Skip navigation

Category Archives: Interactive

Loving this build by Dennis P Paul, an instrument that uses the profile of everyday objects to trigger loops.

Advertisements

The video below is documentation for a  live video compositor project developed in collaboration with James Moore. I was the sole As3 programmer on this project. The AIR app takes advantage of FFMPEG to allow videos uploaded to the system to be broken down into frames and also to render out (render out is still in development). FFMPEG was accessed via the native process functionality of the air runtime environment. The overall installation was setup using a matrox Tripple head running 3 projectors. The project is a lot bigger than it looks in the video.

I really enjoyed the collaboration with James. It was nice to take a back seat in the decision making and just concentrate on how the application would be coded.

The left projection functions as a media viewer allowing the user to browse through files in the assets folder. The center screen allows the user to edit the content before it is played back. The edit choices are stylised and restrictive to fit in with the overall aesthetic of the system which is based around a minimalist grid. The right side projection shows a playback of the video being edited with the changes appearing in real time.

 

A beautiful way of getting your music out into the community.

Something new for Cornwall!

A hacker space for anyone, based at the Tremough Campus University College Falmouth.

If your interested in joining us here is a link to the site:

CORNWALL HACKERSPACE

Occasionally, I ponder a machine that is capable of reading all the chaotic thoughts inside my head at any given time. Once the thoughts have been collated, this device should then find a tangible yet beautiful media to output these thoughts in a cohesive manner. The closest that technology has come to doing this so far, that I am aware of, is reconstructing the brains vision. At UC Berkeley, scientists have been using an fMRI system to record the blood flow through their brains’ visual cortex as the user watches youtube videos. Using this data scientist have been able to reconstruct the visuals that the user was being fed.

article via gizmodo

There are some obvious concerns when it comes to human rights and violation of privacy but thats not what concerns me the most. Once they have invented this mind reader device what if my thoughts output like this:

Article vis MAKE

Whilst researching into artificial intelligence I came across this article. David Cope is a composer that wrote a computer program he called Experiments in Musical Intelligence. The program was written to analyse past compositions from great composers such as Bach or Beethoven and read the notes like data. Once the data has been read in the program identifies common patterns and styles within the composition and recreates those techniques for new arrangements and compositions. The thing I find most interesting about EMI is the reaction the software received from the audience of music lovers. If a machine can mimic and create works of beauty equal to that of the greats then does this discredit and invalidate the idea of the master piece? It reminds me of the core marxist belief of metaphysical materialism which states:

the mind, or thought, is purely the product of the material composition of the brain.

Quote from site

Recently I have been looking into cybernetics. The website: http://www.v2.nl/ seem to be an invaluable resource so I thought I would post it here so it might help others as well.

 

 

 

Melvin the Magical Mixed Media Machine (or just Melvin the Machine) is best described as a Rube Goldberg machine with a twist. Besides doing what Rube Goldbergs do best – performing a simple task as inefficiently as possible, often in the form of a chain reaction – Melvin has an identity. Actually, the only purpose of this machine is promoting its own identity.

SITE

Yet another Rube goldberg machine but I really love the social media twist for this one.

Well worth a watch.

Our camera uses 36 fixed-focus 2 megapixel mobile phone camera modules. The camera modules are mounted in a robust, 3D-printed, ball-shaped enclosure that is padded with foam and handles just like a ball. Our camera contains an accelerometer which we use to measure launch acceleration. Integration lets us predict rise time to the highest point, where we trigger the exposure. After catching the ball camera, pictures are downloaded in seconds using USB and automatically shown in our spherical panoramic viewer. This lets users interactively explore a full representation of the captured environment.

 

Amazing idea

%d bloggers like this: