Skip navigation

Tag Archives: Interactive

2012 Yeosu EXPO HYUNDAI MOTOR GROUP – Hyper-Matrix from yangsookyun on Vimeo.

Epic shapeshifting pixel wall designed by J o n p a s a n g media group consisting of:

Jin-Yo Mok
Sookyun Yang
Earl Park
Jin-Wook Yeo

Advertisements

Recently I have been looking into cybernetics. The website: http://www.v2.nl/ seem to be an invaluable resource so I thought I would post it here so it might help others as well.

 

 

 

Melvin the Magical Mixed Media Machine (or just Melvin the Machine) is best described as a Rube Goldberg machine with a twist. Besides doing what Rube Goldbergs do best – performing a simple task as inefficiently as possible, often in the form of a chain reaction – Melvin has an identity. Actually, the only purpose of this machine is promoting its own identity.

SITE

Yet another Rube goldberg machine but I really love the social media twist for this one.

 

Life writer By Christa Sommerer and Laurent Mignonneau.

Website

We are artists working since 1991 on the creation of interactive computer installations for which we design metaphoric, emotional, natural, intuitive and multi-modal interfaces. The interactive experiences we create are situated between art, design, entertainment and edutainment. One of our key concepts is to create interactive artworks that constantly change and evolve and adapt to the users’ interaction input [1]. For several of our interactive systems we have therefore applied Artificial Life and Complex Systems principles and used genetic programming to create open-ended systems that can evolve and develop over time through user interaction.

 

 

 

 

 

 

 

 

I love Kate’s outlook on life. I think we should all try to be more glacier like!

KateHartman @ Ted Talks

Over the next academic year I intend to show a lot more of the UCF student work on this blog. The staff and students put their heart and soul into the work that is produced at Falmouth. Personally, I get a massive sense of pride from what I do and want to share the results with the world. So here is the first of many; a video documenting the first Kinect based student project to come from UCF.

I am pretty sure that I have posted about this technique before but this is a particularly good example. The waterfall is by far the biggest install of this type I have seen and the lighting really emphasises the lines produced by the falling water.

Another example only this time wrapped into a cylinder:

The video above is a proof of concept for a kinect guitar pedal project I have begun in collaboration with Jem Mackay another technical instructor at UCF. The general plan is to use the kinect controller to trigger as much functionality as possible within the software Logic Mainstage. MainStage is capable of doing some awesome things such as live loop recording, backing track control and the main feature which intend to utilise which is the live guitar effects processing. From the video above I intend to work on the sensitivity of the pedal so that using the pedal feels more tangible. The pedals functionality is pointless if it does not perform fast and accurately enough to fulfil the needs of the live performer. Eventually it might be good to add some sort of visual feedback to show where the pedals are. This could be done using a mini projector to project the pedal boundaries and the function on to the floor where the guitarist is standing.

We were very lucky recently to have Kim Cascone visit UCF.

Wikipedia says it better than I ever could: Kim Cascone

Kim was a very intense and provocative speaker who there was no doubt in my mind had tremendous passion for his work and field of expertise. He seemed to be hyper observant at a level where no detail was left unscrutinised. He took us on journeys through past memories reminiscent of the tiniest details, from the intrusive tones of coins dropping on to the hard sidewalk to noise of the birds agitated and overactive. I was really impressed by his work with World Building. Never before had I thought about the complexities of the sound design behind films. Kim explained what he called scope and focus as key concepts to understanding the situation of the listener. From his explanation my interpretation of these concepts goes as follows:

Focus is a directional aim of attention from the listener on certain points in the environment. The scope is the almost like the circumference around the focus point, the bigger the scope the larger the area where the listener is able to the sounds is. I am sure that my definition is not quite right but the way I imagine this to look visually is almost like a cone protruding away from the listener with the wide end furthest away. as the scope and focus gets larger and less specific the end of the cone becomes larger allowing a lot more sounds to be heard. If the cone’s base becomes smaller then the listener can really focus in on very specific sounds.

I was very interested in the battle that seemed to be persistant in Kim’s work between the auditory field being 3D and the stereo recording which exists only in 2D. Kim used the term ‘grain’ to explain how if done well a stereo 2D signal can be amplified to a 3D experience by the user. Grain follows the listener, past experiences and sensations amplify and reconstruct the 2D signal.

A small blog entry won’t do this man justice so if you ever get the chance to see Kim talk then it is well worth going.

For the last 2-3 months in the corner of my office a project has been slowly growing as bits and pieces ordered have been arriving sporadically from Ebay’s global sellers. finally, this morning I found an hour or two at work in order to make a concerted effort to get everything pieced together so that I could test that the project would work. The video above is a proof of concept for my first attempt at making an frustrated internal reflection touch screen table (FTIR). The theory is simple; FTIR works by shining infra red light into the sides of a sheet of acrylic so that it internally reflects around the inside of the acrylic. Internal reflection continues until something on the surface of the acrylic sheet disturbs the internal reflection and deflects the infra red out of the acrylic and allows an infra camera (bodged web cam) to spot the infra red thus detecting where the object is on the surface of the table. There are 100’s of articles on line about FTIR, all of them more concise and worded better.

http://modin.yuri.at/teaching/TangibleWorkshop/papers/multitouch.pdf

http://en.wikipedia.org/wiki/Multi-touch

http://wiki.nuigroup.com/Multitouch_Technologies

As a first run I am very pleased with the results. Obviously There is still a lot to be done. I need to work on the rear projection surface and what is called a compliant suface between the acrylic and projection surface. The good news from todays experimenting is that i know it works.

%d bloggers like this: