Skip navigation

Category Archives: Interactive

Zach Lieberman is one of the smarts behind the openFramework software. Here he is talking about some of his projects. The main reason I have posted this video is for the project that is introduced 6mins in.Zach has been collaborating with an old school graffiti artist who goes by the name of Tempt. The twist is that Tempt now suffers from Lou Gehrig’s disease and is paralysed. Zach Lieberman and a few others created a system for tracking Tempt’s eyes and relaying the movements back to a drawing package. The accuracy is really quite amazing and the project is very inspirational.

Advertisements

 

Life writer By Christa Sommerer and Laurent Mignonneau.

Website

We are artists working since 1991 on the creation of interactive computer installations for which we design metaphoric, emotional, natural, intuitive and multi-modal interfaces. The interactive experiences we create are situated between art, design, entertainment and edutainment. One of our key concepts is to create interactive artworks that constantly change and evolve and adapt to the users’ interaction input [1]. For several of our interactive systems we have therefore applied Artificial Life and Complex Systems principles and used genetic programming to create open-ended systems that can evolve and develop over time through user interaction.

 

 

 

 

 

 

 

 

Recent experiments with the AIR runtime environment, native process and FFMPEG got me thinking about digital video. The digital strands of narrative intertwine in a confusion of naughts and ones. Auditory vibes harmonise with the luminosity of performing pixels conducted by semiconductors fluent in machine. FFMPEG is a decoder/encoder for digital content capable of converting one video format to another, separating audio from video, breaking video up into frames as jpegs and so much more. One of the most basic features that interests me the most about FFMPEG is -s (size parameter). FFMPEG uses this parameter to scale the video as its being converted. As a person of great ability in the art of procrastination instead of the task in hand I began contemplating the consequence of encoding a video into a containing dimension of 1×1 pixels. After some experimentation I disproved my first naive/romantic hypothesis of what this 1×1 video might produce. Without considering the repercussion in depth I thought that the result of this scaling might produce a colour narrative, a timeline of mood brought forth by hue, saturation and brightness presented by a single pixel against an axis of time. The reality is that FFMPEG is only able to resize and scale video in factors of 2 so next I tried a 2×2 pixel square. Still the notion of a colour narrative was far out of reach as once encoding of  the 2×2 video was complete the playback result was a grayscale blur. The result was definitely not a consequence of the colours within the video. I decided I would try the process one more time only with a 4×4 pixel result so that more of the colour detail was kept. I was extremely pleased with the result at 4×4 the mood of the video was very apparent but the detailed had definitely been dissolved. I enjoyed the extra bonus that the audio had been preserved to compliment the mood of the frames. I intend to follow this up with some experimental data visualisations of the pixel colour over time very similar to this example by Brendan Dawes but for now see below for the result of the 4×4 square scaled back up to 320×240.

Over the next academic year I intend to show a lot more of the UCF student work on this blog. The staff and students put their heart and soul into the work that is produced at Falmouth. Personally, I get a massive sense of pride from what I do and want to share the results with the world. So here is the first of many; a video documenting the first Kinect based student project to come from UCF.

Just a few photos and a video from the 3 days of training I took part in which started on Monday the 11th of July. In the 3 days we put together 2 machines; one that was a gantry based plotter and the other was a turn table based scanner. The interesting part was tooling the plotter to pour slip over clay. The pictures below are just a few photo I took over the 3 days. Mat, another from the CNC workshop documented the whole process better than I did, you can find his photos here

Recently I attended a 3 day training session on building home-brew CNC machines which was run by Dave Turtle from the RCA. It was an amazing 3 days and I will post the photos and videos of the results as soon as possible. One of the hurdles that came crashing in on day two was the limitation of running the kit from the parallel port. There really aren’t that many computers these days that still roll out with a parallel port as standard, not to mention my nice shiny mac does not come equipped with a parallel port. I was 100% sure the solution to this problem was the trusty Arduino. There have been many projects where the Arduino has already been used as the heart of a CNC machine, the first that comes to mind is the REPRAP. There is also another CNC project called Contraptor which utilises the Arduino at its heart. The home site for the Contraptor project has a lot of useful information, it was there I came across Grbl.

I tried the RepRap g code interpreter, fiveD but I could not get it to compile for the Arduino (Any tips would be gratefully recieved). I also tried a few other interpreters with varying success: teapot, rsteppercontrol and arduino-gcode-interpreter-new. I really struggled, probably partly due to my lack of understanding when it comes to g-code. I had no success over the three days of training but I did find Grbl though I didn’t have the kit to test it. Grbl seemed like a very simple solution but the main hurdle when it comes to implementing it is that you need to use avrdude to flash it to the Arduino you can’t just send it via usb direct to the Arduino. I have never done this before so I let the Arduino rest for the the remainder of the training with a mind to try it as soon as possible.

Today I started messing around with Flashing Grbl to the Arduino and was caught out by several issues which slowed my progress. There are already several sites with information on how to do this but I found I needed bits from all my sources to get the job done. I thought I would document my process in case anyone else finds it useful.

First off the sites that proved to be most useful:

http://www.sparkfun.com/tutorials/247

http://www.arduino.cc/en/Hacking/Bootloader

http://dank.bengler.no/-/page/show/5471_gettinggrbl

I started by downloading the prebuilt hex files for Grbl here

I then downloaded Crosspack-AVR from here which installs a version of AVRDUDE (used to handle flashing the data to the Arduino)

The Arduino that is going to act as a programmer needs to have the programming firmware uploaded to it. This is a very simple task as it is all built into the Arduino IDE. Open up the Arduino IDE then go to File -> examples -> Arduino ISP then upload the sketch to the Arduino. The Arduino is now fully setup to Flash another Arduino.

The next step was to wire one Arduino to another to use as a programmer. I found the wiring diagram from Sparkfun here and the picture below is my version of the wiring. One thing that sparkfun didn’t explain is that you must disable  auto reset on serial connection. I found out how to do this here. I could not find a cable to suit so unfortunately I had to solder directly to the ISCP headers (not pretty).

Wiring for flashing the arduino

Now all the setup is done it is time to put AVRDUDE to work, on a Mac this is done via terminal.

I found the terminal commands for AVRDUDE on sparkfun here about half way down the page.

command one (make sure the Arduino is ready for grbl):

avrdude -P /dev/tty.usbserial-A9007VP6 -b 19200 -c avrisp -p m328p -v -e -U efuse:w:0x05:m -U hfuse:w:0xD6:m -U lfuse:w:0xFF:m

change the red text for the name of the serial port that your Arduino is plugged into

command two (load Grbl):

avrdude -P /dev/tty.usbserial-A9007VP6 -b 19200 -c avrisp -p m328p -v -e -U flash:w:grbl.hex -U lock:w:0x0F:m

blue text is for the location of the grbl hex file on the computer

Hopefully thats it, Grbl is now installed!

If you want to test that Grbl is working properly the you can download CoolTerm which is a GUI for mac for sending and receiving information on serial ports.

I come from an art/design background but sometimes I wish I had some training as an engineer. So many of my past projects would have benefited by a little extra knowledge. I have bodged rickety frame work from doweling and 2×2 hacked up sheets of metal to make mechanism, all the time thinking there must be a better way. I have often thought how awesome it would be to own a laser cuter so i could bash out prototypes to my hearts content. Unfortunately I don’t have the sort of money that it takes to acquire this sort of equipment. I am slowly getting my knowledge or engineering resource to a higher standard but it is very slow progress. I came across MicroMax today and it looks very interesting so I thought I would paste it up here incase it can help anyone else. The material looks awesome for rapid prototyping small projects that need accurate and strong frameworks. The website can be found here

The video above is a proof of concept for a kinect guitar pedal project I have begun in collaboration with Jem Mackay another technical instructor at UCF. The general plan is to use the kinect controller to trigger as much functionality as possible within the software Logic Mainstage. MainStage is capable of doing some awesome things such as live loop recording, backing track control and the main feature which intend to utilise which is the live guitar effects processing. From the video above I intend to work on the sensitivity of the pedal so that using the pedal feels more tangible. The pedals functionality is pointless if it does not perform fast and accurately enough to fulfil the needs of the live performer. Eventually it might be good to add some sort of visual feedback to show where the pedals are. This could be done using a mini projector to project the pedal boundaries and the function on to the floor where the guitarist is standing.

We were very lucky recently to have Kim Cascone visit UCF.

Wikipedia says it better than I ever could: Kim Cascone

Kim was a very intense and provocative speaker who there was no doubt in my mind had tremendous passion for his work and field of expertise. He seemed to be hyper observant at a level where no detail was left unscrutinised. He took us on journeys through past memories reminiscent of the tiniest details, from the intrusive tones of coins dropping on to the hard sidewalk to noise of the birds agitated and overactive. I was really impressed by his work with World Building. Never before had I thought about the complexities of the sound design behind films. Kim explained what he called scope and focus as key concepts to understanding the situation of the listener. From his explanation my interpretation of these concepts goes as follows:

Focus is a directional aim of attention from the listener on certain points in the environment. The scope is the almost like the circumference around the focus point, the bigger the scope the larger the area where the listener is able to the sounds is. I am sure that my definition is not quite right but the way I imagine this to look visually is almost like a cone protruding away from the listener with the wide end furthest away. as the scope and focus gets larger and less specific the end of the cone becomes larger allowing a lot more sounds to be heard. If the cone’s base becomes smaller then the listener can really focus in on very specific sounds.

I was very interested in the battle that seemed to be persistant in Kim’s work between the auditory field being 3D and the stereo recording which exists only in 2D. Kim used the term ‘grain’ to explain how if done well a stereo 2D signal can be amplified to a 3D experience by the user. Grain follows the listener, past experiences and sensations amplify and reconstruct the 2D signal.

A small blog entry won’t do this man justice so if you ever get the chance to see Kim talk then it is well worth going.

Screen shot of the NaturalDocs command line tool

Recently, I have been really enjoying AS3 because of its versatility. The projects I have been working on are growing quite complex and hard to manage. The main issue I have been having is remembering what each class does. I have been very careful to comment code but sometimes it would be nice just to have a document that describes all variables, methods etc. Today I started looking at automated systems to create documentation from AS3 project code and I came across NaturalDocs. Its a very simple to use command line tool which goes through all my project code reading the comments and structure. NaturalDocs then creates a full blown website documenting the code.

A couple of weeks ago I wrote an application in AS3 that used FFMPEG for video manipulation. The application utilised the NativeProcess class from adobe air to execute native processes via the command line. The technique really adds another level to the versatility of AS3. Since the video app I have been itching to use the technique again so I decided to make a very simple GUI for NaturalDocs. The picture above is a screen shot of an air app that has NaturalDocs embedded in. The app is very simple all the user has to do is click the top box and locate the project code then click the second box and define a folder for the destination of the docs. Once “create docs” is clicked NaturalDocs goes ahead and creates the documentation for the code. Its very simple but hopefully it will be quite useful. I was very impressed with the work that Greg Valure has done on NaturalDocs its a very intuitive and efficient system to use.

Some of the AS3 resources I have been working on are becoming very useful. I can’t wait to start documenting some of the more universal classes I have written and then releasing them into the great wild west that is the internet.

A screen shot of the documentation that is generated:

Screen shot of the documentation that is generated by NaturalDocsIMPORTANT: None of the work done for NaturalDocs is mine I am only responsible for the GUI used to short cut the functionality of NaturalDocs

I will upload the app and post link to download it soon. I don’t think this would break the GPL license the NaturalDocs project is licensed with.

%d bloggers like this: