Skip navigation

Category Archives: My work

The video below is documentation for a  live video compositor project developed in collaboration with James Moore. I was the sole As3 programmer on this project. The AIR app takes advantage of FFMPEG to allow videos uploaded to the system to be broken down into frames and also to render out (render out is still in development). FFMPEG was accessed via the native process functionality of the air runtime environment. The overall installation was setup using a matrox Tripple head running 3 projectors. The project is a lot bigger than it looks in the video.

I really enjoyed the collaboration with James. It was nice to take a back seat in the decision making and just concentrate on how the application would be coded.

The left projection functions as a media viewer allowing the user to browse through files in the assets folder. The center screen allows the user to edit the content before it is played back. The edit choices are stylised and restrictive to fit in with the overall aesthetic of the system which is based around a minimalist grid. The right side projection shows a playback of the video being edited with the changes appearing in real time.

Android 1 – iPhone 0 (eventually sorted though)

I have used PhoneGap before without the aid of Dreamweaver and although it works well the setup process is a little complex and I never really got it working for IPhone on my Mac. A lot of students have been asking for training sessions for writing apps so I thought that the CS5.5 / PhoneGap integration might streamline the whole process. As with most things Adobe do the workflow is very simple and when using the jQuery mobile template and exporting out to Android everything worked really well. The problems came when I started trying to export for the IPhone (Surprise surprise). The main issue was that once I pointed the setting to the the developer folder at the root of my hard drive and clicked build and emulate I got an error:

The build of application failed to complete successfully

After hunting round in the settings for a while I decided to hit up Google. I found the solution in the Adobe forums, here is the link:

I was unsure if I was supposed to change the camera.h file @



/Applications/Adobe Dreamweaver CS5.5/Configuration/NativeAppFramework/DWPhoneGap

So I changed them both for good measure and everything now works fine.

Something new for Cornwall!

A hacker space for anyone, based at the Tremough Campus University College Falmouth.

If your interested in joining us here is a link to the site:


I learnt so much in the home brew CNC workshop and I am very aware that if I don’t do something with my new found knowledge soon then I will forget most of it. I am a collector of broken junk and in my collection I have an A3 scanner and an A4 scanner. I have decided to turn these deserted and tired old pieces of equipment into a homebrew plotter / engraver (if all goes well.) The first step was to take the Grbl ready Arduino from a previous post and make it control a stepper motor. I have been leant 3 stepper motor drivers to experiment with, the documentation for the drivers can be found here. The A4983 stepper motor driver is a very compact driver perfect for smaller CNC projects. The documentation for the A4983 is very good so it was very easy to connect it up to the Arduino and the stepper motor. The pictures below evidence my setup with the driver running full steps. I have not experimented with micro steps yet but the results are very promising.

The wiring:

mac to arduino - arduino to a stepper motor

Just a few photos and a video from the 3 days of training I took part in which started on Monday the 11th of July. In the 3 days we put together 2 machines; one that was a gantry based plotter and the other was a turn table based scanner. The interesting part was tooling the plotter to pour slip over clay. The pictures below are just a few photo I took over the 3 days. Mat, another from the CNC workshop documented the whole process better than I did, you can find his photos here

The video above is a proof of concept for a kinect guitar pedal project I have begun in collaboration with Jem Mackay another technical instructor at UCF. The general plan is to use the kinect controller to trigger as much functionality as possible within the software Logic Mainstage. MainStage is capable of doing some awesome things such as live loop recording, backing track control and the main feature which intend to utilise which is the live guitar effects processing. From the video above I intend to work on the sensitivity of the pedal so that using the pedal feels more tangible. The pedals functionality is pointless if it does not perform fast and accurately enough to fulfil the needs of the live performer. Eventually it might be good to add some sort of visual feedback to show where the pedals are. This could be done using a mini projector to project the pedal boundaries and the function on to the floor where the guitarist is standing.

For the last 2-3 months in the corner of my office a project has been slowly growing as bits and pieces ordered have been arriving sporadically from Ebay’s global sellers. finally, this morning I found an hour or two at work in order to make a concerted effort to get everything pieced together so that I could test that the project would work. The video above is a proof of concept for my first attempt at making an frustrated internal reflection touch screen table (FTIR). The theory is simple; FTIR works by shining infra red light into the sides of a sheet of acrylic so that it internally reflects around the inside of the acrylic. Internal reflection continues until something on the surface of the acrylic sheet disturbs the internal reflection and deflects the infra red out of the acrylic and allows an infra camera (bodged web cam) to spot the infra red thus detecting where the object is on the surface of the table. There are 100’s of articles on line about FTIR, all of them more concise and worded better.

As a first run I am very pleased with the results. Obviously There is still a lot to be done. I need to work on the rear projection surface and what is called a compliant suface between the acrylic and projection surface. The good news from todays experimenting is that i know it works.

A morning spent murdering Nirvana – come as you are.

A couple of months ago a good friend bought me a stylophone for my birthday. I had a blast snarling out noises that were close to songs we all know and love. unfortunately due to my clumsy inaccurate nature I have never managed to play a whole song at the correct tempo with out playing wrong notes. The novelty soon wore off and the stylophone was left to gather dust on a shelf in my office.

I have a list of tasks as long as my arm to do at work but this morning when I got in the motivation levels were at an all time low. Instead of doing anything useful I decided it was time time to put the stylophone to good use. Knowing that my ability to  manipulate the stylus over those circuit board keys was never gonna improve I decided I would cheat and automate the circuit connections that are made when the stylus connects with one of the keys.

The video attached shows the result of todays procrastination. So far I have only automated 10 of the keys direct from the arduino.if I get time in the near future I will extend its functionality using an 8bit shift register so that the arduino can play all the keys. I also intend to write a processing sketch interface so that inputting songs is easier and more intuitive (and not murderous to classic rock songs).

Written using swype on htc feature hd

I have been busy running through ideas for automated instruments I could use to enrich my performances at open mic nights. One of the main points of interest for me is percussion as it is usually quite over looked at open mic nights apart from the occasional set of bongos. I have been drawing up sketches for a snare drum played using dropping marbles and also for a cassette player hack. The main hurdle for any automated instrument is how will it be sequenced to play itself. Last night I sat down for a while and coded a very very basic sequencer in processing that controlled an Arduino with Firmata installed. There is nothing fancy about the code but I believe this will be a good solid starting point for most of the automated instruments I could ever imagine. There are some images below of the basic setup and a video of the sequencer on the screen and the Arduino carrying out the sequence using LED’s. I am quite happy to publish the source code on request.

This made me smile so I thought i would post it. One of the projects I am working on at the minute is a snare drum that is played using marbles dropping from shoots. I had been wondering how to return the marbles to the top of the shoot until I saw this video posted on Its such a simple idea I cant wait to see if it will work for my project!

%d bloggers like this: