Roughly two years ago there was big hype around Augmented Reality, the main driving force behind this was the transition from augmented reality being bound to desktop software to the ARToolkit being ported for Flash and therefor able to run in the browser. I jumped on the band wagon and wrote some samples in AS3 using the FLARToolkit which is an as3 port of the ARToolkit and the results were very nice. Thats pretty much where it ended.
In the last couple of weeks I have had more and more request for code that involves augmented reality. Most notably a project involving a wide open space being mapped as an AR maze. The challenge is to have a persistant 3d virtual maze to walk around. I am not sure if it is even possible with the equipment we have here but I have begun testing some ideas.
The video above is the first experiment in a long time involving AR. I wanted to check how well processing would handle the ARToolkit instead of Flash. I used the wrapper class of the NYARtoolkit as a base but I found the adaptation by cpbotha.net who has added multiple marker functionality to the library. The end result is a lot slower than i remember AS3 to be but I am loading an .STL file with 97000 triangles. I am using the unlekker library to load the .STL file but I am thinking about using the OBJLoader library instead because of the texture support.
The .STL was a download from thingiverse.com. I can’t find the name of the person to credit anymore but the file is called doneShell.stl
Its been a while since I have used processing for any of my projects so today I took some time to have a good look at some of the new libraries available for the framework. I was amazed at far processing has come, there are now libraries for everything from sound to reading the accelerometer built into a Mac book. I was very pleased to see that box2D has been wrapped to integrate simply into processing as this is a very powerful 2D physics engine. Here is the link to the libraries section of the processing site for anybody that is interested.
Try to keep up! The statement I have been telling myself a lot recently. The shift to HTML5 and CSS3 has left me scrambling for browser support comparisons and video encoding specifications. Web technologies like tectonic plates are shifting, some struggling to find their place while others are emerging powerful with claims of modularity and future proofing. All this change has rekindled my love for web development but there have been a couple of changes that I really want to shout about so here it goes:
Starting with the most obvious first – jQuery
I have been busy running through ideas for automated instruments I could use to enrich my performances at open mic nights. One of the main points of interest for me is percussion as it is usually quite over looked at open mic nights apart from the occasional set of bongos. I have been drawing up sketches for a snare drum played using dropping marbles and also for a cassette player hack. The main hurdle for any automated instrument is how will it be sequenced to play itself. Last night I sat down for a while and coded a very very basic sequencer in processing that controlled an Arduino with Firmata installed. There is nothing fancy about the code but I believe this will be a good solid starting point for most of the automated instruments I could ever imagine. There are some images below of the basic setup and a video of the sequencer on the screen and the Arduino carrying out the sequence using LED’s. I am quite happy to publish the source code on request.
I ran through the tutorial i posted a while ago on projectile motion, This post is just to show a quick example of how to apply this theory in code. I wrote the code for this animation in processing which is a java frame work for graphic designers. I have touched on processing in previous posts. Animation was created by allowing the code to run and saving every frame as a .jpg. Here is the final animation :-
I am no moving image expert by a long shot. The animation was compiled in final cut pro. I am very pleased with the out put from my code it look very natural and the particles all fall taking into account gravity. I now am working on a fire work class so i might post that up soon.
This is a very poetic project and a very beautiful idea. The author and creator of the program decided it would be nice to have a way to convert the beauty of the retina in an eye in to music. He’s Using Processing as the backbone and creating OSC which are then picked up by SuperCollider.
I have not looked at SuperCollider yet but it does look like an excellent piece of software for producing real-time audio synthesis and algorithmic composition. So watch out in the future for experiments on my blog using this software.
heres the video showing the EyeSequencer:
this project was found on Makezine.com
Makezine found this project on: http://blog.califaudio.com
I have recently been looking for the best way to visualise data and make it come to life. I came across an amazing piece of software. processing 1.0 (BETA) has a very simple interface which is so similar to the arduino interface that it must have something to do with arduino. As soon as i realised the similarities i felt very at home with processing. There are so many good examples of how to use the software that i know the learning experience should be smooth and exciting. I really liked this example.
I also found some really nice examples of processing being used in the design world. Ole Kristensen produced a video entitled ‘Body Navigation’ which has some brilliant examples of how processing can manipulate what a camera sees and then project a reaction on any surface. Check out the video HERE