The video below is documentation for a live video compositor project developed in collaboration with James Moore. I was the sole As3 programmer on this project. The AIR app takes advantage of FFMPEG to allow videos uploaded to the system to be broken down into frames and also to render out (render out is still in development). FFMPEG was accessed via the native process functionality of the air runtime environment. The overall installation was setup using a matrox Tripple head running 3 projectors. The project is a lot bigger than it looks in the video.
I really enjoyed the collaboration with James. It was nice to take a back seat in the decision making and just concentrate on how the application would be coded.
The left projection functions as a media viewer allowing the user to browse through files in the assets folder. The center screen allows the user to edit the content before it is played back. The edit choices are stylised and restrictive to fit in with the overall aesthetic of the system which is based around a minimalist grid. The right side projection shows a playback of the video being edited with the changes appearing in real time.
Roughly two years ago there was big hype around Augmented Reality, the main driving force behind this was the transition from augmented reality being bound to desktop software to the ARToolkit being ported for Flash and therefor able to run in the browser. I jumped on the band wagon and wrote some samples in AS3 using the FLARToolkit which is an as3 port of the ARToolkit and the results were very nice. Thats pretty much where it ended.
In the last couple of weeks I have had more and more request for code that involves augmented reality. Most notably a project involving a wide open space being mapped as an AR maze. The challenge is to have a persistant 3d virtual maze to walk around. I am not sure if it is even possible with the equipment we have here but I have begun testing some ideas.
The video above is the first experiment in a long time involving AR. I wanted to check how well processing would handle the ARToolkit instead of Flash. I used the wrapper class of the NYARtoolkit as a base but I found the adaptation by cpbotha.net who has added multiple marker functionality to the library. The end result is a lot slower than i remember AS3 to be but I am loading an .STL file with 97000 triangles. I am using the unlekker library to load the .STL file but I am thinking about using the OBJLoader library instead because of the texture support.
The .STL was a download from thingiverse.com. I can’t find the name of the person to credit anymore but the file is called doneShell.stl
Recent experiments with the AIR runtime environment, native process and FFMPEG got me thinking about digital video. The digital strands of narrative intertwine in a confusion of naughts and ones. Auditory vibes harmonise with the luminosity of performing pixels conducted by semiconductors fluent in machine. FFMPEG is a decoder/encoder for digital content capable of converting one video format to another, separating audio from video, breaking video up into frames as jpegs and so much more. One of the most basic features that interests me the most about FFMPEG is -s (size parameter). FFMPEG uses this parameter to scale the video as its being converted. As a person of great ability in the art of procrastination instead of the task in hand I began contemplating the consequence of encoding a video into a containing dimension of 1×1 pixels. After some experimentation I disproved my first naive/romantic hypothesis of what this 1×1 video might produce. Without considering the repercussion in depth I thought that the result of this scaling might produce a colour narrative, a timeline of mood brought forth by hue, saturation and brightness presented by a single pixel against an axis of time. The reality is that FFMPEG is only able to resize and scale video in factors of 2 so next I tried a 2×2 pixel square. Still the notion of a colour narrative was far out of reach as once encoding of the 2×2 video was complete the playback result was a grayscale blur. The result was definitely not a consequence of the colours within the video. I decided I would try the process one more time only with a 4×4 pixel result so that more of the colour detail was kept. I was extremely pleased with the result at 4×4 the mood of the video was very apparent but the detailed had definitely been dissolved. I enjoyed the extra bonus that the audio had been preserved to compliment the mood of the frames. I intend to follow this up with some experimental data visualisations of the pixel colour over time very similar to this example by Brendan Dawes but for now see below for the result of the 4×4 square scaled back up to 320×240.
Recently, I have been really enjoying AS3 because of its versatility. The projects I have been working on are growing quite complex and hard to manage. The main issue I have been having is remembering what each class does. I have been very careful to comment code but sometimes it would be nice just to have a document that describes all variables, methods etc. Today I started looking at automated systems to create documentation from AS3 project code and I came across NaturalDocs. Its a very simple to use command line tool which goes through all my project code reading the comments and structure. NaturalDocs then creates a full blown website documenting the code.
A couple of weeks ago I wrote an application in AS3 that used FFMPEG for video manipulation. The application utilised the NativeProcess class from adobe air to execute native processes via the command line. The technique really adds another level to the versatility of AS3. Since the video app I have been itching to use the technique again so I decided to make a very simple GUI for NaturalDocs. The picture above is a screen shot of an air app that has NaturalDocs embedded in. The app is very simple all the user has to do is click the top box and locate the project code then click the second box and define a folder for the destination of the docs. Once “create docs” is clicked NaturalDocs goes ahead and creates the documentation for the code. Its very simple but hopefully it will be quite useful. I was very impressed with the work that Greg Valure has done on NaturalDocs its a very intuitive and efficient system to use.
Some of the AS3 resources I have been working on are becoming very useful. I can’t wait to start documenting some of the more universal classes I have written and then releasing them into the great wild west that is the internet.
A screen shot of the documentation that is generated:
IMPORTANT: None of the work done for NaturalDocs is mine I am only responsible for the GUI used to short cut the functionality of NaturalDocs
I will upload the app and post link to download it soon. I don’t think this would break the GPL license the NaturalDocs project is licensed with.
Try to keep up! The statement I have been telling myself a lot recently. The shift to HTML5 and CSS3 has left me scrambling for browser support comparisons and video encoding specifications. Web technologies like tectonic plates are shifting, some struggling to find their place while others are emerging powerful with claims of modularity and future proofing. All this change has rekindled my love for web development but there have been a couple of changes that I really want to shout about so here it goes:
Starting with the most obvious first – jQuery
Above is one of my Favorite examples I have seen so far of augmented reality. It was coded by Saqoosha who I am very thankful to as he was one of my main sources of information and documentation for the FLARToolkit.
The last two days I have spent bashing about with the FLARToolkit for AS3. Its very addictive and there are some very good example online. In the end I produced some working examples by mixing the examples given by Mikko Haapoja and saqoosha.net into one AS3 file. I can’t wait to see where these experiments will take me and i hopefully post them up here soon.
One of the nicest data visualisation projects I have come across in my research so far is Well-Formed.Eigenfactor.org designed by Moritz Stefaner using the AS3 framework Flare. Moritz has tapped into the massive database created by Eigenfactor.org and produced some very beautiful and intuitive Flash applications. The Flare toolkit is very comprehensive, versatile and capable of producing some very nice visuals I can’t wait to start playing with it.
Recently I have spent a considerable amount of time researching data visualization techniques using Flash and AS3. The main driving force for this research was to inspire first year Digital Media students without allowing them to be intimidated by AS3. To make the process of visualizing data in AS3 less daunting I have started writing a toolkit of classes the students can use to bypass some of the fundamental and structural elements of visualizing data. This way the students can concentrate on the creative and experimental aspects of the visualizations and achieve results at a much faster rate.
Here is one of the classes I have completed that I believe to be quite useful. The purpose of the class is to convert time into units along an axis. The class takes the start time and end time for a period of data collection and maps that across either the x or y axis. The class has a getPoint method which you can pass any time within the data collection period and it will return the point in pixels along the specified axis.
Once an instance of timeToAxis.as has been defined only two lines are required to start utilising the class for example:
converter.setAxis(“12:00:00”, // The start time
“12:05:00”, // The end time
“x”); // the axis to apply the units to
trace(converter.getPoint(“12:02:00”)); //get a value along the defined axis
Anybody is welcome to use this class I just hope other people find it as useful as I do. However I would love to know any improvements that could be made to the class and how it has been implemented.
DOWNLOAD class and examples