About a year ago, I went along to the Code the City Hack at the Informatics Forum in Edinburgh. This was the forth Code the City event and the theme was the environment. I joined a team working on making rubbish collection more fun and we built an Arduino based talking rubbish bin. At the end of the weekend the bin had to be dismantled, as we didn’t own any of the components. I thought it might be a good idea to try and rebuild the bin and write up a set of instructions.
The Bin plays sound effects when you drop rubbish into it. The effects can be recorded as .wavs and uploaded onto the device over USB. We themed our Bin as though it was a monster that liked eating rubbish so that’s what I’ll be doing for this tutorial.
Last week I went up to Tiree Tech Wave, a unique tech conference held on the Hebridean island of Tiree. The event was very relaxed both in terms of the feel and structure. It was a bit like a five day hack event with a bring your own gadgets policy.
One of the gadgets that was brought along was a Kinect that was compatible with my Mac. I have been wanting to experiment the Kinect since I attended the Digital Revolution show at the Barbican earlier in the year but I’ve had trouble finding a model that would work with my computer. Luckily Adam van Sertima from TAG had already solved many of the set up problems and we were able to getting it running with only a short wait for all the drivers to download over Tiree’s limit internet connection.
We decide to work on a project exploring how virtual representations of our bodies can suddenly undermine our expectations. The player is rendered as a grid of differently sized dots using the Kinect depth camera to pull out there outline. They get used to the rules that govern this representation and how it reflects their own movements. Then after a set time, the dots transform into physics objects which collapse under gravity, hopefully inducing a sense of physical disintegration in the player. I think it worked pretty well as a first step and it’s good fun to play around with.
Here is video of the project running on my laptop. If you fancy having a go at getting it to run on you own Kinect, I have released the source code on GitHub with installation instructions. If you run into trouble with it, you might want to check out Daniel Shiffman’s guide to using the Kinect in Processing.
This is my second project for Creative Programming for Digital Media & Mobile Apps on Coursera. It’s an audio visualiser inspired by An Optical Poem (1938) by Oskar Fischinger. The course work was to create a visualiser so I thought it would be interesting to look back at the history abstract animations set to music. I quickly discovered Fischinger and decided to have a go at making something in his style.
I don’t think I was able to get anywhere near the dynamism of the original but perhaps it captures something of the more relaxed moments of his film. Here’s a link to a video of it running.
I wrote the soundtrack to emphasis high and low frequencies at different part of the composition in order to create a more noticeable change in the visualisation.
To generate the graphics, the music is transformed into frequency bands and the power of each band is used to change the sizes of particular set of particles. Some Perlin Noise is used to set velocities and sine waves are used to parameterise the animations. I ended up tuning the parameters quite heavily so I’m not sure how the system will work with other soundtracks.
I’m taking the Creative Programming course on Coursera at the moment and thought that I would post my first course work. It’s based on a demo application called Sonic Painter put together by the course tutors. My variation on it uses Perlin Noise to create a kind of organic motion. The code is available on my GitHub page.
I’m really enjoying the course so far and would recommend it. It’s aimed at people with no previous programming skills but if you already know Processing then you can just skip those bits.