Tim Thompson > InstrumentLab #3

It may sound crazy to be spending a sunny Easter Sunday programming and experimenting in the bowels of STEIM, but that is what I came here to do.  In the first few days of the Instrument Lab, we received a variety of presentations which were always thought-provoking.  The best part of the week has been interacting with and seeing the work of the other Lab participants – their instruments and ideas cover a wide spectrum.  I also gave a presentation about my work, which people seemed to enjoy.

I originally thought I wouldn’t be able to bring my Space Palette instrument because of its size, but I was able to cut it into four pieces and bring it along.   It was originally designed as a “casual instrument” for people to enjoy at festivals (like Burning Man), but over the last few months I’ve been experimenting with different arrangements of the controls and features of the sound generation to make it more suitable for performances.  It’s always useful to have deadlines (such as the Tuesday night performance here) to force myself to make progress.  I doubt I will ever find the perfect arrangement of controls, but I’ve been able to experiment with several different approaches while I’ve been here.

When describing my work in one of the presentations, I mentioned that one of my ideas was to use the outline of the hands (or body or any other physical object that you stick in front of the Kinect) as a waveform, directly.  It’s a relatively simple idea that seems to have a lot of potential for creating and evolving sound in a direct and intimate way, and I was happy to hear Kristina Andersen echo my enthusiasm for the idea.  I haven’t been able to implement it completely, yet, but I made some progress toward that goal – TUIO 2.0 has a spec for an OSC message that conveys the outline of an object, and I implemented that in the MultiMultiTouchTouch software that forms the basis of the Space Palette.  It isn’t clear to me what approach I should use for converting the outline to audio in realtime, but SuperCollider is a very strong possibility, and this finally gives me a good excuse to learn that language.

One of the Lab participants (Iris van der Ende) has a harp which can trigger sounds created from stars.  She brought some short star-related video clips (for example, time-lapses of the night-time sky), which I was able to convert and play from within Resolume.  Soon we’re going to try triggering these video clips using the MIDI output of her harp.

…Tim…

Comments are closed.