Icarus (with Daniel Jones) – navigable 2D musical surfaces

During Icarus’ last residency, in February 2011, we created the underlying software for our parametric album Fake Fish Distribution (FFD). Using the FFD software we were able to create alternative trajectories through a kind of operable musical structure. We mapped our various trajectories ultimately to a single parameter, our version index, resulting in an album that smoothly varied between a fixed set of different versions. Each version defined an arrangement of the track, with different parameters being arranged over time in different ways. Although FFD actually explored a variety of ways to map the version index to the structuring of the track, one of the easiest to work with involved simply plotting in either 1D or 2D space a number of phases in an arrangement. 1D arrangements are basically timelines, if you read them from left to right. Our 1D trajectories could just scrub back and forth through the musical structure, producing linear partially palindromic arrangements. 2D spaces offer significantly more variety. You can trace a line back and forth as before, but you can also bypass structures, and travel in loops that drift, take short-cuts and bifurcate. The idea of laying out musical compositions on 2D surfaces in order to realise them as pathways is not new, but we realised that this is something still not well explored in the very obvious application domain of the 2D multitouch surface (of course things get even more interesting in 3D, but this will have to wait). So in this project we set about to create a live performance tool that was related to the FFD album concept, but with a whole load of interest and potential of its own.

We created an experimental Android app (with iOS port coming), way too experimental to post here. It can load any structure consisting of non-overlapping polygons. These act as multitouch buttons which you can press or drag over to produce trajectories. The surface generates OSC messages in response to touch control and these can be used to control any musical system. The surfaces are zoomable to any level of detail, and can be played with multiple fingers, or also automated agents that navigate the space according to simple procedures (we have so far explored loopers, flocks and simple exploratory agents). With the framework for the system in place we experimented with a number of ways to procedurally generate interesting surface structures and map them to procedurally generated musical structures.

 

The work reached a very rugged prototype which was sufficient for an experimental try out at our live performance for Sonic Acts, on February 25th at Paradiso. Unfortunately the performance plan was derailed: our only Android tablet (plus one of our laptops), went missing from the venue only a couple of hours before the show.

The project is being continually developed an we aim to co-release an app with STEIM later in the year.

 

 

Comments are closed.