Pan::Ik (ongoing motion-tracking experiments)

The project IkPan started during our residency at Trasformatorio in Sicily as a study to combine multimedia and performing art. We are three artists coming from different fields: Federica Dauri (performer/dancer), Alberto Novello (Composer/Digital Artist) and Antony Raijekoff (Composer/Digital Artist/Theater maker) we are interested in how to develop a flexible system to better combine sound and movement in a structured improvisational performance. Our main focus is to extend the creative and performative possibilities more than the mere experimentation with technology.

We used several wireless sensors (newly-developed at STEIM, Amsterdam) or 3D image cameras which allow us to track movements in relatively large distances. The bodily information drives the sound design and the visuals, so that the performer is in charge of generating the sounds and visuals through her movements. The aesthetic imposed by the developed technology revolves around the concept of synchronicity of machine-human // action-reaction. Mostly the performer decides when to trigger sound events, however sometimes the machine error triggers unexpected sounds that influence the body reaction of the performer. This aspect adds a random creative element that enriches the performative discourse. The performer’s actions were inspired by giving/receiving sound events and had autonomous control on the rhythm and length of the composition. In these first studies, we adapted our work to few selected spaces of the little Sicilian village in which we were residing and we used recorded sounds from relevant aspects of the rural Sicilian culture to add a site-specificity to the improvisations.

At STEIM, during a 3-week residency, we experimented with different resources available in the STEIM facilities (quadraphonic sound, and beamers for body projections). The main purpose of the residency was to develop a set of tools for future performances as our goal is to collect site-specific material and create a different performance in different spaces.
We followed different dramaturgical apporaches and mapping strategies to see how body language would change: describing a stage as a 3d space in which coordinates of the perfromer’s position affect the sound, or using accents in limb movements as triggers for loops/short sounds, connecting pitch with height of the perfromer, or reverb with distance between legs, we even truied with more complex mapping considering the amount of movement and counting accents in a machine learning way. It is difficult to find a right balance between having a too-bvious mapping in which the connection becomes boring for the audience and expressivity-trapping for the eprfromaer and complex mapping which hides the real control which is one of the goals of motion tracking.
One problem was also the noisiness of the system. We tried different filters to reduce the internal jitter of the accelerometers (Klaman, lattice) with some success. We experiment with body projections, trying to achieve a 3d body shape tracker in the whole stage with some trigonometrical calculation. We found some interesting results that need to be developed. In particular the 3d body mapping needs still some work and seems till a complex goal to reach. Especially tricky is the calibration of the system, but having fixed the geometrical characteristics (distance between camera and beamer with respect to the stage size) we can set up in few hours.

FUTURE PLANS
We intend to use the experience acquired during this artistic residency to present a workshop on VVVV, sound manipulation & synthesis trough motion tracking at STEIM in the future + a performance demonstrating our artistic direction in the field of motion tracking and mapping strategies. It can be a very stimulating time for us for an open discussion about the issues and solutions connected to different motion tracking approaches.

Comments are closed.