jamie griffiths – Instrument Lab #1 – Instrument Building Residency Aug-Sep 2011

from Performance I.V.Y. & me

I work with imagery, sound and light, as well as live cameras, video tracking & 3D live drawing in fields of Visual Music, Abstract Cinema and New Media; collaborative & solo performance & installations.

This was my 4th residency at STEIM since 2008, the first one was with Alex Nowitz experimenting with the use of Wii controllers for video/audio duet and since then refining my video-tracking and software mapping to various MIDI and bluetooth controllers, including a custom built wireless video-spotlight called the ‘Videmote’.
In August 2011, I began a new leg of my creative journey by choosing to design a custom wireless instrument for live performance to replace my off-the-shelf controllers. I wanted freestyle control requiring layered physical gestural skills, a greater degree of complexity in software mapping, housed within a first prototype that I could grow with and adapt to include future technologies.

‘Instrumentality’ was  the issue I first needed to address during the residency:

Frank Mauceri instrument 'Fluffy'

'manifestation' instrument

  • what is an instrument?
  • why do I want one?
  • what makes a good instrument?
  • what will it look like?
  • how will I hold it?
  • how will I ‘play’ it?

WEEK ONE INSTRUMENT LAB#1  16-23 August 2011
http://www.jamiegriffiths.com/steim-instrument-building-workshop/
In order to understand the problems associated with conceiving, designing and building an instrument, we studied;

  •  The philosophy of instrument building
  •  Hardware-software interaction principles
  •  Practical considerations for live, virtuosic electronic performance
  •  Softwares: JunXion & LiSa for instrument mapping & audio sampling

We were mentored by key STEIM staff and artistic personnel:

  • Kristina Anderson: STEIM history, events & instrument archives.

    Arduino into JunXion

  • Daniel Schorno:  STEIM Instrument showcase including Crackleboxes and analog synths in the underground ‘bunker’ http://vimeo.com/28155973
  • Takuro Mizuta Lippit: Thinking about new instruments
  • Frank Baldé: JunXion and LiSa software mapping, real-time computer music.
  • Kristina Anderson: Imagining your instrument. Draw a sound and build the instrument that might conceptually be able to make that sound.
  • Frank Baldé:  JunXion intoArduino
  • Marije Baalman: Sense/Stage XBee wireless device.
  • Joel Ryan:  Vizualizing complexity and the Inside-out trombone.
  • LoVid: Workshop & performance. Building a skin-sensor circuit using a 1/4 inch jack and circuit board (affectionately called a ‘Taco-Jack’) with Tali Hinkis and Kyle Lapidus

All of this led to tangible results... I began to set up a requirements list of my priorities for the new instrument. I knew I wanted acrylic for the prototype form factor and researched local suppliers but during the Instrument Lab week I had a lucky break. I stumbled across a 1970’s folded acrylic vase in an outdoor Sunday market that jumped into my hands. It was perfect... and cost 2 Euros.

pink-instrument with taco jack

‘Fluffy Pink Box’: Presentation & Improvised Performance by Jamie Griffiths, Frank Mauceri (Maine, Ohio) and Sinan Kestelli (Istanbul, Turkey).

I added the new Taco Jack (just built that afternoon) to a pink balloon I found on the street, tucking the circuit board inside the acrylic vase, and taping the potentiometer to the top. BY wetting he surface of the balloon with saliva I could alter the conductivity between two fingers on the balloon and hence change the sound coming out from the Taco Jack. On the last night we did a workshop performance, using the skin sensors built during the session with LoVid, and some custom mapping of sound and video files, put together earlier that day.  http://www.jamiegriffiths.com/fluffy-pink-box-at-steim-instrument-lab-aug-2011/

Frank Mauceri’s Blog http://www.steim.org/projectblog/?p=3432

Sinan Kestelli’s Blog http://www.steim.org/projectblog/?p=3591

Group Instrument Jam: Aug 23, Last Day of the Workshop http://soundcloud.com/fmauceri/sets/jam-session-at-steim-amsterdam/  Sinan Kestelli (TR) – Sampling and digital processing, Peter Edwards (US) – Electronics, Frank Mauceri (US) – Saxophone and digital processing, Benjamin Bacon (US) – percussion, Jamie Griffiths (CA, UK) – Video and digital processing.

Much of Joel Ryan’s talk had struck deep chords with me.  “CLARITY OF LOGIC DOES NOT PRODUCE GOOD MUSIC”.  Joel pointed out that musical timing is intuitive & is known to be measured in micro-seconds (faster than a neuron can trigger).  Body timing is therefore DIFFUSE involving millions of micro-bodily choices & rapid intuitive responses whereas computer timing is PRECISE/LINEAR. Very different. The design of an instrument works best when this diffuse mode of performance is enfolded into the design, such as with a violin, for instance.  So how can I introduce diffusion to a visual performance? Through complexities in software such as parallel processing streams with marginal randomisation around those numbers? Gestural embodiment in the physical instrument’s form factor? At the quantum level, there are no absolutes. The ‘numbers’ in living systems are constantly randomising around the illusion of fixed numbers.  

Static = Death
How to enfold gestural complexity into the ‘machine’, keeping this in mind?

WEEK TWO ASSIMILATION: a week of solo reflection, experimentation, programming and design.

I have been using JunXion since 2008, but LiSa was new to me.  I had used live audio sampling with Ableton Live in two of my major works, so I was keen to find out if LiSa. was a compelling alternative.  I experimented with OSC mapping from LiSa through to Isadora (my main software for visuals).  I was intrigued by the live vocal aspect of LiSa and its greater complexity for rapid sampling playbacks.

LiSa won’t run on Lion (since Apple have now made Rosetta obsolete) but there is a workaround by running a partition of Snow Leopard on a Lion machine.  Frank Baldé has also released companion software called LisaX which can run a LiSa patch on Lion, once it has been authored and saved on a Snow Leopard machine.

There were some pre-existing performance requirements that the new prototype instrument needed to accommodate.  In addition to control of live and pre-recorded media it had to control DMX theatre lighting including my custom video spotlight that was currently mapped to a Wii Nunchuk joystick.  I also contemplated whether to add a wireless camera to the instrument plus some other ‘optional’ add-ons such as infra-red leds to allow it to be seen & tracked in the dark by an IR camera.
That was all I had time for before teaching the Isadora workshop intensive in week 3.

WEEK THREE Four day Isadora Workshop Intensive 

Interactive graphical programming with Isadora software for 13 artists.
Jennifer Kanary, Sjoerd Leijten, Han Halewijn, Julia Mihaly, Sander Trspel (Amsterdam), Per Sacklen (Sweden), Pinar Temiz (Germany), Eva Auster, Nick Blackburn (England), Laura Mahon (Ireland), Chris Duplech (France), Sneja Dobrosavljevic (Belgium), Valeria Marraco (Argentina/Amsterdam)

Isadora Intensive Workshop

It was a fantastic group with many outstanding ideas and projects already in development, including a bicycle-beamer, remote cars, multi-channel installations, psychosis labyrinth, gestural music, projection-mapping with the kinect camera, interactive lighting and more…

WEEK FOUR Artist Residency 5-11 SEPT  2011

naked I.T. (Interactive Tool)

I am lucky to be working with the very talented Marije Baalman http://www.nescivi.nl/ (Sense/Stage hardware & software), with additional support from Frank Baldé (JunXion software) to build the prototype for my new instrument.  Important goals that I had identified were to create a live performance instrument upon which I could develop a playing style that would sustain between performances. Acquiring the gestural skills over time would hopefully lead to a sense of virtuosity with the instrument that could be detected by an audience. The controls would still need to be adaptable for one-off projects, with the ability to quickly alter the mapping, as desired.

I set up my performance system in a downstairs STEIM studio in order to test which aspects of performance I most urgently wanted to map to the new instrument.  The plan was to work with Marije’s Sense/Stage PCB hardware along with Python,SuperCollider and Isadora softwares.
http://sensestage.hexagram.ca/getting-started-with-sense-stage

The Sense/Stage XBee hardware was then embedded inside the empty acrylic instrument. From there it broadcast the data to my laptop using radio (not bluetooth) thanks to the Sense/Stage firmware written and installed by Marije in Python/SuperCollider and from there I unpacked it via OSC into an Isadora patch.  Once the data was received I could use it to control any aspect of my performance, such as video, sound or lights.

I tried adding a long handle adapted from a hollow aluminum crutch, but I abandoned that concept after a couple of days. It was unnecessary and unwieldy. I had thought it might be helpful for steadiness if I decided to add a camera to it, but it felt best when held freely in the hands. I experimented with different types of sensors and controls by connecting them to an Arduino Uno via Junxion software into Isadora software.  Once I found the sensors that I liked, I added them to the instrument, and we connected them to the XBee.  The sensors included:

getting dressed...

  • 8 buttons for the left hand
  • 1 potentiometer/dial on the top left
  • IR proximity sensor for the right hand
  • photo-led sensor for the left hand
  • flexion sensor for the right index finger
  • touch paper on both sides for left and right thumbs

I tried out some piezo mics but rejected them, preferring the touch paper.  I also rejected a position sensor that ought to have allowed me a glide/slide/touch control, but failed to generate usable data. I may retry this with a different brand later on. Perhaps it was just a dud.
I would have liked to add a joystick, but ran out of time. We did experiment with a small track pad, but the data was absolute, not relative data of the x,y position when touched and the plastic was too flimsy, so we opted to put that aside also.

I had enjoyed using touch paper in the instrument workshop so I decided to try adding it in a position that would work well with my thumbs.  Here is a video link to a demo of the touch paper connected to an Arduino Uno, mapped to JunXion and Isadora software, for control of video and audio. http://vimeo.com/28446091  When you press down on the paper it changes the resistance between the two copper strips. The Arduino converts the electrical information into digital data in JunXion software, that I then route into Isadora software, using MIDI (or OSC) protocol. I could set up the Arduino to communicate directly with Isadora, but since I had learned how to use JunXion to do that already in the Instrument workshop, I left it that way in order to save time.  Isadora is controlling the video clips and for the audio I took advantage of the Quicktime Synthesiser inside JunXion that can selected as an output.

ready for the ball...

Holes were drilled into the acrylic, as needed, for attachment of the tactile controls (sliders, dials, touch screen etc).  Velcro was used for the Sense/Stage XBee itself, although this will be locked down more rigidly once the perfect position for it has been determined.  However it does need to be able to be removed easily as well.  The rechargeable battery was also attached by velcro to the inside of the instrument. The XBee contains a built-in accelerometer, to which Marije  added a gyrometer and a board connected to the main device that allowed wires to be connected from the other sensors located around the outside of the instrument in suitable locations for my fingers/hands.

The remainder of the week was spent refining the layout of the controls and adjusting the firmware of the Sense/Stage as needed.  I built a custom user actor in Isadora for the instrument’s incoming data, in order to unpack the numbers in a convenient way within the patch.

Immediate next steps beyond the time available during the instrument residency:

  • Scale the controller values to a useful data range for physical gestures
  • Map the data to suitable aspects of my live performance
  • Build a control panel in Isadora that reflects the internal mapping of the data, but that is easy to see while standing a couple of feet away from the laptop.
  • Practice with gestural performance  ie ‘learning’ the subtleties and complexities of the accelerometer and gyroscope data flows triggered by different kinds of physical gestures

Wish list for Additions to the Instrument: in order of perceived urgency

  • Track pad absolute x,y or… Docking station for an iPod as a bluetooth device and touchpad
  • 8 button set for the right hand
  • Joystick relative x,y to control my DMX video-spotlight, called the Videmote
  • Infra Red LEDS to wash the instrument internally with IR light to allow it to be tracked by a kinect camera or other IR camera
  • Wireless Mic  vocal or breath triggers
  • Wireless Camera  IR pinhole camera or SD camera with wireless EyeFI card

Thoughts on the Form Factor of the Instrument:  
The shape, size and weight of the instrument are all good. I may still revise the shape at a later development stage, but mainly for aesthetic rather than functional reasons.  I do get some comments about  my ‘vase’… so a unique form factor would prevent the visual recognition that is perhaps either a bit of a distraction to the audience, or perhaps part of the fun!
Sanding or chemically etching the exterior of the acrylic surface may render it more refractive/opaque for effective use of internal color LEDs or infra red LED lights.

PEOPLE & PLACE

TUTORIALS

SUPPLIERS

DOWNLOAD BLOG ENTRY AS A PDF  jGriffiths_STEIMInstrumentLabBlog FINAL.

Posted on by
in Artistic Residency, Hardware Research and Development, Instrument Lab, Software Research and Development, Studio Work

About Jamie Griffiths

Interactive visuals, film art for live performance. Interested in consciousness and spiritualities. Perform live films with wireless controllers. Opera, orchestra, ensemble, new music, dance and experimental theatre projects. Research and Development for new projection technologies and screen designs include a prototype interactive video-spotlight called the VIDEMOTE, 3D video projections for use in live performance, Lanbox lighting. I teach workshops to artists on using Isadora software for live performance visual designs, interactive lighting and video tracking. I am also an adjunct Professor doing research at University of British Columbia in Vancouver, Canada, in Department of Film and Theatre. In addition to Canadian projects, am currently spending substantial time on projects in the EU.

Comments are closed.