Adinda van ‘t Klooster > EMOTION LIGHTS with Ken Brown, Marc Boon a.o.

Emotion Light, work in progress, (c) Adinda van 't Klooster, 2009

Emotion Light, work in progress, (c) Adinda van 't Klooster, 2009

Project team:

Artist: Adinda van ‘t Klooster (concept, sculptural design, porcelain model, electro acoustic composition and project management)

Hardware design and advice: Ben Knapp, Ken Brown, Marc Boon, Bob Young, Belvin Ho

Software: Vincent Akkermans, Ken Brown, Miguel Angel Ortiz-Perez & Nick Ward

Rapid Prototyping: advice: Alan Stafford, AMAP, University of Sunderland

3d modelling: Neil Milburn, Iain Barrett, Dave Knapton, AMAP, University of Sunderland

The residency at STEIM was made possible by a small grant for the arts from the Arts Council England.

STEIM RESIDENCY 7-22 April 2009

Adinda van ‘t Klooster is an international artist and currently an AHRC funded PhD researcher at CRUMB, Sunderland University. She creates responsive artworks using sensors, light and sound. Her contextual research focuses on artworks which use biofeedback in particular.http://www.axisweb.org/seCVPG.aspx?ARTISTID=8405

The aim of the residency at STEIM was to further develop the first prototype ‘Emotion Light’: a sculptural light which uses biofeedback technology to visualize the holder’s physiological state in changing light patterns emerging from a portable, wireless sculpture.

To keep the experience non-invasive, we chose to track physiological data that could be captured from touch, like GSR (galvanic skin response) and heart rate, which both change continuously depending on physical effort and emotional state.

We combined this system with a 3-axis accelerometer to detect movement. Code on an Arduino microcontroller was then used to translate the biosignals into changing light patterns.

The production of a musical sound sequence made to evoke emotions in the viewer was also started at STEIM. The final artwork will allow the viewer to see how they respond to the sound, as this will be reflected in the changing behavior of the light.

This project aimed create interesting art working in an interdisciplinary way with people from computing, psychology, neuroscience, electronics, the arts, and music.

Rather than working with high-end medical sensors that have their own software and physical restrictions like visible leads, we wanted to create a portable system that would be adaptable for creative contexts and relatively cheap. We investigated custom built sensors which could easily be inserted into or merged with sculptural shapes.

The shape of the sculpture has gone through different phases, but for this version I chose a uterus-like shape which simultaneously reminds one simultaneously of a rams head, fallopian tubes and spermatoids. The shape was cast in porcelain and glazed in parts with a gold luster to create a conductive surface functioning as the touch part of the sensor. This allowed technology and art to merge seamlessly. This part was developed prior to the residency at the National Glass Centre, part of the University of Sunderland. A rapid prototype of this shape is also being developed with AMAP in Sunderland.

At STEIM, I worked with Vincent Akkermans to further develop the software side of this project. Marc Boon also worked with us to advise us on any hardware issues we encountered.

I brought most of the hardware for this system with me from the University of Sunderland, where Ken Brown, Bob Hogg, and Belvin Ho, had advised me on how to build a LED driver system.

During the STEIM residency, we further adapted some of the hardware. For the GSR sensor, we used an electrical diagram provided by Ben Knapp and made the touch parts of the sensor by putting a conductive glaze on a porcelain sculpture and wiring this up.

To obtain heart rate, we initially worked with a sensor that could capture heart rate from touch, the diagram of which was given away by the Elector magazine (http://www.elektor.com/magazines/2006/october/ecg-using-a-sound-card.58566.lynkx). For the touch part of the sensor we used a conductive glaze. However, thorough testing of this circuit made it clear that this sensor was too unreliable and gave too noisy a signal to be useful for our purposes.

We then explored the possibility of working with a pulse plethysmograph. We used the diagram on the following website (http://www.picotech.com/experiments/calculating_heart_rate/index.html) and polymorph to create the sensor, which was quite reliable and we used it throughout the residency. The downside of this sensor is that it also picks up part of the LED signal as it’s input which is problematic. Also, any movement of the finger can be misinterpreted by the computer as heart beat, so further research into which sensor could be used to obtain heart rate from touch is continuing.

In terms of programming, the first methodology at STEIM was to build a database to train a classifier which would be able to distinguish between 8 different emotions. There are many different systems of emotion and there is no general consensus as to which system best represents reality. We used the taxonomy of eight emotions by Manfred Clynes (Clynes, M., 1989). The emotions are: No emotion, Anger, Hate, Grief, Love, Sexual Desire, Joy and Reverence. Rosalind Picard had reported high success rates with training a classifier on one person using this taxonomy (Picard, R.W., Vyzas, E., Healey, J. (ud)).

For our database we used twenty volunteers who were asked to listen to the Sentic Cycles by Manfred Clynes (http://senticcycles.org/) whilst we recorded their heart rate and galvanic skin response. Sentic Cycle kits, were first published in the 70’ies, but are still available online, as a photocopied publication with a CD with two sentic cycles on it and with so much white noise one can’t help being transported back to the 70’ies. On the tape a male voice neutrally reads out 8 emotions, each followed by a series of little knocks with a hammer at which point the listener has to bring on the emotion internally. The kit comes with a finger rest which should be used to express the emotion with the index finger. The total time spent per emotion is slightly over 3 minutes. For this experiment, we did not supply the volunteers with a finger rest but asked them to just feel the emotion at each knock of the hammer, whilst holding the uterus shape which was wired up to capture people’s GSR and heart rate.

capturing GSR and heartrate for the database

Capturing GSR and heartrate for the database

Consequently, the following clustering and classification algorithms were tested on the database: Simple K-Means Support, Vector machines, Trees, Linear Logistic regression Models and Naive Bayes. Features of the signals were means, variances, deltas (at different d’s) plus some explorations of these features in relation to the ‘no emotion’ state. Using a classifier for 8 emotions gave success rates that were only slightly higher than chance level, i.e. the computer could not detect these emotions from the biosignals. We only got success rates of 75% when limiting the classification to two different classes: no emotion and anger and looking at GSR and heart rate for those two. This would be such an over simplification that we decided the classifier method was not suitable in our context. A major concern with the induction method was that the participants reported difficulty in truly feeling some of the emotions, and that some of the emotional descriptors were culture specific. (Like for example reverence which did not mean much to some people). We noted extreme differences in people’s biosignals across the emotional categories, except for anger which was easy for most people to feel on demand.

Clynes claims that on his tape the time intervals between the ticks of the hammer, the order of the emotions and the finger rest, which provides a vehicle of expression of the emotion in question, all help to make people feel the emotion. We used the sentic cycle without a finger rest and we concluded that Clynes induction method does not work without a finger rest. It may not work with a finger rest either as most people need to be induced into emotions by external means, unless they are trained.

I then decided to use the biodata directly as input to steer the colour and pulse of the light, rather than use a classifier to detect emotion.

A direct mapping approach has the benefit of staying closer to the raw data, leaving interpretation more open and thus avoid having to buy into a system of discrete emotions. In the domain of psychology there are contrasting opinions on how to classify emotions. By visualizing the output in colored light patterns, we don’t have to opt for a system of discrete emotions, but can be more ambiguous. I decided to map the tonic component to hue and the phasic component to more or less change in the color selection. The heart rate was directly reflected in the light by pulsing in time with the person’s heart rate. As more emotional information can be obtained from the heart rate signal by looking at it over a longer period of time, we reverted to using Max/MSP on the laptop together with an Arduino Bluetooth for outputting wirelessly to colored light. We also added an accelerometer to our system, as we needed to detect when the signal was unreliable due to too much movement. The more movement the darker the light output becomes, and when held still the viewer is rewarded with brighter light.

Conclusions:

There is no one-to-one relationship between emotions and body data, but body signals do reveal something of the emotional state of the person in question.

In the context of this artwork, it became clear that direct mapping from physiological changes to light can already create an interesting feedback loop between artwork and participant. GSR provides instant feedback indicative of emotional changes and instinctive physiological responses to the external environment.

Public presentation:

I gave a lecture at STEIM http://www.vimeo.com/4216238 in the middle of the residency period.

Further development:

Artistically: There is plenty of scope for designing other sculptural shapes which could have the same technology inside but would perhaps have slightly different behaviours. This is planned in future work. Also the interaction will be further experimented with and the mapping expanded upon.

It would also be interesting to develop a closer collaboration with neuroscientists, psychologists and physiologists as a deeper understanding of the data will enrich the possibilities for the artwork.

Technically:

There were issues with the Bluetooth Arduino and the LED driver chip we were using: Each time we moved from Arduino Diecimila to Arduino BT, the LED’s would start flickering and then the chips would give up.

Ideally we will convert back to using just an Arduino, rather than Max/MSP and Arduino as that way we would not need a laptop. Once all the mappings have been finalized in Max/MSP we will investigate the possibility of converting the code to Arduino code only, using the Arduino Mega, as this has 4 k memory rather than 2 k of the latest Arduino Decimila and may be able to do all we need it to do. However, this would require substantial reprogramming so we need to find additional funding for this.

Since the residency we have had our own PCB made to avoid noise in the signal as much as possible.

As all the electronic components need a bespoke place inside the sculpture, there are considerable benefits in working with a rapid prototyped shape, rather than ceramics which breaks easily and is very labour intensive to create. The benefit of using ceramics on the other hand is that it creates an extra layer of meaning, referencing to tableware and fragility. For the moment we have decided to rapid prototype the shape in vinyl as many people will handle the shape at ISEA09 in Belfast where the prototype will be first launched to a large audience so we have to make it as robust as possible.

Many thanks to STEIM, the Arts Council England, the volunteers for the database and the University of Sunderland, advisory team:

· curator and academic writer on media art: Beryl Graham, University of Sunderland

· psychology researcher: Lieselotte van Leeuwen, University of Sunderland

· Sensors & biofeedback engineer: Ben Knapp, SARC, Queens University Belfast

Background information + References:

Boehner, B., DePaula, R., Dourish, P., Sengers, P. (ud), “How Emotion is Made and Measured”

Clyne, M., (1989) Sentics, The touch of Emotions, Dorset: Prism Press.

Evans, D. (2001), EMOTION, A very Short Introduction, Oxford, Oxford University Press

Gage, J.,(1999) Colour and Meaning, Art, Science and Symbolism. Singapore, London: Thames and Hudson

Le Groux, S., Valjamae, A. , Manzolli, J., Verschure, P. (2008), “Implicit Physiological Interaction for the Generation of Affective Musical Sounds”, Pompeu Fabra University

Gomez, P. and Danuser, B. (2007) “Relationships between Musical Structures and Psychphysiological Measures of Emotions”, The American Psychological Asssociation, Vol. 7, Nr.2

Healey, J. and Picard, R. (2000), “ SmartCar: Detecting Driver Stress”, Proceedings of ICPR ’00, Barcelona, Spain

Isbister, K., Höök, Laaksolahti, J., Sharp, M. (2006), “The sensual evaluation instrument: Developing a trans-cultural self-report measure of affect.” International Journal of Human-Computer Studies 65 (2007) 315-328, Elsevier

Juslin P.N., Sloboda, J.A (2001) Music and Emotion, Theory and Research. Oxford University Press

Lim, C. L., Rennie, C., Barry R.B., Bahramali, H., Lazzaro, I., Manor, B., Gordon, E. (1997), ” Decomposing Skin Conductance in Tonic ad Phasic Components”, International Journal of Psychophysiology 25 (1997) 97-109

Nakatsu, R., Nicholso, J., Tosa, N., (ud) “ Emotion Recognition and Its Application to Computer Agents with Spontaneous Interactive Capabilities” Kyoto, Japan: ATR Media Integration & Communications Research Laboratories

Oatley, K., Keltner, D., Jenkins, J.N (2006), Understanding Emotions, 2nd edition, USA, UK & Australia, Blackwell Publishing

Picard, R.W., Vyzas, E., Healey, J. (ud), “Toward Machine Emotional Intelligence: Analysis of Affective Physiological State”, MIT Laboratory, perceptual Computing Section Technical Report No 536, To appear IEEE Transactions on Pattern Analysis and Machine Intelligence.

Posner, J., Russell, J., Peterson, B. (2005), “The circumplex model of affect: An integrative approach to affective neuroscience, cognitive development and psychopathology”, Development and Psychopathology 17, Cambridge University Press

Sadat Shami, N., Hancock, J.T., Peter C., Muller M., Mandryk, R. (2008), “Measuring Affect in HCI, Going Beyond the Individual”, CHI 2008 Proceedings – Workshops, Florence, Italy

Van ‘t Klooster, A. (2009), Emotion Lights: from biosignals to light art, Belfast, ISEA2009 proceedings

This report was written by Adinda van ‘t Klooster and published on 20-06-09.

Holding the Emotion Light, work in porgress (c) Adinda van 't Klooster, 2009

Holding the Emotion Light, work in porgress (c) Adinda van 't Klooster, 2009

logostripemolightflat2

Comments are closed.