Feel: Trace 2007

“FEEL:TRACE”: is a psychophysiologically responsive video installation synthesizing art, neuroscience and technology. The project explores new, more embodied languages of interactive and emotional communication, investigating the inter-relationship of the internal body and the external world. Using biosensors, the participant’s heart rate responses are monitored. An audio narrative, created by a clinical hypnotist, induces varying emotional responses in the viewer. Sensors begin to diagnose patterns of internal arousal states of the body, picking up how the body is responding to the imagery. Dependant on the heart rate of the viewer, a pre-shot database of salient affective video stimulus is triggered.

Synopsis of “Feel Trace”: “FEEL:TRACE”: is a psychophysiologically responsive video installation synthesizing art, neuroscience and technology. The project explores new, more embodied languages of interactive and emotional communication, investigating the inter-relationship of the internal body and the external world. Using biosensors, the participant’s heart rate responses are monitored. An audio narrative, created by a clinical hypnotist, induces varying emotional responses in the viewer. Sensors begin to diagnose patterns of internal arousal states of the body, picking up how the body is responding to the imagery. Dependant on the heart rate of the viewer, a pre-shot database of salient affective video stimulus is triggered. 

This database has been tested scientifically for their potent psycho-physiological effects on central and autonomic nervous systems (using the medical diagnostic imaging devices such as fMRI, EEG, MEG). Bodily reactions of the viewer continue to trigger changes in the projected video and audio content. The video footage constantly adjusts in response to the viewer’s internal state (creating a bio feedback dialogue between the image and the participant).The software reads the responses of the body, making a choice to either calm the viewer or further stress them out.

Building of “Feel Trace”: “Feel Trace” was the first sketch we embarked on in Feel Series. The prototype used small biosensors from the London based company, Health Smart. The ‘heart’ sensor monitored both heart rate and heart rate variability (HRV). The sensor attached to the finger of the participant. I worked with Critchley to understand the signals that the sensors generated. Heart rate is rather simple to analyse – a rise in hear rate indicates that one may be stressed. Critchley discusses, “increased smooth variability in heart rate / interbeat interval is seen as being physiologically desirable, associated with calmness”. But how immediate is the signal of HRV and heart rate? If you showed the viewer a video, how long would this take to affect the physiology of the body? With “Feel” Trace, I aimed to build a feeling of ‘control’ in the participant, which meant the change in physiology needed to by as immediate as possible. In discussion with Critchley he states:

Hugo Critchley writes: “Heart rate changes within the first 2 beats (2 seconds) of seeing a potent stimulus and may peak in its changes about 4 beats later. Increased smooth variability in heart rate / interbeat interval is seen as being physiologically desirable associated with calmness. HRV is not a response as such, rather a frequency power: the longer you measure the more accurate your estimation of power: the standard advice is to measure over give it 3 min”.

How potent would the video need to be in order to affect the heart rate and HRV? When first discussing the work with Critchley’s research group, they agreed that if you wanted stronger arousal readings from the sensors, you would need to create potent, arousing footage.

I worked with Dr Neil Harrison[i] to develop the video content. Critchley had asked me to build three viscerally directed videostimulus sets (VDVSS) for specific research experiments that aimed to explore the basis to patterned physiological responses to disgust stimuli. I had created two databases: one used body boundary violation (mostly consisting of videos of people undergoing surgery) and ingestive disgust (videos of people vomiting, and eating vomit). Twelve healthy participants were scanned while they watched videos displaying disgust-evoking imagery, combining functional magnetic resonance imaging of brain with autonomic monitoring. Measures of heart rate (using pulse oximetry) and gastric activity (using electrogastrography, EGG) were acquired during scanning with subjective ratings of the videos. This stimuli formed the majority of the content to “Feel Trace” and also became part of a research paper[ii]. I was interesting in the power of images to elicit a visceral response in the viewer; I imagined that feelings of discomfort and squeamishness would arise by showing images of physically painful events.

For the sound, I created an audio narrative with clinical hypnotist and psychologist, Dr David Oakley. I hoped his hypnotic voice and word structure would induce varying emotional responses in the viewer. The background sounds to the track were created from ‘emotional sound’ databases, created by voice neuroscientist, Dr Sophie Scott. This database is used frequently in voice-neuroscientific studies[iii].

Working with Engineers at Health Smart, Sean Gomer, and also Doron Friedman, Computer Scientist at UCL’s Computer Science Department, we created the software that would recognise the sensor and trigger the imagery. The software was created in Flash, and over a three minute period, it continued to track the participant’s feelings, triggering a range of emotionally potent images that were slowly revealed to the viewer.

The first prototype of the series, “Feel Trace”, investigated the use of heart rate variability (HRV) and heart rate to monitor the arousal of the audience, using this data to trigger the video footage.

Conclusion of “Feel Trace”: The piece was tested five times, including the Cynet Art Festival in Dresden, the Science Museum and the White Chapel Gallery in London.

The sensors were small, aesthetic, but they proved to be unreliable with any movement created a dropping out of the signal. The sensors were not wireless, constraining the naturalistic agenda of interaction scenario I had envisaged. Interestingly, applying the sensors caused a rise in heart rate. Using only heart rate gave little insight into defining emotional states, only arousal states. The video content I created was very potent, producing significant rises in the heart rate of the participants.

At each exhibition, I dressed each participant with the sensors and asked them to sit in a chair in a darkened room facing projected video. Although the sensors were small and aesthetic, they proved to be problematic and unstable. Any movement of the participant created a dropping out of the signal. Disappointingly, the sensors were not wireless, and the participant had to be connected to a computer via a cable, constraining the naturalistic agenda of interaction I had initially envisaged. Using only heart rate gave little understanding into defining emotional states, and mostly arousal. To take this project further, we will need to look at using multi-modal wireless sensors. Interestingly, the arousal responses were affected by applying the sensors: dressing and directing of the user instigated a rise in heart rate. This had to be countered in the display of content (showing calmer/more neutral footage for longer period at the start before the artwork began using the heart rate to trigger video) and the software design (not reading the physiological data until 90 seconds). Discussing the work with Rosalind Picard, she suggests that this was not long enough, and at least three to five minutes was needed. I had only envisaged a five minute ‘experience’ for the work.

Conversely, some users reported that they enjoyed the anticipation of being dressed and directed for the work, stating that they thought it was more performative.

For the sensors to interact with the computer, the code was written in programme, Flash, which had a difficult time triggering QuickTime video. I had to design the content so the video worked at 320 x 480 pixels. Even so, the program crashed often, and the interface required a restart for each participant. The code would need to be rewritten if the project was to be taken further, preferably in a program such as Max MSP/Jitter.

The video content I created was very potent, producing significant rises in heart variability of the participants, but did the audience feel like the work was responding to their physiology? “… it was quite dramatic to feel the installation reflecting your own bodily responses. I did feel as though I had no control over what would happen although I did attempt to do so. I felt as though the installation was leading me into a sort-of more emotional feverishness (mainly through the use of the soundtrack as I remember it!) which was very difficult to then calm down…[iv]” The potent nature of the piece was confronting to sit through for the participants (and myself). “.. it is hard to describe how, at the same time, I wanted to rush out but could move my eyes from the screen… It was filled with body parts and squeaking, resounding, noise, a mix of revulsion and attraction, antinomic but automatic, this artwork elicited what was unsettling”[v]. There was a sense that the imagery was reflecting and provoking the ‘inner body’ of the participants, though most users felt they had no control over the piece, that it was taking them on a journey. I wondered if this might have been due to the two second latency of using heart rate to gather a response.

[i] Dr Neil Harrison is part of Professor Hugo Critchley research lab, based at the Functional Imaging Laboratory (FIL) at the Institute of Neurology (IoN) at University College of London (UCL)

[ii] Neil A Harrison, Marcus A Gray, Peter J Gianaros, Hugo D Critchley

14th World Congress of Psychophysiology 
THE OLYMPICS OF THE BRAIN 
SEPTEMBER 8 - 13, 2008, ST.PETERSBURG, RUSSIA

Central mechanisms for organ-specific control of visceral responses to emotive stimuli revealed by neuroimaging and autonomic dissociation of disgust

Emotions are embodied in physiological response patterns. We explored the basis to patterned physiological responses to disgust stimuli, combining functional magnetic resonance imaging of brain with autonomic monitoring. Twelve healthy participants were scanned while they watched videos displaying disgust-evoking imagery, divided into body boundary violation and ingestive disgust. Measures of heart rate (using pulse oximetry) and gastric activity (using electrogastrography, EGG) were acquired during scanning with subjective ratings of the videos.

Distinct patterns of cardiac and gastric sympathetic and parasympathetic responses were observed to the two forms of disgust. Neuroimaging revealed association between all disgust and activiation of insular cortex, neostriatum, thalamus and PAG. However, ingestive disgust evoked greater activity within insula, amygdala and brainstem while body boundary violation disgust evoked greater activity within somatomotor cortices and posterior insula / S2. Activity changes within right insula predicted gastric (tachygastria) responses and ratings of ingestive disgust, while activity changes in sensorimotor cortex related to cardiac response and ratings of boundary violation disgust.

These findings illustrate dissociation within both the brain responses and peripheral physiological reactions to two forms of disgust stimuli, with implications for understanding mechanisms for individual vulnerability to particular psychosomatic disorders.

[iii] The research team played a series sounds to volunteers whilst measuring their brain's response using an fMRI scanner. Some of the sounds were positive, such as laughter or triumph, whilst others were unpleasant, such as screaming or retching. All of the sounds triggered a response in the volunteer's brain in the premotor cortical region, which prepares the muscles in the face to respond accordingly. Interestingly, the response was greater for positive sounds, suggesting that these were more contagious than negative sounds.

[iv] Transposed from email interview post exhibition with Rachael Maddock who took part in the UCL exhibition

[v] Transposed from email interview post exhibition with Thiery Chamina who took part in the Whitechapel exhibition

Exhibition

CYNETART, Dresden, GERMANY 2006
Short listed for Harries National Digital Art Award, AUSTRALIA 2006
QUT gallery, Brisbane, AUSTRALIA 2006
Science Museum, Node. London, UK 2006
White Chapel Gallery, Wormhole Salon, London, UK 2006

Previous
Previous

Feel INSULA

Next
Next

Feel INSIDE