Raewyn Turner, Tony Brooks

Multisensory Spaces

The Four Senses performances with a symphony orchestra were a translation of sound into light, colour, and smell.

Four Senses Performance, collaboration 2002

Four Senses Performance, collaboration 2002

Technical Description

The pre-programmed light states were created with a lighting plan and a pc based stage-lighting program to make multiple sequences and cues used in improvisation. In the performances improvisation took place within the bounds of the lighting states and smells created for each piece of music. The canvas of the orchestra was dressed in white underlit with ultraviolet light. Divided into sound groups, each section was assigned a colour and its complimentary to achieve high degrees of retinal stimulation, brightness, afterimage.


The efficient dispersal of smell is important to the comfort of a ‘closed room’ audience. Processed by an industrial chemist into aerosol sprays, they were applied direct to the air conditioning system which distributed the fragrances evenly through the auditorium on cue. The fragrances were mixed or cleared away using the air conditioning fans in the building.
Warning notices were posted with the advertising of the event, informing audiences of the presence of smell.

The software used was various image manipulators on the eight camera feeds blended into interactive paint programs on the three upstream video mixer computers with the final composition through real-time improvisation manipulation of parameters of Eyesweb (logo).
Technic process = movement was taken from various sections of the orchestra - 3 x soundscapes sensors controlling R G B filter system - this was set up under the orchestra conductors' hands such that his conducting "painted the scenography" - multi head arrays of Soundscapes sensors (various) set up on stage - multi microphone inputs from stage (source to manipulate images) - 5 static cameras - and three roving camera people controlled via headphones by me. These signals served a switching bay that I created and a bank of monitors to observe all signals. Downstream were video mixers which then served 5 computers (Mac & PC) and hence manipulated the material I was blending in real-time. These were mapped to further blends down stream with video feedback etc., and then to 7 big EPSON LCD projectors. There were also feeds from the sonic aspects of the orchestra and these manipulated the images in real-time also.
We also created a foyer installation for a "mini performance" by dancers triggering image & sound via my sensors system & camera. In this space we photographed people/audience experiencing the VIRTUAL INTERACTIVE SPACE © and then printed out on a series of Epson printers supplied with photo quality paper.

The light performances were mostly improvised in real-time as we found the computer programming too restrictive as subtle changes from one of us effected the other's input, so there was much talking on the headsets for timing etc. The two spectrums of my projected images and the lights integrated well.

Hardware / Software

software : EyesWeb
Laboratorio di Informatica Musicale (InfoMus Lab)
The EyesWeb open software platform.
Copyright © 1999-2002 - Lab. InfoMus - DIST - University of Genoa
http://www.infomus.dist.unige.it/

7 LCD video projectors, video mixers, television monitors, a sound system, 5 x computers both Macintosh and PC platforms
Martin Intelligent Lighting system
Lightjockey stage lighting programmer and control
Essential Oils packaged by an industrial chemist into Aerosol cans.