EEG data in Interactive Art

warning: call_user_func_array() []: First argument is expected to be a valid callback, 'phpmailer_preview_access' was given in /var/www/web/html/includes/ on line 454.

The paper reflects upon the interaction of music and/or digital media with brainwaves’ data obtained from performers or an audience via an EEG interface and to contextualise them as a dynamical system, as defined by Abraham and Shaw. A brief historical overview is introduced, with examples from the past 45 years –included one of my own– based on biofeedback methods developed from the late 1960s onwards.



In 1875, Richard Caton discovered the electric currents of the brain using a galvanometer by experimenting with animal brains, setting the path for the development of the EEG by the German psychiatric Hans Berger in the early 1920s. Berger was the first to record electrical impulses from human brains in 1924 (he called these recordings Elektroenkephalograms); he was also the first to observe the alpha wave activity (8 to 12 Hz), which is accentuated during relaxation. In the 1960s, Neal E. Miller applied such experiments for therapeutic treatments, which resulted in the creation of the Bio-feedback method. This method has been used since then to measure the physiological activity –such as brainwaves, heartbeat or muscle tension– of patients in order for them to learn how to control those activities and improve their health condition by increasing their body and mind awareness. Schwartz and Andrasik state [3] that in 1969, Joe Kamiya, who was studying the alpha and beta brain states, reported that:

“one could voluntarily control alpha waves –a feat that was previously believed impossible.”

With his research, Kamiya gave a vital impulse to the usage of the EEG-Biofeedback-method, most commonly known as Neurofeedback. Before that (mid of the 1960s), the use of EEG was incorporated in art by composers such as Alvin Lucier and Richard Teitelbaum with their first performances utilising brainwaves, and hence, giving for the first time a new usage to EEG data other than scientific or therapeutic. The artworks described in the next section show how the use of EEG data in real-time pieces results in dynamical interactive works: their evolution is in constant modification, changing accordingly with emotional and/or mental states by each performance. This variability implies a dynamical system, as Rosenboom [2] explains :

 “Dynamical systems may be thought of as those involving forms or behaviors that change over time. The study of such changing forms may also be termed morphodynamics.”


In 1965, Lucier composed Music for Solo Performer - for Enormously Amplified Brain Waves and Percussion, which is considered the first musical piece using brainwaves. For this piece, he attached electrodes to the performer’s scalp, measuring only the Alpha rhythm waves, which were sent to amplifiers and loudspeakers connected to a large set of percussion instruments, in order to produce vibrations in these instruments via sympathetic resonance.

Some other important examples are Teitelbaum’s [4] pieces: Organ Music and IN TUNE, both created in 1968. In both of them the composer used the EEG signals mainly to control voltages in the Moog synthesizer, including in some cases not only the alpha waves range but

“comprised the broad spectrum from DC to about 50 cycles per second (Hz). It was applied chiefly to frequency modulate four voltage controlled oscillators, and also to control the amplitude and filtering of these audio signals.”

For On being invisible (1976/77) David Rosenboom [2] developed his own real-time software to create

“a self-organizing, dynamical system, rather than a fixed musical composition.”

His dynamical, real-time system extracted patterns from the performer’s brain activity that appeared with at certain regularity and compared them with some others stored before. This comparison determined their periodicity in order to influence different music parameters. The software detected and analysed the event-related potential (ERP) from the brain activity –which is associated with the direct response to an internal or external stimulus– in order to activate instruments, creating a circuit between the performer and the sound environment produced by the system.

The EEG has also inspired several artists using different types of media. For example, in On being invisible II (1994/95) Rosenboom includes lights, video and slide projectors. In Mariko Mori's Wave UFO (1999–2002), brainwaves from three participants inside a sculptural object –a spaceship– are connected to the audiovisual elements in order to create an interactive experience that invites them to be immersed into a deeper state of consciousness and to interconnect with themselves and the universe, representing the Buddhist concept of oneness. 

All these examples show rather different approaches to use this technology for artistic purposes since the 1960s. Interfacing EEG data with multimedia continues to raise the interest of artists aided nowadays by several devices/software such as, for example, the open EEG and the Arduino projects, as my piece INsideOUT –explained below– shows.

INsideOUT (2009) – Performance (EEG and real-time media)

The name of this project is inspired by the expression of the self, turning the subject’s imagination from the inside to the outside, keeping the original intention of the EEG project by materialising the performer’s thoughts and feelings on stage. The stage is a place for the appearance of the invisible. Michael Haerdter and Sumie Kawai [1] quote that Yasu Ohashi stated that:

“the actors aim at our senses, our body and our unconscious and not at our intellect. Their gestures try to envision THE INVISIBLE WORLD.”

INsideOUT was created during an artist in residence program at KHM (Germany). The performer interacts with sound and images using an EEG interface, which measures the performer’s brain activity. Sounds and images –some already stored in the computer and some produced live– are continuously modified by the values from two electrode combinations via MAX/MSP-Jitter. Hence, the performer determines how those combinations will be revealed to the audience. Images are projected to a screen and also onto the performer, while sounds are projected in surround.

The Olimex EEG open source interface ( used here which measures the brain activity and consists of two assembled boards: one analogue and the other digital. It is possible to connect up to three boards, each with two EEG channels. Only two are used for the piece though: frontal and occipital. The rubber cap and the contact electrodes of the interface are those typically used in medical applications.

I received technical support from Lasse Scherffig and Martin Nawrath at Lab3-KHM to adapt the interface. They modified the open EEG device by replacing the Atmel microcontroller with one running the Arduino firmware and changing the quartz clock accordingly to 16 MHz.

Scherffig wrote a program with the software Processing, which reads the values of both EEG channels via a serial communication. The modified open EEG sends ASCII-formatted data representing the voltages of both channels (at a frequency of 100 Hz). In Processing, a Fast Fourier Transform is applied to the data and extracts the bins for the frequencies between 0-50 Hz. From these, the median of the frequencies for the alpha channel (8-12 Hz) is extracted, smoothed using a low-pass filter and transmitted via OSC, which is received once again by the OSC-route object in MAX.

For the performance of INsideOUT, I have tried to train my brain in order to control the media combinations on the stage, putting in evidence different emotional and mental states, which cannot be achieved without the input of data coming from my own brain waves. However, this conscious control is not completely attained due to the enormous and uncontrollable stream of feelings that generally appear under such circumstances.


Based on my own experience of performing pieces using EEG data, the common ground is the presence of a behaviour, to produce a dynamic system, as explained at the start of this paper. Whether the mapping of data interfacing with the media is made directly or occurs via more complex systems (e.g. Rosenboom’s software) the structure of the pieces is invariably not fixed; instead, it evolves dynamically. This dynamic evolution depends on the performer’s or participant’s emotional and/or mental state, which is affected by the perception of their own self and by the perception of the piece's environment. The result therefore varies strongly between performances of the same piece, making them impossible to replicate.

The aim of my research in this field is mainly focused on raising awareness of the artistic possibilities of consciously controlling brain activity on stage to steer multimedia events and, at the same time, to allow feelings –the real creators of each dynamical system– to flow freely.

References and Notes: 
  1. Michael Haerdter and Sumie Kawai, Rebellion des Körpers, Butoh, ein Tanz aus Japan (Berlin: Alexander Verlag, 1998), 25.
  2. David Rosenboom, "Extended Musical Interface with the Human Nervous System," in LEA Electronic Monographs, no. 1 (ISAST, 1997), (accessed June 11, 2011).
  3. Mark S. Schwartz and Frank Andrasik, Biofeedback: A Practitioner’s Guide (New York, London: The Guilford Press, 2003), 9.
  4. Richard Teitelbaum, "In Tune: Some Early Experiments in Biofeedback Music (1966-1974)," in Biofeedback and the Arts, Results of Early Experiments, ed. D. Rosenboom, 39 (Vancouver: Aesthetic Research Center of Canada Publications, 1976).