BRAIN.WAV: EEG BIOFEEDBACK IN PERFORMANCE AND IMPROVISATION

2
The 26 th International Conference on Auditory Display (ICAD 2021) June 25 -28 2021, Virtual Conference BRAIN.WAV: EEG BIOFEEDBACK IN PERFORMANCE AND IMPROVISATION Nicolas D’Aleman Arango Hasso-Plattner-Institute for Digital Engineering University of Potsdam Potsdam, Germany [email protected] Shama Rahman Hasso-Plattner-Institute for Digital Engineering University of Potsdam Potsdam, Germany [email protected] ABSTRACT Brain.wav brings their unique process iBoS (Input Brain Output Sound, patent pending) improvising with both brain and acoustic waves, creating biosonic loops. Shama and Nico will present and exclusive teaser, tailormade to the virtual world, merging analog and digital. Through digital signal processes Nico captures and sonifies Shamas brainwaves into electronica sounds, while Shama incorporates the sonified brainwaves into her improvisation, gen- erating a recursive loop and of course, listening to her own brain- waves changes her own brainwaves, so its a tightrope act. Nico and Shama unique process disrupts the market of contemporary improvisatory music and creates orthogonal, value for their stock- holders (aka the audience.) We invite you to take part in this world premier teaser and become a member of the BrainClub, directly from the streets of Berlin into your living rooms. 1. INTRODUCTION During the last decade or so, we have been a surge in consumer- level devices that are capable of measuring brainwave activity in one way or another. They range from basic tasks such as medita- tion relaxation and sleep aid, to experimentation and educational proposes through the programming of microcontrollers. One of the main appeals of devices is that they can stream and present data in real time, making them attractive for consumer. Real time application is a great vehicle for the sonification of data, as it is immediately available and direct interaction with the device can elicit a recognizable sonic response. One of the most logical applications for this device is their use as Brain-Computer Musical Interfaces. A thorough review on the ideas of BCMI can be found in [1]. According to them, the idea of BCMI is to create a device that controls musical parameters. Just are regular BCI are used to control other computational de- vices (see, locked-in syndrome.) As a form of control, mapping is the main tool to develop significant changes over sound signals and musical events. As the capacity of computers and devices to process information grows, the number of parameters and indica- tors that can be mapped to music are greater and therefore more flexible. This work is licensed under Creative Commons Attribution Non Commercial 4.0 International License. The full terms of the License are available at http://creativecommons.org/licenses/by-nc/4.0/ 2. IBOS iBoS Streaming set up We created an authonomous project of musical improvisation that merges synthetized sounds controlled by brainwaves with tra- ditional acoustic instruments such as the Sitar. Through digital signal processes we capture and sonify brainwaves into electronic sounds, incorporating them into the acoustic improvisation, gen- erating a bio-feedback loop. The auditive cues presented through sound spatialization and sonic immersion function as the medium to explore the nuances of our emotional perception. This genera- tive production of space is dictated by AI algorithms that react to signals captured in the room i.e. the improvisations created by the performers and their soundwaves. Brain.wav is the first creative output of the iBoS project. It is an improvisation between sonified brainwaves and acoustic in- struments. The set up consist of a Muse2 headband connected to a smartphone using the MindMonitor application. It is a commercial app developed by James Clutterbuck and is designed to be used in conjunction to the Muse device. The application serves mainly as a bridge between the Muse2 and the main MaxMsp patch run- ning on a computer. The data is then transmitted as OSC messages via WiFi connection. MindMonitor allows the streaming of dif- ferent data including the raw data from the four main electrodes, as well as the FFT analysis of each electrode, resulting in power band frequencies of Delta, Theta, Alpha, Beta and Gamma. It also recognizes blinks, jaw clenches, and transmits gyroscope and ac- celerometer data. Once the OSC data is incoming, we tried different approaches. Some of the most common mappings were tested, including the audification of the different frequency parameters (mapping fre- quency band to audible frequency bands in an specific range) cre- ating a sort of pre-established harmonic floor; mapping to filter frequencies, resembling voice formants; mapping raw signals to the amplitude of the channels, and so on.

Transcript of BRAIN.WAV: EEG BIOFEEDBACK IN PERFORMANCE AND IMPROVISATION

The 26th International Conference on Auditory Display (ICAD 2021) June 25 -28 2021, Virtual Conference

BRAIN.WAV: EEG BIOFEEDBACK IN PERFORMANCE AND IMPROVISATION

Nicolas D’Aleman Arango

Hasso-Plattner-Institute for Digital EngineeringUniversity of Potsdam

Potsdam, [email protected]

Shama Rahman

Hasso-Plattner-Institute for Digital EngineeringUniversity of Potsdam

Potsdam, [email protected]

ABSTRACT

Brain.wav brings their unique process iBoS (Input Brain OutputSound, patent pending) improvising with both brain and acousticwaves, creating biosonic loops. Shama and Nico will present andexclusive teaser, tailormade to the virtual world, merging analogand digital. Through digital signal processes Nico captures andsonifies Shamas brainwaves into electronica sounds, while Shamaincorporates the sonified brainwaves into her improvisation, gen-erating a recursive loop and of course, listening to her own brain-waves changes her own brainwaves, so its a tightrope act. Nicoand Shama unique process disrupts the market of contemporaryimprovisatory music and creates orthogonal, value for their stock-holders (aka the audience.) We invite you to take part in this worldpremier teaser and become a member of the BrainClub, directlyfrom the streets of Berlin into your living rooms.

1. INTRODUCTION

During the last decade or so, we have been a surge in consumer-level devices that are capable of measuring brainwave activity inone way or another. They range from basic tasks such as medita-tion relaxation and sleep aid, to experimentation and educationalproposes through the programming of microcontrollers. One ofthe main appeals of devices is that they can stream and presentdata in real time, making them attractive for consumer. Real timeapplication is a great vehicle for the sonification of data, as it isimmediately available and direct interaction with the device canelicit a recognizable sonic response.

One of the most logical applications for this device is their useas Brain-Computer Musical Interfaces. A thorough review on theideas of BCMI can be found in [1]. According to them, the ideaof BCMI is to create a device that controls musical parameters.Just are regular BCI are used to control other computational de-vices (see, locked-in syndrome.) As a form of control, mappingis the main tool to develop significant changes over sound signalsand musical events. As the capacity of computers and devices toprocess information grows, the number of parameters and indica-tors that can be mapped to music are greater and therefore moreflexible.

This work is licensed under Creative Commons Attribution NonCommercial 4.0 International License. The full terms of the License areavailable at http://creativecommons.org/licenses/by-nc/4.0/

2. IBOS

iBoS Streaming set up

We created an authonomous project of musical improvisationthat merges synthetized sounds controlled by brainwaves with tra-ditional acoustic instruments such as the Sitar. Through digitalsignal processes we capture and sonify brainwaves into electronicsounds, incorporating them into the acoustic improvisation, gen-erating a bio-feedback loop. The auditive cues presented throughsound spatialization and sonic immersion function as the mediumto explore the nuances of our emotional perception. This genera-tive production of space is dictated by AI algorithms that react tosignals captured in the room i.e. the improvisations created by theperformers and their soundwaves.

Brain.wav is the first creative output of the iBoS project. Itis an improvisation between sonified brainwaves and acoustic in-struments. The set up consist of a Muse2 headband connected to asmartphone using the MindMonitor application. It is a commercialapp developed by James Clutterbuck and is designed to be used inconjunction to the Muse device. The application serves mainlyas a bridge between the Muse2 and the main MaxMsp patch run-ning on a computer. The data is then transmitted as OSC messagesvia WiFi connection. MindMonitor allows the streaming of dif-ferent data including the raw data from the four main electrodes,as well as the FFT analysis of each electrode, resulting in powerband frequencies of Delta, Theta, Alpha, Beta and Gamma. It alsorecognizes blinks, jaw clenches, and transmits gyroscope and ac-celerometer data.

Once the OSC data is incoming, we tried different approaches.Some of the most common mappings were tested, including theaudification of the different frequency parameters (mapping fre-quency band to audible frequency bands in an specific range) cre-ating a sort of pre-established harmonic floor; mapping to filterfrequencies, resembling voice formants; mapping raw signals tothe amplitude of the channels, and so on.

The 26th International Conference on Auditory Display (ICAD 2021) June 25 -28 2021, Virtual Conference

3. PERFORMANCE PERSPECTIVES

Initially, the goal was to produce an event that will showcase theresults in the form of a multichannel audio concert or sound in-stallation, but all the prospects have been postponed due to Covid-19. Six speakers distributed in an architectural space, representingeach of the brain areas, reproduce sound translated from the data ofdifferent EGG measurements. This will allow to not only hear, butproperly embody the reading of the brain activity . The multichan-nel mapping would reflect the position of brainwave activity indifferent areas of the brain. The first iteration of iBoS / Brain.wavwas presented at the Polymath festival in a Virtual format in Febru-ary 2021. Therefore, the piece and the process is already availableto perform in a virtual space.

4. FIGURES

Brainwave Sonic Instrument prototype

Binaural Set Up

5. MEDIA LINKS

• https://www.youtube.com/watch?v=T3_-IKE4jtw

• https://soundcloud.com/nicodaleman/brainwav

6. REFERENCES

[1] J. Eaton and E. R. Miranda, “On mapping eeg information intomusi,” in Guide to Brain-Computer Music Interfacing, E. R.Miranda and J. Castet, Eds. London: Springer-Verlag, 2014,pp. 221–254.