The Space Between Us: Evaluating a multi-user affective brain-computer music interface
MetadataShow full item record
Music as a mechanism for neuro-feedback presents an interesting medium for artistic exploration, especially with regard to passive BCI control. Passive control in a brain-computer music interface (BCMI) provides a means for approximating mental states that can be mapped to select musical phrases, creating a system for real-time musical neuro-feedback. This article presents a BCMI for measuring the affective states of two users, a performer and an audience member, during a live musical performance of the piece titled The Space Between Us. The system adapts to the affective states of the users and selects sequences of a pre-composed musical score. By affect-matching music to mood and subsequently plotting affective musical trajectories across a two-dimensional model of affect, the system attempts to measure the affective interactions of the users, derived from arousal and valence recorded in EEG. An Affective Jukebox, the work of a previous study, validates the method used to read emotions across two dimensions in EEG in response to music. Results from a live performance of The Space Between Us indicate that measures of arousal may be manipulated by emotionally charged music, and that measures of valence are less responsive to musical stimuli, across both users. As such, an affective BCMI presents a novel platform for designing individualized musical performance and composition tools where the selection of music can reflect and induce affect in users. Furthermore, an affective channel of communication shows potential for enhancing collaboration in music-making in a wider context, for example in the roles of therapist and patient.
The following license files are associated with this item: