Abstract

Brain-computer interfaces (BCIs) aim to establish a communication medium that is independent of muscle control. This project investigates how BCIs can be harnessed for musical applications. The impact of such systems is twofold — (i) it offers a novel mechanism of control for musicians during performance and (ii) it is beneficial for patients who are suffering from motor disabilities. Several challenges are encountered when attempting to move these technologies from laboratories to real-world scenarios. Additionally, BCIs are significantly different from conventional computer interfaces and realise low communication rates. This project considers these challenges and uses a dry and wireless electroencephalogram (EEG) headset to detect neural activity. It adopts a paradigm called steady state visually evoked potential (SSVEP) to provide the user with control. It aims to encapsulate all braincomputer music interface (BCMI)-based operations into a stand-alone application, which would improve the portability of BCMIs. This projects addresses various engineering problems that are faced while developing a stand-alone BCMI. In order to efficiently present the visual stimulus for SSVEP, it requires hardware-accelerated rendering. EEG data is received from the headset through Bluetooth and thus, a dedicated thread is designed to receive signals. As this thesis is not using medical-grade equipment to detect EEG, signal processing techniques need to be examined to improve the signal to noise ratio (SNR) of brain waves. This projects adopts canonical correlation analysis (CCA), which is multi-variate statistical technique and explores filtering algorithms to improve communication rates of BCMIs. Furthermore, this project delves into optimising biomedical engineering-based parameters, such as placement of the EEG headset and size of the visual stimulus. After implementing the optimisations, for a time window of 4s and 2s, the mean accuracies of the BCMI are 97.92±2.22% and 88.02±9.30% respectively. The obtained information transfer rate (ITR) is 36.56±9.17 bits min-1, which surpasses communication rates of earlier BCMIs. This thesis concludes by building a system which encompasses a novel control flow, which allows the user to play a musical instrument by gazing at it.

Keywords

Brain-computer Interface (BCI), Brain-computer Music Interface (BCMI), Dry Electroencephalogram (EEG), Computer Music, Musical Composition, Musical Performance, Optimising Parameters, Stand-alone

Document Type

Thesis

Publication Date

2019

DOI

10.24382/397

Share

COinS