Show simple item record

dc.contributor.supervisorMiranda, Eduardo
dc.contributor.authorVenkatesh, Satvik
dc.contributor.otherFaculty of Arts, Humanities and Businessen_US
dc.date.accessioned2019-08-05T11:41:27Z
dc.date.issued2019
dc.identifier10577651en_US
dc.identifier.urihttp://hdl.handle.net/10026.1/14747
dc.description.abstract

Brain-computer interfaces (BCIs) aim to establish a communication medium that is independent of muscle control. This project investigates how BCIs can be harnessed for musical applications. The impact of such systems is twofold — (i) it offers a novel mechanism of control for musicians during performance and (ii) it is beneficial for patients who are suffering from motor disabilities. Several challenges are encountered when attempting to move these technologies from laboratories to real-world scenarios. Additionally, BCIs are significantly different from conventional computer interfaces and realise low communication rates. This project considers these challenges and uses a dry and wireless electroencephalogram (EEG) headset to detect neural activity. It adopts a paradigm called steady state visually evoked potential (SSVEP) to provide the user with control. It aims to encapsulate all braincomputer music interface (BCMI)-based operations into a stand-alone application, which would improve the portability of BCMIs.

This projects addresses various engineering problems that are faced while developing a stand-alone BCMI. In order to efficiently present the visual stimulus for SSVEP, it requires hardware-accelerated rendering. EEG data is received from the headset through Bluetooth and thus, a dedicated thread is designed to receive signals. As this thesis is not using medical-grade equipment to detect EEG, signal processing techniques need to be examined to improve the signal to noise ratio (SNR) of brain waves. This projects adopts canonical correlation analysis (CCA), which is multi-variate statistical technique and explores filtering algorithms to improve communication rates of BCMIs.

Furthermore, this project delves into optimising biomedical engineering-based parameters, such as placement of the EEG headset and size of the visual stimulus. After implementing the optimisations, for a time window of 4s and 2s, the mean accuracies of the BCMI are 97.92±2.22% and 88.02±9.30% respectively. The obtained information transfer rate (ITR) is 36.56±9.17 bits min-1, which surpasses communication rates of earlier BCMIs. This thesis concludes by building a system which encompasses a novel control flow, which allows the user to play a musical instrument by gazing at it.

en_US
dc.description.sponsorshipThe School of Humanities and Performing Arts, University of Plymouthen_US
dc.language.isoen
dc.publisherUniversity of Plymouth
dc.subjectBrain-computer Interface (BCI)en_US
dc.subjectBrain-computer Music Interface (BCMI)en_US
dc.subjectDry Electroencephalogram (EEG)en_US
dc.subjectComputer Musicen_US
dc.subjectMusical Compositionen_US
dc.subjectMusical Performanceen_US
dc.subjectOptimising Parametersen_US
dc.subjectStand-aloneen_US
dc.subject.classificationResMen_US
dc.titleInvestigation into Stand-alone Brain-computer Interfaces for Musical Applicationsen_US
dc.typeThesis
plymouth.versionpublishableen_US
dc.identifier.doihttp://dx.doi.org/10.24382/397
dc.rights.embargodate2020-08-05T11:41:27Z
dc.rights.embargoperiod12 monthsen_US
dc.type.qualificationMastersen_US
rioxxterms.versionNA
plymouth.orcid_idhttps://orcid.org/0000-0001-5244-3020en_US


Files in this item

Thumbnail
Thumbnail

This item appears in the following Collection(s)

Show simple item record


All items in PEARL are protected by copyright law.
Author manuscripts deposited to comply with open access mandates are made available in accordance with publisher policies. Please cite only the published version using the details provided on the item record or document. In the absence of an open licence (e.g. Creative Commons), permissions for further reuse of content should be sought from the publisher or author.
Theme by 
Atmire NV