Show simple item record

dc.contributor.authorDaly, I
dc.contributor.authorWilliams, D
dc.contributor.authorMalik, A
dc.contributor.authorWeaver, J
dc.contributor.authorKirke, Alexis
dc.contributor.authorHwang, F
dc.contributor.authorMiranda, Eduardo
dc.contributor.authorNasuto, SJ
dc.date.accessioned2018-09-10T09:37:09Z
dc.date.available2018-09-10T09:37:09Z
dc.date.issued2020-01-01
dc.identifier.issn1949-3045
dc.identifier.issn1949-3045
dc.identifier.urihttp://hdl.handle.net/10026.1/12294
dc.description.abstract

Brain-computer music interfaces (BCMIs) may be used to modulate affective states, with applications in music therapy, composition, and entertainment. However, for such systems to work they need to be able to reliably detect their user's current affective state. We present a method for personalised affective state detection for use in BCMI. We compare it to a population-based detection method trained on 17 users and demonstrate that personalised affective state detection is significantly (p<0.01p<0.01) more accurate, with average improvements in accuracy of 10.2 percent for valence and 9.3 percent for arousal. We also compare a hybrid BCMI (a BCMI that combines physiological signals with neurological signals) to a conventional BCMI design (one based upon the use of only EEG features) and demonstrate that the hybrid design results in a significant (p<0.01p<0.01) 6.2 percent improvement in performance for arousal classification and a significant (p<0.01p<0.01) 5.9 percent improvement for valence classification.

dc.format.extent1-1
dc.language.isoen
dc.publisherInstitute of Electrical and Electronics Engineers (IEEE)
dc.subjectEEG
dc.subjectGSR
dc.subjectaffective state detection
dc.subjectBCMI
dc.subjectpersonalised affective state detection
dc.titlePersonalised, Multi-modal, Affective State Detection for Hybrid Brain-Computer Music Interfacing
dc.typejournal-article
dc.typeJournal Article
plymouth.author-urlhttps://www.webofscience.com/api/gateway?GWVersion=2&SrcApp=PARTNER_APP&SrcAuth=LinksAMR&KeyUT=WOS:000521989700008&DestLinkType=FullRecord&DestApp=ALL_WOS&UsrCustomerID=11bb513d99f797142bcfeffcc58ea008
plymouth.issue1
plymouth.volume11
plymouth.publication-statusPublished
plymouth.journalIEEE Transactions on Affective Computing
dc.identifier.doi10.1109/TAFFC.2018.2801811
plymouth.organisational-group/Plymouth
plymouth.organisational-group/Plymouth/Faculty of Arts, Humanities and Business
plymouth.organisational-group/Plymouth/Faculty of Arts, Humanities and Business/School of Society and Culture
plymouth.organisational-group/Plymouth/REF 2021 Researchers by UoA
plymouth.organisational-group/Plymouth/REF 2021 Researchers by UoA/UoA33 Music, Drama, Dance, Performing Arts, Film and Screen Studies
plymouth.organisational-group/Plymouth/Users by role
plymouth.organisational-group/Plymouth/Users by role/Academics
dcterms.dateAccepted2018-01-05
dc.identifier.eissn1949-3045
dc.rights.embargoperiodNot known
rioxxterms.funderEngineering and Physical Sciences Research Council
rioxxterms.identifier.projectBrain-Computer Interface for Monitoring and Inducing Affective States
rioxxterms.versionofrecord10.1109/TAFFC.2018.2801811
rioxxterms.licenseref.urihttp://www.rioxx.net/licenses/all-rights-reserved
rioxxterms.typeJournal Article/Review
plymouth.funderBrain-Computer Interface for Monitoring and Inducing Affective States::Engineering and Physical Sciences Research Council


Files in this item

Thumbnail
Thumbnail

This item appears in the following Collection(s)

Show simple item record


All items in PEARL are protected by copyright law.
Author manuscripts deposited to comply with open access mandates are made available in accordance with publisher policies. Please cite only the published version using the details provided on the item record or document. In the absence of an open licence (e.g. Creative Commons), permissions for further reuse of content should be sought from the publisher or author.
Theme by 
Atmire NV