Personalised, Multi-modal, Affective State Detection for Hybrid Brain-Computer Music Interfacing
dc.contributor.author | Daly, I | |
dc.contributor.author | Williams, D | |
dc.contributor.author | Malik, A | |
dc.contributor.author | Weaver, J | |
dc.contributor.author | Kirke, Alexis | |
dc.contributor.author | Hwang, F | |
dc.contributor.author | Miranda, Eduardo | |
dc.contributor.author | Nasuto, SJ | |
dc.date.accessioned | 2018-09-10T09:37:09Z | |
dc.date.available | 2018-09-10T09:37:09Z | |
dc.date.issued | 2020-01-01 | |
dc.identifier.issn | 1949-3045 | |
dc.identifier.issn | 1949-3045 | |
dc.identifier.uri | http://hdl.handle.net/10026.1/12294 | |
dc.description.abstract |
Brain-computer music interfaces (BCMIs) may be used to modulate affective states, with applications in music therapy, composition, and entertainment. However, for such systems to work they need to be able to reliably detect their user's current affective state. We present a method for personalised affective state detection for use in BCMI. We compare it to a population-based detection method trained on 17 users and demonstrate that personalised affective state detection is significantly (p<0.01p<0.01) more accurate, with average improvements in accuracy of 10.2 percent for valence and 9.3 percent for arousal. We also compare a hybrid BCMI (a BCMI that combines physiological signals with neurological signals) to a conventional BCMI design (one based upon the use of only EEG features) and demonstrate that the hybrid design results in a significant (p<0.01p<0.01) 6.2 percent improvement in performance for arousal classification and a significant (p<0.01p<0.01) 5.9 percent improvement for valence classification. | |
dc.format.extent | 1-1 | |
dc.language.iso | en | |
dc.publisher | Institute of Electrical and Electronics Engineers (IEEE) | |
dc.subject | EEG | |
dc.subject | GSR | |
dc.subject | affective state detection | |
dc.subject | BCMI | |
dc.subject | personalised affective state detection | |
dc.title | Personalised, Multi-modal, Affective State Detection for Hybrid Brain-Computer Music Interfacing | |
dc.type | journal-article | |
dc.type | Journal Article | |
plymouth.author-url | https://www.webofscience.com/api/gateway?GWVersion=2&SrcApp=PARTNER_APP&SrcAuth=LinksAMR&KeyUT=WOS:000521989700008&DestLinkType=FullRecord&DestApp=ALL_WOS&UsrCustomerID=11bb513d99f797142bcfeffcc58ea008 | |
plymouth.issue | 1 | |
plymouth.volume | 11 | |
plymouth.publication-status | Published | |
plymouth.journal | IEEE Transactions on Affective Computing | |
dc.identifier.doi | 10.1109/TAFFC.2018.2801811 | |
plymouth.organisational-group | /Plymouth | |
plymouth.organisational-group | /Plymouth/Faculty of Arts, Humanities and Business | |
plymouth.organisational-group | /Plymouth/Faculty of Arts, Humanities and Business/School of Society and Culture | |
plymouth.organisational-group | /Plymouth/REF 2021 Researchers by UoA | |
plymouth.organisational-group | /Plymouth/REF 2021 Researchers by UoA/UoA33 Music, Drama, Dance, Performing Arts, Film and Screen Studies | |
plymouth.organisational-group | /Plymouth/Users by role | |
plymouth.organisational-group | /Plymouth/Users by role/Academics | |
dcterms.dateAccepted | 2018-01-05 | |
dc.identifier.eissn | 1949-3045 | |
dc.rights.embargoperiod | Not known | |
rioxxterms.funder | Engineering and Physical Sciences Research Council | |
rioxxterms.identifier.project | Brain-Computer Interface for Monitoring and Inducing Affective States | |
rioxxterms.versionofrecord | 10.1109/TAFFC.2018.2801811 | |
rioxxterms.licenseref.uri | http://www.rioxx.net/licenses/all-rights-reserved | |
rioxxterms.type | Journal Article/Review | |
plymouth.funder | Brain-Computer Interface for Monitoring and Inducing Affective States::Engineering and Physical Sciences Research Council |