Affective brain–computer music interfacing
dc.contributor.author | Daly, I | |
dc.contributor.author | Williams, D | |
dc.contributor.author | Kirke, Alexis | |
dc.contributor.author | Weaver, J | |
dc.contributor.author | Malik, A | |
dc.contributor.author | Hwang, F | |
dc.contributor.author | Miranda, Eduardo | |
dc.contributor.author | Nasuto, SJ | |
dc.date.accessioned | 2016-10-13T16:01:01Z | |
dc.date.available | 2016-10-13T16:01:01Z | |
dc.date.issued | 2016-08-01 | |
dc.identifier.issn | 1741-2560 | |
dc.identifier.issn | 1741-2552 | |
dc.identifier.other | ARTN 046022 | |
dc.identifier.uri | http://hdl.handle.net/10026.1/6517 | |
dc.description.abstract |
Objective. We aim to develop and evaluate an affective brain–computer music interface (aBCMI) for modulating the affective states of its users. Approach. An aBCMI is constructed to detect a userʼs current affective state and attempt to modulate it in order to achieve specific objectives (for example, making the user calmer or happier) by playing music which is generated according to a specific affective target by an algorithmic music composition system and a casebased reasoning system. The system is trained and tested in a longitudinal study on a population of eight healthy participants, with each participant returning for multiple sessions. Main results. The final online aBCMI is able to detect its users current affective states with classification accuracies of up to 65% (3 class, p < 0.01) and modulate its userʼs affective states significantly above chance level (p < 0.05). Significance. Our system represents one of the first demonstrations of an online aBCMI that is able to accurately detect and respond to userʼs affective states. Possible applications include use in music therapy and entertainment | |
dc.format.extent | 046022-046022 | |
dc.format.medium | Print-Electronic | |
dc.language | eng | |
dc.language.iso | en | |
dc.publisher | IOP Publishing | |
dc.subject | brain-computer music interfacing | |
dc.subject | EEG | |
dc.subject | music therapy | |
dc.subject | affective computing | |
dc.subject | passive brain-computer interfacing | |
dc.title | Affective brain–computer music interfacing | |
dc.type | journal-article | |
dc.type | Journal Article | |
dc.type | Research Support, Non-U.S. Gov't | |
plymouth.author-url | https://www.webofscience.com/api/gateway?GWVersion=2&SrcApp=PARTNER_APP&SrcAuth=LinksAMR&KeyUT=WOS:000380668900025&DestLinkType=FullRecord&DestApp=ALL_WOS&UsrCustomerID=11bb513d99f797142bcfeffcc58ea008 | |
plymouth.issue | 4 | |
plymouth.volume | 13 | |
plymouth.publication-status | Published | |
plymouth.journal | Journal of Neural Engineering | |
dc.identifier.doi | 10.1088/1741-2560/13/4/046022 | |
plymouth.organisational-group | /Plymouth | |
plymouth.organisational-group | /Plymouth/Faculty of Arts, Humanities and Business | |
plymouth.organisational-group | /Plymouth/Faculty of Arts, Humanities and Business/School of Society and Culture | |
plymouth.organisational-group | /Plymouth/REF 2021 Researchers by UoA | |
plymouth.organisational-group | /Plymouth/REF 2021 Researchers by UoA/UoA33 Music, Drama, Dance, Performing Arts, Film and Screen Studies | |
plymouth.organisational-group | /Plymouth/Users by role | |
plymouth.organisational-group | /Plymouth/Users by role/Academics | |
dc.publisher.place | England | |
dcterms.dateAccepted | 2016-06-21 | |
dc.rights.embargodate | 2017-7-11 | |
dc.identifier.eissn | 1741-2552 | |
dc.rights.embargoperiod | Not known | |
rioxxterms.funder | Engineering and Physical Sciences Research Council | |
rioxxterms.identifier.project | Brain-Computer Interface for Monitoring and Inducing Affective States | |
rioxxterms.versionofrecord | 10.1088/1741-2560/13/4/046022 | |
rioxxterms.licenseref.uri | http://www.rioxx.net/licenses/all-rights-reserved | |
rioxxterms.licenseref.startdate | 2016-08-01 | |
rioxxterms.type | Journal Article/Review | |
plymouth.funder | Brain-Computer Interface for Monitoring and Inducing Affective States::Engineering and Physical Sciences Research Council |