Show simple item record

dc.contributor.authorDaly, I
dc.contributor.authorWilliams, D
dc.contributor.authorHallowell, J
dc.contributor.authorHwang, F
dc.contributor.authorKirke, Alexis
dc.contributor.authorMalik, A
dc.contributor.authorWeaver, J
dc.contributor.authorMiranda, Eduardo
dc.contributor.authorNasuto, SJ
dc.date.accessioned2016-10-13T16:06:12Z
dc.date.available2016-10-13T16:06:12Z
dc.date.issued2015-12
dc.identifier.issn0278-2626
dc.identifier.issn1090-2147
dc.identifier.urihttp://hdl.handle.net/10026.1/6519
dc.description.abstract

It is widely acknowledged that music can communicate and induce a wide range of emotions in the listener. However, music is a highly-complex audio signal composed of a wide range of complex time- and frequency-varying components. Additionally, music-induced emotions are known to differ greatly between listeners. Therefore, it is not immediately clear what emotions will be induced in a given individual by a piece of music. We attempt to predict the music-induced emotional response in a listener by measuring the activity in the listeners electroencephalogram (EEG). We combine these measures with acoustic descriptors of the music, an approach that allows us to consider music as a complex set of time-varying acoustic features, independently of any specific music theory. Regression models are found which allow us to predict the music-induced emotions of our participants with a correlation between the actual and predicted responses of up to r=0.234,p<0.001. This regression fit suggests that over 20% of the variance of the participant's music induced emotions can be predicted by their neural activity and the properties of the music. Given the large amount of noise, non-stationarity, and non-linearity in both EEG and music, this is an encouraging result. Additionally, the combination of measures of brain activity and acoustic features describing the music played to our participants allows us to predict music-induced emotions with significantly higher accuracies than either feature type alone (p<0.01).

dc.format.extent1-11
dc.format.mediumPrint-Electronic
dc.languageen
dc.language.isoeng
dc.publisherElsevier BV
dc.subjectMusic
dc.subjectAffective state prediction
dc.subjectEEG
dc.subjectAcoustic features
dc.subjectMachine learning
dc.titleMusic-induced emotions can be predicted from a combination of brain activity and acoustic features
dc.typejournal-article
dc.typeJournal Article
dc.typeResearch Support, Non-U.S. Gov't
plymouth.author-urlhttps://www.webofscience.com/api/gateway?GWVersion=2&SrcApp=PARTNER_APP&SrcAuth=LinksAMR&KeyUT=WOS:000366784300001&DestLinkType=FullRecord&DestApp=ALL_WOS&UsrCustomerID=11bb513d99f797142bcfeffcc58ea008
plymouth.volume101
plymouth.publication-statusPublished
plymouth.journalBrain and Cognition
dc.identifier.doi10.1016/j.bandc.2015.08.003
plymouth.organisational-group/Plymouth
plymouth.organisational-group/Plymouth/Faculty of Arts, Humanities and Business
plymouth.organisational-group/Plymouth/Faculty of Arts, Humanities and Business/School of Society and Culture
plymouth.organisational-group/Plymouth/REF 2021 Researchers by UoA
plymouth.organisational-group/Plymouth/REF 2021 Researchers by UoA/UoA33 Music, Drama, Dance, Performing Arts, Film and Screen Studies
plymouth.organisational-group/Plymouth/Users by role
plymouth.organisational-group/Plymouth/Users by role/Academics
dc.publisher.placeUnited States
dcterms.dateAccepted2015-08-04
dc.identifier.eissn1090-2147
dc.rights.embargoperiodNot known
rioxxterms.funderEngineering and Physical Sciences Research Council
rioxxterms.identifier.projectBrain-Computer Interface for Monitoring and Inducing Affective States
rioxxterms.versionofrecord10.1016/j.bandc.2015.08.003
rioxxterms.licenseref.urihttp://www.rioxx.net/licenses/all-rights-reserved
rioxxterms.licenseref.startdate2015-12
rioxxterms.typeJournal Article/Review
plymouth.funderBrain-Computer Interface for Monitoring and Inducing Affective States::Engineering and Physical Sciences Research Council


Files in this item

Thumbnail
Thumbnail

This item appears in the following Collection(s)

Show simple item record


All items in PEARL are protected by copyright law.
Author manuscripts deposited to comply with open access mandates are made available in accordance with publisher policies. Please cite only the published version using the details provided on the item record or document. In the absence of an open licence (e.g. Creative Commons), permissions for further reuse of content should be sought from the publisher or author.
Theme by 
Atmire NV