The McGill Face Database: Validation and Insights Into the Recognition of Facial Expressions of Complex Mental States
dc.contributor.author | KANG, JUNGHEE | |
dc.contributor.author | Jennings, BJ | |
dc.contributor.author | Sandra, DA | |
dc.contributor.author | Pollock, J | |
dc.contributor.author | Gold, I | |
dc.date.accessioned | 2020-09-11T10:41:37Z | |
dc.date.available | 2020-09-11T10:41:37Z | |
dc.date.issued | 2020-03 | |
dc.identifier.issn | 0301-0066 | |
dc.identifier.issn | 1468-4233 | |
dc.identifier.other | ARTN 0301006620901671 | |
dc.identifier.uri | http://hdl.handle.net/10026.1/16239 | |
dc.description.abstract |
<jats:p> Current databases of facial expressions represent only a small subset of expressions, usually the basic emotions (fear, disgust, surprise, happiness, sadness, and anger). To overcome these limitations, we introduce a database of pictures of facial expressions reflecting the richness of mental states. A total of 93 expressions of mental states were interpreted by two professional actors, and high-quality pictures were taken under controlled conditions in front and side view. The database was validated in two experiments. First, a four-alternative forced-choice paradigm was employed to test the ability to select a term associated with each expression. Second, the task was to locate each face within a 2-D space of valence and arousal. Results from both experiments demonstrate that subjects can reliably recognize a great diversity of emotional states from facial expressions. While subjects’ performance was better for front view images, the advantage over the side view was not dramatic. This is the first demonstration of the high degree of accuracy human viewers exhibit when identifying complex mental states from only partially visible facial features. The McGill Face Database provides a wide range of facial expressions that can be linked to mental state terms and can be accurately characterized in terms of arousal and valence. </jats:p> | |
dc.format.extent | 310-329 | |
dc.format.medium | Print-Electronic | |
dc.language | en | |
dc.language.iso | en | |
dc.publisher | SAGE Publications | |
dc.subject | faces | |
dc.subject | face database | |
dc.subject | emotions | |
dc.subject | mental states | |
dc.subject | theory of mind | |
dc.title | The McGill Face Database: Validation and Insights Into the Recognition of Facial Expressions of Complex Mental States | |
dc.type | journal-article | |
dc.type | Journal Article | |
plymouth.author-url | https://www.webofscience.com/api/gateway?GWVersion=2&SrcApp=PARTNER_APP&SrcAuth=LinksAMR&KeyUT=WOS:000514036400001&DestLinkType=FullRecord&DestApp=ALL_WOS&UsrCustomerID=11bb513d99f797142bcfeffcc58ea008 | |
plymouth.issue | 3 | |
plymouth.volume | 49 | |
plymouth.publication-status | Published | |
plymouth.journal | Perception | |
dc.identifier.doi | 10.1177/0301006620901671 | |
plymouth.organisational-group | /Plymouth | |
plymouth.organisational-group | /Plymouth/Faculty of Health | |
plymouth.organisational-group | /Plymouth/Faculty of Health/School of Health Professions | |
plymouth.organisational-group | /Plymouth/REF 2021 Researchers by UoA | |
plymouth.organisational-group | /Plymouth/REF 2021 Researchers by UoA/UoA03 Allied Health Professions, Dentistry, Nursing and Pharmacy | |
plymouth.organisational-group | /Plymouth/Research Groups | |
plymouth.organisational-group | /Plymouth/Research Groups/Institute of Health and Community | |
plymouth.organisational-group | /Plymouth/Users by role | |
plymouth.organisational-group | /Plymouth/Users by role/Academics | |
dc.publisher.place | United States | |
dcterms.dateAccepted | 2019-12-17 | |
dc.rights.embargodate | 2020-9-15 | |
dc.identifier.eissn | 1468-4233 | |
dc.rights.embargoperiod | Not known | |
rioxxterms.versionofrecord | 10.1177/0301006620901671 | |
rioxxterms.licenseref.uri | http://www.rioxx.net/licenses/all-rights-reserved | |
rioxxterms.licenseref.startdate | 2020-03 | |
rioxxterms.type | Journal Article/Review |