ORCID

Abstract

Current databases of facial expressions represent only a small subset of expressions, usually the basic emotions (fear, disgust, surprise, happiness, sadness, and anger). To overcome these limitations, we introduce a database of pictures of facial expressions reflecting the richness of mental states. A total of 93 expressions of mental states were interpreted by two professional actors, and high-quality pictures were taken under controlled conditions in front and side view. The database was validated in two experiments. First, a four-alternative forced-choice paradigm was employed to test the ability to select a term associated with each expression. Second, the task was to locate each face within a 2-D space of valence and arousal. Results from both experiments demonstrate that subjects can reliably recognize a great diversity of emotional states from facial expressions. While subjects’ performance was better for front view images, the advantage over the side view was not dramatic. This is the first demonstration of the high degree of accuracy human viewers exhibit when identifying complex mental states from only partially visible facial features. The McGill Face Database provides a wide range of facial expressions that can be linked to mental state terms and can be accurately characterized in terms of arousal and valence.

DOI

10.1177/0301006620901671

Publication Date

2020-03-01

Publication Title

Perception

Volume

49

Issue

3

ISSN

0301-0066

Embargo Period

2020-09-15

Organisational Unit

School of Health Professions

First Page

310

Last Page

329

Share

COinS