Show simple item record

dc.contributor.authorRitchie, K
dc.contributor.authorKramer, R
dc.contributor.authorMileva, Mila
dc.contributor.authorSandford, A
dc.contributor.authorBurton, AM
dc.date.accessioned2021-02-16T12:02:04Z
dc.date.issued2021-06
dc.identifier.issn0010-0277
dc.identifier.issn1873-7838
dc.identifier.other104632
dc.identifier.urihttp://hdl.handle.net/10026.1/16886
dc.description.abstract

Previous research has shown that exposure to within-person variability facilitates face learning. A different body of work has examined potential benefits of providing multiple images in face matching tasks. Viewers are asked to judge whether a target face matches a single face image (as when checking photo-ID) or multiple face images of the same person. The evidence here is less clear, with some studies finding a small multiple-image benefit, and others finding no advantage. In four experiments, we address this discrepancy in the benefits of multiple images from learning and matching studies. We show that multiple-image arrays only facilitate face matching when arrays precede targets. Unlike simultaneous face matching tasks, sequential matching and learning tasks involve memory and require abstraction of a stable representation of the face from the array, for subsequent comparison with a target. Our results show that benefits from multiple-image arrays occur only when this abstraction is required, and not when array and target images are available at once. These studies reconcile apparent differences between face learning and face matching and provide a theoretical framework for the study of within-person variability in face perception.

dc.format.extent104632-104632
dc.format.mediumPrint-Electronic
dc.languageen
dc.language.isoen
dc.publisherElsevier
dc.subjectFace matching
dc.subjectFace learning
dc.subjectVariability
dc.titleMultiple-image arrays in face matching tasks with and without memory
dc.typejournal-article
dc.typeJournal Article
plymouth.author-urlhttps://www.webofscience.com/api/gateway?GWVersion=2&SrcApp=PARTNER_APP&SrcAuth=LinksAMR&KeyUT=WOS:000641973200003&DestLinkType=FullRecord&DestApp=ALL_WOS&UsrCustomerID=11bb513d99f797142bcfeffcc58ea008
plymouth.volume211
plymouth.publication-statusPublished
plymouth.journalCognition
dc.identifier.doi10.1016/j.cognition.2021.104632
plymouth.organisational-group/Plymouth
plymouth.organisational-group/Plymouth/Faculty of Health
plymouth.organisational-group/Plymouth/REF 2021 Researchers by UoA
plymouth.organisational-group/Plymouth/REF 2021 Researchers by UoA/UoA04 Psychology, Psychiatry and Neuroscience
plymouth.organisational-group/Plymouth/REF 2021 Researchers by UoA/UoA04 Psychology, Psychiatry and Neuroscience/UoA04 Psychology, Psychiatry and Neuroscience MANUAL
plymouth.organisational-group/Plymouth/Users by role
plymouth.organisational-group/Plymouth/Users by role/Academics
dc.publisher.placeNetherlands
dcterms.dateAccepted2021-02-10
dc.rights.embargodate2022-2-20
dc.identifier.eissn1873-7838
dc.rights.embargoperiodNot known
rioxxterms.versionofrecord10.1016/j.cognition.2021.104632
rioxxterms.licenseref.urihttp://www.rioxx.net/licenses/all-rights-reserved
rioxxterms.licenseref.startdate2021-06
rioxxterms.typeJournal Article/Review


Files in this item

Thumbnail
Thumbnail

This item appears in the following Collection(s)

Show simple item record


All items in PEARL are protected by copyright law.
Author manuscripts deposited to comply with open access mandates are made available in accordance with publisher policies. Please cite only the published version using the details provided on the item record or document. In the absence of an open licence (e.g. Creative Commons), permissions for further reuse of content should be sought from the publisher or author.
Theme by 
Atmire NV