Skip navigation
Skip navigation

The identification of unfolding facial expressions

Fiorentini, chiara; Schmidt, Susanna; Viviani, Paola

Description

We asked whether the identification of emotional facial expressions (FEs) involves the simultaneous perception of the facial configuration or the detection of emotion-specific diagnosticcues. We recorded at high speed (500 frames s-1) the unfolding of the FE in five actors, each expressing six emotions (anger, surprise, happiness, disgust, fear, sadness). Recordings were coded every 10 frames (20 ms of real time) with the Facial Action Coding System (FACS, Ekman et al 2002, Salt Lake City, UT:...[Show more]

dc.contributor.authorFiorentini, chiara
dc.contributor.authorSchmidt, Susanna
dc.contributor.authorViviani, Paola
dc.date.accessioned2015-12-10T23:27:18Z
dc.identifier.issn1035-1841
dc.identifier.urihttp://hdl.handle.net/1885/68167
dc.description.abstractWe asked whether the identification of emotional facial expressions (FEs) involves the simultaneous perception of the facial configuration or the detection of emotion-specific diagnosticcues. We recorded at high speed (500 frames s-1) the unfolding of the FE in five actors, each expressing six emotions (anger, surprise, happiness, disgust, fear, sadness). Recordings were coded every 10 frames (20 ms of real time) with the Facial Action Coding System (FACS, Ekman et al 2002, Salt Lake City, UT: Research Nexus eBook) to identify the facial actions contributing to each expression, and their intensity changes over time. Recordings were shown in slow motion (1/20 of recording speed) to one hundred observers in a forced-choice identification task. Participants were asked to identify the emotion during the presentation as soon as they felt confident to do so. Responses were recorded along with the associated response times (RTs). The RT probability density functions for both correct and incorrect responses were correlated with the facial activity during the presentation. There were systematic correlations between facial activities, response probabilities, and RT peaks, and significant differences in RT distributions for correct and incorrect answers. The results show that a reliable response is possible long before the full FE configuration is reached. This suggests that identification is reached by integrating in time individual diagnostic facial actions, and does not require perceiving the full apex configuration.
dc.publisherCeramics Art and Perception
dc.sourceCeramics: Art and perception
dc.subjectKeywords: adult; article; association; attention; color vision; emotion; facial expression; female; human; male; pattern recognition; perceptive discrimination; reaction time; Adult; Attention; Color Perception; Cues; Discrimination (Psychology); Emotions; Facial E Action units; Emotions; Facial expressions; Identification
dc.titleThe identification of unfolding facial expressions
dc.typeJournal article
local.description.notesImported from ARIES
local.identifier.citationvolume41
dc.date.issued2012
local.identifier.absfor170112 - Sensory Processes, Perception and Performance
local.identifier.ariespublicationf5625xPUB1637
local.type.statusPublished Version
local.contributor.affiliationFiorentini, chiara, College of Medicine, Biology and Environment, ANU
local.contributor.affiliationSchmidt, Susanna, University of Turin
local.contributor.affiliationViviani, Paola, University of Geneva
local.description.embargo2037-12-31
local.bibliographicCitation.issue5
local.bibliographicCitation.startpage532
local.bibliographicCitation.lastpage555
local.identifier.doi10.1068/p7052
local.identifier.absseo970117 - Expanding Knowledge in Psychology and Cognitive Sciences
dc.date.updated2016-02-24T08:49:18Z
local.identifier.scopusID2-s2.0-84865467293
local.identifier.thomsonID000303918500001
CollectionsANU Research Publications

Download

File Description SizeFormat Image
01_Fiorentini_The_identification_of_2012.pdf395.93 kBAdobe PDF    Request a copy


Items in Open Research are protected by copyright, with all rights reserved, unless otherwise indicated.

Updated:  19 May 2020/ Responsible Officer:  University Librarian/ Page Contact:  Library Systems & Web Coordinator