Fashioning the face: sensorimotor simulation contributes to facial expression recognition

Adrienne Wood, Magdalena Rychlowska, Sebastian Korb, Paula M. Niedenthal

Research output: Contribution to journalArticle

98 Citations (Scopus)

Abstract

When we observe a facial expression of emotion, we often mimic it. This automatic mimicry reflects underlying sensorimotor simulation that supports accurate emotion recognition. Why this is so is becoming more obvious: emotions are patterns of expressive, behavioral, physiological, and subjective feeling responses. Activation of one component can therefore automatically activate other components. When people simulate a perceived facial expression, they partially activate the corresponding emotional state in themselves, which provides a basis for inferring the underlying emotion of the expresser. We integrate recent evidence in favor of a role for sensorimotor simulation in emotion recognition. We then connect this account to a domain-general understanding of how sensory information from multiple modalities is integrated to generate perceptual predictions in the brain.
Original languageEnglish
Pages (from-to)227-240
JournalTrends In Cognitive Sciences
Volume20
Issue number3
DOIs
Publication statusPublished - 01 Mar 2016

    Fingerprint

Cite this