Main content

Home

Menu

Loading wiki pages...

View
Wiki Version:
**ABSTRACT** Affective incongruence between face and body interferes with emotion recognition. Existing research has focused on interference effects for universally recognizable bodily expressions. However, it remains unknown whether learned, conventional gestures can interfere with facial expression processing. Study 1 participants (N = 62, within-subject) viewed videos of facial expressions accompanied by hand gestures and indicated the valence of either the face or hand. Responses were slower and less accurate when the face-hand pairing was incongruent compared to congruent. We hypothesized that hand gestures might exert an even stronger influence on facial expression processing when other routes to understanding the meaning of a facial expression, such as with sensorimotor simulation, are disrupted. Participants in Study 2 (N = 127) completed the same task as in Study 1. The facial mobility of some participants was restricted, which disrupted facial expression processing in prior work. The hand-face congruency effect from Study 1 was replicated. The facial mobility manipulation affected males only, and it did not moderate the congruency effect. The present work suggests the affective meaning of conventional gestures is processed automatically and can interfere with face perception, but perceivers do not seem to rely more on gestures when sensorimotor face processing is disrupted. **NOTES ON USING OUR MATERIALS** The following materials are provided as supplementary materials to the manuscript. Feel free to use our stimuli or experiment materials, but please cite this manuscript if you do: *Wood, A., Martin, J., Alibali, M., & Niedenthal, P. (under review). A sad thumbs up: Incongruent gestures and disrupted sensorimotor activity both slow processing of facial expressions* * **all data files** (in .csv and .txt file formats). See R code for information on how we handled and analyzed the data. * **complete R code** (in the form of an R markdown file, "gesture_analyses_for_pub2.Rmd") * **R markdown output**, which shows output of analyses and graphs alongside R code ("gesture_analyses_for_pub2.html") * **All materials necessary to run the experiment**, including the python experiment file ("gesture_done.py"), supporting files, and the video stimuli used in the 2 studies, uploaded as a compressed zip file (Gesture_Study_materials.zip). **NOTE**: to run the python experiment (built by Jared Martin using the psychopy library), we recommend installing Python following the instructions provided by Gary Lupyan: [http://sapir.psych.wisc.edu/wiki/index.php/Psych711][1] [1]: http://sapir.psych.wisc.edu/wiki/index.php/Psych711
OSF does not support the use of Internet Explorer. For optimal performance, please switch to another browser.
Accept
This website relies on cookies to help provide a better user experience. By clicking Accept or continuing to use the site, you agree. For more information, see our Privacy Policy and information on cookie use.
Accept
×

Start managing your projects on the OSF today.

Free and easy to use, the Open Science Framework supports the entire research lifecycle: planning, execution, reporting, archiving, and discovery.