Main content
Dissociating facial electromyographic correlates of visual and verbal induced rumination
Date created: | Last Updated:
: DOI | ARK
Creating DOI. Please wait...
Category: Project
Description: Previous research showed that mental rumination, considered as a form of repetitive and negative inner speech, is associated with increased facial muscular activity. However, the relation between these muscular activations and the underlying mental processes is still unclear. In this study, we tried to separate the facial electromyographic correlates of induced rumination related to either i) mechanisms of (inner) speech production or ii) rumination as a state of pondering on negative affects. To this end, we compared two groups of participants submitted to two types of rumination induction (for a total of 85 female undergraduate students without excessive depressive symptoms). The first type of induction was designed to specifically induce rumination in a verbal modality whereas the second one was designed to induce rumination in a visual modality. Following the *motor simulation view* of inner speech production, we hypothesised that the verbal rumination induction should result in a higher increase of activity in the speech-related muscles as compared to the non-verbal rumination induction. We also hypothesised that relaxation focused on the orofacial area should be more efficient in reducing rumination (when experienced in a verbal modality) than a relaxation focused on a non-orofacial area. Our results do not corroborate these hypotheses, as both rumination inductions resulted in a similar increase of peripheral muscular activity in comparison to baseline levels. Moreover, the two relaxation types were similarly efficient in reducing rumination, whatever the rumination induction. We discuss these results in relation to the inner speech literature and suggest that because rumination is a habitual and automatic form of emotion regulation, it might be a particularly (strongly) internalised and condensed form of inner speech. Pre-registered protocol, preprint, data, as well as reproducible code and figures are available at: https://osf.io/c9pag/.
Add important information, links, or images here to describe your project.