Main content

Contributors:

Date created: | Last Updated:

: DOI | ARK

Creating DOI. Please wait...

Create DOI

Category: Project

Description: Misinformation on social media is a key challenge to effective and timely public health responses. Existing mitigation measures include flagging misinformation or providing links to correct information but have not yet targeted social processes. Here, we examine whether providing balanced social reference cues in addition to flagging misinformation leads to reductions in sharing behavior. In 3 field experiments (N=886, N=322, and N=278) on Twitter, we show that highlighting which content others within the personal network share and, more importantly, not share combined with misinformation flags significantly and meaningfully reduces the amount of misinformation shared (Study 1-3). We show that this reduction is driven by change in injunctive social norms (Study 2) but not social identity (Study 3). Social reference cues, combined with misinformation flags, are feasible and scalable means to effectively curb sharing misinformation on social media.

Wiki

Add important information, links, or images here to describe your project.

Files

Loading files...

Citation

Recent Activity

Loading logs...

OSF does not support the use of Internet Explorer. For optimal performance, please switch to another browser.
Accept
This website relies on cookies to help provide a better user experience. By clicking Accept or continuing to use the site, you agree. For more information, see our Privacy Policy and information on cookie use.
Accept
×

Start managing your projects on the OSF today.

Free and easy to use, the Open Science Framework supports the entire research lifecycle: planning, execution, reporting, archiving, and discovery.