Main content

Date created: | Last Updated:

: DOI | ARK

Creating DOI. Please wait...

Create DOI

Category: Uncategorized

Description: Misinformation has the capacity to negatively impact people’s beliefs and behaviours (Ecker et al., 2022). The advent of social media has arguably increased the quantity of misinformation people may be exposed to, which not only increases the potential impact of misinformation, but makes it increasingly difficult to provide targeted interventions to counteract specific misinformation exposure (e.g., debunking; Pennycook & Rand, 2022a). As such, in recent years there has been an increased focus on developing and implementing general interventions which are easily scalable to the social-media environment (Kozyreva et al., 2022; Pennycook et al., 2020; Roozenbeek et al., 2022). Many of these proposed interventions are based in nudge theory, a theory which posits that small changes in choice architecture can meaningfully impact decision-making processes (Lin et al., 2017). In the realm of misinformation, these nudge-based interventions typically attempt to reduce sharing of misinformation that may occur due to inattentiveness (Pennycook & Rand, 2022a) by providing prompts that prime people to consider (1) the accuracy of encountered information (i.e., accuracy prompts; Pennycook & Rand, 2022b) or (2) the attitudes or behaviours of others (e.g., descriptive social-norm prompts highlighting that most people avoid sharing misinformation; Epstein et al., 2021; Prike et al., 2023). Within experimental settings, nudge-based misinformation interventions have generally shown to have a beneficial (though small) impact on sharing behaviour (e.g., Epstein et al., 2021; Pennycook & Rand, 2022b; Prike et al., 2023; Roozenbeek et al., 2021), either through directly reducing sharing of false information or improving “sharing discernment” (i.e., the proportion of true compared to false information participants report they would share). Although these findings are arguably positive, the assessment of the effectiveness of nudge-based interventions often fails to appropriately consider the structure of the social-media information environment. Specifically, within these studies (1) the proportion of false information is often artificially high (i.e., around 50% of the claims presented), (2) participants are often only exposed to true and false news-based information (typically in the form of headlines), and (3) participants are usually required to rate their sharing intent for every headline presented. In the real world, the quantity of misinformation people are exposed to on social media is negligible compared to true or non-news-based (e.g., personal or opinion-based) information (Altay et al., 2023), and the quantity of information people are exposed to on social media is greater than that which they are inclined or able to critically appraise (Pennycook & Rand, 2022a; Zollo & Quattrociocchi, 2018). Given these factors, it is unclear whether the influence of nudge-based interventions observed in typical experimental settings will translate to information environments that more accurately resemble real-world social media environments. In fact, nudges, especially those which aim to prime a certain behaviour, are known to be highly context-specific and quick to decay, if they are effective at all (Chater & Loewenstein, 2022; Lin et al., 2017). Thus, it is plausible that nudge-based interventions that aim to reduce misinformation sharing may be effective when the level of misinformation in the environment is high but not when the level of misinformation is lower (Roozenbeek et al., 2021). The overarching aim of the current experiment is to assess the effectiveness of a scalable, nudge-based intervention (specifically, a combined accuracy prompt and social-norm intervention) in settings with greater external validity. Specifically, we aim to assess whether the effect of an intervention is influenced by the quantity of misinformation (compared to other types of information) people are exposed to. To this end, we will implement the nudge intervention in a mock social-media environment; participants will encounter social-media posts, with a varying number of posts containing false information. Specifically, two factors will be manipulated: (1) The presence of the nudge intervention (present or absent), and (2) the proportion of false claims presented (ranging from 12.5% to 50%). In the condition with 50% false claims, 40 false and 40 true headlines will be presented; the 20% condition will feature 10 false and 40 true headlines. To control for the difference in the total number of posts (which may affect overall nudge efficacy due to decay), an additional 12.5% condition will use 10 false headlines, 40 true headlines, and 30 non-verifiable claims. True and false headlines will be similar in style to other nudging studies (e.g., Pennycook et al., 2021; Pennycook & Rand, 2019, 2022); non-verifiable claims will be similar to common non-news posts on social media (e.g., buy/sell/swap, birthday, or holiday posts). The 10 false headlines used in the 20% and 12.5% false conditions will be kept constant across participants. To further increase external validity, posts will be presented in a mock feed using a realistic social-media simulator (Butler et al., 2023). Participants’ engagement with the posts will be measured using the options available within the simulation (specifically, participants can choose to “like” or “share” posts). Participants will not be forced to engage with posts, but rather will be asked to engage with information how they typically would on social media. Given engagement behaviours are, to a degree, incentivised on social media (Globig et al., 2023), there will be small, ostensible changes to participants’ simulated follower count based on their engagement with the posts. These changes will be randomly sampled from a normal distribution; the distribution parameters will be identical across post types to avoid changes interacting with the nudge effects.

License: CC-By Attribution-ShareAlike 4.0 International

Files

Loading files...

Citation

Recent Activity

Loading logs...

OSF does not support the use of Internet Explorer. For optimal performance, please switch to another browser.
Accept
This website relies on cookies to help provide a better user experience. By clicking Accept or continuing to use the site, you agree. For more information, see our Privacy Policy and information on cookie use.
Accept
×

Start managing your projects on the OSF today.

Free and easy to use, the Open Science Framework supports the entire research lifecycle: planning, execution, reporting, archiving, and discovery.