Main content

Contributors:

Date created: | Last Updated:

: DOI | ARK

Creating DOI. Please wait...

Create DOI

Category: Project

Description: As society relies on algorithms to guide decisions from observations of humans, attempts to influence human behavior can also influence those algorithms. Here I report a field experiment that observes an “AI nudge,” an intervention influencing algorithm behavior by nudging human behavior. In an online community of 14 million, I test if encouraging readers to fact-check articles causes recommendation algorithms to interpret verification as popularity and promote those articles. Interventions encouraged readers to (a) fact-check articles or (b) fact-check and vote to influence a recommendation algorithm. While both encouragements increased fact-checking behavior, only the fact-checking condition reduced an article’s algorithmic ranking on average over time, contrary to expectations. Since AI nudges are possible, they have pragmatic and theoretical importance for understanding human and machine behavior. MARCH 26, 2020: I am making available this pre-print during the COVID-19 pandemic to encourage people working on fact-checking to also consider second-order outcomes on ranking algorithms. After completing the draft in 2018, I had set this dissertation paper aside while on the academic job market. I am planning to submit it for review and publication once things slow down.

Files

Loading files...

Citation

Recent Activity

Loading logs...

OSF does not support the use of Internet Explorer. For optimal performance, please switch to another browser.
Accept
This website relies on cookies to help provide a better user experience. By clicking Accept or continuing to use the site, you agree. For more information, see our Privacy Policy and information on cookie use.
Accept
×

Start managing your projects on the OSF today.

Free and easy to use, the Open Science Framework supports the entire research lifecycle: planning, execution, reporting, archiving, and discovery.