Main content
Nudging Algorithms by Influencing Human Behavior: Effects of Encouraging Fact-Checking on News Rankings
Date created: | Last Updated:
: DOI | ARK
Creating DOI. Please wait...
Category: Project
Description: As society relies on algorithms to guide decisions from observations of humans, attempts to influence human behavior can also influence those algorithms. Here I report a field experiment that observes an “AI nudge,” an intervention influencing algorithm behavior by nudging human behavior. In an online community of 14 million, I test if encouraging readers to fact-check articles causes recommendation algorithms to interpret verification as popularity and promote those articles. Interventions encouraged readers to (a) fact-check articles or (b) fact-check and vote to influence a recommendation algorithm. While both encouragements increased fact-checking behavior, only the fact-checking condition reduced an article’s algorithmic ranking on average over time, contrary to expectations. Since AI nudges are possible, they have pragmatic and theoretical importance for understanding human and machine behavior. MARCH 26, 2020: I am making available this pre-print during the COVID-19 pandemic to encourage people working on fact-checking to also consider second-order outcomes on ranking algorithms. After completing the draft in 2018, I had set this dissertation paper aside while on the academic job market. I am planning to submit it for review and publication once things slow down.