## Overview
Longitudinal crowdsourcing relies on the same workers completing a task more than once, yet keeping those workers engaged across sessions is difficult. We surveyed 300 workers on Amazon Mechanical Turk, Prolific, and Toloka, collecting 547 detailed accounts of longitudinal participation. A mixed-methods analysis reveals platform-specific retention patterns, motivational drivers, and structural barriers, distilled into 17 take-home messages, 8 recommendations for task requesters, and 5 best practices for platform operators.
## Methodology
- Survey of 300 workers on Amazon Mechanical Turk, Prolific, and Toloka
- Questionnaire covered demographics, experience with longitudinal tasks, motivations, and obstacles
- Dataset includes 547 worker experiences
- Combined quantitative statistics and qualitative thematic coding to uncover retention patterns and barriers
## Results
- **Platform-specific retention**: retention rates differ by marketplace, showing platform influence on participation
- **Motivational factors**: workers remain engaged when compensation is fair, topics are interesting, and communication is clear
- **Barriers**: dropout is driven by poor communication, vague instructions, and low pay
- **Recommendations**: build trust, provide precise instructions, and ensure fair compensation to improve retention
- **Best practices**: the study offers 8 requester guidelines and 5 platform practices to better support longitudinal studies