Main content
Home
Menu
Download Manuscript Here
Overview
Psychologists rely on undergraduate participant pools as their primary source of participants. Most participant pools are made up of undergraduate students taking introductory psychology courses over the course of a semester. Each semester the pool refreshes with a new group of students. In most, students self-regulate when to participate – either for extra credit or to meet a course requirement. Because of this self-selection, the qualities of participants may vary across the semester - influencing both the sample characteristics and the likelihood of detecting effects.
This issue is relevant to all behavioral researchers that use university participant pool. As such, there are pervasive superstitions, lay theories, and anecdotal examples about when is the best time to collect data for a particular effect. For example, some never collect data at the end of the semester; others only do so. Remarkably, there is little to no systematic evidence for whether the timing of pool data collection affects the power and sensitivity of experimental designs. Part of the reason is the feasibility of obtaining sufficient data to test this question.
This crowdsourced project examined time of semester variation in 10 known effects, 10 individual differences, and 3 data quality indicators over the course of the academic semester in 20 participant pools (N = 2,696) and with an online sample (N = 737). Seven of the 10 effects did not replicate. Three of those were interaction effects for which a main effect did replicate. Weak time of semester effects were observed on data quality indicators, participant sex, and a few individual differences—conscientiousness, mood, and stress. However, there was little evidence for time of semester qualifying experimental or correlational effects. This suggests a provocative conclusion. Mean characteristics of pool samples change slightly during the semester, but those changes are mostly irrelevant for detecting effects.
Key Project Files
Below are links to the key files associated with this project.
Preregistration Materials
Final protocol, effect description, and analysis plans: ML3_Protocol_9152014.pdf
Preregistration of Protocol: Registration
Summary of effects and individual difference measures: ML3: Selected Effects and Individual Difference Measures
Study Materials
Video demo of procedure: UVa_ML3_Demo.MOV.zip
Video demos from collection sites: Channel
Scripts to administer the experimental procedure: ML3_Lab_Script.docx
Packets of stimuli and questions for in-lab portion: ML3_In-Lab_Packetsrevised.docx
Data and Analysis Scripts
Data: ML3 Final Data
Variable Codebook: ML3 Variable Codebook
Analysis Scripts: ML3 Data Analysis Scripts
Manuscript Materials
Many Labs 3 Manuscript: ManyLabs3 Manuscript
Many Labs 3 Tables: ManyLabs3 Manuscript Tables
Many Labs 3 Supplement: ManyLabs3 Supplementary Materials
Many Labs 3 Supplement Figure and Tables: Many Labs 3 Supplementary Figure and Tables
Many Labs 3 Key Figure: ManyLabs3 Figure
Page permissions have changed
Your browser should refresh shortly…
Renaming wiki...
Wiki page deleted
Press Confirm to return to the project wiki home page.
Connected to the collaborative wiki
This page is currently connected to the collaborative wiki. All edits made will be visible to contributors with write permission in real time. Changes will be stored but not published until you click the "Save" button.
Connecting to the collaborative wiki
This page is currently attempting to connect to the collaborative wiki. You may continue to make edits. Changes will not be saved until you press the "Save" button.
Collaborative wiki is unavailable
The collaborative wiki is currently unavailable. You may continue to make edits. Changes will not be saved until you press the "Save" button.
Browser unsupported
Your browser does not support collaborative editing. You may continue to make edits. Changes will not be saved until you press the "Save" button.

Start managing your projects on the OSF today.
Free and easy to use, the Open Science Framework supports the entire research lifecycle: planning, execution, reporting, archiving, and discovery.
Copyright © 2011-2025
Center for Open Science
|
Terms of Use
|
Privacy Policy
|
Status
|
API
TOP Guidelines
|
Reproducibility Project: Psychology
|
Reproducibility Project: Cancer Biology