Main content

Wiki | home Discussion

Home

Toggle view:
View
Compare

Menu

Project Wiki Pages
Component Wiki Pages
View
Wiki Version:

Download Manuscript Here

Overview

Psychologists rely on undergraduate participant pools as their primary source of participants. Most participant pools are made up of undergraduate students taking introductory psychology courses over the course of a semester. Each semester the pool refreshes with a new group of students. In most, students self-regulate when to participate – either for extra credit or to meet a course requirement. Because of this self-selection, the qualities of participants may vary across the semester - influencing both the sample characteristics and the likelihood of detecting effects.

This issue is relevant to all behavioral researchers that use university participant pool. As such, there are pervasive superstitions, lay theories, and anecdotal examples about when is the best time to collect data for a particular effect. For example, some never collect data at the end of the semester; others only do so. Remarkably, there is little to no systematic evidence for whether the timing of pool data collection affects the power and sensitivity of experimental designs. Part of the reason is the feasibility of obtaining sufficient data to test this question.

This crowdsourced project examined time of semester variation in 10 known effects, 10 individual differences, and 3 data quality indicators over the course of the academic semester in 20 participant pools (N = 2,696) and with an online sample (N = 737). Seven of the 10 effects did not replicate. Three of those were interaction effects for which a main effect did replicate. Weak time of semester effects were observed on data quality indicators, participant sex, and a few individual differences—conscientiousness, mood, and stress. However, there was little evidence for time of semester qualifying experimental or correlational effects. This suggests a provocative conclusion. Mean characteristics of pool samples change slightly during the semester, but those changes are mostly irrelevant for detecting effects.

Key Project Files

Below are links to the key files associated with this project.

Preregistration Materials

Final protocol, effect description, and analysis plans: ML3_Protocol_9152014.pdf

Preregistration of Protocol: Registration

Summary of effects and individual difference measures: ML3: Selected Effects and Individual Difference Measures

Study Materials

Video demo of procedure: UVa_ML3_Demo.MOV.zip

Video demos from collection sites: Channel

Scripts to administer the experimental procedure: ML3_Lab_Script.docx

Packets of stimuli and questions for in-lab portion: ML3_In-Lab_Packetsrevised.docx

Data and Analysis Scripts

Data: ML3 Final Data

Variable Codebook: ML3 Variable Codebook

Analysis Scripts: ML3 Data Analysis Scripts

Manuscript Materials

Many Labs 3 Manuscript: ManyLabs3 Manuscript

Many Labs 3 Tables: ManyLabs3 Manuscript Tables

Many Labs 3 Supplement: ManyLabs3 Supplementary Materials

Many Labs 3 Supplement Figure and Tables: Many Labs 3 Supplementary Figure and Tables

Many Labs 3 Key Figure: ManyLabs3 Figure

OSF does not support the use of Internet Explorer. For optimal performance, please switch to another browser.
Accept
This website relies on cookies to help provide a better user experience. By clicking Accept or continuing to use the site, you agree. For more information, see our Privacy Policy and information on cookie use.
Accept
×

Start managing your projects on the OSF today.

Free and easy to use, the Open Science Framework supports the entire research lifecycle: planning, execution, reporting, archiving, and discovery.