Main content

This registration is a frozen, non-editable version of this project

Home

Menu

Loading wiki pages...

View
Wiki Version:
**Download [Manuscript Here][1]** **Overview** Psychologists rely on undergraduate participant pools as their primary source of participants. Most participant pools are made up of undergraduate students taking introductory psychology courses over the course of a semester. Each semester the pool refreshes with a new group of students. In most, students self-regulate when to participate – either for extra credit or to meet a course requirement. Because of this self-selection, the qualities of participants may vary across the semester - influencing both the sample characteristics and the likelihood of detecting effects. This issue is relevant to all behavioral researchers that use university participant pool. As such, there are pervasive superstitions, lay theories, and anecdotal examples about when is the best time to collect data for a particular effect. For example, some never collect data at the end of the semester; others only do so. Remarkably, there is little to no systematic evidence for whether the timing of pool data collection affects the power and sensitivity of experimental designs. Part of the reason is the feasibility of obtaining sufficient data to test this question. This crowdsourced project examined time of semester variation in 10 known effects, 10 individual differences, and 3 data quality indicators over the course of the academic semester in 20 participant pools (N = 2,696) and with an online sample (N = 737). Seven of the 10 effects did not replicate. Three of those were interaction effects for which a main effect did replicate. Weak time of semester effects were observed on data quality indicators, participant sex, and a few individual differences—conscientiousness, mood, and stress. However, there was little evidence for time of semester qualifying experimental or correlational effects. This suggests a provocative conclusion. Mean characteristics of pool samples change slightly during the semester, but those changes are mostly irrelevant for detecting effects. **Files** All project files are available in the files section. Final protocol, effect description, and analysis plans: [ML3_Protocol_9152014.pdf][2]) Experimenter script: [ML3_Computer_Scripts_by_collection_site.zip][3] Video demo of procedure: [UVa_ML3_Demo.MOV.zip][4] Scripts to administer the experimental procedure: [ML3_Lab_Script.docx][5] Packets of stimuli and questions for in-lab portion: [ML3_In-Lab_Packetsrevised.docx][6] Summary of effects and individual difference measures: [ML3: Selected Effects and Individual Difference Measures][7] Many Labs 3 Manuscript: [ManyLabs3 Manuscript][8] Many Labs 3 Tables: [ManyLabs3 Tables][9] Many Labs 3 Supplement: [ManyLabs3 Supplementary Materials][10] Many Labs 3 Supplement Tables: [Many Labs 3 Supplementary Tables][11] Many Labs 3 Key Figure: [ManyLabs3 Figure][12] Data: [ML3 Final Data][13] [1]: https://osf.io/s59bg/ [2]: https://osf.io/wamcn/ [3]: https://osf.io/vsxkg/ [4]: https://osf.io/w358c/ [5]: https://osf.io/ud3na/ [6]: https://osf.io/2zatr/ [7]: https://docs.google.com/spreadsheets/d/1db7KIJws0ttGc3vlD5WxwXsvK61o2o67JMIzzP1aaqk/edit?usp=sharing [8]: https://osf.io/s59bg/ [9]: https://osf.io/qpwf2/ [10]: https://osf.io/ruct4/ [11]: https://osf.io/9keh3/ [12]: https://osf.io/j9ady/ [13]: https://osf.io/bxw8j/
OSF does not support the use of Internet Explorer. For optimal performance, please switch to another browser.
Accept
This website relies on cookies to help provide a better user experience. By clicking Accept or continuing to use the site, you agree. For more information, see our Privacy Policy and information on cookie use.
Accept
×

Start managing your projects on the OSF today.

Free and easy to use, the Open Science Framework supports the entire research lifecycle: planning, execution, reporting, archiving, and discovery.