Main content

Files | Discussion Wiki | Discussion | Discussion
default Loading...

Home

Menu

Loading wiki pages...

View
Wiki Version:
[RP:P home][1] ## RP:P Analyses The Reproducibility Project: Psychology (RP:P) was conducted to maximize transparency of the process, materials, and data, and to maximize reproducibility of the observed results. To that end, documentation of the analysis process is available here, with scripts to reproduce the entire analysis. ### Reproducing individual results Every study identified a key statistic to be used as the basis for evaluating replication success in that study. These inferential tests became data points in the meta-analysis of the 100 replications in the summary report. Replication teams conducted their own analyses, and those analyses were verified through an independent analysis audit by another team member. The audit analyses were completed in R (with a couple of exceptions) so that the results could be reproduced easily and independently by anyone. Further, a controller R script was written to reproduce the findings of all audit scripts at once. Analysis auditors followed [these instructions][2]. Each replication's audit script appears on the study's OSF project. - [Replication Audit Scripts][3]: Links to each of these can be found here. - [Audit output][4]: Output from individual audit scripts are available here. - [Controller Script][5]: The controller script for reproducing all audit analyses at once is available here. - [Controller Output][6]: Output from the controller script is available here. ### Reproducing the article's findings The data file representing the aggregate RP:P's results is also available for re-use. This file includes outcomes of the replications and many other variables about the original team, publication, replication team, and characteristics of the finding and replication process. The analysis scripts necessary to reproduce the data file, reported table values, and figures are available in the files section of this component. - [README][7]: A README to describe the analysis process is available here. Portions of this README are available below. - [Original dataset][8]: A CSV of the published dataset is available here. - [Codebook][9]: A CSV explaining each of the variables in the dataset is available here. - [Figures and tables][10]: Figures and tables from the publication are available here. - [Additional figures and graphs][11]: Additional charts are also available for download in the files section of this project. ### A living dataset At the time of publication, RP:P included results from 100 replications. Several additional replications were ongoing and unable to be included in the summary analysis and report. As additional studies from the sampling frame are conducted and reported following the RP:P protocol, their findings can be added to the dataset. Also, any complex project is likely to have errors. Despite a rigorous auditing process, we expect that errors could still be identified in the individual replication data or analyses, or in the aggregate data or analyses. If you believe that you have found an error, please email reproducibilityproject@cos.io to report it. We will maintain a versioned dataset so that both the "original" and most recent versions of the dataset can be analyzed. - [Original dataset][12]: A copy of the data as it was at the time of publication is available here. - [Updated Dataset][13]: The most up-to-date copy of the data is available here. This includes additional completed studies and error corrections. - [CHANGELOG][14]: A CHANGELOG for the updated datafile documenting the changes is available here. - [Analysis Discussions][16]: A record of discussions between original authors and replicators regarding particular studies and potential alternative methods of analysis. ### README excerpt **Running the analyses for the RP:P project** Full README available [here][15]. There are two ways of getting the files required to reproduce all analyses in the RPP manuscript: 1. Download the zip file rpp_reproduce.zip and extract the folder (this is for the non-git users). You can use this link to do that. 2. Clone this git repository and run the masterscript.R (this is for the git-users. The command to do this would be git clone https://github.com/centerforopenscience/rpp FOLDERNAME, where FOLDERNAME is the name of the folder these files will be contained (note your working directory to know where this folder will be placed) Once the files are downloaded, running the analyses has been made user-friendly (please make sure you have the R statistical package installed, downloadable here). 1. Open the masterscript.R file in R. 2. Run all 3. Select the directory where you downloaded the files (i.e., the folder where masterscript.R, functions.R, RPP_figures.R, and rpp_data.csv are located) 4. Now you can run all the results. [1]: https://osf.io/ezcuj/wiki/home/ [2]: https://osf.io/nefzv/ [3]: https://osf.io/ezcuj/wiki/Replicated%20Studies/ [4]: https://osf.io/4ybmd/ [5]: https://osf.io/fkmwg/ [6]: https://osf.io/pkj9a/ [7]: https://github.com/CenterForOpenScience/rpp [8]: https://osf.io/fgjvw/ [9]: https://osf.io/bhcsf/ [10]: https://osf.io/ezum7/files/ [11]: https://osf.io/ytpuq/files/ [12]: https://osf.io/fgjvw/ [13]: https://osf.io/yt3gq/ [14]: https://osf.io/3qdkg/wiki/home/ [15]: https://github.com/CenterForOpenScience/rpp [16]: https://osf.io/3qdkg/wiki/Analysis%20Discussions
OSF does not support the use of Internet Explorer. For optimal performance, please switch to another browser.
Accept
This website relies on cookies to help provide a better user experience. By clicking Accept or continuing to use the site, you agree. For more information, see our Privacy Policy and information on cookie use.
Accept
×

Start managing your projects on the OSF today.

Free and easy to use, the Open Science Framework supports the entire research lifecycle: planning, execution, reporting, archiving, and discovery.