Main content

Files | Discussion Wiki | Discussion | Discussion
default Loading...

about

Menu

Loading wiki pages...

View
Wiki Version:
This issue invited proposals for highly-powered, direct replications of important results in social psychology. The review process focused on the soundness of the design and analysis, regardless of the significance level. The issue is now available [here][1]. **What are important results?** Importance is subjective but demonstrable. Proposals had to justify the replication value of the finding to be replicated. To merit publication in this issue, the original result had to be important (e.g., highly cited, a topic of intense scholarly or public interest, a challenge to established theories), but also needed uncertain truth value (e.g., few confirmations, imprecise estimates of effect sizes). The prestige of the original publishing journal was not sufficient to justify replication value. **What replication formats are encouraged by the special issue?** Proposals were for direct replications that faithfully reproduced the original procedure, materials, and analysis for verification. Conceptual replications that attempted to improve theoretical understanding by changing the operationalization of the constructs were considered for this issue. Articles in the issue could take several forms: (1) Registered replication. Authors submitted the introduction, methods, and analysis plan for a replication study or studies. These proposals were reviewed for their importance and soundness. Once provisionally accepted, if authors completed the study as proposed, the results were published without regard to the outcome. Registered replication proposals also could include: (a) collaborations between two or more laboratories independently attempting to replicate an effect with the same materials, (b) joint replication by the original laboratory and another laboratory, or (c) adversarial collaborations in which laboratories with competing expectations prepare a joint registered proposal and conduct independent replications. Only adequately powered tests of results with high replication value were considered. Registered replication projects were eligible to apply for grant funding from the Center for Open Science. (2) Aggregation of existing replication attempts. Authors submitted a proposal to aggregate existing replication attempts from the proverbial file-drawer. Even if individual replication attempts are underpowered, aggregated results could increase the precision of estimates of an effect. Ideal proposals aggregated replication attempts from multiple laboratories. They could also explore the impact of procedural variations, replicator expertise, or other factors on replication success. (3) Other approaches. Authors were encouraged to propose novel replication strategies or methods to improve the precision of estimates of important effects in social psychology. **How were replication projects proposed?** Interested authors contacted the guest editors before preparing a formal proposal. Deadlines for the formal proposal and final manuscript depended on the type of project. Instructions for authors to prepare their preregistered proposal are [here][2]. Once accepted, instructions for the registration process are [here][3]. [Home][4] | [About the Issue][5] | [The Articles][6] [1]: http://www.psycontent.com/content/l67413865317/?p=3cc4b2ccfeb847dab4d86d0309a3237e&pi=0 [2]: https://docs.google.com/document/d/1rNHYMpyhEzSPh6opCVagCEJzFKtjgoZdIF3srKU4ZPg/edit#bookmark=id.2206xramif68 [3]: https://docs.google.com/document/d/19htjV1XBeqq1xHs-EEdOe-NJeLeUNo9EyxRAjZpslrQ/edit [4]: http://bit.ly/14LBc1N [5]: http://bit.ly/13kB1Ms [6]: http://bit.ly/17NBnHz
OSF does not support the use of Internet Explorer. For optimal performance, please switch to another browser.
Accept
This website relies on cookies to help provide a better user experience. By clicking Accept or continuing to use the site, you agree. For more information, see our Privacy Policy and information on cookie use.
Accept
×

Start managing your projects on the OSF today.

Free and easy to use, the Open Science Framework supports the entire research lifecycle: planning, execution, reporting, archiving, and discovery.