Main content

Date created: | Last Updated:

: DOI | ARK

Creating DOI. Please wait...

Create DOI

Category: Project

Description: It has often been noted that severe testing is hampered by unspecified options in the specification of a hypothesis, in design, data processing and analysis that might be misused to fish for the desired result. In this case, it is unlikely that a hypothesis that is claimed to be confirmed in a paper could have turned out to be false. As an intervention, we aim to develop a condensed form that conveniently summarises the core information about severe testing and highlights ambiguities in reporting of the above. The form is intended to inform 1) authors when planning a study and preparing a manuscript, 2) journals in the submission/review process, and 3) reviewers of published papers. In particular, the form shall ask about the following: Is it clear where the hypothesis comes from? Is it clear how the hypothesis gave rise to the analysis carried out? What results other than those found would have led to reporting the hypothesis as not confirmed? Were alternative explanations to the hypothesis (being true) investigated? Can the claims on these issues be controlled by transparency measures? The form shall include items from the following domains: 1. Origin of the hypothesis 2. Data processing, choice and coding of variables that were used in the analysis 3. Analysis 4. Alternative explanations (design issues, bias due to measurement, selection and confounding) November 20th: We have reached a consensus on the content of the form and its items. We plan to collect collaborative feedback from methodological experts in psychology and beyond in December (which the members of this project select personally). Initiative groups for better science and journal editors will also be invited. Based on the feedback, we shall revise the form in early 2025 and prepare a paper on it. After publication, the form will be publicly available. The present work builds, among others, on these publications: Devezer B., & Buzbas E. (2021) Minimum Viable Experiment to Replicate. [Preprint] Greenland S. For and Against Methodologies: Some Perspectives on Recent Causal and Statistical Inference Debates. Eur J Epidemiol. 2017;32(1):3-20. doi:10.1007/s10654-017-0230-6 Höfler, M., Scherbaum, S., Kanske, P., McDonald, B., & Miller, R. (2022). Means to valuable exploration I. The blending of confirmation and exploration and how to resolve it. Meta-Psychology 2, 6. doi: 10.15626/MP.2021.2837 Guest, O. What Makes a Good Theory, and How Do We Make a Theory Good?. Comput Brain Behav (2024). https://doi.org/10.1007/s42113-023-00193-2 Lakens, D. (2019). The Value of Preregistration for Psychological Science: A Conceptual Analysis. doi: 10.31234/osf.io/jbh4w Mayo, Deborah. Statistical Inference as Severe Testing How to Get Beyond the Statistics Wars. Cambridge: Cambridge University Press, 2018 Scheel, A. M. (2022). Why most psychological research findings are not even wrong. Infant and Child Development, 31, Article e2295. https://doi.org/10.1002/icd.2295

License: CC0 1.0 Universal

Wiki

This Wiki is intended to record the history of the project.

The form and the history of its construction are stored there: https://docs.google.com/presentation/d/1L0Ih2qUMq-l7NvpYjOxDExf3mhQPqs6pR3-py8mGj_4/edit#slide=id.g2ca77c579a9_1_75

July 24: Working on the form continues there: https://docs.google.com/document/d/1PxJ5bqnYFACdC140jgLQHlI2Uq6r5wxFT2c9p01kqgY/edit

Files

Files can now be accessed and managed under the Files tab.

Citation

Tags

InferenceOpen ScienceReplication

Recent Activity

Unable to retrieve logs at this time. Please refresh the page or contact support@osf.io if the problem persists.

OSF does not support the use of Internet Explorer. For optimal performance, please switch to another browser.
Accept
This website relies on cookies to help provide a better user experience. By clicking Accept or continuing to use the site, you agree. For more information, see our Privacy Policy and information on cookie use.
Accept
×

Start managing your projects on the OSF today.

Free and easy to use, the Open Science Framework supports the entire research lifecycle: planning, execution, reporting, archiving, and discovery.