Home

Menu

Loading wiki pages...

View
Wiki Version:
<h2><strong>Overview</strong></h2> <p><a href="http://PsychDisclosure.org" rel="nofollow">PsychDisclosure.org</a> provides a platform for authors of published articles in Psychology to publically disclose four categories of important methodological details that are not required to be disclosed under current reporting standards, but which are essential for interpreting research findings.</p> <p>Recently, several common research practices in psychology have been highlighted as potentially impeding knowledge development and hurting the reputation of our field. For instance, it has become acceptable -- and action editors often have required authors -- to selectively exclude and report measures, manipulations, samples, and analyses on the basis of whether these practices yield significant results or tell more compelling stories rather than for principled reasons (John, Loewenstein, & Prelec, 2012; <a href="http://www.tilburguniversity.edu/nl/nieuws-en-agenda/finalreportLevelt.pdf" rel="nofollow">Stapel final report, 2012</a>). (Though of course many methodological details are also often not reported for reasons which have nothing to do with increasing the statistical significance or compellingness of the story.)</p> <p>Regardless of the source of these suboptimal resesarch practices, <strong>it is our belief that many of us would appreciate the opportunity to provide more details about the methods actually used to obtain findings reported in published articles (indeed over 40% of contacted authors have provided such details). Our initiative provides this opportunity.</strong> Our effort builds upon a <a href="http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2160588" rel="nofollow">Simmons, Nelson, and Simonsohn's (2012)</a> recently proposed initiative wherein authors submitting manuscripts for publication voluntarily include a 21-word disclosure statement regarding crucial methodological details that are not required to be disclosed under currently accepted reporting standards. </p> <p>The primary benefits of our initiative include the following: </p> <ol> <li><strong>increasing the information value of recently published articles to allow for more accurate interpretation of the reported findings,</strong></li> <li><strong>making visible what goes on under the radar of official publications, and</strong></li> <li><strong>promoting sounder research practices by raising awareness regarding the ineffective and out-of-date reporting standards in our field with the hope that our website inspires journal editors to change editorial policies whereby the 4 categories of methodological details disclosed on this website become a required component of submitted manuscripts.</strong></li> </ol> <p>We aim to achieve this by inviting (via email) a subset of corresponding authors of recently published articles (2012 and onward) in prominent psychology journals to publically disclose important methodological details, derived from the four methodological categories from Simmons et al.'s 21-word disclosure statement initiative. Responses are posted to <a href="http://PsychDisclosure.org" rel="nofollow">PsychDisclosure.org</a> .</p> <p><strong>DISCLOSURE QUESTIONS</strong>:<br> For all studies in your recently published article titled [publication title], please endorse the following statements: <em>(please type an X to indicate your answer)</em></p> <ol> <li> <p>We reported the total number of observations which were excluded (if any) and the criterion for doing so. (If no observations excluded, please indicate Yes)<br> Yes: <strong><em>_</em></strong> No: <strong><em>_</em></strong> <br> <strong>If no, please report this information here</strong> (e.g., data from 3 participants in Study 2 excluded due to computer malfunction; 4 participants in Study 1 excluded for not following instructions):</p> </li> <li> <p>We reported all tested experimental conditions, including failed manipulations. Yes: <strong><em>_</em></strong> No: <strong><em>_</em></strong> <br> <strong>If no, please provide brief explanation for not reporting this information</strong> (e.g., critical software implementation error; editorial request):</p> </li> <li> <p>We reported all administered measures/items. Yes: <strong><em>_</em></strong> No: <strong><em>_</em></strong><br> <strong>If no, please provide brief explanation for not reporting this information</strong> (e.g., measures not related to research question; scores from unreported measure insufficiently reliable):</p> </li> <li> <p>We reported (a) how we determined our sample size <strong>and</strong> (b) our data collection stopping rule. Yes: <strong><em>_</em></strong> No: <strong><em>_</em></strong><br> <strong>If no, please describe (a) the basis for the sample sizes used and (b) how you decided to stop collecting data</strong> (e.g., decided ahead of time to collect data until minimum sample size achieved and this was followed; sample size determined by power analysis but didn’t achieve it by the end of term):</p> </li> </ol> <p>Our initiative has received appropriate ethics clearance in accordance with APA guidelines. To protect the anonymity of non-respondents, only a randomly determined subset (i.e., half) of the corresponding authors in your journal and issue will be contacted. </p> <p>We emphasize that the additional information requested is not intended to question or stigmatize published research, but to give a more accurate picture of the actual methods used to obtain the findings, correcting for artificially rigid standards of evidence in publication. The project is committed to transparency and open science practices (see all project materials below).</p> <p><strong>Appendix</strong><br> Proposed 21-word disclosure statement: “We report how we determined our sample size, all data exclusions (if any), all manipulations, and all measures in the study.” <a href="http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2160588" rel="nofollow">Simmons, J. P., Nelson, L. D., & Simonsohn, U. (2012)</a>. A 21 Word Solution. SPSP Dialogue,26,2,Fall 2012 issue. </p> <p>Or, alternatively – and the one we prefer -- an 18-word disclosure statement: "We report all measures in the study, all manipulations, any data exclusions, and the sample size determination rule."</p> <p><strong>Frequently Asked Questions (FAQ)</strong><br> <strong>Q: Won’t people just lie when providing additional information for the public spreadsheet?</strong><br> A: We think they won’t. We are confident that in spite of recent high-profile cases, actual data and reporting fraud is very infrequent in psychology. Much more common – and therefore worrisome – is the extent to which post-hoc reporting decisions based on significance have become acceptable. We think almost all researchers want to play by the rules of the game; we just propose realigning the rules so that they conform to statistical reality and common-sense ideas of honesty.<br> <strong>Q: I’m concerned that if I report the practices we actually used, people will question my research.</strong><br> A: Your article represents a great achievement. It has addressed an interesting question and passed through a tough process of peer review. We don’t think supplying additional information will take that achievement away from you. However, we hope you will agree that as scientists, our primary commitment is to the truth – not to any given idea, no matter how personally invested we are in it. Ultimately, truth does not reside in any one study, any one paper, or any one lab, but in the overall body of evidence. More accurate reporting in any one article will allow better assessment of this body of evidence overall.<br> <strong>Q: What exactly is sound reporting practice?</strong><br> A: We have chosen to focus this project on the goal of increasing the amount of information in a research report rather than defining what is good and bad research. However, our basic assumption is that sound practice is principled. That is, it accurately reports any ideas and procedures determined a priori; if these were modified in the course of the research, it reports this fact, along with the reason for doing so. We are not seeking to impose standards that further restrict how people can run studies or analyze their data. We are encouraging standards under which people can feel free to report honestly what they did and why.<br> <strong>Q: If I don’t respond, will my research be judged as dishonest?</strong><br> A: Because only 50% of the authors in your journal and issue have been randomly chosen to take part, there will be no way of identifying you if you choose not to participate. Information regarding who was and was not contacted will be kept strictly confidential as research data. Using this selection approach, we are emphasizing the positive benefits of adding information to your article, rather than the negative judgment of not doing so.<br> <strong>Q: Which journals are being targeted and why?</strong><br> A: We are focusing on articles published in all 2012 issues (and onward) of Journal of Personality and Social Psychology, Psychological Science, Journal of Experimental Psychology: Learning, Memory, and Cognition, and Journal of Experimental Psychology: General because they represent prominent journals in psychology that are widely read. However, our effort may eventually expand to other psychology journals and other publication years.<br> <strong>Q: Can I submit disclosure information for articles not covered by your initiative</strong>?<br> A: Yes! Please see Contact Us page for details on how to e-mail us your information.<br> <strong>Q: Won't journals or editors be upset because some of the disclosures will involve information on how the reviewing process itself sometimes leads to selective reporting?</strong><br> A: Yes it is possible that disclosures revealing questionable editorial practices (QEPs) might irk some journals given that these practices have now been identified as clearly inadmissible or indefensible. However, we view this possibility as a small cost compared to the much larger benefit that disclosing this information will have for improving research practices in our field.<br> <strong>Q: But why is an independent group of researchers requesting this information? Shouldn't journals be asking for this information?</strong><br> A: We can't agree more that it is indeed the journals that <strong>SHOULD</strong> be asking for this information rather than an independent group of researchers!!! (Trust me, I'd definitely rather be doing science than spending hundreds of hours e-mailing hundreds of corresponding authors!!!). The fact of that matter is, however, that journals are <strong>NOT</strong> asking for this information at this time. As a (bottom-up) strategy to move us closer to this reality, our initiative aims to raise awareness regarding our journals' completely ineffective reporting standards with the hope that our website inspires journal editors to consider changing editorial policies whereby the 4 categories of methodological details disclosed on this website become a required component of submitted manuscripts. Indeed, there is evidence that our website is already having such an effect.<br> <strong>Q: I've been randomly selected to disclose the methodological information and I support your cause, but isn't unfair because opponents in our intense theoretical debate can now use this information about minor procedural imperfections to advance their cause?</strong><br> A: We sympathesize with this concern, however, we believe that it is completely scientifically justified to then turn around and email your opponents requesting such methodological details (not required to be disclosed) for papers containing findings implicated in the debate! This is not only fair, but completely scientifically reasonable given that it is crucial to know these methodological details to accurately interpret the reported published findings.<br></p> <h2><strong>Project Materials</strong></h2> <ol> <li>Word document of current email sent to corresponding authors (see Files, Email_template_-_Version8_Current_Version.pdf)</li> <li>Ethics form protocol document (see Files)</li> </ol>
This website relies on cookies to help provide a better user experience. By clicking Accept or continuing to use the site, you agree. For more information, see our Privacy Policy and information on cookie use.
Accept
×

Start managing your projects on the OSF today.

Free and easy to use, the Open Science Framework supports the entire research lifecycle: planning, execution, reporting, archiving, and discovery.

Create an Account Learn More Hide this message