Reading lists

Menu

Loading wiki pages...

View
Wiki Version:
<p>Wanting to start a journal club, but don't have time to find journals? Why not borrow someone else's reading list?</p> <p>Here is a collection of readings on open sciences collected from around the web, let us know if you find more!</p> <p>We are in the process of making this a <a href="https://www.zotero.org/groups/2336553/anzorn" rel="nofollow">Zotero library here</a>.</p> <h2>Existing Reading lists</h2> <ul> <li><a href="https://thehardestscience.com/2016/08/11/everything-is-fucked-the-syllabus/" rel="nofollow">Srivistava (2016) - Everything is fucked</a></li> <li><a href="https://osf.io/kgnva/wiki/Open%20Science%20Literature/" rel="nofollow">David Mellor's "Open Science in the Literature"</a></li> <li><a href="https://morn.netlify.com/replconcernsnotnew" rel="nofollow">"Replication Concerns are not new"</a></li> <li><a href="https://alexanderetz.com/2016/02/07/understanding-bayes-how-to-become-a-bayesian-in-eight-easy-steps/" rel="nofollow">How to become a Bayesian in eight easy steps</a></li> <li><a href="https://psyarxiv.com/cfzyx" rel="nofollow">Eight Easy Steps to Open Science: An Annotated Reading List</a></li> <li><a href="https://docs.google.com/document/d/14lBD0aZDPij2Z6AOpAharOAtmt6ZBI0EuF3_tu8m66I/" rel="nofollow">Brent Roberts and Dan Simons' exhaustive reading list</a></li> <li><a href="https://reproducibility.dash.umn.edu/" rel="nofollow">Reproducibility Biography</a></li> <li><a href="https://www.mdpi.com/books/pdfview/edition/914" rel="nofollow">The global benefits of Open research</a></li> </ul> <h3>Open Science</h3> <ul> <li> <p><a href="https://journals.sagepub.com/doi/full/10.1177/1745691615609918" rel="nofollow">A Short (Personal) Future History of Revolution 2.0</a></p> <p>^ Recommended "intro to open science" article - <a href="https://psyarxiv.com/emyux/" rel="nofollow">Practical Tools and Strategies for Researchers to Increase Replicability</a></p> <p>How to get started in open research practices - <a href="https://psyarxiv.com/ak6jr/" rel="nofollow">Open science: What, Why, and how?</a> - <a href="https://psyarxiv.com/2yphf" rel="nofollow">Implications of the Credibility Revolution for Productivity, Creativity, and Progress</a> - <a href="http://journals.sagepub.com/doi/abs/10.1177/1745691618767878" rel="nofollow">Open Science Is Liberating and Can Foster Creativity</a> - <a href="https://cdn.elifesciences.org/articles/16800/elife-16800-v1.pdf" rel="nofollow">How open science helps researchers succeed</a></p> <p>What can open research practices do for you (cynically) - <a href="https://psyarxiv.com/3czyt/" rel="nofollow">Open Science challenges, benefits and tips in early career and beyond</a> - <a href="https://www.ncbi.nlm.nih.gov/pubmed/24220629" rel="nofollow">The new statistics: why and how.</a> - <a href="https://esajournals.onlinelibrary.wiley.com/doi/abs/10.1890/ES14-00402.1" rel="nofollow">The Tao of open science for ecology</a> - <a href="http://journals.sagepub.com/doi/full/10.1177/2332858418787466" rel="nofollow">Open Education Science</a> - <a href="https://etiennelebel.com/documents/lbghprs(2013,pps).pdf" rel="nofollow">PsychDisclosure.org: Grassroots support for reforming reporting standards in psychology</a> - <a href="https://etiennelebel.com/documents/cl&l(2014,pr).pdf" rel="nofollow">Enhancing transparency of the research process to increase accuracy of findings: A guide for relationship researchers</a> - <a href="https://etiennelebel.com/documents/l(2015,collabra).pdf" rel="nofollow">A new replication norm for psychology</a> - <a href="https://etiennelebel.com/documents/l&j(inpress,chapter).pdf" rel="nofollow">Psychological and institutional obstacles toward more transparent reporting of psychological science</a> - <a href="https://etiennelebel.com/documents/lcl(2017,jpsp).pdf" rel="nofollow">Benefits of open and high-powered research outweigh costs</a> - <a href="https://plato.stanford.edu/entries/scientific-reproducibility/" rel="nofollow">Reproducibility of Scientific results</a> - <a href="https://arxiv.org/abs/1811.04525" rel="nofollow">A Model-Centric Analysis of Openness, Replication, and Reproducibility</a></p> <p>Complex, but addresses the relationship between openness and reproducibility in great detail. - <a href="https://arxiv.org/abs/1803.10118" rel="nofollow">Discovery of truth is not implied by reproducibility but facilitated by innovation and epistemic diversity in a model-centric framework</a> - <a href="https://psyarxiv.com/bwr48" rel="nofollow">Sound Inference in Complicated Research: A Multi-Strategy Approach</a> - <a href="https://doi.org/10.17226/25303" rel="nofollow">Reproducibility and Replicability in Science</a> by National Academies of Sciences, Engineering, and Medicine.</p> </li> </ul> <h3>p-values and statistical significance</h3> <ul> <li><a href="https://www.nature.com/articles/s41562-017-0189-z" rel="nofollow">Redefine statistical significance</a></li> <li><a href="https://psyarxiv.com/9s3y6/" rel="nofollow">Justify Your Alpha</a></li> <li><a href="https://psyarxiv.com/rbm8y" rel="nofollow">Redefine or Justify? Comments on the alpha debate</a></li> <li><a href="https://www.researchgate.net/publication/320021624_Remove_rather_than_redefine_statistical_significance" rel="nofollow">Remove, rather than redefine, statistical significance</a></li> <li><a href="https://www.nature.com/articles/d41586-019-00857-9" rel="nofollow">Scientists rise up against statistical significance</a></li> <li><a href="https://www.frontiersin.org/articles/10.3389/fpsyg.2015.00245/full" rel="nofollow">Is the call to abandon p-values the red herring of the replicability crisis?</a></li> </ul> <h3>Theory testing</h3> <ul> <li><a href="http://meehl.umn.edu/sites/g/files/pua1696/f/128socscientistsdontunderstand.pdf" rel="nofollow">What Social Scientists Don't Understand</a></li> </ul> <h3>False-positive inflation</h3> <ul> <li><a href="http://journals.sagepub.com/doi/abs/10.1177/0956797611430953" rel="nofollow">Measuring the prevalence of questionable research practices with incentives for truth telling</a></li> <li><a href="http://annals.org/aim/article-abstract/2706170/researcher-requests-inappropriate-analysis-reporting-u-s-survey-consulting-biostatisticians" rel="nofollow">Researcher Requests for Inappropriate Analysis and Reporting: A U.S. Survey of Consulting Biostatisticians</a></li> <li><a href="http://journals.sagepub.com/doi/pdf/10.1177/0956797611417632" rel="nofollow">False-positive psychology: Undisclosed flexibility in data collection and analysis allows presenting anything as significant</a></li> <li><a href="https://etiennelebel.com/documents/l&p(2011,rgp).pdf" rel="nofollow">Fearing the future of empirical psychology: Bem's (2011) evidence of psi as a case study of deficiencies in modal research practice</a></li> </ul> <h3>Replicability and Credibility</h3> <ul> <li><a href="https://github.com/alan-turing-institute/the-turing-way#about-the-project" rel="nofollow">The Turing Way - A How-to guide to reproducible data science</a></li> <li><a href="https://psyarxiv.com/gxcy5" rel="nofollow">Minimizing Mistakes in Psychological Science</a></li> <li><a href="https://journals.plos.org/plosbiology/article?id=10.1371/journal.pbio.1002165" rel="nofollow">The Economics of Reproducibility in Preclinical Research</a></li> <li><a href="https://www.ncbi.nlm.nih.gov/pubmed/25552691" rel="nofollow">Reproducibility in science: improving the standard for basic and preclinical research.</a></li> <li><a href="http://science.sciencemag.org/content/349/6251/aac4716" rel="nofollow">Estimating the reproducibility of psychological science</a></li> <li><a href="https://www.jstor.org/stable/1806061" rel="nofollow">Replication in Empirical Economics: The Journal of Money, Credit and Banking Project (1986)</a></li> <li><a href="https://etiennelebel.com/documents/lebeletal(2018,ampss)a-unified-framework-to-quantify-the-credibility-of-scientific-findings.pdf" rel="nofollow">A unified framework to quantify the credibility of scientific findings</a></li> <li><a href="https://www.nature.com/articles/483531a" rel="nofollow">Raise standards for preclinical cancer research</a></li> </ul> <h2>Publication reforms</h2> <ul> <li><a href="https://www.sciencedirect.com/science/article/pii/S0010945217302393" rel="nofollow">Exploratory reports: A new article type for Cortex</a></li> <li><a href="https://www.collabra.org/articles/10.1525/collabra.158/" rel="nofollow">A practical guide for transparency in psychological science</a></li> </ul> <h3>Preregistration</h3> <ul> <li><a href="https://etiennelebel.com/documents/l&c(2015,abstract).pdf" rel="nofollow">Bold conjectures meet open science: Accelerating theory development in social psychology by testing riskier predictions via pre-registration</a></li> <li><a href="https://www.pnas.org/content/115/11/2600" rel="nofollow">The preregistration revolution</a></li> <li><a href="https://journals.sagepub.com/doi/full/10.1177/1745691612463078" rel="nofollow">An Agenda for Purely Confirmatory Research</a></li> <li><a href="https://psyarxiv.com/x36pz" rel="nofollow">Preregistration is redundant, at best</a> - A contrarian but thoughtful discussion of the role of preregistration in science. </li> </ul> <h3>Open Materials</h3> <ul> <li><a href="http://journals.sagepub.com/doi/pdf/10.1177/2515245917746500" rel="nofollow">Practical Solutions for Sharing Data and Materials From Psychological Research</a></li> </ul> <h3>Open Data</h3> <ul> <li><a href="https://journals.sagepub.com/doi/full/10.1177/2515245917747656" rel="nofollow">Practical Tips for Ethical Data Sharing</a></li> <li><a href="https://ojs.library.queensu.ca/index.php/IEE/article/view/4608" rel="nofollow">Nine simple ways to make it easier to (re)use your data</a></li> <li><a href="https://www.tandfonline.com/doi/abs/10.1080/00031305.2017.1375989" rel="nofollow">Data Organization in Spreadsheets</a></li> <li><a href="https://www.nejm.org/doi/full/10.1056/NEJMsb1616595" rel="nofollow">Data Authorship as an Incentive to Data Sharing</a></li> <li><a href="https://peerj.com/preprints/3139/" rel="nofollow">How to share data for collaboration</a></li> <li><a href="https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5967383/" rel="nofollow">Responsible practices for data sharing</a></li> <li><a href="https://f1000research.com/articles/7-90/v1" rel="nofollow">Badges for sharing data and code at Biostatistics: an observational study</a></li> <li><a href="https://researchintegrityjournal.biomedcentral.com/articles/10.1186/s41073-017-0028-9" rel="nofollow">What incentives increase data sharing in health and medical research? A systematic review</a></li> <li><a href="https://www.nature.com/articles/sdata201618" rel="nofollow">The FAIR Guiding Principles for scientific data management and stewardship</a></li> <li><a href="https://journals.sagepub.com/doi/abs/10.1177/2515245918757689" rel="nofollow">Using OSF to Share Data: A Step-by-Step Guide</a></li> </ul> <h3>Open Publication</h3> <ul> <li><a href="https://link.springer.com/article/10.1007/s11192-015-1547-0" rel="nofollow">The open access advantage considering citation, article usage and social media attention</a></li> </ul> <h3>Open Educational Resources</h3> <ul> <li><a href="https://www.learntechlib.org/p/44796/" rel="nofollow">Models for Sustainable Open Educational Resources</a> </li> </ul> <h2>Research Process Changes</h2> <ul> <li><a href="https://academic.oup.com/beheco/article/28/2/348/3069145" rel="nofollow">Striving for transparent and credible research: practical guidelines for behavioral ecologists</a> </li> <li><a href="https://psyarxiv.com/dmfhk/" rel="nofollow">Robust modeling in cognitive science</a></li> <li><a href="https://journals.plos.org/ploscompbiol/article?id=10.1371/journal.pcbi.1000424" rel="nofollow">A Quick Guide to Organizing Computational Biology Projects</a></li> </ul> <h3>Computational Reproducibility</h3> <ul> <li><a href="https://journals.plos.org/ploscompbiol/article?id=10.1371/journal.pcbi.1005510" rel="nofollow">Good enough practices in scientific computing</a></li> <li><a href="https://openresearchsoftware.metajnl.com/articles/10.5334/jors.ay/" rel="nofollow">Best practices in computational science</a></li> <li><a href="https://www.bmj.com/content/336/7659/1472" rel="nofollow">What is missing from descriptions of treatment in trials and reviews?</a></li> <li><a href="http://www.pnas.org/content/115/11/2584" rel="nofollow">An empirical analysis of journal policy effectiveness for computational reproducibility</a></li> <li><a href="https://peerj.com/preprints/3192/" rel="nofollow">Packaging data analytical work reproducibly using R (and friends)</a>)</li> <li><a href="https://ropensci.org/blog/2017/09/01/nf-softwarereview/" rel="nofollow">How rOpenSci uses Code review to promote reproducible science</a></li> <li><a href="https://arxiv.org/abs/1410.0846" rel="nofollow">An introduction to Docker for reproducible research, with examples from the R environment</a></li> </ul> <h3>Version Control</h3> <ul> <li><a href="https://scfbm.biomedcentral.com/articles/10.1186/1751-0473-8-7" rel="nofollow">Git can facilitate greater reproducibility and increased transparency in science</a></li> <li><a href="https://peerj.com/preprints/3159/" rel="nofollow">Excuse me, do you have a moment to talk about version control?</a></li> </ul> <h3>Open Source</h3> <ul> <li><a href="https://openresearchsoftware.metajnl.com/articles/10.5334/jors.bu/print/" rel="nofollow">Building Software Building Community: Lessons from the rOpenSci Project</a></li> <li><a href="https://www.nature.com/articles/s41559-017-0160" rel="nofollow">Our path to better science in less time using open data science tools</a></li> </ul> <h3>Teaching Open Practices</h3> <ul> <li><a href="https://journals.sagepub.com/doi/10.1177/1745691612460686" rel="nofollow">Teaching Replication</a></li> </ul> <h3>Examples of practice</h3> <ul> <li><a href="https://civiclaboratory.nl/" rel="nofollow">Civic Laboratory for Environmental Action Research (CLEAR)</a></li> <li><a href="http://babieslearninglanguage.blogspot.com/2017/01/onboarding.html" rel="nofollow">Onboarding in the Language and Cognition Lab</a></li> </ul>
OSF does not support the use of Internet Explorer. For optimal performance, please switch to another browser.
Accept
This website relies on cookies to help provide a better user experience. By clicking Accept or continuing to use the site, you agree. For more information, see our Privacy Policy and information on cookie use.
Accept
×

Start managing your projects on the OSF today.

Free and easy to use, the Open Science Framework supports the entire research lifecycle: planning, execution, reporting, archiving, and discovery.