Main content

Home

Menu

Loading wiki pages...

View
Wiki Version:
Link to email list of those interested in & working on this project. https://docs.google.com/document/d/1HNHRJIn4X1EZbGUzSmN0iXAzuHgc_eMOyWLLuN2NN0E/edit?usp=sharing Link to Quality Assessment for Meta-Analysis/Systematic Review: [https://docs.google.com/document/d/15tJ60Ha5vfza3xXVyanKXUmI3cJITLfIl2tHJrwiTnQ/edit?usp=sharing][1] Link to Pre-Registration Draft (NAC, June 29, 2018): https://docs.google.com/document/d/1SqvBzmd2YSmFuFQY0sB7bvq3YcUlsv3waSseMAB44IA/edit?usp=sharing Unstructured notes (KSC, June 25, 2018) Link to pre-registration of ultimately published meta-analysis by Brandt et al on relationship of brightness to recall of moral behav. https://osf.io/er8ki/ - One idea is to form a consortium similar to Cochrane and Campbell but for psychology - Cochrane has a service called Cochrane Cloud which allows citizen science (crowd.cochrane.org) - ZPID is running an OA platform that will do registered reports for meta-analyses (https://leibniz-psychology.org/en/). They will make a generic structure for publishing Stage 1 protocols. - Develop standards for registered reports protocols (for meta-analyses) and standards for review. - PROSPERO protocols (https://www.crd.york.ac.uk/prospero/) - Quality standards need to be articulated (we have PRISMA and MARS). - We need a list of best practices, a resource for people. John Sakaluk has a document that we crowdsourced as a syllabus document. - Reporting standards are badly needed. Greg Webster, 51 meta-analyses have been published in PSPR. The quality is very very low. - There is a need for a risk of bias or critical appraisal tool for psychology (Cooper & Valentine, 2008) - Study Diet for Experiments - but this is cumbersome to use (and we need extensions for correlational or observational studies). - Here is the syllabus that John S. mentioned: https://docs.google.com/document/d/1oImg-ojUFqak5KyZ-ETD2qGvkvUgx8Ym6b8gG4GwfM8/edit# - Evidence Gap Map / Mapping Reviews (http://gapmaps.3ieimpact.org/evidence-maps/intimate-partner-violence-prevention-evidence-gap-map) - What do we do about weak reporting in primary studies? Complex designs also make it difficult to choose the appropriate contrast. - More specificity in inclusion/exclusion criteria would be beneficial. Perhaps attention to studies with primary data available. - Could a consortium that reviews stage 1 protocols work as a service for journals? Existing Guidelines - Conduct standards: (http://methods.cochrane.org/sites/default/files/public/uploads/mecir_printed_booklet_final_v1.02.pdf) - PRISMA / MARS (Reporting Standards) - PRESS (Search Quality Review Checklist) https://www.cadth.ca/resources/finding-evidence/press Unstructured notes (KSC, June 26, 2018) - A consortium could develop standards and practices similar to a registered reports model. - Search in psych is often incomplete. - Health researchers use PROSPERO to log their stage 1 protocols; you check PROSPERO before you commence your project - EVIDENT Framework: a paper about different methods of research synthesis - A policy or position paper could be framed as best practices in meta-analysis and systematic review. - We could use a template or guide for registered reports at journals in psych. - We could use guidance with interpreting effect sizes. This would be a best practice. - RAYYAN QCRI: dual coding system (also metagear package) [1]: https://docs.google.com/document/d/15tJ60Ha5vfza3xXVyanKXUmI3cJITLfIl2tHJrwiTnQ/edit?usp=sharing
OSF does not support the use of Internet Explorer. For optimal performance, please switch to another browser.
Accept
This website relies on cookies to help provide a better user experience. By clicking Accept or continuing to use the site, you agree. For more information, see our Privacy Policy and information on cookie use.
Accept
×

Start managing your projects on the OSF today.

Free and easy to use, the Open Science Framework supports the entire research lifecycle: planning, execution, reporting, archiving, and discovery.