Main content

Home

Menu

Loading wiki pages...

View
Wiki Version:
In 2009, John Hattie released the meta-meta-review Visible Learning which summarized 800 meta-analyses into 138 possible influences on student achievement. The influences were all re-coded to a standard metric (Cohen’s d) and ranked based on their effect sizes, ranging from negative (e.g. retention), little effect (e.g., student personality), to strong influences on student achievement (e.g., Response to intervention). To this day, the general criticism has focused on discovering examples of flaws in Hattie’s approach which has been referred to as cherry-picking by proponents of Visible Learning. The purpose of this project is to conduct a rigorous systematic assessment of the presented material. This talk will go through the syntheses made in Visible Learning and also how the quality assessment of the material is done. For example, previous research indicates that several influences have combined meta-analyses despite not having similar population, intervention, comparison groups, outcomes, and study types (PICOS). The talk will also contain a practical demonstration of the code-sheet and coding of the influences. The approach taken includes resources when conducting or assessing any type of meta-review.
OSF does not support the use of Internet Explorer. For optimal performance, please switch to another browser.
Accept
This website relies on cookies to help provide a better user experience. By clicking Accept or continuing to use the site, you agree. For more information, see our Privacy Policy and information on cookie use.
Accept
×

Start managing your projects on the OSF today.

Free and easy to use, the Open Science Framework supports the entire research lifecycle: planning, execution, reporting, archiving, and discovery.