Main content

Home

Menu

Loading wiki pages...

View
Wiki Version:
Epistemic concepts of statistic tools like p-values are complex. We investigate educational researchers' interpretations and conclusions when they find that the value of a p-value exceeds the a priori defined cut-off value, commonly p > .05, referred to as non-significant. Non-significant p-values do not allow any conclusion, but researchers in various fields often maker the mistake to interpret that non-significant p-values indicate the absence of an effect. A second commom isinterpretation is that when one parameter estimate is p > .05, while another is p < .05, researchers interpret that the two parameters differ from each other, without conducting appropriate statistical tests. We examined the frequency of these two common misinterpretations of p-values in recent research in educational psychology, focusing on reviewing researchers' suggestions regarding educational theory, practical implications, and policy implications based on these misinterpretations. 30 articles randomly sampled from the 2017-volumes of three educational psychology journals were reviewed, showing that misinterpretations and inferred wrong implications of p-values > .05 are common in this field. We identify researchers misconcepts of p-values in two distinct cases and discuss how to improve the application and interpretation of p-values. This includes a productive rather than criticizing explanation of why the reviewed misinterpretations can lead to adverse outcomes, and the role of power analysis in educational psychology research. Importantly, this study shows up that the correct interpretation of p-values is not just statistical nitpicking, but instead it tries to examine to which extent potentially misleading conclusions arise from misinterpretations that can have adverse outcomes for educational theory and policy. The medium of digital poster will allow to provide an interactive Shiny visualization of an example why these misinterpretations of p-values exist and matter.
OSF does not support the use of Internet Explorer. For optimal performance, please switch to another browser.
Accept
This website relies on cookies to help provide a better user experience. By clicking Accept or continuing to use the site, you agree. For more information, see our Privacy Policy and information on cookie use.
Accept
×

Start managing your projects on the OSF today.

Free and easy to use, the Open Science Framework supports the entire research lifecycle: planning, execution, reporting, archiving, and discovery.