Main content

Date created: | Last Updated:

: DOI | ARK

Creating DOI. Please wait...

Create DOI

Category: Project

Description: 1. Researchers are often uncertain about the extent to which it may be acceptable to violate the assumption of normality of errors, which underlies the most-frequently used tests for statistical significance (regression, t-test, ANOVA, and linear mixed models with Gaussian error). 2. Here we use Monte Carlo simulations to show that such Gaussian models are remarkably robust to even the most dramatic deviations from normality. 3. We find that P-values are generally reliable if either the dependent variable Y or the predictor X are normally distributed and that biased P-values only occur if both are heavily skewed (resulting in outliers in both X and Y). In the latter case, judgement of significance at an α-level of 0.05 is still safe unless sample size is very small. Yet, with more stringent significance criteria as is used when conducting numerous tests (e.g. α = 0.0001) there is a greater risk of making erroneous judgements. Parameter estimates are generally unbiased and precise and there is no drop in power as long as the predictor X is not heavily skewed (resulting in high-leverage observations). Yet, parameter estimates may not always be appropriate for interpretation because ceiling and floor restrictions are not accounted for (e.g. extrapolation to negative counts). 4. Generally we conclude that violating the normality assumption appears to be the lesser of two evils, when compared to alternative solutions that are either unable to account for levels of non-independence in the data (most non-parametric tests) or much less robust (e.g. Poisson models which require control of overdispersion and sometimes a sophisticated bootstrap resampling). We argue that the latter may pose a more substantial threat to the reliability of research findings when pragmatically acknowledging that, in the majority of research projects, statistical expertise is limited.

License: MIT License

Wiki

Add important information, links, or images here to describe your project.

Files

Loading files...

Citation

Recent Activity

Loading logs...

OSF does not support the use of Internet Explorer. For optimal performance, please switch to another browser.
Accept
This website relies on cookies to help provide a better user experience. By clicking Accept or continuing to use the site, you agree. For more information, see our Privacy Policy and information on cookie use.
Accept
×

Start managing your projects on the OSF today.

Free and easy to use, the Open Science Framework supports the entire research lifecycle: planning, execution, reporting, archiving, and discovery.