Main content

A meta-analysis of score differences in ability assessments in proctored and unproctored settings  /

Date created: | Last Updated:


Creating DOI. Please wait...

Create DOI

Category: Project

Description: Unproctored, web-based assessments are frequently compromised by a lack of control over the participants’ test taking behavior. It is likely that participants cheat if personal consequences are high. This meta-analysis summarizes findings on context effects in unproctored and proctored ability assessments and examines mean score differences and correlations between both assessment contexts. As potential moderators, we consider (a) the perceived consequences of the assessment, (b) countermeasures against cheating, (c) the susceptibility to cheating of the measure itself, and (d) the use of different test media. For standardized mean differences, a three-level random-effects meta-analysis based on 108 effect sizes from 49 studies (total N = 100,434) identified a pooled effect of Δ = 0.20, 95% CI [0.10, 0.31], indicating higher scores in unproctored assessments. Moderator analyses revealed significantly smaller effects for measures that are difficult to research on the Internet. Regarding rank order stability, a small subsample of studies (n = 5) providing 15 effect sizes (total N = 1,280) indicated considerable rank order changes (ρ = .58, 95% CI [.38, .78]). These results demonstrate that unproctored ability assessments are markedly biased by cheating. Unproctored assessments may be most suitable for tasks that are difficult to search on the Internet.

License: CC-By Attribution 4.0 International


Loading files...



Recent Activity

Loading logs...

OSF does not support the use of Internet Explorer. For optimal performance, please switch to another browser.
This website relies on cookies to help provide a better user experience. By clicking Accept or continuing to use the site, you agree. For more information, see our Privacy Policy and information on cookie use.

Start managing your projects on the OSF today.

Free and easy to use, the Open Science Framework supports the entire research lifecycle: planning, execution, reporting, archiving, and discovery.