Main content

Home

Menu

Loading wiki pages...

View
Wiki Version:
Structured Abstract ------------------- **Background.** Cyber security security user studies have been scrutinized in recent years on their reporting completeness for statistical inferences as well as their statistical reporting fidelity. However, other benchmarks of sound research, such as statistical power, estimates of Positive Predictive Value (PPV) and publication bias have been largely absent in the meta-research on the field. **Aim.** We aim to estimate the power, PPV, and publication bias distribution over an SLR-derived sample of cyber security user studies. **Method.** Based on an earlier published SLR of 146 cyber security user studies, we will extract correctly reported test triplets (test statistic, degrees of freedom, and $p$-value), the overall study sample sizes and group sizes of statistical tests, in addition to test families and multiple-comparison corrections. Based on that data we will compute effect sizes for parametric comparisons between conditions in the form of $t$-tests, $\chi^2$-tests, or one-way $F$-tests. We will convert all such effect sizes into Standardized Mean Differences (SMD, Hedges $g$) for comparisons across studies. Based on these post-hoc effect size estimates, we will compute we will estimate confidence intervals as well funnel plots for the estimation of publication biases. Furthermore, we evaluate detection sensitivity, statistical power and PPV in face of parametrized *a priori* effect size thresholds. **Anticipated Results.** While we expect based on earlier results that the sample will only partially yield usable effect size estimates (and thereby estimates for further benchmarks), we anticipate that the results will offer a plethora of data characterizing the field. **Anticipated Conclusions.** We anticipate that the benchmarks provided will offer an empirical evidence base to inform the community how we are doing and substantiate recommendation on how to advance the field.
OSF does not support the use of Internet Explorer. For optimal performance, please switch to another browser.
Accept
This website relies on cookies to help provide a better user experience. By clicking Accept or continuing to use the site, you agree. For more information, see our Privacy Policy and information on cookie use.
Accept
×

Start managing your projects on the OSF today.

Free and easy to use, the Open Science Framework supports the entire research lifecycle: planning, execution, reporting, archiving, and discovery.