Main content

Home

Menu

Loading wiki pages...

View
Wiki Version:
### **Rationale behind QA tools** An overview of bias types can be found [here][1]. Two tools for Risk of Bias (RoB) / Quality Assessment (QA) were used in TRACE, based on the type of primary studies that were included in this meta-analysis. --- #### **I. Primary studies types in TRACE** Experimental papers that compared learning/memory in PTSD patients (for clinical studies) or models (for pre-clinical studies) to controls were included in this meta-analysis. As a result different types primary study are present in TRACE: * The included **clinical studies** are observational, *case-control* studies, that compare learning/memory performance between case (PTSD patients) and (healhty or trauma-exposed) controls. * Due to the nature of PTSD models in animals (i.e. controlled exposure to a traumatic experience), the included **preclinical studies** can be considered *experimental, controlled trails* that compare the effect of 'the PTSD model' on learning / memory performance. For an overview of these study types, see [Song et al.,2010][2]. --- #### **II. RoB/QA Tools used in TRACE** ##### *Preclinical studies:* The RoB in animal studies was assessed using **SYRCLE's risk of bias tool** for animal studies ([Hooijmans et al.,2014][3]). This tool was specifically developed to measure selection bias, performance bias, detection bias, attrition bias and reporting bias in animal intervention studies. These items are scored with "yes" (low RoB), "no" (high RoB), or "unclear". Two extra criteria were added for QA: 1. validated PTSD model? (based on [Borghans & Homberg,2015][4] and [Flandreau & Toth, 2018][5]) 2. (validated) behavioral measure of learning/memory? ##### *Clinical studies:* According to the *GRADE guidelines* there are over 200 tools to measure the quality of observational studies [(Guyatt et al.,2011)][6]. They summarize the key criteria of these tools (to assess the limitations in observational studies in [Table 2][7]: 1. Failure to develop and apply appropriate eligibility criteria (inclusion of control population) - Under- or overmatching in case-control studies - Selection of exposed and unexposed in cohort studies from different populations 2. Flawed measurement of both exposure and outcome - Differences in measurement of exposure (e.g., recall bias in case-control studies) - Differential surveillance for outcome in exposed and unexposed in cohort studies 3. Failure to adequately control confounding - Failure of accurate measurement of all known prognostic factors - Failure to match for prognostic factors and/or lack of adjustment in statistical analysis 4. Incomplete follow-up For practical reasons we have selected on of these tools: the **Newcastle-Ottawa Scale (NOS)** ([Wells et al.,2019][8], accessed 11.12.19). This tool contains the criteria above, and is the most used RoB/QA tool for assessing the quality of non-randomized studies (in humans) in meta-analyses, together with the ROBINS-I ([Farrah et al.,2019][9]). Note, ROBINS-I (or the derived ROBINS-E) was not selected as it is suboptimal for observational studies ([Bero et al.,2018][10]). The NOS has 2 versions: case-control and cohort (for a critical review about NOS-cohort see: [Stang, 2010][11]). We have adapted the case-control version to a [RoB/QA tool for the clinical data in TRACE][12]. Note, the NOS has been adapted for a specific project before (for an example see [Wang et al., 2019][13]). --- #### **III. RoB/QA data TRACE** TRACE's [template for QA](https://osf.io/qceum) and [QA data](https://osf.io/tckb5) are on OSF. [1]: https://catalogofbias.org/biases/ [2]: https://journals.lww.com/plasreconsurg/fulltext/2010/12000/Observational_Studies__Cohort_and_Case_Control.58.aspx [3]: https://doi.org/10.1186/1471-2288-14-43 [4]: https://www.wjgnet.com/2220-3206/full/v5/i4/387.htm [5]: https://doi.org/10.1007/7854_2016_65 [6]: https://doi.org/10.1016/j.jclinepi.2010.07.017 [7]: https://doi.org/10.1016/j.jclinepi.2010.07.017 [8]: http://www.ohri.ca/programs/clinical_epidemiology/oxford.asp. [9]: https://doi.org/10.1186/s13643-019-1172-8 [10]: https://doi.org/10.1186/s13643-018-0915-2 [11]: https://link.springer.com/article/10.1007%2Fs10654-010-9491-z#Sec1 [12]: https://osf.io/pnfhj/ [13]: https://bmcpsychiatry.biomedcentral.com/articles/10.1186/s12888-019-2302-5
OSF does not support the use of Internet Explorer. For optimal performance, please switch to another browser.
Accept
This website relies on cookies to help provide a better user experience. By clicking Accept or continuing to use the site, you agree. For more information, see our Privacy Policy and information on cookie use.
Accept
×

Start managing your projects on the OSF today.

Free and easy to use, the Open Science Framework supports the entire research lifecycle: planning, execution, reporting, archiving, and discovery.