Main content

Home

Menu

Loading wiki pages...

View
Wiki Version:
# ATC Interruptions Retention This repository contains the code and data required to run the analyses required for _Prospective memory performance in simulated air traffic control: Robust to interruptions but reduced by retention interval_. Authors: Michael Wilson, Luke Strickland, Simon Farrell, Troy A.W. Visser, & Shayne Loft ## Requirements Run the following snippet to install packages required for this project. ```r requirements <- c("ggplot2", "lme4", "lmerTest","dplyr", "effects", "ggeffects", "ggplot2", "lme4", "lmerTest", "lubridate", "scales", "sjPlot", "stringr", "tidyverse", "viridis", "rmarkdown") to.install <- requirements[!(requirements %in% installed.packages()[,"Package"])] if(length(to.install)) install.packages(to.install) ``` ## Repository Overview The project is designed to be opened with Rstudio projects. The name of the project file is `atc-interruptions-derde.Rproj`. The directories in the repository contain the following: - `analysis` - Main script directory, load iteratively. - `data` - all the CSVs used for analysing the current data - `reports` - Rmarkdown documents - contains most of the in-text analyses - `R` - Utility R functions ## Quick Run Guide 1. Open `reports/deferred_exclusions.Rmd` to inspect exclusion criteria for PM task. 2. Open `analysis/02_dataparse_deferred.R` to inspect deferred handoff models and wrangling. 3. Models associated with deferred handoff results can be found in `reports/reported_analyses.Rmd` 3. Open `analysis/03_dataparse_conflicts.R` and inspect how conflict cost was calculated; then view models in `analysis/20_conflicts_models.R` 4. Run each of the Rmd documents in reports (interactively perhaps) to reproduce the results. 5. Some exploratory model comparisons are presented throughout. ## Reproducing Analyses The repository is relatively straightforward to use, and there shouldn't be anything too quirky to get it all working. All `source()` calls are relative to the project root. - The raw data is large (> 15GB) and not available in the repositoryanalysis/` and `analysis/10_defhand_plots. - Main wrangling and data preparation occurs in `01_main_dataload.R`. Almost all files will source this to get data related to the deferred handoff PM task. - If concerns around exclusions or wrangling, this is the file to check. **In terms of Rmarkdown**, it is important to note all my model analyses are reported using two convenience functions, `glmer_report` and `lmer_report`, which are contained in `R/rmarkdown_lme4.R`. In short, these take two `lme4` model objects, and perform frequentist model comparisons, and return a list ofanalysis/` and `analysis/10_defhand_plots. the two models and results. Then, within a knitr document you can call `rmdprint(report_object)` and it will return the models in APA. ### Deferred Handoff Task **Main Wrangling**: The file `"analysis/01_main_dataload.R"` performs a number of exclusions and joins related to deferred handoff task. **Exclusions**: All information relating to the data cleaning/exclusions reported in the start of the results section are listed in `"reports/deferred_exclusions.Rmd"`. This is well documented and justified. **Modelling**: `"analysis/11_defhand_models.R"` contains the model definitions reported in the paper (and a few unreported models to test slopes etc.) **Rmarkdown/Reproducible**: `"reports/deferred_results_main.Rmd"` contains writeup of the reported lme4 models. However, it's become too much of a burden to keep updating the text with edits from other authors. So the results are identical; but wording is not. ### Routine Acceptance and Handoff Each of these have a unique file in reports directory (`routine_*`). ## Experimental Description **Participants.** Undergraduate students (*N* = 64; X female; median age = X years) from the University of Western Australia participated in the study in exchange for partial course credit or $25. All participants viewed an information sheet prior to giving informed consent. The information sheet and consent form were approved by the UWA Ethics Committee. **Design.** The study used a 2 x 2 x 2 within-subjects design. There were two interruption conditions: no-interruption (no interruption administered), and an ATC- interruption (interrupted by an additional ATC scenario); two retrieval timing conditions (short: handoff flashed immediately after interruption epoch; and long: handoff flashed 50 s after interruption epoch); and two encoding timing conditions (short: handoff encoded 10s before interruption epoch; and long: handoff encoded 40 s before interruption epoch). This resulted in 8 within-subjects conditions. Participants completed 16 air traffic control per day, over the course of two days. The scenarios on the second day were the same as the first day, except the aircraft names were randomized. Simulation trials lasted 5 mins each, with four trials per within-subject condition. In order to counter balance all 32 simulation trials across the eight conditions, two 8 x 8 Latin square schemes were used. Specifically, we first divided the 16 scenarios into eight scenario groups. Then, the ordering of these eight scenario groups were then counter balanced in a 8 (columns) x 8 (rows) using Latin square (see Table 1 for a visual depiction). The columns of the Latin square were used to allocate a condition to each scenario group; whilst the rows were used to allocate equal sized groups of participants to a condition-scenario scheme. This resulted in an even allocation of interruption and timing conditions across the 16 simulation trials. The final exact order in which the scenarios were presented was randomized for each participant. For the second day, we simply reversed the column order. Consequently, for any given scenario seen on the first day: on the second day they received the opposite interruption condition and timing condition allocation. E.g., if you did scenario 5 as ATC-SS on the first day; on the second day you would receive scenario 5 as No Interruption-LL.
OSF does not support the use of Internet Explorer. For optimal performance, please switch to another browser.
Accept
This website relies on cookies to help provide a better user experience. By clicking Accept or continuing to use the site, you agree. For more information, see our Privacy Policy and information on cookie use.
Accept
×

Start managing your projects on the OSF today.

Free and easy to use, the Open Science Framework supports the entire research lifecycle: planning, execution, reporting, archiving, and discovery.