Main content

Home

Menu

Loading wiki pages...

View
Wiki Version:
**Rationale** Advancements in Virtual Reality (VR) technology has led to a variety of exciting implementations in training programs. Indeed, VR is already showing great promise in training drivers (Cox et al., 2010) and surgeons (Aïm et al., 2019). More recently, VR has even been considered as a training tool in aviation. A crucial skill needed by drivers, surgeons, and pilots alike is the ability to distribute their peripheral attention. Indeed, to work most effectively in these environments, one must be able to focus on multiple peripheral cues at once. However, there has been little research exploring how this distributed peripheral attention (also referred to as perceptual span) differs in VR. The use of this new technology is certainly exciting, however it is imperitive to ensure that VR is actually enhancing training. As of date, very little research has been done comparing how people respond in VR, compared to traditional, computer-based environments. Specifically, little is known about how perceptual span differs across different modalities. ---------- **The Current Study** This experiment aims to investigate whether there are any significant differences in distributed visual attention (perceptual span) in VR, as opposed to the more traditional training method of a computer screen. A dual-attention task has been developed based upon an existing, computer-based paradigm (Ryan, Keane & Wallis, 2019). By administering this task in both VR and on a computer screen, we hope to discover any differences in perceptual span across the two modalities. People with schizophrenia have been shown to have various impairments in how they process visual information, and show deficits in visual attention (Fuller et al., 2006). The personality dimensions underlying schizophrenia can also be seen in non-clinical populations of schizotypy. Individuals who exhibit more schizotypal traits perform significantly worse on perceptual (Ettinger et al., 2015) and attentional tasks (Lenzenwegger, Cornblatt & Putnick, 1991). Therefore, if participants higher in trait schizotypy perform significantly worse, this provides validity to our dual-attention task. --- **Update - January 2023** During pilot testing (undertaken in December, 2022), we discovered the task was too difficult for some participants. Furthermore, there was not enough time for our staircase model to reach an accurate threshold. To rectify this, trials have been made longer. As trials are now longer, the experiment has been split into two parts to accommodate for this time increase. In part one, fifty participants will complete the VR version of the task alongside the schizotypy questionnaire within a one hour time slot. In part two, ten participants will complete the VR and the computer version of the task, within a one and a half hour time slot. This will decrease overall testing time, but still allow us to perform the relevant regression and ANOVA analyses. ---------- **Materials + Methods** This experiment is based upon the experiment conducted by Ryan, Keane & Wallis. The stimuli (shown below) are based loosely on instruments in the cockpit of an airplane. Participants are tasked with making a response as soon as the white bar moves out of the grey target zone and into the black bar. The white bars were coded to move randomly, using a sum of sines formula. Eye tracking is used to ensure participants are fixating on the middle of the screen, thus ensuring they are indeed distributing their attention. Stimuli were scaled using the M-Scaling formula developed by Virsu & Rovamo (1979). This allowed for participants to see the stimuli clearly even at higher fields of view, thus ensuring we are measuring perceptual span, not just visual ability. Full calculations will be available in the project files. Trait schizotypy will be measured using Raine's Schizotypy Questionnaire (1991). The virtual reality version of the experiment was conducted using the Vive Pro Eye, and coded in Unity 2019 using SRanipal eye tracking. The computer version was coded using MATLAB and SimuLink. We aim to recruit 55 participants across both part one and part two of the experiment. ---------- **Dependent Variables** There are three dependent variables in this experiment: scores on the Schizotypy Personality Questionnaire (SPQ), Correct Responses and Reaction Time (ms). Scores on the SPQ will only be analysed in part one of the experiment. ---------- **Hypotheses** *Hypothesis 1* We predict a significant difference in reaction times between the VR and the computer versions of the experiment, whereby responses will be faster on the computer. This hypothesis is based upon the theory that stimuli are processed differently in a virtual reality environment. Since binocular depth cues associated with the dorsal visual pathway (such as vergence and accommodation) are conflicting in VR environments, this may result in more reliance on monocular depth cues (such as shadow and texture) when making depth judgements. As the dorsal pathway is associated with action-oriented movements, less reliance on binocular cues has been theorized to lead to slower reaction times in VR (Harris, Buckingham, Wilson & Vine, 2019). *Hypothesis 2* We predict no significant difference in the number of correct responses between the VR and computer-based versions of the experiment. *Hypothesis 3* We predict that task performance will be mediated by schizotypy scores, whereby higher schizotypy scores facilitate worse performance. ---------- **Analysis Plan** The following analysis plan is a tentative plan based on our current predictions. If the actual analyses deviate from this plan, this section will be updated. To explore Hypothesis 1 and 2, two within-groups ANOVA's will be conducted to discover any substantial differences in performance and reaction time between the PC and VR task versions. To assess Hypothesis 3, two hierarchical multiple regressions will be conducted to discover any correlations between schizotypy scores and task performance. The first analysis will regress reaction time with schizotypy scores, and the second will regress task performance with schizotypy scores. An exploratory analysis may be conducted to explore whether schizotypy mediates task modality (VR vs computer). ------ **Ethics** This project has ethics approval from the University of Queensland, Project ID: 2022/HE001599 ---------- **References** Aïm, F., Lonjon, G., Hannouche, D., & Nizard, R. (2016). Effectiveness of virtual reality training in orthopaedic surgery. *Arthroscopy: the journal of arthroscopic & related surgery*, *32*(1), 224-232. Cox, D., Davis, M., Singh, H., Barbour, B., Nidiffer, F. D., Trudel, T., Mourant, M., & Moncrief, R. (2010). Driving rehabilitation for military personnel recovering from traumatic brain injury using virtual reality driving simulation: A feasibility study. *Military medicine*, *175*(6), 411-416. Ettinger, U., Mohr, C., Gooding, D. C., Cohen, A. S., Rapp, A., Haenschel, C., & Park, S. (2015). Cognition and brain function in schizotypy: a selective review. *Schizophrenia bulletin*, *41*(2), S417-S426. Fuller, R. L., Luck, S. J., Braun, E. L., Robinson, B. M., McMahon, R. P., & Gold, J. M. (2006). Impaired control of visual attention in schizophrenia. *Journal of abnormal psychology*, *115*(2), 266. Harris, D. J., Buckingham, G., Wilson, M. R., & Vine, S. J. (2019). Virtually the same? How impaired sensory information in virtual reality may disrupt vision for action. *Experimental brain research*, *237*(11), 2761-2766. Lenzenweger, M. F., Cornblatt, B. A., & Putnick, M. (1991). Schizotypy and sustained attention. *Journal of Abnormal psychology*, *100*(1), 84. Raine, A. (1991). The SPQ: a scale for the assessment of schizotypal personality based on DSM-III-R criteria. *Schizophrenia bulletin*, *17*(4), 555-564. Ryan, A.E., Keane, B., & Wallis, G. (2019). Microsaccades and covert attention: Evidence from a continuous, divided attention task. *Journal of Eye Movement Research*, *12*(6). Rovamo, J., & Virsu, V. (1979). An estimation and application of the human cortical magnification factor. *Experimental brain research*, *37*(3), 495-510. [1]: https://files.osf.io/v1/resources/jyhb2/providers/osfstorage/63db0a9278a623029b99e6e8?mode=render
OSF does not support the use of Internet Explorer. For optimal performance, please switch to another browser.
Accept
This website relies on cookies to help provide a better user experience. By clicking Accept or continuing to use the site, you agree. For more information, see our Privacy Policy and information on cookie use.
Accept
×

Start managing your projects on the OSF today.

Free and easy to use, the Open Science Framework supports the entire research lifecycle: planning, execution, reporting, archiving, and discovery.