Date created: | Last Updated:
: DOI | ARK
Creating DOI. Please wait...
Category: Project
Description: Atypical social attention is a key behavioral marker of autism (Chita-Tegmark, 2016) and may constitute a process in infancy and childhood that contributes to the later development of atypical social cognitive abilities more generally (Jones et al., 2014). Numerous studies have demonstrated atypical attention patterns in autistic individuals, based on distinct eye-movement patterns while they view images (Wang et al., 2015) or videos (Klin et al., 2002; Keles et al., 2022). Quantifying these gaze patterns could thus provide an objective method for screening, contribute to diagnosis, inform intervention and further inform our understanding of the mechanisms behind autism. Moreover, these gaze patterns are thought to reflect underlying genetic influences (Constantino et al., 2017; Kennedy et al., 2017) that could be associated with trait-like individual differences in visual attention or preferences for specific visual features. Eye tracking data would thus offer insights into both genetic predisposition and phenotypic expression, especially in regard to social behavior, and could help to characterize individual differences and even potentially discover autism subtypes. Yet despite their potential, most eye-tracking studies are constrained by the need for sophisticated, expensive in-lab equipment that requires complex calibration and in-person laboratory visits by participants. This is a critical obstacle because large samples together with longitudinal data will be required to fully assess the variability in gaze patterns among autistic people, to discover possible subtypes of autism, and to link differential gaze patterns with biological factors. The present study aims to tackle this limitation by utilizing a new technology, webcam-based eye tracking over the internet, to characterize social attention in a larger autistic population. We will investigate gaze patterns while participants watch video recordings of group conversations on teleconferencing platforms like Zoom, which permits partitioning of the presentation screen into individual speaker windows in the video stimuli. To investigate specific factors of interest, we scripted the videos and had them performed by professional actors to incorporate social features such as turn-takings, reactions of the listeners, gestures, as well as nonsocial features such as irrelevant and distracting background objects. To plan the present study, we collected and analyzed data from a preliminary study with 97 autistic and 140 non-autistic participants who were recruited online. Preliminary results suggested that autistic individuals may be more prone to distraction by irrelevant nonsocial events in our video stimuli. This preregistered study seeks to replicate and expand these initial findings by recruiting a broader and more diverse sample of autistic and non-autistic participants (from multiple populations) and to more comprehensively examine the relationship between autistic traits, visual attention, and individual differences.
Files can now be accessed and managed under the Files tab.