This repository provides data from a large scene corpus experiment with 200 participants, who memorized scenes in one session and searched for objects in different scenes in another session. Half of the participants saw the scenes in color, the other half saw the same scenes in grayscale. In each session, high or low spatial frequencies were gaze-contingently attenuated in central or peripheral vision while participants inspected the scenes. Scenes were displayed in one of five conditions: with a central low-pass filter, a central high-pass filter, a peripheral low-pass filter, a peripheral high-pass filter, or without any filtering. For related publications see Wiki below.
Cajar, A., Engbert, R., & Laubrock, J. (2019). How spatial frequencies and color drive object search in real-world scenes: A new eye-movement corpus. http://arxiv.org/abs/1910.09904
Get more citations
No components to display.
OSF does not support the use of Internet Explorer. For optimal performance, please switch to another browser.
This website relies on cookies to help provide a better user experience. By clicking Accept or continuing to use the site, you agree. For more information,
and information on cookie use.