Main content

Home

Menu

Loading wiki pages...

View
Wiki Version:
# Object-based attention during scene perception elicits boundary contraction in memory Journal Article [link](https://rdcu.be/dHWZk). <br> Github Repository [link](https://github.com/ehhall/object-based-memories). If you use this code or dataset in you research, please cite this paper. ``` @article{hall2024object, title={Object-based attention during scene perception elicits boundary contraction in memory}, author={Hall, Elizabeth H and Geng, Joy J}, journal={Memory \& cognition}, pages={1--13}, year={2024}, publisher={Springer} } ``` ## Abstract Boundary contraction and extension are two types of scene transformations that occur in memory. In extension, viewers extrapolate information beyond the edges of the image, whereas in contraction, viewers forget information near the edges. Recent work suggests that image composition influences the direction and magnitude of boundary transformation. We hypothesize that selective attention at encoding is an important driver of boundary transformation effects, selective attention to specific objects at encoding leading to boundary contraction. In this study, one group of participants (N = 36) memorized 15 scenes while searching for targets, while a separate group (N = 36) just memorized the scenes. Both groups then drew the scenes from memory with as much object and spatial detail as they could remember. We asked online workers to provide ratings of boundary transformations in the drawings, as well as how many objects they contained and the precision of remembered object size and location. We found that search condition drawings showed significantly greater boundary contraction than drawings of the same scenes in the memorize condition. Search drawings were significantly more likely to contain target objects, and the likelihood to recall other objects in the scene decreased as a function of their distance from the target. These findings suggest that selective attention to a specific object due to a search task at encoding will lead to significant boundary contraction. ## Data The repository contains the drawings done from memory, the 15 scenes used in the experiment with segmentations for the objects in the scenes, eye-tracking fixations from the study phase, and ratings from the 3 online AMT tasks. ## Code *Settings.py* defines the directories. <br> *Attention.ipynb* includes code for the eyetracking analyses. <br> *Boundary.ipynb* calculates boundary transformations in the drawings. <br> *Corners.ipynb* is used to define the scale of the scanned in drawings. <br> *Location.ipynb* calculates the shift in remembered object location from the memory drawings. <br> *Memory.ipynb* includes models for what determines whether an object in the image will be drawn from memory.
OSF does not support the use of Internet Explorer. For optimal performance, please switch to another browser.
Accept
This website relies on cookies to help provide a better user experience. By clicking Accept or continuing to use the site, you agree. For more information, see our Privacy Policy and information on cookie use.
Accept
×

Start managing your projects on the OSF today.

Free and easy to use, the Open Science Framework supports the entire research lifecycle: planning, execution, reporting, archiving, and discovery.