Main content
Natural Walking and Eye Tracking in different Virtual Reality Tasks
Date created: | Last Updated:
: DOI | ARK
Creating DOI. Please wait...
Category: Data
Description: This dataset provides tracking data from a virtual reality experiment with 18 participants. During the 15 minutes of each session participatns were doing multiple tasks including natural walking on straight and curved paths, searching for a target and avoiding obstacles. During these tasks data was gathered from multiple sensors including eye tracking, head position, trunk orientation, controller position and orientation. Related publications: N. Stein, G. Bremer and M. Lappe, "Eye Tracking-based LSTM for Locomotion Prediction in VR," 2022 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), 2022, pp. 493-503, doi: 10.1109/VR51125.2022.00069. Bremer, G., Stein, N., Lappe, M. (2021) Predicting Future Position From Natural Walking and Eye Movements with Machine Learning. Proceedings of the IEEE International Conference on Artificial Intelligence and Virtual Reality (AIVR) 2021, 19-28, doi:10.1109/AIVR52153.2021.00013 Stein, N., Analyzing Visual Perception and Predicting Locomotion using Virtual Reality and Eye Tracking (2021) IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW), 2021, 727-728, doi: 10.1109/VRW52623.2021.00246
We use this wiki as codebook to describe the different columns of the open data set in more detail.
Walking Data
vp_gender
gender of the participant [male/female]
vp_code
two digit participant number, subject 1,2 and 4 were authors of the experiment
vp_sight
did the participant have [normal/k (corrected)/u (uncorrected)] vision during the experiment. Participants with impaired vision were free …
Files
Files can now be accessed and managed under the Files tab.
Citation
Recent Activity
Unable to retrieve logs at this time. Please refresh the page or contact support@osf.io if the problem persists.