Main content
Natural Walking and Eye Tracking in different Virtual Reality Tasks
Date created: | Last Updated:
: DOI | ARK
Creating DOI. Please wait...
Category: Data
Description: This dataset provides tracking data from a virtual reality experiment with 18 participants. During the 15 minutes of each session participatns were doing multiple tasks including natural walking on straight and curved paths, searching for a target and avoiding obstacles. During these tasks data was gathered from multiple sensors including eye tracking, head position, trunk orientation, controller position and orientation. Related publications: N. Stein, G. Bremer and M. Lappe, "Eye Tracking-based LSTM for Locomotion Prediction in VR," 2022 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), 2022, pp. 493-503, doi: 10.1109/VR51125.2022.00069. Bremer, G., Stein, N., Lappe, M. (2021) Predicting Future Position From Natural Walking and Eye Movements with Machine Learning. Proceedings of the IEEE International Conference on Artificial Intelligence and Virtual Reality (AIVR) 2021, 19-28, doi:10.1109/AIVR52153.2021.00013 Stein, N., Analyzing Visual Perception and Predicting Locomotion using Virtual Reality and Eye Tracking (2021) IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW), 2021, 727-728, doi: 10.1109/VRW52623.2021.00246
Add important information, links, or images here to describe your project.