Main content

Home

Menu

Loading wiki pages...

View
Wiki Version:
**Data & pre-trained versions of DeepMReye** - **DeepMReye is an open-source software package for magnetic-resonance-based eye tracking. It can be used to perform eye tracking without camera in (f)MRI experiments.** The paper that formally describes DeepMReye is currently available as preprint. Magnetic resonance-based eye tracking using deep neural networks. Frey M.†& Nau M.†ǂ, Doeller C.F.ǂ (2021). Nature Neuroscience. https://doi.org/10.1038/s41593-021-00947-w The full code can be found on GitHub: https://github.com/DeepMReye --- **Image source data:** The "source_data" folder contains the source data of the figures in the published article. Fig.1 shows eyeball-voxel intensities for different gaze positions, which we provide as volumetric NIFTI files (.nii). Fig.4 and Extended Data Fig. 10 show brain activity explained by eye movements measured with camera-based eye tracking and with DeepMReye with and without HRF convolution. For these figures, we provide volumentric data in MNI space (.nii) as well as surface-based data coregistered to the fsaverage surface (.mgh). Source data of all other figures is provided as individual excel files (.xlsx) --- **Exemplary data:** The data inside the "exemplary_data" folder in this repository are required to execute the notebooks illustrating the use of DeepMReye: https://github.com/DeepMReye/DeepMReye/blob/main/notebooks/deepmreye_example_usage.ipynb Each of the twelve sub-NDARAXXXXXXX-folders contains the data of a single participant with two scanning runs each as 4D NIFTI files (.nii). In addition, it contains a NumPy file, containing the preprocessed eyeball data of both runs. Participants fixated at various locations on the screen while functional MRI data were acquired. These data were kindly shared by Alexander & Escalera & Ai et al. 2017 as part of a larger data release. The full dataset can be downloaded here. http://fcon_1000.projects.nitrc.org If you use these data in a publication, please cite the following paper: Alexander, L., Escalera, J., Ai, L. et al. An open resource for transdiagnostic research in pediatric mental health and learning disorders. Sci Data 4, 170181 (2017). https://doi.org/10.1038/sdata.2017.181 --- **Processed data:** The folder "Processed data" includes the preprocessed fMRI data of the eyeballs as well as the preprocessed eye-tracking data of all datasets used in our article. For each participant, the respective subdirectory contains one zipped NumPy file (.npz) containing both the fMRI data and the eye-tracking data split into the respective functional volumes. --- **Pre-trained model weights:** This folder contains pre-trained model weights, which were estimated using the following individual dataset as well as a combination of all datasets. They enable decoding viewing behavior with DeepMReye in existing datasets without re-training the model in certain scenarios. See the [Online Documentation][1] for details). Dataset 1: Guided fixations Alexander et al. 2017, Scientific Data https://doi.org/10.1038/sdata.2017.181 Dataset 2: Smooth pursuit *Nau et al. 2018, NeuroImage https://doi.org/10.1016/j.neuroimage.2018.04.012* Dataset 3: Smooth pursuit *Polti & Nau et al. 2021, eLife https://elifesciences.org/articles/79027* Dataset 4: Smooth pursuit *Nau et al. 2018, Nature Neuroscience https://doi.org/10.1038/s41593-017-0050-8* Dataset 5: Visual search *Julian et al. 2018, Nature Neuroscience https://doi.org/10.1038/s41593-017-0049-1* [1]: https://deepmreye.slite.com/p/channel/MUgmvViEbaATSrqt3susLZ
OSF does not support the use of Internet Explorer. For optimal performance, please switch to another browser.
Accept
This website relies on cookies to help provide a better user experience. By clicking Accept or continuing to use the site, you agree. For more information, see our Privacy Policy and information on cookie use.
Accept
×

Start managing your projects on the OSF today.

Free and easy to use, the Open Science Framework supports the entire research lifecycle: planning, execution, reporting, archiving, and discovery.