Main content

Home

Menu

Loading wiki pages...

View
Wiki Version:
# Freiburg Screen-free BCI for Robotic Object Selection: EEG Dataset In this dataset, we present data from a screen-free Brain-Computer Interface (BCI) [1], where a robot highlighted candidate objects in the environment using a laser pointer, such that the user goal can decoded from the resulting event-related responses (ERPs) in the electroencephalogram (EEG). This dataset contains EEG recordings from 19 participants. ## Citing this dataset. If you use this dataset, please cite [1] and [2]. ## License This dataset is licensed under the ODC Open Database License (ODbL). The full text of the license can be found in the accompanying LICENSE file. A human-readable summary can be found at https://opendatacommons.org/licenses/odbl/summary/index.html. ## Paradigm Participants attended their goal object, while the robot highlighted candidates objects directly in the environment using a laser pointer. Having the robot present stimuli in the environment allows for more direct commands than traditional BCIs that require the use of graphical user interfaces. However, in realistic application environments, possible candidate objects have heterogeneous optical properties. These affect the elicited brain responses and lead to different ERP distributions across objects, forming a subclass structure in the classification task [2]. ## Experiments Experiments were performed in three groups: Trials of participants in group 1 (subject number starting with 1) had two different conditions. In the first condition ("hom_objects_soa250" or HOM1), objects with homogeneous surface properties were highlighted with a stimulus-onset asynchrony (SOA) of 250ms. In the second condition ("het_objects_soa250" or HET1) objects with heterogeneous surface properties were highlighted with an SOA of 250ms. Sessions of group 1 consisted of 6 runs with 8 trials each. Each run was balanced between the two conditions and target objects. Trials of participants in group 2 (subject number starting with 2) also had two different conditions. In the first condition ("hom_objects_soa500" or HOM2), objects with homogeneous surface properties were highlighted with an SOA of 500ms. In the second condition ("het_objects_soa250" or HET2) objects with heterogeneous surface properties were highlighted with an SOA of 250ms. Sessions of group 2 consisted of 6 runs with 8 trials each. Each run was balanced between the two conditions and target objects. Trials of participants in group 3 (subject number starting with 3) only had a single condition ("hom_objects_soa500" or HOM3). In all trails, objects with homogeneous surface properties were highlighted with an SOA of 500ms. Sessions of group 3 consisted of 8 runs with 6 trials each. ## Format This dataset is formatted according to the Brain Imaging Data Structure (BIDS). More information on BIDS can be found at https://bids-specification.readthedocs.io/. Sample code for loading the dataset in Python (using MNE-Python and MNE-BIDS) can be found in the code subdirectory. Please also see the dataset_description.json and task-objectselection_eeg.json. ## References [1] Kolkhorst, Henrich, Michael Tangermann, and Wolfram Burgard. 2018. Guess What I Attend: Interface-Free Object Selection Using Brain Signals. In 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 7111–16. https://doi.org/10.1109/IROS.2018.8593992. [2] Kolkhorst, Henrich, Joseline Veit, Wolfram Burgard, and Michael Tangermann. 2020. A Robust Screen-Free Brain-Computer Interface for Robotic Object Selection. Frontiers in Robotics and AI 7: 38. https://doi.org/10.3389/frobt.2020.00038.
OSF does not support the use of Internet Explorer. For optimal performance, please switch to another browser.
Accept
This website relies on cookies to help provide a better user experience. By clicking Accept or continuing to use the site, you agree. For more information, see our Privacy Policy and information on cookie use.
Accept
×

Start managing your projects on the OSF today.

Free and easy to use, the Open Science Framework supports the entire research lifecycle: planning, execution, reporting, archiving, and discovery.