Home

Menu

Loading wiki pages...

View
Wiki Version:
<h1>Eye Gaze Drone Racing Dataset</h1> <p>This is the dataset accompanying the paper</p> <pre class="highlight"><code>C. Pfeiffer and D. Scaramuzza, &quot;Human-Piloted Drone Racing: Visual Processing and Control,&quot; in IEEE Robotics and Automation Letters, vol. 6, no. 2, pp. 3467-3474, April 2021, doi: 10.1109/LRA.2021.3064282.</code></pre> <h2>Coordinate Frames</h2> <p><strong><em>World</em></strong> frame coordinates are in x=Forward, y=Left, z=Up. World origin is in the center of the race track, floor is at z=0.</p> <p><strong><em>Body</em></strong> frame coordinates are in x=Forward, y=Left, z=Up.</p> <p><strong><em>Image</em></strong> frame coordinates are in x=right, y=down (, z=forward), origin is the upper left image corner.</p> <h2>FPV Camera</h2> <p>The first-person view (FPV) camera had a -30 degrees rotation about the quadrotor y-axis (i.e., corresponding to a 30 degrees uptilt angle) and was located at (x=0.2, y=0, z=-0.1) meters offset from drone body frame origin.</p> <pre class="highlight"><code>camera_uptilt = 30. # degrees camera_transform = [0.2, 0., -0.1] # meters</code></pre> <p>FPV camera matrix, and distortion coefficients for normalized image coordinates (0-1 range) are:</p> <pre class="highlight"><code>camera_matrix = np.array([[0.41, 0., 0.5], [0., 0.56, 0.5], [0., 0., 1.0]]), dist_coefs = np.array([[0., 0., 0., 0.]])</code></pre> <h2>Data</h2> <p>Data is stored in <code>/data</code> in subfolders for each subject in csv files.</p> <p>Data files are named as follows:</p> <p><code>&lt;SUBJECT-ID&gt;_&lt;TRACK-NAME&gt;&lt;RUN-NUMBER&gt;_&lt;DATA-TYPE&gt;.csv</code></p> <p>where: <em> <strong>SUBJECT-ID</strong> is a 3-letter string, e.g. <code>RUB</code> </em> <strong>TRACK-NAME</strong> is either <code>FLAT</code> or <code>WAVE</code> <em> <strong>RUN-NUMBER</strong> is a digit between <code>1-6</code> </em> <strong>DATA-TYPE</strong> is either <code>DRONE</code>, <code>CAMERA</code>, <code>EVENTS</code>, <code>GAZE</code>, <code>AOI</code>, <code>VECTORS</code>. </p> <p>Data Types:</p> <h3><code>DRONE</code>: Drone pose and control commands</h3> <ul> <li><code>t</code>: Time in seconds.</li> <li><code>p</code>: Drone position in meters (world frame).</li> <li><code>q</code>: Drone rotation quaternion (world frame).</li> <li><code>v</code>: Drone velocity in m/s (world frame).</li> <li><code>a</code>: Drone acceleration in m/s/s (world frame).</li> <li><code>w</code>: Drone body rates in rad/s (body frame).</li> <li><code>throttle</code>: Control command for collective thrust (0-1 range).</li> <li><code>roll</code>, <code>pitch</code>, <code>yaw</code>: Control commands for body rates (-1 to 1 range).</li> </ul> <h3><code>CAMERA</code>: Camera pose</h3> <ul> <li><code>t</code>: Time in seconds.</li> <li><code>p</code>: Camera position in meters (world frame).</li> <li><code>q</code>: Camera rotation quaternion (world frame).</li> </ul> <h3><code>GAZE</code>: Gaze position in screen coordinates</h3> <ul> <li><code>t</code>: Time in seconds.</li> <li><code>gx</code>: Normalized gaze position along x axis (image frame, 0-1 range).</li> <li><code>gy</code>: Normalized gaze position along y axis (image frame, 0-1 range).</li> </ul> <h3><code>VECTORS</code>: Velocity, Camera, Gaze, and Thrust vectors</h3> <ul> <li><code>t</code>: Time in seconds.</li> <li><code>camera_vector</code>: Unit vector indicating camera focal point orientation in 3D (world frame).</li> <li><code>gaze_vector</code>: Unit vector indicating gaze orientation in 3D (world frame).</li> <li><code>thrust_vector</code>: Unit vector indicating thrust orientation in 3D (world frame).</li> <li><code>velocity_vector</code>: Unit vector indicating velocity orientation in 3D (world frame).</li> </ul> <h3><code>AOI</code>: Area of interest hits and labels</h3> <ul> <li><code>t</code>: Time in seconds.</li> <li><code>gaze_origin</code>: Gaze vector origin (=camera position) (world frame).</li> <li><code>gaze_vector</code>: Unit vector indicating gaze orientation in 3D (world frame).</li> <li><code>aoi_label</code>: Label of the area of interest object currently hit by the gaze vector:</li> <li><code>NONE</code>: No object currently fixated / hit by the gaze vector. </li> <li><code>FLOOR</code>: Ground floor fixated / hit by the gaze vector. </li> <li><code>GATE</code>: Gate object fixated / hit by the gaze vector. </li> <li><code>aoi_hit</code>: Location of AOI hit in object coordinate frame (where x=right, y=down)</li> <li><code>distance</code>: Distance in meters between gaze_origin and AOI-gaze vector interaction point.</li> </ul> <h3><code>EVENTS</code>: Gate passing and collision events</h3> <ul> <li><code>t</code>: Time in seconds.</li> <li><code>label</code>: event labels</li> <li><code>START</code>: Liftoff of drone from start podium</li> <li><code>PASS</code>: Gate passing event with gates 0-9.</li> <li><code>COLL</code>: Collision event.</li> </ul> <h2>Track</h2> <p>Track information is saved in <code>/tracks</code>. Each file lists the gate positions, rotations, and dimensions in sequential order.</p> <h2>Videos</h2> <p>Videos are stored in <code>/videos</code> in subfolders for each subject.</p> <p>Note that <code>FPVCAM</code> videos for subjects <code>RUB</code> and <code>REW</code> are missing, due to technical problems during data acquisition.</p> <p>Data files are named as follows:</p> <p><code>&lt;SUBJECT-ID&gt;_&lt;TRACK-NAME&gt;&lt;RUN-NUMBER&gt;_&lt;DATA-TYPE&gt;.&lt;FILE-TYPE&gt;</code></p> <p>Video File types:</p> <ul> <li><code>.csv</code>: contains the timestamp <code>t</code> and frame number <code>frame</code> allowing to relate the video frames to data in the data folder</li> <li><code>.mp4</code>: the video</li> </ul> <p>Video Data types:</p> <ul> <li><code>FPVCAM</code> Screen capture videos of the FPV camera recorded with 60 FPS and 1080 x 960 resolution</li> </ul> <h2>Citation</h2> <p>Please cite our work as follows:</p> <p>Plain text:</p> <pre class="highlight"><code>C. Pfeiffer and D. Scaramuzza, &quot;Human-Piloted Drone Racing: Visual Processing and Control,&quot; in IEEE Robotics and Automation Letters, vol. 6, no. 2, pp. 3467-3474, April 2021, doi: 10.1109/LRA.2021.3064282.</code></pre> <p>BibTex:</p> <pre class="highlight"><code>@ARTICLE{9372809, author={C. {Pfeiffer} and D. {Scaramuzza}}, journal={IEEE Robotics and Automation Letters}, title={Human-Piloted Drone Racing: Visual Processing and Control}, year={2021}, volume={6}, number={2}, pages={3467-3474}, doi={10.1109/LRA.2021.3064282}}</code></pre> <h2>Maintainer</h2> <p><code>christian&lt;dot&gt;pfeiffer&lt;at&gt;uzh&lt;dot&gt;ch</code></p>
OSF does not support the use of Internet Explorer. For optimal performance, please switch to another browser.
Accept
This website relies on cookies to help provide a better user experience. By clicking Accept or continuing to use the site, you agree. For more information, see our Privacy Policy and information on cookie use.
Accept
×

Start managing your projects on the OSF today.

Free and easy to use, the Open Science Framework supports the entire research lifecycle: planning, execution, reporting, archiving, and discovery.