Main content

Home

Menu

Loading wiki pages...

View
Wiki Version:
**Variables Collected** * **`workerID`**: The subject's worker ID as assigned by Mechanical Turk. In the posted data, this variable will be replaced with a **`subjectID`** column with arbitrary numerical assignment to protect subject anonymity. * **`completed_crossings`**: Integer. The number of completed crossings (should be `8` for all subjects) * **`total_score`**: Integer. The subject's final score. * **`total_time`**: Integer. Time in milliseconds the subject spent on the game part of the experiment. * **`unex_motion`**: String. The motion path of the unexpected object. Can be: * `closer` if the object spawned far from the player and moved closer * `farther` if it spawned close to the player and moved farther away * **`unex_velocity_x`**: Integer. The absolute direction of the unexpected object's velocity. * `1` corresponds to left-to-right * `-1` corresponds to right-to-left. * **`unex_velocity_y`**: Integer. The absolute direction of the unexpected object's velocity. * `1` corresponds to top-to-bottom * `-1` corresponds to bottom-to-top. * **`unex_location_relative`**: String. The location relative to the player (`high` or `low`) of the unexpected object. * **`unex_location_absolute_x`**: Integer. The absolute location of x coordinate of the probe when it onsets in the display, in pixels ((0, 0) is top-left, (900, 900) is bottom-right). * **`unex_location_absolute_y`**: Integer. The absolute location for the y coordinate of the probe when it onsets in the display, in pixels ((0, 0) is top-left, (900, 900) is bottom-right). * **`unex_crossing_num`**: Integer. On which crossing the unexpected object appeared. Can be `5` or `6` in this experiment, of 8 crossings total. * **`unex_deploy_time`**: Integer. Time since the beginning of the game (in milliseconds) when the probe appeared in the display. * **`unex_crossing_origin`**: String. Which side the player was crossing from when the unexpected object appeared (`right` or `left`). * **`unex_start_color`**: String. The color of the unexpected object at spawn. Can be `green` or `yellow`. * **`unex_shape`**: String. The shape of the unexpected object. `diamond` for all subjects in this experiment. * **`dist_at_offset`**: Integer. The Euclidean distance between the probe and the player at probe offset. * **`concurrent_movement`**: Float. The proportion of time both the player and probe were in motion; `1` means the player was moving the entire time the probe was, while `0` means the player did not move at all while the probe was onscreen. * **`stops`**: Semicolon-delimited integer pairs of comma-delimited integers. The first number in each pair is the time in milliseconds since the onset of the probe that the player stopped, and the second value is how long the stop lasted. If the cell is empty, the player did not stop while the probe was onscreen. * **`collisions_during_unex`**: Integer. A count of how many times the player collided with an obstacle while the probe was onscreen. * **`all_collisions`**: Comma-delimited list of integers. Each value is the time in milliseconds since the start of the game that a player collided with an object. * **`report_notice`**: Boolean. The subject's report of having noticed the object. `1` indicates the subject reported noticing something new, `0` indicates no such noticing. * **`report_moving`**: Boolean. The subject's report of whether the object was moving (`1`) or not moving (`0`). * **`report_direction`**: String. The string code for the direction of the image of the arrow the subject chose. Can be: * `up` * `down` * `left` (left-to-right) * `right` (right-to-left) * `up_left` * `up_right` * `down_left` * `down_right` * **`report_shape`**: String. The subject's report of the unexpected object shape. One of `rectangle`, `triangle`, `diamond`, `circle`, `cross`, `t`, `l`, `b`, or `v`. * **`report_color`**: The subject's report of the unexpected object color when they first noticed it. One of `red`, `green`, `blue`, `purple`, `yellow`, `gray`, `black`, `white`, or `brown`. * **`gender`**: String. The subject's self-reported gender, `male` or `female`. * **`age`**: Integer. The subject's self-reported age range. * `0` = under 18 * `1` = 18-24 * `2` = 25 - 49 * `3` = 50 - 80 * `4` = over 80. * **`vision`**: Integer. Whether the subject requires vision correction and was wearing it during the task. * `0` = Normal vision * `1` = correct-to-normal vision, correction worn * `2` = correct-to-normal vision, correction not worn * **`color_vision`**: Integer. Whether the subject is colorblind (self-report). * `0` = normal color vision * `1` = red-green colorblindness * `2` = blue-yellow colorblindness * `3` = some other color vision issue. * **`ishihara`**: Integer. The subject's response for the number contained in Ishihara Plate 38. Can be `5`, `21` (the answer typically given by those with red-green colorblindness), `74` (the answer typically given by those with normal color vision), `122`, or `0` ("I don't see a number"). * **`lagging`**: Boolean. Whether the subject experienced any lagging that impacted their ability to play the game (`1` = yes, `0` = no). * **`freezing`**: Boolean. Whether the subject experienced any freezing that impacted their ability to play the game (`1` = yes, `0` = no). * **`other_issues`**: Boolean. Whether the subject experienced some other technical issue that impacted their ability to play the game (`1` = yes, `0` = no). * **`other_text`**: String. A text description of the "other" technical issue. * **`prior`**: Boolean. Whether the subject had prior experience with any similar inattentional blindness tasks (`1` = yes, `0` = no). * **`prior_text`**: String. A text description of the subject's prior inattentional blindness experience. **Exclusion Rules** Subjects will be excluded for meeting at least one of the following criteria: * Incorrect reporting of the value of Ishihara Plate 38 * Reporting needing glasses or contacts but not wearing them during the experiment * Reporting any issue with color vision * Being under 18 years of age * Reporting that the game lagged, froze, or had some other issue that affected the subject's ability to play * Reporting prior experience with inattentional blindness experiments **Determining Noticing** We will create two different classifications for "noticing"--noticing for color, and noticing for motion. For a subject to be labeled as having noticed the motion of the unexpected object, they must: * Respond "yes" to the question, "During the game, did you notice any object appear in the display that was NOT a red car, a blue pedestrian, a flower, the seed basket, the barn, or the avatar you controlled?" * Respond "yes" to the question, "Was the new object moving?" * Respond with the correct direction when asked, "What direction was the new object moving in?" For a subject to be labeled as having noticed the color of the unexpected object, they must: * Respond "yes" to the question, "During the game, did you notice any object appear in the display that was NOT a red car, a blue pedestrian, a flower, the seed basket, the barn, or the avatar you controlled?" * Report that the object was either green or yellow when they first noticed it. **Collisions during probe** We will not exclude subjects who experience a collision during the probe onset in our main analysis, but as with noticing we will look at the results both with and without these subjects (we do not expect much of a difference, as previous studies indicate this is a very rare occurrence). **Analysis Plan** We will calculate noticing rates and 95% bootstrapped confidence intervals for the two conditions (starting far and ending near vs. starting near and ending far) for motion noticing and color noticing. Within the color-noticing group, we will also calculate the rate at which the first vs. last color is noticed (with 95% bootstrapped confidence intervals) for the two conditions. Before separating out the conditions, we will look at the noticing rate for motion according to which color was reported amongst the color noticers. If people report noticing the first color, are they also more likely to correctly identify the direction of motion? We will also examine the reverse--are subjects who correctly identify the motion direction of the unexpected object more likely to report the first color? We will also look for differences by condition (whether the object spawns nearby and moves away, or spawns farther away and moves closer). Here, we have two primary questions. One, will we replicate the findings of the previous experiment? There, we found a dramatic difference in subjects' ability to identify the motion direction of the unexpected object depending on the condition to which they were assigned. Subjects for whom the unexpected object spawned nearby and then moved away noticed the motion of the object about 70% of the time, while those who had a faraway object come closer noticed the motion just 35% of the time. However, when we examined noticing for color, there was only about a 3% difference between the groups. This leads to our second question: is the large discrepancy in noticing for motion but no comparable discrepancy for color due to the timecourse of noticing? It could be that the near-spawn group notices the object much earlier than the far-spawn group, and so is able to track its entire trajectory and correctly identify the motion. By contrast, the far-spawn group may only notice the object once it gets close, and as a result can correctly report the color but not the motion. If this is the case, we expect to see that the far-spawn group, for whom the object gets closer over time, should report the second color more often than the first color. The other group with the nearby spawn should report the first color more often than the second color.
OSF does not support the use of Internet Explorer. For optimal performance, please switch to another browser.
Accept
This website relies on cookies to help provide a better user experience. By clicking Accept or continuing to use the site, you agree. For more information, see our Privacy Policy and information on cookie use.
Accept
×

Start managing your projects on the OSF today.

Free and easy to use, the Open Science Framework supports the entire research lifecycle: planning, execution, reporting, archiving, and discovery.