Main content

Home

Menu

Loading wiki pages...

View
Wiki Version:
Lab Log ------- *Provide details about your testing here. This is a good place to note any problems during testing. We recommend recording the time and date of any problems to allow easier verification of your records here.* Participant #160 in the raw data file (which corresponds to 10/16/2015 data collection date, 9:46am time) should be thrown out. Column Q in the raw data file says "throw out this data." We realized that that was a session we had opened several days before and no one had started - we entered in that test data (including the "throw out this data" information) to test whether it was still active and then realized it was and that we would need to throw out that error line of data. **(Note: the notes below are copied and pasted from the "differences from official protocol" section of the Implementation page)** We experienced difficulty recruiting participants to participate in this study. We are at a smaller campus with fewer psychology majors (and a smaller subject pool) that I have experienced at other large universities. As such, we have not been able to run sessions of 20 people. Instead, our breakdown across the fall 2015 semester of data collection is below. When we first started data collection, we were turning people away, as per the protocol (e.g., we had 15 people for the 9/22/15 and 10/6/15 sessions, but turned 3 away each day to stay in multiples of 4). However, once we realized we might not be able to finish data collection in the time allotted (by May 1, 2016), we decided to let people participate even if in odd multiples, and even if we dipped below 8 people at a time [so long as we were at 4]. The justification for our decision to move forward with this plan, and how we did so while still staying within the primary constraints of the protocol, is described in detail below. 9/22/2015 12 people; 10/6/2015 12 people; 10/20/2015 13 people; 11/2/2015 16 people; 11/10/2015 7 people; 11/16/2015 9 people; 11/17/2015 10 people; 11/18/2015 8 people; 11/23/2015 6 people; 11/24/2015 6 people; 11/30/2015 4 people; 12/1/2015 8 people; 12/2/2015 13 people; 1/20/16 8 people; 2/3/16 11 people; 2/17/16 7 people; 3/2/16 10 people; 3/16/16 9 people. Range of people per session = 4 to 16, mean = 9.39. When we had non-multiples of 4, we changed the calculation for payment as follows: After participants made their contributions and were progressing with the remainder of the survey, we used the qualtrics "responses in progress" page to identify each participant's contribution. We logged each contribution into our excel spreadsheet for that day's data collection, and then once we had all contributions entered, we sorted the data by a random number associated with each person in the excel file. Then, we grouped people into groups of 4, and calculated their payment according to the original equation: 4 - (their contribution) + Total Group Contribution / 2. This equation is a reduction of the full equation, which is actually: 4 - (their contribution) + (Total Group Contribution * 2) / 4 We used the original equation with as many groups of 4 as were present that day. For the remaining people, we edited the quation appropriately. For example, for a group with 3 people: 4 - (their contribution) + (Total Group Contribution * 2) / 3. And for example, for a group of 5: 4 - (their contribution) + (Total Group Contribution * 2) / 5. This slight alteration of the protocol allowed us to collect more data than we otherwise would have been able to, and to finish data collection on time. We made one other alteration to the protocol (although it is condoned in the official protocol) in an effort to solicit higher numbers of participants. Specifically, we began posting fliers in the dorm and classroom building hallways to encourage participation after our 11/10/2015 collection session. We made it clear on the fliers that students did not have to be psychology majors to participate. We indicated that we would pay non-psychology majors an extra $5 show-up fee in lieu of the psychology research credits that psych majors received. We finished data collection by the May 1 deadline, using the procedures with slight modifications as described above in this "differences from official protocol" section. I also just uploaded on this implementation page the "Procedure for data collection" document we used in the implementation of this study (see attached). There was one other difference from the official protocol to note. It is that the room in which we collected data used a networked computer security scheme (an ASU University Technology Office protocol) that made it impossible for us to use the computer IP addresses as specified in the original protocol. The IP addresses were dynamic in this environment, and we were unable to use the IP addresses to identify individual participants in a given session in order to calculate payments per the official protocol. After consultation with the ASU University Technology Office and Dr. Dan Simons (email dated 10/6/2015), we implemented a slightly different procedure for identifying participants in the "surveys in progress" page of qualtrics for a given session in order to calculate payments per the protocol. Our solution was to tape post-it notes with "computer number 1," "computer number 2," "..3," "...4" etc. to the desk at which the computers were placed. We altered the qualtrics survey so that the first question on the opening page asked "What is your computer number?" and for each session, we entered the computer number on that first screen and then had the computer open to the informed consent page for participants as they arrived. This solution enabled us to see the number of the computer for each participant in the active sessions so we could identify people to pay specific amounts, per the formulas in the official protocol and our slight deviations as described above in this "differences" section.
OSF does not support the use of Internet Explorer. For optimal performance, please switch to another browser.
Accept
This website relies on cookies to help provide a better user experience. By clicking Accept or continuing to use the site, you agree. For more information, see our Privacy Policy and information on cookie use.
Accept
×

Start managing your projects on the OSF today.

Free and easy to use, the Open Science Framework supports the entire research lifecycle: planning, execution, reporting, archiving, and discovery.