Main content

Home

Menu

Loading wiki pages...

View
Wiki Version:
# Audience Motion ![Audience by Carloine Bittencourt][1] Both physical and virtual audience members were given the opportunity to actively participate in research by having their motion recorded. This was done mostly through the use of their smartphones but some physical audience members chose to use a wearable sensor instead. ## MusicLab App Motion Data ![RITMO members using App][2] ### Description The MusicLab App is a mobile application that leverages participants’ smartphones’ inertial measurement units (IMU) to measure their motion. Find more information on the app [here][3]. The app also collects survey responses with surveys from Nettskjema (see section “Surveys” for more information). The first time the app is downloaded, it asks participants to provide their consent. The app is available for both Android and Apple. Participants were instructed to download the app via email communication before the concert and QR codes and bit.ly links were used to enable quick downloading/updating in the participant preparation room which was located immediately left of the ticket table. Participants were greeted by several volunteer teams who were designated as either Apple or Android helpers, based on their personal expertise. As part of the installation and consent giving process, participants can allow the application to record the accelerometer, gyroscope, and/or geolocation from the phone during the event. The app transmits the permitted sensor data from the phone to a server at 1-minute intervals. Volunteers assisted participants by providing them with RITMO phone holders that were configured into necklaces that placed the phones high on the participants’ chests. Remote participants were also instructed to download the application and their instructions were provided [here][4]. Remote audience members also recorded answers to questionnaires before and after the concert and after each piece. (See the “Surveys” section for more details). Volunteers were provided with an instruction guide that aimed to help them prepare participants’ phones for data collection. This guide included information on phone holder fitting, writing participants’ unique ID code into the phone app, and optimizing the phone for whole concert recording by exiting other apps, disabling notifications and muting the phones’ volume, dimming the phone screen, and entering low power mode. These steps were taken because turning the phone screen off pushes the app to the background and recording does not continue in the background on some phones (including Android and Apple depending on participants’ smartphone settings). Therefore participants were instructed to leave their phone screens on during the concert. If participants did not wish to use their phone or their phone could not successfully operate the MusicLab app, they were provided with an Axivity accelerometer (see section “AX3s"). ### Data Each device receives an anonymous random identification key when the app is installed. We attribute these to individual participants, assuming the assigned number will not change over the course of the concert recording period. MusicLab App collects internal mobile accelerometer and gyroscope sensor readings at rates around 60 Hz, released in one minute batches. These are labeled with the device ID (or installation ID) as well as a unique submission numbers. Recordings were sent to the Nettskjema server once every minute. Raw data collected during the concert time period was reorganized and aligned (if possible) into single participant files. MusicLab Alignment methods for timelocking the participants’ phone signals with the concert audio can be found on [Github here][5] Start_time = '2021-10-26 17:30:00+0000' <br> End_time = '2021-10-26 20:30:00+0000' If it was possible to synchronize the device time to concert time (Participant motion quality code Q or R), the csv file contains the following fields: - **time:** time of measurements in concert time (ms) - **datetime:** time of measurements in concert time (datetime) - **x,y,z:** Accelerometer measurement dimensions, floats - **alpha,beta,gamma:** gyroscope measurement dimensions, floats - **device_timestamps:** Epoch timestamp values from the recording device (Android or iOS mobile) If it was NOT possible to synchronize the device time to concert time (Participant motion quality code S or T or U), the csv file contains the following fields: - **device_time:** approximate datetime of measurements, without alignment correction - **device_datetime:** approximate concert time of measurements in ms, without alignment correction - **x,y,z:** Accelerometer measurement dimensions, floats - **alpha,beta,gamma:** gyroscope measurement dimensions, floats - **device_timestamps:** Epoch timestamp values from the recording device (Android or iOS mobile) NOTE: the sensor recordings are consecutive but not isochronous. Some gaps between samples are minutes long, others are only a little above the mode sample interval. Interpolate with care! We recommend inserting NaNs in gaps too large for a given analysis. ## Accelerometry Sensors (AX3s) ![AX3 sensor station][6] ### Description Audience members that were unable or did not wish to use their smartphone may have chosen to participate by wearing a sensor. The [AX3 sensor by Axivity][7] is a data logger accelerometer sensor. Participants were outfitted with the sensor prior to the concert. The sensors were positioned in the pouch of the phone holders and placed around participants’ necks much like the phones would have been positioned. Note that these sensors are much lighter than a smartphone and therefore may not have laid as flat on the participants’ chests as a smartphone. After the concert, participants returned the sensor and the data was transferred. Axivity sensors were armed for recording using [OmGui][8] and recording was started a few hours before the concert (around 15:00). The sensors were prepared ahead of time by placing them in the pouch of the phone holders. First, participants read and filled out a paper consent form by marking it with an X. These phone holders were specially marked with bright orange stickers to aid retrieval of the sensors. The phone holders were positioned on 7 participants’ chests before the concert. The sensor’s unique identifier was written on their paper survey. After the concert, the AX3s were collected. Data from these sensors was downloaded and converted to all available data formats. There were 11 sensors that were recording but not used by participants and the data recordings from these sensors were deleted. ### Data Raw data is in the format of “.cwa” files. These files were downloaded from the sensors through OmGui and exported as raw WAV, CSV, and resampled CSV (all options for file formats). Motion data was then aligned to match the MusicLabApp data. - AX3s (n = 18 were prepared; n = 7 were used) - Raw Format: .cwa - Axivity’s own data format. (Stands for Continuous Wave Accelerometer data) - Processed Format: CSV - Settings: 100 Hz CSV files are named for the participant identifier and contain the following fields. If it was possible to synchronize the device time to concert time (Participant motion quality code Q or R), the csv file contains the following fields: - **time:** time of measurements in concert time (ms) - **datetime:** time of measurements in concert time (datetime) - **x,y,z:** Accelerometer measurement dimensions, floats - **device_timestamps:** Epoch timestamp values from the recording device If it was NOT possible to synchronize the device time to concert time (Participant motion quality code S or T or U), the csv file contains the following fields: - **device_time:** approximate datetime of measurements, without alignment correction - **device_datetime:** approximate concert time of measurements in ms, without alignment correction - **x,y,z:** Accelerometer measurement dimensions, floats - **device_timestamps:** Epoch timestamp values from the recording device This data is consented for openly sharing and has no copyright associated with it. [1]: https://www.uio.no/ritmo/english/news-and-events/events/musiclab/2021/dsq/photos/bittencourt/dsq-festival-2021-by-caroline-bittencourt-27.jpg "Audience by Caroline Bittencourt" [2]: https://www.uio.no/ritmo/english/news-and-events/events/musiclab/2021/dsq/photos/20211026_p1023677.jpg "Phones worn example" [3]: https://www.uio.no/ritmo/english/research/labs/fourms/software/musiclab-app/ [4]: https://www.uio.no/ritmo/english/news-and-events/events/musiclab/2021/dsq/for-participants/ [5]: https://github.com/finn42/MusicLab_aligning [6]: https://www.uio.no/ritmo/english/news-and-events/events/musiclab/2021/dsq/photos/ritmo/20211026_p1023818.jpg "AX3 sensor station" [7]: https://axivity.com/product/ax3 "AX3 product page" [8]: https://github.com/digitalinteraction/openmovement/wiki/AX3-GUI "OmGui GitHub"
OSF does not support the use of Internet Explorer. For optimal performance, please switch to another browser.
Accept
This website relies on cookies to help provide a better user experience. By clicking Accept or continuing to use the site, you agree. For more information, see our Privacy Policy and information on cookie use.
Accept
×

Start managing your projects on the OSF today.

Free and easy to use, the Open Science Framework supports the entire research lifecycle: planning, execution, reporting, archiving, and discovery.