Main content

Home

Menu

Loading wiki pages...

View
Wiki Version:
# EMO2018-vis-verb Files and scripts related to EMO2018 data manuscript "Processing emotions from faces and words measured by event-related brain potentials". ## Manuscript The accompanying manuscript can be found in the files list and at https://www.tandfonline.com/doi/full/10.1080/02699931.2023.2223906 ## General Information All data were collected 2018-2020 in the Laboratory of Experimental Psychology at the University of Tartu, Ravila 14a, 50411 Tartu, Estonia. This work was funded by the Estonian Research Council institutional research grant IUT2-13 "Psychological mechanisms of cognition and personality processes", the personal research grant PRG1151 “Pre-attentive information processing in the brain: relationship to state and trait variables, and behaviour”, and the personal research grant PUT638 „Enhancing human-computer interaction using multimodal emotion recognition based on gesture, face, and speech“. The authors have no potential conflicts of interest to disclose. ## Data and File Overview The depository consists of structured data files (output folder) and R scripts (scripts folder). Datasets are structured as follows: - data-long6.RData - a cleaned version of raw data with average electrocortical amplitudes (0-600 ms) with each electrode in separate column (used for topographical plotting). - data-threshold6-ROI.RData - a cleaned version of raw data with average electrocortical amplitudes (0-600 ms) divided into 2 ms segments (used for pairwise analyses). - data2_1.csv - self-reported confidence ratings for task success. - data_plot.csv - data to create Figure 4. - supplementary material containing data for 301 pairwise F-tests run on emotion minus neutral value in the visual and auditory ROIs. Contains pairwise F-test statistics, contrast effect confidence intervals, R2 values, and FDR- and FWER-controlling multiple testing corrected p-values using the Benjamini-Hochberg (BH) procedure for each individual 2 ms paired test. - input_topo.csv - data to create Figure 5. Variable descriptions: - Subj - [001-120]; unique number for each participant. - Group - [ee/ru/en]; ee - Estonian; ru - Russian; en - English; participant primary language group. - Sex - [m/f]; m - male; f - female. - Marker - 14 unique numbers representing condition + emotion, used to recode for Condition and Emotion columns. - Condition - [Ekman/Under]; Ekman - face condition, Under - word condition; experimental condition. - Emotion - [anger, disgust, fear, happiness, sadness, surprise, neutral]; emotion used in condition. - Time - [-199.22 - 1033.52]; 615 timepoints for each individual corresponding to the temporal progression of the ERP wave from approximately -200 ms to 1000 ms. - ROI - [ROI1/ROI2/ROI3]; single electrodes aggregated into the occipital (occipital, occipital-parietal, parietal) area (electrodes P7, P5, P3, P1, PO7, PO3, POz, Oz, O1, P2, P4, P6, P8, PO4, PO8, O2); central (central, central-parietal) area (electrodes C5, C3, C1, Cz, C2, C4, C6, CP5, CP3, CP1, CPz, Pz, CP2, CP4, CP6) and left temporal area (electrodes F7, FT7, T7, TP7, P9) region. - FP1 to O2 - electrode designation, corresponding to a 64-electrode set (international 10-20 electrode placement system), i.e., https://www.biosemi.com/pin_electrode.htm - T_1 - T_615 - 2 ms time intervals (matching those of the "Time" column) to run pairwise F-tests. Script files are structured as follows: - input_topo_graphs.R - steps to create Figure 5. - script_02_st_selfrep-graphs.R - steps to create Figure 2. - script_05_lj_new-ROIs.R - steps to create Figure 3. - steps to create Figure 4 from data-threshold6-ROI.RData. First, creates file data_plot.csv (already in output folder). Then, uses package ERP (Causeur et al, 2019) to plot figures depicting point-wise significance analyses with ggplot2. These files were created in June - July, 2022. ## Licenses or restrictions placed on the data ## Links to publications that cite or use the data https://www.tandfonline.com/doi/full/10.1080/02699931.2023.2223906 ## Recommended citation for the data ## Methodology Participants 119 healthy volunteers participated in the study. The data of 3 participants were excluded due to technical errors and the final sample consisted of 116 participants (45 men; M = 25.02; SD = 6.34; age range: 18-49) with normal or corrected-to-normal eyesight. The sample consisted of 91 Estonian, 17 Russian, and 8 English-speaking participants. 107 participants described themselves as right-handed, 8 left-handed, and 1 (main hand right) ambidextrous. A preliminary ANOVA on the self-reported ratings of the groups based on sex, primary language, and handedness showed no significant differences, and all groups were included in the final analyses. Participation was reimbursed with a 15 EUR department store gift card. The study was approved by the Research Ethics Committee of the University of Tartu per the ethical standards of the Code of Ethics of the World Medical Association (Declaration of Helsinki), and a written signed consent form was obtained from all participants. ## Measures and procedure The recordings were conducted in an electrically shielded quiet room in the Laboratory of Experimental Psychology at the University of Tartu. Stimuli were presented in Psychtoolbox (MATLAB, MathWorks, Natick, Massachusetts, United States) on a standard PC monitor (LCD display, 19” diagonal). Throughout the experiment, subjective reports, measures of EEG, skin conductance (SCR), as well as video recordings of the facial expressions of the participants, were recorded. Due to the scope of the current paper, aspects of the subjective reports, video material, and SCR will not be analysed further. The participant was instructed to view either a written word or a picture of a face on the screen, to then express the related emotion (i.e., produce a facial expression), and to use a mouse to answer how well they expressed the emotion (see Figure 1 for examples of stimuli and experiment set up). ## Stimuli and Experiment Set-up Visual stimuli were selected from Ekman’s standard set of facial expressions JACFEE (Matsumoto & Ekman, 1988) and words (e.g., “happiness”, “anger”; see Figure 1) from the emotion lexicon. Both experimental conditions consisted of 7 (types of emotion) × 6 repetitions (different variants of a face in the visual condition and repetitions of an emotional word in the verbal condition). Different variants of pictures in an emotion category were used to attain variability in emotional expressions as well as to avoid the problem of habituation, as it has shown to diminish the effects of brain imaging recordings (Breiter et al., 1996; Feinstein et al., 2002). The stimuli were semi-randomly presented. ## EEG Recordings and Pre-Processing A BioSemi ActiveTwo (BioSemi, Amsterdam, The Netherlands) active electrode system was used to record signals from 64 scalp locations. Two reference electrodes were placed behind the earlobes and four ocular electrodes (above and below the left eye and near the outer canthi of both eyes) were used to correct for eye movement and blinks. The data were recorded with a 0.16-100 Hz band-pass filter and 512 Hz sampling rate. The placement of the electrodes followed the international 10-20 system (Jasper, 1958). The EEG data were processed offline in BrainVision Analyzer 2.1 (Brain Products GmbH, Munich, Germany). The data was referenced to earlobe electrodes, correction for eye blinks (Gratton et al., 1983), and the Butterworth Zero Phase filter were applied (0.16-30 Hz; 24 dB/oct). The data was separated into segments from -200 ms to 1000 ms and the baseline correction made at 0-50 ms post stimulus onset. Segments lower than 0.5 μV and higher than 100 μV were removed, resulting in the loss of 2.15% raw data, and ERPs calculated by averaging the segments from the six stimulus repetitions. For questions or enquiries, please contact Kairi Kreegipuu, PhD Professor of Experimental Psychology Institute of Psychology University of Tartu Näituse 2, Tartu, 50409, Estonia Email: kairi.kreegipuu@ut.ee
OSF does not support the use of Internet Explorer. For optimal performance, please switch to another browser.
Accept
This website relies on cookies to help provide a better user experience. By clicking Accept or continuing to use the site, you agree. For more information, see our Privacy Policy and information on cookie use.
Accept
×

Start managing your projects on the OSF today.

Free and easy to use, the Open Science Framework supports the entire research lifecycle: planning, execution, reporting, archiving, and discovery.