The dataset being used for this project is from the study “The NimStim set of facial expressions: Judgments from untrained research participants”, by Tottenham et al. (2009). The NimStem Set of Facial Expressions (Tottenham et al., 2009) is a broad dataset comprising of 672 images of naturally posed photographs by 43 professional actors (18 female, 25 male) ranging from 21 to 30 years old. Actors from a diverse sample were chosen to portray emotional expressions within this dataset. To be precise, the actors were African-American (N = 10), Asian-American (N = 6), European-American (N = 25), Latino-American (N = 2). The images contained in this dataset include eight emotional expressions, namely: neutral, angry, disgust, surprise, sad, calm, happy, and afraid. Both open and closed mouth versions were provided for all emotional expressions, with the exception of surprise (only open mouth provided) and happy (high arousal open mouth/exuberant provided). The face stimuli can be accessed for free on: http://www.macbrain.org/resources.htm. The participants’ responses in the data files has been divided into multiple categories. Firstly, it was divided depending on if the expressions were open or closed mouth. Next, it was sub-divided pertaining to one of the eight emotions being conveyed, and lastly it was further divided based on the actor’s gender. There are two excel files which contain the data for this study, in addition to the coding key. These excel files are available on the study’s Open Science Foundation (OSF) Project page, available here: https://osf.io/w98qj/. Coding instructions can be found on the file “individual_scores_MODELs1_20.xls”. The coding scheme for this dataset was as follows: “surprise” = 2, “happy” = 3, “neutral” = 4, “calm” = 5, “disgust” = 6, “angry” = 7, “sad” = 8, “none of the above” = 9, “afraid” = 10. Notably, responses for “calm” and “neutral” were interchangeable and hence both were marked as correct. The file “individual_scores_MODELs1_20.xls” contains the participants’ responses to the face stimuli to the fist 20 actors in the dataset. The file “individual_scores_MODELs1_21-43.xls” contains the participants’ responses to the face stimuli to the remaining 23 actors in the dataset. The NimSet Set of Facial Expressions is a widely known dataset in the literature. It has been extensively used, and has been cited over 2000 times in publications. This dataset has been used in various research areas, such as the investigation of working memory (Schweizer et al., 2018), self-regulation (Casey et al., 2011) and even in the treatment of clinical disorders such as schizophrenia (Kurtz et al., 2016). In particular, it has contributed greatly to research using neuroimaging techniques, especially those involving amygdala activation (Tsuchiya, Moradi, Felsen, Yamazaki, & Adolphs, 2009; Tottenham et al., 2011). Studies like these can help expand our knowledge of how emotional expressions are interpreted by different cohorts. REFERENCES Casey, B. J., Somerville, L. H., Gotlib, I. H., Ayduk, O., Franklin, N. T., Askren, M. K., ... & Glover, G. (2011). Behavioral and neural correlates of delay of gratification 40 years later. Proceedings of the National Academy of Sciences, 108(36), 14998-15003. Kurtz, M. M., Gagen, E., Rocha, N. B., Machado, S., & Penn, D. L. (2016). Comprehensive treatments for social cognitive deficits in schizophrenia: A critical review and effect-size analysis of controlled studies. Clinical psychology review, 43, 80-89. Schweizer, S., Satpute, A. B., Atzil, S., Field, A., Hitchcock, C., Black, M., ... & Dalgleish, T. (2018). The behavioral and neural effects of affective information on working memory performance: A pair of meta-analytic reviews. Tottenham, N., Hare, T. A., Millner, A., Gilhooly, T., Zevin, J. D., & Casey, B. J. (2011). Elevated amygdala response to faces following early deprivation. Developmental science, 14(2), 190-204. Tottenham, N., Tanaka, J. W., Leon, A. C., McCarry, T., Nurse, M., Hare, T. A., ... & Nelson, C. (2009). The NimStim set of facial expressions: judgments from untrained research participants. Psychiatry research, 168(3), 242-249. Tsuchiya, N., Moradi, F., Felsen, C., Yamazaki, M., & Adolphs, R. (2009). Intact rapid detection of fearful faces in the absence of the amygdala. Nature neuroscience, 12(10), 1224.