The attached files accompany the publication by Skiendziel, Rösch, and Schultheiss. Some are data files generated with FaceReader 7 from the movies of 80 target subjects, with each enacting 6 basic emotions (anger, happiness, disgust, sadness, fear, surprise) in short videoclips. These enactments were also coded by two certified coders using the Facial Action Coding System (FACS); these data are also included. The syntax was generated with SYSTAT 13, is annotated extensively, and can be opened in any ASCII editor even if you don't have SYSTAT. It describes how these original data files were processed and merged and how the results reported in the paper titled "Assessing the convergent validity between the automated emotion recognition software Noldus FaceReader 7 and manual Facial Action Coding System Scoring" were generated. The attached files also include final processed & aggregated data files, both in SYSTAT and SPSS formats, for easy access and retracing of our analytical steps.
Finally, we also include sample pictures from the SMoFEE picture set to illustrate all types of emotional expressions generated for this picture set (Smofee_stimulus_conditions.pptx). For more information on the validation of the SMoFEE picture set, please see https://opus4.kobv.de/opus4-fau/frontdoor/index/index/year/2012/docId/2304. The picture set can be obtained from oliver.schultheiss@fau.de, provided that users agree in writing to follow the terms of using the set. These rules aim at protecting the rights of the individuals depicted in the set.
On September 30 2019, we uploaded the SYSTAT syntax and output files (both as WORD and PDF) for the extended analyses requested by the reviewers. These requests led to the inclusion of Table 4 in the manuscript. The syntax and output document how the indices given in this table were calculated (in addition to everything else that was already the basis of the previous manuscript version).