Join Zoom Meeting https://us04web.zoom.us/j/75008053837?pwd=TXVBOXB6c0J3VmduN2loVksvMVNtdz09
Meeting ID: 750 0805 3837
Password: 1H0mES
Abstract
Musicians usually want to express emotions with their music. In turn, their listeners want to detect these emotions, and usually they can. Research suggests that the ability to detect music-expressed emotion varies in individuals just like the ability to detect emotions conveyed through vocal expression (Mayer, Roberts, & Barsade, 2007). Differences in the ability to recognise music-expressed emotions have been linked to personality traits of the listener (Vuoskoski & Eerola, 2011). This study wants to examine which of the Big Five traits can best predict the ability to discriminate emotions in music. We used a cross-sectional sample from the longitudinal LongGold project consisting of 762 students (13.48 years, SD = 2.01 years, 69.03% female) from secondary schools (455 UK, 218 Germany). Personality was measured using the TIPI (Gosling, Rentfrow, & Swann, 2003) and emotion discrimination ability was tested with the EDT (MacGregor & Müllensiefen, 2019). Correlational analyses indicated associations between the EDT and agreeableness (r = 0.12, p < .01), openness (r = 0.19, p < .01) and conscientiousness (r = 0.11, p < .01), whereas a multiple regression showed that only openness (b = .02, p < .01) and emotional stability (b = .01, p < .05) predicted musical emotion discrimination ability. About 5% of variance was explained. A random forest regression model produced comparable results, explaining 7% of variance. The results suggest that openness is the best predictor for emotion detection in music, but that this trait explains only a small amount of variance in this ability.