Main content

Home

Menu

Loading wiki pages...

View
Wiki Version:
During the [BCEM poster session][1], Wed 20 May 2020 14.45-15.30 BST, we are all available on Zoom to discuss the poster! Join Zoom Meeting [https://us04web.zoom.us/j/76809657232?pwd=bWdUdTNvQnlEcmI5SFI3dkYydWRPUT09][2] Meeting ID: 768 0965 7232 Password: musicemoti Or contact us by email: gral2 / a.k.jordanous / c.li @kent.ac.uk George Langroudi* - School of Computing, University of Kent, Anna Jordanous* - School of Computing, University of Kent, Ling Li* - School of Computing, University of Kent *presenting author Music Emotion Capture: Ethical issues around emotion-based music generation People’s emotions are not always detectable, e.g. if a person has difficulties/lack of skills in expressing emotions, or if people are geographically separated/communicating online). Brain-computer interfaces (BCI) could enhance non-verbal communication of emotion, particularly in detecting and responding to users’ emotions e.g. music therapy, interactive software. Our pilot study Music Emotion Capture [1] detects, models and sonifies people’s emotions based on their real-time emotional state, measured by mapping EEG feedback onto a valence-arousal emotional model [2] based on [3]. Though many practical applications emerge, the work raises several ethical questions, which need careful consideration. This poster discusses these ethical issues. Are the work’s benefits (e.g. improved user experiences; music therapy; increased emotion communication abilities; enjoyable applications) important enough to justify navigating the ethical issues that arise? (e.g. privacy issues; control of representation of/reaction to users’ emotional state; consequences of detection errors; the loop of using emotion to generate music and music affecting the emotion, with the human in the process as an “intruder”). [1] Langroudi, G., Jordanous, A., & Li, L. (2018). Music Emotion Capture: emotion-based generation of music using EEG. Emotion Modelling and Detection in Social Media and Online Interaction symposium @ AISB 2018, Liverpool. [2] Paltoglou, G., & Thelwall, M. (2012). Seeing stars of valence and arousal in blog posts. IEEE Transactions on Affective Computing, 4(1) [3] Russell, J.A. (1980). ‘A circumplex model of affect’, Journal of Personality and Social Psychology, 39 [1]: https://bcem-conference.weebly.com/posters.html [2]: https://us04web.zoom.us/j/76809657232?pwd=bWdUdTNvQnlEcmI5SFI3dkYydWRPUT09
OSF does not support the use of Internet Explorer. For optimal performance, please switch to another browser.
Accept
This website relies on cookies to help provide a better user experience. By clicking Accept or continuing to use the site, you agree. For more information, see our Privacy Policy and information on cookie use.
Accept
×

Start managing your projects on the OSF today.

Free and easy to use, the Open Science Framework supports the entire research lifecycle: planning, execution, reporting, archiving, and discovery.