Emotion Recognition based on EEG Signals in Response to Bilingual Music Tracks

Emotion Recognition based on EEG Signals

in Response to Bilingual Music Tracks

Rida Zainab and Muhammad Majid

Department of Computer Engineering, University of Engineering and Technology Taxila, Pakistan

Abstract: Emotions are vital for communication in daily life and their recognition is important in the field of artificial intelligence. Music help evoking human emotions and brain signals can effectively describe human emotions. This study utilized Electroencephalography (EEG) signals to recognize four different emotions namely happy, sad, anger, and relax in response to bilingual (English and Urdu) music. Five genres of English music (rap, rock, hip-hop, metal, and electronic) and five genres of Urdu music (ghazal, qawwali, famous, melodious, and patriotic) are used as an external stimulus. Twenty-seven participants consensually took part in this experiment and listened to three songs of two minutes each and also recorded self-assessments. Muse four-channel headband is used for EEG data recording that is commercially available. Frequency and time-domain features are fused to construct the hybrid feature vector that is further used by classifiers to recognize emotional response. It has been observed that hybrid features gave better results than individual domains while the most common and easily recognizable emotion is happy. Three classifiers namely Multilayer Perceptron (MLP), Random Forest, and Hyper Pipes have been used and the highest accuracy achieved is 83.95% with Hyper Pipes classification method. 

Keywords: Emotion recognition, electroencephalography, feature extraction, classification, bilingual music.

Received September 16, 2019; accepted July 26, 2020

https://doi.org/10.34028/iajit/18/3/4
Read 1029 times
Share
Top
We use cookies to improve our website. By continuing to use this website, you are giving consent to cookies being used. More details…