IRMA-International.org: Creator of Knowledge
Information Resources Management Association
Advancing the Concepts & Practices of Information Resources Management in Modern Organizations

Improving Emotion Analysis for Speech-Induced EEGs Through EEMD-HHT-Based Feature Extraction and Electrode Selection

Improving Emotion Analysis for Speech-Induced EEGs Through EEMD-HHT-Based Feature Extraction and Electrode Selection
View Sample PDF
Author(s): Jing Chen (Harbin Institute of Technology, China), Haifeng Li (Harbin Institute of Technology, China), Lin Ma (Harbin Institute of Technology, China)and Hongjian Bo (Shenzhen Academy of Aerospace Technology, China)
Copyright: 2021
Volume: 12
Issue: 2
Pages: 18
Source title: International Journal of Multimedia Data Engineering and Management (IJMDEM)
Editor(s)-in-Chief: Chengcui Zhang (University of Alabama at Birmingham, USA)and Shu-Ching Chen (University of Missouri-Kansas City, United States)
DOI: 10.4018/IJMDEM.2021040101

Purchase


Abstract

Emotion detection using EEG signals has advantages in eliminating social masking to obtain a better understanding of underlying emotions. This paper presents the cognitive response to emotional speech and emotion recognition from EEG signals. A framework is proposed to recognize mental states from EEG signals induced by emotional speech: First, speech-evoked emotion cognitive experiment is designed, and EEG dataset is collected. Second, power-related features are extracted using EEMD-HHT, which is more accurate to reflect the instantaneous frequency of the signal than STFT and WT. An extensive analysis of relationships between frequency bands and emotional annotation of stimulus are presented using MIC and statistical analysis. The strongest correlations with EEG signals are found in lateral and medial orbitofrontal cortex (OFC). Finally, the performance of different feature set and classifier combinations are evaluated, and the experiments show that the framework proposed in this paper can effectively recognize emotion from EEG signals with accuracy of 75.7% for valence and 71.4% for arousal.

Related Content

. © 2024.
Chengxuan Huang, Evan Brock, Dalei Wu, Yu Liang. © 2023. 23 pages.
Duleep Rathgamage Don, Jonathan Boardman, Sudhashree Sayenju, Ramazan Aygun, Yifan Zhang, Bill Franks, Sereres Johnston, George Lee, Dan Sullivan, Girish Modgil. © 2023. 17 pages.
Wei-An Teng, Su-Ling Yeh, Homer H. Chen. © 2023. 17 pages.
Anchen Sun, Yudong Tao, Mei-Ling Shyu, Angela Blizzard, William Andrew Rothenberg, Dainelys Garcia, Jason F. Jent. © 2022. 19 pages.
Hemanth Gudaparthi, Prudhviraj Naidu, Nan Niu. © 2022. 20 pages.
Suvojit Acharjee, Sheli Sinha Chaudhuri. © 2022. 16 pages.
Body Bottom