IRMA-International.org: Creator of Knowledge
Information Resources Management Association
Advancing the Concepts & Practices of Information Resources Management in Modern Organizations

Improving Emotion Analysis for Speech-Induced EEGs Through EEMD-HHT-Based Feature Extraction and Electrode Selection

Improving Emotion Analysis for Speech-Induced EEGs Through EEMD-HHT-Based Feature Extraction and Electrode Selection
View Sample PDF
Author(s): Jing Chen (Harbin Institute of Technology, China), Haifeng Li (Harbin Institute of Technology, China), Lin Ma (Harbin Institute of Technology, China) and Hongjian Bo (Shenzhen Academy of Aerospace Technology, China)
Copyright: 2021
Volume: 12
Issue: 2
Pages: 18
Source title: International Journal of Multimedia Data Engineering and Management (IJMDEM)
Editor(s)-in-Chief: Chengcui Zhang (University of Alabama at Birmingham, USA) and Shu-Ching Chen (Florida International University, USA)
DOI: 10.4018/IJMDEM.2021040101

Purchase


Abstract

Emotion detection using EEG signals has advantages in eliminating social masking to obtain a better understanding of underlying emotions. This paper presents the cognitive response to emotional speech and emotion recognition from EEG signals. A framework is proposed to recognize mental states from EEG signals induced by emotional speech: First, speech-evoked emotion cognitive experiment is designed, and EEG dataset is collected. Second, power-related features are extracted using EEMD-HHT, which is more accurate to reflect the instantaneous frequency of the signal than STFT and WT. An extensive analysis of relationships between frequency bands and emotional annotation of stimulus are presented using MIC and statistical analysis. The strongest correlations with EEG signals are found in lateral and medial orbitofrontal cortex (OFC). Finally, the performance of different feature set and classifier combinations are evaluated, and the experiments show that the framework proposed in this paper can effectively recognize emotion from EEG signals with accuracy of 75.7% for valence and 71.4% for arousal.

Related Content

Yi Qin, Huayu Zhang, Yuni Wang, Mei Mao, Fuguo Chen. © 2021. 16 pages.
Lijuan Duan, Xiao Xu, Qing En. © 2021. 16 pages.
Riju Bhattacharya, Naresh Kumar Nagwani, Sarsij Tripathi. © 2021. 16 pages.
Gavindya Jayawardena, Sampath Jayarathna. © 2021. 16 pages.
Jing Chen, Haifeng Li, Lin Ma, Hongjian Bo. © 2021. 18 pages.
Hui Liu, Wei Wang, Chuang Wen Wang. © 2021. 12 pages.
Wei Wang, Hui Liu, Wangqun Lin. © 2021. 26 pages.
Body Bottom