• 1. Jiangxi Provincial Key Laboratory of Image Processing and Pattern Recognition, Nanchang Hangkong University, Nanchang 330063, P. R. China;
  • 2. College of Instrument Science and Optoelectronic Engineering, Nanchang Hangkong University, Nanchang 330063, P. R. China;
  • 3. School of Aviation Services and Music, Nanchang Hangkong University, Nanchang 330063, P. R. China;
JIE Lilin, Email: jielilin@nchu.edu.cn
Export PDF Favorites Scan Get Citation

Existing emotion recognition research is typically limited to static laboratory settings and has not fully handle the changes in emotional states in dynamic scenarios. To address this problem, this paper proposes a method for dynamic continuous emotion recognition based on electroencephalography (EEG) and eye movement signals. Firstly, an experimental paradigm was designed to cover six dynamic emotion transition scenarios including happy to calm, calm to happy, sad to calm, calm to sad, nervous to calm, and calm to nervous. EEG and eye movement data were collected simultaneously from 20 subjects to fill the gap in current multimodal dynamic continuous emotion datasets. In the valence-arousal two-dimensional space, emotion ratings for stimulus videos were performed every five seconds on a scale of 1 to 9, and dynamic continuous emotion labels were normalized. Subsequently, frequency band features were extracted from the preprocessed EEG and eye movement data. A cascade feature fusion approach was used to effectively combine EEG and eye movement features, generating an information-rich multimodal feature vector. This feature vector was input into four regression models including support vector regression with radial basis function kernel, decision tree, random forest, and K-nearest neighbors, to develop the dynamic continuous emotion recognition model. The results showed that the proposed method achieved the lowest mean square error for valence and arousal across the six dynamic continuous emotions. This approach can accurately recognize various emotion transitions in dynamic situations, offering higher accuracy and robustness compared to using either EEG or eye movement signals alone, making it well-suited for practical applications.

Copyright © the editorial department of Journal of Biomedical Engineering of West China Medical Publisher. All rights reserved