Emotion is a crucial physiological attribute in humans, and emotion recognition technology can significantly assist individuals in self-awareness. Addressing the challenge of significant differences in electroencephalogram (EEG) signals among different subjects, we introduce a novel mechanism in the traditional whale optimization algorithm (WOA) to expedite the optimization and convergence of the algorithm. Furthermore, the improved whale optimization algorithm (IWOA) was applied to search for the optimal training solution in the extreme learning machine (ELM) model, encompassing the best feature set, training parameters, and EEG channels. By testing 24 common EEG emotion features, we concluded that optimal EEG emotion features exhibited a certain level of specificity while also demonstrating some commonality among subjects. The proposed method achieved an average recognition accuracy of 92.19% in EEG emotion recognition, significantly reducing the manual tuning workload and offering higher accuracy with shorter training times compared to the control method. It outperformed existing methods, providing a superior performance and introducing a novel perspective for decoding EEG signals, thereby contributing to the field of emotion research from EEG signal.
Objective To explore the association between behavioral, emotional problems and life events among adolescents, and to determine which factors of life events correlate most highly with the behavioral, emotional problems. Method A total of 1 325 adolescents were investigated with Youth Self-Report (YSR) of Achenbach’s behavior checklist and Adolescent Self-Rating Life Events Checklist (ASLEC), and the data were analyzed with canonical correlation analysis. Results Canonical correlation was statistically significant. The correlation coefficients of the first pair of canonical variables in the male and female group were 0.631 3 and 0.621 1, respectively, and the cumulative proportion of the first two pairs of canonical variables was above 0.95. In the first pair of canonical variables, the loadings of anxious/depressed, interpersonal sensitivity and study pressure were higher, while in the second pair, withdrawal and punishment were the most important factors. Conclusions The effects of life events on emotional problems mainly contributed to interpersonal sensitivity and study pressure.
Current studies on electroencephalogram (EEG) emotion recognition primarily concentrate on discrete stimulus paradigms under controlled laboratory settings, which cannot adequately represent the dynamic transition characteristics of emotional states during multi-context interactions. To address this issue, this paper proposes a novel method for emotion transition recognition that leverages a cross-modal feature fusion and global perception network (CFGPN). Firstly, an experimental paradigm encompassing six types of emotion transition scenarios was designed, and EEG and eye movement data were simultaneously collected from 20 participants, each annotated with dynamic continuous emotion labels. Subsequently, deep canonical correlation analysis integrated with a cross-modal attention mechanism was employed to fuse features from EEG and eye movement signals, resulting in multimodal feature vectors enriched with highly discriminative emotional information. These vectors are then input into a parallel hybrid architecture that combines convolutional neural networks (CNNs) and Transformers. The CNN is employed to capture local time-series features, whereas the Transformer leverages its robust global perception capabilities to effectively model long-range temporal dependencies, enabling accurate dynamic emotion transition recognition. The results demonstrate that the proposed method achieves the lowest mean square error in both valence and arousal recognition tasks on the dynamic emotion transition dataset and a classic multimodal emotion dataset. It exhibits superior recognition accuracy and stability when compared with five existing unimodal and six multimodal deep learning models. The approach enhances both adaptability and robustness in recognizing emotional state transitions in real-world scenarios, showing promising potential for applications in the field of biomedical engineering.
ObjectivesTo systematically review the association between pubertal development progression and emotional and behavioral problems.MethodsVIP, CNKI, CBM, WanFang Data, PubMed, Web of Science and EBSCO databases were electronically searched to collect studies on the relationship between pubertal tempo or trajectory and emotional and behavioral problems from inception to December 31st, 2019. Two reviewers independently screened literature, extracted data and assessed risk of bias of included studies. Qualitative methods were then used to analyze the data.ResultsA total of 14 cohort studies were included. The results showed that depression was the most studied emotional problem, and 2 of the 3 studies found a significant association between faster pubertal tempo and more depressive symptoms in juvenile males. However, no association was found in 3 of the 4 studies on juvenile females. The content of behavioral problems of included studies was broad, including internalizing and externalizing problems, substance abuse, attention problem, self-control, first-sexual experience, delinquency, conduct disorder, peer relationship, etc. However, few studies on the same behaviors, and the relationship between behavioral problems was unclear.ConclusionsThe faster pubertal tempo may be associated with depression in juvenile males. The association between pubertal tempo and behavioral problems in males and females remain to be determined by more studies.
There are two modes to display panoramic movies in virtual reality (VR) environment: non-stereoscopic mode (2D) and stereoscopic mode (3D). It has not been fully studied whether there are differences in the activation effect between these two continuous display modes on emotional arousal and what characteristics of the related neural activity are. In this paper, we designed a cognitive psychology experiment in order to compare the effects of VR-2D and VR-3D on emotional arousal by analyzing synchronously collected scalp electroencephalogram signals. We used support vector machine (SVM) to verify the neurophysiological differences between the two modes in VR environment. The results showed that compared with VR-2D films, VR-3D films evoked significantly higher electroencephalogram (EEG) power (mainly reflected in α and β activities). The significantly improved β wave power in VR-3D mode showed that 3D vision brought more intense cortical activity, which might lead to higher arousal. At the same time, the more intense α activity in the occipital region of the brain also suggested that VR-3D films might cause higher visual fatigue. By the means of neurocinematics, this paper demonstrates that EEG activity can well reflect the effects of different vision modes on the characteristics of the viewers’ neural activities. The current study provides theoretical support not only for the future exploration of the image language under the VR perspective, but for future VR film shooting methods and human emotion research.
ObjectiveTo identify the effects of transition to siblinghood (TTS) on the firstborn children’s emotions and behaviors, and to define the time of TTS.MethodsCBM, VIP, CNKI, WanFang Data, PubMed, Web of Science and EBSCO were electronically searched to collect studies on the emotional and behavioral characteristics of firstborn children in TTS from inception to December 31st, 2019. Two reviewers independently screened literature, extracted data and assessed the risk bias of included studies. Then, qualitative methods were used to analyze the studies.ResultsA total of 13 studies involving 980 children were included. 12 behavioral related studies explored self-behavior of the firstborn children during TTS, 3 studies focused on the interaction behavior between the firstborn children and their parents, the firstborn children and the second children. The systematic reviews found that TTS showed both positive and negative effects on the behavioral characteristics of firstborn children, primarily the negative effects. Firstborn children’s anxiety, confrontation and attachment showed 3 different patterns over time, respectively. Two studies showed the increase of negative emotions of firstborn children during TTS. The time range of TTS was mainly concentrated in the third trimester to 12 months after the birth of the second child.ConclusionsThe current evidence shows that TTS primarily increases the negative emotions and behaviors of firstborn children, and the behaviors of firstborn children changes over time. Due to limited quality and quantity of the included studies, more high quality studies are required to verify above conclusions.
To accurately capture and effectively integrate the spatiotemporal features of electroencephalogram (EEG) signals for the purpose of improving the accuracy of EEG-based emotion recognition, this paper proposes a new method combining independent component analysis-recurrence plot with an improved EfficientNet version 2 (EfficientNetV2). First, independent component analysis is used to extract independent components containing spatial information from key channels of the EEG signals. These components are then converted into two-dimensional images using recurrence plot to better extract emotional features from the temporal information. Finally, the two-dimensional images are input into an improved EfficientNetV2, which incorporates a global attention mechanism and a triplet attention mechanism, and the emotion classification is output by the fully connected layer. To validate the effectiveness of the proposed method, this study conducts comparative experiments, channel selection experiments and ablation experiments based on the Shanghai Jiao Tong University Emotion Electroencephalogram Dataset (SEED). The results demonstrate that the average recognition accuracy of our method is 96.77%, which is significantly superior to existing methods, offering a novel perspective for research on EEG-based emotion recognition.
ObjectiveTo investigate the effect of positive family behavior support on emotional and behavioral problems in preschool children with epilepsy. Methods A total of 80 preschool epileptic children and their parents who were admitted to the Department of Neurology of our hospital from October 2022 to February 2023 were selected as the research objects, and were divided into experimental group and control group with 40 cases each by random number table method. The control group received neurology routine nursing, and the experimental group received positive family behavior support intervention based on the control group. The scores of family intimacy and adaptability scale, strengths and difficulties questionnaire, medication compliance and quality of life of epilepsy children were compared before and after intervention between the two groups. ResultsAfter intervention, the scores of strength and difficulty questionnaire in experimental group were lower than those in control group (P<0.05), and the scores of family intimacy and adaptability scale, quality of life and medication compliance in experimental group were higher than those in control group (all P<0.05). ConclusionThe application of positive family behavior support program can reduce the occurrence of emotional behavior problems, improve family closeness and adaptability, improve medication compliance, and improve the quality of life of preschool children with epilepsy.
ObjectiveTo explore and clarify the relationship between epileptic seizure and inducing factors. Avoid inducing factors and reduce epileptic seizure, so as to improve the quality of life in patients with epilepsy.MethodsClinical data of 604 patients diagnosed with epilepsy in Xijing Hospital of Air Force Military Medical University from January 2018 to January 2019 were collected. The clinical data of patients with epilepsy were followed up 6 months.ResultsAmong the 604 patients, 318 (52.6%) were seizure-free in the last 6 months, 286 (47.4%) had seizures. 169 (59.1%) had seizures with at least one inducing factor. Common inducing factors: 123 cases of sleep disorder (72.8%), 114 cases of emotion changes (67.5%), 87 cases of irregular medication (51.5%), 97 cases of diet related (57.4%), 33 cases of menstruation and pregnancy (19.5%), etc. Using the χ2 test, seizures with age, gender differences had no statistical significance (P > 0.05), but seizure type was statistically different between inducing factors. In generalized seizures, tonic-clonic seizures associated with sleep deprivation (χ2= 0.189), absence seizures and anger (χ2= 0.237), pressure (χ2= 0.203), irregular life (χ2= 0.214). In the focal seizures, focal motor seizures was correlated with coffee consumption (χ2=0.145), focal sensory seizures with cold (χ2=0.235), electronic equipment use (χ2 =0.153), satiety (χ2 =0.257). Complex partial seizures was correlated with anger (χ2 =0.229), stress (χ2 =0.187), and cold (χ2 =0.198). The secondarily generalized seizures was correlated with drug missing (χ2 =0.231), sleep deprivation (χ2 =0.158), stress (χ2 =0.161), cold (χ2 =0.263), satiety (χ2 =0.182). Among the inducing factors, sleep deprivation was correlated with anger (χ2 =0.167), fatigue (χ2 =0.283), and stress (χ2 =0.230).ConclusionsEpileptic seizure were usually induced by a variety of factors. Generalized seizures were associated with sleep disorders, emotional changes, stress, irregular life, etc. While focal seizures were associated with stress, emotional changes, sleep disorders, cold, satiety, etc. An analysis of the triggers found that sleep deprivation was associated with anger, fatigue, and stress. Therefore, to clarify the inducing factors of epileptic seizure, avoid the inducing factors as much as possible, reduce the harm caused by seizures, and improve the quality of life of patients.
Existing emotion recognition research is typically limited to static laboratory settings and has not fully handle the changes in emotional states in dynamic scenarios. To address this problem, this paper proposes a method for dynamic continuous emotion recognition based on electroencephalography (EEG) and eye movement signals. Firstly, an experimental paradigm was designed to cover six dynamic emotion transition scenarios including happy to calm, calm to happy, sad to calm, calm to sad, nervous to calm, and calm to nervous. EEG and eye movement data were collected simultaneously from 20 subjects to fill the gap in current multimodal dynamic continuous emotion datasets. In the valence-arousal two-dimensional space, emotion ratings for stimulus videos were performed every five seconds on a scale of 1 to 9, and dynamic continuous emotion labels were normalized. Subsequently, frequency band features were extracted from the preprocessed EEG and eye movement data. A cascade feature fusion approach was used to effectively combine EEG and eye movement features, generating an information-rich multimodal feature vector. This feature vector was input into four regression models including support vector regression with radial basis function kernel, decision tree, random forest, and K-nearest neighbors, to develop the dynamic continuous emotion recognition model. The results showed that the proposed method achieved the lowest mean square error for valence and arousal across the six dynamic continuous emotions. This approach can accurately recognize various emotion transitions in dynamic situations, offering higher accuracy and robustness compared to using either EEG or eye movement signals alone, making it well-suited for practical applications.