Document Type : Original/Review Paper

Authors

Electrical and Computer Engineering Department, Semnan University, Semnan, Iran.

Abstract

The classification of emotions using electroencephalography (EEG) signals is inherently challenging due to the intricate nature of brain activity. Overcoming inconsistencies in EEG signals and establishing a universally applicable sentiment analysis model are essential objectives. This study introduces an innovative approach to cross-subject emotion recognition, employing a genetic algorithm (GA) to eliminate non-informative frames. Then, the optimal frames identified by the GA undergo spatial feature extraction using common spatial patterns (CSP) and the logarithm of variance. Subsequently, these features are input into a Transformer network to capture spatial-temporal features, and the emotion classification is executed using a fully connected (FC) layer with a Softmax activation function. Therefore, the innovations of this paper include using a limited number of channels for emotion classification without sacrificing accuracy, selecting optimal signal segments using the GA, and employing the Transformer network for high-accuracy and high-speed classification. The proposed method undergoes evaluation on two publicly accessible datasets, SEED and SEED-V, across two distinct scenarios. Notably, it attains mean accuracy rates of 99.96% and 99.51% in the cross-subject scenario, and 99.93% and 99.43% in the multi-subject scenario for the SEED and SEED-V datasets, respectively. Noteworthy is the outperformance of the proposed method over the state-of-the-art (SOTA) in both scenarios for both datasets, thus underscoring its superior efficacy. Additionally, comparing the accuracy of individual subjects with previous works in cross subject scenario further confirms the superiority of the proposed method for both datasets.

Keywords

Main Subjects

[1] S. Liu, Z. Wang., Y. An, J. Zhao, Y. Zhao, & Y. D. Zhang. “EEG emotion recognition based on the attention mechanism and pre-trained convolution capsule network”. Knowledge-Based Systems, vol. 265, pp. 110372, 2023.
 
[2] W. Y. Hsu, C. C. Lin, M. S. Ju, & Y. N. Sun, “Wavelet-based fractal features with active segment selection: Application to single-trial EEG data”, Journal of neuroscience methods, vol. 163, no. 1, pp. 145-160, 2007.
 
[3] R. Rastgoo, & K. Kiani, “Face recognition using fine-tuning of Deep Convolutional Neural Network and transfer learning”, Journal of Modeling in Engineering, vol. 17, no. 58, pp. 103-111, 2019.
 
[4] K. Kiani, R. Hematpour, & R. Rastgoo, “Automatic grayscale image colorization using a deep hybrid model”, Journal of AI and Data Mining, vol. 9, no. 3, pp. 321-328, 2021.
 
[5] N. Majidi, K. Kiani, & R. Rastgoo, “A deep model for super-resolution enhancement from a single image”, Journal of AI and Data Mining, vol. 8, no. 4, pp. 451-460, 2020.
 
[6] F. Alinezhad, K. Kiani, & R. Rastgoo, “A Deep Learning-based Model for Gender Recognition in Mobile Devices”, Journal of AI and Data Mining, pp. 229-236, 2023.
 
[7] S. Zarbafi, K. Kiani, & R. Rastgoo, “Spoken Persian digits recognition using deep learning”, Journal of Modeling in Engineering, vol. 21, no. 74, pp. 163-172, 2023.
 
[8] A. Fakhari & K. Kiani, “A new restricted boltzmann machine training algorithm for image restoration”, Multimedia Tools and Applications, vol. 80(2), pp. 2047-2062, 2021.
 
[9] A. Fakhari& K. Kiani, “An image restoration architecture using abstract features and generative models”, Journal of AI and Data Mining, vol. 9(1), pp. 129-139, 2021.
 
[10] S. Sartipi, M. Cetin, “Adversarial Discriminative Domain Adaptation and Transformers for EEG-based Cross-Subject Emotion Recognition”, In 2023 11th International IEEE/EMBS Conference on Neural Engineering (NER), IEEE, April 2023, pp. 1-4.
 
[11] Z. Wang, M. Chen, & G. Feng, “Study on Driver Cross-Subject Emotion Recognition Based on Raw Multi-Channels EEG Data”, Electronics, vol. 12(11), pp. 2359, 2023.
 
[12] X. Luan, G. Zhang, & K. Yang, “A Bi-hemisphere Capsule Network Model for Cross-Subject EEG Emotion Recognition”, In International Conference on Neural Information Processing, Singapore: Springer Nature Singapore, November 2022, pp. 325-336.
 
[13] Y. Wei, Y. Liu, C. Li, J. Cheng, R. Song, & X. Chen, “TC-Net: A Transformer Capsule Network for EEG-based emotion recognition”, Computers in Biology and Medicine, vol. 152, pp. 106463, 2023.
 
[14] M. Azarbad, H. Azami, S. Sanei, & A. Ebrahimzadeh, “A time-frequency approach for EEG signal segmentation”, Journal of AI and Data Mining, vol. 2, no. 1, pp. 63-71, 2014.
 
[15] J. Xie, J. Zhang, J. Sun, Z. Ma, L. Qin, G. Li, H. Zhou, Y. Zhan, "A transformer-based approach combining deep learning network and spatial–temporal information for raw EEG classification", IEEE Trans. Neural Syst. Rehabil. Eng., vol. 30, pp. 2126-2136, 2022.
[16] T. Yan, T. Jingtian, & G. Andong, “Multi-class EEG classification for brain computer interface based on CSP”, In 2008 International Conference on Bio Medical Engineering and Informatics, IEEE, vol. 2, May 2008, pp. 469-472.
 
[17] M. Esmaeili, M. Zahedi, “Static Partitioning of EEG Signals by GA Using Multi_CSP”, Elixir Comp Engg 82, pp. 32134-32138, 2015.
 
[18] A. Vaswani, N. Shazeer, N. Parmar, J. Uszkoreit, L. Jones, A. N. Gomez, ... & I. Polosukhin, “Attention is all you need”. Advances in neural information processing systems, 30, 2017.
 
[19] Z. Li, G. Zhang, L. Wang, J. Wei, &J. Dang, “Emotion recognition using spatial-temporal EEG features through convolutional graph attention network”, Journal of Neural Engineering, vol. 20, no. 1, pp. 016046, 2023.
 
[20] A. Iyer, S. S. Das, R. Teotia, S. Maheshwari, &R. R. Sharma, “CNN and LSTM based ensemble learning for human emotion recognition using EEG recordings”, Multimedia Tools and Applications, vol. 82, no. 4, pp. 4883-4896, 2023.
 
[21] S. N. V. Kanuboyina, T. Shankar, & R. R. Venkata Penmetsa, “Electroencephalograph based Human Emotion Recognition Using Artificial Neural Network and Principal Component Analysis”, IETE Journal of Research, vol. 69, no. 3, pp. 1200-1209, 2023.
 
[22] J. Zong, X. Xiong, J. Zhou, Y. Ji, D. Zhou, & Q. Zhang, “FCAN–XGBoost: A Novel Hybrid Model for EEG Emotion Recognition”, Sensors, vol. 23, no. 12, pp. 5680, 2023.
 
[23] O. Almanza-Conejo, D. L. Almanza-Ojeda, J. L. Contreras-Hernandez, & M. A. Ibarra-Manzano, “Emotion recognition in EEG signals using the continuous wavelet transform and CNNs”, Neural Computing and Applications, vol. 35, no. 2, pp. 1409-1422, 2023.
 
[24] Z. Han, H. Chang, X. Zhou, J. Wang, I. Wang, & Y. Shao, “E2ENNet: An end-to-end neural network for emotional brain-computer interface”, Frontiers in Computational Neuroscience, vol. 16, pp. 942979, 2022.
 
[25] S. Mohammadzadeh Koumleh, H. Hassanpour, M. Esmaeili, A. Gholami, “Various Deep Learning Techniques for the Applications in Polymer, Polymer Composite Chemistry, Structures and Processing”. J. Chem. Lett., vol. 2, no. 4, pp. 157-177, 2021.
 
[26] K. Kamble, & J. Sengupta, “A comprehensive survey on emotion recognition based on electroencephalograph (EEG) signals”, Multimedia Tools and Applications, pp. 1-36, 2023.
 
[27] M. Esmaeili, A. Arjomandzadeh, R. Shams & M. Zahedi, “An anti-spam system using naive Bayes method and feature selection methods”, International Journal of Computer Applications, vol. 165, no. 4, pp. 1-5, 2017.
[28] R. Chen, Z. Sun, X. Diao, H. Wang, J. Wang, T. Li, Y. Wang, “Happy or sad? Recognizing emotions with wavelet coefficient energy mean of EEG signals”, Technol. Health Care, vol. 30, pp. 937–949, 2022.
 
[29] J. W. Li, D. Lin, Y. Che, J. J. Lv, R. J. Chen, L. J. Wang, ... & X. Lu, “An innovative EEG-based emotion recognition using a single channel-specific feature from the brain rhythm code method”, Frontiers in neuroscience, vol. 17, pp. 1221512, 2023.
 
[30] K. S. Kamble, J. Sengupta, “Ensemble machine learning-based affective computing for emotion recognition using dual-decomposed EEG signals”, IEEE Sensors J 22, pp. 2496–2507, 2021.
 
[31] A. Goshvarpour, A. Goshvarpour, “Novel high-dimensional phase space features for EEG emotion recognition”, Signal, Image and Video Processing, vol. 17, no. 2, pp. 417-425, 2023.
 
[32] J. Li, Z. Zhang, H. He, “Hierarchical Convolutional Neural Networks for EEG-Based Emotion Recognition”, Cogn. Comput. vol. 10, pp.368–380, 2018.
 
[33] R. R. Immanuel, S. K. B. Sangeetha, “ANALYSIS OF DIFFERENT EMOTIONS WITH BIO-SIGNALS (EEG) USING DEEP CNN”, Journal of Data Acquisition and Processing, vol. 38, no. 3, pp. 743, 2023.
 
[34] H. Chao, Y. Liu, “Emotion recognition from multi-channel EEG signals by exploiting the deep belief-conditional random field framework”, IEEE Access, vol. 8, pp. 33002-33012, 2020
 
[35] X. Qiu, S. Wang, R. Wang, Y. Zhang, &L. Huang, “A multi-head residual connection GCN for EEG emotion recognition”, Computers in Biology and Medicine, pp. 107126, 2023.
 
[36] M. Jehosheba Margaret, N. M. Masoodhu Banu, “Performance analysis of EEG based emotion recognition using deep learning models", Brain-Computer Interfaces, pp. 1-20, 2023.
 
[37] W. L. Zheng, B. L. Lu, “Investigating critical frequency bands and channels for EEG-based emotion recognition with deep neural networks”, IEEE Trans. Auton. Ment. Dev. vol. 7, no. 3, pp. 162-175, 2015.
 
[38] R. N. Duan, J. Y. Zhu, B. L. Lu, “Differential entropy feature for EEG-based emotion classification”, In 2013 6th International IEEE/EMBS Conference on Neural Engineering (NER), IEEE, November 2013, pp 81-84.
 
[39] W. Liu, J. L. Qiu, W. L. Zheng, B. L. Lu,“ Comparing recognition performance and robustness of multimodal deep learning models for multimodal emotion recognition”, IEEE Transactions on Cognitive and Developmental Systems, vol. 14, no. 2, pp.715-729, 2021.
 
[40] B. Fu, F. Li, Y. Niu, H. Wu, Y. Li, G. Shi, “Conditional generative adversarial network for EEG-based emotion fine-grained estimation and visualization”, Journal of Visual Communication and Image Representation, vol. 74, pp. 102982, 2021.
 
[41] W. L. Zheng, H. T. Guo, & B. L. Lu, “Revealing critical channels and frequency bands for emotion recognition from EEG with deep belief network”, In 2015 7th International IEEE/EMBS Conference on Neural Engineering (NER), IEEE, April 2015, pp. 154-157.
 
[42] F. Baradaran, A. Farzan, S. Danishvar & S. Sheykhivand, “Customized 2D CNN Model for the Automatic Emotion Recognition Based on EEG Signals”, Electronics, vol. 12, no. 10, pp. 2232, 2023.
 
[43] L. A. Moctezuma, M. Molinas, “EEG Channel-Selection Method for Epileptic-Seizure Classification Based on Multi-Objective Optimization”, Front. Neurosci. vol. 14, pp. 593, 2020.
 
[44] M. Esmaeili, M. Zahedi, & N. Hafezi-Motlagh, “Performance Analysis of PSO and GA Algorithms in Order to Classifying EEG Data”, Elixir, vol. 82, 2015.
 
[45] H. Cizmeci, C. Ozcan, “Enhanced deep capsule network for EEG-based emotion recognition”, Signal, Image and Video Processing, vol. 17, no. 2, pp. 463-469, 2023.
 
[46] F. M. Alotaibi, “An AI-Inspired Spatio-Temporal Neural Network for EEG-Based Emotional Status”, Sensors, vol. 23, no. 1, pp.498, 2023.
 
[47] L. Gong, M. Li, T. Zhang, & W. Chen, “EEG emotion recognition using attention-based convolutional transformer neural network”, Biomedical Signal Processing and Control, vol. 84, pp. 104835, 2023.
 
[48] Y. Zhang, Y. Peng, J. Li, & W. Kong, “SIFIAE: An adaptive emotion recognition model with EEG feature-label inconsistency consideration”, Journal of Neuroscience Methods, vol. 395, pp.109909, 2023.
 
[49] V. Jadhav, N. Tiwari, & M. Chawla, “EEG-based Emotion Recognition using Transfer Learning Based Feature Extraction and Convolutional Neural Network”, In ITM Web of Conferences. EDP Sciences, vol. 53,2023.
 
[50] L. Zhu, F. Yu, A. Huang, N. Ying, & J. Zhang, “Instance-representation transfer method based on joint distribution and deep adaptation for EEG emotion recognition”, Medical & Biological Engineering & Computing, pp.1-15, 2023.
 
[51] S. Y. Dharia, C. E. Valderrama, & S. G. Camorlinga, “Multimodal Deep Learning Model for Subject-Independent EEG-based Emotion Recognition”, In 2023 IEEE Canadian Conference on Electrical and Computer Engineering (CCECE), IEEE, September 2023, pp. 105-110.
 
 
 
 
[52] T. H. Li, W. Liu, W. L. Zheng, & B. L. Lu, “Classification of five emotions from EEG and eye movement signals: Discrimination ability and stability over time”, In 2019 9th International IEEE/EMBS Conference on Neural Engineering (NER), March 2019, pp. 607-610.
 
[53] H. Liu, H. Guo, & W. Hu, “EEG-based emotion classification using joint adaptation networks”, In 2021 IEEE international symposium on circuits and systems (ISCAS), May 2021, pp. 1-5.
 
[54] X. Shen, X. Liu, X. Hu, D. Zhang, & S. Song, “Contrastive learning of subject-invariant eeg representations for cross-subject emotion recognition”, IEEE Transactions on Affective Computing, 2022.
 
[55] W. Guo, & Y. Wang, “Convolutional gated recurrent unit-driven multidimensional dynamic graph neural network for subject-independent emotion recognition”, Expert Systems with Applications, vol. 238, pp. 121889, 2024.
 
[56] W. Guo, Y. Li, M. Liu, R. Ma, & Y. Wang, “Functional connectivity-enhanced feature-grouped attention network for cross-subject EEG emotion recognition”, Knowledge-Based Systems, vol. 283, pp. 111199, 2024.
 
[57] Q. M. U. Haq, L. Yao, W. Rahmaniar, F. Islam, “A hybrid hand-crafted and deep neural spatio-temporal EEG features clustering framework for precise emotional status recognition”, Sensors, vol. 22, no. 14, pp. 5158, 2022.
 
[58] X. Zhang, D. Huang, H. Li, Y. Zhang, Y. Xia, & J. Liu, “Self‐training maximum classifier discrepancy for EEG emotion recognition”, CAAI Transactions on Intelligence Technology, 2023.
 
[59] J. Li, W. Pan, H. Huang, J. Pan, & F. Wang, “STGATE: Spatial-temporal graph attention network with a transformer encoder for EEG-based emotion recognition”, Frontiers in Human Neuroscience, vol. 17, pp. 1169949, 2023.
 
[60] M. Esmaeili, K. Kiani, “Generating personalized facial emotions using emotional EEG signals and conditional generative adversarial networks”, Multimedia Tools and Applications, pp. 1-26, 2023.
 
[61] M. Jin, C. Du, H. He, T. Cai, J. Li. "PGCN: Pyramidal graph convolutional network for EEG emotion recognition." IEEE Transactions on Multimedia, 2024.