PL EN


Preferencje help
Widoczny [Schowaj] Abstrakt
Liczba wyników
Powiadomienia systemowe
  • Sesja wygasła!
Tytuł artykułu

EEG based emotion analysis using reinforced spatio‐temporal attentive graph neural and contextnet techniques

Treść / Zawartość
Identyfikatory
Warianty tytułu
Języki publikacji
EN
Abstrakty
EN
EEG-based emotion classification is considered to separate and observe the mental state or emotions. Emotion classification using EEG is used for medical, security and other purposes. Several deep learning and machine learning strategies are employed to classify the EEG emotion signals. They do not provide sufficient accuracy and have higher complexity and high error rate. In this manuscript, a novel Reinforced Spatio-Temporal Attentive Graph Neural Networks (RSTAGNN) and ContextNet for emotion classification with EEG signals is proposed (RSTAGNN-ContextNet-GWOA-EEG-EA). Here, the input EEG signals are taken from two benchmark datasets,namely DEAP and K-EmoCon datasets. Then, the input EEG signals are pre-processed,and the fea- tures are extracted utilizing ContextNet with Global Principal Component Analysis (GPCA). After that, the EEG signal emotions are classified using Reinforced Spatio- Temporal Attentive Graph Neural Networks method. RSTAGNN weight parameters are optimized under the Glowworm Swarm Optimization Algorithm (GWOA). The proposed model classifies the EEG signal emotions with high accuracy. The efficacy of the proposed method using the DEAP dataset attains higher accuracy by 24.05%, 12.64% related to existing systems, like Multi-domain feature fusion for emotion classification (DWT-SVM-EEG- EA-DEAP), EEG emotion finding utilizing fusion mode of graph CNN with LSTM (GCNN-LSTM-EEG-EA-DEAP) respectively. The efficiency of the proposed method using the K-EmoCon dataset attains higher accuracy 32.64%, 15.65% related to existing systems, like Toward Robust Wearable Emotion Realization along Contrastive Repre- sentation Learning (CAT-EEG-EA-K-EmoCon) and Human Emotion Recognition using Physiological Signals (CAT- EEG-EA-K-EmoCon) respectively.
Twórcy
  • Department of Information Technology, PSG College of Technology, Coimbatore, India
  • Department of Information Technology, PSG College of Technology, Coimbatore, India
Bibliografia
  • [1] S. Kim, H. Yang, N. Nguyen, S. Prabhakar and S. Lee, “WeDea: A New EEG-Based Framework for Emotion Recognition,” IEEE Journal of Biomedical and Health Informatics, vol. 26, no. 1, 2022, pp. 264-275. Doi: 10.1109/jbhi.2021.3091187.
  • [2] N. Salankar, P. Mishra and L. Garg, “Emotion Recognition From EEG Signals Using Empirical Mode Decomposition And Second-Order Difference Plot,” Biomedical Signal Processing and Control, vol. 65, 2021, p. 102389. Doi:10.1016/j.bspc.2020.102389.
  • [3] A. Subasi, T. Tuncer, S. Dogan, D. Tanko and U. Sakoglu, “EEG-Based Emotion Recognition Using Tunable Q Wavelet Transform And Rotation Forest Ensemble Classiϐier,” Biomedical Signal Processing and Control, vol. 68, 2021, p. 102648. Doi: 10.1016/j.bspc.2021.102648.
  • [4] P.V. and A. Bhattacharyya, “Human Emotion Recognition Based On Time–Frequency Analysis Of Multivariate EEG Signal”, KnowledgeBased Systems, vol. 238, 2022, p. 107867. Doi: 10.1016/j.knosys.2021.107867.
  • [5] J. Wang and M. Wang, “Review Of The Emotional Feature Extraction And Classiϐication Using EEG Signals,” Cognitive Robotics, vol. 1, 2021, pp. 29-40. Doi: 10.1016/j.cogr.2021.04.001.
  • [6] X. Zhou, X. Tang and R. Zhang, “Impact Of Green Finance On Economic Development And Environmental Quality: A Study Based On Provincial Panel Data From China,” Environmental Science and Pollution Research, vol. 27, no. 16, 2020, pp. 19915-19932. Doi: 10.1007/s11356-020-08383-2.
  • [7] N. Garcia, B. Renoust and Y. Nakashima, “ContextNet: representation and exploration for painting classiϐication and retrieval in context,” International Journal of Multimedia Information Retrieval, vol. 9, no. 1, 2019, pp. 17-30. Doi: 10.1007/s13735-019-00189-4.
  • [8] F. Zhou, Q. Yang, K. Zhang, G. Trajcevski, T. Zhong and A. Khokhar, “Reinforced Spatiotemporal Attentive Graph Neural Networks for Traffic Forecasting,” IEEE Internet of Things Journal, vol. 7, no. 7, 2020, pp. 6414-6428. Doi: 10.1109/jiot.2020.2974494.
  • [9] A. Chowdhury and D. De, “Energy-efficient coverage optimization in wireless sensor networks based on Voronoi-Glowworm Swarm Optimization-K-means algorithm,” Ad Hoc Networks, vol. 122, 2021, p. 102660. Doi: 10.1016/j.adhoc.2021.102660.
  • [10] M. Khateeb, S. Anwar and M. Alnowami, “Multi-Domain Feature Fusion for Emotion Classification Using DEAP Dataset,” IEEE Access., vol. 9, 2021, pp. 12134-12142. Doi: 10.1109/access.2021.3051281.
  • [11] Y. Yin, X. Zheng, B. Hu, Y. Zhang and X. Cui, “EEG emotion recognition using fusion model of graph convolutional neural networks and LSTM,” Applied Soft Computing, vol. 100, 2021, p. 106954. Doi: 10.1016/j.asoc.2020.106954.
  • [12] V. Dissanayake, S. Seneviratne, R. Rana, E. Wen, T. Kaluarachchi and S. Nanayakkara, “SigRep: Toward Robust Wearable Emotion Recognition With Contrastive Representation Learning”,IEEE Access, vol. 10, 2022, pp. 18105-18120. Doi: 10.1109/access.2022.3149509.
  • [13] K. Yang, B. Tag, Y. Gu, C. Wang, T. Dingler, G. Wadley, J. Goncalves. “Mobile emotion recognition via multiple physiological signals using convolution-augmented transformer”. InProceedings of the 2022 International Conference on Multimedia Retrieval. pp. 562-570 2022. Doi: 10.1145/3512527.3531385
  • [14] S. Koelstra, C. Muhl, M. Soleymani, J.S. Lee, A. Yazdani, T. Ebrahimi, T. Pun, A. Nijholt, and I. Patras, “DEAP: A Database for Emotion Analysis Using Physiological Signals,” IEEE Transactions on Affective Computing, vol. 3, no. 1, 2012, pp. 18-31. Doi: 10.1109/t-affc.2011.15.
  • [15] C.Y. Park, N. Cha, S. Kang, A. Kim, A.H. Khan-doker, L. Hadjileontiadis, A. Oh, Y. Jeong, and U. Lee, “K-EmoCon, a multimodal sensor dataset for continuous emotion recognition in naturalistic conversations,” Scientific Data, vol. 7, no. 1, 2020.Doi: 10.1038/s41597-020-00630-y.
Uwagi
Opracowanie rekordu ze środków MNiSW, umowa nr POPUL/SP/0154/2024/02 w ramach programu "Społeczna odpowiedzialność nauki II" - moduł: Popularyzacja nauki (2025).
Typ dokumentu
Bibliografia
Identyfikator YADDA
bwmeta1.element.baztech-9475d999-6120-4021-8ae3-0c00f2a072a1
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.