PL EN


Preferencje help
Widoczny [Schowaj] Abstrakt
Liczba wyników
Tytuł artykułu

Comparison of different feature extraction methods for EEG-based emotion recognition

Identyfikatory
Warianty tytułu
Języki publikacji
EN
Abstrakty
EN
EEG-based emotion recognition is a challenging and active research area in affective computing. We used three-dimensional (arousal, valence and dominance) model of emotion to recognize the emotions induced by music videos. The participants watched a video (1 min long) while their EEG was recorded. The main objective of the study is to identify the features that can best discriminate the emotions. Power, entropy, fractal dimension, statistical features and wavelet energy are extracted from the EEG signals. The effects of these features are investigated and the best features are identified. The performance of the two feature selection methods, Relief based algorithm and principle component analysis (PCA), is compared. PCA is adopted because of its improved performance and the efficacies of the features are validated using support vector machine, K-nearest neighbors and decision tree classifiers. Our system achieves an overall best classification accuracy of 77.62%, 78.96% and 77.60% for valence, arousal and dominance respectively. Our results demonstrated that time-domain statistical characteristics of EEG signals can efficiently discriminate different emotional states. Also, the use of three-dimensional emotion model is able to classify similar emotions that were not correctly classified by two-dimensional model (e.g. anger and fear). The results of this study can be used to support the development of real-time EEG-based emotion recognition systems.
Twórcy
autor
  • Department of Electronic Engineering, Faculty of Engineering and Green Technology, Universiti Tunku Abdul Rahman (UTAR), Kampar, Malaysia
  • Department of Electronic Engineering, Faculty of Engineering and Green Technology, Universiti Tunku Abdul Rahman (UTAR), Kampar, Malaysia
  • Faculty of Engineering and Green Technology (FEGT), Department of Electronic Engineering, Universiti Tunku Abdul Rahman (UTAR), Jalan Universiti, Kampar 31900, Perak, Malaysia
  • Department of Electronic Engineering, Faculty of Engineering and Green Technology, Universiti Tunku Abdul Rahman (UTAR), Kampar, Malaysia
Bibliografia
  • [1] Jenke R, Peer A, Buss M. Feature extraction and selection for emotion recognition from EEG. IEEE Trans Affect Comput 2014;5:327–39.
  • [2] Thammasan N, Moriyama K, Fukui K, Numao M. Continuous music-emotion recognition based on electroencephalogram. IEICE Trans Inf Syst 2016;99:1234–41.
  • [3] Black MJ, Yacoob Y. Recognizing facial expressions in image sequences using local parameterized models of image motion. Int J Comput Vis 1997;25:23–48.
  • [4] Anderson K, McOwan PW. A real-time automated system for the recognition of human facial expressions. IEEE Trans Syst Man Cybern Part B 2006;36:96–105.
  • [5] Soleymani M, Asghari-Esfeden S, Fu Y, Pantic M. Analysis of EEG signals and facial expressions for continuous emotion detection. IEEE Trans Affect Comput 2016;17–28.
  • [6] Brosschot JF, Thayer JF. Heart rate response is longer after negative emotions than after positive emotions. Int J Psychophysiol 2003;50:181–7.
  • [7] Kim KH, Bang SW, Kim SR. Emotion recognition system using short-term monitoring of physiological signals. Med Biol Eng Comput 2004;42:419–27.
  • [8] Wang X-W, Nie D, Lu B-L. Emotional state classification from EEG data using machine learning approach. Neurocomputing 2014;129:94–106.
  • [9] Mohammadi Z, Frounchi J, Amiri M. Wavelet-based emotion recognition system using EEG signal. Neural Comput Appl 2017;28:1985–90.
  • [10] Yin Z, Zhao M, Wang Y, Yang J, Zhang J. Recognition of emotions using multimodal physiological signals and an ensemble deep learning model. Comput Methods Programs Biomed 2017;140:93–110.
  • [11] Zhang Y, Ji X, Zhang S. An approach to EEG-based emotion recognition using combined feature extraction method. Neurosci Lett 2016;633:152–7.
  • [12] Al-Nafjan A, Hosny M, Al-Wabil A, Al-Ohali Y. Classification of human emotions from electroencephalogram (EEG) signal using deep neural network. Int J Adv Comput Sci Appl 2017;8:419–25.
  • [13] Koelstra S, Muhl C, Soleymani M, Lee J-S, Yazdani A, Ebrahimi T, et al. Deap: a database for emotion analysis; using physiological signals. IEEE Trans Affect Comput 2012;3:18–31.
  • [14] Bhatti AM, Majid M, Anwar SM, Khan B. Human emotion recognition and analysis in response to audio music using brain signals. Comput Human Behav 2016;65:267–75.
  • [15] Harmon-Jones E, Gable PA, Peterson CK. The role of asymmetric frontal cortical activity in emotion-related phenomena: a review and update. Biol Psychol 2010;84:451–62.
  • [16] Pizzagalli D, Regard M, Lehmann D. Rapid emotional face processing in the human right and left brain hemispheres: an ERP study. Neuroreport 1999;10:2691–8.
  • [17] Lan Z, Sourina O, Wang L, Scherer R, Müller-Putz G. Unsupervised feature learning for EEG-based emotion recognition. 2017 Int. Conf. Cyberworlds. 2017. pp. 182–5.
  • [18] Schaaff K, Schultz T. Towards emotion recognition from electroencephalographic signals. Affect. Comput. Intell. Interact. Work. 2009. ACII 2009. 3rd Int. Conf.. 2009. pp. 1–6.
  • [19] Hadjidimitriou SK, Hadjileontiadis LJ. Toward an EEG-based recognition of music liking using time–frequency analysis. IEEE Trans Biomed Eng 2012;59:3498–510.
  • [20] Petrantonakis PC, Hadjileontiadis LJ. Emotion recognition from EEG using higher order crossings. IEEE Trans Inf Technol Biomed 2010;14:186–97.
  • [21] Liu Y, Sourina O, Nguyen MK. Real-time EEG-based emotion recognition and its applications. Transaction on computational science XII. Springer; 2011. p. 256–77.
  • [22] Tuomas E, Vuoskoski K. A review of music and emotion studies: approaches, emotion models, and stimuli. Music Percept An Interdiscip J 2013;30:307–40. http://dx.doi.org/10.1525/jams.2009.62.1.145.
  • [23] Cowen AS, Keltner D. Self-report captures 27 distinct categories of emotion bridged by continuous gradients. Proc Natl Acad Sci U S A 2017;114:E7900–9.
  • [24] Russell JA. Affective space is bipolar. J Pers Soc Psychol 1979;37:345–56. http://dx.doi.org/10.1037/0022-3514.37.3.345.
  • [25] Liu Y, Sourina O. Real-time subject-dependent EEG-based emotion recognition algorithm. Transaction on computational science XXIII. Springer; 2014. p. 199–223.
  • [26] Mehrabian A. Pleasure-arousal-dominance: A general framework for describing and measuring individual differences in temperament. Curr Psychol 1996;14:261–92.
  • [27] Liu Y, Sourina O. EEG-based dominance level recognition for emotion-enabled interaction. 2012 IEEE Int. Conf. on Multimed. Expo (ICME). 2012. pp. 1039–44.
  • [28] Tandle AL, Joshi MS, Dharmadhikari AS, Jaiswal SV. Mental state and emotion detection from musically stimulated EEG. Brain Informatics 2018;5:14.
  • [29] Scherer KR. Which emotions can be induced by music? What are the underlying mechanisms? And how can we measure them? J New Music Res 2004;33:239–51.
  • [30] Fernández-Sotos A, Fernández-Caballero A, Latorre JM. Influence of tempo and rhythmic unit in musical emotion regulation. Front Comput Neurosci 2016;10:80.
  • [31] Hubert W, de Jong-Meyer R. Autonomic, neuroendocrine, and subjective responses to emotion-inducing film stimuli. Int J Psychophysiol 1991;11:131–40.
  • [32] Zhuang N, Zeng Y, Tong L, Zhang C, Zhang H, Yan B. Emotion recognition from EEG signals using multidimensional information in EMD domain. Biomed Res Int 2017. http://dx.doi.org/10.1155/2017/8317357.
  • [33] Ros T, Munneke MAM, Parkinson LA, Gruzelier JH. Neurofeedback facilitation of implicit motor learning. Biol Psychol 2014;95:54–8.
  • [34] Phneah SW, Nisar H. EEG-based alpha neurofeedback training for mood enhancement. Australas Phys Eng Sci Med 2017;40:325–36.
  • [35] Hou Y, Chen S. Distinguishing different emotions evoked by music via electroencephalographic signals. Comput Intell Neurosci 2019;2019. pages 18.
  • [36] Candra H, Yuwono M, Chai R, Handojoseno A, Elamvazuthi I, Nguyen HT, et al. Investigation of window size in classification of EEG-emotion signal with wavelet entropy and support vector machine. 2015 37th Annu. Int. Conf. IEEE Eng. Med. Biol. Soc.. 2015. pp. 7250–3.
  • [37] Picard RW, Vyzas E, Healey J. Toward machine emotional intelligence: Analysis of affective physiological state. IEEE Trans Pattern Anal Mach Intell 2001;1175–91.
  • [38] Lan Z, Sourina O, Wang L, Liu Y. Real-time EEG-based emotion monitoring using stable features. Vis Comput 2016;32:347–58.
  • [39] Wang W, Gill EW. Comparison of a modified periodogram and standard periodogram for current estimation by an hf surface radar. Ocean 2014-TAIPEI. 2014. pp. 1–7.
  • [40] Lin Y-P, Wang C-H, Jung T-P, Wu T-L, Jeng S-K, Duann J-R, et al. EEG-based emotion recognition in music listening. IEEE Trans Biomed Eng 2010;57:1798–806.
  • [41] Li M, Lu B-L. Emotion classification based on gamma-band EEG. 2009 Annu. Int. Conf. IEEE Eng. Med. Biol. Soc.. 2009. pp. 1223–6.
  • [42] Nawaz R, Nisar H, Voon YV. The effect of music on human brain; Frequency domain and time series analysis using electroencephalogram. IEEE Access 2018;6. http://dx.doi.org/10.1109/ACCESS.2018.2855194.
  • [43] Bandt C, Pompe B. Permutation entropy: a natural complexity measure for time series. Phys Rev Lett 2002;88:174102.
  • [44] Olofsen E, Sleigh JW, Dahan A. Permutation entropy of the electroencephalogram: a measure of anaesthetic drug effect. Br J Anaesth 2008;101:810–21. http://dx.doi.org/10.1093/bja/aen290.
  • [45] Inouye T, Shinosaki K, Sakamoto H, Toi S, Ukai S, Iyama A, et al. Quantification of EEG irregularity by use of the entropy of the power spectrum. Electroencephalogr Clin Neurophysiol 1991;79:204–10.
  • [46] Grassberger P, Schreiber T, Schaffrath C. Nonlinear time sequence analysis. Int J Bifurc Chaos 1991;1:521–47.
  • [47] Sleigh JW, Olofsen E, Dahan A, De Goede J, Steyn-Ross DA. Entropies of the EEG: the effects of general anaesthesia; 2001.
  • [48] Hosseini SA, Naghibi-Sistani MB. Emotion recognition method using entropy analysis of EEG signals. Int J Image Graph Signal Process 2011;3:30.
  • [49] Richman JS, Moorman JR. Physiological time-series analysis using approximate entropy and sample entropy. Am J Physiol Circ Physiol 2000;278:H2039–4.
  • [50] Jie X, Cao R, Li L. Emotion recognition based on the sample entropy of EEG. Biomed Mater Eng 2014;24:1185–92.
  • [51] Puthankattil SD, Joseph PK. Analysis of EEG signals using wavelet entropy and approximate entropy: a case study on depression patients. Int J Med Heal Biomed Pharm Eng 2014;8:420–4.
  • [52] Garc\'\ia-Mart\'\inez B, Martinez-Rodrigo A, Alcaraz R, Fernández-Caballero A. A review on nonlinear methods using electroencephalographic recordings for emotion recognition. IEEE Trans Affect Comput 2019.
  • [53] Esteller R, Vachtsevanos G, Echauz J, Litt B. A comparison of waveform fractal dimension algorithms. IEEE Trans Circuits Syst I Fundam Theory Appl 2001;48:177–83.
  • [54] Thammasan N, Fukui K, Numao M. Multimodal fusion of eeg and musical features in music-emotion recognition. Thirty-First AAAI Conf. Artif. Intell.; 2017.
  • [55] Petrosian A. Kolmogorov complexity of finite sequences and recognition of different preictal EEG patterns. Proc. Eighth IEEE Symp Comput Med Syst. 1995. pp. 212–7.
  • [56] Higuchi T. Approach to an irregular time series on the basis of the fractal theory. Phys D Nonlinear Phenom 1988;31:277–83.
  • [57] Li M, Xu H, Liu X, Lu S. Emotion recognition from multichannel EEG signals using K-nearest neighbor classification. Technol Heal Care 2018;26:509–19.
  • [58] Urbanowicz RJ, Meeker M, La Cava W, Olson RS, Moore JH. Relief-based feature selection: Introduction and review. J Biomed Inform 2018;85:189–203.
  • [59] Bolón-Canedo V, Sánchez-Maroño N, Alonso-Betanzos A. A review of feature selection methods on synthetic data. Knowl Inf Syst 2013;34:483–519.
  • [60] Song F, Guo Z, Mei D. Feature selection using principal component analysis. 2010 Int Conf Syst Sci Eng Des Manuf Inform vol. 1. 2010. pp. 27–30.
  • [61] Kira K, Rendell LA. A practical approach to feature selection. Mach Learn Proc 1992. Elsevier; 1992. p. 249–56.
  • [62] Kira K, Rendell LA, et al. The feature selection problem: Traditional methods and a new algorithm. Aaai 1992;2:129–34.
  • [63] Yang Y-H, Lin Y-C, Su Y-F, Chen HH. A regression approach to music emotion recognition. IEEE Trans Audio Speech Lang Process 2008;16:448–57.
  • [64] Kononenko I, Šikonja MR. Non-myopic feature quality evaluation with (R) ReliefF. Comput Methods Featur Sel 2008;169–91.
  • [65] Kooperberg C, Dai JY, Hsu L, Tzeng J-Y. Statistical approaches to Gene X environment interactions for complex phenotypes. MIT press; 2016.
  • [66] Al Zoubi O, Awad M, Kasabov NK. Anytime multipurpose emotion recognition from EEG data using a Liquid State Machine based framework. Artif Intell Med 2018;86:1–8.
  • [67] Chang C-C, Lin C-J. LIBSVM: a library for support vector machines. ACM Trans Intell Syst Technol 2011;2:27.
  • [68] Mert A, Akan A. Emotion recognition from EEG signals by using multivariate empirical mode decomposition. Pattern Anal Appl 2018;21:81–9.
  • [69] Chao H, Dong L, Liu Y, Lu B. Emotion recognition from multiband EEG signals using CapsNet. Sensors 2019;19:2212.
  • [70] Yan J, Chen S, Deng S. A EEG-based emotion recognition model with rhythm and time characteristics. Brain Informat 2019;6:7.
Uwagi
PL
Opracowanie rekordu ze środków MNiSW, umowa Nr 461252 w ramach programu "Społeczna odpowiedzialność nauki" - moduł: Popularyzacja nauki i promocja sportu (2020).
Typ dokumentu
Bibliografia
Identyfikator YADDA
bwmeta1.element.baztech-0e87f612-4d41-4998-a126-7c5c05c05990
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.