PL EN


Preferencje help
Widoczny [Schowaj] Abstrakt
Liczba wyników
Tytuł artykułu

Development of a real time emotion classifier based on evoked EEG

Autorzy
Identyfikatory
Warianty tytułu
Języki publikacji
EN
Abstrakty
EN
Our quality of life is more dependent on our emotions than on physical comforts alone. This is motivation enough to classify emotions using Electroencephalogram (EEG) signals. This paper describes the acquisition of evoked EEG signals for classification of emotions into four quadrants. The EEG signals have been collected from 24 subjects on three electrodes (Fz, Cz and Pz) along the central line. The absolute and differential attributes of single trial ERPs have been used to classify emotions. The single trial ERP attributes collected from each electrode have been used for developing an emotion classifier for each subject. The accuracy of classification of emotions into four classes lies between 62.5–83.3% for single trials. The subject independent analysis has been done using absolute and differential attributes of single trial signals of ERP. An overall accuracy of 55% has been obtained on Fz electrode for multi subject trials. The methodology used to classify emotions by fixing the attributes for classification of emotions brings us a step closer to developing a real time emotion recognition system with benefits including applications like Brain-Computer Interface for locked-in subjects, emotion classification for highly sensitive jobs like fighter pilots etc.
Twórcy
autor
  • Department of Electrical and Instrumentation Engineering, Thapar University, Patiala 147004, India
autor
  • Department of Electrical and Instrumentation Engineering, Thapar University, Patiala 147004, India
Bibliografia
  • [1] Bower GH. How might emotions affect learning? In: Christianson S, editor. The handbook of emotion and memory: research and theory. Psychology Press; 1992. p. 3–32.
  • [2] Bradley MM, Lang PJ. Measuring emotion: the self-assessment manikin (SAM) and the semantic differential. J Exp Psychiatry Behav Ther 1994;25:49–59.
  • [3] Bradley MM, Lang PJ. The International Affective Digitized Sounds (2nd edition; IADS-2): affective ratings of sounds and instruction manual. Technical report B-3. Gainesville, FL: University of Florida; 2007.
  • [4] Cacioppo JT, Tassinary LG. Inferring psychological significance from physiological signals. Am Psychol 1990;45:16–28.
  • [5] Chanel G, Ansari-Asl K, Pun T. Valence-arousal evaluation using physiological signals in an emotion recall paradigm. IEEE International Conference on Systems, Man and Cybernetics; 2007. pp. 2662–7.
  • [6] Chanel G, Kronegg J, Grandjean D, Pun T. Emotion assessment: arousal evaluation using EEG's and peripheral physiological signals. Technical report. Computer Vision Group, Computing Science Center, University of Geneva; 2005.
  • [7] Clynes M. Sentics, the touch of the emotions. New York: Anchor Press/Doubleday; 1978.
  • [8] Conroy MA, Polich J. Affective valence and P300 when stimulus arousal level is controlled. Cogn Emot 2007;21: 891–901.
  • [9] Cowie R, Douglas-Cowie E, Savvidou S, McMahon E, Sawey M, Schrder M. Feeltrace: an instrument for recording perceived emotion in real time. Proc. ISCA Workshop Speech and Emotion; 2000. pp. 19–24.
  • [10] Cuthbert BN, Schupp HT, Bradley MM, Birbaumer N, Lang PJ. Brain potentials in affective picture processing: covariation with autonomic arousal and affective report. Biol Psychol 2000;52:95–111.
  • [11] Cuthbert BN, Schupp HT, McManis M, Hilman C, Bradley MM, Lang PJ. Cortical slow waves: emotional perception and processing. Psychophysiology 1995;32:S26.
  • [12] Daly I, Nicolaou N, Nasuto SJ, Warwick K. Automated artifact removal from the electroencephalogram: a comparative study. Clin EEG Neurosci 2013;44: 291–306.
  • [13] De Silva LC, Miyasato T, Nakatsu R. Use of multimodal information in facial emotion recognition. ICICS'97; 1997.
  • [14] Debener S, Minow F, Emkes R, Gandras K, de Vos M. How about taking a low-cost, small, and wireless EEG for a walk? Psychophysiology 2012;49:1617–21.
  • [15] Delorme A, Makeig S. EEGLab: an open source toolbox for analysis of single-trial EEG dynamics including independent component analysis. J Neurosci Methods 2003;134:9–21.
  • [16] Delplanque S, Lavoie ME, Hot P, Silvert L, Sequeira H. Modulation of cognitive processing by emotional valence studied through event-related potentials in humans. Neurosci Lett 2004;3:1–4.
  • [17] Delplanque S, Silvert L, Hot P, Sequeira H. Event-related P3a and P3b in response to unpredictable emotional stimuli. Biol Psychol 2005;68:107–20.
  • [18] Ekman P. An argument for basic emotions. Cogn Emot 1992;6:169–200.
  • [19] Frantzidis CA, Bratsas C, Papadelis CL, Konstantinidis E, Pappas C, Bamidis PD. Toward emotion aware computing: an integrated approach using multichannel neurophysiological recordings and affective visual stimuli. IEEE Trans Inf Technol Biomed 2010;14:589–97.
  • [20] Fredrickson BL, Branigan C. Positive emotions broaden the scope of attention and thought-action repertoires. Cogn Emot 2005;19:313–32.
  • [21] Gaffary Y, Eyharabide V, Martin JC, Ammi M. The impact of combining kinesthetic and facial expression displays on emotion recognition by users. Int J Hum Comput Interact 2014;30:904–20.
  • [22] Graimann B, Pfurtscheller G, Allison B. Brain–computer interfaces: a gentle introduction. In: Graimann B, Pfurtscheller G, Allison B, editors. Brain–computer interfaces: revolutionizing human–computer interaction. London: Springer; 2010. p. 1–28.
  • [23] Healey JA. Wearable and automotive systems for affect recognition from physiology.[PhD thesis] Cambridge, MA: Massachusetts Institute of Technology; 2000.
  • [24] Hidalgo-Munoz AR, López MM, Santos IM, Pereira AT, Vázquez-Marrufo M, Galvao-Carmona A, et al. Application of SVM-RFE on EEG signals for detecting the most relevant scalp regions linked to affective valence processing. Expert Syst Appl 2013;40:2102–8.
  • [25] Horlings R. Emotion recognition using brain activity. [Thesis] Delft University of Technology; 2008.
  • [26] Isen AM, Means B. The influence of positive affect on decision-making strategy. Soc Cogn 1983;2:18–31.
  • [27] Izurieta Hidalgo NA, Oelkers-Ax R, Nagy K, Mancke F, Bohus M, Herpertz SC, et al. Time course of facial emotion processing in women with borderline personality disorder: an ERP study. J Psychiatry Neurosci 2016;41:16–26.
  • [28] Jatupaiboon N, Pan-ngum S, Israsena P. Real-time EEG-based happiness detection system. Sci World J 2013. http://dx.doi.org/10.1155/2013/618649.
  • [29] Jenke R, Peer A, Buss M. Feature extraction and selection for emotion recognition from EEG. IEEE Trans Affect Comput 2014;5:327–39.
  • [30] Jung N, Wranke C, Hamburger K, Knauff M. How emotions affect logical reasoning: evidence from experiments with mood-manipulated participants, spider phobics, and people with exam anxiety. Front Psychol 2014;5:570. http://dx.doi.org/10.3389/fpsyg.2014.00570.
  • [31] Kim J, Andre E. Emotion recognition based on physiological changes in music listening. IEEE Trans Pattern Anal Mach Intell 2008;30:2067–83.
  • [32] Koelstra S, Patras I. Fusion of facial expressions and EEG for implicit affective tagging. Image Vis Comput 2013;31:167–74.
  • [33] Koelstra S, Yazdani A, Soleymani M, Mühl C, Lee JS, Nijholt A, et al. Single trial classification of EEG and peripheral physiological signals for recognition of emotions induced by music videos. Brain Inform 2010;6334:89–100.
  • [34] Koelstra S, Muhl C, Soleymani M, Lee JS, Yazdani A, Ebrahimi T, et al. DEAP: a database for emotion analysis using physiological signals. IEEE Trans Affect Comput 2012;3:18–31.
  • [35] Kumar A, Anand S. EEG signal processing for monitoring depth of anesthesia. IETE Tech Rev 2006;23:179–86.
  • [36] Lang PJ, Bradley MM, Cuthbert BN. International Affective Picture System (IAPS): affective ratings of pictures and instruction manual. Technical report A-8. Gainesville, FL: University of Florida; 2008.
  • [37] Lang PJ, Greenwald MK, Bradley MM, Hamm AO. Looking at pictures: affective, facial, visceral, and behavioral reactions. Psychophysiology 1993;30:261–73.
  • [38] Leon E, Clarke G, Callaghan V, Sepulveda F. A user-independent real-time emotion recognition system for software agents in domestic environments. Eng Appl Artif Intell 2007;20:337–45.
  • [39] Leyh R, Heinisch C, Kungl MT, Spangler G. Attachment representation moderates the influence of emotional context on information processing. Front Hum Neurosci 2016;10:278. http://dx.doi.org/10.3389/fnhum.2016.00278.
  • [40] Li M, Lu BL. Emotion classification based on gamma band EEG. Proceedings of the Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC'09); 2009. p. 1223–6.
  • [41] Lin Y-P, Wang C-H, Jung T-P, Wu T-L, Jeng S-K, Duann J-R. EEG-based emotion recognition in music listening. IEEE Trans Biomed Eng 2010;57:1798–806.
  • [42] Liu Y, Sourina O. Real-time fractal-based valence level recognition from EEG. Trans Comput Sci XVIII 2013;7848:101–20.
  • [43] Liu YH, Wu CT, Cheng WT, Hsiao YT, Chen PM, Teng JT. Emotion recognition from single-trial EEG based on kernel Fisher's emotion pattern and imbalanced quasiconformal kernel support vector machine. Sensors 2014;14:13361–88. http://dx.doi.org/10.3390/s140813361.
  • [44] Li Y, Bahn S, Nam C, Lee J. Effects of luminosity contrast and stimulus duration on user performance and preference in a P300-based brain–computer interface. Int J Hum Comput Interact 2014;30:151–63.
  • [45] Luck SJ. An introduction to the event-related potential technique, vol. 1. Cambridge, MA: The MIT Press; 2005. p. 1–50.
  • [46] Moors A. Theories of emotion causation: a review. Cogn Emot 2009;23:625–62.
  • [47] Mühl C, Allison B, Nijholt A, Chanel G. A survey of affective brain computer interfaces: principles, state-of-the-art, and challenges. Brain Comput Interfaces 2014;1:66–84.
  • [48] Murugappan M, Nagarajan R, Yaacob S. Discrete wavelet transform based selection of salient EEG frequency band for assessing human emotions. In: Olkkonen H, editor. Discrete wavelet transforms – biomedical applications. Shanghai: In Tech Publisher; 2011. p. 33–52.
  • [49] Nasoz F, Alvarez K, Lisetti CL, Finkelstein N. Emotion recognition from physiological signals for presence technologies. Cogn Technol Work 2004;6:4–14.
  • [50] Nicolaou N, Nasuto SJ, Georgiou J. Single trial event related potential analysis for brain computer interfaces. AISB 2008 Symposium on Brain Computer Interface and Human Computer Interaction: A Convergence of Ideas; 2008. pp. 13–9.
  • [51] Nicolaou N, Nasuto SJ, Georgiou J. Towards natural human computer interaction in BCI. AISB 2008 Symposium on Brain Computer Interface and Human Computer Interaction: A Convergence of Ideas; 2008. pp. 26–31.
  • [52] Nie D, Wang XW, Shi LC, Lu BL. EEG-based emotion recognition during watching movies. International IEEE EMBC Conference on Neural Engineering (NER); 2011. pp. 667–70.
  • [53] Petrantonakis PC, Hadjileontiadis LJ. Emotion recognition from brain signals using hybrid adaptive filtering and higher order crossings analysis. IEEE Trans Affect Comput 2010;1:81–97.
  • [54] Petrantonakis PC, Hadjileontiadis LJ. Emotion recognition from eeg using higher order crossings. IEEE Trans Inf Technol Biomed 2010;14:186–97.
  • [55] Picard RW, Vyzas E, Healey J. Toward machine emotional intelligence: analysis of affective physiological state. IEEE Trans Pattern Anal Mach Intell 2001;23:1175–91.
  • [56] Quiroga R, Atienza M, Cantero J, Jongsma M. What can we learn from single-trial event-related potentials? Chaos Complex Lett 2007;2:345–63.
  • [57] Russell J. A circumplex model of affect. J Personal Soc Psychol 1980;39:1161–78.
  • [58] Savran A, Ciftci K, Chanel G, Mota JC, Viet LH, Sankur B, et al. Emotion detection in the loop from brain signals and facial images. Final Project Report of eNTERFACE06; 2006.
  • [59] Schupp HT, Junghöfer M, Weike AI, Hamm AO. Emotional facilitation of sensory processing in the visual cortex. Psychol Sci 2003;14:7–13.
  • [60] Selvan S, Srinivasan R. Recurrent neural network based efficient adaptive filtering technique for the removal of ocular artefacts from EEG. IETE Tech Rev 2000;17:73–8.
  • [61] Shi Y, Ruiz N, Taib R, Choi E, Chen F. Galvanic skin response (GSR) as an index of cognitive load. Proceeding CHI'07 Extended Abstracts on Human Factors in Computing Systems; 2007. pp. 2651–6.
  • [62] Singh MI, Singh M. Development of low-cost event marker for EEG-based emotion recognition. Trans Inst Meas Control 2015. http://dx.doi.org/10.1177/0142331215620698.
  • [63] Sivaradje G, Nakkeeran R, Dananjayan P. Extraction of evoked potential and its applications in biomedical engineering. IETE Tech Rev 2005;22:229–39.
  • [64] Soleymani M, Asghari-Esfeden S, Fu Y, Pantic M. Analysis of EEG signals and facial expressions for continuous emotion detection. IEEE Trans Affect Comput 2016;7:17–28.
  • [65] Sourina O, Liu Y. A fractal-based algorithm of emotion recognition from EEG using arousal-valence model. Proc. Biosignals; 2011. pp. 209–14.
  • [66] Takahashi K. Remarks on emotion recognition from bio-potential signals. 2nd International Conference on Autonomous Robots and Agents; 2004. pp. 186–91.
  • [67] Van Dongen NN, Van Strien JW, Dijkstra K. Implicit emotion regulation in the context of viewing artworks: ERP evidence in response to pleasant and unpleasant pictures. Brain Cogn 2016;107:48–54.
  • [68] Wu H, Chen C, Cheng D, Yang S, Huang R, Cacioppo S, et al. The mediation effect of menstrual phase on negative emotion processing: evidence from N2. Soc Neurosci 2014;9:278–88.
  • [69] Yi S, He W, Zhan L, Qi Z, Zhu C, Luo W, et al. Emotional noun processing: an erp study with rapid serial visual presentation. PLOS ONE 2015;10(3):e0118924. http://dx.doi.org/10.1371/journal.pone.0118924.
  • [70] Zhang DD, Luo WB, Luo YJ. Single-trial ERP evidence for the three-stage scheme of facial expression processing. Sci China Life Sci 2013;56:835–47.
  • [71] Zheng WL, Zhu JY, Lu BL. Identifying stable patterns over time for emotion recognition from EEG; 2016, arXiv:1601. 02197v1 [cs.HC].
  • [72] Zheng WL, Zhu JY, Peng Y, Lu BL. EEG-based emotion classification using deep belief networks. IEEE International Conference on Multimedia and Expo (ICME). 2014. pp. 1–6.
  • [73] Zhu C, He W, Qi Z, Wang L, Song D, Zhan L, et al. The time course of emotional picture processing: an event-related potential study using a rapid serial visual presentation paradigm. Front Psychol 2015;6:954. http://dx.doi.org/10.3389/fpsyg.2015.00954.
Uwagi
PL
Opracowanie ze środków MNiSW w ramach umowy 812/P-DUN/2016 na działalność upowszechniającą naukę (zadania 2017).
Typ dokumentu
Bibliografia
Identyfikator YADDA
bwmeta1.element.baztech-0d3889e3-2ca0-47be-976f-a326a1d9bb60
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.