PL EN


Preferencje help
Widoczny [Schowaj] Abstrakt
Liczba wyników
Tytuł artykułu

Transition of emotions from the negatively excited state to positive unexcited state: an ERP perspective

Treść / Zawartość
Identyfikatory
Warianty tytułu
Języki publikacji
EN
Abstrakty
EN
The cognitive aspects like perception, problem-solving, thinking, task performance, etc., are immensely influenced by emotions making it necessary to study emotions. The best state of emotion is the positive unexcited state, also known as the HighValence LowArousal (HVLA) state of the emotion. The psychologists endeavour to bring the subjects from a negatively excited state of emotion (Low Valence High Arousal state) to a positive unexcited state of emotion (High Valence Low Arousal state). In the first part of this study, a four-class subject independent emotion classifier was developed with an SVM polynomial classifier using average Event Related Potential (ERP) and differential average ERP attributes. The visually evoked Electroencephalogram (EEG) signals were acquired from 24 subjects. The four-class classification accuracy was 83% using average ERP attributes and 77% using differential average ERP attributes. In the second part of the study, the meditative intervention was applied to 20 subjects who declared themselves negatively excited (in Low Valence High Arousal state of emotion). The EEG signals were acquired before and after the meditative intervention. The four-class subject independent emotion classifier developed in Study 1 correctly classified these 20 subjects to be in a negatively excited state of emotion. After the intervention, 16 subjects self-assessed themselves to be in a positive unexcited (HVLA) state of emotion (which shows the intervention accuracy of 80%). Testing a four-class subject independent emotion classifier on the EEG data acquired after the meditative intervention validated 13 of 16 subjects in a positive unexcited state, yielding an accuracy of 81.3%.
Słowa kluczowe
Rocznik
Strony
405--423
Opis fizyczny
Bibliogr. 45 poz., rys., tab.
Twórcy
  • Thapar Institute of Engineering and Technology, P.O. Box 32, Patiala, Pin - 147004, India
  • Thapar Institute of Engineering and Technology, P.O. Box 32, Patiala, Pin - 147004, India
Bibliografia
  • [1] Du, N., Zhou, K, Pulver, K. M., Tilbury, D. M., Robert, L. P., Pradhan, A. K., & Yang, X. J. (2020). Examining the effects of emotional valence and arousal on takeover performance in conditionally automated driving. Transportation Research Part C: Emerging Technologies, 112, 78-87. https://doi.org/10.1016/j.trc.2020.01.006
  • [2] Cahill, J., Cullen, P., Anwer, S., Wilson, S., & Gaynor, K. (2021). Pilot work related stress (WRS), effects on wellbeing and mental health, and coping methods. The International Journal of Aerospace Psychology, 31(2), 87-109. https://doi.org/10.1080/24721840.2020.1858714
  • [3] Viczko, J., Tarrant, J., & Jackson, R. (2021). Effects on Mood and EEG States After Meditation in Augmented Reality With and Without Adjunctive Neurofeedback. Frontiers in Virtual Reality, 2, 618381. https://doi.org/10.3389/frvir.2021.618381
  • [4] Tarrant, J., Viczko. J., & Cope, H. (2018). Virtual reality tor anxiety reduction demonstrated by quantitative EEG: a pilot study. Frontiers in Psychology, 9, 1280. https://doi.org/10.3389/fpsyg.2018.01280
  • [5] Tarrant, J. (2020). Neuromeditation: The Science and Practice of Combining Neurofeedback and Meditation tor Improved Mental Health. In E. D.-Y. Liao (Ed.), Smart Biofeedback-Perspectives and Applications. IntechOpen. https://doi.org/10.5772/intechopen.93781
  • [6] Kato, N., & Hagiwara, M. (2016). An emotion transition model using fuzzy inference. International Journal of Affective Engineering, 15(3), 305-311. https://doi.org/10.5057/ijae.IJAE-D-16-00001
  • [7] Xiaolan, P., Lun, X., Xin, L., & Zhiliaing, W. (2013). Emotional state transition model based on stimulus and personality characteristics. China Communications, 10(6), 146-155. https://doi.org/10.1109/CC.2013.6549266
  • [8] Prasetio, B. H., Tamura, H., & Tanno, K. (2020). Deep time-delay Markov network for prediction and modeling the stress and emotions state transition. Scientific Reports, 10(1), 1-12. https://doi.org/10.1038/s41598-020-75155-w
  • [9] Hansen, J. H. (1999). SUSAS. (LDC99S78) [Data set]. Philadelphia: Linguistic Data Consortium. https://doi.org/10.35111/x4at-ff87
  • [10] Griessmair, M. (2017). Ups and downs: Emotional Dynamics in Negotiations and their Effects on (ln)Equity. Group Decision and Negotiation, 26(6), 1061-1090. https://doi.org/10.1007/s10726-017-9541-y
  • [11] Van Kleef, G. A., De Dreu, C. K., & Manstead, A. S. (2006). Supplication and appeasement in conflict and negotiation: The interpersonal effects of disappointment, worry, guilt, and regret. Journal of Personality and Social Psychology, 91(1), 124. https://doi.org/10.1037/0022-3514.91.1.124
  • [12] Ichimura, T., & Mera, K. (2013). Emotion-oriented agent in mental state transition learning network. International Journal of Computational Intelligence Studies, 2(1), 26-51. https://doi.org/10.1504/IJCISTUDIES.2013.054773
  • [13] Xiang, H., Ren, F., Kuroiwa, S., & Jiang, P. (2005, June). An experimentation on creating a mental state transition network. In 2005 IEEE International Conference on Information Acquisition (pp. 5-pp). IEEE. https://doi.org/10.1109/ICIA.2005.1635127
  • [14] Aftanas, L. I., & Golosheikin, S. A. (2003). Changes in cortical activity in altered states of consciousness: the study of meditation by high-resolution EEG. Human Physiology, 29(2), 143-151. https://doi.org/10.1023/A:1022986308931
  • [15] Goshvarpour, A., & Goshvarpour, A. (2012). Classification of Electroencephalographic changes in meditation and rest: using correlation dimension and wavelet coefficients. IJ Information Technology and Computer Science, 4(3), 24-30. https://doi.org/10.5815/ijitcs.2012.03.04
  • [16| Kimmatkar, N. V., & Babu, B. V. (2021). Novel Approach for Emotion Detection and Stabilizing Mental State by Using Machine Learning Techniques. Computers, 10(3), 37. https://doi.org/10.3390/computers10030037
  • [17] Filipowicz, A., Barsade, S., & Melwani, S. (2011). Understanding emotional transitions: the interpersonal consequences of changing emotions in negotiations. Journal of Personality and Social Psychology, 101(3), 541-556. https://doi.org/10.1037/a0023545
  • [18] Jenke, R., Peer, A., & Buss, M. (2014). Feature extraction and selection for emotion recognition from EEG. IEEE Transactions on Affective Computing, 5(3), 327-339. https://doi.org/10.1109/TAFFC.2014.2339834
  • [19] Koelstra, S., & Patras, I. (2013). Fusion of facial expressions and EEG for implicit affective tagging. Image and Vision Computing, 31(2), 164-174. https://doi.org/10.1016/j.imavis.2012.10.002
  • [20] Frantzidis, C. A., Bratsas, C., Papadelis, C. L., Konstantinidis, E., Pappas, C., & Bamidis, P. D. (2010). Toward emotion aware computing: an integrated approach using multichannel neurophysiological recordings and atfective visual stimuli. IEEE Transactions on Information Technology in Biomedicine, 14(3), 589-597. https://doi.org/10.1109/TITB.2010.2041553
  • [21] Singh, M. I., & Singh, M. (2017). Development of low-cost event marker for EEG-based emotion recognition. Transactions of the Institute of Measurement and Control, 39(5), 642-652. https://doi.org/10.1177/0142331215620698
  • [22] Lang, P. J., Bradley, M. M., & Cuthbert, B. N. (2008). International Affective Picture System (IAPS): affective ratings of pictures and instruction manual (Technical Report No. A-8). University of Florida, Gainesville.
  • [23] Singh, M. I., & Singh, M. (2020). Development of emotion classifier based on absolute and differential attributes of averaged signals of visually stimulated event related potentials. Transactions of the Institute of Measurement and Control, 42(11), 2057-2067. https://doi.org/10.1177/0142331220904889
  • [24] Singh, M. I., & Singh, M. (2021). Emotion Recognition: An Evaluation of ERP Features Acquired from Frontal EEG Electrodes. Applied Sciences, 11(9), 4131. https://doi.org/10.3390/appl1094131
  • [25] Singh, M. I., & Singh, M. (2017). Development of a real time emotion classifier based on evoked EEG. Biocybernetics and Biomedical Engineering, 37(3), 498-509. https://doi.org/10.1016/j.bbce.2017.05.004
  • [26] Chanel, G., Kronegg, J., Grandjean, D., & Pun, T. (2006, September). Emotion assessment: Arousal evaluation using EEG’s and peripheral physiological signals. In International Workshop on Multimedia Content Representation, Classification and Security (pp. 530-537). Springer. Berlin. Heidelberg. https://doi.org/10.1007/11848035_70
  • [27] Horlings, R., Datcu, D., & Rothkrantz, L. J. (2008, June). Emotion recognition using brain activity. In Proceedings of the 9th International Conference on Computer Systems and Technologies and Workshop for PhD Students in Computing. https://doi.org/10.1145/1500879.1500888
  • [28] Savran, A., Ciftci, K., Chanel, G., Mota, J., Hong Viet, L., Sankur, B., Akarun, L., Caplier, A., & Rombaut, M. (2006). Emotion detection in the loop from brain signals and facial images. In Proceedings of the eNTERFACE 2006 Workshop, http://www.enterface.net/enterface06/docs/results/eNTERFACE06_proceedings.pdf
  • [29] Jatupaiboon, N., Pan-Ngum, S., & Israsena, P. (2013). Real-time EEG-based happiness detection system. The Scientific World Journal, 2013. https://doi.org/10.1155/2013/618649
  • [30] Dan-Glauser, E. S., & Scherer, K. R. (2011). The Geneva affective picture database (GAPED): A new 730-picture database focusing on valence and normative significance. Behavior Research Methods. 43(2), 468-477. https://doi.org/10.3758/s13428-011-0064-1
  • [31] Hidalgo-Muñoz, A. R., López, M. M., Santos, I. M., Pereira, A. T., Vázquez-Marrufo, M., Galvao-Carmona, A., & Tomé, A. M. (2013). Application of SVM-RFE on EEG signals for detecting the most relevant scalp regions linked to affective valence processing. Expert Systems with Applications, 40(6), 2102-2108. https://doi.org/10.1016/j.eswa.2012.10.013
  • [32] Liu, Y. H., Wu, C. T., Cheng, W. T., Hsiao, Y. T., Chen, P. M., & Teng, J. T. (2014). Emotion recognition from single-trial EEG based on kernel Fisher's emotion pattern and imbalanced quasiconformal kernel support vector machine. Sensors, 14(8). 13361-13388. https://doi.org/10.3390/s140813361
  • [33] Mehmood, R. M., & Lee, H. J. (2015). EEG based emotion recognition from human brain using Hjorth parameters and SVM. International Journal of Bio-Science and Bio-Technology, 7(3), 23-32. http://dx.doi.org/10.14257/ijbsbt.2015.7.3.03
  • [34] Wei, Y., Wu, Y., & Tudor, J. (2017). A real-time wearable emotion detection headband based on EEG measurement. Sensors and Actuators A: Physical, 263, 614-621. https://doi.org/10.1016/j.sna.2017.07.012
  • [35] Marín-Morales, J., Higuera-Trujillo, J. L., Greco, A., Guixeres, J., Limares, C., Scilingo, E. P., Alcañiz, M. & Valenza, G. (2018). Affective computing in virtual reality: emotion recognition from brain and heartbeat dynamics using wearable sensors. Scientific Reports, 8(1), 1-15. https://doi.org/10.1038/s41598-018-32063-4
  • [36] Apicella, A., Arpaia, P., Mastrati, G., & Moecaldi, N. (2021). EEG-based detection of emotional valence towards a reproducible measurement of emotions. Scientific Reports, 11(1), 1-16. https://doi.org/10.1038/s41598-021-00812-7
  • [37] Kurdi, B., Lozano, S., & Banaji, M. R. (2017). Introducing the open affective standardized image set (OASIS). Behavior Research Methods, 49(2), 457-470. https://doi.org/10.3758/s13428-016-0715-3
  • [38] Yao, L., Wang, M., Lu, Y., Li, H., Zhang, X. (2021). EEG-Based Emotion Recognition by Exploiting Fused Network Entropy Measures of Complex Networks across Subjects. Entropy, 23(8), 984. https://doi.org/10.3390/e23080984
  • [39] Gannouni, S., Aledaily, A., Belwafi, K., & Aboalsamh, H. (2021). Emotion detection using electroencephalography signals and a zero-time windowing-based epoch estimation and relevant electrode identification. Scientific Reports, 11(1), 1-17. https://doi.org/10.1038/s41598-021-86345-5
  • [40] Woodman, G. E. (2010). A brief introduction to the use of event-related potentials in studies of perception and attention. Attention, Perception, & Psychophysics, 72(8), 2031-2046. https://doi.org/10.3758/BF03196680
  • [41] Lakens, D., Fockenberg, D. A., Lemmens, K. P., Ham, J., & Midden, C. J. (2013). Brightness differences influence the evaluation of affective pictures. Cognition & Emotion. 27(7), 1225-1246. https://doi.org/10.1080/02699931.2013.781501
  • [42] Balsamo, M., Carlucci, L., Padulo, C., Perfetti, B., & Fairfield, B. (2020). A Bottom-Up Validation of the IAPS, GAPED, and NAPS Affective Picture Databases: Differential Effects on Behavioral Performance. Frontiers in Psychology, 11, 2187. https://doi.org/10.3389/fpsyg.2020.02187
  • [43] Eroğlu, K., Kayıkçioğlu, T., & Osman, O. (2020). Effect of brightness of visual stimuli on EEG signals. Behavioural Brain Research, 382, 112486. https://doi.org/10.1016/j.bbr.2020.112486
  • [44] Meiselman, H. L. (Ed.). (2016). Emotion Measurement. Woodhead Publishing, https://doi.org/10.1016/C2014-0-03427-2
  • [45] Marchewka, A., Żurawski, L., Jednoróg, K., & Grabowska, A. (2014). The Nencki Affective Picture System (NAPS): Introduction to a novel, standardized, wide-range, high-quality, realistic picture database. Behavior Research Methods, 46(2), 596-610. https://doi.org/10.3758/s13428-013-0379-1
Uwagi
Opracowanie rekordu ze środków MEiN, umowa nr SONP/SP/546092/2022 w ramach programu "Społeczna odpowiedzialność nauki" - moduł: Popularyzacja nauki i promocja sportu (2022-2023).
Typ dokumentu
Bibliografia
Identyfikator YADDA
bwmeta1.element.baztech-feedf7a0-43e1-475a-948a-cbcac02687ec
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.