PL EN


Preferencje help
Widoczny [Schowaj] Abstrakt
Liczba wyników
Tytuł artykułu

Electrodermal activity measurements for detection of emotional arousal

Treść / Zawartość
Identyfikatory
Warianty tytułu
Języki publikacji
EN
Abstrakty
EN
In this article, we present a comprehensive measurement system to determine the level of user emotional arousal by the analysis of electrodermal activity (EDA). A number of EDA measurements were collected, while emotions were elicited using specially selected movie sequences. Data collected from 16 participants of the experiment, in conjunction with those from personal questionnaires, were used to determine a large number of 20 features of the EDA, to assess the emotional state of a user. Feature selection was performed using signal processing and analysis methods, while considering user declarations. The suitability of the designed system for detecting the level of emotional arousal was fully confirmed, throughout the number of experiments. The average classification accuracy for two classes of the least and the most stimulating movies varies within the range of 61‒72%.
Rocznik
Strony
813--826
Opis fizyczny
Bibliogr. 67 poz., rys., tab.
Twórcy
  • Warsaw University of Technology, Institute of Theory of Electrical Engineering, Measurement and Information Systems, 75 Koszykowa St., Warsaw 00-662, Poland
autor
  • Warsaw University of Technology, Institute of Theory of Electrical Engineering, Measurement and Information Systems, 75 Koszykowa St., Warsaw 00-662, Poland
autor
  • Warsaw University of Technology, Institute of Theory of Electrical Engineering, Measurement and Information Systems, 75 Koszykowa St., Warsaw 00-662, Poland
autor
  • Warsaw University of Technology, Institute of Theory of Electrical Engineering, Measurement and Information Systems, 75 Koszykowa St., Warsaw 00-662, Poland
Bibliografia
  • [1] A. Luneski, E. Konstantinidis, and P. Bamidis, “Affective Med-icine: a review of Affective Computing efforts in Medical Informatics,” Methods Inf. Med., 49(3) 207–218, May 2010.
  • [2] H. Becker, J. Fleureau, P. Guillotel, F. Wendling, I. Merlet, and L. Albera, “Emotion recognition based on high-resolution EEG recordings and reconstructed brain sources,” IEEE Trans. Affect. Comput., vol. PP, no. 99, pp. 1–1, 2017.
  • [3] M. Moghim, R. Stone, P. Rotshtein, and N. Cooke, “Adaptive virtual environments: A physiological feedback HCI system concept,” in 2015 7th Computer Science and Electronic Engineering Conference (CEEC), 2015, pp. 123–128.
  • [4] A. Landowska, “Emotion Monitoring – Verification of Physio-logical Characteristics Measurement Procedures,” Metrol. Meas. Syst., 21(4) 719–732, 2014.
  • [5] T.-H. Chueh, T.-B. Chen, H. H.-S. Lu, S.-S. Ju, T.-H. Tao, and J.-H. Shaw, “Statistical prediction of emotional states by physiological signals with manova and machine learning,” Int. J. Pattern Recognit. Artif. Intell., 26(4) p. 1250008, Jun. 2012.
  • [6] A. Wojciechowski, “Hand’s poses recognition as a mean of communication within natural user interfaces,” Bull. Pol. Ac.: Tech., 60(2) 331–336, 2012.
  • [7] M. Soleymani, S. Asghari-Esfeden, M. Pantic, and Y. Fu, “Continuous emotion detection using EEG signals and facial expressions,” in 2014 IEEE International Conference on Multimedia and Expo (ICME), 2014, pp. 1–6.
  • [8] Electrodermal Activity | Wolfram Boucsein | Springer.
  • [9] R.W. Picard, E. Vyzas, and J. Healey, “Toward machine emotional intelligence: analysis of affective physiological state,” IEEE Trans. Pattern Anal. Mach. Intell., 23(10) 1175–1191, Oct. 2001.
  • [10] “Frustrating the user on purpose: a step toward building an affective computer | Interacting with Computers | Oxford Academic.” [Online]. Available: https://academic.oup.com/iwc/article-ab-stract/14/2/93/758896?redirectedFrom=fulltext. [Accessed: 24-May-2018].
  • [11] H. Gunes, B. Schuller, M. Pantic, and R. Cowie, “Emotion repre-sentation, analysis and synthesis in continuous space: A survey,” in Face and Gesture 2011, 2011, pp. 827–834.
  • [12] H.D. Critchley, “Electrodermal responses: what happens in the brain,” Neurosci. Rev. J. Bringing Neurobiol. Neurol. Psychiatry, 8(2) 132–142, Apr. 2002.
  • [13] D.J. Leiner, A. Fahr, and H. Früh, “EDA Positive Change: A Sim-ple Algorithm for Electrodermal Activity to Measure General Audience Arousal During Media Exposure,” Social Science Research Network, Rochester, NY, SSRN Scholarly Paper ID 2467983, Dec. 2012.
  • [14] G. Crifaci, L. Billeci, G. Tartarisco, R. Balocchi, G. Pioggia, E. Brunori, S. Maestro, and M. A. Morales, “ECG and GSR measure and analysis using wearable systems: Application in anorexia nervosa adolescents,” in 2013 8th International Symposium on Image and Signal Processing and Analysis (ISPA), 2013, pp. 499–504.
  • [15] T. Westeyn, P. Presti, and T. Starner, “ActionGSR: A Combination Galvanic Skin Response-Accelerometer for Physiological Measurements in Active Environments,” in 2006 10th IEEE International Symposium on Wearable Computers, 2006, pp. 129–130.
  • [16] R. Luharuka, R.X. Gao, and S. Krishnamurty, “Design and realization of a portable data logger for physiological sensing [GSR],” IEEE Trans. Instrum. Meas., 52(4) 1289–1295, Aug. 2003.
  • [17] D.C. Fowles, M.J. Christie, R. Edelberg, W.W. Grings, D.T. Lyk-ken, and P.H. Venables, “Publication Recommendations for Electrodermal Measurements,” Psychophysiology, 18(3) 232–239, May 1981.
  • [18] D.R. Bach, K.J. Friston, and R.J. Dolan, “Analytic measures for quantification of arousal from spontaneous skin conductance fluctuations,” Int. J. Psychophysiol., 76(1) 52–55, Apr. 2010.
  • [19] D.T. Lykken and P.H. Venables, “Direct Measurement of Skin Conductance: A Proposal for Standardization,” Psychophysiology, 8(5) 656–672, Sep. 1971.
  • [20] A. Greco, A. Lanatà, G. Valenza, G. Rota, N. Vanello, and E.P. Scilingo, “On the deconvolution analysis of electrodermal activity in bipolar patients,” in 2012 Annual International Conference of the IEEE Engineering in Medicine and Biology Soci-ety, 2012, pp. 6691–6694.
  • [21] C. Christoforou, S. Christou-Champi, F. Constantinidou, and M. Theodorou, “From the eyes and the heart: a novel eye-gaze metric that predicts video preferences of a large audience,” Front. Psychol., vol. 6, May 2015.
  • [22] P.J. Lang, M.K. Greenwald, M.M. Bradley, and A.O. Hamm, “Looking at pictures: affective, facial, visceral, and behavioral reactions,” Psychophysiology, 30(3) 261–273, May 1993.
  • [23] G. Chanel, K. Ansari-Asl, and T. Pun, “Valence-arousal evaluation using physiological signals in an emotion recall paradigm,” in 2007 IEEE International Conference on Systems, Man and Cybernetics, 2007, pp. 2662–2667.
  • [24] G. Chanel, C. Rebetez, M. Bétrancourt, and T. Pun, “Boredom, Engagement and Anxiety As Indicators for Adaptation to Difficulty in Games,” in Proceedings of the 12th International Con-ference on Entertainment and Media in the Ubiquitous Era, New York, NY, USA, 2008, pp. 13–17.
  • [25] “Affective Characterization of Movie Scenes Based on Multimedia Content Analysis and User’s Physiological Emotional Responses – IEEE Conference Publication.” [Online]. Avail-able: http://ieeexplore.ieee.org/abstract/document/4741174/. [Accessed: 06-Mar-2018].
  • [26] M. Soleymani, M. Pantic, and T. Pun, “Multimodal Emotion Recognition in Response to Videos,” IEEE Trans. Affect. Comput., 3(2) 211–223, Apr. 2012.
  • [27] B. Figner and R. O. Murphy, “Using skin conductance in judgment and decision making research,” Soc. Judgm. Decis. Mak., 2011.
  • [28] M.V. Villarejo, B.G. Zapirain, and A.M. Zorrilla, “A Stress Sensor Based on Galvanic Skin Response (GSR) Controlled by ZigBee,” Sensors, 12(5) 6075–6101, May 2012.
  • [29] R. Zangróniz, A. Martínez-Rodrigo, J. M. Pastor, M. T. López, and A. Fernández-Caballero, “Electrodermal Activity Sensor for Classification of Calm/Distress Condition,” Sensors, 17(10), Oct. 2017.
  • [30] M. Liu, D. Fan, X. Zhang, and X. Gong, “Human Emotion Recognition Based on Galvanic Skin Response Signal Feature Selection and SVM,” in 2016 International Conference on Smart City and Systems Engineering (ICSCSE), 2016, pp. 157–160.
  • [31] “International affective picture system (IAPS) : affective ratings of pictures and instruction manual (Book, 2005) [WorldCat.org].” [Online]. Available: http://www.worldcat.org/title/inter-national-affective-picture-system-iaps-affective-ratings-of-pic-tures-and-instruction-manual/oclc/315231035. [Accessed: 12-Mar-2018].
  • [32] “Affective auditory stimuli: Adaptation of the International Affective Digitized Sounds (IADS-2) for European Portuguese | SpringerLink.” [Online]. Available: https://link.springer.com/article/10.3758/s13428‒012‒0310‒1. [Accessed: 12-Mar-2018].
  • [33] A. Marchewka, Ł. Żurawski, K. Jednoróg, and A. Grabowska, “The Nencki Affective Picture System (NAPS): Introduction to a novel, standardized, wide-range, high-quality, realistic picture database,” Behav. Res. Methods, 46(2) 596–610, 2014.
  • [34] M.K. Uhrig, N. Trautmann, U. Baumgärtner, R.-D. Treede, F. Henrich, W. Hiller, and S. Marschall, “Emotion Elicitation: A Comparison of Pictures and Films,” Front. Psychol., vol. 7, Feb. 2016.
  • [35] S. Koelstra, C. Muhl, M. Soleymani, J.S. Lee, A. Yazdani, T. Ebrahimi, T. Pun, A. Nijholt, and I. Patras, “DEAP: A Data-base for Emotion Analysis ;Using Physiological Signals,” IEEE Trans. Affect. Comput., 3(1) 18–31, Jan. 2012.
  • [36] R. Khosrowabadi, H.C. Quek, A. Wahab, and K.K. Ang, “EEG-based Emotion Recognition Using Self-Organizing Map for Boundary Detection,” in 2010 20th International Conference on Pattern Recognition, 2010, pp. 4242–4245.
  • [37] M. Giaquinta and G. Modica, “Polynomials, Rational Functions and Trigonometric Polynomials,” in Mathematical Analysis, Birkhäuser Boston, 2004, pp. 145–186.
  • [38] A. Greco, G. Valenza, and E.P. Scilingo, “Modeling for the Analysis of the EDA,” in Advances in Electrodermal Activity Processing with Applications for Mental Health, Springer, Cham, 2016, pp. 19–33.
  • [39] U. Hasson, O. Furman, D. Clark, Y. Dudai, and L. Davachi, “Enhanced inter subject correlations during movie viewing cor-relate with successful episodic encoding,” Neuron, 57(3) 452–462, Feb. 2008.
  • [40] F. Lotte, M. Congedo, A. Lécuyer, F. Lamarche, and B. Arnaldi, “A review of classification algorithms for EEG-based brain–computer interfaces,” J. Neural Eng., 4(2) p. R1, 2007.
  • [41] A. Wiliński and S. Osowski, “Ensemble of data mining methods for gene ranking,” Bull. Pol. Ac.: Tech., 60(3) 461–470, 2012.
  • [42] C. Spearman, “Correlation Calculated from Faulty Data,” Br. J. Psychol. 1904‒1920, 3(3) 271–295, Oct. 1910.
  • [43] M. Mukaka, “A guide to appropriate use of Correlation coefficient in medical research,” Malawi Med. J. J. Med. Assoc. Malawi, 24(3) 69–71, Sep. 2012.
  • [44] A. Drachen, L. E. Nacke, G. Yannakakis, and A. L. Pedersen, “Correlation Between Heart Rate, Electrodermal Activity and Player Experience in First-person Shooter Games,” in Proceedings of the 5th ACM SIGGRAPH Symposium on Video Games, New York, NY, USA, 2010, pp. 49–54.
  • [45] H. Silva, A. Fred, S. Eusebio, M. Torrado, and S. Ouakinin, “Feature extraction for psychophysiological load assessment in unconstrained scenarios,” Conf. Proc. Annu. Int. Conf. IEEE Eng. Med. Biol. Soc. IEEE Eng. Med. Biol. Soc. Annu. Conf., vol. 2012, pp. 4784–4787, 2012.
  • [46] K.H. Kim, S.W. Bang, and S.R. Kim, “Emotion recognition sys-tem using short-term monitoring of physiological signals,” Med. Biol. Eng. Comput., 42(3) 419–427, May 2004.
  • [47] J. Huang and C.X. Ling, “Using AUC and accuracy in evaluating learning algorithms,” IEEE Trans. Knowl. Data Eng., 17(3) 299–310, Mar. 2005.
  • [48] S. Amari, N. Murata, K.R. Muller, M. Finke, and H.H. Yang, “Asymptotic statistical theory of overtraining and cross-validation,” IEEE Trans. Neural Netw., 8(5) 985–996, Sep. 1997.
  • [49] N. Jatupaiboon, S. Panngum, and P. Israsena, “Emotion classification using minimal EEG channels and frequency bands,” in The 2013 10th International Joint Conference on Computer Science and Software Engineering (JCSSE), 2013, pp. 21–24.
  • [50] G.K. Verma and U.S. Tiwary, “Multimodal fusion framework: a multiresolution approach for emotion classification and recognition from physiological signals,” NeuroImage, 102 Pt 1, pp. 162–172, Nov. 2014.
  • [51] Y.P. Lin, C.H. Wang, T.P. Jung, T.L. Wu, S.K. Jeng, J.R. Duann, and J.H. Chen, “EEG-Based Emotion Recognition in Music Listening,” IEEE Trans. Biomed. Eng., 57(7) 1798–1806, Jul. 2010.
  • [52] M. Murugappan, N. Ramachandran, and Y. Sazali, “Classification of human emotion from EEG using discrete wavelet trans-form,” J. Biomed. Sci. Eng., 3(4) 390, Apr. 2010.
  • [53] A. Colomer Granero, F. Fuentes-Hurtado, V. Naranjo Ornedo, J. Guixeres Provinciale, J.M. Ausín, and M. Alcañiz Raya, “A Comparison of Physiological Signal Analysis Techniques and Classifiers for Automatic Emotional Evaluation of Audiovisual Contents,” Front. Comput. Neurosci., vol. 10, Jul. 2016.
  • [54] “Affective Recognition Based On GSR Signal By Curve Fitting And ABC Algorithm – Globe Thesis.” [Online]. Available: http://globethesis.com/?t=2268330428980609. [Accessed: 13-Apr-2017].
  • [55] J. Healey and R. Picard, “SmartCar: detecting driver stress,” in Proceedings 15th International Conference on Pattern Recognition. ICPR-2000, 2000, vol. 4, pp. 218–221.
  • [56] G. Wu, G. Liu, and M. Hao, “The Analysis of Emotion Recognition from GSR Based on PSO,” in 2010 International Symposium on Intelligence Information Processing and Trusted Computing, 2010, pp. 360–363.
  • [57] R.J. Barry, S. Feldmann, E. Gordon, K. I. Cocker, and C. Rennie, “Elicitation and habituation of the electrodermal orienting response in a short interstimulus interval paradigm,” Int. J. Psychophysiol., 15(3) 247–253, Nov. 1993.
  • [58] C.L. Lim, C. Rennie, R.J. Barry, H. Bahramali, I. Lazzaro, B. Manor, and E. Gordon, “Decomposing skin conductance into tonic and phasic components,” Int. J. Psychophysiol., 25(2) 97–109, Feb. 1997.
  • [59] D.M. Alexander, C. Trengove, P. Johnston, T. Cooper, J.P. August, and E. Gordon, “Separating individual skin conductance responses in a short interstimulus-interval paradigm,” J. Neurosci. Methods, 146(1) 116–123, Jul. 2005.
  • [60] M. Benedek and C. Kaernbach, “Decomposition of skin conductance data by means of nonnegative deconvolution,” Psychophysiology, 47(4) 647–658, Jul. 2010.
  • [61] M. Benedek and C. Kaernbach, “A continuous measure of phasic electrodermal activity,” J. Neurosci. Methods, 190(1) 80–91, Jun. 2010.
  • [62] D.R. Bach, “A head-to-head comparison of SCRalyze and Leda-lab, two model-based methods for skin conductance analysis,” Biol. Psychol., (103) 63–68, Dec. 2014.
  • [63] A. Greco, A. Lanata, G. Valenza, E. P. Scilingo, and L. Citi, “Electrodermal activity processing: A convex optimization approach,” in 2014 36th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, 2014, pp. 2290–2293.
  • [64] A. Greco, G. Valenza, L. Citi, and E. P. Scilingo, “Arousal and Valence Recognition of Affective Sounds Based on Electrodermal Activity,” IEEE Sens. J., 17(3) 716–725, Feb. 2017.
  • [65] Z. Yang, G. Liu, Z. Yang, and G. Liu, “Emotion Recognition Based on Nonlinear Features of Skin Conductance Response, Emotion Recognition Based on Nonlinear Features of Skin Conductance Response,” J. Inf. Comput. Sci., 10(12) 3877–3887.
  • [66] “(5) Electrodermal activity (eda) based wearable device for qunatifying normal and abnormal emotions in humans,” Research-Gate. [Online]. Available: https://www.researchgate.net/publica-tion/318270177_Electrodermal_activity_eda_based_wearable_device_for_qunatifying_normal_and_abnormal_emotions_in_humans. [Accessed: 24-May-2018].
  • [67] A. Greco, A. Lanata, L. Citi, N. Vanello, G. Valenza, and E.P. Scilingo, “Skin Admittance Measurement for Emotion Recognition: A Study over Frequency Sweep,” Electronics, 5(3) 46, Aug. 2016.
Uwagi
PL
Opracowanie rekordu ze środków MNiSW, umowa Nr 461252 w ramach programu "Społeczna odpowiedzialność nauki" - moduł: Popularyzacja nauki i promocja sportu (2020).
Typ dokumentu
Bibliografia
Identyfikator YADDA
bwmeta1.element.baztech-e856c13f-1e07-47d1-8d49-9d474b611b24
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.