PL EN


Preferencje help
Widoczny [Schowaj] Abstrakt
Liczba wyników
Tytuł artykułu

Towards effective music therapy for mental health care using machine learning tools : human affective reasoning and music genres

Treść / Zawartość
Identyfikatory
Warianty tytułu
Języki publikacji
EN
Abstrakty
EN
Music has the ability to evoke different emotions in people, which is reflected in their physiological signals. Advances in affective computing have introduced computational methods to analyse these signals and understand the relationship between music and emotion in greater detail. We analyse Electrodermal Activity (EDA), Blood Volume Pulse (BVP), Skin Temperature (ST) and Pupil Dilation (PD) collected from 24 participants while they listen to 12 pieces from 3 different genres of music. A set of 34 features were extracted from each signal and 6 different feature selection methods were applied to identify useful features. Empirical analysis shows that a neural network (NN) with a set of features extracted from the physiological signals can achieve 99.2% accuracy in differentiating among the 3 music genres. The model also reaches 98.5% accuracy in classification based on participants’ subjective rating of emotion. The paper also identifies some useful features to improve accuracy of the classification models. Furthermore, we introduce a new technique called ’Gingerbread Animation’ to visualise the physiological signals we record as a video, and to make these signals more comprehensible to the human eye, and also appropriate for computer vision techniques such as Convolutional Neural Networks (CNNs). Our results overall provide a strong motivation to investigate the relationship between physiological signals and music, which can lead to improvements in music therapy for mental health care and musicogenic epilepsy reduction (our long term goal).
Rocznik
Strony
5--20
Opis fizyczny
Bibiliogr. 52 poz., rys.
Twórcy
  • Research School of Computer Science, The Australian National University Canberra, Australia
autor
  • Research School of Computer Science, The Australian National University Canberra, Australia
  • Research School of Computer Science, The Australian National University Canberra, Australia
  • Research School of Computer Science, The Australian National University Canberra, Australia
autor
  • Research School of Computer Science, The Australian National University Canberra, Australia
Bibliografia
  • [1] A. Bardekar and A. A. Gurjar, Study of Indian Classical Ragas Structure and its Influence on Human Body for Music Therapy, in 2016 2nd International Conference on Applied and Theoretical Computing and Communication Technology (iCATccT), 2016, pp. 119-123: IEEE.
  • [2] C. L. Baldwin and B. A. Lewis, Positive valence music restores executive control over sustained attention, PLOS ONE, vol. 12, no. 11, p. e0186231, 2017.
  • [3] L. Harmat, J. Takács, and R. Bodizs, Music improves sleep quality in students, Journal of advanced nursing, vol. 62, no. 3, pp. 327-335, 2008.
  • [4] G. Coppola et al., Mozart’s music in children with drug-refractory epileptic encephalopathies, Epilepsy & Behavior, vol. 50, pp. 18-22, 2015.
  • [5] M. Z. Hossain, Observer’s galvanic skin response for discriminating real from fake smiles, 2016.
  • [6] L. Chen, T. Gedeon, M. Z. Hossain, and S. Caldwell, Are you really angry?: detecting emotion veracity as a proposed tool for interaction, presented at the Proceedings of the 29th Australian Conference on Computer-Human Interaction, Brisbane, Queensland, Australia, 2017.
  • [7] J. A. Healey and R. W. Picard, Detecting stress during real-world driving tasks using physiological sensors, IEEE Transactions on intelligent transportation systems, vol. 6, no. 2, pp. 156-166, 2005.
  • [8] Y. Nagai, L. H. Goldstein, P. B. Fenwick, and M. R. Trimble, Clinical efficacy of galvanic skin response biofeedback training in reducing seizures in adult epilepsy: a preliminary randomized controlled study, Epilepsy & Behavior, vol. 5, no. 2, pp. 216-223, 2004.
  • [9] L. Harrison and P. Loui, Thrills, chills, frissons, and skin orgasms: toward an integrative model of transcendent psychophysiological experiences in music, Frontiers in psychology, vol. 5, p. 790, 2014.
  • [10] D. Huron and E. Margulis, Musical Expectancy and Thrills, Handbook of Music and Emotion: Theory, Research, Applications, pp. 575-604, 07/29 2011.
  • [11] M. Guhn, A. Hamm, and M. Zentner, Physiological and musico-acoustic correlates of the chill response, Music Perception: An Interdisciplinary Journal, vol. 24, no. 5, pp. 473-484, 2007.
  • [12] D. G. Craig, An exploratory study of physiological changes during “chills” induced by music, Musicae scientiae, vol. 9, no. 2, pp. 273-287, 2005.
  • [13] K. H. Kim, S. W. Bang, and S. R. Kim, Emotion recognition system using short-term monitoring of physiological signals, Medical and biological engineering and computing, vol. 42, no. 3, pp. 419-427, 2004.
  • [14] M. Z. Hossain, T. Gedeon, and R. Sankaranarayana, Using temporal features of observers’ physiological measures to distinguish between genuine and fake smiles, IEEE Transactions on Affective Computing, pp. 1-1, 2018.
  • [15] A. Haag, S. Goronzy, P. Schaich, and J. Williams, Emotion recognition using bio-sensors: First steps towards an automatic system, in Tutorial and research workshop on affective dialogue systems, 2004, pp. 36-48: Springer.
  • [16] J. S. Rahman, T. Gedeon, S. Caldwell, R. Jones, M. Z. Hossain, and X. Zhu, Melodious Micro-frissons: Detecting Music Genres from Skin Response, in International Joint Conference on Neural Networks (IJCNN), Budapest, Hungary, 2019: IEEE.
  • [17] J. R. Hughes and J. J. Fino, The Mozart effect: distinctive aspects of the music—a clue to brain coding?, Clinical Electroencephalography, vol. 31, no. 2, pp. 94-103, 2000.
  • [18] L. C. Lin et al., Parasympathetic activation is involved in reducing epileptiform discharges when listening to Mozart music, Clin Neurophysiol, vol. 124, no. 8, pp. 1528-35, Aug 2013.
  • [19] R. McCraty, The effects of different types of music on mood, tension, and mental clarity.”
  • [20] Youtube. (2016). Gamma Brain Energizer - 40 Hz - Clean Mental Energy - Focus Music - Binaural Beats. Avail able: https://www.youtube.com/watch?v=9wrFk5vuOsk
  • [21] Youtube. (2017). Serotonin Release Music with Alpha Waves - Binaural Beats Relaxing Music, Happiness Frequency. Available: https://www.youtube.com/watch?v=9TPSs16DwbA
  • [22] N. Hurless, A. Mekic, S. Pena, E. Humphries, H. Gentry, and D. Nichols, Music genre preference and tempo alter alpha and beta waves in human non-musicians.
  • [23] Billboard Year End Chart. Available: https://www.billboard.com/charts/year-end
  • [24] D. J. Thurman et al., Standards for epidemiologic studies and surveillance of epilepsy, Epilepsia, vol. 52, pp. 2-26, 2011.
  • [25] Y. Shi, N. Ruiz, R. Taib, E. Choi, and F. Chen, Galvanic skin response (GSR) as an index of cognitive load, in CHI’07 extended abstracts on Human factors in computing systems, 2007, pp. 2651-2656: ACM.
  • [26] T. Lin, M. Omata, W. Hu, and A. Imamiya, Do physiological data relate to traditional usability indexes?, in Proceedings of the 17th Australia conference on Computer-Human Interaction: Citizens Online: Considerations for Today and the Future, 2005, pp. 1-10: Computer-Human Interaction Special Interest Group (CHISIG) of Australia.
  • [27] S. Reisman, Measurement of physiological stress, in Bioengineering Conference, 1997., Proceedings of the IEEE 1997 23rd Northeast, 1997, pp. 21-23: IEEE.
  • [28] R. A. McFarland, Relationship of skin temperature changes to the emotions accompanying music, Biofeedback and Self-regulation, vol. 10, no. 3, pp. 255-267, 1985.
  • [29] T. Partala and V. Surakka, Pupil size variation as an indication of affective processing, International journal of human-computer studies, vol. 59, no. 1-2, pp. 185-198, 2003.
  • [30] R. S. Larsen and J. Waters, Neuromodulatory correlates of pupil dilation, Frontiers in neural circuits, vol. 12, p. 21, 2018.
  • [31] J. Zhai and A. Barreto, Stress Recognition Using Non-invasive Technology, in FLAIRS Conference, pp. 395-401, 2006.
  • [32] M. W. Weiss, S. E. Trehub, E. G. Schellenberg, and P. Habashi, Pupils dilate for vocal or familiar music, Journal of Experimental Psychology: Human Perception and Performance, vol. 42, no. 8, p. 1061, 2016.
  • [33] E4 wristband from empatica. Available: https://www.empatica.com/research/e4/
  • [34] The Eye Tribe. Available: http://theeyetribe.com/about/index.html
  • [35] J. L. Walker, Subjective reactions to music and brainwave rhythms, Physiological Psychology, vol. 5, no. 4, pp. 483-489, 1977.
  • [36] D. F. Alwin, Feeling thermometers versus 7-point scales: Which are better?, Sociological Methods & Research, vol. 25, no. 3, pp. 318-340, 1997.
  • [37] J. A. Russell, A circumplex model of affect, Journal of personality and social psychology, vol. 39, no. 6, p. 1161, 1980.
  • [38] J. Kim and E. Andre, Emotion recognition based on physiological changes in music listening, IEEE Trans Pattern Anal Mach Intell, vol. 30, no. 12, pp. 2067-83, Dec 2008.
  • [39] S. Jerritta, M. Murugappan, R. Nagarajan, and K. Wan, Physiological signals based human emotion recognition: a review, in 2011 IEEE 7th International Colloquium on Signal Processing and its Applications, 2011, pp. 410-415: IEEE.
  • [40] R. W. Picard, E. Vyzas, and J. Healey, Toward machine emotional intelligence: Analysis of affective physiological state, IEEE transactions on pattern analysis and machine intelligence, vol. 23, no. 10, pp. 1175-1191, 2001.
  • [41] U. R. Acharya et al., Characterization of focal EEG signals: a review, Future Generation Computer Systems, vol. 91, pp. 290-299, 2019.
  • [41] R. Chowdhury, M. Reaz, M. Ali, A. Bakar, K. Chellappan, and T. Chang, Surface electromyography signal processing and classification techniques, Sensors, vol. 13, no. 9, pp. 12431-12466, 2013.
  • [43] C. D. Katsis, N. Katertsidis, G. Ganiatsas, and D. I. Fotiadis, Toward Emotion Recognition in Car-Racing Drivers: A Biosignal Processing Approach, IEEE Transactions on Systems, Man, and Cybernetics - Part A: Systems and Humans, vol. 38, no. 3, pp. 502-512, 2008.
  • [44] T. Triwiyanto, O. Wahyunggoro, H. A. Nugroho, and H. Herianto, An investigation into time domain features of surface electromyography to estimate the elbow joint angle, Advances in Electrical and Electronic Engineering, vol. 15, no. 3, pp. 448-458, 2017.
  • [45] R. Kohavi and G. H. John, Wrappers for feature subset selection, Artificial intelligence, vol. 97, no. 1-2, pp. 273-324, 1997.
  • [46] J. Pohjalainen, O. Räsänen, and S. Kadioglu, Feature selection methods and their combinations in high-dimensional classification of speaker likability, intelligibility and personality traits, Computer Speech & Language, vol. 29, no. 1, pp. 145-171, 2015.
  • [47] J. Yang and V. Honavar, Feature subset selection using a genetic algorithm, in Feature extraction, construction and selection: Springer, 1998, pp. 117-136.
  • [48] F. J. Valverde-Albacete and C. Peláez-Moreno, 100% classification accuracy considered harmful: The normalized information transfer factor explains the accuracy paradox, PloS one, vol. 9, no. 1, p. e84217, 2014.
  • [49] K. He, X. Zhang, S. Ren, and J. Sun, Deep residual learning for image recognition, in Proceedings of the IEEE conference on computer vision and pattern recognition, 2016, pp. 770-778.
  • [50] M. G. N. Bos, P. Jentgens, T. Beckers, and M. Kindt, Psychophysiological response patterns to affective film stimuli, (in eng), PloS one, vol. 8, no. 4, pp. e62661-e62661, 2013.
  • [51] S. Jerritta, M. Murugappan, K. Wan, and S. Yaacob, Emotion Detection from QRS Complex of ECG Signals Using Hurst Exponent for Different Age Groups, in 2013 Humaine Association Conference on Affective Computing and Intelligent Interaction, 2013, pp. 849-854.
  • [52] J. S. Rahman, T. Gedeon, S. Caldwell and R. Jones, Brain Melody Informatics: Analysing Effects of Music on Brainwave Patterns, in International Joint Conference on Neural Networks (IJCNN), Glasgow, United Kingdom, 2020: IEEE.
Uwagi
Opracowanie rekordu ze środków MNiSW, umowa Nr 461252 w ramach programu "Społeczna odpowiedzialność nauki" - moduł: Popularyzacja nauki i promocja sportu (2021).
Typ dokumentu
Bibliografia
Identyfikator YADDA
bwmeta1.element.baztech-843ebc82-43ad-4474-b8c1-ceb4dec0493c
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.