Tytuł artykułu
Treść / Zawartość
Pełne teksty:
Identyfikatory
Warianty tytułu
Języki publikacji
Abstrakty
Medical history highlights that myocardial infarction is one of the leading factors of death in human beings. Angina pectoris is a prominent vital sign of myocardial infarction. Medical reports suggest that experiencing chest pain during heart attacks causes changes in facial muscles, resulting in variations in patterns of facial expression. This work intends to develop an automatic facial expression detection to identify the severity of chest pain as a vital sign of MI, using an algorithmic approach that is implemented with a state-of-the-art convolutional neural network (CNN). The advanced object detection lightweight CNN models are as follows: Single Shot Detector Mobile Net V2, and Single Shot Detector Inception V2, which were utilized for designing the vital signs MI model from the 500 Red Blue Green Color images private dataset. The authors developed cardiac emergency health monitoring care using an Edge Artificial Intelligence (“Edge AI”) using NVIDIA’s Jetson Nano embedded GPU platform. The proposed model is mainly focused on the factors of low cost and less power consumption for onboard real-time detection of vital signs of myocardial infarction. The evaluated metrics achieve a mean Average Precision of 85.18%, Average Recall of 88.32%, and 6.85 frames per second for the generated detections.
Rocznik
Tom
Strony
40--55
Opis fizyczny
Bibliogr. 56 poz., rys.
Twórcy
autor
- – HMI, Digital Shark Technology Pvt. Ltd, Bangalore, Karnataka, India
autor
- Tecplix Technologies Pvt. Ltd, Bangalore, Karnataka, India
autor
- Government Polytechnic, Kampli, Karnataka, India
autor
- Dept. of CSE, SJB Institute of Technology, Bangalore, VTU, India
Bibliografia
- [1] Cristina Balla Rita Pavasini Roberto Ferrari, “Treatment of Angina: Where Are We?”, Cardiology journal, vol.140, 2018, pp.52–67 10.1159/000487936
- [2] Amanda Williams, Kenneth D. Craig, “Updating the definition of pain” PAIN, vol.157, no.11, 2016, pp. 2420-2423, 10.1097/j.pain.
- [3] George R. Hansen, Jon Streltzer, “Psychology of pain”, Emerg Med Clin N, vol. 23, 2005, pp.339–348, 10.1016/j.emc.2004.12.005.
- [4] Steven J. Linton, William S. Shaw,” Impact of Psychological Factors in the Experience of Pain” Physical Therapy, vol.91, no.5, 2011, pp.700-711, 10.2522 /ptj. 20100330.
- [5] James C Eisenach, “Textbook of Pain, 4th edition”, vol.12, no.3, July 2000, 276-278.
- [6] Xiaojing Xu, Jeannie S Huang, Virginia R De Sa, “Pain Evaluation in Video using Extended Multitask Learning from Multidimensional Measurements”, Proceedings of Machine Learning for Health NeurIPS Workshop, PMLR 116: 2020, pp.141-154.
- [7] Philipp Werner; Daniel Lopez-M, Walter St , “Automatic Recognition Methods Supporting Pain Assessment: A Survey”, IEEE Trans on Affective Computing, vol.13, no.1, Oct 2019, pp. 530-552, 10.1109/ TAFFC.2019.2946774.
- [8] Johan Herlitz, Ake Hjalmarson, Finn W, “Treatment of pain in acute myocardial infarction,” British Heart Journal, vol.61, 1989, pp. 9-13, 10.1136/hrt.61.1.9.
- [9] Richard Gorlin, “Pathophysiology of Cardiac Pain” Circulation, vol.32, July 1965
- [10] James H Behrmann, Harold R Hipp, Howard E Heyer, “Pain Patterns in Acute Myocardial Infarction” American Journal of Medicine, vol. 9, no.2, Aug 1950, pp.156-163, 10.1016 0002- 9343(50)90018-0.
- [11] J A Dalton L Brown, J Carlson, R McNutt, S M Greer “An evaluation of facial expression displayed by patients with chest pain”, vol. 28, no.3, May-Jun 1999, pp.168-74, 10.1016/s0147-9563(99)70056-7
- [12] Patrick Thiam, Hans A Kesler, “Two-Stream Attention Network for Pain Recognition from Video Sequences”, Sensors, vol. 20, no.3, pp.839 2020, 10.3390/s20030839
- [13] Luca Greco, Gennalo Percannella, Pierluigi Ritrovato, “Trends in iot based solutions for health care moving ai to the edge”, Pattern Recognition Letters, vol.135, July 2020, pp. 346-353, 10.1016/j.patrec.2020.05.016.
- [14] Massimo Merenda, Carlo Porcaro, Demetrio Lero, “Edge machine learning for ai-enabled iot devices a review”, Sensors, Vol. 20, no.9, 2020, pp.25-33, 10.3390/s20092533.
- [15] Ghulam Muhammed, Mohammed F Alhamid, “Edge computing with cloud for voice disorder assessment and treatment” IEEE Communications Magazine, vol. 56, no. 4, April 2018, pp. 60-65 10.1109/MCOM.2018.1700790.
- [16] J. P. Queralta, T. N. Gia, H. Tenhunen and T. Westerlund, “Edge-AI in LoRa-based Health Monitoring: Fall Detection System with Fog Computing and LSTM Recurrent Neural Networks,” 2019 42nd International Conference on Telecommunications and Signal Processing, Budapest, Hungary, 2019, pp. 601-604, doi: 10.1109/TSP.2019.8768883.
- [17] X. Dai, I. Spasić, B. Meyer, S. Chapman and F. Andres, “Machine Learning on Mobile: An On-device Inference App for Skin Cancer Detection,” 2019 Fourth International Conference on Fog and Mobile Edge Computing (FMEC), Rome, Italy, 2019, pp. 301-305, doi: 10.1109/FMEC.2019.8795362.
- [18] Dianbo Liu, Dan Cheng, Timothy T Houle, Lucy Chen, Wei Zhang, Hao Deng, “Machine learning methods for automatic pain assessment using facial expression information”, vol.97, no. 49, Medicine 2018, 10.1097/MD.0000000000013421.
- [19] Teena Hassan, Dominik Seuß, Johannes Wollenberg, Katharina Weitz, Miriam Kunz, Stefan Lautenbacher, Jens-Uwe Garbas, Ute Schmid, “Automatic Detection of Pain from Facial Expressions: A Survey” IEEE Transactions on pattern analysis and machine intelligence, 2019, 10.1109/TPAMI.2019.2958341
- [20] M Adibuzzaman; Colin Ostberg; S Ahamed, Richard P; “Assessment of Pain Using Facial Pictures Taken with a Smartphone”, IEEE 39th Annual computer Software and Applications conference, vol.2, July 2015, pp. 726-731. 10.1109/OMPSAC.2015.150.
- [21] Sajad Ashouri, Mohsen Abedi, Masoud Abdollahi, “A novel approach to spinal 3-D kinematic assessment using inertial sensors: Towards effective quantitative evaluation of low back pain in clinical settings”, Comput Biol Med, vol.89, Aug 2017, pp.144-149, 10.1016/j.compbiomed.2017.08.002.
- [22] M. A. Haque et al., “Deep Multimodal Pain Recognition: A Database and Comparison of Spatio-Temporal Visual Modalities,” 2018 13th IEEE International Conference on Automatic Face & Gesture Recognition (FG 2018), Xi’an, China, 2018, pp. 250-257, doi: 10.1109/FG.2018.00044.
- [23] J. J. Rivas et al., “Automatic recognition of pain, anxiety, engagement and tiredness for virtual rehabilitation from stroke: A marginalization approach,” 2017 Seventh International Conference on Affective Computing and Intelligent Interaction Workshops and Demos (ACIIW), San Antonio, TX, USA, 2017, pp. 159-164, doi: 10.1109/ACIIW.2017.8272607.
- [24] Lei Yang,Shuang Wang, Xiaoqian Jiang, “PATTERN: Pain Assessment for patients who can’t tell using Restricted Boltzmann machine,” BMC Medical Informatics and Decision Making, vol.73, July 2016, pp.190-208, 10.1186/s12911-016-0317-0.
- [25] Zhanli Chen, Rashid Ansari, Diana J Wilkie, “Automated Pain Detection from Facial Expressions using FACS: A Review”, Nov 2018, pp. 1-19.
- [26] Kenneth M Prkachin 1, Patricia E Solomon, “The structure, reliability and validity of pain expression: Evidence from patients with shoulder pain”, Pain, vol.139, Oct 2008, pp. 267-74, 10.1016/j.pain.2008.04.010.
- [27] Patrick Lucey, Jeffrey F. Cohn, Kenneth M. Prkachin “Painful data: The UNBC-McMaster shoulder pain expression archive database”, 2011 IEEE International Conference on Automatic Face & Gesture Recognition, March 2011, pp. 1-9, 10.1109/FG.2011.5771462.
- [28] Min S. H. Aung, Sebastian Kaltwang, Bernardino Romera-Paredes, “The Automatic Detection of Chronic Pain-Related Expression: Requirements, Challenges and the Multimodal Emo Pain Dataset”, 2016 IEEE Transactions on Affective Computing, vol.7, no. 4, Oct 2016, pp.435-451, 10.1109/ TAFFC.2015.246283.
- [29] S. Walter, S Gruss, H Ehleiter, “The biovid heat pain database data for the advancement and systematic validation of an automated pain recognition system,” 2013 IEEE International Conference on Cybernetics (CYBCO), Lausanne, Switzerland, 2013, pp. 128-131, doi: 10.1109/CYBConf.2013.6617456.
- [30] Mittal V K., “Discriminating the Infant Cry Sounds Due to Pain vs. Discomfort Towards Assisted Clinical Diagnosis”, SLPAT 2016- on Speech and Language Processing for Assistive Technologies, 2016, pp.37-42, 10.21437/SLPAT. 2016-7.
- [31] Xing Zhang, Lijun Y, “BP4D-Spontaneous: a high-resolution spontaneous 3D dynamic facial expression database”, Image and Vision Computing, vol.32, no. 10, Oct 2014, pp.692-706,10.1016/j.imavis.2014.06.002.
- [32] Maria Velana, Sascha G, G Layher, “The Sense Emotion Database: A Multimodal Database for the Development and Systematic Validation of an Automatic Pain- and Emotion-Recognition System”, Multimodal Pattern Recognition of Social Signals in Human-Computer-Interaction, vol. 10183, June 2017, pp. 127-139.
- [33] Ruijing Yang, Shujun Tong, Miguel Bordallo “On Pain Assessment from Facial Videos Using Spatio-Temporal Local Descriptors”, IEEE 6th International Conference on Image Processing Theory, Tools and Applications, IEEE 2016, pp.1-6. 10.1109/IPTA.2016.7820930.
- [34] Juho Kannala, Esa Rahtu, “BSIF: Binarized Statistical Image Features”, Proceedings of 21st International Conference on Pattern Recognition, ICPR2012, Nov 2012, pp.1363 – 1366.
- [35] Ghazal Bargshady, Xujuan Zhou, Ravinesh “Enhanced deep learning algorithm development to detect pain intensity from facial expression images”, Expert systems with applications, vol.149, 1 July 2020,
- [36] Jing Zhou; Xiaopeng Hong “Recurrent Convolutional Neural Network Regression for Continuous Pain Intensity Estimation in Video” 2016 IEEE Conf on Computer Vision and Pattern Recognition, 10.1109/CVPRW.2016.191.
- [37] Marco Bellantonio, Mohammad A. Haque, Pau Rodriguez, “Spatio-temporal Pain Recognition in CNN-Based Super-Resolved Facial Images” FFER 2016: Video Analytics. Face and Facial Expression Recognition and Audience Measurement, March 2017, pp.151-162.
- [38] Pau Rodriguez, Guillem C, Jordi Gonalez “Deep Pain: Exploiting long short Term Memory Networks for facial expression classification” IEEE Transactions on Cybernetics, Feb 2017, pp.1-11, 10.1109/tcyb.2017.2662199.
- [39] M. Tavakolian and A. Hadid, “Deep Binary Representation of Facial Expressions: A Novel Framework for Automatic Pain Intensity Recognition,” 2018 25th IEEE International Conference on Image Processing (ICIP), Athens, Greece, 2018, pp. 1952-1956, doi: 10.1109/ICIP.2018.8451681.
- [40] Patrick Thiam, Hans A. Kestler, Friedhelm Schwenker “Two-Stream Attention Network for Pain Recognition from Video Sequences” Sensors, vol. 20, no.3, 2020, 10.3390/s20030839.
- [41] Xiaojing Xu, Jeannie S. Huang, Virginia R. de Sa, “Pain Evaluation in Video using Extended Multitask Learning from Multidimensional Measurements”, Proceedings of Machine Learning Research, vol.116, 2020, pp.141–154.
- [42] F. Saxen, P. Werner, S. Handrich, E. Othman, L. Dinges and A. Al-Hamadi, “Face Attribute Detection with MobileNetV2 and NasNet-55 Mobile,” 2019 11th International Symposium on Image and Signal Processing and Analysis (ISPA), Dubrovnik, Croatia, 2019, pp. 176-180, doi: 10.1109/ISPA.2019.8868585.
- [43] Min-Kook Choi, Jaehyung Park, Heechul Jung “Fast and Accurate Convolutional Object Detectors for Real-time Embedded Platforms” Computer Vision and Pattern Recognition, 2019, arXiv:1909.10798.
- [44] Jiaxing Li, Dexiang Zhang, Jinging Zhang “Facial Expression Recognition with Faster R-CNN”, Procedia Computer Science, vol.107, 2017, pp.135-140, 10.1016/j.procs. 2017.03.069.
- [45] Cho, Y. et al. (2020) ‘Artificial intelligence algorithm for detecting myocardial infarction using six-lead electrocardiography’, Scientific Reports 2020 10:1. Nature Publishing Group, vol.10, no.1, pp.1–10. 10.1038/s41598-020-77599-6.
- [46] Mandair, D. ‘Prediction of incident myocardial infarction using machine learning applied to harmonized electronic health record data’, BMC Medical Informatics and Decision Making. BioMed Central, vol.20, no.1, 2020, pp.1–10. 10.1186/S12911-020-01268-X.
- [47] Kwon, J. ‘Deep-learning-based risk stratification for mortality of patients with acute myocardial infarction’, PLOS ONE. Public Library of Science, vol.14, no.10, 10.1371/JOURNAL. PONE.0224502
- [48] Lui, H. W. and Chow, K. L. (2018) ‘Multiclass classification of myocardial infarction with convolutional and recurrent neural networks for portable ECG devices’, Informatics in Medicine Unlocked. Elsevier, vol.13, pp.26–33. 10.1016/J.IMU.2018.08.002
- [49] Jyoti Metan, A.Y. Prasad, K.S. Ananda Kumar et al. “Cardiovascular MRI image analysis by using the bio inspired (sand piper optimized) fully deep convolutional network (Bio-FDCN) architecture for an automated detection of cardiac disorders” Biomedical Signal Processing and Control, vol.70, 2021, 10.1016/j.bspc.2021.103002
- [50] Howard, Andrew Zhu, Menglong Chen, Bo Kalenichenko, “MobileNets: Efficient Convolutional Neural Networks for Mobile Vision Applications”, 2017, pp.1-9. arXiv preprint ar Xiv:1704.04861.
- [51] Wei Liu, Dragomir Anguelov, Dumitru Erhan, “SSD: Single Shot MultiBox Detector” Computer Vision and Pattern Recognition, 2016, 10.1007/978-3-319-46448-02.
- [52] M. Kunz, D. Meixner, S. Lautenbacher, “Facial muscle movements encoding pain–a systematic review,” Pain, vol.160, no. 3, March 2019, pp.535–549.
- [53] Wilkie, Diana J. “Facial Expressions of Pain in Lung Cancer” Analgesia, vol.1, no.2, 1995, pp.91-99, 10.3727/ 107156995819564301.
- [54] P. Werner, D. Lopez-Martinez, S. Walter, A. Al-Hamadi, S. Gruss and R. W. Picard, “Automatic Recognition Methods Supporting Pain Assessment: A Survey,” in IEEE Transactions on Affective Computing, vol. 13, no. 1, pp. 530-552, 1 Jan.-March 2022, doi: 10.1109/TAFFC.2019.2946774.
- [55] Pau Rodriguez; Guillem Cucurull; Jordi Gonzàlez P. “Deep Pain: Exploiting Long Short-Term Memory Networks for Facial Expression Classification,” in IEEE Transactions on Cybernetics, vol. 52, no. 5, pp. 3314-3324, May 2022, doi:10.1109/TCYB.2017.2662199.
- [56] Lin, Zitnick, Doll, “Microsoft COCO: Common Objects in Context”, Computer Vision, ECCV, vol. 8693, 2014, pp. 740-755.
Uwagi
Opracowanie rekordu ze środków MEiN, umowa nr SONP/SP/546092/2022 w ramach programu "Społeczna odpowiedzialność nauki" - moduł: Popularyzacja nauki i promocja sportu (2022-2023).
Typ dokumentu
Bibliografia
Identyfikator YADDA
bwmeta1.element.baztech-72a268bb-188c-4df1-acbe-e1b665fcb932