PL EN


Preferencje help
Widoczny [Schowaj] Abstrakt
Liczba wyników
Tytuł artykułu

Human activity recognition using improved complete ensemble EMD with adaptive noise and long short-term memory neural networks

Identyfikatory
Warianty tytułu
Języki publikacji
EN
Abstrakty
EN
The recognition of human activities is a topic of great relevance due to its wide range of applications. Different approaches have been proposed to recognize human activities, ranging from the comparison of signals with thresholds to the application of deep and machine learning techniques. In this work, the classification of six human activities (walking, walking downstairs, walking upstairs, standing, sitting, and lying down) is performed using bidirectional LSTM networks that exploit intrinsic mode function (IMF) representation of inertial signals. Records with inertial signals (accelerometer and gyroscope) of 2.56 s, available at the UCI Machine Learning Repository, were collected from 30 subjects using a smartphone. First, inertial signals were standardized to take them to the same scale and were decomposed into IMF using the improved complete ensemble empirical mode decomposition with adaptive noise (ICEEMDAN). IMF were then segmented (split) into nine segments of 1.28 s with 12.5% overlap and introduced to a first network with four outputs to identify the dynamic activities and the statics as a single class called ‘‘statics’’, giving 98.86% accuracy. Then, the non-segmented IMF of the records assigned to the statics class were introduced to a second network to classify their three activities, giving an accuracy of 88.46%. In total, 92.91% accuracy was obtained to classify the six human activities. This performance is because ICEEMDAN allowed the extraction of information that was embedded in the signal, and the segmentation of the IMF allowed the network to discriminate between static and dynamic activities.
Twórcy
  • Faculty of Electrical and Electronic Engineering, Pontifical Bolivarian University, Bucaramanga, Colombia
  • Faculty of Electrical and Electronic Engineering, Pontifical Bolivarian University, Bucaramanga, Colombia
  • Faculty of Electrical and Electronic Engineering, Pontifical Bolivarian University, Bucaramanga, Colombia
Bibliografia
  • [1] Mastorakis G, Makris D. Fall detection system using Kinect's infrared sensor. J Real-Time Image Process 2014;9(4):635–46. http://dx.doi.org/10.1007/s11554-012-0246-9.
  • [2] Kantoch E. Recognition of sedentary behavior by machine learning analysis of wearable sensors during activities of daily living for telemedical assessment of cardiovascular risk. Sensors 2018;18(10):3219. http://dx.doi.org/10.3390/s18103219.
  • [3] Lin W, Sun M-T, Poovendran R, Zhang Z. Activity recognition using a combination of category components and local models for video surveillance. IEEE Trans Circuits Syst Video Technol 2008;18(8):1128–39. http://dx.doi.org/10.1109/TCSVT.2008.927111.
  • [4] Chen L, Nugent CD, Wang H. A knowledge-driven approach to activity recognition in smart homes. IEEE Trans Knowl Data Eng 2012;24(6):961–74. http://dx.doi.org/10.1109/TKDE.2011.51.
  • [5] Ohn-Bar E, Tawari A, Martin S, Trivedi MM. On surveillance for safety critical events: in-vehicle video networks for predictive driver assistance systems. Comput Vis Image Underst 2015;134:130–40. http://dx.doi.org/10.1016/j.cviu.2014.10.003.
  • [6] Zhang S, Wei Z, Nie J, Huang L, Wang S, Li Z. A review on human activity recognition using vision-based method. J Healthc Eng 2017;2017:1–31. http://dx.doi.org/10.1155/2017/3090343.
  • [7] Bulling A, Blanke U, Schiele B. A tutorial on human activity recognition using body-worn inertial sensors. ACM Comput Surv (CSUR) 2014;46(3):1–33. http://dx.doi.org/10.1145/2499621.
  • [8] Sun L, Zhang D, Li B, Guo B, Li S. Activity recognition on an accelerometer embedded mobile phone with varying positions and orientations. International Conference on Ubiquitous Intelligence and Computing; 2010. pp. 548–62. http://dx.doi.org/10.1007/978-3-642-16355-5_42.
  • [9] Roy N, Misra A, Cook D. Ambient and smartphone sensor assisted ADL recognition in multi-inhabitant smart environments. J Amb Intell Hum Comput 2016;7(1):1–19. http://dx.doi.org/10.1007/s12652-015-0294-7.
  • [10] Reyes-Ortiz J-L, Oneto L, Samà A, Parra X, Anguita D. Transition-aware human activity recognition using smartphones. Neurocomputing 2016;171:754–67. http://dx.doi.org/10.1016/j.neucom.2015.07.085.
  • [11] Hassan MM, Uddin MZ, Mohamed A, Almogren A. A robust human activity recognition system using smartphone sensors and deep learning. Future Gen Comput Syst 2018;81:307–13. http://dx.doi.org/10.1016/j.future.2017.11.029.
  • [12] Vrigkas M, Nikou C, Kakadiaris IA. A review of human activity recognition methods. Front Robot AI 2015;2:28. http://dx.doi.org/10.3389/frobt.2015.00028.
  • [13] Aggarwal JK, Xia L. Human activity recognition from 3D data: a review. Pattern Recognit Lett 2014;48:70–80. http://dx.doi.org/10.1016/j.patrec.2014.04.011.
  • [14] Nweke HF, Teh YW, Al-Garadi MA, Alo UR. Deep learning algorithms for human activity recognition using mobile and wearable sensor networks: state of the art and research challenges. Expert Syst Appl 2018;105:233–61. http://dx.doi.org/10.1016/j.eswa.2018.03.056.
  • [15] Wang J, Chen Y, Hao S, Peng X, Hu L. Deep learning for sensor-based activity recognition: a survey. Pattern Recognit Lett 2019;119:3–11. http://dx.doi.org/10.1016/j.patrec.2018.02.010.
  • [16] Morales J, Akopian D. Physical activity recognition by smartphones, a survey. Biocybern Biomed Eng 2017;37 (3):388–400. http://dx.doi.org/10.1016/j.bbe.2017.04.004.
  • [17] Altuve M, Suárez L, Ardila J. Fundamental heart sounds analysis using improved complete ensemble EMD with adaptive noise. Biocybern Biomed Eng 2020;40(1):426–39. http://dx.doi.org/10.1016/j.bbe.2019.12.007.
  • [18] Huang NE, Shen Z, Long SR, Wu MC, Shih HH, Zheng Q, et al. The empirical mode decomposition and the Hilbert spectrum for nonlinear and non-stationary time series analysis. Proc R Soc Lond Ser A: Math Phys Eng Sci 1998;454 (1971):903–95. http://dx.doi.org/10.1098/rspa.1998.0193.
  • [19] Wu Z, Huang NE. Ensemble empirical mode decomposition: a noise-assisted data analysis method. Adv Adapt Data Anal 2009;1(01):1–41. http://dx.doi.org/10.1142/S1793536909000047.
  • [20] Torres ME, Colominas MA, Schlotthauer G, Flandrin P. A complete ensemble empirical mode decomposition with adaptive noise. 2011 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP); 2011. pp. 4144–7. http://dx.doi.org/10.1109/ICASSP.2011.5947265.
  • [21] Colominas MA, Schlotthauer G, Torres ME. Improved complete ensemble EMD: a suitable tool for biomedical signal processing. Biomed Signal Process Control 2014;14:19–29. http://dx.doi.org/10.1016/j.bspc.2014.06.009.
  • [22] Labate D, La Foresta F, Occhiuto G, Morabito FC, Lay- Ekuakille A, Vergallo P. Empirical mode decomposition vs. wavelet decomposition for the extraction of respiratory signal from single-channel ECG: a comparison. IEEE Sens J 2013;13(7):2666–74. http://dx.doi.org/10.1109/JSEN.2013.2257742.
  • [23] Motin MA, Karmakar CK, Palaniswami M. Selection of empirical mode decomposition techniques for extracting breathing rate from PPG. IEEE Signal Process Lett 2019;26 (4):592–6. http://dx.doi.org/10.1109/LSP.2019.2900923.
  • [24] Hassan AR, Subasi A. Automatic identification of epileptic seizures from EEG signals using linear programming boosting. Comput Methods Progr Biomed 2016;136:65–77. http://dx.doi.org/10.1016/j.cmpb.2016.08.013.
  • [25] Wang L, Li X, Ma C, Bai Y. Improving the prediction accuracy of monthly streamflow using a data-driven model based on a double-processing strategy. J Hydrol 2019;573:733–45. http://dx.doi.org/10.1016/j.jhydrol.2019.03.101.
  • [26] Xu H, Liu J, Hu H, Zhang Y. Wearable sensor-based human activity recognition method with multi-features extracted from Hilbert-Huang transform. Sensors 2016;16(12):2048. http://dx.doi.org/10.3390/s16122048.
  • [27] Reiss A, Stricker D. Introducing a new benchmarked dataset for activity monitoring. 2012 16th International Symposium on Wearable Computers; 2012. pp. 108–9. http://dx.doi.org/10.1109/ISWC.2012.13.
  • [28] Wang Z, Wu D, Chen J, Ghoneim A, Hossain MA. A triaxial accelerometer-based human activity recognition via EEMD- based features and game-theory-based feature selection. IEEE Sens J 2016;16(9):3198–207. http://dx.doi.org/10.1109/JSEN.2016.2519679.
  • [29] Chen Y, Guo M, Wang Z. An improved algorithm for human activity recognition using wearable sensors. 2016 Eighth International Conference on Advanced Computational Intelligence (ICACI); 2016. pp. 248–52. http://dx.doi.org/10.1109/ICACI.2016.7449833.
  • [30] Leutheuser H, Schuldhaus D, Eskofier BM. Hierarchical, multi-sensor based classification of daily life activities: comparison with state-of-the-art algorithms using a benchmark dataset. PLoS ONE 2013;8(10). http://dx.doi.org/10.1371/journal.pone.0075196.
  • [31] Anguita D, Ghio A, Oneto L, Parra X, Reyes-Ortiz JL. A public domain dataset for human activity recognition using smartphones. Proceedings of the 21th International European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning; 2013. p. 437–42.
  • [32] Ordó nez FJ, Roggen D. Deep convolutional and LSTM recurrent neural networks for multimodal wearable activity recognition. Sensors 2016;16(1):115. http://dx.doi.org/10.3390/s16010115.
  • [33] Roggen D, Calatroni A, Rossi M, Holleczek T, Förster K, Tröster G, et al. Collecting complex activity datasets in highly rich networked sensor environments. 2010 Seventh International Conference on Networked Sensing Systems (INSS); 2010. pp. 233–40. http://dx.doi.org/10.1109/INSS.2010.5573462.
  • [34] Zappi P, Lombriser C, Stiefmeier T, Farella E, Roggen D, Benini L, et al. Activity recognition from on-body sensors: accuracy- power trade-off by dynamic sensor selection. European Conference on Wireless Sensor Networks; 2008. pp. 17–33. http://dx.doi.org/10.1007/978-3-540-77690-1_2.
  • [35] Zhao Y, Yang R, Chevalier G, Xu X, Zhang Z. Deep residual Bidir-LSTM for human activity recognition using wearable sensors. Math Probl Eng 2018;2018. http://dx.doi.org/10.1155/2018/7316954.
  • [36] Yu S, Qin L. Human activity recognition with smartphone inertial sensors using Bidir-LSTM networks. 2018 3rd International Conference on Mechanical Control and Computer Engineering (ICMCCE) 2018;219–24. http://dx.doi.org/10.1109/ICMCCE.2018.00052.
  • [37] Hernández F, Suárez LF, Villamizar J, Altuve M. Human activity recognition on smartphones using a bidirectional LSTM network. 2019 XXII Symposium on Image, Signal Processing and Artificial Vision (STSIVA); 2019. pp. 1–5. http://dx.doi.org/10.1109/STSIVA.2019.8730249.
  • [38] Colominas MA, Schlotthauer G, Torres ME, Flandrin P. Noise-assisted EMD methods in action. Adv Adapt Data Anal 2012;4(04):1250025. http://dx.doi.org/10.1142/S1793536912500252.
  • [39] Hochreiter S, Schmidhuber J. Long short-term memory. Neural Comput 1997;9(8):1735–80. http://dx.doi.org/10.1162/neco.1997.9.8.1735.
  • [40] Schuster M, Paliwal KK. Bidirectional recurrent neural networks. IEEE Trans Signal Process 1997;45(11):2673–81. http://dx.doi.org/10.1109/78.650093.
  • [41] Murray MP, Drought AB, Kory RC. Walking patterns of normal men. J Bone Joint Surg 1964;46(2):335–60.
  • [42] Rahim KA, Nurhanim K, Elamvazuthi I, Izhar LI, Capi G. Classification of human daily activities using ensemble methods based on smartphone inertial sensors. Sensors 2018;18(12):4132. http://dx.doi.org/10.3390/s18124132.
  • [43] Ronao CA, Cho S-B. Recognizing human activities from smartphone sensors using hierarchical continuous hidden Markov models. Int J Distrib Sensor Netw 2017;13(1). http://dx.doi.org/10.1177/1550147716683687. 1550147716683687.
  • [44] Ronao CA, Cho S-B. Human activity recognition with smartphone sensors using deep learning neural networks. Expert Syst Appl 2016;59:235–44. http://dx.doi.org/10.1016/j.eswa.2016.04.032.
  • [45] Barshan B, Yüksek MC. Recognizing daily and sports activities in two open source machine learning environments using body-worn sensor units. Comput J 2014;57(11):1649–67. http://dx.doi.org/10.1093/comjnl/bxt075.
  • [46] Kwapisz JR, Weiss GM, Moore SA. Activity recognition using cell phone accelerometers. ACM SigKDD Expl Newslett 2011;12(2):74–82. http://dx.doi.org/10.1145/1964897.1964918.
  • [47] Banos O, Garcia R, Holgado-Terriza JA, Damas M, Pomares H, Rojas I, et al. mHealthDroid: a novel framework for agile development of mobile health applications. International Workshop on Ambient Assisted Living; 2014. pp. 91–8. http://dx.doi.org/10.1007/978-3-319-13105-4_14.
Uwagi
PL
Opracowanie rekordu ze środków MNiSW, umowa Nr 461252 w ramach programu "Społeczna odpowiedzialność nauki" - moduł: Popularyzacja nauki i promocja sportu (2020).
Typ dokumentu
Bibliografia
Identyfikator YADDA
bwmeta1.element.baztech-46f81f15-71e4-4287-b55f-4b37412c736e
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.