PL EN


Preferencje help
Widoczny [Schowaj] Abstrakt
Liczba wyników
Identyfikatory
Warianty tytułu
Języki publikacji
EN
Abstrakty
EN
Human activity recognition (HAR) from wearable motion sensor data is a promising research field due to its applications in healthcare, athletics, lifestyle monitoring, and computer–human interaction. Smartphones are an obvious platform for the deployment of HAR algorithms. This paper provides an overview of the state-of-the-art when it comes to the following aspects: relevant signals, data capture and preprocessing, ways to deal with unknown on-body locations and orientations, selecting the right features, activity models and classifiers, metrics for quantifying activity execution, and ways to evaluate usability of a HAR system. The survey covers detection of repetitive activities, postures, falls, and inactivity.
Twórcy
autor
  • University of Texas at San Antonio, BSE 1.500, One UTSA Circle, San Antonio, TX 78249, United States
autor
  • University of Texas at San Antonio, BSE 1.500, One UTSA Circle, San Antonio, TX 78249, United States
Bibliografia
  • [1] Shoaib M, Bosch S, Durmaz Incel O, Scholten H, Havinga PJ. A survey of online activity recognition using mobile phones. Sensors 2015;15(1):2059–85.
  • [2] Wang Y, Lin J, Annavaram M, Jacobson QA, Hong J, Krishnamachari B, et al. A framework of energy efficient mobile sensing for automatic user state recognition. MobiSys'09 Proceedings of the 7th International Conference on Mobile Systems, Applications, and Services; 2009.
  • [3] Bao L, Intille S. Activity recognition from user-annotated acceleration data. Pervasive Computing; 2004.
  • [4] Ravi N, Dandekar N, Mysore P, Littman ML. Activity recognition from accelerometer data. In: Bruce Porter, editor. Proceedings of the 17th Conference on Innovative Applications of Artificial Intelligence (IAAI'05), vol. 3, 2005.
  • [5] Karantonis DM, Narayanan MR, Mathie M, Lovell NH, Celler BG. Implementation of a real-time human movement classifier using a triaxial accelerometer for ambulatory monitoring. IEEE Trans Inf Technol Biomed 2006;10(1):156–67.
  • [6] Lee S-W, Mase K. Activity and location recognition using wearable sensors. IEEE Pervasive Comput 2002;1(3):24–32.
  • [7] Parkka J, Ermes M, Korpipaa P, Mantyjarvi J, Peltola J, Korhonen I. Activity classification using realistic data from wearable sensors. IEEE Trans Inf Technol Biomed 2006;10 (1):119–28.
  • [8] Ward J, Lukowicz P, Troster G, Starner T. Activity recognition of assembly tasks using body-worn microphones and accelerometers. IEEE Trans Pattern Anal Mach Intell 2006;28(10):1553–67.
  • [9] Munguia Tapia E, Intille SS, Haskell W, Larson K, Wright J, King A, et al. Real-time recognition of physical activities and their intensities using wireless accelerometers and a heart rate monitor. 2007 11th IEEE International Symposium on Wearable Computers; 2007.
  • [10] Mannini A, Sabatini A. Machine learning methods for classifying human physical activity from on-body accelerometers. Sensors 2010;10(2):1154–75.
  • [11] Martin H, Bernardos AM, Iglesias J, Casar JR. Activity logging using lightweight classification techniques in mobile devices. Pers Ubiquitous Comput 2013;17(4):675–95.
  • [12] Berchtold M, Braunschweig T, Budde M, Gordon D, Schmidtke H. ActiServ: Activity Recognition Service for mobile phones. International Symposium on Wearable Computers (ISWC) 2010; 2010.
  • [13] Moncada-Torres A, Leuenberger K, Gonzenbach R, Luft A, Gassert R. Activity classification based on inertial and barometric pressure sensors at different anatomical locations. Physiol Meas 2014;35(7):1245–63.
  • [14] Hwang K, Lee S. Environmental audio scene and activity recognition through mobile-based crowdsourcing. IEEE Trans Consum Electron 2012;58(2):700–5.
  • [15] Google. ActivityRecognitionApi; 27 June 2016, Available: https://developers.google.com/android/reference/com/ google/android/gms/location/ ActivityRecognitionApi#public-methods [online; accessed 02.07.16].
  • [16] Kwapisz JR, Weiss GM, Moore SA. Activity recognition using cell phone accelerometers. ACM SIGKDD Explor Newsl 2011; (December):74–82.
  • [17] Hynes M, Wang H, Kilmartin L. Off-the-shelf mobile handset environments for deploying accelerometer based gait and activity analysis algorithms. Annual International Conference of the Engineering in Medicine and Biology Society, 2009. EMBC 2009; 2009.
  • [18] Thiemjarus HAS, Marukatat S. Accurate activity recognition using a mobile phone regardless of device orientation and location. 2011 International Conference on Body Sensor Networks (BSN); 2011.
  • [19] Anguita D, Ghio A, Oneto L, Parra X, Reyes-Ortiz JL. Human activity recognition on smartphones using a multiclass hardware-friendly support vector machine. IWAAL'12 Proceedings of the 4th International Conference on Ambient Assisted Living and Home Care; 2012.
  • [20] Siirtola P, Roning J. Recognizing human activities user-independently on smartphones based on accelerometer data. Int J Artif Intell Interact Multimed 2012;1(5):38–45.
  • [21] Brezmes T, Gorricho J-L, Cotrina J. Activity recognition from accelerometer data on a mobile phone. Distributed computing, artificial intelligence, bioinformatics, soft computing, and ambient assisted living. Berlin/Heidelberg: Springer-Verlag; 2009. p. 796–9.
  • [22] Antonsson EK, Mann RW. The frequency content of gait. J Biomech 1985;18(1):39–47.
  • [23] Bhattacharya A, McCutcheon E, Shvartz E, Greenleaf J. Body acceleration distribution and 02 uptake in humans during running and jumping. J Appl Physiol: Respir Environ Exerc Physiol 1980;49(5):881–7.
  • [24] Alvarez de la Concepcion M, Soria Morillo L, Gonzalez-Abril L, Ortega Ramirez J. Discrete techniques applied to low-energy mobile human activity recognition. A new approach. Expert Syst Appl 2014;41(14):6138–46.
  • [25] Yan Z, Subbaraju V, Chakraborty D, Misra A, Aberer K. Energy-efficient continuous activity recognition on mobile phones: an activity-adaptive approach. Proceedings of the 2012 16th International Symposium onWearable Computers (ISWC); 2012.
  • [26] Viet VQ, Thang HM, Choi DJ. Balancing precision and battery drain in activity recognition on mobile phone. 2012 IEEE 18th International Conference on Parallel and Distributed Systems (ICPADS); 2012.
  • [27] Lee Y-S, Cho S-B. Activity recognition using hierarchical hidden Markov models on a smartphone with 3D accelerometer. Hybrid artificial intelligent systems. Berlin/ Heidelberg: Springer; 2011. p. 460–7.
  • [28] Dernbach S, Das B, Krishnan NC, Thomas BL, Cook DJ. Simple and complex activity recognition through smart phones. 2012 8th International Conference on IEEE Intelligent Environments (IE); 2012.
  • [29] Mathie M. Monitoring and interpreting human movement patterns using a triaxial accelerometer.[A dissertation submitted in fulfilment of the requirements for the degree of Doctor of Philosophy] The University of New South Wales; 2003.
  • [30] OpenSignal. Mobile sensors database. OpenSignal; 2016, Available: https://opensignal.com/sensors/library/ [online; accessed 17.07.16].
  • [31] Shoaib M, Scholten H, Havinga PJM. Towards physical activity recognition using smartphone sensors. 2013 IEEE 10th International Conference on Ubiquitous Intelligence and Computing and 10th International Conference on Autonomic and Trusted Computing (UIC/ATC); 2013.
  • [32] Mustafa K, Incel O, Ersoy C. Online human activity recognition on smart phones. 2nd International Workshop on Mobile Sensing; 2012.
  • [33] Morales J, Akopian D, Agaian S. Human activity recognition by smartphones regardless of device orientation. Mobile Devices and Multimedia: Enabling Technologies, Algorithms, and Applications 2014, Proc. of SPIE-IS&T Electronic Imaging, SPIE vol. 9030; 2014.
  • [34] Lu H, Yang J, Liu Z, Lane ND, Choudhury T, Campbell AT. The Jigsaw continuous sensing engine for mobile phone applications. Proceedings of the 8th ACM Conference on Embedded Networked Sensor Systems; 2010.
  • [35] Mashita T, Komaki D, Iwata M, Shimatani K, Miyamoto H, Hara T, et al. A content search system for mobile devices based on user context recognition. 2012 IEEE Virtual Reality Short Papers and Posters (VRW); 2012.
  • [36] Mashita T, Shimatani K, Iwata M, Miyamoto H, Komaki D, Hara T, et al. Human activity recognition for a content search system considering situations of smartphone users. 2012 IEEE Virtual Reality Short Papers and Posters (VRW); 2012.
  • [37] Oja E. A simplified neuron model as a principal component analyzer. J Math Biol 1982;15(3):267–73.
  • [38] Becker S, Plumbley M. Unsupervised neural network learning procedures for feature extraction and classification. J Appl Intell 1996;6:185–203.
  • [39] Gjoreski H, Bizjak J, Gjoreski M, Gams M. Comparing deep and classical machine learning methods for human activity recognition using wrist accelerometer. Available online: http://www.cc.gatech.edu/ _alanwags/DLAI2016/2.%20 (Gjoreski+)%20Comparing%20Deep%20and%20Classical% 20Machine%20Learning%20Methods%20for%20Human% 20Activity%20Recognition%20using%20Wrist% 20Accelerometer.pdf.
  • [40] Lane ND, Georgiev P. Can deep learning revolutionize mobile sensing?Proceedings of the 16th International Workshop on Mobile Computing Systems and Applications; 2015.
  • [41] Lara ÓD, Pérez AJ, Labrador MA, Posada JD. Centinela: a human activity recognition system based on acceleration and vital sign data. Pervasive Mob Comput 2012;8(5):717–29.
  • [42] Figo D, Diniz P, Ferreira D, Cardoso J. Preprocessing techniques for context recognition from accelerometer data. Pers Ubiquitous Comput 2010;14(7):645–62.
  • [43] Yun X, Calusdian J, Bachmann ER, McGhee RB. Estimation of human foot motion during normal walking using inertial and magnetic sensor measurement. IEEE Trans Instrum Meas 2012;61(7):2059–72.
  • [44] Kwon Y, Kang K, Bae C. Analysis and evaluation of smartphone-based human activity recognition using a neural network approach. 2015 International Joint Conference on Neural Networks (IJCNN); 2015.
  • [45] Riboni D, Bettini C. Cosar: hybrid reasoning for context-aware activity recognition. Pers Ubiquitous Comput 2011;15:271–89.
  • [46] Spruyt V. The Curse of Dimensionality in classification; 2014, Available: http://www.visiondummy.com/2014/04/ curse-dimensionality-affect-classification/ [online; accessed 02.07.16].
  • [47] Gutierrez-Osuna R. Introduction to pattern recognition. Available: http://psi.cse.tamu.edu/teaching/lecture_notes/ [online; accessed 02.07.16].
  • [48] Hall M, Smith L. Feature selection for machine learning: comparing a correlation-based filter approach to the wrapper. Proceedings of the Twelfth International Florida Artificial Intelligence Research Society Conference; 1999.
  • [49] Lara OD, Labrador MA. A survey on human activity recognition using wearable sensors. IEEE Commun Surv Tutor 2012;15(3):1192–209.
  • [50] Quinlan J. Programs for machines learning. Edward Brothers; 1993.
  • [51] Nielsen MA. Neural networks and deep learning. Determination Press; 2015, Available: http://neuralnetworksanddeeplearning.com/chap4.html [online; accessed 02.07.16].
  • [52] Hall M, Frank E, Holmes G, Pfahringer B, Reutemann P, Witten I. The WEKA data mining software: an update. SIGKDD Explor Newsl 2009;(November):10–8.
  • [53] Arndt H, Bundschus M, Nägele A. Java Data Mining Package, a library for machine learning and big data analytics; 2016, Available: https://jdmp.org/ [online; accessed 02.07.16].
  • [54] Kuncheva LI. Combining pattern classifiers: methods and algorithms. John Wiley & Sons, Inc.; 2004 [online book].
  • [55] Collobert R, Bengio S, Marithoz J. Torch: a modular machine learning software library. IDIAP research report; 2002, Martigny, Switzerland.
  • [56] Jiang W, Yin Z. Human activity recognition using wearable sensors by deep convolutional neural networks. Proceedings of the 23rd ACM International Conference on Multimedia (MM'15); 2015.
  • [57] Ravi D, Wong C, Lo B, Yang GZ. Deep learning for human activity recognition: a resource efficient implementation on low-power devices. 2016 IEEE 13th International Conference on Wearable and Implantable Body Sensor Networks (BSN); 2016.
  • [58] Frigo M, Johnson SG. The design and implementation of FFTW3. Proc IEEE 2005;93(2):216–31.
  • [59] Rabiner L. A tutorial on hidden Markov models and selected applications in speech recognition. Proc IEEE 1989;77 (2):257–86.
  • [60] Bouten C, Westerterp K, Verduin M, Janssen J. Assessment of energy expenditure for physical activity using a triaxial accelerometer. Med Sci Sports Exerc 1994;26(12):1516–23.
  • [61] Hendelman D, Miller K, Baggett C, Debold E, Freedson P. Validity of accelerometry for the assessment of moderate intensity physical activity in the field. Med Sci Sports Exerc 2000;32(9 Suppl.):S442–9.
  • [62] Lazzer S, Busti C, Galli R, Boniello S, Agosti F, Lafortuna C, et al. Physical activity ratios for various commonly performed sedentary and physical activities in obese adolescents. J Endocrinol Investig 2009;32(1):79–82.
  • [63] He Z, Jin L. Activity recognition from acceleration data based on discrete consine transform and svm. IEEE International Conference on Systems, Man and Cybernetics, 2009. SMC 2009; 2009.
  • [64] Reyes-Ortiz J-L, Oneto L, Samà A, Parra X, Anguita D. Transition-aware human activity recognition using smartphones. Neurocomputing 2016;171:754–67.
  • [65] Bird S, Klein E, Loper E. Chapter 6: Learning to classify text. Natural language processing with python; July 2015. Available: http://www.nltk.org/book/ch06.html [online; accessed 07.07.16].
Uwagi
PL
Opracowanie ze środków MNiSW w ramach umowy 812/P-DUN/2016 na działalność upowszechniającą naukę (zadania 2017).
Typ dokumentu
Bibliografia
Identyfikator YADDA
bwmeta1.element.baztech-4146e6fe-53b2-4ca0-b7d6-4cae2a47e23c
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.