PL EN


Preferencje help
Widoczny [Schowaj] Abstrakt
Liczba wyników
Tytuł artykułu

Human action recognition using simple geometric features and a finite state machine

Wybrane pełne teksty z tego czasopisma
Identyfikatory
Warianty tytułu
Języki publikacji
EN
Abstrakty
EN
In this paper a human action recognition algorithm, which uses background generation with shadow elimination, silhouette description based on simple geometrical features and a finite state machine for recognizing particular actions is described. The performed tests indicate that this approach obtains a 81 % correct recognition rate allowing real-time image processing of a 360 X 288 video stream.
Twórcy
autor
  • AGH University of Science and Technology, al. Mickiewicza 30, 30-059 Kraków
autor
  • AGH University of Science and Technology, al. Mickiewicza 30, 30-059 Kraków
autor
  • AGH University of Science and Technology, al. Mickiewicza 30, 30-059 Kraków
Bibliografia
  • [1] J.K. Aggarwal, M.S. Ryoo, Human activity analysis: A review, ACM Comput. Surv., vol. 43, no. 3, pp. 16:1-16:43, 2011
  • [2] M. Ekinci, E Gedikli. Silhouette based human motion detection and analysis for real-time automated video surveillance, Turkish Journal of Electrical Enginering, vol. 12, no. 2, pp. 200-229, 2005
  • [3] T. Horprasert, D. Harwood, L.S. Davis, A statistical approach for real-time robust background subtraction and shadow detection, In Proceedings: International Conference on Computer Vision (ICCV), pp. 1-19, 1999
  • [4] J.-W. Hsieh, H. Yung-Tai, H.-Y.M. Liao, C. Chih- Chiang, Video-based human movement analysis and its application to surveillance systems, Multimedia, IEEE Transactions on, vol. 10, no. 3, pp. 372-384, 2008
  • [5] S.-R. Ke, H.L.U. Thuc, Y.-J. Lee, J.-N. Hwang, J.- H. Yoo, K.-H. Choi, A review on video-based human activity recognition, Computers, vol. 2, no. 2, pp. 88-131, 2013
  • [6] T. Kryjak, M. Komorkiewicz, M. Gorgo´n, Realtime background generation and foreground object segmentation for high defnition colour video stream in FPGA device, Journal of Real-Time Image Processing, pp. 1-17, 2012
  • [7] A.P.B Lopes, R.S. Oliveira, J.M. de Almeida, A de A.Araùjo, Spatio-temporal frames in a bag-ofvisual- features approach for human actions recognition, In In Proceedings of the 2009 XXII Brazilian Symposium on Computer Graphics and Image Processing, SIBGRAPI Š09, pp. 315-321, 2009
  • [8] P. Natarajan, R. Nevatia. Coupled hidden semi markov models for activity recognition, In Motion and Video Computing, 2007. WMVC ’07. IEEE Workshop on, pp. 10-10, 2007
  • [9] OpenCV, Open Source Computer Vision, website: http://opencv.org/ (last access: 16.11.2013), 2013.
  • [10] M.D. Rodriguez, J. Ahmed, M. Shah, Action mach a spatio-temporal maximum average correlation height filter for action recognition, In Computer Vision and Pattern Recognition, 2008. CVPR2008. IEEE Conference on, pp. 1-8, 2008
  • [11] M.S. Ryoo and J.K. Aggarwal. Semantic representation and recognition of continued and recursive human activities. Int. J. Comput. Vision, vol. 82, no. 1, pp. 1-24, 2009
  • [12] Y. Sheikh, M. Sheikh, M. Shah, Exploring the space of a human action. In Computer Vision, 2005.ICCV 2005. Tenth IEEE International Conference on, vol. 1, pp. 144-149, 2005
  • [13] R. Tadeusiewicz. Artificial intelligence applied to the intelligent buildings, In 6th international congress on Intelligent Building Systems, pp. 1-11, 2011
  • [14] C. Wren, i A. Azarbayejan, T. Darrell, A. Pentland, Pfinder: Real-time tracking of the human body, IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 19, pp. 780-785, 1997
  • [15] L. Xia, C.-C. Chen, J.K. Aggarwal View invariant human action recognition using histograms of 3D joints, In The 2nd International Workshop on Human Activity Understanding from 3D Data (HAU3D) in conjunction with IEEE CVPR 2012, 2012
Typ dokumentu
Bibliografia
Identyfikator YADDA
bwmeta1.element.baztech-911a91f9-0e1f-462d-ae8f-fcb420db9719
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.