PL EN


Preferencje help
Widoczny [Schowaj] Abstrakt
Liczba wyników
Tytuł artykułu

Aspects of Microsoft Kinect sensor application to servomotor control

Treść / Zawartość
Identyfikatory
Warianty tytułu
Języki publikacji
EN
Abstrakty
EN
This paper presents the design process of a gesture control system based on the Microsoft Kinect sensor. An environment enabling implementation of the integrated system using a variety of equipment and software was selected and prepared. A method for integrating the sensor with the Arduino environment has also been discussed. Algorithms for remote gesture control of the given servodrive angle and the position of the robot arm gripper were prepared. The results of several experiments, which were carried out in order to determine the optimal method for starting, controlling, and stopping the drive and for assessment of the accuracy of the proposed method for the arm control, are presented.
Rocznik
Strony
595--601
Opis fizyczny
Bibliogr. 25 poz., rys., fot., wykr.
Twórcy
autor
  • Poznan University of Technology, Chair of Control and System Engineering, Division of Signal Processing and Electronic Systems, 24 Jana Pawla II, 60-965 Poznan, Poland
  • Poznan University of Technology, Chair of Control and System Engineering, Division of Signal Processing and Electronic Systems, 24 Jana Pawla II, 60-965 Poznan, Poland
  • Poznan University of Technology, Chair of Control and System Engineering, Division of Signal Processing and Electronic Systems, 24 Jana Pawla II, 60-965 Poznan, Poland
Bibliografia
  • [1] T. Marciniak, A. Dabrowski, A. Chmielewska, and A.A. Krzykowska, “Selection of parameters in iris recognition system”, Multimedia Tools and Applications 68 (1), 193–208 (2014).
  • [2] P. Pawlowski, K. Borowczyk, T. Marciniak, and A. Dabrowski, “Real-time object tracking using motorized camera”, Proc. 13th IEEE Conference on Signal Processing: Algorithms, Architectures, Arrangements, and Applications (SPA) 1, 110–115 (2009).
  • [3] S. Kumar and J. Segen, “Gesture based 3D man-machine interaction using a single camera”, Proc. Int. Conf. on Multimedia Computing and Systems 1, 630–635 (1999).
  • [4] A. Prieto, F. Bellas, R.J. Duro, and F. Lopez-Pena, “An adaptive visual gesture based interface for human machine interaction in inteligent workspaces”, Proc. Int. Conf. on Virtual Environments, Human Computer Interfaces and Measurement Systems 1, 43–48 (2006).
  • [5] Y.V. Parkale, “Gesture based operating system control”, Proc. Int. Conf. on Advanced Computing and Communication Technologies 1, 318–323 (2012).
  • [6] D. Ionescu, B. Ionescu, C. Gadea, and S. Islam, “An intelligent gesture interface for controlling TV sets and set-top boxes”, Proc. Symp. on Applied Computational Intelligence and Informatics 1, 159–164 (2011).
  • [7] M. Lech and B. Kostek, “Gesture-based computer control system applied to the interactive whiteboard”, Proc. Int. Conf. on Industrial Technology 1, 75–78 (2012).
  • [8] M. Nishino, Y. Nakanishi, M. Sugi, Y. Soo, J. Ota, and T. Arai, “Design of gesture interface for deskwork support system”, Proc. Int. Joint Conf. ICROS-SICE 1, 2260–2267 (2009).
  • [9] B. Mrazovac, M.Z. Bjelica, D. Simic, S. Tikvic, and I. Papp, “Gesture based hardware interface for RF lightning control”, Proc. Int. Symp. on Intelligent Systems and Informatics 1, 309–314 (2011).
  • [10] M. Manigandan and I.M. Jackin, “Wireless vision based mobile robot control using hand gesture recognition through perceptual color space”, Proc. Int. Conf. on Advances in Conputer Entertainment Technology 1, 95–99 (2010).
  • [11] A. Wakabayashi, S. Motomura, and S. Kato, “Communicative humanoid robot control systems reflecting human body movement”, Proc. Int. Symp. on Micro-NanoMechatronics and Human Science 1, 122–127 (2011).
  • [12] W. Song, X. Guo, F. Jiang, S. Jang, G. Jiang, and Y. Shi, “Teleoperation humanoid robot control system based on kinect sensor”, Proc. Int. Conf. on Intelligent Human-Machine Systems and Cybernetics 2, 264–267 (2012).
  • [13] N. Helmi and M. Helmi, “Applying a neuro-fuzzy classifier for gesture-based control using a single wrist-mounted accelerometer”, Proc. Int. Symp. on Computational Intelligence in Robotics and Automation 1, 216–221 (2009).
  • [14] A. Wojciechowski, “Hand’s poses recognition as a mean of communication within natural user interfaces”, Bull. Pol. Ac.: Tech. 60 (2), 955–971 (2012).
  • [15] A. Babiarz, R. Bieda, K. Jaskot, and J. Klamka, “The dynamics of the human arm with an observer for the capture of body motion parameters”, Bull. Pol. Ac.: Tech. 61 (4), 955–971 (2013).
  • [16] Kinect for Windows: Product Features (access on 29.09.2013) http://www.microsoft.com/en-us/kinectforwindows/discover/features.aspx (2013).
  • [17] Microsoft Developer Network: Kinect Sensor (access on 29.09.2013) http://msdn.microsoft.com/en-us/library/hh438998.aspx (2013).
  • [18] T. Kowalczyk, Kinect SDK – Introduction, Microsoft Knowledge Base, (access on 29.09.2013), http://msdn.microsoft.com/pl-pl/library/kinect-sdkwprowadzenie (2013).
  • [19] Educational Robot: Robot Arm Pro. Mounting instructions: Model RA1-PRO, Arexx Engineering (access on 29.09.2013) http://www.arexx.com/robot arm/html/en/index.htm (2013).
  • [20] The Documentation of Processing Programming Environment and Libraries (11.06.2013), http://processing.org/reference/ (2013).
  • [21] The documentation of OpenNI Framework (11.06.2013) http://www.openni.org/reference-guide/ (2013).
  • [22] NITE Libraries Official Website (11.06.2013) http://www.primesense.com/solutions/nite-middleware/
  • [23] P. Korohoda and A. Dabrowski, “Generalized convolution as a tool for the multi-dimensional filtering tasks”, Multidimensional Systems and Signal Processing 19 (3–4), 361–377 (2008).
  • [24] A. Dabrowski, A. Menzi, and G.S. Moschytz, “Design of switched-capacitor FIR filters with application to a low-power MFSK receiver”, IEE Proc.-G Circuits, Devices, and Systems 139 (4), 450–466 (1992).
  • [25] A. Dabrowski, R. Dlugosz, and P. Pawlowski, “Integrated CMOS GSM baseband channel selecting filters realized using switched capacitor finite impulse response technique”, Microelectronics Reliability 46 (5–6), 949–958 (2006).
Typ dokumentu
Bibliografia
Identyfikator YADDA
bwmeta1.element.baztech-a04e42d5-53df-4b11-9774-9502faf04668
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.