Nowa wersja platformy, zawierająca wyłącznie zasoby pełnotekstowe, jest już dostępna.
Przejdź na https://bibliotekanauki.pl

PL EN


Preferencje help
Widoczny [Schowaj] Abstrakt
Liczba wyników
2023 | Vol. 30, nr 4 | 737--754
Tytuł artykułu

A yaw tracking algorithm for head movement from inertial sensors data

Treść / Zawartość
Warianty tytułu
Języki publikacji
EN
Abstrakty
EN
Monitoring head movements is important in many aspects of life from medicine and rehabilitation to sports, and VR entertainment. In this study, we used recordings from two sensors, i.e. an accelerometer and a gyroscope, to calculate the angles of movement of the gesturing person’s head. For the yaw motion, we proposed an original algorithm using only these two inertial sensors and the detected motion type obtained from a pre-trained SVM classifier. The combination of the gyroscope data and the detected motion type allowed us to calculate the yaw angle without the need for other sensors, such as a magnetometer or a video camera. To verify the accuracy of our algorithm, we used a robotic arm that simulated head gestures where the angle values were read out from the robot kinematics. The calculated yaw angles differed from the robot’s readings with a mean absolute error of approx. 1 degree and the rate of differences between these values exceeding 5 degrees was significantly below 1 percent except for one outlier at 1.12%. This level of accuracy is sufficient for many applications, such as VR systems, human-system interfaces, or rehabilitation.
Wydawca

Rocznik
Strony
737--754
Opis fizyczny
Bibliogr. 49 poz., rys., tab., wykr.
Twórcy
  • Łódź University of Technology, Faculty of Electrical, Electronic, Computer and Control Engineering, Institute of Electronics, Al. Politechniki 10, 93-590 Łódź, Poland, anna.borowska-terka@p.lodz.pl
  • Łódź University of Technology, Faculty of Electrical, Electronic, Computer and Control Engineering, Institute of Electronics, Al. Politechniki 10, 93-590 Łódź, Poland, pawel.strumillo@p.lodz.pl
Bibliografia
  • [1] Roomkham, S., Lovell, D., Cheung, J., & Perrin, D. (2018). Promises and challenges in the use of Consumer-Grade devices for sleep monitoring. IEEE Reviews in Biomedical Engineering, 11, 53-67. https://doi.org/10.1109/rbme.2018.2811735
  • [2] Buettner, R., Baumgartl, H., Konle, T., & Haag, P. (2020, July). A Review of Virtual Reality and Augmented Reality Literature in Healthcare. In 2020 IEEE Symposium on Industrial Electronics & Applications (ISIEA) (pp. 1-6). IEEE. https://doi.org/10.1109/ISIEA49364.2020.9188211
  • [3] Shafi Ahmed. (2023, November 10). In Wikipedia. https://en.wikipedia.org/wiki/Shafi_Ahmed
  • [4] Aggarwal, J. K., & Ryoo, M. S. (2011). Human activity analysis. ACM Computing Surveys, 43(3), 1-43. https://doi.org/10.1145/1922649.1922653
  • [5] Mahmud, S., Lin, X., & Kim, J. H. (2020, January). Interface for human machine interaction for assistant devices: A review. In 2020 10th Annual Computing and Communication Workshop and Conference (CCWC) (pp. 0768-0773). IEEE. https://doi.org/10.1109/CCWC47524.2020.9031244
  • [6] Ascari, R. E. S., Silva, L., & Pereira, R. (2020, March). Personalized gestural interaction applied in a gesture interactive game-based approach for people with disabilities. In Proceedings of the 25th International Conference on Intelligent User Interfaces (pp. 100-110). https://doi.org/10.1145/3377325.3377495
  • [7] Solea, R., Margarit, A., Cernega, D., & Serbencu, A. (2019, October). Head movement control of powered wheelchair. In 2019 23rd International Conference on System Theory, Control and Computing (ICSTCC) (pp. 632-637). IEEE. https://doi.org/10.1109/ICSTCC.2019.8885844
  • [8] Kyrarini, M., Zheng, Q., Haseeb, M. A., & Gräser, A. (2019, June). Robot learning of assistive manipulation tasks by demonstration via head gesture-based interface. In 2019 IEEE 16th International Conference on Rehabilitation Robotics (ICORR) (pp. 1139-1146). IEEE. https://doi.org/10.1109/ICORR.2019.8779379
  • [9] Vadiraj, S. K., Rao, A., & Ghosh, P. K. (2020, May). Automatic Identification of Speakers From Head Gestures in a Narration. In ICASSP 2020-2020 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) (pp. 6314-6318). IEEE. https://doi.org/10.1109/icassp40776.2020.9053124
  • [10] Titterton, D., & Weston, J. (2004). Strapdown inertial navigation technology (2nd ed.). AIAA.
  • [11] Wu, Z., & Wang, W. (2018). Magnetometer and gyroscope calibration method with level rotation. Sensors, 18(3), 748. https://doi.org/10.3390/s18030748
  • [12] Rudigkeit, N., Gebhard, M., & Graser, A. (2015, December). An analytical approach for head gesture recognition with motion sensors. In 2015 9th International Conference on Sensing Technology (ICST) (pp. 1-6). IEEE. https://doi.org/10.1109/icsenst.2015.7933064
  • [13] Feigl, T., Mutschler, C., & Philippsen, M. (2018, March). Human Compensation Strategies for Orientation Drifts. In 2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR) (pp. 409-414). IEEE. https://doi.org/10.1109/VR.2018.8446300
  • [14] Jalaliniya, S., Mardanbeigi, D., Pederson, T., & Hansen, D. W. (2014, June). Head and eye movement as pointing modalities for eyewear computers. In 2014 11th International Conference on Wearable and Implantable Body Sensor Networks Workshops (pp. 50-53). IEEE. https://doi.org/10.1109/BSN.Workshops.2014.14
  • [15] Dey, P., Hasan, M. M., Mostofa, S., & Rana, A. I. (2019, January). Smart wheelchair integrating head gesture navigation. In 2019 International Conference on Robotics, Electrical and Signal Processing Techniques (ICREST) (pp. 329-334). IEEE. https://doi.org/10.1109/ICREST.2019.8644322
  • [16] Dobrea, M. C., Dobrea, D. M., & Severin, I. C. (2019, November). A new wearable system for head gesture recognition designed to control an intelligent wheelchair. In 2019 E-Health and Bioengineering Conference (EHB) (pp. 1-5). IEEE. https://doi.org/10.1109/EHB47216.2019.8969993
  • [17] Nasif, S., & Khan, M. A. G. (2017, December). Wireless head gesture controlled wheel chair for disable persons. In 2017 IEEE Region 10 Humanitarian Technology Conference (R10-HTC) (pp. 156-161). IEEE. https://doi.org/10.1109/R10-HTC.2017.8288928
  • [18] Mavuş, U., & Sezer, V. (2017, March). Head gesture recognition via dynamic time warping and threshold optimization. In 2017 IEEE Conference on Cognitive and Computational Aspects of Situation Management (CogSIMA) (pp. 1-7). IEEE. https://doi.org/10.1109/COGSIMA.2017.7929592
  • [19] Wee, T. J., & Wahid, H. (2021). Design of a Head Movement Navigation System for Mobile Telepresence Robot Using Open-source Electronics Software and Hardware. International Journal of Electronics and Telecommunications, 379-384. https://doi.org/10.24425/ijet.2021.137823
  • [20] Wiener, M. (1967). Decoding of inconsistent communications. Journal of Personality and Social Psychology, 6(1), 109-114. https://doi.org/10.1037/h0024532
  • [21] Mehrabian, A., & Ferris, S. R. (1967). Inference of attitudes from nonverbal communication in two channels. Journal of Consulting Psychology, 31(3), 248. https://doi.org/10.1037/h0024648
  • [22] Kasano, E., Muramatsu, S., Matsufuji, A., Sato-Shimokawara, E., & Yamaguchi, T. (2019, June). Estimation of speaker’s confidence in conversation using speech information and head motion. In 2019 16th International Conference on Ubiquitous Robots (UR) (pp. 294-298). IEEE. https://doi.org/10.1109/URAI.2019.8768561
  • [23] Giannakakis, G., Manousos, D., Simos, P., & Tsiknakis, M. (2018, May). Head movements in context of speech during stress induction. In 2018 13th IEEE International Conference on Automatic Face & Gesture Recognition (FG 2018) (pp. 710-714). IEEE. https://doi.org/10.1109/FG.2018.00112
  • [24] Bhatia, S., Goecke, R., Hammal, Z., & Cohn, J. F. (2019, May). Automated measurement of head movement synchrony during dyadic depression severity interviews. In 2019 14th IEEE International Conference on Automatic Face & Gesture Recognition (FG 2019) (pp. 1-8). IEEE.. https://doi.org/10.1109/FG.2019.8756509
  • [25] Ye, X., Chen, G., & Cao, Y. (2015, October). Automatic eating detection using head-mount and wrist-worn accelerometers. In 2015 17th International Conference on E-health Networking, Application & Services (HealthCom) (pp. 578–581). IEEE. https://doi.org/10.1109/HealthCom.2015.7454568
  • [26] Hammoud, R. I., Wilhelm, A., Malawey, P., & Witt, G. J. (2005, June). Efficient real-time algorithms for eye state and head pose tracking in advanced driver support systems. In 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR’05) (Vol. 2, pp. 1181-vol). IEEE. https://doi.org/10.1109/CVPR.2005.142
  • [27] Niyogi, S., & Freeman, W. T. (1996, October). Example-based head tracking. In Proceedings of the second international conference on automatic face and gesture recognition (pp. 374-378). IEEE. https://doi.org/10.1109/afgr.1996.557294
  • [28] Parimi, G. M., Kundu, P. P., & Phoha, V. V. (2018, January). Analysis of head and torso movements for authentication. In 2018 IEEE 4th International Conference on Identity, Security, and Behavior Analysis (ISBA) (pp. 1-8). IEEE. https://doi.org/10.1109/ISBA.2018.8311460
  • [29] Li, S., Ashok, A., Zhang, Y., Xu, C., Lindqvist, J., & Gruteser, M. (2016, March). Whose move is it anyway? Authenticating smart wearable devices using unique head movement patterns. In 2016 IEEE International Conference on Pervasive Computing and Communications (PerCom) (pp. 1-9). IEEE. https://doi.org/10.1109/PERCOM.2016.7456514
  • [30] Steinicke, F., Bruder, G., Jerald, J., Frenz, H., & Lappe, M. (2008, October). Analyses of human sensitivity to redirected walking. In Proceedings of the 2008 ACM symposium on Virtual reality software and technology (pp. 149-156). https://doi.org/10.1145/1450579.1450611
  • [31] Feigl, T., Mutschler, C., & Philippsen, M. (2018, September). Supervised learning for yaw orientation estimation. In 2018 international conference on indoor positioning and indoor navigation (IPIN) (pp. 206-212). IEEE. https://doi.org/10.1109/IPIN.2018.8533811
  • [32] Meta. Developer Forum. https://forums.oculusvr.com/t5/Oculus-Developer/ct-p/developer
  • [33] Nishino, H. (2022). PlayStation VR2 and PlayStation VR2 Sense controller: the next generation of VR gaming on PS5. PlayStation.Blog. https://blog.playstation.com/2022/01/04/playstation-vr2-and-playstation-vr2-sense-controller-the-next-generation-of-vr-gaming-on-ps5/
  • [34] HTC VIVE. VIVE Developers |Get started developing for COSMOS. https://developer.vive.com/us/hardware/vive-cosmos/
  • [35] Marins, J. L., Yun, X., Bachmann, E. R., McGhee, R. B., & Zyda, M. J. (2001, October). An extended Kalman filter for quaternion-based orientation estimation using MARG sensors. In Proceedings 2001 IEEE/RSJ International Conference on Intelligent Robots and Systems. Expanding the Societal Role of Robotics in the the Next Millennium (Cat. No. 01CH37180) (Vol. 4, pp. 2003-2011). IEEE. https://doi.org/10.1109/iros.2001.976367
  • [36] Witczak, A. (2015). FPV Headtracker inercyjny sterownik ruchu (1). Elektronika Praktyczna, (8), 16-25.
  • [37] Maciejewski, M., Piszczek, M., Pomianek, M., & Palka, N. (2020). Design and evaluation of a steamvr tracker for training applications - simulations and measurements. Metrology and Measurement Systems, 27(4), 601-614. https://doi.org/10.24425/mms.2020.134841
  • [38] Nurhakim, A., Effendi, M. R., Saputra, H. M., Mardiati, R., Priatna, T., & Ismail, N. (2019, July). A Novel Approach to Calculating Yaw Angles Using an Accelerometer Sensor. In 2019 IEEE 5th International Conference on Wireless and Telematics (ICWT) (pp. 1-4). IEEE. https://doi.org/10.1109/ICWT47785.2019.8978238
  • [39] DUO3D™Division. DUO - Ultra-compact Stereo Camera for Sensing Space, Retrieved May, 2023, https://duo3d.com/
  • [40] Borowska-Terka, A., & Strumillo, P. (2020). Person independent recognition of head gestures from parametrised and raw signals recorded from inertial measurement unit. Applied Sciences, 10(12), 4213. https://doi.org/10.3390/app10124213
  • [41] Kok, M., Hol, J. D., & Schön, T. B. (2017). Using inertial sensors for position and orientation estimation. Foundations and Trends in Signal Processing, 11(1-2), 1-153. https://doi.org/10.1561/2000000094
  • [42] LaValle, S. M., Yershova, A., Katsev, M., & Antonov, M. (2014, May). Head tracking for the Oculus Rift. In 2014 IEEE international conference on robotics and automation (ICRA) (pp. 187-194). IEEE. https://doi.org/10.1109/ICRA.2014.6906608
  • [43] Ababsa, F., Didier, J. Y., Mallem, M., & Roussel, D. (2003, December). Head Motion Prediction in Augmented Reality Systems Using Monte Carlo Particle Filters. In Proceeding of the 13th International Conference on Artificial Reality and Telexistence (ICAT 2003).
  • [44] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., & Duchesnay, É. (2011). Scikit-Learn. Machine Learning in Python. Journal of Machine Learning Research, 12, 2825-2830.
  • [45] KUKA AG. LBR iiwa. Retrieved May, 2023. https://www.kuka.com/en-gb/products/robotics-systems/industrial-robots/lbr-iiwa
  • [46] Anna Borowska-Terka. Retrieved May, 2023. http://eletel.p.lodz.pl/abterka
  • [47] Sterling, M. (2014). Physiotherapy management of whiplash-associated disorders (WAD). Journal of Physiotherapy, 60(1), 5-12. https://doi.org/10.1016/j.jphys.2013.12.004
  • [48] Tournier, C., Hours, M., Charnay, P., Chossegros, L., & Tardy, H. (2015). Five years after the accident, whiplash casualties still have poorer quality of life in the physical domain than other mildly injured casualties: analysis of the ESPARR cohort. BMC Public Health, 16(1). https://doi.org/10.1186/s12889-015-2647-8
  • [49] Caraiman, S., Morar, A., Owczarek, M., Burlacu, A., Rzeszotarski, D., Botezatu, N., Herghelegiu, P., Moldoveanu, F., Strumiłło, P., & Moldoveanu, A. (2017, January). Computer Vision for the Visually Impaired: the Sound of Vision System. In Proceedings - 2017 IEEE International Conference on Computer Vision Workshops (ICCVW 2017) (pp. 1480-1489). IEEE. https://doi.org/10.1109/ICCVW.2017.175
Uwagi
We would like to thank the staff of the Institute of Automatic Control of Łódź University of Technology,
in particular prof. Grzegorz Granosik and Łukasz Chlebowicz, M.Sc., for providing access to the robot arm
KUKA LBR iiwa 14 R820, and for their assistance in carrying out the measurements.
Typ dokumentu
Bibliografia
Identyfikatory
Identyfikator YADDA
bwmeta1.element.baztech-f70f92fc-26ee-4c74-8fb1-7399173c5432
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.