PL EN


Preferencje help
Widoczny [Schowaj] Abstrakt
Liczba wyników
Tytuł artykułu

System sensoryczny wielomodalnej percepcji otoczenia - koncepcja i analiza przypadków użycia

Identyfikatory
Warianty tytułu
EN
Sensory system of multimodal perception of the environment - concept and analysis of use cases
Języki publikacji
PL
Abstrakty
PL
Niniejszy referat prezentuje koncepcję multimodalnej percepcji otoczenia opartą na trzech typach sensorów tj. sonarze ultradźwiękowym, uproszczonej kamerze 3D oraz uproszczonej kamerze termowizyjnej. Podstawowym celem takiego wyboru jest otrzymanie układu sensorycznego, który redukuje ilość informacji podlegającej przetworzeniu. Jednocześnie układ ma dostarczyć lokalnie precyzyjnych danych o otoczeniu. Takie podejście umożliwi realizację nawigacji robota mobilnego, zadania SLAM oraz interakcji z ludźmi. Pod kątem przedstawionych założeń dokonano przeglądu wspominanych typów sensorów oraz metod przetwarzania danych. Całość zamknięto podsumowaniem, w którym wskazano konkretne sensory, które mogą stanowić podstawę proponowanego multimodalnego systemu sensorycznego.
EN
This paper presents the concept of multimodal perception of the environment based on three types of sensors, i.e. ultrasonic sonar, simplified 3D camera and simplified thermal imaging camera. The primary purpose of such a choice is to obtain a sensory system that reduces the amount of necessary information to be processed. However, at the same time, it is to ensure the receipt of locally precise data about the environment. This approach is to enable the implementation of mobile robot navigation, SLAM tasks and interactions with people. With this in mind, the above-mentioned types of sensors and data processing methods were reviewed in this respect. The paper is closed with a summary, which indicates specific sensors that can from the basis of the proposed multimodal sensory system.
Rocznik
Strony
63--76
Opis fizyczny
Bibliogr. 44 poz., rys., tab., wykr.
Twórcy
  • Katedra Cybernetyki i Robotyki Wydziału Elektroniki, Fotoniki i mikrosystemów Politechniki Wrocławskiej
  • Katedra Cybernetyki i Robotyki Wydziału Elektroniki, Fotoniki i mikrosystemów Politechniki Wrocławskiej
  • Katedra Cybernetyki i Robotyki Wydziału Elektroniki, Fotoniki i mikrosystemów Politechniki Wrocławskiej
Bibliografia
  • 1. C. Abah et al. A multi-modal sensor array for sale human-robot interaction and mapping. In: 2019 International Conference on Robotics and Automation (ICRA). Proceedings, Maj, 2019, s. 3768-3774.
  • 2. Gianni Allevato et al. Embedded air-coupled ultrasonic 3d sonar system with gpu acceleration. In: 2020 IEEE SENSORS. Proceedings, 2020, s. 1-4.
  • 3. Aisha Fahad Alraeeso et al. privacy-preserved social distancing system using low-resolution thermal sensors and deep learning. In: 2021 IEEE International Conference on Systems, Man, and Cybernetics (SMC). Proceedings, 2021, s. 66-71.
  • 4. B. Barshan, R. Kuc. Differentiating sonar reflections from corners and planes by employing an intelligent sensor. IEEE Trans. on Pattern Anal. and Mach. Intell., Czerwiec, 1990, wolumen 12, numer 6, s. 560-569.
  • 5. Long chen et al. Rgb-t slam: Aflexible slam framework by combining appearance and thermal information. In: 2017 IEEE International Conference on Robotics and Automation (ICRA). Proceedings. IEEE, 5, 2017, s. 5682-5687.
  • 6. Wenqiang Chen et al. Eil-slam: Depth-enhanced edge-based infrared-lidar slam. Journal of Field Robotics, 3, 2022, wolumen 39, s. 117-130.
  • 7. Hong Cheng, Lu Yang, Zicheng Liu. Survey on 3D Hand Gesture Recognition. IEEE Transaction on Circuits and Systems for Video Technology, 2016, wolumen 26, numer 9, s. 1659-1673.
  • 8. Cheng Chi. Underwater Real-Time 3D Acoustical Imaging. Signal and Communication Technology. Springer 2019.
  • 9. Kacper Chmiel. Stanowisko laboratoryjne miniaturowej kamery termowizyjnej. Praca dyplomowa inżynierska, Politechnika Wrocławska, 2021.
  • 10. A. Elfes. Sonar-based real-world mapping and navigation. IEEE Journal on Robotics and Automation, 6, 1987, wolumen 3, s. 249-265.
  • 11. Christine Evers, Alastair H. Moore, Patrick A. Naylor. Acoustic simultaneous localization and mapping (a-slam) of a moving microphone array and its surrounding speakers. In: 2016 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). Proceedings. IEEE, 3, 2016. Wolumen 2016-May, s. 6-10.
  • 12. Dariush Forouher, Marvin Grose Besselmann, Erik Maehle. Sensor fusion of depth camera and ultrasound data for obstacle detection and robot navigation. In: 2016 14th International Conference on Control, Automation, Robotics and Vision (ICARCV). Proceedings. IEEE, 11, 2016, s. 1-6.
  • 13. K. Kalagaonkar, B. Raj. One-handed gesture recognition using ultrasonic doppler sonar. In: Acoustics, Speech and Signal Processing, 2009. ICASSP 2019. IEEE International Conference on. Proceedings, 2009, s. 1889-1892.
  • 14. R. Kerstens, D. Laurijssen, J. Steckel. Low-cost one-bit mems microphone arrays for in-air acoustic imaging using fpga’s. In: 2017 IEEE SENSORS. Proceedings, Październik, 2017, s. 1-3.
  • 15. R. Kerstens, D. Laurijssen, J. Steckel. Ertis: A fully embedded real time 3d imaging sonar sensor for robotic applications. In: 2019 International Conference on Robotics and Automation (ICRA). Proceedings, Maj, 2019, s. 1438-1443.
  • 16. Muhammad Shahzad Alam Khan et al. Investigation of widely used slam sensors using analytical hierarchy process. Journal of Sensors, 1, 2022, wolumen 2022, s. 1-15.
  • 17. L. Kleeman, R. Kuc. Mobile robot sonar for target localization and classification. Int. J. Robotics Res., Sierpień, 1995, wolumen 4, s. 295-318.
  • 18. Da Kong, Yu Zhang, Weichen Dai. Direct near-infrared0depth visual slam with active lighting. IEEE robotics and Automation Letters, 10, 2021, woluemn 6, s. 7057-7064.
  • 19. Bogadan Kreczmer. Gestures recognition by using ultrasonic range-finders. In: Methods and Models in Automation and Robotics (MMAR), 2011 16th International Conference on. Proceedings, Sieprień, 2011, s. 363-368.
  • 20. Bogadan Kreczmer. Estimation of the azimuth angle of the arrival direction for an ultrasonic signal by using indirect determination of the phase shift. Archives of Acoustics, 2019, wolumen 44, numer 3.
  • 21. Bogadan Kreczmer. Influence of signal interference in determining direction of arrival by using the indirect phase determination method. In: Automation 2021: Recent Achievements in Automation, Robotics and Measurement Techniques. Proceedings. Red. Roman Szewczyk, Cezary Zieliński, Małgorzta Kaliczyńska, Cham, Springer International Publishing, 2021, s. 319-328.
  • 22. Bogadan Kreczmer, Piotr Portasiak. Experimental comparison of selected triangulation and tof optical distance sensors. In: Automation 2022: New Solutions and Technologies for Automation, Robotics and Measurement Techniques. Proceedings Red. Roman Szewczyk, Cezary Zieliński, Małgorzta Kaliczyńska, Cham, Springer International Publishing, 2022, s. 285-295.
  • 23. Miranda Krekovic, Ivan Dokmanic, Martin Vetterli. Echoslam: Simultaneous localization and mapping with acoustic echoes. In: 2016 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). Proceedings. IEEE, 3, 2016. Wolumen 2016-May, s. 11-15.
  • 24. Miranda Krekovic, Ivan Dokmanic, Martin Vetterli. Omnidirectional bats, pointto-plane distance, and the proce of uniqueness. In: 2017 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). Proceedings. IEEE, 3, 2017, s. 3261-3265.
  • 25. Kengo Kuroki et al. A remote conversation support systems for deaf-mute persons based on bimanual gestures recognition using finger-worn devices. In: 2015 IEEE International Conference on Pervasive Computing and Communication Workshops (PerCom Workshop). Proceedings, 2015, s. 574-578.
  • 26. C. Mollaret et al. A multi-modal perception based assistive robotic system for the elderly. Computer Vision and Image Understanding, 2016, wolumen 149, s. 78-97. Special issue on Assistive Computer Vision and Robotics – “Assistive Solutions for Mobility, communication and HMI”.
  • 27. F. Noroozi et al. Survey on emotional body gesture recognition. IEEE Transactions on Affective Computing, apr, 2021, wolumen 12, numer 02, s. 505-523.
  • 28. Mohammad Obiad et al. A framework for user-defined body gestures to control a humanoid robot. International Jouranl of Social Robotics, 08, 2014, woluemen 6, s. 383-396.
  • 29. Hossein Raeis, Mohammad Kazemi, Shervin Shimohammadi. Human activity recognition with device-free sensors for well-being assessment in smart homes. IEEE Instrumentattion Measurement Magazine, 2021, wolumen 24, numer 6, s. 46-57.
  • 30. Siddharth Swarup Rautaray, Anupam Agrawal. Vision based hand gesture recognition for human computer interaction: a survey. Artificial Intelligence Review, 2012, wolumen 43, s. 1-54.
  • 31. Y. Sang, L. Shi, Y. Liu. Micro Hand Gesture Recognition System Using Ultrasonic Active Sensing. ArXiv e-prints, Grudzień, 2017.
  • 32. Shane Saunderson, Goldie Nejat. How Robots Influence Humans: A Survey of Nonverbal Communication in Social Human-Robot Interaction. Interaction Journal of social Robotics, 01, 2019, wolumen 11.
  • 33. Young-Sik Shin, Ayoung Kim. Sparse depth enhanced direct thermal-infrared slam beyond the visible spectrum. IEEE Robotics and Automation Letters, 7, 2019, wolumen 4, s. 2918-2925.
  • 34. Sruthy Skaria et al. Deep-learning for hand-gesture recognition with simultaneous thermal and radar sensors. In: 2020 IEEE SENSORS. Proceedings, 2020, s. 1-4
  • 35. Shining Song, Dongsong Yan, Yongjun Xie. Design of Control system based on hand gesture recognition. In: 2018 IEEE 15th International Conference on Networking, Sensing and Control (ICNSC). Proceedings, 2018, s. 1-4.
  • 36. J. Steckel, A. Boen, H. Peremans. Broadband 3-d sonar system using a sparse array for indoor navigation. Robotics, IEEE Transactions on, Luty, 2013, wolumen 29, numer 1, s. 161-171.
  • 37. Toposens. ECHO ONE DK – toposens next-generation 3d ultrasonic sensing technology. https://toposens.com/echo-one-dk/, 2022.
  • 38. H.L. Van Trees. Optimum Array Processing: Part IV of Detection, Estimation, and Modulation Theory. Detection, Estimation, and Modulation Theory. Wiley 2004.
  • 39. T. Verellen et al. Urtis: a small 3d imaging sonar sensor for robotic applications. In: ICASSP 2020 – 2020 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). Proceedings, 2020, s. 4801-4805.
  • 40. Thomas Verellen, Robin Krestens, Jan Steckel. High-resolution ultrasound sensing for robotics using dense microphone arrays. IEEE Access, 2020, wolumen 8, s. 190083-190093.
  • 41. Stephen Vidas, Sridha Sridharan. Hand-held monocular slam in thermal-infrared. In: 2012 12th International Conference on Control Automation Robotics & Vision (ICARCV). Proceedings. IEEE, 12, 2012, s. 859-864.
  • 42. C. Walter, H. Schweinzer. Locating of objects with discontinuities, boundaries and intersections using a compact ultrasonic 3D sensor. In: 2014 International Conference on Indoor Positioning and Indoor Navigation. Proceeding, Październik, 2014, s. 99-102.
  • 43. Indika Wijayasinghe et al. Human-robot gesture analysis for objective assessment of autism spectrum disorder. International Journal of Social Robotics, 11, 2016, wolumen 8.
  • 44. Mubariz Zaffar et al. Sensors, slam and long-term autonomy: A review. In: 2018 NASA/ESA Conference on Adaptive Hardware and Systems (AHS). Proceedings. IEEE, 8, 2018, s. 285-290.
Uwagi
PL
Opracowanie rekordu ze środków MEiN, umowa nr SONP/SP/546092/2022 w ramach programu "Społeczna odpowiedzialność nauki" - moduł: Popularyzacja nauki i promocja sportu (2024).
Typ dokumentu
Bibliografia
Identyfikator YADDA
bwmeta1.element.baztech-d42509e4-2dca-4abe-b5f4-e29d63d887fc
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.