PL EN


Preferencje help
Widoczny [Schowaj] Abstrakt
Liczba wyników
Powiadomienia systemowe
  • Sesja wygasła!
  • Sesja wygasła!
  • Sesja wygasła!
Tytuł artykułu

Processing of LiDAR and IMU data for target detection and odometry of a mobile robot

Treść / Zawartość
Identyfikatory
Warianty tytułu
Języki publikacji
EN
Abstrakty
EN
In this paper, the processing of the data of a 3D light detection and distance measurement (LiDAR) sensor mounted on a mobile robot is demonstrated, introducing an innovative methodology to manage the data and extract useful information. The LiDAR sensor is placed on a mobile robot which has a modular design that permits the easy change of the number of wheels, was designed to travel through several environments, and saves energy by changing the number and arrangement of the wheels in each environment. In addition, the robot can recognize landmarks in a structured environment by using a classification technique on each frame acquired by the LiDAR. Furthermore, considering the experimental tests, a new simple algorithm based on the LiDAR data processing together with the inertial data (IMU sensor) through a Kalman filter is proposed to characterize the robot’s pose by the surrounding environment with fixed landmarks. Finally, the limits of the proposed algorithm have been analyzed, highlighting new improvements in the future prospective development for permitting autonomous navigation and environment perception with a simple, modular, and low-cost device.
Twórcy
  • Department of Innovation Engineering, University of Salento, Lecce, 73100, Italy
  • Department of Control Engineering, Kyushu Institute of Technology, Kitakyushu, Japan
  • Department of Innovation Engineering, University of Salento, Lecce, 73100, Italy
  • Universidad Panamericana, Aguascalientes, Ags, 20290, MEXICO
  • Department of Innovation Engineering, University of Salento, Lecce, 73100, Italy
Bibliografia
  • [1] T. Goelles, B. Schlager, S. Muckenhuber, “Fault Detection, Isolation, Identification and Recovery (FDIIR) Methods for automotive perceptions sensors including a detailed literature survey for lidar”, Sensors, vol. 20, 3662, 2020, pp. 1–21.
  • [2] T. Raj, F. Hanim Hashim, A. Baseri Huddin, M. Faisal Ibrahim, A. Hussain, “A survey on LiDAR scanning mechanisms”, Electronics, vol. 741, 9, 2020, pp. 1–25.
  • [3] C. Debuenne, D. Vivet, “A review of visual-Lidar fusion based simultaneous localisation and mapping”, Sensors, vol. 20, 2068, 2020, pp. 1–20.
  • [4] P. Forsman, A. Halme, “3-D mapping of natural environments with trees by means of mobile perception”, IEEE Transactions on Robotics, vol. 21, 2005, pp. 482–490. 10.1109/TRO.2004.838003
  • [5] T. Tsubouchi, A. Asano, T. Mochizuki, S. Kondou, K.Shiozawa, M. Matsumoto, S. Tomimura, S. Nakanishi, A. Mochizuki, Y. Chiba, K. Sasaki, T. Hayami, “Forest 3D Mapping and Tree Sizes Measurement for Forest Management Based on Sensing Technology for Mobile Robots,” Berlin, Heidelberg: Springer Berlin Heidelberg, 2014, pp. 357–368.
  • [6] J. Billingsley, A. Visala, M. Dunn, “Robotics in agriculture and forestry” Springer handbook of robotics; Springer, 2008, pp. 1065–1077.
  • [7] X. Liang, P. Litkey, J. Hyyppa, H. Kaartinen, M. Vastaranta, M. Holopainen, “Automatic Stem Mapping Using Single-Scan Terrestrial Laser Scanning”, IEEE Transactions on Geoscience and Remote Sensing, vol. 50, 2012, pp. 661–670. 10.1109/TGRS.2011.2161613
  • [8] M. A. Juman, Y. W. Wong, R. K. Rajkumar, L. J. Goh, “A novel tree trunk detection method for oil-palm plantation navigation”, Computers and Electronics in Agriculture, vol. 128, 2016, pp. 172–180. 10.1016/j.compag.2016.09.002
  • [9] S. Li, H. Feng, K. Chen, K. Chen, J. Lin, L. Chou, “Auto-maps generation through self path generation in ROS based Robot Navigation”, Journal of Applied Science and Engineering, vol. 21, no. 3, 2018, pp. 351–360.
  • [10] M. Ocando, N. Certad, S. Alvarado, A. Terrones, ”Autonomous 2D SLAM and 3D mapping of an environment using a single 2D Lidar and ROS”, Proc. of 2017 Latin American Robotics Symposium, 2017.
  • [11] Y. Wang, C. Peng, A. Ravankar, A. Ravankar, “A single LIDAR-Based feature fusion indoor localisation algorithm”, Sensors, vol. 1294, 2018, pp. 1–19.
  • [12] A. Lay-Ekuakille, A. Trotta, “Binomial Filtering to Improve Backscattered Lidar Signal Content”, Proc. of XVII IMEKO World Congress, June 22–27, 2003, Dubrovnik, Croatia.
  • [13] N. I. Giannoccaro, T. Nishida, “The Design, Fabrication and Preliminary Testing of a Variable Configuration Mobile Robot”, International Journal of Robotics and Automation Technology, 2019, 6, pp. 47–54.
  • [14] N.I. Giannoccaro, T. Nishida, “Analysis of the surrounding environment using an innovative algorithm based on lidar data on a modular mobile robot”, Journal of Automation, Mobile Robotics and Intelligent Systems, vol. 14, no. 4, 2020.
  • [15] N.I. Giannoccaro, L. Spedicato, C. Di Castri, ”A new strategy for spatial reconstruction of orthogonal planes using a rotating array of ultrasonic sensor” IEEE Sensors, vol. 12, no. 5, 2012, pp. 1307–1316.
  • [16] N.I. Giannoccaro, L. Spedicato, “Exploratory data analysis for robot perception of room environments by means of an in-air sonar scanner”, Ultrasonics, vol. 53, no. 6, 2013, pp. 1163–1173.
  • [17] N.I. Giannoccaro, L. Spedicato, L. Aiello, “Kernel PCA and approximate pre-images to extract the closest ultrasonic arc from the scanning of indoor specular environments”, Measurement: Journal of the International Measurement Confederation, vol. 58, no. 1, pp. 2014, 46–60.
  • [18] W. Farag, “Real-Time Autonomous Vehicle Localization Based on Particle and Unscented Kalman Filters”, Journal of Control, Automation and Electrical Systems, vol. 32, no. 2, 2021, pp. 309–325.
  • [19] W.Xu, F. Zhang, “FAST-LIO: A fast, robust, LiDAR--inertial odometry package by tightly-coupled iterated Kalman-filter”, IEEE Robotics and Automation Letters, vol. 6, no. 2, 2021, pp. 3317-3324.
  • [20] P. Chauchat, A. Burrau, S. Bonnabel, “Factor Graph-Based Smoothing without Matrix Inversion for Highly Precise Localization”, IEEE Trans. On Control System Technology, vol. 29, no. 3, 2021, pp. 1219–1232.
  • [21] S. Muro, I. Yoshida, M. Hashimoto, T. Takahashi, “Moving-object detection and tracking by scanning LiDAR mounted on motorcycle based on dynamic background subtraction”, Artificial Life and Robotics, Vol. 26, no. 4, 2021, pp. 412–422.
  • [22] H. Zhang, N. Chen, Z. Dai, G. Fan.” A Multi-level Data Fusion Localization Algorithm for SLAM”, Jiqiren/Robot, vol. 43, no. 6, 2021, pp. 641–652.
  • [23] J. Zhang, L. Xu, C. Bao, “An Adaptive Pose Fusion Method for Indoor Map Construction”, International Journal of Geo-Information, vol. 10, 2021, pp. 1–22.
  • [24] M. Morita, T. Nishida, Y. Arita M., Shige-eda, E. di Maria, R. Gallone, N.I. Giannoccaro, “Development of Robot for 3D Measurement of Forest Environment”, Journal of Robotics and Mechatronics, vol. 30, 2018, pp. 145–154. 10.20965/jrm.2018.p0145.
  • [25] https://hokuyo-usa.com/application/files/7815/9111/2405/YVT-35LX-FK_Specifications.pdf
  • [26] https://www.invensense.com/products/motion-tracking/6-axis/mpu-6500/
  • [27] https://www.cytron.io/p-10amp-5v-30v-dc--motor-driver-2-channels.
  • [28] M. Quigley, K. Conley, B. Gerkey, J. Faust, T. Foote, J. Leibs, R. Wheeler, A. Y. Ng, “ROS: an open-source Robot Operating System”, Proc. of ICRA workshop on open source software, Kobe, Japan, vol. 3, no. 5, 2009.
  • [29] https://www.vizrt.com/en/products/viz-virtual-studio
  • [30] Matlab, “Computer Vision Toolbox”, The Mathworks, 2018.
Uwagi
Opracowanie rekordu ze środków MEiN, umowa nr SONP/SP/546092/2022 w ramach programu "Społeczna odpowiedzialność nauki" - moduł: Popularyzacja nauki i promocja sportu (2022-2023).
Typ dokumentu
Bibliografia
Identyfikator YADDA
bwmeta1.element.baztech-c5899b4e-1175-46a4-8eca-0ce49ff87429
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.