PL EN


Preferencje help
Widoczny [Schowaj] Abstrakt
Liczba wyników
Tytuł artykułu

On the application of RGB-D SLAM systems for practical localization of mobile robots

Treść / Zawartość
Identyfikatory
Warianty tytułu
Języki publikacji
EN
Abstrakty
EN
This paper considers the practical application of the RGBD Simultaneous Localization and Mapping (SLAM) techniques for localization of mobile robots. We attempt to answer the question: how the quality of the estimated sensor trajectory depends on the approach to RGB-D data processing in the SLAM system when RGB-D frames acquired on a real mobile robot are used. Experiments are performed on data obtained from robots of different classes, and from different environment types to present the problems characteristic to RGB-D data. Conclusions as to the robustness of particular architectures and solutions applied in SLAM are drawn on the basis of experimental results. Publicly available data sets and well-established performance metrics are used to ensure that the results are verifiable, reproducible and relevant
Słowa kluczowe
Twórcy
autor
  • Poznań University of Technology, Institute of Control and Information Engineering, ul. Piotrowo 3A, 60-965 Poznań , Poland
autor
  • Poznań University of Technology, Institute of Control and Information Engineering, ul. Piotrowo 3A, 60-965 Poznań , Poland
  • Poznań University of Technology, Institute of Control and Information Engineering, ul. Piotrowo 3A, 60-965 Poznań , Poland
Bibliografia
  • [1] S. Baker, I. Matthews, ”Lucas-Kanade 20 years on: A unifying framework”, Int. Journal of Computer Vision, vol. 56, no. 3, 221–255, 2004. DOI: 10.1023/B:VISI.0000011205.11775.fd.
  • [2] H. Bay, A. Ess, T. Tuytelaars, L. Van Gool, ”Speeded-up robust features (SURF)”, Computer Vision and Image Understanding, vol. 110, no. 3, 2008, 346–359. DOI:10.1016/j.cviu.2007.09.014.
  • [3] D. Belter, M. Nowicki, P. Skrzypczyński, ”On the performance of pose-based RGB-D visual navigation systems”, In: Computer Vision – ACCV 2014 (D. Cremers et al., eds.), LNCS 9004, Springer, 2015, 407–423. DOI: 10.1007/978-3-319-16808-1_28.
  • [4] D. Belter, M. Nowicki, P. Skrzypczyń ski, ”Accurate map-based RGB-D SLAM for mobile robots”, In: Robot 2015: Advances in Robotics (L. P. Reis et al., eds.), AISC 418, Springer, 2015, 533–545. DOI: 10.1007/978-3-319-27149-1_41.
  • [5] D. Belter, M. Nowicki, P. Skrzypczyń ski, ”Evaluating map-based RGB-D SLAM on an autonomous walking robot”, In: Challenges in Automation, Robotics and Measurement Techniques, (R. Szewczyk, et al., eds.), AISC 440, Springer, 2016, 469–481. DOI: 10.1007/978-3-319-29357-8_42.
  • [6] D. Belter, M. Nowicki, P. Skrzypczyń ski, ”Improving accuracy of feature-based RGB-D SLAM by modeling spatial uncertainty of point features”. In: Proc. IEEE Int. Conf. on Robotics and Automation, Stockholm, 2016, 1279–1284. DOI: 10.1109/ICRA.2016.7487259.
  • [7] P. Cƽı́žek, J. Faigl, ”On localization and mapping with RGB-D sensor and hexapod walking robot in rough terrains”. In: Proc. IEEE Int. Conf. on Systems, Man, and Cybernetics, Budapest, 2016, 2273–2278. DOI:: 10.1109/SMC.2016.7844577.
  • [8] B. Curless, M. Levoy, ”A volumetric method for building complex models from range images “ Proc. 23rd Conf. on Computer Graphics and Interactive Techniques SIGGRAPH, New Orleans, 1996, 303–312. DOI:10.1145/237170.237269.
  • [9] I. Dryanovski, R. Valenti, J. Xiao, ”Fast visual odometry and mapping from RGB-D data”. In: Proc. IEEE Int. Conf. on Robotics & Automation, Karlsruhe, 2013, 5704–5711. DOI: 10.1109/ICRA.2013.6630889.
  • [10] D. W. Eggert, A. Lorusso, R. B. Fisher, ”Estimating 3-D rigid body transformations: a comparison of four major algorithms”. Machine Vision and Applications, vol. 9, no. 5–6, 272–290, 1997. DOI:10.1007/s001380050048.
  • [11] F. Endres, J. Hess, J. Sturm, D. Cremers, W. Burgard, ”3-D mapping with an RGB-D camera, IEEE Trans. on Robotics, vol. 30. no. 1, 2014, 177–187.DOI: 10.1109/TRO.2013.2279412.
  • [12] M. Fallon, H. Johannsson, M. Kaess, J. J. Leonard, ”The MIT Stata Center dataset”, Int. Journal of Robotics Research, vol. 32, no. 14, 2013, 1695–1699. DOI: 10.1177/0278364913509035.
  • [13] G. Grisetti, R. Kümmerle, C. Stachniss, W. Burgard, ”A tutorial on graph-based SLAM”, IEEE Intelligent Transportation Systems Magazine, vol. 2, no. 4, 2010, 31–43. DOI: 10.1109/MITS.2010.939925.
  • [14] A. Handa, T. Whelan, J. D. McDonald, A. J. Davison, ”A benchmark for RGB-D visual odometry, 3D reconstruction and SLAM”, IEEE Int. Conf. on Robotics & Automation, Hong Kong, 2014, 1524–1531.DOI: 10.1109/ICRA.2014.6907054.
  • [15] A. Hornung, K. O. Wurm, M. Bennewitz, C. Stachniss,W. Burgard, OctoMap: An efficient probabilistic 3D mapping framework based on octrees, Autonomous Robots, vol. 34, no. 3, 2013, 189–206. DOI: 10.1007/s10514-012-9321-0.
  • [16] A. Kostusiak, ”The comparison of keypoint detectorsand descriptors for registration of RGBD data”, In: Challenges in Automation, Robotics and Measurement Techniques, (R. Szewczyk, et al., eds.), AISC 440, Springer, 2016, 609–622.DOI: 10.1007/978-3-319-29357-8_53.
  • [17] A. Kostusiak, M. Nowicki, P. Skrzypczyński, ”On the use of RGB-D SLAM for mobile robots localization”,Zeszyty Naukowe Politechniki Warszawskiej, no. 195, Postȩpy robotyki, tom 2, 2016, 387–396 (in Polish).
  • [18] M. Kraft, M. Nowicki, R. Penne, A. Schmidt, P. Skrzypczyński, ”Efficient RGB-D data processing for feature-based self-localization of mobile robots”,Int. Journal of Applied Mathematics and Computer Science, vol. 26, no. 1, 2016, 63–79.DOI: 10.1515/amcs-2016-0005.
  • [19] M. Kraft, M. Nowicki, A. Schmidt, M. Fularz, P. Skrzypczyński, ”Toward evaluation of visual navigation algorithms on RGB-D data from the firstand second-generation Kinect”, Machine Vision and Applications, vol. 28, no. 1, 2017, 61–74. DOI:10.1007/s00138-016-0802-6.
  • [20] R. Kümmerle, G. Grisetti, H. Strasdat, K. Konolige,W. Burgard, ”g2o: A general framework for graph optimization”, Proc. IEEE Int. Conf. on Robotics & Automation, Shanghai, 2011, 3607–3613. DOI:10.1109/ICRA.2011.5979949.
  • [21] R. Maier, J. Sturm, D. Cremers, ”Submap-based bundle adjustment for 3D reconstruction from RGB-D data”, In: Pattern Recognition: GCPR 2014,(X. Jiang et al., eds.), LNCS 8753, Springer, 2014,54–65. DOI: 10.1007/978-3-319-11752-2_5.[22] R. Mur-Artal, J. M. M. Montiel, J. D. Tardós,”ORB-SLAM: A versatile and accurate monocular SLAM system”, IEEE Trans. on Robotics, vol. 31, no. 5, 2015, 1147–1163. DOI:10.1109/TRO.2015.2463671.
  • [23] R. Mur-Artal, J. D. Tardós, ”ORB-SLAM2: An open-source SLAM system for monocular,stereo and RGB-D cameras, arXiv preprint,arXiv:1610.06475v1, 2016.
  • [24] M. Nowicki, P. Skrzypczyński, ”Experimental verification of a walking robot self-localization system with the Kinect sensor”, Journal of Automation,Mobile Robotics & Intelligent Systems, vol.7, no. 4, 2013, 42–51. DOI: 10.14313/JAMRIS_4-2013/43.
  • [25] Point Cloud Library, http://pointclouds.org/
  • [26] E. Rublee, V. Rabaud, K. Konolige, G. Bradski,”ORB: an efficient alternative to SIFT or SURF”, IEEE Int. Conf. on Computer Vision, Barcelona, 2011, 2564–2571.DOI:10.1109/ICCV.2011.6126544.
  • [27] D. Scaramuzza, F. Fraundorfer, ”Visual odometry: Part I the first 30 years and fundamentals”. IEEE Robotics & Automation Magazine, vol. 18, no. 4, 2011, 80–92. DOI: 10.1109/MRA.2011.943233.
  • [28] A. Schmidt, M. Kraft, M. Fularz, Z. Domagala, ”‘Comparative assessment of point feature detectors and descriptors in the context of robot navigation”, Journal of Automation, Mobile Robotics & Intelligent Systems, vol. 7, no. 1, 2013, 11–20.
  • [29] A. Schmidt, M. Kraft, M. Fularz, Z. Domagala, ”The registration system for the evaluation of indoor visual SLAM and odometry algorithms”, Journal of Automation, Mobile Robotics & Intelligent Systems, vol. 7, no. 2, 2013, 46–51.
  • [30] A. Schmidt, A. Kasiński, M. Kraft, M. Fularz, Z. Domagala, ”Calibration of the multi-camera registration system for visual navigation benchmarking”, Int. Journal of Advanced Robotic Systems, vol. 11, no. 83, 2014. DOI: 10.5772/58471.
  • [31] P. Skrzypczyński, ”Mobile robot localization: where we are and what are the challenges?”, In: Automation 2017. Innovation in Automation, Robotics and Measurement Techniques, (R. Szewczyk, et al., eds.), AISC 550, Springer, 2017. DOI:10.1007/978-3-319-54042-9.
  • [32] F. Steinbrücker, J. Sturm, D. Cremers, ”‘Volumetric 3D mapping in real-time on a CPU”. In: Proc. IEEE Int. Conf. on Robotics & Automation,Hong Kong, 2014, 2021–2028. DOI: 10.1109/ICRA.2014.6907127.
  • [33] H. Strasdat, J. M. M. Montiel, A. J. Davison,”Visual SLAM: Why filter?”, Image and Vision Computing, vol. 30, no. 2, 2012, 65–77. DOI:10.1016/j.imavis.2012.02.009.
  • [34] H. Strasdat, Local accuracy and global consistency for efficient visual SLAM, PhD Dissertation, Imperial College, London, 2012.
  • [35] J. Sturm, N. Engelhard, F. Endres, W. Burgard,D. Cremers, ”A benchmark for the evaluationof RGB-D SLAM systems”. In: Proc. IEEE/RSJ Int. Conf. on Intelligent Robots & Systems, Vilamoura, 2012, 573–580. DOI: 10.1109/IROS.2012.6385773.
  • [36] B. Triggs, P. F. McLauchlan, R. I., Hartley, A. W.Fitzgibbon, ”‘Bundle adjustment – a modern synthesis”, In: Vision Algorithms: Theory and Practice, LNCS 1883, Springer, 2000, 298–372.DOI: 10.1007/3-540-44480-7_21.
  • [37] T. Whelan, M. Kaess, H. Johannsson, M. Fallon, J. J. Leonard, J. B. McDonald, ”Real-time large-scale dense RGB-D SLAM with volumetric fusion”, Int. Journal of Robotics Research, vol. 34, no. 4–5, 2015, 598–626. DOI:10.1177/0278364914551008.
  • [38] A. Wilkowski, T. Kornuta, M. Stefań czyk, W. Kasprzak, Efficient generation of 3D surfel maps using RGB-D sensors, Int. Journal of Applied Mathematics and Computer Science, vol. 26, no. 1, 2016, 99–122. DOI: 10.1515/amcs-2016-0007..
Uwagi
PL
Opracowanie ze środków MNiSW w ramach umowy 812/P-DUN/2016 na działalność upowszechniającą naukę (zadania 2017)
Typ dokumentu
Bibliografia
Identyfikator YADDA
bwmeta1.element.baztech-cabb511a-7d5b-4009-ac3c-1c3b3104e572
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.