PL EN


Preferencje help
Widoczny [Schowaj] Abstrakt
Liczba wyników
Tytuł artykułu

Simultaneous localization and mapping for tracked wheel robots combining monocular and stereo vision

Autorzy
Treść / Zawartość
Identyfikatory
Warianty tytułu
Języki publikacji
EN
Abstrakty
EN
This paper addresses an online 6D SLAM method for a tracked wheel robot in an unknown and unstructured environment. While the robot pose is represented by its position and orientation over a 3D space, the environment is mapped with natural landmarks in the same space, autonomously collected using visual data from feature detectors. The observation model employs opportunistically features detected from either monocular and stereo vision. These features are represented using an inverse depth parametrization. The motion model uses odometry readings from motor encoders and orientation changes measured with an IMU. A dimensional-bounded EKF (DBEKF) is introduced here, that keeps the dimension of the state bounded. A new landmark classifier using a Temporal Difference Learning methodology is used to identify undesired landmarks from the state. By forcing an upper bound to the number of landmarks in the EKF state, the computational complexity is reduced to up to a constant while not compromising its integrity. All experimental work was done using real data from RAPOSA-NG, a tracked wheel robot developed for Search and Rescue missions.
Twórcy
autor
  • Institute for Systems and Robotics, Instituto Superior Técnico, Av. Rovisco Pais, 1, Lisbon, Portugal
autor
  • Institute for Systems and Robotics, Instituto Superior Técnico, Av. Rovisco Pais, 1, Lisbon, Portugal
Bibliografia
  • [1] H. Bay, A. Ess, T. Tuytelaars, L. Van Gool, “Speededup robust features (SURF)”, Comput. Vis. ImageUnderst., 110(3), 2008, pp. 346–359.
  • [2] J. Civera, A.J. Davison, and J. Montiel, “Inverse depth parametrization for monocular SLAM”, IEEE Transactions on Robotics, 24(5), 2008, pp. 932–945.
  • [3] A.J. Davison, I.D. Reid, N.D. Molton, O. Stasse, “MonoSLAM: Real-Time Single Camera SLAM”, IEEE Trans. Pattern Anal. Mach. Intell., 29(6), 2007,pp. 1052–1067.
  • [4] R.I. Hartley, A. Zisserman, “Multiple View Geometry in Computer Vision”, Cambridge University Press, 2nd edition, 2004.
  • [5] J.J. Leonard, H.F. Durrant-Whyte, “Simultaneous map building and localization for an autonomous mobile robot”. In: IEEE/RSJ International Workshop on Intelligent Robots and Systems, 1991, pp. 1442–1447.
  • [6] D. Oram, “Rectification for any epipolar geometry”. In: BMVC’01 2001, pp. 653–662.
  • [7] P. Pinies, T. Lupton, S. Sukkarieh, J.D. Tardos, “Inertial aiding of inverse depth SLAM using a monocular camera”. In: IEEE International Conference on Robotics and Automation, 2007, pp. 2797–2802.
  • [8] E. Rublee, V. Rabaud, K. Konolige, G. Bradski, “ORB: An efficient alternative to SIFT or SURF”. In: International Conference on Computer Vision, Barcelona, 2011.
  • [9] N. Trawny, S.I. Roumeliotis, “Indirect kalman filter for 3D attitude estimation”, Technical Report, University of Minnesota, Dept. of Comp. Sci. & Eng., 2005.
  • [10] O.J. Woodman, “An introduction to inertial navigation”, Technical Report, UCAM-CL-TR-696, University of Cambridge, Computer Laboratory, 2007.
Typ dokumentu
Bibliografia
Identyfikator YADDA
bwmeta1.element.baztech-85c1da02-8492-4dbd-a0af-e88467ab6fb3
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.