PL EN


Preferencje help
Widoczny [Schowaj] Abstrakt
Liczba wyników
Tytuł artykułu

Uncertainty Models of Vision Sensors in Mobile Robot Positioning

Treść / Zawartość
Identyfikatory
Warianty tytułu
Języki publikacji
EN
Abstrakty
EN
This paper discusses how uncertainty models of vision-based positioning sensors can be used to support the planning and optimization of positioning actions for mobile robots. Two sensor types are considered: a global vision with overhead cameras, and an on-board camera observing artificial landmarks. The developed sensor models are applied to optimize robot positioning actions in a distributed system of mobile robots and monitoring sensors, and to plan the sequence of actions for a robot cooperating with the external infrastructure supporting its navigation.
Rocznik
Strony
73--88
Opis fizyczny
Bibliogr. 38 poz., rys., wykr.
Twórcy
  • Institute of Control and Information Engineering, Poznan University of Technology, ul. Piotrowo 3A, 60-965 Poznan, Poland, ps@ar-kari.put.poznan.pl
Bibliografia
  • [1] Adam A., Rivlin E. and Shimshoni I. (2001): Computing the sensory uncertainty field of a vision-based localization sensor. — IEEE Trans. Robot. Automat., Vol. 17, No. 3, pp. 258–267.
  • [2] Ahuja R., Magnanti T. and Orlin J. (1993): Network Flows: Theory, Algorithms and Applications. — Englewood Cliffs: Prentice Hall.
  • [3] Bączyk R. and Skrzypczyński P. (2001): Mobile robot localization by means of an overhead camera. — Proc. Conf. Automation 2001, Warsaw, Poland, pp. 220–229.
  • [4] Bączyk R. (2001): Methods of correcting image distortions in a localization system of a mobile robot. — Proc. 7-th Nat. Conf. Robotics, Wrocław, Poland, pp. 185–194.
  • [5] Bączyk R., Kasiński A. and Skrzypczyński P. (2003): Visionbased mobile robot localization with simple artificial landmarks.— Prep. 7th IFAC Symp. Robot Control, Wrocław, Poland, pp. 217–222.
  • [6] Bączyk R. and Skrzypczyński P. (2003): A framework for visionbased positioning in a distributed robotic system. — Proc. Europ. Conf. Mobile Robots, Warsaw, Poland, pp. 153–158.
  • [7] Brzykcy G., Martinek J., Meissner A. and Skrzypczyński P. (2001): Multi-agent blackboard architecture for a mobile robot.—Proc. IEEE/RSJ Int. Conf. Intelligent Robots and Systems, Maui, USA, pp. 2369–2374.
  • [8] Castellanos J. and Tardòs J. (1999): Mobile Robot Localization and Map Building. A Multisensor Fusion Approach. —Dordrecht: Kluwer.
  • [9] Coulouris G., Dollimore J. and Kindberg T. (1996): Distributed Systems. Concepts and Design. — Boston: Addison Wesley.
  • [10] Crowley J. L. (1996): Mathematical foundations of navigation and perception for an autonomous mobile robot, In: Reasoning with Uncertainty in Robotics (L. Dorst, Ed.). — Berlin: Springer.
  • [11] DeSouza G. and Kak A. C. (2002): Vision for mobile robot navigation: A survey. — IEEE Trans. Pattern Anal. Mach. Intell., Vol. 24, No. 2, pp. 237–267.
  • [12] Feng L., Borenstein J. and Everett H. (1996): “Where am I ?” Sensors and methods for autonomous mobile robot positioning.— Tech. Rep., Univ. of Michigan.
  • [13] Haralick R. M. (1996): Propagating covariance in computer vision. – Int. J. Pattern Recog. Artif. Intell., Vol. 10, No. 5, pp. 561–572.
  • [14] Heikkilä J. (2000): Geometric camera calibration using circular control points. — IEEE Trans. Pattern Anal. Mach. Intell., Vol. 22, No. 10, pp. 1066–1077.
  • [15] Ishiguro H. (1997): Distributed vision system: A perceptual information infrastructure for robot navigation.—Proc. Int. Joint Conf. Artif. Intell., Nagoya, Japan, pp. 36–43.
  • [16] Jain R., Kasturi R. and Schunck B. (1995): Machine Vision. — Singapore: McGraw-Hill.
  • [17] Kasiński A. and Bączyk R. (2001): Robust landmark recognition with application to navigation. — Proc. Conf. Computer Recognition Systems (KOSYR), Wrocław, Poland, pp. 401–407.
  • [18] Kasiński A. and Hamdy A. (2003): Efficient illumination suppression in a sequence by motion detection combined with homomorphic filtering.—Proc. 27thWorkshop AAPR Vision in a Dynamic World, Laxenburg, Austria, pp. 19–26.
  • [19] Kasiński A. and Skrzypczyński P. (1998): Cooperative perception and world-model maintenance in mobile navigation tasks, In: Distributed Autonomous Robotic Systems 3 (T. Lüth et al., Eds.). — Berlin: Springer, pp. 173–182.
  • [20] Kasiński A. and Skrzypczyński P. (2001): Perception network for the team of indoor mobile robots.: Concept, architecture, implementation. — Eng. Appl. Artif. Intell. Vol. 14, No. 2, pp. 125–137.
  • [21] Kasiński A. and Skrzypczyński P. (2002): Communication mechanism in a distributed system of mobile robots, In: Distributed Autonomous Robotic Systems 5 (H. Asama et al., Eds.).— Tokyo: Springer, pp. 51–60.
  • [22] Kruse E., Gutsche R. and Wahl F. (1998): Intelligent mobile robot guidance in time varying environments by using a global monitoring system.—Proc. IFAC Symp. Intell. Autonomous Vehicles, Madrid, Spain, pp. 509–514.
  • [23] Kuipers F. A., Korkmaz T., Krunz M. and Van Mieghem P. (2002): A review of constraint-based routing algorithms. — Tech. Rep., Delft Univ. Technol.
  • [24] Lambert A. and Fraichard T. (2000): Landmark-based safe path planning for car-like robots.—IEEE Int. Conf. Robot. Automat., San Francisco, pp. 2046–2051.
  • [25] Lazanas A. and Latombe J.-C. (1995): Motion planning with uncertainty: A landmark approach. — Artif. Intell., Vol. 76, No 1–2, pp. 287–317.
  • [26] Lorenz D. and Raz D. (2001): A simple efficient approximation scheme for the restricted shortest path problem. — Oper. Res. Lett., Vol. 28, No. 5, pp. 213–219.
  • [27] Latombe J.-C. (1991): Robot Motion Planning. — Dordrecht: Kluwer.
  • [28] Miura J. and Shirai Y. (1993): An uncertainty model of stereo vision and its application to vision-motion planning of robot. — Proc. Int. Joint Conf. Artif. Intell., Chambéry, France, pp. 1618–1623.
  • [29] Moon I., Miura J. and Shirai Y. (1999): On-line viewpoint and motion planning for efficient visual navigation under uncertainty. — Robot. Autonom. Syst., Vol. 28, No 2, pp. 237–248.
  • [30] Müller J. (1996): The Design of Intelligent Agents: A Layered Approach.— Berlin: Springer.
  • [31] Shah S. and Aggarwal J. (1994): A simple calibration procedure for fish-eye (high distortion) lens camera. — IEEE Int. Conf. Robot. Automat., San Diego, pp. 3422–3427.
  • [32] Skrzypczyński P. (2004a): A team of mobile robots and monitoring sensors – From concept to experiment. — Adv. Robot. Vol. 18, No. 6, pp. 583–610.
  • [33] Skrzypczyński P. (2004b): Using sensor uncertainty models to optimize the robot positioning actions, In: Intelligent Autonomous Systems 8 (F. Groen et al., Eds.).—Amsterdam, IOS Press, pp. 299–308.
  • [34] Smith R. and Cheeseman P. (1987): On the estimation and representation of spatial uncertainty. — Int. J. Robot. Res., Vol. 5, No. 4, pp. 56–68.
  • [35] Smith R. G. (1980): The contract net protocol: High-level communication and control in a distributed problem solver.— IEEE Trans. Comput., Vol. 29, No. 12, pp. 1104–1113.
  • [36] Sysło M. M., Deo N. and Kowalik J. S. (1983): Discrete Optimization Algorithms with Pascal Programs.—Englewood Cliffs: Prentice-Hall.
  • [37] Takahashi O. and Schilling R. J. (1989): Motion planning in a plane using generalized Voronoi diagrams.—IEEE Trans. Robot. Automat., Vol. 5, No. 2, pp. 143–150.
  • [38] Takeda H., Facchinetti C. and Latombe J.-C. (1994): Planning the motions of a mobile robot in a sensory uncertainty field. —IEEE Trans. Pattern Anal. Mach. Intell., Vol. 16, No. 10, pp. 1002–1017.
Typ dokumentu
Bibliografia
Identyfikator YADDA
bwmeta1.element.baztech-article-BPZ1-0008-0017
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.