PL EN


Preferencje help
Widoczny [Schowaj] Abstrakt
Liczba wyników
Tytuł artykułu

Auditory occupancy grids with a mobile robot

Autorzy
Treść / Zawartość
Identyfikatory
Warianty tytułu
Języki publikacji
EN
Abstrakty
EN
This paper presents the use of auditory occupancy grids (AOGs) for mapping of a mobile robot’s acoustic environment. An AOG is a probabilistic map of sound source locations built from multiple measurements using techniques from both probabilistic robotics and sound localization. The mapping is simulated, tested for robustness, and then successfully implemented on a threemicrophone mobile robot with four sound sources. Using the robot’s inherent advantage of mobility, the AOG correctly locates the sound sources from only nine measurements. The resulting map is then used to intelligently position the robot within the environment and to maintain auditory contact with a moving target.
Twórcy
autor
Bibliografia
  • [1] DeJong B.P., “Auditory occupancy grids: Sound localization on a mobile robot”. In: IASTED International Conference on Robotics and Applications, 2010.
  • [2] Martinson E., Arkin R., “Noise maps for acoustically sensitive navigation”. In: Society of Photo-Optical Instrumentation Engineers, 2004.
  • [3] Thrun S., “Probabilistic algorithms in robotics”, AI Magazine, vol. 21, no. 4, 2000.
  • [4] Thrun S., Burgard W., Fox D., Probabilistic Robotics, MIT Press, Cambridge, MA, 2005.
  • [5] Grabowski R., Khosla P., Choset H., “Autonomous exploration via regions of interest”. In: IEEE/RSJ International Conference on Intelligent Robots and Systems, 2003.
  • [6] Thrun S., “Learning occupancy grid maps with forward sensor models”, Autonomous Robots, vol. 15, no. 2, 2003.
  • [7] Leon-Garcia A., Probability and Random Processes for Electrical Engineering, 2nd ed. Addison-Wesley, 2004.
  • [8] West M., Harrison J., Bayesian Forecasting and Dynamic Models, 2nd ed. Springer-Verlag, New York, 1997.
  • [9] A. Elfes, “Sonar-based real-world mapping and navigation”, IEEE Transactions of Robotics and Automation, 1987.
  • [10] Moravec H.P., “Sensor fusion in certainty grids for mobile robots”, AI Magazine, vol. 9, no. 2, 1988.
  • [11] Schultz A.C., Adams W., “Continuous localization using evidence grids”. In: IEEE International Conference on Robotics and Automation, 1998.
  • [12] Wolf D.F., Sukhatme G.S., “Mobile robot simultaneous localization and mapping in dynamic environments”, Autonomous Robots, 2005.
  • [13] Valin J.-M., Michaud F., Rouat J., “Robust localization and tracking of simultaneous moving sound sources using beamforming and particle filtering”, Robotics and Autonomous Systems, vol. 55, no. 3, 2007.
  • [14] Rabinkin D.V., Renomeron R.J., Dahl A., et al., “A DSP implementation of source location using microphone arrays”, The Journal of the Acoustical Society of America, vol. 99, no. 4, 1996.
  • [15] Sasaki Y., Kagami S., Mizoguchi H., “Multiple sound source mapping for a mobile robot by selfmotion triangulation”. In: IEEE/RSJ International Conference on Intelligent Robots and Systems, 2006.
  • [16] Kagami S., Tamai Y., Mizoguchi H., and Kanade T., “Microphone array for 2D sound localization and capture”. In: IEEE International Conference on Robotics and Automation, 2004.
  • [17] DiBiase J.H., Silverman H.F., Brandstein M.S., “Robust localization in reverberant rooms”, Microphone Arrays: Signal Processing Techniques and Applications, Springer, 2001.
  • [18] Knapp C.H., Carter G., “The generalized correlation method for estimation of time delay”, IEEE Transactions on Acoustics, Speech and Signal Processing, 1976.
  • [19] Aarabi P., “The fusion of distributed microphone arrays for sound localization”. Journal of Applied Signal Processing, vol. 4, 2003.
  • [20] Stillman S. , Essa I., “Towards reliable multimodal sensing in aware environments”. In: ACM International Conference, 2001.
  • [21] Mungamuru B. , Aarabi P., “Enhanced sound localization”, IEEE Transactions on Systems, Man, and Cybernetics, vol. 34, no. 3, 2004.
  • [22] Nakadai K., Matsuura D., Okuno H., Kitano H., “Applying scattering theory to robot audition system: Robust sound source localization and extraction”. In: IEEE/RSJ International Conference on Intelligent Robots and Systems, 2003.
  • [23] Nakadai K., Lourens T., Okuno H., Kitano H., “Active audition for humanoid”. In: National Conference on Artificial Intelligence, 2000.
  • [24] Nakadai K., Hidai K., Okuno H., Kitano H., “Epipolar geometry based sound localization and extraction for humanoid audition”. In: IEEE/RSJ International Conference on Intelligent Robots and Systems, 2001.
  • [25] Huang J. , Ohnishi N., Sugie N., “A biomimetic system for localization and separation of multiple sound sources”, IEEE Transactions on Instrumentation and Measurement, vol. 44, no. 3, 1995.
  • [26] Huang J., Supaongprapa T., Terakura I., Wang F., Ohnishi N., Sugie N., “A model based sound localization system and its application to robot navigation”, Robotics and Autonomous Systems, no. 27, 1999.
  • [27] Valin J.-M., Michaud F., Rouat J., Letourneau D., “Robust sound source localization using a microphone array on a mobile robot”. In: IEEE/RSJ International Conference on Intelligent Robots and Systems, 2003.
  • [28] Martinson E., Schultz A., “Auditory evidence grids”. In: IEEE/RSJ International Conference on Intelligent Robots and Systems, 2006.
  • [29] Martinson E., Schultz A., “Robotic discovery of the auditory scene”. In: IEEE International Conference on Robotics and Automation, 2007.
  • [30] Martinson E., “Improving human-robot interaction through adaption to the auditory scene”. In: ACM/ IEEE International Conference on Human-Robot Interaction, 2007.
  • [31] Graves K., Adams W., Schultz A., “Continuous localization in changing environments”. In: IEEE International Symposium on Computational Intelligence in Robotics and Automation, 1997.
Typ dokumentu
Bibliografia
Identyfikator YADDA
bwmeta1.element.baztech-article-BUJ8-0016-0013
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.