PL EN


Preferencje help
Widoczny [Schowaj] Abstrakt
Liczba wyników
Tytuł artykułu

Self-supervised learning of motion-induced acoustic noise awareness in social robots

Treść / Zawartość
Identyfikatory
Warianty tytułu
Języki publikacji
EN
Abstrakty
EN
With the growing presence of robots in human populated environments, it becomes necessary to render their presence natural, rather than invasive. To do that, robots need to make sure the acoustic noise induced by their motion does not disturb people nearby. In this line, this paper proposes a method that allows the robot to learn how to control the amount of noise it produces, taking into account the environmental context and the robot’s mechanical characteristics. Concretely, the robot adapts its motion to a speed that allows it to produce less noise than the environment’s background noise and, hence, avoiding to disturb nearby humans. For that, before executing any given task in the environment, the robot learns how much acoustic noise it produces at different speeds in that environment by gathering acoustic informatinon through a microphone. The proposed method was successfully validated on various environments with various background noises. In addition, a PIR sensor was installed on the robot in order to test the robot’s ability to trigger the noise-aware speed control procedure when a person enters the sensor’s field of view. The use of a such a simple sensor aims at demonstrating the ability of the proposed system to be deployed in minimalistic robots, such as micro unmanned aerial vehicles.
Twórcy
  • Instituto de Telecomunicações and ISCTE-Instituto Universitário de Lisboa, Lisboa, PORTUGAL
  • Instituto de Telecomunicações and ISCTE-Instituto Universitário de Lisboa, Lisboa, PORTUGAL
  • Instituto de Telecomunicações and ISCTE-Instituto Universitário de Lisboa, Lisboa, PORTUGAL
Bibliografia
  • [1] J. Andrade, P. Santana, and A. Almeida, “Motioninduced acoustic noise awareness for sociallyaware robot navigation”. In: Proceedings of the IEEE International Conference on Autonomous Robot Systems and Competitions (ICARSC), 2018, 24–29, 10.1109/ICARSC.2018.8374155.
  • [2] R. Bajcsy, “Active perception”, Proceedings of the IEEE, vol. 76, no. 8, 1988, 966–1005, 10.1109/5.5968.
  • [3] M. Bajracharya, A. Howard, L. H. Matthies, B. Tang, and M. Turmon, “Autonomous off-road navigation with end-to-end learning for the LAGR program”, Journal of Field Robotics, vol. 26, no. 1, 2009, 3–25, 10.1002/rob.20269.
  • [4] J. Baleia, P. Santana, and J. Barata, “On exploiting haptic cues for self-supervised learning of depth-based robot navigation affordances”, Journal of Intelligent & Robotic Systems, vol. 80, no. 3-4, 2015, 455–474, 10.1007/s10846-015-0184-4.
  • [5] D. H. Ballard, “Animate vision”, Artificial intelligence, vol. 48, no. 1, 1991, 57–86, 10.1016/0004-3702(91)90080-4.
  • [6] M. Basner, W. Babisch, A. Davis, M. Brink, C. Clark, S. Janssen, and S. Stansfeld, “Auditory and non-auditory effects of noise on health”, The Lancet, vol. 383, no. 9925, 2014, 1325–1332, 10.1016/S0140-6736(13)61613-X.
  • [7] M. Brambilla, E. Ferrante, M. Birattari, and M. Dorigo, “Swarm robotics: a review from the swarm engineering perspective”, Swarm Intelligence, vol. 7, no. 1, 2013, 1–41, 10.1007/s11721-012-0075-2.
  • [8] J. Gibson, “The concept of affordances”, Perceiving, acting, and knowing, 1977, 67–82.
  • [9] E. Hall, The Hidden Dimension : man’s use of space in public and in private, Publisher is empty!, 1969, 217.
  • [10] H. Heidarsson and G. Sukhatme, “Obstacle detection from overhead imagery using selfsupervised earning for autonomous surface vehicles”. In: Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2011, 3160–3165, 10.1109/IROS.2011.6094610.
  • [11] L. Jamone, E. Ugur, A. Cangelosi, L. Fadiga, A. Bernardino, J. Piater, and J. Santos-Victor, “Affordances in psychology, neuroscience and robotics: a survey”, IEEE Transactions on Cognitive and Developmental Systems, 2016, 10.1109/TCDS.2016.2594134.
  • [12] N. Kallakuri, J. Even, Y. Morales, C. Ishi, and N. Hagita, “Probabilistic approach for building auditory maps with a mobile microphone array”. In: Robotics and Automation (ICRA), 2013 IEEE International Conference on, 2013, 2270–2275, 10.1109/ICRA.2013.6630884.
  • [13] N. Kallakuri, J. Even, Y. Morales, C. Ishi, and N. Hagita, “Using sound reflections to detect moving entities out of the field of view”. In: Intelligent Robots and Systems (IROS), 2013 IEEE/RSJ International Conference on, 2013, 5201–5206, 10.1109/IROS.2013.6697108.
  • [14] T. Kruse, A. K. Pandey, R. Alami, and A. Kirsch, “Human-aware robot navigation: A survey”, Robotics and Autonomous Systems, vol. 61, no. 12, 2013, 1726–1743,10.1016/j.robot.2013.05.007.
  • [15] S. Lacey, J. Hall, and K. Sathian, “Are Surface properties integrated into visuohaptic object representations?”, European Journal of Neuroscience, vol. 31, no. 10, 2010, 1882–1888, 10.1111/j.1460-9568.2010.07204.x.
  • [16] M. Luber, L. Spinello, J. Silva, and K. O. Arras, “Socially-aware robot navigation: A learning approach”. In: 2012 IEEE/RSJ international conference on Intelligent robots and systems (IROS), 2012, 902–907, 10.1109/IROS.2012.6385716.
  • [17] F. Marques, D. Gonçalves, J. Barata, and P. Santana, “Human-aware navigation for autonomous mobile robots for intra-factory logistics”. In: Proceedings of the 6th International Workshop on Symbiotic Interaction, 2017, 10.1007/978-3-319-91593-7_9.
  • [18] E. Martinson, “Hiding the acoustic signature of a mobile robot”. In: Intelligent Robots and Systems, 2007. IROS 2007. IEEE/RSJ International Conference on, 2007, 985–990, 10.1109/IROS.2007.4399264.
  • [19] E. Martinson and A. Schultz, “Auditory evidence grids”. In: Intelligent Robots and Systems, 2006 IEEE/RSJ International Conference on, 2006, 1139–1144, 10.1109/IROS.2006.281843.
  • [20] G. Michalos, S. Makris, P. Tsarouchi, T. Guasch, D. Kontovrakis, and G. Chryssolouris, “Design considerations for safe human-robot collaborative workplaces”, Procedia CIrP, vol. 37, 2015, 248–253, 10.1016/j.procir.2015.08.014.
  • [21] B. Paden, M. Čáp, S. Z. Yong, D. Yershov, and E. Frazzoli, “A survey of motion planning nand control techniques for self-driving urban vehicles”, IEEE Transactions on Intelligent Vehicles, vol. 1, no. 1, 2016, 33–55, 10.1109/TIV.2016.2578706.
  • [22] E. Pinto, F. Marques, R. Mendonça, A. Lourenço, P. Santana, and J. Barata, “An autonomous surface-aerial marsupial robotic team for riverine environmental monitoring: Benefiting from coordinated aerial, underwater, and surface level perception”. In: Robotics and Biomimetics (ROBIO), 2014 IEEE international conference on, 014, 443–450.
  • [23] R. Pombeiro, R. Mendonça, P. Rodrigues, F. Marques, A. Lourenço, E. Pinto, P. Santana, and J. Barata, “Water detection from downwashinduced optical flow for a multirotor uav”. In: OCEANS’15 MTS/IEEE Washington, 2015, 1–6,10.23919/OCEANS.2015.7404458.
  • [24] M. Quigley, K. Conley, B. Gerkey, J. Faust, T. Foote, J. Leibs, R. Wheeler, and A. Y. Ng, “Ros: an opensource robot operating system”. In: ICRA workshop on open source software, vol. 3, no. 3.2, 2009, 5.
  • [25] J. Rios-Martinez, A. Spalanzani, and C. Laugier, “From proxemics theory to socially-aware navigation: A survey”, International Journal of Social Robotics, vol. 7, no. 2, 2015, 137–153, 10.1007/s12369-014-0251-1.
  • [26] J. Schwenkler, “Do things look the way they feel?”, Analysis, vol. 73, no. 1, 2013, 86–96.
  • [27] L. Takayama and C. Pantofaru, “Influences on proxemic behaviors in human-robot interaction”. In: IEEE/RSJ international conference on Intelligent robots and systems, 2009. IROS, 2009, 5495–5502, 10.1109/IROS.2009.5354145.
  • [28] E. Uğur and E. Şahin, “Traversability: A case study for learning and perceiving affordances in robots”, Adaptive Behavior, vol. 18, no. 3-4, 2010, 258–284, 10.1177/1059712310370625.
  • [29] K. van Hecke, G. de Croon, L. van der Maaten, D. Hennes, and D. Izzo, “Persistent selfsupervised learning: From stereo to monocular vision for obstacle avoidance”, International Journal of Micro Air Vehicles, vol. 10, no. 2, 2018, 186–206, 10.1177/1756829318756355.
  • [30] K. M. Wurm, H. Kretzschmar, R. Kümmerle, C. Stachniss, and W. Burgard, “Identifying vegetation from laser data in structured outdoor environments”, Robotics and Autonomous Systems, vol. 62, no. 5, 2012, 675–684, 10.1016/j.robot.2012.10.003.
Uwagi
Opracowanie rekordu w ramach umowy 509/P-DUN/2018 ze środków MNiSW przeznaczonych na działalność upowszechniającą naukę (2019).
Typ dokumentu
Bibliografia
Identyfikator YADDA
bwmeta1.element.baztech-be3307bc-95ca-46a5-a836-02a8abc526cb
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.