PL EN


Preferencje help
Widoczny [Schowaj] Abstrakt
Liczba wyników
Tytuł artykułu

The utilization of spherical camera in simulation for service robotics

Identyfikatory
Warianty tytułu
Języki publikacji
EN
Abstrakty
EN
Safety is one of the most critical factors in robotics, especially when robots have to collaborate with people in a shared environment. Testing the physical systems, however, must focus on much more than just software. One of the common steps in robotic system development is the utilization of simulators, which are very good for tasks like navigation or manipulation. Testing vision systems is more challenging, as the simulated data often is far from the real camera readings. In this paper, we show the advantages of using the spherical camera for recording the sequences of test images and a way to integrate those with existing robotic simulator. The presented system also has the possibility to be extended with rendered objects to further improve its usability.
Rocznik
Strony
17--26
Opis fizyczny
Bibliogr. 13 poz., rys., tab., wykr.
Twórcy
  • Warsaw University of Technology Institute of Control and Computation Engineering Warsaw, Poland
  • Warsaw University of Technology Institute of Control and Computation Engineering Warsaw, Poland
Bibliografia
  • [1] W. Dudek et al. Task harmonisation for a single–task robot controller,” in 12th International Workshop on Robot Motion and Control (RoMoCo). Proceedings. IEEE, 2019, pp. 86–91.
  • [2] N. Greene, Environment mapping and other applications of world projections, IEEE Computer Graphics and Applications, Nov 1986, vol. 6, no. 11, pp. 21–29.
  • [3] S. Haddadin, A. Albu-Schäffer, and G. Hirzinger, Safety evaluation of physical human-robot interaction via crash-testing. In: Robotics: Science and Systems, Proceedings. Citeseer, 2007, vol. 3. pp. 217–224.
  • [4] N. Koenig and A. Howard, Design and use paradigms for Gazebo, an open-source multi-robot simulator, In: 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems, Proceedings, Sep. 2004, vol. 3, pp. 2149–2154.
  • [5] S. Levine et al. Learning hand-eye coordination for robotic grasping with deep learning and largescale data collection, The International Journal of Robotics Research, 2018, vol. 37, no. 4-5, pp. 421–436.
  • [6] P. Martinez-Gonzalez et al. UnrealROX: an extremely photorealistic virtual reality environment for robotics simulations and synthetic data generation,” Virtual Reality, 2019, pp. 1–18.
  • [7] J. Pages, L. Marchionni, F. Ferro, Tiago: the modular robot that adapts to different research needs,” In: International workshop on robot modularity, IROS, Proceedings, 2016.
  • [8] D. Seredynski, K. Banachowicz, and T. Winiarski, Graph-based potential field for the end-effector control within the torque-based task hierarchy, In: 2016 21st International Conference on Methods and Models in Automation and Robotics (MMAR). Proceedings. IEEE, 2016, pp. 645–650.
  • [9] D. Seredynski, W. Szynkiewicz, Fast Grasp Learning for Novel Objects, In: Recent Advances in Automation, Robotics and Measuring Techniques, Proceedings. Springer, 2016. Vol. 440 series Advances in Intelligent Systems and Computing (AISC), pp. 681–692.
  • [10] J. Skinner et al. High-fidelity simulation for evaluating robotic vision performance, In: 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems, Proceedings, 2016, pp. 2737–2744.
  • [11] A. Staranowicz and G. L. Mariottini, A survey and comparison of commercial and open-source robotic simulator software, In: Proceedings of the 4th International Conference on PErvasive Technologies Related to Assistive Environments, Proceedings. Association for Computing Machinery, 2011, PETRA ’11.
  • [12] T. Winiarski et al. Automated inspection of door parts based on fuzzy recognition system, In: 21th IEEE International Conference on Methods and Models in Automation and Robotics, MMAR’2016. Proceedings. IEEE, 2016, pp. 478–483.
  • [13] M. Wasik, M. Rostkowska, and P. Skrzypczynski, Embedded, gpu-based omnidirectional vision for a walking robot, In: Advances in cooperative robotics, pp. 339–347, World Scientific, 2017.
Uwagi
PL
Opracowanie rekordu ze środków MEiN, umowa nr SONP/SP/546092/2022 w ramach programu "Społeczna odpowiedzialność nauki" - moduł: Popularyzacja nauki i promocja sportu (2024).
Typ dokumentu
Bibliografia
Identyfikator YADDA
bwmeta1.element.baztech-b3978b0a-f894-4cf5-9679-33d69fa5513c
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.