PL EN


Preferencje help
Widoczny [Schowaj] Abstrakt
Liczba wyników
Tytuł artykułu

Intuitive User Interfaces for Mobile Manipulation Tasks

Treść / Zawartość
Identyfikatory
Warianty tytułu
Języki publikacji
EN
Abstrakty
EN
This article describes interactive methods that can ease difficult manipulation tasks in Search & Rescue operations. We discuss the requirements that are necessary for a telemanipulation system to be successfully used. These include not just correctness of generated motion but also ergonomy, mobility and interactivity of the operator’s interface. We show that grippers with one or more degrees of freedom can be intuitively controlled by different interface mechanisms, supported by 3D vision systems. Tests are performed both in the simulation environment and with real grippers. A practical pipeline for a direct control and learning the system is also presented.
Słowa kluczowe
Twórcy
autor
  • Lodz University of Technology, Stefanowskiego 18/22, 90-924 Lodz, www.robotyka.p.lodz.pl
autor
  • Lodz University of Technology, Stefanowskiego 18/22, 90-924 Lodz, www.robotyka.p.lodz.pl.
Bibliografia
  • [1] E. Ackerman. “Willow garage introduces velo 2g adaptive gripper”. http://spectrum. ieee.org/automaton/robotics/roboticshardware/willow-garage-introducesvelo-2g-adaptive-gripper.
  • [2] Autecsafety.com. “Autec: industrial radio remote controls for all applications”. http://www.autecsafety.com/en/products, 2015.
  • [3] R. Baer. “Recording crt light gun and method”. http://www.google.com/patents/US3599221, August 10 1971. US Patent 3,599,221.
  • [4] D. Balek, R. Kelley, “Using gripper mounted infrared proximity sensors for robot feedback control”. In: Proc. of the 1985 IEEE Int. Conf. on Robotics and Automation, vol. 2, 1985,282–287.DOI: 10.1109/ROBOT.1985.1087328.
  • [5] B. Bruggemann, B. Gaspers, A. Ciossek, J. Pellenz, and N. Kroll, “Comparison of different control methods for mobile manipulation using standardized tests”. In: IEEE Int. Symposium on Safety, Security, and Rescue Robotics (SSRR), 2013, 1–2.DOI: 10.1109/SSRR.2013.6719378.
  • [6] M. Ciocarlie, C. Goldfeder, and P. K. Allen, “Dimensionality reduction for hand-independent dexterous robotic grasping”. In: 2007 IEEE/RSJ International Conference on Intelligent Robots and Systems: San Diego, CA, 29 October-2 November 2007, 2007, 3270–3275. DOI:10.1109/IROS.2007.4399227.
  • [7] S. R. Company. “Shadow dexterous hand”. http://www.shadowrobot.com/products/dexterous-hand/.
  • [8] M. Cutkosky, “On grasp choice, grasp models,and the design of hands for manufacturing tasks”, IEEE Transactions on Robotics and Automation, vol. 5, no. 3, 1989, 269–279. DOI:10.1109/70.34763.
  • [9] S. Ekvall, D. Kragic, “Grasp recognition for programming by demonstration”. In: Robotics and Automation. ICRA 2005. Proceedings of the 2005 IEEE International Conference on, 2005, 748–753. DOI: 10.1109/ROBOT.2005.1570207.
  • [10] R.-E. Fan, K.-W. Chang, C.-J. Hsieh, X.-R. Wang, C.-J. Lin, “LIBLINEAR: A library for large linear classi fiication”, Journal of Machine Learning Research,vol. 9, 2008, 1871–1874.
  • [11] T. Fischer, “Sensor system for controlling a multifiingered gripper on a robot arm”. In: Proc. of the Int. Conf. on Intelligent Robots and Systems, vol. 3, 1998, 1509–1514. DOI:10.1109/IROS.1998.724809.
  • [12] Forcedimension.com. “Force dimension – product bruchures”. http://www.forcedimension.com/products, 2015.
  • [13] Futaba-rc.com. “Futaba® product manuals”.http://www.futaba-rc.com/downloads/manuals.html, 2015.
  • [14] O. Gibaru. “Control of a ur10 robot and a 3-finger robot gripper thanks to a leapmotion”.https://www.youtube.com/watch?v=YJ5s3wxKZ30, 2014.
  • [15] G. Gioioso, G. Salvietti, M. Malvezzi, D. Prattichizzo,“Mapping synergies from human to robotic hands with dissimilar kinematics: An approach in the object domain”, IEEE Transactions on Robotics, vol. 29, no. 4, 2013, 825–837. DOI:10.1109/TRO.2013.2252251.
  • [16] S. Goza, R. Ambrose, M. A. Diftler, and I. M. Spain, “Telepresence control of the nasa/darpa robonaut on a mobility platform”. In: Proc. of the SIGCHI Conference on human factors in computing systems, 2004, 623–629. DOI:10.1145/985692.985771.
  • [17] W. B. Griffiin, R. P. Findley, M. L. Turner, and M. R. Cutkosky, “Calibration and mapping of a human hand for dexterous telemanipulation”. In: ASME IMECE 2000 Symposium on Haptic Interfaces for Virtual Environments and Teleoperator Systems, 2000, 1–8.
  • [18] A. Kowalski, Wykorzystanie czujnika leapmotion do sterowania chwytaka (Using the leapmotion sensor to control the gripper), BSc Thesis, Lodz University of Technology, 2014. (in Polish)
  • [19] A. Leeper, K. Hsiao, M. Ciocarlie, L. Takayama, D. Gossow, “Strategies for human-in-the-loop robotic grasping”. In: 7th ACM/IEEE Int. Conf. on Human-Robot Interaction (HRI), 2012, 1–8. DOI:10.1145/2157689.2157691.
  • [20] H. Liu, K. Wu, P. Meusel, et al., “Multisensory fiive-fiinger dexterous hand: The DLR/HIT hand II”. In: IEEE/RSJ Int. Conf. on Intelligent Robots and Systems, IROS, 2008, 3692–3697. DOI:10.1109/IROS.2008.4650624.
  • [21] J. Liu, Y. Zhang, “Mapping human hand motion to dexterous robotic hand”. In: IEEE International Conference onRobotics and Biomimetics. ROBIO 2007, 2007, 829–834.
  • [22] M. J. Lum, J. Rosen, H. King, et al., “Telesurgery via unmanned aerial vehicle (UAV) with a fiield deployable surgical robot”, Stud Health Technol Inform, vol. 125, 2007, 313–5.
  • [23] R. R. Ma, A. M. Dollar, “On dexterity and dexterous manipulation”. In: 15th Int. Conf.on Advanced Robotics (ICAR), 2011, 1–7. DOI:10.1109/ICAR.2011.6088576.
  • [24] R. Marks. “Prop input device and method for mapping an object from a two-dimensional camera image to a three-dimensional space for controlling action in a game program”.http://www.google.com/patents/EP1176559A2?cl=en, January 30 2002. EP Patent App. EP20,010,306,264.
  • [25] D. McNeill, “So you think gestures are nonverbal?”,Psychological Review, vol. 92, no. 3, 1985, 350.
  • [26] L. Motion. “Leap motion release notes and known issues”. https://developer.leapmotion.com/features/faq.Accessed 27 Dec. 2014.
  • [27] M. A. Nacenta, Y. Kamber, Y. Qiang, and P. O.Kristensson, “Memorability of pre-designed and user-defiined gesture sets”. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, New York, NY, USA, 2013,1099–1108. DOI: 10.1145/2470654.2466142.
  • [28] K. Nagatani, S. Kiribayashi, Y. Okada, et al., “Redesign of rescue mobile robot quince”. In: IEEE Int. Symposium on Safety, Security, and Rescue Robotics (SSRR), 2011, 13–18. DOI:/10.1109/SSRR.2011.6106794.
  • [29] J. R. Napier, “The prehensile movements of the human hand”, Journal of bone and joint surgery, vol. 38, no. 4, 1956, 902–913.
  • [30] NASA. “Robonaut 2, the next generation dexterous robot”. http://www.nasa.gov/mission_pages/station/multimedia/robonaut_photos_prt.htm.
  • [31] U. Navy. “Us navy 090512-n-2013o-013 a mark ii talon robot from explosive ordnance disposal mobile unit 5, det. japan, is used to inspect a suspicious package during a force protection-antiterrorism training exercise”. http://commons.wikimedia.org/wiki/. Accessed 16 Feb 2014.
  • [32] A. Okamura. ME 327 Design and Control of Haptic Systems: Lecture Notes. Stanford University, Stanford, USA, March 2014.
  • [33] F. Pedregosa, G. Varoquaux, A. Gramfort, et al.,“Scikit-learn: Machine learning in python”, Journal of Machine Learning Research, vol. 12, 2011, 2825–2830.
  • [34] A. Peer, S. Einenkel, and M. Buss, “Multi-fiingered telemanipulation-mapping of a human hand to a three fiinger gripper”. In: The 17th IEEE International Symposium onRobot and Human Interactive Communication (RO-MAN), 2008, 465–470.DOi: 10.1109/ROMAN.2008.4600710.
  • [35] M. Shimamoto, “Teleoperator/telepresence system (TOPS) concept verifiication model (CVM) development”. In: NASA. Lyndon B. Johnson Space Center, The Sixth Annual Workshop on Space Operations Applications and Research (SOAR 1992) (SEE N 93-32097 12-99), 1993, 149–155.
  • [36] B. Siciliano and O. Khatib, Springer handbook of robotics, Springer, 2008. DOI: 10.1007/978-3-540-30301-5.
  • [37] H. I. Son, L. Chuang, A. Franchi, J. Kim, D. Lee,S.-W. Lee, H. Bulthoff, and P. Giordano, “Measuring an operator’s maneuverability performance in the haptic teleoperation of multiple robots”. In: 2011 IEEE/RSJ International Conference onIntelligent Robots and Systems (IROS), 3039–3046.DOI: 10.1109/IROS.2011.6094618.
  • [38] Spectrum.ieee.org. “Fukushima robot operator writes tell-all blog – IEEE Spectrum”.http://spectrum.ieee.org/automaton/robotics/industrial-robots/fukushimarobot-operator-diaries, 2014.
  • [39] Taurob.com. “Universal teleoperation console | taurob”. http://taurob.com/produkte/universal-teleoperation-console/, 2015.
  • [40] B. technology. “Bh8-282 datasheet”. http://web.barrett.com.
  • [41] R. Tibshirani, “Regression shrinkage and selection via the lasso”, Journal of the Royal Statistical Society. Series B (Methodological), 1996, 267–288.
  • [42] T. Veltrop. “Kinect teleoperation of humanoid robot”. https://www.youtube.com/watch?v=GdSfLyZl4N0, 2011.
  • [43] T. Wojtara and K. Nonami, “Hand posture detection by neural network and grasp mapping for a master slave hand system”. In: 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2004). Proceedings. , vol. 1, 2004, 866–871. DOI: 10.1109/IROS.2004.1389461.
  • [44] I. Zubrycki and G. Granosik, “Grip recognition and control of 3-finger gripper with sensor glove”. In: Proc. of Int. Conference on Robotics and Artifiicial Intelligence. Problems and perspective.(RAIPAP’13), Brest, Belarus, 4-6 November 2013, 2013.
  • [45] I. Zubrycki and G. Granosik, “Test setup for multi-fiinger gripper control based on robot operating system (ros)”. In: Proc. of 9thInt:W orkshop on Robot Motion and Control; 2013; 135-140:DOI : 10:1109/RoMoCo:2013:6614598:
  • [46] I. Zubrycki and G. Granosik. “Demo of sensor glove to dexterous gripper mapping”. https://www.youtube.com/watch?v=e5bEyEErEc4,2014. Accessed: 2014-24-12.
  • [47] I. Zubrycki and G. Granosik, “Hybrid control of Schunk Dexterous Hand”. https://www.youtube.com/watch?v=CPTsFYXaRLs, 2014.Accessed: 2014-24-12.
  • [48] I. Zubrycki and G. Granosik, “Leap motion sensor for gripper control”. https://www.youtube.com/watch?v=OvO4DKdigds, 2014. Accessed: 2014-24-12.
  • [49] I. Zubrycki and G. Granosik, “Using integrated vision systems: three gears and leap motion, to control a 3-Finger dexterous gripper”. In: Recent Advances in Automation, Robotics and Measuring Techniques, 2014, 553–564. DOI: 10.1007/978-3-319-05353-0_52.
Typ dokumentu
Bibliografia
Identyfikator YADDA
bwmeta1.element.baztech-9b26c6d4-99be-4498-bc9e-e7ef26c6418d
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.