PL EN


Preferencje help
Widoczny [Schowaj] Abstrakt
Liczba wyników
Tytuł artykułu

Path planning optimization and object placement through visual servoing technique for robotics application

Treść / Zawartość
Identyfikatory
Warianty tytułu
Języki publikacji
EN
Abstrakty
EN
Visual servoing define a new methodology for vision based control in robotics. Vision based action involve number of actions that move a robot in response of results of camera analysis. This process is important to operate and help robot to achieve a specific goal. The main purpose of visual servoing consists of considering a vision system by specific sensor dedicated to involve control servo loop and task. In this article, three visual control scheme: Image Based Visual Servoing (IBVS), Position Based Visual Servoing (PBVS) and Hybrid Based Visual Servoing (HBVS) are illustrated. The different terminologies are represented through effective workflow of robot vision. IBVS method concentrate on the image features that are immediately available in the image. This experiment is performed by estimating distance between camera and object. PBVS consist of moving object 3-D parameters to estimate measurement. This paper showcases PBVS using kuka robot model. HBVS uses the 2D and 3D servoing by combining visual sensors also it overcomes challenges of previous two methods. This paper represents HBVS method using IPR communication robot model.
Twórcy
  • U and P U. Patel Department of Computer Engineering, Chandubhai S. Patel Institute of Technology (CSPIT), Charotar University of Science and Technology (CHARUSAT), Gujarat, India
  • Information Technology Department, R. C. Technical Institute, Ahmedabad, India
autor
  • Department of Information Technology, Chandubhai S. Patel Institute of Technology (CSPIT), Charotar University of Science and Technology (CHARUSAT), Gujarat, India
Bibliografia
  • [1] P. K. Allen, B. Yoshimi and A. Timcenko, “Realtime visual servoing”. In: Proceedings. 1991 IEEE International Conference on Robotics and Automation, 1991, 851–856, DOI: 10.1109/ROBOT.1991.131694.
  • [2] B. Thuilot, P. Martinet, L. Cordesses and J. Gallice, “Position based visual servoing: keeping the object in the field of vision”. In: Proceedings 2002 IEEE International Conference on Robotics and Automation, vol. 2, 2002, 1624–1629, DOI: 10.1109/ROBOT.2002.1014775.
  • [3] A. C. Sanderson and L. E. Weiss, “Image-based visual servo control using relational graph error signals”. In: Proc. IEEE Conference on Cybernetics and Society, 1980.
  • [4] F. Chaumette and S. Hutchinson, “Visual servo control. II. Advanced approaches”, IEEE Robotics & Automation Magazine, vol. 14, no. 1, 2007, 109–118, DOI: 10.1109/MRA.2007.339609.
  • [5] J. T. Feddema, C. S. G. Lee and O. R. Mitchell, “Weighted selection of image features for resolved rate visual feedback control”, IEEE Transactions on Robotics and Automation, vol. 7, no. 1, 1991, 31–47, DOI: 10.1109/70.68068.
  • [6] N. R. Gans and S. A. Hutchinson, “An asymptotically stable switched system visual controller for eye in hand robots”. In: Proceedings 2003 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2003), vol. 1, 2003, 735–742, DOI: 10.1109/IROS.2003.1250717.
  • [7] A. Fedrizzi, L. Mosenlechner, F. Stulp and M. Beetz, “Transformational planning for mobile manipulation based on action-related places”. In: 2009 International Conference on Advanced Robotics, 2009, 1–8.
  • [8] P. I. Corke and R. P. Paul, “Video-rate visual servoing for robots”. In: V. Hayward and O. Khatib (eds.), Experimental Robotics I, vol. 139, 1990, 429–451, DOI: 10.1007/BFb0042533.
  • [9] G. Flandin, F. Chaumette and E. Marchand, “Eye-in-hand/eye-to-hand cooperation for visual servoing”. In: Proceedings 2000 ICRA. Millennium Conference. IEEE International Conference on Robotics and Automation. Symposia Proceedings, vol. 3, 2000, 2741–2746, DOI: 10.1109/ROBOT.2000.846442.
  • [10] P. Ardón, M. Dragone and M. S. Erden, “Reaching and Grasping of Objects by Humanoid Robots Through Visual Servoing”. In: D. Prattichizzo, H. Shinoda, H. Z. Tan, E. Ruffaldi and A. Frisoli (eds.), Haptics: Science, Technology, and Applications, vol. 10894, 2018, 353–365, DOI: 10.1007/978-3-319-93399-3_31.
  • [11] A. McFadyen, M. Jabeur and P. Corke, “ImageBased Visual Servoing With Unknown Point Feature Correspondence”, IEEE Robotics and Automation Letters, vol. 2, no. 2, 2017, 601–607, DOI: 10.1109/LRA.2016.2645886.
  • [12] F. Chaumette, “Image Moments: A General and Useful Set of Features for Visual Servoing”, IEEE Transactions on Robotics, vol. 20, no. 4, 2004, 713–723, DOI: 10.1109/TRO.2004.829463.
  • [13] R. T. Fomena and F. Chaumette, “Improvements on Visual Servoing From Spherical Targets Using a Spherical Projection Model”, IEEE Transactions on Robotics, vol. 25, no. 4, 2009, 874–886, DOI: 10.1109/TRO.2009.2022425.
  • [14] S. Miyata, H. Saito, K. Takahashi, D. Mikami, M. Isogawa and A. Kojima, “Extrinsic Camera Calibration Without Visible Corresponding Points Using Omnidirectional Cameras”, IEEE Transactions on Circuits and Systems for Video Technology, vol. 28, no. 9, 2018, 2210–2219, DOI: 10.1109/TCSVT.2017.2731792.
  • [15] J. You, Y. Park, and H. Yoon, “United States Patent: 9781318 - Camera for measuring depth image and method of measuring depth image using the same, ” October, 2017.
  • [16] D. Kragic and H. I. Christensen, “Cue integration for visual servoing”, IEEE Transactions on Robotics and Automation, vol. 17, no. 1, 2001, 18–27, DOI: 10.1109/70.917079.
  • [17] Y. Ri and H. Fujimoto, “Image Based Visual Servo Application on Video Tracking with Monocular Camera Based on Phase Correlation Method”. In: IEEJ International Workshop on Sensing, Actuation, Motion Control, and Optimization, 2017.
  • [18] G. Chesi and A. Vicino, “Visual Servoing for Large Camera Displacements”, IEEE Transactions on Robotics, vol. 20, no. 4, 2004, 724–735, DOI: 10.1109/TRO.2004.829465.
  • [19] Y. Benbelkacem and R. Mohd-Mokhtar, “Position-based visual servoing through Cartesian path-planning for a grasping task”. In: 2012 IEEE International Conference on Control System, Computing and Engineering, 2012, 410–415, DOI: 10.1109/ICCSCE.2012.6487180.
  • [20] M. Keshmiri and W. F. Xie, “Catching moving objects using a Navigation Guidance technique in a robotic Visual Servoing system”. In: 2013 American Control Conference, 2013, 6302–6307, DOI: 10.1109/ACC.2013.6580826.
  • [21] P. Borrel and A. Liégeois, “A study of multiple manipulator inverse kinematic solutions applications to trajectory planning and workspace determination”. In: Proceedings. 1986 IEEE International Conference on Robotics and Automation, vol. 3, 1986, 1180–1185, DOI: 10.1109/ROBOT.1986.1087554.
  • [22] E. Hoffman, A. Rocchi, N. G. Tsagarakis and D. G. Caldwell, “Robot Dynamics Constraint for Inverse Kinematics”. In: J. Lenarčič and J.-P. Merlet (eds.), Advances in Robot Kinematics 2016, vol. 4, 2018, 275–283, DOI: 10.1007/978-3-319-56802-7_29.
  • [23] H. Wang, B. Yang, J. Wang, X. Liang, W. Chen and Y.-H. Liu, “Adaptive Visual Servoing of Contour Features”, IEEE/ASME Transactions on Mechatronics, vol. 23, no. 2, 2018, 811–822, DOI: 10.1109/TMECH.2018.2794377.
  • [24] M. Keshmiri and W.-F. Xie, “Image-Based Visual Servoing Using an Optimized Trajectory Planning Technique”, IEEE/ASME Transactions on Mechatronics, vol. 22, no. 1, 2017, 359–370, DOI: 10.1109/TMECH.2016.2602325.
  • [25] Q. Yang, W.-N. Chen, Z. Yu, T. Gu, Y. Li, H. Zhang and J. Zhang, “Adaptive Multimodal Continuous Ant Colony Optimization”, IEEE Transactions on Evolutionary Computation, vol. 21, no. 2, 2017, 191–205, DOI: 10.1109/TEVC.2016.2591064.
  • [26] M. Dorigo, M. Birattari, Ch. Blum, M. Clerc, T. Stützle, A. F. T. Winfield (eds.), “Ant Colony Optimization and Swarm Intelligence”, Proceedings of 6th International Conference, ANTS 2008, vol. 5217, Brussels, Belgium, September 22-24, 2008 DOI: 10.1007/978-3-540-87527-7.
  • [27] A. A. Canutescu and R. L. Dunbrack, “Cyclic coordinate descent: A robotics algorithm for protein loop closure”, Protein Science, vol. 12, no. 5, 2003, 963–972, DOI: 10.1110/ps.0242703.
  • [28] Y. Pang, Q. Huang, D. Jia, Y. Tian, J. Gao and W. Zhang, “Object manipulation of a humanoid robot based on visual servoing”. In: 2007 IEEE/RSJ International Conference on Intelligent Robots and Systems, 2007, 1124–1129, DOI: 10.1109/IROS.2007.4399445.
  • [29] S. Y. Chen, “Kalman Filter for Robot Vision: A Survey”, IEEE Transactions on Industrial Electronics, vol. 59, no. 11, 2012, 4409–4420, DOI: 10.1109/TIE.2011.2162714.
  • [30] G.C. Chen and J.S. Yu, “Particle swarm optimization algorithm”, Information and ControlShenyang, vol. 34, no. 3, 2005, 318–324.
  • [31] S. Choi and B. K. Kim, “Obstacle avoidance control for redundant manipulators using collidability measure”. In: Proceedings 1999 IEEE/RSJ International Conference on Intelligent Robots and Systems. Human and Environment Friendly Robots with High Intelligence and Emotional Quotients, vol. 3, 1999, 1816–1821, DOI: 10.1109/IROS.1999.811742.
  • [32] V. Lippiello, J. Cacace, A. Santamaria-Navarro, J. Andrade-Cetto, M. A. Trujillo, Y. R. Esteves and A. Viguria, “Hybrid Visual Servoing With Hierarchical Task Composition for Aerial Manipulation”, IEEE Robotics and Automation Letters, vol. 1, no. 1, 2016, 259–266, DOI: 10.1109/LRA.2015.2510749.
  • [33] R. Raja and S. Kumar, “A Hybrid Image Based Visual Servoing for a Manipulator using Kinect”. In: Proceedings of the Advances in Robotics on - AIR ‘17, 2017, DOI: 10.1145/3132446.3134916.
  • [34] E. Marchand, P. Bouthemy, F. Chaumette and V. Moreau, “Robust real-time visual tracking using a 2D-3D model-based approach”. In: Proceedings of the Seventh IEEE International Conference on Computer Vision, 1999, 262–268, DOI: 10.1109/ICCV.1999.791229.
  • [35] D. Tsai, D. G. Dansereau, T. Peynot and P. Corke, “Image-Based Visual Servoing With Light Field Cameras”, IEEE Robotics and Automation Letters, vol. 2, no. 2, 2017, 912–919, DOI: 10.1109/LRA.2017.2654544.
  • [36] R. C. Luo, S.-C. Chou, X.-Y. Yang and N. Peng, “Hybrid Eye-to-hand and Eye-in-hand visual servo system for parallel robot conveyor object tracking and fetching”. In: IECON 2014 – 40th Annual Conference of the IEEE Industrial Electronics Society, 2014, 2558–2563, DOI: 10.1109/IECON.2014.7048866.
  • [37] P. Cigliano, V. Lippiello, F. Ruggiero and B. Siciliano, “Robotic Ball Catching with an Eye-inHand Single-Camera System”, IEEE Transactions on Control Systems Technology, vol. 23, no. 5, 2015, 1657–1671, DOI: 10.1109/TCST.2014.2380175.
  • [38] O. Michel, “Cyberbotics Ltd. Webots™: Professional Mobile Robot Simulation”, International Journal of Advanced Robotic Systems, vol. 1, no. 1, 2004, DOI: 10.5772/5618.
  • [39] F. Tahriri, M. Mousavi and H. J. Yap, “Optimizing the Robot Arm Movement Time Using Virtual Reality Robotic Teaching System”, International Journal of Simulation Modelling, vol. 14, no. 1, 2015, 28–38, DOI: 10.2507/IJSIMM14(1)3.273.
  • [40] L. R. Buonocore, J. Cacace and V. Lippiello, “Hybrid visual servoing for aerial grasping with hierarchical task-priority control”. In: 2015 23rd Mediterranean Conference on Control and Automation (MED), 2015, 617–623, DOI: 10.1109/MED.2015.7158815.
  • [41] D. Liang, N. Sun, Y. Wu and Y. Fang, “Dynamic modeling and control of inverted pendulum robots moving on undulating pavements”. In: 2017 Seventh International Conference on Information Science and Technology (ICIST), 2017, 115–120, DOI: 10.1109/ICIST.2017.7926503.
  • [42] Q.-Z. Ang, B. Horan and S. Nahavandi, “Multipoint Haptic Mediator Interface for Robotic Teleoperation”, IEEE Systems Journal, vol. 9, no. 1, 2015, 86–97, DOI: 10.1109/JSYST.2013.2283955.
  • [43] R. J. Nayak and J. P. Chaudhari, “Object Tracking Using Dominant Sub Bands in Steerable Pyramid Domain”, International Journal on Information Technologies and Security, vol. 12, no. 1, 2020, 61–74.
  • [44] D. Israni and H. Mewada, “Feature Descriptor Based Identity Retention and Tracking of Players Under Intense Occlusion in Soccer Videos”, International Journal of Intelligent Engineering and Systems, vol. 11, no. 4, 2018, 31–41, DOI: 10.22266/ijies2018.0831.04.
  • [45] D. Israni and H. Mewada, “Identity Retention of Multiple Objects under Extreme Occlusion Scenarios using Feature Descriptors”, Journal of Communications Software and Systems, vol. 14, no. 4, 2018, DOI: 10.24138/jcomss.v14i4.541.
Uwagi
Opracowanie rekordu ze środków MNiSW, umowa Nr 461252 w ramach programu "Społeczna odpowiedzialność nauki" - moduł: Popularyzacja nauki i promocja sportu (2020).
Typ dokumentu
Bibliografia
Identyfikator YADDA
bwmeta1.element.baztech-910f3e27-f54e-4311-9ebb-28a8d21ea717
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.