PL EN


Preferencje help
Widoczny [Schowaj] Abstrakt
Liczba wyników
Tytuł artykułu

An effective programming by demonstration method for SMEs’ industrial robots

Treść / Zawartość
Identyfikatory
Warianty tytułu
Języki publikacji
EN
Abstrakty
EN
Traditional programming methods often require expertise and significant time investment, which does not conform with Small and Medium size Enterprises (SMEs) nature in which High-Mix, Low-Volume (HMLV) orders are usually encountered. In this research, a Programming by Demonstration (PbD) method which aims to reduce the programming time and complexity while keeping a suitable level of execution accuracy is proposed. For this purpose, a special teaching tool is designed and manufactured. The tool has 5-spherical passive markers to indicate the position and orientation along the desired 3D path. An optical tracking system using stereo camera is used to capture the 3D pose of the teaching tool. The capturing algorithm is based on Circle Hough Transform (CHT) and Singular Value Decomposition (SVD). The developed tool and programming method have been tested experimentally. The results show successful capturing of the desired path points with a competitive level of accuracy compared with other methods.
Rocznik
Strony
86--98
Opis fizyczny
Bibliogr. 30 poz., rys.
Twórcy
autor
  • Mechanical Engineering Department, College of Engineering and Technology-Cairo Campus, Arab Academy for Science, Technology and Maritime Transport (AASTMT), Heliopolis, Cairo, Egypt
  • Mechanical Engineering Department, College of Engineering and Technology-Cairo Campus, Arab Academy for Science, Technology and Maritime Transport (AASTMT), Heliopolis, Cairo, Egypt
  • Mechanical Power Engineering Department, Faculty of Engineering, El Materia Helwan University, Cairo, Egypt
Bibliografia
  • [1] DIETZ T., 2012, Programming System for Efficient Use of Industrial Robots for Deburring in SME Environments, ROBOTIK 2012; 7th German Conference on Robotics, VDE.
  • [2] KULIĆ D., OTT C., LEE D., 2012, Incremental Learning of Full Body Motion Primitives and Their Sequencing Through Human Motion Observation, The International Journal of Robotics Research, 31/3, 330–345.
  • [3] TAKANO W., MURAKAMI Y., NAKAMURA Y., 2020, Representation and Classification of Whole-Body Motion Integrated with Finger Motion, Robotics and Autonomous Systems, 124, 103378.
  • [4] ALEXANDERSON S., O’SULLIVAN C., 2017, Real-Time Labeling of Non-Rigid Motion Capture Marker sets, Computers & Graphics, 69, 59–67.
  • [5] ANTONELLI D., 2013, Training by Demonstration for Welding Robots by Optical Trajectory Tracking, Procedia CIRP, 12, 145–150.
  • [6] ANTONELLI D., ASTANIN S., 2016, Qualification of a Collaborative Human-Robot Welding Cell, Procedia CIRP, 41, 352–357.
  • [7] MUELLER F., DEUERLEIN C., KOCH M.J.I-P., 2019, Intuitive Welding Robot Programming via Motion Capture and Augmented Reality, IFAC-PapersOnLine, 52/10, 294–299.
  • [8] CHEN C., 2020, A Virtual-Physical Collision Detection Interface for AR-Based Interactive Teaching of Robot, Robotics and Computer-Integrated Manufacturing, 64, 101948.
  • [9] ALEOTTI J., CASELLI S., 2012, Grasp Programming by Demonstration in Virtual Reality with Automatic Environment Reconstruction, Virtual Reality, 16/2, 87–104.
  • [10] SKOGLUND A., ILIEV B., PALM R., 2010, Programming-by-Demonstration of reaching motions—A nextstate-planner approach, Robotics and Autonomous Systems, 58/5, 607–621.
  • [11] RUPPEL P., HENDRICH N., ZHANG J., 2019, Low-Cost Multi-View Pose Tracking Using Active Markers, IEEE International Conference on Industrial Cyber Physical Systems (ICPS), 18882306.
  • [12] FERREIRA M., 2016, Stereo-Based Real-Time 6-DoF Work Tool Tracking for Robot Programing by Demonstration, The International Journal of Advanced Manufacturing Technology, 85/1–4, 57–69.
  • [13] DUQUE D.A., 2019, Trajectory generation for robotic assembly operations using learning by demonstration, Robotics and Computer-Integrated Manufacturing, 57, 292–302.
  • [14] KYRARINI M., 2019, Robot Learning of Industrial Assembly Task via Human Demonstrations, Autonomous Robots, 43, 239–257.
  • [15] CACCAVALE R., 2019, Kinesthetic Teaching and Attentional Supervision of Structured Tasks in Human–Robot Interaction, Autonomous Robots, 43, 1291–1307.
  • [16] LUO J., 2020, A Task Learning Mechanism for the Telerobots, International Journal of Humanoid Robotics, 16/2, 1950009.
  • [17] FANG H., ONG S., NEE A., 2012, Interactive Robot Trajectory Planning and Simulation Using Augmented Reality, Robotics and Computer-Integrated Manufacturing, 28/2, 227–237.
  • [18] FANG H., ONG S.K., NEE A.Y-C., 2009, Robot Programming Using Augmented Reality, International Conference on CyberWorlds, IEEE, 13–20.
  • [19] RABBI I., 2017, Analysing the Attributes of Fiducial Markers for Robust Tracking in Augmented Reality Applications, International Journal of Computational Vision and Robotics, 7/1-2, 68–82.
  • [20] KUTS V., 2016, Robot Manipulator Usage for Measurement in Production Areas, Journal of Machine Engineering, 16/1, 57–67.
  • [21] SANDAK J., ORLOWSKI K.A., 2018, Machine Vision Detection of the Circular Saw Vibrations, Journal of Machine Engineering, 18/3, 67–77.
  • [22] HARTLEY R.I., STURM P., 1997, Triangulation, Computer Vision and Image Understanding, 68/2, 146–157.
  • [23] HARTLEY R., ZISSERMAN A., 2003, Multiple view Geometry in Computer Vision, Cambridge University Press.
  • [24] YUEN H., 1990, Comparative Study of Hough Transform Methods for Circle Finding, Image and Vision Computing, 8/1, 71–77.
  • [25] DAVIES E.R., 2004, Machine Vision: Theory, Algorithms, Practicalities, Elsevier, Morgan Kaufmann.
  • [26] PEDERSEN S.J.K., 2007, Circular Hough Transform, Aalborg University, Vision, Graphics, and Interactive Systems, 123/6.
  • [27] KERBYSON D., ATHERTON T., 1995, Circle Detection Using Hough Transform Filters, Fifth International Conference on Image Processing and its Applications, 370–374.
  • [28] CUEVAS E., 2012, Fast Algorithm for Multiple-Circle Detection on Images Using Learning Automata, IET Image Processing, 6/8, 1124–1135.
  • [29] ATHERTON T.J., KERBYSON D.J, 1999, Size invariant circle detection, Image and Vision Computing, 17/11, 795–803.
  • [30] ARUN K.S., 1987, Least-Squares Fitting of Two 3-D Point Sets, IEEE Transactions on Pattern Analysis and Machine Intelligence PAMI, 9/5, 698–700.
Uwagi
Opracowanie rekordu ze środków MNiSW, umowa Nr 461252 w ramach programu "Społeczna odpowiedzialność nauki" - moduł: Popularyzacja nauki i promocja sportu (2021).
Typ dokumentu
Bibliografia
Identyfikator YADDA
bwmeta1.element.baztech-4f531253-e413-4e45-bac9-d2429a91d0db
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.