PL EN


Preferencje help
Widoczny [Schowaj] Abstrakt
Liczba wyników
Tytuł artykułu

The Influence of Light Intensity on the Operation of Vision System in Collaborative Robot

Treść / Zawartość
Identyfikatory
Warianty tytułu
Języki publikacji
EN
Abstrakty
EN
Human-robot collaboration can be a powerful tool for increasing productivity in production systems by combining the strengths of humans and robots. Assembly operations, in particular, have shown great potential for utilizing the unique abilities of both parties. However, for robots to efficiently perform assembly tasks, components and parts must be presented in a known location and orientation, which is achieved through a process called parts feeding. Traditional automation methods for parts feeding, such as vibratory bowl feeders, are limited in their ability to accommodate variations in parts design, shape, location, and orientation, making them less flexible for use in human-robot collaboration. Recent advancements in machine vision technology have opened up new possibilities for flexible feeding systems in human-robot assembly cells. This paper explores the application of the vision system in the collaborative robot ABB Yumi and its ability in object detection. In this case, the characteristic of the vision system was determined experimentally by changing the light intensity on the test rig. The system was validated, if the angle of incidence of light affects the stability of the vision system. The results of the study demonstrate the efficiency of vision system in collaborative robot and provide insights into its industrial application.
Słowa kluczowe
Twórcy
  • Faculty of Mechanical Engineering, Lublin University of Technology, ul. Nadbystrzycka 36, 20-618 Lublin, Poland
  • Faculty of Mechanical Engineering, Lublin University of Technology, ul. Nadbystrzycka 36, 20-618 Lublin, Poland
Bibliografia
  • 1. Pizoń J., Gola A. Human-Machine Relationship – Perspective and Future Roadmap for Industry 5.0 Solutions. Machines. 2023; 11(2). https://doi. org/10.3390/machines11020203.
  • 2. Wang X.V., Wang L. A literature survey of the robotic technologies during the COVID-19 pandemic. Journal of Manufacturing Systems. 2021; 60: 823– 836. https://doi.org/10.1016/j.jmsy.2021.02.005
  • 3. Khamis A., Meng J., Wang J., Azar A.T., Prestes E., Li H., Hameed I.A., Takacs A., Rudas I.J., Haidegger T. Robitics and intelligent systems against a pandemic. Acta Polytechnica Hungarica. 2021; 18(5): 13–35. https://doi.org/10.12700/APH.18.5.2021.5.3
  • 4. Shibata M., Dobashi H., Uemura W., Kotosaka S., Maeda Y., Aiyama Y., Sakaguchi T., Kawai Y., Noda A., Yokoi K., Yokokohji Y. Task board for the World Robot Summit 2020. Advanced Robotics. 2022; 36(22): 1194–1212. https://doi.org/10.1080/01691 864.2022.2130711
  • 5. Yokokohji Y., Kawai Y., Shibata M., Aiyama Y., Kotosaka S., Uemura W., Noda A., Dobashi H., Sakaguchi T., Maeda Y., Yokoi K. World Robot Summit 2020 assembly challenge – summary of the competition and its outcomes. Advanced Robotics. 2022; 36(22): 1174–1193. https://doi.org/10.1080/01691 864.2022.2101348
  • 6. Maddikunta P.K.R., Pham Q-V., Prabadevi B., Deepa N., Dev K., Gadekallu T.R., Ruby R., Liyanage M. Industry 5.0: A survey on enabling technologies and potential applications. Journal of Industrial Information Integration. 2022; 26: 100257. https://doi. org/10.1016/j.jii.2021.100257
  • 7. Nourmohammadi A., Fathi M., Amos H.C. Balancing and scheduling assembly lines with humanrobot collaboration tasks. Computers and Operations Research. 2022; 140: 105674. https://doi. org/10.1016/j.cor.2021.105674
  • 8. Pizoń J., Cioch M., Kanski Ł., Garcia E.S. Cobots Implementation in the Era of Industry 5.0 Using Modern Business and Management Solutions. Advances in Science and Technology Research Journal. 2022; 16(6). https://doi.org/10.12913/22998624/156222
  • 9. El Zaatari S., Marei M., Li W., Usman Z. Cobot programming for collaborative industrial tasks: An overview. Robotics and Autonomous Systems. 2019; 116: 162–180. https://doi.org/10.1016/j.robot.2019.03.003
  • 10. Jacob F., Grosse E.H., Morana S., König C.J. Pick- ing with a robot colleague: A systematic literaturę review and evaluation of technology acceptance in human-robot collaborative warehouses. Computers and Industrial Engineering. 2023; 180: 109262. https://doi.org/10.1016/j.cie.2023.109262
  • 11. Bi Z.M., Luo M., Miao Z., Zhang B., Zhang W.J., Wang L. Safety assurance mechanisms of collaborative robotic systems in manufacturing. Robotics and Computer-Integrated Manufacturing. 2021; 67: 102022. https://doi.org/10.1016/j.rcim.2020.102022
  • 12. Proia S., Carli R., Cavone G., Dotoli M. Control Techniques for Safe, Ergonomic, and Efficient Human-Robot Collaboration in the Digital Industry: A survey. IEEE Transactions on Automation Science and Engineering. 2022; 19(3): 1798–1819. https:// doi.org/10.1109/TASE.2021.3131011
  • 13. Jiang Y., Yu L., Jia H., Zhao H., Xia H. Absolute positioning accuracy improvement in an industrial robot. Sensors. 2020; 20(16): 1–14. https://doi. org/10.3390/s20164354
  • 14. Pagani R., Nuzzi C., Ghidelli M., Borboni A., Lancini M., Legnani G. Cobot user frame calibration: Evaluation and comparison between positioning repeatability performances achieved by traditional and vision-based methods. Robotics. 2021; 10(1): 45. https://doi.org/10.3390/robotics10010045
  • 15. Gao Y., Gao L., Li X. A generative adversarial network based deep learning method for low-quality defect image reconstruction and recognition. IEEE Transactions on Industrial Informatics. 2021; 17(5): 3231- 3240. https://doi.org/10.1109/TII.2020.3008703
  • 16. Marques de Araujo P.R., Lins R.G. Computer vision system for workpiece referencing in three-axis machining centers. The International Journal of Advanced Manufacturing Technology. 2020; 106(5–6): 2007–2020. https://doi.org/10.1007/s00170-019-04626-w
  • 17. Du G., Zhang P. Online robot calibration based on vision measurement. Robotics and ComputerIntegrated Manufacturing. 2013; 29(6): 484–492. https://doi.org/10.1016/j.rcim.2013.05.003
  • 18. Cherubini A., Navarro-Alarcon D. Sensor-Based Control for Collaborative Robots: Fundamentals, Challeng- es, and Opportunities. Frontiers in Neurobiotics. 2021; 14: 576846. https://doi.org/10.3389/fnbot.2020.576846
  • 19. Mohammed A., Schmidt B., Wang L. Active collision avoidance for human-robot collaboration driven by vision sensors. International Journal of Computer Integrated Manufacturing. 2017; 30(9): 970–980. https://doi.org/10.1080/0951192X.2016.1268269
  • 20. Golnabi H., Asadpour A. Design and application of industrial machine vision systems. Robotics and Computer-Integrated Manufacturing. 2007; 23: 630– 637. https://doi.org/10.1016/j.rcim.2007.02.005
  • 21. Hong H., Yang X., You Z., Cheng F. Visual quality detection of aquatic products using machine vision. Aquacultural Engineering. 2014; 63: 62–71. https:// doi.org/10.1016/j.aquaeng.2014.10.003
  • 22. He W., Jiang Z., Ming W., Zhang G., Yuan J., Yin L. A critical review for machining positioning based on computer vision. Measurement. 2021; 184: 109973. https://doi.org/10.1016/j.measurement.2021.109973
  • 23. Van Groesen W., Pauwels P. Tracking prefabricated assets and compliance using quick response (QR) codes, blockchain and smart technology. Automation in Construction. 2022; 141: 104420. https://doi. org/10.1016/j.autcon.2022.104420
  • 24. Suebtimrat P., Vonguai R. An investigation of behavioral intention towards QR code payment in Bangkok, Thailand. Journal of Adian Finance, Economics and Business. 2021; 8(1): 939–950. https:// doi.org/10.13106/jafeb.2021.vol8.no1.939
  • 25. Petrova K., Romanello A., Medlin B.D., Vannoy S.A. QR codes advantages and dangers. ICETE 2016 – Pro- ceedings of the 13th International Joint Conference on e-Business and Telecommunications. 2016; 2: 112– 115. https://doi.org/10.5220/0005993101120115
  • 26. Frank D.J., Witt K.J., Hartle C., Enders J.J., Beiring V., Freuler R.J. A low-cost robot positioning system for a first-year engineering cornerstone design project. ASEE Annual Conference and Exposition, Conference Proceedings. 2016; 123120. https://doi. org/10.18260/p.26355
  • 27. Wang H., Wang J., Chen W., Xu L. Automatic illumination planning for robot vision inspection system. Neurocomputing. 2018; 19-28. https://doi. org/10.1016/j.neucom.2017.05.015
  • 28. Huo, L., Zhu J., Singh P.K., Pavlovich P.A. Research on QR image code recognition system based on artificial intelligence algorithm. Journal of Intelligent Systems. 2021; 30: 855–867. https://doi. org/10.1515/jisys-2020-0143.
  • 29. Application manual Integrated Vision OmniCore, URL: https://search.abb.com/library/Download. aspx?DocumentID=3HAC067707-001&Languag eCode=en&DocumentPartId=&Action=Launch, (Access: 2.07.2023)
  • 30. Pratomo A.H., Zakaria M.S., Prabuwono A.S. Illumination Systems for Autonomous Robot: Implementation and Design. Journal of Engineering and Applied Sciences. 2009; 4(5–6): 342–347. https://medwell- journals.com/abstract/?doi=jeasci.2009.342.347
Uwagi
Opracowanie rekordu ze środków MEiN, umowa nr SONP/SP/546092/2022 w ramach programu "Społeczna odpowiedzialność nauki" - moduł: Popularyzacja nauki i promocja sportu (2022-2023).
Typ dokumentu
Bibliografia
Identyfikator YADDA
bwmeta1.element.baztech-422425b2-3b9e-40a2-b0e6-8550621d9fff
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.