PL EN


Preferencje help
Widoczny [Schowaj] Abstrakt
Liczba wyników
Tytuł artykułu

Optomechanical industrial-level camera modifications for repeatable thermal image drift

Treść / Zawartość
Identyfikatory
Warianty tytułu
Języki publikacji
EN
Abstrakty
EN
Thermal image drift is observed in prevalent industrial-level cameras because their optomechanical design is not optimised to reduce this phenomenon. In this paper, the effect of temperature on industrial-level cameras is investigated, focusing on the thermal image drift resulting from ambient temperature changes and warming-up process. Standard methods for reducing thermal image drift are reviewed, concentrating on the lack of repeatability aspect of this drift. Repeatable thermal image drift is crucial for applying a compensation model as random thermal deformations in sensors cannot be compensated. Moreover, the possible cause of this issue is explored, and novel optomechanical camera modifications are proposed that maintain the thermal degrees of freedom for the deforming sensor, limiting the lack of repeatability aspect of thermal image drift to a low level. The improvement is verified by conducting experiments using a specialised test stand equipped with an invar frame and thermal chamber. Considering the results from the application of the polynomial compensation model, the standard deviation of the central shifts of image drift is reduced by ×3.99, and the absolute range of image drift is reduced by ×2.53.
Rocznik
Strony
art. no. e150185
Opis fizyczny
Bibliogr. 42 poz., rys., wykr.
Twórcy
  • Warsaw University of Technology, Faculty of Mechatronics, Institute of Micromechanics and Photonics, ul. Andrzeja Boboli 8,02-525 Warsaw, Poland
autor
  • Warsaw University of Technology, Faculty of Mechatronics, Institute of Micromechanics and Photonics, ul. Andrzeja Boboli 8, 02-525 Warsaw, Poland
Bibliografia
  • [1] Coffey, V. C. Machine vision: The eyes of industry 4.0. Opt. Photon. News 29, 42-49 (2018). https://doi.org/10.1364/OPN.29.7.000042.
  • [2] Szeliski, R. Computer Vision: Algorithms and Applications 2nd Edition (Springer, 2021).
  • [3] Emmer, C., Glaesner, K.-H., Pfouga, A. & Stjepandić, J. Advances in 3d measurement data management for industry 4.0. Procedia Manuf. 11, 1335-1342 (2017). https://doi.org/10.1016/j.promfg.2017.07.262.
  • [4] Machine vision market. https://www.marketsandmarkets.com/Market-Reports/industrial-machine-vision-market-234246734.html (2021). Accessed: 2021-09-26.
  • [5] Lenty, B., Sioma, A. & Kwiek, P. Quality control automation of electric cables using machine vision. In Romaniuk, R. S. & Linczuk, M. (eds.) Photonics Applications in Astronomy, Communications, Industry, and High-Energy Physics Experiments 2018, 129 (SPIE, 2018). https://doi.org/10.1117/12.2501562.
  • [6] Davies, E. 4-machine vision in the food industry. In Caldwell, D. G. (ed.) Robotics and Automation in the Food Industry, 75-110 (Woodhead Publishing, 2013). https://doi.org/10.1533/9780857095763.1.75.
  • [7] Siekański, P. et al. On-line laser triangulation scanner for wood logs surface geometry measurement. Sensors (Switzerland) 19, 1074 (2019). https://doi.org/10.3390/s19051074.
  • [8] Peisheng, T., Ronggui, D. & Yubin, Z. A stereoscopic warehouse stocktaking method based on machine vision. In J. Phys.-Conference Series, 1627 (IOP Publishing Ltd, 2020). https://doi.org/10.1088/1742-6596/1627/1/012015.
  • [9] Michoński, J., Glinkowski, W., Witkowski, M. & Sitnik, R. Automatic recognition of surface landmarks of anatomical structures of back and posture. J. Biomed. Opt. 17, 056015 (2012). https://doi.org/10.1117/1.JBO.17.5.056015.
  • [10] Glinkowski, W. M. et al. Posture and low back pain during pregnancy-3d study. Ginekol. Polska 87, 575–580 (2016). https://doi.org/10.5603/GP.2016.0047.
  • [11] Galata, D. L. et al. Applications of machine vision in pharmaceutical technology: A review. Eur. J. Pharm. Sci. 159, 105717 (2021). https://doi.org/10.1016/j.ejps.2021.105717.
  • [12] Karaszewski, M., Adamczyk, M. & Sitnik, R. Assessment of next-best-view algorithms performance with various 3d scanners and manipulator. ISPRS J. Photogramm. Remote Sens. 119, 320-333 (2016). https://doi.org/10.1016/j.isprsjprs.2016.06.015.
  • [13] Karaszewski, M., Lech, K., Bunsch, E. & Sitnik, R. In the pursuit of perfect 3d digitization of surfaces of paintings: Geometry and color optimization. In Ioannides, M. et al. (eds.) Digital Heritage. Progress in Cultural Heritage: Documentation, Preservation, and Protection, 25-34 (Springer International Publishing, 2014).
  • [14] Wang, S., Liang, J., Li, X., Su, F. & Zhao, Z. A calibration method on 3d measurement based on structuredlight with single camera. In Nomura, T., Liu, J., Jia, B., Yao, X. & Wang, Y. (eds.) 2019 International Conference on Optical Instruments and Technology: Optical Systems and Modern Optoelectronic Instruments, 109 (SPIE, 2020). https://doi.org/10.1117/12.2550235.
  • [15] Liberadzki, P., Adamczyk, M., Witkowski, M. & Sitnik, R. Structured-light-based system for shape measurement of the human body in motion. Sensors 18, 2827 (2018). https://doi.org/10.3390/s18092827.
  • [16] Lenar, J. et al. Lower body kinematics evaluation based on a multidirectional four-dimensional structured light measurement. J. Biomed. Opt. 18, 056014 (2013). https://doi.org/10.1117/1.jbo.18.5.056014.
  • [17] Zou, H., Cao, K. & Jiang, C. Spatio-temporal visual analysis for urban traffic characters based on video surveillance camera data. ISPRS Int. J. Geo-Inf. 10, 177 (2021). https://doi.org/10.3390/ijgi10030177.
  • [18] Stylios, I., Kokolakis, S., Thanou, O. & Chatzis, S. Behavioral biometrics and continuous user authentication on mobile devices: A survey. Inf. Fusion 66, 76-99 (2021). https://doi.org/10.1016/j.inffus.2020.08.021.
  • [19] Adamczyk, M., Sieniło, M., Sitnik, R. & Woźniak, A. Hierarchical, three-dimensional measurement system for crime scene scanning. J. Forensic Sci. 62, 889-899 (2017). https://doi.org/10.1111/1556-4029.13382.
  • [20] Sitnik, R. Newmethod of structure light measurement system calibration based on adaptive and effective evaluation of 3dphase distribution. In Proc. SPIE, Optical Measurement Systems for Industrial Inspection IV, vol. 5856, 109-117 (2005). https://doi.org/10.1117/12.613017.
  • [21] Chen, R. et al. Accurate calibration method for camera and projector in fringe patterns measurement system. Appl. Opt. 55, 4293-4300 (2016). https://doi.org/10.1364/AO.55.004293.
  • [22] Sładek, J., Sitnik, R., Kupiec, M.&Błaszczyk, P. The hybrid coordinate measurement system as a response to industrial requirements. Metrol. Meas. Syst. XVII, 537–547 (2010). https://doi.org/10.2478/v10178-012-0001-3.Brought.
  • [23] Zhang, Z. A flexible new technique for camera calibration. IEEE Trans. Pattern Anal. Mach. Intell. 22, 1330-1334 (2000). https://doi.org/10.1109/34.888718.
  • [24] Elias, M., Eltner, A., Liebold, F.&Maas, H. G. Assessing the influence of temperature changes on the geometric stability of smartphone-and raspberry Pi cameras. Sensors (Switzerland) 20, 643 (2020). https://doi.org/10.3390/s20030643.
  • [25] Yu, Q. et al. The effects of temperature variation on videometric measurement and a compensation method. Image Vis. Comput. 32, 1021-1029 (2014). https://doi.org/10.1016/j.imavis.2014.08.011.
  • [26] Handel, H. Compensation of thermal errors in vision based measurement systems using a system identification approach. In 2008 9th International Conference on Signal Processing, 1329-1333 (IEEE, 2008). https://doi.org/10.1109/ICOSP.2008.4697377.
  • [27] Handel, H. Analyzing the influences of camera warm-up on image acquisition. IPSJ Trans. Comput. Vis. Appl. 1, 12-20 (2009). https://doi.org/10.1109/ICOSP.2008.4697377.
  • [28] Podbreznik, P. & Potočnik, B. Assessing the influence of temperature variations on the geometrical properties of a low-cost calibrated camera system by using computer vision procedures. Mach. Vis. Appl. 23, 953-966 (2012). https://doi.org/10.1007/s00138-011-0330-3.
  • [29] Pan, B., Shi, W. & Lubineau, G. Effect of camera temperature variations on stereo-digital image correlation measurements. Appl. Opt. 54, 10089–10095 (2015). https://doi.org/10.1364/AO.54.010089.
  • [30] Adamczyk, M., Kamiński, M., Sitnik, R., Bogdan, A. & Karaszewski, M. Effect of temperature on calibration quality of structured-light three-dimensional scanners. Appl. Opt. 53, 5154 (2014). https://doi.org/10.1364/AO.53.005154.
  • [31] Adamczyk, M., Liberadzki, P. & Sitnik, R. Temperature compensation method for digital cameras in 2D and 3D measurement applications. Sensors 18, 1-17 (2018). https://doi.org/10.3390/s18113685.
  • [32] UI-6280SE-C-HQ Rev.3. https://en.ids-imaging.com/ download-details/AB.0010.1.54800.24.html. Accessed: 2021-09-26.
  • [33] Factory Automation/Machine Vision-Fixed Focal. https://www.fujifilm.com/us/en/business/optical-devices/machine-vision-lens/hf-ha-1s-series. Accessed: 2021-09-26.
  • [34] Sitnik, R., Kujawinska, M. & Woznicki, J. M. Digital fringe projection system for large-volume 360-deg shape measurement. Opt. Eng. 41, 443-449 (2002). https://doi.org/10.1117/1.1430422.
  • [35] Sitnik, R. & Kujawińska, M. From cloud-of-point coordinates to three-dimensional virtual environment: the data conversion system. Opt. Eng. 41 (2002). https://doi.org/10.1117/1.1430419.
  • [36] Yoder, P. Opto-Mechanical Systems Design (CRC Press, 2005).
  • [37] Vukobratovich, D. & Yoder, P. Fundamentals of Optomechanics (CRC Press, 2018).
  • [38] Noda, T. Temperature compensation with DOE for zoom lens. In Proc. SPIE, 11106, 111060 (2019). https://doi.org10.1117/12.2527944.
  • [39] Thermoelectric Cooled Camera - VP Series|Area Scan Camera| Vieworks. http://www.vieworks.com/eng/product.html?pid=32. Accessed: 2021-09-26.
  • [40] Akurat Lighting - S8 MARK2. https://www.akurat.lighting/en/products/soft-panels/s8-mark-2-strong-lenticular-130w-led-panel. Accessed: 2022-01-03.
  • [41] Lake, M. S. & Hachkowski, M. R. Design of Mechanisms for Deployable, Optical Instruments: Guidelines for Reducing Hysteresis. In NTRS - NASA Technical Reports Server, 20000032952 (2000). https://ntrs.nasa.gov/api/citations/20000032952/downloads/20000032952.pdf.
  • [42] UI-5282SE-C Rev.4. https://en.ids-imaging.com/store/ui-5282se-rev-4.html. Accessed: 2022-01-07.
Uwagi
1. This research was funded by CB POB FOTECH of Warsaw University of Technology within the Excellence Initiative: Research University (IDUB) program.
2. Opracowanie rekordu ze środków MNiSW, umowa nr SONP/SP/546092/2022 w ramach programu "Społeczna odpowiedzialność nauki" - moduł: Popularyzacja nauki i promocja sportu (2024).
Typ dokumentu
Bibliografia
Identyfikator YADDA
bwmeta1.element.baztech-f71f1c2f-6283-4acc-913e-f724ef2f4dfa
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.