Tytuł artykułu
Treść / Zawartość
Pełne teksty:
Identyfikatory
Warianty tytułu
Języki publikacji
Abstrakty
The precise location of the needle tip is critical in robot-assisted needle-based percutaneous interventions. An automatic needle tip measuring system based on binocular vision technology with the advantages of non-contact, excellent accuracy and high stability is designed and evaluated. First the measurement requirements of the prostate intervention robot are introduced. A laser interferometer is used as the reference for measuring the position of the needle tip whose relative position variation is described as the needle tip distance in the time domain. The parameters of the binocular cameras are obtained by Zhang’s calibration method. Then a robust needle tip extraction algorithm is specially designed to detect the pixel coordinates of the needle tip without installing the marked points. Once the binocular cameras have completed the stereo matching, the 3D coordinates of the needle tip are estimated. The measurement capability analysis (MCA) is used to evaluate the performance of the proposed system. The accuracy of the system can be controlled within 0.3621 mm. The agreement analysis is conducted by the Bland-Altman analysis, and the Pearson correlation coefficient is 0.999847. The P/T ratio value is 16.42% in the repeatability analysis. The results indicate that the accuracy and stability of the binocular vision needle tip measuring system are adequate to meet the requirement for the needle tip measurement in percutaneous interventions.
Czasopismo
Rocznik
Tom
Strony
495--512
Opis fizyczny
Bibliogr. 35 poz., rys., tab., wykr., wzory
Twórcy
autor
- Nanjing University of Aeronautics and Astronautics, State Key Laboratory of Mechanics and Control of Mechanical Structures, Nanjing 210016, China
autor
- Nanjing University of Aeronautics and Astronautics, State Key Laboratory of Mechanics and Control of Mechanical Structures, Nanjing 210016, China
autor
- Northwest A&F University, College of Mechanical and Electronic Engineering, Yangling 712100, China
autor
- Nanjing University of Aeronautics and Astronautics, State Key Laboratory of Mechanics and Control of Mechanical Structures, Nanjing 210016, China
autor
- Nanjing University of Aeronautics and Astronautics, State Key Laboratory of Mechanics and Control of Mechanical Structures, Nanjing 210016, China
- Pingdingshan University, Institute of Electrical and Mechanical Engineering, Pingdingshan 467000, China
Bibliografia
- [1] Bray, F., Ferlay, J., Soerjomataram, I., Siegel, R.L., Torre, L.A., Jemal, A. (2018). Global cancer statistics 2018: GLOBOCAN estimates of incidence and mortality worldwide for 36 cancers in 185 countries. CA: A Cancer Journal for Clinicians, 68, 394-424.
- [2] Adam, A. (2002). Interventional radiology in the treatment of hepatic metastases. Cancer Treatment Reviews, 28(2), 93-99.
- [3] Aschoff, A.J., Merkle, E.M., Emancipator, S.N., Petersilge, C.A., Duerk, J.L., Lewin, J.S. (2002). Femur: MR imaging-guided radio-frequency ablation in a porcine model-feasibility study. Radiology, 225(2), 471-478.
- [4] Wang, Q., Wang, Z., Yao, Z., Forrest, J., Zhou, W. (2016). An improved measurement model of binocular vision using geometrical approximation. Measurement Science and Technology, 27(12), 125013.
- [5] Mcguire, K., Croon, G.D., Wagter, C.D., Tuyls, K., Kappen, B. (2017). Efficient Optical Flow and Stereo Vision for Velocity Estimation and Obstacle Avoidance on an Autonomous Pocket Drone. IEEE Robotics & Automation Letters, 2(2), 1070-1076.
- [6] Sanchez-Rodriguez, J.P., Aceves-Lopez, A. (2018). A survey on stereo vision-based autonomous navigation for multi-rotor MUAVs. Robotica, 36(8), 1225-1243.
- [7] Li, X., Liu, W., Pan, Y., Ma, J., Wang, F. (2019). Binocular vision-based 3D method for detecting high dynamic and wide-range contouring errors of CNC machine tools. Measurement Science And Technology, 30(12), 125019.
- [8] Bellandi, P., Docchio, F., Sansoni, G. (2013). Roboscan: a combined 2D and 3D vision system for improved speed and flexibility in pick-and-place operation. The International Journal of Advanced Manufacturing Technology, 69(5), 1873-1886.
- [9] Luo, Z., Zhang, K., Wang, Z., Zheng, J., Chen, Y. (2017). 3D pose estimation of large and complicated workpieces based on binocular stereo vision. Applied Optics, 56(24), 6822-6836.
- [10] Xia, R., et al. (2020). An accurate and robust method for the measurement of circular holes based on binocular vision. Measurement Science and Technology, 31(2), 025006.
- [11] Pełczyński, P., Ostrowski, B., Rzeszotarski, D. (2012). Motion Vector Estimation of a Stereovision Camera with Inertial Sensors. Metrology and Measurement Systems, 19(1), 141-150.
- [12] Lin, G., Tang, Y., Zou, X., Xiong, J., Fang, Y. (2019). Color-, depth-, and shape-based 3D fruit detection. Precision Agriculture, 21, 1-17.
- [13] Tang, Y., Chen, M., Wang, C., Luo, L., Zou, X. (2020). Recognition and Localization Methods for Vision-Based Fruit Picking Robots: A Review. Frontiers in Plant Science, 11, 1-17.
- [14] Xiang, R., He, W., Zhang, X., Wang, D., Shan, Y. (2018). Size measurement based on a two-camera machine vision system for the bayonets of automobile brake pads. Measurement, 122, 106-116.
- [15] Li, J., Bennett, B.L., Karam, L.J., Pettinato, J.S. (2016). Stereo Vision Based Automated Solder Ball Height and Substrate Coplanarity Inspection. IEEE Transactions on Automation Science and Engineering, 13(2), 757-771.
- [16] Lin, Q., Cai, K., Yang, R., Chen, H., Wang, Z., Zhou, J. (2016). Development and Validation of a Near-Infrared Optical System for Tracking Surgical Instruments. Journal of Medical Systems, 40(107), 1-14.
- [17] Jiang, G., Luo, M., Bai, K. (2019). Optical positioning technology of an assisted puncture robot based on binocular vision. International Journal of Imaging Systems and Technology, 29(2), 180-190.
- [18] Zhou, Z., Wu, B., Duan, J., Zhang, X., Zhang, N., Liang, Z. (2017). Optical surgical instrument tracking system based on the principle of stereo vision. Journal of Biomedical Optics, 22(6), 065005.
- [19] Stamey, T.A., Freiha, F.S., Mcneal, J.E., Redwine, E.A., Whittemore, A.S., Schmid, H.P. (2015). Localized prostate cancer. Relationship of tumor volume to clinical significance for the treatment of prostate cancer. Cancer, 71(S3), 933-938.
- [20] Stamey, T.A., McNeal, J.E., Freiha, F.S., Redwine, E. (1988). Morphometric and clinical studies on 68 consecutive radical prostatectomies. Journal of Urology, 139(6), 1235-1240.
- [21] Krieger, A., Song, S., Cho, N.B., Iordachita, I., Guion, P., Fichtinger, G. (2013). Development and Evaluation of an Actuated MRI-Compatible Robotic System for MRI-Guided Prostate Intervention. IEEE/ASME Transactions on Mechatronics, 18(1), 273-284.
- [22] Elhawary, H., Tse, Z. T. H., Rea, M. Zivanovic, A., Davies, B.L., Besant, C., Souza, N., McRobbie, D., Young, I., Lamperth, M.U. (2010). Robotic System for Transrectal Biopsy of the Prostate: Real-Time Guidance Under MRI. IEEE Engineering in Medicine and Biology Magazine, 29(2), 78-86.
- [23] Seifabadi, R., Song, S., Krieger, A., Cho, N.B., Tokuda, J., Fichtinger, G., Iordachita, I. (2012). Robotic system for MRI-guided prostate biopsy: feasibility of teleoperated needle insertion and ex vivo phantom study. International Journal of Computer Assisted Radiology and Surgery, 7(2), 181-190.
- [24] Chen, Y., Squires, A., Seifabadi, R., Xu, S., Agrawal, H., Bernardo, M., Pinto, P., Choyke, P., Wood, B., Tse, Z.T.H. (2017). Robotic System for MRI-Guided Focal Laser Ablation in the Prostate. IEEE/ASME Transactions on Mechatronics, 22(1), 107-114.
- [25] Zhengyou, Z. (2000). A flexible new technique for camera calibration. IEEE Transactions on Pattern Analysis and Machine Intelligence, 22(11), 1330-1334.
- [26] Raman, M., Aggarwal, H. (2009). Study and Comparison of Various Image Edge Detection Techniques. International Journal of Image Processing, 3(1), 1-11.
- [27] Zhang, X. H., Li, G., Li, C. L., Zhang, H., Zhao, J., Hou, Z. X. (2015). Stereo Matching Algorithm Based on 2D Delaunay Triangulation. Mathematical Problems in Engineering, 2015(PT.17), 137191-137193.
- [28] Canny, J. (1986). A computational approach to edge detection. IEEE Transactions on Pattern Analysis and Machine Intelligence, 8(6), 679-698.
- [29] Lu, J., Cai, H., Lou, J., Li, J. (2007). An Epipolar Geometry-Based Fast Disparity Estimation Algorithm for Multiview Image and Video Coding. IEEE Transactions on Circuits and Systems for Video Technology, 17(6), 737-750.
- [30] Han, J. H., Park, J. S. (2000). Contour matching using epipolar geometry. IEEE Transactions on Pattern Analysis and Machine Intelligence, 22(4), 358-370.
- [31] Zhang, Z. (1998). Determining the Epipolar Geometry and its Uncertainty: A Review. International Journal of Computer Vision, 27(2), 161-195.
- [32] Schmid, C., Zisserman, A. (2000). The Geometry and Matching of Lines and Curves Over Multiple Views. International Journal of Computer Vision, 40(3), 199-233.
- [33] Dong, S.C. (2006). Measurement System Analysis: Theory, Method and Applications. China Metrology Press, 143-148.
- [34] Wu, Z.G. (2004). Measurement System Analysis. China Standards Press, 86-95.
- [35] Bland, J.M., Altman, D.G. (1986). Statistical methods for assessing agreement between two methods of clinical measurement. International Journal of Nursing Studies, 47(8), 931-936.
Uwagi
EN
1. This paper is supported by the National Sciences Foundation of China (Grant No. 51975282).
PL
2. Opracowanie rekordu ze środków MNiSW, umowa Nr 461252 w ramach programu "Społeczna odpowiedzialność nauki" - moduł: Popularyzacja nauki i promocja sportu (2020).
Typ dokumentu
Bibliografia
Identyfikator YADDA
bwmeta1.element.baztech-db1313ed-edfd-45ed-b87a-e4cc027d7595