PL EN


Preferencje help
Widoczny [Schowaj] Abstrakt
Liczba wyników
Tytuł artykułu

Real-time camera pose estimation based on volleyball court view

Wybrane pełne teksty z tego czasopisma
Identyfikatory
Warianty tytułu
Języki publikacji
EN
Abstrakty
EN
The use of technology in sports has increased in recent years. One of the most influential of these technologies is referee support systems. Team sports such as volleyball require accurate and robust tracking systems that do not affect either the players or the court. This paper introduces the application of intrinsic and extrinsic camera calibration in a 12-camera volleyball referee system. Intrinsic parameters are calculated by using the classic pinhole model and Zhang’s method. To perform extrinsic calibration in real time, the volleyball court is treated as a global calibration artifact. Calibration keypoints are defined as court-line intersections. In addition, a new keypoint detection algorithm is proposed. It enables achievement of an accurate camera pose in regard to the court. With all 12 cameras calibrated in a common coordinate system, a dynamic camera stereo pair creation is possible. Therefore, with known ball 2D image coordinates, the 3D real ball coordinates can be reconstructed and the ball trajectory can be estimated. The performance of the proposed method is tested on a synthetic data set, including 3Ds Max rendering and real data scenarios. The mean camera pose error calculated for data biased with keypoint detection errors is approximately equal to 0.013% of the measurement volume. For the real data experiment with a human hand phantom, it is possible to determine the presence of the human phantom on the basis of the ball reflection attitude.
Rocznik
Strony
202--212
Opis fizyczny
Bibliogr. 53 poz., rys., wykr., tab.
Twórcy
autor
  • Faculty of Mechatronics, Warsaw University of Technology, 8 Św. Andrzeja Boboli St.,Warsaw 02-525, Poland
autor
  • Faculty of Mechatronics, Warsaw University of Technology, 8 Św. Andrzeja Boboli St.,Warsaw 02-525, Poland
autor
  • Faculty of Mechatronics, Warsaw University of Technology, 8 Św. Andrzeja Boboli St.,Warsaw 02-525, Poland
autor
  • Faculty of Mechatronics, Warsaw University of Technology, 8 Św. Andrzeja Boboli St.,Warsaw 02-525, Poland
Bibliografia
  • [1] J.M. Frahm, K. Köser, R. Koch, Pose estimation for multi-camera systems, in: Joint Pattern Recognition Symposium, Springer, Berlin, Heidelberg, 2004, 286–293.
  • [2] K.W. Nam, J. Park, I.Y. Kim, K.G. Kim, Application of stereo-imaging technology to medical field, J. Healthc. Inf. Res. 18 (3) (2012) 158–163.
  • [3] M. Witkowski, R. Sitnik, M. Kujawinska, ´ W. Rapp, M. Kowalski, B. Haex, S. Mooshake, 4D measurement system for automatic location of anatomical structures Biophotonics and New Therapy Frontiers, 6191, SPIE, 2006, 61910H.
  • [4] M. Achtelik, T. Zhang, K. Kuhnlenz, M. Buss, Visual tracking and control of a quadcopter using a stereo camera system and inertial sensors, IEEE Int. Conf. Robot Autom., 2009.
  • [5] B. Bal, G. Dureja, Hawkeye: a logical innovative technology use in sports for effective decision making, Sport Sci. Rev. 21 (1-2) (2012) 107–119.
  • [6] G. Bleser, H. Wuest, D. Stricker, Online camera pose estimation in partially known and dynamic scenes, in: IEEE/ACM ISMAR, 2006, 56–65.
  • [7] S. Ohayon, E. Rivlin, Robust 3d head tracking using camera pose estimation ICPR, 1, IEEE, 2006, 1063–1066.
  • [8] V. Lepetit, P. Lagger, P. Fua, Randomized Trees for Real-Time Keypoint Recognition, IEEE 2, 2007, 775–781.
  • [9] Wikipedia Page, Goal Line Technology, 2017.
  • [10] V-challenge Tds International System Webpage, 2019 (Accessed 20 February 2018) http://www.tdsinternational.eu/about-us.html.
  • [11] P. Tan, Y. Li, Z.Y. Huang, Feasibility analysis of “Hawkeye” technique employed in volleyball competition, J. Mianyang Normal Univ. 8 (2010) 028.
  • [12] D. Scaramuzza, A. Harati, R. Siegwart, Extrinsic self calibration of a camera and a 3d laser range finder from natural scenes, IEEE/RSJ IROS, 2007, 4164–4169.
  • [13] B. Triggs, Detecting Keypoints with Stable Position, Orientation, and Scale under Illumination Changes. ECCV, Springer, Berlin, Heidelberg, 2004, 100–113.
  • [14] C.G. Harris, M. Stephens, A combined corner and edge detector, AVC 15 (50) (1988) 10–5244.
  • [15] H. Zhou, Y. Yuan, C. Shi, Object tracking using SIFT features and mean shift, CVIU 113 (3) (2009) 345–352.
  • [16] S. Leutenegger, M. Chli, R. Siegwart, BRISK: Binary Robust Invariant Scalable Keypoints, IEEE ICCV, 2011, 2548–2555.
  • [17] R. Brinkmann, The Art and Science of Digital Compositing: Techniques for Visual Effects, Animation and Motion Graphics, Morgan Kaufmann, 2008.
  • [18] D.G. Lowe, Distinctive image features from scale-invariant keypoints, IJCV 60 (2) (2004) 91–110.
  • [19] V. Lepetit, F. Moreno-Noguer, P. Fua, EPnP: efficient perspective-n-point camera pose estimation, IJCV 81 (2009) 155–166.
  • [20] F. Moreno-Noguer, V. Lepetit, P. Fua, Accurate Non-Iterative O (n) Solution to the Pnp Problem, IEEE ICCV, 2007, 1–8.
  • [21] C.P. Lu, G.D. Hager, E. Mjolsness, Fast and globally convergent pose estimation from video images, IEEE Trans. Pattern Anal. Mach. Intell. 22 (6) (2000) 610–622.
  • [22] L. Quan, Z. Lan, Linear n-point camera pose determination IEEE, Trans. Pattern Anal. Mach. Intell. 21 (8) (1999) 774–780.
  • [23] V. Sharma, P.C. Barnum (2016). U.S. Patent No. 9,237,340. Washington, DC: U.S. Patent and Trademark Office.
  • [24] D.L. Ruderman, W. Bialek, Statistics of natural images: scaling in the woods, Adv. Neural Inf. Process. Syst. (1994) 551–558.
  • [25] S. Belongie, J. Malik, J. Puzicha, Shape matching and object recognition using shape contexts, IEEE Trans. Pattern Anal. Mach. Intell. 4 (2002) 509–522.
  • [26] B.K. Horn, B.G. Schunck, Determining optical flow, Artif. Intell. 17 (1-3) (1981) 185–203.
  • [27] A. Litvin, J. Konrad, W.C. Karl, Probabilistic video stabilization using Kalman filtering and mosaicing, Int. Society for Optics and Photonics 5022 (2003) 663–675.
  • [28] J.J. Moré, The Levenberg-marquardt Algorithm: Implementation and Theory. Numer. Anal, Springer, Berlin, Heidelberg, 1978, 105–116.
  • [29] M.A. Fischler, R.C. Bolles, Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography, Commun. ACM 24 (6) (1981) 381–395.
  • [30] D. Svedberg, S. Carlsson, Calibration, pose and novel views from single images of constrained scenes, Pattern. Recognit. Lett. 21 (13-14) (2000) 1125–1133.
  • [31] C. Colombo, D. Comanducci, A. Del Bimbo, Camera Calibration with Two Arbitrary Coaxial Circles. ECCV, Springer, Berlin, Heidelberg, 2006, 265–276.
  • [32] A.J. Davison, I.D. Reid, N.D. Molton, O. Stasse, MonoSLAM: real-time single camera SLAM, IEEE Trans. Pattern Anal. Mach. Intell. 6 (2007) 1052–1067.
  • [33] C. Tomasi, T. Kanade, Shape and motion from image streams under orthography: a factorization method, IJCV 9 (2) (1992) 137–154.
  • [34] Official Fivb Rules for Volleyball, 2019, Accessed 20 February 2018 http:// www.fivb.org/EN/Refereeing-Rules/documents/FIVB-Volleyball Rules2013- EN 20121214.pdf.
  • [35] D. Ding, C. Lee, K.Y. Lee, An Adaptive Road ROI Determination Algorithm for Lane Detection (TENCON 2013), IEEE, 2013, 1–4.
  • [36] L.A. Fernandes, M.M. Oliveira, Real-time line detection through an improved Hough transform voting scheme, Pattern Recogn. 41 (1) (2008) 299–314.
  • [37] E. Rosten, T. Drummond, Machine Learning for High-speed Corner Detection. ECCV, Springer, Berlin, Heidelberg, 2006, 430–443.
  • [38] F. Mokhtarian, R. Suomela, Curvature scale space for robust image corner detection, Proc. ICPR (Cat. No. 98EX170) 2 (1998) 1819–1821.
  • [39] J. Illingworth, J. Kittler, A survey of the Hough transform, Comput. Gr. Image Process. 44 (1) (1988) 87–116.
  • [40] A. Chambolle, R.A. De Vore, N.Y. Lee, B.J. Lucier, Nonlinear wavelet image processing: variational problems, compression, and noise removal through wavelet shrinkage, IEEE Trans. Image Process. 7 (3) (1998) 319–335.
  • [41] M. Axholt, Pinhole Camera Calibration in the Presence of Human Noise (Doctoral Dissertation), Linköping University Electronic Press, 2011.
  • [42] Z. Zhang, A flexible new technique for camera calibration, IEEE Trans. Pattern Anal. Mach. Intell. (2000) 22.
  • [43] J. Heikkila, Geometric camera calibration using circular control points, IEEE Trans. Pattern Anal. Mach. Intell. 22 (10) (2000) 1066–1077.
  • [44] K. Szelag, G. Maczkowski, R. Gierwialo, A. Gebarska, R. Sitnik, Robust geometric, phase and colour structured light projection system calibration, Opto-Electron Rev. 25 (4) (2017) 326–336.
  • [45] R. Sitnik, M. Kujawinska, W. Załuski, 3DMADMAC system: optical 3D shape acquisition and processing path for VR applications, Optical Methods for Arts and Archaeology, SPIE 5857 (2005) 58570E.
  • [46] X. Mei, S. Yang, J. Rong, X. Ying, S. Huang, H. Zha, Radial Lens Distortion Correction using Cascaded One-parameter Division Model. ICIP, IEEE, 2015, pp. 3615–3619.
  • [47] Y. Tang, Y. Li, J. Luo, Parametric distortion-adaptive neighborhood for omnidirectional camera, Appl. Opt. 54 (23) (2015) 6969–6978.
  • [48] Y.I. Abdel-Aziz, H.M. Karara, M. Hauck, Direct linear transformation from comparator coordinates into object space coordinates in close-range photogrammetry, Photogramm. Eng. Remote Sens. 81 (2) (2015) 103–107.
  • [49] X.S. Gao, X.R. Hou, J. Tang, H.F. Cheng, Complete solution classification for the perspective-three-point problem, IEEE Trans. Pattern Anal. Mach. Intell. 25 (8) (2003) 930–943.
  • [50] S. Li, C. Xu, M. Xie, A robust O (n) solution to the perspective-n-point problem, IEEE Trans. Pattern Anal. Mach. Intell. 34 (7) (2012) 1444–1450.
  • [51] D.W. Marquardt, An algorithm for least-squares estimation of nonlinear parameters, J. Soc. Ind. Appl. Math. 11 (2) (1963) 431–441.
  • [52] R. Hartley, J. Trumpf, Y. Dai, H. Li, Rotation averaging, IJCV 103 (3) (2013) 267–305.
  • [53] Wikipedia page, Coordinate Measuring Machine, 2017.
Uwagi
Opracowanie rekordu w ramach umowy 509/P-DUN/2018 ze środków MNiSW przeznaczonych na działalność upowszechniającą naukę (2019).
Typ dokumentu
Bibliografia
Identyfikator YADDA
bwmeta1.element.baztech-4096849a-a86e-4e97-a803-cc5169962a70
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.