PL EN


Preferencje help
Widoczny [Schowaj] Abstrakt
Liczba wyników
Tytuł artykułu

Immersive feedback in fencing training using mixed reality

Autorzy
Treść / Zawartość
Identyfikatory
Warianty tytułu
Języki publikacji
EN
Abstrakty
EN
During sports training, providing athletes with real-time feedback that is based on the automatic analysis of motion is both useful and challenging. In this work, a novel system that is based on mixed reality is proposed and verified. The system allows for immersive and real-time visual feedback in fencing training. Novel methods have been introduced for 3D blade tracking from a single RGB camera, creating weapon-action models by recording the actions of a coach and evaluating the trainee’s performance against these models. Augmented reality glasses with see-through displays are employed, and a method for coordinate mapping between the virtual and real environments is proposed; this will allow for the provision of real-time visual cues and feedback by overlaying virtual trajectories on the real-world view. The system has been verified experimentally in fencing bladework training (with the supervision of a fencing coach). The results indicate that the proposed system allows novice fencers to perform their exercises more precisely.
Wydawca
Czasopismo
Rocznik
Tom
Strony
37--62
Opis fizyczny
Bibliogr. 35 poz., rys., tab.
Twórcy
  • AGH University of Science and Technology, Institute of Computer Science, Krakow, Poland
Bibliografia
  • [1] Adams R., Bischof L.: Seeded region growing, IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 16(6), pp. 641–647, 1994.
  • [2] Aramis Fencing School: https://aramis.pl/, 2020. Last access on Sep 2020.
  • [3] Baca A., Kornfeind P.: Rapid feedback systems for elite sports training, IEEE Pervasive Computing, vol. 5(4), pp. 70–76, 2006.
  • [4] Bay H., Ess A., Tuytelaars T., Van Gool L.: Speeded-up robust features (SURF), Computer Vision and Image Understanding, vol. 110(3), pp. 346–359, 2008.
  • [5] Beauchemin S.S., Barron J.L.: The computation of optical flow, ACM Computing Surveys (CSUR), vol. 27(3), pp. 433–466, 1995.
  • [6] Bottoms L., Greenhalgh A., Sinclair J.: Kinematic determinants of weapon velocity during the fencing lunge in experienced ´ep´ee fencers, Acta of Bioengineering and Biomechanics, vol. 15(4), pp. 109–113, 2013. doi: 10.5277/abb130414.
  • [7] Campani¸co A.T., Valente A., Serˆodio R., Escalera S.: Data’s hidden data: qualitative revelations of sports efficiency analysis brought by neural network performance metrics, Motricidade, vol. 14(4), pp. 94–102, 2018.
  • [8] Chen H.T., Tsai W.J., Lee S.Y., Yu J.Y.: Ball tracking and 3D trajectory approximation with applications to tactics analysis from single-camera volleyball sequences, Multimedia Tools and Applications, vol. 60(3), pp. 641–667, 2012.
  • [9] Ciaparrone G., S´anchez F.L., Tabik S., Troiano L., Tagliaferri R., Herrera F.: Deep learning in video multi-object tracking: A survey, Neurocomputing, vol. 381, pp. 61–88, 2020.
  • [10] Cosco F., Garre C., Bruno F., Muzzupappa M., Otaduy M.A.: Visuo-haptic mixed reality with unobstructed tool-hand integration, IEEE Transactions on Visualization and Computer Graphics, vol. 19(1), pp. 159–172, 2012.
  • [11] Cyganek B., Siebert J.P.: An introduction to 3D computer vision techniques and algorithms, John Wiley & Sons, 2011.
  • [12] Czajkowski Z.: Understanding Fencing. The Unity of Theory and Practice, SKA Swordplay Books, 2005.
  • [13] Dalal N., Triggs B.: Histograms of oriented gradients for human detection. In: 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR’05), vol. 1, pp. 886–893, IEEE, 2005.
  • [14] De Boor C.: A Practical Guide to Splines, Applied Mathematical Sciences, vol. 27, Springer-Verlag New York, 1978.
  • [15] El Jamiy F., Marsh R.: Survey on depth perception in head mounted displays: distance estimation in virtual reality, augmented reality, and mixed reality, IET Image Processing, vol. 13(5), pp. 707–712, 2019.
  • [16] Epson BT-300 smart glasses: https://www.epson.eu/products/see-through-mobile-viewer/moverio-bt-300?productfinder=bt300, 2020. Last access on Sep 2020.
  • [17] ES O.: OpenGL ES, https://www.khronos.org/opengles/, 2020. Last access on Sep 2020.
  • [18] Farshid M., Paschen J., Eriksson T., Kietzmann J.: Go boldly!: Explore augmented reality (AR), virtual reality (VR), and mixed reality (MR) for business, Business Horizons, vol. 61(5), pp. 657–663, 2018.
  • [19] Gholipour M., Tabrizi A., Farahmand F.: Kinematics analysis of lunge fencing using stereophotogrametry, World Journal of Sport Sciences, vol. 1(1), pp. 32–37, 2008.
  • [20] Kersten-Oertel M., Jannin P., Collins D.L.: DVV: a taxonomy for mixed reality visualization in image guided surgery, IEEE Transactions on Visualization and Computer Graphics, vol. 18(2), pp. 332–352, 2011.
  • [21] Liang D., Liu Y., Huang Q., Gao W.: A scheme for ball detection and tracking in broadcast soccer video. In: Pacific-Rim Conference on Multimedia, pp. 864–875, Springer, 2005.
  • [22] Lowe D.G.: Distinctive image features from scale-invariant keypoints, International Journal of Computer Vision, vol. 60(2), pp. 91–110, 2004.
  • [23] Malawski F.: Real-Time First Person Perspective Tracking and Feedback System for Weapon Practice Support in Fencing. In: APPIS, pp. 79–88, 2018.
  • [24] Malawski F., Kwolek B.: Improving multimodal action representation with joint motion history context, Journal of Visual Communication and Image Representation, vol. 61, pp. 198–208, 2019.
  • [25] Manafifard M., Ebadi H., Moghaddam H.A.: A survey on player tracking in soccer videos, Computer Vision and Image Understanding, vol. 159, pp. 19–46, 2017.
  • [26] Mantovani G., Ravaschio A., Piaggi P., Landi A.: Fine classification of complex motion pattern in fencing, Procedia Engineering, vol. 2(2), pp. 3423–3428, 2010.
  • [27] Mehta D., Sridhar S., Sotnychenko O., Rhodin H., Shafiei M., Seidel H.P., Xu W.,Casas D., Theobalt C.: Vnect: Real-time 3d human pose estimation with a single RGB camera, ACM Transactions on Graphics (TOG), vol. 36(4), pp. 1–14, 2017.
  • [28] Mueller F., Bernard F., Sotnychenko O., Mehta D., Sridhar S., Casas D., Theobalt C.: GANerated Hands for Real-Time 3D Hand Tracking from Monocular RGB. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 49–59, 2018.
  • [29] Parisot P., De Vleeschouwer C.: Scene-specific classifier for effective and efficient team sport players detection from a single calibrated camera, Computer Vision and Image Understanding, vol. 159, pp. 74–88, 2017.
  • [30] Regenbrecht H., Baratoff G., Wilke W.: Augmented reality projects in the automotive and aerospace industries, IEEE Computer Graphics and Applications, vol. 25(6), pp. 48–56, 2005.
  • [31] Reily B., Zhang H., Hoff W.: Real-time gymnast detection and performance analysis with a portable 3D camera, Computer Vision and Image Understanding, vol. 159, pp. 154–163, 2016.
  • [32] Santos dos J´unior J.G., Monte Lima do J.P.S.: Particle swarm optimization for 3D object tracking in RGB-D images, Computers & Graphics, vol. 76, pp. 167–180, 2018.
  • [33] Thomas G., Gade R., Moeslund T.B., Carr P., Hilton A.: Computer vision for sports: Current applications and research topics, Computer Vision and Image Understanding, vol. 159, pp. 3–18, 2017.
  • [34] Wang P., Bai X., Billinghurst M., Zhang S., Han D., Sun M., Wang Z., Lv H., Han S.: Haptic Feedback Helps Me? A VR-SAR Remote Collaborative System with Tangible Interaction, International Journal of Human–Computer Interaction, vol. 36(13), pp. 1242–1257, 2020. doi: 10.1080/10447318.2020.1732140.
  • [35] Yovcheva Z., Buhalis D., Gatzidis C., Elzakker van C.P.: Empirical Evaluation of Smartphone Augmented Reality Browsers in an Urban Tourism Destination Context, International Journal of Mobile Human Computer Interaction (IJMHCI), vol. 6(2), pp. 10–31, 2014.
Uwagi
PL
Opracowanie rekordu ze środków MEiN, umowa nr SONP/SP/546092/2022 w ramach programu "Społeczna odpowiedzialność nauki" - moduł: Popularyzacja nauki i promocja sportu (2022-2023).
Typ dokumentu
Bibliografia
Identyfikator YADDA
bwmeta1.element.baztech-7d14039e-94d8-45d5-87a3-57dbebd03e7b
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.