Tytuł artykułu
Autorzy
Treść / Zawartość
Pełne teksty:
Identyfikatory
Warianty tytułu
Języki publikacji
Abstrakty
Folk dances, integral components of intangible cultural heritage (ICH), are both fleeting and fragile. However, with the rapid advancement of computer vision techniques, there arises an opportunity to document and safeguard these cultural expressions for future generations. This study aims to identify the distinctive dance sequences and characteristics of Zeibekiko, a popular Greek folk solo dance found in variations across Greece, Cyprus, and the Aegean region of Minor Asia, and translate them into a virtual 3D environment. Utilizing a state-of-the-art optical motion capture system featuring active markers (the PhaseSpace X2E system), precise recordings of the Zeibekiko dance are achieved. The three-dimensional spatial data derived from the dancer's movements serves as the foundation for classification, accomplished through a Spatial Temporal Graph Convolutional Network with Multi Attention Modules (ST-GCN-MAM). This innovative architecture strategically employs attention modules to extract key features of the dance from primary areas of the upper and lower parts of human body. With high level accuracy, the proposed tool accurately detected and recognized Zeibekiko sequences. Ensuring the precise alignment of captured points with corresponding bones or anatomical features in the 3D dancer model is essential for seamless and authentic animations. Advanced visualization and animation techniques are then employed to translate these points into smooth, realistic character movements, preserving their inherent dynamics and expressions. As a result, a faithful virtual rendition of the dance is achieved, capturing its authenticity and beauty. Such a solution holds potential applications in gaming, video production, or virtual museum exhibits dedicated to showcasing folk dances.
Wydawca
Rocznik
Tom
Strony
217--227
Opis fizyczny
Bibliogr. 43 poz., fig., tab.
Twórcy
- Department of Computer Science, Lublin University of Technology, Nadbystrzycka 36B, 20-618 Lublin, Poland
autor
- Department of Computer Science, Lublin University of Technology, Nadbystrzycka 36B, 20-618 Lublin, Poland
autor
- Department of Computer Science, Lublin University of Technology, Nadbystrzycka 36B, 20-618 Lublin, Poland
autor
- Department of Computer Science, Lublin University of Technology, Nadbystrzycka 36B, 20-618 Lublin, Poland
autor
- University of Cyprus, 75 Kallipoleos Str., 1678 Nicosia, Cyprus
Bibliografia
- 1. A. Aristidou, A. Chalmers, Y. Chrysanthou, C. Loscos, F. Multon, J. E. Parkins, B. Sarupuri, E. Stavrakis. Safeguarding our Dance Cultural Heritage. Eurographics Tutorials, April 26, 2022.
- 2. Guo Y., Wang X. A Spatiotemporal data acquisition and processing method for Ansai waist drum based on motion capture. In: E3S Web of Conferences. 2020, 179, 01024.
- 3. Reshma M.R., Kannan B., Jagathy Raj, V.P., Shailesh, S., Cultural heritage preservation through dance digitization: A review. Digit. Appl. Archaeol. Cult. Herit. 2023, 28, e00257. https://doi. org/10.1016/j.daach.2023.e00257.
- 4. Ami-Williams T., Serghides C.-G., Aristidou A. Digitizing Traditional Dances Under Extreme Clothing: The Case Study of Eyo. J. Cult. Herit. 2024, 67, 145-157.
- 5. Kico I., Zelníček D., Liarokapis F. Assessing the learning of folk dance movements using immersive virtual reality. In: 2020 24th International Conference Information Visualisation (IV). IEEE. 2020, 587-59.
- 6. Jie G., Chaetnalao A. Application of motion capture technology in the digital recreation of Tang Dynasty mural art: a case study of Han Xiu’s Tomb. J. Arts Technol. 2023, 14(4), 298-319.
- 7. Peng Y. Research on dance teaching based on motion capture system. Math. Probl. Eng., 2022, 1-8. https://doi.org/10.1155/2022/1455849.
- 8. Sun K. Research on Dance Motion Capture Technology for Visualization Requirements. Sci. Program. 2022, 1–8. https://doi.org/10.1155/2022/2062791.
- 9. Aristidou A., Shamir A., Chrysanthou Y. Digital dance ethnography: Organizing large dance collections. J. Cult. Herit. 2019, 12, 1-27.
- 10. A. Aristidou, N. Andreou, L. Charalambous, A. Yiannakidis, Y. Chrysanthou. Virtual Dance Museums: the case of Greek/Cypriot folk dancing. In EUROGRAPHICS Workshop on Graphics and Cultural Heritage, GCH’21, Bournemouth, United Kingdom, November 2021.
- 11. Hariharan D., Acharya T., Mitra S. Recognizing hand gestures of a dancer. In: Pattern Recognition and Machine Intelligence: 4th International Conference, PReMI 2011, Moscow, Russia, June 27-July 1, 2011. Proceedings 4, 186-192. Springer Berlin Heidelberg.
- 12. Saha S., Ghosh L., Konar A., Janarthanan R. Fuzzy L membership function based hand gesture recognition for Bharatanatyam dance. In: 2013 5th International Conference and Computational Intelligence and Communication Networks. 2013, 331-335. IEEE.
- 13. Anbarsanti N., Prihatmanto A. S. Dance modelling, learning and recognition system of aceh traditional dance based on hidden Markov model. In: 2014 International Conference on Information Technology Systems and Innovation (ICITSI). 2014, 86-92. IEEE.
- 14. Ofli F., Erzin E., Yemez Y., Tekalp A.M., Erdem C.E., Erdem A.T., Abaci T., Ozkan M.K. Unsupervised dance figure analysis from video for danc- ing Avatar animation. In: 2008 15th IEEE International Conference on Image Processing. Presented at the 2008 15th IEEE International Conference on Image Processing - ICIP 2008, IEEE, San Diego, CA, pp. 1484–1487. https://doi.org/10.1109/ ICIP.2008.4712047.
- 15. Cai W., Liu W. Innovative Strategies of Virtual Reality Technology in Ethnic Dance Inheritance. Appl. math. nonlinear sci. 2024, 9(1).
- 16. Chaudhry H., Tabia K., Rahim S. A., BenFerhat S. Automatic annotation of traditional dance data using motion features. In: 2017 International Conference on Digital Arts, Media and Technology (ICDAMT). 2017, 254-258. IEEE.
- 17. Li M., Miao Z., Lu Y. LabanFormer: Multi-scale graph attention network and transformer with gated recurrent positional encoding for labanotation generation. Neurocomputing. 2023, 539, 126203.
- 18. Ma-Thi C., Tabia K., Lagrue S., Le-Thanh H., Bui-The D., Nguyen-Thanh T. Annotating movement phrases in Vietnamese folk dance videos. In: Advances in Artificial Intelligence: From Theory to Practice: 30th International Conference on Industrial Engineering and Other Applications of Applied Intelligent Systems, IEA/AIE 2017, Arras, France, June 27-30, 2017, Proceedings, Part II 30. 2017, 3-11. Springer International Publishing.
- 19. Samanta S., Purkait P., Chanda B. Indian classical dance classification by learning dance pose bases. In: 2012 IEEE Workshop on the Applications of Computer Vision (WACV). 2012, 265-270. IEEE.
- 20. Zhang N. Identification Model of Writhing Posture of Classical Dance Based on Motion Capture Technology and Few‐Shot Learning. Comput Intel Neurosc. 2022, 1, 8239905.
- 21. Mindoro J. N., Festijo E. D., de Guzman M. T. G. A comparative study of deep transfer learning techniques for cultural (aeta) dance classification utilizing skeleton-based choreographic motion capture data. In: 2021 International Conference on Computational Intelligence and Knowledge Economy (ICCIKE). 2021, 74-79. IEEE.
- 22. Stavrakis E., Aristidou A., Savva M., Himona S. L., Chrysanthou, Y. Digitization of cypriot folk dances. In: Progress in Cultural Heritage Preservation: 4th International Conference, EuroMed 2012, Limassol, Cyprus, October 29–November 3, 2012. 404- 413. Springer Berlin Heidelberg..
- 23. Nowomiejska K., Powroznik P., Skublewska-Paszkowska M., Adamczyk K., Concilio M., Sereikaite L., Zemaitiene R., Toro M.D., Rejdak R. Residual Attention Network for distinction between visible optic disc drusen and healthy optic discs. Opt. Lasers Eng. 2024, 176, 108056.
- 24. Skublewska-Paszkowska M., Powroznik P., Lukasik E. Attention Temporal Graph Convolutional Network for Tennis Groundstrokes Phases Classification. In: 2022 IEEE International Conference on Fuzzy Systems (FUZZ-IEEE), July 2022, July, 1-8. IEEE.
- 25. Liu S., Dai S., Sun J., Mao T., Zhao J., Zhang H. Multicomponent Spatial‐Temporal Graph Attention Convolution Networks for Traffic Prediction with Spatially Sparse Data. Comput. Intel. Neirosc. 2021, 1, 9134942.
- 26. Bai J., Zhu Y., Song Y., Zhao L., Hou Z., Du R., Li H. A3T-GCN: Attention Temporal Graph Convolutional Network for Traffic Forecasting. ISPRS Int. J.—Geo-Inf. 2021, 10, 485.
- 27. Skublewska-Paszkowska M., Powroznik P. Temporal pattern attention for multivariate time series of tennis strokes classification. Sensors. 2023, 23(5), 2422.
- 28. Skublewska-Paszkowska, M., Powroznik, P., Lukasik, E. Attention Temporal Graph Convolutional Network for Tennis Groundstrokes Phases Classification. In: 2022 IEEE International Conference on Fuzzy Systems (FUZZ-IEEE). 2022, 1-8. IEEE.
- 29. Xiao, J.L.; Ye, H.; He, X.; Zhang, H.; Wu, F.; Chua, T. Attentional Factorization Machines: Learning the Weight of Feature, Interactions via Attention Networks. arXiv 2017, arXiv:1708.04617.
- 30. Pappas N., Popescu-Belis A. Multilingual Hierarchical Attention Networks for Document Classification. arXiv. 2017, 1707.00896.
- 31. Bahdanau D., Cho K., Bengio Y. Neural Machine Translation by Jointly Learning to Align and Translate. arXiv 2014, 1409.0473.
- 32. Wu Z., Pan S., Long G., Jiang J., Zhang C. Graph wavenet for deep spatial-temporal graph modeling. arXiv preprint. 2019, 1906.00121.
- 33. Fang S., Zhang Q., Meng G., Xiang S., Pan C. GST-Net: Global spatial-temporal network for traffic flow prediction. In: IJCAI, August 2019, 2286-2293.
- 34. Powroznik P., Czerwinski D. Spectral methods in Polish emotional speech recognition. Adv. Sci. Technol. Res. J. 2016, 10(32), 73-81
- 35. Skublewska-Paszkowska M., Powroznik P., Barszcz M., Dziedzic K. Dual Attention Graph Convolutional Neural Network to Support Mocap Data Animation. Adv. Sci. Technol. Res. J. 2023, 17(5) 313-325.
- 36. Miao Y. Dance pose capture and recognition based on heterogeneous sensors. Procedia Comput. Sci. 2023, 228, 171–184. https://doi.org/10.1016/j. procs.2023.11.021.
- 37. Qianwen, L. Application of motion capture technology based on wearable motion sensor devices in dance body motion recognition. Meas.: Sens. 2024, 32, 101055.
- 38. Al-Faris M., Chiverton J., Ndzi D., Ahmed A.I. A Review on Computer Vision-Based Methods for Human Action Recognition. J. Imaging .2020, 6, 46. https://doi.org/10.3390/jimaging6060046.
- 39. Gupta, S., Singh, S. Indian dance classification using machine learning techniques: A survey. Entertain. Comput. 2024, 50, 100639. https://doi. org/10.1016/j.entcom.2024.100639
- 40. Muhamada A.W., Mohammed A.A., Review on recent Computer Vision Methods for Human Action Recognition. ADCAIJ. 2022, 10, 361–379. https:// doi.org/10.14201/ADCAIJ2021104361379.
- 41. Patrona F., Chatzitofis A., Zarpalas D., Daras P. Motion analysis: Action detection, recognition and evaluation based on motion capture data. Pattern Recognit. 2018, 76, 612–622. https://doi. org/10.1016/j.patcog.2017.12.007.
- 42. Wang M., Yu R. Digital production and realization for traditional dance movements based on Motion Capture Technology. The Frontiers of Society, Science and Technology. 2022, 4(11).
- 43. Eldar R., Fisher-Gewirtzman D. Ergonomic design visualization mapping- developing an assistive model for design activities. Int. J. Ind. Ergon. 2019, 74, 102859
Uwagi
Opracowanie rekordu ze środków MNiSW, umowa nr POPUL/SP/0154/2024/02 w ramach programu "Społeczna odpowiedzialność nauki II" - moduł: Popularyzacja nauki (2025).
Typ dokumentu
Bibliografia
Identyfikator YADDA
bwmeta1.element.baztech-bb12a427-e144-4d18-85b3-8ced87ee4351
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.