Ten serwis zostanie wyłączony 2025-02-11.
Nowa wersja platformy, zawierająca wyłącznie zasoby pełnotekstowe, jest już dostępna.
Przejdź na https://bibliotekanauki.pl

PL EN


Preferencje help
Widoczny [Schowaj] Abstrakt
Liczba wyników
2022 | Vol. 42, no. 1 | 325--340
Tytuł artykułu

EMGHandNet: A hybrid CNN and Bi-LSTM architecture for hand activity classification using surface EMG signals

Warianty tytułu
Języki publikacji
EN
Abstrakty
EN
Recently, Convolutional Neural Networks (CNNs) have been used for the classification of hand activities from surface Electromyography (sEMG) signals. However, sEMG signal has spatial sparsity due to position of electrodes on hand muscle and temporal dependency due to performance of activity over a period of time. The CNN has the ability to extract spatial features and is limited in extracting temporal dependencies. Whereas, the Long Short-Term Memory (LSTM) aims to encode the temporal relations from sequential data. Hence, in this paper, we propose a hybrid CNN and Bidirectional LSTM (Bi-LSTM) based EMG-HandNet architecture to encode the inter-channel and temporal dependencies of sEMG signals for hand activity classification. First, the CNN layers are used to extract deep features from sEMG signals, then these feature maps are processed by the Bi-LSTM to extract the sequential information in both the forward and backward directions. Thus, the proposed model learns both inter-channel and bidirectional temporal information in an end-to-end manner. The proposed model is trained and tested on five benchmark datasets, including the NinaPro DB1, NinaPro DB2, NinaPro DB4, BioPatRec DB2 and UCI Gesture. The average classification accuracies for the NinaPro DB1, NinaPro DB2, NinaPro DB4 and UCI Gesture are 95.77%, 95.9%, 91.65%, and 98.33% respectively. They correspond to an improvement of 4.42%, 12.2%, 18.65% and 1.33% over the respective state-of-the-art models. Moreover, for the BioPatRec DB2 dataset, a comparable performance (91.29%) is observed. The experimental results and comparisons confirm the superiority of the proposed model for hand activity classification from the sEMG signals.
Wydawca

Rocznik
Strony
325--340
Opis fizyczny
Bibliogr. 63 poz., rys., tab., wykr.
Twórcy
  • Bio-Signal Analysis Group, Department of Electronics and Communication Engineering, Indian Institute of Information Technology, Sri City, Andhra Pradesh, India
  • Computer Vision and Biometrics Laboratory (CVBL), Department of Information Technology, Indian Institute of Information Technology, Allahabad, Uttar Pradesh, India
  • Bio-Signal Analysis Group, Department of Electronics and Communication Engineering, Indian Institute of Information Technology, Sri City 517646, Andhra Pradesh, India, anish.turlapaty@iiits.in
  • Visualizations and Computing Advanced Research Center (ViCAR), Department of Computational Data Science and Engineering, North Carolina A and T State University, Greensboro, NC, United States
Bibliografia
  • [1] Guo W, Yao P, Sheng X, Zhang D, Zhu X. An enhanced human-computer interface based on simultaneous sEMG and NIRS for prostheses control. In: 2014 IEEE International Conference on Information and Automation (ICIA). p. 204–7.
  • [2] Fan Y, Yin Y. Active and progressive exoskeleton rehabilitation using multisource information fusion from EMG and force-position EPP. IEEE Trans Biomed Eng. 2013;60 (12):3314–21.
  • [3] Li Y, Chen X, Zhang X, Wang K, Wang ZJ. A sign-component-based framework for chinese sign language recognition using accelerometer and sEMG data. IEEE Trans Biomed Eng 2012;59 (10):2695–704.
  • [4] Cheng J, Chen X, Lu Z, Wang K, Shen M. Key-press gestures recognition and interaction based on sEMG signals. In: International Conference on Multimodal Interfaces and the Workshop on Machine Learning for Multimodal Interaction. ICMI-MLMI ’10. New York, NY, USA: Association for Computing Machinery; 2010. p. 1–4.
  • [5] Qi W, Su H, Aliverti A. A smartphone-based adaptive recognition and real-time monitoring system for human activities. IEEE Trans Hum Mach Syst 2020;50(5):414–23.
  • [6] Wen R, Tay W, Nguyen BP, Chng CB, Chui CK. Hand gesture guided robot-assisted surgery based on a direct augmented reality interface. Comput Methods Programs Biomed 2014;116 (2):68–80.
  • [7] Wachs JP, Kölsch M, Stern H, Edan Y. Vision-based hand-gesture applications. Commun ACM 2011;54(2):60–71.
  • [8] Lu Z, Chen X, Li Q, Zhang X, Zhou P. A hand gesture recognition framework and wearable gesture-based interaction prototype for mobile devices. IEEE Trans Hum Mach Syst 2014;44(2):293–9.
  • [9] Lu Z, Tong K, Shin H, Li S, Zhou P. Advanced myoelectric control for robotic hand-assisted training: outcome from a stroke patient. Front Neurol 2017;8:107.
  • [10] Li Z, Xia Y, Su C. Intelligent networked teleoperation control. Springer; 2015.
  • [11] Su H, Qi W, Li Z, Chen Z, Ferrigno G, De Momi E. Deep neural network approach in EMG-based force estimation for human-robot interaction. IEEE Trans Artif Intell 2021;2 (5):404–12.
  • [12] Schabron B, Desai J, Yihun Y. Wheelchair-mounted upper limb robotic exoskeleton with adaptive controller for activities of daily living. Sensors 2021;21(17).
  • [13] Kim K, Park S, Lim T, Lee SJ. Upper-limb electromyogram classification of reaching-to-grasping tasks based on convolutional neural networks for control of a prosthetic hand. Front Neurosci 2021;15.
  • [14] Su H, Ovur SE, Zhou X, Qi W, Ferrigno G, De Momi E. Depth vision guided hand gesture recognition using electromyographic signals. Adv Robot 2020;34(15):985–97.
  • [15] Phinyomark A, Scheme E. An investigation of temporally inspired time domain features for electromyographic pattern recognition. In: 2018 40th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC); 2018. p. 5236–5240.
  • [16] Phinyomark A, Phukpattaranont P, Limsakul C. Feature reduction and selection for EMG signal classification. Expert Syst Appl 2012;39(8):7420–31.
  • [17] Shenoy P, Miller KJ, Crawford B, Rao RPN. Online electromyographic control of a robotic prosthesis. IEEE Trans Biomed Eng 2008;55(3):1128–35.
  • [18] Al-Timemy AH, Khushaba RN, Bugmann G, Escudero J. Improving the performance against force variation of EMG controlled multifunctional upper-limb prostheses for transradial amputees. IEEE Trans Neural Syst Rehabil Eng 2016;24(6):650–61.
  • [19] Shi WT, Lyu ZJ, Tang ST, Chia TL, Yang CY. A bionic hand controlled by hand gesture recognition based on surface EMG signals: A preliminary study. Biocybern Biomed Eng 2018;38 (1):126–35.
  • [20] Waris A, Niazi IK, Jamil M, Englehart K, Jensen W, Kamavuako EN. Multiday evaluation of techniques for EMG-based classification of hand motions. IEEE J Biomed Health Inform 2019;23(4):1526–34.
  • [21] Tuncer T, Dogan S, Subasi A. Surface EMG signal classification using ternary pattern and discrete wavelet transform based feature extraction for hand movement recognition. Biomed Signal Process Control 2020;58 101872.
  • [22] Fatimah B, Singh P, Singhal A, Pachori RB. Hand movement recognition from sEMG signals using fourier decomposition method. Biocybern Biomed Eng 2021;41(2):690–703.
  • [23] Karnam NK, Turlapaty AC, Dubey SR, Gokaraju B. Classification of sEMG signals of hand gestures based on energy features. Biomed Signal Process Control 2021;70 102948.
  • [24] LeCun Y, Bengio Y, Hinton G. Deep learning. Nature 2015;521 (7553):436–44.
  • [25] Atzori M, Cognolato M, Mu¨ ller H. Deep learning with convolutional neural networks applied to electromyography data: A resource for the classification of movements for prosthetic hands. Front Neurorobot 2016;10:9.
  • [26] Atzori M, Gijsberts A, Castellini C, Caputo B, Hager AGM, Elsig S, et al. Electromyography data for non-invasive naturally-controlled robotic hand prostheses. Sci Data 2014;1(1) 140053.
  • [27] Geng W, Du Y, Jin W, Wei W, Hu Y, Li J. Gesture recognition by instantaneous surface EMG images. Sci Rep 2016;6(1):36571.
  • [28] Wei W, Wong Y, Du Y, Hu Y, Kankanhalli M, Geng W. A multi-stream convolutional neural network for sEMG-based gesture recognition in muscle-computer interface. Pattern Recognit Lett 2019;119:131–8.
  • [29] Olsson AE, Björkman A, Antfolk C. Automatic discovery of resource-restricted convolutional neural network topologies for myoelectric pattern recognition. Comput Biol Med 2020;120 103723.
  • [30] Côté-Allard U, Fall CL, Drouin A, Campeau-Lecours A, Gosselin C, Glette K, et al. Deep learning for electromyographic hand gesture signal classification using transfer learning. IEEE Trans Neural Syst Rehabil Eng 2019;27 (4):760–71.
  • [31] Qi S, Wu X, Chen WH, Liu J, Zhang J, Wang J. sEMG-based recognition of composite motion with convolutional neural network. Sens Actuators A Phys 2020;311 112046.
  • [32] Betthauser JL, Krall JT, Bannowsky SG, Lévay G, Kaliki RR, Fifer MS, et al. Stable responsive EMG sequence prediction and adaptive reinforcement with temporal convolutional networks. IEEE Trans Biomed Eng 2020;67(6):1707–17.
  • [33] Gautam A, Panwar M, Wankhede A, Arjunan SP, Naik GR, Acharyya A, et al. Locomo-Net: A low-complex deep learning framework for sEMG-based hand movement recognition for prosthetic control. IEEE J Transl Eng Health Med 2020;8:1–12.
  • [34] Koch P, Dreier M, Maass M, Phan H, Mertins A. RNN with stacked architecture for sEMG based sequence-to-sequence hand gesture recognition. In: 2020 28th European Signal Processing Conference (EUSIPCO). p. 1600–4.
  • [35] Ketykó I, Kovács F, Varga KZ. Domain adaptation for sEMG-based gesture recognition with recurrent neural networks. In: 2019 International Joint Conference on Neural Networks (IJCNN). p. 1–7.
  • [36] Hu Y, Wong Y, Wei W, Du Y, Kankanhalli M, Geng W. A novel attention-based hybrid CNN-RNN architecture for sEMG-based gesture recognition. PLoS One 2018;13(10):1–18.
  • [37] Wang Y, Wu Q, Dey N, Fong S, Ashour AS. Deep back propagation–long short-term memory network based upper-limb sEMG signal classification for automated rehabilitation. Biocybern Biomed Eng 2020;40(3):987–1001.
  • [38] Donahue J, Anne Hendricks L, Guadarrama S, Rohrbach M, Venugopalan S, Saenko K, et al. Long-term recurrent convolutional networks for visual recognition and description. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR). p. 2625–34.
  • [39] Bao T, Zaidi SAR, Xie S, Yang P, Zhang ZQ. A CNN-LSTM hybrid model for wrist kinematics estimation using surface electromyography. IEEE Trans Instrum Meas 2021;70:1–9.
  • [40] Chen X, Li Y, Hu R, Zhang X, Chen X. Hand gesture recognition based on surface electromyography using convolutional neural network with transfer learning method. IEEE J Biomed Health Inform 2021;25(4):1292–304.
  • [41] Krizhevsky A, Sutskever I, Hinton GE. Imagenet classification with deep convolutional neural networks. Adv Neural Inf Process Syst 2012;25:1097–105.
  • [42] Bengio Y, Simard P, Frasconi P. Learning long-term dependencies with gradient descent is difficult. IEEE Trans Neural Netw 1994;5(2):157–66.
  • [43] Hochreiter S, Schmidhuber J. Long short-term memory. Neural Comput 1997;9(8):1735–80.
  • [44] Scheme E, Englehart K. On the robustness of EMG features for pattern recognition based myoelectric control; A multi-dataset comparison. In: 2014 36th Annual International Conference of the IEEE Engineering in Medicine and Biology Society; 2014. p. 650–653.
  • [45] Scheme EJ, Hudgins BS, Englehart KB. Confidence-based rejection for improved pattern recognition myoelectric control. IEEE Trans Biomed Eng 2013;60(6):1563–70.
  • [46] Amsüss S, Goebel PM, Jiang N, Graimann B, Paredes L, Farina D. Self-correcting pattern recognition system of surface EMG signals for upper limb prosthesis control. IEEE Trans Biomed Eng 2014;61(4):1167–76.
  • [47] Lu Z, Tong K, Zhang X, Li S, Zhou P. Myoelectric pattern recognition for controlling a robotic hand: A feasibility study in stroke. IEEE Trans Biomed Eng 2019;66(2):365–72.
  • [48] Criswell E. Cram’s introduction to surface electromyography. Burlington: Jones & Bartlett Publishers; 2010.
  • [49] Reifinger S, Wallhoff F, Ablassmeier M, Poitschke T, Rigoll G. Static and dynamic hand-gesture recognition for augmented reality applications. In: International Conference on Human-Computer Interaction. Springer; 2007. p. 728–37.
  • [50] Leonardis D, Barsotti M, Loconsole C, Solazzi M, Troncossi M, Mazzotti C, et al. An EMG-controlled robotic hand exoskeleton for bilateral rehabilitation. IEEE Trans Haptics 2015;8(2):140–51.
  • [51] Dunai L, Novak M, García Espert C. Human hand anatomy-based prosthetic hand. Sensors 2021;21(1):137.
  • [52] Pizzolato S, Tagliapietra L, Cognolato M, Reggiani M, Müller H, Atzori M. Comparison of six electromyography acquisition setups on hand movement classification tasks. PLoS One 2017;12(10):1–17.
  • [53] Ortiz-Catalan M, Brånemark R, Håkansson B. BioPatRec: A modular research platform for the control of artificial limbs based on pattern recognition algorithms. Source Code Biol Med 2013;8(1):11.
  • [54] Lobov S, Krilova N, Kastalskiy I, Kazantsev V, Makarov VA. Latent factors limiting the performance of sEMG-interfaces. Sensors 2018;18(4):1122.
  • [55] Yang K, Xu M, Yang X, Yang R, Chen Y. A novel EMG-based hand gesture recognition framework based on multivariate variational mode decomposition. Sensors 2021;21(21):7002.
  • [56] Cheng Y, Li G, Yu M, Jiang D, Yun J, Liu Y, et al. Gesture recognition based on surface electromyography-feature image. Concurr Comput 2021;33(6) e6051.
  • [57] Wei W, Dai Q, Wong Y, Hu Y, Kankanhalli M, Geng W. Surface-electromyography-based gesture recognition by multi-view deep learning. IEEE Trans Biomed Eng 2019;66(10):2964–73.
  • [58] Josephs D, Drake C, Heroy A, Santerre J. sEMG gesture recognition with a simple model of attention. In: Alsentzer E, McDermott MBA, Falck F, Sarkar SK, Roy S, Hyland SL, editors. Proceedings of the Machine Learning for Health NeurIPS Workshop. vol. 136 of Proceedings of Machine Learning Research. PMLR. p. 126–38.
  • [59] Potekhin VV, Unal O. Development of machine learning models to determine hand gestures using EMG signals. Ann DAAAM Proceedings 2020;7(1).
  • [60] Nazemi A, Maleki A. Artificial neural network classifier in comparison with LDA and LS-SVM classifiers to recognize 52 hand postures and movements. In: 2014 4th International Conference on Computer and Knowledge Engineering (ICCKE). IEEE; 2014. p. 18–22.
  • [61] Rubio AM, Grisales JAA, Tabares-Soto R, Orozco-Arias S, Varón CFJ, Buriticá JIP. Identification of hand movements from electromyographic signals using machine learning; 2020.
  • [62] Foody GM. Thematic map comparison. Photogramm Eng Remote Sensing 2004;70(5):627–33.
  • [63] Dumoulin V, Visin F. A guide to convolution arithmetic for deep learning; 2018.
Typ dokumentu
Bibliografia
Identyfikatory
Identyfikator YADDA
bwmeta1.element.baztech-3e533278-331c-4ea9-8cbf-78b12e342fec
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.