Nowa wersja platformy, zawierająca wyłącznie zasoby pełnotekstowe, jest już dostępna.
Przejdź na https://bibliotekanauki.pl

PL EN


Preferencje help
Widoczny [Schowaj] Abstrakt
Liczba wyników
2022 | Vol. 42, no. 2 | 630--645
Tytuł artykułu

The quantitative application of channel importance in movement intention decoding

Warianty tytułu
Języki publikacji
EN
Abstrakty
EN
The complex brain network consists of multiple collaborative regions, which can be activated to varying degrees by motor imagery (MI) and the induced electroencephalogram (EEG) recorded by an array of scalp electrodes is usually decoded for driving rehabilitation system. Either all channels or partially selected channels are equally applied to recognize movement intention, which may be incompatible with the individual differences of channels from different locations. In this paper, a channel importance based imaging method is proposed, denoted as CIBI. For each electrode of MI-EEG, the power over 8–30 Hz band is calculated from discrete Fourier spectrum and input to random forest algorithm (RF) to quantify its contribution, namely channel importance (CI); Then, CI is used for weighting the powers of α and β rhythms, which are interpolated to a 32 x 32 grid by using Clough-Tocher method respectively, generating two main band images with time-frequency-space information. In addition, a dual branch fusion convolutional neural network (DBFCNN) is developed to match with the characteristic of two MI images, realizing the extraction, fusion and classification of comprehensive features. Extensive experiments are conducted based on two public datasets with four classes of MI-EEG, the relatively higher average accuracies are obtained, and the improvements achieve 23:95% and 25:14% respectively when using channel importance, their statistical analysis are also performed by Kappa value, confusion matrix and receiver operating characteristic. Experiment results show that the personalized channel importance is helpful to enhance inter-class separability as well as the proposed method has the outstanding decoding ability for multiple MI tasks.
Wydawca

Rocznik
Strony
630--645
Opis fizyczny
Bibliogr. 42 poz., rys., tab., wykr.
Twórcy
autor
  • Faculty of Information Technology, Beijing University of Technology, Beijing, China
autor
  • Faculty of Information Technology, Beijing University of Technology, Beijing 100124, China, limingai@bjut.edu.cn
  • Beijing Key Laboratory of Computational Intelligence and Intelligent System, Beijing, China
  • Engineering Research Center of Digital Community, Ministry of Education, Beijing, China
Bibliografia
  • [1] Tariq M, Trivailo PM, Simic M. Eeg-based bci control schemes for lower-limb assistive-robots. Front Human Neurosci 2018;12:312. https://doi.org/10.3389/fnhum.2018.00312.
  • [2] Bright D, Nair A, Salvekar D, Bhisikar S. Eeg-based brain controlled prosthetic arm. 2016 Conference on Advances in Signal Processing (CASP) 2016:479–83. https://doi.org/10.1109/CASP.2016.7746219.
  • [3] Qiu S, Li Z, He W, Zhang L, Yang C, Su C-Y. Brain-machine interface and visual compressive sensing-based teleoperation control of an exoskeleton robot. IEEE Trans Fuzzy Syst 2017;25(1):58–69. https://doi.org/10.1109/TFUZZ.2016.2566676.
  • [4] Sawangjai P, Hompoonsup S, Leelaarporn P, Kongwudhikunakorn S, Wilaiprasitporn T. Consumer grade eeg measuring sensors as research tools: A review. IEEE Sens J 2020;20(8):3996–4024. https://doi.org/10.1109/JSEN.2019.2962874.
  • [5] Chaisaen R, Autthasan P, Mingchinda N, Leelaarporn P, Kunaseth N, Tammajarung S, Manoonpong P, Mukhopadhyay SC, Wilaiprasitporn T. Decoding eeg rhythms during action observation, motor imagery, and execution for standing and sitting. IEEE Sens J 2020;20(22):13776–86. https://doi.org/10.1109/JSEN.2020.3005968.
  • [6] Saha S, Baumert M. Intra- and inter-subject variability in eeg-based sensorimotor brain computer interface: A review. Front Comput Neurosci 2020;13:87. https://doi.org/10.3389/fncom.2019.00087.
  • [7] Zhang Y, Zhang S, Ji X. Eeg-based classification of emotions using empirical mode decomposition and autoregressive model. Singapore: Multimedia Tools and Applications; 2018. p. 26697–710.
  • [8] Sadiq MT, Yu X, Yuan Z, Aziz MZ, Siuly S, Ding W. Toward the development of versatile brain-computer interfaces. IEEE Trans Artif Intell 2021;2(4):314–28. https://doi.org/10.1109/TAI.2021.3097307.
  • [9] Yu X, Aziz MZ, Sadiq MT, Fan Z, Xiao G. A new framework for automatic detection of motor and mental imagery eeg signals for robust bci systems. IEEE Trans Instrum Meas 2021;70:1–12. https://doi.org/10.1109/TIM.2021.3069026.
  • [10] Dora C, Biswal PK. Correlation-based ecg artifact correction from single channel eeg using modified variational mode decomposition. Comput Methods Programs Biomed 2020;183. https://doi.org/10.1016/j.cmpb.2019.105092 105092.
  • [11] Sadiq MT, Yu X, Yuan Z, Aziz MZ. Identification of motor and mental imagery eeg in two and multiclass subject-dependent tasks using successive decomposition index. Sensors 20(18). doi:10.3390/s20185283.
  • [12] Alyasseri ZAA, Khadeer AT, Al-Betar MA, Abasi A, Makhadmeh S, Ali NS. The effects of eeg feature extraction using multi-wavelet decomposition for mental tasks classification. In: Proceedings of the International Conference on Information and Communication Technology. New York, NY, USA: Association for Computing Machinery; 2019. p. 139–46. https://doi.org/10.1145/3321289.3321327.
  • [13] Sadiq MT, Yu X, Yuan Z, Aziz MZ, Siuly S, Ding W. A matrix determinant feature extraction approach for decoding motor and mental imagery eeg in subject specific tasks. IEEE Trans Cogn Develop Syst 2020:1. https://doi.org/10.1109/TCDS.2020.3040438.
  • [14] Youngjoo K, Jiwoo R, Keun KK, Took CC, Mandic DP, Cheolsoo P. Motor imagery classification using mu and beta rhythms of eeg with strong uncorrelating transform based complex common spatial patterns. Comput Intell Neurosci 2016;2016:1–13. https://doi.org/10.1155/2016/1489692.
  • [15] Ang KK, Chin ZY, Wang C, Guan C, Zhang H. Filter bank common spatial pattern algorithm on bci competition iv datasets 2a and 2b. Front Neurosci 2012;6:39. https://doi.org/10.3389/fnins.2012.00039.
  • [16] Sakhavi S, Guan C, Yan S. Learning temporal information for brain-computer interface using convolutional neural networks. IEEE Trans Neural Networks Learn Syst 2018;29 (11):5619–29. https://doi.org/10.1109/TNNLS.2018.2789927.
  • [17] Lotte F, Bougrain L, Cichocki A, Clerc M, Congedo M, Rakotomamonjy A, Yger F. A review of classification algorithms for EEG-based brain–computer interfaces: a 10 year update. J Neural Eng 2018;15(3) . https://doi.org/10.1088/1741-2552/aab2f2 031005.
  • [18] Hajibabazadeh M, Azimirad V. Brain-robot interface: Distinguishing left and right hand eeg signals through svm. Second RSI/ISM International Conference on Robotics and Mechatronics (ICRoM) 2014;2014:813–6. https://doi.org/10.1109/ICRoM.2014.6991004.
  • [19] Hsu W-Y. Eeg-based motor imagery classification using neuro-fuzzy prediction and wavelet fractal features. J Neurosci Methods 2010;189(2):295–302. https://doi.org/10.1016/j.jneumeth.2010.03.030.
  • [20] Lawhern VJ, Solon AJ, Waytowich NR, Gordon SM, Hung CP, Lance BJ. EEGNet: a compact convolutional neural network for EEG-based brain-computer interfaces. J Neural Eng 2018;15(5) . https://doi.org/10.1088/1741-2552/aace8c 056013.
  • [21] Dose H, Møller JS, Iversen HK, Puthusserypady S. An end-toend deep learning approach to mi-eeg signal classification for bcis. Expert Syst Appl 2018;114:532–42. https://doi.org/10.1016/j.eswa.2018.08.031.
  • [22] Schirrmeister RT, Springenberg JT, Fiederer LDJ, Glasstetter M, Eggensperger K, Tangermann M, Hutter F, Burgard W, Ball T. Deep learning with convolutional neural networks for eeg decoding and visualization. Hum Brain Mapp 2017;38(11):5391–420. https://doi.org/10.1002/hbm.23730.
  • [23] Dai G, Zhou J, Huang J, Wang N. HS-CNN: a CNN with hybrid convolution scale for EEG motor imagery classification. J Neural Eng 2020;17(1). https://doi.org/10.1088/1741-2552/ab405f 016025.
  • [24] Autthasan P, Chaisaen R, Sudhawiyangkul T, Kiatthaveephong S, Rangpong P, Dilokthanakul N, Bhakdisongkhram G, Phan H, Guan C, Wilaiprasitporn T. Min2net: End-to-end multi-task learning for subject-independent motor imagery eeg classification. IEEE Trans Biomed Eng 2021:1. https://doi.org/10.1109/TBME.2021.3137184.
  • [25] Uktveris T, Jusas V. Application of convolutional neural networks to four-class motor imagery classification problem. Inf Technol Control 2017;46(2):260–73. https://doi.org/10.5755/j01.itc.46.2.17528.
  • [26] Dai M, Zheng D, Na R, Wang S, Zhang S. Eeg classification of motor imagery using a novel deep learning framework. Sensors 2019;19(3):551. https://doi.org/10.3390/s19030551.
  • [27] Mammone N, Ieracitano C, Morabito FC. A deep cnn approach to decode motor preparation of upper limbs from time-frequency maps of eeg signals at source level. Neural Networks 2020;124:357–72. https://doi.org/10.1016/j.neunet.2020.01.027.
  • [28] Bashivan P, Rish I, Yeasin M, Codella N. Learning Representations from EEG with Deep Recurrent-Convolutional Neural Networks, arXiv e-prints; 2015. arXiv:1511.06448. URL:https://ui.adsabs.harvard.edu/abs/2015arXiv151106448B.
  • [29] Li M-A, Han J-F, Duan L-J. A novel mi-eeg imaging with the location information of electrodes. IEEE Access 2020;8:3197–211. https://doi.org/10.1109/ACCESS.2019.2962740.
  • [30] Li M-A, Peng W-M, Yang J-F. Key band image sequences and a hybrid deep neural network for recognition of motor imagery eeg. IEEE Access 2021;9:86994–7006. https://doi.org/10.1109/ACCESS.2021.3085865.
  • [31] Riyad M, Khalil M, Adib A. A novel multi-scale convolutional neural network for motor imagery classification. Biomed Signal Process Control 2021;68. https://doi.org/10.1016/j.bspc.2021.102747 102747.
  • [32] Zhao X, Zhang H, Zhu G, You F, Kuang S, Sun L. A multi-branch 3d convolutional neural network for eeg-based motor imagery classification. IEEE Trans Neural Syst Rehabil Eng 2019;27(10):2164–77. https://doi.org/10.1109/TNSRE.2019.2938295.
  • [33] Li D, Xu J, Wang J, Fang X, Ji Y. A multi-scale fusion convolutional neural network based on attention mechanism for the visualization analysis of eeg signals decoding. IEEE Trans Neural Syst Rehabil Eng 2020;28(12):2615–26. https://doi.org/10.1109/TNSRE.2020.3037326.
  • [34] Li Y, Yang H, Li J, Chen D, Du M. Eeg-based intention recognition with deep recurrent-convolution neural network: Performance and channel selection by grad-cam. Neurocomputing 2020;415:225–33. https://doi.org/10.1016/j.neucom.2020.07.072.
  • [35] Handiru VS, Prasad VA. Optimized bi-objective eeg channel selection and cross-subject generalization with brain-computer interfaces. IEEE Trans Human-Mach Syst 2016;46(6):777–86. https://doi.org/10.1109/THMS.2016.2573827.
  • [36] Jin J, Miao Y, Daly I, Zuo C, Hu D, Cichocki A. Correlation-based channel selection and regularized feature optimization for mi-based bci. Neural Networks 2019;118:262–70. https://doi.org/10.1016/j.neunet.2019.07.008.
  • [37] Alyasseri ZAA, Khader AT, Al-Betar MA, Alomari OA. Person identification using eeg channel selection with hybrid flower pollination algorithm. Pattern Recogn 2020;105 . https://doi.org/10.1016/j.patcog.2020.107393 107393.
  • [38] Breiman L. Random forests. Mach Learn 2001;45(1):5–32. https://doi.org/10.1023/A:1010933404324.
  • [39] Alfeld P. A trivariate clough-tocher scheme for tetrahedral data. Comput Aided Geometric Design 1984;1(2):169–81. https://doi.org/10.1016/0167-8396(84)90029-3.
  • [40] Goldberger AL, Amaral LAN, Glass L, Hausdorff JM, Ivanov PC, Mark RG, Mietus JE, Moody GB, Peng C-K, Stanley HE. Physiobank, physiotoolkit, and physionet. Circulation 2000;101(23):e215–20. https://doi.org/10.1161/01.CIR.101.23.e215.
  • [41] Li M, Wang R, Xu D. An improved composite multiscale fuzzy entropy for feature extraction of mi-eeg. Entropy 22(12). doi:10.3390/e22121356.
  • [42] Tangermann M, Müller K-R, Aertsen A, Birbaumer N, Braun C, Brunner C, Leeb R, Mehring C, Miller K, Mueller-Putz G, Nolte G, Pfurtscheller G, Preissl H, Schalk G, Schlögl A, Vidaurre C, Waldert S, Blankertz B. Review of the bci competition iv. Front Neurosci 2012;6:55. https://doi.org/10.3389/fnins.2012.00055.
Typ dokumentu
Bibliografia
Identyfikatory
Identyfikator YADDA
bwmeta1.element.baztech-983346e5-c367-4a1a-9baf-a7be422e84fb
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.