Ten serwis zostanie wyłączony 2025-02-11.
Nowa wersja platformy, zawierająca wyłącznie zasoby pełnotekstowe, jest już dostępna.
Przejdź na https://bibliotekanauki.pl

PL EN


Preferencje help
Widoczny [Schowaj] Abstrakt
Liczba wyników
2024 | Vol. 44, no. 3 | 470--480
Tytuł artykułu

Parallel collaboration and closed-loop control of a cursor using multimodal physiological signals

Warianty tytułu
Języki publikacji
EN
Abstrakty
EN
This paper explores the parallel collaboration of multimodal physiological signals, combining eye tracker output signals, motor imagery, and error-related potentials to control a computer mouse. Specifically, a parallel working mechanism is implemented in the decision layer, where the eye tracker manages cursor movements, and motor imagery manages click functions. Meanwhile, the eye tracker output signals are integrated with electroencephalography data to detect the idle state for asynchronous control. Additionally, error-related potentials evoked by visual feedback, are detected to reduce the cost of error corrections. To efficiently collect data and provide continuous evaluations, we performed offline training and online testing in the designed paradigm. To further validate the practicability, we conducted online experiments on the real-world computer, focusing on a scenario of opening and closing files. The experiments involved seventeen subjects. The results showed that the stability of the eye tracker was optimized from 67.6% to 95.2% by the designed filter, providing the support for parallel control. The accuracy of motor imagery conducted simultaneously with fixations reached 93.41 ± 2.91%, proving the feasibility of parallel control. Furthermore, the real-world experiments took 45.86 ± 14.94 s to complete three movements and clicks, and showed a significant improvement compared to the baseline experiment without automatic error correction, validating the practicability of the system and the efficacy of error-related potentials detection. Moreover, this system freed users from the stimulus paradigm, enabling a more natural interaction. To sum up, the parallel collaboration of multimodal physiological signals is novel and feasible, the designed mouse is practical and promising.
Wydawca

Rocznik
Strony
470--480
Opis fizyczny
Bibliogr. 64 poz., rys., tab., wykr.
Twórcy
autor
  • College of Intelligence Science and Technology, National University of Defense Technology, Changsha, 410073, China, yezeqigfkd@nudt.edu.cn
autor
  • College of Intelligence Science and Technology, National University of Defense Technology, Changsha, 410073, China, yuyangnudt@hotmail.com
autor
  • College of Computer Science and Technology, National University of Defense Technology, Changsha, 410073, China, zhangyiyun213@163.com
autor
  • College of Intelligence Science and Technology, National University of Defense Technology, Changsha, 410073, China, 2209449676@qq.com
  • College of Intelligence Science and Technology, National University of Defense Technology, Changsha, 410073, China, sunjianxiang@nudt.edu.cn
  • College of Intelligence Science and Technology, National University of Defense Technology, Changsha, 410073, China, narcz@163.com
  • College of Intelligence Science and Technology, National University of Defense Technology, Changsha, 410073, China, zengphd@nudt.edu.cn
Bibliografia
  • [1] Long J, Li Y, Wang H, Yu T, Pan J, Li F. A hybrid brain computer interface to control the direction and speed of a simulated or real wheelchair. IEEE Trans Neural Syst Rehabil Eng 2012;20(5):720-9.
  • [2] Yu Y, Liu Y, Jiang J, Yin E, Zhou Z, Hu D. An asynchronous control paradigm based on sequential motor imagery and its application in wheelchair navigation. IEEE Trans Neural Syst Rehabil Eng 2018;26:2367-75.
  • [3] Hochberg LR, Bacher D, Jarosiewicz B, Masse NY, Simeral JD, Vogel J, Haddadin S, Liu J, Cash SS, Patrick V. Reach and grasp by people with tetraplegia using a neurally controlled robotic arm. Nature 2013;485(7398):372-5.
  • [4] Lin H, Gao S, Gao X. A novel system of SSVEP-based human-robot coordination. J Neural Eng 2018.
  • [5] Nsugbe E, Al-Timemy AH. Shoulder girdle recognition using electrophysiological and low frequency anatomical contraction signals for prosthesis control. CAAI Trans Intell Technol 2022;7(1):81-94.
  • [6] Stefanov D, Bien Z, Bang W-C. The smart house for older persons and persons with physical disabilities: structure, technology arrangements, and perspectives. IEEE Trans Neural Syst Rehabil Eng 2004;12(2):228-50.
  • [7] Lee L-H, Hui P. Interaction methods for smart glasses: A survey. IEEE Access 2018;6:28712-32.
  • [8] Rudigkeit N, Gebhard M, Gräser A. Evaluation of control modes for head motion-based control with motion sensors. In: 2015 IEEE international symposium on medical measurements and applications (meMeA) proceedings. 2015, p. 135-40.
  • [9] Ramcharitar A, Teather RJ. Ezcursorvr: 2D selection with virtual reality head-mounted displays. In: Proceedings of the 44th graphics interface conference. Waterloo, CAN: Canadian Human-Computer Communications Society; 2018, p. 123-30.
  • [10] Liu Y, Habibnezhad M, Jebelli H. Brain-computer interface for hands-free teleoperation of construction robots. Autom Constr 2021;123:103523.
  • [11] Edelman BJ, Meng J, Suma D, Zurn CA, Nagarajan E, Baxter BS, Cline CC, He B. Noninvasive neuroimaging enhances continuous neural tracking for robotic device control. Science Robotics 2019;4.
  • [12] Miao Y, Chen S, Zhang X, Jin J, Jung TP. BCI-based rehabilitation on the stroke in sequela stage. Neural Plast 2020;2020(1):1-10.
  • [13] Farwell L, Donchin E. Talking off the top of your head : toward a mental prosthesis utilizing event-related brain potentials. Electroencephalogr Clin Neurophysiol 1988;70.
  • [14] Li Y, Long J, Yu T, Yu Z, Wang C, Zhang H, Guan C. An EEG-based BCI system for 2-D cursor control by combining Mu/Beta rhythm and P300 potential. IEEE Trans Biomed Eng 2010;57(10):2495-505.
  • [15] Yu T, Li Y, Long J, Gu Z. Surfing the internet with a BCI mouse. J Neural Eng 2012;9(3).
  • [16] Simeral JD, Kim SP, Black MJ, Donoghue JP, Hochberg LR. Neural control of cursor trajectory and click by a human with tetraplegia 1000 days after implant of an intracortical microelectrode array. J Neural Eng 2011;8(2).
  • [17] Zhang Q, Zhang S, Hao Y, Zhang H, Zhu J, Zhao T, Zhang J, Wang Y, Zheng X, Chen W. Development of an invasive brain-machine interface with a monkey model. Chin Sci Bull 2012;57(16):2036-45.
  • [18] Crane H, Steele C. Accurate three-dimensional eyetracker. Appl Opt 1978;17(5):691-705.
  • [19] Newman R, Matsumoto Y, Rougeaux S, Zelinsky A. Real-time stereo tracking for head pose and gaze estimation. In: Proceedings fourth IEEE international conference on automatic face and gesture recognition (cat. no. PR00580). 2000, p. 122-8.
  • [20] Stiefelhagen R, Yang J, Waibel A. Tracking eyes and monitoring eye gaze. In: Workshop on perceptual user interfaces. 1997, p. 98-100.
  • [21] Cegovnik T, Stojmenova K, Jakus G, Sodnik J. An analysis of the suitability of a low-cost eye tracker for assessing the cognitive load of drivers. Appl Ergon 2018;68:1-11.
  • [22] Fabio RA, Giannatiempo S, Semino M, Caprě T. Longitudinal cognitive rehabilitation applied with eye-tracker for patients with rett syndrome. Res Dev Disabil 2021;111:103891.
  • [23] Villani D, Morganti F, Cipresso P, Ruggi S, Riva G, Gilli G. Visual exploration patterns of human figures in action: an eye tracker study with art paintings. Front Psychol 2015;6:1636.
  • [24] Chen Y, Tsai M-J. Eye-hand coordination strategies during active video game playing: An eye-tracking study. Comput Hum Behav 2015;51:8-14.
  • [25] Tchalenko J. Eye-hand coordination in portrait drawing. In: 11th European conference on eye movements. 2001.
  • [26] Gonzalez DA, Niechwiej-Szwedo E. The effects of monocular viewing on hand-eye coordination during sequential grasping and placing movements. Vis Res 2016;128:30-8.
  • [27] Wang H, Dong X, Chen Z, Shi BE. Hybrid gaze/EEG brain computer interface for robot arm control on a pick and place task. In: 2015 37th annual international conference of the IEEE engineering in medicine and biology society. EMBC, 2015, p. 1476-9.
  • [28] Kierkels JJM, Riani J, Bergmans JWM, van Boxtel GJM. Using an eye tracker for accurate eye movement artifact correction. IEEE Trans Biomed Eng 2007;54(7):1256-67.
  • [29] Dornhege G, del R. Millán J, Hinterberger T, McFarland DJ, Müller K-R. Error-related EEG potentials in brain-computer interfaces. Toward Brain-Comput Interfacing 2007;291-301.
  • [30] Yu Y, Zhou Z, Yin E, Jiang J, Tang J, Liu Y, Hu D. Toward brain-actuated car applications: Self-paced control with a motor imagery-based brain-computer interface. Comput Biol Med 2016;77:148-55.
  • [31] Homan RW, Herman J, Purdy P. Cerebral location of international 10-20 system electrode placement. Electroencephalogr Clin Neurophysiol 1987;66(4):376-82.
  • [32] Munn SM, Stefano L, Pelz JB. Fixation-identification in dynamic scenes: comparing an automated algorithm to manual coding. In: Proceedings of the 5th symposium on applied perception in graphics and visualization. Association for Computing Machinery; 2008, p. 33-42.
  • [33] Salvucci DD, Goldberg JH. Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 symposium on eye tracking research & applications. New York, NY, USA: Association for Computing Machinery; 2000, p. 71-8.
  • [34] Delorme A, Makeig S. EEGLAB: an open source toolbox for analysis of single-trial EEG dynamics including independent component analysis. J Neurosci Methods 2004;134(1):9-21.
  • [35] Chang C-Y, Hsu S-H, Pion-Tonachini L, Jung T-P. Evaluation of artifact subspace reconstruction for automatic artifact components removal in multi-channel EEG recordings. IEEE Trans Biomed Eng 2020;67(4):1114-21.
  • [36] Li C, Zhou W, Liu G, Zhang Y, Geng M, Liu Z, Wang S, Shang W. Seizure onset detection using empirical mode decomposition and common spatial pattern. IEEE Trans Neural Syst Rehabil Eng 2021;29:458-67.
  • [37] Huang NE, Shen Z, Long SR, Wu MC, Shih HH, Zheng Q, Yen N-C, Tung CC, Liu HH. The empirical mode decomposition and the Hilbert spectrum for nonlinear and non-stationary time series analysis. Proc R Soc Lond Ser A 1998;454(1971):903-98.
  • [38] Tang X, Li W, Li X, Ma W, Dang X. Motor imagery EEG recognition based on conditional optimization empirical mode decomposition and multi-scale convolutional neural network. Expert Syst Appl 2020;149:113285.
  • [39] Müller-Gerking J, Pfurtscheller G, Flyvbjerg H. Designing optimal spatial filters for single-trial EEG classification in a movement task. Clin Neurophysiol 1999;110(5):787-98.
  • [40] Qi F, Li Y, Wu W. RSTFC: A novel algorithm for spatio-temporal filtering and classification of single-trial EEG. IEEE Trans Neural Netw Learn Syst 2015;26(12):3070-82.
  • [41] Ferrez PW, del R. Millan J. Error-related EEG potentials generated during simulated brain-computer interaction. IEEE Trans Biomed Eng 2008;55(3):923-9.
  • [42] Yu Y, Zhou Z, Yin E, Jiang J, Liu Y, Hu D. A P300-based brain-computer interface for Chinese character input. Int J Hum-Comput Interact 2016;32(11):878-84.
  • [43] Xiao X, Xu M, Jin J, Wang Y, Jung T-P, Ming D. Discriminative canonical pattern matching for single-trial classification of ERP components. IEEE Trans Biomed Eng 2020;67(8):2266-75.
  • [44] Liao X, Yao D, Wu D, Li C. Combining spatial filters for the classification of single-trial EEG in a finger movement task. IEEE Trans Biomed Eng 2007;54(5):821-31.
  • [45] Bin G, Gao X, Zheng Y, Bo H, Gao S. An online multi-channel SSVEP-based brain-computer interface using a canonical correlation analysis method. J Neural Eng 2009;6(4):046002.
  • [46] Creed C, Al-Kalbani M, Theil A, Sarcar S, Williams I. Inclusive AR/VR: accessibility barriers for immersive technologies. Univ Access Inf Soc 2024;23(1):59-73.
  • [47] Partarakis N, Zabulis X. A review of immersive technologies, knowledge representation, and AI for human-centered digital experiences. Electronics 2024;13(2).
  • [48] Drews M, Dierkes K. Strategies for enhancing automatic fixation detection in head-mounted eye tracking. Behav Res Methods 2024.
  • [49] Engbert R, Kliegl R. Microsaccades keep the eyes’ balance during fixation. Psychol Sci 2004;15:431-2.
  • [50] Wolf J, Hess S, Bachmann D, Lohmeyer Q, Meboldt M. Automating areas of interest analysis in mobile eye tracking experiments based on machine learning. J Eye Mov Res 2018;11.
  • [51] Novák J, Masner J, Benda P, Simek P, Merunka V. Eye tracking, usability, and user experience: A systematic review. Int J Hum-Comput Interact 2023.
  • [52] Rammy SA, Abbas W, Mahmood SS, Riaz H, Rehman HU, Abideen RZU, Aqeel M, Zhang W. Sequence-to-sequence deep neural network with spatio-spectro and temporal features for motor imagery classification. Biocybern Biomed Eng 2021;41(1):97-110.
  • [53] Ang KK, Chin ZY, Zhang H, Guan C. Filter bank common spatial pattern (FBCSP) in brain-computer interface. In: 2008 IEEE international joint conference on neural networks (IEEE world congress on computational intelligence). 2008, p. 2390-7.
  • [54] Park Y, Chung W. Selective feature generation method based on time domain parameters and correlation coefficients for filter-bank-CSP BCI systems. Sensors 2019;19(17).
  • [55] Lin C-L, Chen L-T. Improvement of brain-computer interface in motor imagery training through the designing of a dynamic experiment and FBCSP. Heliyon 2023;9(3):e13745.
  • [56] Yasemin M, Cruz A, Nunes UJ, Pires G. Single trial detection of error-related potentials in brain-machine interfaces: a survey and comparison of methods. J Neural Eng 2023;20(1):016015.
  • [57] Wang L, Li M. Decoding motor imagery based on dipole feature imaging and a hybrid CNN with embedded squeeze-and-excitation block. Biocybern Biomed Eng 2023;43(4):751-62.
  • [58] Kobler RJ, Sburlea AI, Mondini V, Hirata M, Müller-Putz GR. Distanceand speed-informed kinematics decoding improves M/EEG based upper-limb movement decoder accuracy. J Neural Eng 2020;17(5):056027.
  • [59] Mondini V, Kobler RJ, Sburlea AI, Müller-Putz GR. Continuous low-frequency EEG decoding of arm movement for closed-loop, natural control of a robotic arm. J Neural Eng 2020;17(4):046031.
  • [60] Adhanom IB, MacNeilage P, Folmer E. Eye tracking in virtual reality: a broad review of applications and challenges. Virtual Real 2023;27(2):1481-505.
  • [61] Fan L-H, Huang W-C, Shao X-Q, Niu Y-F. Design recommendations for voluntary blink interactions based on pressure sensors. Adv Eng Inform 2024;61:102489.
  • [62] Mannan MMN, Kamran MA, Kang S, Choi HS, Jeong MY. A hybrid speller design using eye tracking and SSVEP brain-computer interface. Sensors 2020;20(3).
  • [63] Larsen OFP, Tresselt WG, Lorenz EA, Holt T, Sandstrak G, Hansen TI, Su X, Holt A. A method for synchronized use of EEG and eye tracking in fully immersive VR. Front Hum Neurosci 2024;18.
  • [64] Abiri R, Borhani S, Kilmarx J, Esterwood C, Jiang Y, Zhao X. A usability study of low-cost wireless brain-computer interface for cursor control using online linear model. IEEE Trans Hum-Mach Syst 2020;50(4):287-97.
Typ dokumentu
Bibliografia
Identyfikatory
Identyfikator YADDA
bwmeta1.element.baztech-781509f5-21a4-40c5-8f24-12c2c5319910
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.