PL EN


Preferencje help
Widoczny [Schowaj] Abstrakt
Liczba wyników
Identyfikatory
Warianty tytułu
Języki publikacji
EN
Abstrakty
EN
This paper presents a novel single channel Electrooculography (EOG) based efficient Human–Machine Interface (HMI) for helping the individuals suffering from severe paralysis or motor degenerative diseases to regain mobility. In this study, we propose a robust system that generates control command using only one type of asynchronous eye activity (voluntary eye blink) to navigate the wheelchair without a need of graphical user interface. This work demonstrates a simple but robust and effective multi-level threshold strategy to generate control commands from multiple features associated with the single, double and triple voluntary eye blinks to control predefined actions (forward, right turn, left turn and stop). Experimental trials were carried out on the able-bodied and disabled subjects to validate the universal applicability of the algorithms. It achieved an average command detection and execution accuracy of 93.89% with information transfer rate (ITR) of 62.64 (bits/ min) that shows the robust, sensitive and responsive features of the presented interface. In comparison with the established state of art similar HMI systems, our system achieved a better trade-off between higher accuracy and better ITR and while maintaining better performance in all qualitative and quantitative criteria. The results confirm that the proposed system offers a user-friendly, cost-effective and reliable alternative to the existing EOG-based HMI.
Twórcy
  • Department of Mechanical Engineering, Shri Guru Gobind Singhji Institute of Engineering and Technology, Nanded (M.S.), India
  • Center of Excellence in Signal and Image Processing, Shri Guru Gobind Singhji Institute of Engineering and Technology, Nanded (M.S.), India
  • Department of Mechanical Engineering, Shri Guru Gobind Singhji Institute of Engineering and Technology, Nanded (M.S.), India
  • ImViA/IFTIM, Université de Bourgogne, Dijon, France
Bibliografia
  • [1] Barea R, Boquete L, Bergasa LM, López E, Mazo M. Electro- oculographic guidance of a wheelchair using eye movements codification. Int J Robot Res 2003;22(7–8):641–52.
  • [2] World report on disability; 2011, https://www.unicef.org/protection/ World_report_on_disability_eng.pdf [accessed 02.01.18].
  • [3] Wang Y, Dong P. The design and implementation of the voice control system of smart home based on ios. 2016 IEEE International Conference on Mechatronics and Automation. 2016. pp. 133–8.
  • [4] Koo Y, Kim G, Jang S, Lee W, Kim H, Han S. A study on travelling control of mobile robot by voice commend. 2015 15th International Conference on Control, Automation and Systems (ICCAS). 2015. pp. 1250–2.
  • [5] Begalinova A, Shintemirov A. Design of embedded gesture recognition system for robotic applications. 2014 IEEE 8th International Conference on Application of Information and Communication Technologies (AICT). 2014. pp. 1–4.
  • [6] Tsagaris A, Manitsaris S, Hatzikos E, Manitsaris A. Methodology for finger gesture control of mechatronic systems. Proceedings of 15th International Conference MECHATRONIKA; 2012. p. 1–6.
  • [7] Katevas NI, Sgouros NM, Tzafestas SG, Papakonstantinou G, Beattie P, Bishop JM, et al. The autonomous mobile robot senario: a sensor aided intelligent navigation system for powered wheelchairs. IEEE Robot Autom Mag 1997; 4(4):60–70.
  • [8] Deng LY, Hsu C-L, Lin T-C, Tuan J-S, Chang S-M. EOG-based human-computer interface system development. Expert Syst Appl 2010;37(4):3337–43.
  • [9] Huang Q, He S, Wang Q, Gu Z, Peng N, Li K, et al. An EOG based human machine interface for wheelchair control. IEEE Trans Biomed Eng 2017;1.
  • [10] Nakanishi M, Mitsukura Y. Wheelchair control system by using electrooculogram signal processing. 2013 19th Korea- Japan Joint Workshop on Frontiers of Computer Vision (FCV); 2013. pp. 137–42.
  • [11] Tu Y, Hung YS, Hu L, Huang G, Hu Y, Zhang Z. An automated and fast approach to detect single-trial visual evoked potentials with application to brain–computer interface. Clin Neurophysiol 2014;125(12):2372–83.
  • [12] Wu JF, Ang AMS, Tsui KM, Wu HC, Hung YS, Hu Y, et al. Efficient implementation and design of a new single-channel electrooculography-based human–machine interface system. IEEE Trans Circuits Syst II: Express Briefs 2015;62(2):179–83.
  • [13] Guo X, Pei W, Wang Y, Chen Y, Zhang H, Wu X, et al. A human–machine interface based on single channel EOG and patchable sensor. Biomed Signal Process Control 2016;30:98–105.
  • [14] Punsawad Y, Wongsawat Y, Parnichkun M. Hybrid eeg-eog brain–computer interface system for practical machine control. Engineering in Medicine and Biology Society (EMBC), 2010 Annual International Conference of the IEEE; 2010. pp. 1360–3.
  • [15] Farwell L, Donchin E. Talking off the top of your head: toward a mental prosthesis utilizing event-related brain potentials. Electroencephalogr Clin Neurophysiol 1988; 70(6):510–23.
  • [16] Guger C, Daban S, Sellers E, Holzner C, Krausz G, Carabalona R, et al. How many people are able to control a p300-based brain-computer interface (BCI)? Neurosci Lett 2009;462 (1):94–8.
  • [17] Kaufmann T, Schulz SM, Grünzinger C, Kübler A. Flashing characters with famous faces improves ERP-based brain– computer interface performance. J Neural Eng 2011;8 (5):056016.
  • [18] Long J, Li Y, Wang H, Yu T, Pan J, Li F. A hybrid brain computer interface to control the direction and speed of a simulated or real wheelchair. IEEE Trans Neural Syst Rehabil Eng 2012;20(5):720–9.
  • [19] Pfurtscheller G, da Silva FL. Event-related EEG/MEG synchronization and desynchronization: basic principles. Clin Neurophysiol 1999;110(11):1842–57.
  • [20] Bin G, Gao X, Yan Z, Hong B, Gao S. An online multi-channel SSVEP-based brain–computer interface using a canonical correlation analysis method. J Neural Eng 2009; 6(4):046002.
  • [21] Middendorf M, McMillan G, Calhoun G, Jones K. Brain– computer interfaces based on the steady-state visual-evoked response. IEEE Trans Rehabil Eng 2000;8(2):211–4.
  • [22] Birbaumer N, Ghanayim N, Hinterberger T, Iversen I, Kotchoubey B, Kübler A, et al. A spelling device for the paralysed. Nature 1999;398(6725):297–8.
  • [23] Doud AJ, Lucas JP, Pisansky MT, He B. Continuous three-dimensional control of a virtual helicopter using a motor imagery based brain–computer interface. PLoS ONE 2011; 6(10):e26322.
  • [24] Duan F, Lin D, Li W, Zhang Z. Design of a multimodal EEG-based hybrid BCI system with visual servo module. IEEE Trans Auton Mental Dev 2015;7(4):332–41.
  • [25] Koo B, Nam Y, Choi S. A hybrid eog-p300 bci with dual monitors. 2014 International Winter Workshop on Brain– Computer Interface (BCI); 2014. pp. 1–4.
  • [26] Ma J, Zhang Y, Cichocki A, Matsuno F. A novel eog/eeg hybrid human machine interface adopting eye movements and erps: application to robot control. IEEE Trans Biomed Eng 2015;62(3):876–89.
  • [27] Royer AS, Doud AJ, Rose ML, He B. EEG control of a virtual helicopter in 3-dimensional space using intelligent control strategies. IEEE Trans Neural Syst Rehabil Eng 2010;18 (6):581–9.
  • [28] Yong X, Fatourechi M, Ward RK, Birch GE. The design of a point-and-click system by integrating a self-paced brain– computer interface with an eye-tracker. IEEE J Emerg Sel Topics Circuits Syst 2011;1(4):590–602.
  • [29] Galán F, Nuttin M, Lew E, Ferrez P, Vanacker G, Philips J, et al. A brain-actuated wheelchair: asynchronous and non- invasive brain-computer interfaces for continuous control of robots. Clin Neurophysiol 2008;119(9):2159–69.
  • [30] Grewal H, Matthews A, Tea R, George K. Lidar-based autonomous wheelchair. Sensors Applications Symposium (SAS), 2017 IEEE; 2017. pp. 1–6.
  • [31] Li Z, Lei S, Su C-Y, Li G. Hybrid brain/muscle-actuated control of an intelligent wheelchair. 2013 IEEE International Conference on Robotics and Biomimetics (ROBIO); 2013. pp. 19–25.
  • [32] Rebsamen B, Guan C, Zhang H, Wang C, Teo C, Ang MH, et al. A brain controlled wheelchair to navigate in familiar environments. IEEE Trans Neural Syst Rehabil Eng 2010; 18(6):590–8.
  • [33] Yu Y, Zhou Z, Yin E, Jiang J, Tang J, Liu Y, et al. Toward brain-actuated car applications: self-paced control with a motor imagery-based brain–computer interface. Comput Biol Med 2016;77:148–55.
  • [34] Xu F, Zhou W, Zhen Y, Yuan Q. Classification of motor imagery tasks for electrocorticogram based brain–computer interface. Biomed Eng Lett 2014;4(2):149–57.
  • [35] Zhang Z, Duan F, Solé-Casals J, Dinarès-Ferran J, Cichocki A, Yang Z, et al. A novel deep learning approach with data augmentation to classify motor imagery signals. IEEE Access 2019;7:15945–54.
  • [36] Barea R, Boquete L, Mazo M, Lopez E. System for assisted mobility using eye movements based on electrooculography. IEEE Trans Neural Syst Rehabil Eng 2002;10(4):209–18.
  • [37] Bastos-Filho TF, Cheein FA, Muller SMT, Celeste WC, de la Cruz C, Cavalieri DC, et al. Towards a new modality-independent interface for a robotic wheelchair. IEEE Trans Neural Syst Rehabil Eng 2014;22(3):567–84.
  • [38] Al-Haddad A, Sudirman R, Omar C, Hui KY, Jimin MR. Wheelchair motion control guide using eye gaze and blinks based on pointbug algorithm. 2012 Third International Conference on Intelligent Systems, Modelling and Simulation (ISMS); 2012. pp. 37–42.
  • [39] Duguleana M, Mogan G. Using eye blinking for EOG-based robot control. IFIP Advances in Information and Communication Technology. Springer Berlin Heidelberg; 2010. p. 343–50.
  • [40] Shen H-M, Hu L, Lee K-M, Fu X. Multi-motion robots control based on bioelectric signals from single-channel dry electrode. Proc Inst Mech Eng Part H: J Eng Med 2015;229 (2):124–36.
  • [41] El-Halabi M, Haidar R, El Kadri R, Lahoud C. Eye-blinks communication vehicle: a prototype. 2017 Fourth International Conference on Advances in Biomedical Engineering (ICABME); 2017. pp. 1–4.
  • [42] Borghetti D, Bruni A, Fabbrini M, Murri L, Sartucci F. A low-cost interface for control of computer functions by means of eye movements. Comput Biol Med 2007;37 (12):1765–70.
  • [43] Królak A, Strumillo P. Eye-blink detection system for human–computer interaction. Univers Access Inf Soc 2011;11(4):409–19.
  • [44] Usakli AB, Gurkan S, Aloise F, Vecchiato G, Babiloni F. On the use of electrooculogram for efficient human computer interfaces. Comput Intell Neurosci 2010;2010:1–5.
  • [45] Yamagishi K, Hori J, Miyakawa M. Development of eog-based communication system controlled by eight-directional eye movements. Engineering in Medicine and Biology Society, 2006. EMBS'06. 28th Annual International Conference of the IEEE; 2006. pp. 2574–7.
  • [46] Aungsakul S, Phinyomark A, Phukpattaranont P, Limsakul C. Evaluating feature extraction methods of electrooculography (EOG) signal for human–computer interface. Proc Eng 2012;32:246–52.
  • [47] Aungsakun S, Phinyomark A, Phukpattaranont P, Limsakul C. Robust eye movement recognition using EOG signal for human–computer interface. Software Engineering and Computer Systems. Springer Berlin Heidelberg; 2011. p. 714–23.
  • [48] Barea R, Boquete L, Ortega S, López E, Rodríguez-Ascariz J. EOG-based eye movements codification for human computer interaction. Expert Syst Appl 2012;39(3):2677–83.
  • [49] Heo J, Yoon H, Park K. A novel wearable forehead EOG measurement system for human computer interfaces. Sensors 2017;17(7):1485.
  • [50] Ang AMS, Zhang ZG, Hung YS, Mak JNF. A user-friendly wearable single-channel eog-based human–computer interface for cursor control. 2015 7th International IEEE/EMBS Conference on Neural Engineering (NER). 2015. pp. 565–8.
  • [51] Ning B, jie Li M, Liu T, min Shen H, Hu L, Fu X. Human brain control of electric wheelchair with eye-blink electrooculogram signal. Intelligent Robotics and Applications. Springer Berlin Heidelberg; 2012. p. 579–88.
  • [52] Iturrate I, Antelis J, Minguez J. Synchronous eeg brain-actuated wheelchair with automated navigation. IEEE International Conference on Robotics and Automation, 2009. ICRA'09; 2009. pp. 2318–25.
  • [53] Wolpaw J, Ramoser H, McFarland D, Pfurtscheller G. EEG-based communication: improved accuracy by response verification. IEEE Trans Rehabil Eng 1998;6(3): 326–333.
Uwagi
PL
Opracowanie rekordu w ramach umowy 509/P-DUN/2018 ze środków MNiSW przeznaczonych na działalność upowszechniającą naukę (2019).
Typ dokumentu
Bibliografia
Identyfikator YADDA
bwmeta1.element.baztech-d8809819-c0f0-4757-919a-901c8546126b
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.