PL EN


Preferencje help
Widoczny [Schowaj] Abstrakt
Liczba wyników
Tytuł artykułu

Detection of eye closing/opening from EOG and its application in robotic arm control

Identyfikatory
Warianty tytułu
Języki publikacji
EN
Abstrakty
EN
Detection of eye closing/opening from alpha-blocking in the EEG of occipital region has been used to build human-machine interfaces. This paper presents an alternative method for detection of eye closing/opening from EOG signals in an online setting. The accuracies for correct detection of eye closing and opening operations with the proposed techniques were found to be 95.6% and 91.9% respectively for 8 healthy subjects. These techniques were then combined with the detection of eye blinks, the accuracy of which turned out to be 96.9%. This was then used to build an interface for robotic arm control for a pick and place task. The same task was also carried out using a haptic device as a master. The speed and accuracy for these two methods were then compared to assess quantitatively the ease of using this interface. It appears that the proposed interface will be very useful for persons with neurodegenerative disorders who can perform eye closing/opening and eye blinks.
Twórcy
autor
  • Homi Bhabha National Institute, India
autor
  • National Brain Research Centre, Manesar, India
  • Homi Bhabha National Institute, India
Bibliografia
  • [1] Wolpaw JR, Birbaumer N, McFarland DJ, Pfurtscheller G, Vaughan TM. Brain-computer interfaces for communication and control. Clin Neurophysiol 2002;113:767–91. http://dx.doi.org/10.1016/S1388-2457(02)00057-3.
  • [2] Mao X, Li M, Li W, Niu L, Xian B, Zeng M, et al. Progress in EEG-based brain robot interaction systems. Comput Intell Neurosci 2017. http://dx.doi.org/10.1155/2017/1742862.
  • [3] Wu JF, Ang AMS, Tsui KM, Wu HC, Hung YS, Hu Y, et al. Efficient implementation and design of a new single-channel electrooculography-based human-machine interface system. IEEE Trans Circuits Syst II Express Briefs 2015;62:179–83. http://dx.doi.org/10.1109/TCSII.2014.2368617.
  • [4] Patterson JR, Grabois M. Locked-in syndrome: a review of 139 cases. Stroke 1986. http://dx.doi.org/10.1161/01.STR.17.4.758.
  • [5] National Organization for Rare Disorders. Locked in syndrome; 2010 (accessed February 8, 2018) https://rarediseases.org/rare-diseases/locked-in-syndrome/.
  • [6] Do AH, Wang PT, King CE, Chun SN, Nenadic Z. Braincomputer interface controlled robotic gait orthosis. J Neuroeng Rehabil 2013. http://dx.doi.org/10.1186/1743-0003-10-111.
  • [7] Hortal E, Planelles D, Resquin F, Climent JM, Azorín JM, Pons JL. Using a brain-machine interface to control a hybrid upper limb exoskeleton during rehabilitation of patients with neurological conditions. J Neuroeng Rehabil 2015;12:1–16. http://dx.doi.org/10.1186/s12984-015-0082-9.
  • [8] Looned R, Webb J, Xiao ZG, Menon C. Assisting drinking with an affordable BCI-controlled wearable robot and electrical stimulation: a preliminary investigation. J Neuroeng Rehabil 2014;11:51.
  • [9] Pfurtscheller G, Brunner C, Schlögl A. Lopes da Silva FH. Mu rhythm (de)synchronization and EEG single-trial classification of different motor imagery tasks. Neuroimage 2006;31:153–9. http://dx.doi.org/10.1016/j.neuroimage.2005.12.003.
  • [10] Lesenfants D, Habbal D, Lugo Z, Lebeau M, Horki P, Amico E, et al. An independent SSVEP-based brain-computer interface in locked-in syndrome. J Neural Eng 2014;11. http://dx.doi.org/10.1088/1741-2560/11/3/035002.
  • [11] Martišius I, Damaševicius R. A prototype SSVEP based real time BCI gaming system. Comput Intell Neurosci 2016;2016:1–13. http://dx.doi.org/10.1155/2016/3861425.
  • [12] Zhao J, Li W, Mao X, Li M. SSVEP-based experimental procedure for brain-robot interaction with humanoid robots. J Vis Exp 2015. http://dx.doi.org/10.3791/53558.
  • [13] Ahmed KS. Wheelchair movement control VIA human eye blinks. Am J Biomed Eng 2012;1:55–8. http://dx.doi.org/10.5923/j.ajbe.20110101.09.
  • [14] Lin JS, Yang WC. Wireless brain-computer interface for electric wheelchairs with eeg and eye-blinking signals. Int J Innov Comput Inf Control 2012.
  • [15] Barreto AB, Scargle SD, Adjouadi M. A practical EMG-based human-computer interface for users with motor disabilities. J Rehabil Res Dev 2000;37:53–63.
  • [16] Costa Á, Hortal E, Iáñez E, Azorn JM. A supplementary system for a brain-machine interface based on jaw artifacts for the bidimensional control of a robotic arm. PLoS One 2014;9:1–13. http://dx.doi.org/10.1371/journal.pone.0112352.
  • [17] Minati L, Yoshimura N, Koike Y. Hybrid control of a visionguided robot arm by EOG, EMG, EEG biosignals and head movement acquired via a consumer-grade wearable device. IEEE Access 2016. http://dx.doi.org/10.1109/ACCESS.2017.2647851.
  • [18] Jiang Y, Lee H, Li G, Chung WY. High performance wearable two-channel hybrid BCI system with eye closure assist. Proc Annu Int Conf IEEE Eng Med Biol Soc EMBS 2016;2016 (October):5869–72. http://dx.doi.org/10.1109/EMBC.2016.7592063.
  • [19] Kirkup L, Searle A, Craig A, Mclsaac P, Moses P. EEG-based system for rapid on-off switching without prior learning. Med Biol Eng Comput 1997;35:504–9. http://dx.doi.org/10.1007/BF02525531.
  • [20] Fisch B.J., Spehlmann R. Fisch and Spehlmann's EEG primer: basic principles of digital and analog EEG. Elsevier Health Sciences; n.d.
  • [21] Iwasaki M, Kellinghaus C, Alexopoulos AV, Burgess RC, Kumar AN, Han YH, et al. Effects of eyelid closure, blinks, and eye movements on the electroencephalogram. Clin Neurophysiol 2005;116:878–85. http://dx.doi.org/10.1016/j.clinph.2004.11.001.
  • [22] Ferreira A, Celeste WC, Cheein FA, Bastos-Filho TF, Sarcinelli-Filho M, Carelli R. Human-machine interfaces based on EMG and EEG applied to robotic systems. J Neuroeng Rehabil 2008. http://dx.doi.org/10.1186/1743-0003-5-10.
  • [23] Wang H, Li Y, Long J, Yu T, Gu Z. An asynchronous wheelchair control by hybrid EEG–EOG brain–computer interface. Cogn Neurodyn 2014. http://dx.doi.org/10.1007/s11571-014-9296-y.
  • [24] Du Chang W, Lim JH, Im CH. An unsupervised eye blink artifact detection method for real-time electroencephalogram processing. Physiol Meas 2016. http://dx.doi.org/10.1088/0967-3334/37/3/401.
  • [25] Duda RO, Hart PE, Stork DG. Pattern classification. New York: John Wiley, Sect; 2001.
  • [26] Caldera S, Rassau A, Chai D. Review of deep learning methods in robotic grasp detection. Multimodal Technol Interact 2018. http://dx.doi.org/10.3390/mti2030057.
  • [27] Zeng H, Wang Y, Wu C, Song A, Liu J, Ji P, et al. Closed-loop hybrid Gaze Brain-machine interface based robotic arm control with augmented reality feedback. Front Neurorobot 2017. http://dx.doi.org/10.3389/fnbot.2017.00060.
  • [28] Jaju A, Das AP, Pal PK. Evaluation of motion mappings from a haptic device to an industrial robot for effective master– SLAVE manipulation. Int J Robot Autom 2013. http://dx.doi.org/10.2316/journal.206.2013.1.206-3657.
  • [29] Shen HM, Hu L, Lee KM, Fu X. Multi-motion robots control based on bioelectric signals from single-channel dry electrode. Arch Proc Inst Mech Eng Part H J Eng Med 1989- 1996 2015. http://dx.doi.org/10.1177/0954411915570079.
Uwagi
PL
Opracowanie rekordu ze środków MNiSW, umowa Nr 461252 w ramach programu "Społeczna odpowiedzialność nauki" - moduł: Popularyzacja nauki i promocja sportu (2020).
Typ dokumentu
Bibliografia
Identyfikator YADDA
bwmeta1.element.baztech-424d65c6-4d94-41c5-8535-6ab14d75a2de
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.