PL EN


Preferencje help
Widoczny [Schowaj] Abstrakt
Liczba wyników
Powiadomienia systemowe
  • Sesja wygasła!
  • Sesja wygasła!
Tytuł artykułu

Single web camera robust interactive eye-gaze tracking method

Treść / Zawartość
Identyfikatory
Warianty tytułu
Języki publikacji
EN
Abstrakty
EN
Eye-gaze tracking is an aspect of human-computer interaction still growing in popularity,. Tracking human gaze point can help control user interfaces and may help evaluate graphical user interfaces. At the same time professional eye-trackers are very expensive and thus unavailable for most of user interface researchers and small companies. The paper presents very effective, low cost, computer vision based, interactive eye-gaze tracking method. On contrary to other authors results the method achieves very high precision (about 1.5 deg horizontally and 2.5 deg vertically) at 20 fps performance, exploiting a simple HD web camera with reasonable environment restrictions. The paper describes the algorithms used in the eye-gaze tracking method and results of experimental tests, both static absolute point of interest estimation, and dynamic functional gaze controlled cursor steering.
Rocznik
Strony
879--886
Opis fizyczny
Bibliogr. 69 poz., rys., wykr., tab.
Twórcy
  • Institute of Information Technology, Lodz University of Technology, 215 Wólczańska St., 90-924 Łódź, Poland
  • Institute of Information Technology, Lodz University of Technology, 215 Wólczańska St., 90-924 Łódź, Poland
Bibliografia
  • [1] A. Duchowski, Eye Tracking Methodology: Theory and Practice, Springer, Berlin, 2007.
  • [2] J.M. Henderson, “Human gaze control during real-world scene perception”, Trends in Cognitive Sciences 7 (11), 498-504 (2003).
  • [3] K. Rayner, “Eye movements and the perceptual span in beginning and skilled readers”, J. Experimental Child Psychology 41, 211-236 (1986).
  • [4] D.M. Snodderly, I. Kagan, and M. Gur, “Selective activation of visual cortex neurons by ?xational eye movements: implications for neural coding”, Visual Neuroscience 18, 259-277 (2001).
  • [5] S. Josephson and M.E. Holmes, “Clutter or content? How onscreen enhancements affect how TV viewers scan and what they learn”, Proc. Eye Tracking Research and Applications ETRA 1, 155-162 (2006).
  • [6] G.L. Lohse, “Consumer eye movement patterns on yellow pages advertising”, J. Advertising 26 (1), 61-73 (1997).
  • [7] K. Ślot, Selected Aspects of Biometry, WKŁ, Warsaw, 2008, (in Polish).
  • [8] M.M. Hayhoe, D.H. Ballard, J. Triesch, H. Shinoda, P. Aivar, and B. Sullivan, “Vision in natural and virtual environments”, Proc. ACM Symp. on Eye Tracking Research and Application 1, 7-13 (2002).
  • [9] G. Ho, C.T. Scialfa, J.K. Caird, and T. Graw, “Visual search for traffic signs: the effects of clutter, luminance and aging”, Human Factors 43 (3), 194-207 (2001).
  • [10] R.J. Jacob, “What you look is what you get: eye movementbased interaction techniques”, ACM Conf. Proc. Human Factors in Computing Systems: CHI 90, 11-18 (1990).
  • [11] I. Starker and R.A. Bolt, “A gaze-responsive self-disclosing display”, ACM Conf. Proc. on Human Factors in Computing Systems 1, 3-9 (1990).
  • [12] V. Tanriverdi and R.J.K. Jacob, “Interacting with eye movements in virtual environments”, ACM Conf. Proc. on Human Factors in Comp. Sci. 1, 265-272 (2000).
  • [13] P. Kotze, M. Eloff, A. Adesina-Ojo, and J. Eloff, “Accessible computer interaction for people with disabilities: the case of quadriplegics”, 6th Int. Conf. on Ent. Inf. Sys. 1, 14-17 (2004).
  • [14] P. Majaranta and KJ. Raiha, “Twenty years of eye typing: systems and design issues”, Proc. ETRA 1, 15-22 (2002).
  • [15] A.J. Hornof and A. Cavender, “EyeDraw: enabling children with severe motor impairments to draw with their eyes”, Proc. SIGCHI Conference on Human Factors in Computing Systems, ACM Press 1, 161-170 (2005).
  • [16] A. Santella, M. Agrawala, D. DeCarlo, D. Salesin, and M. Cohen, “Gaze-based interaction for semi-automatic photo cropping”, Proc. SIGCHI Conf. on Human Factors in Computing Systems, ACM Press 1, 781-790 (2006).
  • [17] B.H. McCormick, D.A. Batte, and A.T. Duchowski, “A virtual environment: exploring the brain forest”, Computation Sistemas 1 (1), 5-13 (1997).
  • [18] B. Watson, N. Walker, L.F. Hodges, and A. Worden, “Managing level of detail through peripheral degradation: effects on search performance with a head-mounted display”, ACM Transactions on CHI 4 (4), 323-346 (1997).
  • [19] A. Krolak and P. Strumiłło, “Eye-blink detection system for human-computer interaction”, Univers. Access Inf. Soc. 10, 1-11 (2011).
  • [20] B. Watson, N. Walker, and L.F. Hodges, “Supra threshold control of peripheral LOD”, Trans. on Graphics 23 (3), 750-759 (2004).
  • [21] E.C. Chang, S. Mallat, and C. Yap, “Wavelet foveation”, J. Appl. Comp. Harmonic Analysis 9 (3), 312-335 (2000).
  • [22] A.T. Duchowski, “Acuity-matching resolution degradation through wavelet coefficient scaling”, IEEE Trans. on Image Processing 9 (8), 1437-1440 (2000).
  • [23] M. Levoy and R. Whitaker, “Gaze-directed volume rendering”, Comp. Grap. (SIGGRAPH’90) 1, 217-223 (1990).
  • [24] R. Danforth, A. Duchowski, R. Geist, and E.McAliley, “A platform for gaze-contingent virtual environments”, Smart Graphics (Paper form 2000 AAAI Spring Symp., Technical Report) SS-00-04, 66-70 (2000).
  • [25] D. Luebke and C. Erikson, “View-dependant simplification of arbitrary polygonal environments”, Computer Graphics (SIGGRAPH’ 97) 1, 199-208 (1997).
  • [26] H. Murphy and A.T. Duchowski, “Gaze-contingent level of detail”, Eurographics 1, 1-10 (2001).
  • [27] S. Wiak and P. Napieralski, “Visualization method of magnetic fields with dynamic particle systems”, Electrotechnical Review 88 (7B), 56-59 (2012).
  • [28] C. O’Sullivan and J. Dingliana, “Collisions and perception”, ACM Trans. on Graphics 20 (3), 151-168 (2001).
  • [29] C. O’Sullivan, J. Dingliana, and S. Howlett, “Gaze-contingent algorithms for interactive graphics”, in eds. J. Hyona, R. Radach, H. Deubel, The Mind’s Eyes: Cognitive and Applied Aspects of Eye Movement Research, Elsevier Science, Oxford, 2002.
  • [30] R.J.K. Jacob, A. Girouard, L.M. Hirsfield, M.S. Horn, O. Shaer, E.T. Solovey, and J. Zigelbaum, “Reality-based interaction: a framework for post-WIMP interfaces”, Proc. ACM CHI Conf. 1, 201-210 (2008).
  • [31] J. Nielsen, “Noncommand user interfaces”, Communication ACM 36 (4), 8-99 (1993).
  • [32] G. Demenko, B. Mobius, and K. Klessa, “Implementation of polish speech synthesis for the BOSS system”, Bull. Pol. Ac.: Tech. 58 (3), 371-376 (2010).
  • [33] T. Marciniak, R. Weychan, A. Stankiewicz, and A. Dąbrowski, “A biometric speech signal processing in a system with digital signal processor”, Bull. Pol. Ac.: Tech. 62 (3), 589-595 (2014).
  • [34] S. Zhai, C. Morimoto, and S. Ihde, “Manual and gaze input cascaded (MAGIC) pointing”, Proc. ACM SIGCHI Conf. on Human Factors in Computing Systems 1, 246-253 (1999).
  • [35] A. Wojciechowski, “Hand’s poses recognition as a mean of communication within natural user interfaces”, Bull. Pol. Ac.: Tech. 60 (2), 331-336 (2012).
  • [36] G. Glonek and M. Pietruszka, “Natural user interfaces (NUI): review”, J. Applied Comp. Science 20 (2), 27-46 (2012).
  • [37] P.J. Durka, R. Kuś, J. Żygierewicz, M. Michalska, P.Milanowski, M. Łabęcki, T. Spustek, D. Laszuk, A. Duszyk, and M. Kruszyński, “User-centered design of brain-computer interfaces: openBCI.pl and BCI appliance”, Bull. Pol. Ac.: Tech. 60 (3), 427-431 (2012).
  • [38] C.S. Lin, C.C. Huan, C.N. Chan, M.S. Yeh, and C.C. Chiu, “Design of a computer game using an eye-tracking device for eye’s activity rehabilitation”, Optics and Lasers in Engineering 42 (1), 91-108 (2004).
  • [39] N. Schneider, E. Barth, P. Bex, and M. Dorr, “An open-source low-cost eye-tracking system for portable real-time and offline tracking”, Proc. 2011 Conf. on Novel Gaze-Controlled Applications 1, 1-4 (2011).
  • [40] Y. Ishiguro, A. Mujibiya, T. Miyaki, and J. Rekimoto, “Aided eyes: eye activity sensing for daily life”, Proc. 1st ACM Augmented Human Int. Conf. 1, 25-31 (2010).
  • [41] K.H. Tan, D.J. Kriegman, and N. Ahuja, “Appearance-based eye gaze estimation”, Proc. Sixth IEEE Workshop on Applications of Computer Vision 1, 191-195 (2002).
  • [42] M.R.M. Mimica and C.H. Morimoto, “A computer vision framework for eye gaze tracking”, Proc. XVI Brazilian Symp. on Comp. Graph. and Image Proc. 1, 40-412 (2003).
  • [43] C.H. Morimoto and M.R.M. Mimica, “Eye gaze tracking techniques for interactive applications”, Computer Vision and Image Understanding 98 (1), 4-24 (2005).
  • [44] Z. Zhu and Q. Ji, “Eye gaze tracking under natural head movements”, IEEE CS Conf. on Computer Vision and Pattern Recognition 1, 918-923 (2005).
  • [45] M. Reale, T. Hung, and L. Yin, “Pointing with the eyes: gaze estimation using a static/active camera system and 3d iris disk model”, IEEE Int. Conf. on Multimedia and Expo (ICME) 1, 280-285 (2010).
  • [46] C. Hennessey, B. Noureddin, and P. Lawrence, “A single camera eye-gaze tracking system with free head motion”, Proc. 2006 ACM Symp. on Eye Tracking Research & Applications 1, 87-94 (2006).
  • [47] D.W. Hansen and Q. Ji, “In the eye of the beholder: a survey of models for eyes and gaze”, IEEE Trans. on Pattern Analysis and Machine Intelligence 32 (3), 478-500 (2010).
  • [48] A. Villanueva and R. Cabeza, “Models for gaze tracking systems”, J. Image and Video Processing 2007 (3), 4-19 (2007).
  • [49] D.H. Yoo and M.J. Chung, “Non-intrusive eye gaze estimation without knowledge of eye pose”, Proc. Sixth IEEE Int. Conf. on Automatic Face and Gesture Recognitionn 1, 785-790 (2004).
  • [50] Y. Zhou, X. Zhao, S. Zhang, and Y. Zhang, “Design of eye tracking system for real scene”, IEEE Pacific-Asia Workshop on Computational Intelligence and Industrial Application 1, 708–711 (2008).
  • [51] L.E. Sibert and R.J. Jacob, “Evaluation of eye gaze interaction”, Proc. ACM SIGCHI Conf. on Human Factors in Comp. Systems 1, 281–288 (2000).
  • [52] K.N. Kim and R.S. Ramakrishna, “Vision-based eye-gaze tracking for human computer interface”, Proc. IEEE Int. Conf. on Systems Man and Cybernetics 1, 324–329 (1999).
  • [53] B.L. Nguyen, C. Tijus, F. Jouen, M. Molina, and Y. Chahir, “Eye gaze tracking with free head movements using a single camera”, Proc. 2010 Symp. on Information and Communication Technology 1, 108–113 (2010).
  • [54] O.M.C. Williams, A. Blake, and R. Cipolla, “Sparse and semisupervised visual mapping with the s3Gp”, Proc. IEEE Conf. Comp. Vision and Pattern Rec. 1, 230–237 (2006).
  • [55] M. Betke, J. Gips, and P. Fleming, “The camera mouse: visual tracking of body features to provide computer access for people with severe disabilities”, IEEE Neural Systems and Rehabilitation Engineering 10 (1), 1–10 (2002).
  • [56] W. Sewell and O. Komogortsev, “Real-time eye gaze tracking with an unmodified commodity webcam employing a neural network”, ACM CHI’10 Extended Abstracts on Human Factors in Computing Systems 1, 3739–3744 (2010).
  • [57] T. Liu and C. Pang, “Eye-gaze tracking research based on image processing”, IEEE Congress on Image and Signal Processing 1, 176–180 (2008).
  • [58] C.W. Kao, C.W. Yang, Y.W. Chen, K.C. Fan, B.J. Hwang, and C.P. Huang, “Eye gaze tracking based on pattern voting scheme for mobile device”, Proc. IEEE Int. Conf. on Instrumentation, Measurement, Computer, Communication and Control 1, 337–340 (2011).
  • [59] J.J. Magee, M.R. Scott, B.N. Waber, and M. Betke, “Eyekeys: a real-time vision interface based on gaze detection from a low-grade video camera”, IEEE Computer Vision and Pattern Recognition Workshop 1, 159–159 (2004).
  • [60] H. Yamazoe, A. Utsumi, T. Yonezawa, and S. Abe, “Remote gaze estimation with a single camera based on facial-feature tracking without special calibration actions”, Proc. Symp. Eye Tracking Research and Applications 1, 140–145 (2008).
  • [61] T. Ishikawa, S. Baker, I. Matthews, and T. Kanade, “Passive driver gaze tracking with active appearance models”, Proc. 11th World Congress Int. Transport. Syst. 1, 159–164 (2004).
  • [62] S.A.Mahmoudi, M. Kierzynka, P.Manneback, and K. Kurowski, „Real-time motion tracking using optical flow on multiple GPUs”, Bull. Pol. Ac.: Tech. 62 (1), 139–150 (2014).
  • [63] R. Lienhart and J. Maydt, “An extended set of haar-like features for rapid object detection”, IEEE ICIP 2002 1, 900-903 (2002).
  • [64] P.A. Viola and M.J. Jones, “Rapid object detection using a boosted cascade of simple features”, IEEE Computer Society Conf. on Computer Vision and Pattern Recognition 1, 511–518 (2001).
  • [65] O. Oguz, “The proportion of the face in younger adults using the thumb rule of Leonardo da Vinci”, J. Surg. Radiol. Anat. 18 (2), 111–114 (1996).
  • [66] M. Sezgin and B. Sankur, “Survey over image thresholding techniques and quantitative performance evaluation”, J. Electronic Imaging 13 (1), 146–165 (2004).
  • [67] N. Otsu, “A threshold selection method from gray-level histograms”, IEEE Trans. Sys., Man., Cyber. 9 (1), 62–66 (1979).
  • [68] M. Taghizadeh and M.R. Mahzoun, “Bidirectional image thresholding algorithm using combined edge detection and ptile algorithms”, TJMCS 2 (2), 255–261 (2011).
  • [69] A. Tomczyk and P.S. Szczepaniak, “Adaptive potential active contours”, Pattern Analysis and Application 14, 425–440 (2011).
Typ dokumentu
Bibliografia
Identyfikator YADDA
bwmeta1.element.baztech-3831b0b0-de25-46cb-ae18-da1e93eb3da4
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.