PL EN


Preferencje help
Widoczny [Schowaj] Abstrakt
Liczba wyników
Tytuł artykułu

Detecting gaze direction using robot-mounted and mobile-device cameras

Treść / Zawartość
Identyfikatory
Warianty tytułu
Języki publikacji
EN
Abstrakty
EN
Two common channels through which humans communicate are speech and gaze. Eye gaze is an important mode of communication: it allows people tobetter understand each others’ intentions, desires, interests, and so on. The goal of this research is to develop a framework for gaze triggered events that can be executed on a robot and mobile devices and allows to perform experiments. We experimentally evaluate the framework and techniques for extracting gaze direction based on a robot-mounted camera or a mobile-device camera that are implemented in the framework. We investigate the impact of light on the accuracy of gaze estimation, and also how the overall accuracy depends on user eye and head movements. Our research shows that light intensity is important, and the placement of a light source is crucial. All the robot-mounted gaze detection modules we tested were found to be similar with regard to their accuracy. The framework we developed was tested in a human-robot interaction experiment involving a job-interview scenario. The flexible structure of this scenario allowed us to test different components of the framework in varied real-world scenarios, which was very useful for progressing towards our long-term research goal of designing intuitive gaze-based interfaces for human robot communication.
Wydawca
Czasopismo
Rocznik
Strony
453--474
Opis fizyczny
Bibliogr. 18 poz., rys., tab.
Twórcy
  • AGH University of Science and Technology, Faculty of Computer Science, Electronics, and Telecommunications, Department of Computer Science, al. A. Mickiewicza 30, 30-059 Krakow, Poland
  • AGH University of Science and Technology, Faculty of Computer Science, Electronics, and Telecommunications, Department of Computer Science, al. A. Mickiewicza 30, 30-059 Krakow, Poland
  • AGH University of Science and Technology, Faculty of Computer Science, Electronics, and Telecommunications, Department of Computer Science, al. A. Mickiewicza 30, 30-059 Krakow, Poland
  • AGH University of Science and Technology, Faculty of Computer Science, Electronics, and Telecommunications, Department of Computer Science, al. A. Mickiewicza 30, 30-059 Krakow, Poland
  • AGH University of Science and Technology, Faculty of Computer Science, Electronics, and Telecommunications, Department of Computer Science, al. A. Mickiewicza 30, 30-059 Krakow, Poland
  • AGH University of Science and Technology, Faculty of Computer Science, Electronics, and Telecommunications, Department of Computer Science, al. A. Mickiewicza 30, 30-059 Krakow, Poland
Bibliografia
  • [1] Baltrusaitis T., Robinson P., Morency L.P.: Constrained local neural fields for robust facial landmark detection in the wild. In: Proceedings of the IEEE International Conference on Computer Vision Workshops, pp. 354-361. 2013.
  • [2] Chennamma H., Yuan X.: A survey on eye-gaze tracking techniques. Indian Journal of Computer Science and Engineering, vol. 4(5), pp.388-393, 2013.
  • [3] Droege D., Paulus D.: Pupil center detection in low resolution images. In: Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications, pp. 169-172. ACM, 2010.
  • [4] Fuhl W., Geisler D., Santini T., Rosenstiel W., Kasneci E.: Evaluation of stateof- the-art pupil detection algorithms on remote eye images. In: Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct, pp. 1716-1725. ACM, 2016.
  • [5] Fuhl W., Santini T.C., Kubler T., Kasneci E.: Else: Ellipse selection for robust pupil detection in real-world environments. In: Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications, pp. 123-130. ACM, 2016.
  • [6] Gee A., Cipolla R.: Determining the gaze of faces in images. In: Image and Vision Computing, vol. 12(10), pp. 639-647, 1994.
  • [7] George A., Routray A.: Fast and accurate algorithm for eye localisation for gaze tracking in low-resolution images. In: IET Computer Vision, vol. 10(7), pp. 660-669, 2016.
  • [8] Kar A., Corcoran P.: A Review and Analysis of Eye-Gaze Estimation Systems, Algorithms and Performance Evaluation Methods in Consumer Platforms. In: IEEE Access, vol. 5, pp. 16495-16519, 2017.
  • [9] Kazemi V., Josephine S.: One millisecond face alignment with an ensemble of regression trees. In: 27th IEEE Conference on Computer Vision and Pattern Recognition, CVPR 2014, Columbus, United States, 23 June 2014 through 28 June 2014, pp. 1867-1874. IEEE Computer Society, 2014.
  • [10] Mohamed A.O., Da Silva M.P., Courboulay V.: A history of eye gaze tracking. Rapport Interne, 2007. https://hal.archives-ouvertes.fr/hal-00215967.
  • [11] Sagonas C., Antonakos E., Tzimiropoulos G., Zafeiriou S., Pantic M.: 300 faces in-the-wild challenge: Database and results, Image and Vision Computing, vol. 47, pp. 3-18, 2016.
  • [12] Sapienza M., Camilleri K.: Fasthpe: a recipe for quick head pose estimation. Systems and Control Engineering, Department of Systems and Control Engineering, University of Malta, Msida, Malta, 2011.
  • [13] Timm F., Barth E.: Accurate Eye Centre Localisation by Means of Gradients. Visapp, vol. 11, pp. 125-130, 2011.
  • [14] Valenti R., Gevers T.: Accurate eye center location and tracking using isophote curvature. In: IEEE Conference on Computer Vision and Pattern Recognition, CVPR 2008, pp. 1-8. IEEE, 2008.
  • [15] Wood E., Baltrusaitis T., Zhang X., Sugano Y., Robinson P., Bulling A.: Rendering of eyes for eye-shape registration and gaze estimation. In: Proceedings of the IEEE International Conference on Computer Vision, pp. 3756-3764. 2015.
  • [16] Wood E., Bulling A.: Eyetab: Model-based gaze estimation on unmodified tablet computers. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 207-210. ACM, 2014.
  • [17] Yamazoe H., Utsumi A., Yonezawa T., Abe S.: Remote gaze estimation with a single camera based on facial-feature tracking without special calibration actions. In: Proceedings of the 2008 symposium on eye tracking research & applications, pp. 245-250. ACM, 2008.
  • [18] Zhang X., Sugano Y., Fritz M., Bulling A.: Appearance-based gaze estimation in the wild. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 4511-4520. 2015.
Typ dokumentu
Bibliografia
Identyfikator YADDA
bwmeta1.element.baztech-1bc09794-a281-483d-8252-be39bf45e817
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.