PL EN


Preferencje help
Widoczny [Schowaj] Abstrakt
Liczba wyników
Tytuł artykułu

MEMS mirror based eye tracking: simulation of the system parameter effect on the accuracy of pupil position estimation

Treść / Zawartość
Identyfikatory
Warianty tytułu
Języki publikacji
EN
Abstrakty
EN
Eye tracking systems are mostly video-based methods which require significant computation to achieve good accuracy. An alternative method with comparable accuracy but less computational expense is 2D microelectromechanical (MEMS) mirror scanning. However, this technology is relatively new and there are not many publications on it. The purpose of this study was to examine how individual parameters of system components can affect the accuracy of pupil position estimation. The study was conducted based on a virtual simulator. It was shown that the optimal detector field of view (FOV) depends on the frequency ratio of the MEMS mirror axis. For a value of 1:13, the smallest errors were at 0.1°, 1.65°, 2.3°, and 2.95°. The error for the impact of the signal sampling rate above 3 kHz stabilizes at 0.065° and no longer changes its value regardless of increasing the number of samples. The error for the frequency ratio of the MEMS mirror axis increases linearly in the range of 0.065°-0.1°up to the ratio of 1:230. Above this there is a sudden increase to the average value of 0.3°. The conducted research provides guidance in the selection of parameters for the construction of eye tracking MEMS mirror-based systems.
Rocznik
Strony
711--724
Opis fizyczny
Bibliogr. 20 poz., rys., wykr.
Twórcy
  • Military University of Technology, Institute of Optoelectronics, 2 Kaliskiego St., 00-908 Warsaw, Poland
  • Military University of Technology, Institute of Optoelectronics, 2 Kaliskiego St., 00-908 Warsaw, Poland
  • Military University of Technology, Institute of Optoelectronics, 2 Kaliskiego St., 00-908 Warsaw, Poland
Bibliografia
  • [1] Duchowski, A. T., (2017). Eye tracking methodology: Theory and practice. Springer. https://doi.org/10.1007/978-3-319-57883-5
  • [2] Judd, T., Ehinger, K., Durand, F., & Torralba, A. (2009, September). Learning to predict where humans look. IEEE 12th International Conference on Computer Vision (pp. 2106-2113). IEEE. https://doi.org/10.1109/ICCV.2009.5459462
  • [3] Goldberg, J. H., & Kotval, X. P. (1999). Computer interface evaluation using eye movements: methods and constructs. International Journal of Industrial Ergonomics, 24(6), 631-645. https://doi.org/10.1016/S0169-8141(98)00068-7
  • [4] Hansen, D. W., & Ji, Q. (2009). In the eye of the beholder: A survey of models for eyes and gaze. IEEE Transactions on Pattern Analysis and Machine Intelligence, 32(3), 478-500. https://doi.org/10.1109/TPAMI.2009.30
  • [5] Carvalho, N., Laurent, E., Noiret, N., Chopard, G., Haffen, E., Bennabi, D., & Vandel, P. (2015). Eye movement in unipolar and bipolar depression: A systematic review of the literature. Frontiers in Psychology, 6, 1809. https://doi.org/10.3389/fpsyg.2015.01809
  • [6] Bittencourt, J., Velasques, B., Teixeira, S., Basile, L. F., Salles, J. I., Nardi, A. E., Budde, H., Cagy, M., Piedade, R., & Ribeiro, P. (2013). Saccadic eye movement applications for psychiatric disorders. Neuropsychiatric Disease and Treatment, 9, 1393. https://doi.org/10.2147/NDT.S45931
  • [7] Duchowski, A. T., Medlin, E., Gramopadhye, A., Melloy, B., & Nair, S. (2001, November). Binocular eye tracking in VR for visual inspection training. Proceedings of the ACM symposium on Virtual reality software and technology (pp. 1-8). https://doi.org/10.1145/505008.505010
  • [8] Blattgerste, J., Renner, P., & Pfeiffer, T. (2018, June). Advantages of eye-gaze over head-gaze-based selection in virtual and augmented reality under varying field of views. Proceedings of the Workshop on Communication by Gaze Interaction (pp. 1-9). https://doi.org/10.1145/3206343.3206349
  • [9] Păsărică, A., Bozomitu, R. G., Cehan, V., Lupu, R. G., & Rotariu, C. (2015, October). Pupil detection algorithms for eye tracking applications. 2015 IEEE 21st International Symposium for Design and Technology in Electronic Packaging (SIITME) (pp. 161-164). IEEE. https://doi.org/10.1109/SIITME.2015.7342317
  • [10] Stengel, M., Grogorick, S., Eisemann, M., Eisemann, E., & Magnor, M. A. (2015, October). An affordable solution for binocular eye tracking and calibration in head-mounted displays. Proceedings of the 23rd ACM international conference on Multimedia (pp. 15-24). https://doi.org/10.1145/2733373.2806265
  • [11] Wen, Q., Bradley, D., Beeler, T., Park, S., Hilliges, O., Yong, J., & Xu, F. (2020). Accurate Real-time 3D Gaze Tracking Using a Lightweight Eyeball Calibration. Computer Graphics Forum, 39(2), 475-485. https://doi.org/10.1111/cgf.13945
  • [12] Lee, G. J., Jang, S. W., & Kim, G. Y. (2020). Pupil detection and gaze tracking using a deformable template. Multimedia Tools and Applications, 79(19), 12939-12958. https://doi.org/10.1007/s11042-020-08638-7
  • [13] Gegenfurtner, A., Lehtinen, E., & Säljö, R. (2011). Expertise differences in the comprehension of visualizations: A meta-analysis of eye-tracking research in professional domains. Educational Psychology Review, 23(4), 523-552. https://doi.org/10.1007/s10648-011-9174-7
  • [14] Sarkar, N., O’Hanlon, B., Rohani, A., Strathearn, D., Lee, G., Olfat, M., & Mansour, R. R. (2017, January). A resonant eye-tracking microsystem for velocity estimation of saccades and foveated rendering. IEEE 30th International Conference on Micro Electro Mechanical Systems (MEMS) (pp. 304-307). IEEE. https://doi.org/10.1109/MEMSYS.2017.7863402
  • [15] Bartuzel, M. M., Wróbel, K., Tamborski, S., Meina, M., Nowakowski, M., Dalasiński, K., Szkulmowska, A. & Szkulmowski, M. (2020). High-resolution, ultrafast, wide-field retinal eye-tracking for enhanced quantification of fixational and saccadic motion. Biomedical Optics Express, 11(6), 3164-3180. https://doi.org/10.1364/BOE.392849
  • [16] Meyer, J., Schlebusch, T., Fuhl, W., & Kasneci, E. (2020). A novel camera-free eye tracking sensor for augmented reality based on laser scanning. IEEE Sensors Journal, 20(24), 15204-15212. https://doi.org/10.1109/JSEN.2020.3011985
  • [17] Pomianek, M., Piszczek, M., Maciejewski, M., & Krukowski, P. (2020, October). Pupil Position Estimation Error in an Eye Tracking System Based on the MEMS Mirror Scanning Method. Proceedings of the 3rd International Conference on Microelectronic Devices and Technologies (MicDAT’ 2020) (pp. 28-30). IFSA.
  • [18] Pengfei, Y., Zhengming, C., Jing, T., & Lina, Q. (2016). Virtual Simulation System of Cutter Suction Dredger Based on Unity3D. Journal of Systems Simulation, 28(9), 2069-2075.
  • [19] Richards, D., & Taylor, M. (2015). A Comparison of learning gains when using a 2D simulation tool versus a 3D virtual world: An experiment to find the right representation involving the Marginal Value Theorem. Computers & Education, 86, 157-171. https://doi.org/10.1016/j.compedu.2015.03.009
  • [20] Müller, L. M., Mandon, K., Gliesche, P., Weiß, S., & Heuten, W. (2020, November). Visualization of Eye Tracking Data in Unity3D. 19th International Conference on Mobile and Ubiquitous Multimedia (pp. 343-344). https://doi.org/10.1145/3428361.3431194
Uwagi
1. The research was funded by the Military University of Technology, Grant Number ZBW/08-893/2020/WAT and Remmed VR.
2. Opracowanie rekordu ze środków MNiSW, umowa Nr 461252 w ramach programu "Społeczna odpowiedzialność nauki" - moduł: Popularyzacja nauki i promocja sportu (2021).
Typ dokumentu
Bibliografia
Identyfikator YADDA
bwmeta1.element.baztech-e64e6df2-997e-4433-9172-3a41d81a616b
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.