PL EN


Preferencje help
Widoczny [Schowaj] Abstrakt
Liczba wyników
Tytuł artykułu

Natural User Interfaces (NUI): review

Wybrane pełne teksty z tego czasopisma
Identyfikatory
Warianty tytułu
Języki publikacji
EN
Abstrakty
EN
The article summarizes and systematizes knowledge concerning natural user interfaces. The most important facts related to this problem have been supplemented with examples of possible practical use of such type of human-computer communication. Moreover, the article contains descriptions of three most popular controllers: Microsoft Kinect, Nintendo Wii and Sony Move.
Słowa kluczowe
Rocznik
Strony
27--45
Opis fizyczny
Bibliogr. 33 poz.
Twórcy
autor
  • Łódź University of Technology, Institute of Information Technology, Wólczańska 215, 90-924 Łódź, Poland, grzegorz@glonek.net.pl
Bibliografia
  • [1] Liu, W., Natural user interface-next mainstream product user interface, In: IEEE 11th International Conference on Computer-Aided Industrial Design & Conceptual Design (CAIDCD), Vol. 1, IEEE, 2010, pp. 203–205.
  • [2] Han, J., TED 2006: Multi-touch interaction experiments, TED Conference, February 2006.
  • [3] Rauterberg, M., Bichsel, M., Meier, M., and Fjeld, M., A gesture based interaction technique for a planning tool for construction and design, In: IEEE International Workshop on Robot and Human Communication, IEEE, 1997, pp. 212–217.
  • [4] Rauterberg, M., Mauch, T., and Stebler, R., The Digital Playing Desk: a case study for augmented reality, In: IEEE International Workshop on Robot and Human Communication, IEEE, 1996, pp. 410–415.
  • [5] Rauterberg, M. and Steiger, P., Pattern recognition as a key technology for the next generation of user interfaces, In: IEEE International Conference on Systems, Man, and Cybernetics, Vol. 4, IEEE, 1996, pp. 2805–2810.
  • [6] http://wiibrew.org/wiki/Wiimote .
  • [7] Laviola Jr, J. J. and Keefe, D. F., 3D Spatial Interaction : Applications for Art, Design, and Science, Interfaces, 2011.
  • [8] Radzie ? nski, M., Cyfrowe przetwarzanie obrazów, http://atol.am. gdynia.pl/tc/Radzienski/scienianie.htm, 2007.
  • [9] The Kinect Effect. How the world is using kinect. http://www.xbox.com/ en-US/Kinect/Kinect-Effect, 2011.
  • [10] Chang, Y.-J., Chen, S.-F., and Huang, J.-D., A Kinect-based system for physical rehabilitation: a pilot study for young adults with motor disabilities. Research in developmental disabilities, Vol. 32, No. 6, Dec. 2011, pp. 2566–70.
  • [11] Rego, P., Moreira, P., and Reis, L., Natural user interfaces in serious games for rehabilitation, In: 6th Iberian Conference on Information Systems and Technologies CISTI 2011, IEEE, 2011.
  • [12] Loureiro, B. and Rodrigues, R., Multi-touch as a Natural User Interface for elders: A survey, In: 6th Iberian Conference on Information Systems and Technologies CISTI 2011, IEEE, 2011, pp. 1–6.
  • [13] Marnik, J., Rozpoznawanie znaków polskiego alfabetu palcowego,, No. 12, 2003, pp. 51–66.
  • [14] Myśliński, S. and Flasiński, M., Rozpoznawanie obrazów dłoni za pomoc ?a gramatyk klasy ETPL ( k ) w systemach wizyjnych analizy języka migowego., Ph.D. thesis, AKADEMIA GÓRNICZO-HUTNICZA IM. STANISŁAWA STASZICA W KRAKOWIE, 2009.
  • [15] Ong, S. C. W. and Ranganath, S., Automatic sign language analysis: a survey and the future beyond lexical meaning. IEEE transactions on pattern analysis and machine intelligence, Vol. 27, No. 6, June 2005, pp. 873–91.
  • [16] Szokal-Egird, K., Budowa interaktywnej tablicy dla platform e-learningowej na baize kontrolera Nintendo Wii Remote, Master’s thesis, Lodz University of Technology Institute of Computer Science, 2009.
  • [17] Lee, J. C., Low-Cost Multi-touch Whiteboard using the Wiimote, http:// www.youtube.com/watch?v=5s5EvhHy7eQ, 2008.
  • [18] Challinor, R., SYNAPSE, http://synapsekinect.tumblr.com/, 2011.
  • [19] Veltrop, T., Improved Humanoid Robot Teleoperation with NAO and Kinect, http://www.youtube.com/watch?v=TmTW61MLm68, 2011.
  • [20] Kinect 2 may be so accurate it can lip read, http://venturebeat.com/ 2011/11/29/kinect-2-will-be-so-accurate-it-can-lip-read/, november 2011.
  • [21] Lee, C. and Lee, G. G., Emotion recognition for affective user interfaces using natural language dialogs, In: 16th IEEE International Conference on Robot & Human Interactive Communication, 2007, pp. 798–801.
  • [22] Malcangi, M., Smart recognition and synthesis of emotional speech for embedded systems with natural user interfaces, In: International Joint Conference on Neural Networks, IEEE, San Jose, 2011, pp. 867–871.
  • [23] Fang, T., Zhao, X., Ocegueda, O., Shah, S., and Kakadiaris, I., 3D facial expression recognition: A perspective on promises and challenges, In: IEEE International Conference on Automatic Face & Gesture Recognition Workshops FG 2011, IEEE, 2011, pp. 603–610.
  • [24] Fasel, B. and Luettin, J., Automatic facial expression analysis: a survey, Pattern Recognition, Vol. 36, No. 1, Jan. 2003, pp. 259–275.
  • [25] Gallagher, G., Kinect Hand Detection, 2010.
  • [26] Liu, S., Tian, Y., and Li, D., New research advances of facial expression recognition, In: International Conference on Machine Learning and Cybernetics, Vol. 2, IEEE, 2009, pp. 1150–1155.
  • [27] Li, K. e. a., Building and using a scalable display wall system, IEEE Computer Graphics and Applications, Vol. 20, No. 4, 2000, pp. 29–37.
  • [28] Norman, D., Natural user interfaces are not natural, interactions, Vol. 17, No. 3, 2010, pp. 6–10.
  • [29] Sidik, M. K. B. M., Sunar, M. S. B., Ismail, I. B., Mokhtar, M. K. B., and Jusoh, N. B. M., A Study on Natural Interaction for Human Body Motion Using Depth Image Data, 2011 Workshop on Digital Media and Digital Content Management, Vol. 2, May 2011, pp. 97–102.
  • [30] Wang, H. and Dai, G., A gesture language for collaborative conceptual design, The 7th International Conference on Computer Supported Cooperative Work in Design, 2002, pp. 205–209.
  • [31] Wojciechowski, A., Hand’s poses recognition as a mean of communication within natural user interfaces, Bulletin of the Polish Academy of Sciences, Technical Sciences, Vol. 60, No. 2, pp. 331–336.
  • [32] Królak, A. and Strumiłło, P., Wizyjny system monitorowania mrugania powiekami w zastosowaniach interfejsów człowiek-komputer, 2008.
  • [33] Petersen, N. and Stricker, D., Continuous natural user interface: Reducing the gap between real and digital world, In: IEEE International Symposium on Mixed and Augmented Reality, IEEE, Orlando, 2009, pp. 23–26.
Typ dokumentu
Bibliografia
Identyfikator YADDA
bwmeta1.element.baztech-article-LODD-0002-0002
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.