PL EN


Preferencje help
Widoczny [Schowaj] Abstrakt
Liczba wyników
Tytuł artykułu

Camera navigation support in a virtual environment

Treść / Zawartość
Identyfikatory
Warianty tytułu
Języki publikacji
EN
Abstrakty
EN
Artificial camera navigation is the most ubiquitous and principal interaction task within a virtual environment. Efficient and intuitive virtual scene navigation affects other tasks completion performance. Though many scientists have elaborated invaluable navigation techniques design guidelines, it is still, especially for novice users, the really challenging and demanding process. The user interface input hardware imprecision, interface operation cognitive burden put on users and deficiency of direct mapping between user physical movement and virtual camera navigation evoke discrepancy between user desired and actual camera position and orientation. The provided paper concentrates on the new potential field based camera navigation support method. Originally designed and exploited potential fields support not only collisions solving, but goal profiled attraction and camera manoeuvring as well. It works both in static and dynamic environments. It can be easily boosted by the GPU approach and eventually can be easily adapted for advanced or novice interface users for a miscellaneous navigation task completion.
Rocznik
Strony
871--884
Opis fizyczny
Bibliogr. 55 poz., rys., tab., il.
Twórcy
  • Institute of Information Technology, Lodz University of Technology, 215 Wólczańska St., 90-924 Łódź, Poland
Bibliografia
  • [1] D.A. Bowman, “Principles for the design of performanceoriented interaction techniques”, in Handbook of Virtual Environments: Design, Implementation and Applications, Lawrence Erlbaum Ass., Mahwah, New Jersey, 2002.
  • [2] D.J. McFarland and J.R. Wolpaw, “Brain-computer interfaces for Communication and Control”, Communication ACM 54 (5), 60-=66 (2011).
  • [3] E. Foxlin, “Motion tracking requirements and technologies”, in Handbook of Virtual Environments: Design, Implementation and Applications, Lawrence Erlbaum Ass., Mahwah, New Jersey, 2002.
  • [4] J. Wilder, G.K. Kung, M.M. Tremaine, and M. Kaur, “Eye tracking in virtual environments”, in Handbook of Virtual Environments: Design, Implementation and Applications, Lawrence Erlbaum Ass., Mahwah, New Jersey, 2002.
  • [5] W.T. Nelson and R.S. Bolia, “Technological considerations in the design of multisensory virtual environments: the virtual field of dreams will have to wait”, in Handbook of Virtual Environments: Design, Implementation and Applications, Lawrence Erlbaum Ass., Mahwah, New Jersey, 2002.
  • [6] F. Quek, D. McNeill, R. Bryll, S. Duncan, X.F. Ma, C. Kirbas, K.E. McCullough, and R. Ansari, “Multimodal human discourse: gesture and speech”, ACM Trans. on Computer-Human Interaction 9 (3), 171-193 (2002).
  • [7] J.P. Wachs, M. Kolsch, H. Stern, and Y. Edan, “Vision-based hand-gesture applications”, Communication ACM 54 (2), 60-71 (2011).
  • [8] A. Wojciechowski, “Hand’s poses recognition as a mean of communication within natural user interfaces”, Bull. Pol. Ac.: Tech. 60 (2), 331-336 (2012).
  • [9] G. Glonek and M. Pietruszka, “Natural user interfaces (NUI): review”, JACS 20 (2), 27-46 (2012).
  • [10] K. Szabat, T. Orłowska-Kowalska, and K.P. Dyrcz, “Extended Kalman filters in the control structure of two-mass drive system”, Bull. Pol. Ac.: Tech. 54 (3), 315-325 (2006).
  • [11] K. Guzek and P. Napieralski, “Measurement of noise in the Monte Carlo point sampling method”, Bull. Pol. Ac.: Tech. 59 (1), 15-19 (2011).
  • [12] K. Murakami and H. Taguchi, “Gesture recognition using recurrent neural networks”, Proc. Int. Conf. Human Factors in Computer Systems CHI 91, 237-242 (1991).
  • [13] A. Sandberg, “Gesture recognition using neural networks”, Technical Report, KTH University, Stockholm, 1997.
  • [14] K. Symeonidis, “Hand gesture recognition using neural networks”, Neural Networks 13, 5-10 (1996).
  • [15] A. Niewiadomski, “On finity, countability, cardinalities, and cylindric extensions of Type-2 fuzzy sets in linguistic summarization of databases”, IEEE Trans. on Fuzzy Systems 18 (3), 532-545 (2010).
  • [16] D. Rutkowska, Intelligent Computational Systems: Genetic Agorithms and Neural Networks in Fuzzy Systems, PLJ,Warszawa, 1997, (in Polish).
  • [17] L. Rutkowski, Methods and Techniques of Artificial Intelligence, PWN, Warszawa, 2012.
  • [18] A. Wojciechowski and G. Wroblewski, “GPU calculated camera collisions detection within a dynamic environment”, LNCS 6375, Springer, 2010, (in Polish).
  • [19] A. Wojciechowski, “Potential field based camera collisions detection within dynamically moving 3D environment”, in LNCS 5337, Springer, Berlin, 2009.
  • [20] A. Wojciechowski, “Potential field based camera collisions’ detection in a 3D environment”,Int. J. MGV 15, 665-672 (2006).
  • [21] M.R. Mine, “Virtual environment interaction techniques”, Technical Report 95-018, Department of Computer Science, University of North Carolina, Chapel Hill, 1995.
  • [22] D.A. Bowman and L.F. Hodgas, “Formalizing the design, evaluation, and application of interaction techniques for immersive virtual environments”, J. Visual Languages and Computing 10 (1), 37-53 (1999).
  • [23] D.A. Bowman, “Interaction techniques for common tasks in immersive virtual environments”, Ph.D. Thesis, Georgia Institute of Technology, Georgia, 1999.
  • [24] R.P. Darken and B. Peterson, “Spatial orientation, wayfinding and representation”, in Handbook of Virtual Environments: Design, Implementation and Applications, Lawrence Erlbaum Ass., Mahwah, New Jersey, 2002.
  • [25] J.C. Latombe, Robot Motion Planning, Kluwer, Amsterdam, 1991.
  • [26] R.R. Murphy, Introduction to AI Robotics, MIT Press, Cambridge, 2000.
  • [27] I. Dulęba, Methods and Algorithms of Motion Planning of Manual and Manipulative Robots, Exit, Warszawa, 2004, (in Polish).
  • [28] I. Dulęba, “Impact of control representations on efficiency of local nonholonomic motion planning”, Bull. Pol. Ac.: Tech. 59 (2), 213-218 (2011).
  • [29] Z. Hendzel, “Collision free path planning and control of wheeled mobile robot using Kohonen self-organizing map”, Bull. Pol. Ac.: Tech. 53 (1), 39-47 (2005).
  • [30] K.Z. Haigh and M. Veloso, “Route planning by analogy”, Proc. Int. Conf. on Case-Based Reasoning 160-180 (1995).
  • [31] T.Y. Li and H.K. Ting, “An intelligent user interface with motion planning for 3D navigation”, Proc. Conf. IEEE Virtual Reality 2000 1, 177-184 (2000).
  • [32] C. Urdiales, A. Bandera, F. Arrebola, and F. Sandoval, “Multilevel path planning algorithm for autonomous robots”, Electronics Letters 34 (2), 223-224 (1998).
  • [33] C. Urdiales, A. Bandera, E.J. Perez, A. Poncela, and F. Sandoval, “Hierarchical planning in a mobile robot for map learning and navigation”, Autonomous Robotic Systems - Soft Computing and Hard Computing Methodologies and Applications, eds. D. Maravall, D. Ruan, and C. Zhou , Springer, Berlin, 2003.
  • [34] J.S. Zelek, “Dynamic issues for mobile robot realtime discovery and path planning”, Proc. Int. Conf. Computational Intelligence in Robotics and Automation 1, 232-237 (1999).
  • [35] S. Bandi and D. Thalmann, “Space discretization for efficient human navigation”, Proc. Eurographics 17, 195-206 (1998).
  • [36] S.M. Drucker and D. Zelter, “Intelligent camera control in a virtual environment”, Proc. Graphics Interface 1, 190-199 (1994).
  • [37] P.K. Egbert and H. Winkler, “Collisions-free object movement using vector fields”, IEEE Computer Graphics and Applications 16 (4), 18-22 (1996).
  • [38] S.S. Ge and Y.J. Cui, “Dynamic motion planning for mobile robots using potential field method”, Autonomous Robots 13, 207-222 (2002).
  • [39] O. Khatib, “Real-time obstacle avoidance for manipulators and mobile robots”, Int. J. Mobile Research 5 (1), 90-99 (1986).
  • [40] M. Khatib and R. Chatila, “An extended potential field approach for mobile robot sensor-based motions”, Proc. Int. Conf. Intelligent Autonomous Systems IAS 4, CD-ROM (1995).
  • [41] D. Xiao and R. Hubbold, “Navigation guided by artificial force fields”, Proc. Int. Conf. CHI’98 1, 179-186 (1998).
  • [42] T.Y. Li and H.C. Chou, “Improving navigation efficiency with artificial force fields”, Proc. Int. Conf. Computer Vision, Graphics and Image Processing 1, 1-7 (2001).
  • [43] T.Y. Li and S.W. Hsu, “An intelligent 3D user interface adapting to user control behaviours”, Proc. 9th Int. Conf. Intelligent User Interface 1, 184-190 (2004).
  • [44] S. Beckhaus, “Dynamic potential fields for guided exploration in virtual environments”, Ph.D. Thesis, Otto-von-Gueicke Universitate, Magdeburg, 2002.
  • [45] S. Beckhaus, F. Ritter, and T. Strothotte, “CubicalPath - dynamic potential fields for guided exploration in virtual environments”, Proc. IEEE Int. Conf. Computer Graphics and Applications PG’00 1, 389-457 (2000).
  • [46] R.C. Arkin, “Motor scheme-based mobile robot navigation”, Int. J. Robotics Research 8, 92-112 (1989).
  • [47] B. Bederson, J. Hollan, K. Perlin, J. Meyer, D. Bacon, and G. Furnas, “Pad++: a zoomable graphical sketchpad for exploring alternate interface physics”, J. Visual Languages and Computing 7, 3-31 (1996).
  • [48] J. Borenstein and Y. Koren, “Real-time obstacle avoidance for fast mobile robots”, IEEE Trans. on Systems, Man and Cybernetics 1 (5), 1179-1187 (1989).
  • [49] S. Beckhaus, F. Ritter, and T. Strothotte, “Guided exploration with dynamic potential fields: the cubical path method”, Computer Graphics Forum 20 (4), 201-210 (2001).
  • [50] J. Barraquand and J.C. Latombe, “Robot motion planning: a distributed representation approach”, Technical Report, Department of Computer Science, Stanford University, Stanford, 1989.
  • [51] S. Beckhaus, G. Eckel, and T. Strothotte, “Guided exploration in virtual environments”, Proc. Int. Conf. Electronic Imaging’ 01 4297, 426-435 (2001).
  • [52] D. Benyon and K. Hook, “Navigation in information spaces: supporting the individual”, Proc. Int. Conf. Human Computer Interaction: INTERACT’97 1, 39-46 (1997).
  • [53] J. Nielsen, “Finding usability problems through heuristic evaluation”, Proc. Int. Conf. ACM CHI’92 1, 373-380 (1992).
  • [54] J. Nielsen, Usability Engineering, Academic Press, New York, 1993.
  • [55] D. Bowman, J. Gabbard, and D. Hix, “A survey of usability evaluation in virtual environments: classification and comparison of methods”, Presence: Teleoperators and Virtual Environments 11 (4), 404-424 (2002).
Typ dokumentu
Bibliografia
Identyfikator YADDA
bwmeta1.element.baztech-bf3e99fc-d4aa-4639-97a9-8e8ac99c7182
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.