PL EN


Preferencje help
Widoczny [Schowaj] Abstrakt
Liczba wyników
Tytuł artykułu

Intercultural differences in decoding facial expressions of the android robot Geminoid F

Treść / Zawartość
Identyfikatory
Warianty tytułu
Języki publikacji
EN
Abstrakty
EN
As android robots become increasingly sophisticated in their technical as well as artistic design, their non-verbal expressiveness is getting closer to that of real humans. Accordingly, this paper presents results of two online surveys designed to evaluate a female android’s facial display of five basic emotions. Being interested in intercultural differences we prepared both surveys in English, German, as well as Japanese language, and we not only found that in general our design of the emotional expressions “fearful” and “surprised” were often confused, but also that Japanese participants more often confused “angry” with “sad” than the German and English participants. Although facial displays of the same emotions portrayed by the model person of Geminoid F achieved higher recognition rates overall, portraying fearful has been similarly difficult for her. Finally, from the analysis of free responses that the participants were invited to give, a number of interesting further conclusions are drawn that help to clarify the question of how intercultural differences impact on the interpretation of facial displays of an android’s emotions.
Rocznik
Strony
215--231
Opis fizyczny
Bibliogr. 21 poz., rys.
Twórcy
  • Research Group on the Foundations of AI, Department of Computer Science University of Freiburg, Germany
autor
  • Department of Adaptive Machine Systems Osaka University, Japan
Bibliografia
  • [1] C. Becker-Asano,&H. Ishiguro, “Evaluating facial displays of emotion for the android robot Geminoid F.” Workshop on Affective Computational Intelligence.. Paris: IEEE, 2011. 22–29.
  • [2] K. Dautenhahn, I. Nourbakhsh and T. Fong, ”A Survey of Socially Interactive Robots.” Robotics and Autonomous Systems 42 (2003): 143-166.
  • [3] C. Breazeal, ”Emotion and sociable humanoid robots.” International Journal of Human-Computer Studies 59 (2003): 119–155.
  • [4] C. Becker-Asano and I. Wachsmuth, ”Affective computing with primary and secondary emotions in a virtual human.” Autonomous Agents and Multi-Agent Systems 20, no. 1 (2010): 32–49.
  • [5] H. Ishiguro, ”Android Science: Toward a new cross-interdisciplinary framework.” Proc. of the CogSci 2005 Workshop ”Toward Social Mechanisms of Android Science”. Stresa, Italy, 2005. 1–6.
  • [6] R.W. Picard, Affective Computing. The MIT Press, 1997.
  • [7] S. Nishio, H. Ishiguro, and N. Hagita, ”Geminoid: Teleoperated Android of an Existing Person.” In Humanoid Robots, New Developments, 343–352.I-Tech, 2007.
  • [8] T. Kanda, T. Hirano, D. Eaton, and H. Ishiguro, Interactive Robots as Social Partners and Peer Tutors Ifor Children: A Field Trial.“ Human Computer Interaction (Special issues on human-robot interaction) 19 (2004): 61–84.
  • [9] T. Kanda, H. Ishiguro, T. Ono, M. Imai, K. and Mase, Development and Evaluation of an Interactive Robot ‘Robovie’.“ IEEE International Conference on Robotics and Automation. 2002. 1848–1855.
  • [10] K. F. MacDorman, and H. Ishiguro, The uncanny advantage of using androids in cognitive and social science research.“ Interaction Studies, 2006: 297–337.
  • [11] C. Becker-Asano, K. Ogawa, S. Nishio, H. And Ishiguro, ”Exploring the uncanny valley with Geminoid HI-1 in a real world application.” IADIS Intl. Conf. on Interfaces and Human Computer Interaction. Freiburg, Germany: IADIS, 2010. 121–128.
  • [12] P. Ekman, ”Basic Emotions.” Chap. 3 in Handbook of Cognition and Emotion, 45–60. John Wiley & Sons, 1999.
  • [13] C. Bartneck, J. Reichenbach, and A. Breemen, ”In your face, robot! The influence of a character’s embodiment on how users perceive its emotional expressions.” Proceedings of the Design and Emotion. Ankara, 2004.
  • [14] C. Bartneck, ”How convincing is Mr. Data’s smile?: Affective Expressions of Machines.” User Modeling and User-Adapted Interaction, 2001: 279-295.
  • [15] P. Ekman, ”Facial Expressions.” Chap. 16 in Handbook of Cognition and Emotion, 301–320. John Wiley & Sons, 1999.
  • [16] M. Yuki, W. Maddux, and T. Masuda, ”Are the windows to the soul the same in the East andWest? Cultural differences in using the eyes and mouth as cues to recognize emotions in Japan and the United States.” Journal of Experimental Social Psychology, 2007: 303–311.
  • [17] Y. Zhang, and Q. Ji, ”Active and Dynamic Information Fusion for Facial Expression Understanding from Image Sequences.” IEEE Transactions on pattern analysis and machine intelligence, 2005: 699-714.
  • [18] J. Tolksdorf, C. Becker-Asano, and S. Kopp, ”Do You Know How I Feel? Evaluating Emotional Display of Primary and Secondary Emotions.” Edited by H. Prendinger, J. Lester and M. Ishizuka. Intelligent Virtual Agents. Springer, 2008. 548–549.
  • [19] T. Koda, ”Cross-Cultural Study of Avatars’ Facial Expressions and Design ConsiderationsWithin Asian Countries.” Proc. of Intercultural Collaboration. Springer, 2007. 207–220.
  • [20] J.A. Russell, ”Is There Universal Recognition of Emotion From Facial Expression? A Review of the Cross-Cultural Studies.” Psychological Bulletin 115 (1994): 102-141.
  • [21] M. A. Tamanoi, ”Women’s Voices: Their Critique of the Anthropology of Japan.” Annual Review of Anthropology (1990):17–37.
Typ dokumentu
Bibliografia
Identyfikator YADDA
bwmeta1.element.baztech-40f45a6b-0792-4790-94e7-a3382c227b36
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.