PL EN


Preferencje help
Widoczny [Schowaj] Abstrakt
Liczba wyników
Powiadomienia systemowe
  • Sesja wygasła!
Tytuł artykułu

Designing Social Robots for Interaction at Work: Socio-Cognitive Factors Underlying Intention to Work with Social Robots

Treść / Zawartość
Identyfikatory
Warianty tytułu
Języki publikacji
EN
Abstrakty
EN
This paper discusses the effects of robot design (machine- like, humanoid, android) and users’ gender on the intention to work with social robots in the near future. For that purpose, the theoretical framework afforded by the theory of planned behavior (TPB) is used. Results showed effects for robot design and users’ gender. As the robot got more human-like the lower the intention to work with it. Female participants showed lower intention to work with social robots. These effects are mediated by the variables of the TPB. Perceived behavioral control and subjective norm are the main predictors of the intention to work with social robots in the near future.
Twórcy
autor
  • University of Algarve, Faro, Portugal
autor
  • Department of Psychology and Sciences of Education, University of Algarve and Research Centre for Spatial and Organizational Dynamics – CIEO
autor
  • Virtual Reality and Psychophysiology Lab, Institute of Psychology, Polish Academy of Sciences, Warsaw, 00-378, Poland
autor
  • Institute of Automatic Control and Robotics (IAiR), Faculty of Mechatronics, Warsaw University of Technology, Warsaw, Poland
Bibliografia
  • [1] Thrun S., “Towards A Framework for Human-Robot Interaction”, Human-Computer Interaction, vol. 19, no. 1–2, 2004, 9–24. DOI: 10.1080/07370024.2004.9667338.
  • [2] Bartneck C., Forlizi J., “A Design-Centred Framework for Social Human-Robot Interaction”, ROMAN 2004: 13th IEEE International Workshop on Robot and Human Interactive Communication, 2004, 591–594. DOI: 10.1109/ROMAN.2004.1374827.
  • [3] Breazeal C., “Toward sociable robots”, Robotics and Autonomous Systems, vol. 42, no. 3, 2003, 167–175. DOI:10.1016/S0921-8890(02)00373-1.
  • [4] Dautenhahn K., “Socially intelligent robots: dimensions of human–robot interaction”, Philosophical Transactions of the Royal Society B: Biological Sciences, vol. 362, no. 1480, 2007, 679–704. DOI:10.1098/rstb.2006.2004.
  • [5] Fong T., Nourbakhsh I., Dautenhahn K., “A survey of socially interactive robots”, Robotics and Autonomous Systems, vol. 42, no. 3, 2003, 143–166. DOI:10.1016/S0921-8890(02)00372-X.
  • [6] Piçarra N., Giger J.-C., Pochwatko G., Gonçalves G., “Making sense of social robots: A structural analysis of the layperson’s social representation of robots”, European Review of Applied Psychology/Revue Européenne de Psychologie Appliquée,vol. 66, no. 6, 2016, 277–289. DOI: 10.1016/j.erap.2016.07.001.
  • [7] Breazeal C., Takanishi A., Kobayashi T., “Social robots that interact with people”. In: B. Siciliano, O. Khatib (Ed.), Springer handbook of robotics, 2008, 1349–1369. Springer. DOI:10.1007/978-3-540-30301-5_59.
  • [8] Bernstein D., Crowley K., Nourbakhsh I., “Working with a robot. Exploring relationship potential in human–robot systems”, Interaction Studies, vol. 8, no. 3, 2007, 465–482. DOI: 10.1075/is.8.3.09ber.
  • [9] DiSalvo C. F., Gemperle F., Forlizzi J., Kiesler S., “All robots are not created equal: the design and perceptionof humanoid robot heads”, Proceedings of the 4th conference on Designing interactive systems processes practices methods and techniques, 2002, 321–326. DOI: 10.1145/778712.778756.
  • [10] Blow M., Dautenhahn K., Appleby A., Nehaniv C., Lee D., “Perception of Robot Smiles and Dimensions for Human-Robot Interaction Design”, ROMAN 2006 The 15th IEEE International Symposium on Robot and Human Interactive Communication, 2006, 469–474. DOI: 10.1109/ROMAN. 2006.314372.
  • [11] Lee M. K., Forlizzi J., Rybski P. E., Crabbe F., Chung W., Finkle J., Glaser E., Kiesler S., “The snackbot: documenting the design of a robot for long-term Human-Robot Interaction”, Proceedings of the 4th ACMIEEE international conference on Human-Robot Interaction, 2009, 7–14. DOI: 10.1145/1514095.1514100.
  • [12] Walters M., Koay K., Syrdal D., Dautenhahn K., Boekhorst R., “Preferences and perceptions of robot appearance and embodiment in Human-Robot Interaction trials”, Artificial intelligence and simulation of behavior AISB’09 convention, 2009, 136–143. Retrieved from http://hdl.handle. net/2299/3795.
  • [13] A. Powers, A. Kramer, S. Lim, J. Kuo, S. Lee, S. Kiesler, “Eliciting Information from people with a gendered humanoid robot”, IEEE International Workshop on Robot and Human Interactive Communication ROMAN, 2005, 158–163. DOI: 10.1109/ROMAN.2005.1513773.
  • [14] F. Eyssel, D. Kuchenbrandt, S. Bobinger, L. Ruiter, F. Hegel, “If You Sound Like Me, You Must Be More Human: On the Interplay of Robot and User Features on Human Robot Acceptance and Anthropomorphism”, HRI ‘12 Proceedings of the 7th annual ACM/IEEE international conference on Human-Robot Interaction, 2012, 125–126. DOI: 10.1145/2157689.2157717.
  • [15] D. Syrdal, K. Dautenhahn, S.Woods, M. Walters, K. Koay, “Looking good? Appearance preferences and robot personality inferences at zero acquaintance”, Proceedings of AAAI Spring Symposia, 2007, 86–92, 2007. Retrieved from: https://www.aaai.org/Papers/ Symposia/ Spring/2007/SS-07-07/SS07-07-019.
  • [16] F. Eyssel, D. Kuchenbrandt, F. Hegel, L. de Ruiter, “Activating elicited agent knowledge: How robot and user features shape the perception of social robots”, IEEE RO-MAN: The 21st IEEE International Symposium on Robot and Human Interactive Communication, 2012, 851–857. DOI: 10.1109/ROMAN.2012.6343858.
  • [17] K. Nakagawa, M. Shiomi, K. Shinozawa, R. Matsumura,H. Ishiguro, N. Hagita., “Effect of Robot’sWhispering Behavior on People’s Motivation”, International Journal of Social Robotics, vol. 5, no. 1, 2013, 5–16. DOI: 10.1007/s12369-012-0141-3.
  • [18] A. Niculescu, B. Dijk, A. Nijholt, S.L. See, “The influence of voice pitch on the evaluation of a social robot receptionist ”, International Conference on User Science and Engineering (i-USEr), 2011.DOI: 10.1109/iUSEr.2011.6150529.
  • [19] M. Salem, F. Eyssel, K. Rohlfing, S. Kopp, F. Joublin, “To Err is Human(-like): Effects of Robot Gesture on Perceived Anthropomorphism and Likability”, International Journal of Social Robotics, vol. 5, no.3, 313–323, 2013. DOI: 10.1007/s12369-013-0196-9.
  • [20] J. Ham, R. H. Cuijpers, J.-J. Cabibihan, “Combining Robotic Persuasive Strategies: The Persuasive Power of a Storytelling Robot that Uses Gazing and Gestures”, International Journal of Social Robotics, vol. 7, no.4, 479–487, 2015. DOI: 10.1007/s12369-015-0280-4
  • [21] C. Bartneck, T. Kanda, O. Mubin, A. Al Mahmud, “Does the design of a robot influence its animacy and perceived intelligence?”, International Journal of Social Robotics, vol. 1, no. 2, 2009, 195–204. DOI: 10.1007/s12369-009-0013-7.
  • [22] J. Carpenter, J. Davis, N. Erwin-Stewart, T. Lee, J. Bransford, N. Vye, “Gender representation andhumanoid robots designed for domestic use”, International Journal of Social Robotics, vol. 1, no. 3, 2009, 261–265. DOI: 10.1007/s12369-009-0016-4.
  • [23] J. Carpenter, M. Eliot, D. Schultheis, “Machine or friend: understanding users’ preferences for and expectations of a humanoid robot companion”, Proceedings of 5th conference on Design and Emotion, 2006. Retrieved from http://citeseerx.ist.psu.edu.
  • [24] E. Broadbent, Y. Lee, R Stafford, I. Kuo, B. MacDonald, “Mental Schemas of Robots as More Human-Like Are Associated with Higher Blood Pressure and Negative Emotions in a Human-Robot Interaction,” International Journal of Social Robotics, vol. 3, no. 3, 2011, 291–297. DOI 10.1007/s12369-011-0096-9.
  • [25] J. Kätsyri, K. Förger, M. Mäkäräinen, T. Takala, “A review of empirical evidence on different Uncanny Valley hypotheses: support for perceptual mismatch as one road to the valley of eeriness”, Frontiers in Psychology, vol. 6 no. 390, 2015. DOI:10.3389/fpsyg.2015.00390.
  • [26] K. MacDorman, “Mortality Salience and the Uncanny Valley”, Proceedings of 5th IEEE-RAS International Conference on Humanoid Robots, 2005, 399-405. DOI: 10.1109/ICHR.2005.1573600.
  • [27] J. Seyama, R. Nagayama, “The Uncanny Valley: Effect of Realism on the Impression of Artificial Human Faces”, Presence, vol. 16, no. 4, 2007, 337–35. DOI: 10.1162/pres.16.4.337.
  • [28] K. MacDorman, R. Green, C. Ho, C. Koch, “Too real for comfort? Uncanny responses to computer generated faces”, Computers in Human Behavior, vol. 25, no. 3, 2009, 695–710. DOI: 10.1016/j.chb.2008.12.026.
  • [29] S. Steckenfinger, A. Ghazanfar, “Monkey visual behavior falls into the Uncanny Valley”, Proceedings of the National Academy of Sciences of the United States of America, vol. 106, no. 43, 2009, 18362–18366. DOI: 10.1073/pnas.0910063106.
  • [30] D. Lewkowicz, A. Ghazanfar, “The Development of the Uncanny Valley in Infants,” Developmental Psychobiology, vol. 54, no. 2, 2012, 124–132. DOI:10.1002/dev.20583.
  • [31] S. Turkle, “Computational reticence: why womenear the intimate machine”. In: C. Kramarae (ed.), Technology and Women’s Voices, New York: Pergamon Press, 1986, 40-61.
  • [32] D. Gefen, D. Straub, “Gender difference in the perception and use of E-Mail: an extension to the technology acceptance model”, MIS Quarterly, vol. 21, no.4, 1997, 389–400. DOI: 10.2307/249720.
  • [33] F. Davis, “Perceived usefulness, perceived ease of use and user acceptance of information technology”, MIS Quarterly, vol. 13, no. 3, 1989, 319–340. DOI: 10.2307/249008.
  • [34] V. Venkatesh, M.G. Morris, “Why don’t men ever stop to ask for directions? Gender, social influence, and their role in technology acceptance and usage behavior”, MIS Quarterly , vol. 24, no. 1, 2000, 115–139. DOI: 10.2307/3250981
  • [35] V. Venkatesh, M. G. Morris, P. L. Ackerman, “A Longitudinal Field Investigation of Gender Differences in Individual Technology Adoption Decision-Making Processes”, Organizational Behavior and Human Decision Processes, vol. 83, no. 1, 2000, 33–60. DOI: 10.1006/obhd.2000.2896.
  • [36] I. Ajzen, “The theory of planned behavior”, Organizational Behavior and Human Decision Processes, vol. 50, no. 2, 1991, 179–211. DOI: 10.1016/0749-5978(91)90020-T.
  • [37] I. H. Kuo, J. M. Rabindran, E. Broadbent, Y. I. Lee, N. Kerse, R. M. Q. Stafford, B. A. MacDonald, ”Age and gender factors in user acceptance of healthcare robots”, The 18th IEEE International Symposium on Robot and Human Interactive Communication ROMAN 2009, 214–219. DOI: 10.1109/ROMAN.2009.5326292.
  • [38] C.A. Chung, “Human issues influencing the successful implementation of advanced manufacturing technology”, Journal of Engineering and Technology Management, vol. 13, no. 3–4, 1996, 283–299. DOI: 10.1016/S0923-4748(96)01010-7.
  • [39] I, Ajzen, M. Fishbein, “Attitude - Behavior relations: a theoretical analysis and review of empirical research”, Psychological Bulletin, vol. 84, no. 5, 1977, 888–918. DOI: 10.1037/0033-2909.84.5.888
  • [40] M. Perugini, R. Bagozzi, “The distinction between desires and intentions”, European Journal of Social Psychology, vol. 34, no. 1, 2004, 69–84. DOI: 10.1002/ejsp.186.
  • [41] I. Ajzen, “The theory of planned behaviour: Reactions and reflections”, Psychology & Health, vol. 26, no. 9, 2011, 1113–1127. DOI: 10.1080/08870446.2011.613995.
  • [42] I. Ajzen, “The theory of planned behavior”, 2012, 438–459. In: P. Lange, A. Kruglanski, E. Higgins (Eds.), Handbook of theories of social psychology, vol. 1, London, UK: Sage.
  • [43] I. Ajzen, “The theory of planned behaviour is alive and well, and not ready to retire: a commentary on Sniehotta, Presseau, and Araújo-Soares”, Health Psychology Review, vol. 9, no. 2, 2014, 131–137. DOI: 10.1080/17437199.2014.883474.
  • [44] C. Blue, “The predictive capacity of the theory of reasoned action and the theory of planned behavior in exercise behavior: An integrated literature review”, Research in Nursing & Health, vol. 18, no. 2, 1995, 105– 121. DOI: 10.1002/nur.4770180205.
  • [45] G. Godin, “Theories of reasoned action and planned behavior: usefulness for exercise promotion”, Medicine and science in sports and exercise, vol. 26, no. 11, 1994, 1391–1394. DOI: 10.1249/00005768-199411000-00014
  • [46] G. Godin, G. Kok, “The theory of planned behavior: a review of its implications to health related behaviors”, American Journal of Health Promotion, vol. 11, no. 2, 1996, 87-98. DOI: 10.4278/0890-1171-11.2.87.
  • [47] M. Cannière, P. Pelsmacker, M. Geuens, “Relationship Quality and the Theory of Planned Behavior models of behavioral intentions and purchase behavior”, Journal of Business Research, vol. 62, no. 1, 2009, 82–92. DOI: 10.1016/j.jbusres.2008.01.001.
  • [48] S. Taylor, P. Todd, “Decomposition and crossover effects in the theory of planned behavior: A study of consumer adoption intentions”, International Journal of Research in Marketing, vol. 12, no. 2, 1995, 137–155. DOI: 10.1016/0167- 8116(94)00019-K.
  • [49] K. Mathieson, “Predicting User Intentions: Comparing the Technology Acceptance Model with the Theory of Planned Behavior”, Information Systems Research, vol. 2, no. 3, 1991, 173–191. DOI: 10.1287/isre.2.3.173.
  • [50] T. Hansen, J. M. Jensen, H. S. Solgaard, “Predicting online grocery buying intention: a comparison of the theory of reasoned action and the theory of planned behavior”, International Journal of Information Management, vol. 24, no. 6, 2004, 539–550. DOI: 10.1016/j.ijinfomgt.2004.08.004.
  • [51] M.-H. Hsu, C.-H. Yen, C.-M. Chiu, C.-M. Chang, “A longitudinal investigation of continued online shopping behavior: An extension of the theory of planned behavior”, International Journal of Human-Computer Studies, vol. 64, no. 9, 2006, 889–904. DOI: 10.1016/j.ijhcs.2006.04.004.
  • [52] E. Grandón, S. Nasco, P. Mykytyn, “Comparing theories to explain e-commerce adoption”, Journal of Business Research, vol. 64, no. 3, 2011, 292–298. DOI: 10.1016/j.jbusres.2009.11.015.
  • [53] A. d’Astous, F. Colbert, D. Montpetit, “Music Piracy on the Web – How Effective Are Anti-Piracy Arguments? Evidence from the theory of planned behavior”, Journal of Consumer Policy, vol. 28, no. 3, 2005, 289–310. DOI: 10.1007/s10603-005-8489-5.
  • [54] T. Kwong, M. Lee, “Behavioral intention model for the exchange mode internet music piracy”, Proceedings of the 35th Hawaii International Conference on System Sciences, HICSS, 2002, 2481-2490. DOI: 10.1109/HICSS.2002.994187.
  • [55] C. Liao, H.-N. Lin, Y-P. Liu, “Predicting the Use of Pirated Software: A Contingency Model Integrating Perceived Risk with the Theory of Planned Behavior”, Journal of Business Ethics, vol. 91, no. 2, 2010, 237–252. DOI: 10.1007/s10551-009-0081-5.
  • [56] C. Yoon, “Theory of Planned Behavior and Ethics Theory in Digital Piracy: An integrated Model”, Journal of Business Ethics, vol. 100, no. 3, 2011, 405–417. DOI: 10.1007/s10551-010-0687-7.
  • [57] C. Yoon, “Digital piracy intention: a comparison of theoretical models”, Behaviour & Information Technology, vol. 31, no. 6, 2012, 565–576. DOI:10.1080/0144929X.2011.602424.
  • [58] S. N. Woods, M. L. Walters, K. L. Koay, K. Dautenhahn, “Methodological Issues in HRI: a comparison of live and video based methods in robot to human approach direction trials”, ROMAN 2006 –The 15th IEEE International Symposium on Robot and Human Interactive Communication, 2006, 51–58. DOI: 10.1109/ROMAN.2006.314394.
  • [59] G. Pochwatko, J.-C. Giger, M. Różańska-Walczuk, J. Świdrak, K. Kukiełka, J. Możaryn, N. Piçarra, “Polish Version of the Negative Attitude Toward Robots Scale (NARS-PL)”, Journal of Automation, Mobile Robotics & Intelligent Systems, vol. 9, no. 3, 2015, 65–72. DOI: 10.14313/JAMRIS_3-2015/25.
  • [60] H. Cramer, N. Kemper, A. Amin, V. Evers, B. Wielinga, “‘Give me ahug’: The effects of touch and autonomy on people’s responses to embodied social agents”, Computer Animation and Virtual Worlds, 2009,20(2–3), 437–445. DOI: 10.1002/cav.317Dautenhahn.
  • [61] K. Tsui, M. Desai, H, Yanco, H. Cramer, N. Kemper, “Measuring attitudes towards telepresence robots”, International Journal of Intelligent Controland Systems, 16(2), 2011, 1–11 (Retrieved from http://www.ezconf.net/newfiles/IJICS/203/nars-telepresence-IJICS-cameraReady.pdf).
  • [62] N. Piçarra, J-C. Giger, G. Gonçalves, G. Pochwatko, “Validation of the Portuguese Version of the Negative Attitudes Towards Robots Scale”, European Review of Applied Psychology/ Revue Européenne de Psychologie Appliquée, vol. 65, no. 2, 2015, 93–104. DOI: 10.1016/j.erap.2014.11.002.
  • [63] M. Perugini, M. Conner, “Predicting and understanding behavioral volitions: the interplay between goals and behaviors”, European Journal of Social Psychology, vol. 30, no. 5, 2000, 705-731. DOI: 10.1002/1099-0992(200009/10)30:5<705::AID-EJSP18>3.0.CO;2-#.
  • [64] C. Bartneck, D. Kulic, E. Croft, S. Zoghbi, “Measurement instruments for the anthropomorphism, animacy, likeability, perceived intelligence, and perceived safety of robots”, International Journal of Social Robotics, vol. 1, no. 1, 2009, 71–81. DOI 10.1007/s12369-008-0001-3.
  • [65] A. Field, “Discovering Statistics Using IBM SPSS Statistics”, SAGE, 2014, 645–649.10.1002/1099-0992(200009/10)30:5<705::AID-EJSP18>3.0.CO;2-#.
Uwagi
PL
Opracowanie ze środków MNiSW w ramach umowy 812/P-DUN/2016 na działalność upowszechniającą naukę.
Typ dokumentu
Bibliografia
Identyfikator YADDA
bwmeta1.element.baztech-b0d902bf-4a2e-4e86-a961-1f66ed6e8a1d
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.