Identyfikatory
Warianty tytułu
Języki publikacji
Abstrakty
This paper concerns measurement procedures on an emotion monitoring stand designed for tracking human emotions in the Human-Computer Interaction with physiological characteristics. The paper addresses the key problem of physiological measurements being disturbed by a motion typical for human-computer interaction such as keyboard typing or mouse movements. An original experiment is described, that aimed at practical evaluation of measurement procedures performed at the emotion monitoring stand constructed at GUT. Different locations of sensors were considered and evaluated for suitability and measurement precision in the Human- Computer Interaction monitoring. Alternative locations (ear lobes and forearms) for skin conductance, blood volume pulse and temperature sensors were proposed and verified. Alternative locations proved correlation with traditional locations as well as lower sensitiveness to movements like typing or mouse moving, therefore they can make a better solution for monitoring the Human-Computer Interaction.
Czasopismo
Rocznik
Tom
Strony
719--732
Opis fizyczny
Bibliogr. 26 poz., rys., tab., wykr.
Twórcy
autor
- Gdańsk University of Technology, Faculty of Electronics, Telecommunications and Informatics, Software Engineering Department, Narutowicza 11/12, 80-233 Gdańsk, Poland
Bibliografia
- [1] Boehner, K., dePaula, R., Dourish, P., Sengers, P. (2007). How emotion is made and measured. Int. J. Human-Computer Studies. 65, 275-291.
- [2] Landowska, A. (2013). Affect-awareness Framework for Intelligent Tutoring Systems. In Proc of HSI. Gdańsk. Poland. 540-547.
- [3] Gunes, H., Schuller, H. (2013). Categorical and dimensional affect analysis in continuous input: Current trends and future directions. Image and Vision Computing. 31. 120-136.
- [4] Szwoch, W. (2013). Using physiological signals for emotion recognition. In Proc of HSI. Gdańsk. Poland. 556-561.
- [5] Wang, J., Yin, L., Wei, X., Sun, Y., (2006). 3D facial expression recognition based on primitive surface feature distribution. Comp. Vision and Pattern Recogn.. 2. 1399-1406.
- [6] Zhang, J., Lipp, O., Oei, T., Zhou, R. (2011). The effects of arousal and valence on facial electromyograpjic asymmetry during blocked picture viewing. International Journal of Psychophysiology. 79. 378-384.
- [7] Gunes, H. M., Piccardi, M. (2005). Affect Recognition from Face and Body: Early Fusion vs. Late Fusion. IEEE International Conference on Systems. Man and Cybernetics. 4. 3437-3443.
- [8] Vizer, L. M., Zhou, L., Sears, A. (2009). Automated stress detection using keystroke and linguistic features. Int. Journal of Human-Computer Studies. 67. 870-886.
- [9] Kołakowska, A. (2013). A review of emotion recognition methods based on keystroke dynamics and mouse movements. In Proc of HSI. Gdańsk. Poland. 548-555.
- [10] Zeng, Z., Pantic, M., Roisman, G., Huang, T. (2009). A survey of affect recognition methods: Audio. visual. and spontaneous expressions. IEEE Transactions on Pattern Analysis and Machine Intelligence. 31(1). 39-58.
- [11] Healey, J., Picard, R. (2005). Detecting stress during real-world driving tasks using physiological sensors. IEEE Trans. on Intelligent Transportation Systems. 6 (2). 156-166.
- [12] Lisetti, C., Nasoz, F. (2004). Using noninvasive wearable computers to recognize human emotions from physiological signals. EURASIP Journal on Applied Signal Processing. 11. 1672-1687.
- [13] Mauss, I., Robinson, M. (2009). Measures of emotion: A review. Cogn. Emot.. 23(2). 209-237.
- [14] Jang, E., Rak, B., Kim, S., Sohn, J. (2012). Emotion classification by Machine Learning Algorithm using Physiological Signals. In. Proc. of Computer Science and Information Technology. Singapore. 25. 1-5.
- [15] BioGraph Infiniti and FlexComp Infiniti User Manual. Thought Technology. Canada. www.thoughttechnology.com (VII 2013).
- [16] Boxtel, A. (2010). Facial EMG as a Tool for Inferring Affective States. In Proc. of Measuring Behavior. Eindhoven. The Netherlands. 104-108.
- [17] Bach, D., Flandin, G., Friston, K., Dolan, R. (2010). Modelling event-related skin conductance responses. Int J Psychophysiol. 75(3). 349-356.
- [18] Yousefi, R., Nourani, M., Ostadabbas, S., Panashi, I. (2013). A motion-tolerant adaptive algorithm for wearable photoplethymographic biosensors. IEEE J. Biomed. Health Inform.. 18. 670-681.
- [19] Sweeney, K. T., Ward, T. E., McLoone, S. F. (2012). Artifact Removal in Physiological Signals - Practices and Possibilities. IEEE Transactions on Information Technology in Biomedicine. 16 (3). 488-500.
- [20] Sweeney, K. T., Kelly, D., Ward, T. E., McLoone, S. F. (2011). A Review of the State of the Art in Artifact Removal Technologies as used in an Assisted Living Domain. IET Conference on Assisted Living.
- [21] Sweeney, K., McLoone, S., Ward, T. (2010). A simple bio-signals quality measure for in-home monitoring. 7th IASTED International Conference.
- [22] Emotions in Human-Computer Interaction Research Group. www.emorg.eu (I 2014).
- [23] Balakrishnan, G., Durand, F., Guttag, J. (2013). Detecting Pulse from Head Motions in Video. IEEE Conf. on Computer Vision and Pattern Recognition (CVPR). 3430-3437.
- [24] Nimwegen, C., Leuven, K. (2009). Unobtrusive physiological measures to adapt system behavior: The GSR mouse. Presented at Key issues in sensory augmentation. Institute for Media Studies. KU Leuven. Nederlands.
- [25] Landowska, A. (2013). Affective computing and affective learning - methods. tools and prospects. EduAction. Electronic education magazine. 1(5). 16-31.
- [26] Kołakowska, A., Landowska, A., Szwoch, M., Szwoch, W., Wrobel, M. R. (2013). Emotion Recognition and its Application in Software Engineering. In HSI Proc.. Poland. 532-539.
Uwagi
EN
This study was partially supported by the Foundation for Polish Science within the Grant no 173/UD/SKILLS/2012 and by Polish-Norwegian Research Programme operated by the National Centre for Research and Development under the Norwegian Financial Mechanism 2009-2014 within Project Contract no Pol-Nor/210629/51/2013 as well as by DS Programs of the ETI Faculty, Gdańsk University of Technology.
Typ dokumentu
Bibliografia
Identyfikator YADDA
bwmeta1.element.baztech-81a98ef2-0248-46b7-b48c-9e323d219430