PL EN


Preferencje help
Widoczny [Schowaj] Abstrakt
Liczba wyników
Tytuł artykułu

Adaptacyjny system rozpoznawania emocji na podstawie wyrazów twarzy

Autorzy
Identyfikatory
Warianty tytułu
EN
Adaptive facial emotion recognition system
Języki publikacji
PL
Abstrakty
PL
W artykule opisano i zbadano adaptacyjny system rozpoznawania emocji oparty na sieci neuronowej pracującej w trybie on-line. Przetwarzanie wstępne wykorzystuje hybrydowe podejście: dopasowanie modelu 3D do twarzy na obrazie z wykorzystaniem aplikacji FaceTracker i ekstrakcję cech geometrycznych w oparciu o metodę nadmiarowego zbioru cech. Opracowany system uzyskał 96.8% skuteczności na osobach poznanych i 84.9% na osobach nieznanych na bazie wyrazów twarzy MUG. Badania wskazują, że system skutecznie dopasowuje się do zmian w otoczeniu, utrzymując wysoką skuteczność klasyfikacji.
EN
The paper presents an adaptive facial emotion recognition system based on on-line learning neutral network. The preprocessing stage uses a hybrid aproach: a 3D face model is fitted to the image with FaceTracker application, then geometric features are extracted through extensive feature set method. The resulting classification rate on MUG facial expression databases was 96.8% for known subjects and 84.9% on unknown subjects. The research suggests that the system is able to finely adapt to the changes in environment and maintain high classification rate.
Rocznik
Strony
213--222
Opis fizyczny
Bibliogr. 24 poz., rys., tab., wykr.
Twórcy
  • Instytut Informatyki, Automatyki i Robotyki, ul. Z. Janiszewskiego 11/17, 50-372 Wrocław
Bibliografia
  • [1] Weka 3: Data Mining Software in Java, http://www.cs.waikato.ac.nz/ml/weka/
  • [2] N. Aifanti, C. Papachristou, A. Delopoulos. The mug facial expression database. In: 11th Int. Workshop on Image Analysis for Multimedia Interactive Services. Proceedings. Desenzano, Italy, April 12-14, 2010.
  • [3] T. Cootes, G. Edwards, C. Taylor. Active Appearance Models. IEEE Trans. Pattern Anal. Mach. Intell., June, 2001, Vol. 23, No. 6, s. 681-685.
  • [4] T. F. Cootes et al. Active Shape Models - their training and application. Comput. Vis. Image Underst., January, 1995, Vol. 61, No. J, s. 38-59.
  • [5] S. Dubuisson, F. Davoine, M. Masson. A solution for facial expression representation and recognition. Sig. Proc.: Image Comm., 2002, Vol. 17, No. 9, s. 657-673.
  • [6] P. Ekman, W.V. Friesen. I can see it all over your face! Constants across cultures in the face and emotion. Journal of Personality and Social Psychology, 11, 124-129, 1971.
  • [7] Paul Ekman. Emotions Revealed: Recognizing Faces and Feelings to Improve Communication and Emotional Life. Owl Books 2004.
  • [8] I. Essa, A. Pentland. Facial expression recognition using a dynamic model and motion energy. In: In ICCV. Proceedings. IEEE Computer Society, 1995, s. 360-367.
  • [9] M. A. Hall. Correlation-based Feature Subset Selection for Machine Learning. PhD thesis, University of Waikato, Hamilton, New Zealand, 1998.
  • [10] S. Ioannou et al. Adaptive on-line neural network retraining for real life multimodal emotion recognition. International Conference on Artificial Neural Networks (ICANN) 2006, Athens, Greece, September 2006., 2006.
  • [11] R.E. Jack et al. Facial expressions of emotion are not culturally universal. Proceedings of the National Academy of Sciences of the United States of America, 109.19:7241-7244, 2012.
  • [12] I. Kotsia, I. Pitas. Facial expression recognition in image sequences using geometric deformation features and Support Vector Machines. IEEE Transactions on Image Processing, 2007, Vol. 16, No. 1, s. 172-187.
  • [13] M. Lyons et al. Coding facial expressions with gabor wavelets. 1998, s. 200-205.
  • [14] L . Ma, K. Khorasani. Facial expression recognition using constructive feedforward neural networks. Systems, Man and Cybernetics, Part B, IEEE Transactions on, 2004, Vol. 34, No. 3, s. 1588-1595.
  • [15] P. Michel, R. Kaliouby. Real time facial expression recognition in video using Support Vector Machines, 2003.
  • [16] M. Pantic, L. Rothkrantz. Automatic analysis of facial expressions: The state of the art. IEEE Trans. Pattern Anal. Mach. Intell., December, 2000, Vol. 22, No. 12, s. 1424-1445.
  • [17] J. Saragih, S. Lucey, J. Cohn. Face alignment through subspace constrained mean-shifts. In: ICCV. Proceedings, 2009, s. 1034-1041.
  • [18] J. Saragih, S. Lucey, J. Cohn. Deformable model fitting by regularized landmark mean-shift. International Journal of Computer Vision, 2011, Vol. 91, No. 2, s. 200-215.
  • [19] S. Widen, J. Russell, A. Brooks. Anger and disgust: Discrete or overlapping categories? Presented at the 2004 APS Annual Convention, Chicago, IL, 2004.
  • [20] L. Wiskott et al. Face recognition by elastic bunch graph matching. IEEE Transactions on pattern analysis and machine intelligence, 1997, Vol. 19, s. 775-779.
  • [21] Jia-Jun Wong, Siu-Yeung Cho. Facial emotion recognition by adaptive processing of tree structures. In: Proceedings of the 2006 ACM Symposium on Applied Computing. Proceedings, New York, NY, USA, ACM, 2006, SAC'06, s. 23-30.
  • [22] M. Zarkowski. Extensive feature set approach in facial expression recognition in static images. Journal of Automation, Mobile Robotics and Intelligent Systems, 2013, Vol. 7, No. 3, s. 52-58.
  • [23] Z. Zhang et al. Comparison between geometry-based and gabor-wavelets-based facial expression recognition using multi-layer perceptron. In: Proceedings of the 3rd. International Conference on Face & Gesture Recognition. Proceedings, Washington, DC, USA, IEEE Computer Society, 1998, FG'98, s. 454-459.
  • [24] J. Saragih. FaceTracker, http://web.mac.com/jsaragih/FaceTracker/FaceTracker.html
Typ dokumentu
Bibliografia
Identyfikator YADDA
bwmeta1.element.baztech-3707082c-d7b1-4ffe-8d74-3291833e2cd5
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.