Tytuł artykułu
Treść / Zawartość
Pełne teksty:
Identyfikatory
Warianty tytułu
Języki publikacji
Abstrakty
Facial emotion recognition (FER) is an important topic in the fields of computer vision and artificial intelligence owing to its significant academic and commercial potential. Nowadays, emotional factors are important as classic functional aspects of customer purchasing behavior. Purchasing choices and decisions making are the result of a careful analysis of the product advantages and disadvantages and of affective and emotional aspects. This paper presents a novel method for human emotion classification and recognition. We generate seven referential faces suitable for each kind of facial emotion based on perfect face ratios and some classical averages. The basic idea is to extract perfect face ratios for emotional face and for each referential face as features and calculate the distance between them by using fuzzy hamming distance. To extract perfect face ratios, we use the point landmarks in the face then sixteen features will be extract. An experimental evaluation demonstrates the satisfactory performance of our approach on WSEFEP dataset. It can be applied with any existing facial emotion dataset. The proposed algorithm will be a competitor of the other proposed relative approaches. The recognition rate reaches more than 90%.
Rocznik
Tom
Strony
37--44
Opis fizyczny
Bibliogr. 11 poz., rys.
Twórcy
autor
- Department of Informatics, Faculty of Sciences, Ibn Tofail University, Kenitra, Morocco
autor
- Department of Informatics, Faculty of Sciences, Ibn Tofail University, Kenitra, Morocco
autor
- Department of Informatics, Faculty of Sciences, Ibn Tofail University, Kenitra, Morocco
autor
- Department of Informatics, Faculty of Sciences, Ibn Tofail University, Kenitra, Morocco
Bibliografia
- [1] P. Ekman, W. V. Friesen, M. O’Sullivan, A. Chan, I. Diacoyanni-Tarlatzis, K. Heider, R. Krause, W. A. LeCompte, T. Pitcairn and P. E. Ricci-Bitti, “Universals and cultural differences in the judgments of facial expressions of emotion”, Journal of Personality and Social Psychology, vol. 53, no. 4, 1987, 712–717, DOI: 10.1037//0022-3514.53.4.712.
- [2] A. Bookstein, S. T. Klein and T. Raita, “Fuzzy Hamming Distance: A New Dissimilarity Measure”. In: G. M. Landau and A. Amir (eds.), Combinatorial Pattern Matching, 2001, 86–97, DOI: 10.1007/3-540-48194-X_7.
- [3] K. Ounachad, A. Sadiq and A. Souhar, “Fuzzy Hamming Distance and Perfect Face Ratios Based Face Sketch Recognition”. In: 2018 IEEE 5th International Congress on Information Science and Technology (CiSt), 2018, 317–322, DOI: 10.1109/CIST.2018.8596665.
- [4] M. Ionescu and A. Ralescu, “Fuzzy Hamming Distance Based Banknote Validator”. In: The 14th IEEE International Conference on Fuzzy Systems, 2005. FUZZ ‘05, 2005, 300–305, DOI: 10.1109/FUZZY.2005.1452410.
- [5] M. Ionescu, “Image clustering for a fuzzy hamming distance based cbir system”. In: Proceedings of the Sixteen Midwest Artificial Intelligence and Cognitive Science Conference, 2005, 102–108.
- [6] M. Ionescu and A. Ralescu, “Fuzzy hamming distance in a content-based image retrieval ystem”. In: 2004 IEEE International Conference on Fuzzy Systems, vol. 3, 2004, 1721–1726, DOI: 10.1109/FUZZY.2004.1375443.
- [7] G. S. Shehu, A. M. Ashir and A. Eleyan, “Character recognition using correlation hamming distance”. In: 2015 23nd Signal Processing and Communications Applications Conference (SIU), 2015, 755–758, DOI: 10.1109/SIU.2015.7129937.
- [8] S. Xiao, S. Yan, and A. A. Kassim, “Facial Landmark Detection via Progressive Initialization”. In: 2015 IEEE International Conference on Computer Vision Workshop (ICCVW), 2016, 986–993, DOI: 10.1109/ICCVW.2015.130.
- [9] K. Ounachad, M. Oualla, A. Souhar and A. Sadiq, “Structured learning and prediction in face sketch gender classification and recognition”, International Journal of Computational Vision and Robotics, vol. 10, no. 6, 2020, DOI: 10.1504/IJCVR.2020.110645.
- [10] R. Chellappa, C. L. Wilson and S. Sirohey, “Human and machine recognition of faces: a survey”, Proceedings of the IEEE, vol. 83, no. 5, 1995, 705–741, DOI: 10.1109/5.381842.
- [11] M. Olszanowski, G. Pochwatko, K. Kuklinski, M. Scibor-Rylski, P. Lewinski and R. K. Ohme, “Warsaw set of emotional facial expression pictures: a validation study of facial display photographs”, Frontiers in Psychology, vol. 5, 2015, DOI: 10.3389/fpsyg.2014.01516.
Uwagi
Opracowanie rekordu ze środków MNiSW, umowa Nr 461252 w ramach programu "Społeczna odpowiedzialność nauki" - moduł: Popularyzacja nauki i promocja sportu (2021).
Typ dokumentu
Bibliografia
Identyfikator YADDA
bwmeta1.element.baztech-ff550dd5-94c6-4a97-8374-d6cc83687ebf
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.