PL EN


Preferencje help
Widoczny [Schowaj] Abstrakt
Liczba wyników
Tytuł artykułu

Support vector machine in gender recognition

Treść / Zawartość
Identyfikatory
Warianty tytułu
Języki publikacji
EN
Abstrakty
EN
In the paper, Support Vector Machine (SVM) methods are discussed. The SVM algorithm is a very strong classification tool. Its capability in gender recognition in comparison with the other methods is presented here. Different sets of face features derived from the frontal facial image such as eye corners, nostrils, mouth corners etc. are taken into account. The efficiency of different sets of facial features in gender recognition using SVM method is examined.
Rocznik
Strony
318--329
Opis fizyczny
Bibliogr. 45 poz., rys., tab.
Twórcy
autor
  • Department of Computer Science, Faculty of Physics and Applied Informatics, University of Lodz, Poland
  • Department of Computer Science, Faculty of Physics and Applied Informatics, University of Lodz, Poland
Bibliografia
  • [1] Stawska Z., Milczarski P. (2013) Algorithms and Methods Used in Skin and Face Detection Suitable for Mobile Applications, ISIM: ISSN: 2084-5537, Vol.2, No 3, pp. 227-238
  • [2] Milczarski P., Stawska Z. (2014) Complex Colour Detection Methods Used In Skin Detection Systems, ISIM: ISSN: 2084-5537, Vol.3, No 1, pp. 40-52
  • [3] Viola P., Jones M. (2001) Rapid object detection using a boosted cascade of simple features In: Proc. IEEE Computer Society Conf. on Computer Vision and Pattern Recognition (CVPR’01), vol. 1, pp. 511–518.
  • [4] Brunelli R., Poggio T. (1993) Face recognition: features versus templates, IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 15, no. 10, pp. 1042-1052
  • [5] Abdi H., Valentin D., Edelman B., O’Toole A.J. (1995) More about the difference between men and women: Evidence from linear neural network and principal component approach, Neural Comput. 7 (6), 1160–1164.
  • [6] Wiskott L., Fellous J. M., Krüger N., von der Malsburg C. (1997) Face recognition by elastic bunch graph matching, In: Sommer, G., Daniilidis, K., Pauli, J. (Eds.), 7th International Conference on Computer Analysis of Images and Patterns, CAIP’97, Kiel. Springer-Verlag, Heidelberg, pp. 456–463.
  • [7] Cottrell G.W., Metcalfe J. (1990) EMPATH: Face, emotion, and gender recognition using holons, In: Lippmann, R., Moody, J.E., Touretzky, D.S. (Eds.), Proc. Advances in Neural Information Processing Systems 3 (NIPS). Morgan Kaufmann, pp. 564–571.
  • [8] Golomb B.A., Lawrence D.T., Sejnowski T.J. (1990) SEXNET: A neural network identifies sex from human faces, In: Lippmann, R., Moody, J.E., Touretzky, D.S. (Eds.), Proc. Advances in Neural Information Processing Systems 3 (NIPS). Morgan Kaufmann, pp. 572–579.
  • [9] Lyons M., Budynek J., Plante A., Akamatsu S. (2000) Classifying facial attributes using a 2-d Gabor wavelet representation and discriminant analysis, In: Proc. Internat. Conf. on Automatic Face and Gesture Recognition (FG’00), IEEE, Grenoble, France, pp. 202–207.
  • [10] Shakhnarovich G., Viola P.A., Moghaddam B. (2002) A unified learning framework for real time face detection and classification. In: Proc. Internat. Conf. on Automatic Face and Gesture Recognition (FGR’02). IEEE, pp. 14–21.
  • [11] Sun, Z., Bebis, G., Yuan, X., Louis, S.J.(2002). Genetic feature subset selection for gender classification: A comparison study. In: Proc. IEEE Workshop on Applications of Computer Vision (WACV’02), pp. 165–170.
  • [12] Wu, B., Ai, H., Huang, C.(2003). LUT-based Adaboost for gender classification. In: Proc. Internat. Conf. on Audio and Video-based Biometric Person Authentication (AVBPA’03), Guildford, United Kingdom, pp. 104–110.
  • [13] Sun, N., Zheng, W., Sun, C., Zou, C., Zhao, L., (2006). Gender classification based on boosting local binary pattern. In: Proc. 3rd Internat. Symposium on Neural Networks (ISNN’06), Chengdu, China, vol. 2, pp. 194–201.
  • [14] Fellous J.M. (1997) Gender discrimination and prediction on the basis of facial metric information ,Vision Research, vol. 37, no. 14, pp. 1961–1973.
  • [15] Moghaddam B., Yang M.H. (2002) Learning gender with support faces, Pattern Analysis and Machine Intelligence, IEEE Transactions on, vol. 24, no. 5, pp. 707–711.
  • [16] Castrillon M., Deniz O., Hernandez D., Dominguez A. (2003) Identity and gender recognition using the encara real-time face detector in Conferencia de la Asociacin Espaola para la Inteligencia Artificial, vol. 3.
  • [17] Buchala S., Loomes M.J., Davey N., Frank R.J. (2005) The role of global and feature based information in gender classification of faces: a comparison of human performance and computational models International Journal of Neural Systems, vol. 15, pp. 121- 128.
  • [18] Jain A., Huang J., Fang S. (2005) Gender identification using frontal facial images in Multimedia and Expo, 2005. ICME 2005. IEEE International Conference on, p. 4.
  • [19] Baluja S., Rowley H.A. (2007) Boosting sex identification performance International Journal of Computer Vision, vol. 71, no. 1, pp. 111–119.
  • [20] Fok T.H.C., Bouzerdoum A. (2006) A Gender Recognition System using Shunting Inhibitory Convolutional Neural Networks in The 2006 IEEE International Joint Conference on Neural Network Proceedings, pp. 5336-5341.
  • [21] Aghajanian J., Warrell J., Prince S.J., Rohn J.L., Baum B. (2009) Patch-based with in object classification in 2009 IEEE 12th International Conference on Computer Vision, pp. 1125-1132.
  • [22] Demirkus M., Toews M., Clark J.J., Arbel T. (2010) Gender classification from unconstrained video sequences in Computer Vision and Pattern Recognition Workshops (CVPRW), 2010 IEEE Computer Society Conference on, pp. 55–62.
  • [23] Wang J.G., Li J., Lee C.Y., Yau W.Y. (2010) Dense SIFT and Gabor descriptorsbased face representation with applications to gender recognition in Control Automation Robotics & Vision (ICARCV), 2010 11th International Conference on, no. December, pp. 1860–1864.
  • [24] Alexandre L.A. (2010) Gender recognition: A multiscale decision fusion approach Pattern Recognition Letters, vol. 31, no. 11, pp. 1422-1427.
  • [25] Li B., Lian X.-C., Lu B.-L. (2011) Gender classification by combining clothing, hair and facial component classifiers Neurocomputing, pp. 1-10.
  • [26] Zheng J., Lu B.L. (2011) A support vector machine classifier with automatic confidence and its application to gender classification Neurocomputing, vol. 74, no. 11, pp. 1926-1935.
  • [27] Shan C. (2012) Learning local binary patterns for gender classification on real-world face images Pattern Recognition Letters, vol. 33, no. 4, pp. 431-437.
  • [28] Burton A.M., Bruce V., Dench N. (1993) What's the difference between men and women? Evidence from facial measurements, Perception, 22, pp. 153–176.
  • [29] Brunelli R., Poggio T. (1992) Hyperbf networks for gender classification, DARPA Image Understanding Workshop, pp. 311–314.
  • [30] Milczarski P. (2011) A new method for face identification and determining facial asymmetry, Semantic Methods for Knowledge Management and Communication, eds. R. Katarzyniak et al., seria Studies in Computational Intelligence vol. 381, ISBN: 978-3-642-23417-0, (pp. 329-340). Springer-Verlag Berlin Heidelberg.
  • [31] Phillips P.J., Moon H., Rizvi S.A., Rauss P.J. (2000) The FERET evaluation methodology for face-recognition algorithms Pattern Analysis and Machine Intelligence, IEEE Transactions on, vol. 22, no. 10, pp. 1090-1104.
  • [32] Mäkinen E., Raisamo R. (2008) An experimental comparison of gender classification methods Pattern Recognition Letters 29, pp. 1544–1556.
  • [33] Stawska Z., Milczarski P. (2016) Gender Recognition Methods Useful in Mobile Authentication Applications, Information Systems in Management, Vol. 5, 2016, No. 2, pp. 248-259
  • [34] Milczarski P., Kompanets L., Kurach D. (2010) An Approach to Brain Thinker Type Recognition Based on Facial Asymmetry in L. Rutkowski et al. (Eds.): ICAISC 2010, Part I, LNCS 6113, pp. 643--650. Springer, Heidelberg.
  • [35] Kompanets L., Milczarski P., Kurach D. (2013) Creation of the fuzzy three-level adapting Brainthinker in Human System Interaction (HSI), 2013 The 6th International Conference on Digital Object Identifier: 10.1109/HSI.2013.6577865, pp. 459 – 465.
  • [36] Boser B. E., Guyon I. M., Vapnik V. N. (1992) A training algorithm for optimal margin classifiers. Proceedings of the fifth annual workshop on Computational learning theory – COLT '92. p. 144
  • [37] Cortes C., Vapnik V. (1995) Support-vector network. Machine Learning. 20 (3): 273–297.
  • [38] Vapnik V. N., Kotz S. (2006) Estimation of Dependences Based on Empirical Data, Springer, ISBN 0-387-30865-2
  • [39] Lian H. C., Lu B. L. (2006) Multi-view gender classification using local binary patterns and support vector machines. In International Symposium on Neural Networks (pp. 202-209). Springer Berlin Heidelberg.
  • [40] Bocklet T., Maier A., Bauer J. G., Burkhardt F., Noth E. (2008) Age and gender recognition for telephone applications based on gmm supervectors and support vector machines. In Acoustics, Speech and Signal Processing, 2008. ICASSP 2008. IEEE International Conference on (pp. 1605-1608). IEEE.
  • [41] Chang C.-C., Lin C.-J. (2011) LIBSVM: A library for support vector machines. ACM Transactions on Intelligent Systems and Technology 2 (3).
  • [42] Campbell C., Ying Y. (2011) Learning with Support Vector Machines Morgan and Claypool, ISBN 978-1-60845-616-1.
  • [43] Martinez A.M., Benavente R. (1998) The AR Face Database. CVC Technical Report #24.
  • [44] Muldashev E.R. (2002) Whom Did We Descend From? OLMA Press, Moscow (In Russian)
  • [45] Ouarda W., Trichili H., Alimi A. M. and Solaiman B. (2014) Face recognition based on geometric features using Support Vector Machines, In 6th International Conference of Soft Computing and Pattern Recognition (SoCPaR), Tunis, pp. 89-95
Uwagi
Opracowanie rekordu w ramach umowy 509/P-DUN/2018 ze środków MNiSW przeznaczonych na działalność upowszechniającą naukę (2018).
Typ dokumentu
Bibliografia
Identyfikator YADDA
bwmeta1.element.baztech-a9d2eefb-ee9d-4cc8-8141-8be7af55e2f8
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.