Preferencje help
Widoczny [Schowaj] Abstrakt
Liczba wyników
Tytuł artykułu

Extracting the symmetry of the human face from digital photographs

Warianty tytułu
Języki publikacji
By defining a midline and selecting six pairs of the landmarks of the human face on digital photographs, we extracted the symmetry of the human face by means of digital techniques. As a first approach to the symmetry of the human face, the distances and the tilts from the midline, between similar landmarks, were computed and averaged, respectively. The procrustes analysis and the histogram of oriented gradients (HOG), applied on patches on the six pairs of the landmarks of the human face, were used as a second approach to the symmetry of the human face. To have a better estimation of the symmetry of the whole human face, the photographs in grayscale and color were cut on pairs of strips, equally spaced from the midline, and then the strips were compared by the HOG feature extractor. The symmetry of the human face was extracted from 89 photographs of human faces (37 females and 52 males, ages 28.67 } 6.65 and 35.65 } 12.2 years, respectively). The HOG feature extractor applied on strips for the photographs in color and grayscale provided more confident values for the symmetry of the human face, which was well correlated with the assigned value by the photographers and physiotherapists. Also, an experiment was performed to evaluate the attractiveness as a function of the human face symmetry; thus, two groups of men and women were asked to sort digital photographs of women and men according to the attractiveness of women/men on the photographs. The results show that the most selected digital photographs were those with the highest symmetry scores.
Słowa kluczowe
Opis fizyczny
Bibliogr. 17 poz., rys., wykr.
  • Instituto Nacional de Astrofísica, Optica y Electrónica, Luis Enrique Erro No. 1, Santa María Tonatzintla, 72840 Puebla, Pue., Mexico
  • Instituto Nacional de Astrofísica, Optica y Electrónica, Puebla, Pue., Mexico
  • Instituto Nacional de Astrofísica, Optica y Electrónica, Puebla, Pue., Mexico
  • 1. Thornhill R, Gangestad SW. Facial attractiveness. Trends Cognit Sci 1999;3:452–60.
  • 2. Folgero PO, Hodne L, Johansson C, Andresen AE, Satren LC, Specht K, et al. Effects of facial symmetry and gaze direction on perception of social attributes: a study in experimental art history. Front Hum Neurosci 2016;10:452.
  • 3. Komori M, Kawamura S, Ishihara S. Averageness or symmetry: which is more important for facial attractiveness? Acta Psychol 2009;131:136–42.
  • 4. Henderson J, Holzleitner IJ, Talamas SN, Perret DI. Perception of health from facial cues. Philos Trans R Soc B 2016;371:20150380.
  • 5. Penke L, Bates TC, Gow AJ, Pattie A, Starr JM, Benedict C, et al. Symmetric faces are a sign of successful cognitive aging. Evol Hum Behav 2009;30:429437.
  • 6. Wu J, Heike C, Birgfeld C, Evans K, Maga M, Morrison C, et al. Measuring symmetry in children with unrepaired cleft lip: defining a standard for the three-dimensional mid-facial reference plane. Cleft Palate Craniofac J 2016;53:695–704.
  • 7. Schmid K, Marx D, Samal A. Computation of a face attractiveness index based on neoclassical canons, symmetry, and golden ratios. Pattern Recognit 2008;41:2710–7.
  • 8. Bogin B, Varela-Silva MI. Leg length, body proportion, and health: a review with a note on beauty. Int J Environ Res Public Health 2010;7:1047–75.
  • 9. Ozdemir S. The concept of anthropometric facial asymmetry. In: Preedy VR, editor. Handbook of anthropometry. New York: Springer, 2012:625–39.
  • 10. Bock MT, Bowman AW. On the measurement and analysis of asymmetry with applications to facial modelling. Appl Stat 2006;55:77–91.
  • 11. Romero-H R-A, Renero-C F-J. A deformable mode to search characteristic facial points. In: Bayro-Corrochano E, Hancock E, editors. Progress in pattern recognition, image analysis, computer vision, and applications, vol. 8827, Lecture Note in Computer Science. Switzerland: Springer International Publishing, 2014:933–9.
  • 12. Liu Y, Mitra S. Experiments with quantified facial asymmetry for human identification. Technical Report CMU-RI-TR-02-24, The Robotics Institute, Carnegie Mellon University, Pittsburgh, PA, 2001.
  • 13. Liu Y, Schmidt KL, Cohn JF, Mitra S. Facial asymmetry quantification for expression invariant human identification. Comput Vis Image Understand 2003;91:138–59.
  • 14. Jing X-Y, Wong H-S, Zhang D. Face recognition based on 2D Fisherface approach. Pattern Recogn 2006;39:707–10.
  • 15. Goodall C. Procrustes methods in the statistical analysis of shape. J R Stat Soc Ser B (Methodol) 1991;53:285–339.
  • 16. Li B, Huo G. Face recognition using locality sensitive histograms of oriented gradients. Optik 2016;127:3489–94.
  • 17. Suard F, Rakotomamonjy A, Bensrhair A, Broggi A. Pedestrian detection using infrared images and histograms of oriented gradients. Intelligent Vehicles Symposium, Tokyo, Japan, 2006.
Typ dokumentu
Identyfikator YADDA
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.