PL EN


Preferencje help
Widoczny [Schowaj] Abstrakt
Liczba wyników
Tytuł artykułu

Machine-learning at the service of plastic surgery: a case study evaluating facial attractiveness and emotions using R language

Wybrane pełne teksty z tego czasopisma
Identyfikatory
Warianty tytułu
Konferencja
Federated Conference on Computer Science and Information Systems (14 ; 01-04.09.2019 ; Leipzig, Germany)
Języki publikacji
EN
Abstrakty
EN
Since the plastic surgery should consider that facial impression is always dependent on current facial emotion, it came to be verified how precise classification of facial images into sets of defined facial emotions is.
Rocznik
Tom
Strony
107--112
Opis fizyczny
Bibliogr. 28 poz., il., tab.
Twórcy
  • Department of Biomedical Informatics, Faculty of Biomedical Engineering, Czech Technical University in Prague, Kladno, Czech Republic
  • Department of Plastic Surgery, First Faculty of Medicine, Charles University and Na Bulovce Hospital, Prague, Czech Republic
autor
  • Department of Biomedical Informatics, Faculty of Biomedical Engineering, Czech Technical University in Prague, Kladno, Czech Republic
Bibliografia
  • 1. Leslie G. Farkas, Tania A. Hreczko, John C. Kolar, et al. “Vertical and Horizontal Proportions of the Face in Young Adult North American Caucasians”. In: Plastic and Reconstructive Surgery 75.3 (Mar. 1985), pp. 328–337. http://dx.doi.org/10.1097/00006534-198503000-00005. URL : https://doi.org/10.1097/00006534-198503000-00005.
  • 2. Kendra Schmid, David Marx, and Ashok Samal. “Computation of a face attractiveness index based on neoclassical canons, symmetry, and golden ratios”. In: Pattern Recognition 41.8 (Aug. 2008), pp. 2710–2717. DOI: 10. 1016/j.patcog.2007.11.022. URL : https://doi.org/10.1016/j.patcog.2007.11.022.
  • 3. Mounir Bashour. Is an objective measuring system for facial attractiveness possible. Boca Raton, Fla: Dissertation.com, 2007. ISBN: 978-158-1123-654.
  • 4. Randy Thornhill and Steven W. Gangestad. “Human facial beauty”. In: Human Nature 4.3 (Sept. 1993), pp. 237–269. http://dx.doi.org/10.1007/bf02692201. URL: https://doi.org/10.1007/bf02692201.
  • 5. A. C. Little, B. C. Jones, and L. M. DeBruine. “Facial attractiveness: evolutionary based research”. In: Philosophical Transactions of the Royal Society B: Biological Sciences 366.1571 (May 2011), pp. 1638–1659. http://dx.doi.org/10.1098/rstb.2010.0404. URL: https://doi.org/10.1098/rstb.2010.0404.
  • 6. D. I. Perrett, K. J. Lee, I. Penton-Voak, et al. “Effects of sexual dimorphism on facial attractiveness”. In: Nature 394.6696 (Aug. 1998), pp. 884–887. http://dx.doi.org/10.1038/29772. URL : https://doi.org/10.1038/29772.
  • 7. Farhad Naini. Facial aesthetics : concepts & clinical diagnosis. Chichester, West Sussex, UK Ames, Iowa: Wiley-Blackwell, 2011. ISBN : 978-1-405-18192-1.
  • 8. Charles Darwin. The expression of the emotions in man and animals. Oxford New York: Oxford University Press, 1998. ISBN: 9780195158069.
  • 9. Silvan Tomkins. Affect imagery consciousness : the complete edition. New York: Springer Pub, 2008. ISBN : 978-0826144041.
  • 10. Paul Ekman and Wallace V. Friesen. “Constants across cultures in the face and emotion.” In: Journal of Personality and Social Psychology 17.2 (1971), pp. 124–129. http://dx.doi.org/10.1037/h0030377. URL : https://doi.org/10.1037/h0030377.
  • 11. Paul Ekman. Unmasking the face : a guide to recognizing emotions from facial clues. Cambridge, MA: Malor Books, 2003. ISBN: 1883536367.
  • 12. B. Fasel and Juergen Luettin. “Automatic facial expression analysis: a survey”. In: Pattern Recognition 36.1 (Jan. 2003), pp. 259–275. http://dx.doi.org/10.1016/s0031-3203(02)00052-3. URL : https://doi.org/10.1016/s0031-3203(02)00052-3.
  • 13. Ming-Hsuan Yang, D.J. Kriegman, and N. Ahuja. “Detecting faces in images: a survey”. In: IEEE Transactions on Pattern Analysis and Machine Intelligence 24.1 (2002), pp. 34–58. http://dx.doi.org/10.1109/34.982883. URL : https://doi.org/10.1109/34.982883.
  • 14. A Lanitis, CJ Taylor, and TF Cootes. “Automatic face identification system using flexible appearance models”. In: Image and Vision Computing 13.5 (June 1995), pp. 393–401. http://dx.doi.org/10.1016/0262-8856(95)99726-h. URL : https://doi.org/10.1016/0262-8856(95)99726-h.
  • 15. Henry A. Rowley, Shumeet Baluja, and Takeo Kanade. “Neural Network-Based Face Detection”. In: IEEE Trans. Pattern Anal. Mach. Intell. 20.1 (Jan. 1998), pp. 23–38. ISSN : 0162-8828. http://dx.doi.org/10.1109/34.655647. URL : http://dx.doi.org/10.1109/34.655647.
  • 16. Xiaoming Zhao and Shiqing Zhang. “A Review on Facial Expression Recognition: Feature Extraction and Classification”. In: IETE Technical Review 33.5 (Jan. 2016), pp. 505–517. http://dx.doi.org/10.1080/02564602.2015. 1117403. URL : https://doi.org/10.1080/02564602.2015. 1117403.
  • 17. T.F. Cootes, C.J. Taylor, D.H. Cooper, et al. “Active Shape Models-Their Training and Application”. In: Computer Vision and Image Understanding 61.1 (Jan. 1995), pp. 38–59. http://dx.doi.org/10.1006/cviu.1995.1004. URL : https://doi.org/10.1006/cviu.1995.1004.
  • 18. Ethem Alpaydin. Introduction to machine learning. Cambridge, Mass: MIT Press, 2010. ISBN : 9780262012430.
  • 19. Pavel Kasal, Patrik Fiala, Lubomír Štěpánek, et al. “Application of Image Analysis for Clinical Evaluation of Facial Structures”. In: Medsoft 2015 (2015), pp. 64–70. URL: http://www.creativeconnections.cz/medsoft/2015/Medsoft_2015_kasal.pdf.
  • 20. Lubomir Stepanek, Pavel Kasal, and Jan Mestak. “Evaluation of facial attractiveness for purposes of plastic surgery using machine-learning methods and image analysis”. In: 20th IEEE International Conference on e-Health Networking, Applications and Services, Health-com 2018, Ostrava, Czech Republic, September 17-20, 2018. 2018, pp. 1–6. http://dx.doi.org/10.1109/HealthCom.2018. 8531195. URL : https://doi.org/10.1109/HealthCom.2018.8531195.
  • 21. R Core Team. R: A Language and Environment for Statistical Computing. ISBN 3-900051-07-0. R Foundation for Statistical Computing. Vienna, Austria, 2013. URL : http://www.R-project.org/.
  • 22. Davis E. King. “Dlib-ml: A Machine Learning Toolkit”. In: J. Mach. Learn. Res. 10 (Dec. 2009), pp. 1755–1758. ISSN: 1532-4435. URL: http://dl.acm.org/citation.cfm?id=1577069.1755843.
  • 23. John Chambers. Statistical models in S. Boca Raton, Fla: Chapman & Hall/CRC, 1992. ISBN: 041283040X.
  • 24. Nir Friedman, Dan Geiger, and Moises Goldszmidt. “Bayesian Network Classifiers”. In: Mach. Learn. 29.2-3 (Nov. 1997), pp. 131–163. ISSN: 0885-6125. http://dx.doi.org/10.1023/A:1007465528199. URL: https://doi.org/10.1023/A:1007465528199.
  • 25. Leo Breiman. Classification and regression trees. New York: Chapman & Hall, 1993. ISBN : 0412048418.
  • 26. Warren S. McCulloch and Walter Pitts. “A logical calculus of the ideas immanent in nervous activity”. In: The Bulletin of Mathematical Biophysics 5.4 (Dec. 1943), pp. 115–133. http://dx.doi.org/10.1007/bf02478259. URL : https://doi.org/10.1007/bf02478259.
  • 27. Johannes Stallkamp, Marc Schlipsing, Jan Salmen, et al. “The German Traffic Sign Recognition Benchmark: A multi-class classification competition”. In: The 2011 International Joint Conference on Neural Networks. IEEE, July 2011. http://dx.doi.org/10.1109/ijcnn.2011.6033395. URL: https://doi.org/10.1109/ijcnn.2011.6033395.
  • 28. Sridhar Ramaswamy, Pablo Tamayo, Ryan Rifkin, et al. “Multiclass cancer diagnosis using tumor gene expression signatures”. In: Proceedings of the National Academy of Sciences 98.26 (2001), pp. 15149–15154. ISSN: 0027-8424. http://dx.doi.org/10.1073/pnas.211566398. eprint: https://www.pnas.org/content/98/26/15149.full.pdf. URL: https://www.pnas.org/content/98/26/15149.
Uwagi
1. Track 1: Artificial Intelligence and Applications
2. Technical Session: 14th International Symposium Advances in Artificial Intelligence and Applications
3. Opracowanie rekordu ze środków MNiSW, umowa Nr 461252 w ramach programu "Społeczna odpowiedzialność nauki" - moduł: Popularyzacja nauki i promocja sportu (2020).
Typ dokumentu
Bibliografia
Identyfikator YADDA
bwmeta1.element.baztech-98167c23-1a7d-4c60-b0f5-e963cea1fa9c
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.