Ten serwis zostanie wyłączony 2025-02-11.
Nowa wersja platformy, zawierająca wyłącznie zasoby pełnotekstowe, jest już dostępna.
Przejdź na https://bibliotekanauki.pl
Preferencje help
Widoczny [Schowaj] Abstrakt
Liczba wyników

Znaleziono wyników: 3

Liczba wyników na stronie
first rewind previous Strona / 1 next fast forward last
Wyniki wyszukiwania
Wyszukiwano:
w słowach kluczowych:  facial features
help Sortuj według:

help Ogranicz wyniki do:
first rewind previous Strona / 1 next fast forward last
1
Content available remote EDFCES: A new example-driven 3D face construction and editing system
100%
EN
This paper presents an automatic runtime system for generating varied, realistic face models by synthesizing a global face shape and local facial features according to intuitive, high-level control parameters. Our method takes as examples 3D face scans in order to exploit the parameter-to-geometry correlations present in real faces. In order to establish the correspondences among the scanned models, we use a three-step model fitting approach to conform a generic head mesh onto each scanned model. We transform the obtained data sets of global face shapes and local feature shapes into vector space representations by applying a principal component analysis (PCA). We compute a set of face anthropometric measurements to parameterize the exemplary shapes in the measurement spaces. Using PCA coefficients as a compact shape representation, we approach the shape synthesis problem by forming scattered data interpolation functions designed to generate the desired face shape by taking anthropometric parameters as input. At runtime, the interpolation functions are evaluated for the input parameter values to produce new face geometries at an interactive rate. The correspondence among all exemplary face textures is obtained by parameterizing the 3D generic mesh over a 2D image domain. The new feature texture with the desired attributes is synthesized by interpolating the example textures. The resulting system is intuitive to control and fine-grained. We demonstrate our method by applying different parameters to generate a wide range of face models.
EN
People often assess other people’s personality traits merely based on their emotional expression or the physical features of their faces. In this paper we review the evidence of biases when formulating judgments of trustworthiness and confidence from two types of facial characteristics. One line of evidence documents the influence of emotional expressions representing an individual’s motivational state and reflecting agents’ intentions. People’s judgment about the trustworthiness or attractiveness of others largely depends on the emotions expressed. The second line of evidence describes how facial appearance (e.g., cues of physical strength or resemblance to one’s own face) affects the inferences of personality traits. The two experiments described in this paper investi-gated the interplay between these two factors (i.e., facial features and emotional expression) and their combined influence on social judgments. We hypothesized and tested how both facial features conveying trustworthiness (vs. dominance) and a smiling (vs. neutral) expression influence judgments of trustworthiness and confidence (Study 1). We also tested the influence of facial resemblance in an interaction with a smiling individual when forming judgments (Study 2). We found that relatively static facial features conveying trust had more impact on judgments of trustworthiness than emotional expressions, yet emotional expressions seem to be more impactful for judgments of dominance. The results of both studies are discussed from a sociocognitive perspective.
3
Content available An Age-Group Ranking Model for Facial Age Estimation
100%
EN
Age prediction has become an important Computer Vision task. Although this task requires the age of an individual to be predicted from a given face, research has shown that it is more intuitive and easier for humans to decide which of two individuals is older than to decide how old an individual is. This work follows this intuition to aid the age prediction of a face by exploiting the age information available from other faces. It goes further to explore the statistical relationships between facial features within age groups to compute age-group ranks for a given face. The resulting age-group rank is low-dimensional and age-discriminatory, thus improving age prediction accuracy when fed into an age predictor. Experiments on publicly available facial ageing datasets (FGnet, PAL, and Adience) reveal the effectiveness of the proposed age-group ranking model when used with traditional Machine learning algorithms as well as Deep Learning algorithms. Cross-dataset validation, a method of training and testing on entirely different datasets, was also employed to further investigate the effectiveness of this method.
first rewind previous Strona / 1 next fast forward last
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.