Back to Search Start Over

ANALYSIS AND CONTROL OF FACIAL EXPRESSIONS USING DECOMPOSABLE NONLINEAR GENERATIVE MODELS

Authors :
Chan-Su Lee
Dimitris Samaras
Source :
International Journal of Pattern Recognition and Artificial Intelligence. 28:1456009
Publication Year :
2014
Publisher :
World Scientific Pub Co Pte Lt, 2014.

Abstract

Facial expressions convey personal characteristics and subtle emotional states. This paper presents a new framework for modeling subtle facial motions of different people with different types of expressions from high-resolution facial expression tracking data to synthesize new stylized subtle facial expressions. A conceptual facial motion manifold is used for a unified representation of facial motion dynamics from three-dimensional (3D) high-resolution facial motions as well as from two-dimensional (2D) low-resolution facial motions. Variant subtle facial motions in different people with different expressions are modeled by nonlinear mappings from the embedded conceptual manifold to input facial motions using empirical kernel maps. We represent facial expressions by a factorized nonlinear generative model, which decomposes expression style factors and expression type factors from different people with multiple expressions. We also provide a mechanism to control the high-resolution facial motion model from low-resolution facial video sequence tracking and analysis. Using the decomposable generative model with a common motion manifold embedding, we can estimate parameters to control 3D high resolution facial expressions from 2D tracking results, which allows performance-driven control of high-resolution facial expressions.

Details

ISSN :
17936381 and 02180014
Volume :
28
Database :
OpenAIRE
Journal :
International Journal of Pattern Recognition and Artificial Intelligence
Accession number :
edsair.doi...........782090616c2009590f2e8f5dcba7b10c