Learning-Based Animation of Clothing for Virtual Try-On
Fecha
2020-04-17
Título de la revista
ISSN de la revista
Título del volumen
Editor
Resumen
This paper presents a learning-based clothing animation method for highly efficient virtual try-on simulation. Given a garment,
we preprocess a rich database of physically-based dressed character simulations, for multiple body shapes and animations.
Then, using this database, we train a learning-based model of cloth drape and wrinkles, as a function of body shape and
dynamics. We propose a model that separates global garment fit, due to body shape, from local garment wrinkles, due to
both pose dynamics and body shape. We use a recurrent neural network to regress garment wrinkles, and we achieve highly
plausible nonlinear effects, in contrast to the blending artifacts suffered by previous methods. At runtime, dynamic virtual
try-on animations are produced in just a few milliseconds for garments with thousands of triangles. We show qualitative and
quantitative analysis of results.
Descripción
"This is the peer reviewed version of the following article: [FULL CITE], which has been published in final form at [Link to final article using the DOI]. This article may be used for non-commercial purposes in accordance with Wiley Terms and Conditions for Use of Self-Archived Versions."
Palabras clave
Citación
Colecciones
Excepto si se señala otra cosa, la licencia del ítem se describe como Atribución-NoComercial-CompartirIgual 4.0 Internacional