Back to Search Start Over

Keyframe Control of Music-driven 3D Dance Generation

Authors :
Yang, Zhipeng
Wen, Yu-Hui
Chen, Shu-Yu
Liu, Xiao
Gao, Yuan
Liu, Yong-Jin
Gao, Lin
Fu, Hongbo
Yang, Zhipeng
Wen, Yu-Hui
Chen, Shu-Yu
Liu, Xiao
Gao, Yuan
Liu, Yong-Jin
Gao, Lin
Fu, Hongbo
Publication Year :
2023

Abstract

For 3D animators, choreography with artificial intelligence has attracted more attention recently. However, most existing deep learning methods mainly rely on music for dance generation and lack sufficient control over generated dance motions. To address this issue, we introduce the idea of keyframe interpolation for music-driven dance generation and present a novel transition generation technique for choreography. Specifically, this technique synthesizes visually diverse and plausible dance motions by using normalizing flows to learn the probability distribution of dance motions conditioned on a piece of music and a sparse set of key poses. Thus, the generated dance motions respect both the input musical beats and the key poses. To achieve a robust transition of varying lengths between the key poses, we introduce a time embedding at each timestep as an additional condition. Extensive experiments show that our model generates more realistic, diverse, and beat-matching dance motions than the compared state-of-the-art methods, both qualitatively and quantitatively. Our experimental results demonstrate the superiority of the keyframe-based control for improving the diversity of the generated dance motions. IEEE

Details

Database :
OAIster
Notes :
English
Publication Type :
Electronic Resource
Accession number :
edsoai.on1452721261
Document Type :
Electronic Resource