Virtual Reality & Intelligent Hardware (Oct 2024)
Music-stylized hierarchical dance synthesis with user control
Abstract
Background: Synthesizing dance motions to match musical inputs is a significant challenge in animation research. Compared to functional human motions, such as locomotion, dance motions are creative and artistic, often influenced by music, and can be independent body language expressions. Dance choreography requires motion content to follow a general dance genre, whereas dance performances under musical influence are infused with diverse impromptu motion styles. Considering the high expressiveness and variations in space and time, providing accessible and effective user control for tuning dance motion styles remains an open problem. Methods: In this study, we present a hierarchical framework that decouples the dance synthesis task into independent modules. We use a high-level choreography module built as a Transformer-based sequence model to predict the long-term structure of a dance genre and a low-level realization module that implements dance stylization and synchronization to match the musical input or user preferences. This novel framework allows the individual modules to be trained separately. Because of the decoupling, dance composition can fully utilize existing high-quality dance datasets that do not have musical accompaniments, and the dance implementation can conveniently incorporate user controls and edit motions through a decoder network. Each module is replaceable at runtime, which adds flexibility to the synthesis of dance sequences. Results: Synthesized results demonstrate that our framework generates high-quality diverse dance motions that are well adapted to varying musical conditions and user controls.