• Login
    View Item 
    •   Eurographics DL Home
    • Computer Graphics Forum
    • Volume 40 (2021)
    • 40-Issue 7
    • View Item
    •   Eurographics DL Home
    • Computer Graphics Forum
    • Volume 40 (2021)
    • 40-Issue 7
    • View Item
    JavaScript is disabled for your browser. Some features of this site may not work without it.

    Diverse Dance Synthesis via Keyframes with Transformer Controllers

    Thumbnail
    View/Open
    v40i7pp071-083.pdf (4.033Mb)
    paper1031_mm.mp4 (233.0Mb)
    Date
    2021
    Author
    Pan, Junjun
    Wang, Siyuan
    Bai, Junxuan
    Dai, Ju
    Pay-Per-View via TIB Hannover:

    Try if this item/paper is available.

    Metadata
    Show full item record
    Abstract
    Existing keyframe-based motion synthesis mainly focuses on the generation of cyclic actions or short-term motion, such as walking, running, and transitions between close postures. However, these methods will significantly degrade the naturalness and diversity of the synthesized motion when dealing with complex and impromptu movements, e.g., dance performance and martial arts. In addition, current research lacks fine-grained control over the generated motion, which is essential for intelligent human-computer interaction and animation creation. In this paper, we propose a novel keyframe-based motion generation network based on multiple constraints, which can achieve diverse dance synthesis via learned knowledge. Specifically, the algorithm is mainly formulated based on the recurrent neural network (RNN) and the Transformer architecture. The backbone of our network is a hierarchical RNN module composed of two long short-term memory (LSTM) units, in which the first LSTM is utilized to embed the posture information of the historical frames into a latent space, and the second one is employed to predict the human posture for the next frame. Moreover, our framework contains two Transformer-based controllers, which are used to model the constraints of the root trajectory and the velocity factor respectively, so as to better utilize the temporal context of the frames and achieve fine-grained motion control. We verify the proposed approach on a dance dataset containing a wide range of contemporary dance. The results of three quantitative analyses validate the superiority of our algorithm. The video and qualitative experimental results demonstrate that the complex motion sequences generated by our algorithm can achieve diverse and smooth motion transitions between keyframes, even for long-term synthesis.
    BibTeX
    @article {10.1111:cgf.14402,
    journal = {Computer Graphics Forum},
    title = {{Diverse Dance Synthesis via Keyframes with Transformer Controllers}},
    author = {Pan, Junjun and Wang, Siyuan and Bai, Junxuan and Dai, Ju},
    year = {2021},
    publisher = {The Eurographics Association and John Wiley & Sons Ltd.},
    ISSN = {1467-8659},
    DOI = {10.1111/cgf.14402}
    }
    URI
    https://doi.org/10.1111/cgf.14402
    https://diglib.eg.org:443/handle/10.1111/cgf14402
    Collections
    • 40-Issue 7

    Eurographics Association copyright © 2013 - 2023 
    Send Feedback | Contact - Imprint | Data Privacy Policy | Disable Google Analytics
    Theme by @mire NV
    System hosted at  Graz University of Technology.
    TUGFhA
     

     

    Browse

    All of Eurographics DLCommunities & CollectionsBy Issue DateAuthorsTitlesSubjectsThis CollectionBy Issue DateAuthorsTitlesSubjects

    My Account

    LoginRegister

    Statistics

    View Usage Statistics

    BibTeX | TOC

    Create BibTeX Create Table of Contents

    Eurographics Association copyright © 2013 - 2023 
    Send Feedback | Contact - Imprint | Data Privacy Policy | Disable Google Analytics
    Theme by @mire NV
    System hosted at  Graz University of Technology.
    TUGFhA