VMP: Versatile Motion Priors for Robustly Tracking Motion on Physical Characters

Loading...
Thumbnail Image
Date
2024
Journal Title
Journal ISSN
Volume Title
Publisher
The Eurographics Association and John Wiley & Sons Ltd.
Abstract
Recent progress in physics-based character control has made it possible to learn policies from unstructured motion data. However, it remains challenging to train a single control policy that works with diverse and unseen motions, and can be deployed to real-world physical robots. In this paper, we propose a two-stage technique that enables the control of a character with a full-body kinematic motion reference, with a focus on imitation accuracy. In a first stage, we extract a latent space encoding by training a variational autoencoder, taking short windows of motion from unstructured data as input. We then use the embedding from the time-varying latent code to train a conditional policy in a second stage, providing a mapping from kinematic input to dynamics-aware output. By keeping the two stages separate, we benefit from self-supervised methods to get better latent codes and explicit imitation rewards to avoid mode collapse. We demonstrate the efficiency and robustness of our method in simulation, with unseen user-specified motions, and on a bipedal robot, where we bring dynamic motions to the real world.
Description

CCS Concepts: Computing methodologies → Learning from demonstrations; Learning latent representations; Reinforcement learning; Physical simulation; Animation; Control methods

        
@article{
10.1111:cgf.15175
, journal = {Computer Graphics Forum}, title = {{
VMP: Versatile Motion Priors for Robustly Tracking Motion on Physical Characters
}}, author = {
Serifi, Agon
and
Grandia, Ruben
and
Knoop, Espen
and
Gross, Markus
and
Bächer, Moritz
}, year = {
2024
}, publisher = {
The Eurographics Association and John Wiley & Sons Ltd.
}, ISSN = {
1467-8659
}, DOI = {
10.1111/cgf.15175
} }
Citation
Collections