Volume 41 (2022)
Permanent URI for this community
Browse
Browsing Volume 41 (2022) by Subject "animation"
Now showing 1 - 4 of 4
Results Per Page
Sort Options
Item Learning Camera Control in Dynamic Scenes from Limited Demonstrations(© 2022 Eurographics ‐ The European Association for Computer Graphics and John Wiley & Sons Ltd, 2022) Hanocka, R.; Assa, J.; Cohen‐Or, D.; Giryes, R.; Hauser, Helwig and Alliez, PierreIn this work, we present our strategy for camera control in dynamic scenes with multiple people (sports teams). We learn a generic model of the player dynamics offline in simulation. We use only a few sparse demonstrations of a user's camera control policy to learn a reward function to drive camera motion in an ongoing dynamic scene. Key to our approach is the creation of a low‐dimensional representation of the scene dynamics which is independent of the environment action and rewards, which enables learning the reward function using only a small number of examples. We cast the user‐specific control objective as an inverse reinforcement learning problem, aiming to learn an expert's intention from a small number of demonstrations. The learned reward function is used in combination with a visual model predictive controller (MPC). We learn a generic scene dynamics model that is agnostic to the user‐specific reward, enabling reusing the same dynamics model for different camera control policies. We show the effectiveness of our method on simulated and real soccer matches.Item Narrow‐Band Screen‐Space Fluid Rendering(© 2022 Eurographics ‐ The European Association for Computer Graphics and John Wiley & Sons Ltd., 2022) Oliveira, Felipe; Paiva, Afonso; Hauser, Helwig and Alliez, PierreThis paper presents a novel and practical screen‐space liquid rendering for particle‐based fluids for real‐time applications. Our rendering pipeline performs particle filtering only in a narrow‐band around the boundary particles to provide a smooth liquid surface with volumetric rendering effects. We also introduce a novel boundary detection method allowing the user to select particle layers from the liquid interface. The proposed approach is simple, fast, memory‐efficient, easy to code and it can be adapted straightforwardly in the standard screen‐space rendering methods, even in GPU architectures. We show through a set of experiments how the prior screen‐space techniques can be benefited and improved by our approach.Item Transition Motion Synthesis for Object Interaction based on Learning Transition Strategies(© 2022 Eurographics ‐ The European Association for Computer Graphics and John Wiley & Sons Ltd., 2022) Hwang, Jaepyung; Park, Gangrae; Kwon, Taesoo; Ishii, Shin; Hauser, Helwig and Alliez, PierreIn this study, we focus on developing a motion synthesis framework that generates a natural transition motion between two different behaviours to interact with a moving object. Specifically, the proposed framework generates the transition motion, bridging from a locomotive behaviour to an object interaction behaviour. And, the transition motion should adapt to the spatio‐temporal variation of the target object in an online manner, so as to naturally connect the behaviours. To solve this issue, we propose a framework that combines a regression model and a transition motion planner. The neural network‐based regression model estimates the reference transition strategy to guide the reference pattern of the transitioning, adapted to the varying situation. The transition motion planner reconstructs the transition motion based on the reference pattern while considering dynamic constraints that avoid the footskate and interaction constraints. The proposed framework is validated to synthesize various transition motions while adapting to the spatio‐temporal variation of the object by using object grasping motion, and athletic motions in soccer.Item Wassersplines for Neural Vector Field-Controlled Animation(The Eurographics Association and John Wiley & Sons Ltd., 2022) Zhang, Paul; Smirnov, Dmitriy; Solomon, Justin; Dominik L. Michels; Soeren PirkMuch of computer-generated animation is created by manipulating meshes with rigs. While this approach works well for animating articulated objects like animals, it has limited flexibility for animating less structured free-form objects. We introduce Wassersplines, a novel trajectory inference method for animating unstructured densities based on recent advances in continuous normalizing flows and optimal transport. The key idea is to train a neurally-parameterized velocity field that represents the motion between keyframes. Trajectories are then computed by advecting keyframes through the velocity field. We solve an additional Wasserstein barycenter interpolation problem to guarantee strict adherence to keyframes. Our tool can stylize trajectories through a variety of PDE-based regularizers to create different visual effects. We demonstrate our tool on various keyframe interpolation problems to produce temporally-coherent animations without meshing or rigging.