Browsing by Author "Cho, Kyungmin"
Now showing 1 - 3 of 3
Results Per Page
Sort Options
Item Online Avatar Motion Adaptation to Morphologically-similar Spaces(The Eurographics Association and John Wiley & Sons Ltd., 2023) Choi, Soojin; Hong, Seokpyo; Cho, Kyungmin; Kim, Chaelin; Noh, Junyong; Myszkowski, Karol; Niessner, MatthiasIn avatar-mediated telepresence systems, a similar environment is assumed for involved spaces, so that the avatar in a remote space can imitate the user's motion with proper semantic intention performed in a local space. For example, touching on the desk by the user should be reproduced by the avatar in the remote space to correctly convey the intended meaning. It is unlikely, however, that the two involved physical spaces are exactly the same in terms of the size of the room or the locations of the placed objects. Therefore, a naive mapping of the user's joint motion to the avatar will not create the semantically correct motion of the avatar in relation to the remote environment. Existing studies have addressed the problem of retargeting human motions to an avatar for telepresence applications. Few studies, however, have focused on retargeting continuous full-body motions such as locomotion and object interaction motions in a unified manner. In this paper, we propose a novel motion adaptation method that allows to generate the full-body motions of a human-like avatar on-the-fly in the remote space. The proposed method handles locomotion and object interaction motions as well as smooth transitions between them according to given user actions under the condition of a bijective environment mapping between morphologically-similar spaces. Our experiments show the effectiveness of the proposed method in generating plausible and semantically correct full-body motions of an avatar in room-scale space.Item Recurrent Motion Refiner for Locomotion Stitching(© 2023 Eurographics ‐ The European Association for Computer Graphics and John Wiley & Sons Ltd., 2023) Kim, Haemin; Cho, Kyungmin; Hong, Seokhyeon; Noh, Junyong; Hauser, Helwig and Alliez, PierreStitching different character motions is one of the most commonly used techniques as it allows the user to make new animations that fit one's purpose from pieces of motion. However, current motion stitching methods often produce unnatural motion with foot sliding artefacts, depending on the performance of the interpolation. In this paper, we propose a novel motion stitching technique based on a recurrent motion refiner (RMR) that connects discontinuous locomotions into a single natural locomotion. Our model receives different locomotions as input, in which the root of the last pose of the previous motion and that of the first pose of the next motion are aligned. During runtime, the model slides through the sequence, editing frames window by window to output a smoothly connected animation. Our model consists of a two‐layer recurrent network that comes between a simple encoder and decoder. To train this network, we created a sufficient number of paired data with a newly designed data generation. This process employs a K‐nearest neighbour search that explores a predefined motion database to create the corresponding input to the ground truth. Once trained, the suggested model can connect various lengths of locomotion sequences into a single natural locomotion.Item Synthesizing Character Animation with Smoothly Decomposed Motion Layers(© 2020 Eurographics ‐ The European Association for Computer Graphics and John Wiley & Sons Ltd, 2020) Eom, Haegwang; Choi, Byungkuk; Cho, Kyungmin; Jung, Sunjin; Hong, Seokpyo; Noh, Junyong; Benes, Bedrich and Hauser, HelwigThe processing of captured motion is an essential task for undertaking the synthesis of high‐quality character animation. The motion decomposition techniques investigated in prior work extract meaningful motion primitives that help to facilitate this process. Carefully selected motion primitives can play a major role in various motion‐synthesis tasks, such as interpolation, blending, warping, editing or the generation of new motions. Unfortunately, for a complex character motion, finding generic motion primitives by decomposition is an intractable problem due to the compound nature of the behaviours of such characters. Additionally, decomposed motion primitives tend to be too limited for the chosen model to cover a broad range of motion‐synthesis tasks. To address these challenges, we propose a generative motion decomposition framework in which the decomposed motion primitives are applicable to a wide range of motion‐synthesis tasks. Technically, the input motion is smoothly decomposed into three motion layers. These are base‐level motion, a layer with controllable motion displacements and a layer with high‐frequency residuals. The final motion can easily be synthesized simply by changing a single user parameter that is linked to the layer of controllable motion displacements or by imposing suitable temporal correspondences to the decomposition framework. Our experiments show that this decomposition provides a great deal of flexibility in several motion synthesis scenarios: denoising, style modulation, upsampling and time warping.