Show simple item record

dc.contributor.authorAlbrecht, Irene
dc.date.accessioned2016-01-20T15:36:51Z
dc.date.available2016-01-20T15:36:51Z
dc.date.issued2005
dc.identifier.urihttp://diglib.eg.org/handle/10.2312/14683
dc.description.abstractIn order to be believable, virtual human characters must be able to communicate in a human-like fashion realistically. This dissertation contributes to improving and automating several aspects of virtual conversations. We have proposed techniques to add non-verbal speech-related facial expressions to audiovisual speech, such as head nods for of emphasis. During conversation, humans experience shades of emotions much more frequently than the strong Ekmanian basic emotions. This prompted us to develop a method that interpolates between facial expressions of emotions to create new ones based on an emotion model. In the area of facial modeling, we have presented a system to generate plausible 3D face models from vague mental images. It makes use of a morphable model of faces and exploits correlations among facial features. The hands also play a major role in human communication. Since the basis for every realistic animation of gestures must be a convincing model of the hand, we devised a physics-based anatomical hand model, where a hybrid muscle model drives the animations. The model was used to visualize complex hand movement captured using multi-exposure photography.en_US
dc.language.isoenen_US
dc.titleFaces And Hands- Modeling and Animating Anatomical and Photorealistic Models with Regard to the Communicative Competence of Virtual Humansen_US
dc.typeThesisen_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record