Repository logo
  • Communities & Collections
  • All of DSpace
  • English
  • ČeÅ”tina
  • Deutsch
  • EspaƱol
  • FranƧais
  • GĆ idhlig
  • LatvieÅ”u
  • Magyar
  • Nederlands
  • PortuguĆŖs
  • PortuguĆŖs do Brasil
  • Suomi
  • Svenska
  • TürkƧe
  • ŅšŠ°Š·Š°Ņ›
  • বাংলা
  • ą¤¹ą¤æą¤‚ą¤¦ą„€
  • Ελληνικά
  • Log In
    or
    New user? Click here to register.Have you forgotten your password?
  1. Home
  2. Browse by Author

Browsing by Author "Zhang, Yan"

Now showing 1 - 3 of 3
Results Per Page
Sort Options
  • Loading...
    Thumbnail Image
    Item
    FictionalWorlds, Real Connections: Developing Community Storytelling Social Chatbots through LLMs
    (The Eurographics Association, 2023) Sun, Yuqian; Wang, Hanyi; Chan, Pok Man; Tabibi, Morteza; Zhang, Yan; Lu, Huan; Chen, Yuheng; Lee, Chang Hee; Asadipour, Ali; Pelechano, Nuria; Liarokapis, Fotis; Rohmer, Damien; Asadipour, Ali
    We address the integration of storytelling and Large Language Models (LLMs) to develop engaging and believable Social Chatbots (SCs) in community settings. Motivated by the potential of fictional characters to enhance social interactions, we introduce Storytelling Social Chatbots (SSCs) and the concept of story engineering to transform fictional game characters into "live" social entities within player communities. Our story engineering process includes three steps: (1) Character and story creation, defining the SC's personality and worldview, (2) Presenting Live Stories to the Community, allowing the SC to recount challenges and seek suggestions, and (3) Communication with community members, enabling interaction between the SC and users. We employed the LLM GPT-3 to drive our SSC prototypes, ''David" and ''Catherine," and evaluated their performance in an online gaming community, ''DE (Alias)," on Discord. Our mixed-method analysis, based on questionnaires (N=15) and interviews (N=8) with community members, reveals that storytelling significantly enhances the engagement and believability of SCs in community settings.
  • Loading...
    Thumbnail Image
    Item
    GPU-based Real-time Cloth Simulation for Virtual Try-on
    (The Eurographics Association, 2018) Su, Tongkui; Zhang, Yan; Zhou, Yu; Yu, Yao; Du, Sidan; Fu, Hongbo and Ghosh, Abhijeet and Kopf, Johannes
    We present a novel real-time approach for dynamic detailed clothing simulation on a moving body. The most distinctive feature of our method is that it divides dynamic simulation into two parts: local driving and static cloth simulation. In local driving, feature points of clothing will be handled between two consecutive frames. And then we apply static cloth simulation for a specific frame. Both parts are ecxuted in an entire parallel way. In practice, our system achieves real-time virtual try-on using a depth camera to capture the moving body model and meanwhile, keeps high-fidelity. Experimental results indicate that our method has significant speedups over prior related techniques.
  • Loading...
    Thumbnail Image
    Item
    Text2Mat: Generating Materials from Text
    (The Eurographics Association, 2023) He, Zhen; Guo, Jie; Zhang, Yan; Tu, Qinghao; Chen, Mufan; Guo, Yanwen; Wang, Pengyu; Dai, Wei; Chaine, Raphaƫlle; Deng, Zhigang; Kim, Min H.
    Specific materials are often associated with a certain type of objects in the real world. They simulate the way the surface of the object interacting with light and are named after that type of object. We observe that the text labels of materials contain advanced semantic information, which can be used as a guidance to assist the generation of specific materials. Based on that, we propose Text2Mat, a text-guided material generation framework. To meet the demand of material generation based on text descriptions, we construct a large set of PBR materials with specific text labels. Each material contains detailed text descriptions that match the visual appearance of the material. Furthermore, for the sake of controlling the texture and spatial layout of generated materials through text, we introduce texture attribute labels and extra attributes describing regular materials. Using this dataset, we train a specific neural network adapted from Stable Diffusion to achieve text-based material generation. Extensive experiments and rendering effects demonstrate that Text2Mat can generate materials with spatial layout and texture styles highly corresponding to text descriptions.

Eurographics Association Ā© 2013-2025  |  System hosted at Graz University of Technology      
DSpace software copyright Ā© 2002-2025 LYRASIS

  • Cookie settings
  • Privacy policy
  • End User Agreement
  • Send Feedback