Neural Networks for Digital Materials and Radiance Encoding
dc.contributor.author | Rodriguez-Pardo, Carlos | |
dc.date.accessioned | 2023-08-01T09:10:29Z | |
dc.date.available | 2023-08-01T09:10:29Z | |
dc.date.issued | 2023-07 | |
dc.description.abstract | Realistic virtual scenes are becoming increasingly prevalent in our society, with a wide range of applications in areas such as manufacturing, architecture, fashion design, and entertainment, including movies, video games, and augmented and virtual reality. Generating realistic images of such scenes requires highly accurate illumination, geometry, and material models, which can be time-consuming and challenging to obtain. Traditionally, such models have often been created manually by skilled artists, but this process can be prohibitively time-consuming and costly. Alternatively, real-world examples can be captured, but this approach presents additional challenges in terms of accuracy and scalability. Moreover, while realism and accuracy are crucial in such processes, rendering efficiency is also a key requirement, so that lifelike images can be generated with the speed required in many real-world applications. One of the most significant challenges in this regard is the acquisition and representation of materials, which are a critical component of our visual world and, by extension, of virtual representations of it. However, existing approaches for material acquisition and representation are limited in terms of efficiency and accuracy, which limits their real-world impact. To address these challenges, data-driven approaches that leverage machine learning may provide viable solutions. Nevertheless, designing and training machine learning models that meet all these competing requirements remains a challenging task, requiring careful consideration of trade-offs between quality and efficiency. In this thesis, we propose novel learning-based solutions to address several key challenges in physically-based rendering and material digitization. Our approach leverages various forms of neural networks to introduce innovative algorithms for radiance encoding, digital material generation, edition, and estimation. First, we present a visual attribute transfer framework for digital materials that can effectively generalize to new illumination conditions and geometric distortions. We showcase a use-case of this method for high-resolution material acquisition using a custom device. Additionally, we propose a generative model capable of synthesizing tileable textures from a single input image, which helps improve the quality of material rendering. Building upon recent work in neural fields, we also introduce a material representation that accurately encodes material reflectance while offering powerful editing and propagation capabilities. In addition to reflectance, we present a novel method for global illumination encoding that leverages carefully designed generative models to achieve significantly faster sampling than previous work. Finally, we propose two innovative methods for low-cost material digitization. With flatbed scanners as our capture device, we present a generative model that can provide high-resolution material reflectance estimations using a single image as input, while introducing an uncertainty quantification algorithm that increases its reliability and efficiency. Additionally, we present a novel method for digitizing fabric mechanical properties using depth images as input, which we extend with a perceptually-validated drape similarity metric. Overall, the contributions of this thesis represent significant advances in the fields of radiance encoding and digital material acquisition and edition, enhancing the quality, scalability, and efficiency of physically-based rendering pipelines. | en_US |
dc.description.sponsorship | Thesis fully funded by SEDDI. | en_US |
dc.identifier.uri | https://diglib.eg.org:443/handle/10.2312/3543873 | |
dc.language.iso | en_US | en_US |
dc.subject | Deep Learning | en_US |
dc.subject | Digital Materials | en_US |
dc.subject | Reflectance | en_US |
dc.subject | Radiance | en_US |
dc.subject | Machine Learning | en_US |
dc.subject | Neural Networks | en_US |
dc.subject | Material Acquisition | en_US |
dc.subject | BRDF | en_US |
dc.subject | Global Illumination | en_US |
dc.subject | Generative Models | en_US |
dc.subject | Fabrics | en_US |
dc.subject | Attribute Transfer | en_US |
dc.subject | Texture Synthesis | en_US |
dc.subject | Intrinsic Decomposition | en_US |
dc.title | Neural Networks for Digital Materials and Radiance Encoding | en_US |
dc.type | Thesis | en_US |
Files
Original bundle
1 - 1 of 1
No Thumbnail Available
- Name:
- _PhD_Thesis__2023___.pdf
- Size:
- 224.95 MB
- Format:
- Adobe Portable Document Format
- Description:
License bundle
1 - 1 of 1
No Thumbnail Available
- Name:
- license.txt
- Size:
- 1.79 KB
- Format:
- Item-specific license agreed upon to submission
- Description: