43-Issue 4
Permanent URI for this collection
Browse
Browsing 43-Issue 4 by Subject "Computing methodologies → Reflectance modeling"
Now showing 1 - 5 of 5
Results Per Page
Sort Options
Item MatUp: Repurposing Image Upsamplers for SVBRDFs(The Eurographics Association and John Wiley & Sons Ltd., 2024) Gauthier, Alban; Kerbl, Bernhard; Levallois, Jérémy; Faury, Robin; Thiery, Jean-Marc; Boubekeur, Tamy; Garces, Elena; Haines, EricWe propose MATUP, an upsampling filter for material super-resolution. Our method takes as input a low-resolution SVBRDF and upscales its maps so that their rendering under various lighting conditions fits upsampled renderings inferred in the radiance domain with pre-trained RGB upsamplers. We formulate our local filter as a compact Multilayer Perceptron (MLP), which acts on a small window of the input SVBRDF and is optimized using a data-fitting loss defined over upsampled radiance at various locations. This optimization is entirely performed at the scale of a single, independent material. Doing so, MATUP leverages the reconstruction capabilities acquired over large collections of natural images by pre-trained RGB models and provides regularization over self-similar structures. In particular, our light-weight neural filter avoids retraining complex architectures from scratch or accessing any large collection of low/high resolution material pairs - which do not actually exist at the scale RGB upsamplers are trained with. As a result, MATUP provides fine and coherent details in the upscaled material maps, as shown in the extensive evaluation we provide.Item Neural Appearance Model for Cloth Rendering(The Eurographics Association and John Wiley & Sons Ltd., 2024) Soh, Guan Yu; Montazeri, Zahra; Garces, Elena; Haines, EricThe realistic rendering of woven and knitted fabrics has posed significant challenges throughout many years. Previously, fiberbased micro-appearance models have achieved considerable success in attaining high levels of realism. However, rendering such models remains complex due to the intricate internal scatterings of hundreds of fibers within a yarn, requiring vast amounts of memory and time to render. In this paper, we introduce a new framework to capture aggregated appearance by tracing many light paths through the underlying fiber geometry. We then employ lightweight neural networks to accurately model the aggregated BSDF, which allows for the precise modeling of a diverse array of materials while offering substantial improvements in speed and reductions in memory. Furthermore, we introduce a novel importance sampling scheme to further speed up the rate of convergence. We validate the efficacy and versatility of our framework through comparisons with preceding fiber-based shading models as well as the most recent yarn-based model.Item Neural SSS: Lightweight Object Appearance Representation(The Eurographics Association and John Wiley & Sons Ltd., 2024) Tg, Thomson; Tran, Duc Minh; Jensen, Henrik W.; Ramamoorthi, Ravi; Frisvad, Jeppe Revall; Garces, Elena; Haines, EricWe present a method for capturing the BSSRDF (bidirectional scattering-surface reflectance distribution function) of arbitrary geometry with a neural network. We demonstrate how a compact neural network can represent the full 8-dimensional light transport within an object including heterogeneous scattering. We develop an efficient rendering method using importance sampling that is able to render complex translucent objects under arbitrary lighting. Our method can also leverage the common planar half-space assumption, which allows it to represent one BSSRDF model that can be used across a variety of geometries. Our results demonstrate that we can render heterogeneous translucent objects under arbitrary lighting and obtain results that match the reference rendered using volumetric path tracing.Item Practical Appearance Model for Foundation Cosmetics(The Eurographics Association and John Wiley & Sons Ltd., 2024) Lanza, Dario; Padrón-Griffe, Juan Raúl; Pranovich, Alina; Muñoz, Adolfo; Frisvad, Jeppe Revall; Jarabo, Adrian; Garces, Elena; Haines, EricCosmetic products have found their place in various aspects of human life, yet their digital appearance reproduction has received little attention. We present an appearance model for cosmetics, in particular for foundation layers, that reproduces a range of existing appearances of foundation cosmetics: from a glossy to a matte to an almost velvety look. Our model is a multilayered BSDF that reproduces the stacking of multiple layers of cosmetics. Inspired by the microscopic particulates used in cosmetics, we model each individual layer as a stochastic participating medium with two types of scatterers that mimic the most prominent visual features of cosmetics: spherical diffusers, resulting in a uniform distribution of radiance; and platelets, responsible for the glossy look of certain cosmetics.We implement our model on top of the position-free Monte Carlo framework, that allows us to include multiple scattering. We validate our model against measured reflectance data, and demonstrate the versatility and expressiveness of our model by thoroughly exploring the range of appearances that it can produce.Item VMF Diffuse: A Unified Rough Diffuse BRDF(The Eurographics Association and John Wiley & Sons Ltd., 2024) d'Eon, Eugene; Weidlich, Andrea; Garces, Elena; Haines, EricWe present a practical analytic BRDF that approximates scattering from a generalized microfacet volume with a von Mises- Fischer NDF. Our BRDF seamlessly blends from smooth Lambertian, through moderately rough height fields with Beckmannlike statistics and into highly rough/porous behaviours that have been lacking from prior models. At maximum roughness, our model reduces to the recent Lambert-sphere BRDF. We validate our model by comparing to simulations of scattering from geometries with randomly-placed Lambertian spheres and show an improvement relative to a rough Beckmann BRDF with very high roughness.