MAM2017: Eurographics Workshop on Material Appearance Modeling
Permanent URI for this collection
Browse
Browsing MAM2017: Eurographics Workshop on Material Appearance Modeling by Issue Date
Now showing 1 - 10 of 10
Results Per Page
Sort Options
Item Challenges in Appearance Capture and Predictive Modeling of Textile Materials(The Eurographics Association, 2017) Castillo, Carlos; Aliaga, Carlos; López-Moreno, Jorge; Reinhard Klein and Holly RushmeierThe appearance of cloth is the result of complex light interactions within the structures present in textile materials, particularly challenging due to their multi-scale nature. In addition to the inherent complexity of cloth rendering, there is a lack of connection between computer graphics techniques and manufacturing processes followed in industry. We discuss existing techniques and pose questions about which are the right paths to follow for a better synergy between CG and textile research, including (but not restricted to): defining a standard set of properties required to predict the appearance of cloth to be manufactured; developing both acquisition techniques reliable and suitable for industrial processes and other frameworks more focused on inexpensive capturing (e.g. based on single pictures, Pantone labels); finding material representations that are robust in absence of several low-level parameters; creating a standard for color depth depending on the dye type and dying technique; developing a standard to account for post-process steps (washing, chemical treatments, etc) on the mechanical and optical properties of the textiles.Item Image-based Remapping of Material Appearance(The Eurographics Association, 2017) Sztrajman, Alejandro; Krivánek, Jaroslav; Wilkie, Alexander; Weyrich, Tim; Reinhard Klein and Holly RushmeierDigital 3D content creation requires the ability to exchange assets across multiple software applications. For many 3D asset types, standard formats and interchange conventions are available. For material definitions, however, inter-application exchange is still hampered by different software packages supporting different BRDF models. To make matters worse, even if nominally identical BRDF models are supported, these often differ in their implementation, due to optimisations and safeguards in individual renderers. To facilitate appearance-preserving translation between different BRDF models whose precise implementation is not known (arguably the standard case with commercial systems), we propose a robust translation scheme which leaves BRDF evaluation to the targeted rendering system, and which expresses BRDF similarity in image space. As we will show, even naïve applications of a nonlinear fit which uses such an image space residual metric work well in some cases; however, it does suffer from instabilities for certain material parameters. We propose strategies to mitigate these instabilities and perform reliable parameter remappings between differing BRDF definitions. We report on experiences with this remapping scheme, both with respect to robustness and visual differences of the fits.Item Diffraction Prediction in HDR Measurements(The Eurographics Association, 2017) Lucat, Antoine; Hegedus, R.; Pacanowski, Romain; Reinhard Klein and Holly RushmeierModern imaging techniques have proved to be very efficient to recover a scene with high dynamic range values. However, this high dynamic range can introduce star-burst patterns around highlights arising from the diffraction of the camera aperture. The spatial extent of this effect can be very wide and alters pixels values, which, in a measurement context, are not reliable anymore. To address this problem, we introduce a novel algorithm that predicts, from a closed-form PSF, where the diffraction will affect the pixels of an HDR image, making it possible to discard them from the measurement. Our results gives better results than common deconvolution techniques and the uncertainty values (convolution kernel and noise) of the algorithm output are recovered.Item Intuitive Editing of Visual Appearance from Real-World Datasets(The Eurographics Association, 2017) Marco, Julio; Serrano, Ana; Jarabo, Adrian; Masia, Belen; Gutierrez, Diego; Reinhard Klein and Holly RushmeierComputer-generated imagery is ubiquitous, spanning fields such as games and movies, architecture, engineering, or virtual prototyping, while also helping create novel ones such as computational materials. With the increase in computational power and the improvement of acquisition techniques, there has been a paradigm shift in the field towards data-driven techniques, which has yielded an unprecedented level of realism in visual appearance. Unfortunately, this leads to a series of problems. First, there is a disconnect between the mathematical representation of the data and any meaningful parameters that humans understand; the captured data is machine-friendly, but not human-friendly. Second, the many different acquisition systems lead to heterogeneous formats and very large datasets. And third, real-world appearance functions are usually nonlinear and high-dimensional. As a result, visual appearance datasets are increasingly unfit to editing operations, which limits the creative process for scientists, engineers, artists, and practitioners in general. There is an immense gap between the complexity, realism and richness of the captured data, and the flexibility to edit such data. The current research path leads to a fragmented space of isolated solutions, each tailored to a particular dataset and problem. To define intuitive and predictable editing spaces, algorithms, and workflows, we must investigate at the theoretical, algorithmic and application levels, putting the user at the core, learning key relevant appearance features in terms humans understand.Item High-Quality Multi-Spectral Reflectance Acquisition with X-Rite TAC7(The Eurographics Association, 2017) Merzbach, Sebastian; Weinmann, Michael; Klein, Reinhard; Reinhard Klein and Holly RushmeierWhen relighting digitized objects, strong color deviations can arise depending on the illumination conditions if the object’s reflectance is only captured in RGB. To guarantee color-correct simulations, it is therefore of great importance to perform appearance capture with a finer spectral sampling than the three broad band RGB channels. Capturing both shape and multispectral reflectance at a high quality is a challenging task and - to the best of our knowledge - has not yet been performed at the quality and speed of our approach. We acquire surface geometry and multi-spectral spatially varying reflectance of objects of up to a few centimeters height with the TAC7 device, which is available commercially as of lately. We demonstrate the improvements in color-accuracy and the overall quality of the appearance capture by relighting our accurately digitized objects under varying illumination conditions.Item Towards Sparse and Multiplexed Acquisition of Material BTFs(The Eurographics Association, 2017) Brok, Dennis den; Weinmann, Michael; Klein, Reinhard; Reinhard Klein and Holly RushmeierWe present preliminary results on our effort to combine sparse and illumination-multiplexed acquisition of bidirectional texture functions (BTFs) for material appearance. Both existing acquisition paradigms deal with a single specific problem: the desire to reduce either the number of images to be obtained while maintaining artifact-free renderings, or the shutter times required to capture the full dynamic range of a material's appearance. These problems have so far been solved by means of data-driven models. We demonstrate that the way these models are derived prevents combined sparse and multiplexed acquisition, and introduce a novel model that circumvents this obstruction. As a result, we achieve acquisition times on the order of minutes in comparison to the few hours required with sparse acquisition or multiplexed illumination.Item The Effects of Digital Cameras Optics and Electronics for Material Acquisition(The Eurographics Association, 2017) Holzschuch, Nicolas; Pacanowski, Romain; Reinhard Klein and Holly RushmeierFor material acquisition, we use digital cameras and process the pictures. We usually treat the cameras as perfect pinhole cameras, with each pixel providing a point sample of the incoming signal. In this paper, we study the impact of camera optical and electronic systems. Optical system effects are modelled by the Modulation Transfer Function (MTF). Electronic System effects are modelled by the Pixel Response Function (PRF). The former is convolved with the incoming signal, the latter is multiplied with it. We provide a model for both effects, and study their impact on the measured signal. For high frequency incoming signals, the convolution results in a significant decrease in measured intensity, especially at grazing angles. We show this model explains the strange behaviour observed in the MERL BRDF database at grazing angles.Item Appearance of Interfaced Lambertian Microfacets, using STD Distribution(The Eurographics Association, 2017) Ribardière, M.; Meneveaux, D.; Bringier, B.; Simonot, L.; Reinhard Klein and Holly RushmeierThis paper presents the use of Student’s T-Distribution (STD) with interfaced Lambertian (IL) microfacets. The resulting model increases the range of materials while providing a very accurate adjustment of appearance. STD has been recently proposed as a generalized distribution of microfacets which includes Beckmann and GGX widely used in computer graphics; IL corresponds to a physical representation of a Lambertian substrate covered with a flat Fresnel interface. We illustrate the appearance variations that can be observed, and discuss the advantages of using such a combination.Item Experimental Analysis of BSDF Models(The Eurographics Association, 2017) Kurt, Murat; Reinhard Klein and Holly RushmeierThe Bidirectional Scattering Distribution Function (BSDF) describes the appearance of an optically thin, translucent material by its interaction with light at a surface point. Various BSDF models have been proposed to represent BSDFs. In this paper, we experimentally analyze a few of BSDF models in terms of their accuracy to represent measured BSDFs, their required storage sizes and computation times. To make a fair comparison of BSDF models, we measured three samples of optically thin, translucent materials (hunter douglas, orange glass, structured glass) by using pgII gonio-photometer. Based on rendered images, required storage sizes and computation times, we compare the performance of the BSDF models. We show that datadriven BSDF models give a more accurate representation of measured BSDFs, while data-driven BSDF models require much more storage sizes and computation times.We also show that BSDF measurements from highly anisotropic translucent materials can not be expressed by an analytical BSDF model visually correctly.Item MAM 2017: Frontmatter(Eurographics Association, 2017) Reinhard Klein; Holly Rushmeier;