Browsing by Author "Galerne, B."
Now showing 1 - 2 of 2
Results Per Page
Sort Options
Item On Demand Solid Texture Synthesis Using Deep 3D Networks(© 2020 Eurographics ‐ The European Association for Computer Graphics and John Wiley & Sons Ltd, 2020) Gutierrez, J.; Rabin, J.; Galerne, B.; Hurtut, T.; Benes, Bedrich and Hauser, HelwigThis paper describes a novel approach for on demand volumetric texture synthesis based on a deep learning framework that allows for the generation of high‐quality three‐dimensional (3D) data at interactive rates. Based on a few example images of textures, a generative network is trained to synthesize coherent portions of solid textures of arbitrary sizes that reproduce the visual characteristics of the examples along some directions. To cope with memory limitations and computation complexity that are inherent to both high resolution and 3D processing on the GPU, only 2D textures referred to as ‘slices’ are generated during the training stage. These synthetic textures are compared to exemplar images a perceptual loss function based on a pre‐trained deep network. The proposed network is very light (less than 100k parameters), therefore it only requires sustainable training (i.e. few hours) and is capable of very fast generation (around a second for 256 voxels) on a single GPU. Integrated with a spatially seeded pseudo‐random number generator (PRNG) the proposed generator network directly returns a color value given a set of 3D coordinates. The synthesized volumes have good visual results that are at least equivalent to the state‐of‐the‐art patch‐based approaches. They are naturally seamlessly tileable and can be fully generated in parallel.Item A Stochastic Film Grain Model for Resolution‐Independent Rendering(© 2017 The Eurographics Association and John Wiley & Sons Ltd., 2017) Newson, A.; Delon, J.; Galerne, B.; Chen, Min and Zhang, Hao (Richard)The realistic synthesis and rendering of film grain is a crucial goal for many amateur and professional photographers and film‐makers whose artistic works require the authentic feel of analogue photography. The objective of this work is to propose an algorithm that reproduces the visual aspect of film grain texture on any digital image. Previous approaches to this problem either propose unrealistic models or simply blend scanned images of film grain with the digital image, in which case the result is inevitably limited by the quality and resolution of the initial scan. In this work, we introduce a stochastic model to approximate the physical reality of film grain, and propose a resolution‐free rendering algorithm to simulate realistic film grain for any digital input image. By varying the parameters of this model, we can achieve a wide range of grain types. We demonstrate this by comparing our results with film grain examples from dedicated software, and show that our rendering results closely resemble these real film emulsions. In addition to realistic grain rendering, our resolution‐free algorithm allows for any desired zoom factor, even down to the scale of the microscopic grains themselves.The realistic synthesis and rendering of film grain is a crucial goal for many amateur and professional photographers and film‐makers whose artistic works require the authentic feel of analogue photography. The objective of this work is to propose an algorithm that reproduces the visual aspect of film grain texture on any digital image. Previous approaches to this problem either propose unrealistic models or simply blend scanned images of film grain with the digital image, in which case the result is inevitably limited by the quality and resolution of the initial scan. In this work, we introduce a stochastic model to approximate the physical reality of film grain, and propose a resolution‐free rendering algorithm to simulate realistic film grain for any digital input image. By varying the parameters of this model, we can achieve a wide range of grain types.