The Delta Radiance Field
No Thumbnail Available
Date
2015-09-11
Authors
Journal Title
Journal ISSN
Volume Title
Publisher
Abstract
The wide availability of mobile devices capable of computing high fidelity
graphics in real-time has sparked a renewed interest in the development and
research of Augmented Reality applications. Within the large spectrum of
mixed real and virtual elements one specific area is dedicated to produce realistic
augmentations with the aim of presenting virtual copies of real existing
objects or soon to be produced products. Surprisingly though, the current
state of this area leaves much to be desired: Augmenting objects in current
systems are often presented without any reconstructed lighting whatsoever
and therefore transfer an impression of being glued over a camera image
rather than augmenting reality. In light of the advances in the movie industry,
which has handled cases of mixed realities from one extreme end to
another, it is a legitimate question to ask why such advances did not fully
reflect onto Augmented Reality simulations as well.
Generally understood to be real-time applications which reconstruct the spatial
relation of real world elements and virtual objects, Augmented Reality
has to deal with several uncertainties. Among them, unknown illumination
and real scene conditions are the most important. Any kind of reconstruction
of real world properties in an ad-hoc manner must likewise be incorporated
into an algorithm responsible for shading virtual objects and transferring
virtual light to real surfaces in an ad-hoc fashion. The immersiveness of an
Augmented Reality simulation is, next to its realism and accuracy, primarily
dependent on its responsiveness. Any computation affecting the final image
must be computed in real-time. This condition rules out many of the methods
used for movie production.
The remaining real-time options face three problems: The shading of virtual
surfaces under real natural illumination, the relighting of real surfaces
according to the change in illumination due to the introduction of a new
object into a scene, and the believable global interaction of real and virtual
light. This dissertation presents contributions to answer the problems at
hand.
Current state-of-the-art methods build on Differential Rendering techniques
to fuse global illumination algorithms into AR environments. This simple approach
has a computationally costly downside, which limits the options for
believable light transfer even further. This dissertation explores new shading
and relighting algorithms built on a mathematical foundation replacing
Differential Rendering. The result not only presents a more efficient competitor
to the current state-of-the-art in global illumination relighting, but also
advances the field with the ability to simulate effects which have not been
demonstrated by contemporary publications until now.
Description