Physics-Informed Neural Fields with Neural Implicit Surface for Fluid Reconstruction

Loading...
Thumbnail Image
Date
2024
Journal Title
Journal ISSN
Volume Title
Publisher
The Eurographics Association
Abstract
Recovering fluid density and velocity from multi-view RGB videos poses a formidable challenge. Existing solutions typically assume knowledge of obstacles and lighting, or are designed for simple fluid scenes without obstacles or complex lighting. Addressing these challenges, our study presents a novel hybrid model named PINFS, which ingeniously fuses the capabilities of Physics-Informed Neural Fields (PINF) and Neural Implicit Surfaces (NeuS) to accurately reconstruct scenes containing smoke. By combining the capabilities of SIREN-NeRFt in PINF for creating realistic smoke representations and the accuracy of NeuS in depicting solid obstacles, PINFS excels in providing detailed reconstructions of smoke scenes with improved visual authenticity and physical precision. PINFS distinguishes itself by incorporating solid's view-independent opaque density and addressing Neumann boundary conditions through signed distances from NeuS. This results in a more realistic and physically plausible depiction of smoke behavior in dynamic scenarios. Comprehensive evaluations of synthetic and real-world datasets confirm the model's superior performance in complex scenes with obstacles. PINFS introduces a novel framework for realistically and physically consistent rendering of complex fluid dynamics scenarios, pushing the boundaries in the utilization of mixed physical and neural-based approaches. The code is available at https://github.com/zduan3/pinfs_code.
Description

CCS Concepts: Computing methodologies → Scene understanding; Neural networks; Physical simulation

        
@inproceedings{
10.2312:pg.20241298
, booktitle = {
Pacific Graphics Conference Papers and Posters
}, editor = {
Chen, Renjie
and
Ritschel, Tobias
and
Whiting, Emily
}, title = {{
Physics-Informed Neural Fields with Neural Implicit Surface for Fluid Reconstruction
}}, author = {
Duan, Zheng
and
Ren, Zhong
}, year = {
2024
}, publisher = {
The Eurographics Association
}, ISBN = {
978-3-03868-250-9
}, DOI = {
10.2312/pg.20241298
} }
Citation