Browsing by Author "Thuerey, Nils"
Now showing 1 - 6 of 6
Results Per Page
Sort Options
Item Deep Fluids: A Generative Network for Parameterized Fluid Simulations(The Eurographics Association and John Wiley & Sons Ltd., 2019) Kim, Byungsoo; Azevedo, Vinicius C.; Thuerey, Nils; Kim, Theodore; Gross, Markus; Solenthaler, Barbara; Alliez, Pierre and Pellacini, FabioThis paper presents a novel generative model to synthesize fluid simulations from a set of reduced parameters. A convolutional neural network is trained on a collection of discrete, parameterizable fluid simulation velocity fields. Due to the capability of deep learning architectures to learn representative features of the data, our generative model is able to accurately approximate the training data set, while providing plausible interpolated in-betweens. The proposed generative model is optimized for fluids by a novel loss function that guarantees divergence-free velocity fields at all times. In addition, we demonstrate that we can handle complex parameterizations in reduced spaces, and advance simulations in time by integrating in the latent space with a second network. Our method models a wide variety of fluid behaviors, thus enabling applications such as fast construction of simulations, interpolation of fluids with different parameters, time re-sampling, latent space simulations, and compression of fluid simulation data. Reconstructed velocity fields are generated up to 700x faster than re-simulating the data with the underlying CPU solver, while achieving compression rates of up to 1300x.Item Exploring Physical Latent Spaces for High-Resolution Flow Restoration(The Eurographics Association, 2023) Paliard, Chloé; Thuerey, Nils; Um, Kiwon; Guthe, Michael; Grosch, ThorstenWe explore training deep neural network models in conjunction with physics simulations via partial differential equations (PDEs), using the simulated degrees of freedom as latent space for a neural network. In contrast to previous work, this paper treats the degrees of freedom of the simulated space purely as tools to be used by the neural network. We demonstrate this concept for learning reduced representations, as it is extremely challenging to faithfully preserve correct solutions over long time-spans with traditional reduced representations, particularly for solutions with large amounts of small scale features. This work focuses on the use of such physical, reduced latent space for the restoration of fine simulations, by training models that can modify the content of the reduced physical states as much as needed to best satisfy the learning objective. This autonomy allows the neural networks to discover alternate dynamics that significantly improve the performance in the given tasks. We demonstrate this concept for various fluid flows ranging from different turbulence scenarios to rising smoke plumes.Item Frontmatter: ACM SIGGRAPH / Eurographics Symposium of Computer Animation 2018(The Eurographics Association and John Wiley & Sons Ltd., 2018) Thuerey, Nils; Beeler, Thabo; Thuerey, Nils and Beeler, ThaboItem Latent Space Physics: Towards Learning the Temporal Evolution of Fluid Flow(The Eurographics Association and John Wiley & Sons Ltd., 2019) Wiewel, Steffen; Becher, Moritz; Thuerey, Nils; Alliez, Pierre and Pellacini, FabioWe propose a method for the data-driven inference of temporal evolutions of physical functions with deep learning. More specifically, we target fluid flow problems, and we propose a novel LSTM-based approach to predict the changes of the pressure field over time. The central challenge in this context is the high dimensionality of Eulerian space-time data sets. We demonstrate for the first time that dense 3D+time functions of physics system can be predicted within the latent spaces of neural networks, and we arrive at a neural-network based simulation algorithm with significant practical speed-ups. We highlight the capabilities of our method with a series of complex liquid simulations, and with a set of single-phase buoyancy simulations. With a set of trained networks, our method is more than two orders of magnitudes faster than a traditional pressure solver. Additionally, we present and discuss a series of detailed evaluations for the different components of our algorithm.Item Latent Space Subdivision: Stable and Controllable Time Predictions for Fluid Flow(The Eurographics Association and John Wiley & Sons Ltd., 2020) Wiewel, Steffen; Kim, Byungsoo; Azevedo, Vinicius; Solenthaler, Barbara; Thuerey, Nils; Bender, Jan and Popa, TiberiuWe propose an end-to-end trained neural network architecture to robustly predict the complex dynamics of fluid flows with high temporal stability. We focus on single-phase smoke simulations in 2D and 3D based on the incompressible Navier-Stokes (NS) equations, which are relevant for a wide range of practical problems. To achieve stable predictions for long-term flow sequences with linear execution times, a convolutional neural network (CNN) is trained for spatial compression in combination with a temporal prediction network that consists of stacked Long Short-Term Memory (LSTM) layers. Our core contribution is a novel latent space subdivision (LSS) to separate the respective input quantities into individual parts of the encoded latent space domain. As a result, this allows to distinctively alter the encoded quantities without interfering with the remaining latent space values and hence maximizes external control. By selectively overwriting parts of the predicted latent space points, our proposed method is capable to robustly predict long-term sequences of complex physics problems, like the flow of fluids. In addition, we highlight the benefits of a recurrent training on the latent space creation, which is performed by the spatial compression network. Furthermore, we thoroughly evaluate and discuss several different components of our method.Item Visualizing Optimizers using Chebyshev Proxies and Fatou Sets(The Eurographics Association, 2022) Winchenbach, Rene; Thuerey, Nils; Bender, Jan; Botsch, Mario; Keim, Daniel A.With recent advances in optimization many different optimization approaches have been proposed, especially regarding the optimization of weights for neural networks. However, comparing these approaches in a visually succinct and intuitive manner is difficult to do, especially without relying on simplified toy examples that may not be representative. In this paper, we present a visualization toolkit using a modified variant of Fatou sets of functions in the complex domain to directly visualize the convergence behavior of an optimizer across a large range of input values. Furthermore, we propose an approach of generating test functions based on polynomial Chebyshev proxies, with polynomial degrees up to 11217, and a modification of these proxies to yield functions that are strictly positive with known global minima, i.e., roots. Our proposed toolkit is provided as a cross platform open source framework in C++ using OpenMP for parallelization. Finally, for menomorphic functions the process generates visually interesting fractals, which might also be interesting from an artistic standpoint.