Real-time 3D Fluid Rendering
Summer 2019 | Mini project
This is a technical experiment of real-time 3D fluid rendering visual capabilities powered by TouchDesigner.
The experiment originates from my personal interest in this topic and technical research on Vincent Houzé's interactive installation works. By the time, there was almost no tutorial online for this particular pipeline. The course of research and implementation was largely dependent on Houzé's approach and common real-time fluid rendering techniques in computer graphics.
As a new-comer to TouchDesigner, I feel happy with the result. Despite being an technical experiment, it can be easily integrated into any future project.
Designed & Implemented by Yufeng Zhao.
Inspired by Vincent Houze.
Done in Summer 2019.
Fluid Structure, 2017 by Vincent Houzé is the case study subject for the research. His amazing yet clean implementation gives methodological insights on the use of TouchDesigner.
The theoretical study is based on conventional computer graphics approaches, which can be found online as papers and technical handbooks.
Real-time interactive and visual fliud requires two parts - simulaton and rendering.
Click on the referenced images to view the source.
Real-time 3D Fluid Simulation
If to build from scratch, the simulation requires understanding of basics fluid dynamics and its dicrete implementation in GPU. The common approach is to implement Lagrangian schemes in GPGPU structure. The technique is commonly used for simulating not only liquid, but also smoke, fire, and fabric.
Houzé's approach is to integrate Nvidia FleX, real-time particle solver by Nvidia, to TouchDesigner by building a C++ plugin. FlexChop is released on Github by Houze that can be used fairly easily in TouchDesigner to specify a particle system and aqcuire later states in real-time.
Screen-Space Fluid Rendering
Acquiring particles' states only is not enough because they need to be rendered together as fluid. To render fluid in realtime, traditional surface reconstruction algorithm would not work. In stead, the industry takes a screen-space approach.
Referencing Nvidia's talk at GDC, the implementation includes several passes which render the particles as sphere to generate depth map, normal map, and density map, in order to calculate the reflection and refraction.