[Bf-committers] flame density sampling problem

Raul Fernandez Hernandez raulf at info.upr.edu.cu
Thu Jan 21 21:14:24 CET 2010

Hi :)

> Hello again,
> I made some significant progress with the fire simulation system i
> announced here earlier, thanks to the help of jahka, who convinced me
> to integrate this into the particle system. Particles now form the
> basis of the fire's flames, so we can uses them to further simulate
> fire spread and other effects at a later time. See these videos for a
> small impression of how this simulation works:
> http://www.vimeo.com/album/166313

 Congratulations for your fast development rate :)

> Now, in order to turn this physical simulation into actual images of
> fire, i started work on the second big part of this fire system, which
> is rendering. Since fire is essentially a fluid, a volumetric
> rendering approach is a natural choice. So my first attempt to render
> the flames was to use a point density texture, which describes the
> flame density, emission and color from the underlying filament data
> (those curves you see in the videos) and an associated density and
> turbulence function. However, after a little test implementation and
> after wrapping my head around how the creators of the original method
> actually did this, i figured that this is not a good approach, last
> but not least because rendering times get _way_ too big (here's the
> original paper again:
> http://reference.kfupm.edu.sa/content/s/t/structural_modeling_of_flames_for_a_prod_96921.pdf).

 Well, rendering times could get longer but is not so bad approach, I
think whatever method you came up , still a volumetric rendering option
should be provided. Maya flame effects are very beautifull and are done
with volumetric rendring , if you need further help on this feel free to
contact me at raulf at info.upr.edu.cu.

> So now i need some help to figure out how this could be done with
> Blender's current rendering methods. I need to get a little deeper
> into the math here:
> The problem is that directly evaluating the flame
> density/illumination/etc. at a point in space is difficult, because of
> two turbulence functions applied to the raw density function:
> 1. Let x be a point relative to a flame filament
> 2. The density/color/etc. at this _untransformed_ point x is then
> given by a function d(x)
> 3. The point x is then transformed by a turbulence function T to a
> point x' = T(x). This simulates turbulence of the surrounding air as
> well as fluctuation of the combustion process and is essential for
> realistic fire!
> 4. In order to determine the actual density at an arbitrary sample
> point s in space (which is what we need to do for volumetric
> rendering), we'd have to calculate d( T_inv( s ) ), where T_inv is the
> inverse of T, which is afaik not easily calculatable ...

 Pardon my ignorance in flame physics but since T is a turbulence function
its inverse should also be a Turbulence function, and while for the
simulation step you need some level of accuracy, for rendering some
tradeoffs could be made, why not trying to render d( T( s ) )? , that
means using the same turbulence function? could be a non sense, but you
could try it ;)

> The solution chosen by the authors of said paper is to generate a
> complete set of point samples for the rendered flame volume in advance
> and then transform these points all together. This creates a new set
> of sample points, which is still sufficiently sampled due to the
> coherence of the turbulence function. This however is more
> complicated, since i'd have to influence the way samples for the
> volumetric material are generated instead of just creating a new
> texture method.

  My first implementation of volumetrics (the discarded path) have the
flexibility to easily allow this, custom tailored sampling
transformations in the volumetric sampling space, though at that time I
thougth  It never will be needed :)
 but the current implementation is more hardcoded/integrated to not allow
this because is more oriented toward physical ligth behavior (ease of
use, intitive) while the first volumetric code where oriented toward
visualization (flexible, harder to use).
  For example a lost feature from the swap of volumetric implemenation was
the adaptive sampling, which allowed more samples on high detail zones
and fewer samples on uniform zones.

  But those are features that we can live without :) , more importantly is
that another approaches should exist.

>I also thought about simply assuming that, due to the
> random nature of noise, the inverse transformation essentially looks
> similar to the transformation itself, so one could just perturb the
> samples before calculating the density, but i guess this would still
> look odd.

 wow! i should have read the whole message before answer you :) , this is
basically what i proposed before .

> I am really just starting to get into the raytracing matter, so i
> might just have overlooked a simple solution to this problem. Maybe
> there actually _is_ an inverse to the turbulence transformation and i
> didn't get it?
> Any help is appreciated :)

  How are your progress on that?  I'm currently digging into isosurfaces
generation from particles and an idea came to my mind... what about
using several turbulent iso-surfaces layers for the flames? I think i
have seing a paper that use an isosurface for the "solid" part of a
   anyway, those are just ideas.

                    Cheers   Raul

More information about the Bf-committers mailing list