[Bf-cycles] build with OSL support

storm kartochka22 at yandex.ru
Fri Nov 11 23:06:55 CET 2011


В Пт., 11/11/2011 в 19:43 +0100, Brecht Van Lommel пишет:
> Hi,
> 
> On Fri, Nov 11, 2011 at 5:47 PM, storm <kartochka22 at yandex.ru> wrote:
> > Eval have no relation with sampling. It used for absolutely different
> > events. Sampling generate primary integrator event, "What happened with
> > current photon (or importance particle in the case of backtrace) package
> > during another timeslice". Eval say "If you fire photon package in that
> > direction, it will be attenuated by that value because of some surface
> > property". In the case of ideal perfect importance sampling they MUST
> > have different values. Because PDF shape control density of rays, not
> > value of PDF. If thay have same values , we get brighter image in normal
> > direction in the case of diffuse material.
> >
> > And trick with cos(theta) is the only reason that we need non uniform
> > sampling at all in diffuse sampler.
> 
> We must be using different definitions here?
> 
> What I mean is, if you look at how kernel_path.h and how eval and pdf
> affect the throughput, it gets multiplied with eval/pdf, which is 1
> here. So there's no extra factors influencing the throughput, which
> must mean either the sampling perfectly matches the BSDF already, or
> there's a bug in the current code?

I doubt there is bug, rendered image are perfect. I suspect
interpretation of actual ray spatial density. Cannot visualize ray
density to check it visually, I have idea to create coarse 3D texture
containing whole scene, maybe 64x64x64 or less, and "draw" every
non-zero path ray chain using 3dda by adding constant, simulating self
illuminating lazer beams in dust. Texture pixels with bright values show
where path have more density. Then maybe use that texture to adjust
sampling to speed up render time. Sort of "MLT w/o MLT".

> 
> > __device_inline void sample_cos_hemisphere(const float3 N,
> >        float randu, float randv, float3 *omega_in, float *pdf)
> > {
> >        // Default closure BSDF implementation: uniformly sample
> >        // cosine-weighted hemisphere above the point.
> >        to_unit_disk(&randu, &randv);
> >        float costheta = sqrtf(max(1.0f - randu * randu - randv * randv,
> > 0.0f));
> >        float3 T, B;
> >        make_orthonormals(N, &T, &B);
> >        *omega_in = randu * T + randv * B + costheta * N;
> >        *pdf = costheta *M_1_PI_F;
> > }
> >
> > It generate UNIFORM point on hemisphere "to_unit_disk", LATER calculate
> > cosinus (of uniform point ), linearly transform to respect input vector,
> > and asjust pdf. That called uniform sampling by definition.
> 
> >From what I understand, to_unit_disk generates a uniform point in a
> disk/circle.

I was wrong, to_unit_disk generate uniform 2d circle. Dunno where I even
get that it return point projected from uniform sampled hemisphere to
unit disk ? Perhaps I misread that comment above to_unit_disk w/o check
in book. Proof http://s017.radikal.ru/i425/1111/49/5b08e67f23b5.png No
visual increase of point density near edge after many tryes.

You ruine my perfect conspiracy theory!

>  If you project that up onto a hemisphere, it gives cosine
> weighted samples. This code is more complicated than it could be, but
> as far as I can see that's what it is doing.
> 

So, we have uniformly distributed point on unit disc, 2 coordinates, x,
y. Then calculate last coordinate, z, by move point from disc along
normal untill hit unit sphere. That eazy, x*x + y*y +z*z = 1*1. At the
same time, z is cosinus(theta) by definition.

We get less point density near plane, and more near center. Assume, it
cosine weighted (that mathematics too hard for me, sorry).

We already have importance sampling. Nice.

Only one question left, why we divide it by variable pdf. I mean
bsdf_eval/bsdf_pdf looks exactly what is MC integrator equation 1/N *
sum(f(x)/p(x)), is it ?  bsdf_eval == f(x), bsdf_pdf == p(x), where is
the trick? If you already include pdf in bsdf_eval (in ), then bsdf_pdf
is not final pdf but some abstract constant (with some relation to pdf)?
That make sense, I need to doublecheck all my experimets. Have trouble
with hot pixels in phase finction importance sampling, water like
particle have tiny strong beams in pdf, importance sampling is vital in
that case.



More information about the Bf-cycles mailing list