[Bf-committers] Blender and laser scanner - amendment

Jed jedfrechette at gmail.com
Tue Apr 30 23:18:16 CEST 2013


For a good real-world example of how this type of data is being used in
modern VFX pipelines I would suggest watching Scott Metzger's talk on
fxguidetv about set capture on Beautiful Creatures using scanner data in
Mari and Maya. I've been working on a somewhat similar pipeline that relies
heavily on Blender and with a few changes Blender could completely replace
both of those applications as well a couple of other commercial tools.

http://www.fxguide.com/fxguidetv/fxguidetv-165-scott-metzger-on-mari-and-hdr/


Jace Priester-2 wrote
> I wrote in point cloud support for Blender over a year ago, but I have
> been
> told repeatedly by the developers that there is no interest in the
> community and therefore they aren't interested in implementing it.

That's unfortunate, but I'll add the voice of another user who would be very
interested in development of these areas. In addition, I think some of the
required changes would have benefits for users well beyond the small subset
working with scanner data. When Ton asked for potential GSOC stakeholders
about a month ago I posted a message with my specific ideas 

http://lists.blender.org/pipermail/bf-committers/2013-March/039585.html


Jace Priester-2 wrote
> There are entire software
> libraries and programs devoted to that task (creating textured meshes) and
> frankly all of them fail
> pretty horribly given anything except laboratory-grade clean point cloud
> data. Under normal circumstances with a normal amount of noise and holes
> in
> the data, point clouds do not mesh well.

I don't disagree with your main point that Blender shouldn't be in the lidar
processing business but I don't think building textured meshes is quite as
hard as you make it seem. For example, the open-source software:
PoissonRecon (http://www.cs.jhu.edu/~misha/Code/PoissonRecon/Version4.51/)
can be used to produce meshes that are largely equivalent to a dense single
resolution sculpt. Using structure from motion, with the scanner geometry as
reference, to reconstruct camera positions and project photo textures is a
natural extension of the camera tracking tools already in Blender.


Bastien Montagne wrote
> For render, I think the simplest solution would be to tweak a bit
> particles 
> (we already have billboards, just would have to make them be able to 
> take the color from actual vertices and not only from loops), a system 
> with as much particles as vertices, emitted from those all at once and 
> with lifetime=scene duration is a suitable way to render a points cloud.

In general, I'm more interested in point clouds (aka. particle data sets
with arbitrary attributes) in the view port, but for rendering you may want
to look at some of the algorithms in CloudCompare
(http://www.danielgm.net/cc). They do as good of job or better as any of the
commercial point cloud rendering software I've used. Perhaps a separate
rendering engine, a la. Freestyle, would be more appropriate than trying to
hack it into the current system.

Best,

--
Jed Frechette

Lidar Guys



--
View this message in context: http://blender.45788.x6.nabble.com/Re-Blender-and-laser-scanner-amendment-td107542.html
Sent from the Bf-committers mailing list archive at Nabble.com.


More information about the Bf-committers mailing list