[Robotics] Blender and UAVs

Herman Bruyninckx Herman.Bruyninckx at mech.kuleuven.be
Thu Jan 28 10:49:30 CET 2010


On Thu, 28 Jan 2010, Séverin Lemaignan wrote:

> I'm adding the robotics at blender.org mailing-list to this interesting
> discussion.
>
>> Currently I'm building a virtual lab in Blender to test my human motion
>> tracking algorithms. This includes exporting
>> virtual camera data to my comp. vision algorithm (as the sensor data
>> from multiple cameras) but also controling the models armatures from my
>> cv program.
>
> Excellent project. I'd be very glad to see some videos/papers about that!
>
>> In this context I started looking into the possibilities to use blender
>> as a "component". (in our Orocos naming scheme).
>> This is something I would like to discuss with you all and hopefully
>> present some ideas and goals at the Annual Meeting, in order
>> to combine our efforts into a common goal/effort. Because this goes way
>> further than a single person can accomplish.
>> Some things that I (and Herman) think are related to such a problem:
>>
>> - It means there must be a FSM around blender (configuring, starting and
>> updating as primary functions, timing for ex. as secondary)
>
> What do you call a "FSM"? a Finite State Machine, I guess. 
Yes.

> But then what do you mean by "timing"? you would like to have smthg like
> a common clock line for all your components?

When making a simulation system that involves multiple heterogenous
components, each of them should offer the possibility to be driven by a
"virtual, simulation time". One simulation time trigger will send out
events that each component's FSM translates in an execution of the
functionality of that component.

The FSM is also needed for (re)configuration, at start up, but often also
at runtime.

> In any case, we call at LAAS/ONERA this control component the
> "simulation supervisor", and we are currently implementing something
> like that in the Blender UI. We wrote a partial specification for that,
> but I can imagine that your needs are likely to be a bit specific.

We don't know yet :-) But apparently, it _is_ a topic that we could have a
more open discussion about. Since, if we would succeed in wrapping Blender
in a "component interface", it would make the program extremely more
attractive to many applications, since people only now know it as a
standalone desktop application.

>> - It means we should write our communication as middleware independent
>> as possible and make it multifunctional compatible with certain middlewares:
>>
>>   * YARP (Laas)
>>   * ROS ( Leuven, TUM, ...) (I'm currently testing this)
>>   * Standard TCP/UDP data dumps
>>   * Corba (Leuven) (we already have implementations based on Corba)
>>   * ... whatever I've might overseen
>
> Yes, this point (middleware independence) was very high on the
> requirements list for ORS (OpenRobots Simulator) from the beginning.

We have some ideas about that. Topic for further discussion :-)

> The path we would like to use rely on LAAS's Genom3: an important,
> under-development, software project that aim to automatically generate
> module for a wide range of middleware from a single module specification.

I know the predecessors of Genom3, so I am _very_ interested in the
concepts that you want to introduce.

> Genom3 is not yet available (we are in the process of converting the
> prototype to a full-featured application), so we've been implementing
> "by hand" support for YARP (and raw TCP sockets, if I'm right. Gilberto?).
>
> I would like to add that generally speaking, LAAS does *not* rely on
> YARP. We are merely YARP user because of several European projects we
> are involved in. But our everyday "middleware" (inter components
> communication framework, should I say) is an internal tool called Pocolibs.
>
>> - It should mean that we both need to get input and _output_ out of blender
>> - In the long term it should mean that we could load a unified
>> description of our virtual environment from the FSM. (for example load a
>> Collada file, ...):
>>
>>   * the model/meshes
>>   * the kinematics/armatures
>>   * the dynamics of such models
>>   * lighting and camera parameters
>>   * user interface (if needed)
>>   * ... think I forgot a whole lot here
>
> This point on Collada files appear very important to me. I'll wite
> another mail on this topic later on.
Good!

>> PS2: Herman, should we include people from TUM in this discussion or are
>> they coming to the Annual Meeting?
> I'm currently discussing it with them. They are using Gazebo as primary
> simulation platform, and are not likely to change anytime soon, I think.

We will convince them (and others) with an unbeatable feature set of the Blender
approach! :-) That's my major goal for the Blender for robotics workshop at
the EURON Annual Meeting...

> At least not until it's well integrated with ROS with an easy path to
> convert current models to models usable in Blender.

This is a two-way interaction: ROS is not using Collada in the most
appropriate way yet (also Orocos is not), but again we have some very
concrete suggestions what could be that more appropriate way :-)

Thanks for the constructive responses! It seems it's time to start some
more focused, single-topic follow up posts :-) Maybe we can do that now
already, or wait till after our face-to-face meeting in Toulouse in two
weeks, during which we can prepare a more consistent and concrete
discussion document. What are the preferences of you all?

Herman


More information about the Robotics mailing list