[Robotics] Introduction

Herman Bruyninckx Herman.Bruyninckx at mech.kuleuven.be
Thu Aug 27 17:41:45 CEST 2009


On Thu, 27 Aug 2009, Séverin Lemaignan wrote:

> I start with the end of your mail:
> > I also want to repeat my suggestion of two weeks ago, to organize a
> > workshop or conference around open source robotics software. I have
> > received close to zero reactions to that suggestion, which is in contrast
> > with the self-acclaimed openness and cooperation of the open source
> > community... :-(
>
> I started to answer you, but after a Thunderbird crash, I need to start
> over again. In brief, I was wondering how the RoSta project and the
> outcomes of RoSta could weight on this symposium.

RosTa was a very useful project to identify a lot of the problems that the
robotics community was not really aware of, such as: there is no
interoperability at all at this moment; there is no such thing as _the_
robotics architecture; the problem of robot software interoperability is
about more than just standardizing APIs (there is also a need for semantic,
ontological activities, there are significant differences between
object-oriented libraries, component based systems, and service oriented
web services, etc.

> More precisely, regarding interoperability, it seems to me that it's
> something everybody talk about, even think about, but is not always
> committed to make happen. For several good reasons, including the work
> it requires. 
Indeed.

> I think that was one of the conclusion of the RoSta
> project. Maybe you could go into some details of what kind of results
> you would expect from the symposium.
I think these are a couple of low-level, realistic goals:
- to agree on data structures for (instantaneous) motion, "geo spatial"
   maps
- to discuss concrete suggestion(s) how to interface Blender, and about how
   a "robotics-oriented UI" should look like.
- a critical discussion of the ROS Concepts
     <http://www.ros.org/wiki/ROS/Concepts>
   to see to what extent they serve all use cases in robotics.

> Now, regarding more specifically the simulation: we are still at a very
> early design stage, more like experimenting than anything else. But an
> open discussion on this topic with other teams (especially teams
> developing middlewares) could be very interesting.
> In this case, I see something more like a workshop than a symposium.

Fine!

> To your remarks, now:
>
>> - there is a significant difference between the game engine and the
>>    simulation mode. The former is the most important one for robotics
>>    (because it has the interaction possibilities with the outside world) and
>>    hence we should concentrate on that.
>
> That's an important aspect. We do actually work with the GameEngine
> because we need the simulation of Physics. That's even our main reason
> for choosing Blender. But I agree with you, interactions with the
> simulation are not easy in the GameEngine mode, and it's a problem.

I am not sure why we really need the simulation....

> I wrote a mail once to Ton to know if Blender 2.5 could possibly allow
> to have the GameEngine running while keeping the Blender interface
> active. It's still unclear to me, but would be a very welcome
> improvement for us.

Can you say which use case this would solve for you?

> In the meantime, and in our use case, I see the workflow like that:
>
> 1- You design your robot from the "normal" Blender interface, made of
> reusable components (I'll be back on this topic later) you can drag&drop
> (imagine a component library integrated in the Blender UI).

I really _do_ imagine such an interface! :-) I think it is definitely
needed, to help roboticists making good use of the Blender
functionalities...

> You add a
> GPS, a camera, an arm... like that. each component (sensor or actuator)
> may be configured.
I fully approve of this description... :-)

> 2- You design the whole scene in the same way, importing your
> previously-made robots in the scene. You configure here some global
> parameters (like a geo-reference for the GPS, or the middleware you'll
> be using).

I think importing 'world' model from somewhere else should also be possible
at this stage. In a later phase, that world model should be updated
online...

> 3- You start the simulation by starting the GameEngine, and start
> talking to your robot through your middleware.

>> - Blender has a very difficult user interface for starters, and for "poor"
>>    robotics engineers. So, coming up with a 'robotics panel' would be a good
>>    idea, I think. The new 2.50 provides lots of opportunities for customized
>>    panels.
>
> We plan to rely heavily on this feature.
We too! So this is certainly something we should cooperate on.

> In particular, in our current approach, each component would come with
> it's own configuration panel (as a specific Python file).

>> - we all want to interface Blender with our own middleware, hence it makes
>>    sense to develop together the Blender part of that interface such that it
>>    is compatible with all possible external middleware projects, with just a
>>    little amount of configuration.
>> - this interface middleware is just half of the story: besides being able
>>    to send "messages" to, and receive from, the outside world, we also have
>>    to come up with a "standard" about the meaning of the data structures in
>>    the messages.
>
> That's one of the big challenges. Our current idea is to rely on Genom3
> (Herman, I think you attended the presentation at ICAR. Cf the attached
> slides): it's basically a code generator that take in input a
> description of the services (and datatypes) your component offers, a set
> of so-called "codels" that implement the actual computation required to
> generate the sensor output or actuator actions and a generic template
> per middleware.
>
> With that, someone who wants to use the simulator with it's own
> middleware just need to write its own template and everything works, and
> someone who want to add a new component just describe the services the
> component offers and write the Python code to be executed in Blender to
> generate the data.

That means the Genome data structures are "the standard"... And I am not so
sure that that's a right choice. (The same remark holds for other
frameworks, of course.) The real challenge is not so much to make Blender
interoperable with one particular framework, but to make it interoperable
with _all_ frameworks! That requires a higher level of standardization
between these frameworks, of course. And Blender could be the catalyst to
start such interoperability work...

> More details on that on the wiki in a few days.
>
>> - the game engine has already very interesting sensors for robotics, the
>>    "message" and the "distance sensor" being the first ones that come to my
>>    mind. Also making artificial "camera images" should be rather easy. But
>>    more complex robotics sensors should be added, and we should be able to
>>    share developments. For example, inertia sensors, laser range scans,
>>    force and touch sensors, ...
>
> The first components we want to implement to validate the approach are a
> GPS and a stereohead. 
Good!

> Laser scanner are actually very close to camera
> (since we can generate a fake laser range scanner output from a OpenGL
> depth map).

Thanks for you informative message :-)

Herman


More information about the Robotics mailing list