[Robotics] Blender and academic world

Herman Bruyninckx Herman.Bruyninckx at mech.kuleuven.be
Wed Apr 15 18:31:41 CEST 2009


On Tue, 14 Apr 2009, Damien Marchal wrote:

> I'm a research engineer working at the IRCICA-CNRS specialized on
> virtual reality things. At  IRCICA we have some good haptic devices
> (Phantoms) as well as room with art-tracks:
> http://www2.lifl.fr/pirvi/pirvi_salleRV.php
> and of courses touchtables. I initiated the use of blender in my
> institute; and I we have just
> released a specific version of blender supporting the TUIO procol in the
> gameengine.
> You can have look in this news:
> http://www.blendernation.com/2009/04/11/blendertuio-blender-multi-touch-screen/
>
> For the moment this blenderTUIO is capable of receiving sensor data and
> inject them in the gameengine. Initially we tried to use python scripts
> but it was too slow so we made a direct support in C++ that is now
> working nicely. Currently the data are only from the physical sensors
> mapped to gameengine sensors, so we can update object
> position/orientation based on the sensors data. A possible next step
> could be to send force/torque from the gameengine to physical actuator
> (in my case a Phantom). I also have plan to add richer info we got from
> the sensors, for example we have a lot of camera from which we get 2d
> polygonal model, and I would like to inject that in the gameengine (in my
> case for VR interaction).

Great! All this makes direct sense for robotics applications too :-)

> Few days ago I noticed you project on blendernation.
> http://www.blendernation.com/2009/04/10/blender-for-robotics-is-shaping-up/
>
> I was wondering the amount of connections between our respective work,
> and how much from
> each other we can reuse and maybe collaborate. If you are going to add
> support at c++ layer (is this a direction you want to go?)

Yes, if possible! And "possible" means:
- if the Blender maintainers accept such patches
- if we can come up with interfaces for Game Engine 'sensors' and
   'actuators' that are sufficiently generic but at the same time
   sufficiently performant to be relevant to "real" robotics applications.

> maybe it could make sense to define a shared interface for all kind of
> external sensors and actuators as well as their python exposure
> (normalized naming) ?

I fully agree! A first step towards "normalized" interfaces and naming is
to identify the class of "sensors" that we want to interface with. We could
start this on the mailinglist, but since Lille is rather close to Leuven,
you could come over for a day or two and have a discussion here. (Or the
other way around.)

In order to start this discussion, let me make some comments and suggestions:
- 'actuators' in robotics have basically four "modes": position, velocity,
   acceleration, or torque. The latter requires a dynamic model of the
   "robot" that the actuator is acting on. Position is the simplest to
   interface in Blender, currently. Velocity and acceleration require some
   for of 'state information' to be integrated into the Game Engine. (Or
   does that exist already?)
   Anyway, (most) actuator interfaces looked rather simple to me. 
- 'sensors' in robotics can be (roughly) of two different kinds:
   - proprioceptive: that means, 'sensing the internal state' of the robot,
     and that almost always boils down to the complement of the 'actuators'
     above, that is the sensing of the position, velocity, acceleration
     and/or the torque at the robot's joints.
   - exteroceptive, that means, 'sensing the interaction of the robot with
     its environment'. This category carries many sub-categories:
     - distance sensors (laser range scanners, for example)
     - force sensors (at the "wrist" of a robot arm, the "foot" of a
       humanoid robot, or the "fingertip" of a robothand)
     - camera's, that take images of the scene around the robot.
     Again, each of these categories is not too difficult to give an
     interface. If I am not mistaken, this kind of functionality is what
     the people in LAAS want to achieve, via their local "blender for
     robotics" project and programmer...
- 'controllers': the number of robot control algorithms is rich, but still
   "enumerably finite", so some initial standardization would be possible
   to: 1 DOF PID control; simple trajectory planning to reach goals; full
   dynamic control of humanoid robot armatures;...
   (Support for the latter is in progress, to some extent, via a Blender
    project run by Game Engine developer Benoit Bolsee and sponsored
    by K.U.Leuven.)
_The_ problem will be to get agreement about all little details, I guess :-)
And we should be worried a bit by the fact that the "real" robotics world
has also failed to come up with normalized interfaces ...

> I would dream of a demo, based on blender, in which a set of real robots
> are navigating in the building, with some of them controlled from our VR
> room using the art-tracks gloves and touchtable... so I will follow the
> blender-robotics page news with a lot of care.

Judging from your message, you can do much more than "follow": you can
"lead" :-)

> Have fun and best regards,
> Damien.
> PS: I'm not sure we have the same definition for word like sensors
> etc... :), please correct me if this is not the case.

I am rather happy with the semantics of "sensor" and "actuator" and
"controller" in the Game Engine, since these meanings correspond _very_
well to the common meaning in robotics :-)

Summary of my reply: your stuff is a _perfect_ fit for what the "blender
for robotics founders" want to do, so please keep on "bothering" us with
your ideas and suggestions and code! :-)

Best regards,

Herman Bruyninckx

--
   K.U.Leuven, Mechanical Eng., Mechatronics & Robotics Research Group
     <http://people.mech.kuleuven.be/~bruyninc> Tel: +32 16 322480
   Coordinator of EURON (European Robotics Research Network)
     <http://www.euron.org>
   Open Realtime Control Services <http://www.orocos.org>


More information about the Robotics mailing list