[Robotics] Integration of middlewares

Damien Marchal damien.marchal at lifl.fr
Fri Apr 17 19:05:22 CEST 2009


Hi all,

First, great work Severin, for the imagegrabber I really like it.

> I think there is a middle way, that most people want to follow: to have
> some "scripting" possibilities to customize the available features, without
> having to "hack". So, I think that the robotics community should focus on
> (i) what "hacking" is required to extend the set of current features to a
> level that is very attractive to robotics (and computer animators!), and
> (ii) to work on a framework of Python scripts that can be put together in a
> "Blender robotics script repository", and whose interoperability is
> guaranteed.
>   
Scripting capability is good. But in the current state of blender (2.48) 
you need to hack a bit if you
want some nice feature (combined with performance). So yes minimizing 
the amount of change in blender
and maximizing the amount of script.

> Would this (de)serialization functionality not be something that belongs in
> the Blender code? Here you seem to suggest that each controller writer must
> do it itself...
>
>   
>>  6) InOutActuator
>>             |
>>             | (udp, tcp, shmem)
>>             |
>>  7) component_wrapper: deserialization of data to the framework data
>> format.
>>  8) the real actuator
>>     
>
> Can you explain which of these 8 points your C++ patches to the GameEngine
> are covering?
>   
TUIO describe what are 2d and 3d objects or what are 2d and 3d cursors.  
For exemple TUIO
says that a 2dcursor is a list of float [x,y, dx,dy ,fx, fy, 
angularvel]. These informations are
then send between peers (the touchtable or the wiimote send to blender) 
using a protocol named
OSC(it is over udp).

So what I did was to write a receiver that get udp packets, extracts the 
informations
contained in the packet and stores the data. The scene objects that are 
interrested by
these data have to use a TUIOSensor (most of the time connected to a 
python Controller).
The Controller can then have access, through a python binding to the 
TUIOSensor data, and thus, can
process the values (eg. changing location of the object).


>> But I totally agree that there is plenty of other protocol to transmit
>> normalized data (corba, ws, ace, verse, osc(tuio), yarp).
>>     
>
> Blender should not know about any of those... :-)
>   
I was thinking to have something called RawSensor/RawActuator that would 
behave
like in and out ports by which the data will flow in and out from and to 
blender. But
this RawSensor/RawActuator need to use something (udp,tcp,shmem) to 
exchange
data with external component_wrapper isn't it ?
The component_wrapper hides the externals framework (yarps, etc..) from
blender but at least one communication framework has to be used in order 
to exchange
data between blender and the component_wrapper. This can be a really low 
level
framework just exchanging byte stream or arrays or a more higher level 
one (exchanging
message, sycnrhonous asynchronous etc...). In this context I was citing 
Verse just
because it already exists in Blender. ACE seems often cited and rather 
modern.

>> Let's suppose that the data_structure is bound to a "class" name. When
>> the component_wrapper connects
>> to a sensor or an actuator it advertise this class name to Blender.
>> Something like: I will send you "6dofphanom" or
>> "framebufferimage". 
>>     
> Do you think there is a need for _Blender_ to have to know these class
> names? Or do you think only the GameEngine Actors have to know, and Blender
> should just pass the data uninterpreted?
>   
The best would be to have the actor's controller to know how to 
interprete the raw_structure.
This would clearly permit to keep blender out of the user/robotics 
specific complexity. The only
drawback of doing so is that any function that is not supported in the 
controller will not be possible
to make (unless the python api is extended and or the simulation loop is 
changed).

>> Instead of having blender implementing a
>> "data_structure" wrapper it would be possible to have
>> the data wrapper provided by the component_wrapper. For exemple at
>> connexion the component_wrapper can send a python script
>> implementing the decoding/access to the data structure.
>>     
>
> I don't understand what would be going on exactly in this scenario that you
> describe... More specifically, the sentence "send a python script
> implementing the decoding/access to the data structure" is not clear to m
>   
Sorry it was out of topic. It was a silly idea in which the script that 
is needed to interpret the raw_structure is
provided by the component_wrapper (sending the textfile when a sensor 
need it and let blender will load it as a
python script).  But this was not really relevant nore important.

>> Now something to keep in mind is that all the sensors are not only
>> "data_structure" but they can also do
>> complex processing like, in blenderTUIO i have sensor reacting to
>> condition like "finger_is_over_objet"
>> in which you have to test intersection with object of the scene. This is
>> something important to keep in mind
>> (in my case but maybe not for robotics).
>>     
>
> Of course also for robotics! :-) For example, grasp simulation comes to
> mind: you could let a Blender Actor do grasping, and send the position of,
> and forces on, the fingers to an external "haptic interface" or grasp
> controller, or estimator, or...
>   
This is funny I just have been asked  today to illustrate for the 15 of 
may a simulation of grasping for
Christian Duriez: http://www2.lifl.fr/~duriez/EG2008/contactSkinning.pdf
(but the simulation is totally done using their framework: 
http://www.sofa-framework.org/)

Bye for now and best, regards,
Damien



More information about the Robotics mailing list