[Robotics] Integration of middlewares

Herman Bruyninckx Herman.Bruyninckx at mech.kuleuven.be
Fri Apr 17 17:34:18 CEST 2009


On Fri, 17 Apr 2009, Damien Marchal wrote:

> All the previous discussion are really interesting.
Indeed :-)

> Few more info about the GameEngine:
> In the GameEngine we have the following component: A Game contains
> several Scenes that contains several Actors. An actor has a set of
> Sensors connected to a Controllers that are connected to Actuators.

I think this is a very often overlooked feature of the GameEngine: it _has_
already a "multi-agent" architecture which corresponds very well to what
people in robotics like to have... :-)
_Maybe_ there might be a need to assign Actors to individual "threads",
sooner or later; for example, one Actor could have to interface with a
slower external sensor, via some "busy waiting"... I seem to remember that
Blender 2.50 will have a more configurable threading support, but I am not
at all sure about that... (You seem to mention threading support in your
blenderTUIO documentation below...)

> An actor also has a State (I'm a bit weak on this point  as I only use
> one State) that describes how the Sensor/Controller/Actuator are
> connected and which one is active.

> Such kind of approach was made to allow people to  design game without
> writing a line of code. This is nice for human being but for a programmer
> you will always want to do something the is not supported and have to
> spend time on hacking :)

I think there is a middle way, that most people want to follow: to have
some "scripting" possibilities to customize the available features, without
having to "hack". So, I think that the robotics community should focus on
(i) what "hacking" is required to extend the set of current features to a
level that is very attractive to robotics (and computer animators!), and
(ii) to work on a framework of Python scripts that can be put together in a
"Blender robotics script repository", and whose interoperability is
guaranteed.

>>> To better define the problem, could you write down on the Wiki what a
>>> "generic interface" in the context of Blender would look like for a
>>> sensor like a camera, for instance? I'm not sure, for example, that the
>>> framerate of a virtual (Blender) camera is something we can easily
>>> control -> should it appears in the interface? etc...
>>>
>>
>> These are the right questions! Let me try to give a partial "answer"... :-)
>> (This "answer" is basically a summary of the reasons why "component based
>> programming" is needed in addition to "object oriented programming".)
>> - different classes of sensors provide "data structures" that can be shared
>>    by all members of the class. For example, in the camera case: an image is
>>    a matrix with pixel values, with some configuration parameters such as
>>    size, time stamp, gray value intervals, ...
>> - these "data structures" would be the only thing that should be encoded
>>    in Blender.
>> - Blender should get a "Read" and "Write" interface to the "outside world",
>>    where any activity within Blender can input or output such data
>>    structures in a _non blocking_ way.
>> - Blender should get a "component wrapper" around itself (whose use or
>>    not-use could be a simple configuration setting) that allows to connect the
>>    above-mentioned "Read"/"Write" data ports to "ports" offered by
>>    communication middleware. In the worst case, each middleware project that
>>    wants to integrate with Blender provides its own "Blender device driver"
>>    to make this happen. In the best case, Blender can have a "generic"
>>    interface to such middleware. (And the concrete middleware projects then
>>    have to add a "device driver" for this generic interface to their own
>>    code.)
>> - The "component wrapper" to communication middleware can have several
>>    (configurable!) policies: (un)buffered, FIFO, overwrite when new
>>    data arrives, ...
>> - The paragraphs above only deal with "data flow"; the next step is to
>>    introduce "event flow".
>>
>
> I like the way Herman has defined this flexible way by which blender
> will get/set external data.
>
> Here is an exemple of a such schema:
> 1) A real-sensor
>  2) component_wrapper: from the framework data format to a blender
> in/out normalized format
>               |
>               | (udp, tcp, shmem)
>               |
"pipe", inter-process "message passing" (udp and tcp are only two specific
protocols in this context), CORBA ...

>  4) InOutSensor
>  5) controller: deserialize the data
>                       interprets the data
>                       process something
>                       serialize data for the actuators.

Would this (de)serialization functionality not be something that belongs in
the Blender code? Here you seem to suggest that each controller writer must
do it itself...

>  6) InOutActuator
>             |
>             | (udp, tcp, shmem)
>             |
>  7) component_wrapper: deserialization of data to the framework data
> format.
>  8) the real actuator

Can you explain which of these 8 points your C++ patches to the GameEngine
are covering?

> In the context of blender maybe a first attempt can be made to support
> the Verse protocole for the network part.
> Verse seems "stalled" since years it is already part of mainstream
> blender which is definitely a plus to avoid
> maintaining a fork. After a quick look it seems that Verse can exchange
> texture data (in which you can store a bytestream).

Yes, but "re-using" one data flow to send another one "in disguise" is
asking for maintenance and semantics problems!!!

> But I totally agree that there is plenty of other protocol to transmit
> normalized data (corba, ws, ace, verse, osc(tuio), yarp).

Blender should not know about any of those... :-)

> About the "data_structure" that have to be implemented in blender. I
> wonder if other options are possible.

"other options" than what, exactly?

> Let's suppose that the data_structure is bound to a "class" name. When
> the component_wrapper connects
> to a sensor or an actuator it advertise this class name to Blender.
> Something like: I will send you "6dofphanom" or
> "framebufferimage". 
Do you think there is a need for _Blender_ to have to know these class
names? Or do you think only the GameEngine Actors have to know, and Blender
should just pass the data uninterpreted?

> Instead of having blender implementing a
> "data_structure" wrapper it would be possible to have
> the data wrapper provided by the component_wrapper. For exemple at
> connexion the component_wrapper can send a python script
> implementing the decoding/access to the data structure.

I don't understand what would be going on exactly in this scenario that you
describe... More specifically, the sentence "send a python script
implementing the decoding/access to the data structure" is not clear to me.

> The result is that the data_structure encoding/decoding is now defined
> out of Blender. You can even combined the two approach by using a factory
> to select if you wrapper is implemented and hardcoded C++ version or a
> python version it.

> Now something to keep in mind is that all the sensors are not only
> "data_structure" but they can also do
> complex processing like, in blenderTUIO i have sensor reacting to
> condition like "finger_is_over_objet"
> in which you have to test intersection with object of the scene. This is
> something important to keep in mind
> (in my case but maybe not for robotics).

Of course also for robotics! :-) For example, grasp simulation comes to
mind: you could let a Blender Actor do grasping, and send the position of,
and forces on, the fingers to an external "haptic interface" or grasp
controller, or estimator, or...

> Do collision detection have to be exposed to the Python API ?

> About grabbing the Blender rendering context, this may require some
> tricks (to be sure to get the last frame)
> but sound possible. I have read that some guy are reshaping the GE so
> that we can plug much more easily to
> such event. A kind of Sensor triggered during the
> "on_begin_frame/on_end_frame" this can be done.

> Now about working together, do we already have a shared repository for
> source code ? I can set up one or  maybe this can be asked to the Blender
> Foundation.
>
> I'm going to holliday for the next 2 weeks. So here is most of the code
> (for 2.48) for blenderTUIO
> http://forge.lifl.fr/PIRVI/raw-attachment/wiki/MTUtils/blenderTUIO/sourcecode_blenderTUIO.zip
> I wasn't prepare to shared this source code in the current state (sorry
> for the lame status of it) but I
> think it can be an hint to know where to look to support new
> sensor/actuator.
>
>
> =========== Some low level details about blenderTUIO =============
> In blenderTUIO I have a TUIO sensor.
> A TUIOReceiver is shared by all the sensor and is running a thread
> receiving UDP packed. These UDP packet containing
> TUIO frames; The frames are decoded and then passed to the Sensors based
> on an observer/observable pattern. Using this information
> all the tuio sensors (that are connected to the objects) are feed with
> values like position of the user hand in 2d or 3d space etc...
>
> For a touchtable the  sensor can be configured to trigger a controller if:
> - on_cursor: if a finger is on the surface the sensor triggers its
> controllers. The controller
>                    then have to use the python API to access the
> sensors values.
> - on_cursor_over: trigger the controller if a finger is over the object.
> The python API is different
>                            as it allow to return the collision
> properties: "location of the hit, normal to the hit...etc"
> - on_cursor_over_any: trigger the controller if one of the finger is
> over any object in the scene...I don't like this one
>                                   because it should be a scene property
> and not an object on.
> Remark: the on_cursor_over requires the geometrical description of the
> scene while on_cursor is just a kind of
> measurment tool. Implementing it properly is a bit tricky as the
> complete scene have to be tested for collision with
> the fingers. Currently this is done for each active on_cursor_over_*
> sensor while this should be done once per scene.
>
> Ok so now let's have a quick look at the main GE 2.48a loop. The
> description may not be very
> precise but I allow to have better understanding on how/where we can
> "plug-in".
>
> In gameengine\Ketsji\KX_KetsjuEngine.cpp most of the simulation loop
> into the
> NextFrame method. It is working this way:
>   - update physic geometry,
>   - update scene geometry,
>   - KXScene.cpp::LogicBeginFrame - > SCA_LogicManager.cpp::BeginFrame
>         1) for each sensor manager::BeginFrame (it will check for each
> of its active sensors if they need to trigger), if so insert the
> controller it into a "to_process list".
>         2) for each i in to_processlist: apply the controller.
>   - update (again) scene geometry to take the into acount the
> controller actions.
>   - KXScene.cpp::LogicUpdateFrame - > SCA_LogicManager.cpp::UpdateFrame
>         3) apply the actuators.
>   - update the scene geometry and physics parameters
>   - physical simulation, integrate timestep
>   - update scene geometry
> ================ End of low level details ==================
Ok, thanks! I will need a lot more time to digest this, but I appreciate
your efforts, because I think they are an extremely useful first step to
help others that also want to improve Blender features!

> Best regards,
> Damien.

Herman


More information about the Robotics mailing list