[Robotics] Middlewares: the sequel

Benoit Bolsee benoit.bolsee at online.be
Thu Sep 17 22:42:20 CEST 2009


Nice stuff! I see clearly how it will work for controllers but what
about sensors? There is currently no 'python' sensor. Of course you can
always replace a sensor by a python controller running for each frame
and doing some detection work.

But for performance reason you might want to implement some C++ sensors.
It's fairly easy to simulate a camera sensor: all the code necessary to
render the scene from a camera object is present in the GE. It just
needs to be put together. Using OGL filters, you could even simulate
noise, black&white camera, etc. It would even be possible to simulate a
ultrasonic or radar sensor by processing the Zbuffer.

/benoit

> -----Original Message-----
> From: robotics-bounces at blender.org 
> [mailto:robotics-bounces at blender.org] On Behalf Of Séverin Lemaignan
> Sent: mardi 15 septembre 2009 9:57
> To: Blender and Robotics
> Subject: [Robotics] Middlewares: the sequel
> 
> 
> Hello everyone,
> 
> I've put on the wiki the current state of our design ideas for a 
> component-based, middleware independent, simulator (check the 
> two last 
> sections on this page 
> http://wiki.blender.org/index.php/Robotics:Simulators/OpenRobots).
> 
> We are currently implementing first examples of such components 
> (starting with a GPS and a simple mobile robot), but it's still at 
> drafting stage, and your reviews are very welcome.
> 
> It would be especially interesting to think of how to 
> integrate what has 
> been done on the arm in this workflow (if it's relevant). I'm also 
> interested in Stefaan views regarding the possible use of channels 
> within the component specification.
> 
> By the way, we have a meeting tomorrow at LAAS-CNRS and ONERA 
> to update 
> the roadmap. We'll keep you posted, but the target for a 
> first, usable 
> release of the OpenRobots simulator is likely to be mid-march.
> 
> Best regards,
> Severin
> 



More information about the Robotics mailing list