[Robotics] Update on ORS (article for BlenderNation)

Séverin Lemaignan severin.lemaignan at laas.fr
Tue Apr 20 10:21:16 CEST 2010


Hello everyone!

I'm about to send an update to BlenderNation (I promissed it one year 
ago :-) to Bart) regarding what has been achieved until now in the 
"Blender for Robotics" project.

Can you read and complete it?

I'd like to send it this afternoon, so please comment before 16:00.

Cheers,
Severin

----------

Hello BlenderNation!

A bit more than a year later, some update on the Blender for Robotics 
project, and under the scenes, a lot of things did happen over the last 
months. Blender for Robotics is an effort from several European robotics 
lab to build over the Blender Game Engine a set of tools for 
roboticians, including a new, multi-purpose simulator.

The first round of developments was started at KU Leuven were Benoit 
Bolsee has been working for 6 months last year to rewrite the IK solver, 
with the GameEngine in mind, resulting in the so-called iTaSC branch. 
Demos here: http://vimeo.com/5771805, http://vimeo.com/5857397

Then, a the beginning of July 2009, two full-time engineers, Nicolas 
Lassabe at ONERA and Gilberto Echeverria at LAAS-CNRS (both in Toulouse) 
started working on the simulator itself. We've called it OpenRobot 
Simulator (ORS).

ORS has been designed to be multi-purpose (simulation of field robotics, 
indoor robotics, multi-robots systems) and to allow simulation at 
different levels, from raw sensory output to higher-level semantic 
information on the sensor side, and on the actuator side, from raw 
elementary commands processed by the physics engine to higher level 
(less realistic) motions.

To build a robot, a set of Blender files (one for each sensor or 
actuator) are assembled together with a model of robotic base (we 
currently have outdoor iRobot ATRVs, a model of helicopter and a 
custom-made one-arm indoor robot).

Each sensor or actuator can work at different level of realism: for 
instance, a camera can be set to export the raw OpenGL buffer, a depth 
map, a pre-segmented image of even a "cognitive map" with the name and 
position of the objects the robot sees. This different level of realism 
are very important when working on big robotic project since each 
researcher may want to test its specific algorithms without the burden 
of running the complete processing stack.

For now, we can simulate mono and stereo cameras, laser scanners, 
gyroscopes and GPS. We have as well simple actuators to move robots by 
speed or through waypoints.

To be useful, a simulator must be able to communicate with the real 
robot control stack: this is usually achieved by relying on so-called 
middlewares that abstract the components of the robot (it is thus 
possible to replace a real camera by a simulated one without altering 
the vision algorithms that use the camera, for example). Since there is 
no standard middleware, ORS is planned to rely on an open-source tool 
currently developed at LAAS and called Genom3: Genom3 is a 
template-based module generator. Once the template has been written for 
a given middleware, all the simulator modules can be automatically 
generated. Simulator developers only need to write once the sensors and 
actuators and all the robotics community can use them.

This feature is not complete yet, and for the developement needs, we 
have implemented only two middlewares, YARP (and Orocos via YARP) and 
Pocolibs.

Some more videos: http://vimeo.com/9825826, http://vimeo.com/9825888 (on 
this video, please note the screen on the wall that stream the robot 
camera thanks Benoit's VideoTexture extension)

By the way, as you've maybe noticed on the video, we are still 
transitionning from Blender 2.49 to Blender 2.5 (it was long because 
some features important to us weren't ported yet), but be assured that 
ORS will run on Blender 2.5!

OpenRobot Simulator is not ready yet for a broad diffusion (we will let 
you know when we release the version 1.0!). Mostly missing: a good 
documentation to explain of to build and start a simulation scene, a 
dedicated UI with drag&drop of components (robots, sensors, 
actuators...), integration with Genom3 and templates for other 
middleware like ROS (if some ROS dev are interested in helping, please 
drop a mail!), and a lot of polish! However, if you are in a hackish 
mood, you can check out the public GIT repository: git clone 
http://trac.laas.fr/git/robots/morse.git and hang on morse-dev at laas.fr 
(but please accept that we won't answer all user questions until we do a 
first official release!)

-- 
            Séverin Lemaignan - severin.lemaignan at laas.fr
    [00]    PhD student on Cognitive Robotics
   /|__|\   LAAS-CNRS - RIS group / Technische Uni München - IAS group
     ''     +33561337844 / +498928917780
            http://www.laas.fr/~slemaign



More information about the Robotics mailing list