[Robotics] Middlewares: the sequel

Benoit Bolsee benoit.bolsee at online.be
Tue Sep 29 12:12:00 CEST 2009


Hi all,

I just found out that authorization is not longer necessary to get edit
rights in the wiki. As soon as you register, you get automatically edit
rights. That's true democracy :-)

You just need to create an account on 
http://wiki.blender.org/index.php?title=Special:Userlogin&returnto=Robot
ics:Index

Enjoy,
Benoit

> -----Original Message-----
> From: robotics-bounces at blender.org
> [mailto:robotics-bounces at blender.org] On Behalf Of Gilberto Echeverria
> Sent: lundi 28 septembre 2009 18:25
> To: Blender and Robotics
> Subject: Re: [Robotics] Middlewares: the sequel
> 
> 
> Hello,
> 
> I have made an example of using VideoTexture to export the
> video from a 
> camera and send it through a YARP port. This way it is no longer 
> necessary to have additional viewports for every camera on the scene. 
> I'm now documenting the process in the Blender for Robotics wiki, and 
> will have the example posted there ... as soon as I have 
> permission to 
> upload images and files to the wiki. Who should I ask for 
> that autorization?
> 
> Cheers.
> 
> Gilberto
> 
> Benoit Bolsee wrote:
> > Hi Gilberto,
> >
> > First your question about Blender module: yes it's gone in 2.5 and
> > replaced by RNA. You have the documentation (still partial) here : 
> > http://www.graphicall.org/ftp/ideasman42/html/bpy-module.html
> >
> > The principle is that you get reference to the data block
> you want to
> > manipulate from the context
> > 
> (http://www.graphicall.org/ftp/ideasman42/html/bpy.types.Context-class
> > .h
> > tml): you start from the main and dig through it.
> >
> > Afaik it's not available in the GE yet. So in the GE you must stick
> > with GE API only, which hasn't changed.
> >
> > I don't have information about the OpenGL functions, I will ask on
> > IRC.
> >
> > Regarding your stereo camera sensor, you need to look at the
> > VideoTexture module, it can do custom render from selected 
> camera. The
> > main functions of the VideoTexture module are video
> streaming to GPU
> > texture and Render to Texture. This latter feature can be
> intercepted
> > to retrieve custom render in Python buffers.
> >
> > There is a wiki about VideoTexture but it doesn't cover advanced
> > features:
> > 
> http://wiki.blender.org/index.php/Dev:Source/GameEngine/2.49/VideoText
> > ur
> > e
> >
> > Especially the Render to Texture is not covered but you can
> find a lot
> > of details in the commit log:
> > 
> http://lists.blender.org/pipermail/bf-blender-cvs/2008-November/016793
> > .h
> > tml
> >
> > These examples assume that you want to update a GPU texture
> but it's
> > possible to intercept the image buffer and bypass the
> Texture update.
> > Instead of calling the Texture object's Refresh function, you can
> > directly take the image buffer with this command sequence:
> >
> > image = GameLogic.video.source.image
> > if image: 
> >     ... Custom processing
> > GameLogic.video.source.refresh()
> >
> > Where 'video' is the Texture object and 'source' is the ImageRender
> > object. Note that this method will allow you to keep the 
> image size to
> > its original size at no cost: just set
> GameLogic.video.source.scale =
> > False during the initialization. If you want to apply the
> render to a
> > texture and process the image, you need the following sequence:
> >
> > GameLogic.video.refresh(False)	# as opposed to refresh(True)
> > image = GameLogic.video.source.image
> > if image: 
> >     ... Custom processing
> > GameLogic.video.source.refresh()
> >
> > Note that in that case you must set GameLogic.video.source.scale =
> > True during initialization otherwise scaling is much too 
> slow but then
> > the size of the image is reduced to the largest power of 2 that is
> > equal or lower than the render size. Applying the render to 
> a texture
> > will allow you to display what the robot sees inside the GE
> Scene (for
> > example on a wall or on a panel on the robot itself). I can
> be handy
> > for debugging or just for fun.
> >
> > One last comment: GameLogic.video.source.image builds and returns a
> > string buffer of very large size. For performance reason, 
> you should
> > only access it only once per script. The format is a string
> containing
> > the image as binary RGBA line by line.
> >
> > I hope you can find your way through this massive amount of
> > information!
> >
> > /benoit
> >
> >
> >   
> >> -----Original Message-----
> >> From: robotics-bounces at blender.org 
> >> [mailto:robotics-bounces at blender.org] On Behalf Of
> Gilberto Echeverria
> >> Sent: lundi 21 septembre 2009 15:31
> >> To: Blender and Robotics
> >> Subject: Re: [Robotics] Middlewares: the sequel
> >>
> >>
> >> Hi Benoit,
> >>
> >> We are currently trying to implement a stereo camera sensor. As you

> >> mentioned there are no python sensors at the moment, so
> our current
> >> approach is to use the Game Engine sensors connected to a
> >> python Controller.
> >>
> >> The way we're trying to implement the stereo camera is by
> having two
> >> Blender camera objects on the robot, plus the scene camera.
> >> We want to 
> >> generate screen captures of what the cameras on board the 
> >> robot see, but 
> >> without changing the users's view of the scene. As in 
> >> Severin's example, 
> >> we can capture a portion of the Blender screen using 
> >> glReadPixels, but 
> >> this requires the view from the robot camera to be shown 
> >> on-screen. So 
> >> what we need is to do off-screen rendering of the 
> additional cameras.
> >> I'm looking at how this can be done in Blender 2.5.
> >>
> >>  From the other threads I see that Blender 2.5 no longer
> includes the
> >> Blender module. So where can I now find the BGL or equivalent
> >> module to 
> >> make use of OpenGL functions? Is there any documentation 
> available at
> >> the moment?
> >>
> >> Cheers
> >> Gilberto
> >>
> >>
> >> Benoit Bolsee wrote:
> >>     
> >>> Nice stuff! I see clearly how it will work for
> controllers but what
> >>> about sensors? There is currently no 'python' sensor. Of
> course you
> >>> can always replace a sensor by a python controller
> running for each
> >>> frame and doing some detection work.
> >>>
> >>> But for performance reason you might want to implement some C++ 
> >>> sensors. It's fairly easy to simulate a camera sensor:
> all the code
> >>> necessary to render the scene from a camera object is
> >>>       
> >> present in the
> >>     
> >>> GE. It just needs to be put together. Using OGL filters, you could

> >>> even simulate noise, black&white camera, etc. It would even be 
> >>> possible to simulate a ultrasonic or radar sensor by
> processing the
> >>> Zbuffer.
> >>>
> >>> /benoit
> >>>
> >>>   
> >>>       
> >>>> -----Original Message-----
> >>>> From: robotics-bounces at blender.org
> >>>> [mailto:robotics-bounces at blender.org] On Behalf Of Séverin
> >>>>         
> >> Lemaignan
> >>     
> >>>> Sent: mardi 15 septembre 2009 9:57
> >>>> To: Blender and Robotics
> >>>> Subject: [Robotics] Middlewares: the sequel
> >>>>
> >>>>
> >>>> Hello everyone,
> >>>>
> >>>> I've put on the wiki the current state of our design ideas for a
> >>>> component-based, middleware independent, simulator 
> (check the two
> >>>> last sections on this page
> >>>> 
> http://wiki.blender.org/index.php/Robotics:Simulators/OpenRobots).
> >>>>
> >>>> We are currently implementing first examples of such components
> >>>> (starting with a GPS and a simple mobile robot), but 
> it's still at
> >>>> drafting stage, and your reviews are very welcome.
> >>>>
> >>>> It would be especially interesting to think of how to integrate
> >>>> what has been done on the arm in this workflow (if it's 
> relevant).
> >>>> I'm also interested in Stefaan views regarding the
> possible use of
> >>>> channels within the component specification.
> >>>>
> >>>> By the way, we have a meeting tomorrow at LAAS-CNRS and ONERA to
> >>>> update the roadmap. We'll keep you posted, but the target for a
> >>>> first, usable 
> >>>> release of the OpenRobots simulator is likely to be mid-march.
> >>>>
> >>>> Best regards,
> >>>> Severin
> >>>>
> >>>>     
> >>>>         
> >>> _______________________________________________
> >>> Robotics mailing list
> >>> Robotics at blender.org 
> >>> http://lists.blender.org/mailman/listinfo/robotics
> >>>   
> >>>       
> >> _______________________________________________
> >> Robotics mailing list
> >> Robotics at blender.org 
> >> http://lists.blender.org/mailman/listinfo/robo> tics
> >>
> >>     
> >
> > _______________________________________________
> > Robotics mailing list
> > Robotics at blender.org
> > http://lists.blender.org/mailman/listinfo/robotics
> >   
> 
> _______________________________________________
> Robotics mailing list
> Robotics at blender.org
> http://lists.blender.org/mailman/listinfo/robo> tics
> 



More information about the Robotics mailing list