[Robotics] Camera calibration parameters

Simon Lacroix Simon.Lacroix at laas.fr
Wed May 12 19:30:53 CEST 2010


	Benoit,

I am not sure the 'near' attribute can be considered as a focal  
length. AFAIK, it is a clipping attribute, and modifying it does not  
influence the field of view -- as would do a focal length modification.
One dimension is actually missing in the Blender camera attributes to  
establish a one-to-one link between a Blender camera and a real one.  
The good thing is that in computer vision we only need the ratio f/ 
pixSize, that suffices to define the central projection that models  
the camera, and which is defined by the 'lens' attribute (what mixes  
up things a bit is that this attribute is said to be in mm in the  
documentation -- but being not a Blender user, I did not dig a lot).

Simon

On 12 mai 10, at 18:24, Benoit Bolsee wrote:

> Hi Simon,
>
> You got the point I think.  If you want to have the physical pixel  
> size,
> it is easily computed with
>
> pixSize_u = near * 32.0/lens/capsize[0]
>
> Which means that the focal distance is camera.near if I insert this
> value in your a_u formula.
>
> Camera.near is the distance from the camera position of the projection
> plane where the pixels are formed. For a real camera, that's where the
> CCD plane would be.  I guess it makes sense to call it the focal
> distance.
>
>
>
>> -----Original Message-----
>> From: robotics-bounces at blender.org
>> [mailto:robotics-bounces at blender.org] On Behalf Of Simon Lacroix
>> Sent: mercredi 12 mai 2010 17:59
>> To: Blender and Robotics
>> Subject: Re: [Robotics] Camera calibration parameters
>>
>>
>> 	Benoit,
>>
>> Thanks for the information, it answers our question !
>> To make things clear:  the 'pixsize' as defined by your
>> formula is the
>> size of the scene element corresponding to one pixel in the image,
>> whereas the pixel size Gilberto was originally referring to is "the
>> size of one pixel of the image plane", i.e. of a CCD / Cmos imaging
>> device.
>>
>> The clue we were missing is the following :
>>
>> On 11 mai 10, at 20:55, Benoit Bolsee wrote:
>>> [...]
>>> The lens attribute is a value that represents the
>>> distance in Blender unit at which the largest image
>> dimension is 32.0
>>> Blender units.
>>
>> Explanation: the intrinsic calibration matrix K of a camera is :
>>
>> (a_u	0		u_c)
>> (0		a_v		v_c)
>> (0		0		1)
>>
>> This matrix transforms a 3D point P(X,Y,Z) in the 3D camera
>> frame to a
>> 2D point p(u,v) in the image plane in pixel coordinates: u = KP (see
>> e.g. http://homepages.inf.ed.ac.uk/rbf/CVonline/LOCAL_COPIES/OWENS/
>> LECT9 for frame and notation conventions, but this is not necessary
>> here).
>>
>> (u_0, v_0) represent the intersection of the optical axis with the
>> image plane in pixels coordinate (for a Blender "ideal" camera, they
>> are capsize[0]/2 and capsize[1]/2).
>> (a_u, a_v) represent the focal length expressed in pixels :
>> a_u = f /
>> pixSize_u, where pixSize_u is the length of the physical
>> pixel of the
>> imaging device according to the u image direction, and f is the lens
>> focal length.
>>
>> Now that we know the meaning of the attribute 'lens', we can compute
>> a_[u|v]:
>> a_u = capsize[0] . lens / 32.0
>> (and we have to consider the aspect ratio if it makes non-square
>> pixels).
>>
>> Simon
>>
>>
>>
>>>
>>>> -----Original Message-----
>>>> From: robotics-bounces at blender.org
>>>> [mailto:robotics-bounces at blender.org] On Behalf Of Gilberto
>>>> Echeverria
>>>> Sent: mardi 11 mai 2010 16:03
>>>> To: robotics at blender.org
>>>> Subject: Re: [Robotics] Camera calibration parameters
>>>>
>>>>
>>>> Thanks for the suggestions. It would indeed be strange to
>> calibrate
>>>> the camera in the simulation. We expected to avoid this step
>>>> inside Blender,
>>>> since the data required should be already defined and exist
>>>> somewhere in
>>>> Blender. The focal length seems to be a modifiable property of the
>>>> camera object in Blender, but the image size is still evading
>>>> us. For the moment we'll try to estimate this measurement
>>>> from the pattern
>>>> image.
>>>>
>>>> Thanks.
>>>> Gilberto
>>>>
>>>>
>>>> On 05/11/2010 03:28 PM, Paul Fitzpatrick wrote:
>>>>> Another (somewhat strange, but fun) method would be to run
>>>> a standard
>>>>> calibration program (e.g. from OpenCV).  You'd need to
>> make a plane
>>>>> with a pattern like this on it:
>>>>>
>>>> http://www.vision.caltech.edu/bouguetj/calib_doc/htmls/pattern.pdf
>>>>> Then light it, animate it moving around, record several
>>>> views from the
>>>>> camera, and run the calibration program.
>>>>>
>>>>> Cheers,
>>>>> Paul
>>>>>
>>>>> On 05/11/2010 06:41 AM, koen buys wrote:
>>>>>
>>>>>> Hi Gilberto,
>>>>>>
>>>>>> I once figured this out in a previous version of
>> blender. Together
>>>>>> with Herve Legrand. I will forward you our experiments
>>>> back then. You
>>>>>> need both the intrinsic and the external calibration matrix. The
>>>>>> external should be fairly easy to find. The internal one
>> was more
>>>>>> difficult. I hope they didn't change the API of that part.
>>>>>>
>>>>>> Best regards,
>>>>>>
>>>>>> Koen Buys
>>>>>>
>>>>>>
>>>>>> On 11 May 2010 11:14, Gilberto
>>>> Echeverria<gilberto.echeverria at laas.fr
>>>>>> <mailto:gilberto.echeverria at laas.fr>>  wrote:
>>>>>>
>>>>>>    Hello everyone,
>>>>>>
>>>>>>    As part of the Open Robots Simulator (now called
>>>> MORSE), we are now
>>>>>>    connecting Blender with other robotics software
>>>> modules that do the
>>>>>>    processing of the data sent by the simulation. In the case of
>>>>>>    image data
>>>>>>    from a camera, these modules expect the image to have
>>>> an associated
>>>>>>    calibration matrix.
>>>>>>
>>>>>>    The calibration matrix is composed with the focal
>>>> point, the focal
>>>>>>    length and the dimensions of the acquired image as a
>>>> number of pixels
>>>>>>    per unit of distance. For the simulated camera in
>>>> Blender, we can
>>>>>>    consider the focal point to be (0,0). For the other
>>>> two parameters we
>>>>>>    have been unable to find the data in Blender. We are
>>>> using the Blender
>>>>>>    camera object and the VideoTexture module to generate
>>>> captured images.
>>>>>>    Does anyone know where this information could be found in
>>>>>> Blender?
>>>>>>
>>>>>>    Best regards
>>>>>>
>>>>>>    Gilberto
>>>>>>    _______________________________________________
>>>>>>    Robotics mailing list
>>>>>>    Robotics at blender.org<mailto:Robotics at blender.org>
>>>>>>    http://lists.blender.org/mailman/listinfo/robotics
>>>>>>
>>>>>>
>>>>>>
>>>>>> _______________________________________________
>>>>>> Robotics mailing list
>>>>>> Robotics at blender.org
>>>>>> http://lists.blender.org/mailman/listinfo/robotics
>>>>>>
>>>>>>
>>>>> _______________________________________________
>>>>> Robotics mailing list
>>>>> Robotics at blender.org
>>>>> http://lists.blender.org/mailman/listinfo/robotics
>>>>>
>>>>
>>>> _______________________________________________
>>>> Robotics mailing list
>>>> Robotics at blender.org
>> http://lists.blender.org/mailman/listinfo/robo> >
>>>> tics
>>>>
>>>
>>
>>>
>> _______________________________________________
>>> Robotics mailing list
>>> Robotics at blender.org
>>> http://lists.blender.org/mailman/listinfo/robotics
>>
>> _______________________________________________
>> Robotics mailing list
>> Robotics at blender.org
>> http://lists.blender.org/mailman/listinfo/robo> tics
>>
>
> _______________________________________________
> Robotics mailing list
> Robotics at blender.org
> http://lists.blender.org/mailman/listinfo/robotics



More information about the Robotics mailing list