Calibration Best Practices

Accurate calibration is of key importance for performance in most machine and computer vision tasks. The following lists our best practices which we have found through extensive experimentation and theoretical considerations. 

  1. Choose the right size calibration target. Large enough to properly constrain parameters. Ideally it should cover at least half of the total area when seen fronto-parallel in the camera images. 
  2. Perform calibration at the approximate working distance (WD) of your final application. The camera should be focused at this distance and lens focus and aperture must not be changed during or after calibration.
  3. The target should have a high feature count. Using fine patterns is preferable. However, at some point detection robustness suffers. Our recommendation is to use fine pattern counts for cameras above 3MPx and if the lighting is controlled and good.
  4. Collect images from different areas and tilts. Move the target to fully cover the image area and aim for even coverage. Lens distortion can be properly determined from fronto-parallel images, but focal length and principle point  estimation is dependent on observing foreshortening. Include both frontoparallel images, and images taken with the board tilted up to +/- 45 degrees in both horizontal an vertical directions. Tilting more is usually not a good idea as feature localization accuracy suffers and can become biased.
  5. Use good lighting. This is often overlooked, but hugely important. The calibration target should preferably be diffusely lit by means of controlled photography lighting. Strong point sources give rise to uneven illumination, possibly making detection fail, and not utilizing the camera's dynamic range very well. Shadows can do the same.
  6. Have enough observations. Usually, calibration should be performed on at least 6 observations (images) of a calibration target. If a higher order camera or distortion model is used, more observations are beneficial.
  7. Consider using uniquely coded targets such as CharuCo boards. These allow you to gather observations from the very edges of the camera sensor and lens, and hence constrain the distortion parameters very well. Also, they allow you to collect data even when some of the feature points do not fulfil the other requirements.
  8. Calibration is only as accurate as the calibration target used. Use laser or inkjet printed targets only to validate and test.
  9. Proper mounting of calibration target and camera. In order to minimize distortion and bow in larger targets, mount them either vertically, or laying flat on a rigid support. Consider moving the camera instead of the target in these cases instead. Use a quality tripod, and avoid touching the camera during acquisitions. 
  10. Remove bad observations. Carefully inspect reprojection errors. Both per-view and per-feature. If any of these appear as outliers, exclude them and recalibrate. 
  11. Obtaining a low reproduction error does not equal a good camera calibration, but merely indicates that the provided data/evidence can be described with the used model. This could be due to overfitting. Parameter uncertainties are indications of how well the chosen camera model was constrained.
  12. Analyse the individual reprojection errors. Their direction and magnitude should not correlate with position, i.e. they should point chaotically in all directions.'s Camera Calibrator software provides powerfull visualizations to investigate the reprojected errors.

Following these practices should ensure the most accurate and precise calibration possible. 


Have any questions, comments or additional insights? Post them below.


  • do you need to take images at different distance for good range coverage as well ?

  • @Brian: It sounds like you are working with a camera that can do some light internal image processing (i.e. sharpening filters). These alter the raw image to make it appear sharper. Generally, we would recommend turning off all image processing during calibration. Depending on the exact implementation, these algorithms could bias the found locations of image saddle points or circles. However, the effects could very well be negligible.
  • First off I would like to thank your team for making such great calibration targets! My question deals with digital camera setting. I am using a fixed focal length camera and basically following the guidelines that you have set out in this document. Provided that you are getting a “good image”, do settings like sharpness, gamma, white balance, etc, impact the camera characteristics of camera calibration? If so, do you have any recommendation for such settings?
    I keep going back and forth on whether the sharpness setting impacts the finding of the intersections of a chess board.

  • @Min-An Chao, yes you are correct that the pinhole camera model is ideal in the sense that it is focus free. Only with very small apertures do standard cameras begin to exhibit pinhole like characteristics. Unfortunately, in most cases we are required to open the aperture in order to collect more light and as a result we deviate from the pinhole model. However, in these cases we can assume pinhole like characteristics where the image is in focus and we perform a calibration at that focusing distance (fixed focal length) in order to estimate the pinhole model in that condition. By re-focusing we are effectively changing the distance between lense and sensor which affects the focal length directly. The change in focal length is quite small, but for precision applications it can put a measurement setup out of spec. For ideal optics the principal point (cx,cy) will theoretically not change. But in the real world, it might.

    To visually see the change in focal length by focusing. Put your camera in manual focusing mode, close the aperture as much as you can and use a lot of light to illuminate a scene. Now, manually adjust the focus dial and observe how the image zooms in and out.
  • Hello,
    Thanks for great guidelines and informative Q&A section here. I‘d like to follow up @Tim‘s question, since the possibility of calibration for autofocus camera has bothered me several times. Because pinhole camera model does not have focus problem, so when applying this model to describe a camera matrix, fine-tuning the distance between image sensor and the lens set should not change the pinhole focal length f, and should not change the origin point (cx, cy) projected on the image plane. Then what would autofocus affect or harm the calibration results? Because personally I did the calibration for one camera where I can set manual focus digitally and precisely and repeat the calibration with precisely controlled motorized stages and could not find signifacant differences on camera matrix and distortion parameters with opencv scripts. Could you shed some lights here? Thanks!


    Min-An Chao

Leave a comment

Please note, comments must be approved before they are published