OpenCV  4.1.2-pre
Open Source Computer Vision
Interactive camera calibration application

According to classical calibration technique user must collect all data first and when run cv::calibrateCamera function to obtain camera parameters. If average re-projection error is huge or if estimated parameters seems to be wrong, process of selection or collecting data and starting of cv::calibrateCamera repeats.

Interactive calibration process assumes that after each new data portion user can see results and errors estimation, also he can delete last data portion and finally, when dataset for calibration is big enough starts process of auto data selection.

Main application features

The sample application will:

Supported patterns:

Description of parameters

Application has two groups of parameters: primary (passed through command line) and advances (passed through XML file).

Primary parameters:

All of this parameters are passed to application through a command line.

-[parameter]=[default value]: description

Advanced parameters:

By default values of advanced parameters are stored in defaultConfig.xml

<?xml version="1.0"?>
<camera_resolution>1280 720</camera_resolution>

Note: charuco_dict, charuco_square_lenght and charuco_marker_size are used for chAruco pattern generation (see Aruco module description for details: Aruco tutorials)

Default chAruco pattern:


Dual circles pattern

To make this pattern you need standard OpenCV circles pattern and binary inverted one. Place two patterns on one plane in order when all horizontal lines of circles in one pattern are continuations of similar lines in another. Measure distance between patterns as shown at picture below pass it as dst command line parameter. Also measure distance between centers of nearest circles and pass this value as sz command line parameter.


This pattern is very sensitive to quality of production and measurements.

Data filtration

When size of calibration dataset is greater then max_frames_num starts working data filter. It tries to remove "bad" frames from dataset. Filter removes the frame on which \(loss\_function\) takes maximum.

\[loss\_function(i)=\alpha RMS(i)+(1-\alpha)reducedGridQuality(i)\]

RMS is an average re-projection error calculated for frame i, reducedGridQuality is scene coverage quality evaluation without frame i. \(\alpha\) is equals to frame_filter_conv_param.

Calibration process

To start calibration just run application. Place pattern ahead the camera and fixate pattern in some pose. After that wait for capturing (will be shown message like "Frame #i captured"). Current focal distance and re-projection error will be shown at the main screen. Move pattern to the next position and repeat procedure. Try to cover image plane uniformly and don't show pattern on sharp angles to the image plane.


If calibration seems to be successful (confidence intervals and average re-projection error are small, frame coverage quality and number of pattern views are big enough) application will show a message like on screen below.


Hot keys:


As result you will get camera parameters and confidence intervals for them.

Example of output XML file:

<?xml version="1.0"?>
<calibrationDate>"Thu 07 Apr 2016 04:23:03 PM MSK"</calibrationDate>
1280 720</cameraResolution>
<cameraMatrix type_id="opencv-matrix">
1.2519588293098975e+03 0. 6.6684948780852471e+02 0.
1.2519588293098975e+03 3.6298123112613683e+02 0. 0. 1.</data></cameraMatrix>
<cameraMatrix_std_dev type_id="opencv-matrix">
0. 1.2887048808572649e+01 2.8536856683866230e+00
<dist_coeffs type_id="opencv-matrix">
1.3569117181595716e-01 -8.2513063822554633e-01 0. 0.
<dist_coeffs_std_dev type_id="opencv-matrix">
1.5570675523402111e-02 8.7229075437543435e-02 0. 0.