Single Camera Calibrator App
Camera Calibrator Overview
You can use the Camera Calibrator app to estimate camera
intrinsics, extrinsics, and lens distortion parameters. You can use these camera
parameters for various computer vision applications. These applications include removing
the effects of lens distortion from an image, measuring planar objects, or
reconstructing 3-D scenes from multiple cameras.
The suite of calibration functions used by the Camera Calibrator app provide the workflow for camera calibration. You can
use these functions directly in the MATLAB® workspace. For a list of functions, see Single and Stereo Camera Calibration.
Single Camera Calibration
Follow this workflow to calibrate your camera using the app:
Prepare images, camera, and calibration pattern.
Add images and select standard or fisheye camera model.
Calibrate the camera.
Evaluate calibration accuracy.
Adjust parameters to improve accuracy (if necessary).
Export the parameters object.
In some cases, the default values work well, and you do not need to make
any improvements before exporting parameters. You can also make improvements using the
camera calibration functions directly in the MATLAB workspace. For a list of functions, see Single and Stereo Camera Calibration.
Open the Camera Calibrator
MATLAB Toolstrip: On the Apps tab, in the
Image Processing and Computer Vision section, click the
Camera Calibrator icon.
MATLAB command prompt: Enter cameraCalibrator
Prepare the Pattern, Camera, and Images
To better the results, use between 10 and 20 images of the calibration pattern. The
calibrator requires at least three images. Use uncompressed images or lossless
compression formats such as PNG. The calibration pattern and the camera setup must
satisfy a set of requirements to work with the calibrator. For greater calibration
accuracy, follow these instructions for preparing the pattern, setting up the camera,
and capturing the images.
Note
The Camera Calibrator app supports only checkerboard patterns. If you are
using a different type of calibration pattern, you can still calibrate your
camera using the estimateCameraParameters
function. Using a different type of pattern requires that you supply your own
code to detect the pattern points in the image.
Prepare the Checkerboard Pattern
The Camera Calibrator app uses a checkerboard pattern. A checkerboard pattern
is a convenient calibration target. If you want to use a different pattern to
extract key points, you can use the camera calibration MATLAB functions directly. See Single and Stereo Camera Calibration for
the list of functions.
You can print (from MATLAB) and use the
checkerboard pattern provided.
Tip
Use a checkerboard that contains an even number of squares along one edge and
an odd number of squares along the other edge. Using a non-square pattern
contains two black corners along one side and two white corners on the opposite
side. This criteria enables the app to determine the orientation of the pattern
and the origin. The calibrator assigns the longer side to be the
x-direction. A square pattern can produce unexpected
results for camera extrinsics.
To prepare the checkerboard pattern:
Attach the checkerboard printout to a flat surface.
Imperfections on the surface can affect the accuracy of the calibration.
Measure one side of the checkerboard square. You need
this measurement for calibration. The size of the squares can vary
depending on printer settings.
To improve the detection speed, set up the pattern
with as little background clutter as possible.
Camera Setup
To calibrate your camera, follow these rules:
Keep the pattern in focus, but do not use autofocus.
If you change zoom settings between images, the focal length
changes.
Capture Images
For better results, use at least 10 to 20 images of the calibration pattern. The calibrator
requires at least three images. Use uncompressed images or images in lossless
compression formats such as PNG. For greater calibration accuracy:
Capture the images of the pattern at a distance roughly equal to the
distance from your camera to the objects of interest. For example, if
you plan to measure objects from 2 meters, keep your pattern
approximately 2 meters from the camera.
Place the checkerboard at an angle less than 45 degrees relative to
the camera plane.
Do not modify the images, (for example, do not crop them).
Do not use autofocus or change the zoom settings between
images.
Capture the images of a checkerboard
pattern at different orientations relative to the
camera.
Capture a variety of images of the pattern so that you have accounted
for as much of the image frame as possible. Lens distortion increases
radially from the center of the image and sometimes is not uniform
across the image frame. To capture this lens distortion, the pattern
must appear close to the edges of the captured images.
The Calibrator works with a range of checkerboard square sizes. As a
general rule, your checkerboard should fill at least 20% of the captured image. For
example, the preceding images were taken with a checkerboard square size of 108 mm,
as the following montage shows:
Add Images and Select Camera Model
To begin calibration, you must add images. You can add saved images from a folder or add
images directly from a camera. The calibrator analyzes the images to ensure they meet
the calibrator requirements. The calibrator then detects the points on the
checkerboard.
Add Images from File
On the Calibration tab, in the File section, click
Add images, and then select From
file
. You can add images from multiple folders by clicking
Add images for each folder.
Acquire Live Images
To begin calibration, you must add images. You can acquire live
images from a webcam using the MATLAB Webcam support. To use this feature, you must install MATLAB Support Package for USB Webcams. See Install the MATLAB Support Package for USB Webcams (Image Acquisition Toolbox) for information on installing the support package. To add live images, follow these
steps.
On the Calibration tab, in the File section,
click Add Images, then select From
camera
.
This action opens the Camera tab opens. If you have only one
webcam connected to your system, it is selected by default and a live preview window
opens. If you have multiple cameras connected and want to use one different from the
default, select that specific camera in the Camera list.
Set properties for the camera to control the image (optional). Click the
Camera Properties to open a menu of the properties for the
selected camera. This list varies depending on your device.
Use the sliders or drop-down list to change any available property settings. The
Preview window updates dynamically when you change a setting. When you are done setting
properties, click anywhere outside of the menu box to dismiss the properties
list.
Enter a location for the acquired image files in the Save
Location box by typing the path to the folder or using the
Browse button. You must have permission to write to the
folder you select.
Set the capture parameters.
To set the number of seconds between image captures, use the
Capture Interval box or slider. The default is 5
seconds, the minimum is 1 second, and the maximum is 60 seconds.
To set the number of image captures, use the Number of images to
capture box or slider. The default is 20 images, the minimum
is 2 images, and the maximum is 100 images.
In the default configuration, a total of 20 images are captured, one every 5
seconds.
The Preview window shows the live images streamed as RGB data. After you adjust any
device properties and capture settings, use the Preview window as a guide to line up the
camera to acquire the checkerboard pattern image you want to capture.
Click the Capture button. The number of images you set are
captured and the thumbnails of the snapshots appear in the Data
Browser pane. They are automatically named incrementally and are captured
as .png
files.
You can optionally stop the image capture before the designated number of images are
captured by clicking Stop Capture.
When you are capturing images of a checkerboard, after the designated number of images
are captured, a Checkerboard Square Size dialog box displays. Specify the size of the
checkerboard square, then click OK.
The detection results are then calculated and displayed. For example:
Click OK to dismiss the Detection Results dialog box.
When you have finished acquiring live images, click Close Image
Capture to close the Camera tab.
Analyze Images
After you add the images, the Checkerboard Square Size dialog box appears. Specify
size of the checkerboard square by entering the length of one side of a square from
the checkerboard pattern.
The calibrator attempts to detect a checkerboard in each of the added images,
displaying an Analyzing Images progress bar window, indicating detection progress.
If any of the images are rejected, the Detection Results dialog box appears, which
contains diagnostic information. The results indicate how many total images were
processed, and of those processed, how many were accepted, rejected, or skipped. The
calibrator skips duplicate images.
To view the rejected images, click View images. The
calibrator rejects duplicate images. It also rejects images where the entire
checkerboard could not be detected. Possible reasons for no detection are a blurry
image or an extreme angle of the pattern. Detection takes longer with larger images
and with patterns that contain a large number of squares.
View Images and Detected Points
The Data Browser pane displays a list of images with IDs.
These images contain a detected pattern. To view an image, select it from the
Data Browser pane.
The Image window displays the selected
checkerboard image with green circles to indicate detected points. You can verify
that the corners were detected correctly using the zoom controls. The yellow square
indicates the (0,0) origin. The X and Y arrows indicate the checkerboard axes
orientation.
Calibrate
Once you are satisfied with the accepted images, click the Calibrate
button on the Calibration tab. The default calibration settings
assume the minimum set of camera parameters. Start by running the calibration with the
default settings. After evaluating the results, you can try to improve calibration
accuracy by adjusting the settings and adding or removing images and then calibrating
again. If you switch between standard and fisheye camera model, you must
recalibrate.
Select Camera Model
You can select either a standard or fisheye camera model on the
Calibration tab, in the Camera Model
section, select Standard or Fisheye.
You can switch camera models at any point in the session. You must calibrate again
after any changes you make to the app's settings. Click Options
to access settings and optimizations for either camera model.
Standard Model Options
When the camera has severe lens distortion, the app can fail to compute the initial values for
the camera intrinsics. If you have the manufacturer’s specifications for your camera
and know the pixel size, focal length, or lens characteristics, you can manually set
initial guesses for camera intrinsics and radial distortion. To set initial guesses,
click Options > Optimization Options.
Select the top checkbox and then enter a 3-by-3 matrix to specify
initial intrinsics. If you do not specify an initial guess, the function
computes the initial intrinsic matrix using linear least squares.
Select the bottom checkbox and then enter a 2- or 3-element vector to
specify the initial radial distortion. If you do not provide a value,
the function uses 0
as the initial value for all the
coefficients.
Fisheye Model Options
In the Camera Model section, with
Fisheye selected, click Options.
Select Estimate Alignment to enable estimation of the axes
alignment when the optical axis of the fisheye lens is not perpendicular to the
image plane.
For details about the fisheye camera model calibration algorithm, see Fisheye Calibration Basics.
Calibration Algorithm
For fisheye camera model calibration, see Fisheye Calibration Basics.
The standard camera model calibration algorithm assumes a pinhole camera model:
(X,Y,Z):
world coordinates of a point.
(x,y): image coordinates of the
corresponding image point in pixels.
w: arbitrary homogeneous coordinates scale
factor.
K: camera intrinsic matrix, defined as.
The coordinates
(cx
cy) represent the optical
center (the principal point), in pixels. When the x-
and y-axes are exactly perpendicular, the skew
parameter, s, equals 0
. The matrix
elements are defined as:
fx =
F*sx |
fy =
F*sy |
F is the focal length in world units,
typically expressed in millimeters. |
[sx,
sy] are the number
of pixels per world unit in the x and
y direction respectively. |
fx and
fy are expressed
in pixels. |
R: matrix representing the 3-D rotation of the
camera .
t: translation of the camera relative to the world
coordinate system.
The camera calibration algorithm estimates the values of the
intrinsic parameters, the extrinsic parameters, and the distortion
coefficients. Camera calibration involves these steps:
Solve for the intrinsics and extrinsics in closed
form, assuming that lens distortion is zero. [1]
Estimate all parameters simultaneously, including
the distortion coefficients, using nonlinear least-squares minimization
(Levenberg–Marquardt algorithm). Use the closed-form solution
from the preceding step as the initial estimate of the intrinsics
and extrinsics. Set the initial estimate of the distortion coefficients
to zero. [1][2]
Evaluate Calibration Results
You can evaluate calibration accuracy by examining the reprojection errors, examining the
camera extrinsics, or viewing the undistorted image. For best calibration results, use
all three methods of evaluation.
Examine Reprojection Errors
The reprojection errors are the distances, in pixels, between the
detected and the reprojected points. The Camera Calibrator app calculates
reprojection errors by projecting the checkerboard points from world coordinates,
defined by the checkerboard, into image coordinates. The app then compares the
reprojected points to the corresponding detected points. As a general rule, mean
reprojection errors of less than one pixel are acceptable.
The Camera Calibrator app displays, in pixels, the reprojection errors as a
bar graph. The graph helps you to identify which images that adversely contribute to
the calibration. Select the bar graph entry and remove the image from the list of
images in the Data Browser pane.
Reprojection Errors Bar Graph
The bar graph
displays the mean reprojection error per image, along with the overall mean error.
The bar labels correspond to the image IDs. The highlighted bars correspond to the
selected images.
Select an image in one of these ways:
Click a corresponding bar in the graph.
Select an image from the list of images in the Data
Browser pane.
Adjust the overall mean error. Click and slide the red line up or down
to select outlier images.
Examine Extrinsic Parameter Visualization
The 3-D extrinsic parameters plot provides a camera-centric view of the patterns and a
pattern-centric view of the camera. The camera-centric view is helpful if the camera
was stationary when the images were captured. The pattern-centric view is helpful if
the pattern was stationary. You can click the cursor and hold down the mouse button
with the rotate icon to rotate the figure. Click a checkerboard (or camera) to
select it. The highlighted data in the visualizations correspond to the selected
image in the list. Examine the relative positions of the pattern and the camera to
determine if they match what you expect. For example, a pattern that appears behind
the camera indicates a calibration error.
View Undistorted Image
To view the effects of removing lens distortion, click Show Undistorted
in the View section of the Calibration
tab. If the calibration was accurate, the distorted lines in the image become
straight.
Checking the undistorted images is important even if the
reprojection errors are low. For example, if the pattern covers only a small
percentage of the image, the distortion estimation might be incorrect, even though
the calibration resulted in few reprojection errors. The following image shows an
example of this type of incorrect estimation for a single camera calibration.
While viewing the undistorted images, you can examine the fisheye images more
closely by selecting Fisheye Scale in the
View section of the Calibration tab.
Use the slider in the Scale Factor window to adjust the scale of the image.
Improve Calibration
To improve the calibration, you can remove high-error images,
add more images, or modify the calibrator settings.
Add or Remove Images
Consider adding more images if:
You have less than 10 images.
The patterns do not cover enough of the image frame.
The patterns do not have enough variation in orientation
with respect to the camera.
Consider removing images if the images:
The images have a high mean reprojection error.
The images are blurry.
The images contain a checkerboard at an angle greater than 45 degrees
relative to the camera plane.
The images contain incorrectly detected checkerboard points.
Standard Model: Change the Number of Radial Distortion Coefficients
You can specify two or three radial distortion coefficients. On the
Calibrations tab, in the Camera Model
section, with Standard selected, click
Options. Select the Radial Distortion
as either 2 Coefficients or 3
Coefficients. Radial distortion occurs when light
rays bend more near the edges of a lens than they do at its optical center. The
smaller the lens, the greater the distortion.
The radial distortion
coefficients model this type of distortion. The distorted points are
denoted as (xdistorted, ydistorted):
xdistorted = x(1
+ k1*r2 + k2*r4 + k3*r6)
ydistorted= y(1
+ k1*r2 + k2*r4 + k3*r6)
x, y —
Undistorted pixel locations. x and y are
in normalized image coordinates. Normalized image coordinates are
calculated from pixel coordinates by translating to the optical center
and dividing by the focal length in pixels. Thus, x and y are
dimensionless.
k1, k2,
and k3 — Radial distortion
coefficients of the lens.
r2: x2 + y2
Typically, two coefficients are sufficient for calibration.
For severe distortion, such as in wide-angle lenses, you can select
3 coefficients to include k3.
The undistorted pixel locations are in normalized image coordinates,
with the origin at the optical center. The coordinates are expressed
in world units.
Standard Model: Compute Skew
When you select the Compute Skew
check box, the calibrator estimates the image axes skew. Some camera sensors contain
imperfections that cause the x- and y-axes of the image to
not be perpendicular. You can model this defect using a skew parameter. If you do not select the
check box, the image axes are assumed to be perpendicular, which is the case for most modern
cameras.
Standard Model: Compute Tangential Distortion
Tangential distortion occurs when the lens and the image plane
are not parallel. The tangential distortion coefficients model this
type of distortion.
The distorted
points are denoted as (xdistorted, ydistorted):
xdistorted = x +
[2 * p1 * x * y + p2 *
(r2 + 2 * x2)]
ydistorted = y +
[p1 * (r2 +
2 *y2) + 2 * p2 * x * y]
x, y —
Undistorted pixel locations. x and y are
in normalized image coordinates. Normalized image coordinates are
calculated from pixel coordinates by translating to the optical center
and dividing by the focal length in pixels. Thus, x and y are
dimensionless.
p1 and p2 —
Tangential distortion coefficients of the lens.
r2:
x2 +
y2
When you select the Compute Tangential Distortion check
box, the calibrator estimates the tangential distortion coefficients.
Otherwise, the calibrator sets the tangential distortion coefficients
to zero.
Fisheye Model: Estimate Alignment
In the Camera Model section, with
Fisheye selected, click Options.
Select Estimate Alignment to enable estimation of the axes
alignment when the optical axis of the fisheye lens is not perpendicular to the
image plane.
Export Camera Parameters
When you are satisfied with calibration accuracy, click Export Camera
Parameters. You can either save and export the camera parameters to an
object by selecting Export Camera Parameters or generate the camera
parameters as a MATLAB script.
Export Camera Parameters
Select > to create a cameraParameters
object in your workspace. The object contains the
intrinsic and extrinsic parameters of the camera and the distortion coefficients.
You can use this object for various computer vision tasks, such as image
undistortion, measuring planar objects, and 3-D reconstruction. See Measuring Planar Objects with a Calibrated Camera. You can optionally export the cameraCalibrationErrors
object, which contains the standard errors of
estimated camera parameters, by selecting the Export estimation
errors check box.
Generate MATLAB Script
Select > to save your camera parameters to a MATLAB script, enabling you to reproduce the steps from your calibration
session.
References
[1] Zhang, Z. “A Flexible New Technique for Camera
Calibration.” IEEE Transactions on Pattern Analysis and Machine
Intelligence. Vol. 22, Number. 11, 2000, pp. 1330–1334.
[2] Heikkila, J. and O. Silven. “A Four-step Camera Calibration
Procedure with Implicit Image Correction.” IEEE International
Conference on Computer Vision and Pattern Recognition.
1997.
[3] Scaramuzza, D., A. Martinelli, and R. Siegwart. "A Toolbox for Easy Calibrating
Omindirectional Cameras." Proceedings to IEEE International Conference on
Intelligent Robots and Systems (IROS 2006). Beijing, China, October
7–15, 2006.
[4] Urban, S., J. Leitloff, and S. Hinz. "Improved Wide-Angle, Fisheye and
Omnidirectional Camera Calibration." ISPRS Journal of Photogrammetry and
Remove Sensing. Vol. 108, 2015, pp.72–79.
See Also
Camera Calibrator | cameraParameters
| detectCheckerboardPoints
| estimateCameraParameters
| generateCheckerboardPoints
| showExtrinsics
| showReprojectionErrors
| Stereo Camera Calibrator | stereoParameters
| undistortImage
Related Examples
More About
External Websites