Estimate camera projection matrix from world-to-image point correspondences
returns the camera projection matrix determined from known world points and their
corresponding image projections by using the direct linear transformation (DLT)
approach.camMatrix
= estimateCameraMatrix(imagePoints
,worldPoints
)
[
also returns the reprojection error that quantifies the accuracy of the projected image
coordinates.camMatrix
,reprojectionErrors
] = estimateCameraMatrix(imagePoints
,worldPoints
)
You can use the estimateCameraMatrix
function to estimate a camera projection
matrix:
If the world-to-image point correspondences are known, and the camera intrinsics and
extrinsics parameters are not known, you can use the cameraMatrix
function.
To compute 2-D image points from 3-D world points, refer to the equations in camMatrix
.
For use with the findNearestNeighbors
object function of the
pointCloud
object. The use of a camera
projection matrix speeds up the nearest neighbors search in a point cloud generated by an
RGB-D sensor, such as Microsoft®
Kinect®.
Given the world points X and the image points x, the camera projection matrix C, is obtained by solving the equation
λx = CX.
The equation is solved using the direct linear transformation (DLT) approach [1]. This approach formulates a homogeneous linear system of equations, and the solution is obtained through generalized eigenvalue decomposition.
Because the image point coordinates are given in pixel values, the approach for computing the camera projection matrix is sensitive to numerical errors. To avoid numerical errors, the input image point coordinates are normalized, so that their centroid is at the origin. Also, the root mean squared distance of the image points from the origin is . These steps summarize the process for estimating the camera projection matrix.
Normalize the input image point coordinates with transform T.
Estimate camera projection matrix CN from the normalized input image points.
Compute the denormalized camera projection matrix C as CNT-1.
Compute the reprojected image point coordinates xE as CX.
Compute the reprojection errors as
reprojectionErrors = |x− xE|.
[1] Richard, H. and A. Zisserman. Multiple View Geometry in Computer Vision. Cambridge: Cambridge University Press, 2000.
cameraCalibrationErrors
| cameraIntrinsics
| extrinsicsEstimationErrors
| intrinsicsEstimationErrors
| stereoParameters
cameraMatrix
| detectCheckerboardPoints
| estimateCameraParameters
| estimateEssentialMatrix
| estimateFundamentalMatrix
| estimateWorldCameraPose
| findNearestNeighbors
| generateCheckerboardPoints
| showExtrinsics
| showReprojectionErrors
| undistortImage