Description
Camera calibration is an essential task in computer vision. It solves the problem of how a 3D point in the world corresponds to a 2D pixel coordinate (intrinsic camera calibration) and how the cameras are located with respect to each other and other sensors (extrinsic camera calibration). For intrinsic camera calibration the mapping between the 3D point and the 2D pixel coordinate highly depends on the used lens. In the past several camera models are proposed to describe different types of lenses, like the pinhole camera model with radial and tangential distortions, wide angle camera models and single and non-single viewpoint models.
One of the biggest challenges is how to evaluate and compare these camera models. To solve that problem a new method for intrinsic camera calibration is proposed. The proposed method is model free and highly accurate. It can serve as ground truth to evaluate the standard approaches.
The proposed approach uses a projector which displays fringe patterns and which position can be varied. These fringe patterns are used to measure for each camera pixel the projector position. By using several measured poses of the projector the viewing ray can be reconstructed.