Instrumentation, Techniques, and Measurement

Method for calibration accuracy improvement of projector-camera-based structured light system

[+] Author Affiliations
Lei Nie, Yuping Ye

Chinese Academy of Sciences, Shenzhen Key Laboratory of Computer Vision and Virtual Reality Technology, Shenzhen Institutes of Advanced Technology, Shenzhen, China

Zhan Song

Chinese Academy of Sciences, Shenzhen Key Laboratory of Computer Vision and Virtual Reality Technology, Shenzhen Institutes of Advanced Technology, Shenzhen, China

The Chinese University of Hong Kong, Hong Kong SAR, China

Opt. Eng. 56(7), 074101 (Jul 04, 2017). doi:10.1117/1.OE.56.7.074101
History: Received April 20, 2017; Accepted June 19, 2017
Text Size: A A A

Open Access Open Access

Abstract.  Calibration is a critical step for the projector-camera-based structured light system (SLS). Conventional SLS calibration means usually use the calibrated camera to calibrate the projector device, and the optimization of calibration parameters is applied to minimize the two-dimensional (2-D) reprojection errors. A three-dimensional (3-D)-based method is proposed for the optimization of SLS calibration parameters. The system is first calibrated with traditional calibration methods to obtain the primary calibration parameters. Then, a reference plane with some precisely printed markers is used for the optimization of primary calibration results. Three metric error criteria are introduced to evaluate the 3-D reconstruction accuracy of the reference plane. By treating all the system parameters as a global optimization problem and using the primary calibration parameters as initial values, a nonlinear multiobjective optimization problem can be established and solved. Compared with conventional calibration methods that adopt the 2-D reprojection errors for the camera and projector separately, a global optimal calibration result can be obtained by the proposed calibration procedure. Experimental results showed that, with the optimized calibration parameters, measurement accuracy and 3-D reconstruction quality of the system can be greatly improved.

Figures in this Article

The structured light system (SLS) has been an important contactless three-dimensional (3-D) measurement technology for its advantages of high accuracy and efficiency.14 A basic SLS consists of one projector and one camera. The projector is used to project some predefined pattern images onto the target surface, and the camera is used to capture the scene synchronously. By extracting the projected features from the captured images, accurate and dense correspondences can be established between the camera and projector reference frames. With the correspondences, 3-D information can be retrieved via the triangulation principle.5,6

For the SLS, calibration of system parameters is usually the first and crucial step because the calibration result determines final 3-D measurement precision directly.79 To perform the triangulation, we need to know the intrinsic parameters of both projector and camera, as well as the extrinsic parameters between them. There have been several research works to address this classical problem.1013 The main difficulty in the calibration of the projector-camera-based SLS is how to precisely calibrate the projector device. As the projector cannot “see” the calibration target like the camera, existing camera calibration methods cannot be applied directly. To calibrate the projector, a usual means is to treat the projector as an “inverse” camera. In the implementation, the camera is first calibrated, and then it is used to calibrate the projector device. However, with such a calibration strategy, calibration errors of the camera will propagate to the stage of projector calibration14 and thus decrease the overall calibration accuracy of the SLS. Moreover, minimization of two-dimensional (2-D) reprojection error of the reference points is a usual criterion for optimizing the calibration results, especially for the lens distortion parameters.15,16 Such an optimization procedure is usually applied to the camera and projector separately and cannot reflect real 3-D reconstruction accuracy.

In this paper, a 3D-based optimization method is studied to improve the calibration accuracy of the projector-camera-based SLS. The system is first calibrated by traditional means with a printed checkerboard pattern. Then, a planar surface with some precisely printed markers is used for the parameter optimization. Based on the reference plane, 3-D metric error criteria are defined as the planarity error, the distance error, and the angular error. A multiobjective optimization problem is established by considering all system parameters as variables. Using the primary calibration results as initial values, optimal calibration parameters with minimum 3-D measurement errors can be solved. In the experiments, the optimized calibration parameters are evaluated qualitatively and quantitatively. The results show that calibration accuracy can be greatly improved by the proposed approach compared with some classical calibration methods.

This paper is organized as follows: a brief review of existing calibration methods of the projector-camera-based SLS is presented in Sec. 2. In Sec. 3, the calibration procedure and the parameter optimization are introduced. Experimental results are provided and evaluated in Sec. 4. Finally, the conclusion is offered in Sec. 5.

Camera calibration is a classical topic in the computer vision domain. The most widely used camera calibration methods are Tsai’s method17 and Zhang’s method.18 Tsai’s method uses a precise external 3-D calibration object to which a reference coordinate frame is defined. In Zhang’s method, the calibration object can be simplified to a planar surface with some printed patterns. Position and orientation of the calibration plane can be changed freely in the visual field of the camera. With adequate calibration images, the camera’s intrinsic and extrinsic parameters with respect to the calibration plane can be estimated. Such a calibration procedure can be applied to multiple camera-based stereo vision systems.19

However, for the projector-camera-based SLS, both the camera and the projector are required to be accurately calibrated. Calibration of the camera can follow traditional means. The projector cannot see the calibration object, so the methods for camera calibration cannot be applied to it directly. In previous works, a popular approach is to use the camera calibration information to calibrate the projector device. The operation contains two steps: (1) the calibration plane with printed patterns is imaged by the camera and (2) the calibration plane is kept static while another pattern is projected onto the calibration plane and then imaged by the camera. By changing the position and pose of the calibration plane, a group of image pairs can be captured. The images with only printed patterns are used for the camera’s calibration to obtain its intrinsic (e.g., focal length, principle point, and lens distortions) and extrinsic (e.g., rotation and translation vectors with respect to the calibration plane) parameters. As a result, 3-D information of the calibration plane at each calibration position can be calculated with respect to the camera reference frame. Thus, 3-D coordinates of the projected pattern features can be calculated. As the image coordinates of the projected pattern features are known a priori, the intrinsic and extrinsic parameters of the projector can be estimated via traditional camera calibration procedures.

In Ref. 20, a printed checkerboard pattern was used for the calibration of a projector-camera-based SLS. The calibration plane contained two regions: one region with a printed pattern was used for the camera calibration and the other was blank and used as the projector screen. Four corners of the plane were marked with colors to release the feature detection difficulty. In Ref. 21, a planar calibration object with 140 uniformly distributed physical markers was used. These markers were precisely measured with a known distance, which were used to calibrate the camera first. Then, a series of sinusoidal phase-shifting patterns was projected on it and captured by the camera. With the phase decoding procedure, one-to-one correspondence can be established between the projector and the camera. By interpolating the image positions of the markers on the projector’s image plane, their projector coordinates can be calculated. A similar idea was also reported in Ref. 22, which extended it to a concept that allows the projector to be treated as if it can “capture” images. In this method, three sinusoidal phase-shifting fringe patterns were projected on the object sequentially and captured by the camera. To construct the one-to-one correspondence between the camera and projector, both vertical and horizontal fringe patterns were used. Thus, the calibration of the projector can be implemented on the regenerated projector images. As a continuous work, an improved calibration approach was introduced in Ref. 23 to deal with the projector defocus problem. The authors showed that one-to-one correspondence between the projector and camera cannot be established in spatial domain subject to the defocused pattern projection. However, the mapping in the phase domain was invariant between the central points of a projector pixel and a camera pixel. Without considering the nonlinear distortion of the projector, an improved calibration result was obtained via traditional calibration methods. In Ref. 24, a planar board with some evenly distributed circular markers was placed on a motion table and used to calibrate the projector-camera-based SLS. By defining the calibration board as the world coordinate system, 3-D coordinates of the circular makers can be precisely calculated and used for the calibration of the camera. To calibrate the projector, the gray code and phase-shifting patterns were also used. The sum of the reprojection errors of all the reference points onto the camera and projector image planes was used to optimize the calibration parameters of the camera and projector. In Ref. 25, dense correspondences between the projector and camera were first generated by gray code and phase-shifting patterns. Then, the intrinsic and extrinsic parameters of the projector and camera were estimated by decomposing a radial fundamental matrix, and the 2-D reprojection error was adopted for the parameter optimization.

In Ref. 26, a calibration method for the fringe projection profilometry system was studied. Unlike previous stereo vision calibration methods, the bundle adjustment strategy was introduced to the calibration procedure, which was used to adjust the coordinates of benchmarks. The results showed that side effect due to inaccuracy of benchmarks could be efficiently reduced, resulting in reliable calibration parameters. In Ref. 27, a nonlinear iterative optimization method was proposed to correct the errors caused by lens distortion. Simulated and experimental results showed that the calibration accuracy can be improved compared with the conventional linear model method. In Ref. 28, a residual error compensation scheme was proposed to improve the calibration accuracy. The compensation scheme was applied to a reference plane with the projection of some circular control points that projected from the projector. Planarity of the control points was used to rectify the remaining distortions that are not predicted by the projector lens distortion model. With such a feedback scheme, the systematic error and robustness could be improved. Instead of using projected features, a reference plane with some precisely printed markers was used for the rectification of primary calibration parameters.29 Based on this work, a more comprehensive framework for the optimization of the projector-camera-based SLS parameters will be investigated and evaluated in this paper.

In addition, there are also some projector-camera-based SLS calibration tools that are widely used in the research domain, such as the “Procam-calib” tool30,31 and the “SLS-calib” tool.32,33 For the Procam-calib tool, it first calibrates the camera via Zhang’s method. Then, a checkerboard pattern is projected on the calibration board, and the corners of the projected pattern can be detected. By applying the ray-plane intersection method, the 3-D position for each projected corner can be calculated and used for the calibration of the projector. For the SLS-calib tool, its improvement is to use the local homographies to individually translate each checkerboard corner from the camera plane to the projector plane. Each local homography is only valid within its neighborhood, and it is used to translate only one corner point. In this way, all pattern corner points can be transferred from camera to projector plane independently of each other, thus decreasing the effects by lens distortions. The experimental results showed that local homographies can successfully handle projector lens distortion and improve the overall calibration accuracy.

The proposed calibration approach for the projector-camera-based SLS consists of two steps: primary calibration and the parameter optimization in 3-D space. To model the system, a full pinhole model that contains radial and tangential lens distortions is adopted for both the camera and projector. The calibration procedures described in Refs. 30313233 can be used for the primary calibration of the system. Then, a reference plane with some precisely printed markers is used for the optimization of the primary calibration parameters.

Primary Calibration of the Structured Light System

Geometric model of the projector-camera-based SLS can be described in Fig. 1. The parameters required to estimate including the intrinsic parameters of both the camera and projector, as well as the extrinsic parameters between the camera and projector. To describe the system model more accurately, radial and tangential distortions are considered for both the camera and projector.

Graphic Jump Location
Fig. 1
F1 :

Geometric model of a typical projector-camera-based SLS.

Let Mc=[XcYcZc]T denote the 3-D coordinate of a spatial point with respect to the camera reference frame, and its corresponding image pixel coordinate on the camera plane can be denoted as mc=[ucvc]T. According to the pinhole model, the normalized form of mc can be written as Display Formula

m˜c=[u˜cv˜c]=[Xc/ZcYc/Zc].(1)

Considering the radial and tangential lens distortions, the undistorted expression of m˜c can be expressed as Display Formula

L(m˜c)=m˜c·(1+kc1rc2+kc2rc4+kc3rc6)+Δt(m˜c),(2)
where rc2=u˜c2+v˜c2 and Δt(m˜c) refers to the tangential distortion vector that can be expressed as Display Formula
Δt(m˜c)=[2kc4u˜cv˜c+kc5(rc2+2u˜c2)kc4(rc2+2v˜c2)+2kc5u˜cv˜c].(3)
The homogeneous coordinate x¯c of the corresponding camera point mc with the compensation of lens distortions is expressed as Display Formula
x¯c=[xcyc1]=Kc·L(m˜c),(4)
where Kc is known as the intrinsic parameter matrix of the camera that is represented by Display Formula
Kc=[fxcαc·fxccxc0fyccyc001].(5)
The same model can be applied for the projector device. For a complete model with lens distortions, there are 10 parameters to be estimated, i.e., {fx,  fy,cx,cy,k1,k2,k3,k4,k5,andα}. The parameter of α refers to the skewness of the sensor axes, which can be assumed to be 0 for most modern imaging sensors.18,31,33

The extrinsic parameters is expressed by a rotational matrix R and a translation vector T as Display Formula

R=[R11R12R13R21R22R23R31R32R33],T=[T1T2T3].(6)

Therefore, the coordinates of Mc and Mp with respect to the camera and projector reference frames is related as Display Formula

[Mc1]=[RT01]·[Mp1].(7)

There are a total of 12 extrinsic parameters to be estimated for the camera and projector. For each corresponding point [xcyc1]T and [xpyp1]T on the camera and projector plane, a closed-form expression for the depth (Zc) in the camera reference frame is derived as34Display Formula

Zc=(T1xpT3)/R[1]+xpR[3],x¯c,(8)
where R[1]=[R11R12R13]T and R[3]=[R31R32R33]T.

To calibrate the projector-camera-based SLS, the camera is first calibrated with a printed checkerboard pattern via the method in Ref. 17. The checkerboard pattern corners (mc) are extracted, and their corresponding 3-D points (Mc) on the calibration plane can be estimated. Then, a closed-form solution is applied to solve the intrinsic and extrinsic parameters of the camera. By including the lens distortion parameters, the minimization of the reprojection errors is introduced as follows: Display Formula

i=1nj=1mmijproj(K,Ri,Ti,Mj)2,(9)
where n indicates the number of calibration images, m indicates the number of pattern feature points on each calibration plane, and proj(K,Ri,Ti,Mj) is the projection of the 3-D points Mj on the j’th calibration plane.

To calibrate the projector, the calibrated camera can be used. With the calibration result of the camera, 3-D information of the calibration plane can be calculated. As a result, 3-D coordinates of the checkerboard corners on the projected patterns can be calculated with respect to the camera reference frame. Therefore, the correspondence of {mp,Mp} can be calculated. Then, the calibration of the projector can be performed following the camera calibration procedure, and the extrinsic parameters R and T can be calculated from Eq. (7). For existing calibration methods of the projector-camera-based SLS, optimization of the calibration parameters was applied for the camera and projector separately. The optimization was performed with respect to the reprojection errors of pattern feature points in 2-D image space as given in Eq. (9). The following section describes how the parameters can be optimized in 3-D space to further improve the system calibration accuracy.

Optimization of Primary Calibration Parameters

As described in Sec. 3.1, optimization of calibration parameters with respect to the 2-D reprojection error criterion has been a standard step in existing calibration methods not only for the camera but also for the projector-camera-based SLSs. However, such a procedure is applied for the camera and projector separately and cannot reflect the real metric errors. In this section, an extra optimization procedure that performed in 3-D space is introduced to improve the calibration accuracy. The underlying principle of the proposed method is to treat all the calibration parameters of the projector-camera-based SLS as a global optimization problem. The primary calibration results in Sec. 3.1 are used as the initial values, and some objective functions are constructed to minimize the 3-D metric errors. By solving the nonlinear multiple-target optimization problem, the optimal calibration parameters can be obtained. Workflow of the proposed calibration procedure is shown in Fig. 2.

Graphic Jump Location
Fig. 2
F2 :

Flowchart of the proposed parameter optimization procedure.

The object used for the optimization is very simple. To guarantee high flatness of the object surface, a flat glass with homogeneous reflectance is used. Some markers are uniformly printed on the glass surface with precise distance (D) as shown in Fig. 3. The reference plane with markers is first scanned by a group of structured light patterns.35 According to the coding strategy of Ref. 35, the first image that contains no pattern information is white. Based on this image, a random downsampling is applied to obtain a group of image points (popt). Then, a threshold is applied to separate the marker areas, and the centroids of markers (pm) can be calculated with subpixel accuracy. With the primary calibration parameters, 3-D coordinates of popt and pm can be calculated and denoted as popt and Pm, respectively. Based on the reconstructed 3-D points of popt and Pm, three objective functions are constructed to evaluate the 3-D reconstruction accuracy as follows:

  1. Planarity error: The reference plane used for reconstruction can be viewed as a perfect plane. Without considering the calibration errors and reconstruction errors, the planarity of popt should be zero. Based on this a priori, a least-square fitting approach is applied to popt. Suppose the number of sampling points is S, and the distance between the i’th sample point to the fitting plane is di, then the absolute mean fitting residuals (Ep) can be defined as Display Formula
    Ep=i=1S|di|/S.(10)
  2. Distance error: For each marker point PjPm, its average distance to all adjacent marker points in horizontal and vertical directions can be calculated and denoted as dj. Suppose there are J marker points on the reference plane, the distance error objective function is simply defined as Display Formula
    Ed=j=1J|djD|/J.(11)
  3. Angular error: Considering that the affine transformation may be caused by the inaccurate primary calibration parameters, then the last objective function is constructed to evaluate the angles between the marker points. For each marker point PjPm, by connecting it with all adjacent marker points, all the included angles θ can be calculated. The ground-truth value of θ is known as θ0=90  deg and θj is the average of all calculated angles, the angular error objective function is expressed as Display Formula
    Eθ=j=1J|θjθ0|/J.(12)

Graphic Jump Location
Fig. 3
F3 :

Illustration of the pattern structure used for parameter optimization.

There are a total of 30 parameters to optimize, i.e., eight intrinsic parameters in {fc,cc,fp,cp}, 12 extrinsic parameters in R and T, and 10 lens distortion parameters in {kc,kp},; the sensor skewness factors (αc and αp) are assumed to be 0 and are not considered in the optimization. A vector x is defined to represent all parameters to be optimized as Display Formula

x=[fc,cc,fp,cp,R,T,kc,kp]T.(13)

With conventional calibration procedures,3033 we obtain the initial value of x and denote it as x0. Therefore, a multiobjective optimization problem is established as Display Formula

min{Ep(x),α·Ed(x),β·Eθ(x)},s.t.  R·RT=1,lb.[x10x200][x1x20]lu.[x10x200],lkb[x21x25]lku,lkb[x26x30]lku,(14)
where the first constraint R·RT=1 is used to guarantee the orthogonality of matrix R, and the others are used to set the ranges of parameters. For [x1x20], which refers to the parameters of [fc,cc,fp,cp,R,T], the empirical values of lb=0.9 and lu=1.1 are adopted. lkb and lku are set to a fixed range of [0.1,0.5,0.5,0.5,0.5] and [0.1, 0.5, 0.5, 0.5, 0.5], respectively. The weighting factors (α and β) are used to balance the effects from three error criteria, which can be evaluated empirically to satisfy Epα·Edβ·Ea. In our experiments, both α and β are set to 1. To solve this multiobjective optimization problem, some off-the-shelf mathematical tools can be used. In our work, the “fminsearch” function provided in the MATLAB optimization toolbox is used, which is based on the solution of the simplex search method as described in Ref. 36.

The experimental setup consists of one camera (point gray FL3-U3-32S2C-CS, with the resolution of 2080×1552  pixels, USB3.0 interface, and 60 fps), one digital light procession projector (Benq GP1, with the resolution of 1024×768  pixels, HDMI interface, and 60 Hz), and a rotation table as shown in Fig. 4. The camera is mounted with a lens of 10 mm. The rotation table is used to realize the multiple-view 3-D scanning. The working distance of the system is about 700 mm, and the scanning field is about 400×300  mm. The system is first calibrated with a printed checkerboard pattern via conventional calibration methods,3033 where the 2-D reprojection error criterion is used for the optimization of system parameters. A flat glass with homogeneous reflectance is used for the parameter optimization. Some circular markers are uniformly printed on the glass surface with a precise distance of 100 mm. The structured light method described in Ref. 35 is used for the 3-D scanning. The first experiment is conducted on the reference plane, and the 3-D measurement results with respect to three error criteria are provided to evaluate different calibration parameters. The second experiment is applied with the rotational 3-D scanning system to evaluate the 3-D reconstruction results qualitatively.

Graphic Jump Location
Fig. 4
F4 :

The experimental projector-camera-based SLS, which contains one camera, one projector, and a rotation table.

Calibration Results with and without Optimization

The Procam-calib tool30,31 and SLS-calib tool,32,33 two usual calibration tools for the projector-camera-based SLS, are used for the primary calibration in our work. Figure 5 shows the checkerboard calibration plane and planar surface with markers used for parameter optimization. The reference plane is scanned five times with different positions and poses in the working volume. With the downsampling procedure, 10,000 points are randomly selected and reconstructed for the calculation of the planarity error. The calibration parameters by the SLS-calib tool are used as the initial values, and the optimization algorithm is implemented with MATLAB 2012. All the calibration parameters by the Procam-calib tool, SLS-calib tool, and the proposed method are given in Table 1. From the results, we can see that the major differences of three calibration results appear in the distortion factors kc and kp.

Graphic Jump Location
Fig. 5
F5 :

(a) the printed calibration pattern used for the primary calibration and (b) the planar surface with markers used for parameter optimization.

Table Grahic Jump Location
Table 1Calibration parameters by Procam-calib tool, SLS-calib tool, and the proposed method.
Evaluation of Calibration Accuracy

The reference plane is also used to evaluate the accuracy of different calibration parameters. By changing the position and pose of the reference plane with respect to that used in the optimization stage, it was reconstructed by three calibration results as listed in Table 1. The scanning region is about 400×300  mm. By fitting a plane to the three reconstructed point clouds, distributions of the fitting errors are as displayed in Fig. 6. From the results, we can see that, with classical calibration parameters, distinct reconstruction errors arise at the plane corners and boundaries as shown in Figs. 6(a) and 6(b). This was mainly caused by the inaccurately calibrated lens distortion parameters. The values of metric error terms Ep, Ed, and Eθ are also calculated and given in Table 2. From the results, we can see that the planarity by the optimized calibration parameters can be improved 5 to 10 times over classical calibration methods. The distance error by the optimized parameters is very close to 0 compared with the values of 0.219 and 0.166 mm by the other two methods. The angular error can also be improved. To evaluate the robustness of the optimized calibration parameters, the 3-D reconstruction procedure is repeated by changing the position and pose of the target plane, and similar measurement results can be obtained.

Graphic Jump Location
Fig. 6
F6 :

Distribution of plane-fitting errors by the calibration results of (a) Procam-calib tool, (b) SLS-calib tool, and (c) proposed method.

Table Grahic Jump Location
Table 2Evaluation of measurement accuracy by different calibration parameters.
Qualitative Evaluation of Calibration Parameters

This experiment is used to evaluate the 3-D reconstruction quality by different calibration parameters. For the objects with free-form surfaces, the calibration accuracy is difficult to be evaluated from a single 3-D scanning. To make the comparison, a rotation table is introduced to the projector-camera-based SLS as shown in Fig. 4. The object is rotated with a fixed angle of 30 deg, and the rotational axis is calculated by the method in Ref. 37. With a complete scanning, 12 surface patches can be obtained. To align these surface patches, a rigid transformation can be applied with the calculated rotational axis and the given rotation angle.

A plaster pot is used in this experiment as shown in Fig. 7(a). The calibration parameters by the Procam-calib tool, SLS-calib tool, and the proposed approach are used for the 3-D reconstruction. The registered 3-D models are shown in Figs. 7(b)7(d). Red lines on the 3-D models indicate the gaps between adjacent surface patches. In other words, the reconstructed surface patches are distorted and cannot be well aligned. From Fig. 7(b), we can see that the reconstructed 3-D model with calibration parameters by the Procam-calib tool has distinct distortions in the areas of the surface patch boundaries. In these areas, the adjacent scanning surface patches cannot be well registered. The reconstruction quality can be improved by the calibration results of the SLS-calib tool as shown in Fig. 7(c), but a few surface regions still cannot be well aligned. Figure 7(d) shows the reconstructed 3-D model by the calibration parameters of our method, where most of the surface patches can be precisely registered. Figure 8(a) shows another 3-D object, which has abundant surface details, like the hair and some texts carved on it. Some areas on the model are enlarged as shown in Fig. 8(b) for close observation. From the results, we can see that tiny features can be precisely registered, which benefits the accurate calibration parameters. More experimental results are provided in Fig. 9 to show the high 3-D reconstruction quality brought by the accurate calibration parameters. With the above evaluations, calibration accuracy of the proposed method can be fully demonstrated.

Graphic Jump Location
Fig. 7
F7 :

Evaluation of 3-D reconstruction quality by various calibration methods. (a) Experimental target, (b) registered 3-D model by the parameter of Procam-calib tool, (c) registered 3-D model by the parameter of SLS-calib tool, and (d) registered 3-D model by the proposed calibration method.

Graphic Jump Location
Fig. 8
F8 :

3-D reconstruction of a plaster model with the proposed calibration method. (a) The reconstructed 3-D model under various viewpoints and (b) enlarged surface areas for close observation.

Graphic Jump Location
Fig. 9
F9 :

More objects are used for the experiments. The reconstructed 3-D models with the proposed calibration method.

In this work, an accurate and practical calibration method is introduced for the projector-camera-based SLS. In this method, conventional calibration means is first applied for the primary calibration of the system. Then, a planar surface with some markers is used for the parameters’ optimization. All the intrinsic parameters of the camera and projector and the extrinsic parameters between them are considered a global optimization problem. Compared with classical calibration means, which apply the parameter optimization in 2-D image space to minimize the reprojection errors, the proposed optimization approach is executed in 3-D space directly. Three error criteria are introduced as the objective functions, i.e., planarity error, distance error, and the angular error. Using the primary calibration parameters as initial values, the nonlinear multiple-target optimization problem can be solved to obtain the optimal calibration parameters.

The first experiment is conducted on the planar surface. These results show that, by the proposed calibration method, 3-D measurement accuracy can be improved 5 to 10 times compared with classical calibration means. The second experiment is applied to evaluate the 3-D reconstruction results qualitatively. These results show that 3-D models with higher quality can be obtained by the optimized calibration parameters. With the above comparisons, improvement of the calibration accuracy by the proposed calibration method can be fully demonstrated. The proposed method is simple and easy to implement, which can be widely used for the SLS calibration to improve its measuring accuracy.

This work was supported in part by the National Natural Science Foundation of China (Nos. 61375041 and U1613213) and the Shenzhen Science Plan (Nos. JCYJ20140509174140685, JCYJ20150401150223645, and JSGG20150925164740726).

Salvi  J.  et al., “A state of the art in structured light patterns for surface profilometry,” Pattern Recognit.. 43, (8 ), 2666 –2680 (2010). 0031-3203 CrossRef
Gupta  M.  et al., “A practical approach to 3D scanning in the presence of interreflections, subsurface scattering and defocus,” Int. J. Comput. Vision. 102, (1–3 ), 33 –55 (2013). 0920-5691 CrossRef
Jeught  S. V., and Dirckx  J. J., “Real-time structured light profilometry: a review,” Opt. Lasers Eng.. 87, (12 ), 18 –31 (2016). 0143-8166 CrossRef
Hansen  K.  et al., “A structured light scanner for hyper flexible industrial automation,” in  2nd Int. Conf. on 3D Vision (3DV 2014) , pp. 401 –408 (2014).CrossRef
Ribo  M., and Brandner  M., “State of the art on vision-based structured light systems for 3D measurements,” in  Int. Workshop on Robotic Sensors: Robotic and Sensor Environments ,  Canada , pp. 2 –6 (2005).CrossRef
Sung  M. H.  et al., “Image unprojection for 3D surface reconstruction: a triangulation-based approach,” in  20th IEEE Int. Conf. on Image Processing (ICIP 2015) , pp. 161 –165 (2013).CrossRef
Yang  Z. M., and Wang  Y. F., “Error analysis of 3D shape construction from structured lighting,” Pattern Recognit.. 29, (2 ), 189 –206 (1996). 0031-3203 CrossRef
Walch  A., and Eitzinger  C., “A combined calibration of 2D and 3D sensors—a novel calibration for laser triangulation sensors based on point correspondences,” in  Int. Joint Conf. on Computer Vision, Imaging and Computer Graphics Theory and Applications , Vol. 1, pp. 89 –95 (2014).
Bird  N., and Papanikolopoulos  N., “Optimal image-based Euclidean calibration of structured light systems in general scenes,” IEEE Trans. Autom. Sci. Eng.. 8, (4 ), 815 –823 (2011). 1545-5955 CrossRef
Hamadou  A. B.  et al., “Flexible calibration of structured-light systems projecting point patterns,” Comput. Vision Image Understanding. 117, (10 ), 1468 –1481 (2013). 1077-3142 CrossRef
Lin  H. B., , Nie  L., and Song  Z., “A single-shot structured light means by encoding both color and geometrical features,” Pattern Recognit.. 54, (1 ), 178 –189 (2016). 0031-3203 CrossRef
Xie  Z. X., , Wang  X. M., and Chi  S. K., “Simultaneous calibration of the intrinsic and extrinsic parameters of structured-light sensors,” Opt. Lasers Eng.. 58, (7 ), 9 –18 (2014). 0143-8166 CrossRef
Zhang  S., and Huang  P. S., “Novel method for structured light system calibration,” Opt. Eng.. 45, (8 ), 083601  (2006).CrossRef
Léandry  I., , Brèque  C., and Valle  V., “Calibration of a structured-light projection system: development to large dimension objects,” Opt. Lasers Eng.. 50, (3 ), 373 –379 (2012). 0143-8166 CrossRef
Chen  R.  et al., “A self-recalibration method based on scale-invariant registration for structured light measurement systems,” Opt. Lasers Eng.. 88, (1 ), 75 –81 (2017). 0143-8166 CrossRef
Liu  Z.  et al., “Calibration method for line-structured light vision sensor based on a single ball target,” Opt. Lasers Eng.. 69, (6 ), 20 –28 (2015). 0143-8166 CrossRef
Tsai  R. Y., “A versatile camera calibration technique for high-accuracy 3D machine vision metrology using off-the-shelf TV cameras and lenses,” IEEE J. Robot. Autom.. 3, (4 ), 323 –344 (1987). 0882-4967 CrossRef
Zhang  Z., “A flexible new technique for camera calibration,” IEEE Trans. Pattern Anal. Mach. Intell.. 22, (11 ), 1330 –1334 (2000). 0162-8828 CrossRef
Kim  H., and Hilton  A., “3D scene reconstruction from multiple spherical stereo pairs,” Int. J. Comput. Vision. 104, (1 ), 94 –116 (2013). 0920-5691 CrossRef
Ribo  M., and Brandner  M., “State of the art on vision-based structured light systems for 3D measurements,” in  Int. Workshop on Robotic and Sensor Environments , pp. 2 –6 (2005).CrossRef
Legarda-Sáenz  R., , Bothe  T., and Jüptner  W. P., “Accurate procedure for the calibration of a structured light system,” Opt. Eng.. 43, (2 ), 464 –471 (2004).CrossRef
Gao  W., , Wang  L., and Hu  Z., “Flexible method for structured light system calibration,” Opt. Eng.. 47, (8 ), 083602  (2008).CrossRef
Li  B., , Karpinsky  N., and Zhang  S., “Novel calibration method for structured-light system with an out-of-focus projector,” Appl. Opt.. 53, (16 ), 3415 –3426 (2014). 0003-6935 CrossRef
Chen  X. B.  et al., “Accurate calibration for a camera-projector measurement system based on structured light projection,” Opt. Lasers Eng.. 47, (3–4 ), 310 –319 (2009). 0143-8166 CrossRef
Yamazaki  S., , Mochimaru  M., and Kanade  T., “Simultaneous self-calibration of a projector and a camera using structured light,” in  IEEE Conf. on Computer Vision and Pattern Recognition , pp. 60 –67 (2011).CrossRef
Yin  Y.  et al., “Calibration of fringe projection profilometry with bundle adjustment strategy,” Opt. Lett.. 37, , 542 –544 (2012). 0146-9592 CrossRef
Ma  S. D.  et al., “Flexible structured-light-based three-dimensional profile reconstruction method considering lens projection-imaging distortion,” Appl. Opt.. 51, (13 ), 2419 –2428 (2012). 0003-6935 CrossRef
Han  D., , Chimienti  A., and Menga  G., “Improving calibration accuracy of structured light systems using plane-based residual error compensation,” Opt. Eng.. 52, , 104106  (2013).CrossRef
Ye  Y. P., and Song  Z., “A practical means for the optimization of structured light system calibration parameters,” in  IEEE Int. Conf. on Image Processing  (2016).CrossRef
Falcao  G.  et al., “Projector-camera calibration toolbox,” http://code.google.com/p/procamcalib (2009).
Bouguet  J. Y., “Matlab camera calibration toolbox,” http://www.vision.caltech.edu/bouguetj/calib_doc/index.html (2015).
Moreno  D., and Taubin  G., “Simple, accurate, and robust projector-camera calibration,” in  Second Int. Conf. on 3D Imaging, Modeling, Processing, Visualization and Transmission (3DIMPVT 2012) , pp. 464 –471 (2012).CrossRef
Moreno  D., and Taubin  G., “Structured light calibration tool,” http://mesh.brown.edu/calibration/ (2012).
Bouguet  J.-Y., “Visual methods for three-dimensional modeling,” PhD Thesis, California Institute of Technology, pp. 107 –110 (1999).
Song  Z., , Chung  R., and Zhang  X. T., “An accurate and robust strip-edge based structured light means for shiny surface micro-measurement in 3D,” IEEE Trans. Ind. Electron.. 60, (3 ), 1023 –1032 (2013).CrossRef
Lagarias  J. C.  et al., “Convergence properties of the Nelder–Mead Simplex method in low dimensions,” SIAM J. Optim.. 9, (1 ), 112 –147 (1998).CrossRef
Pang  X. F.  et al., “A tool-free calibration method for turntable-based 3D scanning systems,” IEEE Comput. Graphics Appl.. 36, (1 ), 52 –61 (2016). 0272-1716 CrossRef

Lei Nie received his BS degree in computer science from Beihang University in 2010. He is a PhD student at Shenzhen Institutes of Advance Technology, Chinese Academy of Sciences (CAS). His current research interests include machine learning and stereo vision.

Yuping Ye received his BS degree in electronics from Wuhan University in 2013. He is a master’s student at Shenzhen Institutes of Advance Technology, CAS. His current research interest is structured light-based 3-D sensing technology.

Zhan Song received his PhD in mechanical and automation engineering from Chinese University of Hong Kong, Hong Kong, in 2008. He is currently with the Shenzhen Institutes of Advanced Technology, CAS, as a professor. His current research interests include structured-light-based sensing, image processing, 3-D face recognition, and human-computer interaction.

© The Authors. Published by SPIE under a Creative Commons Attribution 3.0 Unported License. Distribution or reproduction of this work in whole or in part requires full attribution of the original publication, including its DOI.

Citation

Lei Nie ; Yuping Ye and Zhan Song
"Method for calibration accuracy improvement of projector-camera-based structured light system", Opt. Eng. 56(7), 074101 (Jul 04, 2017). ; http://dx.doi.org/10.1117/1.OE.56.7.074101


Figures

Graphic Jump Location
Fig. 1
F1 :

Geometric model of a typical projector-camera-based SLS.

Graphic Jump Location
Fig. 2
F2 :

Flowchart of the proposed parameter optimization procedure.

Graphic Jump Location
Fig. 3
F3 :

Illustration of the pattern structure used for parameter optimization.

Graphic Jump Location
Fig. 4
F4 :

The experimental projector-camera-based SLS, which contains one camera, one projector, and a rotation table.

Graphic Jump Location
Fig. 5
F5 :

(a) the printed calibration pattern used for the primary calibration and (b) the planar surface with markers used for parameter optimization.

Graphic Jump Location
Fig. 6
F6 :

Distribution of plane-fitting errors by the calibration results of (a) Procam-calib tool, (b) SLS-calib tool, and (c) proposed method.

Graphic Jump Location
Fig. 7
F7 :

Evaluation of 3-D reconstruction quality by various calibration methods. (a) Experimental target, (b) registered 3-D model by the parameter of Procam-calib tool, (c) registered 3-D model by the parameter of SLS-calib tool, and (d) registered 3-D model by the proposed calibration method.

Graphic Jump Location
Fig. 8
F8 :

3-D reconstruction of a plaster model with the proposed calibration method. (a) The reconstructed 3-D model under various viewpoints and (b) enlarged surface areas for close observation.

Graphic Jump Location
Fig. 9
F9 :

More objects are used for the experiments. The reconstructed 3-D models with the proposed calibration method.

Tables

Table Grahic Jump Location
Table 1Calibration parameters by Procam-calib tool, SLS-calib tool, and the proposed method.
Table Grahic Jump Location
Table 2Evaluation of measurement accuracy by different calibration parameters.

References

Salvi  J.  et al., “A state of the art in structured light patterns for surface profilometry,” Pattern Recognit.. 43, (8 ), 2666 –2680 (2010). 0031-3203 CrossRef
Gupta  M.  et al., “A practical approach to 3D scanning in the presence of interreflections, subsurface scattering and defocus,” Int. J. Comput. Vision. 102, (1–3 ), 33 –55 (2013). 0920-5691 CrossRef
Jeught  S. V., and Dirckx  J. J., “Real-time structured light profilometry: a review,” Opt. Lasers Eng.. 87, (12 ), 18 –31 (2016). 0143-8166 CrossRef
Hansen  K.  et al., “A structured light scanner for hyper flexible industrial automation,” in  2nd Int. Conf. on 3D Vision (3DV 2014) , pp. 401 –408 (2014).CrossRef
Ribo  M., and Brandner  M., “State of the art on vision-based structured light systems for 3D measurements,” in  Int. Workshop on Robotic Sensors: Robotic and Sensor Environments ,  Canada , pp. 2 –6 (2005).CrossRef
Sung  M. H.  et al., “Image unprojection for 3D surface reconstruction: a triangulation-based approach,” in  20th IEEE Int. Conf. on Image Processing (ICIP 2015) , pp. 161 –165 (2013).CrossRef
Yang  Z. M., and Wang  Y. F., “Error analysis of 3D shape construction from structured lighting,” Pattern Recognit.. 29, (2 ), 189 –206 (1996). 0031-3203 CrossRef
Walch  A., and Eitzinger  C., “A combined calibration of 2D and 3D sensors—a novel calibration for laser triangulation sensors based on point correspondences,” in  Int. Joint Conf. on Computer Vision, Imaging and Computer Graphics Theory and Applications , Vol. 1, pp. 89 –95 (2014).
Bird  N., and Papanikolopoulos  N., “Optimal image-based Euclidean calibration of structured light systems in general scenes,” IEEE Trans. Autom. Sci. Eng.. 8, (4 ), 815 –823 (2011). 1545-5955 CrossRef
Hamadou  A. B.  et al., “Flexible calibration of structured-light systems projecting point patterns,” Comput. Vision Image Understanding. 117, (10 ), 1468 –1481 (2013). 1077-3142 CrossRef
Lin  H. B., , Nie  L., and Song  Z., “A single-shot structured light means by encoding both color and geometrical features,” Pattern Recognit.. 54, (1 ), 178 –189 (2016). 0031-3203 CrossRef
Xie  Z. X., , Wang  X. M., and Chi  S. K., “Simultaneous calibration of the intrinsic and extrinsic parameters of structured-light sensors,” Opt. Lasers Eng.. 58, (7 ), 9 –18 (2014). 0143-8166 CrossRef
Zhang  S., and Huang  P. S., “Novel method for structured light system calibration,” Opt. Eng.. 45, (8 ), 083601  (2006).CrossRef
Léandry  I., , Brèque  C., and Valle  V., “Calibration of a structured-light projection system: development to large dimension objects,” Opt. Lasers Eng.. 50, (3 ), 373 –379 (2012). 0143-8166 CrossRef
Chen  R.  et al., “A self-recalibration method based on scale-invariant registration for structured light measurement systems,” Opt. Lasers Eng.. 88, (1 ), 75 –81 (2017). 0143-8166 CrossRef
Liu  Z.  et al., “Calibration method for line-structured light vision sensor based on a single ball target,” Opt. Lasers Eng.. 69, (6 ), 20 –28 (2015). 0143-8166 CrossRef
Tsai  R. Y., “A versatile camera calibration technique for high-accuracy 3D machine vision metrology using off-the-shelf TV cameras and lenses,” IEEE J. Robot. Autom.. 3, (4 ), 323 –344 (1987). 0882-4967 CrossRef
Zhang  Z., “A flexible new technique for camera calibration,” IEEE Trans. Pattern Anal. Mach. Intell.. 22, (11 ), 1330 –1334 (2000). 0162-8828 CrossRef
Kim  H., and Hilton  A., “3D scene reconstruction from multiple spherical stereo pairs,” Int. J. Comput. Vision. 104, (1 ), 94 –116 (2013). 0920-5691 CrossRef
Ribo  M., and Brandner  M., “State of the art on vision-based structured light systems for 3D measurements,” in  Int. Workshop on Robotic and Sensor Environments , pp. 2 –6 (2005).CrossRef
Legarda-Sáenz  R., , Bothe  T., and Jüptner  W. P., “Accurate procedure for the calibration of a structured light system,” Opt. Eng.. 43, (2 ), 464 –471 (2004).CrossRef
Gao  W., , Wang  L., and Hu  Z., “Flexible method for structured light system calibration,” Opt. Eng.. 47, (8 ), 083602  (2008).CrossRef
Li  B., , Karpinsky  N., and Zhang  S., “Novel calibration method for structured-light system with an out-of-focus projector,” Appl. Opt.. 53, (16 ), 3415 –3426 (2014). 0003-6935 CrossRef
Chen  X. B.  et al., “Accurate calibration for a camera-projector measurement system based on structured light projection,” Opt. Lasers Eng.. 47, (3–4 ), 310 –319 (2009). 0143-8166 CrossRef
Yamazaki  S., , Mochimaru  M., and Kanade  T., “Simultaneous self-calibration of a projector and a camera using structured light,” in  IEEE Conf. on Computer Vision and Pattern Recognition , pp. 60 –67 (2011).CrossRef
Yin  Y.  et al., “Calibration of fringe projection profilometry with bundle adjustment strategy,” Opt. Lett.. 37, , 542 –544 (2012). 0146-9592 CrossRef
Ma  S. D.  et al., “Flexible structured-light-based three-dimensional profile reconstruction method considering lens projection-imaging distortion,” Appl. Opt.. 51, (13 ), 2419 –2428 (2012). 0003-6935 CrossRef
Han  D., , Chimienti  A., and Menga  G., “Improving calibration accuracy of structured light systems using plane-based residual error compensation,” Opt. Eng.. 52, , 104106  (2013).CrossRef
Ye  Y. P., and Song  Z., “A practical means for the optimization of structured light system calibration parameters,” in  IEEE Int. Conf. on Image Processing  (2016).CrossRef
Falcao  G.  et al., “Projector-camera calibration toolbox,” http://code.google.com/p/procamcalib (2009).
Bouguet  J. Y., “Matlab camera calibration toolbox,” http://www.vision.caltech.edu/bouguetj/calib_doc/index.html (2015).
Moreno  D., and Taubin  G., “Simple, accurate, and robust projector-camera calibration,” in  Second Int. Conf. on 3D Imaging, Modeling, Processing, Visualization and Transmission (3DIMPVT 2012) , pp. 464 –471 (2012).CrossRef
Moreno  D., and Taubin  G., “Structured light calibration tool,” http://mesh.brown.edu/calibration/ (2012).
Bouguet  J.-Y., “Visual methods for three-dimensional modeling,” PhD Thesis, California Institute of Technology, pp. 107 –110 (1999).
Song  Z., , Chung  R., and Zhang  X. T., “An accurate and robust strip-edge based structured light means for shiny surface micro-measurement in 3D,” IEEE Trans. Ind. Electron.. 60, (3 ), 1023 –1032 (2013).CrossRef
Lagarias  J. C.  et al., “Convergence properties of the Nelder–Mead Simplex method in low dimensions,” SIAM J. Optim.. 9, (1 ), 112 –147 (1998).CrossRef
Pang  X. F.  et al., “A tool-free calibration method for turntable-based 3D scanning systems,” IEEE Comput. Graphics Appl.. 36, (1 ), 52 –61 (2016). 0272-1716 CrossRef

Some tools below are only available to our subscribers or users with an online account.

Related Content

Customize your page view by dragging & repositioning the boxes below.

Related Book Chapters

Topic Collections

PubMed Articles
Advertisement


 

  • Don't have an account?
  • Subscribe to the SPIE Digital Library
  • Create a FREE account to sign up for Digital Library content alerts and gain access to institutional subscriptions remotely.
Access This Article
Sign in or Create a personal account to Buy this article ($20 for members, $25 for non-members).
Access This Proceeding
Sign in or Create a personal account to Buy this article ($15 for members, $18 for non-members).
Access This Chapter

Access to SPIE eBooks is limited to subscribing institutions and is not available as part of a personal subscription. Print or electronic versions of individual SPIE books may be purchased via SPIE.org.