Instrumentation, Techniques, and Measurement

Parallax correction of texture image in fringe projection profilometry

[+] Author Affiliations
Zhuang Lu, Jun Zhou, Hongwei Guo

Shanghai University, Laboratory of Applied Optics and Metrology, Department of Precision Mechanical Engineering, 149 Yanchang Road, Shanghai 200072, China

Opt. Eng. 54(8), 084107 (Aug 13, 2015). doi:10.1117/1.OE.54.8.084107
History: Received February 25, 2015; Accepted July 21, 2015
Text Size: A A A

Open Access Open Access

Abstract.  In fringe projection profilometry, the measurement system generally consists of a projector for casting fringes onto a measured object and a monochrome camera for capturing the deformed fringe patterns. In addition to these components, we can add a color camera for capturing the texture of the object simultaneously. For implementing texture mapping on the reconstructed three-dimensional (3-D) surface, the parallax between the views of the texture camera and the measuring camera has to be corrected. For this purpose, we analyze the geometry of the fringe projection system with a color texture camera and further suggest a system calibration method. Using this method, the corresponding relationship between the texture and the 3-D data and the mapping relationship between the depths and the fringe phases are determined simultaneously, so that the duration time for the system calibration implementation is saved. The data processing with this method is of a low computational complexity because it involves only linear minimizations. Using the calibration results, we can transform the texture image from the view of the color camera to that of the measuring camera and precisely map it on the reconstructed object surface. Experimental results demonstrate that this method is effective in correcting the parallax of the texture image.

Figures in this Article

Fringe projection profilometry1 has been extensively developed to meet the demands of various applications. It has advantages over other methods in being noncontacting and providing whole-field information. With it, periodic fringe patterns are projected onto the object surface, then the distorted patterns caused by the depth variations of the surface are recorded from different directions. Analyzing the distorted patterns allows us to reconstruct the depth map of the object. In addition to the depth measurement, we can capture the object texture and map it on the reconstructed three-dimensional (3-D) surface in order to enhance its realism.

Combining with texture mapping, the range of applications of the fringe projection technique is broadened, especially to fields such as virtual reality,24 commercial advertisement and entertainment,5 medical diagnosis,68 cosmetics,9 anthropology,10 and cultural heritage preservation.1114

Many methods have been developed for acquiring the texture of a measured object. Among them, the simplest one is to use a single camera to capture both the deformed fringe pattern and the texture image. As typical examples, Ref. 15 proposed a Gray-code fringe method and Ref. 16 proposed a marker-encoded fringe pattern method for measuring 3-D shapes with abrupt steps. In them, the fringe patterns and the textures are recorded in turn using the same camera. To decrease the time of image capturing in dynamic measurement, the texture is usually calculated from the deformed fringe patterns rather than directly captured by use of the camera. For example, in Fourier transform profilometry,17 one can recover the depth map from the spectrum of a single fringe pattern and simultaneously retrieve the (blurred) texture image by use of a notch filter or a low-pass filter.1820 With the phase-shifting technique21 that requires capturing a sequence of fringe patterns, simply averaging the captured fringe patterns will produce the texture image, i.e., the distribution of the background intensities with fringes removed.2224 These methods can also be used for calculating a color texture image, which is more realistic than a gray one, as long as a color camera is used and the object is illuminated with white light.22,23 In the more complex case of a colored light source used to illuminate a colorful object, for example, when using the color-code fringe projection techniques a more sophisticated algorithm is required to retrieve its texture from the colored fringe patterns.25 Using the methods that use a single camera, however, the Bayer color processing and chromatic balance may destroy the response linearity of the camera, thus leading to difficulties in fringe pattern analysis or degrading the measurement accuracy.

To overcome the aforementioned problems, the measurement system can equip two separate cameras, one monochrome and one color. The monochrome camera with a high linearity is used to capture the deformed fringe patterns in order to guarantee the 3-D measurement accuracy. The color camera is used for capturing the texture; its parameters are adjusted independently, thus adapting the color texture image to the observation of human eyes, which are nonlinear in response. In Ref. 26, a beam splitter is mounted on the optical axis of a black and white (B/W) camera, by which the texture can be captured with a color camera from the same view. In Refs. 27 and 28, a multiple-chip CCD camera is used to capture the infrared fringe patterns and the color texture simultaneously. This camera can be thought of as a combination of two cameras that have different channels but share the same optical axis. With these methods, the texture data can align the recovered 3-D data exactly since the texture and the fringe patterns are recorded from the same view.

In comparison with the aforementioned techniques, a more flexible method is to use two cameras with separate optical axes.29,30 In Ref. 30 for example, an infrared camera is used to capture the deformed fringe patterns. In addition, a color camera is used for capturing the color texture. This system is much easier to fabricate. However, the texture and the 3-D shape data are obtained from different views, and therefore a parallax exists between them. A solution to this problem is to perform a registration between the texture image and the 3-D model by matching their features (e.g., points, lines, and edges).31,32 However, this method is not always effective, especially when the features are too weak on the 3-D model. Another method is to employ a binocular geometry.30 The mapping relationship between object depths and fringe phases is determined by calibrating the projector-camera system. In addition, the intrinsic and extrinsic parameters of the texture camera and the measuring camera have to be calibrated, so that the corresponding relation between the texture image and the 3-D model can be determined. The mainstream camera calibration techniques33,34 are flexible to meet the demands of various applications and can also be used to calibrate the texture camera in the fringe projection system. However, these methods usually involve complex nonlinear minimization problems, implying that they are somewhat complicated to implement and program unless existing software is available.

In this paper, we analyze the geometry of the fringe projection system with a color texture camera and suggest a simple calibration method, by which the corresponding relationship between the texture and the 3-D data and the mapping relationship between the phases and depths are determined simultaneously, so that the duration time for implementing the system calibration is saved. In addition, data processing involves only linear minimizations that have low computational complexity. Using the calibration results, we can transform the texture image from the view of the color camera to that of the measuring camera, with its parallax being removed, and then precisely map the texture on the reconstructed object surface. Experimental results demonstrate that this method is valid in correcting the parallax of texture.

Measurement System and the Parallax of Texture Image

Figure 1 shows the measurement system, which mainly consists of a projector, a B/W camera, a color camera, and a stage mounted on a track. The projector casts sinusoidal fringe patterns onto the object surface. The B/W camera (i.e., the measuring camera) is used to record the deformed fringe patterns. This measuring camera should have a linear response to intensities in order to guarantee the accuracy of phase measuring. In addition, the color camera (i.e., the texture camera) is used to capture the color texture of the measured object. In addition to these components, we have a standard plane that serves as the reference plane, a benchmark for depth measurement. In the calibration procedure, it is also used as the calibration board, mounted on the stage and shifted by the track.

In Fig. 1, the measuring and texture cameras have different optical axes, and therefore a parallax exists between their views. In other words, the color photograph captured by the texture camera cannot be directly used for texture mapping. In this situation, parallax correction is a necessary step for texture mapping. Through it, the texture image is transformed from the view of the texture camera to that of the measuring camera, thus aligning the reconstructed 3-D data exactly.

The parallax of the texture image depends not only on the geometry of the measurement system, but also on the 3-D shape of the measured object. Parallax correction of the texture image has to be implemented concurrently with 3-D data reconstruction. Therefore, in the next two subsections, we shall introduce the principle of depth measurement, including steps of fringe analysis and the phase to depth conversion. In Sec. 2.4, the relationships between the object lateral coordinates and the camera pixel coordinates will be determined. Using these results allows us to deduce in Sec. 2.5 a method for texture parallax correction.

Phase Measuring

Phase measuring is an important step in measurement. It affects the accuracy of depth reconstruction and of texture parallax correction. There are many methods for extricating phases from fringe patterns. In this paper, we presume that the phase-shifting technique is used. With it, the deformed fringe patterns with phase shifts captured by the measuring camera are represented with Display Formula

In(u,v)=J(u,v)+γ(u,v)cos[φ(u,v)+2πn/N],(1)
where N is the number of phase shifts and n=0,1,,N1 denotes the phase-shift indexes. (u,v) are the pixel coordinates on the image plane of camera, and J(u,v) and γ(u,v) denote the recorded background intensity and the modulation at the pixel (u,v), respectively. φ(u,v) is the phase at (u,v). The wrapped phase is calculated using a standard phase-shifting algorithm,21Display Formula
φwrapped(u,v)=arctan[n=0N1In(u,v)sin(2πn/N)n=0N1In(u,v)cos(2πn/N)].(2)

The phase map calculated using Eq. (2) is not continuous but constrained to its principal value within the range from π to π rad. The discontinuities in a phase map are usually removed by implementing a two-dimensional (2-D) phase unwrapping.35 With fringe projection profilometry, however, the conventional spatial phase unwrapping algorithms are not always effective, because the abrupt steps in the measured shape may induce shadows, over-dense fringes, and fringe-order ambiguities in the fringe patterns.

In this work, we use the temporal phase unwrapping algorithm36 to overcome the problems described above. We project several groups of phase-shifting fringe patterns onto the object. From one group to the next, the spatial frequency of fringes increases by multiplying a fixed ratio 2. In the first group, each pattern contains only one fringe, so there is no ambiguity in the fringe order. The phase map for each group is calculated using Eq. (2), so we obtain a sequence of wrapped phase maps with increased sensitivities. In principle, their unwrapped phases at each pixel should be a geometric progression with a common ratio of 2. Therefore, the phase at each pixel can be unwrapped independently of other pixels over time. Using this temporal phase-unwrapping technique, the unwrapped phase map of the last group, which has the highest sensitivity, is used for measurement purpose. In this procedure, the invalid regions induced by shadows and other factors are segmented and excluded from the fringe patterns by setting a threshold for the fringe modulations.37 This temporal phase-unwrapping method is very stable in providing an absolute measure of surface height, but it has a drawback that a relatively large number of fringe patterns have to be captured.

A more efficient method that requires fewer fringe patterns is to use the Gray-code technique combining with phase-shifting fringe projection.38 In this method, the Gray-code fringes are exploited to detect the surface discontinuities without any ambiguities, and the phase-shifting technique is used to achieve a high measurement resolution. The limitation of this technique is that it involves an image binarization procedure, which is sensitive to the illumination nonuniformity and noise.

For convenience, the same phase measurement method is also implemented to the reference plane. The unwrapped phase maps of the object and the reference plane are obtained and represented with φobj and φref, respectively, so the phase difference is Display Formula

Δφ(u,v)=φobj(u,v)φref(u,v),(3)
from which the depth map of the measured surface relative to the reference plane can be calculated and the 3-D shape of the object can be reconstructed.

Depth Map Measuring

Fringe projection profilometry is a triangulation-based technique. When the projector and the B/W camera in Fig. 1 are located arbitrarily, the mapping relationship between the depths and phase differences is dependent on the system parameters and becomes very complex. Even so, relationship for each pixel (u,v), according to the geometric analysis in Ref. 39, can always be formulated with Display Formula

Δφ(u,v)=a(u,v)h(u,v)1+b(u,v)h(u,v),(4)
where h(u,v) is the object depth relative to the reference plane at the pixel (u,v). Coefficients a(u,v) and b(u,v) are the functions of pixel coordinates (u,v), whose distributions are dependent on the system parameters such as the pitch of fringes, the positions and orientations of the project and camera. The coefficients a(u,v) and b(u,v) can be determined through a system calibration and saved in look-up tables for use in measurement.

According to Eq. (4), the depths of the measured object are reconstructed with Display Formula

z(u,v)=h(u,v)=Δφ(u,v)a(u,v)b(u,v)Δφ(u,v).(5)

Lateral Coordinate Calculating

The depth map h(u,v) gives 2.5-dimensional data about the shape of the measured object. It is necessary to calculate the lateral coordinates at each pixel (u,v). Figure 2 shows a simple geometry of a camera in the system with a reference plane. The uv plane is the image plane of the camera, and C is its center of lens. For convenience, we assume a fictitious plane at the distance of l away from the point C. The coordinate system Oxyz is bound with this fictitious plane, with z axis coinciding with the optical axis of camera, and x and y axes being parallel with u and v axes, respectively. A pair (x,y) on the xOy plane corresponds to the image pixel (u,v), with a relation of the form Display Formula

{x=κuy=κv,(6)
where κ is a scale factor.

Graphic Jump Location
Fig. 2
F2 :

Geometry for camera calibration.

World coordinate system Oxyz is established with xOy coinciding with the reference plane, so the transform relationship between Oxyz and Oxyz is Display Formula

[xyz1]=[r1r2r3r4r5r6r7r8r9r10r11r120001][xyz1].(7)

Among the 12 entries of the transform matrix, only 6 are independent. They are determined by the rotations and translations in 6 degrees of freedom.

Assume that D is a point in the object field, with its depth relative to the reference plane being h, so its coordinates in Oxyz are (xD,yD,zD) with zD=h. This point produces its image at (u,v). The line DC crosses the plane xOy at the point E whose coordinates in xOy are Display Formula

(xE,yE,0)=(κu,κv,0).(8)

Using Eq. (7), the coordinates of E in Oxyz are obtained as (xE,yE,zE) with Display Formula

[xEyEzE]=[r1κu+r2κv+r4r5κu+r6κv+r8r9κu+r10κv+r12].(9)

The coordinates of C in Oxyz are (0,0,l), and in Oxyz are calculated with Display Formula

[xCyCzC]=[r3l+r4r7l+r8r11l+r12].(10)

The equation of the line CE in Oxyz is Display Formula

xxCxExC=yyCyEyC=zzCzEzC.(11)

By using Eqs. (9) and (10) and substituting (xD,yD,zD) in Eq. (11), we have Display Formula

xD=zD(r1κu+r2κv+r4r3lr4)r9κu+r10κv+r12r11lr12+(r11lr12)(r1κu+r2κv+r4r3lr4)r9κu+r10κv+r12r11lr12+(r3l+r4)(r9κu+r10κv+r12r11lr12)r9κu+r10κv+r12r11lr12(12)
and Display Formula
yD=zD(r5κu+r6κv+r8r7lr8)r9κu+r10κv+r12r11lr12+(r11lr12)(r5κu+r6κv+r8r7lr8)r9κu+r10κv+r12r11lr12+(r7l+r8)(r9κu+r10κv+r12r11lr12)r9κu+r10κv+r12r11lr12.(13)

Using general coordinates [x(u,v), y(u,v), z(u,v)] instead of (xD,yD,zD), Eqs. (12) and (13) can be simplified as Display Formula

x(u,v)=q3+q4u+q5v1+q1u+q2vz(u,v)+q6+q7u+q8v1+q1u+q2v(14)
and Display Formula
y(u,v)=q9+q10u+q11v1+q1u+q2vz(u,v)+q12+q13u+q14v1+q1u+q2v,(15)
with their coefficients being (q1,q2,,q14)=[(κr9)/(r11l), (κr10)/(r11l), r3/r11, (κr1)/(r11l), ((κr2)/(r11l), r4+(r3r12)/r11, r1+(r1r12)/(r11l)(r3r9)/r11(r4r9)/(r11l), r2+(r2r12)/(r11l)(r3r10)/r11(r4r10)/(r11l), r7/r11, (κr5)/(r11l), (κr6)/(r11l), r8+(r7r12)/r11, r5+(r5r12)/(r11l)(r7r9)/r11(r8r9)/(r11l), r6+(r6r12)/(r11l)(r7r10)/r11(r8r10)/(r11l)].

Equations (14) and (15) give the relationship between a pixel (u,v) and the 3-D coordinates of its corresponding object point. These equations use 14 new coefficients, q1,q2,,q14, instead of the old ones, i.e., r1,r2,,r12, κ, and l. When these coefficients are determined through a calibration procedure, using Eqs. (14) and (15) allows us to calculate the lateral coordinates of the object points, following from measuring their depths.

Parallax Correction of the Texture Image

In Fig. 1, the texture image captured with the color camera cannot be used directly for texture mapping, because the texture and the measuring cameras have different optical axes. Therefore, it is necessary to correct the parallax between them. We denote the pixel coordinates on the image plane of the color camera as (s,t). Similar to Eqs. (14) and (15), we have the relations between a pixel (s,t) and the 3-D coordinates of its corresponding object point: Display Formula

x(s,t)=p3+p4s+p5t1+p1s+p2tz(s,t)+p6+p7s+p8t1+p1s+p2t(16)
and Display Formula
y(s,t)=p9+p10s+p11t1+p1s+p2tz(s,t)+p12+p13s+p14t1+p1s+p2t,(17)
where p1,p2,,p14 are the coefficients related to the parameters of the color camera. They are determined through a calibration procedure.

As shown in Fig. 3, the pixel (u,v) of the measuring camera has a corresponding pixel [s(u,v), t(u,v)] in the image plane of the texture camera. These two pixels correspond to the same object point with the coordinates [x(u,v), y(u,v), z(u,v)]. Using Eqs. (16) and (17), we have Display Formula

s(u,v)=|p3z(u,v)x(u,v)+p6p2x(u,v)p5z(u,v)p8p9z(u,v)y(u,v)+p12p2y(u,v)p11z(u,v)p14||p1x(u,v)p4z(u,v)p7p2x(u,v)p5z(u,v)p8p1y(u,v)p10z(u,v)p13p2y(u,v)p11z(u,v)p14|(18)
and Display Formula
t(u,v)=|p1x(u,v)p4z(u,v)p7p3z(u,v)x(u,v)+p6p1y(u,v)p10z(u,v)p13p9z(u,v)y(u,v)+p12||p1x(u,v)p4z(u,v)p7p2x(u,v)p5z(u,v)p8p1y(u,v)p10z(u,v)p13p2y(u,v)p11z(u,v)p14|.(19)

Graphic Jump Location
Fig. 3
F3 :

Corresponding pixels between the measuring and texture cameras.

Using Eqs. (18) and (19) allows us to match the corresponding pixels between the two cameras. As a result, the color texture image from the view of the measuring camera can be calculated, with its intensities of three primary colors being Display Formula

{R(u,v)=R[s(u,v),t(u,v)]G(u,v)=G[s(u,v),t(u,v)]B(u,v)=B[s(u,v),t(u,v)],(20)
where R, G, and B are the intensities of the primary colors (i.e., red, green, and blue) in the texture image taken by the color camera. Generally, the calculated coordinate pair [s(u,v), t(u,v)] does not exactly locate at a pixel of the captured texture image. In this case, we calculate its values from its four closest pixels using a bilinear interpolation. Using Eq. (20), the parallax of the texture image is corrected.

System Calibration

As noted in the previous section, both the 3-D measurement and the texture parallax correction are dependent on coefficients related to the geometry of the measurement system. We implement the calibration in order to obtain these coefficients. We use a standard plane with diffused surface as the calibration board, on which we placed a 2-D array of black circular markers, as shown in Fig. 4. The lateral coordinates of the centers of these markers are previously known as (Xk,Yk), with k=1,2,,K being their indexes and K being the number of the markers. Using this calibration board, the procedure of calibration is summarized as follows.

  • We position the calibration board at z=0, as shown in Fig. 5, and take the image of the circular marker array using the B/W camera. In the world coordinate system, the center of the k’th circular marker has the coordinates (Xk,Yk,0). In the image, the pixel coordinates of centroid of this marker are calculated as (Uk,0,Vk,0).
  • At the same calibration board position, we take the image with the texture camera. In this image, the pixel coordinates of centroid of the k’th marker are calculated as (Sk,0,Tk,0).
  • According to the principle in Sec. 2.2, we project several groups of phase-shifting fringe patterns, with their frequencies varying from low to high by multiplying a fixed ratio 2, onto the object surface using the projector. We capture the distorted patterns using the B/W camera and analyze them to calculate their absolute phase distribution. In this step, since the markers are black and of low reflectivity, their phases cannot be accurately calculated. To solve this problem, we segment these markers off from the calculated phase map by setting a threshold for the fringe modulations and then calculate the full-field phase map by fitting the segmentation result with a rational function.40 This full-field phase map is denoted as φ0(u,v), which also serves as the reference phase distribution φref(u,v) in measurement.
  • As shown in Fig. 5, we shift the calibration board along the perpendicular direction (i.e., z direction), to at least two positions with difference depths hn (n=1,2,,N). In this procedure, the circular markers have the fixed lateral coordinates (Xk,Yk), and their vertical coordinates are Zk=hn. Repeating Steps 1 through 3, we have the pixel coordinates of centroids of the markers (Uk,n,Vk,n) in the image plane of the measuring camera, and (Sk,n,Tk,n) in the image plane of the texture camera, with k=1,2,,K and n=1,2,,N. At the same time, the phase distributions [i.e., φn(u,v) for n=1,2,,N] are also measured.
  • We calculate the phase difference with Display Formula
    Δφn(u,v)=φn(u,v)φ0(u,v),(21)
    where n=1,2,,N. For each pixel (u,v), substituting hn and φn(u,v) with n=1,2,,N in Eq. (5) results in a system of linear equations: Display Formula
    hna(u,v)Δφn(u,v)hnb(u,v)=Δφn(u,v),(22)
    with a(u,v) and b(u,v) being unknowns. Solving it in the least squares sense, we have the distributions of a(u,v) and b(u,v). By this step, the mapping relationship between the phase differences and the object depths is determined.
  • Substituting the center coordinates of the markers (Xk,Yk,hn) and the centroid coordinates (Uk,n,Vk,n) with k=1,2,,K and n=0,1,,N, in Eqs. (14) and (15), produces 2K(N+1) simultaneous linear equations: Display Formula
    {XkUk,nq1+XkVk,nq2+hnq3+hnUk,nq4+hnVk,nq5q6Uk,nq7Vk,nq8=XkYkUk,nq1+YkVk,nq2+hnq9+hnUk,nq10+hnVk,nq11q12Uk,nq13Vk,nq14=Yk,(23)
    from which the coefficients q1,q2,,q14 are estimated in the least squares sense.
  • Similar to Step 6, substituting (Xk,Yk,hn) and (Sk,n,Tk,n) with k=1,2,,K and n=0,1,,N, in Eqs. (16) and (17), we have 2K(N+1) simultaneous linear equations: Display Formula
    {XkSk,np1+XkTk,np2+hnp3+hnSk,np4+hnTk,np5p6Sk,np7Tk,np8=XkYkSk,np1+YkTk,np2+hnp9+hnSk,np10+hnTk,np11p12Sk,np13Tk,np14=Yk,(24)
    from which the coefficients p1,p1,,p14 are estimated in the least squares sense.

Through this calibration procedure, the corresponding relationship between the texture and the 3-D data and the mapping relationship between the phases and depths are determined simultaneously. The data processing involves only linear minimizations that have low computational complexity. Even so, it should be noted that with this procedure we have to adjust the orientation of the calibration board carefully in order to make it exactly perpendicular to the direction of the track.

Graphic Jump Location
Fig. 4
F4 :

Circular markers with known center coordinates on the calibration board.

Graphic Jump Location
Fig. 5
F5 :

Shifting the calibration board to different depths in calibration procedure.

Three-Dimensional Measurement and Texture Parallax Correction

When the measurement system is calibrated, we can use it for measuring the 3-D shape and correcting the parallax of texture and mapping it on the 3-D shape. Its result will be a six-dimensional array, with [x(u,v), y(u,v), z(u,v)] describing the 3-D data and [R(u,v), G(u,v), B(u,v)] representing the color texture. The overall procedure is summarized as follows:

  • Capture the texture image with the color camera.
  • Project several groups of phase-shifting fringe patterns, with their frequencies increasing by a fixed ratio 2, onto the object surface using the projector. Capture the distorted fringe patterns with the B/W camera.
  • Analyze the captured fringe patterns using the phase-shifting method and the temporal phase-unwrapping technique introduced in Sec. 2.2, in order to retrieve the object phase map φobj(u,v); calculate the phase difference Δφ(u,v) using Eq. (3); and, finally, calculate the z coordinates through Eq. (5) whose coefficients a(u,v) and b(u,v) were calibrated in the previous subsection.
  • Using the pixel coordinates (u,v) and the vertical coordinates z(u,v) obtained in Step 3, we calculate the lateral coordinates of object points [i.e., x(u,v) and y(u,v)] through Eqs. (14) and (15) whose coefficients q1,q2,,q14 having being obtained in calibration. Up to this step, the 3-D data of the object, [x(u,v), y(u,v), z(u,v)], have been obtained.
  • For each pixel (u,v), we calculate its corresponding pixel [s(u,v), t(u,v)] in the captured texture image by using Eqs. (18) and (19). By reading the color intensities at the pixel [s(u,v), t(u,v)] in the captured texture image and using Eq. (20), we obtain the intensities of three primary colors at the pixel (u,v). As a result, we have a new color image [R(u,v), G(u,v), B(u,v)] as the texture image with its parallax having been corrected. Finally, we can map this texture image onto the surface of a 3-D shape.

Experiments are carried out for verifying the validity of the suggested method. Our measurement system mainly consists of an LCD projector (HITACHI HCP-3250X, 1024×768pixels), a B/W camera [DH HV1351UM (1280×1024pixels)], and a color camera [DH HV1351UC (1280×1024pixels)]. The focal length of lenses (Computar M1620-MPV with F1.6) is 16 mm. The distance between the camera and the measured object is around 800 mm. In both the system calibration and the measurement, we use the phase-shifting technique and the temporal phase-unwrapping technique introduced in Sec. 2.2 for analyzing the distorted fringe patterns. Here, we use eight groups of phase-shifting fringe patterns with their spatial frequencies varying from low to high by multiplying a fixed ratio 2. Each pattern in the first group contains only one fringe with no ambiguity in its fringe order, and consequently each pattern in the last group contains 128 fringes. Only the phase map of the last group, which has the highest sensitivity, is reserved for measurement purpose. According to the sampling theory of phase-shifting technique, the N-frame algorithm is insensitive to the harmonics up to N2 order41,42 induced by the luminance nonlinearity43 of the projector. Therefore, in the last group the number of phase steps is 8, and the phase increment between consecutive frames is π/4rad in order to guarantee the measurement accuracy. In each group before the last, the number of phase steps is 4 in order to reduce the number of fringe patterns.

We begin the experiment by calibrating the measurement system using the procedure presented in Sec. 3.1. Figure 6 shows the calibration board with a circular marker array, on which the space between two neighboring markers is 40 mm in both horizontal and vertical directions. In the calibration procedure, the calibration board is perpendicularly shifted, within the depth of the focus of the cameras, to nine different depth positions with the depth increment between two positions being 5 mm. At each calibration board position, we take its photograph using the color camera and take the deformed fringe patterns using the B/W camera. Figures 7(a) and 7(b), respectively, show a photograph of the color camera, and one of the fringe patterns captured using the B/W camera, when the calibration board is at the position z=0. By comparing their marker positions, the parallax between the views of the two cameras is evident. From the captured fringes patterns, we calculate the absolute phase map for each calibration board position and then estimate the coefficients a(u,v) and b(u,v), whose distributions are illustrated in Fig. 8. These coefficients determine the mapping relationship between the depths and phase differences through Eq. (5). From the images of circular markers, we estimate their centroids and further calculate the coefficients q1,q2,q14 for the B/W camera and p1,p2,,p14 for the color camera. They are used for determining the relationship between pixel coordinates and the lateral coordinates of the object points.

Graphic Jump Location
Fig. 6
F6 :

Calibration board used in this experiment.

Graphic Jump Location
Fig. 7
F7 :

(a) The image captured using the color camera. (b) The image captured using the B/W camera with fringes projected on it. From these views, it can be observed that the circular markers have different pixel coordinates in the two images, implying a parallax between the views of the two cameras.

Graphic Jump Location
Fig. 8
F8 :

Calibration results with (a) a(u,v) and (b) b(u,v) being coefficients determining the mapping relationship between the depths and phase differences.

The first experiment examines the accuracy of this method by using the calibration board in Fig. 6 as the measured object. As mentioned in Sec. 2, parallax correction of the texture image must be implemented concurrently with the depth reconstruction because the parallax is dependent on the 3-D shape of the object. We position the measured board at 0, 16, and 32 mm depths. By measuring it and correcting its texture image, we can evaluate the accuracies of depth reconstruction and the texture parallax correction.

For example, when the measured board is positioned at the depth of 16 mm, one of the fringe patterns captured with the measuring camera is shown in Fig. 9(a). From these fringe patterns, we recover the unwrapped phase map by using the method in Sec. 2.2, and its result is shown in Fig. 9(b). Subtracting the reference phases from it yields the phase differences, whose distribution is illustrated in Fig. 9(c). Furthermore, we calculate the depth map through Eq. (5), giving the result shown in Fig. 10. From Sec. 2.5, we know that the depth measurement accuracy strongly affects texture parallax correction through Eqs. (18) and (19). Table 1 quantitatively investigates the precision of the depth measurement results. The depth maps of the measured board reconstructed from the fringe patterns may deviate from its nominal depth values, say 0, 16, and 32 mm. From this table, we see that the mean values of the deviations are very small, the RMS (root-mean-square) values of the deviations are of 0.1-mm level, and the maximum deviations are around 0.5 mm. The depth measurement errors should be smaller than these values, since these deviations are caused not only by the measurement errors but also by the flatness error of the board.

Table Grahic Jump Location
Table 1Precision of depth reconstruction and texture parallax correction for the measured board.
Graphic Jump Location
Fig. 9
F9 :

Fringe analysis for measuring the board positioned at the depth of 16 mm: (a) a deformed fringe pattern captured with the B/W camera, (b) the unwrapped phase map in radians, and (c) the distribution of phase differences in radians.

Graphic Jump Location
Fig. 10
F10 :

The depth map reconstructed from Fig. 9.

Regarding texture parallax correction, Fig. 11(a) shows an image of the measured board captured using the texture camera when the measured board is positioned at the depth of 16 mm, and Fig. 11(b) is the image of the same board captured using the measuring camera, with its fringes having been removed by averaging all the captured fringe patterns. Comparing the pixel positions of the circular markers in Figs. 11(a) and 11(b), it is evident that there is a parallax between the views of the two cameras. Using the procedure introduced in Sec. 3.2, we transform the image in Fig. 11(a) from the view of the texture camera to that of the measuring camera; the result is shown in Fig. 11(c). We observe that the circular markers in Fig. 11(c) are exactly aligned with those in Fig. 11(b). This fact implies that the parallax of the texture image has been corrected successfully. Table 1 also lists the deviations of the circular marker centroids of the texture image after parallax correction from those of the measuring camera image. The mean and RMS deviations are smaller than 0.2 pixels, and the maximum deviations are not more than 0.5 pixels (corresponding to about 0.2 mm in lateral coordinates). These results demonstrate that the texture parallax correction method can achieve a satisfactory accuracy.

Graphic Jump Location
Fig. 11
F11 :

Parallax correction of the texture image: (a) an image captured using the texture camera with the measured board positioned at the depth of 16 mm; and (b) the image of the same board captured using the measuring camera with its fringes removed by averaging all fringe patterns. By comparing the pixel positions of the circular markers in (a) and (b), it is evident that there is a parallax between the views of the two cameras. (c) The result of transforming the image in (a) to the view of the measuring camera. The circular markers in (c) align those in (b) exactly, implying that the parallax of the texture image has been corrected.

Continuing the experiment, we use the same method to measure a plastic bottle, which has a freeform curved surface. Figure 12(a) shows one of the fringe patterns captured with the B/W camera. The wrapped phase map recovered from the fringe patterns is shown in Fig. 12(b). By subtracting the reference phases from it, Fig. 12(c) gives the phase difference distribution. The reconstructed 3-D shape of the object is illustrated in Fig. 13.

Graphic Jump Location
Fig. 12
F12 :

Fringe analysis for measuring a plastic bottle: (a) a deformed fringe pattern captured with the B/W camera, (b) the unwrapped phase map in radians, and (c) the distribution of phase differences in radians.

Graphic Jump Location
Fig. 13
F13 :

Reconstructed 3-D shape of the measured plastic bottle (in mm).

The last step of this experiment is to map texture on the surface of the 3-D shape just obtained. Figure 14(a) is the texture photograph taken by the color camera. Here the calibration board is kept behind the measured object, making it easy to observe the parallax between the views of the color and B/W cameras by comparing Figs. 14(a) and 12(a). Without correcting the parallax, directly mapping the captured texture image in Fig. 14(a) onto the 3-D object surface shown in Fig. 13 leads to a faulty result. As shown in Fig. 14(b), at the handle and lid positions of the measured bottle, the displacements of texture from its real positions, induced by the pixel misalignment, are evident. To solve this problem, we transform the texture image from the view of the color camera to the view of the B/W camera. Figure 14(c) shows the texture image with the parallax having been corrected. Using it for texture mapping, the result is illustrated in Fig. 14(d), in which the misalignment faults have been removed. Figure 14(e) shows the image difference obtained by subtracting Fig. 14(a) from Fig. 14(c). We directly observe from this image difference the parallax between the two images. Figure 14(f) is the result of mapping the image difference onto the reconstructed 3-D shape. These results demonstrate that the method introduced in this paper is valid in correcting the parallax of texture when the texture camera and measuring camera have different optical axes.

Graphic Jump Location
Fig. 14
F14 :

Texture mapping for the measured plastic bottle: (a) the texture image captured using the color camera, (b) texture mapping result directly using (a) with the parallax not being corrected, (c) the texture image transformed to the view of the B/W camera, (d) texture mapping result using (c) with the parallax corrected, (e) the image difference obtained by subtracting (a) from (c), from which the parallax between (a) and (c) can be directly observed, and (f) the image difference in (e) mapped onto the reconstructed 3-D shape.

Regarding the measurement efficiency, data processing with this method involves only linear minimizations with a low computational complexity, but we still took about 30 min to calibrate the system and less than 2 min to measure an object in the experiments. The overwhelming majorities of these durations were consumed just for capturing fringe patterns, because we used the temporal phase unwrapping technique,36 which requires capturing a great number of fringe patterns, to measure an object with abrupt steps. Using a more efficient technique, such as the Gray-code combining with phase-shifting technique,38 may significantly reduce the time needed for image capture. In addition, manually shifting the calibration board to different depths along the track also occupied a portion of time for system calibration. Using an automatically controlled track will be helpful for improving the efficiency and accuracy of the system calibration.

In this paper, we have analyzed the geometry of the fringe projection system with a color texture camera and suggested a system calibration method. This method can be used for determining the corresponding relationship between the 2-D texture and the 3-D data and the mapping relationship between the object depths and phase differences simultaneously, thus decreasing the duration time for system calibration. In it, the data processing involves only linear minimizations and has a low computational complexity. The experimental results demonstrate that using the suggested method allows us to transform the texture image from the view of the color camera to that of the measuring camera, thereby correcting the parallax of the texture image and precisely mapping the texture on the reconstructed object surface.

This work was supported by the National Natural Science Foundation of China (61178045 and 61433016) and the National High Technology Research and Development Program (863 Program) of China (2012AA040507).

Gorthi  S. S., and Rastogi  P., “Fringe projection techniques: whither we are?” Opt. Lasers Eng.. 48, (2 ), 133 –140 (2010).CrossRef
Garbat  P., and Kujawińska  M., “Combining fringe projection method of 3D object monitoring with virtual reality environment: concept and initial results,” in  IEEE Proc. First Int. Symp. on 3D Data Processing Visualization and Transmission , pp. 504 –508 (2002).CrossRef
Garbat  P., , Kujawińska  M., and Wegiel  M., “3D visualization of true variable in time objects based on data from optical measurement system,” Proc. SPIE. 6196, , 61960J  (2006).CrossRef
El-Hakim  S. F.  et al., “System for indoor 3D mapping and virtual environments,” Proc. SPIE. 3174, , 21 –35 (1997).CrossRef
Kujawińska  M., and Pawlowski  M., “Application of shape-measuring optical methods in animation,” Proc. SPIE. 3407, , 522 –527 (1998).CrossRef
Pazos  M. V.  et al., “Accuracy assessment of human trunk surface 3D reconstructions from an optical digitising system,” Med. Biol. Eng. Comput.. 43, (1 ), 11 –15 (2005). 0140-0118 CrossRef
Ares  M.  et al., “Handheld 3D scanning system for in-vivo imaging of skin cancer,” in  The 5th Int. Conf. 3D Body Scanning Technologies ,  Lugano, Switzerland , pp. 231 –236 (2014).CrossRef
Frankowski  G., , Chen  M., and Huth  T., “Phase shift rapid in-vivo measuring of human skin (PRIMOS) by digital fringe projection with micromirror display devices DMD,” Proc. SPIE. 4778, , 66 –73 (2002).CrossRef
Callaghan  T. M., and Wilhelm  K. P., “A review of ageing and an examination of clinical methods in the assessment of ageing skin. Part 2: clinical perspectives and clinical methods in the evaluation of ageing skin,” Int. J. Cosmet. Sci.. 30, (5 ), 323 –332 (2008).CrossRef
Slizewski  A., , Friess  M., and Semal  P., “Surface scanning of anthropological specimens: nominal-actual comparison with low cost laser scanner and high end fringe light projection surface scanning systems,” Quartär. 57, , 179 –187 (2010).CrossRef
Przybilla  H. J., , Peipe  J., “3D modeling of heritage objects by fringe projection and laser scanning systems,” in CIPA Heritage Documentation: Best Practices and Applications. , , Stylianidis  E., , Patias  P., and Quintero  M. S., Eds., pp. 35 –39,  Ziti Publications ,  Thessaloniki, Greece  (2011).
Kersten  T., and Stallmann  D., “Automatic texture mapping of architectural and archaeological 3d models,” Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci.. 39, (5 ), 273 –278 (2012).CrossRef
Chambard  J.-P.  et al., “Digitization of art pieces based on 3D, colour and texture parameters,” Proc. SPIE. 6618, , 66180C  (2007).CrossRef
Peipe  J., and Przybilla  H. J., “Modeling the Golden Madonna,” in CIPA 2005 XX Int. Symp.. ,  Torino, Italy , pp. 934 –936 (2005).
Sitnik  R., , Kujawińska  M., and Woźnicki  J., “Digital fringe projection system for large-volume 360-deg shape measurement,” Opt. Eng.. 41, (2 ), 443 –449 (2002).CrossRef
Budianto  B., , Lun  P. K., and Hsung  T.-C., “Marker encoded fringe projection profilometry for efficient 3D model acquisition,” Appl. Opt.. 53, (31 ), 7442 –7453 (2014). 0003-6935 CrossRef
Takeda  M., and Mutoh  K., “Fourier transform profilometry for the automatic measurement of 3-D object shapes,” Appl. Opt.. 22, (24 ), 3977 –3982 (1983). 0003-6935 CrossRef
Peng  X.  et al., “A simple method of generating pseudo-solid texture for optical 3-D image,” Optik. 110, (7 ), 317 –322 (1999). 0030-4026 
Zhang  Z.  et al., “Color texture extraction from fringe image based on full-field projection,” Opt. Eng.. 42, (7 ), 1935 –1939 (2003).CrossRef
Zhang  Z., , Towers  D. P., and Towers  C. E., “Snapshot color fringe projection for absolute three-dimensional metrology of video sequences,” Appl. Opt.. 49, (31 ), 5947 –5953 (2010). 0003-6935 CrossRef
Srinivasan  V., , Liu  H. C., and Halioua  M., “Automated phase-measuring profilometry of 3-D diffuse object,” Appl. Opt.. 23, (18 ), 3105 –3108 (1984). 0003-6935 CrossRef
Zhang  S., and Yau  S.-T., “Simultaneous three-dimensional geometry and color texture acquisition using single color camera,” Opt. Eng.. 47, (12 ), 123604  (2008).CrossRef
Zhang  S., “Recent progresses on real-time 3D shape measurement using digital fringe projection techniques,” Opt. Lasers Eng.. 48, (2 ), 149 –158 (2010).CrossRef
Shi  H.  et al., “Shape and deformation measurement system by combining fringe projection and digital image correlation,” Opt. Lasers Eng.. 51, (1 ), 47 –53 (2013).CrossRef
Zhang  Z., , Towers  C. E., and Towers  D. P., “Robust color and shape measurement of full color artifacts by RGB fringe projection,” Opt. Eng.. 51, (2 ), 021109  (2012).CrossRef
Zhang  S., and Huang  P. S., “High-resolution, real-time three-dimensional shape measurement,” Opt. Eng.. 45, (12 ), 123601  (2006).CrossRef
Akasaka  K., , Sagawa  R., and Yagi  Y., “A sensor for simultaneously capturing texture and shape by projecting structured infrared light,” in  IEEE Proc. Sixth Int. Conf. 3-D Digital Imaging and Modeling , pp. 375 –381 (2007).CrossRef
Xu  Y. J.  et al., “Simultaneously measuring 3D shape and colour texture of moving objects using IR and colour fringe projection techniques,” Opt. Lasers Eng.. 61, , 1 –7 (2014).CrossRef
Weise  T., , Leibe  B., and Van Gool  L., “Fast 3D scanning with automatic motion compensation,” in  IEEE Conf. Computer Vision and Pattern Recognition , Vol. II, pp. 1 –8 (2007).CrossRef
Ou  P.  et al., “Flexible real-time natural 2D colour and 3D shape measurement,” Opt. Express. 21, (14 ), 16736 –16741 (2013). 1094-4087 CrossRef
Liu  L., and Stamos  I., “Automatic 3D to 2D registration for the photorealistic rendering of urban scenes,” in  IEEE Computer Society Conf. Computer Vision and Pattern Recognition , Vol. II, pp. 137 –143 (2005).CrossRef
Stamos  I., and Allen  P. K., “Automatic registration of 2-D with 3-D imagery in urban environments,” in  Eighth IEEE Int. Conf. Computer Vision , Vol. 2, pp. 731 –737 (2001).CrossRef
Tsai  R. Y., “An efficient and accurate camera calibration technique for 3D machine vision,” in IEEE Proc. Conf. on Computer Vision and Pattern . , 364 –374 (1986).
Zhang  Z. Y., “A flexible new technique for camera calibration,” IEEE Trans. Pattern Anal. Mach. Intell.. 22, (11 ), 1330 –1334 (2000). 0162-8828 CrossRef
Ghiglia  D. C., and Pritt  M. D., Two-Dimensional Phase Unwrapping: Theory, Algorithms, and Software. ,  Wiley ,  New York  (1998).
Saldner  H. O., and Huntley  J. M., “Profilometry using temporal phase unwrapping and a spatial light modulator-based fringe projector,” Opt. Eng.. 36, (2 ), 610 –615 (1997).CrossRef
Su  X., , Von Bally  G., and Vukicevic  D., “Phase-stepping grating profilometry: utilization of intensity modulation analysis in complex objects evaluation,” Opt. Commun.. 98, (1 ), 141 –150 (1993). 0030-4018 CrossRef
Sansoni  G., , Carocci  M., and Rodella  R., “Three-dimensional vision based on a combination of Gray-code and phase-shift light projection: analysis and compensation of the systematic errors,” Appl. Opt.. 38, (31 ), 6565 –6573 (1999). 0003-6935 CrossRef
Guo  H.  et al., “Least-squares calibration method for fringe projection profilometry,” Opt. Eng.. 44, (3 ), 033603  (2005).CrossRef
Guo  H., , Chen  M., and Zheng  P., “Least squares fitting of carrier phase distribution using rational function in fringe projection profilometry,” Opt. Lett.. 31, (24 ), 3588 –3590 (2006). 0146-9592 CrossRef
Stetson  I. A., and Brohinsky  W. R., “Electrooptic holography and its application to hologram interferometry,” Appl. Opt.. 24, (21 ), 3631 –3637 (1985). 0003-6935 CrossRef
Guo  H., and Chen  M., “Fourier analysis of the sampling characteristics of the phase-shifting algorithm,” Proc. SPIE. 5180, , 437 –444 (2003).CrossRef
Guo  H., , He  H., and Chen  M., “Gamma correction for digital fringe projection profilometry,” Appl. Opt.. 43, (14 ), 2906 –2914 (2004). 0003-6935 CrossRef

Zhuang Lu received his BS degree from Shenyang University of Chemical Technology, China, in 2012 and his MS degree from Shanghai University, China, in 2015. His research interests are optical 3-D measurement and computer vision.

Jun Zhou received her BS degree from Tianjin Polytechnic University, China, in 2009 and her MS degree from Shanghai University, China, in 2012. Her research interests are optical 3-D measurement and computer vision.

Hongwei Guo received his BS degree in mechanical engineering from Xi’an Jiaotong University, China, in 1990, and his PhD in mechanical-electronic engineering from Shanghai University, China, in 2001. Currently, he is a professor in the Laboratory of Applied Optics and Metrology, the Department of Precision Mechanical Engineering, at Shanghai University, China. His research interests include optical metrology, computer vision, and signal and image processing.

© The Authors

Citation

Zhuang Lu ; Jun Zhou and Hongwei Guo
"Parallax correction of texture image in fringe projection profilometry", Opt. Eng. 54(8), 084107 (Aug 13, 2015). ; http://dx.doi.org/10.1117/1.OE.54.8.084107


Figures

Graphic Jump Location
Fig. 2
F2 :

Geometry for camera calibration.

Graphic Jump Location
Fig. 6
F6 :

Calibration board used in this experiment.

Graphic Jump Location
Fig. 7
F7 :

(a) The image captured using the color camera. (b) The image captured using the B/W camera with fringes projected on it. From these views, it can be observed that the circular markers have different pixel coordinates in the two images, implying a parallax between the views of the two cameras.

Graphic Jump Location
Fig. 4
F4 :

Circular markers with known center coordinates on the calibration board.

Graphic Jump Location
Fig. 5
F5 :

Shifting the calibration board to different depths in calibration procedure.

Graphic Jump Location
Fig. 12
F12 :

Fringe analysis for measuring a plastic bottle: (a) a deformed fringe pattern captured with the B/W camera, (b) the unwrapped phase map in radians, and (c) the distribution of phase differences in radians.

Graphic Jump Location
Fig. 14
F14 :

Texture mapping for the measured plastic bottle: (a) the texture image captured using the color camera, (b) texture mapping result directly using (a) with the parallax not being corrected, (c) the texture image transformed to the view of the B/W camera, (d) texture mapping result using (c) with the parallax corrected, (e) the image difference obtained by subtracting (a) from (c), from which the parallax between (a) and (c) can be directly observed, and (f) the image difference in (e) mapped onto the reconstructed 3-D shape.

Graphic Jump Location
Fig. 3
F3 :

Corresponding pixels between the measuring and texture cameras.

Graphic Jump Location
Fig. 8
F8 :

Calibration results with (a) a(u,v) and (b) b(u,v) being coefficients determining the mapping relationship between the depths and phase differences.

Graphic Jump Location
Fig. 9
F9 :

Fringe analysis for measuring the board positioned at the depth of 16 mm: (a) a deformed fringe pattern captured with the B/W camera, (b) the unwrapped phase map in radians, and (c) the distribution of phase differences in radians.

Graphic Jump Location
Fig. 10
F10 :

The depth map reconstructed from Fig. 9.

Graphic Jump Location
Fig. 11
F11 :

Parallax correction of the texture image: (a) an image captured using the texture camera with the measured board positioned at the depth of 16 mm; and (b) the image of the same board captured using the measuring camera with its fringes removed by averaging all fringe patterns. By comparing the pixel positions of the circular markers in (a) and (b), it is evident that there is a parallax between the views of the two cameras. (c) The result of transforming the image in (a) to the view of the measuring camera. The circular markers in (c) align those in (b) exactly, implying that the parallax of the texture image has been corrected.

Graphic Jump Location
Fig. 13
F13 :

Reconstructed 3-D shape of the measured plastic bottle (in mm).

Tables

Table Grahic Jump Location
Table 1Precision of depth reconstruction and texture parallax correction for the measured board.

References

Gorthi  S. S., and Rastogi  P., “Fringe projection techniques: whither we are?” Opt. Lasers Eng.. 48, (2 ), 133 –140 (2010).CrossRef
Garbat  P., and Kujawińska  M., “Combining fringe projection method of 3D object monitoring with virtual reality environment: concept and initial results,” in  IEEE Proc. First Int. Symp. on 3D Data Processing Visualization and Transmission , pp. 504 –508 (2002).CrossRef
Garbat  P., , Kujawińska  M., and Wegiel  M., “3D visualization of true variable in time objects based on data from optical measurement system,” Proc. SPIE. 6196, , 61960J  (2006).CrossRef
El-Hakim  S. F.  et al., “System for indoor 3D mapping and virtual environments,” Proc. SPIE. 3174, , 21 –35 (1997).CrossRef
Kujawińska  M., and Pawlowski  M., “Application of shape-measuring optical methods in animation,” Proc. SPIE. 3407, , 522 –527 (1998).CrossRef
Pazos  M. V.  et al., “Accuracy assessment of human trunk surface 3D reconstructions from an optical digitising system,” Med. Biol. Eng. Comput.. 43, (1 ), 11 –15 (2005). 0140-0118 CrossRef
Ares  M.  et al., “Handheld 3D scanning system for in-vivo imaging of skin cancer,” in  The 5th Int. Conf. 3D Body Scanning Technologies ,  Lugano, Switzerland , pp. 231 –236 (2014).CrossRef
Frankowski  G., , Chen  M., and Huth  T., “Phase shift rapid in-vivo measuring of human skin (PRIMOS) by digital fringe projection with micromirror display devices DMD,” Proc. SPIE. 4778, , 66 –73 (2002).CrossRef
Callaghan  T. M., and Wilhelm  K. P., “A review of ageing and an examination of clinical methods in the assessment of ageing skin. Part 2: clinical perspectives and clinical methods in the evaluation of ageing skin,” Int. J. Cosmet. Sci.. 30, (5 ), 323 –332 (2008).CrossRef
Slizewski  A., , Friess  M., and Semal  P., “Surface scanning of anthropological specimens: nominal-actual comparison with low cost laser scanner and high end fringe light projection surface scanning systems,” Quartär. 57, , 179 –187 (2010).CrossRef
Przybilla  H. J., , Peipe  J., “3D modeling of heritage objects by fringe projection and laser scanning systems,” in CIPA Heritage Documentation: Best Practices and Applications. , , Stylianidis  E., , Patias  P., and Quintero  M. S., Eds., pp. 35 –39,  Ziti Publications ,  Thessaloniki, Greece  (2011).
Kersten  T., and Stallmann  D., “Automatic texture mapping of architectural and archaeological 3d models,” Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci.. 39, (5 ), 273 –278 (2012).CrossRef
Chambard  J.-P.  et al., “Digitization of art pieces based on 3D, colour and texture parameters,” Proc. SPIE. 6618, , 66180C  (2007).CrossRef
Peipe  J., and Przybilla  H. J., “Modeling the Golden Madonna,” in CIPA 2005 XX Int. Symp.. ,  Torino, Italy , pp. 934 –936 (2005).
Sitnik  R., , Kujawińska  M., and Woźnicki  J., “Digital fringe projection system for large-volume 360-deg shape measurement,” Opt. Eng.. 41, (2 ), 443 –449 (2002).CrossRef
Budianto  B., , Lun  P. K., and Hsung  T.-C., “Marker encoded fringe projection profilometry for efficient 3D model acquisition,” Appl. Opt.. 53, (31 ), 7442 –7453 (2014). 0003-6935 CrossRef
Takeda  M., and Mutoh  K., “Fourier transform profilometry for the automatic measurement of 3-D object shapes,” Appl. Opt.. 22, (24 ), 3977 –3982 (1983). 0003-6935 CrossRef
Peng  X.  et al., “A simple method of generating pseudo-solid texture for optical 3-D image,” Optik. 110, (7 ), 317 –322 (1999). 0030-4026 
Zhang  Z.  et al., “Color texture extraction from fringe image based on full-field projection,” Opt. Eng.. 42, (7 ), 1935 –1939 (2003).CrossRef
Zhang  Z., , Towers  D. P., and Towers  C. E., “Snapshot color fringe projection for absolute three-dimensional metrology of video sequences,” Appl. Opt.. 49, (31 ), 5947 –5953 (2010). 0003-6935 CrossRef
Srinivasan  V., , Liu  H. C., and Halioua  M., “Automated phase-measuring profilometry of 3-D diffuse object,” Appl. Opt.. 23, (18 ), 3105 –3108 (1984). 0003-6935 CrossRef
Zhang  S., and Yau  S.-T., “Simultaneous three-dimensional geometry and color texture acquisition using single color camera,” Opt. Eng.. 47, (12 ), 123604  (2008).CrossRef
Zhang  S., “Recent progresses on real-time 3D shape measurement using digital fringe projection techniques,” Opt. Lasers Eng.. 48, (2 ), 149 –158 (2010).CrossRef
Shi  H.  et al., “Shape and deformation measurement system by combining fringe projection and digital image correlation,” Opt. Lasers Eng.. 51, (1 ), 47 –53 (2013).CrossRef
Zhang  Z., , Towers  C. E., and Towers  D. P., “Robust color and shape measurement of full color artifacts by RGB fringe projection,” Opt. Eng.. 51, (2 ), 021109  (2012).CrossRef
Zhang  S., and Huang  P. S., “High-resolution, real-time three-dimensional shape measurement,” Opt. Eng.. 45, (12 ), 123601  (2006).CrossRef
Akasaka  K., , Sagawa  R., and Yagi  Y., “A sensor for simultaneously capturing texture and shape by projecting structured infrared light,” in  IEEE Proc. Sixth Int. Conf. 3-D Digital Imaging and Modeling , pp. 375 –381 (2007).CrossRef
Xu  Y. J.  et al., “Simultaneously measuring 3D shape and colour texture of moving objects using IR and colour fringe projection techniques,” Opt. Lasers Eng.. 61, , 1 –7 (2014).CrossRef
Weise  T., , Leibe  B., and Van Gool  L., “Fast 3D scanning with automatic motion compensation,” in  IEEE Conf. Computer Vision and Pattern Recognition , Vol. II, pp. 1 –8 (2007).CrossRef
Ou  P.  et al., “Flexible real-time natural 2D colour and 3D shape measurement,” Opt. Express. 21, (14 ), 16736 –16741 (2013). 1094-4087 CrossRef
Liu  L., and Stamos  I., “Automatic 3D to 2D registration for the photorealistic rendering of urban scenes,” in  IEEE Computer Society Conf. Computer Vision and Pattern Recognition , Vol. II, pp. 137 –143 (2005).CrossRef
Stamos  I., and Allen  P. K., “Automatic registration of 2-D with 3-D imagery in urban environments,” in  Eighth IEEE Int. Conf. Computer Vision , Vol. 2, pp. 731 –737 (2001).CrossRef
Tsai  R. Y., “An efficient and accurate camera calibration technique for 3D machine vision,” in IEEE Proc. Conf. on Computer Vision and Pattern . , 364 –374 (1986).
Zhang  Z. Y., “A flexible new technique for camera calibration,” IEEE Trans. Pattern Anal. Mach. Intell.. 22, (11 ), 1330 –1334 (2000). 0162-8828 CrossRef
Ghiglia  D. C., and Pritt  M. D., Two-Dimensional Phase Unwrapping: Theory, Algorithms, and Software. ,  Wiley ,  New York  (1998).
Saldner  H. O., and Huntley  J. M., “Profilometry using temporal phase unwrapping and a spatial light modulator-based fringe projector,” Opt. Eng.. 36, (2 ), 610 –615 (1997).CrossRef
Su  X., , Von Bally  G., and Vukicevic  D., “Phase-stepping grating profilometry: utilization of intensity modulation analysis in complex objects evaluation,” Opt. Commun.. 98, (1 ), 141 –150 (1993). 0030-4018 CrossRef
Sansoni  G., , Carocci  M., and Rodella  R., “Three-dimensional vision based on a combination of Gray-code and phase-shift light projection: analysis and compensation of the systematic errors,” Appl. Opt.. 38, (31 ), 6565 –6573 (1999). 0003-6935 CrossRef
Guo  H.  et al., “Least-squares calibration method for fringe projection profilometry,” Opt. Eng.. 44, (3 ), 033603  (2005).CrossRef
Guo  H., , Chen  M., and Zheng  P., “Least squares fitting of carrier phase distribution using rational function in fringe projection profilometry,” Opt. Lett.. 31, (24 ), 3588 –3590 (2006). 0146-9592 CrossRef
Stetson  I. A., and Brohinsky  W. R., “Electrooptic holography and its application to hologram interferometry,” Appl. Opt.. 24, (21 ), 3631 –3637 (1985). 0003-6935 CrossRef
Guo  H., and Chen  M., “Fourier analysis of the sampling characteristics of the phase-shifting algorithm,” Proc. SPIE. 5180, , 437 –444 (2003).CrossRef
Guo  H., , He  H., and Chen  M., “Gamma correction for digital fringe projection profilometry,” Appl. Opt.. 43, (14 ), 2906 –2914 (2004). 0003-6935 CrossRef

Some tools below are only available to our subscribers or users with an online account.

Related Content

Customize your page view by dragging & repositioning the boxes below.

Advertisement


 

  • Don't have an account?
  • Subscribe to the SPIE Digital Library
  • Create a FREE account to sign up for Digital Library content alerts and gain access to institutional subscriptions remotely.
Access This Article
Sign in or Create a personal account to Buy this article ($20 for members, $25 for non-members).
Access This Proceeding
Sign in or Create a personal account to Buy this article ($15 for members, $18 for non-members).
Access This Chapter

Access to SPIE eBooks is limited to subscribing institutions and is not available as part of a personal subscription. Print or electronic versions of individual SPIE books may be purchased via SPIE.org.