Optical Design and Engineering

Accuracy improvement in laser stripe extraction for large-scale triangulation scanning measurement system

[+] Author Affiliations
Yang Zhang, Wei Liu, Xiaodong Li, Fan Yang, Peng Gao, Zhenyuan Jia

Dalian University of Technology, Key Laboratory for Precision and Non-Traditional Machining Technology of the Ministry of Education, No. 2 Linggong Road, Dalian 116024, China

Opt. Eng. 54(10), 105108 (Oct 15, 2015). doi:10.1117/1.OE.54.10.105108
History: Received May 31, 2015; Accepted September 10, 2015
Text Size: A A A

Open Access Open Access

Abstract.  Large-scale triangulation scanning measurement systems are widely used to measure the three-dimensional profile of large-scale components and parts. The accuracy and speed of the laser stripe center extraction are essential for guaranteeing the accuracy and efficiency of the measuring system. However, in the process of large-scale measurement, multiple factors can cause deviation of the laser stripe center, including the spatial light intensity distribution, material reflectivity characteristics, and spatial transmission characteristics. A center extraction method is proposed for improving the accuracy of the laser stripe center extraction based on image evaluation of Gaussian fitting structural similarity and analysis of the multiple source factors. First, according to the features of the gray distribution of the laser stripe, evaluation of the Gaussian fitting structural similarity is estimated to provide a threshold value for center compensation. Then using the relationships between the gray distribution of the laser stripe and the multiple source factors, a compensation method of center extraction is presented. Finally, measurement experiments for a large-scale aviation composite component are carried out. The experimental results for this specific implementation verify the feasibility of the proposed center extraction method and the improved accuracy for large-scale triangulation scanning measurements.

Geometric accuracies of large-scale aircraft components or parts, including tail and wings, are the essential specifications for determining the airworthiness of the major subassemblies or subsystems of an aircraft.13 Geometrical measurements in large-scale aircraft components are fundamental for both aircraft assembly and aircraft reliability testing.4 Nevertheless, the inspection of large-scale aircraft subsystems remains a delicate task because it requires a high measurement range and accuracy.57 Machine-vision technologies have become important tools for the three-dimensional measurement of product structures because they can be used for noncontact measurements that can produce results with high efficiency and accuracy.8 Large-scale triangulation scanning measurement systems, which are based on machine-vision technology, are widely used in industry for accurately measuring the three-dimensional profile of large-scale components.9,10

Triangulation scanning measurement systems are characterized by the reconstruction of dynamic laser stripes on scanned large-scale components. The three-dimensional profile is accurately measured by analyzing image sequences of laser stripes recorded by charge-coupled device/complementary metal oxide semiconductor (CMOS) cameras. Because the laser stripe center is a unique feature in the images, the extraction accuracy of the laser stripe center is a decisive factor for measurement accuracy.1113 However, due to the large size of aircraft components, the laser stripe covers a long scan range. In addition to variations due to multiple lighting effects (illumination, reflectivity of object, light source characteristics, etc.), errors are easily introduced into the center extraction results of large-scale laser stripes. Because conventional center extraction methods cannot be used to extract laser stripes with sufficiently high accuracy for large-scale parts, the development of a highly accurate laser stripe extraction method is essential for measuring large-scale aircraft parts.14

To improve the accuracy of laser stripe extraction, conventional center extraction methods, such as geometric center extraction, barycenter extraction, and Gaussian fitting extraction, are enhanced. Lukas et al. proposed an enhanced Gaussian fitting extraction method.15 In this method, the initial center of the laser stripe was extracted using the conventional extraction method. Then the laser stripe center was extracted using the Gaussian fitting method at the range of 5 pixels around the initial center. Though this method could extract the laser stripe with high accuracy, it was only applied to laser stripes with a uniform gray distribution and width. Jang and Hong proposed a new method for detecting curvilinear structures.16 The edge of the input image was extracted using a Canny edge detector,17 and the distance from each pixel to the nearest edge of the feature is calculated based on a Euclidean distance mapping.18 Thus, the light center could be obtained by removing extraneous points after the extracted curve was refined into a 1-pixel-wide stripe.16 The method proposed by Jang and Hong could be used for comprehensive natural images with good robustness;16 however, it was considered inappropriate for high-accuracy measurement in industry because the measurement accuracy of this method could only reach pixel-level accuracy. Steger first obtained the normal of the laser stripe center using a Hessian matrix. Then the maximum gray value on this normal could be calculated as a subpixel center.19 The resulting method showed high extraction accuracy with high stability; however, because it requires a large number of arithmetic operations, the method is inappropriate for high-speed center extraction. Finally, Wei proposed a robust automatic method that combines erosion, thinning, and the least-median square algorithm to overcome the interference of partial serious reflection for laser stripe center extraction; however, it does not work well for measuring large-scale components.20

In this study, a novel laser stripe extraction method is proposed with high accuracy and efficiency. After analyzing the Gaussian fitting structural similarity and image features of laser stripes, the deviation in the laser stripe extraction can be corrected, thus improving the measurement accuracy of a large-scale triangulation scanning system.

Characteristics of a large-scale triangulation scanning measurement system and laser stripe images are described and analyzed in Sec. 2. To determine deviations in the laser stripe extraction, an image evaluation method based on the structural similarity between Gaussian fittings is presented in Sec. 3. In Sec. 4, the relationships between the gray distribution of a laser stripe and the multiple source factors are discussed. Then the center compensation model is established for laser stripes on the surface of composite materials to improve the accuracy of laser stripe extraction. According to the method of laser stripe extraction, specific experimental implementations are performed in Sec. 5. Moreover, the improvement in the accuracy of the large-scale triangulation scanning measurement system is verified by the measurement results of large-scale aircraft components. Some conclusions are discussed in Sec. 6.

Active Triangulation Scanning Measurement System

The active triangulation scanning measurement system using laser stripes is composed of two cameras with high speed and resolution, one linear diode laser transmitter with high stability, one automatically controlled platform with high accuracy, and one graphic workstation, as shown in Fig. 1. During the measurement, linear diode laser stripes are projected onto the surface of objects. With the rotation of the automatically controlled platform, the laser stripes are scanned over the object. The image sequences of the laser stripes are captured by binocular cameras. After establishing the three-dimensional system of coordinates through the binocular camera calibration, the profile of a large-scale object can be obtained by extracting the center lines of the laser stripes in the image sequences. Thus, the extraction accuracy of the laser stripe is the decisive factor in improving the accuracy of active triangulation scanning measurements.

Graphic Jump Location
Fig. 1
F1 :

The schematic diagram of active triangulation scanning measurement system.

Gray Distribution Features of Laser Stripe

Because the image information in the laser stripes is expressed by the intensity pattern (gray distribution), we will demonstrate the gray distribution features of the laser stripe in detail.

The luminous theory of the linear diode laser states that a dot laser beam passing through a cylindrical lens generates a continuous optical space. Thereafter, laser stripes with a certain width are formed when the measured object surface intersects with the laser optical space.

The fundamental transverse mode of the linear laser, which is an important type of solid laser with high stability, has been widely used in visual measuring systems. According to laser principles, the intensity distribution of the fundamental transverse mode follows a single Gaussian distribution in the space domain (cross-section of laser beam).10 The gray distribution curve of laser stripe a is shown in Fig. 2. When the laser stripe is overexposed, the gray distribution of this stripe is a Gaussian curve with a platform (curve b). However, the curve of the nonplatform area still agrees with the Gaussian distribution (curve c). When the laser stripe is projected vertically on the measured object and the observation direction is perpendicular to the irradiated surface, the gray distribution is axisymmetric in the space domain. Moreover, the laser stripe center will coincide with the geometric center of the laser stripe when the laser, camera, and normal vector of the measured surface have the same direction. However, the gray distribution is asymmetric in practical measurement conditions. With changes in the incident angle of the laser transmitter, the laser stripe center deviates from the geometric center, as shown in Fig. 2(b). Thus, this laser stripe center deviation should be considered in the extraction method to further improve the measurement accuracy.

Graphic Jump Location
Fig. 2
F2 :

Schematic diagram of the active triangulation scanning measurement system showing the gray-scale distribution: (a) laser stripe with the incident angle and (b) reference laser stripe.

During large-scale component measurements, laser stripes scanned over objects can have a large range of movement. Therefore, a larger incidence angle is produced. Due to the large deviation angle as well as the characteristics of the laser, measured object, and cameras, the center of the laser stripe can significantly deviate from the geometric center. Additionally, the gray distribution is not similar to the standard Gaussian distribution of laser stripes. Therefore, the compensation for the laser stripe deviation must be considered for different incident angles.

Because the gray distribution of a laser stripe is asymmetric and a significant deviation in the laser stripe center could occur, we propose an image evaluation method for the laser stripe extraction to determine the degree of deviation between the center of the captured laser stripe and the geometric center.

Theory of Structural Similarity

The structural similarity (SSIM) index provides a method for measuring the similarity between the evaluating image x and reference image y.21 The reference image is a distortion-free image. In particular, the SSIM index includes a comparison of the luminance l(x,y), contrast c(x,y), and structure s(x,y) between images x and y. The SSIM index is given by Display Formula

SSIM(x,y)=[l(x,y)]α[c(x,y)]β[s(x,y)]γ,(1)
where α, β, and γ are the adaptive scaling index for the luminance comparison, contrast comparison, and structure comparison, respectively. The luminance l(x,y), contrast c(x,y), and structure s(x,y) comparison can be expressed as follows: Display Formula
l(x,y)=2μxμy+C1μx2+μy2+C1,(2)
Display Formula
c(x,y)=2σxσy+C2σx2+σy2+C2,(3)
Display Formula
s(x,y)=σxy+C3σxσy+C3,(4)
where μx and μy are the mean luminance intensities of images x and y given by μx=(1/m)t=1mxt and μy=(1/m)t=1myt, respectively. Similarly, σx and σy are the standard deviations of images x and y given by σx=[1/(m1)]t=1m(xtμx)2 and σy=[1/(m1)]t=1m(ytμy)2, σy=[1/(n1)]i=1n(yiμy)2, respectively. Finally, σxy is the image covariance that represents the structural comparison, and C1, C2 and C3 are small constants used to prevent a zero denominator. The method for estimation of C1, C2, and C3 is found elsewhere.21

Gaussian Fitting Structural Similarity

For a triangulation scanning measurement system, the gray distribution of the laser stripe has a specific Gaussian distribution. To evaluate the degree of laser stripe deviation, we proposed the following image evolution method using the structural similarity of the gray distribution and the Gaussian fitting:

  1. A reference image is defined in which the gray distribution of the laser stripe is approximately a Gaussian distribution (see Sec. 2.2). The laser stripe images are compared with the reference image, and the luminance, contrast, and structure comparisons are calculated.
  2. The average gray distribution of the laser stripe is analyzed and fitted by a Gaussian curve. The comparison between the gray distribution of the laser stripe and the Gaussian curve can be written as Display Formula
    g(y)=1u=1T[(1Mv=1Mxu,iAueru2wu2)/(1Mv=1Mxu,v)]2T,(5)
    where Aue(ru2/wu2) is the Gaussian curve describing the gray distribution and (1/M)v=1Mxu,v is the average gray value of the laser stripe; T is the total number of pixels in the cross-sections of the laser stripe. According to the theory of structural similarity,21 the laser evaluation model of Gaussian fitting structural similarity can be expressed as follows: Display Formula
    GFSSIM(x,y)=[l(x,y)]α[c(x,y)]β[s(x,y)]γ[g(y)]λ,(6)
    where λ is the adaptive scaling index for the Gaussian fitting comparison. When the gray distribution of the laser stripe is similar to the reference image, the value of the Gaussian fitting structural similarity is 1.

When the laser scans have measured objects over a large-scale measuring range, a large incident angle will lead to a large laser stripe center deviation. On the other hand, in measuring ranges with a smaller incident angle, the laser stripe center has a smaller deviation from the geometric center. Thus, for small incident angles, the laser stripe center can be extracted using the geometric center extraction method. However, in most cases, the gray distribution of the laser stripe significantly deviates from that of the reference image. When the value of Gaussian fitting structural similarity is less than a certain value, the laser stripe center is no longer close to the geometric center. Therefore, compensation should be applied to the laser stripe center after the geometric center extraction is performed.

Threshold Value of Gaussian Fitting Structural Similarity

The threshold value of Gaussian fitting structural similarity is used for determining the similarity between the gray distribution of the laser stripe and the reference distribution. In this section, the relationship between the Gaussian fitting structural similarity and the centerline deviation of the laser stripe is analyzed after calculating the Gaussian fitting structural similarity of laser stripes from different incident angles.

First, images of laser stripes are captured from different angles, and the gray distributions of laser stripes with various incident angles between 0 and 40 deg are shown in Fig. 3. Gray distribution curves for different incident angles are expressed using different colors. The red dashed line shows the center of the reference laser stripe, while the blue dashed line shows the center of the laser stripe with the maximum (40 deg) incident angle.

Graphic Jump Location
Fig. 3
F3 :

The gray distribution of laser stripes with different incident angles.

Then the laser stripe images are processed using the median filter. Additionally, the reference laser stripe image is set as the initial position, which is the position of the laser stripe that is vertically projected on the measured object with a camera observing the light from a vertical position. The spatial relationship between the reference laser stripe and the projected laser stripe is shown in Fig. 4.

Graphic Jump Location
Fig. 4
F4 :

The spatial relationship between the reference laser stripe and projected laser stripe.

The distance between the incident position of laser and the surface of measured object is defined as d. When the incident angle of the laser stripe is i, the theoretical distance from the center of the projected laser stripe to the reference laser stripe is dtani. Therefore, the deviation of the laser stripe center is the difference between the theoretical distance and actual distance. The standard deviation σersi for the deviation of the laser stripe center can be calculated using Peters’ equation.22Display Formula

σersi=1.253v=1n|Δli,vdtani|n(n1),(7)
where n is the number of gray columns for the laser stripe images; Δli,v is the deviation between the captured laser stripe center and the reference laser stripe center. The relationship between the Gaussian fitting structural similarity and the standard deviation of the centerline is shown in Fig. 5.

Graphic Jump Location
Fig. 5
F5 :

The relationship between the Gaussian fitting structural similarity and the standard deviation.

As shown in Fig. 5, with increasing incident angles, the Gaussian fitting structural similarity decreases and the centerline deviation of the laser stripe increases. The Gaussian fitting structural similarity of the stripe is relatively high when the centerline deviation is small, and the value of similarity sharply decreases when the centerline offset exceeds a particular threshold. The relationship between the Gaussian fitting structural similarity and the standard deviation of the laser stripe center deviation can be described by a high-order polynomial curve. Thus, when the centerline deviation is (1/3)ϵ, the value of Gaussian structural similarity can be obtained using the fitting curve.

The center of the laser stripe deviates from the geometric center due to the material surface reflectivity, laser spatial transmission, camera imaging characteristics, and the incidence angle from the camera. When the deviation of the laser stripe center exceeds a certain threshold, the accuracy of the laser center extraction is further decreased by using the geometric center extraction method. Thus, we propose a center compensation method based on the analysis of the multiple source factors (reflectivity characteristics of the material surfaces, spatial transmission characteristics of the laser, and imaging characteristics of cameras) for improving the accuracy of laser center extraction.

Relationships Between the Gray Distribution of the Laser Stripe and Multiple Source Factors

The gray distribution of laser stripes is affected in real-time by the spatial light intensity distribution, material reflectivity characteristics, and spatial transmission characteristics. Thus, the real-time information for a laser stripe can be quantifiably expressed by a laser stripe model that is based on associations between the gray distribution of a laser stripe and the multiple source factors. The imaging mechanism for the image sensor is shown in Fig. 6.

Graphic Jump Location
Fig. 6
F6 :

The imaging mechanism complementary metal oxide semiconductor image sensor.

In the sensitive characteristics of the image sensor, the model of the optical electronic converter is expressed as follows:23,24Display Formula

E=a×Hρ+b,(8)
where E is the electrical output signal of the sensors, H is the accumulation of light energy, a is the coefficient of electrical output signals, b is the electrical output signal of the sensor without lighting, and ρ is the index of the optical electronic converter, which is approximately equal to 1, in general. Thus, the electrical output signal from the sensor has a linear relationship with the accumulation of light energy. The gray distribution of images captured by the image sensor can demonstrate the spatial light energy of laser.

In the actual measuring space, the laser stripes are projected on the measuring surface of objects at a certain incident angle. Then the laser stripes are reflected off the surface and captured by the camera. The laser stripe images are affected by the physical and geometric quantities shown in Fig. 7.

Graphic Jump Location
Fig. 7
F7 :

The schematic diagram of laser reflection on the surface of composite material T800.

The relationship between the gray distribution of the laser stripe and the multiple source factors can be written as Display Formula

Elc=f[φlas(d,i,k),φcm(d,i,k),φcam(d,i,k),φeni],(9)
where f(φlas,φcm,φcam,φeni) is a compound function of laser stripe energy that includes the influence of the laser spatial transmission characteristics, material reflectivity characteristics, imaging characteristics of the cameras, and the spatial light intensity distribution. Moreover, φlas(d,i,k), φcm(d,i,k), φcam(d,i,k) represent the functions of laser stripe energy affected by the laser spatial transmission characteristics, material reflectivity characteristics, and imaging characteristics of cameras, respectively, with spatial distance d, incident angle i, and physical characteristic coefficient k.

First, we analyze the laser spatial transmission characteristics. The laser is transmitted according to a hyperbolic curve in the time domain. Because the laser transmitters used in large-scale triangulation scanning measurement systems have a better performance quality, the divergence angle of these lasers is quite small. Thus, we assume that the laser is transmitted in a straight line (with no deviation).

Figure 8 shows the laser stripe transmission in space. With a certain flare laser angle, the relationship between the length of the laser stripe lp, lq and the projection distance dp, dq agrees with the theory of similar triangles. Display Formula

lq/lp=dq/dp.(10)

Graphic Jump Location
Fig. 8
F8 :

The laser stripe transmission in space.

Additionally, the measurement range for a large-scale triangulation scanning measurement system is limited. Under a certain measurement distance, the luminous flux of the laser in a certain cross-section is relatively constant. Therefore, the luminous flux ϕp in the cross-section of projection distance dp nearly equals to the luminous flux ϕq in the cross-section of projection distance dq. Thus, the light intensity of the laser is inversely proportional to the transmission length of the laser stripe. The light intensity Ep with certain length η for laser stripes with the different projection distances dp can be expressed as Display Formula

Ep=ϕplpωp(ηωp),(11)
where ωp is the width of laser stripes p. On the basis of Eqs. (10) and (11), the relationship between the light intensity Ep of laser stripes p and the light intensity Eq of laser stripes q can be deduced as Display Formula
Ep/Eq=[ϕplpωp(ηωp)]/[ϕqlqωq(ηωq)]=lq/lp=dq/dp,(12)
where ωp, ωq are the width of laser stripes p and q, respectively. Compared with the reference image, the light intensity of the laser stripe is given by Display Formula
Elas=est/dlas+ϵst,(13)
where est is the coefficient of light intensity, dlas is the projection distance of the reference laser stripe, and ϵst is the fitting error.

After projection by the laser transmitter, the laser stripe is reflected by the surface of the measured object. Then the laser stripe is captured by the image sensor. The reflected laser stripes are affected by the incident angle, the reflection characteristics of the measured material, and the observation angle of cameras. According to the reflection characteristics, the reflected light primarily includes specular reflection light and diffuse reflection light.25,26 The energy of specular reflection light is determined by the incident angle of laser i and material reflection characteristic kd. The energy of diffuse reflection light is related to the material reflection characteristic ks and the observation angle of the camera θ. When the intensity of the incident light is Elas, the intensity of the reflected light E can be expressed as follows: Display Formula

E=Elas{kdcosi+ks[cos(θi)]h},(14)
where h is the index of diffuse reflection light.

Moreover, when the distance from the laser transmitter to the plane of object is d and the incident angle is i, the laser transmission distance can be deduced by the equation d/cosi. According to our impact analysis of the multiple source factors (laser spatial transmission characteristics, material reflectivity characteristics, imaging characteristics, and spatial light intensity distribution), the relationship between the gray distribution of the laser stripe and multiple source factors can be expressed as Display Formula

Elc=f(φlas,φcm,φcam,φeni)=[est/(dcosi)+ϵst]{kdcosi+ks[cos(θi)]h}+Eeni,(15)
where Elc is the light intensity of the captured laser stripes and Eeni is the light intensity of the environment. Because the light intensity of the environment is much lower than the intensity of the laser, the environmental light intensity can be ignored. Thus, Eq. (13) can be expressed as Display Formula
Elc=[est/(dcosi)+ϵst]{kdcosi+ks[cos(θi)]h}.(16)

Center Compensation Method Based on Geometric Analysis

In the image capturing process, the position of the camera is fixed. The scanning plane is produced by the motion of the laser stripe over different incident angles. A schematic diagram of a large-scale triangulation scanning measurement system is shown in Fig. 9.

Graphic Jump Location
Fig. 9
F9 :

Schematic diagram of the scanning measurement system.

Because the gray distribution of the laser stripe is symmetric in the reference image, the center of the laser stripe coincides with its geometric center. As discussed above, when the incident angle of the laser transmitter varies, the center of the laser stripe deviates from the geometric center, and the width of the laser stripe changes. The center of the reference laser stripe is the initial position for our system. The distance between the initial position (reference stripe) and the center of the extracted laser stripe is defined as the position of the laser stripe center. Half of the flare angle of the laser is set as α. For the distance d from the laser transmitter to the plane of the measured object and incident angle i, the laser incident light is projected on the surface of the object at a certain angle. The position of the laser stripe center is extracted by the geometric center extraction method, which is the distance llf from the last laser stripe to the first stripe, which is described by Display Formula

llf=12d[tan(i+α)+tan(iα)].(17)

However, the actual position of the laser stripe center is Display Formula

llf=dtan(i).(18)

Thus, the deviation of the laser stripe center can be expressed as Display Formula

Δlf=llfllf=d[12tan(i+α)+12tan(iα)tan(i)].(19)

In the measurement space, the imaging principle of the camera approximates the pinhole imaging principle, and the measured objects are projected on the imaging plane through the optic center of the lens. When the angle between the direction of observation and the plane vector of the measured object is θ, the deviation of the laser stripe can be expressed as Display Formula

Δ=Δlf·cosθ=d[12tan(i+α)+12tan(iα)tan(i)]cosθ.(20)

Because the incident angle is controlled by the rotating platform, the incident angle is a known quantity. Additionally, the observation angle can be calculated by the relationship between the gray distribution of the laser stripe and the multiple source factors. Thus, the center of the laser stripe can be compensated according to Eq. (20).

The large-scale triangulation scanning measurement system includes two CMOS cameras (VC-25MC-M/C 30, Korea Vieworks Company) with a resolution of 4096×3072pixels and a pixel size of 5.5μm, linear solid lasers with wavelengths of 450 nm, lens (AF-S-24-70 mm f/2.8G, Nikkor), a controlled platform, and an imaging workstation (Z820, HP). A flat plate of the composite material T800 is selected as the measured object because this is the primary material used in aircraft components. The experimental system is shown in Fig. 10.

Graphic Jump Location
Fig. 10
F10 :

Diagram of experimental system.

Using this measurement system, the accuracy of the proposed center compensation method is verified through a large number of tests. Moreover, a flat tail of an airplane was measured in the assembly workshop in an aviation manufacturing company to further validate the proposed measuring method.

Threshold Value of Gaussian Fitting Structural Similarity

The Gaussian fitting structural similarity of corresponding stripes is calculated to analyze the gray distribution of different stripes. The relationship between the gray distribution of the laser stripes on T800 aviation composite materials and the corresponding Gaussian fitting structural similarity should be discussed to confirm the threshold value for the Gaussian fitting structural similarity.

Images of laser stripes on the composite material T800 are shown in Fig. 9. For capturing the reference image of laser stripes, the laser incidence direction, camera observing direction, and the measured surface normal vector direction are set to the same direction, and the laser incident angle is defined as 0 deg. In this situation, the distance from the measured plane to the laser transmitter is measured by a laser rangefinder. Then utilizing the electric rotary platform with a repeated positioning accuracy of 0.003 deg, the incident angle of laser is changed every 2 deg. The laser stripes with different incident angles are projected on the surface. The original images and gray distribution images of laser stripes with different incident angles are shown in Fig. 11.

Graphic Jump Location
Fig. 11
F11 :

The original images (top row) and gray distribution images (bottom row) of laser stripes with different incident angles: (a) 0 deg, (b) 12 deg, and (c) 30 deg.

Then the Gaussian fitting structural similarity is calculated for corresponding stripes with different incident angles. Moreover, the movement of the laser stripe on the measured plane can be obtained through the relative vertical distance and the incident angle. Since the part of laser stripe is saturated, of which the gray intensity can be obtained by Gaussian fitting of the gray intensity of unsaturated part. The movement distance and the values of Gaussian fitting structural similarity are shown in Table 1.

Table Grahic Jump Location
Table 1The movement distance and the values of Gaussian fitting structural similarity.

The relationship between the deviation and Gaussian fitting structural similarity is illustrated in Fig. 12(b). In accordance with the accuracy requirement for the actual measurement, the maximal error of the laser stripe extraction is 0.15 mm. Thus, the value of the Gaussian fitting structural similarity should be >0.998 according to the curve fitting values in Fig. 12(b). The relationship between the incident angles and the Gaussian fitting structural similarity is illustrated in Fig. 12(a).

Graphic Jump Location
Fig. 12
F12 :

The relationship between the Gaussian fitting structural similarity and (a) the incident angle and (b) the deviation.

Figure 12 shows that when the value of Gaussian fitting structural similarity is 0.998, the incident angle is 20 deg. In this situation, the deviation from the actual laser center to the geometric center is beyond the acceptable range. Thus, the extraction center of the laser stripe should be compensated.

Compensation for the Laser Stripe Center

The relationship between the gray distribution of the laser stripe and these multiple source factors are verified based on the measurement system. To simplify the calculation of the correlation coefficients for the model, the incident angle of the laser and the observing angle of the camera are set to 0 deg according to Eq. (16). Therefore, the laser incidence direction, camera observation direction, and the measured surface normal vector direction are the same. The laser transmitter is moved parallel along this direction. Moreover, the projection distance between the laser transmitter and the object surface changes. The spatial light intensity distribution of the laser must also be considered. The gray values of the laser stripes are shown in Fig. 13.

Graphic Jump Location
Fig. 13
F13 :

The relationship between laser intensity and distance.

Figure 13 shows that the brightness of laser stripe has a linear relationship with the reciprocal of the projection distance; the fitted curve is based on Eq. (13). These results confirm the theoretical analysis of laser spatial transmission characteristics described in Sec. 4.1.

Then with a constant projection distance and a certain incident angle, the laser stripes are captured at different observation directions. The relationship between the camera offset angle and light brightness is shown in Fig. 14, and the fitted curve depends on Eq. (14). With the change in the incident angle, the coefficients of the material reflectivity characteristics and spatial transmission characteristics are calculated using Eq. (16). Thus, we can obtain a laser stripe gray distribution model based on the analysis of the multiple source factors for this experimental condition.

Graphic Jump Location
Fig. 14
F14 :

The relationship between camera viewing angle and light intensity of laser.

Based on this model, both the center extraction method and the compensation method are applied to extract the center of the laser stripe when the laser stripe has a certain angle of incidence. The results are shown in Table 2 with the theoretical deviation of the actual deviation of the laser stripe.

Table Grahic Jump Location
Table 2Experimental results.

Based on the center extraction and compensation methods, the centers of the laser stripes are extracted. The reconstruction of the measured plane is shown in Fig. 15. The compensation method based on multiple source factors decreases the center deviation of the laser stripe, and the accuracy of measurement is improved by up to 99.86% compared to the center extraction method.

Graphic Jump Location
Fig. 15
F15 :

The reconstruction of measured plane: (a) reconstruction result of traditional extraction method, (b) reconstruction result of proposed extraction method, (c) reconstruction error of traditional extraction method, and (d) reconstruction error of proposed extraction method.

Field Experiment Validation

In the assembly workshop of an aviation manufacturing company, a flat tail of an airplane is measured to test the proposed center compensation method. The profile of the composite part is within the size of 1200mm×1000mm. The cameras are calibrated using the plane target calibration method, and the intrinsic and extrinsic parameters of the two industrial cameras are determined. The experimental results is shown in Table 3 and the reconstruction of measured plane is shown in Fig. 16. Then the binocular vision measurement method based on laser scanning is used to realize the high-precision reduction of geometric parameters; the accuracy of measurement is up to 99.75% compared with the theoretical size.

Table Grahic Jump Location
Table 3The results of field experiment validation.
Graphic Jump Location
Fig. 16
F16 :

Reconstruction of measured plane.

In this study, we propose a laser stripe center extraction method based on the analysis of multiple source factors. The experimental results show that our proposed method significantly improves the accuracy of the laser stripe extraction for large-scale triangulation scanning measurement systems. To achieve this result, the laser stripe evaluation method (Gaussian fitting structural similarity) effectively provides a threshold value for center compensation by evaluating the similarity between the measured images and the reference image. When the value of Gaussian fitting structural similarity is beyond the defined threshold value, the geometric center deviates from the actual center of the laser stripe. This deviation is resolved by the proposed method of center compensation, which is based on our analysis of the spatial light intensity distribution, material reflectivity characteristics, imaging characteristics, and spatial transmission characteristics. The experiments in laboratory are conducted successfully, and the method has also been successfully applied to the measurement of aircraft components.

This paper is supported by the National Basic Research Program of China 973 Project (Grant No. 2014CB046504), the National Natural Science Foundation of China (Grant No. 51227004), the National Natural Science Foundation of China (Grant No. 51375075), the Liaoning Provincial Natural Science Foundation of China (Grant No. 2014028010), and the Science Fund for Creative Research Groups (No. 51321004).

Marguet  B., and Ribere  B., “Measurement-assisted assembly applications on airbus final assembly lines,” SAE Technical Paper Series, 2003-01-2950 (2003).
Marsh  B. J., “Laser tracker assisted aircraft machining and assembly,” SAE Technical Paper Series, 2008-01-2313 (2008).
Saadat  M., and Cretin  L., “Measurement systems for large aerospace components,” Sens. Rev.. 22, (3 ), 199 –206 (2002). 0260-2288 CrossRef
Muelaner  J. E., and Maropoulos  P., “Large scale metrology in aerospace assembly,” presented at  5th Int. Conf. on Digital Enterprise Technology , University of Bath, 22–24  October  2008,  Nantes, France  (2008).
Muelaner  J. E., , Cai  B., and Maropoulos  P. G., “Large-volume metrology instrument selection and measurability analysis,” Proc. Inst. Mech. Eng., Part B. 224, (6 ), 853 –868 (2010).
Maropoulos  P. G.  et al., “Large volume metrology process models: a framework for integrating measurement with assembly planning,” CIRP Ann. Manuf. Technol.. 57, (1 ), 477 –480 (2008). 0007-8506 CrossRef
Cuypers  W.  et al., “Optical measurement techniques for mobile and large-scale dimensional metrology,” Opt. Laser Eng.. 47, (3 ), 292 –300 (2009).CrossRef
Liu  Z.  et al., “Fast and flexible movable vision measurement for the surface of a large-sized object,” Sensors. 15, (3 ), 4643 –4657 (2015). 0746-9462 CrossRef
Fu  H. L.  et al., “Innovative optical scanning technique and device for three-dimensional full-scale measurement of wind-turbine blades,” Opt. Eng.. 53, (12 ), 122411  (2014).CrossRef
Liu  W.  et al., “Fast dimensional measurement method and experiment of the forgings under high temperature,” J. Mater. Process. Technol.. 211, (2 ), 237 –244 (2011).CrossRef
Qi  L.  et al., “Statistical behavior analysis and precision optimization for the laser stripe center detector based on Steger’s algorithm,” Opt. Express. 21, (11 ), 13442 –13449 (2013).CrossRef
Xue  Q.  et al., “Improving the measuring accuracy of structured light measurement system,” Opt. Eng.. 53, (11 ), 112204  (2014).CrossRef
Yousef  H.  et al., “An innovative approach in structured light systems,” Proc. SPIE. 7864, , 78640N  (2011). 0277-786X CrossRef
Zhang  W., , Cao  N., and Guo  H., “Novel sub-pixel feature point extracting algorithm for three-dimensional measurement system with linear-structure light,” Proc. SPIE. 7656, , 76563V  (2010). 0277-786X CrossRef
Lukáš  J., , Fridrich  J., and Goljan  M., “Detecting digital image forgeries using sensor pattern noise,” Proc. SPIE. 6072, , 60720Y  (2006). 0277-786X CrossRef
Jang  J., and Hong  K., “Detection of curvilinear structures and reconstruction of their regions in gray-scale images,” Pattern Recognit.. 35, (4 ), 807 –824 (2002).CrossRef
Canny  J., “A computational approach to edge detection,” IEEE Trans. Pattern Anal. Mach. Intell.. PAMI-8, , 679 –698 (1986).CrossRef
Danielsson  P. E., “Euclidean distance mapping,” Comput. Graph. Image Process.. 14, (3 ), 227 –248 (1980).CrossRef
Steger  C., “An unbiased detector of curvilinear structures,” IEEE Trans. Pattern Anal. Mach. Intell.. 20, (2 ), 113 –125 (1998).CrossRef
Guangjun  W. Z. Z., “A robust automatic method for extracting the centric line of straight structured-light stripe,” Chin. J. Sci. Instrum.. 2, (26 ), 244 –247 (2004).
Wang  Z.  et al., “Image quality assessment: from error visibility to structural similarity,” IEEE Trans. Image Process.. 13, (4 ), 600 –612 (2004).CrossRef
Fei  Y. T., Error Theory and Data Processing. ,  China Machine Press ,  Beijing, China  (2010).
He  Y., and Li  X., “Error analysis of laser beam quality measured with CCD sensor and choice of the optimal threshold,” Opt. Laser Technol.. 45, , 671 –677 (2013).CrossRef
Liu  W.  et al., “An image acquiring method for position and attitude measurement of high-speed target in wind tunnel,” Sens. Transducers. 160, (12 ), 635  (2013).
Guo  D. M.  et al., “Illumination model for fast measurement of free-form surface,” Chin. J. Mech. Eng.. 38, , 7 –11 (2002).CrossRef
Liang  Z. G.  et al., “Sub-pixel feature extraction and edge detection in 3-D measuring using structured lights,” Chin. J. Mech. Eng.. 40, (12 ), 96 –99 (2004).CrossRef

Yang Zhang is a PhD student at Dalian University of Technology. She received her BE degree in mechanical engineering from Dalian University of Technology in 2012. Her interests include three-dimensional measurement, binocular stereo vision, and digital image processing.

Wei Liu is an assistant professor at Dalian University of Technology. He received his BE degrees in mechanical engineering from the North China Electric Power University in 2001 and his PhD in mechanical engineering from Dalian University of Technology in 2007. He is the author of more than 50 journal papers and has written one book chapter. His current research interests include precision measurement and precision control.

Xiaodong Li is a master’s student at Dalian University of Technology. He received his BE degree in mechanism design, manufacturing, and automatization from Dalian University of Technology in 2013. His interests include large view field measurement, binocular stereo vision, and measurement system calibration.

Fan Yang is a master’s student at Dalian University of Technology. He received his BE degree in mechanical design and manufacturing and automatization from Dalian Maritime University in 2014. His interests include camera calibration, three-dimensional measurement, binocular stereo vision, and aircraft assembly.

Peng Gao is a master’s student at Dalian University of Technology. He received his BE degree in mechanical engineering from Dalian University of Technology in 2014. His interests include camera calibration, three-dimensional measurement, binocular stereo vision, and aircraft assembly.

Zhenyuan Jia is a professor at Dalian University of Technology. He received his BE, MD, and PhD degrees in mechanical engineering from Dalian University of Technology in 1980, 1984, and 1987 respectively. His interests include precision and nontraditional machining, precision measurement, and controlling.

© The Authors. Published by SPIE under a Creative Commons Attribution 3.0 Unported License. Distribution or reproduction of this work in whole or in part requires full attribution of the original publication, including its DOI.

Citation

Yang Zhang ; Wei Liu ; Xiaodong Li ; Fan Yang ; Peng Gao, et al.
"Accuracy improvement in laser stripe extraction for large-scale triangulation scanning measurement system", Opt. Eng. 54(10), 105108 (Oct 15, 2015). ; http://dx.doi.org/10.1117/1.OE.54.10.105108


Figures

Graphic Jump Location
Fig. 1
F1 :

The schematic diagram of active triangulation scanning measurement system.

Graphic Jump Location
Fig. 2
F2 :

Schematic diagram of the active triangulation scanning measurement system showing the gray-scale distribution: (a) laser stripe with the incident angle and (b) reference laser stripe.

Graphic Jump Location
Fig. 3
F3 :

The gray distribution of laser stripes with different incident angles.

Graphic Jump Location
Fig. 4
F4 :

The spatial relationship between the reference laser stripe and projected laser stripe.

Graphic Jump Location
Fig. 5
F5 :

The relationship between the Gaussian fitting structural similarity and the standard deviation.

Graphic Jump Location
Fig. 6
F6 :

The imaging mechanism complementary metal oxide semiconductor image sensor.

Graphic Jump Location
Fig. 7
F7 :

The schematic diagram of laser reflection on the surface of composite material T800.

Graphic Jump Location
Fig. 8
F8 :

The laser stripe transmission in space.

Graphic Jump Location
Fig. 9
F9 :

Schematic diagram of the scanning measurement system.

Graphic Jump Location
Fig. 10
F10 :

Diagram of experimental system.

Graphic Jump Location
Fig. 11
F11 :

The original images (top row) and gray distribution images (bottom row) of laser stripes with different incident angles: (a) 0 deg, (b) 12 deg, and (c) 30 deg.

Graphic Jump Location
Fig. 12
F12 :

The relationship between the Gaussian fitting structural similarity and (a) the incident angle and (b) the deviation.

Graphic Jump Location
Fig. 13
F13 :

The relationship between laser intensity and distance.

Graphic Jump Location
Fig. 14
F14 :

The relationship between camera viewing angle and light intensity of laser.

Graphic Jump Location
Fig. 15
F15 :

The reconstruction of measured plane: (a) reconstruction result of traditional extraction method, (b) reconstruction result of proposed extraction method, (c) reconstruction error of traditional extraction method, and (d) reconstruction error of proposed extraction method.

Graphic Jump Location
Fig. 16
F16 :

Reconstruction of measured plane.

Tables

Table Grahic Jump Location
Table 1The movement distance and the values of Gaussian fitting structural similarity.
Table Grahic Jump Location
Table 2Experimental results.
Table Grahic Jump Location
Table 3The results of field experiment validation.

References

Marguet  B., and Ribere  B., “Measurement-assisted assembly applications on airbus final assembly lines,” SAE Technical Paper Series, 2003-01-2950 (2003).
Marsh  B. J., “Laser tracker assisted aircraft machining and assembly,” SAE Technical Paper Series, 2008-01-2313 (2008).
Saadat  M., and Cretin  L., “Measurement systems for large aerospace components,” Sens. Rev.. 22, (3 ), 199 –206 (2002). 0260-2288 CrossRef
Muelaner  J. E., and Maropoulos  P., “Large scale metrology in aerospace assembly,” presented at  5th Int. Conf. on Digital Enterprise Technology , University of Bath, 22–24  October  2008,  Nantes, France  (2008).
Muelaner  J. E., , Cai  B., and Maropoulos  P. G., “Large-volume metrology instrument selection and measurability analysis,” Proc. Inst. Mech. Eng., Part B. 224, (6 ), 853 –868 (2010).
Maropoulos  P. G.  et al., “Large volume metrology process models: a framework for integrating measurement with assembly planning,” CIRP Ann. Manuf. Technol.. 57, (1 ), 477 –480 (2008). 0007-8506 CrossRef
Cuypers  W.  et al., “Optical measurement techniques for mobile and large-scale dimensional metrology,” Opt. Laser Eng.. 47, (3 ), 292 –300 (2009).CrossRef
Liu  Z.  et al., “Fast and flexible movable vision measurement for the surface of a large-sized object,” Sensors. 15, (3 ), 4643 –4657 (2015). 0746-9462 CrossRef
Fu  H. L.  et al., “Innovative optical scanning technique and device for three-dimensional full-scale measurement of wind-turbine blades,” Opt. Eng.. 53, (12 ), 122411  (2014).CrossRef
Liu  W.  et al., “Fast dimensional measurement method and experiment of the forgings under high temperature,” J. Mater. Process. Technol.. 211, (2 ), 237 –244 (2011).CrossRef
Qi  L.  et al., “Statistical behavior analysis and precision optimization for the laser stripe center detector based on Steger’s algorithm,” Opt. Express. 21, (11 ), 13442 –13449 (2013).CrossRef
Xue  Q.  et al., “Improving the measuring accuracy of structured light measurement system,” Opt. Eng.. 53, (11 ), 112204  (2014).CrossRef
Yousef  H.  et al., “An innovative approach in structured light systems,” Proc. SPIE. 7864, , 78640N  (2011). 0277-786X CrossRef
Zhang  W., , Cao  N., and Guo  H., “Novel sub-pixel feature point extracting algorithm for three-dimensional measurement system with linear-structure light,” Proc. SPIE. 7656, , 76563V  (2010). 0277-786X CrossRef
Lukáš  J., , Fridrich  J., and Goljan  M., “Detecting digital image forgeries using sensor pattern noise,” Proc. SPIE. 6072, , 60720Y  (2006). 0277-786X CrossRef
Jang  J., and Hong  K., “Detection of curvilinear structures and reconstruction of their regions in gray-scale images,” Pattern Recognit.. 35, (4 ), 807 –824 (2002).CrossRef
Canny  J., “A computational approach to edge detection,” IEEE Trans. Pattern Anal. Mach. Intell.. PAMI-8, , 679 –698 (1986).CrossRef
Danielsson  P. E., “Euclidean distance mapping,” Comput. Graph. Image Process.. 14, (3 ), 227 –248 (1980).CrossRef
Steger  C., “An unbiased detector of curvilinear structures,” IEEE Trans. Pattern Anal. Mach. Intell.. 20, (2 ), 113 –125 (1998).CrossRef
Guangjun  W. Z. Z., “A robust automatic method for extracting the centric line of straight structured-light stripe,” Chin. J. Sci. Instrum.. 2, (26 ), 244 –247 (2004).
Wang  Z.  et al., “Image quality assessment: from error visibility to structural similarity,” IEEE Trans. Image Process.. 13, (4 ), 600 –612 (2004).CrossRef
Fei  Y. T., Error Theory and Data Processing. ,  China Machine Press ,  Beijing, China  (2010).
He  Y., and Li  X., “Error analysis of laser beam quality measured with CCD sensor and choice of the optimal threshold,” Opt. Laser Technol.. 45, , 671 –677 (2013).CrossRef
Liu  W.  et al., “An image acquiring method for position and attitude measurement of high-speed target in wind tunnel,” Sens. Transducers. 160, (12 ), 635  (2013).
Guo  D. M.  et al., “Illumination model for fast measurement of free-form surface,” Chin. J. Mech. Eng.. 38, , 7 –11 (2002).CrossRef
Liang  Z. G.  et al., “Sub-pixel feature extraction and edge detection in 3-D measuring using structured lights,” Chin. J. Mech. Eng.. 40, (12 ), 96 –99 (2004).CrossRef

Some tools below are only available to our subscribers or users with an online account.

Related Content

Customize your page view by dragging & repositioning the boxes below.

Related Book Chapters

Topic Collections

PubMed Articles
Advertisement


 

  • Don't have an account?
  • Subscribe to the SPIE Digital Library
  • Create a FREE account to sign up for Digital Library content alerts and gain access to institutional subscriptions remotely.
Access This Article
Sign in or Create a personal account to Buy this article ($20 for members, $25 for non-members).
Access This Proceeding
Sign in or Create a personal account to Buy this article ($15 for members, $18 for non-members).
Access This Chapter

Access to SPIE eBooks is limited to subscribing institutions and is not available as part of a personal subscription. Print or electronic versions of individual SPIE books may be purchased via SPIE.org.