Open Access
15 October 2015 Accuracy improvement in laser stripe extraction for large-scale triangulation scanning measurement system
Yang Zhang, Wei Liu, Xiaodong Li, Fan Yang, Peng Gao, Zhenyuan Jia
Author Affiliations +
Abstract
Large-scale triangulation scanning measurement systems are widely used to measure the three-dimensional profile of large-scale components and parts. The accuracy and speed of the laser stripe center extraction are essential for guaranteeing the accuracy and efficiency of the measuring system. However, in the process of large-scale measurement, multiple factors can cause deviation of the laser stripe center, including the spatial light intensity distribution, material reflectivity characteristics, and spatial transmission characteristics. A center extraction method is proposed for improving the accuracy of the laser stripe center extraction based on image evaluation of Gaussian fitting structural similarity and analysis of the multiple source factors. First, according to the features of the gray distribution of the laser stripe, evaluation of the Gaussian fitting structural similarity is estimated to provide a threshold value for center compensation. Then using the relationships between the gray distribution of the laser stripe and the multiple source factors, a compensation method of center extraction is presented. Finally, measurement experiments for a large-scale aviation composite component are carried out. The experimental results for this specific implementation verify the feasibility of the proposed center extraction method and the improved accuracy for large-scale triangulation scanning measurements.

1.

Introduction

Geometric accuracies of large-scale aircraft components or parts, including tail and wings, are the essential specifications for determining the airworthiness of the major subassemblies or subsystems of an aircraft.13 Geometrical measurements in large-scale aircraft components are fundamental for both aircraft assembly and aircraft reliability testing.4 Nevertheless, the inspection of large-scale aircraft subsystems remains a delicate task because it requires a high measurement range and accuracy.57 Machine-vision technologies have become important tools for the three-dimensional measurement of product structures because they can be used for noncontact measurements that can produce results with high efficiency and accuracy.8 Large-scale triangulation scanning measurement systems, which are based on machine-vision technology, are widely used in industry for accurately measuring the three-dimensional profile of large-scale components.9,10

Triangulation scanning measurement systems are characterized by the reconstruction of dynamic laser stripes on scanned large-scale components. The three-dimensional profile is accurately measured by analyzing image sequences of laser stripes recorded by charge-coupled device/complementary metal oxide semiconductor (CMOS) cameras. Because the laser stripe center is a unique feature in the images, the extraction accuracy of the laser stripe center is a decisive factor for measurement accuracy.1113 However, due to the large size of aircraft components, the laser stripe covers a long scan range. In addition to variations due to multiple lighting effects (illumination, reflectivity of object, light source characteristics, etc.), errors are easily introduced into the center extraction results of large-scale laser stripes. Because conventional center extraction methods cannot be used to extract laser stripes with sufficiently high accuracy for large-scale parts, the development of a highly accurate laser stripe extraction method is essential for measuring large-scale aircraft parts.14

To improve the accuracy of laser stripe extraction, conventional center extraction methods, such as geometric center extraction, barycenter extraction, and Gaussian fitting extraction, are enhanced. Lukas et al. proposed an enhanced Gaussian fitting extraction method.15 In this method, the initial center of the laser stripe was extracted using the conventional extraction method. Then the laser stripe center was extracted using the Gaussian fitting method at the range of 5 pixels around the initial center. Though this method could extract the laser stripe with high accuracy, it was only applied to laser stripes with a uniform gray distribution and width. Jang and Hong proposed a new method for detecting curvilinear structures.16 The edge of the input image was extracted using a Canny edge detector,17 and the distance from each pixel to the nearest edge of the feature is calculated based on a Euclidean distance mapping.18 Thus, the light center could be obtained by removing extraneous points after the extracted curve was refined into a 1-pixel-wide stripe.16 The method proposed by Jang and Hong could be used for comprehensive natural images with good robustness;16 however, it was considered inappropriate for high-accuracy measurement in industry because the measurement accuracy of this method could only reach pixel-level accuracy. Steger first obtained the normal of the laser stripe center using a Hessian matrix. Then the maximum gray value on this normal could be calculated as a subpixel center.19 The resulting method showed high extraction accuracy with high stability; however, because it requires a large number of arithmetic operations, the method is inappropriate for high-speed center extraction. Finally, Wei proposed a robust automatic method that combines erosion, thinning, and the least-median square algorithm to overcome the interference of partial serious reflection for laser stripe center extraction; however, it does not work well for measuring large-scale components.20

In this study, a novel laser stripe extraction method is proposed with high accuracy and efficiency. After analyzing the Gaussian fitting structural similarity and image features of laser stripes, the deviation in the laser stripe extraction can be corrected, thus improving the measurement accuracy of a large-scale triangulation scanning system.

Characteristics of a large-scale triangulation scanning measurement system and laser stripe images are described and analyzed in Sec. 2. To determine deviations in the laser stripe extraction, an image evaluation method based on the structural similarity between Gaussian fittings is presented in Sec. 3. In Sec. 4, the relationships between the gray distribution of a laser stripe and the multiple source factors are discussed. Then the center compensation model is established for laser stripes on the surface of composite materials to improve the accuracy of laser stripe extraction. According to the method of laser stripe extraction, specific experimental implementations are performed in Sec. 5. Moreover, the improvement in the accuracy of the large-scale triangulation scanning measurement system is verified by the measurement results of large-scale aircraft components. Some conclusions are discussed in Sec. 6.

2.

Characteristics of Measurement System and Laser Stripe Images

2.1.

Active Triangulation Scanning Measurement System

The active triangulation scanning measurement system using laser stripes is composed of two cameras with high speed and resolution, one linear diode laser transmitter with high stability, one automatically controlled platform with high accuracy, and one graphic workstation, as shown in Fig. 1. During the measurement, linear diode laser stripes are projected onto the surface of objects. With the rotation of the automatically controlled platform, the laser stripes are scanned over the object. The image sequences of the laser stripes are captured by binocular cameras. After establishing the three-dimensional system of coordinates through the binocular camera calibration, the profile of a large-scale object can be obtained by extracting the center lines of the laser stripes in the image sequences. Thus, the extraction accuracy of the laser stripe is the decisive factor in improving the accuracy of active triangulation scanning measurements.

Fig. 1

The schematic diagram of active triangulation scanning measurement system.

OE_54_10_105108_f001.png

2.2.

Gray Distribution Features of Laser Stripe

Because the image information in the laser stripes is expressed by the intensity pattern (gray distribution), we will demonstrate the gray distribution features of the laser stripe in detail.

The luminous theory of the linear diode laser states that a dot laser beam passing through a cylindrical lens generates a continuous optical space. Thereafter, laser stripes with a certain width are formed when the measured object surface intersects with the laser optical space.

The fundamental transverse mode of the linear laser, which is an important type of solid laser with high stability, has been widely used in visual measuring systems. According to laser principles, the intensity distribution of the fundamental transverse mode follows a single Gaussian distribution in the space domain (cross-section of laser beam).10 The gray distribution curve of laser stripe a is shown in Fig. 2. When the laser stripe is overexposed, the gray distribution of this stripe is a Gaussian curve with a platform (curve b). However, the curve of the nonplatform area still agrees with the Gaussian distribution (curve c). When the laser stripe is projected vertically on the measured object and the observation direction is perpendicular to the irradiated surface, the gray distribution is axisymmetric in the space domain. Moreover, the laser stripe center will coincide with the geometric center of the laser stripe when the laser, camera, and normal vector of the measured surface have the same direction. However, the gray distribution is asymmetric in practical measurement conditions. With changes in the incident angle of the laser transmitter, the laser stripe center deviates from the geometric center, as shown in Fig. 2(b). Thus, this laser stripe center deviation should be considered in the extraction method to further improve the measurement accuracy.

Fig. 2

Schematic diagram of the active triangulation scanning measurement system showing the gray-scale distribution: (a) laser stripe with the incident angle and (b) reference laser stripe.

OE_54_10_105108_f002.png

During large-scale component measurements, laser stripes scanned over objects can have a large range of movement. Therefore, a larger incidence angle is produced. Due to the large deviation angle as well as the characteristics of the laser, measured object, and cameras, the center of the laser stripe can significantly deviate from the geometric center. Additionally, the gray distribution is not similar to the standard Gaussian distribution of laser stripes. Therefore, the compensation for the laser stripe deviation must be considered for different incident angles.

3.

Evaluation of Laser Stripe Images

Because the gray distribution of a laser stripe is asymmetric and a significant deviation in the laser stripe center could occur, we propose an image evaluation method for the laser stripe extraction to determine the degree of deviation between the center of the captured laser stripe and the geometric center.

3.1.

Theory of Structural Similarity

The structural similarity (SSIM) index provides a method for measuring the similarity between the evaluating image x and reference image y.21 The reference image is a distortion-free image. In particular, the SSIM index includes a comparison of the luminance l(x,y), contrast c(x,y), and structure s(x,y) between images x and y. The SSIM index is given by

Eq. (1)

SSIM(x,y)=[l(x,y)]α[c(x,y)]β[s(x,y)]γ,
where α, β, and γ are the adaptive scaling index for the luminance comparison, contrast comparison, and structure comparison, respectively. The luminance l(x,y), contrast c(x,y), and structure s(x,y) comparison can be expressed as follows:

Eq. (2)

l(x,y)=2μxμy+C1μx2+μy2+C1,

Eq. (3)

c(x,y)=2σxσy+C2σx2+σy2+C2,

Eq. (4)

s(x,y)=σxy+C3σxσy+C3,
where μx and μy are the mean luminance intensities of images x and y given by μx=(1/m)t=1mxt and μy=(1/m)t=1myt, respectively. Similarly, σx and σy are the standard deviations of images x and y given by σx=[1/(m1)]t=1m(xtμx)2 and σy=[1/(m1)]t=1m(ytμy)2, σy=[1/(n1)]i=1n(yiμy)2, respectively. Finally, σxy is the image covariance that represents the structural comparison, and C1, C2 and C3 are small constants used to prevent a zero denominator. The method for estimation of C1, C2, and C3 is found elsewhere.21

3.2.

Gaussian Fitting Structural Similarity

For a triangulation scanning measurement system, the gray distribution of the laser stripe has a specific Gaussian distribution. To evaluate the degree of laser stripe deviation, we proposed the following image evolution method using the structural similarity of the gray distribution and the Gaussian fitting:

  • 1. A reference image is defined in which the gray distribution of the laser stripe is approximately a Gaussian distribution (see Sec. 2.2). The laser stripe images are compared with the reference image, and the luminance, contrast, and structure comparisons are calculated.

  • 2. The average gray distribution of the laser stripe is analyzed and fitted by a Gaussian curve. The comparison between the gray distribution of the laser stripe and the Gaussian curve can be written as

    Eq. (5)

    g(y)=1u=1T[(1Mv=1Mxu,iAueru2wu2)/(1Mv=1Mxu,v)]2T,
    where Aue(ru2/wu2) is the Gaussian curve describing the gray distribution and (1/M)v=1Mxu,v is the average gray value of the laser stripe; T is the total number of pixels in the cross-sections of the laser stripe. According to the theory of structural similarity,21 the laser evaluation model of Gaussian fitting structural similarity can be expressed as follows:

    Eq. (6)

    GFSSIM(x,y)=[l(x,y)]α[c(x,y)]β[s(x,y)]γ[g(y)]λ,
    where λ is the adaptive scaling index for the Gaussian fitting comparison. When the gray distribution of the laser stripe is similar to the reference image, the value of the Gaussian fitting structural similarity is 1.

When the laser scans have measured objects over a large-scale measuring range, a large incident angle will lead to a large laser stripe center deviation. On the other hand, in measuring ranges with a smaller incident angle, the laser stripe center has a smaller deviation from the geometric center. Thus, for small incident angles, the laser stripe center can be extracted using the geometric center extraction method. However, in most cases, the gray distribution of the laser stripe significantly deviates from that of the reference image. When the value of Gaussian fitting structural similarity is less than a certain value, the laser stripe center is no longer close to the geometric center. Therefore, compensation should be applied to the laser stripe center after the geometric center extraction is performed.

3.3.

Threshold Value of Gaussian Fitting Structural Similarity

The threshold value of Gaussian fitting structural similarity is used for determining the similarity between the gray distribution of the laser stripe and the reference distribution. In this section, the relationship between the Gaussian fitting structural similarity and the centerline deviation of the laser stripe is analyzed after calculating the Gaussian fitting structural similarity of laser stripes from different incident angles.

First, images of laser stripes are captured from different angles, and the gray distributions of laser stripes with various incident angles between 0 and 40 deg are shown in Fig. 3. Gray distribution curves for different incident angles are expressed using different colors. The red dashed line shows the center of the reference laser stripe, while the blue dashed line shows the center of the laser stripe with the maximum (40 deg) incident angle.

Fig. 3

The gray distribution of laser stripes with different incident angles.

OE_54_10_105108_f003.png

Then the laser stripe images are processed using the median filter. Additionally, the reference laser stripe image is set as the initial position, which is the position of the laser stripe that is vertically projected on the measured object with a camera observing the light from a vertical position. The spatial relationship between the reference laser stripe and the projected laser stripe is shown in Fig. 4.

Fig. 4

The spatial relationship between the reference laser stripe and projected laser stripe.

OE_54_10_105108_f004.png

The distance between the incident position of laser and the surface of measured object is defined as d. When the incident angle of the laser stripe is i, the theoretical distance from the center of the projected laser stripe to the reference laser stripe is dtani. Therefore, the deviation of the laser stripe center is the difference between the theoretical distance and actual distance. The standard deviation σersi for the deviation of the laser stripe center can be calculated using Peters’ equation.22

Eq. (7)

σersi=1.253v=1n|Δli,vdtani|n(n1),
where n is the number of gray columns for the laser stripe images; Δli,v is the deviation between the captured laser stripe center and the reference laser stripe center. The relationship between the Gaussian fitting structural similarity and the standard deviation of the centerline is shown in Fig. 5.

Fig. 5

The relationship between the Gaussian fitting structural similarity and the standard deviation.

OE_54_10_105108_f005.png

As shown in Fig. 5, with increasing incident angles, the Gaussian fitting structural similarity decreases and the centerline deviation of the laser stripe increases. The Gaussian fitting structural similarity of the stripe is relatively high when the centerline deviation is small, and the value of similarity sharply decreases when the centerline offset exceeds a particular threshold. The relationship between the Gaussian fitting structural similarity and the standard deviation of the laser stripe center deviation can be described by a high-order polynomial curve. Thus, when the centerline deviation is (1/3)ϵ, the value of Gaussian structural similarity can be obtained using the fitting curve.

4.

Center Compensation Method Based on the Analysis of Multiple Source Factors

The center of the laser stripe deviates from the geometric center due to the material surface reflectivity, laser spatial transmission, camera imaging characteristics, and the incidence angle from the camera. When the deviation of the laser stripe center exceeds a certain threshold, the accuracy of the laser center extraction is further decreased by using the geometric center extraction method. Thus, we propose a center compensation method based on the analysis of the multiple source factors (reflectivity characteristics of the material surfaces, spatial transmission characteristics of the laser, and imaging characteristics of cameras) for improving the accuracy of laser center extraction.

4.1.

Relationships Between the Gray Distribution of the Laser Stripe and Multiple Source Factors

The gray distribution of laser stripes is affected in real-time by the spatial light intensity distribution, material reflectivity characteristics, and spatial transmission characteristics. Thus, the real-time information for a laser stripe can be quantifiably expressed by a laser stripe model that is based on associations between the gray distribution of a laser stripe and the multiple source factors. The imaging mechanism for the image sensor is shown in Fig. 6.

Fig. 6

The imaging mechanism complementary metal oxide semiconductor image sensor.

OE_54_10_105108_f006.png

In the sensitive characteristics of the image sensor, the model of the optical electronic converter is expressed as follows:23,24

Eq. (8)

E=a×Hρ+b,
where E is the electrical output signal of the sensors, H is the accumulation of light energy, a is the coefficient of electrical output signals, b is the electrical output signal of the sensor without lighting, and ρ is the index of the optical electronic converter, which is approximately equal to 1, in general. Thus, the electrical output signal from the sensor has a linear relationship with the accumulation of light energy. The gray distribution of images captured by the image sensor can demonstrate the spatial light energy of laser.

In the actual measuring space, the laser stripes are projected on the measuring surface of objects at a certain incident angle. Then the laser stripes are reflected off the surface and captured by the camera. The laser stripe images are affected by the physical and geometric quantities shown in Fig. 7.

Fig. 7

The schematic diagram of laser reflection on the surface of composite material T800.

OE_54_10_105108_f007.png

The relationship between the gray distribution of the laser stripe and the multiple source factors can be written as

Eq. (9)

Elc=f[φlas(d,i,k),φcm(d,i,k),φcam(d,i,k),φeni],
where f(φlas,φcm,φcam,φeni) is a compound function of laser stripe energy that includes the influence of the laser spatial transmission characteristics, material reflectivity characteristics, imaging characteristics of the cameras, and the spatial light intensity distribution. Moreover, φlas(d,i,k), φcm(d,i,k), φcam(d,i,k) represent the functions of laser stripe energy affected by the laser spatial transmission characteristics, material reflectivity characteristics, and imaging characteristics of cameras, respectively, with spatial distance d, incident angle i, and physical characteristic coefficient k.

First, we analyze the laser spatial transmission characteristics. The laser is transmitted according to a hyperbolic curve in the time domain. Because the laser transmitters used in large-scale triangulation scanning measurement systems have a better performance quality, the divergence angle of these lasers is quite small. Thus, we assume that the laser is transmitted in a straight line (with no deviation).

Figure 8 shows the laser stripe transmission in space. With a certain flare laser angle, the relationship between the length of the laser stripe lp, lq and the projection distance dp, dq agrees with the theory of similar triangles.

Eq. (10)

lq/lp=dq/dp.

Fig. 8

The laser stripe transmission in space.

OE_54_10_105108_f008.png

Additionally, the measurement range for a large-scale triangulation scanning measurement system is limited. Under a certain measurement distance, the luminous flux of the laser in a certain cross-section is relatively constant. Therefore, the luminous flux ϕp in the cross-section of projection distance dp nearly equals to the luminous flux ϕq in the cross-section of projection distance dq. Thus, the light intensity of the laser is inversely proportional to the transmission length of the laser stripe. The light intensity Ep with certain length η for laser stripes with the different projection distances dp can be expressed as

Eq. (11)

Ep=ϕplpωp(ηωp),
where ωp is the width of laser stripes p. On the basis of Eqs. (10) and (11), the relationship between the light intensity Ep of laser stripes p and the light intensity Eq of laser stripes q can be deduced as

Eq. (12)

Ep/Eq=[ϕplpωp(ηωp)]/[ϕqlqωq(ηωq)]=lq/lp=dq/dp,
where ωp, ωq are the width of laser stripes p and q, respectively. Compared with the reference image, the light intensity of the laser stripe is given by

Eq. (13)

Elas=est/dlas+ϵst,
where est is the coefficient of light intensity, dlas is the projection distance of the reference laser stripe, and ϵst is the fitting error.

After projection by the laser transmitter, the laser stripe is reflected by the surface of the measured object. Then the laser stripe is captured by the image sensor. The reflected laser stripes are affected by the incident angle, the reflection characteristics of the measured material, and the observation angle of cameras. According to the reflection characteristics, the reflected light primarily includes specular reflection light and diffuse reflection light.25,26 The energy of specular reflection light is determined by the incident angle of laser i and material reflection characteristic kd. The energy of diffuse reflection light is related to the material reflection characteristic ks and the observation angle of the camera θ. When the intensity of the incident light is Elas, the intensity of the reflected light E can be expressed as follows:

Eq. (14)

E=Elas{kdcosi+ks[cos(θi)]h},
where h is the index of diffuse reflection light.

Moreover, when the distance from the laser transmitter to the plane of object is d and the incident angle is i, the laser transmission distance can be deduced by the equation d/cosi. According to our impact analysis of the multiple source factors (laser spatial transmission characteristics, material reflectivity characteristics, imaging characteristics, and spatial light intensity distribution), the relationship between the gray distribution of the laser stripe and multiple source factors can be expressed as

Eq. (15)

Elc=f(φlas,φcm,φcam,φeni)=[est/(dcosi)+ϵst]{kdcosi+ks[cos(θi)]h}+Eeni,
where Elc is the light intensity of the captured laser stripes and Eeni is the light intensity of the environment. Because the light intensity of the environment is much lower than the intensity of the laser, the environmental light intensity can be ignored. Thus, Eq. (13) can be expressed as

Eq. (16)

Elc=[est/(dcosi)+ϵst]{kdcosi+ks[cos(θi)]h}.

4.2.

Center Compensation Method Based on Geometric Analysis

In the image capturing process, the position of the camera is fixed. The scanning plane is produced by the motion of the laser stripe over different incident angles. A schematic diagram of a large-scale triangulation scanning measurement system is shown in Fig. 9.

Fig. 9

Schematic diagram of the scanning measurement system.

OE_54_10_105108_f009.png

Because the gray distribution of the laser stripe is symmetric in the reference image, the center of the laser stripe coincides with its geometric center. As discussed above, when the incident angle of the laser transmitter varies, the center of the laser stripe deviates from the geometric center, and the width of the laser stripe changes. The center of the reference laser stripe is the initial position for our system. The distance between the initial position (reference stripe) and the center of the extracted laser stripe is defined as the position of the laser stripe center. Half of the flare angle of the laser is set as α. For the distance d from the laser transmitter to the plane of the measured object and incident angle i, the laser incident light is projected on the surface of the object at a certain angle. The position of the laser stripe center is extracted by the geometric center extraction method, which is the distance llf from the last laser stripe to the first stripe, which is described by

Eq. (17)

llf=12d[tan(i+α)+tan(iα)].

However, the actual position of the laser stripe center is

Eq. (18)

llf=dtan(i).

Thus, the deviation of the laser stripe center can be expressed as

Eq. (19)

Δlf=llfllf=d[12tan(i+α)+12tan(iα)tan(i)].

In the measurement space, the imaging principle of the camera approximates the pinhole imaging principle, and the measured objects are projected on the imaging plane through the optic center of the lens. When the angle between the direction of observation and the plane vector of the measured object is θ, the deviation of the laser stripe can be expressed as

Eq. (20)

Δ=Δlf·cosθ=d[12tan(i+α)+12tan(iα)tan(i)]cosθ.

Because the incident angle is controlled by the rotating platform, the incident angle is a known quantity. Additionally, the observation angle can be calculated by the relationship between the gray distribution of the laser stripe and the multiple source factors. Thus, the center of the laser stripe can be compensated according to Eq. (20).

5.

Experiment

The large-scale triangulation scanning measurement system includes two CMOS cameras (VC-25MC-M/C 30, Korea Vieworks Company) with a resolution of 4096×3072pixels and a pixel size of 5.5μm, linear solid lasers with wavelengths of 450 nm, lens (AF-S-24-70 mm f/2.8G, Nikkor), a controlled platform, and an imaging workstation (Z820, HP). A flat plate of the composite material T800 is selected as the measured object because this is the primary material used in aircraft components. The experimental system is shown in Fig. 10.

Fig. 10

Diagram of experimental system.

OE_54_10_105108_f010.png

Using this measurement system, the accuracy of the proposed center compensation method is verified through a large number of tests. Moreover, a flat tail of an airplane was measured in the assembly workshop in an aviation manufacturing company to further validate the proposed measuring method.

5.1.

Threshold Value of Gaussian Fitting Structural Similarity

The Gaussian fitting structural similarity of corresponding stripes is calculated to analyze the gray distribution of different stripes. The relationship between the gray distribution of the laser stripes on T800 aviation composite materials and the corresponding Gaussian fitting structural similarity should be discussed to confirm the threshold value for the Gaussian fitting structural similarity.

Images of laser stripes on the composite material T800 are shown in Fig. 9. For capturing the reference image of laser stripes, the laser incidence direction, camera observing direction, and the measured surface normal vector direction are set to the same direction, and the laser incident angle is defined as 0 deg. In this situation, the distance from the measured plane to the laser transmitter is measured by a laser rangefinder. Then utilizing the electric rotary platform with a repeated positioning accuracy of 0.003 deg, the incident angle of laser is changed every 2 deg. The laser stripes with different incident angles are projected on the surface. The original images and gray distribution images of laser stripes with different incident angles are shown in Fig. 11.

Fig. 11

The original images (top row) and gray distribution images (bottom row) of laser stripes with different incident angles: (a) 0 deg, (b) 12 deg, and (c) 30 deg.

OE_54_10_105108_f011.png

Then the Gaussian fitting structural similarity is calculated for corresponding stripes with different incident angles. Moreover, the movement of the laser stripe on the measured plane can be obtained through the relative vertical distance and the incident angle. Since the part of laser stripe is saturated, of which the gray intensity can be obtained by Gaussian fitting of the gray intensity of unsaturated part. The movement distance and the values of Gaussian fitting structural similarity are shown in Table 1.

Table 1

The movement distance and the values of Gaussian fitting structural similarity.

Number147101316
Incident angle (deg)0612182430
Gaussian fitting structural similarity0.99970.99960.99880.99830.99600.9886
Theoretical deviation (mm)068.28138.09211.08289.23375.05
Actual deviation (mm)0.0168.29138.16211.24289.54375.59
Standard deviation (mm)0.010.010.090.200.390.68

The relationship between the deviation and Gaussian fitting structural similarity is illustrated in Fig. 12(b). In accordance with the accuracy requirement for the actual measurement, the maximal error of the laser stripe extraction is 0.15 mm. Thus, the value of the Gaussian fitting structural similarity should be >0.998 according to the curve fitting values in Fig. 12(b). The relationship between the incident angles and the Gaussian fitting structural similarity is illustrated in Fig. 12(a).

Fig. 12

The relationship between the Gaussian fitting structural similarity and (a) the incident angle and (b) the deviation.

OE_54_10_105108_f012.png

Figure 12 shows that when the value of Gaussian fitting structural similarity is 0.998, the incident angle is 20 deg. In this situation, the deviation from the actual laser center to the geometric center is beyond the acceptable range. Thus, the extraction center of the laser stripe should be compensated.

5.2.

Compensation for the Laser Stripe Center

The relationship between the gray distribution of the laser stripe and these multiple source factors are verified based on the measurement system. To simplify the calculation of the correlation coefficients for the model, the incident angle of the laser and the observing angle of the camera are set to 0 deg according to Eq. (16). Therefore, the laser incidence direction, camera observation direction, and the measured surface normal vector direction are the same. The laser transmitter is moved parallel along this direction. Moreover, the projection distance between the laser transmitter and the object surface changes. The spatial light intensity distribution of the laser must also be considered. The gray values of the laser stripes are shown in Fig. 13.

Fig. 13

The relationship between laser intensity and distance.

OE_54_10_105108_f013.png

Figure 13 shows that the brightness of laser stripe has a linear relationship with the reciprocal of the projection distance; the fitted curve is based on Eq. (13). These results confirm the theoretical analysis of laser spatial transmission characteristics described in Sec. 4.1.

Then with a constant projection distance and a certain incident angle, the laser stripes are captured at different observation directions. The relationship between the camera offset angle and light brightness is shown in Fig. 14, and the fitted curve depends on Eq. (14). With the change in the incident angle, the coefficients of the material reflectivity characteristics and spatial transmission characteristics are calculated using Eq. (16). Thus, we can obtain a laser stripe gray distribution model based on the analysis of the multiple source factors for this experimental condition.

Fig. 14

The relationship between camera viewing angle and light intensity of laser.

OE_54_10_105108_f014.png

Based on this model, both the center extraction method and the compensation method are applied to extract the center of the laser stripe when the laser stripe has a certain angle of incidence. The results are shown in Table 2 with the theoretical deviation of the actual deviation of the laser stripe.

Table 2

Experimental results.

Number147101316
Incident angle0612182430
Theoretical deviation068.28138.09211.08289.23375.05
Deviation by center method0.0168.29138.16211.24289.54375.59
Deviation by compensation method0.0168.28138.13211.12289.34375.16

Based on the center extraction and compensation methods, the centers of the laser stripes are extracted. The reconstruction of the measured plane is shown in Fig. 15. The compensation method based on multiple source factors decreases the center deviation of the laser stripe, and the accuracy of measurement is improved by up to 99.86% compared to the center extraction method.

Fig. 15

The reconstruction of measured plane: (a) reconstruction result of traditional extraction method, (b) reconstruction result of proposed extraction method, (c) reconstruction error of traditional extraction method, and (d) reconstruction error of proposed extraction method.

OE_54_10_105108_f015.png

5.3.

Field Experiment Validation

In the assembly workshop of an aviation manufacturing company, a flat tail of an airplane is measured to test the proposed center compensation method. The profile of the composite part is within the size of 1200mm×1000mm. The cameras are calibrated using the plane target calibration method, and the intrinsic and extrinsic parameters of the two industrial cameras are determined. The experimental results is shown in Table 3 and the reconstruction of measured plane is shown in Fig. 16. Then the binocular vision measurement method based on laser scanning is used to realize the high-precision reduction of geometric parameters; the accuracy of measurement is up to 99.75% compared with the theoretical size.

Table 3

The results of field experiment validation.

EdgeABBCCDDA
Theoretical length (mm)111810401270967
Measurement length (mm)1116.541038.671268.08964.60

Fig. 16

Reconstruction of measured plane.

OE_54_10_105108_f016.png

6.

Conclusion

In this study, we propose a laser stripe center extraction method based on the analysis of multiple source factors. The experimental results show that our proposed method significantly improves the accuracy of the laser stripe extraction for large-scale triangulation scanning measurement systems. To achieve this result, the laser stripe evaluation method (Gaussian fitting structural similarity) effectively provides a threshold value for center compensation by evaluating the similarity between the measured images and the reference image. When the value of Gaussian fitting structural similarity is beyond the defined threshold value, the geometric center deviates from the actual center of the laser stripe. This deviation is resolved by the proposed method of center compensation, which is based on our analysis of the spatial light intensity distribution, material reflectivity characteristics, imaging characteristics, and spatial transmission characteristics. The experiments in laboratory are conducted successfully, and the method has also been successfully applied to the measurement of aircraft components.

Acknowledgments

This paper is supported by the National Basic Research Program of China 973 Project (Grant No. 2014CB046504), the National Natural Science Foundation of China (Grant No. 51227004), the National Natural Science Foundation of China (Grant No. 51375075), the Liaoning Provincial Natural Science Foundation of China (Grant No. 2014028010), and the Science Fund for Creative Research Groups (No. 51321004).

References

1. 

B. Marguet and B. Ribere, “Measurement-assisted assembly applications on airbus final assembly lines,” (2003). Google Scholar

2. 

B. J. Marsh, “Laser tracker assisted aircraft machining and assembly,” (2008). Google Scholar

3. 

M. Saadat and L. Cretin, “Measurement systems for large aerospace components,” Sens. Rev., 22 (3), 199 –206 (2002). http://dx.doi.org/10.1108/02602280210433025 SNRVDY 0260-2288 Google Scholar

4. 

J. E. Muelaner and P. Maropoulos, “Large scale metrology in aerospace assembly,” in 5th Int. Conf. on Digital Enterprise Technology, (2008). Google Scholar

5. 

J. E. Muelaner, B. Cai and P. G. Maropoulos, “Large-volume metrology instrument selection and measurability analysis,” 853 –868 (2010). Google Scholar

6. 

P. G. Maropoulos et al., “Large volume metrology process models: a framework for integrating measurement with assembly planning,” CIRP Ann. Manuf. Technol., 57 (1), 477 –480 (2008). http://dx.doi.org/10.1016/j.cirp.2008.03.017 CIRAAT 0007-8506 Google Scholar

7. 

W. Cuypers et al., “Optical measurement techniques for mobile and large-scale dimensional metrology,” Opt. Laser Eng., 47 (3), 292 –300 (2009). http://dx.doi.org/10.1016/j.optlaseng.2008.03.013 Google Scholar

8. 

Z. Liu et al., “Fast and flexible movable vision measurement for the surface of a large-sized object,” Sensors, 15 (3), 4643 –4657 (2015). http://dx.doi.org/10.3390/s150304643 SNSRES 0746-9462 Google Scholar

9. 

H. L. Fu et al., “Innovative optical scanning technique and device for three-dimensional full-scale measurement of wind-turbine blades,” Opt. Eng., 53 (12), 122411 (2014). http://dx.doi.org/10.1117/1.OE.53.12.122411 Google Scholar

10. 

W. Liu et al., “Fast dimensional measurement method and experiment of the forgings under high temperature,” J. Mater. Process. Technol., 211 (2), 237 –244 (2011). http://dx.doi.org/10.1016/j.jmatprotec.2010.09.015 Google Scholar

11. 

L. Qi et al., “Statistical behavior analysis and precision optimization for the laser stripe center detector based on Steger’s algorithm,” Opt. Express, 21 (11), 13442 –13449 (2013). http://dx.doi.org/10.1364/OE.21.013442 Google Scholar

12. 

Q. Xue et al., “Improving the measuring accuracy of structured light measurement system,” Opt. Eng., 53 (11), 112204 (2014). http://dx.doi.org/10.1117/1.OE.53.11.112204 Google Scholar

13. 

H. Yousef et al., “An innovative approach in structured light systems,” Proc. SPIE, 7864 78640N (2011). http://dx.doi.org/10.1117/12.872394 PSISDG 0277-786X Google Scholar

14. 

W. Zhang, N. Cao and H. Guo, “Novel sub-pixel feature point extracting algorithm for three-dimensional measurement system with linear-structure light,” Proc. SPIE, 7656 76563V (2010). http://dx.doi.org/10.1117/12.864563 PSISDG 0277-786X Google Scholar

15. 

J. Lukáš, J. Fridrich and M. Goljan, “Detecting digital image forgeries using sensor pattern noise,” Proc. SPIE, 6072 60720Y (2006). http://dx.doi.org/10.1117/12.640109 PSISDG 0277-786X Google Scholar

16. 

J. Jang and K. Hong, “Detection of curvilinear structures and reconstruction of their regions in gray-scale images,” Pattern Recognit., 35 (4), 807 –824 (2002). http://dx.doi.org/10.1016/S0031-3203(01)00073-5 Google Scholar

17. 

J. Canny, “A computational approach to edge detection,” IEEE Trans. Pattern Anal. Mach. Intell., PAMI-8 679 –698 (1986). http://dx.doi.org/10.1109/TPAMI.1986.4767851 Google Scholar

18. 

P. E. Danielsson, “Euclidean distance mapping,” Comput. Graph. Image Process., 14 (3), 227 –248 (1980). http://dx.doi.org/10.1016/0146-664X(80)90054-4 Google Scholar

19. 

C. Steger, “An unbiased detector of curvilinear structures,” IEEE Trans. Pattern Anal. Mach. Intell., 20 (2), 113 –125 (1998). http://dx.doi.org/10.1109/34.659930 Google Scholar

20. 

W. Z. Z. Guangjun, “A robust automatic method for extracting the centric line of straight structured-light stripe,” Chin. J. Sci. Instrum., 2 (26), 244 –247 (2004). Google Scholar

21. 

Z. Wang et al., “Image quality assessment: from error visibility to structural similarity,” IEEE Trans. Image Process., 13 (4), 600 –612 (2004). http://dx.doi.org/10.1109/TIP.2003.819861 Google Scholar

22. 

Y. T. Fei, Error Theory and Data Processing, China Machine Press, Beijing, China (2010). Google Scholar

23. 

Y. He and X. Li, “Error analysis of laser beam quality measured with CCD sensor and choice of the optimal threshold,” Opt. Laser Technol., 45 671 –677 (2013). http://dx.doi.org/10.1016/j.optlastec.2012.05.013 Google Scholar

24. 

W. Liu et al., “An image acquiring method for position and attitude measurement of high-speed target in wind tunnel,” Sens. Transducers, 160 (12), 635 (2013). Google Scholar

25. 

D. M. Guo et al., “Illumination model for fast measurement of free-form surface,” Chin. J. Mech. Eng., 38 7 –11 (2002). http://dx.doi.org/10.3901/JME.2002.supp.007 Google Scholar

26. 

Z. G. Liang et al., “Sub-pixel feature extraction and edge detection in 3-D measuring using structured lights,” Chin. J. Mech. Eng., 40 (12), 96 –99 (2004). http://dx.doi.org/10.3901/JME.2004.12.096 Google Scholar

Biography

Yang Zhang is a PhD student at Dalian University of Technology. She received her BE degree in mechanical engineering from Dalian University of Technology in 2012. Her interests include three-dimensional measurement, binocular stereo vision, and digital image processing.

Wei Liu is an assistant professor at Dalian University of Technology. He received his BE degrees in mechanical engineering from the North China Electric Power University in 2001 and his PhD in mechanical engineering from Dalian University of Technology in 2007. He is the author of more than 50 journal papers and has written one book chapter. His current research interests include precision measurement and precision control.

Xiaodong Li is a master’s student at Dalian University of Technology. He received his BE degree in mechanism design, manufacturing, and automatization from Dalian University of Technology in 2013. His interests include large view field measurement, binocular stereo vision, and measurement system calibration.

Fan Yang is a master’s student at Dalian University of Technology. He received his BE degree in mechanical design and manufacturing and automatization from Dalian Maritime University in 2014. His interests include camera calibration, three-dimensional measurement, binocular stereo vision, and aircraft assembly.

Peng Gao is a master’s student at Dalian University of Technology. He received his BE degree in mechanical engineering from Dalian University of Technology in 2014. His interests include camera calibration, three-dimensional measurement, binocular stereo vision, and aircraft assembly.

Zhenyuan Jia is a professor at Dalian University of Technology. He received his BE, MD, and PhD degrees in mechanical engineering from Dalian University of Technology in 1980, 1984, and 1987 respectively. His interests include precision and nontraditional machining, precision measurement, and controlling.

CC BY: © The Authors. Published by SPIE under a Creative Commons Attribution 4.0 Unported License. Distribution or reproduction of this work in whole or in part requires full attribution of the original publication, including its DOI.
Yang Zhang, Wei Liu, Xiaodong Li, Fan Yang, Peng Gao, and Zhenyuan Jia "Accuracy improvement in laser stripe extraction for large-scale triangulation scanning measurement system," Optical Engineering 54(10), 105108 (15 October 2015). https://doi.org/10.1117/1.OE.54.10.105108
Published: 15 October 2015
Lens.org Logo
CITATIONS
Cited by 17 scholarly publications and 1 patent.
Advertisement
Advertisement
KEYWORDS
Cameras

Reflectivity

Image analysis

Imaging systems

Optical engineering

Optical scanning systems

Transmitters

RELATED CONTENT


Back to Top