Next Article in Journal
Design and Implementation of a Stereo Vision System on an Innovative 6DOF Single-Edge Machining Device for Tool Tip Localization and Path Correction
Next Article in Special Issue
GesID: 3D Gesture Authentication Based on Depth Camera and One-Class Classification
Previous Article in Journal
Finite Element Calculation with Experimental Verification for a Free-Flooded Transducer Based on Fluid Cavity Structure
Previous Article in Special Issue
A Versatile Method for Depth Data Error Estimation in RGB-D Sensors
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

RGB Colour Encoding Improvement for Three-Dimensional Shapes and Displacement Measurement Using the Integration of Fringe Projection and Digital Image Correlation

by
Luis Felipe-Sesé
1,*,
Ángel Jesús Molina-Viedma
2,
Elías López-Alba
2 and
Francisco A. Díaz
2
1
Department of Mechanical Engineering, Campus Científico Tecnológico de Linares, University of Jaén, 23700 Linares, Spain
2
Department of Mechanical and Mining Engineering, Campus Las Lagunillas, University of Jaén, 23071 Jaén, Spain
*
Author to whom correspondence should be addressed.
Sensors 2018, 18(9), 3130; https://doi.org/10.3390/s18093130
Submission received: 1 August 2018 / Revised: 10 September 2018 / Accepted: 13 September 2018 / Published: 17 September 2018
(This article belongs to the Special Issue Depth Sensors and 3D Vision)

Abstract

:
Three-dimensional digital image correlation (3D-DIC) has become the most popular full-field optical technique for measuring 3D shapes and displacements in experimental mechanics. The integration of fringe projection (FP) and two-dimensional digital image correlation (FP + DIC) has been recently established as an intelligent low-cost alternative to 3D-DIC, overcoming the drawbacks of a stereoscopic system. Its experimentation is based on the colour encoding of the characterized fringe and speckle patterns required for FP and DIC implementation, respectively. In the present work, innovations in experimentation using FP + DIC for more accurate results are presented. Specifically, they are based on the improvement of the colour pattern encoding. To achieve this, in this work, a multisensor camera and/or laser structural illumination were employed. Both alternatives are analysed and evaluated. Results show that improvements both in three-dimensional and in-plane displacement are obtained with the proposed alternatives. Nonetheless, multisensor high-speed cameras are uncommon, and laser structural illumination is established as an important improvement when low uncertainty is required for 2D-displacement measurement. Hence, the uncertainty has been demonstrated to be reduced by up to 50% compared with results obtained in previous experimental approaches of FP + DIC.

1. Introduction

Nowadays, full-field measurements of shape and displacements occurring on surface elements are accessible thanks to optical techniques based on digital cameras [1,2]. These techniques have provided interesting applications in experimental mechanics to observe and explain the mechanical behaviour of solids and the validation of theoretical and/or numerical models.
Although the theoretical basis of most optical techniques was established some decades ago, optical techniques are still improving their performance and capabilities due to technological advances, i.e., high-speed cameras, higher computational performance and improved data processing algorithms.
Several optical techniques are employed in experimental mechanics, depending on the mechanical variables of interest. For instance, digital photoelasticity [3,4,5], shows a high level of sensitivity in the direct measurement of the shear strain in photoelastic materials. Besides this, electronic speckle photography and interferometry techniques [2,6] offer high sensitivity for detecting displacements in the order of magnitude of fractions of the wavelength of the employed laser. In addition, thermoelasticity [2] allows us to experimentally obtain the variation of the first stress invariant from the thermal emissivity of a specimen under cyclic loading.
Other techniques are specifically focused on the measurement of shape and three-dimensional displacement, even in industrial environments, and are applicable for dynamic applications. Fringe projection (FP) [7,8,9,10,11] is a technique based on the Moiré effect which measures 3D shape, from which out-of-plane displacements can be extracted. It takes advantage of a fringe pattern (usually black and white) obliquely projected over the specimen whose image is recorded with a single camera. It is used in a wide range of applications, allowing sensitivities from microns to metres, depending on the optical set-up [11,12,13,14,15]. A more recent technique is digital image correlation 3D (3D-DIC) [16] which measures 3D shapes and displacements, from which strain fields can be calculated. This technique relies on the stereo-positioning and stereo-tracking of small facets of the image over time. An aleatory speckle pattern on the specimen surface is required (black dots randomly distributed on white background). This technique has multiple applications in experimental mechanics, either for static or high speed testing [17,18,19,20,21,22,23,24]. Its sensitivity strongly depends on the resolution of the camera and the optical arrangement, resulting in a range from microns to centimetres. Additionally, some work has been performed on digital image correlation with projected speckle patterns [25,26], but two cameras are still required.
Cameras employed in 3D-DIC technique require a perfect synchronisation (especially important for dynamic applications) and a complete calibration of the optical set-up to determine their relative orientation and the internal parameters of the cameras (i.e., extrinsic and intrinsic parameters) [16]. Computational and algorithm requirements usually make 3D-DIC difficult to implement, and the use of commercial software is often required. Additionally, the set-up significantly increases its cost when high-speed cameras are required for dynamic events. Moreover, a single camera is only required for the 2D version of DIC (2D-DIC) [27,28,29,30] but it is limited to in-plane displacement (Δx and Δy) measurement on flat specimens. Therefore, 2D-DIC and FP require lower equipment costs (only one camera and an additional projector), but they present some independent restrictions compared to 3D-DIC. However, these are complementary and could be overcome by their integration.
In fact, an alternative technique to 3D-DIC combining 2D-DIC and FP was investigated by previous researchers [31,32,33,34,35,36,37,38], but they were usually limited to static testing, a specific size of specimens or presented some errors in the measurement. Nonetheless, the authors of this work proposed a new integration methodology to measure large 3D shapes and displacements in solids [39,40]. This improved methodology overcame the independent restrictions presented by 2D-DIC and FP. Therefore, nowadays, it is possible to measure simultaneously in-plane (Δx, Δy with 2D-DIC) and out-of-plane displacements or shapes (Δz with FP) using only one single camera and an additional fringe projector.
One of the most important issues in this integration is the distortion of in-plane displacements due to 3D surfaces or out-of-plane displacements. This aspect was neglected by some previous authors, and others tried to minimize it by employing telecentric lenses [31]. The authors of the current work developed a more practical optical correction which also resulted into a more economical and user-friendly solution and a feasible alternative method to 3D-DIC [39]. The developed method employs a CCD camera with conventional lenses and a post-processing correction of the in-plane displacements due to their distortion resulting from the out-of-plane displacements or the shape of the specimen. The adopted experimental set-up is simple but requires precise alignment and calibration to obtain the fringe constant and other parameters required for the in-plane correction [40]. Additionally, images to be processed employing FP require a clear fringe pattern to operate adequately. On the other hand, DIC requires a speckle pattern free from periodic patterns, such as fringes, in order to allow proper correlation. Dynamic testing has attracted a great deal of interest recently for different applications (i.e., impact or vibration testing), and so both patterns should represent the same time instant and, hence, each image should present both the speckle and fringe pattern simultaneously. To make this possible without interfering with both patterns, colour encoding by employing an RGB camera is employed [37]. Particularly, a specimen should present a white background (similar to typical 2D or 3D-DIC testing) but, in this case, a red speckle is sprayed. Additionally, a white and blue fringe pattern is obliquely projected. RGB cameras with a single sensor (CCD or CMOS) record one image which presents a Bayer colour pattern [41], providing three different interpolated images, one sensitive to red, another to green and the other to blue. This methodology allows us to obtain the fringe pattern in the red image and the speckle pattern in the blue one. However, this separation is not perfect, and some residual noise could be present in images. Finally, the image processing software required for the combination of FP and 2D-DIC is straightforward and implements very efficient and simple image processing algorithms available in the literature [7,42]. The result is a simple 3D measurement system with a spatial resolution comparable to commercial 3D-DIC. Actually, this methodology has been successfully applied to different static or dynamic tests of different size of specimens [39,43,44].
However, as it was remarked, RGB pattern encoding potentially presents some aspects to be improved. Recently, Mares et al. [45] analysed several combinations of colour using FP and DIC, employing the correction algorithm developed by authors of this work [39]. In this case, the speckle pattern was not sprayed over the specimen and the proper texture of the specimen was intelligently employed as a speckle pattern to be processed with DIC algorithms. One of their main conclusions is that the best performance is gained by employing red and blue colours: specifically, blue fringes and a red light background. This interesting idea is only applicable to high textures (with high roughness) and for small displacements on which no important shadow appears. In addition, it also requires additional oblique illumination, reducing its applicability.
In the present paper, two different novel improvements in the integration of FP + 2D-DIC are presented and evaluated. Particularly, in the first alternative, a three-CCD sensor camera is employed instead of a Bayer colour camera (with a single sensor). This technology, through an optical prism, provides different images which are sensitive to each RGB colour, leading to more clear fringes and speckle patterns, improving the measurement accuracy. Nevertheless, some industrial applications require high-speed image capturing, but the availability of high-speed cameras with three sensors is very limited. Hence, an alternative to those cameras is additionally proposed. This second alternative consists of employing a single-sensor RGB camera but, instead of employing a traditional LCD projector, a red laser fringe pattern is projected. This novel combination offers the possibility of a clear differentiation of the fringe pattern from speckle pattern, leading to a better resolution in this last pattern and obtaining higher accuracy in the uncertainty of the measurement.
To evaluate the uncertainty improvement employing both alternatives, two set of tests are performed. In the first test, both methods are employed to analyse the rigid body motion of a vibrating specimen in order to compare them with the previous approach. After that, the focus is placed on employing laser projection compared to LCD projection using a single sensor camera, as this will be the case when high speed cameras are required. For this second case, the 3D displacements during a bending test of a flexible blade are analysed. For both sets of tests, the results show an interesting level of improvement of the uncertainty of the measurement, especially so for the case of the camera with three sensors.

2. Fundamentals

2.1. Fringe Projection

The principle that underlies fringe projection (FP) was first proposed in the mid-1980s [8,9] and is essentially based on parallel fringes projected over the specimen surface to determine the out-of-plane shape changes or surface deformation. These fringes should be obliquely projected in order to infer an in-plane displacement of their projection when the specimen actually experiences 3D displacement. The images of a flat reference stage and after surface deformation are analysed to extract a continuous map of out-of-plane displacements or shape.
More specifically, as shown in Figure 1a, parallel straight and equidistant fringes are projected over the specimen surface with a non-zero incidence angle (α). Fringes have to be perpendicular to the incidence plane. Projected fringes are viewed by a camera placed perpendicular to the specimen surface. If the surface is completely flat (reference surface) then fringes are straight with a certain pitch value p. Furthermore, the intensity profile across the projected fringes has to be sinusoidal. When the specimen surface is displaced out-of-plane, then the projected fringes viewed by the camera move according to their oblique incidence angle, as illustrated in Figure 1b. Since fringes have a sinusoidal profile, their displacement can be measured as a phase shift ( ϕ ) which is related to the-out-of plane movement (Z-) as follows in Equation (1):
z = K f ϕ = p 2 π   tan α ϕ
where the Kf coefficient is called the fringe constant with units (mm/rad) and depends on the pitch of the projected fringe pattern and its incidence angle.
Hence, the idea of projecting sinusoidal fringes is to encode the displacement of the fringes in the phase of the recorded fringe pattern.
The intensity profile along the x-direction (perpendicular to the fringes) can then be expressed as Equation (2):
i ( x , y ) = r ( x , y ) + b ( x , y ) · cos ( 2 π p x + ϕ ( x , y ) )
where r is the background illumination and b is the amplitude. Several methods exist for retrieving the phase from the fringe pattern. Those can be classified into spatial and temporal methods, with spatial methods being suitable for dynamic testing [10].

2.2. Digital Image Correlation

Digital image correlation (DIC) is a full-field optical technique for measuring surface displacements and was first presented in the middle of the 1980s [27]. This technique is light-intensity-based and relies on the use of digital cameras and image processing capabilities. In this technique, areas of recorded images are grouped into virtual sets of pixels called facets. The illustration in Figure 2a represents the specimen surface which is divided into facets (Figure 2b) that are tracked in subsequent images when displacement occurs. Each surface element corresponding to each group of pixels from the camera is unambiguously identified by randomly painting the specimen surface with speckle, as shown in Figure 2b. Then, each surface element can be identified since the intensity pattern closed around it is matchless and will practically not change with surface deformation [16].
When a single camera is used, only in-plane displacements can be measured; this is known as 2D-DIC. A schematic representation of the 2D-DIC set-up is shown in Figure 2a. Digital images of the undeformed and deformed specimen surface are acquired during load application. Then, each facet of the image of the undeformed specimen surface is localized on the image of the deformed specimen surface by using an image correlation algorithm [42]. In-plane displacement (i.e., on the camera sensor) is obtained from the difference of the pixel position located at the centre of the facet. The real displacement of the corresponding surface element on the specimen surface is obtained by relating the pixel displacement by the lateral magnification of the used lens. When measuring in-plane displacement, it has to be considered that additional out-of-plane displacement can affect the results (if telecentric lenses are not used) since the lateral magnification will be different depending on the distance between the lens and the specimen surface [31].
The sensitivity of DIC depends primarily on the magnification of the used lens and the pixel size. Another key point of DIC is the image correlation algorithm that allows the identification and localization of the speckle images and the corresponding pixels. It is interesting to point out that digital image correlation algorithms allow the measurement of sub-pixel displacements with a resolution of about 0.1 ± 0.01 pixels [46]. For the speckle pattern, a good contrast between speckle and background is desirable. The speckle size should be big enough so that the image of the speckle is slightly bigger than the pixel size (if it is smaller, a diffuse grey pattern is observed with low contrast, and inaccurate results would be obtained).

2.3. In-Plane Displacement Correction

As previously mentioned, the in-plane displacements (Δx and Δy) are distorted by shape or z-displacements (Δz) and they must be corrected using the out-of-plane information. The correction method adopted in the present work [39] obtains real in-plane displacement using the out-of-plane displacement obtained with FP. This correction is formulated in the base of a pin-hole model (Figure 3) with the assumption that the relative small out-of-plane displacement does not affect the image quality due to the defocusing effect.
This method associates for each pixel the corrected in-plane displacements (Δx and Δy) with the uncorrected ones (ΔxCCD, ΔyCCD), which are the directly obtained with 2D-DIC and the out-of-plane distance (Δz) measured with FP, as illustrated in Equation (3):
{ Δ x = L [ Δ x C C D ( x 2 , C C D Δ z 2 Δ z 0 x 1 , C C D Δ z 1 Δ z 0 ) ] Δ y = L [ Δ y C C D ( y 2 , C C D Δ z 2 Δ z 0 y 1 , C C D Δ z 1 Δ z 0 ) ]
where ( x 1 , C C D , y 1 , C C D ) and ( x 2 , C C D , y 2 , C C D ) are the initial and final position of a displaced pixel at the element surface, Δz1 and Δz2 are the corresponding out-of-plane displacement at the corresponding pixel, z0 is the distance between the reference surface and the optical centre of the camera lens and L is the inverse of the lateral magnification at z0.
To perform the correction of Equation (3), some parameters are required: i.e., the distance z0, the relation L (mm/pixel), and the intersection of the optical axis with the CCD plane (i.e., the centre of the CCD) from which the pixel positions are referenced. In addition, FP requires the fringe constant, Kf (mm/radian) and inevitably Δϕ, produced by the fringe displacement due to the z-displacement. All these parameters can be obtained following a calibration process [40,47].

3. Experimental Procedure

3.1. RGB Colour Pattern Encoding

As presented, to implement digital image correlation, the specimen surface has to be previously prepared by spraying it with a random speckle pattern. In addition, for fringe projection, vertical fringes have to be projected over the specimen surface. Thus, when the surface is deformed, both speckle and projected fringes moved accordingly. Two-dimensional DIC computes speckle displacements in a plane perpendicular to the optical axis while FP computes the surface displacement along the optical axis from the lateral shifting of the projected fringes. Therefore, when FP and 2D-DIC are performed at the same time, both patterns have to be conveniently separated to integrate both techniques simultaneously.
The adopted method for the simultaneous measurement of in- and out-of plane displacements and shapes employs the projection of colour fringes and a colour (i.e., RGB) digital camera [37]. In previous approaches, the specimen was painted with red speckle over white background and blue–white fringes were projected with a sinusoidal intensity profile. According to this procedure, a single image is acquired at each loading state, as illustrated in Figure 4a. Using the Bayer filter implemented in the colour camera, speckle and fringes are then separated in two different images based on their colour: fringe images without speckle are obtained through the R-channel (see Figure 4b) and the speckle images without fringes are obtained through the B-channel (see Figure 4c). This procedure obtained appropriate results in previous applications, but in this work, two different improvements are presented and evaluated.
As commented, the main basis of the colour decoding is the manner in which RGB cameras employ a Bayer filter to obtain three interpolated images from an original raw image. This interpolation could also make the differentiation of colour more difficult. A Bayer filter consists of dividing the sensor of the camera into groups of four pixels. From them, one pixel only records the red colour, another one records blue, and finally, two pixels record the green colour. Hence, colour cameras present a lower resolution since an interpolation is required to reconstruct a full-size colour image. Only ¼ of the pixels are employed to reconstruct red (i.e., fringe pattern) and another ¼ of the pixels reconstruct a blue image (i.e., speckle pattern). The previous RGB pattern-encoding procedure neglects ½ of the total number of pixels of the sensor, and therefore a lower resolution is obtained from the images.
In this work, this drawback is overcome, more pixel information is employed and the improvement is evaluated.

3.1.1. Three-Sensor Camera

The first proposed alternative is the employment of a camera with a beam splitter and three sensors (in this case, CCD sensors). Each of the sensors present a colour filter in order to allow each sensor to be sensitive to one specific colour: i.e., red, blue and green. Thus, the camera makes no interpolation, which facilitates better colour separation and obtains full resolution images. Hence, an image with the same red speckle and blue fringes (Figure 5a) is decomposed into a clearer fringe pattern (Figure 5b) and speckle pattern (Figure 5c) which also present a higher resolution.

3.1.2. Laser Projection

The second approach attempted to obtain better resolution in speckle patterns when employing the green images obtained from single sensor RGB cameras, which represent twice the resolution employed for red or blue. For that purpose, it is necessary to project a fringe pattern with a narrow wavelength colour, far from the sensitivity of the camera to green wavelength, so that the fringe patterns do not appear in green images. Hence, a red laser fringe pattern (a laser is employed as a structured light source) was employed (Figure 6a). With this laser fringe pattern, the speckle pattern is observed with better resolution (twice the resolution compared to original approach) and with no remnants from fringes (as observed in Figure 6c). Additionally, fringes are observed in the red channel (Figure 6b) of the camera. This combination could be more appropriate when performing a high-speed test since single sensor cameras are more common and laser illumination can be free from flickering.

3.2. Experimental Set-Up

In this work, two different tests have been performed for the evaluation of the proposed improvement approaches, namely a vibration and a bending test. For both tests, speckle images were processed employing a commercial digital image correlation algorithm (VIC 2D from Correlated Solutions Inc., Irmo, SC, USA) and fringe patterns were processed employing Fourier transform profilometry [8]. Subsequently, each experimental set-up is explained separately.

3.2.1. Vibration Test

A first test consisted of measuring in-plane (Δx, Δy) and out-of-plane (Δz) displacements during the vibration of a rigid composite specimen. The aim was to quantify the improvements of the multisensor and laser-projection alternatives. In this test, displacements measured using single sensor (one-sensor) and multi-sensor (three-sensor) cameras are compared when employing both traditional LCD fringe projection and laser fringe projection. The three-sensor camera employed was JAI AT-200 GE (JAI Ltd., Yokohama, Japan) with 1624 × 1236 resolution pixels employing 25 mm focal length lenses. The one-sensor camera was Allied Stingray F080 (Allied Vision, Exton, PA, USA) with 1032 × 776 pixel resolution and 25 mm focal length lenses. Both cameras were arranged together and acquired images synchronously. Capturing was performed stroboscopically in order to not require a high acquisition frame rate. The optical set–up was adjusted to obtain as similar optical parameters as possible. Hence, the L (inverse of lateral magnification) obtained was 0.1103 mm/pixel and 0.097 mm/pixel for the one-sensor and three-sensor camera, respectively. Fringe projection was performed with one LCD Epson EB W32 (Seiko Epson Corporation, Suwa, Japan) projector (for traditional fringe projection) and one Stingray Laser (Coherent Inc., Santa Clara, CA, USA) of 660 nm wavelength and 50 mW for the new proposal where structured light is projected. The Kf obtained for both systems were 0.88 mm/rad and 0.94 mm/rad for the projector and laser, respectively. The specimen was a 1 cm thick methacrylate plate of 400 mm × 250 mm sprayed with white paint and speckled with red paint (RAL code 3020).
The dynamic tests consisted of exciting the plate harmonically at a relatively low frequency, 5 Hz, in order to avoid natural frequencies of the specimen and, hence, deformation. It was screwed into the armature of an electrodynamic shaker, model GW-V20/PA30E (Data Physics Corporation, San Jose, CA, USA). The shaker and the plate were positioned while allowing that the main motion was out-of-plane from the cameras point of view, as shown in Figure 7. On the surface hidden from the cameras a tri-axial accelerometer Bruel&Kjaer 4520-001 (Bruel and Kjaer Sound & Vibration Measurement A/S, Nærum, Denmark) was fixed, as seen in Figure 7b, to monitor the plate motion. The accelerometer was employed for the close-loop control of the shaker excitation to generate a 3 mm pk–pk amplitude using a Spider 80X machine condition monitoring system (Crystal Instruments, Santa Clara, CA, USA). The excitation frequency was low enough to neglect the plate deformation due to the inertia effect. Hence, it is assumed that the amplitude was constant over the whole surface.

3.2.2. Bending Test

The second test was focused on evaluating the improvement when no multi-sensor camera is available (for instance, at high speed rates) and laser projection is used during a 3D deformation. For that purpose, controlled in-plane displacements were also required. A bending test was performed, where large Δy displacement were inferred due to large Δz displacement, as presented in Figure 8a. The specimen was a 500 mm × 125 mm × 2 mm polycarbonate beam rigidly clamped on its lower area, as observed in Figure 8b. The displacement to infer the bending movement was performed and micrometrically controlled at its tip as presented in Figure 8a. In this case, the same LCD projector and laser projector were employed together with the RGB camera previously described and a 25 mm focal length lens. This optical set-up resulted in a Kf of 1.12 and 1.08 mm/radian for the case of laser and LCD projection and an L parameter of 0.1351 mm/pixel.

4. Results

4.1. Vibration Test

For illustrative purposes, Figure 9 illustrates the displacement maps measured with LCD projection and 3 CDD camera during the vibration test at the maximum amplitude instant (Figure 9a). As observed, in-plane displacement maps (Δx and Δy in Figure 9b,c respectively) measured employing 2D-DIC show some aberration due to out-of-plane displacement. Those displacement are conveniently corrected employing expression (3) to obtain uniform displacement maps, as is presented in Figure 9d,e.
The correlation parameters of speckle images depend on the optical arrangement and the presence of a remnant fringe pattern on the speckle. In the case of a 3-sensor camera, speckle images were processed employing a facet size of 35 for the case of LCD and laser projections for both cases (no important remnant fringes were observed on the speckle pattern). Moreover, in the case of the 1-sensor camera, facet sizes were 41 and 35 pixels for the case of LCD and laser projections, respectively (some remnant fringes appeared on speckle pattern when employing LCD). In all cases, the step size was 1 pixel.
In Figure 10, the mean displacements measured on the surface of the specimen are presented together with the measurement of the accelerometer during the sine wave movement of 3 mm amplitude (1.5 mm peak). It can be detected that measurements of 3-sensor (green) and 1-sensor (red) cameras obtain similar results to those obtained by accelerometers (blue) in the case of employing an LCD projector (Figure 10a) or employing the laser projector (Figure 10b). Some differences could be observed in Figure 10a and are quantified as 0.073 and 0.051 mm for the case of the 1-sensor camera and 3-sensor camera, respectively. In the case of employing laser projection, the differences are 0.08 and 0.056 mm for the same cameras, which are small compared to the displacement amplitude of 3 mm.
In order to deeply analyze and evaluate those results, and taking into account that the specimen is not deformed but only solid rigid displacements occur, the uncertainties of the measurements are based on the calculation of the standard deviation of the mean of the full field measurements. In addition, to make a better comparison and dissociate from the small differences in resolution between different cameras and projection systems, the comparison is performed based on pixels and radians by taking into account the respective fringe order (Kf) and the inverse of lateral magnification (L).
In Figure 11, the standard deviation in Δz displacement maps is presented. As is observed, good results are obtained (in millimetres, representing below 1.8% of the range of displacements) and, logically, bigger standard deviations are achieved when larger Δz are measured. Additionally, it is observed that the 3-sensor camera always obtains lower, and better, results, either with LCD or laser projection (Figure 11a,b respectively). It is also observed that the laser projection system always obtain more uncertainty than LCD projection (up to a 16% increase compared to the case of 1-sensor camera). Nonetheless, it is observed in Figure 11b that the value of uncertainty is more constant than in the case of LCD projection (especially so in the case of a 1-sensor camera). That shows that uncertainty is especially affected by high frequency noise inherent to laser illumination, which could be reduced by employing different filtering methodologies.
In Figure 12, the uncertainty of the in-plane displacements calculated as the vector magnitude of Δx and Δy uncertainties is presented. As happened with the uncertainty of Δz, good results are obtained (always below 0.55 pixels); however, it is especially significant that the uncertainty is drastically decreased by employing either laser projection or the 3-sensor camera. This reduction could be up to 60% in the case of employing a 3-sensor camera with LCD projection (Figure 12a) or 40% in the case of laser projection, even when employing the same 1-sensor camera. This shows that interpolations inherent to the Bayer filter of cameras mainly affect speckle patterns.

4.2. Bending Test

As previously indicated, most high-speed cameras include only one sensor. With this limitation, as it is practically not possible to perform tests at high speed with 3-sensor cameras, the employment of laser fringe projection with a single sensor camera approach when in- and out-of-plane displacement occurs was evaluated. In fact, the bending test performed allows for a deeper analysis of the laser projection proposal when not only large Δz displacement occurs but also Δy displacements take place. In this case, an image when the specimen is completely unloaded was captured as a reference, and then a displacement of 130 mm was applied at the tip of the cantilever placed at 413 mm from the gripped area. The camera was focused on the area close to the clamp observing up to 96.7 mm from the grip of the specimen in order to observe the area of interest where displacement varies from negligible to large displacement.
Speckle images were processed employing a facet of 27 pixels and 35 pixels for the case of laser and LCD projection, respectively. As previously stated, this difference was required since speckle images employing LCD presenting noise from fringe patterns and bigger facets were required. It is important to highlight that a smaller facet will represent a more realistic and accurate sampling of the displacements occurring at the surface of the specimen.
In Figure 13, the theoretical displacement maps in the y- (a) and z- (d) direction are presented. Additionally, the differences between the theoretical and measured map employing laser or LCD projection are presented. It is observed that, for the case of Δy displacements, differences for the case of LCD (Figure 13c) are noisier than those obtained with laser projection (Figure 13b). In fact, some bigger differences up to 0.1 mm are found, which represents an error of 0.74 pixels compared to the maximum error when employing a laser of 0.44 pixels. In the case of Δz displacements, differences between theoretical and experiments employing lasers are slightly bigger, as expected from previous test. The maximum error employing the laser was found to be 0.47 mm (which represents a 4.95% of the maximum measured) which, taking into account the Kf factor, means a difference of 0.41 radians (6.5% of the wavelength of the projected fringe pattern).
In order to illustrate the behaviour of Δx displacement, the measured Δx displacements along a horizontal profile where a large Δz occurred is presented in Figure 14 (along red dashed line in Figure 13d). As observed in the graph, displacements measured by 2D-DIC (red), which should be zero, are clearly distorted either for the case of employing laser projection (dashed line) or LCD projection (continuous line). Nonetheless, by employing expression (3), the results are adequately fitted to the actual displacement. Additionally, it is observed that results employing LCD projection are much noisier than those obtained employing laser projection (dashed line).
To summarise the results, Table 1 shows the mean differences of the theoretical and experimental displacements together with the standard deviation over the full-field displacement. It is observed that results are consistent with those obtained in the vibration test. Δz displacements are slightly improved when employing LCD projection. However, displacements Δx and Δy are sensitively improved and uncertainty was reduced up to 40% of the results when laser projection was employed.
As observed, laser projection offers an interesting improvement for in-plane displacements. However, out-of-plane displacement could be slightly affected due to the naturally noisy nature of laser illumination. Thus, this is especially interesting when high accuracy is required for in-plane displacement, as could be the case for the calculation of strains. Strain maps could be calculated by accomplishing some strain tensors which are based on spatial derivative. Due to the nature of derivation, noise could clearly affect the final results. It is important to highlight that no filtering in the processing of images has been employed. Hence, filtering laser fringe patterns could be a point on which continue working. Moreover, an additional improvement of the measurement in Δz could be also performed by decreasing the value of the Kf factor, which could be simply performed by increasing the angle of the projection or increasing the pitch of the fringe pattern. Both alternatives would improve the results for out-of-plane displacements. However, in this work, this improvement has not been implemented since it was intended to have the same optical set-up to make a fair comparison between both alternatives.

5. Conclusions

This work presents and evaluates two improvements in the experimental methodology used for 3D displacement measurement performing fringe projection together with 2D-digital image correlation. Both alternatives improve the accuracy and uncertainty of the original experimental procedure. Among the two alternatives, it has been demonstrated that better results are obtained when employing a 3-sensor camera. Nonetheless, this kind of camera usually represents an important economic cost compared to single sensor cameras. In fact, they are not available for high-speed testing, which represents one of the most interesting fields of research in experimental mechanics nowadays. In those cases where multi-sensor cameras are not suitable, an interesting alternative has been presented by projecting a laser fringe pattern. For this case, in-plane displacements are unquestionably improved. Thus, laser projection could significantly improve the results. It has also been demonstrated that laser projection also entails some increase in the uncertainty of the Δz displacement. However, it is important to highlight that, in this case, no filtering has been performed on the images.

Author Contributions

L.F.-S. performed the conceptualization and organized the research work; L.F.-S., Á.J.M.-V. and E.L.-A. performed the tests, analysed the data and wrote the first version of the manuscript. Finally, L.F.-S. and F.A.D. managed the Funding Acquisition and wrote the final version of the manuscript. All authors approved the manuscript.

Funding

This research was funded by University of Jaén and Caja Rural de Jaén with the grant number UJA2015/06/06 corresponding to the program “Acción 6 Financiación de Primeros Proyectos de Investigación dentro del Programa de Apoyo a actividades I+D+I del Plan de Apoyo a la Investigación, Desarrollo Tecnológico e Innovación de la Universidad de Jaén R6/6/2015”.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Cloud, G. Optical methods in experimental mechanics. Exp. Tech. 2011, 35, 3–6. [Google Scholar] [CrossRef]
  2. Sharpe, W.N. Springer Handbook of Experimental Solid Mechanics; Springer: New York, NY, USA, 2008. [Google Scholar]
  3. Patterson, E.A.; Wang, Z.F. Simultaneous observation of phase-stepped images for automated photoelasticity. J. Strain Anal. Eng. Des. 2005, 33, 1–15. [Google Scholar] [CrossRef]
  4. Patterson, E.A.; Wang, Z.F. Towards full field automated photoelastic analysis of complex components. Strain 1991, 27, 49–53. [Google Scholar] [CrossRef]
  5. Ramesh, K. Digital Photoelasticity—Advanced Techniques and Applications; Springer: Berlin/Heidelberg, Germany, 2000; ISBN 9783642640995. [Google Scholar]
  6. Sjödahl, M. Electronic Speckle Photography: Increased Accuracy by Nonintegral Pixel Shifting. Appl. Opt. 1994, 33, 6667–6673. [Google Scholar] [CrossRef] [PubMed]
  7. Takeda, M.; Ina, H.; Kobayashi, S. Fourier-transform method of fringe-pattern analysis for computer-based topography and interferometry. J. Opt. Soc. Am. 1982, 72, 156–160. [Google Scholar] [CrossRef]
  8. Takeda, M.; Mutoh, K. Fourier transform profilometry for the automatic measurement of 3-D object shapes. Appl. Opt. 1983, 22, 3977–3982. [Google Scholar] [CrossRef] [PubMed]
  9. Srinivasan, V.; Liu, H.C.; Halioua, M. Automated phase-measuring profilometry: A phase mapping approach. Appl. Opt. 1985, 24, 185–188. [Google Scholar] [CrossRef] [PubMed]
  10. Gorthi, S.S.; Rastogi, P. Fringe projection techniques: Whither we are? Opt. Lasers Eng. 2010, 48, 133–140. [Google Scholar] [CrossRef] [Green Version]
  11. Heredia-Ortiz, M.; Patterson, E.A. On the Industrial Applications of Moire and Fringe Projection Techniques. Strain 2003, 39, 95–100. [Google Scholar] [CrossRef]
  12. Kühmstedt, P.; Munckelt, C.; Heinze, M.; Bräuer-Burchardt, C.; Notni, G.H.G.; Notni, G.H.G.; Huang, P.S. Digital Fringe Projection in 3D Shape Measurement—An Error Analysis. Opt. Eng. 2003, 5144, 66160B. [Google Scholar] [CrossRef]
  13. Notni, G.H.; Notni, G. Digital Fringe Projection in 3D Shape Measurement—An Error Analysis. In Proceedings of the SPIE 5144, Munich, Germany, 30 May 2003. [Google Scholar]
  14. Chen, L.C.; Huang, C.C. Miniaturized 3D surface profilometer using digital fringe projection. Meas. Sci. Technol. 2005, 16, 1061. [Google Scholar] [CrossRef]
  15. Huang, P.S. High-speed 3-D shape measurement based on digital fringe projection. Opt. Eng. 2003, 42, 163–169. [Google Scholar] [CrossRef]
  16. Sutton, M.A.; Orteu, J.J.; Schreier, H. Image Correlation for Shape, Motion and Deformation Measurements: Basic Concepts, Theory and Applications; Springer US: Boston, MA, USA, 2009; 322p, ISBN 978-0-387-78746-6. [Google Scholar]
  17. Molina-Viedma, A.J.; Felipe-Sesé, L.; López-Alba, E.; Díaz, F. High frequency mode shapes characterisation using Digital Image Correlation and phase-based motion magnification. Mech. Syst. Signal Process. 2018, 102, 245–261. [Google Scholar] [CrossRef]
  18. Molina-Viedma, Á.J.; López-Alba, E.; Felipe-Sesé, L.; Díaz, F.A. Full-field modal analysis during base motion excitation using high-speed 3D digital image correlation. Meas. Sci. Technol. 2017, 28, 105402. [Google Scholar] [CrossRef] [Green Version]
  19. Molina-Viedma, A.; López-Alba, E.; Felipe-Sesé, L.; Díaz, F.; Rodríguez-Ahlquist, J.; Iglesias-Vallejo, M. Modal Parameters Evaluation in a Full-Scale Aircraft Demonstrator under Different Environmental Conditions Using HS 3D-DIC. Materials 2018, 11, 230. [Google Scholar] [CrossRef] [PubMed]
  20. Molina-Viedma, A.J.; Felipe-Sesé, L.; López-Alba, E.; Díaz, F.A. 3D mode shapes characterisation using phase-based motion magnification in large structures using stereoscopic DIC. Mech. Syst. Signal Process. 2018, 108, 140–155. [Google Scholar] [CrossRef]
  21. Molina-Viedma, Á.; López-Alba, E.; Felipe-Sesé, L.; Díaz, F. Modal Identification in an Automotive Multi-Component System Using HS 3D-DIC. Materials 2018, 11, 241. [Google Scholar] [CrossRef] [PubMed]
  22. Felipe-Sesé, L.; López-Alba, E.; Hannemann, B.; Schmeer, S.; Diaz, F.A. A validation approach for quasistatic numerical/experimental indentation analysis in soft materials using 3D digital image correlation. Materials 2017, 10, 722. [Google Scholar] [CrossRef] [PubMed]
  23. Patterson, E.A.; Patki, A.S. On characterizing strain fields in impact-damaged composites using shape descriptors. Procedia IUTAM 2012, 4, 126–132. [Google Scholar] [CrossRef]
  24. Sebastian, C.; Hack, E.; Patterson, E. An approach to the validation of computational solid mechanics models for strain analysis. J. Strain Anal. Eng. Des. 2012, 48, 36–47. [Google Scholar] [CrossRef]
  25. Baronea, S.; Neria, P.; Paolia, A.; Razionale, A. Digital Image Correlation based on projected pattern for high frequency vibration measurements. Procedia Manuf. 2017, 11, 1592–1599. [Google Scholar] [CrossRef]
  26. Dekiff, M.; Berssenbrügge, P.; Kemper, B.; Denz, C.; Dirksen, D. Three-dimensional data acquisition by digital correlation of projected speckle patterns. Appl. Phys. B Lasers Opt. 2010, 99, 449–456. [Google Scholar] [CrossRef]
  27. Sutton, M.; Wolters, W.; Peters, W.; Ranson, W.; McNeill, S. Determination of displacements using an improved digital correlation method. Image Vis. Comput. 1983, 1, 133–139. [Google Scholar] [CrossRef]
  28. Sutton, M.A.; Chao, Y.J. Measurement of Strains in a Paper Tensile Specimen Using Computer Vision and Digital Image Correlation. Tappi J. 1988, 71, 173–175. [Google Scholar]
  29. Pan, B.; Wang, Z.; Lu, Z. Genuine full-field deformation measurement of an object with complex shape using reliability-guided digital image correlation. Opt. Express 2010, 18, 1011–1023. [Google Scholar] [CrossRef] [PubMed]
  30. Pan, B. Full-field strain measurement using a two-dimensional Savitzky-Golay digital differentiator in digital image correlation. Opt. Eng. 2007, 46, 033601. [Google Scholar] [CrossRef]
  31. Mares, C.; Barrientos, B.; Blanco, A. Measurement of transient deformation by color encoding. Opt. Express 2011, 19, 25712–25722. [Google Scholar] [CrossRef] [PubMed]
  32. Barrientos, B.; Cerca, M.; García-Márquez, J.; Hernández-Bernal, C. Three-dimensional displacement fields measured in a deforming granular-media surface by combined fringe projection and speckle photography. J. Opt. A Pure Appl. Opt. 2008, 10, 104027. [Google Scholar] [CrossRef]
  33. Tay, C.J.; Quan, C.; Wu, T.; Huang, Y.H. Integrated method for 3-D rigid-body displacement measurement using fringe projection. Opt. Eng. 2004, 43, 1152–1160. [Google Scholar] [CrossRef]
  34. Shi, H.; Ji, H.; Yang, G.; He, X. Shape and deformation measurement system by combining fringe projection and digital image correlation. Opt. Lasers Eng. 2013, 51, 47–53. [Google Scholar] [CrossRef]
  35. Tay, C.J.; Quan, C.; Huang, Y.H.; Fu, Y. Digital image correlation for whole field out-of-plane displacement measurement using a single camera. Opt. Commun. 2005, 251, 23–36. [Google Scholar] [CrossRef]
  36. Quan, C.; Tay, C.J.; Huang, Y.H. 3-D deformation measurement using fringe projection and digital image correlation. Opt. Int. J. Light Electron Opt. 2004, 115, 164–168. [Google Scholar] [CrossRef]
  37. Siegmann, P.; Álvarez-Fernández, V.; Díaz-Garrido, F.; Patterson, E.A. A simultaneous in- and out-of-plane displacement measurement method. Opt. Lett. 2011, 36, 10–12. [Google Scholar] [CrossRef] [PubMed]
  38. Nguyen, T.N.; Huntley, J.M.; Burguete, R.L.; Coggrave, C.R. Multiple-view shape and deformation measurement by combining fringe projection and digital image correlation. Strain 2012, 48, 256–266. [Google Scholar] [CrossRef]
  39. Felipe-Sesé, L.; Siegmann, P.; Díaz, F.A.; Patterson, E.A. Simultaneous in-and-out-of-plane displacement measurements using fringe projection and digital image correlation. Opt. Lasers Eng. 2014, 52, 66–74. [Google Scholar] [CrossRef]
  40. Felipe-Sesé, L.; Siegmann, P.; Díaz, F.A.; Patterson, E.A. Integrating fringe projection and digital image correlation for high-quality measurements of shape changes. Opt. Eng. 2014, 53, 044106. [Google Scholar] [CrossRef]
  41. Ramanath, R.; Snyder, W.E.; Bilbro, G.L.; Sander, W.A. Demosaicking methods for Bayer color arrays. J. Electron. Imaging 2002, 11, 306–316. [Google Scholar] [CrossRef]
  42. Pan, B.; Qian, K.; Xie, H.; Asundi, A. Two-dimensional digital image correlation for in-plane displacement and strain measurement: A review. Meas. Sci. Technol. 2009, 20, 1–17. [Google Scholar] [CrossRef]
  43. Felipe-Sesé, L.; Díaz, F.A. Damage methodology approach on a composite panel based on a combination of Fringe Projection and 2D Digital Image Correlation. Mech. Syst. Signal Process. 2018, 101, 467–479. [Google Scholar] [CrossRef]
  44. Felipe-Sesé, L.; Díaz, F.A.; Patterson, E.A. Exploiting measurement-based validation for a high-fidelity model of dynamic indentation of a hyperelastic material. Int. J. Solids Struct. 2016, 97, 520–529. [Google Scholar] [CrossRef]
  45. Mares, C.; Barrientos, B.; Valdivia, R. Three-dimensional displacement in multi-colored objects. Opt. Express 2017, 25, 11652–11672. [Google Scholar] [CrossRef] [PubMed]
  46. Sutton, M.A.; Yan, J.H.; Tiwari, V.; Schreier, H.W.; Orteu, J.J. The effect of out-of-plane motion on 2D and 3D digital image correlation measurements. Opt. Lasers Eng. 2008, 46, 746–757. [Google Scholar] [CrossRef] [Green Version]
  47. Siegmann, P.; Felipe-Sese, L.; Diaz-Garrido, F. Improved 3D displacement measurements method and calibration of a combined fringe projection and 2D-DIC system. Opt. Lasers Eng. 2017, 88, 255–264. [Google Scholar] [CrossRef]
Figure 1. (a) Fringe projection schematic set-up; (b) illustration showing the physical principle of the fringe projection technique.
Figure 1. (a) Fringe projection schematic set-up; (b) illustration showing the physical principle of the fringe projection technique.
Sensors 18 03130 g001
Figure 2. (a) Two-dimensional digital image correlation (2D-DIC) schematic set-up; (b) digital image correlation theory scheme.
Figure 2. (a) Two-dimensional digital image correlation (2D-DIC) schematic set-up; (b) digital image correlation theory scheme.
Sensors 18 03130 g002
Figure 3. Schematic illustration showing the in-plane error induced in 2D-DIC when the specimen experiences out-of-plane displacements during a deformation process from the Q1 to Q2 position.
Figure 3. Schematic illustration showing the in-plane error induced in 2D-DIC when the specimen experiences out-of-plane displacements during a deformation process from the Q1 to Q2 position.
Sensors 18 03130 g003
Figure 4. Original image RGB decomposing process. (a) Original RGB image; (b) fringe image; (c) speckle image.
Figure 4. Original image RGB decomposing process. (a) Original RGB image; (b) fringe image; (c) speckle image.
Sensors 18 03130 g004
Figure 5. Original image RGB decomposing process. (a) Original RGB image; (b) fringe image; (c) speckle image.
Figure 5. Original image RGB decomposing process. (a) Original RGB image; (b) fringe image; (c) speckle image.
Sensors 18 03130 g005
Figure 6. Image RGB decomposing employing the laser projection process. (a) Original RGB image; (b) fringe image; (c) speckle image.
Figure 6. Image RGB decomposing employing the laser projection process. (a) Original RGB image; (b) fringe image; (c) speckle image.
Sensors 18 03130 g006
Figure 7. Set-up employed in vibration test. (a) Actual set-up; (b) schematic distribution.
Figure 7. Set-up employed in vibration test. (a) Actual set-up; (b) schematic distribution.
Sensors 18 03130 g007
Figure 8. Set-up employed in bending test. (a) Schematic displacement inferred to cantilever beam; (b) actual set-up; (c) schematic distribution.
Figure 8. Set-up employed in bending test. (a) Schematic displacement inferred to cantilever beam; (b) actual set-up; (c) schematic distribution.
Sensors 18 03130 g008
Figure 9. Displacement maps measured with LCD projection and a 3-CCD camera at maximum Δz displacement of 1.5 mm. (a) Δz displacement map (b), (c) Δx and Δy displacement map measured by 2D-DIC and (d), (e) Δx and Δy corrected displacement map (field of view size 157.5 mm × 119.9 mm).
Figure 9. Displacement maps measured with LCD projection and a 3-CCD camera at maximum Δz displacement of 1.5 mm. (a) Δz displacement map (b), (c) Δx and Δy displacement map measured by 2D-DIC and (d), (e) Δx and Δy corrected displacement map (field of view size 157.5 mm × 119.9 mm).
Sensors 18 03130 g009
Figure 10. Mean Δz displacement measured by 3-sensor camera (green) and 1-sensor camera (red) during vibration test employing (a) an LCD projector and (b) a laser projector together with accelerometer data (blue).
Figure 10. Mean Δz displacement measured by 3-sensor camera (green) and 1-sensor camera (red) during vibration test employing (a) an LCD projector and (b) a laser projector together with accelerometer data (blue).
Sensors 18 03130 g010
Figure 11. Standard deviation in Δz displacement maps measured by 3-sensor camera (green) and 1-sensor camera (red) during vibration testing employing (a) an LCD projector and (b) a laser projector.
Figure 11. Standard deviation in Δz displacement maps measured by 3-sensor camera (green) and 1-sensor camera (red) during vibration testing employing (a) an LCD projector and (b) a laser projector.
Sensors 18 03130 g011
Figure 12. Standard deviation for in-plane displacement maps measured by 3-sensor camera (green) and 1-sensor camera (red) during vibration testing employing (a) an LCD projector and (b) a laser projector.
Figure 12. Standard deviation for in-plane displacement maps measured by 3-sensor camera (green) and 1-sensor camera (red) during vibration testing employing (a) an LCD projector and (b) a laser projector.
Sensors 18 03130 g012
Figure 13. (a) Theroretical Δy diplacement map and differences with displacement measured employing (b) laser and (c) LCD projection. (d) Theoretical Δz displacement map and differences with displacement measured employing (e) laser and (f) LCD projection (field of view 139.4 mm × 104.8 mm).
Figure 13. (a) Theroretical Δy diplacement map and differences with displacement measured employing (b) laser and (c) LCD projection. (d) Theoretical Δz displacement map and differences with displacement measured employing (e) laser and (f) LCD projection (field of view 139.4 mm × 104.8 mm).
Sensors 18 03130 g013
Figure 14. Δx displacents meaured with 2D-DIC (red) and 2D-DIC corrected with fringe projection (FP) (green) along the red dashed profile in Figure 13.
Figure 14. Δx displacents meaured with 2D-DIC (red) and 2D-DIC corrected with fringe projection (FP) (green) along the red dashed profile in Figure 13.
Sensors 18 03130 g014
Table 1. Mean and standard deviation obtained for the LCD projection and laser projection system during bending test.
Table 1. Mean and standard deviation obtained for the LCD projection and laser projection system during bending test.
LCD ProjectionLaser Projection
DisplacementMean Error (mm)Standard Deviation (mm)DisplacementMean Error (mm)Standard Deviation (mm)
Δx0.02680.0112 (0.083 px)Δx0.01480.0055 (0.041 px)
Δy0.02180.0164 (0.1214 px)Δy0.00840.0064 (0.047 px)
Δz0.070.0071 (0.058 rad)Δz0.110.0084 (0.067 rad)

Share and Cite

MDPI and ACS Style

Felipe-Sesé, L.; Molina-Viedma, Á.J.; López-Alba, E.; Díaz, F.A. RGB Colour Encoding Improvement for Three-Dimensional Shapes and Displacement Measurement Using the Integration of Fringe Projection and Digital Image Correlation. Sensors 2018, 18, 3130. https://doi.org/10.3390/s18093130

AMA Style

Felipe-Sesé L, Molina-Viedma ÁJ, López-Alba E, Díaz FA. RGB Colour Encoding Improvement for Three-Dimensional Shapes and Displacement Measurement Using the Integration of Fringe Projection and Digital Image Correlation. Sensors. 2018; 18(9):3130. https://doi.org/10.3390/s18093130

Chicago/Turabian Style

Felipe-Sesé, Luis, Ángel Jesús Molina-Viedma, Elías López-Alba, and Francisco A. Díaz. 2018. "RGB Colour Encoding Improvement for Three-Dimensional Shapes and Displacement Measurement Using the Integration of Fringe Projection and Digital Image Correlation" Sensors 18, no. 9: 3130. https://doi.org/10.3390/s18093130

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop