CN113959362A - Structured light three-dimensional measurement system calibration method and routing inspection data processing method - Google Patents
Structured light three-dimensional measurement system calibration method and routing inspection data processing method Download PDFInfo
- Publication number
- CN113959362A CN113959362A CN202111107578.3A CN202111107578A CN113959362A CN 113959362 A CN113959362 A CN 113959362A CN 202111107578 A CN202111107578 A CN 202111107578A CN 113959362 A CN113959362 A CN 113959362A
- Authority
- CN
- China
- Prior art keywords
- structured light
- calibration
- inspection robot
- measurement system
- point cloud
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000007689 inspection Methods 0.000 title claims abstract description 67
- 238000000034 method Methods 0.000 title claims abstract description 64
- 238000005259 measurement Methods 0.000 title claims abstract description 48
- 238000003672 processing method Methods 0.000 title description 6
- 230000009466 transformation Effects 0.000 claims abstract description 34
- 238000004422 calculation algorithm Methods 0.000 claims abstract description 21
- 238000001514 detection method Methods 0.000 claims abstract description 7
- 239000011159 matrix material Substances 0.000 claims description 45
- 238000013519 translation Methods 0.000 claims description 36
- 238000009616 inductively coupled plasma Methods 0.000 claims description 17
- 230000006870 function Effects 0.000 claims description 12
- 238000004590 computer program Methods 0.000 claims description 11
- 239000013598 vector Substances 0.000 claims description 11
- 238000012545 processing Methods 0.000 claims description 7
- 238000003709 image segmentation Methods 0.000 claims description 4
- 238000013499 data model Methods 0.000 claims description 3
- 238000013178 mathematical model Methods 0.000 claims description 3
- 238000010586 diagram Methods 0.000 description 6
- 230000036544 posture Effects 0.000 description 6
- 230000000712 assembly Effects 0.000 description 5
- 238000000429 assembly Methods 0.000 description 5
- 238000012986 modification Methods 0.000 description 5
- 230000004048 modification Effects 0.000 description 5
- 238000005070 sampling Methods 0.000 description 4
- 230000001360 synchronised effect Effects 0.000 description 4
- 238000004364 calculation method Methods 0.000 description 3
- 238000013507 mapping Methods 0.000 description 2
- 239000000463 material Substances 0.000 description 2
- 229910000831 Steel Inorganic materials 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 238000009434 installation Methods 0.000 description 1
- 238000003754 machining Methods 0.000 description 1
- 238000007726 management method Methods 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 239000010959 steel Substances 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/24—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
- G01B11/25—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
- G01B11/2545—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object with one projection direction and several detection directions, e.g. stereo
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/24—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
- G01B11/25—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
- G01B11/2504—Calibration devices
Landscapes
- Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
The application belongs to the technical field of track detection, and particularly relates to a structured light three-dimensional measurement system calibration method of an inspection robot, wherein the structured light three-dimensional measurement system comprises two structured light sensors arranged on two sides of the inspection robot; the method comprises the following steps: s10, acquiring initial point cloud images of two calibration blocks arranged side by side and acquired by a structured light three-dimensional measurement system, wherein the calibration blocks comprise a plurality of trapezoidal platforms on the same plane; s20, extracting the three-dimensional corner points of each trapezoidal table from the initial point cloud image as feature points, and taking the three-dimensional corner points at the corresponding positions of the two calibration blocks as feature point pairs of the left and right calibration blocks; and S30, constructing an objective function of the ICP algorithm according to the extracted feature point pairs of the left and right calibration blocks, and performing iterative solution to obtain a transformation relation between the coordinate systems of the two structured light sensors. The method can accurately and efficiently calibrate the structured light sensor, and improve the accuracy and efficiency of structured light measurement.
Description
Technical Field
The application belongs to the technical field of track detection, and particularly relates to a calibration method of a structured light three-dimensional measurement system of an inspection robot.
Background
The inspection robot detects the sections of the steel rails by adopting left and right structured lights respectively, and then calculates a gauge value. Due to the machining precision and manual installation errors, the spatial position relation of the left laser camera shooting assembly and the right laser camera shooting assembly is unknown, whether the left laser plane and the right laser plane are installed in a coplanar mode cannot be judged, the left structural light measuring point and the right structural light measuring point are often staggered in the walking direction, and then the measuring errors are introduced.
In the CN112785654A patent, a plurality of images of a planar calibration plate under different postures of a calibration target are acquired by cameras in left and right laser camera assemblies in a track geometry detection system, so as to obtain a transformation relationship between camera coordinate systems in the left and right laser camera assemblies. However, in this method, it is necessary to acquire images of the planar calibration plates in different postures of the translation of the plurality of calibration targets along three axes of the motion coordinate system and the rotation around the three axes, under the condition that the cameras on the left and right sides can shoot the complete corresponding planar calibration plates at the same time. In the process, the requirement that the left and right structured lights can not be simultaneously taken is often met.
The existing method cannot accurately and efficiently calibrate the laser camera shooting assembly, and reduces the accuracy and efficiency of structured light measurement.
Disclosure of Invention
Technical problem to be solved
In view of the above disadvantages and shortcomings of the prior art, the present application provides a calibration method and a data processing method for a structured light three-dimensional measurement system.
(II) technical scheme
In order to achieve the purpose, the technical scheme is as follows:
in a first aspect, an embodiment of the present application provides a calibration method for a structured light three-dimensional measurement system of an inspection robot, where the structured light three-dimensional measurement system includes two structured light sensors disposed on two sides of the inspection robot, and the method includes:
s10, acquiring initial point cloud images of two calibration blocks arranged side by side and acquired by a structured light three-dimensional measurement system, wherein the calibration blocks comprise a plurality of trapezoidal platforms on the same plane;
s20, extracting the three-dimensional corner points of each trapezoidal table from the initial point cloud image as feature points, and taking the three-dimensional corner points at the corresponding positions of the two calibration blocks as feature point pairs of the left and right calibration blocks;
and S30, constructing an objective function of the ICP algorithm according to the extracted feature point pairs of the left and right calibration blocks, and performing iterative solution to obtain a transformation relation between the coordinate systems of the two structured light sensors.
Optionally, before S10, the method further includes:
placing the inspection robot on a calibration platform to ensure the left-right consistency of the walking of the inspection robot;
scanning two calibration blocks which are arranged side by side and fixed on the calibration platform through a left side structured light sensor and a right side structured light sensor in the inspection robot, wherein the two calibration blocks are arranged in the working range of the left side structured light sensor and the right side structured light sensor;
and acquiring point cloud images of the two calibration blocks by the structured light sensors on the left side and the right side.
Optionally, the calibration block is 16 trapezoidal stages arranged in an array on a reference plane.
Optionally, the extracting the three-dimensional corner points of each trapezoid table from the initial point cloud image in S20 includes:
s21, extracting plane mathematical models of the side surface, the upper surface and the bottom surface of the trapezoidal table from the initial point cloud image by adopting a preset point cloud image segmentation algorithm;
and S22, solving the three-dimensional corner points of each trapezoidal table based on the plane data model of three adjacent faces of the three-dimensional corner points to obtain 128 three-dimensional corner point coordinates of each calibration block.
Optionally, S30 includes:
s31, constructing a standard stl model of two calibration blocks arranged side by side through solidWorks, and generating standard point cloud images of the two calibration blocks;
s32, respectively registering two initial point cloud images acquired by two structured light sensors with the standard point cloud image based on an ICP (inductively coupled plasma) algorithm to obtain a rotation and translation matrix, wherein the rotation and translation matrix comprises a left calibration block rotation matrix, a left calibration block translation matrix, a right calibration block rotation matrix and a right calibration block translation matrix;
and S33, obtaining a transformation relation between the two groups of structured light coordinate systems according to the relative position relation between the rotation and translation matrix and the two calibration blocks.
Optionally, the objective function is:
wherein n is the number of nearest neighbor point pairs, piFor a point in the target point cloud P, qiIs the source point in cloud Q and piAnd R is a rotation matrix and t is a translation vector.
Optionally, the transformation relationship between the two structured-light sensor coordinate systems includes a rotation matrix and a translation matrix to the camera coordinate system in the left structured-light sensor to the camera coordinate system in the right structured-light sensor.
In a second aspect, the present application provides a method for processing inspection data of a track line inspection robot, the method including,
obtaining a transformation relation between two structured light sensor coordinate systems by adopting the structured light three-dimensional measurement system calibration method of the inspection robot in any one of the first aspects;
acquiring two point cloud images acquired when the track line inspection robot performs line detection through a structured light three-dimensional measurement system;
adjusting the two point cloud images based on the transformation relationship;
and performing three-dimensional reconstruction based on the adjusted point cloud image.
In a third aspect, an embodiment of the present application provides an electronic device, including: the calibration method comprises a memory, a processor and a computer program stored on the memory and capable of running on the processor, wherein the computer program realizes the steps of the calibration method of the structured light three-dimensional measurement system of the inspection robot in any one of the first aspect.
In a fourth aspect, an embodiment of the present application provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the steps of the method for calibrating a structured light three-dimensional measurement system of an inspection robot according to any one of the first aspect are implemented.
(III) advantageous effects
The beneficial effect of this application is: the application provides a calibration method of a structured light three-dimensional measurement system of an inspection robot, wherein the structured light three-dimensional measurement system comprises two structured light sensors arranged on two sides of the inspection robot; the method comprises the following steps: s10, acquiring initial point cloud images of two calibration blocks arranged side by side and acquired by a structured light three-dimensional measurement system, wherein the calibration blocks comprise a plurality of trapezoidal platforms on the same plane; s20, extracting the three-dimensional corner points of each trapezoidal table from the initial point cloud image as feature points, and taking the three-dimensional corner points at the corresponding positions of the two calibration blocks as feature point pairs of the left and right calibration blocks; and S30, constructing an objective function of the ICP algorithm according to the extracted feature point pairs of the left and right calibration blocks, and performing iterative solution to obtain a transformation relation between the coordinate systems of the two structured light sensors. The method can accurately and efficiently calibrate the structured light sensor.
Further, the application also provides a routing inspection data processing method of the track line routing inspection robot, which comprises the following steps: obtaining a transformation relation between coordinate systems of the two structured light sensors by adopting the structured light three-dimensional measuring system calibration method of the inspection robot; acquiring two point cloud images acquired when the track line inspection robot performs line detection through a structured light three-dimensional measurement system; adjusting the two point cloud images based on the transformation relationship; and performing three-dimensional reconstruction based on the adjusted point cloud image. By the method, the measurement error caused by the fact that the left and right structural light measurement points are staggered in the walking direction is greatly reduced, and the accuracy and the efficiency of structural light measurement are improved.
Drawings
The application is described with the aid of the following figures:
fig. 1 is a schematic flow chart of a calibration method of a structured light three-dimensional measurement system of an inspection robot in an embodiment of the present application;
FIG. 2 is a block diagram of a calibration block according to an embodiment of the present application;
FIG. 3 is a schematic view of a trapezoidal mesa and corner points in an embodiment of the present application;
fig. 4 is a schematic flow chart of an inspection data processing method of the track line inspection robot according to another embodiment of the present application;
fig. 5 is a schematic structural diagram of an electronic device according to another embodiment of the present application.
Detailed Description
For the purpose of better explaining the present invention and to facilitate understanding, the present invention will be described in detail by way of specific embodiments with reference to the accompanying drawings. It is to be understood that the following specific examples are illustrative of the invention only and are not to be construed as limiting the invention. In addition, it should be noted that, in the case of no conflict, the embodiments and features in the embodiments in the present application may be combined with each other; for convenience of description, only portions related to the invention are shown in the drawings.
Example one
Fig. 1 is a schematic flow chart of a calibration method of a structured light three-dimensional measurement system of an inspection robot in an embodiment of the present application. As shown in fig. 1, in the calibration method of the structured light three-dimensional measurement system of the inspection robot in the embodiment, the structured light three-dimensional measurement system includes two structured light sensors disposed on two sides of the inspection robot, and the method includes:
s10, acquiring initial point cloud images of two calibration blocks arranged side by side and acquired by a structured light three-dimensional measurement system, wherein the calibration blocks comprise a plurality of trapezoidal platforms on the same plane;
s20, extracting the three-dimensional corner points of each trapezoidal table from the initial point cloud image as feature points, and taking the three-dimensional corner points at the corresponding positions of the two calibration blocks as feature point pairs of the left and right calibration blocks;
and S30, constructing an objective function of the ICP algorithm according to the extracted feature point pairs of the left and right calibration blocks, and performing iterative solution to obtain a transformation relation between the coordinate systems of the two structured light sensors.
The calibration method of the structured light three-dimensional measurement system of the inspection robot can accurately and efficiently calibrate the structured light sensor, and improve the accuracy and efficiency of structured light measurement.
In order to better understand the present invention, the steps in the present embodiment are explained below.
In S10, the inspection robot may be a railway line inspection robot.
In S10, the calibration block includes a plurality of trapezoidal stages on the same plane, and the number of trapezoidal stages is at least 2. Specifically, in this embodiment, the calibration block may be 16 trapezoidal stages arranged in an array on a reference plane. Fig. 2 is a schematic structural diagram of a calibration block in an embodiment of the present application, and as shown in fig. 2, 4 × 4 standard trapezoidal stages are co-located on a reference plane, and 16 trapezoidal stages are distributed in an equally spaced array.
It should be noted that, the present embodiment is only an exemplary illustration of the number of the trapezoidal stages, and does not constitute a specific limitation to the number of the trapezoidal stages.
Fig. 3 is a schematic diagram of a trapezoidal table top and corner points in an embodiment of the present application, and as shown in fig. 3, a schematic diagram of a trapezoidal table at the upper left corner of a calibration block is taken, where a is a reference plane, B, C, D, E are four side surfaces of the trapezoidal table, respectively, and F is an upper surface of the trapezoidal table and is parallel to the reference plane a. a is the unique intersection point of A, B, E three faces, b is the unique intersection point of A, B, C three faces; c is the unique intersection of A, D, C faces; d is the unique intersection point of A, D, E three faces; e is the unique intersection of E, B, F faces; f is the unique intersection point of C, B, F three faces; g is the unique intersection point of E, D, F three faces; h is the unique intersection of the D, C, F three faces.
In this embodiment, the extracting the three-dimensional corner point of each trapezoid from the initial point cloud image in S20 includes:
s21, extracting plane mathematical models of the side surface, the upper surface and the bottom surface of the trapezoidal table from the initial point cloud image by adopting a preset point cloud image segmentation algorithm;
specifically, the preset point cloud image segmentation algorithm may be a random sampling consistency algorithm.
Respectively extracting planes, namely the side surface, the upper surface and the bottom surface of the trapezoidal table from the initial point cloud image by using a random sampling consistency algorithm; the method specifically comprises the following steps:
s211, randomly selecting a sample subset from the samples, and then calculating model parameters for the subset by using a minimum variance estimation algorithm;
the planar model is adapted to the assumed intra-local points, i.e. all unknown parameters can be calculated from the assumed and sufficient intra-local points.
S212, calculating the deviation between all samples and the model, comparing the deviation with a preset threshold, marking the point as a local point when the deviation is smaller than the threshold, and otherwise, rejecting the point;
all other data are tested using the model obtained in S211 and if a point is suitable for the estimated model, it is considered to be an in-office point. If enough points are classified as hypothetical intra-office points, the estimated model is reasonable enough.
S213, re-estimating the model by using all the assumed local interior points, and further obtaining more accurate model parameters.
S214, estimating the error rate of the local interior point and the model to evaluate the model.
S215, repeating S211-S214 until the end condition is met, wherein the obtained model parameters are the optimal plane model; wherein the end condition includes: the model meets the assumed constraint conditions, namely the error rate is less than the expected error rate, and the iteration times reach the preset times.
And respectively extracting the models of the side surface, the upper surface and the bottom surface of the trapezoidal table from the initial point cloud image through a random sampling consistency algorithm.
And S22, solving the three-dimensional corner points of each trapezoidal table based on the plane data model of three adjacent faces of the three-dimensional corner points to obtain 128 three-dimensional corner point coordinates of each calibration block.
In this embodiment, the three-dimensional corner point of the trapezoidal table is uniquely determined by the common intersection point of the three adjacent surfaces, and a process of solving the unique common intersection point of the three adjacent surfaces is described below.
Three plane equations are assumed as follows:
a1x+b1y+c1z+d1=0
a2x+b2y+c2z+d2=0
a3x+b3y+c3z+d3=0
wherein a1, a2, a3, b1, b2, b3, c1, c2, c3, d1, d2 and d3 are plane equation coefficients respectively.
Establishing a matrix equation:
can solve the following problems:
wherein ,
8-16 corner points of the intersection points of the plurality of surface surfaces can be respectively obtained through the method. Meanwhile, since the absolute positions of 128 corner points of the calibration block are known, 128 sets of corresponding sequence points can be obtained.
S30 includes:
s31, constructing a standard stl model of two calibration blocks arranged side by side through solidWorks, and generating standard point cloud images of the two calibration blocks;
in this embodiment, since the two calibration blocks are arranged side by side, the two standard point clouds — the left standard point cloud and the right standard point cloud are perfectly parallel in an ideal state, and only have a translation amount in the x direction, so that:
PLbase=PRbase+Tbase
wherein ,PLbaseAs a left standard point cloud PLbase,PRbaseAs a standard point cloud on the right, TbaseIs the translation vector between two standard point clouds.
S32, registering two initial point cloud images acquired by two structured light sensors with a standard point cloud image respectively based on an ICP (Iterative closest point, ICP) point cloud matching algorithm to obtain a rotation and translation matrix, wherein the rotation and translation matrix comprises a left calibration block rotation matrix, a left calibration block translation matrix, a right calibration block rotation matrix and a right calibration block translation matrix.
The registration algorithm ICP is widely applied to various fields, the calculation cost of the original ICP algorithm is high, the initial transformation is sensitive, and the initial transformation is easy to fall into a local optimal solution. If the number of the points on the surface is too many, the number of the points on the surface can be reduced to improve the calculation speed if a down-sampling measure is taken, but the position of the registration original point is reduced by adopting a reduced mode, and the matching error is increased. In the embodiment, the registration speed and the registration precision are considered, 16 × 8 three-dimensional corner points of the calibration block are used as anchor feature points to accelerate convergence, and the extraction of each corner point in 128 is generated by extracting the intersection point of three surfaces associated with each corner point, so that the precision of the extracted corner point and the precision of registration are ensured.
The ICP algorithm is explained below.
The final purpose of point cloud registration is to unify two or more groups of point cloud data under different coordinate systems to the same reference coordinate system through certain rotation and translation transformation. This process may be accomplished by a set of mappings. Assuming that the mapping transformation is a rigid transformation matrix H, six unknowns α, β, γ, tx, ty, tz need to be found in the rigid transformation matrix. Solving six unknown parameters requires six linear equations, namely at least 3 groups of point cloud corresponding point pairs to be matched need to be found, the 3 groups of corresponding point pairs cannot be collinear, 128 groups of data points can be obtained in total in S20, further parameter estimation of the rigid matrix is completed, and the parameter estimation precision of the rigid transformation matrix is further improved by data point pairs larger than 3 groups.
The basic principle of the ICP algorithm is: respectively finding out the nearest point (P) in the matched target point cloud P and source point cloud Q according to certain constraint conditionsi,qi) Then, optimal matching parameters R and t are calculated, the objective function being such that the error function is minimal. The error function is E (R, t) is:
wherein n is the number of nearest neighbor point pairs, piFor a point in the target point cloud P, qiIs the source point in cloud Q and piAnd R is a rotation matrix and t is a translation vector.
ICP algorithm step, comprising:
(1) taking a point set P in a target point cloud Pi∈P;
(2) Finding out corresponding point set Q in source point cloud QiBelongs to Q, so that min | | | Qi-pi||;
(3) Calculating a rotation matrix R and a translation matrix t to minimize an error function;
(4) to piCarrying out rotation and translation transformation by using the rotation matrix R and the translation matrix t obtained in the previous step to obtain a new corresponding point set pi’={pi’=Rpi+t,pi∈P};
(5) Calculating piCorresponding point set qiThe average distance d of;
(6) if d is less than a given threshold or greater than a preset maximum number of iterations, the iterative computation is stopped. Otherwise, returning to the step 2 until the convergence condition is met.
Each sensor of the two structured light sensors acquires two calibration block point cloud images of a world coordinate system, each calibration block has 128 angular points, and then the registration operation is carried out on the ideal point cloud under the absolute parallel of the left calibration block and the right calibration block, so that a rotation matrix and a translation matrix for acquiring the real-time attitude of the calibration block and the standard attitude of the calibration block can be acquired respectively, and the method comprises the following steps: left calibration block rotation matrix RLLeft calibration block translation matrix TLRight calibration block rotation matrix RRTranslation matrix T of right calibration blockR。
And S33, obtaining a transformation relation between the two groups of structured light coordinate systems according to the relative position relation between the rotation and translation matrix and the two calibration blocks.
In this embodiment, the transformation relationship between the two structured light sensor coordinate systems includes a rotation matrix and a translation matrix from the camera coordinate system in the left structured light sensor to the camera coordinate system in the right structured light sensor.
In this embodiment, since:
PRbase=QRbase RRinit+TpcRbase
PLbase=QLbase RLinit+TpcLbase
wherein ,PRbase、PLbaseRespectively carrying out rotational translation transformation on the left and right initial point clouds based on the standard point cloud, QRbase、、QLbaseRespectively, left and right initial point clouds, RRinit、、RLinitInitialization matrices of rotation vectors, Tpc, formed by normal vectors of the left and right point clouds, respectivelyRbase、、TpcLbaseTranslation vectors are initialized for the left and right sides, respectively.
Combining the coordinate relation of the standard point cloud: pLbase=PRbase+Tbase
Then the process of the first step is carried out,
PLbase=QLbase RLinit+TpcLbase=PRbase+Tbase=QRbase RRinit+TpcRbase+Tbase
namely:
QLbase RLinit+TpcLbase=QRbase RRinit+TpcRbase+Tbase
the transformation relationship between the two sets of structured light coordinate systems can be obtained by the above formula.
Preferably, in S30, S31 further includes before:
the point cloud centroid coordinate calculation formula is as follows:
wherein Pc is the coordinates of the centroid of the point cloud, n is the number of the midpoints of the point cloud, and xi、yi、ziRespectively, the ith point coordinate.
And respectively extracting the centroid of the angular point and the normal vector of the point cloud according to the above formula, and performing translation and rotation to serve as initial registration postures.
PRbase=QRbase RRinit+TpcRbase
PLbase=QLbase RLinit+TpcLbase
wherein ,PRbase、PLbaseRespectively the left and right sides of the point cloud after rotational translation transformation based on the standard point cloud, QRbase、、QLbaseRespectively left and right side original point clouds, RRinit、RLinitInitialization matrices of rotation vectors, Tpc, formed by normal vectors of point clouds on the left and right sides, respectivelyRbase、TpcLbaseTranslation vectors are initialized for the left and right sides, respectively.
Due to the uniqueness and position certainty of the calibration block corner points and the dependence on the transformation initial values in ICP comparison, the initial values of registration can be given by the centroid positions of the 128 corner points, and the constraint of registration is further enhanced.
In this embodiment, before S10, the method may further include:
placing the inspection robot on a calibration platform to ensure the left-right consistency of the walking of the inspection robot; meanwhile, the absolute position between the calibration blocks can be ensured;
scanning two calibration blocks which are arranged side by side and fixed on a calibration platform through a left side structured light sensor and a right side structured light sensor in the inspection robot, wherein the two calibration blocks are arranged in the working range of the left side structured light sensor and the right side structured light sensor;
and acquiring point cloud images of the two calibration blocks by the structured light sensors on the left side and the right side.
The acquisition process needs a calibration platform and two high-precision calibration blocks, and the relative positions and respective three-dimensional forms of the calibration blocks are known. Because the left and right structured light postures and the distance are always kept constant, a group of point cloud data under the respective reference coordinate systems are respectively acquired by the left and right cameras, and the two groups of point clouds are associated together in pairs. Thus, the calibration problem of the two camera coordinate systems can be translated into a registration problem of the two sets of point cloud data.
In the embodiment, two groups of standard calibration terraces fixed on a calibration platform with higher processing precision are collected by left and right side structured light sensor assemblies in the line inspection robot, the standard calibration terraces are arranged in the working range of the left and right side structured light sensor assemblies, intersection points of three adjacent surfaces of 4-4 terraces are extracted, 4-8 angular point characteristic points are counted, an ICP target function is constructed according to the extracted characteristic point pairs of the left and right reference terraces for high-precision iterative solution, the transformation relation between left and right side structured light sensor assembly coordinate systems without common visual fields is calibrated, namely the absolute position in the coordinate system of the inspection robot, the accurate and high-efficiency calibration of the structured light sensor is realized when the inspection robot is installed, and the problem that the postures of the left and right structured light sensor assemblies of the line inspection robot are inconsistent can be solved.
Example two
In a second aspect of the present application, a method for processing inspection data of a track line inspection robot is provided according to another embodiment, and fig. 4 is a flowchart illustrating a method for processing inspection data of a track line inspection robot according to another embodiment of the present application, as shown in fig. 4, the method includes,
s100, obtaining a transformation relation between coordinate systems of two structured light sensors by adopting the structured light three-dimensional measurement system calibration method of the inspection robot in the first embodiment;
s200, acquiring two point cloud images acquired when the track line inspection robot performs line detection through a structured light three-dimensional measurement system;
s300, adjusting two point cloud images based on the transformation relation;
and S400, performing three-dimensional reconstruction based on the adjusted point cloud image.
The inspection data processing method can calibrate the transformation relation between the left and right structural light coordinate systems, further can acquire the relative postures of the two groups of structural light in the coordinate system of the line inspection robot, solves the problem of the relative coordinate system without public vision, and improves the accuracy and efficiency of cross-vision structural light measurement of the inspection robot.
EXAMPLE III
A third aspect of the present application provides, by way of a third embodiment, an electronic apparatus, including: the calibration method comprises a memory, a processor and a computer program which is stored on the memory and can run on the processor, wherein when the computer program is executed by the processor, the calibration method comprises the steps of any one of the above embodiments.
Fig. 5 is a schematic structural diagram of an electronic device according to another embodiment of the present application.
The electronic device shown in fig. 5 may include: at least one processor 101, at least one memory 102, at least one network interface 104, and other user interfaces 103. The various components in the electronic device are coupled together by a bus system 105. It is understood that the bus system 105 is used to enable communications among the components. The bus system 105 includes a power bus, a control bus, and a status signal bus in addition to a data bus. For clarity of illustration, however, the various buses are labeled as bus system 105 in fig. 5.
The user interface 103 may include, among other things, a display, a keyboard, or a pointing device (e.g., a mouse, trackball, or touch pad, among others.
It will be appreciated that the memory 102 in this embodiment may be either volatile memory or nonvolatile memory, or may include both volatile and nonvolatile memory. The non-volatile memory may be a Read-only memory (ROM), a programmable Read-only memory (PROM), an erasable programmable Read-only memory (erasabprom, EPROM), an electrically erasable programmable Read-only memory (EEPROM), or a flash memory. The volatile memory may be a Random Access Memory (RAM) which functions as an external cache. By way of example, but not limitation, many forms of RAM are available, such as static random access memory (staticiram, SRAM), dynamic random access memory (dynamic RAM, DRAM), synchronous dynamic random access memory (syncronous DRAM, SDRAM), double data rate synchronous dynamic random access memory (DDRSDRAM ), Enhanced Synchronous DRAM (ESDRAM), Synchronous Link DRAM (SLDRAM), and direct memory bus RAM (DRRAM). The memory 62 described herein is intended to comprise, without being limited to, these and any other suitable types of memory.
In some embodiments, memory 102 stores elements, executable units or data structures, or a subset thereof, or an expanded set thereof as follows: an operating system 1021 and application programs 1022.
The operating system 1021 includes various system programs, such as a framework layer, a core library layer, a driver layer, and the like, and is used for implementing various basic services and processing hardware-based tasks. The application 622 includes various applications, such as an industrial control device operation management system, for implementing various application services. Programs that implement methods in accordance with embodiments of the invention can be included in application 1022.
In the embodiment of the present invention, the processor 101 is configured to execute the method steps provided in the first aspect by calling a program or an instruction stored in the memory 102, which may be specifically a program or an instruction stored in the application 1022.
The method disclosed by the above embodiment of the present invention can be applied to the processor 101, or implemented by the processor 101. The processor 101 may be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the above method may be performed by integrated logic circuits of hardware or instructions in the form of software in the processor 101. The processor 101 described above may be a general purpose processor, a digital signal processor, an application specific integrated circuit, an off-the-shelf programmable gate array or other programmable logic device, discrete gate or transistor logic, discrete hardware components. The various methods, steps and logic blocks disclosed in the embodiments of the present invention may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of the method disclosed in connection with the embodiments of the present invention may be directly implemented by a hardware decoding processor, or implemented by a combination of hardware and software elements in the decoding processor. The software elements may be located in ram, flash, rom, prom, or eprom, registers, among other storage media that are well known in the art. The storage medium is located in the memory 102, and the processor 101 reads the information in the memory 102 and completes the steps of the method in combination with the hardware thereof.
In addition, in combination with the calibration method for the structured light three-dimensional measurement system of the inspection robot in the above embodiment, an embodiment of the present invention may provide a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the calibration method for the structured light three-dimensional measurement system of the inspection robot in any one of the above embodiments is implemented.
It should be noted that in the claims, any reference signs placed between parentheses shall not be construed as limiting the claim. The word "comprising" does not exclude the presence of elements or steps not listed in a claim. The word "a" or "an" preceding an element does not exclude the presence of a plurality of such elements. The invention may be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer. The use of the terms first, second, third and the like are for convenience only and do not denote any order. These words are to be understood as part of the name of the component.
Furthermore, it should be noted that in the description of the present specification, the description of the term "one embodiment", "some embodiments", "examples", "specific examples" or "some examples", etc., means that a specific feature, structure, material or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the present invention. In this specification, the schematic representations of the terms used above are not necessarily intended to refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, various embodiments or examples and features of different embodiments or examples described in this specification can be combined and combined by one skilled in the art without contradiction.
While preferred embodiments of the present invention have been described, additional variations and modifications in those embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. Therefore, the claims should be construed to include preferred embodiments and all changes and modifications that fall within the scope of the invention.
It will be apparent to those skilled in the art that various modifications and variations can be made in the present invention without departing from the spirit or scope of the invention. Thus, if such modifications and variations of the present invention fall within the scope of the claims of the present invention and their equivalents, the present invention should also include such modifications and variations.
Claims (10)
1. The calibration method of the structured light three-dimensional measurement system of the inspection robot is characterized in that the structured light three-dimensional measurement system comprises two structured light sensors arranged on two sides of the inspection robot, and the calibration method comprises the following steps:
s10, acquiring initial point cloud images of two calibration blocks arranged side by side and acquired by a structured light three-dimensional measurement system, wherein the calibration blocks comprise a plurality of trapezoidal platforms on the same plane;
s20, extracting the three-dimensional corner points of each trapezoidal table from the initial point cloud image as feature points, and taking the three-dimensional corner points at the corresponding positions of the two calibration blocks as feature point pairs of the left and right calibration blocks;
and S30, constructing an objective function of the ICP algorithm according to the extracted feature point pairs of the left and right calibration blocks, and performing iterative solution to obtain a transformation relation between the coordinate systems of the two structured light sensors.
2. The method for calibrating the structured light three-dimensional measurement system of the inspection robot according to claim 1, further comprising, before S10:
placing the inspection robot on a calibration platform to ensure the left-right consistency of the walking of the inspection robot;
scanning two calibration blocks which are arranged side by side and fixed on the calibration platform through a left side structured light sensor and a right side structured light sensor in the inspection robot, wherein the two calibration blocks are arranged in the working range of the left side structured light sensor and the right side structured light sensor;
and acquiring point cloud images of the two calibration blocks by the structured light sensors on the left side and the right side.
3. The inspection robot structured light three-dimensional measurement system calibration method according to claim 2, wherein the calibration block is 16 trapezoidal stages arranged in an array on a reference plane.
4. The inspection robot structured light three-dimensional measurement system calibration method according to claim 3, wherein the extracting the three-dimensional corner points of each trapezoid table from the initial point cloud image in S20 includes:
s21, extracting plane mathematical models of the side surface, the upper surface and the bottom surface of the trapezoidal table from the initial point cloud image by adopting a preset point cloud image segmentation algorithm;
and S22, solving the three-dimensional corner points of each trapezoidal table based on the plane data model of three adjacent faces of the three-dimensional corner points to obtain 128 three-dimensional corner point coordinates of each calibration block.
5. The inspection robot structured light three-dimensional measurement system calibration method according to claim 1, wherein S30 includes:
s31, constructing a standard stl model of two calibration blocks arranged side by side through solidWorks, and generating standard point cloud images of the two calibration blocks;
s32, respectively registering two initial point cloud images acquired by two structured light sensors with the standard point cloud image based on an ICP (inductively coupled plasma) algorithm to obtain a rotation and translation matrix, wherein the rotation and translation matrix comprises a left calibration block rotation matrix, a left calibration block translation matrix, a right calibration block rotation matrix and a right calibration block translation matrix;
and S33, obtaining a transformation relation between the two groups of structured light coordinate systems according to the relative position relation between the rotation and translation matrix and the two calibration blocks.
6. The method for calibrating the structured light three-dimensional measurement system of the inspection robot according to claim 1, wherein the objective function is:
wherein n is the number of nearest neighbor point pairs, piFor a point in the target point cloud P, qiIs the source point in cloud Q and piAnd R is a rotation matrix and t is a translation vector.
7. The inspection robot structured light three-dimensional measurement system calibration method according to claim 1, wherein the transformation relationship between the two structured light sensor coordinate systems includes a rotation matrix and a translation matrix from a camera coordinate system in the left structured light sensor to a camera coordinate system in the right structured light sensor.
8. A method for processing routing inspection data of a track line routing inspection robot is characterized by comprising the following steps:
obtaining a transformation relation between two structured light sensor coordinate systems by adopting the structured light three-dimensional measurement system calibration method of the inspection robot according to any one of claims 1 to 7;
acquiring two point cloud images acquired when the track line inspection robot performs line detection through a structured light three-dimensional measurement system;
adjusting the two point cloud images based on the transformation relationship;
and performing three-dimensional reconstruction based on the adjusted point cloud image.
9. An electronic device, comprising: a memory, a processor and a computer program stored on the memory and executable on the processor, the computer program when executed by the processor implementing the steps of the method for calibrating a structured light three-dimensional measurement system of an inspection robot according to any one of claims 1 to 7.
10. A computer-readable storage medium, having a computer program stored thereon, which, when being executed by a processor, performs the steps of the method for calibrating a structured light three-dimensional measurement system of an inspection robot according to any one of claims 1 to 7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111107578.3A CN113959362B (en) | 2021-09-22 | 2021-09-22 | Calibration method and inspection data processing method of structured light three-dimensional measurement system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111107578.3A CN113959362B (en) | 2021-09-22 | 2021-09-22 | Calibration method and inspection data processing method of structured light three-dimensional measurement system |
Publications (2)
Publication Number | Publication Date |
---|---|
CN113959362A true CN113959362A (en) | 2022-01-21 |
CN113959362B CN113959362B (en) | 2023-09-12 |
Family
ID=79462355
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202111107578.3A Active CN113959362B (en) | 2021-09-22 | 2021-09-22 | Calibration method and inspection data processing method of structured light three-dimensional measurement system |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113959362B (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115035195A (en) * | 2022-08-12 | 2022-09-09 | 歌尔股份有限公司 | Point cloud coordinate extraction method, device, equipment and storage medium |
CN118572559A (en) * | 2024-07-30 | 2024-08-30 | 南方电网储能股份有限公司信息通信分公司 | Unmanned inspection method and unmanned inspection device for power line facilities |
Citations (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2008164493A (en) * | 2006-12-28 | 2008-07-17 | Pulstec Industrial Co Ltd | Method for measuring three-dimensional shape, and calibrating body |
CN102364299A (en) * | 2011-08-30 | 2012-02-29 | 刘桂华 | Calibration technology for multiple structured light projected three-dimensional profile measuring heads |
CN105180830A (en) * | 2015-09-28 | 2015-12-23 | 浙江大学 | Automatic three-dimensional point cloud registration method applicable to ToF (Time of Flight) camera and system |
CN105953747A (en) * | 2016-06-07 | 2016-09-21 | 杭州电子科技大学 | Structured light projection full view three-dimensional imaging system and method |
CN107843208A (en) * | 2017-10-27 | 2018-03-27 | 北京矿冶研究总院 | Mine roadway contour sensing method and system |
WO2018103694A1 (en) * | 2016-12-07 | 2018-06-14 | 苏州笛卡测试技术有限公司 | Robotic three-dimensional scanning device and method |
CN108225216A (en) * | 2016-12-14 | 2018-06-29 | 中国科学院深圳先进技术研究院 | Structured-light system scaling method and device, structured-light system and mobile equipment |
WO2018195986A1 (en) * | 2017-04-28 | 2018-11-01 | SZ DJI Technology Co., Ltd. | Calibration of laser sensors |
CN110058237A (en) * | 2019-05-22 | 2019-07-26 | 中南大学 | InSAR point Yun Ronghe and three-dimensional deformation monitoring method towards High-resolution SAR Images |
CN110440692A (en) * | 2019-08-27 | 2019-11-12 | 大连理工大学 | Laser tracker and structured light 3D scanner combined type measure scaling method |
CN111121628A (en) * | 2019-12-31 | 2020-05-08 | 芜湖哈特机器人产业技术研究院有限公司 | Calibration method of three-dimensional scanning system of carriage container based on two-dimensional laser radar |
CN111429490A (en) * | 2020-02-18 | 2020-07-17 | 北京林业大学 | Agricultural and forestry crop three-dimensional point cloud registration method based on calibration ball |
CN112561966A (en) * | 2020-12-22 | 2021-03-26 | 清华大学 | Sparse point cloud multi-target tracking method fusing spatio-temporal information |
CN112785654A (en) * | 2021-01-21 | 2021-05-11 | 中国铁道科学研究院集团有限公司基础设施检测研究所 | Calibration method and device for track geometry detection system |
CN112950562A (en) * | 2021-02-22 | 2021-06-11 | 杭州申昊科技股份有限公司 | Fastener detection algorithm based on line structured light |
-
2021
- 2021-09-22 CN CN202111107578.3A patent/CN113959362B/en active Active
Patent Citations (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2008164493A (en) * | 2006-12-28 | 2008-07-17 | Pulstec Industrial Co Ltd | Method for measuring three-dimensional shape, and calibrating body |
CN102364299A (en) * | 2011-08-30 | 2012-02-29 | 刘桂华 | Calibration technology for multiple structured light projected three-dimensional profile measuring heads |
CN105180830A (en) * | 2015-09-28 | 2015-12-23 | 浙江大学 | Automatic three-dimensional point cloud registration method applicable to ToF (Time of Flight) camera and system |
CN105953747A (en) * | 2016-06-07 | 2016-09-21 | 杭州电子科技大学 | Structured light projection full view three-dimensional imaging system and method |
WO2018103694A1 (en) * | 2016-12-07 | 2018-06-14 | 苏州笛卡测试技术有限公司 | Robotic three-dimensional scanning device and method |
CN108225216A (en) * | 2016-12-14 | 2018-06-29 | 中国科学院深圳先进技术研究院 | Structured-light system scaling method and device, structured-light system and mobile equipment |
WO2018195986A1 (en) * | 2017-04-28 | 2018-11-01 | SZ DJI Technology Co., Ltd. | Calibration of laser sensors |
CN107843208A (en) * | 2017-10-27 | 2018-03-27 | 北京矿冶研究总院 | Mine roadway contour sensing method and system |
CN110058237A (en) * | 2019-05-22 | 2019-07-26 | 中南大学 | InSAR point Yun Ronghe and three-dimensional deformation monitoring method towards High-resolution SAR Images |
CN110440692A (en) * | 2019-08-27 | 2019-11-12 | 大连理工大学 | Laser tracker and structured light 3D scanner combined type measure scaling method |
CN111121628A (en) * | 2019-12-31 | 2020-05-08 | 芜湖哈特机器人产业技术研究院有限公司 | Calibration method of three-dimensional scanning system of carriage container based on two-dimensional laser radar |
CN111429490A (en) * | 2020-02-18 | 2020-07-17 | 北京林业大学 | Agricultural and forestry crop three-dimensional point cloud registration method based on calibration ball |
CN112561966A (en) * | 2020-12-22 | 2021-03-26 | 清华大学 | Sparse point cloud multi-target tracking method fusing spatio-temporal information |
CN112785654A (en) * | 2021-01-21 | 2021-05-11 | 中国铁道科学研究院集团有限公司基础设施检测研究所 | Calibration method and device for track geometry detection system |
CN112950562A (en) * | 2021-02-22 | 2021-06-11 | 杭州申昊科技股份有限公司 | Fastener detection algorithm based on line structured light |
Non-Patent Citations (2)
Title |
---|
李云梦: ""旋转扫描结构光的三维检测系统及其标定"", 《电子测量与仪器学报》 * |
邓成呈: ""面向服务机器人三维地图创建的大规模点云分割"", 《机电一体化》 * |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115035195A (en) * | 2022-08-12 | 2022-09-09 | 歌尔股份有限公司 | Point cloud coordinate extraction method, device, equipment and storage medium |
CN115035195B (en) * | 2022-08-12 | 2022-12-09 | 歌尔股份有限公司 | Point cloud coordinate extraction method, device, equipment and storage medium |
CN118572559A (en) * | 2024-07-30 | 2024-08-30 | 南方电网储能股份有限公司信息通信分公司 | Unmanned inspection method and unmanned inspection device for power line facilities |
Also Published As
Publication number | Publication date |
---|---|
CN113959362B (en) | 2023-09-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US7894661B2 (en) | Calibration apparatus, calibration method, program for calibration, and calibration jig | |
US9928595B2 (en) | Devices, systems, and methods for high-resolution multi-view camera calibration | |
JP2016128810A (en) | Method for calibrating depth camera | |
Chen et al. | Self-recalibration of a colour-encoded light system for automated three-dimensional measurements | |
JP7462769B2 (en) | System and method for characterizing an object pose detection and measurement system - Patents.com | |
CN112815843B (en) | On-line monitoring method for printing deviation of workpiece surface in 3D printing process | |
CN113959362B (en) | Calibration method and inspection data processing method of structured light three-dimensional measurement system | |
CN109801333A (en) | Volume measuring method, device, system and calculating equipment | |
WO2020188799A1 (en) | Camera calibration device, camera calibration method, and non-transitory computer-readable medium having program stored thereon | |
Chen et al. | A self-recalibration method based on scale-invariant registration for structured light measurement systems | |
El-Hakim et al. | Multicamera vision-based approach to flexible feature measurement for inspection and reverse engineering | |
CN115187612A (en) | Plane area measuring method, device and system based on machine vision | |
CN113048912B (en) | Calibration system and method of projector | |
CN114170321A (en) | Camera self-calibration method and system based on distance measurement | |
CN113689397A (en) | Workpiece circular hole feature detection method and workpiece circular hole feature detection device | |
CN110232715B (en) | Method, device and system for self calibration of multi-depth camera | |
CN113012279A (en) | Non-contact three-dimensional imaging measurement method and system and computer readable storage medium | |
CN114782315A (en) | Method, device and equipment for detecting shaft hole assembly pose precision and storage medium | |
CN113790711A (en) | Unmanned aerial vehicle low-altitude flight pose uncontrolled multi-view measurement method and storage medium | |
Li et al. | A method for 3D measurement and reconstruction for active vision | |
Morris et al. | Uncertainty analysis of the optical model attitude and deformation system in the AEDC PWT 16T facility | |
Wang et al. | Pedestrian speed estimation based on direct linear transformation calibration | |
Makris et al. | Vision guided robots. Calibration and motion correction | |
CN111145268A (en) | Video registration method and device | |
Zhang et al. | An efficient method for dynamic calibration and 3D reconstruction using homographic transformation |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |