CN109406525B - Bridge apparent disease detection system and detection method thereof - Google Patents

Bridge apparent disease detection system and detection method thereof Download PDF

Info

Publication number
CN109406525B
CN109406525B CN201810971346.4A CN201810971346A CN109406525B CN 109406525 B CN109406525 B CN 109406525B CN 201810971346 A CN201810971346 A CN 201810971346A CN 109406525 B CN109406525 B CN 109406525B
Authority
CN
China
Prior art keywords
auxiliary line
laser
included angle
parameter
angle parameter
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810971346.4A
Other languages
Chinese (zh)
Other versions
CN109406525A (en
Inventor
张冠华
崔凯华
王超
武旭娟
王秋实
王佳伟
韩基刚
鲁薇薇
李万德
吴宪锴
郭东升
李文全
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Liaoning Institute Of Transportation Planning And Design Co ltd
Original Assignee
Liaoning Institute Of Transportation Planning And Design Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Liaoning Institute Of Transportation Planning And Design Co ltd filed Critical Liaoning Institute Of Transportation Planning And Design Co ltd
Priority to CN201810971346.4A priority Critical patent/CN109406525B/en
Publication of CN109406525A publication Critical patent/CN109406525A/en
Application granted granted Critical
Publication of CN109406525B publication Critical patent/CN109406525B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • G01N2021/8887Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges based on image processing techniques

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Biochemistry (AREA)
  • General Health & Medical Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Immunology (AREA)
  • Pathology (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention discloses a detection system and a detection method for apparent bridge diseases, relates to the technical field of apparent bridge disease detection, and mainly aims to enable operation of apparent bridge disease detection to be simpler and more convenient and improve working efficiency and construction safety. The main technical scheme of the invention is as follows: the system comprises a double-shaft motion mechanism, a driving mechanism and a control mechanism, wherein the double-shaft motion mechanism comprises a horizontal rotation part and a vertical rotation part arranged on the horizontal rotation part; the camera is arranged on the vertical rotating part and comprises a lens, at least four laser probes are circumferentially arranged on the peripheral array of the lens, laser ranging sensors are arranged on the lens, and the laser emission directions of the four laser probes and the laser emission directions of the laser ranging sensors are respectively parallel to the optical axis direction of the lens and are both positioned on the same side; and the data processing device is respectively and electrically connected with the biaxial movement mechanism, the camera and the laser ranging sensor. The method is mainly used for detecting the apparent diseases of the bridge.

Description

Bridge apparent disease detection system and detection method thereof
Technical Field
The invention relates to the technical field of bridge apparent disease detection, in particular to a system and a method for detecting bridge apparent diseases.
Background
The detection of the apparent bridge defect plays an important role in bridge detection, mainly comprises the observation of the apparent bridge defect characteristics, including the observation of the defect characteristics of peeling, corner falling, cracks, seepage and whiskering of the bridge, and the like, and the detection result requires the actual size of the bridge defect.
At present, bridge apparent disease detection is generally based on a non-contact bridge apparent disease detection method adopting an image processing mode, namely, a rotatable camera is utilized to carry out rotary shooting on the appearance of a bridge, perspective distortion can occur to image information obtained by rotary shooting of the camera, in order to correct the image information with the perspective distortion into front view image information so as to facilitate disease detection, in the prior art, an operator generally utilizes a climbing tool to attach a target on a detected surface of the bridge so as to obtain a perspective distortion inverse transformation matrix through relative coordinates of the target, then the transformation matrix is utilized to correct the perspective distortion image information, and then the corrected front view information is utilized to detect the disease.
Disclosure of Invention
In view of the above, the embodiment of the invention provides a system and a method for detecting apparent diseases of bridges, which mainly aims to enable the operation of detecting the apparent diseases of the bridges to be simpler and more convenient and improve the working efficiency and the construction safety.
In order to achieve the above purpose, the present invention mainly provides the following technical solutions:
in one aspect, an embodiment of the present invention provides a system for detecting apparent bridge diseases, including:
the double-shaft motion mechanism comprises a horizontal rotation part and a vertical rotation part arranged on the horizontal rotation part;
the camera is arranged on the vertical rotating part, the camera comprises a lens, at least four laser probes are circumferentially arranged on the peripheral array of the lens, laser ranging sensors are arranged on the lens, the laser emission directions of the four laser probes and the laser emission directions of the laser ranging sensors are respectively parallel to the optical axis direction of the lens and are positioned on the same side, and the camera is used for capturing image information with four laser probe projection points; and
the data processing device is respectively and electrically connected with the double-shaft motion mechanism, the camera and the laser ranging sensor and is used for acquiring and correspondingly processing the image information shot by the camera, the distance information measured by the laser ranging sensor and the motion information of the double-shaft motion mechanism.
On the other hand, the embodiment of the invention also provides a detection method of bridge apparent diseases, which is applied to the detection system and comprises the following steps:
the camera shoots perspective distortion image information with laser points sent by the laser probe;
the data processing device acquires the perspective distortion image information shot by the camera;
the data processing device acquires the relative coordinates of the laser points in the perspective distortion image information and the relative coordinates of laser projection points of the laser probe, which are projected onto the detected surface, of the laser;
the data processing device calculates a perspective distortion inverse transformation matrix according to the laser point relative coordinates and the laser projection point relative coordinates;
and the data processing device corrects the perspective distortion image information into front view image information through the perspective distortion inverse transformation matrix, and performs bridge apparent disease detection by utilizing the front view image information.
Further, the data processing device obtains the relative coordinates of the laser projection point of the laser probe, which is projected onto the detected surface, and the data processing device comprises:
establishing a space rectangular coordinate system in which a detected surface is positioned, wherein the detected surface is intersected with an X-axis positive direction, a Y-axis positive direction and a Z-axis positive direction of the space rectangular coordinate system at one point respectively, and the Z-axis positive direction is the optical axis direction of the lens;
Acquiring a first included angle parameter of the detected surface and the Y axis, a second included angle parameter of the detected surface and the X axis and a relative position coordinate of the laser probe;
and calculating the relative coordinates of the laser projection points according to the first included angle parameter, the second included angle parameter and the relative position coordinates of the laser probe.
Further, the obtaining the first included angle parameter between the detected surface and the Y axis includes:
generating a first auxiliary line and a second auxiliary line which extend from the origin of the space rectangular coordinate system to the detected surface respectively and are positioned in the positive direction and the negative direction of the Y axis respectively, and a third auxiliary line which extends from the intersection point of the first auxiliary line and the detected surface to the second auxiliary line, wherein the included angles formed by the first auxiliary line and the second auxiliary line respectively with the Z axis are equal, and the third auxiliary line is parallel to the Y axis;
acquiring a third included angle parameter between the first auxiliary line and the second auxiliary line, and acquiring a first length parameter of the first auxiliary line and a second length parameter of the second auxiliary line by using a laser ranging sensor;
calculating a fourth included angle parameter between the intersection point connecting line of the first auxiliary line, the second auxiliary line and the detected surface and the third auxiliary line by using a first preset formula according to the first length parameter, the second length parameter and the third included angle parameter, wherein the fourth included angle parameter is the first included angle parameter;
The first preset formula is:
Figure GDA0004218208850000031
wherein, gamma is a fourth included angleThe parameter os is a first length parameter, ot is a second length parameter, and θ is a third included angle parameter.
Further, the obtaining the second included angle parameter between the detected surface and the X-axis includes:
generating an auxiliary Z axis which has a fifth included angle parameter with the Z axis and is positioned on a coordinate plane YOZ, a fourth auxiliary line which extends from an intersection point of the detected surface and the X axis to the auxiliary Z axis, fifth auxiliary lines and sixth auxiliary lines which respectively extend from an origin of the space rectangular coordinate system to the fourth auxiliary line and are respectively positioned in positive and negative directions of the X axis, and seventh auxiliary lines which extend from an intersection point of the fifth auxiliary line and the fourth auxiliary line to the sixth auxiliary line, wherein included angles formed by the fifth auxiliary line and the sixth auxiliary line and the auxiliary Z axis are equal, and the seventh auxiliary lines are parallel to the X axis;
acquiring a sixth included angle parameter between the fifth auxiliary line and the sixth auxiliary line, and acquiring a fifth length parameter of the fifth auxiliary line and a sixth length parameter of the sixth auxiliary line by using a laser ranging sensor;
Calculating a seventh included angle parameter between the fifth auxiliary line, an intersection connecting line of the sixth auxiliary line and the fourth auxiliary line and the seventh auxiliary line by using a second preset formula according to the fifth length parameter, the sixth length parameter and the sixth included angle parameter, wherein the seventh included angle parameter is equal to the second included angle parameter;
the second preset formula is:
Figure GDA0004218208850000041
wherein ω is a seventh included angle parameter, oh "is a fifth auxiliary line, ow is a sixth auxiliary line, and θ' is a sixth included angle parameter.
Further, the acquiring a sixth included angle parameter between the fifth auxiliary line and the sixth auxiliary line includes:
generating an eighth auxiliary line and a ninth auxiliary line which extend to two end points of the seventh auxiliary line from the positive direction of the Y axis in sequence, wherein the planes of the eighth auxiliary line and the ninth auxiliary line are perpendicular to a coordinate plane YOZ;
acquiring the fifth included angle parameter, and acquiring an eighth included angle parameter between the eighth auxiliary line and the ninth auxiliary line;
calculating the sixth included angle parameter by using a third preset formula according to the fifth included angle parameter and the eighth included angle parameter;
the third preset formula is:
Figure GDA0004218208850000042
Wherein θ' is a sixth included angle parameter, ε is a fifth included angle parameter, and θ "is an eighth included angle parameter.
Further, the calculating the relative coordinates of the laser projection point according to the first included angle parameter, the second included angle parameter and the relative position coordinates of the laser probe includes:
acquiring a distance parameter between an origin of the space rectangular coordinate system and an intersection point of the detected surface and the Z axis by using the laser ranging sensor;
calculating an equation of the detected surface by utilizing a trigonometric function relation and a space plane equation according to the first included angle parameter, the second included angle parameter and the distance parameter;
calculating the space coordinates on the detected surface corresponding to the relative coordinates of the projection points by utilizing the equation of the detected surface according to the relative position coordinates of the laser probe;
establishing a plane rectangular coordinate system in which coordinate points of the space coordinates are located, and taking the coordinate points on the Z axis as an original point, wherein a vector direction formed by the coordinate points on the Z axis and the coordinate points adjacent to the coordinate points on the Z axis is a positive direction of an X axis;
and calculating the plane coordinates on the detected surface corresponding to the relative position coordinates of the laser probe by utilizing a vector operation relation and a trigonometric function relation according to the plane rectangular coordinate system and the space coordinates, wherein the plane coordinates are the relative coordinates of the projection points of the laser probe.
Further, the data processing device calculates a perspective distortion inverse transformation matrix according to the laser point relative coordinates and the laser projection point relative coordinates, and the method comprises the following steps:
calculating the perspective distortion inverse transformation matrix by using a fourth preset formula according to the laser point relative coordinates and the laser projection point relative coordinates;
the fourth preset formula is that
Figure GDA0004218208850000051
Wherein (1)>
Figure GDA0004218208850000052
Inverse transform matrix for perspective distortion->
Figure GDA0004218208850000053
Coordinate matrix of laser projection point relative coordinates, +.>
Figure GDA0004218208850000054
And taking a coordinate matrix of relative coordinates of the laser points in the perspective distortion image information for the camera, wherein n is a positive integer greater than zero.
Further, the data processing apparatus corrects the perspective distortion image information into front view image information through the perspective distortion inverse transformation matrix, and performs bridge apparent disease detection by using the front view image information, including:
correcting the perspective distortion image information into front view image information through the perspective distortion inverse transformation matrix, and detecting apparent bridge diseases by utilizing the front view image information and the relative coordinates of the laser projection points;
further, the bridge apparent disease detection by using the front view image information and the laser projection point relative coordinates comprises:
Calculating a distance parameter between any two coordinate points in the relative coordinates of the laser projection points, and counting the number parameter of pixel units between the any two coordinate points;
dividing the distance parameter and the number parameter of the pixel units to obtain the size parameter of each pixel unit;
and determining the size information of the bridge Liang Biaoguan disease in the front view image information according to the number of the pixel units in the range of the bridge Liang Biaoguan disease in the front view image information and the size parameter of each pixel unit.
By means of the technical scheme, the invention has at least the following beneficial effects:
according to the bridge apparent disease detection system provided by the embodiment of the invention, the biaxial movement mechanism, the camera and the data processing device are arranged, the biaxial movement mechanism can drive the camera to carry out rotary shooting, so that the camera can shoot perspective distortion image information comprising four laser probe projection points, when the bridge apparent disease is detected, the camera can be arranged below the bridge to carry out rotary shooting at fixed points, the setting work of the calibration points is completed by utilizing lasers emitted by the four laser probes, the data processing device can obtain the relative coordinates of the laser points in the perspective distortion image information shot by the camera and the relative coordinates of the laser projection points of the laser probes projected onto a detected surface, and a perspective distortion inverse transformation matrix is obtained according to the relative coordinates of the laser points of the laser probes and the relative coordinates of the laser projection points, and the perspective distortion image information is corrected into front view image information by utilizing the perspective distortion inverse transformation matrix.
According to the bridge apparent disease detection method provided by the embodiment of the invention, the camera in the detection system is utilized to capture perspective distortion image information of the laser emission point with the laser probe; the data processing device acquires perspective distortion image information shot by the camera, acquires relative coordinates of laser points in the perspective distortion image information and relative coordinates of laser projection points of laser of the laser probe projected onto a detected surface, and can calculate a perspective distortion inverse transformation matrix through the relative coordinates of the laser points and the relative coordinates of the laser projection points, so that the data processing device in the detection equipment can utilize the perspective distortion inverse transformation matrix to correct the perspective distortion image information into front view image information, and therefore disease detection is carried out by utilizing the front view image information.
Drawings
Fig. 1 is a schematic structural diagram of a bridge apparent disease detection system according to an embodiment of the present invention at a first view angle;
Fig. 2 is a schematic structural diagram of a bridge apparent disease detection system according to an embodiment of the present invention at a second view angle;
FIG. 3 is a flow chart of a method for detecting apparent bridge diseases, which is provided by the embodiment of the invention;
FIG. 4 is a flowchart of another method for detecting apparent bridge diseases according to an embodiment of the present invention;
FIG. 5 is a model of a space rectangular coordinate system in FIG. 3 or FIG. 4;
FIG. 6 is a rectangular planar coordinate system model of FIG. 5;
FIG. 7 is a triangle model of FIG. 5;
FIG. 8 is another space rectangular coordinate system model of FIG. 3 or FIG. 4;
FIG. 9 is a triangle model of FIG. 8;
FIG. 10 is a block diagram of a bridge apparent disease detection device according to an embodiment of the present invention;
fig. 11 is a block diagram of another bridge apparent disease detection device according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the technical solutions in the preferred embodiments of the present invention will be described in more detail with reference to the accompanying drawings in the preferred embodiments of the present invention. In the drawings, the same or similar reference numerals denote the same or similar elements or elements having the same or similar functions throughout. The described embodiments are some, but not all, embodiments of the invention. The embodiments described below by referring to the drawings are illustrative and intended to explain the present invention and should not be construed as limiting the invention. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention. Embodiments of the present invention will be described in detail below with reference to the accompanying drawings.
In the description of the present embodiment, it should be understood that the terms "center", "longitudinal", "lateral", "front", "rear", "left", "right", "vertical", "horizontal", "top", "bottom", "inner", "outer", etc. indicate orientations or positional relationships based on the orientations or positional relationships shown in the drawings, are merely for convenience in describing the present embodiment and simplifying the description, and do not indicate or imply that the device or element in question must have a specific orientation, be configured and operated in a specific orientation, and therefore should not be construed as limiting the scope of protection of the present embodiment.
As described in the background art, at present, bridge apparent disease detection is generally based on a non-contact bridge apparent disease detection method adopting an image processing mode, that is, a rotatable camera is utilized to rotationally shoot the apparent bridge, since perspective distortion occurs in image information obtained by rotation shooting of the camera, in order to correct the image information with perspective distortion into front view image information so as to facilitate disease detection, in the prior art, an operator generally uses a climbing tool to attach a target on a detected surface of the bridge, so as to obtain a perspective distortion inverse transformation matrix through relative coordinates of the target, then the transformation matrix is utilized to correct the perspective distortion image information, and then disease detection is performed through corrected front view information, so that the operation is inconvenient, the construction safety is lower, and the working efficiency is lower.
In order to solve the above problems, as shown in fig. 1 and 2, an embodiment of the present invention provides a detection system for apparent diseases of bridges, comprising a biaxial movement mechanism 1, a horizontal rotation part 12 of the biaxial movement mechanism 1, and a vertical rotation part 11 provided on the horizontal rotation part 12; a camera provided in the vertical rotation section 11; the camera 2 is arranged on the biaxial movement mechanism 1, the camera 2 comprises a lens 21, at least four laser probes 3 are circumferentially arranged on the peripheral array of the lens 21, a laser ranging sensor 4 is arranged on the lens 21, the laser emission directions of the four laser probes 3 and the laser emission directions of the laser ranging sensor 4 are respectively parallel to the optical axis direction of the lens 21 and are positioned on the same side, and the camera 2 is used for capturing image information with four projection points of the laser probes 3; and a data processing device (not shown in the figure) electrically connected with the biaxial movement mechanism 1, the camera 2 and the laser ranging sensor 4 respectively, for acquiring and processing the image information taken by the camera 2, the distance information measured by the laser ranging sensor 4 and the movement information of the biaxial movement mechanism accordingly.
In the detection system, the camera 2 is used for horizontally rotating or vertically rotating along with the double-shaft motion mechanism 1, wherein the double-shaft motion mechanism 1 can comprise a vertical rotation part 11 and a horizontal rotation part 12, and the camera 2 can be arranged on the vertical rotation part 11, so that the camera 2 can rotate by plus or minus 30 degrees in the vertical direction along with the rotation of the vertical rotation part 11, and can rotate by plus or minus 30 degrees in the horizontal direction along with the rotation of the horizontal rotation part 12, and the rotation shooting is realized. The at least four laser probes 3 are arranged around the lens 21 in an array mode, so that lasers emitted by the at least four laser probes 3 can be located in the view angle of the lens 21, and importantly, laser points emitted by the laser probes 3 can be located at the center of an image shot by the camera 2 as much as possible, distortion of the laser points is reduced, and the data processing device can be connected with the camera 2 or the laser ranging sensor 4 in a wired or wireless connection mode, specifically, the data processing device can be a computer and is used for acquiring relative coordinates of the laser points in perspective distortion image information and relative coordinates of laser projection points of the laser probes, on a detected surface, and calculating a perspective distortion inverse transformation matrix according to the relative coordinates of the laser points and the relative coordinates of the laser projection points, and correcting the perspective distortion image information into front view image information by using the perspective distortion inverse transformation matrix, and detecting bridge apparent diseases by using the front view image information, so that bridge apparent diseases are detected more simply and conveniently, and effectively improving working efficiency. The biaxial movement mechanism 11 may be a biaxial movement holder, and the horizontal rotation part 12 and the vertical rotation part 11 may be implemented by means of mechanical gears or motor driving, and the structure thereof will not be described in detail herein since the biaxial movement holder is a common technology in the prior art. And the biaxial movement cradle head can be electrically connected with the data processing device, so that the data processing device can control the movement of the cradle head and acquire the rotation angle information of the cradle head.
According to the bridge apparent disease detection system provided by the embodiment of the invention, the biaxial movement mechanism, the camera and the data processing device are arranged, the biaxial movement mechanism can drive the camera to carry out rotary shooting, so that the camera can shoot perspective distortion image information comprising four laser probe projection points, when the bridge apparent disease is detected, the camera can be arranged below the bridge to carry out rotary shooting at fixed points, the setting work of the calibration points is completed by utilizing lasers emitted by the four laser probes, the data processing device can obtain the relative coordinates of the laser points in the perspective distortion image information shot by the camera and the relative coordinates of the laser projection points of the laser probes projected onto a detected surface, and a perspective distortion inverse transformation matrix is obtained according to the relative coordinates of the laser points of the laser probes and the relative coordinates of the laser projection points, and the perspective distortion image information is corrected into front view image information by utilizing the perspective distortion inverse transformation matrix.
The embodiment of the invention also provides a detection method of the apparent bridge diseases, which is realized by adopting the detection system, as shown in fig. 3 and combined with fig. 1 and 2, and comprises the following steps:
101. the camera takes perspective distorted image information with the laser spot emitted by the laser probe 3.
At least four laser probes 3 are arranged around the periphery of the lens 21, when the camera 2 shoots an image of the detected surface, the laser probes project laser to the detected surface, and the laser emitted by the at least four laser probes 3 can be located in the view angle of the lens 21, meanwhile, the laser spot emitted by the laser probes 3 can be located in the central position of the shot image of the camera 2 as far as possible, so that distortion of the laser spot is reduced, and the camera 2 can shoot in a rotating way and can shoot perspective distortion image information with the laser spot emitted by the laser probe 3.
102. The data processing device acquires perspective distortion image information captured by the camera 2.
The camera 2 in the detection system may store the captured image information, and the data processing device may acquire the image information with perspective distortion in the image information stored in the camera 2, so as to prepare for correcting the perspective distortion image information.
103. And the data processing device acquires the relative coordinates of the laser points in the perspective distortion image information and the relative coordinates of laser projection points of the laser probe, which are projected onto the detected surface, of the laser.
The relative coordinates of the laser points can be directly extracted from the perspective distortion image information by the data processing device according to the coordinates of the pixel points, so that the relative coordinates of the laser points can be used as a known quantity; the relative coordinates of the laser projection points, that is, the relative coordinates of the laser projection points of the laser emitted by the laser probe 3 projected onto the surface to be detected, can be calculated by the data processing device.
104. And the data processing device calculates a perspective distortion inverse transformation matrix according to the relative coordinates of the laser points and the relative coordinates of the laser projection points.
The coefficients of the perspective distortion inverse transformation matrix equation can be calculated through the relative coordinates of the laser points and the relative coordinates of the laser projection points, and the coefficients are the perspective distortion inverse transformation matrix.
105. The data processing apparatus corrects the perspective distortion image information into front view image information through the perspective distortion inverse transformation matrix.
106. And the data processing device is used for detecting apparent bridge diseases by utilizing the front-view image information.
After the perspective distortion inverse transformation matrix is calculated, the perspective distortion image information shot in the current shooting posture can be corrected and transformed into front view image information through the matrix, so that the bridge disease information can be identified and measured by utilizing the front view image information.
According to the bridge apparent disease detection method provided by the embodiment of the invention, the camera in the detection system is utilized to capture perspective distortion image information of the laser emission point with the laser probe; the data processing device acquires perspective distortion image information shot by the camera, acquires relative coordinates of laser points in the perspective distortion image information and relative coordinates of laser projection points of laser of the laser probe projected onto a detected surface, and can calculate a perspective distortion inverse transformation matrix through the relative coordinates of the laser points and the relative coordinates of the laser projection points, so that the data processing device in the detection equipment can utilize the perspective distortion inverse transformation matrix to correct the perspective distortion image information into front view image information, and therefore disease detection is carried out by utilizing the front view image information.
Further, in order to better explain the method for detecting the apparent bridge disease, as a refinement and expansion of the above embodiment, the embodiment of the present invention provides another method for detecting the apparent bridge disease, as shown in fig. 4, and in combination with fig. 1 and fig. 2, but not limited thereto, specifically as follows:
201. the camera captures perspective distortion image information with laser points emitted by the laser probe.
At least four laser probes 3 are arranged around the periphery of the lens 21, when the camera 2 shoots an image of the detected surface, the laser probes project laser to the detected surface, and the laser emitted by the at least four laser probes 3 can be located in the view angle of the lens 21, so that the camera 2 can shoot in a rotating mode and can shoot perspective distortion image information with the emitting points of the laser probes 3.
202. The data processing device acquires the perspective distortion image information.
The camera 2 in the detection system may store the captured image information, and the data processing device may acquire the image information with perspective distortion in the image information stored in the camera 2, so as to prepare for correcting the perspective distortion image information.
203. The data processing device acquires the relative coordinates of the laser points in the perspective distortion image information.
The relative coordinates of the laser points can be directly extracted by the data processing device from the perspective distortion image information according to the coordinates of the pixel points, so that the relative coordinates of the laser points can be used as a known quantity.
204. The data processing device establishes a space rectangular coordinate system where the detected surface is located, and the detected surface is intersected with an X-axis positive direction, a Y-axis positive direction and a Z-axis positive direction of the space rectangular coordinate system at one point respectively.
Referring to fig. 5, with reference to fig. 1, o being an origin of the space rectangular coordinate system, a positive Z-axis direction may be an optical axis direction of the lens 21 in the detection device, a positive Y-axis direction may be a right-hand direction of the camera 2, a positive X-axis direction may be a right-hand direction of the camera 2, a plane abc may be a detected surface intersecting the positive X-axis direction, the positive Y-axis direction, and the positive Z-axis direction at one point, and the occurrence points of the four laser probes 3 may be located on an XOY plane of the space rectangular coordinate system, that is, the aforementioned four o, p, q, r coordinate points, and the projection points of the four laser probes 3 onto the detected surface may be c, e, d, f coordinate points, respectively, that is, the projection points of the four laser probes 3 in the perspective distortion image information.
205. The data processing device acquires the first included angle parameter of the detected surface and the Y axis, the second included angle parameter of the detected surface and the X axis, and the relative position coordinate of the laser probe 3.
Referring to fig. 5, in combination with fig. 1, the included angle between the detected surface abc and the Y axis is β, β is a first included angle parameter, and the included angle between the detected surface abc and the X axis is α, α is a second included angle parameter. Since the relative positions of at least four laser probes 3 in the detection apparatus on the lens 21 are controllable, the relative position coordinates of the laser probes 3 may be a known quantity, for example, taking the number of the laser probes 3 as four as an example, the four laser probes 3 may be arranged on the periphery of the lens 21 in a rectangular shape, and the rectangular shape may have a length of 13cm and a width of 9cm, so that the coordinates of the emission points of the four laser probes 3 on a plane rectangular coordinate system may be o (0, 0), p (p, 0), q (0, q), r (p, q), where o, p, q, r may be the position points of the four laser probes 3, and o is the origin of the plane rectangular coordinate system. Furthermore, the relative position coordinates of the four laser probes 3 may be directly input to the data processing device.
In the embodiment of the present invention, since the camera 2 in the detection apparatus performs rotation shooting in the horizontal direction and the vertical direction by the biaxial movement mechanism 1, and the Z-axis direction of the aforementioned space rectangular coordinate system is the optical axis direction of the lens 21 of the camera 2, the vertical rotation direction of the vertical rotation portion 11 of the biaxial movement mechanism 1 and the vertical rotation (pitching rotation) direction of the camera 2 themselves are consistent under any condition, so the data processing apparatus in step 204 may specifically include the following steps:
a1, generating a first auxiliary line and a second auxiliary line which extend from the origin of the space rectangular coordinate system to the detected surface respectively and are positioned in the positive direction and the negative direction of the Y axis respectively, and a third auxiliary line which extends from the intersection point of the first auxiliary line and the detected surface to the second auxiliary line.
The first auxiliary line and the second auxiliary line are respectively located in the positive direction and the negative direction of the Y axis and are both in the plane YOZ, the included angles formed by the first auxiliary line and the second auxiliary line respectively with the Z axis are equal, the third auxiliary line is parallel to the Y axis, referring to FIG. 5, os is the first auxiliary line, ot is the second auxiliary line, sg is the third auxiliary line.
A2, acquiring a third included angle parameter between the first auxiliary line and the second auxiliary line, and acquiring a first length parameter of the first auxiliary line and a second length parameter of the second auxiliary line by using a laser ranging sensor 4.
Referring to fig. 5, the dual-axis motion mechanism 1 in the detection system may drive the laser ranging sensor 4 to rotate by a small angle θ along the positive direction and the negative direction of the Y axis with the Z axis as the center, where the angle θ is a third included angle parameter, and the data processing device may acquire the value of the angle θ, and simultaneously, when the laser ranging sensor 4 rotates along the Y positive direction with the Z axis as the center, the first length parameter of the first auxiliary line os may be measured by emitting laser to the detected surface abc, and similarly, when the laser ranging sensor 4 rotates along the Y negative direction with the Z axis as the center, the second length parameter of the second auxiliary line ot may be measured by emitting laser to the detected surface abc, and may send the first length parameter and the second length parameter to the data processing device. In the embodiment of the invention, the angle θ is a controllable quantity, and can be specifically set according to actual engineering requirements, and according to actual bridge inspection conditions and measurement accuracy of the laser ranging sensor 4, the angle θ can be set to 1 ° when the measurement distance is greater than 5m, and can be set to 2 ° when the measurement distance is less than 5 m. Moreover, the letter os may represent a first length parameter of the first auxiliary line, and the letter ot may represent a second length parameter of the second auxiliary line.
A3, calculating fourth included angle parameters between the first auxiliary line, the intersection connecting line of the second auxiliary line and the detected surface and the third auxiliary line by using a first preset formula according to the first length parameter, the second length parameter and the third included angle parameter, wherein the fourth included angle parameters are the first included angle parameters.
Referring to fig. 5 and 7, since the first auxiliary line os and the second auxiliary line ot form equal angles with the Z axis, respectively, and the third auxiliary line sg is parallel to the Y axis, Δosg is an isosceles triangle, and the fourth angle parameter γ is equal to the first angle parameter β, the following formula can be obtained:
Figure GDA0004218208850000151
tg=ot-os equation 2
st 2 =os 2 +ot 2 -2.os.ot.cos θ formula 3
tg 2 =st 2 +sg 2 -2 st sg cos gamma equation 4
The first preset formula can be obtained by the formulas 1 to 4:
Figure GDA0004218208850000152
wherein, gamma is a fourth included angle parameter, os is a first length parameter, ot is a second length parameter, and θ is a third included angle parameter.
In summary, when the data processing device calculates the first included angle parameter β, the first length parameter os, the second length parameter ot, and the third included angle parameter θ may be respectively brought into the first preset formula, so as to calculate the fourth included angle parameter γ, thereby obtaining the first included angle parameter β.
In the embodiment of the present invention, referring to fig. 1 and 5, since the camera 2 in the detection apparatus performs rotation shooting in the horizontal direction and the vertical direction by the biaxial movement mechanism 1, and the Z-axis direction of the aforementioned space rectangular coordinate system is the optical axis direction of the camera 2 lens 21, when the vertical rotation portion 11 of the biaxial movement mechanism 1 drives the camera 2 lens 21 to vertically rotate by a certain angle, that is, when the camera 2 lens 21 has a certain pitch angle, the rotation direction of the horizontal rotation portion 12 of the biaxial movement mechanism 1 is inconsistent with the rotation direction of the camera 2 in many cases, that is, inconsistent with the rotation direction of the Z-axis rotating around the Y-axis along the X-axis direction, the step of obtaining the second included angle parameter between the detected surface and the X-axis is different from the step of obtaining the aforementioned first included angle parameter, and specifically, the step 204 of obtaining the second included angle parameter between the detected surface and the X-axis by the data processing apparatus may specifically include the following steps:
b1, generating an auxiliary Z axis which has a fifth included angle parameter with the Z axis and is positioned on a coordinate plane YOZ, a fourth auxiliary line which extends from an intersection point of the detected surface and the X axis to the auxiliary Z axis, fifth auxiliary lines and sixth auxiliary lines which respectively extend from an origin of the space rectangular coordinate system to the fourth auxiliary line and are respectively positioned in positive and negative directions of the X axis, and seventh auxiliary lines which extend from an intersection point of the fifth auxiliary line and the fourth auxiliary line to the sixth auxiliary line.
The included angles formed by the fifth auxiliary line, the sixth auxiliary line and the auxiliary Z axis are equal, and the seventh auxiliary line is parallel to the X axis. Referring to fig. 8, the Z 'axis is the auxiliary Z axis, oh "is the fifth auxiliary line, ow is the sixth auxiliary line, and h' h" is the seventh auxiliary line. Moreover, the included angle epsilon between the Z' axis and the Z axis is a fifth included angle parameter, and the included angle epsilon represents that the camera 2 vertically rotates by epsilon along with the vertical rotation part 11 of the biaxial movement mechanism 1, that is, the pitch angle of the current camera 2 is epsilon.
And B2, acquiring a sixth included angle parameter between the fifth auxiliary line and the sixth auxiliary line, and acquiring a fifth length parameter of the fifth auxiliary line and a sixth length parameter of the sixth auxiliary line by using a laser ranging sensor 4.
For the embodiment of the present invention, the obtaining, in step B2, the sixth included angle parameter between the fifth auxiliary line and the sixth auxiliary line may specifically include the following steps:
b1, generating an eighth auxiliary line and a ninth auxiliary line which extend from the positive direction of the Y axis to two end points of the seventh auxiliary line in sequence.
Wherein, the plane of the eighth auxiliary line and the ninth auxiliary line is vertical to the YOZ axis of the surface. Referring to fig. 8, o ' h "is the eighth auxiliary line, and o ' h ' is the ninth auxiliary line. In addition, a tenth auxiliary line o 'h may be generated and intersects the sixth auxiliary line h' h″ and the auxiliary Z axis Z 'at a point h, where the tenth auxiliary line o' h is an intersection line between the plane where the eighth auxiliary line and the ninth auxiliary line are located and the surface YOZ, and is parallel to the Z axis.
b2, acquiring the fifth included angle parameter, and acquiring an eighth included angle parameter between the eighth auxiliary line and the ninth auxiliary line.
Referring to fig. 8, as mentioned above, epsilon is the pitch angle of the current camera 2, that is, the fifth included angle parameter, and the fifth included angle parameter may be automatically obtained by the data processing device according to the rotation angle of the vertical rotation portion 11 in the biaxial motion pan/tilt, that is, the fifth included angle parameter is a controllable amount. The eighth included angle parameter is an angle θ″, the laser ranging sensor 4 may be driven by the biaxial motion mechanism 1 in the detection system to rotate around the Y axis in the positive direction and the negative direction of the X axis, and the laser emitted by the laser ranging sensor is overlapped with the eighth auxiliary line and the ninth auxiliary line, so that the angle rotated by the laser ranging sensor 4 is the eighth included angle parameter θ″, and the data processing device may obtain the value of the eighth included angle parameter θ″.
b3, calculating the sixth included angle parameter by using a third preset formula according to the fifth included angle parameter and the eighth included angle parameter.
Referring to fig. 8, since the fifth auxiliary line oh″ and the sixth auxiliary line ow form equal angles with the auxiliary Z axis Z ', respectively, and the seventh auxiliary line h' h″ is parallel to the X axis, the following formula can be obtained:
oh "=oh' equation 5
Figure GDA0004218208850000171
Figure GDA0004218208850000172
Figure GDA0004218208850000173
Thus:
Figure GDA0004218208850000174
at the same time:
Figure GDA0004218208850000175
thus:
Figure GDA0004218208850000181
the third preset equation is derived from equation 11 as:
Figure GDA0004218208850000182
wherein θ' is a sixth included angle parameter, ε is a fifth included angle parameter, and θ "is an eighth included angle parameter.
In summary, when the data processing device calculates the sixth included angle parameter, the fifth included angle parameter and the eighth included angle parameter may be respectively brought into the third preset formula, so as to calculate the sixth included angle parameter.
And B3, calculating a seventh included angle parameter between the fifth auxiliary line, the intersection connecting line of the sixth auxiliary line and the fourth auxiliary line and the seventh auxiliary line by using a second preset formula according to the fifth length parameter, the sixth length parameter and the sixth included angle parameter, wherein the seventh included angle parameter is equal to the second included angle parameter.
Referring to fig. 8, in combination with fig. 9, since the angles formed by the fifth auxiliary line oh″ and the sixth auxiliary line ow with the auxiliary Z axis are equal, and the seventh auxiliary line h' h″ is parallel to the X axis, Δoh "w is an isosceles triangle, and the angle ω, i.e., the seventh angle parameter is equal to the second angle parameter, the following formula can be obtained:
Figure GDA0004218208850000183
wh' =ow-oh "equation 13
h”w 2 =oh” 2 +ow 2 -2.oh ". Ow. Cos θ' equation 14
wh' 2 =wh” 2 +h”h' 2 -2.h "w.h" h'. Cos ω formula 15
The second preset formula can be obtained from formulas 12 to 15 as follows:
Figure GDA0004218208850000184
wherein ω is a seventh included angle parameter, oh "is a fifth auxiliary line, ow is a sixth auxiliary line, and θ' is a sixth included angle parameter.
Also, as previously described
Figure GDA0004218208850000185
Wherein θ' is a sixth included angle parameter, ε is a fifth included angle parameter, θ "is an eighth included angle parameter, and therefore ∈>
Figure GDA0004218208850000191
And (5) carrying out a second preset formula to obtain a seventh included angle parameter omega.
In summary, when the data processing apparatus calculates the second included angle parameter α, the fifth length parameter oh ", the sixth length parameter ow, and the sixth included angle parameter θ' may be respectively brought into the second preset formula, so as to calculate the seventh included angle parameter ω, thereby obtaining the second included angle parameter α.
206. The data processing device calculates the relative coordinates of the laser projection point according to the first included angle parameter, the second included angle parameter and the relative position coordinates of the laser probe 3.
For the embodiment of the present invention, the step 206 may specifically include the following steps:
and C1, acquiring a distance parameter between the origin of the space rectangular coordinate system and the intersection point of the detected surface and the Z axis by using the laser ranging sensor 4.
And C2, calculating an equation of the detected surface by utilizing a trigonometric function relation and a space plane equation according to the first included angle parameter, the second included angle parameter and the distance parameter.
Referring to fig. 5, the data processing device may acquire a distance parameter c between an origin o of the space rectangular coordinate system measured by the laser ranging sensor 4 and an intersection point of the detected surface abc and the Z axis; and then bringing the distance parameter c, the first included angle parameter beta and the second included angle parameter alpha which are calculated by the calculation formula into the following space plane equation:
Figure GDA0004218208850000192
wherein a, b, c are the distance parameters from the origin o to the detected surface abc, respectively, and c can be measured by the laser ranging sensor 4, and can be known by tangent function
Figure GDA0004218208850000193
The equation for bringing it into equation 16 yields the detected surface abc as follows:
xtanα+ytanβ+z=c equation 17
And C3, calculating the space coordinates on the detected surface corresponding to the relative coordinates of the laser projection points by using the equation of the detected surface according to the relative position coordinates of the laser probe.
Referring to fig. 5, the coordinates of the emission points of the laser probe 3 are the coordinates of four points o, p, q, and r in the coordinate plane XOY, where the coordinates of the four points are as follows:
Figure GDA0004218208850000201
Wherein p is a distance parameter from p point to origin point o, q is a distance parameter from q point to origin point o, so that the spatial coordinates of c, e, d, f on the detected surface abc corresponding to the four coordinates of o, p, q, r are obtained by introducing the formula 18 into the formula 17 as follows:
Figure GDA0004218208850000202
wherein, α is the second included angle parameter, and β is the first included angle parameter.
And C4, establishing a plane rectangular coordinate system where the coordinate points of the space coordinates are located, taking the coordinate points on the Z axis as an original point, and taking the vector direction formed by the coordinate points on the Z axis and the coordinate points adjacent to the coordinate points as the positive direction of the X axis.
And C5, calculating the plane coordinates on the detected surface corresponding to the relative position coordinates of the laser probe by utilizing a vector operation relation and a trigonometric function relation according to the plane rectangular coordinate system and the space coordinates, wherein the plane coordinates are the relative coordinates of the laser projection points.
In order to obtain the plane coordinates of the four points c, d, e and f on the detected surface abc for obtaining the perspective distortion inverse transformation matrix, referring to fig. 6, the point c can be used as the reference point and the vector on the plane abc
Figure GDA0004218208850000203
The rectangular coordinate system of the plane formed by the four points c, d, e and f is obtained for the positive direction of the X axis of the plane abc, see fig. 5, so that the following formula can be obtained: / >
Figure GDA0004218208850000204
Figure GDA0004218208850000205
Figure GDA0004218208850000206
From equation 20:
Figure GDA0004218208850000207
Figure GDA0004218208850000211
Figure GDA0004218208850000212
and:
Figure GDA0004218208850000213
Figure GDA0004218208850000214
thus:
Figure GDA0004218208850000215
Figure GDA0004218208850000216
referring to the relative positions of the four points c, d, e, f in fig. 6, and the plane coordinates of the four points c, d, e, f on the detected surface abc can be obtained by formulas 26 to 29:
Figure GDA0004218208850000217
wherein, as mentioned above, p is the distance parameter from the midpoint p of the coordinate plane XOY to the origin o in the space rectangular coordinate system, q is the distance parameter from the midpoint q of the coordinate plane XOY to the origin o in the space rectangular coordinate system, and the two distance parameters are known quantities; and alpha is a second included angle parameter, beta is a first included angle parameter, and the two included angle parameters can be calculated by the corresponding calculation formulas, so that plane coordinates of four points c, d, e and f on the detected surface abc can be obtained by the formula 30, and further, the relative coordinates of projection points of the four laser probes 3 on the detected surface are obtained.
207. And calculating the perspective distortion inverse transformation matrix by using a fourth preset formula according to the laser point relative coordinates and the laser projection point relative coordinates.
Wherein the fourth preset formula may be
Figure GDA0004218208850000221
Wherein (1)>
Figure GDA0004218208850000222
Inverse transform matrix for perspective distortion->
Figure GDA0004218208850000223
Coordinate matrix of laser projection point relative coordinates, +.>
Figure GDA0004218208850000224
And taking a coordinate matrix of relative coordinates of the laser points in the perspective distortion image information for the camera, wherein n is a positive integer greater than zero. That is, the data processing device may bring the obtained laser projection point relative coordinates and the obtained laser point relative coordinates of the laser probe 3 into the fourth preset formula, and may calculate the perspective distortion inverse transformation matrix. If the number of laser probes 3 is four, n=4.
208. And correcting the perspective distortion image information into front view image information through the perspective distortion inverse transformation matrix.
209. And detecting apparent bridge diseases by utilizing the front view image information and the relative coordinates of the laser projection points.
For the embodiment of the present invention, step 209 may specifically include: calculating a distance parameter between any two coordinate points in the relative coordinates of the laser projection points, and counting the number parameter of pixel units between the any two coordinate points; dividing the distance parameter and the number parameter of the pixel units to obtain the size parameter of each pixel unit; and determining the size information of the bridge Liang Biaoguan disease in the front view image information according to the number of the pixel units in the range of the bridge Liang Biaoguan disease in the front view image information and the size parameter of each pixel unit. The method comprises the steps of firstly calculating the distance parameter between any two coordinate points in the relative coordinates of the laser projection point of the laser probe 3 on the detected surface, and calculating the size parameter of each pixel unit by counting the number of the pixel units between the two coordinate points, so that the accurate size information of the apparent bridge disease can be obtained by counting the number of the pixel units in the range of Liang Biaoguan disease in the forward-looking image information and multiplying the size parameter of each pixel unit, and the accuracy of detecting the apparent bridge disease size is improved.
Further, as a specific implementation of fig. 3, an embodiment of the present invention provides a device for detecting apparent bridge diseases, as shown in fig. 10, where the device includes: a first acquisition unit 31, a second acquisition unit 32, a calculation unit 33, a correction unit 34, and a detection unit 35.
The first obtaining unit 31 may be configured to obtain perspective distortion image information obtained by the camera 2, where the first obtaining unit 31 is a main functional module in the present apparatus for obtaining perspective distortion image information obtained by the camera 2.
The second obtaining unit 32 may be configured to obtain the relative coordinates of the laser point in the perspective distortion image information and the relative coordinates of a laser projection point of the laser probe, where the laser projection is projected onto the detected surface, and the second obtaining unit 32 is a core module that obtains the relative coordinates of the laser point in the perspective distortion image information and the relative coordinates of the laser projection point of the laser probe, where the laser projection is projected onto the detected surface, in the present apparatus.
The correcting unit 34 may be configured to correct the perspective distortion image information into front view image information through the perspective distortion inverse transformation matrix, and the correcting unit 34 is a main functional module in the present apparatus that corrects the perspective distortion image information into front view image information through the perspective distortion inverse transformation matrix.
The detection unit 35 may be used for detecting apparent bridge diseases by using the front view image information, and the detection unit 35 is a main functional module for detecting apparent bridge diseases by using the front view image information in the device.
For the embodiment of the present invention, the second acquisition unit 32 includes: a modeling module 321, an acquisition sub-module 322, and a first calculation sub-module 323, as shown in fig. 11.
The modeling module 321 may be configured to establish a space rectangular coordinate system in which a detected surface is located, where the detected surface intersects an X-axis positive direction, a Y-axis positive direction, and a Z-axis positive direction of the space rectangular coordinate system at a point, where the Z-axis positive direction is an optical axis direction of the lens 21.
The obtaining submodule 322 may be configured to obtain a first included angle parameter of the detected surface and the Y axis, a second included angle parameter of the detected surface and the X axis, and a relative position coordinate of the laser probe.
Further, the obtaining submodule 312 may be further specifically configured to generate a first auxiliary line and a second auxiliary line that extend from an origin of the rectangular space coordinate system to the detected surface respectively and are located in a positive direction and a negative direction of the Y axis respectively, and a third auxiliary line that extends from an intersection point of the first auxiliary line and the detected surface to the second auxiliary line, where included angles formed by the first auxiliary line and the second auxiliary line and the Z axis are equal, and the third auxiliary line is parallel to the Y axis; acquiring a third included angle parameter between the first auxiliary line and the second auxiliary line, and acquiring a first length parameter of the first auxiliary line and a second length parameter of the second auxiliary line by using a laser ranging sensor 4; calculating a fourth clamp between the intersection point connecting line of the first auxiliary line, the second auxiliary line and the detected surface and the third auxiliary line by using a first preset formula according to the first length parameter, the second length parameter and the third included angle parameter The fourth included angle parameter is the first included angle parameter; the first preset formula is:
Figure GDA0004218208850000241
wherein, gamma is a fourth included angle parameter, os is a first length parameter, ot is a second length parameter, and θ is a third included angle parameter.
In addition, the obtaining submodule 312 may be further specifically configured to generate an auxiliary Z axis having a fifth included angle parameter with the Z axis and located on a coordinate plane YOZ, a fourth auxiliary line extending from an intersection point of the detected plane and the X axis to the auxiliary Z axis, fifth auxiliary line and sixth auxiliary line extending from an origin of the space rectangular coordinate system to the fourth auxiliary line and located in a positive direction and a negative direction of the X axis, respectively, and a seventh auxiliary line extending from an intersection point of the fifth auxiliary line and the fourth auxiliary line to the sixth auxiliary line, wherein included angles formed by the fifth auxiliary line and the sixth auxiliary line and the auxiliary Z axis are equal, and the seventh auxiliary line is parallel to the X axis; acquiring a sixth included angle parameter between the fifth auxiliary line and the sixth auxiliary line, and acquiring a fifth length parameter of the fifth auxiliary line and a sixth length parameter of the sixth auxiliary line by using a laser ranging sensor 4; calculating a seventh included angle parameter between the fifth auxiliary line, an intersection connecting line of the sixth auxiliary line and the fourth auxiliary line and the seventh auxiliary line by using a second preset formula according to the fifth length parameter, the sixth length parameter and the sixth included angle parameter, wherein the seventh included angle parameter is equal to the second included angle parameter; the second preset formula is:
Figure GDA0004218208850000251
Wherein ω is a seventh included angle parameter, oh "is a fifth auxiliary line, ow is a sixth auxiliary line, and θ' is a sixth included angle parameter. Moreover, the obtaining sub-module 312 may be further configured to generate an eighth auxiliary line and a ninth auxiliary line that sequentially extend from the positive direction of the Y axis to two end points of the seventh auxiliary line, respectively, where the eighth auxiliary line and the ninth auxiliary line are located on a flat surfaceThe plane is perpendicular to the coordinate plane YOZ; acquiring the fifth included angle parameter, and acquiring an eighth included angle parameter between the eighth auxiliary line and the ninth auxiliary line; calculating the sixth included angle parameter by using a third preset formula according to the fifth included angle parameter and the eighth included angle parameter; the third preset formula is:
Figure GDA0004218208850000252
Wherein θ' is a sixth included angle parameter, ε is a fifth included angle parameter, and θ "is an eighth included angle parameter.
The first calculation sub-module 323 may be configured to calculate the laser projection point relative coordinate according to the first included angle parameter, the second included angle parameter, and the laser point relative coordinate.
Further, the first calculation sub-module 313 may be further specifically configured to obtain, by using the laser ranging sensor 4, a distance parameter between an origin of the space rectangular coordinate system and an intersection point of the detected surface and the Z axis; calculating an equation of the detected surface by utilizing a trigonometric function relation and a space plane equation according to the first included angle parameter, the second included angle parameter and the distance parameter; calculating the space coordinates on the detected surface corresponding to the relative coordinates of the projection points by using the equation of the detected surface according to the relative position coordinates of the laser probe 3; establishing a plane rectangular coordinate system in which coordinate points of the space coordinates are located, and taking the coordinate points on the Z axis as an original point, wherein a vector direction formed by the coordinate points on the Z axis and the coordinate points adjacent to the coordinate points on the Z axis is a positive direction of an X axis; and calculating the plane coordinates on the detected surface corresponding to the relative position coordinates of the laser probe by utilizing a vector operation relation and a trigonometric function relation according to the plane rectangular coordinate system and the space coordinates, wherein the plane coordinates are the relative coordinates of the laser projection points.
The calculating unit 33 may be configured to calculate a perspective distortion inverse transformation matrix according to the laser point relative coordinates and the laser projection point relative coordinates, and the calculating unit 33 is a main functional module in the present apparatus for calculating a perspective distortion inverse transformation matrix according to the laser point relative coordinates and the projection point relative coordinates.
For the embodiment of the present invention, the calculation unit 33 includes: a second calculation sub-module 331 is shown in fig. 11.
The second calculation submodule 331 may be configured to calculate the perspective distortion inverse transformation matrix according to the laser point relative coordinate and the laser projection point relative coordinate by using a fourth preset formula; wherein the fourth preset formula may be
Figure GDA0004218208850000261
Wherein (1)>
Figure GDA0004218208850000262
Inverse transform matrix for perspective distortion->
Figure GDA0004218208850000263
Coordinate matrix of laser projection point relative coordinates, +.>
Figure GDA0004218208850000264
And taking a coordinate matrix of relative coordinates of the laser points in the perspective distortion image information for the camera, wherein n is a positive integer greater than zero.
For the embodiment of the present invention, the detection unit 35 includes: the detection sub-module 351 is shown in fig. 11.
The detection submodule 351 can be used for detecting apparent diseases of the bridge by utilizing the front-view image information and the relative coordinates of the laser projection points.
It should be noted that, other corresponding descriptions of each functional module related to the detection device for apparent bridge diseases provided by the embodiment of the present invention may refer to corresponding descriptions of the method shown in fig. 1, and are not repeated herein.
Based on the above method as shown in fig. 1, correspondingly, the embodiment of the present invention further provides a computer readable storage medium, on which a computer program is stored, which when being executed by a processor, implements the following steps: obtaining perspective distortion image information shot by a camera 2; acquiring the relative coordinates of the laser points in the perspective distortion image information and the coordinates of laser shooting points of the laser probe projected onto a detected surface; calculating a perspective distortion inverse transformation matrix according to the laser point relative coordinates and the laser projection point relative coordinates; correcting the perspective distortion image information into front view image information through the perspective distortion inverse transformation matrix, and detecting apparent bridge diseases by utilizing the front view image information.
Based on the embodiment shown in the method shown in fig. 1, the embodiment of the invention also provides a physical structure of a device for detecting apparent bridge diseases, which comprises: a processor, a memory, and a computer program stored on the memory and executable on the processor, the processor implementing the steps of: obtaining perspective distortion image information of a projection point shot by a camera 2; acquiring the relative coordinates of the laser points in the perspective distortion image information and the relative coordinates of laser projection points of the laser probe 3 projected onto the detected surface; calculating a perspective distortion inverse transformation matrix according to the relative coordinates of the laser points of the laser probe 3 and the relative coordinates of the laser projection points; correcting the perspective distortion image information into front view image information through the perspective distortion inverse transformation matrix, and detecting apparent bridge diseases by utilizing the front view image information.
According to the technical scheme, the camera in the detection system is utilized to shoot perspective distortion image information of the laser emission point with the laser probe; the data processing device acquires perspective distortion image information shot by the camera, acquires relative coordinates of laser points in the perspective distortion image information and relative coordinates of laser projection points of laser of the laser probe projected onto a detected surface, and can calculate a perspective distortion inverse transformation matrix through the relative coordinates of the laser points and the relative coordinates of the laser projection points, so that the data processing device in the detection equipment can utilize the perspective distortion inverse transformation matrix to correct the perspective distortion image information into front view image information, and therefore disease detection is carried out by utilizing the front view image information.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and for parts of one embodiment that are not described in detail, reference may be made to related descriptions of other embodiments.
It will be appreciated that the relevant features of the methods and apparatus described above may be referenced to one another. In addition, the "first", "second", and the like in the above embodiments are for distinguishing the embodiments, and do not represent the merits and merits of the embodiments.
It will be clear to those skilled in the art that, for convenience and brevity of description, specific working procedures of the above-described systems, apparatuses and units may refer to corresponding procedures in the foregoing method embodiments, which are not repeated herein.
The algorithms and displays presented herein are not inherently related to any particular computer, virtual system, or other apparatus. Various general-purpose systems may also be used with the teachings herein. The required structure for a construction of such a system is apparent from the description above. In addition, the present invention is not directed to any particular programming language. It will be appreciated that the teachings of the present invention described herein may be implemented in a variety of programming languages, and the above description of specific languages is provided for disclosure of enablement and best mode of the present invention.
In the description provided herein, numerous specific details are set forth. However, it is understood that embodiments of the invention may be practiced without these specific details. In some instances, well-known methods, structures and techniques have not been shown in detail in order not to obscure an understanding of this description.
Similarly, it should be appreciated that in the foregoing description of exemplary embodiments of the invention, various features of the invention are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the disclosure and aiding in the understanding of one or more of the various inventive aspects. However, the disclosed method should not be construed as reflecting the intention that: i.e., the claimed invention requires more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive aspects lie in less than all features of a single foregoing disclosed embodiment. Thus, the claims following the detailed description are hereby expressly incorporated into this detailed description, with each claim standing on its own as a separate embodiment of this invention.
Those skilled in the art will appreciate that the modules in the apparatus of the embodiments may be adaptively changed and disposed in one or more apparatuses different from the embodiments. The modules or units or components of the embodiments may be combined into one module or unit or component and, furthermore, they may be divided into a plurality of sub-modules or sub-units or sub-components. Any combination of all features disclosed in this specification (including any accompanying claims, abstract and drawings), and all of the processes or units of any method or apparatus so disclosed, may be used in combination, except insofar as at least some of such features and/or processes or units are mutually exclusive. Each feature disclosed in this specification (including any accompanying claims, abstract and drawings), may be replaced by alternative features serving the same, equivalent or similar purpose, unless expressly stated otherwise.
Furthermore, those skilled in the art will appreciate that while some embodiments described herein include some features but not others included in other embodiments, combinations of features of different embodiments are meant to be within the scope of the invention and form different embodiments. For example, in the following claims, any of the claimed embodiments can be used in any combination.
Various component embodiments of the invention may be implemented in hardware, or in software modules running on one or more processors, or in a combination thereof. Those skilled in the art will appreciate that some or all of the functions of some or all of the components in accordance with embodiments of the present invention may be implemented in practice using a microprocessor or Digital Signal Processor (DSP). The present invention can also be implemented as an apparatus or device program (e.g., a computer program and a computer program product) for performing a portion or all of the methods described herein. Such a program embodying the present invention may be stored on a computer readable medium, or may have the form of one or more signals. Such signals may be downloaded from an internet website, provided on a carrier signal, or provided in any other form.
It should be noted that the above-mentioned embodiments illustrate rather than limit the invention, and that those skilled in the art will be able to design alternative embodiments without departing from the scope of the appended claims. In the claims, any reference signs placed between parentheses shall not be construed as limiting the claim. The word "comprising" does not exclude the presence of elements or steps not listed in a claim. The word "a" or "an" preceding an element does not exclude the presence of a plurality of such elements. The invention may be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer. In the unit claims enumerating several means, several of these means may be embodied by one and the same item of hardware. The use of the words first, second, third, etc. do not denote any order. These words may be interpreted as nouns.

Claims (8)

1. The utility model provides a detection system of bridge apparent disease which characterized in that includes:
the double-shaft motion mechanism comprises a horizontal rotation part and a vertical rotation part arranged on the horizontal rotation part;
the camera is arranged on the vertical rotating part, the camera comprises a lens, at least four laser probes are circumferentially arranged on the peripheral array of the lens, laser ranging sensors are arranged on the lens, the laser emission directions of the four laser probes and the laser emission directions of the laser ranging sensors are respectively parallel to the optical axis direction of the lens and are positioned on the same side, and the camera is used for capturing image information with four laser probe projection points; and
The data processing device is respectively and electrically connected with the double-shaft motion mechanism, the camera and the laser ranging sensor and is used for acquiring and correspondingly processing the image information shot by the camera, the distance information measured by the laser ranging sensor and the motion information of the double-shaft motion mechanism;
the method for detecting the apparent bridge diseases comprises the following steps:
the camera shoots perspective distortion image information with laser points sent by the laser probe;
the data processing device acquires the perspective distortion image information shot by the camera;
the data processing device acquires the relative coordinates of the laser points in the perspective distortion image information and the relative coordinates of laser projection points of the laser probe, which are projected onto the detected surface, of the laser;
the data processing device calculates a perspective distortion inverse transformation matrix according to the laser point relative coordinates and the laser projection point relative coordinates;
the data processing device corrects the perspective distortion image information into front view image information through the perspective distortion inverse transformation matrix, and performs bridge apparent disease detection by utilizing the front view image information;
the data processing device obtains the relative coordinates of a laser projection point of the laser probe projected onto the detected surface, and the data processing device comprises:
Establishing a space rectangular coordinate system in which a detected surface is positioned, wherein the detected surface is intersected with an X-axis positive direction, a Y-axis positive direction and a Z-axis positive direction of the space rectangular coordinate system at one point respectively, and the Z-axis positive direction is the optical axis direction of the lens;
acquiring a first included angle parameter of the detected surface and the Y axis, a second included angle parameter of the detected surface and the X axis and a relative position coordinate of the laser probe;
and calculating the relative coordinates of the laser projection points according to the first included angle parameter, the second included angle parameter and the relative position coordinates of the laser probe.
2. The system for detecting apparent bridge diseases according to claim 1, wherein the obtaining the first included angle parameter between the detected surface and the Y axis comprises:
generating a first auxiliary line and a second auxiliary line which extend from the origin of the space rectangular coordinate system to the detected surface respectively and are positioned in the positive direction and the negative direction of the Y axis respectively, and a third auxiliary line which extends from the intersection point of the first auxiliary line and the detected surface to the second auxiliary line, wherein the included angles formed by the first auxiliary line and the second auxiliary line respectively with the Z axis are equal, and the third auxiliary line is parallel to the Y axis;
Acquiring a third included angle parameter between the first auxiliary line and the second auxiliary line, and acquiring a first length parameter of the first auxiliary line and a second length parameter of the second auxiliary line by using a laser ranging sensor;
calculating a fourth included angle parameter between the intersection point connecting line of the first auxiliary line, the second auxiliary line and the detected surface and the third auxiliary line by using a first preset formula according to the first length parameter, the second length parameter and the third included angle parameter, wherein the fourth included angle parameter is the first included angle parameter;
the first preset formula is:
Figure QLYQS_1
wherein, gamma is a fourth included angle parameter, os is a first length parameter, ot is a second length parameter, and θ is a third included angle parameter.
3. The system for detecting apparent bridge diseases according to claim 1, wherein the obtaining the second included angle parameter between the detected surface and the X-axis comprises:
generating an auxiliary Z axis which has a fifth included angle parameter with the Z axis and is positioned on a coordinate plane YOZ, a fourth auxiliary line which extends from an intersection point of the detected surface and the X axis to the auxiliary Z axis, fifth auxiliary lines and sixth auxiliary lines which respectively extend from an origin of the space rectangular coordinate system to the fourth auxiliary line and are respectively positioned in positive and negative directions of the X axis, and seventh auxiliary lines which extend from an intersection point of the fifth auxiliary line and the fourth auxiliary line to the sixth auxiliary line, wherein included angles formed by the fifth auxiliary line and the sixth auxiliary line and the auxiliary Z axis are equal, and the seventh auxiliary lines are parallel to the X axis;
Acquiring a sixth included angle parameter between the fifth auxiliary line and the sixth auxiliary line, and acquiring a fifth length parameter of the fifth auxiliary line and a sixth length parameter of the sixth auxiliary line by using a laser ranging sensor;
calculating a seventh included angle parameter between the fifth auxiliary line, an intersection connecting line of the sixth auxiliary line and the fourth auxiliary line and the seventh auxiliary line by using a second preset formula according to the fifth length parameter, the sixth length parameter and the sixth included angle parameter, wherein the seventh included angle parameter is equal to the second included angle parameter;
the second preset formula is:
Figure QLYQS_2
wherein ω is a seventh included angle parameter, oh "is a fifth auxiliary line, ow is a sixth auxiliary line, and θ' is a sixth included angle parameter.
4. The bridge apparent defect detection system according to claim 3, wherein the obtaining a sixth included angle parameter between the fifth auxiliary line and the sixth auxiliary line comprises:
generating an eighth auxiliary line and a ninth auxiliary line which extend to two end points of the seventh auxiliary line from the positive direction of the Y axis in sequence, wherein the planes of the eighth auxiliary line and the ninth auxiliary line are perpendicular to a coordinate plane YOZ;
Acquiring the fifth included angle parameter, and acquiring an eighth included angle parameter between the eighth auxiliary line and the ninth auxiliary line;
calculating the sixth included angle parameter by using a third preset formula according to the fifth included angle parameter and the eighth included angle parameter;
the third preset formula is:
Figure QLYQS_3
wherein θ' is a sixth included angle parameter, ε is a fifth included angle parameter, and θ "is an eighth included angle parameter.
5. The system for detecting apparent bridge diseases according to claim 1, wherein the calculating the relative coordinates of the laser projection point according to the first included angle parameter, the second included angle parameter and the relative position coordinates of the laser probe comprises:
acquiring a distance parameter between an origin of the space rectangular coordinate system and an intersection point of the detected surface and the Z axis by using the laser ranging sensor;
calculating an equation of the detected surface by utilizing a trigonometric function relation and a space plane equation according to the first included angle parameter, the second included angle parameter and the distance parameter;
calculating the space coordinates on the detected surface corresponding to the relative coordinates of the laser projection points by utilizing an equation of the detected surface according to the relative position coordinates of the laser probe;
Establishing a plane rectangular coordinate system in which coordinate points of the space coordinates are located, and taking the coordinate points on the Z axis as an original point, wherein a vector direction formed by the coordinate points on the Z axis and the coordinate points adjacent to the coordinate points on the Z axis is a positive direction of an X axis;
and calculating the plane coordinates on the detected surface corresponding to the relative position coordinates of the laser probe by utilizing a vector operation relation and a trigonometric function relation according to the plane rectangular coordinate system and the space coordinates, wherein the plane coordinates are the relative coordinates of the laser projection points.
6. The system for detecting apparent bridge diseases according to claim 1, wherein the data processing device calculates an inverse perspective distortion matrix according to the laser point relative coordinates and the laser projection point relative coordinates, comprising:
calculating the perspective distortion inverse transformation matrix by using a fourth preset formula according to the laser point relative coordinates and the laser projection point relative coordinates;
the fourth preset formula is that
Figure QLYQS_4
Wherein (1)>
Figure QLYQS_5
Inverse transform matrix for perspective distortion->
Figure QLYQS_6
Coordinate matrix of laser projection point relative coordinates, +.>
Figure QLYQS_7
And taking a coordinate matrix of relative coordinates of the laser points in the perspective distortion image information for the camera, wherein n is a positive integer greater than zero.
7. The bridge apparent disease detection system according to claim 1, wherein the data processing apparatus corrects the perspective distortion image information into front view image information through the perspective distortion inverse transformation matrix, and performs bridge apparent disease detection using the front view image information, comprising:
correcting the perspective distortion image information into front view image information through the perspective distortion inverse transformation matrix, and detecting apparent bridge diseases by utilizing the front view image information and the relative coordinates of the laser projection points.
8. The bridge apparent disease detection system according to claim 7, wherein the bridge apparent disease detection using the front view image information and the laser projection point relative coordinates comprises:
calculating a distance parameter between any two coordinate points in the relative coordinates of the laser projection points, and counting the number parameter of pixel units between the any two coordinate points;
dividing the distance parameter and the number parameter of the pixel units to obtain the size parameter of each pixel unit;
and determining the size information of the bridge Liang Biaoguan disease in the front view image information according to the number of the pixel units in the range of the bridge Liang Biaoguan disease in the front view image information and the size parameter of each pixel unit.
CN201810971346.4A 2018-08-24 2018-08-24 Bridge apparent disease detection system and detection method thereof Active CN109406525B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810971346.4A CN109406525B (en) 2018-08-24 2018-08-24 Bridge apparent disease detection system and detection method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810971346.4A CN109406525B (en) 2018-08-24 2018-08-24 Bridge apparent disease detection system and detection method thereof

Publications (2)

Publication Number Publication Date
CN109406525A CN109406525A (en) 2019-03-01
CN109406525B true CN109406525B (en) 2023-06-16

Family

ID=65464474

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810971346.4A Active CN109406525B (en) 2018-08-24 2018-08-24 Bridge apparent disease detection system and detection method thereof

Country Status (1)

Country Link
CN (1) CN109406525B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112329531B (en) * 2020-09-30 2023-04-07 山东大学 Linear array binocular imaging system for pipe gallery apparent disease detection and working method
CN116295020B (en) * 2023-05-22 2023-08-08 山东高速工程检测有限公司 Bridge disease positioning method and device
CN117030585B (en) * 2023-08-08 2024-05-10 重庆阿泰可科技股份有限公司 Accelerated aging test device based on mirror system and mirror calibration method thereof

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1975324A (en) * 2006-12-20 2007-06-06 北京航空航天大学 Double-sensor laser visual measuring system calibrating method
CN105798909A (en) * 2016-04-29 2016-07-27 上海交通大学 Calibration system and method of zero position of robot based on laser and vision
CN106127745A (en) * 2016-06-17 2016-11-16 凌云光技术集团有限责任公司 The combined calibrating method of structure light 3 D visual system and line-scan digital camera and device
CN106645205A (en) * 2017-02-24 2017-05-10 武汉大学 Unmanned aerial vehicle bridge bottom surface crack detection method and system
CN106677037A (en) * 2016-11-21 2017-05-17 同济大学 Portable asphalt pavement disease detection method and device based on machine vision
CN106971408A (en) * 2017-03-24 2017-07-21 大连理工大学 A kind of camera marking method based on space-time conversion thought
CN107063129A (en) * 2017-05-25 2017-08-18 西安知象光电科技有限公司 A kind of array parallel laser projection three-dimensional scan method
CN206563562U (en) * 2016-11-10 2017-10-17 华南理工大学 A kind of three-dimensionalreconstruction device of double-rotating laser
CN108106801A (en) * 2017-11-15 2018-06-01 温州市交通工程试验检测有限公司 Bridge tunnel disease non-contact detection system and detection method

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1975324A (en) * 2006-12-20 2007-06-06 北京航空航天大学 Double-sensor laser visual measuring system calibrating method
CN105798909A (en) * 2016-04-29 2016-07-27 上海交通大学 Calibration system and method of zero position of robot based on laser and vision
CN106127745A (en) * 2016-06-17 2016-11-16 凌云光技术集团有限责任公司 The combined calibrating method of structure light 3 D visual system and line-scan digital camera and device
CN206563562U (en) * 2016-11-10 2017-10-17 华南理工大学 A kind of three-dimensionalreconstruction device of double-rotating laser
CN106677037A (en) * 2016-11-21 2017-05-17 同济大学 Portable asphalt pavement disease detection method and device based on machine vision
CN106645205A (en) * 2017-02-24 2017-05-10 武汉大学 Unmanned aerial vehicle bridge bottom surface crack detection method and system
CN106971408A (en) * 2017-03-24 2017-07-21 大连理工大学 A kind of camera marking method based on space-time conversion thought
CN107063129A (en) * 2017-05-25 2017-08-18 西安知象光电科技有限公司 A kind of array parallel laser projection three-dimensional scan method
CN108106801A (en) * 2017-11-15 2018-06-01 温州市交通工程试验检测有限公司 Bridge tunnel disease non-contact detection system and detection method

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
刘国华.《HALCON数字图像处理》.西安电子科技大学出版社,2018,(第1版),第128-131页. *
移动快速悬臂式桥梁安全智能视频检测系统研究;肖长礼;李小重;冀孟恩;;中国公路(第01期);第134-135页 *

Also Published As

Publication number Publication date
CN109406525A (en) 2019-03-01

Similar Documents

Publication Publication Date Title
CN109444163B (en) System for obtaining perspective distortion inverse transformation matrix
CN113532311A (en) Point cloud splicing method, device, equipment and storage equipment
CN104154875B (en) Three-dimensional data acquisition system and acquisition method based on two-axis rotation platform
CN109406525B (en) Bridge apparent disease detection system and detection method thereof
CN111220130B (en) Focusing measurement method and terminal capable of measuring object at any position in space
WO2022111105A1 (en) Intelligent visual 3d information acquisition apparatus with free posture
CN109087355B (en) Monocular camera pose measuring device and method based on iterative updating
CN113409285A (en) Method and system for monitoring three-dimensional deformation of immersed tunnel joint
CN110470320B (en) Calibration method of swinging scanning type line structured light measurement system and terminal equipment
JP7185860B2 (en) Calibration method for a multi-axis movable vision system
CN109215086A (en) Camera extrinsic scaling method, equipment and system
JP3690581B2 (en) POSITION DETECTION DEVICE AND METHOD THEREFOR, PLAIN POSITION DETECTION DEVICE AND METHOD THEREOF
WO2022078418A1 (en) Intelligent three-dimensional information acquisition appratus capable of stably rotating
JP2001148025A5 (en)
CN112361962B (en) Intelligent visual 3D information acquisition equipment of many every single move angles
JPWO2018043524A1 (en) Robot system, robot system control apparatus, and robot system control method
CN112257537A (en) Intelligent multi-point three-dimensional information acquisition equipment
CN112254638B (en) Intelligent visual 3D information acquisition equipment that every single move was adjusted
JP4077755B2 (en) POSITION DETECTION METHOD, DEVICE THEREOF, PROGRAM THEREOF, AND CALIBRATION INFORMATION GENERATION METHOD
JP3754340B2 (en) Position detection device
CN112815832B (en) Measuring camera coordinate system calculation method based on 3D target
CN112254669B (en) Intelligent visual 3D information acquisition equipment of many bias angles
WO2022078444A1 (en) Program control method for 3d information acquisition
CN112304250B (en) Three-dimensional matching equipment and method between moving objects
CN209387548U (en) The system for obtaining perspective distortion inverse-transform matrix

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant