CN116823735B - Weld polishing feature extraction method, weld detection and polishing method - Google Patents
Weld polishing feature extraction method, weld detection and polishing method Download PDFInfo
- Publication number
- CN116823735B CN116823735B CN202310645680.1A CN202310645680A CN116823735B CN 116823735 B CN116823735 B CN 116823735B CN 202310645680 A CN202310645680 A CN 202310645680A CN 116823735 B CN116823735 B CN 116823735B
- Authority
- CN
- China
- Prior art keywords
- point cloud
- weld
- cloud data
- data set
- polishing
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000005498 polishing Methods 0.000 title claims abstract description 76
- 238000000034 method Methods 0.000 title claims abstract description 34
- 238000000605 extraction Methods 0.000 title claims abstract description 30
- 238000001514 detection method Methods 0.000 title claims abstract description 26
- 238000003466 welding Methods 0.000 claims abstract description 83
- 239000011324 bead Substances 0.000 claims description 66
- 239000011159 matrix material Substances 0.000 claims description 35
- 239000013598 vector Substances 0.000 claims description 35
- 238000012545 processing Methods 0.000 claims description 25
- 230000011218 segmentation Effects 0.000 claims description 15
- 238000004422 calculation algorithm Methods 0.000 claims description 10
- 238000000227 grinding Methods 0.000 claims description 9
- 238000006243 chemical reaction Methods 0.000 claims description 6
- 238000000354 decomposition reaction Methods 0.000 claims description 6
- 239000003795 chemical substances by application Substances 0.000 claims description 4
- 238000007689 inspection Methods 0.000 claims description 4
- 230000035515 penetration Effects 0.000 claims description 3
- 238000004590 computer program Methods 0.000 description 7
- 230000008569 process Effects 0.000 description 7
- 238000010586 diagram Methods 0.000 description 6
- 230000006870 function Effects 0.000 description 4
- 238000004519 manufacturing process Methods 0.000 description 4
- 238000004364 calculation method Methods 0.000 description 3
- 238000005070 sampling Methods 0.000 description 3
- 238000003860 storage Methods 0.000 description 3
- 230000009466 transformation Effects 0.000 description 3
- 230000000007 visual effect Effects 0.000 description 3
- 230000000694 effects Effects 0.000 description 2
- 239000002184 metal Substances 0.000 description 2
- 229910052751 metal Inorganic materials 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000007517 polishing process Methods 0.000 description 2
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 2
- 206010063385 Intellectualisation Diseases 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000005266 casting Methods 0.000 description 1
- 239000012459 cleaning agent Substances 0.000 description 1
- 238000005520 cutting process Methods 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000018109 developmental process Effects 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 230000004927 fusion Effects 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 238000000691 measurement method Methods 0.000 description 1
- 238000009659 non-destructive testing Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000036544 posture Effects 0.000 description 1
- 238000007781 pre-processing Methods 0.000 description 1
- 229910000679 solder Inorganic materials 0.000 description 1
- 238000005507 spraying Methods 0.000 description 1
- 238000004381 surface treatment Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0004—Industrial image inspection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/06—Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
- G06Q10/063—Operations research, analysis or management
- G06Q10/0633—Workflow analysis
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
- G06Q50/04—Manufacturing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10028—Range image; Depth image; 3D point clouds
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02P—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
- Y02P90/00—Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
- Y02P90/30—Computing systems specially adapted for manufacturing
Landscapes
- Engineering & Computer Science (AREA)
- Business, Economics & Management (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Human Resources & Organizations (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Economics (AREA)
- Strategic Management (AREA)
- Marketing (AREA)
- General Business, Economics & Management (AREA)
- Quality & Reliability (AREA)
- Entrepreneurship & Innovation (AREA)
- Tourism & Hospitality (AREA)
- Educational Administration (AREA)
- Operations Research (AREA)
- Game Theory and Decision Science (AREA)
- Development Economics (AREA)
- Manufacturing & Machinery (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Primary Health Care (AREA)
- Image Processing (AREA)
Abstract
The invention relates to a weld polishing feature extraction method, a weld detection and polishing method, which comprises the following steps: shooting and acquiring point cloud information of a workpiece through a 3D camera; step two, extracting weld polishing characteristics through a weld polishing characteristic extraction method; generating a welding seam polishing path according to welding seam polishing characteristics, and polishing a workpiece by a robot according to the polishing path; step four, repeating the step one and the step two, obtaining the residual height characteristics of the polished weld joint, and carrying out weld joint forming detection: if the detection is passed, finishing the polishing operation flow, and if the detection is not passed, returning to the third step to continue polishing. The welding seam polishing device can realize efficient and accurate welding seam polishing feature extraction, can greatly improve the accuracy of welding seam feature extraction, shortens the welding seam feature extraction time, and can be used for realizing efficient robot welding seam automatic polishing and efficient polishing welding seam forming detection.
Description
Technical Field
The invention relates to the technical field of welding, in particular to a welding seam polishing feature extraction method, a welding seam detection and polishing method.
Background
With the gradual development of the manufacturing industry towards automation and intellectualization, industrial robots play an important role in more and more fields. The industrial robot can replace manual work in material processing engineering such as welding, polishing and casting, replaces traditional manual production in certain environments with high noise and high pollution, greatly improves the processing efficiency and reduces the labor cost. The intelligent industrial robot is greatly improved by adding visual sensing, sound sensing and other methods on the body of the industrial robot to perform multi-information fusion, so that the robot can treat various complex working conditions like manual work.
Traditional manual polishing is low in polishing efficiency and unstable in polishing quality, and can cause great harm to the body and mind of workers, so that automatic polishing is realized, and the automatic polishing machine has important significance for replacing manual work. At present, the multi-axis numerical control machine (CNC) is still the most widely used in the field of automatic polishing, but the multi-axis numerical control machine has high manufacturing cost, poor expansibility and flexibility and cannot be used in polishing of large-scale welding seams. Compared with a multi-axis numerical control machine tool, the industrial robot has wider and expandable operation space and lower price, has more prospect in the field of polishing, and has obvious advantages especially in polishing large workpieces. The robot processing operation can optimize the operation parameters in real time according to the process knowledge model and the multi-sensor feedback information, so that the active control of the equipment in the process is realized.
The forming of the polished weld joint generally has certain requirements on the weld joint residual height, and the traditional polished weld joint forming detection method generally adopts a manual measurement method, so that the method not only consumes a long time and has low detection precision, but also adopts a contour meter to detect the polished weld joint forming surface, but can be only applied to workpieces with smaller sizes. The vision sensing is used as one of the important sensors in the automatic polishing of the robot, three-dimensional reconstruction of polished workpieces can be realized through the 3D camera, key feature points of the welding lines are extracted, the method can be used for planning a subsequent polishing path, detecting the polishing welding line forming and judging whether the polishing welding line forming is qualified or not. The vision sensor commonly used at present is a line laser 3D camera, and the line laser 3D camera has higher reconstruction precision, but has lower three-dimensional reconstruction efficiency and low three-dimensional reconstruction speed, and is not beneficial to the efficient implementation of an automatic polishing process.
Disclosure of Invention
Therefore, the technical problem to be solved by the invention is to overcome the technical defect that the three-dimensional reconstruction precision in the weld polishing process is low and the characteristics cannot be extracted accurately in the prior art.
In order to solve the technical problems, the invention provides a weld polishing feature extraction method, which comprises the following steps:
S1, acquiring point cloud information of a workpiece to be detected, wherein the workpiece to be detected is colored by a dye penetration inspection agent, and the point cloud information is acquired by an infrared structured light 3D camera;
S2, processing the original point cloud information of the workpiece to be detected to obtain weld polishing characteristics, wherein the method comprises the following steps:
s21, processing the point cloud data set P through European clustering to segment out a target welding plate point cloud data set P B;
S22, fitting a welding plate plane by a RANSAC algorithm based on a target welding plate point cloud data set P B to obtain a point cloud data set P S on the welding plate plane;
s23, the target welding plate point cloud data set P B is differenced with the point cloud data set P S on the welding plate plane, and a welding line point cloud data set P W is obtained;
S24, performing European cluster segmentation on the weld joint point cloud data set P W, and segmenting the weld joint point cloud data set P W into n independent weld joint point cloud data sets P Wi;
S25, extracting a direction vector of point cloud in a welding bead point cloud data set P Wi;
s26, setting a lower threshold D 1 and an upper threshold D 2 of a weld bead direction vector, and extracting a weld bead direction of a weld bead point cloud data set P Wi
S27, welding bead directionConversion toObtaining a rotation matrix R i in the direction, and multiplying the welding bead point cloud data set by the rotation matrix R i to obtain a rotated welding bead point cloud data set;
S28, slicing the rotated weld bead point cloud data set along the X-axis direction to obtain multi-channel weld bead slice point cloud data P sliceij;
S29, extracting point cloud highest points of the weld slice point cloud data P sliceij along the Z-axis direction, the height of the slice weld and the width of the slice weld, and finishing the extraction of the weld features.
Preferably, the step S21 further includes:
And carrying out voxel downsampling processing on the point cloud information to obtain a preprocessed point cloud data set P.
Preferably, the S21 includes:
Determining a query point P i, setting a distance threshold r, and calculating the distance d between any two points in the point cloud data set according to the following formula:
Wherein p i and p j represent two points in the point cloud, and p ik and p jk represent any point of the two points in the neighborhood;
Finding n adjacent points P j nearest to the query point P i through the KD-tree, wherein j=1, 2, …, n, and calculating the Euclidean distance d j from the n adjacent points to the query point P i according to the above formula;
Comparing the distance d j with the distance threshold r, and classifying points smaller than the distance threshold r into the class M until the number of points in the class M is not increased any more, and completing the segmentation.
Preferably, the S24 includes:
performing European cluster segmentation on the weld point cloud data set P W, setting a minimum value kappa of the quantity of point clouds in clusters, and reserving the classified point cloud data set when the quantity of the point cloud data sets P Wi after European cluster segmentation is more than or equal to kappa, otherwise deleting the classified point cloud data set, wherein i=1, 2,3 … n and n are natural numbers more than 3.
Preferably, the step S25 includes:
S251, recording m point cloud data in the bead point cloud data set P Wi, and selecting q point cloud data (j=1, 2,3 … m) around one point P Wij in the bead point cloud data set P Wi;
S252, calculating a point cloud centroid C Wij of q+1 point cloud data, wherein the formula is as follows:
S253, performing point cloud barycentering processing, and sequentially subtracting the point cloud barycenters C Wij from q+1 point cloud data to obtain a barycentered point cloud data position matrix deM ij;
s254, constructing a covariance matrix: cov ij=deMij T*deMij;
S255, performing singular value decomposition on the covariance matrix Cov ij, wherein a singular vector (u ij,vij,wij) corresponding to the minimum singular value is a normal vector of a point P Wij;
S256, judging whether j is equal to m, if j is smaller than m, returning to S251, wherein j=j+1;
And S257, when j is equal to m, obtaining the direction vectors of all the point clouds in the welding bead point cloud data set P Wi.
Preferably, the step S26 includes:
s261, making the fitted weld plate plane expression be A 1x+B1y+C1z=D1, recording that the weld bead point cloud data set P Wi has m point cloud data in total, and calculating an included angle cosine value cos phi ij between a direction vector and the weld plate plane of a point P Wij in the weld bead point cloud data set P Wi by the following formula:
wherein j=1, 2,3 … m;
S262, creating an array D Wi to store weld bead direction point cloud data of the ith weld bead, when D 1≤cosΦij≤D2, putting the point P Wij into the array D Wi, otherwise, continuing the next step;
S263, judging whether j is equal to m, if j is smaller than m, returning to S261, wherein j=j+1;
s264, when j is equal to m, obtaining a welding bead direction point cloud data set D Wi;
S265, calculating a point cloud centroid C Wi of the weld bead direction point cloud data set D Wi, wherein the coordinate is marked as (a i,bi,ci);
S266, performing point cloud barycentering processing, and sequentially subtracting the point cloud barycenter C Wi from the point cloud data in D Wi to obtain a barycentered point cloud data position matrix deM i;
S267, performing singular value decomposition on the position matrix deM i to obtain matrices U i、Si and V i;
S268, the feature vector corresponding to the maximum feature value is the direction vector of the fitting straight line, and the point cloud centroid is a point on the fitting straight line, namely:
wherein (x i,yi,zi) is the point on the fitting space straight line, and (l i,mi,ni) represents the welding path direction of the fitting straight line
Preferably, the step S27 includes:
The direction of the welding bead The unit vector is converted into:
Recording device The direction unit vector is
When (when)Then q i = (0, 0), otherwise
Where sum () represents summing the three directions of the vector;
Constructing a rotation matrix R i by using a quaternion q i;
The weld bead point cloud data set P Wi is transformed into a weld bead direction parallel to the X axis by using the rotation matrix R i, namely, a rotated point cloud data set P Wi=Ri*PWi T.
Preferably, the step S28 includes:
S281, setting a space d of slice point clouds and a slice point cloud width f, wherein the distance of each space d selects the point clouds with the width f for slicing, and the range of the point clouds P Wi is X mini to X maxi, wherein i represents an ith welding line, j represents a jth point cloud slice, and j starts to be remembered from 0;
S282. let j=j+1, determine whether X mini +j×f+ (j-1) ×d is greater than X maxi, and when X mini +j×f+ (j-1) ×d is less than or equal to X maxi, select a point in the coordinate range X mini+(j-1)*f+(j-1)*d~Xmini +j×f+ (j-1) ×d as slice point cloud data P sliceij;
When X mini+j*f+(j-1)*dXmaxi is greater than X maxi, this step is ended.
Preferably, the step S29 includes:
the three-dimensional coordinates of P sliceij are noted as (x sliceij,ysliceij,zsliceij);
the highest point H sliceij=maxzsliceij, slice weld height H sliceij=maxzsliceij-minzsliceij, slice weld width W sliceij=maxysliceij-minysliceij, and leg positions of the slice welds are maxy sliceij and miny sliceij.
The invention discloses a welding seam detection and polishing method based on a 3D visual sensor, which comprises the following steps:
shooting and acquiring point cloud information of a workpiece through a 3D camera;
step two, extracting weld polishing characteristics based on the weld polishing characteristic extraction method;
Generating a welding seam polishing path according to welding seam polishing characteristics, and polishing a workpiece by a robot according to the polishing path;
step four, repeating the step one and the step two, obtaining the residual height characteristics of the polished weld joint, and carrying out weld joint forming detection: if the detection is passed, finishing the polishing operation flow, and if the detection is not passed, returning to the third step to continue polishing.
Compared with the prior art, the technical scheme of the invention has the following advantages:
1. The invention provides a robot weld polishing feature extraction method based on 3D visual sensing, which can realize efficient and accurate weld polishing feature extraction, can greatly improve the accuracy of weld feature extraction, shortens the weld feature extraction time, and can be used for realizing efficient robot weld automatic polishing and efficient weld polishing forming detection.
2. The three-dimensional reconstruction method is high in three-dimensional reconstruction speed, can rapidly process the three-dimensional model, extracts key structural features, and can realize high-efficiency robot welding seam automatic polishing and high-efficiency polishing welding seam forming detection.
Drawings
FIG. 1 is a flow chart of a weld grinding feature extraction method of the present invention;
FIG. 2 is a flow chart of a weld detection and sanding method based on a 3D vision sensor.
Detailed Description
The present invention will be further described with reference to the accompanying drawings and specific examples, which are not intended to be limiting, so that those skilled in the art will better understand the invention and practice it.
Referring to fig. 1, the invention discloses a weld polishing feature extraction method, which comprises the following steps:
s1, acquiring point cloud information of a workpiece to be detected, wherein the workpiece to be detected is colored by a dye penetration inspection agent, and the point cloud information is acquired by an infrared structured light 3D camera.
Specifically, the commonly used 3D cameras comprise a line laser 3D camera and a structured light 3D camera, the line laser 3D camera needs to follow the robot to move to shoot a plurality of line laser pictures and complete three-dimensional reconstruction through a line laser extraction algorithm, and the efficiency is low; the structured light 3D camera can directly acquire the point cloud information of the photographed workpiece, but the quality of the point cloud information acquired when photographing the reflective workpiece is poor. According to the invention, the infrared structured light 3D camera is adopted, and the metal surface treatment method is provided, so that the problem that imaging effect of the infrared structured light 3D camera for shooting the reflective metal surface is poor can be effectively avoided, and efficient workpiece point cloud information acquisition is realized. The specific implementation method comprises the steps of spraying the DPT-5 dye penetrant inspection agent on the surface of a workpiece, shooting by using an infrared structured light 3D camera, and adjusting higher exposure time to obtain high-quality workpiece point cloud information. DPT-5 dye penetrant is a high-efficiency and convenient method commonly used in nondestructive testing, is convenient to clean, and can clean the penetrant on the surface of a workpiece by using a cleaning agent or water (the water pressure is less than or equal to 1.5kg/cm 2).
Before a photo is acquired, firstly, calibrating a TCP of the robot and calibrating an internal reference matrix and a hand-eye conversion matrix of the 3D camera. The calibration point of the robot TCP is selected as the center point of the EHA floating sanding head. Calibration is completed by changing six-axis gestures of the robot to shoot a plurality of pictures, and an internal reference matrix of the 3D camera is obtainedAnd hand-eye conversion matrixThe acquisition of the photograph is then carried out.
The following procedures are adopted in the process of collecting the photo:
① Setting photographing positions and photographing quantity n in advance;
② The robot moves to a preset point position to take a picture;
③ Detecting whether the photographing number is equal to n, if so, returning to ② th step to continue photographing;
④ The collected n Zhang Dianyun photos are sent to an industrial personal computer;
⑤ The industrial personal computer performs point cloud splicing on the acquired point cloud photos to obtain integral point cloud information of the workpiece;
⑥ The position coordinates P R in the robot coordinate system are obtained by the following formula:
wherein P L is the coordinate value of the point cloud information under the image coordinate system, The conversion matrix from the image coordinate system to the camera coordinate system, namely the camera internal reference,For the transformation matrix of the camera coordinate system to the tool coordinate system, namely the camera hand-eye transformation matrix,The transformation matrix from the tool coordinate system to the robot coordinate system is P R, which is the three-dimensional coordinate value of the point cloud information under the robot coordinate system.
S2, processing the original point cloud information of the workpiece to be detected to obtain weld polishing characteristics, wherein the method comprises the following steps:
S21, processing the point cloud data set P through European clustering, and dividing out a target welding plate point cloud data set P B.
Because the information shot by the 3D camera not only contains the target welding plate point cloud information, but also contains the surrounding scene point cloud information, unnecessary point clouds need to be removed so as to extract the point cloud information of the welding plate. The essence of Euclidean clustering is that points with relatively close distances are classified, for n points in a point cloud C, euclidean distances are defined as the affinity and sparsity degree of two points, and the distance between adjacent points is used as a standard, so that the point cloud clustering segmentation is realized. The specific process of segmentation is as follows:
S211, determining a query point P i for a point cloud data set P after voxel downsampling preprocessing, and setting a distance threshold r, wherein the calculation formula of the distance d between any two points in the point cloud data set is as follows:
Wherein p i and p j represent two points in the point cloud, and p ik and p jk represent any point of the two points in the neighborhood;
S212, finding n adjacent points P j (j=1, 2, …, n) nearest to the query point P i through the KD-tree, and calculating the Euclidean distance d j from the n adjacent points to the query point P i according to the above formula;
S213, comparing the distance d j with a distance threshold r, and classifying points smaller than the distance threshold r into a class M until the number of points in the class M is not increased any more, and completing the segmentation.
After European clustering is completed, a group with the largest quantity of clustered point clouds is selected as the welding plate integral point cloud P B due to reasonable selection of earlier shooting positions, so that the cutting of the welding plate integral point cloud information is realized.
S22, fitting a welding plate plane by a RANSAC algorithm based on the target welding plate point cloud data set P B to obtain a point cloud data set P S on the welding plate plane.
The RANSAC algorithm is a random consistency algorithm, is an effective and steady estimation method, and can still obtain a relatively ideal fitting result under the condition of large error value of point cloud data. The RANSAC algorithm flow is as follows:
1) The calculation formula p=1- (1- (1-epsilon) m)M, where P represents the probability of acquiring at least one benign subset, epsilon represents the data error rate, M is the minimum data amount required to calculate the model parameters, and M is the minimum number of samples of the basic subset. Calculating a minimum sampling number M according to given epsilon, P and M;
2) Randomly extracting M points from the point cloud data by the calculated minimum iteration times M, and calculating the initial value of the plane model parameter by utilizing the relation of a 2+b2+c2 =1 between the formula ax+by+cz=d and the parameter;
3) Setting a threshold delta 0, and calculating a tolerance delta of all points in the data set according to the calculated initial value of the model parameter by using a formula d i=|axi+byi+czi -d|; if the threshold value is within the range delta 0, classifying the local point as an intra-local point, otherwise classifying the local point as an extra-local point;
4) Repeating the steps 2) and 3) for M times, counting the number of the local points after each classification, selecting the group of points with the largest number of the local points, and fitting the characteristic value of the largest number of the local points to obtain the final plane model parameters;
5) A point cloud dataset P S in the fitting plane in the target solder plate point cloud dataset P B is calculated.
After the welding plate plane is fitted through the RANSAC algorithm, all point cloud data on the welding plate plane can be obtained.
S23, the target welding plate point cloud data set P B is differenced with the point cloud data set P S on the welding plate plane, and a welding line point cloud data set P W is obtained;
S24, performing European cluster segmentation on the weld joint point cloud data set P W, and segmenting the weld joint point cloud data set P W into n independent weld joint point cloud data sets P Wi, wherein the European cluster segmentation comprises the following steps:
performing European cluster segmentation on the weld point cloud data set P W, setting a minimum value kappa of the quantity of point clouds in clusters, and reserving the classified point cloud data set when the quantity of the point cloud data sets P Wi after European cluster segmentation is more than or equal to kappa, otherwise deleting the classified point cloud data set, wherein i=1, 2,3 … n and n are natural numbers more than 3.
The weld point cloud data set P W can be divided into n independent weld point cloud data sets P Wi by selecting an appropriate parameter kappa;
s25, extracting a direction vector of point cloud in a welding bead point cloud data set P Wi, wherein the direction vector comprises the following steps:
S251, recording m point cloud data in the bead point cloud data set P Wi, and selecting q point cloud data (j=1, 2,3 … m) around one point P Wij in the bead point cloud data set P Wi;
S252, calculating a point cloud centroid C Wij of q+1 point cloud data, wherein the formula is as follows:
S253, performing point cloud barycentering processing, and sequentially subtracting the point cloud barycenters C Wij from q+1 point cloud data to obtain a barycentered point cloud data position matrix deM ij;
s254, constructing a covariance matrix: cov ij=deMij T*deMij;
S255, performing singular value decomposition on the covariance matrix Cov ij, wherein a singular vector (u ij,vij,wij) corresponding to the minimum singular value is a normal vector of a point P Wij;
S256, judging whether j is equal to m, if j is smaller than m, returning to S251, wherein j=j+1;
And S257, when j is equal to m, obtaining the direction vectors of all the point clouds in the welding bead point cloud data set P Wi.
S26, setting a lower threshold D 1 and an upper threshold D 2 of a weld bead direction vector, and extracting a weld bead direction of a weld bead point cloud data set P Wi Comprising the following steps:
s261, making the fitted weld plate plane expression be A 1x+B1y+C1z=D1, recording that the weld bead point cloud data set P Wi has m point cloud data in total, and calculating an included angle cosine value cos phi ij between a direction vector and the weld plate plane of a point P Wij in the weld bead point cloud data set P Wi by the following formula:
wherein j=1, 2,3 … m;
S262, creating an array D Wi to store weld bead direction point cloud data of the ith weld bead, when D 1≤cosΦij≤D2, putting the point P Wij into the array D Wi, otherwise, continuing the next step;
S263, judging whether j is equal to m, if j is smaller than m, returning to S261, wherein j=j+1;
s264, when j is equal to m, obtaining a welding bead direction point cloud data set D Wi;
S265, calculating a point cloud centroid C Wi of the weld bead direction point cloud data set D Wi, wherein the coordinate is marked as (a i,bi,ci);
S266, performing point cloud barycentering processing, and sequentially subtracting the point cloud barycenter C Wi from the point cloud data in D Wi to obtain a barycentered point cloud data position matrix deM i;
S267, performing singular value decomposition on the position matrix deM i to obtain matrices U i、Si and V i;
S268, the feature vector corresponding to the maximum feature value is the direction vector of the fitting straight line, and the point cloud centroid is a point on the fitting straight line, namely:
wherein (x i,yi,zi) is the point on the fitting space straight line, and (l i,mi,ni) represents the welding path direction of the fitting straight line
S27, welding bead directionConversion toAnd obtaining a rotation matrix R i in the direction, and multiplying the welding bead point cloud data set by the rotation matrix R i to obtain a rotated welding bead point cloud data set.
Since the weld bead direction may not be parallel to the x-axis, the weld bead direction is calculated from the weld bead direction vectorTransition toThe specific flow of the direction is as follows:
The direction of the welding bead The unit vector is converted into:
Recording device The direction unit vector is
When (when)Then q i = (0, 0), otherwise
Where sum () represents summing the three directions of the vector;
Constructing a rotation matrix R i by using a quaternion q i;
The weld bead point cloud data set P Wi is transformed into a weld bead direction parallel to the X axis by using the rotation matrix R i, namely, a rotated point cloud data set P Wi=Ri*PWi T.
S28, slicing the rotated welding bead point cloud data set along the X-axis direction to obtain multi-channel welding bead slice point cloud data P sliceij, wherein the method comprises the following steps:
S281, setting a space d of slice point clouds and a slice point cloud width f, wherein the distance of each space d selects the point clouds with the width f for slicing, and the range of the point clouds P Wi is X mini to X maxi, wherein i represents an ith welding line, j represents a jth point cloud slice, and j starts to be remembered from 0;
S282. let j=j+1, determine whether X mini +j×f+ (j-1) ×d is greater than X maxi, and when X mini +j×f+ (j-1) ×d is less than or equal to X maxi, select a point in the coordinate range X mini+(j-1)*f+(j-1)*d~Xmini +j×f+ (j-1) ×d as slice point cloud data P sliceij;
When X mini+j*f+(j-1)*dXmaxi is greater than X maxi, this step is ended.
S29, extracting point cloud highest points of weld slice point cloud data P sliceij along the Z-axis direction, the height of a slice weld and the width of the slice weld, and finishing the extraction of weld features, wherein the method comprises the following steps:
the three-dimensional coordinates of P sliceij are noted as (x sliceij,ysliceij,zsliceij);
the highest point H sliceij=maxzsliceij, slice weld height H sliceij=maxzsliceij-minzsliceij, slice weld width W sliceij=maxysliceij-minysliceij, and leg positions of the slice welds are maxy sliceij and miny sliceij.
Further, before S21, the method further includes: and carrying out voxel downsampling processing on the point cloud information to obtain a preprocessed point cloud data set P.
Specifically, the point cloud downsampling is to resample the point cloud according to a certain sampling rule, so as to reduce the density of the point cloud under the condition of ensuring that the overall geometric characteristics of the point cloud are unchanged, further reduce the data volume and algorithm complexity of related processing, and improve the processing speed of the subsequent point cloud algorithm. Voxel downsampling is a very efficient point cloud downsampling method, sampling points are distributed uniformly, and the processing effect is good. The voxel downsampling process is as follows:
1) The maximum value Xmax, ymax, zmax and the minimum value Xmin, ymin, zmin of X, Y, Z on the 3-direction coordinate system can be obtained through the point cloud coordinate values, and the side lengths lx, ly and lz of the bounding boxes are calculated by subtracting the minimum value from the maximum value:
2) The voxel small cube is cut into M, N, L parts by setting 3 directions X, Y, Z coordinate axes of the voxel small cube side length cell, so that a sum=m×n×l voxel small cube is obtained, wherein floor represents a rounding in the downward direction:
3) Each voxel small cube is labeled (x, y, z), resulting in a voxel small cube for each corresponding data point:
4) And calculating the mass center of each point cloud, only reserving mass center point clouds of each voxel cube, and removing other point clouds. The point cloud centroid calculation method comprises the following steps:
Referring to fig. 2, the invention also discloses a welding seam detection and polishing method based on the 3D vision sensor, which comprises the following steps:
shooting and acquiring point cloud information of a workpiece through a 3D camera;
step two, extracting weld polishing characteristics based on the weld polishing characteristic extraction method;
Generating a welding seam polishing path according to the welding seam polishing characteristics, and polishing the workpiece by a robot according to the polishing path.
Specifically, a weld polishing path is generated by the weld characteristic points extracted in the third step, the positions and the postures of the polishing points are sent to the robot in real time through the upper computer, and the robot executes polishing work.
Step four, repeating the step one and the step two, obtaining the characteristic of the weld seam surplus height, and carrying out contrast detection on the highest point H sliceij of the weld seam slicing point cloud and the weld seam surplus height requirement, and carrying out weld seam forming detection: if the detection is passed, finishing the polishing operation flow, and if the detection is not passed, returning to the third step to continue polishing.
It will be appreciated by those skilled in the art that embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
It is apparent that the above examples are given by way of illustration only and are not limiting of the embodiments. Other variations and modifications of the present invention will be apparent to those of ordinary skill in the art in light of the foregoing description. It is not necessary here nor is it exhaustive of all embodiments. While still being apparent from variations or modifications that may be made by those skilled in the art are within the scope of the invention.
Claims (10)
1. The weld polishing feature extraction method is characterized by comprising the following steps of:
S1, acquiring point cloud information of a workpiece to be detected, wherein the workpiece to be detected is colored by a dye penetration inspection agent, and the point cloud information is acquired by an infrared structured light 3D camera;
S2, processing the original point cloud information of the workpiece to be detected to obtain weld polishing characteristics, wherein the method comprises the following steps:
s21, processing the point cloud data set P through European clustering to segment out a target welding plate point cloud data set P B;
s22, fitting a welding plate plane by a RANSAC algorithm based on a target welding plate point cloud data set P B to obtain a point cloud data set P S on the welding plate plane;
s23, the target welding plate point cloud data set P B is differenced with the point cloud data set P S on the welding plate plane, and a welding line point cloud data set P W is obtained;
S24, performing European cluster segmentation on the weld joint point cloud data set P W, and segmenting the weld joint point cloud data set P W into n independent weld joint point cloud data sets P Wi;
S25, extracting a direction vector of point cloud in a welding bead point cloud data set P Wi;
s26, setting a lower threshold D 1 and an upper threshold D 2 of a weld bead direction vector, and extracting a weld bead direction of a weld bead point cloud data set P Wi
S27, welding bead directionConversion toObtaining a rotation matrix R i in the direction, and multiplying the welding bead point cloud data set by the rotation matrix R i to obtain a rotated welding bead point cloud data set;
S28, slicing the rotated weld bead point cloud data set along the X-axis direction to obtain multi-channel weld bead slice point cloud data P sliceij;
S29, extracting point cloud highest points of the weld slice point cloud data P sliceij along the Z-axis direction, the height of the slice weld and the width of the slice weld, and finishing the extraction of the weld features.
2. The method for extracting weld grinding characteristics according to claim 1, wherein the step S21 further comprises:
And carrying out voxel downsampling processing on the point cloud information to obtain a preprocessed point cloud data set P.
3. The weld grinding feature extraction method according to claim 1, wherein the S21 includes:
Determining a query point P i, setting a distance threshold r, and calculating the distance d between any two points in the point cloud data set according to the following formula:
Wherein p i and p j represent two points in the point cloud, and p ik and p jk represent any point of the two points in the neighborhood;
Finding n adjacent points P j nearest to the query point P i through the KD-tree, wherein j=1, 2, …, n, and calculating the Euclidean distance d j from the n adjacent points to the query point P i according to the above formula;
Comparing the distance d j with the distance threshold r, and classifying points smaller than the distance threshold r into the class M until the number of points in the class M is not increased any more, and completing the segmentation.
4. The weld grinding feature extraction method according to claim 1, wherein the S24 includes:
Performing European clustering segmentation on the weld point cloud data set P W, setting a minimum value kappa of the quantity of point clouds in clustering, and reserving a classified point cloud data set when the quantity of the point cloud data set P Wi after European clustering segmentation is more than or equal to kappa, otherwise deleting the classified point cloud data set, wherein i=1, 2,3 … n and n are natural numbers more than 3.
5. The weld grinding feature extraction method according to claim 1, wherein the S25 includes:
S251, recording m point cloud data in the bead point cloud data set P Wi, and selecting q point cloud data (j=1, 2,3 … m) around one point P Wij in the bead point cloud data set P Wi;
S252, calculating a point cloud centroid C Wij of q+1 point cloud data, wherein the formula is as follows:
S253, performing point cloud barycentering processing, and sequentially subtracting the point cloud barycenters C Wij from q+1 point cloud data to obtain a barycentered point cloud data position matrix deM ij;
S254, constructing a covariance matrix: covi j=deMij T*deMij;
S255, performing singular value decomposition on the covariance matrix Cov ij, wherein a singular vector (u ij,vij,wij) corresponding to the minimum singular value is a normal vector of a point P Wij;
S256, judging whether j is equal to m, if j is smaller than m, returning to S251, wherein j=j+1;
And S257, when j is equal to m, obtaining the direction vectors of all the point clouds in the welding bead point cloud data set P Wi.
6. The weld grinding feature extraction method according to claim 5, wherein the S26 includes:
S261, making the fitted weld plate plane expression be A 1x+B1y+C1z=D1, recording that the weld bead point cloud data set P Wi has m point cloud data in total, and calculating an included angle cosine value cos phi ij between a direction vector and the weld plate plane of a point P Wij in the weld bead point cloud data set P Wi by the following formula:
wherein j=1, 2, 3..m;
S262, creating an array D Wi to store weld bead direction point cloud data of the ith weld bead, when D 1≤cosΦij≤D2, putting the point P Wij into the array D Wi, otherwise, continuing the next step;
S263, judging whether j is equal to m, if j is smaller than m, returning to S261, wherein j=j+1;
s264, when j is equal to m, obtaining a welding bead direction point cloud data set D Wi;
S265, calculating a point cloud centroid C Wi of the weld bead direction point cloud data set D Wi, wherein the coordinate is marked as (a i,bi,ci);
S266, performing point cloud barycentering processing, and sequentially subtracting the point cloud barycenter C Wi from the point cloud data in D Wi to obtain a barycentered point cloud data position matrix deM i;
S267, performing singular value decomposition on the position matrix deM i to obtain matrices U i、Si and V i;
S268, the feature vector corresponding to the maximum feature value is the direction vector of the fitting straight line, and the point cloud centroid is a point on the fitting straight line, namely:
wherein (x i,yi,zi) is the point on the fitting space straight line, and (l i,mi,ni) represents the welding path direction of the fitting straight line
7. The weld grinding feature extraction method according to claim 6, wherein the S27 includes:
The direction of the welding bead The unit vector is converted into:
Recording device The direction unit vector is
When (when)Then q i = (0, 0), otherwise
Where sum () represents summing the three directions of the vector;
Constructing a rotation matrix R i by using a quaternion q i;
The weld bead point cloud data set P Wi is transformed into a weld bead direction parallel to the X axis by using the rotation matrix R i, namely, a rotated point cloud data set P Wi=Ri*PWi T.
8. The weld grinding feature extraction method according to claim 1, wherein the S28 includes:
S281, setting a space d of slice point clouds and a slice point cloud width f, wherein the distance of each space d selects the point clouds with the width f for slicing, and the range of the point clouds P Wi is X mini to X maxi, wherein i represents an ith welding line, j represents a jth point cloud slice, and j starts to be remembered from 0;
S282. let j=j+1, determine whether X mini +j×f+ (j-1) ×d is greater than X maxi, and when X mini +j×f+ (j-1) ×d is less than or equal to X maxi, select a point in the coordinate range X mini+(j-1)*f+(j-1)*d~Xmini +j×f+ (j-1) ×d as slice point cloud data P sliceij;
When X mini+j*f+(j-1)*dXmaxi is greater than X maxi, this step is ended.
9. The weld grinding feature extraction method according to claim 1, wherein the S29 includes:
the three-dimensional coordinates of P sliceij are noted as (x sliceij,ysliceij,zsliceij);
The highest point H sliceij=maxzsliceij, slice weld height H sliceij=maxzsliceij-minzsliceij, slice weld width W sliceij=maxysliceij-minysliceij, and leg positions of the slice welds are maxy sliceij and miny sliceij.
10. The welding seam detection and polishing method based on the 3D vision sensor is characterized by comprising the following steps of:
shooting and acquiring point cloud information of a workpiece through a 3D camera;
Step two, extracting weld polishing characteristics based on the weld polishing characteristic extraction method according to any one of claims 1 to 9;
Generating a welding seam polishing path according to welding seam polishing characteristics, and polishing a workpiece by a robot according to the polishing path;
step four, repeating the step one and the step two, obtaining the residual height characteristics of the polished weld joint, and carrying out weld joint forming detection: if the detection is passed, finishing the polishing operation flow, and if the detection is not passed, returning to the third step to continue polishing.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310645680.1A CN116823735B (en) | 2023-06-01 | 2023-06-01 | Weld polishing feature extraction method, weld detection and polishing method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310645680.1A CN116823735B (en) | 2023-06-01 | 2023-06-01 | Weld polishing feature extraction method, weld detection and polishing method |
Publications (2)
Publication Number | Publication Date |
---|---|
CN116823735A CN116823735A (en) | 2023-09-29 |
CN116823735B true CN116823735B (en) | 2024-11-01 |
Family
ID=88123364
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202310645680.1A Active CN116823735B (en) | 2023-06-01 | 2023-06-01 | Weld polishing feature extraction method, weld detection and polishing method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN116823735B (en) |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2014106823A2 (en) * | 2013-01-03 | 2014-07-10 | Meta Company | Extramissive spatial imaging digital eye glass apparatuses, methods and systems for virtual or augmediated vision, manipulation, creation, or interaction with objects, materials, or other entities |
CN111737796B (en) * | 2020-06-10 | 2021-02-26 | 南京英尼格玛工业自动化技术有限公司 | Reverse reconstruction method for high-speed rail sleeper beam process hole |
CN114571153B (en) * | 2022-04-07 | 2023-10-10 | 福州大学 | Weld joint identification and robot weld joint tracking method based on 3D point cloud |
CN115965960A (en) * | 2023-01-31 | 2023-04-14 | 河北中电信普智能科技有限公司 | Weld joint identification method based on deep learning and 3D point cloud |
-
2023
- 2023-06-01 CN CN202310645680.1A patent/CN116823735B/en active Active
Non-Patent Citations (1)
Title |
---|
基于三维点云数据的焊缝识别方法研究;唐国寅等;南京工程学院学报(自然科学版);20230930;第21卷(第3期);全文 * |
Also Published As
Publication number | Publication date |
---|---|
CN116823735A (en) | 2023-09-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN103192397B (en) | Vision robot's off-line programing method and system | |
CN100435055C (en) | Method for planning smooth and non-interference tool route of 5-axis numerical control machining | |
CN103678754B (en) | Information processor and information processing method | |
CN101497279B (en) | Measuring and machining integrated laser three-dimensional marking method and device | |
CN111644935A (en) | Robot three-dimensional scanning measuring device and working method | |
CN114055255B (en) | Large-scale complex component surface polishing path planning method based on real-time point cloud | |
CN113920060A (en) | Autonomous operation method and device for welding robot, electronic device, and storage medium | |
Patil et al. | Extraction of weld seam in 3d point clouds for real time welding using 5 dof robotic arm | |
Tarbox et al. | IVIS: An integrated volumetric inspection system | |
CN115965960A (en) | Weld joint identification method based on deep learning and 3D point cloud | |
Xu et al. | A new welding path planning method based on point cloud and deep learning | |
Fang et al. | A vision-based method for narrow weld trajectory recognition of arc welding robots | |
Pachidis et al. | Vision-based path generation method for a robot-based arc welding system | |
CN116823735B (en) | Weld polishing feature extraction method, weld detection and polishing method | |
Cao et al. | Aircraft pipe gap inspection on raw point cloud from a single view | |
CN117162098B (en) | Autonomous planning system and method for robot gesture in narrow space | |
Borsu et al. | Automated surface deformations detection and marking on automotive body panels | |
Wu et al. | A novel approach for porcupine crab identification and processing based on point cloud segmentation | |
CN118386236A (en) | Teaching-free robot autonomous welding polishing method based on combination of line laser scanning and stereoscopic vision | |
Marchand et al. | Controlled camera motions for scene reconstruction and exploration | |
Yusen et al. | A method of welding path planning of steel mesh based on point cloud for welding robot | |
CN113744245A (en) | Method and system for positioning structural reinforcing rib welding seam in point cloud | |
Hu et al. | A novel method for the localization of convex workpieces in robot workspace using gauss map | |
CN116394235B (en) | Dry ice cleaning track planning system and method for large part robot based on three-dimensional measurement | |
Shivshankar et al. | 3D scanning: A new approach towards model development in advanced manufacturing system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant |