CN114378830B - Robot wrist joint singular avoidance method and system - Google Patents

Robot wrist joint singular avoidance method and system Download PDF

Info

Publication number
CN114378830B
CN114378830B CN202210153487.1A CN202210153487A CN114378830B CN 114378830 B CN114378830 B CN 114378830B CN 202210153487 A CN202210153487 A CN 202210153487A CN 114378830 B CN114378830 B CN 114378830B
Authority
CN
China
Prior art keywords
look
ahead
singular point
robot
final
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210153487.1A
Other languages
Chinese (zh)
Other versions
CN114378830A (en
Inventor
黄彦玮
张鹏
张国平
王光能
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Dazu Robot Co ltd
Original Assignee
Shenzhen Dazu Robot Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Dazu Robot Co ltd filed Critical Shenzhen Dazu Robot Co ltd
Priority to CN202210153487.1A priority Critical patent/CN114378830B/en
Publication of CN114378830A publication Critical patent/CN114378830A/en
Application granted granted Critical
Publication of CN114378830B publication Critical patent/CN114378830B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • B25J9/1607Calculation of inertia, jacobian matrixes and inverses
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Numerical Control (AREA)
  • Manipulator (AREA)

Abstract

The embodiment of the invention provides a robot wrist joint singular avoidance method and a robot wrist joint singular avoidance system, which can be applied to application scenes requiring a large working space and requiring a track to strictly comply with position and time sequence requirements, such as large-scale vehicle welding, building periphery dispensing and the like. The embodiment of the invention can accurately execute the position and speed instructions, and control the gesture deviation precision within the set range, so that the robot does not need to be decelerated, and the gesture stably passes through the singular point of the odd-type, thereby ensuring the completion of the process with high quality. For the commercial robot with the inverse solution module, the embodiment of the invention can directly correct the original problematic discrete track points one by one, the modified track corresponds to the original track one by one, the correction only occurs near the singular point, and the correction has no influence on other tracks except for the vicinity of the singular point. Therefore, the embodiment of the invention can be used as an additional module on all isomorphic commercial robots, and has better universality.

Description

Robot wrist joint singular avoidance method and system
Technical Field
The embodiment of the invention relates to the technical field of robot wrist joint control, in particular to a robot wrist joint singular avoidance method and system.
Background
One of the common problems of the wrist singular point in the 6R non-redundant cooperative arm, which may occur at any position in the cartesian space, is only related to the special spatial model when the robot joint 5 approaches zero, and is difficult to exclude by limiting the working space. When the mechanical arm passes near the singular point, in order to maintain the position accuracy, the mechanical arm is usually required to move along with the falling speed of the Cartesian space, which often cannot meet the requirement of a process task (such as welding, dispensing and the like) with constant space speed. To address this problem, a series of methods of task ordering (Task Priority Sequence), singular value separation (Singularity Isolation), and the like are proposed. The existing solution thinking mostly avoids singular bit patterns by sacrificing certain gesture precision, and the mode can solve part of problems for applications with low gesture precision requirements. However, some defects still exist, on one hand, the existing method often lacks quantitative analysis on the attitude error when the singular point is exceeded, and lacks a clear and controllable quantitative index on the maximum error when the robot passes the singular point.
Disclosure of Invention
Therefore, the embodiment of the invention provides a robot wrist joint singular avoidance method and a robot wrist joint singular avoidance system, which are used for solving the technical problem that the maximum error of a robot passing through a singular point lacks clear and controllable quantization indexes in the prior art.
In order to achieve the above object, the embodiment of the present invention provides the following technical solutions:
according to a first aspect of an embodiment of the present invention, an embodiment of the present application provides a robot wrist singular avoidance method, including:
sampling and detecting a joint track instruction when the robot moves in a Cartesian space according to a preset period;
predicting the motion gesture of the robot in a plurality of next periods according to the current motion speed by using the sampling detection result;
judging whether an odd-abnormal-type singular point exists in front of the robot based on a prediction result;
if it is detected that there is an odd-type singular point in front of the robot, an angular velocity deflection posture is generated
Deflecting the attitude using the angular velocityAs correction amount and original target posture->Fusion generates new target poses->
For the new target gestureSolving, generating a new joint track signal and sending the new joint track signal to a lower computer for execution.
Further, the robot wrist joint singular avoidance method provided by the embodiment of the application further includes:
if the fact that the singular point of the singular position does not exist in front of the robot is detected, the original target gesture is performedSolving, generating a joint track signal and sending the joint track signal to a lower computer for execution.
Further, based on the prediction result, judging whether a singular point of an odd-dislocation exists in front of the robot, including:
obtaining a fifth joint space angle q (t) of the robot based on the prediction result 5 Said fifth joint space angle q (t) 5 First order difference value of (a)
Judging the space angle q (t) of the fifth joint 5 And the first order differential valueWhether a first preset condition is met or not, wherein the first preset condition is as follows:
and->Or->And->
Wherein,mean value of the first inner circle threshold value parameter, which is preset, is indicated,/->Representing an average value of a second inner circle threshold parameter preset;
judging the fifth joint space angle q (t) if the first preset condition is satisfied 5 Whether a second preset condition is met or not, wherein the second preset condition is as follows:
wherein,an average value of preset outer ring threshold parameters is represented;
if the second preset condition is met, an odd-ectopic singular point exists in front of the robot.
Further, an angular velocity deflection gesture is generatedComprising the following steps:
calculating the forward-looking starting moment t of the robot by using the sampling detection result s The look-ahead position command linear velocity v represented in the base coordinates by the tool coordinate system T T (t s ) And a look-ahead home position command angular velocity omega T (t);
Presetting a look-ahead time t forward With the first position vector p T (t s ) Calculating the look-ahead starting position s (t) s ) Using the look-ahead starting position to command a linear velocity v T (t s ) Predicting a look-ahead end position s (t e ) The end-of-look position s (t e ) The first predictive formula of (2) is as follows:
s(t s )=0
s(t e )=s(t s )+t forward ||v T (t s )|| 2
wherein t is forward Representing a look-ahead time;
using the look-ahead starting position s (t s ) And the look-ahead end position s (t e ) Searching for a singular point proximity position s (T) of the robot tool coordinate system T closest to the odd-dislocation in the travel path using a dichotomy final ) And a singular point proximity pose, the singular point proximity pose comprising: singular point proximity position vector p T (s(t final ) And a singular point adjacent position rotation matrix R T (s(t final ));
Using the singular point approach position vector p T (s(t final ) And the singular point adjacent position rotation matrix R T (s(t final ) Calculating an angular acceleration rate deflection axis of the robot tool coordinate system T
Command linear velocity v based on the look-ahead starting position T (t s ) And the fifth joint space angle q (t) 5 Calculating a deflection angle theta of the robot tool coordinate system Tq final ) The deflection angle θ (q final ) The seventh calculation formula of (2) is as follows:
wherein c 1 C for a first preset parameter 2 C is a second preset parameter 3 The third preset parameter;
yaw axis according to the angular velocity increaseAnd the deflection angle θ (q final ) Angular velocity deflection gesture ++calculated by using the Rodrigas formula>The formula of the rodgers is as follows:
where I is an identity matrix and θ is a deflection angle θ (q final ),For angular acceleration of the yaw axis +.>
Further, calculating the robot look-ahead starting time t by using the sampling detection result s The look-ahead position command linear velocity v represented in the base coordinates by the tool coordinate system T T (t s ) And a look-ahead home position command angular velocity omega T (t) comprising:
obtaining a look-ahead starting time t from the sampling detection result s And t s -1 a first position vector p represented in base coordinates by a tool coordinate system T at moment T (t s ) And a second position vector p T (t s -1) look-ahead initiationTime t s And t s -1 a first rotation matrix R represented in base coordinates by a tool coordinate system T at moment T (t s ) And a second rotation matrix R T (t s -1);
Using the first position vector p T (t s ) And the second position vector p T (t s -1) calculating the robot look-ahead start instant t s The look-ahead position command linear velocity v represented in the base coordinates by the tool coordinate system T r (t s ) The look-ahead home position instructs a linear velocity v T (t s ) The first calculation formula of (2) is as follows:
v T (t s )=(p T (t s )-p T (t s -1))/ΔT
wherein, deltaT is a sampling preset period; and
Using the first rotation matrix R T (t s ) And the second rotation matrix R T (t s -1) calculating the robot look-ahead start instant t s The look-ahead starting position of the tool coordinate system T in the base coordinates commands an angular velocity ω T (t) the look-ahead position commands an angular velocity ω T (t s ) The second calculation formula of (2) is as follows:
where log represents logarithmic transformation on the matrix form, T' represents transpose, V represents vectorization of the diagonal velocity matrix, and Δt is the sampling preset period.
Further, using the look-ahead position s (t s ) And the look-ahead end position s (t e ) Searching for a singular point proximity position s (T) of the robot tool coordinate system T closest to the odd-dislocation in the travel path using a dichotomy final ) And a singular point proximity position pose, comprising:
using the look-ahead starting position s (t s ) And the look-ahead end position s (t e ) Predicting the intermediate position s (t h ) And an intermediate position posture, the intermediate position s(t h ) The second predictive formula of (2) is:
s(t h )=0.5(s(t s )+s(t e ));
to look from the look-ahead starting position s (t s ) To the intermediate position s (t h ) As a first path to pass from the intermediate position s (t h ) To the end-of-look position s (t e ) As a second path, performing a bipartite operation, circularly executing the second prediction formula, and predicting to obtain a first intermediate position s (t h1 ) A second intermediate position s (t h2 ) The method comprises the steps of carrying out a first treatment on the surface of the Wherein the first look-ahead position s (t s1 ) And a first look-ahead end position s (t e1 ) Respectively the look-ahead starting positions s (t s ) And the intermediate position s (t h ) A second look-ahead position s (t s2 ) And a second look-ahead end position s (t e2 ) Respectively the intermediate positions s (t h ) And the look-ahead end position s (t e );
For the first look-ahead starting position s (t s1 ) Said first look-ahead end position s (t e1 ) Said first intermediate position s (t h1 ) The corresponding position vectors and the rotation matrix are subjected to inverse solution to obtain respective corresponding first joint space postures, wherein the first joint space postures comprise: the first starting position joint space angle q (s (t) s1 )) 5 The first distal joint space angle q (s (t) e1 )) 5 The first neutral position joint space angle q (s (t) h1 )) 5
For the second look-ahead starting position s (t s2 ) Said second look-ahead end position s (t e2 ) Said second intermediate position s (t h2 ) The corresponding position vectors and the rotation matrix are subjected to inverse solution to obtain respective corresponding second joint space postures, wherein the second joint space postures comprise: the second starting position joint space angle q (s (t) s2 )) 5 The second distal joint space angle q (s (t) e2 )) 5 The second intermediate position joint space angle q (s (t) h2 )) 5
Judging whether the first joint space posture and the second joint space posture meet a third preset condition or not respectively, wherein the third preset condition is as follows:
q(s(t hx )) 5 <min(q(s(t sx )) 5 ,q(s(t ex )) 5 )
wherein x=1 or 2;
if the first joint spatial pose meets a third preset condition, a third prediction formula is executed to predict a corresponding first intermediate position pose, wherein the first intermediate position pose comprises: first intermediate position vector p T (s(t h1 ) And a first intermediate position rotation matrix R T (s(t h1 ) A) is provided; and uses the first look-ahead position s (t s1 ) Said first look-ahead end position s (t e1 ) Said first intermediate position s (t h1 ) Respectively updating the look-ahead starting positions s (t s ) Said look-ahead end position s (t e ) Said intermediate position s (t h ) Continuously performing the bisection operation of the next round;
if the second joint spatial pose meets a third preset condition, a third prediction formula is executed to predict a corresponding second intermediate position pose, wherein the second intermediate position pose comprises: second intermediate position vector p T (s(t h2 ) And a second intermediate position rotation matrix R T (s(t h2 ) A) is provided; and uses the second look-ahead position s (t s2 ) Said second look-ahead end position s (t e2 ) Said second intermediate position s (t h2 ) Respectively updating the look-ahead starting positions s (t s ) Said look-ahead end position s (t e ) Said intermediate position s (t h ) Continuously performing the bisection operation of the next round;
after each round of binary operation is performed, the updated look-ahead starting position s (t s ) And the look-ahead end position s (t e ) A difference between them;
judging whether the difference value is smaller than a preset interval value or not;
if the difference is smaller than a preset interval value, stopping the two-way operation cycle, and updating the intermediate position s (t h ) And corresponding intermediate bitThe posing states are respectively recorded as singular point adjacent positions s (t final ) And a singular point proximity pose, the singular point proximity pose comprising: singular point proximity position vector p T (s(t final ) And a singular point adjacent position rotation matrix R T (s(t final ))。
Further, the intermediate position gesture includes: intermediate position vector p T (s(t h ) And a neutral position rotation matrix R T (s(t h ) A third predictive formula for the mid-position pose is:
p T (s(t h ))=p T (s(t s ))+0.5t step v T (t s )
p T (s(t s ))=p T (t s )
R T (s(t h ))=exp(0.5t step ω T (t s ))·R T (s(t s ))
R T (s(t s ))=R T (t s )
t step =[s(t h )-s(t s )]/|v T (t s )| 2
wherein t is step To command the linear velocity v according to the look-ahead starting position T (t s ) From a prospective starting position s (t s ) To an intermediate position s (t) h ) Time required, |v T (t s )| 2 Indicating a look-ahead start position command linear velocity v T (t s ) Is a binary norm of (c).
Further, the singular point approach position vector p is utilized T (s(t nnal ) And the singular point adjacent position rotation matrix R T (s(t final ) Calculating an angular acceleration rate deflection axis of the robot tool coordinate system TComprising the following steps:
using the singular point approach position vector p T (s(t final ) And the rotational moment at the position near the singular point)Array R T (s(t final ) Calculating the space angle q of the joint at the position close to the singular point of the robot final The space angle q of the joint at the adjacent position of the singular point final The third calculation formula of (2) is as follows:
q final =invkin(p T (s(t final )),R T (s(t final )));
wherein invkin represents an inverse solution formula in robot kinematics;
based on the singular point adjacent position joint space angle q final Calculating a first angular velocity partial matrix of a jacobian matrix of the robot tool coordinate system T
Rotating matrix R by utilizing adjacent positions of singular points T (s(t final ) For the first angular velocity partial matrixCorrecting to obtain a second angular velocity partial matrix +.>The second angular velocity partial matrixThe correction formula of (2) is as follows:
from a second angular velocity partial matrixThe values of the 4 th row, the 1 st column and the 4 th row and the 2 nd column are taken to obtain a first element j 41 And a first element j 42
By means of the first element j 41 And a first element j 42 Calculation ofAngular acceleration speed deflection axis of the robot tool coordinate system TThe angular acceleration rate deflection axis +.>The sixth calculation formula of (2) is as follows:
wherein [ j ] 42 ,-j 41 ] T Representation [ j ] 42 ,-j 41 ]Is a transpose of (a).
Further, based on the singular point adjacent position joint space angle q final Calculating a first angular velocity partial matrix of a jacobian matrix of the robot tool coordinate system TComprising the following steps:
based on the singular point adjacent position joint space angle q final Calculating a third rotation matrix of the axis coordinate system of the adjacent position of the singular point relative to the base coordinate systemWherein m is x, y, z, the third rotation matrix +.>The fourth calculation formula of (2) is as follows:
wherein finkin is robot kinematicsIs used as a positive kinematic formula in the model (a),a position vector of a singular point adjacent position axis coordinate system relative to a base coordinate system;
according to the third rotation matrixObtaining the coordinate of Z-axis unit direction vector of each joint n expressed in the base coordinate system +.>
Based on the coordinatesA first angular velocity partial matrix of the jacobian matrix of the robot tool coordinate system T is obtained>The first angular velocity partial matrix->The fifth calculation formula of (2) is as follows:
according to a second aspect of the embodiment of the present invention, an embodiment of the present application provides a robot wrist singular avoidance system, the system including:
the sampling detection module is used for sampling and detecting joint track instructions when the robot moves in a Cartesian space according to a preset period;
The singular point look-ahead detection module is used for predicting the motion gesture of the robot in a plurality of next periods according to the current motion speed by using the sampling detection result; judging whether an odd-abnormal-type singular point exists in front of the robot based on a prediction result;
the gesture correction module is used for generating an angular velocity deflection gesture if detecting that an odd-type singular point exists in front of the robot
A gesture fusion module for deflecting gestures by using the angular velocityAs correction amount and original target posture->Fusion generates new target poses->
An inverse solution module for the new target gestureSolving, generating a new joint track signal and sending the new joint track signal to a lower computer for execution.
Compared with the prior art, the robot wrist joint singular avoidance method and system provided by the embodiment of the application can be applied to application scenes requiring a larger working space and requiring a track strictly conforming to position and time sequence requirements, such as large-scale vehicle welding, building periphery dispensing and the like. The embodiment of the invention can accurately execute the position and speed instructions, and control the gesture deviation precision within the set range, so that the robot does not need to be decelerated, and the gesture stably passes through the singular point of the odd-type, thereby ensuring the completion of the process with high quality. For the commercial robot with the inverse solution module, the embodiment of the invention can directly correct the original problematic discrete track points one by one, the modified track corresponds to the original track one by one, the correction only occurs near the singular point, and the correction has no influence on other tracks except for the vicinity of the singular point. Therefore, the embodiment of the invention can be used as an additional module on all isomorphic commercial robots, and has better universality.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below. It will be apparent to those of ordinary skill in the art that the drawings in the following description are exemplary only and that other implementations can be obtained from the extensions of the drawings provided without inventive effort.
The structures, proportions, sizes, etc. shown in the present specification are shown only for the purposes of illustration and description, and are not intended to limit the scope of the invention, which is defined by the claims, so that any structural modifications, changes in proportions, or adjustments of sizes, which do not affect the efficacy or the achievement of the present invention, should fall within the ambit of the technical disclosure.
FIG. 1 is a schematic structural diagram of a robot wrist singular avoidance system according to an embodiment of the present invention;
fig. 2 is a schematic flow chart of a robot wrist singular avoidance method according to an embodiment of the present invention;
Fig. 3 is a diagram illustrating an angular velocity deflection gesture generated in a robot wrist singular avoidance method according to an embodiment of the present inventionIs a flow diagram of (a).
Detailed Description
Other advantages and advantages of the present invention will become apparent to those skilled in the art from the following detailed description, which, by way of illustration, is to be read in connection with certain specific embodiments, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
The idea of the embodiment of the invention is as follows: when the robot moves in the space, the instruction track signal of the Cartesian space of the robot is sampled at regular intervals. After sampling, the motion gesture of the robot is calculated by a look-ahead algorithm for the next several cycles under the condition of moving according to the current motion speed. If the robot singular gesture is detected to exist in the front, a correction amount is generated for the current instruction according to a certain rule, and the corrected information and the original target gesture are fused to generate a new target gesture. And finally, the fused information is sent to the original controller for solving, and a new joint track is generated and sent to a lower computer for execution.
As shown in fig. 1, an embodiment of the present application provides a robot wrist singular avoidance system, including: the system comprises a sampling detection module 1, a singular point look-ahead detection module 2, a posture correction module 3, a posture fusion module 4 and an inverse solution module 5.
Specifically, the sampling detection module 1 is configured to sample and detect a joint track instruction of the robot when the robot moves in a cartesian space according to a preset period; the singular point look-ahead detection module 2 is used for predicting the motion gesture of the robot in a plurality of next periods according to the current motion speed by using the sampling detection result; judging whether an odd-abnormal-type singular point exists in front of the robot based on a prediction result; the attitude correction module 3 is used for generating an angular velocity deflection attitude if an odd-type singular point exists in front of the robotThe pose fusion module 4 is for deflecting the pose +.>As correction amount and original target posture->Fusion generates new target poses->The inverse solution module 5 is used for solving the new target gestureSolving, generating a new joint track signal and sending the new joint track signal to a lower computer for execution. The inverse solution module 5 is also used for +/a-original target pose if it is detected that there is no singular point of odd-type in front of the robot >Solving, generating a joint track signal and sending the joint track signal to a lower computer for execution.
According to the robot wrist joint singular avoidance system provided by the embodiment of the application, the robot can avoid the singular point by sacrificing a certain pose through on-line correction of the pose near the singular point. Compared with the prior art, the embodiment of the invention has the advantages that the deflection error and the deflection range reach the maximum adjustability, and meanwhile, the generated track is continuous in the first order of the speed level and can be directly used for joint control execution.
Corresponding to the robot wrist joint singular avoidance system disclosed by the invention, the embodiment of the invention also discloses a robot wrist joint singular avoidance method. The robot wrist singular avoidance system disclosed in the embodiment of the invention is described in detail below with reference to the above description.
As shown in fig. 2, specific steps of the robot wrist singular avoidance method provided in the embodiment of the present application are described in detail below.
Step S11: and sampling and detecting the joint track instruction of the robot when the robot moves in the Cartesian space according to a preset period by the sampling and detecting module 1.
In the embodiment of the invention, the sampling detection algorithm is a separate thread independent of the main controller thread. The independent sampling detection frequency is the same as the sampling frequency of the main controller.
Step S12: and predicting the motion gesture of the robot in a plurality of next periods according to the current motion speed by using the sampling detection result through the singular point look-ahead detection module 2.
Step S13: and judging whether an odd-ectopic singular point exists in front of the robot or not based on a prediction result through the singular point look-ahead detection module 2.
The above steps 12 and 13 are prospective detection, and the main purpose of the step is to find out whether the front part of the robot in the motion direction tends to the singular point of the wrist joint, and find out whether the front part of the robot advances at the current speed to enter the singular area range of the wrist joint. Two conditions for judging that the robot enters the singular area of the wrist joint in the singular point prospective detection module 2 exist.
Specifically, in the embodiment of the present invention, based on a prediction result, it is determined whether an odd-ectopic singular point exists in front of the robot, and the method specifically includes the following steps: obtaining a fifth joint space angle q (t) of the robot based on the prediction result 5 Said fifth joint space angle q (t) 5 First order difference value of (a)Judging the space angle q (t) of the fifth joint 5 And the first order difference value ++>Whether a first preset condition is met or not, wherein the first preset condition is as follows:
and->Or->And->
Wherein,mean value of the first inner circle threshold value parameter, which is preset, is indicated,/->Representing an average value of a second inner circle threshold parameter preset; judging the fifth joint space angle q (t) if the first preset condition is satisfied 5 Whether a second preset condition is met or not, wherein the second preset condition is as follows:
wherein,an average value of preset outer ring threshold parameters is represented; if the second preset condition is met, an odd-ectopic singular point exists in front of the robot. If the first preset condition or the second preset condition is not met, no singular point exists in front of the robot.
Step S14: if no singular point of the odd position exists in front of the robot, the original target gesture is processed by the inverse solution module 5Solving, generating a joint track signal and sending the joint track signal to a lower computer for execution.
Step S15: if it is detected that there is an odd-type singular point in front of the robot, an angular velocity deflection gesture is generated by the gesture correction module 3
Further, the following describes the generation of the angular velocity deflection gesture disclosed in the embodiment of the present inventionIs described in detail.
Step S21: calculating the forward-looking starting moment t of the robot by using the sampling detection result s The look-ahead position command linear velocity v represented in the base coordinates by the tool coordinate system T T (t s ) And a look-ahead home position command angular velocity omega T (t)。
Further, the step S21 specifically includes: obtaining a look-ahead starting time t from the sampling detection result s And t s -1 a first position vector p represented in base coordinates by a tool coordinate system T at moment T (t s ) And a second position vector p T (t s -1) a look-ahead start time t s And t s -1 a first rotation matrix R represented in base coordinates by a tool coordinate system T at moment T (t s ) And a second rotation matrix R T (t s -1); using the first position vector p T (t s ) And the second position vector p T (t s -1) calculating the robot look-ahead start instant t s The look-ahead position command linear velocity v represented in the base coordinates by the tool coordinate system T T (t s ) The look-ahead home position instructs a linear velocity v T (t s ) The first calculation formula of (2) is as follows:
v T (t s )=(p T (t s )-p T (t s -1))/ΔT
wherein, deltaT is a sampling preset period; using the first rotation matrix R T (t s ) And the second rotation matrix R T (t s -1) calculating the robot look-ahead start instant t s The look-ahead starting position of the tool coordinate system T in the base coordinates commands an angular velocity ω T (t) the look-ahead position commands an angular velocity ω T (t s ) The second calculation formula of (2) is as follows:
where log represents logarithmic transformation on the matrix form, T' represents transpose, V represents vectorization of the diagonal velocity matrix, and Δt is the sampling preset period.
Step S22: presetting a look-ahead time t forward With the first position vector p T (t s ) Calculating the look-ahead starting position s (t) s ) Using the look-ahead starting position to command a linear velocity v T (t s ) Predicting a look-ahead end position s (t e ) The end-of-look position s (t e ) The first predictive formula of (2) is as follows:
s(t s )=0
s(t e )=s(t s )+t forward ||v T (t s )|| 2
wherein t is forward Representing a look-ahead time;
step S23: using the look-ahead starting position s (t s ) And the look-ahead end position s (t e ) Searching for a singular point proximity position s (T) of the robot tool coordinate system T closest to the odd-dislocation in the travel path using a dichotomy final ) And a singular point proximity pose, the singular point proximity pose comprising: singular point proximity position vector p T (s(t final ) And a singular point adjacent position rotation matrix R T (s(t final ))。
Further, the singular point vicinity s (t final ) And the singular point adjacent position posture specifically comprises the following steps: using the look-ahead starting position s (t s ) And the look-ahead end position s (t e ) Predicting the intermediate position s (t h ) And an intermediate position posture, the intermediate position s (t h ) The second predictive formula of (2) is:
s(t h )=0.5(s(t s )+s(t e ));
wherein the intermediate position gesture comprises: intermediate position vector p T (s(t h ) And a neutral position rotation matrix R T (s(t h ) A third predictive formula for the mid-position pose is:
p T (s(t h ))=p T (s(t s ))+0.5t step v T (t s )
p T (s(t s ))=p T (t s )
R T (s(t h ))=exp(0.5t step ω T (t s ))·R T (s(t s ))
R T (s(t s ))=R T (t s )
t step =[s(t h )-s(t s )]/|v T (t s )| 2
wherein t is step To command the linear velocity v according to the look-ahead starting position T (t s ) From a prospective starting position s (t s ) To an intermediate position s (t) h ) Time required, |v T (t s )| 2 Indicating a look-ahead start position command linear velocity v T (t s ) Is a binary norm of (2); to look from the look-ahead starting position s (t s ) To the intermediate position s (t h ) As a first path to pass from the intermediate position s (t h ) To the end-of-look position s (t e ) As a second path, performing a bipartite operation, circularly executing the second prediction formula, and predicting to obtain a first intermediate position s (t h1 ) A second intermediate position s (t h2 ) The method comprises the steps of carrying out a first treatment on the surface of the Wherein the first look-ahead position s (t s1 ) And a first look-ahead end position s (t e1 ) Respectively the look-ahead starting positions s (t s ) And the intermediate position s (t h ) A second look-ahead position s (t s2 ) And a second look-ahead end position s (t e2 ) Respectively the intermediate positions s (t h ) And the look-ahead end position s (t e ) The method comprises the steps of carrying out a first treatment on the surface of the For the first look-ahead starting position s (t s1 ) Said first look-ahead end position s (t e1 ) Said first intermediate position s (t h1 ) The corresponding position vectors and the rotation matrix are subjected to inverse solution to obtain respective corresponding first joint space postures, wherein the first joint space postures comprise: the first starting position joint space angle q (s (t) s1 )) 5 The first distal joint space angle q (s (t) e1 )) 5 The first neutral position joint space angle q (s (t) h1 )) 5 The method comprises the steps of carrying out a first treatment on the surface of the For the second look-ahead starting position s (t s2 ) Said second look-ahead end position s (t e2 ) Said second intermediate position s (t h2 ) The corresponding position vectors and the rotation matrix are subjected to inverse solution to obtain respective corresponding second joint space postures, wherein the second joint space postures comprise: the second starting position joint space angle q (s (t) s2 )) 5 The second distal joint space angle q (s (t) e2 )) 5 The second intermediate position joint space angle q (s (t) h2 )) 5 The method comprises the steps of carrying out a first treatment on the surface of the Judging whether the first joint space posture and the second joint space posture meet a third preset condition or not respectively, wherein the third preset condition is as follows:
q(s(t hx )) 5 <min(q(s(t sx )) 5 ,q(s(t ex )) 5 )
wherein x=1 or 2; if the first joint spatial pose meets a third preset condition, a third prediction formula is executed to predict a corresponding first intermediate position pose, wherein the first intermediate position pose comprises: first intermediate position vector p T (s(t h1 ) And a first intermediate position rotation matrix R T (s(t h1 ) A) is provided; and uses the first look-ahead position s (t s1 ) Said first look-ahead end position s (t e1 ) Said first intermediate position s (t h1 ) Respectively updating the look-ahead starting positions s (t s ) Said look-ahead end position s (t e ) Said intermediate position s (t h ) Continuously performing the bisection operation of the next round; if the second joint spatial pose meets a third preset condition, a third prediction formula is executed to predict a corresponding second intermediate position pose, wherein the second intermediate position pose comprises: second intermediate position vector p T (s(t h2 ) And a second intermediate position rotation matrix R T (s(t h2 ) A) is provided; and uses the second look-ahead position s (t s2 ) Said second look-ahead end position s (t e2 ) Said second intermediate position s (t h2 ) Respectively updating the look-ahead starting positions s (t s ) Said look-ahead end position s (t e ) Said intermediate position s (t h ) Continuing the next round of twoPerforming sub-operations; after each round of binary operation is performed, the updated look-ahead starting position s (t s ) And the look-ahead end position s (t e ) A difference between them; judging whether the difference value is smaller than a preset interval value or not; if the difference is smaller than a preset interval value, stopping the two-way operation cycle, and updating the intermediate position s (t h ) And the corresponding intermediate position poses are recorded as singular point adjacent positions s (t final ) And a singular point proximity pose, the singular point proximity pose comprising: singular point proximity position vector p T (s(t dinal ) And a singular point adjacent position rotation matrix R T (s(t final ))。
Step S24: using the singular point approach position vector p T (s(t final ) And the singular point adjacent position rotation matrix R T (s(t final ) Calculating an angular acceleration rate deflection axis of the robot tool coordinate system T
Further, the step S24 specifically includes the following steps: using the singular point approach position vector p T (s(t final ) And the singular point adjacent position rotation matrix R T (s(t final ) Calculating the space angle q of the joint at the position close to the singular point of the robot final The space angle q of the joint at the adjacent position of the singular point final The third calculation formula of (2) is as follows:
q final =invkin(p T (s(t final )),R T (s(t final )));
wherein invkin represents an inverse solution formula in robot kinematics;
based on the singular point adjacent position joint space angle q final Calculating a first angular velocity partial matrix of a jacobian matrix of the robot tool coordinate system TThe method specifically comprises the following steps: based on the singular point approachSpatial angle q of position joint final Calculating a third rotation matrix of the axis coordinate system of the adjacent position of the singular point relative to the base coordinate system Wherein m is x, y, z, the third rotation matrix +.>The fourth calculation formula of (2) is as follows:
wherein finkin is a positive kinematic formula in robot kinematics,a position vector of a singular point adjacent position axis coordinate system relative to a base coordinate system;
according to the third rotation matrixObtaining the coordinate of Z-axis unit direction vector of each joint n expressed in the base coordinate system +.>Based on the coordinates->A first angular velocity partial matrix of the jacobian matrix of the robot tool coordinate system T is obtained>The first angular velocity partial matrix->The fifth calculation formula of (2) is as follows:
then the singular point adjacent position rotation matrix R is utilized T (s(t final ) For the first angular velocity partial matrixCorrecting to obtain a second angular velocity partial matrix +.>Said second angular velocity partial matrix ++>The correction formula of (2) is as follows:
from a second angular velocity partial matrixThe values of the 4 th row, the 1 st column and the 4 th row and the 2 nd column are taken to obtain a first element j 41 And a first element j 42 The method comprises the steps of carrying out a first treatment on the surface of the By means of the first element j 41 And a first element j 42 Calculating an angular acceleration speed yaw axis of the robot tool coordinate system T>The angular acceleration rate deflection axis +.>The sixth calculation formula of (2) is as follows:
wherein [ j ] 42 ,-j 41 ] T Representation [ j ] 42 ,-j 41 ]Is a transpose of (a).
Step S25: command linear velocity v based on the look-ahead starting position T (t s ) And the fifth joint space angle q (t) 5 Calculating a deflection angle θ (q final ) The deflection angle θ (q final ) The seventh calculation formula of (2) is as follows:
wherein c 1 C for a first preset parameter 2 C is a second preset parameter 3 And the third preset parameter.
Deflection angle θ (q final ) The solution of (c) is to use the seventh calculation formula, and it can be seen that the deflection angle θ (q final ) With a look-ahead start position command line speed v T (t s ) Proportional to the deflection angle θ (q final ) Spatial angle q (t) with the current fifth joint of the robot 5 Inversely proportional. By adjusting c 1 、c 2 And c 3 To adjust the behavior of the robot around the singular point.
Step S26: yaw axis according to the angular velocity increaseAnd the deflection angle θ (q final ) Angular velocity deflection gesture ++calculated by using the Rodrigas formula>The formula of the rodgers is as follows:
wherein I is an identity matrixθ is the deflection angle θ (q final ),For angular acceleration of the yaw axis +.>
In the embodiment of the invention, the gesture correction module 3 is an online track corrector, and the input of the module only depends on the current instruction and does not depend on the global information planned in advance, so that the embodiment of the invention can be applied to manual teaching operation in Cartesian space.
Step S16: the angular velocity is used to deflect the gesture by the gesture fusion module 4As correction amount and original target posture->Fusion generates new target poses->Generating a new target pose->The fusion formula of (2) is as follows:
step S17: the new target gesture is processed by the inverse solution module 5Solving, generating a new joint track signal and sending the new joint track signal to a lower computer for execution.
Compared with the prior art, the robot wrist joint singular avoidance method and system provided by the embodiment of the application can be applied to application scenes requiring a larger working space and requiring a track strictly conforming to position and time sequence requirements, such as large-scale vehicle welding, building periphery dispensing and the like. The embodiment of the invention can accurately execute the position and speed instructions, and control the gesture deviation precision within the set range, so that the robot does not need to be decelerated, and the gesture stably passes through the singular point of the odd-type, thereby ensuring the completion of the process with high quality. For the commercial robot with the inverse solution module, the embodiment of the invention can directly correct the original problematic discrete track points one by one, the modified track corresponds to the original track one by one, the correction only occurs near the singular point, and the correction has no influence on other tracks except for the vicinity of the singular point. Therefore, the embodiment of the invention can be used as an additional module on all isomorphic commercial robots, and has better universality.
While the invention has been described in detail in the foregoing general description and specific examples, it will be apparent to those skilled in the art that modifications and improvements can be made thereto. Accordingly, such modifications or improvements may be made without departing from the spirit of the invention and are intended to be within the scope of the invention as claimed.

Claims (9)

1. A robot wrist singular avoidance method, the method comprising:
sampling and detecting a joint track instruction when the robot moves in a Cartesian space according to a preset period;
predicting the motion gesture of the robot in a plurality of next periods according to the current motion speed by using the sampling detection result;
judging whether an odd-abnormal-type singular point exists in front of the robot based on a prediction result;
if it is detected that there is an odd-type singular point in front of the robot, an angular velocity deflection posture is generated
By using the angular velocity deviationRotating postureAs correction amount and original target posture->Fusion to generate new target poses
For the new target gestureSolving, generating a new joint track signal and sending the new joint track signal to a lower computer for execution;
Generating angular velocity yaw attitudeComprising the following steps:
calculating the forward-looking starting moment t of the robot by using the sampling detection result s The look-ahead position command linear velocity v represented in the base coordinates by the tool coordinate system T T (t s ) And a look-ahead home position command angular velocity omega T (t s );
Presetting a look-ahead time t forward With a first position vector p T (t s ) Calculating the look-ahead starting position s (t) s ) Using the look-ahead starting position to command a linear velocity v T (t s ) Predicting a look-ahead end position s (t e ) The end-of-look position s (t e ) The first predictive formula of (2) is as follows:
s(t s )=0
s(t e )=s(t s )+t forward| ||v T (t s )|| 2
wherein t is forward Representing a look-ahead time;
using the look-ahead starting position s (t s ) And the look-ahead end position s (t e ) Searching for the closest singular point proximity of the tool coordinate system T to the odd-dislocation in the travel path using dichotomyPosition s (t) final ) And a singular point proximity pose, the singular point proximity pose comprising: singular point proximity position vector p T (s(t final ) And a singular point adjacent position rotation matrix R T (s(t final ));
Using the singular point approach position vector p T (s(t final ) And the singular point adjacent position rotation matrix R T (s(t final ) Calculating an angular acceleration rate deflection axis of the tool coordinate system T
Command linear velocity v based on the look-ahead starting position T (t s ) And a fifth joint space angle q (t) 5 Calculating a deflection angle θ (q final ) The deflection angle θ (q final ) The seventh calculation formula of (2) is as follows:
wherein c 1 C for a first preset parameter 2 C is a second preset parameter 3 The third preset parameter;
yaw axis according to the angular velocity increaseAnd the deflection angle θ (q final ) Angular velocity deflection gesture ++calculated by using the Rodrigas formula>The formula of the rodgers is as follows:
where I is an identity matrix and θ is a deflection angle θ (q final ),For angular acceleration of the yaw axis +.>
2. A robotic wrist singular avoidance method as claimed in claim 1, the method further comprising:
if the fact that the singular point of the singular position does not exist in front of the robot is detected, the original target gesture is performedSolving, generating a joint track signal and sending the joint track signal to a lower computer for execution.
3. The robot wrist singular avoidance method of claim 1, wherein determining whether there is a singular point of an odd position in front of the robot based on a prediction result, comprises:
obtaining a fifth joint space angle q (t) of the robot based on the prediction result 5 Said fifth joint space angle q (t) 5 First order difference value of (a)
Judging the space angle q (t) of the fifth joint 5 And the first order differential value Whether a first preset condition is met or not, wherein the first preset condition is as follows:
and->Or->And->
Wherein,mean value of the first inner circle threshold value parameter, which is preset, is indicated,/->Representing an average value of a second inner circle threshold parameter preset;
judging the fifth joint space angle q (t) if the first preset condition is satisfied 5 Whether a second preset condition is met or not, wherein the second preset condition is as follows:
wherein,an average value of preset outer ring threshold parameters is represented;
if the second preset condition is met, an odd-ectopic singular point exists in front of the robot.
4. The robot wrist singular avoidance method of claim 1, wherein the robot look-ahead start time t is calculated using sampling detection results s The look-ahead position command linear velocity v represented in the base coordinates by the tool coordinate system T T (t s ) And a look-ahead home position command angular velocity omega T (t s ) Comprising:
obtaining a look-ahead starting time t from the sampling detection result s And t s -1 a first position vector p represented in base coordinates by a tool coordinate system T at moment T (t s ) And a second position vector p T (t s -1) a look-ahead start time t s And t s -1 a first rotation matrix R represented in base coordinates by a tool coordinate system T at moment T (t s ) And a second rotation matrix R T (t s -1);
Using the first position vector p T (t s ) And the second position vector p T (t s -1) calculating the robot look-ahead start instant t s The look-ahead position command linear velocity v represented in the base coordinates by the tool coordinate system T T (t s ) The look-ahead home position instructs a linear velocity v T (t s ) The first calculation formula of (2) is as follows:
v T (t s )=(p T (t s )-p T (t s -1))/ΔT
wherein, deltaT is a sampling preset period; and
using the first rotation matrix R T (t s ) And the second rotation matrix R T (t s -1) calculating the robot look-ahead start instant t s The look-ahead starting position of the tool coordinate system T in the base coordinates commands an angular velocity ω T (t s ) The look-ahead starting position commands an angular velocity omega T (t s ) The second calculation formula of (2) is as follows:
where log represents logarithmic transformation on the matrix form, T' represents transpose, V represents vectorization of the diagonal velocity matrix, and Δt is the sampling preset period.
5. A robot wrist singular avoidance method according to claim 4, characterized in that the look-ahead starting position s (t s ) And the look-ahead end position s (t e ) Searching for a singular point proximity position s (T) of the tool coordinate system T closest to the odd-dislocation in the travel path using a dichotomy final ) And a singular point proximity position pose, comprising:
by using the said Look-ahead starting position s (t) s ) And the look-ahead end position s (t e ) Predicting the intermediate position s (t h ) And an intermediate position posture, the intermediate position s (t h ) The second predictive formula of (2) is:
s(t h )=0.5(s(t s )+s(t e ));
to look from the look-ahead starting position s (t s ) To the intermediate position s (t h ) As a first path to pass from the intermediate position s (t h ) To the end-of-look position s (t e ) As a second path, performing a bipartite operation, circularly executing the second prediction formula, and predicting to obtain a first intermediate position s (t h1 ) A second intermediate position s (t h2 ) The method comprises the steps of carrying out a first treatment on the surface of the Wherein the first look-ahead position s (t s1 ) And a first look-ahead end position s (t e1 ) Respectively the look-ahead starting positions s (t s ) And the intermediate position s (t h ) A second look-ahead position s (t s2 ) And a second look-ahead end position s (t e2 ) Respectively the intermediate positions s (t h ) And the look-ahead end position s (t e );
For the first look-ahead starting position s (t s1 ) Said first look-ahead end position s (t e1 ) Said first intermediate position s (t h1 ) The corresponding position vectors and the rotation matrix are subjected to inverse solution to obtain respective corresponding first joint space postures, wherein the first joint space postures comprise: the first starting position joint space angle q (s (t) s1 )) 5 The first distal joint space angle q (s (t) e1 )) 5 The first neutral position joint space angle q (s (t) h1 )) 5
For the second look-ahead starting position s (t s2 ) Said second look-ahead end position s (t e2 ) Said second intermediate position s (t h2 ) The corresponding position vectors and the rotation matrix are subjected to inverse solution to obtain respective corresponding second joint space postures, wherein the second joint space postures comprise: the second starting position joint space angle q (s (t) s2 )) 5 The second distal joint space angle q (s (t) e2 )) 5 The second intermediate position joint space angle q (s (t) h2 )) 5
Judging whether the first joint space posture and the second joint space posture meet a third preset condition or not respectively, wherein the third preset condition is as follows:
q(s(t hx )) 5 <min(q(s(t sx )) 5 ,q(s(t ex )) 5 )
wherein x=1 or 2;
if the first joint spatial pose meets a third preset condition, a third prediction formula is executed to predict a corresponding first intermediate position pose, wherein the first intermediate position pose comprises: first intermediate position vector p T (s(t h1 ) And a first intermediate position rotation matrix R T (s(t h1 ) A) is provided; and uses the first look-ahead position s (t s1 ) Said first look-ahead end position s (t e1 ) Said first intermediate position s (t h1 ) Respectively updating the look-ahead starting positions s (t s ) Said look-ahead end position s (t e ) Said intermediate position s (t h ) Continuously performing the bisection operation of the next round;
if the second joint spatial pose meets a third preset condition, a third prediction formula is executed to predict a corresponding second intermediate position pose, wherein the second intermediate position pose comprises: second intermediate position vector p T (s(t h2 ) And a second intermediate position rotation matrix R T (s(t h2 ) A) is provided; and uses the second look-ahead position s (t s2 ) Said second look-ahead end position s (t e2 ) Said second intermediate position s (t h2 ) Respectively updating the look-ahead starting positions s (t s ) Said look-ahead end position s (t e ) Said intermediate position s (t h ) Continuously performing the bisection operation of the next round;
after each round of binary operation is performed, the updated look-ahead starting position s (t s ) And the look-ahead end position s (t e ) A difference between them;
judging whether the difference value is smaller than a preset interval value or not;
if the difference is smaller thanPresetting an intermediate value, stopping the two-way operation cycle, and updating the intermediate position s (t) h ) And the corresponding intermediate position gestures are respectively recorded as a singular point adjacent position and a singular point adjacent position gesture, and the singular point adjacent position gesture comprises: singular point proximity position vector p T (s(t final ) And a singular point adjacent position rotation matrix R T (s(t final ))。
6. The robot wrist singular avoidance method of claim 5, wherein the intermediate position pose comprises: intermediate position vector p T (s(t h ) And a neutral position rotation matrix R T (s(t h ) A third predictive formula for the mid-position pose is:
p T (s(t h ))=p T (s(t s ))+0.5t step v T (t s )
p T (s(t s ))=p T (t s )
R T (s(t h ))=exp(0.5t step ω T (t s ))·R T (s(t s ))
R T (s(t s ))=R T (t s )
t step =[s(t h )-s(t s )]/|v T (t s )| 2
wherein t is step To command the linear velocity v according to the look-ahead starting position T (t s ) From a prospective starting position s (t s ) To an intermediate position s (t) h ) Time required, |v T (t s )| 2 Indicating a look-ahead start position command linear velocity v T (t s ) Is a binary norm of (c).
7. The robot wrist singular avoidance method of claim 6, wherein the singular point approach position vector p is utilized T (s(t final ) And the singular point adjacent position rotation matrix R T (s(t final ) And) calculatingThe angular acceleration and velocity deflection axis of the tool coordinate system TComprising the following steps:
using the singular point approach position vector p T (s(t final ) And the singular point adjacent position rotation matrix R T (s(t final ) Calculating the space angle q of the joint at the position close to the singular point of the robot final The space angle q of the joint at the adjacent position of the singular point final The third calculation formula of (2) is as follows:
q final =invkin(p T (s(t final )),R T (s(t final )));
wherein invkin represents an inverse solution formula in robot kinematics;
based on the singular point adjacent position joint space angle q final Calculating a first angular velocity partial matrix of a jacobian matrix of the robot tool coordinate system T
Rotating the matrix by utilizing the singular point adjacent positionFor the first angular velocity partial matrixCorrecting to obtain a second angular velocity partial matrix +.>The second angular velocity partial matrixThe correction formula of (2) is as follows:
from a second angular velocity partial matrixThe values of the 4 th row, the 1 st column and the 4 th row and the 2 nd column are taken to obtain a first element j 41 And a first element j 42
By means of the first element j 41 And a first element j 42 Calculating an angular acceleration rate deflection axis of the tool coordinate system TThe angular acceleration rate deflection axis +.>The sixth calculation formula of (2) is as follows:
wherein [ j ] 42 ,-j 41 ] T Representation [ j ] 42 ,-j 41 ]Is a transpose of (a).
8. The robot wrist singular avoidance method of claim 5, wherein the joint space angle q is based on the singular point vicinity position final Calculating a first angular velocity partial matrix of the jacobian matrix of the tool coordinate system TComprising the following steps:
based on the singular point adjacent position joint space angle q final Calculating a third rotation matrix of the axis coordinate system of the adjacent position of the singular point relative to the base coordinate system through the positive kinematics of the robotWherein the third rotation matrix->The fourth calculation formula of (2) is as follows:
wherein finkin is a positive kinematic formula in robot kinematics,a position vector of a singular point adjacent position axis coordinate system relative to a base coordinate system;
According to the third rotation matrixObtaining the coordinate of Z-axis unit direction vector of each joint n expressed in the base coordinate system +.>
Based on the coordinatesObtaining a first angular velocity partial matrix of the Jacobian matrix of the tool coordinate system TThe first angular velocity partial matrix->The fifth calculation formula of (2) is as follows:
9. a robotic wrist singular avoidance system, the system comprising:
the sampling detection module is used for sampling and detecting joint track instructions when the robot moves in a Cartesian space according to a preset period;
the singular point look-ahead detection module is used for predicting the motion gesture of the robot in a plurality of next periods according to the current motion speed by using the sampling detection result; judging whether an odd-abnormal-type singular point exists in front of the robot based on a prediction result;
the gesture correction module is used for generating an angular velocity deflection gesture if detecting that an odd-type singular point exists in front of the robot
A gesture fusion module for deflecting gestures by using the angular velocityAs correction amount and original target posture->Fusion generates new target poses->
An inverse solution module for the new target gesture Solving, generating a new joint track signal and sending the new joint track signal to a lower computer for execution;
generating angular velocity yaw attitudeComprising the following steps:
calculating the forward-looking starting moment t of the robot by using the sampling detection result s The look-ahead position command linear velocity v represented in the base coordinates by the tool coordinate system T T (t s ) And look-ahead start positionSetting the commanded angular velocity ω T (t s );
Presetting a look-ahead time t forward With a first position vector p T (t s ) Calculating the look-ahead starting position s (t) s ) Using the look-ahead starting position to command a linear velocity v T (t s ) Predicting a look-ahead end position s (t e ) The end-of-look position s (t e ) The first predictive formula of (2) is as follows:
s(t s )=0
s(t e )=s(t s )+t forward| ||v T (t s )|| 2
wherein t is forward Representing a look-ahead time;
using the look-ahead starting position s (t s ) And the look-ahead end position s (t e ) Searching for a singular point proximity position s (T) of the tool coordinate system T closest to the odd-dislocation in the travel path using a dichotomy final ) And a singular point proximity pose, the singular point proximity pose comprising: singular point proximity position vector p T (s(t final ) And a singular point adjacent position rotation matrix R T (s(t final ));
Using the singular point approach position vector p T (s(t final ) And the singular point adjacent position rotation matrix R T (s(t final ) Calculating an angular acceleration rate deflection axis of the tool coordinate system T
Command linear velocity v based on the look-ahead starting position T (t s ) And a fifth joint space angle q (t) 5 Calculating a deflection angle θ (q final ) The deflection angle θ (q final ) The seventh calculation formula of (2) is as follows:
wherein c 1 For a first preset parameter,c 2 C is a second preset parameter 3 The third preset parameter;
yaw axis according to the angular velocity increaseAnd the deflection angle θ (q final ) Angular velocity deflection gesture ++calculated by using the Rodrigas formula>The formula of the rodgers is as follows:
where I is an identity matrix and θ is a deflection angle θ (q final ),For angular acceleration of the yaw axis +.>
CN202210153487.1A 2022-02-18 2022-02-18 Robot wrist joint singular avoidance method and system Active CN114378830B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210153487.1A CN114378830B (en) 2022-02-18 2022-02-18 Robot wrist joint singular avoidance method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210153487.1A CN114378830B (en) 2022-02-18 2022-02-18 Robot wrist joint singular avoidance method and system

Publications (2)

Publication Number Publication Date
CN114378830A CN114378830A (en) 2022-04-22
CN114378830B true CN114378830B (en) 2024-02-20

Family

ID=81204866

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210153487.1A Active CN114378830B (en) 2022-02-18 2022-02-18 Robot wrist joint singular avoidance method and system

Country Status (1)

Country Link
CN (1) CN114378830B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116038686B (en) * 2022-10-14 2023-12-08 深圳市大族机器人有限公司 Robot singular point avoidance method, apparatus, computer device, and storage medium
CN118519470B (en) * 2024-07-17 2024-10-29 深圳市华成工业控制股份有限公司 Six-axis robot tracking and gluing control method, system, equipment and program

Citations (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS62189504A (en) * 1986-02-17 1987-08-19 Yaskawa Electric Mfg Co Ltd Control method for robot
US4975856A (en) * 1986-02-18 1990-12-04 Robotics Research Corporation Motion controller for redundant or nonredundant linkages
KR20000046432A (en) * 1998-12-31 2000-07-25 추호석 Method of moving control for robot for industry
JP2003300183A (en) * 2002-04-09 2003-10-21 Nippon Steel Corp Control device of vertical articulated manipulator
EP2703130A2 (en) * 2012-08-31 2014-03-05 Seiko Epson Corporation Robot
CN103909522A (en) * 2014-03-19 2014-07-09 华南理工大学 Method of six-DOF industrial robot passing singular region
CN104483835A (en) * 2014-11-06 2015-04-01 中国运载火箭技术研究院 T-S fuzzy model-based flexible spacecraft multi-objective integrated control method
CN107116542A (en) * 2017-06-28 2017-09-01 华中科技大学 Control method and system that a kind of six joint industrial robot passes through posture singular point
US10065311B1 (en) * 2016-06-08 2018-09-04 X Development Llc Singularity handling for robot jogging
CN109571481A (en) * 2018-12-30 2019-04-05 深圳市越疆科技有限公司 A kind of joint singular point processing method, device, equipment and storage medium
CN109834706A (en) * 2017-11-25 2019-06-04 梅卡曼德(北京)机器人科技有限公司 The method and device of kinematicsingularities is avoided in robot motion planning
CN110524544A (en) * 2019-10-08 2019-12-03 深圳前海达闼云端智能科技有限公司 A kind of control method of manipulator motion, terminal and readable storage medium storing program for executing
CN110561428A (en) * 2019-08-23 2019-12-13 大族激光科技产业集团股份有限公司 method, device and system for determining pose of robot base coordinate system and readable medium
CN110802600A (en) * 2019-11-28 2020-02-18 合肥工业大学 Singularity processing method of six-degree-of-freedom articulated robot
CN210704906U (en) * 2019-07-17 2020-06-09 武汉金石兴机器人自动化工程有限公司 Multifunctional end manipulator of industrial robot
CN112405537A (en) * 2020-11-11 2021-02-26 配天机器人技术有限公司 Robot space track interpolation method and robot
CN113263496A (en) * 2021-04-01 2021-08-17 北京无线电测量研究所 Method for optimizing path of six-degree-of-freedom mechanical arm and computer equipment
CN113601512A (en) * 2021-08-23 2021-11-05 太原理工大学 Universal avoidance method and system for singular points of mechanical arm

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10665115B2 (en) * 2016-01-05 2020-05-26 California Institute Of Technology Controlling unmanned aerial vehicles to avoid obstacle collision
US11667035B2 (en) * 2019-07-01 2023-06-06 Wisconsin Alumni Research Foundation Path-modifying control system managing robot singularities
US11559893B2 (en) * 2020-04-02 2023-01-24 Intrinsic Innovation Llc Robot control for avoiding singular configurations

Patent Citations (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS62189504A (en) * 1986-02-17 1987-08-19 Yaskawa Electric Mfg Co Ltd Control method for robot
US4975856A (en) * 1986-02-18 1990-12-04 Robotics Research Corporation Motion controller for redundant or nonredundant linkages
KR20000046432A (en) * 1998-12-31 2000-07-25 추호석 Method of moving control for robot for industry
JP2003300183A (en) * 2002-04-09 2003-10-21 Nippon Steel Corp Control device of vertical articulated manipulator
EP2703130A2 (en) * 2012-08-31 2014-03-05 Seiko Epson Corporation Robot
CN103909522A (en) * 2014-03-19 2014-07-09 华南理工大学 Method of six-DOF industrial robot passing singular region
CN104483835A (en) * 2014-11-06 2015-04-01 中国运载火箭技术研究院 T-S fuzzy model-based flexible spacecraft multi-objective integrated control method
US10065311B1 (en) * 2016-06-08 2018-09-04 X Development Llc Singularity handling for robot jogging
CN107116542A (en) * 2017-06-28 2017-09-01 华中科技大学 Control method and system that a kind of six joint industrial robot passes through posture singular point
CN109834706A (en) * 2017-11-25 2019-06-04 梅卡曼德(北京)机器人科技有限公司 The method and device of kinematicsingularities is avoided in robot motion planning
CN109571481A (en) * 2018-12-30 2019-04-05 深圳市越疆科技有限公司 A kind of joint singular point processing method, device, equipment and storage medium
CN210704906U (en) * 2019-07-17 2020-06-09 武汉金石兴机器人自动化工程有限公司 Multifunctional end manipulator of industrial robot
CN110561428A (en) * 2019-08-23 2019-12-13 大族激光科技产业集团股份有限公司 method, device and system for determining pose of robot base coordinate system and readable medium
CN110524544A (en) * 2019-10-08 2019-12-03 深圳前海达闼云端智能科技有限公司 A kind of control method of manipulator motion, terminal and readable storage medium storing program for executing
CN110802600A (en) * 2019-11-28 2020-02-18 合肥工业大学 Singularity processing method of six-degree-of-freedom articulated robot
CN112405537A (en) * 2020-11-11 2021-02-26 配天机器人技术有限公司 Robot space track interpolation method and robot
CN113263496A (en) * 2021-04-01 2021-08-17 北京无线电测量研究所 Method for optimizing path of six-degree-of-freedom mechanical arm and computer equipment
CN113601512A (en) * 2021-08-23 2021-11-05 太原理工大学 Universal avoidance method and system for singular points of mechanical arm

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
串联式结构机器人逆运动学的求解分析;高威;李莎;黄高荣;;机床与液压(第21期);第87-90+182页 *
线驱动拟人臂机器人逆向运动学分析;陈伟海;陈泉柱;张建斌;张颖;;机械工程学报(第04期);第16-24页 *
连续三轴平行机器人的避奇异规划;陈钢;贾庆轩;孙汉旭;洪磊;;北京邮电大学学报(第03期);第59-63页 *

Also Published As

Publication number Publication date
CN114378830A (en) 2022-04-22

Similar Documents

Publication Publication Date Title
CN106625666B (en) Control method and device of redundant mechanical arm
CN114378830B (en) Robot wrist joint singular avoidance method and system
Baeten et al. Hybrid vision/force control at corners in planar robotic-contour following
CN109571466A (en) A kind of seven freedom redundant mechanical arm dynamic obstacle avoidance paths planning method based on quick random search tree
CN110948504B (en) Normal constant force tracking method and device for robot machining operation
CN109623825B (en) Movement track planning method, device, equipment and storage medium
CN108189034B (en) Method for realizing continuous track of robot
CN111522351B (en) Three-dimensional formation and obstacle avoidance method for underwater robot
CN111515928B (en) Mechanical arm motion control system
CN107457783A (en) Sixdegree-of-freedom simulation self-adapting intelligent detection method based on PD control device
CN109623812B (en) Mechanical arm trajectory planning method considering spacecraft body attitude motion
CN114986498B (en) Cooperative control method for mobile operation arm
CN114942593A (en) Mechanical arm self-adaptive sliding mode control method based on disturbance observer compensation
CN113119109A (en) Industrial robot path planning method and system based on pseudo-distance function
CN112650217A (en) Robot trajectory tracking strategy dynamic optimization method based on evaluation function
CN113199481B (en) Robot motion control method, device, electronic apparatus, and medium
CN117724472A (en) Mobile robot track tracking control method and system of kinematic model
CN107398903A (en) The method for controlling trajectory of industrial machinery arm actuating station
Wang et al. Fuzzy-PI double-layer stability control of an online vision-based tracking system
CN113867157B (en) Optimal trajectory planning method and device for control compensation and storage device
Chen et al. Adaptive Stiffness Visual Servoing for Unmanned Aerial Manipulators With Prescribed Performance
Shu et al. Dynamic path tracking of industrial robots with high accuracy by visual servoing
CN110039249B (en) Method for solving motion trail of welding positioner based on inverse kinematics analytic solution weight method of welding positioner
JPH10329066A (en) Method for specific posture detection and specific posture avoidance for redundant manipulator
Huynh et al. Dynamic Hybrid Filter for Vision‐Based Pose Estimation of a Hexa Parallel Robot

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant