CN106945036A - Robot motion generation method and device - Google Patents
Robot motion generation method and device Download PDFInfo
- Publication number
- CN106945036A CN106945036A CN201710170277.2A CN201710170277A CN106945036A CN 106945036 A CN106945036 A CN 106945036A CN 201710170277 A CN201710170277 A CN 201710170277A CN 106945036 A CN106945036 A CN 106945036A
- Authority
- CN
- China
- Prior art keywords
- robot
- model
- attitude
- image
- joint
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1656—Programme controls characterised by programming, planning systems for manipulators
- B25J9/1664—Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1656—Programme controls characterised by programming, planning systems for manipulators
- B25J9/1671—Programme controls characterised by programming, planning systems for manipulators characterised by simulation, either to verify existing program or to create and verify new program, CAD/CAM oriented, graphic oriented programming systems
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Manipulator (AREA)
Abstract
The present invention relates to a kind of robot motion generation method, following steps are specifically included:Obtain robot model and read instruction, read in instruction and carry model identification;In response to reading instruction, in first area, display model identifies corresponding robot model, wherein, robot model includes multiple turning joints;The first joint parameter of the setting of all turning joints is retrieved as, while the robot model for being shown in first area shows the first attitude corresponding with the first joint parameter;The second joint parameter of the setting of all turning joints is retrieved as, while the robot model for being shown in first area shows the second attitude corresponding with second joint parameter;The temporal information of input is obtained, wherein, temporal information is the time used in robot model from the first attitude motion to the second attitude;Robot motion file is generated according to the first joint parameter, second joint parameter and temporal information.The above method can make the generation of robot motion simpler directly perceived.
Description
Technical field
The present invention relates to Computer Applied Technology field, more particularly to a kind of robot motion generation method and device.
Background technology
With the fast development of robot, robot is had been applied in various fields, for example, apply in industrial circle and
Service field etc..It is the key that robot provides good service robot is made various actions.
The mode that traditional control machine people makes action is the fortune of the direct power set of control machine person joint one by one
Turn parameter to realize.When joint of robot is more, movements design process is cumbersome, and molar behavior control is more difficult, poor intuition.
The content of the invention
Based on this, it is necessary to for it is above-mentioned the problem of it is simpler straight there is provided a kind of generation that can make robot motion
The robot motion generation method and device of sight.
A kind of robot motion generation method, methods described includes:
Obtain robot model and read instruction, described read in instruction carries model identification;
Instruction is read in response to described, the corresponding robot model of the model identification is shown in first area, wherein, institute
Stating robot model includes multiple turning joints;
The first joint parameter that the turning joint is set is retrieved as, the machine is controlled according to first joint parameter
People's model display goes out the first attitude corresponding with first joint parameter;
It is retrieved as the second joint parameter that the turning joint is set, the machine according to the second joint state modulator
People's model display goes out the second attitude corresponding with the second joint parameter;
The temporal information of input is obtained, the temporal information is used to determine that the robot model transports from first attitude
Move the time used in second attitude;
Robot motion text is generated according to first joint parameter, the second joint parameter and the temporal information
Part.
In one embodiment, methods described also includes:
When robot model shows first attitude, obtain to the robot model's in the first attitude
Instruction is shot, the first image is obtained, described first image is included in second area;
When robot model shows second attitude, the shooting to the robot model in the second attitude
Instruction, obtains the second image, and second image is included in the second area.
In one embodiment, methods described also includes:
Obtain the image drag operation for acting on the second area;
The position of the image shown in the second area is exchanged according to described image drag operation, wherein, described image
Position represent the order that the corresponding attitude of described image occurs in the robot motion of generation, the position of described image is more leaned on
Before, the corresponding attitude of described image more first occurs.
In one embodiment, described according to first joint parameter, second joint parameter and the temporal information
After the step of generating robot motion file, also include:
Robot link order is obtained, attachable robot identity is searched;
Judge the robot identity model identification matching whether corresponding with the robot motion file searched, if
It is then to connect the robot, and the robot motion file is sent to the robot.
In one embodiment, the robot motion file of generation is multiple, each robot motion file correspondence
One action block identification;Methods described also includes:
Operating instruction is sent to the robot, the operating instruction includes the movable block mark sequence and each institute
The cycle-index of action block identification is stated, so that the robot is according to the action block sequencing identification information and the movable block mark
The cycle-index information of knowledge performs corresponding robot motion file.
A kind of robot motion generating means, described device includes:
Model read module, reads instruction, described read in instruction carries model identification for obtaining robot model;
Model display module, for being instructed in response to described read, shows that the model identification is corresponding in first area
Robot model, wherein, the robot model includes multiple turning joints;
First joint parameter setup module, for being retrieved as the first joint parameter that the turning joint is set, according to institute
Stating the first joint parameter controls the robot model to show the first attitude corresponding with first joint parameter;
Second joint parameter setting module, for being retrieved as the second joint parameter that the turning joint is set, according to institute
Robot model described in stating second joint state modulator shows the second attitude corresponding with the second joint parameter;
Time setting module, the temporal information for obtaining input, the temporal information is used to determine the robot mould
Time used in type from first attitude motion to second attitude;
Generation module is acted, for according to first joint parameter, the second joint parameter and the temporal information
Generate robot motion file.
In one embodiment, described device also includes:
First image display, for when robot model shows first attitude, obtaining in first
The shooting instruction of the robot model of attitude, obtains the first image, described first image is included in second area;
Second image display, for when robot model shows second attitude, in the second attitude
The robot model shooting instruction, obtain the second image, will second image including in the second area.
In one embodiment, described device also includes:
Image sequence Switching Module, the image drag operation of the second area is acted on for obtaining;
The position of the image shown in the second area is exchanged according to described image drag operation, wherein, described image
Position represent the order that the corresponding attitude of described image occurs in the robot motion of generation, the position of described image is more leaned on
Before, the corresponding attitude of described image more first occurs.
In one embodiment, described device also includes:
Link order acquisition module, for obtaining robot link order, searches attachable robot identity;
Robot link block, for judge search the robot identity whether with the robot motion file pair
The model identification matching answered, if so, then connecting the robot, and the robot motion file is sent to the machine
People.
In one embodiment, the robot motion file of generation is multiple, each robot motion file correspondence
One action block identification;Described device also includes:
Module is run, for sending operating instruction to the robot, the operating instruction includes the movable block mark
The cycle-index of sequence and each action block identification is known, so that the robot is according to the action block sequencing identification information
Corresponding robot motion file is performed with the cycle-index information of the action block identification
Above-mentioned robot motion generation method and device, build robot model in advance, and the robot model of structure includes
Multiple turning joints, read the robot model built in advance, and show the robot model in the first area of the page, lead to
The first joint parameter that each turning joint is set is crossed, to define the machine in the first attitude of the robot model, first area
People's mold sync shows robot pose corresponding with joint parameter.Same mode sets the second joint of each turning joint to join
Number, to define the second attitude of the robot model;Then set from the first joint parameter of the first attitude and run to the second appearance
The temporal information of the second joint parameter of state, that is, generate a movable block of robot, without being individually for each joint configuration rotation
Turn parameter and the duration of runs, the origination action attitude and termination movement posture that need to only define movable block complete robot motion
Definition, the generation of action is simpler quick.In addition, the display of first area model and model and joint parameter
Servo-actuated feature can make the definition of action more directly perceived, to facilitate designer to carry out the design of the various action of robot.
Brief description of the drawings
Fig. 1 is the applied environment figure of robot motion generation method in one embodiment;
Fig. 2 is the flow chart of robot motion generation method in one embodiment;
Fig. 3 is the flow chart of robot motion generation method in another embodiment;
Fig. 4 be one embodiment in be used for realize robot motion generation page figure;
The schematic diagram that Fig. 5 pulls for image in second area in one embodiment;
Fig. 6 is connected involved flow chart for terminal in one embodiment with robot;
Fig. 7 is that terminal is used for the control page figure that operation robot execution is acted;
Fig. 8 is the structured flowchart of robot motion generating means in one embodiment;
Fig. 9 is the structured flowchart of robot motion generating means in another embodiment;
Figure 10 is the structured flowchart of robot motion generating means in another embodiment;
Figure 11 is the structured flowchart involved by terminal connection robot in one embodiment;
Figure 12 is the structured flowchart involved by terminal control machine human action in one embodiment.
Embodiment
In order to make the purpose , technical scheme and advantage of the present invention be clearer, it is right below in conjunction with drawings and Examples
The present invention is further elaborated.It should be appreciated that the specific embodiments described herein are merely illustrative of the present invention, and
It is not used in the restriction present invention.
As shown in figure 1, in one embodiment there is provided a kind of applied environment figure of robot motion generation method, should
Applied environment figure includes terminal 110 and robot 120, and terminal 110 can enter style of writing with robot 120 by short-range communication technique
Part is transmitted, instruction transmission etc..Terminal 110 can be in smart mobile phone, panel computer, notebook computer, desktop computer extremely
Few one kind, but be not limited thereto.Terminal 110 is used to generate robot motion file, and robot can perform terminal generation
The action that robot motion file is responded.
As shown in Fig. 2 in one embodiment there is provided a kind of robot motion generation method, to be applied in Fig. 1
Illustrated exemplified by terminal 110, should include specifically including following steps:
Step S202:Obtain robot model and read instruction, model identification is carried in reading instruction.
Multiple robot models, the unique model identification of each robot model's correspondence, such as biped people are prestored in terminal
Anthropomorphic robot, pet robot etc..
In one embodiment, the reading button in terminal page is can trigger, instruction is read in triggering.Terminal response is in this
Instruction is read, all model identifications prestored is shown in the form of combobox or bullet frame, triggers to the model identification of display
Selection operation, the corresponding robot model of model identification that terminal imports selection by command path.
Step S204:In response to reading instruction, in first area, display model identifies corresponding robot model, wherein,
Robot model includes multiple turning joints.
Terminal response in acquisition robot model read instruction, search prestore with read instruction in model identification phase
Corresponding robot model, and the first area of the homepage in robot motion edit page shows the robot model.Enter
One step, first area is the intermediate region of the homepage of robot motion edit page.
Robot model includes multiple turning joints, and these turning joints can make the vivid of robot make various
The action on ground.Arm bending is such as made by the joint of arm, acted using arm root as the rotation in axle center;Pass through leg
The action such as leg curvature is made, lifts, rotate in joint.
In the present embodiment, the robot 120 in such as Fig. 1 is modeled.As seen from Figure 1, the robot model of structure
Each arm at least include two joints, wherein, be used for the motion of control machine people large arm positioned at the joint of upper end, positioned at lower end
Joint user's control robot forearm relative motion.Robot model's at least includes two joints, robot leg with
The junction of foot includes joint, and the junction of robot head and body includes a joint.
In one embodiment, using steering wheel as the coupling part in each joint, it can complete determining for each joint
Position and motion.Steering wheel has control signal relatively easy, and control accuracy is high, and reaction speed is fast, and than servomotor power saving etc.
Advantage.
Step S206:The first joint parameter of turning joint setting is retrieved as, according to the first joint parameter control machine people
Model display goes out the first attitude corresponding with the first joint parameter.
In one embodiment, it is provided with and is slided with robot movable amount of articulation identical in robot motion edit page
Dynamic control, each joint for sliding control control machine people, by dragging the state that control can adjust turning joint of sliding,
So that the robot model in first area makes different actions.
Further, each control that slides includes the position of sliding shoe and digital frame in slider bar and digital frame, slider bar
Numerical value characterize the state of corresponding turning joint.Both have interaction relation, drag in sliding shoe, corresponding digital frame
Numerical value can also change.Similarly, the numerical value in modification digital frame, sliding shoe can also do being moved to of responding with digital frame
The corresponding position of numerical value.
In one embodiment, null positions are pre-defined.Specifically, clicking on reset button, terminal in homepage
Show the zero value in each joint.As robot attentions the zero-bit that the anglec of rotation of corresponding each joint steering wheel under state is each joint
Value.It can be, attention under state, the zero value in each joint is 0., it is understood that slide control adjustment is each joint
The anglec of rotation of the steering wheel relative to null positions.
Here the first joint parameter includes the articulate joint parameter of institute of robot, is one group of joint parameter.
With to sliding control dragging or the modification of logarithm value frame numerical value, the robot model in first area is shown in
State can also change therewith, the slip control at such as adjustment arm joint, the state of the arm of robot model can repair accordingly
Change.It is, the attitude of the robot model of display and the joint parameter set are corresponding in real time.Based on above-mentioned attribute, designer
Member can be can intuitively constantly the first joint parameter of adjustment be shown up to robot model in the case of observer robot state
Desired state.
Corresponding first attitude of first joint parameter is a fixed posture of robot.If robot left arm is to extension
Directly, side stretches 30 ° to left leg to the right.
Step S208:The second joint parameter of turning joint setting is retrieved as, according to second joint state modulator robot
Model display goes out the second attitude corresponding with second joint parameter.
At least one joint parameter is differed with the first joint parameter in second joint parameter.It is readily obtained, second
Attitude is different from the first attitude.
Step S210:The temporal information of input is obtained, the temporal information is used to determine that robot model transports from the first attitude
Move to the time used in the second attitude.
Terminal obtains the temporal information of user input, and the temporal information is from above-mentioned the first attitude (the first joint of correspondence
Parameter) move to the time used in the second attitude (correspondence second joint parameter).More specific, the time of input closes to be each
The anglec of rotation that the steering wheel of section is specified from the first joint parameter rotates to the time for the anglec of rotation that second joint parameter is specified.
For example, the time of user input is 1s.Robot model includes joint A in 2 joints, the first joint parameter
Angle be 20 °, joint B angle is 0 °.Joint B angle is 90 ° in second joint parameter, and joint B angle is 90 °.
The robot motion file of generation can make robot perform action be:Within 1s time, joint A at the uniform velocity from 20 ° rotation
To 90 °, meanwhile, joint B at the uniform velocity turn to 90 ° from 0 °.
In the present embodiment, by defining the first attitude and the second attitude, two nodes of action are defined.During by setting
Between, the action of robot is by a steady uniform motion of node to another node, and robot motion is smoothly natural.
Step S212:Robot motion file is generated according to the first joint parameter, second joint parameter and temporal information.
The machine for being capable of guidance machine human action can be generated according to the first joint parameter, second joint parameter and temporal information
Device human action file.
In the present embodiment, in carry out action node definition, the robot model being servo-actuated with joint parameter can be can be visually seen,
The design of action is more directly perceived.In addition, the generation of robot motion is not based on each joint of segmentation in the present embodiment, but
Using all joints as an entirety, the starting point and ending point of definition action, and by the setting of unified time, make action
Define simpler quick.
In one embodiment, it can also set more according to the first attitude and the second attitude identical method is set
Attitude, that is, set multiple action nodes, often increases an action node, sets a temporal information.For example, being also provided with the 3rd
Attitude and the second temporal information, second temporal information are the used time from the second attitude motion to the 3rd attitude.
The robot motion of above-mentioned generation is called movable block, then each movable block may include multiple action nodes.
In one embodiment, as shown in figure 3, robot motion generation method comprises the following steps:
Step S302:Obtain robot model and read instruction, read in instruction and carry model identification.
Step S304:In response to reading instruction, in first area, display model identifies corresponding robot model, wherein,
Robot model includes multiple turning joints.
Step S306:The first joint parameter of turning joint setting is retrieved as, according to the first joint parameter control machine people
Model display goes out the first attitude corresponding with the first joint parameter.
Step S308:Obtain the shooting to the robot model in the first attitude to instruct, the first image is obtained, by first
Image is shown in second area.
Here the first image and the first joint parameter has incidence relation, clicks on the first image in second area, will
The first joint parameter is shown in control is slided.Control can be slided by adjusting, change the first joint parameter, and then change machine
First attitude of people's model, after the completion of modification, shoots amended robot model, obtains amended first image.
As shown in figure 4, shooting after the first image for the robot model being located in intermediate region (first area), in left side
Second area in display the first image.
Step S310:The second joint parameter of turning joint setting is retrieved as, according to second joint state modulator robot
Model display goes out the second attitude corresponding with second joint parameter.
Step S312:Obtain the shooting to the robot model in the second attitude to instruct, the second image is obtained, by second
Image is shown in second area.
Similarly, adjustment joint parameter obtains the second attitude of robot model, the robot mould in first area is shot
Second attitude of type, obtains the second image, and the second obtained image is included in second area.Wherein, in second area,
Second image is located at the lower section of the first image.Similarly, if there is the 3rd image, the 3rd image is in the lower section of the second image.
Step S314:The temporal information of input is obtained, the temporal information is used to determine that robot model transports from the first attitude
Move to the time used in the second attitude.
Step S316:Robot motion file is generated according to the first joint parameter, second joint parameter and temporal information.
In the present embodiment, the image of the first attitude and the second attitude in action node location is obtained, and by node
Image be shown in second area, can more intuitively watch action effect, and the modification of action is more facilitated.
In one embodiment, drag operation can also be carried out to the image in second area, to change action node
Sequential.Specifically, terminal obtains the drag operation for the image for acting on second area, and the dragging of the drag operation according to acquisition
Track, the position of swap image.It is suitable that the corresponding attitude of position representative image of image occurs in the robot motion of generation
Sequence, the position of image is more forward, and the corresponding attitude of image more first occurs.
For example, exchanging corresponding first image of the first joint parameter and second joint parameter corresponding the by drag operation
The position of two images, the robot motion of generation is then by corresponding to the second image from corresponding first attitude motion of the first image
Second attitude is changed to by the second attitude motion to the first attitude, that is, the action generated is changed.It should be noted that figure
The order that image occurs in the conversion change action of image position, temporal information is constant.As shown in figure 5, pulling A have exchanged image
1 and the position of image 2, the action generated after exchange is the action indicated by first displaying image 2, then in t1Time is from image 2
Action joint movements to image 1 action node, then in t2Time is from the action joint movements of image 1 to the action of image 3
Node.It is identical with the principle for pulling A to pull B.
In one embodiment, can be depending on the music beat according to setting to the time of next node from a joint movements.
The order of switching node, but do not change temporal information, can make the action after adjustment also can be corresponding with the music beat of setting, can
Following robot, music beat is rhythmical to dance lightly.
Further, the image in second area can also be deleted by drag operation.Acted on specifically, terminal is obtained
The drag operation of second area, whether be arbitrary image are dragged into specified location, should if so, then deleting if judging the drag operation
Image.
The image being such as shown in second area is dragged in the dustbin in second area, that is, completes to delete image
Remove.Deleted image will be no longer present in the action of generation.
Position change is carried out to the image in second area or is deleted after some images, terminal is according to the image after change
(corresponding node parameter) generates robot motion file.
In one embodiment, as shown in fig. 6, in step S212:According to the first joint parameter, second joint parameter and when
Between information generation robot motion file the step of after, also comprise the following steps:
Step S402:Robot link order is obtained, attachable robot identity is searched.
Step S404:Judge the robot identity's model identification matching whether corresponding with robot motion file searched,
If so, then connecting robot, and robot motion file is sent to robot.
In one embodiment, the short-distance wireless communication skill such as Wi-Fi technology, Bluetooth technology, ZigBee technology can be passed through
Art is set up with robot device and is connected.
During being connected with robot device's foundation, terminal may find multiple attachable robots broadcast
Link information, robot identity can be obtained by the link information, and terminal is right according to the robot motion file institute being currently generated
The robot model's mark answered determines the robot identity matched with robot model's mark, connects robot identity correspondence
Robot.Controller in robot device can only recognize the action generated based on its corresponding model, therefore, connect machine
, it is necessary to ensure that the robot of connection is corresponding with robot model during people.
In one embodiment, carry out the transmission of the information between terminal and robot device using Modbus agreements and connect
Receive.Modbus agreements are a kind of all-purpose languages being applied on electronic controller.By this agreement, controller each other, control
Device processed can communicate via network (such as Ethernet) between miscellaneous equipment.It has become a kind of current industry standard.Have
It, the control device of different vendor's production can be linked to be industrial network, carry out Centralized Monitoring.This protocol define a control
Device can recognize the message structure used, be communicated but regardless of them by which kind of network.It describes a controller
Request accesses the process of miscellaneous equipment, how to respond the request from miscellaneous equipment, and how to detect mistake and record.It makes
The common format of message field layout and content is determined.
When being communicated on same Modbus networks, this agreement determines that each controller needs with knowing their equipment
Location, recognizes the message sent by address, and which kind of action decision will produce.If necessary to respond, controller will generate feedback information
And sent with Modbus agreements.On other networks, the message for containing Modbus agreements is converted to what is used over the network
Frame or pack arrangement.
The information transmitted to robot includes following packet structure:Header text part+function code+storage address+specific
Data (robot motion file).Here the type of the data of function code transmission, can be motion editing type, program volume
Collect type, zero position type etc..
The transmission of data is sent using Register approach.
In one embodiment, terminal can generate multiple robot motion files, and performing the robot motion file can make
Robot makes corresponding action.Control machine people continuously performs multiple motion files, and robot can be made to make series of actions.
The method of specific control machine human action includes:Set up and communicate to connect with robot, and by the robot of generation
Motion file is sent to robot, and one data block of each robot motion file correspondence, terminal pre-defines each movable block
Corresponding action block identification, pre-assigned action block identification is carried in transferring robot motion file.
Terminal sends operating instruction to the robot of connection, and the operation, which is performed, includes action block identification order letter to be run
The cycle-index of breath and each movable block.For example, action block identification includes action 1, action 2 and action 3, sequence of movement
For:1- action 2- actions 3 are acted, wherein action 1 is circulated 1 time, the circulation of action 21 time, the circulation of action 32 times.Robot is according to this
Instruction is acted accordingly.
In another embodiment, terminal can previously generate above-mentioned operation program data (including sequence of movement and circulation
Number of times), and the operation program data of generation are sent to robot, terminal and robot have made an appointment each operation program
Mark, such as operation program 1, operation program 2.Terminal sets runnable interface, and sets multiple operation buttons in runnable interface, often
Show that operation program is identified on individual operation button one operation program of correspondence, the button, as shown in Figure 7.
Trigger action of the user to operation program button is obtained, is instructed in response to the trigger action generating run, the operation
The operation program mark of operation program button association is carried in instruction, the operating instruction is passed through the connection that pre-establishes by terminal
Send to robot, robot according to operation program identifier lookup to corresponding operation program data, according to operation program data
Guidance machine people makes corresponding action.
In one embodiment, as shown in Figure 8 there is provided a kind of robot motion generating means, the device includes:
Model read module 510, reads instruction for obtaining robot model, reads in instruction and carry model identification.
Model display module 520, for being instructed in response to reading, in first area, display model identifies corresponding robot
Model, wherein, robot model includes multiple turning joints.
First joint parameter setup module 530, the first joint parameter for being retrieved as turning joint setting, according to first
Joint parameter control machine people's model display goes out the first attitude corresponding with the first joint parameter.
Second joint parameter setting module 540, the second joint parameter for being retrieved as turning joint setting, according to second
Joint parameter control machine people's model display goes out the second attitude corresponding with second joint parameter.
Time setting module 550, the temporal information for obtaining input, temporal information is used to determine robot model from the
Time used in one attitude motion to the second attitude.
Generation module 560 is acted, for generating robot according to the first joint parameter, second joint parameter and temporal information
Motion file.
In one embodiment, as shown in figure 9, robot motion generating means also include:
First image display 531, for when robot model shows the first attitude, obtaining in the first appearance
The shooting instruction of the robot model of state, obtains the first image, the first image is included in second area.
Second image display 541, for when robot model shows the second attitude, in the second attitude
The shooting instruction of robot model, obtains the second image, the second image is included in second area.
In one embodiment, as shown in Figure 10, robot motion generating means also include:
Image sequence Switching Module 542, the image drag operation of second area is acted on for obtaining;
The position of the image shown in second area is exchanged according to image drag operation, wherein, the position representative graph of image
As the order that corresponding attitude occurs in the robot motion of generation, the position of image is more forward, and the corresponding attitude of image is got over
First occur.
In one embodiment, as shown in figure 11, robot motion generating means also include:
Link order acquisition module 610, for obtaining robot link order, searches attachable robot identity.
Whether robot link block 620, the robot identity for judging to search is corresponding with robot motion file
Model identification is matched, if so, then connecting robot, and robot motion file is sent to robot.
In one embodiment, as shown in figure 12, the robot motion file of generation is multiple, each robot motion text
One action block identification of part correspondence;Robot motion generating means also include:
Module 630 is run, for sending operating instruction to robot, operating instruction includes the sequence of movable block mark and every
The cycle-index of individual action block identification, so that cycle-index of the robot according to action block sequencing identification information and action block identification
Information performs corresponding robot motion file.
One of ordinary skill in the art will appreciate that realize all or part of flow in above-described embodiment method, being can be with
The hardware of correlation is instructed to complete by computer program, program can be stored in a computer read/write memory medium, such as
In the embodiment of the present invention, the program can be stored in the storage medium of computer system, and by the computer system at least
One computing device, to realize the flow for including the embodiment such as above-mentioned each method.Wherein, storage medium can be magnetic disc, light
Disk, read-only memory (Read-Only Memory, ROM) or random access memory (Random Access Memory,
RAM) etc..
Each technical characteristic of above example can be combined arbitrarily, to make description succinct, not to above-described embodiment
In each technical characteristic it is all possible combination be all described, as long as however, the combination of these technical characteristics be not present lance
Shield, is all considered to be the scope of this specification record.
Above example only expresses the several embodiments of the present invention, and it describes more specific and detailed, but can not
Therefore it is construed as limiting the scope of the patent.It should be pointed out that for the person of ordinary skill of the art,
On the premise of not departing from present inventive concept, various modifications and improvements can be made, these belong to protection scope of the present invention.
Therefore, the protection domain of patent of the present invention should be determined by the appended claims.
Claims (10)
1. a kind of robot motion generation method, methods described includes:
Obtain robot model and read instruction, described read in instruction carries model identification;
Instruction is read in response to described, the corresponding robot model of the model identification is shown in first area, wherein, the machine
Device people's model includes multiple turning joints;
The first joint parameter that the turning joint is set is retrieved as, the robot mould is controlled according to first joint parameter
Type shows the first attitude corresponding with first joint parameter;
It is retrieved as the second joint parameter that the turning joint is set, the robot mould according to the second joint state modulator
Type shows the second attitude corresponding with the second joint parameter;
Obtain input temporal information, the temporal information be used for determine the robot model from first attitude motion to
Time used in second attitude;
Robot motion file is generated according to first joint parameter, the second joint parameter and the temporal information.
2. according to the method described in claim 1, it is characterised in that methods described also includes:
When robot model shows first attitude, the shooting to the robot model in the first attitude is obtained
Instruction, obtains the first image, and described first image is included in second area;
When robot model shows second attitude, the shooting to the robot model in the second attitude refers to
Order, obtains the second image, and second image is included in the second area.
3. method according to claim 2, it is characterised in that methods described also includes:
Obtain the image drag operation for acting on the second area;
The position of the image shown in the second area is exchanged according to described image drag operation, wherein, the position of described image
Put and represent the order that the corresponding attitude of described image occurs in the robot motion of generation, the position of described image is more forward,
The corresponding attitude of described image more first occurs.
4. according to the method described in claim 1, it is characterised in that described according to first joint parameter, second joint
After the step of parameter and temporal information generation robot motion file, also include:
Robot link order is obtained, attachable robot identity is searched;
Judge the robot identity model identification matching whether corresponding with the robot motion file searched, if so,
The robot is then connected, and the robot motion file is sent to the robot.
5. method according to claim 4, it is characterised in that the robot motion file of generation be it is multiple, each
One action block identification of robot motion file correspondence;Methods described also includes:
Operating instruction is sent to the robot, the operating instruction includes the movable block mark sequence and each described dynamic
Make the cycle-index of block identification, so that the robot is according to the action block sequencing identification information and the action block identification
Cycle-index information performs corresponding robot motion file.
6. a kind of robot motion generating means, it is characterised in that described device includes:
Model read module, reads instruction, described read in instruction carries model identification for obtaining robot model;
Model display module, for being instructed in response to described read, the corresponding machine of the model identification is shown in first area
People's model, wherein, the robot model includes multiple turning joints;
First joint parameter setup module, for being retrieved as the first joint parameter that the turning joint is set, according to described the
One joint parameter controls the robot model to show the first attitude corresponding with first joint parameter;
Second joint parameter setting module, for being retrieved as the second joint parameter that the turning joint is set, according to described the
Two joint parameters control the robot model to show the second attitude corresponding with the second joint parameter;
Time setting module, the temporal information for obtaining input, the temporal information is used to determine the robot model certainly
Time used in first attitude motion to second attitude;
Generation module is acted, for being generated according to first joint parameter, the second joint parameter and the temporal information
Robot motion file.
7. device according to claim 6, it is characterised in that described device also includes:
First image display, for when robot model shows first attitude, obtaining in the first attitude
The robot model shooting instruction, obtain the first image, by described first image include in second area;
Second image display, for when robot model shows second attitude, to the institute in the second attitude
The shooting instruction of robot model is stated, the second image is obtained, second image is included in the second area.
8. device according to claim 7, it is characterised in that described device also includes:
Image sequence Switching Module, the image drag operation of the second area is acted on for obtaining;
The position of the image shown in the second area is exchanged according to described image drag operation, wherein, the position of described image
Put and represent the order that the corresponding attitude of described image occurs in the robot motion of generation, the position of described image is more forward,
The corresponding attitude of described image more first occurs.
9. device according to claim 6, it is characterised in that described device also includes:
Link order acquisition module, for obtaining robot link order, searches attachable robot identity;
Whether robot link block, the robot identity for judging to search is corresponding with the robot motion file
Model identification is matched, if so, then connecting the robot, and the robot motion file is sent to the robot.
10. device according to claim 9, it is characterised in that the robot motion file of generation be it is multiple, each
One action block identification of robot motion file correspondence;Described device also includes:
Module is run, for sending operating instruction to the robot, the operating instruction includes the movable block mark-row
The cycle-index of sequence and each action block identification, so that the robot is according to the action block sequencing identification information and institute
The cycle-index information for stating action block identification performs corresponding robot motion file.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710170277.2A CN106945036A (en) | 2017-03-21 | 2017-03-21 | Robot motion generation method and device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710170277.2A CN106945036A (en) | 2017-03-21 | 2017-03-21 | Robot motion generation method and device |
Publications (1)
Publication Number | Publication Date |
---|---|
CN106945036A true CN106945036A (en) | 2017-07-14 |
Family
ID=59472186
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710170277.2A Pending CN106945036A (en) | 2017-03-21 | 2017-03-21 | Robot motion generation method and device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN106945036A (en) |
Cited By (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108356806A (en) * | 2017-12-19 | 2018-08-03 | 北京可以科技有限公司 | Modularization robot control method and system |
CN109828568A (en) * | 2019-02-15 | 2019-05-31 | 武汉理工大学 | Ball gait optimization method is sought to the NAO robot of RoboCup match |
CN110039546A (en) * | 2019-05-13 | 2019-07-23 | 上海鲸鱼机器人科技有限公司 | For controlling the method and robot of robot motion |
CN110561418A (en) * | 2019-08-06 | 2019-12-13 | 珠海格力智能装备有限公司 | Robot control method, device, storage medium, controller and control system |
CN110576433A (en) * | 2018-06-08 | 2019-12-17 | 香港商女娲创造股份有限公司 | robot motion generation method |
CN111195909A (en) * | 2019-12-27 | 2020-05-26 | 深圳市优必选科技股份有限公司 | Steering engine control method and device for robot, terminal and computer storage medium |
CN111352357A (en) * | 2018-12-21 | 2020-06-30 | 深圳市优必选科技有限公司 | Robot control method and device and terminal equipment |
CN111476257A (en) * | 2019-01-24 | 2020-07-31 | 富士通株式会社 | Information processing method and information processing apparatus |
CN111844021A (en) * | 2020-06-17 | 2020-10-30 | 慧灵科技(深圳)有限公司 | Mechanical arm cooperative control method, device, equipment and storage medium |
CN108189029B (en) * | 2017-12-19 | 2020-10-30 | 北京可以科技有限公司 | Control system of modular robot, modular robot system and method for controlling modular robot |
CN108189028B (en) * | 2017-12-19 | 2020-11-03 | 北京可以科技有限公司 | Modular robot control system |
CN108326841B (en) * | 2017-12-19 | 2020-12-18 | 北京可以科技有限公司 | Modular robot and system, control method, construction prompting method and correction method for constructing modular robot |
CN112757273A (en) * | 2020-12-28 | 2021-05-07 | 广州一康医疗设备实业有限公司 | Method, system and device for editing and visualizing track of mechanical arm and storage medium |
CN112975963A (en) * | 2021-02-23 | 2021-06-18 | 广东智源机器人科技有限公司 | Robot action generation method and device and robot |
CN113168341A (en) * | 2020-06-30 | 2021-07-23 | 深圳市大疆创新科技有限公司 | Control method of movable platform, terminal device and storage medium |
CN113492408A (en) * | 2021-08-12 | 2021-10-12 | 北京木甲天枢文化科技有限公司 | Debugging method for drum beating robot |
CN114227699A (en) * | 2022-02-10 | 2022-03-25 | 乐聚(深圳)机器人技术有限公司 | Robot motion adjustment method, robot motion adjustment device, and storage medium |
CN116000911A (en) * | 2021-10-22 | 2023-04-25 | 瑞龙诺赋(上海)医疗科技有限公司 | Mechanical arm control method and device and mechanical arm |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1392824A (en) * | 2000-09-28 | 2003-01-22 | 索尼公司 | Authoring system and method, and storage medium |
EP1435280A2 (en) * | 2002-12-30 | 2004-07-07 | Abb Research Ltd. | A method and a system for programming an industrial robot |
CN1939678A (en) * | 2005-09-28 | 2007-04-04 | 发那科株式会社 | Offline teaching apparatus for robot |
CN102794768A (en) * | 2012-09-10 | 2012-11-28 | 江南现代工业研究院 | Material carrying robot and industrial process control method thereof |
CN103101054A (en) * | 2013-01-17 | 2013-05-15 | 上海交通大学 | Programming and control system of mobile phone to robot |
-
2017
- 2017-03-21 CN CN201710170277.2A patent/CN106945036A/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1392824A (en) * | 2000-09-28 | 2003-01-22 | 索尼公司 | Authoring system and method, and storage medium |
EP1435280A2 (en) * | 2002-12-30 | 2004-07-07 | Abb Research Ltd. | A method and a system for programming an industrial robot |
CN1939678A (en) * | 2005-09-28 | 2007-04-04 | 发那科株式会社 | Offline teaching apparatus for robot |
CN102794768A (en) * | 2012-09-10 | 2012-11-28 | 江南现代工业研究院 | Material carrying robot and industrial process control method thereof |
CN103101054A (en) * | 2013-01-17 | 2013-05-15 | 上海交通大学 | Programming and control system of mobile phone to robot |
Cited By (29)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108189029B (en) * | 2017-12-19 | 2020-10-30 | 北京可以科技有限公司 | Control system of modular robot, modular robot system and method for controlling modular robot |
US11325249B2 (en) | 2017-12-19 | 2022-05-10 | Beijing Keyi Technology Co., Ltd. | Modular robot control method and system |
WO2019120150A1 (en) * | 2017-12-19 | 2019-06-27 | 北京可以科技有限公司 | Modular robot control method and system |
CN108356806A (en) * | 2017-12-19 | 2018-08-03 | 北京可以科技有限公司 | Modularization robot control method and system |
JP2021506604A (en) * | 2017-12-19 | 2021-02-22 | 北京可以科技有限公司Beijing Keyi Technology Co.,Ltd | Modularization robot control method and its system |
JP7053067B2 (en) | 2017-12-19 | 2022-04-12 | 北京可以科技有限公司 | Modulation robot control method and its system |
CN108326841B (en) * | 2017-12-19 | 2020-12-18 | 北京可以科技有限公司 | Modular robot and system, control method, construction prompting method and correction method for constructing modular robot |
CN108189028B (en) * | 2017-12-19 | 2020-11-03 | 北京可以科技有限公司 | Modular robot control system |
CN110576433A (en) * | 2018-06-08 | 2019-12-17 | 香港商女娲创造股份有限公司 | robot motion generation method |
CN110576433B (en) * | 2018-06-08 | 2021-05-18 | 香港商女娲创造股份有限公司 | Robot motion generation method |
CN111352357B (en) * | 2018-12-21 | 2021-09-17 | 深圳市优必选科技有限公司 | Robot control method and device and terminal equipment |
CN111352357A (en) * | 2018-12-21 | 2020-06-30 | 深圳市优必选科技有限公司 | Robot control method and device and terminal equipment |
CN111476257A (en) * | 2019-01-24 | 2020-07-31 | 富士通株式会社 | Information processing method and information processing apparatus |
CN109828568A (en) * | 2019-02-15 | 2019-05-31 | 武汉理工大学 | Ball gait optimization method is sought to the NAO robot of RoboCup match |
CN109828568B (en) * | 2019-02-15 | 2022-04-15 | 武汉理工大学 | NAO robot ball-searching gait optimization method for RoboCup game |
CN110039546A (en) * | 2019-05-13 | 2019-07-23 | 上海鲸鱼机器人科技有限公司 | For controlling the method and robot of robot motion |
CN110561418A (en) * | 2019-08-06 | 2019-12-13 | 珠海格力智能装备有限公司 | Robot control method, device, storage medium, controller and control system |
CN110561418B (en) * | 2019-08-06 | 2022-09-13 | 珠海格力智能装备有限公司 | Robot control method, device, storage medium, controller and control system |
CN111195909A (en) * | 2019-12-27 | 2020-05-26 | 深圳市优必选科技股份有限公司 | Steering engine control method and device for robot, terminal and computer storage medium |
CN111844021A (en) * | 2020-06-17 | 2020-10-30 | 慧灵科技(深圳)有限公司 | Mechanical arm cooperative control method, device, equipment and storage medium |
CN111844021B (en) * | 2020-06-17 | 2021-12-03 | 慧灵科技(深圳)有限公司 | Mechanical arm cooperative control method, device, equipment and storage medium |
CN113168341A (en) * | 2020-06-30 | 2021-07-23 | 深圳市大疆创新科技有限公司 | Control method of movable platform, terminal device and storage medium |
CN112757273A (en) * | 2020-12-28 | 2021-05-07 | 广州一康医疗设备实业有限公司 | Method, system and device for editing and visualizing track of mechanical arm and storage medium |
CN112975963B (en) * | 2021-02-23 | 2022-08-23 | 广东优碧胜科技有限公司 | Robot action generation method and device and robot |
CN112975963A (en) * | 2021-02-23 | 2021-06-18 | 广东智源机器人科技有限公司 | Robot action generation method and device and robot |
CN113492408A (en) * | 2021-08-12 | 2021-10-12 | 北京木甲天枢文化科技有限公司 | Debugging method for drum beating robot |
CN116000911A (en) * | 2021-10-22 | 2023-04-25 | 瑞龙诺赋(上海)医疗科技有限公司 | Mechanical arm control method and device and mechanical arm |
CN114227699A (en) * | 2022-02-10 | 2022-03-25 | 乐聚(深圳)机器人技术有限公司 | Robot motion adjustment method, robot motion adjustment device, and storage medium |
CN114227699B (en) * | 2022-02-10 | 2024-06-11 | 乐聚(深圳)机器人技术有限公司 | Robot motion adjustment method, apparatus, and storage medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN106945036A (en) | Robot motion generation method and device | |
CN106985150A (en) | The method and apparatus of control machine human action | |
KR102169918B1 (en) | A method and apparatus for generating facial expression animation of a human face model | |
US6470235B2 (en) | Authoring system and method, and storage medium used therewith | |
US6718231B2 (en) | Authoring system and authoring method, and storage medium | |
US10499004B2 (en) | Method and terminal for reproducing content | |
CN103513992B (en) | A kind of general Edutainment robot application software development platform | |
US8447428B2 (en) | Method for editing movements of a robot | |
CN111203854B (en) | Robot track reproduction method, control device, equipment and readable storage medium | |
CN107122175B (en) | Interface creating method and device | |
CN108334385B (en) | User interface skin management method and device | |
CN107748639A (en) | Curve editing method, device, equipment and storage medium | |
CN108415386A (en) | Augmented reality system and its working method for intelligent workshop | |
EP4155904A1 (en) | Graphical programming method, processor, and terminal | |
Liu et al. | An augmented reality-assisted interaction approach using deep reinforcement learning and cloud-edge orchestration for user-friendly robot teaching | |
Deimel | Reactive interaction through body motion and the phase-state-machine | |
CN115847431B (en) | Method and device for setting waypoints of mechanical arm, electronic equipment and storage medium | |
CN114610677A (en) | Method for determining conversion model and related device | |
CN105955180A (en) | Intelligent manufacturing adaptive dynamic generation robot real-time automatic programming method | |
CN112612463A (en) | Graphical programming control method, system and device | |
Loutfi et al. | Augmented Reality with Mobility Awareness in Mobile Edge Computing over 6G Network: A Survey | |
CN113546423B (en) | Action generation method, device, computer equipment and storage medium | |
CN117170604A (en) | Synchronization method and system of vehicle-mounted terminal | |
CN112965709B (en) | Method, device, equipment and storage medium for generating building block | |
US20230410437A1 (en) | Ar system for providing interactive experiences in smart spaces |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20170714 |