CN110216676B - Mechanical arm control method, mechanical arm control device and terminal equipment - Google Patents

Mechanical arm control method, mechanical arm control device and terminal equipment Download PDF

Info

Publication number
CN110216676B
CN110216676B CN201910542115.6A CN201910542115A CN110216676B CN 110216676 B CN110216676 B CN 110216676B CN 201910542115 A CN201910542115 A CN 201910542115A CN 110216676 B CN110216676 B CN 110216676B
Authority
CN
China
Prior art keywords
user
arm
real time
mechanical
mechanical arm
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910542115.6A
Other languages
Chinese (zh)
Other versions
CN110216676A (en
Inventor
邓生全
余扬
曾宏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Iwin Visual Technology Co ltd
Original Assignee
Shenzhen Iwin Visual Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Iwin Visual Technology Co ltd filed Critical Shenzhen Iwin Visual Technology Co ltd
Priority to CN201910542115.6A priority Critical patent/CN110216676B/en
Publication of CN110216676A publication Critical patent/CN110216676A/en
Application granted granted Critical
Publication of CN110216676B publication Critical patent/CN110216676B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Manipulator (AREA)

Abstract

The invention is suitable for the technical field of control, and provides a mechanical arm control method, a mechanical arm control device and terminal equipment, wherein the mechanical arm control method comprises the following steps: the method comprises the steps that first relative position information of an arm of a user relative to a trunk of the user is obtained in real time through a somatosensory device, and state information of a hand of the user is obtained in real time; and controlling the mechanical arm to move according to the first relative position information in real time, and controlling a mechanical claw positioned at the end part of the mechanical arm to execute an operation corresponding to the state information in real time. By the method and the system, the requirements of people on experience efficient interactive activities can be met, so that pleasure is obtained or efficiency is improved.

Description

Mechanical arm control method, mechanical arm control device and terminal equipment
Technical Field
The invention belongs to the technical field of control, and particularly relates to a mechanical arm control method, a mechanical arm control device and terminal equipment.
Background
The inventors have found that in scenarios such as scientific exhibitions, human-computer interaction competitions, goods transportation and entertainment activities, it is desirable to experience efficient interactive activities to gain enjoyment or improve efficiency.
Disclosure of Invention
In view of this, embodiments of the present invention provide a robot arm control method, a robot arm control device, and a terminal device, so as to meet the requirements of people for experiencing efficient interactive activities, thereby obtaining fun or improving efficiency.
A first aspect of an embodiment of the present invention provides a robot arm control method, including:
the method comprises the steps that first relative position information of an arm of a user relative to a trunk of the user is obtained in real time through a somatosensory device, and state information of a hand of the user is obtained in real time;
and controlling the mechanical arm to move according to the first relative position information in real time, and controlling a mechanical claw positioned at the end part of the mechanical arm to execute an operation corresponding to the state information in real time.
A second aspect of an embodiment of the present invention provides a robot arm control apparatus, including:
the acquisition module is used for acquiring first relative position information of an arm of a user relative to a trunk of the user in real time through the motion sensing device and acquiring state information of a hand of the user in real time;
and the control module is used for controlling the mechanical arm to move according to the first relative position information in real time and controlling the mechanical claw positioned at the end part of the mechanical arm to execute the operation corresponding to the state information in real time.
A third aspect of the embodiments of the present invention provides a terminal device, including a memory, a processor, and a computer program stored in the memory and executable on the processor, where the processor implements the steps of the method when executing the computer program.
A fourth aspect of embodiments of the present invention provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of the method as described above.
Compared with the prior art, the embodiment of the invention has the following beneficial effects: in the embodiment of the invention, first relative position information of an arm of a user relative to a trunk of the user is acquired in real time through a body sensing device, and state information of a hand of the user is acquired in real time; and controlling the mechanical arm to move according to the first relative position information in real time, and controlling a mechanical claw positioned at the end part of the mechanical arm to execute an operation corresponding to the state information in real time. In the embodiment of the invention, the first relative position information can indicate the movement of the arm of the user relative to the trunk, the state information of the hand of the user can indicate the opening or closing of the hand of the user, the mechanical arm is controlled to move according to the first relative position information in real time, the mechanical claw at the end part of the mechanical arm is controlled to execute the operation corresponding to the state information in real time, the real-time feedback and interaction of the actions of the arm, the hand and the like of the user can be realized through the mechanical arm and the mechanical claw at the end part of the mechanical arm, the operation is flexible and simple, the user experience is better, and the operation efficiency is higher. The scheme can provide interactive activities for users, and has strong interestingness, practicability and usability.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed for the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings without creative efforts.
Fig. 1 is a schematic flow chart illustrating an implementation of a robot arm control method according to an embodiment of the present invention;
fig. 2 is a schematic flow chart illustrating an implementation of a robot arm control method according to a second embodiment of the present invention;
fig. 3 is a schematic view of a robot arm control device provided in a third embodiment of the present invention;
fig. 4 is a schematic diagram of a terminal device according to a fourth embodiment of the present invention.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the invention. It will be apparent, however, to one skilled in the art that the present invention may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present invention with unnecessary detail.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It is also to be understood that the terminology used in the description of the invention herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used in the specification of the present invention and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It should be further understood that the term "and/or" as used in this specification and the appended claims refers to and includes any and all possible combinations of one or more of the associated listed items.
As used in this specification and the appended claims, the term "if" may be interpreted contextually as "when", "upon" or "in response to a determination" or "in response to a detection". Similarly, the phrase "if it is determined" or "if a [ described condition or event ] is detected" may be interpreted contextually to mean "upon determining" or "in response to determining" or "upon detecting [ described condition or event ]" or "in response to detecting [ described condition or event ]".
In order to explain the technical means of the present invention, the following description will be given by way of specific examples.
Fig. 1 is a schematic flow chart illustrating an implementation of a robot arm control method according to an embodiment of the present invention.
In the embodiment of the present invention, the robot arm control method may be implemented by a terminal device including an information processing module. In addition, the terminal equipment can also comprise one or more body sensing equipment and one or more mechanical arms, and the end parts of the mechanical arms comprise mechanical claws. The information Processing module may include a module or a device that can perform information Processing, such as a Central Processing Unit (CPU). In addition, the motion sensing device and/or the mechanical arm may also be external to the terminal device as an external device, and a connection mode and a communication mode of the mechanical arm and/or the motion sensing device and the processor may be set according to an actual application scenario, which is not limited herein. For example, the information processing module can be connected via a wired network, a wireless network (e.g., Bluetooth communication, Wi-Fi communication, third generation mobile communication technology (the 3)rdGenerationmobile communication technology, 3G), fourth generation mobile communication technology (the 4)thGeneral biological communication technology, 4G) or fifth generation mobile communication technology (the 5)thGeneral biological communication technology, 5G) network) and the like are respectively in communication connection with the mechanical arm and the motion sensing device, so that data of the motion sensing device can be acquired, and the mechanical arm can be driven by a control motor.
The robot arm control method as shown in fig. 1 may include the steps of:
step S101, acquiring first relative position information of an arm of a user relative to a trunk of the user in real time through a body sensing device, and acquiring state information of a hand of the user in real time.
In the embodiment of the invention, the body sensing device can detect the limb actions of the user, so that the user can interact with the specified device or environment through the limb actions. The motion sensing device can detect the limb movement of the user through infrared rays, laser or a camera and the like. Illustratively, the motion sensing device may be a third-party device such as a Kinect. Of course, the motion sensing device may also be other devices, such as a device designed by the user.
The user's arms may comprise left and/or right arms and the hands may comprise left and/or right hands. In some embodiments, the hand corresponds to the arm, e.g., where the arm comprises a right arm, the hand comprises a right hand; and when the arms comprise left arms, the hand comprises a left hand. Generally, one arm of the user may correspond to one mechanical arm, and if there are two or more mechanical arms, the left arm and the right arm of the user may be detected simultaneously, or even the arms of multiple users may be detected. In the embodiment of the present invention, the number of the mechanical arms is not limited herein, and accordingly, the number of the detected arms and hands and the number of the users are not limited herein.
In an embodiment of the present invention, the first relative position information may include a difference between coordinates of a specified feature point on the arm and coordinates of the trunk. The designated feature points on the arm can be set according to the actual application scene. For example, the designated feature points may include feature points on joints of the arm, and the joints of the arm may include one or more of a joint between the arm and the shoulder (i.e., a shoulder joint), an elbow joint, and a wrist joint. The state information may indicate that the user's hand is open or fist-making, etc.
Optionally, the arm is a right arm, the hand is a right hand, and/or, the arm is a left arm, the hand is a left hand.
By detecting the related information of the arm and the hand which are positioned on the same side of the body of the user, a data base can be provided for the subsequent corresponding real-time control of the mechanical arm and the mechanical claw positioned at the end part of the mechanical arm. In addition, the arm and the hand on the same side of the body of the user can be moved in a subsequent step by the robot arm and the robot claw located at the end of the robot arm.
And S102, controlling the mechanical arm to move according to the first relative position information in real time, and controlling a mechanical claw positioned at the end part of the mechanical arm to execute an operation corresponding to the state information in real time.
In an embodiment of the present invention, the mechanical arm may move according to a position change of the arm relative to the trunk of the user, in this case, the mechanical arm may simulate the movement of the arm of the user, and a moving distance, a moving speed, and the like of the mechanical arm may correspond to the moving distance and the moving speed of the arm of the user relative to the trunk.
In the embodiment of the invention, the mechanical claw can also simulate the action of the hand of the user, for example, when the user makes a fist, the mechanical claw can be controlled to be closed so as to grab a specified article; when the hands of the user are opened, the mechanical claws can be controlled to be opened, so that the appointed object can be released. Of course, there may be various corresponding relationships between the operation of the gripper and the state information, and the corresponding relationships may be set according to the needs of the actual application scenario.
One specific application of the present invention is illustrated below as a specific example.
In one application scenario, the object that the user wants to grasp may be a doll, a toy, or other objects (of course, other objects are also possible), while in the existing doll grasping machine, the user controls the movement of the mechanical arm and the mechanical claw by means of a rocker, a button, and the like, and in the prior art, the opening and closing of the mechanical claw are often preset and cannot be controlled by the user.
While one particular example of the present invention may be used with dolls such as dolls. At the moment, the mechanical arm can simulate the movement of the arm of a user, and the mechanical claws can simulate the states of fist making or opening of the hand of the user, so that the user can control the mechanical arm and the mechanical claws at the end part of the mechanical arm to reach specific positions to grab dolls such as dolls through the body actions of the user, and the robot is different from the existing doll grabbing machine and provides a more flexible, interesting and novel man-machine interaction mode for the user. The device of the example is simple to disassemble and assemble, convenient to maintain, friendly in user experience, high in running speed and good in commercial prospect.
Optionally, the state information of the hand includes an indication that the hand is in a fist making state or an open state:
correspondingly, the real-time control of the mechanical claw at the end part of the mechanical arm to execute the operation corresponding to the state information comprises the following steps:
if the state information indicates that the hand is in a fist making state, controlling a mechanical claw positioned at the end part of the mechanical arm to perform closing operation;
and if the state information indicates that the hand is in an opening state, controlling a mechanical claw positioned at the end part of the mechanical arm to execute opening operation.
In the embodiment of the invention, the mechanical claw can be made of metal or other materials. Since the mechanical claw may apply a force when performing a closing operation, the mechanical claw may be made of a metal material to improve the hardness and other properties of the mechanical claw. For example, the motion sensing device may capture an image of the hand of the user through a camera, and determine the state of the hand of the user through image recognition or the like.
Optionally, obtain the first relative position information of user's arm for user's truck in real time through body sensing device, include:
acquiring first space coordinates respectively corresponding to preset joints of the arms of the user in real time through the motion sensing equipment, and acquiring second space coordinates corresponding to the trunk of the user;
calculating the difference value between each first space coordinate and each second space coordinate to obtain the relative space coordinates of the preset joints of the arms relative to the trunk;
correspondingly, the real-time control mechanical arm moves according to the first relative position information, and the method comprises the following steps:
and controlling the movement of each shaft joint of the mechanical arm in real time according to the relative space coordinate, wherein each shaft joint corresponds to a preset joint of the arm.
The preset joint of user's arm can include one or more in shoulder joint, elbow joint and the wrist joint, the number of presetting the joint can with the number of the axle joint of arm and the removal mode of arm is corresponding, for example, the arm has three axle joint, then can make the three axle joint of arm corresponds respectively shoulder joint, elbow joint and the wrist joint on the arm.
In the embodiment of the invention, the first relative position information can indicate the movement of the arm of the user relative to the trunk, the state information of the hand of the user can indicate the opening or closing of the hand of the user, the mechanical arm is controlled to move according to the first relative position information in real time, the mechanical claw at the end part of the mechanical arm is controlled to execute the operation corresponding to the state information in real time, the real-time feedback and interaction of the actions of the arm, the hand and the like of the user can be realized through the mechanical arm and the mechanical claw at the end part of the mechanical arm, the operation is flexible and simple, the user experience is better, and the operation efficiency is higher. The scheme can provide interactive activities for users, and has strong interestingness, practicability and usability.
On the basis of the foregoing embodiment, fig. 2 is a schematic flow chart illustrating an implementation process of a robot arm control method according to a second embodiment of the present invention, and as shown in fig. 2, the robot arm control method may include the following steps:
step S201, detecting second relative position information between the user and the motion sensing device.
In this embodiment of the present invention, the second relative position information may include information such as a relative distance and a relative orientation between the user and the motion sensing device. For example, a relative position between the outermost left shoulder of the user and the motion sensing device and a relative position between the outermost right shoulder of the user and the motion sensing device may be detected to determine an included angle between a direction in which a body of the user faces and a central axis of the motion sensing device, and the like.
Step S202, if the second relative position information indicates that the distance between the user and the motion sensing device is within a preset distance range, and an included angle between the body orientation of the user and the device orientation of the motion sensing device is within a preset angle range, acquiring first relative position information of the arm of the user relative to the trunk of the user and state information of the hand in real time through the motion sensing device.
In the embodiment of the present invention, the body orientation of the user may refer to a direction in which the body of the user faces. If the included angle between the body orientation of the user and the device orientation of the motion sensing device is within a preset angle range, the user can be considered to face the motion sensing device at the moment. And if the user with distance between the body sensing device is at predetermineeing the distance within range, just, user's health orientation with contained angle between the device orientation of body sensing device is at predetermineeing the angle within range, can think the user faces this moment body sensing device and with body sensing device's distance is nearer, thereby will body sensing device is injectd in certain region to human discernment scope, reduces other individual's influence.
And S203, controlling the mechanical arm to move according to the first relative position information in real time, and controlling a mechanical claw positioned at the end part of the mechanical arm to execute an operation corresponding to the state information in real time.
Step S203 of this embodiment is the same as or similar to step S102, and reference may be specifically made to the description related to step S102, which is not repeated herein.
In the embodiment of the invention, by detecting the second relative position information between the user and the motion sensing device, whether the user is in a state of facing the motion sensing device and being close to the motion sensing device or not can be judged according to the second relative position information, so that the influence of other individuals such as passing other individuals is eliminated, the detection precision of the motion sensing device is improved, and unnecessary detection is reduced.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present invention.
Fig. 3 is a schematic diagram of a robot arm control device according to a third embodiment of the present invention. For convenience of explanation, only portions related to the embodiments of the present invention are shown.
The robot arm control device 300 includes:
the acquisition module 301 is configured to acquire first relative position information of an arm of a user relative to a trunk of the user in real time through a motion sensing device, and acquire state information of a hand of the user in real time;
and the control module 302 is configured to control the mechanical arm to move according to the first relative position information in real time, and control a mechanical claw located at the end of the mechanical arm to perform an operation corresponding to the state information in real time.
Optionally, the obtaining module 301 specifically includes:
the first detection unit is used for detecting second relative position information between a user and the somatosensory device;
and the second detection unit is used for acquiring first relative position information of the arm of the user relative to the trunk of the user and state information of the hand in real time through the body sensing equipment if the second relative position information indicates that the distance between the user and the body sensing equipment is within a preset distance range, and the body orientation of the user and the equipment orientation of the body sensing equipment form an included angle within a preset angle range.
Optionally, the arm is a right arm, the hand is a right hand, and/or, the arm is a left arm, the hand is a left hand.
Optionally, the state information of the hand includes an indication that the hand is in a fist making state or an open state:
correspondingly, the control module 302 specifically includes:
the first control unit is used for controlling the mechanical claws at the end parts of the mechanical arms to execute closing operation if the state information indicates that the hands are in a fist making state;
and the second control unit is used for controlling the mechanical claw positioned at the end part of the mechanical arm to perform opening operation if the state information indicates that the hand is in the opening state.
Optionally, the obtaining module 301 specifically includes:
the first acquisition unit is used for acquiring first space coordinates respectively corresponding to preset joints of the arms of the user in real time through the somatosensory device and acquiring second space coordinates corresponding to the trunk of the user;
the second acquisition unit is used for calculating the difference value between each first space coordinate and each second space coordinate to obtain the relative space coordinates of the preset joints of the arms relative to the trunk;
the control module 302 is specifically configured to:
and controlling the movement of each shaft joint of the mechanical arm in real time according to the relative space coordinate, wherein each shaft joint corresponds to a preset joint of the arm.
In the embodiment of the invention, the first relative position information can indicate the movement of the arm of the user relative to the trunk, the state information of the hand of the user can indicate the opening or closing of the hand of the user, the mechanical arm is controlled to move according to the first relative position information in real time, the mechanical claw at the end part of the mechanical arm is controlled to execute the operation corresponding to the state information in real time, the real-time feedback and interaction of the actions of the arm, the hand and the like of the user can be realized through the mechanical arm and the mechanical claw at the end part of the mechanical arm, the operation is flexible and simple, the user experience is better, and the operation efficiency is higher. The scheme can provide interactive activities for users, and has strong interestingness, practicability and usability.
Fig. 4 is a schematic diagram of a terminal device according to a fourth embodiment of the present invention. As shown in fig. 4, the terminal device 4 of this embodiment includes: a processor 40, a memory 41 and a computer program 42 stored in said memory 41 and executable on said processor 40. The terminal device can further comprise a body sensing device and/or a mechanical arm, and the body sensing device and/or the mechanical arm are respectively connected with the processor in a coupling mode. In addition, the motion sensing device and/or the mechanical arm may also be external to the terminal device as an external device, and a connection mode and a communication mode of the mechanical arm and/or the motion sensing device and the processor may be set according to an actual application scenario, which is not limited herein.
The processor 40, when executing the computer program 42, implements the steps in the various robot arm control method embodiments described above, such as steps 101-102 shown in fig. 1. Alternatively, the processor 40, when executing the computer program 42, implements the functions of each module/unit in each device embodiment described above, for example, the functions of the modules 301 to 302 shown in fig. 3.
Illustratively, the computer program 42 may be partitioned into one or more modules/units that are stored in the memory 41 and executed by the processor 40 to implement the present invention. The one or more modules/units may be a series of computer program instruction segments capable of performing specific functions, which are used to describe the execution process of the computer program 42 in the terminal device 4. For example, the computer program 42 may be divided into an acquisition module and a control module, and the specific functions of the modules are as follows:
the acquisition module is used for acquiring first relative position information of an arm of a user relative to a trunk of the user in real time through the motion sensing device and acquiring state information of a hand of the user in real time;
and the control module is used for controlling the mechanical arm to move according to the first relative position information in real time and controlling the mechanical claw positioned at the end part of the mechanical arm to execute the operation corresponding to the state information in real time.
The terminal device 4 may be a desktop computer, a notebook, a palm computer, a cloud server, or other computing devices. The terminal device may include, but is not limited to, a processor 40, a memory 41. Those skilled in the art will appreciate that fig. 4 is merely an example of a terminal device 4 and does not constitute a limitation of terminal device 4 and may include more or fewer components than shown, or some components may be combined, or different components, e.g., the terminal device may also include input-output devices, network access devices, buses, etc.
The Processor 40 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic, discrete hardware components, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 41 may be an internal storage unit of the terminal device 4, such as a hard disk or a memory of the terminal device 4. The memory 41 may also be an external storage device of the terminal device 4, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like, which are provided on the terminal device 4. Further, the memory 41 may also include both an internal storage unit and an external storage device of the terminal device 4. The memory 41 is used for storing the computer program and other programs and data required by the terminal device. The memory 41 may also be used to temporarily store data that has been output or is to be output.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-mentioned functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
In the embodiments provided in the present invention, it should be understood that the disclosed apparatus/terminal device and method may be implemented in other ways. For example, the above-described embodiments of the apparatus/terminal device are merely illustrative, and for example, the division of the modules or units is only one logical division, and there may be other divisions when actually implemented, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated modules/units, if implemented in the form of software functional units and sold or used as separate products, may be stored in a computer readable storage medium. Based on such understanding, all or part of the flow of the method according to the embodiments of the present invention may also be implemented by a computer program, which may be stored in a computer-readable storage medium, and when the computer program is executed by a processor, the steps of the method embodiments may be implemented. . Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer-readable medium may include: any entity or device capable of carrying the computer program code, recording medium, usb disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signals, telecommunications signals, software distribution medium, and the like. It should be noted that the computer readable medium may contain content that is subject to appropriate increase or decrease as required by legislation and patent practice in jurisdictions, for example, in some jurisdictions, computer readable media does not include electrical carrier signals and telecommunications signals as is required by legislation and patent practice.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present invention, and not for limiting the same; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present invention, and are intended to be included within the scope of the present invention.

Claims (9)

1. A robot arm control method is characterized by comprising:
detecting second relative position information between the user and the somatosensory device; the second relative position information comprises a relative distance and a relative orientation between the user and the somatosensory device;
if the second relative position information indicates that the distance between the user and the motion sensing equipment is within a preset distance range and an included angle between the body orientation of the user and the equipment orientation of the motion sensing equipment is within a preset angle range, acquiring first relative position information of the arms of the user relative to the trunk of the user in real time through the motion sensing equipment and acquiring state information of the hands of the user in real time; wherein the first relative position information includes a difference between coordinates of a specified feature point on the arm and coordinates of the torso;
and controlling the mechanical arm to move according to the first relative position information in real time, and controlling a mechanical claw positioned at the end part of the mechanical arm to execute an operation corresponding to the state information in real time.
2. The robot arm control method according to claim 1, wherein the arm is a right arm and the hand is a right hand, and/or wherein the arm is a left arm and the hand is a left hand.
3. The robot arm control method according to claim 1 or 2, wherein the state information of the hand includes information indicating that the hand is in a fist-closed state or an open state:
correspondingly, the real-time control of the mechanical claw at the end part of the mechanical arm to execute the operation corresponding to the state information comprises the following steps:
if the state information indicates that the hand is in a fist making state, controlling a mechanical claw positioned at the end part of the mechanical arm to perform closing operation;
and if the state information indicates that the hand is in an opening state, controlling a mechanical claw positioned at the end part of the mechanical arm to execute opening operation.
4. The mechanical arm control method of claim 1 or 2, wherein the acquiring, in real time, first relative position information of the arm of the user relative to the torso of the user through the body sensing device comprises:
acquiring first space coordinates respectively corresponding to preset joints of the arms of the user in real time through the motion sensing equipment, and acquiring second space coordinates corresponding to the trunk of the user;
calculating the difference value between each first space coordinate and each second space coordinate to obtain the relative space coordinates of the preset joints of the arms relative to the trunk;
correspondingly, the real-time control mechanical arm moves according to the first relative position information, and the method comprises the following steps:
and controlling the movement of each shaft joint of the mechanical arm in real time according to the relative space coordinate, wherein each shaft joint corresponds to a preset joint of the arm.
5. A robot arm control apparatus, comprising:
an acquisition module, comprising: the first detection unit is used for detecting second relative position information between the user and the somatosensory device; the second detection unit is used for acquiring first relative position information of the arm of the user relative to the trunk of the user in real time through the body sensing equipment and acquiring state information of the hand of the user in real time if the second relative position information indicates that the distance between the user and the body sensing equipment is within a preset distance range and an included angle between the body orientation of the user and the equipment orientation of the body sensing equipment is within a preset angle range; wherein the first relative position information includes a difference between coordinates of a specified feature point on the arm and coordinates of the torso; wherein the second relative position information comprises a relative distance and a relative orientation between a user and the somatosensory device;
and the control module is used for controlling the mechanical arm to move according to the first relative position information in real time and controlling the mechanical claw positioned at the end part of the mechanical arm to execute the operation corresponding to the state information in real time.
6. The robot arm control apparatus of claim 5, wherein the state information of the hand portion includes an indication that the hand portion is in a fist-closed state or an open state:
correspondingly, the control module specifically comprises:
the first control unit is used for controlling the mechanical claws at the end parts of the mechanical arms to execute closing operation if the state information indicates that the hands are in a fist making state;
and the second control unit is used for controlling the mechanical claw positioned at the end part of the mechanical arm to perform opening operation if the state information indicates that the hand is in the opening state.
7. The robot arm control device of claim 5, wherein the acquisition module specifically comprises:
the first acquisition unit is used for acquiring first space coordinates respectively corresponding to preset joints of the arms of the user in real time through the somatosensory device and acquiring second space coordinates corresponding to the trunk of the user;
the second acquisition unit is used for calculating the difference value between each first space coordinate and each second space coordinate to obtain the relative space coordinates of the preset joints of the arms relative to the trunk;
the control module is specifically configured to:
and controlling the movement of each shaft joint of the mechanical arm in real time according to the relative space coordinate, wherein each shaft joint corresponds to a preset joint of the arm.
8. Terminal device, comprising a robot arm, a processor and a computer program stored in a memory and executable on the processor, the robot arm being coupleable with the processor, the processor implementing the steps of the robot arm control method according to any of claims 1 to 4 when executing the computer program.
9. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the steps of the robot arm control method according to any one of claims 1 to 4.
CN201910542115.6A 2019-06-21 2019-06-21 Mechanical arm control method, mechanical arm control device and terminal equipment Active CN110216676B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910542115.6A CN110216676B (en) 2019-06-21 2019-06-21 Mechanical arm control method, mechanical arm control device and terminal equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910542115.6A CN110216676B (en) 2019-06-21 2019-06-21 Mechanical arm control method, mechanical arm control device and terminal equipment

Publications (2)

Publication Number Publication Date
CN110216676A CN110216676A (en) 2019-09-10
CN110216676B true CN110216676B (en) 2022-04-26

Family

ID=67814113

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910542115.6A Active CN110216676B (en) 2019-06-21 2019-06-21 Mechanical arm control method, mechanical arm control device and terminal equipment

Country Status (1)

Country Link
CN (1) CN110216676B (en)

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103386683A (en) * 2013-07-31 2013-11-13 哈尔滨工程大学 Kinect-based motion sensing-control method for manipulator
CN104440926A (en) * 2014-12-09 2015-03-25 重庆邮电大学 Mechanical arm somatic sense remote controlling method and mechanical arm somatic sense remote controlling system based on Kinect
CN106272409A (en) * 2016-08-03 2017-01-04 北京航空航天大学 Mechanical arm control method based on gesture identification and system
CN106313072A (en) * 2016-10-12 2017-01-11 南昌大学 Humanoid robot based on leap motion of Kinect
CN106313049A (en) * 2016-10-08 2017-01-11 华中科技大学 Somatosensory control system and control method for apery mechanical arm
CN108127667A (en) * 2018-01-18 2018-06-08 西北工业大学 A kind of mechanical arm body feeling interaction control method based on joint angle increment
CN108453742A (en) * 2018-04-24 2018-08-28 南京理工大学 Robot man-machine interactive system based on Kinect and method
CN108673505A (en) * 2018-05-28 2018-10-19 南昌大学 A kind of mechanical arm tail end precise motion control method
WO2018208761A1 (en) * 2017-05-11 2018-11-15 Misty Robotics, Inc. Infinite robot personalities
CN109814541A (en) * 2017-11-21 2019-05-28 深圳市优必选科技有限公司 Robot control method and system and terminal equipment

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103386683A (en) * 2013-07-31 2013-11-13 哈尔滨工程大学 Kinect-based motion sensing-control method for manipulator
CN104440926A (en) * 2014-12-09 2015-03-25 重庆邮电大学 Mechanical arm somatic sense remote controlling method and mechanical arm somatic sense remote controlling system based on Kinect
CN106272409A (en) * 2016-08-03 2017-01-04 北京航空航天大学 Mechanical arm control method based on gesture identification and system
CN106313049A (en) * 2016-10-08 2017-01-11 华中科技大学 Somatosensory control system and control method for apery mechanical arm
CN106313072A (en) * 2016-10-12 2017-01-11 南昌大学 Humanoid robot based on leap motion of Kinect
WO2018208761A1 (en) * 2017-05-11 2018-11-15 Misty Robotics, Inc. Infinite robot personalities
CN109814541A (en) * 2017-11-21 2019-05-28 深圳市优必选科技有限公司 Robot control method and system and terminal equipment
CN108127667A (en) * 2018-01-18 2018-06-08 西北工业大学 A kind of mechanical arm body feeling interaction control method based on joint angle increment
CN108453742A (en) * 2018-04-24 2018-08-28 南京理工大学 Robot man-machine interactive system based on Kinect and method
CN108673505A (en) * 2018-05-28 2018-10-19 南昌大学 A kind of mechanical arm tail end precise motion control method

Also Published As

Publication number Publication date
CN110216676A (en) 2019-09-10

Similar Documents

Publication Publication Date Title
CN111402290B (en) Action restoration method and device based on skeleton key points
Du et al. Markerless kinect-based hand tracking for robot teleoperation
CN114080583B (en) Visual teaching and repetitive movement manipulation system
US11331806B2 (en) Robot control method and apparatus and robot using the same
Watanabe et al. Survey of robotic manipulation studies intending practical applications in real environments-object recognition, soft robot hand, and challenge program and benchmarking
US9517559B2 (en) Robot control system, robot control method and output control method
Wachs et al. Vision-based hand-gesture applications
CN103529944B (en) A kind of human motion recognition method based on Kinect
GB2602611A (en) Reinforcement learning of tactile grasp policies
US20130335318A1 (en) Method and apparatus for doing hand and face gesture recognition using 3d sensors and hardware non-linear classifiers
Williams et al. Augmented, mixed, and virtual reality enabling of robot deixis
CN109732593B (en) Remote control method and device for robot and terminal equipment
EP2904472A2 (en) Wearable sensor for tracking articulated body-parts
Fang et al. Robotic teleoperation systems using a wearable multimodal fusion device
CN109955244B (en) Grabbing control method and device based on visual servo and robot
CN113119104B (en) Mechanical arm control method, mechanical arm control device, computing equipment and system
Nasif et al. Wireless head gesture controlled wheel chair for disable persons
Liu et al. Intention Recognition in Physical Human‐Robot Interaction Based on Radial Basis Function Neural Network
CN111113429B (en) Action simulation method, action simulation device and terminal equipment
Gil et al. 3D visual sensing of the human hand for the remote operation of a robotic hand
CN110216676B (en) Mechanical arm control method, mechanical arm control device and terminal equipment
Prasad et al. Hand gesture controlled robot
US11331802B2 (en) Method for imitation of human arm by robotic arm, computer readable storage medium, and robot
Cruz-Silva et al. 3-D Human Body Posture Reconstruction by Computer Vision
Balaji et al. Smart phone accelerometer sensor based wireless robot for physically disabled people

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant