CN116100565A - Immersive real-time remote operation platform based on exoskeleton robot - Google Patents
Immersive real-time remote operation platform based on exoskeleton robot Download PDFInfo
- Publication number
- CN116100565A CN116100565A CN202310389548.9A CN202310389548A CN116100565A CN 116100565 A CN116100565 A CN 116100565A CN 202310389548 A CN202310389548 A CN 202310389548A CN 116100565 A CN116100565 A CN 116100565A
- Authority
- CN
- China
- Prior art keywords
- operation end
- upper limb
- exoskeleton
- real
- moment
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 210000001364 upper extremity Anatomy 0.000 claims abstract description 57
- 230000003993 interaction Effects 0.000 claims abstract description 25
- 238000004891 communication Methods 0.000 claims abstract description 13
- 230000002452 interceptive effect Effects 0.000 claims abstract description 6
- 230000000007 visual effect Effects 0.000 claims description 25
- 230000033001 locomotion Effects 0.000 claims description 23
- 210000000707 wrist Anatomy 0.000 claims description 15
- 238000000034 method Methods 0.000 claims description 12
- 238000012545 processing Methods 0.000 claims description 12
- 238000009877 rendering Methods 0.000 claims description 8
- 208000006358 Hand Deformities Diseases 0.000 claims description 4
- 230000003190 augmentative effect Effects 0.000 claims description 3
- 230000008602 contraction Effects 0.000 claims description 3
- 238000013461 design Methods 0.000 claims description 3
- 230000000694 effects Effects 0.000 claims description 3
- 230000009023 proprioceptive sensation Effects 0.000 abstract description 3
- 230000006870 function Effects 0.000 description 9
- 239000011521 glass Substances 0.000 description 9
- 230000001953 sensory effect Effects 0.000 description 6
- 238000005516 engineering process Methods 0.000 description 5
- 230000008569 process Effects 0.000 description 4
- 230000001276 controlling effect Effects 0.000 description 3
- 230000008878 coupling Effects 0.000 description 3
- 238000010168 coupling process Methods 0.000 description 3
- 238000005859 coupling reaction Methods 0.000 description 3
- 230000000638 stimulation Effects 0.000 description 3
- 238000010586 diagram Methods 0.000 description 2
- 230000004927 fusion Effects 0.000 description 2
- 210000001503 joint Anatomy 0.000 description 2
- 238000013507 mapping Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000010399 physical interaction Effects 0.000 description 2
- 238000004088 simulation Methods 0.000 description 2
- 230000003068 static effect Effects 0.000 description 2
- 238000012546 transfer Methods 0.000 description 2
- 241000282412 Homo Species 0.000 description 1
- 206010063385 Intellectualisation Diseases 0.000 description 1
- 230000006978 adaptation Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 208000002173 dizziness Diseases 0.000 description 1
- 230000008030 elimination Effects 0.000 description 1
- 238000003379 elimination reaction Methods 0.000 description 1
- 238000007654 immersion Methods 0.000 description 1
- 238000007689 inspection Methods 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 210000000653 nervous system Anatomy 0.000 description 1
- 230000001105 regulatory effect Effects 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 210000000323 shoulder joint Anatomy 0.000 description 1
- 230000008054 signal transmission Effects 0.000 description 1
- 230000009012 visual motion Effects 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1615—Programme controls characterised by special kind of manipulator, e.g. planar, scara, gantry, cantilever, space, closed chain, passive/active joints and tendon driven manipulators
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/0006—Exoskeletons, i.e. resembling a human figure
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1679—Programme controls characterised by the tasks executed
- B25J9/1689—Teleoperation
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02P—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
- Y02P90/00—Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
- Y02P90/02—Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Orthopedic Medicine & Surgery (AREA)
- Manipulator (AREA)
Abstract
The invention provides an immersive real-time remote operation platform based on an exoskeleton robot, which comprises an operation end and a working end, wherein the upper limb exoskeleton is divided into an operation end and a working end; the upper limb exoskeleton comprises joint motors, the joints of the upper limb exoskeleton are driven by the joint motors, and each joint motor corresponds to one degree of freedom of the upper limb of the human body; the operation end joint motor collects interaction moment of an operator and generates a driving signal, and the operation end joint motor completes operation according to the driving signal; the sensing and detecting module comprises an image sensor and a laser radar, wherein the image sensor acquires visible light images and depth information, and the laser radar acquires three-dimensional point cloud data; and the real-time network communication module is used for carrying out communication connection between the operation end and the operation end. The invention provides rich proprioception including vision and pressure touch for operators based on high-bandwidth low-delay real-time network communication, and is suitable for the requirements of interactive and high-precision remote operation.
Description
Technical Field
The invention relates to the technical field of robot control, in particular to an immersive real-time remote operation platform based on an exoskeleton robot, and particularly relates to a real-time remote operation platform integrated with an upper limb exoskeleton, a mobile platform and a three-dimensional map building sensor.
Background
Remote control of robots has long been a hot spot research direction for robotics, as one important meaning of robotics is to replace humans in performing operations to places that are dangerous or difficult to reach, such as telemedicine, industrial inspection, high voltage live working, etc.
The wearable exoskeleton technology is used as a new man-machine coupling mode, so that the information exchange bandwidth of a human body and a hardware system is greatly improved. The man-machine information coupling comprises a wearable sensor and a sensory feedback system. The wearable robot is provided with a sensor for sensing environment, cognizing the state of a person and measuring the state of the robot; including physical signal sensors and physiological signal sensors. Human movement intent recognition is based on such wearable sensing systems. The human motion intention is various, is the key of realizing the intellectualization of wearing the robot. Ideally, the wearable robot should reliably detect the intention of the user, and accordingly provide continuous control signals for multiple degrees of freedom at the same time, and possess smooth and smart control characteristics like a human body to reversely transfer external environment or human body state information to the nervous system (such as vibration touch sense, pressure touch sense, joint torque, electrode stimulation, visual stimulation and auditory stimulation) of the human body, so as to construct a feedback control loop for man-machine co-fusion. Sensory feedback is critical in real-time complex interactive tasks of the human body, and it is almost impossible to perform tasks involving physical interactions with objects, particularly gripping and manipulating tasks, without haptic feedback.
Patent document CN111345970a discloses a modularized seven-degree-of-freedom upper limb exoskeleton, which comprises a bearing moving platform and an upper limb movement mechanism, improves comfort through a flexible fastener, realizes immersion for a human body through force sense sensory feedback, but does not provide VR glasses containing a three-dimensional sensor for mapping an environment and providing visual feedback, and greatly limits the information amount of proprioception.
Patent document CN109521784a discloses a haptic-perception wearable upper limb exoskeleton unmanned aerial vehicle control system and method, which ensures the flight stability and operability of a remotely controlled unmanned aerial vehicle; generating corresponding motor driving signals through force sensing data of the upper limb exoskeleton; according to the system, through specifying the relation between the safe flight level of the unmanned aerial vehicle and the upper limb movement assisting force output by the motor, a remote controller can acquire the flight safety level of the unmanned aerial vehicle by sensing the strength of the upper limb movement assisting force, so that excessive visual dependence of safe operation can be reduced; the system and the exoskeleton hardware system integrate a sensing module for intention recognition and a driving module for sensory feedback, but the feedback information quantity is limited to the safety level of the unmanned aerial vehicle, and the sensory feedback information is single; requiring the operator to learn the adaptation specifically.
Disclosure of Invention
Aiming at the defects in the prior art, the invention aims to provide an immersive real-time remote operation platform based on an exoskeleton robot.
According to the invention, the immersive real-time remote operation platform based on the exoskeleton robot comprises:
an upper limb exoskeleton, which is divided into an operation end and a working end;
the upper limb exoskeleton comprises joint motors, the joints of the upper limb exoskeleton are driven by the joint motors, and each joint motor corresponds to one degree of freedom of the upper limb of the human body; the operation end joint motor collects interaction moment of an operator and generates a driving signal, and the operation end joint motor completes operation according to the driving signal;
the sensing and detecting module comprises an image sensor and a laser radar, wherein the image sensor acquires visible light images and depth information, and the laser radar acquires three-dimensional point cloud data;
and the real-time network communication module is used for carrying out communication connection between the operation end and the operation end.
Preferably, the upper limb exoskeleton further comprises a wheel type mobile platform, the wheel type mobile platform is located at the bottom of the upper limb exoskeleton and comprises a motor encoder and an odometer of an inertial sensor, the wheel type mobile platform of the operation end generates a motion instruction according to the movement of an operator, and the wheel type mobile platform of the operation end receives and executes the motion instruction.
Preferably, the upper limb exoskeleton has 7 arm active degrees of freedom and 1 operation handle degrees of freedom, and feeds back joint angle information in real time through an encoder of the joint motor, and feeds back joint moment information in real time through a current sensor of the joint motor driver.
Preferably, the joint motor comprises a shoulder internal rotation and external rotation joint motor, a shoulder internal contraction and external expansion joint motor, a shoulder flexion and extension joint motor, an elbow flexion and extension joint motor, a wrist internal rotation and external rotation joint motor, a wrist ulnar deviation and radial deviation joint motor and a wrist flexion and extension joint motor;
the upper limb exoskeleton further comprises clamping jaws, and the clamping jaws are mounted on a wrist buckling extension joint motor.
Preferably, the image sensor collects visible light images and depth information around the working end, and a visual dense map of the working end environment is established;
the operation end further comprises VR equipment, the visual dense map is sent to the operation end, and the visual dense map is displayed to an operator through the VR equipment.
Preferably, the VR device adds the three-dimensional rendering effect of the current pose of the upper limb exoskeleton of the operation end to the VR image through the augmented reality technology.
Preferably, the laser radar collects three-dimensional point cloud data around the operation end and the operation end, and provides obstacle avoidance signals for the operation end and the operation end respectively.
Preferably, the method further comprises a center cloud and an edge cloud, wherein the operation end edge cloud processes sensor data of an image sensor and a laser radar in real time to construct an environment map, and the operation end edge cloud renders the received visual dense map into a VR video according to the gesture of VR equipment;
the operation end, the operation end and the edge cloud are connected through a wireless communication network.
Preferably, the VR device includes a Wi-Fi module, a video decoding processing module, a play operation control module, and a display processing module.
Preferably, the interaction method of the operation end joint motor and the operation end joint motor comprises the following steps:
the joint motor comprises an angle sensor and a moment sensor, and the moment interaction method of the upper limb exoskeleton at the operation end and the upper limb exoskeleton at the operation end comprises the following steps: and the working moment fed back by the upper limb exoskeleton of the working end is subjected to difference with the man-machine interaction moment acquired by the working end, the driving moment of the human body is calculated, the driving moment of the human body is adjusted to zero by changing the joint angle through a design controller, and the joint angle command is sent in real time to remotely control the upper limb exoskeleton of the working end.
Preferably, the interaction torque relationship between the man-machine interaction torque of the operation end and the environment of the operation end is as follows:
the man-machine interaction moment of the operation end=the interaction moment of the operation end environment+the human body driving moment,
the interactive torque of the working end environment is adjusted through a booster gain function, and the booster gain function can amplify the torque provided by an operator at the working end or amplify the force feedback of the working end.
Compared with the prior art, the invention has the following beneficial effects:
1. real-time network communication based on high bandwidth and low delay provides rich proprioception including vision and pressure touch for operators, and is suitable for the requirements of interactive and high-precision remote operation.
2. Aiming at the exoskeleton control of the operation end, the problem is regarded as a regulator problem in the joint space, and the human body driving moment is calculated in real time according to the difference between the holding moment of the mechanical arm of the operation end and the moment information acquired by the servo motor, so that the human body driving moment is zeroed to be the target design regulator. This naturally provides force feedback to the human body while controlling the exoskeleton, since the human body can sense the holding moment required for the working end, and the resistance moment generated due to collision interference or friction, etc.
3. The human body can only provide smaller moment to obtain larger moment output at the working end, and force feedback at the working end can be amplified, so that the human body can feel that finer moment change is controlled more finely.
Other advantages of the present invention will be set forth in the description of specific technical features and solutions, by which those skilled in the art should understand the advantages that the technical features and solutions bring.
Drawings
Other features, objects and advantages of the present invention will become more apparent upon reading of the detailed description of non-limiting embodiments, given with reference to the accompanying drawings in which:
FIG. 1 is a schematic diagram of the operation of an exoskeleton robot-based immersive real-time remote operation platform provided by the present invention;
FIG. 2 is a flow chart of a torque feedback method according to the present invention.
Reference numerals illustrate:
internal-rotation and external-rotation shoulder joint motor 1
Shoulder adduction abduction joint motor 2
Shoulder buckling extension joint motor 3
Elbow flexion and extension joint motor 4
Wrist internal rotation and external rotation joint motor 5
Wrist ulnar deviation and radial deviation joint motor 6
Wrist flexion and extension joint motor 7
Clamping jaw 8
RGBD camera 9
Wheeled mobile platform 11
Visual, torque feedback channel 12
Exoskeleton, mobile platform control command channel 13
VR glasses 14
Detailed Description
The present invention will be described in detail with reference to specific examples. The following examples will assist those skilled in the art in further understanding the present invention, but are not intended to limit the invention in any way. It should be noted that variations and modifications could be made by those skilled in the art without departing from the inventive concept. These are all within the scope of the present invention.
The invention provides an immersive real-time remote operation platform based on an exoskeleton robot, which comprises the following components:
the upper limb exoskeleton comprises an operation end and a working end, wherein a unified hardware platform is used as the wearable upper limb exoskeleton at the operation end and used as the cooperative mechanical arm at the working end.
The upper limb exoskeleton comprises joint motors and a wheeled mobile platform 11, the joints of the upper limb exoskeleton are driven by a plurality of joint motors, and each joint motor corresponds to one degree of freedom of the upper limb of the human body. The upper limb exoskeleton can be worn on the upper limb of a human body to perform man-machine coupling motion, and can also independently work through a remote attitude angle instruction.
The upper limb exoskeleton has 7 arm active degrees of freedom and 1 operating handle degrees of freedom, and feeds back joint angle information in real time through an encoder of a joint motor, and feeds back joint moment information in real time through a current sensor of a joint motor driver. Specifically, the upper limb exoskeleton is divided into a left arm and a right arm, and takes the right arm as an example, and comprises a shoulder internal rotation and external rotation joint motor 1, a shoulder internal contraction and external expansion joint motor 2, a shoulder buckling and stretching joint motor 3, an elbow buckling and stretching joint motor 4, a wrist internal rotation and external rotation joint motor 5, a wrist ulnar deviation and radial deviation joint motor 6, a wrist buckling and stretching joint motor 7 and a clamping jaw 8. The clamping jaw 8 is mounted on a wrist buckling extension joint motor 7.
The operation end joint motor collects interaction moment of an operator and generates a driving signal, and the operation end joint motor completes operation according to the driving signal.
The wheel type mobile platform 11 is positioned at the bottom of an upper limb exoskeleton, an operator wears the device to walk freely when operating the wheel type mobile platform 11 at the end, the wheel type mobile platform 11 comprises a motor encoder and an odometer of an inertial sensor, the operating end wheel type mobile platform 11 generates a motion instruction according to the movement of the operator, the instruction is sent through the exoskeleton and a mobile platform control instruction channel 13, and the wheel type mobile platform 11 at the working end receives the motion instruction and executes the motion instruction.
The sensing and detecting module comprises an image sensor and a laser radar 10, wherein the image sensor is used for collecting visible light images and depth information, the laser radar 10 is used for collecting three-dimensional point cloud data, and in the embodiment, the image sensor is an RGBD camera 9. The laser radar 10 is fixedly arranged on the wheeled mobile platform 11 and is approximately positioned at two sides of the waist height of a human body; the RGBD camera 9 may be fixedly mounted on the wheeled mobile platform 11 (Eye to hand), or may be fixedly mounted on any joint (Eye in hand) of the upper limb exoskeleton.
The real-time network communication module is used for carrying out communication connection on the operation end and the operation end, and the communication content is divided into sensory feedback information from the operation end to the operation end and a movement instruction from the operation end to the operation end.
A power module, battery 16, for powering the upper extremity exoskeleton, sensing and detection module and other devices;
the processing module, CPU 15, provides operations for the transfer and processing of the respective signals.
The fastener of the structure part, the operation end is used for being worn on the human body, is soft and compliant, and the rigid connecting rod of the upper limb exoskeleton is telescopically adjustable.
RGBD camera and laser radar 10 at the operation end are used as data sources of a Visual odometer and a laser radar odometer, the odometer estimation is corrected through multi-source data fusion, and the V-Loam algorithm proposed in the literature Ji Zhang, sanjiv Singh, visual-lidar odometry and mapping: low-drift, robust, and fast [ J ] IEEE International Conference on Robotics and Automation (ICRA), 2015,3,26-30: 2174-2181z, a general framework for combining the Visual odometer and the laser radar odometer is proposed. The V-leam algorithm proposes a general framework of combining visual odometers and lidar odometers to improve the performance of real-time motion estimation and point cloud registration algorithms. Visual odometry estimates visual motion information by calculating motion between adjacent images. Lidar odometers determine changes in range and angle of movement by comparing previous and current lidar scans. From the multi-mode information of the visual odometer and the laser radar odometer based on scanning matching, correction is carried out through multi-source data fusion, so that the performance of a real-time motion estimation and point cloud registration algorithm is improved, and the accuracy and the robustness of the system are improved.
An image sensor (RGBD camera 9) collects visible light images and depth information around a working end, an SLAM technology is used for calculating RGB-D point clouds according to camera internal parameters easily, a visual dense map of the working end environment is built through poisson reconstruction or Surfel reconstruction and other algorithms, the visual dense map is sent to an operation end through a visual and moment feedback channel 12, the operation end further comprises VR equipment, the visual dense map is sent to the operation end, and the visual dense map is displayed to an operator through the VR equipment. In this embodiment, the VR device may be VR glasses 14, and render an environment map of the working end for the operator through the VR glasses 14; VR generates a three-dimensional simulation environment of simulation reality through technologies such as dynamic environment modeling, real-time three-dimensional graph generation, three-dimensional display viewing, real-time interaction and the like, and aims to help an operator to navigate, avoid obstacles and make physical interaction decisions.
The VR glasses 14 video terminal includes Wi-Fi modules, decoding, video frame processing, play operation control, display processing, and other modules, and has capabilities of decoding, gesture sensing, motion track prediction, real-time model rendering, presentation, and the like.
The VR glasses 14 of the operation end add the three-dimensional rendering effect of the current pose of the exoskeleton of the upper limb of the operation end to the VR image through the augmented reality technology, and further provide immersive visual feedback.
The lidar 10 collects three-dimensional point cloud data around the operation end and the operation end, and provides obstacle avoidance signals for the operation end and the operation end, respectively. Because the operator directly observes a dense map rendering diagram of the working end through the VR glasses 14 when controlling the exoskeleton, the environment observation of the working end is limited, and the obstacle avoidance signal provided by the laser radar 10 can send a brake signal to the mobile platform (11) in the presence of dangerous situations such as collision; on the other hand, the images and the depth acquired by the RGB-D camera (9) at the operation end have blind areas, the effective ranging range and the real-time measurement performance of the laser radar 10 exceed those of the RGB-D camera (9), and the device can realize better obstacle avoidance performance on dynamic objects possibly existing at the operation end.
The signal transmission architecture of the platform consists of four parts, namely a center cloud, a 5G network, an edge cloud and a terminal, wherein the 5G network can provide a wireless uplink scheme for front end acquisition with high flexibility, a large bandwidth of more than 100Mbps and network time delay of 5-8 milliseconds, can meet the bandwidth requirement of dense map data transmission and eliminate dizziness of VR video viewing in a remote interaction process; the edge cloud is closer to the user, so that high-real-time sensor data processing and low-time-delay rendering can be met, and basic resources such as a CPU (Central processing Unit), a memory and the like and high-performance heterogeneous accelerator cards such as a GPU (graphics processing Unit) and the like are required to be deployed; the working end edge cloud computing platform processes sensor data such as RGB-D cameras (9) and laser radars (10) in real time, and operates SLAM algorithm to construct an environment map; the working end edge cloud is responsible for processing sensor data such as RGB-D cameras and laser radars in real time, running SLAM algorithm to construct an environment map, and transmitting the processed data to the center cloud through a 5G network. And after the central cloud receives the data, further processing and storing the data, and sending the processed data to an operation end edge cloud computing platform. The operation end edge cloud computing platform is responsible for rendering the received visual dense map into VR video according to the gesture of the VR glasses and sending the VR video to the terminal. The terminal is presented to the operator via the VR device. The center cloud, the edge cloud and the terminal are all connected with each other through a 5G network. The operation end edge cloud computing platform renders the received visual dense map into VR video according to the pose of the VR glasses 14.
The joint driver servo motor of the wearable upper limb exoskeleton at the operation end is provided with angle sensing and moment sensing. The upper limb exoskeleton recognizes the movement intention of the operating body through moment feedback. Specifically, the method for interaction between the upper limb exoskeleton of the operation end and the moment of the upper limb exoskeleton of the operation end comprises the following steps:
as shown in fig. 2, the exoskeleton of the operation end is worn on the upper limb of the human body, the man-machine interaction moment is collected by a servo motor moment sensor, and moment data of the corresponding joint of the operation end is subtracted to obtain the driving moment of the human body; then the human body driving moment is regulated to zero by changing the joint angle through designing an LQR controller, and the LQR controller is an abbreviation of a linear secondary regulator, and is an optimal control theory widely applied to control engineering. The goal of the LQR controller is to minimize the square error between the system output and the desired output by adjusting the state feedback gain matrix of the system. In the upper limb exoskeleton of the operation end, the LQR controller outputs joint angle control quantity by the torque sensing feedback of the servo motor so as to adjust the human body driving torque to zero. The operation end exoskeleton sends joint angle instructions in real time to remotely control the operation end cooperative mechanical arm; and the servo motor angle controller receives the joint angle instruction and the feedback of the motor angle encoder to perform position feedback control.
The servo motor moment sensor of the working end actually comprises the statics keeping moment of the pose of the mechanical arm and the interaction moment of the mechanical arm and the environment, and the pose of the working end is real-time following the teaching of the working end, so that the statics keeping moment of the poses of the mechanical arms at the two ends is approximately equal, and can be regarded as elimination after difference, and the quasi-static process of the exoskeleton comprises the following steps:
man-machine interaction torque at the operation end = booster gain function (interaction torque at the operation end environment) +human body driving torque,
the man-machine interaction moment comprises an item for sensing force feedback of a working end and a main moment for controlling the exoskeleton, and is irrelevant to the holding moment of the mechanical arm;
the booster gain function enables a human body to provide only a smaller moment to obtain a larger moment output at the working end, and also enables force feedback at the working end to be amplified, so that the human body can feel that finer moment changes are controlled more finely. The human body can adapt to the remote control power-assisted function well through practice as long as the power-assisted gain function is monotonically increased; different boosting gain functions can be designed in different working modes to meet the requirements of application scenes.
The interactive torque of the working end environment is adjusted through a booster gain function, and the booster gain function can amplify the torque provided by an operator at the working end or amplify the force feedback of the working end.
In the description of the present application, it should be understood that the terms "upper," "lower," "front," "rear," "left," "right," "vertical," "horizontal," "top," "bottom," "inner," "outer," and the like indicate orientations or positional relationships based on the orientations or positional relationships illustrated in the drawings, merely to facilitate description of the present application and simplify the description, and do not indicate or imply that the devices or elements being referred to must have a specific orientation, be configured and operated in a specific orientation, and are not to be construed as limiting the present application.
The foregoing describes specific embodiments of the present invention. It is to be understood that the invention is not limited to the particular embodiments described above, and that various changes or modifications may be made by those skilled in the art within the scope of the appended claims without affecting the spirit of the invention. The embodiments of the present application and features in the embodiments may be combined with each other arbitrarily without conflict.
Claims (10)
1. An immersive real-time remote operation platform based on an exoskeleton robot, comprising:
an upper limb exoskeleton, which is divided into an operation end and a working end;
the upper limb exoskeleton comprises joint motors, the joints of the upper limb exoskeleton are driven by the joint motors, and each joint motor corresponds to one degree of freedom of the upper limb of the human body; the joint motor of the operation end collects interaction moment of an operator and generates a driving signal, and the joint motor of the operation end completes operation according to the driving signal;
the sensing and detecting module comprises an image sensor and a laser radar (10), wherein the image sensor is used for collecting visible light images and depth information, and the laser radar (10) is used for collecting three-dimensional point cloud data;
and the real-time network communication module is used for carrying out communication connection between the operation end and the operation end.
2. The exoskeleton robot-based immersive real-time remote operation platform of claim 1, wherein the upper limb exoskeleton further comprises a wheeled mobile platform (11), the wheeled mobile platform (11) is located at the bottom of the upper limb exoskeleton, the wheeled mobile platform (11) comprises an odometer of a motor encoder and an inertial sensor, the operation end wheeled mobile platform (11) generates a motion command according to the movement of an operator, and the wheeled mobile platform (11) of the working end receives and executes the motion command.
3. The exoskeleton robot-based immersive real-time remote operation platform of claim 1, wherein the upper limb exoskeleton has 7 arm active degrees of freedom and 1 operation handle degree of freedom, and the joint moment information is real-time and fed back through an encoder of the joint motor, and the joint moment information is real-time recorded and fed back through a current sensor of the joint motor driver.
4. The exoskeleton robot-based immersive real-time teleoperation platform of claim 3, wherein the joint motors comprise a shoulder inner rotation outer joint motor (1), a shoulder inner contraction outer joint motor (2), a shoulder flexion extension joint motor (3), an elbow flexion extension joint motor (4), a wrist inner rotation outer joint motor (5), a wrist ulnar deviation radius joint motor (6) and a wrist flexion extension joint motor (7);
the upper limb exoskeleton further comprises clamping jaws (8), and the clamping jaws (8) are arranged on the wrist buckling extension joint motor (7).
5. The exoskeleton robot-based immersive real-time remote operation platform of claim 1, wherein the image sensor collects visible light images and depth information around the working end, and establishes a visual dense map of the working end environment;
the operation end further comprises VR equipment, the visual dense map is sent to the operation end, and the visual dense map is displayed to an operator through the VR equipment.
6. The exoskeleton robot-based immersive real-time remote operation platform of claim 5, wherein the VR device adds a three-dimensional rendering effect of the current pose of the upper limb exoskeleton of the operation end to the VR image through an augmented reality technique.
7. The exoskeleton robot-based immersive real-time remote operation platform of claim 1, wherein the lidar (10) collects three-dimensional point cloud data around the operation end and the working end, providing obstacle avoidance signals for the operation end and the working end, respectively.
8. The exoskeleton robot-based immersive real-time remote operation platform of claim 1, further comprising a center cloud and an edge cloud, the task end edge cloud processing sensor data of the image sensor and the lidar (10) in real-time, constructing an environment map, the operation end edge cloud rendering the received visual dense map as VR video in accordance with the pose of the VR device;
the operation end, the operation end and the edge cloud are connected through a wireless communication network.
9. The exoskeleton robot-based immersive real-time remote operation platform of claim 1, wherein the joint motor comprises an angle sensor and a moment sensor, and the method for interaction of the moment of the upper limb exoskeleton of the operation end and the moment of the upper limb exoskeleton of the operation end comprises: and the working moment fed back by the upper limb exoskeleton of the working end is subjected to difference with the man-machine interaction moment acquired by the working end, the driving moment of the human body is calculated, the driving moment of the human body is adjusted to zero by changing the joint angle through a design controller, and the joint angle command is sent in real time to remotely control the upper limb exoskeleton of the working end.
10. The exoskeleton robot-based immersive real-time remote operation platform of claim 9, wherein the interaction torque relationship between the operation end man-machine interaction torque and the operation end environment is:
the man-machine interaction moment of the operation end=the interaction moment of the operation end environment+the human body driving moment,
the interactive torque of the working end environment is adjusted through a booster gain function, and the booster gain function can amplify the torque provided by an operator at the working end or amplify the force feedback of the working end.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310389548.9A CN116100565A (en) | 2023-04-13 | 2023-04-13 | Immersive real-time remote operation platform based on exoskeleton robot |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310389548.9A CN116100565A (en) | 2023-04-13 | 2023-04-13 | Immersive real-time remote operation platform based on exoskeleton robot |
Publications (1)
Publication Number | Publication Date |
---|---|
CN116100565A true CN116100565A (en) | 2023-05-12 |
Family
ID=86258328
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202310389548.9A Pending CN116100565A (en) | 2023-04-13 | 2023-04-13 | Immersive real-time remote operation platform based on exoskeleton robot |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN116100565A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117921638A (en) * | 2024-03-20 | 2024-04-26 | 戴盟(深圳)机器人科技有限公司 | Exoskeleton device, system and teaching method for teleoperation of humanoid robot |
Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080009771A1 (en) * | 2006-03-29 | 2008-01-10 | Joel Perry | Exoskeleton |
CN102229146A (en) * | 2011-04-27 | 2011-11-02 | 北京工业大学 | Remote control humanoid robot system based on exoskeleton human posture information acquisition technology |
CN104385266A (en) * | 2014-08-28 | 2015-03-04 | 北京邮电大学 | Seven-degree-of-freedom external skeleton type teleoperation main hand |
CN107284544A (en) * | 2017-07-30 | 2017-10-24 | 福州大学 | A kind of multi-functional General Mobile robot chassis and its application process |
US20190099877A1 (en) * | 2017-09-29 | 2019-04-04 | Airbus Operations Gmbh | Exoskeleton System |
CN110039545A (en) * | 2019-04-30 | 2019-07-23 | 齐鲁工业大学 | A kind of robot remote control system and control method based on wearable device |
CN110181482A (en) * | 2019-05-23 | 2019-08-30 | 北京邮电大学 | A kind of modularization seven freedom upper limb exoskeleton robot |
CN111319026A (en) * | 2020-02-06 | 2020-06-23 | 北京凡川智能机器人科技有限公司 | Immersive human-simulated remote control method for double-arm robot |
WO2022043504A1 (en) * | 2020-08-27 | 2022-03-03 | Extend Robotics | Remote operation of robotic systems |
CN114905478A (en) * | 2021-02-08 | 2022-08-16 | 腾讯科技(深圳)有限公司 | Bilateral teleoperation system and control method |
CN115113517A (en) * | 2021-03-17 | 2022-09-27 | 腾讯科技(深圳)有限公司 | Bilateral force feedback method, device, equipment and medium |
-
2023
- 2023-04-13 CN CN202310389548.9A patent/CN116100565A/en active Pending
Patent Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080009771A1 (en) * | 2006-03-29 | 2008-01-10 | Joel Perry | Exoskeleton |
CN102229146A (en) * | 2011-04-27 | 2011-11-02 | 北京工业大学 | Remote control humanoid robot system based on exoskeleton human posture information acquisition technology |
CN104385266A (en) * | 2014-08-28 | 2015-03-04 | 北京邮电大学 | Seven-degree-of-freedom external skeleton type teleoperation main hand |
CN107284544A (en) * | 2017-07-30 | 2017-10-24 | 福州大学 | A kind of multi-functional General Mobile robot chassis and its application process |
US20190099877A1 (en) * | 2017-09-29 | 2019-04-04 | Airbus Operations Gmbh | Exoskeleton System |
CN110039545A (en) * | 2019-04-30 | 2019-07-23 | 齐鲁工业大学 | A kind of robot remote control system and control method based on wearable device |
CN110181482A (en) * | 2019-05-23 | 2019-08-30 | 北京邮电大学 | A kind of modularization seven freedom upper limb exoskeleton robot |
CN111319026A (en) * | 2020-02-06 | 2020-06-23 | 北京凡川智能机器人科技有限公司 | Immersive human-simulated remote control method for double-arm robot |
WO2022043504A1 (en) * | 2020-08-27 | 2022-03-03 | Extend Robotics | Remote operation of robotic systems |
CN114905478A (en) * | 2021-02-08 | 2022-08-16 | 腾讯科技(深圳)有限公司 | Bilateral teleoperation system and control method |
CN115113517A (en) * | 2021-03-17 | 2022-09-27 | 腾讯科技(深圳)有限公司 | Bilateral force feedback method, device, equipment and medium |
Non-Patent Citations (1)
Title |
---|
陈敏: "《人工智能通信理论与算法》", pages: 245 - 248 * |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117921638A (en) * | 2024-03-20 | 2024-04-26 | 戴盟(深圳)机器人科技有限公司 | Exoskeleton device, system and teaching method for teleoperation of humanoid robot |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20200055195A1 (en) | Systems and Methods for Remotely Controlling a Robotic Device | |
JP6567563B2 (en) | Humanoid robot with collision avoidance and orbit return capability | |
CN110825076B (en) | Mobile robot formation navigation semi-autonomous control method based on sight line and force feedback | |
JP4821865B2 (en) | Robot apparatus, control method therefor, and computer program | |
US8873831B2 (en) | Walking robot and simultaneous localization and mapping method thereof | |
CN111055281A (en) | ROS-based autonomous mobile grabbing system and method | |
US20060129278A1 (en) | Legged mobile robot control system | |
CN104503450A (en) | Service robot achieving intelligent obstacle crossing | |
CN113829343B (en) | Real-time multitasking and multi-man-machine interaction system based on environment perception | |
Walas et al. | Messor–verstatile walking robot for serach and rescue missions | |
CN116100565A (en) | Immersive real-time remote operation platform based on exoskeleton robot | |
CN111230888A (en) | RGBD camera-based upper limb exoskeleton robot obstacle avoidance method | |
Zhou et al. | Teleman: Teleoperation for legged robot loco-manipulation using wearable imu-based motion capture | |
TW201540281A (en) | Walking assist system of robot | |
Fernando et al. | Effectiveness of Spatial Coherent Remote Drive Experience with a Telexistence Backhoe for Construction Sites. | |
Takahashi et al. | Development of the assistive mobile robot system: AMOS—to aid in the daily life of the physically handicapped | |
Wang et al. | Intuitive and versatile full-body teleoperation of a humanoid robot | |
Anderson et al. | Coordinated control and range imaging for mobile manipulation | |
Xu et al. | Design of a human-robot interaction system for robot teleoperation based on digital twinning | |
Walęcki et al. | Control system of a service robot's active head exemplified on visual servoing | |
Nagasawa et al. | Development of a walking assistive robot for elderly people in outdoor environments | |
CN115922731B (en) | Control method of robot and robot | |
Xu et al. | Study of reinforcement learning based shared control of walking-aid robot | |
Liu et al. | Bilateral control of teleoperation manipulator based on virtual force aware guidance | |
Li et al. | Image based approach to obstacle avoidance in mobile manipulators |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20230512 |