Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the invention. It will be apparent, however, to one skilled in the art that the present invention may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present invention with unnecessary detail.
In order to explain the technical means of the present invention, the following description will be given by way of specific examples.
Referring to fig. 1, an embodiment of the present invention provides a master-slave type dexterous hand following method, including:
s101: for each dexterous hand finger on a dexterous hand: acquiring a first bending angle characteristic of the fingers of the dexterous hand at the current moment detected by an angle sensor in real time; determining the gesture information of the fingers of the dexterous hand at the current moment according to the first bending angle characteristics of the fingers of the dexterous hand at the current moment; wherein, the angle sensor is arranged at the proximal knuckle of the finger of the dexterous hand;
s102: for each human finger on the human hand: acquiring a second bending angle characteristic of the fingers of the hand at the current moment detected by the rotation angle sensor, a third bending angle characteristic of the fingers of the hand at the current moment detected by the first bending sensor and a fourth bending angle characteristic of the fingers of the hand at the current moment detected by the second bending sensor in real time; determining gesture information of the fingers of the human hand at the current moment according to the second bending angle characteristic of the fingers of the human hand at the current moment, the third bending angle characteristic of the fingers of the human hand at the current moment and the fourth bending angle characteristic of the fingers of the human hand at the current moment; the first bending sensor is arranged on the back side of the fingers of the human hand, the second bending sensor is arranged on the abdomen side of the fingers of the human hand, and the rotation angle sensor is arranged at the joint of the finger palm of the fingers of the human hand;
s103: respectively determining the action direction and the action time of each dexterous hand finger at the current moment according to the gesture information of each human finger at the current moment and the gesture information of each dexterous hand finger at the current moment;
s104: and respectively driving the corresponding fingers of the dexterous hand to follow the fingers of the human hand according to the action direction and the action time of each finger of the dexterous hand at the current moment.
The embodiment of the invention provides a master-slave type dexterous hand following method which is used for controlling a dexterous hand to follow the manual work of a person.
Referring to fig. 2, the dexterous hand is provided with five modularized human-simulated three-knuckle dexterous hand fingers with completely identical structures, and the mechanical dimensions and transmission relationship of the fingers of the dexterous hand are determined according to the physiological structure parameters of the fingers of the hand and the coupling motion mechanism; the unique angle sensor is arranged at the proximal knuckle of the finger of the dexterous hand and used for collecting the first bending angle characteristic of the finger of the dexterous hand during operation of the dexterous hand and obtaining the gesture information of the finger of the dexterous hand according to the first bending angle characteristic.
According to the physiological structure of the human hand, a first bending sensor is arranged on the back side of each finger of the human hand, a second bending sensor is arranged on the abdomen side of each finger of the human hand, and a rotation angle sensor is arranged at the finger palm joint and is used for acquiring a second bending angle characteristic, a third bending angle characteristic and a fourth bending angle characteristic when the human hand acts.
And determining the control quantity (action time and action direction) required by the dexterous hand according to the obtained gesture information of the fingers of the dexterous hand and the gesture information of the fingers of the hand, and driving the dexterous hand to execute the grabbing operation. The embodiment of the invention fuses sensor data of various different types, realizes accurate following of the gestures of the dexterous hand to the human hand, and can control the dexterous hand to carry out effective remote grabbing operation through the human hand.
The smart fingers adopt an under-actuated design, and the micro electric push rod is used as a unique driving source to drive the fingers to move according to the calculated control quantity, so that grabbing is realized.
In some embodiments, referring to fig. 3 and 4, the first bending sensor 23, the second bending sensor 24 and the rotation angle sensor 22 are disposed on the data glove 21, and the data glove 21 is worn on a human hand.
The data glove 21 is further provided with an exoskeleton joint 25 and a palm fixing seat 26. The rotation angle sensor 22 is fixedly installed on the palm fixing seat 26, one end of the exoskeleton joint 25 is fixedly connected with the data glove 21, the other end of the exoskeleton joint is fixedly connected with the rotation angle sensor 22, and the palm fixing seat 26 is fixed on the wrist of a human hand by a magic tape; the first bending sensor 23 and the second bending sensor 24 are disposed on the dorsal side and the ventral side of the data glove 21, respectively.
In some embodiments, the gesture information Θ of the ith individual finger at time ti(t) meterThe calculation formula can be:
wherein,
for the second bend angle characteristic of the ith individual finger at time t,
for the third bend angle characteristic of the ith individual finger at time t,
a fourth bend angle characteristic of an ith individual finger at time t; k is a radical of
1Is a first weight coefficient, k
2Is the second weight coefficient, m
1Is a third weight coefficient, m
2Is a fourth weight coefficient, f
2As a mapping between the second bending angle characteristic and the bending angle of the finger of the human hand, f
3As a mapping relation between the third bending angle characteristic and the bending angle of the finger of the human hand, f
4Is the mapping relation between the fourth bending angle characteristic and the bending angle of the finger of the human hand.
According to the embodiment of the invention, the data of each sensor is fused by adopting an average weighting method to obtain the gesture information of the human hand, the calculation result is accurate, and the method is simple and practical.
In some embodiments, prior to S102, the master-slave dexterous hand following method may further comprise:
s105: for at least one human finger on a human hand: acquiring a second bending angle characteristic, a third bending angle characteristic and a fourth bending angle characteristic which correspond to the fingers of the person at a plurality of preset positions respectively; acquiring images of the human fingers shot by a camera, which correspond to the human fingers at a plurality of preset positions respectively, and determining human finger bending angles corresponding to the human fingers at the plurality of preset positions respectively according to the images of the human fingers, which correspond to the human fingers at the plurality of preset positions respectively;
s106: determining a mapping relation between the second bending angle characteristic and the human hand finger bending angle according to a second bending angle characteristic corresponding to at least one human hand finger on the human hand at a plurality of preset positions respectively and a human hand finger bending angle corresponding to at least one human hand finger on the human hand at a plurality of preset positions respectively;
s107: determining a mapping relation between the third bending angle characteristic and the bending angle of the fingers of the human hand according to the third bending angle characteristic corresponding to the fingers of the human hand at a plurality of preset positions and the bending angle of the fingers of the human hand at a plurality of preset positions;
s108: and determining a mapping relation between the fourth bending angle characteristic and the bending angles of the fingers of the human hand according to the fourth bending angle characteristic corresponding to the fingers of the human hand at a plurality of preset positions and the bending angles of the fingers of the human hand at a plurality of preset positions.
In the embodiment of the invention, the human hand wears the data glove to operate, so that the data glove is kept to be attached to the human hand, and the stability of the operation process is kept. And calibrating the gesture information of the fingers of the human hand and the output values of the sensors on the data glove by using the high-speed camera. The high-speed camera is over against the side face of the data glove, fingers of the hand are slowly bent to preset positions, and the bending angles of the fingers of the hand are obtained after image processing. And simultaneously acquiring data of the three sensors on the data glove at preset positions, and respectively calculating to obtain the mapping relation between the data of each sensor and the bending angle of the fingers of the hand. In order to improve the calibration accuracy, a plurality of fingers on the hand of a person can be operated, and a plurality of preset positions can be selected.
In some embodiments, the calibration process of the sensor on the data glove may be repeated multiple times, and the first weighting factor k may be determined using a multi-factor ANOVA method1Second weight coefficient k2The third weight coefficient m1Fourth weighting factor m2。
In some embodiments, time t is the ith smart powerGesture information Θ of fingeri' (t) can be calculated as:
wherein,
is the first bending angle characteristic of the ith dexterous finger at the moment t, f
1Is the mapping relation between the first bending angle characteristic and the bending angle of the finger of the dexterous hand.
In some embodiments, prior to S101, the master-slave dexterous hand following method may further comprise:
s109: for at least one dexterous hand finger on a dexterous hand: acquiring first bending angle characteristics corresponding to the fingers of the dexterous hand at a plurality of preset positions respectively; acquiring images of the fingers of the dexterous hand shot by a camera corresponding to the fingers at a plurality of preset positions respectively, and determining corresponding bending angles of the fingers of the dexterous hand when the fingers of the dexterous hand are at the plurality of preset positions respectively according to the corresponding images of the fingers of the dexterous hand at the plurality of preset positions respectively;
s1010: determining the mapping relation between the first bending angle characteristic and the bending angle of the fingers of the dexterous hand according to the first bending angle characteristic corresponding to the fingers of the dexterous hand at a plurality of preset positions and the bending angle of the fingers of the dexterous hand corresponding to the fingers of the dexterous hand at a plurality of preset positions.
And controlling the dexterous hand to carry out empty grabbing operation, and calibrating the bending angle of the fingers of the dexterous hand and the output value (first bending angle characteristic) of the angle sensor by using the high-speed camera. The high-speed camera is over against the side face of the dexterous hand simulating fingers, and the dexterous hand simulating fingers are bent to a preset position under the driving of the miniature electric push rod; processing the image shot by the high-speed camera to obtain a flexible finger bending angle; and meanwhile, acquiring a first bending angle characteristic detected by the angle sensor, and calculating to obtain a mapping relation between the first bending angle characteristic and the bending angle of the finger of the dexterous hand.
In some embodiments, S103 may include:
s1031: for each dexterous hand finger on a dexterous hand: and determining the difference between the gesture information of the finger of the dexterous hand at the current moment and the gesture information of the finger of the person followed by the finger of the dexterous hand at the current moment, and determining the action direction and the action time of the finger of the dexterous hand at the current moment according to the difference.
In some embodiments, the formula for calculating the motion direction and the motion time of the ith dexterous hand finger at time t may be:
Δti(t)=Ti 2(t)-Ti 1(t)
Ti 1(t)=f5′Θi′(t)
Ti 2(t)=f5′Θi(t)
wherein, thetai' t is the gesture information of the ith dexterous hand finger at the time t, thetai(t) gesture information of an ith individual finger at time t; f. of5' is a mapping relation between the bending angle of the fingers of the dexterous hand and the driving time; t isi 1(T) is the total driving time of the ith dexterous hand finger at the time T, Ti 2(t) the expected total driving time of the ith dexterous hand finger at the time t; Δ ti(t) is the motion time of the ith dexterous hand finger at the time t; Δ ti(t)>The motion direction of the dexterous hand at 0 is continuous bending, delta ti(t)<The direction of the dexterous hand motion is reverse extension at 0.
Δti(t)>When 0, the gesture of the dexterous hand lags behind the gesture of the hand of the user, and the dexterous hand needs to be controlled to continuously bend; Δ ti(t) ═ 0 means that the dexterous hand gesture is consistent with the hand gesture, and the dexterous hand does not need to move; Δ ti(t)<And 0 represents that the hand gesture of the dexterous hand is prior to the hand gesture of the human hand, and the dexterous hand needs to be controlled to extend reversely. Δ ti(t) Positive and negative represent the direction of motion, Δ tiThe absolute value of (t) represents the action time value.
The motion direction and the motion time of each finger of the dexterous hand are obtained, five micro electric push rods are used for controlling the five fingers of the dexterous hand respectively, and the bending angles of the five fingers of the dexterous hand are controlled by adjusting the motion direction and the motion time of the micro electric push rods, so that the dexterous hand can accurately follow the hand of the person.
In some embodiments, before S1031, S103 may further include:
s1032: for at least one dexterous hand finger on a dexterous hand: acquiring corresponding driving time when the fingers of the dexterous hand move to a plurality of preset positions respectively; acquiring images of the fingers of the dexterous hand shot by a camera corresponding to the fingers at a plurality of preset positions respectively, and determining corresponding bending angles of the fingers of the dexterous hand when the fingers of the dexterous hand are at the plurality of preset positions respectively according to the corresponding images of the fingers of the dexterous hand at the plurality of preset positions respectively;
s1033: determining the mapping relation between the driving time and the finger bending angles of the dexterous hand according to the driving time corresponding to the situation that at least one finger of the dexterous hand on the dexterous hand moves to a plurality of preset positions respectively and the finger bending angles corresponding to the situation that at least one finger of the dexterous hand on the dexterous hand moves to the preset positions respectively.
The micro electric push rod of the dexterous hand is driven by a constant voltage, and the bending angle of the fingers of the dexterous hand and the driving time of the micro electric push rod are calibrated by a high-speed camera. The high-speed camera is right opposite to the side face of the dexterous hand simulating fingers, the dexterous hand simulating fingers are bent to a preset position under the driving of the micro electric push rod, the image obtained by shooting the high-speed camera is processed to obtain the bending angle of the fingers of the dexterous hand, and then the mapping relation between the bending angle of the fingers of the dexterous hand and the driving time is calculated.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present invention.
Referring to fig. 5, an embodiment of the present invention further provides a master-slave type dexterous hand following device, including:
a first gesture information determination module 31 for, for each dexterous hand finger on the dexterous hand: acquiring a first bending angle characteristic of the fingers of the dexterous hand at the current moment detected by an angle sensor in real time; determining the gesture information of the fingers of the dexterous hand at the current moment according to the first bending angle characteristics of the fingers of the dexterous hand at the current moment; wherein, the angle sensor is arranged at the proximal knuckle of the finger of the dexterous hand;
a second gesture information determination module 32 for, for each finger of the human hand: acquiring a second bending angle characteristic of the fingers of the hand at the current moment detected by the rotation angle sensor, a third bending angle characteristic of the fingers of the hand at the current moment detected by the first bending sensor and a fourth bending angle characteristic of the fingers of the hand at the current moment detected by the second bending sensor in real time; determining gesture information of the fingers of the human hand at the current moment according to the second bending angle characteristic of the fingers of the human hand at the current moment, the third bending angle characteristic of the fingers of the human hand at the current moment and the fourth bending angle characteristic of the fingers of the human hand at the current moment; the first bending sensor is arranged on the back side of the fingers of the human hand, the second bending sensor is arranged on the abdomen side of the fingers of the human hand, and the rotation angle sensor is arranged at the joint of the finger palm of the fingers of the human hand;
the following information determining module 33 is used for respectively determining the action direction and the action time of each finger of the dexterous hand at the current moment according to the gesture information of each finger of the person at the current moment and the gesture information of each finger of the dexterous hand at the current moment;
and the following module 34 is used for respectively driving the corresponding fingers of the dexterous hand to follow the fingers of the human hand according to the action direction and the action time of each finger of the dexterous hand at the current moment.
In some embodiments, the gesture information Θ of the ith individual finger at time tiThe calculation formula of (t) may be:
wherein,
for the second bend angle characteristic of the ith individual finger at time t,
for the third bend angle characteristic of the ith individual finger at time t,
a fourth bend angle characteristic of an ith individual finger at time t; k is a radical of
1Is a first weight coefficient, k
2Is the second weight coefficient, m
1Is a third weight coefficient, m
2Is a fourth weight coefficient, f
2As a mapping between the second bending angle characteristic and the bending angle of the finger of the human hand, f
3As a mapping relation between the third bending angle characteristic and the bending angle of the finger of the human hand, f
4Is the mapping relation between the fourth bending angle characteristic and the bending angle of the finger of the human hand.
In some embodiments, the master-slave dexterous hand following device may further comprise:
a first parameter acquisition module 35 for, for at least one human hand finger on a human hand: acquiring a second bending angle characteristic, a third bending angle characteristic and a fourth bending angle characteristic which correspond to the fingers of the person at a plurality of preset positions respectively; acquiring images of the human fingers shot by a camera, which correspond to the human fingers at a plurality of preset positions respectively, and determining human finger bending angles corresponding to the human fingers at the plurality of preset positions respectively according to the images of the human fingers, which correspond to the human fingers at the plurality of preset positions respectively;
the first calibration module 36 is configured to determine a mapping relationship between the second bending angle characteristic and the bending angles of the fingers of the human hand according to the second bending angle characteristic corresponding to the fingers of the human hand at a plurality of preset positions and the bending angles of the fingers of the human hand at a plurality of preset positions;
the second calibration module 37 is configured to determine a mapping relationship between the third bending angle characteristic and the bending angles of the fingers of the human hand according to the third bending angle characteristic corresponding to the fingers of the human hand at a plurality of preset positions and the bending angles of the fingers of the human hand at a plurality of preset positions;
the third calibration module 38 is configured to determine a mapping relationship between the fourth bending angle characteristic and the bending angles of the fingers of the human hand according to the fourth bending angle characteristic corresponding to the fingers of the human hand at multiple preset positions and the bending angles of the fingers of the human hand corresponding to the fingers of the human hand at multiple preset positions.
In some embodiments, the gesture information Θ of the ith dexterous hand finger at the time ti' (t) can be calculated as:
wherein,
is the first bending angle characteristic of the ith dexterous finger at the moment t, f
1Is the mapping relation between the first bending angle characteristic and the bending angle of the finger of the dexterous hand.
In some embodiments, the master-slave dexterous hand following device may further comprise:
a second parameter acquisition module 39 for, for at least one dexterous hand finger on the dexterous hand: acquiring first bending angle characteristics corresponding to the fingers of the dexterous hand at a plurality of preset positions respectively; acquiring images of the fingers of the dexterous hand shot by a camera corresponding to the fingers at a plurality of preset positions respectively, and determining corresponding bending angles of the fingers of the dexterous hand when the fingers of the dexterous hand are at the plurality of preset positions respectively according to the corresponding images of the fingers of the dexterous hand at the plurality of preset positions respectively;
the fourth calibration module 310 is configured to determine a mapping relationship between the first bending angle characteristic and the bending angle of the finger of the dexterous hand according to the first bending angle characteristic corresponding to the finger of the dexterous hand at a plurality of preset positions and the bending angle of the finger of the dexterous hand corresponding to the finger of the dexterous hand at a plurality of preset positions.
In some embodiments, the following information determining module 33 may include:
a finger following information determination unit 331 for, for each dexterous hand finger on the dexterous hand: and determining the difference between the gesture information of the finger of the dexterous hand at the current moment and the gesture information of the finger of the person followed by the finger of the dexterous hand at the current moment, and determining the action direction and the action time of the finger of the dexterous hand at the current moment according to the difference.
In some embodiments, the formula for calculating the motion direction and the motion time of the ith dexterous hand finger at time t may be:
Δti(t)=Ti 2(t)-Ti 1(t)
Ti 1(t)=f5′Θi′(t)
Ti 2(t)=f5′Θi(t)
wherein, thetai' t is the gesture information of the ith dexterous hand finger at the time t, thetai(t) gesture information of an ith individual finger at time t; f. of5' is a mapping relation between the bending angle of the fingers of the dexterous hand and the driving time; t isi 1(T) is the total driving time of the ith dexterous hand finger at the time T, Ti 2(t) the expected total driving time of the ith dexterous hand finger at the time t; Δ ti(t) is the motion time of the ith dexterous hand finger at the time t; Δ ti(t)>The motion direction of the dexterous hand at 0 is continuous bending, delta ti(t)<The direction of the dexterous hand motion is reverse extension at 0.
It is obvious to those skilled in the art that, for convenience and simplicity of description, the foregoing division of the functional units and modules is merely used as an example, and in practical applications, the above function distribution may be performed by different functional units and modules as needed, that is, the internal structure of the terminal device is divided into different functional units or modules to perform all or part of the above described functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the above-mentioned apparatus may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
Fig. 6 is a schematic block diagram of a terminal device according to an embodiment of the present invention. As shown in fig. 6, the terminal device 4 of this embodiment includes: one or more processors 40, a memory 41, and a computer program 42 stored in the memory 41 and executable on the processors 40. The processor 40, when executing the computer program 42, implements the steps in the various master-slave dexterous hand following method embodiments described above, such as steps S101 to S104 shown in fig. 1. Alternatively, the processor 40, when executing the computer program 42, implements the functionality of the modules/units of the master-slave dexterous hand-following device embodiment described above, such as the modules 31 to 34 shown in fig. 5.
Illustratively, the computer program 42 may be divided into one or more modules/units, which are stored in the memory 41 and executed by the processor 40 to accomplish the present application. One or more of the modules/units may be a series of computer program instruction segments capable of performing specific functions, which are used to describe the execution of the computer program 42 in the terminal device 4. For example, the computer program 42 may be divided into a first gesture information determination module 31, a second gesture information determination module 32, a following information determination module 33, and a following module 34.
A first gesture information determination module 31 for, for each dexterous hand finger on the dexterous hand: acquiring a first bending angle characteristic of the fingers of the dexterous hand at the current moment detected by an angle sensor in real time; determining the gesture information of the fingers of the dexterous hand at the current moment according to the first bending angle characteristics of the fingers of the dexterous hand at the current moment; wherein, the angle sensor is arranged at the proximal knuckle of the finger of the dexterous hand;
a second gesture information determination module 32 for, for each finger of the human hand: acquiring a second bending angle characteristic of the fingers of the hand at the current moment detected by the rotation angle sensor, a third bending angle characteristic of the fingers of the hand at the current moment detected by the first bending sensor and a fourth bending angle characteristic of the fingers of the hand at the current moment detected by the second bending sensor in real time; determining gesture information of the fingers of the human hand at the current moment according to the second bending angle characteristic of the fingers of the human hand at the current moment, the third bending angle characteristic of the fingers of the human hand at the current moment and the fourth bending angle characteristic of the fingers of the human hand at the current moment; the first bending sensor is arranged on the back side of the fingers of the human hand, the second bending sensor is arranged on the abdomen side of the fingers of the human hand, and the rotation angle sensor is arranged at the joint of the finger palm of the fingers of the human hand;
the following information determining module 33 is used for respectively determining the action direction and the action time of each finger of the dexterous hand at the current moment according to the gesture information of each finger of the person at the current moment and the gesture information of each finger of the dexterous hand at the current moment;
and the following module 34 is used for respectively driving the corresponding fingers of the dexterous hand to follow the fingers of the human hand according to the action direction and the action time of each finger of the dexterous hand at the current moment. Other modules or units are not described in detail herein.
Terminal device 4 includes, but is not limited to, processor 40, memory 41. Those skilled in the art will appreciate that fig. 6 is only one example of a terminal device and does not constitute a limitation of terminal device 4 and may include more or fewer components than shown, or combine certain components, or different components, e.g., terminal device 4 may also include input devices, output devices, network access devices, buses, etc.
The Processor 40 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, discrete hardware component, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The storage 41 may be an internal storage unit of the terminal device, such as a hard disk or a memory of the terminal device. The memory 41 may also be an external storage device of the terminal device, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like provided on the terminal device. Further, the memory 41 may also include both an internal storage unit of the terminal device and an external storage device. The memory 41 is used for storing the computer program 42 and other programs and data required by the terminal device. The memory 41 may also be used to temporarily store data that has been output or is to be output.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed terminal device and method may be implemented in other ways. For example, the above-described terminal device embodiments are merely illustrative, and for example, a module or a unit may be divided into only one logical function, and may be implemented in other ways, for example, multiple units or components may be combined or integrated into another system, or some features may be omitted or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
Units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated modules/units, if implemented in the form of software functional units and sold or used as separate products, may be stored in a computer readable storage medium. Based on such understanding, all or part of the flow in the method according to the embodiments described above may be implemented by a computer program, which is stored in a computer readable storage medium and used by a processor to implement the steps of the embodiments of the methods described above. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer readable medium may include: any entity or device capable of carrying computer program code, recording medium, usb disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signals, telecommunications signals, software distribution medium, and the like. It should be noted that the computer readable medium may include any suitable increase or decrease as required by legislation and patent practice in the jurisdiction, for example, in some jurisdictions, computer readable media may not include electrical carrier signals and telecommunications signals in accordance with legislation and patent practice.
The above embodiments are only used to illustrate the technical solutions of the present application, and not to limit the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present application and are intended to be included within the scope of the present application.