CN110187720B - Unmanned aerial vehicle guiding method, device, system, medium and electronic equipment - Google Patents

Unmanned aerial vehicle guiding method, device, system, medium and electronic equipment Download PDF

Info

Publication number
CN110187720B
CN110187720B CN201910476895.9A CN201910476895A CN110187720B CN 110187720 B CN110187720 B CN 110187720B CN 201910476895 A CN201910476895 A CN 201910476895A CN 110187720 B CN110187720 B CN 110187720B
Authority
CN
China
Prior art keywords
unmanned aerial
aerial vehicle
target
image
flight
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910476895.9A
Other languages
Chinese (zh)
Other versions
CN110187720A (en
Inventor
吉利
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Rhenium Indium Space Technology Co.,Ltd.
Original Assignee
Shenzhen Platinum Stone Space Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Platinum Stone Space Technology Co ltd filed Critical Shenzhen Platinum Stone Space Technology Co ltd
Priority to CN201910476895.9A priority Critical patent/CN110187720B/en
Publication of CN110187720A publication Critical patent/CN110187720A/en
Application granted granted Critical
Publication of CN110187720B publication Critical patent/CN110187720B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Image Analysis (AREA)

Abstract

The utility model relates to the field of unmanned aerial vehicles, and provides an unmanned aerial vehicle guidance method, an apparatus, a system, a computer readable medium and an electronic device, wherein the method comprises the following steps: acquiring a target image containing an unmanned aerial vehicle; acquiring the position coordinates of the unmanned aerial vehicle in a target coordinate system according to the target image; and determining a flight instruction according to the position coordinates and the coordinates of the target position, and sending the flight instruction to the unmanned aerial vehicle so that the unmanned aerial vehicle flies to the target position. This openly can guarantee that unmanned aerial vehicle is located the visual field within range that is monitored, has avoided unmanned aerial vehicle to lose under the malfunctioning condition of locating signal, can also guarantee unmanned aerial vehicle safety descending and be retrieved.

Description

Unmanned aerial vehicle guiding method, device, system, medium and electronic equipment
Technical Field
The present disclosure relates to the field of unmanned aerial vehicle technologies, and in particular, to an unmanned aerial vehicle guidance method, an unmanned aerial vehicle guidance device, an unmanned aerial vehicle guidance system, a computer-readable storage medium, and an electronic device.
Background
Unmanned aerial vehicle is becoming more and more popular in all walks of life, carries out tasks such as shooting, discernment, control, location, early warning through unmanned aerial vehicle and is more and more accepted by the user, especially also more and more paid attention to in professional fields such as industry, agriculture, forestry, electric power, security protection, survey and drawing unmanned aerial vehicle.
Unmanned aerial vehicle's flight safety is one of the focus of attention in this field, and unmanned aerial vehicle carries out the environment of task and more complicated, and global positioning signal (GPS, big dipper etc.) often the quality is very poor or is shielded under many scenes, therefore unmanned aerial vehicle's normal flight is difficult to guarantee. If the unmanned aerial vehicle flies to higher place far away, need to descend it safety as soon as possible and retrieve, but the loss of locating signal lets unmanned aerial vehicle safety to descend to appointed place and becomes very difficult.
In view of this, there is a need in the art to develop a new guiding method, device and system for an unmanned aerial vehicle.
It is to be noted that the information disclosed in the background section above is only used to enhance understanding of the background of the present disclosure.
Disclosure of Invention
The present disclosure aims to provide an unmanned aerial vehicle guidance method, an unmanned aerial vehicle guidance device, an unmanned aerial vehicle guidance system, a computer-readable storage medium, and an electronic device, so that at least to some extent, an unmanned aerial vehicle can also be within a sight range in a scene of a weaker global positioning signal, and can land safely to a designated place at any time.
Additional features and advantages of the disclosure will be set forth in the detailed description which follows, or in part will be obvious from the description, or may be learned by practice of the disclosure.
According to a first aspect of the present disclosure, there is provided a method for guiding a drone, comprising: acquiring a target image containing an unmanned aerial vehicle; acquiring the position coordinates of the unmanned aerial vehicle in a target coordinate system according to the target image; and determining a flight instruction according to the position coordinates and the coordinates of the target position, and sending the flight instruction to the unmanned aerial vehicle so that the unmanned aerial vehicle flies to the target position.
In an exemplary embodiment of the present disclosure, the target image is generated by a binocular camera conforming to a horizontal epipolar constraint.
In an exemplary embodiment of the present disclosure, the target image includes a reference image and a contrast image; the acquiring the position coordinate of the unmanned aerial vehicle in a target coordinate system according to the target image comprises: processing the reference image and the comparison image through an image recognition model to obtain a first position of the unmanned aerial vehicle in the reference image and a second position of the unmanned aerial vehicle in the comparison image; acquiring a parallax value according to the first position and the second position; and determining the position coordinates according to the offset of the first position relative to the central point of the reference image, the parallax value and the parameters of the binocular camera.
In an exemplary embodiment of the present disclosure, the obtaining a disparity value according to the first position and the second position includes: mapping the first position onto the comparison image to obtain a third position; judging whether a target pixel exists in a range which is away from the third position by a preset distance, wherein the difference between the pixel value of the target pixel and the pixel value of the pixel corresponding to the third position is close to zero; and calculating the difference between the abscissa of the target pixel and the abscissa of the third position to obtain the parallax value.
In an exemplary embodiment of the present disclosure, determining a flight instruction according to the position coordinates and coordinates of a target position, and sending the flight instruction to the drone to make the drone fly to the target position includes: determining the flight distance and the flight direction according to the position coordinates and the coordinates of the target position; forming the flight instruction according to the flight distance, the flight direction and the preset speed, and sending the flight instruction to the unmanned aerial vehicle so that the unmanned aerial vehicle flies to the target position.
In an exemplary embodiment of the present disclosure, the determining a flight distance and a flight direction according to the position coordinates and the coordinates of the target position includes: when the target position is the central point of the reference image, the flight distance is the distance between the position coordinate and the central point, and the flight direction is the direction of the central point pointed by the unmanned aerial vehicle in the image plane.
In an exemplary embodiment of the present disclosure, the determining a flight distance and a flight direction according to the position coordinates and the coordinates of the target position includes: when the target position is a target landing point, the flying distance is a distance between the position coordinate and the target landing point, and the flying direction is a direction in which the unmanned aerial vehicle points to the target landing point.
In an exemplary embodiment of the present disclosure, the method further comprises: acquiring position information of the unmanned aerial vehicle in the target coordinate system in real time through a binocular camera conforming to horizontal epipolar constraint; and when the position information meets the preset condition, sending a landing instruction to the unmanned aerial vehicle to enable the unmanned aerial vehicle to land.
In an exemplary embodiment of the present disclosure, when the location information satisfies a preset condition, sending a landing instruction to the drone to land the drone, includes: calculating the distance between the position information and the target landing point; comparing the distance to a preset distance threshold; and if the distance is smaller than or equal to the preset distance threshold value, sending a landing instruction to the unmanned aerial vehicle so as to enable the unmanned aerial vehicle to land.
According to a second aspect of the present disclosure, there is provided a drone guiding device comprising: the image acquisition module is used for acquiring a target image containing the unmanned aerial vehicle; the coordinate acquisition module is used for acquiring the position coordinates of the unmanned aerial vehicle in a target coordinate system according to the target image; and the command generation module is used for determining a flight command according to the position coordinates and the coordinates of the target position, and sending the flight command to the unmanned aerial vehicle so that the unmanned aerial vehicle flies to the target position.
According to a third aspect of the present disclosure, there is provided a drone guiding system, comprising: a camera for filming a drone to generate a target image containing the drone; the guider is connected with the camera and used for receiving the target image sent by the camera and acquiring the position coordinate of the unmanned aerial vehicle in a target coordinate system according to the target image; and the signal feedback device is connected with the guider and the unmanned aerial vehicle and used for receiving the position coordinate, determining a flight instruction according to the position coordinate and the coordinate of the target position, and sending the flight instruction to the unmanned aerial vehicle so that the unmanned aerial vehicle flies to the target position.
According to a fourth aspect of the present disclosure, there is provided a computer storage medium having a computer program stored thereon, wherein the computer program, when executed by a processor, implements the drone guiding method described in the above embodiments.
According to a fifth aspect of the present disclosure, there is provided an electronic device comprising: one or more processors; storage means for storing one or more programs which, when executed by the one or more processors, cause the one or more processors to implement the drone directing method as described in the embodiments above.
According to the technical scheme, the unmanned aerial vehicle guidance method in the exemplary embodiment of the disclosure has at least the following advantages and positive effects:
the method comprises the steps of obtaining a target image containing the unmanned aerial vehicle; then, analyzing the position of the unmanned aerial vehicle in the target image to obtain the position coordinate of the unmanned aerial vehicle in a target coordinate system; and finally, determining a flight instruction according to the position coordinate of the unmanned aerial vehicle and the coordinate of the target position, and sending the flight instruction to the unmanned aerial vehicle so as to enable the unmanned aerial vehicle to fly to the target position. The method and the device ensure that the unmanned aerial vehicle is in the monitored visual field, and avoid the condition that the unmanned aerial vehicle cannot be monitored under the condition that the positioning signal is failed; on the other hand can guide unmanned aerial vehicle flight or safe landing to predetermineeing the place, has improved user experience.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and, together with the description, serve to explain the principles of the disclosure. It is to be understood that the drawings in the following description are merely exemplary of the disclosure, and that other drawings may be derived from those drawings by one of ordinary skill in the art without the exercise of inventive faculty.
Fig. 1 schematically shows a flow diagram of a drone guiding method according to an embodiment of the present disclosure;
FIG. 2 schematically illustrates a directional schematic of a camera coordinate system according to an embodiment of the disclosure;
fig. 3 schematically shows a flowchart for acquiring the position coordinates of the drone in the target coordinate system according to an embodiment of the present disclosure;
fig. 4 schematically shows a flow chart for obtaining a disparity value according to an embodiment of the present disclosure;
fig. 5 schematically illustrates the coordinate position of the drone in an image taken by a binocular camera according to an embodiment of the present disclosure;
fig. 6 schematically illustrates a flow diagram for guiding a safe landing of a drone according to an embodiment of the present disclosure;
figure 7 schematically illustrates a block diagram of a drone guiding device according to an embodiment of the present disclosure;
figure 8 schematically illustrates a structural schematic of an unmanned aerial vehicle guidance system according to an embodiment of the present disclosure;
FIG. 9 schematically shows a block schematic of an electronic device according to an embodiment of the disclosure;
FIG. 10 schematically shows a program product schematic according to an embodiment of the present disclosure
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. Example embodiments may, however, be embodied in many different forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of example embodiments to those skilled in the art. The described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to give a thorough understanding of embodiments of the disclosure. One skilled in the relevant art will recognize, however, that the subject matter of the present disclosure can be practiced without one or more of the specific details, or with other methods, components, devices, steps, and the like. In other instances, well-known technical solutions have not been shown or described in detail to avoid obscuring aspects of the present disclosure.
The terms "a," "an," "the," and "said" are used in this specification to denote the presence of one or more elements/components/parts/etc.; the terms "comprising" and "having" are intended to be inclusive and mean that there may be additional elements/components/etc. other than the listed elements/components/etc.; the terms "first" and "second," etc. are used merely as labels, and are not limiting on the number of their objects.
Furthermore, the drawings are merely schematic illustrations of the present disclosure and are not necessarily drawn to scale. The same reference numerals in the drawings denote the same or similar parts, and thus their repetitive description will be omitted. Some of the block diagrams shown in the figures are functional entities and do not necessarily correspond to physically or logically separate entities.
The present disclosure first provides an unmanned aerial vehicle guidance method, which may be run on a server, or may also be run on a server cluster or a cloud server, and the like. Fig. 1 shows a drone guiding method, as shown in fig. 1, comprising at least the following steps:
step S110: acquiring a target image containing an unmanned aerial vehicle;
step S120: acquiring the position coordinates of the unmanned aerial vehicle in a target coordinate system according to the target image;
step S130: determining a flight instruction according to the position coordinates and the coordinates of the target position, and sending the flight instruction to the unmanned aerial vehicle so that the unmanned aerial vehicle flies to the target position.
According to the unmanned aerial vehicle guiding method, on one hand, the position coordinate of the unmanned aerial vehicle can be determined according to the acquired target image, and the unmanned aerial vehicle is guaranteed to be in the monitored visual field; on the other hand can confirm the flight instruction according to position coordinate and target location's coordinate, and the guide unmanned aerial vehicle flies to the target location, has guaranteed that unmanned aerial vehicle also can land to appointed place safely under the condition that positioning signal loses.
In order to make the technical solution of the present disclosure clearer, each step of the text processing method is described next by taking the interaction between the game screen and the real scene as an example.
In step S110, a target image including the drone is acquired.
In an exemplary embodiment of the present disclosure, in order to acquire a target image including an unmanned aerial vehicle, tracking monitoring may be performed by a ground monitoring device disposed on the ground, the ground monitoring device may be a photographing device disposed on the ground, which is used for photographing the unmanned aerial vehicle to acquire the target image including the unmanned aerial vehicle, the photographing device may specifically be a camera, a video camera, a smart terminal with a photographing function, and the like, and in an embodiment of the present disclosure, the photographing device may be a binocular camera conforming to a horizontal polar constraint, that is, the unmanned aerial vehicle photographed by the binocular camera, the ordinate in two imaging planes of the camera is the same, and only the abscissa is different.
In the exemplary embodiment of the present disclosure, a wireless communication module is provided in the binocular camera, and the wireless communication module may be wirelessly connected to a server through a network, and after the binocular camera conforming to the horizontal polar line constraint is used to shoot the unmanned aerial vehicle to generate a target image including the unmanned aerial vehicle, the target image may be sent to the server through the binocular camera, and after the server acquires the target image including the unmanned aerial vehicle, the server performs position coordinate recognition and generates a flight command on the target image.
In step S120, position coordinates of the drone in a target coordinate system are obtained according to the target image.
In an exemplary embodiment of the present disclosure, after the target image is acquired, feature extraction may be performed on the drone in the target image to determine the position coordinates of the drone in the target coordinate system. Since the binocular camera is used for photographing the unmanned aerial vehicle to generate the target image containing the unmanned aerial vehicle, two target images are correspondingly generated, one target image corresponds to the left lens, the other target image corresponds to the right lens, for convenience of subsequent data processing, one image can be set as a reference image, the other image can be set as a contrast image, as an example, the target image generated by the left lens can be marked as the reference image, and the target image generated by the right lens can be marked as the contrast image. In addition, the target coordinate system is a camera coordinate system corresponding to the binocular camera, and fig. 2 shows a schematic direction diagram of the camera coordinate system, as shown in fig. 2, an X axis in the camera coordinate system is along a horizontal direction, a Y axis is along a vertical upward direction, and a Z axis is along a direction in which the binocular camera points to the drone.
In an exemplary embodiment of the present disclosure, fig. 3 shows a flowchart for acquiring the position coordinates of the drone in the target coordinate system, and as shown in fig. 3, the method for acquiring the position coordinates of the drone in the target coordinate system at least includes steps S301 to S303, specifically:
in step S301, the reference image and the comparison image are processed by an image recognition model to obtain a first position of the drone in the reference image and a second position of the drone in the comparison image.
In an exemplary embodiment of the present disclosure, in order to determine a specific position of the drone in the reference image and the comparison image, feature extraction may be performed on the reference image and the comparison image respectively through a machine learning model, which may be a neural network model, such as a convolutional neural network model, R-CNN, Faster R-CNN, and so on, after performing multi-layer convolution-pooling-full connection on the reference image or the comparison image, the drone may be located and marked with a marking frame, and a first position of the drone in the reference image and a second position of the drone in the comparison image may be determined according to the marking frame.
In step S302, a disparity value is obtained according to the first position and the second position.
In the exemplary embodiment of the present disclosure, the disparity value of the unmanned aerial vehicle on the reference image and the contrast image can be obtained by performing binocular vision matching on the labeling frame of the unmanned aerial vehicle on the binocular image. Since the corresponding Y coordinates of the two lenses of the binocular camera in the target coordinate system are the same, and only the X coordinates are different, the disparity value is also the horizontal coordinate difference between the first position and the second position.
In an exemplary embodiment of the present disclosure, a disparity value may be obtained according to a pixel value, fig. 4 shows a schematic flowchart of obtaining the disparity value, and as shown in fig. 4, the method for calculating the disparity value at least includes steps S401 to S403, specifically:
in step S401, the first position is mapped onto the contrast image to obtain a third position.
In an exemplary embodiment of the present disclosure, the origin of the target coordinate system may be located at the center of the binocular camera, and in the mapping, a first position (x) may be mapped onto the contrast image with the Y-axis as the symmetry axis, as shown in fig. 5 1 Y) after mapping with the Y-axis as the axis of symmetry, the third position on the comparison image is(x 1 ’,y’)。
In step S402, it is determined whether a target pixel exists within a range of a preset distance from a third position, where a difference between a pixel value of the target pixel and a pixel value of a pixel corresponding to the third position is close to zero.
In an exemplary embodiment of the present disclosure, with (x) 1 ', y') is defined as being centered on (x) 1 ', y') and (x) 1 '+ delta, y' + delta) is formed in the image range, whether the third position (x) exists or not 1 ', y') of pixels having pixel values close to each other, where δ is not zero. In the embodiment of the present disclosure, images generated by the binocular camera are RGB images, when comparing pixel values, differences between R, G, B values of two pixels may be calculated respectively, and then the differences are added to obtain an average value, and whether the pixel values of the two pixels are close to each other is determined according to the obtained average value.
In step S403, the difference between the abscissa of the target pixel and the abscissa of the third position is calculated to obtain a parallax value.
In the exemplary embodiment of the disclosure, after determining the target pixel closest to the pixel value of the pixel point corresponding to the third position, the target pixel may be considered as a projection point of the unmanned aerial vehicle on the comparison image, because the difference between the two generated images is not large when the binocular camera shoots the same object, the target pixel and the third position are substantially located on the same abscissa axis, and the abscissa of the target pixel and the abscissa of the third position are subtracted to obtain the absolute value thereof, so that the parallax value may be obtained. For example, the target pixel coordinate in FIG. 5 is (x) 2 Y), then according to | x 2 -x 1 And | obtaining a parallax value.
In step S303, the position coordinates are determined according to the offset amount of the first position with respect to the center point of the reference image, the parallax value, and the parameters of the binocular camera.
In an exemplary embodiment of the present disclosure, the position coordinates of the drone in the target coordinate system may be determined according to the offset of the first position with respect to the center point of the reference image, the parallax value, and the parameters of the binocular camera, and specifically may be calculated according to formula (1):
X=b×x 1 /d Y=b×y/d Z=f/d (1)
wherein (X, Y, Z) is position coordinates, b is the distance between lenses in the binocular camera, d is a parallax value, and f is the focal length of the binocular camera.
In step S130, a flight instruction is determined according to the position coordinates and the coordinates of the target position, and the flight instruction is sent to the unmanned aerial vehicle, so that the unmanned aerial vehicle flies to the target position.
In this disclosed example embodiment, after having confirmed the position coordinate of unmanned aerial vehicle in the target coordinate system, can confirm unmanned aerial vehicle's flying distance, direction and speed according to unmanned aerial vehicle's position coordinate and target location's coordinate, according to flying distance the flying direction forms with predetermineeing speed the flight instruction, and will the flight instruction send to unmanned aerial vehicle, so that unmanned aerial vehicle flies to the target location.
In an exemplary embodiment of the present disclosure, the target position may be a central point of a reference image captured by a binocular camera, or may be a certain position where the unmanned aerial vehicle can land safely. When the target position is the central point of the reference image, at this moment, the unmanned aerial vehicle only needs to fly to the central point of the reference image in the XOY plane, that is to say, the flight state of the unmanned aerial vehicle in the target coordinate system can have displacement on the Z axis, and also can have no displacement on the Z axis, as long as in the image shot through the binocular camera, the unmanned aerial vehicle is located at the central point of the reference image. In the embodiment of the present disclosure, when the position of the drone in the target coordinate system is (X, Y, Z), the frame center coordinate point may be set to (0,0,0), then the flight direction in the generated flight command may be (-X, -Y, 0), and the flight distance may be (-X, -Y, 0)
Figure BDA0002082563010000091
The flying speed may be a preset safe speed Vs. The unmanned aerial vehicle flies according to the received flight instruction, namely the reference image can be reachedEnsures that the drone is in the field of view being monitored.
Further, when the unmanned aerial vehicle flies to the target position, the server can send a hovering instruction to the unmanned aerial vehicle, and the unmanned aerial vehicle is guaranteed to be always located at the center point of the reference image relative to the binocular camera. When the posture or the position of the binocular camera changes, the flying distance, the flying direction and the flying speed can be recalculated, and a new flying instruction is sent to the unmanned aerial vehicle, so that the unmanned aerial vehicle continuously flies until the central point of the reference image is reached.
In an exemplary embodiment of the present disclosure, when the target position is a target landing point at which the unmanned aerial vehicle can land safely, the flight instruction may be determined according to the position coordinate of the unmanned aerial vehicle and the position of the target landing point, and the unmanned aerial vehicle is guided to fly in the target coordinate system. The target landing point may be a point relatively close to the binocular camera, which facilitates the recovery of the drone by the user, for example, the drone may be informed to fly in the (-X, -Y, -Z) direction at a preset safe speed Vs
Figure BDA0002082563010000092
Distance, at unmanned aerial vehicle flight in-process, in order to guarantee that unmanned aerial vehicle is being kept watch on the within range always, the two mesh cameras can gather the image that contains unmanned aerial vehicle in real time and send for the server, and the server can confirm whether instruct unmanned aerial vehicle to hover and descend according to unmanned aerial vehicle's real-time position and the distance of descending target point. Fig. 6 is a schematic flow chart illustrating guiding the unmanned aerial vehicle to land safely, as shown in fig. 6, in step S601, a distance between real-time position information of the unmanned aerial vehicle and a target landing point is calculated; if the real-time position information is marked as (Xs, Ys, Zs), wherein Xs, Ys, Zs are preset safe distances from the unmanned aerial vehicle to the binocular camera in all directions and are not zero at the same time, the landing target point is marked as (0,0,0), and the distance between the two is (Xs, Ys, Zs) which are preset safe distances from the unmanned aerial vehicle to the binocular camera in all directions and are not zero at the same time
Figure BDA0002082563010000101
In step S602, comparing the distance with a preset distance threshold; in step S603, if the distance is less than or equal to the preset distance threshold, sending a landing to the droneInstructions to land the drone; assuming the predetermined distance threshold is S, if
Figure BDA0002082563010000102
The unmanned aerial vehicle enters the range capable of safely landing, and a landing instruction can be sent to the unmanned aerial vehicle to control the unmanned aerial vehicle to land.
The unmanned aerial vehicle guiding method in the embodiment of the disclosure can guide the unmanned aerial vehicle to fly to a target position, ensures that the unmanned aerial vehicle is located in a monitored visual field range, avoids the unmanned aerial vehicle from being lost under the condition that a positioning signal is out of order, and further can ensure that the unmanned aerial vehicle safely lands and is recovered.
The following introduces an apparatus embodiment of the present disclosure, which can be used to execute the above-mentioned guiding method for an unmanned aerial vehicle of the present disclosure. For details not disclosed in the embodiments of the apparatus of the present disclosure, please refer to the embodiments of the unmanned aerial vehicle guidance method of the present disclosure.
Fig. 7 schematically illustrates a block diagram of a drone guiding device according to one embodiment of the present disclosure.
Referring to fig. 7, a drone guiding device 700 according to one embodiment of the present disclosure, the drone guiding device 700 includes: an image acquisition module 701, a coordinate acquisition module 702, and an instruction generation module 703. Specifically, the method comprises the following steps:
an image acquisition module 701, configured to acquire a target image including an unmanned aerial vehicle; a coordinate obtaining module 702, configured to obtain, according to the target image, a position coordinate of the unmanned aerial vehicle in a target coordinate system; the instruction generating module 703 is configured to determine a flight instruction according to the position coordinates and the coordinates of the target position, and send the flight instruction to the unmanned aerial vehicle, so that the unmanned aerial vehicle flies to the target position.
It should be noted that although in the above detailed description several modules or units of the apparatus for performing are mentioned, such a division is not mandatory. Indeed, the features and functionality of two or more modules or units described above may be embodied in one module or unit, according to embodiments of the present disclosure. Conversely, the features and functions of one module or unit described above may be further divided into embodiments by a plurality of modules or units.
In an exemplary embodiment of the present disclosure, there is also provided a drone guiding system, fig. 8 shows a schematic structural diagram of the drone guiding system, and as shown in fig. 8, the drone guiding system 800 includes: camera 801, introducer 802 and signal feedback 803, specifically:
a camera 801 for shooting a drone to generate a target image containing the drone; the guider 802 is connected with the camera and used for receiving the target image sent by the camera and acquiring the position coordinate of the unmanned aerial vehicle in a target coordinate system according to the target image; and a signal feedback unit 803, connected to the guiding device and the unmanned aerial vehicle, for receiving the position coordinates, determining a flight instruction according to the position coordinates and the coordinates of the target position, and sending the flight instruction to the unmanned aerial vehicle, so that the unmanned aerial vehicle flies to the target position.
In an exemplary embodiment of the present disclosure, the director 802 may be a dedicated chip or a computer equipped with a high performance GPU.
In an exemplary embodiment of the present disclosure, an electronic device capable of implementing the above method is also provided.
As will be appreciated by one skilled in the art, aspects of the present invention may be embodied as a system, method or program product. Accordingly, various aspects of the present invention may be embodied in the form of: an entirely hardware embodiment, an entirely software embodiment (including firmware, microcode, etc.) or an embodiment combining hardware and software aspects that may all generally be referred to herein as a "circuit," module "or" system.
An electronic device 900 according to this embodiment of the invention is described below with reference to fig. 9. The electronic device 900 shown in fig. 9 is only an example and should not bring any limitations to the function and scope of use of the embodiments of the present invention.
As shown in fig. 9, electronic device 900 is in the form of a general purpose computing device. Components of electronic device 900 may include, but are not limited to: the at least one processing unit 910, the at least one storage unit 920, a bus 930 connecting different system components (including the storage unit 920 and the processing unit 910), and a display unit 940.
Wherein the storage unit stores program code that is executable by the processing unit 910 to cause the processing unit 910 to perform steps according to various exemplary embodiments of the present invention described in the above section "exemplary methods" of the present specification. For example, the processing unit 910 may perform step s110 as shown in fig. 1-acquire a target image containing a drone; s120, acquiring a position coordinate of the unmanned aerial vehicle in a target coordinate system according to the target image; step S130: determining a flight instruction according to the position coordinates and the coordinates of the target position, and sending the flight instruction to the unmanned aerial vehicle so that the unmanned aerial vehicle flies to the target position.
The storage unit 920 may include a readable medium in the form of a volatile storage unit, such as a random access memory unit (RAM)9201 and/or a cache memory unit 9202, and may further include a read only memory unit (ROM) 9203.
Storage unit 920 may also include a program/utility 9204 having a set (at least one) of program modules 9205, such program modules 9205 including but not limited to: an operating system, one or more application programs, other program modules, and program data, each of which, or some combination thereof, may comprise an implementation of a network environment.
Bus 930 can be any of several types of bus structures including a memory unit bus or memory unit controller, a peripheral bus, an accelerated graphics port, a processing unit, or a local bus using any of a variety of bus architectures.
The electronic device 900 may also communicate with one or more external devices 1500 (e.g., keyboard, pointing device, bluetooth device, etc.), with one or more devices that enable a user to interact with the electronic device 900, and/or with any devices (e.g., router, modem, etc.) that enable the electronic device 900 to communicate with one or more other computing devices. Such communication may occur via input/output (I/O) interface 950. Also, the electronic device 900 may communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network such as the Internet) via the network adapter 960. As shown, the network adapter 960 communicates with the other modules of the electronic device 900 via the bus 930. It should be appreciated that although not shown, other hardware and/or software modules may be used in conjunction with the electronic device 900, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data backup storage systems, to name a few.
Through the above description of the embodiments, those skilled in the art will readily understand that the exemplary embodiments described herein may be implemented by software, or by software in combination with necessary hardware. Therefore, the technical solution according to the embodiments of the present disclosure may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (which may be a CD-ROM, a usb disk, a removable hard disk, etc.) or on a network, and includes several instructions to enable a computing device (which may be a personal computer, a server, a terminal device, or a network device, etc.) to execute the method according to the embodiments of the present disclosure.
In an exemplary embodiment of the present disclosure, there is also provided a computer-readable storage medium having stored thereon a program product capable of implementing the above-described method of the present specification. In some possible embodiments, aspects of the invention may also be implemented in the form of a program product comprising program code means for causing a terminal device to carry out the steps according to various exemplary embodiments of the invention described in the above section "exemplary methods" of the present description, when said program product is run on the terminal device.
Referring to fig. 10, a program product 1000 for implementing the above method according to an embodiment of the present invention is described, which may employ a portable compact disc read only memory (CD-ROM) and include program code, and may be run on a terminal device, such as a personal computer. However, the program product of the present invention is not limited in this regard and, in the present document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
A computer readable signal medium may include a propagated data signal with readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A readable signal medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device and partly on a remote computing device, or entirely on the remote computing device or server. In the case of a remote computing device, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., through the internet using an internet service provider).
Furthermore, the above-described drawings are only schematic illustrations of processes involved in methods according to exemplary embodiments of the invention, and are not intended to be limiting. It will be readily understood that the processes shown in the above figures are not intended to indicate or limit the chronological order of the processes. In addition, it is also readily understood that these processes may be performed, for example, synchronously or asynchronously in multiple modules.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is to be limited only by the terms of the appended claims.

Claims (12)

1. An unmanned aerial vehicle guidance method is characterized by comprising the following steps:
acquiring a target image containing an unmanned aerial vehicle generated by photographing with a binocular camera conforming to a horizontal epipolar constraint, the target image including a reference image photographed by one lens of the binocular camera and a contrast image photographed by the other lens;
determining a third position corresponding to the first position in the comparison image based on the first position of the unmanned aerial vehicle in the reference image and a target coordinate system corresponding to the binocular camera, determining a target pixel close to the pixel value of the third position in the comparison image, determining a parallax value according to the abscissa of the first position and the abscissa of the target pixel, and acquiring the position coordinate of the unmanned aerial vehicle in the target coordinate system according to the parallax value, the offset of the first position relative to the central point of the reference image and the parameters of the binocular camera;
determining a flight instruction according to the position coordinates and the coordinates of the target position, and sending the flight instruction to the unmanned aerial vehicle so that the unmanned aerial vehicle flies to the target position.
2. The drone guidance method of claim 1, wherein an origin of the target coordinate system is located at a center of the binocular camera;
the determining a third position in the comparison image corresponding to the first position based on the first position of the UAV in the reference image and a target coordinate system corresponding to the binocular camera comprises:
processing the reference image through an image recognition model to obtain the first position;
and mapping the first position by taking the Y axis of the target coordinate system as a symmetry axis to obtain the third position, wherein the X axis of the target coordinate system is along the horizontal direction, the Y axis is along the vertical upward direction, and the Z axis is along the direction of the binocular camera pointing to the unmanned aerial vehicle.
3. The drone guiding method of claim 2, wherein the determining a target pixel in the contrast image that is close to the pixel value of the third location to determine a disparity value from the abscissa of the first location and the abscissa of the target pixel comprises:
judging whether the target pixel exists in a range with a preset distance from the third position, wherein the difference between the pixel value of the target pixel and the pixel value of the pixel corresponding to the third position is close to zero;
and calculating the difference between the abscissa of the target pixel and the abscissa of the first position to obtain the parallax value.
4. The guidance method for the unmanned aerial vehicle according to claim 1, wherein determining a flight instruction according to the position coordinates and coordinates of a target position, and sending the flight instruction to the unmanned aerial vehicle to enable the unmanned aerial vehicle to fly to the target position comprises:
determining the flight distance and the flight direction according to the position coordinates and the coordinates of the target position;
forming the flight instruction according to the flight distance, the flight direction and the preset speed, and sending the flight instruction to the unmanned aerial vehicle so that the unmanned aerial vehicle flies to the target position.
5. The drone guidance method of claim 4, wherein the determining a flight distance and a flight direction from the location coordinates and the coordinates of the target location comprises:
when the target position is the central point of the reference image, the flight distance is the distance between the position coordinate and the central point, and the flight direction is the direction of the central point pointed by the unmanned aerial vehicle in the image plane.
6. The drone guidance method of claim 4, wherein the determining a flight distance and a flight direction from the location coordinates and the coordinates of the target location comprises:
when the target position is a target landing point, the flying distance is a distance between the position coordinate and the target landing point, and the flying direction is a direction in which the unmanned aerial vehicle points to the target landing point.
7. The drone directing method according to claim 6, further comprising:
acquiring position information of the unmanned aerial vehicle in the target coordinate system in real time through a binocular camera conforming to horizontal epipolar constraint;
and when the position information meets a preset condition, sending a landing instruction to the unmanned aerial vehicle so that the unmanned aerial vehicle lands.
8. The guidance method for the unmanned aerial vehicle according to claim 7, wherein when the position information satisfies a preset condition, a landing instruction is sent to the unmanned aerial vehicle to land the unmanned aerial vehicle, and the guidance method comprises:
calculating the distance between the position information and the target landing point;
comparing the distance to a preset distance threshold;
and if the distance is smaller than or equal to the preset distance threshold, sending a landing instruction to the unmanned aerial vehicle so as to enable the unmanned aerial vehicle to land.
9. An unmanned aerial vehicle guiding device, its characterized in that includes:
the system comprises an image acquisition module, a display module and a control module, wherein the image acquisition module is used for acquiring a target image containing the unmanned aerial vehicle, which is generated by shooting by a binocular camera conforming to the horizontal epipolar constraint, and the target image comprises a reference image shot by one lens of the binocular camera and a contrast image shot by the other lens;
a coordinate obtaining module, configured to determine a third position corresponding to the first position in the comparison image based on the first position of the unmanned aerial vehicle in the reference image and a target coordinate system corresponding to the binocular camera, determine a target pixel in the comparison image, where the target pixel is close to a pixel value of the third position, to determine a disparity value according to an abscissa of the first position and an abscissa of the target pixel, and obtain a position coordinate of the unmanned aerial vehicle in the target coordinate system according to the disparity value, an offset of the first position with respect to a center point of the reference image, and a parameter of the binocular camera;
and the command generation module is used for determining a flight command according to the position coordinates and the coordinates of the target position, and sending the flight command to the unmanned aerial vehicle so that the unmanned aerial vehicle flies to the target position.
10. An unmanned aerial vehicle guidance system, comprising:
a binocular camera conforming to a horizontal epipolar constraint for capturing a drone to generate a target image containing the drone, the target image including a reference image captured by one lens of the binocular camera and a contrast image captured by the other lens;
the guider is connected with the camera and used for receiving the reference image and the comparison image sent by the binocular camera, determining a third position corresponding to the first position in the comparison image based on the first position of the unmanned aerial vehicle in the reference image and a target coordinate system corresponding to the binocular camera, determining a target pixel close to the pixel value of the third position in the comparison image, determining a parallax value according to the abscissa of the first position and the abscissa of the target pixel, and acquiring the position coordinate of the unmanned aerial vehicle in the target coordinate system according to the parallax value, the offset of the first position relative to the central point of the reference image and the parameters of the binocular camera;
and the signal feedback device is connected with the guider and the unmanned aerial vehicle and used for receiving the position coordinate, determining a flight instruction according to the position coordinate and the coordinate of the target position, and sending the flight instruction to the unmanned aerial vehicle so that the unmanned aerial vehicle flies to the target position.
11. A computer storage medium having a computer program stored thereon, wherein the computer program, when executed by a processor, implements the drone guidance method of any one of claims 1-8.
12. An electronic device, comprising:
a processor; and
a memory for storing executable instructions of the processor;
wherein the processor is configured to perform the drone directing method of any one of claims 1-8 via execution of the executable instructions.
CN201910476895.9A 2019-06-03 2019-06-03 Unmanned aerial vehicle guiding method, device, system, medium and electronic equipment Active CN110187720B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910476895.9A CN110187720B (en) 2019-06-03 2019-06-03 Unmanned aerial vehicle guiding method, device, system, medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910476895.9A CN110187720B (en) 2019-06-03 2019-06-03 Unmanned aerial vehicle guiding method, device, system, medium and electronic equipment

Publications (2)

Publication Number Publication Date
CN110187720A CN110187720A (en) 2019-08-30
CN110187720B true CN110187720B (en) 2022-09-27

Family

ID=67719798

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910476895.9A Active CN110187720B (en) 2019-06-03 2019-06-03 Unmanned aerial vehicle guiding method, device, system, medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN110187720B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110749324A (en) * 2019-10-28 2020-02-04 深圳市赛为智能股份有限公司 Unmanned aerial vehicle rescue positioning method and device, computer equipment and storage medium
CN111232234A (en) * 2020-02-10 2020-06-05 江苏大学 Method for real-time positioning system of aircraft space
CN114020029B (en) * 2021-11-09 2022-06-10 深圳大漠大智控技术有限公司 Automatic generation method and device of aerial route for cluster and related components
CN114756037B (en) * 2022-03-18 2023-04-07 广东汇星光电科技有限公司 Unmanned aerial vehicle system based on neural network image recognition and control method
CN114697575A (en) * 2022-03-25 2022-07-01 珠海市猎科电子有限公司 Pyroelectric infrared hunting camera system for unmanned aerial vehicle to read image and control method

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106960454A (en) * 2017-03-02 2017-07-18 武汉星巡智能科技有限公司 Depth of field barrier-avoiding method, equipment and unmanned vehicle
CN107329490A (en) * 2017-07-21 2017-11-07 歌尔科技有限公司 Unmanned plane barrier-avoiding method and unmanned plane
CN107402578A (en) * 2017-06-21 2017-11-28 中国科学院深圳先进技术研究院 Unmanned plane panorama obstacle cognitive method, device, equipment and storage medium
CN107728633A (en) * 2017-10-23 2018-02-23 广州极飞科技有限公司 Obtain object positional information method and device, mobile device and its control method
CN108140245A (en) * 2017-12-25 2018-06-08 深圳市道通智能航空技术有限公司 Distance measuring method, device and unmanned plane
CN109341537A (en) * 2018-09-27 2019-02-15 北京伟景智能科技有限公司 Dimension measurement method and device based on binocular vision

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5806156B2 (en) * 2012-03-23 2015-11-10 株式会社ジャパンディスプレイ Display device, electronic device
CN103914076B (en) * 2014-03-28 2017-02-15 浙江吉利控股集团有限公司 Cargo transferring system and method based on unmanned aerial vehicle
WO2017035841A1 (en) * 2015-09-06 2017-03-09 深圳市大疆创新科技有限公司 Unmanned aerial vehicle and airborne supply method therefor, and floating platform and control method therefor
GB201608744D0 (en) * 2016-05-18 2016-06-29 Unmanned Systems Ltd Intelligent autonomous unmanned system
CN105955292B (en) * 2016-05-20 2018-01-09 腾讯科技(深圳)有限公司 A kind of method, mobile terminal, aircraft and system for controlling aircraft flight
JP7016796B2 (en) * 2016-07-04 2022-02-07 ソニーセミコンダクタソリューションズ株式会社 Information processing equipment and information processing method
CN107289953A (en) * 2017-08-07 2017-10-24 深圳市华琥技术有限公司 A kind of navigation control method of unmanned aerial vehicle group
CN107272743A (en) * 2017-08-07 2017-10-20 深圳市华琥技术有限公司 A kind of express delivery delivering method of unmanned aerial vehicle group
CN108038415B (en) * 2017-11-06 2021-12-28 湖南华诺星空电子技术有限公司 Unmanned aerial vehicle automatic detection and tracking method based on machine vision
CN108388256A (en) * 2017-12-29 2018-08-10 易瓦特科技股份公司 The method and device that unmanned plane is controlled in the target area by earth station
CN109191504A (en) * 2018-08-01 2019-01-11 南京航空航天大学 A kind of unmanned plane target tracking
CN109144096A (en) * 2018-08-15 2019-01-04 东汉太阳能无人机技术有限公司 A kind of control method and unmanned plane of UAV Landing
CN109270519A (en) * 2018-09-14 2019-01-25 吉林大学 Vehicle-mounted rotor wing unmanned aerial vehicle recycling guidance system and method based on millimetre-wave radar

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106960454A (en) * 2017-03-02 2017-07-18 武汉星巡智能科技有限公司 Depth of field barrier-avoiding method, equipment and unmanned vehicle
CN107402578A (en) * 2017-06-21 2017-11-28 中国科学院深圳先进技术研究院 Unmanned plane panorama obstacle cognitive method, device, equipment and storage medium
CN107329490A (en) * 2017-07-21 2017-11-07 歌尔科技有限公司 Unmanned plane barrier-avoiding method and unmanned plane
CN107728633A (en) * 2017-10-23 2018-02-23 广州极飞科技有限公司 Obtain object positional information method and device, mobile device and its control method
CN108140245A (en) * 2017-12-25 2018-06-08 深圳市道通智能航空技术有限公司 Distance measuring method, device and unmanned plane
CN109341537A (en) * 2018-09-27 2019-02-15 北京伟景智能科技有限公司 Dimension measurement method and device based on binocular vision

Also Published As

Publication number Publication date
CN110187720A (en) 2019-08-30

Similar Documents

Publication Publication Date Title
CN110187720B (en) Unmanned aerial vehicle guiding method, device, system, medium and electronic equipment
US11394950B2 (en) Augmented reality-based remote guidance method and apparatus, terminal, and storage medium
US11797009B2 (en) Unmanned aerial image capture platform
US10776939B2 (en) Obstacle avoidance system based on embedded stereo vision for unmanned aerial vehicles
CN107329490B (en) Unmanned aerial vehicle obstacle avoidance method and unmanned aerial vehicle
CN111344644B (en) Techniques for motion-based automatic image capture
US20200346750A1 (en) Method for generating flight path, control device, and unmanned aerial vehicle
WO2019113966A1 (en) Obstacle avoidance method and device, and unmanned aerial vehicle
US12039874B2 (en) Control method and device for unmanned aerial vehicle, and computer readable storage medium
US20200125100A1 (en) Movable object control method, device and system
CN110633629A (en) Power grid inspection method, device, equipment and storage medium based on artificial intelligence
EP3968266B1 (en) Obstacle three-dimensional position acquisition method and apparatus for roadside computing device
CN108805917A (en) Sterically defined method, medium, device and computing device
CN112652016A (en) Point cloud prediction model generation method, pose estimation method and device
US20200012756A1 (en) Vision simulation system for simulating operations of a movable platform
US11755042B2 (en) Autonomous orbiting method and device and UAV
CN113228103A (en) Target tracking method, device, unmanned aerial vehicle, system and readable storage medium
CN113272871A (en) Camera calibration method and system
WO2021078003A1 (en) Obstacle avoidance method and device for unmanned vehicle, and unmanned vehicle
CN113031462A (en) Port machine inspection route planning system and method for unmanned aerial vehicle
CN114600162A (en) Scene lock mode for capturing camera images
CN112640419B (en) Following method, movable platform, device and storage medium
CN111784842B (en) Three-dimensional reconstruction method, device, equipment and readable storage medium
WO2022040941A1 (en) Depth calculation method and device, and mobile platform and storage medium
WO2022040940A1 (en) Calibration method and device, movable platform, and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20240410

Address after: Room 2703, West Block, Qiushi Building, No. 17 Zizhu Seventh Road, Zhulin Community, Xiangmihu Street, Futian District, Shenzhen City, Guangdong Province, 518000

Patentee after: Shenzhen Rhenium Indium Space Technology Co.,Ltd.

Country or region after: China

Address before: 518048 Room 203, building 4, Jinsha garden, Shazui Road, Shatou street, Futian District, Shenzhen City, Guangdong Province

Patentee before: Shenzhen Platinum Stone Space Technology Co.,Ltd.

Country or region before: China