Disclosure of Invention
The present disclosure aims to provide an unmanned aerial vehicle guidance method, an unmanned aerial vehicle guidance device, an unmanned aerial vehicle guidance system, a computer-readable storage medium, and an electronic device, so that at least to some extent, an unmanned aerial vehicle can also be within a sight range in a scene of a weaker global positioning signal, and can land safely to a designated place at any time.
Additional features and advantages of the disclosure will be set forth in the detailed description which follows, or in part will be obvious from the description, or may be learned by practice of the disclosure.
According to a first aspect of the present disclosure, there is provided a method for guiding a drone, comprising: acquiring a target image containing an unmanned aerial vehicle; acquiring the position coordinates of the unmanned aerial vehicle in a target coordinate system according to the target image; and determining a flight instruction according to the position coordinates and the coordinates of the target position, and sending the flight instruction to the unmanned aerial vehicle so that the unmanned aerial vehicle flies to the target position.
In an exemplary embodiment of the present disclosure, the target image is generated by a binocular camera conforming to a horizontal epipolar constraint.
In an exemplary embodiment of the present disclosure, the target image includes a reference image and a contrast image; the acquiring the position coordinate of the unmanned aerial vehicle in a target coordinate system according to the target image comprises: processing the reference image and the comparison image through an image recognition model to obtain a first position of the unmanned aerial vehicle in the reference image and a second position of the unmanned aerial vehicle in the comparison image; acquiring a parallax value according to the first position and the second position; and determining the position coordinates according to the offset of the first position relative to the central point of the reference image, the parallax value and the parameters of the binocular camera.
In an exemplary embodiment of the present disclosure, the obtaining a disparity value according to the first position and the second position includes: mapping the first position onto the comparison image to obtain a third position; judging whether a target pixel exists in a range which is away from the third position by a preset distance, wherein the difference between the pixel value of the target pixel and the pixel value of the pixel corresponding to the third position is close to zero; and calculating the difference between the abscissa of the target pixel and the abscissa of the third position to obtain the parallax value.
In an exemplary embodiment of the present disclosure, determining a flight instruction according to the position coordinates and coordinates of a target position, and sending the flight instruction to the drone to make the drone fly to the target position includes: determining the flight distance and the flight direction according to the position coordinates and the coordinates of the target position; forming the flight instruction according to the flight distance, the flight direction and the preset speed, and sending the flight instruction to the unmanned aerial vehicle so that the unmanned aerial vehicle flies to the target position.
In an exemplary embodiment of the present disclosure, the determining a flight distance and a flight direction according to the position coordinates and the coordinates of the target position includes: when the target position is the central point of the reference image, the flight distance is the distance between the position coordinate and the central point, and the flight direction is the direction of the central point pointed by the unmanned aerial vehicle in the image plane.
In an exemplary embodiment of the present disclosure, the determining a flight distance and a flight direction according to the position coordinates and the coordinates of the target position includes: when the target position is a target landing point, the flying distance is a distance between the position coordinate and the target landing point, and the flying direction is a direction in which the unmanned aerial vehicle points to the target landing point.
In an exemplary embodiment of the present disclosure, the method further comprises: acquiring position information of the unmanned aerial vehicle in the target coordinate system in real time through a binocular camera conforming to horizontal epipolar constraint; and when the position information meets the preset condition, sending a landing instruction to the unmanned aerial vehicle to enable the unmanned aerial vehicle to land.
In an exemplary embodiment of the present disclosure, when the location information satisfies a preset condition, sending a landing instruction to the drone to land the drone, includes: calculating the distance between the position information and the target landing point; comparing the distance to a preset distance threshold; and if the distance is smaller than or equal to the preset distance threshold value, sending a landing instruction to the unmanned aerial vehicle so as to enable the unmanned aerial vehicle to land.
According to a second aspect of the present disclosure, there is provided a drone guiding device comprising: the image acquisition module is used for acquiring a target image containing the unmanned aerial vehicle; the coordinate acquisition module is used for acquiring the position coordinates of the unmanned aerial vehicle in a target coordinate system according to the target image; and the command generation module is used for determining a flight command according to the position coordinates and the coordinates of the target position, and sending the flight command to the unmanned aerial vehicle so that the unmanned aerial vehicle flies to the target position.
According to a third aspect of the present disclosure, there is provided a drone guiding system, comprising: a camera for filming a drone to generate a target image containing the drone; the guider is connected with the camera and used for receiving the target image sent by the camera and acquiring the position coordinate of the unmanned aerial vehicle in a target coordinate system according to the target image; and the signal feedback device is connected with the guider and the unmanned aerial vehicle and used for receiving the position coordinate, determining a flight instruction according to the position coordinate and the coordinate of the target position, and sending the flight instruction to the unmanned aerial vehicle so that the unmanned aerial vehicle flies to the target position.
According to a fourth aspect of the present disclosure, there is provided a computer storage medium having a computer program stored thereon, wherein the computer program, when executed by a processor, implements the drone guiding method described in the above embodiments.
According to a fifth aspect of the present disclosure, there is provided an electronic device comprising: one or more processors; storage means for storing one or more programs which, when executed by the one or more processors, cause the one or more processors to implement the drone directing method as described in the embodiments above.
According to the technical scheme, the unmanned aerial vehicle guidance method in the exemplary embodiment of the disclosure has at least the following advantages and positive effects:
the method comprises the steps of obtaining a target image containing the unmanned aerial vehicle; then, analyzing the position of the unmanned aerial vehicle in the target image to obtain the position coordinate of the unmanned aerial vehicle in a target coordinate system; and finally, determining a flight instruction according to the position coordinate of the unmanned aerial vehicle and the coordinate of the target position, and sending the flight instruction to the unmanned aerial vehicle so as to enable the unmanned aerial vehicle to fly to the target position. The method and the device ensure that the unmanned aerial vehicle is in the monitored visual field, and avoid the condition that the unmanned aerial vehicle cannot be monitored under the condition that the positioning signal is failed; on the other hand can guide unmanned aerial vehicle flight or safe landing to predetermineeing the place, has improved user experience.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. Example embodiments may, however, be embodied in many different forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of example embodiments to those skilled in the art. The described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to give a thorough understanding of embodiments of the disclosure. One skilled in the relevant art will recognize, however, that the subject matter of the present disclosure can be practiced without one or more of the specific details, or with other methods, components, devices, steps, and the like. In other instances, well-known technical solutions have not been shown or described in detail to avoid obscuring aspects of the present disclosure.
The terms "a," "an," "the," and "said" are used in this specification to denote the presence of one or more elements/components/parts/etc.; the terms "comprising" and "having" are intended to be inclusive and mean that there may be additional elements/components/etc. other than the listed elements/components/etc.; the terms "first" and "second," etc. are used merely as labels, and are not limiting on the number of their objects.
Furthermore, the drawings are merely schematic illustrations of the present disclosure and are not necessarily drawn to scale. The same reference numerals in the drawings denote the same or similar parts, and thus their repetitive description will be omitted. Some of the block diagrams shown in the figures are functional entities and do not necessarily correspond to physically or logically separate entities.
The present disclosure first provides an unmanned aerial vehicle guidance method, which may be run on a server, or may also be run on a server cluster or a cloud server, and the like. Fig. 1 shows a drone guiding method, as shown in fig. 1, comprising at least the following steps:
step S110: acquiring a target image containing an unmanned aerial vehicle;
step S120: acquiring the position coordinates of the unmanned aerial vehicle in a target coordinate system according to the target image;
step S130: determining a flight instruction according to the position coordinates and the coordinates of the target position, and sending the flight instruction to the unmanned aerial vehicle so that the unmanned aerial vehicle flies to the target position.
According to the unmanned aerial vehicle guiding method, on one hand, the position coordinate of the unmanned aerial vehicle can be determined according to the acquired target image, and the unmanned aerial vehicle is guaranteed to be in the monitored visual field; on the other hand can confirm the flight instruction according to position coordinate and target location's coordinate, and the guide unmanned aerial vehicle flies to the target location, has guaranteed that unmanned aerial vehicle also can land to appointed place safely under the condition that positioning signal loses.
In order to make the technical solution of the present disclosure clearer, each step of the text processing method is described next by taking the interaction between the game screen and the real scene as an example.
In step S110, a target image including the drone is acquired.
In an exemplary embodiment of the present disclosure, in order to acquire a target image including an unmanned aerial vehicle, tracking monitoring may be performed by a ground monitoring device disposed on the ground, the ground monitoring device may be a photographing device disposed on the ground, which is used for photographing the unmanned aerial vehicle to acquire the target image including the unmanned aerial vehicle, the photographing device may specifically be a camera, a video camera, a smart terminal with a photographing function, and the like, and in an embodiment of the present disclosure, the photographing device may be a binocular camera conforming to a horizontal polar constraint, that is, the unmanned aerial vehicle photographed by the binocular camera, the ordinate in two imaging planes of the camera is the same, and only the abscissa is different.
In the exemplary embodiment of the present disclosure, a wireless communication module is provided in the binocular camera, and the wireless communication module may be wirelessly connected to a server through a network, and after the binocular camera conforming to the horizontal polar line constraint is used to shoot the unmanned aerial vehicle to generate a target image including the unmanned aerial vehicle, the target image may be sent to the server through the binocular camera, and after the server acquires the target image including the unmanned aerial vehicle, the server performs position coordinate recognition and generates a flight command on the target image.
In step S120, position coordinates of the drone in a target coordinate system are obtained according to the target image.
In an exemplary embodiment of the present disclosure, after the target image is acquired, feature extraction may be performed on the drone in the target image to determine the position coordinates of the drone in the target coordinate system. Since the binocular camera is used for photographing the unmanned aerial vehicle to generate the target image containing the unmanned aerial vehicle, two target images are correspondingly generated, one target image corresponds to the left lens, the other target image corresponds to the right lens, for convenience of subsequent data processing, one image can be set as a reference image, the other image can be set as a contrast image, as an example, the target image generated by the left lens can be marked as the reference image, and the target image generated by the right lens can be marked as the contrast image. In addition, the target coordinate system is a camera coordinate system corresponding to the binocular camera, and fig. 2 shows a schematic direction diagram of the camera coordinate system, as shown in fig. 2, an X axis in the camera coordinate system is along a horizontal direction, a Y axis is along a vertical upward direction, and a Z axis is along a direction in which the binocular camera points to the drone.
In an exemplary embodiment of the present disclosure, fig. 3 shows a flowchart for acquiring the position coordinates of the drone in the target coordinate system, and as shown in fig. 3, the method for acquiring the position coordinates of the drone in the target coordinate system at least includes steps S301 to S303, specifically:
in step S301, the reference image and the comparison image are processed by an image recognition model to obtain a first position of the drone in the reference image and a second position of the drone in the comparison image.
In an exemplary embodiment of the present disclosure, in order to determine a specific position of the drone in the reference image and the comparison image, feature extraction may be performed on the reference image and the comparison image respectively through a machine learning model, which may be a neural network model, such as a convolutional neural network model, R-CNN, Faster R-CNN, and so on, after performing multi-layer convolution-pooling-full connection on the reference image or the comparison image, the drone may be located and marked with a marking frame, and a first position of the drone in the reference image and a second position of the drone in the comparison image may be determined according to the marking frame.
In step S302, a disparity value is obtained according to the first position and the second position.
In the exemplary embodiment of the present disclosure, the disparity value of the unmanned aerial vehicle on the reference image and the contrast image can be obtained by performing binocular vision matching on the labeling frame of the unmanned aerial vehicle on the binocular image. Since the corresponding Y coordinates of the two lenses of the binocular camera in the target coordinate system are the same, and only the X coordinates are different, the disparity value is also the horizontal coordinate difference between the first position and the second position.
In an exemplary embodiment of the present disclosure, a disparity value may be obtained according to a pixel value, fig. 4 shows a schematic flowchart of obtaining the disparity value, and as shown in fig. 4, the method for calculating the disparity value at least includes steps S401 to S403, specifically:
in step S401, the first position is mapped onto the contrast image to obtain a third position.
In an exemplary embodiment of the present disclosure, the origin of the target coordinate system may be located at the center of the binocular camera, and in the mapping, a first position (x) may be mapped onto the contrast image with the Y-axis as the symmetry axis, as shown in fig. 5 1 Y) after mapping with the Y-axis as the axis of symmetry, the third position on the comparison image is(x 1 ’,y’)。
In step S402, it is determined whether a target pixel exists within a range of a preset distance from a third position, where a difference between a pixel value of the target pixel and a pixel value of a pixel corresponding to the third position is close to zero.
In an exemplary embodiment of the present disclosure, with (x) 1 ', y') is defined as being centered on (x) 1 ', y') and (x) 1 '+ delta, y' + delta) is formed in the image range, whether the third position (x) exists or not 1 ', y') of pixels having pixel values close to each other, where δ is not zero. In the embodiment of the present disclosure, images generated by the binocular camera are RGB images, when comparing pixel values, differences between R, G, B values of two pixels may be calculated respectively, and then the differences are added to obtain an average value, and whether the pixel values of the two pixels are close to each other is determined according to the obtained average value.
In step S403, the difference between the abscissa of the target pixel and the abscissa of the third position is calculated to obtain a parallax value.
In the exemplary embodiment of the disclosure, after determining the target pixel closest to the pixel value of the pixel point corresponding to the third position, the target pixel may be considered as a projection point of the unmanned aerial vehicle on the comparison image, because the difference between the two generated images is not large when the binocular camera shoots the same object, the target pixel and the third position are substantially located on the same abscissa axis, and the abscissa of the target pixel and the abscissa of the third position are subtracted to obtain the absolute value thereof, so that the parallax value may be obtained. For example, the target pixel coordinate in FIG. 5 is (x) 2 Y), then according to | x 2 -x 1 And | obtaining a parallax value.
In step S303, the position coordinates are determined according to the offset amount of the first position with respect to the center point of the reference image, the parallax value, and the parameters of the binocular camera.
In an exemplary embodiment of the present disclosure, the position coordinates of the drone in the target coordinate system may be determined according to the offset of the first position with respect to the center point of the reference image, the parallax value, and the parameters of the binocular camera, and specifically may be calculated according to formula (1):
X=b×x 1 /d Y=b×y/d Z=f/d (1)
wherein (X, Y, Z) is position coordinates, b is the distance between lenses in the binocular camera, d is a parallax value, and f is the focal length of the binocular camera.
In step S130, a flight instruction is determined according to the position coordinates and the coordinates of the target position, and the flight instruction is sent to the unmanned aerial vehicle, so that the unmanned aerial vehicle flies to the target position.
In this disclosed example embodiment, after having confirmed the position coordinate of unmanned aerial vehicle in the target coordinate system, can confirm unmanned aerial vehicle's flying distance, direction and speed according to unmanned aerial vehicle's position coordinate and target location's coordinate, according to flying distance the flying direction forms with predetermineeing speed the flight instruction, and will the flight instruction send to unmanned aerial vehicle, so that unmanned aerial vehicle flies to the target location.
In an exemplary embodiment of the present disclosure, the target position may be a central point of a reference image captured by a binocular camera, or may be a certain position where the unmanned aerial vehicle can land safely. When the target position is the central point of the reference image, at this moment, the unmanned aerial vehicle only needs to fly to the central point of the reference image in the XOY plane, that is to say, the flight state of the unmanned aerial vehicle in the target coordinate system can have displacement on the Z axis, and also can have no displacement on the Z axis, as long as in the image shot through the binocular camera, the unmanned aerial vehicle is located at the central point of the reference image. In the embodiment of the present disclosure, when the position of the drone in the target coordinate system is (X, Y, Z), the frame center coordinate point may be set to (0,0,0), then the flight direction in the generated flight command may be (-X, -Y, 0), and the flight distance may be (-X, -Y, 0)
The flying speed may be a preset safe speed Vs. The unmanned aerial vehicle flies according to the received flight instruction, namely the reference image can be reachedEnsures that the drone is in the field of view being monitored.
Further, when the unmanned aerial vehicle flies to the target position, the server can send a hovering instruction to the unmanned aerial vehicle, and the unmanned aerial vehicle is guaranteed to be always located at the center point of the reference image relative to the binocular camera. When the posture or the position of the binocular camera changes, the flying distance, the flying direction and the flying speed can be recalculated, and a new flying instruction is sent to the unmanned aerial vehicle, so that the unmanned aerial vehicle continuously flies until the central point of the reference image is reached.
In an exemplary embodiment of the present disclosure, when the target position is a target landing point at which the unmanned aerial vehicle can land safely, the flight instruction may be determined according to the position coordinate of the unmanned aerial vehicle and the position of the target landing point, and the unmanned aerial vehicle is guided to fly in the target coordinate system. The target landing point may be a point relatively close to the binocular camera, which facilitates the recovery of the drone by the user, for example, the drone may be informed to fly in the (-X, -Y, -Z) direction at a preset safe speed Vs
Distance, at unmanned aerial vehicle flight in-process, in order to guarantee that unmanned aerial vehicle is being kept watch on the within range always, the two mesh cameras can gather the image that contains unmanned aerial vehicle in real time and send for the server, and the server can confirm whether instruct unmanned aerial vehicle to hover and descend according to unmanned aerial vehicle's real-time position and the distance of descending target point. Fig. 6 is a schematic flow chart illustrating guiding the unmanned aerial vehicle to land safely, as shown in fig. 6, in step S601, a distance between real-time position information of the unmanned aerial vehicle and a target landing point is calculated; if the real-time position information is marked as (Xs, Ys, Zs), wherein Xs, Ys, Zs are preset safe distances from the unmanned aerial vehicle to the binocular camera in all directions and are not zero at the same time, the landing target point is marked as (0,0,0), and the distance between the two is (Xs, Ys, Zs) which are preset safe distances from the unmanned aerial vehicle to the binocular camera in all directions and are not zero at the same time
In step S602, comparing the distance with a preset distance threshold; in step S603, if the distance is less than or equal to the preset distance threshold, sending a landing to the droneInstructions to land the drone; assuming the predetermined distance threshold is S, if
The unmanned aerial vehicle enters the range capable of safely landing, and a landing instruction can be sent to the unmanned aerial vehicle to control the unmanned aerial vehicle to land.
The unmanned aerial vehicle guiding method in the embodiment of the disclosure can guide the unmanned aerial vehicle to fly to a target position, ensures that the unmanned aerial vehicle is located in a monitored visual field range, avoids the unmanned aerial vehicle from being lost under the condition that a positioning signal is out of order, and further can ensure that the unmanned aerial vehicle safely lands and is recovered.
The following introduces an apparatus embodiment of the present disclosure, which can be used to execute the above-mentioned guiding method for an unmanned aerial vehicle of the present disclosure. For details not disclosed in the embodiments of the apparatus of the present disclosure, please refer to the embodiments of the unmanned aerial vehicle guidance method of the present disclosure.
Fig. 7 schematically illustrates a block diagram of a drone guiding device according to one embodiment of the present disclosure.
Referring to fig. 7, a drone guiding device 700 according to one embodiment of the present disclosure, the drone guiding device 700 includes: an image acquisition module 701, a coordinate acquisition module 702, and an instruction generation module 703. Specifically, the method comprises the following steps:
an image acquisition module 701, configured to acquire a target image including an unmanned aerial vehicle; a coordinate obtaining module 702, configured to obtain, according to the target image, a position coordinate of the unmanned aerial vehicle in a target coordinate system; the instruction generating module 703 is configured to determine a flight instruction according to the position coordinates and the coordinates of the target position, and send the flight instruction to the unmanned aerial vehicle, so that the unmanned aerial vehicle flies to the target position.
It should be noted that although in the above detailed description several modules or units of the apparatus for performing are mentioned, such a division is not mandatory. Indeed, the features and functionality of two or more modules or units described above may be embodied in one module or unit, according to embodiments of the present disclosure. Conversely, the features and functions of one module or unit described above may be further divided into embodiments by a plurality of modules or units.
In an exemplary embodiment of the present disclosure, there is also provided a drone guiding system, fig. 8 shows a schematic structural diagram of the drone guiding system, and as shown in fig. 8, the drone guiding system 800 includes: camera 801, introducer 802 and signal feedback 803, specifically:
a camera 801 for shooting a drone to generate a target image containing the drone; the guider 802 is connected with the camera and used for receiving the target image sent by the camera and acquiring the position coordinate of the unmanned aerial vehicle in a target coordinate system according to the target image; and a signal feedback unit 803, connected to the guiding device and the unmanned aerial vehicle, for receiving the position coordinates, determining a flight instruction according to the position coordinates and the coordinates of the target position, and sending the flight instruction to the unmanned aerial vehicle, so that the unmanned aerial vehicle flies to the target position.
In an exemplary embodiment of the present disclosure, the director 802 may be a dedicated chip or a computer equipped with a high performance GPU.
In an exemplary embodiment of the present disclosure, an electronic device capable of implementing the above method is also provided.
As will be appreciated by one skilled in the art, aspects of the present invention may be embodied as a system, method or program product. Accordingly, various aspects of the present invention may be embodied in the form of: an entirely hardware embodiment, an entirely software embodiment (including firmware, microcode, etc.) or an embodiment combining hardware and software aspects that may all generally be referred to herein as a "circuit," module "or" system.
An electronic device 900 according to this embodiment of the invention is described below with reference to fig. 9. The electronic device 900 shown in fig. 9 is only an example and should not bring any limitations to the function and scope of use of the embodiments of the present invention.
As shown in fig. 9, electronic device 900 is in the form of a general purpose computing device. Components of electronic device 900 may include, but are not limited to: the at least one processing unit 910, the at least one storage unit 920, a bus 930 connecting different system components (including the storage unit 920 and the processing unit 910), and a display unit 940.
Wherein the storage unit stores program code that is executable by the processing unit 910 to cause the processing unit 910 to perform steps according to various exemplary embodiments of the present invention described in the above section "exemplary methods" of the present specification. For example, the processing unit 910 may perform step s110 as shown in fig. 1-acquire a target image containing a drone; s120, acquiring a position coordinate of the unmanned aerial vehicle in a target coordinate system according to the target image; step S130: determining a flight instruction according to the position coordinates and the coordinates of the target position, and sending the flight instruction to the unmanned aerial vehicle so that the unmanned aerial vehicle flies to the target position.
The storage unit 920 may include a readable medium in the form of a volatile storage unit, such as a random access memory unit (RAM)9201 and/or a cache memory unit 9202, and may further include a read only memory unit (ROM) 9203.
Storage unit 920 may also include a program/utility 9204 having a set (at least one) of program modules 9205, such program modules 9205 including but not limited to: an operating system, one or more application programs, other program modules, and program data, each of which, or some combination thereof, may comprise an implementation of a network environment.
Bus 930 can be any of several types of bus structures including a memory unit bus or memory unit controller, a peripheral bus, an accelerated graphics port, a processing unit, or a local bus using any of a variety of bus architectures.
The electronic device 900 may also communicate with one or more external devices 1500 (e.g., keyboard, pointing device, bluetooth device, etc.), with one or more devices that enable a user to interact with the electronic device 900, and/or with any devices (e.g., router, modem, etc.) that enable the electronic device 900 to communicate with one or more other computing devices. Such communication may occur via input/output (I/O) interface 950. Also, the electronic device 900 may communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network such as the Internet) via the network adapter 960. As shown, the network adapter 960 communicates with the other modules of the electronic device 900 via the bus 930. It should be appreciated that although not shown, other hardware and/or software modules may be used in conjunction with the electronic device 900, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data backup storage systems, to name a few.
Through the above description of the embodiments, those skilled in the art will readily understand that the exemplary embodiments described herein may be implemented by software, or by software in combination with necessary hardware. Therefore, the technical solution according to the embodiments of the present disclosure may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (which may be a CD-ROM, a usb disk, a removable hard disk, etc.) or on a network, and includes several instructions to enable a computing device (which may be a personal computer, a server, a terminal device, or a network device, etc.) to execute the method according to the embodiments of the present disclosure.
In an exemplary embodiment of the present disclosure, there is also provided a computer-readable storage medium having stored thereon a program product capable of implementing the above-described method of the present specification. In some possible embodiments, aspects of the invention may also be implemented in the form of a program product comprising program code means for causing a terminal device to carry out the steps according to various exemplary embodiments of the invention described in the above section "exemplary methods" of the present description, when said program product is run on the terminal device.
Referring to fig. 10, a program product 1000 for implementing the above method according to an embodiment of the present invention is described, which may employ a portable compact disc read only memory (CD-ROM) and include program code, and may be run on a terminal device, such as a personal computer. However, the program product of the present invention is not limited in this regard and, in the present document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
A computer readable signal medium may include a propagated data signal with readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A readable signal medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device and partly on a remote computing device, or entirely on the remote computing device or server. In the case of a remote computing device, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., through the internet using an internet service provider).
Furthermore, the above-described drawings are only schematic illustrations of processes involved in methods according to exemplary embodiments of the invention, and are not intended to be limiting. It will be readily understood that the processes shown in the above figures are not intended to indicate or limit the chronological order of the processes. In addition, it is also readily understood that these processes may be performed, for example, synchronously or asynchronously in multiple modules.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is to be limited only by the terms of the appended claims.