WO2019223159A1 - Method and apparatus for controlling live broadcast of unmanned device, computer device, and storage medium - Google Patents
Method and apparatus for controlling live broadcast of unmanned device, computer device, and storage medium Download PDFInfo
- Publication number
- WO2019223159A1 WO2019223159A1 PCT/CN2018/102874 CN2018102874W WO2019223159A1 WO 2019223159 A1 WO2019223159 A1 WO 2019223159A1 CN 2018102874 W CN2018102874 W CN 2018102874W WO 2019223159 A1 WO2019223159 A1 WO 2019223159A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- unmanned device
- state instruction
- unmanned
- instruction
- preset
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 72
- 230000033001 locomotion Effects 0.000 claims abstract description 29
- 230000008569 process Effects 0.000 claims abstract description 10
- 238000012545 processing Methods 0.000 claims description 11
- 230000009471 action Effects 0.000 description 6
- 230000006870 function Effects 0.000 description 6
- 238000005516 engineering process Methods 0.000 description 5
- 238000010586 diagram Methods 0.000 description 4
- 230000006399 behavior Effects 0.000 description 3
- 230000005672 electromagnetic field Effects 0.000 description 3
- 238000013459 approach Methods 0.000 description 2
- 238000004590 computer program Methods 0.000 description 2
- 238000003384 imaging method Methods 0.000 description 2
- 238000005259 measurement Methods 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 238000005034 decoration Methods 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 238000007689 inspection Methods 0.000 description 1
- 238000012423 maintenance Methods 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000012549 training Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/183—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source
- H04N7/185—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source from a mobile camera, e.g. for remote control
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02D—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
- Y02D30/00—Reducing energy consumption in communication networks
- Y02D30/70—Reducing energy consumption in communication networks in wireless communication networks
Definitions
- the present application relates to the field of remote control technology, and in particular, the present application relates to a method, an apparatus, a computer device, and a storage medium for controlling live broadcast of an unmanned device.
- the purpose of this application is to solve at least one of the above-mentioned technical defects, in particular to a method, device, computer equipment, and storage medium capable of freely controlling unmanned equipment live broadcast, so as to achieve free control of the terminal and view the environment in which the terminal is located. image.
- the present application provides a method for controlling live broadcast of an unmanned device, including: the unmanned device obtains a first status instruction to be executed, where the first status instruction is used to control a movement track of the unmanned device; and the unmanned device Move on a preset path according to the movement trajectory, and obtain spatial video information in the direction of travel of the unmanned device during the traveling process through an image sensor on the unmanned device; the unmanned device moves the space The video information is streamed to the server.
- the present application also discloses an apparatus for controlling live broadcasting of an unmanned device, including: an acquisition module: for acquiring a first status instruction to be executed, the first status instruction for controlling a movement track of the unmanned device; a processing module : It is used for the unmanned device to move on a preset path according to the movement trajectory, and acquire spatial video information of the unmanned device in the traveling direction through the image sensor on the unmanned device; execute Module: used for the unmanned device to push the spatial video information to the server.
- This application also discloses a computer device including a memory and a processor.
- the memory stores computer-readable instructions.
- the processor causes a processor to execute a control mode.
- the following steps of the method for live broadcasting by a human device an unmanned device obtains a first state instruction to be executed, the first state instruction is used to control a movement track of the unmanned device; and the unmanned device is based on the movement track Move on a preset path, and acquire the spatial video information in the direction of travel of the unmanned device during the traveling process through an image sensor on the unmanned device; the unmanned device pushes the spatial video information to a stream To the server.
- the present application also discloses a non-volatile storage medium storing computer-readable instructions.
- the one or more processors execute a method for controlling an unmanned device.
- the live broadcast method has the following steps: an unmanned device obtains a first state instruction to be executed, the first state instruction is used to control a movement trajectory of the unmanned device; It is assumed that the user moves on the path and acquires the spatial video information of the unmanned device in the traveling direction through the image sensor on the unmanned device; the unmanned device sends the spatial video information to the server by streaming end.
- the method and device for controlling live broadcast of unmanned devices disclosed in the present application can realize that in the scene of automatic house viewing, any user can automatically control the device for controlling live broadcast of unmanned devices, so that the device can preset Walking on the route and taking relevant images to realize the function of remotely controlling the device to perform autonomous remote house viewing. Users can operate the device without professional control training.
- the method is simple and the entire method of controlling the live broadcast of unmanned devices is controlled. Simple, more flexible control.
- FIG. 1 is a flowchart of a method for controlling live broadcast of an unmanned device in this application
- FIG. 2 is a flowchart of a method for obtaining a distance by an unmanned device in this application
- FIG. 3 is a flowchart of a method for obtaining a distance by using a ranging light spot on an unmanned device of this application;
- FIG. 4 is a flowchart of a first implementation manner of a precondition for executing a first state instruction in this application
- FIG. 5 is a flowchart of a second implementation manner of executing a precondition of a first state instruction of the present application
- FIG. 6 is a schematic diagram of a device module for controlling live broadcast of an unmanned device in this application.
- FIG. 7 is a block diagram of a basic structure of a computer device of the present application.
- the present application provides a method for controlling live broadcast of an unmanned device.
- the method for controlling live broadcast of unmanned equipment includes at least two terminals, one is a remote terminal, which is used by a user to view images and control status, and a remote terminal that sends control instructions.
- the remote terminal It may be a computer, a notebook, a mobile phone, or other terminals, and the unmanned device may be a robot or an unmanned aerial vehicle, an unmanned vehicle such as a remote control car.
- the other is an unmanned device that is controlled to move and can perform related shooting actions.
- the remote communication between the two the unmanned device analyzes and judges the relevant instructions sent according to its own motion state, and executes or prohibits the first state instruction, thereby achieving the purpose of remotely controlling the house inspection, and the unmanned device has automatic obstacle avoidance. Function, high degree of intelligence, and simple control method.
- a method for controlling live broadcast of an unmanned device based on a remote terminal and an unmanned device includes the following steps:
- the unmanned device obtains a first status instruction to be executed, where the first status instruction is used to control a movement track of the unmanned device;
- the first instruction is used to control a movement trajectory of the unmanned device, that is, to control the unmanned device to move in a specified direction.
- a direction input unit is provided on the remote terminal, and the unmanned device can be controlled to move through the direction input unit or by setting related direction parameters.
- the user can send any direction instruction that allows the unmanned device to move, and controls the unmanned device to move according to the user's wishes.
- the unmanned device moves on a preset path according to the movement trajectory, and acquires spatial video information in the traveling direction of the unmanned device during the traveling process through an image sensor on the unmanned device;
- spatial video information including: a ranging light spot image projected in the traveling direction or an image of the surrounding environment at the current position. This spatial video information is obtained through an image sensor on the drone.
- the ranging light spot image projected in the traveling direction is mainly used to monitor the distance between the unmanned device and an obstacle in the traveling direction to avoid a collision.
- the image of the surrounding environment at the current location is for the convenience of the user to understand the environment around the location of the unmanned device.
- Spatial video information in both cases can be obtained at the same time, or can be obtained in different periods.
- the situation obtained at the same time is that two different camera devices are used to obtain the above-mentioned ranging spot image and the image of the surrounding environment at the current position.
- the situation obtained in different periods is usually that a camera device is used.
- a ranging light spot image is first acquired to measure the distance between the unmanned device and the obstacle in front.
- the surrounding environment where the unmanned device is located is taken to obtain a corresponding image.
- the first instruction includes instruction information for controlling a moving direction.
- the unmanned device moves automatically according to the first state instruction, so there should be a necessary obstacle avoidance mechanism for obstacle avoidance.
- Obtaining a ranging spot image is actually a way to avoid obstacles.
- a distance measurement spot image is used to determine the distance between the unmanned device and an obstacle in front.
- the unmanned device pushes the spatial video information to a server.
- spatial video information including: a ranging spot image projected in the direction of travel or an image of the surrounding environment at the current position, after obtaining the relevant spatial video information, it is directly transmitted to the server.
- the method when the unmanned device moves on a preset path according to the moving track, the method further includes:
- An obstacle refers to an object blocked in the direction of travel of the unmanned device.
- the first parameter value is a relationship between the unmanned device and the obstacle, such as a distance relationship.
- the first preset condition is a condition of the first parameter value and, for example, when the first parameter value is the distance relationship between the unmanned device and the obstacle, in order to better let the unmanned device in the direction of travel
- a first threshold value for distance can be set.
- the first preset condition is the relationship between the first parameter value and the first threshold. For example, the first parameter value is greater than the first threshold value, or the first parameter value is less than the first threshold value, or other relationships are met according to the corresponding application scenario. set.
- the second state instruction includes any one of the instruction information of a stop travel instruction, a decrease speed instruction, or a reverse travel instruction.
- a stop travel instruction a stop travel instruction
- a decrease speed instruction a reverse travel instruction.
- the first threshold value is assumed to be 1 meter, and the first preset condition is that the first parameter value is smaller than the first threshold value.
- the first parameter value obtained by the ultrasonic wave is 0.8 meters, the first parameter value is less than the first threshold value, and the first preset condition is satisfied. In this way, the unmanned device executes the second state instruction and immediately stops or reduces the traveling speed. Or go in reverse.
- the first parameter value between obstacles there are many ways to obtain and obtain the first parameter value between obstacles, and it may also be laser ranging, visible ranging, or other methods.
- a method for performing ranging using a ranging light spot is also disclosed, that is, the obtained first parameter value is a parameter value related to the ranging light spot.
- the specific steps of obtaining the first parameter value and executing the second preset condition by using the ranging light spot include:
- this step is applicable when the spatial video information refers to a ranging spot image, and the purpose of obtaining the frame image of the spatial video information is to measure the unmanned device and The distance between obstacles.
- the specific ranging method refer to step S212.
- the device that sends the light beam is directly installed on the unmanned device, and the camera that captures the spatial image with the full light spot in front is also set on the unmanned device.
- a preferred method is that the camera and the light emitting device The devices should be as close as possible, arranged side by side, up and down, or back and forth to reduce the inclination angle of the spatial image with a complete light spot.
- the device that sends the light beam is a light spot projector.
- the light spot projector and the camera that captures the image with the light spot are sequentially stacked to form a distance measuring device.
- the distance measuring method of the distance measuring device always points in the direction of travel.
- the laser transmitter The light spot is projected on the obstacle, and the camera collects the image in the area where the light spot is located in real time or periodically.
- the complete light spot area can be obtained through image processing technology. Since the camera is fixed and the area of the entire captured picture is fixed, the light spot can be obtained by a fixed algorithm Percentage of the entire image area.
- the shape of the light spot projected by the light spot projector is fixed, but the area of the light spot projected on the obstacle will be different due to the distance. It can be understood that when the unmanned device approaches the obstacle The spot area projected by the spot projector on the unmanned device is small. When the spot projector is far away from the obstacle, the area of the spot projected by the obstacle is large. In this way, it can be easily provided that the projection direction is certain. Get a function value about the projection area and distance of the light spot.
- the size of the picture taken by the camera is fixed, and it can also be displayed in the picture taken according to the existing technology.
- the measurement of the distance between the unmanned device and the obstacle may also be the emission of visible light.
- This method is the same as the above-mentioned spot ranging, and the other is using laser ranging, that is, using a pulse valve or Phase method to calculate the distance between the position of the laser emitting device and the obstacle.
- laser ranging that is, using a pulse valve or Phase method to calculate the distance between the position of the laser emitting device and the obstacle.
- the above are just a few of the ranging methods disclosed in this application, but this application is not limited to the above one ranging method, and may also be other methods.
- the first threshold value is a critical value of a view ratio of the flare image in the frame picture image in the foregoing step S212. It can be seen from step S212 that in the frame picture, the distance between the current unmanned device and the obstacle in front can be determined from the proportion of the spot image.
- the unmanned device When taking an image of the surrounding environment at the current position at the same time, if the unmanned device is closer to the obstacle, it will not only block the turning and moving of the unmanned device, but the obstacle may block a part of the perspective, which is not good for Shoot the surrounding environment, so you need to set an optimal distance between the unmanned device and the obstacle, which can not only take a good picture of the surrounding environment, but also ensure that the unmanned device does not collide with the obstacle,
- the view ratio of the light spot image in the frame image can be obtained, that is, the distance between the current position of the unmanned device and the obstacle in front, and the light spot corresponding to the critical distance between the unmanned device and the obstacle can be obtained.
- the view ratio of the image in the frame picture image is the above-mentioned first threshold. Assuming that the total area of the frame picture image is S1, and the area of the spot image in the frame picture image is S2, the actual distance L between the current unmanned device and the obstacle is about the spot image area S2 and the total area of the frame picture image
- the mapping of the proportional value of S1 to the corresponding distance function relationship f (n), that is, L (S2 / S1) * f (n).
- the second state instruction includes any one of instruction information of stopping the travel, reducing the travel speed, or traveling in the reverse direction. For example, in one embodiment, when the above-mentioned proportional relationship is greater than the first threshold, a second state instruction for stopping travel is executed to avoid collision between the unmanned device and an obstacle.
- a second threshold value may be further set so that The human device slowly moves to a position corresponding to the second threshold, and then performs other actions such as stopping or turning. It should be noted that the distance between the unmanned device and the obstacle may be obtained in real time, or may be obtained according to a certain frequency.
- a first A threshold value when it is greater than a first threshold value, reducing the traveling speed, controlling the unmanned device to adjust a distance value between the unmanned device and an obstacle according to a smaller moving unit and a distance detection frequency, so that the unmanned device The unmanned device reaches the position corresponding to the second threshold more accurately.
- the second state instruction of the reverse travel is executed, so that the unmanned device moves in the opposite direction, and the S2 / S1 is also continuously monitored at the same time. Until the S2 / S1 is equal to or smaller than the first threshold, stop the unmanned device or perform other actions.
- the distance corresponding to the first threshold may be the safest distance, such as this distance is suitable for turning by unmanned equipment, or this distance is not prone to collision, etc., or the distance of the first threshold may be the distance that is most suitable for shooting surrounding space images.
- the distance value of the first threshold is set.
- the unmanned device is controlled to retreat at the same time, so that the distance between the unmanned device and the obstacle is greater than or equal to the critical value of the distance, so as to achieve the optimal shooting distance range and make the unmanned device Able to capture surrounding images more fully. Allows users to observe the most comprehensive spatial image at the current distance.
- the second status instruction further includes: sending warning information.
- S2 / S1 is greater than the preset first threshold, it indicates that the unmanned device is in danger of colliding with an obstacle in front or is not conducive to the execution of the next action, so it can send a warning message to the remote terminal or to Relevant server to give alarm prompts, remind users or managers to check, or take relevant measures to adjust.
- Warning information can be control alarm alarm, emergency phone dialing, emergency mail sending, etc.
- the method further includes:
- This step can be specifically applied in some scenarios, for example, in an automatic house viewing system, by remotely controlling an unmanned device, shooting the surrounding environment where the unmanned device is located for autonomous house viewing. Because there are a lot of objects in the room, the automatic walking shooting is completely performed by unmanned equipment, which is prone to collisions. And this kind of automatic walking unmanned equipment usually travels according to the designated route. Different customers have different preferences and observation habits. You can't just let your customers see any scene in their current environment. Therefore, the method for controlling unmanned equipment to control unmanned equipment live broadcast in this application can solve this problem. However, in the room, due to the limitation of the layout and decoration, the unmanned device cannot be moved at will.
- a travel path adapted to the current environment can be further formulated to ensure that the unmanned equipment can be safe To move around and to take a complete picture of the surroundings of the current environment.
- the travel path of the unmanned device at the current location is not exactly the same. In order to adapt to different types of rooms, a travel path can be set for each room.
- the unmanned device starts to work, the real-time The position matches the travel path of the corresponding room according to the position of the unmanned device.
- it is preferred to determine whether the current unmanned device is in the travel path. If so, follow the first state instruction. Move in the direction of, if not, it means that the current location of the unmanned device is an illegal area, that is, an unsafe area, and the first state instruction is not executed. Further, the unmanned device can be controlled to send an alarm signal to remind the current position as an illegal position, and to prompt the staff or remote operator to move the unmanned device to leave the illegal area.
- This method is similar to the above method, and it is also necessary to preset a travel path and obtain the current position information of the unmanned device to match the corresponding travel path.
- the difference is that the unmanned device needs to determine whether the current position of the unmanned device and the end point of the upcoming travel fall within the boundary of the travel path according to the direction pointed by the first state instruction. For example, when the first When the state command is moved forward at a speed of 10cm / s for 10 seconds, since the distance of 10 seconds is 100cm, when the first state command is executed, the current unmanned device is predicted to move forward 100cm Whether the end position is within the preset travel path range above. Only when the end point of the travel is also within the travel path, the first state command is executed. If the unmanned device executes the first state command, If there is a deviation from the travel path, the execution of the first state instruction is prohibited, so as to prevent the unmanned device from running after the first state instruction is executed.
- the travel path mentioned in the above two schemes is a preset electronic path, which can be sent to the device performing the movement through data transmission.
- the preset travel path is not limited to
- the above-mentioned electronic path may also be an electromagnetic track installed in a house.
- a hall sensor for detecting the electromagnetic track is provided on the unmanned device performing the movement, and the Hall sensor is used to detect whether the unmanned device is on the electromagnetic track. Determine whether to execute the first state instruction.
- Hall sensors are provided on both sides of the drone. When the Hall sensor on one side cannot detect the electromagnetic field or the detected electromagnetic field is too small, the unmanned robot reports to the The other side of the Hall sensor shifts until the electromagnetic field signal is received, which makes viewing the house more convenient and intelligent by means of automatic driving.
- This application also discloses a device for controlling live broadcast of an unmanned device. Please refer to FIG. 6, including:
- the obtaining module 100 is configured to obtain a first status instruction to be executed, where the first status instruction is used to control a movement trajectory of the unmanned device;
- Processing module 200 used for the unmanned device to move on a preset path according to the movement trajectory, and to obtain a spatial video of the unmanned device in the traveling direction through an image sensor on the unmanned device information;
- the execution module 300 is used for the unmanned device to push the spatial video information to a server.
- the first instruction includes instruction information for controlling a moving direction. That is, the unmanned device is controlled to move in a specified direction. Further, a direction input unit is provided on the remote terminal, and the unmanned device can be controlled to move through the direction input unit or by setting related direction parameters. The user can send any direction instruction that allows the unmanned device to move, and controls the unmanned device to move according to the user's wishes.
- a ranging light spot image projected in the traveling direction or an image of the surrounding environment at the current position there are various types of spatial video information, including: a ranging light spot image projected in the traveling direction or an image of the surrounding environment at the current position.
- the processing module 200 receives the first status instruction, it can obtain the above-mentioned ranging spot image and the image of the surrounding environment at the current position, or obtain the ranging spot image first, and then obtain the surrounding environment of the current location under certain conditions.
- Image The simultaneous acquisition situation is usually that the imaging device used to capture the ranging spot image and the imaging device used to image the surrounding environment at the current location are two different devices, so that the two take different images with complementary effects.
- the case where two types of images are acquired separately may be the case where one camera device is shared.
- the ranging spot image first to measure the distance between the current position and the obstacle to ensure that the unmanned device is operating within a safe distance range.
- the unmanned device is stopped or the current direction of travel is safe , And then take a picture of the surroundings of the current location.
- Any of the captured image information can be sent to the server for storage in a streaming manner through the execution module 300, or sent to a remote client for real-time viewing through the server.
- the automatic traveling device further includes:
- a first acquisition module configured to acquire a first parameter value with an obstacle
- a first processing module configured to determine whether the first parameter value meets a first preset condition
- First execution module when the first parameter value meets a first preset condition, execute a preset second state instruction.
- the spatial video information includes: a light spot image of a ranging light spot projected in a traveling direction by a light spot projector on the unmanned device, and the step of obtaining a first parameter value between the obstacle and the obstacle specifically includes:
- the first acquisition module is used to acquire a frame picture image of the spatial video information;
- the first processing module is used to calculate a view ratio of the flare image in the frame picture image
- a first execution module configured to execute a preset second state instruction when the view ratio is greater than the first threshold.
- the above-mentioned first acquisition module, first processing module, and first execution module are mainly used to measure the distance between the unmanned device and the obstacle according to the spot image of the ranging spot in the above-mentioned spatial video information.
- the device for projecting a ranging spot is installed on an unmanned device, and the device for shooting a spot image with a complete spot is also installed on an unmanned device.
- a distance threshold is controlled to execute the second state instruction through the entire first distance threshold.
- the second state instruction includes any one of a stop travel instruction, a decrease speed instruction, or a reverse travel instruction. It can be applied to different occasions according to the situation.
- the application occasion and control method are the same as those in the method for controlling live broadcast of an unmanned device.
- the second status instruction further includes: sending warning information.
- S2 / S1 is greater than the preset first threshold, it indicates that the unmanned device is in danger of colliding with an obstacle in front or is not conducive to the execution of the next action, so it can send a warning message to the remote terminal or to Relevant server to give alarm prompts, remind users or managers to check, or take relevant measures to adjust.
- the warning information can be controlled by the alarm, emergency phone dialing, emergency mail sending, etc. to improve security.
- the automatic traveling device further includes:
- a second acquisition module used to acquire a preset travel path and real-time position information
- a second processing module used to determine whether the real-time location information is in the travel path
- the second execution module is configured to prohibit execution of the first state instruction when the real-time location information is not in the travel path.
- the automatic traveling device moves arbitrarily at the current position, it is not suitable for the automatic traveling device to move due to the pattern of the moving site and the terrain, so it is necessary to set a traveling path according to each different site pattern and terrain.
- a travel path can be set for each room.
- the unmanned device starts to work, the real-time position of the unmanned device is obtained, and the travel path of the corresponding room is matched according to the position of the unmanned device.
- the location is an illegal area, that is, an unsafe area, and the first state instruction is not executed.
- the unmanned device can be controlled to send an alarm signal to remind the current position as an illegal position, and to prompt the staff or remote operator to move the unmanned device to leave the illegal area.
- this application also includes another structure that restricts the unmanned device from executing the first state instruction, including:
- a third acquisition module acquiring a preset travel path and real-time position information
- a third processing module judging whether an end position represented by the first state instruction is in the travel path according to a boundary distance between the real-time position information and the travel path;
- the third execution module when the end position is not in the travel path, execution of the first state instruction is prohibited.
- This method is similar to the above method, and it is also necessary to preset a travel path and obtain the current position information of the unmanned device to match the corresponding travel path.
- the difference is that the unmanned device needs to determine whether the current position of the unmanned device and the end point of the upcoming end of the path fall within the boundary of the travel path according to the direction pointed by the first state of the Sosearch.
- the first state instruction is executed only when the distance also falls within the travel path. If the unmanned device executes the first state instruction and it is out of the travel path, it is prohibited to execute the first state instruction to avoid After the human device executes the first state instruction, there is a running risk.
- the present application also discloses a computer device including a memory and a processor.
- the memory stores computer-readable instructions.
- the processor causes the processor to execute any one of the foregoing.
- FIG. 7 a block diagram of a basic structure of a computer device provided in an embodiment of the present application.
- the computer device includes a processor, a non-volatile storage medium, a memory, and a network interface connected through a system bus.
- the non-volatile storage medium of the computer device stores an operating system, a database, and computer-readable instructions.
- the database may store control information sequences.
- the processor may implement a A User Behavior Association Prompt Method.
- the processor of the computer equipment is used to provide computing and control capabilities to support the operation of the entire computer equipment.
- the memory of the computer device may store computer-readable instructions, and when the computer-readable instructions are executed by the processor, the processor may cause the processor to execute a method for prompting user behavior association.
- the network interface of the computer equipment is used to connect and communicate with the terminal.
- FIG. 7 is only a block diagram of a part of the structure related to the solution of the present application, and does not constitute a limitation on the computer equipment to which the solution of the application is applied. Include more or fewer parts than shown in the figure, or combine certain parts, or have a different arrangement of parts.
- the computer device receives status information of the prompting behavior sent by the associated client, that is, whether the associated terminal turns on the prompt and whether the user closes the prompt task. By verifying whether the above-mentioned task conditions are met, and then sending a corresponding preset instruction to the associated terminal, so that the associated terminal can perform a corresponding operation according to the preset instruction, thereby achieving effective supervision of the associated terminal.
- the server controls the associated terminal to continue to ring to prevent the problem that the prompt task of the associated terminal is automatically terminated after being executed for a period of time.
- the present application also provides a storage medium storing computer-readable instructions.
- the computer-readable instructions are executed by one or more processors, the one or more processors execute the control of an unmanned person according to any one of the foregoing embodiments.
- Method of device live broadcast Method of device live broadcast.
- the computer program can be stored in a computer-readable storage medium.
- the foregoing storage medium may be a non-volatile storage medium such as a magnetic disk, an optical disc, a read-only memory (ROM), or a random access memory (Random Access Memory, RAM).
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- Aviation & Aerospace Engineering (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Selective Calling Equipment (AREA)
- Studio Devices (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
Disclosed are a method and apparatus for controlling live broadcast of an unmanned device, a computer device, and a storage medium. The method comprises: the unmanned device obtains a first state instruction to be executed, the first state instruction being used for controlling the movement trajectory of the unmanned device; the unmanned device moves on a preset path according to the movement trajectory, and obtains space video information of the unmanned device in the traveling direction during the traveling process by means of an image sensor thereon; the unmanned device sends the space video information to a server by means of flow pushing. The method and apparatus for controlling live broadcast of an unmanned device in the present application can be applied to an automatic house viewing scenario. Any user can automatically control the apparatus for controlling live broadcast of an unmanned device, and make the apparatus travel in the predetermined path according to the preference thereof and capture related images so as to achieve the function of remotely operating a device to view a house freely. The user does not need to be trained professionally. The apparatus is simple in the control mode, and can be controlled more flexible.
Description
本申请要求于2018年5月23日提交中国专利局、申请号为201810502117.8,发明名称为“控制无人设备直播的方法、装置、计算机设备及存储介质”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。This application claims the priority of a Chinese patent application filed with the Chinese Patent Office on May 23, 2018, with application number 201810502117.8, and the invention name is "Method, Device, Computer Equipment, and Storage Medium for Controlling Unmanned Live Broadcasting", all of which The contents are incorporated herein by reference.
本申请涉及远程操控技术领域,具体而言,本申请涉及一种控制无人设备直播的方法、装置、计算机设备及存储介质。The present application relates to the field of remote control technology, and in particular, the present application relates to a method, an apparatus, a computer device, and a storage medium for controlling live broadcast of an unmanned device.
随着人们生活水平的提高,人们购房的需求很大,且对于大城市而言,租房的人也很多。人们在购买房屋或者租房之前,为了保证能购买或者租赁到合适的房子,都会亲自去实地看房,亲自感受后才做决定。With the improvement of people's living standards, there is a great demand for people to buy a house, and for large cities, there are many people renting a house. Before buying or renting a house, in order to ensure that they can buy or rent a suitable house, they will go to the house to see the house in person, and then make a decision after experiencing it.
为了便于人们提前了解房屋的状况,现有技术中很多房屋交易服务商会采用线上看房的方式,先在网络上提供相关房屋的图片信息,对网络上的相关房屋的信息满意的情况下,房子供求双方先约定看房时间,然后再通过线下服务商提供的服务人员陪同看房,比如通过房屋中介带着去看房,这个过程所花费的成本和代价都是比较高的,存在浪费时间和人力成本过高等弊端,看房效率比较低。并且,目前开锁的钥匙也是多种多样,没有统一的管理,这样也会增加人力维护成本。In order to make it easier for people to understand the status of houses in the prior art, many house transaction service providers in the prior art will use online house viewing methods to first provide picture information of relevant houses on the Internet, and are satisfied with the information of relevant houses on the Internet The supply and demand sides of the house agree on the viewing time first, and then accompany the viewing through the service personnel provided by the offline service provider, such as taking the viewing through the housing agent. Disadvantages such as high time and labor costs make the efficiency of house viewing relatively low. In addition, the keys for unlocking are currently diverse, and there is no unified management, which will also increase labor maintenance costs.
现有技术中,为了客户的满足动态看房,出现了线上视频看房的技术,但是这种线上视频看房的技术,均是通过工作人员手持视频拍摄设备,在房间内移动,通过无线网络将数据流发送至服务器端,用户端通过访问服务器端获取直播数据,完成了在线上看房的设想。现有技术中还提供一种自动机器人看房的方法,通过机器人携带视频设备,用户远程控制机器人运动,实现视频看房的功能。发明人发现由于房屋内部的环境复杂,未经过专业训练的用户无法在复杂环境下驾驶无人设备,贸然操作容易造成机器人的损伤。In the prior art, in order to satisfy the customer's dynamic viewing, online video viewing technology has appeared, but this online video viewing technology is all performed by staff members holding video capture devices and moving in the room. The wireless network sends the data stream to the server, and the client accesses the server to obtain live data, which completes the idea of viewing a house online. In the prior art, a method for automatic robot house viewing is also provided. The robot carries video equipment, and the user remotely controls the movement of the robot to realize the function of video house viewing. The inventor found that due to the complex environment inside the house, untrained users cannot drive unmanned equipment in complex environments, and hastily operating could easily cause damage to the robot.
发明内容Summary of the Invention
本申请的目的旨在至少能解决上述的技术缺陷之一,特别是一种能够自由控制无人设备直播的方法、装置、计算机设备及存储介质,以达到自由控制终端,并查看终端所在环境的图像。The purpose of this application is to solve at least one of the above-mentioned technical defects, in particular to a method, device, computer equipment, and storage medium capable of freely controlling unmanned equipment live broadcast, so as to achieve free control of the terminal and view the environment in which the terminal is located. image.
本申请提供一种控制无人设备直播的方法,包括:无人设备获取待执行的第一状态指令,所述第一状态指令用于控制所述无人设备的移动轨迹;所述无人设备根 据所述移动轨迹在预设路径上移动,并通过所述无人设备上的图像传感器获取所述无人设备在行进过程中行进方向上的空间视频信息;所述无人设备将所述空间视频信息推流发送至服务器端。The present application provides a method for controlling live broadcast of an unmanned device, including: the unmanned device obtains a first status instruction to be executed, where the first status instruction is used to control a movement track of the unmanned device; and the unmanned device Move on a preset path according to the movement trajectory, and obtain spatial video information in the direction of travel of the unmanned device during the traveling process through an image sensor on the unmanned device; the unmanned device moves the space The video information is streamed to the server.
本申请还公开一种控制无人设备直播的装置,包括:获取模块:用于获取待执行的第一状态指令,所述第一状态指令用于控制所述无人设备的移动轨迹;处理模块:用于所述无人设备根据所述移动轨迹在预设路径上移动,并通过所述无人设备上的图像传感器获取所述无人设备在行进过程中行进方向上的空间视频信息;执行模块:用于所述无人设备将所述空间视频信息推流发送至服务器端。The present application also discloses an apparatus for controlling live broadcasting of an unmanned device, including: an acquisition module: for acquiring a first status instruction to be executed, the first status instruction for controlling a movement track of the unmanned device; a processing module : It is used for the unmanned device to move on a preset path according to the movement trajectory, and acquire spatial video information of the unmanned device in the traveling direction through the image sensor on the unmanned device; execute Module: used for the unmanned device to push the spatial video information to the server.
本申请还公开一种计算机设备,包括存储器和处理器,所述存储器中存储有计算机可读指令,所述计算机可读指令被所述处理器执行时,使得所述处理器执行一种控制无人设备直播的方法的下述步骤:无人设备获取待执行的第一状态指令,所述第一状态指令用于控制所述无人设备的移动轨迹;所述无人设备根据所述移动轨迹在预设路径上移动,并通过所述无人设备上的图像传感器获取所述无人设备在行进过程中行进方向上的空间视频信息;所述无人设备将所述空间视频信息推流发送至服务器端。This application also discloses a computer device including a memory and a processor. The memory stores computer-readable instructions. When the computer-readable instructions are executed by the processor, the processor causes a processor to execute a control mode. The following steps of the method for live broadcasting by a human device: an unmanned device obtains a first state instruction to be executed, the first state instruction is used to control a movement track of the unmanned device; and the unmanned device is based on the movement track Move on a preset path, and acquire the spatial video information in the direction of travel of the unmanned device during the traveling process through an image sensor on the unmanned device; the unmanned device pushes the spatial video information to a stream To the server.
本申请还公开一种存储有计算机可读指令的非易失性存储介质,所述计算机可读指令被一个或多个处理器执行时,使得一个或多个处理器执行一种控制无人设备直播的方法的下述步骤:无人设备获取待执行的第一状态指令,所述第一状态指令用于控制所述无人设备的移动轨迹;所述无人设备根据所述移动轨迹在预设路径上移动,并通过所述无人设备上的图像传感器获取所述无人设备在行进过程中行进方向上的空间视频信息;所述无人设备将所述空间视频信息推流发送至服务器端。The present application also discloses a non-volatile storage medium storing computer-readable instructions. When the computer-readable instructions are executed by one or more processors, the one or more processors execute a method for controlling an unmanned device. The live broadcast method has the following steps: an unmanned device obtains a first state instruction to be executed, the first state instruction is used to control a movement trajectory of the unmanned device; It is assumed that the user moves on the path and acquires the spatial video information of the unmanned device in the traveling direction through the image sensor on the unmanned device; the unmanned device sends the spatial video information to the server by streaming end.
本申请公开的控制无人设备直播的方法和装置,能够实现在自动看房的场景中,任何用户可以都可以自动控制控制无人设备直播的装置,让该装置按照自己的喜好,在预定的路线内行走,并拍摄相关图像,实现远程操控设备进行自主远程看房的功能,用户无需经过专业的控制训练,即可操作该装置,使用方法简单,且整个控制无人设备直播的方法控制方式简单,控制灵活度更大。The method and device for controlling live broadcast of unmanned devices disclosed in the present application can realize that in the scene of automatic house viewing, any user can automatically control the device for controlling live broadcast of unmanned devices, so that the device can preset Walking on the route and taking relevant images to realize the function of remotely controlling the device to perform autonomous remote house viewing. Users can operate the device without professional control training. The method is simple and the entire method of controlling the live broadcast of unmanned devices is controlled. Simple, more flexible control.
图1为本申请控制无人设备直播的方法流程图;FIG. 1 is a flowchart of a method for controlling live broadcast of an unmanned device in this application;
图2为本申请无人设备获取距离的方法流程图;FIG. 2 is a flowchart of a method for obtaining a distance by an unmanned device in this application;
图3为本申请无人设备采用测距光斑获取距离的方法流程图;3 is a flowchart of a method for obtaining a distance by using a ranging light spot on an unmanned device of this application;
图4为本申请执行第一状态指令前提条件的第一种实施方式流程图;FIG. 4 is a flowchart of a first implementation manner of a precondition for executing a first state instruction in this application;
图5为本申请执行第一状态指令前提条件的第二种实施方式流程图;5 is a flowchart of a second implementation manner of executing a precondition of a first state instruction of the present application;
图6为本申请控制无人设备直播的装置模块示意图;以及FIG. 6 is a schematic diagram of a device module for controlling live broadcast of an unmanned device in this application; and
图7为本申请计算机设备基本结构框图。FIG. 7 is a block diagram of a basic structure of a computer device of the present application.
为了能让任意人都能远程操控用于行进的装置,并执行拍摄功能,实现实时查看拍摄的图像的目的,本申请提供一种控制无人设备直播的方法,其中,当该方法使用在远程看房的应用场景中时,该控制无人设备直播的方法至少包括两个终端,一个是远程终端,用于用户对图像以及控制状态进行查看的,以及发送控制指令的远程终端,该远程终端可以是电脑、笔记本、手机或者其他的终端,而所述无人设备可以为机器人或者无人机,遥控车等无人驾驶设备。而另一个则是被控制以进行移动以及能够执行相关拍摄动作的无人设备。二者之间远程通讯,无人设备对发送的相关指令根据自身的运动状态进行分析判断,执行或者禁止该第一状态指令,从而达到远程控制看房的目的,同时无人设备具有自动避障功能,智能化程度高,且控制方式简单。In order to allow any person to remotely control the device for traveling and perform a shooting function to achieve the purpose of real-time viewing of the captured image, the present application provides a method for controlling live broadcast of an unmanned device. When the method is used at a remote location, In the application scenario of house viewing, the method for controlling live broadcast of unmanned equipment includes at least two terminals, one is a remote terminal, which is used by a user to view images and control status, and a remote terminal that sends control instructions. The remote terminal It may be a computer, a notebook, a mobile phone, or other terminals, and the unmanned device may be a robot or an unmanned aerial vehicle, an unmanned vehicle such as a remote control car. The other is an unmanned device that is controlled to move and can perform related shooting actions. The remote communication between the two, the unmanned device analyzes and judges the relevant instructions sent according to its own motion state, and executes or prohibits the first state instruction, thereby achieving the purpose of remotely controlling the house inspection, and the unmanned device has automatic obstacle avoidance. Function, high degree of intelligence, and simple control method.
具体的,请参阅图1,依据远程终端和无人设备进行控制无人设备直播的方法包括以下步骤:Specifically, referring to FIG. 1, a method for controlling live broadcast of an unmanned device based on a remote terminal and an unmanned device includes the following steps:
S100、无人设备获取待执行的第一状态指令,所述第一状态指令用于控制所述无人设备的移动轨迹;S100. The unmanned device obtains a first status instruction to be executed, where the first status instruction is used to control a movement track of the unmanned device;
所述第一指令用于控制所述无人设备的移动轨迹,即控制所述无人设备按照指定的方向移动。进一步的,在远程终端上设置有方向输入单元,可以通过该方向输入单元,或者通过设定相关的方向参数,控制所述无人设备进行移动。用户可发送任意一种可让无人设备进行移动的方向指令,控制所述无人设备够按照用户的意愿移动。The first instruction is used to control a movement trajectory of the unmanned device, that is, to control the unmanned device to move in a specified direction. Further, a direction input unit is provided on the remote terminal, and the unmanned device can be controlled to move through the direction input unit or by setting related direction parameters. The user can send any direction instruction that allows the unmanned device to move, and controls the unmanned device to move according to the user's wishes.
S200、所述无人设备根据所述移动轨迹在预设路径上移动,并通过所述无人设备上的图像传感器获取所述无人设备在行进过程中行进方向上的空间视频信息;S200. The unmanned device moves on a preset path according to the movement trajectory, and acquires spatial video information in the traveling direction of the unmanned device during the traveling process through an image sensor on the unmanned device;
所述空间视频信息有多种,包括:投射在行进方向上的测距光斑图像或者是当前位置上周围环境的图像。这些空间视频信息通过无人设备上的图像传感器进行获取。There are various types of spatial video information, including: a ranging light spot image projected in the traveling direction or an image of the surrounding environment at the current position. This spatial video information is obtained through an image sensor on the drone.
在本申请中,投射在行进方向上的测距光斑图像主要是用于监测所述无人设备与行进方向上的障碍物之间的距离,以避免发生碰撞。而当前位置上周围环境的图像则是为了方便用户了解所述无人设备所在位置周围的环境。In the present application, the ranging light spot image projected in the traveling direction is mainly used to monitor the distance between the unmanned device and an obstacle in the traveling direction to avoid a collision. The image of the surrounding environment at the current location is for the convenience of the user to understand the environment around the location of the unmanned device.
这两种情况的空间视频信息可以同时获得,或者分时段获得。同时获得的情况为,分两个不同的摄像装置来获取上述的测距光斑图像和当前位置上周围环境的图像。而分时段获得的情况通常为,采用一个摄像装置,在行进过程中,先获取测距光斑图像,以对无人设备与前方障碍物的距离进行测量,当无人设备停止时,才开始对无人设备所在的位置周围的环境进行拍摄,获取对应的图像。Spatial video information in both cases can be obtained at the same time, or can be obtained in different periods. The situation obtained at the same time is that two different camera devices are used to obtain the above-mentioned ranging spot image and the image of the surrounding environment at the current position. The situation obtained in different periods is usually that a camera device is used. During the travel process, a ranging light spot image is first acquired to measure the distance between the unmanned device and the obstacle in front. The surrounding environment where the unmanned device is located is taken to obtain a corresponding image.
本申请中,第一指令包括:控制移动方向的指令信息。所述无人设备是根据第一状态指令进行自动移动的,故应当有必要的避障机制进行避障。获取测距光斑图像实际上就是一种避障方式。本申请中,公开一种通过测距光斑图像来判断所述无人设备与前方障碍物之间的距离。In the present application, the first instruction includes instruction information for controlling a moving direction. The unmanned device moves automatically according to the first state instruction, so there should be a necessary obstacle avoidance mechanism for obstacle avoidance. Obtaining a ranging spot image is actually a way to avoid obstacles. In the present application, a distance measurement spot image is used to determine the distance between the unmanned device and an obstacle in front.
S300、所述无人设备将所述空间视频信息推流发送至服务器端。S300. The unmanned device pushes the spatial video information to a server.
由于空间视频信息有多种,包括:投射在行进方向上的测距光斑图像或者是当前位置上周围环境的图像,在获取得到相关空间视频信息后,直接推流发送至服务器端。Because there are many kinds of spatial video information, including: a ranging spot image projected in the direction of travel or an image of the surrounding environment at the current position, after obtaining the relevant spatial video information, it is directly transmitted to the server.
进一步的,请参阅图2,当所述无人设备依据所述移动轨迹在预设路径上移动时,所述方法还包括:Further, referring to FIG. 2, when the unmanned device moves on a preset path according to the moving track, the method further includes:
S210、获取与障碍物之间的第一参数值;S210. Obtain a first parameter value with the obstacle;
障碍物是指阻挡在所述无人设备行进方向上的物体。第一参数值为所述无人设备与障碍物之间的关系,比如距离关系。An obstacle refers to an object blocked in the direction of travel of the unmanned device. The first parameter value is a relationship between the unmanned device and the obstacle, such as a distance relationship.
S220、判断所述第一参数值是否满足第一预设条件;S220. Determine whether the first parameter value meets a first preset condition;
所述第一预设条件为所述第一参数值与的条件,比如当第一参数值为无人设备与障碍物之间的距离关系的时候,为了更好地让无人设备在行进方向上执行指定的动作,可以设置一个关于距离的第一阈值,当第一参数值小于第一阈值的时候,则表示无人设备与障碍物的距离较近。第一预设条件为第一参数值与第一阈值之间的关系,比如满足第一参数值大于第一阈值,或者第一参数值小于第一阈值或者是其他关系,根据对应的应用场景来定。The first preset condition is a condition of the first parameter value and, for example, when the first parameter value is the distance relationship between the unmanned device and the obstacle, in order to better let the unmanned device in the direction of travel When a specified action is performed on the device, a first threshold value for distance can be set. When the first parameter value is smaller than the first threshold value, it indicates that the distance between the unmanned device and the obstacle is relatively short. The first preset condition is the relationship between the first parameter value and the first threshold. For example, the first parameter value is greater than the first threshold value, or the first parameter value is less than the first threshold value, or other relationships are met according to the corresponding application scenario. set.
S230、当所述第一参数值满足第一预设条件时,执行预设的第二状态指令。S230. When the first parameter value meets a first preset condition, execute a preset second state instruction.
进一步的,本申请中,第二状态指令包括停止行进指令、降低行进速度指令或反向行进指令当中的任意一项指令信息。以具体的实施例来介绍上述三个步骤。Further, in the present application, the second state instruction includes any one of the instruction information of a stop travel instruction, a decrease speed instruction, or a reverse travel instruction. The above three steps are described in a specific embodiment.
例如,在采用超声波方式获取障碍物与无人设备之间的距离的时候,假设第一阈值为1米,第一预设条件为第一参数值小于第一阈值。当超声波获得的第一参数值为0.8米时,第一参数值小于第一阈值,满足第一预设条件,这样,无人设备则 执行第二状态指令,立即停止、或者是降低行进速度,或者是反向行进。For example, when the distance between the obstacle and the unmanned device is acquired in an ultrasonic manner, the first threshold value is assumed to be 1 meter, and the first preset condition is that the first parameter value is smaller than the first threshold value. When the first parameter value obtained by the ultrasonic wave is 0.8 meters, the first parameter value is less than the first threshold value, and the first preset condition is satisfied. In this way, the unmanned device executes the second state instruction and immediately stops or reduces the traveling speed. Or go in reverse.
本申请中,获与取障碍物之间的第一参数值的方式有很多种,还可以是激光测距,可见光测距或者是其他的方式。本申请中,还公开一种采用测距光斑进行测距的方式,即获取的第一参数值为关于测距光斑的参数值。In this application, there are many ways to obtain and obtain the first parameter value between obstacles, and it may also be laser ranging, visible ranging, or other methods. In the present application, a method for performing ranging using a ranging light spot is also disclosed, that is, the obtained first parameter value is a parameter value related to the ranging light spot.
请进一步参阅图3,采用测距光斑获取第一参数值并执行第二预设条件的具体步骤包括:Please refer to FIG. 3 further, the specific steps of obtaining the first parameter value and executing the second preset condition by using the ranging light spot include:
S211、获取所述空间视频信息的帧画面图像;S211. Acquire a frame picture image of the spatial video information.
此步骤执行前需要说明的是,该步骤适用在所说的空间视频信息是指测距光斑图像的情况下,获取所述空间视频信息的帧画面图像的目的是为了测量所述无人设备与障碍物之间的距离。具体的测距方式参阅S212步骤。What needs to be explained before the execution of this step is that this step is applicable when the spatial video information refers to a ranging spot image, and the purpose of obtaining the frame image of the spatial video information is to measure the unmanned device and The distance between obstacles. For the specific ranging method, refer to step S212.
S212、计算所述光斑图像在所述帧画面图像中的视图比例;S212: Calculate a view ratio of the light spot image in the frame picture image;
通常情况下,发送光束的设备是直接安装在无人设备上的,且拍摄前方带有完整光斑的空间图像的摄像头也设置在无人设备上,一种优选的方式是,所述摄像头与发光装置尽量靠近,彼此左右排列,上下排列或者前后排列,以减小带完整光斑的空间图像的倾斜角。在本实施例中,发送光束的设备为光斑投射器,光斑投射器与拍摄带有光斑的图像的摄像头依次叠加构成测距设备,测距设备的测距方式始终指向行进的方向,激光发射器将光斑投射在障碍物上,摄像头实时或定时采集光斑所在区域内的图像。Generally, the device that sends the light beam is directly installed on the unmanned device, and the camera that captures the spatial image with the full light spot in front is also set on the unmanned device. A preferred method is that the camera and the light emitting device The devices should be as close as possible, arranged side by side, up and down, or back and forth to reduce the inclination angle of the spatial image with a complete light spot. In this embodiment, the device that sends the light beam is a light spot projector. The light spot projector and the camera that captures the image with the light spot are sequentially stacked to form a distance measuring device. The distance measuring method of the distance measuring device always points in the direction of travel. The laser transmitter The light spot is projected on the obstacle, and the camera collects the image in the area where the light spot is located in real time or periodically.
采集到带有完整光斑的图像信息后,通过图像处理技术即可得到完整的光斑面积,由于摄像头是固定的,整个拍摄的画面的面积是固定不变的,故可以通过固定的算法得出光斑占整个图像面积的占比。在本实施例中,光斑投射器投射的光斑形状是固定的,只是由于距离的远近,投射在障碍物上的光斑的面积会不一样,可以理解的是,当所述无人设备靠近障碍物的时候,所述无人设备上的光斑投射器投射的光斑面积小,当光斑投射器远离障碍物的时候,则障碍物投射的光斑面积大,这样,在投射方向一定前提下,可以很容易得到一个关于光斑的投射面积与距离的函数值,对应的,在摄像头位置和拍摄角度固定的前提下,摄像头拍摄的图片大小是一定的,也可以根据现有技术得到呈现在所拍摄的图片中的光斑大小与图片面积之间的比值与拍摄地点与障碍物之间的距离的关系,故可以通过判断所述图像中光斑呈现区域的面积在整张所述图像中的占比值,得到所述无人设备与障碍物之间的距离。After collecting the image information with a complete light spot, the complete light spot area can be obtained through image processing technology. Since the camera is fixed and the area of the entire captured picture is fixed, the light spot can be obtained by a fixed algorithm Percentage of the entire image area. In this embodiment, the shape of the light spot projected by the light spot projector is fixed, but the area of the light spot projected on the obstacle will be different due to the distance. It can be understood that when the unmanned device approaches the obstacle The spot area projected by the spot projector on the unmanned device is small. When the spot projector is far away from the obstacle, the area of the spot projected by the obstacle is large. In this way, it can be easily provided that the projection direction is certain. Get a function value about the projection area and distance of the light spot. Correspondingly, under the premise of fixed camera position and shooting angle, the size of the picture taken by the camera is fixed, and it can also be displayed in the picture taken according to the existing technology. The relationship between the ratio of the spot size to the picture area and the distance between the shooting location and the obstacle, so you can get the ratio by determining the ratio of the area of the spot area in the image to the entire image. The distance between the drone and the obstacle.
进一步的,在另一实施例中,测量无人设备与障碍物的距离还可以为发射可见 光,这种方法与上述的光斑测距一样,另一种为采用激光测距,即采用脉冲阀或者相位法来计算激光发射设备所在位置与障碍物之间的距离。以上只是本申请公开的其中几种测距的方式,但是本申请中不局限于以上一种测距方式,还可以是其他的方式。Further, in another embodiment, the measurement of the distance between the unmanned device and the obstacle may also be the emission of visible light. This method is the same as the above-mentioned spot ranging, and the other is using laser ranging, that is, using a pulse valve or Phase method to calculate the distance between the position of the laser emitting device and the obstacle. The above are just a few of the ranging methods disclosed in this application, but this application is not limited to the above one ranging method, and may also be other methods.
S213、当所述视图比例大于所述第一阈值时,执行预设的第二状态指令。S213. When the view ratio is greater than the first threshold, execute a preset second state instruction.
在本实施例中,第一阈值为上述步骤S212中的光斑图像在所述帧画面图像中的视图比例的一个临界值。通过步骤S212的步骤可以看出,在帧画面中,从光斑图像所占的比例可以判断出当前无人设备与前方障碍物之间的距离。而当要同时拍摄当前位置上周围环境的图像的时候,若无人设备离障碍物较近,不仅会阻挡无人设备的转弯、前行等动作,障碍物可能会阻挡住一部分视角,不利于拍摄周围的环境,故需要设置一个无人设备与障碍物之间的最佳距离,既能够很好地拍摄周围环境的图像,又能够保证无人设备不会与障碍物发生碰撞,In this embodiment, the first threshold value is a critical value of a view ratio of the flare image in the frame picture image in the foregoing step S212. It can be seen from step S212 that in the frame picture, the distance between the current unmanned device and the obstacle in front can be determined from the proportion of the spot image. When taking an image of the surrounding environment at the current position at the same time, if the unmanned device is closer to the obstacle, it will not only block the turning and moving of the unmanned device, but the obstacle may block a part of the perspective, which is not good for Shoot the surrounding environment, so you need to set an optimal distance between the unmanned device and the obstacle, which can not only take a good picture of the surrounding environment, but also ensure that the unmanned device does not collide with the obstacle,
通过上述步骤S212可以得到光斑图像在帧画面图像中的视图比例,即可以得出所述无人设备当前的位置与前方障碍物的距离,无人设备与障碍物之间的临界距离对应的光斑图像在帧画面图像中的视图比例则为上述的的第一阈值。假设帧画面图像的总面积为S1,而光斑图像在所述帧画面图像的面积为S2,则当前无人设备与障碍物之间的实际距离L为关于光斑图像面积S2与帧画面图像总面积S1的比例值与对应距离函数关系f(n)的映射,即L=(S2/S1)*f(n)。由此关系可以得出,f(n)的函数关系是固定的,变量为S2/S1,而S2/S1与L是一个反比的关系,即当距离L小时,S2/S1的比例关系大,当距离L大时,S2/S1的比例关系小,故当S2/S1的比例关系大于第一阈值的时候,对应的距离L过小。此时执行预设的第二状态指令。Through the above step S212, the view ratio of the light spot image in the frame image can be obtained, that is, the distance between the current position of the unmanned device and the obstacle in front, and the light spot corresponding to the critical distance between the unmanned device and the obstacle can be obtained. The view ratio of the image in the frame picture image is the above-mentioned first threshold. Assuming that the total area of the frame picture image is S1, and the area of the spot image in the frame picture image is S2, the actual distance L between the current unmanned device and the obstacle is about the spot image area S2 and the total area of the frame picture image The mapping of the proportional value of S1 to the corresponding distance function relationship f (n), that is, L = (S2 / S1) * f (n). From this relationship, it can be concluded that the functional relationship of f (n) is fixed, the variable is S2 / S1, and S2 / S1 and L are an inverse relationship, that is, when the distance L is small, the proportional relationship between S2 / S1 is large, When the distance L is large, the proportional relationship of S2 / S1 is small, so when the proportional relationship of S2 / S1 is greater than the first threshold, the corresponding distance L is too small. At this time, a preset second state instruction is executed.
在一实施例中,第二状态指令包括:停止行进、降低行进速度或反向行进当中的任意一项指令信息。比如,在一实施例中,当上述的比例关系大于第一阈值的时候,执行停止行进的第二状态指令,以避免所述无人设备与障碍物发生碰撞。In an embodiment, the second state instruction includes any one of instruction information of stopping the travel, reducing the travel speed, or traveling in the reverse direction. For example, in one embodiment, when the above-mentioned proportional relationship is greater than the first threshold, a second state instruction for stopping travel is executed to avoid collision between the unmanned device and an obstacle.
进一步的,在另一实施例中,当上述的比例关系大于第一阈值的时候,降低行进速度,使其更缓慢地靠近障碍物,在本实施例中还可进一步设置第二阈值,使无人设备在进行缓慢地移动到第二阈值所对应的位置,再执行停止或者转弯等其他的动作。需要说明的是,获取无人设备与障碍物之间的距离可以是实时获取,也可以是按照某一频率进行获取。当为按照某一频率进行获取的时候,由于存在时间间隔,当检测到无人设备与前方障碍物之间的距离时候,很有可能该距离会小于预设值, 故可以先设置一个第一阈值,当大于第一阈值的时候,降低行进速度,控制所述无人设备按照一个更小的移动单位和距离检测频率来调整所述无人设备与障碍物之间的距离值,使所述无人设备更精准地达到第二阈值所对应的位置。Further, in another embodiment, when the above-mentioned proportional relationship is greater than the first threshold value, the traveling speed is reduced to make it approach the obstacle more slowly. In this embodiment, a second threshold value may be further set so that The human device slowly moves to a position corresponding to the second threshold, and then performs other actions such as stopping or turning. It should be noted that the distance between the unmanned device and the obstacle may be obtained in real time, or may be obtained according to a certain frequency. When the acquisition is performed at a certain frequency, due to the time interval, when the distance between the unmanned device and the obstacle in front is detected, it is likely that the distance will be less than the preset value, so a first A threshold value, when it is greater than a first threshold value, reducing the traveling speed, controlling the unmanned device to adjust a distance value between the unmanned device and an obstacle according to a smaller moving unit and a distance detection frequency, so that the unmanned device The unmanned device reaches the position corresponding to the second threshold more accurately.
进一步的,在另一实施例中,当上述的比例关系大于第一阈值的时候,执行反向行进的第二状态指令,使所述无人设备沿反方向运动,还同时继续监测S2/S1的关系值,直到S2/S1等于第一阈值或者小于第一阈值时,再停止所述无人设备或者执行其他的动作。第一阈值所对应的距离可以是最安全的距离,比如此距离适合无人设备进行转向,或者此距离不容易发生碰撞等,或者第一阈值的距离可以是最适合拍摄周围空间图像的距离,因为当无人设备与障碍物之间的距离过近的时候,容易因障碍物的阻挡,影响空间图像的拍摄,故设置第一阈值的距离值,同时,当所述无人设备与障碍物的距离小于第一阈值的时候,同时控制所述无人设备后退,使所述无人设备与障碍物的距离大于或者等于距离的临界值,以达到最佳拍摄的距离范围,使无人设备能够更全面地拍摄周围的图像。让用户在当前距离能够观测到最全面的空间图像。Further, in another embodiment, when the above-mentioned proportional relationship is greater than the first threshold value, the second state instruction of the reverse travel is executed, so that the unmanned device moves in the opposite direction, and the S2 / S1 is also continuously monitored at the same time. Until the S2 / S1 is equal to or smaller than the first threshold, stop the unmanned device or perform other actions. The distance corresponding to the first threshold may be the safest distance, such as this distance is suitable for turning by unmanned equipment, or this distance is not prone to collision, etc., or the distance of the first threshold may be the distance that is most suitable for shooting surrounding space images. Because when the distance between the unmanned device and the obstacle is too close, it is easy to affect the shooting of the spatial image due to the obstruction of the obstacle, so the distance value of the first threshold is set. At the same time, when the unmanned device and the obstacle are blocked, When the distance is less than the first threshold value, the unmanned device is controlled to retreat at the same time, so that the distance between the unmanned device and the obstacle is greater than or equal to the critical value of the distance, so as to achieve the optimal shooting distance range and make the unmanned device Able to capture surrounding images more fully. Allows users to observe the most comprehensive spatial image at the current distance.
进一步的,所述第二状态指令还包括:发送警示信息。当S2/S1大于所述预设的第一阈值时,则表示所述无人设备存在碰撞上前方障碍物的危险或者不利于下一步动作的执行,故可以发出警示信息给远程终端或者是给相关的服务器,以进行报警提示,提醒用户或者管理人员进行查看,或者采取相关措施进行调整。警示信息可以是控制报警器报警,紧急电话拨打,紧急邮件发送等。Further, the second status instruction further includes: sending warning information. When S2 / S1 is greater than the preset first threshold, it indicates that the unmanned device is in danger of colliding with an obstacle in front or is not conducive to the execution of the next action, so it can send a warning message to the remote terminal or to Relevant server to give alarm prompts, remind users or managers to check, or take relevant measures to adjust. Warning information can be control alarm alarm, emergency phone dialing, emergency mail sending, etc.
在本申请中,请参阅图4,所述根据所述第一状态指令获取行进方向上视界范围内的空间视频信息的步骤之前还包括:In the present application, please refer to FIG. 4. Before the step of obtaining spatial video information in a view range in the direction of travel according to the first state instruction, the method further includes:
S110、获取预设的行进路径及实时位置信息;S110. Obtain a preset travel path and real-time position information.
这一个步骤可具体应用在某一些场景中,比如,在自动看房系统中,通过远程控制无人设备,对无人设备所在的周围环境进行拍摄,以进行自主看房。由于房间中物件比较多,完全通过无人设备进行自动行走拍摄,容易发生碰撞,且这种自动行走的无人设备通常会按照指定的路线行进,不同的客户有不同的喜好和观察习惯,其不能随心所欲地让客户查看当前环境中的任意一个场景。故本申请控制所述无人设备控制无人设备直播的方法可解决这个问题。但是,在房间中,由于格局以及装修的限制,不能让所述无人设备随意移动,比如,当无人设备运行到没有玻璃或者有间隙的护栏的阳台的时候,由于无人设备体积小,容易从阳台的护栏缝隙中掉落,不宜前往阳台的边缘,或者安装了蹲便的厕所,容易导致所述无人设备掉落, 或者在复式房中,位于下一层的楼梯位置,容易导致无人设备从楼梯上滑落等。在房间中,总有一些不安全的地方,不适合所述无人设备前行,故在本实施例中,可以进一步制定一个适应于当前环境的行进路径,以确保所述无人设备能够安全地进行移动,并能够完整地拍摄当前环境周围的图像。This step can be specifically applied in some scenarios, for example, in an automatic house viewing system, by remotely controlling an unmanned device, shooting the surrounding environment where the unmanned device is located for autonomous house viewing. Because there are a lot of objects in the room, the automatic walking shooting is completely performed by unmanned equipment, which is prone to collisions. And this kind of automatic walking unmanned equipment usually travels according to the designated route. Different customers have different preferences and observation habits. You can't just let your customers see any scene in their current environment. Therefore, the method for controlling unmanned equipment to control unmanned equipment live broadcast in this application can solve this problem. However, in the room, due to the limitation of the layout and decoration, the unmanned device cannot be moved at will. For example, when the unmanned device runs to a balcony without a glass or a fence with a gap, due to the small size of the unmanned device, It is easy to fall from the gap between the balconies of the balcony, it is not suitable to go to the edge of the balcony, or a squat toilet is installed, which may cause the unmanned equipment to fall, or in the duplex room, located on the next level of stairs, it is easy to cause Unmanned equipment slipped off the stairs and so on. In the room, there are always some unsafe places that are not suitable for the unmanned equipment to travel, so in this embodiment, a travel path adapted to the current environment can be further formulated to ensure that the unmanned equipment can be safe To move around and to take a complete picture of the surroundings of the current environment.
S120、判断所述实时位置信息是否在所述行进路径中;S120. Determine whether the real-time position information is in the travel path;
S130、当所述实时位置信息不在所述行进路径中,禁止执行所述第一状态指令。S130. When the real-time location information is not in the travel path, prohibit execution of the first state instruction.
所述无人设备在当前位置的行进路径并不是完全相同的,为了适应不同的户型,可为每一个房间设置一个行进路径,当无人设备开始工作的时候,获取所述无人设备的实时位置,根据无人设备所在的位置,匹配对应房间的行进路径,当接收有第一状态指令的时候,首选判断当前无人设备是否在所述行进路径中,若是,则按照第一状态指令指示的方向进行移动,若否,则表示当前无人设备所在的位置为非法区域,即不安全区域,不执行所述第一状态指令。进一步的,还可以控制无人设备发送报警信号,以提醒当前位置为非法位置,提示工作人员或者远程操作人员对无人设备进行移动,脱离非法区域。The travel path of the unmanned device at the current location is not exactly the same. In order to adapt to different types of rooms, a travel path can be set for each room. When the unmanned device starts to work, the real-time The position matches the travel path of the corresponding room according to the position of the unmanned device. When receiving the first state instruction, it is preferred to determine whether the current unmanned device is in the travel path. If so, follow the first state instruction. Move in the direction of, if not, it means that the current location of the unmanned device is an illegal area, that is, an unsafe area, and the first state instruction is not executed. Further, the unmanned device can be controlled to send an alarm signal to remind the current position as an illegal position, and to prompt the staff or remote operator to move the unmanned device to leave the illegal area.
进一步的,请参阅图5,在本申请中,另一种限制无人设备执行第一状态指令的方式为:Further, referring to FIG. 5, in this application, another way to restrict the unmanned device from executing the first state instruction is:
S140、获取预设的行进路径及实时位置信息;S140. Obtain a preset travel path and real-time position information.
S150、根据所述实时位置信息与所述行进路径的边界距离预判所述第一状态指令表征的终点位置是否在所述行进路径中;S150. Prejudge whether the end position represented by the first state instruction is in the travel path according to a boundary distance between the real-time position information and the travel path;
S160、当所述终点位置未在所述行进路径中,禁止执行所述第一状态指令。S160. When the end position is not in the travel path, execution of the first state instruction is prohibited.
这一种方式与上面的一种方式类似,也是需要预设一个行进路径,并获取无人设备当前的位置信息,匹配对应的行进路径。不同之处在于,无人设备需要根据所述第一状态指令所指向的方向,判断无人设备当前的位置与即将行进的的终点位置是否落入在行进路径的边界以内,比如,当第一状态指令以10cm/s的速度向前方运动10秒钟的时候,由于运动10秒的距离是100cm,在执行该第一状态指令的时候,先预判当前的无人设备向前方移动100cm后所处的终点位置是否还在上述预设的行径路径范围内,只有当即将行径的终点也落入到行进路径以内时,才执行第一状态指令,若无人设备的执行第一状态指令后,存在脱离所述行进路径的情况,则禁止执行所述第一状态指令,以免无人设备在执行了第一状态指令后存在运行上风险。This method is similar to the above method, and it is also necessary to preset a travel path and obtain the current position information of the unmanned device to match the corresponding travel path. The difference is that the unmanned device needs to determine whether the current position of the unmanned device and the end point of the upcoming travel fall within the boundary of the travel path according to the direction pointed by the first state instruction. For example, when the first When the state command is moved forward at a speed of 10cm / s for 10 seconds, since the distance of 10 seconds is 100cm, when the first state command is executed, the current unmanned device is predicted to move forward 100cm Whether the end position is within the preset travel path range above. Only when the end point of the travel is also within the travel path, the first state command is executed. If the unmanned device executes the first state command, If there is a deviation from the travel path, the execution of the first state instruction is prohibited, so as to prevent the unmanned device from running after the first state instruction is executed.
进一步的,上述的两种方案中提到的行进路径为预设的电子路径,可通过数据传输的方式发送至执行移动的装置中,但是,本申请中,该预设的行进路径不局限 于上述的电子路径,还可以是在房屋内设置的电磁轨道,执行移动的无人设备上设置有用于检测电磁轨道的霍尔传感器,通过该霍尔传感器检测无人设备是否在电磁轨道上,依次判断是否执行第一状态指令。具体的,当无人设备为无人机的时候,在无人机的两侧设置霍尔传感器,当一侧的霍尔传感器无法检测到电磁场或检测到的电磁场过小时,无人机器人向该霍尔传感器的另一侧偏移直至接收到电磁场信号为准,通过自动驾驶的方式使看房更加的方便和智能。Further, the travel path mentioned in the above two schemes is a preset electronic path, which can be sent to the device performing the movement through data transmission. However, in this application, the preset travel path is not limited to The above-mentioned electronic path may also be an electromagnetic track installed in a house. A hall sensor for detecting the electromagnetic track is provided on the unmanned device performing the movement, and the Hall sensor is used to detect whether the unmanned device is on the electromagnetic track. Determine whether to execute the first state instruction. Specifically, when the unmanned device is a drone, Hall sensors are provided on both sides of the drone. When the Hall sensor on one side cannot detect the electromagnetic field or the detected electromagnetic field is too small, the unmanned robot reports to the The other side of the Hall sensor shifts until the electromagnetic field signal is received, which makes viewing the house more convenient and intelligent by means of automatic driving.
本申请还公开一种控制无人设备直播的装置,请参阅图6,包括:This application also discloses a device for controlling live broadcast of an unmanned device. Please refer to FIG. 6, including:
获取模块100:用于获取待执行的第一状态指令,所述第一状态指令用于控制所述无人设备的移动轨迹;The obtaining module 100 is configured to obtain a first status instruction to be executed, where the first status instruction is used to control a movement trajectory of the unmanned device;
处理模块200:用于所述无人设备根据所述移动轨迹在预设路径上移动,并通过所述无人设备上的图像传感器获取所述无人设备在行进过程中行进方向上的空间视频信息;Processing module 200: used for the unmanned device to move on a preset path according to the movement trajectory, and to obtain a spatial video of the unmanned device in the traveling direction through an image sensor on the unmanned device information;
执行模块300:用于所述无人设备将所述空间视频信息推流发送至服务器端。The execution module 300 is used for the unmanned device to push the spatial video information to a server.
所述第一指令包括:控制移动方向的指令信息。即控制所述无人设备按照指定的方向移动。进一步的,在远程终端上设置有方向输入单元,可以通过该方向输入单元,或者通过设定相关的方向参数,控制所述无人设备进行移动。用户可发送任意一种可让无人设备进行移动的方向指令,控制所述无人设备够按照用户的意愿移动。The first instruction includes instruction information for controlling a moving direction. That is, the unmanned device is controlled to move in a specified direction. Further, a direction input unit is provided on the remote terminal, and the unmanned device can be controlled to move through the direction input unit or by setting related direction parameters. The user can send any direction instruction that allows the unmanned device to move, and controls the unmanned device to move according to the user's wishes.
所述空间视频信息有多种,包括:投射在行进方向上的测距光斑图像或者是当前位置上周围环境的图像。当处理模块200接收到第一状态指令的时候,可以同时获得上述的测距光斑图像和当前位置周围环境的图像,或者是先获得测距光斑图像,在一定条件下再获得当前位置的周围环境的图像。同时获得情况通常为用于拍摄测距光斑图像的摄像装置和用于当前位置周围环境的图像的摄像装置为两个不同的装置,这样二者拍摄不同的图像,且互补影响。而分别获取两种的图像的情况可以为共用一个摄像装置的情况。这种情况下,优先获得测距光斑图像,以测量当前位置与障碍物之间的距离,确保无人设备是运行在安全距离范围内,当无人设备停止或者确保当前行进方向安全的情况下,再自行拍摄当前位置周围环境的图像的功能。There are various types of spatial video information, including: a ranging light spot image projected in the traveling direction or an image of the surrounding environment at the current position. When the processing module 200 receives the first status instruction, it can obtain the above-mentioned ranging spot image and the image of the surrounding environment at the current position, or obtain the ranging spot image first, and then obtain the surrounding environment of the current location under certain conditions. Image. The simultaneous acquisition situation is usually that the imaging device used to capture the ranging spot image and the imaging device used to image the surrounding environment at the current location are two different devices, so that the two take different images with complementary effects. The case where two types of images are acquired separately may be the case where one camera device is shared. In this case, obtain the ranging spot image first to measure the distance between the current position and the obstacle to ensure that the unmanned device is operating within a safe distance range. When the unmanned device is stopped or the current direction of travel is safe , And then take a picture of the surroundings of the current location.
拍摄好的上述任意一种图像信息都可以通过执行模块300以推流的方式发送至服务器中进行存储,或者是通过服务器再发送至远程客户端中进行实时查看。Any of the captured image information can be sent to the server for storage in a streaming manner through the execution module 300, or sent to a remote client for real-time viewing through the server.
进一步的,所述自动行进装置还包括:Further, the automatic traveling device further includes:
第一获取模块:用于获取与障碍物之间的第一参数值;A first acquisition module: configured to acquire a first parameter value with an obstacle;
第一处理模块:用于判断所述第一参数值是否满足第一预设条件;A first processing module: configured to determine whether the first parameter value meets a first preset condition;
第一执行模块:当所述第一参数值满足第一预设条件时,执行预设的第二状态指令。First execution module: when the first parameter value meets a first preset condition, execute a preset second state instruction.
进一步的,所述空间视频信息包括:通过所述无人设备上的光斑投射器投射在行进方向上的测距光斑的光斑图像,获取与障碍物之间的第一参数值的步骤具体包括:Further, the spatial video information includes: a light spot image of a ranging light spot projected in a traveling direction by a light spot projector on the unmanned device, and the step of obtaining a first parameter value between the obstacle and the obstacle specifically includes:
获取所述空间视频信息的帧画面图像;Acquiring a frame picture image of the spatial video information;
计算所述光斑图像在所述帧画面图像中的视图比例。Calculate a view ratio of the light spot image in the frame picture image.
结合上述步骤,可以得出:所述第一获取模块:用于获取所述空间视频信息的帧画面图像;第一处理模块:用于计算所述光斑图像在所述帧画面图像中的视图比例;第一执行模块:用于当所述视图比例大于所述第一阈值时,执行预设的第二状态指令。In combination with the above steps, it can be obtained that the first acquisition module is used to acquire a frame picture image of the spatial video information; the first processing module is used to calculate a view ratio of the flare image in the frame picture image A first execution module: configured to execute a preset second state instruction when the view ratio is greater than the first threshold.
上述的第一获取模块、第一处理模块和第一执行模块主要用于根据上述的空间视频信息中的测距光斑的光斑图像对无人设备与障碍物之间的距离进行测量。The above-mentioned first acquisition module, first processing module, and first execution module are mainly used to measure the distance between the unmanned device and the obstacle according to the spot image of the ranging spot in the above-mentioned spatial video information.
投射测距光斑的装置安装在无人设备上,拍摄带完整光斑的光斑图像的装置也安装在无人设备上,无人设备离前方障碍物之间的距离越近,完整光斑在拍摄的带光斑图像的帧画面图像中的比例越大,故可以根据这一关系,设定一个帧画面图像中完整光斑占整个帧画面的面积比例的第一阈值,以得出无人设备与障碍物之间的一个距离阈值。通过整个第一距离阈值来控制无人设备执行第二状态指令。The device for projecting a ranging spot is installed on an unmanned device, and the device for shooting a spot image with a complete spot is also installed on an unmanned device. The closer the unmanned device is to the obstacle in front, the more the complete spot is in the shooting zone. The larger the ratio of the spot image in the frame picture image, so according to this relationship, a first threshold value of the area ratio of the complete spot in the frame picture image to the entire frame picture can be set to obtain the unmanned device and the obstacle. A distance threshold. The unmanned device is controlled to execute the second state instruction through the entire first distance threshold.
第二状态指令包括:停止行进指令、降低行进速度指令或反向行进指令之中的任意一项。其根据情况可应用在不同的场合。其应用场合与控制方法如上述的控制无人设备直播的方法中的一样。The second state instruction includes any one of a stop travel instruction, a decrease speed instruction, or a reverse travel instruction. It can be applied to different occasions according to the situation. The application occasion and control method are the same as those in the method for controlling live broadcast of an unmanned device.
进一步的,所述第二状态指令还包括:发送警示信息。当S2/S1大于所述预设的第一阈值时,则表示所述无人设备存在碰撞上前方障碍物的危险或者不利于下一步动作的执行,故可以发出警示信息给远程终端或者是给相关的服务器,以进行报警提示,提醒用户或者管理人员进行查看,或者采取相关措施进行调整。警示信息可以是控制报警器报警,紧急电话拨打,紧急邮件发送等,以提高安全性。Further, the second status instruction further includes: sending warning information. When S2 / S1 is greater than the preset first threshold, it indicates that the unmanned device is in danger of colliding with an obstacle in front or is not conducive to the execution of the next action, so it can send a warning message to the remote terminal or to Relevant server to give alarm prompts, remind users or managers to check, or take relevant measures to adjust. The warning information can be controlled by the alarm, emergency phone dialing, emergency mail sending, etc. to improve security.
进一步的,所述自动行进装置,还包括:Further, the automatic traveling device further includes:
第二获取模块:用于获取预设的行进路径及实时位置信息;A second acquisition module: used to acquire a preset travel path and real-time position information;
第二处理模块:用于判断所述实时位置信息是否在所述行进路径中;A second processing module: used to determine whether the real-time location information is in the travel path;
第二执行模块:用于当所述实时位置信息不在所述行进路径中时,禁止执行所述第一状态指令。The second execution module is configured to prohibit execution of the first state instruction when the real-time location information is not in the travel path.
由于自动行进装置若任意在当前位置移动,由于移动场地的格局以及地势的问题,不适合自动行进装置移动,故需要根据每个不同的场地格局以及地势,设置一个行进路径。比如为了适应不同的户型,可为每一个房间设置一个行进路径,当无人设备开始工作的时候,获取所述无人设备的实时位置,根据无人设备所在的位置,匹配对应房间的行进路径,当接收有第一状态指令的时候,首选判断当前无人设备是否在所述行进路径中,若是,则按照第一状态指令指示的方向进行移动,若否,则表示当前无人设备所在的位置为非法区域,即不安全区域,不执行所述第一状态指令。进一步的,还可以控制无人设备发送报警信号,以提醒当前位置为非法位置,提示工作人员或者远程操作人员对无人设备进行移动,脱离非法区域。If the automatic traveling device moves arbitrarily at the current position, it is not suitable for the automatic traveling device to move due to the pattern of the moving site and the terrain, so it is necessary to set a traveling path according to each different site pattern and terrain. For example, in order to adapt to different types of rooms, a travel path can be set for each room. When the unmanned device starts to work, the real-time position of the unmanned device is obtained, and the travel path of the corresponding room is matched according to the position of the unmanned device. When receiving the first state instruction, it is preferred to determine whether the current unmanned device is in the travel path. If so, move in the direction indicated by the first state instruction. If not, it means that the current unmanned device is located. The location is an illegal area, that is, an unsafe area, and the first state instruction is not executed. Further, the unmanned device can be controlled to send an alarm signal to remind the current position as an illegal position, and to prompt the staff or remote operator to move the unmanned device to leave the illegal area.
进一步的,本申请还包括另外一种限制无人设备执行第一状态指令的结构,包括:Further, this application also includes another structure that restricts the unmanned device from executing the first state instruction, including:
第三获取模块:获取预设的行进路径及实时位置信息;A third acquisition module: acquiring a preset travel path and real-time position information;
第三处理模块:根据所述实时位置信息与所述行进路径的边界距离判断所述第一状态指令表征的终点位置是否在所述行进路径中;A third processing module: judging whether an end position represented by the first state instruction is in the travel path according to a boundary distance between the real-time position information and the travel path;
第三执行模块:当所述终点位置未在所述行进路径中时,禁止执行所述第一状态指令。The third execution module: when the end position is not in the travel path, execution of the first state instruction is prohibited.
这一种方式与上面的一种方式类似,也是需要预设一个行进路径,并获取无人设备当前的位置信息,匹配对应的行进路径。不同之处在于,无人设备需要根据苏搜第一状态指令所指向的方向,判断无人设备当前的位置与即将行进的的终点位置是否落入在行进路径的边界以内,只有当即将行径的距离也落入到行进路径以内时,才执行第一状态指令,若无人设备的执行第一状态指令后,存在脱离所述行进路径的情况,则禁止执行所述第一状态指令,以免无人设备在执行了第一状态指令后存在运行上风险。This method is similar to the above method, and it is also necessary to preset a travel path and obtain the current position information of the unmanned device to match the corresponding travel path. The difference is that the unmanned device needs to determine whether the current position of the unmanned device and the end point of the upcoming end of the path fall within the boundary of the travel path according to the direction pointed by the first state of the Sosearch. The first state instruction is executed only when the distance also falls within the travel path. If the unmanned device executes the first state instruction and it is out of the travel path, it is prohibited to execute the first state instruction to avoid After the human device executes the first state instruction, there is a running risk.
本申请还公开一种计算机设备,包括存储器和处理器,所述存储器中存储有计算机可读指令,所述计算机可读指令被所述处理器执行时,使得所述处理器执行上述任意一项所述控制无人设备直播的方法。The present application also discloses a computer device including a memory and a processor. The memory stores computer-readable instructions. When the computer-readable instructions are executed by the processor, the processor causes the processor to execute any one of the foregoing. The method for controlling live broadcast of unmanned equipment.
本申请实施例提供计算机设备基本结构框图请参阅图7。Please refer to FIG. 7 for a block diagram of a basic structure of a computer device provided in an embodiment of the present application.
该计算机设备包括通过系统总线连接的处理器、非易失性存储介质、存储器和网络接口。其中,该计算机设备的非易失性存储介质存储有操作系统、数据库和计 算机可读指令,数据库中可存储有控件信息序列,该计算机可读指令被处理器执行时,可使得处理器实现一种用户行为关联提示方法。该计算机设备的处理器用于提供计算和控制能力,支撑整个计算机设备的运行。该计算机设备的存储器中可存储有计算机可读指令,该计算机可读指令被处理器执行时,可使得处理器执行一种用户行为关联提示方法。该计算机设备的网络接口用于与终端连接通信。本领域技术人员可以理解,图7中示出的结构,仅仅是与本申请方案相关的部分结构的框图,并不构成对本申请方案所应用于其上的计算机设备的限定,具体的计算机设备可以包括比图中所示更多或更少的部件,或者组合某些部件,或者具有不同的部件布置。The computer device includes a processor, a non-volatile storage medium, a memory, and a network interface connected through a system bus. The non-volatile storage medium of the computer device stores an operating system, a database, and computer-readable instructions. The database may store control information sequences. When the computer-readable instructions are executed by the processor, the processor may implement a A User Behavior Association Prompt Method. The processor of the computer equipment is used to provide computing and control capabilities to support the operation of the entire computer equipment. The memory of the computer device may store computer-readable instructions, and when the computer-readable instructions are executed by the processor, the processor may cause the processor to execute a method for prompting user behavior association. The network interface of the computer equipment is used to connect and communicate with the terminal. Those skilled in the art can understand that the structure shown in FIG. 7 is only a block diagram of a part of the structure related to the solution of the present application, and does not constitute a limitation on the computer equipment to which the solution of the application is applied. Include more or fewer parts than shown in the figure, or combine certain parts, or have a different arrangement of parts.
计算机设备通过接收关联的客户端发送的提示行为的状态信息,即关联终端是否开启提示以及用户是否关闭该提示任务。通过验证上述任务条件是否达成,进而向关联终端发送对应的预设指令,以使关联终端能够根据该预设指令执行相应的操作,从而实现了对关联终端的有效监管。同时,在提示信息状态与预设的状态指令不相同时,服务器端控制关联终端持续进行响铃,以防止关联终端的提示任务在执行一段时间后自动终止的问题。The computer device receives status information of the prompting behavior sent by the associated client, that is, whether the associated terminal turns on the prompt and whether the user closes the prompt task. By verifying whether the above-mentioned task conditions are met, and then sending a corresponding preset instruction to the associated terminal, so that the associated terminal can perform a corresponding operation according to the preset instruction, thereby achieving effective supervision of the associated terminal. At the same time, when the status of the prompt information is different from the preset status instruction, the server controls the associated terminal to continue to ring to prevent the problem that the prompt task of the associated terminal is automatically terminated after being executed for a period of time.
本申请还提供一种存储有计算机可读指令的存储介质,所述计算机可读指令被一个或多个处理器执行时,使得一个或多个处理器执行上述任一实施例所述控制无人设备直播的方法。The present application also provides a storage medium storing computer-readable instructions. When the computer-readable instructions are executed by one or more processors, the one or more processors execute the control of an unmanned person according to any one of the foregoing embodiments. Method of device live broadcast.
本领域普通技术人员可以理解实现上述实施例方法中的全部或部分流程,是可以通过计算机程序来指令相关的硬件来完成,该计算机程序可存储于一计算机可读取存储介质中,该程序在执行时,可包括如上述各方法的实施例的流程。其中,前述的存储介质可为磁碟、光盘、只读存储记忆体(Read-Only Memory,ROM)等非易失性存储介质,或随机存储记忆体(Random Access Memory,RAM)等。A person of ordinary skill in the art can understand that all or part of the processes in the methods of the foregoing embodiments can be implemented by using a computer program to instruct related hardware. The computer program can be stored in a computer-readable storage medium. When executed, the processes of the embodiments of the methods described above may be included. The foregoing storage medium may be a non-volatile storage medium such as a magnetic disk, an optical disc, a read-only memory (ROM), or a random access memory (Random Access Memory, RAM).
Claims (20)
- 一种控制无人设备直播的方法,包括:A method for controlling unmanned equipment live broadcast, including:无人设备获取待执行的第一状态指令,所述第一状态指令用于控制所述无人设备的移动轨迹;The unmanned device acquires a first state instruction to be executed, where the first state instruction is used to control a movement trajectory of the unmanned device;所述无人设备根据所述移动轨迹在预设路径上移动,并通过所述无人设备上的图像传感器获取所述无人设备在行进过程中行进方向上的空间视频信息;The unmanned device moves on a preset path according to the movement trajectory, and acquires spatial video information in the traveling direction of the unmanned device during the traveling process through an image sensor on the unmanned device;所述无人设备将所述空间视频信息推流发送至服务器端。The unmanned device pushes the spatial video information to a server.
- 根据权利要求1所述的控制无人设备直播的方法,当所述无人设备依据所述移动轨迹在预设路径上移动时,所述方法还包括:The method for controlling live broadcast of an unmanned device according to claim 1, when the unmanned device moves on a preset path according to the movement trajectory, the method further comprises:获取与障碍物之间的第一参数值;Obtaining a first parameter value with an obstacle;判断所述第一参数值是否满足第一预设条件;Determining whether the first parameter value meets a first preset condition;当所述第一参数值满足第一预设条件时,执行预设的第二状态指令。When the first parameter value meets a first preset condition, a preset second state instruction is executed.
- 根据权利要求2所述的控制无人设备直播的方法,所述空间视频信息包括:通过所述无人设备上的光斑投射器投射在行进方向上的测距光斑的光斑图像;获取与障碍物之间的第一参数值的步骤包括:The method for controlling live broadcast of an unmanned device according to claim 2, wherein the spatial video information comprises: a spot image of a ranging spot projected in a traveling direction by a spot projector on the unmanned device; obtaining and an obstacle The steps between the first parameter values include:获取所述空间视频信息的帧画面图像;Acquiring a frame picture image of the spatial video information;计算所述光斑图像在所述帧画面图像中的视图比例。Calculate a view ratio of the light spot image in the frame picture image.
- 根据权利要求2或3所述的控制无人设备直播的方法,所述第二状态指令包括:停止行进指令、降低行进速度指令或反向行进指令之中的任意一项。The method for controlling live broadcasting of an unmanned device according to claim 2 or 3, wherein the second state instruction comprises any one of a stop travel instruction, a decrease travel speed instruction, or a reverse travel instruction.
- 根据权利要求1所述的控制无人设备直播的方法,在执行所述第一状态指令之前,所述方法还包括:The method for controlling live broadcast of an unmanned device according to claim 1, before executing the first state instruction, the method further comprises:获取预设的行进路径及实时位置信息;Get preset travel paths and real-time location information;判断所述实时位置信息是否在所述行进路径中;Determining whether the real-time location information is in the travel path;当所述实时位置信息不在所述行进路径中时,禁止执行所述第一状态指令。When the real-time position information is not in the travel path, execution of the first state instruction is prohibited.
- 根据权利要求1所述的控制无人设备直播的方法,在执行所述第一状态指令之前,所述方法还包括:The method for controlling live broadcast of an unmanned device according to claim 1, before executing the first state instruction, the method further comprises:获取预设的行进路径及实时位置信息;Get preset travel paths and real-time location information;根据所述实时位置信息与所述行进路径的边界距离判断所述第一状态指令表征的终点位置是否在所述行进路径中;Determining whether the end position represented by the first state instruction is in the travel path according to a boundary distance between the real-time position information and the travel path;当所述终点位置未在所述行进路径中时,禁止执行所述第一状态指令。When the end position is not in the travel path, execution of the first state instruction is prohibited.
- 一种控制无人设备直播的装置,包括:A device for controlling live broadcast of unmanned equipment includes:获取模块:用于获取待执行的第一状态指令,所述第一状态指令用于控制所述无人设备的移动轨迹;An acquisition module: configured to acquire a first status instruction to be executed, where the first status instruction is used to control a movement trajectory of the unmanned device;处理模块:用于所述无人设备根据所述移动轨迹在预设路径上移动,并通过所述无人设备上的图像传感器获取所述无人设备在行进过程中行进方向上的空间视频信息;Processing module: used for the unmanned device to move on a preset path according to the movement trajectory, and to obtain spatial video information in the direction of travel of the unmanned device through the image sensor on the unmanned device ;执行模块:用于所述无人设备将所述空间视频信息推流发送至服务器端。Execution module: used for the unmanned device to send the spatial video information to the server.
- 根据权利要求7所述的控制无人设备直播的装置,还包括:The apparatus for controlling live broadcast of an unmanned device according to claim 7, further comprising:第一获取模块:用于获取与障碍物之间的第一参数值;A first acquisition module: configured to acquire a first parameter value with an obstacle;第一处理模块:用于判断所述第一参数值是否满足第一预设条件;A first processing module: configured to determine whether the first parameter value meets a first preset condition;第一执行模块:当所述第一参数值满足第一预设条件时,执行预设的第二状态指令。First execution module: when the first parameter value meets a first preset condition, execute a preset second state instruction.
- 一种计算机设备,包括存储器和处理器,所述存储器中存储有计算机可读指令,所述计算机可读指令被所述处理器执行时,使得所述处理器执行一种控制无人设备直播的方法的下述步骤:A computer device includes a memory and a processor. The memory stores computer-readable instructions. When the computer-readable instructions are executed by the processor, the processor causes the processor to execute a method for controlling live broadcast of an unmanned device. The following steps of the method:无人设备获取待执行的第一状态指令,所述第一状态指令用于控制所述无人设备的移动轨迹;The unmanned device acquires a first state instruction to be executed, where the first state instruction is used to control a movement trajectory of the unmanned device;所述无人设备根据所述移动轨迹在预设路径上移动,并通过所述无人设备上的图像传感器获取所述无人设备在行进过程中行进方向上的空间视频信息;The unmanned device moves on a preset path according to the movement trajectory, and acquires spatial video information in the traveling direction of the unmanned device during the traveling process through an image sensor on the unmanned device;所述无人设备将所述空间视频信息推流发送至服务器端。The unmanned device pushes the spatial video information to a server.
- 根据权利要求9所述的计算机设备,当所述无人设备依据所述移动轨迹在预设路径上移动时,还包括:The computer device according to claim 9, when the unmanned device moves on a preset path according to the movement trajectory, further comprising:获取与障碍物之间的第一参数值;Obtaining a first parameter value with an obstacle;判断所述第一参数值是否满足第一预设条件;Determining whether the first parameter value meets a first preset condition;当所述第一参数值满足第一预设条件时,执行预设的第二状态指令。When the first parameter value meets a first preset condition, a preset second state instruction is executed.
- 根据权利要求10所述的计算机设备,所述空间视频信息包括:通过所述无人设备上的光斑投射器投射在行进方向上的测距光斑的光斑图像;获取与障碍物之间的第一参数值的步骤包括:The computer device according to claim 10, wherein the spatial video information comprises: a spot image of a ranging spot projected in a traveling direction by a spot projector on the unmanned device; and acquiring a first distance from an obstacle The parameter value steps include:获取所述空间视频信息的帧画面图像;Acquiring a frame picture image of the spatial video information;计算所述光斑图像在所述帧画面图像中的视图比例。Calculate a view ratio of the light spot image in the frame picture image.
- 根据权利要求10或11所述的计算机设备,所述第二状态指令包括:停止行进指令、降低行进速度指令或反向行进指令之中的任意一项。The computer device according to claim 10 or 11, wherein the second state instruction comprises any one of a stop travel instruction, a decrease travel speed instruction, or a reverse travel instruction.
- 根据权利要求9所述的计算机设备,在执行所述第一状态指令之前,还包括:The computer device according to claim 9, before executing the first state instruction, further comprising:获取预设的行进路径及实时位置信息;Get preset travel paths and real-time location information;判断所述实时位置信息是否在所述行进路径中;Determining whether the real-time location information is in the travel path;当所述实时位置信息不在所述行进路径中时,禁止执行所述第一状态指令。When the real-time position information is not in the travel path, execution of the first state instruction is prohibited.
- 根据权利要求9所述的计算机设备,在执行所述第一状态指令之前,还包括:The computer device according to claim 9, before executing the first state instruction, further comprising:获取预设的行进路径及实时位置信息;Get preset travel paths and real-time location information;根据所述实时位置信息与所述行进路径的边界距离判断所述第一状态指令表征的终点位置是否在所述行进路径中;Determining whether the end position represented by the first state instruction is in the travel path according to a boundary distance between the real-time position information and the travel path;当所述终点位置未在所述行进路径中时,禁止执行所述第一状态指令。When the end position is not in the travel path, execution of the first state instruction is prohibited.
- 一种存储有计算机可读指令的非易失性存储介质,所述计算机可读指令被一个或多个处理器执行时,使得一个或多个处理器执行一种控制无人设备直播的方法的下述步骤:A non-volatile storage medium storing computer-readable instructions, when the computer-readable instructions are executed by one or more processors, cause the one or more processors to execute a method for controlling live broadcast of an unmanned device The following steps:无人设备获取待执行的第一状态指令,所述第一状态指令用于控制所述无人设备的移动轨迹;The unmanned device acquires a first state instruction to be executed, where the first state instruction is used to control a movement trajectory of the unmanned device;所述无人设备根据所述移动轨迹在预设路径上移动,并通过所述无人设备上的图像传感器获取所述无人设备在行进过程中行进方向上的空间视频信息;The unmanned device moves on a preset path according to the movement trajectory, and acquires spatial video information in the traveling direction of the unmanned device during the traveling process through an image sensor on the unmanned device;所述无人设备将所述空间视频信息推流发送至服务器端。The unmanned device pushes the spatial video information to a server.
- 根据权利要求15所述的非易失性存储介质,当所述无人设备依据所述移动轨迹在预设路径上移动时,还包括:The non-volatile storage medium according to claim 15, when the unmanned device moves on a preset path according to the movement trajectory, further comprising:获取与障碍物之间的第一参数值;Obtaining a first parameter value with an obstacle;判断所述第一参数值是否满足第一预设条件;Determining whether the first parameter value meets a first preset condition;当所述第一参数值满足第一预设条件时,执行预设的第二状态指令。When the first parameter value meets a first preset condition, a preset second state instruction is executed.
- 根据权利要求16所述的非易失性存储介质,所述空间视频信息包括:通过所述无人设备上的光斑投射器投射在行进方向上的测距光斑的光斑图像;获取与障碍物之间的第一参数值的步骤包括:The non-volatile storage medium according to claim 16, wherein the spatial video information comprises: a spot image of a ranging spot projected in a traveling direction by a spot projector on the unmanned device; The steps of the first parameter value include:获取所述空间视频信息的帧画面图像;Acquiring a frame picture image of the spatial video information;计算所述光斑图像在所述帧画面图像中的视图比例。Calculate a view ratio of the light spot image in the frame picture image.
- 根据权利要求16或17所述的非易失性存储介质,所述第二状态指令包括:停止行进指令、降低行进速度指令或反向行进指令之中的任意一项。The non-volatile storage medium according to claim 16 or 17, wherein the second state instruction comprises any one of a stop travel instruction, a decrease speed instruction, or a reverse travel instruction.
- 根据权利要求15所述的非易失性存储介质,在执行所述第一状态指令之前,还包括:The non-volatile storage medium according to claim 15, before executing the first state instruction, further comprising:获取预设的行进路径及实时位置信息;Get preset travel paths and real-time location information;判断所述实时位置信息是否在所述行进路径中;Determining whether the real-time location information is in the travel path;当所述实时位置信息不在所述行进路径中时,禁止执行所述第一状态指令。When the real-time position information is not in the travel path, execution of the first state instruction is prohibited.
- 根据权利要求15所述的非易失性存储介质,在执行所述第一状态指令之前,还包括:The non-volatile storage medium according to claim 15, before executing the first state instruction, further comprising:获取预设的行进路径及实时位置信息;Get preset travel paths and real-time location information;根据所述实时位置信息与所述行进路径的边界距离判断所述第一状态指令表征的终点位置是否在所述行进路径中;Determining whether the end position represented by the first state instruction is in the travel path according to a boundary distance between the real-time position information and the travel path;当所述终点位置未在所述行进路径中时,禁止执行所述第一状态指令。When the end position is not in the travel path, execution of the first state instruction is prohibited.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810502117.8A CN108805928B (en) | 2018-05-23 | 2018-05-23 | Method and device for controlling live broadcast of unmanned equipment, computer equipment and storage medium |
CN201810502117.8 | 2018-05-23 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2019223159A1 true WO2019223159A1 (en) | 2019-11-28 |
Family
ID=64092902
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2018/102874 WO2019223159A1 (en) | 2018-05-23 | 2018-08-29 | Method and apparatus for controlling live broadcast of unmanned device, computer device, and storage medium |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN108805928B (en) |
WO (1) | WO2019223159A1 (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113848986A (en) * | 2021-11-03 | 2021-12-28 | 广州港集团有限公司 | Unmanned aerial vehicle safety inspection method and system |
CN114494848A (en) * | 2021-12-21 | 2022-05-13 | 重庆特斯联智慧科技股份有限公司 | Robot sight distance path determining method and device |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115599126A (en) * | 2022-12-15 | 2023-01-13 | 深之蓝海洋科技股份有限公司(Cn) | Automatic collision-prevention wireless remote control unmanned submersible and automatic collision-prevention method |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150032260A1 (en) * | 2013-07-29 | 2015-01-29 | Samsung Electronics Co., Ltd. | Auto-cleaning system, cleaning robot and method of controlling the cleaning robot |
US20150148959A1 (en) * | 2012-06-08 | 2015-05-28 | Lg Electronics Inc. | Robot cleaner, controlling method of the same, and robot cleaning system |
CN105222760A (en) * | 2015-10-22 | 2016-01-06 | 一飞智控(天津)科技有限公司 | The autonomous obstacle detection system of a kind of unmanned plane based on binocular vision and method |
CN107515606A (en) * | 2017-07-20 | 2017-12-26 | 北京格灵深瞳信息技术有限公司 | Robot implementation method, control method and robot, electronic equipment |
CN107807639A (en) * | 2017-10-20 | 2018-03-16 | 上海犀木信息技术有限公司 | Room mobile robot is seen in a kind of mobile platform, mobile platform system and interior |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107343153A (en) * | 2017-08-31 | 2017-11-10 | 王修晖 | A kind of image pickup method of unmanned machine, device and unmanned plane |
-
2018
- 2018-05-23 CN CN201810502117.8A patent/CN108805928B/en active Active
- 2018-08-29 WO PCT/CN2018/102874 patent/WO2019223159A1/en active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150148959A1 (en) * | 2012-06-08 | 2015-05-28 | Lg Electronics Inc. | Robot cleaner, controlling method of the same, and robot cleaning system |
US20150032260A1 (en) * | 2013-07-29 | 2015-01-29 | Samsung Electronics Co., Ltd. | Auto-cleaning system, cleaning robot and method of controlling the cleaning robot |
CN105222760A (en) * | 2015-10-22 | 2016-01-06 | 一飞智控(天津)科技有限公司 | The autonomous obstacle detection system of a kind of unmanned plane based on binocular vision and method |
CN107515606A (en) * | 2017-07-20 | 2017-12-26 | 北京格灵深瞳信息技术有限公司 | Robot implementation method, control method and robot, electronic equipment |
CN107807639A (en) * | 2017-10-20 | 2018-03-16 | 上海犀木信息技术有限公司 | Room mobile robot is seen in a kind of mobile platform, mobile platform system and interior |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113848986A (en) * | 2021-11-03 | 2021-12-28 | 广州港集团有限公司 | Unmanned aerial vehicle safety inspection method and system |
CN114494848A (en) * | 2021-12-21 | 2022-05-13 | 重庆特斯联智慧科技股份有限公司 | Robot sight distance path determining method and device |
CN114494848B (en) * | 2021-12-21 | 2024-04-16 | 重庆特斯联智慧科技股份有限公司 | Method and device for determining vision path of robot |
Also Published As
Publication number | Publication date |
---|---|
CN108805928A (en) | 2018-11-13 |
CN108805928B (en) | 2023-04-18 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US12066292B2 (en) | Emergency drone guidance device | |
US11501628B2 (en) | Integrative security system and method | |
EP2571660B1 (en) | Mobile human interface robot | |
US11288936B1 (en) | Emergency incident detection, response, and mitigation using autonomous drones | |
US10665071B2 (en) | System and method for deadzone detection in surveillance camera network | |
WO2019223159A1 (en) | Method and apparatus for controlling live broadcast of unmanned device, computer device, and storage medium | |
US10728505B2 (en) | Monitoring system | |
US20050071046A1 (en) | Surveillance system and surveillance robot | |
US20150073646A1 (en) | Mobile Human Interface Robot | |
KR101543542B1 (en) | Intelligent surveillance system and method of monitoring using the same | |
WO2011146259A2 (en) | Mobile human interface robot | |
WO2020110400A1 (en) | Unmanned aerial vehicle, control method, and program | |
KR101959366B1 (en) | Mutual recognition method between UAV and wireless device | |
GB2509814A (en) | Method of Operating a Mobile Robot | |
US11004317B2 (en) | Moving devices and controlling methods, remote controlling systems and computer products thereof | |
JP2019208197A (en) | Monitoring device, monitoring program, storage medium, and monitoring method | |
WO2017166548A1 (en) | Method and device for controlling stopping of balance car | |
KR20180038871A (en) | Robot for airport and method thereof | |
KR102439548B1 (en) | Power plant monitoring system and method | |
JP2018036847A (en) | Information processing system | |
KR102474706B1 (en) | The Apparatus For Surveillancing | |
KR20190118486A (en) | Mutual recognition method between UAV and wireless device | |
JP2006164120A (en) | Autonomous travel monitor and program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 18919492 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 18919492 Country of ref document: EP Kind code of ref document: A1 |