US20170235018A1 - Systems and methods for taking, processing, retrieving, and displaying images from unmanned aerial vehicles - Google Patents
Systems and methods for taking, processing, retrieving, and displaying images from unmanned aerial vehicles Download PDFInfo
- Publication number
- US20170235018A1 US20170235018A1 US15/401,999 US201715401999A US2017235018A1 US 20170235018 A1 US20170235018 A1 US 20170235018A1 US 201715401999 A US201715401999 A US 201715401999A US 2017235018 A1 US2017235018 A1 US 2017235018A1
- Authority
- US
- United States
- Prior art keywords
- image
- uav
- images
- aerial images
- geographic location
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 238000000034 method Methods 0.000 title claims abstract description 46
- 238000012545 processing Methods 0.000 title description 7
- 238000004891 communication Methods 0.000 claims description 43
- 230000001131 transforming effect Effects 0.000 claims description 2
- 210000004712 air sac Anatomy 0.000 description 16
- 238000001514 detection method Methods 0.000 description 14
- 238000010586 diagram Methods 0.000 description 13
- 238000012360 testing method Methods 0.000 description 11
- 230000008569 process Effects 0.000 description 8
- 230000009467 reduction Effects 0.000 description 8
- RZVHIXYEVGDQDX-UHFFFAOYSA-N 9,10-anthraquinone Chemical compound C1=CC=C2C(=O)C3=CC=CC=C3C(=O)C2=C1 RZVHIXYEVGDQDX-UHFFFAOYSA-N 0.000 description 7
- 230000006870 function Effects 0.000 description 7
- 230000000007 visual effect Effects 0.000 description 7
- 238000002405 diagnostic procedure Methods 0.000 description 5
- 239000000463 material Substances 0.000 description 5
- 230000007246 mechanism Effects 0.000 description 5
- VLLVVZDKBSYMCG-UHFFFAOYSA-N 1,3,5-trichloro-2-(2-chlorophenyl)benzene Chemical compound ClC1=CC(Cl)=CC(Cl)=C1C1=CC=CC=C1Cl VLLVVZDKBSYMCG-UHFFFAOYSA-N 0.000 description 4
- 230000007257 malfunction Effects 0.000 description 4
- 230000001413 cellular effect Effects 0.000 description 3
- 230000008859 change Effects 0.000 description 3
- 230000001010 compromised effect Effects 0.000 description 3
- 230000001276 controlling effect Effects 0.000 description 3
- 238000007689 inspection Methods 0.000 description 3
- 230000033001 locomotion Effects 0.000 description 3
- CURLTUGMZLYLDI-UHFFFAOYSA-N Carbon dioxide Chemical compound O=C=O CURLTUGMZLYLDI-UHFFFAOYSA-N 0.000 description 2
- 230000005856 abnormality Effects 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 2
- 238000004458 analytical method Methods 0.000 description 2
- 230000003190 augmentative effect Effects 0.000 description 2
- 230000001419 dependent effect Effects 0.000 description 2
- 239000006260 foam Substances 0.000 description 2
- 239000000446 fuel Substances 0.000 description 2
- 230000036541 health Effects 0.000 description 2
- 238000007726 management method Methods 0.000 description 2
- 238000005259 measurement Methods 0.000 description 2
- VNWKTOKETHGBQD-UHFFFAOYSA-N methane Chemical compound C VNWKTOKETHGBQD-UHFFFAOYSA-N 0.000 description 2
- 230000006855 networking Effects 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 238000001228 spectrum Methods 0.000 description 2
- 230000003068 static effect Effects 0.000 description 2
- 239000013598 vector Substances 0.000 description 2
- 239000010755 BS 2869 Class G Substances 0.000 description 1
- 230000009471 action Effects 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 229910002092 carbon dioxide Inorganic materials 0.000 description 1
- 239000001569 carbon dioxide Substances 0.000 description 1
- QBWCMBCROVPCKQ-UHFFFAOYSA-N chlorous acid Chemical compound OCl=O QBWCMBCROVPCKQ-UHFFFAOYSA-N 0.000 description 1
- 239000004020 conductor Substances 0.000 description 1
- 230000002596 correlated effect Effects 0.000 description 1
- 230000000875 corresponding effect Effects 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 230000003111 delayed effect Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000005611 electricity Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 239000011888 foil Substances 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 238000010348 incorporation Methods 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 230000004807 localization Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 239000003550 marker Substances 0.000 description 1
- 230000005055 memory storage Effects 0.000 description 1
- 238000013439 planning Methods 0.000 description 1
- 238000010248 power generation Methods 0.000 description 1
- 238000003908 quality control method Methods 0.000 description 1
- 230000002829 reductive effect Effects 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- 230000007723 transport mechanism Effects 0.000 description 1
- 238000012384 transportation and delivery Methods 0.000 description 1
- 230000001960 triggered effect Effects 0.000 description 1
Images
Classifications
-
- G01V99/005—
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/10—Simultaneous control of position or course in three dimensions
- G05D1/101—Simultaneous control of position or course in three dimensions specially adapted for aircraft
- G05D1/106—Change initiated in response to external conditions, e.g. avoidance of elevated terrain or of no-fly zones
- G05D1/1064—Change initiated in response to external conditions, e.g. avoidance of elevated terrain or of no-fly zones specially adapted for avoiding collisions with other aircraft
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64C—AEROPLANES; HELICOPTERS
- B64C39/00—Aircraft not otherwise provided for
- B64C39/02—Aircraft not otherwise provided for characterised by special use
- B64C39/024—Aircraft not otherwise provided for characterised by special use of the remote controlled vehicle type, i.e. RPV
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64D—EQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENT OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
- B64D17/00—Parachutes
- B64D17/80—Parachutes in association with aircraft, e.g. for braking thereof
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64D—EQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENT OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
- B64D45/00—Aircraft indicators or protectors not otherwise provided for
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64D—EQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENT OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
- B64D47/00—Equipment not otherwise provided for
- B64D47/08—Arrangements of cameras
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U30/00—Means for producing lift; Empennages; Arrangements thereof
- B64U30/20—Rotors; Rotor supports
- B64U30/26—Ducted or shrouded rotors
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01V—GEOPHYSICS; GRAVITATIONAL MEASUREMENTS; DETECTING MASSES OR OBJECTS; TAGS
- G01V20/00—Geomodelling in general
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/0094—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots involving pointing a payload, e.g. camera, weapon, sensor, towards a fixed or moving target
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/10—Simultaneous control of position or course in three dimensions
- G05D1/101—Simultaneous control of position or course in three dimensions specially adapted for aircraft
- G05D1/106—Change initiated in response to external conditions, e.g. avoidance of elevated terrain or of no-fly zones
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/60—Intended control result
- G05D1/606—Compensating for or utilising external environmental conditions, e.g. wind or water currents
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/60—Intended control result
- G05D1/656—Interaction with payloads or external entities
- G05D1/689—Pointing payloads towards fixed or moving targets
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/60—Intended control result
- G05D1/69—Coordinated control of the position or course of two or more vehicles
- G05D1/693—Coordinated control of the position or course of two or more vehicles for avoiding collisions between vehicles
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/50—Information retrieval; Database structures therefor; File system structures therefor of still image data
- G06F16/58—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
- G06F16/5866—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using information manually generated, e.g. tags, keywords, comments, manually generated location and time information
-
- G06F17/30268—
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64D—EQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENT OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
- B64D2201/00—Airbags mounted in aircraft for any use
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U10/00—Type of UAV
- B64U10/10—Rotorcrafts
- B64U10/13—Flying platforms
- B64U10/14—Flying platforms with four distinct rotor axes, e.g. quadcopters
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U10/00—Type of UAV
- B64U10/25—Fixed-wing aircraft
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U10/00—Type of UAV
- B64U10/50—Glider-type UAVs, e.g. with parachute, parasail or kite
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U10/00—Type of UAV
- B64U10/60—Tethered aircraft
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U20/00—Constructional aspects of UAVs
- B64U20/80—Arrangement of on-board electronics, e.g. avionics systems or wiring
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U2101/00—UAVs specially adapted for particular uses or applications
- B64U2101/30—UAVs specially adapted for particular uses or applications for imaging, photography or videography
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U2201/00—UAVs characterised by their flight controls
- B64U2201/20—Remote controls
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04817—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10032—Satellite or aerial image; Remote sensing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
Definitions
- Unmanned aerial systems typically include unmanned aerial vehicles (UAV) that do not carry a human operator, but instead operate partially or completely autonomously and/or are remotely piloted.
- UAS unmanned aerial vehicles
- Unmanned aerial vehicles may be used to capture images from one or more onboard image capture device and/or capture sensor data from one or more onboard sensor.
- the images or sensor data may have embedded metadata.
- metadata from the time the images or sensor data were taken may be available separately from the unmanned aerial system or from an outside source.
- the format and content type of the images, sensor data, and metadata vary widely depending on the type of unmanned aerial vehicle and/or unmanned aerial system.
- the form of transmission of the images, sensor data, and metadata also varies widely from system to system.
- FIG. 1 is a block diagram of an exemplary embodiment of an unmanned aerial system in accordance with the present disclosure.
- FIG. 2 is a perspective view of an exemplary embodiment of an unmanned aerial system in accordance with the present disclosure.
- FIG. 3 is an illustration of another exemplary embodiment of an unmanned aerial system in accordance with the present disclosure.
- FIG. 4 is an illustration of yet another exemplary embodiment of an unmanned aerial system in accordance with the present disclosure.
- FIG. 5 is a block diagram of an exemplary embodiment of an unmanned aerial vehicle in accordance with the present disclosure.
- FIG. 6 is a block diagram of an exemplary embodiment of integrated components of an unmanned aerial vehicle in accordance with the present disclosure.
- FIG. 7 is a block diagram of another exemplary embodiment of integrated components of an unmanned aerial vehicle in accordance with the present disclosure.
- FIG. 8 is a block diagram of yet another exemplary embodiment of integrated components of an unmanned aerial vehicle in accordance with the present disclosure.
- FIG. 9 is an illustration of an exemplary embodiment of multiple unmanned aerial vehicles in accordance with the present disclosure.
- FIG. 10 is an illustration of an exemplary embodiment of yet another unmanned aerial vehicle in accordance with the present disclosure.
- FIG. 11 is an illustration of the unmanned aerial vehicle of FIG. 10 in which air bladders are deployed.
- FIG. 12 is a block diagram of an exemplary embodiment of a bladder system in accordance with the present disclosure.
- FIG. 13 is an illustration of exemplary embodiments of unmanned aerial vehicles having closed loop sensors in accordance with the present disclosure.
- FIG. 14 is a block diagram of an exemplary embodiment of a controller of an unmanned aerial vehicle in accordance with the present disclosure.
- FIG. 15 is a block diagram of an exemplary embodiment of a power system of an unmanned aerial vehicle in accordance with the present disclosure.
- FIG. 16 is an illustration of exemplary embodiments of unmanned aerial vehicles having propeller guards in accordance with the present disclosure.
- FIG. 17 is a block diagram of an exemplary embodiment of an avionics system of an unmanned aerial vehicle in accordance with the present disclosure.
- FIG. 18 is an illustration of exemplary embodiments of unmanned aerial vehicles in accordance with the present disclosure.
- FIG. 19 is a block diagram of an exemplary embodiment of a remote station of an exemplary unmanned aerial system in accordance with the present disclosure.
- FIG. 20 is a top plan view of an exemplary geographic location.
- FIG. 21 is an illustration of an exemplary embodiment of boundary marking of the exemplary geographic location of FIG. 20 .
- FIG. 22 is an illustration of another exemplary embodiment of boundary marking of the exemplary geographic location of FIG. 20 .
- FIG. 23 is an illustration of an exemplary embodiment of an unmanned aerial vehicle in use in the exemplary geographic location of FIG. 20 .
- FIG. 24 is a front view of the unmanned aerial vehicle in use of FIG. 23 .
- FIG. 25 is a block diagram of an exemplary embodiment of an image location system in accordance with the present disclosure.
- FIG. 26 is an exemplary embodiment of an overview image in accordance with the present disclosure.
- FIG. 27 is an exemplary embodiment of an overview image having thumbnail images in accordance with the present disclosure.
- FIG. 28 is an exemplary embodiment of an overview image having icons in accordance with the present disclosure.
- FIG. 29 is an exemplary embodiment of an image in accordance with the present disclosure.
- FIG. 30 is an exemplary embodiment of a user search area on an image in accordance with the present disclosure.
- FIG. 31 is an exemplary embodiment of a returned image based on the search area of FIG. 30 in accordance with the present disclosure.
- FIG. 32 is an exemplary embodiment of another returned image based on the search area of FIG. 30 in accordance with the present disclosure.
- FIG. 33 is an exemplary embodiment of another user search area on an image in accordance with the present disclosure.
- FIG. 34 is an exemplary embodiment of another user search area on a three dimensional model in accordance with the present disclosure.
- FIG. 35 is an exemplary embodiment of another user search area on a three dimensional model in accordance with the present disclosure.
- FIG. 36 is an exemplary embodiment of a user search point on a three dimensional model in accordance with the present disclosure.
- FIG. 37 is an exemplary embodiment of a user selected wall on a three dimensional model in accordance with the present disclosure.
- FIG. 38 is an exemplary embodiment of a user search area on a two dimensional model in accordance with the present disclosure.
- the terms “comprises,” “comprising,” “includes,” “including,” “has,” “having” or any other variation thereof, are intended to cover a non-exclusive inclusion.
- a process, method, article, or apparatus that comprises a list of elements is not necessarily limited to only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus.
- “or” refers to an inclusive “or” and not to an exclusive “or”. For example, a condition A or B is satisfied by anyone of the following: A is true (or present) and B is false (or not present), A is false (or not present) and B is true (or present), and both A and B are true (or present).
- qualifiers like “substantially,” “about,” “approximately,” and combinations and variations thereof, are intended to include not only the exact amount or value that they qualify, but also some slight deviations therefrom, which may be due to manufacturing tolerances, measurement error, wear and tear, stresses exerted on various parts, and combinations thereof, for example.
- any reference to “one embodiment” or “an embodiment” means that a particular element, feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment.
- the appearances of the phrase “in one embodiment” in various places in the specification are not necessarily all referring to the same embodiment.
- a method for capturing aerial images comprises: determining, with at least one of a controller of an unmanned aerial vehicle and a processor of a remote station, a flight plan of the unmanned aerial vehicle, the flight plan configured such that the unmanned aerial vehicle and fields of view of an image capture device of the unmanned aerial vehicle are restricted to an area within boundaries of a geographic location identified by coordinates of the geographic location; and executing, with the unmanned aerial vehicle, the flight plan; and capturing, with the image capture device, one or more aerial images solely within the boundaries of the geographic location while executing the flight plan.
- executing the flight plan is carried out automatically by the controller of the unmanned aerial vehicle.
- executing the flight plan is at least partially carried out by an operator utilizing the human-machine interface module of the remote station, and further comprising: receiving, by the remote station from the communications system, one or more first non-transitory signal indicative of position of the unmanned aerial vehicle; and transmitting, from the remote station to the communications system of the unmanned aerial vehicle, one or more second non-transitory signal indicative of instructions for navigation of the unmanned aerial vehicle to maintain the unmanned aerial vehicle within the boundaries.
- a method comprises receiving aerial images captured by one or more unmanned aerial vehicle; receiving metadata associated with the aerial images captured by the one or more unmanned aerial vehicle; geo-referencing the aerial images based at least in part on a geographic location of a surface to determine geographic coordinates of pixels of the aerial images; receiving a geographic location from a user; retrieving one or more of the aerial images associated with the geographic location based on the determined geographic coordinates of the pixels; and displaying to the user one or more overview image depicting the geographic location and overlaid with one or more icons indicative of and associated with the retrieved aerial images associated with the geographic location.
- the metadata includes orientation, attitude, and bearing of one or more image capture device that captured the one or more aerial image, and wherein geo-referencing the aerial images based at least in part on a geographic location of a surface to determine geographic coordinates is geo-referencing the aerial images based at least in part on a geographic location of a surface and using the orientation, attitude, and bearing of the image capture device to determine the geographic coordinates of objects depicted in the one or more aerial image.
- the method further comprises receiving a selection from the user of one of the icons; and displaying the retrieved aerial image associated with the icon.
- the geographic location from the user may be in a form of three or more geographic points forming a polygon.
- the method may further comprise creating a three dimensional polygon based on the polygon and a predetermined height dimension; wherein retrieving one or more of the aerial images associated with the geographic location based on the determined geographic coordinates further comprises retrieving one or more of the aerial images associated with the geographic location based on the three dimensional polygon.
- a method comprises receiving aerial images captured by one or more image capture device on one or more unmanned aerial vehicle, the aerial images depicting only objects above the ground; receiving metadata associated with the one or more image capture device at the time the aerial images were captured, the metadata including latitude and longitude of the one or more image capture device and one or more of altitude, orientation, attitude, and bearing of the one or more image capture device; receiving information indicative of a location of a region of interest; and geolocating one or more of the aerial images, thereby associating one or more of the geolocated aerial images with the region of interest. Geolocating the one or more aerial images may be based at least in part on correlating the information indicative of the location of the region of interest and the metadata associated with the one or more image capture device at the time the aerial images were captured.
- the metadata associated with the one or more image capture device may further include one or more of sensor size of the one or more image capture device, focal length of the one or more image capture device; pixel pitch of the one or more image capture device, and distortion parameters of the one or more image capture device.
- a method comprises receiving aerial images captured by one or more unmanned aerial vehicle and time data indicative of a time the aerial images were captured; receiving metadata captured by the one or more unmanned aerial vehicle including time data indicative of when the metadata was captured; associating the metadata with the aerial images based at least in part on matching the time data of the metadata with the time data of the aerial images; geo-referencing the aerial images based on a geographic location of a surface to determine geographic coordinates of pixels for ground locations and objects depicted in the aerial images; receiving a geographic location from a user; retrieving one or more of the aerial images associated with the geographic location based on the determined geographic coordinates; and displaying to the user one or more overview image depicting the geographic location and overlaid with one or more icons indicative of and associated with the retrieved aerial images associated with the geographic location.
- a method comprises receiving non-standardized metadata captured by an unmanned aerial vehicle and associated with one or more image captured by the unmanned aerial vehicle; transforming the non-standardized metadata into a standardized format; and storing the transformed metadata in a first database associated with the one or more image stored in a second database.
- a method comprises determining, with at least one of a controller of an unmanned aerial vehicle and a processor of a remote station, a flight plan of the unmanned aerial vehicle, the flight plan configured such that the unmanned aerial vehicle and fields of view of an image capture device of the unmanned aerial vehicle are restricted to an area within boundaries of a geographic location identified by coordinates of the geographic location; executing, with the unmanned aerial vehicle, the flight plan; and capturing, with the image capture device, one or more aerial images solely within the boundaries of the geographic location and restricted to fields of view in within the boundaries while executing the flight plan. Executing the flight plan may be carried out automatically by the controller of the unmanned aerial vehicle.
- Executing the flight plan may be at least partially carried out by an operator utilizing a human-machine interface module of the remote station, and further comprise receiving, by the remote station, one or more first non-transitory signal indicative of position of the unmanned aerial vehicle; and transmitting, from the remote station to a communications system of the unmanned aerial vehicle, one or more second non-transitory signal indicative of instructions for navigation of the unmanned aerial vehicle to maintain the unmanned aerial vehicle within the boundaries.
- a method comprises receiving aerial images captured by one or more unmanned aerial vehicle; receiving metadata associated with the aerial images captured by the one or more unmanned aerial vehicle; geo-referencing the aerial images based at least in part on a geographic location of a surface to determine geographic coordinates of pixels of the aerial images; receiving a geographic location from a user based on selection by the user of one or more pixels of a first one of the aerial images, the geographic location being above the ground; and retrieving one or more of second ones of the aerial images associated with the geographic location from the user based on the determined geographic coordinates of the pixels.
- the geographic location from the user may be in a form of three or more geographic points based on selection by the user of three or more pixels forming a polygon of the first one of the aerial images.
- the first one of the aerial images may include a depiction of a structure and the geographic location from the user in the form of three or more geographic points forming a polygon may be located on the structure in the first on of the aerial image.
- the first one of the aerial images may include a depiction of a structure and the geographic location from the user in in a form of one or more elements of the structure chosen by the user in the first on of the aerial image.
- the one or more elements of the structure may be chosen from the group consisting of a wall, a roof plane, a roof, a floor, a door, an intersection, a cross-section, and a window.
- FIG. 1 is a block diagram of an exemplary embodiment of an unmanned aerial system (UAS) 10 in accordance with the present disclosure.
- the UAS 10 may comprise one or more Unmanned Aerial Vehicle (UAV) 12 .
- the UAS 10 may further comprise one or more remote station 14 .
- one or more remote operator 16 may interact with the remote station 14 .
- the remote station 14 may serve a range of functions, from simply receiving data from the UAV 12 , up to and including completely controlling all functions of the UAV 12 .
- the UAS 10 may comprise a plurality of UAVs 12 and/or a plurality of remote stations 14 , working in pairs separately, or working together in any combination, for example, as shown in FIG. 2 .
- the UAS 10 may comprise two or more UAVs 12 working in tandem and/or independently.
- the UAS 10 may comprise a transponder system (not shown) configured for transmitting signals to other aircraft, the signals comprising information regarding the UAS 10 and/or location of the UAS 10 or UAV 12 .
- the transponder system may be located partially or completely in one or both of the UAV 12 and the remote station 14 .
- the UAS 10 may further comprise a case 17 .
- the case 17 may be used to store and transfer the UAV 12 and/or the remote station 14 . Additionally, or alternately, the case 17 may be part of the remote station 14 . Additionally, or alternately, the case 17 may be used as part of pre-flight check(s). For example, the case 17 may be used to weigh down the UAV 12 for a power-up check and provide targets for a collision detection diagnostic phase.
- the case 17 may also contain actuators (not shown) to move the UAV 12 on the various axes to test how the UAS 10 reacts to changes in attitude.
- the UAV 12 is secured to the case 17 such that the UAV 12 may be moved to allow roll, pitch, and yaw.
- the UAV 12 may be connected to the case 17 via one or more gimbal 21 , a nested gimbal, and/or a gimbal lock.
- a gimbal lock restricts one degree of freedom in a multi-dimensional, multi-gimbal mechanism having “n” gimbals and thus “n” degrees of freedom.
- the gimbal lock restricts the axes of “n- 1 ” gimbals. For example, in a three-gimbal system, two of the three gimbals are driven into a parallel configuration, “locking” the system into rotation in a degenerate two-dimensional space.
- multiple servos and/or motors may rotate the UAV 12 across each degree of freedom (roll, pitch, and yaw) in a series of tests to verify that the correct power is provided to the correct component of the UAV 12 to compensate for the motion, thereby testing flight-worthiness of the UAV 12 .
- the UAS 10 may further comprise a tether 18 for tethering the UAV 12 to a base 19 .
- the remote station 14 and/or the case 17 may act as the base 19 .
- power may be provided through the tether 18 using step up/step down transformers (not shown).
- the UAS 10 may employ software-based distance and/or altitude limits to limit and/or control the use of the UAV 12 to a control range.
- the operator 16 may set a maximum distance limit in the UAV 12 and/or the remote station 14 so the UAV 12 will not go beyond the control range.
- the operator 16 may set a maximum above ground limit in the UAV 12 and/or remote station 14 so the UAV 12 will not go above a set altitude, for example 400 feet above the ground.
- the maximum altitude limit is set based on Federal Aviation Administration rules.
- the remote station 14 and/or the UAV 12 may be programmed with data indicative of a particular type of air space and to restrict the use of the UAV 12 to that particular type of air space.
- the particular type of air space could be “class G” air space to substantially prevent the UAV 12 from interfering with air craft in another type of air space (such as other air space classes).
- the UAV 12 may automatically return to a predetermined home location and/or to the remote station 14 if there is a failure. For example, if components of the UAV 12 fail or if the signal from the remote station 14 fails, the UAV 12 may automatically return to the home location and/or to the remote station 14 .
- the UAV 12 may comprise an airframe 20 , a controller 22 , a communications system 24 , a power system 26 , a propulsion system 28 , and an avionics system 30 .
- the UAV 12 may comprise a navigation system 32 , or the navigation system 32 may be partially or completely in the remote station 14 .
- the UAV 12 may comprise one or more Electronic Speed Control (ESC) 34 .
- the UAV 12 may comprise one or more power bus 36 .
- the UAV 12 may comprise one or more speed reduction device 150 (for example, as shown in FIG. 18 ).
- the UAV 12 may comprise one or more actuator 38 .
- the UAV 12 may carry a payload 40 .
- components of the UAV 12 are sized and specified to safely carry the weight of the desired payload 40 and to meet specifications to withstand wind forces and to reduce the weight of the UAV 12 .
- the weight of the UAV 12 is related to the kinetic energy of the UAV 12 , a UAV 12 with a reduced weight has less kinetic energy than a heavier weight UAV 12 , and therefore minimizes damage in the event of a crash of the UAV 12 .
- the UAV 12 may comprise one or more image capture device 42 and/or may carry one or more image capture device 42 as part of the payload 40 .
- image capture devices 42 include cameras (capable of detecting visible and non-visible ranges of light), infrared sensors, radar, and sonar.
- the image capture device 42 may capture images 44 .
- the UAV 12 may transmit the images 44 to the remote station 14 and/or to a remote system (not shown), and/or store the images 44 , and/or process (partially or fully) the images 44 onboard the UAV 12 .
- Nonexclusive examples of processing the images 44 may include partially or completely georeferencing one or more images 44 , geolocating one or more images 44 , reviewing one or more images 44 for abnormalities, performing quality control of one or more images 44 , tie-pointing (manual/automatic) to relate adjacent images 44 , bundle adjustments, 3D point cloud generation from 2D images 44 , mosaic generation from the images 44 , and/or color-balancing one or more images 44 .
- components of the UAV 12 may be tightly integrated to reduce size and weight of the UAV 12 .
- the controller 22 , the communications system 24 , the ESCs 34 , the power bus 36 , and/or components of the power system 26 may be integrated into one or more printed circuit board (PCB) 50 or a hybrid PCB and integrated circuit. Wires may be substituted with PCB traces, thus reducing or eliminating the number of wires and connectors required.
- PCB printed circuit board
- all or some of the payload 40 may be integrated on one or more shared PCB 50 , as shown in FIG. 7 , for example, with the power bus 36 and/or controller 22 .
- components of the power system 26 may be mounted directly to the PCB 50 , along with other components, such as the power bus 36 and/or the controller 22 . Additionally, or alternately, wires may be used to connect the power system 26 to the Electronic Speed Controls 34 .
- any type of aircraft airframe may be used as the basis of the airframe 20 of the UAV 12 .
- types of UAVs 12 having different airframes 20 include a fixed-wing UAV 12 a having a front or rear propeller, a fixed-wing UAV 12 b having multiple wing propellers, a helicopter type UAV 12 c , a multi-rotor UAV 12 d , a tilt-rotor UAV 12 e , a jet-type UAV 12 f , and a blimp-type UAV 12 g .
- the airframe 20 of the UAV 12 g may have a blimp-like design in which the airframe 20 encloses lighter-than-air gas.
- the airframe 20 of the UAV 12 may have one or more control surfaces 60 such as elevators, rudders, flaps, slats, and/or ailerons.
- the control surfaces 60 may have one or more servomechanism (not shown).
- the airframe 20 of the UAV 12 may have attachments to carry the payload 40 and/or the payload 40 may be integrated into the airframe 20 of the UAV 12 .
- the PCB 50 may also form a section of the airframe 20 .
- the airframe 20 may be configured to absorb energy, such as energy generated in a collision.
- the airframe 20 may include padding 70 that meets OSHA 1910.135b and the cited ANSI requirements for head protection.
- the padding 70 may substantially cover one or more exterior surfaces of the airframe 20 .
- the padding 70 may be formed of foam or other appropriate padding material.
- the airframe 20 is completely or partially composed of foam or other appropriate padding material.
- the airframe 20 may include a bladder system 72 having air bladders 74 .
- the air bladders 74 may substantially cover the airframe 20 .
- the air bladders 74 may weigh less than padding 70.
- the air bladders 74 may have an un-inflated state ( FIG. 10 ) and an inflated state ( FIG. 11 ). In the inflated state, the air bladders 74 may encompass all or part of an exterior of the UAV 12 to protect the UAV 12 from impact with other objects, as well as to protect other objects from impact with the UAV 12 .
- the air bladders 74 may be automatically and/or manually (remotely) switched to the inflated state if the UAV 12 is out of control.
- the controller 22 may monitor the power system 26 , the propulsion system 28 , the avionics system 30 , and/or the navigation system 32 .
- the controller 22 may signal the air bladder system 72 to switch the air bladders 74 to the inflated state from the uninflated state.
- the air bladders 74 may be automatically triggered to the inflated state when power is lost to one or more of the systems in the UAV 12 .
- the bladder system 72 may comprise one or more air bladders 74 , a bladder system control 76 , and one or more containers 78 containing compressed gas 79 .
- the air bladders 74 may be inflated with the compressed gas 79 from the containers 78 by the bladder system control 76 .
- the air bladders 74 may be automatically and/or manually (remotely) switched to the inflated state if the UAV 12 is out of control, via the bladder system control 76 .
- the bladder system control 76 may monitor the power system 26 , the propulsion system 28 , the avionics system 30 , and/or the navigation system 32 . If the bladder system control 76 determines the UAV 12 is outside of predetermined parameters for one or more of the systems, the bladder system control 76 may signal the air bladder system 72 to switch the air bladders 74 to the inflated state from the uninflated state.
- the airframe 20 may include both padding 70 and air bladders 74 .
- sections of, or all of, the airframe 20 may be designed to break apart or compress on impact to help absorb the energy of a collision. This might include spring loading, gas loading, compressible materials, or weak points in the airframe 20 that are meant to break and/or collapse during a collision.
- the UAV 12 may comprise a closed loop sensor 80 surrounding at least a portion of the airframe 20 .
- the closed loop sensor 80 comprises an electrical circuit 82 surrounding at least a portion of the airframe 20 .
- the closed loop sensor 80 works to signal the controller 22 if there is a break in the electrical circuit 82 .
- the controller 22 and/or the remote station 14 may shut down the power system 26 and/or emit a warning to the remote operator 16 and anyone in the vicinity.
- the warning may be in any form, non-exclusive examples of which are an audible and/or visual warning.
- the controller 22 may control the functions of, and/or receive data from, the communications system 24 , the power system 26 , the propulsion system 28 , the avionics system 30 , the navigation system 32 , and/or the ESC(s) 34 .
- the controller 22 may use data from the avionics system 30 or elsewhere (for example, an airspeed sensor, one or more down facing camera, GPS speed, etc.) to detect and limit the speed of the UAV 12 .
- the controller 22 may contain a maximum speed setting and/or altitude setting for the UAV 12 .
- the controller 22 may include one or more computer processor 90 and/or field-programmable gate array (FPGA) 92 , one or more drive 94 , one or more user input device 96 , and one or more non-transitory memory 98 . In one embodiment, the controller 22 may have an image capture module 100 .
- FPGA field-programmable gate array
- the computer processors 90 and/or FPGAs 92 may be programmed or hardwired to control the UAV 12 and/or to interpret and carry out commands from the remote station 14 to control the UAV 12 .
- the controller 22 may be configurable to perform specific in-flight functions.
- the controller 22 may receive flight control instructions from the remote station 14 (or elsewhere), control relevant flight control mechanisms (such as through the power system 26 , propulsion system 28 , navigation system 32 , and/or avionics system 30 ), and/or provide feedback information (e.g., telemetry information) to the remote station 14 and/or other device(s).
- the drives 94 and their associated computer storage media such as removable storage media (e.g., CD-ROM, DVD-ROM) and non-removable storage media (e.g., a hard drive disk), may provide storage of computer readable instructions, data structures, program modules and other data.
- the drives 94 may store and include an operating system, application programs, program modules, and one or more database storing various data, nonexclusive examples of which include image data, position data, flight control instructions data, flight path data, past flight data, sensor data, and navigation data.
- the controller 22 further may include one or more user input device 96 , through which a user may enter commands and data.
- input devices 96 may include an electronic digitizer, a microphone, a keyboard, and a pointing device such as a mouse device, trackball device or touch pad device.
- Other input devices 96 may include a joystick device, game pad device, satellite dish, scanner device, or the like.
- the controller 22 may stream data (live or delayed feed) utilizing the communications system 24 to the remote station 14 , or other site(s) or vehicle(s), and/or may store data in the one or more non-transitory memory 98 .
- the controller 22 may transmit real-time video or data to the remote station and/or to points worldwide.
- the UAV 12 may have Internet connectivity (for example, through an Inmarsat satellite) and may transmit data directly over the Internet.
- the image capture module 100 may transmit captured images 44 to the remote station 14 or other device through the communication system 24 , store the captured images 44 in the memory 98 , and/or process the captured images 44 .
- Non-exclusive examples of processing of captured images are described in U.S. Pat. No. 8,477,190, issued Jul. 2, 2013, titled “Real-Time Moving Platform Management System;” U.S. Pat. No. 8,385,672, issued Feb. 26, 2013, titled “System for Detecting Image Abnormalities;” U.S. Pat. No. 7,424,133, issued Sep. 9, 2008, titled “Method and Apparatus for Capturing, Geolocating and Measuring Oblique Images;” and U.S. Patent Publication US20150221079A1, published Aug. 6, 2015, titled “Augmented Three Dimensional Point Collection of Vertical Structures;” all of which are hereby incorporated by reference in their entirety herein.
- the image capture module 100 and/or the remote station 14 may also be used to adjust operational parameters, such as resolution, of the image capture device 42 .
- the image capture module 100 and/or the remote station 14 may transmit one or more signal to the image capture device 42 indicating a change to operational parameters.
- the memory 98 of the controller 22 may comprise, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which may be used to store the desired information and which may be accessed by the controller or the remote station or other remote processor. Any such computer storage media may be part of the controller 22 and/or the remote station 14 .
- the controller 22 may automatically mitigate unexpected flight characteristics.
- unexpected flight characteristics include a ground effect (that is, increased lift-force and decreased aerodynamic drag that wings or rotors generate when they are close to a fixed surface), translational lift (i.e. a transitional state present after a helicopter has moved from hover to forward flight), and vortex ring state (i.e. settling with power, in which a rotary UAV 12 settles into its own downwash causing loss of lift).
- the controller 22 may monitor the power system 26 , propulsion system 28 , navigation system 32 , and/or avionics system 30 , to detect unexpected flight characteristics.
- the controller 22 may implement counter measures to the detected unexpected flight characteristics, such as sending one or more non-transitory signals to the power system 26 , propulsion system 28 , navigation system 32 , and/or avionics system 30 , to control the flight of the UAV 12 .
- the UAV 12 may comprise one or more Electronic Speed Control (ESC) 34 .
- the ESC(s) 34 may control the operation of the control surfaces 60 of the airframe 20 and/or may control the propulsion system 28 , either in conjunction with or instead of the controller 22 .
- the controller 22 and/or the ESC(s) 34 may control the operation of the actuator 38 to actuate the propulsion system 28 (for example, the rotor blade) and/or the control surfaces 60 of the UAV 12 .
- the ESC 34 may be electrically connected to the controller 22 and the actuator 38 .
- the controller 22 may provide control signals for the ESC 34 , which in turn provides actuator signals to the electrically connected actuator 38 so as to actuate the corresponding component of the propulsion system 28 (such as the rotor) or control surface 60 on the airframe 20 .
- feedback signals can also be provided by the actuator 38 and/or the ESC 34 to the controller 22 .
- the number of ESCs 34 is equal to the number of actuators 38 (such as actuators 38 controlling rotors) of the UAV 12 .
- a 4-rotor UAV 12 d may have four actuators 38 and four ESCs 34 .
- the number of ESCs 34 may be different (more or less) than the number of actuators 38 .
- the ESC 34 may control the speed of revolution of the power system 26 , such as a motor/generator or an engine.
- the ESCs 34 may be optional. In some embodiments, instead of, or in addition to, the ESCs 34 , other types of actuator controllers can be provided to control the operation of the actuators 38 , and/or the controller 22 may directly control the control surfaces 60 and/or the propulsion system 28 .
- the communications system 24 of the UAV 12 may communicate with an external system, such as the remote station 14 , or other UAVs 12 , UASs 10 , aircraft, or other vehicles (including ground vehicles or satellites). As depicted in FIG. 5 , the communications system 24 may have one or more receiver 110 and one or more transmitter 112 . The communications system 24 may have one or more antenna 114 and one or more attenuator 116 for the antenna(s) 114 . The attenuator 116 may reduce the strength of a signal from or to the antenna 114 . The attenuator 116 may be used for range testing between the UAV 12 and the remote station 14 .
- An interlock may be used to prevent the UAV 12 from taking off with the attenuator 116 in place.
- the interlock is a device that makes the state of two mechanisms mutually dependent.
- a sensor is configured to detect that the attenuator 116 is in place. If the attenuator 116 is in place, the UAV 12 is prevented from flying (or flying beyond a predetermined distance) to prevent the UAV 12 from flying beyond the range of the controller 22 with the attenuator 116 attached to the UAV 12 .
- the attenuator 116 may also be affixed to the case 17 such that, when the UAV 12 is removed from the case 17 , the attenuator 116 is effectively removed.
- the antenna 114 may transmit/receive one or more signal to/from the communications system 24 to communicate with the remote station 14 and/or other UAV 12 , aircraft, and/or vehicles.
- Non-exclusive examples of communications systems are described in U.S. Pat. No. 8,477,190, issued Jul. 2, 2013, titled “Real-Time Moving Platform Management System,” which is hereby incorporated by reference in its entirety.
- the power system 26 may comprise one or more power generation and/or storage devices.
- the power system 26 may comprise one or more motor 118 or engine (not shown).
- the engine may be a piston engine or a jet.
- the power system 26 may comprise one or more generator and/or a solar power system (not shown) for generating power to supply to the motor 118 or other components of the UAV 12 .
- the power system 26 may comprise one or more fuel cell (not shown) for generating electrical energy to supply to the motor 118 of the power system 26 .
- the motor 118 of the power system 26 may have a light-weight housing 120 made of plastic, or other low-weight material.
- a motor with a plastic housing may be used (e.g. Emax PM2212 920 KV, Plastic Brushless Motor).
- the housing 120 of the motor 118 may be integrated with the airframe 20 and/or a part of the airframe 20 .
- the housing 120 may be molded or printed into the airframe 20 , such that fewer or no fasteners (such as screws) are needed to secure the motor(s) 118 to the airframe 20 , thus eliminating the potential failure of the fasteners.
- the power system 26 may also comprise one or more battery 122 sized to provide power for the desired task set for the UAV 12 . Capacity of the battery 122 may be sized for the task with a margin for error. Typically, for small UAVs 12 , the battery 122 may make up a significant portion of the weight of the UAV 12 . In one embodiment, multiple batteries 122 may be used in conjunction with a base station, such as the remote station 14 , such that the UAV 12 can fly back to the remote station 14 and switch out the battery 122 . In one embodiment, the battery 122 may be charged from the remote station 14 . The battery 122 may be automatically exchanged for another battery at the remote station 14 .
- the one or more battery 122 may directly plug into a socket of the PCB 50 so there are no wires between the battery 122 and the power bus 36 , thus eliminating the added weight of wiring between the battery 122 and the power bus 36 .
- the controller 22 and/or the remote station 14 may monitor voltage of the battery 122 to help determine the remaining capacity of the battery 122 .
- Total battery power output may be monitored (both volts and amps) to determine the total power drain from the battery 122 .
- Batteries 122 may have a built-in check (not shown) so the operator 16 can easily check the power level of the battery 122 .
- the built-in check may be a push-button with visual or audible indicators of the level of power of the battery 122 .
- the controller 22 and/or the remote station 14 may shut down the power system 26 or components of the power system 26 , such as the one or more motors 118 ( FIG. 15 ), in the event of a malfunction.
- the controller 22 and/or remote station 14 may shut down the power system 26 when an impact is detected, such as by an accelerometer; or when there is a disruption in the closed loop sensor 80 surrounding the airframe 20 indicating the airframe 20 has been compromised.
- the propulsion system 28 may comprise one or more propulsion device 130 , including a combination of different types of propulsion devices 130 .
- the one or more propulsion device 130 of the UAV 12 f may be a jet engine.
- the one or more propulsion device 130 may comprise one or more rotor 132 .
- the term “rotor” as used herein refers to a hub with a number of rotating air foils or blades.
- the rotor 132 may be orientated vertically (such as to provide propulsion), horizontally (such as to provide lift), or may be angularly adjustable (such as a tilt rotor).
- the one or more rotor 132 may be comprised of a material that yields when subjected to force, such as in the event of a strike of the rotor 132 against another object. For example, if the rotor 132 strikes an object, the rotor 132 may deflect, bend, or break to absorb the force of the strike.
- the propulsion system 28 may further comprise a propeller guard 134 .
- the propeller guard 134 may be connected to and supported by the airframe 20 .
- the propeller guard 134 may surround the rotor(s) 132 with a shroud or a cowling.
- the propeller guard 134 may cover exposed areas of the rotor(s) 132 .
- the propeller guard 134 may have openings no longer than one-half inch. The dimensions of the openings may comply with the Occupational Safety Health Administration regulation 1910 .
- removing the propeller guard 134 may interrupt electricity to the propulsion system 28 .
- a circuit (not shown) of the power system 26 is interrupted so that the power system 26 is nonoperational, and the rotor 132 is therefore no longer provided power.
- the propeller guard 134 may include a continuous loop conductor (e.g., conductive ink) (not shown) that substantially covers the outline of the propeller guard 134 , such that, in the event that the propeller guard 134 is broken, the conductive path is also broken.
- the UAS (such as the controller 22 and/or the remote station 14 ) may shut down the power system 26 and/or emit an audible and visual warning to the operator 16 and anyone in the vicinity.
- the controller 22 and/or the remote station 14 may shut down the propulsion system 28 or components of the propulsion system 28 , such as the rotors 132 , in the event of a malfunction.
- the controller 22 and/or remote station 14 may shut down the propulsion system 28 when an impact is detected, such as by the accelerometer; or when there is a disruption in the closed loop sensor 80 surrounding the airframe 20 indicating the airframe 20 has been compromised.
- the UAV 12 may comprise an avionics system 30 .
- the avionics system 30 may include mechanical and electronic flight control mechanisms such as motor(s), servo(s), fuel control switches, etc. (not shown) associated with various flight operations of the UAV 12 .
- the avionics system 30 may comprise one or more processor (not shown).
- the avionics system 30 may comprise one or more actuators 38 .
- the avionics system 30 may comprise one or more sensor 140 i . . . 140 i+n .
- the sensors 140 i . . . 140 i+n may be onboard the UAV 12 but outside of the avionics system.
- Nonexclusive examples of sensors 140 i . . . 140 i+n include a roll sensor, a pitch sensor, a yaw sensor, an altitude sensor (such as an altimeter), a directional sensor, and a velocity sensor.
- the avionics system 30 may comprise an inertial measurement unit (IMU) for measuring the velocity, orientation, and/or gravitational forces of the UAV 12 .
- the IMU may include one or more accelerometers and/or gyroscopes.
- the sensors 140 i . . . 140 i+n may further comprise an airspeed sensor for determining the relative speed between the UAV 12 and the body of air through which it is travelling.
- the sensors 140 i . . . 140 i+n may comprise a pitot sensor comprising both static and dynamic pressure sensors.
- the sensors 140 i . . . 140 i+n may comprise one or more altitude sensor, which provides a signal indicative of the altitude of the UAV 12 above sea level and/or above ground.
- the altitude sensor may comprise a GPS receiver, a magnetometer, a barometric altimeter, etc. Signals from the sensor(s) 140 i . . . 140 i+n may be sent via a power bus (not shown) to the avionics system 30 and/or the navigation system 32 .
- the GPS receiver and magnetometer were located close to the other electrical components. However, the operation of magnetometers may be affected by interference from other electrical components and/or the GPS receiver. To reduce risk of interference, in one embodiment, the ESCs 34 of the motors 118 may be mounted away from the magnetometer to prevent interference. Additionally, or alternately, the ESCs 34 , magnetometer, and/or GPS receiver may be shielded.
- the sensors 140 i . . . 140 i+n may comprise one or more collision detection sensor.
- collision detection sensors include an ultra-sonic device, a radar device, a laser device, a sonar device, an imaging device, and a transponder/receiver device.
- the one or more collision detection sensor may be utilized by the avionics system 30 to determine position of and avoid collisions with other aircraft, the ground, other structures, trees, and/or other obstacles.
- the avionics system 30 may comprise a Traffic Alert and Collision System (TACS) utilizing the collision detection sensor to warn of aircraft within the vicinity of the UAV 12 .
- TACS Traffic Alert and Collision System
- Such systems are well known by persons having skill in the art, for example, as described in “The Traffic Alert and Collision Avoidance System,” Kuchar and Drumm, Lincoln Laboratory Journal, Volume 16, Number 2, 2007, which is hereby incorporated by reference in its entirety herein.
- the controller 22 of the UAV 12 and/or the remote station 14 may utilize information from the TACS to change flight paths to avoid collisions with other aircraft.
- the avionics system 30 may comprise a Terrain Awareness and Warning System (TAWS) utilizing one or more sensor 140 i . . . 140 i+n , such as the one or more collision detection sensor.
- the TAWS may signal the controller 22 and/or the remote station 14 when the sensor 140 detects terrain or structures within a predetermined distance of the UAV 12 , when the UAV 12 goes outside predetermined flight parameters, and/or when the UAV 12 leaves a predetermined flight path or flight area.
- TAWS Terrain Awareness and Warning System
- the navigation system 32 may be located within the UAV 12 . Additionally, or alternately, part or all of the navigation system 32 may be located in the remote station 14 .
- the navigation system 32 may plan and/or deploy the flight path of the UAV 12 , may determine/receive location coordinates, may determine/receive way points, may determine/receive real world position information, may generate and transmit signals to appropriate components to control the flight of the UAV 12 , and so on.
- the avionics system 30 and/or the navigation system 32 may monitor the lateral location (latitude and longitude) of the UAV 12 (for example, using a GPS receiver), and/or monitor the altitude of the UAV 12 using the signals from the sensors 140 i . . . 140 i+n , and/or may receive information from the remote station 14 .
- the controller 22 utilizes information from the avionics system 30 in conjunction with the navigation system 32 to fly the UAV 12 from one location to another.
- the controller 22 may utilize the information to control the control surfaces 60 of the airframe 20 of the UAV 12 (for example, elevators, ailerons, rudders, flaps, and/or slats).
- the avionics system 30 and/or the navigation system 32 may include a memory (not shown) on which location of controlled airspace is stored, or may communicate with an external device, such as an air traffic control station (not shown) or the remote station 14 to receive transmitted data indicating the location of controlled airspace.
- the avionics system 30 and/or the navigation system 32 may provide signals to the ESC 34 and/or the controller 22 to be used to control the speed of rotation of the rotor 132 or the output of the motor 118 or engine.
- the avionics system 30 and/or the navigation system 32 may estimate the current velocity, orientation and/or position of the UAV 12 based on data obtained from the sensors 140 , such as visual sensors (e.g., cameras), IMU, GPS receiver and/or other sensors, perform path planning, provide data to the controller 22 (and/or control signals to the actuators 38 ) to implement navigational control, and the like.
- the sensors 140 such as visual sensors (e.g., cameras), IMU, GPS receiver and/or other sensors, perform path planning, provide data to the controller 22 (and/or control signals to the actuators 38 ) to implement navigational control, and the like.
- the UAV 12 may comprise one or more secondary sensors 140 a i . . . 140 a i+n ( FIG. 17 ) and secondary controller 22 a ( FIG. 5 ) that may be implemented to detect a fault in the primary sensors 140 and/or controller 22 and/or to replace functions of failed sensors 140 i . . . 140 i+n or a failed controller 22 .
- the secondary sensors 140 a i . . . 140 a i+n may include redundant sensors comprising one or more accelerometer, gyro, magnetometer, GPS, etc.
- the secondary sensors 140 a i . . . 140 a i+n may comprise redundant attitude sensors and control systems.
- the secondary sensors 140 a i . . . 140 a i+n and controller 22 a may be electrically isolated from the primary controller 22 and/or sensors 140 i . . . 140 i+n , via opto isolators and/or magnetic relays so a catastrophic failure of the primary controller 22 and/or sensors 140 i . . . 140 i+n does not cascade to the secondary sensors 140 a i . . . 140 a i+n and controller 22 a . If the secondary sensors 140 a i . . . 140 a i+n and controller 22 a detect a failure in the primary sensors 140 i . . .
- the secondary controller 22 a may shut off a relay that connects the primary sensors 140 i . . . 140 i+n and controller 22 to the power system 26 , such as the battery 122 .
- a protocol in the controller 22 may decide if it is appropriate for the UAV 12 to attempt to land or shut down immediately.
- the sensors 140 i . . . 140 i+n of the UAV 12 comprise one or more geo-localization sensor.
- a geo-localization sensor include a Global Positioning System (GPS), a Global Navigation Satellite System, a hyperbolic radio navigation system (e.g. LORAN), a motion capture system (e.g. such as manufactured by Vicon), a detector of lines and/or optical points of reference on the ground/structures, and an altimeter.
- GPS Global Positioning System
- LORAN hyperbolic radio navigation system
- motion capture system e.g. such as manufactured by Vicon
- a form of localization for example, utilizing the geo-localization sensor, may be used to keep the UAV 12 within a specific operation area (referred to as an “operation box” 211 ) for the current inspection task.
- the coordinates of the boundary of the allowed operation area (which may be referred to as a “box” or “fence”) may be stored on the UAS 10 .
- the box 211 may be predetermined.
- the box 211 may be determined using parcel boundaries, building outlines, cadastre data, and/or other sources of data (e.g. in any appropriate coordinates, such as latitude and longitude) and altitude.
- the box 211 may be determined on-site by the operator 16 prior to take off of the UAV 12 .
- On-site establishment of the box 211 i.e. “boxing” or “fencing” may be done using a handheld device (for example, a smartphone or tablet having GPS capabilities) to obtain the box corners coordinates.
- the operator 16 may walk to the corners of the box 211 and record/set a point for the corner of the box 211 .
- the operator 16 may place the points of the box 211 on a map displayed on the handheld device. In one embodiment, the operator 16 may choose or define a radius from a point as the box 211 or a boundary around a point as the box 211 . In one embodiment, the operator 16 may define attributes of one or more point (for example, location or title of the point, for example, southwest corner). In one embodiment, the operator 16 may define outlines of structures and/or trees within the box 211 . These vertices and/or boundaries may define the outside boundary of a given property, the location of tall obstructions such as trees, and/or the outline of a structure that is to be captured.
- the box coordinates and/or outlines of structures or obstructions may then be relayed from the operator 16 (and/or the handheld device) to the UAS 10 and/or the UAV 12 (for example, through Wi-Fi/Bluetooth, or manual download).
- the handheld device may be the remote station 14 .
- the box 211 may be geographical coordinates and/or altitude values that define a geometric shape (e.g. polygon, circle, square, etc.) on and/or above the earth.
- the box 211 may have a maximum altitude or z value.
- the box 211 may be a 3D polygon having a height.
- the box 211 may be a 2D geometric shape on the ground that extends upwards either to a maximum z height or up to a maximum altitude.
- the maximum z height or maximum altitude may be based at least in part on government regulations.
- the controller 22 may provide instructions to the UAV 12 such that the UAV 12 stays inside the box 211 and does not fly over adjacent or other properties.
- the controller 22 and/or the remote station 14 may take appropriate action if the remote station 14 and/or the controller 22 detects that the UAV 12 is leaving the operation box 211 .
- navigation coordinates may be provided to direct the UAV 12 away from leaving the operation box 211 , and/or the UAV 12 may be directed to land or to fly to the remote station 14 .
- the data about the box 211 and/or structures/vegetation in the box 211 are integrated into a flight plan for the UAV 12 .
- the data may also be used by the controller 22 to help ensure the UAV 12 doesn't collide with structures, trees, etc.
- the data may also be used by the controller 22 to maintain a specific distance from a structure being captured with images/video, so that the images and video will be a consistent sample distance (millimeters per pixel for example).
- each pixel of the image 44 taken by the camera 42 of the UAV 12 may represent 1 mm on a structure in the image.
- the UAV 12 may comprise a speed reduction device 150 .
- speed reduction devices 150 include air brakes 152 and parachutes 154 .
- the speed reduction device 150 may be deployed automatically or manually when the controller 22 and/or the remote station 14 or operator 16 detects a malfunction in the UAV 12 and/or that the UAV 12 is out of control.
- the speed reduction device 150 creates drag to limit the airspeed of the UAV 12 and therefor reduce kinetic energy.
- the speed reduction device 150 may increase the velocity of the UAV 12 by acting as a sail.
- the air brakes 152 may be part of the airframe 20 in the form of hard or pliable panels/sails. In one embodiment, the air brakes 152 may be in the form of gas inflated bladders (not shown).
- the rotor 132 may be utilized as the speed reduction device 150 .
- the remote station 14 may comprise components that interface with the unmanned aerial vehicle 12 and/or the remote operator 16 and/or that process data to/from the UAV 12 .
- the remote station 14 may comprise a human-machine interface module 160 , one or more processor(s) 162 (hereinafter “the processor”), one or more drive(s) 164 (hereinafter “the drive”), and a remote station communications system 166 .
- the remote station 14 may comprise one or more antenna(s) 168 (hereinafter “the antenna”).
- the antenna 168 may transmit/receive one or more signal to/from the remote station communications system 166 to communicate with one or more UAV 12 , aircraft, and/or vehicles.
- the processor 162 may comprise one or more of a smartphone, a tablet personal computer, a personal computer processor, and/or other personal computing device.
- the remote station 14 may receive/download onboard data from the UAV 12 , for example, through the remote station communications system 166 .
- the onboard data may include images 44 and/or metadata, such as metadata about the images 44 , about/from the image capture device 42 , and/or about/from the sensors 140 .
- the remote station 14 may upload commands from the remote station 14 to the UAV 12 , for example, through the remote station communications system 166 , in order to control functions of the UAV 12 and/or the payload 40 of the UAV 12 .
- the remote station 14 may transmit commands and/or data to the UAV 12 .
- the remote station 14 may control the UAV 12 in real time in all three physical dimensions.
- the UAV 12 may operate autonomously or with varying degrees of guidance from the remote station 14 and/or the remote operator 16 .
- the remote station 14 may provide the remote operator 16 with real time data concerning the UAV 12 and/or data transmitted from the UAV 12 through the human-machine interface module 160 .
- the remote station 14 may provide the operator 16 with flight information necessary to control the flight of the UAV 12 .
- flight information may include cockpit-type control data such as data from the sensors 140 and/or indications of roll, pitch, and yaw angle, navigational view of attitude data, current position of the UAV 12 with coordinates and/or visually, failure of components/systems within the UAV 12 , and so on.
- the human-machine interface module 160 may be configured for the operator 16 to receive data and to input data and/or commands.
- the human-machine interface module 160 may comprise a display displaying a view transmitted from the UAV 12 similar to a view that an onboard pilot would have.
- the human-machine interface module 160 may include a control panel for remotely piloting the UAV 12 .
- the human-machine interface module 160 may comprise a graphical user interface.
- the human-machine interface module 160 may comprise user input devices through which the operator 16 may enter commands and data.
- Non-exclusive examples of input devices may include an electronic digitizer, a microphone, a keyboard, and a pointing device such as a mouse device, trackball device or touch pad device.
- Other input devices may include a joystick device, game pad device, satellite dish, scanner device, heads-up device, a vision system, a data bus interface, and so on.
- the remote station 14 may translate commands from the operator 16 to the UAV 12 to control the flight control surfaces 60 and speed of the UAV 12 .
- the remote station 14 may translate simplistic inputs from the operator 16 into specific, detailed, precision-controlled flight control of the UAV 12 .
- the operator's 16 movement of a joystick may be translated by the processor 162 into commands and transmitted via the remote station communications system 166 and the communications system 24 of the UAV 12 to the controller 22 of the UAV 12 to adjust the flight control surfaces 60 of the UAV 12 to affect roll, pitch, and yaw.
- the remote station 14 may comprise one or more attenuator 170 on the antenna 168 for range testing.
- An interlock (not shown) may be used to prevent the UAV 12 from taking off with the attenuator 170 in place on the antenna 168 of the remote station 14 .
- the attenuator 170 may be used for range testing between the UAV 12 and the remote station 14 .
- the interlock is a device that makes the state of two mechanisms mutually dependent.
- a sensor is configured to detect that the attenuator 116 is in place.
- the UAV 12 is prevented from flying (or flying beyond a predetermined distance) to prevent the UAV 12 from flying beyond the range of the controller 22 with the attenuator 116 attached to the UAV 12 .
- the attenuator 116 may also be affixed to the case 17 such that when the UAV 12 is removed from the case 17 the attenuator 116 is effectively removed.
- the drive 164 and associated computer storage media such as removable storage media (e.g., CD-ROM, DVD-ROM) and non-removable storage media (e.g., a hard drive disk), may provide storage of computer readable instructions, data structures, program modules and other data.
- the drive 164 may include an operating system, application programs, program modules, and one or more database.
- the remote station 14 may be implemented as a portion of a small-form factor portable (or mobile) electronic device such as a portable computing device, a mobile computing device, an application specific device, or a hybrid device that include any of the above functions.
- the remote station 14 may be implemented as a personal computer including both laptop computer and non-laptop computer configurations.
- the remote station 14 may be implemented as a networked system or as part of a specialized server.
- the remote station 14 may comprise an automatic dependence surveillance-broadcast (ADS-B) device (not shown) such that when a conventional aircraft or another UAV 12 enters the area the operator 16 may be notified and land the UAV 12 .
- ADS-B device may be configured with ADS-B “Out” and/or ADS-B “In”.
- ADS-B “Out” periodically broadcasts information about the UAV 12 , such as identification, current position, altitude, and velocity, through a transmitter, such as the communications system 166 of the remote station 14 (and/or the communications system 24 of the UAV 12 ).
- ADS-B “Out” may provide air traffic controllers and other aircraft with real-time position information.
- ADS-B “In” allows the reception by the UAS 10 of ADS-B data, such as direct communication from nearby aircraft of their identification, current position, altitude, and/or velocity.
- the ADS-B device is located in either or both the remote station 14 and/or the UAV 12 .
- the remote station communications system 166 may comprise a transmitter and a receiver.
- the remote station 14 may comprise a wireless datalink subsystem.
- the wireless datalink subsystem may be configured for remote communication with the UAV 12 .
- the remote station 14 may further comprise a mobile power system, such as one or more battery (not shown).
- a mobile power system such as one or more battery (not shown).
- the communications system 24 of the UAV 12 and the communications system 166 of the remote station 14 are configured to form a connection between the UAV 12 and the remote station 14 using radio frequency protocols that may or may not meet the requirements of a Wi-Fi network.
- the communications system 24 of the UAV 12 and the communications system 166 of the remote station 14 may utilize a cellular network for communication between the UAV 12 and the remote station 14 and/or communication between the UAS 10 and other vehicles and/or systems.
- the UAV 12 and/or remote station 14 may have cellular radios via which data may be communicated.
- a Verizon MiFi 4G LTE Global USB Modem is an example of such a device.
- the UAV 12 may connect to the cellular network using the modem and send telemetry, images, photos, etc.
- the UAV 12 may also receive commands/instructions on where to go next, flight plans, and/or what to photograph/video.
- the controller 22 in conjunction with the communications system 24 of the UAV 12 and/or the communications system 166 of the remote station 14 may operate in a networked environment using logical connections to one or more processors, such as a remote processor connected to a network interface.
- the remote processor may be the processor 162 of the remote station 14 , or located all or in part separately from the remote station 14 .
- the remote processor may be a personal computer, a server, a router, a network PC, a peer device or other common network node, and can include any or all of the elements described above relative to the controller.
- Networking environments are commonplace in offices, enterprise-wide area networks (WAN), local area networks (LAN), intranets and world-wide networks such as the Internet.
- source and destination machines need not be coupled together by a network(s) or any other means, but instead, data may be migrated via any media capable of being written by the source platform and read by the destination platform or platforms.
- the controller 22 may be coupled to the LAN through the network interface or an adapter.
- the network(s) may comprise any topology employing servers, clients, switches, routers, modems, Internet service providers (ISPs), and any appropriate communication media (e.g., wired or wireless communications).
- a system may have a static or dynamic network topology.
- the network(s) may include a secure network such as an enterprise network (e.g., a LAN, WAN, or WLAN), an unsecure network such as a wireless open network (e.g., IEEE 802.11 wireless networks), or a world-wide network such (e.g., the Internet).
- the network(s) may also comprise a plurality of distinct networks that are adapted to operate together.
- the network(s) are adapted to provide communication between nodes.
- the network(s) may include wireless media such as acoustic, RF, infrared and other wireless media.
- a network communication link may be one nonexclusive example of a communication media.
- Communication media may typically be embodied by computer readable instructions, data structures, program modules, or other data in a modulated data signal, such as a carrier wave or other transport mechanism, and may include any information delivery media.
- a “modulated data signal” may be a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal.
- communication media may include wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, radio frequency (RF), microwave, infrared (IR) and other wireless media.
- RF radio frequency
- IR infrared
- the term computer readable media as used herein may include both storage media and communication media.
- pre-flight diagnostic testing is employed.
- the UAS 10 is programmed such that, when the UAS 10 is initially powered on, the UAV 12 will not operate without first performing a series of diagnostics to verify that all systems are properly calibrated and operating correctly.
- the remote station 14 may prompt the operator 16 (via the human-machine interface module 160 , for example, with a text display or an audible voice) to rotate the UAV 12 on different axes.
- the values from each sensor pair (that is, the primary sensor 140 and the secondary sensor 140 a ) may be compared to verify that they match within a predetermined margin of error.
- the operator 16 may be prompted by the controller 22 and/or the processor 162 to anchor the UAV 12 to a weight or test rig that is heavy or anchored well enough so the UAV 12 may power up the propulsion system 28 to full power and remain secure on the ground.
- the preflight test may verify one or more of the following parameters: integrity of the propulsion system 28 at full power (for example, integrity of the rotor 132 ), RPM output, power consumption (for example, by each motor 118 ), performance of the electronic speed controls (ESCs) 34 , yaw torque, health/output power of the battery 122 under load, thrust output from the propulsion system 28 (for example, each motor/propeller), integrity of the airframe 20 , etc.
- collision detection diagnostics are employed. As part of a collision detection test, the operator 16 may be prompted to place an object in front of the collision detection sensor(s) to verify that all collision detection systems are working correctly.
- the collision detection diagnostics utilize a pre-flight test rig/jig (not shown) so the predetermined distance for collision detection may be checked with precision.
- the diagnostic tests may also involve placing one or more attenuator 116 and/or attenuator 170 on the antenna(s) 114 of the UAV 12 or the antenna(s) 168 of the remote station 14 for range testing.
- An interlock may be used to prevent the UAV 12 from taking off when the attenuator(s) 116 , 170 are in place.
- Diagnostic tests also may be used to check environmental conditions and disallow use of the UAV 12 when it is too windy or the temperature is too hot or too cold. This is particularly important for the battery 122 which may have significantly lower power output in cold temperatures.
- in-flight diagnostic testing is employed.
- a series of algorithms may be used to detect faults and suspend flight operations if required. For example, if the controller 22 adjusts power output to a particular motor 118 and does not “see” the intended change in attitude as a result, the controller 22 may assume there is a malfunction or the vehicle is “stuck” and power down all motors 118 .
- a property 200 of interest is identified, such as with location information 202 .
- the location information 202 may be in the form of any coordinate system or location information system, including, but not limited to, a street address, a plat location, and/or latitude and longitude coordinates.
- a general location of the property may be provided, and the UAS 10 may then be provided with, and/or determine, specific boundaries 210 of an inspection site of the property 200 —that is, of the operation box 211 .
- the operator 16 may identify the outer boundaries 210 of the inspection site.
- the operator 16 may identify two or more points 212 on the outer boundaries 210 of the operation box 211 , as illustrated in FIG. 21 .
- the UAS 10 may then determine the outer boundaries 210 of the operation box 211 based on the identified points 212 , as shown in FIG. 22 .
- FIG. 21 illustrates the outer boundaries 210 of the operation box 211 as square shaped, the outer boundaries 210 may have the shape of any polygon or polygons.
- the outer boundaries 210 may have a three dimensional shape including, for example, a polygon having a height, or other structure.
- the navigation system 32 of the UAS 10 may utilize the outer boundaries 210 to guide the UAV 12 to remain within the outer boundaries 210 of the operation box 211 . Further, the UAV 12 may be directed to only capture images 44 of objects/locations within the boundaries 210 , thus protecting privacy interests of surrounding properties.
- the navigation system 32 may utilize the coordinates of the boundaries 210 of the operation box 211 to determine a flight plan for the UAV 12 that remains within the operation box 211 .
- the navigation system 32 may provide the UAV 12 geographical coordinates and/or altitude values that define a geometric shape (e.g. a polygon, a circle, a square, etc.) on and/or above the earth for the operation box 211 .
- the navigation system 32 may provide the UAV 12 a maximum altitude or z value.
- the geometric shape may be a 3D polygon, having a 2D geometric shape on the ground that extends upwards, either to a maximum z height or up to a maximum altitude (such as a maximum altitude allowed by government regulations).
- the controller 22 maintains the UAV 12 inside the 3D polygon such that the UAV 12 does not fly over adjacent and/or other properties.
- parcel data, building outlines, and other sources of data may be used to define the geometric shape.
- the navigation system 32 may ensure that the camera 42 carried by the UAV 12 does not capture data and/or images 44 on any neighboring structures, cars, and/or individuals, etc.
- the 3D polygon information and data from attitude sensors on the camera 42 and/or the UAV 12 carrying the camera 42 can be used to ensure that camera 42 does not capture data and/or images 44 on any neighboring structures, cars, and/or individuals, etc.
- 3D data about a structure may be used to ensure the camera 42 is oriented in such a way so that only the structure is in the frame of the image 44 when taking the image 44 or video and that neighboring structures, individuals, and/or vehicles are not in the background of the image 44 .
- 3D data about a neighboring structure may be used to ensure the camera 42 is oriented in such a way so that the neighboring structure is not captured in the image 44 .
- the navigation system 32 may also determine the flight plan that keeps the image capture device 42 of the UAV 12 orientated such that the field of view (designated with arrows from the UAV 12 in different positions within the operation box in FIGS. 23 and 24 ) of the image capture device 42 is solely within the boundaries 210 of the operation box 211 , while capturing desired images 44 of the property 200 of interest.
- the controller 22 of the UAV 12 and/or the remote station 14 may compare the position of the UAV 12 , based on data from sensors 140 , such as the GPS and/or the altimeter, with the coordinates of the boundaries 210 of the operation box 211 . If the distance between the position of the UAV 12 and the boundaries 210 is less than or above a predetermined amount, the UAV 12 may be directed to adjust position and/or orientation to maintain the position of the UAV 12 within the boundaries 210 .
- the orientation and position of the UAV 12 may be adjusted such that the field of view of the image capture device 42 is solely within the boundaries 210 of the operation box 211 to respect the privacy of neighbors adjacent to the boundaries 210 .
- the UAV 12 may be orientated and positioned such that the image capture device 42 has a field of view that encompasses an object or structure within the boundaries 210 .
- the UAS 10 and the image capture device 42 may be utilized in a method to capture aerial images 44 of a structure while avoiding capturing images of neighboring properties.
- the UAS 10 may be utilized to determine one or more ground location and/or one or more surface location.
- the UAS 10 may be positioned on the ground/surface location.
- a location reading from a GPS onboard the UAS 10 may be taken with the UAS 10 on the ground/surface location.
- the location reading may include the latitude, the longitude, and the altitude above sea level of the UAS 10 .
- the altitude above sea level from the GPS may be designated as a ground/surface elevation point for the latitude/longitude location of the UAS 10 .
- another GPS reading for the UAS 10 may be taken, including the latitude, the longitude, and the altitude above sea level of the UAS 10 .
- the height of the UAS 10 above the ground/surface may be calculated by subtracting the ground/surface elevation point from the altitude above sea level of the UAS 10 in the air.
- the controller 22 and/or the image capture device 42 , the one or more sensors 140 , and/or the image capture module 100 may capture metadata associated with one or more of the images 44 .
- metadata include information about and/or from the UAV 12 , the one or more sensors 140 , and/or the image capture device 42 .
- Metadata about the image capture device 42 may comprise such data as the attitude of the UAV 12 , the attitude of the image capture device 42 , and/or the focal length of the image capture device 42 , sensor size of the image capture device 42 , pixel pitch of the image capture device 42 , and/or distortion parameters of the image capture device 42 .
- the metadata may include information from the avionics system 30 and/or the navigation system 32 such as orientation and/or position of the UAV 12 based on data obtained from the sensors 140 , such as the visual sensors (e.g., cameras), IMU, GPS receiver and/or other sensors 140 .
- the sensors 140 such as the visual sensors (e.g., cameras), IMU, GPS receiver and/or other sensors 140 .
- the metadata may include data from a GPS and/or data associated with the GPS such as GPS signal strength, number and information regarding available satellites, and so on.
- the metadata may include data from an IMU and/or data associated with the IMU, such as information about pitch, roll, yaw, acceleration vectors in x, y, z orientations, and acceleration vectors about an x-axis, about a y-axis, and about a z-axis.
- the metadata may be from and/or about other sensors of the UAV 12 , non-exclusive examples of which include proximity sensors, LiDAR, methane gas sensors, carbon dioxide sensors, heat sensors, multi-spectral sensors (for example, four-band image sensors capable of detecting and/or recording red, green, blue and near infrared), and hyper-spectral sensors (for example, image sensors capable of detecting and/or recording a larger number of spectrum, including 16 or 32 band image—which may include red, green, blue and near infrared and additional spectrum).
- sensors of the UAV 12 non-exclusive examples of which include proximity sensors, LiDAR, methane gas sensors, carbon dioxide sensors, heat sensors, multi-spectral sensors (for example, four-band image sensors capable of detecting and/or recording red, green, blue and near infrared), and hyper-spectral sensors (for example, image sensors capable of detecting and/or recording a larger number of spectrum, including 16 or 32 band image—which may include red, green, blue and near infrared and additional spectrum).
- the metadata may include one or more of the following: whether the image 44 or associated image 44 was captured from the UAV 12 , the particular type of the UAV 12 (such as, but not limited to, make, model, and/or an identification number of the UAV 12 ), whether the image 44 was captured from the ground, whether the image 44 was captured from a moving ground vehicle, whether the image 44 was captured from a manned aircraft, whether the image 44 was captured from some other source, and what type of image capture device 42 was used to capture the image 44 .
- the metadata may be embedded in the image 44 .
- the metadata and the image 44 may be stored together in a single image file.
- the image 44 may be part of an image file having an image header.
- the metadata may be embedded in the image header, such as in the header of a jpeg formatted file.
- the jpeg header may be organized in a predetermined format such that the metadata is stored in a consistent manner in the jpeg header. For example, the position of the metadata in the header and/or the format of the title of the metadata in the header may be predetermined for consistency.
- the remote station 14 transforms the image file into a standard format for processing.
- the metadata and the image 44 may be stored in a removable non-transitory memory storage device, such as a memory card.
- the memory card may be removed from the UAS 10 to download the images 44 and the metadata.
- the images 44 and/or the metadata may be transmitted from the UAS 10 to the remote station 14 .
- the images 44 and/or the metadata may be transmitted wirelessly and/or through a physical connection, such as wires.
- the images 44 and/or the metadata may be processed by the processor 162 of the remote station 14 .
- the images 44 and/or the metadata may first be downloaded wirelessly from the UAS 10 to the remote station 14 . Then the images 44 and/or the metadata may be transmitted through a physical connection to a computer processor device where the images 44 and/or the metadata may be extracted and/or processed.
- the images 44 and/or the metadata may be transmitted a smartphone, a tablet personal computer, a personal computer processor, and/or other personal computing device.
- the UAS 10 may have an application program interface (API).
- API application program interface
- the metadata is captured by the image capture device 42 at the time the image 44 is captured.
- the image capture device 42 captures none of, or less than all of, the metadata.
- some or all of the metadata may be captured by the controller 22 , the avionics system 30 , the navigation system 32 , and/or the sensors 140 of the UAS 10 .
- the metadata from the time an individual image 44 is taken is matched with that individual image 44 .
- the controller 22 transmits one or more signal to the image capture device 42 instructing the image capture device 42 to capture an image 44 .
- the controller 22 may record the metadata.
- the metadata may be combined with the image 44 by the controller 22 , or may be combined with the image 44 after the image 44 and the metadata are transmitted from the UAV 12 to the remote station 14 .
- the metadata contains time data and the images 44 contain time data
- the metadata may be matched to the images 44 by matching the metadata time data to the image time data.
- the metadata may be combined with the images 44 in the header of the image file, such as a jpeg header for a jpeg image file.
- Metadata may not be necessary in all analysis scenarios, for example, when visual data from an image is sufficient. However, other creation and/or analyses may benefit from and/or require metadata—for example, creation of a three-dimensional model.
- one or more of the images 44 may be geolocated and/or georeferenced.
- Geolocating the image 44 comprises associating the image 44 with a location or structure in a location.
- One example of use for geolocation of the image 44 is for images 44 depicting objects above the ground without depicting the ground, or without ground location information, or without access to surface location information for the objects depicted.
- an image may depict a chimney on a roof without depicting the ground location.
- Metadata can be used to associate the image 44 with a particular location or structure.
- metadata can be used that is associated with the one or more image capture device 42 at the time the aerial images 44 were captured, such as latitude and longitude of the one or more image capture device 42 and/or one or more of altitude, orientation, attitude, and bearing of the one or more image capture device 42 .
- the metadata can be correlated to the location or structure of interest thereby associating the image 44 with the location or structure of interest.
- Georeferencing the images 44 may comprise processing the images 44 to determine and assign geographic location information for the pixels of the images 44 .
- the images 44 may be processed as described in U.S. Pat. No. 7,424,133, issued Sep. 9, 2008, titled “Method and Apparatus for Capturing, Geolocating and Measuring Oblique Images;” and/or U.S. Patent Publication US20150221079A1, published Aug. 6, 2015, titled “Augmented Three Dimensional Point Collection of Vertical Structures;” all of which are hereby incorporated by reference in their entirety herein.
- the geographic location information may include geographic coordinates for the ground as well as structures and objects located above the ground in the image 44 .
- the geographic location information for the pixels of the image 44 may be a part of the metadata associated with the image 44 .
- Georeferencing the images 44 may be based at least in part on one or more known ground points and/or surface points.
- known ground points and/or surface points include digital elevation models (DEMs), point clouds, three-dimensional models, individually plotted/mapped points, and tessellated ground planes.
- the images 44 may be georeferenced based at least in part on searching for and locating one or more surface model or point cloud having locations within a predetermined proximity of the location of the UAV 12 and/or in the direction of orientation of the UAV 12 . In one embodiment, the images 44 may be georeferenced based at least in part on searching for and locating one or more ground point or ground plane having ground locations within a predetermined proximity of the UAV 12 .
- the image location system 250 may comprise a metadata database 252 and an image warehouse database 254 stored in one or more non-transitory computer memory 256 .
- the image location system 250 may further comprise one or more processor 258 and one or more user interface 260 .
- the metadata is stored in the metadata database 252 and the images 44 are stored in the image warehouse database 254 .
- Metadata and images 44 may be received from multiple UASs 10 and/or multiple UAVs 12 of various types.
- the metadata may initially be received in varying formats depending on the type of UAV 12 transmitting the metadata.
- the metadata may be transformed into a standardized format.
- the metadata may be stored with a standardized format.
- the metadata may be stored in the metadata database 252 and associated with the image 44 and/or the image file.
- the metadata may include a file path of the associated image 44 and/or the image file.
- one or more of the metadata, the metadata database 252 , the images 44 , and the image warehouse database 254 may be stored in one or more remote locations, such as cloud storage.
- the metadata database 252 and/or the image warehouse database 254 may be spatial databases. That is, the metadata database 252 and/or the image warehouse database 254 may be structured with spatial (locational) connections such that spatial conclusions and results can be reached.
- the metadata database 252 and/or the image warehouse database 254 may be able to search for, find, and return to a user data based on an input location. For example, a user may request images 44 within one mile of a location and the metadata database 252 and/or the image warehouse database 254 may return such information. In another example, the user may request images 44 within a polygon drawn on an overview image. In another example, the user may request images 44 based on other location information.
- a user may utilize the image location system 250 to locate image(s) 44 and/or metadata for a particular geographic area or structure.
- the user may search for images 44 of a structure and/or geographic location by inputting geographic coordinates through the user interface 260 .
- the user may search for images 44 by choosing one or more points, facets, components, or areas of a structure in an image, floorplan, 2D model, or 3D model, as shown in FIGS. 30-38 .
- the user may search for images 44 of the structure and/or geographic location by inputting a polygon 268 (such as a 2D or 3D polygon) of geographic coordinates through the user interface 260 .
- the user may input geographic points, and the processor 258 of the image location system 250 may form the polygon 268 of geographic coordinates from the inputted points.
- the polygon 268 may be located by the user on a structure in an image 44 .
- the polygon 268 may be an area or facet of a structure in an image 44 .
- the image location system 250 may utilize the polygon 268 in conjunction with the metadata associated with the image 44 and/or a two-dimensional outline and/or a three-dimensional model of the structure in the image 44 to identify the portion of the structure selected by the user.
- the image location system 250 may allow the user to further specify a particular area of the structure.
- the image location system 250 may search the metadata database 252 for geographic information in the metadata matching, or approximate to, the geographic coordinates entered by the user.
- the image location system 250 may then display images 44 associated with the metadata matching the geographic coordinates.
- the displayed images 44 contain pixels having matching geographic coordinates.
- the image location system 250 may search the metadata database 252 for points on the ground that match, or are approximate to, the geographic coordinates entered by the user. In one embodiment, the image location system 250 may search the metadata database 252 for points on the ground that are intersected by or enclosed within the polygon 268 .
- the image location system 250 may search the metadata database 252 for points above the ground that match, or are approximate to, the geographic coordinates entered by the user. In one embodiment, the image location system 250 may search the metadata database 252 for points above the ground that are intersected by or enclosed within the polygon 268 . Points above the ground may be geographic location points on structures or vegetation above the ground.
- the image location system 250 may return images of the structure. In one embodiment, the image location system 250 may return images that depict the particular area of the structure chosen by the user.
- the images 44 may depict structures and/or vegetation without depicting the ground.
- images 44 taken by an image capture device 42 with a perspective pointed toward the horizon, or at an angle upwards from the horizon, may not depict the ground.
- the image location system 250 may search the metadata for recorded locations of the image capture device 42 in which the image capture device 42 location matches, intersects, or is enclosed in, the inputted coordinates and/or polygon 268 .
- the image location system 250 may calculate, and or store, data indicative of points on, in, and/or the outline of, one or more structures and/or vegetation depicted in the images 44 , the attitude of the image capture device 42 , and the bearing of the image capture device 42 (i.e. the direction the image capture device 42 was pointing when the image 44 was captured).
- the data can be stored in the metadata database 252 .
- the image location system 250 may determine the geographic coordinates (X, Y, and Z) where the view of the image capture device 42 intersects the one or more structure and/or vegetation.
- the image location system 250 may utilize the intersection geographic coordinates as a geographic marker for the image 44 .
- the image location system 250 may match the inputted geographic coordinates to the intersection geographic coordinates to locate an image 44 depicting a geographic location having geographic coordinates matching or within a predetermined distance relative to inputted geographic coordinates and/or polygon 268 .
- a user may search for images 44 with the image location system 250 by inputting a geo-code. For example, the user may enter a street address and receive a property parcel's geometry, that is, a property parcel polygon of the property line of a land parcel or building. The user may use the received property parcel polygon as polygon 268 to input into the image location system 250 to request any images 44 for that polygon, that is, any images 44 that intersect the polygon 268 or that are associated with the property within the polygon.
- the user may search for images 44 with the image location system 250 by selecting the polygon 268 that was formed by the operator of the UAV 12 when establishing boundaries 210 of the operation box 211 when one or more of the images 44 were originally captured by the image capture device 42 of the UAV 12 .
- the metadata includes a street address.
- the street address may be acquired by an operator of the UAS 10 .
- the street address may be associated with the images 44 captured by the UAS 10 while the UAS 10 is operated to capture images 44 at the street address.
- the image location system 250 may process one or more of the images 44 before a user utilizes the image location system 250 .
- the image location system 250 may create one or more 3D model based on the images 44 and the metadata, calculate one or more virtual nadir camera view, and then create an ortho-mosaic based on the 3D model and virtual nadir camera views.
- the image location system 250 may process one or more of the images 44 and/or the metadata and create one or more three-dimensional point clouds and/or one or more three-dimensional models based at least in part on the images 44 and/or the metadata.
- the metadata may be used to produce more accurate results to existing or new models and/or images 44 .
- the image location system 250 may process one or more of the images 44 by ortho-rectifying the images and stitching the images 44 together using tie points to create an ortho-mosaic.
- the ortho-mosaic may be divided into tiles (for example, tiles 256 ⁇ 256 in size).
- the image location system 250 may display one or more tiles to the user, such as when the user views the ortho-mosaic in a web-based browser.
- the tiles may be in a standardized format for use in multiple types of web-based browsers.
- the image location system 250 may provide multiple images 44 from different perspectives to the user.
- the image location system 250 may initially provide an overview image 270 , such as a top-down (nadir) view, of an entire area/property.
- the image location system 250 may display an overlay of the polygon 268 on the overview image 270 .
- the image location system may display image tiles or “thumbnail” images 272 (that is, preview images smaller in initial size than the overview image 270 ) of additional images 44 of the property from different perspectives, different distances, and/or different areas of the property for the user to choose to display.
- the thumbnail images 272 may be displayed outside of the overview image 270 , such as on one side of the overview image 270 , as shown in FIG. 27 .
- the overview image 270 and or other images may have icons 274 on and/or beside the overview image 270 to show where the additional images 44 (for example, those represented by the thumbnail images 272 ) were taken and which direction the image capture device 42 was facing when the additional images 44 were taken.
- the user may select the icon 274 and the image location system 250 may highlight the thumbnail image 272 associated with the icon 274 . In one embodiment, the user may select the thumbnail image 272 and the image location system 250 may highlight the portion of the overview image 270 where the image associated with the thumbnail image 272 was captured. In one embodiment, the user may select the icon 274 and the thumbnail image 272 may be displayed. In one embodiment, the user may select the icon 274 and the additional image 44 may be displayed in full.
- the overview image 270 provides the user with an overall perspective of where additional images 44 are available.
- the additional images 44 may depict less than the total area of the property.
- the image capture device 42 may capture a particular image 44 of a four foot by four foot section of a roof of a structure. If a user views this particular image 44 of the roof section, the user may have difficulty knowing the location of the roof section in relation to the entire structure and/or the property.
- the image location system 250 may provide links to the overall overview image 270 and/or an ortho-mosaic to help orientate the user.
- the image location system 250 may give a reference as to where the image capture device 42 was located and orientated when the particular image 44 was captured such that the location of damage to the roof may be ascertained.
- the image location system 250 may display the icon 274 on the overview image 270 to indicate the location of the image capture device 42 and/or the orientation of the image capture device 42 (that is, the direction, the bearing, of the viewpoint of the image capture device) at the time the image 44 was captured.
- the icon 274 may include a pie shape indicative of the direction the image 44 was taken (that is, which way the image capture device 42 was facing, the angle view the image capture device 42 had when the image capture device 42 captured the image 44 ).
- the images 44 may be labeled as to what type of image 44 and/or how the image 44 was captured.
- the image 44 may be labeled as being captured by the UAV 12 .
- the thumbnail image 272 and/or the icon 274 may be labeled to indicate the type of image 44 and/or how the image 44 was captured.
- the images 44 , the icons 274 , and/or the thumbnail images 272 displayed on and/or adjacent to the overview image 270 may be labeled with one or more of the following metadata: whether the image 44 or associated image 44 was captured from the UAV 12 , the particular type of the UAV 12 (such as, make, model, and/or an identification number of the UAV 12 ), whether the image 44 was captured from the ground, whether the image 44 was captured from a moving ground vehicle, whether the image 44 was captured from a manned aircraft, whether the image 44 was captured from some other source, what type of image capture device 42 was used to capture the image 44 , or the like.
- the particular type of the UAV 12 such as, make, model, and/or an identification number of the UAV 12
- the user may select one or more points in the overview image 270 and the image location system 250 may display one or more additional image 44 to show a visual depiction related to the one or more selected points.
- the user may select the polygon 268 and the image location system 250 may display all of the additional images 44 available that are encompassed by the polygon 268 .
- a user may search for images 44 by selecting an area, a facet 276 , a point 278 , a component 280 , and/or an intersection of a structure in a first image 44 or in a 2D model 282 or 3D model 284 .
- the user may click on an area or facet of the structure, or draw a shape, such as polygon 268 on an area of the structure.
- the image location system 250 may detect when the user selects an area or facet 276 of the structure, such as by utilizing two-dimensional outlines and/or three-dimensional models of the structures that are associated with geographic locations on the earth and metadata from the images 44 .
- the image location system 250 may allow the user to further specify a particular area of a structure of interest after a first selection by the user. If the user draws a circle or polygon 268 (or even single clicks to specify a point 278 ), image location system 250 may further allow the user to specify a particular area, component, and/or element of that structure in which the user is interested.
- Non-exclusive examples of area, component 280 , and/or elements of structures that may be specified include one or more wall, roof plane, roof, floor, door, window, intersection, and cross-section, or portion or combination thereof.
- the image location system 250 may return images 44 to the user, not just in the geographic proximity to a structure, but that include the area of interest in three dimensional coordinates above the ground and on the structure.
- the user may select an area, such as polygon 268 , on a wall of interest on a structure in an image 44 .
- the image location system 250 can determine that the user selected a section of wall on the structure in the image 44 and not just a point on the ground.
- the image location system 250 may search the metadata database 252 and/or the image warehouse database 254 for images 44 taken in that locality to discover images 44 that point to that region of the structure, such as images 44 a and 44 b shown in FIGS. 31 and 32 .
- images 44 a and 44 b may have been taken by the image capture device 42 of the UAV 12 that depict the user selected location in polygon 268 , such as by the image capture device 42 when the UAV 12 was in a first location 286 and/or in a second location 288 , as shown in FIG. 30 .
- the user may simply click on a side or element of the structure in a first image 44 or in a 2D model 282 (as shown in FIG. 38 ) or 3D model 284 (as shown in FIG. 37 ) and be quickly presented with thumbnails of the images 44 , or the images 44 themselves, that include that side or element of the structure.
- search results to the user may include ground shots by an adjuster, street-view, drone, selfie stick, manned aerial, 3D models, etc.
Landscapes
- Engineering & Computer Science (AREA)
- Aviation & Aerospace Engineering (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Automation & Control Theory (AREA)
- Theoretical Computer Science (AREA)
- Databases & Information Systems (AREA)
- Data Mining & Analysis (AREA)
- Library & Information Science (AREA)
- General Engineering & Computer Science (AREA)
- Mechanical Engineering (AREA)
- Geophysics (AREA)
- General Life Sciences & Earth Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- User Interface Of Digital Computer (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
- Image Processing (AREA)
- Instructional Devices (AREA)
- Processing Or Creating Images (AREA)
- Traffic Control Systems (AREA)
Abstract
Description
- The present patent application claims priority to the provisional patent application identified by U.S. Ser. No. 62/276,539, filed on Jan. 8, 2016, entitled “Unmanned Aerial Systems and Methods of Using and Controlling Flight of Same,” and to the provisional patent application identified by U.S. Ser. No. 62/413,483, filed on Oct. 27, 2016, entitled “Systems and Methods for Processing Images from Unmanned Aerial Vehicles,” the entire contents of all of which are hereby incorporated herein by reference.
- Unmanned aerial systems (UAS) typically include unmanned aerial vehicles (UAV) that do not carry a human operator, but instead operate partially or completely autonomously and/or are remotely piloted.
- Unmanned aerial vehicles may be used to capture images from one or more onboard image capture device and/or capture sensor data from one or more onboard sensor. In some unmanned aerial systems, the images or sensor data may have embedded metadata. In other unmanned aerial systems, metadata from the time the images or sensor data were taken may be available separately from the unmanned aerial system or from an outside source. However, the format and content type of the images, sensor data, and metadata vary widely depending on the type of unmanned aerial vehicle and/or unmanned aerial system. The form of transmission of the images, sensor data, and metadata also varies widely from system to system.
- Therefore, methods and systems are needed to address processing images and accompanying data sourced from diverse unmanned aerial vehicles. Additionally, there is a need for systems and methods to retrieve aerial image and/or sensor data based on the metadata.
-
FIG. 1 is a block diagram of an exemplary embodiment of an unmanned aerial system in accordance with the present disclosure. -
FIG. 2 is a perspective view of an exemplary embodiment of an unmanned aerial system in accordance with the present disclosure. -
FIG. 3 is an illustration of another exemplary embodiment of an unmanned aerial system in accordance with the present disclosure. -
FIG. 4 is an illustration of yet another exemplary embodiment of an unmanned aerial system in accordance with the present disclosure. -
FIG. 5 is a block diagram of an exemplary embodiment of an unmanned aerial vehicle in accordance with the present disclosure. -
FIG. 6 is a block diagram of an exemplary embodiment of integrated components of an unmanned aerial vehicle in accordance with the present disclosure. -
FIG. 7 is a block diagram of another exemplary embodiment of integrated components of an unmanned aerial vehicle in accordance with the present disclosure. -
FIG. 8 is a block diagram of yet another exemplary embodiment of integrated components of an unmanned aerial vehicle in accordance with the present disclosure. -
FIG. 9 is an illustration of an exemplary embodiment of multiple unmanned aerial vehicles in accordance with the present disclosure. -
FIG. 10 is an illustration of an exemplary embodiment of yet another unmanned aerial vehicle in accordance with the present disclosure. -
FIG. 11 is an illustration of the unmanned aerial vehicle ofFIG. 10 in which air bladders are deployed. -
FIG. 12 is a block diagram of an exemplary embodiment of a bladder system in accordance with the present disclosure. -
FIG. 13 is an illustration of exemplary embodiments of unmanned aerial vehicles having closed loop sensors in accordance with the present disclosure. -
FIG. 14 is a block diagram of an exemplary embodiment of a controller of an unmanned aerial vehicle in accordance with the present disclosure. -
FIG. 15 is a block diagram of an exemplary embodiment of a power system of an unmanned aerial vehicle in accordance with the present disclosure. -
FIG. 16 is an illustration of exemplary embodiments of unmanned aerial vehicles having propeller guards in accordance with the present disclosure. -
FIG. 17 is a block diagram of an exemplary embodiment of an avionics system of an unmanned aerial vehicle in accordance with the present disclosure. -
FIG. 18 is an illustration of exemplary embodiments of unmanned aerial vehicles in accordance with the present disclosure. -
FIG. 19 is a block diagram of an exemplary embodiment of a remote station of an exemplary unmanned aerial system in accordance with the present disclosure. -
FIG. 20 is a top plan view of an exemplary geographic location. -
FIG. 21 is an illustration of an exemplary embodiment of boundary marking of the exemplary geographic location ofFIG. 20 . -
FIG. 22 is an illustration of another exemplary embodiment of boundary marking of the exemplary geographic location ofFIG. 20 . -
FIG. 23 is an illustration of an exemplary embodiment of an unmanned aerial vehicle in use in the exemplary geographic location ofFIG. 20 . -
FIG. 24 is a front view of the unmanned aerial vehicle in use ofFIG. 23 . -
FIG. 25 is a block diagram of an exemplary embodiment of an image location system in accordance with the present disclosure. -
FIG. 26 is an exemplary embodiment of an overview image in accordance with the present disclosure. -
FIG. 27 is an exemplary embodiment of an overview image having thumbnail images in accordance with the present disclosure. -
FIG. 28 is an exemplary embodiment of an overview image having icons in accordance with the present disclosure. -
FIG. 29 is an exemplary embodiment of an image in accordance with the present disclosure. -
FIG. 30 is an exemplary embodiment of a user search area on an image in accordance with the present disclosure. -
FIG. 31 is an exemplary embodiment of a returned image based on the search area ofFIG. 30 in accordance with the present disclosure. -
FIG. 32 is an exemplary embodiment of another returned image based on the search area ofFIG. 30 in accordance with the present disclosure. -
FIG. 33 is an exemplary embodiment of another user search area on an image in accordance with the present disclosure. -
FIG. 34 is an exemplary embodiment of another user search area on a three dimensional model in accordance with the present disclosure. -
FIG. 35 is an exemplary embodiment of another user search area on a three dimensional model in accordance with the present disclosure. -
FIG. 36 is an exemplary embodiment of a user search point on a three dimensional model in accordance with the present disclosure. -
FIG. 37 is an exemplary embodiment of a user selected wall on a three dimensional model in accordance with the present disclosure. -
FIG. 38 is an exemplary embodiment of a user search area on a two dimensional model in accordance with the present disclosure. - As used herein, the terms “comprises,” “comprising,” “includes,” “including,” “has,” “having” or any other variation thereof, are intended to cover a non-exclusive inclusion. For example, a process, method, article, or apparatus that comprises a list of elements is not necessarily limited to only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Further, unless expressly stated to the contrary, “or” refers to an inclusive “or” and not to an exclusive “or”. For example, a condition A or B is satisfied by anyone of the following: A is true (or present) and B is false (or not present), A is false (or not present) and B is true (or present), and both A and B are true (or present).
- In addition, use of the “a” or “an” are employed to describe elements and components of the embodiments herein. This is done merely for convenience and to give a general sense of the inventive concept. This description should be read to include one or more and the singular also includes the plural unless it is obvious that it is meant otherwise.
- Further, use of the term “plurality” is meant to convey “more than one” unless expressly stated to the contrary.
- As used herein, qualifiers like “substantially,” “about,” “approximately,” and combinations and variations thereof, are intended to include not only the exact amount or value that they qualify, but also some slight deviations therefrom, which may be due to manufacturing tolerances, measurement error, wear and tear, stresses exerted on various parts, and combinations thereof, for example.
- The use of the term “at least one” or “one or more” will be understood to include one as well as any quantity more than one. In addition, the use of the phrase “at least one of X, V, and Z” will be understood to include X alone, V alone, and Z alone, as well as any combination of X, V, and Z.
- The use of ordinal number terminology (i.e., “first”, “second”, “third”, “fourth”, etc.) is solely for the purpose of differentiating between two or more items and, unless explicitly stated otherwise, is not meant to imply any sequence or order or importance to one item over another or any order of addition.
- Finally, as used herein any reference to “one embodiment” or “an embodiment” means that a particular element, feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment. The appearances of the phrase “in one embodiment” in various places in the specification are not necessarily all referring to the same embodiment.
- In one embodiment in accordance with the present disclosure, a method for capturing aerial images comprises: determining, with at least one of a controller of an unmanned aerial vehicle and a processor of a remote station, a flight plan of the unmanned aerial vehicle, the flight plan configured such that the unmanned aerial vehicle and fields of view of an image capture device of the unmanned aerial vehicle are restricted to an area within boundaries of a geographic location identified by coordinates of the geographic location; and executing, with the unmanned aerial vehicle, the flight plan; and capturing, with the image capture device, one or more aerial images solely within the boundaries of the geographic location while executing the flight plan.
- In one embodiment, executing the flight plan is carried out automatically by the controller of the unmanned aerial vehicle.
- In one embodiment, executing the flight plan is at least partially carried out by an operator utilizing the human-machine interface module of the remote station, and further comprising: receiving, by the remote station from the communications system, one or more first non-transitory signal indicative of position of the unmanned aerial vehicle; and transmitting, from the remote station to the communications system of the unmanned aerial vehicle, one or more second non-transitory signal indicative of instructions for navigation of the unmanned aerial vehicle to maintain the unmanned aerial vehicle within the boundaries.
- In one embodiment, a method comprises receiving aerial images captured by one or more unmanned aerial vehicle; receiving metadata associated with the aerial images captured by the one or more unmanned aerial vehicle; geo-referencing the aerial images based at least in part on a geographic location of a surface to determine geographic coordinates of pixels of the aerial images; receiving a geographic location from a user; retrieving one or more of the aerial images associated with the geographic location based on the determined geographic coordinates of the pixels; and displaying to the user one or more overview image depicting the geographic location and overlaid with one or more icons indicative of and associated with the retrieved aerial images associated with the geographic location.
- In one embodiment, the metadata includes orientation, attitude, and bearing of one or more image capture device that captured the one or more aerial image, and wherein geo-referencing the aerial images based at least in part on a geographic location of a surface to determine geographic coordinates is geo-referencing the aerial images based at least in part on a geographic location of a surface and using the orientation, attitude, and bearing of the image capture device to determine the geographic coordinates of objects depicted in the one or more aerial image.
- In one embodiment, the method further comprises receiving a selection from the user of one of the icons; and displaying the retrieved aerial image associated with the icon. The geographic location from the user may be in a form of three or more geographic points forming a polygon. The method may further comprise creating a three dimensional polygon based on the polygon and a predetermined height dimension; wherein retrieving one or more of the aerial images associated with the geographic location based on the determined geographic coordinates further comprises retrieving one or more of the aerial images associated with the geographic location based on the three dimensional polygon.
- In one embodiment, a method comprises receiving aerial images captured by one or more image capture device on one or more unmanned aerial vehicle, the aerial images depicting only objects above the ground; receiving metadata associated with the one or more image capture device at the time the aerial images were captured, the metadata including latitude and longitude of the one or more image capture device and one or more of altitude, orientation, attitude, and bearing of the one or more image capture device; receiving information indicative of a location of a region of interest; and geolocating one or more of the aerial images, thereby associating one or more of the geolocated aerial images with the region of interest. Geolocating the one or more aerial images may be based at least in part on correlating the information indicative of the location of the region of interest and the metadata associated with the one or more image capture device at the time the aerial images were captured.
- In one embodiment, the metadata associated with the one or more image capture device may further include one or more of sensor size of the one or more image capture device, focal length of the one or more image capture device; pixel pitch of the one or more image capture device, and distortion parameters of the one or more image capture device.
- In one embodiment, a method comprises receiving aerial images captured by one or more unmanned aerial vehicle and time data indicative of a time the aerial images were captured; receiving metadata captured by the one or more unmanned aerial vehicle including time data indicative of when the metadata was captured; associating the metadata with the aerial images based at least in part on matching the time data of the metadata with the time data of the aerial images; geo-referencing the aerial images based on a geographic location of a surface to determine geographic coordinates of pixels for ground locations and objects depicted in the aerial images; receiving a geographic location from a user; retrieving one or more of the aerial images associated with the geographic location based on the determined geographic coordinates; and displaying to the user one or more overview image depicting the geographic location and overlaid with one or more icons indicative of and associated with the retrieved aerial images associated with the geographic location.
- In one embodiment, a method comprises receiving non-standardized metadata captured by an unmanned aerial vehicle and associated with one or more image captured by the unmanned aerial vehicle; transforming the non-standardized metadata into a standardized format; and storing the transformed metadata in a first database associated with the one or more image stored in a second database.
- In one embodiment, a method comprises determining, with at least one of a controller of an unmanned aerial vehicle and a processor of a remote station, a flight plan of the unmanned aerial vehicle, the flight plan configured such that the unmanned aerial vehicle and fields of view of an image capture device of the unmanned aerial vehicle are restricted to an area within boundaries of a geographic location identified by coordinates of the geographic location; executing, with the unmanned aerial vehicle, the flight plan; and capturing, with the image capture device, one or more aerial images solely within the boundaries of the geographic location and restricted to fields of view in within the boundaries while executing the flight plan. Executing the flight plan may be carried out automatically by the controller of the unmanned aerial vehicle. Executing the flight plan may be at least partially carried out by an operator utilizing a human-machine interface module of the remote station, and further comprise receiving, by the remote station, one or more first non-transitory signal indicative of position of the unmanned aerial vehicle; and transmitting, from the remote station to a communications system of the unmanned aerial vehicle, one or more second non-transitory signal indicative of instructions for navigation of the unmanned aerial vehicle to maintain the unmanned aerial vehicle within the boundaries.
- In one embodiment, a method comprises receiving aerial images captured by one or more unmanned aerial vehicle; receiving metadata associated with the aerial images captured by the one or more unmanned aerial vehicle; geo-referencing the aerial images based at least in part on a geographic location of a surface to determine geographic coordinates of pixels of the aerial images; receiving a geographic location from a user based on selection by the user of one or more pixels of a first one of the aerial images, the geographic location being above the ground; and retrieving one or more of second ones of the aerial images associated with the geographic location from the user based on the determined geographic coordinates of the pixels.
- In one embodiment, the geographic location from the user may be in a form of three or more geographic points based on selection by the user of three or more pixels forming a polygon of the first one of the aerial images.
- In one embodiment, the first one of the aerial images may include a depiction of a structure and the geographic location from the user in the form of three or more geographic points forming a polygon may be located on the structure in the first on of the aerial image.
- In one embodiment, the first one of the aerial images may include a depiction of a structure and the geographic location from the user in in a form of one or more elements of the structure chosen by the user in the first on of the aerial image.
- In one embodiment, the one or more elements of the structure may be chosen from the group consisting of a wall, a roof plane, a roof, a floor, a door, an intersection, a cross-section, and a window.
- Referring now to the drawings,
FIG. 1 is a block diagram of an exemplary embodiment of an unmanned aerial system (UAS) 10 in accordance with the present disclosure. TheUAS 10 may comprise one or more Unmanned Aerial Vehicle (UAV) 12. In some embodiments theUAS 10 may further comprise one or moreremote station 14. In one embodiment, one or moreremote operator 16 may interact with theremote station 14. Theremote station 14 may serve a range of functions, from simply receiving data from theUAV 12, up to and including completely controlling all functions of theUAV 12. Further, it will be understood that theUAS 10 may comprise a plurality ofUAVs 12 and/or a plurality ofremote stations 14, working in pairs separately, or working together in any combination, for example, as shown inFIG. 2 . TheUAS 10 may comprise two ormore UAVs 12 working in tandem and/or independently. - In one embodiment, the
UAS 10 may comprise a transponder system (not shown) configured for transmitting signals to other aircraft, the signals comprising information regarding theUAS 10 and/or location of theUAS 10 orUAV 12. The transponder system may be located partially or completely in one or both of theUAV 12 and theremote station 14. - In one embodiment, as illustrated in
FIG. 3 , theUAS 10 may further comprise acase 17. Thecase 17 may be used to store and transfer theUAV 12 and/or theremote station 14. Additionally, or alternately, thecase 17 may be part of theremote station 14. Additionally, or alternately, thecase 17 may be used as part of pre-flight check(s). For example, thecase 17 may be used to weigh down theUAV 12 for a power-up check and provide targets for a collision detection diagnostic phase. - The
case 17 may also contain actuators (not shown) to move theUAV 12 on the various axes to test how theUAS 10 reacts to changes in attitude. As one non-exclusive example, theUAV 12 is secured to thecase 17 such that theUAV 12 may be moved to allow roll, pitch, and yaw. TheUAV 12 may be connected to thecase 17 via one ormore gimbal 21, a nested gimbal, and/or a gimbal lock. A gimbal lock restricts one degree of freedom in a multi-dimensional, multi-gimbal mechanism having “n” gimbals and thus “n” degrees of freedom. The gimbal lock restricts the axes of “n-1” gimbals. For example, in a three-gimbal system, two of the three gimbals are driven into a parallel configuration, “locking” the system into rotation in a degenerate two-dimensional space. - In one embodiment, multiple servos and/or motors may rotate the
UAV 12 across each degree of freedom (roll, pitch, and yaw) in a series of tests to verify that the correct power is provided to the correct component of theUAV 12 to compensate for the motion, thereby testing flight-worthiness of theUAV 12. - As illustrated in
FIG. 4 , in one embodiment, theUAS 10 may further comprise atether 18 for tethering theUAV 12 to abase 19. In one embodiment, theremote station 14 and/or thecase 17 may act as thebase 19. In one embodiment, power may be provided through thetether 18 using step up/step down transformers (not shown). - In one embodiment, the
UAS 10 may employ software-based distance and/or altitude limits to limit and/or control the use of theUAV 12 to a control range. For example, theoperator 16 may set a maximum distance limit in theUAV 12 and/or theremote station 14 so theUAV 12 will not go beyond the control range. And/or theoperator 16 may set a maximum above ground limit in theUAV 12 and/orremote station 14 so theUAV 12 will not go above a set altitude, for example 400 feet above the ground. In one embodiment, the maximum altitude limit is set based on Federal Aviation Administration rules. For example, theremote station 14 and/or theUAV 12 may be programmed with data indicative of a particular type of air space and to restrict the use of theUAV 12 to that particular type of air space. For example, the particular type of air space could be “class G” air space to substantially prevent theUAV 12 from interfering with air craft in another type of air space (such as other air space classes). - In one embodiment, the
UAV 12 may automatically return to a predetermined home location and/or to theremote station 14 if there is a failure. For example, if components of theUAV 12 fail or if the signal from theremote station 14 fails, theUAV 12 may automatically return to the home location and/or to theremote station 14. - UAV 12:
- As shown in
FIGS. 2 and 5 , theUAV 12 may comprise anairframe 20, acontroller 22, acommunications system 24, apower system 26, apropulsion system 28, and anavionics system 30. In some embodiments, theUAV 12 may comprise anavigation system 32, or thenavigation system 32 may be partially or completely in theremote station 14. In some embodiments, theUAV 12 may comprise one or more Electronic Speed Control (ESC) 34. In some embodiments, theUAV 12 may comprise one ormore power bus 36. In one embodiment, theUAV 12 may comprise one or more speed reduction device 150 (for example, as shown inFIG. 18 ). In some embodiments, theUAV 12 may comprise one ormore actuator 38. - The
UAV 12 may carry apayload 40. In one embodiment, components of theUAV 12 are sized and specified to safely carry the weight of the desiredpayload 40 and to meet specifications to withstand wind forces and to reduce the weight of theUAV 12. Additionally, since the weight of theUAV 12 is related to the kinetic energy of theUAV 12, aUAV 12 with a reduced weight has less kinetic energy than aheavier weight UAV 12, and therefore minimizes damage in the event of a crash of theUAV 12. - The
UAV 12 may comprise one or moreimage capture device 42 and/or may carry one or moreimage capture device 42 as part of thepayload 40. Nonexclusive examples ofimage capture devices 42 include cameras (capable of detecting visible and non-visible ranges of light), infrared sensors, radar, and sonar. Theimage capture device 42 may captureimages 44. TheUAV 12 may transmit theimages 44 to theremote station 14 and/or to a remote system (not shown), and/or store theimages 44, and/or process (partially or fully) theimages 44 onboard theUAV 12. Nonexclusive examples of processing theimages 44 may include partially or completely georeferencing one ormore images 44, geolocating one ormore images 44, reviewing one ormore images 44 for abnormalities, performing quality control of one ormore images 44, tie-pointing (manual/automatic) to relateadjacent images 44, bundle adjustments, 3D point cloud generation from2D images 44, mosaic generation from theimages 44, and/or color-balancing one ormore images 44. - In one embodiment, components of the
UAV 12 may be tightly integrated to reduce size and weight of theUAV 12. For example, as illustrated inFIG. 6 , thecontroller 22, thecommunications system 24, theESCs 34, thepower bus 36, and/or components of the power system 26 (e.g. motors) may be integrated into one or more printed circuit board (PCB) 50 or a hybrid PCB and integrated circuit. Wires may be substituted with PCB traces, thus reducing or eliminating the number of wires and connectors required. - Additionally, or alternately, all or some of the
payload 40, for example, theimage capture device 42, may be integrated on one or moreshared PCB 50, as shown inFIG. 7 , for example, with thepower bus 36 and/orcontroller 22. - In one embodiment, as shown in
FIG. 8 , components of thepower system 26 may be mounted directly to thePCB 50, along with other components, such as thepower bus 36 and/or thecontroller 22. Additionally, or alternately, wires may be used to connect thepower system 26 to theElectronic Speed Controls 34. - UAV Airframe 20:
- Returning to
FIG. 2 , it will be understood that any type of aircraft airframe may be used as the basis of theairframe 20 of theUAV 12. Non-exclusive examples of types ofUAVs 12 havingdifferent airframes 20 include a fixed-wing UAV 12 a having a front or rear propeller, a fixed-wing UAV 12 b having multiple wing propellers, ahelicopter type UAV 12 c, amulti-rotor UAV 12 d, a tilt-rotor UAV 12 e, a jet-type UAV 12 f, and a blimp-type UAV 12 g. In one embodiment, theairframe 20 of theUAV 12 g may have a blimp-like design in which theairframe 20 encloses lighter-than-air gas. - The
airframe 20 of theUAV 12 may have one ormore control surfaces 60 such as elevators, rudders, flaps, slats, and/or ailerons. The control surfaces 60 may have one or more servomechanism (not shown). - The
airframe 20 of theUAV 12 may have attachments to carry thepayload 40 and/or thepayload 40 may be integrated into theairframe 20 of theUAV 12. - In one embodiment, the
PCB 50 may also form a section of theairframe 20. - The
airframe 20 may be configured to absorb energy, such as energy generated in a collision. In one embodiment, as illustrated inFIG. 9 , theairframe 20 may includepadding 70 that meets OSHA 1910.135b and the cited ANSI requirements for head protection. Thepadding 70 may substantially cover one or more exterior surfaces of theairframe 20. Thepadding 70 may be formed of foam or other appropriate padding material. In one embodiment, theairframe 20 is completely or partially composed of foam or other appropriate padding material. - In one embodiment, as illustrated in
FIGS. 10-12 , theairframe 20 may include abladder system 72 havingair bladders 74. Theair bladders 74 may substantially cover theairframe 20. Theair bladders 74 may weigh less than padding 70. - In one embodiment, the
air bladders 74 may have an un-inflated state (FIG. 10 ) and an inflated state (FIG. 11 ). In the inflated state, theair bladders 74 may encompass all or part of an exterior of theUAV 12 to protect theUAV 12 from impact with other objects, as well as to protect other objects from impact with theUAV 12. Theair bladders 74 may be automatically and/or manually (remotely) switched to the inflated state if theUAV 12 is out of control. In one embodiment, thecontroller 22 may monitor thepower system 26, thepropulsion system 28, theavionics system 30, and/or thenavigation system 32. If thecontroller 22 determines theUAV 12 is outside of predetermined parameters for one or more of the systems, thecontroller 22 may signal theair bladder system 72 to switch theair bladders 74 to the inflated state from the uninflated state. In one embodiment, theair bladders 74 may be automatically triggered to the inflated state when power is lost to one or more of the systems in theUAV 12. - In one embodiment, as shown in
FIG. 12 , thebladder system 72 may comprise one ormore air bladders 74, abladder system control 76, and one ormore containers 78 containing compressedgas 79. Theair bladders 74 may be inflated with the compressedgas 79 from thecontainers 78 by thebladder system control 76. Theair bladders 74 may be automatically and/or manually (remotely) switched to the inflated state if theUAV 12 is out of control, via thebladder system control 76. In one embodiment, thebladder system control 76 may monitor thepower system 26, thepropulsion system 28, theavionics system 30, and/or thenavigation system 32. If thebladder system control 76 determines theUAV 12 is outside of predetermined parameters for one or more of the systems, thebladder system control 76 may signal theair bladder system 72 to switch theair bladders 74 to the inflated state from the uninflated state. - In one embodiment, the
airframe 20 may include bothpadding 70 andair bladders 74. - In one embodiment, sections of, or all of, the
airframe 20 may be designed to break apart or compress on impact to help absorb the energy of a collision. This might include spring loading, gas loading, compressible materials, or weak points in theairframe 20 that are meant to break and/or collapse during a collision. - As illustrated in
FIG. 13 , in one embodiment, theUAV 12 may comprise aclosed loop sensor 80 surrounding at least a portion of theairframe 20. Theclosed loop sensor 80 comprises anelectrical circuit 82 surrounding at least a portion of theairframe 20. Theclosed loop sensor 80 works to signal thecontroller 22 if there is a break in theelectrical circuit 82. For example, if theairframe 20 is damaged or is in a collision, there is a disruption in theelectrical circuit 82 of theclosed loop sensor 80, and thecontroller 22 and/or theremote station 14 receive a signal indicating theairframe 20 has been compromised. Then thecontroller 22 and/or theremote station 14 may shut down thepower system 26 and/or emit a warning to theremote operator 16 and anyone in the vicinity. The warning may be in any form, non-exclusive examples of which are an audible and/or visual warning. - UAV Controller 22:
- Turning now to
FIG. 14 , a block diagram of anexemplary controller 22 is shown. Thecontroller 22 may control the functions of, and/or receive data from, thecommunications system 24, thepower system 26, thepropulsion system 28, theavionics system 30, thenavigation system 32, and/or the ESC(s) 34. In one embodiment, thecontroller 22 may use data from theavionics system 30 or elsewhere (for example, an airspeed sensor, one or more down facing camera, GPS speed, etc.) to detect and limit the speed of theUAV 12. In one embodiment, thecontroller 22 may contain a maximum speed setting and/or altitude setting for theUAV 12. - In one embodiment, the
controller 22 may include one ormore computer processor 90 and/or field-programmable gate array (FPGA) 92, one ormore drive 94, one or moreuser input device 96, and one or morenon-transitory memory 98. In one embodiment, thecontroller 22 may have animage capture module 100. - The
computer processors 90 and/orFPGAs 92 may be programmed or hardwired to control theUAV 12 and/or to interpret and carry out commands from theremote station 14 to control theUAV 12. In one embodiment, thecontroller 22 may be configurable to perform specific in-flight functions. Thecontroller 22 may receive flight control instructions from the remote station 14 (or elsewhere), control relevant flight control mechanisms (such as through thepower system 26,propulsion system 28,navigation system 32, and/or avionics system 30), and/or provide feedback information (e.g., telemetry information) to theremote station 14 and/or other device(s). - The
drives 94 and their associated computer storage media such as removable storage media (e.g., CD-ROM, DVD-ROM) and non-removable storage media (e.g., a hard drive disk), may provide storage of computer readable instructions, data structures, program modules and other data. Thedrives 94 may store and include an operating system, application programs, program modules, and one or more database storing various data, nonexclusive examples of which include image data, position data, flight control instructions data, flight path data, past flight data, sensor data, and navigation data. - The
controller 22 further may include one or moreuser input device 96, through which a user may enter commands and data. Non-exclusive examples ofinput devices 96 may include an electronic digitizer, a microphone, a keyboard, and a pointing device such as a mouse device, trackball device or touch pad device.Other input devices 96 may include a joystick device, game pad device, satellite dish, scanner device, or the like. - In one embodiment, the
controller 22 may stream data (live or delayed feed) utilizing thecommunications system 24 to theremote station 14, or other site(s) or vehicle(s), and/or may store data in the one or morenon-transitory memory 98. In data streaming applications, thecontroller 22 may transmit real-time video or data to the remote station and/or to points worldwide. TheUAV 12 may have Internet connectivity (for example, through an Inmarsat satellite) and may transmit data directly over the Internet. - In some embodiments, the
image capture module 100 may transmit capturedimages 44 to theremote station 14 or other device through thecommunication system 24, store the capturedimages 44 in thememory 98, and/or process the capturedimages 44. Non-exclusive examples of processing of captured images are described in U.S. Pat. No. 8,477,190, issued Jul. 2, 2013, titled “Real-Time Moving Platform Management System;” U.S. Pat. No. 8,385,672, issued Feb. 26, 2013, titled “System for Detecting Image Abnormalities;” U.S. Pat. No. 7,424,133, issued Sep. 9, 2008, titled “Method and Apparatus for Capturing, Geolocating and Measuring Oblique Images;” and U.S. Patent Publication US20150221079A1, published Aug. 6, 2015, titled “Augmented Three Dimensional Point Collection of Vertical Structures;” all of which are hereby incorporated by reference in their entirety herein. - The
image capture module 100 and/or theremote station 14 may also be used to adjust operational parameters, such as resolution, of theimage capture device 42. For example, theimage capture module 100 and/or theremote station 14 may transmit one or more signal to theimage capture device 42 indicating a change to operational parameters. - The
memory 98 of thecontroller 22 may comprise, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which may be used to store the desired information and which may be accessed by the controller or the remote station or other remote processor. Any such computer storage media may be part of thecontroller 22 and/or theremote station 14. - In one embodiment, the
controller 22 may automatically mitigate unexpected flight characteristics. Nonexclusive examples of unexpected flight characteristics include a ground effect (that is, increased lift-force and decreased aerodynamic drag that wings or rotors generate when they are close to a fixed surface), translational lift (i.e. a transitional state present after a helicopter has moved from hover to forward flight), and vortex ring state (i.e. settling with power, in which arotary UAV 12 settles into its own downwash causing loss of lift). Thecontroller 22 may monitor thepower system 26,propulsion system 28,navigation system 32, and/oravionics system 30, to detect unexpected flight characteristics. After detection, thecontroller 22 may implement counter measures to the detected unexpected flight characteristics, such as sending one or more non-transitory signals to thepower system 26,propulsion system 28,navigation system 32, and/oravionics system 30, to control the flight of theUAV 12. - UAV
Electronic Speed Controls 34 and Actuators 38: - As previously described and shown in
FIG. 5 , in some embodiments, theUAV 12 may comprise one or more Electronic Speed Control (ESC) 34. In some embodiments, the ESC(s) 34 may control the operation of the control surfaces 60 of theairframe 20 and/or may control thepropulsion system 28, either in conjunction with or instead of thecontroller 22. - In some embodiments, in which the
UAV 12 may comprise one ormore actuator 38, thecontroller 22 and/or the ESC(s) 34 may control the operation of theactuator 38 to actuate the propulsion system 28 (for example, the rotor blade) and/or the control surfaces 60 of theUAV 12. In some embodiments, theESC 34 may be electrically connected to thecontroller 22 and theactuator 38. Thecontroller 22 may provide control signals for theESC 34, which in turn provides actuator signals to the electrically connectedactuator 38 so as to actuate the corresponding component of the propulsion system 28 (such as the rotor) orcontrol surface 60 on theairframe 20. In some embodiments, feedback signals can also be provided by theactuator 38 and/or theESC 34 to thecontroller 22. - In one embodiment, the number of
ESCs 34 is equal to the number of actuators 38 (such asactuators 38 controlling rotors) of theUAV 12. For example, a 4-rotor UAV 12 d may have fouractuators 38 and fourESCs 34. In an alternative embodiment, the number ofESCs 34 may be different (more or less) than the number ofactuators 38. - In some embodiments, the
ESC 34 may control the speed of revolution of thepower system 26, such as a motor/generator or an engine. - In some embodiments, the
ESCs 34 may be optional. In some embodiments, instead of, or in addition to, theESCs 34, other types of actuator controllers can be provided to control the operation of theactuators 38, and/or thecontroller 22 may directly control thecontrol surfaces 60 and/or thepropulsion system 28. - UAV Communications System 24:
- The
communications system 24 of theUAV 12 may communicate with an external system, such as theremote station 14, orother UAVs 12,UASs 10, aircraft, or other vehicles (including ground vehicles or satellites). As depicted inFIG. 5 , thecommunications system 24 may have one ormore receiver 110 and one ormore transmitter 112. Thecommunications system 24 may have one ormore antenna 114 and one or more attenuator 116 for the antenna(s) 114. The attenuator 116 may reduce the strength of a signal from or to theantenna 114. The attenuator 116 may be used for range testing between theUAV 12 and theremote station 14. - An interlock (not shown) may be used to prevent the
UAV 12 from taking off with the attenuator 116 in place. The interlock is a device that makes the state of two mechanisms mutually dependent. In one example of an interlock, a sensor is configured to detect that the attenuator 116 is in place. If the attenuator 116 is in place, theUAV 12 is prevented from flying (or flying beyond a predetermined distance) to prevent theUAV 12 from flying beyond the range of thecontroller 22 with the attenuator 116 attached to theUAV 12. The attenuator 116 may also be affixed to thecase 17 such that, when theUAV 12 is removed from thecase 17, the attenuator 116 is effectively removed. - The
antenna 114 may transmit/receive one or more signal to/from thecommunications system 24 to communicate with theremote station 14 and/orother UAV 12, aircraft, and/or vehicles. - Non-exclusive examples of communications systems are described in U.S. Pat. No. 8,477,190, issued Jul. 2, 2013, titled “Real-Time Moving Platform Management System,” which is hereby incorporated by reference in its entirety.
- UAV Power System 36:
- The
power system 26, as depicted inFIG. 5 , may comprise one or more power generation and/or storage devices. In one embodiment, as illustrated inFIG. 15 , thepower system 26 may comprise one ormore motor 118 or engine (not shown). For example, the engine may be a piston engine or a jet. In one embodiment, thepower system 26 may comprise one or more generator and/or a solar power system (not shown) for generating power to supply to themotor 118 or other components of theUAV 12. In one embodiment, thepower system 26 may comprise one or more fuel cell (not shown) for generating electrical energy to supply to themotor 118 of thepower system 26. - In one embodiment, the
motor 118 of thepower system 26 may have a light-weight housing 120 made of plastic, or other low-weight material. For example, a motor with a plastic housing may be used (e.g. Emax PM2212 920 KV, Plastic Brushless Motor). - The housing 120 of the
motor 118 may be integrated with theairframe 20 and/or a part of theairframe 20. For example, the housing 120 may be molded or printed into theairframe 20, such that fewer or no fasteners (such as screws) are needed to secure the motor(s) 118 to theairframe 20, thus eliminating the potential failure of the fasteners. - In one embodiment, as shown in
FIG. 15 , thepower system 26 may also comprise one ormore battery 122 sized to provide power for the desired task set for theUAV 12. Capacity of thebattery 122 may be sized for the task with a margin for error. Typically, forsmall UAVs 12, thebattery 122 may make up a significant portion of the weight of theUAV 12. In one embodiment,multiple batteries 122 may be used in conjunction with a base station, such as theremote station 14, such that theUAV 12 can fly back to theremote station 14 and switch out thebattery 122. In one embodiment, thebattery 122 may be charged from theremote station 14. Thebattery 122 may be automatically exchanged for another battery at theremote station 14. - The one or
more battery 122 may directly plug into a socket of thePCB 50 so there are no wires between thebattery 122 and thepower bus 36, thus eliminating the added weight of wiring between thebattery 122 and thepower bus 36. - In one embodiment, the
controller 22 and/or theremote station 14 may monitor voltage of thebattery 122 to help determine the remaining capacity of thebattery 122. Total battery power output may be monitored (both volts and amps) to determine the total power drain from thebattery 122.Batteries 122 may have a built-in check (not shown) so theoperator 16 can easily check the power level of thebattery 122. The built-in check may be a push-button with visual or audible indicators of the level of power of thebattery 122. - The
controller 22 and/or theremote station 14 may shut down thepower system 26 or components of thepower system 26, such as the one or more motors 118 (FIG. 15 ), in the event of a malfunction. For example, thecontroller 22 and/orremote station 14 may shut down thepower system 26 when an impact is detected, such as by an accelerometer; or when there is a disruption in theclosed loop sensor 80 surrounding theairframe 20 indicating theairframe 20 has been compromised. - UAV Propulsion System 28:
- As shown in
FIG. 2 , thepropulsion system 28 may comprise one ormore propulsion device 130, including a combination of different types ofpropulsion devices 130. - In one embodiment, the one or
more propulsion device 130 of theUAV 12 f may be a jet engine. - In one embodiment, the one or
more propulsion device 130 may comprise one ormore rotor 132. The term “rotor” as used herein refers to a hub with a number of rotating air foils or blades. Therotor 132 may be orientated vertically (such as to provide propulsion), horizontally (such as to provide lift), or may be angularly adjustable (such as a tilt rotor). In one embodiment, the one ormore rotor 132 may be comprised of a material that yields when subjected to force, such as in the event of a strike of therotor 132 against another object. For example, if therotor 132 strikes an object, therotor 132 may deflect, bend, or break to absorb the force of the strike. - As shown in
FIG. 16 , in one embodiment, thepropulsion system 28 may further comprise apropeller guard 134. Thepropeller guard 134 may be connected to and supported by theairframe 20. Thepropeller guard 134 may surround the rotor(s) 132 with a shroud or a cowling. Thepropeller guard 134 may cover exposed areas of the rotor(s) 132. In one embodiment, thepropeller guard 134 may have openings no longer than one-half inch. The dimensions of the openings may comply with the Occupational Safety Health Administration regulation 1910.212(a)(5), which states in part, “The use of concentric rings with spacing between them not exceeding a one-half inch are acceptable, provided that sufficient radial spokes and firm mountings are used to make the guard rigid enough to prevent it from being pushed into the fan blade during normal use.” - In one embodiment, removing the
propeller guard 134 may interrupt electricity to thepropulsion system 28. In one embodiment, when thepropeller guard 134 is removed, a circuit (not shown) of thepower system 26 is interrupted so that thepower system 26 is nonoperational, and therotor 132 is therefore no longer provided power. Thepropeller guard 134 may include a continuous loop conductor (e.g., conductive ink) (not shown) that substantially covers the outline of thepropeller guard 134, such that, in the event that thepropeller guard 134 is broken, the conductive path is also broken. When thecontroller 22 of theUAV 12 detects a break in thepropeller guard 134, the UAS (such as thecontroller 22 and/or the remote station 14) may shut down thepower system 26 and/or emit an audible and visual warning to theoperator 16 and anyone in the vicinity. - The
controller 22 and/or theremote station 14 may shut down thepropulsion system 28 or components of thepropulsion system 28, such as therotors 132, in the event of a malfunction. For example, thecontroller 22 and/orremote station 14 may shut down thepropulsion system 28 when an impact is detected, such as by the accelerometer; or when there is a disruption in theclosed loop sensor 80 surrounding theairframe 20 indicating theairframe 20 has been compromised. -
UAV Avionics System 30 and Navigation System 32: - As shown in
FIG. 5 , in one embodiment, theUAV 12 may comprise anavionics system 30. In one embodiment, theavionics system 30 may include mechanical and electronic flight control mechanisms such as motor(s), servo(s), fuel control switches, etc. (not shown) associated with various flight operations of theUAV 12. In one embodiment, theavionics system 30 may comprise one or more processor (not shown). In one embodiment, theavionics system 30 may comprise one ormore actuators 38. - In one embodiment, illustrated in
FIG. 17 , theavionics system 30 may comprise one ormore sensor 140 i . . . 140 i+n. Of course, it will be understood that one or more of thesensors 140 i . . . 140 i+n may be onboard theUAV 12 but outside of the avionics system. Nonexclusive examples ofsensors 140 i . . . 140 i+n, include a roll sensor, a pitch sensor, a yaw sensor, an altitude sensor (such as an altimeter), a directional sensor, and a velocity sensor. In one embodiment, theavionics system 30 may comprise an inertial measurement unit (IMU) for measuring the velocity, orientation, and/or gravitational forces of theUAV 12. The IMU may include one or more accelerometers and/or gyroscopes. - The
sensors 140 i . . . 140 i+n may further comprise an airspeed sensor for determining the relative speed between theUAV 12 and the body of air through which it is travelling. In one embodiment, thesensors 140 i . . . 140 i+n may comprise a pitot sensor comprising both static and dynamic pressure sensors. - The
sensors 140 i . . . 140 i+n may comprise one or more altitude sensor, which provides a signal indicative of the altitude of theUAV 12 above sea level and/or above ground. For example, the altitude sensor may comprise a GPS receiver, a magnetometer, a barometric altimeter, etc. Signals from the sensor(s) 140 i . . . 140 i+n may be sent via a power bus (not shown) to theavionics system 30 and/or thenavigation system 32. - In previous systems that utilized GPS receivers and/or magnetometers, the GPS receiver and magnetometer were located close to the other electrical components. However, the operation of magnetometers may be affected by interference from other electrical components and/or the GPS receiver. To reduce risk of interference, in one embodiment, the
ESCs 34 of themotors 118 may be mounted away from the magnetometer to prevent interference. Additionally, or alternately, theESCs 34, magnetometer, and/or GPS receiver may be shielded. - In one embodiment, the
sensors 140 i . . . 140 i+n may comprise one or more collision detection sensor. Non-exclusive examples of collision detection sensors include an ultra-sonic device, a radar device, a laser device, a sonar device, an imaging device, and a transponder/receiver device. In one embodiment, the one or more collision detection sensor may be utilized by theavionics system 30 to determine position of and avoid collisions with other aircraft, the ground, other structures, trees, and/or other obstacles. - In one embodiment in which the collision detection sensor is a transponder/receiver device, the
avionics system 30 may comprise a Traffic Alert and Collision System (TACS) utilizing the collision detection sensor to warn of aircraft within the vicinity of theUAV 12. Such systems are well known by persons having skill in the art, for example, as described in “The Traffic Alert and Collision Avoidance System,” Kuchar and Drumm, Lincoln Laboratory Journal,Volume 16,Number 2, 2007, which is hereby incorporated by reference in its entirety herein. Thecontroller 22 of theUAV 12 and/or theremote station 14 may utilize information from the TACS to change flight paths to avoid collisions with other aircraft. - In one embodiment, the
avionics system 30 may comprise a Terrain Awareness and Warning System (TAWS) utilizing one ormore sensor 140 i . . . 140 i+n, such as the one or more collision detection sensor. The TAWS may signal thecontroller 22 and/or theremote station 14 when thesensor 140 detects terrain or structures within a predetermined distance of theUAV 12, when theUAV 12 goes outside predetermined flight parameters, and/or when theUAV 12 leaves a predetermined flight path or flight area. - In one embodiment, the
navigation system 32 may be located within theUAV 12. Additionally, or alternately, part or all of thenavigation system 32 may be located in theremote station 14. Thenavigation system 32 may plan and/or deploy the flight path of theUAV 12, may determine/receive location coordinates, may determine/receive way points, may determine/receive real world position information, may generate and transmit signals to appropriate components to control the flight of theUAV 12, and so on. - The
avionics system 30 and/or thenavigation system 32 may monitor the lateral location (latitude and longitude) of the UAV 12 (for example, using a GPS receiver), and/or monitor the altitude of theUAV 12 using the signals from thesensors 140 i . . . 140 i+n, and/or may receive information from theremote station 14. - In one embodiment, the
controller 22 utilizes information from theavionics system 30 in conjunction with thenavigation system 32 to fly theUAV 12 from one location to another. For example, thecontroller 22 may utilize the information to control the control surfaces 60 of theairframe 20 of the UAV 12 (for example, elevators, ailerons, rudders, flaps, and/or slats). - The
avionics system 30 and/or thenavigation system 32 may include a memory (not shown) on which location of controlled airspace is stored, or may communicate with an external device, such as an air traffic control station (not shown) or theremote station 14 to receive transmitted data indicating the location of controlled airspace. Theavionics system 30 and/or thenavigation system 32 may provide signals to theESC 34 and/or thecontroller 22 to be used to control the speed of rotation of therotor 132 or the output of themotor 118 or engine. - The
avionics system 30 and/or thenavigation system 32 may estimate the current velocity, orientation and/or position of theUAV 12 based on data obtained from thesensors 140, such as visual sensors (e.g., cameras), IMU, GPS receiver and/or other sensors, perform path planning, provide data to the controller 22 (and/or control signals to the actuators 38) to implement navigational control, and the like. - In one embodiment, the
UAV 12 may comprise one or moresecondary sensors 140 a i . . . 140 a i+n (FIG. 17 ) andsecondary controller 22 a (FIG. 5 ) that may be implemented to detect a fault in theprimary sensors 140 and/orcontroller 22 and/or to replace functions of failedsensors 140 i . . . 140 i+n or a failedcontroller 22. Thesecondary sensors 140 a i . . . 140 a i+n may include redundant sensors comprising one or more accelerometer, gyro, magnetometer, GPS, etc. In one embodiment, thesecondary sensors 140 a i . . . 140 a i+n may comprise redundant attitude sensors and control systems. - The
secondary sensors 140 a i . . . 140 a i+n andcontroller 22 a may be electrically isolated from theprimary controller 22 and/orsensors 140 i . . . 140 i+n, via opto isolators and/or magnetic relays so a catastrophic failure of theprimary controller 22 and/orsensors 140 i . . . 140 i+n does not cascade to thesecondary sensors 140 a i . . . 140 a i+n andcontroller 22 a. If thesecondary sensors 140 a i . . . 140 a i+n andcontroller 22 a detect a failure in theprimary sensors 140 i . . . 140 i+n orcontroller 22, thesecondary controller 22 a may shut off a relay that connects theprimary sensors 140 i . . . 140 i+n andcontroller 22 to thepower system 26, such as thebattery 122. When a fault is detected, a protocol in thecontroller 22 may decide if it is appropriate for theUAV 12 to attempt to land or shut down immediately. - In one embodiment, the
sensors 140 i . . . 140 i+n of theUAV 12 comprise one or more geo-localization sensor. Non-exclusive examples of a geo-localization sensor include a Global Positioning System (GPS), a Global Navigation Satellite System, a hyperbolic radio navigation system (e.g. LORAN), a motion capture system (e.g. such as manufactured by Vicon), a detector of lines and/or optical points of reference on the ground/structures, and an altimeter. - A form of localization, for example, utilizing the geo-localization sensor, may be used to keep the
UAV 12 within a specific operation area (referred to as an “operation box” 211) for the current inspection task. The coordinates of the boundary of the allowed operation area (which may be referred to as a “box” or “fence”) may be stored on theUAS 10. - The
box 211 may be predetermined. For example, thebox 211 may be determined using parcel boundaries, building outlines, cadastre data, and/or other sources of data (e.g. in any appropriate coordinates, such as latitude and longitude) and altitude. Thebox 211 may be determined on-site by theoperator 16 prior to take off of theUAV 12. On-site establishment of the box 211 (i.e. “boxing” or “fencing”) may be done using a handheld device (for example, a smartphone or tablet having GPS capabilities) to obtain the box corners coordinates. For example, theoperator 16 may walk to the corners of thebox 211 and record/set a point for the corner of thebox 211. In one example, theoperator 16 may place the points of thebox 211 on a map displayed on the handheld device. In one embodiment, theoperator 16 may choose or define a radius from a point as thebox 211 or a boundary around a point as thebox 211. In one embodiment, theoperator 16 may define attributes of one or more point (for example, location or title of the point, for example, southwest corner). In one embodiment, theoperator 16 may define outlines of structures and/or trees within thebox 211. These vertices and/or boundaries may define the outside boundary of a given property, the location of tall obstructions such as trees, and/or the outline of a structure that is to be captured. - The box coordinates and/or outlines of structures or obstructions may then be relayed from the operator 16 (and/or the handheld device) to the
UAS 10 and/or the UAV 12 (for example, through Wi-Fi/Bluetooth, or manual download). In one embodiment, the handheld device may be theremote station 14. - The
box 211 may be geographical coordinates and/or altitude values that define a geometric shape (e.g. polygon, circle, square, etc.) on and/or above the earth. In one embodiment, thebox 211 may have a maximum altitude or z value. Thebox 211 may be a 3D polygon having a height. Thebox 211 may be a 2D geometric shape on the ground that extends upwards either to a maximum z height or up to a maximum altitude. The maximum z height or maximum altitude may be based at least in part on government regulations. - The
controller 22 may provide instructions to theUAV 12 such that theUAV 12 stays inside thebox 211 and does not fly over adjacent or other properties. Thecontroller 22 and/or theremote station 14 may take appropriate action if theremote station 14 and/or thecontroller 22 detects that theUAV 12 is leaving theoperation box 211. For example, navigation coordinates may be provided to direct theUAV 12 away from leaving theoperation box 211, and/or theUAV 12 may be directed to land or to fly to theremote station 14. In one embodiment, the data about thebox 211 and/or structures/vegetation in thebox 211 are integrated into a flight plan for theUAV 12. - The data may also be used by the
controller 22 to help ensure theUAV 12 doesn't collide with structures, trees, etc. The data may also be used by thecontroller 22 to maintain a specific distance from a structure being captured with images/video, so that the images and video will be a consistent sample distance (millimeters per pixel for example). For example, each pixel of theimage 44 taken by thecamera 42 of theUAV 12 may represent 1 mm on a structure in the image. - UAV Speed Reduction Device 150:
- In one embodiment, as illustrated in
FIG. 18 , theUAV 12 may comprise aspeed reduction device 150. Non-exclusive examples ofspeed reduction devices 150 includeair brakes 152 andparachutes 154. Thespeed reduction device 150 may be deployed automatically or manually when thecontroller 22 and/or theremote station 14 oroperator 16 detects a malfunction in theUAV 12 and/or that theUAV 12 is out of control. Typically, thespeed reduction device 150 creates drag to limit the airspeed of theUAV 12 and therefor reduce kinetic energy. In some windy conditions, thespeed reduction device 150 may increase the velocity of theUAV 12 by acting as a sail. - The
air brakes 152 may be part of theairframe 20 in the form of hard or pliable panels/sails. In one embodiment, theair brakes 152 may be in the form of gas inflated bladders (not shown). - In propeller driven
UAVs 12, therotor 132 may be utilized as thespeed reduction device 150. - UAS Remote Station 14:
- As illustrated in
FIG. 19 , in one embodiment, theremote station 14 may comprise components that interface with the unmannedaerial vehicle 12 and/or theremote operator 16 and/or that process data to/from theUAV 12. Theremote station 14 may comprise a human-machine interface module 160, one or more processor(s) 162 (hereinafter “the processor”), one or more drive(s) 164 (hereinafter “the drive”), and a remotestation communications system 166. In one embodiment, theremote station 14 may comprise one or more antenna(s) 168 (hereinafter “the antenna”). Theantenna 168 may transmit/receive one or more signal to/from the remotestation communications system 166 to communicate with one ormore UAV 12, aircraft, and/or vehicles. - In one embodiment, the
processor 162 may comprise one or more of a smartphone, a tablet personal computer, a personal computer processor, and/or other personal computing device. - The
remote station 14 may receive/download onboard data from theUAV 12, for example, through the remotestation communications system 166. In one embodiment, the onboard data may includeimages 44 and/or metadata, such as metadata about theimages 44, about/from theimage capture device 42, and/or about/from thesensors 140. - The
remote station 14 may upload commands from theremote station 14 to theUAV 12, for example, through the remotestation communications system 166, in order to control functions of theUAV 12 and/or thepayload 40 of theUAV 12. Theremote station 14 may transmit commands and/or data to theUAV 12. In some embodiments, theremote station 14 may control theUAV 12 in real time in all three physical dimensions. However, in some embodiments theUAV 12 may operate autonomously or with varying degrees of guidance from theremote station 14 and/or theremote operator 16. - In one embodiment, the
remote station 14 may provide theremote operator 16 with real time data concerning theUAV 12 and/or data transmitted from theUAV 12 through the human-machine interface module 160. For example, theremote station 14 may provide theoperator 16 with flight information necessary to control the flight of theUAV 12. For example, flight information may include cockpit-type control data such as data from thesensors 140 and/or indications of roll, pitch, and yaw angle, navigational view of attitude data, current position of theUAV 12 with coordinates and/or visually, failure of components/systems within theUAV 12, and so on. - The human-
machine interface module 160 may be configured for theoperator 16 to receive data and to input data and/or commands. In one embodiment, the human-machine interface module 160 may comprise a display displaying a view transmitted from theUAV 12 similar to a view that an onboard pilot would have. The human-machine interface module 160 may include a control panel for remotely piloting theUAV 12. The human-machine interface module 160 may comprise a graphical user interface. The human-machine interface module 160 may comprise user input devices through which theoperator 16 may enter commands and data. Non-exclusive examples of input devices may include an electronic digitizer, a microphone, a keyboard, and a pointing device such as a mouse device, trackball device or touch pad device. Other input devices may include a joystick device, game pad device, satellite dish, scanner device, heads-up device, a vision system, a data bus interface, and so on. - The
remote station 14 may translate commands from theoperator 16 to theUAV 12 to control theflight control surfaces 60 and speed of theUAV 12. In one embodiment, theremote station 14 may translate simplistic inputs from theoperator 16 into specific, detailed, precision-controlled flight control of theUAV 12. For example, the operator's 16 movement of a joystick may be translated by theprocessor 162 into commands and transmitted via the remotestation communications system 166 and thecommunications system 24 of theUAV 12 to thecontroller 22 of theUAV 12 to adjust theflight control surfaces 60 of theUAV 12 to affect roll, pitch, and yaw. - In one embodiment, the
remote station 14 may comprise one ormore attenuator 170 on theantenna 168 for range testing. An interlock (not shown) may be used to prevent theUAV 12 from taking off with theattenuator 170 in place on theantenna 168 of theremote station 14. Theattenuator 170 may be used for range testing between theUAV 12 and theremote station 14. The interlock is a device that makes the state of two mechanisms mutually dependent. In one example of an interlock, a sensor is configured to detect that the attenuator 116 is in place. If the attenuator 116 is in place, theUAV 12 is prevented from flying (or flying beyond a predetermined distance) to prevent theUAV 12 from flying beyond the range of thecontroller 22 with the attenuator 116 attached to theUAV 12. The attenuator 116 may also be affixed to thecase 17 such that when theUAV 12 is removed from thecase 17 the attenuator 116 is effectively removed. - In one embodiment, the
drive 164 and associated computer storage media such as removable storage media (e.g., CD-ROM, DVD-ROM) and non-removable storage media (e.g., a hard drive disk), may provide storage of computer readable instructions, data structures, program modules and other data. Thedrive 164 may include an operating system, application programs, program modules, and one or more database. - In one embodiment, the
remote station 14 may be implemented as a portion of a small-form factor portable (or mobile) electronic device such as a portable computing device, a mobile computing device, an application specific device, or a hybrid device that include any of the above functions. Theremote station 14 may be implemented as a personal computer including both laptop computer and non-laptop computer configurations. Moreover, theremote station 14 may be implemented as a networked system or as part of a specialized server. - In one embodiment, the
remote station 14 may comprise an automatic dependence surveillance-broadcast (ADS-B) device (not shown) such that when a conventional aircraft or anotherUAV 12 enters the area theoperator 16 may be notified and land theUAV 12. Optionally, thecontroller 22 of theUAV 12 may automatically land theUAV 12 when notified. The ADS-B device may be configured with ADS-B “Out” and/or ADS-B “In”. ADS-B “Out” periodically broadcasts information about theUAV 12, such as identification, current position, altitude, and velocity, through a transmitter, such as thecommunications system 166 of the remote station 14 (and/or thecommunications system 24 of the UAV 12). ADS-B “Out” may provide air traffic controllers and other aircraft with real-time position information. ADS-B “In” allows the reception by theUAS 10 of ADS-B data, such as direct communication from nearby aircraft of their identification, current position, altitude, and/or velocity. In one embodiment, the ADS-B device is located in either or both theremote station 14 and/or theUAV 12. - In one embodiment, the remote
station communications system 166 may comprise a transmitter and a receiver. - In one embodiment, the
remote station 14 may comprise a wireless datalink subsystem. The wireless datalink subsystem may be configured for remote communication with theUAV 12. - In one embodiment, the
remote station 14 may further comprise a mobile power system, such as one or more battery (not shown). -
UAV 12 andRemote Station 14 Communication - In one embodiment, the
communications system 24 of theUAV 12 and thecommunications system 166 of theremote station 14 are configured to form a connection between theUAV 12 and theremote station 14 using radio frequency protocols that may or may not meet the requirements of a Wi-Fi network. - In one embodiment, the
communications system 24 of theUAV 12 and thecommunications system 166 of theremote station 14 may utilize a cellular network for communication between theUAV 12 and theremote station 14 and/or communication between theUAS 10 and other vehicles and/or systems. In one non-exclusive example, theUAV 12 and/orremote station 14 may have cellular radios via which data may be communicated. A Verizon MiFi 4G LTE Global USB Modem is an example of such a device. TheUAV 12 may connect to the cellular network using the modem and send telemetry, images, photos, etc. TheUAV 12 may also receive commands/instructions on where to go next, flight plans, and/or what to photograph/video. - In one embodiment, the
controller 22 in conjunction with thecommunications system 24 of theUAV 12 and/or thecommunications system 166 of theremote station 14 may operate in a networked environment using logical connections to one or more processors, such as a remote processor connected to a network interface. The remote processor may be theprocessor 162 of theremote station 14, or located all or in part separately from theremote station 14. The remote processor may be a personal computer, a server, a router, a network PC, a peer device or other common network node, and can include any or all of the elements described above relative to the controller. Networking environments are commonplace in offices, enterprise-wide area networks (WAN), local area networks (LAN), intranets and world-wide networks such as the Internet. It should be noted, however, that source and destination machines need not be coupled together by a network(s) or any other means, but instead, data may be migrated via any media capable of being written by the source platform and read by the destination platform or platforms. When used in a LAN or WLAN networking environment, thecontroller 22 may be coupled to the LAN through the network interface or an adapter. - The network(s) may comprise any topology employing servers, clients, switches, routers, modems, Internet service providers (ISPs), and any appropriate communication media (e.g., wired or wireless communications). A system according to some embodiments may have a static or dynamic network topology. The network(s) may include a secure network such as an enterprise network (e.g., a LAN, WAN, or WLAN), an unsecure network such as a wireless open network (e.g., IEEE 802.11 wireless networks), or a world-wide network such (e.g., the Internet). The network(s) may also comprise a plurality of distinct networks that are adapted to operate together. The network(s) are adapted to provide communication between nodes. By way of example, and not limitation, the network(s) may include wireless media such as acoustic, RF, infrared and other wireless media.
- A network communication link may be one nonexclusive example of a communication media. Communication media may typically be embodied by computer readable instructions, data structures, program modules, or other data in a modulated data signal, such as a carrier wave or other transport mechanism, and may include any information delivery media. A “modulated data signal” may be a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal. By way of example, and not limitation, communication media may include wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, radio frequency (RF), microwave, infrared (IR) and other wireless media. The term computer readable media as used herein may include both storage media and communication media.
- Diagnostic Testing
- In some embodiments, pre-flight diagnostic testing is employed. In one embodiment, the
UAS 10 is programmed such that, when theUAS 10 is initially powered on, theUAV 12 will not operate without first performing a series of diagnostics to verify that all systems are properly calibrated and operating correctly. In embodiments havingredundant sensors 140 a (accelerometer, gyro, magnetometer, GPS, etc.), theremote station 14 may prompt the operator 16 (via the human-machine interface module 160, for example, with a text display or an audible voice) to rotate theUAV 12 on different axes. The values from each sensor pair (that is, theprimary sensor 140 and thesecondary sensor 140 a) may be compared to verify that they match within a predetermined margin of error. - In one embodiment, as part of a pre-flight diagnostic routine the
operator 16 may be prompted by thecontroller 22 and/or theprocessor 162 to anchor theUAV 12 to a weight or test rig that is heavy or anchored well enough so theUAV 12 may power up thepropulsion system 28 to full power and remain secure on the ground. The preflight test may verify one or more of the following parameters: integrity of thepropulsion system 28 at full power (for example, integrity of the rotor 132), RPM output, power consumption (for example, by each motor 118), performance of the electronic speed controls (ESCs) 34, yaw torque, health/output power of thebattery 122 under load, thrust output from the propulsion system 28 (for example, each motor/propeller), integrity of theairframe 20, etc. - In one embodiment, collision detection diagnostics are employed. As part of a collision detection test, the
operator 16 may be prompted to place an object in front of the collision detection sensor(s) to verify that all collision detection systems are working correctly. In one embodiment, the collision detection diagnostics utilize a pre-flight test rig/jig (not shown) so the predetermined distance for collision detection may be checked with precision. - In one embodiment, the diagnostic tests may also involve placing one or more attenuator 116 and/or
attenuator 170 on the antenna(s) 114 of theUAV 12 or the antenna(s) 168 of theremote station 14 for range testing. An interlock may be used to prevent theUAV 12 from taking off when the attenuator(s) 116, 170 are in place. - Diagnostic tests also may be used to check environmental conditions and disallow use of the
UAV 12 when it is too windy or the temperature is too hot or too cold. This is particularly important for thebattery 122 which may have significantly lower power output in cold temperatures. - In one embodiment, in-flight diagnostic testing is employed. During flight operations a series of algorithms may be used to detect faults and suspend flight operations if required. For example, if the
controller 22 adjusts power output to aparticular motor 118 and does not “see” the intended change in attitude as a result, thecontroller 22 may assume there is a malfunction or the vehicle is “stuck” and power down allmotors 118. - Referring now to
FIGS. 20-24 , an example of one embodiment of theUAS 10 in use in accordance with the present disclosure will be described. As illustrated inFIG. 20 , aproperty 200 of interest is identified, such as withlocation information 202. Thelocation information 202 may be in the form of any coordinate system or location information system, including, but not limited to, a street address, a plat location, and/or latitude and longitude coordinates. - In one embodiment, a general location of the property may be provided, and the
UAS 10 may then be provided with, and/or determine,specific boundaries 210 of an inspection site of theproperty 200—that is, of theoperation box 211. In one embodiment, theoperator 16 may identify theouter boundaries 210 of the inspection site. In one embodiment, theoperator 16 may identify two ormore points 212 on theouter boundaries 210 of theoperation box 211, as illustrated inFIG. 21 . TheUAS 10 may then determine theouter boundaries 210 of theoperation box 211 based on the identifiedpoints 212, as shown inFIG. 22 . ThoughFIG. 21 illustrates theouter boundaries 210 of theoperation box 211 as square shaped, theouter boundaries 210 may have the shape of any polygon or polygons. Theouter boundaries 210 may have a three dimensional shape including, for example, a polygon having a height, or other structure. - As illustrated in
FIGS. 23 and 24 , thenavigation system 32 of theUAS 10 may utilize theouter boundaries 210 to guide theUAV 12 to remain within theouter boundaries 210 of theoperation box 211. Further, theUAV 12 may be directed to only captureimages 44 of objects/locations within theboundaries 210, thus protecting privacy interests of surrounding properties. - In one embodiment, the
navigation system 32 may utilize the coordinates of theboundaries 210 of theoperation box 211 to determine a flight plan for theUAV 12 that remains within theoperation box 211. Thenavigation system 32 may provide theUAV 12 geographical coordinates and/or altitude values that define a geometric shape (e.g. a polygon, a circle, a square, etc.) on and/or above the earth for theoperation box 211. Thenavigation system 32 may provide theUAV 12 a maximum altitude or z value. The geometric shape may be a 3D polygon, having a 2D geometric shape on the ground that extends upwards, either to a maximum z height or up to a maximum altitude (such as a maximum altitude allowed by government regulations). In one embodiment, thecontroller 22 maintains theUAV 12 inside the 3D polygon such that theUAV 12 does not fly over adjacent and/or other properties. In one embodiment, parcel data, building outlines, and other sources of data may be used to define the geometric shape. - In one embodiment, in addition to, or alternatively to, ensuring that the
UAV 12 does not leave theoperation box 211, thenavigation system 32 may ensure that thecamera 42 carried by theUAV 12 does not capture data and/orimages 44 on any neighboring structures, cars, and/or individuals, etc. In one embodiment, the 3D polygon information and data from attitude sensors on thecamera 42 and/or theUAV 12 carrying thecamera 42, can be used to ensure thatcamera 42 does not capture data and/orimages 44 on any neighboring structures, cars, and/or individuals, etc. In one embodiment, 3D data about a structure may be used to ensure thecamera 42 is oriented in such a way so that only the structure is in the frame of theimage 44 when taking theimage 44 or video and that neighboring structures, individuals, and/or vehicles are not in the background of theimage 44. In one embodiment, 3D data about a neighboring structure may be used to ensure thecamera 42 is oriented in such a way so that the neighboring structure is not captured in theimage 44. - Further, if the
UAV 12 is utilized to captureimages 44 with theimage capture device 42, thenavigation system 32 may also determine the flight plan that keeps theimage capture device 42 of theUAV 12 orientated such that the field of view (designated with arrows from theUAV 12 in different positions within the operation box inFIGS. 23 and 24 ) of theimage capture device 42 is solely within theboundaries 210 of theoperation box 211, while capturing desiredimages 44 of theproperty 200 of interest. - In one embodiment, at one or more instant in time, the
controller 22 of theUAV 12 and/or theremote station 14 may compare the position of theUAV 12, based on data fromsensors 140, such as the GPS and/or the altimeter, with the coordinates of theboundaries 210 of theoperation box 211. If the distance between the position of theUAV 12 and theboundaries 210 is less than or above a predetermined amount, theUAV 12 may be directed to adjust position and/or orientation to maintain the position of theUAV 12 within theboundaries 210. If theUAV 12 is utilized to captureimages 44 with theimage capture device 42, the orientation and position of theUAV 12, and thus theimage capture device 42, may be adjusted such that the field of view of theimage capture device 42 is solely within theboundaries 210 of theoperation box 211 to respect the privacy of neighbors adjacent to theboundaries 210. - In one embodiment, the
UAV 12 may be orientated and positioned such that theimage capture device 42 has a field of view that encompasses an object or structure within theboundaries 210. - In one embodiment, the
UAS 10 and theimage capture device 42 may be utilized in a method to captureaerial images 44 of a structure while avoiding capturing images of neighboring properties. - In one embodiment, the
UAS 10 may be utilized to determine one or more ground location and/or one or more surface location. In one embodiment, theUAS 10 may be positioned on the ground/surface location. A location reading from a GPS onboard theUAS 10 may be taken with theUAS 10 on the ground/surface location. The location reading may include the latitude, the longitude, and the altitude above sea level of theUAS 10. The altitude above sea level from the GPS may be designated as a ground/surface elevation point for the latitude/longitude location of theUAS 10. Once theUAS 10 is launched into the air, another GPS reading for theUAS 10 may be taken, including the latitude, the longitude, and the altitude above sea level of theUAS 10. The height of theUAS 10 above the ground/surface may be calculated by subtracting the ground/surface elevation point from the altitude above sea level of theUAS 10 in the air. - Metadata
- In one embodiment, the
controller 22 and/or theimage capture device 42, the one ormore sensors 140, and/or theimage capture module 100 may capture metadata associated with one or more of theimages 44. Nonexclusive examples of metadata include information about and/or from theUAV 12, the one ormore sensors 140, and/or theimage capture device 42. - Metadata about the
image capture device 42 may comprise such data as the attitude of theUAV 12, the attitude of theimage capture device 42, and/or the focal length of theimage capture device 42, sensor size of theimage capture device 42, pixel pitch of theimage capture device 42, and/or distortion parameters of theimage capture device 42. - The metadata may include information from the
avionics system 30 and/or thenavigation system 32 such as orientation and/or position of theUAV 12 based on data obtained from thesensors 140, such as the visual sensors (e.g., cameras), IMU, GPS receiver and/orother sensors 140. - The metadata may include data from a GPS and/or data associated with the GPS such as GPS signal strength, number and information regarding available satellites, and so on. The metadata may include data from an IMU and/or data associated with the IMU, such as information about pitch, roll, yaw, acceleration vectors in x, y, z orientations, and acceleration vectors about an x-axis, about a y-axis, and about a z-axis.
- In one embodiment, the metadata may be from and/or about other sensors of the
UAV 12, non-exclusive examples of which include proximity sensors, LiDAR, methane gas sensors, carbon dioxide sensors, heat sensors, multi-spectral sensors (for example, four-band image sensors capable of detecting and/or recording red, green, blue and near infrared), and hyper-spectral sensors (for example, image sensors capable of detecting and/or recording a larger number of spectrum, including 16 or 32 band image—which may include red, green, blue and near infrared and additional spectrum). - In one embodiment, the metadata may include one or more of the following: whether the
image 44 or associatedimage 44 was captured from theUAV 12, the particular type of the UAV 12 (such as, but not limited to, make, model, and/or an identification number of the UAV 12), whether theimage 44 was captured from the ground, whether theimage 44 was captured from a moving ground vehicle, whether theimage 44 was captured from a manned aircraft, whether theimage 44 was captured from some other source, and what type ofimage capture device 42 was used to capture theimage 44. - In one embodiment, the metadata may be embedded in the
image 44. In one embodiment, the metadata and theimage 44 may be stored together in a single image file. In one embodiment, theimage 44 may be part of an image file having an image header. The metadata may be embedded in the image header, such as in the header of a jpeg formatted file. In one embodiment, the jpeg header may be organized in a predetermined format such that the metadata is stored in a consistent manner in the jpeg header. For example, the position of the metadata in the header and/or the format of the title of the metadata in the header may be predetermined for consistency. - In one embodiment, the
remote station 14 transforms the image file into a standard format for processing. - In one embodiment, the metadata and the
image 44 may be stored in a removable non-transitory memory storage device, such as a memory card. The memory card may be removed from theUAS 10 to download theimages 44 and the metadata. - In one embodiment, the
images 44 and/or the metadata may be transmitted from theUAS 10 to theremote station 14. Theimages 44 and/or the metadata may be transmitted wirelessly and/or through a physical connection, such as wires. In one embodiment, theimages 44 and/or the metadata may be processed by theprocessor 162 of theremote station 14. - In one embodiment, the
images 44 and/or the metadata may first be downloaded wirelessly from theUAS 10 to theremote station 14. Then theimages 44 and/or the metadata may be transmitted through a physical connection to a computer processor device where theimages 44 and/or the metadata may be extracted and/or processed. For example, theimages 44 and/or the metadata may be transmitted a smartphone, a tablet personal computer, a personal computer processor, and/or other personal computing device. - In one embodiment, the
UAS 10 may have an application program interface (API). - In one embodiment, the metadata is captured by the
image capture device 42 at the time theimage 44 is captured. - In one embodiment, the
image capture device 42 captures none of, or less than all of, the metadata. In such a case, some or all of the metadata may be captured by thecontroller 22, theavionics system 30, thenavigation system 32, and/or thesensors 140 of theUAS 10. In such a case, the metadata from the time anindividual image 44 is taken is matched with thatindividual image 44. - In one embodiment, the
controller 22 transmits one or more signal to theimage capture device 42 instructing theimage capture device 42 to capture animage 44. At the same time theimage 44 is captured, thecontroller 22 may record the metadata. The metadata may be combined with theimage 44 by thecontroller 22, or may be combined with theimage 44 after theimage 44 and the metadata are transmitted from theUAV 12 to theremote station 14. - In one embodiment, the metadata contains time data and the
images 44 contain time data, and the metadata may be matched to theimages 44 by matching the metadata time data to the image time data. - The metadata may be combined with the
images 44 in the header of the image file, such as a jpeg header for a jpeg image file. - Metadata may not be necessary in all analysis scenarios, for example, when visual data from an image is sufficient. However, other creation and/or analyses may benefit from and/or require metadata—for example, creation of a three-dimensional model.
- In one embodiment, one or more of the
images 44 may be geolocated and/or georeferenced. - Geolocating the
image 44 comprises associating theimage 44 with a location or structure in a location. One example of use for geolocation of theimage 44 is forimages 44 depicting objects above the ground without depicting the ground, or without ground location information, or without access to surface location information for the objects depicted. For example, an image may depict a chimney on a roof without depicting the ground location. Metadata can be used to associate theimage 44 with a particular location or structure. For example, metadata can be used that is associated with the one or moreimage capture device 42 at the time theaerial images 44 were captured, such as latitude and longitude of the one or moreimage capture device 42 and/or one or more of altitude, orientation, attitude, and bearing of the one or moreimage capture device 42. The metadata can be correlated to the location or structure of interest thereby associating theimage 44 with the location or structure of interest. - Georeferencing the
images 44 may comprise processing theimages 44 to determine and assign geographic location information for the pixels of theimages 44. For example, theimages 44 may be processed as described in U.S. Pat. No. 7,424,133, issued Sep. 9, 2008, titled “Method and Apparatus for Capturing, Geolocating and Measuring Oblique Images;” and/or U.S. Patent Publication US20150221079A1, published Aug. 6, 2015, titled “Augmented Three Dimensional Point Collection of Vertical Structures;” all of which are hereby incorporated by reference in their entirety herein. - The geographic location information may include geographic coordinates for the ground as well as structures and objects located above the ground in the
image 44. The geographic location information for the pixels of theimage 44 may be a part of the metadata associated with theimage 44. - Georeferencing the
images 44 may be based at least in part on one or more known ground points and/or surface points. Nonexclusive examples of known ground points and/or surface points include digital elevation models (DEMs), point clouds, three-dimensional models, individually plotted/mapped points, and tessellated ground planes. - In one embodiment, the
images 44 may be georeferenced based at least in part on searching for and locating one or more surface model or point cloud having locations within a predetermined proximity of the location of theUAV 12 and/or in the direction of orientation of theUAV 12. In one embodiment, theimages 44 may be georeferenced based at least in part on searching for and locating one or more ground point or ground plane having ground locations within a predetermined proximity of theUAV 12. - An
image location system 250 constructed in accordance with the current disclosure is illustrated inFIG. 25 . Theimage location system 250 may comprise ametadata database 252 and animage warehouse database 254 stored in one or morenon-transitory computer memory 256. Theimage location system 250 may further comprise one ormore processor 258 and one ormore user interface 260. - In one embodiment, the metadata is stored in the
metadata database 252 and theimages 44 are stored in theimage warehouse database 254. Metadata andimages 44 may be received frommultiple UASs 10 and/ormultiple UAVs 12 of various types. The metadata may initially be received in varying formats depending on the type ofUAV 12 transmitting the metadata. The metadata may be transformed into a standardized format. The metadata may be stored with a standardized format. - The metadata may be stored in the
metadata database 252 and associated with theimage 44 and/or the image file. The metadata may include a file path of the associatedimage 44 and/or the image file. - In one embodiment, one or more of the metadata, the
metadata database 252, theimages 44, and theimage warehouse database 254 may be stored in one or more remote locations, such as cloud storage. - In one embodiment, the
metadata database 252 and/or theimage warehouse database 254 may be spatial databases. That is, themetadata database 252 and/or theimage warehouse database 254 may be structured with spatial (locational) connections such that spatial conclusions and results can be reached. Themetadata database 252 and/or theimage warehouse database 254 may be able to search for, find, and return to a user data based on an input location. For example, a user may requestimages 44 within one mile of a location and themetadata database 252 and/or theimage warehouse database 254 may return such information. In another example, the user may requestimages 44 within a polygon drawn on an overview image. In another example, the user may requestimages 44 based on other location information. - A user may utilize the
image location system 250 to locate image(s) 44 and/or metadata for a particular geographic area or structure. In one embodiment, the user may search forimages 44 of a structure and/or geographic location by inputting geographic coordinates through theuser interface 260. In one embodiment, the user may search forimages 44 by choosing one or more points, facets, components, or areas of a structure in an image, floorplan, 2D model, or 3D model, as shown inFIGS. 30-38 . In one embodiment, the user may search forimages 44 of the structure and/or geographic location by inputting a polygon 268 (such as a 2D or 3D polygon) of geographic coordinates through theuser interface 260. In one embodiment, the user may input geographic points, and theprocessor 258 of theimage location system 250 may form thepolygon 268 of geographic coordinates from the inputted points. - In one embodiment, as shown in
FIGS. 30-34 , thepolygon 268 may be located by the user on a structure in animage 44. In one embodiment, thepolygon 268 may be an area or facet of a structure in animage 44. In one embodiment, theimage location system 250 may utilize thepolygon 268 in conjunction with the metadata associated with theimage 44 and/or a two-dimensional outline and/or a three-dimensional model of the structure in theimage 44 to identify the portion of the structure selected by the user. In one embodiment, theimage location system 250 may allow the user to further specify a particular area of the structure. - The
image location system 250 may search themetadata database 252 for geographic information in the metadata matching, or approximate to, the geographic coordinates entered by the user. Theimage location system 250 may then displayimages 44 associated with the metadata matching the geographic coordinates. The displayedimages 44 contain pixels having matching geographic coordinates. - In one embodiment, the
image location system 250 may search themetadata database 252 for points on the ground that match, or are approximate to, the geographic coordinates entered by the user. In one embodiment, theimage location system 250 may search themetadata database 252 for points on the ground that are intersected by or enclosed within thepolygon 268. - In one embodiment, the
image location system 250 may search themetadata database 252 for points above the ground that match, or are approximate to, the geographic coordinates entered by the user. In one embodiment, theimage location system 250 may search themetadata database 252 for points above the ground that are intersected by or enclosed within thepolygon 268. Points above the ground may be geographic location points on structures or vegetation above the ground. - In one embodiment, the
image location system 250 may return images of the structure. In one embodiment, theimage location system 250 may return images that depict the particular area of the structure chosen by the user. - The
images 44 may depict structures and/or vegetation without depicting the ground. For example,images 44 taken by animage capture device 42 with a perspective pointed toward the horizon, or at an angle upwards from the horizon, may not depict the ground. In such a case, theimage location system 250 may search the metadata for recorded locations of theimage capture device 42 in which theimage capture device 42 location matches, intersects, or is enclosed in, the inputted coordinates and/orpolygon 268. - In one embodiment, the
image location system 250 may calculate, and or store, data indicative of points on, in, and/or the outline of, one or more structures and/or vegetation depicted in theimages 44, the attitude of theimage capture device 42, and the bearing of the image capture device 42 (i.e. the direction theimage capture device 42 was pointing when theimage 44 was captured). The data can be stored in themetadata database 252. Utilizing the data, theimage location system 250 may determine the geographic coordinates (X, Y, and Z) where the view of theimage capture device 42 intersects the one or more structure and/or vegetation. Theimage location system 250 may utilize the intersection geographic coordinates as a geographic marker for theimage 44. Theimage location system 250 may match the inputted geographic coordinates to the intersection geographic coordinates to locate animage 44 depicting a geographic location having geographic coordinates matching or within a predetermined distance relative to inputted geographic coordinates and/orpolygon 268. - In one embodiment, a user may search for
images 44 with theimage location system 250 by inputting a geo-code. For example, the user may enter a street address and receive a property parcel's geometry, that is, a property parcel polygon of the property line of a land parcel or building. The user may use the received property parcel polygon aspolygon 268 to input into theimage location system 250 to request anyimages 44 for that polygon, that is, anyimages 44 that intersect thepolygon 268 or that are associated with the property within the polygon. - In one embodiment, the user may search for
images 44 with theimage location system 250 by selecting thepolygon 268 that was formed by the operator of theUAV 12 when establishingboundaries 210 of theoperation box 211 when one or more of theimages 44 were originally captured by theimage capture device 42 of theUAV 12. - In one embodiment, the metadata includes a street address. The street address may be acquired by an operator of the
UAS 10. The street address may be associated with theimages 44 captured by theUAS 10 while theUAS 10 is operated to captureimages 44 at the street address. - In one embodiment, the
image location system 250 may process one or more of theimages 44 before a user utilizes theimage location system 250. In one embodiment, theimage location system 250 may create one or more 3D model based on theimages 44 and the metadata, calculate one or more virtual nadir camera view, and then create an ortho-mosaic based on the 3D model and virtual nadir camera views. - In one embodiment, the
image location system 250 may process one or more of theimages 44 and/or the metadata and create one or more three-dimensional point clouds and/or one or more three-dimensional models based at least in part on theimages 44 and/or the metadata. In one embodiment, the metadata may be used to produce more accurate results to existing or new models and/orimages 44. - In one embodiment, the
image location system 250 may process one or more of theimages 44 by ortho-rectifying the images and stitching theimages 44 together using tie points to create an ortho-mosaic. - In one embodiment, the ortho-mosaic may be divided into tiles (for example,
tiles 256×256 in size). Theimage location system 250 may display one or more tiles to the user, such as when the user views the ortho-mosaic in a web-based browser. The tiles may be in a standardized format for use in multiple types of web-based browsers. - Referring now to
FIGS. 26-29 , in one embodiment, theimage location system 250 may providemultiple images 44 from different perspectives to the user. For example, theimage location system 250 may initially provide anoverview image 270, such as a top-down (nadir) view, of an entire area/property. In one embodiment, theimage location system 250 may display an overlay of thepolygon 268 on theoverview image 270. - In one embodiment, the image location system may display image tiles or “thumbnail” images 272 (that is, preview images smaller in initial size than the overview image 270) of
additional images 44 of the property from different perspectives, different distances, and/or different areas of the property for the user to choose to display. For example, thethumbnail images 272 may be displayed outside of theoverview image 270, such as on one side of theoverview image 270, as shown inFIG. 27 . - As illustrated in
FIGS. 27-29 , in one embodiment, theoverview image 270 and or other images (seeFIG. 29 ) may haveicons 274 on and/or beside theoverview image 270 to show where the additional images 44 (for example, those represented by the thumbnail images 272) were taken and which direction theimage capture device 42 was facing when theadditional images 44 were taken. - In one embodiment, the user may select the
icon 274 and theimage location system 250 may highlight thethumbnail image 272 associated with theicon 274. In one embodiment, the user may select thethumbnail image 272 and theimage location system 250 may highlight the portion of theoverview image 270 where the image associated with thethumbnail image 272 was captured. In one embodiment, the user may select theicon 274 and thethumbnail image 272 may be displayed. In one embodiment, the user may select theicon 274 and theadditional image 44 may be displayed in full. - The
overview image 270 provides the user with an overall perspective of whereadditional images 44 are available. Theadditional images 44 may depict less than the total area of the property. For example, theimage capture device 42 may capture aparticular image 44 of a four foot by four foot section of a roof of a structure. If a user views thisparticular image 44 of the roof section, the user may have difficulty knowing the location of the roof section in relation to the entire structure and/or the property. Theimage location system 250 may provide links to theoverall overview image 270 and/or an ortho-mosaic to help orientate the user. For example, if theparticular image 44 of the roof section is to be used for insurance claims, theimage location system 250 may give a reference as to where theimage capture device 42 was located and orientated when theparticular image 44 was captured such that the location of damage to the roof may be ascertained. - In one embodiment, the
image location system 250 may display theicon 274 on theoverview image 270 to indicate the location of theimage capture device 42 and/or the orientation of the image capture device 42 (that is, the direction, the bearing, of the viewpoint of the image capture device) at the time theimage 44 was captured. In one embodiment, theicon 274 may include a pie shape indicative of the direction theimage 44 was taken (that is, which way theimage capture device 42 was facing, the angle view theimage capture device 42 had when theimage capture device 42 captured the image 44). - In one embodiment, the
images 44 may be labeled as to what type ofimage 44 and/or how theimage 44 was captured. For example, theimage 44 may be labeled as being captured by theUAV 12. In one embodiment, thethumbnail image 272 and/or theicon 274 may be labeled to indicate the type ofimage 44 and/or how theimage 44 was captured. - In one embodiment, the
images 44, theicons 274, and/or thethumbnail images 272 displayed on and/or adjacent to theoverview image 270, may be labeled with one or more of the following metadata: whether theimage 44 or associatedimage 44 was captured from theUAV 12, the particular type of the UAV 12 (such as, make, model, and/or an identification number of the UAV 12), whether theimage 44 was captured from the ground, whether theimage 44 was captured from a moving ground vehicle, whether theimage 44 was captured from a manned aircraft, whether theimage 44 was captured from some other source, what type ofimage capture device 42 was used to capture theimage 44, or the like. - In one embodiment, the user may select one or more points in the
overview image 270 and theimage location system 250 may display one or moreadditional image 44 to show a visual depiction related to the one or more selected points. In one embodiment, the user may select thepolygon 268 and theimage location system 250 may display all of theadditional images 44 available that are encompassed by thepolygon 268. - In one embodiment in use, as illustrated in
FIGS. 30-38 , a user may search forimages 44 by selecting an area, afacet 276, apoint 278, acomponent 280, and/or an intersection of a structure in afirst image 44 or in a2D model 282 or3D model 284. For example, the user may click on an area or facet of the structure, or draw a shape, such aspolygon 268 on an area of the structure. - The
image location system 250 may detect when the user selects an area orfacet 276 of the structure, such as by utilizing two-dimensional outlines and/or three-dimensional models of the structures that are associated with geographic locations on the earth and metadata from theimages 44. Theimage location system 250 may allow the user to further specify a particular area of a structure of interest after a first selection by the user. If the user draws a circle or polygon 268 (or even single clicks to specify a point 278),image location system 250 may further allow the user to specify a particular area, component, and/or element of that structure in which the user is interested. Non-exclusive examples of area,component 280, and/or elements of structures that may be specified include one or more wall, roof plane, roof, floor, door, window, intersection, and cross-section, or portion or combination thereof. Theimage location system 250 may returnimages 44 to the user, not just in the geographic proximity to a structure, but that include the area of interest in three dimensional coordinates above the ground and on the structure. - For example, as shown in
FIG. 30 , the user may select an area, such aspolygon 268, on a wall of interest on a structure in animage 44. Using the geo-reference information of theimage 44 and/or information indicative of the structure's footprint and geographic location, theimage location system 250 can determine that the user selected a section of wall on the structure in theimage 44 and not just a point on the ground. Theimage location system 250 may search themetadata database 252 and/or theimage warehouse database 254 forimages 44 taken in that locality to discoverimages 44 that point to that region of the structure, such asimages FIGS. 31 and 32 . For example,images image capture device 42 of theUAV 12 that depict the user selected location inpolygon 268, such as by theimage capture device 42 when theUAV 12 was in afirst location 286 and/or in asecond location 288, as shown inFIG. 30 . - In one embodiment, the user may simply click on a side or element of the structure in a
first image 44 or in a 2D model 282 (as shown inFIG. 38 ) or 3D model 284 (as shown inFIG. 37 ) and be quickly presented with thumbnails of theimages 44, or theimages 44 themselves, that include that side or element of the structure. In one embodiment, search results to the user may include ground shots by an adjuster, street-view, drone, selfie stick, manned aerial, 3D models, etc. - While several embodiments of the inventive concepts have been described for purposes of this disclosure, it will be understood that numerous changes may be made which will readily suggest themselves to those skilled in the art and which are accomplished within the spirit of the inventive concepts disclosed and as defined in the appended claims.
- Additionally, it will be understood that components or systems described in certain embodiments may be used in combination with components or systems in other embodiments disclosed herein. Further, it will be understood that other components required for the
UAS 10 to be operational are well known in the art such that a person having ordinary skill in the art would readily know how to select and use those components according to the intended use of theUAS 10.
Claims (18)
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/401,999 US20170235018A1 (en) | 2016-01-08 | 2017-01-09 | Systems and methods for taking, processing, retrieving, and displaying images from unmanned aerial vehicles |
US17/495,988 US12079013B2 (en) | 2016-01-08 | 2021-10-07 | Systems and methods for taking, processing, retrieving, and displaying images from unmanned aerial vehicles |
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201662276539P | 2016-01-08 | 2016-01-08 | |
US201662413483P | 2016-10-27 | 2016-10-27 | |
US15/401,999 US20170235018A1 (en) | 2016-01-08 | 2017-01-09 | Systems and methods for taking, processing, retrieving, and displaying images from unmanned aerial vehicles |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/495,988 Division US12079013B2 (en) | 2016-01-08 | 2021-10-07 | Systems and methods for taking, processing, retrieving, and displaying images from unmanned aerial vehicles |
Publications (1)
Publication Number | Publication Date |
---|---|
US20170235018A1 true US20170235018A1 (en) | 2017-08-17 |
Family
ID=59274448
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/401,999 Abandoned US20170235018A1 (en) | 2016-01-08 | 2017-01-09 | Systems and methods for taking, processing, retrieving, and displaying images from unmanned aerial vehicles |
US17/495,988 Active US12079013B2 (en) | 2016-01-08 | 2021-10-07 | Systems and methods for taking, processing, retrieving, and displaying images from unmanned aerial vehicles |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/495,988 Active US12079013B2 (en) | 2016-01-08 | 2021-10-07 | Systems and methods for taking, processing, retrieving, and displaying images from unmanned aerial vehicles |
Country Status (6)
Country | Link |
---|---|
US (2) | US20170235018A1 (en) |
EP (1) | EP3391164B1 (en) |
AU (2) | AU2017206097B2 (en) |
CA (1) | CA3001023A1 (en) |
MX (1) | MX2018007935A (en) |
WO (1) | WO2017120571A1 (en) |
Cited By (55)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20180025649A1 (en) * | 2016-02-08 | 2018-01-25 | Unmanned Innovation Inc. | Unmanned aerial vehicle privacy controls |
US20180082135A1 (en) * | 2016-09-22 | 2018-03-22 | Apple Inc. | Vehicle Video System |
US20180211406A1 (en) * | 2017-01-23 | 2018-07-26 | Shanghai Hang Seng Electronic Technology Co., Ltd | Image processing method and device for unmanned aerial vehicle |
US10189580B2 (en) * | 2017-06-16 | 2019-01-29 | Aerobo | Image stabilization and pointing control mechanization for aircraft imaging systems |
US20190041219A1 (en) * | 2017-08-02 | 2019-02-07 | X Development Llc | Systems and Methods for Navigation Path Determination for Unmanned Vehicles |
US20190112049A1 (en) * | 2017-10-17 | 2019-04-18 | Top Flight Technologies, Inc. | Portable launch system |
US10267949B2 (en) * | 2014-10-17 | 2019-04-23 | Sony Corporation | Information processing apparatus and information processing method |
US10336453B2 (en) * | 2016-01-14 | 2019-07-02 | Elwha Llc | System and method for payload management for unmanned aircraft |
US10378895B2 (en) | 2014-08-29 | 2019-08-13 | Spookfish Innovagtions PTY LTD | Aerial survey image capture system |
US10435154B1 (en) * | 2018-07-26 | 2019-10-08 | RSQ-Systems SPRL | Tethered drone system with surveillance data management |
JP2019182268A (en) * | 2018-04-12 | 2019-10-24 | 株式会社荏原製作所 | Wired drone system |
US20190364507A1 (en) * | 2018-05-25 | 2019-11-28 | At&T Intellectual Property I, L.P. | Interfering device identification |
WO2019246283A1 (en) * | 2018-06-19 | 2019-12-26 | Seekops Inc. | Localization analytics algorithms and methods |
WO2019246280A1 (en) * | 2018-06-19 | 2019-12-26 | Seekops Inc. | Emissions estimate model algorithms and methods |
US20200001979A1 (en) * | 2018-07-02 | 2020-01-02 | Bell Helicopter Textron Inc. | Method and apparatus for proximity control between rotating and non-rotating aircraft components |
US20200001988A1 (en) * | 2018-07-02 | 2020-01-02 | Bell Helicopter Textron Inc. | Method and apparatus for proximity control between rotating and non-rotating aircraft components |
CN110837839A (en) * | 2019-11-04 | 2020-02-25 | 嘉兴职业技术学院 | High-precision unmanned aerial vehicle orthoimage manufacturing and data acquisition method |
US10589423B2 (en) * | 2018-06-18 | 2020-03-17 | Shambhu Nath Roy | Robot vision super visor for hybrid homing, positioning and workspace UFO detection enabling industrial robot use for consumer applications |
US20200103922A1 (en) * | 2016-12-13 | 2020-04-02 | Autonomous Control Systems Laboratory Ltd. | Unmanned Aircraft, Device for Controlling Unmanned Aircraft, Method for Controlling Unmanned Aircraft, and Device for Detecting Failure of Unmanned Aircraft |
US10636314B2 (en) | 2018-01-03 | 2020-04-28 | Qualcomm Incorporated | Adjusting flight parameters of an aerial robotic vehicle based on presence of propeller guard(s) |
WO2020086499A1 (en) | 2018-10-22 | 2020-04-30 | Seekops Inc. | A uav-borne, high-bandwidth, lightweight point sensor for quantifying greenhouse gases in atmospheric strata |
US10642284B1 (en) * | 2018-06-08 | 2020-05-05 | Amazon Technologies, Inc. | Location determination using ground structures |
US10696396B2 (en) | 2018-03-05 | 2020-06-30 | Rsq-Systems Us Llc | Stability systems for tethered unmanned aerial vehicles |
US10717435B2 (en) | 2018-01-03 | 2020-07-21 | Qualcomm Incorporated | Adjustable object avoidance proximity threshold based on classification of detected objects |
US10719705B2 (en) | 2018-01-03 | 2020-07-21 | Qualcomm Incorporated | Adjustable object avoidance proximity threshold based on predictability of the environment |
US10720070B2 (en) | 2018-01-03 | 2020-07-21 | Qualcomm Incorporated | Adjustable object avoidance proximity threshold of a robotic vehicle based on presence of detected payload(s) |
WO2020147085A1 (en) * | 2019-01-17 | 2020-07-23 | 深圳市大疆创新科技有限公司 | Photographing control method and movable platform |
US10737783B2 (en) | 2018-01-16 | 2020-08-11 | RSQ-Systems SPRL | Control systems for unmanned aerial vehicles |
KR102147830B1 (en) * | 2019-12-16 | 2020-08-26 | (주)프리뉴 | Intergrated control system and method of unmanned aerial vehicle using al based image processing |
US10773800B2 (en) | 2018-07-26 | 2020-09-15 | RSQ-Systems SPRL | Vehicle-based deployment of a tethered surveillance drone |
US10803759B2 (en) | 2018-01-03 | 2020-10-13 | Qualcomm Incorporated | Adjustable object avoidance proximity threshold based on presence of propeller guard(s) |
CN112033389A (en) * | 2020-08-10 | 2020-12-04 | 山东科技大学 | Deformation settlement monitoring method under gully terrain condition |
US20200409357A1 (en) | 2016-04-24 | 2020-12-31 | Flytrex Aviation Ltd. | System and method for dynamically arming a failsafe on a delivery drone |
US10895968B2 (en) * | 2016-09-08 | 2021-01-19 | DJI Research LLC | Graphical user interface customization in a movable object environment |
CN112313599A (en) * | 2019-10-31 | 2021-02-02 | 深圳市大疆创新科技有限公司 | Control method, control device and storage medium |
US20210224413A1 (en) * | 2017-11-13 | 2021-07-22 | Yoppworks Inc. | Vehicle enterprise fleet management system and method |
US11226619B2 (en) * | 2016-04-24 | 2022-01-18 | Flytrex Aviation Ltd. | Dynamically arming a safety mechanism on a delivery drone |
US11242143B2 (en) | 2016-06-13 | 2022-02-08 | Skydio, Inc. | Unmanned aerial vehicle beyond visual line of sight control |
JPWO2022070375A1 (en) * | 2020-09-30 | 2022-04-07 | ||
WO2022070371A1 (en) * | 2020-09-30 | 2022-04-07 | 日本電信電話株式会社 | Propeller guard, flight vehicle, and repulsive member |
US20220185499A1 (en) * | 2020-12-10 | 2022-06-16 | Wing Aviation Llc | Systems and Methods for Autonomous Airworthiness Pre-Flight Checks for UAVs |
US11453512B2 (en) * | 2017-02-08 | 2022-09-27 | Airbus Helicopters | System and a method for assisting landing an aircraft, and a corresponding aircraft |
US11494977B2 (en) * | 2020-02-28 | 2022-11-08 | Maxar Intelligence Inc. | Automated process for building material detection in remotely sensed imagery |
US11614430B2 (en) | 2019-12-19 | 2023-03-28 | Seekops Inc. | Concurrent in-situ measurement of wind speed and trace gases on mobile platforms for localization and qualification of emissions |
US11691721B1 (en) * | 2022-04-29 | 2023-07-04 | Beta Air, Llc | System for propeller parking control for an electric aircraft and a method for its use |
US11748866B2 (en) | 2020-07-17 | 2023-09-05 | Seekops Inc. | Systems and methods of automated detection of gas plumes using optical imaging |
US11776221B2 (en) | 2018-10-09 | 2023-10-03 | Corelogic Solutions, Llc | Augmented reality application for interacting with building models |
US11988598B2 (en) | 2019-12-31 | 2024-05-21 | Seekops Inc. | Optical cell cleaner |
US11994464B2 (en) | 2019-04-05 | 2024-05-28 | Seekops Inc. | Analog signal processing for a lightweight and compact laser-based trace gas sensor |
US12007764B2 (en) | 2016-04-24 | 2024-06-11 | Flytrex Aviation Ltd. | System and method for aerial traffic management of unmanned aerial vehicles |
US12015386B2 (en) | 2020-03-25 | 2024-06-18 | Seekops Inc. | Logarithmic demodulator for laser Wavelength-Modulaton Spectroscopy |
US12044666B2 (en) | 2018-07-30 | 2024-07-23 | Seekops Inc. | Ultra-lightweight, handheld gas leak detection device |
US12055485B2 (en) | 2020-02-05 | 2024-08-06 | Seekops Inc. | Multispecies measurement platform using absorption spectroscopy for measurement of co-emitted trace gases |
US12100117B2 (en) | 2021-03-15 | 2024-09-24 | International Business Machines Corporation | Image stitching for high-resolution scans |
US12130204B2 (en) | 2019-08-05 | 2024-10-29 | Seekops Inc. | Rapidly deployable UAS system for autonomous inspection operations using a combined payload |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CA3065503A1 (en) | 2017-05-31 | 2018-12-06 | Geomni, Inc. | System and method for mission planning and flight automation for unmanned aircraft |
IL267356B2 (en) * | 2019-06-12 | 2024-05-01 | Israel Aerospace Ind Ltd | Three dimensional aircraft autonomous navigation under constraints |
CA3139820A1 (en) * | 2019-06-20 | 2020-12-24 | Gentex Corporation | System and method for automated modular illumination and deployment |
US11945609B1 (en) * | 2023-08-16 | 2024-04-02 | Falcon Exodynamics, Inc. | System and method for identifying and distinguishing spacecraft appendages from the spacecraft body |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20040105090A1 (en) * | 2002-11-08 | 2004-06-03 | Schultz Stephen L. | Method and apparatus for capturing, geolocating and measuring oblique images |
US20140233863A1 (en) * | 2013-02-19 | 2014-08-21 | Digitalglobe, Inc. | Crowdsourced search and locate platform |
Family Cites Families (235)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US2273876A (en) | 1940-02-12 | 1942-02-24 | Frederick W Lutz | Apparatus for indicating tilt of cameras |
US3153784A (en) | 1959-12-24 | 1964-10-20 | Us Industries Inc | Photo radar ground contour mapping system |
US5345086A (en) | 1962-11-28 | 1994-09-06 | Eaton Corporation | Automatic map compilation system |
US3621326A (en) | 1968-09-30 | 1971-11-16 | Itek Corp | Transformation system |
US3594556A (en) | 1969-01-08 | 1971-07-20 | Us Navy | Optical sight with electronic image stabilization |
US3661061A (en) | 1969-05-05 | 1972-05-09 | Atomic Energy Commission | Picture position finder |
US3614410A (en) | 1969-06-12 | 1971-10-19 | Knight V Bailey | Image rectifier |
US3716669A (en) | 1971-05-14 | 1973-02-13 | Japan Eng Dev Co | Mapping rectifier for generating polarstereographic maps from satellite scan signals |
US3725563A (en) | 1971-12-23 | 1973-04-03 | Singer Co | Method of perspective transformation in scanned raster visual display |
US3864513A (en) | 1972-09-11 | 1975-02-04 | Grumman Aerospace Corp | Computerized polarimetric terrain mapping system |
US4015080A (en) | 1973-04-30 | 1977-03-29 | Elliott Brothers (London) Limited | Display devices |
JPS5223975Y2 (en) | 1973-05-29 | 1977-05-31 | ||
US3877799A (en) | 1974-02-06 | 1975-04-15 | United Kingdom Government | Method of recording the first frame in a time index system |
DE2510044A1 (en) | 1975-03-07 | 1976-09-16 | Siemens Ag | ARRANGEMENT FOR RECORDING CHARACTERS USING MOSAIC PENCILS |
US4707698A (en) | 1976-03-04 | 1987-11-17 | Constant James N | Coordinate measurement and radar device using image scanner |
US4240108A (en) | 1977-10-03 | 1980-12-16 | Grumman Aerospace Corporation | Vehicle controlled raster display system |
JPS5637416Y2 (en) | 1977-10-14 | 1981-09-02 | ||
IT1095061B (en) | 1978-05-19 | 1985-08-10 | Conte Raffaele | EQUIPMENT FOR MAGNETIC REGISTRATION OF CASUAL EVENTS RELATED TO MOBILE VEHICLES |
US4396942A (en) | 1979-04-19 | 1983-08-02 | Jackson Gates | Video surveys |
FR2461305B1 (en) | 1979-07-06 | 1985-12-06 | Thomson Csf | MAP INDICATOR SYSTEM MORE PARTICULARLY FOR AIR NAVIGATION |
DE2939681A1 (en) | 1979-09-29 | 1981-04-30 | Agfa-Gevaert Ag, 5090 Leverkusen | METHOD AND DEVICE FOR MONITORING THE QUALITY IN THE PRODUCTION OF PHOTOGRAPHIC IMAGES |
DE2940871C2 (en) | 1979-10-09 | 1983-11-10 | Messerschmitt-Bölkow-Blohm GmbH, 8012 Ottobrunn | Photogrammetric method for aircraft and spacecraft for digital terrain display |
US4387056A (en) | 1981-04-16 | 1983-06-07 | E. I. Du Pont De Nemours And Company | Process for separating zero-valent nickel species from divalent nickel species |
US4382678A (en) | 1981-06-29 | 1983-05-10 | The United States Of America As Represented By The Secretary Of The Army | Measuring of feature for photo interpretation |
US4463380A (en) | 1981-09-25 | 1984-07-31 | Vought Corporation | Image processing system |
US4495500A (en) | 1982-01-26 | 1985-01-22 | Sri International | Topographic data gathering method |
US4490742A (en) | 1982-04-23 | 1984-12-25 | Vcs, Incorporated | Encoding apparatus for a closed circuit television system |
US4586138A (en) | 1982-07-29 | 1986-04-29 | The United States Of America As Represented By The United States Department Of Energy | Route profile analysis system and method |
US4491399A (en) | 1982-09-27 | 1985-01-01 | Coherent Communications, Inc. | Method and apparatus for recording a digital signal on motion picture film |
US4527055A (en) | 1982-11-15 | 1985-07-02 | Honeywell Inc. | Apparatus for selectively viewing either of two scenes of interest |
FR2536851B1 (en) | 1982-11-30 | 1985-06-14 | Aerospatiale | RECOGNITION SYSTEM COMPRISING AN AIR VEHICLE TURNING AROUND ITS LONGITUDINAL AXIS |
US4489322A (en) | 1983-01-27 | 1984-12-18 | The United States Of America As Represented By The Secretary Of The Air Force | Radar calibration using direct measurement equipment and oblique photometry |
US4635136A (en) | 1984-02-06 | 1987-01-06 | Rochester Institute Of Technology | Method and apparatus for storing a massive inventory of labeled images |
US4814711A (en) | 1984-04-05 | 1989-03-21 | Deseret Research, Inc. | Survey system and method for real time collection and processing of geophysicals data using signals from a global positioning satellite network |
US4686474A (en) | 1984-04-05 | 1987-08-11 | Deseret Research, Inc. | Survey system for collection and real time processing of geophysical data |
US4673988A (en) | 1985-04-22 | 1987-06-16 | E.I. Du Pont De Nemours And Company | Electronic mosaic imaging process |
US4653136A (en) | 1985-06-21 | 1987-03-31 | Denison James W | Wiper for rear view mirror |
EP0211623A3 (en) | 1985-08-01 | 1988-09-21 | British Aerospace Public Limited Company | Identification of ground targets in airborne surveillance radar returns |
US4953227A (en) | 1986-01-31 | 1990-08-28 | Canon Kabushiki Kaisha | Image mosaic-processing method and apparatus |
US4653316A (en) | 1986-03-14 | 1987-03-31 | Kabushiki Kaisha Komatsu Seisakusho | Apparatus mounted on vehicles for detecting road surface conditions |
US4688092A (en) | 1986-05-06 | 1987-08-18 | Ford Aerospace & Communications Corporation | Satellite camera image navigation |
US4956872A (en) | 1986-10-31 | 1990-09-11 | Canon Kabushiki Kaisha | Image processing apparatus capable of random mosaic and/or oil-painting-like processing |
JPS63202182A (en) | 1987-02-18 | 1988-08-22 | Olympus Optical Co Ltd | Tilted dot pattern forming method |
US4814896A (en) | 1987-03-06 | 1989-03-21 | Heitzman Edward F | Real time video data acquistion systems |
US5164825A (en) | 1987-03-30 | 1992-11-17 | Canon Kabushiki Kaisha | Image processing method and apparatus for mosaic or similar processing therefor |
US4807024A (en) | 1987-06-08 | 1989-02-21 | The University Of South Carolina | Three-dimensional display methods and apparatus |
US4899296A (en) | 1987-11-13 | 1990-02-06 | Khattak Anwar S | Pavement distress survey system |
IL85731A (en) | 1988-03-14 | 1995-05-26 | B T A Automatic Piloting Syste | Apparatus and method for controlling aircraft, particularly remotely-controlled aircraft |
US4843463A (en) | 1988-05-23 | 1989-06-27 | Michetti Joseph A | Land vehicle mounted audio-visual trip recorder |
GB8826550D0 (en) | 1988-11-14 | 1989-05-17 | Smiths Industries Plc | Image processing apparatus and methods |
US4906198A (en) | 1988-12-12 | 1990-03-06 | International Business Machines Corporation | Circuit board assembly and contact pin for use therein |
JP2765022B2 (en) | 1989-03-24 | 1998-06-11 | キヤノン販売株式会社 | 3D image forming device |
US5617224A (en) | 1989-05-08 | 1997-04-01 | Canon Kabushiki Kaisha | Imae processing apparatus having mosaic processing feature that decreases image resolution without changing image size or the number of pixels |
US5086314A (en) | 1990-05-21 | 1992-02-04 | Nikon Corporation | Exposure control apparatus for camera |
JPH0316377A (en) | 1989-06-14 | 1991-01-24 | Kokusai Denshin Denwa Co Ltd <Kdd> | Method and apparatus for reducing binary picture |
US5166789A (en) | 1989-08-25 | 1992-11-24 | Space Island Products & Services, Inc. | Geographical surveying using cameras in combination with flight computers to obtain images with overlaid geographical coordinates |
FR2655448B1 (en) | 1989-12-04 | 1992-03-13 | Vigilant Ltd | CONTROL SYSTEM FOR A TELEGUID AIRCRAFT. |
JP3147358B2 (en) | 1990-02-23 | 2001-03-19 | ミノルタ株式会社 | Camera that can record location data |
US5335072A (en) | 1990-05-30 | 1994-08-02 | Minolta Camera Kabushiki Kaisha | Photographic system capable of storing information on photographed image data |
EP0464263A3 (en) | 1990-06-27 | 1992-06-10 | Siemens Aktiengesellschaft | Device for obstacle detection for pilots of low flying aircrafts |
US5191174A (en) | 1990-08-01 | 1993-03-02 | International Business Machines Corporation | High density circuit board and method of making same |
US5200793A (en) | 1990-10-24 | 1993-04-06 | Kaman Aerospace Corporation | Range finding array camera |
US5155597A (en) | 1990-11-28 | 1992-10-13 | Recon/Optical, Inc. | Electro-optical imaging array with motion compensation |
JPH04250436A (en) | 1991-01-11 | 1992-09-07 | Pioneer Electron Corp | Image pickup device |
US5265173A (en) | 1991-03-20 | 1993-11-23 | Hughes Aircraft Company | Rectilinear object image matcher |
US5369443A (en) | 1991-04-12 | 1994-11-29 | Abekas Video Systems, Inc. | Digital video effects generator |
CA2066280C (en) | 1991-04-16 | 1997-12-09 | Masaru Hiramatsu | Image pickup system with a image pickup device for control |
US5555018A (en) | 1991-04-25 | 1996-09-10 | Von Braun; Heiko S. | Large-scale mapping of parameters of multi-dimensional structures in natural environments |
US5231435A (en) | 1991-07-12 | 1993-07-27 | Blakely Bruce W | Aerial camera mounting apparatus |
EP0530391B1 (en) | 1991-09-05 | 1996-12-11 | Nec Corporation | Image pickup system capable of producing correct image signals of an object zone |
US5677515A (en) | 1991-10-18 | 1997-10-14 | Trw Inc. | Shielded multilayer printed wiring board, high frequency, high isolation |
US5402170A (en) | 1991-12-11 | 1995-03-28 | Eastman Kodak Company | Hand-manipulated electronic camera tethered to a personal computer |
US5247356A (en) | 1992-02-14 | 1993-09-21 | Ciampa John A | Method and apparatus for mapping and measuring land |
US5270756A (en) | 1992-02-18 | 1993-12-14 | Hughes Training, Inc. | Method and apparatus for generating high resolution vidicon camera images |
US5251037A (en) | 1992-02-18 | 1993-10-05 | Hughes Training, Inc. | Method and apparatus for generating high resolution CCD camera images |
US5372337A (en) | 1992-05-01 | 1994-12-13 | Kress; Robert W. | Unmanned aerial aircraft having a single engine with dual jet exhausts |
US5277380A (en) | 1992-06-22 | 1994-01-11 | United Technologies Corporation | Toroidal fuselage structure for unmanned aerial vehicles having ducted, coaxial, counter-rotating rotors |
US5506644A (en) | 1992-08-18 | 1996-04-09 | Olympus Optical Co., Ltd. | Camera |
US5481479A (en) | 1992-12-10 | 1996-01-02 | Loral Fairchild Corp. | Nonlinear scanning to optimize sector scan electro-optic reconnaissance system performance |
US5342999A (en) | 1992-12-21 | 1994-08-30 | Motorola, Inc. | Apparatus for adapting semiconductor die pads and method therefor |
US5414462A (en) | 1993-02-11 | 1995-05-09 | Veatch; John W. | Method and apparatus for generating a comprehensive survey map |
US5508736A (en) | 1993-05-14 | 1996-04-16 | Cooper; Roger D. | Video signal processing apparatus for producing a composite signal for simultaneous display of data and video information |
US5467271A (en) | 1993-12-17 | 1995-11-14 | Trw, Inc. | Mapping and analysis system for precision farming applications |
DE69532126T2 (en) | 1994-05-19 | 2004-07-22 | Geospan Corp., Plymouth | METHOD FOR COLLECTING AND PROCESSING VISUAL AND SPATIAL POSITION INFORMATION |
RU2153700C2 (en) | 1995-04-17 | 2000-07-27 | Спейс Системз/Лорал, Инк. | Orientation and image shaping control system (design versions) |
US5604534A (en) | 1995-05-24 | 1997-02-18 | Omni Solutions International, Ltd. | Direct digital airborne panoramic camera system and method |
US5581258A (en) | 1995-06-07 | 1996-12-03 | The United States Of America As Represented By The Secretary Of The Navy | Portable antenna controller |
US5668593A (en) | 1995-06-07 | 1997-09-16 | Recon/Optical, Inc. | Method and camera system for step frame reconnaissance with motion compensation |
US5963664A (en) | 1995-06-22 | 1999-10-05 | Sarnoff Corporation | Method and system for image combination using a parallax-based technique |
US5904724A (en) | 1996-01-19 | 1999-05-18 | Margolin; Jed | Method and apparatus for remotely piloting an aircraft |
US5835133A (en) | 1996-01-23 | 1998-11-10 | Silicon Graphics, Inc. | Optical system for single camera stereo video |
US5894323A (en) | 1996-03-22 | 1999-04-13 | Tasc, Inc, | Airborne imaging system using global positioning system (GPS) and inertial measurement unit (IMU) data |
US5844602A (en) | 1996-05-07 | 1998-12-01 | Recon/Optical, Inc. | Electro-optical imaging array and camera system with pitch rate image motion compensation which can be used in an airplane in a dive bomb maneuver |
US5798786A (en) | 1996-05-07 | 1998-08-25 | Recon/Optical, Inc. | Electro-optical imaging detector array for a moving vehicle which includes two axis image motion compensation and transfers pixels in row directions and column directions |
US5841574A (en) | 1996-06-28 | 1998-11-24 | Recon/Optical, Inc. | Multi-special decentered catadioptric optical system |
WO1998020301A1 (en) | 1996-11-05 | 1998-05-14 | Lockheed Martin Corporation | Electro-optical reconnaissance system with forward motion compensation |
US6108032A (en) | 1996-11-05 | 2000-08-22 | Lockheed Martin Fairchild Systems | System and method for image motion compensation of a CCD image sensor |
RU2127075C1 (en) | 1996-12-11 | 1999-03-10 | Корженевский Александр Владимирович | Method for producing tomographic image of body and electrical-impedance tomographic scanner |
US6222583B1 (en) | 1997-03-27 | 2001-04-24 | Nippon Telegraph And Telephone Corporation | Device and system for labeling sight images |
US6597818B2 (en) | 1997-05-09 | 2003-07-22 | Sarnoff Corporation | Method and apparatus for performing geo-spatial registration of imagery |
US6097854A (en) | 1997-08-01 | 2000-08-01 | Microsoft Corporation | Image mosaic construction system and apparatus with patch-based alignment, global block adjustment and pair-wise motion-based local warping |
US6157747A (en) | 1997-08-01 | 2000-12-05 | Microsoft Corporation | 3-dimensional image rotation method and apparatus for producing image mosaics |
AU9783798A (en) | 1997-10-06 | 1999-04-27 | John A. Ciampa | Digital-image mapping |
WO1999024936A1 (en) | 1997-11-10 | 1999-05-20 | Gentech Corporation | System and method for generating super-resolution-enhanced mosaic images |
US5852753A (en) | 1997-11-10 | 1998-12-22 | Lo; Allen Kwok Wah | Dual-lens camera with shutters for taking dual or single images |
US6037945A (en) | 1997-12-16 | 2000-03-14 | Xactware, Inc. | Graphical method for modeling and estimating construction costs |
US6094215A (en) | 1998-01-06 | 2000-07-25 | Intel Corporation | Method of determining relative camera orientation position to create 3-D visual images |
US6130705A (en) | 1998-07-10 | 2000-10-10 | Recon/Optical, Inc. | Autonomous electro-optical framing camera system with constant ground resolution, unmanned airborne vehicle therefor, and methods of use |
JP4245699B2 (en) | 1998-09-16 | 2009-03-25 | オリンパス株式会社 | Imaging device |
US6434265B1 (en) | 1998-09-25 | 2002-08-13 | Apple Computers, Inc. | Aligning rectilinear images in 3D through projective registration and calibration |
DE19857667A1 (en) | 1998-12-15 | 2000-08-17 | Aerowest Photogrammetrie H Ben | Process for creating a three-dimensional object description |
US6167300A (en) | 1999-03-08 | 2000-12-26 | Tci Incorporated | Electric mammograph |
DE19922341C2 (en) | 1999-05-14 | 2002-08-29 | Zsp Geodaetische Sys Gmbh | Method and arrangement for determining the spatial coordinates of at least one object point |
AUPQ056099A0 (en) | 1999-05-25 | 1999-06-17 | Silverbrook Research Pty Ltd | A method and apparatus (pprint01) |
TW483287B (en) | 1999-06-21 | 2002-04-11 | Semiconductor Energy Lab | EL display device, driving method thereof, and electronic equipment provided with the EL display device |
US6639596B1 (en) | 1999-09-20 | 2003-10-28 | Microsoft Corporation | Stereo reconstruction from multiperspective panoramas |
US7233691B2 (en) | 1999-12-29 | 2007-06-19 | Geospan Corporation | Any aspect passive volumetric image processing method |
US6826539B2 (en) | 1999-12-31 | 2004-11-30 | Xactware, Inc. | Virtual structure data repository and directory |
US6829584B2 (en) | 1999-12-31 | 2004-12-07 | Xactware, Inc. | Virtual home data repository and directory |
US6810383B1 (en) | 2000-01-21 | 2004-10-26 | Xactware, Inc. | Automated task management and evaluation |
WO2001058129A2 (en) | 2000-02-03 | 2001-08-09 | Alst Technical Excellence Center | Image resolution improvement using a color mosaic sensor |
CA2400975C (en) | 2000-03-16 | 2008-11-04 | The Johns Hopkins University | Light detection and ranging (lidar) mapping system |
IL151951A0 (en) | 2000-03-29 | 2003-04-10 | Astrovision International Inc | Direct broadcast imaging satellite system, apparatus and method for providing real-time, continuous monitoring of earth from geostationary earth orbit and related services |
US7184072B1 (en) | 2000-06-15 | 2007-02-27 | Power View Company, L.L.C. | Airborne inventory and inspection system and apparatus |
US6834128B1 (en) | 2000-06-16 | 2004-12-21 | Hewlett-Packard Development Company, L.P. | Image mosaicing system and method adapted to mass-market hand-held digital cameras |
US6484101B1 (en) | 2000-08-16 | 2002-11-19 | Imagelinks, Inc. | 3-dimensional interactive image modeling system |
US7313289B2 (en) | 2000-08-30 | 2007-12-25 | Ricoh Company, Ltd. | Image processing method and apparatus and computer-readable storage medium using improved distortion correction |
US6421610B1 (en) | 2000-09-15 | 2002-07-16 | Ernest A. Carroll | Method of preparing and disseminating digitized geospatial data |
US6959120B1 (en) | 2000-10-27 | 2005-10-25 | Microsoft Corporation | Rebinning methods and arrangements for use in compressing image-based rendering (IBR) data |
EP1384046B1 (en) | 2001-05-04 | 2018-10-03 | Vexcel Imaging GmbH | Digital camera for and method of obtaining overlapping images |
US7046401B2 (en) | 2001-06-01 | 2006-05-16 | Hewlett-Packard Development Company, L.P. | Camera-based document scanning system using multiple-pass mosaicking |
US7509241B2 (en) | 2001-07-06 | 2009-03-24 | Sarnoff Corporation | Method and apparatus for automatically generating a site model |
US20030043824A1 (en) | 2001-08-31 | 2003-03-06 | Remboski Donald J. | Vehicle active network and device |
US6847865B2 (en) | 2001-09-27 | 2005-01-25 | Ernest A. Carroll | Miniature, unmanned aircraft with onboard stabilization and automated ground control of flight path |
US6747686B1 (en) | 2001-10-05 | 2004-06-08 | Recon/Optical, Inc. | High aspect stereoscopic mode camera and method |
US7262790B2 (en) | 2002-01-09 | 2007-08-28 | Charles Adams Bakewell | Mobile enforcement platform with aimable violation identification and documentation system for multiple traffic violation types across all lanes in moving traffic, generating composite display images and data to support citation generation, homeland security, and monitoring |
TW550521B (en) | 2002-02-07 | 2003-09-01 | Univ Nat Central | Method for re-building 3D model of house in a semi-automatic manner using edge segments of buildings |
US6894809B2 (en) | 2002-03-01 | 2005-05-17 | Orasee Corp. | Multiple angle display produced from remote optical sensing devices |
JP4184703B2 (en) | 2002-04-24 | 2008-11-19 | 大日本印刷株式会社 | Image correction method and system |
US7725258B2 (en) | 2002-09-20 | 2010-05-25 | M7 Visual Intelligence, L.P. | Vehicle based data collection and processing system and imaging sensor system and methods thereof |
EP1540937A4 (en) | 2002-09-20 | 2008-11-12 | M7 Visual Intelligence Lp | Vehicule based data collection and porcessing system |
EP1696204B1 (en) | 2002-11-08 | 2015-01-28 | Pictometry International Corp. | Method for capturing, geolocating and measuring oblique images |
US6742741B1 (en) | 2003-02-24 | 2004-06-01 | The Boeing Company | Unmanned air vehicle and method of flying an unmanned air vehicle |
SE0300871D0 (en) | 2003-03-27 | 2003-03-27 | Saab Ab | Waypoint navigation |
US7343232B2 (en) | 2003-06-20 | 2008-03-11 | Geneva Aerospace | Vehicle control system including related methods and components |
US7018050B2 (en) | 2003-09-08 | 2006-03-28 | Hewlett-Packard Development Company, L.P. | System and method for correcting luminance non-uniformity of obliquely projected images |
JP2005151536A (en) | 2003-10-23 | 2005-06-09 | Nippon Dempa Kogyo Co Ltd | Crystal oscillator |
US7130741B2 (en) | 2003-10-23 | 2006-10-31 | International Business Machines Corporation | Navigating a UAV with a remote control device |
US7916940B2 (en) | 2004-01-31 | 2011-03-29 | Hewlett-Packard Development Company | Processing of mosaic digital images |
JP2007525770A (en) | 2004-02-27 | 2007-09-06 | インターグラフ ソフトウェアー テクノロジーズ カンパニー | Technology to form a single image from multiple overlapping images |
US20060028550A1 (en) | 2004-08-06 | 2006-02-09 | Palmer Robert G Jr | Surveillance system and method |
WO2006121457A2 (en) | 2004-08-18 | 2006-11-16 | Sarnoff Corporation | Method and apparatus for performing three-dimensional computer modeling |
US8078396B2 (en) | 2004-08-31 | 2011-12-13 | Meadow William D | Methods for and apparatus for generating a continuum of three dimensional image data |
CA2484422A1 (en) | 2004-10-08 | 2006-04-08 | Furgro Airborne Surveys | Unmanned airborne vehicle for geophysical surveying |
US7680053B1 (en) | 2004-10-29 | 2010-03-16 | Marvell International Ltd. | Inter-device flow control |
US7348895B2 (en) | 2004-11-03 | 2008-03-25 | Lagassey Paul J | Advanced automobile accident detection, data recordation and reporting system |
US7142984B2 (en) | 2005-02-08 | 2006-11-28 | Harris Corporation | Method and apparatus for enhancing a digital elevation model (DEM) for topographical modeling |
CN101164025A (en) | 2005-04-01 | 2008-04-16 | 雅马哈发动机株式会社 | Control method, control device, and unmanned helicopter |
US7466244B2 (en) | 2005-04-21 | 2008-12-16 | Microsoft Corporation | Virtual earth rooftop overlay and bounding |
US7554539B2 (en) | 2005-07-27 | 2009-06-30 | Balfour Technologies Llc | System for viewing a collection of oblique imagery in a three or four dimensional virtual scene |
US7844499B2 (en) | 2005-12-23 | 2010-11-30 | Sharp Electronics Corporation | Integrated solar agent business model |
US7778491B2 (en) | 2006-04-10 | 2010-08-17 | Microsoft Corporation | Oblique image stitching |
US20070244608A1 (en) | 2006-04-13 | 2007-10-18 | Honeywell International Inc. | Ground control station for UAV |
US7922115B2 (en) | 2006-04-21 | 2011-04-12 | Colgren Richard D | Modular unmanned air-vehicle |
EP2036043A2 (en) | 2006-06-26 | 2009-03-18 | Lockheed Martin Corporation | Method and system for providing a perspective view image by intelligent fusion of a plurality of sensor data |
US7873238B2 (en) | 2006-08-30 | 2011-01-18 | Pictometry International Corporation | Mosaic oblique images and methods of making and using same |
US20100121574A1 (en) | 2006-09-05 | 2010-05-13 | Honeywell International Inc. | Method for collision avoidance of unmanned aerial vehicle with other aircraft |
DE102007030781A1 (en) | 2006-10-11 | 2008-04-17 | Gta Geoinformatik Gmbh | Method for texturing virtual three-dimensional objects |
IL179344A (en) | 2006-11-16 | 2014-02-27 | Rafael Advanced Defense Sys | Method for tracking a moving platform |
US20100250022A1 (en) | 2006-12-29 | 2010-09-30 | Air Recon, Inc. | Useful unmanned aerial vehicle |
TWI361095B (en) | 2007-03-23 | 2012-04-01 | Yu Tuan Lee | Remote-controlled motion apparatus with acceleration self-sense and remote control apparatus therefor |
TWI324080B (en) | 2007-03-23 | 2010-05-01 | Yu Tuan Lee | Remote-controlled motion apparatus with sensing terrestrial magnetism and remote control apparatus therefor |
US7832267B2 (en) | 2007-04-25 | 2010-11-16 | Ecometriks, Llc | Method for determining temporal solar irradiance values |
EP2153245A1 (en) | 2007-05-04 | 2010-02-17 | Teledyne Australia Pty Ltd. | Collision avoidance system and method |
US8346578B1 (en) | 2007-06-13 | 2013-01-01 | United Services Automobile Association | Systems and methods for using unmanned aerial vehicles |
WO2009025928A2 (en) | 2007-06-19 | 2009-02-26 | Ch2M Hill, Inc. | Systems and methods for solar mapping, determining a usable area for solar energy production and/or providing solar information |
US9026272B2 (en) | 2007-12-14 | 2015-05-05 | The Boeing Company | Methods for autonomous tracking and surveillance |
US8417061B2 (en) | 2008-02-01 | 2013-04-09 | Sungevity Inc. | Methods and systems for provisioning energy systems |
US8275194B2 (en) | 2008-02-15 | 2012-09-25 | Microsoft Corporation | Site modeling using image data fusion |
US8131406B2 (en) | 2008-04-09 | 2012-03-06 | Lycoming Engines, A Division Of Avco Corporation | Piston engine aircraft automated pre-flight testing |
US8538151B2 (en) | 2008-04-23 | 2013-09-17 | Pasco Corporation | Building roof outline recognizing device, building roof outline recognizing method, and building roof outline recognizing program |
WO2009131542A1 (en) | 2008-04-23 | 2009-10-29 | Drone Technology Pte Ltd | Module for data acquisition and control in a sensor/control network |
JP5134469B2 (en) | 2008-08-21 | 2013-01-30 | 三菱重工業株式会社 | Drone system and its operation method |
US20100079267A1 (en) | 2008-09-29 | 2010-04-01 | Tsun-Huang Lin | Automobile Anti-Collision Early-Warning Device |
US7969346B2 (en) | 2008-10-07 | 2011-06-28 | Honeywell International Inc. | Transponder-based beacon transmitter for see and avoid of unmanned aerial vehicles |
US8422825B1 (en) | 2008-11-05 | 2013-04-16 | Hover Inc. | Method and system for geometry extraction, 3D visualization and analysis using arbitrary oblique imagery |
US9437044B2 (en) | 2008-11-05 | 2016-09-06 | Hover Inc. | Method and system for displaying and navigating building facades in a three-dimensional mapping system |
US8242623B2 (en) | 2008-11-13 | 2012-08-14 | Honeywell International Inc. | Structural ring interconnect printed circuit board assembly for a ducted fan unmanned aerial vehicle |
US20100286859A1 (en) | 2008-11-18 | 2010-11-11 | Honeywell International Inc. | Methods for generating a flight plan for an unmanned aerial vehicle based on a predicted camera path |
US20100215212A1 (en) | 2009-02-26 | 2010-08-26 | Honeywell International Inc. | System and Method for the Inspection of Structures |
US8401222B2 (en) | 2009-05-22 | 2013-03-19 | Pictometry International Corp. | System and process for roof measurement using aerial imagery |
KR101262968B1 (en) | 2009-09-02 | 2013-05-09 | 부산대학교 산학협력단 | Unmanned Aerial System Including Unmanned Aerial Vehicle Having Spherical Loading Portion And Unmanned Ground Vehicle Therefor |
AU2010324768A1 (en) | 2009-11-25 | 2012-06-14 | Aerovironment, Inc. | Automatic configuration control of a device |
US9036861B2 (en) | 2010-04-22 | 2015-05-19 | The University Of North Carolina At Charlotte | Method and system for remotely inspecting bridges and other structures |
US8965598B2 (en) | 2010-09-30 | 2015-02-24 | Empire Technology Development Llc | Automatic flight control for UAV based solid modeling |
US20120143482A1 (en) | 2010-12-02 | 2012-06-07 | Honeywell International Inc. | Electronically file and fly unmanned aerial vehicle |
CA2845094A1 (en) | 2011-08-16 | 2013-04-18 | Unmanned Innovation Inc. | Modular flight management system incorporating an autopilot |
ES2672881T3 (en) * | 2011-11-29 | 2018-06-18 | Pictometry International Corp. | System of automatic detection of footprint of oblique imagery structure |
TW201328344A (en) | 2011-12-27 | 2013-07-01 | Hon Hai Prec Ind Co Ltd | System and method for controlling a unmanned aerial vehicle to capture images of a target location |
US10515414B2 (en) | 2012-02-03 | 2019-12-24 | Eagle View Technologies, Inc. | Systems and methods for performing a risk management assessment of a property |
US9183538B2 (en) | 2012-03-19 | 2015-11-10 | Pictometry International Corp. | Method and system for quick square roof reporting |
US9170106B2 (en) | 2012-04-19 | 2015-10-27 | Raytheon Corporation | Shock-resistant device and method |
US9501760B2 (en) | 2012-04-24 | 2016-11-22 | Michael Paul Stanley | Media echoing and social networking device and method |
US20140018979A1 (en) * | 2012-07-13 | 2014-01-16 | Honeywell International Inc. | Autonomous airspace flight planning and virtual airspace containment system |
US20140132635A1 (en) | 2012-11-09 | 2014-05-15 | Ali Murdoch | Systems and methods for roof area estimation |
DE202013012541U1 (en) | 2012-11-15 | 2017-06-27 | SZ DJI Technology Co., Ltd. | Unmanned aerial vehicle with multiple rotors |
US8874283B1 (en) | 2012-12-04 | 2014-10-28 | United Dynamics Advanced Technologies Corporation | Drone for inspection of enclosed space and method thereof |
US20140316614A1 (en) | 2012-12-17 | 2014-10-23 | David L. Newman | Drone for collecting images and system for categorizing image data |
US9162753B1 (en) | 2012-12-31 | 2015-10-20 | Southern Electrical Equipment Company, Inc. | Unmanned aerial vehicle for monitoring infrastructure assets |
US9075415B2 (en) * | 2013-03-11 | 2015-07-07 | Airphrame, Inc. | Unmanned aerial vehicle and methods for controlling same |
US8931144B2 (en) | 2013-03-14 | 2015-01-13 | State Farm Mutual Automobile Insurance Company | Tethering system and method for remote device |
US9330504B2 (en) | 2013-04-30 | 2016-05-03 | Hover Inc. | 3D building model construction tools |
US8991758B2 (en) | 2013-05-13 | 2015-03-31 | Precisionhawk Inc. | Unmanned aerial vehicle |
US9798928B2 (en) * | 2013-07-17 | 2017-10-24 | James L Carr | System for collecting and processing aerial imagery with enhanced 3D and NIR imaging capability |
AU2014295972B2 (en) | 2013-08-02 | 2018-10-11 | Xactware Solutions, Inc. | System and method for detecting features in aerial images using disparity mapping and segmentation techniques |
WO2015102731A2 (en) | 2013-10-18 | 2015-07-09 | Aerovironment, Inc. | Privacy shield for unmanned aerial systems |
WO2015108588A2 (en) | 2013-10-21 | 2015-07-23 | Kespry, Inc. | Systems and methods for unmanned aerial vehicle landing |
EP3060479A4 (en) | 2013-10-21 | 2017-05-17 | Kespry Inc. | System and methods for execution of recovery actions on an unmanned aerial vehicle |
WO2015123348A1 (en) | 2014-02-11 | 2015-08-20 | Xactware Solutions, Inc. | System and method for generating computerized floor plans |
US20150254738A1 (en) * | 2014-03-05 | 2015-09-10 | TerrAvion, LLC | Systems and methods for aerial imaging and analysis |
US9273981B1 (en) | 2014-05-12 | 2016-03-01 | Unmanned Innovation, Inc. | Distributed unmanned aerial vehicle architecture |
US9256994B2 (en) | 2014-05-12 | 2016-02-09 | Unmanned Innovation, Inc. | Unmanned aerial vehicle authorization and geofence envelope determination |
US11768508B2 (en) | 2015-02-13 | 2023-09-26 | Skydio, Inc. | Unmanned aerial vehicle sensor activation and correlation system |
WO2016130994A1 (en) | 2015-02-13 | 2016-08-18 | Unmanned Innovation, Inc. | Unmanned aerial vehicle remote flight planning system |
US9953540B2 (en) * | 2015-06-16 | 2018-04-24 | Here Global B.V. | Air space maps |
US9508263B1 (en) * | 2015-10-20 | 2016-11-29 | Skycatch, Inc. | Generating a mission plan for capturing aerial images with an unmanned aerial vehicle |
US9513635B1 (en) | 2015-12-30 | 2016-12-06 | Unmanned Innovation, Inc. | Unmanned aerial vehicle inspection system |
US9740200B2 (en) | 2015-12-30 | 2017-08-22 | Unmanned Innovation, Inc. | Unmanned aerial vehicle inspection system |
US9609288B1 (en) | 2015-12-31 | 2017-03-28 | Unmanned Innovation, Inc. | Unmanned aerial vehicle rooftop inspection system |
WO2017116860A1 (en) | 2015-12-31 | 2017-07-06 | Unmanned Innovation, Inc. | Unmanned aerial vehicle rooftop inspection system |
CA3012049A1 (en) * | 2016-01-20 | 2017-07-27 | Ez3D, Llc | System and method for structural inspection and construction estimation using an unmanned aerial vehicle |
US20180025649A1 (en) | 2016-02-08 | 2018-01-25 | Unmanned Innovation Inc. | Unmanned aerial vehicle privacy controls |
US10671648B2 (en) | 2016-02-22 | 2020-06-02 | Eagle View Technologies, Inc. | Integrated centralized property database systems and methods |
US9592912B1 (en) | 2016-03-08 | 2017-03-14 | Unmanned Innovation, Inc. | Ground control point assignment and determination system |
US9658619B1 (en) | 2016-03-31 | 2017-05-23 | Unmanned Innovation, Inc. | Unmanned aerial vehicle modular command priority determination and filtering system |
-
2017
- 2017-01-09 WO PCT/US2017/012696 patent/WO2017120571A1/en active Application Filing
- 2017-01-09 AU AU2017206097A patent/AU2017206097B2/en not_active Ceased
- 2017-01-09 MX MX2018007935A patent/MX2018007935A/en unknown
- 2017-01-09 US US15/401,999 patent/US20170235018A1/en not_active Abandoned
- 2017-01-09 EP EP17736502.0A patent/EP3391164B1/en active Active
- 2017-01-09 CA CA3001023A patent/CA3001023A1/en active Pending
-
2021
- 2021-10-06 AU AU2021245126A patent/AU2021245126B2/en active Active
- 2021-10-07 US US17/495,988 patent/US12079013B2/en active Active
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20040105090A1 (en) * | 2002-11-08 | 2004-06-03 | Schultz Stephen L. | Method and apparatus for capturing, geolocating and measuring oblique images |
US20140233863A1 (en) * | 2013-02-19 | 2014-08-21 | Digitalglobe, Inc. | Crowdsourced search and locate platform |
Cited By (82)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11175135B2 (en) | 2014-08-29 | 2021-11-16 | Spookfish Innovations Pty Ltd | Aerial survey image capture systems and methods |
US10612923B2 (en) | 2014-08-29 | 2020-04-07 | Spookfish Innovations Pty Ltd | Aerial survey image capture system |
US10378895B2 (en) | 2014-08-29 | 2019-08-13 | Spookfish Innovagtions PTY LTD | Aerial survey image capture system |
US10267949B2 (en) * | 2014-10-17 | 2019-04-23 | Sony Corporation | Information processing apparatus and information processing method |
US10336453B2 (en) * | 2016-01-14 | 2019-07-02 | Elwha Llc | System and method for payload management for unmanned aircraft |
US10762795B2 (en) | 2016-02-08 | 2020-09-01 | Skydio, Inc. | Unmanned aerial vehicle privacy controls |
US11189180B2 (en) | 2016-02-08 | 2021-11-30 | Skydio, Inc. | Unmanned aerial vehicle visual line of sight control |
US11361665B2 (en) | 2016-02-08 | 2022-06-14 | Skydio, Inc. | Unmanned aerial vehicle privacy controls |
US20180025649A1 (en) * | 2016-02-08 | 2018-01-25 | Unmanned Innovation Inc. | Unmanned aerial vehicle privacy controls |
US11854413B2 (en) | 2016-02-08 | 2023-12-26 | Skydio, Inc | Unmanned aerial vehicle visual line of sight control |
US11762384B2 (en) | 2016-04-24 | 2023-09-19 | Flytrex Aviation Ltd. | System and method for dynamically arming a failsafe on a delivery drone |
US20200409357A1 (en) | 2016-04-24 | 2020-12-31 | Flytrex Aviation Ltd. | System and method for dynamically arming a failsafe on a delivery drone |
US12007764B2 (en) | 2016-04-24 | 2024-06-11 | Flytrex Aviation Ltd. | System and method for aerial traffic management of unmanned aerial vehicles |
US12001204B2 (en) | 2016-04-24 | 2024-06-04 | Flytrex Aviation Ltd. | System and method for dynamically arming a failsafe on a delivery drone |
US11226619B2 (en) * | 2016-04-24 | 2022-01-18 | Flytrex Aviation Ltd. | Dynamically arming a safety mechanism on a delivery drone |
US11897607B2 (en) | 2016-06-13 | 2024-02-13 | Skydio, Inc. | Unmanned aerial vehicle beyond visual line of sight control |
US11242143B2 (en) | 2016-06-13 | 2022-02-08 | Skydio, Inc. | Unmanned aerial vehicle beyond visual line of sight control |
US10895968B2 (en) * | 2016-09-08 | 2021-01-19 | DJI Research LLC | Graphical user interface customization in a movable object environment |
US20180082135A1 (en) * | 2016-09-22 | 2018-03-22 | Apple Inc. | Vehicle Video System |
US11756307B2 (en) | 2016-09-22 | 2023-09-12 | Apple Inc. | Vehicle video system |
US11341752B2 (en) * | 2016-09-22 | 2022-05-24 | Apple Inc. | Vehicle video system |
US10810443B2 (en) * | 2016-09-22 | 2020-10-20 | Apple Inc. | Vehicle video system |
US11640178B2 (en) * | 2016-12-13 | 2023-05-02 | Acsl Ltd. | Unmanned aircraft, device for controlling unmanned aircraft, method for controlling unmanned aircraft, and device for detecting failure of unmanned aircraft |
US20200103922A1 (en) * | 2016-12-13 | 2020-04-02 | Autonomous Control Systems Laboratory Ltd. | Unmanned Aircraft, Device for Controlling Unmanned Aircraft, Method for Controlling Unmanned Aircraft, and Device for Detecting Failure of Unmanned Aircraft |
US20180211406A1 (en) * | 2017-01-23 | 2018-07-26 | Shanghai Hang Seng Electronic Technology Co., Ltd | Image processing method and device for unmanned aerial vehicle |
US11453512B2 (en) * | 2017-02-08 | 2022-09-27 | Airbus Helicopters | System and a method for assisting landing an aircraft, and a corresponding aircraft |
US10189580B2 (en) * | 2017-06-16 | 2019-01-29 | Aerobo | Image stabilization and pointing control mechanization for aircraft imaging systems |
US20190041219A1 (en) * | 2017-08-02 | 2019-02-07 | X Development Llc | Systems and Methods for Navigation Path Determination for Unmanned Vehicles |
US10393528B2 (en) * | 2017-08-02 | 2019-08-27 | Wing Aviation Llc | Systems and methods for navigation path determination for unmanned vehicles |
US20190112049A1 (en) * | 2017-10-17 | 2019-04-18 | Top Flight Technologies, Inc. | Portable launch system |
US20210224413A1 (en) * | 2017-11-13 | 2021-07-22 | Yoppworks Inc. | Vehicle enterprise fleet management system and method |
US10636314B2 (en) | 2018-01-03 | 2020-04-28 | Qualcomm Incorporated | Adjusting flight parameters of an aerial robotic vehicle based on presence of propeller guard(s) |
US10717435B2 (en) | 2018-01-03 | 2020-07-21 | Qualcomm Incorporated | Adjustable object avoidance proximity threshold based on classification of detected objects |
US10719705B2 (en) | 2018-01-03 | 2020-07-21 | Qualcomm Incorporated | Adjustable object avoidance proximity threshold based on predictability of the environment |
US10720070B2 (en) | 2018-01-03 | 2020-07-21 | Qualcomm Incorporated | Adjustable object avoidance proximity threshold of a robotic vehicle based on presence of detected payload(s) |
US10803759B2 (en) | 2018-01-03 | 2020-10-13 | Qualcomm Incorporated | Adjustable object avoidance proximity threshold based on presence of propeller guard(s) |
US10737783B2 (en) | 2018-01-16 | 2020-08-11 | RSQ-Systems SPRL | Control systems for unmanned aerial vehicles |
US10696396B2 (en) | 2018-03-05 | 2020-06-30 | Rsq-Systems Us Llc | Stability systems for tethered unmanned aerial vehicles |
JP7048397B2 (en) | 2018-04-12 | 2022-04-05 | 株式会社荏原製作所 | Wired drone system |
JP2019182268A (en) * | 2018-04-12 | 2019-10-24 | 株式会社荏原製作所 | Wired drone system |
US10772043B2 (en) * | 2018-05-25 | 2020-09-08 | At&T Intellectual Property I, L.P. | Interfering device identification |
US11425658B2 (en) | 2018-05-25 | 2022-08-23 | At&T Intellectual Property I, L.P. | Interfering device identification |
US20190364507A1 (en) * | 2018-05-25 | 2019-11-28 | At&T Intellectual Property I, L.P. | Interfering device identification |
US10642284B1 (en) * | 2018-06-08 | 2020-05-05 | Amazon Technologies, Inc. | Location determination using ground structures |
US10589423B2 (en) * | 2018-06-18 | 2020-03-17 | Shambhu Nath Roy | Robot vision super visor for hybrid homing, positioning and workspace UFO detection enabling industrial robot use for consumer applications |
WO2019246280A1 (en) * | 2018-06-19 | 2019-12-26 | Seekops Inc. | Emissions estimate model algorithms and methods |
WO2019246283A1 (en) * | 2018-06-19 | 2019-12-26 | Seekops Inc. | Localization analytics algorithms and methods |
US20200001988A1 (en) * | 2018-07-02 | 2020-01-02 | Bell Helicopter Textron Inc. | Method and apparatus for proximity control between rotating and non-rotating aircraft components |
US10569866B2 (en) * | 2018-07-02 | 2020-02-25 | Bell Helicopter Textron Inc. | Method and apparatus for proximity control between rotating and non-rotating aircraft components |
US20200001979A1 (en) * | 2018-07-02 | 2020-01-02 | Bell Helicopter Textron Inc. | Method and apparatus for proximity control between rotating and non-rotating aircraft components |
US10583916B2 (en) * | 2018-07-02 | 2020-03-10 | Bell Helicopter Textron Inc. | Method and apparatus for proximity control between rotating and non-rotating aircraft components |
US10773800B2 (en) | 2018-07-26 | 2020-09-15 | RSQ-Systems SPRL | Vehicle-based deployment of a tethered surveillance drone |
US10435154B1 (en) * | 2018-07-26 | 2019-10-08 | RSQ-Systems SPRL | Tethered drone system with surveillance data management |
US12044666B2 (en) | 2018-07-30 | 2024-07-23 | Seekops Inc. | Ultra-lightweight, handheld gas leak detection device |
US12014433B1 (en) * | 2018-10-09 | 2024-06-18 | Corelogic Solutions, Llc | Generation and display of interactive 3D real estate models |
US11776221B2 (en) | 2018-10-09 | 2023-10-03 | Corelogic Solutions, Llc | Augmented reality application for interacting with building models |
EP3870951A4 (en) * | 2018-10-22 | 2022-07-27 | SeekOps Inc. | A uav-borne, high-bandwidth, lightweight point sensor for quantifying greenhouse gases in atmospheric strata |
WO2020086499A1 (en) | 2018-10-22 | 2020-04-30 | Seekops Inc. | A uav-borne, high-bandwidth, lightweight point sensor for quantifying greenhouse gases in atmospheric strata |
WO2020147085A1 (en) * | 2019-01-17 | 2020-07-23 | 深圳市大疆创新科技有限公司 | Photographing control method and movable platform |
US11994464B2 (en) | 2019-04-05 | 2024-05-28 | Seekops Inc. | Analog signal processing for a lightweight and compact laser-based trace gas sensor |
US12130204B2 (en) | 2019-08-05 | 2024-10-29 | Seekops Inc. | Rapidly deployable UAS system for autonomous inspection operations using a combined payload |
CN112313599A (en) * | 2019-10-31 | 2021-02-02 | 深圳市大疆创新科技有限公司 | Control method, control device and storage medium |
CN110837839A (en) * | 2019-11-04 | 2020-02-25 | 嘉兴职业技术学院 | High-precision unmanned aerial vehicle orthoimage manufacturing and data acquisition method |
KR102147830B1 (en) * | 2019-12-16 | 2020-08-26 | (주)프리뉴 | Intergrated control system and method of unmanned aerial vehicle using al based image processing |
US11614430B2 (en) | 2019-12-19 | 2023-03-28 | Seekops Inc. | Concurrent in-situ measurement of wind speed and trace gases on mobile platforms for localization and qualification of emissions |
US11988598B2 (en) | 2019-12-31 | 2024-05-21 | Seekops Inc. | Optical cell cleaner |
US12055485B2 (en) | 2020-02-05 | 2024-08-06 | Seekops Inc. | Multispecies measurement platform using absorption spectroscopy for measurement of co-emitted trace gases |
US11494977B2 (en) * | 2020-02-28 | 2022-11-08 | Maxar Intelligence Inc. | Automated process for building material detection in remotely sensed imagery |
US12015386B2 (en) | 2020-03-25 | 2024-06-18 | Seekops Inc. | Logarithmic demodulator for laser Wavelength-Modulaton Spectroscopy |
US11748866B2 (en) | 2020-07-17 | 2023-09-05 | Seekops Inc. | Systems and methods of automated detection of gas plumes using optical imaging |
CN112033389A (en) * | 2020-08-10 | 2020-12-04 | 山东科技大学 | Deformation settlement monitoring method under gully terrain condition |
US20230373661A1 (en) * | 2020-09-30 | 2023-11-23 | Nippon Telegraph And Telephone Corporation | Propeller guard, flying body, and resilient member |
WO2022070375A1 (en) * | 2020-09-30 | 2022-04-07 | 日本電信電話株式会社 | Propeller guard, flight vehicle, and repulsion mechanism |
JPWO2022070371A1 (en) * | 2020-09-30 | 2022-04-07 | ||
WO2022070371A1 (en) * | 2020-09-30 | 2022-04-07 | 日本電信電話株式会社 | Propeller guard, flight vehicle, and repulsive member |
JP7518426B2 (en) | 2020-09-30 | 2024-07-18 | 日本電信電話株式会社 | Propeller guard, flying object, and rebound member |
JPWO2022070375A1 (en) * | 2020-09-30 | 2022-04-07 | ||
US20230331382A1 (en) * | 2020-09-30 | 2023-10-19 | Nippon Telegraph And Telephone Corporation | Propeller guard, flying body, and resilient mechanism |
US11912432B2 (en) * | 2020-12-10 | 2024-02-27 | Wing Aviation Llc | Systems and methods for autonomous airworthiness pre-flight checks for UAVs |
US20220185499A1 (en) * | 2020-12-10 | 2022-06-16 | Wing Aviation Llc | Systems and Methods for Autonomous Airworthiness Pre-Flight Checks for UAVs |
US12100117B2 (en) | 2021-03-15 | 2024-09-24 | International Business Machines Corporation | Image stitching for high-resolution scans |
US11691721B1 (en) * | 2022-04-29 | 2023-07-04 | Beta Air, Llc | System for propeller parking control for an electric aircraft and a method for its use |
Also Published As
Publication number | Publication date |
---|---|
AU2017206097B2 (en) | 2021-07-08 |
MX2018007935A (en) | 2018-08-09 |
EP3391164A1 (en) | 2018-10-24 |
AU2021245126B2 (en) | 2023-11-09 |
US12079013B2 (en) | 2024-09-03 |
EP3391164A4 (en) | 2019-08-28 |
AU2017206097A1 (en) | 2018-04-26 |
CA3001023A1 (en) | 2017-07-13 |
WO2017120571A1 (en) | 2017-07-13 |
EP3391164B1 (en) | 2021-10-20 |
AU2021245126A1 (en) | 2021-11-04 |
US20220026929A1 (en) | 2022-01-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US12079013B2 (en) | Systems and methods for taking, processing, retrieving, and displaying images from unmanned aerial vehicles | |
US12116979B2 (en) | Unmanned aerial vehicle wind turbine inspection systems and methods | |
AU2022291653B2 (en) | A backup navigation system for unmanned aerial vehicles | |
US11854413B2 (en) | Unmanned aerial vehicle visual line of sight control | |
US11017679B2 (en) | Unmanned aerial vehicle visual point cloud navigation | |
Chen et al. | State of technology review of civilian UAVs | |
US20220357753A1 (en) | Drop-off location planning for delivery vehicle | |
EP3243749B1 (en) | Unmanned aerial vehicle (uav) having vertical takeoff and landing (vtol) capability | |
US20200301015A1 (en) | Systems and methods for localization | |
JP2020098567A (en) | Adaptive detection/avoidance system | |
WO2017147142A1 (en) | Unmanned aerial vehicle visual line of sight control | |
US20210011472A1 (en) | System, device and method for time limited communication for remotely controlled vehicles | |
US12148316B2 (en) | Unmanned aerial vehicle visual point cloud navigation | |
US20220343779A1 (en) | System, device and method for time limited communication for remotely controlled vehicles |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: HPS INVESTMENT PARTNERS, LLC,, NEW YORK Free format text: SECOND LIEN PATENT SECURITY AGREEMENT;ASSIGNOR:PICTOMETRY INTERNATIONAL CORP.;REEL/FRAME:046823/0755 Effective date: 20180814 |
|
AS | Assignment |
Owner name: MORGAN STANLEY SENIOR FUNDING, INC., AS COLLATERAL AGENT, NEW YORK Free format text: FIRST LIEN PATENT SECURITY AGREEMENT;ASSIGNOR:PICTOMETRY INTERNATIONAL CORP.;REEL/FRAME:046919/0065 Effective date: 20180814 Owner name: MORGAN STANLEY SENIOR FUNDING, INC., AS COLLATERAL Free format text: FIRST LIEN PATENT SECURITY AGREEMENT;ASSIGNOR:PICTOMETRY INTERNATIONAL CORP.;REEL/FRAME:046919/0065 Effective date: 20180814 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
AS | Assignment |
Owner name: PICTOMETRY INTERNATIONAL CORP., NEW YORK Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:EAGLE VIEW TECHNOLOGIES, INC.;REEL/FRAME:050371/0167 Effective date: 20170919 Owner name: EAGLE VIEW TECHNOLOGIES, INC., WASHINGTON Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:PARCHMENT, ANTONY;REEL/FRAME:050371/0102 Effective date: 20150803 Owner name: PICTOMETRY INTERNATIONAL CORP., NEW YORK Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:FOSTER, MARK A.;GIUFFRIDA, FRANK;SIGNING DATES FROM 20170214 TO 20190215;REEL/FRAME:050371/0222 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STCV | Information on status: appeal procedure |
Free format text: NOTICE OF APPEAL FILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |