US9576349B2 - Techniques for atmospheric and solar correction of aerial images - Google Patents
Techniques for atmospheric and solar correction of aerial images Download PDFInfo
- Publication number
- US9576349B2 US9576349B2 US12/973,689 US97368910A US9576349B2 US 9576349 B2 US9576349 B2 US 9576349B2 US 97368910 A US97368910 A US 97368910A US 9576349 B2 US9576349 B2 US 9576349B2
- Authority
- US
- United States
- Prior art keywords
- image
- aerial
- atmospheric
- solar
- images
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active, expires
Links
- 238000000034 method Methods 0.000 title claims abstract description 48
- 238000012937 correction Methods 0.000 title claims abstract description 33
- 238000003702 image correction Methods 0.000 claims abstract description 26
- 230000000996 additive effect Effects 0.000 claims description 26
- 239000000654 additive Substances 0.000 claims description 24
- 238000003491 array Methods 0.000 claims description 22
- 238000004891 communication Methods 0.000 claims description 15
- 238000003860 storage Methods 0.000 claims description 12
- 230000000694 effects Effects 0.000 claims description 11
- 238000010606 normalization Methods 0.000 claims description 8
- 238000010521 absorption reaction Methods 0.000 claims description 7
- 238000000149 argon plasma sintering Methods 0.000 claims description 7
- 239000000443 aerosol Substances 0.000 claims description 5
- 230000004907 flux Effects 0.000 claims description 5
- 230000004931 aggregating effect Effects 0.000 claims description 2
- 230000003287 optical effect Effects 0.000 description 31
- 238000012545 processing Methods 0.000 description 30
- 238000003384 imaging method Methods 0.000 description 25
- 238000010586 diagram Methods 0.000 description 17
- 230000008569 process Effects 0.000 description 11
- 238000004519 manufacturing process Methods 0.000 description 9
- 238000004590 computer program Methods 0.000 description 6
- 230000006870 function Effects 0.000 description 6
- 230000000007 visual effect Effects 0.000 description 6
- 230000001174 ascending effect Effects 0.000 description 3
- 238000004422 calculation algorithm Methods 0.000 description 3
- 230000001419 dependent effect Effects 0.000 description 3
- 238000013461 design Methods 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 238000005286 illumination Methods 0.000 description 3
- 230000006855 networking Effects 0.000 description 3
- 230000002093 peripheral effect Effects 0.000 description 3
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 3
- 238000002845 discoloration Methods 0.000 description 2
- 238000013507 mapping Methods 0.000 description 2
- 229920000642 polymer Polymers 0.000 description 2
- 229920006395 saturated elastomer Polymers 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- 230000003068 static effect Effects 0.000 description 2
- 230000001960 triggered effect Effects 0.000 description 2
- OKTJSMMVPCPJKN-UHFFFAOYSA-N Carbon Chemical compound [C] OKTJSMMVPCPJKN-UHFFFAOYSA-N 0.000 description 1
- 230000002457 bidirectional effect Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 239000003990 capacitor Substances 0.000 description 1
- 239000002041 carbon nanotube Substances 0.000 description 1
- 229910021393 carbon nanotube Inorganic materials 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 239000002131 composite material Substances 0.000 description 1
- 239000004020 conductor Substances 0.000 description 1
- 238000005315 distribution function Methods 0.000 description 1
- 230000009977 dual effect Effects 0.000 description 1
- 238000009432 framing Methods 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 238000009434 installation Methods 0.000 description 1
- 239000007788 liquid Substances 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 239000002159 nanocrystal Substances 0.000 description 1
- 239000002245 particle Substances 0.000 description 1
- 238000003908 quality control method Methods 0.000 description 1
- 238000012552 review Methods 0.000 description 1
- 230000011218 segmentation Effects 0.000 description 1
- 230000035945 sensitivity Effects 0.000 description 1
- 229910052710 silicon Inorganic materials 0.000 description 1
- 239000010703 silicon Substances 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
- 239000010409 thin film Substances 0.000 description 1
- 238000001429 visible spectrum Methods 0.000 description 1
- 230000004304 visual acuity Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/20—Image enhancement or restoration using local operators
-
- G06T5/008—
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C15/00—Surveying instruments or accessories not provided for in groups G01C1/00 - G01C13/00
- G01C15/002—Active optical surveying means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/50—Image enhancement or restoration using two or more images, e.g. averaging or subtraction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/70—Denoising; Smoothing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/90—Dynamic range modification of images or parts thereof
- G06T5/94—Dynamic range modification of images or parts thereof based on local image properties, e.g. for local contrast enhancement
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/60—Analysis of geometric attributes
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/698—Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture
-
- H04N5/23238—
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C11/00—Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C11/00—Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
- G01C11/02—Picture taking arrangements specially adapted for photogrammetry or photographic surveying, e.g. controlling overlapping of pictures
-
- G06K9/40—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10024—Color image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10032—Satellite or aerial image; Remote sensing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20076—Probabilistic image processing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/40—Scaling of whole images or parts thereof, e.g. expanding or contracting
- G06T3/4038—Image mosaicing, e.g. composing plane images from plane sub-images
Definitions
- Aerial imaging refers to capturing images of a surface of a planet from an elevated position, such as an aircraft or satellite. Aerial imaging is used in cartography, such as for providing photogrammetric surveys, which are often a basis for topographic maps. Depending on a given elevation for a camera, aerial imaging may capture undesired visual elements obscuring surface objects in an image. Removing undesired visual elements may be difficult, however, particularly when multiple aerial images are stitched together to form a composite aerial image, such as an “ortho mosaic” or aerial map. It is with respect to these and other considerations that the present improvements have been needed.
- Embodiments are generally directed to advanced image processing techniques for aerial images. Some embodiments are particularly directed to advanced image processing techniques specifically designed to correct atmospheric and solar influences in aerial images. As elevation for a camera increases, the camera may acquire an aerial image with an increasing number of atmospheric and solar influences, such as light scattering and absorption due to aerosols in the atmosphere. The atmospheric and solar influences potentially obscure surface objects in an aerial image acquired by the aerial camera, among other problems.
- an atmospheric and solar component is arranged for execution by a logic device and operative to correct solar and atmosphere artifacts from an aerial image.
- the atmospheric and solar component may comprise, among other elements, an image information component operative to generate an image record for each aerial image of a group of aerial images, the image record comprising statistical information and image context information for each aerial image.
- the atmospheric and solar component may further comprise a filter generation component operative to generate an atmospheric filter and a solar filter from the statistical information and the image context information stored in the image records.
- the atmospheric and solar component may still further comprise an image correction component operative to correct atmospheric and solar artifacts from the aerial image using the respective atmospheric filter and solar filter.
- the atmospheric and solar component may provide enhanced aerial images providing a greater degree of visual acuity and accurate reproduction of surface objects in the aerial images.
- Other embodiments are described and claimed.
- FIG. 1 is a schematic diagram showing aspects of a multi-resolution digital large format camera with multiple detector arrays provided in one embodiment presented herein;
- FIG. 2 is a schematic diagram showing the footprint of a primary camera system overlaid with the footprints of four secondary camera systems in a large format digital camera presented in one embodiment disclosed herein;
- FIG. 3 is a perspective diagram showing a perspective view of the footprint of a primary camera system and the footprints of four secondary camera systems in a large format digital camera presented in one embodiment disclosed herein;
- FIG. 4A is a schematic diagram showing a top-down view that illustrates the overlap between the footprint of a sequence of consecutive images taken with a primary camera system and the footprint of a sequence of consecutive images taken with four secondary camera systems in a large format digital camera presented one embodiment disclosed herein;
- FIG. 4B is a schematic diagram showing a perspective view that illustrates the overlap between the footprint of a sequence of consecutive images taken along a flight line with a primary camera system and the footprint of a sequence of consecutive images taken with four secondary camera systems in a large format digital camera presented one embodiment disclosed herein;
- FIG. 5 is a flow diagram showing one illustrative process presented herein for the airborne optical registration of large areas using a multi-resolution digital large format camera with multiple detector arrays provided in one embodiment presented herein;
- FIG. 6 is a block diagram showing an aerial imaging system for correcting aerial images acquired by a multi-resolution digital large format camera with multiple detector arrays provided in one embodiment presented herein;
- FIG. 7 is a block diagram showing an atmospheric and solar component for an aerial imaging system for correcting aerial images acquired by a multi-resolution digital large format camera with multiple detector arrays provided in one embodiment presented herein;
- FIG. 8 is a flow diagram showing one illustrative process presented herein for correcting atmospheric influences in aerial images acquired by a multi-resolution digital large format camera with multiple detector arrays provided in one embodiment presented herein;
- FIG. 9 is a flow diagram showing one illustrative process presented herein for correcting solar influences in aerial images acquired by a multi-resolution digital large format camera with multiple detector arrays provided in one embodiment presented herein;
- FIG. 10 is a block diagram of a computing architecture suitable for implementing an atmospheric and solar component for an aerial imaging system for correcting aerial images acquired by a multi-resolution digital large format camera with multiple detector arrays provided in one embodiment presented herein.
- Embodiments are generally directed to advanced image processing techniques for aerial images. Some embodiments are particularly directed to advanced image processing techniques specifically designed to correct for atmospheric and solar influences of aerial images acquired using, for example, a large format aerial camera.
- an atmosphere for a planet is a hostile environment for aerial image acquisition and processing.
- a large format digital camera may acquire aerial images having undesired visual elements obscuring surface objects in the aerial images.
- the aerial images may have a number of atmospheric and solar influences, such as light scattering and absorption due to aerosols in the atmosphere.
- Light scattering may include, for example, Rayleigh scattering and Mie scattering.
- Aerosols may include gaseous, liquid and solid particles.
- atmospheric and solar influences introduce a range-dependent haze (e.g., brightness changes) and discoloration (e.g., blue-tint) on each pixel in an aerial image.
- This range-dependent haze and discoloration should be corrected and separated from the actual scene content on the ground or surface of the earth. Further, in larger image acquisitions, such as those spanning multiple days, a significant difference in solar irradiance in two or more flight lines may be introduced. This brightness difference makes subsequent correction, such as a bidirectional reflectance distribution function (BRDF) correction, more difficult or expensive to perform. As a result, there is a significant need to correct for atmospheric and solar influences obscuring surface objects in an aerial image acquired by a large format digital camera.
- BRDF bidirectional reflectance distribution function
- advanced image processing techniques attempt to model and derive a robust atmosphere linear mask that is optimal for each pixel in each aerial image, while remaining insensitive (or independent) of scene content on the ground.
- advanced image processing techniques attempt to derive a robust solar irradiance model and a camera model that can adjust brightness for each image to a preset mean brightness through scaling techniques.
- Advanced image processing techniques as described herein may provide significant advantages over conventional image processing techniques. For example, previous solutions such as dark object subtraction (DOS) techniques derive only an offset for each channel of one aerial image and only model an additive effect of the atmosphere.
- DOS dark object subtraction
- the advanced image processing techniques derive a linear model for each sub-region of an aerial image, and model both an additive effect and a multiplicative effect of the atmosphere.
- the advanced image processing techniques also consider adjacent images along a flight line, and therefore provide greater accuracy and less sensitivity to scene content. In other words, the advanced image processing techniques are locally optimal and globally consistent. Further, the advanced image processing techniques alleviate large brightness differences due to solar irradiance differences from larger image acquisitions spanning lengthy time intervals, such as across several hours or even multiple days.
- a first section provides a detailed description and examples for a large format aerial camera suitable for acquiring and outputting aerial images as illustrated in FIGS. 1-5 .
- a second section provides a detailed description and examples for an aerial imaging system implementing advanced image processing techniques arranged to process aerial images acquired by a large format aerial camera as illustrated in FIGS. 6-10 .
- the aerial imaging system may include an atmospheric and solar component arranged to correct for atmospheric and solar influences in aerial images.
- the corrected aerial images may be used for a number of use scenarios typical for digital aerial images, such as generating an ortho mosaic (aerial map) that is largely free of atmosphere and solar influences, among other use scenarios.
- a large format aerial camera may be used to acquire, capture or record aerial images.
- the aerial images may comprise still images (e.g., pictures) or moving images (e.g., video).
- the large format aerial camera may be suitable for use in airborne optical registration of large surface areas of the earth, such as entire countries, continents, or even the entire world.
- aerial images may be acquired by a large format aerial camera implemented as a multi-resolution large format digital camera having multiple optical systems and detector arrays, such as a MICROSOFT® ULTRACAM-G aerial camera, made by Microsoft Corporation, Redmond, Wash.
- a large format aerial camera implemented as a multi-resolution large format digital camera having multiple optical systems and detector arrays, such as a MICROSOFT® ULTRACAM-G aerial camera, made by Microsoft Corporation, Redmond, Wash.
- a multi-resolution large format digital camera may be capable of producing aerial images at different photographic scales.
- the multi-resolution large format digital camera can produce panchromatic images having a wide-angle geometry that are suitable for use in a photogrammetric workflow that includes image-based georeferencing and digital surface modeling.
- the multi-resolution large format digital camera can also concurrently produce multiple color images having a narrow-angle geometry suitable for use in a photogrammetric workflow that includes “ortho image” production.
- An ortho image is an image that shows ground objects in an orthographic projection. Because a single flight utilizing the multi-resolution large format digital camera can produce both wide-angle and narrow-angle images, the cost of mapping a large area can be reduced as compared to previous solutions.
- the multi-resolution large format digital camera may include a primary camera system and two or more secondary camera systems.
- the primary camera system is configured for collecting panchromatic image data and the secondary camera systems are configured for collecting color image data.
- Each of the secondary camera systems has an optical system that has a longer focal length than the optical system of the primary camera system.
- the primary camera system and the secondary camera systems may be mounted within a common housing suitable for installation and use within an aircraft.
- the primary camera system has an electro optical detector array capable of capturing the panchromatic image data.
- Each of the secondary camera systems has an electro optical detector array capable of capturing the color image data.
- the resolution of the electro optical detector in each of the secondary camera systems is greater than the resolution of the electro optical detector in the primary camera system.
- the radiometric resolution of the secondary camera systems may be greater than the radiometric resolution of the primary camera system.
- the primary camera system and the secondary camera systems may be configured such that the large format digital camera can produce images at two different image scales offering two different footprints. Images produced by the primary camera system have a larger footprint and are larger in size than those produced by the secondary camera systems and offer information for performing image-based georeferencing by means of photogrammetric triangulation. Images produced by the secondary camera systems have a smaller footprint and are smaller in size than those produced by the primary camera system and offer a high-resolution narrow angle color image. The color images produced by the secondary camera systems may be utilized as a source data set for high-resolution ortho image production. The footprint of the images generated by the secondary camera systems may be configured to overlap the footprint of the primary camera system in a direction perpendicular to a flight path.
- the multi-resolution large format digital camera may be configured to generate a sequence of consecutive images along a flight line.
- the multi-resolution large format digital camera may be further configured such that the primary camera system produces a sequence of consecutive panchromatic images that overlap one another.
- the secondary camera systems may be configured to produce a sequence of consecutive color images that overlap one another and the images produced by the primary camera system. The overlap between consecutive panchromatic images may be greater than the overlap between consecutive color images.
- FIG. 1 is a schematic diagram showing aspects of a large format digital camera 100 having multiple optical systems 106 A- 106 B and detector arrays 110 A- 110 B provided in one embodiment presented herein.
- the large format digital camera 100 includes a primary camera system 104 A and two or more secondary camera systems 104 B- 104 N.
- FIG. 1 illustrates two secondary camera systems 104 B- 104 N, it should be appreciated that other embodiments might include additional secondary camera systems 104 B- 104 N.
- the large format digital camera 100 includes four secondary camera systems 104 B- 104 N.
- the primary camera system 104 A includes an optical system 106 A that has a focal length 108 A.
- Each of the secondary camera systems 104 B- 104 N has an optical system 106 B that has a focal length 108 B that is longer than the focal length 108 A of the optical system 106 A.
- the secondary camera systems 104 B- 104 N are configured to produce images having a narrower field of view than images produced by the primary camera system 104 A. Images produced by the primary camera system 104 A have a wider field of view than images produced by the secondary camera systems 104 B- 104 N.
- the optical systems 106 A- 106 B may include other conventional optical elements to produce a suitable image at the desired focal length.
- the primary camera system 104 A is configured with an electro optical detector array 110 A capable of capturing panchromatic image data 112 .
- a panchromatic image sensor such as the electro optical detector array 110 A, is sensitive to all or most of the entire visible spectrum.
- each of the secondary camera systems 104 B- 104 N is configured with an electro optical detector array 110 B capable of capturing color image data 116 .
- the secondary camera systems 104 B- 104 N might be equipped with a suitable charge coupled device (“CCD”) array configured for capturing the color image data 116 A- 116 N, respectively.
- the camera system presented herein is a frame camera (also referred to as a framing camera), as opposed to a camera that utilizes push-broom sensing.
- the detector arrays 110 A- 110 B comprise arrays of individual electro-optical detectors, e.g., semiconductor devices that output an electric signal, the magnitude of which is dependent on the intensity of light energy incident on such electro-optical detector. Therefore, the signal from each electro-optical detector in the arrays 110 A- 110 B is indicative of light energy intensity from a pixel area of the portion of the object or terrain being photographed, and the signals from all of the individual electro-optical detectors in the arrays 110 A- 110 B are indicative of light energy intensity from all of the pixel areas of the portion of the object or terrain being photographed.
- individual electro-optical detectors e.g., semiconductor devices that output an electric signal, the magnitude of which is dependent on the intensity of light energy incident on such electro-optical detector. Therefore, the signal from each electro-optical detector in the arrays 110 A- 110 B is indicative of light energy intensity from a pixel area of the portion of the object or terrain being photographed, and the signals from all of the individual electro-optical detectors in the arrays 110 A-
- the signals from the electro-optical detectors in each of the detector arrays 110 A- 110 B, together, are indicative of the pattern of light energy from the portion of the object being photographed, so a sub-image of the portion of the object can be produced from such signals.
- the signals are amplified, digitized, processed, and stored, as is well known to those of ordinary skill in the art.
- the electro-optical detector arrays 110 A- 110 B are connected electrically by suitable conductors to a control circuit (not shown), which includes at least a microprocessor, input/output circuitry, memory, and a power supply for driving the electro-optical detector arrays 110 A- 110 B, retrieving image data from of the arrays 110 A- 110 B, and storing the image data.
- a control circuit not shown
- Other data processing functions for example combining images and/or performing image display functions may be accomplished within the large format digital camera 100 or by other external data processing equipment.
- the resolution of the electro optical detector arrays 104 B in the secondary camera systems 104 B- 104 N is greater than the resolution of the electro optical detector array 104 A in the primary camera system 104 A.
- the large format digital camera 110 can produce a panchromatic image file 114 from the primary camera system 104 A using a wide-angle geometry that is suitable for use in a photogrammetric workflow that includes image-based georeferencing and digital surface modeling.
- the large format digital camera 110 can also simultaneously produce multiple higher-resolution color image files from the secondary camera systems 104 B- 104 N using a narrow-angle geometry suitable for use in a photogrammetric workflow that includes ortho image production.
- the primary camera system 104 A and the secondary camera systems 104 B- 104 N might be mounted within a common housing 102 .
- a front glass plate 120 might be mounted within the housing 102 to protect the optical systems 106 A- 106 B.
- the primary camera system 104 A and the secondary camera systems 104 B- 104 N are mounted in separate housings (not shown). In both cases, the primary camera system 104 A, the secondary camera systems 104 B- 104 N, and the housing 102 are configured for mounting and use within an aircraft.
- FIG. 2 is a schematic diagram showing the footprint 202 of the primary camera system 104 A overlaid with footprints 204 A- 204 B of the secondary camera systems 104 B- 104 N in the large format digital camera 100 according in one embodiment disclosed herein.
- the large format digital camera 100 includes four secondary camera systems 104 B- 104 N configured with the footprints 204 A- 204 D illustrated in FIG. 2 , respectively.
- the primary camera system 104 A and the secondary camera systems 104 B- 104 N are configured in one embodiment such that the large format digital camera 100 can produce overlapping images at two different image scales offering two different footprints 202 and 204 A- 204 D.
- two primary camera systems 104 A and four secondary camera systems 104 B- 104 N are utilized.
- images produced by the primary camera system 104 A have a larger footprint 202 and are larger in size than those produced by the secondary camera systems 104 B- 104 N. Images produced by the secondary camera systems 104 B- 104 N have smaller footprint 204 A- 204 D and are smaller in size than those produced by the primary camera system 104 A and offer a higher resolution narrow angle color image.
- the four secondary camera systems 104 B- 104 N may be configured such that the footprints 204 A- 204 D of the secondary camera systems 104 B- 104 N cover the footprint 202 of the primary camera system 104 A in a direction perpendicular to a flight line 400 .
- the footprints 204 A- 204 D of the four secondary camera systems 104 B- 104 N cover a “stripe” of the footprint 202 of the primary camera system 104 A in a direction perpendicular to the flight line 400 .
- a portion of the images produced by the primary camera system 104 A can be enhanced by the images produced by the secondary camera systems 104 B- 104 N.
- FIG. 3 provides a perspective view of the footprint 200 of the primary camera system 104 A and the footprints 204 A- 204 N of the four secondary camera systems 104 B- 104 N when an image is taken from a common point 302 by the primary camera system 104 A and the four secondary camera systems 104 B- 104 N.
- FIG. 4A shows a top-down view that illustrates the overlap between the footprints 202 A- 202 D of a sequence of consecutive images taken with the primary camera system 104 A and the footprint 204 A- 204 D of a sequence of consecutive images taken with four secondary camera systems 104 B- 104 N in the large format digital camera 100 in one embodiment disclosed herein.
- the large format digital camera 100 may be mounted and configured for use within an aircraft (not shown). When the aircraft is flown according to a well-defined flight line 400 , the large format digital camera 100 may be configured to capture a sequence of images along the flight line 400 .
- FIG. 4A shows a top-down view that illustrates the overlap between the footprints 202 A- 202 D of a sequence of consecutive images taken with the primary camera system 104 A and the footprint 204 A- 204 D of a sequence of consecutive images taken with four secondary camera systems 104 B- 104 N in the large format digital camera 100 in one embodiment disclosed herein.
- the large format digital camera 100 may be mounted and configured for use
- 4A illustrates the footprints 202 A- 202 D of a sequence of images taken using the primary camera system 104 A and the footprints 204 A- 204 D of a sequence of images taken using by four secondary camera systems 104 B- 104 N along the flight line 400 .
- the large format camera 100 may be further configured such that the primary camera system 104 A produces a sequence of consecutive panchromatic images that have footprints 202 A- 202 D wherein consecutive sequential images overlap one another.
- the secondary camera systems 104 B- 104 N may similarly be configured to produce a sequence of consecutive color images that have footprints 204 A- 204 D wherein consecutive sequential images overlap one another and also overlap the images produced by the primary camera system 104 A.
- the overlap between the footprints of consecutive panchromatic images may be greater than the overlap between the footprints of consecutive color images.
- FIG. 4B is a perspective diagram illustrating the overlap between the footprints 202 A- 202 D of a sequence of consecutive images taken on several flight lines 400 with the primary camera system 104 A and the footprints 204 A- 204 D of a sequence of consecutive images taken with four secondary camera systems 104 B- 104 N in the large format digital camera 100 in one embodiment disclosed herein. If, as illustrated in FIG. 4B , images are produced by the primary camera system 104 A and the secondary camera systems 104 B- 104 N along multiple well-defined flight lines by means of aerial photogrammetric image acquisition, the footprints 202 of the primary camera system 104 A overlap one another in the sequence of exposures along the flight lines.
- the footprints 204 A- 204 D of the secondary camera systems 104 B- 104 N also overlap with the footprints 202 A- 202 D of the primary camera system 104 A and the footprints 204 A- 204 D of the four secondary camera systems 104 B- 104 N.
- flight line 400 images are therefore produced in such a way that the sequence of images produced by the primary camera system 104 A and the images produced by the secondary camera systems 104 B- 104 N create continuous image strips of overlapping images.
- the flight lines may be defined in such a way that the large format digital camera 100 captures images covering an entire project area.
- image acquisition by the secondary camera systems 104 B- 104 N may be triggered substantially simultaneously with image acquisition by the primary camera system 104 A and, accordingly, images from the secondary camera systems 104 B- 104 N may be acquired at the same position and with the same camera attitude as images from the primary camera system 104 A.
- the trigger for the secondary camera systems 104 B- 104 N may be independent from the primary camera system 104 A, e.g., may be at a higher rate than images captured by the primary camera system. Either embodiment, as well as any combination thereof, is contemplated to be within the scope of embodiments presented herein.
- the images produced by the secondary camera systems 104 B- 104 N may be registered to the images produced by the primary camera system 104 A using the same trigger event. Additionally, images produced by the secondary camera systems 104 B- 104 N may be calibrated to images of the primary camera system 104 A through the use of a precisely surveyed and well-structured object (known as a “calibration object”).
- the images of the secondary camera systems 104 B- 104 N may also be stitched to the images of the primary camera system 104 B using traditional methods. Additionally, the images generated by the primary camera system 104 A can be used to reconstruct the three dimensional form of an object (for instance, the buildings of a city by means of a digital surface model) and the images of the secondary camera system 104 B, with a higher geometric resolution, may be used to extract high resolution photo texture which can then used for the production of ortho image maps.
- FIG. 5 provides additional details regarding the embodiments presented herein for a large format digital camera 100 having multiple optical systems and detector arrays.
- FIG. 5 is a flow diagram showing a routine 500 that illustrates one process presented herein for the airborne optical registration of large areas using the large format digital camera 100 described above.
- the routine 500 begins at operation 502 , where the large format digital camera 100 is calibrated.
- the large format digital camera 100 may be calibrated using a calibration object such that the footprint of images produced by the secondary camera systems 104 B- 104 N overlap the footprint of images produced by the primary camera system 104 A in the manner discussed above.
- the large format digital camera 100 may be installed in an aircraft and utilized to capture ground images as the aircraft is flown along a well-defined flight line. Such images may be captured and stored in an appropriate digital storage device integrated with or external to the large format digital camera 100 .
- the routine 500 proceeds to operation 504 where panchromatic image files 114 are received from the primary camera system 104 A.
- the routine then proceeds to operation 506 , where the color image files 118 A- 118 N are received from the secondary camera systems 104 B- 104 N.
- the routine 500 proceeds to operation 508 , where the image files 114 from the primary camera system 104 A are co-registered with the image files 118 A- 118 N from the secondary camera systems 104 B- 104 N.
- the routine 500 proceeds to operation 510 , where the image files 114 from the primary camera system 104 A are utilized in a photogrammetric workflow that includes image-based georeferencing and digital surface modeling. From operation 510 , the routine 500 proceeds to operation 512 , where the image files 118 A- 118 N from the secondary camera systems 104 B- 104 N are utilized for ortho image production. The routine 500 proceeds from operation 512 to operation 514 , where it ends.
- FIG. 6 is a block diagram showing an aerial imaging system 600 .
- the aerial imaging system 600 may be used for processing and correcting aerial images 602 - a acquired by an aerial digital camera, such as the large format digital camera 110 , for example. It may be appreciated that other aerial digital cameras may be used as well, and the embodiments are not limited to the exemplary large format digital camera 110 .
- a and “b” and “c” and similar designators as used herein are intended to be variables representing any positive integer.
- a complete set of aerial image 602 - a may include aerial images 602 - 1 , 602 - 2 , 602 - 3 , 602 - 4 and 602 - 5 .
- the embodiments are not limited in this context.
- the aerial imaging system 600 may comprise a computer-implemented system having one or more components, such as an atmospheric and solar component 610 , for example.
- system and “component” are intended to refer to a computer-related entity, comprising either hardware, a combination of hardware and software, software, or software in execution.
- a component can be implemented as a process running on a processor, a processor, a hard disk drive, multiple storage drives of optical and/or magnetic storage medium, an object, an executable, a thread of execution, a program, and/or a computer.
- an application running on a server and the server can be a component.
- One or more components can reside within a process and/or thread of execution, and a component can be localized on one computer and/or distributed between two or more computers as desired for a given implementation. The embodiments are not limited in this context.
- the aerial imaging system 600 may be implemented as part of an electronic device.
- an electronic device may include without limitation a digital camera, an aerial digital camera, a large format digital camera, a mobile device, a personal digital assistant, a mobile computing device, a smart phone, a cellular telephone, a handset, a one-way pager, a two-way pager, a messaging device, a computer, a personal computer (PC), a desktop computer, a laptop computer, a notebook computer, a handheld computer, a server, a server array or server farm, a web server, a network server, an Internet server, a work station, a mini-computer, a main frame computer, a supercomputer, a network appliance, a web appliance, a distributed computing system, multiprocessor systems, processor-based systems, consumer electronics, programmable consumer electronics, television, digital television, set top box, wireless access point, base station, subscriber station, mobile subscriber center, radio network controller, router, hub, gateway, bridge
- the aerial imaging system 600 may comprise the large format digital camera 110 , the atmospheric and solar component 610 , and a digital display 620 .
- the aerial imaging system 600 as shown in FIG. 6 has a limited number of elements in a certain topology, it may be appreciated that the aerial imaging system 600 may include more or less elements in alternate topologies as desired for a given implementation.
- the aerial imaging system 600 may further comprise other elements typically found in an aerial imaging system or an electronic device, such as computing components, communications components, power supplies, input devices, output devices, and so forth. The embodiments are not limited in this context.
- the large format digital camera 110 may acquire and output a series of aerial images 602 - a , such as aerial images as previously described with reference to FIGS. 1-5 .
- the large format digital camera 110 may be part of a camera platform used to elevate the camera 110 above a surface of the earth.
- the camera platform may comprise an airplane flying a flight mission over a defined flight line, such as flight line 400 , to acquire aerial images 602 - a of a particular surface area of the earth for airborne optical registration.
- a defined flight line may comprise a typical north-south flight line pattern for an aerial photograph acquisition. Such an acquisition produces overlapping aerial images 602 - a , which are subsequently processed into a seamless ortho mosaic or aerial map.
- the atmospheric and solar component 610 may implement various image processing techniques to correct atmospheric and solar influences from the aerial images 602 - a taken by the large format digital camera 110 .
- the atmospheric and solar component 610 derives a robust atmosphere mask which is applied to each pixel of an aerial image 602 - a while remaining insensitive (or independent) of scene content on the ground.
- the atmospheric and solar component 610 models undesired atmosphere effects with a linear relationship that is related to two physical phenomena due to light scattering and absorption in the atmosphere.
- a first physical phenomenon is additive path radiance caused by scattering which is modeled by an intercept term.
- a second physical phenomenon is a multiplicative attenuation factor caused by scattering and absorption which is modeled by a gain term.
- the atmospheric and solar component 610 uses the linear relationship as a good approximation of a sub-region of an aerial image 602 - a , where a distance to the large format digital camera 110 can be considered a constant and therefore atmosphere effects are the same.
- the atmospheric and solar component 610 models the atmosphere in a single aerial image 602 - a by dividing the aerial image 602 - a into a series of grids. After dividing an aerial image 602 - a into a grid, however, a number of samples in each grid becomes smaller relative to the entire aerial image 602 - a , and the atmosphere influence becomes harder to model given fewer samples. Depending on scene content, statistics derived from each grid might be insufficient to derive a robust atmosphere mask.
- the atmospheric and solar component 610 solves this problem by aggregating statistics of grids across groups of aerial images 602 - a , such as aerial images 602 - a that have been captured close in time and/or on a same flight line. For example, the atmospheric and solar component 610 groups a series of aerial images 602 - a taken along a same flight line. Pixels in one grid have approximately the same distance to the large format digital camera 110 , and therefore undergo the same or similar atmospheric effects. Samples taken from a series of aerial images 602 - a , rather than using a single grid or a single aerial image 602 - a , may be used to derive an atmosphere mask insensitive to scene content. As a result, the atmospheric and solar component 610 can correct for atmosphere effects on each pixel of an aerial image 602 - a by modeling and separating atmospheric effects from scene content.
- the atmospheric and solar component 610 derives a robust solar irradiance model and a camera model that can adjust brightness of each aerial image 602 - a to a preset mean brightness.
- the atmospheric and solar component 610 models solar irradiance by using a sun elevation angle with a multiplicative factor.
- the atmospheric and solar component 610 models light received by the large format digital camera 110 using a multiplicative factor comprising an exposure time and an aperture size.
- the two multiplicative factors are then multiplied and combined to one factor to scale each aerial image 602 - a to a common mean brightness. After scaling operations, brightness levels for each aerial image 602 - a are more uniform.
- the atmospheric and solar component 610 may receive as input one or more aerial images 602 - a from the large format digital camera 110 , perform advanced image processing operations on the aerial images 602 - a , and output corrected aerial images 604 - b .
- the atmospheric and solar component 610 may adjust color and brightness both globally and locally to correct for atmospheric and solar influences in aerial images 602 - a to form corrected aerial images 604 - b .
- the corrected aerial images 602 - b may be used to generate a seamless and color-balanced ortho mosaic production.
- the corrected aerial images 602 - b can be further refined using subsequent image processing techniques, such as BRDF correction and seamline design techniques, for example.
- the aerial imaging system 600 may present the corrected aerial images 604 - b on the display 620 .
- the display 620 may comprise any electronic display for presentation of visual, tactile or auditive information. Examples for the display 620 may include without limitation a cathode ray tube (CRT), bistable display, electronic paper, nixie tube, vector display, a flat panel display, a vacuum fluorescent display, a light-emitting diode (LED) display, electroluminescent (ELD) display, a plasma display panel (PDP), a liquid crystal display (LCD), a thin-film transistor (TFT) display, an organic light-emitting diode (OLED) display, a surface-conduction electron-emitter display (SED), a laser television, carbon nanotubes, nanocrystal displays, a head-mounted display, and so any other displays consistent with the described embodiments.
- CTR cathode ray tube
- bistable display electronic paper
- nixie tube vector display
- flat panel display a vacuum fluorescent display
- the display 620 may be implemented as a touchscreen display.
- a touchscreen display is an electronic visual display that can detect the presence and location of a touch within the display area. The touch may be from a finger, hand, stylus, light pen, and so forth. The embodiments are not limited in this context.
- FIG. 7 is a more detailed block diagram of the atmospheric and solar component 610 of the aerial imaging system 600 .
- the atmospheric and solar component 610 as shown in FIG. 7 has a limited number of elements in a certain topology, it may be appreciated that atmospheric and solar component 610 may include more or less elements in alternate topologies as desired for a given implementation.
- the atmospheric and solar component 610 may include, among other elements, an image information component 710 , a filter generation component 720 and an image correction component 730 .
- the components 710 , 720 and 730 may be communicatively coupled via various types of communications media.
- the components 710 , 720 and 730 may coordinate operations between each other. The coordination may involve the uni-directional or bi-directional exchange of information.
- the components 710 , 720 and 730 may communicate information in the form of signals communicated over the communications media.
- the information can be implemented as signals allocated to various signal lines. In such allocations, each message is a signal. Further embodiments, however, may alternatively employ data messages. Such data messages may be sent across various connections. Exemplary connections include parallel interfaces, serial interfaces, and bus interfaces.
- the image information component 710 may be generally arranged to generate image records 742 - c associated with aerial images 602 - a .
- An image record 742 - c may be associated with a single aerial image 602 - a , or in some cases, multiple aerial images 602 - a .
- the image information component 710 may store the image records 742 - c in a datastore 740 .
- the datastore 740 and stored image records 742 - c are accessible by the filter generation component 720 and the image correction component 730 during different image processing phases.
- the image information component 710 may retrieve image context information 712 associated with one or more aerial images 602 - a .
- the image context information 712 may comprise, among other types of information, image information, camera information and camera platform information.
- the image context information 712 may be retrieved from the datastore 740 , an aerial image 602 - a , a file or package containing an aerial image 602 - a , or a remote datastore for another device (e.g., the large format digital camera 110 or an airplane computer).
- the image information component 710 may store the image context information 712 in an image record 742 - c associated with a given aerial image 602 - a.
- the image context information 712 may comprise image information associated with one or more aerial images 602 - a acquired by the large format digital camera 110 .
- image information may include without limitation an image identifier, an image date, an image time, an image location, an image tag, an image label, or other image metadata.
- the image context information 712 may comprise camera information associated with the large format digital camera 110 used to acquire one or more aerial images 602 - a .
- camera information may include without limitation a camera identifier, a camera exposure time (e.g., shutter speed), a camera aperture size, a camera location (e.g., latitude, longitude and altitude coordinates), and other camera metadata.
- the image context information 712 may comprise camera platform information associated with a platform for the large format digital camera 110 used to acquire a group of aerial images 602 - a .
- Examples of camera platform information may include without limitation a camera platform identifier, a flight mission identifier, a flight line identifier, a camera platform location (e.g., latitude, longitude and altitude coordinates), and other camera platform metadata.
- the image information component 710 may retrieve or generate statistical information 714 associated with one or more aerial images 602 - a .
- the image information component 710 may receive one or more aerial images 602 - a , and generate statistical information 714 from information contained within the aerial images 602 - a , such as pixel values.
- the image information component 710 may store the statistical information 714 in a same image record 742 - c used to store image context information 712 associated with a given aerial image 602 - a or set of aerial images 602 - a .
- the image information component 710 may store statistical information 714 in a different image record 742 - c , and link the different image records using a globally unique identifier (GUID).
- GUID globally unique identifier
- the filter generation component 720 may generate an atmospheric filter 722 and a solar filter 724 from statistical information 714 and image context information 712 stored in image records 742 - c .
- each atmospheric filter 722 and solar filter 724 may comprise one or more correction masks derived from the image context information 712 and statistical information 714 .
- Each atmospheric filter 722 and solar filter 724 may be associated with one or more aerial images 602 - a .
- the filter generation component 720 may retrieve information from the image records 742 - c , process the retrieved information to form the atmospheric filter 722 and the solar filter 724 , and store processed information for the filters 722 , 724 in corresponding filter image records 744 - d in the datastore 740 .
- the image correction component 730 may be generally arranged to correct atmospheric and solar artifacts from a target aerial image 732 using a respective atmospheric filter 722 and solar filter 724 .
- a target aerial image 732 may comprise an aerial image that is currently being focused on for a particular set of image processing operations, such as an aerial image for which a set of filters 722 , 724 are being designed, or an aerial image that is currently being corrected.
- the image correction component 730 may retrieve an appropriate atmospheric filter 722 and solar filter 724 from the image records 742 - c and/or the filter image records 744 - d for a given target aerial image 732 , perform correction operations on the target aerial image 732 , and output a corrected target aerial image 734 .
- the corrected target aerial image 734 may be written to the datastore 740 , or transported in real-time to another device.
- logic flows 800 , 900 Operations for the aerial imaging system 600 and the atmospheric and solar component 610 may be further described with reference to logic flows 800 , 900 . It may be appreciated that the representative logic flows 800 , 900 do not necessarily have to be executed in the order presented, or in any particular order, unless otherwise indicated. Moreover, various activities described with respect to the logic flows 800 , 900 can be executed in serial or parallel fashion.
- the logic flows 800 , 900 may be implemented using one or more hardware elements and/or software elements of the described embodiments or alternative elements as desired for a given set of design and performance constraints.
- the logic flows 800 , 900 may be implemented as logic (e.g., computer program instructions) for execution by a logic device (e.g., a general-purpose or specific-purpose computer).
- FIG. 8 illustrates one embodiment of a logic flow 800 .
- the logic flow 800 may be representative of some or all of the operations executed by one or more embodiments described herein, such as the atmospheric and solar component 610 of the aerial imaging system 600 , for example.
- the logic flow 800 may be implemented by the atmospheric and solar component 610 to correct for atmospheric influences in an aerial image 602 - a.
- the logic flow 800 may receive multiple aerial images at block 802 .
- the atmospheric and solar component 610 may receive multiple aerial images 602 - a from the large format digital camera 110 .
- the aerial images 602 - a may be temporally sequential aerial images 602 - a .
- the embodiments, however, are not limited to this example.
- the logic flow 800 may generate statistical information for each aerial image at block 804 .
- the image information component 710 may generate statistical information 714 for each aerial image 602 - a .
- the atmospheric and solar component 610 may generate statistical information 714 comprising a shadow percentile value and a standard deviation value for each grid within an aerial image 602 - a , among other types of statistical information 714 .
- the image information component 710 may calculate a shadow percentile value for each grid. For example, the image information component 710 may calculate a shadow percentile (e.g., 0.06 percentile) per channel (e.g., R, G, and B) from all pixels in a grid. In some cases, some pixels from a grid may be excluded. For example, if a water mask (e.g., geo-locations of lakes, ocean, rivers, etc.) is available, pixels of water in an image should be excluded from the statistics calculation. If saturated pixel values are known, those pixels should be excluded as well.
- a water mask e.g., geo-locations of lakes, ocean, rivers, etc.
- the image information component 710 may calculate a standard deviation value for each grid. For example, the image information component 710 may calculate a standard deviation per channel (e.g., R, G, and B) from all pixels in a grid. In some cases, some pixels from a grid may be excluded, such as those associated with a water mask or saturated pixels.
- a standard deviation per channel e.g., R, G, and B
- the image information component 710 may store the shadow percentile values and the standard deviation values in an image record 742 - c for the aerial image 602 - a .
- the image information component 710 may organize and store a shadow percentile value and a standard deviation value for each grid of an aerial image 602 - a to an associated image record 742 - c .
- the image record 742 - c may implement any known data schema. In one embodiment, for example, the image record 742 - c may be implemented as a comma delimited file (CSV).
- Each aerial image 602 - a may have a separate associated image record 742 - c , or a separate record in a compounded image record 742 - c .
- the image record 742 - c of a given aerial image 602 - a can be retrieved for subsequent image processing operations.
- Each image record 742 - c , or record within an image record 742 - c may be indexed by a globally unique identifier (GUID) to facilitate retrieval operations.
- GUID globally unique identifier
- the logic flow 800 may retrieve image context information for each aerial image at block 806 .
- the image information component 710 may retrieve image context information 712 related to the aerial image 602 - a from a datastore or datasource, and write the related image context information 712 to the same image record 742 - c.
- the logic flow 800 may create an atmospheric filter from the statistical information and the image context information at block 808 .
- the filter generation component 720 may create an atmospheric filter 722 from the statistical information 712 and the image context information 714 stored in one or more image records 742 - c.
- the filter generation component 720 may selectively group aerial images 602 - a to form a set of aerial images 726 - e using associated image context information 712 and an ordering algorithm.
- the filter generation component 720 may use an ordering algorithm to produce a set of aerial images 726 - e sharing a nested order comprising a flight mission identifier, a flight line identifier, a camera exposure time, and a camera aperture size.
- Such an ordering algorithm is designed to produce a set of aerial images 726 - e having similar, if not identical, illumination conditions (e.g., from the sun and atmosphere), and exposure settings, since they are continuous in time.
- the filter generation component 720 leverages this time-continuity concept to infer illumination conditions so that atmosphere correction information can be robustly estimated using a larger set of samples generated across the aerial images 726 - e.
- the filter generation component 720 may identify an image window of a subset of aerial images 728 - f from a set of aerial images 726 - e from the stream of received aerial images 602 - c . Once grouped, the filter generation component 720 may sort a set of aerial images 726 - e by time in ascending order. Alternatively, the filter generation component 720 can sort the aerial images 726 - e with image numbers if the aerial images 726 - e are numbered with a natural number in ascending order in time in a consistent manner. Once sorted, the filter generation component 720 may identify or select a subset of aerial images 728 - f from the set of aerial images 726 - e to form an image window.
- the subset of aerial images 728 - f may comprise a number of (k) previous aerial images before a target aerial image 732 .
- the subset of aerial images 728 - f may comprise k/2 aerial images before the target aerial image 732 , and k/2 aerial images after the target aerial image 732 . In either case, if the number of aerial images found is less than k, they can be used but this condition should be marked as an edge condition for quality control purposes.
- a resulting image window comprises a subset of aerial images 728 - f having k+1 images, including a target aerial image 732 as a current (or center if non-real-time system) aerial image.
- the target aerial image 732 may comprise an aerial image within an image window for which a set of filters 722 , 724 are being made.
- the filter generation component 720 may generate a grid atmospheric haze additive term value for each grid of a target aerial image 732 using shadow percentile values from a subset of aerial images 728 - f in an image window. For each image window, the filter generation component 720 may search and retrieve all image records 742 - c associated with the subset of aerial images 728 - f in the image window. For each channel, and for each grid, the filter generation component 720 retrieves all the shadow percentile values from the retrieved image records 742 - c , and organizes them into a vector. The filter generation component 720 sorts them in ascending order, and takes the p th percentile. The p th percentile value is a grid atmospheric haze additive term value for a given grid. These operations are repeated for each grid of the target aerial image 732 . The grid atmospheric haze additive term values collectively form a filtered additive haze mask (per-channel) for the target aerial image 732 of the image window.
- the filter generation component 720 may generate a grid atmospheric correction gain term value for each grid of a target aerial image 732 using standard deviation values from a subset of aerial images 728 - f in an image window. For each image window, the filter generation component 720 may search and retrieve all image records 742 - c associated with the subset of aerial images 728 - f in the image window. For each channel, and for each grid, the filter generation component 720 retrieves all the standard deviation values from the retrieved image records 742 - c , and averages the standard deviation values to form a grid atmospheric correction gain term value. These operations are repeated for each grid in the target aerial image 732 .
- the grid atmospheric correction gain term values collectively form one filtered atmospheric attenuation correction mask per channel (e.g., R, G, B, Infrared, Panchromatic, etc.), for the target aerial image 732 of the image window.
- the filter generation component 720 may generate grid atmospheric haze additive term values and grid atmospheric correction gain term values for all aerial images within a set of aerial images 726 - e using similar operations as described above.
- the filter generation component 720 may store grid atmospheric haze additive term values and grid atmospheric correction gain term values for each target aerial image 732 in a filter image record 744 - d .
- a filter image record 744 - d or a record in a filter image record 744 - d may be indexed by a GUID for an aerial image 726 - e so it may be retrieved for a given aerial image 726 - e during image correction operations.
- the logic flow 800 may correct atmosphere artifacts of an aerial image using the atmospheric filter at block 810 .
- the image correction component 730 may correct atmosphere artifacts of a target aerial image 732 using an atmospheric filter 722 designed for the target aerial image 732 by the filter generation component 720 .
- an atmospheric filter 722 may comprise grid atmospheric haze additive term values and grid atmospheric correction gain term values stored in a filter image record 744 - d associated with the target aerial image 732 .
- the image correction component 730 may correct atmosphere additive haze and multiplicative attenuation for a target aerial image 732 using its atmospheric filter 722 .
- L′ is the measured digital number (DN) value of a given band at pixel location (x,y)
- L is the atmosphere-corrected value
- haze(band,x,y) is the grid atmospheric haze additive term value for a given band at pixel location (x,y)
- g(band,x,y) is the grid atmospheric correction gain term value.
- the image correction component 730 may calculate a pixel atmospheric haze additive term value for a pixel of a target aerial image 732 using grid atmospheric haze additive term values stored in a filter image record 744 - d associated with the target aerial image 732 . This may be accomplished two ways. First, the image correction component 730 may perform a bilinear or bicubic interpolation on the filtered additive haze mask of the given aerial image 726 - e . The Bilinear or bicubic interpolation may use four neighboring grids that surround a given pixel location for the four interpolation corners.
- a natural cubic spline curve can be fitted to the one-dimensional grids (along the longer-dimension pixel coordinate, e.g., the x-coordinate if number of rows is one and the y-coordinate if number of columns is one).
- the natural cubic spline curve can be represented as a look-up table (LUT) at each pixel location. In this case, for example, each pixel location with a same y-coordinate value (when number of rows is one) will have the same value. Then the haze additive term value at a given pixel location can be looked up from the LUT.
- the image correction component 730 may calculate a pixel atmospheric correction gain term value for a pixel of a target aerial image 732 using grid atmospheric correction gain term values stored in a filter image record 744 - d associated with the target aerial image 732 .
- the image correction component 730 may perform a bilinear or bicubic interpolation on each filtered atmospheric correction mask for each channel (e.g., R, G, or B) of the target aerial image 732 to retrieve interpolated values for the masks.
- the natural cubic spline curve fitting as previously described can be used if the grid configuration is one-dimensional.
- the image correction component 730 may correct a pixel of a target aerial image 732 using a pixel atmospheric haze additive term value calculated for the pixel.
- the image correction component 730 may correct a given pixel in accordance with Equation (3) as previously described.
- the image correction component 730 may correct a pixel of a target aerial image 732 using a pixel atmospheric correction gain term value calculated for the pixel.
- the image correction component 730 may correct a given pixel in accordance with Equation (3) as previously described.
- the image correction component 730 may output a corrected target aerial image 734 .
- the image correction component 730 selects a next target aerial image 732 , and performs correction operations similar to those described above to form a next corrected target aerial image 734 .
- the above correction operations may be performed for each of the aerial images 726 - e.
- FIG. 9 illustrates one embodiment of a logic flow 900 .
- the logic flow 900 may be representative of some or all of the operations executed by one or more embodiments described herein, such as the atmospheric and solar component 610 of the aerial imaging system 600 , for example.
- the logic flow 900 may be implemented by the atmospheric and solar component 610 to correct for solar influences in an aerial image 602 - a.
- the logic flow 900 may receive multiple aerial images at block 902 .
- the atmospheric and solar component 610 may receive multiple aerial images 602 - a from the large format digital camera 110 .
- the aerial images 602 - a may be temporally sequential aerial images 602 - a .
- the embodiments, however, are not limited to this example.
- the logic flow 900 may receive image context information for each aerial image at block 904 .
- the image information component 710 may receive image context information 712 for each aerial image 602 - a .
- the image information component 710 may store image context information 712 for an aerial image 602 - a in a same or different image record 742 - c used to store the statistical information 714 derived for the aerial image 602 - a as described in the logic flow 800 .
- the logic flow 900 may create a solar filter from the image context information at block 906 .
- the filter generation component 720 may create a solar filter 724 for a target aerial image 732 from the image context information 712 .
- the filter generation component 720 may calculate a sun elevation angle value for each aerial image 726 - e .
- the image information component 710 may retrieve image information for a target aerial image 732 from associated image context information 712 , including an image date, an image time, and an image location.
- the image information component 710 may calculate a sun elevation angle (sunZenith) for each aerial image in a set of aerial images 726 - e.
- the filter generation component 720 may calculate an average sun elevation angle value (meanSunZenithDeg) from the multiple sun elevation angle values computed for a set of aerial images 726 - e.
- the filter generation component 720 may calculate a solar flux factor value for a target aerial image 732 from an average sun elevation angle value for a set of aerial images 726 - e .
- fluxFactor cos(targetSunZenith)/cos(sunZenith) Equation (6) where targetSunZenith and sunZenith are in radians, and meanSunZenithDeg is in degrees.
- the filter generation component 720 may calculate an average exposure time value and average aperture value for a set of aerial images 726 - e .
- the filter generation component 720 may retrieve camera exposure time (exposure Time) and camera aperture size (aperture) for each aerial image 726 - e using associated image context information 712 , and calculate an average camera exposure time (meanExposureTime) and an average camera aperture size (meanAperture) of all the aerial images 726 - e.
- the filter generation component 720 may calculate an exposure normalization factor value for a target aerial image 732 from an average exposure time value and an average aperture value for a set of aerial images 726 - e .
- the filter generation component 720 may calculate a total normalization factor value for a target aerial image 732 from a solar flux factor value and exposure normalization factor value for the target aerial image 732 .
- the logic flow 900 may correct solar artifacts of an aerial image using the solar filter at block 908 .
- the image correction component 730 may correct solar artifacts of a target aerial image 732 before or after atmospheric correction operations, as described with reference to the logic flow 800 of FIG. 8 .
- the atmospheric and solar component 610 may be arranged to perform further image processing operations on a corrected target aerial image 734 .
- the atmospheric and solar component 610 may be further arranged to perform “hotspot removal” operations, which is a radiometric process to remove hotspots caused by micro shadows, which result in uneven illumination of aerial images with respect to the camera ray direction and the sun direction.
- the atmospheric and solar component 610 may be arranged to perform ortho rectification operations, which is a geometric process to correct an aerial image so that the scale is uniform like a map, without distortion. This is typically performed using a photogrammetric library using a digital elevation model (DEM) and bundle adjustment results.
- DEM digital elevation model
- the atmospheric and solar component 610 may be arranged to perform ortho mosaic operations, which is a segmentation process to put together multiple ortho rectified images into one seamless mosaic using a max-flow-min-cut theory, for example.
- the atmospheric and solar component 610 may be arranged to perform globally-aware-locally-adaptive (GALA) tone mapping to convert a larger aerial ortho from 16-bit camera range to 8-bit camera range.
- GALA globally-aware-locally-adaptive
- FIG. 10 illustrates an embodiment of an exemplary computing architecture 1000 suitable for implementing various embodiments as previously described.
- the computing architecture 1000 includes various common computing elements, such as one or more processors, co-processors, memory units, chipsets, controllers, peripherals, interfaces, oscillators, timing devices, video cards, audio cards, multimedia input/output (I/O) components, and so forth.
- processors such as one or more processors, co-processors, memory units, chipsets, controllers, peripherals, interfaces, oscillators, timing devices, video cards, audio cards, multimedia input/output (I/O) components, and so forth.
- I/O multimedia input/output
- the computing architecture 1000 comprises a processing unit 1004 , a system memory 1006 and a system bus 1008 .
- the processing unit 1004 can be any of various commercially available processors. Dual microprocessors and other multi-processor architectures may also be employed as the processing unit 1004 .
- the system bus 1008 provides an interface for system components including, but not limited to, the system memory 1006 to the processing unit 1004 .
- the system bus 1008 can be any of several types of bus structure that may further interconnect to a memory bus (with or without a memory controller), a peripheral bus, and a local bus using any of a variety of commercially available bus architectures.
- the system memory 1006 may include various types of memory units, such as read-only memory (ROM), random-access memory (RAM), dynamic RAM (DRAM), Double-Data-Rate DRAM (DDRAM), synchronous DRAM (SDRAM), static RAM (SRAM), programmable ROM (PROM), erasable programmable ROM (EPROM), electrically erasable programmable ROM (EEPROM), flash memory, polymer memory such as ferroelectric polymer memory, ovonic memory, phase change or ferroelectric memory, silicon-oxide-nitride-oxide-silicon (SONOS) memory, magnetic or optical cards, or any other type of media suitable for storing information.
- the system memory 1006 can include non-volatile memory 1010 and/or volatile memory 1012 .
- a basic input/output system (BIOS) can be stored in the non-volatile memory 1010 .
- the computer 1002 may include various types of computer-readable storage media, including an internal hard disk drive (HDD) 1014 , a magnetic floppy disk drive (FDD) 1016 to read from or write to a removable magnetic disk 1018 , and an optical disk drive 1020 to read from or write to a removable optical disk 1022 (e.g., a CD-ROM or DVD).
- the HDD 1014 , FDD 1016 and optical disk drive 1020 can be connected to the system bus 1008 by a HDD interface 1024 , an FDD interface 1026 and an optical drive interface 1028 , respectively.
- the HDD interface 1024 for external drive implementations can include at least one or both of Universal Serial Bus (USB) and IEEE 1394 interface technologies.
- USB Universal Serial Bus
- the drives and associated computer-readable media provide volatile and/or nonvolatile storage of data, data structures, computer-executable instructions, and so forth.
- a number of program modules can be stored in the drives and memory units 1010 , 1012 , including an operating system 1030 , one or more application programs 1032 , other program modules 1034 , and program data 1036 .
- the one or more application programs 1032 , other program modules 1034 , and program data 1036 can include, for example, the aerial imaging system 600 .
- a user can enter commands and information into the computer 1002 through one or more wire/wireless input devices, for example, a keyboard 1038 and a pointing device, such as a mouse 1040 .
- Other input devices may include a microphone, an infrared (IR) remote control, a joystick, a game pad, a stylus pen, touch screen, or the like.
- IR infrared
- These and other input devices are often connected to the processing unit 1004 through an input device interface 1042 that is coupled to the system bus 1008 , but can be connected by other interfaces such as a parallel port, IEEE 1394 serial port, a game port, a USB port, an IR interface, and so forth.
- a monitor 1044 or other type of display device is also connected to the system bus 1008 via an interface, such as a video adaptor 1046 .
- a computer typically includes other peripheral output devices, such as speakers, printers, and so forth.
- the computer 1002 may operate in a networked environment using logical connections via wire and/or wireless communications to one or more remote computers, such as a remote computer 1048 .
- the remote computer 1048 can be a workstation, a server computer, a router, a personal computer, portable computer, microprocessor-based entertainment appliance, a peer device or other common network node, and typically includes many or all of the elements described relative to the computer 1002 , although, for purposes of brevity, only a memory/storage device 1050 is illustrated.
- the logical connections depicted include wire/wireless connectivity to a local area network (LAN) 1052 and/or larger networks, for example, a wide area network (WAN) 1054 .
- LAN and WAN networking environments are commonplace in offices and companies, and facilitate enterprise-wide computer networks, such as intranets, all of which may connect to a global communications network, for example, the Internet.
- the computer 1002 When used in a LAN networking environment, the computer 1002 is connected to the LAN 1052 through a wire and/or wireless communication network interface or adaptor 1056 .
- the adaptor 1056 can facilitate wire and/or wireless communications to the LAN 1052 , which may also include a wireless access point disposed thereon for communicating with the wireless functionality of the adaptor 1056 .
- the computer 1002 can include a modem 1058 , or is connected to a communications server on the WAN 1054 , or has other means for establishing communications over the WAN 1054 , such as by way of the Internet.
- the modem 1058 which can be internal or external and a wire and/or wireless device, connects to the system bus 1008 via the input device interface 1042 .
- program modules depicted relative to the computer 1002 can be stored in the remote memory/storage device 1050 . It will be appreciated that the network connections shown are exemplary and other means of establishing a communications link between the computers can be used.
- the computer 1002 is operable to communicate with wire and wireless devices or entities using the IEEE 802 family of standards, such as wireless devices operatively disposed in wireless communication (e.g., IEEE 802.11 over-the-air modulation techniques) with, for example, a printer, scanner, desktop and/or portable computer, personal digital assistant (PDA), communications satellite, any piece of equipment or location associated with a wirelessly detectable tag (e.g., a kiosk, news stand, restroom), and telephone.
- PDA personal digital assistant
- the communication can be a predefined structure as with a conventional network or simply an ad hoc communication between at least two devices.
- Wi-Fi networks use radio technologies called IEEE 802.11x (a, b, g, etc.) to provide secure, reliable, fast wireless connectivity.
- IEEE 802.11x a, b, g, etc.
- a Wi-Fi network can be used to connect computers to each other, to the Internet, and to wire networks (which use IEEE 802.3-related media and functions).
- Various embodiments may be implemented using hardware elements, software elements, or a combination of both.
- hardware elements may include devices, components, processors, microprocessors, circuits, circuit elements (e.g., transistors, resistors, capacitors, inductors, and so forth), integrated circuits, application specific integrated circuits (ASIC), programmable logic devices (PLD), digital signal processors (DSP), field programmable gate array (FPGA), memory units, logic gates, registers, semiconductor device, chips, microchips, chip sets, and so forth.
- Examples of software elements may include software components, programs, applications, computer programs, application programs, system programs, machine programs, operating system software, middleware, firmware, software modules, routines, subroutines, functions, methods, procedures, software interfaces, application program interfaces (API), instruction sets, computing code, computer code, code segments, computer code segments, words, values, symbols, or any combination thereof. Determining whether an embodiment is implemented using hardware elements and/or software elements may vary in accordance with any number of factors, such as desired computational rate, power levels, heat tolerances, processing cycle budget, input data rates, output data rates, memory resources, data bus speeds and other design or performance constraints, as desired for a given implementation.
- An article of manufacture may comprise a storage medium to store logic.
- Examples of a storage medium may include one or more types of computer-readable storage media capable of storing electronic data, including volatile memory or non-volatile memory, removable or non-removable memory, erasable or non-erasable memory, writeable or re-writeable memory, and so forth.
- Examples of the logic may include various software elements, such as software components, programs, applications, computer programs, application programs, system programs, machine programs, operating system software, middleware, firmware, software modules, routines, subroutines, functions, methods, procedures, software interfaces, application program interfaces (API), instruction sets, computing code, computer code, code segments, computer code segments, words, values, symbols, or any combination thereof.
- API application program interfaces
- an article of manufacture may store executable computer program instructions that, when executed by a computer, cause the computer to perform methods and/or operations in accordance with the described embodiments.
- the executable computer program instructions may include any suitable type of code, such as source code, compiled code, interpreted code, executable code, static code, dynamic code, and the like.
- the executable computer program instructions may be implemented according to a predefined computer language, manner or syntax, for instructing a computer to perform a certain function.
- the instructions may be implemented using any suitable high-level, low-level, object-oriented, visual, compiled and/or interpreted programming language.
- Coupled and “connected” along with their derivatives. These terms are not necessarily intended as synonyms for each other. For example, some embodiments may be described using the terms “connected” and/or “coupled” to indicate that two or more elements are in direct physical or electrical contact with each other. The term “coupled,” however, may also mean that two or more elements are not in direct contact with each other, but yet still co-operate or interact with each other.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Geometry (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Studio Devices (AREA)
- Image Processing (AREA)
Abstract
Description
Grid_center_x(column)=(image_width/m)*column+(image_width/2/m) Equation (1)
Grid_center_y(row)=(image_height/n)*row+(image_height/2/n) Equation (2)
L(band,x,y)=g(band,x,y)*L′(band,x,y)−haze(band,x,y)) Equation (3)
where L′ is the measured digital number (DN) value of a given band at pixel location (x,y), L is the atmosphere-corrected value, haze(band,x,y) is the grid atmospheric haze additive term value for a given band at pixel location (x,y), and g(band,x,y) is the grid atmospheric correction gain term value.
g(band,x,y)=std_dev_ref(x,y)/std_dev(band,x,y) Equation (4)
where std_dev_ref(x,y) is the reference value chosen from one of the three bands (e.g., green is chosen because human eyes are more sensitive to green).
targetSunZenith=M_PI*meanSunZenithDeg/180.0 Equation (5)
fluxFactor=cos(targetSunZenith)/cos(sunZenith) Equation (6)
where targetSunZenith and sunZenith are in radians, and meanSunZenithDeg is in degrees.
opCorrection=(meanExposureTime/exposure Time)*(aperture/_meanAperture)*(aperture/_meanAperture) Equation (7)
g_norm(image)=fluxFactor(image)*opCorrection(image) Equation (8)
L_final(band,x,y)=L(band,x,y)*g_norm(x,y) Equation (9)
The
Claims (20)
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US12/973,689 US9576349B2 (en) | 2010-12-20 | 2010-12-20 | Techniques for atmospheric and solar correction of aerial images |
CN201110429145.XA CN102567967B (en) | 2010-12-20 | 2011-12-20 | For the air of spatial image and the technology of solar correction |
US15/437,140 US11017503B2 (en) | 2010-12-20 | 2017-02-20 | Techniques for atmospheric and solar correction of aerial images |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US12/973,689 US9576349B2 (en) | 2010-12-20 | 2010-12-20 | Techniques for atmospheric and solar correction of aerial images |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/437,140 Continuation US11017503B2 (en) | 2010-12-20 | 2017-02-20 | Techniques for atmospheric and solar correction of aerial images |
Publications (2)
Publication Number | Publication Date |
---|---|
US20120154584A1 US20120154584A1 (en) | 2012-06-21 |
US9576349B2 true US9576349B2 (en) | 2017-02-21 |
Family
ID=46233887
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/973,689 Active 2033-05-04 US9576349B2 (en) | 2010-12-20 | 2010-12-20 | Techniques for atmospheric and solar correction of aerial images |
US15/437,140 Active 2031-05-29 US11017503B2 (en) | 2010-12-20 | 2017-02-20 | Techniques for atmospheric and solar correction of aerial images |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/437,140 Active 2031-05-29 US11017503B2 (en) | 2010-12-20 | 2017-02-20 | Techniques for atmospheric and solar correction of aerial images |
Country Status (2)
Country | Link |
---|---|
US (2) | US9576349B2 (en) |
CN (1) | CN102567967B (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10109224B1 (en) * | 2015-06-26 | 2018-10-23 | Jayant Ratti | Methods and devices for using aerial vehicles to create graphic displays and reconfigurable 3D structures |
CN110769577A (en) * | 2019-10-18 | 2020-02-07 | Oppo(重庆)智能科技有限公司 | Atmosphere lamp control method and device |
US10600162B2 (en) * | 2016-12-29 | 2020-03-24 | Konica Minolta Laboratory U.S.A., Inc. | Method and system to compensate for bidirectional reflectance distribution function (BRDF) |
US11017503B2 (en) | 2010-12-20 | 2021-05-25 | Microsoft Technology Licensing , LLC | Techniques for atmospheric and solar correction of aerial images |
Families Citing this family (29)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10168153B2 (en) | 2010-12-23 | 2019-01-01 | Trimble Inc. | Enhanced position measurement systems and methods |
EP2702375A2 (en) * | 2011-04-25 | 2014-03-05 | Skybox Imaging, Inc. | Systems and methods for overhead imaging and video |
US8970691B2 (en) | 2011-08-26 | 2015-03-03 | Microsoft Technology Licensing, Llc | Removal of rayleigh scattering from images |
US8923567B2 (en) * | 2011-12-19 | 2014-12-30 | General Electric Company | Apparatus and method for predicting solar irradiance variation |
US9235763B2 (en) * | 2012-11-26 | 2016-01-12 | Trimble Navigation Limited | Integrated aerial photogrammetry surveys |
US8958603B2 (en) * | 2013-01-25 | 2015-02-17 | Regents Of The University Of Minnesota | Automated mapping of land cover using sequences of aerial imagery |
US9042674B2 (en) * | 2013-03-15 | 2015-05-26 | Digitalglobe, Inc. | Automated geospatial image mosaic generation |
CN103391413B (en) * | 2013-07-03 | 2016-05-18 | 中国科学院光电技术研究所 | Aerial survey image recording device and method |
US10230925B2 (en) | 2014-06-13 | 2019-03-12 | Urthecast Corp. | Systems and methods for processing and providing terrestrial and/or space-based earth observation video |
CN105376474B (en) * | 2014-09-01 | 2018-09-28 | 光宝电子(广州)有限公司 | Image collecting device and its Atomatic focusing method |
US10091418B2 (en) * | 2014-10-24 | 2018-10-02 | Bounce Imaging, Inc. | Imaging systems and methods |
US9824290B2 (en) * | 2015-02-10 | 2017-11-21 | nearmap australia pty ltd. | Corridor capture |
WO2016153914A1 (en) | 2015-03-25 | 2016-09-29 | King Abdulaziz City Of Science And Technology | Apparatus and methods for synthetic aperture radar with digital beamforming |
WO2017044168A2 (en) | 2015-06-16 | 2017-03-16 | King Abdulaziz City Of Science And Technology | Efficient planar phased array antenna assembly |
WO2017091747A1 (en) | 2015-11-25 | 2017-06-01 | Urthecast Corp. | Synthetic aperture radar imaging apparatus and methods |
WO2017169946A1 (en) * | 2016-03-30 | 2017-10-05 | 日本電気株式会社 | Information processing device, information processing system, information processing method and program storage medium |
EP3440428B1 (en) | 2016-04-08 | 2022-06-01 | Orbital Insight, Inc. | Remote determination of quantity stored in containers in geographical region |
US10217236B2 (en) | 2016-04-08 | 2019-02-26 | Orbital Insight, Inc. | Remote determination of containers in geographical region |
JP6818463B2 (en) * | 2016-08-08 | 2021-01-20 | キヤノン株式会社 | Image processing equipment, image processing methods and programs |
EP3631504B8 (en) | 2017-05-23 | 2023-08-16 | Spacealpha Insights Corp. | Synthetic aperture radar imaging apparatus and methods |
EP3646054A4 (en) | 2017-05-23 | 2020-10-28 | King Abdulaziz City for Science and Technology | Synthetic aperture radar imaging apparatus and methods for moving targets |
US10586349B2 (en) | 2017-08-24 | 2020-03-10 | Trimble Inc. | Excavator bucket positioning via mobile device |
WO2019226194A2 (en) | 2017-11-22 | 2019-11-28 | Urthecast Corp. | Synthetic aperture radar apparatus and methods |
US10943360B1 (en) | 2019-10-24 | 2021-03-09 | Trimble Inc. | Photogrammetric machine measure up |
US11087532B2 (en) * | 2019-11-05 | 2021-08-10 | Raytheon Company | Ortho-image mosaic production system |
US11769224B2 (en) | 2021-04-08 | 2023-09-26 | Raytheon Company | Mitigating transitions in mosaic images |
CN113096043B (en) * | 2021-04-09 | 2023-02-17 | 杭州睿胜软件有限公司 | Image processing method and device, electronic device and storage medium |
JP2023011331A (en) * | 2021-07-12 | 2023-01-24 | 株式会社Jvcケンウッド | Data generation device, display device, data generation method, display method, and program |
CN115830146B (en) * | 2023-02-10 | 2023-05-09 | 武汉玄景科技有限公司 | On-orbit relative radiation calibration and correction method for aerospace optical remote sensing camera |
Citations (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020096622A1 (en) * | 2001-01-23 | 2002-07-25 | Steven Adler-Golden | Methods for atmospheric correction of solar-wavelength Hyperspectral imagery over land |
US6484099B1 (en) * | 1999-07-16 | 2002-11-19 | Deutsches Zentrum Fur Luft -Und Raumfahrt E.V. | Process for correcting atmospheric influences in multispectral optical remote sensing data |
US6757445B1 (en) * | 2000-10-04 | 2004-06-29 | Pixxures, Inc. | Method and apparatus for producing digital orthophotos using sparse stereo configurations and external models |
US20050265631A1 (en) * | 2002-09-19 | 2005-12-01 | Mai Tuy V | System and method for mosaicing digital ortho-images |
US20080100803A1 (en) | 1999-12-03 | 2008-05-01 | Manfred Dick | Method for determining vision defects and for collecting data for correcting vision defects of the eye by interaction of a patient with an examiner and apparatus therefor |
US20080123990A1 (en) | 2006-03-23 | 2008-05-29 | Industry-Academic Cooperation Foundation, Yonsei University | Method and apparatus of correcting geometry of an image |
US20090232349A1 (en) * | 2008-01-08 | 2009-09-17 | Robert Moses | High Volume Earth Observation Image Processing |
US7653218B1 (en) | 2006-05-02 | 2010-01-26 | Orbimage Si Opco, Inc. | Semi-automatic extraction of linear features from image data |
US20100092045A1 (en) | 2008-10-15 | 2010-04-15 | The Boeing Company | System and method for airport mapping database automatic change detection |
CN101908210A (en) | 2010-08-13 | 2010-12-08 | 北京工业大学 | Method and system for color image defogging treatment |
US20110025919A1 (en) * | 2009-07-31 | 2011-02-03 | Vorontsov Mikhail A | Automated Video Data Fusion Method |
US20110188775A1 (en) * | 2010-02-01 | 2011-08-04 | Microsoft Corporation | Single Image Haze Removal Using Dark Channel Priors |
US9036861B2 (en) * | 2010-04-22 | 2015-05-19 | The University Of North Carolina At Charlotte | Method and system for remotely inspecting bridges and other structures |
Family Cites Families (33)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US3030061A (en) | 1961-02-20 | 1962-04-17 | Economy Forms Corp | Adjustable brace connector unit |
US5841911A (en) * | 1995-06-06 | 1998-11-24 | Ben Gurion, University Of The Negev | Method for the restoration of images disturbed by the atmosphere |
US6268093B1 (en) * | 1999-10-13 | 2001-07-31 | Applied Materials, Inc. | Method for reticle inspection using aerial imaging |
US6834122B2 (en) * | 2000-01-22 | 2004-12-21 | Kairos Scientific, Inc. | Visualization and processing of multidimensional data using prefiltering and sorting criteria |
US7019777B2 (en) * | 2000-04-21 | 2006-03-28 | Flight Landata, Inc. | Multispectral imaging system with spatial resolution enhancement |
US7593835B2 (en) * | 2001-04-20 | 2009-09-22 | Spectral Sciences, Inc. | Reformulated atmospheric band model method for modeling atmospheric propagation at arbitrarily fine spectral resolution and expanded capabilities. |
US7072502B2 (en) * | 2001-06-07 | 2006-07-04 | Applied Materials, Inc. | Alternating phase-shift mask inspection method and apparatus |
US20050175253A1 (en) * | 2002-01-22 | 2005-08-11 | National University Of Singapore | Method for producing cloud free and cloud-shadow free images |
US7725258B2 (en) * | 2002-09-20 | 2010-05-25 | M7 Visual Intelligence, L.P. | Vehicle based data collection and processing system and imaging sensor system and methods thereof |
US6909815B2 (en) * | 2003-01-31 | 2005-06-21 | Spectral Sciences, Inc. | Method for performing automated in-scene based atmospheric compensation for multi-and hyperspectral imaging sensors in the solar reflective spectral region |
WO2005054799A2 (en) * | 2003-11-26 | 2005-06-16 | Florida Environmental Research Institute, Inc. | Spectral imaging system |
US7697759B2 (en) * | 2004-05-11 | 2010-04-13 | The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration | Split-remerge method for eliminating processing window artifacts in recursive hierarchical segmentation |
US20060126959A1 (en) * | 2004-12-13 | 2006-06-15 | Digitalglobe, Inc. | Method and apparatus for enhancing a digital image |
US7564017B2 (en) * | 2005-06-03 | 2009-07-21 | Brion Technologies, Inc. | System and method for characterizing aerial image quality in a lithography system |
EP1977393A4 (en) * | 2006-01-18 | 2013-05-08 | Technion Res & Dev Foundation | System and method for dehazing |
CN101501703B (en) * | 2006-02-01 | 2012-07-04 | 以色列商·应用材料以色列公司 | Method and system for evaluating a variation in a parameter of a pattern |
US7965902B1 (en) * | 2006-05-19 | 2011-06-21 | Google Inc. | Large-scale image processing using mass parallelization techniques |
WO2010004677A1 (en) * | 2008-07-08 | 2010-01-14 | パナソニック株式会社 | Image processing method, image processing device, image processing program, image synthesis method, and image synthesis device |
US8073279B2 (en) * | 2008-07-08 | 2011-12-06 | Harris Corporation | Automated atmospheric characterization of remotely sensed multi-spectral imagery |
US8396324B2 (en) * | 2008-08-18 | 2013-03-12 | Samsung Techwin Co., Ltd. | Image processing method and apparatus for correcting distortion caused by air particles as in fog |
US8117010B2 (en) * | 2008-12-05 | 2012-02-14 | Honeywell International Inc. | Spectral signal detection system |
US20100177095A1 (en) * | 2009-01-14 | 2010-07-15 | Harris Corporation | Geospatial modeling system for reducing shadows and other obscuration artifacts and related methods |
US8447129B2 (en) * | 2009-03-18 | 2013-05-21 | Florida Institute Of Technology | High-speed diversity-based imaging method for parallel atmospheric turbulence compensation |
US8350933B2 (en) * | 2009-04-08 | 2013-01-08 | Yissum Research Development Company Of The Hebrew University Of Jerusalem, Ltd. | Method, apparatus and computer program product for single image de-hazing |
JP5428618B2 (en) * | 2009-07-29 | 2014-02-26 | ソニー株式会社 | Image processing apparatus, imaging apparatus, image processing method, and program |
US9097792B2 (en) * | 2009-08-12 | 2015-08-04 | The Johns Hopkins University | System and method for atmospheric correction of information |
US8588551B2 (en) * | 2010-03-01 | 2013-11-19 | Microsoft Corp. | Multi-image sharpening and denoising using lucky imaging |
US9576349B2 (en) | 2010-12-20 | 2017-02-21 | Microsoft Technology Licensing, Llc | Techniques for atmospheric and solar correction of aerial images |
US8666190B1 (en) * | 2011-02-01 | 2014-03-04 | Google Inc. | Local black points in aerial imagery |
US8897543B1 (en) * | 2012-05-18 | 2014-11-25 | Google Inc. | Bundle adjustment based on image capture intervals |
US8755628B2 (en) * | 2012-09-10 | 2014-06-17 | Google Inc. | Image de-hazing by solving transmission value |
US9396528B2 (en) * | 2013-03-15 | 2016-07-19 | Digitalglobe, Inc. | Atmospheric compensation in satellite imagery |
US9449244B2 (en) * | 2013-12-11 | 2016-09-20 | Her Majesty The Queen In Right Of Canada, As Represented By The Minister Of National Defense | Methods for in-scene atmospheric compensation by endmember matching |
-
2010
- 2010-12-20 US US12/973,689 patent/US9576349B2/en active Active
-
2011
- 2011-12-20 CN CN201110429145.XA patent/CN102567967B/en active Active
-
2017
- 2017-02-20 US US15/437,140 patent/US11017503B2/en active Active
Patent Citations (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6484099B1 (en) * | 1999-07-16 | 2002-11-19 | Deutsches Zentrum Fur Luft -Und Raumfahrt E.V. | Process for correcting atmospheric influences in multispectral optical remote sensing data |
US20080100803A1 (en) | 1999-12-03 | 2008-05-01 | Manfred Dick | Method for determining vision defects and for collecting data for correcting vision defects of the eye by interaction of a patient with an examiner and apparatus therefor |
US6757445B1 (en) * | 2000-10-04 | 2004-06-29 | Pixxures, Inc. | Method and apparatus for producing digital orthophotos using sparse stereo configurations and external models |
US20020096622A1 (en) * | 2001-01-23 | 2002-07-25 | Steven Adler-Golden | Methods for atmospheric correction of solar-wavelength Hyperspectral imagery over land |
US20050265631A1 (en) * | 2002-09-19 | 2005-12-01 | Mai Tuy V | System and method for mosaicing digital ortho-images |
US20080123990A1 (en) | 2006-03-23 | 2008-05-29 | Industry-Academic Cooperation Foundation, Yonsei University | Method and apparatus of correcting geometry of an image |
US7653218B1 (en) | 2006-05-02 | 2010-01-26 | Orbimage Si Opco, Inc. | Semi-automatic extraction of linear features from image data |
US20090232349A1 (en) * | 2008-01-08 | 2009-09-17 | Robert Moses | High Volume Earth Observation Image Processing |
US20100092045A1 (en) | 2008-10-15 | 2010-04-15 | The Boeing Company | System and method for airport mapping database automatic change detection |
US20110025919A1 (en) * | 2009-07-31 | 2011-02-03 | Vorontsov Mikhail A | Automated Video Data Fusion Method |
US20110188775A1 (en) * | 2010-02-01 | 2011-08-04 | Microsoft Corporation | Single Image Haze Removal Using Dark Channel Priors |
US9036861B2 (en) * | 2010-04-22 | 2015-05-19 | The University Of North Carolina At Charlotte | Method and system for remotely inspecting bridges and other structures |
CN101908210A (en) | 2010-08-13 | 2010-12-08 | 北京工业大学 | Method and system for color image defogging treatment |
Non-Patent Citations (5)
Title |
---|
Atmosphere and shading compensation of satellite images-Published Date: Apr. 11, 2008 https://www.imagico.de/pov/earth-atmosphere.html. |
Conferences-Web Events Microsoft's Vexcel Imaging GmbH Reports Sales of 30 UltraCam Systems for Its 2010 Fiscal Year-Published Date: Aug. 19, 2010 https://www.microsoft.com/ultracam/en-us/Aug1030Ultracams.aspx. |
First Office Action and Search Report received for China Patent Application No. 201110429145.X, Mailed Dec. 17, 2013, 9 Pages, (W/o English Translation). |
Haest, Birgen, et al. "Radiometric Calibration of Digital Photogrammetric Camera Image Data", ASPRS 2009 Annual Conference, Mar. 2009, 13 pages. |
Second Office Action and Search Report received for China Patent Application No. 201110429145.X, Mailed Aug. 1, 2014, 10 pages, including 3 pages English translation. |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11017503B2 (en) | 2010-12-20 | 2021-05-25 | Microsoft Technology Licensing , LLC | Techniques for atmospheric and solar correction of aerial images |
US10109224B1 (en) * | 2015-06-26 | 2018-10-23 | Jayant Ratti | Methods and devices for using aerial vehicles to create graphic displays and reconfigurable 3D structures |
US10600162B2 (en) * | 2016-12-29 | 2020-03-24 | Konica Minolta Laboratory U.S.A., Inc. | Method and system to compensate for bidirectional reflectance distribution function (BRDF) |
CN110769577A (en) * | 2019-10-18 | 2020-02-07 | Oppo(重庆)智能科技有限公司 | Atmosphere lamp control method and device |
CN110769577B (en) * | 2019-10-18 | 2022-02-25 | Oppo(重庆)智能科技有限公司 | Atmosphere lamp control method and device |
Also Published As
Publication number | Publication date |
---|---|
CN102567967B (en) | 2015-10-28 |
US20170161878A1 (en) | 2017-06-08 |
US11017503B2 (en) | 2021-05-25 |
CN102567967A (en) | 2012-07-11 |
US20120154584A1 (en) | 2012-06-21 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11017503B2 (en) | Techniques for atmospheric and solar correction of aerial images | |
US10127685B2 (en) | Profile matching of buildings and urban structures | |
US10356317B2 (en) | Wide-scale terrestrial light-field imaging of the sky | |
US20170031056A1 (en) | Solar Energy Forecasting | |
US10013785B2 (en) | Methods and systems for object based geometric fitting | |
US8665316B2 (en) | Multi-resolution digital large format camera with multiple detector arrays | |
US10902660B2 (en) | Determining and presenting solar flux information | |
JP5283214B2 (en) | Fixed point observation apparatus and fixed point observation method | |
Wulder et al. | Digital high spatial resolution aerial imagery to support forest health monitoring: the mountain pine beetle context | |
US10356343B2 (en) | Methods and system for geometric distortion correction for space-based rolling-shutter framing sensors | |
CN111612901A (en) | Extraction feature and generation method of geographic information image | |
CN111598777A (en) | Sky cloud image processing method, computer device and readable storage medium | |
Arietta | Estimation of forest canopy structure and understory light using spherical panorama images from smartphone photography | |
CN104580920A (en) | Imaging processing method and user terminal | |
Liu et al. | High-spatial-resolution nighttime light dataset acquisition based on volunteered passenger aircraft remote sensing | |
CN109785439A (en) | Human face sketch image generating method and Related product | |
CN109377476A (en) | The dynamic threshold acquisition methods and device of remote sensing image cloud detection characteristic parameter | |
Deng et al. | Automatic true orthophoto generation based on three-dimensional building model using multiview urban aerial images | |
CN112785678B (en) | Sunlight analysis method and system based on three-dimensional simulation | |
Fedorov et al. | Snow phenomena modeling through online public media | |
Chijioke | Satellite remote sensing technology in spatial modeling process: technique and procedures | |
WO2020061186A1 (en) | An apparatus, methodologies and software applications for determining a level of direct sunlight | |
Wakter et al. | A novel shade analysis technique for solar photovoltaic systems | |
Schulz et al. | Automatic cloud top height determination in mountainous areas using a cost-effective time-lapse camera system | |
Roupioz et al. | Quantifying the impact of cloud cover on ground radiation flux measurements using hemispherical images |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: MICROSOFT CORPORATION, WASHINGTON Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:OMER, IDO;LIU, YUXIANG;SCHICKLER, WOLFGANG;AND OTHERS;SIGNING DATES FROM 20101217 TO 20101218;REEL/FRAME:025531/0791 |
|
AS | Assignment |
Owner name: MICROSOFT CORPORATION, WASHINGTON Free format text: CORRECTIVE ASSIGNMENT TO CORRECT THE FIRST NAME OF INVENTOR LEDNER WAS RECORDED AS ROBER ON THE PAGE 1 OF THE NOTICE OF RECORDATION PREVIOUSLY RECORDED ON REEL 025531 FRAME 0791. ASSIGNOR(S) HEREBY CONFIRMS THE FIRST NAME OF INVENTOR LEDNER IS ROBERT AS IN ROBERT LEDNER;ASSIGNORS:OMER, IDO;LIU, YUXIANG;SCHICKLER, WOLFGANG;AND OTHERS;SIGNING DATES FROM 20101210 TO 20101218;REEL/FRAME:027346/0973 |
|
AS | Assignment |
Owner name: MICROSOFT CORPORATION, WASHINGTON Free format text: CORRECTIVE ASSIGNMENT TO CORRECT THE FIRST NAME OF INVENTOR LEDNER WAS RECORDED AS ROBER ON THE PAGE 1 OF THE NOTICE OF RECORDATION PREVIOUSLY RECORDED ON REEL 025531 FRAME 0791. ASSIGNOR(S) HEREBY CONFIRMS THE FIRST NAME OF INVENTOR LEDNER IS ROBERT AS IN ROBERT LEDNER SIGNED ON 12/17/2010;ASSIGNORS:OMER, IDO;LIU, YUXIANG;SCHICKLER, WOLFGANG;AND OTHERS;SIGNING DATES FROM 20101217 TO 20101218;REEL/FRAME:027409/0676 |
|
AS | Assignment |
Owner name: MICROSOFT TECHNOLOGY LICENSING, LLC, WASHINGTON Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:MICROSOFT CORPORATION;REEL/FRAME:034544/0001 Effective date: 20141014 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 4 |
|
FEPP | Fee payment procedure |
Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |