US9576349B2 - Techniques for atmospheric and solar correction of aerial images - Google Patents

Techniques for atmospheric and solar correction of aerial images Download PDF

Info

Publication number
US9576349B2
US9576349B2 US12/973,689 US97368910A US9576349B2 US 9576349 B2 US9576349 B2 US 9576349B2 US 97368910 A US97368910 A US 97368910A US 9576349 B2 US9576349 B2 US 9576349B2
Authority
US
United States
Prior art keywords
image
aerial
atmospheric
solar
images
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active, expires
Application number
US12/973,689
Other versions
US20120154584A1 (en
Inventor
Ido Omer
Yuxiang Liu
Wolfgang Schickler
Robert Ledner
Leon Rosenshein
David Simons
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Microsoft Technology Licensing LLC
Original Assignee
Microsoft Technology Licensing LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Assigned to MICROSOFT CORPORATION reassignment MICROSOFT CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: OMER, IDO, LEDNER, ROBER, LIU, YUXIANG, ROSENSHEIN, LEON, SCHICKLER, WOLFGANG, SIMONS, DAVID
Priority to US12/973,689 priority Critical patent/US9576349B2/en
Application filed by Microsoft Technology Licensing LLC filed Critical Microsoft Technology Licensing LLC
Assigned to MICROSOFT CORPORATION reassignment MICROSOFT CORPORATION CORRECTIVE ASSIGNMENT TO CORRECT THE FIRST NAME OF INVENTOR LEDNER WAS RECORDED AS ROBER ON THE PAGE 1 OF THE NOTICE OF RECORDATION PREVIOUSLY RECORDED ON REEL 025531 FRAME 0791. ASSIGNOR(S) HEREBY CONFIRMS THE FIRST NAME OF INVENTOR LEDNER IS ROBERT AS IN ROBERT LEDNER. Assignors: OMER, IDO, LIU, YUXIANG, ROSENSHEIN, LEON, SCHICKLER, WOLFGANG, SIMONS, DAVID, LEDNER, ROBERT
Assigned to MICROSOFT CORPORATION reassignment MICROSOFT CORPORATION CORRECTIVE ASSIGNMENT TO CORRECT THE FIRST NAME OF INVENTOR LEDNER WAS RECORDED AS ROBER ON THE PAGE 1 OF THE NOTICE OF RECORDATION PREVIOUSLY RECORDED ON REEL 025531 FRAME 0791. ASSIGNOR(S) HEREBY CONFIRMS THE FIRST NAME OF INVENTOR LEDNER IS ROBERT AS IN ROBERT LEDNER SIGNED ON 12/17/2010. Assignors: OMER, IDO, LEDNER, ROBERT, LIU, YUXIANG, ROSENSHEIN, LEON, SCHICKLER, WOLFGANG, SIMONS, DAVID
Priority to CN201110429145.XA priority patent/CN102567967B/en
Publication of US20120154584A1 publication Critical patent/US20120154584A1/en
Assigned to MICROSOFT TECHNOLOGY LICENSING, LLC reassignment MICROSOFT TECHNOLOGY LICENSING, LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: MICROSOFT CORPORATION
Priority to US15/437,140 priority patent/US11017503B2/en
Publication of US9576349B2 publication Critical patent/US9576349B2/en
Application granted granted Critical
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/20Image enhancement or restoration using local operators
    • G06T5/008
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C15/00Surveying instruments or accessories not provided for in groups G01C1/00 - G01C13/00
    • G01C15/002Active optical surveying means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/90Dynamic range modification of images or parts thereof
    • G06T5/94Dynamic range modification of images or parts thereof based on local image properties, e.g. for local contrast enhancement
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/698Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture
    • H04N5/23238
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/02Picture taking arrangements specially adapted for photogrammetry or photographic surveying, e.g. controlling overlapping of pictures
    • G06K9/40
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20076Probabilistic image processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • G06T3/4038Image mosaicing, e.g. composing plane images from plane sub-images

Definitions

  • Aerial imaging refers to capturing images of a surface of a planet from an elevated position, such as an aircraft or satellite. Aerial imaging is used in cartography, such as for providing photogrammetric surveys, which are often a basis for topographic maps. Depending on a given elevation for a camera, aerial imaging may capture undesired visual elements obscuring surface objects in an image. Removing undesired visual elements may be difficult, however, particularly when multiple aerial images are stitched together to form a composite aerial image, such as an “ortho mosaic” or aerial map. It is with respect to these and other considerations that the present improvements have been needed.
  • Embodiments are generally directed to advanced image processing techniques for aerial images. Some embodiments are particularly directed to advanced image processing techniques specifically designed to correct atmospheric and solar influences in aerial images. As elevation for a camera increases, the camera may acquire an aerial image with an increasing number of atmospheric and solar influences, such as light scattering and absorption due to aerosols in the atmosphere. The atmospheric and solar influences potentially obscure surface objects in an aerial image acquired by the aerial camera, among other problems.
  • an atmospheric and solar component is arranged for execution by a logic device and operative to correct solar and atmosphere artifacts from an aerial image.
  • the atmospheric and solar component may comprise, among other elements, an image information component operative to generate an image record for each aerial image of a group of aerial images, the image record comprising statistical information and image context information for each aerial image.
  • the atmospheric and solar component may further comprise a filter generation component operative to generate an atmospheric filter and a solar filter from the statistical information and the image context information stored in the image records.
  • the atmospheric and solar component may still further comprise an image correction component operative to correct atmospheric and solar artifacts from the aerial image using the respective atmospheric filter and solar filter.
  • the atmospheric and solar component may provide enhanced aerial images providing a greater degree of visual acuity and accurate reproduction of surface objects in the aerial images.
  • Other embodiments are described and claimed.
  • FIG. 1 is a schematic diagram showing aspects of a multi-resolution digital large format camera with multiple detector arrays provided in one embodiment presented herein;
  • FIG. 2 is a schematic diagram showing the footprint of a primary camera system overlaid with the footprints of four secondary camera systems in a large format digital camera presented in one embodiment disclosed herein;
  • FIG. 3 is a perspective diagram showing a perspective view of the footprint of a primary camera system and the footprints of four secondary camera systems in a large format digital camera presented in one embodiment disclosed herein;
  • FIG. 4A is a schematic diagram showing a top-down view that illustrates the overlap between the footprint of a sequence of consecutive images taken with a primary camera system and the footprint of a sequence of consecutive images taken with four secondary camera systems in a large format digital camera presented one embodiment disclosed herein;
  • FIG. 4B is a schematic diagram showing a perspective view that illustrates the overlap between the footprint of a sequence of consecutive images taken along a flight line with a primary camera system and the footprint of a sequence of consecutive images taken with four secondary camera systems in a large format digital camera presented one embodiment disclosed herein;
  • FIG. 5 is a flow diagram showing one illustrative process presented herein for the airborne optical registration of large areas using a multi-resolution digital large format camera with multiple detector arrays provided in one embodiment presented herein;
  • FIG. 6 is a block diagram showing an aerial imaging system for correcting aerial images acquired by a multi-resolution digital large format camera with multiple detector arrays provided in one embodiment presented herein;
  • FIG. 7 is a block diagram showing an atmospheric and solar component for an aerial imaging system for correcting aerial images acquired by a multi-resolution digital large format camera with multiple detector arrays provided in one embodiment presented herein;
  • FIG. 8 is a flow diagram showing one illustrative process presented herein for correcting atmospheric influences in aerial images acquired by a multi-resolution digital large format camera with multiple detector arrays provided in one embodiment presented herein;
  • FIG. 9 is a flow diagram showing one illustrative process presented herein for correcting solar influences in aerial images acquired by a multi-resolution digital large format camera with multiple detector arrays provided in one embodiment presented herein;
  • FIG. 10 is a block diagram of a computing architecture suitable for implementing an atmospheric and solar component for an aerial imaging system for correcting aerial images acquired by a multi-resolution digital large format camera with multiple detector arrays provided in one embodiment presented herein.
  • Embodiments are generally directed to advanced image processing techniques for aerial images. Some embodiments are particularly directed to advanced image processing techniques specifically designed to correct for atmospheric and solar influences of aerial images acquired using, for example, a large format aerial camera.
  • an atmosphere for a planet is a hostile environment for aerial image acquisition and processing.
  • a large format digital camera may acquire aerial images having undesired visual elements obscuring surface objects in the aerial images.
  • the aerial images may have a number of atmospheric and solar influences, such as light scattering and absorption due to aerosols in the atmosphere.
  • Light scattering may include, for example, Rayleigh scattering and Mie scattering.
  • Aerosols may include gaseous, liquid and solid particles.
  • atmospheric and solar influences introduce a range-dependent haze (e.g., brightness changes) and discoloration (e.g., blue-tint) on each pixel in an aerial image.
  • This range-dependent haze and discoloration should be corrected and separated from the actual scene content on the ground or surface of the earth. Further, in larger image acquisitions, such as those spanning multiple days, a significant difference in solar irradiance in two or more flight lines may be introduced. This brightness difference makes subsequent correction, such as a bidirectional reflectance distribution function (BRDF) correction, more difficult or expensive to perform. As a result, there is a significant need to correct for atmospheric and solar influences obscuring surface objects in an aerial image acquired by a large format digital camera.
  • BRDF bidirectional reflectance distribution function
  • advanced image processing techniques attempt to model and derive a robust atmosphere linear mask that is optimal for each pixel in each aerial image, while remaining insensitive (or independent) of scene content on the ground.
  • advanced image processing techniques attempt to derive a robust solar irradiance model and a camera model that can adjust brightness for each image to a preset mean brightness through scaling techniques.
  • Advanced image processing techniques as described herein may provide significant advantages over conventional image processing techniques. For example, previous solutions such as dark object subtraction (DOS) techniques derive only an offset for each channel of one aerial image and only model an additive effect of the atmosphere.
  • DOS dark object subtraction
  • the advanced image processing techniques derive a linear model for each sub-region of an aerial image, and model both an additive effect and a multiplicative effect of the atmosphere.
  • the advanced image processing techniques also consider adjacent images along a flight line, and therefore provide greater accuracy and less sensitivity to scene content. In other words, the advanced image processing techniques are locally optimal and globally consistent. Further, the advanced image processing techniques alleviate large brightness differences due to solar irradiance differences from larger image acquisitions spanning lengthy time intervals, such as across several hours or even multiple days.
  • a first section provides a detailed description and examples for a large format aerial camera suitable for acquiring and outputting aerial images as illustrated in FIGS. 1-5 .
  • a second section provides a detailed description and examples for an aerial imaging system implementing advanced image processing techniques arranged to process aerial images acquired by a large format aerial camera as illustrated in FIGS. 6-10 .
  • the aerial imaging system may include an atmospheric and solar component arranged to correct for atmospheric and solar influences in aerial images.
  • the corrected aerial images may be used for a number of use scenarios typical for digital aerial images, such as generating an ortho mosaic (aerial map) that is largely free of atmosphere and solar influences, among other use scenarios.
  • a large format aerial camera may be used to acquire, capture or record aerial images.
  • the aerial images may comprise still images (e.g., pictures) or moving images (e.g., video).
  • the large format aerial camera may be suitable for use in airborne optical registration of large surface areas of the earth, such as entire countries, continents, or even the entire world.
  • aerial images may be acquired by a large format aerial camera implemented as a multi-resolution large format digital camera having multiple optical systems and detector arrays, such as a MICROSOFT® ULTRACAM-G aerial camera, made by Microsoft Corporation, Redmond, Wash.
  • a large format aerial camera implemented as a multi-resolution large format digital camera having multiple optical systems and detector arrays, such as a MICROSOFT® ULTRACAM-G aerial camera, made by Microsoft Corporation, Redmond, Wash.
  • a multi-resolution large format digital camera may be capable of producing aerial images at different photographic scales.
  • the multi-resolution large format digital camera can produce panchromatic images having a wide-angle geometry that are suitable for use in a photogrammetric workflow that includes image-based georeferencing and digital surface modeling.
  • the multi-resolution large format digital camera can also concurrently produce multiple color images having a narrow-angle geometry suitable for use in a photogrammetric workflow that includes “ortho image” production.
  • An ortho image is an image that shows ground objects in an orthographic projection. Because a single flight utilizing the multi-resolution large format digital camera can produce both wide-angle and narrow-angle images, the cost of mapping a large area can be reduced as compared to previous solutions.
  • the multi-resolution large format digital camera may include a primary camera system and two or more secondary camera systems.
  • the primary camera system is configured for collecting panchromatic image data and the secondary camera systems are configured for collecting color image data.
  • Each of the secondary camera systems has an optical system that has a longer focal length than the optical system of the primary camera system.
  • the primary camera system and the secondary camera systems may be mounted within a common housing suitable for installation and use within an aircraft.
  • the primary camera system has an electro optical detector array capable of capturing the panchromatic image data.
  • Each of the secondary camera systems has an electro optical detector array capable of capturing the color image data.
  • the resolution of the electro optical detector in each of the secondary camera systems is greater than the resolution of the electro optical detector in the primary camera system.
  • the radiometric resolution of the secondary camera systems may be greater than the radiometric resolution of the primary camera system.
  • the primary camera system and the secondary camera systems may be configured such that the large format digital camera can produce images at two different image scales offering two different footprints. Images produced by the primary camera system have a larger footprint and are larger in size than those produced by the secondary camera systems and offer information for performing image-based georeferencing by means of photogrammetric triangulation. Images produced by the secondary camera systems have a smaller footprint and are smaller in size than those produced by the primary camera system and offer a high-resolution narrow angle color image. The color images produced by the secondary camera systems may be utilized as a source data set for high-resolution ortho image production. The footprint of the images generated by the secondary camera systems may be configured to overlap the footprint of the primary camera system in a direction perpendicular to a flight path.
  • the multi-resolution large format digital camera may be configured to generate a sequence of consecutive images along a flight line.
  • the multi-resolution large format digital camera may be further configured such that the primary camera system produces a sequence of consecutive panchromatic images that overlap one another.
  • the secondary camera systems may be configured to produce a sequence of consecutive color images that overlap one another and the images produced by the primary camera system. The overlap between consecutive panchromatic images may be greater than the overlap between consecutive color images.
  • FIG. 1 is a schematic diagram showing aspects of a large format digital camera 100 having multiple optical systems 106 A- 106 B and detector arrays 110 A- 110 B provided in one embodiment presented herein.
  • the large format digital camera 100 includes a primary camera system 104 A and two or more secondary camera systems 104 B- 104 N.
  • FIG. 1 illustrates two secondary camera systems 104 B- 104 N, it should be appreciated that other embodiments might include additional secondary camera systems 104 B- 104 N.
  • the large format digital camera 100 includes four secondary camera systems 104 B- 104 N.
  • the primary camera system 104 A includes an optical system 106 A that has a focal length 108 A.
  • Each of the secondary camera systems 104 B- 104 N has an optical system 106 B that has a focal length 108 B that is longer than the focal length 108 A of the optical system 106 A.
  • the secondary camera systems 104 B- 104 N are configured to produce images having a narrower field of view than images produced by the primary camera system 104 A. Images produced by the primary camera system 104 A have a wider field of view than images produced by the secondary camera systems 104 B- 104 N.
  • the optical systems 106 A- 106 B may include other conventional optical elements to produce a suitable image at the desired focal length.
  • the primary camera system 104 A is configured with an electro optical detector array 110 A capable of capturing panchromatic image data 112 .
  • a panchromatic image sensor such as the electro optical detector array 110 A, is sensitive to all or most of the entire visible spectrum.
  • each of the secondary camera systems 104 B- 104 N is configured with an electro optical detector array 110 B capable of capturing color image data 116 .
  • the secondary camera systems 104 B- 104 N might be equipped with a suitable charge coupled device (“CCD”) array configured for capturing the color image data 116 A- 116 N, respectively.
  • the camera system presented herein is a frame camera (also referred to as a framing camera), as opposed to a camera that utilizes push-broom sensing.
  • the detector arrays 110 A- 110 B comprise arrays of individual electro-optical detectors, e.g., semiconductor devices that output an electric signal, the magnitude of which is dependent on the intensity of light energy incident on such electro-optical detector. Therefore, the signal from each electro-optical detector in the arrays 110 A- 110 B is indicative of light energy intensity from a pixel area of the portion of the object or terrain being photographed, and the signals from all of the individual electro-optical detectors in the arrays 110 A- 110 B are indicative of light energy intensity from all of the pixel areas of the portion of the object or terrain being photographed.
  • individual electro-optical detectors e.g., semiconductor devices that output an electric signal, the magnitude of which is dependent on the intensity of light energy incident on such electro-optical detector. Therefore, the signal from each electro-optical detector in the arrays 110 A- 110 B is indicative of light energy intensity from a pixel area of the portion of the object or terrain being photographed, and the signals from all of the individual electro-optical detectors in the arrays 110 A-
  • the signals from the electro-optical detectors in each of the detector arrays 110 A- 110 B, together, are indicative of the pattern of light energy from the portion of the object being photographed, so a sub-image of the portion of the object can be produced from such signals.
  • the signals are amplified, digitized, processed, and stored, as is well known to those of ordinary skill in the art.
  • the electro-optical detector arrays 110 A- 110 B are connected electrically by suitable conductors to a control circuit (not shown), which includes at least a microprocessor, input/output circuitry, memory, and a power supply for driving the electro-optical detector arrays 110 A- 110 B, retrieving image data from of the arrays 110 A- 110 B, and storing the image data.
  • a control circuit not shown
  • Other data processing functions for example combining images and/or performing image display functions may be accomplished within the large format digital camera 100 or by other external data processing equipment.
  • the resolution of the electro optical detector arrays 104 B in the secondary camera systems 104 B- 104 N is greater than the resolution of the electro optical detector array 104 A in the primary camera system 104 A.
  • the large format digital camera 110 can produce a panchromatic image file 114 from the primary camera system 104 A using a wide-angle geometry that is suitable for use in a photogrammetric workflow that includes image-based georeferencing and digital surface modeling.
  • the large format digital camera 110 can also simultaneously produce multiple higher-resolution color image files from the secondary camera systems 104 B- 104 N using a narrow-angle geometry suitable for use in a photogrammetric workflow that includes ortho image production.
  • the primary camera system 104 A and the secondary camera systems 104 B- 104 N might be mounted within a common housing 102 .
  • a front glass plate 120 might be mounted within the housing 102 to protect the optical systems 106 A- 106 B.
  • the primary camera system 104 A and the secondary camera systems 104 B- 104 N are mounted in separate housings (not shown). In both cases, the primary camera system 104 A, the secondary camera systems 104 B- 104 N, and the housing 102 are configured for mounting and use within an aircraft.
  • FIG. 2 is a schematic diagram showing the footprint 202 of the primary camera system 104 A overlaid with footprints 204 A- 204 B of the secondary camera systems 104 B- 104 N in the large format digital camera 100 according in one embodiment disclosed herein.
  • the large format digital camera 100 includes four secondary camera systems 104 B- 104 N configured with the footprints 204 A- 204 D illustrated in FIG. 2 , respectively.
  • the primary camera system 104 A and the secondary camera systems 104 B- 104 N are configured in one embodiment such that the large format digital camera 100 can produce overlapping images at two different image scales offering two different footprints 202 and 204 A- 204 D.
  • two primary camera systems 104 A and four secondary camera systems 104 B- 104 N are utilized.
  • images produced by the primary camera system 104 A have a larger footprint 202 and are larger in size than those produced by the secondary camera systems 104 B- 104 N. Images produced by the secondary camera systems 104 B- 104 N have smaller footprint 204 A- 204 D and are smaller in size than those produced by the primary camera system 104 A and offer a higher resolution narrow angle color image.
  • the four secondary camera systems 104 B- 104 N may be configured such that the footprints 204 A- 204 D of the secondary camera systems 104 B- 104 N cover the footprint 202 of the primary camera system 104 A in a direction perpendicular to a flight line 400 .
  • the footprints 204 A- 204 D of the four secondary camera systems 104 B- 104 N cover a “stripe” of the footprint 202 of the primary camera system 104 A in a direction perpendicular to the flight line 400 .
  • a portion of the images produced by the primary camera system 104 A can be enhanced by the images produced by the secondary camera systems 104 B- 104 N.
  • FIG. 3 provides a perspective view of the footprint 200 of the primary camera system 104 A and the footprints 204 A- 204 N of the four secondary camera systems 104 B- 104 N when an image is taken from a common point 302 by the primary camera system 104 A and the four secondary camera systems 104 B- 104 N.
  • FIG. 4A shows a top-down view that illustrates the overlap between the footprints 202 A- 202 D of a sequence of consecutive images taken with the primary camera system 104 A and the footprint 204 A- 204 D of a sequence of consecutive images taken with four secondary camera systems 104 B- 104 N in the large format digital camera 100 in one embodiment disclosed herein.
  • the large format digital camera 100 may be mounted and configured for use within an aircraft (not shown). When the aircraft is flown according to a well-defined flight line 400 , the large format digital camera 100 may be configured to capture a sequence of images along the flight line 400 .
  • FIG. 4A shows a top-down view that illustrates the overlap between the footprints 202 A- 202 D of a sequence of consecutive images taken with the primary camera system 104 A and the footprint 204 A- 204 D of a sequence of consecutive images taken with four secondary camera systems 104 B- 104 N in the large format digital camera 100 in one embodiment disclosed herein.
  • the large format digital camera 100 may be mounted and configured for use
  • 4A illustrates the footprints 202 A- 202 D of a sequence of images taken using the primary camera system 104 A and the footprints 204 A- 204 D of a sequence of images taken using by four secondary camera systems 104 B- 104 N along the flight line 400 .
  • the large format camera 100 may be further configured such that the primary camera system 104 A produces a sequence of consecutive panchromatic images that have footprints 202 A- 202 D wherein consecutive sequential images overlap one another.
  • the secondary camera systems 104 B- 104 N may similarly be configured to produce a sequence of consecutive color images that have footprints 204 A- 204 D wherein consecutive sequential images overlap one another and also overlap the images produced by the primary camera system 104 A.
  • the overlap between the footprints of consecutive panchromatic images may be greater than the overlap between the footprints of consecutive color images.
  • FIG. 4B is a perspective diagram illustrating the overlap between the footprints 202 A- 202 D of a sequence of consecutive images taken on several flight lines 400 with the primary camera system 104 A and the footprints 204 A- 204 D of a sequence of consecutive images taken with four secondary camera systems 104 B- 104 N in the large format digital camera 100 in one embodiment disclosed herein. If, as illustrated in FIG. 4B , images are produced by the primary camera system 104 A and the secondary camera systems 104 B- 104 N along multiple well-defined flight lines by means of aerial photogrammetric image acquisition, the footprints 202 of the primary camera system 104 A overlap one another in the sequence of exposures along the flight lines.
  • the footprints 204 A- 204 D of the secondary camera systems 104 B- 104 N also overlap with the footprints 202 A- 202 D of the primary camera system 104 A and the footprints 204 A- 204 D of the four secondary camera systems 104 B- 104 N.
  • flight line 400 images are therefore produced in such a way that the sequence of images produced by the primary camera system 104 A and the images produced by the secondary camera systems 104 B- 104 N create continuous image strips of overlapping images.
  • the flight lines may be defined in such a way that the large format digital camera 100 captures images covering an entire project area.
  • image acquisition by the secondary camera systems 104 B- 104 N may be triggered substantially simultaneously with image acquisition by the primary camera system 104 A and, accordingly, images from the secondary camera systems 104 B- 104 N may be acquired at the same position and with the same camera attitude as images from the primary camera system 104 A.
  • the trigger for the secondary camera systems 104 B- 104 N may be independent from the primary camera system 104 A, e.g., may be at a higher rate than images captured by the primary camera system. Either embodiment, as well as any combination thereof, is contemplated to be within the scope of embodiments presented herein.
  • the images produced by the secondary camera systems 104 B- 104 N may be registered to the images produced by the primary camera system 104 A using the same trigger event. Additionally, images produced by the secondary camera systems 104 B- 104 N may be calibrated to images of the primary camera system 104 A through the use of a precisely surveyed and well-structured object (known as a “calibration object”).
  • the images of the secondary camera systems 104 B- 104 N may also be stitched to the images of the primary camera system 104 B using traditional methods. Additionally, the images generated by the primary camera system 104 A can be used to reconstruct the three dimensional form of an object (for instance, the buildings of a city by means of a digital surface model) and the images of the secondary camera system 104 B, with a higher geometric resolution, may be used to extract high resolution photo texture which can then used for the production of ortho image maps.
  • FIG. 5 provides additional details regarding the embodiments presented herein for a large format digital camera 100 having multiple optical systems and detector arrays.
  • FIG. 5 is a flow diagram showing a routine 500 that illustrates one process presented herein for the airborne optical registration of large areas using the large format digital camera 100 described above.
  • the routine 500 begins at operation 502 , where the large format digital camera 100 is calibrated.
  • the large format digital camera 100 may be calibrated using a calibration object such that the footprint of images produced by the secondary camera systems 104 B- 104 N overlap the footprint of images produced by the primary camera system 104 A in the manner discussed above.
  • the large format digital camera 100 may be installed in an aircraft and utilized to capture ground images as the aircraft is flown along a well-defined flight line. Such images may be captured and stored in an appropriate digital storage device integrated with or external to the large format digital camera 100 .
  • the routine 500 proceeds to operation 504 where panchromatic image files 114 are received from the primary camera system 104 A.
  • the routine then proceeds to operation 506 , where the color image files 118 A- 118 N are received from the secondary camera systems 104 B- 104 N.
  • the routine 500 proceeds to operation 508 , where the image files 114 from the primary camera system 104 A are co-registered with the image files 118 A- 118 N from the secondary camera systems 104 B- 104 N.
  • the routine 500 proceeds to operation 510 , where the image files 114 from the primary camera system 104 A are utilized in a photogrammetric workflow that includes image-based georeferencing and digital surface modeling. From operation 510 , the routine 500 proceeds to operation 512 , where the image files 118 A- 118 N from the secondary camera systems 104 B- 104 N are utilized for ortho image production. The routine 500 proceeds from operation 512 to operation 514 , where it ends.
  • FIG. 6 is a block diagram showing an aerial imaging system 600 .
  • the aerial imaging system 600 may be used for processing and correcting aerial images 602 - a acquired by an aerial digital camera, such as the large format digital camera 110 , for example. It may be appreciated that other aerial digital cameras may be used as well, and the embodiments are not limited to the exemplary large format digital camera 110 .
  • a and “b” and “c” and similar designators as used herein are intended to be variables representing any positive integer.
  • a complete set of aerial image 602 - a may include aerial images 602 - 1 , 602 - 2 , 602 - 3 , 602 - 4 and 602 - 5 .
  • the embodiments are not limited in this context.
  • the aerial imaging system 600 may comprise a computer-implemented system having one or more components, such as an atmospheric and solar component 610 , for example.
  • system and “component” are intended to refer to a computer-related entity, comprising either hardware, a combination of hardware and software, software, or software in execution.
  • a component can be implemented as a process running on a processor, a processor, a hard disk drive, multiple storage drives of optical and/or magnetic storage medium, an object, an executable, a thread of execution, a program, and/or a computer.
  • an application running on a server and the server can be a component.
  • One or more components can reside within a process and/or thread of execution, and a component can be localized on one computer and/or distributed between two or more computers as desired for a given implementation. The embodiments are not limited in this context.
  • the aerial imaging system 600 may be implemented as part of an electronic device.
  • an electronic device may include without limitation a digital camera, an aerial digital camera, a large format digital camera, a mobile device, a personal digital assistant, a mobile computing device, a smart phone, a cellular telephone, a handset, a one-way pager, a two-way pager, a messaging device, a computer, a personal computer (PC), a desktop computer, a laptop computer, a notebook computer, a handheld computer, a server, a server array or server farm, a web server, a network server, an Internet server, a work station, a mini-computer, a main frame computer, a supercomputer, a network appliance, a web appliance, a distributed computing system, multiprocessor systems, processor-based systems, consumer electronics, programmable consumer electronics, television, digital television, set top box, wireless access point, base station, subscriber station, mobile subscriber center, radio network controller, router, hub, gateway, bridge
  • the aerial imaging system 600 may comprise the large format digital camera 110 , the atmospheric and solar component 610 , and a digital display 620 .
  • the aerial imaging system 600 as shown in FIG. 6 has a limited number of elements in a certain topology, it may be appreciated that the aerial imaging system 600 may include more or less elements in alternate topologies as desired for a given implementation.
  • the aerial imaging system 600 may further comprise other elements typically found in an aerial imaging system or an electronic device, such as computing components, communications components, power supplies, input devices, output devices, and so forth. The embodiments are not limited in this context.
  • the large format digital camera 110 may acquire and output a series of aerial images 602 - a , such as aerial images as previously described with reference to FIGS. 1-5 .
  • the large format digital camera 110 may be part of a camera platform used to elevate the camera 110 above a surface of the earth.
  • the camera platform may comprise an airplane flying a flight mission over a defined flight line, such as flight line 400 , to acquire aerial images 602 - a of a particular surface area of the earth for airborne optical registration.
  • a defined flight line may comprise a typical north-south flight line pattern for an aerial photograph acquisition. Such an acquisition produces overlapping aerial images 602 - a , which are subsequently processed into a seamless ortho mosaic or aerial map.
  • the atmospheric and solar component 610 may implement various image processing techniques to correct atmospheric and solar influences from the aerial images 602 - a taken by the large format digital camera 110 .
  • the atmospheric and solar component 610 derives a robust atmosphere mask which is applied to each pixel of an aerial image 602 - a while remaining insensitive (or independent) of scene content on the ground.
  • the atmospheric and solar component 610 models undesired atmosphere effects with a linear relationship that is related to two physical phenomena due to light scattering and absorption in the atmosphere.
  • a first physical phenomenon is additive path radiance caused by scattering which is modeled by an intercept term.
  • a second physical phenomenon is a multiplicative attenuation factor caused by scattering and absorption which is modeled by a gain term.
  • the atmospheric and solar component 610 uses the linear relationship as a good approximation of a sub-region of an aerial image 602 - a , where a distance to the large format digital camera 110 can be considered a constant and therefore atmosphere effects are the same.
  • the atmospheric and solar component 610 models the atmosphere in a single aerial image 602 - a by dividing the aerial image 602 - a into a series of grids. After dividing an aerial image 602 - a into a grid, however, a number of samples in each grid becomes smaller relative to the entire aerial image 602 - a , and the atmosphere influence becomes harder to model given fewer samples. Depending on scene content, statistics derived from each grid might be insufficient to derive a robust atmosphere mask.
  • the atmospheric and solar component 610 solves this problem by aggregating statistics of grids across groups of aerial images 602 - a , such as aerial images 602 - a that have been captured close in time and/or on a same flight line. For example, the atmospheric and solar component 610 groups a series of aerial images 602 - a taken along a same flight line. Pixels in one grid have approximately the same distance to the large format digital camera 110 , and therefore undergo the same or similar atmospheric effects. Samples taken from a series of aerial images 602 - a , rather than using a single grid or a single aerial image 602 - a , may be used to derive an atmosphere mask insensitive to scene content. As a result, the atmospheric and solar component 610 can correct for atmosphere effects on each pixel of an aerial image 602 - a by modeling and separating atmospheric effects from scene content.
  • the atmospheric and solar component 610 derives a robust solar irradiance model and a camera model that can adjust brightness of each aerial image 602 - a to a preset mean brightness.
  • the atmospheric and solar component 610 models solar irradiance by using a sun elevation angle with a multiplicative factor.
  • the atmospheric and solar component 610 models light received by the large format digital camera 110 using a multiplicative factor comprising an exposure time and an aperture size.
  • the two multiplicative factors are then multiplied and combined to one factor to scale each aerial image 602 - a to a common mean brightness. After scaling operations, brightness levels for each aerial image 602 - a are more uniform.
  • the atmospheric and solar component 610 may receive as input one or more aerial images 602 - a from the large format digital camera 110 , perform advanced image processing operations on the aerial images 602 - a , and output corrected aerial images 604 - b .
  • the atmospheric and solar component 610 may adjust color and brightness both globally and locally to correct for atmospheric and solar influences in aerial images 602 - a to form corrected aerial images 604 - b .
  • the corrected aerial images 602 - b may be used to generate a seamless and color-balanced ortho mosaic production.
  • the corrected aerial images 602 - b can be further refined using subsequent image processing techniques, such as BRDF correction and seamline design techniques, for example.
  • the aerial imaging system 600 may present the corrected aerial images 604 - b on the display 620 .
  • the display 620 may comprise any electronic display for presentation of visual, tactile or auditive information. Examples for the display 620 may include without limitation a cathode ray tube (CRT), bistable display, electronic paper, nixie tube, vector display, a flat panel display, a vacuum fluorescent display, a light-emitting diode (LED) display, electroluminescent (ELD) display, a plasma display panel (PDP), a liquid crystal display (LCD), a thin-film transistor (TFT) display, an organic light-emitting diode (OLED) display, a surface-conduction electron-emitter display (SED), a laser television, carbon nanotubes, nanocrystal displays, a head-mounted display, and so any other displays consistent with the described embodiments.
  • CTR cathode ray tube
  • bistable display electronic paper
  • nixie tube vector display
  • flat panel display a vacuum fluorescent display
  • the display 620 may be implemented as a touchscreen display.
  • a touchscreen display is an electronic visual display that can detect the presence and location of a touch within the display area. The touch may be from a finger, hand, stylus, light pen, and so forth. The embodiments are not limited in this context.
  • FIG. 7 is a more detailed block diagram of the atmospheric and solar component 610 of the aerial imaging system 600 .
  • the atmospheric and solar component 610 as shown in FIG. 7 has a limited number of elements in a certain topology, it may be appreciated that atmospheric and solar component 610 may include more or less elements in alternate topologies as desired for a given implementation.
  • the atmospheric and solar component 610 may include, among other elements, an image information component 710 , a filter generation component 720 and an image correction component 730 .
  • the components 710 , 720 and 730 may be communicatively coupled via various types of communications media.
  • the components 710 , 720 and 730 may coordinate operations between each other. The coordination may involve the uni-directional or bi-directional exchange of information.
  • the components 710 , 720 and 730 may communicate information in the form of signals communicated over the communications media.
  • the information can be implemented as signals allocated to various signal lines. In such allocations, each message is a signal. Further embodiments, however, may alternatively employ data messages. Such data messages may be sent across various connections. Exemplary connections include parallel interfaces, serial interfaces, and bus interfaces.
  • the image information component 710 may be generally arranged to generate image records 742 - c associated with aerial images 602 - a .
  • An image record 742 - c may be associated with a single aerial image 602 - a , or in some cases, multiple aerial images 602 - a .
  • the image information component 710 may store the image records 742 - c in a datastore 740 .
  • the datastore 740 and stored image records 742 - c are accessible by the filter generation component 720 and the image correction component 730 during different image processing phases.
  • the image information component 710 may retrieve image context information 712 associated with one or more aerial images 602 - a .
  • the image context information 712 may comprise, among other types of information, image information, camera information and camera platform information.
  • the image context information 712 may be retrieved from the datastore 740 , an aerial image 602 - a , a file or package containing an aerial image 602 - a , or a remote datastore for another device (e.g., the large format digital camera 110 or an airplane computer).
  • the image information component 710 may store the image context information 712 in an image record 742 - c associated with a given aerial image 602 - a.
  • the image context information 712 may comprise image information associated with one or more aerial images 602 - a acquired by the large format digital camera 110 .
  • image information may include without limitation an image identifier, an image date, an image time, an image location, an image tag, an image label, or other image metadata.
  • the image context information 712 may comprise camera information associated with the large format digital camera 110 used to acquire one or more aerial images 602 - a .
  • camera information may include without limitation a camera identifier, a camera exposure time (e.g., shutter speed), a camera aperture size, a camera location (e.g., latitude, longitude and altitude coordinates), and other camera metadata.
  • the image context information 712 may comprise camera platform information associated with a platform for the large format digital camera 110 used to acquire a group of aerial images 602 - a .
  • Examples of camera platform information may include without limitation a camera platform identifier, a flight mission identifier, a flight line identifier, a camera platform location (e.g., latitude, longitude and altitude coordinates), and other camera platform metadata.
  • the image information component 710 may retrieve or generate statistical information 714 associated with one or more aerial images 602 - a .
  • the image information component 710 may receive one or more aerial images 602 - a , and generate statistical information 714 from information contained within the aerial images 602 - a , such as pixel values.
  • the image information component 710 may store the statistical information 714 in a same image record 742 - c used to store image context information 712 associated with a given aerial image 602 - a or set of aerial images 602 - a .
  • the image information component 710 may store statistical information 714 in a different image record 742 - c , and link the different image records using a globally unique identifier (GUID).
  • GUID globally unique identifier
  • the filter generation component 720 may generate an atmospheric filter 722 and a solar filter 724 from statistical information 714 and image context information 712 stored in image records 742 - c .
  • each atmospheric filter 722 and solar filter 724 may comprise one or more correction masks derived from the image context information 712 and statistical information 714 .
  • Each atmospheric filter 722 and solar filter 724 may be associated with one or more aerial images 602 - a .
  • the filter generation component 720 may retrieve information from the image records 742 - c , process the retrieved information to form the atmospheric filter 722 and the solar filter 724 , and store processed information for the filters 722 , 724 in corresponding filter image records 744 - d in the datastore 740 .
  • the image correction component 730 may be generally arranged to correct atmospheric and solar artifacts from a target aerial image 732 using a respective atmospheric filter 722 and solar filter 724 .
  • a target aerial image 732 may comprise an aerial image that is currently being focused on for a particular set of image processing operations, such as an aerial image for which a set of filters 722 , 724 are being designed, or an aerial image that is currently being corrected.
  • the image correction component 730 may retrieve an appropriate atmospheric filter 722 and solar filter 724 from the image records 742 - c and/or the filter image records 744 - d for a given target aerial image 732 , perform correction operations on the target aerial image 732 , and output a corrected target aerial image 734 .
  • the corrected target aerial image 734 may be written to the datastore 740 , or transported in real-time to another device.
  • logic flows 800 , 900 Operations for the aerial imaging system 600 and the atmospheric and solar component 610 may be further described with reference to logic flows 800 , 900 . It may be appreciated that the representative logic flows 800 , 900 do not necessarily have to be executed in the order presented, or in any particular order, unless otherwise indicated. Moreover, various activities described with respect to the logic flows 800 , 900 can be executed in serial or parallel fashion.
  • the logic flows 800 , 900 may be implemented using one or more hardware elements and/or software elements of the described embodiments or alternative elements as desired for a given set of design and performance constraints.
  • the logic flows 800 , 900 may be implemented as logic (e.g., computer program instructions) for execution by a logic device (e.g., a general-purpose or specific-purpose computer).
  • FIG. 8 illustrates one embodiment of a logic flow 800 .
  • the logic flow 800 may be representative of some or all of the operations executed by one or more embodiments described herein, such as the atmospheric and solar component 610 of the aerial imaging system 600 , for example.
  • the logic flow 800 may be implemented by the atmospheric and solar component 610 to correct for atmospheric influences in an aerial image 602 - a.
  • the logic flow 800 may receive multiple aerial images at block 802 .
  • the atmospheric and solar component 610 may receive multiple aerial images 602 - a from the large format digital camera 110 .
  • the aerial images 602 - a may be temporally sequential aerial images 602 - a .
  • the embodiments, however, are not limited to this example.
  • the logic flow 800 may generate statistical information for each aerial image at block 804 .
  • the image information component 710 may generate statistical information 714 for each aerial image 602 - a .
  • the atmospheric and solar component 610 may generate statistical information 714 comprising a shadow percentile value and a standard deviation value for each grid within an aerial image 602 - a , among other types of statistical information 714 .
  • the image information component 710 may calculate a shadow percentile value for each grid. For example, the image information component 710 may calculate a shadow percentile (e.g., 0.06 percentile) per channel (e.g., R, G, and B) from all pixels in a grid. In some cases, some pixels from a grid may be excluded. For example, if a water mask (e.g., geo-locations of lakes, ocean, rivers, etc.) is available, pixels of water in an image should be excluded from the statistics calculation. If saturated pixel values are known, those pixels should be excluded as well.
  • a water mask e.g., geo-locations of lakes, ocean, rivers, etc.
  • the image information component 710 may calculate a standard deviation value for each grid. For example, the image information component 710 may calculate a standard deviation per channel (e.g., R, G, and B) from all pixels in a grid. In some cases, some pixels from a grid may be excluded, such as those associated with a water mask or saturated pixels.
  • a standard deviation per channel e.g., R, G, and B
  • the image information component 710 may store the shadow percentile values and the standard deviation values in an image record 742 - c for the aerial image 602 - a .
  • the image information component 710 may organize and store a shadow percentile value and a standard deviation value for each grid of an aerial image 602 - a to an associated image record 742 - c .
  • the image record 742 - c may implement any known data schema. In one embodiment, for example, the image record 742 - c may be implemented as a comma delimited file (CSV).
  • Each aerial image 602 - a may have a separate associated image record 742 - c , or a separate record in a compounded image record 742 - c .
  • the image record 742 - c of a given aerial image 602 - a can be retrieved for subsequent image processing operations.
  • Each image record 742 - c , or record within an image record 742 - c may be indexed by a globally unique identifier (GUID) to facilitate retrieval operations.
  • GUID globally unique identifier
  • the logic flow 800 may retrieve image context information for each aerial image at block 806 .
  • the image information component 710 may retrieve image context information 712 related to the aerial image 602 - a from a datastore or datasource, and write the related image context information 712 to the same image record 742 - c.
  • the logic flow 800 may create an atmospheric filter from the statistical information and the image context information at block 808 .
  • the filter generation component 720 may create an atmospheric filter 722 from the statistical information 712 and the image context information 714 stored in one or more image records 742 - c.
  • the filter generation component 720 may selectively group aerial images 602 - a to form a set of aerial images 726 - e using associated image context information 712 and an ordering algorithm.
  • the filter generation component 720 may use an ordering algorithm to produce a set of aerial images 726 - e sharing a nested order comprising a flight mission identifier, a flight line identifier, a camera exposure time, and a camera aperture size.
  • Such an ordering algorithm is designed to produce a set of aerial images 726 - e having similar, if not identical, illumination conditions (e.g., from the sun and atmosphere), and exposure settings, since they are continuous in time.
  • the filter generation component 720 leverages this time-continuity concept to infer illumination conditions so that atmosphere correction information can be robustly estimated using a larger set of samples generated across the aerial images 726 - e.
  • the filter generation component 720 may identify an image window of a subset of aerial images 728 - f from a set of aerial images 726 - e from the stream of received aerial images 602 - c . Once grouped, the filter generation component 720 may sort a set of aerial images 726 - e by time in ascending order. Alternatively, the filter generation component 720 can sort the aerial images 726 - e with image numbers if the aerial images 726 - e are numbered with a natural number in ascending order in time in a consistent manner. Once sorted, the filter generation component 720 may identify or select a subset of aerial images 728 - f from the set of aerial images 726 - e to form an image window.
  • the subset of aerial images 728 - f may comprise a number of (k) previous aerial images before a target aerial image 732 .
  • the subset of aerial images 728 - f may comprise k/2 aerial images before the target aerial image 732 , and k/2 aerial images after the target aerial image 732 . In either case, if the number of aerial images found is less than k, they can be used but this condition should be marked as an edge condition for quality control purposes.
  • a resulting image window comprises a subset of aerial images 728 - f having k+1 images, including a target aerial image 732 as a current (or center if non-real-time system) aerial image.
  • the target aerial image 732 may comprise an aerial image within an image window for which a set of filters 722 , 724 are being made.
  • the filter generation component 720 may generate a grid atmospheric haze additive term value for each grid of a target aerial image 732 using shadow percentile values from a subset of aerial images 728 - f in an image window. For each image window, the filter generation component 720 may search and retrieve all image records 742 - c associated with the subset of aerial images 728 - f in the image window. For each channel, and for each grid, the filter generation component 720 retrieves all the shadow percentile values from the retrieved image records 742 - c , and organizes them into a vector. The filter generation component 720 sorts them in ascending order, and takes the p th percentile. The p th percentile value is a grid atmospheric haze additive term value for a given grid. These operations are repeated for each grid of the target aerial image 732 . The grid atmospheric haze additive term values collectively form a filtered additive haze mask (per-channel) for the target aerial image 732 of the image window.
  • the filter generation component 720 may generate a grid atmospheric correction gain term value for each grid of a target aerial image 732 using standard deviation values from a subset of aerial images 728 - f in an image window. For each image window, the filter generation component 720 may search and retrieve all image records 742 - c associated with the subset of aerial images 728 - f in the image window. For each channel, and for each grid, the filter generation component 720 retrieves all the standard deviation values from the retrieved image records 742 - c , and averages the standard deviation values to form a grid atmospheric correction gain term value. These operations are repeated for each grid in the target aerial image 732 .
  • the grid atmospheric correction gain term values collectively form one filtered atmospheric attenuation correction mask per channel (e.g., R, G, B, Infrared, Panchromatic, etc.), for the target aerial image 732 of the image window.
  • the filter generation component 720 may generate grid atmospheric haze additive term values and grid atmospheric correction gain term values for all aerial images within a set of aerial images 726 - e using similar operations as described above.
  • the filter generation component 720 may store grid atmospheric haze additive term values and grid atmospheric correction gain term values for each target aerial image 732 in a filter image record 744 - d .
  • a filter image record 744 - d or a record in a filter image record 744 - d may be indexed by a GUID for an aerial image 726 - e so it may be retrieved for a given aerial image 726 - e during image correction operations.
  • the logic flow 800 may correct atmosphere artifacts of an aerial image using the atmospheric filter at block 810 .
  • the image correction component 730 may correct atmosphere artifacts of a target aerial image 732 using an atmospheric filter 722 designed for the target aerial image 732 by the filter generation component 720 .
  • an atmospheric filter 722 may comprise grid atmospheric haze additive term values and grid atmospheric correction gain term values stored in a filter image record 744 - d associated with the target aerial image 732 .
  • the image correction component 730 may correct atmosphere additive haze and multiplicative attenuation for a target aerial image 732 using its atmospheric filter 722 .
  • L′ is the measured digital number (DN) value of a given band at pixel location (x,y)
  • L is the atmosphere-corrected value
  • haze(band,x,y) is the grid atmospheric haze additive term value for a given band at pixel location (x,y)
  • g(band,x,y) is the grid atmospheric correction gain term value.
  • the image correction component 730 may calculate a pixel atmospheric haze additive term value for a pixel of a target aerial image 732 using grid atmospheric haze additive term values stored in a filter image record 744 - d associated with the target aerial image 732 . This may be accomplished two ways. First, the image correction component 730 may perform a bilinear or bicubic interpolation on the filtered additive haze mask of the given aerial image 726 - e . The Bilinear or bicubic interpolation may use four neighboring grids that surround a given pixel location for the four interpolation corners.
  • a natural cubic spline curve can be fitted to the one-dimensional grids (along the longer-dimension pixel coordinate, e.g., the x-coordinate if number of rows is one and the y-coordinate if number of columns is one).
  • the natural cubic spline curve can be represented as a look-up table (LUT) at each pixel location. In this case, for example, each pixel location with a same y-coordinate value (when number of rows is one) will have the same value. Then the haze additive term value at a given pixel location can be looked up from the LUT.
  • the image correction component 730 may calculate a pixel atmospheric correction gain term value for a pixel of a target aerial image 732 using grid atmospheric correction gain term values stored in a filter image record 744 - d associated with the target aerial image 732 .
  • the image correction component 730 may perform a bilinear or bicubic interpolation on each filtered atmospheric correction mask for each channel (e.g., R, G, or B) of the target aerial image 732 to retrieve interpolated values for the masks.
  • the natural cubic spline curve fitting as previously described can be used if the grid configuration is one-dimensional.
  • the image correction component 730 may correct a pixel of a target aerial image 732 using a pixel atmospheric haze additive term value calculated for the pixel.
  • the image correction component 730 may correct a given pixel in accordance with Equation (3) as previously described.
  • the image correction component 730 may correct a pixel of a target aerial image 732 using a pixel atmospheric correction gain term value calculated for the pixel.
  • the image correction component 730 may correct a given pixel in accordance with Equation (3) as previously described.
  • the image correction component 730 may output a corrected target aerial image 734 .
  • the image correction component 730 selects a next target aerial image 732 , and performs correction operations similar to those described above to form a next corrected target aerial image 734 .
  • the above correction operations may be performed for each of the aerial images 726 - e.
  • FIG. 9 illustrates one embodiment of a logic flow 900 .
  • the logic flow 900 may be representative of some or all of the operations executed by one or more embodiments described herein, such as the atmospheric and solar component 610 of the aerial imaging system 600 , for example.
  • the logic flow 900 may be implemented by the atmospheric and solar component 610 to correct for solar influences in an aerial image 602 - a.
  • the logic flow 900 may receive multiple aerial images at block 902 .
  • the atmospheric and solar component 610 may receive multiple aerial images 602 - a from the large format digital camera 110 .
  • the aerial images 602 - a may be temporally sequential aerial images 602 - a .
  • the embodiments, however, are not limited to this example.
  • the logic flow 900 may receive image context information for each aerial image at block 904 .
  • the image information component 710 may receive image context information 712 for each aerial image 602 - a .
  • the image information component 710 may store image context information 712 for an aerial image 602 - a in a same or different image record 742 - c used to store the statistical information 714 derived for the aerial image 602 - a as described in the logic flow 800 .
  • the logic flow 900 may create a solar filter from the image context information at block 906 .
  • the filter generation component 720 may create a solar filter 724 for a target aerial image 732 from the image context information 712 .
  • the filter generation component 720 may calculate a sun elevation angle value for each aerial image 726 - e .
  • the image information component 710 may retrieve image information for a target aerial image 732 from associated image context information 712 , including an image date, an image time, and an image location.
  • the image information component 710 may calculate a sun elevation angle (sunZenith) for each aerial image in a set of aerial images 726 - e.
  • the filter generation component 720 may calculate an average sun elevation angle value (meanSunZenithDeg) from the multiple sun elevation angle values computed for a set of aerial images 726 - e.
  • the filter generation component 720 may calculate a solar flux factor value for a target aerial image 732 from an average sun elevation angle value for a set of aerial images 726 - e .
  • fluxFactor cos(targetSunZenith)/cos(sunZenith) Equation (6) where targetSunZenith and sunZenith are in radians, and meanSunZenithDeg is in degrees.
  • the filter generation component 720 may calculate an average exposure time value and average aperture value for a set of aerial images 726 - e .
  • the filter generation component 720 may retrieve camera exposure time (exposure Time) and camera aperture size (aperture) for each aerial image 726 - e using associated image context information 712 , and calculate an average camera exposure time (meanExposureTime) and an average camera aperture size (meanAperture) of all the aerial images 726 - e.
  • the filter generation component 720 may calculate an exposure normalization factor value for a target aerial image 732 from an average exposure time value and an average aperture value for a set of aerial images 726 - e .
  • the filter generation component 720 may calculate a total normalization factor value for a target aerial image 732 from a solar flux factor value and exposure normalization factor value for the target aerial image 732 .
  • the logic flow 900 may correct solar artifacts of an aerial image using the solar filter at block 908 .
  • the image correction component 730 may correct solar artifacts of a target aerial image 732 before or after atmospheric correction operations, as described with reference to the logic flow 800 of FIG. 8 .
  • the atmospheric and solar component 610 may be arranged to perform further image processing operations on a corrected target aerial image 734 .
  • the atmospheric and solar component 610 may be further arranged to perform “hotspot removal” operations, which is a radiometric process to remove hotspots caused by micro shadows, which result in uneven illumination of aerial images with respect to the camera ray direction and the sun direction.
  • the atmospheric and solar component 610 may be arranged to perform ortho rectification operations, which is a geometric process to correct an aerial image so that the scale is uniform like a map, without distortion. This is typically performed using a photogrammetric library using a digital elevation model (DEM) and bundle adjustment results.
  • DEM digital elevation model
  • the atmospheric and solar component 610 may be arranged to perform ortho mosaic operations, which is a segmentation process to put together multiple ortho rectified images into one seamless mosaic using a max-flow-min-cut theory, for example.
  • the atmospheric and solar component 610 may be arranged to perform globally-aware-locally-adaptive (GALA) tone mapping to convert a larger aerial ortho from 16-bit camera range to 8-bit camera range.
  • GALA globally-aware-locally-adaptive
  • FIG. 10 illustrates an embodiment of an exemplary computing architecture 1000 suitable for implementing various embodiments as previously described.
  • the computing architecture 1000 includes various common computing elements, such as one or more processors, co-processors, memory units, chipsets, controllers, peripherals, interfaces, oscillators, timing devices, video cards, audio cards, multimedia input/output (I/O) components, and so forth.
  • processors such as one or more processors, co-processors, memory units, chipsets, controllers, peripherals, interfaces, oscillators, timing devices, video cards, audio cards, multimedia input/output (I/O) components, and so forth.
  • I/O multimedia input/output
  • the computing architecture 1000 comprises a processing unit 1004 , a system memory 1006 and a system bus 1008 .
  • the processing unit 1004 can be any of various commercially available processors. Dual microprocessors and other multi-processor architectures may also be employed as the processing unit 1004 .
  • the system bus 1008 provides an interface for system components including, but not limited to, the system memory 1006 to the processing unit 1004 .
  • the system bus 1008 can be any of several types of bus structure that may further interconnect to a memory bus (with or without a memory controller), a peripheral bus, and a local bus using any of a variety of commercially available bus architectures.
  • the system memory 1006 may include various types of memory units, such as read-only memory (ROM), random-access memory (RAM), dynamic RAM (DRAM), Double-Data-Rate DRAM (DDRAM), synchronous DRAM (SDRAM), static RAM (SRAM), programmable ROM (PROM), erasable programmable ROM (EPROM), electrically erasable programmable ROM (EEPROM), flash memory, polymer memory such as ferroelectric polymer memory, ovonic memory, phase change or ferroelectric memory, silicon-oxide-nitride-oxide-silicon (SONOS) memory, magnetic or optical cards, or any other type of media suitable for storing information.
  • the system memory 1006 can include non-volatile memory 1010 and/or volatile memory 1012 .
  • a basic input/output system (BIOS) can be stored in the non-volatile memory 1010 .
  • the computer 1002 may include various types of computer-readable storage media, including an internal hard disk drive (HDD) 1014 , a magnetic floppy disk drive (FDD) 1016 to read from or write to a removable magnetic disk 1018 , and an optical disk drive 1020 to read from or write to a removable optical disk 1022 (e.g., a CD-ROM or DVD).
  • the HDD 1014 , FDD 1016 and optical disk drive 1020 can be connected to the system bus 1008 by a HDD interface 1024 , an FDD interface 1026 and an optical drive interface 1028 , respectively.
  • the HDD interface 1024 for external drive implementations can include at least one or both of Universal Serial Bus (USB) and IEEE 1394 interface technologies.
  • USB Universal Serial Bus
  • the drives and associated computer-readable media provide volatile and/or nonvolatile storage of data, data structures, computer-executable instructions, and so forth.
  • a number of program modules can be stored in the drives and memory units 1010 , 1012 , including an operating system 1030 , one or more application programs 1032 , other program modules 1034 , and program data 1036 .
  • the one or more application programs 1032 , other program modules 1034 , and program data 1036 can include, for example, the aerial imaging system 600 .
  • a user can enter commands and information into the computer 1002 through one or more wire/wireless input devices, for example, a keyboard 1038 and a pointing device, such as a mouse 1040 .
  • Other input devices may include a microphone, an infrared (IR) remote control, a joystick, a game pad, a stylus pen, touch screen, or the like.
  • IR infrared
  • These and other input devices are often connected to the processing unit 1004 through an input device interface 1042 that is coupled to the system bus 1008 , but can be connected by other interfaces such as a parallel port, IEEE 1394 serial port, a game port, a USB port, an IR interface, and so forth.
  • a monitor 1044 or other type of display device is also connected to the system bus 1008 via an interface, such as a video adaptor 1046 .
  • a computer typically includes other peripheral output devices, such as speakers, printers, and so forth.
  • the computer 1002 may operate in a networked environment using logical connections via wire and/or wireless communications to one or more remote computers, such as a remote computer 1048 .
  • the remote computer 1048 can be a workstation, a server computer, a router, a personal computer, portable computer, microprocessor-based entertainment appliance, a peer device or other common network node, and typically includes many or all of the elements described relative to the computer 1002 , although, for purposes of brevity, only a memory/storage device 1050 is illustrated.
  • the logical connections depicted include wire/wireless connectivity to a local area network (LAN) 1052 and/or larger networks, for example, a wide area network (WAN) 1054 .
  • LAN and WAN networking environments are commonplace in offices and companies, and facilitate enterprise-wide computer networks, such as intranets, all of which may connect to a global communications network, for example, the Internet.
  • the computer 1002 When used in a LAN networking environment, the computer 1002 is connected to the LAN 1052 through a wire and/or wireless communication network interface or adaptor 1056 .
  • the adaptor 1056 can facilitate wire and/or wireless communications to the LAN 1052 , which may also include a wireless access point disposed thereon for communicating with the wireless functionality of the adaptor 1056 .
  • the computer 1002 can include a modem 1058 , or is connected to a communications server on the WAN 1054 , or has other means for establishing communications over the WAN 1054 , such as by way of the Internet.
  • the modem 1058 which can be internal or external and a wire and/or wireless device, connects to the system bus 1008 via the input device interface 1042 .
  • program modules depicted relative to the computer 1002 can be stored in the remote memory/storage device 1050 . It will be appreciated that the network connections shown are exemplary and other means of establishing a communications link between the computers can be used.
  • the computer 1002 is operable to communicate with wire and wireless devices or entities using the IEEE 802 family of standards, such as wireless devices operatively disposed in wireless communication (e.g., IEEE 802.11 over-the-air modulation techniques) with, for example, a printer, scanner, desktop and/or portable computer, personal digital assistant (PDA), communications satellite, any piece of equipment or location associated with a wirelessly detectable tag (e.g., a kiosk, news stand, restroom), and telephone.
  • PDA personal digital assistant
  • the communication can be a predefined structure as with a conventional network or simply an ad hoc communication between at least two devices.
  • Wi-Fi networks use radio technologies called IEEE 802.11x (a, b, g, etc.) to provide secure, reliable, fast wireless connectivity.
  • IEEE 802.11x a, b, g, etc.
  • a Wi-Fi network can be used to connect computers to each other, to the Internet, and to wire networks (which use IEEE 802.3-related media and functions).
  • Various embodiments may be implemented using hardware elements, software elements, or a combination of both.
  • hardware elements may include devices, components, processors, microprocessors, circuits, circuit elements (e.g., transistors, resistors, capacitors, inductors, and so forth), integrated circuits, application specific integrated circuits (ASIC), programmable logic devices (PLD), digital signal processors (DSP), field programmable gate array (FPGA), memory units, logic gates, registers, semiconductor device, chips, microchips, chip sets, and so forth.
  • Examples of software elements may include software components, programs, applications, computer programs, application programs, system programs, machine programs, operating system software, middleware, firmware, software modules, routines, subroutines, functions, methods, procedures, software interfaces, application program interfaces (API), instruction sets, computing code, computer code, code segments, computer code segments, words, values, symbols, or any combination thereof. Determining whether an embodiment is implemented using hardware elements and/or software elements may vary in accordance with any number of factors, such as desired computational rate, power levels, heat tolerances, processing cycle budget, input data rates, output data rates, memory resources, data bus speeds and other design or performance constraints, as desired for a given implementation.
  • An article of manufacture may comprise a storage medium to store logic.
  • Examples of a storage medium may include one or more types of computer-readable storage media capable of storing electronic data, including volatile memory or non-volatile memory, removable or non-removable memory, erasable or non-erasable memory, writeable or re-writeable memory, and so forth.
  • Examples of the logic may include various software elements, such as software components, programs, applications, computer programs, application programs, system programs, machine programs, operating system software, middleware, firmware, software modules, routines, subroutines, functions, methods, procedures, software interfaces, application program interfaces (API), instruction sets, computing code, computer code, code segments, computer code segments, words, values, symbols, or any combination thereof.
  • API application program interfaces
  • an article of manufacture may store executable computer program instructions that, when executed by a computer, cause the computer to perform methods and/or operations in accordance with the described embodiments.
  • the executable computer program instructions may include any suitable type of code, such as source code, compiled code, interpreted code, executable code, static code, dynamic code, and the like.
  • the executable computer program instructions may be implemented according to a predefined computer language, manner or syntax, for instructing a computer to perform a certain function.
  • the instructions may be implemented using any suitable high-level, low-level, object-oriented, visual, compiled and/or interpreted programming language.
  • Coupled and “connected” along with their derivatives. These terms are not necessarily intended as synonyms for each other. For example, some embodiments may be described using the terms “connected” and/or “coupled” to indicate that two or more elements are in direct physical or electrical contact with each other. The term “coupled,” however, may also mean that two or more elements are not in direct contact with each other, but yet still co-operate or interact with each other.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Geometry (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Studio Devices (AREA)
  • Image Processing (AREA)

Abstract

Techniques for atmospheric and solar correction of aerial images are described. An apparatus may comprise an atmospheric and solar component arranged for execution by a logic device and operative to correct solar and atmosphere artifacts from an aerial image. The atmospheric and solar component may comprise an image information component operative to generate an image record for each aerial image of a group of aerial images, the image record comprising statistical information and image context information for each aerial image, a filter generation component operative to generate an atmospheric filter and a solar filter from the statistical information and the image context information stored in the image records, and an image correction component operative to correct atmospheric and solar artifacts from the aerial image using the respective atmospheric filter and solar filter. Other embodiments are described and claimed.

Description

RELATED APPLICATIONS
This application is related to commonly owned U.S. patent application Ser. No. 12/624,412 titled “Multi-Resolution Digital Large Format Camera With Multiple Detector Arrays” filed on Nov. 24, 2009, issued as U.S. Pat. No. 8,665,316, which is hereby incorporated by reference in its entirety.
BACKGROUND
Aerial imaging refers to capturing images of a surface of a planet from an elevated position, such as an aircraft or satellite. Aerial imaging is used in cartography, such as for providing photogrammetric surveys, which are often a basis for topographic maps. Depending on a given elevation for a camera, aerial imaging may capture undesired visual elements obscuring surface objects in an image. Removing undesired visual elements may be difficult, however, particularly when multiple aerial images are stitched together to form a composite aerial image, such as an “ortho mosaic” or aerial map. It is with respect to these and other considerations that the present improvements have been needed.
SUMMARY
This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended as an aid in determining the scope of the claimed subject matter.
Embodiments are generally directed to advanced image processing techniques for aerial images. Some embodiments are particularly directed to advanced image processing techniques specifically designed to correct atmospheric and solar influences in aerial images. As elevation for a camera increases, the camera may acquire an aerial image with an increasing number of atmospheric and solar influences, such as light scattering and absorption due to aerosols in the atmosphere. The atmospheric and solar influences potentially obscure surface objects in an aerial image acquired by the aerial camera, among other problems.
Embodiments implement various atmospheric and solar correction techniques to correct atmospheric and solar influences in aerial images. In one embodiment, for example, an atmospheric and solar component is arranged for execution by a logic device and operative to correct solar and atmosphere artifacts from an aerial image. The atmospheric and solar component may comprise, among other elements, an image information component operative to generate an image record for each aerial image of a group of aerial images, the image record comprising statistical information and image context information for each aerial image. The atmospheric and solar component may further comprise a filter generation component operative to generate an atmospheric filter and a solar filter from the statistical information and the image context information stored in the image records. The atmospheric and solar component may still further comprise an image correction component operative to correct atmospheric and solar artifacts from the aerial image using the respective atmospheric filter and solar filter. In this manner, the atmospheric and solar component may provide enhanced aerial images providing a greater degree of visual acuity and accurate reproduction of surface objects in the aerial images. Other embodiments are described and claimed.
These and other features and advantages will be apparent from a reading of the following detailed description and a review of the associated drawings. It is to be understood that both the foregoing general description and the following detailed description are explanatory only and are not restrictive of aspects as claimed.
BRIEF DESCRIPTION OF THE DRAWINGS
FIG. 1 is a schematic diagram showing aspects of a multi-resolution digital large format camera with multiple detector arrays provided in one embodiment presented herein;
FIG. 2 is a schematic diagram showing the footprint of a primary camera system overlaid with the footprints of four secondary camera systems in a large format digital camera presented in one embodiment disclosed herein;
FIG. 3 is a perspective diagram showing a perspective view of the footprint of a primary camera system and the footprints of four secondary camera systems in a large format digital camera presented in one embodiment disclosed herein;
FIG. 4A is a schematic diagram showing a top-down view that illustrates the overlap between the footprint of a sequence of consecutive images taken with a primary camera system and the footprint of a sequence of consecutive images taken with four secondary camera systems in a large format digital camera presented one embodiment disclosed herein;
FIG. 4B is a schematic diagram showing a perspective view that illustrates the overlap between the footprint of a sequence of consecutive images taken along a flight line with a primary camera system and the footprint of a sequence of consecutive images taken with four secondary camera systems in a large format digital camera presented one embodiment disclosed herein;
FIG. 5 is a flow diagram showing one illustrative process presented herein for the airborne optical registration of large areas using a multi-resolution digital large format camera with multiple detector arrays provided in one embodiment presented herein;
FIG. 6 is a block diagram showing an aerial imaging system for correcting aerial images acquired by a multi-resolution digital large format camera with multiple detector arrays provided in one embodiment presented herein;
FIG. 7 is a block diagram showing an atmospheric and solar component for an aerial imaging system for correcting aerial images acquired by a multi-resolution digital large format camera with multiple detector arrays provided in one embodiment presented herein;
FIG. 8 is a flow diagram showing one illustrative process presented herein for correcting atmospheric influences in aerial images acquired by a multi-resolution digital large format camera with multiple detector arrays provided in one embodiment presented herein;
FIG. 9 is a flow diagram showing one illustrative process presented herein for correcting solar influences in aerial images acquired by a multi-resolution digital large format camera with multiple detector arrays provided in one embodiment presented herein; and
FIG. 10 is a block diagram of a computing architecture suitable for implementing an atmospheric and solar component for an aerial imaging system for correcting aerial images acquired by a multi-resolution digital large format camera with multiple detector arrays provided in one embodiment presented herein.
DETAILED DESCRIPTION
Embodiments are generally directed to advanced image processing techniques for aerial images. Some embodiments are particularly directed to advanced image processing techniques specifically designed to correct for atmospheric and solar influences of aerial images acquired using, for example, a large format aerial camera.
In general, an atmosphere for a planet is a hostile environment for aerial image acquisition and processing. At higher elevations, a large format digital camera may acquire aerial images having undesired visual elements obscuring surface objects in the aerial images. For instance, the aerial images may have a number of atmospheric and solar influences, such as light scattering and absorption due to aerosols in the atmosphere. Light scattering may include, for example, Rayleigh scattering and Mie scattering. Aerosols may include gaseous, liquid and solid particles. At a normal flight altitude of 18,500 feet, for example, atmospheric and solar influences introduce a range-dependent haze (e.g., brightness changes) and discoloration (e.g., blue-tint) on each pixel in an aerial image. This range-dependent haze and discoloration should be corrected and separated from the actual scene content on the ground or surface of the earth. Further, in larger image acquisitions, such as those spanning multiple days, a significant difference in solar irradiance in two or more flight lines may be introduced. This brightness difference makes subsequent correction, such as a bidirectional reflectance distribution function (BRDF) correction, more difficult or expensive to perform. As a result, there is a significant need to correct for atmospheric and solar influences obscuring surface objects in an aerial image acquired by a large format digital camera.
To correct for any undesired atmospheric influence, advanced image processing techniques attempt to model and derive a robust atmosphere linear mask that is optimal for each pixel in each aerial image, while remaining insensitive (or independent) of scene content on the ground. To correct for any undesired solar influence, advanced image processing techniques attempt to derive a robust solar irradiance model and a camera model that can adjust brightness for each image to a preset mean brightness through scaling techniques.
Advanced image processing techniques as described herein may provide significant advantages over conventional image processing techniques. For example, previous solutions such as dark object subtraction (DOS) techniques derive only an offset for each channel of one aerial image and only model an additive effect of the atmosphere. The advanced image processing techniques derive a linear model for each sub-region of an aerial image, and model both an additive effect and a multiplicative effect of the atmosphere. The advanced image processing techniques also consider adjacent images along a flight line, and therefore provide greater accuracy and less sensitivity to scene content. In other words, the advanced image processing techniques are locally optimal and globally consistent. Further, the advanced image processing techniques alleviate large brightness differences due to solar irradiance differences from larger image acquisitions spanning lengthy time intervals, such as across several hours or even multiple days.
The following detailed description will comprise two major sections. A first section provides a detailed description and examples for a large format aerial camera suitable for acquiring and outputting aerial images as illustrated in FIGS. 1-5. A second section provides a detailed description and examples for an aerial imaging system implementing advanced image processing techniques arranged to process aerial images acquired by a large format aerial camera as illustrated in FIGS. 6-10. The aerial imaging system may include an atmospheric and solar component arranged to correct for atmospheric and solar influences in aerial images. The corrected aerial images may be used for a number of use scenarios typical for digital aerial images, such as generating an ortho mosaic (aerial map) that is largely free of atmosphere and solar influences, among other use scenarios.
Large Format Aerial Camera
In various embodiments, a large format aerial camera may be used to acquire, capture or record aerial images. The aerial images may comprise still images (e.g., pictures) or moving images (e.g., video). In some embodiments, the large format aerial camera may be suitable for use in airborne optical registration of large surface areas of the earth, such as entire countries, continents, or even the entire world.
In one embodiment, for example, aerial images may be acquired by a large format aerial camera implemented as a multi-resolution large format digital camera having multiple optical systems and detector arrays, such as a MICROSOFT® ULTRACAM-G aerial camera, made by Microsoft Corporation, Redmond, Wash. Although some embodiments may be described with reference to aerial images acquired by a multi-resolution large format digital camera having multiple optical systems and detector arrays, it may be appreciated that the aerial imaging system and advanced atmospheric and solar component may correct aerial images acquired by any aerial digital camera. The embodiments are not limited in this context.
A multi-resolution large format digital camera may be capable of producing aerial images at different photographic scales. The multi-resolution large format digital camera can produce panchromatic images having a wide-angle geometry that are suitable for use in a photogrammetric workflow that includes image-based georeferencing and digital surface modeling. The multi-resolution large format digital camera can also concurrently produce multiple color images having a narrow-angle geometry suitable for use in a photogrammetric workflow that includes “ortho image” production. An ortho image is an image that shows ground objects in an orthographic projection. Because a single flight utilizing the multi-resolution large format digital camera can produce both wide-angle and narrow-angle images, the cost of mapping a large area can be reduced as compared to previous solutions.
The multi-resolution large format digital camera may include a primary camera system and two or more secondary camera systems. The primary camera system is configured for collecting panchromatic image data and the secondary camera systems are configured for collecting color image data. Each of the secondary camera systems has an optical system that has a longer focal length than the optical system of the primary camera system. The primary camera system and the secondary camera systems may be mounted within a common housing suitable for installation and use within an aircraft.
The primary camera system has an electro optical detector array capable of capturing the panchromatic image data. Each of the secondary camera systems has an electro optical detector array capable of capturing the color image data. The resolution of the electro optical detector in each of the secondary camera systems is greater than the resolution of the electro optical detector in the primary camera system. In some cases, the radiometric resolution of the secondary camera systems may be greater than the radiometric resolution of the primary camera system.
The primary camera system and the secondary camera systems may be configured such that the large format digital camera can produce images at two different image scales offering two different footprints. Images produced by the primary camera system have a larger footprint and are larger in size than those produced by the secondary camera systems and offer information for performing image-based georeferencing by means of photogrammetric triangulation. Images produced by the secondary camera systems have a smaller footprint and are smaller in size than those produced by the primary camera system and offer a high-resolution narrow angle color image. The color images produced by the secondary camera systems may be utilized as a source data set for high-resolution ortho image production. The footprint of the images generated by the secondary camera systems may be configured to overlap the footprint of the primary camera system in a direction perpendicular to a flight path.
The multi-resolution large format digital camera may be configured to generate a sequence of consecutive images along a flight line. The multi-resolution large format digital camera may be further configured such that the primary camera system produces a sequence of consecutive panchromatic images that overlap one another. The secondary camera systems may be configured to produce a sequence of consecutive color images that overlap one another and the images produced by the primary camera system. The overlap between consecutive panchromatic images may be greater than the overlap between consecutive color images.
FIG. 1 is a schematic diagram showing aspects of a large format digital camera 100 having multiple optical systems 106A-106B and detector arrays 110A-110B provided in one embodiment presented herein. As shown in FIG. 1, the large format digital camera 100 includes a primary camera system 104A and two or more secondary camera systems 104B-104N. Although FIG. 1 illustrates two secondary camera systems 104B-104N, it should be appreciated that other embodiments might include additional secondary camera systems 104B-104N. For instance, in an embodiment described below, the large format digital camera 100 includes four secondary camera systems 104B-104N.
According to one embodiment, the primary camera system 104A includes an optical system 106A that has a focal length 108A. Each of the secondary camera systems 104B-104N has an optical system 106B that has a focal length 108B that is longer than the focal length 108A of the optical system 106A. In this manner, the secondary camera systems 104B-104N are configured to produce images having a narrower field of view than images produced by the primary camera system 104A. Images produced by the primary camera system 104A have a wider field of view than images produced by the secondary camera systems 104B-104N. The optical systems 106A-106B may include other conventional optical elements to produce a suitable image at the desired focal length.
According to one implementation, the primary camera system 104A is configured with an electro optical detector array 110A capable of capturing panchromatic image data 112. As known in the art, a panchromatic image sensor, such as the electro optical detector array 110A, is sensitive to all or most of the entire visible spectrum. According to embodiments, each of the secondary camera systems 104B-104N is configured with an electro optical detector array 110B capable of capturing color image data 116. For instance, the secondary camera systems 104B-104N might be equipped with a suitable charge coupled device (“CCD”) array configured for capturing the color image data 116A-116N, respectively. According to embodiments, the camera system presented herein is a frame camera (also referred to as a framing camera), as opposed to a camera that utilizes push-broom sensing.
It should be appreciated that the detector arrays 110A-110B comprise arrays of individual electro-optical detectors, e.g., semiconductor devices that output an electric signal, the magnitude of which is dependent on the intensity of light energy incident on such electro-optical detector. Therefore, the signal from each electro-optical detector in the arrays 110A-110B is indicative of light energy intensity from a pixel area of the portion of the object or terrain being photographed, and the signals from all of the individual electro-optical detectors in the arrays 110A-110B are indicative of light energy intensity from all of the pixel areas of the portion of the object or terrain being photographed. Consequently, the signals from the electro-optical detectors in each of the detector arrays 110A-110B, together, are indicative of the pattern of light energy from the portion of the object being photographed, so a sub-image of the portion of the object can be produced from such signals. First, however, the signals are amplified, digitized, processed, and stored, as is well known to those of ordinary skill in the art.
The electro-optical detector arrays 110A-110B are connected electrically by suitable conductors to a control circuit (not shown), which includes at least a microprocessor, input/output circuitry, memory, and a power supply for driving the electro-optical detector arrays 110A-110B, retrieving image data from of the arrays 110A-110B, and storing the image data. Other data processing functions, for example combining images and/or performing image display functions may be accomplished within the large format digital camera 100 or by other external data processing equipment.
According to implementations, the resolution of the electro optical detector arrays 104B in the secondary camera systems 104B-104N is greater than the resolution of the electro optical detector array 104A in the primary camera system 104A. In this manner, the large format digital camera 110 can produce a panchromatic image file 114 from the primary camera system 104A using a wide-angle geometry that is suitable for use in a photogrammetric workflow that includes image-based georeferencing and digital surface modeling. The large format digital camera 110 can also simultaneously produce multiple higher-resolution color image files from the secondary camera systems 104B-104N using a narrow-angle geometry suitable for use in a photogrammetric workflow that includes ortho image production.
As described briefly above, the primary camera system 104A and the secondary camera systems 104B-104N might be mounted within a common housing 102. In this embodiment, a front glass plate 120 might be mounted within the housing 102 to protect the optical systems 106A-106B. In alternate implementations, the primary camera system 104A and the secondary camera systems 104B-104N are mounted in separate housings (not shown). In both cases, the primary camera system 104A, the secondary camera systems 104B-104N, and the housing 102 are configured for mounting and use within an aircraft.
FIG. 2 is a schematic diagram showing the footprint 202 of the primary camera system 104A overlaid with footprints 204A-204B of the secondary camera systems 104B-104N in the large format digital camera 100 according in one embodiment disclosed herein. In this embodiment, the large format digital camera 100 includes four secondary camera systems 104B-104N configured with the footprints 204A-204D illustrated in FIG. 2, respectively. As shown in FIG. 2, the primary camera system 104A and the secondary camera systems 104B-104N are configured in one embodiment such that the large format digital camera 100 can produce overlapping images at two different image scales offering two different footprints 202 and 204A-204D. According to other embodiments, two primary camera systems 104A and four secondary camera systems 104B-104N are utilized.
According to one embodiment, images produced by the primary camera system 104A have a larger footprint 202 and are larger in size than those produced by the secondary camera systems 104B-104N. Images produced by the secondary camera systems 104B-104N have smaller footprint 204A-204D and are smaller in size than those produced by the primary camera system 104A and offer a higher resolution narrow angle color image.
As also illustrated in FIG. 2, the four secondary camera systems 104B-104N may be configured such that the footprints 204A-204D of the secondary camera systems 104B-104N cover the footprint 202 of the primary camera system 104A in a direction perpendicular to a flight line 400. In this way, the footprints 204A-204D of the four secondary camera systems 104B-104N cover a “stripe” of the footprint 202 of the primary camera system 104A in a direction perpendicular to the flight line 400. By overlapping the footprints 202 and 204A-204D in the manner shown in FIG. 2, a portion of the images produced by the primary camera system 104A can be enhanced by the images produced by the secondary camera systems 104B-104N.
FIG. 3 provides a perspective view of the footprint 200 of the primary camera system 104A and the footprints 204A-204N of the four secondary camera systems 104B-104N when an image is taken from a common point 302 by the primary camera system 104A and the four secondary camera systems 104B-104N.
FIG. 4A shows a top-down view that illustrates the overlap between the footprints 202A-202D of a sequence of consecutive images taken with the primary camera system 104A and the footprint 204A-204D of a sequence of consecutive images taken with four secondary camera systems 104B-104N in the large format digital camera 100 in one embodiment disclosed herein. As discussed briefly above, the large format digital camera 100 may be mounted and configured for use within an aircraft (not shown). When the aircraft is flown according to a well-defined flight line 400, the large format digital camera 100 may be configured to capture a sequence of images along the flight line 400. FIG. 4A illustrates the footprints 202A-202D of a sequence of images taken using the primary camera system 104A and the footprints 204A-204D of a sequence of images taken using by four secondary camera systems 104B-104N along the flight line 400.
As illustrated in FIG. 4A, the large format camera 100 may be further configured such that the primary camera system 104A produces a sequence of consecutive panchromatic images that have footprints 202A-202D wherein consecutive sequential images overlap one another. The secondary camera systems 104B-104N may similarly be configured to produce a sequence of consecutive color images that have footprints 204A-204D wherein consecutive sequential images overlap one another and also overlap the images produced by the primary camera system 104A. The overlap between the footprints of consecutive panchromatic images may be greater than the overlap between the footprints of consecutive color images.
FIG. 4B is a perspective diagram illustrating the overlap between the footprints 202A-202D of a sequence of consecutive images taken on several flight lines 400 with the primary camera system 104A and the footprints 204A-204D of a sequence of consecutive images taken with four secondary camera systems 104B-104N in the large format digital camera 100 in one embodiment disclosed herein. If, as illustrated in FIG. 4B, images are produced by the primary camera system 104A and the secondary camera systems 104B-104N along multiple well-defined flight lines by means of aerial photogrammetric image acquisition, the footprints 202 of the primary camera system 104A overlap one another in the sequence of exposures along the flight lines. The footprints 204A-204D of the secondary camera systems 104B-104N also overlap with the footprints 202A-202D of the primary camera system 104A and the footprints 204A-204D of the four secondary camera systems 104B-104N.
Along the flight line 400 images are therefore produced in such a way that the sequence of images produced by the primary camera system 104A and the images produced by the secondary camera systems 104B-104N create continuous image strips of overlapping images. The flight lines may be defined in such a way that the large format digital camera 100 captures images covering an entire project area.
According to various embodiments, image acquisition by the secondary camera systems 104B-104N may be triggered substantially simultaneously with image acquisition by the primary camera system 104A and, accordingly, images from the secondary camera systems 104B-104N may be acquired at the same position and with the same camera attitude as images from the primary camera system 104A. Alternatively, the trigger for the secondary camera systems 104B-104N may be independent from the primary camera system 104A, e.g., may be at a higher rate than images captured by the primary camera system. Either embodiment, as well as any combination thereof, is contemplated to be within the scope of embodiments presented herein.
When the primary camera system 104A and the secondary camera systems 104B-104N are triggered at the same time, the images produced by the secondary camera systems 104B-104N may be registered to the images produced by the primary camera system 104A using the same trigger event. Additionally, images produced by the secondary camera systems 104B-104N may be calibrated to images of the primary camera system 104A through the use of a precisely surveyed and well-structured object (known as a “calibration object”).
The images of the secondary camera systems 104B-104N may also be stitched to the images of the primary camera system 104B using traditional methods. Additionally, the images generated by the primary camera system 104A can be used to reconstruct the three dimensional form of an object (for instance, the buildings of a city by means of a digital surface model) and the images of the secondary camera system 104B, with a higher geometric resolution, may be used to extract high resolution photo texture which can then used for the production of ortho image maps.
FIG. 5 provides additional details regarding the embodiments presented herein for a large format digital camera 100 having multiple optical systems and detector arrays. In particular, FIG. 5 is a flow diagram showing a routine 500 that illustrates one process presented herein for the airborne optical registration of large areas using the large format digital camera 100 described above.
The routine 500 begins at operation 502, where the large format digital camera 100 is calibrated. As discussed above, the large format digital camera 100 may be calibrated using a calibration object such that the footprint of images produced by the secondary camera systems 104B-104N overlap the footprint of images produced by the primary camera system 104A in the manner discussed above. As also discussed above, the large format digital camera 100 may be installed in an aircraft and utilized to capture ground images as the aircraft is flown along a well-defined flight line. Such images may be captured and stored in an appropriate digital storage device integrated with or external to the large format digital camera 100.
From operation 502, the routine 500 proceeds to operation 504 where panchromatic image files 114 are received from the primary camera system 104A. The routine then proceeds to operation 506, where the color image files 118A-118N are received from the secondary camera systems 104B-104N. Once the images files have been received from all of the camera systems 104A-104N, the routine 500 proceeds to operation 508, where the image files 114 from the primary camera system 104A are co-registered with the image files 118A-118N from the secondary camera systems 104B-104N.
From operation 508, the routine 500 proceeds to operation 510, where the image files 114 from the primary camera system 104A are utilized in a photogrammetric workflow that includes image-based georeferencing and digital surface modeling. From operation 510, the routine 500 proceeds to operation 512, where the image files 118A-118N from the secondary camera systems 104B-104N are utilized for ortho image production. The routine 500 proceeds from operation 512 to operation 514, where it ends.
Aerial Imaging System
FIG. 6 is a block diagram showing an aerial imaging system 600. The aerial imaging system 600 may be used for processing and correcting aerial images 602-a acquired by an aerial digital camera, such as the large format digital camera 110, for example. It may be appreciated that other aerial digital cameras may be used as well, and the embodiments are not limited to the exemplary large format digital camera 110.
It is worthy to note that “a” and “b” and “c” and similar designators as used herein are intended to be variables representing any positive integer. Thus, for example, if an implementation sets a value for a=5, then a complete set of aerial image 602-a may include aerial images 602-1, 602-2, 602-3, 602-4 and 602-5. The embodiments are not limited in this context.
The aerial imaging system 600 may comprise a computer-implemented system having one or more components, such as an atmospheric and solar component 610, for example. As used herein the terms “system” and “component” are intended to refer to a computer-related entity, comprising either hardware, a combination of hardware and software, software, or software in execution. For example, a component can be implemented as a process running on a processor, a processor, a hard disk drive, multiple storage drives of optical and/or magnetic storage medium, an object, an executable, a thread of execution, a program, and/or a computer. By way of illustration, both an application running on a server and the server can be a component. One or more components can reside within a process and/or thread of execution, and a component can be localized on one computer and/or distributed between two or more computers as desired for a given implementation. The embodiments are not limited in this context.
In the illustrated embodiment shown in FIG. 6, the aerial imaging system 600 may be implemented as part of an electronic device. Examples of an electronic device may include without limitation a digital camera, an aerial digital camera, a large format digital camera, a mobile device, a personal digital assistant, a mobile computing device, a smart phone, a cellular telephone, a handset, a one-way pager, a two-way pager, a messaging device, a computer, a personal computer (PC), a desktop computer, a laptop computer, a notebook computer, a handheld computer, a server, a server array or server farm, a web server, a network server, an Internet server, a work station, a mini-computer, a main frame computer, a supercomputer, a network appliance, a web appliance, a distributed computing system, multiprocessor systems, processor-based systems, consumer electronics, programmable consumer electronics, television, digital television, set top box, wireless access point, base station, subscriber station, mobile subscriber center, radio network controller, router, hub, gateway, bridge, switch, machine, or combination thereof.
In the illustrated embodiment shown in FIG. 6, the aerial imaging system 600 may comprise the large format digital camera 110, the atmospheric and solar component 610, and a digital display 620. Although the aerial imaging system 600 as shown in FIG. 6 has a limited number of elements in a certain topology, it may be appreciated that the aerial imaging system 600 may include more or less elements in alternate topologies as desired for a given implementation. For instance, the aerial imaging system 600 may further comprise other elements typically found in an aerial imaging system or an electronic device, such as computing components, communications components, power supplies, input devices, output devices, and so forth. The embodiments are not limited in this context.
The large format digital camera 110 may acquire and output a series of aerial images 602-a, such as aerial images as previously described with reference to FIGS. 1-5. In one embodiment, the large format digital camera 110 may be part of a camera platform used to elevate the camera 110 above a surface of the earth. In one embodiment, the camera platform may comprise an airplane flying a flight mission over a defined flight line, such as flight line 400, to acquire aerial images 602-a of a particular surface area of the earth for airborne optical registration. Alternatively, a defined flight line may comprise a typical north-south flight line pattern for an aerial photograph acquisition. Such an acquisition produces overlapping aerial images 602-a, which are subsequently processed into a seamless ortho mosaic or aerial map.
The atmospheric and solar component 610 may implement various image processing techniques to correct atmospheric and solar influences from the aerial images 602-a taken by the large format digital camera 110. To correct for undesired atmospheric influences, the atmospheric and solar component 610 derives a robust atmosphere mask which is applied to each pixel of an aerial image 602-a while remaining insensitive (or independent) of scene content on the ground. The atmospheric and solar component 610 models undesired atmosphere effects with a linear relationship that is related to two physical phenomena due to light scattering and absorption in the atmosphere. A first physical phenomenon is additive path radiance caused by scattering which is modeled by an intercept term. A second physical phenomenon is a multiplicative attenuation factor caused by scattering and absorption which is modeled by a gain term. The atmospheric and solar component 610 uses the linear relationship as a good approximation of a sub-region of an aerial image 602-a, where a distance to the large format digital camera 110 can be considered a constant and therefore atmosphere effects are the same. To model the linear relationship accurately for each pixel in an aerial image 602-a, the atmospheric and solar component 610 models the atmosphere in a single aerial image 602-a by dividing the aerial image 602-a into a series of grids. After dividing an aerial image 602-a into a grid, however, a number of samples in each grid becomes smaller relative to the entire aerial image 602-a, and the atmosphere influence becomes harder to model given fewer samples. Depending on scene content, statistics derived from each grid might be insufficient to derive a robust atmosphere mask.
The atmospheric and solar component 610 solves this problem by aggregating statistics of grids across groups of aerial images 602-a, such as aerial images 602-a that have been captured close in time and/or on a same flight line. For example, the atmospheric and solar component 610 groups a series of aerial images 602-a taken along a same flight line. Pixels in one grid have approximately the same distance to the large format digital camera 110, and therefore undergo the same or similar atmospheric effects. Samples taken from a series of aerial images 602-a, rather than using a single grid or a single aerial image 602-a, may be used to derive an atmosphere mask insensitive to scene content. As a result, the atmospheric and solar component 610 can correct for atmosphere effects on each pixel of an aerial image 602-a by modeling and separating atmospheric effects from scene content.
To correct for undesired solar influences, which has a global effect on each aerial image 602-a in an aerial photography acquisition, the atmospheric and solar component 610 derives a robust solar irradiance model and a camera model that can adjust brightness of each aerial image 602-a to a preset mean brightness. The atmospheric and solar component 610 models solar irradiance by using a sun elevation angle with a multiplicative factor. The atmospheric and solar component 610 models light received by the large format digital camera 110 using a multiplicative factor comprising an exposure time and an aperture size. The two multiplicative factors are then multiplied and combined to one factor to scale each aerial image 602-a to a common mean brightness. After scaling operations, brightness levels for each aerial image 602-a are more uniform.
As shown in FIG. 6, the atmospheric and solar component 610 may receive as input one or more aerial images 602-a from the large format digital camera 110, perform advanced image processing operations on the aerial images 602-a, and output corrected aerial images 604-b. For instance, the atmospheric and solar component 610 may adjust color and brightness both globally and locally to correct for atmospheric and solar influences in aerial images 602-a to form corrected aerial images 604-b. The corrected aerial images 602-b may be used to generate a seamless and color-balanced ortho mosaic production. In some cases, the corrected aerial images 602-b can be further refined using subsequent image processing techniques, such as BRDF correction and seamline design techniques, for example.
The aerial imaging system 600 may present the corrected aerial images 604-b on the display 620. The display 620 may comprise any electronic display for presentation of visual, tactile or auditive information. Examples for the display 620 may include without limitation a cathode ray tube (CRT), bistable display, electronic paper, nixie tube, vector display, a flat panel display, a vacuum fluorescent display, a light-emitting diode (LED) display, electroluminescent (ELD) display, a plasma display panel (PDP), a liquid crystal display (LCD), a thin-film transistor (TFT) display, an organic light-emitting diode (OLED) display, a surface-conduction electron-emitter display (SED), a laser television, carbon nanotubes, nanocrystal displays, a head-mounted display, and so any other displays consistent with the described embodiments. In one embodiment, the display 620 may be implemented as a touchscreen display. A touchscreen display is an electronic visual display that can detect the presence and location of a touch within the display area. The touch may be from a finger, hand, stylus, light pen, and so forth. The embodiments are not limited in this context.
FIG. 7 is a more detailed block diagram of the atmospheric and solar component 610 of the aerial imaging system 600. Although the atmospheric and solar component 610 as shown in FIG. 7 has a limited number of elements in a certain topology, it may be appreciated that atmospheric and solar component 610 may include more or less elements in alternate topologies as desired for a given implementation.
The atmospheric and solar component 610 may include, among other elements, an image information component 710, a filter generation component 720 and an image correction component 730. The components 710, 720 and 730 may be communicatively coupled via various types of communications media. The components 710, 720 and 730 may coordinate operations between each other. The coordination may involve the uni-directional or bi-directional exchange of information. For instance, the components 710, 720 and 730 may communicate information in the form of signals communicated over the communications media. The information can be implemented as signals allocated to various signal lines. In such allocations, each message is a signal. Further embodiments, however, may alternatively employ data messages. Such data messages may be sent across various connections. Exemplary connections include parallel interfaces, serial interfaces, and bus interfaces.
The image information component 710 may be generally arranged to generate image records 742-c associated with aerial images 602-a. An image record 742-c may be associated with a single aerial image 602-a, or in some cases, multiple aerial images 602-a. The image information component 710 may store the image records 742-c in a datastore 740. The datastore 740 and stored image records 742-c are accessible by the filter generation component 720 and the image correction component 730 during different image processing phases.
The image information component 710 may retrieve image context information 712 associated with one or more aerial images 602-a. The image context information 712 may comprise, among other types of information, image information, camera information and camera platform information. The image context information 712 may be retrieved from the datastore 740, an aerial image 602-a, a file or package containing an aerial image 602-a, or a remote datastore for another device (e.g., the large format digital camera 110 or an airplane computer). The image information component 710 may store the image context information 712 in an image record 742-c associated with a given aerial image 602-a.
The image context information 712 may comprise image information associated with one or more aerial images 602-a acquired by the large format digital camera 110. Examples of image information may include without limitation an image identifier, an image date, an image time, an image location, an image tag, an image label, or other image metadata.
The image context information 712 may comprise camera information associated with the large format digital camera 110 used to acquire one or more aerial images 602-a. Examples of camera information may include without limitation a camera identifier, a camera exposure time (e.g., shutter speed), a camera aperture size, a camera location (e.g., latitude, longitude and altitude coordinates), and other camera metadata.
The image context information 712 may comprise camera platform information associated with a platform for the large format digital camera 110 used to acquire a group of aerial images 602-a. Examples of camera platform information may include without limitation a camera platform identifier, a flight mission identifier, a flight line identifier, a camera platform location (e.g., latitude, longitude and altitude coordinates), and other camera platform metadata.
Along with image context information 712, the image information component 710 may retrieve or generate statistical information 714 associated with one or more aerial images 602-a. The image information component 710 may receive one or more aerial images 602-a, and generate statistical information 714 from information contained within the aerial images 602-a, such as pixel values. The image information component 710 may store the statistical information 714 in a same image record 742-c used to store image context information 712 associated with a given aerial image 602-a or set of aerial images 602-a. Alternatively, the image information component 710 may store statistical information 714 in a different image record 742-c, and link the different image records using a globally unique identifier (GUID).
The filter generation component 720 may generate an atmospheric filter 722 and a solar filter 724 from statistical information 714 and image context information 712 stored in image records 742-c. In one embodiment, for example, each atmospheric filter 722 and solar filter 724 may comprise one or more correction masks derived from the image context information 712 and statistical information 714. Each atmospheric filter 722 and solar filter 724 may be associated with one or more aerial images 602-a. The filter generation component 720 may retrieve information from the image records 742-c, process the retrieved information to form the atmospheric filter 722 and the solar filter 724, and store processed information for the filters 722, 724 in corresponding filter image records 744-d in the datastore 740.
The image correction component 730 may be generally arranged to correct atmospheric and solar artifacts from a target aerial image 732 using a respective atmospheric filter 722 and solar filter 724. A target aerial image 732 may comprise an aerial image that is currently being focused on for a particular set of image processing operations, such as an aerial image for which a set of filters 722, 724 are being designed, or an aerial image that is currently being corrected. The image correction component 730 may retrieve an appropriate atmospheric filter 722 and solar filter 724 from the image records 742-c and/or the filter image records 744-d for a given target aerial image 732, perform correction operations on the target aerial image 732, and output a corrected target aerial image 734. The corrected target aerial image 734 may be written to the datastore 740, or transported in real-time to another device.
Operations for the aerial imaging system 600 and the atmospheric and solar component 610 may be further described with reference to logic flows 800, 900. It may be appreciated that the representative logic flows 800, 900 do not necessarily have to be executed in the order presented, or in any particular order, unless otherwise indicated. Moreover, various activities described with respect to the logic flows 800, 900 can be executed in serial or parallel fashion. The logic flows 800, 900 may be implemented using one or more hardware elements and/or software elements of the described embodiments or alternative elements as desired for a given set of design and performance constraints. For example, the logic flows 800, 900 may be implemented as logic (e.g., computer program instructions) for execution by a logic device (e.g., a general-purpose or specific-purpose computer).
FIG. 8 illustrates one embodiment of a logic flow 800. The logic flow 800 may be representative of some or all of the operations executed by one or more embodiments described herein, such as the atmospheric and solar component 610 of the aerial imaging system 600, for example. In particular, the logic flow 800 may be implemented by the atmospheric and solar component 610 to correct for atmospheric influences in an aerial image 602-a.
In the illustrated embodiment shown in FIG. 8, the logic flow 800 may receive multiple aerial images at block 802. For example, the atmospheric and solar component 610 may receive multiple aerial images 602-a from the large format digital camera 110. In one embodiment, the aerial images 602-a may be temporally sequential aerial images 602-a. The embodiments, however, are not limited to this example.
The logic flow 800 may generate statistical information for each aerial image at block 804. For example, the image information component 710 may generate statistical information 714 for each aerial image 602-a. In one embodiment, the atmospheric and solar component 610 may generate statistical information 714 comprising a shadow percentile value and a standard deviation value for each grid within an aerial image 602-a, among other types of statistical information 714.
To generate statistical information 714, the image information component 710 may divide an aerial image 602-c into multiple grids. For each aerial image 602-a, the image information component 710 may divide an aerial image 602-a into a m-by-n grid, with each grid having the same dimensions such as width and height. Each grid is centered at locations calculated using Equation (1) and Equation (2), as follows:
Grid_center_x(column)=(image_width/m)*column+(image_width/2/m)   Equation (1)
Grid_center_y(row)=(image_height/n)*row+(image_height/2/n)   Equation (2)
The image information component 710 may calculate a shadow percentile value for each grid. For example, the image information component 710 may calculate a shadow percentile (e.g., 0.06 percentile) per channel (e.g., R, G, and B) from all pixels in a grid. In some cases, some pixels from a grid may be excluded. For example, if a water mask (e.g., geo-locations of lakes, ocean, rivers, etc.) is available, pixels of water in an image should be excluded from the statistics calculation. If saturated pixel values are known, those pixels should be excluded as well.
The image information component 710 may calculate a standard deviation value for each grid. For example, the image information component 710 may calculate a standard deviation per channel (e.g., R, G, and B) from all pixels in a grid. In some cases, some pixels from a grid may be excluded, such as those associated with a water mask or saturated pixels.
The image information component 710 may store the shadow percentile values and the standard deviation values in an image record 742-c for the aerial image 602-a. The image information component 710 may organize and store a shadow percentile value and a standard deviation value for each grid of an aerial image 602-a to an associated image record 742-c. The image record 742-c may implement any known data schema. In one embodiment, for example, the image record 742-c may be implemented as a comma delimited file (CSV). Each aerial image 602-a may have a separate associated image record 742-c, or a separate record in a compounded image record 742-c. The image record 742-c of a given aerial image 602-a can be retrieved for subsequent image processing operations. Each image record 742-c, or record within an image record 742-c, may be indexed by a globally unique identifier (GUID) to facilitate retrieval operations.
The logic flow 800 may retrieve image context information for each aerial image at block 806. For example, once the image information component 710 generates statistical information 714 for an aerial image 602-a, and stores the statistical information 714 to an image record 742-c, the image information component 710 may retrieve image context information 712 related to the aerial image 602-a from a datastore or datasource, and write the related image context information 712 to the same image record 742-c.
The logic flow 800 may create an atmospheric filter from the statistical information and the image context information at block 808. For example, the filter generation component 720 may create an atmospheric filter 722 from the statistical information 712 and the image context information 714 stored in one or more image records 742-c.
The filter generation component 720 may selectively group aerial images 602-a to form a set of aerial images 726-e using associated image context information 712 and an ordering algorithm. In one embodiment, for example, the filter generation component 720 may use an ordering algorithm to produce a set of aerial images 726-e sharing a nested order comprising a flight mission identifier, a flight line identifier, a camera exposure time, and a camera aperture size. Such an ordering algorithm is designed to produce a set of aerial images 726-e having similar, if not identical, illumination conditions (e.g., from the sun and atmosphere), and exposure settings, since they are continuous in time. The filter generation component 720 leverages this time-continuity concept to infer illumination conditions so that atmosphere correction information can be robustly estimated using a larger set of samples generated across the aerial images 726-e.
The filter generation component 720 may identify an image window of a subset of aerial images 728-f from a set of aerial images 726-e from the stream of received aerial images 602-c. Once grouped, the filter generation component 720 may sort a set of aerial images 726-e by time in ascending order. Alternatively, the filter generation component 720 can sort the aerial images 726-e with image numbers if the aerial images 726-e are numbered with a natural number in ascending order in time in a consistent manner. Once sorted, the filter generation component 720 may identify or select a subset of aerial images 728-f from the set of aerial images 726-e to form an image window. For a real-time system, the subset of aerial images 728-f may comprise a number of (k) previous aerial images before a target aerial image 732. For a non-real-time system, the subset of aerial images 728-f may comprise k/2 aerial images before the target aerial image 732, and k/2 aerial images after the target aerial image 732. In either case, if the number of aerial images found is less than k, they can be used but this condition should be marked as an edge condition for quality control purposes. Once identified, a resulting image window comprises a subset of aerial images 728-f having k+1 images, including a target aerial image 732 as a current (or center if non-real-time system) aerial image. The target aerial image 732 may comprise an aerial image within an image window for which a set of filters 722, 724 are being made.
The filter generation component 720 may generate a grid atmospheric haze additive term value for each grid of a target aerial image 732 using shadow percentile values from a subset of aerial images 728-f in an image window. For each image window, the filter generation component 720 may search and retrieve all image records 742-c associated with the subset of aerial images 728-f in the image window. For each channel, and for each grid, the filter generation component 720 retrieves all the shadow percentile values from the retrieved image records 742-c, and organizes them into a vector. The filter generation component 720 sorts them in ascending order, and takes the pth percentile. The pth percentile value is a grid atmospheric haze additive term value for a given grid. These operations are repeated for each grid of the target aerial image 732. The grid atmospheric haze additive term values collectively form a filtered additive haze mask (per-channel) for the target aerial image 732 of the image window.
The filter generation component 720 may generate a grid atmospheric correction gain term value for each grid of a target aerial image 732 using standard deviation values from a subset of aerial images 728-f in an image window. For each image window, the filter generation component 720 may search and retrieve all image records 742-c associated with the subset of aerial images 728-f in the image window. For each channel, and for each grid, the filter generation component 720 retrieves all the standard deviation values from the retrieved image records 742-c, and averages the standard deviation values to form a grid atmospheric correction gain term value. These operations are repeated for each grid in the target aerial image 732. The grid atmospheric correction gain term values collectively form one filtered atmospheric attenuation correction mask per channel (e.g., R, G, B, Infrared, Panchromatic, etc.), for the target aerial image 732 of the image window.
The filter generation component 720 may generate grid atmospheric haze additive term values and grid atmospheric correction gain term values for all aerial images within a set of aerial images 726-e using similar operations as described above. The filter generation component 720 may store grid atmospheric haze additive term values and grid atmospheric correction gain term values for each target aerial image 732 in a filter image record 744-d. As with the image records 742-c, a filter image record 744-d or a record in a filter image record 744-d may be indexed by a GUID for an aerial image 726-e so it may be retrieved for a given aerial image 726-e during image correction operations.
The logic flow 800 may correct atmosphere artifacts of an aerial image using the atmospheric filter at block 810. For example, the image correction component 730 may correct atmosphere artifacts of a target aerial image 732 using an atmospheric filter 722 designed for the target aerial image 732 by the filter generation component 720. For a given target aerial image 732, an atmospheric filter 722 may comprise grid atmospheric haze additive term values and grid atmospheric correction gain term values stored in a filter image record 744-d associated with the target aerial image 732. The image correction component 730 may correct atmosphere additive haze and multiplicative attenuation for a target aerial image 732 using its atmospheric filter 722.
In one embodiment, the atmosphere correction is a linear adjustment in accordance with Equation (3) as follows:
L(band,x,y)=g(band,x,y)*L′(band,x,y)−haze(band,x,y))   Equation (3)
where L′ is the measured digital number (DN) value of a given band at pixel location (x,y), L is the atmosphere-corrected value, haze(band,x,y) is the grid atmospheric haze additive term value for a given band at pixel location (x,y), and g(band,x,y) is the grid atmospheric correction gain term value.
On a pixel basis, the image correction component 730 may calculate a pixel atmospheric haze additive term value for a pixel of a target aerial image 732 using grid atmospheric haze additive term values stored in a filter image record 744-d associated with the target aerial image 732. This may be accomplished two ways. First, the image correction component 730 may perform a bilinear or bicubic interpolation on the filtered additive haze mask of the given aerial image 726-e. The Bilinear or bicubic interpolation may use four neighboring grids that surround a given pixel location for the four interpolation corners. Second, if the number of rows (r) or the number of columns (c) of the grids is one, a natural cubic spline curve can be fitted to the one-dimensional grids (along the longer-dimension pixel coordinate, e.g., the x-coordinate if number of rows is one and the y-coordinate if number of columns is one). The natural cubic spline curve can be represented as a look-up table (LUT) at each pixel location. In this case, for example, each pixel location with a same y-coordinate value (when number of rows is one) will have the same value. Then the haze additive term value at a given pixel location can be looked up from the LUT.
On a pixel basis, the image correction component 730 may calculate a pixel atmospheric correction gain term value for a pixel of a target aerial image 732 using grid atmospheric correction gain term values stored in a filter image record 744-d associated with the target aerial image 732. The image correction component 730 may perform a bilinear or bicubic interpolation on each filtered atmospheric correction mask for each channel (e.g., R, G, or B) of the target aerial image 732 to retrieve interpolated values for the masks. Alternatively, the natural cubic spline curve fitting as previously described can be used if the grid configuration is one-dimensional. The image correction component 730 may calculate per-channel gain correction using Equation (4) as follows:
g(band,x,y)=std_dev_ref(x,y)/std_dev(band,x,y)   Equation (4)
where std_dev_ref(x,y) is the reference value chosen from one of the three bands (e.g., green is chosen because human eyes are more sensitive to green).
The image correction component 730 may correct a pixel of a target aerial image 732 using a pixel atmospheric haze additive term value calculated for the pixel. The image correction component 730 may correct a given pixel in accordance with Equation (3) as previously described.
The image correction component 730 may correct a pixel of a target aerial image 732 using a pixel atmospheric correction gain term value calculated for the pixel. The image correction component 730 may correct a given pixel in accordance with Equation (3) as previously described.
Once all pixels for a target aerial image 732 are corrected, the image correction component 730 may output a corrected target aerial image 734. The image correction component 730 then selects a next target aerial image 732, and performs correction operations similar to those described above to form a next corrected target aerial image 734. The above correction operations may be performed for each of the aerial images 726-e.
FIG. 9 illustrates one embodiment of a logic flow 900. The logic flow 900 may be representative of some or all of the operations executed by one or more embodiments described herein, such as the atmospheric and solar component 610 of the aerial imaging system 600, for example. In particular, the logic flow 900 may be implemented by the atmospheric and solar component 610 to correct for solar influences in an aerial image 602-a.
In the illustrated embodiment shown in FIG. 9, the logic flow 900 may receive multiple aerial images at block 902. For example, the atmospheric and solar component 610 may receive multiple aerial images 602-a from the large format digital camera 110. In one embodiment, the aerial images 602-a may be temporally sequential aerial images 602-a. The embodiments, however, are not limited to this example.
The logic flow 900 may receive image context information for each aerial image at block 904. For example, the image information component 710 may receive image context information 712 for each aerial image 602-a. The image information component 710 may store image context information 712 for an aerial image 602-a in a same or different image record 742-c used to store the statistical information 714 derived for the aerial image 602-a as described in the logic flow 800.
The logic flow 900 may create a solar filter from the image context information at block 906. For example, the filter generation component 720 may create a solar filter 724 for a target aerial image 732 from the image context information 712.
The filter generation component 720 may calculate a sun elevation angle value for each aerial image 726-e. In one embodiment, the image information component 710 may retrieve image information for a target aerial image 732 from associated image context information 712, including an image date, an image time, and an image location. The image information component 710 may calculate a sun elevation angle (sunZenith) for each aerial image in a set of aerial images 726-e.
The filter generation component 720 may calculate an average sun elevation angle value (meanSunZenithDeg) from the multiple sun elevation angle values computed for a set of aerial images 726-e.
The filter generation component 720 may calculate a solar flux factor value for a target aerial image 732 from an average sun elevation angle value for a set of aerial images 726-e. The filter generation component 720 may calculate a solar flux factor value in accordance with Equation (5) and Equation (6) as follows:
targetSunZenith=M_PI*meanSunZenithDeg/180.0   Equation (5)
fluxFactor=cos(targetSunZenith)/cos(sunZenith)   Equation (6)
where targetSunZenith and sunZenith are in radians, and meanSunZenithDeg is in degrees.
The filter generation component 720 may calculate an average exposure time value and average aperture value for a set of aerial images 726-e. The filter generation component 720 may retrieve camera exposure time (exposure Time) and camera aperture size (aperture) for each aerial image 726-e using associated image context information 712, and calculate an average camera exposure time (meanExposureTime) and an average camera aperture size (meanAperture) of all the aerial images 726-e.
The filter generation component 720 may calculate an exposure normalization factor value for a target aerial image 732 from an average exposure time value and an average aperture value for a set of aerial images 726-e. The filter generation component 720 may calculate an exposure normalization factor value in accordance with Equation (7) as follows:
opCorrection=(meanExposureTime/exposure Time)*(aperture/_meanAperture)*(aperture/_meanAperture)   Equation (7)
The filter generation component 720 may calculate a total normalization factor value for a target aerial image 732 from a solar flux factor value and exposure normalization factor value for the target aerial image 732. The filter generation component 720 may calculate a total normalization factor value in accordance with Equation (3) as follows:
g_norm(image)=fluxFactor(image)*opCorrection(image)   Equation (8)
The logic flow 900 may correct solar artifacts of an aerial image using the solar filter at block 908. For example, the image correction component 730 may correct solar artifacts of a target aerial image 732 using a solar filter 724 in accordance with Equation (9) as follows:
L_final(band,x,y)=L(band,x,y)*g_norm(x,y)   Equation (9)
The image correction component 730 may correct solar artifacts of a target aerial image 732 before or after atmospheric correction operations, as described with reference to the logic flow 800 of FIG. 8.
In addition to image correction, the atmospheric and solar component 610 may be arranged to perform further image processing operations on a corrected target aerial image 734. For instance, the atmospheric and solar component 610 may be further arranged to perform “hotspot removal” operations, which is a radiometric process to remove hotspots caused by micro shadows, which result in uneven illumination of aerial images with respect to the camera ray direction and the sun direction. In another example, the atmospheric and solar component 610 may be arranged to perform ortho rectification operations, which is a geometric process to correct an aerial image so that the scale is uniform like a map, without distortion. This is typically performed using a photogrammetric library using a digital elevation model (DEM) and bundle adjustment results. In yet another example, the atmospheric and solar component 610 may be arranged to perform ortho mosaic operations, which is a segmentation process to put together multiple ortho rectified images into one seamless mosaic using a max-flow-min-cut theory, for example. In still another example, the atmospheric and solar component 610 may be arranged to perform globally-aware-locally-adaptive (GALA) tone mapping to convert a larger aerial ortho from 16-bit camera range to 8-bit camera range.
FIG. 10 illustrates an embodiment of an exemplary computing architecture 1000 suitable for implementing various embodiments as previously described. The computing architecture 1000 includes various common computing elements, such as one or more processors, co-processors, memory units, chipsets, controllers, peripherals, interfaces, oscillators, timing devices, video cards, audio cards, multimedia input/output (I/O) components, and so forth. The embodiments, however, are not limited to implementation by the computing architecture 1000.
As shown in FIG. 10, the computing architecture 1000 comprises a processing unit 1004, a system memory 1006 and a system bus 1008. The processing unit 1004 can be any of various commercially available processors. Dual microprocessors and other multi-processor architectures may also be employed as the processing unit 1004. The system bus 1008 provides an interface for system components including, but not limited to, the system memory 1006 to the processing unit 1004. The system bus 1008 can be any of several types of bus structure that may further interconnect to a memory bus (with or without a memory controller), a peripheral bus, and a local bus using any of a variety of commercially available bus architectures.
The system memory 1006 may include various types of memory units, such as read-only memory (ROM), random-access memory (RAM), dynamic RAM (DRAM), Double-Data-Rate DRAM (DDRAM), synchronous DRAM (SDRAM), static RAM (SRAM), programmable ROM (PROM), erasable programmable ROM (EPROM), electrically erasable programmable ROM (EEPROM), flash memory, polymer memory such as ferroelectric polymer memory, ovonic memory, phase change or ferroelectric memory, silicon-oxide-nitride-oxide-silicon (SONOS) memory, magnetic or optical cards, or any other type of media suitable for storing information. In the illustrated embodiment shown in FIG. 10, the system memory 1006 can include non-volatile memory 1010 and/or volatile memory 1012. A basic input/output system (BIOS) can be stored in the non-volatile memory 1010.
The computer 1002 may include various types of computer-readable storage media, including an internal hard disk drive (HDD) 1014, a magnetic floppy disk drive (FDD) 1016 to read from or write to a removable magnetic disk 1018, and an optical disk drive 1020 to read from or write to a removable optical disk 1022 (e.g., a CD-ROM or DVD). The HDD 1014, FDD 1016 and optical disk drive 1020 can be connected to the system bus 1008 by a HDD interface 1024, an FDD interface 1026 and an optical drive interface 1028, respectively. The HDD interface 1024 for external drive implementations can include at least one or both of Universal Serial Bus (USB) and IEEE 1394 interface technologies.
The drives and associated computer-readable media provide volatile and/or nonvolatile storage of data, data structures, computer-executable instructions, and so forth. For example, a number of program modules can be stored in the drives and memory units 1010, 1012, including an operating system 1030, one or more application programs 1032, other program modules 1034, and program data 1036. The one or more application programs 1032, other program modules 1034, and program data 1036 can include, for example, the aerial imaging system 600.
A user can enter commands and information into the computer 1002 through one or more wire/wireless input devices, for example, a keyboard 1038 and a pointing device, such as a mouse 1040. Other input devices may include a microphone, an infrared (IR) remote control, a joystick, a game pad, a stylus pen, touch screen, or the like. These and other input devices are often connected to the processing unit 1004 through an input device interface 1042 that is coupled to the system bus 1008, but can be connected by other interfaces such as a parallel port, IEEE 1394 serial port, a game port, a USB port, an IR interface, and so forth.
A monitor 1044 or other type of display device is also connected to the system bus 1008 via an interface, such as a video adaptor 1046. In addition to the monitor 1044, a computer typically includes other peripheral output devices, such as speakers, printers, and so forth.
The computer 1002 may operate in a networked environment using logical connections via wire and/or wireless communications to one or more remote computers, such as a remote computer 1048. The remote computer 1048 can be a workstation, a server computer, a router, a personal computer, portable computer, microprocessor-based entertainment appliance, a peer device or other common network node, and typically includes many or all of the elements described relative to the computer 1002, although, for purposes of brevity, only a memory/storage device 1050 is illustrated. The logical connections depicted include wire/wireless connectivity to a local area network (LAN) 1052 and/or larger networks, for example, a wide area network (WAN) 1054. Such LAN and WAN networking environments are commonplace in offices and companies, and facilitate enterprise-wide computer networks, such as intranets, all of which may connect to a global communications network, for example, the Internet.
When used in a LAN networking environment, the computer 1002 is connected to the LAN 1052 through a wire and/or wireless communication network interface or adaptor 1056. The adaptor 1056 can facilitate wire and/or wireless communications to the LAN 1052, which may also include a wireless access point disposed thereon for communicating with the wireless functionality of the adaptor 1056.
When used in a WAN networking environment, the computer 1002 can include a modem 1058, or is connected to a communications server on the WAN 1054, or has other means for establishing communications over the WAN 1054, such as by way of the Internet. The modem 1058, which can be internal or external and a wire and/or wireless device, connects to the system bus 1008 via the input device interface 1042. In a networked environment, program modules depicted relative to the computer 1002, or portions thereof, can be stored in the remote memory/storage device 1050. It will be appreciated that the network connections shown are exemplary and other means of establishing a communications link between the computers can be used.
The computer 1002 is operable to communicate with wire and wireless devices or entities using the IEEE 802 family of standards, such as wireless devices operatively disposed in wireless communication (e.g., IEEE 802.11 over-the-air modulation techniques) with, for example, a printer, scanner, desktop and/or portable computer, personal digital assistant (PDA), communications satellite, any piece of equipment or location associated with a wirelessly detectable tag (e.g., a kiosk, news stand, restroom), and telephone. This includes at least Wi-Fi (or Wireless Fidelity), WiMax, and Bluetooth™ wireless technologies. Thus, the communication can be a predefined structure as with a conventional network or simply an ad hoc communication between at least two devices. Wi-Fi networks use radio technologies called IEEE 802.11x (a, b, g, etc.) to provide secure, reliable, fast wireless connectivity. A Wi-Fi network can be used to connect computers to each other, to the Internet, and to wire networks (which use IEEE 802.3-related media and functions).
Various embodiments may be implemented using hardware elements, software elements, or a combination of both. Examples of hardware elements may include devices, components, processors, microprocessors, circuits, circuit elements (e.g., transistors, resistors, capacitors, inductors, and so forth), integrated circuits, application specific integrated circuits (ASIC), programmable logic devices (PLD), digital signal processors (DSP), field programmable gate array (FPGA), memory units, logic gates, registers, semiconductor device, chips, microchips, chip sets, and so forth. Examples of software elements may include software components, programs, applications, computer programs, application programs, system programs, machine programs, operating system software, middleware, firmware, software modules, routines, subroutines, functions, methods, procedures, software interfaces, application program interfaces (API), instruction sets, computing code, computer code, code segments, computer code segments, words, values, symbols, or any combination thereof. Determining whether an embodiment is implemented using hardware elements and/or software elements may vary in accordance with any number of factors, such as desired computational rate, power levels, heat tolerances, processing cycle budget, input data rates, output data rates, memory resources, data bus speeds and other design or performance constraints, as desired for a given implementation.
Some embodiments may comprise an article of manufacture. An article of manufacture may comprise a storage medium to store logic. Examples of a storage medium may include one or more types of computer-readable storage media capable of storing electronic data, including volatile memory or non-volatile memory, removable or non-removable memory, erasable or non-erasable memory, writeable or re-writeable memory, and so forth. Examples of the logic may include various software elements, such as software components, programs, applications, computer programs, application programs, system programs, machine programs, operating system software, middleware, firmware, software modules, routines, subroutines, functions, methods, procedures, software interfaces, application program interfaces (API), instruction sets, computing code, computer code, code segments, computer code segments, words, values, symbols, or any combination thereof. In one embodiment, for example, an article of manufacture may store executable computer program instructions that, when executed by a computer, cause the computer to perform methods and/or operations in accordance with the described embodiments. The executable computer program instructions may include any suitable type of code, such as source code, compiled code, interpreted code, executable code, static code, dynamic code, and the like. The executable computer program instructions may be implemented according to a predefined computer language, manner or syntax, for instructing a computer to perform a certain function. The instructions may be implemented using any suitable high-level, low-level, object-oriented, visual, compiled and/or interpreted programming language.
Some embodiments may be described using the expression “one embodiment” or “an embodiment” along with their derivatives. These terms mean that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment. The appearances of the phrase “in one embodiment” in various places in the specification are not necessarily all referring to the same embodiment.
Some embodiments may be described using the expression “coupled” and “connected” along with their derivatives. These terms are not necessarily intended as synonyms for each other. For example, some embodiments may be described using the terms “connected” and/or “coupled” to indicate that two or more elements are in direct physical or electrical contact with each other. The term “coupled,” however, may also mean that two or more elements are not in direct contact with each other, but yet still co-operate or interact with each other.
It is emphasized that the Abstract of the Disclosure is provided to comply with 37 C.F.R. Section 1.72(b), requiring an abstract that will allow the reader to quickly ascertain the nature of the technical disclosure. It is submitted with the understanding that it will not be used to interpret or limit the scope or meaning of the claims. In addition, in the foregoing Detailed Description, it can be seen that various features are grouped together in a single embodiment for the purpose of streamlining the disclosure. This method of disclosure is not to be interpreted as reflecting an intention that the claimed embodiments require more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive subject matter lies in less than all features of a single disclosed embodiment. Thus the following claims are hereby incorporated into the Detailed Description, with each claim standing on its own as a separate embodiment. In the appended claims, the terms “including” and “in which” are used as the plain-English equivalents of the respective terms “comprising” and “wherein,” respectively. Moreover, the terms “first,” “second,” “third,” and so forth, are used merely as labels, and are not intended to impose numerical requirements on their objects.
Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims.

Claims (20)

The invention claimed is:
1. A computer-implemented method, comprising:
receiving multiple aerial images;
grouping a set of aerial images based on image context information for each aerial image;
dividing each aerial image of the set of aerial images into grids;
generating statistical information for each aerial image, including aggregating statistics of the grids;
creating an atmospheric filter insensitive to scene content from the statistical information and the image context information, wherein the atmospheric filter filters atmospheric effects from light scattering and atmospheric absorption due to aerosols; and
correcting atmosphere artifacts of an aerial image using the atmospheric filter.
2. The computer-implemented method of claim 1, comprising:
dividing an aerial image into multiple grids;
calculating a shadow percentile value for each grid;
calculating a standard deviation value for each grid;
storing the shadow percentile values and the standard deviation values in an image record for the aerial image; and
storing image context information associated with the aerial image in the image record.
3. The computer-implemented method of claim 1, comprising:
identifying an image window of a subset of aerial images from the group;
generating a grid atmospheric haze additive term value for each grid of a target aerial image using shadow percentile values from aerial images in the image window;
generating a grid atmospheric correction gain term value for each grid of the target aerial image using standard deviation values from aerial images in the image window; and
storing the grid atmospheric haze additive term values and grid atmospheric correction gain term values for the target aerial image in a filter image record.
4. The computer-implemented method of claim 1, comprising correcting atmosphere additive haze and multiplicative attenuation for a target aerial image using grid atmospheric haze additive term values and grid atmospheric correction gain term values stored in a filter image record associated with the target aerial image.
5. The computer-implemented method of claim 1, comprising calculating a pixel atmospheric haze additive term value for a pixel of a target aerial image using grid atmospheric haze additive term values stored in a filter image record associated with the target aerial image.
6. The computer-implemented method of claim 1, comprising calculating a pixel atmospheric correction gain term value for a pixel of a target aerial image using grid atmospheric correction gain term values stored in a filter image record associated with the target aerial image.
7. The computer-implemented method of claim 1, comprising correcting a pixel of a target aerial image using a pixel atmospheric haze additive term value calculated for the pixel.
8. The computer-implemented method of claim 1, comprising correcting a pixel of a target aerial image using a pixel atmospheric correction gain term value calculated for the pixel.
9. A system comprising at least one processor in communication with computer-readable storage media, the computer readable storage media storing instructions, that, when executed cause the at least one processor to:
receive multiple aerial images spanning multiple days along and having a difference in solar irradiance in at least two flight lines;
receive image context information for each aerial image, the image context information to comprise camera location information for that aerial image;
create a solar filter from the image context information;
correct solar artifacts of an aerial image using the solar filter; and
scale at least one aerial image to a mean brightness level using the solar filter.
10. The system of claim 9, further comprising instructions that when executed enable the system to calculate a sun elevation angle value for each aerial image.
11. The system of claim 9, further comprising instructions that when executed enable the system to calculate an average sun elevation angle value from multiple sun elevation angle values for a group of aerial images.
12. The system of claim 9, further comprising instructions that when executed enable the system to calculate a solar flux factor value for a target aerial image from an average sun elevation angle value for a group of aerial images.
13. The system of claim 9, further comprising instructions that when executed enable the system to calculate an average exposure time value and average aperture value for a group of aerial images.
14. The system of claim 9, further comprising instructions that when executed enable the system to calculate an exposure normalization factor value for a target aerial image from an average exposure time value and an average aperture value for a group of aerial images.
15. The system of claim 9, further comprising instructions that when executed enable the system to calculate a total normalization factor value for a target aerial image from a solar flux factor value and exposure normalization factor value for the target aerial image.
16. An apparatus, comprising:
a logic device; and
a plurality of components executing on the logic device, the plurality of components comprising:
an atmospheric and solar component to correct solar and atmosphere artifacts from an aerial image, the atmospheric and solar component comprising:
an image information component to generate an image record for each aerial image of a group of aerial images, wherein the image information component models atmospheric effects from light scattering and atmospheric absorption due to aerosols, and further wherein the image record comprising statistical information and image context information for each aerial image, the statistical information to model atmospheric effects from light scattering and atmospheric absorption and solar irradiance;
a filter generation component to generate an atmospheric filter and a solar filter from the statistical information and the image context information stored in the image records; and
an image correction component to correct atmospheric and solar artifacts from the aerial image using the respective atmospheric filter and solar filter.
17. The apparatus of claim 16, the image information component to retrieve image context information associated with the group of aerial images, the image context information comprising at least one of image information, camera information or camera platform information.
18. The apparatus of claim 16, the image correction component to correct brightness differences due to solar irradiance differences between at least two aerial images in the group of aerial images.
19. The apparatus of claim 16, comprising a large format aerial camera used to acquire the group of aerial images, the large format aerial camera comprising a multi-resolution large format aerial camera with multiple detector arrays.
20. The apparatus of claim 16, the image correction component to correct atmosphere additive haze and multiplicative attenuation for a target aerial image using grid atmospheric haze additive term values and grid atmospheric correction gain term values stored in a filter image record associated with the target aerial image.
US12/973,689 2010-12-20 2010-12-20 Techniques for atmospheric and solar correction of aerial images Active 2033-05-04 US9576349B2 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
US12/973,689 US9576349B2 (en) 2010-12-20 2010-12-20 Techniques for atmospheric and solar correction of aerial images
CN201110429145.XA CN102567967B (en) 2010-12-20 2011-12-20 For the air of spatial image and the technology of solar correction
US15/437,140 US11017503B2 (en) 2010-12-20 2017-02-20 Techniques for atmospheric and solar correction of aerial images

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US12/973,689 US9576349B2 (en) 2010-12-20 2010-12-20 Techniques for atmospheric and solar correction of aerial images

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US15/437,140 Continuation US11017503B2 (en) 2010-12-20 2017-02-20 Techniques for atmospheric and solar correction of aerial images

Publications (2)

Publication Number Publication Date
US20120154584A1 US20120154584A1 (en) 2012-06-21
US9576349B2 true US9576349B2 (en) 2017-02-21

Family

ID=46233887

Family Applications (2)

Application Number Title Priority Date Filing Date
US12/973,689 Active 2033-05-04 US9576349B2 (en) 2010-12-20 2010-12-20 Techniques for atmospheric and solar correction of aerial images
US15/437,140 Active 2031-05-29 US11017503B2 (en) 2010-12-20 2017-02-20 Techniques for atmospheric and solar correction of aerial images

Family Applications After (1)

Application Number Title Priority Date Filing Date
US15/437,140 Active 2031-05-29 US11017503B2 (en) 2010-12-20 2017-02-20 Techniques for atmospheric and solar correction of aerial images

Country Status (2)

Country Link
US (2) US9576349B2 (en)
CN (1) CN102567967B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10109224B1 (en) * 2015-06-26 2018-10-23 Jayant Ratti Methods and devices for using aerial vehicles to create graphic displays and reconfigurable 3D structures
CN110769577A (en) * 2019-10-18 2020-02-07 Oppo(重庆)智能科技有限公司 Atmosphere lamp control method and device
US10600162B2 (en) * 2016-12-29 2020-03-24 Konica Minolta Laboratory U.S.A., Inc. Method and system to compensate for bidirectional reflectance distribution function (BRDF)
US11017503B2 (en) 2010-12-20 2021-05-25 Microsoft Technology Licensing , LLC Techniques for atmospheric and solar correction of aerial images

Families Citing this family (29)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10168153B2 (en) 2010-12-23 2019-01-01 Trimble Inc. Enhanced position measurement systems and methods
EP2702375A2 (en) * 2011-04-25 2014-03-05 Skybox Imaging, Inc. Systems and methods for overhead imaging and video
US8970691B2 (en) 2011-08-26 2015-03-03 Microsoft Technology Licensing, Llc Removal of rayleigh scattering from images
US8923567B2 (en) * 2011-12-19 2014-12-30 General Electric Company Apparatus and method for predicting solar irradiance variation
US9235763B2 (en) * 2012-11-26 2016-01-12 Trimble Navigation Limited Integrated aerial photogrammetry surveys
US8958603B2 (en) * 2013-01-25 2015-02-17 Regents Of The University Of Minnesota Automated mapping of land cover using sequences of aerial imagery
US9042674B2 (en) * 2013-03-15 2015-05-26 Digitalglobe, Inc. Automated geospatial image mosaic generation
CN103391413B (en) * 2013-07-03 2016-05-18 中国科学院光电技术研究所 Aerial survey image recording device and method
US10230925B2 (en) 2014-06-13 2019-03-12 Urthecast Corp. Systems and methods for processing and providing terrestrial and/or space-based earth observation video
CN105376474B (en) * 2014-09-01 2018-09-28 光宝电子(广州)有限公司 Image collecting device and its Atomatic focusing method
US10091418B2 (en) * 2014-10-24 2018-10-02 Bounce Imaging, Inc. Imaging systems and methods
US9824290B2 (en) * 2015-02-10 2017-11-21 nearmap australia pty ltd. Corridor capture
WO2016153914A1 (en) 2015-03-25 2016-09-29 King Abdulaziz City Of Science And Technology Apparatus and methods for synthetic aperture radar with digital beamforming
WO2017044168A2 (en) 2015-06-16 2017-03-16 King Abdulaziz City Of Science And Technology Efficient planar phased array antenna assembly
WO2017091747A1 (en) 2015-11-25 2017-06-01 Urthecast Corp. Synthetic aperture radar imaging apparatus and methods
WO2017169946A1 (en) * 2016-03-30 2017-10-05 日本電気株式会社 Information processing device, information processing system, information processing method and program storage medium
EP3440428B1 (en) 2016-04-08 2022-06-01 Orbital Insight, Inc. Remote determination of quantity stored in containers in geographical region
US10217236B2 (en) 2016-04-08 2019-02-26 Orbital Insight, Inc. Remote determination of containers in geographical region
JP6818463B2 (en) * 2016-08-08 2021-01-20 キヤノン株式会社 Image processing equipment, image processing methods and programs
EP3631504B8 (en) 2017-05-23 2023-08-16 Spacealpha Insights Corp. Synthetic aperture radar imaging apparatus and methods
EP3646054A4 (en) 2017-05-23 2020-10-28 King Abdulaziz City for Science and Technology Synthetic aperture radar imaging apparatus and methods for moving targets
US10586349B2 (en) 2017-08-24 2020-03-10 Trimble Inc. Excavator bucket positioning via mobile device
WO2019226194A2 (en) 2017-11-22 2019-11-28 Urthecast Corp. Synthetic aperture radar apparatus and methods
US10943360B1 (en) 2019-10-24 2021-03-09 Trimble Inc. Photogrammetric machine measure up
US11087532B2 (en) * 2019-11-05 2021-08-10 Raytheon Company Ortho-image mosaic production system
US11769224B2 (en) 2021-04-08 2023-09-26 Raytheon Company Mitigating transitions in mosaic images
CN113096043B (en) * 2021-04-09 2023-02-17 杭州睿胜软件有限公司 Image processing method and device, electronic device and storage medium
JP2023011331A (en) * 2021-07-12 2023-01-24 株式会社Jvcケンウッド Data generation device, display device, data generation method, display method, and program
CN115830146B (en) * 2023-02-10 2023-05-09 武汉玄景科技有限公司 On-orbit relative radiation calibration and correction method for aerospace optical remote sensing camera

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020096622A1 (en) * 2001-01-23 2002-07-25 Steven Adler-Golden Methods for atmospheric correction of solar-wavelength Hyperspectral imagery over land
US6484099B1 (en) * 1999-07-16 2002-11-19 Deutsches Zentrum Fur Luft -Und Raumfahrt E.V. Process for correcting atmospheric influences in multispectral optical remote sensing data
US6757445B1 (en) * 2000-10-04 2004-06-29 Pixxures, Inc. Method and apparatus for producing digital orthophotos using sparse stereo configurations and external models
US20050265631A1 (en) * 2002-09-19 2005-12-01 Mai Tuy V System and method for mosaicing digital ortho-images
US20080100803A1 (en) 1999-12-03 2008-05-01 Manfred Dick Method for determining vision defects and for collecting data for correcting vision defects of the eye by interaction of a patient with an examiner and apparatus therefor
US20080123990A1 (en) 2006-03-23 2008-05-29 Industry-Academic Cooperation Foundation, Yonsei University Method and apparatus of correcting geometry of an image
US20090232349A1 (en) * 2008-01-08 2009-09-17 Robert Moses High Volume Earth Observation Image Processing
US7653218B1 (en) 2006-05-02 2010-01-26 Orbimage Si Opco, Inc. Semi-automatic extraction of linear features from image data
US20100092045A1 (en) 2008-10-15 2010-04-15 The Boeing Company System and method for airport mapping database automatic change detection
CN101908210A (en) 2010-08-13 2010-12-08 北京工业大学 Method and system for color image defogging treatment
US20110025919A1 (en) * 2009-07-31 2011-02-03 Vorontsov Mikhail A Automated Video Data Fusion Method
US20110188775A1 (en) * 2010-02-01 2011-08-04 Microsoft Corporation Single Image Haze Removal Using Dark Channel Priors
US9036861B2 (en) * 2010-04-22 2015-05-19 The University Of North Carolina At Charlotte Method and system for remotely inspecting bridges and other structures

Family Cites Families (33)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3030061A (en) 1961-02-20 1962-04-17 Economy Forms Corp Adjustable brace connector unit
US5841911A (en) * 1995-06-06 1998-11-24 Ben Gurion, University Of The Negev Method for the restoration of images disturbed by the atmosphere
US6268093B1 (en) * 1999-10-13 2001-07-31 Applied Materials, Inc. Method for reticle inspection using aerial imaging
US6834122B2 (en) * 2000-01-22 2004-12-21 Kairos Scientific, Inc. Visualization and processing of multidimensional data using prefiltering and sorting criteria
US7019777B2 (en) * 2000-04-21 2006-03-28 Flight Landata, Inc. Multispectral imaging system with spatial resolution enhancement
US7593835B2 (en) * 2001-04-20 2009-09-22 Spectral Sciences, Inc. Reformulated atmospheric band model method for modeling atmospheric propagation at arbitrarily fine spectral resolution and expanded capabilities.
US7072502B2 (en) * 2001-06-07 2006-07-04 Applied Materials, Inc. Alternating phase-shift mask inspection method and apparatus
US20050175253A1 (en) * 2002-01-22 2005-08-11 National University Of Singapore Method for producing cloud free and cloud-shadow free images
US7725258B2 (en) * 2002-09-20 2010-05-25 M7 Visual Intelligence, L.P. Vehicle based data collection and processing system and imaging sensor system and methods thereof
US6909815B2 (en) * 2003-01-31 2005-06-21 Spectral Sciences, Inc. Method for performing automated in-scene based atmospheric compensation for multi-and hyperspectral imaging sensors in the solar reflective spectral region
WO2005054799A2 (en) * 2003-11-26 2005-06-16 Florida Environmental Research Institute, Inc. Spectral imaging system
US7697759B2 (en) * 2004-05-11 2010-04-13 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Split-remerge method for eliminating processing window artifacts in recursive hierarchical segmentation
US20060126959A1 (en) * 2004-12-13 2006-06-15 Digitalglobe, Inc. Method and apparatus for enhancing a digital image
US7564017B2 (en) * 2005-06-03 2009-07-21 Brion Technologies, Inc. System and method for characterizing aerial image quality in a lithography system
EP1977393A4 (en) * 2006-01-18 2013-05-08 Technion Res & Dev Foundation System and method for dehazing
CN101501703B (en) * 2006-02-01 2012-07-04 以色列商·应用材料以色列公司 Method and system for evaluating a variation in a parameter of a pattern
US7965902B1 (en) * 2006-05-19 2011-06-21 Google Inc. Large-scale image processing using mass parallelization techniques
WO2010004677A1 (en) * 2008-07-08 2010-01-14 パナソニック株式会社 Image processing method, image processing device, image processing program, image synthesis method, and image synthesis device
US8073279B2 (en) * 2008-07-08 2011-12-06 Harris Corporation Automated atmospheric characterization of remotely sensed multi-spectral imagery
US8396324B2 (en) * 2008-08-18 2013-03-12 Samsung Techwin Co., Ltd. Image processing method and apparatus for correcting distortion caused by air particles as in fog
US8117010B2 (en) * 2008-12-05 2012-02-14 Honeywell International Inc. Spectral signal detection system
US20100177095A1 (en) * 2009-01-14 2010-07-15 Harris Corporation Geospatial modeling system for reducing shadows and other obscuration artifacts and related methods
US8447129B2 (en) * 2009-03-18 2013-05-21 Florida Institute Of Technology High-speed diversity-based imaging method for parallel atmospheric turbulence compensation
US8350933B2 (en) * 2009-04-08 2013-01-08 Yissum Research Development Company Of The Hebrew University Of Jerusalem, Ltd. Method, apparatus and computer program product for single image de-hazing
JP5428618B2 (en) * 2009-07-29 2014-02-26 ソニー株式会社 Image processing apparatus, imaging apparatus, image processing method, and program
US9097792B2 (en) * 2009-08-12 2015-08-04 The Johns Hopkins University System and method for atmospheric correction of information
US8588551B2 (en) * 2010-03-01 2013-11-19 Microsoft Corp. Multi-image sharpening and denoising using lucky imaging
US9576349B2 (en) 2010-12-20 2017-02-21 Microsoft Technology Licensing, Llc Techniques for atmospheric and solar correction of aerial images
US8666190B1 (en) * 2011-02-01 2014-03-04 Google Inc. Local black points in aerial imagery
US8897543B1 (en) * 2012-05-18 2014-11-25 Google Inc. Bundle adjustment based on image capture intervals
US8755628B2 (en) * 2012-09-10 2014-06-17 Google Inc. Image de-hazing by solving transmission value
US9396528B2 (en) * 2013-03-15 2016-07-19 Digitalglobe, Inc. Atmospheric compensation in satellite imagery
US9449244B2 (en) * 2013-12-11 2016-09-20 Her Majesty The Queen In Right Of Canada, As Represented By The Minister Of National Defense Methods for in-scene atmospheric compensation by endmember matching

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6484099B1 (en) * 1999-07-16 2002-11-19 Deutsches Zentrum Fur Luft -Und Raumfahrt E.V. Process for correcting atmospheric influences in multispectral optical remote sensing data
US20080100803A1 (en) 1999-12-03 2008-05-01 Manfred Dick Method for determining vision defects and for collecting data for correcting vision defects of the eye by interaction of a patient with an examiner and apparatus therefor
US6757445B1 (en) * 2000-10-04 2004-06-29 Pixxures, Inc. Method and apparatus for producing digital orthophotos using sparse stereo configurations and external models
US20020096622A1 (en) * 2001-01-23 2002-07-25 Steven Adler-Golden Methods for atmospheric correction of solar-wavelength Hyperspectral imagery over land
US20050265631A1 (en) * 2002-09-19 2005-12-01 Mai Tuy V System and method for mosaicing digital ortho-images
US20080123990A1 (en) 2006-03-23 2008-05-29 Industry-Academic Cooperation Foundation, Yonsei University Method and apparatus of correcting geometry of an image
US7653218B1 (en) 2006-05-02 2010-01-26 Orbimage Si Opco, Inc. Semi-automatic extraction of linear features from image data
US20090232349A1 (en) * 2008-01-08 2009-09-17 Robert Moses High Volume Earth Observation Image Processing
US20100092045A1 (en) 2008-10-15 2010-04-15 The Boeing Company System and method for airport mapping database automatic change detection
US20110025919A1 (en) * 2009-07-31 2011-02-03 Vorontsov Mikhail A Automated Video Data Fusion Method
US20110188775A1 (en) * 2010-02-01 2011-08-04 Microsoft Corporation Single Image Haze Removal Using Dark Channel Priors
US9036861B2 (en) * 2010-04-22 2015-05-19 The University Of North Carolina At Charlotte Method and system for remotely inspecting bridges and other structures
CN101908210A (en) 2010-08-13 2010-12-08 北京工业大学 Method and system for color image defogging treatment

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
Atmosphere and shading compensation of satellite images-Published Date: Apr. 11, 2008 https://www.imagico.de/pov/earth-atmosphere.html.
Conferences-Web Events Microsoft's Vexcel Imaging GmbH Reports Sales of 30 UltraCam Systems for Its 2010 Fiscal Year-Published Date: Aug. 19, 2010 https://www.microsoft.com/ultracam/en-us/Aug1030Ultracams.aspx.
First Office Action and Search Report received for China Patent Application No. 201110429145.X, Mailed Dec. 17, 2013, 9 Pages, (W/o English Translation).
Haest, Birgen, et al. "Radiometric Calibration of Digital Photogrammetric Camera Image Data", ASPRS 2009 Annual Conference, Mar. 2009, 13 pages.
Second Office Action and Search Report received for China Patent Application No. 201110429145.X, Mailed Aug. 1, 2014, 10 pages, including 3 pages English translation.

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11017503B2 (en) 2010-12-20 2021-05-25 Microsoft Technology Licensing , LLC Techniques for atmospheric and solar correction of aerial images
US10109224B1 (en) * 2015-06-26 2018-10-23 Jayant Ratti Methods and devices for using aerial vehicles to create graphic displays and reconfigurable 3D structures
US10600162B2 (en) * 2016-12-29 2020-03-24 Konica Minolta Laboratory U.S.A., Inc. Method and system to compensate for bidirectional reflectance distribution function (BRDF)
CN110769577A (en) * 2019-10-18 2020-02-07 Oppo(重庆)智能科技有限公司 Atmosphere lamp control method and device
CN110769577B (en) * 2019-10-18 2022-02-25 Oppo(重庆)智能科技有限公司 Atmosphere lamp control method and device

Also Published As

Publication number Publication date
CN102567967B (en) 2015-10-28
US20170161878A1 (en) 2017-06-08
US11017503B2 (en) 2021-05-25
CN102567967A (en) 2012-07-11
US20120154584A1 (en) 2012-06-21

Similar Documents

Publication Publication Date Title
US11017503B2 (en) Techniques for atmospheric and solar correction of aerial images
US10127685B2 (en) Profile matching of buildings and urban structures
US10356317B2 (en) Wide-scale terrestrial light-field imaging of the sky
US20170031056A1 (en) Solar Energy Forecasting
US10013785B2 (en) Methods and systems for object based geometric fitting
US8665316B2 (en) Multi-resolution digital large format camera with multiple detector arrays
US10902660B2 (en) Determining and presenting solar flux information
JP5283214B2 (en) Fixed point observation apparatus and fixed point observation method
Wulder et al. Digital high spatial resolution aerial imagery to support forest health monitoring: the mountain pine beetle context
US10356343B2 (en) Methods and system for geometric distortion correction for space-based rolling-shutter framing sensors
CN111612901A (en) Extraction feature and generation method of geographic information image
CN111598777A (en) Sky cloud image processing method, computer device and readable storage medium
Arietta Estimation of forest canopy structure and understory light using spherical panorama images from smartphone photography
CN104580920A (en) Imaging processing method and user terminal
Liu et al. High-spatial-resolution nighttime light dataset acquisition based on volunteered passenger aircraft remote sensing
CN109785439A (en) Human face sketch image generating method and Related product
CN109377476A (en) The dynamic threshold acquisition methods and device of remote sensing image cloud detection characteristic parameter
Deng et al. Automatic true orthophoto generation based on three-dimensional building model using multiview urban aerial images
CN112785678B (en) Sunlight analysis method and system based on three-dimensional simulation
Fedorov et al. Snow phenomena modeling through online public media
Chijioke Satellite remote sensing technology in spatial modeling process: technique and procedures
WO2020061186A1 (en) An apparatus, methodologies and software applications for determining a level of direct sunlight
Wakter et al. A novel shade analysis technique for solar photovoltaic systems
Schulz et al. Automatic cloud top height determination in mountainous areas using a cost-effective time-lapse camera system
Roupioz et al. Quantifying the impact of cloud cover on ground radiation flux measurements using hemispherical images

Legal Events

Date Code Title Description
AS Assignment

Owner name: MICROSOFT CORPORATION, WASHINGTON

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:OMER, IDO;LIU, YUXIANG;SCHICKLER, WOLFGANG;AND OTHERS;SIGNING DATES FROM 20101217 TO 20101218;REEL/FRAME:025531/0791

AS Assignment

Owner name: MICROSOFT CORPORATION, WASHINGTON

Free format text: CORRECTIVE ASSIGNMENT TO CORRECT THE FIRST NAME OF INVENTOR LEDNER WAS RECORDED AS ROBER ON THE PAGE 1 OF THE NOTICE OF RECORDATION PREVIOUSLY RECORDED ON REEL 025531 FRAME 0791. ASSIGNOR(S) HEREBY CONFIRMS THE FIRST NAME OF INVENTOR LEDNER IS ROBERT AS IN ROBERT LEDNER;ASSIGNORS:OMER, IDO;LIU, YUXIANG;SCHICKLER, WOLFGANG;AND OTHERS;SIGNING DATES FROM 20101210 TO 20101218;REEL/FRAME:027346/0973

AS Assignment

Owner name: MICROSOFT CORPORATION, WASHINGTON

Free format text: CORRECTIVE ASSIGNMENT TO CORRECT THE FIRST NAME OF INVENTOR LEDNER WAS RECORDED AS ROBER ON THE PAGE 1 OF THE NOTICE OF RECORDATION PREVIOUSLY RECORDED ON REEL 025531 FRAME 0791. ASSIGNOR(S) HEREBY CONFIRMS THE FIRST NAME OF INVENTOR LEDNER IS ROBERT AS IN ROBERT LEDNER SIGNED ON 12/17/2010;ASSIGNORS:OMER, IDO;LIU, YUXIANG;SCHICKLER, WOLFGANG;AND OTHERS;SIGNING DATES FROM 20101217 TO 20101218;REEL/FRAME:027409/0676

AS Assignment

Owner name: MICROSOFT TECHNOLOGY LICENSING, LLC, WASHINGTON

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:MICROSOFT CORPORATION;REEL/FRAME:034544/0001

Effective date: 20141014

STCF Information on status: patent grant

Free format text: PATENTED CASE

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 4

FEPP Fee payment procedure

Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY