CN112381882A - Unmanned aerial vehicle image automatic correction method carrying hyperspectral equipment - Google Patents

Unmanned aerial vehicle image automatic correction method carrying hyperspectral equipment Download PDF

Info

Publication number
CN112381882A
CN112381882A CN202011213921.8A CN202011213921A CN112381882A CN 112381882 A CN112381882 A CN 112381882A CN 202011213921 A CN202011213921 A CN 202011213921A CN 112381882 A CN112381882 A CN 112381882A
Authority
CN
China
Prior art keywords
unmanned aerial
aerial vehicle
image
radiance
value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011213921.8A
Other languages
Chinese (zh)
Inventor
康苒
徐天河
龙霞
刘博威
周珊羽
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shandong University
Original Assignee
Shandong University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shandong University filed Critical Shandong University
Priority to CN202011213921.8A priority Critical patent/CN112381882A/en
Publication of CN112381882A publication Critical patent/CN112381882A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)

Abstract

The invention relates to an unmanned aerial vehicle image automatic correction method carrying hyperspectral equipment, which comprises the following steps: s1, extracting exposure time from a meta-information file of hyperspectral image data of the unmanned aerial vehicle; s2, extracting a calibration coefficient from a calibration file of the hyperspectral image of the unmanned aerial vehicle; s3, calculating the radiance value of each pixel of the whole image; s4, taking a median value between the maximum radiation brightness value and the minimum radiation brightness value, wherein the part larger than the median value is a whiteboard area, and taking the median value of the radiation brightness values of the whiteboard area as the radiation brightness value of the whiteboard; s5, acquiring the surface reflectivity of each pixel; s6, evaluating the original data, and performing first derivative transformation on the original attitude positioning data; finally, interpolation is performed based on the jump point data. The earth observation precision is effectively improved, the automatic pretreatment of the hyperspectral image of the unmanned aerial vehicle pixel by pixel can be quickly realized, and the problems of low correction precision, low speed and the like of the existing method are solved.

Description

Unmanned aerial vehicle image automatic correction method carrying hyperspectral equipment
Technical Field
The invention relates to the hyperspectral field of unmanned aerial vehicles, in particular to an unmanned aerial vehicle image automatic correction method carrying hyperspectral equipment.
Background
At present, an unmanned aerial vehicle carrying camera is a convenient and fast method for acquiring small-area ground object information, and a hyperspectral image has the technical advantages of high spectrum and spatial resolution, integrated atlas and flexible operation mode, and can acquire ground object potential information more accurately and fast. The combination of the unmanned aerial vehicle and the hyperspectrum creates a new technology in the field of remote sensing, but the new challenges of big data, information automatic processing and the like are followed.
Compared with the existing relatively stable aerospace platform, the unmanned aerial vehicle is more easily affected by atmospheric turbulence at low altitude, and the instability of the sensor caused by the influence can cause serious geometric distortion of data. Severe pan-tilt jitter can lead to data acquisition anomalies, which make subsequent ortho-correction very difficult. In addition, the GNSS/IMU module with lower accuracy may cause serious geometric problems, resulting in serious problems of drift, jump, missing and error in data.
The existing orthographic correction method of the hyperspectral camera of the unmanned aerial vehicle needs manual input of geometric parameters, atmospheric parameters and the like, including elevation, observation azimuth, pixel longitude and latitude and various compensation parameters, the correction mode usually cannot automatically read related information, only aims at one-time data acquisition, is poor in universality, complicated in input parameter and troublesome in calculation, and is low in processing efficiency.
Disclosure of Invention
The invention aims to provide an unmanned aerial vehicle image automatic correction method carrying hyperspectral equipment, and aims to solve the problems in the background technology.
In order to achieve the purpose, the invention provides the following technical scheme:
the unmanned aerial vehicle image automatic correction method carrying the hyperspectral equipment is characterized by comprising the following steps:
s1, extracting exposure time from a meta-information file of hyperspectral image data of the unmanned aerial vehicle;
s2, extracting a calibration coefficient from a calibration file of the hyperspectral image of the unmanned aerial vehicle;
s3, calculating the radiance value of each pixel of the whole image according to the extracted exposure time, the extracted calibration coefficient, the acquired hyperspectral image metadata of the unmanned aerial vehicle and the acquired dark current data and a radiometric calibration formula;
s4, taking a median value between the maximum radiance value and the minimum radiance value, wherein the part which is larger than the median value is a whiteboard area, eliminating abnormal values, and taking the median value of the radiance values of the whiteboard area as the radiance value of the whiteboard;
s5, reading the calculated image radiance value and whiteboard radiance value according to a flat field correction formula in atmospheric correction, and performing atmospheric correction to obtain the earth surface reflectivity of each pixel;
s6, reading the imu-gps file, the setting file and the DEM file of each image, evaluating the original data, and performing least square difference calculation; then, performing first derivative transformation on the original attitude positioning data, performing gross error rejection processing based on a 3 sigma principle, and screening out the position of a jump point, namely accurate positioning/attitude data; finally, interpolation is performed based on the jump point data.
As a further scheme of the invention: the radiometric calibration formula is as follows:
ρradiance=(DNdata-DNdarkcurrent)×CFFsensorconfig/te
wherein: rhorandianceSpectral radiance values obtained for the sensor in mW (cm)2·sr·μm);
DNdataThe DN value corresponding to the pixel on the collected hyperspectral image of the unmanned aerial vehicle is obtained;
DNdarkcurrentDN value corresponding to the pixel on the collected dark current image;
CFFsensorconfigthe calibration coefficient of a hyperspectral camera carried on the unmanned aerial vehicle is obtained;
teexposure time set for the acquired hyperspectral image of the unmanned aerial vehicle.
As a still further scheme of the invention: the flat field correction formula is as follows:
ρreflectance=ρ(T)radiance/ρ(W)radiance
wherein: rho (T)radianceThe unit of the spectral radiance value of a ground object pixel of a hyperspectral image of an unmanned aerial vehicle is mW/(cm)2·sr·μm);
ρ(W)radianceThe unit of the white board pixel spectral radiance value of the hyperspectral image of the unmanned aerial vehicle is mW/(cm)2·sr·μm)。
As a still further scheme of the invention: the 3 sigma principle is as follows:
the probability of the numerical distribution in (μ - σ, μ + σ) is 0.6826;
the probability of the numerical distribution in (μ -2 σ, μ +2 σ) is 0.9544;
the probability of the numerical distribution in (μ -3 σ, μ +3 σ) is 0.9974;
wherein: σ represents the standard deviation, μ represents the mean, and x ═ μ is the axis of symmetry of the image.
Compared with the prior art, the invention has the beneficial effects that: the method is novel in design, can realize automatic radiometric calibration, atmospheric correction and orthometric correction of hyperspectral data of any unmanned aerial vehicle, greatly reduces repeatability, improves universality, effectively improves ground observation precision, can quickly realize pixel-by-pixel automatic preprocessing of hyperspectral images of the unmanned aerial vehicle, and overcomes the problems of low correction precision, low speed and the like of the existing method.
Drawings
Fig. 1 is a flowchart of an unmanned aerial vehicle image automatic correction method for carrying hyperspectral equipment.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
In addition, an element of the present invention may be said to be "fixed" or "disposed" to another element, either directly on the other element or with intervening elements present. When an element is referred to as being "connected" to another element, it can be directly connected to the other element or intervening elements may also be present. The terms "vertical," "horizontal," "left," "right," and the like as used herein are for illustrative purposes only and do not represent the only embodiments.
Referring to fig. 1, in an embodiment of the present invention, an unmanned aerial vehicle image automatic correction method for a hyperspectral device is characterized in that the unmanned aerial vehicle image automatic correction method for the hyperspectral device includes the following steps:
s1, extracting exposure time from a meta-information file 'setting. txt' of hyperspectral image data of the unmanned aerial vehicle;
s2, extracting a calibration coefficient from a calibration file 'radioCal.raw' of the hyperspectral image of the unmanned aerial vehicle;
s3, calculating the radiance value of each pixel of the whole image according to the extracted exposure time, the extracted calibration coefficient, the acquired hyperspectral image metadata of the unmanned aerial vehicle and the acquired dark current data and a radiometric calibration formula;
the radiometric calibration formula is as follows:
ρradiance=(DNdata-DNdarkcurrent)×CFFsensorconfig/te
wherein: rhorandianceSpectral radiance values obtained for the sensor in mW/(cm)2·sr·μm);
DNdataThe DN value corresponding to the pixel on the collected hyperspectral image of the unmanned aerial vehicle is obtained;
DNdarkcurrentDN value corresponding to the pixel on the collected dark current image;
CFFsensorconfigthe calibration coefficient of a hyperspectral camera carried on the unmanned aerial vehicle is obtained;
tesetting exposure time for the acquired hyperspectral image of the unmanned aerial vehicle;
s4, taking a median value between the maximum radiance value and the minimum radiance value, wherein the part which is larger than the median value is a whiteboard area, eliminating abnormal values, and taking the median value of the radiance values of the whiteboard area as the radiance value of the whiteboard;
s5, reading the calculated image radiance value and whiteboard radiance value according to a flat field correction formula in atmospheric correction, and performing atmospheric correction to obtain the earth surface reflectivity of each pixel;
the flat field correction formula is as follows:
ρreflectance=ρ(T)radiance/ρ(W)radiance
wherein: rho (T)radianceThe unit of the spectral radiance value of a ground object pixel of a hyperspectral image of an unmanned aerial vehicle is mW/(cm)2·sr·μm);
ρ(W)radianceThe unit of the white board pixel spectral radiance value of the hyperspectral image of the unmanned aerial vehicle is mW/(cm)2·sr·μm);
S6, reading the imu-gps file, the setting file and the DEM file of each image, evaluating the original data, and performing least square difference calculation; then, performing first derivative transformation on the original attitude positioning data, performing gross error rejection processing based on a 3 sigma principle, and screening out the position of a jump point, namely accurate positioning/attitude data; finally, interpolation is carried out based on jumping point data;
the 3 sigma principle is as follows:
the probability of the numerical distribution in (μ - σ, μ + σ) is 0.6826;
the probability of the numerical distribution in (μ -2 σ, μ +2 σ) is 0.9544;
the probability of the numerical distribution in (μ -3 σ, μ +3 σ) is 0.9974;
wherein: σ represents the standard deviation, μ represents the mean, and x ═ μ is the axis of symmetry of the image.
In the embodiment of the invention, the preprocessing method provided by the invention can realize automatic radiometric calibration, atmospheric correction and orthorectification of hyperspectral data of any unmanned aerial vehicle, greatly reduce repeatability, improve universality, effectively improve ground observation precision, quickly realize pixel-by-pixel automatic preprocessing of hyperspectral images of the unmanned aerial vehicle, and overcome the problems of low correction precision, low speed and the like of the existing method.
In the embodiment of the present invention, it should be noted that the extracted exposure time is: the corresponding field is exposure (ms), and the extracted scaling coefficient is as follows: the corresponding field is Sensor calibration.
In the embodiment of the invention, because the difference value between the radiation brightness values of the whiteboard and other ground objects on the image at the position of 500-700nm is larger, preferably, a median value is taken between the maximum radiation brightness value and the minimum radiation brightness value, the part which is larger than the median value is the whiteboard area, the abnormal value is removed, and the median value is taken for the radiation brightness value of the whiteboard area to be taken as the radiation brightness value of the whiteboard.
In the embodiment of the present invention, the least square method is a mathematical tool widely applied in many subject fields of data processing such as error estimation, uncertainty, system identification and prediction, and the like, and the least square method (also called a least squares method) is a mathematical optimization technique, and it can simply and conveniently find unknown data by minimizing the square sum of errors and finding the optimal function matching of the data, and make the square sum of errors between the found data and actual data the minimum, but it should be noted that the least square method is a common calculation method, and therefore, it is not described herein again.
In the embodiment of the present invention, it should be further noted that the first derivative is: the derivative of a function at a certain point describes the rate of change of the function in the vicinity of this point. The nature of the derivative is that the function is locally linearly approximated by the concept of a limit when the argument of the function f is at a point x0When an increment h is generated, the limit of the ratio of the increment of the function output value to the independent variable increment h when h approaches 0 is present, namely f is x0The derivative of (c).
It will be evident to those skilled in the art that the invention is not limited to the details of the foregoing illustrative embodiments, and that the present invention may be embodied in other specific forms without departing from the spirit or essential attributes thereof. The present embodiments are therefore to be considered in all respects as illustrative and not restrictive, the scope of the invention being indicated by the appended claims rather than by the foregoing description, and all changes which come within the meaning and range of equivalency of the claims are therefore intended to be embraced therein. Any reference sign in a claim should not be construed as limiting the claim concerned.
Furthermore, it should be understood that although the present description refers to embodiments, not every embodiment may contain only a single embodiment, and such description is for clarity only, and those skilled in the art should integrate the description, and the embodiments may be combined as appropriate to form other embodiments understood by those skilled in the art.

Claims (4)

1. The unmanned aerial vehicle image automatic correction method carrying the hyperspectral equipment is characterized by comprising the following steps:
s1, extracting exposure time from a meta-information file of hyperspectral image data of the unmanned aerial vehicle;
s2, extracting a calibration coefficient from a calibration file of the hyperspectral image of the unmanned aerial vehicle;
s3, calculating the radiance value of each pixel of the whole image according to the extracted exposure time, the extracted calibration coefficient, the acquired hyperspectral image metadata of the unmanned aerial vehicle and the acquired dark current data and a radiometric calibration formula;
s4, taking a median value between the maximum radiance value and the minimum radiance value, wherein the part which is larger than the median value is a whiteboard area, eliminating abnormal values, and taking the median value of the radiance values of the whiteboard area as the radiance value of the whiteboard;
s5, reading the calculated image radiance value and whiteboard radiance value according to a flat field correction formula in atmospheric correction, and performing atmospheric correction to obtain the earth surface reflectivity of each pixel;
s6, reading the imu-gps file, the setting file and the DEM file of each image, evaluating the original data, and performing least square difference calculation; then, performing first derivative transformation on the original attitude positioning data, performing gross error rejection processing based on a 3 sigma principle, and screening out the position of a jump point, namely accurate positioning/attitude data; finally, interpolation is performed based on the jump point data.
2. The unmanned aerial vehicle image automatic correction method carrying high spectrum equipment according to claim 1, characterized in that the radiometric calibration formula is:
ρradiance=(DNdata-DNdarkcurrent)×CFFsensorconfig/te
wherein: rhorandianceSpectral radiance values obtained for the sensor in mW/(cm)2·sr·μm);
DNdataThe DN value corresponding to the pixel on the collected hyperspectral image of the unmanned aerial vehicle is obtained;
DNdarkcurrentDN value corresponding to the pixel on the collected dark current image;
CFFsensorconfigthe calibration coefficient of a hyperspectral camera carried on the unmanned aerial vehicle is obtained;
teexposure time set for the acquired hyperspectral image of the unmanned aerial vehicle.
3. The unmanned aerial vehicle image automatic correction method carrying high spectrum equipment according to claim 1, characterized in that the flat field correction formula is:
ρreflectance=ρ(T)radiance/ρ(W)radiance
wherein: rho (T)radianceThe unit of the spectral radiance value of a ground object pixel of a hyperspectral image of an unmanned aerial vehicle is mW/(cm)2·sr·μm);
ρ(W)radianceThe unit of the white board pixel spectral radiance value of the hyperspectral image of the unmanned aerial vehicle is mW/(cm)2·sr·μm)。
4. The unmanned aerial vehicle image automatic correction method carrying high spectrum equipment according to claim 1, wherein the 3 σ principle is as follows:
the probability of the numerical distribution in (μ - σ, μ + σ) is 0.6826;
the probability of the numerical distribution in (μ -2 σ, μ +2 σ) is 0.9544;
the probability of the numerical distribution in (μ -3 σ, μ +3 σ) is 0.9974;
wherein: σ represents the standard deviation, μ represents the mean, and x ═ μ is the axis of symmetry of the image.
CN202011213921.8A 2020-11-04 2020-11-04 Unmanned aerial vehicle image automatic correction method carrying hyperspectral equipment Pending CN112381882A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011213921.8A CN112381882A (en) 2020-11-04 2020-11-04 Unmanned aerial vehicle image automatic correction method carrying hyperspectral equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011213921.8A CN112381882A (en) 2020-11-04 2020-11-04 Unmanned aerial vehicle image automatic correction method carrying hyperspectral equipment

Publications (1)

Publication Number Publication Date
CN112381882A true CN112381882A (en) 2021-02-19

Family

ID=74577989

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011213921.8A Pending CN112381882A (en) 2020-11-04 2020-11-04 Unmanned aerial vehicle image automatic correction method carrying hyperspectral equipment

Country Status (1)

Country Link
CN (1) CN112381882A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113643388A (en) * 2021-10-14 2021-11-12 深圳市海谱纳米光学科技有限公司 Black frame calibration and correction method and system for hyperspectral image
CN115615938A (en) * 2022-12-14 2023-01-17 天津中科谱光信息技术有限公司 Water quality analysis method and device based on reflection spectrum and electronic equipment

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102446347A (en) * 2010-10-09 2012-05-09 株式会社理光 White balance method and device for image
CN103353616A (en) * 2013-07-05 2013-10-16 吉林大学 Method used for fast recognition of oil gas micro leakage and based on hyperspectral remote sensing data
CN103383348A (en) * 2013-05-28 2013-11-06 吉林大学 Method for extracting altered mineral at vegetation-covered areas by hyperspectral remote sensing
CN105387846A (en) * 2015-10-26 2016-03-09 中国农业大学 Normal incidence correcting method and system for satellite images

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102446347A (en) * 2010-10-09 2012-05-09 株式会社理光 White balance method and device for image
CN103383348A (en) * 2013-05-28 2013-11-06 吉林大学 Method for extracting altered mineral at vegetation-covered areas by hyperspectral remote sensing
CN103353616A (en) * 2013-07-05 2013-10-16 吉林大学 Method used for fast recognition of oil gas micro leakage and based on hyperspectral remote sensing data
CN105387846A (en) * 2015-10-26 2016-03-09 中国农业大学 Normal incidence correcting method and system for satellite images

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
周珊羽: "基于无人机高光谱系统多角度观测的农作物叶绿素含量反演", 《中国优秀硕士学位论文全文数据库 农业科技辑》 *
黄微 等: "基于遥感的吕宋岛西北部海域灯光渔船空间分布及其与海洋涡的关系", 《海洋学报》 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113643388A (en) * 2021-10-14 2021-11-12 深圳市海谱纳米光学科技有限公司 Black frame calibration and correction method and system for hyperspectral image
CN115615938A (en) * 2022-12-14 2023-01-17 天津中科谱光信息技术有限公司 Water quality analysis method and device based on reflection spectrum and electronic equipment
CN115615938B (en) * 2022-12-14 2023-03-28 天津中科谱光信息技术有限公司 Water quality analysis method and device based on reflection spectrum and electronic equipment

Similar Documents

Publication Publication Date Title
James et al. Mitigating systematic error in topographic models for geomorphic change detection: accuracy, precision and considerations beyond off‐nadir imagery
CN113284171B (en) Vegetation height analysis method and system based on satellite remote sensing stereo imaging
Brell et al. Improving sensor fusion: A parametric method for the geometric coalignment of airborne hyperspectral and LiDAR data
Tang et al. Verification of ZY-3 satellite imagery geometric accuracy without ground control points
CN110363758B (en) Optical remote sensing satellite imaging quality determination method and system
CN112598608A (en) Method for manufacturing optical satellite rapid fusion product based on target area
CN107798668B (en) Unmanned aerial vehicle imaging hyperspectral geometric correction method and system based on RGB images
Kim et al. Investigating applicability of unmanned aerial vehicle to the tidal flat zone
CN112381882A (en) Unmanned aerial vehicle image automatic correction method carrying hyperspectral equipment
CN111815524A (en) Radiometric calibration correction system and method
Valorge et al. Forty years of experience with SPOT in-flight calibration
CN111144350B (en) Remote sensing image positioning accuracy evaluation method based on reference base map
Stow et al. Evaluation of geometric elements of repeat station imaging and registration
Seiz et al. Cloud mapping with ground‐based photogrammetric cameras
Weifeng et al. Multi-source DEM accuracy evaluation based on ICESat-2 in Qinghai-Tibet Plateau, China
Li et al. Improve the ZY-3 height accuracy using ICESat/GLAS laser altimeter data
CN111598930B (en) Color point cloud generation method and device and terminal equipment
CN114544006A (en) Low-altitude remote sensing image correction system and method based on ambient illumination condition
Hasheminasab et al. Multiscale image matching for automated calibration of UAV-based frame and line camera systems
Yeh et al. Direct georeferencing of airborne pushbroom images
Li et al. Multi-sensor based high-precision direct georeferencing of medium-altitude unmanned aerial vehicle images
Reulke et al. Image quality and image resolution
Kim et al. Estimation and improvement in the geolocation accuracy of rational polynomial coefficients with minimum GCPs using KOMPSAT-3A
Gliß et al. A Python Software Toolbox for the Analysis of SO2 Camera Data. Implications in Geosciences
CN114494039A (en) Underwater hyperspectral push-broom image geometric correction method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20210219

WD01 Invention patent application deemed withdrawn after publication