CN116027349A - Coral reef substrate classification method based on laser radar and side scan sonar data fusion - Google Patents
Coral reef substrate classification method based on laser radar and side scan sonar data fusion Download PDFInfo
- Publication number
- CN116027349A CN116027349A CN202211503490.8A CN202211503490A CN116027349A CN 116027349 A CN116027349 A CN 116027349A CN 202211503490 A CN202211503490 A CN 202211503490A CN 116027349 A CN116027349 A CN 116027349A
- Authority
- CN
- China
- Prior art keywords
- laser radar
- image
- scan sonar
- laser
- coral reef
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 86
- 235000014653 Carica parviflora Nutrition 0.000 title claims abstract description 60
- 230000004927 fusion Effects 0.000 title claims abstract description 46
- 239000000758 substrate Substances 0.000 title claims abstract description 46
- 244000132059 Carica parviflora Species 0.000 title description 3
- 241000243321 Cnidaria Species 0.000 claims abstract description 57
- 238000012937 correction Methods 0.000 claims abstract description 41
- 238000004364 calculation method Methods 0.000 claims abstract description 37
- 238000013528 artificial neural network Methods 0.000 claims abstract description 35
- 238000012549 training Methods 0.000 claims abstract description 34
- 238000012545 processing Methods 0.000 claims abstract description 18
- 230000005855 radiation Effects 0.000 claims abstract description 16
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 claims description 82
- 241000251468 Actinopterygii Species 0.000 claims description 48
- 230000006870 function Effects 0.000 claims description 30
- 239000013598 vector Substances 0.000 claims description 19
- 238000004422 calculation algorithm Methods 0.000 claims description 18
- 238000006073 displacement reaction Methods 0.000 claims description 17
- 238000005070 sampling Methods 0.000 claims description 17
- 238000012952 Resampling Methods 0.000 claims description 12
- 230000006872 improvement Effects 0.000 claims description 11
- 238000004220 aggregation Methods 0.000 claims description 7
- 230000003287 optical effect Effects 0.000 claims description 7
- 238000012546 transfer Methods 0.000 claims description 7
- 210000002569 neuron Anatomy 0.000 claims description 6
- 230000007704 transition Effects 0.000 claims description 5
- 230000001902 propagating effect Effects 0.000 claims description 2
- 238000012876 topography Methods 0.000 abstract description 5
- 239000000523 sample Substances 0.000 description 32
- 238000005259 measurement Methods 0.000 description 18
- 238000001514 detection method Methods 0.000 description 15
- 210000005036 nerve Anatomy 0.000 description 14
- 230000000875 corresponding effect Effects 0.000 description 11
- 238000013507 mapping Methods 0.000 description 11
- 238000011160 research Methods 0.000 description 11
- 230000008859 change Effects 0.000 description 10
- 238000005516 engineering process Methods 0.000 description 9
- 239000011159 matrix material Substances 0.000 description 7
- 238000002592 echocardiography Methods 0.000 description 6
- 230000000694 effects Effects 0.000 description 6
- 230000007246 mechanism Effects 0.000 description 6
- 230000008569 process Effects 0.000 description 6
- 230000009466 transformation Effects 0.000 description 6
- 238000004458 analytical method Methods 0.000 description 5
- 238000006243 chemical reaction Methods 0.000 description 5
- 230000009286 beneficial effect Effects 0.000 description 4
- 238000011161 development Methods 0.000 description 4
- 230000018109 developmental process Effects 0.000 description 4
- 238000009826 distribution Methods 0.000 description 4
- 238000000605 extraction Methods 0.000 description 4
- 238000003384 imaging method Methods 0.000 description 4
- 238000007619 statistical method Methods 0.000 description 4
- 238000012795 verification Methods 0.000 description 4
- 241001474374 Blennius Species 0.000 description 3
- 230000001133 acceleration Effects 0.000 description 3
- 238000010586 diagram Methods 0.000 description 3
- 238000001914 filtration Methods 0.000 description 3
- 238000007781 pre-processing Methods 0.000 description 3
- 230000035945 sensitivity Effects 0.000 description 3
- 238000012360 testing method Methods 0.000 description 3
- 238000010521 absorption reaction Methods 0.000 description 2
- 125000003275 alpha amino acid group Chemical group 0.000 description 2
- 230000008901 benefit Effects 0.000 description 2
- 230000007423 decrease Effects 0.000 description 2
- 230000003247 decreasing effect Effects 0.000 description 2
- 230000008030 elimination Effects 0.000 description 2
- 238000003379 elimination reaction Methods 0.000 description 2
- 230000007613 environmental effect Effects 0.000 description 2
- 238000011156 evaluation Methods 0.000 description 2
- 230000005284 excitation Effects 0.000 description 2
- 238000002474 experimental method Methods 0.000 description 2
- 239000000284 extract Substances 0.000 description 2
- 238000007499 fusion processing Methods 0.000 description 2
- 238000011835 investigation Methods 0.000 description 2
- 230000035772 mutation Effects 0.000 description 2
- 238000010606 normalization Methods 0.000 description 2
- 238000013139 quantization Methods 0.000 description 2
- 238000002310 reflectometry Methods 0.000 description 2
- 239000013535 sea water Substances 0.000 description 2
- 239000013049 sediment Substances 0.000 description 2
- 230000009897 systematic effect Effects 0.000 description 2
- 238000001161 time-correlated single photon counting Methods 0.000 description 2
- 238000013519 translation Methods 0.000 description 2
- 238000009827 uniform distribution Methods 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- 238000013316 zoning Methods 0.000 description 2
- 241000201308 Boschniakia Species 0.000 description 1
- 241000282326 Felis catus Species 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000015572 biosynthetic process Effects 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 150000001875 compounds Chemical class 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 238000000354 decomposition reaction Methods 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 230000007340 echolocation Effects 0.000 description 1
- 230000002068 genetic effect Effects 0.000 description 1
- 230000012447 hatching Effects 0.000 description 1
- 230000036541 health Effects 0.000 description 1
- 238000009532 heart rate measurement Methods 0.000 description 1
- CPBQJMYROZQQJC-UHFFFAOYSA-N helium neon Chemical compound [He].[Ne] CPBQJMYROZQQJC-UHFFFAOYSA-N 0.000 description 1
- 230000036039 immunity Effects 0.000 description 1
- 230000001788 irregular Effects 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 238000003062 neural network model Methods 0.000 description 1
- 238000004091 panning Methods 0.000 description 1
- 238000003672 processing method Methods 0.000 description 1
- 238000007637 random forest analysis Methods 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 239000011435 rock Substances 0.000 description 1
- 239000004576 sand Substances 0.000 description 1
- 238000000926 separation method Methods 0.000 description 1
- 239000002689 soil Substances 0.000 description 1
- 241000894007 species Species 0.000 description 1
- 230000003595 spectral effect Effects 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
- 230000033772 system development Effects 0.000 description 1
- 230000036962 time dependent Effects 0.000 description 1
- 230000001131 transforming effect Effects 0.000 description 1
- 238000003809 water extraction Methods 0.000 description 1
Images
Landscapes
- Measurement Of Velocity Or Position Using Acoustic Or Ultrasonic Waves (AREA)
Abstract
The invention discloses a coral reef substrate classification method based on laser radar and side-scan sonar data fusion, which mainly comprises the following steps: (1) By analyzing the structure of a photon counting laser radar scanning system, an unmanned airborne laser radar underwater sounding point coordinate accurate calculation model is constructed; (2) Constructing a side scan sonar image with geocoding through the fine processing steps of sea bottom line tracking, radiation distortion joint correction and the like; (3) Carrying out fusion treatment on the backscattering intensities of the laser radar and the side-scan sonar to construct a submarine topography image; (4) improving a standard wavelet BP neural network; (5) On the basis of the fusion image, extracting coral reef type samples, selecting characteristic parameters with higher self-focusing degree and resolution to perform sample training, and classifying coral reef substrates by using sample training results. Through the steps, coral reef substrate classification based on laser radar and side scan sonar data fusion can be realized, and technical support is provided for identifying the type of the coral reef.
Description
Technical Field
The invention relates to a coral reef substrate classification method based on laser radar and side-scan sonar data fusion, in particular to a photon counting mechanism laser radar underwater sounding point coordinate calculation method based on underwater ray tracing and a fusion technology of the laser radar and the side-scan sonar data.
Background
Coral reefs are one of the most diverse marine ecosystems with highest primary productivity, and play an important role in the health and sustainable development of human society and marine ecological environments, so investigation, monitoring and protection of coral reefs are urgent important tasks at present. Compared with underwater artificial photographing or investigation, the unmanned airborne laser radar coral reef has the characteristics of high efficiency, high precision and high coverage, but the measurement depth is limited due to the absorption of the water body to laser, and is generally between a few meters and tens of meters. The shipborne side-scan sonar has the characteristics of long detection distance and high-density scan, but the positioning accuracy is not high. The coral reef substrate image with large water depth, high precision and high density can be constructed by utilizing the complementary advantages of the coral reef substrate image and the modified wavelet BP neural network is used for classifying various coral reef substrates on the basis of the substrate image.
The United states is the country which carries out airborne laser sounding system research in the world at the earliest time, in 1968, hickman and Hogg of Syracuse university in the United states build the first laser seawater measurement system in the world, verify the feasibility of the laser water depth measurement technology for the first time, and initially establish the theoretical basis of the ocean laser detection technology [1] The method comprises the steps of carrying out a first treatment on the surface of the Thereafter, the United states navy successfully developed an airborne pulse laser system (PLADS) and tested in 1971 [2] The method comprises the steps of carrying out a first treatment on the surface of the The united states National Atmospheric and Space Administration (NASA) successfully developed an on-board laser water depth gauge (ALB) [3] Experiments were carried out in 1971 to 1974 usingYAG laser with 50Hz, which can measure depth to about 10m when the transparency of the water disc is 5 m; at the end of the 70 s of the 20 th century NASA developed an on-board hydrolidar (AOL) device with scanning and high-speed data recording capabilities [4] And drawing the submarine topography with the water depth less than 10m by adopting a 400Hz low peak power 2kW helium-neon laser.
In the middle of the 80 s of the 20 th century, the united states army engineering army (USACE) started the development project for the production of the airborne laser scanning sounding system (SHOALS) [5] The project was eventually developed with the support of the opatech company in canada. The SHOALS system was initially applied to navigational route environmental measurements and soon developed into a coastal region mapping system. SHOALS has become one of the main means of offshore sounding today, and Optech corporation has a partnership with the united states navy weather, the marine command center, and the navy ocean office. Based on the relation, the SHOALS system is enabled to bear a great deal of tasks such as chart making, quick environment estimation for army training and the like. Over thirty years of effort and technical effort by the Optech company, series products of SHOALS 200 (1993), SHOALS 400 (1998), SHOALS1000 (2003), SHOALS 3000 (2006) were successfully developed successively for water depth measurement [6] . Table 1 shows comparison of parameters of several shallow sea mapping lidar systems.
Table 1 comparison of parameters of several shallow sea mapping lidar systems
Parameters (parameters) | SHOALS 3000T | Hawk Eye II | LADS MK II |
Measuring frequency | 3KHz | 4KHz | 900Hz |
Fly height | 300~400m | 250~500m | 366~671m |
Accuracy of water depth measurement | IHO Order1 | IHO Order1 | IHO Order1 |
Horizontal accuracy | IHO Order1 | IHO Order1 | 5m CEP 95% |
Minimum detection depth | 0.2 | 0.3 | 0.5 |
|
50 | 3 times disc transparency | 70m |
Scan width | Maximum 0.75 times of the |
100~350m |
In addition, the airborne water depth laser scanning system developed by RIEGL company has larger competitiveness, and the two types of waterway joint measurement laser scanning systems developed by RIEGL company and the parameter pairs of the system are shown in table 2. The amphibious laser scanning system developed by RIEGL is divided into a light and small-sized BathyCopter and an efficient VQ-880-G [7] The BathyCopter has small quality but can only perform single-point scanning, and the mapping efficiency is low; the VQ-880-G adopts a linear detection system, so that the signal-to-noise ratio requirement of a low-reflectivity substrate cannot be met, the system is heavy and has high power consumption, and the system cannot be applied to an unmanned plane platform.
Table 2 parameter comparison of RIEGL amphibious laser scanning systems
In the traditional mapping laser radar detection system, whether the waveform is digitally sampled or the pulse width measurement or the multi-pulse measurement is carried out, the essence of the traditional mapping laser radar detection system is the detection of echo waveforms. The probe system can not fully utilize photon energy in echo pulse, so that the requirements on laser single pulse energy and system optical caliber are higher. In order to solve the problem of low efficiency of the linear detection system, a photon counting detection system with single photon sensitivity is introduced into the field of mapping lidar.
The first use of lidar in the photon counting regime for mapping to ground was successfully implemented in the instrument hatching program (IIP) of NASA P-3 aircraft. The NASA goldamard spatial center has first conducted research work in this regard. The first generation airborne verification system is called Micro Altimer, the laser wavelength is 532nm, the repetition frequency is 10KHz, the pulse energy is 2uJ, a 2X 2-element Photomultiplier (PMT) working in photon counting mode is adopted as an echo detector, an off-axis telescope sharing 20cm caliber is transmitted and received, and the ground cone scanning imaging is realized by matching with a single optical wedge in front of the telescope [8] 。
On the basis of Micro Altimer, researchers at the Goldspace center and Sigma company developed a second generation verification System Imaging Photon-counting Altimeter (IPA) [9] . The laser wavelength is still 532nm, the repetition frequency is increased to 22KHz, the pulse energy is 6.4uJ, and the PMT operating in photon counting mode is still selected as an echo detector, but the element number is increased to 10×10. The system adopts a double-optical wedge scanner to provide one-dimensional and two-dimensional scanning for different platform speeds, and realizes wide-range imaging for single fly-over.
Due to extremely high sensitivity, the photon counting detection System can penetrate a certain water depth to obtain shallow water terrain, and on the basis, researchers at the university of florida in the United states develop a laser radar principle model machine (CATS) of the photon counting System [10] The method is used for measuring coastal zone areas, and the topography with the water depth below 5m is successfully measured.
The first satellite to carry laser altimeter radar in the world was the first one to transmit NASA in 2003 in the united states, with the foremost load being the geoscience laser altimeter (Geoscience Laser Altimeter System, GLAS), and the second one being the successive star of the NASA-1, with the foremost load being the advanced topography laser altimeter system (Advanced Topographic Laser Altimeter System, ATLAS) [11] The load realizes the multi-beam push-broom function which is not realized by ICESat-1, and the laser energy required by the system is greatly reduced due to the adoption of a high-repetition-frequency (10 kHz) photon counting detection system, and the total pulse energy before light splitting is only 400 mu J, but the measuring precision of about 10cm and the horizontal resolution of about 70cm are realized.
The research of the laser radar sounding technology in China starts from the 80 th century, and related technology research and system development are carried out. At present, the newly developed Mapper5000 system has completed multiple flight tests in sea areas near certain islands in the south China sea to obtain three-dimensional topographic data of the south China sea islands, wherein the maximum actual depth is 51.00m, the shallowest depth is 0.25m, the sounding precision is 0.23m, and a good technical foundation is laid for the development of the Chinese spaceborne ocean exploration laser radar [12] 。
In the aspect of a laser radar sounding algorithm, students at home and abroad also have made much work. Guenther found through statistical analysis that the water surface echo intensity in the received waveform of the blue-green channel may have larger deviation due to the influence of environmental factors, and indicated that sometimes the detected water surface echo is possibly mixed with the backscattering of the water body or is simply the backscattering waveform of the water body, he called the problem of 'water surface uncertainty', and considers that the water surface position determined by only using the blue-green waveform is inaccurate [13] . Allouis proposes an algorithm flow for improving shallow water extraction accuracy by comprehensively using near-red laser and blue-green laser, firstly, carrying out amplitude correction and time offset correction on red wave band and blue-green wave band, then adjusting the amplitude of the near-infrared wave band to correspond to the green wave band, subtracting the adjusted near-infrared signal from the green wave band signal to obtain a water bottom signal, and finally taking the peak difference corresponding distance between the near-infrared signal and the water bottom signal as shallow water depth [14] . Wong and Antoniou propose to automatically calculate the water depth by decomposing the waveform into two parts, a surface echo and a bottom echo, using an exponentially modified Gaussian function (Exponentially Modified Gaussian, EMG), which is said to estimate the water depth more accurately even when the two parts of echoes are almost completely coincident [15] . Cheng et al believe that water backscattering shifts the peak positions of the water surface and water bottom echoes, and therefore propose to decompose the waveform into three parts, namely water surface echo, water backscattering and water bottom echo by using EMG [16] . Liu et al developed a refraction correction method that simultaneously considered sea surface wave and beam incident angle, and the model used sea surface wave theory and Snell's law to determine the propagation distance of photons in water and corrected the position by geometric relationship. The actual verification of the proposed model is carried out on the research selection place, which shows that the proposed refraction correction method can correct the water depth error more accurately and effectively [17] 。
In terms of side scan sonar data processing, systematic studies have been conducted by foreign scientific research institutions and scholars for a long time. The research of the pretreatment of the side-scan sonar data has been started by the American sea geological survey as early as 1979, and the core member Chavez in 2002 system describes the side-scan sonarData processing related links [18] He divides the processing of the side-scan sonar stripe data into two parts of preprocessing and geocoding, wherein the preprocessing mainly comprises geometric correction and radiation correction of a waterfall image, the geocoding mainly comprises homing calculation and gap filling of the waterfall image, and the Chavez only carries out theoretical analysis on the causes and the processing necessity of each link due to related commercial confidentiality, and does not give out a specific algorithm. Blonedel published in 2009 as a handbook monograph for use by side-scan sonar [19] The monograph focuses on the application of the side-scan sonar in the aspect of target identification and feature interpretation, and the system is introduced only in the fourth chapter of the literature on each link of the side-scan sonar data processing. Chang discusses the gray level equalization and gap filling processing method of the side-scan sonar image in 2010, adopts a statistical method to respectively process the gray level imbalance in the transverse direction and the longitudinal direction, adopts an interpolation method to fill the gap on the geocode image by combining with the sector footprint theory, and improves the quality of the sonar image from the test effect [20] However, the test object selected by Chang is only a very short section of side scan sonar image, the course is hardly changed, and the longitudinal gray level change of the sonar image is more influenced by the motion state of the towed fish besides the height influence of the towed fish from the seabed, so that the longitudinal gray level imbalance is difficult to eliminate only by a statistical method. Daniel in 1998 features the shadow of the target, respectively constructing topological net shapes of the target shadow on adjacent strip sonar images, and moving the relative positions of the two net shapes to maximize the correlation of the two net shapes, thereby realizing the registration of the adjacent strip images [21] This method has only a certain effect on the overall positional deviation, and it is difficult to achieve accurate registration of the strip images. SIFT (Scale Invariant Feature Transform) algorithm extracts the characteristics of the image from a multi-scale angle, the algorithm has better robustness on rotation, translation, scaling and brightness of the image, vandrish preliminarily realizes the matching of the image with the co-view target sonar by using the algorithm in 2011, and the navigation of the underwater vehicle is carried out [22] However, when the sonar image is distorted greatly, it is difficult to extract the correct SIFT feature point pair. Chailoux selects correlation coefficient and mutual information as similarity measurement system in 2011Metering parameter [23,24] A series of obvious related blocks can be extracted from the adjacent strip images through the two parameters, and the overall rigid transformation parameters and the local elastic transformation parameters can be obtained according to the related blocks, so that the mosaic and the splicing of the adjacent strip images are realized. The image registration method has certain limitations, noise and distortion can make the extraction of characteristic point pairs or related blocks difficult, the distortion types and the distortion sizes of the side-scan sonar images are different from each other along the track direction, and the local deviation between adjacent strip images is difficult to eliminate through rigid transformation.
Research on side scan sonar data processing in China starts later, but progress relatively faster. In the aspect of waterfall image processing, li Jun, teng Huizhong and the like (2002) describe the decoding and conversion of the data in the modes of the side scan sonar Qmips and the XTF format in detail [25] The method comprises the steps of carrying out a first treatment on the surface of the Deng Xueqing, dan superet al (2002) summarized the basic model of the geocoding of the side-scan sonar image against Qmips format data [26] The method comprises the steps of carrying out a first treatment on the surface of the Gao Junguo, li Zenglin et al (2003) analyzed the cause of displacement, deformation and noise generation in sonar images and the impact on target interpretation [27] The related conclusion has a certain help to the correct interpretation of the target; yang Fanlin et al (2004) have conducted related studies on the pretreatment of side-scan sonar images from the viewpoint of data fusion, and have focused on the influence of local gray scale mutation on gray scale equalization, proposed a method for detecting a mutation region by wavelet transformation to correct the gain coefficient of the local region, and obtained a good effect based on a simulated image [28] The method comprises the steps of carrying out a first treatment on the surface of the Guo Haitao, sun Dajun, etc. (2002) the application of the attribute histogram to the blurring enhancement of the sonar image significantly improves the visual effect of the sonar image [29] The method comprises the steps of carrying out a first treatment on the surface of the Teng Huizhong, yan Xiaoming et al (2004) summarize the common method of sonar image enhancement, using statistical method to achieve distance-wise gray scale equalization, using high-pass filtering to eliminate speckle and streak noise, using low-pass filtering to eliminate gray scale differences caused by nonlinear attenuation of submarine sound waves at the seafloor, and using block histogram normalization to achieve mosaic image equalization [30] 。
In coral reef substrateIn terms of classification, a great deal of systematic research work is done by students at home and abroad. Purkis (2005) performs sample training on the IKONOS image by using on-site spectral measurement data for identifying coral reef biological physiognomy sub-bands, and further performs remote sensing identification on dense living coral, dense dead coral, sparse living coral, seaweed, shallow water seaweed, deep water seaweed, rock, sand and the like, so that the overall accuracy of 69% is finally obtained [31] . Phonn et al (2012) recognized the geomorphic zona and geomorphic microcells of coral reefs such as Heron, palaugh Ngderack and fijis by 2.4m resolution QuickBird images in Australia according to the hierarchical classification system of NCCOS based on the OBIA method provided by Defiens development 7.0 (eCogination) software [32] . The results show that the overall accuracy of the method can reach about 85% on the aspect of the geomorphic microunits, and the method is superior to the supervision classification performed by pixel. The Roelfsema et al (2013) also used similar methods and classification systems to conduct landform zoning and landform micro-unit mapping experiments on Rovia coral reefs of fijis Kandavu, kubulau and Solomon islands by using 2.4m resolution QuickBird images and 4m resolution IKONOS images, and the results show that the increase of the number of landform zoning and landform micro-unit classification types does not greatly affect the overall precision, and the landform micro-unit mapping can obtain 70% -90% of the overall classification precision [33] . Jakeman simulates a probability distribution model of a submarine side sonar image by using K distribution to obtain distribution parameters so as to classify submarine substrates, but the K distribution method relies on the determination of a clustering center, if the position of the clustering center is inaccurate, the classification precision is affected, and the classification boundary is not easy to divide [34] . Pace and Dyer propose a method for classifying the seabed by using the characteristic quantity based on the gray level co-occurrence matrix of the image, and the classification accuracy is not high due to the fact that the characteristic quantity of the utilized image is less [35] . The Lucino Fonseca proposes to classify the seabed substrate of the Stanton coast research area by combining an image mosaic and seabed echo reflection angle analysis technology, improves the resolution of reflection angle analysis by utilizing a high-resolution mosaic technology, but fails to automatically divide the image simultaneously in texture space and angle response space [36] . The Jin Li classifies the seafloor soil sediment in the southwest boundary research area of Australia by comparing 14 machine learning methods such as inverse distance weighting and random forest, and the like, and the combination of the two methods has strong robustness, but has no consistent method model when observing data changes, does not determine the size of a search window from multiple layers when searching adjacent values, and does not determine reasonable range values related to the sampling number [37] . TOM K classifies subsea sediment using artificial neural networks and non-parametric methods [38] . ChakrabbORTY B uses a mixed neural network to classify the seabed substrate [39] The method comprises the steps of carrying out a first treatment on the surface of the Jianming et al (2014) propose a 3-level classification system of coral reef geomorphic and geomorphic micro units of south China sea based on the process and structural characteristics of coral reef geomorphic formation of south China sea [40] . Based on an OBIA method, a coral reef landform micro-unit extraction model suitable for WorldView-2 images is constructed. Practical application proves that the overall classification accuracy of the model can reach 85.75%. Zhouxi et al (2015) refer to a layered classification system of NCCOS, and propose a geomorphic zonal and geomorphic micro-unit extraction framework with stronger noise immunity under the mixed driving of a model and data under double-scale transformation aiming at domestic CBERS-02B images [41] . The method is applied to the cissampsonii Le Huanjiao, and the overall classification accuracy of 88.89% is obtained, which is superior to the data-driven ML and SVM methods. Tang Qiuhua classifying subsea substrates using Learning Vector Quantization (LVQ) neural networks incorporating genetic algorithms [42] . In addition, huang Z also points out that the existing method for classifying the seabed substrate by adopting the BP neural network has the problems of low network convergence rate, easy sinking into local minimum value, unstable learning rate, poor gradient characteristic of network correction function and the like [43] 。
In summary, although unmanned airborne lidar and shipborne side-scan sonar detection coral reefs have certain advantages, the present technology still has the following drawbacks:
(1) The unmanned aerial vehicle laser radar is easily affected by wind power, so that the shaking of the attitude sensor is too large, the attitude error is large, and finally the mapping precision of the laser radar is affected;
(2) The laser radar is influenced by a measurement mechanism of the laser radar, and the coral reef with large depth cannot be effectively detected;
(3) Because the shipborne side-scan sonar towed fish is in soft connection with the supporting points on the ship, the calculation accuracy of the towed fish position can be influenced by the flexible connection, and the towed fish position is still to be lifted;
(4) The coral reef mixed substrate cannot be effectively distinguished, and an effective mixed pixel decomposition method needs to be found.
Disclosure of Invention
Aiming at the problem that the efficiency and the precision of the existing side-scan sonar detection coral reef substrate are low, firstly, a photon counting laser radar scanning system structure is analyzed, the geometric relationship between the incidence angle and azimuth angle of the laser reflection light water surface and the normal vector of a reflector is established, and an unmanned airborne laser radar underwater sounding point coordinate accurate calculation model is established through an underwater in-layer constant-speed light ray tracking algorithm; secondly, constructing a side scan sonar image with geocoding through fine processing steps such as sea bottom line tracking, radiation distortion joint correction, oblique distance correction, towed fish position calculation, echo sampling point coordinate calculation and the like; thirdly, normalizing and converting the backscattering intensities of the laser radar and the side-scan sonar into gray values by using the Z fraction, fusing the two data by using a near homonymy point rule, and constructing a fused image by using geocoding and image resampling; then, on the basis of a standard wavelet BP neural network, improving the training rate and fitting accuracy of the wavelet BP neural network through improvement of a network learning rate, a momentum factor and an initial value; and finally, on the basis of the high-precision high-resolution fusion image, various coral reef type samples are extracted at the corresponding positions of the fusion image according to geographic coordinates shot by a grab bucket or an underwater camera, then sample training is carried out by selecting some statistical characteristic parameters with higher self-aggregation degree and resolution, and the coral reef substrate classification is carried out on the fusion image by utilizing a sample training result.
In order to achieve the above purpose, the coral reef substrate classification method based on laser radar and side scan sonar data fusion provided by the invention comprises the following steps:
(1) Establishing a laser scanning reference coordinate system and a transition coordinate system thereof by analyzing a photon counting laser radar scanning system structure, further establishing a geometric relationship between an incidence angle, an azimuth angle and a normal vector of a reflecting mirror of a laser reflection light surface, obtaining a three-dimensional coordinate of an incidence point of the laser surface, and constructing an unmanned airborne laser radar underwater sounding point coordinate accurate calculation model by constructing an underwater light velocity profile and an underwater in-layer constant light velocity ray tracking algorithm;
(2) The side scan sonar image with the geocode is constructed through the fine processing steps of sea bottom line tracking, radiation distortion joint correction, oblique distance correction, towed fish position calculation, echo sampling point coordinate calculation and the like;
(3) The Z fraction is facilitated, the backscattering intensities of the laser radar and the side-scan sonar are normalized and converted into gray values, two kinds of data are fused through a near homonymy point rule, and then a fusion image is constructed through geocoding and image resampling;
(4) Based on a standard wavelet BP neural network, the training rate and fitting precision of the wavelet BP neural network are improved through improvement of the network learning rate, the momentum factor and the initial value;
(5) Based on the high-precision high-resolution fusion image, various coral reef type samples are extracted at corresponding positions of the fusion image according to geographic coordinates shot by a grab bucket or an underwater camera, then sample training is carried out by selecting some statistical characteristic parameters with higher self-aggregation degree and resolution, and the coral reef substrate classification is carried out on the fusion image by utilizing a sample training result.
In one embodiment of the invention, the geometric relationship between the incident angle, the azimuth angle and the normal vector of the reflector of the laser reflection light is established, and an unmanned airborne laser radar underwater sounding point coordinate accurate calculation model is established through an underwater in-layer constant light speed ray tracking algorithm, and the method mainly comprises the following steps:
(1) Analyzing the structure of a photon counting laser radar scanning system;
(2) Establishing a laser scanning reference coordinate system and a transition coordinate system thereof;
(3) Establishing a relation model of an incidence angle of the water surface of the reflected light and a normal vector of the reflecting mirror under a laser scanning reference coordinate system;
(4) Calculating a three-dimensional coordinate value of a laser water surface incident point;
(5) Constructing an underwater light velocity profile;
(6) Establishing a constant light speed ray tracing model in the underwater layer;
(7) And constructing a coordinate accurate calculation model of the underwater sounding point in a laser scanning reference coordinate system.
In one embodiment of the invention, a geocoded side scan sonar image is constructed through the fine processing steps of sea bottom line tracking, radiation distortion joint correction, oblique distance correction, towed fish position calculation, echo sampling point coordinate calculation and the like, and mainly comprises the following steps:
(1) Sea bottom line tracking;
(2) Radiation distortion joint correction;
(3) Correcting the inclined distance;
(4) Calculating the position of the towed fish;
(5) Calculating coordinates of the side scan sonar echo sampling points;
(6) A side-scan sonar image with geocoding is constructed.
In one embodiment of the invention, the Z fraction is beneficial to normalize and convert the backscattering intensities of the laser radar and the side-scan sonar into gray values, two kinds of data are fused through a near homonym rule, and a fused image is constructed through geocoding and image resampling, and the method mainly comprises the following steps:
(1) The Z fraction is facilitated to normalize the backscattering intensities of the laser radar and the side-scan sonar;
(2) Converting the normalized data of the two into gray values;
(3) Establishing a rule of near homonymy points;
(4) Fusing the two data by using a rule of a near homonymy point;
(5) Re-geocoding the fused data;
(6) Resampling an image;
(7) And constructing a laser radar and side-scan sonar fusion image.
In one embodiment of the invention, on the basis of a standard wavelet BP neural network, the training rate and fitting precision of the wavelet BP neural network are improved by improving the network learning rate, the momentum factor and the initial value, and the method mainly comprises the following steps:
(1) Establishing a basic framework of a wavelet BP neural network, namely 1 input layer, 1 hidden layer and 1 output layer, wherein the hidden layers adopt wavelet functions as transfer functions;
(2) Improvement of the network learning rate;
(3) Improvement of momentum factor;
(4) Taking the relation between the initial value of the network weight and the neuron transfer function and the learning sample into consideration, and constructing an optimal weight initial value model;
in one embodiment of the invention, image samples of various coral reef types are extracted on the basis of the fusion image, characteristic values of the image samples are calculated in a statistics mode, and the fusion image is classified by utilizing an improved wavelet BP neural network, and the method mainly comprises the following steps:
(1) On the basis of the fusion image, extracting image samples of various coral reefs according to actual shooting coordinates of a grab bucket or a camera;
(2) Selecting statistical characteristic parameters with higher self-aggregation degree and resolution (such as standard deviation, entropy, third-order moment, kurtosis, skewness and the like) as image sample characteristics;
(3) Respectively selecting 70% and 30% of samples as training and checking, and performing image sample training and checking;
(4) And classifying the coral reef substrate by using the sample training result.
In one embodiment of the present invention, the ray tracing algorithm of the constant speed of light in the underwater layer is as follows:
1) The laser beam experiences a water column consisting of N layers, the speed of light propagating in the layers at a constant speed of light, according to Snell's law:
2) Let the thickness of the water column layer be Deltaz i (Δz i =z i+1 -z i ) Then the horizontal displacement y of the beam in layer i i And propagation time t i The method comprises the following steps:
according to equations (11) and (12), the horizontal distance and propagation time of the light beam through the entire water column are respectively:
assuming that the beam does not experience the full water column layer, but is at Z r Where it disappears, the horizontal displacement of the beam at the layer is deltay r The vertical displacement is Deltaz r . The light beam has a time t in all water column layers all The time elapsed in this layer is t r The optical path DeltaS that the light beam experiences at that layer r The method comprises the following steps:
Δz r =ΔS r ·cosθ r (16)
Δy r =ΔS r ·sinθ r (17)
therefore, the total horizontal displacement and vertical displacement of the light beam in the water body are respectively:
the technical problems to be solved by the invention mainly comprise the following aspects:
(1) Analyzing the structure of a single-photon laser radar scanning system, and establishing a geometric relationship model of laser reflected light and a reflector normal vector;
(2) Constructing a coordinate calculation model of a laser radar water surface incidence point in a laser scanning reference coordinate system;
(3) Providing a constant light speed ray tracing model in the underwater layer;
(4) Establishing a coordinate calculation model of an underwater laser sounding point in a laser scanning reference coordinate system;
(5) Constructing a high-resolution side-scan sonar image;
(6) Carrying out fusion processing on the laser radar and the side scan sonar data;
(7) Improving a standard wavelet BP neural network;
(8) And classifying coral reef substrates of the fused images.
Through the technical scheme, the invention has the beneficial effects that:
(1) By researching a photon counting laser radar scanning structure, a geometric relation model of laser reflected light and a reflector normal vector is constructed, and the coordinates of a water laser incident point can be accurately calculated;
(2) The in-layer constant light speed ray tracking model based on the light speed profile is provided, a coordinate calculation model of the underwater laser sounding point in a laser scanning reference coordinate system is established, and the measurement accuracy of the underwater sounding point can be improved to a large extent.
(3) The Z fraction is facilitated to fuse the laser radar and the side scan sonar data of different mechanisms;
(4) By improving the traditional wavelet BP neural network, the precision of coral reef substrate classification is improved.
Drawings
In order to more clearly illustrate the embodiments of the invention or the technical solutions in the prior art, the drawings that are required in the embodiments or the description of the prior art will be briefly described, it being obvious that the drawings in the following description are only some embodiments of the invention, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a schematic diagram of a photon counting lidar elliptical scanning system of the present invention;
FIG. 2 is two Cartesian coordinate systems of the mirror normal direction vector of the present invention;
FIG. 3 is a geometric angle of the reflected light of the present invention in the sensor coordinate system;
FIG. 4 is a schematic illustration of the geometry of the outgoing laser calculated from normal variation in accordance with the present invention;
FIG. 5 is a schematic illustration of the laser sea surface point of incidence of the present invention;
FIG. 6 is a schematic diagram of intra-layer constant speed ray tracing of the present invention;
FIG. 7 is a schematic view of the underwater optical path of the lidar of the present invention;
FIG. 8 is a schematic view of seafloor tracking of the present invention;
FIG. 9 is an amplitude thresholding method of the seafloor tracking of the present invention;
FIG. 10 is a window slope method of seafloor tracking of the present invention;
FIG. 11 is the effect of the height of a towed fish on beam angle of the present invention;
FIG. 12 is a graph of beam angle versus towed fish height and travel distance for the present invention;
FIG. 13 is a positional relationship of echo, sea bottom line, towed fish according to the present invention;
FIG. 14 is an example of the side-scan sonar waterfall image slope distance correction of the present invention;
FIG. 15 is a schematic view of the towed fish coordinate estimation of the present invention;
FIG. 16 is a schematic illustration of echo position calculation according to the present invention;
FIG. 17 is a schematic diagram of a waterfall image geocoding of the present invention;
FIG. 18 is an image resampling scan fill method of the invention;
FIG. 19 is a modified wavelet BP neural network of the present invention;
FIG. 20 is a flow chart of a fusion image coral reef matrix classification of the present invention;
FIG. 21 is an overall flow chart of a coral reef substrate classification method based on laser radar and side scan sonar data fusion in accordance with the present invention.
Detailed Description
The invention is further described with reference to the following detailed drawings in order to make the technical means, the creation characteristics, the achievement of the purpose and the effect of the implementation of the invention easy to understand.
First, the present invention relates to the following technical terms:
photon counting laser radar
The photon counting laser radar is also called as a single photon laser radar, and is a laser radar with high sensitivity and high time resolution. The photoelectric detector capable of detecting the low-single-photon-level weak echo signals is adopted as a photoelectric conversion device, and the high-precision detection of the weak signals can be completed by matching with a high-precision time-dependent single-photon timing technology (Time Correlated Single Photon Counting, TCSPC), so that the method is suitable for scenes with limited echo intensities, such as long-distance and low-reflectivity targets, and the like [44] 。
Inertial navigation system
An inertial navigation system (INS, hereinafter referred to as inertial navigation) is an autonomous navigation system that does not depend on external information and does not radiate energy to the outside [45] . The working environment not only comprises the air and the ground, but also can be underwater. The basic working principle of inertial navigation is based on Newton's law of mechanics, and information such as speed, yaw angle and position in a navigation coordinate system can be obtained by measuring acceleration of a carrier in an inertial reference system, integrating the acceleration with time and transforming the acceleration into the navigation coordinate system.
Ray tracing
The ray tracing is based on the underwater light velocity profile to trace underwater laser soundingMethod for calculating coordinates of point (projection point) in laser radar scanning reference coordinate system [46] . On the premise that photons in each water column layer move at a constant speed, the incident angle and the refraction angle of light on the interface of each water column layer are calculated according to the Snell rule, and the travel time and the horizontal displacement of the light in each water column layer are calculated respectively until the light disappears at the interface or somewhere in the layer.
Side-scan sonar
The side-scan sonar system comprises a plurality of parts, including a workstation, a display unit, a winch, a towing cable, a towed fish, a GPS receiver and the like, and auxiliary equipment such as a pressure sensor, a compass, an attitude sensor and the like are needed to provide relevant motion parameters, and if the real-time towed fish position is required to be accurately obtained, an underwater towed fish positioning system is also needed to be equipped [47] 。
The coral reef substrate classification method based on laser radar and side scan sonar data fusion mainly comprises the following steps, see fig. 21:
(1) Establishing a laser scanning reference coordinate system and a transition coordinate system thereof by analyzing a photon counting laser radar scanning system structure, further establishing a geometric relationship between an incidence angle, an azimuth angle and a normal vector of a reflecting mirror of a laser reflection light surface, obtaining a three-dimensional coordinate of an incidence point of the laser surface, and constructing an unmanned airborne laser radar underwater sounding point coordinate accurate calculation model by constructing an underwater light velocity profile and an underwater in-layer constant light velocity ray tracking algorithm;
(2) The side scan sonar image with the geocode is constructed through the fine processing steps of sea bottom line tracking, radiation distortion joint correction, oblique distance correction, towed fish position calculation, echo sampling point coordinate calculation and the like;
(3) The Z fraction is facilitated, the backscattering intensities of the laser radar and the side-scan sonar are normalized and converted into gray values, two kinds of data are fused through a near homonymy point rule, and then a fusion image is constructed through geocoding and image resampling;
(4) Based on a standard wavelet BP neural network, the training rate and fitting precision of the wavelet BP neural network are improved through improvement of the network learning rate, the momentum factor and the initial value;
(5) Based on the high-precision high-resolution fusion image, various coral reef type samples are extracted at corresponding positions of the fusion image according to geographic coordinates shot by a grab bucket or an underwater camera, then sample training is carried out by selecting some statistical characteristic parameters with higher self-aggregation degree and resolution, and the coral reef substrate classification is carried out on the fusion image by utilizing a sample training result.
Referring to fig. 1 to 20, specific embodiments of the present invention will now be described in detail as follows:
(1) General technical scheme
Firstly, establishing a geometric relationship between an incidence angle and an azimuth angle of a laser reflection light water surface and a normal vector of a reflector by analyzing a photon counting laser radar scanning system structure, and constructing an unmanned airborne laser radar underwater sounding point coordinate accurate calculation model by a constant light speed ray tracing algorithm in an underwater layer; secondly, constructing a side scan sonar image with geocoding through fine processing steps such as sea bottom line tracking, radiation distortion joint correction, oblique distance correction, towed fish position calculation, echo sampling point coordinate calculation and the like; thirdly, the backscattering intensity of the laser radar and the side-scan sonar is fused by the Z fraction; and finally, on the basis of a standard wavelet BP neural network, improving the training rate and fitting precision of the wavelet BP neural network through the improvement of a network learning rate, a momentum factor and an initial value, and classifying the coral reef substrate of the fusion image through the improved wavelet BP neural network and the extracted substrate sample.
(2) Marine laser radar data fine processing
1) Elliptical scanning system structure of photon counting laser radar
The marine laser radar described herein is a conventional elliptical scanning structure (shown in fig. 1) using a prism rotatable about an axis of rotation as a mirror to control the direction of the emitted laser beam, which is reflected by the prism and directed to the sea surface. The angle between the normal direction of the prism and the rotation axis is 7.5 °, and when the prism surface rotates around the rotation axis, the laser draws a track on the sea surface at an incident angle of approximately 15 °. The scanning structure is also referred to as an oval scanning structure, since the angle of incidence is not equal to 15 ° (related to the normal direction) during a scan revolution, and the resulting aircraft has an approximately elliptical oval shape of the sea surface laser spot trajectory in hover.
2) Laser radar scanning reference coordinate system
The laser radar scanning reference coordinate system defines: with the center point of the reflecting mirror as the origin of coordinates O, X s The axis points to the negative direction of the emergent laser, Y s The axis points to the flight direction, Z s Axis and X s 、Y s The axes construct a right hand coordinate system with the direction vertically upward. The incident laser and the motor shaft are on the same plane (X s Z s Plane), laser is incident horizontally (along X s Negative axis direction), the mirror incidence point of the laser is the mirror center. For easy understanding, as shown in FIG. 2, the original X is s Y s Z s Coordinate system around Y s The axis rotates 45 degrees anticlockwise to obtain a new coordinate system X s ′Y s ′Z s ' at this time Z s The' axis coincides with the direction of rotation of the motor. Mirror normal at X s Z s Projection of plane Z s Included angle of axesIn Y s Z s Projection of plane Z s Included angle of shaft->
As shown in FIG. 3, the reflected ray is at X s Z s 、Y s Z s The included angles between the projection of the plane and the Z axis are phi respectively x 、φ y The space base angle is phi. Due to laser light along X s Negative direction incidence of axis, normal line at Y s Z s Projection of plane Z s Included angle of axesEqual to the reflected laser at Y s Z s Projection of plane Z s Included angle phi of shaft y (because the incident laser line, the mirror normal, and the reflected laser line are coplanar, the incident laser linePerpendicular to Y s Z s A plane, according to the theorem that one plane passes through the perpendicular to the other plane, where the two planes are orthogonal), therefore, at Y s Z s The angle of rotation of the normal line is synchronized with the angle of rotation of the reflected light on the plane (i.e., the normal line rotates by an angle θ, which also rotates by the reflected light). And at X s Z s In the plane, when the mirror rotates (i.e., normal) by an angle θ, the reflected light ray rotates by an angle 2θ. When the normal angle changes, the included angle is->Is also changed by ∈>Easy to solve phi x And further to calculate the nadir angle phi and azimuth angle +.>The angular variation of the normal is therefore critical.
3) Direction vector of mirror normal
In FIG. 2, the mirror normal is at X s ′Y s ′Z s Normal vector of' coordinate system (F x′ ,F y′ ,F z′ ):
Then wind Y s X can be obtained by rotating the coordinate axis clockwise by 45 degrees s Y s Z s Mirror normal vector of coordinate system (F x ,F y ,F z ):
4) The relative angle of the reflected light in the laser scanning reference frame
φ x (θ)=2arctan(F x /|F z |)-90° (3)
φ y (θ)=arctan(F y /|F z |) (4)
deriving the nadir angle phi and azimuth angle from the geometrical relationship of FIG. 3The method comprises the following steps: />
5) Coordinates of water surface light spot footprint under laser radar scanning reference coordinate system
As shown in FIG. 5, if the sea surface is a plane, the laser sea surface incidence point is P 1 The laser center is denoted as S, and the laser beam pitch in the air is L 1 Azimuth angle ofThe measured height of the reflector center is H, and then the incident point P of the laser sea surface 1 The position coordinates of (a) are:
x s =Htan(φ x ) (7)
y s =Htan(φ y ) (8)
z s =-H (9)
(3) Coordinate calculation of underwater light spot footprint under laser radar scanning reference coordinate system
1) Ray tracing algorithm based on water body layer normal light speed assumption
Because the temperature, the salinity and the density of each water group in the vertical direction of the water body are different, the light velocity of traveling in each water group is also different, and meanwhile, the light rays can also generate refraction phenomena at interfaces of different water groups. Therefore, the ocean light velocity profiler needs to be beneficial to accurately tracking light after acquiring the water depth and light velocity value sequence in the vertical direction, so as to obtain the high-precision underwater facula footprint coordinates.
Assuming that the laser beam experiences a water column consisting of N layers, the speed of light propagates within the layers at a constant speed of light (fig. 6), according to Snell's law:
as shown in FIG. 6, the thickness of the water column layer is set to be Deltaz i (Δz i =z i+1 -z i ) Then the horizontal displacement y of the beam in layer i i And propagation time t i The method comprises the following steps:
according to equations (11) and (12), the horizontal distance and propagation time of the light beam through the entire water column are respectively:
assuming that the beam does not experience the full water column layer, but is at Z r Where it disappears, the horizontal displacement of the beam at the layer is deltay r The vertical displacement is Deltaz r . The light beam has a time t in all water column layers all At the layer throughThe calendar time is t r The optical path DeltaS that the light beam experiences at that layer r The method comprises the following steps:
Δz r =ΔS r ·cosθ r (16)
Δy r =ΔS r ·sinθ r (17)
therefore, the total horizontal displacement and vertical displacement of the light beam in the water body are respectively:
2) Coordinates of underwater laser footprint in laser scanning reference coordinate system
As shown in fig. 7, the laser light is on the water surface P at an incident angle phi 1 Point incidence with refraction angle theta 0 P passing through different water layers in turn 2 、P 3 Finally reach point P 4 The light energy disappears. OP (optical path) 4 ' is the distance L from the projection point of the depth measurement point of the underwater laser radar on the water surface to the origin of coordinates s The value is:
x of underwater sounding point of laser radar w 、y w The coordinate values are respectively:
Z s =-H a -H w =-H a -z′ (23)
3) Coordinates of laser radar underwater sounding point under WGS84 space rectangular coordinate system
And (X) GPS ,Y GPS ,Z GPS ) The method comprises the following steps:
in the formulas (24) and (25), (X) s-wgs84 ,Y s-wgs84 ,Z s-wgs84 ) Coordinates of the laser radar underwater sounding point under a WGS84 space rectangular coordinate system; (X) GPS ,Y GPS ,Z GPS ) Coordinates of the center of the GPS antenna on the air cushion ship in a WGS84 space rectangular coordinate system; r (yaw, pitch, roll) is a rotation matrix of the conversion of the body coordinate system to the local navigation coordinate system;the method comprises two parts, namely, the eccentric difference between the center of the laser scanning reference coordinate system and the center of the IMU body coordinate system and the eccentric difference between the center of the GPS antenna center and the center of the IMU body coordinate system;The reference coordinate system is scanned for laser placement offset angles relative to the IMU body coordinate system.
(4) Accurate processing of side scan sonar data
1) Sea bottom line tracking
When there is less suspended matter in the body of water, it can be observed in the waterfall image that there is a distinct boundary between the water column and the seafloor echo image, the so-called seafloor line, which is a strong echo line consisting of the first seafloor echo of each row. The process of determining the first seafloor echo location of each row is commonly referred to as seafloor tracking, as shown in fig. 8. Typically, the subsea lines on the left and right sides are symmetrical about the transmission line, and the first echo on the seabed is fromThe transverse distance between the seabed line and the transmitting line is the height from the towed fish to the seabed. In FIG. 8, N 0 N being the position of the transmitted line b For the position of the seabed line, where the lateral dimension of a single pixel of the sonar image is Δd, the height of the towed fish from the seabed can be expressed as:
H fish-pulling device =|N b -N 0 |×Δd (26)
The seabed line is an initial line of lateral gain of the side scan sonar, is also a reference line for slope correction, and correctly extracting the seabed line is a basis for subsequent processing of waterfall images. Here, common methods for seafloor tracking are generalized to be an amplitude threshold method, a window slope method, and a manual intervention method.
(1) Amplitude thresholding
The hydrophone of the side-scan sonar can only monitor weak noise signals before the first submarine echo returns, and when the first submarine echo arrives, the signals received by the hydrophone can generate step change, so that the first intensity I can be found in the echo sequence according to the sequence of the receiving time by setting a proper threshold T (figure 9) n >And the echo of T is considered to be the first echo of the seabed surface, and the distance from the echo to the towed fish is the height from the towed fish to the seabed.
The method is simple and quick, but when the threshold value is unreasonable to select, the method is easily interfered by strong echo formed by objects in water, so that the threshold value is required to be continuously adjusted according to actual conditions in data processing.
(2) Window slope method
In general, the change of the seabed surface is relatively gentle, the heights of the towed fish to the seabed in the vicinity of several Ping can be considered to be the same or similar, and the average value b of each echo in the window is counted by setting the window length d j The column mean curve is derived by finite difference method, the calculation formula is shown as formula (27), and the position with the largest slope on the curve is the position of the sea bottom line (shown as figure 10).
(3) Manual intervention method
In general, both of the above methods can accurately extract the seabed line from the sonar image. However, the ocean environment is complex, when a large amount of suspended matters exist in the water body, the strong echoes of the suspended matters and the submarine echoes are mixed together, the true position of the submarine line is difficult to distinguish, and in order to better eliminate the influence of the water column, the submarine line in a complex area can be manually selected according to the trend of the submarine line.
In addition, window median filtering can be adopted to eliminate the influence of random noise on the extracted seabed line, and a symmetrical method is adopted to improve the extraction stability of the seabed line.
2) Joint correction of radiation distortion
There are 3 main components of radiation distortion: 1) The artificial gain is different; 2) The gray level intensity change of the image at the seabed is related to the beam angle, and is the influence of the beam mode, and the influence of side is more remarkable; 3) The far-end gray scale intensity variation of the image is related to distance, absorption and expansion loss in the propagation process and overgain. Radiation distortion correction is mainly to obtain high quality images with uniform gray level variation and ensuring that the target can be accurately detected and identified.
(1) Distance dependent correction of radiation distortion
Taking starboard as an example, assume that the data sampling sequence in a certain Ping (N) is N s And the sequence of gray value sequences is s (N, i), i=1, 2 s -1. Assuming that the position of the seabed line obtained by seabed tracking is b (N), N min For the width of the image area corresponding to the maximum height of the towed fish in the strip, the calculation can be performed according to the formula (28):
N min =min(N s -b(n)),n=1...N p (28)
calculating correction factors delta for each Ping section position according to formula (29):
wherein N is P To smooth the number along the track to Ping. N (N) P Should be satisfied that the local variation can be eliminatedThe influence can also ensure that two different seabed substrate areas cannot be traversed. Delta (i) obtained through the formula (29) is a sequence that is not smooth enough, and thus delta (i) is smoothed using the following moving average method:
in formula (30), p (i) =max (0, i-l); q (i) =min (N min -1,i+l);l=N min /50. Calculating a correction factor according to equation (31):
to sum up, the correction formula of the Ping section echo data can be obtained:
S C (n,i)=C(i)s(n,i) (32)
the correction effect is closely related to the seabed tracking accuracy. The method eliminates the strong change of the far-end gray level of the image, effectively improves the visual effect of the image, and ensures that the edges between the processed image categories are clearer.
(2) Correction of radiation distortion associated with beam patterns
The gray level intensity change at the submarine position of the side scan sonar image is related to the beam angle and can be divided into two aspects of the influence of the fish towing height and the beam angle. Fig. 11 shows the ratio of the height of the transducer from the seafloor to the beam position at the seafloor, which determines the beam pattern per Ping echo and the position of each echo in the Ping scan line.
Fig. 12 shows the beam angle as a function of propagation distance and towed fish height. The beam angle is changed rapidly within a certain distance under the sea floor of the towed fish, and when the propagation distance is larger than the towed fish height by a certain multiple, the influence of the propagation distance on the change of the beam angle is weakened gradually; and the influence of the beam angle increases with increasing towed fish height.
Taking the submarine point and the point at a certain short distance from the submarine point as reference points, and carrying out the in-section echo intensity calculation again through a model (33). In order to prevent the influence of the substrate change, a plurality of Pings are taken along the track direction to carry out equalization treatment.
Delta is the distance from the reference point to the seabed point, and can take 1/10 times of the working range; l is Ping number; + and-represent port and starboard, respectively; b (n) is the position of the sea bottom point; the Ping section gray scale is changed as in formula (34):
correction coefficients for points on the Ping section can then be calculated by equation (35):
Finally, correction of strong echo near the seabed just below the trawling of the side-scan sonar image is performed based on the formula (36):
the correction algorithm combining the artificial gain quantity elimination algorithm, the separation distance correlation and the beam mode is a radiation distortion combination correction method.
3) Correction of skew
Under the influence of oblique distance record, oblique geometric distortion exists in the lateral direction of the side scan sonar image, and the existence of a water column also leads to that the target under the towed fish is separated at two sides, so that the side scan sonar image is subjected to transverse equalization and then the oblique distance correction is needed to be carried out on the side scan sonar image. Under the constraint of a measurement mechanism, the side-scan sonar transducer cannot distinguish the direction of each echo, only the slant distance from the towed fish to each echo can be obtained, and the slant distance correction can be carried out under the existing condition, only the following assumption can be introduced:
(1) The first seafloor echo is from directly below the transducer;
(2) The surface of the sea bottom is smooth and approximately flat, and the vertical distance from the target to the towed fish is equal to the height from the towed fish to the sea bottom;
(3) Ignoring the change in sound velocity, sound waves are considered to propagate straight in seawater.
Based on the above assumptions, the dead space of each echo can be calculated using the trigonometric relationship of the towed fish, the seafloor and the echo. FIG. 13 shows a ping echo of a waterfall image, N is the total number of samples of each side echo, N h The diagonal pixel width, n, of the first echo to transmit line at the seafloor i The flat pixel width of the first echo of the current echo to the sea floor is:
the corrected side scan sonar image, as shown in fig. 14, removes the water column area and also reduces the impact of oblique geometric distortion on the target shape.
4) Position estimation of towed fish
The towing position is generally calculated approximately by the length of the towing cable, which is generally relatively strong and flexible, and the towing cable greatly reduces the influence of the change of the attitude of the hull on the towing during towing. Therefore, when the ship is sailing straight at a constant speed, the towed fish can be considered to be only subjected to forward traction, and the horizontal distance from the towed fish to the towed point is calculated according to the triangle geometry in fig. 15.
Because of the dead weight of the towing rope, the hypotenuse of the triangle always takes 0.9 times of the length L of the towing rope, and the horizontal distance from the towing fish to the towing point is calculated according to a formula (38), wherein h is the height from the cradle to the water surface, and f d For towing fish to the surface, this value can be determined by a pressure sensor. Finally, the geographical coordinates of the towed fish are obtained through the towing point position according to the course consistency of the towed fish and the ship body (formula (39)).
In the above, [ x, y ] ] T VFS_TOW Is the coordinates of the towing point position in the ship body coordinate system, [ x, y] T GRF_T The coordinates of the towing point position in a geographic coordinate system are shown as A, and the azimuth angle of the measuring ship is shown as A.
5) Echo sampling point coordinate calculation
After the waterfall image is processed, the pixel at the center of each row of the image theoretically corresponds to the position under the towed fish, the geographic coordinates of the pixel in the middle of each row can be determined according to the position of the towed fish, each row of pixels is perpendicular to the current heading of the towed fish, and the width from each pixel to the center is a flat distance. Based on the above relationship, the position of each echo in the geographic coordinates can be calculated.
As shown in fig. 16, in the rectangular planar coordinate system, the geographic coordinate of the projection point directly below the towed fish during Ping echo measurement is P 0 (X 0 ,Y 0 ) The single-side scanning amplitude of the side-scan sonar is R, the sampling rate of each channel is N, the sailing azimuth angle is alpha, and because each Ping echo is perpendicular to the sailing direction, the azimuth angle of the left-side channel echo is theta=alpha-pi/2, and the azimuth angle of the starboard channel echo is theta=alpha+pi/2, P i For the ith echo of a channel, P i Is (X) i ,Y i ) The method comprises the following steps:
if the influence of the motion gesture on the towed fish is considered, corresponding correction needs to be added:
(5) Laser radar and side-scan sonar data fusion processing
1) Facilitating Z-score data normalization
Since the backscatter intensities of lidar and side-scan sonar are affected by the respective emission mechanisms, the range of backscatter intensities they obtain is different. Therefore, their backscatter intensities can be normalized by the Z-score and then gray value converted.
Z-score is a common mathematical statistical means to sum parameters of multiple different variation intervals to a common variation interval. For any set of scattering intensity sequences B (mean μ, standard deviation σ), then the Z fraction of any scattering intensity value B in B is:
2) Conversion of Z-score to gray value
Before the topography of the seabed is mapped, Z fraction is converted into gray value, and the conversion formula is as follows:
wherein Z is the Z fraction of the backscatter intensity, Z min Z is the smallest score in the sequence max Then I is the gray value after linear quantization, which is the largest fractional value in the sequence.
3) Laser radar with near homonymy point and side-scan sonar point cloud combination
After the data of the sounding points of the laser radar and the sounding points of the multi-beam sonar are obtained, as the coordinates of the sounding points are reset to the WGS84 space rectangular coordinate system, the sounding points form full-coverage strip measurement, when two sounding points of different types meet the condition of a formula (44), the two points are considered to be similar points, the coordinates of the sounding points need to be reassigned, as shown in a formula (45), and similarly, the gray values need to be reassigned, as shown in a formula (46).
4) Fusion image geocoding
After the scattering intensity data is linearly quantized, the specific pixel position of each sampling point in the image needs to be calculated, and assuming that the pixel resolution is res, the calculation formula of the position of the sampling point in the image is as follows:
wherein (X) i ,Y i ) Pixel positions of the ith sampling point in the image, (x) i ,y i ) Geographic coordinates of the ith sample point, (x) min ,y min ) The minimum value of the overall sampling point geographical coordinates.
5) Image resampling
The image resampling mainly solves the problem of gaps (as shown in figure 17) caused by inconsistent longitudinal and transverse sampling rates of the side-scan sonar echoes, and provides an image resampling method based on scanning filling aiming at imaging characteristics of laser radar and multi-beam sonar images.
The basic principle of the scanning filling method is that for any closed area, each row of pixels of the area is scanned sequentially from top to bottom by horizontal scanning lines, a series of intersection points generated by each scanning line and a boundary are calculated, the intersection points are ordered according to a transverse axis, the ordered intersection points are sequentially taken out in pairs to serve as left and right boundary points, all pixels in the left and right boundary points are marked as filling points, and when the whole area is scanned, the filling of the area is completed.
As shown in fig. 18, a 1 、A 2 For two adjacent echoes on the same scanning line, B 3 、B 4 For two adjacent points on adjacent scan lines, A 1 、A 2 、B 3 、B 4 A closed communication area is formed, all pixel points in the area can be calibrated by adopting a scanning filling method, and each pixel pointThe pixel value of the pixel point can be obtained according to an inverse distance weighting method, as shown in the formula.
(6) Improved wavelet BP neural network
The standard wavelet BP neural network is based on a standard BP neural network and comprises an input layer, a hidden layer and an output layer. Unlike standard BP neural network, the hidden layer adopts wavelet or scale function to replace Sigmoid function, and in general, the excitation function adopted by hidden layer is Morlet Gaussian function, and three-layer BP wavelet neural network structure [48] As shown in fig. 19.
Wavelet transformation, which analyzes a signal using a series of vibration functions having different frequencies as window functions, has a basic principle of fitting a signal through a series of special wavelet basis function constructions [49] . The wavelet function has a good local time-frequency analysis and is also capable of multi-resolution analysis of non-stationary signals by telescoping and panning operations. If the mother wavelet function is defined as ψ (x) ∈L 2 (R),L 2 (R) represents the square of the real set, R represents the real set. Then the sub-wavelet function [50] Is defined as the following formula:
the excitation functions adopted in the hidden layer and the output layer are Morlet Gaussian function and Sigmoid function respectively, as shown in the formulas (50) and (51),
BP wavelet neural network model [50] Can be expressed as:
in the formula (52), the amino acid sequence of the formula,input value of ith node for the p-th sample of input layer, w kj And w ji And b1 is the connection weight of the output layer and the hidden layer, and the hidden layer and the input layer j B2 is the threshold value of the j-th node of the hidden layer k A is the threshold value of the kth node of the output layer j And b j The scale factor and the shift factor of the j-th node of the hidden layer respectively.
The objective function of network training adopts an energy function [51] The following formula:
in the formula (53), the amino acid sequence of the compound,is the kth expected output value of the p-th mode of the output layer, ">Is the output layer p-th mode kth actual network output value.
In the standard wavelet BP neural network, the fixed learning rate, the randomization of the connection weight and the threshold initial value can cause the problems of slower convergence speed, easy jitter at the steep position of the error curved surface and the like, and the improvement is needed. The setting of the learning rate is particularly important for the whole network learning process, and if the learning rate is too large, the network is easy to oscillate to cause instability; if the learning rate is too small, the network convergence is easy to be slow and the training time is long. Therefore, in the network training process, the magnitude of the learning rate needs to be adaptively adjusted so as to better correct the network error performance curved surface. According to the local error curved surface, the learning rate can be correspondingly adjusted, when the network error gradually tends to convergence accuracy in a reducing mode, the correction direction is correct, the step length is required to be increased, and the learning rate is correspondingly required to be increased; conversely, the step size needs to be reduced, and the learning rate correspondingly decreases. Typically, the learning rate is increased or decreased at a fixed rate, i.e., as long as E (k) is greater (or less) than E (k-1), the rate is decreased (or increased) without comparing the extent to which they differ. A fixed increase or decrease in learning rate may affect network stability and thereby slow down network convergence. Therefore, it is necessary to construct a learning rate function corresponding to the variation of E (k-1)/E (k) and continuously update the learning rate with the variation of E (k-1)/E (k) by:
Wherein 1.ltoreq.ρ.ltoreq.2, 0< α <1, m=e (k-1)/E (k), z=1+ (m-1)/m exp (-m-1) α, and the updated formula for the momentum factor λ is:
whether the network learning is converged and the convergence speed are closely related to the optimal initialization of the wavelet BP neural network parameters, and in general, it is not guaranteed that a random number is adopted to acquire a good initial weight. Obtaining the optimal initial weight takes into account the relationship between the initial weight value and the neuron transfer function, and the learning sample. Setting the hidden layer node number of the three-layer wavelet BP neural network as J, the input layer node number as I, and firstly matching w ji The initialization is carried out by the following steps:
(1) first at [ -1,1]The interval generates a random number with uniform distribution as w ji Initial value of and use w ji0 A representation;
(3) then multiplying a factor C, wherein the factor C is related to a transfer function, the input layer nerve unit number I and the hidden layer nerve unit number J, and C can be 2;
w ji2 =C·J 1/I ·w ji1 ,j=1,2,...,J (57)
(4) finally, considering the relation with the learning sample, setting the maximum value and the minimum value of the input sample of the ith neuron in the input layer as x respectively imax 、x imin The following steps are:
after initializing the weight between the hidden layer and the input layer, initializing the threshold b1 between the hidden layer and the input layer j The method comprises the following steps:
(1) first at [ -1,1]The interval generates a random number with uniform distribution as b1 j With b1 j0 A representation;
(2) multiplying the result by a factor C, wherein the factor C is related to a transfer function, the input layer nerve unit number I and the hidden layer nerve unit number J, and the value of C is the same as that of the formula (57);
b1 j1 =C·J 1/I ·b1 j0 ,j=1,2,...,J (59)
(3) finally, comprehensively considering the learning sample and w ji The following steps are:
scaling factor a for wavelet function j And translation factor b j It is known from wavelet theory that in the time domain, the concentrated region of the wavelet expansion system is [ b+aθ -aβ, b+aθ+aβ ]]. In order for the wavelet transform to cover all input vectors, the following settings must be followed when initializing the transform parameters:
from formula (61):
for the connection weight and threshold initialization of the hidden layer and the output layer, as the output layer generally adopts linear neurons, the connection weight and threshold initial value of the two layers can be considered to be the uniformly distributed random numbers generated on the [ -1,1] interval.
The determination of the hidden layer nerve unit number in the standard wavelet BP nerve network has larger randomness, the selection of the hidden layer nerve unit number is closely related to the network convergence speed and the improvement of error precision, if the hidden layer nerve unit number is too small, the network learning capacity is limited, and all rules contained in the training sample cannot be stored; if the number of hidden nerve units is too large, the time cost of network learning is increased, irregular content in the sample is likely to be stored, and generalization capability is reduced. Therefore, it is necessary to discuss the problem of setting the number of hidden nerve cells. The number of hidden nerve units can be referred to as follows [52] :
In the formula (63), J represents the number of hidden layer nerve units, n represents the number of input layer nerve units, m represents the number of output layer nerve units, and α represents a constant between 1 and 10.
(7) Fused image substrate classification
The coral reef substrate classification is based on a fusion image of a laser radar and a side scan sonar. Therefore, firstly, the gray value of each pixel in the range of the measurement area and the geographic coordinate value thereof need to be calculated; then generating a high-precision high-resolution fusion image through a series of preprocessing works such as noise elimination, resampling, gray level equalization and the like; then various coral reef species samples are extracted at the corresponding positions of the fusion image according to geographic coordinates shot by a grab bucket or an underwater camera, and then sample training is carried out by selecting some statistical characteristic parameters with higher self-aggregation degree and resolution; and classifying the coral reef substrate by using the sample training result. The specific classification flow is shown in fig. 20.
The substrate classifier selects an improved three-layer wavelet BP neural network, namely, the number of units comprising an input layer, a hidden layer, an output layer and the input layer is determined according to the number of samples, the number of nerve units of the hidden layer is determined according to an empirical formula (63), 70% of samples are selected for training, and 30% of samples are used as a check. And taking the result of the training sample as an inner coincidence evaluation, and taking the result of the checking sample as an outer coincidence evaluation.
The training precision of the coral reef substrate sample can be evaluated by means of the classification precision of the remote sensing image, the classification precision refers to the number proportion of the sample can be identified, and the confusion matrix is defined as:
in the formula (64), m ij The sample type of the i-th class is classified into the total number of samples of the j-th class, and n is the class number. Each column of the confusion matrix represents the field reference verification information, and the numerical value of each column is equal to the number of the field sample types corresponding to the corresponding types in the sample training result; each row of the confusion matrix represents classification information of the sample type, and the numerical value of each row is equal to the number of the sample types in the corresponding types of the real samples in the field. If the value on the main diagonal line in the confusion matrix is larger, the accuracy of the classification result is higher, otherwise, the accuracy of the classification result is lower.
Thus, the following technical problems are solved in the invention:
(1) Analyzing the structure of a photon counting laser radar scanning system, and establishing a geometric relation model of laser reflected light and a normal vector of a reflector;
(2) Constructing a coordinate calculation model of a laser radar water surface incidence point in a laser scanning reference coordinate system;
(3) Providing a constant light speed ray tracing model in the underwater layer;
(4) Establishing a coordinate calculation model of the underwater laser radar footprint in a laser scanning reference coordinate system;
(5) And homing the coordinates of the underwater sounding points to a WGS84 space rectangular coordinate system.
In addition, the invention has the technical characteristics that:
(1) Analyzing the structure of a single-photon laser radar scanning system, and establishing a geometric relation model of the incidence angle and azimuth angle of laser reflected light and the normal vector of a reflector;
(2) The constant light speed ray tracking model in the underwater layer is provided, the underwater laser footprint is accurately tracked, and the coordinate precision of the underwater laser sounding point is improved.
Through the description, the beneficial effects of the invention are as follows:
(1) By researching a photon counting laser radar scanning structure, a geometric relation model of laser reflected light and a reflector normal vector is constructed, and the coordinates of a water laser incident point can be accurately calculated;
(2) The in-layer constant light speed ray tracking model based on the light speed profile is provided, a coordinate calculation model of the underwater laser sounding point in a laser scanning reference coordinate system is established, and the measurement accuracy of the underwater sounding point can be improved to a large extent.
(3) The Z fraction is facilitated to fuse the laser radar and the side scan sonar data of different mechanisms;
(4) By improving the traditional wavelet BP neural network, the precision of coral reef substrate classification is improved.
Claims (7)
1. The coral reef substrate classification method based on laser radar and side scan sonar data fusion is characterized by comprising the following steps:
(1) Establishing a laser scanning reference coordinate system and a transition coordinate system thereof by analyzing a photon counting laser radar scanning system structure, further establishing a geometric relationship between an incidence angle, an azimuth angle and a normal vector of a reflecting mirror of a laser reflection light surface, obtaining a three-dimensional coordinate of an incidence point of the laser surface, and constructing an unmanned airborne laser radar underwater sounding point coordinate accurate calculation model by constructing an underwater light velocity profile and an underwater in-layer constant light velocity ray tracking algorithm;
(2) The side scan sonar image with the geocode is constructed through the fine processing steps of sea bottom line tracking, radiation distortion joint correction, oblique distance correction, towed fish position calculation, echo sampling point coordinate calculation and the like;
(3) The Z fraction is facilitated, the backscattering intensities of the laser radar and the side-scan sonar are normalized and converted into gray values, two kinds of data are fused through a near homonymy point rule, and then a fusion image is constructed through geocoding and image resampling;
(4) Based on a standard wavelet BP neural network, the training rate and fitting precision of the wavelet BP neural network are improved through improvement of the network learning rate, the momentum factor and the initial value;
(5) Based on the high-precision high-resolution fusion image, various coral reef type samples are extracted at corresponding positions of the fusion image according to geographic coordinates shot by a grab bucket or an underwater camera, then sample training is carried out by selecting some statistical characteristic parameters with higher self-aggregation degree and resolution, and the coral reef substrate classification is carried out on the fusion image by utilizing a sample training result.
2. The coral reef substrate classification method based on laser radar and side scan sonar data fusion of claim 1, wherein the step (1) includes the steps of:
1) Analyzing the structure of a photon counting laser radar scanning system;
2) Establishing a laser scanning reference coordinate system and a transition coordinate system thereof;
3) Establishing a relation model of an incidence angle of the water surface of the reflected light and a normal vector of the reflecting mirror under a laser scanning reference coordinate system;
4) Calculating a three-dimensional coordinate value of a laser water surface incident point;
5) Constructing an underwater light velocity profile;
6) Establishing a constant light speed ray tracing model in the underwater layer;
7) And constructing a coordinate accurate calculation model of the underwater sounding point in a laser scanning reference coordinate system.
3. The coral reef substrate classification method based on laser radar and side scan sonar data fusion of claim 1, wherein the step (2) includes the steps of:
1) Sea bottom line tracking;
2) Radiation distortion joint correction;
3) Correcting the inclined distance;
4) Calculating the position of the towed fish;
5) Calculating coordinates of the side scan sonar echo sampling points;
6) A side-scan sonar image with geocoding is constructed.
4. The coral reef substrate classification method based on laser radar and side scan sonar data fusion of claim 1, wherein the step (3) includes the steps of:
1) The Z fraction is facilitated to normalize the backscattering intensities of the laser radar and the side-scan sonar;
2) Converting the normalized data of the two into gray values;
3) Establishing a rule of near homonymy points;
4) Fusing the two data by using a rule of a near homonymy point;
5) Re-geocoding the fused data;
6) Resampling an image;
7) And constructing a laser radar and side-scan sonar fusion image.
5. The coral reef substrate classification method based on laser radar and side scan sonar data fusion of claim 1, wherein the step (4) includes the steps of:
1) Establishing a basic framework of a wavelet BP neural network, namely 1 input layer, 1 hidden layer and 1 output layer, wherein the hidden layers adopt wavelet functions as transfer functions;
2) Improvement of the network learning rate;
3) Improvement of momentum factor;
4) And constructing an optimal weight initial value model by considering the relation between the initial value of the network weight and the neuron transfer function and the learning sample.
6. The coral reef substrate classification method based on laser radar and side scan sonar data fusion of claim 1, wherein the step (5) includes the steps of:
1) On the basis of the fusion image, extracting image samples of various coral reefs according to actual shooting coordinates of a grab bucket or a camera;
2) Selecting statistical characteristic parameters with higher self-aggregation degree and resolution (such as standard deviation, entropy, third-order moment, kurtosis, skewness and the like) as image sample characteristics;
3) Respectively selecting 70% and 30% of samples as training and checking, and performing image sample training and checking;
4) And classifying the coral reef substrate by using the sample training result.
7. The coral reef substrate classification method based on laser radar and side scan sonar data fusion of claim 1, wherein the underwater in-layer constant speed ray tracing algorithm is as follows:
1) The laser beam experiences a water column consisting of N layers, the speed of light propagating in the layers at a constant speed of light, according to Snell's law:
2) Let the thickness of the water column layer be Deltaz i (Δz i =z i+1 -z i ) Then the horizontal displacement y of the beam in layer i i And propagation time t i The method comprises the following steps:
according to equations (11) and (12), the horizontal distance and propagation time of the light beam through the entire water column are respectively:
assuming that the beam does not experience the full water column layer, but is at Z r Where it disappears, the horizontal displacement of the beam at the layer is deltay r The vertical displacement is Deltaz r . The light beam has a time t in all water column layers all The time elapsed in this layer is t r The optical path DeltaS that the light beam experiences at that layer r The method comprises the following steps:
Δz r =ΔS r ·cosθ r (16)
Δy r =ΔS r ·sinθ r (17)
therefore, the total horizontal displacement and vertical displacement of the light beam in the water body are respectively:
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211503490.8A CN116027349A (en) | 2022-11-28 | 2022-11-28 | Coral reef substrate classification method based on laser radar and side scan sonar data fusion |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211503490.8A CN116027349A (en) | 2022-11-28 | 2022-11-28 | Coral reef substrate classification method based on laser radar and side scan sonar data fusion |
Publications (1)
Publication Number | Publication Date |
---|---|
CN116027349A true CN116027349A (en) | 2023-04-28 |
Family
ID=86074893
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202211503490.8A Pending CN116027349A (en) | 2022-11-28 | 2022-11-28 | Coral reef substrate classification method based on laser radar and side scan sonar data fusion |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN116027349A (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116609759A (en) * | 2023-07-21 | 2023-08-18 | 自然资源部第一海洋研究所 | Method and system for enhancing and identifying airborne laser sounding seabed weak echo |
CN117111070A (en) * | 2023-10-19 | 2023-11-24 | 广东海洋大学 | Underwater target positioning method and device based on sonar and laser |
-
2022
- 2022-11-28 CN CN202211503490.8A patent/CN116027349A/en active Pending
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116609759A (en) * | 2023-07-21 | 2023-08-18 | 自然资源部第一海洋研究所 | Method and system for enhancing and identifying airborne laser sounding seabed weak echo |
CN116609759B (en) * | 2023-07-21 | 2023-10-31 | 自然资源部第一海洋研究所 | Method and system for enhancing and identifying airborne laser sounding seabed weak echo |
CN117111070A (en) * | 2023-10-19 | 2023-11-24 | 广东海洋大学 | Underwater target positioning method and device based on sonar and laser |
CN117111070B (en) * | 2023-10-19 | 2023-12-26 | 广东海洋大学 | Underwater target positioning method and device based on sonar and laser |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Huy et al. | Object perception in underwater environments: a survey on sensors and sensing methodologies | |
Cong et al. | Underwater robot sensing technology: A survey | |
Singh et al. | Towards high-resolution imaging from underwater vehicles | |
Johnson et al. | The geological interpretation of side‐scan sonar | |
Luo et al. | Sediment classification of small-size seabed acoustic images using convolutional neural networks | |
Singh et al. | Microbathymetric mapping from underwater vehicles in the deep ocean | |
CN114488164B (en) | Synchronous positioning and mapping method for underwater vehicle and underwater vehicle | |
Huang et al. | Comprehensive sample augmentation by fully considering SSS imaging mechanism and environment for shipwreck detection under zero real samples | |
CN116027349A (en) | Coral reef substrate classification method based on laser radar and side scan sonar data fusion | |
CN207908979U (en) | A kind of target identification tracing system of unmanned boat | |
CN115755071A (en) | Deep sea in-situ fine detection frame design method based on acousto-optic remote sensing and VR technology | |
Negaharipour | On 3-D scene interpretation from FS sonar imagery | |
CN116184376A (en) | Underwater three-dimensional terrain and multi-beam image sonar data simulation system and method | |
Yu et al. | Treat noise as domain shift: Noise feature disentanglement for underwater perception and maritime surveys in side-scan sonar images | |
Tang et al. | Three dimensional height information reconstruction based on mobile active sonar detection | |
CN116106925A (en) | Method for calculating underwater sounding point coordinates of laser radar by using rigorous photon counting mechanism | |
Agrafiotis | Image-based bathymetry mapping for shallow waters | |
Wawrzyniak et al. | MSIS sonar image segmentation method based on underwater viewshed analysis and high-density seabed model | |
McClinton et al. | Neuro-fuzzy classification of submarine lava flow morphology | |
CN114137546A (en) | AUV (autonomous underwater vehicle) submarine target identification and path planning method based on data driving | |
Rosenblum et al. | 3D reconstruction of small underwater objects using high-resolution sonar data | |
Pulido et al. | Time and cost-efficient bathymetric mapping system using sparse point cloud generation and automatic object detection | |
Xu et al. | Real-time Volumetric Perception for unmanned surface vehicles through fusion of radar and camera | |
Shi et al. | Sonar image intelligent processing in seabed pipeline detection: review and application | |
Zhang et al. | Bridge substructure feature extraction based on the underwater sonar point cloud data |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |