CN115597498B - Unmanned aerial vehicle positioning and speed estimation method - Google Patents

Unmanned aerial vehicle positioning and speed estimation method Download PDF

Info

Publication number
CN115597498B
CN115597498B CN202211593074.1A CN202211593074A CN115597498B CN 115597498 B CN115597498 B CN 115597498B CN 202211593074 A CN202211593074 A CN 202211593074A CN 115597498 B CN115597498 B CN 115597498B
Authority
CN
China
Prior art keywords
aerial vehicle
unmanned aerial
image
camera
point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202211593074.1A
Other languages
Chinese (zh)
Other versions
CN115597498A (en
Inventor
莫双齐
王根
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chengdu Bobei Technology Co ltd
Original Assignee
Chengdu Bobei Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chengdu Bobei Technology Co ltd filed Critical Chengdu Bobei Technology Co ltd
Priority to CN202211593074.1A priority Critical patent/CN115597498B/en
Publication of CN115597498A publication Critical patent/CN115597498A/en
Application granted granted Critical
Publication of CN115597498B publication Critical patent/CN115597498B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/002Measuring arrangements characterised by the use of optical techniques for measuring two or more coordinates
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/02Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness
    • G01B11/022Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness by means of tv-camera scanning
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01PMEASURING LINEAR OR ANGULAR SPEED, ACCELERATION, DECELERATION, OR SHOCK; INDICATING PRESENCE, ABSENCE, OR DIRECTION, OF MOVEMENT
    • G01P3/00Measuring linear or angular speed; Measuring differences of linear or angular speeds
    • G01P3/64Devices characterised by the determination of the time taken to traverse a fixed distance
    • G01P3/68Devices characterised by the determination of the time taken to traverse a fixed distance using optical means, i.e. using infrared, visible, or ultraviolet light
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention discloses an unmanned aerial vehicle positioning and speed estimating method, which comprises an unmanned aerial vehicle, wherein the unmanned aerial vehicle is provided with a camera and an intersection measuring system, the intersection measuring system comprises two area array CCDs which have the same performance and are symmetrically arranged on two sides of the camera, and the two area array CCDs use a main optical axis of the camera as a symmetry axis; the invention combines the optical flow and the camera automatic zooming technology, and can realize the autonomous positioning of the unmanned aerial vehicle and the measurement of the displacement in the horizontal direction and the vertical direction under the condition of no GPS; the method adopts the image edge characteristics obtained by wavelet transformation of the area array CCD image and the CCD intersection distance measurement principle, and utilizes the good multi-scale resolution capability of the wavelet transformation and the internal and external standard quantity characteristics of the edge image to better determine the target point in the area target.

Description

Unmanned aerial vehicle positioning and speed estimation method
Technical Field
The invention belongs to the technical field of unmanned aerial vehicle positioning, and particularly relates to an unmanned aerial vehicle positioning and speed estimation method.
Background
Usually, the unmanned aerial vehicle mainly depends on GPS inertial integrated navigation, and in environments such as jungles, urban buildings, indoor environments, etc., the positioning of the unmanned aerial vehicle is difficult or unstable due to interference or shielding of GPS signals.
Disclosure of Invention
The invention provides an unmanned aerial vehicle positioning and speed estimating method, and aims to solve the existing problems.
The invention is realized in this way, a method for positioning and estimating speed of an unmanned aerial vehicle, which comprises the unmanned aerial vehicle, wherein the unmanned aerial vehicle is provided with a camera and an intersection measurement system, the intersection measurement system comprises two area array CCDs which have the same performance and are symmetrically arranged on two sides of the camera, and the two area array CCDs take a main optical axis of the camera as a symmetry axis, and the method specifically comprises the following steps:
s1: randomly selecting a reference object for flying of the unmanned aerial vehicle, and acquiring two frames of images acquired by a camera of the unmanned aerial vehicle at different moments in the flying process of the unmanned aerial vehicle, wherein the two frames of images both comprise the reference object and have the same pixel gray;
s2: calculating the displacement of pixel points in the two frames of images by using an optical flow method to obtain an optical flow vector, and further calculating the horizontal displacement of the unmanned aerial vehicle;
s3: selecting the intersection point of the main optical axis of the camera and the ground as a target point;
s4: at the first moment, images on the area array CCDs on two sides are obtained, image edge features are extracted according to time-frequency localization features and multi-scale analysis of wavelet transformation, the position of a target point on the images is determined, and the actual height between the unmanned aerial vehicle and the target point is obtained through calculation;
s5: at the second moment, after the camera is automatically focused, images on the area array CCDs on two sides are obtained, according to time-frequency localization characteristics of wavelet transformation and multi-scale analysis, image edge characteristics are extracted, the position of a target point on the images is determined, and the actual height between the unmanned aerial vehicle and the target point is obtained through calculation;
s6: calculating the actual heights obtained twice to obtain the vertical displacement of the unmanned aerial vehicle;
s7: calculating the flying speed of the unmanned aerial vehicle according to the two relative position offsets of the unmanned aerial vehicle in unit time; according to the positioning data of the unmanned aerial vehicle at the previous moment, and the horizontal displacement and the vertical displacement of the unmanned aerial vehicle, the positioning data of the unmanned aerial vehicle at the current moment is calculated.
Further, in step S2, since the gray levels of the pixels of the two frames of images are the same, the following results are obtained:
Figure 136918DEST_PATH_IMAGE002
wherein, I (x, y, t) represents that the pixel point moves to the position of the second frame image (x + dx, y + dy) after time dt, and both sides are expanded by taylor series to eliminate the same term, so as to obtain the following equation:
Figure 268822DEST_PATH_IMAGE003
wherein:
Figure 591219DEST_PATH_IMAGE004
Figure 201192DEST_PATH_IMAGE005
the above is the optical flow equation, f x And f y Representing the gradient of the image, f t Represents a temporal gradient;
because all points in the reference object have the same motion, 9 points in the 3X3 neighborhood have the same motion to obtain 9 optical flow equations, and a least square method is adopted for fitting solution to obtain (u, v) as follows:
Figure 626400DEST_PATH_IMAGE007
further, in step S4, one side of the CCD lens node O is used 1 As the origin of coordinates, connecting line O of CCD lens nodes at two sides 1 O 2 Establishing a coordinate system for the x axis, wherein the node distance of the two CCD lenses is d, the inclination angles of the two optical axes are alpha and beta, and the imaging distances of the target point A on the two CCD surfaces through the lenses are h respectively 1 、h 2 I.e. image height; the corresponding sign rule is: taking the y axis as a starting point, clockwise rotation is positive, and anticlockwise rotation is negative; image height h 1 、h 2 The left side is negative, the right side is positive, the focal lengths of the two CCD lenses are the same and are f, and the coordinates (x, y) of the target point A are as follows:
Figure 335730DEST_PATH_IMAGE009
Figure 473320DEST_PATH_IMAGE011
the target point is the intersection point h of the main optical axis of the camera and the ground 1 =h 2 = h, i.e. the target point y coordinate, i.e. the camera shooting distance:
Figure 27929DEST_PATH_IMAGE013
further, in step S4, extracting image edge features according to the best time-frequency localization features and multi-scale analysis capability of the wavelet transform, specifically including:
let the real function of the two-dimensional signal output by CCD be f (x, z), and take the Gaussian function
Figure 301784DEST_PATH_IMAGE014
As a smoothing function, its first derivative is:
Figure 41070DEST_PATH_IMAGE016
in the dimension 2 j Under the conditions, the wavelet transform in the x-direction and the y-direction can be expressed as:
Figure 151109DEST_PATH_IMAGE017
in the formula:
Figure 493097DEST_PATH_IMAGE018
;
order to
Figure 637771DEST_PATH_IMAGE020
Since the modulus maximum point of the wavelet transform corresponds to the edge of the image, the threshold value is uniformly taken as m 0 If is greater or greater>
Figure 547958DEST_PATH_IMAGE021
Namely, the binary image is used as an edge point, and a binary image with an edge part of 1 and the rest of 0 is obtained; determining the central area and the edge characteristics of the central point image in the image, comparing the two sides of the CCD image to obtain the image height, and calculating the actual height between the unmanned aerial vehicle and the target point.
Further, in step S4, removing the false target in the binarized image includes:
removing the false target in the binary image by using the area as an inner scalar parameter and using a radius vector taking the centroid of a closed curve as a center as an outer scalar parameter, wherein the centroid coordinate (x) of the closed curve 1 ,z 1 ) Comprises the following steps:
Figure 128981DEST_PATH_IMAGE022
where k is the number of sampling points of the closed curve, x i (i)、z i (i) Is the coordinate of the first sampling point;
the radius vector of the first sample point is:
Figure 25393DEST_PATH_IMAGE024
compared with the prior art, the invention has the beneficial effects that: the invention discloses a method for positioning and estimating the speed of an unmanned aerial vehicle, which combines an optical flow technology and an automatic camera zooming technology, and can realize the autonomous positioning of the unmanned aerial vehicle and the measurement of the displacement in the horizontal direction and the vertical direction under the condition of no GPS; the method comprises the steps of adopting image edge characteristics obtained by wavelet transformation of an area array CCD image and a CCD intersection distance measurement principle, and utilizing good multi-scale resolution capability of the wavelet transformation and internal and external standard quantity characteristics of the edge image to better determine a target point in an area target; the camera shooting distance can be accurately determined by utilizing the characteristics of the internal scalar and the external scalar of the edge image after wavelet transformation, and the automatic focusing of the camera can be realized by means of a feedback circuit.
Drawings
FIG. 1 is a schematic diagram of the CCD convergence distance measurement of the present invention;
FIG. 2 is a diagram showing the relative mounting positions of the CCD and the camera according to the present invention;
FIG. 3 is a block diagram of the system design of the present invention;
fig. 4 is a two-dimensional code layout of the experimental example of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is further described in detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
In the description of the present invention, it is to be understood that the terms "length", "width", "upper", "lower", "front", "rear", "left", "right", "vertical", "horizontal", "top", "bottom", "inner", "outer", etc., indicate orientations or positional relationships based on those shown in the drawings, and are merely for convenience of description and simplicity of description, but do not indicate or imply that the device or element being referred to must have a particular orientation, be constructed in a particular orientation, and be operated, and thus, are not to be construed as limiting the present invention. Further, in the description of the present invention, "a plurality" means two or more unless specifically defined otherwise.
Examples
Referring to fig. 1-3, in the optical flow theory, the gray scale of the pixel between two frames of images collected by the camera is not changed and the adjacent two frames of pixels have relative motion, then the following relationship holds:
Figure 618048DEST_PATH_IMAGE025
wherein I (x, y, t) represents that the pixel point moves to the position of the second frame image (x + dx, y + dy) after time dt, and the following equation can be obtained by expanding both sides by taylor series and eliminating the same term:
Figure 823770DEST_PATH_IMAGE026
wherein:
Figure 908401DEST_PATH_IMAGE027
the above is the optical flow equation, where fx and fy represent the gradient of the image and ft represents the time gradient, but the above method cannot obtain (u, v) because one equation cannot solve two unknowns, and to solve this problem, the classical lucas-Kanade method can be used.
In the lucas-Kanade method, since all points in the neighborhood of the target point have similar motions, which is the core of the lucas-Kanade method, 9 optical flow equations are obtained by using 9 points in a 3X3 neighborhood to have the same motions, and then the least squares are adopted to perform fitting solution, and finally (u, v) are obtained as follows:
Figure 733137DEST_PATH_IMAGE028
the method for calculating the moving speed of the pixel points by the optical flow method only needs to track some points in the image, the optical flow vector can be calculated by adopting the method, the attitude control of the unmanned aerial vehicle can be further optimized according to the obtained optical flow vector, and more accurate control is realized.
FIG. 1 is a schematic diagram of an area array CCD measurement installed on two sides of a camera, wherein the two CCD performance parameters are completely the same, and the optical axis of the camera is taken as a symmetry axis, and a CCD lens node O on one side is taken as a CCD lens node O 1 As the origin of coordinates, connecting line O of CCD lens nodes at two sides 1 O 2 Establishing a coordinate system for the x axis, wherein the node distance of the two CCD lenses is d, the inclination angles of the two optical axes are alpha and beta, and the imaging distances of the target point A on the two CCD surfaces through the lenses are h respectively 1 、h 2 I.e. image height;
the corresponding sign rule is: taking the y axis as a starting point, clockwise rotation is positive, and anticlockwise rotation is negative; image height h 1 、h 2 The left side is negative, the right side is positive, the focal lengths of the two CCD lenses are the same and are f, and the coordinates (x, y) of the target point A are as follows:
Figure 101671DEST_PATH_IMAGE009
Figure 697868DEST_PATH_IMAGE011
the conditions of the intersection system in optimizing the structural layout are as follows:
Figure 519063DEST_PATH_IMAGE029
and to ensure distance measurementThe system is convenient to calculate when the distance is actually measured, and the intersection point of the main optical axis of the camera and the ground, namely h, is selected as a target point S 1 =h 2 = h, under this condition, the y coordinate of the target point, i.e., the camera photographing distance:
Figure 616332DEST_PATH_IMAGE013
from the above analysis, the key to measuring camera height with a convergence system is how to determine the image of a point on the CCD plane located on the camera's principal optical axis in the target.
And extracting image edge characteristics according to the optimal time-frequency localization characteristics and multi-scale analysis capability of the wavelet transform, and judging the position of the target point on the basis of the image edge characteristics.
Let the real function of the two-dimensional signal output by CCD be f (x, z), and take the Gaussian function
Figure 324525DEST_PATH_IMAGE014
Is a smoothing function having a first derivative of
Figure 216257DEST_PATH_IMAGE016
Then in dimension 2 j Under the condition, the wavelet transform in the x-direction and the y-direction can be expressed as
Figure 542326DEST_PATH_IMAGE017
In the formula:
Figure 584231DEST_PATH_IMAGE018
order to
Figure 271564DEST_PATH_IMAGE020
Theoretically, the modulus maximum point of wavelet transform corresponds to the edge of the image, and for simplifying operation, the threshold value is uniformly selected to be m 0 If is>
Figure 927674DEST_PATH_IMAGE021
I.e. as edge points, which do not affect the inherent edge characteristics of the image.
In order to improve the calculation speed, edge detection is carried out from the x direction and the z direction to obtain a binary image with an edge part of 1 and the rest of 0.
To reduce the positioning error, the relative installation positions of the area array CCD and the camera are shown in fig. 2.
At the moment, the output of the CCD1 is only used as a reference image for determining the image point of the main optical axis, the position of the image point in the CCD2, namely the image height, is determined by comparing the image point with the image of the CCD2 according to the central area and the edge characteristic of the image point of the central point defined by the CCD1, and then the height of the target point is automatically calculated according to a coordinate formula.
Two-dimensional image shape description technologies can be divided into two categories, namely scalar and spatial domain, wherein the scalar technology refers to the fact that one scalar parameter and one group of scalar parameters are used for describing the shape of an object and is divided into two categories, namely an inner scalar technology and an outer scalar technology; spatial domain techniques are structural and relational properties that describe the shape of an object.
After the two-dimensional image is subjected to wavelet transformation and binarization, the area is used as a simple internal scalar parameter, most of non-related targets in the image can be removed, but pseudo targets with the same area and different targets also exist, and the radius vector taking the centroid of a closed curve as a center is used as an external scalar parameter, so that the pseudo targets can be well removed. Closed curve centroid coordinates
Figure 723460DEST_PATH_IMAGE030
Is given by the formula
Figure 303477DEST_PATH_IMAGE031
In the formula, k is the number of sampling points of the closed curve,
Figure 110896DEST_PATH_IMAGE032
Figure 734645DEST_PATH_IMAGE033
the first sample point coordinate.
Then the radius vector of the ith sample point is
Figure 502880DEST_PATH_IMAGE024
According to the invention, the displacement of the unmanned aerial vehicle in the horizontal direction can be measured by utilizing optical flow positioning, so that the speed is estimated. When the height of the unmanned aerial vehicle changes, namely the speed exists in the vertical direction, targets are imaged on the two planar array CCDs through respective objective lenses, the two planar array CCDs are subjected to quick A/D conversion and then stored in a large-capacity memory, after a computer detects that two images are completely stored with signals, data in the memory are automatically read according to the sequence of the first CCD1 and the second CCD2, image edge binarization processing, target point determination and distance calculation are completed, results are provided for digital-to-analog conversion, a lens adjusting motor is driven to work, the adjusting process is fed back to the computer, when the lens reaches a proper position, the output of the computer is interrupted, and the motor stops working.
Test example 1
The method comprises the steps of testing on a p450 series unmanned aerial vehicle by using an optical flow positioning sensor, selecting one surface of a protective net as a reference object (the length is about 5 m) for testing optical flow positioning, pasting a two-dimensional code at intervals of 1m as an image for identifying and positioning by an optical flow technology, and setting the position of a first two-dimensional code to be 0m. The unmanned aerial vehicle flies at fixed points and passes through two-dimensional codes each time (the distance between the unmanned aerial vehicle and the protective net is kept unchanged in the test process), and test data are shown in table 1.
TABLE 1
Figure 745643DEST_PATH_IMAGE035
Test example 2
The two planar array CCDs of the measurement system both adopt Tc215 type full-frame image sensors produced by TEXAS INSTRUMENTS, and the lens adopts a double-gauss photographic objective lens with the focal length of 18-200 mm. The test site was photographed on every other floor at 15m high floors (3 m high floor for each floor) from the ground with the test data shown in table 2.
TABLE 2
Figure 532202DEST_PATH_IMAGE037
Test example 3
The two-area array CCD adopts a Tc215 full-frame image sensor produced by TEXAS INSTRUMENTS, the lens adopts a double-gauss photographic objective lens with the focal length of 18-200mm, and a p450 series unmanned plane test is adopted. The test site is selected as a 6m multiplied by 6m long and wide area protected by a protective net, the height of the test site is 3.5m, two-dimensional codes are selected as identification objects, the unmanned aerial vehicle sequentially passes through each two-dimensional code from different heights, and the position and the speed of the unmanned aerial vehicle are estimated. Record the change of focal length of the camera, the two-dimensional code array is placed on the ground as shown in fig. 4, the height is recorded as 0, and the test data is shown in table 3, wherein the coordinates are expressed as (x, y).
TABLE 3
Figure 608743DEST_PATH_IMAGE039
According to the unmanned aerial vehicle positioning and speed estimating method disclosed by the invention, the unmanned aerial vehicle can be autonomously positioned and the displacement in the horizontal direction and the vertical direction can be measured by utilizing the optical flow positioning and area array CCD technology under the condition of not depending on positioning equipment such as a GPS and the like and under the condition of no GPS; under most circumstances, the data deviation is not higher than 5%, has realized that unmanned aerial vehicle's location is stable.
The present invention is not limited to the above preferred embodiments, and any modifications, equivalent substitutions and improvements made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (4)

1. The utility model provides an unmanned aerial vehicle location and speed estimation method, includes unmanned aerial vehicle, unmanned aerial vehicle is last to be installed the camera and to intersect the measurement system, the measurement system that intersects includes two the same performance and symmetry install the area array CCD of camera both sides, two area array CCD uses the camera principal optical axis as the symmetry axis, its characterized in that specifically includes following step:
s1: randomly selecting a reference object for flying of the unmanned aerial vehicle, and acquiring two frames of images acquired by a camera of the unmanned aerial vehicle at different moments in the flying process of the unmanned aerial vehicle, wherein the two frames of images both comprise the reference object and have the same pixel gray;
s2: calculating the displacement of pixel points in the two frames of images by using an optical flow method to obtain an optical flow vector, and further calculating the horizontal displacement of the unmanned aerial vehicle;
s3: selecting the intersection point of the main optical axis of the camera and the ground as a target point;
s4: at the first moment, images on the area array CCDs on two sides are obtained, image edge features are extracted according to time-frequency localization features and multi-scale analysis of wavelet transformation, the position of a target point on the images is determined, and the actual height between the unmanned aerial vehicle and the target point is obtained through calculation;
s5: at the second moment, after the camera is automatically focused, images on the area array CCDs on two sides are obtained, according to time-frequency localization characteristics of wavelet transformation and multi-scale analysis, image edge characteristics are extracted, the position of a target point on the images is determined, and the actual height between the unmanned aerial vehicle and the target point is obtained through calculation;
s6: calculating the actual heights obtained twice to obtain the vertical displacement of the unmanned aerial vehicle;
s7: calculating the flight speed of the unmanned aerial vehicle according to the two relative position offsets of the unmanned aerial vehicle in unit time; calculating the positioning data of the unmanned aerial vehicle at the current moment according to the positioning data of the unmanned aerial vehicle at the previous moment and the horizontal displacement and the vertical displacement of the unmanned aerial vehicle;
in step S4, one side of CCD lens node O 1 As the origin of coordinates, connecting line O of CCD lens nodes at two sides 1 O 2 Establishing a coordinate system for the x axis, wherein the node distance of the two CCD lenses is d, the inclination angles of the two optical axes are alpha and beta, and the imaging distances of the target point A on the two CCD surfaces through the lenses are h respectively 1 、h 2 I.e. image height; phase (C)The sign rules should be: taking the y axis as a starting point, clockwise rotation is positive, and anticlockwise rotation is negative; image height h 1 、h 2 The left side is negative, the right side is positive, the focal lengths of the two CCD lenses are the same and are f, and the coordinates (x, y) of the target point A are as follows:
Figure FDA0004059768500000011
Figure FDA0004059768500000012
the target point is the intersection point h of the main optical axis of the camera and the ground 1 =h 2 = h, i.e. the target point y coordinate, i.e. the camera shooting distance:
y=-d(hsinα+fcosα)/[2hfcos2α+(h 2 -f 2 )sin2α]。
2. the method of claim 1, wherein in step S2, since the two frames of images have the same pixel gray level, the following results are obtained:
I(x,y,z)=I(x+dx,y+dy,t+dt)
wherein, I (x, y, t) represents that the pixel point moves to the position of the second frame image (x + dx, y + dy) after time dt, and both sides are expanded by taylor series to eliminate the same term, so as to obtain the following equation:
f x u+f y v+f t =0
Figure FDA0004059768500000021
wherein:
above is the optical flow equation, f x And f y Representing the gradient of the image, f t Represents a temporal gradient;
because all points in the reference object have the same motion, 9 points in the 3X3 neighborhood have the same motion to obtain 9 optical flow equations, and a least square method is adopted for fitting solution to obtain (u, v) as follows:
Figure FDA0004059768500000022
3. the unmanned aerial vehicle positioning and speed estimating method according to claim 1, wherein in step S4, the image edge feature is extracted according to the best time-frequency localization feature and multi-scale analysis capability of wavelet transform, and specifically comprises:
let the real function of the two-dimensional signal output by CCD be f (x, z), take Gaussian function
Figure FDA0004059768500000031
For a smoothing function, its first derivative is:
Figure FDA0004059768500000032
Figure FDA0004059768500000033
in the dimension 2 j Under the conditions, the wavelet transform in the x-direction and the y-direction can be expressed as:
Figure FDA0004059768500000034
Figure FDA0004059768500000035
Figure FDA0004059768500000036
in the formula:
Figure FDA0004059768500000041
order to
Figure FDA0004059768500000042
Since the modulus maximum point of wavelet transform corresponds to the edge of image, uniformly taking the threshold value as m 0 If m is 2j f (x, z) > m0, namely the binary image is used as an edge point, and a binary image with an edge part of 1 and the rest of 0 is obtained; determining the central area and the edge characteristics of the central point image in the image, comparing the two sides of the CCD image to obtain the image height, and calculating the actual height between the unmanned aerial vehicle and the target point.
4. The unmanned aerial vehicle positioning and speed estimating method according to claim 3, wherein in step S4, removing the false target in the binarized image specifically comprises:
removing the false target in the binary image by using the area as an inner scalar parameter and using a radius vector taking the centroid of a closed curve as a center as an outer scalar parameter, wherein the centroid coordinate (x) of the closed curve 1 ,z 1 ) Comprises the following steps:
Figure FDA0004059768500000043
where k is the number of sampling points of the closed curve, x i (i)、z i (i) Is the coordinate of the first sampling point;
the radius vector of the first sample point is:
R i =[(x i (i)-x 1 ) 2 +(z i (i)-z 1 ) 2 ]1/2,i=1,2,...,k。
CN202211593074.1A 2022-12-13 2022-12-13 Unmanned aerial vehicle positioning and speed estimation method Active CN115597498B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211593074.1A CN115597498B (en) 2022-12-13 2022-12-13 Unmanned aerial vehicle positioning and speed estimation method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211593074.1A CN115597498B (en) 2022-12-13 2022-12-13 Unmanned aerial vehicle positioning and speed estimation method

Publications (2)

Publication Number Publication Date
CN115597498A CN115597498A (en) 2023-01-13
CN115597498B true CN115597498B (en) 2023-03-31

Family

ID=84854006

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211593074.1A Active CN115597498B (en) 2022-12-13 2022-12-13 Unmanned aerial vehicle positioning and speed estimation method

Country Status (1)

Country Link
CN (1) CN115597498B (en)

Family Cites Families (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2981148B1 (en) * 2011-10-10 2013-11-08 Fb Technology MEASURING EQUIPMENT FOR CONTROLLING AN APPROACH TRACK INDICATOR FOR AIRCRAFT LANDING, AND CORRESPONDING CONTROL DEVICE.
CN103365297B (en) * 2013-06-29 2016-03-09 天津大学 Based on four rotor wing unmanned aerial vehicle flight control methods of light stream
CN106989744A (en) * 2017-02-24 2017-07-28 中山大学 A kind of rotor wing unmanned aerial vehicle autonomic positioning method for merging onboard multi-sensor
CN106708081B (en) * 2017-03-17 2019-06-04 北京思比科微电子技术股份有限公司 More rotor unmanned aircraft control systems
US11064184B2 (en) * 2017-08-25 2021-07-13 Aurora Flight Sciences Corporation Aerial vehicle imaging and targeting system
CN207560137U (en) * 2017-08-31 2018-06-29 天津航天中为数据系统科技有限公司 Binocular camera oblique photograph device based on unmanned aerial vehicle platform
CN108335316B (en) * 2018-01-12 2021-08-17 大连大学 Steady optical flow calculation method based on wavelet
CN109163928A (en) * 2018-08-27 2019-01-08 河海大学常州校区 A kind of UAV Intelligent water intake system based on binocular vision
CN208813493U (en) * 2018-09-25 2019-05-03 成都铂贝科技有限公司 A kind of orientable unmanned plane in indoor and outdoor
CN110081982B (en) * 2019-03-11 2021-01-15 中林信达(北京)科技信息有限责任公司 Unmanned aerial vehicle target positioning method based on double-spectrum photoelectric search
CN110007690A (en) * 2019-05-08 2019-07-12 北京天龙智控科技有限公司 A kind of unmanned plane cruising inspection system and method
CN112461204B (en) * 2019-08-19 2022-08-16 中国科学院长春光学精密机械与物理研究所 Method for satellite to dynamic flying target multi-view imaging combined calculation of navigation height
CN111024066B (en) * 2019-12-10 2023-08-01 中国航空无线电电子研究所 Unmanned aerial vehicle vision-inertia fusion indoor positioning method
JP2024038527A (en) * 2021-01-12 2024-03-21 ソニーセミコンダクタソリューションズ株式会社 Signal processing device, signal processing method, and signal processing system
CN216246319U (en) * 2021-06-29 2022-04-08 广州南洋理工职业学院 Unmanned aerial vehicle's light stream positioning system
CN114459467B (en) * 2021-12-30 2024-05-03 北京理工大学 VI-SLAM-based target positioning method in unknown rescue environment
CN114690802A (en) * 2022-04-02 2022-07-01 深圳慧源创新科技有限公司 Unmanned aerial vehicle binocular light stream obstacle avoidance method and device, unmanned aerial vehicle and storage medium
CN115371673A (en) * 2022-07-14 2022-11-22 北京理工大学 Binocular camera target positioning method based on Bundle Adjustment in unknown environment

Also Published As

Publication number Publication date
CN115597498A (en) 2023-01-13

Similar Documents

Publication Publication Date Title
CN112014857B (en) Three-dimensional laser radar positioning and navigation method for intelligent inspection and inspection robot
CN110807809B (en) Light-weight monocular vision positioning method based on point-line characteristics and depth filter
CN111462200A (en) Cross-video pedestrian positioning and tracking method, system and equipment
US10909395B2 (en) Object detection apparatus
JP4363295B2 (en) Plane estimation method using stereo images
CN102788572B (en) Method, device and system for measuring attitude of engineering machinery lifting hook
Mi et al. A vision-based displacement measurement system for foundation pit
Konrad et al. Localization in digital maps for road course estimation using grid maps
CN117128861A (en) Monitoring system and monitoring method for station-removing three-dimensional laser scanning bridge
CN114549629A (en) Method for estimating three-dimensional pose of target by underwater monocular vision
CN112862678A (en) Unmanned aerial vehicle image splicing method and device and storage medium
CN114608522B (en) Obstacle recognition and distance measurement method based on vision
JP4935769B2 (en) Plane region estimation apparatus and program
Deng et al. Joint calibration of dual lidars and camera using a circular chessboard
CN114723811A (en) Stereo vision positioning and mapping method for quadruped robot in unstructured environment
CN115597498B (en) Unmanned aerial vehicle positioning and speed estimation method
CN111862146B (en) Target object positioning method and device
CN111598956A (en) Calibration method, device and system
CN116358547B (en) Method for acquiring AGV position based on optical flow estimation
CN116045965A (en) Multi-sensor-integrated environment map construction method
CN113850864B (en) GNSS/LIDAR loop detection method for outdoor mobile robot
CN112985388B (en) Combined navigation method and system based on large-displacement optical flow method
CN114092388A (en) Obstacle detection method based on monocular camera and odometer
JP2018116147A (en) Map creation device, map creation method and map creation computer program
CN113256726A (en) Online calibration and inspection method for sensing system of mobile device and mobile device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant