CN106558038B - A kind of detection of sea-level and device - Google Patents

A kind of detection of sea-level and device Download PDF

Info

Publication number
CN106558038B
CN106558038B CN201510595935.3A CN201510595935A CN106558038B CN 106558038 B CN106558038 B CN 106558038B CN 201510595935 A CN201510595935 A CN 201510595935A CN 106558038 B CN106558038 B CN 106558038B
Authority
CN
China
Prior art keywords
image
preset
camera
coordinate system
dimensional coordinate
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201510595935.3A
Other languages
Chinese (zh)
Other versions
CN106558038A (en
Inventor
胡庭波
吴涛
安向京
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
National University of Defense Technology
Original Assignee
National University of Defense Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by National University of Defense Technology filed Critical National University of Defense Technology
Priority to CN201510595935.3A priority Critical patent/CN106558038B/en
Publication of CN106558038A publication Critical patent/CN106558038A/en
Application granted granted Critical
Publication of CN106558038B publication Critical patent/CN106558038B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Length Measuring Devices By Optical Means (AREA)
  • Image Processing (AREA)

Abstract

The embodiment of the invention discloses a kind of detection of sea-level and devices, the image obtained based on first camera and second camera in synchronization photographic subjects waters, obtain spatial position of the horizontal plane of target water in default three-dimensional system of coordinate, in turn, utilize the spatial position of horizontal plane, the projection line of horizontal plane in the picture is obtained, and the projection line is determined as to the sky-line of target water.Detection method and device disclosed in the embodiment of the present disclosure can be accurately detected the sky-line when the water surface color and vein of target water changes or illumination changes.Also, when being blocked in the image of shooting there is no the real sky-line or the sky-line, the position of the sky-line in the picture can also be correctly detected, effectively improve the adaptability of detection method and detection device.

Description

Water antenna detection method and device
Technical Field
The invention relates to the technical field of image processing, in particular to a water antenna detection method and device.
Background
The water antenna is a boundary line between water and the sky in an image obtained by shooting a water area by a camera or a video camera, and has important significance for guiding safe navigation of a ship. For example, the water antenna in the water surface image can detect the obstacle on the water surface within the water surface image range, and the three-dimensional coordinate of the obstacle in the space coordinate system is calculated according to the position of the water antenna in the water surface image and the position of the obstacle in the water surface image, so as to warn the ship to avoid in advance.
At present, the existing water antenna detection means is mainly based on monocular image to detect the water antenna, and the detection method is as follows: the method comprises the steps of utilizing pixel characteristics of each pixel point in a water surface image collected by a camera to carry out edge extraction on the water surface image so as to extract an edge line of an object existing in the water surface image, then utilizing fitting algorithms such as a least square method and the like to carry out straight line fitting on the extracted edge line so as to obtain at least one fitting straight line, and finally determining one fitting straight line in the at least one fitting straight line to serve as a water antenna.
However, in practical applications, when a ship sails in a narrow water area and the shooting view of a camera is not wide, or when a water antenna at a distance is blocked by an obstacle, or when the ship is far away from the land and a real water antenna may not exist in a water surface image acquired by the camera, the water antenna cannot be accurately and effectively detected by using the existing detection method for the water antenna; in addition, when the water surface is fogged, the existing detection method of the water antenna fails to detect because the contrast of the image at the water-sky junction is too low, and the existing detection method of the water antenna cannot accurately detect the water antenna when the illumination change is large or the water surface color change is large.
Disclosure of Invention
The embodiment of the invention provides a water antenna detection method and a water antenna detection device, and aims to solve the problem that the existing water antenna detection method cannot accurately and accurately detect a water antenna.
In order to solve the technical problem, the embodiment of the invention discloses the following technical scheme:
a water antenna detection method, the method comprising:
respectively acquiring images obtained by shooting a target water area at the same time by a first camera and a second camera which are separated by a preset distance, wherein the image shot by the first camera is a first image, and the image shot by the second camera is a second image;
carrying out stereo matching on pixel points in the first image and the second image to obtain a parallax value of each pixel point in the first image;
determining the three-dimensional coordinate of each pixel point in the first image in the preset three-dimensional coordinate system according to the image coordinate of each pixel point in the first image and the parallax value of each pixel point in the first image;
determining the spatial position of the horizontal plane of the target water area in the preset three-dimensional coordinate system by using the three-dimensional coordinates of a plurality of pixel points in the first image in the preset three-dimensional coordinate system;
and according to the spatial position of the horizontal plane in the preset three-dimensional coordinate system, obtaining a projection line of the horizontal plane in the first image, and determining the projection line as a water antenna of the target water area in the first image.
Optionally, the performing stereo matching on the pixel points in the first image and the second image to obtain the disparity value of each pixel point in the first image includes:
aiming at each pixel point in the first image, determining a row where the pixel point is located, searching the pixel point with the maximum similarity with the pixel point in the same row of the second image, and taking the pixel point as a corresponding pixel point corresponding to the pixel point in the second image;
and calculating the difference between the row number of the pixel points in the first image and the row number of the corresponding pixel points in the second image to be used as the parallax value of the pixel points.
Optionally, the method further comprises:
calibrating the first camera and the second camera before images obtained by shooting of the first camera and the second camera are obtained for the first time;
respectively obtaining image coordinates of the optical center of the first camera projected in the first image and the preset distance between the first camera and the second camera according to a calibration result;
and establishing an optical center coordinate system of the first camera by using the image coordinate of the optical center of the first camera projected in the first image and the preset distance between the first camera and the second camera, and using the optical center coordinate system as the preset three-dimensional coordinate system.
Optionally, the determining, according to the image coordinate of each pixel in the first image and the parallax value of each pixel in the first image, the three-dimensional coordinate of each pixel in the first image in the preset three-dimensional coordinate system includes:
aiming at each pixel point in the first image, calculating the three-dimensional coordinate of the pixel point in a preset three-dimensional coordinate system by adopting the following formula:
wherein (u1, v1) is the image coordinates of the pixel point in the first image, (u0, v0) is the image coordinates of the optical center of the first camera projected in the first image, b is the preset distance between the first camera and the second camera, f is the preset focal length of the first camera, and d is the parallax value of the pixel point relative to the corresponding pixel point in the second image.
Optionally, the determining, by using three-dimensional coordinates of a plurality of pixel points in the first image in a preset three-dimensional coordinate system, a spatial position of a horizontal plane of the target water area in the preset three-dimensional coordinate system includes:
selecting a plurality of pixel points in the first image as sampling points, wherein all the sampling points form a sample set;
monitoring the number of reference values;
comparing the number of reference values with a preset number of reference values;
when the number of the reference values is smaller than the number of preset reference values, randomly selecting a preset sampling point number of the sampling points in the sample set, and calculating the spatial position of a target plane including the preset sampling point number of the sampling points in a preset three-dimensional coordinate system by utilizing the three-dimensional coordinates of the selected preset sampling point number of the sampling points in the preset three-dimensional coordinate system;
calculating the distance between each sampling point in the sample set and the target plane, calculating the median of all the distances, taking the median as one reference value, and increasing 1 in the number of the reference values as the number of reference values obtained by monitoring at the next time;
and when the number of the reference values is equal to the number of preset reference values, selecting the reference value with the minimum value from the reference values with the number of the preset reference values, and taking the spatial position of the target plane corresponding to the reference value with the minimum value in the preset three-dimensional coordinate system as the spatial position of the horizontal plane in the preset three-dimensional coordinate system.
Optionally, the method further comprises:
searching a target sampling point in the sample set, wherein the distance between the target sampling point and the spatial position of the horizontal plane in the preset three-dimensional coordinate system is smaller than a preset threshold value;
and removing sampling points except the target sampling points in the sample set, and acquiring the spatial position of the horizontal plane in the preset three-dimensional coordinate system again by using the sample set only comprising the target sampling points.
Optionally, the obtaining a projection line of the horizontal plane in the first image according to the spatial position of the horizontal plane in the preset three-dimensional coordinate system includes:
acquiring the spatial position of the optical center of the first camera in the preset three-dimensional coordinate system;
calculating the spatial position of an optical center plane which contains the optical center of the first camera and is parallel to the horizontal plane in the preset three-dimensional coordinate system by utilizing the spatial position of the optical center of the first camera in the preset three-dimensional coordinate system and the spatial position of the horizontal plane in the preset three-dimensional coordinate system;
calculating to obtain a projection line of the optical center plane in the first image by using the spatial position of the optical center plane in the preset three-dimensional coordinate system and the image coordinate of the projection of the optical center of the first camera in the first image, and determining the projection line of the optical center plane in the first image as the projection line of the horizontal plane in the first image.
Optionally, the method further comprises:
after the first image and the second image are acquired, noise reduction processing is performed on the first image and the second image.
A water antenna detection apparatus, the apparatus comprising:
the device comprises an image acquisition unit, a storage unit and a control unit, wherein the image acquisition unit is used for respectively acquiring images obtained by shooting a target water area by a first camera and a second camera at a preset distance at the same moment, the image shot by the first camera is a first image, and the image shot by the second camera is a second image;
the parallax value acquisition unit is used for performing stereo matching on the pixel points in the first image and the second image to obtain the parallax value of each pixel point in the first image;
a pixel coordinate determining unit, configured to determine, according to an image coordinate of each pixel in the first image and a parallax value of each pixel in the first image, a three-dimensional coordinate of each pixel in the first image in the preset three-dimensional coordinate system;
the horizontal plane determining unit is used for determining the spatial position of the horizontal plane of the target water area in a preset three-dimensional coordinate system by utilizing the three-dimensional coordinates of a plurality of pixel points in the first image in the preset three-dimensional coordinate system;
and the water antenna determining unit is used for obtaining a projection line of the horizontal plane in the first image according to the spatial position of the horizontal plane in the preset three-dimensional coordinate system, and determining the projection line as the water antenna of the target water area in the first image.
Optionally, the disparity value obtaining unit includes:
a corresponding pixel point searching unit, configured to determine, for each pixel point in the first image, a row in which the pixel point is located, and search, in the same row of the second image, for a pixel point with the largest similarity to the pixel point, as a corresponding pixel point corresponding to the pixel point in the second image;
and the parallax value calculating unit is used for calculating the difference between the row number of the pixel points in the first image and the row number of the corresponding pixel points in the second image to be used as the parallax value of the pixel points.
Optionally, the apparatus further comprises:
the camera calibration unit is used for calibrating the first camera and the second camera before images obtained by shooting of the first camera and the second camera are obtained for the first time;
the acquisition unit is used for respectively acquiring image coordinates of the optical center of the first camera projected in the first image and the preset distance between the first camera and the second camera according to a calibration result;
a preset three-dimensional coordinate system establishing unit, configured to establish an optical center coordinate system of the first camera by using image coordinates of the optical center of the first camera projected in the first image and the preset distance between the first camera and the second camera, and use the optical center coordinate system as the preset three-dimensional coordinate system.
Optionally, the pixel coordinate determining unit includes:
a pixel coordinate calculation unit, configured to calculate, for each pixel in the first image, a three-dimensional coordinate of the pixel in a preset three-dimensional coordinate system by using the following formula:
wherein (u1, v1) is the image coordinates of the pixel point in the first image, (u0, v0) is the image coordinates of the optical center of the first camera projected in the first image, b is the preset distance between the first camera and the second camera, f is the preset focal length of the first camera, and d is the parallax value of the pixel point relative to the corresponding pixel point in the second image.
Optionally, the level determining unit includes:
the sampling set forming unit is used for selecting a plurality of pixel points in the first image as sampling points, and all the sampling points form a sampling set;
a reference value quantity monitoring unit for monitoring the quantity of the reference values;
a reference value number comparison unit for comparing the number of reference values with a preset reference value number;
the target plane calculation unit is used for randomly selecting a preset sampling point number of the sampling points in the sample set when the number of the reference values is smaller than the preset reference value number, and calculating the spatial position of a target plane including the preset sampling point number of the sampling points in the preset three-dimensional coordinate system by utilizing the three-dimensional coordinates of the selected preset sampling point number of the sampling points in the preset three-dimensional coordinate system;
a reference value obtaining unit, configured to calculate a distance between each sampling point in the sample set and the target plane, calculate a median of all the distances, use the median as one reference value, and increase 1 by the number of the reference values as the number of reference values obtained by the next monitoring;
and the horizontal plane determining subunit is configured to, when the number of the reference values is equal to a preset reference value number, select a reference value with a minimum value from the preset reference value number of the reference values, and use a spatial position of the target plane in the preset three-dimensional coordinate system, which corresponds to the reference value with the minimum value, as a spatial position of the horizontal plane in the preset three-dimensional coordinate system.
Optionally, the apparatus further comprises:
the target sampling point searching unit is used for searching a target sampling point in the sample set, and the distance between the target sampling point and the spatial position of the horizontal plane in the preset three-dimensional coordinate system is smaller than a preset threshold value;
and the horizontal plane reacquiring unit is used for removing the sampling points except the target sampling point in the sample set and reacquiring the spatial position of the horizontal plane in the preset three-dimensional coordinate system by using the sample set only comprising the target sampling point.
Optionally, the water antenna determination unit includes:
an optical center position acquiring unit, configured to acquire a spatial position of an optical center of the first camera in the preset three-dimensional coordinate system;
an optical center plane calculating unit, configured to calculate, by using a spatial position of an optical center of the first camera in the preset three-dimensional coordinate system and a spatial position of the horizontal plane in the preset three-dimensional coordinate system, a spatial position of an optical center plane that includes the optical center of the first camera and is parallel to the horizontal plane in the preset three-dimensional coordinate system;
and the projection line acquisition unit is used for calculating a projection line of the optical center plane in the first image by utilizing the spatial position of the optical center plane in the preset three-dimensional coordinate system and the image coordinate of the projection of the optical center of the first camera in the first image, and determining the projection line of the optical center plane in the first image as the projection line of the horizontal plane in the first image.
Optionally, the apparatus further comprises:
and the noise reduction unit is used for performing noise reduction processing on the first image and the second image after the first image and the second image are acquired.
According to the technical scheme, the water antenna detection method and the water antenna detection device provided by the embodiment of the invention have the advantages that the spatial position of the horizontal plane of the target water area in the preset three-dimensional coordinate system is obtained based on the image obtained by shooting the target water area by the first camera and the second camera at the same time, and then the projection line of the horizontal plane in the image is obtained by utilizing the spatial position of the horizontal plane, and the projection line is determined as the water antenna of the target water area.
In the embodiment of the disclosure, based on the binocular image, the pixel characteristics of the pixel points in the image are not required to be relied on, so that the situation that the water antenna cannot be accurately and effectively detected cannot occur when the water surface color texture of the target water area changes or the illumination changes. In addition, in the embodiment of the disclosure, the position of the water antenna in the image can be obtained by determining the spatial position of the horizontal plane in the preset three-dimensional coordinate system only by shooting the horizontal plane of the target water area in the image, so that the position of the water antenna in the image can be accurately detected when no real water antenna exists in the shot image or the water antenna is blocked, and the adaptability of the detection method and the detection device is effectively improved.
In addition, in one embodiment of the disclosure, sampling points too far away from the horizontal plane in the sample set are removed, and the spatial position of the horizontal plane in the preset three-dimensional coordinate system is recalculated by using the remaining target sampling points in the sample set, so that the robustness of the detection method is enhanced, and further, the accuracy of the finally determined water antenna is improved.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious for those skilled in the art that other drawings can be obtained according to the drawings without inventive exercise.
Fig. 1 is a schematic flow chart of a water antenna detection method according to an embodiment of the present invention;
fig. 2 is a schematic flowchart of step S102 in fig. 1 according to an embodiment of the present invention;
fig. 3 is a schematic flow chart of another water antenna detection method according to an embodiment of the present invention;
fig. 4 is a schematic flowchart of step S103 in fig. 1 according to an embodiment of the present invention;
fig. 5 is a schematic flowchart of step S104 in fig. 1 according to an embodiment of the present invention;
fig. 6 is a schematic flowchart of step S105 in fig. 1 according to an embodiment of the present invention;
FIG. 7 is a schematic diagram of determining a projection line of a horizontal plane in a first image according to an embodiment of the present invention;
fig. 8 is a schematic diagram of a water antenna detection result according to an embodiment of the present invention;
FIG. 9 is a schematic diagram of another water antenna detection result according to an embodiment of the present invention;
fig. 10 is a schematic structural diagram of a water antenna detection apparatus according to an embodiment of the present invention.
Detailed Description
In order to make those skilled in the art better understand the technical solution of the present invention, the technical solution in the embodiment of the present invention will be clearly and completely described below with reference to the drawings in the embodiment of the present invention, and it is obvious that the described embodiment is only a part of the embodiment of the present invention, and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Fig. 1 is a schematic flow chart of a water antenna detection method according to an embodiment of the present disclosure, where the method is implemented on the premise that two cameras simultaneously capture images of a target water area, and the two cameras are respectively referred to as a first camera and a second camera. As shown in fig. 1, the method includes the following steps.
In step S101, images of a target water area captured by a first camera and a second camera separated by a preset distance at the same time are obtained, respectively, where the image captured by the first camera is a first image, and the image captured by the second camera is a second image.
The first camera and the second camera are set to shoot images of a target water area at the same time, the target water area is a water area with a small area in a horizontal plane, the shooting range of the cameras can contain the target water area, and each camera can shoot the images of the target water area once at intervals of a preset time period. Taking the first camera as an example, after the first camera takes one image, the next image is taken continuously after waiting for 5 seconds.
The first camera and the second camera are separated by a predetermined distance, which may be set according to an optimal installation position of the first camera and the second camera, and generally, the predetermined distance is set to 0.2 m to 0.8 m. Further, since the imaging range of the first camera overlaps with the imaging range of the second camera, the first camera and the second camera image the same target water area, and many of the same objects exist in the first image captured by the first camera and the second image captured by the second camera. For example, the same ship exists in both the first image and the second image, or the same house on the bank exists in both the first image and the second image, and the like.
The mode of respectively acquiring the images of the target water area captured by the first camera and the second camera at the same time may be: and respectively obtaining the images of the first camera and the second camera shooting the target water area at the current moment, after 5 seconds, simultaneously shooting the target water area again by the first camera and the second camera, and further respectively obtaining the images of the target water area shot by the first camera and the second camera at the same moment again. And performing all subsequent steps aiming at the images obtained by shooting the target water area by the first camera and the second camera at the same moment, so as to continuously detect the water antenna in the images according to the first image and the second image obtained each time.
In an embodiment of the present disclosure, after the first image and the second image are acquired, the first image and the second image are subjected to noise reduction processing, where the noise reduction processing may be performed by using an existing noise reduction technology, such as median filtering or gaussian smoothing filtering.
In step S102, stereo matching is performed on the pixel points in the first image and the second image to obtain a disparity value of each pixel point in the first image.
And performing stereo matching on all pixel points in the first image and pixel points in the second image by using the existing stereo matching algorithm to obtain the parallax value of each pixel point in any one image. In the embodiment of the present disclosure, the first image is taken as an example, and in this step, the parallax value of each pixel point in the first image is obtained after stereo matching.
In an embodiment of the present disclosure, as shown in fig. 2, the manner of performing stereo matching on the first image and the second image to obtain the disparity value of each pixel point in the first image may include the following steps.
In step S201, for each pixel point in the first image, a row in which the pixel point is located is determined, and a pixel point with the maximum similarity to the pixel point is searched in the same row of the second image, and is used as a corresponding pixel point corresponding to the pixel point in the second image.
Each pixel point in the first image has an image coordinate in the first image, similarly, each pixel point in the second image also has an image coordinate in the second image, and if the abscissa of a certain pixel point in the first image is the same as the abscissa of another pixel point in the second image, it is indicated that the two pixel points are in the same row.
Taking a pixel point in the first image as an example, determining the row where the pixel point is located according to the abscissa of the pixel point, and searching the pixel point with the maximum similarity to the pixel point in the same row in the second image to be used as the corresponding pixel point corresponding to the pixel point in the second image. And according to the mode, aiming at each pixel point in the first image, respectively finding the pixel point with the maximum similarity in the second image as a corresponding pixel point.
In one embodiment of the present disclosure, the similarity between pixels can be obtained by:
MNCC (Moravec normalized cross-correlation metric) is used for evaluating the similarity between the pixel points. The similarity between a certain pixel point in the first image and another pixel point in the second image is calculated, which can be seen in the following formula:
wherein, C in the formulaMNCC(p, d) is a similarity value between pixels; d is the difference value of the vertical coordinate of the A pixel point in the first image and the vertical coordinate of the B pixel point in the second image, namely the parallax value of the A pixel point in the first image relative to the B pixel point in the second image; wpA neighborhood of a pixel points in the first image, for example, a region within a range of 7 × 7 pixel points centered on a pixel point in the image; i isl(u, v) and Ir(u, v) are respectively the gray value of the pixel point (u, v) in the first image and the gray value of the pixel point (u, v) in the second image;andthe gray level average value of all pixel points in the neighborhood of the A pixel point in the first image and the gray level average value of all pixel points in the neighborhood of the B pixel point in the second image are respectively, and the setting mode of the neighborhood of the B pixel point in the second image is the same as that of the neighborhood of the A pixel point in the first image.
In this embodiment, coordinates of the pixel points in the image are (u, v), where u represents a row number where the pixel points are located, and is an abscissa of the pixel points, and v represents a column number where the pixel points are located, and is an ordinate of the pixel points. By using the above formula, the similarity between the pixel point with the abscissa of 1 in the second image and the pixel point with the coordinates of (1,1) in the first image is sequentially calculated from the pixel point with the coordinates of (1,1) in the second image, that is, the similarity between the pixel point with the first row in the second image and the pixel point with the coordinates of (1,1) in the first image is sequentially calculated. And searching out the pixel point with the maximum similarity to the pixel point with the coordinate (1,1) in the first image from the pixel points with the abscissa of 1 in the second image, and using the pixel point with the maximum similarity to the pixel point with the coordinate (1,1) in the first image in the second image. And then, sequentially searching corresponding pixel points corresponding to the residual pixel points in the first image in the second image according to the mode.
In step S202, the difference between the row number of the pixel in the first image and the row number of the corresponding pixel in the second image is calculated as the parallax value of the pixel.
And calculating the difference between the row number of the pixel points in the first image and the row number of the corresponding pixel points corresponding to the pixel points in the second image, and taking the calculated difference value as the parallax value of the pixel points. For example, the ordinate of a certain pixel in the first image is 15, the ordinate of the corresponding pixel corresponding to the pixel in the second image is 10, the parallax value of the pixel is 5, and the parallax value of each pixel in the first image is obtained in the above manner.
In step S103, a three-dimensional coordinate of each pixel point in the first image in a preset three-dimensional coordinate system is determined according to the image coordinate of each pixel point in the first image and the parallax value of each pixel point in the first image.
After the parallax value of each pixel point in the first image is obtained according to the step S102, the three-dimensional coordinates of each pixel point in the first image in the preset three-dimensional coordinate system are determined according to the image coordinates of each pixel point in the first image and the parallax value of each pixel point in the first image in the following manner.
In one embodiment of the present disclosure, as shown in fig. 3, the preset three-dimensional coordinate system may be obtained by:
in step S301, before images captured by the first camera and the second camera are acquired for the first time, the first camera and the second camera are calibrated.
The first camera and the second camera are calibrated by using the existing camera calibration technology, wherein the calibrated content may include respective internal parameters, external parameters, a distance between the two cameras, and the like of the two cameras, the internal parameters of the cameras include focal lengths and distortion parameters of the cameras, image coordinates of optical centers of the cameras projected in images shot by the cameras, and the like, and the external parameters include a rotation matrix, a translation vector, and the like between an image coordinate system and a space coordinate system of the cameras.
In step S302, image coordinates of the optical center of the first camera projected in the first image and a preset distance between the first camera and the second camera are obtained according to the calibration result.
After the above step S301 is completed, the calibration parameters of the two cameras are obtained, so that the image coordinates of the optical center of the first camera projected in the first image and the preset distance between the first camera and the second camera can be obtained.
In step S303, an optical center coordinate system of the first camera is established as a predetermined three-dimensional coordinate system by using image coordinates of the optical center of the first camera projected in the first image and a predetermined distance between the first camera and the second camera.
In the embodiment of the disclosure, the predetermined three-dimensional coordinate system may be defined as an optical center coordinate system of the first camera, as shown in fig. 7, the coordinate system uses the optical center of the first camera as a coordinate origin, uses a direction perpendicular to the shooting direction of the first camera and parallel to the horizontal plane as an X-axis, uses a direction perpendicular to the horizontal plane as a Y-axis, and uses the shooting direction of the first camera as a Z-axis. Any point in the optical center coordinate system of the first camera should satisfy the following formula.
(u1,v1) Is the image coordinate of the pixel point in the first image, (u)0,v0) Is the image coordinate of the projection of the optical center of the first camera in the first image, b is the preset distance between the first camera and the second camera, and f is the preset focus of the first cameraAnd d is the parallax value of the pixel point relative to the corresponding pixel point in the second image.
Therefore, according to the above formula, the three-dimensional coordinates of each pixel point in the first image in the preset three-dimensional coordinate system can be determined according to the image coordinates of each pixel point in the first image and the parallax value of each pixel point in the first image.
In step S104, the spatial position of the horizontal plane of the target water area in the preset three-dimensional coordinate system is determined by using the three-dimensional coordinates of the plurality of pixels in the first image in the preset three-dimensional coordinate system.
The equation for the plane can be expressed by a general plane equation formula, as follows:
ax+by+cz+d=0
where (x, y, z, 1)' may express coordinates of a point in a formula, and a, b, c, d are coefficients in a plane equation.
If a certain point P ═ x, y, z,1 'belongs to a certain plane Φ ═ a, b, c, d)', then the plane equation must be satisfied, and if n three-dimensional points belong to this plane at the same time, then they satisfy the equation set:
order toThe above equation set can be expressed as a phi' 0.
The result of the plane fitting can be obtained by using the above equation set, and many existing methods can be used for solving the equation set, including least squares method, RANSAC algorithm (RANdom SAmple Consensus algorithm), and least median method.
The distance between a certain point P ═ x, y, z,1) 'in space and the plane Φ ═ a, b, c, d)' can be calculated by the following formula:
the first idea of plane fitting is to minimize the following:
that is, the distances between a plurality of points in the space and the plane are minimized, for example, the main content included in the first image is the target water area, and therefore, the number of pixels representing the horizontal plane in the first image is large, and if there is a certain plane, the sum of the distances between the position of the object represented by each pixel in the first image and the plane can be minimized, and the plane is highly likely to be the horizontal plane.
Therefore, based on the above idea, as shown in fig. 4, the spatial position of the horizontal plane of the target water area in the preset three-dimensional coordinate system is determined by the following steps.
In step S401, a plurality of pixel points are selected from the first image as sampling points, and all the sampling points form a sample set.
For example, a pixel point is selected in the region where every 5 × 5 pixel points are located in the first image, or a pixel point is selected every 5 pixel points at intervals in the first image, and no matter which type of selection mode of the pixel point is, the main purpose is to select a part from the first image which can reflect the pixel points of the object contained in the first image, so that the selected pixel points are used for subsequent calculation, and the situation that calculation rate is seriously influenced by adopting all the pixel points in the first image for subsequent calculation is avoided.
In an embodiment of the present disclosure, before selecting a sampling point, it is further required to screen pixel points in the first image, obtain a similarity between each pixel point in the first image and a corresponding pixel point in the second image, filter pixel points in the first image, where the similarity between the first image and the corresponding pixel points is lower than a preset threshold, that is, filter pixel points in the first image, which have a poor stereo matching effect with the second image, and select sampling points from remaining pixel points in the first image.
And taking the pixel points selected according to the mode as sampling points, and forming a sample set by all the sampling points.
In step S402, the number of reference values is monitored.
The reference value is a reflection of the distance between all the sampling points in the sample set and the fitted plane, and it can be determined whether the result of the fitted plane is appropriate, i.e. whether the result of the fitted plane can represent the horizontal plane. The reference value will be explained in detail in the subsequent step.
In step S403, the number of reference values is compared with a preset reference value number.
The preset reference value number may be a preset number, or may be calculated by a preset calculation formula, which will be described in detail in the following steps.
When the number of the reference values is smaller than the preset number of reference values, continuing to execute step S404; if the number of reference values is equal to the preset number of reference values, the process proceeds to step S406.
When the number of the reference values is smaller than the number of the preset reference values, in step S404, a number of sampling points corresponding to the number of the preset sampling points are randomly selected from the sample set, and the spatial position of the target plane including the number of the sampling points corresponding to the number of the preset sampling points in the preset three-dimensional coordinate system is calculated by using the three-dimensional coordinates of the selected number of the sampling points in the preset three-dimensional coordinate system.
When the sampling points with the number of the preset sampling points are randomly selected in the sample set, the number of the preset sampling points is smaller than the total number of the sampling points in the sample set, and in the embodiment of the present disclosure, the number of the preset sampling points is set to 3, that is, 3 sampling points are randomly selected in the sample set. The three-dimensional coordinates of the selected 3 sampling points in the preset three-dimensional coordinate system are obtained according to the step S103, and according to the method for obtaining the fitting plane, the three-dimensional coordinates of the selected 3 sampling points in the preset three-dimensional coordinate system are used, and the spatial position of the target plane including the 3 sampling points in the preset three-dimensional coordinate system is calculated by using the least square method, that is, a plane equation ax + by + cz + d of a plane in the preset three-dimensional coordinate system is fitted to be 0.
In step S405, the distance between each sampling point in the sample set and the target plane is calculated, the median of all the distances is calculated, the median is used as a reference value, and 1 is added to the number of the reference values to be used as the number of reference values obtained by the next monitoring.
And acquiring the three-dimensional coordinates of each sampling point in the sample set in a preset three-dimensional coordinate system, and calculating the distance between each sampling point in the sample set and the fitted target plane by adopting the formula in the description. After the distances from all the sampling points in the sample set to the target plane are obtained, the calculation formula for calculating the median value Mj, Mj of the distances is as follows:
the calculated median value is taken as a reference value, and at the same time, the number of reference values is obtained and increased by 1. For example, according to the foregoing method, results of 10 fitting planes have been obtained, and for each fitting plane, a median value is calculated as a reference value, and therefore, the number of reference values is 10. After the fitting plane is calculated this time, a new reference value is obtained, 1 is added to the number of the reference values, the number of the reference values is changed to 11, and 11 is used as the number of the reference values obtained by the next monitoring. And repeating the steps until the number of the reference values is equal to the preset number of the reference values.
When the number of the reference values is equal to the number of the preset reference values, in step S406, the reference value with the smallest value is selected from the number of the preset reference values, and the spatial position of the target plane in the preset three-dimensional coordinate system corresponding to the reference value with the smallest value is used as the spatial position of the horizontal plane in the preset three-dimensional coordinate system.
The number of preset reference values may be calculated by the following formula:
P=1-[1-(1-ε)p]m
wherein epsilon represents the proportion of sampling points which are incorrectly matched in the sample set, namely the proportion of pixel points which are incorrectly matched with corresponding pixel points in the second image in the sample set, and epsilon is a preset numerical value; p represents the number of preset sampling points of the sampling points, and is 3 in the embodiment of the disclosure; m is the number of preset reference values. P calculated according to the above formula represents the probability of obtaining a correct solution after randomly sampling m times and obtaining m reference values, and P is generally set to 0.9, i.e. after sampling the sample set m times, 90% of the probability can obtain an object plane representing a horizontal plane.
Selecting a reference value with the minimum value from the preset reference value number of reference values obtained in the above steps, and using the spatial position of the target plane corresponding to the reference value with the minimum value in the preset three-dimensional coordinate system as the spatial position of the horizontal plane in the preset three-dimensional coordinate system, that is, using the plane equation of the target plane in the preset three-dimensional coordinate system adopted when the reference value with the minimum value is obtained by calculation as the plane equation of the horizontal plane in the preset three-dimensional coordinate system.
And when the number of the reference values is equal to the number of the preset reference values, calculating to obtain the spatial position of the horizontal plane in the preset three-dimensional coordinate system by using the number of the preset reference values. After the spatial position of the horizontal plane of the target water area has been determined, there is no need to continue to obtain the reference values, but to perform the subsequent step of obtaining the water antennas, so that the number of reference values does not appear to be greater than the number of preset reference values.
The shooting ranges of the first camera and the second camera are overlapped, generally only partially, but not completely, so that different objects, for example, a house near the bank is included in the first image, and the house is not included in the second image, may exist in the first image and the second image. Therefore, a pixel point which is dissimilar to any pixel point in the second image probably exists in the first image, but each pixel point in the first image has a corresponding pixel point in the second image, and therefore, the corresponding relation between the pixel point in the first image and the corresponding pixel point in the second image is wrong probably.
Therefore, in order to enhance the correctness of the spatial position of the horizontal plane in the preset three-dimensional coordinate system, in one embodiment of the disclosure, firstly, the sampling points which are obviously incorrect in matching in the sample set are removed by using a random sampling method, and then, the plane fitting is performed on the rest sampling points by using a least square method.
After acquiring the spatial position of the horizontal plane of the target water area in the preset three-dimensional coordinate system, as shown in fig. 5, the spatial position of the horizontal plane is further corrected by the following steps.
In step S501, a target sampling point in the sample set is searched, and a distance between the target sampling point and a spatial position of the horizontal plane in the preset three-dimensional coordinate system is smaller than a preset threshold.
The target sampling point is a sampling point with correct corresponding relation with the corresponding pixel point in the second image in the sample set, and can be obtained by the following method.
First, using a formulaCalculating a threshold value, wherein n is the total number of sampling points in the sample set, MJObtaining the target water area in the previous stepThe reference value adopted by the horizontal plane in the preset space position of the three-dimensional coordinate system, namely the median with the minimum numerical value, and p is the number of preset sampling points.
Then, based on the threshold, a target sampling point in the sample set is determined according to the following formula.
Wherein,
ri=d2(p,φ)
p is the coordinate (x, y, z,1) 'of the sampling point in the preset three-dimensional coordinate system, (a, b, c, d)' is the spatial position of the water antenna of the target water area in the preset three-dimensional coordinate system obtained in the previous step, namely the plane equation of the horizontal plane, d2Representing the square of the distance between the sample point p and the horizontal plane. And when the Wi calculated by a certain sampling point according to the formula is 1, determining the sampling point as a target sampling point.
In step S502, the sampling points in the sample set except the target sampling point are removed, and the spatial position of the horizontal plane in the preset three-dimensional coordinate system is obtained again by using the least square method using the sample set including only the target sampling point.
And removing the sampling points except the target sampling points in the sample set, and acquiring the spatial position of the horizontal plane of the target water area in the preset three-dimensional coordinate system again by using the remaining target sampling points in the sample set according to the steps S401 to S405.
In step S105, a projection line of the horizontal plane in the first image is obtained according to the spatial position of the horizontal plane in the preset three-dimensional coordinate system, and the projection line is determined as the water antenna of the target water area in the first image.
After the spatial position of the horizontal plane in the preset three-dimensional coordinate system is obtained, a projection line of the horizontal plane in the first image is determined, the projection line is a boundary line of the horizontal plane in the image and the sky in the image, and the boundary line is determined as a water antenna of the target water area in the first image.
The method for obtaining the projection line of the horizontal plane in the first image according to the spatial position of the horizontal plane in the preset three-dimensional coordinate system, as shown in fig. 6, may include the following steps.
In step S601, a spatial position of the optical center of the first camera in a predetermined three-dimensional coordinate system is obtained, in the embodiment of the present disclosure, the predetermined three-dimensional coordinate system is the optical center coordinate system of the first camera, and an origin of the optical center coordinate system is the optical center of the first camera.
In step S602, a spatial position of an optical center plane including the optical center of the first camera and parallel to the horizontal plane in the preset three-dimensional coordinate system is calculated using a spatial position of the optical center of the first camera in the preset three-dimensional coordinate system, that is, a spatial position of the optical center of the first camera in the optical center coordinate system, and a spatial position of the horizontal plane in the preset three-dimensional coordinate system.
As shown in FIG. 7, O is the optical center of the camera, and the coordinate system OXYZ is the optical center coordinate system of the first camera, which uses the optical center of the first camera as the origin, phi is the horizontal plane, and phi is the horizontal planefThe optical center plane OAB is the plane passing through the optical center O and parallel to the plane phi, which is the focal plane. The water antenna of the plane phi is the projection of phi on the focal plane at infinity, and the projection lines of the planes parallel to each other projected on the focal plane are the same straight line, so that the projection lines of the optical center plane OAB and the plane phi on the image are the same, the projection lines of the optical center plane OAB and the plane phi on the focal plane are the projection lines of the plane phi on the first image, and the projection lines are determined as the water antennas of the target water area in the first image.
Assuming that the equation for the plane phi is ax + by + cz + d is 0, the equation for the optical center plane OAB passing through the optical center O is ax + by + cz is 0.
In step S603, a projection line of the optical center plane in the first image is calculated by using the spatial position of the optical center plane in the preset three-dimensional coordinate system and the image coordinates of the optical center of the first camera projected in the first image, and the projection line of the optical center plane in the first image is determined as the projection line of the horizontal plane in the first image.
Focal plane phifF is the focal length of the first camera, and thus the equation for obtaining the projection line AB of the optical center plane OAB on the focal plane is:
ax+by+cf=0
the conversion formula between the optical center coordinate system of the first camera and the image coordinate system of the first camera is as follows:
in the formula (u)0,v0) Is a projection of the optical center of the first camera in the first image. The projection equation of the straight line AB in the image coordinate system of the first camera can thus be found as:
a(u-u0)+b(v-v0)+cF=0
wherein F is the focal length of the first camera, dpIs the physical size of a pixel point in the first image in the optical center coordinate system of the first camera.
And determining the projection line of the optical center plane OAB in the first image as the projection line of the horizontal plane in the first image, and further obtaining an equation of the water antenna in an image coordinate system of the first camera. In the manner in the above-described embodiment of the present disclosure, the detected water antenna in the first image is as shown in fig. 8 and 9.
Since the manner of detecting the water antenna in the first image is also applicable to detecting the water antenna in the second image, details are not described in the embodiment of the present disclosure.
Fig. 10 is a schematic structural diagram of a water antenna detection apparatus provided in an embodiment of the present disclosure, and as shown in fig. 10, the apparatus includes:
the image acquiring unit 11 is configured to acquire images obtained by shooting a target water area at the same time by a first camera and a second camera which are separated by a preset distance, respectively, where the image shot by the first camera is a first image;
the parallax value obtaining unit 12 is configured to perform stereo matching on the pixel points in the first image and the second image to obtain a parallax value of each pixel point in the first image;
a pixel coordinate determining unit 13, configured to determine a three-dimensional coordinate of each pixel in the first image in a preset three-dimensional coordinate system according to an image coordinate of each pixel in the first image and a parallax value of each pixel in the first image;
the horizontal plane determining unit 14 is configured to determine a spatial position of a horizontal plane of the target water area in a preset three-dimensional coordinate system by using three-dimensional coordinates of a plurality of pixel points in the first image in the preset three-dimensional coordinate system;
and the water antenna determining unit 15 is configured to obtain a projection line of the horizontal plane in the first image according to the spatial position of the horizontal plane in the preset three-dimensional coordinate system, and determine the projection line as the water antenna of the target water area in the first image.
In an embodiment of the present disclosure, the disparity value acquiring unit 12 in the foregoing embodiment includes:
the corresponding pixel point searching unit is used for determining a row where the pixel point is located aiming at each pixel point in the first image, searching the pixel point with the maximum similarity to the pixel point in the same row of the second image, and taking the pixel point as the corresponding pixel point corresponding to the pixel point in the second image;
and the parallax value calculating unit is used for calculating the difference between the row number of the pixel points in the first image and the row number of the corresponding pixel points in the second image to be used as the parallax value of the pixel points.
In another embodiment of the present disclosure, the apparatus in the foregoing embodiment further comprises:
the camera calibration unit is used for calibrating the first camera and the second camera before images obtained by shooting of the first camera and the second camera are acquired for the first time;
the acquisition unit is used for respectively acquiring image coordinates of the optical center of the first camera projected in the first image and a preset distance between the first camera and the second camera according to the calibration result;
and the preset three-dimensional coordinate system establishing unit is used for establishing the optical center coordinate system of the first camera by utilizing the image coordinate of the projection of the optical center of the first camera in the first image and the preset distance between the first camera and the second camera, and taking the optical center coordinate system as the preset three-dimensional coordinate system.
In another embodiment of the present disclosure, the pixel coordinate determining unit 13 in the foregoing embodiment includes:
the pixel point coordinate calculation unit is used for calculating the three-dimensional coordinates of the pixel points in a preset three-dimensional coordinate system by adopting the following formula aiming at each pixel point in the first image:
wherein (u)1,v1) Is the image coordinate of the pixel point in the first image, (u)0,v0) The image coordinate of the projection of the optical center of the first camera in the first image, b is a preset distance between the first camera and the second camera, f is a preset focal length of the first camera, and d is a parallax value of the pixel point relative to the corresponding pixel point in the second image.
In another embodiment of the present disclosure, the level determining unit 14 in the foregoing embodiment includes:
the sampling set forming unit is used for selecting a plurality of pixel points from the first image as sampling points, and all the sampling points form a sampling set;
a reference value quantity monitoring unit for monitoring the quantity of the reference values;
a reference value number comparing unit for comparing the number of reference values with a preset reference value number;
the target plane calculation unit is used for randomly selecting a preset sampling point number of sampling points in the sample set when the number of the reference values is smaller than the preset reference value number, and calculating the spatial position of a target plane including the preset sampling point number of the sampling points in a preset three-dimensional coordinate system by utilizing the three-dimensional coordinates of the selected preset sampling point number of the sampling points in the preset three-dimensional coordinate system;
the reference value acquisition unit is used for calculating the distance between each sampling point in the sample set and the target plane, calculating the median of all the distances, taking the median as a reference value, and increasing 1 on the number of the reference values as the number of the reference values obtained by the next monitoring;
and the horizontal plane determining subunit is used for selecting the reference value with the minimum value from the reference values with the number of the preset reference values when the number of the reference values is equal to the number of the preset reference values, and taking the spatial position of the target plane corresponding to the reference value with the minimum value in the preset three-dimensional coordinate system as the spatial position of the horizontal plane in the preset three-dimensional coordinate system.
In another embodiment of the present disclosure, the apparatus in the foregoing embodiment further comprises:
the target sampling point searching unit is used for searching a target sampling point in the sample set, and the distance between the target sampling point and the spatial position of the horizontal plane in the preset three-dimensional coordinate system is smaller than a preset threshold value;
and the horizontal plane reacquiring unit is used for removing the sampling points except the target sampling points in the sample set and reacquiring the spatial position of the horizontal plane in the preset three-dimensional coordinate system by using the sample set only comprising the target sampling points.
In another embodiment of the present disclosure, the water antenna determining unit 15 in the foregoing embodiment includes:
the optical center position acquisition unit is used for acquiring the spatial position of the optical center of the first camera in a preset three-dimensional coordinate system;
the optical center plane calculation unit is used for calculating the spatial position of an optical center plane which contains the optical center of the first camera and is parallel to the horizontal plane in the preset three-dimensional coordinate system by utilizing the spatial position of the optical center of the first camera in the preset three-dimensional coordinate system and the spatial position of the horizontal plane in the preset three-dimensional coordinate system;
and the projection line acquisition unit is used for calculating the projection line of the optical center plane in the first image by utilizing the spatial position of the optical center plane in the preset three-dimensional coordinate system and the image coordinate of the optical center of the first camera projected in the first image, and determining the projection line of the optical center plane in the first image as the projection line of the horizontal plane in the first image.
In another embodiment of the present disclosure, the apparatus in the foregoing embodiment further comprises:
and the noise reduction unit is used for performing noise reduction processing on the first image and the second image after the first image and the second image are acquired.
It is noted that, in this document, relational terms such as "first" and "second," and the like, may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
The foregoing are merely exemplary embodiments of the present invention, which enable those skilled in the art to understand or practice the present invention. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the invention. Thus, the present invention is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.

Claims (16)

1. A water antenna detection method, the method comprising:
the method comprises the steps of respectively obtaining images obtained by shooting a target water area at the same time by a first camera and a second camera which are separated by a preset distance, wherein the image shot by the first camera is a first image, the image shot by the second camera is a second image, and the shooting range of the first camera is overlapped with that of the second camera;
carrying out stereo matching on pixel points in the first image and the second image to obtain a parallax value of each pixel point in the first image;
determining the three-dimensional coordinate of each pixel point in the first image in a preset three-dimensional coordinate system according to the image coordinate of each pixel point in the first image and the parallax value of each pixel point in the first image;
determining the spatial position of the horizontal plane of the target water area in the preset three-dimensional coordinate system by using the three-dimensional coordinates of a plurality of pixel points in the first image in the preset three-dimensional coordinate system;
and according to the spatial position of the horizontal plane in the preset three-dimensional coordinate system, obtaining a projection line of the horizontal plane in the first image, and determining the projection line as a water antenna of the target water area in the first image.
2. The method according to claim 1, wherein the performing stereo matching on the pixel points in the first image and the second image to obtain the disparity value of each pixel point in the first image comprises:
aiming at each pixel point in the first image, determining a row where the pixel point is located, searching the pixel point with the maximum similarity with the pixel point in the same row of the second image, and taking the pixel point as a corresponding pixel point corresponding to the pixel point in the second image;
and calculating the difference between the row number of the pixel points in the first image and the row number of the corresponding pixel points in the second image to be used as the parallax value of the pixel points.
3. The method of claim 1, further comprising:
calibrating the first camera and the second camera before images obtained by shooting of the first camera and the second camera are obtained for the first time;
respectively obtaining image coordinates of the optical center of the first camera projected in the first image and the preset distance between the first camera and the second camera according to a calibration result;
and establishing an optical center coordinate system of the first camera by using the image coordinate of the optical center of the first camera projected in the first image and the preset distance between the first camera and the second camera, and using the optical center coordinate system as the preset three-dimensional coordinate system.
4. The method according to claim 3, wherein determining the three-dimensional coordinates of each pixel point in the first image in the preset three-dimensional coordinate system according to the image coordinates of each pixel point in the first image and the disparity value of each pixel point in the first image comprises:
aiming at each pixel point in the first image, calculating the three-dimensional coordinate of the pixel point in a preset three-dimensional coordinate system by adopting the following formula:
wherein (u)1,v1) (u) image coordinates of said pixel point in said first image0,v0) The image coordinate of the projection of the optical center of the first camera in the first image, b is the preset distance between the first camera and the second camera, f is the preset focal length of the first camera, and d is the parallax value of the pixel point relative to the corresponding pixel point in the second image.
5. The method of claim 1, wherein the determining the spatial position of the horizontal plane of the target water area in the predetermined three-dimensional coordinate system by using the three-dimensional coordinates of the plurality of pixel points in the first image in the predetermined three-dimensional coordinate system comprises:
selecting a plurality of pixel points in the first image as sampling points, wherein all the sampling points form a sample set;
monitoring the number of reference values;
comparing the number of reference values with a preset number of reference values;
when the number of the reference values is smaller than the number of preset reference values, randomly selecting a preset sampling point number of the sampling points in the sample set, and calculating the spatial position of a target plane including the preset sampling point number of the sampling points in a preset three-dimensional coordinate system by utilizing the three-dimensional coordinates of the selected preset sampling point number of the sampling points in the preset three-dimensional coordinate system;
calculating the distance between each sampling point in the sample set and the target plane, calculating the median of all the distances, taking the median as one reference value, and increasing 1 in the number of the reference values as the number of reference values obtained by monitoring at the next time;
and when the number of the reference values is equal to the number of preset reference values, selecting the reference value with the minimum value from the reference values with the number of the preset reference values, and taking the spatial position of the target plane corresponding to the reference value with the minimum value in the preset three-dimensional coordinate system as the spatial position of the horizontal plane in the preset three-dimensional coordinate system.
6. The method of claim 5, further comprising:
searching a target sampling point in the sample set, wherein the distance between the target sampling point and the spatial position of the horizontal plane in the preset three-dimensional coordinate system is smaller than a preset threshold value;
and removing sampling points except the target sampling points in the sample set, and acquiring the spatial position of the horizontal plane in the preset three-dimensional coordinate system again by using the sample set only comprising the target sampling points.
7. The method according to claim 3, wherein the obtaining a projection line of the horizontal plane in the first image according to the spatial position of the horizontal plane in the preset three-dimensional coordinate system comprises:
acquiring the spatial position of the optical center of the first camera in the preset three-dimensional coordinate system;
calculating the spatial position of an optical center plane which contains the optical center of the first camera and is parallel to the horizontal plane in the preset three-dimensional coordinate system by utilizing the spatial position of the optical center of the first camera in the preset three-dimensional coordinate system and the spatial position of the horizontal plane in the preset three-dimensional coordinate system;
calculating to obtain a projection line of the optical center plane in the first image by using the spatial position of the optical center plane in the preset three-dimensional coordinate system and the image coordinate of the projection of the optical center of the first camera in the first image, and determining the projection line of the optical center plane in the first image as the projection line of the horizontal plane in the first image.
8. The method of claim 1, further comprising:
after the first image and the second image are acquired, noise reduction processing is performed on the first image and the second image.
9. A water antenna detection apparatus, the apparatus comprising:
the device comprises an image acquisition unit, a storage unit and a control unit, wherein the image acquisition unit is used for respectively acquiring images obtained by shooting a target water area by a first camera and a second camera at a preset distance at the same moment, the image shot by the first camera is a first image, the image shot by the second camera is a second image, and the shooting range of the first camera is overlapped with that of the second camera;
the parallax value acquisition unit is used for performing stereo matching on the pixel points in the first image and the second image to obtain the parallax value of each pixel point in the first image;
the pixel point coordinate determination unit is used for determining the three-dimensional coordinate of each pixel point in the first image in a preset three-dimensional coordinate system according to the image coordinate of each pixel point in the first image and the parallax value of each pixel point in the first image;
the horizontal plane determining unit is used for determining the spatial position of the horizontal plane of the target water area in the preset three-dimensional coordinate system by utilizing the three-dimensional coordinates of the plurality of pixel points in the first image in the preset three-dimensional coordinate system;
and the water antenna determining unit is used for obtaining a projection line of the horizontal plane in the first image according to the spatial position of the horizontal plane in the preset three-dimensional coordinate system, and determining the projection line as the water antenna of the target water area in the first image.
10. The apparatus according to claim 9, wherein the disparity value obtaining unit includes:
a corresponding pixel point searching unit, configured to determine, for each pixel point in the first image, a row in which the pixel point is located, and search, in the same row of the second image, for a pixel point with the largest similarity to the pixel point, as a corresponding pixel point corresponding to the pixel point in the second image;
and the parallax value calculating unit is used for calculating the difference between the row number of the pixel points in the first image and the row number of the corresponding pixel points in the second image to be used as the parallax value of the pixel points.
11. The apparatus of claim 10, further comprising:
the camera calibration unit is used for calibrating the first camera and the second camera before images obtained by shooting of the first camera and the second camera are obtained for the first time;
the acquisition unit is used for respectively acquiring image coordinates of the optical center of the first camera projected in the first image and the preset distance between the first camera and the second camera according to a calibration result;
a preset three-dimensional coordinate system establishing unit, configured to establish an optical center coordinate system of the first camera by using image coordinates of the optical center of the first camera projected in the first image and the preset distance between the first camera and the second camera, and use the optical center coordinate system as the preset three-dimensional coordinate system.
12. The apparatus of claim 11, wherein the pixel coordinate determination unit comprises:
a pixel coordinate calculation unit, configured to calculate, for each pixel in the first image, a three-dimensional coordinate of the pixel in a preset three-dimensional coordinate system by using the following formula:
wherein (u)1,v1) (u) image coordinates of said pixel point in said first image0,v0) The image coordinate of the projection of the optical center of the first camera in the first image, b is the preset distance between the first camera and the second camera, f is the preset focal length of the first camera, and d is the parallax value of the pixel point relative to the corresponding pixel point in the second image.
13. The apparatus of claim 9, wherein the level determining unit comprises:
the sampling set forming unit is used for selecting a plurality of pixel points in the first image as sampling points, and all the sampling points form a sampling set;
a reference value quantity monitoring unit for monitoring the quantity of the reference values;
a reference value number comparison unit for comparing the number of reference values with a preset reference value number;
the target plane calculation unit is used for randomly selecting a preset sampling point number of the sampling points in the sample set when the number of the reference values is smaller than the preset reference value number, and calculating the spatial position of a target plane including the preset sampling point number of the sampling points in the preset three-dimensional coordinate system by utilizing the three-dimensional coordinates of the selected preset sampling point number of the sampling points in the preset three-dimensional coordinate system;
a reference value obtaining unit, configured to calculate a distance between each sampling point in the sample set and the target plane, calculate a median of all the distances, use the median as one reference value, and increase 1 by the number of the reference values as the number of reference values obtained by the next monitoring;
and the horizontal plane determining subunit is configured to, when the number of the reference values is equal to a preset reference value number, select a reference value with a minimum value from the preset reference value number of the reference values, and use a spatial position of the target plane in the preset three-dimensional coordinate system, which corresponds to the reference value with the minimum value, as a spatial position of the horizontal plane in the preset three-dimensional coordinate system.
14. The apparatus of claim 13, further comprising:
the target sampling point searching unit is used for searching a target sampling point in the sample set, and the distance between the target sampling point and the spatial position of the horizontal plane in the preset three-dimensional coordinate system is smaller than a preset threshold value;
and the horizontal plane reacquiring unit is used for removing the sampling points except the target sampling point in the sample set and reacquiring the spatial position of the horizontal plane in the preset three-dimensional coordinate system by using the sample set only comprising the target sampling point.
15. The apparatus of claim 11, wherein the water antenna determination unit comprises:
an optical center position acquiring unit, configured to acquire a spatial position of an optical center of the first camera in the preset three-dimensional coordinate system;
an optical center plane calculating unit, configured to calculate, by using a spatial position of an optical center of the first camera in the preset three-dimensional coordinate system and a spatial position of the horizontal plane in the preset three-dimensional coordinate system, a spatial position of an optical center plane that includes the optical center of the first camera and is parallel to the horizontal plane in the preset three-dimensional coordinate system;
and the projection line acquisition unit is used for calculating a projection line of the optical center plane in the first image by utilizing the spatial position of the optical center plane in the preset three-dimensional coordinate system and the image coordinate of the projection of the optical center of the first camera in the first image, and determining the projection line of the optical center plane in the first image as the projection line of the horizontal plane in the first image.
16. The apparatus of claim 9, further comprising:
and the noise reduction unit is used for performing noise reduction processing on the first image and the second image after the first image and the second image are acquired.
CN201510595935.3A 2015-09-18 2015-09-18 A kind of detection of sea-level and device Active CN106558038B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510595935.3A CN106558038B (en) 2015-09-18 2015-09-18 A kind of detection of sea-level and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510595935.3A CN106558038B (en) 2015-09-18 2015-09-18 A kind of detection of sea-level and device

Publications (2)

Publication Number Publication Date
CN106558038A CN106558038A (en) 2017-04-05
CN106558038B true CN106558038B (en) 2019-07-02

Family

ID=58414233

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510595935.3A Active CN106558038B (en) 2015-09-18 2015-09-18 A kind of detection of sea-level and device

Country Status (1)

Country Link
CN (1) CN106558038B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109936704A (en) * 2017-12-18 2019-06-25 姜鹏飞 A kind of image data transparent effect processing method and processing device
CN109961455B (en) * 2017-12-22 2022-03-04 杭州萤石软件有限公司 Target detection method and device
CN112017238B (en) * 2019-05-30 2024-07-19 北京初速度科技有限公司 Method and device for determining spatial position information of linear object
CN112639881A (en) * 2020-01-21 2021-04-09 深圳市大疆创新科技有限公司 Distance measuring method, movable platform, device and storage medium
WO2021174539A1 (en) * 2020-03-06 2021-09-10 深圳市大疆创新科技有限公司 Object detection method, mobile platform, device and storage medium

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101604383A (en) * 2009-07-24 2009-12-16 哈尔滨工业大学 A kind of method for detecting targets at sea based on infrared image
CN104778695A (en) * 2015-04-10 2015-07-15 哈尔滨工程大学 Water sky line detection method based on gradient saliency

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100886611B1 (en) * 2007-08-14 2009-03-05 한국전자통신연구원 Method and apparatus for detecting line segment by incremental pixel extension in an image

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101604383A (en) * 2009-07-24 2009-12-16 哈尔滨工业大学 A kind of method for detecting targets at sea based on infrared image
CN104778695A (en) * 2015-04-10 2015-07-15 哈尔滨工程大学 Water sky line detection method based on gradient saliency

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
Rapid Water-Sky-Line Detecting Algorithm in Marine Celestial Navigation;Chonghui Li et al;《第三届中国卫星导航学术年会电子文集》;20121231;全文
一种基于小波多尺度分析的水天线检测方法;裴立力等;《沈阳工业大学学报》;20030430;第25卷(第2期);全文
基于光视觉的无人艇水面目标检测与跟踪研究;曾文静;《中国博士学位论文全文数据库 工程科技Ⅱ辑》;20140415(第4期);全文
基于水天线检测的数字图像处理算法研究;赵凝霞等;《华东船舶工业学院学报(自然科学版)》;20050531;第19卷(第1期);全文
船载摄像系统的一种电子稳像算法;赵红颖等;《光学技术》;20030530;第29卷(第5期);全文

Also Published As

Publication number Publication date
CN106558038A (en) 2017-04-05

Similar Documents

Publication Publication Date Title
CN107077743B (en) System and method for dynamic calibration of an array camera
EP3252715B1 (en) Two-camera relative position calculation system, device and apparatus
JP6663040B2 (en) Depth information acquisition method and apparatus, and image acquisition device
CN106558038B (en) A kind of detection of sea-level and device
CN108470356B (en) Target object rapid ranging method based on binocular vision
CN106960454B (en) Depth of field obstacle avoidance method and equipment and unmanned aerial vehicle
WO2018068678A1 (en) Method and device for determining external parameter of stereoscopic camera
US8538198B2 (en) Method and apparatus for determining misalignment
CN109816708B (en) Building texture extraction method based on oblique aerial image
CN105069804B (en) Threedimensional model scan rebuilding method based on smart mobile phone
JP5672112B2 (en) Stereo image calibration method, stereo image calibration apparatus, and computer program for stereo image calibration
CN104685513A (en) Feature based high resolution motion estimation from low resolution images captured using an array source
CN108345821B (en) Face tracking method and device
CN107980138A (en) A kind of false-alarm obstacle detection method and device
US10529081B2 (en) Depth image processing method and depth image processing system
CN110243390B (en) Pose determination method and device and odometer
CN116029996A (en) Stereo matching method and device and electronic equipment
US20080226159A1 (en) Method and System For Calculating Depth Information of Object in Image
CN111402345A (en) Model generation method and device based on multi-view panoramic image
CN116957987A (en) Multi-eye polar line correction method, device, computer equipment and storage medium
CN105335959B (en) Imaging device quick focusing method and its equipment
CN110800020B (en) Image information acquisition method, image processing equipment and computer storage medium
KR20160024419A (en) System and Method for identifying stereo-scopic camera in Depth-Image-Based Rendering
KR101705330B1 (en) Keypoints Selection method to Find the Viewing Angle of Objects in a Stereo Camera Image
CN109902695B (en) Line feature correction and purification method for image pair linear feature matching

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant