CN108241150B - Method for detecting and tracking moving object in three-dimensional sonar point cloud environment - Google Patents

Method for detecting and tracking moving object in three-dimensional sonar point cloud environment Download PDF

Info

Publication number
CN108241150B
CN108241150B CN201611216488.7A CN201611216488A CN108241150B CN 108241150 B CN108241150 B CN 108241150B CN 201611216488 A CN201611216488 A CN 201611216488A CN 108241150 B CN108241150 B CN 108241150B
Authority
CN
China
Prior art keywords
moving object
frame
data
sonar
candidates
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201611216488.7A
Other languages
Chinese (zh)
Other versions
CN108241150A (en
Inventor
邓小明
杨硕
袁野
郑文勇
王宏安
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Institute of Software of CAS
Original Assignee
Institute of Software of CAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Institute of Software of CAS filed Critical Institute of Software of CAS
Priority to CN201611216488.7A priority Critical patent/CN108241150B/en
Publication of CN108241150A publication Critical patent/CN108241150A/en
Application granted granted Critical
Publication of CN108241150B publication Critical patent/CN108241150B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/66Sonar tracking systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/52Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/30Assessment of water resources

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Measurement Of Velocity Or Position Using Acoustic Or Ultrasonic Waves (AREA)

Abstract

The invention provides a method for detecting and tracking a moving object in a three-dimensional sonar point cloud environment, which comprises the following steps: 1) From three-dimensional sonar equipmentObtain original sonar data D 0 And filtering to obtain filtered data D 1 The method comprises the steps of carrying out a first treatment on the surface of the 2) Data D after filtering 1 Carrying out communication area analysis to divide a plurality of communication areas, wherein each communication area is used as a candidate item C; 3) Extracting characteristics of the candidate item C; 4) Carrying out the processing from 1) to 3) on each frame of data of continuous multi-frame original sonar data to obtain a candidate item set; 5) And detecting and tracking the moving object in the candidate item set to obtain the motion trail of the moving object. And when the multi-object track crossing occurs, analyzing and correcting the obtained motion track of the object based on the graph method. The method of the invention carries out detection tracking according to the time-space information of the data, can rapidly detect and track the moving object in real time, and has strong robustness when multiple objects cross-move.

Description

Method for detecting and tracking moving object in three-dimensional sonar point cloud environment
Technical Field
The invention belongs to the field of image processing and computer vision, and particularly relates to a method for detecting and tracking a moving object in a three-dimensional sonar point cloud environment.
Background
With the recent terrorism's high abuse, some large surface facilities such as harbors, offshore platforms, etc. are often faced with terrorism threats due to their significant economic, strategic value. Therefore, intelligent monitoring of the underwater environment is urgently needed, early warning of emergency conditions of the underwater environment is achieved, and prevention of terrorists moving underwater is a core task of the system. The detection and tracking of the underwater moving object and the behavior recognition based on the track are the basis of an underwater detection and protection system, so that the safety of the underwater environment of the water surface facility is ensured, the objects in the underwater environment are rapidly classified, early warning is achieved in emergency situations, and rapid tracking and positioning can be realized.
There are many more sophisticated methods for object detection and tracking under optical conditions, but such research is almost an unexplored problem in underwater environments. Because the sonar has the problems of low imaging resolution, serious reverberation noise, edge insufficiency and the like, the detection and tracking of the underwater moving object based on the sonar are quite difficult.
Disclosure of Invention
The invention aims to provide a method for detecting and tracking a moving object in a three-dimensional sonar point cloud environment, which improves the detection and tracking precision and ensures the robustness and the practicability of the method.
In order to achieve the above purpose, the invention adopts the following technical scheme:
a method for detecting and tracking a moving object in a three-dimensional sonar point cloud environment comprises the following steps:
1) Raw sonar data D is obtained from three-dimensional sonar equipment 0 And filtering to obtain filtered data D 1
2) Data D after filtering 1 Carrying out communication area analysis to divide a plurality of communication areas, wherein each communication area is used as a candidate item C;
3) Extracting characteristics of the candidate item C;
4) Carrying out the processing of the step 1) and the step 3) on each frame of data of continuous multi-frame original sonar data to obtain a candidate item set;
5) And detecting and tracking the moving object in the candidate item set to obtain the motion trail of the moving object.
Further, in step 1), the original sonar data includes a three-dimensional position of a point and a reflection intensity of the point to the sonar, and the data format is a dimension (N points ,R 4 ) Wherein N is a two-dimensional matrix of points Is the number of points acquired by sonar, R 4 Representing the four element components (x, y, z, q), (x, y, z) is the three-dimensional position of each point, q is the reflected intensity of that point on the sonar.
Further, in step 1), the filtering means deleting the point data with small sonar reflection intensity, and retaining the point data with the front X percent of large reflection intensity, that is, sorting all the points according to the reflection intensity from large to small, and taking the intensity value of the point with the front X percent as the critical value of filtering, where the value range of X is generally 40 to 60.
Further, the method steps of the communication area analysis in the step 2) are as follows:
2-1) determining Euclidean distance between every two points of all points
Figure DEST_PATH_GDA00012379815600000215
2-2) distance d (i, j) between two points<T CC Two-point data communication of T CC The minimum communication distance threshold value is selected, and the threshold value is comprehensively considered according to actual data statistics and actual effects;
2-3) all points of communication are one communication area.
Further, the characteristics of the candidate item C in the step 3) comprise a position L, an average intensity Q and the number N of points; the position refers to the centroid position of all points; the average intensity refers to the average of the reflected intensities of all points.
Further, the method for detecting a moving object in step 5) includes the steps of:
5-1) at the time of the t frame, continuously searching K frames for each candidate item forwards, wherein K is the length of a time sequence and is generally taken as the sonar frame rate;
5-2) carrying out nearest neighbor search on each candidate, carrying out recursion forward search on the candidate which finds the nearest neighbor candidate until the search is stopped until the previous t-k frame, and obtaining a motion track Tr at the moment; stopping if the nearest neighbor candidate is not available;
5-3) comparing candidates at both ends of the track Tr
Figure DEST_PATH_GDA0001237981560000022
The Euclidean distance of the position if the deviation is greater than T min The trajectory Tr is the trajectory of a moving object, otherwise the trajectory Tr is discarded, where T min Is a preset threshold, and the selection of the threshold is related to the actual data statistics and the K value selected in step 5-1).
Still further, the nearest neighbor searching method in the step 5-2) is as follows: separately computing each candidate of the t-1 th frame
Figure DEST_PATH_GDA0001237981560000023
And candidates in the t-th frame +.>
Figure DEST_PATH_GDA0001237981560000024
Distance of->
Figure DEST_PATH_GDA0001237981560000025
Obtaining the minimum value of the distance between the two
Figure DEST_PATH_GDA0001237981560000026
Wherein n is the number of candidates of the t-1 th frame, if->
Figure DEST_PATH_GDA0001237981560000027
Then both are nearest neighbor candidates, where DT max The threshold is preset, and the selection of the threshold is comprehensively considered according to the actual effect.
Further, the distance measure considers not only the Euclidean distance of two candidates, but also the difference between the average intensities of the two, i.e
Figure DEST_PATH_GDA0001237981560000028
Wherein->
Figure DEST_PATH_GDA0001237981560000029
For Euclidean distance described in step 2-1), +.>
Figure DEST_PATH_GDA00012379815600000210
Is the difference between the average intensities of the two candidates, < ->
Figure DEST_PATH_GDA00012379815600000211
And->
Figure DEST_PATH_GDA00012379815600000212
Respectively->
Figure DEST_PATH_GDA00012379815600000213
And->
Figure DEST_PATH_GDA00012379815600000214
Is equal to or less than the average intensity of (1) 2 Representing a binary norm, used herein as a normalization term.
Further, the tracking method of the moving object in step 5) includes the steps of:
5-4) in the t frame, if there is a connected region
Figure DEST_PATH_GDA0001237981560000031
At the end of the motion trajectory Tr, all candidates of the t+1st frame are subjected to the nearest neighbor search described in 5-2), to be found +.>
Figure DEST_PATH_GDA0001237981560000032
Is->
Figure DEST_PATH_GDA0001237981560000033
Adding the end of the trace Tr;
5-5) detecting all candidates of the t+1st frame without any track added according to the detection method of the moving object to detect a new moving object.
Further, when multi-object track crossing occurs, performing analysis correction based on a graph method on the obtained motion track of the object, wherein the analysis correction based on the graph method refers to that candidates of all tracks of a t-1 frame and candidates of all tracks of the t frame form a bipartite graph G, similarity of every two candidates is used as weight of bipartite graph edges, and the bipartite graph matching problem is solved by utilizing integer optimization, and the specific mathematical expression is as follows:
Figure DEST_PATH_GDA0001237981560000034
Figure DEST_PATH_GDA0001237981560000035
Figure DEST_PATH_GDA0001237981560000036
Figure DEST_PATH_GDA0001237981560000037
wherein f ij Is a binary variable, and represents tr i (t-1) whether or not to be equal to tr j (t) ligation, c ij Representing similarity measure of two nodes, p i And q j To relax the variables, part of the nodes are allowed to be outliers and do not participate in matching.
The invention has the beneficial effects that: the invention provides a method for detecting and tracking a moving object in a three-dimensional sonar point cloud environment, which is characterized in that a brand-new method for detecting and tracking the moving object based on active sonar is adopted, detection and tracking are carried out according to space-time information of data, the moving object can be detected and tracked rapidly in real time in a complex environment, and the robustness is very strong when multiple objects cross-move; particularly, in an underwater environment, the invention provides a key technology for an underwater equipment protection system, and can analyze, detect and track underwater threat objects such as frogman, an underwater robot and the like.
Drawings
Fig. 1 is a flow chart of the steps of the method of the present invention.
Fig. 2 is a diagram illustrating a communication area analysis of the method of the present invention.
Fig. 3 is an exemplary diagram of moving object tracking in accordance with the method of the present invention.
FIG. 4 is a diagram of an example of the analytical correction based on the graph method of the present invention.
Fig. 5 is a diagram showing a specific effect example of an embodiment of the method of the present invention.
Detailed Description
In order to make the above features and advantages of the present invention more comprehensible, embodiments accompanied with figures are described in detail below.
The invention provides a method for detecting and tracking a moving object in a three-dimensional sonar point cloud environment, which comprises the following steps:
step 1: raw sonar data D is obtained from three-dimensional sonar equipment 0 The three-dimensional position of the point and the reflection intensity of the point to the sonar are included; wherein the original sonar data format is in dimension (N points ,R 4 ) Is a two-dimensional matrix of N points Is the number of points acquired by sonar, R 4 Representing the four element components (x, y, z, q), (x, y, z) is the three-dimensional position of each point, q is the reflected intensity of that point on the sonar.
Step 2: for the original sonar data D 0 Filtering, deleting point data with small sonar reflection intensity to reduce search amount, and obtaining filtered data D 1 . The filtering means deleting point data with small sonar reflection intensity, and retaining point data of the front X percent with large reflection intensity, namely sorting all points according to the reflection intensity from large to small, taking the intensity value of the point at the front X percent as a filtering critical value, wherein the value range of X is generally 40 to 60.
Step 3: first to D 1 Carrying out communication area analysis, dividing a plurality of communication areas, wherein each communication area is used as a candidate C, and the communication areas from the point cloud to the point cloud are shown in fig. 2, (a) is the original point data, and (b) is the communication area obtained through analysis; the communication area analysis method comprises the following steps:
1) Obtaining Euclidean distance between every two points of all points
Figure DEST_PATH_GDA0001237981560000041
2) If the distance d (i, j) between two points<T CC The two points are communicated, otherwise, the two points are not communicated, wherein T is CC The minimum communication distance threshold value is selected, and the threshold value is comprehensively considered according to actual data statistics and actual effects;
3) All the communicating points are a communicating area.
Secondly, extracting the characteristics of C, such as a position L, average intensity Q and size (the number of points) N; wherein the position of the communication area refers to the centroid position of all points and
Figure DEST_PATH_GDA0001237981560000042
wherein (x) i ,y i ,z i ) Three-dimensional position of the point described in step 1; the average intensity of the communication area is the average of the reflection intensity of all points and +.>
Figure DEST_PATH_GDA0001237981560000043
Wherein q is i The reflected intensity for the point-to-sonar described in step 1.
Step 4: for continuous multi-frame original sonar data D m ,D m+1 ,…,D n The processing of step 1) to step 3) is performed on each frame data of (a) to (b) to obtain a candidate set { C } m } s ,{C m+1 } s ,…,{C n } s The method comprises the steps of carrying out a first treatment on the surface of the Detection and tracking of the moving object are performed in the candidate item set, so as to obtain a motion track of the moving object, and an example diagram of the tracking of the moving object is shown in fig. 3, wherein t-1 and t represent time points. The moving object detection method includes the steps of:
1) At the time of the t-th frame, respectively to the candidates
Figure DEST_PATH_GDA0001237981560000051
Continuously searching K frames forwards, wherein K is the length of a time sequence and is generally taken as the sonar frame rate;
2) Carrying out nearest neighbor search on each candidate item, if the nearest neighbor candidate item is found, carrying out recursively forward search until the search is stopped until the previous t-k frame is reached, obtaining a motion track Tr at the moment, and if the nearest neighbor candidate item is not found, stopping;
3) Comparing candidates at both ends of track Tr
Figure DEST_PATH_GDA0001237981560000052
The Euclidean distance of the position if the deviation is greater than the threshold T min The trajectory Tr is the trajectory of a moving object, otherwise the trajectory Tr is discarded, where T min Is a preset threshold, and the selection of the threshold is related to the actual data statistics and the K value selected in 1) above.
The nearest neighbor searching method in 2) refers to:
2-1) calculating each candidate of the t-1 th frame separately
Figure DEST_PATH_GDA0001237981560000053
And candidates in the t-th frame +.>
Figure DEST_PATH_GDA0001237981560000054
Distance of (2)
Figure DEST_PATH_GDA0001237981560000055
Obtaining the minimum value of the distance between the two>
Figure DEST_PATH_GDA0001237981560000056
Wherein n is the number of candidates of the t-1 th frame, if->
Figure DEST_PATH_GDA0001237981560000057
Then both are nearest neighbor candidates, where DT max Is a preset threshold value, and the selection of the threshold value is comprehensively considered according to the actual effect
2-2) the distance measure described in 2-1) is calculated by the following formula,
Figure DEST_PATH_GDA0001237981560000058
Figure DEST_PATH_GDA0001237981560000059
wherein->
Figure DEST_PATH_GDA00012379815600000510
For Euclidean distance described in step 3, < >>
Figure DEST_PATH_GDA00012379815600000511
And
Figure DEST_PATH_GDA00012379815600000512
respectively->
Figure DEST_PATH_GDA00012379815600000513
And->
Figure DEST_PATH_GDA00012379815600000514
Is equal to or less than the average intensity of (1) 2 Representing a binary norm, used herein as a normalization term.
The moving object tracking method includes the steps of:
1) At the t frame, if the communication area
Figure DEST_PATH_GDA00012379815600000515
At the end of the motion track Tr, tracking all candidates of the t+1st frame according to the nearest neighbor searching method, if the +.>
Figure DEST_PATH_GDA00012379815600000516
Is->
Figure DEST_PATH_GDA00012379815600000517
Will->
Figure DEST_PATH_GDA00012379815600000518
Adding the end of the trace Tr;
2) And (3) detecting the moving object of all candidates without any track added in the t+1st frame according to the moving object detection method so as to detect a new moving object.
The method for detecting and tracking the moving object provided by the invention further comprises the following steps: when the multi-object track crossing occurs, the obtained motion track of the object is subjected to analysis and correction based on a graph method to solve the problem of multi-object track crossing and error connection, and an example graph of the analysis and correction based on the graph method is shown in fig. 4. The motion track correction refers to the track generated by the moving object tracking method in the step 4, and when a plurality of object tracks are crossed, the track of the incorrectly connected different objects is corrected. Is specifically shown as
Figure DEST_PATH_GDA0001237981560000061
And->
Figure DEST_PATH_GDA0001237981560000062
Matching problems with each other, wherein->
Figure DEST_PATH_GDA0001237981560000063
The candidate item is expressed as t-1 frames in the ith track, and m represents the number of tracks in the t-1 frames; />
Figure DEST_PATH_GDA0001237981560000064
Expressed as a candidate of t frames in the jth track, and n represents the number of tracks at the t frame. the candidates of all tracks of the t-1 frame and the candidates of all tracks of the t frame form a bipartite graph G, and the similarity of every two candidates is used as the weight of the bipartite graph edge. The integer optimization is utilized to solve the bipartite graph matching problem, and the specific mathematical expression is as follows:
Figure DEST_PATH_GDA0001237981560000065
Figure DEST_PATH_GDA0001237981560000066
Figure DEST_PATH_GDA0001237981560000067
Figure DEST_PATH_GDA0001237981560000068
wherein f ij Is a binary variable, and represents tr i (t-1) whether or not to be equal to tr j (t) ligation, c ij Representing similarity measure of two nodes, p i And q j To relax the variables, part of the nodes are allowed to be outliers and do not participate in matching.
According to the method provided by the invention, the moving object can be detected and tracked rapidly in real time in an underwater complex environment. The specific effects of an embodiment of the present invention are shown in fig. 5, where (a) is a representation of the original data in three dimensions, it can be seen from the figure that the point cloud of the moving object cannot be effectively distinguished on the original data, and (b) is the obtained motion track of a certain time period and the object point cloud of a certain moment, and the frogman is illustrated to move from right to left.
The above embodiments are only for illustrating the technical solution of the present invention and not for limiting it, and those skilled in the art may modify or substitute the technical solution of the present invention without departing from the spirit and scope of the present invention, and the protection scope of the present invention shall be defined by the claims.

Claims (8)

1. A method for detecting and tracking a moving object in a three-dimensional sonar point cloud environment comprises the following steps:
1) Raw sonar data D is obtained from three-dimensional sonar equipment 0 And filtering to obtain filtered data D 1
2) Data D after filtering 1 Carrying out communication area analysis to divide a plurality of communication areas, wherein each communication area is used as a candidate item C;
3) Extracting characteristics of the candidate item C;
4) Carrying out the processing of the steps 1) to 3) on each frame of data of continuous multi-frame original sonar data to obtain a candidate item set;
5) Detecting and tracking the moving object in the candidate item set to obtain a motion track of the moving object;
the method for detecting the moving object in the step 5) comprises the following steps:
5-1) at the time of the t frame, continuously searching K frames forwards for each candidate item respectively;
5-2) carrying out nearest neighbor search on each candidate, carrying out recursion forward search on the candidate which finds the nearest neighbor candidate until the search is stopped until the previous t-k frame, and obtaining a motion track Tr at the moment; stopping if the nearest neighbor candidate is not available;
5-3) comparing candidates at both ends of the track Tr
Figure FDA0004124594910000011
The Euclidean distance of the position if the deviation is greater than T min The trajectory Tr is the trajectory of a moving object, otherwise the trajectory Tr is discarded, where T min Is a preset threshold;
wherein, the nearest neighbor search in the step 5-2) is: separately computing each candidate of the t-1 th frame
Figure FDA0004124594910000012
And candidates in the t-th frame +.>
Figure FDA0004124594910000013
Distance of->
Figure FDA0004124594910000014
Obtaining the minimum value of the distance between the two>
Figure FDA0004124594910000015
Wherein n is the number of candidates of the t-1 th frame, if->
Figure FDA0004124594910000016
Then both are nearest neighbor candidates, where DT max Is a preset threshold.
2. The method of claim 1, wherein the raw sonar data in step 1) includes three-dimensional locations of points and reflected intensities of point-to-sonar, and the data format is a dimension (N points ,R 4 ) Wherein N is a two-dimensional matrix of points Is the number of points acquired by sonar, R 4 Representing the four element components (x, y, z, q), (x, y, z) is the three-dimensional position of each point, q is the reflected intensity of that point on the sonar.
3. A method as claimed in claim 1, wherein the filtering in step 1) is to delete point data with small sonar reflection intensity, and retain point data with large reflection intensity of the first X percent.
4. The method according to claim 1, wherein the method step of analyzing the communication area in step 2) comprises the steps of:
2-1) determining Euclidean distance between every two points of all points
Figure FDA0004124594910000017
2-2) distance d (i, j) between two points<T CC Two-point data communication of T CC Is a minimum communication distance threshold;
2-3) all points of communication are one communication area.
5. The method of claim 1 wherein the characteristics of the candidate item C in step 3) include location L, average intensity Q, number of points N; the position refers to the centroid position of all points; the average intensity refers to the average of the reflected intensities of all points.
6. The method of claim 1, wherein the distance measure considers not only the euclidean distance of the two candidates, but also the difference in average intensity between the two, i.e.
Figure FDA0004124594910000021
Wherein the method comprises the steps of
Figure FDA0004124594910000022
For Euclidean distance described in step 2-1), +.>
Figure FDA0004124594910000023
Is the difference between the average intensities of the two candidates, < ->
Figure FDA0004124594910000024
And->
Figure FDA0004124594910000025
Respectively->
Figure FDA0004124594910000026
And->
Figure FDA0004124594910000027
Is equal to or less than the average intensity of (1) 2 Representing a binary norm, used herein as a normalization term.
7. The method according to claim 1, wherein the method of tracking the moving object in step 5) comprises the steps of:
5-4) in the t frame, if there is a connected region
Figure FDA0004124594910000028
At the end of the motion trajectory Tr, all candidates of the t+1st frame are subjected to the nearest neighbor search described in 5-2), to be found +.>
Figure FDA0004124594910000029
Is->
Figure FDA00041245949100000210
Adding the end of the trace Tr;
5-5) detecting all candidates of the t+1st frame without any track added according to the detection method of the moving object to detect a new moving object.
8. The method of claim 1, wherein when multi-object trajectory intersection occurs, performing analysis correction based on a graph method on the motion trajectories of the obtained objects, wherein the analysis correction based on the graph method refers to that candidates of all trajectories of a t-1 frame and candidates of all trajectories of the t-frame form a bipartite graph G, and similarity of every two candidates is used as weight of bipartite graph edges, and the bipartite graph matching problem is solved by using integer optimization, and the specific mathematical expression is as follows:
Figure FDA00041245949100000211
Figure FDA00041245949100000212
Figure FDA00041245949100000213
Figure FDA00041245949100000214
wherein f ij Is a binary variable, and represents tr i (t-1) whether or not to be equal to tr j (t) ligation, c ij Representing similarity measure of two nodes, p i And q j To relax the variables, part of the nodes are allowed to be outliers and do not participate in matching.
CN201611216488.7A 2016-12-26 2016-12-26 Method for detecting and tracking moving object in three-dimensional sonar point cloud environment Active CN108241150B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201611216488.7A CN108241150B (en) 2016-12-26 2016-12-26 Method for detecting and tracking moving object in three-dimensional sonar point cloud environment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201611216488.7A CN108241150B (en) 2016-12-26 2016-12-26 Method for detecting and tracking moving object in three-dimensional sonar point cloud environment

Publications (2)

Publication Number Publication Date
CN108241150A CN108241150A (en) 2018-07-03
CN108241150B true CN108241150B (en) 2023-05-30

Family

ID=62704852

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201611216488.7A Active CN108241150B (en) 2016-12-26 2016-12-26 Method for detecting and tracking moving object in three-dimensional sonar point cloud environment

Country Status (1)

Country Link
CN (1) CN108241150B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109308457B (en) * 2018-08-31 2022-03-29 华南理工大学 Multi-particle three-dimensional tracking method under high concentration
CN109409792B (en) * 2018-09-25 2020-02-04 深圳蓝胖子机器人有限公司 Object tracking detection method and system based on point cloud
CN111123274B (en) * 2019-12-27 2021-12-28 苏州联视泰电子信息技术有限公司 Target detection method of underwater sonar imaging system
CN113702979B (en) * 2021-07-16 2024-03-15 中国船舶重工集团公司第七一五研究所 Cross-region target tracking track segment space-time splicing method

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6449215B1 (en) * 2001-10-09 2002-09-10 The United States Of America As Represented By The Secretary Of The Navy Three-dimensional imaging system for sonar system
US9305364B2 (en) * 2013-02-19 2016-04-05 Caterpillar Inc. Motion estimation systems and methods
CN103197308B (en) * 2013-03-15 2015-04-01 浙江大学 Three-dimensional sonar visualization processing method based on multi-beam phased array sonar system
CN104751119A (en) * 2015-02-11 2015-07-01 中国科学院大学 Rapid detecting and tracking method for pedestrians based on information fusion
CN105182350B (en) * 2015-09-26 2017-10-31 哈尔滨工程大学 A kind of multibeam sonar object detection method of application signature tracking
CN105785349B (en) * 2016-05-09 2017-12-26 浙江大学 A kind of noise remove method of phased array three-dimensional acoustics image pickup sonar

Also Published As

Publication number Publication date
CN108241150A (en) 2018-07-03

Similar Documents

Publication Publication Date Title
CN108447080B (en) Target tracking method, system and storage medium based on hierarchical data association and convolutional neural network
Tzannes et al. Detecting small moving objects using temporal hypothesis testing
CN108241150B (en) Method for detecting and tracking moving object in three-dimensional sonar point cloud environment
CN110738690A (en) unmanned aerial vehicle video middle vehicle speed correction method based on multi-target tracking framework
CN110782483A (en) Multi-view multi-target tracking method and system based on distributed camera network
CN106910205A (en) A kind of multi-object tracking method based on the coupling of stochastic finite collection wave filter
Soleimanitaleb et al. Single object tracking: A survey of methods, datasets, and evaluation metrics
CN110555868A (en) method for detecting small moving target under complex ground background
Al-Shakarji et al. Robust multi-object tracking with semantic color correlation
Cancela et al. Unsupervised trajectory modelling using temporal information via minimal paths
Han et al. A method based on multi-convolution layers joint and generative adversarial networks for vehicle detection
Makino et al. Moving-object detection method for moving cameras by merging background subtraction and optical flow methods
CN118196715A (en) Multi-target tracking method based on pseudo depth estimation and online track classification
Najafzadeh et al. Object tracking using Kalman filter with adaptive sampled histogram
Kumar et al. Saliency based shape extraction of objects in unconstrained underwater environment
Wang et al. Low-slow-small target tracking using relocalization module
Singh et al. A greedy data association technique for multiple object tracking
KR20150081797A (en) Apparatus and method for tracking object
CN103093481A (en) Moving object detection method under static background based on watershed segmentation
Altundogan et al. Multiple object tracking with dynamic fuzzy cognitive maps using deep learning
Franchi et al. Tracking hundreds of people in densely crowded scenes with particle filtering supervising deep convolutional neural networks
Panda et al. Blending of Learning-based Tracking and Object Detection for Monocular Camera-based Target Following
Satya Sujith et al. Optimal support vector machine and hybrid tracking model for behaviour recognition in highly dense crowd videos
Ghosh et al. Detecting closely spaced and occluded pedestrians using specialized deep models for counting
Qin et al. Target tracking method based on interference detection

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant