Disclosure of Invention
In view of the above drawbacks of the prior art, an object of the present invention is to provide a method/system, a computer-readable storage medium, and a device for processing a pedestrian trajectory, which are used to solve the problems in the prior art that a pedestrian movement trajectory cannot be stably and accurately determined and the tracking accuracy of a passenger flow counting system is not high in the case that an actual application scenario is complicated and changeable.
To achieve the above and other related objects, an aspect of the present invention provides a pedestrian trajectory line processing method, including: acquiring image data to be processed; detecting the image data to be processed, extracting a pedestrian detection frame from the image data to be processed, and generating scattered trajectory lines by using pedestrian trajectory information contained in the pedestrian detection frame; blocking the pedestrian detection frame to generate tracking trajectory lines, the scattered trajectory lines being connected by the tracking trajectory lines to form initial pedestrian trajectory lines;
and refining the initial pedestrian trajectory line to obtain a real pedestrian trajectory line in the image data to be processed.
In an embodiment of the present invention, in the process of acquiring the image data to be processed, the pedestrian trajectory processing method includes:
acquiring image data from an image acquisition device within a preset acquisition time; and calculating a foreground image of the image data to acquire image data to be processed comprising the image data and the foreground image.
In an embodiment of the present invention, the step of detecting the image data to be processed and extracting the pedestrian detection frame from the image data to be processed includes: pedestrian detection based on the features of the directional gradient histograms is carried out on all image data in the image data to be processed so as to detect a pedestrian detection frame of the image data; and screening the pedestrian detection frames by using the foreground proportion of the foreground image so as to extract the pedestrian detection frames with the foreground proportion larger than a preset foreground proportion threshold value.
In an embodiment of the present invention, the step of performing pedestrian detection based on histogram of oriented gradient features on all image data in the image data to be processed includes: carrying out color space normalization processing on the image data to form preprocessed image data; the color space normalization processing comprises image graying and gamma correction of the image data; acquiring the gradient and the gradient direction of the preprocessed image data; dividing the preprocessed image data into a plurality of image units, and counting a gradient histogram of each image unit; forming a plurality of image units into image blocks, and connecting the feature vectors of all the image units in each image block in series to obtain the directional gradient histogram feature of the image block; combining all the directional gradient histogram features in the image data to form a feature vector of the image data for representing the image data; distinguishing a feature vector of the image data by using a support vector machine classifier to detect a pedestrian detection frame of the image data; the pedestrian detection frame comprises the size and the confidence of the currently detected pedestrian detection frame and/or the current detection moment in the preset acquisition time.
In an embodiment of the present invention, the step of generating the scattered trajectory line by using the pedestrian trajectory information included in the pedestrian detection frame includes: connecting the pedestrian detection frames which accord with the connection rule together by using the pedestrian detection frames to form scattered track lines; the scattered trajectory line contains pedestrian trajectory line information in a preset acquisition time, and the scattered trajectory line comprises the following steps: the new attribute message comprises the position coordinates of the ith scattered trajectory line at the current detection time, the confidence coefficient of the ith scattered trajectory line and the size of the pedestrian detection frame currently detected; wherein the connection rule includes: ensuring that the time difference between the pedestrian detection frame at the current moment and the pedestrian detection frame at the previous moment is within three frames; the change of the front and back movement directions of the pedestrian detection frame does not exceed a preset angle; the confidence of the pedestrian detection frame at each moment is not lower than the confidence threshold.
In an embodiment of the present invention, the step of tracking the pedestrian detection frame by the block to generate the tracking trajectory line includes:
performing image segmentation on the pedestrian detection frame at the current moment according to the head, the body, the left arm, the right arm and the legs to form a head-shoulder segment, a body segment, a left arm segment, a right arm segment and a leg segment; wherein, the pedestrian detection frame at the current moment has the tail end of a scattered track line; the head and shoulder sub-blocks adopt a first tracking mode, and the body sub-blocks, the left arm sub-blocks, the right arm sub-blocks and the leg sub-blocks adopt a second tracking mode to track respectively so as to acquire the positions of the update sub-blocks of the head and shoulder sub-blocks, the body sub-blocks, the left arm sub-blocks, the right arm sub-blocks and the leg sub-blocks in a pedestrian detection frame at the next moment; calculating the center of the pedestrian detection frame at the next moment according to the positions of the update blocks of the head-shoulder block, the body block, the left arm block, the right arm block and the leg block in the pedestrian detection frame at the next moment and the relative displacement of the centers of the head-shoulder block, the body block, the left arm block, the right arm block and the leg block and the pedestrian detection frame at the current moment; calculating the offset of each updating block from the center of the pedestrian detection frame at the next moment; if the offsets of the head-shoulder block, the body part block, the left arm block, the right arm block, the leg block and the center of the pedestrian detection frame at the current moment respectively satisfy the predetermined offset determination condition with the offsets of each update block and the center of the pedestrian detection frame at the next moment, correcting each update block to be the initial position of the next block tracking: circularly executing the steps, connecting the updated pedestrian detection frames after the updating of the pedestrian detection frame is finished to form a tracking track line, searching the head end of another zero track line matched with the tail end of the tracking track line in the updated pedestrian detection frame, if the head end of another zero track line is matched with the tail end of the tracking track line, connecting the tracking track line with the other zero track line, and continuing the block tracking at the tail end of the other zero track line; and if not, continuously updating the pedestrian detection frame.
In an embodiment of the present invention, a calculation formula for calculating the center of the pedestrian detection frame at the next time is as follows: c ═ Σ (y)i+di)×wi(ii) a Wherein C represents the center of the pedestrian detection frame at the next moment; i represents the serial numbers of the head and shoulder block, the body block, the left arm block, the right arm block and the leg block; y isiThe positions of the update blocks of the head and shoulder blocks, the body blocks, the left arm blocks, the right arm blocks and the leg blocks in the pedestrian detection frame at the next moment are represented; diRepresenting the relative displacement of the head and shoulder sub-block, the body sub-block, the left arm sub-block, the right arm sub-block, the leg sub-block and the center of the pedestrian detection frame at the current moment; w is aiAnd representing the corresponding weights of the head and shoulder block, the body block, the left arm block, the right arm block and the leg block.
In an embodiment of the present invention, the predetermined offset determination condition is:
wherein z is
iAn offset of an update block representing a head-shoulder block, a body block, a left arm block, a right arm block, a leg block from the center of the pedestrian detection frame at the next time, d
iAnd a relative displacement between the center of the pedestrian detection frame at the current time and the center of the head-shoulder segment, the body segment, the left-arm segment, the right-arm segment, and the leg segment is represented.
In an embodiment of the invention, the pedestrian trajectory processing method further includes marking a state of the scattered trajectory as a completed state when all the scattered trajectory lines are connected pairwise by the generated tracking trajectory lines or are connected with the tracking trajectory lines only, and otherwise, marking the state as an uncompleted state.
In an embodiment of the present invention, the step of refining the pedestrian trajectory line to obtain a real pedestrian trajectory line in the image data to be processed includes: smoothing the initial pedestrian trajectory line; cutting off the initial pedestrian trajectory line after the smoothing treatment; and according to a preset reconnection judgment condition, reconnecting the cut initial pedestrian trajectory line to form a real pedestrian trajectory line.
In an embodiment of the invention, the step of smoothing the initial pedestrian trajectory line includes: and calculating a spline curve by taking the coordinate points on the initial pedestrian trajectory line as control points, and enabling the spline curve to replace the initial pedestrian trajectory line.
In an embodiment of the invention, the step of cutting the smoothed initial pedestrian trajectory line includes: setting a sliding window, and judging whether the length of the spline curve is greater than the length of a preset window; if not, not cutting off the spline curve; if yes, executing the next step; intercepting a part of spline curve with the length of a preset window, acquiring the average motion direction of the front half part of spline curve and the average motion direction of the rear half part of spline curve, and setting the point of the half part of spline curve as a cutting point to form a cutting line if the included angle of the average motion directions of the front half part of spline curve and the rear half part of spline curve is larger than a preset cutting included angle; and sliding the sliding window backwards for a preset window length, and returning to judge whether the length of the spline curve after the sliding of the preset window length is greater than the preset window length.
In an embodiment of the present invention, the step of reconnecting the cut initial pedestrian trajectory line according to a predetermined reconnection determination condition to form a real pedestrian trajectory line includes: setting the cutting line to be in an initial state;
searching two cutting lines in an initial state, judging whether the cutting lines in the initial state meet the preset reconnection judgment condition, if so, connecting the cutting lines in pairs to form a real pedestrian trajectory line; if not, continuing searching;
in an embodiment of the present invention, the predetermined reconnection determination condition includes: time determination conditions: the tail end time of the cutting line searched first is prior to the head end time of the cutting line searched later; judging conditions of the included angle of the moving direction: the included angle between the motion direction of the cutting line searched first and the motion direction of the cutting line searched later is smaller than the preset motion direction; if the time judgment condition and the motion direction included angle judgment condition are met, judging the cut-off lines searched later as candidate connecting lines; distance determination conditions: and searching the cutting line of which the distance between the tail end of the cutting line searched first and the head end of the candidate connecting line is smaller than a preset distance threshold value in the candidate connecting line.
In another aspect, the present invention provides a pedestrian trajectory line processing system, including: the acquisition module is used for acquiring image data to be processed; the detection module is used for detecting the image data to be processed, extracting a pedestrian detection frame from the image data to be processed, and generating scattered track lines by using pedestrian track information contained in the pedestrian detection frame; a track line initial forming module, which is used for tracking the pedestrian detection frame in blocks to generate tracking track lines, and connecting the scattered track lines through the tracking track lines to form initial pedestrian track lines; and the refining module is used for performing refining processing on the initial pedestrian trajectory line so as to obtain a real pedestrian trajectory line in the image data to be processed.
Yet another aspect of the present invention provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the pedestrian trajectory line processing method.
A final aspect of the invention provides an apparatus comprising: a processor and a memory; the memory is used for storing a computer program, and the processor is used for executing the computer program stored by the memory so as to cause the equipment to execute the pedestrian trajectory line processing method
As described above, the pedestrian trajectory line processing method/system, the computer-readable storage medium, and the device of the present invention have the following advantageous effects:
the pedestrian trajectory processing method/system, the computer-readable storage medium and the equipment select to process the pedestrian movement trajectory in a time period at one time, can effectively improve the stability of the pedestrian movement trajectory, keep high tracking precision for scenes in which a large number of pedestrians simultaneously appear, and obtain the high-quality pedestrian movement trajectory.
Detailed Description
The embodiments of the present invention are described below with reference to specific embodiments, and other advantages and effects of the present invention will be easily understood by those skilled in the art from the disclosure of the present specification. The invention is capable of other and different embodiments and of being practiced or of being carried out in various ways, and its several details are capable of modification in various respects, all without departing from the spirit and scope of the present invention. It is to be noted that the features in the following embodiments and examples may be combined with each other without conflict.
It should be noted that the drawings provided in the following embodiments are only for illustrating the basic idea of the present invention, and the components related to the present invention are only shown in the drawings rather than drawn according to the number, shape and size of the components in actual implementation, and the type, quantity and proportion of the components in actual implementation may be changed freely, and the layout of the components may be more complicated.
Example one
The embodiment provides a pedestrian trajectory processing method, which comprises the following steps:
acquiring image data to be processed;
detecting the image data to be processed, extracting a pedestrian detection frame from the image data to be processed, and generating scattered trajectory lines by using pedestrian trajectory information contained in the pedestrian detection frame;
blocking the pedestrian detection frame to generate tracking trajectory lines, the scattered trajectory lines being connected by the tracking trajectory lines to form initial pedestrian trajectory lines;
and refining the initial pedestrian trajectory line to obtain a real pedestrian trajectory line in the image data to be processed.
The pedestrian trajectory line processing method provided by the present embodiment will be described in detail below with reference to the drawings. Referring to fig. 1A, a flow chart of a pedestrian trajectory line processing method in an embodiment is shown. As shown in fig. 1A, the pedestrian trajectory line processing method includes the following steps:
and S11, acquiring the image data to be processed. In the present embodiment, image data to be processed after decoding from a video stream acquired by an image data acquisition apparatus (e.g., a video camera).
Specifically, this step includes acquiring image data I from the image acquisition device for a predetermined acquisition time TTAnd calculating the image data ITIs FG of the foreground imageTTo obtain image data to be processed including the image data and the foreground image D ═ { I ═ IT,FGT}. In the present embodiment, the image data I is calculated using a single Gaussian modelTIs FG of the foreground imageT. The single-Gaussian background modeling is a commonly used background modeling algorithm, namely, a corresponding single-Gaussian distribution model is established independently for the color distribution of each pixel point in an image.
S12, detecting the image data to be processed, extracting a pedestrian detection frame from the image data to be processed, and generating scattered track lines by using pedestrian track information contained in the pedestrian detection frame.
Please refer to fig. 1B, which shows a schematic flow chart of S12. As shown in fig. 1B, the S12 includes the following steps:
and S121, performing pedestrian detection based on histogram of oriented gradient features (HOG features) on all image data in the image data to be processed to detect a pedestrian detection frame of the image data.
Specifically, the S121 includes the following:
firstly, carrying out color space normalization processing on the image data to form preprocessed image data; the color space normalization process includes image graying and gamma correction of the image data.
In the present embodiment, image graying refers to converting RGB components into a grayscale image for a color image, and the conversion formula is:
gray ═ 0.3R + 0.59G + 0.11B formula (1)
Gamma correction is a process of increasing or decreasing the brightness of the entire image by Gamma correction when the illuminance of the image is not uniform. In practice, the Gamma normalization, square root, logarithmic, can be performed in two different ways. For example, the square root approach, the formula is as follows (where γ ═ 0.5):
Y(x,y)=I(x,y)γformula (2)
Second, the gradient of the preprocessed image data and its gradient direction are obtained. In this embodiment, the calculation is performed in the horizontal and vertical directions, respectively, and the gradient operator is: horizontal direction: [ -101](ii) a Vertical direction: [ -101]T。
Gx(x, y) ═ I (x +1, y) -I (x-1, y) formula (3)
Gy(x, y) ═ I (x, y +1) -I (x, y-1) formula (4)
Thirdly, dividing the preprocessed image data into a plurality of image units (ceil), and counting a gradient histogram of each image unit.
Fourthly, a plurality of image units are combined into image blocks (blocks), and the feature vectors of all the image units in each image block are connected in series to obtain the directional gradient histogram feature (HOG feature) of the image block.
And fifthly, combining all the directional gradient histogram features in the image data to form a feature vector of the image data, wherein the feature vector is used for representing the image data.
Sixthly, a feature vector of the image data is discriminated by using a Support Vector Machine (SVM) classifier to detect a pedestrian detection frame P of the image data
i(ii) a Pedestrian detection frame
Containing pedestrian trajectory information, i.e. the currently detected position, the confidence level and/or the currently detected instant T within a predetermined acquisition time T, wherein,
the pedestrian detection frame size indicating the current detection time t,
and the confidence level of the pedestrian detection frame at the current detection time t is shown.
S122, utilizing the foreground image FGTThe pedestrian detection frames are screened to extract pedestrian detection frames with foreground proportion larger than a preset foreground proportion threshold.
And S123, connecting the pedestrian detection frames which accord with the connection rule together by using the pedestrian track information contained in the pedestrian detection frames to form scattered track lines. The scattered trajectory line contains pedestrian trajectory line information in a preset acquisition time, and the scattered trajectory line comprises the following steps: k scattered trajectory lines, attribute information of each scattered trajectory line at the current detection time, the new attribute message including the position coordinates of the ith scattered trajectory line at the current detection time and the confidence coefficient of the ith scattered trajectory line
And the size of the pedestrian detection frame currently detected
Wherein the connection rule includes:
ensuring that the time difference between the pedestrian detection frame at the current moment and the pedestrian detection frame at the previous moment is within three frames;
the change of the front and back movement direction of the pedestrian detection frame does not exceed a predetermined angle (in the present embodiment, the predetermined angle is 60 degrees);
the confidence of the pedestrian detection frame at each time is not lower than the confidence threshold (in this embodiment, the confidence threshold is 0.9). The connection rule in particular records each zeroLocation of head and tail of stray trajectory
Head coordinates for the ith scattered trajectory line,
is the end coordinate of the ith scattered trajectory line.
S13, blocking and tracking the pedestrian detection frame to generate tracking track lines, and connecting the scattered track lines through the tracking track lines to form initial pedestrian track lines. Please refer to fig. 1C, which shows a schematic flow chart of S13. As shown in fig. 1C, the S13 specifically includes the following steps:
and S131, performing image segmentation on the pedestrian detection frame at the current moment according to the head, the body, the left arm, the right arm and the legs to form a head-shoulder segment, a left arm segment, a right arm segment, a body segment and a leg segment. Wherein, the pedestrian detection frame at the current moment has the tail end of a scattered track line. Please refer to fig. 2, which is a block diagram of the pedestrian detection frame. As shown in fig. 2, the pedestrian detection frame R is divided into a head-shoulder section R1, a left-arm section R2, a right-arm section R3, a body-part section R4, a leg section R5, and a center C of the pedestrian detection frame.
S132, the head-shoulder block R1 adopts a first tracking manner, and the left-arm block R2, the right-arm block R3, the body-part block R4, and the leg block R5 respectively track in a second tracking manner to obtain the position Y ═ Y { Y ═ of the updated blocks of the head-shoulder block, the body-part block, the left-arm block, the right-arm block, and the leg block in the pedestrian detection frame at the next timeiI denotes the numbers of the head-shoulder block R1, left arm block R2, right arm block R3, body part block R4, and leg block R5. In this embodiment, the first tracking mode uses a KCF tracking algorithm, and the second tracking mode uses a particle filter tracking algorithm. In the present embodiment, accuracy and calculation rate are usedAnd (4) compromise selection, wherein the KCF tracking algorithm is superior to the particle filter tracking algorithm in tracking accuracy, and as the apparent information of the head-shoulder blocks is rich and is not easy to shield, the KCF tracking algorithm is adopted to track the head-shoulder blocks. In the present embodiment, the relative displacement d of the head-shoulder block R1, the left-arm block R2, the right-arm block R3, the body-part block R4, and the leg block R5 to the pedestrian detection frame Ri。
S133, the position Y of the pedestrian detection frame at the next time point is { Y ═ Y ] according to the updated blocks of the head-shoulder block, the body block, the left-arm block, the right-arm block, and the leg blockiAnd relative displacement d between the head and shoulder block, the body block, the left arm block, the right arm block, the leg block and the center of the pedestrian detection frame at the current momentiAnd calculating the center of the pedestrian detection frame at the next moment. In this embodiment, the calculation formula for calculating the center of the pedestrian detection frame at the next time is:
C=∑(yi+di)×wiformula (7)
Wherein C represents the center of the pedestrian detection frame at the next moment; i represents the serial numbers of the head and shoulder block, the body block, the left arm block, the right arm block and the leg block; y isiThe positions of the update blocks of the head and shoulder blocks, the body blocks, the left arm blocks, the right arm blocks and the leg blocks in the pedestrian detection frame at the next moment are represented; diRepresenting the relative displacement of the head and shoulder sub-block, the body sub-block, the left arm sub-block, the right arm sub-block, the leg sub-block and the center of the pedestrian detection frame at the current moment; w is aiAnd representing the corresponding weights of the head and shoulder block, the body block, the left arm block, the right arm block and the leg block. The weight of the head-shoulder block is twice the weight of the other blocks, and ∑ wi=1。
S134, calculating the offset of each updating block from the center of the pedestrian detection frame at the next moment. As shown in FIG. 2, z1To z5An offset from the center of the pedestrian detection frame at the next time is indicated by an update block representing a head-shoulder block, a body block, a left-arm block, a right-arm block, and a leg block.
And S135, if the offsets of the head and shoulder blocks, the body part blocks, the left arm blocks, the right arm blocks and the leg blocks with the center of the pedestrian detection frame at the current moment and the offsets of each updating block with the center of the pedestrian detection frame at the next moment respectively meet the preset offset judgment condition, correcting each updating block to be the initial position of the next block tracking. In this embodiment, the predetermined offset determination condition is:
wherein z isiAn offset of an update block representing a head-shoulder block, a body block, a left arm block, a right arm block, a leg block from the center of the pedestrian detection frame at the next time, diAnd a relative displacement between the center of the pedestrian detection frame at the current time and the center of the head-shoulder segment, the body segment, the left-arm segment, the right-arm segment, and the leg segment is represented.
S136, circularly executing the steps, connecting the updated pedestrian detection frames after the updating of the pedestrian detection frame is finished, forming a tracking track line, searching the head end of another zero track line matched with the tail end of the tracking track line in the updated pedestrian detection frame, if the head end of another zero track line is matched with the tail end of the tracking track line, connecting the tracking track line with the other zero track line, and continuing block tracking at the tail end of the other zero track line; and if not, continuously updating the pedestrian detection frame. In the process of connecting scattered track lines, marking the states of the scattered track lines as finished states when all the scattered track lines are connected pairwise through the generated tracking track lines or are only connected with the tracking track lines, and finishing the connection stage of the track lines when all the scattered track lines are marked as finished states to form initial pedestrian track lines; otherwise, it is marked as incomplete. Please refer to fig. 3, which shows a schematic connection diagram of the zero line trace. As shown in fig. 3, h1 and h2 represent the corresponding pedestrians in the pedestrian detection frames f1 and f 2. The solid lines S1, S2, S3 represent the zero line trace generated in S12, and the dashed lines j1, j2, j3 represent the tracking trace generated in S13. Where the head of the dashed line is the end of the dashed line, e.g., q12 and q31 correspond to the head and end of trace line j1, dashed line j1 will achieve s1 and s3, and then end q32 of solid line s3 will begin to track trace line j3 again. If the scattered trace lines are not connected, the connection process is ended. For example, trace lines j2, j3, ends e2, and e1 are tracked.
FIG. 4 is a schematic diagram of an initial pedestrian trajectory without refinement. As shown in fig. 4, S1 and S2 are initial pedestrian trajectory lines that require refinement processing.
And S14, refining the initial pedestrian trajectory line to obtain a real pedestrian trajectory line in the image data to be processed. Referring to fig. 1D, which is a schematic flow chart of S14, as shown in fig. 1D, the S14 specifically includes the following steps:
and S141, smoothing the initial pedestrian trajectory line.
Specifically, a B-spline curve is calculated using a coordinate point on the initial pedestrian trajectory line as a control point, and the spline curve is made to replace the initial pedestrian trajectory line. If the number of control points is m, the order n-ceil (m 0.8) is set for obtaining the low-order spline curve. And replacing the initial pedestrian trajectory line by a spline curve to achieve the purpose of smoothing the initial pedestrian trajectory line.
And S142, cutting the initial pedestrian trajectory line after the smoothing treatment.
Specifically, the step of cutting the smoothed initial pedestrian trajectory line includes:
setting a sliding window, and judging whether the length of the spline curve is greater than the preset window length w; if not, not cutting off the spline curve; if yes, executing the next step;
a part of the spline curve with a predetermined window length W is cut, the average moving direction of the first half part of the spline curve (the front W/2, namely, W1) and the average moving direction of the second half part of the spline curve (the rear W/2, namely, W2) are obtained, and the included angle between the average moving directions of the two parts is larger than a predetermined cut-off included angle (in the present embodiment, the predetermined cut-off included angle is 60 degrees), and the point where the second half part of the spline curve is located (the point W0) is set as a cut-off point, so that a cut-off line is formed. As shown in fig. 5 with the trace cut. In the present embodiment, the average moving direction is obtained by setting two consecutive points p1(x1, y1), p2(x2, y2), the moving direction is represented by a vector V, the vectors V12 of the points p1 and p2(x 2-x1, y2-y1), and the average moving direction is the average value of the vectors.
And sliding the sliding window backwards by a preset window length w, and returning to judge whether the length of the spline curve after the sliding of the preset window length is greater than the preset window length.
In the present embodiment, if one of the initial pedestrian trajectory lines is cut, the cut line has an initial state and a retained state.
And S143, according to preset reconnection judgment conditions, reconnecting the cut initial pedestrian trajectory line to form a real pedestrian trajectory line. In reconnection, reconnection judgment is performed on all the cut trajectory lines, and the cut lines meeting the preset reconnection judgment condition are connected pairwise. This step completes the optimization processing of the staggered pedestrian trajectory line, reducing the probability of the generation of the wrong trajectory line. Meanwhile, the integrity of the pedestrian motion track is ensured by effectively connecting the unconnected track lines.
Specifically, the cutting line is set to an initial state (whether the contents of the initial state can be explained or not); in this embodiment, the initial state is referred to as an unconnected state.
Searching two cutting lines S1 and S2 in an initial state, judging whether the cutting lines in the initial state meet the preset reconnection judgment condition, if so, connecting the cutting lines in pairs to form a real pedestrian trajectory line; if not, continuing searching.
The preset reconnection determination condition includes:
time determination conditions: the tail end time of the cutting line searched first is prior to the head end time of the cutting line searched later;
judging conditions of the included angle of the moving direction: the movement direction of the cutting line searched first and the movement direction of the cutting line searched later are smaller than a preset movement direction included angle th _ theta;
if the time judgment condition and the motion direction included angle judgment condition are met, judging the cut-off lines searched later as candidate connecting lines;
distance determination conditions: and searching for the cutting-off line of which the distance between the tail end of the cutting-off line searched first and the head end of the candidate connecting line is smaller than a preset distance threshold th _ dist in the candidate connecting lines.
Specifically, the process of determining whether the cutting line in the initial state satisfies the preset reconnection determination condition is:
1) searching a cutting line in an initial state S1, if the cutting line exists, executing 2), and if the cutting line does not exist, executing 5), namely, matching every two cutting lines, and meeting the condition that the end time of one cutting line is prior to the head end time of the other cutting line, wherein the distance between the head ends of the two cutting lines is smaller than a preset distance threshold (namely, the minimum distance threshold th _ dist). In this embodiment, all the cutting lines are traversed, and if the state of the cutting line is the initial state, it indicates that one cutting line is successfully found S1.
2) Searching another cutting line S2 in an initial state, wherein the tail end time of the cutting line S1 searched first is prior to the head end time of the cutting line S2 searched later; if yes, entering the next step 3), and if not, executing the step 4).
3) And calculating the running directions of the tail end of the cutting line S1 and the head end of the cutting line S2, judging whether the tail end and the head end of the cutting line S1 meet the motion direction included angle judgment condition, if so, adding the cutting line S2 into a candidate cutting line of the cutting line S1, recording the distance d between the tail end of the S1 and the head end of the S2, and returning to the step 2).
4) If the cut line S1 has a candidate cut line S2, searching for a cut line having a distance d between the tail end of S1 and the head end of S2 as a minimum distance threshold th _ dist, if the candidate cut line S2 exists, connecting the two cut lines to obtain a new trajectory line S1 ', setting the state of the new trajectory line S1' as an initial state, connecting the two cut lines to form a real pedestrian trajectory line, as shown in fig. 6, and performing refinement processing on the pedestrian trajectory lines S1 'and S2'; if not, the candidate cutting line S2 is set to the hold state, and the procedure returns to step 1).
The present embodiment also provides a computer-readable storage medium on which a computer program is stored, which when executed by a processor, implements the above-described pedestrian trajectory line processing method. Those of ordinary skill in the art will understand that: all or part of the steps for implementing the above method embodiments may be performed by hardware associated with a computer program. The aforementioned computer program may be stored in a computer readable storage medium. When executed, the program performs steps comprising the method embodiments described above; and the aforementioned storage medium includes: various media that can store program codes, such as ROM, RAM, magnetic or optical disks.
The pedestrian trajectory processing method and the computer-readable storage medium in the embodiment select to process the pedestrian motion trajectory in a time period at one time, so that the stability of the pedestrian motion trajectory can be effectively improved, the high tracking accuracy of scenes where a large number of pedestrians simultaneously appear is maintained, and the high-quality pedestrian motion trajectory is obtained.
Example two
The present embodiment provides a pedestrian trajectory line processing system, including:
the acquisition module is used for acquiring image data to be processed;
the detection module is used for detecting the image data to be processed, extracting a pedestrian detection frame from the image data to be processed, and generating scattered track lines by using pedestrian track information contained in the pedestrian detection frame;
a track line initial forming module, which is used for tracking the pedestrian detection frame in blocks to generate tracking track lines, and connecting the scattered track lines through the tracking track lines to form initial pedestrian track lines;
and the refining module is used for performing refining processing on the initial pedestrian trajectory line so as to obtain a real pedestrian trajectory line in the image data to be processed.
The pedestrian trajectory line processing system provided by the present embodiment will be described in detail below with reference to the drawings. It should be noted that the division of the modules of the above apparatus is only a logical division, and the actual implementation may be wholly or partially integrated into one physical entity, or may be physically separated. And these modules can be realized in the form of software called by processing element; or may be implemented entirely in hardware; and part of the modules can be realized in the form of calling software by the processing element, and part of the modules can be realized in the form of hardware. For example, the x module may be a processing element that is set up separately, or may be implemented by being integrated in a chip of the apparatus, or may be stored in a memory of the apparatus in the form of program code, and the function of the x module may be called and executed by a processing element of the apparatus. Other modules are implemented similarly. In addition, all or part of the modules can be integrated together or can be independently realized. The processing element described herein may be an integrated circuit having signal processing capabilities. In implementation, each step of the above method or each module above may be implemented by an integrated logic circuit of hardware in a processor element or an instruction in the form of software.
For example, the above modules may be one or more integrated circuits configured to implement the above methods, such as: one or more Application Specific Integrated Circuits (ASICs), or one or more microprocessors (DSPs), or one or more Field Programmable Gate Arrays (FPGAs), etc. For another example, when one of the above modules is implemented in the form of a processing element scheduler code, the processing element may be a general-purpose processor, such as a Central Processing Unit (CPU) or other processor capable of calling program code. For another example, these modules may be integrated together and implemented in the form of a system-on-a-chip (SOC).
Referring to fig. 7, a schematic structural diagram of a pedestrian trajectory processing system in one embodiment is shown. As shown in fig. 7, the pedestrian trajectory line processing system 7 includes: an acquisition module 71, a detection module 72, a trajectory line initial forming module 73 and a refinement module 74.
The acquisition module 71 acquires image data to be processed. In the present embodiment, image data to be processed after decoding from a video stream acquired by an image data acquisition apparatus (e.g., a video camera).
Specifically, the acquiring module 71 acquires the image data I from the image acquisition device within a predetermined acquisition time TTAnd calculating the image data ITIs FG of the foreground imageTTo obtain image data to be processed including the image data and the foreground image D ═ { I ═ IT,FGT}。
The detection module 72 coupled to the acquisition module 71 detects the image data to be processed, extracts a pedestrian detection frame from the image data to be processed, and generates a scattered trajectory line by using pedestrian trajectory information included in the pedestrian detection frame.
Specifically, the
detection module 72 is configured to perform pedestrian detection based on histogram of oriented gradient feature (HOG feature) on all image data in the image data to be processed to detect a pedestrian detection frame of the image data; using the foreground image FG
TThe pedestrian detection frames are screened to extract the pedestrian detection frames with the foreground proportion larger than a preset foreground proportion threshold value; and generating scattered trajectory lines by using the pedestrian trajectory information contained in the pedestrian detection frame. Connecting the pedestrian detection frames which accord with the connection rule together by using the pedestrian detection frames to form scattered track lines; the scattered trajectory line contains pedestrian trajectory line information in a preset acquisition time, and the scattered trajectory line comprises the following steps: k scattered trajectory lines, attribute information of each scattered trajectory line at the current detection time, the new attribute message including the position coordinates of the ith scattered trajectory line at the current detection time and the confidence coefficient of the ith scattered trajectory line
And the size of the pedestrian detection frame currently detected
Wherein the connection rule includes:
ensuring that the time difference between the pedestrian detection frame at the current moment and the pedestrian detection frame at the previous moment is within three frames;
the change of the front and back movement direction of the pedestrian detection frame does not exceed a predetermined angle (in the present embodiment, the predetermined angle is 60 degrees);
the confidence of the pedestrian detection frame at each time is not lower than the confidence threshold (in this embodiment, the confidence threshold is 0.9). The connection rule records, in particular, the position of the head and tail of each scattered trajectory
Head coordinates for the ith scattered trajectory line,
is the end coordinate of the ith scattered trajectory line.
An initial trace line forming module 73 coupled to the acquisition module 71 and the detection module 72 is configured to block trace the pedestrian detection frame to generate trace lines, and to connect the scattered trace lines through the trace lines to form an initial pedestrian trace line.
Specifically, the trajectory line initial forming module 73 is configured to perform image segmentation on the pedestrian detection frame at the current time according to the head, the body, the left arm, the right arm, and the leg, and form a head-shoulder segment, a left-arm segment, a right-arm segment, a body segment, and a leg segment. Wherein, the pedestrian detection frame at the current moment has the tail end of a scattered track line. The head-shoulder block R1 adopts a first tracking manner, and the left-arm block R2, the right-arm block R3, the body-part block R4, and the leg block R5 respectively track in a second tracking manner to obtain the position Y ═ Y { Y ═ of the updated blocks of the head-shoulder block, the body-part block, the left-arm block, the right-arm block, and the leg block in the pedestrian detection frame at the next timeiI represents the numbers of the head-shoulder segment, the left-arm segment, the right-arm segment, the body segment, and the leg segment. In this embodiment, the first tracking mode uses a KCF tracking algorithm, and the second tracking mode uses a particle filter tracking algorithm. In this implementationIn the example, compromise selection of accuracy and calculation rate is adopted, the KCF tracking algorithm is superior to the particle filter tracking algorithm in tracking precision, and as apparent information of the head-shoulder blocks is rich and is not easy to shield, the head-shoulder blocks are tracked by adopting the KCF tracking algorithm. In the present embodiment, the relative displacement d of the head-shoulder block, the left-arm block, the right-arm block, the body block, and the leg block in the pedestrian detection framei. The position Y ═ Y in the pedestrian detection frame at the next time according to the update blocks of the head-shoulder block, the body block, the left-arm block, the right-arm block, and the leg blockiAnd relative displacement d between the head and shoulder block, the body block, the left arm block, the right arm block, the leg block and the center of the pedestrian detection frame at the current momentiAnd calculating the center of the pedestrian detection frame at the next moment. And calculating the offset of each updating block from the center of the pedestrian detection frame at the next moment. And if the offsets of the head-shoulder block, the body part block, the left arm block, the right arm block and the leg block from the center of the pedestrian detection frame at the current moment and the offsets of each updating block from the center of the pedestrian detection frame at the next moment respectively meet the preset offset judgment condition, correcting each updating block to be the initial position of the next block tracking. Circularly executing the above functions, connecting the updated pedestrian detection frames after finishing updating the pedestrian detection frame to form a tracking track line, searching the head end of another zero track line matched with the tail end of the tracking track line in the updated pedestrian detection frame, if the head end of another zero track line is matched with the tail end of the tracking track line, connecting the tracking track line with the other zero track line, and continuing block tracking at the tail end of the other zero track line; and if not, continuously updating the pedestrian detection frame. In the process of connecting scattered track lines, marking the states of the scattered track lines as finished states when all the scattered track lines are connected pairwise through the generated tracking track lines or are only connected with the tracking track lines, and finishing the connection stage of the track lines when all the scattered track lines are marked as finished states to form initial pedestrian track lines; otherwise, it is marked as incomplete.
A refinement module 74 coupled to the trajectory line initial forming module 73 is configured to refine the initial pedestrian trajectory line to obtain a real pedestrian trajectory line in the image data to be processed.
The refinement module 74 is used to smooth the initial pedestrian trajectory. Specifically, a B-spline curve is calculated using a coordinate point on the initial pedestrian trajectory line as a control point, and the spline curve is made to replace the initial pedestrian trajectory line. If the number of control points is m, the order n-ceil (m 0.8) is set for obtaining the low-order spline curve. And replacing the initial pedestrian trajectory line by a spline curve to achieve the purpose of smoothing the initial pedestrian trajectory line.
The refinement module 74 is configured to cut off the smoothed initial pedestrian trajectory. Specifically, the refining module 74 sets a sliding window, and determines whether the length of the spline curve is greater than a predetermined window length w; if not, not cutting off the spline curve; if yes, cutting a part of the spline curve with the length of a preset window length W, obtaining the average motion direction of the front half part of the spline curve (front W/2, namely W1) and the average motion direction of the back half part of the spline curve (back W/2, namely W2), and setting the point (W0 point) of the half part of the spline curve as a cutting point to form a cutting line, wherein the included angle between the average motion directions of the front half part of the spline curve and the back half part of the spline curve is larger than a preset cutting included angle (in the embodiment, the preset cutting included angle is 60 degrees); and sliding the sliding window backwards by a preset window length w, and returning to judge whether the length of the spline curve after the sliding of the preset window length is greater than the preset window length. In the present embodiment, if one of the initial pedestrian trajectory lines is cut, the cut line has an initial state and a retained state.
The refining module 74 is further configured to reconnect the cut-off initial pedestrian trajectory line according to a preset reconnection determination condition to form a real pedestrian trajectory line. In reconnection, reconnection judgment is performed on all the cut trajectory lines, and the cut lines meeting the preset reconnection judgment condition are connected pairwise. This step completes the optimization processing of the staggered pedestrian trajectory line, reducing the probability of the generation of the wrong trajectory line. Meanwhile, the integrity of the pedestrian motion track is ensured by effectively connecting the unconnected track lines.
Specifically, the two cutting lines S1 and S2 of the refining module 74 in the initial state are searched, whether the cutting lines in the initial state meet the preset reconnection determination condition is judged, if yes, the cutting lines are connected in pairs to form a real pedestrian trajectory line; if not, continuing searching.
The preset reconnection determination condition includes:
time determination conditions: the tail end time of the cutting line searched first is prior to the head end time of the cutting line searched later;
judging conditions of the included angle of the moving direction: the movement direction of the cutting line searched first and the movement direction of the cutting line searched later are smaller than a preset movement direction included angle th _ theta;
if the time judgment condition and the motion direction included angle judgment condition are met, judging the cut-off lines searched later as candidate connecting lines;
distance determination conditions: and searching for the cutting-off line of which the distance between the tail end of the cutting-off line searched first and the head end of the candidate connecting line is smaller than a preset distance threshold th _ dist in the candidate connecting lines.
EXAMPLE III
The present embodiment provides an apparatus, comprising: a processor, a memory, a transceiver, a communication interface, and a system bus; the memory is used for storing computer programs and the communication interface is used for communicating with other devices, and the processor and the transceiver are used for operating the computer programs to enable the devices to execute steps of the method such as the ascending human trajectory processing method.
The above-mentioned system bus may be a Peripheral Component Interconnect (PCI) bus, an Extended Industry Standard Architecture (EISA) bus, or the like. The system bus may be divided into an address bus, a data bus, a control bus, and the like. For ease of illustration, only one thick line is shown, but this does not mean that there is only one bus or one type of bus. The communication interface is used for realizing communication between the database access device and other equipment (such as a client, a read-write library and a read-only library). The memory may include a Random Access Memory (RAM), and may further include a non-volatile memory (non-volatile memory), such as at least one disk memory.
The processor may be a general-purpose processor, and includes a Central Processing Unit (CPU), a Network Processor (NP), and the like; the integrated circuit may also be a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other programmable logic device, discrete gate or transistor logic device, or discrete hardware components.
In summary, the pedestrian trajectory processing method/system, the computer-readable storage medium and the device of the present invention select to process the pedestrian motion trajectory within a time period at a time, which can effectively improve the stability of the pedestrian motion trajectory, maintain high tracking accuracy for scenes where a large number of pedestrians are present at the same time, and obtain a high-quality pedestrian motion trajectory. Therefore, the invention effectively overcomes various defects in the prior art and has high industrial utilization value.
The foregoing embodiments are merely illustrative of the principles and utilities of the present invention and are not intended to limit the invention. Any person skilled in the art can modify or change the above-mentioned embodiments without departing from the spirit and scope of the present invention. Accordingly, it is intended that all equivalent modifications or changes which can be made by those skilled in the art without departing from the spirit and technical spirit of the present invention be covered by the claims of the present invention.