CN107679492A - Behavior discriminant analysis method is carried out by using feature crawl function - Google Patents

Behavior discriminant analysis method is carried out by using feature crawl function Download PDF

Info

Publication number
CN107679492A
CN107679492A CN201710911806.XA CN201710911806A CN107679492A CN 107679492 A CN107679492 A CN 107679492A CN 201710911806 A CN201710911806 A CN 201710911806A CN 107679492 A CN107679492 A CN 107679492A
Authority
CN
China
Prior art keywords
mrow
msub
image
human body
face
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201710911806.XA
Other languages
Chinese (zh)
Other versions
CN107679492B (en
Inventor
杨晓凡
刘玉蓉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shandong Xuxing Network Technology Co ltd
Original Assignee
Chongqing City Intellectual Property Road Science And Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chongqing City Intellectual Property Road Science And Technology Co Ltd filed Critical Chongqing City Intellectual Property Road Science And Technology Co Ltd
Priority to CN201710911806.XA priority Critical patent/CN107679492B/en
Publication of CN107679492A publication Critical patent/CN107679492A/en
Application granted granted Critical
Publication of CN107679492B publication Critical patent/CN107679492B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • G06V20/53Recognition of crowd images, e.g. recognition of crowd congestion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/50Extraction of image or video features by performing operations within image blocks; by using histograms, e.g. histogram of oriented gradients [HoG]; by summing image-intensity values; Projection analysis
    • G06V10/507Summing image-intensity values; Histogram projection analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/174Facial expression recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Engineering & Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Artificial Intelligence (AREA)
  • Health & Medical Sciences (AREA)
  • Evolutionary Computation (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Human Computer Interaction (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Psychiatry (AREA)
  • Social Psychology (AREA)
  • Image Analysis (AREA)

Abstract

The present invention proposes one kind and carries out behavior discriminant analysis method by using feature crawl function, comprises the following steps:S1, after being judged according to human face expression characteristic attribute value characteristics of human body's image and face characteristic image, the personnel for leaving crowded region are subjected to matching collection, the region that corresponding dense population is reached or the respective nodes left are distinguished by grader, so as to be pushed to terminal.

Description

Behavior discriminant analysis method is carried out by using feature crawl function
Technical field
The present invention relates to big data analysis field, more particularly to one kind to carry out behavior differentiation by using feature crawl function Analysis method.
Background technology
Today's society personnel transfer is frequent, and on market, station, airport etc., stream of people's close quarters has substantial amounts of video monitor Equipment, but be only for carrying out close quarters simple IMAQ, follow-up classification and differentiation are not carried out to image, But due to crowded complicated in social life, it is necessary to rationally be advised to the personnel and place in the crowded region that comes in and goes out Draw, take corresponding management and configuration, so that the food and drink in crowded region, plugging into traffic and gateway can rationally match somebody with somebody Put, after great amount of images characteristic information is obtained, crowded region original state and result shape can not be carried out to reference sample State carries out degree of correlation matching, and this just needs those skilled in the art badly and solves corresponding technical problem.
The content of the invention
It is contemplated that at least solving technical problem present in prior art, one kind is especially innovatively proposed by making Behavior discriminant analysis method is carried out with feature crawl function.
In order to realize the above-mentioned purpose of the present invention, behavior is carried out by using feature crawl function the invention provides one kind Discriminant analysis method, comprise the following steps:
S1, will be from after being judged according to human face expression characteristic attribute value characteristics of human body's image and face characteristic image The personnel for opening crowded region carry out matching collection, by grader distinguish region that corresponding dense population reached or The respective nodes left, so as to be pushed to terminal.
Described carries out behavior discriminant analysis method by using feature crawl function, it is preferred that the S1 includes:
S1-1, classification judgement is carried out to characteristics of image, different face expressive features set C view data is subjected to model Judge;The histogram of effective characteristics of human body's image is extracted, constructs texture information, people is obtained and connects each attribute in expressive features set Value,
Smile property value Csmile=∑jj·δxj·δyj, wherein δxjAnd δyjRespectively the X-axis smile characteristics factor and Y-axis are special Levy the factor;
Open one's mouth property value Copenmouth=∑jj·τxjτyj, wherein τxjAnd τyjRespectively X-axis is opened one's mouth characterization factor and Y-axis Mouth characterization factor;
Bow property value Cdownhead=∑jj·βxjβyj, wherein βxjAnd βyjRespectively X-axis bow characterization factor and Y-axis it is low Head characterization factor;
New line property value Cuphead=∑jj·εxj·εyj, wherein εxjAnd εyjRespectively X-axis new line characterization factor and Y-axis lift Head characterization factor;
Sobbing property valueWhereinWithRespectively X-axis sobbing characterization factor and Y Axle sobbing characterization factor;
Side face property value Chalffacejj·μxj·μyj, wherein μxjAnd μyjRespectively X-axis side face characterization factor and Y-axis Side face characterization factor;
S1-2, whole crowded region image data is divided, forms besel sequence pair (M1,M2),(M2, M3),...,(Mn-1,Mn);The hand-held object boundary of characteristics of human body's image is positioned, since the initial frame head portion of video image;Positioning The access border of some characteristics of human body's image, from the crowded area that video image tail search characteristics of human body's image occurs The relevant position in domain, and judge the position that characteristics of human body's image occurs, residence time, and whether do shopping or hold Article;
S1-3, by besel sequence pair being compared crawl, before and after judgement one characteristics of human body's image of frame of video and The change degree of face characteristic image
Wherein, wherein | Ei,jLn+Ei,jMn| it is inquiry feature L to be matchednWith besel image MnSimilarity, E representatives Close quarters matching amount of images is flowed, S represents the interference set for influenceing characteristics of human body's image and face characteristic image, and s, t is just Integer, s, t value are different, and its minimum value is 1, and maximum occurrences are the characteristics of human body's image matched in matching characteristics of image figure With face characteristic image number;ωi,jThe weight of degree of correlation total degree, K are matched for face expressive features set CiTo be crowded Region carries out the penalty factor of characteristics of human body's image error matching, and z and d represent collection set and the people of characteristics of human body's image respectively The collection set of the next besel of body characteristicses image,
The change degree is subjected to information matches with the crowded regional location residing for corresponding image capture module, obtained The positive correlation conditional function of crowded regional location and change degree
Wherein, Y (x, y) and Z (x, y) represents to lack between characteristics of human body's image and face characteristic image coordinate point (x, y) respectively The interaction relationship of mistake, ηiAnd σjCharacteristics of human body's image judgment threshold and face characteristic image judgment threshold are represented respectively, and it is Positive number in open interval (0,1), rx,yRepresent similar with face characteristic image to characteristics of human body's image of coordinate (x, y) opening position Degree judges the factor,
S1-4, according to incidence relation between the characteristics of human body's image and face characteristic image of each individual of definition, according to pass Connection relation produces the non-dominant individual collections of different degree of correlation grades to the degree of correlation and data relevancy ranking is inquired about, according to people Non-dominant individual amount in body characteristicses image and face characteristic image gradation, sequence number grade it is small to big order from the degree of correlation, such as Fruit is not matched in the outlet of each stream of people's close quarters with characteristics of human body's image and face characteristic image any feature Correlation chart picture, step S1-1 is performed, if corresponding crowded regional location obtains correlation chart picture and in relevant position Signature is carried out, performs step S1-5;
S1-5, crowded zonelog is set, the attribute information in the crowded region is extracted according to user's request, is entered Row Similarity Measure, similarity is inquired about using characteristics of human body's figure Similarity Measure, calculated and looked into using face characteristic image similarity Similarity is ask, until daily record similarity and inquiry similarity convergence;The characteristics of human body of acquiescence is balanced by using matching weight α Image and face characteristic image correlativity and user define degree of correlation weighing result value
D [i, j]=maxFi,j(1-α)·P(i,j)+α·P(i,j,rx,y)+minFi,jWherein, maxFI, jCharacteristics of human body schemes The maximum of the change degree of picture and face characteristic image, minFi,jThe change degree of characteristics of human body's image and face characteristic image is most Small value, P (i, j) are stream of people's close quarters initial decision decision value, P (i, j, rx,y) it is that stream of people's close quarters result judges decision-making Value, rX, yRepresent to judge the factor to characteristics of human body's image and face characteristic image similarity of coordinate (x, y) opening position, wherein just Begin to judge that decision value is the initial decision for carrying out close quarters according to history feature view data, judge that decision value is for result Judgement decision value after being optimized after being judged by S1-1 to S1-5.
In summary, by adopting the above-described technical solution, the beneficial effects of the invention are as follows:
After the present invention to image by being acquired, according to the facial information of personnel and crowded region is passed in and out The bodily form and wearing difference are classified, perfect so as to which the corresponding auxiliary facility in the crowded region is carried out, and are passed through The sorter model is classified, and consuming system resource is small, saves time overhead, and by the beginning of crowded area people Beginning state and result phase carry out degree of correlation matching, so as to provide rational allocation plan for stream of people's close quarters, are advantageous to Personnel dredge and personnel re-assignment.
The additional aspect and advantage of the present invention will be set forth in part in the description, and will partly become from the following description Obtain substantially, or recognized by the practice of the present invention.
Brief description of the drawings
The above-mentioned and/or additional aspect and advantage of the present invention will become in the description from combination accompanying drawings below to embodiment Substantially and it is readily appreciated that, wherein:
Fig. 1 is general illustration of the present invention.
Embodiment
Embodiments of the invention are described below in detail, the example of the embodiment is shown in the drawings, wherein from beginning to end Same or similar label represents same or similar element or the element with same or like function.Below with reference to attached The embodiment of figure description is exemplary, is only used for explaining the present invention, and is not considered as limiting the invention.
In the description of the invention, it is to be understood that term " longitudinal direction ", " transverse direction ", " on ", " under ", "front", "rear", The orientation or position relationship of the instruction such as "left", "right", " vertical ", " level ", " top ", " bottom ", " interior ", " outer " is based on accompanying drawing institutes The orientation or position relationship shown, it is for only for ease of the description present invention and simplifies description, rather than instruction or the dress for implying meaning Put or element there must be specific orientation, with specific azimuth configuration and operation, therefore it is not intended that to limit of the invention System.
In the description of the invention, unless otherwise prescribed with limit, it is necessary to explanation, term " installation ", " connected ", " connection " should be interpreted broadly, for example, it may be mechanical connection or electrical connection or the connection of two element internals, can To be to be joined directly together, can also be indirectly connected by intermediary, for the ordinary skill in the art, can basis Concrete condition understands the concrete meaning of above-mentioned term.
As shown in figure 1, the inventive method comprises the following steps:
Behavior discriminant analysis method, including following step are carried out by using feature crawl function the invention provides one kind Suddenly:
S1, will be from after being judged according to human face expression characteristic attribute value characteristics of human body's image and face characteristic image The personnel for opening crowded region carry out matching collection, by grader distinguish region that corresponding dense population reached or The respective nodes left, so as to be pushed to terminal.
Described carries out behavior discriminant analysis method by using feature crawl function, it is preferred that the S1 includes:
S1-1, classification judgement is carried out to characteristics of image, different face expressive features set C view data is subjected to model Judge;The histogram of effective characteristics of human body's image is extracted, constructs texture information, people is obtained and connects each attribute in expressive features set Value,
Smile property value Csmilejj·δxj·δyj, wherein δxjAnd δyjRespectively the X-axis smile characteristics factor and Y-axis are special Levy the factor;
Open one's mouth property value Copenmouthjj·τxjτyj, wherein τxjAnd τyjRespectively X-axis is opened one's mouth characterization factor and Y-axis Mouth characterization factor;
Bow property value Cdownheadjj·βxjβyj, wherein βxjAnd βyjRespectively X-axis bow characterization factor and Y-axis it is low Head characterization factor;
New line property value Cuphead=∑jj·εxj·εyj, wherein εxjAnd εyjRespectively X-axis new line characterization factor and Y-axis lift Head characterization factor;
Sobbing property valueWhereinWithRespectively X-axis sobbing characterization factor and Y Axle sobbing characterization factor;
Side face property value Chalffacejj·μxj·μyj, wherein μxjAnd μyjRespectively X-axis side face characterization factor and Y-axis Side face characterization factor;
S1-2, whole crowded region image data is divided, forms besel sequence pair (M1,M2),(M2, M3),...,(Mn-1,Mn);The hand-held object boundary of characteristics of human body's image is positioned, since the initial frame head portion of video image;Positioning The access border of some characteristics of human body's image, from the crowded area that video image tail search characteristics of human body's image occurs The relevant position in domain, and judge the position that characteristics of human body's image occurs, residence time, and whether do shopping or hold Article;
S1-3, by besel sequence pair being compared crawl, before and after judgement one characteristics of human body's image of frame of video and The change degree of face characteristic image
Wherein, wherein | Ei,jLn+Ei,jMn| it is inquiry feature L to be matchednWith besel image MnSimilarity, E representatives Close quarters matching amount of images is flowed, S represents the interference set for influenceing characteristics of human body's image and face characteristic image, and s, t is just Integer, s, t value are different, and its minimum value is 1, and maximum occurrences are the characteristics of human body's image matched in matching characteristics of image figure With face characteristic image number;ωi,jThe weight of degree of correlation total degree, K are matched for face expressive features set CiTo be crowded Region carries out the penalty factor of characteristics of human body's image error matching, and z and d represent collection set and the people of characteristics of human body's image respectively The collection set of the next besel of body characteristicses image,
The change degree is subjected to information matches with the crowded regional location residing for corresponding image capture module, obtained The positive correlation conditional function of crowded regional location and change degree
Wherein, Y (x, y) and Z (x, y) represents to lack between characteristics of human body's image and face characteristic image coordinate point (x, y) respectively The interaction relationship of mistake, ηiAnd σjCharacteristics of human body's image judgment threshold and face characteristic image judgment threshold are represented respectively, and it is Positive number in open interval (0,1), rx,yRepresent similar with face characteristic image to characteristics of human body's image of coordinate (x, y) opening position Degree judges the factor,
S1-4, according to incidence relation between the characteristics of human body's image and face characteristic image of each individual of definition, according to pass Connection relation produces the non-dominant individual collections of different degree of correlation grades to the degree of correlation and data relevancy ranking is inquired about, according to people Non-dominant individual amount in body characteristicses image and face characteristic image gradation, sequence number grade it is small to big order from the degree of correlation, such as Fruit is not matched in the outlet of each stream of people's close quarters with characteristics of human body's image and face characteristic image any feature Correlation chart picture, step S1-1 is performed, if corresponding crowded regional location obtains correlation chart picture and in relevant position Signature is carried out, performs step S1-5;
S1-5, crowded zonelog is set, the attribute information in the crowded region is extracted according to user's request, is entered Row Similarity Measure, similarity is inquired about using characteristics of human body's figure Similarity Measure, calculated and looked into using face characteristic image similarity Similarity is ask, until daily record similarity and inquiry similarity convergence;The characteristics of human body of acquiescence is balanced by using matching weight α Image and face characteristic image correlativity and user define degree of correlation weighing result value
D [i, j]=maxFi,j(1-α)·P(i,j)+α·P(i,j,rx,y)+minFi,jWherein, maxFI, jCharacteristics of human body schemes The maximum of the change degree of picture and face characteristic image, minFi,jThe change degree of characteristics of human body's image and face characteristic image is most Small value, P (i, j) are stream of people's close quarters initial decision decision value, P (i, j, rx,y) it is that stream of people's close quarters result judges decision-making Value, rX, yRepresent to judge the factor to characteristics of human body's image and face characteristic image similarity of coordinate (x, y) opening position, wherein just Begin to judge that decision value is the initial decision for carrying out close quarters according to history feature view data, judge that decision value is for result Judgement decision value after being optimized after being judged by S1-1 to S1-5.
In the description of this specification, reference term " one embodiment ", " some embodiments ", " example ", " specifically show The description of example " or " some examples " etc. means specific features, structure, material or the spy for combining the embodiment or example description Point is contained at least one embodiment or example of the present invention.In this manual, to the schematic representation of above-mentioned term not Necessarily refer to identical embodiment or example.Moreover, specific features, structure, material or the feature of description can be any One or more embodiments or example in combine in an appropriate manner.
Although an embodiment of the present invention has been shown and described, it will be understood by those skilled in the art that:Not In the case of departing from the principle and objective of the present invention a variety of change, modification, replacement and modification can be carried out to these embodiments, this The scope of invention is limited by claim and its equivalent.

Claims (2)

1. a kind of carry out behavior discriminant analysis method by using feature crawl function, it is characterised in that comprises the following steps:
S1, after judging according to human face expression characteristic attribute value characteristics of human body's image and face characteristic image, people will be left The personnel of stream close quarters carry out matching collection, and the region or leave that corresponding dense population reached are distinguished by grader Respective nodes, so as to be pushed to terminal.
2. according to claim 1 carry out behavior discriminant analysis method by using feature crawl function, it is characterised in that The S1 includes:
S1-1, classification judgement is carried out to characteristics of image, different face expressive features set C view data is carried out into model sentences It is disconnected;The histogram of effective characteristics of human body's image is extracted, constructs texture information, obtains each attribute in human face expression characteristic set Value,
Smile property value Csmilejj·δxj·δyj, wherein δxjAnd δyjRespectively the X-axis smile characteristics factor and Y-axis feature because Son;
Open one's mouth property value Copenmouthjj·τxjτyj, wherein τxjAnd τyjRespectively X-axis opens one's mouth characterization factor and Y-axis is opened one's mouth spy Levy the factor;
Bow property value Cdownheadjj·βxjβyj, wherein βxjAnd βyjRespectively X-axis bows characterization factor and Y-axis is bowed spy Levy the factor;
New line property value Cupheadjj·εxj·εyj, wherein εxjAnd εyjRespectively X-axis new line characterization factor and Y-axis come back special Levy the factor;
Sobbing property valueWhereinWithRespectively X-axis sobbing characterization factor and Y-axis are cried Tears characterization factor;
Side face property value Chalfface=∑jj·μxj·μyj, wherein μxjAnd μyjRespectively X-axis side face characterization factor and Y-axis side face Characterization factor;
S1-2, whole crowded region image data is divided, forms besel sequence pair (M1,M2),(M2, M3),...,(Mn-1,Mn);The hand-held object boundary of characteristics of human body's image is positioned, since the initial frame head portion of video image;Positioning The access border of some characteristics of human body's image, from the crowded area that video image tail search characteristics of human body's image occurs The relevant position in domain, and judge the position that characteristics of human body's image occurs, residence time, and whether do shopping or hold Article;
S1-3, by besel sequence pair being compared crawl, one characteristics of human body's image of frame of video and face before and after judgement The change degree of characteristic image
<mrow> <msub> <mi>F</mi> <mrow> <mi>i</mi> <mo>,</mo> <mi>j</mi> </mrow> </msub> <mo>=</mo> <munderover> <mo>&amp;Sigma;</mo> <mrow> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>=</mo> <mn>0</mn> </mrow> <mi>n</mi> </munderover> <msub> <mi>E</mi> <mrow> <mi>i</mi> <mo>,</mo> <mi>j</mi> </mrow> </msub> <mo>&amp;CenterDot;</mo> <mrow> <mo>(</mo> <msub> <mi>K</mi> <mi>i</mi> </msub> <mo>-</mo> <mo>(</mo> <mrow> <msub> <mo>&amp;Sigma;</mo> <mi>i</mi> </msub> <msub> <mi>z</mi> <mi>i</mi> </msub> <mo>+</mo> <msub> <mo>&amp;Sigma;</mo> <mrow> <mi>i</mi> <mo>+</mo> <mn>1</mn> </mrow> </msub> <msub> <mi>d</mi> <mi>i</mi> </msub> </mrow> <mo>)</mo> <mo>)</mo> </mrow> <mo>+</mo> <mfrac> <mi>C</mi> <mrow> <mo>|</mo> <mrow> <msub> <mi>E</mi> <mrow> <mi>i</mi> <mo>,</mo> <mi>j</mi> </mrow> </msub> <msub> <mi>L</mi> <mi>n</mi> </msub> <mo>+</mo> <msub> <mi>E</mi> <mrow> <mi>i</mi> <mo>,</mo> <mi>j</mi> </mrow> </msub> <msub> <mi>M</mi> <mi>n</mi> </msub> </mrow> <mo>|</mo> </mrow> </mfrac> <mo>&amp;CenterDot;</mo> <msub> <mi>&amp;omega;</mi> <mrow> <mi>i</mi> <mo>,</mo> <mi>j</mi> </mrow> </msub> <mo>+</mo> <msub> <mo>&amp;Sigma;</mo> <mrow> <mi>s</mi> <mo>,</mo> <mi>t</mi> </mrow> </msub> <msub> <mi>S</mi> <mrow> <mi>s</mi> <mo>,</mo> <mi>t</mi> </mrow> </msub> </mrow>
Wherein, wherein | Ei,jLn+Ei,jMn| it is inquiry feature L to be matchednWith besel image MnSimilarity, it is close that E represents the stream of people Collecting Region Matching image quantity, S represents the interference set for influenceing characteristics of human body's image and face characteristic image, and s, t are positive integer, S, t value is different, and its minimum value is 1, and maximum occurrences are the characteristics of human body's image matched in matching characteristics of image figure and people Face characteristic image number;ωi,jThe weight of degree of correlation total degree, K are matched for face expressive features set CiFor stream of people's close quarters The penalty factor of characteristics of human body's image error matching is carried out, z and d represent collection set and the human body spy of characteristics of human body's image respectively The collection set of the next besel of image is levied,
The change degree is subjected to information matches with the crowded regional location residing for corresponding image capture module, obtains the stream of people Close quarters position and the positive correlation conditional function of change degree
<mrow> <msub> <mi>N</mi> <mrow> <mi>i</mi> <mo>,</mo> <mi>j</mi> </mrow> </msub> <mo>=</mo> <mfenced open = "{" close = ""> <mtable> <mtr> <mtd> <mrow> <mi>f</mi> <mrow> <mo>(</mo> <mfrac> <mrow> <mi>Y</mi> <mrow> <mo>(</mo> <mi>x</mi> <mo>,</mo> <mi>y</mi> <mo>)</mo> </mrow> </mrow> <msub> <mi>&amp;eta;</mi> <mi>i</mi> </msub> </mfrac> <mo>)</mo> </mrow> <mo>+</mo> <mi>f</mi> <mrow> <mo>(</mo> <mfrac> <mrow> <mi>Z</mi> <mrow> <mo>(</mo> <mi>x</mi> <mo>,</mo> <mi>y</mi> <mo>)</mo> </mrow> </mrow> <msub> <mi>&amp;sigma;</mi> <mi>j</mi> </msub> </mfrac> <mo>)</mo> </mrow> </mrow> </mtd> <mtd> <mrow> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>=</mo> <mn>0</mn> </mrow> </mtd> </mtr> <mtr> <mtd> <mfrac> <mrow> <mo>(</mo> <mi>i</mi> <mo>&amp;CircleTimes;</mo> <msub> <mi>r</mi> <mrow> <mi>x</mi> <mo>,</mo> <mi>y</mi> </mrow> </msub> <mo>)</mo> <mo>(</mo> <mi>j</mi> <mo>&amp;CircleTimes;</mo> <msub> <mi>r</mi> <mrow> <mi>x</mi> <mo>,</mo> <mi>y</mi> </mrow> </msub> <mo>)</mo> </mrow> <mrow> <mo>(</mo> <msub> <mi>M</mi> <mn>1</mn> </msub> <mo>,</mo> <msub> <mi>M</mi> <mn>2</mn> </msub> <mo>)</mo> <mo>&amp;CenterDot;</mo> <mo>(</mo> <msub> <mi>M</mi> <mn>2</mn> </msub> <mo>,</mo> <msub> <mi>M</mi> <mn>3</mn> </msub> <mo>)</mo> <mo>&amp;CenterDot;</mo> <mn>...</mn> <mo>&amp;CenterDot;</mo> <mo>(</mo> <msub> <mi>M</mi> <mrow> <mi>n</mi> <mo>-</mo> <mn>1</mn> </mrow> </msub> <mo>,</mo> <msub> <mi>M</mi> <mi>n</mi> </msub> <mo>)</mo> </mrow> </mfrac> </mtd> <mtd> <mrow> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>=</mo> <mn>1</mn> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <munder> <mo>&amp;Sigma;</mo> <mrow> <mi>i</mi> <mo>,</mo> <mi>j</mi> </mrow> </munder> <mi>C</mi> <msup> <mrow> <mo>(</mo> <mi>i</mi> <mo>&amp;CircleTimes;</mo> <mi>j</mi> <mo>)</mo> </mrow> <mn>2</mn> </msup> </mrow> </mtd> <mtd> <mrow> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>&gt;</mo> <mn>1</mn> </mrow> </mtd> </mtr> </mtable> </mfenced> </mrow>
Wherein, Y (x, y) and Z (x, y) represents what is lacked between characteristics of human body's image and face characteristic image coordinate point (x, y) respectively Interaction relationship, ηiAnd σjCharacteristics of human body's image judgment threshold and face characteristic image judgment threshold are represented respectively, and it is to open area Between positive number in (0,1), rx,yRepresent to sentence characteristics of human body's image and face characteristic image similarity of coordinate (x, y) opening position The disconnected factor,
S1-4, according to incidence relation between the characteristics of human body's image and face characteristic image of each individual of definition, closed according to association It is the non-dominant individual collections that different degree of correlation grades are produced to the inquiry degree of correlation and data relevancy ranking, it is special according to human body Levy non-dominant individual amount in image and face characteristic image gradation, sequence number grade it is small to big order from the degree of correlation, if The outlet of each stream of people's close quarters is not matched to related to face characteristic image any feature with characteristics of human body's image Image is spent, performs step S1-1, if corresponding crowded regional location obtains correlation chart picture and carried out in relevant position Signature, perform step S1-5;
S1-5, crowded zonelog is set, the attribute information in the crowded region is extracted according to user's request, carries out phase Calculated like degree, similarity is inquired about using characteristics of human body's figure Similarity Measure, inquiry phase is calculated using face characteristic image similarity Like degree, until daily record similarity and inquiry similarity convergence;Characteristics of human body's image of acquiescence is balanced by using matching weight α Degree of correlation weighing result value is defined with face characteristic image correlativity and user
D [i, j]=max Fi,j(1-α)·P(i,j)+α·P(i,j,rx,y)+min Fi,j
Wherein, max Fi,jThe maximum of the change degree of characteristics of human body's image and face characteristic image, min Fi,jCharacteristics of human body schemes The minimum value of the change degree of picture and face characteristic image, P (i, j) are stream of people's close quarters initial decision decision value, P (i, j, rx,y) it is that stream of people's close quarters result judges decision value, rx,yRepresent the characteristics of human body's image and face to coordinate (x, y) opening position Characteristic image similarity judges the factor, and wherein initial decision decision value is to carry out close quarters according to history feature view data Initial decision, judge that decision value is the judgement decision value after being optimized after judging by S1-1 to S1-5 for result.
CN201710911806.XA 2017-09-29 2017-09-29 Behavior discriminant analysis method is carried out by using feature crawl function Expired - Fee Related CN107679492B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710911806.XA CN107679492B (en) 2017-09-29 2017-09-29 Behavior discriminant analysis method is carried out by using feature crawl function

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710911806.XA CN107679492B (en) 2017-09-29 2017-09-29 Behavior discriminant analysis method is carried out by using feature crawl function

Publications (2)

Publication Number Publication Date
CN107679492A true CN107679492A (en) 2018-02-09
CN107679492B CN107679492B (en) 2018-10-16

Family

ID=61139461

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710911806.XA Expired - Fee Related CN107679492B (en) 2017-09-29 2017-09-29 Behavior discriminant analysis method is carried out by using feature crawl function

Country Status (1)

Country Link
CN (1) CN107679492B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101325690A (en) * 2007-06-12 2008-12-17 上海正电科技发展有限公司 Method and system for detecting human flow analysis and crowd accumulation process of monitoring video flow
CN101980245A (en) * 2010-10-11 2011-02-23 北京航空航天大学 Adaptive template matching-based passenger flow statistical method
CN103593646A (en) * 2013-10-16 2014-02-19 中国计量学院 Dense crowd abnormal behavior detection method based on micro-behavior analysis
CN104298969A (en) * 2014-09-25 2015-01-21 电子科技大学 Crowd scale statistical method based on color and HAAR feature fusion
CN106127173A (en) * 2016-06-30 2016-11-16 北京小白世纪网络科技有限公司 A kind of human body attribute recognition approach based on degree of depth study
CN106384078A (en) * 2016-08-31 2017-02-08 重庆云库房物联科技有限公司 Infrared array based people stream behavior analysis system and method

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101325690A (en) * 2007-06-12 2008-12-17 上海正电科技发展有限公司 Method and system for detecting human flow analysis and crowd accumulation process of monitoring video flow
CN101980245A (en) * 2010-10-11 2011-02-23 北京航空航天大学 Adaptive template matching-based passenger flow statistical method
CN103593646A (en) * 2013-10-16 2014-02-19 中国计量学院 Dense crowd abnormal behavior detection method based on micro-behavior analysis
CN104298969A (en) * 2014-09-25 2015-01-21 电子科技大学 Crowd scale statistical method based on color and HAAR feature fusion
CN106127173A (en) * 2016-06-30 2016-11-16 北京小白世纪网络科技有限公司 A kind of human body attribute recognition approach based on degree of depth study
CN106384078A (en) * 2016-08-31 2017-02-08 重庆云库房物联科技有限公司 Infrared array based people stream behavior analysis system and method

Also Published As

Publication number Publication date
CN107679492B (en) 2018-10-16

Similar Documents

Publication Publication Date Title
CN107644218A (en) The method of work of crowded region behavioural analysis judgement is realized based on image collecting function
CN109711281B (en) Pedestrian re-recognition and feature recognition fusion method based on deep learning
CN107527068B (en) Vehicle type identification method based on CNN and domain adaptive learning
CN103136504B (en) Face identification method and device
Wu et al. Faithful multimodal explanation for visual question answering
CN111401270A (en) Human motion posture recognition and evaluation method and system
CN111222471B (en) Zero sample training and related classification method based on self-supervision domain perception network
KR100586881B1 (en) Device for providing sound effect accrding to image and method thereof
CN105139070B (en) fatigue driving evaluation method based on artificial neural network and evidence theory
Osareh et al. Comparative exudate classification using support vector machines and neural networks
CN100464332C (en) Picture inquiry method and system
CN107273796A (en) A kind of fast face recognition and searching method based on face characteristic
CN109389074A (en) A kind of expression recognition method extracted based on human face characteristic point
CN108664874A (en) Underground coal flow rate testing methods based on image recognition
Muthukumar Color-theoretic experiments to understand unequal gender classification accuracy from face images
CN108609019A (en) A kind of electric vehicle automatic Pilot method based on artificial intelligence platform
CN113674037A (en) Data acquisition and recommendation method based on shopping behaviors
CN106709528A (en) Method and device of vehicle reidentification based on multiple objective function deep learning
CN103268485A (en) Sparse-regularization-based face recognition method capable of realizing multiband face image information fusion
CN113920491A (en) Fatigue detection system, method, medium and detection device based on facial skeleton model
CN109934255A (en) A kind of Model Fusion method for delivering object Classification and Identification suitable for beverage bottle recycling machine
CN117290730A (en) Optimization method of individual emotion recognition model
CN111860117A (en) Human behavior recognition method based on deep learning
CN107563359A (en) Recognition of face temperature analysis generation method is carried out for dense population
Wiliem et al. Discovering discriminative cell attributes for hep-2 specimen image classification

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20180903

Address after: 276000 101, 21 building, international industrial goods sourcing center, Shengzhuang street, Luozhuang District, Linyi, Shandong.

Applicant after: SHANDONG XUXING NETWORK TECHNOLOGY Co.,Ltd.

Address before: 402160 27-6 6 Xinglong Avenue, Yongchuan District, Chongqing, 27-6.

Applicant before: CHONGQING ZHIQUAN ZHILU TECHNOLOGY CO.,LTD.

GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20181016