CN111325276A - Image classification method and device, electronic equipment and computer-readable storage medium - Google Patents
Image classification method and device, electronic equipment and computer-readable storage medium Download PDFInfo
- Publication number
- CN111325276A CN111325276A CN202010112567.3A CN202010112567A CN111325276A CN 111325276 A CN111325276 A CN 111325276A CN 202010112567 A CN202010112567 A CN 202010112567A CN 111325276 A CN111325276 A CN 111325276A
- Authority
- CN
- China
- Prior art keywords
- reference image
- similarity
- classified
- image
- image feature
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 83
- 238000003860 storage Methods 0.000 title claims abstract description 15
- 239000011159 matrix material Substances 0.000 claims description 73
- 238000005259 measurement Methods 0.000 claims description 20
- 230000015654 memory Effects 0.000 claims description 19
- 238000000605 extraction Methods 0.000 claims description 16
- 238000004590 computer program Methods 0.000 claims description 13
- 238000012545 processing Methods 0.000 claims description 9
- 238000004891 communication Methods 0.000 claims description 7
- 238000010606 normalization Methods 0.000 claims description 3
- 238000010586 diagram Methods 0.000 description 11
- 230000000694 effects Effects 0.000 description 10
- 238000013527 convolutional neural network Methods 0.000 description 6
- 241000282472 Canis lupus familiaris Species 0.000 description 4
- 238000004364 calculation method Methods 0.000 description 4
- 238000007635 classification algorithm Methods 0.000 description 4
- 238000001514 detection method Methods 0.000 description 4
- 239000000284 extract Substances 0.000 description 4
- 230000006870 function Effects 0.000 description 4
- 238000012706 support-vector machine Methods 0.000 description 4
- FGUUSXIOTUKUDN-IBGZPJMESA-N C1(=CC=CC=C1)N1C2=C(NC([C@H](C1)NC=1OC(=NN=1)C1=CC=CC=C1)=O)C=CC=C2 Chemical compound C1(=CC=CC=C1)N1C2=C(NC([C@H](C1)NC=1OC(=NN=1)C1=CC=CC=C1)=O)C=CC=C2 FGUUSXIOTUKUDN-IBGZPJMESA-N 0.000 description 3
- 238000013145 classification model Methods 0.000 description 3
- 238000011160 research Methods 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 2
- 238000005457 optimization Methods 0.000 description 2
- PXFBZOLANLWPMH-UHFFFAOYSA-N 16-Epiaffinine Natural products C1C(C2=CC=CC=C2N2)=C2C(=O)CC2C(=CC)CN(C)C1C2CO PXFBZOLANLWPMH-UHFFFAOYSA-N 0.000 description 1
- 244000141359 Malus pumila Species 0.000 description 1
- 235000021016 apples Nutrition 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 238000004422 calculation algorithm Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000005295 random walk Methods 0.000 description 1
- 239000002904 solvent Substances 0.000 description 1
- 238000012549 training Methods 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
- 238000000844 transformation Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/22—Matching criteria, e.g. proximity measures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
- G06F18/241—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
- G06F18/2411—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on the proximity to a decision surface, e.g. support vector machines
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N20/00—Machine learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/46—Descriptors for shape, contour or point-related descriptors, e.g. scale invariant feature transform [SIFT] or bags of words [BoW]; Salient regional features
- G06V10/462—Salient features, e.g. scale invariant feature transforms [SIFT]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/50—Extraction of image or video features by performing operations within image blocks; by using histograms, e.g. histogram of oriented gradients [HoG]; by summing image-intensity values; Projection analysis
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- General Physics & Mathematics (AREA)
- Evolutionary Computation (AREA)
- Artificial Intelligence (AREA)
- General Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Software Systems (AREA)
- Life Sciences & Earth Sciences (AREA)
- Computing Systems (AREA)
- Mathematical Physics (AREA)
- Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- Biophysics (AREA)
- Computational Linguistics (AREA)
- General Health & Medical Sciences (AREA)
- Molecular Biology (AREA)
- Evolutionary Biology (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Bioinformatics & Computational Biology (AREA)
- Multimedia (AREA)
- Medical Informatics (AREA)
- Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
- Image Analysis (AREA)
Abstract
The embodiment of the application discloses an image classification method, which comprises the following steps: extracting the image features to be classified of the image to be classified, and determining the initial similarity between the image features to be classified and each reference image feature in a reference image feature library; acquiring an incidence relation between every two reference image characteristics in a reference image characteristic library; determining the target similarity between the image features to be classified and each reference image feature according to the image features to be classified, the initial similarity and the incidence relation between every two reference image features in the reference image feature library; selecting target reference image features of which the target similarity with the image features to be classified meets a first condition from a reference image feature library, and determining the categories corresponding to the target reference image features as the categories of the images to be classified. The embodiment of the application also discloses an image classification device, electronic equipment and a computer readable storage medium.
Description
Technical Field
The present application relates to the field of image processing, and in particular, to an image classification method and apparatus, an electronic device, and a computer-readable storage medium.
Background
Images are classified at a fine granularity, with the goal of more finely sub-classifying coarse-grained large categories, such as dogs of different types. Compared with general image classification, the precision of fine-grained image classification is finer, the difference between classes is finer, and different classes can be distinguished only by means of small local difference.
At present, a fine-grained image classification method introduces target area identification to improve the effect of fine-grained image classification; however, the classification of the fine-grained images depends on the positioning of the target area, and the positioning precision of the target area determines the effect of the fine-grained classification, which easily causes the problem of inaccurate classification.
Disclosure of Invention
The embodiment of the application provides an image classification method and device, electronic equipment and a computer readable storage medium, so as to improve the accuracy of image classification.
The embodiment of the application provides an image classification method, which comprises the following steps:
extracting the image features to be classified of the image to be classified, and determining the initial similarity between the image features to be classified and each reference image feature in a reference image feature library;
acquiring an incidence relation between every two reference image features in the reference image feature library;
determining the target similarity between the image features to be classified and each reference image feature according to the association relationship among the image features to be classified, the initial similarity and the every two reference image features;
and selecting target reference image features of which the target similarity with the image features to be classified meets a first condition from the reference image feature library, and determining the categories corresponding to the target reference image features as the categories of the images to be classified.
An embodiment of the present application further provides an image classification apparatus, including: the system comprises a feature extraction unit, an initial similarity determination unit, an acquisition unit, a target similarity determination unit and a category determination unit; wherein,
the feature extraction unit is configured to extract the features of the image to be classified;
the initial similarity determining unit is configured to determine an initial similarity between the image feature to be classified and each reference image feature in a reference image feature library;
the acquiring unit is configured to acquire an association relation between every two reference image features in the reference image feature library;
the target similarity determining unit is configured to determine a target similarity between the image features to be classified and each reference image feature according to the image features to be classified, the initial similarity and the association relationship between each two reference image features;
the category determining unit is configured to select a target reference image feature, of which the target similarity with the image feature to be classified meets a first condition, from the reference image feature library, and determine a category corresponding to the target reference image feature as the category of the image to be classified.
The embodiment of the application also provides an electronic device, which comprises a processor, a memory and a communication bus;
the communication bus is configured to realize connection communication between the processor and the memory;
the processor is configured to execute a program of the image classification method stored in the memory to implement the steps of any one of the image classification methods described above.
The embodiment of the application also provides a computer readable storage medium, on which a computer program is stored, and the computer program is executed by a processor to implement the steps of the image classification method.
According to the image classification method and device, the electronic equipment and the computer storage medium, the image features to be classified of the image to be classified are extracted, and the initial similarity between the image features to be classified and each reference image feature in the reference image feature library is determined; acquiring an incidence relation between every two reference image features in the reference image feature library; determining the target similarity between the image features to be classified and each reference image feature according to the association relationship among the image features to be classified, the initial similarity and the every two reference image features; and selecting target reference image features of which the target similarity with the image features to be classified meets a first condition from the reference image feature library, and determining the categories corresponding to the target reference image features as the categories of the images to be classified. In this way, the reference image features with high similarity to the image to be classified can be selected from the reference image feature library, and the category corresponding to the selected reference image features is used as the category of the image to be classified. The problem that the target area is inaccurately positioned to cause inaccurate classification can be avoided, and the effect of classifying fine-grained images is further improved on the basis of classification.
Drawings
Fig. 1 is a schematic flowchart 1 of an image classification method according to an embodiment of the present disclosure;
fig. 2 is a schematic flowchart of an image classification method according to an embodiment of the present disclosure 2;
fig. 3 is a schematic flowchart of an image classification method according to an embodiment of the present application 3;
fig. 4(a) is a schematic flowchart of an image classification method according to an embodiment of the present application 4;
fig. 4(b) is a schematic view of a scene architecture of an image classification method according to an embodiment of the present application;
fig. 5(a) is a schematic flowchart of an extended query method provided in an embodiment of the present application;
fig. 5(b) is a schematic view of a scenario architecture of an extended query method according to an embodiment of the present application;
fig. 6 is a schematic structural diagram of an image classification apparatus according to an embodiment of the present disclosure;
fig. 7 is a schematic diagram illustrating a hardware structure of an electronic device according to an embodiment of the present disclosure.
Detailed Description
So that the manner in which the features and elements of the present embodiments can be understood in detail, a more particular description of the embodiments, briefly summarized above, may be had by reference to embodiments, some of which are illustrated in the appended drawings.
Fine-grained image classification has extensive research requirements and application scenarios both in the industry and academia. The research topic related to the method mainly comprises the identification of different types of birds, dogs, flowers, vehicles, airplanes and the like. In real life, the identification of different sub-categories also has huge application requirements. For example, in ecological conservation, effective identification of different types of organisms is an important prerequisite for ecological research. If the fine-grained image recognition with low cost can be realized by means of computer vision technology, the method has great significance for both academic and industrial fields.
The fine-grained classification of the image belongs to a branch of the image classification, and the image classes belong to the same large class; the direct dissimilarity between sub-categories of images is relatively small, but there is a diversity difference in background and appearance between different sub-category images, resulting in still much dissimilarity between sub-categories.
The current image fine-grained classification method can be roughly divided into the following branches: the method comprises a fine adjustment method based on the existing classification network, a learning method based on fine-grained feature, a method based on the combination of target block detection and classification, and a method based on a visual attention mechanism. The method based on fine tuning of the existing classification network usually uses the existing classification network (such as the classifier MobileNet, Xception, etc.) to perform preliminary training on an image data set to obtain a trained classification model, and then continues to fine tune on a fine-grained data set for fine-grained categories, so that the classification model can be more suitable for distinguishing sub-categories. The method based on fine-grained feature learning can combine the information acquired by the two networks, so that the method is suitable for fine-grained image classification; one network is used for acquiring the position information of a target object in an image, and the other network is used for extracting abstract characteristics of the target object. The fine-grained classification method based on the combination of target block detection and classification is based on the idea of target detection, the position of a target object is detected in an image, then the position of a distinguishing area in the target object is detected, and then the distinguishing target area is subjected to fine-grained classification through a classification algorithm, wherein the classification algorithm can be a traditional Support Vector Machine (SVM) classifier or a general classification network. Finally, compared with a general classification algorithm, the attention mechanism is added to the fine-grained classification algorithm based on the attention mechanism, so that the model is more focused on information expression of the target area.
Therefore, the related technical scheme mainly focuses on improving the effect of fine-grained classification by combining the current general classification model with the target area, but the positioning precision of the target area determines the final effect of the fine-grained classification; and, the general classification without the target area detection module is not accurate enough for the fine-grained classification effect.
In order to solve the problems in the related art, an embodiment of the present application provides an image classification method, where an execution subject of the image classification method may be the image classification apparatus provided in the embodiment of the present application, or an electronic device integrated with the image classification apparatus, where the image classification apparatus may be implemented in a hardware or software manner. The electronic device may be a smart phone, a tablet computer, a personal computer, a server, an industrial computer, or the like.
Example one
Referring to fig. 1, fig. 1 is a schematic flow chart of a data transmission method according to an embodiment of the present application, and as shown in fig. 1, the image classification method includes the following steps:
The image to be classified according to the embodiment of the present application may be any image input by a user, or may be any image transmitted to an image classification device by another device. The source of the image to be classified is not limited in the embodiments of the present application.
Further, after the image to be classified is acquired, the image classification device processes the image to be classified, extracts the image features of the image to be classified, and obtains the image features of the image to be classified. Here, the image classification apparatus may extract Scale-invariant feature transform (SIFT) information of an image to be classified and Histogram of Oriented Gradient (HOG) information of the image to be classified to obtain a feature of the image to be classified, and may extract a feature of the image to be classified by using an SVM or obtain a feature of the image to be classified based on a Convolutional Neural Network (CNN); the method for extracting the features of the image to be classified is not limited in the embodiment of the present application.
The reference image feature library according to the embodiment of the present application may be an image feature library having a plurality of sub-categories, which is constructed in advance by the image classification device. The reference image feature library comprises a plurality of reference image features, and each reference image feature has a corresponding class label.
Further, after the image classification device obtains the features of the image to be classified, the image classification device can determine the target similarity between the features of the image to be classified and the features of the reference image, further select the features of the reference image with the target similarity meeting a certain condition from the feature library of the reference image, and finally determine the category of the image to be classified based on the category corresponding to the selected features of the reference image.
In order to accurately obtain the target similarity between the image features to be classified and the reference image features, the embodiment of the application can configure the initial similarity for the image features to be classified and each reference image feature, and then optimize and adjust the initial similarity to obtain the optimal target similarity. Thus, the accuracy of image classification can be improved.
And step 120, acquiring the association relationship between every two reference image features in the reference image feature library.
Specifically, the image classification device may obtain an association relationship between each reference image feature in the reference image feature library and each remaining reference image in the reference image feature library, that is, obtain an association relationship between every two reference images in the reference image feature library.
Here, the association relationship between each two reference image features mentioned in the embodiments of the present application may be characterized by a quantized numerical value. The larger the value, the higher the correlation between the two reference image features.
Specifically, the image classification device may obtain the association relationship between each two reference images by calculating a euclidean distance, a hamming distance, or a cosine similarity between each two reference images in the reference image feature library.
It should be noted that step 120 may be executed after step 110, before step 110, or simultaneously with step 110, and the execution order of step 120 and step 110 is defined herein in the embodiment of the present application.
And step 130, determining the target similarity between the image features to be classified and each reference image feature according to the image features to be classified, the initial similarity and the incidence relation between every two reference image features in the reference image feature library.
In the embodiment provided by the application, the image classification device may perform optimization adjustment on the initial similarity based on the image feature to be classified and the association relationship between the two reference image features in the reference image feature library to obtain the target similarity between the image feature to be classified and each reference image feature in the reference image feature library.
Here, the image classification device selects one or more reference image features satisfying a first condition as target reference image features from a reference image feature library. It is understood that the image classification device can select a partial image similar to the image feature to be classified from the reference image features according to the similarity. Further, the image classification device may obtain a category corresponding to the target reference image feature, and use the category corresponding to the reference image feature as the image feature to be classified.
Specifically, when the target reference image feature is one, the category corresponding to the target reference image feature is used as the category of the image feature to be classified, and when the target reference image feature is multiple, the category of the image feature to be classified may be the category of any one target reference image feature.
In the embodiment provided by the present application, a small number of reference images of unknown categories may be added outside the original reference image feature library, and the category of the reference image of the location category is determined according to the methods in steps 110 to 140. Therefore, the embodiments provided by the present application do not have high requirements on the number of reference image features in the reference image feature library.
Therefore, the image classification method provided by the embodiment of the application can select the reference image features with higher similarity with the image to be classified from the reference image feature library, and take the category corresponding to the selected reference image features as the category of the image to be classified. The problem that the target area is inaccurately positioned to cause inaccurate classification can be avoided, and the effect of classifying fine-grained images is further improved on the basis of classification.
Example two
Based on the foregoing embodiment, in the image classification method provided in the embodiment of the present application, before step 110, the image classification apparatus may further perform the following steps:
step 101, acquiring a plurality of reference images; wherein the reference image comprises images of a plurality of different sub-categories;
and 102, extracting the reference image characteristics of each reference image to obtain a reference image characteristic library.
It is understood that the image classification device may acquire a plurality of reference images in advance, process the plurality of reference images, and extract the reference image features of each of the plurality of reference images to construct the reference image feature library. Here, the reference map may be a selected image including a plurality of sub-categories, where the images of the plurality of sub-categories may cover fine-grained sub-categories under all coarse-grained categories; for example, images of multiple subclasses may include different breeds of birds, different breeds of dogs, and different breeds of apples, etc.
In the embodiments provided in the present application, the image classification apparatus may employ different types of feature extraction methods to extract the reference image features. Here, the feature extraction method may be a conventional feature extraction method, for example, extracting SIFT information and HOG information in an image to obtain an image feature, an SVM feature extraction method, or a CNN-based feature extraction method.
Here, the CNN-based feature extraction method can dynamically adjust CNN network model parameters through machine learning, and thus has a better inter-class distinction degree.
It should be noted that each reference image feature in the reference image feature library has a category label to which it belongs. In addition, in the embodiment provided by the application, the process of constructing the reference image feature library is generated off-line, that is, the reference image feature library is constructed in advance, so that the reference image feature library is directly used in the stage of determining the category of the image to be classified, and the image features of the reference image do not need to be extracted, so that the time cost can be reduced, and the image classification efficiency can be improved.
EXAMPLE III
Based on the foregoing embodiments, the image classification method provided in the embodiments of the present application may determine the initial similarity between the image feature to be classified and each reference image feature in the reference image feature library in different manners.
In one possible implementation, determining the initial similarity between the image feature to be classified and each reference image feature in the reference image feature library can be implemented by the following steps:
step 110a, calculating an incidence relation between the image features to be classified and each reference image feature in a reference image feature library;
and step 110b, determining the association relationship between the image features to be classified and each reference image feature as the initial similarity between the image features to be classified and each reference image feature.
Specifically, the image classification device may obtain the association relationship between the image feature to be classified and each reference image feature by calculating an euclidean distance, a hamming distance, or a cosine similarity between the image feature to be classified and the reference image feature.
Here, the association relationship between the image feature to be classified and each reference image feature obtained by calculation is used as the initial similarity, so that the iteration times of determining the target similarity based on the initial similarity can be reduced, and the image classification speed is improved.
In another possible implementation manner, determining an initial similarity between the image feature to be classified and each reference image feature in the reference image feature library can be further implemented by the following steps:
step 110c, generating an N-dimensional random vector; n is the total number of the reference image features in the reference image feature library;
and step 110d, determining elements in the N-dimensional random vector as the initial similarity between the image to be classified and each reference image in the reference image set.
That is to say, the image classification device can set the initial similarity between the image features to be classified and each reference image feature in the reference image feature library as an arbitrary vector, so that the calculation amount in the image processing process can be reduced, and the calculation complexity can be reduced.
Example four
Based on the foregoing embodiments, in the image classification method provided in the embodiments of the present application, step 130 may include steps 1301-1303. Referring to fig. 2, fig. 2 is a flowchart illustrating an image classification method according to an embodiment of the present application, specifically, step 130 includes the following steps:
in the embodiment provided by the application, the image classification device can construct an undirected graph of a reference image feature library; here, the vertex of the undirected graph is each reference image feature in the reference image feature library, and the edge of the undirected graph is the association relationship between every two reference image features in the reference image feature library. And constructing an adjacent matrix forming a reference image feature library by using the edges of the undirected graph, wherein the adjacent matrix is a symmetric matrix.
Specifically, the reference image feature library is X, and comprises N reference image features (X)1,x2,…,xN). Every two elements (e.g. X) in the computed reference image feature library XhAnd xk) The similarity between ahkThen, h and k are integers which are more than zero and less than or equal to N; the image classification device can classify the image according to the similarity a between every two reference image featureshkTo obtain a adjacency matrixThus a is a symmetric matrix, also a positive definite matrix.
In the embodiment provided by the application, in order to process data conveniently, the adjacency matrix may be normalized to obtain the metric matrix. Each element in the metric matrix can characterize the degree of correlation between every two reference image features in the reference image feature library.
In a possible implementation, the image classification apparatus may obtain a degree matrix of the undirected graph based on the adjacency matrix of the undirected graph constructed in step 1302; and obtaining a measurement matrix according to the degree matrix and the adjacency matrix.
Specifically, the elements of each column or each row in the adjacent matrix a are added to obtain N numbers, the N numbers are placed on the diagonal line of the matrix, and the other elements of the matrix are all zero, thereby forming an N-order diagonal matrix, which is denoted as a degree matrix D.
Further, the metric matrix S can be obtained according to equation (1):
wherein, A is a reference image feature library adjacency matrix constructed in the step 1301; d is a degree matrix of A. Thus, the incidence relation of every two normalized reference image features in the reference image feature library can be obtained through the formula (1); therefore, the target similarity between the image features to be classified and the reference image features is determined based on the measurement matrix, and the influence of other data quantity on similarity calculation can be eliminated, so that the accuracy of subsequent image classification is improved.
And step 1303, determining the target similarity between the image to be classified and each reference image feature in the reference image feature library according to the image features to be classified, the initial similarity and the measurement matrix.
Specifically, the image classification device may gradually optimize and adjust the initial similarity based on the image feature to be classified and the metric matrix, so as to obtain the target similarity. The target similarity can accurately reflect the final similarity of the image to be classified and each reference image feature in the reference image feature library.
In the image classification method provided in the present application, step 1303 may include steps 1303a to 1303c, please refer to fig. 3, where fig. 3 is a schematic flow chart 3 of the image classification method provided in the embodiment of the present application, and specifically, step 1303 may specifically include the following steps:
in the embodiments provided by the present application, the vector y is used to represent the image features to be classified. Using vector f0Representing the initial similarity between the image to be classified and each reference image feature in the reference image feature library; in particular, the amount of the solvent to be used,whereinAnd representing the initial similarity between the image feature to be classified and the jth reference image feature in the reference image feature library. j is an integer of 1 or more and N or less.
Specifically, the image classification device may obtain the ith similarity according to formula (2);
fi=α×S×fi-1+(1-α)×y (2);
α is a number greater than 0 and less than 1 and represents the probability fi-1Step 1303a may be understood as the image classification apparatus performing a random "walking operation" in the undirected graph constructed in step 1301, wherein the probability of α jumps to the neighboring vertices of the adjacency matrix, and the probabilities of 1 to α jump to the image features to be classified.
Further, when i ═ T, the similarity can be expressed by formula (3):
wherein T is any iteration number. Thus, as can be seen from formula (3), the image classification apparatus can classify the image based on the feature y and the initial similarity f of the image to be classified0And a measurement matrix S, and iteratively calculating to obtain the similarity between each image to be classified and each reference image in the reference image set.
And step 1303c, if the ith similarity does not meet the convergence condition, calculating to obtain the (i + 1) th similarity between the image to be classified and each reference image in the reference image set based on the image features to be classified, the ith similarity and the measurement matrix until the (i + N) th similarity meets the convergence condition, and determining the (i + N) th similarity as the target similarity.
Based on the above step 1303a, after each similarity is obtained through iterative computation, it is necessary to determine whether the currently computed similarity satisfies the convergence condition, that is, whether the similarity between the image feature to be classified and each reference image feature tends to be stable. And if the current similarity meets the convergence condition, stopping iteration, and taking the currently calculated similarity as the target similarity. And if the current similarity does not meet the convergence condition, calculating the similarity between the image to be classified next time and each reference image in the reference image set according to the current similarity until the convergence condition is met.
In the embodiments provided by the present application, the image classification apparatus may set different convergence conditions to determine whether the similarity between the image feature to be classified and each reference image feature tends to be stable.
In a possible implementation manner, the image classification device may determine a difference between the current similarity and the similarity calculated last time, and if the difference is smaller than a preset threshold, it indicates that the similarity between the image features to be classified and each reference image feature tends to be stable; therefore, the similarity obtained when calculating is taken as the target similarity.
In another possible implementation manner, the image classification device may further determine the currently calculated similarity f*Whether formula (4) is satisfied or not, whether the currently calculated similarity converges or not is judged:
f*=(1-α)×(I-αS)-1×y (4);
wherein I is an identity matrix.
Equation (4) can be demonstrated by equation (5) below:
the parameters in formula (5) have the same meanings as those in the above description, and are not described herein again.
Here, the parameters in equation (5) may be substituted into equation (3), and the convergence equation (4) of the currently calculated similarity may be proved.
Based on the implementation of the above steps, it can be seen that the image classification method provided in the embodiment of the present application may configure initial similarity for the image features to be classified and each reference image feature, and further perform optimization adjustment on the initial similarity to obtain the optimal target similarity. Thus, the accuracy of image classification can be improved. In addition, the image classification method provided by the embodiment of the application does not need to construct a target area, is simple in algorithm model, and reduces the calculated amount in the image classification process to a certain extent.
EXAMPLE five
Based on the foregoing embodiments, the image classification method provided in the embodiments of the present application may obtain the target reference image feature by setting different first conditions.
In one possible implementation, the first condition includes being greater than a similarity threshold;
then, in step 140, a target reference image whose target similarity to the image to be classified satisfies a first condition is selected from the reference image feature library, specifically:
and selecting the reference image features with the target similarity greater than the similarity threshold value with the image features to be classified from the reference image feature library as the target reference image features.
For example, the similarity threshold may be a numerical range (0.98, 1.) as long as the target similarity between the image to be classified and a certain reference image feature is within the numerical range, the reference image feature is considered to satisfy the first condition, and the reference image feature is taken as the target reference image feature.
In another possible implementation manner, the first condition includes that the target similarity is a maximum value of target similarities corresponding to the reference image features;
correspondingly, step 140 selects a target reference image from the reference image feature library, the target similarity of which to the image features to be classified satisfies the first condition, and includes:
and selecting the reference image feature with the maximum target similarity with the image to be classified from the reference image feature library as the target reference image feature.
It can be understood that the image classification device may select, according to the target similarity, a part of the reference image features with the maximum target similarity from the reference image feature library as the target reference image features. For example, the image classification device may sort the target similarity according to a descending order, and select the first reference image feature as the target reference image feature, and similarly, the image classification device may also select the first M reference image features as the target reference image features; here, M is an integer greater than 1.
Therefore, the image classification method provided by the embodiment of the application can select the reference image features with higher similarity with the image to be classified from the reference image feature library, and take the category corresponding to the selected reference image features as the category of the image to be classified. The problem that the target area is inaccurately positioned to cause inaccurate classification can be avoided, and the effect of classifying fine-grained images is further improved on the basis of classification.
EXAMPLE six
Based on the above embodiment, the embodiment provided by the application can be understood as applying an extended query method to a fine-grained image classification task, searching one or more reference images most similar to an image to be classified from a reference image database through the extended query method, and indirectly determining the category of the image to be classified according to the most similar reference images. Specifically, referring to the schematic flow chart of the image classification method shown in fig. 4(a) and the schematic view of the scene architecture of the image classification method shown in fig. 4(b), the image classification method based on the extended query includes the following steps:
step a, acquiring a reference image set K, wherein K is (K)1,k2,…,kN)。
As shown in fig. 4(b), images of a plurality of different sub-categories may be included in the reference image set K; such as images of different kinds of dogs.
In the embodiments provided in the present application, the reference image set refers to images stored in the storage device in advance. The image acquisition device may retrieve the set of reference images from the storage device prior to performing the image classification.
B, extracting each reference image kiReference image feature x ofiWherein i is an integer of 1 or more and N or less.
Here, the image classification apparatus may acquire the reference image feature x of each reference image based on CNN or a conventional feature extraction methodi。
Step c, based on each reference image characteristic xiAnd constructing a reference image feature library X: (x)1,x2,…,xN)。
And d, acquiring an image to be classified.
Here, as shown in fig. 4(b), the image to be classified may be an image captured by an image capturing device (e.g., a camera).
And e, extracting the image characteristics y to be classified of the image to be classified.
Here, the processor of the image classification device performs feature extraction on the image to be classified to obtain the feature y of the image to be classified.
And f, according to the image features y to be classified, carrying out extended query on the reference image feature library X to obtain target reference image features.
And g, determining the category of the image features to be classified based on the target reference image features.
Next, the extended query method in step f is described in detail, please refer to the flowchart of the extended query method shown in fig. 5(a) and the architecture diagram of the extended query scenario shown in fig. 5 (b).
And f1, calculating the incidence relation between every two elements in the reference image feature library X to obtain an adjacency matrix A.
Specifically, the image classification device calculates a reference image feature library X: (x)1,x2,…,xN) Every two elements (e.g. x)hAnd xk) The similarity between ahkWherein h and kIs an integer greater than zero and less than or equal to N; the image classification device can classify the image according to the similarity a between every two reference image featureshkTo obtain a adjacency matrix
And f2, normalizing the adjacency matrix A to obtain a measurement matrix S.
Specifically, the process of normalizing adjacency matrix a is as follows: and constructing an undirected graph of the reference image feature library, wherein A is an adjacency matrix between nodes of the graph. Adding elements of each column or each row in the adjacent matrix A to obtain N numbers, putting the N numbers on diagonal lines of the matrix, and setting other elements of the matrix to be zero, thereby forming an N-order diagonal matrix which is marked as a degree matrix D.
Further, a metric matrixS is understood to be an affine matrix of an undirected graph and is understood to be a metric matrix of transformations between contiguous matrix elements.
Step f3, initializing the initial similarity between the image feature y to be classified and each reference image feature in the reference image feature library X
Specifically, the initial similarity may be an association relationship between the calculated classification image features and each reference image feature, or may be an arbitrary vector.
And f4, calculating the ith similarity between the image to be classified and each reference image in the reference image set according to the characteristics of the image to be classified, the initial similarity and the measurement matrix.
Specifically, the ith similarity is obtained by the following formula:
fi=α×S×fi-1+(1-α)×y;
it can be understood that random walk is performed in the constructed undirected graph, the probability of α jumps to the adjacent vertex in the adjacency matrix, and the probability of 1- α jumps to the image feature y to be classified.
Step f5, judging whether the ith similarity meets the convergence condition f*=(1-α)×(I-αS)-1×y;
If the convergence condition is not satisfied, i +1 is set, and the process returns to step f 4.
If the convergence condition is satisfied, go to step f 6.
And f6, taking the current ith similarity as the target similarity.
Therefore, the image classification method provided by the embodiment of the application can select the reference image features with higher similarity with the image to be classified from the reference image feature library, and take the category corresponding to the selected reference image features as the category of the image to be classified. The problem that the target area is inaccurately positioned to cause inaccurate classification can be avoided, and the effect of classifying fine-grained images is further improved on the basis of classification.
EXAMPLE seven
Based on the foregoing embodiments, an embodiment of the present application provides an image classification apparatus, as shown in fig. 6, the image classification apparatus includes:
a feature extraction unit 61, an initial similarity determination unit 62, an acquisition unit 63, a target similarity determination unit 64, and a category determination unit 65; wherein,
a feature extraction unit 61 configured to extract an image feature to be classified of an image to be classified;
an initial similarity determining unit 62 configured to determine an initial similarity between the image feature to be classified and each reference image feature in the reference image feature library;
an obtaining unit 63 configured to obtain an association relationship between every two reference image features in the reference image feature library;
a target similarity determining unit 64 configured to determine a target similarity between the image features to be classified and each reference image feature according to the image features to be classified, the initial similarity and an association relationship between every two reference image features in the reference image feature library;
the category determining unit 65 is configured to select a target reference image feature, of which the target similarity with the image feature to be classified satisfies a first condition, from the reference image feature library, and determine a category corresponding to the target reference image feature as the category of the image to be classified.
In the embodiment provided by the present application, the initial similarity determining unit 62 is specifically configured to calculate an association relationship between the image feature to be classified and each reference image feature in the reference image feature library; and determining the association relationship between the image features to be classified and each reference image feature as the initial similarity between the image features to be classified and each reference image feature.
In the embodiment provided in the present application, the initial similarity determining unit 62 is specifically configured to generate an N-dimensional random vector; n is the total number of the reference image features in the reference image feature library; and determining elements in the N-dimensional random vector as the initial similarity between the image to be classified and each reference image in a reference image set.
In the embodiment provided by the present application, the target similarity determining unit 64 is configured to construct an adjacency matrix of the reference image feature library according to the association relationship between every two reference image features in the reference image feature library; carrying out normalization processing on the adjacency matrix to obtain a measurement matrix; the measurement matrix is used for characterizing the correlation degree between every two reference image features in the reference image feature library; and determining the target similarity between the image to be classified and each reference image feature in the reference image feature library according to the image features to be classified, the initial similarity and the measurement matrix.
In the embodiment provided by the present application, the target similarity determining unit 64 is specifically configured to calculate, according to the features of the image to be classified, the initial similarity and the metric matrix, an ith similarity between the image to be classified and each reference image in the reference image set; wherein i is an integer greater than or equal to 1;
if the ith similarity meets the convergence condition, determining the ith similarity as the target similarity; the convergence condition is used for representing that the difference value of two adjacent similarity degrees is smaller than a preset threshold value;
if the ith similarity does not meet the convergence condition, calculating to obtain the (i + 1) th similarity between the image to be classified and each reference image in the reference image set based on the image features to be classified, the ith similarity and the measurement matrix until the (i + N) th similarity meets the convergence condition, and determining the (i + N) th similarity as the target similarity.
In embodiments provided herein, the first condition comprises being greater than a similarity threshold;
correspondingly, the category determining unit 65 is configured to select, from the reference image feature library, a reference image feature having a target similarity greater than a similarity threshold with the image feature to be classified as the target reference image feature.
In an embodiment provided by the present application, the first condition includes that the target similarity is a maximum value of target similarities corresponding to the reference image features;
correspondingly, the category determining unit 65 is configured to select, as the target reference image feature, a reference image feature having a maximum target similarity with the image to be classified from the reference image feature library.
In the embodiment provided by the present application, the feature extraction unit 61 is further configured to acquire a plurality of reference images; wherein the reference image comprises images of a plurality of different sub-categories; and extracting the reference image characteristics of each reference image to obtain the reference image characteristic library.
Therefore, the image classification method provided by the embodiment of the application can select the reference image features with higher similarity with the image to be classified from the reference image feature library, and take the category corresponding to the selected reference image features as the category of the image to be classified. The problem that the target area is inaccurately positioned to cause inaccurate classification can be avoided, and the effect of classifying fine-grained images is further improved on the basis of classification.
Example eight
Based on the implementation of each unit in the image classification apparatus, in order to implement the image classification method provided in the embodiment of the present application, an embodiment of the present application further provides an electronic device, as shown in fig. 7, where the electronic device 70 includes: a processor 71 and a memory 72 configured to store a computer program capable of running on the processor,
wherein the processor 71 is configured to perform the method steps in the preceding embodiments when running the computer program.
In practice, of course, the various components of the electronic device 70 are coupled together by a bus system 73, as shown in FIG. 7. It will be appreciated that the bus system 73 is used to enable communications among the components. The bus system 73 includes a power bus, a control bus, and a status signal bus in addition to the data bus. For clarity of illustration, however, the various buses are labeled as bus system 73 in fig. 7.
In an exemplary embodiment, the present application further provides a computer readable storage medium, such as a memory 72, comprising a computer program, which is executable by a processor 71 of an electronic device 70 to perform the steps of the foregoing method. The computer-readable storage medium may be a magnetic random access Memory (FRAM), a Read Only Memory (ROM), a Programmable Read-Only Memory (PROM), an erasable Programmable Read-Only Memory (EPROM), an electrically erasable Programmable Read-Only Memory (EEPROM), a Flash Memory (Flash Memory), a magnetic surface Memory, an optical disk, or a Compact Disc Read-Only Memory (CD-ROM), among other memories.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of a hardware embodiment, a software embodiment, or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
The above description is only a preferred embodiment of the present application, and is not intended to limit the scope of the present application.
Claims (18)
1. A method of image classification, the method comprising:
extracting the image features to be classified of the image to be classified, and determining the initial similarity between the image features to be classified and each reference image feature in a reference image feature library;
acquiring an incidence relation between every two reference image features in the reference image feature library;
determining the target similarity between the image features to be classified and each reference image feature according to the association relationship among the image features to be classified, the initial similarity and the every two reference image features;
and selecting target reference image features of which the target similarity with the image features to be classified meets a first condition from the reference image feature library, and determining the categories corresponding to the target reference image features as the categories of the images to be classified.
2. The method of claim 1, wherein the determining an initial similarity between the image feature to be classified and each reference image feature in a library of reference image features comprises:
calculating the incidence relation between the image features to be classified and each reference image feature in the reference image feature library;
and determining the association relationship between the image features to be classified and each reference image feature as the initial similarity between the image features to be classified and each reference image feature.
3. The method of claim 1, wherein the determining an initial similarity between the image feature to be classified and each reference image feature in a library of reference image features comprises:
generating an N-dimensional random vector; n is the total number of the reference image features in the reference image feature library;
and determining elements in the N-dimensional random vector as the initial similarity between the image to be classified and each reference image in a reference image set.
4. The method according to any one of claims 1 to 3, wherein the determining the target similarity between the image to be classified and each reference image feature in the reference image feature library according to the image feature to be classified, the initial similarity and the association relationship between each two reference image features comprises:
constructing an adjacent matrix of the reference image feature library according to the incidence relation between every two reference image features in the reference image feature library;
carrying out normalization processing on the adjacency matrix to obtain a measurement matrix; the metric matrix is used for characterizing the correlation degree between every two reference image features in the reference image feature library;
and determining the target similarity between the image to be classified and each reference image feature in a reference image feature library according to the image feature to be classified, the initial similarity and the measurement matrix.
5. The method of claim 4, wherein the determining a target similarity between the image to be classified and each reference image feature in a reference image feature library according to the image feature to be classified, the initial similarity and the metric matrix comprises:
calculating the ith similarity between the image to be classified and each reference image in the reference image set according to the image features to be classified, the initial similarity and the measurement matrix; wherein i is an integer greater than or equal to 1;
if the ith similarity meets the convergence condition, determining the ith similarity as the target similarity; the convergence condition is used for representing that the difference value of two adjacent similarity degrees is smaller than a preset threshold value;
if the ith similarity does not meet the convergence condition, calculating to obtain the (i + 1) th similarity between the image to be classified and each reference image in the reference image set based on the image features to be classified, the ith similarity and the measurement matrix until the (i + N) th similarity meets the convergence condition, and determining the (i + N) th similarity as the target similarity.
6. The method of claim 1, wherein the first condition comprises being greater than a similarity threshold;
selecting a target reference image with target similarity meeting a first condition with the image to be classified from the reference image feature library, wherein the target reference image comprises:
and selecting the reference image features with the target similarity greater than a similarity threshold value from the reference image feature library as the target reference image features.
7. The method according to claim 1, wherein the first condition comprises that the target similarity is a maximum value of target similarities corresponding to the reference image features;
the selecting a target reference image from the reference image feature library, the target similarity of which to the image features to be classified meets a first condition, includes:
and selecting the reference image feature with the maximum target similarity with the image to be classified from the reference image feature library as the target reference image feature.
8. The method according to claim 1, wherein before extracting the image features to be classified of the image to be classified and determining the initial similarity between the image features to be classified and each reference image feature in the reference image feature library, the method further comprises:
acquiring a plurality of reference images; wherein the reference image comprises images of a plurality of different sub-categories;
and extracting the reference image characteristics of each reference image to obtain the reference image characteristic library.
9. An image classification apparatus, comprising: the system comprises a feature extraction unit, an initial similarity determination unit, an acquisition unit, a target similarity determination unit and a category determination unit; wherein,
the feature extraction unit is configured to extract the features of the image to be classified;
the initial similarity determining unit is configured to determine an initial similarity between the image feature to be classified and each reference image feature in a reference image feature library;
the acquiring unit is configured to acquire an association relation between every two reference image features in the reference image feature library;
the target similarity determining unit is configured to determine a target similarity between the image features to be classified and each reference image feature according to the image features to be classified, the initial similarity and the association relationship between each two reference image features;
the category determining unit is configured to select a target reference image feature, of which the target similarity with the image feature to be classified meets a first condition, from the reference image feature library, and determine a category corresponding to the target reference image feature as the category of the image to be classified.
10. The image classification device according to claim 9, wherein the initial similarity determination unit is specifically configured to calculate an association relationship between the image feature to be classified and each reference image feature in the reference image feature library; and determining the association relationship between the image features to be classified and each reference image feature as the initial similarity between the image features to be classified and each reference image feature.
11. The image classification apparatus according to claim 9, wherein the initial similarity determination unit is specifically configured to generate an N-dimensional random vector; n is the total number of the reference image features in the reference image feature library; and determining elements in the N-dimensional random vector as the initial similarity between the image to be classified and each reference image in a reference image set.
12. The image classification apparatus according to any one of claims 9 to 11,
the target similarity determining unit is configured to construct an adjacent matrix of the reference image feature library according to the incidence relation between every two reference image features in the reference image feature library; carrying out normalization processing on the adjacency matrix to obtain a measurement matrix; the metric matrix is used for characterizing the correlation degree between every two reference image features in the reference image feature library; and determining the target similarity between the image to be classified and each reference image feature in a reference image feature library according to the image feature to be classified, the initial similarity and the measurement matrix.
13. The image classification apparatus according to claim 12,
the target similarity determining unit is specifically configured to calculate an ith similarity between the image to be classified and each reference image in the reference image set according to the image feature to be classified, the initial similarity and the metric matrix; wherein i is an integer greater than or equal to 1;
if the ith similarity meets the convergence condition, determining the ith similarity as the target similarity; the convergence condition is used for representing that the difference value of two adjacent similarity degrees is smaller than a preset threshold value;
if the ith similarity does not meet the convergence condition, calculating to obtain the (i + 1) th similarity between the image to be classified and each reference image in the reference image set based on the image features to be classified, the ith similarity and the measurement matrix until the (i + N) th similarity meets the convergence condition, and determining the (i + N) th similarity as the target similarity.
14. The image classification device according to claim 9, wherein the first condition includes being greater than a similarity threshold;
the category determination unit is configured to select a reference image feature with a target similarity greater than a similarity threshold value with the image feature to be classified from the reference image feature library as the target reference image feature.
15. The image classification device according to claim 9, wherein the first condition includes that the target similarity is a maximum value of target similarities corresponding to reference image features;
the category determination unit is configured to select a reference image feature with a maximum target similarity to the image to be classified from the reference image feature library as the target reference image feature.
16. The image classification apparatus according to claim 9,
the feature extraction unit is further configured to acquire a plurality of reference images; wherein the reference image comprises images of a plurality of different sub-categories; and extracting the reference image characteristics of each reference image to obtain the reference image characteristic library.
17. An electronic device comprising a processor, a memory, and a communication bus;
the communication bus is configured to realize connection communication between the processor and the memory;
the processor is configured to execute a program of the image classification method stored in the memory to implement the steps of the image classification method according to any one of claims 1 to 8.
18. A computer-readable storage medium, on which a computer program is stored which is executed by a processor for carrying out the steps of the image classification method according to any one of claims 1 to 8.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010112567.3A CN111325276A (en) | 2020-02-24 | 2020-02-24 | Image classification method and device, electronic equipment and computer-readable storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010112567.3A CN111325276A (en) | 2020-02-24 | 2020-02-24 | Image classification method and device, electronic equipment and computer-readable storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN111325276A true CN111325276A (en) | 2020-06-23 |
Family
ID=71172863
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010112567.3A Pending CN111325276A (en) | 2020-02-24 | 2020-02-24 | Image classification method and device, electronic equipment and computer-readable storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111325276A (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112116028A (en) * | 2020-09-29 | 2020-12-22 | 联想(北京)有限公司 | Model decision interpretation implementation method and device and computer equipment |
CN112307934A (en) * | 2020-10-27 | 2021-02-02 | 深圳市商汤科技有限公司 | Image detection method, and training method, device, equipment and medium of related model |
CN112668635A (en) * | 2020-12-25 | 2021-04-16 | 浙江大华技术股份有限公司 | Image archiving method, device, equipment and computer storage medium |
CN117251715A (en) * | 2023-11-17 | 2023-12-19 | 华芯程(杭州)科技有限公司 | Layout measurement area screening method and device, electronic equipment and storage medium |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2013087711A2 (en) * | 2011-12-13 | 2013-06-20 | Ats Group (Ip Holdings) Limited | Method and system for sensor classification |
CN108304882A (en) * | 2018-02-07 | 2018-07-20 | 腾讯科技(深圳)有限公司 | A kind of image classification method, device and server, user terminal, storage medium |
CN108614894A (en) * | 2018-05-10 | 2018-10-02 | 西南交通大学 | A kind of face recognition database's constructive method based on maximum spanning tree |
US20180373925A1 (en) * | 2017-06-22 | 2018-12-27 | Koninklijke Philips N.V. | Subject identification systems and methods |
CN109325518A (en) * | 2018-08-20 | 2019-02-12 | Oppo广东移动通信有限公司 | Classification method, device, electronic equipment and the computer readable storage medium of image |
CN110276406A (en) * | 2019-06-26 | 2019-09-24 | 腾讯科技(深圳)有限公司 | Expression classification method, apparatus, computer equipment and storage medium |
CN110781957A (en) * | 2019-10-24 | 2020-02-11 | 深圳市商汤科技有限公司 | Image processing method and device, electronic equipment and storage medium |
-
2020
- 2020-02-24 CN CN202010112567.3A patent/CN111325276A/en active Pending
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2013087711A2 (en) * | 2011-12-13 | 2013-06-20 | Ats Group (Ip Holdings) Limited | Method and system for sensor classification |
US20180373925A1 (en) * | 2017-06-22 | 2018-12-27 | Koninklijke Philips N.V. | Subject identification systems and methods |
CN108304882A (en) * | 2018-02-07 | 2018-07-20 | 腾讯科技(深圳)有限公司 | A kind of image classification method, device and server, user terminal, storage medium |
CN108614894A (en) * | 2018-05-10 | 2018-10-02 | 西南交通大学 | A kind of face recognition database's constructive method based on maximum spanning tree |
CN109325518A (en) * | 2018-08-20 | 2019-02-12 | Oppo广东移动通信有限公司 | Classification method, device, electronic equipment and the computer readable storage medium of image |
CN110276406A (en) * | 2019-06-26 | 2019-09-24 | 腾讯科技(深圳)有限公司 | Expression classification method, apparatus, computer equipment and storage medium |
CN110781957A (en) * | 2019-10-24 | 2020-02-11 | 深圳市商汤科技有限公司 | Image processing method and device, electronic equipment and storage medium |
Non-Patent Citations (1)
Title |
---|
李青彦等: "基于构造空间金字塔度量矩阵的图像分类算法" * |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112116028A (en) * | 2020-09-29 | 2020-12-22 | 联想(北京)有限公司 | Model decision interpretation implementation method and device and computer equipment |
CN112116028B (en) * | 2020-09-29 | 2024-04-26 | 联想(北京)有限公司 | Model decision interpretation realization method and device and computer equipment |
CN112307934A (en) * | 2020-10-27 | 2021-02-02 | 深圳市商汤科技有限公司 | Image detection method, and training method, device, equipment and medium of related model |
CN112307934B (en) * | 2020-10-27 | 2021-11-09 | 深圳市商汤科技有限公司 | Image detection method, and training method, device, equipment and medium of related model |
TWI754515B (en) * | 2020-10-27 | 2022-02-01 | 大陸商深圳市商湯科技有限公司 | Image detection and related model training method, equipment and computer readable storage medium |
WO2022088411A1 (en) * | 2020-10-27 | 2022-05-05 | 深圳市商汤科技有限公司 | Image detection method and apparatus, related model training method and apparatus, and device, medium and program |
CN112668635A (en) * | 2020-12-25 | 2021-04-16 | 浙江大华技术股份有限公司 | Image archiving method, device, equipment and computer storage medium |
CN117251715A (en) * | 2023-11-17 | 2023-12-19 | 华芯程(杭州)科技有限公司 | Layout measurement area screening method and device, electronic equipment and storage medium |
CN117251715B (en) * | 2023-11-17 | 2024-03-19 | 华芯程(杭州)科技有限公司 | Layout measurement area screening method and device, electronic equipment and storage medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111325276A (en) | Image classification method and device, electronic equipment and computer-readable storage medium | |
CN110175615B (en) | Model training method, domain-adaptive visual position identification method and device | |
Lee et al. | Place recognition using straight lines for vision-based SLAM | |
CN111340097B (en) | Image fine granularity classification method, device, storage medium and equipment | |
CN114358205B (en) | Model training method, model training device, terminal device and storage medium | |
Wu et al. | Improving pedestrian detection with selective gradient self-similarity feature | |
CN114092873B (en) | Long-term cross-camera target association method and system based on appearance and morphological decoupling | |
WO2015146113A1 (en) | Identification dictionary learning system, identification dictionary learning method, and recording medium | |
CN112560787A (en) | Pedestrian re-identification matching boundary threshold setting method and device and related components | |
Gorokhovatskyi et al. | Application a Committee of Kohonen Neural Networks to Training of Image Classifier Based on Description of Descriptors Set | |
CN113255828B (en) | Feature retrieval method, device, equipment and computer storage medium | |
Gao et al. | An improved XGBoost based on weighted column subsampling for object classification | |
CN111753583A (en) | Identification method and device | |
CN117893839B (en) | Multi-label classification method and system based on graph attention mechanism | |
Jiang et al. | Weakly-supervised vehicle detection and classification by convolutional neural network | |
CN115984671A (en) | Model online updating method and device, electronic equipment and readable storage medium | |
Campos et al. | Global localization with non-quantized local image features | |
Farfan-Escobedo et al. | Towards accurate building recognition using convolutional neural networks | |
Nie et al. | Using an improved SIFT algorithm and fuzzy closed-loop control strategy for object recognition in cluttered scenes | |
Sanin et al. | K-tangent spaces on Riemannian manifolds for improved pedestrian detection | |
CN112766423B (en) | Training method and device for face recognition model, computer equipment and storage medium | |
Patsei et al. | Multi-class object classification model based on error-correcting output codes | |
CN112906724B (en) | Image processing device, method, medium and system | |
Weng et al. | Random VLAD based deep hashing for efficient image retrieval | |
CN114842251A (en) | Training method and device of image classification model, image processing method and device and computing equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |