CN113743523B - Building rubbish fine classification method guided by visual multi-feature - Google Patents

Building rubbish fine classification method guided by visual multi-feature Download PDF

Info

Publication number
CN113743523B
CN113743523B CN202111071050.5A CN202111071050A CN113743523B CN 113743523 B CN113743523 B CN 113743523B CN 202111071050 A CN202111071050 A CN 202111071050A CN 113743523 B CN113743523 B CN 113743523B
Authority
CN
China
Prior art keywords
classification
color
visual
image
matrix
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111071050.5A
Other languages
Chinese (zh)
Other versions
CN113743523A (en
Inventor
宋琳
赵慧轩
马宗方
宋琪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xian University of Architecture and Technology
Original Assignee
Xian University of Architecture and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xian University of Architecture and Technology filed Critical Xian University of Architecture and Technology
Priority to CN202111071050.5A priority Critical patent/CN113743523B/en
Publication of CN113743523A publication Critical patent/CN113743523A/en
Application granted granted Critical
Publication of CN113743523B publication Critical patent/CN113743523B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques
    • G06F18/232Non-hierarchical techniques
    • G06F18/2321Non-hierarchical techniques using statistics or function optimisation, e.g. modelling of probability density functions
    • G06F18/23213Non-hierarchical techniques using statistics or function optimisation, e.g. modelling of probability density functions with fixed number of clusters, e.g. K-means clustering

Landscapes

  • Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Probability & Statistics with Applications (AREA)
  • Image Analysis (AREA)

Abstract

A visual multi-feature guided construction waste fine classification algorithm comprises the following steps of; step 1) collecting a sample through a visual sensor and preprocessing; step 2): extracting color features of a sample based on global information of the image and encoding; step 3): extracting texture features of a sample based on local information of the image and encoding; step 4): building color-texture statistical characteristics of building materials; step 5): respectively inputting the color features and the color-texture statistical features into a classifier, training a classification model, and obtaining two evidence matrixes from different spaces; step 6): according to the evidence matrix, an effective decision model is designed, and a decision matrix is output; step 7): and determining the label of the material to be sorted according to the category of the most probable one in the decision matrix. The method can effectively determine the classification model, determine the class label of the material and realize the accurate identification of the target.

Description

Building rubbish fine classification method guided by visual multi-feature
Technical Field
The invention belongs to the technical field of pattern recognition and machine vision, and particularly relates to a visual multi-feature guided construction waste fine classification method.
Background
Most of the existing recovery equipment is low in efficiency and complex in operation procedures through the procedures of iron removal, crushing, screening, magnetic separation and the like. Meanwhile, the cost of resource recycling is increased due to the existence of manual sorting. In addition, the harsh working environment will affect the physical health of the staff. Thus, establishing efficient and rapid sorting equipment has become an important way to increase the utilization of construction waste.
As machine vision technology has evolved over the last decade, pattern recognition theory has been applied to various industries by virtue of its unique advantages. In the field of construction waste identification, some workers propose the separation of bricks and stones with similar densities using color features. Scholars have also proposed using features such as volume, weight, etc. to classify building materials. However, the method is provided for the classification problem of two types of building materials, has poor expandability, and cannot meet the actual requirements of classifying various materials in industry.
Disclosure of Invention
In order to overcome the defects of the prior art, the invention aims to provide a visual multi-feature guided construction waste fine classification method, which is based on pattern recognition and machine vision theory, extracts information of construction materials and constructs new obvious features, thereby realizing accurate classification of various materials.
In order to achieve the above purpose, the technical scheme adopted by the invention is as follows:
a visual multi-feature guided construction waste fine classification method comprises the following steps of;
step one, acquiring a construction waste target sample image through a visual sensor, and binarizing the construction waste target sample image;
step two, extracting color characteristics of the sample based on global information of the image and encoding and constructing
Step three, extracting texture features of the sample based on local information of the image and encoding the texture features to form
Fourth, building color-texture statistical characteristics of building materials
Step five, willAnd/>The features are respectively input into a classifier to train a classification model. The test image outputs two evidence matrixes/>, after being subjected to a classification model
Step six, according to the evidence in the step fiveDesigning a decision model, and outputting a decision matrix/>
Step seven, according to the decision matrixThe category of the highest probability determines the label of the material to be sorted.
The method for extracting and encoding the color features of the sample in the second step comprises the following steps:
Firstly, converting an RGB image acquired by a vision sensor into an HSV color space, wherein H represents hue, S represents saturation, V represents brightness, extracting image H channel information to represent color characteristics of an object, and the formula is as follows;
wherein max and min satisfy: max=max (R, G, B), min=min (R, G, B);
Then, carrying out histogram statistics on the H channel information to obtain color characteristics
The method for extracting and encoding the texture features of the sample image in the third step comprises the following steps:
step 3.1, extracting local texture features:
firstly, dividing the binary image in the first step into a plurality of subregions containing 32 multiplied by 32 pixels, determining the size of the cells to be 16 multiplied by 16 pixels, extracting the gray values N (x, y+1), N (x, y-1), N (x+1, y) and N (x-1, y) of the pixels in the four adjacent domains of the pixels (x, y) of the cells with the size of 2 multiplied by 2 blocks;
then, the gradient magnitude and gradient direction of each pixel are calculated by the following formula;
Finally, the directional gradient histograms of the cells in the cascade blocks are used for representing the distribution of the unsigned gradient directions of each sub-region by using a vector of 1×36 dimensions;
where m y and m x are the vertical gradient and the horizontal gradient of pixel (x, y), respectively.
Step 3.2, constructing visual characteristic words:
extracting local texture features of each type of image according to the step 3.1, then gathering vectors of each type of image into K types through a K-means clustering algorithm, and regarding a clustering center as a visual word;
Step 3.3, texture feature coding:
forming visual word bags by all the visual words in the step 3.2, searching the visual words according to the nearest neighbor principle for the vector calculated in the step 3.1 to encode texture features to obtain
The fourth step is specifically as follows: construction of building material color-texture statisticsThe formula of (2) is as follows:
In the method, in the process of the invention, Color features/>, respectivelyTexture features/>[. Cndot. T is the transposed matrix, where/>
In the sixth step, the classification probability matrix is output according to the classifier in the fifth stepDesigning a classification decision model, and outputting a decision matrix/>The method of (1) is as follows:
building materials are classified by color characteristics into color conspicuous classes (including red bricks and wood blocks) and others (including foam, hard plastic, concrete), and therefore Is a two-dimensional matrix, introduces a distribution matrix/>For matrix/>The classification probability in the model is redistributed, a linear fusion decision model is introduced, and the classification probability/>And/>The fusion is carried out, and the specific formula is as follows:
wherein omega 1 and omega 2 are matrices And/>Representing the degree of trust of the classification result on the classification space from the two features; /(I)The method is a fusion classification probability matrix and is used for judging the final class of the materials, and the building rubbish class corresponding to the maximum probability is the final result;
When ω 1 epsilon (0, 1), the construction waste combined decision classification algorithm (Visual multi-feature guided construction waste classification algorithm based on joint decision model,VMF-J), which is called as visual multi-feature guidance takes ω 1 =0.5 as a default value, and when ω 1 =0, the final classification result is completely characterized by color-texture statistics Classification probability/>For the basis, this algorithm is now called Visual multi-feature guided construction waste fine classification algorithm (VMF), when ω 1 =1, i.e. the objects are classified with color features only, due to/>The average allocation principle is satisfied, and at this time, an accurate class label cannot be obtained.
According to decision matrixThe category of the highest probability among them determines the final category of the material to be sorted.
The invention has the beneficial effects that:
according to the invention, the various materials are accurately classified by a machine vision technology, so that a solution is provided for the market diversification requirements; by extracting the characteristics from the local and global multiple angles of the target, new color-texture statistical characteristics are constructed, and comprehensive description of material information is promoted; by establishing an efficient classification decision model, the identification accuracy of the materials is effectively improved; through deep analysis and intelligent extraction of the obvious characteristics of the target, rapid identification of materials is promoted, automatic and intelligent sorting of building materials is realized, and the recycling cost of the building materials is reduced.
Drawings
FIG. 1 is a schematic diagram of a portion of a sample set.
FIG. 2 is a diagram of constructing color-texture statisticsA flow chart.
FIG. 3 is a graph showing the effect of omega 1 values on classification accuracy.
FIG. 4 is a flow chart of the present invention.
Detailed Description
The present invention will be described in further detail with reference to examples.
As shown in fig. 1-4: the invention discloses a visual multi-feature guided construction waste fine classification method, which specifically comprises the following steps in combination with fig. 2:
Step one, five materials accounting for larger amount in the construction waste in China are collected through a visual sensor, as shown in fig. 1, and collected data are preprocessed. Then, each type of image was divided into 5, optionally 4, as training sets, and the remaining images were considered as test sets.
The specific method for extracting the color characteristics of the sample and coding based on the global information of the image comprises the following steps:
First, an image is converted from an RGB color space to an HSV color space, and H-channel information is extracted. Then, carrying out histogram statistics on the H channel to obtain color characteristics
The specific method for extracting texture features of the sample and coding based on the local information of the image comprises the following steps:
And 3.1, extracting image texture features. First, the RGB image is binarized. The binary image is then divided into a number of sub-regions containing 32 x 32 pixels. Pixels with the size of 16×16 cells and cells with the size of 2×2 blocks are determined, and pixel gray values N (x, y+1), N (x, y-1), N (x+1, y), N (x-1, y) in the four adjacent domains of the pixel (x, y) are extracted. Then, the gradient magnitude and gradient direction of each pixel are calculated by the following formulas. Finally, the directional gradient histograms of the cells in the cascade block are shown and the distribution of unsigned gradient directions of each sub-region is represented using a1×36-dimensional vector.
Where m y and m x are the vertical gradient and the horizontal gradient of pixel (x, y), respectively.
And 3.2, constructing a visual characteristic word bag. Local texture features of each type of image are extracted according to step 3.1. Then, the vectors of each type of image are clustered into K types through a K-means clustering algorithm, and the clustering center is regarded as a visual word. Finally, all classes of visual words constitute a visual word bag.
And 3.3, texture feature coding. For the vector calculated in the step 3.1, searching visual words in the word bag according to the nearest neighbor principle, and encoding texture features to form
Step four, the color characteristics are obtainedAnd texture features/>Fusion, construction of building material color-texture statistics featuresThe specific method comprises the following steps:
for any image in sample space S Fusing the color features and texture features to form a new color-texture statistical feature/>The formula is as follows:
In the middle of Sample image/>, respectivelyColor characteristics/>Texture features/>[. Cndot. T is the transposed matrix, where/>
Step five, willAnd/>The characteristics are respectively input into a classifier, a classification model is trained, and two classification probability matrixes/>, from different characteristic spaces, are obtained
Step six, according to the matrix obtained in the step fiveIntroducing a linear fusion decision model to classify probability matrixes/>And/>Fusion was performed, resulting in/>The specific method comprises the following steps:
And 6.1, designing a classification decision model. Since building materials can be effectively classified into remarkable classes of colors (including red bricks and wood blocks) and others (including foam, hard plastic, concrete) based on color characteristics only Is a two-dimensional matrix. To improve algorithm classification accuracy, a distribution matrix/>, is introducedFor matrix/>The classification probability in the model is redistributed, a linear fusion decision model is introduced, and the classification probability/>And/>The fusion is carried out, and the specific formula is as follows:
wherein ω 1 and ω 2 are matrices And/>Representing the confidence level of the decision model in evidence from two spaces; /(I)The method is used for fusing the classification probability matrix to judge the final class of the materials, and the building rubbish class corresponding to the maximum probability is the final result. When ω 1 =0, the final classification result is completely characterized by color-texture statistics/>Classification probability of (2)Based on this, the algorithm is now referred to as Visual multi-feature guided construction waste fine classification algorithm (VMF). When ω 1 =1, the object is classified by color feature, since/>The average allocation principle is satisfied, and thus an accurate class label cannot be obtained.
And 6.2, parameter tuning of the classification decision model. The effect of ω 1 on the classification accuracy is shown in figure 4. As ω 1 increases, the recognition rate shows two trends: 1. the increment is followed by the decrement, as shown in fig. 3 (3); 2. the values are maintained constant and then decremented as shown in fig. 3 (1) - (2). When ω 1 is E (0, 1), the algorithm is called a visual multi-feature guided construction waste joint decision classification algorithm ,(Visual multi-feature guided construction waste classification algorithm based on joint decision model,VMF-J). experiment, and when ω 1 =0.5, the classification accuracy of the model is optimal. Therefore, the present invention takes ω 1 =0.5 as a default value.
Step seven, according to the decision matrixThe category of the highest probability among them determines the final category of the material to be sorted.
For the effectiveness of the visual multi-feature guided construction waste fine classification algorithm, the VMF and VMF-J algorithms were tested for classification in three different common classifiers (SVM, RF, KNN) herein, with experimental results shown in tables 1,2, and 3. Further, the VMF and VMF-J algorithms are compared to the Lab, RGB, HSV threshold segmentation algorithm, and the experimental results are shown in Table 4.
Table 1 SVM identifies a construction waste confusion matrix
Table 2 KNN identifies a construction waste confusion matrix
Table 3 RF identifies a construction waste confusion matrix
Table 4 algorithm identification effect comparison

Claims (5)

1. A visual multi-feature guided construction waste fine classification method is characterized by comprising the following steps of;
step one, acquiring a construction waste target sample image through a visual sensor, and binarizing the construction waste target sample image;
step two, extracting color characteristics of the sample based on global information of the image and encoding and constructing
Step three, extracting texture features of the sample based on local information of the image and encoding the texture features to form
Fourth, building color-texture statistical characteristics of building materials
Step five, willAnd/>The characteristics are respectively input into a classifier, a classification model is trained, and two evidence matrixes/>' are output after the test image passes through the classification model
Step six, according to the evidence in the step fiveDesigning a decision model, and outputting a decision matrix/>
Step seven, according to the decision matrixThe category of the highest probability determines the label of the material to be sorted.
2. The method for finely classifying construction waste guided by visual multi-feature according to claim 1, wherein the method for extracting and encoding the color features of the sample in the second step comprises the following steps:
Firstly, converting an RGB image acquired by a vision sensor into an HSV color space, wherein H represents hue, S represents saturation, V represents brightness, extracting image H channel information to represent color characteristics of an object, and the formula is as follows;
wherein max and min satisfy: max=max (R, G, B), min=min (R, G, B);
Then, carrying out histogram statistics on the H channel information to obtain color characteristics
3. The method for finely classifying construction waste guided by visual multi-feature according to claim 1, wherein the method for extracting and encoding texture features of the sample image in the third step comprises the following steps:
step 3.1, extracting local texture features:
firstly, dividing the binary image in the first step into a plurality of subregions containing 32 multiplied by 32 pixels, determining the size of the cells to be 16 multiplied by 16 pixels, extracting the gray values N (x, y+1), N (x, y-1), N (x+1, y) and N (x-1, y) of the pixels in the four adjacent domains of the pixels (x, y) of the cells with the size of 2 multiplied by 2 blocks;
then, the gradient magnitude and gradient direction of each pixel are calculated by the following formula;
Finally, the directional gradient histograms of the cells in the cascade blocks are used for representing the distribution of the unsigned gradient directions of each sub-region by using a vector of 1×36 dimensions;
where m y and m x are the vertical gradient and the horizontal gradient of pixel (x, y), respectively;
step 3.2, constructing visual characteristic words:
extracting local texture features of each type of image according to the step 3.1, then gathering vectors of each type of image into K types through a K-means clustering algorithm, and regarding a clustering center as a visual word;
Step 3.3, texture feature coding:
forming visual word bags by all the visual words in the step 3.2, searching the visual words according to the nearest neighbor principle for the vector calculated in the step 3.1 to encode texture features to obtain
4. The method for finely classifying construction waste guided by visual multi-feature according to claim 1, wherein the fourth step is specifically: construction of building material color-texture statisticsThe formula of (2) is as follows:
Wherein n 1、n2 is a color feature respectively Texture features/>[ ] T Is the transposed matrix, where n 1=n2 = 1.
5. The fine classification method of building rubbish guided by visual multi-feature according to claim 1, wherein in the sixth step, the classification probability matrix is output according to the classifier in the fifth stepDesigning a classification decision model, and outputting a decision matrix/>The method of (1) is as follows:
building materials are classified into color conspicuous categories by color characteristics, thus Is a two-dimensional matrix, introduces a distribution matrix/>For matrix/>The classification probability in the model is redistributed, a linear fusion decision model is introduced, and the classification probability/>And/>The fusion is carried out, and the specific formula is as follows:
wherein omega 1 and omega 2 are matrices And/>Representing the degree of trust of the classification result on the classification space from the two features; /(I)The method is a fusion classification probability matrix and is used for judging the final class of the materials, and the building rubbish class corresponding to the maximum probability is the final result;
When ω 1 is epsilon (0, 1), the construction waste combined decision classification algorithm called visual multi-feature guidance takes ω 1 =0.5 as a default value, and when ω 1 =0, the final classification result is completely characterized by color-texture statistics Classification probability/>Based on this, the construction waste fine classification algorithm, now called visual multi-feature guided, classifies the target with only color features when ω 1 =1, due to/>The average allocation principle is satisfied, and at this time, an accurate class label cannot be obtained.
CN202111071050.5A 2021-09-13 2021-09-13 Building rubbish fine classification method guided by visual multi-feature Active CN113743523B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111071050.5A CN113743523B (en) 2021-09-13 2021-09-13 Building rubbish fine classification method guided by visual multi-feature

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111071050.5A CN113743523B (en) 2021-09-13 2021-09-13 Building rubbish fine classification method guided by visual multi-feature

Publications (2)

Publication Number Publication Date
CN113743523A CN113743523A (en) 2021-12-03
CN113743523B true CN113743523B (en) 2024-05-14

Family

ID=78738418

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111071050.5A Active CN113743523B (en) 2021-09-13 2021-09-13 Building rubbish fine classification method guided by visual multi-feature

Country Status (1)

Country Link
CN (1) CN113743523B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115410050B (en) * 2022-11-02 2023-02-03 杭州华得森生物技术有限公司 Tumor cell detection equipment based on machine vision and method thereof
CN116051912B (en) * 2023-03-30 2023-06-16 深圳市衡骏环保科技有限公司 Intelligent identification and classification method for decoration garbage

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102622607A (en) * 2012-02-24 2012-08-01 河海大学 Remote sensing image classification method based on multi-feature fusion
CN106778810A (en) * 2016-11-23 2017-05-31 北京联合大学 Original image layer fusion method and system based on RGB feature Yu depth characteristic
CN107886095A (en) * 2016-09-29 2018-04-06 河南农业大学 A kind of classifying identification method merged based on machine vision and olfactory characteristic
CN110880019A (en) * 2019-10-30 2020-03-13 北京中科研究院 Method for adaptively training target domain classification model through unsupervised domain
CN111104943A (en) * 2019-12-17 2020-05-05 西安电子科技大学 Color image region-of-interest extraction method based on decision-level fusion
CN111401485A (en) * 2020-06-04 2020-07-10 深圳新视智科技术有限公司 Practical texture classification method
CN112488050A (en) * 2020-12-16 2021-03-12 安徽大学 Color and texture combined aerial image scene classification method and system

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6751354B2 (en) * 1999-03-11 2004-06-15 Fuji Xerox Co., Ltd Methods and apparatuses for video segmentation, classification, and retrieval using image class statistical models
US10062008B2 (en) * 2013-06-13 2018-08-28 Sicpa Holding Sa Image based object classification

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102622607A (en) * 2012-02-24 2012-08-01 河海大学 Remote sensing image classification method based on multi-feature fusion
CN107886095A (en) * 2016-09-29 2018-04-06 河南农业大学 A kind of classifying identification method merged based on machine vision and olfactory characteristic
CN106778810A (en) * 2016-11-23 2017-05-31 北京联合大学 Original image layer fusion method and system based on RGB feature Yu depth characteristic
CN110880019A (en) * 2019-10-30 2020-03-13 北京中科研究院 Method for adaptively training target domain classification model through unsupervised domain
CN111104943A (en) * 2019-12-17 2020-05-05 西安电子科技大学 Color image region-of-interest extraction method based on decision-level fusion
CN111401485A (en) * 2020-06-04 2020-07-10 深圳新视智科技术有限公司 Practical texture classification method
CN112488050A (en) * 2020-12-16 2021-03-12 安徽大学 Color and texture combined aerial image scene classification method and system

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
兼顾特征级和决策级融合的场景分类;何刚;霍宏;方涛;;计算机应用;20160531(第05期);第1262-1266页 *
基于计算机视觉的废物垃圾分析与识别探讨;万梅芬;电脑知识与技术;20200831;第16卷(第24期);第189-190页 *

Also Published As

Publication number Publication date
CN113743523A (en) 2021-12-03

Similar Documents

Publication Publication Date Title
CN111127499B (en) Safety inspection image cutter detection and segmentation method based on semantic contour information
CN104573685B (en) A kind of natural scene Method for text detection based on linear structure extraction
JP5567448B2 (en) Image area dividing apparatus, image area dividing method, and image area dividing program
CN112241762B (en) Fine-grained identification method for pest and disease damage image classification
CN113743523B (en) Building rubbish fine classification method guided by visual multi-feature
CN105069466A (en) Pedestrian clothing color identification method based on digital image processing
CN101777125B (en) Method for supervising and classifying complex category of high-resolution remote sensing image
CN106919910B (en) Traffic sign identification method based on HOG-CTH combined features
CN104850854A (en) Talc ore product sorting processing method and talc ore product sorting system
CN108664969B (en) Road sign recognition method based on conditional random field
CN105069816B (en) A kind of method and system of inlet and outlet people flow rate statistical
CN106960176A (en) A kind of pedestrian's gender identification method based on transfinite learning machine and color characteristic fusion
CN105138975B (en) A kind of area of skin color of human body dividing method based on degree of depth conviction network
CN108647703A (en) A kind of type judgement method of the classification image library based on conspicuousness
CN108154158A (en) A kind of building image partition method applied towards augmented reality
CN104834891A (en) Method and system for filtering Chinese character image type spam
JP5464739B2 (en) Image area dividing apparatus, image area dividing method, and image area dividing program
Song et al. A new method of construction waste classification based on two-level fusion
CN112258525A (en) Image abundance statistics and population recognition algorithm based on bird high frame frequency sequence
CN105354547A (en) Pedestrian detection method in combination of texture and color features
CN115272778A (en) Recyclable garbage classification method and system based on RPA and computer vision
CN115393748A (en) Method for detecting infringement trademark based on Logo recognition
CN109829511B (en) Texture classification-based method for detecting cloud layer area in downward-looking infrared image
CN114580569A (en) Construction waste material visual identification method based on feature code fusion
Bairwa et al. Classification of Fruits Based on Shape, Color and Texture using Image Processing Techniques

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant