CN113506297B - Printing data identification method based on big data processing - Google Patents

Printing data identification method based on big data processing Download PDF

Info

Publication number
CN113506297B
CN113506297B CN202111063256.3A CN202111063256A CN113506297B CN 113506297 B CN113506297 B CN 113506297B CN 202111063256 A CN202111063256 A CN 202111063256A CN 113506297 B CN113506297 B CN 113506297B
Authority
CN
China
Prior art keywords
connected domain
image
label
printing
domain
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111063256.3A
Other languages
Chinese (zh)
Other versions
CN113506297A (en
Inventor
葛峰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nantong Tiancheng Packaging Co ltd
Original Assignee
Nantong Tiancheng Packaging Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nantong Tiancheng Packaging Co ltd filed Critical Nantong Tiancheng Packaging Co ltd
Priority to CN202111063256.3A priority Critical patent/CN113506297B/en
Publication of CN113506297A publication Critical patent/CN113506297A/en
Application granted granted Critical
Publication of CN113506297B publication Critical patent/CN113506297B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/90Dynamic range modification of images or parts thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/12Edge-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20172Image enhancement details
    • G06T2207/20192Edge enhancement; Edge preservation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30144Printing quality

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Biomedical Technology (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Biophysics (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Quality & Reliability (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses a printing data identification method based on big data processing, which comprises the following steps: segmenting the acquired RGB image by using a semantic segmentation technology to obtain a presswork image; step two: processing the standard image and the printing image to obtain respective image descriptions; step three: and comparing the image descriptions of the standard image and the printing image to judge the abnormal printing condition. Compared with the prior art, the invention has the beneficial effects that: according to the method, the edges in the pattern are extracted through edge detection, and then the defect detection is carried out by utilizing the change difference of the description of each connected domain, so that the interference of illumination is avoided, the reliability of the result is improved, and the abnormal condition is judged by using the description of the connected domain of the image content instead of the response value of the corresponding pixel.

Description

Printing data identification method based on big data processing
Technical Field
The invention relates to the field of big data processing, in particular to a printing data identification method based on big data processing.
Background
The existing method for detecting the defects of the printed matter is not only a subjective visual inspection method, but also is often to compare and analyze the data of the printed matter to be detected with standard template data, for example, a beam of light is projected on the printed matter in a colorimetric detection method, a tristimulus value of color is obtained through an instrument and converted into a numerical value which can be compared, and then the numerical value is compared with the value of a sample, and the place where the abnormality occurs is the position where the defect exists; and the shot image is compared with the standard image based on the difference of the image, but the methods are easily interfered by factors such as illumination and the like, and finally, the result is inaccurate. Because the standard template is only an electronic version file and is not interfered by the environment in many times, the image of the printed product is really acquired by a camera, and the intensity, the direction and the like of a light source in a real scene are random, the printed product is easily influenced by illumination when being directly compared, namely, the place where the difference is difficult to determine is a real printing defect or the difference of illumination.
Disclosure of Invention
The invention aims to solve the defects in the prior art, and provides a printing data identification method based on big data processing.
In order to achieve the purpose, the invention adopts the following technical scheme:
a printing data identification method based on big data processing comprises the following steps: segmenting the acquired RGB image by using a semantic segmentation technology to obtain a presswork image; step two: processing the standard image and the printing image to obtain respective image descriptions; step three: and comparing the image descriptions of the standard image and the printing image to judge the abnormal printing condition.
Further, the second step is specifically: carrying out edge detection on the printed matter image to obtain a corresponding edge image; extracting a closed connected domain in the edge image; obtaining a description thereof by connected domain analysis; the connected component description is combined to obtain the description of the printed product image.
Further, performing edge detection on the printed image, and obtaining a corresponding edge image specifically includes: inputting a printing image, carrying out gray processing on the image, and carrying out edge detection on the image by using a Canny operator to obtain a gradient edge, namely the edge of the pattern in the printing area.
Further, extracting a closed connected domain in the edge image specifically includes: analyzing the connected domain of the edge of the pattern in the printing area by using a seed filling method to obtain the connected domain with different labels
Figure DEST_PATH_IMAGE002
And obtaining the value of the maximum tag number
Figure DEST_PATH_IMAGE004
I.e. the total number.
Further, the descriptions thereof obtained by connected domain analysis are specifically: setting initial parameters, printing each connected domain of image
Figure DEST_PATH_IMAGE002A
(Co-occurrence of
Figure DEST_PATH_IMAGE004A
Number) of pixels, N =0, and L = 0. Limit coordinates are:
Figure DEST_PATH_IMAGE006
because the pixels of different images are different in size, the area parameter of the connected domain is represented by the ratio of the number of the pixels of the connected domain to the total number of the pixels of the whole image, that is, the area parameter S of the connected domain:
Figure DEST_PATH_IMAGE008
wherein N is the number of pixels of the connected domain,
Figure DEST_PATH_IMAGE010
the number of pixels of the whole image;
traversing pixel points of the image line by line to obtain: the pixel points of each row in the image have corresponding connected domain labels, and the form is as follows:
Figure DEST_PATH_IMAGE012
(ii) a Wherein 0 is background pixel, namely non-connected domain pixel, the number outside 0 is label number of corresponding connected domain, process the label value, obtain the hierarchical information of connected domain under this row, because a connected domain is a closed area, so pass from left to right on a row of pixel of the picture, its label number will appear at least twice, first for starting to enter this connected domain, second for leaving this connected domain, there is connected domain of nested structure, must be that the big connected domain includes the small connected domain, so the nested number of layers of connected domain will not change once determining, if it is determined that the number of layers of connected domain will not change once
Figure DEST_PATH_IMAGE014
Indicates that the nesting layer number of the corresponding connected domain is not determined when
Figure DEST_PATH_IMAGE016
In the process, the value of L is not required to be changed, a temporary variable C =0 is set, traversal is performed from left to right, the first non-0 number is recorded, the corresponding number in the sequence is 1, and the corresponding C is set1, this is due to the number of nested layers of connected domains labeled 1
Figure DEST_PATH_IMAGE014A
Therefore, the value of the nesting layer number L of the corresponding connected component is updated to C, which indicates that the maximum nesting layer number of the connected component is 1, and the recorded non-0 tag sequence is
Figure DEST_PATH_IMAGE018
(ii) a A second non-0 digit, the corresponding digit in the sequence being 3, the digit being recorded if no digit 3 is present in the recorded sequence of labels, the resulting sequence of labels being
Figure DEST_PATH_IMAGE020
At this time order
Figure DEST_PATH_IMAGE022
Indicating entry into a further nested connected region, the largest nesting level of connected regions due to tag number 3
Figure DEST_PATH_IMAGE014AA
Update
Figure DEST_PATH_IMAGE024
The maximum nesting layer number of the connected domain with the label number of 3 is 2; the third non-0 digit is 2, and the digit 2 is not recorded in the recorded non-0 label sequence, and the non-0 digit label sequence is the label sequence at this moment
Figure DEST_PATH_IMAGE026
Number
2 of nested layers of connected domains due to tag number 2
Figure DEST_PATH_IMAGE014AAA
Let us order
Figure DEST_PATH_IMAGE028
The maximum nesting layer number of the connected domain with the label number of 2 is 3; continuing the traversal, the fourth traversal has a non-0 digit of 2 due to the previously recorded sequence of digits
Figure DEST_PATH_IMAGE026A
There is already 2 in (1), which means that the traversal of the connected component labeled 2 has ended and is no longer recorded into the label sequence. Subtracting 1 from C, wherein C =2 at this time, namely the traversed pixel points are located in a connected domain with a nesting level of 2 at this time, and so on, when a new connected domain is encountered, namely a label number which does not exist in the recorded non-0 digital label sequence, adding 1 to C to indicate that the connected domain enters a nesting area of a deeper layer; every time a connected domain is left, C is reduced by 1, and the nested region of the previous layer is returned; in addition, when adding 1 to C each time, it needs to judge whether the nesting level L corresponding to the label number is 0 or not, until the pixels in the row are traversed, and the number of pixels corresponding to the label is increased each time one pixel point with the label is traversed
Figure DEST_PATH_IMAGE030
(ii) a Comparing the coordinates of the labeled pixels
Figure DEST_PATH_IMAGE032
And the existing
Figure DEST_PATH_IMAGE034
If, if
Figure DEST_PATH_IMAGE036
Then, then
Figure DEST_PATH_IMAGE038
Otherwise
Figure DEST_PATH_IMAGE040
Keeping the same; if it is
Figure DEST_PATH_IMAGE042
Then, then
Figure DEST_PATH_IMAGE044
Otherwise
Figure DEST_PATH_IMAGE046
Keeping the same; to pair
Figure DEST_PATH_IMAGE048
The same process is carried out; after traversing, obtaining the coordinates of the central point
Figure DEST_PATH_IMAGE050
Wherein
Figure DEST_PATH_IMAGE052
(ii) a To obtain
Figure DEST_PATH_IMAGE054
And the respective number of nesting layers L; calculating the integral characteristic value of each connected domain
Figure DEST_PATH_IMAGE056
And expressing the distance from the coordinates of the center point of the connected domain to the origin and the area of the connected domain:
Figure DEST_PATH_IMAGE058
get each connected domain
Figure DEST_PATH_IMAGE002AA
Description of the invention
Figure DEST_PATH_IMAGE060
Further, the description of the print image obtained by combining the descriptions of the connected domain is specifically as follows: the description of the whole image is represented as
Figure DEST_PATH_IMAGE062
;
Similarly, the above operations are repeated for the standard image to obtain the image description data of the standard image
Figure DEST_PATH_IMAGE064
Wherein
Figure DEST_PATH_IMAGE066
Is a connected component in the standard image,
Figure DEST_PATH_IMAGE068
is the number of connected domains.
Further, the third step is specifically: traversing the connected domain of the standard image and the printing image according to the layer number
Figure DEST_PATH_IMAGE070
Grouping and respectively calculating the number corresponding to each hierarchy; comparing the corresponding number of each level of the standard image and the printing image, wherein the number is consistent to be a normal condition, and the levels with inconsistent number need to further detect the connected domain in the level: setting corresponding connected domain flags in standard images
Figure DEST_PATH_IMAGE072
Integral feature difference
Figure DEST_PATH_IMAGE074
Corresponding connected domain marks in printed images
Figure DEST_PATH_IMAGE076
Searching the corresponding relation among all connected domains of the hierarchy: searching the integral characteristic value of a certain connected domain of the printing image and each connected domain in the standard image
Figure DEST_PATH_IMAGE056A
The difference between:
Figure DEST_PATH_IMAGE078
the difference being minimal, i.e.
Figure DEST_PATH_IMAGE080
Then, the two related connected domains are the label numbers corresponding to the printing images
Figure DEST_PATH_IMAGE082
Corresponding connected domain of
Figure DEST_PATH_IMAGE084
Tag number corresponding to standard image
Figure DEST_PATH_IMAGE086
The corresponding connected domain of (a);
if the standard image connected domain mark
Figure DEST_PATH_IMAGE072A
At this time, the connected component in the standard image is stored as the connected component without the corresponding connected component
Figure DEST_PATH_IMAGE088
If the standard image connected domain mark
Figure DEST_PATH_IMAGE090
Figure DEST_PATH_IMAGE092
Indicating that the connected component in the standard image currently has a corresponding connected component in the printed image
Figure DEST_PATH_IMAGE094
Comparing the connected component of the standard image
Figure DEST_PATH_IMAGE096
And corresponding connected domain
Figure DEST_PATH_IMAGE084A
Corresponding difference value
Figure DEST_PATH_IMAGE098
The size of (2):
if it is
Figure DEST_PATH_IMAGE100
Update
Figure DEST_PATH_IMAGE102
Value of (1), order
Figure DEST_PATH_IMAGE104
If it is
Figure DEST_PATH_IMAGE106
Obtaining corresponding connected domains in the printed image
Figure DEST_PATH_IMAGE094A
The area difference with the standard image connected domain; obtaining corresponding connected domains
Figure DEST_PATH_IMAGE084AA
The area difference with the standard image connected domain, if corresponding to the connected domain
Figure DEST_PATH_IMAGE084AAA
If the corresponding area difference is smaller, updating
Figure DEST_PATH_IMAGE102A
A value of (d); otherwise, it is not updated
Figure DEST_PATH_IMAGE102AA
A value of (d);
if it is
Figure DEST_PATH_IMAGE108
Then not update
Figure DEST_PATH_IMAGE102AAA
A value of (d); after traversing is finished, corresponding connected domain marks of connected domains in printed images
Figure DEST_PATH_IMAGE110
There are two cases:
Figure DEST_PATH_IMAGE112
;
Figure DEST_PATH_IMAGE076A
indicates the connected domainThe standard image has no corresponding connected domain, namely a defect area, the defect exists at the position of the pattern inner and pattern outer blank areas, the former has more nesting layers and does not influence the whole content of the printed product, so the influence degree of the defect is smaller, and the latter has larger degree, so the whole influence condition of the defect in the area is larger
Figure DEST_PATH_IMAGE114
Comprises the following steps:
Figure DEST_PATH_IMAGE116
Figure DEST_PATH_IMAGE118
comparing the area difference between the corresponding connected domains in the standard image, wherein the error is normal within 3 percent of the total area of the standard image due to the statistical error, and the condition that the pattern is lack of printing or is over printing is considered to exist if the error exceeds 3 percent, so that the overall influence condition of the defects of the regions is as follows
Figure DEST_PATH_IMAGE120
Expressed as follows:
Figure DEST_PATH_IMAGE122
as described above, the defects of the entire printed image
Figure DEST_PATH_IMAGE124
Comprises the following steps:
Figure DEST_PATH_IMAGE126
compared with the prior art, the invention has the beneficial effects that: according to the method, the edges in the pattern are extracted through edge detection, and then the defect detection is carried out by utilizing the change difference of the description of each connected domain, so that the interference of illumination is avoided, the reliability of the result is improved, and the abnormal condition is judged by using the description of the connected domain of the image content instead of the response value of the corresponding pixel.
Drawings
FIG. 1 is a system flow diagram;
FIG. 2 is a diagram of a connected domain tag format;
FIG. 3 is a nesting diagram;
Detailed Description
The technical solution in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention.
See fig. 1. The invention mainly aims to detect the defects of the printed matter; mainly aims at printed matters such as posters with less connected domains.
The method comprises the following steps: extracting a printing image in the acquired RGB image by using a semantic segmentation technology; first, DNN is used to identify the printed matter in the captured image, i.e., the captured image has complex conditions such as background, and the printed matter is to be judged. Processing an image to be detected by a semantic segmentation technology: inputting an image of printing paper acquired by a camera, and performing semantic segmentation on the image by using a DNN (digital hierarchy network), wherein the network structure is an Encoder-Decoder structure, and a data set is various types of printing paper images; labels fall into two categories, printed products and backgrounds. The method is pixel-level classification, that is, all pixels in an image need to be labeled with corresponding labels. A pixel belonging to the printing paper, whose value is denoted by 1, a pixel belonging to the background, whose value is denoted by 0; the loss function used by the network is a cross entropy loss function. And after the connected domain of the printed matter is obtained, the following operations are carried out: and taking the result obtained by semantic segmentation as a mask, and extracting a corresponding presswork image from the RGB image. And rotating the image according to the included angle between the long axis and the short axis of the mask. And obtaining the corrected printed product image. The partial mask segmentation, rotation, is conventional.
And the first step is finished, and the printed matter image can be separated from the acquired RGB image.
Step two: the standard image and the printing image are subjected to connected domain analysis to obtain respective image descriptions, and the change of the illumination is easy to bring the change of pixel values in the image, so that the invention needs not to use gray values to obtain the image descriptions in order to avoid the influence of the illumination. The process for obtaining the printed matter description comprises the following steps: and carrying out edge detection on the printed matter image to obtain a corresponding edge image. And extracting a closed connected domain in the edge image. Its description is obtained by connected domain analysis. The connected component description is combined to obtain the description of the printed product image.
Carrying out edge detection on the printed matter image to obtain a corresponding edge image: inputting a printing image, carrying out edge detection on the image by using a Canny operator after carrying out graying processing on the image to obtain a gradient edge, namely the edge of the pattern in the printing area;
extracting a closed connected domain in the edge image: performing connected domain analysis on the result obtained in the last step by using Seed Filling method to obtain connected domains with different labels (labels)
Figure DEST_PATH_IMAGE002AAA
And obtaining the value of the maximum tag number
Figure DEST_PATH_IMAGE004AA
(i.e., total number), connected domain tag format is shown in FIG. 2. The principle is as follows: https:// blog.csdn.net/liangchunjiang/article/details/79431339.
Its description is obtained by connected domain analysis: setting initial parameters, first, printing each connected domain of the image
Figure DEST_PATH_IMAGE002AAAA
(Co-occurrence of
Figure DEST_PATH_IMAGE004AAA
Number) of pixels, N =0, and L = 0. Limit coordinates are:
Figure DEST_PATH_IMAGE006A
(ii) a Because the pixel sizes of different images are different, the area parameter of the connected domain uses the pixel number of the connected domain and the total image of the whole imageThe ratio of the number of elements is expressed, namely the connected domain area parameter S:
Figure DEST_PATH_IMAGE008A
wherein N is the number of pixels of the connected domain,
Figure DEST_PATH_IMAGE010A
is the number of pixels of the whole image.
Traversing pixel points of the image line by line to obtain: the pixel points of each row in the image have corresponding connected domain labels, and the form is as follows:
Figure DEST_PATH_IMAGE012A
where 0 is the background (non-connected domain) pixel and the numbers outside of 0 are the tag numbers corresponding to the connected domains. And processing the label value to obtain the hierarchy information of the downlink connected domain. Since a connected component is a closed region, its label number appears at least twice from left to right across a row of pixels in the image, the first time to begin entering the connected component and the second time to leave the connected component. Furthermore, the presence of nested-structured connected domains, necessarily the large connected domain containing the small connected domain, does not change the number of nested layers of connected domains once determined, if any
Figure DEST_PATH_IMAGE014AAAA
Indicates that the nesting layer number of the corresponding connected domain is not determined when
Figure DEST_PATH_IMAGE016A
There is no need to change the value of L. Firstly, setting a temporary variable C =0, traversing from left to right, recording a first non-0 number, wherein the corresponding number in the sequence is 1, and the corresponding C is set to 1, at this time, because the nested layer number of the connected domain with the label number of 1 is
Figure DEST_PATH_IMAGE014_5A
So corresponding connected domainThe value of the nesting layer number L is updated to C, which indicates that the maximum nesting layer number of the connected domain is 1, and the recorded non-0 label sequence is
Figure DEST_PATH_IMAGE018A
(ii) a A second non-0 digit, the corresponding digit in the sequence being 3, the digit being recorded if no digit 3 is present in the recorded sequence of labels, the resulting sequence of labels being
Figure DEST_PATH_IMAGE020A
At this time order
Figure DEST_PATH_IMAGE022A
Indicating entry into a further nested connected region, the largest nesting level of connected regions due to tag number 3
Figure DEST_PATH_IMAGE014_6A
Update
Figure DEST_PATH_IMAGE024A
The maximum nesting layer number of the connected domain with the label number of 3 is 2; similarly, the third non-0 digit is 2, and the digit 2 is not recorded in the recorded non-0 tag sequence, and the non-0 digit tag sequence is the tag sequence at this time
Figure DEST_PATH_IMAGE026AA
Number 2 of nested layers of connected domains due to tag number 2
Figure DEST_PATH_IMAGE014_7A
Let us order
Figure DEST_PATH_IMAGE028A
The maximum nesting layer number of the connected domain with the label number of 2 is 3; continuing the traversal, the fourth traversal has a non-0 digit of 2 due to the previously recorded sequence of digits
Figure DEST_PATH_IMAGE026AAA
Has 2 already existed in the sequence, and means that the traversal of the connected component labeled 2 has ended and is no longer recorded in the sequence of labels. And subtracting 1 from C, wherein C =2, namely the traversed pixel points are located in the connected domain with the nesting level of 2. By analogy, when a new connected domain is encountered, namely a label number which does not exist in the recorded non-0 digital label sequence, adding 1 to C to represent that the connected domain enters a nesting area of a deeper layer; every time a connected domain is left, C is reduced by 1, and the nested region of the previous layer is returned; in addition, each time the add-1 operation is performed on C, it is necessary to determine whether the nesting level L corresponding to the tag number is 0 until the row of pixels is traversed. The whole flow is shown in fig. 3 below.
Each time a pixel point with a label is traversed, the number of pixels corresponding to the label is increased
Figure DEST_PATH_IMAGE030A
(ii) a Comparing the pixel point coordinate with label with the existing one
Figure DEST_PATH_IMAGE034A
A comparison is made, such as: the coordinates of the pixel points of the label traversed by a certain point are
Figure DEST_PATH_IMAGE032A
If, if
Figure DEST_PATH_IMAGE036A
Then, then
Figure DEST_PATH_IMAGE038A
Otherwise
Figure DEST_PATH_IMAGE040A
Keeping the same; if it is
Figure DEST_PATH_IMAGE042A
Then, then
Figure DEST_PATH_IMAGE044A
Otherwise
Figure DEST_PATH_IMAGE046A
Keeping the same; to pair
Figure DEST_PATH_IMAGE048A
The same process is carried out; after traversing, obtaining the coordinates of the central point
Figure DEST_PATH_IMAGE050A
Wherein
Figure DEST_PATH_IMAGE052A
(ii) a To obtain
Figure DEST_PATH_IMAGE054A
And the respective number of nesting layers L; calculating the integral characteristic value of each connected domain
Figure DEST_PATH_IMAGE056AA
And expressing the distance from the coordinates of the center point of the connected domain to the origin and the area of the connected domain:
Figure DEST_PATH_IMAGE058A
get each connected domain
Figure DEST_PATH_IMAGE002_5A
Description of the invention
Figure DEST_PATH_IMAGE060A
;
Combining the connected domain descriptions to obtain the description of the printed product image: the description of the entire image appears as
Figure DEST_PATH_IMAGE062A
;
Similarly, the above operations are repeated for the standard image to obtain the image description data of the standard image
Figure DEST_PATH_IMAGE064A
Wherein
Figure DEST_PATH_IMAGE066A
As a link in a standard imageThe number of the through domains is greater than the number of the through domains,
Figure DEST_PATH_IMAGE068A
the number of connected domains;
by this time, the second step is completed,
step three: comparing the image descriptions of the standard image and the printing image, and judging abnormal conditions; traversing the connected domain of the standard image and the printing image according to the layer number
Figure DEST_PATH_IMAGE070A
Grouping and respectively calculating the number corresponding to each hierarchy;
comparing the corresponding number of each level of the standard image and the printing image, wherein the number is consistent to be a normal condition, and the levels with inconsistent number need to further detect the connected domain in the level: setting corresponding connected domain flags in standard images
Figure DEST_PATH_IMAGE072AA
Integral feature difference
Figure DEST_PATH_IMAGE074A
Corresponding connected domain marks in printed images
Figure DEST_PATH_IMAGE076AA
Searching the corresponding relation among all connected domains of the hierarchy: searching the integral characteristic value of a certain connected domain of the printing image and each connected domain in the standard image
Figure DEST_PATH_IMAGE056AAA
The difference between:
Figure DEST_PATH_IMAGE078A
in principle, the difference is minimal, i.e.
Figure DEST_PATH_IMAGE080A
Then, the two related connected domains are the printing images (corresponding to the label numbers)
Figure 335018DEST_PATH_IMAGE082
) With standard image (corresponding tag number)
Figure DEST_PATH_IMAGE086A
) Corresponding connected domain of
Figure DEST_PATH_IMAGE084AAAA
(ii) a If the standard image connected domain mark
Figure DEST_PATH_IMAGE072AAA
That is, the connected component in the standard image at this time has no corresponding connected component
Figure DEST_PATH_IMAGE094AA
Update
Figure DEST_PATH_IMAGE102AAAA
The value of (c):
Figure 875328DEST_PATH_IMAGE088
if the standard image connected domain mark
Figure DEST_PATH_IMAGE090A
Figure DEST_PATH_IMAGE092A
Indicating that the connected component in the standard image currently has a corresponding connected component in the printed image
Figure DEST_PATH_IMAGE094AAA
Comparing the connected component of the standard image
Figure DEST_PATH_IMAGE096A
And corresponding connected domain
Figure DEST_PATH_IMAGE084_5A
Corresponding difference value
Figure DEST_PATH_IMAGE098A
The size of (2):
if it is
Figure DEST_PATH_IMAGE100A
Update
Figure DEST_PATH_IMAGE102_5A
Value of (1), order
Figure DEST_PATH_IMAGE104A
If it is
Figure DEST_PATH_IMAGE106A
Obtaining corresponding connected domains in the printed image
Figure DEST_PATH_IMAGE094AAAA
The area difference with the standard image connected domain; obtaining corresponding connected domains
Figure DEST_PATH_IMAGE084_6A
The area difference of the connected domain with the standard image; if corresponding connected domain
Figure DEST_PATH_IMAGE084_7A
If the corresponding area difference is smaller, updating
Figure DEST_PATH_IMAGE102_6A
A value of (d); otherwise, it is not updated
Figure DEST_PATH_IMAGE102_7A
A value of (d);
if it is
Figure 916051DEST_PATH_IMAGE108
Then not update
Figure DEST_PATH_IMAGE102_8A
The value of (c).
After traversing is finished, corresponding connected domain marks of connected domains in printed images
Figure DEST_PATH_IMAGE110A
There are two cases:
Figure DEST_PATH_IMAGE112A
;
Figure DEST_PATH_IMAGE076AAA
the connected domain does not exist in the corresponding connected domain in the standard image, namely a defect area, the position of the defect is a blank area inside the pattern and outside the pattern, the former has more nesting layers and does not influence the whole content of the printed product, so the influence degree of the defect is smaller, and the latter has larger degree, so the whole influence condition of the defect in the area is larger
Figure DEST_PATH_IMAGE114A
Comprises the following steps:
Figure DEST_PATH_IMAGE116A
Figure DEST_PATH_IMAGE118A
comparing the area difference between the corresponding connected domains in the standard image, wherein the error is normal within 3 percent of the total area of the standard image due to the statistical error, and the condition that the pattern is lack of printing or is over printing is considered to exist if the error exceeds 3 percent, so that the overall influence condition of the defects of the regions is as follows
Figure DEST_PATH_IMAGE120A
Expressed as follows:
Figure DEST_PATH_IMAGE122A
as described above, the defects of the entire printed image
Figure DEST_PATH_IMAGE124A
Comprises the following steps:
Figure DEST_PATH_IMAGE126A
the above description is only for the preferred embodiment of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art should be considered to be within the technical scope of the present invention, and the technical solutions and the inventive concepts thereof according to the present invention should be equivalent or changed within the scope of the present invention.

Claims (5)

1. A printing data identification method based on big data processing comprises the following steps: segmenting the acquired RGB image by using a semantic segmentation technology to obtain a presswork image; step two: processing the standard image and the printing image to obtain respective image descriptions; the method comprises the following steps: carrying out edge detection on the printed matter image to obtain a corresponding edge image; extracting a closed connected domain in the edge image; setting initial parameters, printing each connected domain of image
Figure 241659DEST_PATH_IMAGE002
Respectively setting corresponding initial values, wherein the initial values comprise the number N =0 of pixels and the number L =0 of layers; limit coordinates are:
Figure 774272DEST_PATH_IMAGE004
because the pixels of different images are different in size, the area parameter of the connected domain is represented by the ratio of the number of the pixels of the connected domain to the total number of the pixels of the whole image, that is, the area parameter S of the connected domain:
Figure 176434DEST_PATH_IMAGE006
wherein N is the number of pixels of the connected domain,
Figure 110892DEST_PATH_IMAGE008
is the image of the whole imageThe number of elements;
traversing pixel points of the image line by line to obtain: the pixel points of each row in the image have corresponding connected domain label sequences, and the form is as follows:
Figure 307518DEST_PATH_IMAGE010
(ii) a Wherein 0 is background pixel, namely non-connected domain pixel, the number outside 0 is label number of corresponding connected domain, process the label value, obtain the hierarchical information of connected domain under this row, because a connected domain is a closed area, so pass from left to right on a row of pixel of the picture, its label number will appear at least twice, first for starting to enter this connected domain, second for leaving this connected domain, there is connected domain of nested structure, must be that the big connected domain includes the small connected domain, so the nested number of layers of connected domain will not change once determining, if it is determined that the number of layers of connected domain will not change once
Figure 327427DEST_PATH_IMAGE012
Indicates that the nesting layer number of the corresponding connected domain is not determined when
Figure 64439DEST_PATH_IMAGE014
In the process, the value of L does not need to be changed, a temporary variable C =0 is set, traversal is performed from left to right, the first non-0 digit is recorded, the corresponding digit in the tag sequence of the connected domain is 1, and the corresponding C is set to be 1, so that the nested layer number of the connected domain with the tag number of 1 is reduced, and the nested layer number of the connected domain with the tag number of 1 is reduced
Figure 791086DEST_PATH_IMAGE012
Therefore, the value of the nesting layer number L of the corresponding connected component is updated to C, which indicates that the maximum nesting layer number of the connected component is 1, and the recorded non-0 tag sequence is
Figure 486510DEST_PATH_IMAGE016
(ii) a A second non-0 digit, the corresponding digit in the above-mentioned connected domain tag sequence being 3, the digit 3 not being present in the recorded tag sequence, recording the sequenceNumber, the resulting tag sequence is
Figure 931397DEST_PATH_IMAGE018
At this time order
Figure 472100DEST_PATH_IMAGE020
Indicating entry into a further nested connected region, the largest nesting level of connected regions due to tag number 3
Figure 115571DEST_PATH_IMAGE012
Update
Figure 654000DEST_PATH_IMAGE022
The maximum nesting layer number of the connected domain with the label number of 3 is 2; the third non-0 digit is 2, and the digit 2 is not recorded in the recorded non-0 label sequence, and the non-0 digit label sequence is the label sequence at this moment
Figure 648501DEST_PATH_IMAGE024
Number 2 of nested layers of connected domains due to tag number 2
Figure DEST_PATH_IMAGE025
Let us order
Figure DEST_PATH_IMAGE027
The maximum nesting layer number of the connected domain with the label number of 2 is 3; continuing the traversal, the fourth traversal has a non-0 digit of 2 due to the previously recorded sequence of digits
Figure 196157DEST_PATH_IMAGE024
2 already exists in the sequence, which means that the traversal of the connected domain with the label of 2 is finished and is not recorded into the label sequence; subtracting 1 from C, wherein C =2, namely the traversed pixel points are located in a connected domain with a nesting level of 2, and so on, when a new connected domain is encountered, namely a label number which does not exist in the recorded non-0 digital label sequence, adding 1 to C to indicate that the embedded domain enters a deeper layerIn the jacket region; every time a connected domain is left, C is reduced by 1, and the nested region of the previous layer is returned; in addition, when adding 1 to C each time, it needs to judge whether the nesting level L corresponding to the label number is 0 or not, until the pixels in the row are traversed, and the number of pixels corresponding to the label is increased each time one pixel point with the label is traversed
Figure DEST_PATH_IMAGE029
(ii) a Comparing the pixel point coordinate with the label with the maximum value and the minimum value of the existing horizontal and vertical coordinates:
Figure 162976DEST_PATH_IMAGE030
Figure DEST_PATH_IMAGE031
Figure 137885DEST_PATH_IMAGE032
Figure DEST_PATH_IMAGE033
performing a comparison comprising: the coordinates of the pixel points of the label traversed by a certain point are
Figure DEST_PATH_IMAGE035
If, if
Figure DEST_PATH_IMAGE037
Then, then
Figure DEST_PATH_IMAGE039
Otherwise
Figure DEST_PATH_IMAGE041
Keeping the same; if it is
Figure DEST_PATH_IMAGE043
Then, then
Figure DEST_PATH_IMAGE045
Otherwise
Figure DEST_PATH_IMAGE047
Keeping the same; to pair
Figure DEST_PATH_IMAGE049
The same process is carried out; after traversing, obtaining the coordinates of the central point
Figure DEST_PATH_IMAGE051
Wherein
Figure DEST_PATH_IMAGE053
(ii) a To obtain
Figure DEST_PATH_IMAGE055
And the respective number of nesting layers L; calculating the integral characteristic value of each connected domain
Figure DEST_PATH_IMAGE057
And expressing the distance from the coordinates of the center point of the connected domain to the origin and the area of the connected domain:
Figure DEST_PATH_IMAGE059
get each connected domain
Figure 910362DEST_PATH_IMAGE002
Description of the invention
Figure DEST_PATH_IMAGE061
Step three: and comparing the image descriptions of the standard image and the printing image to judge the abnormal printing condition.
2. The big data processing-based print data identification method according to claim 1, wherein performing edge detection on the print image to obtain a corresponding edge image specifically comprises: inputting a printing image, carrying out gray processing on the image, and carrying out edge detection on the image by using a Canny operator to obtain a gradient edge, namely the edge of the pattern in the printing area.
3. The big data processing-based printing data identification method according to claim 1, wherein extracting the closed connected components in the edge image specifically comprises: analyzing the connected domain of the edge of the pattern in the printing area by using a seed filling method to obtain the connected domain with different labels
Figure 58446DEST_PATH_IMAGE002
And obtaining the value of the maximum tag number
Figure DEST_PATH_IMAGE063
I.e. the total number.
4. The big data processing-based print data identification method according to claim 1, wherein the description of the print image obtained by the combination of the descriptions of the connected component is specifically: the description of the whole image is represented as
Figure DEST_PATH_IMAGE065
;
Similarly, the above operations are repeated for the standard image to obtain the image description data of the standard image
Figure DEST_PATH_IMAGE067
Wherein
Figure DEST_PATH_IMAGE069
Is a connected component in the standard image,
Figure DEST_PATH_IMAGE071
is the number of connected domains.
5. The big data processing-based printing data identification method according to claim 1, wherein the third step is specifically: traversing the connected domains of the standard image and the printed image, grouping the connected domains according to the layer number L, and respectively calculating the number corresponding to each level; comparing the corresponding number of each level of the standard image and the printing image, wherein the number is consistent to be a normal condition, and the levels with inconsistent number need to further detect the connected domain in the level: setting corresponding connected domain flags in standard images
Figure DEST_PATH_IMAGE073
Integral feature difference
Figure DEST_PATH_IMAGE075
Corresponding connected domain marks in printed images
Figure DEST_PATH_IMAGE077
Searching the corresponding relation among all connected domains of the hierarchy: searching the integral characteristic value of a certain connected domain of the printing image and each connected domain in the standard image
Figure 20717DEST_PATH_IMAGE057
The difference between:
Figure DEST_PATH_IMAGE079
the difference being minimal, i.e.
Figure DEST_PATH_IMAGE081
Then, the two related connected domains are the label numbers corresponding to the printing images
Figure 697686DEST_PATH_IMAGE082
Corresponding connected domain of
Figure 135621DEST_PATH_IMAGE084
And standard drawingImage corresponding label number
Figure 759500DEST_PATH_IMAGE086
The corresponding connected domain of (a);
if the standard image connected domain mark
Figure 497649DEST_PATH_IMAGE073
That is, the connected component in the standard image at this time has no corresponding connected component
Figure 814361DEST_PATH_IMAGE088
Update
Figure 739592DEST_PATH_IMAGE090
The value of (c):
Figure DEST_PATH_IMAGE091
if the standard image connected domain mark
Figure DEST_PATH_IMAGE093
Figure DEST_PATH_IMAGE095
Indicating that the connected component in the standard image currently has a corresponding connected component in the printed image
Figure 901583DEST_PATH_IMAGE088
Comparing the connected component of the standard image
Figure DEST_PATH_IMAGE097
And corresponding connected domain
Figure 228659DEST_PATH_IMAGE084
Corresponding difference value
Figure DEST_PATH_IMAGE099
The size of (2):
if it is
Figure DEST_PATH_IMAGE101
Update
Figure 185114DEST_PATH_IMAGE102
Value of (1), order
Figure 597640DEST_PATH_IMAGE104
Wherein,
Figure DEST_PATH_IMAGE105
for corresponding connected domain
Figure 94481DEST_PATH_IMAGE084
Is marked with a connected domain of (c),
Figure 541643DEST_PATH_IMAGE106
in order to be the number of the label,
Figure DEST_PATH_IMAGE107
for corresponding connected domain
Figure 931648DEST_PATH_IMAGE088
A connected domain flag of (c);
if it is
Figure DEST_PATH_IMAGE109
Obtaining corresponding connected domains in the printed image
Figure 300313DEST_PATH_IMAGE088
The area difference with the standard image connected domain; obtaining corresponding connected domains
Figure 132002DEST_PATH_IMAGE084
The area difference with the standard image connected domain; if corresponding connected domain
Figure 433671DEST_PATH_IMAGE084
Corresponding area difference relative to corresponding connected domain
Figure 997507DEST_PATH_IMAGE088
If the difference in area is small, the updating is performed
Figure 650205DEST_PATH_IMAGE090
A value of (d); otherwise, it is not updated
Figure 223269DEST_PATH_IMAGE102
A value of (d);
if it is
Figure 379444DEST_PATH_IMAGE110
Then not update
Figure 176499DEST_PATH_IMAGE102
A value of (d);
after traversing is finished, corresponding connected domain marks of connected domains in printed images
Figure 988597DEST_PATH_IMAGE112
There are two cases:
Figure 427669DEST_PATH_IMAGE114
;
Figure 641612DEST_PATH_IMAGE077
the connected domain does not exist in the standard image, namely a defect area, the position of the defect is a pattern inner part and a pattern outer part blank area, and the overall influence condition of the defect of the area
Figure 609568DEST_PATH_IMAGE116
Comprises the following steps:
Figure 971279DEST_PATH_IMAGE118
Figure 886146DEST_PATH_IMAGE120
comparing the area difference between the corresponding connected domains in the standard image, wherein the error is normal within 3 percent of the total area of the standard image due to the statistical error, and the condition that the pattern is lack of printing or is over printing is considered to exist if the error exceeds 3 percent, so that the overall influence condition of the defects of the regions is as follows
Figure 16913DEST_PATH_IMAGE122
Expressed as follows:
Figure DEST_PATH_IMAGE123
defects in the entirety of the printed image
Figure DEST_PATH_IMAGE125
Comprises the following steps:
Figure DEST_PATH_IMAGE127
CN202111063256.3A 2021-09-10 2021-09-10 Printing data identification method based on big data processing Active CN113506297B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111063256.3A CN113506297B (en) 2021-09-10 2021-09-10 Printing data identification method based on big data processing

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111063256.3A CN113506297B (en) 2021-09-10 2021-09-10 Printing data identification method based on big data processing

Publications (2)

Publication Number Publication Date
CN113506297A CN113506297A (en) 2021-10-15
CN113506297B true CN113506297B (en) 2021-12-03

Family

ID=78017145

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111063256.3A Active CN113506297B (en) 2021-09-10 2021-09-10 Printing data identification method based on big data processing

Country Status (1)

Country Link
CN (1) CN113506297B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118134908B (en) * 2024-04-30 2024-07-12 陕西博越腾达科技有限责任公司 Printing monitoring image analysis method for 3D printing

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030035653A1 (en) * 2001-08-20 2003-02-20 Lyon Richard F. Storage and processing service network for unrendered image data
CN109308700A (en) * 2017-07-27 2019-02-05 南京敏光视觉智能科技有限公司 A kind of visual identity defect inspection method based on printed matter character
CN111242896A (en) * 2019-12-31 2020-06-05 电子科技大学 Color printing label defect detection and quality rating method

Also Published As

Publication number Publication date
CN113506297A (en) 2021-10-15

Similar Documents

Publication Publication Date Title
CN113344857B (en) Defect detection network training method, defect detection method and storage medium
CN111275697B (en) Battery silk-screen quality detection method based on ORB feature matching and LK optical flow method
CN109472271B (en) Printed circuit board image contour extraction method and device
JP2000137804A (en) Method and system for abnormality detection of digital image and storage medium for same
CN111242896A (en) Color printing label defect detection and quality rating method
CN113034488B (en) Visual inspection method for ink-jet printed matter
CN114202543B (en) Method, device, equipment and medium for detecting dirt defects of PCB (printed circuit board)
CN113034492B (en) Printing quality defect detection method and storage medium
CN114387269B (en) Fiber yarn defect detection method based on laser
CN111861990A (en) Method, system and storage medium for detecting bad appearance of product
CN110569774B (en) Automatic line graph image digitalization method based on image processing and pattern recognition
US11580758B2 (en) Method for processing image, electronic device, and storage medium
CN113506297B (en) Printing data identification method based on big data processing
CN110533660B (en) Method for detecting silk-screen defect of electronic product shell
CN112861861A (en) Method and device for identifying nixie tube text and electronic equipment
CN113392819B (en) Batch academic image automatic segmentation and labeling device and method
CN112884741B (en) Printing apparent defect detection method based on image similarity comparison
CN114187247A (en) Ampoule bottle printing character defect detection method based on image registration
CN117115171B (en) Slight bright point defect detection method applied to subway LCD display screen
CN116091503B (en) Method, device, equipment and medium for discriminating panel foreign matter defects
CN111798429B (en) Visual inspection method for defects of printed matter
CN116664817A (en) Power device state change detection method based on image difference
CN115546141A (en) Small sample Mini LED defect detection method and system based on multi-dimensional measurement
CN113850756A (en) Label defect detection method based on template comparison
JP2004094427A (en) Slip image processor and program for realizing the same device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant