CN115439426A - Width information determination method and device and electronic equipment - Google Patents

Width information determination method and device and electronic equipment Download PDF

Info

Publication number
CN115439426A
CN115439426A CN202211024084.3A CN202211024084A CN115439426A CN 115439426 A CN115439426 A CN 115439426A CN 202211024084 A CN202211024084 A CN 202211024084A CN 115439426 A CN115439426 A CN 115439426A
Authority
CN
China
Prior art keywords
edge
pixel point
pixel
gluing
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211024084.3A
Other languages
Chinese (zh)
Inventor
耿凯
魏书琪
哈谦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
BOE Technology Group Co Ltd
Beijing BOE Technology Development Co Ltd
Original Assignee
BOE Technology Group Co Ltd
Beijing BOE Technology Development Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by BOE Technology Group Co Ltd, Beijing BOE Technology Development Co Ltd filed Critical BOE Technology Group Co Ltd
Priority to CN202211024084.3A priority Critical patent/CN115439426A/en
Publication of CN115439426A publication Critical patent/CN115439426A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Quality & Reliability (AREA)
  • Geometry (AREA)
  • Image Analysis (AREA)

Abstract

The embodiment of the invention provides a width information determining method, a width information determining device and electronic equipment, and is applied to the technical field of image processing. The method comprises the following steps: acquiring a gluing image containing a gluing area, wherein the gluing area is a glued area in a gluing part; performing edge recognition on a gluing area in the gluing image to obtain a first edge and a second edge of the gluing area; determining width information between the first edge and the second edge as width information of the gluing area based on distance information between each pixel point and the corresponding matched pixel point in the first edge and the second edge; wherein, the matching pixel point corresponding to each pixel point is: and the pixel point which belongs to different edges and has the minimum distance with the pixel point. Through this scheme, can the gummed width information of accurate measurement.

Description

Width information determination method and device and electronic equipment
Technical Field
The present invention relates to the field of image processing technologies, and in particular, to a method and an apparatus for determining width information, and an electronic device.
Background
In order to detect the quality of the gluing process, the width of a gluing area of a gluing part needs to be measured to determine the width information of the gluing area, and then the quality of the gluing process is evaluated based on the measured width information.
At present, the measurement of the gluing width mainly depends on manual means, that is, the width information of gluing is measured in a manual measurement mode, specifically, after a gluing image is obtained, an operator selects measuring points on two edges of the gluing in the gluing image according to experience, and then a computing device computes the linear distance between the measuring points selected by the operator as the width information of the gluing.
However, the measurement points selected based on experience are often inaccurate, so that the width information determined by manual means often has large errors and is inaccurate.
Disclosure of Invention
The embodiment of the invention aims to provide a width information determining method, a width information determining device and electronic equipment, so as to accurately measure the glued width information. The specific technical scheme is as follows:
in a first aspect, an embodiment of the present invention provides a method for determining width information, where the method includes:
acquiring a gluing image containing a gluing area, wherein the gluing area is a glued area in a gluing part;
performing edge recognition on the gluing area in the gluing image to obtain a first edge and a second edge of the gluing area;
determining width information between the first edge and the second edge as width information of the gluing area based on distance information between each pixel point and the corresponding matched pixel point in the first edge and the second edge;
wherein, the matching pixel point corresponding to each pixel point is: and the pixel point which is different from the pixel point in edge and has the minimum distance.
Optionally, the determining, based on information of distances between each pixel point and the corresponding matching pixel point in the first edge and the second edge, width information between the first edge and the second edge as width information of the gluing area includes:
performing edge correction on the first edge and the second edge based on edge end points of the first edge and the second edge;
and determining width information between the first edge and the second edge as width information of the gluing area based on distance information between each pixel point and the corresponding matching pixel point in the first edge after edge correction and the second edge after edge correction.
Optionally, the edge endpoint includes: first and second endpoints of the first edge, and third and fourth endpoints of the second edge;
the edge modification of the first edge and the second edge based on the edge end points of the first edge and the second edge includes:
performing edge correction on the first edge based on the third end point and the fourth end point;
performing edge correction on the second edge based on the first endpoint and the second endpoint.
Optionally, the performing, based on the third end point and the fourth end point, edge correction on the first edge includes:
determining a pixel point with the minimum distance from the third end point from all pixel points contained in the first edge as a first candidate pixel point, and determining a pixel point with the minimum distance from the fourth end point as a second candidate pixel point;
determining pixel points which do not belong to the range of the first candidate pixel point and the second candidate pixel point in the pixel points included in the first edge as redundant pixel points in the first edge;
removing redundant pixel points in all pixel points contained in the first edge;
performing edge correction on the second edge based on the first endpoint and the second endpoint, including:
determining a pixel point with the minimum distance from the first end point from all pixel points contained in the second edge as a third candidate pixel point, and determining a pixel point with the minimum distance from the second end point as a fourth candidate pixel point;
determining pixel points which do not belong to the range of the third candidate pixel point and the fourth candidate pixel point in the pixel points included in the second edge as redundant pixel points in the second edge;
and eliminating redundant pixel points in all the pixel points contained in the second edge.
Optionally, the determining, based on distance information between each pixel point and a corresponding matching pixel point in the first edge after the edge correction and the second edge after the edge correction, width information between the first edge and the second edge as width information of the glue coating area includes:
and determining width information between the first edge and the second edge as width information of the gluing area based on distance information between each pixel point after the redundant pixel points are removed and the corresponding matched pixel points in the first edge and the second edge.
Optionally, before determining the width information between the first edge and the second edge based on the distance information between each pixel point and the corresponding matching pixel point in the first edge and the second edge, and using the width information as the width information of the gluing area, the method further includes:
determining distance information between each pixel point in the first edge and each pixel point in the second edge;
and aiming at each pixel point in the first edge and the second edge, determining minimum distance information from the distance information between the pixel point and each corresponding pixel point to be screened as the distance information between the pixel point and the corresponding matching pixel point, wherein the pixel point to be screened corresponding to each pixel point is a pixel point belonging to different edges with the pixel point.
Optionally, the determining, for each pixel point in the first edge and the second edge, minimum distance information from distance information between the pixel point and each corresponding pixel point to be screened, as distance information between the pixel point and a corresponding matching pixel point, includes:
constructing a distance information matrix d based on the distance information between each pixel point in the first edge and each pixel point in the second edge:
Figure BDA0003813416540000031
wherein n is the number of pixels included in the first edge, m is the number of pixels included in the second edge, and each d in the distance information matrix d uv Representing distance information between the u-th pixel point in the first edge and the v-th pixel point in the second edge;
determining the minimum distance information in each row in the distance information matrix d as the distance information between the pixel points of the row and the corresponding matched pixel points;
and determining the minimum distance information in each column in the distance information matrix d as the distance information between the pixel point of the column and the corresponding matching pixel point.
Optionally, the determining, based on information of distances between each pixel point and the corresponding matching pixel point in the first edge and the second edge, width information between the first edge and the second edge as width information of the gluing area includes:
determining minimum distance information from the distance information between each pixel point and the corresponding matched pixel point in the first edge and the second edge, and taking the distance represented by the determined distance information as the minimum width of the gluing area; and/or the presence of a gas in the gas,
determining the maximum distance information from the distance information between each pixel point and the corresponding matched pixel point in the first edge and the second edge, and taking the distance represented by the determined distance information as the maximum width of the gluing area;
and taking the minimum width and/or the maximum width as the width information of the gluing area.
Optionally, before the performing edge recognition on the gluing area in the gluing image to obtain a first edge and a second edge of the gluing area, the method further includes:
and identifying a gluing area of the gluing image to determine the gluing area in the gluing image.
In a second aspect, an embodiment of the present invention provides a width information determining apparatus, where the apparatus includes:
the image acquisition module is used for acquiring a gluing image containing a gluing area, wherein the gluing area is a glued area in the gluing part;
the edge recognition module is used for carrying out edge recognition on the gluing area in the gluing image to obtain a first edge and a second edge of the gluing area;
an information determining module, configured to determine, based on distance information between each pixel point and a corresponding matching pixel point in the first edge and the second edge, width information between the first edge and the second edge, where the width information is used as width information of the gluing area; wherein, the matching pixel point corresponding to each pixel point is: and the pixel point which is different from the pixel point in edge and has the minimum distance.
In a third aspect, an embodiment of the present invention provides an electronic device, including a processor, a communication interface, a memory, and a communication bus, where the processor and the communication interface complete communication between the memory and the processor through the communication bus;
a memory for storing a computer program;
a processor for implementing the method steps of any of the first aspect when executing a program stored in the memory.
In a fourth aspect, an embodiment of the present invention provides a computer-readable storage medium, in which a computer program is stored, and the computer program, when executed by a processor, implements the method steps of any one of the first aspect.
The embodiment of the invention has the following beneficial effects:
the method, the device and the electronic equipment for determining the width information can acquire the gluing image including the gluing area, further perform edge recognition on the gluing area in the gluing image to obtain the first edge and the second edge of the gluing area, and determine the width information between the first edge and the second edge as the width information of the gluing area based on the distance information between each pixel point and the corresponding matched pixel point in the first edge and the second edge. After the gluing image is obtained, the first edge and the second edge of the gluing area are identified, and then the width information between the first edge and the second edge is determined by utilizing the distance information between each pixel point and the matching pixel point in the first edge and the second edge, so that the distance identification at the pixel level can be realized, and the accuracy of the determined width information is higher.
Furthermore, the embodiment of the invention can automatically realize the measurement of the width information after the gluing image is obtained, has higher efficiency compared with a manual measurement mode, and can reduce the cost of the measurement of the width information because the operator does not need to select the measurement point.
Of course, not all of the advantages described above need to be achieved at the same time in the practice of any one product or method of the invention.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art that other embodiments can be obtained by referring to these drawings.
Fig. 1 is a flowchart of a method for width information according to an embodiment of the present invention;
FIG. 2 (a) is a schematic diagram of a glue image provided by an embodiment of the invention;
FIG. 2 (b) is another schematic diagram of a glue image provided by an embodiment of the invention;
FIG. 3 (a) is a schematic diagram of edge identification according to an embodiment of the present invention;
FIG. 3 (b) is a schematic diagram of edge identification provided by the embodiment of the present invention;
FIG. 4 is a schematic diagram of an edge in a glue coated image according to an embodiment of the present invention;
FIG. 5 is a flow chart of another method for determining width information according to an embodiment of the present invention;
fig. 6 is a schematic structural diagram of a width information determining apparatus according to an embodiment of the present invention;
fig. 7 is a schematic structural diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived from the embodiments given herein by one of ordinary skill in the art, are within the scope of the invention.
In order to accurately measure the width information of the glue coating, the embodiment of the invention provides a width information determining method, a device and electronic equipment.
It should be noted that, in a specific application, the embodiment of the present invention may be applied to various electronic devices, for example, a personal computer, a server, a mobile phone, and other devices with data processing capability. Moreover, the method for determining width information provided by the embodiment of the present invention may be implemented by software, hardware, or a combination of software and hardware.
The method for determining width information provided by the embodiment of the invention can comprise the following steps:
acquiring a gluing image containing a gluing area, wherein the gluing area is a glued area in a gluing part;
performing edge recognition on a gluing area in the gluing image to obtain a first edge and a second edge of the gluing area;
determining width information between the first edge and the second edge as width information of the gluing area based on distance information between each pixel point and the corresponding matched pixel point in the first edge and the second edge; wherein, the matching pixel point corresponding to each pixel point is: and the pixel point which belongs to different edges and has the minimum distance with the pixel point.
In the above scheme of the embodiment of the invention, after the glue coating image is obtained, the first edge and the second edge of the glue coating area can be identified, and then the width information between the first edge and the second edge is determined by using the distance information between each pixel point and the matching pixel point in the first edge and the second edge, so that the distance identification at the pixel level can be realized, and the accuracy of the determined width information is higher. Furthermore, because the embodiment of the invention can automatically realize the measurement of the width information after the gluing image is obtained, compared with a manual measurement mode, the efficiency is higher, and because the selection of a measurement point by an operator is not needed, the cost of the measurement of the width information can be reduced.
The following describes the width information determining method provided by the embodiment of the present invention in detail with reference to the drawings.
As shown in fig. 1, an embodiment of the present invention provides a method for determining width information, including steps S101 to S103, where:
s101, obtaining a gluing image containing a gluing area, wherein the gluing area is a glued area in a gluing part;
the glue coating part may be a part involved in the screen panel production process, for example, a component constituting the screen panel, such as a backlight module, a polarizer, a glass substrate, and the like. In the screen panel production process, a glue coating process is required to be applied to part or all of the constituent elements of the screen panel to bond them to other elements, and in the embodiment of the present invention, the area in which glue has been coated in the glue coating element is referred to as a glue coating area.
It should be emphasized that, in the production process of other devices, the gluing process and the width information measurement of the gluing may also be involved, and the width information method provided in the embodiment of the present invention is also applicable thereto, and the screen panel mentioned in the embodiment of the present invention is merely illustrated as an example and is not to be construed as a limitation of the present invention.
The gluing image may be acquired from the gluing part after the gluing process is applied to the gluing part. In different scenes, due to the difference of the sizes and requirements of the related gluing parts, the gluing image can be an image related to the whole gluing area of the gluing part or a local gluing area in the gluing part. Illustratively, as shown in fig. 2 (a), the image of the glue application provided by the embodiment of the present invention is a schematic view of the glue application image related to the whole glue application area of the glue application part, as shown in fig. 2 (b), the image of the glue application provided by the embodiment of the present invention is another schematic view of the glue application image related to the partial glue application area in the glue application part, as shown in fig. 2 (b), and the gray area in fig. 2 (a) and 2 (b) is the glue application area. Compared with the gluing image of the whole gluing area, the gluing image related to the local gluing area can show more details, so that in a scene with higher precision requirement, the width information of the gluing image related to the local gluing area is mostly used for confirmation.
In this step, the gluing image may be acquired in combination with an actual application scenario or a requirement, for example, in an implementation manner, a camera that acquires the gluing image may be independent of the execution main body of the embodiment of the present invention, and at this time, the gluing image may be read from the camera, or the gluing image acquired by the camera may also be acquired in advance by another electronic device, and further transmitted to the execution main body of the embodiment of the present invention in a task, instruction, or other manner, and at this time, the execution main body of the embodiment of the present invention may receive the width information determination task and instruction, and then read the gluing image carried by the task and instruction, or read the gluing image from a storage location specified by the task and instruction, which is all possible. In another implementation manner, the execution main body according to the embodiment of the present invention may include a camera for acquiring the gluing image, and in this case, the step may be to control the camera to acquire the gluing image acquired by the camera.
S102, performing edge recognition on a gluing area in the gluing image to obtain a first edge and a second edge of the gluing area;
in order to determine the width information of the glue-coated area, after obtaining the glue-coated image, edge recognition may be performed on the glue-coated area in the glue-coated image. In this step, the edge of the glue area can be identified in various ways. For example, in one implementation, the edge detection algorithm may be used to process the glue-coated image to obtain the first edge and the second edge of the glue-coated region in the glue-coated image, and the edge detection algorithm used may include an algorithm that performs edge recognition by using a first-order operator and a second-order operator, such as a cross differential operator. In another implementation manner, a neural network model may also be used, specifically, a neural network model for edge recognition may be trained in advance, and then the pre-trained neural network model is used to process the glue-coated image to obtain the first edge and the second edge of the glue-coated region in the glue-coated image. Because the neural network model can be trained by using the images acquired in the gluing process scene, compared with an edge detection algorithm, the neural network model can be more suitable for edge recognition in the gluing process scene, so that the accuracy of the edge recognition can be improved.
In order to more accurately perform edge recognition on the gluing area, in one implementation, before performing edge recognition on the gluing area in the gluing image to obtain the first edge and the second edge of the gluing area, the gluing image may be further preprocessed, for example, the gluing area is recognized on the gluing image to determine the gluing area in the gluing image.
In the embodiment of the invention, the gluing area identification can be carried out on the gluing image in various modes, for example, the gluing image is divided into areas, and then the gluing area is identified from the divided areas by using the texture characteristics in different areas. Or the pre-trained neural network model for identifying the gluing area can be used for processing the gluing image and determining the gluing area from the gluing image.
After the gluing area is identified from the gluing image, the step of identifying the gluing area of the gluing image to determine the gluing area in the gluing image can be executed.
Illustratively, as shown in fig. 3 (a), a schematic diagram of edge recognition provided by the embodiment of the present invention, the first edge and the second edge in the glue image in fig. 3 (a) are obtained by performing edge recognition on the glue area in the glue image in fig. 2 (a). As shown in fig. 3 (b), another schematic diagram of edge recognition is provided according to an embodiment of the present invention, in which the first edge and the second edge in the glue image in fig. 3 (b) are obtained by performing edge recognition on the glue area in the glue image in fig. 2 (b).
S103, determining width information between the first edge and the second edge as width information of the gluing area based on distance information between each pixel point and the corresponding matching pixel point in the first edge and the second edge;
after the first edge and the second edge are determined, width information between the first edge and the second edge may be determined based on distance information between each pixel point and the corresponding matching pixel point in the first edge and the second edge, and then the determined width information is used as width information of the gluing area. The distance information is a pixel distance between two pixels, or a square of the pixel distance, and certainly, other information that can feedback the distance may be used.
The matching pixel points corresponding to each pixel point are as follows: and the pixel point which belongs to different edges and has the minimum distance with the pixel point. In short, for each pixel point in the first edge, the corresponding matching pixel point is necessarily one pixel point in the second edge, and the specific pixel point is the pixel point with the minimum distance from the pixel point in the first edge among the pixel points included in the second edge. Exemplarily, the first edge includes a pixel 1 and a pixel 2, and the second edge includes a pixel 3 and a pixel 4, wherein a distance between the pixel 1 and the pixel 3 is smaller than a distance between the pixel 1 and the pixel 4, and a distance between the pixel 2 and the pixel 3 is greater than a distance between the pixel 2 and the pixel 4, so that the matching pixel of the pixel 1 is the pixel 3, the matching pixel of the pixel 2 is the pixel 4, and similarly, the matching pixel of the pixel 3 is the pixel 1, and the matching pixel of the pixel 4 is the pixel 2.
It should be noted that, in general, for the pixels in the first edge and the second edge, the pixels and the matching pixels thereof are often matching pixels, such as the above-mentioned pixel 1 and pixel 3, and pixel 2 and pixel 4. However, in a special case, there is also a matching pixel of a certain pixel, and the matching pixel of the matching pixel is not the pixel, for example, the matching pixel of the pixel 5 in the first edge is the pixel 6 in the second edge, and the matching pixel of the pixel 6 may not be the pixel 5 in the first edge, for example, may be the pixel 7 in the first edge.
Because the matching pixel point of each pixel point is the pixel point which belongs to different edges and has the minimum distance with the pixel point, the distance information between each pixel point and the matching pixel point can represent the width of the gluing area at the position of the pixel point. Therefore, after the first edge and the second edge are determined, the width information between the first edge and the second edge can be determined based on the distance information between each pixel point and the corresponding matched pixel point, and the width information is used as the width information of the gluing area.
In the embodiment of the present invention, there are various ways to determine the width information between the first edge and the second edge, and for example, in one implementation, a target pixel may be selected from the pixels included in the first edge and the second edge, and then the distance information between the target pixel and the matching pixel thereof is used as the width information between the first edge and the second edge. Or, the mean value represented by the distance information between each pixel and its matching pixel may also be used as the width information between the first edge and the second edge.
In order to enable the determined width information to reflect the quality of the gluing process more accurately and comprehensively, in the embodiment of the present invention, the width information of the gluing area may be the maximum width and/or the minimum width between the first edge and the second edge.
In this case, the determination may be made in the following manner, including:
determining minimum distance information from the distance information between each pixel point and the corresponding matched pixel point in the first edge and the second edge, and taking the distance represented by the determined distance information as the minimum width of the gluing area; and/or the presence of a gas in the atmosphere,
determining the maximum distance information from the distance information between each pixel point and the corresponding matching pixel point in the first edge and the second edge, and taking the distance represented by the determined distance information as the maximum width of the gluing area;
and taking the minimum width and/or the maximum width as the width information of the gluing area.
Illustratively, the first edge comprises a pixel point a, a pixel point b and a pixel point c, the second edge comprises a pixel point d, a pixel point e and a pixel point f, wherein the pixel point a and the pixel point d are matched pixel points, the distance information is distance ad, the pixel point b and the pixel point e are matched pixel points, the distance information is distance be, the pixel point c and the pixel point f are matched pixel points, the distance information is distance cf, the distance ad is less than the distance cf and less than the distance be, the minimum distance information is determined to be the distance ad, namely, the minimum width of the gluing area is the distance ad, the maximum distance information is determined to be the distance be, and namely, the maximum width of the gluing area is the distance be. At this time, the determined width information of the glue application area is false: minimum width: distance ad, and/or maximum width: the distance be.
In the above scheme of the embodiment of the invention, after the glue-coated image is obtained, the first edge and the second edge of the glue-coated area can be identified, and then the width information between the first edge and the second edge is determined by using the distance information between each pixel point and the matched pixel point in the first edge and the second edge, so that the distance identification at the pixel level can be realized, and the accuracy of the determined width information is higher. Furthermore, the embodiment of the invention can automatically realize the measurement of the width information after the gluing image is obtained, has higher efficiency compared with a manual measurement mode, and can reduce the cost of the measurement of the width information because the operator does not need to select the measurement point.
In order to further improve the accuracy of the obtained width information, in an implementation manner of step S103 in the embodiment of the present invention, the first edge and the second edge may be edge-corrected based on edge end points of the first edge and the second edge, and then the width information between the first edge and the second edge is determined based on distance information between each pixel point and the corresponding matching pixel point in the first edge after the edge correction and the second edge after the edge correction, and is used as the width information of the gluing area.
The edge end point of the first edge and the second edge may be a starting point or an ending point of the first edge and the second edge, for example, an intersection point of the first edge and the second edge with the image edge in the glue-coated image shown in fig. 3 (b). Edge modification of the first edge and the second edge may include smoothing the edges, breakpoint optimization, and the like. Optionally, when the gummed image only relates to a local gummed region, the first edge and the second edge often have end points with an image edge camera, and for a pixel point of an end point accessory, distance information between the pixel point and a corresponding matching pixel point is often large. For example, as shown in fig. 4, an embodiment of the present invention provides an edge schematic diagram in a glued image, which includes a first edge AB and a second edge CD, and due to the limitation of the sampling range of the glued image, for a pixel C in the second edge, a corresponding matching pixel is a pixel a, so that the distance between the matching pixel and the pixel is a distance AC, and the distance is significantly large, so that the pixel C is a redundant pixel.
Because the pixel points in the corrected first edge and the corrected second edge are more accurate, the width information between the first edge and the second edge can be determined by utilizing the distance information between each pixel point and the corresponding matched pixel point in the first edge after the edge correction and the second edge after the edge correction, and the accuracy of the width information determination can be improved.
In one implementation manner, the edge endpoint referred to in this embodiment of the present invention may include: a first endpoint and a second endpoint of the first edge, and a third endpoint and a fourth endpoint of the second edge. For example, for fig. 4, the first endpoint is a, the second endpoint is B, the third endpoint is C, and the fourth endpoint is D.
In this case, the first edge may be edge-corrected based on the third end point and the fourth end point. If the pixel points contained in the first edge are determined, the pixel point with the minimum distance from the third end point is used as a first alternative pixel point, the pixel point with the minimum distance from the fourth end point is determined and used as a second alternative pixel point, and then the pixel points which do not belong to the range of the first alternative pixel point and the second alternative pixel point in the pixel points contained in the first edge are determined as redundant pixel points in the first edge, and finally the redundant pixel points in the pixel points contained in the first edge are eliminated.
Further, the second edge may be edge-corrected based on the first end point and the second end point. If the pixel points with the minimum distance from the first end point are determined from the pixel points contained in the second edge to be used as a third alternative pixel point, the pixel points with the minimum distance from the second end point are determined to be used as a fourth alternative pixel point, and then the pixel points which do not belong to the range where the third alternative pixel point and the fourth alternative pixel point are located in the pixel points contained in the second edge are determined to be redundant pixel points in the second edge, and the redundant pixel points in the pixel points contained in the second edge are eliminated.
To explain with the example shown in fig. 4, for the first edge, the matching pixel point of the first endpoint a is pixel point E, the matching pixel point of the second endpoint B is pixel point F, the matching pixel point of the third endpoint is pixel point a, and the matching pixel point of the fourth endpoint D is pixel point B, so that for the first edge, the first edge does not include redundant pixel points, and the correction is the same before and after. For the second edge, the modified second edge only includes the pixel points in the segment EF, the segment CE, and the pixel points in the segment FD as redundant pixel points to be eliminated.
In an implementation manner, the method can also be implemented by combining a pixel point set manner, and specifically, the pixel point set of the first edge can be set to be p 1 The set of pixel points of the second edge is p 2 And:
p 1 =[p 11 ,p 12 ,...,p 1(n-2) ,p 1(n-1) ]
p 2 =[p 21 ,p 22 ,...,p 2(m-2) ,p 2(m-1) ]
wherein n is the number of pixels included in the first edge, and m is the number of pixels included in the second edge.
Let the first end be p 10 The second end point is p 1(n-1) And the third endpoint is p 20 The fourth end point is p 2(m-1) Calculating the first endpoint p 10 Set of pixels p 2 Distance information for all pixels, e.g. for convenience of calculation, the distance information may be the square of the distance, i.e. the distance
Figure BDA0003813416540000111
And calculating the minimum value:
Figure BDA0003813416540000112
(wherein i =0,1, \8230;, m-2,m-1)
Further recording the pixel point set p 2 The index position i of the coordinate point in the point set is calculated by the same method as the endpoint p 1(n-1) To a set of pixel points p 2 Distance squared of all points, end point p 20 Set of pixels p 1 Distance squared of all points, end point p 2(m-1) To a set of pixel points p 1 The square of the distances of all points, the minimum of which is calculated:
Figure BDA0003813416540000113
(wherein j =0,1, \8230;, m-2,m-1)
Figure BDA0003813416540000114
(wherein k =0,1, \8230;, n-2,n-1)
Figure BDA0003813416540000115
(wherein l =0,1, \8230;, n-2,n-1)
Further recording corresponding pixel point set p 1 And a set of pixel points p 2 The index positions j, k and l of the coordinate points in (1).
Because the corresponding relation between the endpoint of the pixel point set and the edge endpoint of the image gluing is uncertain, i and j need to be compared, specifically:
if i is less than j, then the pixel point set p 2 All the pixel points with index positions from i to j are taken out as effective pixel points, namely
p 2i ,p 2(i+1) ,…,p 2(j-1) ,p 2j
Set of pixel points p 2 The other pixel points in the group are redundant pixel points;
if j is less than i, then the pixel point set p 2 All pixel points with index positions from j to i are taken out in a point set and taken as effective pixel points, namely
P 2j ,p 2(j+1) ,…,p 2(i-1) ,P 2i
Set of pixel points p 2 The rest pixel points in the group are redundant pixel points;
similarly, k and l are compared to obtain a pixel point set p 1 The pixel point set after the redundant pixel points are removed from the point set is as follows:
p 1k ,P 1(k+1) ,…,p 1(l-1) ,p 1l or p 1l ,p 1(l+1) ,…,p 1(k-1) ,P 1k
Because, the comparison results of i and j and k and l can not influence the following width calculation, therefore, in order to explain the scheme more clearly, the pixel point set of the corrected first edge and the pixel point set of the corrected second edge of the redundant pixel points to be rejected are respectively recorded as:
p 1 ′=[p 1k ,p 1(k+1) ,…,p 1(l-1) ,p 1l ]
p 2 ′=[p 2i ,p 2(i+1) ,…,p 2(j-1) ,p 2j ]
after the redundant pixel points in the first edge and the second edge are removed, based on the distance information between each pixel point after the redundant pixel points are removed and the corresponding matched pixel point in the first edge and the second edge, the width information between the first edge and the second edge is determined and used as the width information of the gluing area, for example, the above p is used 1 ' and p 2 The specific implementation process of the confirmation of the width information performed by each pixel point in the' is the same as or similar to that in step S103, and is not described herein again in the embodiments of the present invention.
According to the technical scheme of the embodiment of the invention, after the gluing image is obtained, the accuracy of width information determination can be improved, the efficiency is higher, and the cost of width information measurement can be reduced. Meanwhile, before the width information is determined, the accuracy of the width information can be further improved by correcting the first edge and the second edge.
In an embodiment, before performing step S103, distance information between each pixel point of the first edge and the second edge and the corresponding matching pixel point may also be determined. It should be emphasized that, in this embodiment, the first edge and the second edge to be referred to may be the corrected first edge and the corrected second edge obtained by the aforementioned edge correction, or may be the first edge and the second edge without the edge correction.
Optionally, before step S103 is executed, distance information between each pixel point in the first edge and each pixel point in the second edge may be determined, and then for each pixel point in the first edge and the second edge, the minimum distance information is determined from the distance information between the pixel point and each corresponding pixel point to be screened, and the minimum distance information is used as the distance information between the pixel point and the corresponding matching pixel point, where the pixel point to be screened corresponding to each pixel point is a pixel point belonging to a different edge from the pixel point.
In one implementation, the distance information matrix d may be constructed based on distance information between each pixel point in the first edge and each pixel point in the second edge:
Figure BDA0003813416540000131
wherein n is the number of pixels contained in the first edge, m is the number of pixels contained in the second edge, and each d in the distance information matrix d uv And representing the distance information between the u-th pixel point in the first edge and the v-th pixel point in the second edge, wherein u is more than or equal to 1 and less than or equal to n, and v is more than or equal to 1 and less than or equal to m.
And then determining the minimum distance information in each row in the distance information matrix d as the distance information between the pixel point of the row and the corresponding matching pixel point, and determining the minimum distance information in each column in the distance information matrix d as the distance information between the pixel point of the column and the corresponding matching pixel point.
For the convenience of subsequent calculation, the first distance information matrix d may be constructed after determining the minimum distance information in each row of the distance information matrix d row
d row =[d 1row ,d 2row ,…,d (n-1)row ,d nrow ]
Wherein the first distance information matrix d row Each of d in urow Representing the minimum distance information in the u-th row in the distance information matrix d;
after determining the minimum distance information in each column in the distance information matrix d, a second distance information matrix d is constructed col
d col =[d 1col ,d 2col ,…,d (m-1)col ,d mcol ]
Wherein the second distance information matrix d col Each of d in vcol Representing the minimum distance information in the v-th column in the distance information matrix d;
in this case, determining the width information between the first edge and the second edge based on the distance information between each pixel point and the corresponding matching pixel point in the first edge and the second edge as the width information of the glue coating area may include determining a first distance information matrix d row And a second distance information matrix d col The minimum distance information and the maximum distance information in (1) are respectively marked as d rowmin 、d rowmax And dc olmin 、d colmax Then calculate d rowmin And d colmin Minimum value of (d) min Calculating d rowmax And d colmax And is noted as d max Finally d is min And d max As width information of the glue application area.
It should be emphasized that the distance information may be a pixel distance between two pixels, or may be a square of the pixel distance between two pixels. If the distance information is a pixel distance flat method, after the minimum distance information and the maximum distance information are determined, the minimum distance information and/or the maximum distance information can be used as the width information of the gluing area.
In the above scheme of the embodiment of the invention, after the gluing image is obtained, the accuracy of width information determination can be improved, the efficiency is higher, and the cost of width information measurement can be reduced. Meanwhile, before the width information is determined, the distance information between each pixel point in the first edge and the second edge and the corresponding matching pixel point is determined, so that a realization basis can be provided for improving the accuracy.
As shown in fig. 5, an embodiment of the present invention further provides a method for determining width information, which may include the following steps:
s501, acquiring an original image; the step can obtain a gluing image, also called as an original image;
s502, preprocessing an image; the step can be used for preprocessing the gluing image, such as identifying a gluing area;
s503, extracting edges; in this step, edge extraction may be performed on the glued area in the glued image, so as to obtain a pixel point set corresponding to two edges (a first edge and a second edge), and the method is set as follows:
p 1 =[p 11 ,p 12 ,...,p 1(n-2) ,p 1(n-1) ]
p 2 =[p 21 ,p 22 ,...,p 2(m-2) ,p 2(m-1) ]
set pixel point set p 1 And p 2 The horizontal and vertical coordinates in (1) are respectively set as:
p 1r .x、p 1r .y、p 2s .x、p 2s y, wherein r =0,1, \8230;, n-2, n-1, s =0,1, \8230;, m-2, m-1
Then the pixel point set p 1 Pixel-to-pixel set p in (1) 2 Distance between inner pixel pointsThe squares of the distance and distance are set as:
Figure BDA0003813416540000141
Figure BDA0003813416540000142
s504, removing redundant pixel points; after the edge extraction is performed, redundant pixel points can be eliminated from the first edge and the second edge, and optionally, a pixel point set p is set 1 And p 2 The end points of (1), i.e. the end points of the two edges, are respectively p 10 、p 1(n-1) And p 20 、p 2(m-1) Calculating the endpoint p 10 To a set of pixel points p 2 The square of the distance of all points, i.e.
Figure BDA0003813416540000143
And calculating the minimum value:
Figure BDA0003813416540000144
(wherein i =0,1, \8230;, m-2,m-1)
Simultaneously recording corresponding pixel point set p 2 The index position i of the coordinate point in (1) is calculated by the same method as the endpoint p 1(n-1) To a set of pixel points p 2 Distance squared of all points, end point p 20 To a set of pixel points p 1 Distance squared of all points, end point p 2(m-1) Set of pixels p 1 The square of the distances of all points, the minimum of which is calculated:
Figure BDA0003813416540000145
(wherein j =0,1, \8230;, m-2,m-1)
Figure BDA0003813416540000151
(where k =0,1, \8230;, n-2,n-1)
Figure BDA0003813416540000152
(wherein l =0,1, \8230;, n-2,n-1)
Simultaneously recording corresponding pixel point set p 1 And p 2 The index position j, k, l where the coordinate point in (1) is located. The corresponding relation between the end point of the pixel point set and the edge end point of the image gluing is uncertain, so that i and j need to be compared, and if i is less than j, the pixel point set p is subjected to comparison 2 All points with index positions i to j, i.e.
p 2i ,p 2(i+1) ,…,p 2(j-1) ,p 2j
The other points are redundant points; if j is less than i, then the pixel point set p 2 All points with index positions j to i, i.e.
p 2j ,p 2(j+1) ,…,p 2(i-1) ,p 2i
The remaining points are redundant points. Similarly, k and l are compared to obtain a pixel point set p 1 The set of pixel points after the redundant points are removed is as follows:
p 1k ,p 1(k+1) ,…,p 1(l-1) ,p 1l or p 1l ,p 1(l+1) ,…,p 1(k-1) ,p 1k
The comparison results of i and j and k and l do not affect the following width calculation, so for convenience of introducing the algorithm, the compared result, that is, the pixel point set with the redundant points removed is respectively recorded as one of the comparison results, that is, the comparison results are recorded as the comparison results
p 1 ′=[p 1k ,p 1(k+1) ,…,p 1(l-1) ,p 1l ]
p 2 ′=[p 2i ,p 2(i+1) ,…,p 2(j-1) ,p 2j ]
S505, calculating width information;
after the redundant point elimination, the width calculation can be performed, and optionally, p can be calculated 1 ' every point in to p 2 The square of the distance of all points in' can give the matrix d 2
Figure BDA0003813416540000153
Then matrix d 2 In (d), the h-th line data represents p 1 ' h point of to p 2 ' the square of the distance of all points in, column f data represents p 2 ' f point of to p 1 The square of the distance of all points in.
Calculating the minimum value of each row, which represents p in a physical sense 1 ' from a certain point to p 2 ' distance of corresponding edge, and the calculation result is recorded as:
Figure BDA0003813416540000154
calculating the minimum value of each column, which represents p in a physical sense 2 From a certain point in to p 1 ' distance of corresponding edge, and the calculation result is recorded as:
Figure BDA0003813416540000161
respectively calculate
Figure BDA0003813416540000162
And
Figure BDA0003813416540000163
respectively, are recorded as
Figure BDA0003813416540000164
And
Figure BDA0003813416540000165
then calculate
Figure BDA0003813416540000166
And
Figure BDA0003813416540000167
and is recorded as the minimum value of
Figure BDA0003813416540000168
Calculating out
Figure BDA0003813416540000169
And
Figure BDA00038134165400001610
and is recorded as
Figure BDA00038134165400001611
The minimum distance and the maximum distance of the gluing area are respectively recorded as d min And d max Then:
Figure BDA00038134165400001612
Figure BDA00038134165400001613
and then d will be min And d max As width information of the glue application area.
In the above scheme of the embodiment of the invention, after the glue-coated image is obtained, the first edge and the second edge of the glue-coated area can be identified, and then the width information between the first edge and the second edge is determined by using the distance information between each pixel point and the matched pixel point in the first edge and the second edge, so that the distance identification at the pixel level can be realized, and the accuracy of the determined width information is higher. Furthermore, because the embodiment of the invention can automatically realize the measurement of the width information after the gluing image is obtained, compared with a manual measurement mode, the efficiency is higher, and because the selection of a measurement point by an operator is not needed, the cost of the measurement of the width information can be reduced.
Corresponding to the width information determining method provided in the foregoing embodiment of the present invention, as shown in fig. 6, an embodiment of the present invention further provides a width information determining apparatus, where the apparatus includes:
the image acquisition module 601 is configured to acquire a gluing image including a gluing area, where the gluing area is an area of a gluing part that has been glued;
an edge identification module 602, configured to perform edge identification on the gluing area in the gluing image to obtain a first edge and a second edge of the gluing area;
an information determining module 603, configured to determine, based on distance information between each pixel point and the corresponding matching pixel point in the first edge and the second edge, width information between the first edge and the second edge, where the width information is used as width information of the gluing area; wherein, the matching pixel point that each pixel point corresponds to is: and the pixel point which is different from the pixel point in edge and has the minimum distance.
Optionally, the information determining module is specifically configured to perform edge correction on the first edge and the second edge based on edge end points of the first edge and the second edge; and determining width information between the first edge and the second edge as width information of the gluing area based on distance information between each pixel point and the corresponding matching pixel point in the first edge after edge correction and the second edge after edge correction.
Optionally, the edge endpoint includes: a first endpoint and a second endpoint of the first edge, and a third endpoint and a fourth endpoint of the second edge;
the information determining module is specifically configured to perform edge correction on the first edge based on the third endpoint and the fourth endpoint; performing edge correction on the second edge based on the first endpoint and the second endpoint.
Optionally, the information determining module includes:
the first correction submodule is used for determining a pixel point with the minimum distance from the third end point from all pixel points contained in the first edge to be used as a first alternative pixel point, and determining a pixel point with the minimum distance from the fourth end point to be used as a second alternative pixel point; determining pixel points which do not belong to the range of the first alternative pixel point and the second alternative pixel point in all pixel points included in the first edge as redundant pixel points in the first edge; eliminating redundant pixel points in all pixel points contained in the first edge;
the second correction submodule is used for determining a pixel point with the minimum distance from the first end point from all pixel points contained in the second edge as a third alternative pixel point, and determining a pixel point with the minimum distance from the second end point as a fourth alternative pixel point; determining pixel points which do not belong to the range of the third candidate pixel point and the fourth candidate pixel point in the pixel points included in the second edge as redundant pixel points in the second edge; and eliminating redundant pixel points in all pixel points contained in the second edge.
Optionally, the information determining module is specifically configured to determine, based on information of distances between each pixel point after the redundant pixel points are removed and the corresponding matched pixel point in the first edge and the second edge, width information between the first edge and the second edge, which is used as the width information of the gluing area.
Optionally, the information determining module is further configured to determine, based on the distance information between each pixel point in the first edge and the corresponding matching pixel point in the second edge, the width information between the first edge and the second edge, and determine, before the width information is used as the width information of the glue coating area, the distance information between each pixel point in the first edge and each pixel point in the second edge; and aiming at each pixel point in the first edge and the second edge, determining minimum distance information from the distance information between the pixel point and each corresponding pixel point to be screened as the distance information between the pixel point and the corresponding matched pixel point, wherein the pixel point to be screened corresponding to each pixel point is the pixel point belonging to different edges with the pixel point.
Optionally, the information determining module is specifically configured to construct a distance information matrix d based on distance information between each pixel point in the first edge and each pixel point in the second edge:
Figure BDA0003813416540000171
wherein n is the number of pixels included in the first edge, m is the number of pixels included in the second edge, and each d in the distance information matrix d uv Representing distance information between the u-th pixel point in the first edge and the v-th pixel point in the second edge;
determining the minimum distance information in each row in the distance information matrix d as the distance information between the pixel points of the row and the corresponding matched pixel points; and determining the minimum distance information in each column in the distance information matrix d as the distance information between the pixel point of the column and the corresponding matched pixel point.
Optionally, the information determining module is specifically configured to determine minimum distance information from distance information between each pixel point and the corresponding matching pixel point in the first edge and the second edge, and use a distance represented by the determined distance information as the minimum width of the gluing area; and/or determining the maximum distance information from the distance information between each pixel point and the corresponding matched pixel point in the first edge and the second edge, and taking the distance represented by the determined distance information as the maximum width of the gluing area; and taking the minimum width and/or the maximum width as the width information of the gluing area.
Optionally, the edge identification module is further configured to perform gluing area identification on the gluing image before performing edge identification on the gluing area in the gluing image to obtain a first edge and a second edge of the gluing area, so as to determine the gluing area in the gluing image.
In the above scheme of the embodiment of the invention, after the glue-coated image is obtained, the first edge and the second edge of the glue-coated area can be identified, and then the width information between the first edge and the second edge is determined by using the distance information between each pixel point and the matched pixel point in the first edge and the second edge, so that the distance identification at the pixel level can be realized, and the accuracy of the determined width information is higher. Furthermore, the embodiment of the invention can automatically realize the measurement of the width information after the gluing image is obtained, has higher efficiency compared with a manual measurement mode, and can reduce the cost of the measurement of the width information because the operator does not need to select the measurement point.
An embodiment of the present invention further provides an electronic device, as shown in fig. 7, including a processor 701, a communication interface 702, a memory 703 and a communication bus 704, where the processor 701, the communication interface 702, and the memory 703 complete mutual communication through the communication bus 704,
a memory 703 for storing a computer program;
the processor 701 is configured to implement the following steps when executing the program stored in the memory 703:
acquiring a gluing image containing a gluing area, wherein the gluing area is a glued area in a gluing part;
performing edge recognition on the gluing area in the gluing image to obtain a first edge and a second edge of the gluing area;
determining width information between the first edge and the second edge as width information of the gluing area based on distance information between each pixel point and the corresponding matched pixel point in the first edge and the second edge;
wherein, the matching pixel point corresponding to each pixel point is: and the pixel point which belongs to different edges and has the minimum distance with the pixel point.
The communication bus mentioned in the electronic device may be a Peripheral Component Interconnect (PCI) bus, an Extended Industry Standard Architecture (EISA) bus, or the like. The communication bus may be divided into an address bus, a data bus, a control bus, etc. For ease of illustration, only one thick line is shown, but this does not mean that there is only one bus or one type of bus.
The communication interface is used for communication between the electronic equipment and other equipment.
The Memory may include a Random Access Memory (RAM) or a Non-Volatile Memory (NVM), such as at least one disk Memory. Optionally, the memory may also be at least one memory device located remotely from the processor.
The Processor may be a general-purpose Processor, including a Central Processing Unit (CPU), a Network Processor (NP), and the like; but also Digital Signal Processors (DSPs), application Specific Integrated Circuits (ASICs), field Programmable Gate Arrays (FPGAs) or other Programmable logic devices, discrete Gate or transistor logic devices, discrete hardware components.
In yet another embodiment of the present invention, a computer-readable storage medium is further provided, in which a computer program is stored, and the computer program, when executed by a processor, implements the steps of any of the above-mentioned width information determining methods.
In yet another embodiment, a computer program product containing instructions is provided, which when run on a computer causes the computer to perform any of the width information determination methods of the above embodiments.
In the above embodiments, the implementation may be wholly or partially realized by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, cause the processes or functions described in accordance with the embodiments of the invention to occur, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored in a computer readable storage medium or transmitted from one computer readable storage medium to another, for example, from one website site, computer, server, or data center to another website site, computer, server, or data center via wired (e.g., coaxial cable, fiber optic, digital Subscriber Line (DSL)) or wireless (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device, such as a server, a data center, etc., that includes one or more available media. The usable medium may be a magnetic medium (e.g., floppy Disk, hard Disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., solid State Disk (SSD)), among others.
It is noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element described by the phrase "comprising a. -" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
All the embodiments in the present specification are described in a related manner, and the same and similar parts among the embodiments may be referred to each other, and each embodiment focuses on differences from other embodiments. In particular, for the apparatus, the electronic device, the computer-readable storage medium, and the computer program product embodiments, since they are substantially similar to the method embodiments, the description is relatively simple, and for the relevant points, reference may be made to the partial description of the method embodiments.
The above description is only for the preferred embodiment of the present invention, and is not intended to limit the scope of the present invention. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention shall fall within the protection scope of the present invention.

Claims (12)

1. A method for determining width information, the method comprising:
acquiring a gluing image containing a gluing area, wherein the gluing area is a glued area in a gluing part;
performing edge recognition on the gluing area in the gluing image to obtain a first edge and a second edge of the gluing area;
determining width information between the first edge and the second edge as width information of the gluing area based on distance information between each pixel point and the corresponding matched pixel point in the first edge and the second edge;
wherein, the matching pixel point that each pixel point corresponds to is: and the pixel point which is different from the pixel point in edge and has the minimum distance.
2. The method according to claim 1, wherein the determining width information between the first edge and the second edge as the width information of the glue spreading area based on distance information between each pixel point and a corresponding matching pixel point in the first edge and the second edge comprises:
performing edge correction on the first edge and the second edge based on edge end points of the first edge and the second edge;
and determining width information between the first edge and the second edge as width information of the gluing area based on distance information between each pixel point and the corresponding matching pixel point in the first edge after the edge correction and the second edge after the edge correction.
3. The method of claim 2, wherein the edge termination point comprises: a first endpoint and a second endpoint of the first edge, and a third endpoint and a fourth endpoint of the second edge;
the edge modification of the first edge and the second edge based on the edge end points of the first edge and the second edge includes:
performing edge correction on the first edge based on the third endpoint and the fourth endpoint;
and performing edge correction on the second edge based on the first end point and the second end point.
4. The method of claim 3, wherein the edge-modifying the first edge based on the third endpoint and the fourth endpoint comprises:
determining a pixel point with the minimum distance from the third end point from all pixel points contained in the first edge as a first candidate pixel point, and determining a pixel point with the minimum distance from the fourth end point as a second candidate pixel point;
determining pixel points which do not belong to the range of the first alternative pixel point and the second alternative pixel point in all pixel points included in the first edge as redundant pixel points in the first edge;
eliminating redundant pixel points in all pixel points contained in the first edge;
performing edge correction on the second edge based on the first endpoint and the second endpoint, including:
determining a pixel point with the minimum distance from the first end point from all pixel points contained in the second edge as a third candidate pixel point, and determining a pixel point with the minimum distance from the second end point as a fourth candidate pixel point;
determining pixel points which do not belong to the range of the third candidate pixel point and the fourth candidate pixel point in the pixel points included in the second edge as redundant pixel points in the second edge;
and eliminating redundant pixel points in all pixel points contained in the second edge.
5. The method according to any one of claims 2 to 4, wherein the determining width information between the first edge and the second edge as the width information of the gluing area based on distance information between each pixel point and the corresponding matching pixel point in the first edge after the edge correction and the second edge after the edge correction comprises:
and determining width information between the first edge and the second edge as width information of the gluing area based on distance information between each pixel point after the redundant pixel points are removed and the corresponding matched pixel points in the first edge and the second edge.
6. The method according to any one of claims 1 to 4, wherein before determining the width information between the first edge and the second edge as the width information of the glue spreading area based on the distance information between each pixel point and the corresponding matching pixel point in the first edge and the second edge, the method further comprises:
determining distance information between each pixel point in the first edge and each pixel point in the second edge;
and aiming at each pixel point in the first edge and the second edge, determining minimum distance information from the distance information between the pixel point and each corresponding pixel point to be screened as the distance information between the pixel point and the corresponding matched pixel point, wherein the pixel point to be screened corresponding to each pixel point is the pixel point belonging to different edges with the pixel point.
7. The method according to claim 6, wherein the determining, for each pixel point in the first edge and the second edge, minimum distance information from distance information between the pixel point and each corresponding pixel point to be filtered, as distance information between the pixel point and a corresponding matching pixel point, includes:
constructing a distance information matrix d based on the distance information between each pixel point in the first edge and each pixel point in the second edge:
Figure FDA0003813416530000021
wherein n is the number of pixels included in the first edge, m is the number of pixels included in the second edge, and each d in the distance information matrix d uv Representing distance information between the u-th pixel point in the first edge and the v-th pixel point in the second edge;
determining the minimum distance information in each row in the distance information matrix d as the distance information between the pixel points of the row and the corresponding matched pixel points;
and determining the minimum distance information in each column in the distance information matrix d as the distance information between the pixel point of the column and the corresponding matched pixel point.
8. The method according to any one of claims 1 to 7, wherein the determining width information between the first edge and the second edge as the width information of the glue area based on distance information between each pixel point and the corresponding matching pixel point in the first edge and the second edge comprises:
determining minimum distance information from the distance information between each pixel point and the corresponding matched pixel point in the first edge and the second edge, and taking the distance represented by the determined distance information as the minimum width of the gluing area; and/or the presence of a gas in the gas,
determining the maximum distance information from the distance information between each pixel point and the corresponding matching pixel point in the first edge and the second edge, and taking the distance represented by the determined distance information as the maximum width of the gluing area;
and taking the minimum width and/or the maximum width as the width information of the gluing area.
9. The method according to any one of claims 1 to 7, wherein before the performing edge recognition on the glue area in the glue image to obtain a first edge and a second edge of the glue area, the method further comprises:
and identifying a gluing area of the gluing image to determine the gluing area in the gluing image.
10. A width information determination apparatus, characterized in that the apparatus comprises:
the image acquisition module is used for acquiring a gluing image containing a gluing area, wherein the gluing area is a glued area in the gluing part;
the edge recognition module is used for carrying out edge recognition on the gluing area in the gluing image to obtain a first edge and a second edge of the gluing area;
an information determining module, configured to determine, based on distance information between each pixel point and a corresponding matching pixel point in the first edge and the second edge, width information between the first edge and the second edge, where the width information is used as width information of the gluing area; wherein, the matching pixel point corresponding to each pixel point is: and the pixel point which is different from the pixel point in edge and has the minimum distance.
11. An electronic device is characterized by comprising a processor, a communication interface, a memory and a communication bus, wherein the processor and the communication interface are used for realizing mutual communication by the memory through the communication bus;
a memory for storing a computer program;
a processor for implementing the method steps of any of claims 1-9 when executing a program stored in the memory.
12. A computer-readable storage medium, characterized in that a computer program is stored in the computer-readable storage medium, which computer program, when being executed by a processor, carries out the method steps of any one of the claims 1-9.
CN202211024084.3A 2022-08-24 2022-08-24 Width information determination method and device and electronic equipment Pending CN115439426A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211024084.3A CN115439426A (en) 2022-08-24 2022-08-24 Width information determination method and device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211024084.3A CN115439426A (en) 2022-08-24 2022-08-24 Width information determination method and device and electronic equipment

Publications (1)

Publication Number Publication Date
CN115439426A true CN115439426A (en) 2022-12-06

Family

ID=84244595

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211024084.3A Pending CN115439426A (en) 2022-08-24 2022-08-24 Width information determination method and device and electronic equipment

Country Status (1)

Country Link
CN (1) CN115439426A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116168041A (en) * 2023-04-26 2023-05-26 湖南隆深氢能科技有限公司 Real-time detection method and system applied to laminating device

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116168041A (en) * 2023-04-26 2023-05-26 湖南隆深氢能科技有限公司 Real-time detection method and system applied to laminating device

Similar Documents

Publication Publication Date Title
US11080839B2 (en) System and method for training a damage identification model
CN108961184B (en) Method, device and equipment for correcting depth image
WO2022151658A1 (en) Defect detection method and apparatus, and computer device and computer-readable storage medium
CN111461113B (en) Large-angle license plate detection method based on deformed plane object detection network
CN109165657A (en) A kind of image feature detection method and device based on improvement SIFT
CN111027412A (en) Human body key point identification method and device and electronic equipment
US20190138840A1 (en) Automatic ruler detection
CN107341824B (en) Comprehensive evaluation index generation method for image registration
CN110969100A (en) Human body key point identification method and device and electronic equipment
CN110909664A (en) Human body key point identification method and device and electronic equipment
CN115439426A (en) Width information determination method and device and electronic equipment
CN111126268A (en) Key point detection model training method and device, electronic equipment and storage medium
CN114972268A (en) Defect image generation method and device, electronic equipment and storage medium
JP6390248B2 (en) Information processing apparatus, blur condition calculation method, and program
CN116977783A (en) Training method, device, equipment and medium of target detection model
CN109902695A (en) One kind is towards as to the correction of linear feature matched line feature and method of purification
CN112699886B (en) Character recognition method and device and electronic equipment
CN110874600B (en) Ion beam sputtering deposition film pit and particle discrimination method based on machine learning
CN117274132A (en) Multi-scale self-encoder generation method, electronic device and storage medium
CN110288576B (en) Light strip center extraction method, terminal device and storage medium
TWI807854B (en) Method for detecting defects, computer device and storage medium
CN114283191A (en) Corner position positioning method and device, electronic equipment and readable medium
CN109242823B (en) Reference image selection method and device for positioning calculation and automatic driving system
CN109146840B (en) Method and device for evaluating data geometric positioning precision based on Gaussian probability statistics
CN114359678A (en) Data labeling method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination