CN110070499A - Image processing method, device and computer readable storage medium - Google Patents

Image processing method, device and computer readable storage medium Download PDF

Info

Publication number
CN110070499A
CN110070499A CN201910193207.8A CN201910193207A CN110070499A CN 110070499 A CN110070499 A CN 110070499A CN 201910193207 A CN201910193207 A CN 201910193207A CN 110070499 A CN110070499 A CN 110070499A
Authority
CN
China
Prior art keywords
image
brightness
edge
luminance
initial
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910193207.8A
Other languages
Chinese (zh)
Inventor
周景锦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing ByteDance Network Technology Co Ltd
Original Assignee
Beijing ByteDance Network Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing ByteDance Network Technology Co Ltd filed Critical Beijing ByteDance Network Technology Co Ltd
Priority to CN201910193207.8A priority Critical patent/CN110070499A/en
Publication of CN110070499A publication Critical patent/CN110070499A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/20Image enhancement or restoration using local operators
    • G06T5/30Erosion or dilatation, e.g. thinning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/40Image enhancement or restoration using histogram techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/77Retouching; Inpainting; Scratch removal
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Processing (AREA)

Abstract

A kind of image processing method of the disclosure, image processing apparatus, image processing hardware device and computer readable storage medium.Wherein, which includes: to obtain luminance picture according to the corresponding luminance component of original image;Edge detection process is carried out to obtain edge image to the luminance picture;The tone images of the luminance picture are obtained according to predetermined reference image;Pencil drawing style image corresponding with the original image is generated according to the edge image and the tone images.The embodiment of the present disclosure passes through according to the corresponding luminance component of original image, obtain luminance picture, edge detection process is carried out to obtain edge image to the luminance picture again, the tone images of the luminance picture are obtained according to predetermined reference image, pencil drawing style image is generated further according to the edge image and the tone images, the technical issues of filter for solving pencil drawing style in the prior art cannot be considered in terms of effect and performance.

Description

Image processing method, apparatus and computer-readable storage medium
Technical Field
The present disclosure relates to the field of image processing technologies, and in particular, to an image processing method and apparatus, and a computer-readable storage medium.
Background
With the development of computer technology, the application range of the intelligent terminal is widely improved, for example, the intelligent terminal can listen to music, play games, chat on internet, take pictures and the like. For the photographing technology of the intelligent terminal, the photographing pixels of the intelligent terminal reach more than ten million pixels, and the intelligent terminal has higher definition and the photographing effect comparable to that of a professional camera.
At present, when an intelligent terminal is used for photographing, not only can photographing effects of traditional functions be realized by using photographing software built in when the intelligent terminal leaves a factory, but also photographing effects with additional functions can be realized by downloading an Application program (APP for short) from a network end, for example, various filter effects can be realized. Typically, such as a filter that can convert the image into a pencil-drawing style.
However, the filter with the pencil drawing style in the prior art cannot give consideration to both effects and performances, or only can perform pencil drawing style processing on a static picture, or can perform real-time pencil drawing style processing on a camera video picture, but the processing speed is low, so that the effects are poor. In the prior art, a filter with a pencil drawing style can be made by using a deep learning image algorithm, but a strong GPU is required as a support, and even on a background server equipped with a high-performance GPU, a real-time processing frame rate is difficult to achieve. In addition, the deep learning based algorithm requires a large amount of training data to be prepared in advance, which consumes a lot of time and labor.
Disclosure of Invention
The technical problem solved by the present disclosure is to provide an image processing method to at least partially solve the technical problem that the filter of the pencil painting style in the prior art cannot give consideration to both effects and performance. In addition, an image processing apparatus, an image processing hardware apparatus, a computer-readable storage medium, and an image processing terminal are also provided.
In order to achieve the above object, according to one aspect of the present disclosure, the following technical solutions are provided:
an image processing method comprising:
obtaining a brightness image of an original image according to a brightness component corresponding to the original image;
carrying out edge detection processing on the brightness image to obtain an edge image of the brightness image;
acquiring a tone image of the luminance image according to a predetermined reference image; and is
And generating a pencil drawing style image corresponding to the original image according to the edge image and the tone image.
Further, the obtaining a luminance image of the original image according to the luminance component corresponding to the original image includes:
carrying out color space transformation processing on the original image, and extracting a brightness component from the transformed image to obtain an initial brightness image;
and obtaining the brightness image of the original image according to the initial brightness image.
Further, the obtaining a luminance image of the original image according to the initial luminance image includes:
and denoising the initial brightness image to obtain a brightness image of the original image.
Further, the performing an edge detection process on the luminance image to obtain an edge image of the luminance image includes:
carrying out edge detection on the brightness image to obtain an initial edge image;
and obtaining the edge image according to the initial edge image.
Further, the performing an edge detection process on the luminance image to obtain an edge image of the luminance image includes:
performing plane convolution on the first preset matrix template and the brightness image to obtain a first brightness difference value of each pixel point, and performing plane convolution on the second preset matrix template and the brightness image to obtain a second brightness difference value of each pixel point;
and determining a corresponding pixel value according to the first brightness difference value and the second brightness difference value aiming at each pixel point to obtain an edge image of the brightness image.
Further, the determining, for each pixel point, a corresponding pixel value according to the first luminance difference value and the second luminance difference value to obtain an edge image of the luminance image includes:
and calculating the root mean square of the first brightness difference value and the second brightness difference value aiming at each pixel point, and taking the root mean square as a corresponding pixel value to obtain an edge image of the brightness image.
Further, the obtaining the edge image according to the initial edge image includes:
performing at least one expansion treatment on the initial edge image to obtain an expanded image;
and obtaining the edge image according to the expanded image.
Further, the obtaining the edge image according to the expanded image includes:
performing an inversion operation on the pixel value of each pixel point in the expanded image;
and obtaining the edge image according to the pixel value after the inversion.
Further, the obtaining an edge image according to the inverted pixel value includes:
by the formula Rx,y=(1.0-Valx,y)αComputing the α power of the pixel value after each pixel point is inverted, wherein Valx,yFor the inverted pixel value, Rx,yTo calculate the pixel value to power of α, α is an adjustable parameter;
from R of all pixelsx,yThe pixel values constitute the edge image.
Further, the acquiring a tone image of the luminance image according to a predetermined reference image includes:
performing brightness histogram matching on the brightness image and the reference image to obtain a brightness mapping table;
and carrying out color mapping on the brightness image according to the brightness mapping table to obtain the tone image.
Further, the performing luminance histogram matching on the luminance image and the reference image to obtain a luminance mapping table includes:
acquiring a brightness histogram of the reference image;
adjusting the brightness distribution of the brightness image to enable the approximation degree of the histogram of the brightness image and the brightness histogram of the reference image to be larger than a preset value;
and determining the brightness mapping table according to the reference image and the adjusted brightness image.
Further, the generating a pencil drawing style image corresponding to the original image according to the edge image and the tone image includes:
and multiplying the edge image and the tone image to obtain a black and white pencil drawing style image corresponding to the original image.
Further, the generating a pencil drawing style image corresponding to the original image according to the edge image and the tone image includes:
obtaining a brightness component according to the tone image, and obtaining a color component according to the original image;
generating an initial color pencil drawing style image according to the brightness component and the color component; and is
And converting the initial colored pencil drawing style image into an RGB color space to obtain a colored pencil drawing style image corresponding to the initial image.
In order to achieve the above object, according to still another aspect of the present disclosure, the following technical solutions are also provided:
an image processing apparatus comprising:
the brightness separation module is used for obtaining a brightness image of the original image according to the brightness component corresponding to the original image;
the image processing module is used for carrying out edge detection processing on the brightness image to obtain an edge image of the brightness image and acquiring a tone image of the brightness image according to a preset reference image; and
and the pencil drawing generation module is used for generating a pencil drawing style image corresponding to the original image according to the edge image and the tone image.
Further, the luminance separating module includes:
the brightness separation unit is used for carrying out color space transformation processing on the original image and extracting a brightness component from the transformed image to obtain an initial brightness image;
and the brightness image generation unit is used for obtaining a brightness image of the original image according to the initial brightness image.
Further, the luminance image generating unit is specifically configured to: and denoising the initial brightness image to obtain a brightness image of the original image.
Further, the image processing module includes:
the edge detection unit is used for carrying out edge detection on the brightness image to obtain an initial edge image;
and the edge image generating unit is used for obtaining the edge image according to the initial edge image.
Further, the image processing module is specifically configured to: performing plane convolution on the first preset matrix template and the brightness image to obtain a first brightness difference value of each pixel point, and performing plane convolution on the second preset matrix template and the brightness image to obtain a second brightness difference value of each pixel point; and determining a corresponding pixel value according to the first brightness difference value and the second brightness difference value aiming at each pixel point to obtain an edge image of the brightness image.
Further, the image processing module is specifically configured to: and calculating the root mean square of the first brightness difference value and the second brightness difference value aiming at each pixel point, and taking the root mean square as a corresponding pixel value to obtain an edge image of the brightness image.
Further, the edge image generating unit is specifically configured to: performing at least one expansion treatment on the initial edge image to obtain an expanded image; and obtaining the edge image according to the expanded image.
Further, the edge image generating unit is specifically configured to: performing an inversion operation on the pixel value of each pixel point in the expanded image; and obtaining the edge image according to the pixel value after the inversion.
Further, the edge image generating unit is specifically configured to: by the formula Rx,y=(1.0-Valx,y)αComputing the α power of the pixel value after each pixel point is inverted, wherein Valx,yFor the inverted pixel value, Rx,yα is an adjustable parameter for calculating the pixel value after α times, and R of all the pixel pointsx,yThe pixel values constitute the edge image.
Further, the image processing module includes:
the histogram matching unit is used for performing brightness histogram matching on the brightness image and the reference image to obtain a brightness mapping table;
and the tone image generation module is used for carrying out color mapping on the brightness image according to the brightness mapping table to obtain the tone image.
Further, the histogram matching unit is specifically configured to: acquiring a brightness histogram of the reference image; adjusting the brightness distribution of the brightness image to enable the approximation degree of the histogram of the brightness image and the brightness histogram of the reference image to be larger than a preset value; and determining the brightness mapping table according to the reference image and the adjusted brightness image.
Further, the pencil drawing generation module is specifically configured to: and multiplying the edge image and the tone image to obtain a black and white pencil drawing style image corresponding to the original image.
Further, the pencil drawing generation module is specifically configured to: obtaining a brightness component according to the tone image, and obtaining a color component according to the original image; generating an initial color pencil drawing style image according to the brightness component and the color component; and converting the initial colored pencil drawing style image into an RGB color space to obtain a colored pencil drawing style image corresponding to the initial image.
In order to achieve the above object, according to still another aspect of the present disclosure, the following technical solutions are also provided:
an electronic device, comprising:
a memory for storing non-transitory computer readable instructions; and
and the processor is used for executing the computer readable instructions, so that the processor can realize the steps in any image processing method technical scheme when executing.
In order to achieve the above object, according to still another aspect of the present disclosure, the following technical solutions are also provided:
a computer readable storage medium storing non-transitory computer readable instructions which, when executed by a computer, cause the computer to perform the steps of any of the image processing method aspects described above.
In order to achieve the above object, according to still another aspect of the present disclosure, the following technical solutions are also provided:
an image processing terminal comprises any one of the image processing devices.
According to the method and the device for processing the pencil drawing style image, the luminance image of the original image is obtained according to the luminance component corresponding to the original image, the edge detection processing is carried out on the luminance image to obtain the edge image of the luminance image, the tone image of the luminance image is obtained according to the preset reference image, and the pencil drawing style image corresponding to the original image is generated according to the edge image and the tone image, so that the technical problem that effects and performances cannot be considered at the same time by the pencil drawing style filter in the prior art is solved.
The foregoing is a summary of the present disclosure, and for the purposes of promoting a clear understanding of the technical means of the present disclosure, the present disclosure may be embodied in other specific forms without departing from the spirit or essential attributes thereof.
Drawings
FIG. 1 is a schematic flow diagram of an image processing method according to one embodiment of the present disclosure;
FIG. 2 is a schematic diagram of an apparatus for image processing according to one embodiment of the present disclosure;
fig. 3 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure.
Detailed Description
The embodiments of the present disclosure are described below with specific examples, and other advantages and effects of the present disclosure will be readily apparent to those skilled in the art from the disclosure in the specification. It is to be understood that the described embodiments are merely illustrative of some, and not restrictive, of the embodiments of the disclosure. The disclosure may be embodied or carried out in various other specific embodiments, and various modifications and changes may be made in the details within the description without departing from the spirit of the disclosure. It is to be noted that the features in the following embodiments and examples may be combined with each other without conflict. All other embodiments, which can be derived by a person skilled in the art from the embodiments disclosed herein without making any creative effort, shall fall within the protection scope of the present disclosure.
It is noted that various aspects of the embodiments are described below within the scope of the appended claims. It should be apparent that the aspects described herein may be embodied in a wide variety of forms and that any specific structure and/or function described herein is merely illustrative. Based on the disclosure, one skilled in the art should appreciate that one aspect described herein may be implemented independently of any other aspects and that two or more of these aspects may be combined in various ways. For example, an apparatus may be implemented and/or a method practiced using any number of the aspects set forth herein. Additionally, such an apparatus may be implemented and/or such a method may be practiced using other structure and/or functionality in addition to one or more of the aspects set forth herein.
It should be noted that the drawings provided in the following embodiments are only for illustrating the basic idea of the present disclosure, and the drawings only show the components related to the present disclosure rather than the number, shape and size of the components in actual implementation, and the type, amount and ratio of the components in actual implementation may be changed arbitrarily, and the layout of the components may be more complicated.
In addition, in the following description, specific details are provided to facilitate a thorough understanding of the examples. However, it will be understood by those skilled in the art that the aspects may be practiced without these specific details.
In order to solve the technical problem that a filter with a pencil drawing style cannot give consideration to both effects and performance in the prior art, the embodiment of the disclosure provides an image processing method. As shown in fig. 1, the image processing method mainly includes steps S1 to S3 as follows. Wherein:
step S1: and obtaining the brightness image of the original image according to the brightness component corresponding to the original image.
Wherein the original image may be a photographed photograph.
The luminance image is an image composed of luminance components.
Specifically, if the original image is an RGB image, it needs to be subjected to color space transformation to obtain corresponding luminance information, and then obtain a corresponding luminance image.
Step S2: carrying out edge detection processing on the brightness image to obtain an edge image of the brightness image;
step S3: and acquiring a tone image of the brightness image according to a preset reference image.
Step S4: and generating a pencil drawing style image corresponding to the original image according to the edge image and the tone image.
According to the method and the device, the luminance image is obtained according to the luminance component corresponding to the original image, the edge detection processing is carried out on the luminance image to obtain the edge image of the luminance image, the tone image of the luminance image is obtained according to the preset reference image, and the pencil drawing style image is generated according to the edge image and the tone image, so that the technical problem that effects and performances cannot be considered by a pencil drawing style filter in the prior art is solved.
In an optional embodiment, step S1 specifically includes:
step S11: and carrying out color space transformation processing on the original image, and extracting a brightness component from the transformed image to obtain an initial brightness image.
Specifically, if the original image is a Red Green Blue (RGB) image, it needs to be color space converted to obtain a corresponding luminance component, for example, the original image is converted from an RGB color space to a YCbCr color space, where the conversion formula is as follows:
Y=0.257*R+0.564*G+0.098*B+16
Cb=-0.148*R-0.291*G+0.439*B+128
Cr=0.439*R-0.368*G-0.071*B+128
wherein, Y is a luminance component, Cb and Cr are a blue component and a red component, respectively, and Y is an extracted luminance component.
Or converting the original image from an RGB color space to an LAB color space, where L is a luminance component, A, B is a color component, a represents a range from magenta to green, and B represents a range from yellow to blue, then L is the extracted luminance component.
Step S12: and obtaining the brightness image of the original image according to the initial brightness image.
Further, step S12 includes:
directly taking the initial brightness image as the brightness image of the original image to carry out subsequent processing; or denoising the initial brightness image to obtain the brightness image of the original image.
Specifically, a median filter may be used to perform denoising processing on the initial luminance image to eliminate salt-pepper noise in the image, and for performance, a median filter of 3 × 3 size may be used in this embodiment, specifically, a median value is calculated in 8 neighborhoods of each pixel point.
In an alternative embodiment, step S2 includes:
step S21: and carrying out edge detection on the brightness image to obtain an initial edge image.
Step S22: and obtaining the edge image according to the initial edge image.
Further, step S2 includes:
performing plane convolution on the first preset matrix template and the brightness image to obtain a first brightness difference value of each pixel point, and performing plane convolution on the second preset matrix template and the brightness image to obtain a second brightness difference value of each pixel point; and aiming at each pixel point, determining a corresponding pixel value according to the first brightness difference value and the second brightness difference value to obtain an edge image of the brightness image.
In this document, in order to distinguish different preset matrix templates, the first-appearing preset matrix template is referred to as a first preset matrix template, and the subsequent-appearing preset matrix template is referred to as a second preset matrix template.
And, in order to distinguish different luminance difference values, a luminance difference value occurring first is referred to as a first luminance difference value and a luminance difference value occurring later is referred to as a second luminance difference value herein.
Specifically, Sobel filtering may be used for edge detection, where the Sobel filtering includes two sets of matrix templates, i.e., a horizontal template and a vertical template, respectively, where the first preset matrix template may be the horizontal template, and the second preset matrix template may be the vertical template. The size of the template may be 3 × 3, and the approximate values of the luminance difference in the horizontal and vertical directions, i.e., the first luminance difference value and the second luminance difference value, may be obtained by performing a planar convolution on the template and the luminance image. The transverse formworks and the longitudinal formworks which can be used are as follows:
further, the determining, for each pixel point, a corresponding pixel value according to the first luminance difference value and the second luminance difference value to obtain an edge image of the luminance image includes:
and calculating the root mean square of the first brightness difference value and the second brightness difference value aiming at each pixel point, and taking the root mean square as a corresponding pixel value to obtain an edge image of the brightness image.
Specifically, taking the brightness difference value calculated by the template as an example, the brightness difference value is calculated by a formulaAnd calculating to obtain the pixel value of each pixel point.
Further, step S22 includes:
performing at least one expansion treatment on the initial edge image to obtain an expanded image; and obtaining the edge image according to the expanded image.
Specifically, the dilation is an algorithm of image morphology, and is used in this embodiment to make the image edge thicker, i.e. more obvious, specifically, the maximum value may be found in 8 neighborhoods for each pixel point of the initial edge image, and a formula f may be adoptedx,y=maxx-1≤i≤x+1maxy-1≤j≤y+1Vali,jPerforming a calculation in which Vali,jIs the pixel value corresponding to the pixel point with the position i, j. The loop can be executed for a plurality of times according to the requirement, and the thicker the edge is, the more obvious the edge is.
Further, the obtaining the edge image according to the expanded image includes:
performing inversion operation on the pixel value of each pixel point in the expanded image; and obtaining the edge image according to the pixel value after the inversion.
Further, the obtaining the edge image according to the inverted pixel value includes:
by the formula Rx,y=(1.0-Valx,y)αComputing the α power of the pixel value after each pixel point is inverted, wherein Valx,yFor the inverted pixel value, Rx,yα is an adjustable parameter for calculating the pixel value after α times, and R of all the pixel pointsx,yThe pixel values constitute the edge image.
Wherein α is larger the darker the image edge.
In an alternative embodiment, step S3 includes:
step S31: and performing brightness histogram matching on the brightness image and the reference image to obtain the brightness mapping table.
Step S32: and carrying out color mapping on the brightness image according to the brightness mapping table to obtain the tone image.
Further, step S31 includes:
acquiring a brightness histogram of the reference image; adjusting the brightness distribution of the brightness image to enable the approximation degree of the histogram of the brightness image and the brightness histogram of the reference image to be larger than a preset value; and determining the brightness mapping table according to the reference image and the adjusted brightness image.
Specifically, the present embodiment uses histogram matching so that the histogram of the luminance image changes to be close to the histogram of the reference image. That is, the luminance distribution of the luminance image is adjusted so that it coincides with or is close to the luminance distribution of the reference image. For example, in the same coordinate system, the ratio of the overlapping area of the histogram of the luminance image and the luminance histogram of the reference image to the total area of the luminance histogram of the reference image is larger than a predetermined ratio value (e.g., 95%), i.e., the degree of approximation is considered to be larger than a preset value. The histogram of the reference image is an adjustable variable, and the overall brighter or darker image is used as the reference image, so that the obtained brightness image can be brighter or darker.
In an alternative embodiment, step S4 includes:
and multiplying the edge image and the tone image to obtain a black and white pencil drawing style image corresponding to the original image.
In an alternative embodiment, step S4 includes:
step S41: a luminance component is obtained from the tone image, and a color component is obtained from the original image.
Step S42: an initial colored pencil drawing style image is generated based on the intensity component and the color component.
Step S43: and converting the initial colored pencil drawing style image into an RGB color space to obtain a colored pencil drawing style image corresponding to the initial image.
Specifically, if the original image is an RGB image, the RGB image is converted into a YCbCr color space, the corresponding color components are Cb and Cr, the luminance component determined in the above step is taken as Y, the YCbCr image, that is, the initial color pencil drawing style image is composed of the Y component, the Cb component, and the Cr component, and then the YCbCr image is converted into the RGB color space, so that the color pencil drawing style image is obtained. The conversion formula adopted is as follows:
R=1.164*(Y-16)+1.596*(Cr-128)
G=1.164*(Y-16)-0.392*(Cb-128)-0.813*(Cr-128)
B=1.164*(Y-16)+2.017*(Cb-128)
if the original image is converted into the LAB color space from the RGB color space, the LAB image can be obtained by taking the brightness component determined in the above steps as the L component and A, B as the color component in the same way, and then the LAB image can be obtained by converting the LAB image into the RGB color space.
It will be appreciated by those skilled in the art that obvious modifications (e.g., combinations of the enumerated modes) or equivalents may be made to the above-described embodiments.
In the above, although the steps in the embodiment of the image processing method are described in the above sequence, it should be clear to those skilled in the art that the steps in the embodiment of the present disclosure are not necessarily performed in the above sequence, and may also be performed in other sequences such as reverse, parallel, and cross, and further, on the basis of the above steps, those skilled in the art may also add other steps, and these obvious modifications or equivalents should also be included in the protection scope of the present disclosure, and are not described herein again.
For convenience of description, only the relevant parts of the embodiments of the present disclosure are shown, and details of the specific techniques are not disclosed, please refer to the embodiments of the method of the present disclosure.
In order to solve the technical problem that a filter with a pencil drawing style cannot give consideration to effects and performance in the prior art, the embodiment of the disclosure provides an image processing device. The apparatus may perform the steps in the above-described image processing method embodiments. As shown in fig. 2, the apparatus mainly includes: a brightness separation module 21, an image processing module 22 and a pencil drawing generation module 23; wherein,
the brightness separation module 21 is configured to obtain a brightness image of the original image according to a brightness component corresponding to the original image;
the image processing module 22 is configured to perform edge detection processing on the luminance image to obtain an edge image of the luminance image, and acquire a tone image of the luminance image according to a predetermined reference image; and
the pencil drawing generation module 23 is configured to generate a pencil drawing style image corresponding to the original image according to the edge image and the tone image.
Further, the luminance separating module 21 includes: a luminance separation unit 211 and a luminance image generation unit 212; wherein,
the luminance separation unit 211 is configured to perform color space transformation on the original image, and extract a luminance component from the transformed image to obtain an initial luminance image;
the luminance image generating unit 212 is configured to obtain a luminance image of the original image according to the initial luminance image.
Further, the luminance image generating unit 212 is specifically configured to: and denoising the initial brightness image to obtain a brightness image of the original image.
Further, the image processing module 22 includes: an edge detection unit 221 and an edge image generation unit 222; wherein,
the edge detection unit 221 is configured to perform edge detection on the luminance image to obtain an initial edge image;
the edge image generating unit 222 is configured to obtain the edge image according to the initial edge image.
Further, the image processing module 22 is specifically configured to: performing plane convolution on the first preset matrix template and the brightness image to obtain a first brightness difference value of each pixel point, and performing plane convolution on the second preset matrix template and the brightness image to obtain a second brightness difference value of each pixel point; and determining a corresponding pixel value according to the first brightness difference value and the second brightness difference value aiming at each pixel point to obtain an edge image of the brightness image.
Further, the image processing module 22 is specifically configured to: and calculating the root mean square of the first brightness difference value and the second brightness difference value aiming at each pixel point, and taking the root mean square as a corresponding pixel value to obtain an edge image of the brightness image.
Further, the edge image generating unit 222 is specifically configured to: performing at least one expansion treatment on the initial edge image to obtain an expanded image; and obtaining the edge image according to the expanded image.
Further, the edge image generating unit 222 is specifically configured to: performing an inversion operation on the pixel value of each pixel point in the expanded image; and obtaining the edge image according to the pixel value after the inversion.
Further, the edge image generating unit 222 is specifically configured to: by the formula Rx,y=(1.0-Valx,y)αComputing the α power of the pixel value after each pixel point is inverted, wherein Valx,yFor the inverted pixel value, Rx,yα is an adjustable parameter for calculating the pixel value after α times, and R of all the pixel pointsx,yThe pixel values constitute the edge image.
Further, the image processing module 22 includes: a histogram matching unit 223 and a tone image generation module 224; wherein,
the histogram matching unit 223 is configured to perform luminance histogram matching on the luminance image and the reference image to obtain a luminance mapping table;
the tone image generation module 224 is configured to perform color mapping on the luminance image according to the luminance mapping table to obtain the tone image.
Further, the histogram matching unit 223 is specifically configured to: acquiring a brightness histogram of the reference image; adjusting the brightness distribution of the brightness image to enable the approximation degree of the histogram of the brightness image and the brightness histogram of the reference image to be larger than a preset value; and determining the brightness mapping table according to the reference image and the adjusted brightness image.
Further, the pencil drawing generation module 23 is specifically configured to: and multiplying the edge image and the tone image to obtain a black and white pencil drawing style image corresponding to the original image.
Further, the pencil drawing generation module 23 is specifically configured to: obtaining a brightness component according to the tone image, and obtaining a color component according to the original image; generating an initial color pencil drawing style image according to the brightness component and the color component; and converting the initial colored pencil drawing style image into an RGB color space to obtain a colored pencil drawing style image corresponding to the initial image.
For detailed descriptions of the working principle, the technical effect of the implementation, and the like of the embodiment of the image processing apparatus, reference may be made to the description of the embodiment of the image processing method, and further description is omitted here.
Referring now to FIG. 3, shown is a schematic diagram of an electronic device suitable for use in implementing embodiments of the present disclosure. The electronic devices in the embodiments of the present disclosure may include, but are not limited to, mobile terminals such as mobile phones, notebook computers, digital broadcast receivers, PDAs (personal digital assistants), PADs (tablet computers), PMPs (portable multimedia players), in-vehicle terminals (e.g., car navigation terminals), and the like, and fixed terminals such as digital TVs, desktop computers, and the like. The electronic device shown in fig. 3 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present disclosure.
As shown in fig. 3, the electronic device may include a processing device (e.g., a central processing unit, a graphics processor, etc.) 301 that may perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM)302 or a program loaded from a storage device 308 into a Random Access Memory (RAM) 303. In the RAM 303, various programs and data necessary for the operation of the electronic apparatus are also stored. The processing device 301, the ROM 302, and the RAM 303 are connected to each other via a bus 304. An input/output (I/O) interface 305 is also connected to bus 304.
Generally, the following devices may be connected to the I/O interface 305: input devices 306 including, for example, a touch screen, touch pad, keyboard, mouse, image sensor, microphone, accelerometer, gyroscope, etc.; an output device 307 including, for example, a Liquid Crystal Display (LCD), a speaker, a vibrator, and the like; storage devices 308 including, for example, magnetic tape, hard disk, etc.; and a communication device 309. The communication means 309 may allow the electronic device to communicate wirelessly or by wire with other devices to exchange data. While fig. 3 illustrates an electronic device having various means, it is to be understood that not all illustrated means are required to be implemented or provided. More or fewer devices may alternatively be implemented or provided.
In particular, according to an embodiment of the present disclosure, the processes described above with reference to the flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method illustrated in the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network through the communication means 309, or installed from the storage means 308, or installed from the ROM 302. The computer program, when executed by the processing device 301, performs the above-described functions defined in the methods of the embodiments of the present disclosure.
It should be noted that the computer readable medium in the present disclosure can be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In contrast, in the present disclosure, a computer readable signal medium may comprise a propagated data signal with computer readable program code embodied therein, either in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, optical cables, RF (radio frequency), etc., or any suitable combination of the foregoing.
The computer readable medium may be embodied in the electronic device; or may exist separately without being assembled into the electronic device.
The computer readable medium carries one or more programs which, when executed by the electronic device, cause the electronic device to: obtaining a brightness image of an original image according to a brightness component corresponding to the original image; carrying out edge detection processing on the brightness image to obtain an edge image of the brightness image; acquiring a tone image of the luminance image according to a predetermined reference image; and generating a pencil drawing style image corresponding to the original image from the edge image and the tone image.
Computer program code for carrying out operations for aspects of the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + +, and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in the embodiments of the present disclosure may be implemented by software or hardware. Where the name of a unit does not in some cases constitute a limitation of the unit itself, for example, the first retrieving unit may also be described as a "unit for retrieving at least two internet protocol addresses".
The foregoing description is only exemplary of the preferred embodiments of the disclosure and is illustrative of the principles of the technology employed. It will be appreciated by those skilled in the art that the scope of the disclosure herein is not limited to the particular combination of features described above, but also encompasses other embodiments in which any combination of the features described above or their equivalents does not depart from the spirit of the disclosure. For example, the above features and (but not limited to) the features disclosed in this disclosure having similar functions are replaced with each other to form the technical solution.

Claims (16)

1. An image processing method, comprising:
obtaining a brightness image of an original image according to a brightness component corresponding to the original image;
carrying out edge detection processing on the brightness image to obtain an edge image of the brightness image;
acquiring a tone image of the luminance image according to a predetermined reference image; and is
And generating a pencil drawing style image corresponding to the original image according to the edge image and the tone image.
2. The method according to claim 1, wherein obtaining the luminance image of the original image according to the luminance component corresponding to the original image comprises:
carrying out color space transformation processing on the original image, and extracting a brightness component from the transformed image to obtain an initial brightness image;
and obtaining the brightness image of the original image according to the initial brightness image.
3. The method of claim 2, wherein deriving the luminance image of the original image from the initial luminance image comprises:
and denoising the initial brightness image to obtain a brightness image of the original image.
4. The method according to claim 1, wherein the performing the edge detection process on the luminance image to obtain an edge image of the luminance image comprises:
carrying out edge detection on the brightness image to obtain an initial edge image;
and obtaining the edge image according to the initial edge image.
5. The method according to claim 1, wherein the performing the edge detection process on the luminance image to obtain an edge image of the luminance image comprises:
performing plane convolution on the first preset matrix template and the brightness image to obtain a first brightness difference value of each pixel point, and performing plane convolution on the second preset matrix template and the brightness image to obtain a second brightness difference value of each pixel point;
and determining a corresponding pixel value according to the first brightness difference value and the second brightness difference value aiming at each pixel point to obtain an edge image of the brightness image.
6. The method of claim 5, wherein the determining, for each pixel point, a corresponding pixel value according to the first luminance difference value and the second luminance difference value to obtain an edge image of the luminance image comprises:
and calculating the root mean square of the first brightness difference value and the second brightness difference value aiming at each pixel point, and taking the root mean square as a corresponding pixel value to obtain an edge image of the brightness image.
7. The method of claim 4, wherein the deriving the edge image from the initial edge image comprises:
performing at least one expansion treatment on the initial edge image to obtain an expanded image;
and obtaining the edge image according to the expanded image.
8. The method of claim 7, wherein the deriving the edge image from the dilated image comprises:
performing an inversion operation on the pixel value of each pixel point in the expanded image;
and obtaining the edge image according to the pixel value after the inversion.
9. The method of claim 8, wherein deriving the edge image from the inverted pixel values comprises:
by the formula Rx,y=(1.0-Valx,y)αComputing the α power of the pixel value after each pixel point is inverted, wherein Valx,yFor the inverted pixel value, Rx,yTo calculate the pixel value to power of α, α is an adjustable parameter;
from R of all pixelsx,yThe pixel values constitute the edge image.
10. The method according to claim 1, wherein said obtaining a tone image of said luminance image from a predetermined reference image comprises:
performing brightness histogram matching on the brightness image and the reference image to obtain a brightness mapping table;
and carrying out color mapping on the brightness image according to the brightness mapping table to obtain the tone image.
11. The method of claim 10, wherein said performing luminance histogram matching on the luminance image and the reference image to obtain a luminance mapping table comprises:
acquiring a brightness histogram of the reference image;
adjusting the brightness distribution of the brightness image to enable the approximation degree of the histogram of the brightness image and the brightness histogram of the reference image to be larger than a preset value;
and determining the brightness mapping table according to the reference image and the adjusted brightness image.
12. The method according to any one of claims 1-11, wherein generating a pencil-drawing style image corresponding to the original image from the edge image and the tone image comprises:
and multiplying the edge image and the tone image to obtain a black and white pencil drawing style image corresponding to the original image.
13. The method according to any one of claims 1-11, wherein generating a pencil-drawing style image corresponding to the original image from the edge image and the tone image comprises:
obtaining a brightness component according to the tone image, and obtaining a color component according to the original image;
generating an initial color pencil drawing style image according to the brightness component and the color component; and is
And converting the initial colored pencil drawing style image into an RGB color space to obtain a colored pencil drawing style image corresponding to the initial image.
14. An image processing apparatus characterized by comprising:
the brightness separation module is used for obtaining a brightness image of the original image according to the brightness component corresponding to the original image;
the image processing module is used for carrying out edge detection processing on the brightness image to obtain an edge image of the brightness image and acquiring a tone image of the brightness image according to a preset reference image; and
and the pencil drawing generation module is used for generating a pencil drawing style image corresponding to the original image according to the edge image and the tone image.
15. An electronic device, comprising:
a memory for storing non-transitory computer readable instructions; and
a processor for executing the computer readable instructions such that the processor when executing performs the image processing method according to any of claims 1-13.
16. A computer-readable storage medium storing non-transitory computer-readable instructions that, when executed by a computer, cause the computer to perform the image processing method of any one of claims 1-13.
CN201910193207.8A 2019-03-14 2019-03-14 Image processing method, device and computer readable storage medium Pending CN110070499A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910193207.8A CN110070499A (en) 2019-03-14 2019-03-14 Image processing method, device and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910193207.8A CN110070499A (en) 2019-03-14 2019-03-14 Image processing method, device and computer readable storage medium

Publications (1)

Publication Number Publication Date
CN110070499A true CN110070499A (en) 2019-07-30

Family

ID=67365267

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910193207.8A Pending CN110070499A (en) 2019-03-14 2019-03-14 Image processing method, device and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN110070499A (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110636331A (en) * 2019-09-26 2019-12-31 北京百度网讯科技有限公司 Method and apparatus for processing video
CN110956590A (en) * 2019-11-04 2020-04-03 中山市奥珀金属制品有限公司 Denoising device and method for iris image and storage medium
CN111754440A (en) * 2020-06-29 2020-10-09 苏州科达科技股份有限公司 License plate image enhancement method, system, equipment and storage medium
CN112241941A (en) * 2020-10-20 2021-01-19 北京字跳网络技术有限公司 Method, device, equipment and computer readable medium for acquiring image
CN112669227A (en) * 2020-12-16 2021-04-16 Tcl华星光电技术有限公司 Icon edge processing method and device and computer readable storage medium
WO2022199583A1 (en) * 2021-03-26 2022-09-29 影石创新科技股份有限公司 Image processing method and apparatus, computer device, and storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104915975A (en) * 2015-06-03 2015-09-16 厦门美图之家科技有限公司 Image processing method and system for simulating crayon colored drawing
CN104915976A (en) * 2015-06-03 2015-09-16 厦门美图之家科技有限公司 Image processing method and system for simulating pencil sketch
CN105374007A (en) * 2015-12-02 2016-03-02 华侨大学 Generation method and generation device of pencil drawing fusing skeleton strokes and textural features
CN105528765A (en) * 2015-12-02 2016-04-27 小米科技有限责任公司 Method and device for processing image
CN108682040A (en) * 2018-05-21 2018-10-19 努比亚技术有限公司 A kind of sketch image generation method, terminal and computer readable storage medium
CN109300099A (en) * 2018-08-29 2019-02-01 努比亚技术有限公司 A kind of image processing method, mobile terminal and computer readable storage medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104915975A (en) * 2015-06-03 2015-09-16 厦门美图之家科技有限公司 Image processing method and system for simulating crayon colored drawing
CN104915976A (en) * 2015-06-03 2015-09-16 厦门美图之家科技有限公司 Image processing method and system for simulating pencil sketch
CN105374007A (en) * 2015-12-02 2016-03-02 华侨大学 Generation method and generation device of pencil drawing fusing skeleton strokes and textural features
CN105528765A (en) * 2015-12-02 2016-04-27 小米科技有限责任公司 Method and device for processing image
CN108682040A (en) * 2018-05-21 2018-10-19 努比亚技术有限公司 A kind of sketch image generation method, terminal and computer readable storage medium
CN109300099A (en) * 2018-08-29 2019-02-01 努比亚技术有限公司 A kind of image processing method, mobile terminal and computer readable storage medium

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
何东健: "《数字图像处理》", 28 February 2015, 西安电子科技大学出版社 *
刘国华: "《HALCON数字图像处理》", 31 May 2018, 西安电子科技大学出版社 *

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110636331A (en) * 2019-09-26 2019-12-31 北京百度网讯科技有限公司 Method and apparatus for processing video
CN110956590A (en) * 2019-11-04 2020-04-03 中山市奥珀金属制品有限公司 Denoising device and method for iris image and storage medium
CN110956590B (en) * 2019-11-04 2023-11-17 张杰辉 Iris image denoising device, method and storage medium
CN111754440A (en) * 2020-06-29 2020-10-09 苏州科达科技股份有限公司 License plate image enhancement method, system, equipment and storage medium
CN112241941A (en) * 2020-10-20 2021-01-19 北京字跳网络技术有限公司 Method, device, equipment and computer readable medium for acquiring image
CN112241941B (en) * 2020-10-20 2024-03-22 北京字跳网络技术有限公司 Method, apparatus, device and computer readable medium for acquiring image
CN112669227A (en) * 2020-12-16 2021-04-16 Tcl华星光电技术有限公司 Icon edge processing method and device and computer readable storage medium
CN112669227B (en) * 2020-12-16 2023-10-17 Tcl华星光电技术有限公司 Icon edge processing method, icon edge processing device and computer readable storage medium
WO2022199583A1 (en) * 2021-03-26 2022-09-29 影石创新科技股份有限公司 Image processing method and apparatus, computer device, and storage medium

Similar Documents

Publication Publication Date Title
CN110070499A (en) Image processing method, device and computer readable storage medium
US20230401682A1 (en) Styled image generation method, model training method, apparatus, device, and medium
WO2020024483A1 (en) Method and apparatus for processing image
US20220319077A1 (en) Image-text fusion method and apparatus, and electronic device
US11409794B2 (en) Image deformation control method and device and hardware device
CN110069974B (en) Highlight image processing method and device and electronic equipment
CN107204034B (en) A kind of image processing method and terminal
JP2022505118A (en) Image processing method, equipment, hardware equipment
CN113658065B (en) Image noise reduction method and device, computer readable medium and electronic equipment
CN110070495B (en) Image processing method and device and electronic equipment
WO2022132032A1 (en) Portrait image processing method and device
CN113066020B (en) Image processing method and device, computer readable medium and electronic equipment
CN113902636A (en) Image deblurring method and device, computer readable medium and electronic equipment
CN113610720A (en) Video denoising method and device, computer readable medium and electronic device
CN112967193A (en) Image calibration method and device, computer readable medium and electronic equipment
CN106548117A (en) A kind of face image processing process and device
US20240095886A1 (en) Image processing method, image generating method, apparatus, device, and medium
CN112819691B (en) Image processing method, device, equipment and readable storage medium
CN110070482B (en) Image processing method, apparatus and computer readable storage medium
CN110069641B (en) Image processing method and device and electronic equipment
CN111292247A (en) Image processing method and device
US11651529B2 (en) Image processing method, apparatus, electronic device and computer readable storage medium
CN113240599B (en) Image toning method and device, computer readable storage medium and electronic equipment
WO2022227996A1 (en) Image processing method and apparatus, electronic device, and readable storage medium
CN112967194B (en) Target image generation method and device, computer readable medium and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information
CB02 Change of applicant information

Address after: 100041 B-0035, 2 floor, 3 building, 30 Shixing street, Shijingshan District, Beijing.

Applicant after: Tiktok vision (Beijing) Co.,Ltd.

Address before: 100041 B-0035, 2 floor, 3 building, 30 Shixing street, Shijingshan District, Beijing.

Applicant before: BEIJING BYTEDANCE NETWORK TECHNOLOGY Co.,Ltd.

Address after: 100041 B-0035, 2 floor, 3 building, 30 Shixing street, Shijingshan District, Beijing.

Applicant after: Douyin Vision Co.,Ltd.

Address before: 100041 B-0035, 2 floor, 3 building, 30 Shixing street, Shijingshan District, Beijing.

Applicant before: Tiktok vision (Beijing) Co.,Ltd.