CN107705336A - A kind of pathological image staining components adjusting method - Google Patents
A kind of pathological image staining components adjusting method Download PDFInfo
- Publication number
- CN107705336A CN107705336A CN201710947445.4A CN201710947445A CN107705336A CN 107705336 A CN107705336 A CN 107705336A CN 201710947445 A CN201710947445 A CN 201710947445A CN 107705336 A CN107705336 A CN 107705336A
- Authority
- CN
- China
- Prior art keywords
- mrow
- mtd
- msub
- mtr
- msubsup
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 230000001575 pathological effect Effects 0.000 title claims abstract description 82
- 238000000034 method Methods 0.000 title claims abstract description 52
- 238000010186 staining Methods 0.000 title claims abstract description 43
- 238000004043 dyeing Methods 0.000 claims abstract description 44
- 238000000926 separation method Methods 0.000 claims abstract description 6
- 239000003086 colorant Substances 0.000 claims description 48
- 238000004040 coloring Methods 0.000 claims description 36
- 230000003287 optical effect Effects 0.000 claims description 29
- 230000009466 transformation Effects 0.000 claims description 17
- 238000004364 calculation method Methods 0.000 claims description 14
- 238000012937 correction Methods 0.000 claims description 12
- WZUVPPKBWHMQCE-UHFFFAOYSA-N Haematoxylin Chemical compound C12=CC(O)=C(O)C=C2CC2(O)C1C1=CC=C(O)C(O)=C1OC2 WZUVPPKBWHMQCE-UHFFFAOYSA-N 0.000 claims description 10
- 239000011159 matrix material Substances 0.000 claims description 10
- 239000003795 chemical substances by application Substances 0.000 claims description 9
- 238000013507 mapping Methods 0.000 claims description 7
- BVRIUXYMUSKBHG-UHFFFAOYSA-N 4-[[4-(dimethylamino)phenyl]diazenyl]aniline Chemical compound C1=CC(N(C)C)=CC=C1N=NC1=CC=C(N)C=C1 BVRIUXYMUSKBHG-UHFFFAOYSA-N 0.000 claims description 6
- 238000002835 absorbance Methods 0.000 claims description 6
- 230000001131 transforming effect Effects 0.000 claims description 6
- YQGOJNYOYNNSMM-UHFFFAOYSA-N eosin Chemical compound [Na+].OC(=O)C1=CC=CC=C1C1=C2C=C(Br)C(=O)C(Br)=C2OC2=C(Br)C(O)=C(Br)C=C21 YQGOJNYOYNNSMM-UHFFFAOYSA-N 0.000 claims description 5
- 230000007170 pathology Effects 0.000 claims description 4
- 239000004576 sand Substances 0.000 claims description 4
- 230000015572 biosynthetic process Effects 0.000 claims description 3
- 239000006103 coloring component Substances 0.000 claims description 3
- 230000003014 reinforcing effect Effects 0.000 claims description 3
- 238000003786 synthesis reaction Methods 0.000 claims description 3
- 238000012360 testing method Methods 0.000 claims description 3
- SXEHKFHPFVVDIR-UHFFFAOYSA-N [4-(4-hydrazinylphenyl)phenyl]hydrazine Chemical compound C1=CC(NN)=CC=C1C1=CC=C(NN)C=C1 SXEHKFHPFVVDIR-UHFFFAOYSA-N 0.000 claims description 2
- 239000004615 ingredient Substances 0.000 claims description 2
- 230000003313 weakening effect Effects 0.000 claims description 2
- 230000019612 pigmentation Effects 0.000 claims 1
- 230000000694 effects Effects 0.000 abstract description 11
- 238000003745 diagnosis Methods 0.000 abstract description 6
- 239000000975 dye Substances 0.000 description 12
- 238000012545 processing Methods 0.000 description 4
- 238000006243 chemical reaction Methods 0.000 description 2
- 210000000805 cytoplasm Anatomy 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 238000005286 illumination Methods 0.000 description 2
- 238000003384 imaging method Methods 0.000 description 2
- 238000007447 staining method Methods 0.000 description 2
- 108010077544 Chromatin Proteins 0.000 description 1
- 102000010834 Extracellular Matrix Proteins Human genes 0.000 description 1
- 108010037362 Extracellular Matrix Proteins Proteins 0.000 description 1
- 230000003321 amplification Effects 0.000 description 1
- 210000003483 chromatin Anatomy 0.000 description 1
- 238000003759 clinical diagnosis Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 210000002744 extracellular matrix Anatomy 0.000 description 1
- 238000007490 hematoxylin and eosin (H&E) staining Methods 0.000 description 1
- 238000003199 nucleic acid amplification method Methods 0.000 description 1
- 210000004940 nucleus Anatomy 0.000 description 1
- 239000000758 substrate Substances 0.000 description 1
- 210000001519 tissue Anatomy 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/90—Determination of colour characteristics
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/90—Dynamic range modification of images or parts thereof
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10056—Microscopic image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30096—Tumor; Lesion
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Nitrogen And Oxygen Or Sulfur-Condensed Heterocyclic Ring Systems (AREA)
- Image Processing (AREA)
Abstract
The invention discloses a kind of pathological image staining components adjusting method, brightness and the saturation degree of pathological section have been corrected in hue, saturation, intensity (HSV) space by the pathological section to collection first, by to the saturation degree and brightness, staining components of correcting the pathological section are separated, staining components are adjusted, staining components synthesize, realize the algorithm to the content of single staining components is adjusted, the effect separately adjustable to coloured differently agent in pathological image is reached, the task of dyeing separation is effectively completed, aids in the diagnosis of pathologist.
Description
Technical Field
The invention relates to the field of digital image processing, in particular to a pathological image dyeing component adjusting method.
Background
The digital pathological image is a high-resolution digital image obtained by scanning and collecting pathological sections through a full-automatic microscope or an optical amplification system, and is widely applied to pathological clinical diagnosis. The color of the pathological section is obtained by coloring with a staining agent, and the most common staining method is hematoxylin and eosin (H-E) staining. However, in the dyeing process, artificial factors such as manual operation methods and differences of the ratios of the dyeing agents cause differences of the dyeing quality of pathological sections; meanwhile, the difference of the illumination environment in the slice scanning process also enables the brightness and the saturation of the acquired digital pathological image to have larger difference. These differences hinder the diagnosis of the pathologist and affect the judgment of the diagnosis accuracy, so that color correction is required for the digital pathological image.
With the continuous development of digital image processing technology, some methods applied to color enhancement and correction of natural scene images are mature, and are widely applied to the fields of vision and multimedia systems, biomedicine, industrial engineering, aerospace and the like, including histogram equalization algorithms, Retinex algorithms, enhancement methods based on color space transformation and the like. The general flow of these algorithms is shown in fig. 1.
The histogram equalization algorithm is a self-adaptive enhancement method, the output result is the optimal effect obtained by the algorithm through calculation, a doctor cannot adjust the result, the adjustable method corresponding to the histogram equalization method is a histogram stipulation algorithm, and the image quality is improved by stipulating the shape, the value range and the like of the image histogram; in contrast, the Retinex algorithm and the spatial transform-based enhancement algorithm are parameter-tunable enhancement methods. The three methods can artificially adjust the color of the pathological image, however, the color of the pathological image is obtained by coloring with two or more coloring agents, and the colors corresponding to the coloring agents have specific meanings, for example, the hematoxylin coloring agent can dye the chromatin and the ribosome in the cytoplasm into the purple blue, and the eosin coloring agent can dye the components in the cytoplasm and the extracellular matrix into the red. It is desirable for the physician to adjust each of the individual components of the stain independently of the other stains, to increase or decrease the intensity of the stain, without affecting the other stains. The methods are designed aiming at natural images stored by red, green and blue (RGB) channels, when the method is applied to pathological images, each dyeing component is inevitably influenced by adjusting a single channel, and the task of dyeing and adjusting the pathological images cannot be well finished according to the requirements of doctors.
Disclosure of Invention
The invention aims to provide a pathological image dyeing component adjusting method to solve the problem that the existing image processing technology cannot separate different dyeing agent components in a pathological image and realize independent adjustment of different dyeing agents in the pathological image.
In order to realize the purpose of the invention, the following technical scheme is adopted:
a pathological image dyeing component adjusting method comprises the following steps:
(1) collecting pathological sections: collecting the pathological section into a computer, and expressing the pathological section by using an RGB channel, wherein the coordinates of a pixel point are marked as (x, y);
(2) correcting the saturation and brightness of the pathological section, comprising the following three steps:
a. transforming the pathological section from the RGB channel to an HSV channel;
b. defining a pixel with the lowest saturation S channel value of 5% of the pathological section as a background area pixel; calculating the mean value of the pixels in the background area to estimate the saturation of the background areaDegree and is represented by SbackMeanwhile, the average value of the background area pixel points in the brightness channel V is counted as the brightness value of the background area and is expressed as Vback(ii) a Then, with the background area transformed into white as a target, linearly stretching the saturation and brightness of the whole pathological section while keeping the hue unchanged;
c. b, inversely transforming the pathological image enhanced in the step b to an RGB channel to finish the correction of the saturation and brightness of the pathological image;
(3) separating dyeing components: in the RGB channel transformed in step 2, the optical density O of the channel c (c ═ R, G, B) is determinedcColoring intensity A of (x, y) and coloring agent ss(x, y) obtaining a mapping relation of the optical density to the coloring intensity of the coloring agent s, and completing an image through a color deconvolution algorithm by using the mapping relationThe dyeing and the separation of (2); the related formula is:
wherein, A is0Is the maximum value of the coloring intensity of the coloring agent, the A0=1;
(4) Dyeing ingredient adjustment: obtaining the coloring intensity A of each coloring agents' (x, y) thereafter, adjusted as diagnostic need; let the regulation rate of the coloring agent s be psWherein said p issAnd if the color intensity of the adjusted coloring agent is more than 0, the calculation formula of the coloring intensity of the adjusted coloring agent is as follows:
wherein, the p iss> 1 represents the intensity of coloration of the reinforcing dyeing component s, ps< 1 represents reduction of the coloring strength of the coloring component sDegree;
(5) dyeing component synthesis: and (4) after the dyeing components are adjusted in the step (4), fusing the dyeing data, and performing inverse transformation on the data back to the RGB channel.
In the method for adjusting the staining component of the pathological image as described above, preferably, the R, G, B three-channel numerical value of the pixel coordinate (x, y) in step 1 is represented as:
I(x,y)=[Ir(x,y),Ig(x,y),Ib(x,y)](3)
wherein Ir(x,y)、Ig(x,y)、Ib(x, y) represent the values of the three color channels of red, green and blue, respectively, and Ic(x,y)∈[0,1],c=r,g,b。
The method for adjusting the staining components of the pathological image as described above preferably includes the following formula when the channel is transformed in step a:
wherein, the H (x, y), S (x, y) and V (x, y) respectively represent the hue, saturation and brightness of the pixel point (x, y).
In the method for adjusting staining components of pathological images as described above, preferably, in the step b, the saturation and brightness of the entire pathological section by linear stretching refers to the background saturation of the pathological section after processingAnd the background brightnessThe related calculation formula is as follows:
wherein, theAndrepresents the enhancement result of point (x, y);
in step c, the transformation formula is involved as follows:
wherein, theTo representAn integer part of (a), saidThe value of the point (x, y) in the pathology image after the saturation and brightness correction is shown. The pathological image staining component adjusting method as described above, preferably, the optical density O of the channel c (c ═ r, g, b)cThe calculation formula of (x, y) is as follows:
wherein, the I0,cIs a single channel maximum, said I0,c=1。
The pathological image staining component adjusting method as described above, preferably, the optical density O is set when the staining agent is a separate staining agent in the step (3)c(x, y) is proportional to the coloring degree A of the coloring agent, and when the coloring agent is a plurality of coloring agents, the optical density of the coloring agent is equal to the sum of the optical densities of the coloring agents in the channel c, and when the coloring agent is coloredWhen the agent is hematoxylin-eosin-diaminobenzidine, the hematoxylin is recorded as H, the eosin is recorded as E, the diaminobenzidine is recorded as DAB, and the optical density is Oc(x, y) and coloring Strength of coloring agent AsThe transformation relationship of (x, y) is as follows:
wherein s ═ H, E, DAB,represents the absorbance of said dye s on channel c,is a constant for the stain s and channel c, and can be obtained by a single stain staining test, when the stain is H-E-DAB staining, the absorbance matrix of the channel c for three stains H, E and DAB is
Order to
O=[Or(x,y),Og(x,y),Ob(x,y)]T,
A=[AH(x,y),AE(x,y),ADAB(x,y)]T,
Equation (8) is abbreviated as:
O=M·A (9)
let D be M-1Each of the coloring strengths of the coloring agents obtainable from the formula (7) is:
A=D·O
(10)
wherein, D is called a color deconvolution matrix and represents the mapping relation of optical density to coloring intensity of the coloring agent, and when the coloring agent is H-E-DAB coloring, the deconvolution matrix is:
vector A ═ AH(x,y),AE(x,y),ADAB(x,y)]TThat is, the decomposed staining intensity is a linear transformation of the optical density O, so that the a is still in the optical density space, and the inverse transformation is performed to the linear space, thereby completing the imageThe dyeing of (2) is separated.
In the method for adjusting the staining components of the pathological image, preferably, in the step 5, when the staining is hematoxylin-eosin-diaminobenzidine staining, the staining data is fused by the following calculation formula:
calculated to obtainAs a result of the adjustment.
The invention provides a method for adjusting dyeing components of a digital pathological image, which firstly corrects the brightness and saturation of the pathological image in hue, saturation and brightness (HSV) space, and then realizes an algorithm for adjusting the content of a single dyeing component by utilizing a color deconvolution algorithm, thereby achieving the effect of independently adjusting different dyeing agents in the pathological image, effectively completing the task of dyeing separation and assisting the diagnosis of a pathologist.
Drawings
Fig. 1 is a flow chart of digital image enhancement in the prior art.
FIG. 2 is a flow chart of the method of the present invention.
FIG. 3 is a graph showing the effect of four pathological sections in the preferred embodiment 1 of the present invention.
Fig. 4 is an original image in the preferred embodiment 2 of the present invention.
Fig. 5 is a diagram illustrating the adjusted effect of the original image 4 according to the preferred embodiment 2 of the present invention.
Detailed Description
In the prior art, a general pathological image adjusting method is performed in an RGB channel or an HSV channel, dyeing information of each dyeing agent is simultaneously distributed in the RGB channel or the HSV channel, while the general pathological image adjusting method can only adjust a single channel in an RGB space or an HSV space, which involves two problems: 1) the independent adjustment of one channel of the RGB space or HSV space can simultaneously affect each dyeing component; 2) the desire to adjust one dye component requires the simultaneous adjustment of three channels of the RGB space or HSV space in proportion. Therefore, the common pathological image adjusting method is difficult to adjust the single dyeing component.
In the invention, the pathological image is transformed to the stain space by utilizing the color deconvolution algorithm, namely, each stain after transformation is controlled by a single channel, so that the adjustment of a single stain component can be realized by adjusting the numerical value of a certain channel after transformation, and other stain components are not influenced. The present invention will be described in detail with reference to the following embodiments with reference to the attached drawings.
Example 1
A pathological image dyeing component adjusting method is shown in a specific flow chart in figure 2 and comprises the following steps:
1. collecting pathological section
The pathological image is input into the computer by the pathological section scanning device, and the image is represented in an RGB color space, in which the numerical value of R, G, B three channels with pixel coordinates (x, y) is represented as:
I(x,y)=[Ir(x,y),Ig(x,y),Ib(x,y)](3)
wherein Ir(x,y)、Ig(x,y)、Ib(x, y) represent the values of the three color channels of red, green and blue, respectively, and Ic(x,y)∈[0,1],c=r,g,b。
2. Saturation and brightness correction
The pathological image scanning causes poor saturation and brightness of the image due to poor illumination conditions and other reasons. Therefore, luminance saturation correction of the pathological image is required. The background area (area without tissue coverage) of the pathological section does not contain any content, and the imaging effect of the whole section is the best when the background area is pure white in general. Based on the method, the saturation and brightness of the whole slice are corrected. The specific method comprises the following three steps:
a. transforming the image I (x, y) from RGB space to HSV space involves the following transformation formula:
where H (x, y), S (x, y), and V (x, y) represent the hue, saturation, and brightness of the dot (x, y), respectively.
b. The saturation of the background area is always lowest throughout the slice. Therefore, the pixels with the lowest 5% of the S-channel value of the saturation are defined as the pixels of the background area, and then the average value of the pixels is counted to estimate the saturation of the background area and is expressed as SbackAnd simultaneously counting the pixels in the brightness channel VMean value, as the luminance value of the background region, and denoted Vback. Finally, the saturation and brightness of the whole slice are linearly stretched by taking the background area converted into white as a target, namely the background saturation of the processed slice is ensured as much as possibleAnd the background brightnessWhile keeping the hue constant, the calculation formula is as follows:
whereinAndrepresents the enhancement result for point (x, y).
c. And inversely transforming the enhanced result into an RGB space to finish the correction of the saturation and brightness of the pathological image, wherein the conversion formula is as follows:
whereinTo representThe integer part of (a) is,the value of the point (x, y) in the pathology image after the saturation and brightness correction is shown.
The method described in step 2 is used to process 4 pathological sections, and the result is shown in fig. 3, where (a), (b), (c), and (d) are four digital pathological sections with poor imaging conditions, the left half of each section is the original image, and the right half is the enhancement effect after the method of the present invention is used. As can be seen from the figure, the method of the invention can effectively adjust the saturation and the brightness of the pathological image.
3. Separation of dyeing components
In the RGB channel, the optical density calculation formula of the channel c (c ═ r, g, b) is:
wherein, I0,cIs a single channel maximum value (in the method)So I0,c1. Optical density O when coloring with a separate coloring agentc(x, y) is proportional to the coloring degree A of the coloring agent, and when colored with a plurality of coloring agents, the optical density is equal to the sum of the optical densities of the coloring agents in the channel c, for example, H-E-DAB coloring, and the optical density Oc(x, y) (c ═ r, g, b) and coloring intensity a of the coloring agentsThe conversion relationship of (x, y) (s ═ H, E, DAB) is as follows:
wherein,represents the absorbance of the dye s on channel c,is a constant for stain s and channel c, can be obtained via a single stain staining test, staining with H-E-DABFor example, channel c has an absorbance matrix for three dyes H, E and DAB
Let O be [ O ]r(x,y),Og(x,y),Ob(x,y)]T,A=[AH(x,y),AE(x,y),ADAB(x,y)]TEquation (6) can be abbreviated as:
O=M·A (9)
let D be M-1The coloring intensity of each coloring agent obtained from the formula (7) is:
A=D·O (10)
wherein, D is called a color deconvolution matrix, which represents the mapping relationship from optical density to coloring intensity of the coloring agent, taking H-E-DAB dyeing as an example, the deconvolution matrix is:
vector A ═ AH(x,y),AE(x,y),ADAB(x,y)]TThat is, since A is a linear transformation of optical density O, A is still in the optical density space, and the inverse transformation is performed to the linear space, thereby completing the imageThe dyeing separation of (1) involves the formula:
A0the maximum value of coloring intensity of the coloring agent is the value range ([0,1 ] of the corresponding RGB channel in the method]) Taking A0=1。
4. Dyeing composition adjustment
Obtaining the coloring intensity A of each coloring agents' (x, y) thereafter, the physician can adjust it as needed for diagnosis. Let the regulation rate of the coloring agent be psWherein, the p issAnd if the color intensity of the adjusted coloring agent is more than 0, the calculation formula of the coloring intensity of the adjusted coloring agent is as follows:
wherein p iss> 1 represents the intensity of coloration of the reinforcing dyeing component s, ps< 1 represents weakening of the coloring strength of the coloring component s.
5. Synthesis of dyeing Components
After the dyeing components are adjusted, the dyeing data needs to be fused and inversely transformed back to the RGB channel. Taking H-E-DAB dyeing as an example, the calculation formula is as follows:
calculated to obtainI.e. the result after adjustment.
. Fig. 5 shows the effect of HE stained digital pathology images treated with the inventive stain modulation method. H-E-staining is the most commonly used staining method, H stands for hematoxylin stain, which stains the nucleus bluish-purple, E stands for eosin stain, which stains the substrate pink. The pathological image of HE staining can be regarded as an H-E-DAB staining image without DAB staining, so that the treatment can be carried out by using the deconvolution matrix provided by the formula (11) and only by making p in the formula (13)DABRegulating p only 1H,pEAnd (4) finishing. Different pH,pEThe dye adjustment method in taking value is shown in fig. 5. As a result, it can be seen that, using the method of the present invention,the two staining components of the pathological image achieve the effect of independent adjustment.
The method can be divided into two parts, wherein the first part is based on the saturation and brightness adjustment of pathological images in HSV space, and the content of the step 2 is obtained; the second part is the adjustment of color components based on the color deconvolution, steps 3, 4, and 5. The first part is a self-adaptive image adjusting method, does not need human intervention, performs color correction on the whole pathological image, and is more targeted compared with other color correction methods. The second part is that the doctor manually sets the parameter p according to the diagnosis requirementsAnd adjusting coloring intensity of the coloring agent. The conventional image adjusting method is performed in an RGB space, and the effect of adjusting the dyeing density of a single coloring agent is difficult to obtain by a user in a mode of adjusting brightness and saturation in the RGB space. The method utilizes a color deconvolution algorithm to transform the pathological image into a dyeing space, then adjusts the coloring intensity of each coloring agent in the dyeing space, and finally reversely transforms the pathological image into an RGB space, thereby achieving the effect of adjusting a single coloring agent.
Claims (6)
1. A pathological image staining component adjusting method is characterized by comprising the following steps:
(1) collecting pathological sections: collecting the pathological section into a computer, and expressing the pathological section by using an RGB channel, wherein the coordinates of a pixel point are marked as (x, y);
(2) correcting the saturation and brightness of the pathological section, comprising the following three steps:
a. transforming the pathological section from the RGB channel to an HSV channel;
b. defining the lowest 5% of saturation S channel value of the pathological sectionThe pixel of (2) is used as a background area pixel; the average value of the pixels of the background area is counted to estimate the saturation of the background area and is expressed as SbackMeanwhile, the average value of the background area pixel points in the brightness channel V is counted as the brightness value of the background area and is expressed as Vback(ii) a Then, with the background area transformed into white as a target, linearly stretching the saturation and brightness of the whole pathological section while keeping the hue unchanged;
c. b, inversely transforming the pathological image enhanced in the step b to an RGB channel to finish the correction of the saturation and brightness of the pathological image;
(3) separating dyeing components: in the RGB channel transformed in step 2, the optical density O of the channel c (c ═ R, G, B) is determinedcColoring intensity A of (x, y) and coloring agent ss(x, y) obtaining a mapping relation of the optical density to the coloring intensity of the coloring agent s, and completing an image through a color deconvolution algorithm by using the mapping relationThe dyeing and the separation of (2); it involves the formula:
<mrow> <msubsup> <mi>A</mi> <mi>s</mi> <mo>&prime;</mo> </msubsup> <mrow> <mo>(</mo> <mi>x</mi> <mo>,</mo> <mi>y</mi> <mo>)</mo> </mrow> <mo>=</mo> <msub> <mi>A</mi> <mn>0</mn> </msub> <mo>&times;</mo> <msup> <mn>10</mn> <mrow> <mo>-</mo> <msub> <mi>A</mi> <mi>s</mi> </msub> <mrow> <mo>(</mo> <mi>x</mi> <mo>,</mo> <mi>y</mi> <mo>)</mo> </mrow> </mrow> </msup> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>1</mn> <mo>)</mo> </mrow> </mrow>
wherein, A is0Is the most coloring intensity of the coloring agentLarge value of said A0=1;
(4) Dyeing ingredient adjustment: a in obtaining the coloring intensity of each coloring agents' (x, y) thereafter, adjusted as diagnostic need; let the regulation rate of the coloring agent s be psWherein said p issAnd if the color intensity of the adjusted coloring agent is more than 0, the calculation formula of the coloring intensity of the adjusted coloring agent is as follows:
<mrow> <msubsup> <mover> <mi>A</mi> <mo>&OverBar;</mo> </mover> <mi>s</mi> <mo>&prime;</mo> </msubsup> <mrow> <mo>(</mo> <mi>x</mi> <mo>,</mo> <mi>y</mi> <mo>)</mo> </mrow> <mo>=</mo> <mn>1</mn> <mo>-</mo> <mo>&lsqb;</mo> <mn>1</mn> <mo>-</mo> <msubsup> <mi>A</mi> <mi>s</mi> <mo>&prime;</mo> </msubsup> <mrow> <mo>(</mo> <mi>x</mi> <mo>,</mo> <mi>y</mi> <mo>)</mo> </mrow> <mo>&rsqb;</mo> <mo>&times;</mo> <msub> <mi>p</mi> <mi>s</mi> </msub> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>2</mn> <mo>)</mo> </mrow> </mrow>
wherein, the p iss> 1 represents the intensity of coloration of the reinforcing dyeing component s, ps< 1 represents weakening of the coloring strength of the coloring component s;
(5) dyeing component synthesis: and 4, after the dyeing components are adjusted in the step 4, fusing the dyeing data, and performing inverse transformation on the data back to the RGB channel.
2. The pathological image staining component adjustment method of claim 1, wherein the R, G, B three-channel numerical representation of the pixel coordinates (x, y) in step (1) is:
I(x,y)=[Ir(x,y),Ig(x,y),Ib(x,y)](3)
wherein Ir(x,y)、Ig(x,y)、Ib(x, y) represent the values of the three color channels of red, green and blue, respectively, and Ic(x,y)∈[0,1],c=r,g,b。
3. The pathological image staining component adjusting method according to claim 1, wherein the formula relating to the transformation when the channel is transformed in step a is as follows:
<mrow> <mtable> <mtr> <mtd> <mrow> <msub> <mi>V</mi> <mi>max</mi> </msub> <mo>=</mo> <mi>max</mi> <mrow> <mo>&lsqb;</mo> <mrow> <msub> <mi>I</mi> <mi>r</mi> </msub> <mrow> <mo>(</mo> <mrow> <mi>x</mi> <mo>,</mo> <mi>y</mi> </mrow> <mo>)</mo> </mrow> <mo>,</mo> <msub> <mi>I</mi> <mi>g</mi> </msub> <mrow> <mo>(</mo> <mrow> <mi>x</mi> <mo>,</mo> <mi>y</mi> </mrow> <mo>)</mo> </mrow> <mo>,</mo> <msub> <mi>I</mi> <mi>b</mi> </msub> <mrow> <mo>(</mo> <mrow> <mi>x</mi> <mo>,</mo> <mi>y</mi> </mrow> <mo>)</mo> </mrow> </mrow> <mo>&rsqb;</mo> </mrow> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <msub> <mi>V</mi> <mi>min</mi> </msub> <mo>=</mo> <mi>min</mi> <mrow> <mo>&lsqb;</mo> <mrow> <msub> <mi>I</mi> <mi>r</mi> </msub> <mrow> <mo>(</mo> <mrow> <mi>x</mi> <mo>,</mo> <mi>y</mi> </mrow> <mo>)</mo> </mrow> <mo>,</mo> <msub> <mi>I</mi> <mi>g</mi> </msub> <mrow> <mo>(</mo> <mrow> <mi>x</mi> <mo>,</mo> <mi>y</mi> </mrow> <mo>)</mo> </mrow> <mo>,</mo> <msub> <mi>I</mi> <mi>b</mi> </msub> <mrow> <mo>(</mo> <mrow> <mi>x</mi> <mo>,</mo> <mi>y</mi> </mrow> <mo>)</mo> </mrow> </mrow> <mo>&rsqb;</mo> </mrow> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <mi>H</mi> <mrow> <mo>(</mo> <mrow> <mi>x</mi> <mo>,</mo> <mi>y</mi> </mrow> <mo>)</mo> </mrow> <mo>=</mo> <mfenced open = "{" close = ""> <mtable> <mtr> <mtd> <mrow> <mn>0</mn> <mo>,</mo> </mrow> </mtd> <mtd> <mrow> <mi>i</mi> <mi>f</mi> </mrow> </mtd> <mtd> <mrow> <msub> <mi>V</mi> <mi>max</mi> </msub> <mo>=</mo> <msub> <mi>V</mi> <mi>min</mi> </msub> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <mn>60</mn> <mo>&times;</mo> <mfrac> <mrow> <msub> <mi>I</mi> <mi>g</mi> </msub> <mrow> <mo>(</mo> <mrow> <mi>x</mi> <mo>,</mo> <mi>y</mi> </mrow> <mo>)</mo> </mrow> <mo>-</mo> <msub> <mi>I</mi> <mi>b</mi> </msub> <mrow> <mo>(</mo> <mrow> <mi>x</mi> <mo>,</mo> <mi>y</mi> </mrow> <mo>)</mo> </mrow> </mrow> <mrow> <msub> <mi>V</mi> <mi>max</mi> </msub> <mo>-</mo> <msub> <mi>V</mi> <mi>min</mi> </msub> </mrow> </mfrac> <mo>,</mo> </mrow> </mtd> <mtd> <mrow> <mi>i</mi> <mi>f</mi> </mrow> </mtd> <mtd> <mrow> <msub> <mi>V</mi> <mi>max</mi> </msub> <mo>=</mo> <msub> <mi>I</mi> <mi>r</mi> </msub> <mrow> <mo>(</mo> <mrow> <mi>x</mi> <mo>,</mo> <mi>y</mi> </mrow> <mo>)</mo> </mrow> <mo>,</mo> <msub> <mi>I</mi> <mi>g</mi> </msub> <mrow> <mo>(</mo> <mrow> <mi>x</mi> <mo>,</mo> <mi>y</mi> </mrow> <mo>)</mo> </mrow> <mo>&GreaterEqual;</mo> <msub> <mi>I</mi> <mi>b</mi> </msub> <mrow> <mo>(</mo> <mrow> <mi>x</mi> <mo>,</mo> <mi>y</mi> </mrow> <mo>)</mo> </mrow> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <mn>60</mn> <mo>&times;</mo> <mfrac> <mrow> <msub> <mi>I</mi> <mi>g</mi> </msub> <mrow> <mo>(</mo> <mrow> <mi>x</mi> <mo>,</mo> <mi>y</mi> </mrow> <mo>)</mo> </mrow> <mo>-</mo> <msub> <mi>I</mi> <mi>b</mi> </msub> <mrow> <mo>(</mo> <mrow> <mi>x</mi> <mo>,</mo> <mi>y</mi> </mrow> <mo>)</mo> </mrow> </mrow> <mrow> <msub> <mi>V</mi> <mi>max</mi> </msub> <mo>-</mo> <msub> <mi>V</mi> <mi>min</mi> </msub> </mrow> </mfrac> <mo>+</mo> <mn>360</mn> <mo>,</mo> </mrow> </mtd> <mtd> <mrow> <mi>i</mi> <mi>f</mi> </mrow> </mtd> <mtd> <mrow> <msub> <mi>V</mi> <mi>max</mi> </msub> <mo>=</mo> <msub> <mi>I</mi> <mi>r</mi> </msub> <mrow> <mo>(</mo> <mrow> <mi>x</mi> <mo>,</mo> <mi>y</mi> </mrow> <mo>)</mo> </mrow> <mo>,</mo> <msub> <mi>I</mi> <mi>g</mi> </msub> <mrow> <mo>(</mo> <mrow> <mi>x</mi> <mo>,</mo> <mi>y</mi> </mrow> <mo>)</mo> </mrow> <mo><</mo> <msub> <mi>I</mi> <mi>b</mi> </msub> <mrow> <mo>(</mo> <mrow> <mi>x</mi> <mo>,</mo> <mi>y</mi> </mrow> <mo>)</mo> </mrow> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <mn>60</mn> <mo>&times;</mo> <mfrac> <mrow> <msub> <mi>I</mi> <mi>b</mi> </msub> <mrow> <mo>(</mo> <mrow> <mi>x</mi> <mo>,</mo> <mi>y</mi> </mrow> <mo>)</mo> </mrow> <mo>-</mo> <msub> <mi>I</mi> <mi>r</mi> </msub> <mrow> <mo>(</mo> <mrow> <mi>x</mi> <mo>,</mo> <mi>y</mi> </mrow> <mo>)</mo> </mrow> </mrow> <mrow> <msub> <mi>V</mi> <mi>max</mi> </msub> <mo>-</mo> <msub> <mi>V</mi> <mi>min</mi> </msub> </mrow> </mfrac> <mo>+</mo> <mn>120</mn> <mo>,</mo> </mrow> </mtd> <mtd> <mrow> <mi>i</mi> <mi>f</mi> </mrow> </mtd> <mtd> <mrow> <msub> <mi>V</mi> <mi>max</mi> </msub> <mo>=</mo> <msub> <mi>I</mi> <mi>g</mi> </msub> <mrow> <mo>(</mo> <mrow> <mi>x</mi> <mo>,</mo> <mi>y</mi> </mrow> <mo>)</mo> </mrow> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <mn>60</mn> <mo>&times;</mo> <mfrac> <mrow> <msub> <mi>I</mi> <mi>r</mi> </msub> <mrow> <mo>(</mo> <mrow> <mi>x</mi> <mo>,</mo> <mi>y</mi> </mrow> <mo>)</mo> </mrow> <mo>-</mo> <msub> <mi>I</mi> <mi>g</mi> </msub> <mrow> <mo>(</mo> <mrow> <mi>x</mi> <mo>,</mo> <mi>y</mi> </mrow> <mo>)</mo> </mrow> </mrow> <mrow> <msub> <mi>V</mi> <mi>max</mi> </msub> <mo>-</mo> <msub> <mi>V</mi> <mi>min</mi> </msub> </mrow> </mfrac> <mo>+</mo> <mn>240</mn> <mo>,</mo> </mrow> </mtd> <mtd> <mrow> <mi>i</mi> <mi>f</mi> </mrow> </mtd> <mtd> <mrow> <msub> <mi>V</mi> <mi>max</mi> </msub> <mo>=</mo> <msub> <mi>I</mi> <mi>b</mi> </msub> <mrow> <mo>(</mo> <mrow> <mi>x</mi> <mo>,</mo> <mi>y</mi> </mrow> <mo>)</mo> </mrow> </mrow> </mtd> </mtr> </mtable> </mfenced> </mrow> </mtd> </mtr> </mtable> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>4</mn> <mo>)</mo> </mrow> </mrow>
<mrow> <mi>V</mi> <mrow> <mo>(</mo> <mi>x</mi> <mo>,</mo> <mi>y</mi> <mo>)</mo> </mrow> <mo>=</mo> <mfrac> <mn>1</mn> <mn>2</mn> </mfrac> <mrow> <mo>(</mo> <msub> <mi>V</mi> <mrow> <mi>m</mi> <mi>a</mi> <mi>x</mi> </mrow> </msub> <mo>+</mo> <msub> <mi>V</mi> <mrow> <mi>m</mi> <mi>i</mi> <mi>n</mi> </mrow> </msub> <mo>)</mo> </mrow> </mrow>
wherein, the H (x, y), S (x, y) and V (x, y) respectively represent the hue, saturation and brightness of the pixel point (x, y);
in the step b, the saturation and brightness of the whole pathological section is linearly stretched and refers to the background saturation of the pathological section after treatmentAnd the background is brightDegree of rotationThe related calculation formula is as follows:
the related calculation formula is as follows:
<mrow> <mtable> <mtr> <mtd> <mrow> <mover> <mi>H</mi> <mo>&OverBar;</mo> </mover> <mrow> <mo>(</mo> <mrow> <mi>x</mi> <mo>,</mo> <mi>y</mi> </mrow> <mo>)</mo> </mrow> <mo>=</mo> <mi>H</mi> <mrow> <mo>(</mo> <mrow> <mi>x</mi> <mo>,</mo> <mi>y</mi> </mrow> <mo>)</mo> </mrow> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <mover> <mi>V</mi> <mo>&OverBar;</mo> </mover> <mrow> <mo>(</mo> <mrow> <mi>x</mi> <mo>,</mo> <mi>y</mi> </mrow> <mo>)</mo> </mrow> <mo>=</mo> <mfenced open = "{" close = ""> <mtable> <mtr> <mtd> <mrow> <mn>1</mn> <mo>,</mo> </mrow> </mtd> <mtd> <mrow> <mi>V</mi> <mrow> <mo>(</mo> <mrow> <mi>x</mi> <mo>,</mo> <mi>y</mi> </mrow> <mo>)</mo> </mrow> <mo>></mo> <msub> <mi>V</mi> <mrow> <mi>b</mi> <mi>a</mi> <mi>c</mi> <mi>k</mi> </mrow> </msub> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <mi>V</mi> <mrow> <mo>(</mo> <mrow> <mi>x</mi> <mo>,</mo> <mi>y</mi> </mrow> <mo>)</mo> </mrow> <mo>/</mo> <msub> <mi>V</mi> <mrow> <mi>b</mi> <mi>a</mi> <mi>c</mi> <mi>k</mi> </mrow> </msub> <mo>,</mo> </mrow> </mtd> <mtd> <mrow> <mi>V</mi> <mrow> <mo>(</mo> <mrow> <mi>x</mi> <mo>,</mo> <mi>y</mi> </mrow> <mo>)</mo> </mrow> <mo>&le;</mo> <msub> <mi>V</mi> <mrow> <mi>b</mi> <mi>a</mi> <mi>c</mi> <mi>k</mi> </mrow> </msub> </mrow> </mtd> </mtr> </mtable> </mfenced> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <mover> <mi>S</mi> <mo>&OverBar;</mo> </mover> <mrow> <mo>(</mo> <mrow> <mi>x</mi> <mo>,</mo> <mi>y</mi> </mrow> <mo>)</mo> </mrow> <mo>=</mo> <mfenced open = "{" close = ""> <mtable> <mtr> <mtd> <mn>0</mn> </mtd> <mtd> <mrow> <mi>S</mi> <mrow> <mo>(</mo> <mrow> <mi>x</mi> <mo>,</mo> <mi>y</mi> </mrow> <mo>)</mo> </mrow> <mo><</mo> <msub> <mi>S</mi> <mrow> <mi>b</mi> <mi>a</mi> <mi>c</mi> <mi>k</mi> </mrow> </msub> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <mrow> <mo>&lsqb;</mo> <mrow> <mi>S</mi> <mrow> <mo>(</mo> <mrow> <mi>x</mi> <mo>,</mo> <mi>y</mi> </mrow> <mo>)</mo> </mrow> <mo>-</mo> <msub> <mi>S</mi> <mrow> <mi>b</mi> <mi>a</mi> <mi>c</mi> <mi>k</mi> </mrow> </msub> </mrow> <mo>&rsqb;</mo> </mrow> <mo>/</mo> <mrow> <mo>(</mo> <mrow> <mn>1</mn> <mo>-</mo> <msub> <mi>S</mi> <mrow> <mi>b</mi> <mi>a</mi> <mi>c</mi> <mi>k</mi> </mrow> </msub> </mrow> <mo>)</mo> </mrow> </mrow> </mtd> <mtd> <mrow> <mi>S</mi> <mrow> <mo>(</mo> <mrow> <mi>x</mi> <mo>,</mo> <mi>y</mi> </mrow> <mo>)</mo> </mrow> <mo>&GreaterEqual;</mo> <msub> <mi>S</mi> <mrow> <mi>b</mi> <mi>a</mi> <mi>c</mi> <mi>k</mi> </mrow> </msub> </mrow> </mtd> </mtr> </mtable> </mfenced> </mrow> </mtd> </mtr> </mtable> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>5</mn> <mo>)</mo> </mrow> </mrow>
wherein, theThe above-mentionedAndrepresents the enhancement result of point (x, y);
in step c, the transformation formula is involved as follows:
wherein, theTo representAn integer part of (a), saidThe value of the point (x, y) in the pathology image after the saturation and brightness correction is shown.
4. The pathological image staining composition adjusting method of claim 1, wherein in the step (3), the optical density O of channel c (c ═ r, g, b) iscThe calculation formula of (x, y) is as follows:
<mrow> <msub> <mi>O</mi> <mi>c</mi> </msub> <mrow> <mo>(</mo> <mi>x</mi> <mo>,</mo> <mi>y</mi> <mo>)</mo> </mrow> <mo>=</mo> <mo>-</mo> <msub> <mi>log</mi> <mn>10</mn> </msub> <mrow> <mo>(</mo> <msub> <mover> <mi>I</mi> <mo>&OverBar;</mo> </mover> <mi>c</mi> </msub> <mo>(</mo> <mrow> <mi>x</mi> <mo>,</mo> <mi>y</mi> </mrow> <mo>)</mo> <mo>/</mo> <msub> <mi>I</mi> <mrow> <mn>0</mn> <mo>,</mo> <mi>c</mi> </mrow> </msub> <mo>)</mo> </mrow> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>7</mn> <mo>)</mo> </mrow> </mrow>
wherein, the I0,cIs a single channel maximum, said I0,c=1。
5. The pathological image staining component adjusting method of claim 1, wherein the optical density O is when the staining agent is a separate staining agent colored in the step (3)c(x, y) is proportional to the stain pigmentation degree a, the optical density being equal to the sum of the optical densities of the stains when the stains are a plurality of stains; when the staining agent is hematoxylin-eosin-diaminobenzidine, the hematoxylin is recorded as H, the eosin is recorded as E, the diaminobenzidine is recorded as DAB, and the optical density of the staining agent is Oc(x, y) (c ═ r, g, b) and coloring intensity a of the coloring agentsThe transformation relationship of (x, y) is as follows:
<mrow> <mfenced open = "[" close = "]"> <mtable> <mtr> <mtd> <mrow> <msub> <mi>O</mi> <mi>r</mi> </msub> <mrow> <mo>(</mo> <mrow> <mi>x</mi> <mo>,</mo> <mi>y</mi> </mrow> <mo>)</mo> </mrow> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <msub> <mi>O</mi> <mi>g</mi> </msub> <mrow> <mo>(</mo> <mrow> <mi>x</mi> <mo>,</mo> <mi>y</mi> </mrow> <mo>)</mo> </mrow> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <msub> <mi>O</mi> <mi>b</mi> </msub> <mrow> <mo>(</mo> <mrow> <mi>x</mi> <mo>,</mo> <mi>y</mi> </mrow> <mo>)</mo> </mrow> </mrow> </mtd> </mtr> </mtable> </mfenced> <mo>=</mo> <mfenced open = "(" close = ")"> <mtable> <mtr> <mtd> <msubsup> <mi>m</mi> <mi>r</mi> <mi>H</mi> </msubsup> </mtd> <mtd> <msubsup> <mi>m</mi> <mi>r</mi> <mi>E</mi> </msubsup> </mtd> <mtd> <msubsup> <mi>m</mi> <mi>r</mi> <mrow> <mi>D</mi> <mi>A</mi> <mi>B</mi> </mrow> </msubsup> </mtd> </mtr> <mtr> <mtd> <msubsup> <mi>m</mi> <mi>g</mi> <mi>H</mi> </msubsup> </mtd> <mtd> <msubsup> <mi>m</mi> <mi>g</mi> <mi>E</mi> </msubsup> </mtd> <mtd> <msubsup> <mi>m</mi> <mi>g</mi> <mrow> <mi>D</mi> <mi>A</mi> <mi>B</mi> </mrow> </msubsup> </mtd> </mtr> <mtr> <mtd> <msubsup> <mi>m</mi> <mi>b</mi> <mi>H</mi> </msubsup> </mtd> <mtd> <msubsup> <mi>m</mi> <mi>b</mi> <mi>E</mi> </msubsup> </mtd> <mtd> <msubsup> <mi>m</mi> <mi>b</mi> <mrow> <mi>D</mi> <mi>A</mi> <mi>B</mi> </mrow> </msubsup> </mtd> </mtr> </mtable> </mfenced> <mfenced open = "[" close = "]"> <mtable> <mtr> <mtd> <mrow> <msub> <mi>A</mi> <mi>H</mi> </msub> <mrow> <mo>(</mo> <mrow> <mi>x</mi> <mo>,</mo> <mi>y</mi> </mrow> <mo>)</mo> </mrow> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <msub> <mi>A</mi> <mi>E</mi> </msub> <mrow> <mo>(</mo> <mrow> <mi>x</mi> <mo>,</mo> <mi>y</mi> </mrow> <mo>)</mo> </mrow> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <msub> <mi>A</mi> <mrow> <mi>D</mi> <mi>A</mi> <mi>B</mi> </mrow> </msub> <mrow> <mo>(</mo> <mrow> <mi>x</mi> <mo>,</mo> <mi>y</mi> </mrow> <mo>)</mo> </mrow> </mrow> </mtd> </mtr> </mtable> </mfenced> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>8</mn> <mo>)</mo> </mrow> </mrow>
wherein s ═ H, E, DAB,represents the absorbance of the dye s on channel c,is a constant for the stain s and channel c, which is obtained via a single stain staining test, and when the stain is a H-E-DAB stain, the absorbance matrix of channel c for the three stains H, E and DAB is
<mrow> <mi>M</mi> <mo>=</mo> <mfenced open = "(" close = ")"> <mtable> <mtr> <mtd> <msubsup> <mi>c</mi> <mi>r</mi> <mi>H</mi> </msubsup> </mtd> <mtd> <msubsup> <mi>c</mi> <mi>r</mi> <mi>E</mi> </msubsup> </mtd> <mtd> <msubsup> <mi>c</mi> <mi>r</mi> <mrow> <mi>D</mi> <mi>A</mi> <mi>B</mi> </mrow> </msubsup> </mtd> </mtr> <mtr> <mtd> <msubsup> <mi>c</mi> <mi>g</mi> <mi>H</mi> </msubsup> </mtd> <mtd> <msubsup> <mi>c</mi> <mi>g</mi> <mi>E</mi> </msubsup> </mtd> <mtd> <msubsup> <mi>c</mi> <mi>g</mi> <mrow> <mi>D</mi> <mi>A</mi> <mi>B</mi> </mrow> </msubsup> </mtd> </mtr> <mtr> <mtd> <msubsup> <mi>c</mi> <mi>b</mi> <mi>H</mi> </msubsup> </mtd> <mtd> <msubsup> <mi>c</mi> <mi>b</mi> <mi>E</mi> </msubsup> </mtd> <mtd> <msubsup> <mi>c</mi> <mi>b</mi> <mrow> <mi>D</mi> <mi>A</mi> <mi>B</mi> </mrow> </msubsup> </mtd> </mtr> </mtable> </mfenced> <mo>=</mo> <mfenced open = "(" close = ")"> <mtable> <mtr> <mtd> <mn>0.65</mn> </mtd> <mtd> <mn>0.70</mn> </mtd> <mtd> <mn>0.29</mn> </mtd> </mtr> <mtr> <mtd> <mn>0.07</mn> </mtd> <mtd> <mn>0.99</mn> </mtd> <mtd> <mn>0.11</mn> </mtd> </mtr> <mtr> <mtd> <mn>0.27</mn> </mtd> <mtd> <mn>0.57</mn> </mtd> <mtd> <mn>0.78</mn> </mtd> </mtr> </mtable> </mfenced> <mo>,</mo> </mrow>
Order to
O=[Or(x,y),Og(x,y),Ob(x,y)]T,
A=[AH(x,y),AE(x,y),ADAB(x,y)]T,
Equation (8) is abbreviated as:
O=M·A (9)
let D be M-1Each of the coloring strengths of the coloring agents obtainable from the formula (7) is:
A=D·O (10)
wherein D is called a color deconvolution matrix and represents the mapping relation of the optical density to the coloring intensity of the coloring agent, and when the coloring agent is H-E-DAB coloring, the deconvolution matrix is:
<mrow> <mi>D</mi> <mo>=</mo> <mfenced open = "(" close = ")"> <mtable> <mtr> <mtd> <mn>1.88</mn> </mtd> <mtd> <mrow> <mo>-</mo> <mn>0.07</mn> </mrow> </mtd> <mtd> <mrow> <mo>-</mo> <mn>0.60</mn> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <mo>-</mo> <mn>1.02</mn> </mrow> </mtd> <mtd> <mn>1.13</mn> </mtd> <mtd> <mrow> <mo>-</mo> <mn>0.48</mn> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <mo>-</mo> <mn>0.55</mn> </mrow> </mtd> <mtd> <mrow> <mo>-</mo> <mn>0.13</mn> </mrow> </mtd> <mtd> <mn>1.57</mn> </mtd> </mtr> </mtable> </mfenced> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>11</mn> <mo>)</mo> </mrow> </mrow>
vector A ═ AH(x,y),AE(x,y),ADAB(x,y)]TThat is, the decomposed staining intensity is a linear transformation of the optical density O, so that the a is still in the optical density space, and the inverse transformation is performed to the linear space, thereby completing the imageThe dyeing of (2) is separated.
6. The pathological image staining component adjusting method according to claim 1, wherein in the step (5), when the staining agent is hematoxylin-eosin-diaminobenzidine staining, the staining data is fused by using the following calculation formula:
<mrow> <mtable> <mtr> <mtd> <mrow> <msub> <mover> <mi>A</mi> <mo>&OverBar;</mo> </mover> <mi>s</mi> </msub> <mrow> <mo>(</mo> <mrow> <mi>x</mi> <mo>,</mo> <mi>y</mi> </mrow> <mo>)</mo> </mrow> <mo>=</mo> <mo>-</mo> <mi>log</mi> <mrow> <mo>(</mo> <mrow> <msubsup> <mover> <mi>A</mi> <mo>&OverBar;</mo> </mover> <mi>s</mi> <mo>&prime;</mo> </msubsup> <mrow> <mo>(</mo> <mrow> <mi>x</mi> <mo>,</mo> <mi>y</mi> </mrow> <mo>)</mo> </mrow> <mo>/</mo> <msub> <mi>A</mi> <mn>0</mn> </msub> </mrow> <mo>)</mo> </mrow> <mo>,</mo> <mi>s</mi> <mo>=</mo> <mi>H</mi> <mo>,</mo> <mi>E</mi> <mo>,</mo> <mi>D</mi> <mi>A</mi> <mi>B</mi> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <mfenced open = "[" close = "]"> <mtable> <mtr> <mtd> <mrow> <msub> <mover> <mi>O</mi> <mo>&OverBar;</mo> </mover> <mi>r</mi> </msub> <mrow> <mo>(</mo> <mrow> <mi>x</mi> <mo>,</mo> <mi>y</mi> </mrow> <mo>)</mo> </mrow> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <msub> <mover> <mi>O</mi> <mo>&OverBar;</mo> </mover> <mi>g</mi> </msub> <mrow> <mo>(</mo> <mrow> <mi>x</mi> <mo>,</mo> <mi>y</mi> </mrow> <mo>)</mo> </mrow> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <msub> <mover> <mi>O</mi> <mo>&OverBar;</mo> </mover> <mi>b</mi> </msub> <mrow> <mo>(</mo> <mrow> <mi>x</mi> <mo>,</mo> <mi>y</mi> </mrow> <mo>)</mo> </mrow> </mrow> </mtd> </mtr> </mtable> </mfenced> <mo>=</mo> <mi>M</mi> <mfenced open = "[" close = "]"> <mtable> <mtr> <mtd> <mrow> <msub> <mover> <mi>A</mi> <mo>&OverBar;</mo> </mover> <mi>H</mi> </msub> <mrow> <mo>(</mo> <mrow> <mi>x</mi> <mo>,</mo> <mi>y</mi> </mrow> <mo>)</mo> </mrow> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <msub> <mover> <mi>A</mi> <mo>&OverBar;</mo> </mover> <mi>E</mi> </msub> <mrow> <mo>(</mo> <mrow> <mi>x</mi> <mo>,</mo> <mi>y</mi> </mrow> <mo>)</mo> </mrow> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <msub> <mover> <mi>A</mi> <mo>&OverBar;</mo> </mover> <mrow> <mi>D</mi> <mi>A</mi> <mi>B</mi> </mrow> </msub> <mrow> <mo>(</mo> <mrow> <mi>x</mi> <mo>,</mo> <mi>y</mi> </mrow> <mo>)</mo> </mrow> </mrow> </mtd> </mtr> </mtable> </mfenced> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <msub> <mover> <mover> <mi>I</mi> <mo>&OverBar;</mo> </mover> <mo>&OverBar;</mo> </mover> <mi>c</mi> </msub> <mrow> <mo>(</mo> <mrow> <mi>x</mi> <mo>,</mo> <mi>y</mi> </mrow> <mo>)</mo> </mrow> <mo>=</mo> <msub> <mi>I</mi> <mrow> <mn>0</mn> <mi>c</mi> </mrow> </msub> <mo>&times;</mo> <msup> <mn>10</mn> <mrow> <mo>-</mo> <msub> <mover> <mi>O</mi> <mo>&OverBar;</mo> </mover> <mi>c</mi> </msub> <mrow> <mo>(</mo> <mrow> <mi>x</mi> <mo>,</mo> <mi>y</mi> </mrow> <mo>)</mo> </mrow> </mrow> </msup> <mo>,</mo> <mi>c</mi> <mo>=</mo> <mi>r</mi> <mo>,</mo> <mi>g</mi> <mo>,</mo> <mi>b</mi> </mrow> </mtd> </mtr> </mtable> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>12</mn> <mo>)</mo> </mrow> </mrow>
calculated to obtainAs a result of the adjustment.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710246205 | 2017-04-15 | ||
CN2017102462051 | 2017-04-15 |
Publications (2)
Publication Number | Publication Date |
---|---|
CN107705336A true CN107705336A (en) | 2018-02-16 |
CN107705336B CN107705336B (en) | 2021-08-06 |
Family
ID=61184206
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710947445.4A Active CN107705336B (en) | 2017-04-15 | 2017-10-12 | Pathological image dyeing component adjusting method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN107705336B (en) |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109544529A (en) * | 2018-11-19 | 2019-03-29 | 南京信息工程大学 | Pathological image data enhancement methods towards deep learning model training and study |
CN109785943A (en) * | 2018-12-21 | 2019-05-21 | 程俊美 | A kind of monitoring of pathology and diagnostic message processing system and method |
CN110223305A (en) * | 2019-06-12 | 2019-09-10 | 志诺维思(北京)基因科技有限公司 | Cell segmentation method, apparatus and readable storage medium storing program for executing |
CN110298905A (en) * | 2019-07-02 | 2019-10-01 | 麦克奥迪(厦门)医疗诊断系统有限公司 | It is a kind of to be sliced the method and apparatus for generating digital slices based on biological sample |
CN110706237A (en) * | 2019-09-06 | 2020-01-17 | 上海衡道医学病理诊断中心有限公司 | Diaminobenzidine separation and evaluation method based on YCbCr color space |
CN111145176A (en) * | 2019-04-15 | 2020-05-12 | 青岛大学 | Method and system for automatically identifying lymph node staining pathological image of gastric cancer based on deep neural network |
WO2020107156A1 (en) * | 2018-11-26 | 2020-06-04 | 深圳先进技术研究院 | Automated classification method and device for breast medical ultrasound images |
CN111539883A (en) * | 2020-04-20 | 2020-08-14 | 福建帝视信息科技有限公司 | Digital pathological image H & E dyeing restoration method based on strong reversible countermeasure network |
CN113469939A (en) * | 2021-05-26 | 2021-10-01 | 透彻影像(北京)科技有限公司 | HER-2 immunohistochemical automatic interpretation system based on characteristic curve |
CN116754548A (en) * | 2022-07-12 | 2023-09-15 | 黑龙江省农业科学院食品加工研究所 | Determination method for peel retention degree of processed rice |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102460156A (en) * | 2009-06-03 | 2012-05-16 | 日本电气株式会社 | Pathologic tissue image analyzing apparatus, pathologic tissue image analyzing method, and pathologic tissue image analyzing program |
CN103635176A (en) * | 2011-03-21 | 2014-03-12 | 卡拉莱特有限公司 | Systems for custom coloration |
WO2014065950A1 (en) * | 2012-10-26 | 2014-05-01 | Google Inc. | Video chat encoding pipeline |
KR101428923B1 (en) * | 2013-04-23 | 2014-08-08 | 충북대학교 산학협력단 | System and Method for Automatic Extraction of Component Packaging Regions in PCB |
CN104700375A (en) * | 2015-03-27 | 2015-06-10 | 麦克奥迪(厦门)医疗诊断系统有限公司 | Method for improving pathology image visual effect based on main component analyzing |
CN105488836A (en) * | 2015-11-16 | 2016-04-13 | 武汉海达数云技术有限公司 | Circular colored tape point cloud rendering method based on elevation distribution characteristics |
CN105893649A (en) * | 2015-03-23 | 2016-08-24 | 温州大学 | Optimal model based interactive image recolorating method |
-
2017
- 2017-10-12 CN CN201710947445.4A patent/CN107705336B/en active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102460156A (en) * | 2009-06-03 | 2012-05-16 | 日本电气株式会社 | Pathologic tissue image analyzing apparatus, pathologic tissue image analyzing method, and pathologic tissue image analyzing program |
CN103635176A (en) * | 2011-03-21 | 2014-03-12 | 卡拉莱特有限公司 | Systems for custom coloration |
WO2014065950A1 (en) * | 2012-10-26 | 2014-05-01 | Google Inc. | Video chat encoding pipeline |
KR101428923B1 (en) * | 2013-04-23 | 2014-08-08 | 충북대학교 산학협력단 | System and Method for Automatic Extraction of Component Packaging Regions in PCB |
CN105893649A (en) * | 2015-03-23 | 2016-08-24 | 温州大学 | Optimal model based interactive image recolorating method |
CN104700375A (en) * | 2015-03-27 | 2015-06-10 | 麦克奥迪(厦门)医疗诊断系统有限公司 | Method for improving pathology image visual effect based on main component analyzing |
CN105488836A (en) * | 2015-11-16 | 2016-04-13 | 武汉海达数云技术有限公司 | Circular colored tape point cloud rendering method based on elevation distribution characteristics |
Cited By (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109544529A (en) * | 2018-11-19 | 2019-03-29 | 南京信息工程大学 | Pathological image data enhancement methods towards deep learning model training and study |
WO2020107156A1 (en) * | 2018-11-26 | 2020-06-04 | 深圳先进技术研究院 | Automated classification method and device for breast medical ultrasound images |
CN109785943A (en) * | 2018-12-21 | 2019-05-21 | 程俊美 | A kind of monitoring of pathology and diagnostic message processing system and method |
CN111145176A (en) * | 2019-04-15 | 2020-05-12 | 青岛大学 | Method and system for automatically identifying lymph node staining pathological image of gastric cancer based on deep neural network |
CN110223305A (en) * | 2019-06-12 | 2019-09-10 | 志诺维思(北京)基因科技有限公司 | Cell segmentation method, apparatus and readable storage medium storing program for executing |
CN110298905A (en) * | 2019-07-02 | 2019-10-01 | 麦克奥迪(厦门)医疗诊断系统有限公司 | It is a kind of to be sliced the method and apparatus for generating digital slices based on biological sample |
CN110706237A (en) * | 2019-09-06 | 2020-01-17 | 上海衡道医学病理诊断中心有限公司 | Diaminobenzidine separation and evaluation method based on YCbCr color space |
CN110706237B (en) * | 2019-09-06 | 2023-06-06 | 上海衡道医学病理诊断中心有限公司 | Diamino benzidine separation and evaluation method based on YCbCr color space |
CN111539883A (en) * | 2020-04-20 | 2020-08-14 | 福建帝视信息科技有限公司 | Digital pathological image H & E dyeing restoration method based on strong reversible countermeasure network |
CN111539883B (en) * | 2020-04-20 | 2023-04-14 | 福建帝视信息科技有限公司 | Digital pathological image H & E dyeing restoration method based on strong reversible countermeasure network |
CN113469939A (en) * | 2021-05-26 | 2021-10-01 | 透彻影像(北京)科技有限公司 | HER-2 immunohistochemical automatic interpretation system based on characteristic curve |
CN116754548A (en) * | 2022-07-12 | 2023-09-15 | 黑龙江省农业科学院食品加工研究所 | Determination method for peel retention degree of processed rice |
CN116754548B (en) * | 2022-07-12 | 2024-05-24 | 黑龙江省农业科学院食品加工研究所 | Determination method for peel retention degree of processed rice |
Also Published As
Publication number | Publication date |
---|---|
CN107705336B (en) | 2021-08-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107705336B (en) | Pathological image dyeing component adjusting method | |
Zhou et al. | Color retinal image enhancement based on luminosity and contrast adjustment | |
EP3201826B1 (en) | Method and data-processing device for computer-assisted hair colouring guidance | |
CN109829930A (en) | Face image processing process, device, computer equipment and readable storage medium storing program for executing | |
CN101901474B (en) | Change the method for at least one in the density of image and contrast | |
CN105654437A (en) | Enhancement method for low-illumination image | |
CN108377373A (en) | A kind of color rendition device and method pixel-based | |
CN108172278B (en) | HE staining pathological image color normalization method | |
CN106127709A (en) | A kind of low-luminance color eye fundus image determination methods and Enhancement Method | |
CN105118076B (en) | Based on over-segmentation and the local image colorization method with global coherency | |
JP6793281B2 (en) | Color gamut mapping method and color gamut mapping device | |
CN110009574B (en) | Method for reversely generating high dynamic range image from low dynamic range image | |
JP2008243059A (en) | Image processing device and method | |
Huang et al. | Deep unsupervised endoscopic image enhancement based on multi-image fusion | |
Pierre et al. | Luminance-hue specification in the RGB space | |
CN102419867A (en) | Image retouching method | |
CN101790101A (en) | Method and device for adjusting image saturation | |
CN117689762B (en) | Endoscopic image staining method and system | |
CN107610186B (en) | Image processing method and device | |
Eldahshan et al. | Segmentation framework on digital microscope images for acute lymphoblastic leukemia diagnosis based on HSV Color Space | |
CN112581390B (en) | Image color enhancement method, device, equipment and readable storage medium | |
CN113808057A (en) | Endoscope image enhancement method based on unsupervised learning | |
CN116630198A (en) | Multi-scale fusion underwater image enhancement method combining self-adaptive gamma correction | |
CN111047517B (en) | Skin color adjusting method and device and readable storage medium | |
CN112995586A (en) | Image capturing method using color conversion and medical image capturing system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |