CN108965835B - Image processing method, image processing device and terminal equipment - Google Patents

Image processing method, image processing device and terminal equipment Download PDF

Info

Publication number
CN108965835B
CN108965835B CN201810965516.8A CN201810965516A CN108965835B CN 108965835 B CN108965835 B CN 108965835B CN 201810965516 A CN201810965516 A CN 201810965516A CN 108965835 B CN108965835 B CN 108965835B
Authority
CN
China
Prior art keywords
color temperature
image
images
camera
acquired
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201810965516.8A
Other languages
Chinese (zh)
Other versions
CN108965835A (en
Inventor
毕强
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN201810965516.8A priority Critical patent/CN108965835B/en
Publication of CN108965835A publication Critical patent/CN108965835A/en
Application granted granted Critical
Publication of CN108965835B publication Critical patent/CN108965835B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/10Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
    • H04N23/13Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths with multiple sensors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N23/84Camera processing pipelines; Components thereof for processing colour signals
    • H04N23/88Camera processing pipelines; Components thereof for processing colour signals for colour balance, e.g. white-balance circuits or colour temperature control

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Processing Of Color Television Signals (AREA)
  • Image Processing (AREA)

Abstract

The application provides an image processing method, an image processing device, a terminal device and a computer readable storage medium, wherein the method comprises the following steps: acquiring images of preset frame numbers continuously acquired by each camera, wherein the images of the preset frame numbers continuously acquired by each camera comprise the images currently acquired by the camera; estimating the color temperature of the current environment according to the acquired images; and carrying out white balance adjustment on a target image according to the color temperature, wherein the target image is one or more images in the images currently acquired by each camera. According to the technical scheme, when a user just enters another color temperature environment from a certain color temperature environment, the color temperature of the current environment can be estimated more accurately.

Description

Image processing method, image processing device and terminal equipment
Technical Field
The present application belongs to the field of image processing technologies, and in particular, to an image processing method, an image processing apparatus, a terminal device, and a computer-readable storage medium.
Background
After the terminal device starts the camera application program, in order to perform white balance adjustment on the image currently acquired by the camera (i.e., make an object in the acquired image show a normal color and avoid color difference), it is necessary to estimate the color temperature of the current environment first. In order to estimate the color temperature of the current environment more accurately, a conventional color temperature estimation method estimates the color temperature of the current environment according to a plurality of frames (for example, 60 frames or 100 frames, etc.) of images captured by a main camera (i.e., a camera for capturing images displayed on a display screen).
However, when the user enters another color temperature environment from a certain color temperature environment, since the terminal device collects a small number of images in the current environment when the user just enters the another color temperature environment (for example, when the user just enters the another color temperature environment, the number of images in the current environment collected by the main camera of the terminal device is 10 frames in total, and if the color temperature estimation is performed on 60 frames of images collected by the main camera, the number of images in the current environment is only 10 frames, and the number of images in the previous environment is 50 frames), the color temperature of the current environment cannot be accurately estimated, and therefore, the white balance adjustment cannot be performed on the images collected when the user just enters the another color temperature environment.
Disclosure of Invention
In view of the above, the present application provides an image processing method, an image processing apparatus, a terminal device and a computer readable storage medium, which can enable a color temperature of a current environment to be estimated more accurately when a user just enters another color temperature environment from a certain color temperature environment.
A first aspect of the present application provides an image processing method, which is applied to a terminal device, where the terminal device includes a plurality of cameras, and the image processing method includes:
acquiring images of preset frame numbers continuously acquired by each camera, wherein the images of the preset frame numbers continuously acquired by each camera comprise the images currently acquired by the camera;
estimating the color temperature of the current environment according to the acquired images;
and carrying out white balance adjustment on a target image according to the color temperature, wherein the target image is one or more images in the images currently acquired by each camera.
A second aspect of the present application provides an image processing apparatus, which is applied to a terminal device, where the terminal device includes a plurality of cameras, and the image processing apparatus includes:
the image acquisition module is used for acquiring images of preset frame numbers continuously acquired by each camera, wherein the images of the preset frame numbers continuously acquired by each camera comprise the images currently acquired by the camera;
the color temperature estimation module is used for estimating the color temperature of the current environment according to the acquired images;
and the white balance adjusting module is used for carrying out white balance adjustment on a target image according to the color temperature, wherein the target image is one or more images in the images currently acquired by each camera.
A third aspect of the present application provides a terminal device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, wherein the processor implements the steps of the method according to the first aspect when executing the computer program.
A fourth aspect of the present application provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of the method of the first aspect as described above.
A fifth aspect of the present application provides a computer program product comprising a computer program which, when executed by one or more processors, performs the steps of the method of the first aspect as described above.
From the above, the present application provides an image processing method, which is applied to a terminal device including a plurality of cameras, and first, acquiring images of a preset number of frames continuously acquired by each camera, wherein, the images of the preset number of frames continuously collected by each camera all include the image currently collected by the camera, for example, if the terminal device includes 3 cameras, which are respectively a first camera, a second camera and a third camera, respectively acquiring images of preset frame numbers respectively acquired by the first camera, the second camera and the third camera, wherein the images of the preset frame number collected by the first camera comprise the images currently collected by the first camera, the images of the preset frame number collected by the second camera comprise the images currently collected by the second camera, the images of the preset frame number acquired by the third camera comprise the images currently acquired by the third camera; secondly, estimating the color temperature of the current environment according to the acquired images, namely estimating the color temperature of the current environment according to the images acquired in the previous step; and finally, carrying out white balance adjustment on a target image according to the color temperature, wherein the target image is one or more images in the images currently acquired by each camera. Therefore, when the user just enters another color temperature environment, although the images of the current environment collected by each camera are not many, since the present application estimates the color temperature of the current environment, the ratio of the images of the current environment in the images used for estimating the color temperature is relatively increased compared to the conventional color temperature estimation method (for example, when the user just enters another color temperature environment, 10 frames of images of the current environment collected by each camera of the terminal device are used, and if the color temperature of the current environment is estimated by using 60 frames of images collected by 3 cameras of the terminal device, the ratio of the images of the current environment in the 60 frames of images used is 1/2, while the conventional color temperature estimation method is only 1/6), according to the technical scheme, when a user just enters another color temperature environment from a certain color temperature environment, the color temperature of the current environment can be estimated more accurately.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive exercise.
Fig. 1 is a schematic flow chart illustrating an implementation of an image processing method according to an embodiment of the present application;
fig. 2 is a schematic diagram illustrating that a plurality of cameras respectively and continuously acquire images with preset frame numbers according to an embodiment of the present application;
fig. 3 is a schematic flow chart illustrating an implementation of another image processing method according to the second embodiment of the present application;
fig. 4 is a schematic diagram of determining whether a white area is provided in the second embodiment of the present application;
fig. 5 is a schematic flow chart of an implementation of determining whether a white area exists in a selected image according to a second embodiment of the present application;
fig. 6 is a schematic diagram of another determination of whether the white area is provided in the second embodiment of the present application;
fig. 7 is a schematic structural diagram of an image processing apparatus according to a third embodiment of the present application;
fig. 8 is a schematic structural diagram of a terminal device according to a fourth embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
The image processing method provided by the embodiment of the application can be applied to terminal devices, and the terminal devices include, but are not limited to: smart phones, tablet computers, learning machines, intelligent wearable devices, and the like.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It is also to be understood that the terminology used in the description of the present application herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in the specification of the present application and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It should be further understood that the term "and/or" as used in this specification and the appended claims refers to and includes any and all possible combinations of one or more of the associated listed items.
As used in this specification and the appended claims, the term "if" may be interpreted contextually as "when", "upon" or "in response to a determination" or "in response to a detection". Similarly, the phrase "if it is determined" or "if a [ described condition or event ] is detected" may be interpreted contextually to mean "upon determining" or "in response to determining" or "upon detecting [ described condition or event ]" or "in response to detecting [ described condition or event ]".
In particular implementations, the terminal devices described in embodiments of the present application include, but are not limited to, other portable devices such as mobile phones, laptop computers, or tablet computers having touch sensitive surfaces (e.g., touch screen displays and/or touch pads). It should also be understood that in some embodiments, the devices described above are not portable communication devices, but rather are desktop computers having touch-sensitive surfaces (e.g., touch screen displays and/or touch pads).
In the discussion that follows, a terminal device that includes a display and a touch-sensitive surface is described. However, it should be understood that the terminal device may include one or more other physical user interface devices such as a physical keyboard, mouse, and/or joystick.
Various applications that may be executed on the terminal device may use at least one common physical user interface device, such as a touch-sensitive surface. One or more functions of the touch-sensitive surface and corresponding information displayed on the terminal can be adjusted and/or changed between applications and/or within respective applications. In this way, a common physical architecture (e.g., touch-sensitive surface) of the terminal can support various applications with user interfaces that are intuitive and transparent to the user.
In addition, in the description of the present application, the terms "first", "second", and the like are used only for distinguishing the description, and are not intended to indicate or imply relative importance.
In order to explain the technical solution of the present application, the following description will be given by way of specific examples.
Example one
Referring to fig. 1, an image processing method provided in an embodiment of the present application is applied to a terminal device including a plurality of cameras, and includes:
in step S101, acquiring images of preset frames continuously acquired by each camera, where the images of the preset frames continuously acquired by each camera include an image currently acquired by the camera;
in the embodiment of the application, after it is detected that a user starts a camera application (i.e., an application with a camera function) in the terminal device, the terminal device starts at least two cameras at the same time, and then acquires images of preset frames continuously acquired by each started camera. In the embodiment of the application, in order to ensure that a terminal device can start a plurality of cameras simultaneously, a camera application program may be developed based on a camera2.0 architecture, so that the camera application program may support a plurality of cameras to work simultaneously, camera2.0 is a camera development program based on an Android operating system, the camera application program may support a plurality of cameras to work simultaneously, and each frame image acquired by each camera may be processed, a conventional camera development program is based on a camera1.0 architecture, a camera application program designed based on the camera1.0 architecture may only support one camera to work at the same time, and processing of data may not reach control of a frame level, and may only reach a stream level. Because the technical scheme provided by the application requires a plurality of cameras to work simultaneously, the camera application program can be developed based on the camera2.0 architecture.
In the embodiment of the present application, the preset number of frames may be a fixed number, for example, fixed to 60 frames; or, the preset frame number may be a variable value, for example, the preset frame number may change according to different external environments, and if it is detected that the current environment is relatively stable (for example, if it is detected that the geographic location of the user is not changed, the current environment may be considered to be relatively stable), the preset frame number may be a smaller value, and if it is detected that the current environment is unstable (for example, if it is detected that the user is moving at a relatively large speed, the current environment may be considered to be unstable), the preset frame number may be a larger value.
In the embodiment of the application, in order to ensure that the subsequent estimation of the color temperature of the current environment is more accurate, the cameras with larger visual angle differences can be started simultaneously, for example, the front camera and the rear camera are started simultaneously.
In step S102, estimating a color temperature of a current environment from the acquired images;
in the embodiment of the application, the ambient color temperature corresponding to each frame of image in each image can be obtained first, and then the weighted average is performed on each ambient color temperature to obtain the color temperature of the current environment; alternatively, the maximum value and the minimum value of each environmental color temperature may be eliminated, and the average value of the remaining environmental color temperatures may be calculated.
A method for performing weighted averaging on the color temperatures of the respective environments to obtain the color temperature of the current environment is discussed below with reference to fig. 2:
in this embodiment of the present application, each ambient color temperature may be weighted and averaged according to a color temperature calculation formula (1), so as to obtain a color temperature of a current environment, where the color temperature calculation formula (1) is:
wherein T is the color temperature of the current environment, M is the number of the cameras of the terminal equipment, K is the preset frame number,respectively the environmental color temperature corresponding to each image currently collected by the M cameras,respectively the environmental color temperature corresponding to the previous frame image of each image currently collected by the M cameras,respectively the environmental color temperature corresponding to the previous K frames of images of the images currently collected by the M cameras,for each of the weight values, the weight value,
as shown in the figure2, the terminal device includes 3 cameras, i.e., M-3, camera1, camera2, and camera 3, each of which continuously captures 20 frames of images, i.e., K-20,andthe ambient color temperature corresponding to each image currently collected by the 3 cameras,andis the environmental color temperature corresponding to the previous frame image of each image currently collected by the 3 cameras,andthe color temperature of the environment corresponding to the first 20 frames of images of each image currently collected by the 3 cameras. When the color temperature of the current environment is calculated, for each image collected by each camera, the longer the time from the current frame, the smaller the correlation with the current environment, therefore, the color temperature of the current environment can be calculatedAndthe corresponding weight is selected as a larger value, and thenAndthe corresponding weight value is selected to be a smaller numerical value, thereby more accurately estimatingThe color temperature of the current environment.
In step S103, performing white balance adjustment on a target image according to the color temperature, where the target image is one or more images in images currently acquired by each camera;
in this embodiment, the terminal device may store the correspondence information of the "color temperature-pixel correction value" in advance, and then correct the pixel value of each pixel point in the target image according to the color temperature value of the current environment estimated in step S102 and the correspondence information of the "color temperature-pixel correction value" stored in advance. The target image may be an image currently displayed on a display screen of the terminal device.
The first embodiment of the present application provides an image processing method, when a user just enters another color temperature environment, because images collected by a plurality of cameras are used when estimating the color temperature of the current environment, in the adopted images, the proportion of the images in the current environment is relatively increased compared with the traditional color temperature estimation method, and therefore, according to the technical scheme provided by the present application, when the user just enters another color temperature environment from a certain color temperature environment, the color temperature of the current environment can be estimated more accurately.
Example two
Referring to fig. 3, the image processing method according to the second embodiment of the present application is applied to a terminal device including a plurality of cameras, and includes:
in step S301, images of preset frames continuously acquired by each camera are obtained, where the images of the preset frames continuously acquired by each camera include an image currently acquired by the camera;
in the second embodiment of the present application, the step S301 is the same as the step S101 in the first embodiment, and specific reference may be made to the description of the first embodiment, which is not repeated herein.
In step S302, an image is selected from the acquired images, and the selected image is divided into a plurality of regions;
in the embodiment of the present application, after the images respectively acquired by the cameras are acquired in step S301, an image may be arbitrarily selected from the acquired images, and then the selected image is divided into a plurality of regions, which may be a plurality of rectangular regions; or may be a plurality of regions of other shapes, which is not limited in this application. As shown in fig. 4, it is assumed that the terminal device includes 3 cameras, and after 20 frames of images continuously acquired by each camera are acquired, a current frame 401 acquired by the camera2 is selected from the acquired 60 frames of images, and the image 401 is divided into 6 × 6 rectangular regions.
In step S303, determining whether a white area exists in the selected image according to the corresponding relationship table, if so, performing step S304, otherwise, performing step S305;
in the technical solution provided in the second embodiment of the present application, the terminal device stores a correspondence table in advance, and the correspondence table records correspondence information between each color temperature and a pixel value of a white pixel at each color temperature. At different color temperatures, the colors of the white pixels are different, that is, the white pixels have different pixel values, for example, the white pixels may turn blue at a high color temperature and turn yellow at a low color temperature, the pixel values of the white pixels at different color temperatures may be recorded in advance, and the correspondence table recorded with "color temperature — pixel value of white pixel" may be stored in the memory before the terminal device leaves the factory, as shown in fig. 4, which is a schematic diagram of the correspondence table 402 provided in the embodiment of the present application.
Specifically, fig. 5 can be utilized to determine whether a white area exists in the selected image:
in step S501, calculating an average value of pixel values corresponding to each region in the selected image, where the average value of pixel values corresponding to each region is an average value of pixel values of all pixel points in the region;
as shown in fig. 4, an average value of R values, an average value of G values, and an average value of B values of all pixel points in the first region 4011 of the image 401 are calculated, so as to obtain an average value of pixel values of the region 4011, that is, an average value of pixel values of the region 4011 is obtainedAndall the regions of the image 401 are traversed to obtain the average value of the pixel values corresponding to each region.
In step S502, according to the average value of the pixel values corresponding to each region, obtaining the storage pixel value corresponding to each region, where the storage pixel value corresponding to each region is the pixel value stored in the correspondence table and closest to the average value of the pixel values corresponding to the region;
as shown in fig. 4, in the correspondence table 402, the average value of the pixel values of the area 4011 is searched for Andthe closest pixel value, the found pixel value is determined to be the stored pixel value of the region 4011. Specifically, the area 4011 can be calculatedAndthe distances between each pixel value in the table 402, R1/G1/B1, R2/G2/B2 and R3/G3/B3, respectively, will be fromAndthe pixel value with the smallest distance is determined as the stored pixel value of the region 4011, for example, if R1, G1, and B1 and andis the smallest, the stored pixel values of the region 4011 are R1, G1, and B1. All regions in image 401 are traversed resulting in a stored pixel value for each region. In addition, if there are a plurality of pixel values closest to the average value of the pixel values of the area in the correspondence table, any one of the pixel values in the correspondence table may be selected as the stored pixel value of the area.
In step S503, determining a distance value corresponding to each region, where the distance value corresponding to each region is a distance between the average value of the pixel values corresponding to the region and the corresponding stored pixel value;
as shown in fig. 4, if the stored pixel values of the region 4011 obtained in step S502 are R1, G1, and B1, the average value of the pixel values of the region 4011 is calculatedAndthe distances to R1, G1, and B1 are determined as the distance values for this region 4011, and all regions of the image 401 are traversed to obtain the distance value for each region.
In step S504, determining an area with a distance value less than a preset distance as a white area, and determining an area with a distance value greater than or equal to the preset distance as a non-white area;
if the distance between the average value of the pixel values of a certain area and the corresponding stored pixel value is too large, the probability that the area is a white area is small, so that the area can be determined as a non-white area, all areas of the selected image are traversed, and whether the white area exists in the selected image is determined.
In addition, in the embodiment of the present application, the method for determining whether a white region exists in the selected image according to the correspondence table is not limited to the steps S501 to S504, and other methods may also be used to determine whether a white region exists in the selected image, as shown in fig. 6, a color histogram of each region in the selected image may be calculated, a preset color histogram corresponding to a pixel value at each color temperature in the correspondence table is obtained, and then the similarity between the color histogram of each region in the selected image and each preset color histogram is calculated.
In step S304, determining an ambient color temperature corresponding to each white region according to the correspondence table, and determining an ambient color temperature corresponding to the selected image according to the ambient color temperature corresponding to each white region;
in this embodiment of the application, if it is determined whether the white area exists in the selected image according to steps S501 to S504, when the white area exists in the selected image, the color temperature corresponding to the stored pixel value corresponding to each white area may be searched in the correspondence table, and each searched color temperature is determined as the ambient color temperature of the corresponding white area. The environmental color temperatures corresponding to the white areas can be weighted and averaged to determine the environmental color temperature corresponding to the selected image; or the maximum value and the minimum value in the environment color temperatures corresponding to the white areas can be removed, and the average value of the environment color temperatures corresponding to the remaining white areas is determined as the environment color temperature corresponding to the selected image.
In step S305, determining an ambient color temperature corresponding to the selected image according to the location of the terminal device, the current weather condition, and the current time;
if the white area does not exist in the selected image, the ambient color temperature corresponding to the selected image can be determined according to the location of the terminal device, the current weather condition and the current time. For example, if the terminal device is outdoors, the weather is cloudy, and the current time is 4 pm, it may be considered that the color temperature of the current environment is higher.
In the embodiment of the application, the corresponding relation information of each different position, each different weather condition, each different time and the ambient color temperature can be stored in the terminal device in advance, so that the color temperature of the current environment can be determined according to the corresponding relation information stored in advance.
In step S306, it is determined whether all the acquired images have been traversed, if yes, step S308 is executed, otherwise, step S307 is executed;
in the embodiment of the present application, the ambient color temperature corresponding to each image acquired in step S301 needs to be determined, and therefore, it needs to be determined whether all the acquired images are traversed, and if all the acquired images are not traversed, another image continues to be selected, and the ambient color temperature of the selected another image continues to be determined.
In step S307, selecting one image from the remaining images, dividing the selected image into a plurality of regions, and returning to perform step S303;
in this embodiment, if it is determined in step S306 that all the acquired images have not been traversed, another image is selected from the acquired remaining images, the image is divided into a plurality of regions, the process returns to step S303, and the ambient color temperature corresponding to the image selected in step S307 is continuously determined.
In step S308, performing weighted average on the color temperatures of the respective environments to obtain the color temperature of the current environment;
in step S309, performing white balance adjustment on a target image according to the color temperature, where the target image is one or more images in images currently acquired by each camera;
in the second embodiment of the present application, the steps S308 to S309 are all described in the first embodiment, which can be specifically referred to in the first embodiment, and are not described herein again.
The second embodiment of the present application provides a specific method for determining the ambient color temperature of each acquired image. When a user just enters another color temperature environment, because the color temperature of the current environment is estimated by using the images collected by the plurality of cameras, the proportion of the images in the current environment is relatively increased compared with that of the traditional color temperature estimation method in the adopted images, and therefore, the color temperature of the current environment can be more accurately estimated when the user just enters another color temperature environment from a certain color temperature environment.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present application.
EXAMPLE III
In a third embodiment of the present application, an image processing apparatus is provided, in which only a part related to the present application is shown for convenience of description, and as shown in fig. 7, an image processing apparatus 700 includes:
an image obtaining module 701, configured to obtain images of preset frame numbers continuously acquired by each camera, where the images of the preset frame numbers continuously acquired by each camera include an image currently acquired by the camera;
a color temperature estimation module 702, configured to estimate a color temperature of a current environment according to each acquired image;
and a white balance adjusting module 703, configured to perform white balance adjustment on a target image according to the color temperature, where the target image is one or more images in the images currently acquired by each camera.
Optionally, the color temperature estimation module 702 includes:
the color temperature acquisition unit is used for acquiring the environmental color temperature corresponding to each frame of image in each image;
and the weighted average unit is used for carrying out weighted average on the color temperatures of all environments to obtain the color temperature of the current environment.
Optionally, the weighted average unit is specifically configured to:
carrying out weighted average on each environmental color temperature according to a color temperature calculation formula to obtain the color temperature of the current environment, wherein the color temperature calculation formula is as follows:
wherein T is the color temperature of the current environment, M is the number of the cameras of the terminal equipment, K is the preset frame number,respectively the environmental color temperature corresponding to each image currently collected by the M cameras,respectively the environmental color temperature corresponding to the previous frame image of each image currently collected by the M cameras,respectively the environmental color temperature corresponding to the previous K frames of images of the images currently collected by the M cameras,for each of the weight values, the weight value,
optionally, the terminal device stores a preset correspondence table, where the correspondence table records correspondence information between color temperatures and pixel values of white pixels at the color temperatures, where each color temperature corresponds to one pixel value, and accordingly, the color temperature obtaining unit includes:
the area dividing subunit is used for selecting one image from the acquired images and dividing the selected image into a plurality of areas;
a white area determining subunit, configured to determine whether each area in the selected image is a white area according to the correspondence table;
a first color temperature determining subunit, configured to determine, if one or more areas in the selected image are white areas, an ambient color temperature corresponding to each white area according to the correspondence table, and determine, according to the ambient color temperature corresponding to each white area, an ambient color temperature corresponding to the selected image;
a second color temperature determining subunit, configured to determine, if each region in the selected image is not a white region, an ambient color temperature corresponding to the selected image according to the location of the terminal device, the current weather condition, and the current time;
and the traversing subunit is used for traversing each acquired image to obtain the environmental color temperature corresponding to each image.
Optionally, the white area determining subunit includes:
the pixel average small unit is used for calculating the pixel value average value corresponding to each area in the selected image, wherein the pixel value average value corresponding to each area is the average value of the pixel values of all pixel points in the area;
the small storage pixel unit is used for acquiring a storage pixel value corresponding to each area according to the pixel value average value corresponding to each area, wherein the storage pixel value corresponding to each area is the pixel value which is stored in the corresponding relation table and is closest to the pixel value average value corresponding to the area;
the distance value determining unit is used for determining a distance value corresponding to each region, wherein the distance value corresponding to each region is the distance between the average value of the pixel values corresponding to the region and the corresponding storage pixel value;
a white determination small unit for determining an area having a distance value smaller than a preset distance as a white area and determining an area having a distance value greater than or equal to the preset distance as a non-white area,
accordingly, the first color temperature determining subunit is specifically configured to:
if one or more areas in the selected image are white areas, searching the color temperature corresponding to the storage pixel value corresponding to each white area in the corresponding relation table, determining each searched color temperature as the environmental color temperature of the corresponding white area, and determining the environmental color temperature corresponding to the selected image according to the environmental color temperature corresponding to each white area.
Optionally, the white balance adjusting module 703 includes:
a target image determining unit, configured to determine an image currently displayed on a display screen of the terminal device as a target image;
and the target image adjusting unit is used for carrying out white balance adjustment on the target image according to the color temperature.
It should be noted that, for the information interaction, execution process, and other contents between the above-mentioned devices/units, the specific functions and technical effects thereof are based on the same concept as those of the embodiment of the method of the present application, and specific reference may be made to the part of the embodiment of the method, which is not described herein again.
Example four
Fig. 8 is a schematic diagram of a terminal device according to a fourth embodiment of the present application. As shown in fig. 8, the terminal device 8 of this embodiment includes: a processor 80, a memory 81, and a computer program 82 stored in the memory 81 and operable on the processor 80. The processor 80 implements the steps of the various method embodiments described above, such as steps S101 to S103 shown in fig. 1, when executing the computer program 82. Alternatively, the processor 80 implements the functions of the modules/units in the device embodiments, for example, the functions of the modules 701 to 703 shown in fig. 7, when executing the computer program 82.
Illustratively, the computer program 82 may be divided into one or more modules/units, which are stored in the memory 81 and executed by the processor 80 to complete the present application. The one or more modules/units may be a series of computer program instruction segments capable of performing specific functions, which are used to describe the execution process of the computer program 82 in the terminal device 8. For example, the computer program 82 can be divided into an image acquisition module, a color temperature estimation module and a white balance adjustment module, and the functions of the modules are as follows:
acquiring images of preset frame numbers continuously acquired by each camera, wherein the images of the preset frame numbers continuously acquired by each camera comprise the images currently acquired by the camera;
estimating the color temperature of the current environment according to the acquired images;
and carrying out white balance adjustment on a target image according to the color temperature, wherein the target image is one or more images in the images currently acquired by each camera.
The terminal device may include, but is not limited to, a processor 80 and a memory 81. Those skilled in the art will appreciate that fig. 8 is merely an example of a terminal device 8 and does not constitute a limitation of terminal device 8 and may include more or fewer components than shown, or some components may be combined, or different components, e.g., the terminal device may also include input-output devices, network access devices, buses, etc.
The Processor 80 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic, discrete hardware components, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The storage 81 may be an internal storage unit of the terminal device 8, such as a hard disk or a memory of the terminal device 8. The memory 81 may be an external storage device of the terminal device 8, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like provided in the terminal device 8. Further, the memory 81 may include both an internal storage unit and an external storage device of the terminal device 8. The memory 81 is used for storing the computer program and other programs and data required by the terminal device. The above-mentioned memory 81 can also be used to temporarily store data that has been output or is to be output.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned functions may be distributed as different functional units and modules according to needs, that is, the internal structure of the apparatus may be divided into different functional units or modules to implement all or part of the above-mentioned functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus/terminal device and method may be implemented in other ways. For example, the above-described embodiments of the apparatus/terminal device are merely illustrative, and for example, the division of the above modules or units is only one logical function division, and there may be other division manners in actual implementation, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated modules/units described above, if implemented in the form of software functional units and sold or used as separate products, may be stored in a computer readable storage medium. Based on such understanding, all or part of the flow in the method of the embodiments described above may be implemented by a computer program, which may be stored in a computer readable storage medium and used by a processor to implement the steps of the embodiments of the methods described above. The computer program includes computer program code, and the computer program code may be in a source code form, an object code form, an executable file or some intermediate form. The computer readable medium may include: any entity or device capable of carrying the above-mentioned computer program code, recording medium, usb disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signal, telecommunication signal, software distribution medium, etc. It should be noted that the computer readable medium described above may include content that is subject to appropriate increase or decrease as required by legislation and patent practice in jurisdictions, for example, in some jurisdictions, computer readable media that does not include electrical carrier signals and telecommunications signals in accordance with legislation and patent practice.
The above embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present application and are intended to be included within the scope of the present application.

Claims (7)

1. An image processing method is applied to a terminal device, the terminal device comprises a plurality of cameras, and the image processing method comprises the following steps:
acquiring images of preset frame numbers continuously acquired by each camera, wherein the images of the preset frame numbers continuously acquired by each camera comprise the images currently acquired by the camera;
estimating the color temperature of the current environment according to the acquired images;
carrying out white balance adjustment on a target image according to the color temperature, wherein the target image is one or more images in the images currently acquired by each camera;
the estimating of the color temperature of the current environment according to the acquired images comprises:
acquiring the ambient color temperature corresponding to each frame of image in each image;
carrying out weighted average on the color temperatures of all environments to obtain the color temperature of the current environment;
the weighted average of the color temperatures of the environments to obtain the color temperature of the current environment includes:
carrying out weighted average on each environmental color temperature according to a color temperature calculation formula to obtain the color temperature of the current environment, wherein the color temperature calculation formula is as follows:
wherein T is the color temperature of the current environment, M is the number of the cameras of the terminal equipment, K is the preset frame number,respectively the environmental color temperature corresponding to each image currently collected by the M cameras,respectively the environmental color temperature corresponding to the previous frame image of each image currently collected by the M cameras,respectively the environmental color temperature corresponding to the previous K frames of images of the images currently collected by the M cameras,for each of the weight values, the weight value,
2. the image processing method according to claim 1, wherein a preset correspondence table is stored in the terminal device, and the correspondence table records correspondence information between each color temperature and a pixel value of a white pixel at each color temperature, wherein each color temperature corresponds to one pixel value;
correspondingly, the acquiring the ambient color temperature corresponding to each frame of image in each image includes:
selecting an image from the acquired images, and dividing the selected image into a plurality of areas;
determining whether each area in the selected image is a white area or not according to the corresponding relation table;
if one or more areas in the selected image are white areas, determining the environmental color temperature corresponding to each white area according to the corresponding relation table, and determining the environmental color temperature corresponding to the selected image according to the environmental color temperature corresponding to each white area;
if all the areas in the selected image are not white areas, determining the environmental color temperature corresponding to the selected image according to the location of the terminal equipment, the current weather condition and the current time;
and traversing each acquired image to obtain the ambient color temperature corresponding to each image.
3. The image processing method according to claim 2, wherein the determining whether each region in the selected image is a white region according to the correspondence table comprises:
calculating the average value of the pixel values corresponding to each area in the selected image, wherein the average value of the pixel values corresponding to each area is the average value of the pixel values of all pixel points in the area;
obtaining a storage pixel value corresponding to each region according to the pixel value average value corresponding to each region, wherein the storage pixel value corresponding to each region is the pixel value which is stored in the corresponding relation table and is closest to the pixel value average value corresponding to the region;
determining a distance value corresponding to each region, wherein the distance value corresponding to each region is the distance between the average value of the pixel values corresponding to the region and the corresponding storage pixel value;
determining an area having a distance value less than a preset distance as a white area, and determining an area having a distance value greater than or equal to the preset distance as a non-white area,
correspondingly, the determining the environmental color temperature corresponding to each white area according to the corresponding relation table includes:
and searching the color temperature corresponding to the storage pixel value corresponding to each white area in the corresponding relation table, and determining each searched color temperature as the environmental color temperature of the corresponding white area.
4. The image processing method according to any one of claims 1 to 3, wherein the white balance adjustment is performed on a target image according to the color temperature, the target image being one or more of images currently acquired by each camera, and the method includes:
determining an image currently displayed on a display screen of the terminal equipment as a target image;
and carrying out white balance adjustment on the target image according to the color temperature.
5. An image processing apparatus applied to a terminal device including a plurality of cameras, the image processing apparatus comprising:
the image acquisition module is used for acquiring images of preset frame numbers continuously acquired by each camera, wherein the images of the preset frame numbers continuously acquired by each camera comprise the images currently acquired by the camera;
the color temperature estimation module is used for estimating the color temperature of the current environment according to the acquired images;
the white balance adjusting module is used for carrying out white balance adjustment on a target image according to the color temperature, wherein the target image is one or more images in the images currently acquired by each camera;
the color temperature estimation module includes:
the color temperature acquisition unit is used for acquiring the environmental color temperature corresponding to each frame of image in each image;
the weighted average unit is used for carrying out weighted average on the color temperatures of all environments to obtain the color temperature of the current environment;
the weighted average unit is specifically configured to:
carrying out weighted average on each environmental color temperature according to a color temperature calculation formula to obtain the color temperature of the current environment, wherein the color temperature calculation formula is as follows:
wherein T is the color temperature of the current environment, M is the number of the cameras of the terminal equipment, K is the preset frame number,respectively the environmental color temperature corresponding to each image currently collected by the M cameras,respectively the environmental color temperature corresponding to the previous frame image of each image currently collected by the M cameras,respectively the environmental color temperature corresponding to the previous K frames of images of the images currently collected by the M cameras,for each of the weight values, the weight value,
6. a terminal device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor implements the steps of the method according to any of claims 1 to 4 when executing the computer program.
7. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 4.
CN201810965516.8A 2018-08-23 2018-08-23 Image processing method, image processing device and terminal equipment Expired - Fee Related CN108965835B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810965516.8A CN108965835B (en) 2018-08-23 2018-08-23 Image processing method, image processing device and terminal equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810965516.8A CN108965835B (en) 2018-08-23 2018-08-23 Image processing method, image processing device and terminal equipment

Publications (2)

Publication Number Publication Date
CN108965835A CN108965835A (en) 2018-12-07
CN108965835B true CN108965835B (en) 2019-12-27

Family

ID=64473637

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810965516.8A Expired - Fee Related CN108965835B (en) 2018-08-23 2018-08-23 Image processing method, image processing device and terminal equipment

Country Status (1)

Country Link
CN (1) CN108965835B (en)

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110930455B (en) * 2019-11-29 2023-12-29 深圳市优必选科技股份有限公司 Positioning method, positioning device, terminal equipment and storage medium
CN111551265B (en) * 2020-04-03 2021-05-14 深圳市爱图仕影像器材有限公司 Color temperature measuring method and color temperature measuring device
CN113542711B (en) * 2020-04-14 2024-08-27 青岛海信移动通信技术有限公司 Image display method and terminal
CN113885823A (en) * 2020-07-02 2022-01-04 中国联合网络通信集团有限公司 Image color value adjusting method, device, equipment, system and storage medium
CN111800568B (en) * 2020-08-06 2021-11-05 珠海格力电器股份有限公司 Light supplement method and device
CN112087611B (en) * 2020-09-07 2022-10-21 Oppo广东移动通信有限公司 Electronic equipment and display screen adjusting method thereof
CN113676663B (en) * 2021-08-13 2023-07-18 驭新智行科技(宁波)有限公司 Camera white balance adjustment method and device, storage medium and terminal equipment
CN114554170B (en) * 2022-03-08 2024-06-11 三星半导体(中国)研究开发有限公司 Method for multi-sensor white balance synchronization and electronic device using same
CN117995137B (en) * 2024-04-07 2024-08-02 荣耀终端有限公司 Method for adjusting color temperature of display screen, electronic equipment and related medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101283604A (en) * 2005-08-30 2008-10-08 诺基亚公司 Image processing device with automatic white balance
CN103051804A (en) * 2012-12-28 2013-04-17 广东欧珀移动通信有限公司 Intelligent photo taking method and system of mobile terminal
CN104320642A (en) * 2014-10-11 2015-01-28 广东欧珀移动通信有限公司 Picture processing method and device
CN106713887A (en) * 2017-01-03 2017-05-24 捷开通讯(深圳)有限公司 Mobile terminal, and white balance adjustment method
CN107371007A (en) * 2017-07-25 2017-11-21 广东欧珀移动通信有限公司 White balancing treatment method, device and terminal
CN107911682A (en) * 2017-11-28 2018-04-13 广东欧珀移动通信有限公司 Image white balancing treatment method, device, storage medium and electronic equipment
CN107959851A (en) * 2017-12-25 2018-04-24 广东欧珀移动通信有限公司 Colour temperature detection method and device, computer-readable recording medium and computer equipment

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6494181B2 (en) * 2014-05-30 2019-04-03 キヤノン株式会社 Imaging device, control method thereof, and control program

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101283604A (en) * 2005-08-30 2008-10-08 诺基亚公司 Image processing device with automatic white balance
CN103051804A (en) * 2012-12-28 2013-04-17 广东欧珀移动通信有限公司 Intelligent photo taking method and system of mobile terminal
CN104320642A (en) * 2014-10-11 2015-01-28 广东欧珀移动通信有限公司 Picture processing method and device
CN106713887A (en) * 2017-01-03 2017-05-24 捷开通讯(深圳)有限公司 Mobile terminal, and white balance adjustment method
CN107371007A (en) * 2017-07-25 2017-11-21 广东欧珀移动通信有限公司 White balancing treatment method, device and terminal
CN107911682A (en) * 2017-11-28 2018-04-13 广东欧珀移动通信有限公司 Image white balancing treatment method, device, storage medium and electronic equipment
CN107959851A (en) * 2017-12-25 2018-04-24 广东欧珀移动通信有限公司 Colour temperature detection method and device, computer-readable recording medium and computer equipment

Also Published As

Publication number Publication date
CN108965835A (en) 2018-12-07

Similar Documents

Publication Publication Date Title
CN108965835B (en) Image processing method, image processing device and terminal equipment
CN110113534B (en) Image processing method, image processing device and mobile terminal
CN109064390B (en) Image processing method, image processing device and mobile terminal
CN108737739B (en) Preview picture acquisition method, preview picture acquisition device and electronic equipment
CN105447864B (en) Processing method, device and the terminal of image
CN111381224B (en) Laser data calibration method and device and mobile terminal
CN109215037B (en) Target image segmentation method and device and terminal equipment
CN103312971A (en) Image processing device, image processing method and computer-readable medium
CN110717452B (en) Image recognition method, device, terminal and computer readable storage medium
WO2022156167A1 (en) Image processing method and apparatus, and electronic device, computer-readable storage medium, computer program and program product
CN113676713B (en) Image processing method, device, equipment and medium
CN108805838B (en) Image processing method, mobile terminal and computer readable storage medium
CN112102164A (en) Image processing method, device, terminal and storage medium
US20220360707A1 (en) Photographing method, photographing device, storage medium and electronic device
CN111667504A (en) Face tracking method, device and equipment
CN112188097B (en) Photographing method, photographing apparatus, terminal device, and computer-readable storage medium
WO2018076172A1 (en) Image display method and terminal
CN110738185B (en) Form object identification method, form object identification device and storage medium
CN111654637A (en) Focusing method, focusing device and terminal equipment
CN108932703B (en) Picture processing method, picture processing device and terminal equipment
CN105678696B (en) A kind of information processing method and electronic equipment
CN111861965B (en) Image backlight detection method, image backlight detection device and terminal equipment
CN108763491B (en) Picture processing method and device and terminal equipment
CN109444905B (en) Dynamic object detection method and device based on laser and terminal equipment
CN109089040B (en) Image processing method, image processing device and terminal equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20191227

CF01 Termination of patent right due to non-payment of annual fee