CN111476848B - Video stream simulation method and device - Google Patents

Video stream simulation method and device Download PDF

Info

Publication number
CN111476848B
CN111476848B CN202010243473.XA CN202010243473A CN111476848B CN 111476848 B CN111476848 B CN 111476848B CN 202010243473 A CN202010243473 A CN 202010243473A CN 111476848 B CN111476848 B CN 111476848B
Authority
CN
China
Prior art keywords
light intensity
video stream
correction
gamma correction
pixel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010243473.XA
Other languages
Chinese (zh)
Other versions
CN111476848A (en
Inventor
吕书鹏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Jingwei Hirain Tech Co Ltd
Original Assignee
Beijing Jingwei Hirain Tech Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Jingwei Hirain Tech Co Ltd filed Critical Beijing Jingwei Hirain Tech Co Ltd
Priority to CN202010243473.XA priority Critical patent/CN111476848B/en
Publication of CN111476848A publication Critical patent/CN111476848A/en
Application granted granted Critical
Publication of CN111476848B publication Critical patent/CN111476848B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B23/00Testing or monitoring of control systems or parts thereof
    • G05B23/02Electric testing or monitoring
    • G05B23/0205Electric testing or monitoring by means of a monitoring system capable of detecting and responding to faults
    • G05B23/0208Electric testing or monitoring by means of a monitoring system capable of detecting and responding to faults characterized by the configuration of the monitoring system
    • G05B23/0213Modular or universal configuration of the monitoring system, e.g. monitoring system having modules that may be combined to build monitoring program; monitoring system that can be applied to legacy systems; adaptable monitoring system; using different communication protocols
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/64Circuits for processing colour signals
    • H04N9/68Circuits for processing colour signals for controlling the amplitude of colour signals, e.g. automatic chroma control circuits
    • H04N9/69Circuits for processing colour signals for controlling the amplitude of colour signals, e.g. automatic chroma control circuits for modifying the colour signals by gamma correction

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Processing Of Color Television Signals (AREA)
  • Picture Signal Circuits (AREA)
  • Studio Devices (AREA)

Abstract

The invention provides a video stream simulation method and a device, which are applied to the technical field of simulation, wherein after an ideal video stream output by video simulation software is obtained, the ideal video stream is subjected to reverse gamma correction according to a gamma correction formula of a target camera to obtain pre-correction pixels which correspond to the target camera and are not subjected to gamma correction, light intensity values of the obtained pre-correction pixels after being filtered by corresponding filters are further calculated to obtain original pixels represented by the filtered light intensity values, and finally the original video stream is generated according to the original pixels. The method restores the change of the gamma correction in the ideal video stream to each pixel through the reverse gamma correction, simulates the change of the filter in the physical camera to the light through calculating the light intensity value of each pixel after being filtered by the filter to obtain the original pixel, and finally obtains the original video stream, thereby realizing the simulation of the original video stream output by the camera without the ISP chip, and providing effective guarantee for the test of the controller.

Description

Video stream simulation method and device
Technical Field
The invention belongs to the technical field of simulation, and particularly relates to a video stream simulation method and device.
Background
The camera is widely applied to an ADAS (Advanced Driver Assistance System) and an AD (intelligent Driving) System as an important environmental perception sensor, and in view of the important role of the camera in the ADAS and AD systems, an ideal video stream obtained through simulation is often provided to a controller to be tested through simulation software at a laboratory test stage of the ADAS controller and the AD controller so as to test the operation performance of the controller to be tested.
The operating principle of the most widely used camera at present is that light firstly passes through a bayer filter arranged in the camera, the bayer filter filters the light to obtain light with a specific wavelength, namely, light with a specific color, the light with the specific wavelength irradiates a CMOS photosensitive array arranged behind the bayer filter, the CMOS photosensitive array further outputs induced current to a CMOS chip, and the CMOS chip outputs an original video stream. Further, the original video stream is subjected to preset processing by an ISP (Image signal processing, image signal processor) to obtain a video stream output to the controller.
The ideal video stream output by the existing simulation software can well simulate the video stream output by the camera, but with the improvement of the resolution and the frame rate of the vehicle-scale camera, in order to reduce the communication bandwidth, more and more cameras separate the ISP and the CMOS chip, and only the CMOS chip is reserved in the camera.
Disclosure of Invention
In view of this, an object of the present invention is to provide a method and an apparatus for simulating a video stream, which perform inverse transformation on an ideal video stream to obtain an original video stream, and provide an effective guarantee for a controller test, and the specific scheme is as follows:
in a first aspect, the present invention provides a video stream simulation method, including:
acquiring an ideal video stream output by video simulation software; the simulation parameters of the ideal video stream are pre-configured in simulation software according to the parameters of the target camera;
according to a gamma correction formula corresponding to the target camera, performing reverse gamma correction on the ideal video stream to obtain pixels before gamma correction, which correspond to the target camera and are not subjected to gamma correction;
calculating the light intensity value of each pixel before correction after being filtered by a corresponding filter in the target camera to obtain an original pixel represented by the filtered light intensity value;
and generating an original video stream according to the original pixels.
Optionally, the performing inverse gamma correction on the ideal video stream according to a gamma correction formula corresponding to the target camera to obtain a pre-correction pixel corresponding to the target camera and not subjected to gamma correction includes:
performing inverse operation on a gamma correction formula corresponding to the target camera to obtain an inverse gamma correction formula;
and reversely restoring the ideal video stream according to the reverse gamma correction formula to obtain pixels before gamma correction, which correspond to the target camera and are not subjected to gamma correction.
Optionally, the calculating a light intensity value of each pixel before correction after being filtered by a corresponding filter in the target camera to obtain an original pixel represented by the filtered light intensity value includes:
respectively calculating the wavelength and the light intensity value of each pixel before correction;
determining a target light intensity conversion rate corresponding to the wavelength of each pre-corrected pixel according to the filter corresponding to the pre-corrected pixel;
and aiming at each pixel before correction, taking the product of the light intensity value of the pixel before correction and the corresponding target light intensity conversion rate as the light intensity value after filtering by a filter, and obtaining the original pixel represented by the filtered light intensity value.
Optionally, the separately calculating the wavelength and the light intensity value of each pixel before correction includes:
respectively carrying out color space conversion on each pixel before correction to obtain YUV parameters of each pixel before correction;
for each pixel before correction, calculating the wavelength of the pixel before correction according to the chroma and saturation in the YUV parameters; and the number of the first and second groups,
and calculating to obtain the light intensity value of the pixel before correction according to the brightness in the YUV parameters.
Optionally, the determining a target light intensity conversion rate corresponding to the wavelength of the pixel before correction includes:
acquiring a photoelectric conversion characteristic curve of the filter, wherein the photoelectric conversion characteristic curve records the corresponding relation between the wavelength of light and the light intensity conversion rate;
and determining a target light intensity conversion rate corresponding to the wavelength of the pixel before correction according to the photoelectric conversion characteristic curve.
Optionally, the process of determining a filter corresponding to the pre-correction pixel includes:
acquiring the position coordinates of the pixels before correction;
and according to a preset mapping relation, determining a filter corresponding to the position coordinate in a plurality of filters included in the target camera to obtain a filter corresponding to the pixel before correction, wherein the preset mapping relation records the corresponding relation between the position coordinate of each pixel and each filter in the target camera.
In a second aspect, the present invention provides a video stream simulation apparatus, including:
the first acquisition unit is used for acquiring an ideal video stream output by the video simulation software; the simulation parameters of the ideal video stream are pre-configured in simulation software according to the parameters of the target camera;
the inverse correction unit is used for performing inverse gamma correction on the ideal video stream according to a gamma correction formula corresponding to the target camera to obtain pixels before gamma correction, corresponding to the target camera, and not performing gamma correction;
the calculation unit is used for calculating the light intensity value of each pixel before correction after being filtered by the corresponding filter in the target camera to obtain an original pixel represented by the filtered light intensity value;
and the generating unit is used for generating an original video stream according to the original pixels.
Optionally, the inverse correction unit is configured to perform inverse gamma correction on the ideal video stream according to a gamma correction formula corresponding to a target camera, and when obtaining a pre-correction pixel corresponding to the target camera and not subjected to gamma correction, the inverse correction unit specifically includes:
performing inverse operation on a gamma correction formula corresponding to the target camera to obtain an inverse gamma correction formula;
and reversely restoring the ideal video stream according to the reverse gamma correction formula to obtain pixels before gamma correction, which correspond to the target camera and are not subjected to gamma correction.
Optionally, the calculating unit is configured to calculate a light intensity value of each pixel before correction after being filtered by a corresponding filter in the target camera, and when an original pixel represented by the filtered light intensity value is obtained, the calculating unit specifically includes:
respectively calculating the wavelength and the light intensity value of each pixel before correction;
determining a target light intensity conversion rate corresponding to a wavelength of each of the pre-correction pixels for a filter corresponding to the pre-correction pixel;
and aiming at each pixel before correction, taking the product of the light intensity value of the pixel before correction and the corresponding target light intensity conversion rate as the light intensity value after filtering by a filter, and obtaining the original pixel represented by the filtered light intensity value.
Optionally, when the calculating unit is configured to calculate the wavelength and the light intensity value of each pixel before correction, the calculating unit specifically includes:
respectively carrying out color space conversion on each pixel before correction to obtain YUV parameters of each pixel before correction;
for each pixel before correction, calculating the wavelength of the pixel before correction according to the chroma and saturation in the YUV parameters; and the number of the first and second groups,
and calculating to obtain the light intensity value of the pixel before correction according to the brightness in the YUV parameters.
After the ideal video stream output by the video simulation software is obtained, the ideal video stream is subjected to reverse gamma correction according to the gamma correction formula of the target camera to obtain corrected pixels which correspond to the target camera and are not subjected to gamma correction, further, the light intensity value of each obtained corrected pixel after being filtered by a corresponding filter is calculated to obtain an original pixel represented by the filtered light intensity value, and finally the original video stream is generated according to the original pixel. The simulation method provided by the invention restores the change of the gamma correction in the ideal video stream to each pixel through the reverse gamma correction to obtain the pixel before the gamma correction, further simulates the change of the filter in the physical camera to the light through calculating the light intensity value of each pixel after being filtered by the filter to obtain the original pixel, and finally combines all the original pixels to obtain the original video stream, thereby realizing the simulation of the original video stream output by the camera without an ISP chip, filling the blank of the simulation of the existing camera video stream and providing effective guarantee for the test of a controller.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to these drawings without creative efforts.
Fig. 1 is a flowchart of a video stream simulation method according to an embodiment of the present invention;
fig. 2 is a photoelectric conversion characteristic curve of the camera filter;
fig. 3 is a block diagram of a video stream simulation apparatus according to an embodiment of the present invention;
FIG. 4 is a block diagram of another video stream simulation apparatus according to an embodiment of the present invention;
fig. 5 is a block diagram of a server according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention. All other embodiments, which can be obtained by a person skilled in the art without making any creative effort based on the embodiments in the present invention, belong to the protection scope of the present invention.
As described above, in practical applications, cameras implemented based on CMOS photosensitive arrays can be mainly divided into two types, the first type is a camera provided with an ISP, and the second type is a camera in which the ISP is separated from a CMOS chip and only the CMOS chip and a filter are retained. For the second camera, the ISP is set at the video signal receiving end, so as to achieve the purpose of reducing the communication bandwidth and lowering the data transmission requirement.
In terms of the physical structure, the second camera is identical to the first camera only in that no ISP is provided, and the rest of the camera, such as the filter and the CMOS chip, is still identical. On the basis, most of the existing scene simulation software can well simulate the video stream output by the first camera to obtain an ideal video stream, and is applied to simulation of the ADAS/AD controller. Therefore, in the case of simulating the original video stream output by the second camera, the most common idea is to refer to the implementation process of the scene simulation software of the first camera, and the simulation of the original video stream output by the second camera can be implemented only by deleting the content corresponding to the ISP.
However, the inventor researches and discovers that although an ideal video stream output by the existing scene simulation software can well simulate the video content output by the first camera, the simulation process is not completely realized according to the working process of the physical camera, and the simulation process cannot be strictly and definitely divided, that is, a functional module corresponding to the physical camera ISP cannot be definitely separated, and if the ideal video stream is divided forward based on the existing simulation software, the difficulty of simulating the original video stream is higher.
The inventor adopts the reverse thinking, and on the basis of the ideal video stream output by the existing scene simulation software, the original video stream output by the second camera is simulated by restoring the change of the ISP gamma correction and the color correction to the original video stream.
Based on the above, embodiments of the present invention provide a video stream simulation method, where the method may be applied to an electronic device, and the electronic device may be an electronic device with data processing capability, such as a notebook computer and a PC (personal computer), and obviously, the electronic device may also be implemented by using a server on a network side in some cases; optionally, referring to fig. 1, fig. 1 is a flowchart of a video stream simulation method provided in an embodiment of the present invention, where the flowchart of the video stream simulation method provided in the embodiment of the present invention may include:
and S100, acquiring an ideal video stream output by the video simulation software.
As described above, the video stream simulation method provided in the embodiment of the present invention is based on the ideal video stream output by the existing simulation software, and therefore, the ideal video stream output by the existing video simulation software needs to be obtained first, and the simulation parameters of the obtained ideal video stream are configured in the simulation software in advance according to the parameters of the target camera.
It should be noted that the gamma correction in ISP requires the video stream to be in RGB format, and the ideal video stream output by the existing simulation software includes multiple formats, so to perform the inverse gamma correction process in S110, the format of the ideal video stream needs to be converted into RGB format first. For the format conversion of the ideal video stream, the format conversion method in the prior art can be referred to for realization, and the specific format conversion process of the ideal video stream is not limited by the invention.
And S110, performing inverse gamma correction on the ideal video stream according to a gamma correction formula corresponding to the target camera to obtain pixels before gamma correction, which correspond to the target camera and are not subjected to gamma correction.
It is conceivable that, in the target camera described in the embodiment of the present invention, that is, the camera to be subjected to the output video stream simulation, relevant parameters of the target camera are all explicitly known, such as positions of the filters in the target camera, position coordinates of pixels in each frame of image, and a gamma correction formula corresponding to the target camera.
Therefore, when performing inverse gamma correction on an ideal video stream, first, a gamma correction formula corresponding to a target camera needs to be obtained, and inverse operation is performed on the gamma correction formula to obtain an inverse gamma correction formula. It should be noted that, for the inverse operation of the gamma correction formula, the inverse operation can be implemented by referring to the inverse transformation method of the mathematical formula in the prior art, and the inverse operation method in the prior art is also optional, which is not limited in the present invention.
After the inverse gamma correction formula is obtained, each pixel of the ideal video stream can be reversely restored according to the inverse gamma correction formula, the influence of the gamma correction on each pixel is restored, and finally the pixel before gamma correction corresponding to the target camera and not subjected to gamma correction is obtained.
And S120, calculating the light intensity value of each pixel before correction after being filtered by the corresponding filter in the target camera to obtain the original pixel represented by the filtered light intensity value.
In practical applications, the ISP can perform color space correction on the original video stream in addition to the gamma correction mentioned above, so that after obtaining the pixels before correction, it is further required to calculate the light intensity value of each pixel before correction after being filtered by the corresponding filter in the target camera, and further obtain the original pixels represented by the filtered light intensity value.
Alternatively, as described above, the gamma correction is applied to the pixels represented by the RGB parameters, and based on this, even if the inverse gamma correction is performed, the resulting pre-corrected pixels are still represented by the RGB parameters. Therefore, it is necessary to perform color space conversion on each of the obtained pre-correction pixels to obtain YUV parameters of each pre-correction pixel.
Specifically, any pixel before correction can realize the conversion from the RGB parameter to the YUV parameter by using the following calculation formula:
Y=0.30R+0.59G+0.11B
U=0.493(B-Y)
V=0.877(R-Y)
as is well known, in a pixel expressed by a YUV parameter, Y represents the chromaticity of the pixel, U represents the luminance of the pixel, and V represents the saturation of the pixel. After the YUV parameters of each pixel before correction are obtained, the wavelength of each pixel before correction can be calculated according to the chroma and saturation in the YUV parameters, and correspondingly, the light intensity value of each pixel before correction can be calculated according to the brightness value in the YUV parameters.
Further, when the CMOS chip is in operation, each CMOS sensor in the photosensitive array outputs light intensity filtered by a corresponding filter, and for the same filter, light of different wavelengths passes through the same filter, corresponding to different light intensity conversion rates. Optionally, referring to fig. 2, fig. 2 is a photoelectric conversion characteristic curve of a bayer filter commonly used in a camera, where the bayer filter may divide light entering the camera into three color lights, each color light corresponds to one photoelectric conversion characteristic curve, and in any curve, a corresponding light intensity conversion rate may be determined according to a wavelength of the light.
Meanwhile, for the determined target camera, the position coordinate of each pixel output by the target camera is known, and correspondingly, the position of each CMOS sensor in the column is also known.
And acquiring a photoelectric conversion characteristic curve of the filter corresponding to each pixel before correction, and determining the light intensity conversion rate corresponding to the wavelength of each pixel before correction according to the corresponding photoelectric conversion characteristic curve to obtain the target light intensity conversion rate corresponding to the wavelength of each pixel before correction.
And finally, for each pixel before correction expressed by the wavelength and the light intensity value, multiplying the light intensity value of the pixel before correction by the corresponding target light intensity conversion rate to obtain a product, namely the light intensity value of the pixel before correction after being filtered by the corresponding filter, and finally obtaining the original pixel expressed by the light intensity value after being filtered by the filter.
And S130, generating an original video stream according to the original pixels.
And traversing each pixel in the ideal video stream, and performing reduction conversion through the steps to obtain all original pixels, namely generating the original video stream according to the original pixels. It should be noted that, the method for generating the original video stream according to the original pixels may be performed with reference to the generation manner of the video stream in the prior art, and the present invention is not limited to this.
In summary, the simulation method provided by the present invention restores, through inverse gamma correction, changes of the gamma correction in the ideal video stream to each pixel to obtain a pixel before gamma correction, further, simulates changes of light rays by a filter in a physical camera by calculating a light intensity value of each pixel after being filtered by the filter to obtain an original pixel, and finally combines all the original pixels to obtain an original video stream, thereby realizing simulation of the original video stream output by a camera without an ISP chip, filling up a blank of existing camera video stream simulation, and providing an effective guarantee for controller testing.
The following briefly introduces an application of the video stream simulation method provided by the embodiment of the present invention in combination with an actual test scenario.
Firstly, before testing a controller to be tested, a camera connected with the controller to be tested in an actual product needs to be parameterized, namely a virtual camera is configured in scene simulation software according to parameters of a real camera.
Then, the gamma correction formula parameters in the ISP matched with the real camera and the photoelectric conversion characteristic curve of the filter lens in the camera are determined, and the known parameters are prepared in advance. According to the execution process of the video stream simulation method provided by the above embodiment of the present invention, these parameters are all used as known parameters when the method provided by the embodiment of the present invention is applied.
Finally, specific application modes of the video stream simulation method provided by the embodiment of the invention are selected according to different test scenes of the controller to be tested.
Specifically, if HIL (Hardware In Loop) scene testing is performed on the controller to be tested, that is, when video simulation output by the camera is implemented through the simulation board, software corresponding to the video stream simulation method provided by the embodiment of the present invention needs to be loaded into the simulation board, and used as a post-processing part before signal output of the video simulation board.
If the controller to be tested is subjected to an MIL (model in Loop) scene test, software corresponding to the video stream simulation method provided by the embodiment of the present invention needs to be used as a plug-in of a camera model of scene simulation software in the prior art, an ideal video stream output by the scene simulation software is directly obtained, and after the processing of the foregoing steps, an original video stream that is not processed by the ISP can be obtained, thereby realizing the test work of the controller to be tested.
The video stream simulation apparatus provided in the embodiment of the present invention is introduced below, and the video stream simulation apparatus described below may be regarded as a functional module architecture that needs to be set in a central device to implement the video stream simulation method provided in the embodiment of the present invention; the following description may be cross-referenced with the above.
Optionally, referring to fig. 3, fig. 3 is a block diagram of a video stream simulation apparatus according to an embodiment of the present invention, where the apparatus may include:
a first obtaining unit 10, configured to obtain an ideal video stream output by video simulation software; the simulation parameters of the ideal video stream are pre-configured in simulation software according to the parameters of the target camera;
the inverse correction unit 20 is configured to perform inverse gamma correction on the ideal video stream according to a gamma correction formula corresponding to the target camera, so as to obtain a pre-correction pixel corresponding to the target camera and not subjected to gamma correction;
a calculating unit 30, configured to calculate a light intensity value of each pixel before correction after being filtered by a corresponding filter in the target camera, so as to obtain an original pixel represented by the filtered light intensity value;
a generating unit 40 for generating an original video stream from the original pixels.
Optionally, the inverse correction unit 20 is configured to perform inverse gamma correction on the ideal video stream according to a gamma correction formula corresponding to the target camera, and when obtaining a pre-correction pixel corresponding to the target camera and not subjected to gamma correction, specifically include:
performing inverse operation on a gamma correction formula corresponding to the target camera to obtain a reverse gamma correction formula;
and performing reverse reduction on the ideal video stream according to a reverse gamma correction formula to obtain pixels before correction, which correspond to the target camera and are not subjected to gamma correction.
Optionally, the calculating unit 30 is configured to calculate a light intensity value of each pixel before correction after being filtered by a corresponding filter in the target camera, and when obtaining an original pixel represented by the filtered light intensity value, specifically include:
respectively calculating the wavelength and the light intensity value of each pixel before correction;
determining a target light intensity conversion rate corresponding to the wavelength of each pre-corrected pixel according to the filter corresponding to each pre-corrected pixel;
and aiming at each pixel before correction, taking the product of the light intensity value of the pixel before correction and the corresponding target light intensity conversion rate as the light intensity value after filtering by the filter, and obtaining the original pixel represented by the filtered light intensity value.
Optionally, the calculating unit 30 is configured to, when calculating the wavelength and the light intensity value of each pixel before correction, specifically include:
respectively carrying out color space conversion on each pixel before correction to obtain YUV parameters of each pixel before correction;
calculating the wavelength of each pixel before correction according to the chroma and saturation in the YUV parameters; and (c) a second step of,
and calculating to obtain the light intensity value of the pixel before correction according to the brightness in the YUV parameters.
Optionally, the calculating unit 30 is configured to, when determining the target light intensity conversion rate corresponding to the wavelength of each pixel before correction, specifically include:
acquiring a photoelectric conversion characteristic curve of the filter, wherein the photoelectric conversion characteristic curve records the corresponding relation between the wavelength of light and the conversion rate of light intensity;
a target light intensity conversion ratio corresponding to the wavelength of the pixel before correction is determined from the photoelectric conversion characteristic curve.
Optionally, referring to fig. 4, fig. 4 is a block diagram of a structure of another video stream simulation apparatus provided in an embodiment of the present invention, and on the basis of the embodiment shown in fig. 3, the apparatus further includes:
a second acquisition unit 50 for acquiring position coordinates of the pixel before correction;
the determining unit 60 is configured to determine, according to a preset mapping relationship, a filter corresponding to the position coordinate among a plurality of filters included in the target camera, to obtain a filter corresponding to the pixel before correction, where the preset mapping relationship records a corresponding relationship between the position coordinate of each pixel and each filter in the target camera.
Optionally, referring to fig. 5, fig. 5 is a block diagram of a server according to an embodiment of the present invention, and as shown in fig. 4, the server may include: at least one processor 100, at least one communication interface 200, at least one memory 300, and at least one communication bus 400;
in the embodiment of the present invention, the number of the processor 100, the communication interface 200, the memory 300, and the communication bus 400 is at least one, and the processor 100, the communication interface 200, and the memory 300 complete the communication with each other through the communication bus 400; it is clear that the communication connections shown by the processor 100, the communication interface 200, the memory 300 and the communication bus 400 shown in fig. 5 are merely optional;
optionally, the communication interface 200 may be an interface of a communication module, such as an interface of a GSM module;
the processor 100 may be a central processing unit CPU, or an Application Specific Integrated Circuit (ASIC), or one or more Integrated circuits configured to implement an embodiment of the invention.
The memory 300, which stores application programs, may include a high-speed RAM memory, and may also include a non-volatile memory (non-volatile memory), such as at least one disk memory.
The processor 100 is specifically configured to execute an application program in the memory to implement any of the embodiments of the video stream simulation method described above.
The embodiments in the present description are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments are referred to each other. The device disclosed by the embodiment corresponds to the method disclosed by the embodiment, so that the description is simple, and the relevant points can be referred to the method part for description.
Those of skill would further appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both, and that the various illustrative components and steps have been described above generally in terms of their functionality in order to clearly illustrate this interchangeability of hardware and software. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
The steps of a method or algorithm described in connection with the embodiments disclosed herein may be embodied directly in hardware, in a software module executed by a processor, or in a combination of the two. A software module may reside in Random Access Memory (RAM), memory, read-only memory (ROM), electrically programmable ROM, electrically erasable programmable ROM, registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art.
The previous description of the disclosed embodiments is provided to enable any person skilled in the art to make or use the present invention. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the invention. Thus, the present invention is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.

Claims (10)

1. A method for video stream simulation, comprising:
acquiring an ideal video stream output by video simulation software; the simulation parameters of the ideal video stream are pre-configured in simulation software according to the parameters of the target camera;
according to a gamma correction formula corresponding to the target camera, performing reverse gamma correction on the ideal video stream to obtain pixels before gamma correction, which correspond to the target camera and are not subjected to gamma correction;
calculating the light intensity value of each pixel before correction after being filtered by a corresponding filter in the target camera to obtain an original pixel represented by the filtered light intensity value;
and generating an original video stream according to the original pixels.
2. The method of claim 1, wherein performing inverse gamma correction on the ideal video stream according to a gamma correction formula corresponding to the target camera to obtain pre-corrected pixels corresponding to the target camera without gamma correction comprises:
performing inverse operation on the gamma correction formula corresponding to the target camera to obtain an inverse gamma correction formula;
and reversely reducing the ideal video stream according to the reverse gamma correction formula to obtain pixels before correction, which correspond to the target camera and are not subjected to gamma correction.
3. The method of claim 1, wherein the calculating the light intensity value of each pixel before correction after being filtered by a corresponding filter in the target camera to obtain an original pixel represented by the filtered light intensity value comprises:
respectively calculating the wavelength and the light intensity value of each pixel before correction;
determining a target light intensity conversion rate corresponding to a wavelength of each of the pre-correction pixels, based on a filter corresponding to each of the pre-correction pixels;
and aiming at each pixel before correction, taking the product of the light intensity value of the pixel before correction and the corresponding target light intensity conversion rate as the light intensity value after filtering by a filter, and obtaining the original pixel represented by the filtered light intensity value.
4. The method of claim 3, wherein said separately calculating the wavelength and intensity values of each of said pre-corrected pixels comprises:
respectively carrying out color space conversion on each pixel before correction to obtain a YUV parameter of each pixel before correction;
for each pixel before correction, calculating the wavelength of the pixel before correction according to the chroma and saturation in the YUV parameters; and the number of the first and second groups,
and calculating to obtain the light intensity value of the pixel before correction according to the brightness in the YUV parameters.
5. The method of claim 3, wherein said determining a target light intensity conversion ratio corresponding to a wavelength of each of said pre-corrected pixels comprises:
acquiring a photoelectric conversion characteristic curve of the filter, wherein the photoelectric conversion characteristic curve records the corresponding relation between the wavelength of light and the light intensity conversion rate;
and determining a target light intensity conversion rate corresponding to the wavelength of each pixel before correction according to the photoelectric conversion characteristic curve.
6. The method according to any of claims 3-5, wherein the process of determining the filter corresponding to each of the pre-correction pixels comprises:
acquiring the position coordinates of each pixel before correction;
and determining a filter corresponding to the position coordinate in a plurality of filters included in the target camera according to a preset mapping relation to obtain the filter corresponding to each pixel before correction, wherein the preset mapping relation records the corresponding relation between the position coordinate of each pixel and each filter in the target camera.
7. A video stream simulation apparatus, comprising:
the first acquisition unit is used for acquiring an ideal video stream output by the video simulation software; the simulation parameters of the ideal video stream are pre-configured in simulation software according to the parameters of the target camera;
the inverse correction unit is used for performing inverse gamma correction on the ideal video stream according to a gamma correction formula corresponding to the target camera to obtain pixels before gamma correction, corresponding to the target camera, and not performing gamma correction;
the calculation unit is used for calculating the light intensity value of each pixel before correction after being filtered by the corresponding filter in the target camera to obtain an original pixel represented by the filtered light intensity value;
and the generating unit is used for generating an original video stream according to the original pixels.
8. The video stream simulation apparatus according to claim 7, wherein the inverse correction unit is configured to perform inverse gamma correction on the ideal video stream according to the gamma correction formula corresponding to the target camera, so as to obtain the pre-correction pixels corresponding to the target camera without performing gamma correction, and specifically includes:
performing inverse operation on the gamma correction formula corresponding to the target camera to obtain an inverse gamma correction formula;
and reversely reducing the ideal video stream according to the reverse gamma correction formula to obtain pixels before correction, which correspond to the target camera and are not subjected to gamma correction.
9. The video stream simulation apparatus according to claim 7, wherein the calculating unit is configured to calculate the light intensity value of each pixel before correction after being filtered by the corresponding filter in the target camera, and when obtaining the original pixel represented by the filtered light intensity value, specifically includes:
respectively calculating the wavelength and the light intensity value of each pixel before correction;
determining a target light intensity conversion rate corresponding to a wavelength of each of the pre-correction pixels, based on a filter corresponding to each of the pre-correction pixels;
and aiming at each pixel before correction, taking the product of the light intensity value of the pixel before correction and the corresponding target light intensity conversion rate as the light intensity value filtered by a filter to obtain the original pixel represented by the filtered light intensity value.
10. The video stream simulation apparatus according to claim 9, wherein the calculating unit, when calculating the wavelength and the light intensity value of each pixel before correction, specifically comprises:
respectively carrying out color space conversion on each pixel before correction to obtain YUV parameters of each pixel before correction;
for each pixel before correction, calculating the wavelength of the pixel before correction according to the chroma and saturation in the YUV parameters; and the number of the first and second groups,
and calculating to obtain the light intensity value of the pixel before correction according to the brightness in the YUV parameters.
CN202010243473.XA 2020-03-31 2020-03-31 Video stream simulation method and device Active CN111476848B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010243473.XA CN111476848B (en) 2020-03-31 2020-03-31 Video stream simulation method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010243473.XA CN111476848B (en) 2020-03-31 2020-03-31 Video stream simulation method and device

Publications (2)

Publication Number Publication Date
CN111476848A CN111476848A (en) 2020-07-31
CN111476848B true CN111476848B (en) 2023-04-18

Family

ID=71749317

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010243473.XA Active CN111476848B (en) 2020-03-31 2020-03-31 Video stream simulation method and device

Country Status (1)

Country Link
CN (1) CN111476848B (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101175221A (en) * 2006-10-30 2008-05-07 华为技术有限公司 Method and apparatus for emending gamma characteristic of video display equipment in video communication
JP2012028937A (en) * 2010-07-21 2012-02-09 Nippon Hoso Kyokai <Nhk> Video signal correction apparatus and video signal correction program
CN103313079A (en) * 2012-03-15 2013-09-18 索尼公司 Display device, image processing device, image processing method, and computer program
CN104853171A (en) * 2014-02-17 2015-08-19 索尼公司 Image processing apparatus, method for image processing, and program
CN106205474A (en) * 2014-11-11 2016-12-07 三星显示有限公司 Data handling equipment and the display apparatus with data handling equipment
CN106210883A (en) * 2016-08-11 2016-12-07 浙江大华技术股份有限公司 A kind of method of Video Rendering, equipment
CN108182672A (en) * 2014-05-28 2018-06-19 皇家飞利浦有限公司 Method and apparatus for the method and apparatus encoded to HDR image and for using such coded image

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003319412A (en) * 2002-04-19 2003-11-07 Matsushita Electric Ind Co Ltd Image processing back-up system, image processor, and image display device

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101175221A (en) * 2006-10-30 2008-05-07 华为技术有限公司 Method and apparatus for emending gamma characteristic of video display equipment in video communication
JP2012028937A (en) * 2010-07-21 2012-02-09 Nippon Hoso Kyokai <Nhk> Video signal correction apparatus and video signal correction program
CN103313079A (en) * 2012-03-15 2013-09-18 索尼公司 Display device, image processing device, image processing method, and computer program
CN104853171A (en) * 2014-02-17 2015-08-19 索尼公司 Image processing apparatus, method for image processing, and program
CN108182672A (en) * 2014-05-28 2018-06-19 皇家飞利浦有限公司 Method and apparatus for the method and apparatus encoded to HDR image and for using such coded image
CN106205474A (en) * 2014-11-11 2016-12-07 三星显示有限公司 Data handling equipment and the display apparatus with data handling equipment
CN106210883A (en) * 2016-08-11 2016-12-07 浙江大华技术股份有限公司 A kind of method of Video Rendering, equipment

Also Published As

Publication number Publication date
CN111476848A (en) 2020-07-31

Similar Documents

Publication Publication Date Title
Karaimer et al. A software platform for manipulating the camera imaging pipeline
US10645268B2 (en) Image processing method and apparatus of terminal, and terminal
EP2523160A1 (en) Image processing device, image processing method, and program
CN109729332B (en) Automatic white balance correction method and system
CN108337496B (en) White balance processing method, processing device, processing equipment and storage medium
JP4313370B2 (en) Method for reducing electronic image aliasing
CN112073703B (en) Method and device for adjusting color correction matrix, terminal equipment and medium
CN105049718A (en) Image processing method and terminal
CN111784603A (en) RAW domain image denoising method, computer device and computer readable storage medium
CN111161188B (en) Method for reducing image color noise, computer device and readable storage medium
CN108230407B (en) Image processing method and device
WO2024027287A1 (en) Image processing system and method, and computer-readable medium and electronic device
CN113298192A (en) Fusion method and device of infrared light image and visible light image and storage medium
WO2017154293A1 (en) Image processing apparatus, imaging apparatus, image processing method, and program
CN111476848B (en) Video stream simulation method and device
CN109754374A (en) A kind of method and device removing brightness of image noise
CN117274060B (en) Unsupervised end-to-end demosaicing method and system
US7512264B2 (en) Image processing
CN111917986A (en) Image processing method, medium thereof, and electronic device
CN103313066B (en) Interpolation method and device
CN105991937A (en) Virtual exposure method and device based on Bayer format image
CN110969675B (en) Method for simulating blurring of different-shape diaphragms of camera
CN117408872B (en) Color image data conversion method, device, equipment and storage medium
CN110089103A (en) A kind of demosaicing methods and device
KR20210107955A (en) Color stain analyzing method and electronic device using the method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: 4 / F, building 1, No.14 Jiuxianqiao Road, Chaoyang District, Beijing 100020

Applicant after: Beijing Jingwei Hirain Technologies Co.,Inc.

Address before: 8 / F, block B, No. 11, Anxiang Beili, Chaoyang District, Beijing 100101

Applicant before: Beijing Jingwei HiRain Technologies Co.,Ltd.

CB02 Change of applicant information
GR01 Patent grant
GR01 Patent grant