IE20060558U1 - Image blurring - Google Patents

Image blurring Download PDF

Info

Publication number
IE20060558U1
IE20060558U1 IE2006/0558A IE20060558A IE20060558U1 IE 20060558 U1 IE20060558 U1 IE 20060558U1 IE 2006/0558 A IE2006/0558 A IE 2006/0558A IE 20060558 A IE20060558 A IE 20060558A IE 20060558 U1 IE20060558 U1 IE 20060558U1
Authority
IE
Ireland
Prior art keywords
image
pixel
flash
pixels
saturated
Prior art date
Application number
IE2006/0558A
Other versions
IES84403Y1 (en
Inventor
Drimbarean Alexandru
Original Assignee
Fotonation Vision Limited
Filing date
Publication date
Application filed by Fotonation Vision Limited filed Critical Fotonation Vision Limited
Publication of IE20060558U1 publication Critical patent/IE20060558U1/en
Publication of IES84403Y1 publication Critical patent/IES84403Y1/en

Links

Classifications

    • G06T7/0081
    • G06T7/0097

Abstract

ABSTRACT A method of blurring an image comprises acquiring a first image of a scene taken at a first exposure level; acquiring a second image of nominally the same scene taken at a second exposure level, wherein at least one region of the second image includes pixels having saturated intensity values; deriving for at least some of the saturated pixels, values extrapolated from the first image; blurring at least a portion of a third image including pixels having the extrapolated values; and re-scaling the blurred portion of the third image.

Description

Image Blurring The present invention relates to image blurring, and in particular, to a system and method for creating blur in an image to reduce the depth of field of the image.
In digital cameras the depth of field (DOF) is typically much greater than for conventional cameras due to the image sensor being somewhat smaller than in a 35mm film negative. This means that portrait images, in particular, will tend to have the background in sharp focus, which is often not desirable as a photographer may wish to emphasize a person's face and de- emphasize the background of the picture.
This problem has typically been corrected by careful photography combined with careful use of camera settings.
Alternatively, portrait images are often blurred semi-manually by professional photographers using desktop computer image processing software after an image has been captured. It will be appreciated that this requires manual intervention and is often time-consuming.
Nonetheless, such conventional blurring software applies various techniques using convolution kernels to create the blurring effects, as illustrated in Figure 1 and 2.
Generically, convolution can be expressed according to the equation below: B = I* g where B is the blurred image, I is the original image and g is the convolution kernel.
Traditional convolution blur is applied on a pixel-by-pixel basis. So, for a particular pixel with coordinates (x,y), the convolution with a kernel of size (M x N) can be written as: The size and shape of the kernel influence the blurring result. The size determines the strength of the blur and therefore the perceived depth of the object. The shape determines the visual aspect of the blur and is related to what is called "circle of confusion".
OPEN TO PUBLIC INSPECTION A circular kernel of a diameter D has the following analytical form ; Naval»- otherwise and a geometrical shape of a cylinder or "pillbox", as is illustrated in Figure 1.
Referring now to Figure 2, the effects of a convolution kernel on a row of pixels within a flash image of a scene can be seen. The most intense (bright areas) of the original image undergo saturation clipping to the maximum of the dynamic intensity range (255 for a 8 bit / pixel image) as depicted by dashed lines 20. Due to convolution, a resulting blurred image lacks the contrast and sharpness present in the scene, therefore creating a less appealing visual effect. As such, conventional blurring techniques do not achieve realistic results and do not resemble an image with a shallow DOF as desired.
- Thus, it is an object of the present invention to provide means for obtaining a more realistic blur similar to the low Depth-of-Field blur generated by a conventional camera.
According to the present invention there is provided a method as claimed in claim 1.
The present invention has the advantages of mitigating low contrast and sharpness associated with prior art blurring techniques. Furthermore, the present invention emulates lens effects of circle of confusion, as is described in more detail below.
Embodiments of the invention will now be described, by way of example, with reference to the accompanying drawings, in which: Figure 1 depicts a conventional circular convolution kernel; Figure 2 depicts conventional image blurring using a circular convolution kernel; Figure 3 illustrates a flow diagram of the preferred embodiment of the present invention; IE060558 Figure 4 depicts a graphical representation of a portion non flash image, a flash image and a high dynamic range (HDR) image derived therefrom according to the preferred embodiment of the present invention; and Figure 5 depicts a graphical representation of a portion of blurred image of the HDR image of Figure 4 and a re-scaled blurred image derived therefrom according to the preferred embodiment of the present invention.
As illustrated in Figure 2, saturation clipping leads to a loss of high dynamic range, HDR, information about the image. In the preferred embodiment of the present invention, this information is recovered to provide a more realistic bluning of an image of a scene.
Referring now to Figure 3, prior to capturing 110 a final flash image of a scene, a low- resolution non-flash image nominally of the same scene (herein after referred to as a preview) is also captured 100. However, it will be appreciated that alternatively the preview may comprise any one or more of a series of captured non-flash low—resolution images.
Alternatively, the non-flash image of the scene can be captured after the final flash version and as such is a post—view image, but for ease of reference, we will use the term "preview" in relation to the preferred embodiment. It will also be seen that the present invention can be applied even where the final image 110 is captured without a flash. In such cases, the final image I 10 is taken at an exposure level different (usually higher) than the preview image 100 and so includes more pixels with saturated values than the preview image.
In the preferred embodiment, the preview is utilised for various tasks such as auto-focus, white balance or to estimate exposure parameters. Preferably, the preview is stored in a memory of the camera for later processing techniques. Alternatively the preview may be stored in any suitable memory device.
A portion of a row of pixels of the preview image is graphically represented in Figure 4. As the preview image is a non-flash image, saturation clipping generally does not occur and as such no or relatively little information about the scene is lost.
|E060558 A corresponding portion of a row of pixels of the flash full resolution image is also graphically represented in Figure 4. This flash image is subjected to saturation clipping due to the dynamic intensity range limit and therefore valuable information about the scene is eliminated from the flash image.
Referring back to Figure 3, the preview and flash images are brought to the same resolution 120. In the preferred embodiment, the resolution of the images is matched by downsampling the flash image to the resolution of the preview image. Alternatively, the resolution of the images may be matched by any combination of upsampling the preview image or downsampling the resolution of the flash image. However it will be appreciated any suitable means of bringing the images to the same resolution may be employed.
The preview and final images are then aligned in step 130, using image registration techniques, to compensate for any slight movement in the scene or camera between taking these images. Alignment may be performed globally across the entire images or locally using various conventional techniques as described in our co-pending Irish Patent Application No.
S2005/0822 filed December 8, 2005 and will not be further described herein.
Utilising intensity information derived from the preview image, a high dynamic range (HDR) image is constructed 140 from the flash image. The HDR image incorporates an estimate of the information (bright areas) eliminated from the flash image by saturation clipping.
In the preferred embodiment, this is achieved by determining an intensity ratio between two or more neighbouring pixels in the preview image one of which will be clipped in the flash image; and the intensity value of the non-saturated pixels in the flash image. It will however be appreciated that the intensity ratio for each saturated pixel may be determined with respect to one or more non-neighbouring comparison pixels. Using this ratio information, the intensity of each of the clipped pixels of the flash image is extrapolated in proportion to the intensity ratio derived from the corresponding preview image pixel(s).
For example, the ratio of the intensity of a first pixel of a preview image to the intensity of a neighbouring pixel of the preview image is detennined. In the case where the first pixel’s corresponding pixel in the flash image has been saturation clipped, theintensity of the lEo5ossa clipped pixel is increased in accordance with the ratio infonnation in order to restore the pixel to its original intensity ratio with respect to its neighbouring or comparison pixels. This process is carried out for all saturated pixels to produce a HDR image.
In this way, the HDR image resembles a flash image, which was not subjected to saturation clipping and a portion of a row of pixels of the HDR image corresponding to the pixels of the original and preview images is depicted graphically in Figure 4.
While the embodiment above has been described in terms of providing a separate HDR image from the images 100, 110, it will be seen that the invention could equally be implemented by adjusting the values of the flash image 1 IO and using this adjusted image in the steps described below.
In the preferred embodiment, as disclosed in Irish Patent Application No. S2005/0822 filed December 8, 2005, the HDR image may undergo a digital segmentation process 135 to determine foreground and/or background within at least one portion of the image. In one implementation, the HDR image is compared to the preview non-flash image 100 of nominally the same scene. Overall light distribution will vary between the two images, because one image or subset of images will be illuminated only with available ambient light while another will be illuminated with direct flash light, thereby enabling the HDR image to be separated into foreground and background. As an alternative to using the HDR image, the flash image 110 could be compared with the preview image 100 to perform foreground/background segmentation which could in turn be applied for use in processing the HDR image; or alternatively a flash and non-flash preview image could be used to foreground/background segmentation again for use in processing the HDR image.
Alternatively, foreground and background regions of the HDR image may be separated 135 by the method disclosed in priority founding US Application No. 60/773,714. In this embodiment, one flash or non-flash image of the scene is taken with the foreground more in focus than the background and can be converted to a HDR image, as explained above. The HDR image is then stored in DCT-coded format. A second out of focus image of nominally the same scene is taken 133, and stored in DCT-coded format. The two DCT-coded images are compared and regions of the HDR image are assigned as foreground or background lEo6o55e according to whether the sum of selected high order DCT coefficients decreases or increases for the equivalent regions of the second image.
In the preferred embodiment, as depicted in Figure 5, regions of the HDR image labelled as background from the steps described above are then blurred 150 with a circular kernel that resembles the PSF (point spread fiinction) of a lens of the camera to emulate the real effect of optical blur. A circular shaped kernel is employed because it approximates the real lens aperture effect. Also, since the lens does not amplify or reduce the amount of light passing through, the convolution kernel is derived such as the sum of all its values equals 1, i.e.: However it will be appreciated that any suitably shaped kernel may be utilised.
The range of the blurred image produced in step 150 is then scaled back 160 to the range of the full resolution image as show in Fig 5 to produce a realistically blurred image 170 similar to the low depth-of-field blur generated by a conventional camera.
It will be seen that many variations of the above embodiments are possible. For example, image processing software described in Figure 3 can be implemented completely in a camera or as part of an external processing device such as a desktop computer which is provided with the images 100, 110, 133.
The present invention is not limited to the embodiments described herein, which may be amended or modified without departing from the scope of the present invention.

Claims (5)

Claims:
1. A method for creating blur in an image, said method comprising: acquiring a first image of a scene taken at a first exposure level; acquiring a second image of nominally the same scene taken at a second exposure level, wherein at least one region of said second image includes pixels having saturated intensity values; deriving for at least some of said saturated pixels, values extrapolated from said first image; blurring at least a portion of a third image including pixels having said extrapolated values; and re—scaling said blurred portion of said third image.
2. The method of claim 1 wherein the first image is taken without a flash and wherein the second image is taken with a flash exposure level.
3. The method of claim 1 wherein deriving an extrapolated value for a saturated pixel comprises: calculating at least one ratio of an intensity of a pixel in said first image to an intensity of at least one non-saturated pixel; and providing said extrapolated value for a selected pixel in accordance with said intensity ratio.
4. The method of claim 2 wherein said non-saturated pixel comprises a pixel in said first image which corresponds to a pixel in said second image which has not undergone saturation clipping.
5. The method of claim 1 further comprising: determining one or more portions of said third image which correspond to foreground of the image and portions which correspond to background of the image; and blurring said background portions. liososss
IE2006/0558A 2006-07-27 Image blurring IES84403Y1 (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
USUNITEDSTATESOFAMERICA14/02/20066

Publications (2)

Publication Number Publication Date
IE20060558U1 true IE20060558U1 (en) 2006-11-01
IES84403Y1 IES84403Y1 (en) 2006-11-01

Family

ID=

Similar Documents

Publication Publication Date Title
EP1987436B1 (en) Image blurring
IES20060558A2 (en) Image blurring
US11558558B1 (en) Frame-selective camera
Hasinoff et al. Burst photography for high dynamic range and low-light imaging on mobile cameras
Galdran Image dehazing by artificial multiple-exposure image fusion
US9019402B2 (en) Dynamic range extension by combining differently exposed hand-held device-acquired images
CN107045715B (en) A kind of method that single width low dynamic range echograms generate high dynamic range images
KR101662846B1 (en) Apparatus and method for generating bokeh in out-of-focus shooting
US10410327B2 (en) Shallow depth of field rendering
WO2018176925A1 (en) Hdr image generation method and apparatus
KR20120016476A (en) Image processing method and image processing apparatus
WO2020029679A1 (en) Control method and apparatus, imaging device, electronic device and readable storage medium
CN112258417B (en) Image generation method, device and equipment
WO2022066726A1 (en) Saliency based capture or image processing
CN111953893B (en) High dynamic range image generation method, terminal device and storage medium
WO2019124289A1 (en) Device, control method, and storage medium
CN112819699A (en) Video processing method and device and electronic equipment
CN110992284A (en) Image processing method, image processing apparatus, electronic device, and computer-readable storage medium
JP2015156616A (en) Image processing system, image processing method, control program, and recording medium
IE20060558U1 (en) Image blurring
IES84403Y1 (en) Image blurring
JP2018181070A (en) Image processing device and image processing method
WO2020084894A1 (en) Multi-camera system, control value calculation method and control device
US11935285B1 (en) Real-time synthetic out of focus highlight rendering
JP2020154640A (en) Image processing apparatus, image processing method and image processing program