US7885527B2 - Focusing apparatus and method - Google Patents
Focusing apparatus and method Download PDFInfo
- Publication number
- US7885527B2 US7885527B2 US12/228,191 US22819108A US7885527B2 US 7885527 B2 US7885527 B2 US 7885527B2 US 22819108 A US22819108 A US 22819108A US 7885527 B2 US7885527 B2 US 7885527B2
- Authority
- US
- United States
- Prior art keywords
- region
- focus
- pixels
- image
- user
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active, expires
Links
- 238000000034 method Methods 0.000 title claims abstract description 30
- 230000003190 augmentative effect Effects 0.000 claims description 5
- 230000001502 supplementing effect Effects 0.000 claims 2
- 230000000994 depressogenic effect Effects 0.000 description 16
- 210000000887 face Anatomy 0.000 description 6
- 238000001514 detection method Methods 0.000 description 5
- 230000008569 process Effects 0.000 description 5
- 230000008859 change Effects 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 3
- 238000012706 support-vector machine Methods 0.000 description 3
- 230000008901 benefit Effects 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 230000004044 response Effects 0.000 description 2
- FYZYXYLPBWLLGI-AUOPOVQUSA-N Genipin 1-beta-gentiobioside Chemical compound C([C@H]1O[C@H]([C@@H]([C@@H](O)[C@@H]1O)O)O[C@@H]1OC=C([C@@H]2[C@H]1C(=CC2)CO)C(=O)OC)O[C@@H]1O[C@H](CO)[C@@H](O)[C@H](O)[C@H]1O FYZYXYLPBWLLGI-AUOPOVQUSA-N 0.000 description 1
- 241000593989 Scardinius erythrophthalmus Species 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 230000006835 compression Effects 0.000 description 1
- 238000007906 compression Methods 0.000 description 1
- 210000003128 head Anatomy 0.000 description 1
- 238000005286 illumination Methods 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 201000005111 ocular hyperemia Diseases 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 238000011084 recovery Methods 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000000153 supplemental effect Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B7/00—Mountings, adjusting means, or light-tight connections, for optical elements
- G02B7/28—Systems for automatic generation of focusing signals
- G02B7/36—Systems for automatic generation of focusing signals using image sharpness techniques, e.g. image processing techniques for generating autofocus signals
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B7/00—Mountings, adjusting means, or light-tight connections, for optical elements
- G02B7/28—Systems for automatic generation of focusing signals
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B13/00—Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
- G03B13/32—Means for focusing
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B7/00—Mountings, adjusting means, or light-tight connections, for optical elements
- G02B7/02—Mountings, adjusting means, or light-tight connections, for optical elements for lenses
- G02B7/04—Mountings, adjusting means, or light-tight connections, for optical elements for lenses with mechanism for focusing or varying magnification
- G02B7/08—Mountings, adjusting means, or light-tight connections, for optical elements for lenses with mechanism for focusing or varying magnification adapted to co-operate with a remote control mechanism
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B13/00—Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
- G03B13/32—Means for focusing
- G03B13/34—Power focusing
- G03B13/36—Autofocus systems
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/61—Control of cameras or camera modules based on recognised objects
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/62—Control of parameters via user interfaces
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/63—Control of cameras or camera modules by using electronic viewfinders
- H04N23/631—Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/67—Focus control based on electronic image sensor signals
- H04N23/675—Focus control based on electronic image sensor signals comprising setting of focusing regions
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N3/00—Scanning details of television systems; Combination thereof with generation of supply voltages
- H04N3/10—Scanning details of television systems; Combination thereof with generation of supply voltages by means not exclusively optical-mechanical
- H04N3/14—Scanning details of television systems; Combination thereof with generation of supply voltages by means not exclusively optical-mechanical by means of electrically scanned solid-state devices
- H04N3/15—Scanning details of television systems; Combination thereof with generation of supply voltages by means not exclusively optical-mechanical by means of electrically scanned solid-state devices for picture signal generation
- H04N3/155—Control of the image-sensor operation, e.g. image processing within the image-sensor
- H04N3/1562—Control of the image-sensor operation, e.g. image processing within the image-sensor for selective scanning, e.g. windowing, zooming
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/80—Camera processing pipelines; Components thereof
- H04N23/84—Camera processing pipelines; Components thereof for processing colour signals
- H04N23/843—Demosaicing, e.g. interpolating colour pixel values
Definitions
- the present invention relates to a digital image processing apparatus and a method of operating the same, and more particularly, to a focusing apparatus and method which enables an image sensor to read only pixels within a preset region of an image, calculates a focus value using the read pixels, and applies the calculated focus value to the entire image.
- digital image processing apparatuses convert an electrical image signal of an object into a digital signal using a sensor such as a charge coupled device (CCD) or a complementary metal oxide semiconductor (CMOS), and stores the digital signal using a compression/recovery unit or outputs the digital image using an output unit.
- a sensor such as a charge coupled device (CCD) or a complementary metal oxide semiconductor (CMOS)
- CMOS complementary metal oxide semiconductor
- Such digital image processing apparatuses perform autofocusing to correctly focus an object.
- the autofocusing is performed by calculating a focus value using an edge, which is output after an image signal is processed by a sensor such as a CCD or a CMOS, for every picture, determining a moving direction and a moving distance of a focus lens on the basis of the calculated focus value, and moving the focus lens in the determined moving direction by the determined moving distance.
- the senor such as a CCD or a CMOS, reads all pixels of the object at a rate of 30 frames per second (FPS). For example, when one frame comprises 256 pixels, a time taken to read one pixel at a rate of 30 FPS is 0.117 ms. Accordingly, a time taken to read all the 256 pixels is approximately 30 ms.
- FPS frames per second
- pixels of some lines are read and pixels of other lines are skipped in order to reduce a time taken to read the pixels. For example, when pixels of a first line are read and pixels of a third line are skipped, since only 64 pixels among 256 pixels are read, a time taken to read the 64 pixels is approximately 7.488 ms (133 FPS).
- the present invention provides a focusing apparatus and method that enables an image sensor to read only pixels within a preset region of an image, calculates a focus value using the read pixels, and applies the calculated focus value to the entire image.
- a focusing apparatus comprising: a region setting unit setting an arbitrary focus region of an image; a set region pixel reading unit reading only pixels within the set focus region; a focus value calculating unit calculating a maximum focus value according to a moving distance of a focus lens for achieving an in-focus state with respect to the pixels within the focus region read by the set region pixel reading unit; and a control unit applying the calculated maximum focus value to the entire image.
- the focus region may be set by a user.
- the focus region may be set to a central region comprising predetermined pixel blocks.
- the focus region may be set to a face recognizing region.
- the focusing apparatus may further comprise a face recognizing unit detecting the number and positions of faces from the image, when the focus region is set to the face recognizing region.
- a focusing apparatus comprising: a region setting unit setting an arbitrary focus region of an image; a set region pixel reading unit reading only pixels within an adjacent pixel region comprising the set focus region; a focus value calculating unit calculating a maximum focus value according to a moving distance of a focus lens for achieving an in-focus state with respect to the pixels within the adjacent pixel region comprising the set focus region read by the set region pixel reading unit; and a control unit applying the calculated maximum focus value to the entire image.
- the focus region may be set by a user.
- the focus region may be set to a central region comprising predetermined pixel blocks.
- the focus region may be set to a face recognizing region.
- the focusing apparatus may further comprise a face recognizing unit detecting the number and positions of faces from the image, when the focus region is set to the face recognizing region.
- a focusing method comprising: setting an arbitrary focus region of an image; reading only pixels within the set focus region; calculating a maximum focus value according to a moving distance of a focus lens for achieving an in-focus state with respect to the pixels within the read focus region; applying the calculated maximum focus value to the entire image.
- the setting of the arbitrary focus region may comprise: the user setting the focus region; setting the focus region to a central region comprising predetermined pixel blocks; or setting the focus region to a face recognizing region.
- the focusing method may further comprise detecting the number and positions of faces from the image, when the focus region is set to the face recognizing region.
- a focusing method comprising: setting an arbitrary focus region from an image; reading only pixels within an adjacent pixel region comprising the set focus region; calculating a maximum focus value according to a moving distance of a focus lens for achieving an in-focus state with respect to the pixels within the read adjacent pixel region comprising the set focus region; and applying the calculated maximum focus value to the entire image.
- the setting of the arbitrary focus region may comprise the user setting the focus region, setting the focus region to a central region comprising predetermined pixel blocks, or setting the focus region to a face recognizing region.
- the focusing method may further comprise detecting the number and positions of faces from the image, when the focus region is set to the face recognizing region.
- FIG. 1 is a perspective view illustrating the front and the top of a conventional digital image processing apparatus
- FIG. 2 is a rear view illustrating the back of the conventional digital image processing apparatus of FIG. 1 ;
- FIG. 3 is a block diagram of a focusing apparatus according to an embodiment of the present invention.
- FIG. 4 portions (a) through (c), which are referred to hereinafter as FIGS. 4A through 4C , illustrate focus regions set by a digital signal processing unit of the focusing apparatus of FIG. 3 , according to an embodiment of the present invention
- FIG. 5 portions (a) through (c), which are referred to hereinafter as FIGS. 5A through 5C , illustrate focus regions set by the digital signal processing unit of the focusing apparatus of FIG. 3 , according to another embodiment of the present invention
- FIG. 6 is a graph illustrating a relationship between a focus value and a moving distance of a focus lens
- FIG. 7 is a flowchart illustrating a focusing method according to an embodiment of the present invention.
- FIG. 8 is a flowchart illustrating a focusing method according to another embodiment of the present invention.
- FIG. 1 is a perspective view illustrating the front and the top of a conventional digital image processing apparatus.
- the digital image processing apparatus includes a shutter-release button 11 , a power button 13 , a flash unit 15 , an auxiliary light unit 17 , and a lens unit 19 .
- the shutter-release button 11 is pressed to expose a charge coupled device (CCD) to light for a predetermined period of time, and works together with an iris (not shown) to properly expose an object to light and record an image of the object on the CCD.
- CCD charge coupled device
- the shutter-release button 11 is pressed by a user to generate first and second image photographing signals. If the shutter-release button 11 is depressed half-way, the digital image processing apparatus performs focusing and adjusts the amount of light. When correct focus is achieved, a focus indication may be provided to the user, for example a green light or icon may be illuminated on a display unit 23 in FIG. 2 . After the shutter-release button 11 is depressed half-way, the correct focus is obtained, and the amount of light is adjusted, the shutter-release button 11 is completely depressed to take shots.
- the power button 13 is pressed to supply power to the digital image processing unit and operate the digital image processing apparatus.
- the flash unit 15 provides a momentary light when an image is photographed in the dark. Flash modes include an auto flash mode, a fill flash mode, a no flash mode, a red-eye reduction mode, and a slow synchronization mode.
- the auxiliary light unit 17 supplies light to an object (e.g., when the object lacks illumination or is photographed at night) so that the digital image processing apparatus can perform autofocusing quickly and accurately.
- the lens unit 19 receives light reflected from an object so that the apparatus can process an image.
- FIG. 2 is a rear view illustrating the back of the conventional digital image processing unit of FIG. 1 .
- the back of the digital image processing apparatus includes a wide angle-zoom button 21 w , a telephoto-zoom button 21 t , a display unit 23 , and input buttons B 1 through B 14 .
- the wide angle-zoom button 21 w or the telephoto-zoom button 21 t When the wide angle-zoom button 21 w or the telephoto-zoom button 21 t is pressed, an angle of view is widened or narrowed. In particular, the wide angle-zoom button 21 w or the telephoto-zoom button 21 t is pressed to change a selected area. When the wide angle-zoom button 21 w is pressed, the selected area is reduced, and when the telephoto-zoom button 21 t is pressed, the selected area is expanded.
- the input buttons B 1 through B 14 are vertically and horizontally arranged adjacent to the display unit 23 .
- Each of the input buttons B 1 through B 14 vertically and horizontally arranged adjacent to the display unit 23 includes a touch sensor (not shown) or a contact switch (not shown).
- a touch sensor does not require as firm a touch as a contact switch.
- buttons B 1 through B 14 If a touch sensor is included in each of the input buttons B 1 through B 14 , an arbitrary item, e.g., color or brightness, among main menu items, or a sub menu icon included in a main menu icon may be selected by the user moving his or her fingertip over or on the horizontally aligned buttons B 1 through B 7 or the vertically aligned buttons B 8 through B 14 in various directions.
- an arbitrary item e.g., color or brightness, among main menu items, or a sub menu icon included in a main menu icon may be selected by the user moving his or her fingertip over or on the horizontally aligned buttons B 1 through B 7 or the vertically aligned buttons B 8 through B 14 in various directions.
- buttons B 1 through B 14 If a contact switch is included in each of the buttons B 1 through B 14 , the main-menu icon and the sub-menu icon may be directly selected to activate corresponding functions.
- FIG. 3 is a block diagram of a focusing apparatus according to an embodiment of the present invention.
- the focusing apparatus according to the current embodiment of the present invention includes a display unit 23 , a user input unit 31 , an image pickup unit 33 , an image processing unit 35 , a storage unit 37 , and a digital signal processing unit 39 .
- the user input unit 31 includes a shutter-release button 11 which is pressed to expose a CCD to light for a predetermined period of time, a power button 13 supplying power, a wide angle-zoom button 21 w and a telephoto-zoom button 21 t widening or narrowing an angle of view, and input buttons B 1 through B 14 vertically and horizontally arranged adjacent to the display unit 23 to input characters and each including a touch sensor or a contact switch.
- the image pickup unit 33 includes a zoom lens 33 - 1 , a focus lens 33 - 2 , a focus lens driving unit 33 - 3 , an image sensor 33 - 4 , an analog-to-digital converter (ADC) 33 - 5 , a shutter (not shown), and an iris (not shown).
- ADC analog-to-digital converter
- the shutter and the iris work together to adjust the amount of light received by the image sensor 33 - 4 .
- the zoom lens 33 - 1 and the focus lens 33 - 2 receive light from an external light source and process an image.
- the iris adjusts its size according to the amount of incident light. The size of the iris is controlled by the digital signal processing unit 39 .
- each of the zoom lens 33 - 1 and the focus lens 33 - 2 is aligned with the optical center of a light receiving surface of the image sensor 33 - 4 .
- the focus lens 33 - 2 is movable linearly along the optical axis.
- the focus lens 33 - 2 is moved to focus an image on the light receiving surface of the image sensor 33 - 4 .
- the focus lens 33 - 2 is moved by the focus lens driving unit 33 - 3 under the control of the digital signal processing unit 39 .
- the image sensor 33 - 4 collects the amount of light input through the zoom lens 33 - 1 and the focus lens 33 - 2 , and outputs the image taken by the zoom lens 33 - 1 and the focus lens 33 - 2 corresponding to the collected amount of light in response to a vertical sync signal.
- the image sensor 33 - 4 which converts light reflected by an object into an electrical signal, serves to capture an image.
- the image sensor 33 - 4 needs a color filter.
- a color filter array (CFA, not shown) is generally used.
- a CFA is an array of color filters regularly placed over the image sensor 33 - 4 in which each pixel in the CFA transmits light of only one color.
- the CFA may have various arrangements.
- the ADC 33 - 5 converts an analog image signal which is output from the image sensor 33 - 4 into a digital image signal.
- the image processing unit 35 processes digital raw data from the ADC 33 - 5 into processed data that can be displayed or stored.
- the image processing unit 35 removes a black level caused by a dark current generated in the CFA and the image sensor 33 - 4 which are sensitive to a temperature change.
- the image processing unit 35 performs gamma correction that encodes information based on non-linear human visual response.
- the image processing unit 35 also performs CFA interpolation that interpolates missing color in a Bayer pattern comprised of RGRG lines and GBGB lines of the gamma corrected information into RGB lines to complete an RGB signal.
- the image processing unit 35 converts the RGB signal into a YUV signal, performs edge compensation that filters a Y signal using a high pass filter and obtains a clear image, and color correction that corrects color values of U and V signals using standard color coordinates, and removes noise of the Y, U, and V signals.
- the image processing unit 35 compresses and processes the Y, U, and V signals whose noise are removed to generate a joint photographic experts group (JPEG) file.
- JPEG joint photographic experts group
- the generated JPEG file is displayed on the display unit 23 , and is stored in the storage unit 37 .
- the operations of the image processing unit 35 may be performed under the control of or in cooperation with the digital signal processing unit 39 .
- the digital signal processing unit 39 does not control the image sensor 33 - 4 to read all pixels of an image, but rather it controls the image sensor 33 - 4 to read only pixels within a set focus region such that a focus value is calculated using the read pixels and is applied to the entire image, thereby reducing a time taken for the image sensor 33 - 4 to read pixels and increasing a focusing speed.
- the digital signal processing unit 39 includes a region setting unit 39 - 1 , a set region pixel reading unit 39 - 2 , a focus value calculating unit 39 - 3 , and a control unit 39 - 4 .
- Digital signal processing units 39 according to two example embodiments of the present invention will now be explained.
- the digital signal processing unit 39 reads only pixels within a preset focus region, calculates a maximum focus value from the read pixels, and then performs focusing.
- the digital signal processing unit 39 reads only pixels within an “adjacent” pixel region that includes the preset focus region of the first embodiment, calculates a maximum focus value from the read pixels, and performs focusing.
- the first embodiment that is, the digital signal processing unit 39 which reads only pixels within a preset focus region, calculates a maximum focus value from the read pixels, and then performs focusing, will now be explained.
- the region setting unit 39 - 1 receives a region set signal from the user input unit 31 , and sets an arbitrary region for calculating a focus value.
- the control unit 39 - 4 stores coordinate information of the focus region which is set by the region setting unit 39 - 1 .
- a focus region may be set in various ways.
- the focus region may be set in one of three ways: by a user; to a central region of an image; and to a face recognizing region.
- the control unit 39 - 4 may provide a menu to a user (i.e., displaying a menu on the display unit 23 ) for setting the focus region.
- the user may select a menu item or sub-menu, e.g., a user direct setting, a central region setting, or a face recognizing region setting, in the menu.
- the focus region may be set in other ways.
- FIGS. 4A through 4C illustrate various example focus regions (shown by cross-hatched pixels) that may be set by the digital signal processing unit 39 of the focusing apparatus of FIG. 3 according to an embodiment of the present invention.
- the user may set a focus region of an image after watching a live view of the image or a viewfinder view of the image.
- the user directly sets a focus region 401 comprised of 4 pixel blocks using the buttons B 1 through B 14 ( FIG. 2 ).
- the focus region 401 of FIG. 4A is comprised of 4 pixel blocks arranged in a square configuration, the present invention is not limited thereto.
- the focus region 401 may be rectangular, circular or defined by other polylinear or curvilinear shapes known in the art.
- the directly set focus region 401 may be moved to a desired position using the buttons B 1 through B 14 .
- the user may select a size, shape, and location for the focus region 401 , thereby customizing the region 401 as desired.
- the user could alternatively select a different portion of the horse's body such as a leg, foot, torso, tail, etc. or even a different object such as the tree (or portion thereof) shown behind the horse.
- a focus region 403 is set to a central region of an image.
- a focus region 405 is set to a face recognizing region of an image.
- the digital signal processing unit 39 further includes a face recognizing unit (not shown) to detect face information regarding the number and positions of faces.
- the face recognizing unit sets the face recognizing region using detected face information.
- a feature-based face detection method of the face recognizing unit is used to locate obvious features on the face, such as eyes, nose, lips, material or skin color. Since skin color among the features is less sensitive to variations in the movement, rotation, and size of the face, skin color is often used.
- a template-based face detection method of the face recognizing unit defines several standard patterns for a face, stores the patterns, and compares an image with one of the stored patterns in a search widow.
- a support vector machine (SVM) based face detection method of the face recognizing unit is most often used presently.
- the SVM-based face detection method sub-samples different regions from an image, discriminates a face from a non-face portion of the image using a detector, and then finds a face from the image. Since the face detection methods of the face recognizing unit are already well known, detailed explanations thereof are omitted for brevity.
- the set region pixel reading unit 39 - 2 reads only pixels within the focus region that is set by the region setting unit 39 - 1 .
- the control unit 39 - 4 fetches coordinate information it has stored of the set focus region, and transmits the fetched coordinate information to the image sensor 33 - 4 .
- the image sensor 33 - 4 transmits the pixels (or pixel information) within the set focus region to the set region pixel reading unit 39 - 2 .
- the set region pixel reading unit 39 - 2 reads only pixels within the focus region 401 directly set by the user.
- the set region pixel reading unit 39 - 2 reads only pixels within the focus region 403 set to the central region.
- the set region pixel reading unit 39 - 2 reads only pixels within the focus region 405 set to the face recognizing region.
- the focus value calculating unit 39 - 3 calculates, under the control of the control unit 39 - 4 , a maximum focus value according to a moving distance of the focus lens 33 - 2 for achieving an in-focus state with respect to the pixels read by the set region pixel reading unit 39 - 2 .
- FIG. 6 is a graph for explaining a maximum focus value according to a moving distance of the focus lens 33 - 2 for achieving an in-focus state, which is calculated from the pixels within the focus region 401 , 403 , or 405 set by the focus value calculating unit 39 - 3 . If correct focus is not achieved within the focus region 401 , 403 , or 405 , a low focus value A is obtained. In this case, the moving direction of the focus lens 33 - 2 is determined at a point B to become a direction C.
- the focus lens 33 - 2 When the focus lens 33 - 2 is moved in the direction C and passes by a point E where a maximum focus value is obtained, the focus lens 33 - 2 is moved back in a direction D toward the point E, and then is fixed to the point E, thereby finding the maximum focus value.
- a time taken to find a maximum focus value is a sum of a time taken to read the pixels within the focus region (i.e., one of 401 , 403 , and 405 ) by the set region pixel reading unit 39 - 2 and a moving time of the focus lens 33 - 2 .
- the focus value calculating unit 39 - 3 continuously exchanges data with the control unit 39 - 4 , and the control unit 39 - 4 receives a signal output from the focus value calculating unit 39 - 3 and controls the focus lens driving unit 33 - 3 to find a maximum focus value.
- the control unit 39 - 4 performs focusing by applying the calculated maximum focus value to the entire image. If the shutter-release button 11 is completely depressed, the control unit 39 - 4 captures an image adjusted with the maximum focus value and stores the adjusted image in the storage unit 37 .
- the set region pixel reading unit 39 - 2 since the set region pixel reading unit 39 - 2 according to an embodiment of the present invention reads only pixels of the set focus region 401 , 403 , or 405 and finds a maximum focus value, a time taken to read pixels can be significantly reduced compared with conventional image sensors that read substantially all pixels. Accordingly, a focusing speed can be increased, compared with a conventional digital signal processing unit.
- the digital signal processing unit 39 which reads only pixels within an “adjacent” pixel region including the preset focus region of the first embodiment, calculates a maximum focus value from the read pixels, and performs focusing, will now be explained.
- the region setting unit 39 - 1 receives a region set signal from the user input unit 31 , and sets an arbitrary focus region for calculating a focus value.
- FIGS. 5A through 5C illustrate various example focus regions (shown by cross-hatched pixels) that may be set by the digital signal processing unit 39 of the focusing apparatus of FIG. 3 , according to the second embodiment of the present invention.
- the control unit 39 - 4 may store coordinate information of the focus region which was defined or configured by the user, and may set an adjacent pixel region 501 that includes the focus region directly set by the user to define a final focus region.
- control unit 39 - 4 may store coordinate information of the focus region directly set by the user and the adjacent pixel region 501 including the focus region set by the user, and may set the adjacent pixel region 501 including the focus region set by the user as a final focus region.
- the adjacent pixel region 501 which is established by the second embodiment once the user sets the focus region 401 is comprised of 16 pixel blocks. That is, the pixel region 501 is defined by a central region (i.e., region 401 of 4 pixels) that is set by the user and an otherwise supplemental or augmenting region (i.e., a generally square ring shaped pixel area of 12 pixels that surrounds the region 401 ).
- an adjacent pixel region 503 being a central region may be set as a final focus region.
- the control unit 39 - 4 stores coordinate information of the central region, and the adjacent pixel region 503 including the central region may be set as a final focus region.
- the control unit 39 - 4 may store coordinate information of the central region and the adjacent pixel region 503 including the central region, and the adjacent pixel region 503 including the central region may be set as a final focus region.
- the adjacent pixel region 503 including the central region in FIG. 5B is comprised of 16 pixel blocks.
- the image processing unit 35 or one or more units of the digital signal processing unit 39 may augment or otherwise change the focus region 403 to define the adjacent pixel region 503 , which is the four pixel region 403 surrounded by a square ring of 12 pixels.
- an adjacent pixel region 505 including a face recognizing region may be set as a final focus region.
- the digital signal processing unit 39 further includes a face recognizing unit (not shown) in order to detect the number and positions of faces.
- the control unit 39 - 4 stores coordinate information of the face recognizing region, and the adjacent pixel region 505 including the face recognizing region may be set as a final focus region.
- the control unit 39 - 4 may store coordinate information of the face recognizing region and the adjacent pixel region 505 including the face recognizing region, and the adjacent pixel region 505 including the face recognizing region may be set as a final focus region.
- the set region pixel reading unit 39 - 2 reads only pixels within the adjacent pixel region including the focus region set by the region setting unit 39 - 1 .
- the control unit 39 - 4 fetches stored coordinate information of the focus region, and transmits the coordinate information to the image sensor 33 - 4 .
- the image sensor 33 - 4 transmits only the pixels within the adjacent pixel region including the set focus region corresponding to the coordinate information to the set region pixel reading unit 39 - 2 .
- the set region pixel reading unit 39 - 2 reads only pixels within the adjacent pixel region 501 including the focus region directly set by the user.
- the set region pixel reading unit 39 - 2 reads only pixels within the adjacent pixel region 503 including the focus region set to the central region.
- the set region pixel reading unit 39 - 2 reads only pixels within the adjacent pixel region 505 including the focus region set to the face recognizing region.
- the focus value calculating unit 39 - 3 calculates a maximum focus value according to a moving distance of the focus lens 33 - 2 for achieving an in-focus state with respect to the pixels read by the set region pixel reading unit 39 - 2 , under the control of the control unit 39 - 4 . Since the operations of the focus value calculating unit 39 - 3 and the control unit 39 - 4 are the same as described previously, an explanation thereof will not be repeated for brevity.
- the digital signal processing unit 39 setting the focus region 501 , 503 , or 505 has more pixels to process than the digital signal processing unit 39 of FIG. 4 setting the focus region 401 , 403 , or 405 , the number of pixels to be read is much less than that of a conventional digital signal processing unit, thereby improving a focusing speed.
- the focusing methods may be performed in the focusing apparatus of FIG. 3 .
- a main photographing method using the present focusing methods may be performed in the digital signal processing unit 39 in conjunction with peripheral components in the digital image processing apparatus.
- FIG. 7 is a flowchart illustrating a focusing method according to an embodiment of the present invention.
- the digital signal processing unit 39 receives a focus region select signal from a user, and sets a focus region of an image using the focus region select signal.
- the digital signal processing unit 39 stores coordinate information of the set focus region.
- the digital signal processing unit 39 may provide a menu for setting a focus region to the user, and the user may select a desired sub menu in the menu.
- the menu may include three menu items or sub menus, that is, a focus region may be selected in three ways.
- the focus region may be directly set by the user, may be set to a central region of an image, and may be set to a face recognizing region of the image. Since the setting of the focus region has already been described in detail with reference to FIGS. 4A through 4C , an explanation thereof will not be repeated for brevity.
- the shutter-release button 11 is depressed half-way by a user to initiate photographing, and the digital signal processing unit 39 receives a signal indicating that the shutter-release button 11 is depressed half-way.
- the digital signal processing unit 39 fetches coordinate information of the set focus region.
- the digital signal processing unit 39 reads only pixels within the set focus region. To this end, once the shutter-release button 11 is depressed half-way, the digital signal processing unit 39 fetches stored coordinate information of the set focus region, and transmits the coordinate information of the set focus region to the image sensor 33 - 4 . The image sensor 33 - 4 transmits only the pixels or pixel information of the pixels within the set focus region to the digital signal processing unit 39 .
- the digital signal processing unit 39 calculates a maximum focus value according to a moving distance of the focus lens 33 - 2 for achieving an in-focus state with respect to the pixels within the set focus region. Since the calculating of the maximum focus value from the pixels within the set focus region has already been explained in detail, an explanation will not be repeated for brevity.
- the digital signal processing unit 39 performs focusing by applying the calculated maximum focus value to the entire image.
- the shutter-release button 11 is completely depressed by a user to capture a focused image, and the digital signal processing unit 39 receives a signal indicating that the shutter-release button 11 is completely depressed.
- the digital signal processing unit 39 captures an image adjusted with the maximum focus value and stores the adjusted image in the storage unit 37 .
- FIG. 8 is a flowchart illustrating a focusing method according to another embodiment of the present invention.
- the digital signal processing unit 39 receives a focus region select signal from a user and sets a focus region using the focus region select signal.
- the digital signal processing unit 39 stores coordinate information of the set focus region.
- the region setting unit 39 - 1 of FIG. 8 sets an adjacent pixel region including a preset focus region as a final focus region unlike the region setting unit 39 - 1 of FIG. 7 . Since the setting of the final focus region has already been explained in detail with reference to FIG. 5 , an explanation thereof will not be repeated for brevity.
- the shutter-release button 11 is depressed half-way by a user to initiate photographing, and the digital signal processing unit 39 receives a signal indicating that the shutter-release button 11 is depressed half-way.
- the digital signal processing unit 39 reads only pixels within the adjacent pixel region that includes the set focus region. To this end, once the shutter-release button 11 is depressed half-way, the digital signal processing unit 39 fetches stored coordinate information, and transmits the coordinate information to the image sensor 33 - 4 . The image sensor 33 - 4 transmits only the pixels within the adjacent pixel region including the set focus region corresponding to the coordinate information to the digital signal processing unit 39 .
- the digital signal processing unit 39 calculates a maximum focus value according to a moving distance of the focus lens 33 - 2 for achieving an in-focus state with respect to the pixels within the adjacent pixel region 501 , 503 , or 505 including the set focus region. Since the calculating of the maximum focus region from the pixels within the adjacent pixel region 501 , 503 , or 505 including the set focus region has already been explained, an explanation thereof will not be repeated for brevity.
- the digital signal processing unit 39 performs focusing by applying the calculated maximum focus value to the entire image.
- the shutter-release button 11 is completely depressed by a user to capture a focused image and the digital signal processing unit 39 receives a signal that the shutter-release button 11 is completely depressed.
- the digital signal processing unit 39 captures an image adjusted with the maximum focus value and stores the adjusted image in the storage unit 37 .
- a focus value is calculated by reading only pixels within the preset region, and the calculated focus value is applied to an entire image, thereby increasing the focusing speed.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- General Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
- Human Computer Interaction (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Studio Devices (AREA)
- Automatic Focus Adjustment (AREA)
Abstract
Description
Claims (18)
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR10-2007-0119296 | 2007-11-21 | ||
KR1020070119296A KR20090052677A (en) | 2007-11-21 | 2007-11-21 | Appratus and method for adjusting focus |
Publications (2)
Publication Number | Publication Date |
---|---|
US20090129767A1 US20090129767A1 (en) | 2009-05-21 |
US7885527B2 true US7885527B2 (en) | 2011-02-08 |
Family
ID=39672434
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/228,191 Active 2029-03-11 US7885527B2 (en) | 2007-11-21 | 2008-08-08 | Focusing apparatus and method |
Country Status (4)
Country | Link |
---|---|
US (1) | US7885527B2 (en) |
KR (1) | KR20090052677A (en) |
CN (1) | CN101441388B (en) |
GB (1) | GB2454956B (en) |
Families Citing this family (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8711215B2 (en) * | 2010-08-06 | 2014-04-29 | Panasonic Corporation | Imaging device and imaging method |
KR101817650B1 (en) * | 2010-09-08 | 2018-01-11 | 삼성전자주식회사 | Focusing Appratus |
FR2977964B1 (en) * | 2011-07-13 | 2013-08-23 | Commissariat Energie Atomique | METHOD FOR ACQUIRING A ROTATION ANGLE AND COORDINATES OF A ROTATION CENTER |
KR101905408B1 (en) * | 2012-01-17 | 2018-10-08 | 엘지이노텍 주식회사 | Camera module, auto focus method and auto focus calibration method |
WO2013141007A1 (en) * | 2012-03-21 | 2013-09-26 | 富士フイルム株式会社 | Image capture device |
US9191566B2 (en) * | 2012-03-30 | 2015-11-17 | Samsung Electronics Co., Ltd. | Image pickup apparatus, method for image pickup and computer-readable recording medium |
CN104243825B (en) * | 2014-09-22 | 2017-11-14 | 广东欧珀移动通信有限公司 | A kind of mobile terminal Atomatic focusing method and system |
CN104767934B (en) * | 2015-03-19 | 2019-02-05 | 联想(北京)有限公司 | Information processing method and electronic equipment |
CN108337425B (en) * | 2017-01-19 | 2020-06-19 | 宏碁股份有限公司 | Image capturing device and focusing method thereof |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP0263510A2 (en) | 1986-10-08 | 1988-04-13 | Canon Kabushiki Kaisha | Automatic focusing device |
WO2000072584A1 (en) | 1999-05-21 | 2000-11-30 | Foveon, Inc. | Targetable autofocus system |
US20040066563A1 (en) | 2002-10-04 | 2004-04-08 | Voss James S. | Electronic imaging device focusing |
WO2008003348A1 (en) | 2006-07-07 | 2008-01-10 | Sony Ericsson Mobile Communications Ab | Active autofocus window |
US20080158409A1 (en) * | 2006-12-28 | 2008-07-03 | Samsung Techwin Co., Ltd. | Photographing apparatus and method |
US20080201637A1 (en) * | 2006-11-06 | 2008-08-21 | Sony Corporation | Image pickup apparatus, method for controlling display of image pickup apparatus, and computer program for executing method for controlling display of image pickup apparatus |
US20080219654A1 (en) * | 2007-03-09 | 2008-09-11 | Border John N | Camera using multiple lenses and image sensors to provide improved focusing capability |
US20080226278A1 (en) * | 2007-03-15 | 2008-09-18 | Nvidia Corporation | Auto_focus technique in an image capture device |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3848230B2 (en) * | 2002-09-12 | 2006-11-22 | キヤノン株式会社 | AUTOFOCUS DEVICE, IMAGING DEVICE, AUTOFOCUS METHOD, PROGRAM, AND STORAGE MEDIUM |
US8194173B2 (en) * | 2004-07-16 | 2012-06-05 | Nikon Corporation | Auto-focusing electronic camera that focuses on a characterized portion of an object |
EP1684503B1 (en) * | 2005-01-25 | 2016-01-13 | Canon Kabushiki Kaisha | Camera and autofocus control method therefor |
-
2007
- 2007-11-21 KR KR1020070119296A patent/KR20090052677A/en not_active Application Discontinuation
-
2008
- 2008-06-17 GB GB0811103.1A patent/GB2454956B/en not_active Expired - Fee Related
- 2008-07-11 CN CN2008101303434A patent/CN101441388B/en not_active Expired - Fee Related
- 2008-08-08 US US12/228,191 patent/US7885527B2/en active Active
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP0263510A2 (en) | 1986-10-08 | 1988-04-13 | Canon Kabushiki Kaisha | Automatic focusing device |
WO2000072584A1 (en) | 1999-05-21 | 2000-11-30 | Foveon, Inc. | Targetable autofocus system |
US20040066563A1 (en) | 2002-10-04 | 2004-04-08 | Voss James S. | Electronic imaging device focusing |
WO2008003348A1 (en) | 2006-07-07 | 2008-01-10 | Sony Ericsson Mobile Communications Ab | Active autofocus window |
US20080201637A1 (en) * | 2006-11-06 | 2008-08-21 | Sony Corporation | Image pickup apparatus, method for controlling display of image pickup apparatus, and computer program for executing method for controlling display of image pickup apparatus |
US20080158409A1 (en) * | 2006-12-28 | 2008-07-03 | Samsung Techwin Co., Ltd. | Photographing apparatus and method |
US20080219654A1 (en) * | 2007-03-09 | 2008-09-11 | Border John N | Camera using multiple lenses and image sensors to provide improved focusing capability |
US20080226278A1 (en) * | 2007-03-15 | 2008-09-18 | Nvidia Corporation | Auto_focus technique in an image capture device |
Non-Patent Citations (1)
Title |
---|
Vinall, Sally; Search Report from UK Intellectual Property Office; Oct. 7, 2008. * |
Also Published As
Publication number | Publication date |
---|---|
GB2454956A (en) | 2009-05-27 |
GB0811103D0 (en) | 2008-07-23 |
GB2454956B (en) | 2012-11-07 |
KR20090052677A (en) | 2009-05-26 |
US20090129767A1 (en) | 2009-05-21 |
CN101441388A (en) | 2009-05-27 |
CN101441388B (en) | 2012-09-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US7885527B2 (en) | Focusing apparatus and method | |
US8786760B2 (en) | Digital photographing apparatus and method using face recognition function | |
US8395694B2 (en) | Apparatus and method for blurring image background in digital image processing device | |
US8890993B2 (en) | Imaging device and AF control method | |
US7881601B2 (en) | Electronic camera | |
JP4290100B2 (en) | Imaging apparatus and control method thereof | |
US8194177B2 (en) | Digital image processing apparatus and method to photograph an image with subject eyes open | |
JP5331128B2 (en) | Imaging device | |
JP4509081B2 (en) | Digital camera and digital camera program | |
KR20090113076A (en) | Apparatus and method for braketing capture in digital image processing device | |
KR20090085261A (en) | Apparatus and method for digital picturing image | |
KR20100013701A (en) | Apparatus and mthod for self timer image capture by checking the number of person | |
US8446510B2 (en) | Method and apparatus for improving face image in digital image processor | |
US10412321B2 (en) | Imaging apparatus and image synthesis method | |
US10419686B2 (en) | Image pickup apparatus and method for controlling the display of through image data based on shutter state | |
US20190222754A1 (en) | Imaging apparatus, imaging method, and storage medium | |
KR101467872B1 (en) | Digital photographing apparatus, method for controlling the same, and recording medium storing program to implement the method | |
KR20090125602A (en) | Digital photographing apparatus, method for controlling the same, and recording medium storing program to implement the method | |
KR101396335B1 (en) | Apparatus and method for detecting face | |
KR20090104429A (en) | Apparatus and method for providing image capture compostion using face detection in digital image processing device | |
KR101446781B1 (en) | Electronic device and controlling method | |
KR20090125600A (en) | Digital photographing apparatus, method for controlling the same, and recording medium storing program to implement the method | |
KR20090129745A (en) | Digital photographing apparatus, method for controlling the same, and recording medium storing program to implement the method | |
KR20090112397A (en) | Apparatus and method for digital picturing image | |
KR20090112394A (en) | Apparatus and method for digital picturing image |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: SAMSUNG TECHWIN CO., LTD., KOREA, REPUBLIC OF Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:SON, HYUK-SOO;KIM, HONG-JU;REEL/FRAME:021643/0007 Effective date: 20080808 |
|
AS | Assignment |
Owner name: SAMSUNG DIGITAL IMAGING CO., LTD., KOREA, REPUBLIC Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:SAMSUNG TECHWIN CO., LTD.;REEL/FRAME:022951/0956 Effective date: 20090619 Owner name: SAMSUNG DIGITAL IMAGING CO., LTD.,KOREA, REPUBLIC Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:SAMSUNG TECHWIN CO., LTD.;REEL/FRAME:022951/0956 Effective date: 20090619 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
AS | Assignment |
Owner name: SAMSUNG ELECTRONICS CO., LTD., KOREA, REPUBLIC OF Free format text: MERGER;ASSIGNOR:SAMSUNG DIGITAL IMAGING CO., LTD.;REEL/FRAME:026128/0759 Effective date: 20100402 |
|
FEPP | Fee payment procedure |
Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
FPAY | Fee payment |
Year of fee payment: 4 |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552) Year of fee payment: 8 |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 12TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1553); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 12 |