US20090153743A1 - Image processing device, image display system, image processing method and program therefor - Google Patents
Image processing device, image display system, image processing method and program therefor Download PDFInfo
- Publication number
- US20090153743A1 US20090153743A1 US12/316,837 US31683708A US2009153743A1 US 20090153743 A1 US20090153743 A1 US 20090153743A1 US 31683708 A US31683708 A US 31683708A US 2009153743 A1 US2009153743 A1 US 2009153743A1
- Authority
- US
- United States
- Prior art keywords
- image data
- unit
- display device
- motion vector
- frame
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 238000012545 processing Methods 0.000 title claims abstract description 203
- 238000003672 processing method Methods 0.000 title claims description 18
- 230000004044 response Effects 0.000 claims abstract description 110
- 238000000034 method Methods 0.000 claims description 79
- 230000008569 process Effects 0.000 claims description 51
- 238000001514 detection method Methods 0.000 claims description 7
- 238000001914 filtration Methods 0.000 claims description 3
- 239000004973 liquid crystal related substance Substances 0.000 description 58
- 230000000694 effects Effects 0.000 description 34
- 238000010586 diagram Methods 0.000 description 33
- 230000010354 integration Effects 0.000 description 33
- 230000006870 function Effects 0.000 description 19
- 238000004891 communication Methods 0.000 description 16
- 210000001525 retina Anatomy 0.000 description 14
- 230000000452 restraining effect Effects 0.000 description 13
- 238000013500 data storage Methods 0.000 description 11
- 230000002194 synthesizing effect Effects 0.000 description 10
- 239000000470 constituent Substances 0.000 description 7
- 230000008859 change Effects 0.000 description 6
- 230000007423 decrease Effects 0.000 description 6
- 239000011159 matrix material Substances 0.000 description 6
- 230000003287 optical effect Effects 0.000 description 5
- 230000007704 transition Effects 0.000 description 5
- 239000000284 extract Substances 0.000 description 4
- 230000006872 improvement Effects 0.000 description 4
- 230000008447 perception Effects 0.000 description 3
- 239000004065 semiconductor Substances 0.000 description 3
- 208000032365 Electromagnetic interference Diseases 0.000 description 2
- 229910021417 amorphous silicon Inorganic materials 0.000 description 2
- 230000002452 interceptive effect Effects 0.000 description 2
- 230000004075 alteration Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 230000020169 heat generation Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000000737 periodic effect Effects 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 239000000758 substrate Substances 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/14—Picture signal circuitry for video frequency region
- H04N5/21—Circuitry for suppressing or minimising disturbance, e.g. moiré or halo
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G3/00—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
- G09G3/20—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters
- G09G3/34—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters by control of light from an independent source
- G09G3/36—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters by control of light from an independent source using liquid crystals
- G09G3/3611—Control of matrices with row and column drivers
- G09G3/3648—Control of matrices with row and column drivers using an active matrix
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/14—Picture signal circuitry for video frequency region
- H04N5/142—Edging; Contouring
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/14—Picture signal circuitry for video frequency region
- H04N5/144—Movement detection
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/14—Picture signal circuitry for video frequency region
- H04N5/144—Movement detection
- H04N5/145—Movement estimation
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2320/00—Control of display operating conditions
- G09G2320/02—Improving the quality of display appearance
- G09G2320/0252—Improving the response speed
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2320/00—Control of display operating conditions
- G09G2320/02—Improving the quality of display appearance
- G09G2320/0261—Improving the quality of display appearance in the context of movement of objects on the screen or movement of the observer relative to the screen
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2320/00—Control of display operating conditions
- G09G2320/02—Improving the quality of display appearance
- G09G2320/0271—Adjustment of the gradation levels within the range of the gradation scale, e.g. by redistribution or clipping
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2320/00—Control of display operating conditions
- G09G2320/10—Special adaptations of display systems for operation with variable images
- G09G2320/106—Determination of movement vectors or equivalent parameters within the image
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2340/00—Aspects of display data processing
- G09G2340/16—Determination of a pixel data signal depending on the signal applied in the previous frame
Definitions
- the present invention contains subject matter related to Japanese Patent Application JP 2007-326342 filed in the Japan Patent Office on Dec. 18, 2007, the entire contents of which being incorporated herein by reference.
- the present invention relates to an image processing device which processes externally input image data and outputs it to a hold-type display device, an image display system including the processing device, an image processing method and a program therefor.
- a hold-type display device such as LCD, keeps displaying all pixels constituting an image page, for a period of time since an instruction of displaying a predetermined one of a plurality of frames or fields (hereinafter referred to as a “field”) constituting a motion image is issued until an instruction of displaying a next frame is issued.
- a field a predetermined one of a plurality of frames or fields
- an issue is the occurrence of motion blur, such as blurring in the leading edge, tailing in the trailing edge, delay in a perception position, in the moving object, by an Eye-Trace Integration effect (an afterglow characteristic on human retina, when following a motion picture).
- this motion blur can easily occur due to a delay in the response speed of the liquid crystal.
- an overdrive technique for restraining the motion blur by improving the response characteristic of the LCD.
- the overdrive technique for example, a voltage greater than a target voltage corresponding to a specified brightness value is applied so as to accelerate a brightness transition, in a frame where an input signal variation first occurs.
- the response speed of the liquid crystal can increase in a half-tone region, and attaining a restraining effect of the motion blur.
- a proposed technique is provided for restraining the motion blur more effectively, by changing a waveform of a voltage applied in accordance with a motion vector in each frame, using the overdrive technique (see, for example, JP-A No. 2005-43864).
- an issue in the overdrive technique is that a sufficiently high voltage for accelerating the response speed of the liquid crystal may not be applied, because of the limit of the applicable voltage range in the liquid crystal.
- the motion blur restraining effect may not sufficiently be attained, for example, when a target voltage for a black display or a white display is near the limit of the voltage range (in the case of the tone variation in a high tone range or a low tone range).
- the driving frequency of a display driver which drives the display device increases as well. This may result an issue of an insufficient charge, an increase in the number of terminals of IC or connector, an increase in the substrate area, heat generation, an increase in EMI (Electro Magnetic Interference), an increase in cost.
- EMI Electro Magnetic Interference
- the present invention has been made in consideration of the above issues. It is desirable to restrain an increase in cost, to reduce an Eye-Trace Integration effect, to improve a response characteristic of image display in all tone variations, and to restrain the motion blur, in an image processing device which processes externally input image data and outputs it to a hold-type display device, an image display system including the processing device, an image processing method and a program therefor.
- an image processing device which processes externally input image data and outputs the display image data to a hold-type display device, including: a motion vector detecting unit which detects a motion vector of the input image data; a response time information storage unit which stores response time information representing a time since a driving voltage is applied to the display device until the display device displays an image with a tone corresponding to the driving voltage, in association with a tone variation value; a compensation processing unit which compensates a pixel value in the image data for each pixel, in a frame which is one frame ahead of a frame to be displayed by the display device, based on the image data, the motion vector and the response time information; and an output unit which outputs the image data after compensated by the compensation processing unit to the display device.
- the image processing device may further include an edge detecting unit which detects an edge from the input image data, based on the motion vector.
- the compensation processing unit may determine whether to perform a compensation process for the pixel value, in accordance with a detection result of the edge detecting unit.
- the compensation processing unit may decide whether to perform the compensation process in accordance with an edge direction of an edge part detected by the edge detecting unit.
- the compensation processing unit may decide to perform the compensation process, when it is determined that the edge part detected by the edge detecting unit is in a rise area from a low tone to a high tone based on the edge direction, and may decide not to perform the compensation process, when it is determined that the edge part is in a decay area from a high tone to a low tone based on the edge direction.
- the compensation processing unit may include: a compensation range setting unit which sets a compensation range for compensating a pixel value in the image data based on the motion vector; a filter setting unit which sets a characteristic of a filter for compensating the pixel value in the image data so as to display an image with a tone corresponding to a tone set based on the image data when the display device displays the frame to be displayed, based on the image data, the motion vector and the response time information; and a filter processing unit which compensates a pixel value of the pixel within the compensation range by filtering the image data with a filter having the characteristic set by the filter setting unit, in the frame that is one frame ahead of the frame to be displayed by the display device.
- the image processing device having the compensation processing unit may further include an edge detecting unit which detects an edge from the input image data based on the motion vector.
- the compensation processing unit may further include a selecting unit which selects either one of the image data whose pixel value has been compensated by the filter processing unit and the image data whose pixel value has not been compensated by the filter processing unit, in accordance with a detection result of the edge detecting unit.
- the selecting unit may select either one of image data whose pixel value has been compensated and image data whose pixel value has not been compensated, in accordance with an edge direction of an edge part detected by the edge detecting unit.
- the selecting unit may select the image data whose pixel value has been compensated, when it is determined that the edge part detected by the edge detecting unit is in a rise area from a low tone to a high tone, and may select the image data whose pixel value has not been compensated, when it is determined that the edge part is in a decay area from a high tone to a low tone, based on the edge direction.
- the filter setting unit may change a number of taps of the filter in accordance with a motion vector value detected by the motion vector detecting unit.
- the filter may, for example, be a moving average filter.
- the compensation processing unit may further include an outside replacement unit which replaces outside of the image data for the input image data, using a maximum value and a minimum value of a tone of the image data, and the filter processing unit may filter the image data after processed by the outside replacement unit, with the filter.
- the compensation processing unit may include an interpolated image generating unit which generates interpolated image data corresponding to an interpolated image to be inserted between two continuous frames based on the image data and the motion vector, a display timing information generating unit which generates display timing information representing a timing to display the interpolated image after a predetermined period of time based on the response time information, and an image synthesizing unit which synthesizes the generated display timing information with the input image data.
- an interpolated image generating unit which generates interpolated image data corresponding to an interpolated image to be inserted between two continuous frames based on the image data and the motion vector
- a display timing information generating unit which generates display timing information representing a timing to display the interpolated image after a predetermined period of time based on the response time information
- an image synthesizing unit which synthesizes the generated display timing information with the input image data.
- an image display system which includes an image processing device, processing externally input image data, and a hold-type display device, displaying the image data processed by the image processing device and input from the image processing device, wherein: the image processing device includes a motion vector detecting unit which detects a motion vector of the input image data, a response time information storage unit which stores response time information representing a time since a driving voltage is applied to the display device until the display device displays an image with a tone corresponding to the driving voltage, in association with a tone variation value, a compensation processing unit which compensates a pixel value in the image data for each pixel, in a frame that is one frame ahead of a frame to be displayed by the display device, based on the input image data, the motion vector and the response time information, and an output unit which outputs the image data after compensated by the compensation processing unit to the display device; and the display device includes an image display unit which displays an image corresponding to the image data input from the image processing device, and a display controlling unit
- an image processing method for processing externally input image data and generating image data to be output to a hold-type display device, the method including the steps of: detecting a motion vector of the input image data; extracting response time information from a response time information storage unit which stores the response time information representing a time since a driving voltage is applied to the display device until the display device displays an image with a tone corresponding to the driving voltage, in association with a tone variation value; compensating a pixel value of the image data for each pixel in a frame that is one frame ahead of a frame to be displayed by the display device, based on the input image data, the motion vector and the response time information; and outputting the image data after compensated to the display device.
- a program for controlling a computer to function as an image processing device which processes externally input image data and outputs it to a display device performing hold-type driving including: a motion vector detecting function which detects a motion vector of the input image data; a response time storage function for storing response time information representing a time since a driving voltage is applied to the display device until the display device displays an image with a tone corresponding to the driving voltage, in association with a tone variation value; a compensation processing function for compensating a pixel value in the image data for each pixel, in a frame that is one frame ahead of a frame to be displayed by the display device, based on the input image data, the motion vector and the response time information; and an outputting function for outputting the image data after compensated by the compensation processing unit to the display device.
- the image processing device, image display system, image processing method and program of the present invention in what is so-called a hold-type display device, it is possible to restrain the motion blur, such as blurring in the leading edge, tailing in the trailing edge, delay in a perception position, in a moving object by an Eye-Trace Integration effect, thus improving quality of a moving image.
- the motion blur can sufficiently be restrained in the tone variation in a region, other than the half tone region, that may not be improved using the overdrive technique.
- the display device using the display element with a slow response speed the greater the difference of the response times due to the tone variation, resulting in a great restraining effect of the motion blur.
- an image processing device which processes externally input image data and outputs it to a hold-type display device, an image display system including the processing device, an image processing method and a program therefor.
- FIG. 1 is an explanatory diagram showing an example of a response waveform of a liquid crystal, when a pulse signal is input to a general VA mode liquid crystal;
- FIG. 2 is an explanatory diagram for explaining an example of the relationship between Eye-Trace Integration Effect and a motion blur in a hold-type display device;
- FIG. 3 is an explanatory diagram for explaining an example of the relationship between Eye-Trace Integration Effect and a motion blur in a hold-type display device
- FIG. 4 is an explanatory diagram for explaining an example of the relationship between Eye-Trace Integration Effect and a motion blur in a hold-type display device
- FIG. 5 is an explanatory diagram for explaining an example of the relationship between Eye-Trace Integration Effect and a motion blur in a hold-type display device
- FIG. 6 is an explanatory diagram schematically showing an example of an image processing method in an image processing device according to the present invention.
- FIG. 7A is an explanatory diagram showing an example of an operation waveform when a step waveform is input to a hold-type display device
- FIG. 7B is an explanatory diagram showing an example of an operation waveform when a step waveform is input to a hold-type display device
- FIG. 7C is an explanatory diagram showing an example of an operation waveform when a step waveform is input to a hold-type display device
- FIG. 7D is an explanatory diagram showing an example of an operation waveform when a step waveform is input to a hold-type display device
- FIG. 8A is an explanatory diagram showing an example of an input signal that is input to an image processing device according to the present invention.
- FIG. 8B is an explanatory diagram showing an example of an output signal that is output from an image processing device according to the present invention.
- FIG. 8C is an explanatory diagram showing an example of an output signal that is output from an image processing device according to the present invention.
- FIG. 9 is an explanatory diagram showing a variation in the spatial direction of the intensity of light accumulated on the retina of a user who has seen a hold-type display device displaying an image based on an output signal that is output from an image processing device according to the present invention
- FIG. 10 is a block diagram showing a functional configuration of an image processing device according to an embodiment of the present invention.
- FIG. 11 is a block diagram showing a functional configuration of a display device according to the embodiment.
- FIG. 12 is a block diagram showing a functional configuration of a compensation processing unit according to the embodiment.
- FIG. 13 is an explanatory diagram for explaining functions of a high frequency detecting unit according to the embodiment.
- FIG. 14 is an explanatory diagram showing an example of setting filter characteristics in accordance with a filter setting unit according to the embodiment.
- FIG. 15 is an explanatory diagram showing an example of setting filter characteristics in accordance with the filter setting unit according to the embodiment.
- FIG. 16 is a block diagram showing a hardware configuration of the image processing device according to the embodiment.
- FIG. 17 is a flowchart showing the processing flow of an image processing method according to the embodiment.
- FIG. 18 is a flowchart showing a concrete example of a compensation method according to the embodiment.
- an image processing device as an improvement device for a motion blur in a hold-type display device, such as a liquid crystal display device or the like.
- an object in motion may result in a motion blur, such as a blur in the leading edge, tailing in the trailing edge, and a delay in a perception position.
- a motion blur such as a blur in the leading edge, tailing in the trailing edge, and a delay in a perception position.
- the cause of the above has been considered as a result of a delay in a response time of a display device, such as a liquid crystal or the like.
- an overdrive technique has been utilized as a device for improving the motion blur in the hold-type display device. This overdrive technique enables to accelerate the response time of the display element, such as a liquid crystal or the like.
- the delay in the response time of the display element is not only one cause of occurrence of the motion blur in the hold-type display device.
- Another major cause of the delay is the Eye-Trace Integration Effect that is the afterglow characteristic reflected on the human retina when tracing a motion image.
- the motion blur in the hold-type display device has not sufficiently been restrained only using a general overdrive technique under consideration of only the delay of the response time of the display element, such as a liquid crystal or the like.
- the motion blur in the hold-type display device can sufficiently be restrained under consideration of not only the response time of the liquid crystal but also the Eye-Trace Integration Effect, when utilizing the overdrive technique.
- the overdrive technique can achieve an effect for accelerating the response time of the display element, in the tone variation in a halftone region.
- a sufficiently high voltage may not possibly be applied to the display element, when a target voltage for a white display or a black display is near the limit of the applicable voltage range.
- the response time may not fall in one frame using only the overdrive technique.
- FIG. 1 is an explanatory diagram showing an example of a response waveform of a liquid crystal, when a pulse signal is input to the general VA mode liquid crystal.
- the vertical axis represents the tone of the liquid crystal, while the horizontal axis represents the time.
- a solid line represents a response waveform L of the liquid crystal which is generated at the time of inputting a waveform pulse signal P, represented by a broken line in one frame period, to the general VA mode liquid crystal.
- the liquid crystal responds along a VT curve, thus causing a delay time since a signal input until a response to the signal.
- the decay the liquid crystal does not respond along a VT curve, thus not causing a much delay time.
- a region U enclosed by a circle of a broken line in FIG. 1 it is obvious that a long delay occurs in the response time in the rise from a low tone (e.g. a level 0). It is obvious also that a large difference occurs in the response times, between tone differences at the time of inputting a signal, in the rise.
- the present inventors have further examined the relationship between the Eye-Trace Integration Effect and the motion blur in the hold-type display device. They have found that the motion blur can effectively be restrained in the hold-type display device by controlling application of a driving voltage in accordance with the response time of the display element, such as a liquid crystal, based on the difference of the response times between the tones. As a result, they have reached to complete the present invention.
- FIG. 2 to FIG. 5 are explanatory diagrams each for explaining an example of the relationship between the Eye-Trace Integration Effect and the motion blur in the hold-type display device.
- a liquid crystal display device will be described as a hold-type display device by way of example.
- a predetermined one of a plurality of pixels included in a frame or field (hereinafter simply referred to as a “frame” for easy explanation) corresponds to each of the display elements (liquid crystals in this embodiment) included in the display screen of the liquid crystal display device.
- the Eye-Trace Integration simply corresponds to one frame.
- the brightness change of the edge of the image (edge part) is vertical.
- Whether the improvement of the motion blur in the hold-type display device has reached a target quality can be determined in accordance with whether the same or greater effect can be obtained as the Eye-Trace Integration Effect in an LCD which drives at 120 Hz as a result of double-speed operation of a 60 Hz driving system. Determination matters of the target quality include: a steepness of the perceptual boundary (leading edge and trailing edge) in the Eye-Trace Integration; and a delay in the half value (half value of the maximum brightness) point of the attained brightness.
- FIG. 2 to FIG. 5 show an example of a case wherein an image of stepwise changes moves by 4 pixels per frame from left to right on the display screen of the liquid crystal display device.
- the upper illustrations of FIG. 2 to FIG. 5 show a waveform of an input image signal input to the liquid crystal display device.
- the middle illustrations of FIG. 2 to FIG. 5 show a time transition of an output level (brightness) of the liquid crystal, when the image based on the input image signal of the upper illustrations is displayed on the liquid crystal display device.
- the lower illustrations thereof show the intensity of light (i.e. the Eye-Trace Integration Effect) introduced to the retina of a user's eye, when the user (a person) sees an image displayed on the liquid crystal display device.
- the intensity of light i.e. the Eye-Trace Integration Effect
- the position in a horizontal direction represents a position (in the spatial direction) of each of pixels included in each frame.
- the replacement in a vertical and downward direction represents the time transition.
- one liquid crystal corresponds to one pixel
- the intensity of gray tone represents an output level of each liquid crystal
- reference symbols “ 0 F” and “ 1 F” identify the number of each frame.
- the position in a horizontal direction represents a position (in the spatial direction) of the retina of a user's eye at a point of time tb in the middle illustration.
- the position in a vertical upward direction represents the intensity of light introduced to the retina of a user's eye. That is, areas S 1 , S 2 , S 3 and S 4 correspond to integration results of the intensity of light in the positions of the retina of the user's eye, and are results of the Eye-Trace Integration.
- oblique arrows toward the lower right position represent the movement of the user's eye.
- a predetermined level of light output from the liquid crystal in a position through which each of the oblique arrows pass, enters the user's retina at each moment between a time ta and the time tb.
- the light entered at each moment is sequentially accumulated on the user's retina.
- the intensity of the accumulated light (the integrated value of the level of the entered light) is introduced at a point of the time tb.
- FIG. 2 shows the relationship between the Eye-Trace Integration Effect and motion blur, when an input image signal having a waveform shown in the upper illustration (an input image signal corresponding to the frame 1 F of the illustration) is input at the time tb to the display element using an ideal hold-type display device (e.g. a liquid crystal) whose response time is 0.
- an ideal hold-type display device e.g. a liquid crystal
- the response time for a step input is 0.
- the output level of the liquid crystal instantaneously reaches the brightness (target brightness) corresponding to the input image signal, thus realizing a quick response of the liquid crystal.
- the Eye-Trace Integration Effect occurs even in the ideal hold-device, and thus resulting in the motion blur by four pixels corresponding to the movement speed of the input image of stepwise changes.
- FIG. 3 shows the relationship between the Eye-Trace Integration effect and the motion blur, when an input image signal having a waveform (an input image signal corresponding to the frame 1 F of the illustration) shown in the upper illustration is input at the time tb, to a general liquid crystal display device (LCD).
- LCD liquid crystal display device
- a general LCD has a slow response speed for a step input, has a response time corresponding to about one frame until reaching the target brightness.
- the LCD performs hold-type driving, thus causing the Eye-Trace Integration Effect.
- the Eye-Trace Integration Effect is added to the response time based on the response speed of the liquid crystal. This results in a motion blur of eight pixels that is twice the movement speed of the input image of stepwise changes.
- FIG. 4 shows the relationship between the Eye-Trace Integration Effect and the motion blur, when an input image signal (an input image signal corresponding to the frame 1 F in the illustration) having a waveform shown in the upper illustration is input at the time tb to an LCD which performs a double-speed operation (double the motion picture display frequency). That is, this LCD is an LCD that displays an image interpolated based on a motion vector in a sub-field divided into two within one frame.
- FIG. 5 shows an example of the relationship between the Eye-Trace Integration Effect and the motion blur, when an input image signal having a waveform shown in the upper illustration (an input image signal corresponding to the frame of 1 F in the illustration) is input at the time tb to the image processing device according to the present invention, and when this signal is displayed on the hold-type display device.
- response time information is stored in association with the brightness change.
- This response time information represents the time, since application of a driving voltage for displaying an image with a target brightness to the hold-type display device until the display device displays of an image with brightness corresponding to this driving voltage.
- the image processing device compensates the brightness value of each pixel included in the frame to be displayed for each pixel, in a frame ( 0 F in this embodiment) ahead of the frame ( 1 F in this embodiment) to be displayed, i.e. at the time ta, based on the response time information and a motion vector of the input image. This compensation is so performed that each pixel reaches the target brightness in the frame ( 1 F) to be displayed.
- the image processing device adjusts a voltage to be applied to the liquid crystal corresponding to each pixel at the point of 0 F, and adjusts the output level of the liquid crystal for each pixel (see the stair-like part of the output level of the liquid crystal at the point of 0 F). As a result, each pixel in the frame ( 1 F) to be displayed reaches the target brightness.
- the image after processed is displayed on the hold-type display device by the image processing device according to the present invention, thereby attaining a greater motion blur restraining effect than that of the LCD performing the double-speed operation.
- the interpolated image is synthesized with an input image, thereby dividing the frame into a plurality of sub-fields so as to increase a frame rate and reducing the hold time so as to restrain the motion blur.
- the interpolation is performed in a spatial direction rather than a time direction, based on a motion vector, and the interpolation result is converted from a spatial variation to a time variation based on the response time information, thereby obtaining an effect of increasing the pseudo frame rate.
- the hold-type display device the motion picture response characteristic is improved, and the motion blur can be restrained.
- FIG. 6 is an explanatory diagram schematically showing an example of the image processing method in the image processing according to the present invention.
- the image processing device 100 compares the input image data corresponding to the input frame to be displayed with image data corresponding to a frame that is one frame ahead of the frame to be displayed and stored in a memory 5 - 1 of the image processing device 100 , so as to detect a motion vector of the input image (S 11 ).
- the detected motion vector is used in the next step (S 13 ) for generating an interpolated image.
- the detected motion vector is used also in the following compensation process or an overdrive process, and may be stored in the memory 5 - 1 as needed.
- the image processing device generates an interpolated image to be inserted between the frame to be displayed and the frame that is one frame ahead thereof, based on the motion vector detected in Step S 11 (S 13 ).
- the motion picture display frequency will be double (from 60 Hz to 120 Hz in a general LCD).
- the generated interpolated image is used in a next compensation step (S 15 ).
- the generated interpolated image may be stored in the memory 5 - 1 .
- This interpolated image generating step (S 13 ) is not an indispensable step in this embodiment. Even if the motion picture display frequency (frame rate) is not increased, a motion blur restraining effect can sufficiently be attained in the hold-type display device by performing the compensation step (S 15 ) as will now be described later.
- the image processing device generates compensation information for displaying the interpolated image generated in step S 13 after a predetermined period of time in order that an image with a target brightness is displayed in a frame to be displayed, based on the motion vector detected in step S 11 and the response time information stored in a lookup table (LUT) 5 - 2 .
- the image processing device synthesizes this compensation information with the input image data, so as to generate compensated image data whose pixel value has been compensated (S 15 ).
- the generated image data after compensated is used in a next overdrive process (S 17 ). This compensation process step (S 15 ) is performed in the frame ahead of the frame to be displayed. If step S 13 is not performed (i.e.
- the image processing device directly obtains the pixel value after compensated for displaying the image of the target brightness in the frame to be displayed without using the interpolated image in step S 15 , based on the motion vector detected in step S 11 and the response time information stored in the lookup table (LUT) 5 - 2 . After that, the image processing device generates image data after compensated, based on the obtained pixel value after compensated.
- the image processing device performs an overdrive process for the image data after compensated, using the input image data stored in the memory 5 - 1 and the compensated image data generated in step S 15 (S 17 ). As a result, display image data to be displayed on the hold-type display device can be generated.
- FIG. 7A to FIG. 7D are explanatory diagrams each showing an example of an operation waveform when a step waveform is input to the hold-type display device.
- the vertical direction indicates the brightness of each pixel included in a frame
- the horizontal direction indicates the position of each pixel (spatial direction) included in a frame.
- the areas partitioned by broken lines are referred to as units each including a plurality of pixels (four pixels in this embodiment).
- FIG. 7A shows a waveform of a step signal input to a general LCD.
- the input step signal has an edge part on the right end of an N-th unit. Note that the height of this edge is the target brightness in the frame to be displayed.
- FIG. 7B shows an operation waveform when a step signal is input to an LCD that adopts an overdrive system.
- a voltage greater than a target voltage for displaying the image of the target brightness on the display device is applied, for example, in the frame wherein an input variation first occurs, so as to accelerate a brightness transition.
- the brightness is greater than the target brightness.
- the N-th unit has an even brightness that is greater than the target brightness, as a whole (each of the pixels included in the N-th unit has an equal brightness).
- FIG. 7C shows an operation waveform when a step signal is input to an LCD adopting a system for adjusting a voltage to be applied based on a motion vector at the time of performing an overdrive operation, as described in patent document 1 .
- the motion vector of an input image is detected when applying a voltage greater than a target voltage, and a voltage to be applied for each pixel is adjusted based on the detected motion vector.
- the motion blur restraining effect can be improved in the hold-type display device, as compared with a general overdrive system.
- FIG. 7D shows an example of an operation waveform when a step signal is input to the image processing device according to the image processing method of the present invention.
- the brightness value of each pixel constituting the frame to be displayed is compensated in a frame that is one frame ahead of the frame to be displayed based on the response time information and the motion vector of the input image. This compensation is so performed that a target brightness is attained in each pixel in the frame to be displayed.
- FIG. 7D shows the operation waveform when an overdrive system is adopted with consideration of the motion vector.
- the overdrive system may be adopted only as needed, and is not necessarily adopted.
- FIG. 8A is an explanatory diagram showing an example of the input signal input to the image processing device according to the present invention.
- FIG. 8B and FIG. 8C are exemplary diagrams each showing an example of the output signal output from the image processing device according to the present invention.
- FIG. 9 is an exemplary diagram showing a variation in the spatial direction of the intensity of light accumulated on the retina of a user who has seen the hold-type display device displaying an image based on the output signal output from the image processing device according to the present invention.
- the position in the horizontal direction shows the position of each pixel (in the spatial direction) constituting the frame, while the position in the vertical direction shows the brightness level output from the display device.
- the areas partitioned by broken lines represent pixels constituting the frames.
- the signal of the step waveform having edge parts shown in FIG. 8A is input to the image processing device. As described above, this step signal moves from left to right in the illustration at a speed of 4 dot/v. Before this step signal is input, a black display is given on the display device, and the display sifts to a white display upon inputting of this step signal.
- a voltage is applied in advance to the rise part so that the brightness level gradually decreases, in accordance with the response characteristics of the liquid crystal, particularly in order to attain smooth rise of the holding device (liquid crystal or the like) (compensation process).
- This process is very important particularly in the rise from a black display.
- the range in which a voltage is applied in advance is determined based on a motion vector value.
- a voltage is applied in advance in the pixel range of 4 dots corresponding to the motion vector value (4 dot/v).
- a voltage value to be applied may be set for each pixel.
- a voltage may be applied so that the brightness level gradually decreases in a stair-like manner, or may be applied so that the brightness level gradually decreases in a straight line rather than in a stair-like manner.
- FIG. 8C shows an operation waveform when the overdrive technique disclosed in Patent Document 1 is adopted for compensated image data in the image processing device according to the present invention.
- a cone-shaped signal is output, upon application of the overdrive technique.
- the voltage will be greater than a voltage value to be applied in advance for the compensation process.
- the brightness level is entirely greater than the case of FIG. 8B (in the case of only the compensation process of the present invention).
- the delay in the display is a little shorter, the delay still occurs, thus not sufficiently attaining the motion blur restraining effect.
- the brightness level of the light accumulated on the user's retina reaches the brightness level of the input step signal, as shown by the curved solid line.
- the variation in the brightness level gently decreases but not steeply.
- the Eye-Trace Integration Effect can sufficiently be restrained, and the motion blur restraining effect is greatly realized in the hold-type display device.
- FIG. 10 is a block diagram showing the functional configuration of the image processing device 100 constituting the image display system 10 according to this embodiment.
- FIG. 11 is a block diagram showing a functional configuration of a display device 200 included in the image display system 10 according to this embodiment.
- the image display system 10 includes the image processing device 100 and the hold-type display device 200 .
- the image processing device 100 processes externally input image data so as to output image data to be displayed.
- the display device 200 actually display an image, based on the input display image data for displaying input from the image processing device 100 .
- the “system” indicates an item including a plurality of logically aggregated devices (functions), and is not to express whether each of the plurality of devices (functions) is included in the same casing.
- the image processing device 100 and the display device 200 constituting the image display system 10 may be incorporated together so as to be handled as one unit, or the display device 200 may be handled as single casing. Explanations will now specifically be made to the functional configuration of each of the image processing device 100 and the display device 200 constituting this image display system 10 .
- the image processing device 100 includes an input image data storage unit 110 , a motion vector detecting unit 120 , a response time information storage unit 130 , a compensation processing unit 140 and an output unit 160 .
- the input image data storage unit 110 stores input image data that is externally input to the image processing device 100 , in association with each of a plurality of continuous frames. More specifically, for example, when input image data for displaying an image in a frame to be displayed is input to the image processing device 100 , the data is stored in the input image data storage unit 110 . When input image data for displaying an image in the next frame to be displayed is input to the image processing device 100 , the input image data in the frame ahead thereof is kept being stored, and the data is used for detecting the motion vector by the motion vector detecting unit 120 .
- the input image data stored in the input image data storage unit 110 may be deleted sequentially from data that is temporally stored before the rest of them as needed, for example.
- the motion vector detecting unit 120 extracts, for example, input image data in a frame that is one frame ahead of the frame to be displayed from the input image data storage unit 110 .
- the motion vector detecting unit 120 compares the input image data in the frame to be displayed and the input image data in the frame that is one frame ahead thereof, sees an object moving in the displayed image, and detects a motion vector of the input image data in the frame to be displayed based on the movement direction of this object and its distance.
- the motion vector detecting unit 120 may be one constituent element of the image processing device 100 , or may be one constituent element of an external unit of the image processing device 100 , such as an MPEG decoder, an IP converter, or the like. In the latter case, the motion vector of the input image data is detected separately by the external unit of the image processing device 100 , and is input to the image processing device 100 .
- the response time information storage unit 130 stores the time, since a driving voltage is applied to the display device 200 until the display device 200 displays an image with a tone corresponding to the driving voltage (i.e. response time information representing a response time of the hold-type display device), in association with the tone variation value of the display device 200 .
- the response time information may be stored in the response time information storage unit 130 in the form of, for example, a lookup table (LUT), in a manner that the tone variation value and the response time of the display element are stored in association with each other.
- LUT lookup table
- a function indicating the relationship between the tone variation value and the response time of the display element is obtained in advance, and this function is stored in the response time information storage unit 130 .
- the input image data in the frame to be displayed is compared with the input image data in the frame ahead of the frame to be displayed, so as to obtain the tone variation value in each pixel.
- the obtained tone variation is converted into response time information using the function stored in the response time information storage unit 130 .
- Such a function can be realized with hardware, such as a RAM, ROM, or the like.
- the compensation processing unit 140 compensates the pixel value of the input image data, for each pixel included in one frame, in the frame that is one frame ahead of the frame to be displayed, based on the input image data extracted from the input image data storage unit 110 , the motion vector detected by the motion vector detecting unit 120 , and the response time information extracted from the response time information storage unit 130 . As a result of this compensation, the image data to be displayed is generated, and the generated image data is output to the output unit 160 .
- the compensation processing unit 140 may include an interpolated image generating unit (not illustrated), a display-timing information generating unit (not illustrated) and an image synthesizing unit (not illustrated).
- the interpolated image generating unit generates an interpolated image to be inserted between frames input based on the input image data and the motion vector.
- the display-timing information generating unit generates display-timing information representing a timing in which the interpolated image is displayed after a predetermined period of time based on the response time information.
- the image synthesizing unit synthesizes the generated display information with the input image data.
- the interpolated image generating unit generates an interpolated image in a spatial direction rather than a time direction, based on the motion vector.
- the display-timing information generating unit can change the interpolated image into display-timing information based on a difference between response times of display elements in accordance with the display-tone variation, thereby converting from the spatial direction to the time direction.
- the same effect as in the case where the interpolated image in a time direction is generated can be attained (i.e. the effect of increasing the pseudo frame rate), using the interpolated image of a spatial direction that can easily be generated based on the motion vector.
- the pixel value may be directly compensated using a spatial filter, such as a moving average filter or the like, without generating the interpolated image.
- a spatial filter such as a moving average filter or the like
- the output unit 160 accepts display image data input from the compensation processing unit 140 , and outputs the input image data to the display device 200 .
- FIG. 12 is a block diagram showing the functional configuration of the compensation processing unit 140 according to this embodiment.
- the compensation processing unit 140 includes a compensation range setting unit 141 , a maximum/minimum value detecting unit 142 , an edge detecting unit 143 , a high frequency detecting unit 144 , an outside replacement unit 145 , a filter setting unit 146 , a filter processing unit 147 , a gain adjusting unit 148 , a selecting unit 149 and a synthesizing unit 150 .
- the compensation range setting unit 141 sets a compensation range for compensating the pixel value in the input image data, based on a motion vector input from the motion vector detecting unit 120 . Specifically, the compensation range setting unit 141 detects an area where there is a movement in the input image data (a part corresponding to a moving object), and sets pixels in the area of the movement as a compensation range. The unit transmits information regarding the set compensation range and information regarding the input motion vector to the maximum/minimum value detecting unit 142 , the edge detecting unit 143 , the high frequency detecting unit 144 and the filter setting unit 146 .
- the maximum/minimum value detecting unit 142 detects the maximum and minimum values of the input image data (input signal) within the compensation range, based on the information regarding the compensation range transmitted from the compensation range setting unit 141 .
- the information regarding the maximum and minimum values of the detected input signal is transmitted to the edge detecting unit 143 and the outside replacement unit 145 .
- the edge detecting unit 143 detects an edge part(s) in the input image data (input signal), based on the information regarding the compensation range transmitted from the compensation range setting unit 141 , the information regarding the input motion vector and the information regarding the maximum/minimum values of the input signal transmitted from the maximum/minimum value detecting unit 142 .
- This edge detecting unit 143 detects not only the position (edge part) of the edge, but also the edge direction of the edge part (whether it is the variation direction from a low tone to a high tone, or the variation direction from a high tone to a low tone). Upon detection of this edge direction, determination can be made as to whether the response of the display element is at the rise or decay. Information regarding the detected edge part and the edge direction is transmitted to the selecting unit 149 .
- the high frequency detecting unit 144 detects a high-frequency signal having spatial frequency of the input image data within the compensation range, based on the information regarding the compensation range transmitted from the compensation range setting unit 141 .
- the high frequency indicates a signal having a half wavelength (1 ⁇ 2 wavelength) in a range narrower than the compensation range, as shown in FIG. 13 . That is, the unit detects a high-frequency signal whose wavelength is shorter than twice of the compensation range, as a high-frequency signal. This is because, in the case of a high-frequency signal, both a rise area and a decay area exist in the compensation range, thus interfering with performance of an adequate process.
- the detected high-frequency signal is output to the gain adjusting unit 148 , and is used for the gain adjusting after the process performed by the filter processing unit 147 .
- the outside replacement unit 145 performs outside replacement for the input image data (input signal) using its maximum and minimum values, based on the information regarding the maximum and minimum values of the input signal transmitted from the maximum/minimum value detecting unit 142 .
- the input image data (input signal) after replaced is transmitted to the filter processing unit 147 .
- the filter setting unit 146 sets a characteristic(s) of the spatial filter for compensating the pixel value in the input image data in such a manner that the image of the tone being set based on the input image data is displayed. This setting is done when the display device 200 displays the frame to be displayed, based on the input image data, the information regarding the compensation range and the motion vector transmitted from the compensation range setting unit 141 , and the response time information extracted from the response time information storage unit 130 .
- the filter characteristics are applied only for the pixels within the compensation range.
- the spatial filter of this embodiment may be a moving average filter, such as a low-pass filter (LPF), and the like.
- the filter characteristics according to this embodiment include, for example, the area that is filtered with the filter, the number of taps of the filter. Such filter characteristics can be realized by appropriately setting a filter coefficient of the filter matrix. Information regarding such set filter characteristics is transmitted to the filter processing unit 147 .
- FIG. 14 and FIG. 15 are explanatory diagrams each showing an example of setting the filter characteristics by the filter setting unit 146 according to this embodiment.
- FIG. 14 shows an example of setting different filter characteristics at between the rise and the decay of the display element (liquid crystal or the like).
- the filter is used for only the rise area of the edge.
- FIG. 14 shows an example of four kinds of step signals that move from left to right in the illustration, as input signals.
- the signals have different maximum values (maximum brightness), different minimum values (minimum brightness), and different edge heights (difference between the maximum and minimum values) from each other.
- those values “255” and “0” indicate the brightness values of each pixel.
- the filter may be used only for the rise area of the edge.
- the filter setting unit 146 acquires information regarding the edge direction detected by the edge detecting unit 143 , and determines whether it is at the rise area or the decay area based on the direction of the tone variation in the edge part.
- the setting unit can set the filter characteristics applicable only when determined that it is at the rise area.
- FIG. 15 shows an example of setting the number of taps of the spatial filter in accordance with the motion vector value of the input image data.
- the number of taps of the filter is changed in proportion to the motion vector value.
- given four kinds of step signals move from left to right by different movement values (motion vector values) in the illustration, as input signal. From left to right in the illustration, given step signals are those of: a still image (movement value 0 dot/v); a movement value 2 dot/v; a movement value 4 dot/v; and a movement value 6 dot/v.
- the values “255” and “0” in FIG. 15 indicate the brightness values of each pixel.
- the filter setting unit 146 sets a filter characteristics that includes the number of taps (e.g. the number of taps is “2”, if the movement value is 2 dot/v) which is equal to the motion vector value (number of pixels) of the input image data. Accordingly, the greater the motion vector value of the input image signal (the higher the movement speed), the greater the number of taps of the filter. Thus, the greater the motion vector value of the input image signal (the higher the movement speed), the more precisely the compensation process of the pixel value can be performed. Thus, according to the image processing device 100 of this embodiment, the greater the motion vector value of the input image data, the more effectively the motion blur can be restrained in the hold-type display device 200 .
- the filter processing unit 147 filters the input image data after outside-replaced transmitted from the outside replacement unit 145 , using a filter having the filter characteristics set by the filter setting unit 146 , in the frame that is one frame ahead of the frame to be displayed by the display device 200 . By so doing, the pixel values of the pixels in the compensation range are compensated.
- the input image data whose pixel value is compensated is transmitted to the gain adjusting unit 148 .
- the filter processing unit 147 of this embodiment filters the input image data after outside-replaced, the unit does not necessary filter the input image data after outside-replaced, and may filter the input image data itself.
- the gain adjusting unit 148 performs gain adjustment for the input image data after compensated transmitted from the filter processing unit 147 , based on a high-band signal transmitted from the high-frequency detecting unit 144 .
- the input image data after gain adjustment is transmitted to the selecting unit 149 .
- the selecting unit 149 accepts inputs including: information regarding the edge part and the edge direction transmitted from the edge detecting unit 143 ; the input image data whose pixel value is compensated and which is transmitted from the filter processing unit 147 ; and the input image data whose pixel value is not compensated and which is extracted from the input image data storage unit 110 .
- This selecting unit 149 selects either one of the input image data whose pixel value has been compensated by the filter processing unit 147 and the input image data whose pixel value has not been compensated by the filter processing unit 147 , in accordance with the input information regarding the edge part and the edge direction.
- the selecting unit 149 selects the input image data whose pixel value has been compensated (i.e. filtering is performed). More particularly, for example, when it is determined that the edge part is at the rise area from a low tone to a high tone based on the edge direction, the selecting unit 149 selects the input image data whose pixel value has been compensated. When it is determined that the edge part is at the decay area from a low tone to a high tone based on the edge direction, the selecting unit 149 selects the input image data whose pixel value has not been compensated. By performing such processing, only the rise area can be filtered, as explained in FIG. 14 .
- the selecting unit 149 is provided on the post stage of the filter processing unit 147 .
- the selecting unit 149 accepts both inputs including input image data which has been filtered by the filter processing unit 147 and the externally input image data itself.
- the selecting unit 149 uses a system for selecting either of the input image data after filter processed and the externally input image.
- the system is not limited to this. For example, before the filter processing unit 147 performs a filter process, the selecting unit 149 determines whether to perform the filter process in advance. Only when the selecting unit 149 determines to perform the filter process (for example, when determined that the edge part is at the rise area), the filter process may be performed.
- the synthesizing unit 150 When the input image data after filter processed is input from the selecting unit 149 , the synthesizing unit 150 synthesizes the externally input image data itself (data which has not been filter processed) with the input image data after filter processed, and outputs the data to the output unit 160 . When the input image data after filter processed is not input from the selecting unit 149 , the synthesizing unit 150 outputs the externally input image data itself which has not been filter processed, to the output unit 160 .
- the display device 200 is a hold-type display device, and includes an image display unit 210 , a source driver 220 , a gate driver 230 and a display controlling unit 240 .
- the image display unit 210 displays an image corresponding to display image data input from the image processing device 100 .
- the image display unit 210 is, for example, a dot matrix type display in an m ⁇ n matrix arrangement.
- Specific examples of the image display unit 210 are an active matrix type OLED (Organic Light Emitting Diode) display using an a-Si (amorphous silicon) TFT, an LCD and the like.
- the source driver 220 and the gate driver 230 are driving units for driving the image display unit 210 in an m ⁇ n matrix arrangement.
- the source driver 220 supplies a data line 221 with a data signal
- the gate driver 230 supplies a scanning line 231 with a select signal (address signal).
- the display controlling unit 240 controls driving of the image display unit 210 (driving of the source driver 220 and the gate driver 230 ), based on the display image data input from the image processing device 100 . More specifically, the display controlling unit 240 outputs a control signal to be supplied to each driver (the source driver 220 and the gate driver 230 ) at an appropriate timing, based on the display image data (video signal) obtained from the image processing device 100 .
- each of the above-described constituent elements may be formed using a widely used member or circuit, or may be formed with hardware specialized for the constituent elements thereof.
- Each function of the constituent elements may be executed by a CPU or the like.
- the applicable configuration may be changed appropriately in accordance with a technical level at the time this embodiment is implemented.
- FIG. 16 is a block diagram showing the hardware configuration of the image processing device according to this embodiment.
- the image processing device 100 mainly includes a CPU (Central Processing Unit) 901 , a ROM (Read Only Memory) 903 , a RAM (Random Access Memory) 905 , a host bus 907 , a bridge 909 , an external bus 911 , an interface 913 , an input device 915 , an output device 917 , a storage device 919 , a drive 921 , a connection port 923 , and a communication device 925 .
- a CPU Central Processing Unit
- ROM Read Only Memory
- RAM Random Access Memory
- the CPU 901 functions as an arithmetic device and a control device, and controls entirely or partially operations of the image processing device 100 in accordance with various programs stored in the ROM 903 , the RAM 905 , the storage device 919 or a removable recording medium 927 .
- the ROM 903 stores programs or arithmetic parameters used by the CPU 901 .
- the RAM 905 temporarily stores programs used for the performance of the CPU 901 and also parameters that are changed appropriately during the performance. These are connected with each other through the host bus 907 including an internal bus, such as a CPU bus or the like.
- the host bus 907 is connected to the external bus 911 , such as a PCI (Peripheral Component Interconnect/interface) bus, through the bridge 909 .
- PCI Peripheral Component Interconnect/interface
- the input device 915 is an operational unit, such as a mouse, a keyboard, a touch panel, a button, a switch and a lever, and is operated by users.
- the input device 915 may be a remote control unit (so-called a remote control), using infrared rays or any other electric wave, or may be an external connection unit 929 , such as a cell phone, a PDA or the like corresponding to the operations of the image processing device 100 .
- the input device 915 generates, for example, an input signal based on information input by a user with the utilization of the above-described operation unit, and includes an input control circuit for outputting information to the CPU 901 .
- the user of the image processing device 100 operates this input device 915 , thereby enabling to input various data for the image processing device 100 and instructs it for processing operations.
- the output device 917 includes a device that can visually or aurally inform the user of acquired information.
- the device 917 may, for example, be a CRT display device, a liquid crystal display device, a plasma display device, a display device such as an EL display device and a lamp, an audio output device such as a speaker and a headphone, a printer device, a cell phone, a facsimile, or the like.
- the display device displays various information, such as image data or the like, in text form or image form.
- the audio output device converts audio data into a voice, and outputs the converted voice.
- the storage device 919 is a device for data storage, and is configured as an example of a storage unit of the image processing device 100 according to this embodiment.
- the storage device 919 includes a magnetic storage unit device, such as an HDD (Hard Disk Drive), etc., a semiconductor storage device, an optical storage device, or a magneto-optical storage device.
- This storage device 919 stores programs to be executed by the CPU 901 , various data, and externally acquired image signal data.
- the drive 921 is a reader/writer for storage medium, and is incorporated in an image signal processing device or externally installed thereon.
- the drive 921 reads information recorded on a removable recording medium 927 , such as an installed magnetic disk, optical disk, magneto-optical disk or semiconductor memory or the like, and outputs the information to the RAM 905 .
- the drive 921 can write the record to the removable recording medium 927 , such as the installed magnetic disk, optical disk, magneto-optical disk or semiconductor memory or the like.
- the removable recording medium 927 may, for example, be a DVD medium, an HD-DVD medium, a Blu-ray medium, a Compact Flash (CF) (registered trademark), a memory stick or an SD memory card (Secure Digital memory card).
- the removable recording medium 927 may be an IC card (Integrated Circuit card) having a contactless IC chip installed thereon or an electronic unit.
- the connection port 923 is a port for directly connecting the image processing device 100 with a unit, such as a USB (Universal Serial Bus) port, an IEEE1394 port (such as an i.Link or the like), a SCSI (Small Computer System Interface) port, an RS-232Cport, an optical audio terminal or the like.
- a unit such as a USB (Universal Serial Bus) port, an IEEE1394 port (such as an i.Link or the like), a SCSI (Small Computer System Interface) port, an RS-232Cport, an optical audio terminal or the like.
- the communication device 925 is a communication interface including a communication device or the like for connecting to a communication network 10 .
- the communication device 925 is a communication card for a cable or wireless LAN (Local Area Network), Bluetooth, or for WUSB (Wireless USB), a router for optical communication, a router for ADSL (Asymmetric Digital Subscriber Line), or a modem for various communications.
- This communication device 925 can transmit and receive an image signal and the like to and from the Internet or any other communication unit.
- the communication network 10 connected to the communication device 925 includes a network or the like connected through a cable or a wireless system, and may include the Internet, a home LAN, infrared communication or satellite communication.
- the image processing device 100 can acquire information regarding an input image signal from various information sources of the other external connection unit 929 connected to the connection port 923 or communication network 10 , and can transmit the image signal to the display device 200 .
- the hardware configuration of the display device 200 according to this embodiment is substantially the same as that of the image processing device 100 , thus will not be explained hereinbelow.
- FIG. 17 is a flowchart showing the processing flow of the image processing method according to this embodiment.
- the image processing method is to process input image data externally input to the image processing device 100 , thereby generating display image data output to the hold-type display device 200 .
- the input image data is externally input to the image processing device 100 , the input image data is stored in the input image data storage unit 110 (S 101 ), and also input to the motion vector detecting unit 120 .
- the motion vector detecting unit 120 When input image data in a frame to be displayed is input to the motion vector detecting unit 120 , the motion vector detecting unit 120 extracts, for example, input image data in a frame that is one frame ahead of the frame to be displayed, from the input image data storage unit 110 .
- the motion vector detecting unit 120 compares the input image data in the frame to be displayed and the input image data in the frame that is one frame ahead thereof, and sees an object moving in this display image, and detects a motion vector of the input image data in the frame to be displayed based on the distance between the object's movement direction and its distance (S 103 ).
- the detected motion vector is transmitted to the compensation processing unit 140 or the like.
- the compensation processing unit 140 extracts response time information corresponding to a tone variation value of each pixel in the frame to be displayed from the response time information storage unit 130 (S 105 ).
- the compensation processing unit 140 performs a compensation process for compensating a pixel value in the input image data, for each pixel included in the frame that is one frame ahead of the frame to be displayed, based on the externally input image data, the motion vector input from the motion vector detecting unit 120 and the response time information extracted from the response time information storage unit 130 (S 107 ).
- the display image data is generated, and the compensation processing unit 140 outputs the generated display image data to the output unit 160 (S 109 ).
- the output unit 160 When the display image data is input from the compensation processing unit 140 , the output unit 160 outputs the input display image data to the display device 200 (S 111 ).
- FIG. 18 is a flowchart showing a specific example of the compensation process according to this embodiment.
- the compensation range setting unit 141 sets a compensation range for compensating the pixel value in the input image data, based on the motion vector input from the motion vector detecting unit 120 (S 203 ). Specifically, the compensation range setting unit 141 detects an area with a movement in the input image data (a part corresponding to an moving object), and sets the pixel in the area having the movement as a compensation range. Further, the compensation range setting unit 141 transmits information regarding the set compensation range and information regarding the input motion vector, to the maximum/minimum value detecting unit 142 , the edge detecting unit 143 , the high frequency detecting unit 144 and the filter setting unit 146 and the like.
- the maximum/minimum value detecting unit 142 detects the maximum and minimum values of input image data (input signal) in the compensation range, based on the information regarding the compensation range and transmitted from the compensation range setting unit 141 (S 205 ). Further, the maximum/minimum value detecting unit 142 transmits information regarding the maximum and minimum values of the detected input signal to the edge detecting unit 143 and the outside replacement unit 145 .
- the edge detecting unit 143 detects an edge area in the input image data (input signal), based on the information regarding the compensation range transmitted from the compensation range setting unit 141 , the information regarding the input motion vector and the information regarding the maximum/minimum values of the input signal transmitted from the maximum/minimum value detecting unit 142 (S 207 ). At this time, the edge detecting unit 143 detects not only the position of the edge (edge part), but also the edge direction in the edge part (whether it is a variation direction from a low tone to a high tone, or a variation direction from a high tone to a low tone). Further, the edge detecting unit 143 transmits information regarding the detected edge part and edge direction to the selecting unit 149 .
- the high frequency detecting unit 144 detects a high frequency signal having spatial frequency of the input image data in the compensation range, based on the information regarding the compensation range transmitted from the compensation range setting unit 141 (S 209 ).
- the high frequency represents a signal having a half wavelength (1 ⁇ 2 wavelength) in a range narrower than the compensation range. That is, the detecting unit detects a high-band signal whose wavelength is shorter than twice of the compensation range, as a high-band signal. This is because, in the case of a high-band signal, both a rise area and a decay area exist in the compensation range, thus interfering with performance of an adequate process.
- the high frequency detecting unit 144 outputs the detected high-band signal to the gain adjusting unit 148 , and the output high-band signal is used for the gain adjusting after the process performed by the filter processing unit 147 .
- the outside replacement unit 145 performs outside replacement for the input image data (input signal) using its maximum and minimum values, based on the information regarding the maximum and minimum values of the input signal transmitted from the maximum/minimum value detecting unit 142 (S 211 ). Further, the outside replacement unit 145 transmits the input image data after replaced (input signal) to the filter processing unit 147 .
- the filter setting unit 146 extracts response time information corresponding to a tone variation value of each pixel in the frame to be displayed (S 213 ).
- the filter setting unit 146 sets a characteristic of a spatial filter for compensating the pixel value in the input image data so that the image having the tone set based on the input image data is displayed, when the display device 200 displays the frame to be displayed, based on the input image data, the information regarding the compensation range, the motion vector and the response time information (S 215 ).
- the spatial filter in this embodiment may be a moving average filter, such as a low pass filter (LPF) or the like.
- the characteristics of the filter of this embodiment may include the area which is filtered with the filter and the number of taps of the filter. Such filter characteristics can be realized by appropriately setting a filter coefficient(s) of the filter matrix. Further, the filter setting unit 146 transmits information regarding thus set filter characteristics to the filter processing unit 147 .
- the filter processing unit 147 performs a filter process for compensating the pixel value of each pixel positioned in the compensation range, by providing the input image data after outside replaced transmitted from the outside replacement unit 145 with a filter having the filter characteristic(s) set by the filter setting unit 146 , in the frame that is one frame ahead of the frame to be displayed by the display device 200 (S 217 ). Further, the filter processing unit 147 transmits the input image data whose pixel value has been compensated to the gain adjusting unit 148 .
- the filter processing unit 147 according to this embodiment provides the input image data after outside replaced with the filter. However, the filter is not necessarily provided to the input image data after outside replaced, and may be provided to the input image data itself.
- the gain adjusting unit 148 performs gain adjustment for the input image data after compensated and transmitted from the filter processing unit 147 , based on the high-band signal transmitted from the high-frequency detecting unit 144 (S 219 ). Further, the gain adjusting unit 148 transmits the input image data after gain adjustment to the selecting unit 149 .
- the selecting unit 149 selects either one of the input image data whose pixel value has been compensated by the filter processing unit 147 and the input image data whose pixel value has not been compensated by the filter processing unit 147 , in accordance with information regarding the input edge part and edge direction. In a specific process, the selecting unit 149 determines whether the edge part is at the rise area from a low tone to a high tone or at the decay area from a high tone to a low tone, based on the edge direction (S 221 ).
- the selecting unit 149 selects the input image data whose pixel value has been compensated (S 223 ), and outputs the input image data whose pixel value has been compensated (filter processed) (S 225 ).
- step S 221 when it is determined that the edge part of the input image data is at the decay area, the selecting unit 149 selects the input image data whose pixel value has not been compensated (S 227 ).
- the synthesizing unit 150 synthesizes the externally input image data itself (not filter processed) with the input image data after filter processed (S 229 ), and outputs the data to the output unit (S 231 ).
- the synthesizing unit 150 outputs the externally input image data itself not filter processed to the output unit 160 , when the input image data after filter processed is not input (S 233 ).
- the selection process by the selecting unit is performed after the filter process by the filter processing unit 147 is performed.
- the selecting unit 149 selects either one of the input image data after filter processed and the externally input image data.
- the timing to perform this process is not limited to the above. For example, before the filter processing unit 147 performs the filter process, the selecting unit 149 determines whether to perform the filter process in advance. Only when the selecting unit 149 determines the filter process is to be performed (for example, when it is determined that the edge part is at the rise area), the filter process may be performed.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Computer Hardware Design (AREA)
- Crystallography & Structural Chemistry (AREA)
- Physics & Mathematics (AREA)
- Chemical & Material Sciences (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Control Of Indicators Other Than Cathode Ray Tubes (AREA)
- Liquid Crystal Display Device Control (AREA)
- Liquid Crystal (AREA)
- Transforming Electric Information Into Light Information (AREA)
- Picture Signal Circuits (AREA)
Abstract
An image processing device according to the present invention includes a motion vector detecting unit, a response time information storage unit, a compensation processing unit and an output unit. The motion vector detecting unit detects a motion vector of image data. The response time information storage unit stores response time information representing a time since a driving voltage is applied to a display device until an image with a corresponding tone is displayed in association with a tone variation value. The compensation processing unit compensates a pixel value in image data for each pixel in a frame that is one frame ahead of a frame to be displayed, based on the image data, the motion vector and the response time information. The output unit outputs the image data after compensated by the compensation processing unit to the display device.
Description
- The present invention contains subject matter related to Japanese Patent Application JP 2007-326342 filed in the Japan Patent Office on Dec. 18, 2007, the entire contents of which being incorporated herein by reference.
- 1. Field of the Invention
- The present invention relates to an image processing device which processes externally input image data and outputs it to a hold-type display device, an image display system including the processing device, an image processing method and a program therefor.
- 2. Description of the Related Art
- In recent years, flat panel displays like LCDs (Liquid Crystal Display) have widely been spread, in place of CRTs (Cathode Ray Tube).
- In motion display, unlike an impulse-type display device, such as a CRT, a hold-type display device, such as LCD, keeps displaying all pixels constituting an image page, for a period of time since an instruction of displaying a predetermined one of a plurality of frames or fields (hereinafter referred to as a “field”) constituting a motion image is issued until an instruction of displaying a next frame is issued. Thus, in the hold-type display device, an issue is the occurrence of motion blur, such as blurring in the leading edge, tailing in the trailing edge, delay in a perception position, in the moving object, by an Eye-Trace Integration effect (an afterglow characteristic on human retina, when following a motion picture). Specifically, in the LCD, it is considered that this motion blur can easily occur due to a delay in the response speed of the liquid crystal.
- For this issue, an overdrive technique is provided for restraining the motion blur by improving the response characteristic of the LCD. To improve the response characteristic in response to a step input in the LCD, in the overdrive technique, for example, a voltage greater than a target voltage corresponding to a specified brightness value is applied so as to accelerate a brightness transition, in a frame where an input signal variation first occurs. With the utilization of this overdrive technique, the response speed of the liquid crystal can increase in a half-tone region, and attaining a restraining effect of the motion blur. A proposed technique is provided for restraining the motion blur more effectively, by changing a waveform of a voltage applied in accordance with a motion vector in each frame, using the overdrive technique (see, for example, JP-A No. 2005-43864).
- However, an issue in the overdrive technique is that a sufficiently high voltage for accelerating the response speed of the liquid crystal may not be applied, because of the limit of the applicable voltage range in the liquid crystal. As a result, the motion blur restraining effect may not sufficiently be attained, for example, when a target voltage for a black display or a white display is near the limit of the voltage range (in the case of the tone variation in a high tone range or a low tone range).
- In the liquid crystal display driving in a VA-mode, different characteristics show at between the rise and the decay of the liquid crystal. It takes much time for the variation of molecular alignments, when rise from a level “0” (e.g. black). Thus, an issue is that a specified brightness transition may not be attained within one frame in consideration of the response speed of the liquid crystal using only the overdrive technique.
- Recently, a double-speed operation technique has been developed for displaying image data on the LCD, by time-dividing a frame to be displayed in order to reduce the Eye-Trace Integration effect, and obtaining an interpolated image between frames based on a motion vector of an input image in order to increase the motion picture display frequency using a plurality of sub-fields.
- However, if the display frequency is increased, the driving frequency of a display driver which drives the display device increases as well. This may result an issue of an insufficient charge, an increase in the number of terminals of IC or connector, an increase in the substrate area, heat generation, an increase in EMI (Electro Magnetic Interference), an increase in cost.
- The present invention has been made in consideration of the above issues. It is desirable to restrain an increase in cost, to reduce an Eye-Trace Integration effect, to improve a response characteristic of image display in all tone variations, and to restrain the motion blur, in an image processing device which processes externally input image data and outputs it to a hold-type display device, an image display system including the processing device, an image processing method and a program therefor.
- According to an embodiment of the present invention, there is provided an image processing device which processes externally input image data and outputs the display image data to a hold-type display device, including: a motion vector detecting unit which detects a motion vector of the input image data; a response time information storage unit which stores response time information representing a time since a driving voltage is applied to the display device until the display device displays an image with a tone corresponding to the driving voltage, in association with a tone variation value; a compensation processing unit which compensates a pixel value in the image data for each pixel, in a frame which is one frame ahead of a frame to be displayed by the display device, based on the image data, the motion vector and the response time information; and an output unit which outputs the image data after compensated by the compensation processing unit to the display device.
- The image processing device may further include an edge detecting unit which detects an edge from the input image data, based on the motion vector.
- The compensation processing unit may determine whether to perform a compensation process for the pixel value, in accordance with a detection result of the edge detecting unit.
- The compensation processing unit may decide whether to perform the compensation process in accordance with an edge direction of an edge part detected by the edge detecting unit.
- Further, at this time, the compensation processing unit may decide to perform the compensation process, when it is determined that the edge part detected by the edge detecting unit is in a rise area from a low tone to a high tone based on the edge direction, and may decide not to perform the compensation process, when it is determined that the edge part is in a decay area from a high tone to a low tone based on the edge direction.
- The compensation processing unit may include: a compensation range setting unit which sets a compensation range for compensating a pixel value in the image data based on the motion vector; a filter setting unit which sets a characteristic of a filter for compensating the pixel value in the image data so as to display an image with a tone corresponding to a tone set based on the image data when the display device displays the frame to be displayed, based on the image data, the motion vector and the response time information; and a filter processing unit which compensates a pixel value of the pixel within the compensation range by filtering the image data with a filter having the characteristic set by the filter setting unit, in the frame that is one frame ahead of the frame to be displayed by the display device.
- The image processing device having the compensation processing unit may further include an edge detecting unit which detects an edge from the input image data based on the motion vector.
- At this time, the compensation processing unit may further include a selecting unit which selects either one of the image data whose pixel value has been compensated by the filter processing unit and the image data whose pixel value has not been compensated by the filter processing unit, in accordance with a detection result of the edge detecting unit.
- The selecting unit may select either one of image data whose pixel value has been compensated and image data whose pixel value has not been compensated, in accordance with an edge direction of an edge part detected by the edge detecting unit.
- Further, at this time, the selecting unit may select the image data whose pixel value has been compensated, when it is determined that the edge part detected by the edge detecting unit is in a rise area from a low tone to a high tone, and may select the image data whose pixel value has not been compensated, when it is determined that the edge part is in a decay area from a high tone to a low tone, based on the edge direction.
- The filter setting unit may change a number of taps of the filter in accordance with a motion vector value detected by the motion vector detecting unit.
- The filter may, for example, be a moving average filter.
- The compensation processing unit may further include an outside replacement unit which replaces outside of the image data for the input image data, using a maximum value and a minimum value of a tone of the image data, and the filter processing unit may filter the image data after processed by the outside replacement unit, with the filter.
- On the other hand, the compensation processing unit may include an interpolated image generating unit which generates interpolated image data corresponding to an interpolated image to be inserted between two continuous frames based on the image data and the motion vector, a display timing information generating unit which generates display timing information representing a timing to display the interpolated image after a predetermined period of time based on the response time information, and an image synthesizing unit which synthesizes the generated display timing information with the input image data.
- According to another embodiment of the present invention, there is provided an image display system which includes an image processing device, processing externally input image data, and a hold-type display device, displaying the image data processed by the image processing device and input from the image processing device, wherein: the image processing device includes a motion vector detecting unit which detects a motion vector of the input image data, a response time information storage unit which stores response time information representing a time since a driving voltage is applied to the display device until the display device displays an image with a tone corresponding to the driving voltage, in association with a tone variation value, a compensation processing unit which compensates a pixel value in the image data for each pixel, in a frame that is one frame ahead of a frame to be displayed by the display device, based on the input image data, the motion vector and the response time information, and an output unit which outputs the image data after compensated by the compensation processing unit to the display device; and the display device includes an image display unit which displays an image corresponding to the image data input from the image processing device, and a display controlling unit which controls driving of the image display unit based on the image data input by the image processing device.
- According to still another embodiment of the present invention, there is provided an image processing method for processing externally input image data and generating image data to be output to a hold-type display device, the method including the steps of: detecting a motion vector of the input image data; extracting response time information from a response time information storage unit which stores the response time information representing a time since a driving voltage is applied to the display device until the display device displays an image with a tone corresponding to the driving voltage, in association with a tone variation value; compensating a pixel value of the image data for each pixel in a frame that is one frame ahead of a frame to be displayed by the display device, based on the input image data, the motion vector and the response time information; and outputting the image data after compensated to the display device.
- According to still further embodiment of the present invention, there is provided a program for controlling a computer to function as an image processing device which processes externally input image data and outputs it to a display device performing hold-type driving, the program including: a motion vector detecting function which detects a motion vector of the input image data; a response time storage function for storing response time information representing a time since a driving voltage is applied to the display device until the display device displays an image with a tone corresponding to the driving voltage, in association with a tone variation value; a compensation processing function for compensating a pixel value in the image data for each pixel, in a frame that is one frame ahead of a frame to be displayed by the display device, based on the input image data, the motion vector and the response time information; and an outputting function for outputting the image data after compensated by the compensation processing unit to the display device.
- As described above, according to the image processing device, image display system, image processing method and program of the present invention, in what is so-called a hold-type display device, it is possible to restrain the motion blur, such as blurring in the leading edge, tailing in the trailing edge, delay in a perception position, in a moving object by an Eye-Trace Integration effect, thus improving quality of a moving image. There is a need for a change in the display device itself in the double-speed operation. However, according to the present invention, there is no need to make any change in the display device, thus no resulting in an increase in cost of the display device. Further, the motion blur can sufficiently be restrained in the tone variation in a region, other than the half tone region, that may not be improved using the overdrive technique. Particularly, in the display device using the display element with a slow response speed, the greater the difference of the response times due to the tone variation, resulting in a great restraining effect of the motion blur.
- According to the embodiments of the present invention, it is possible to restrain an increase in cost, to reduce the Eye-Trace Integration effect, to improve a response characteristic of image display in all tone variations, and to restrain the motion blur, in an image processing device which processes externally input image data and outputs it to a hold-type display device, an image display system including the processing device, an image processing method and a program therefor.
-
FIG. 1 is an explanatory diagram showing an example of a response waveform of a liquid crystal, when a pulse signal is input to a general VA mode liquid crystal; -
FIG. 2 is an explanatory diagram for explaining an example of the relationship between Eye-Trace Integration Effect and a motion blur in a hold-type display device; -
FIG. 3 is an explanatory diagram for explaining an example of the relationship between Eye-Trace Integration Effect and a motion blur in a hold-type display device; -
FIG. 4 is an explanatory diagram for explaining an example of the relationship between Eye-Trace Integration Effect and a motion blur in a hold-type display device; -
FIG. 5 is an explanatory diagram for explaining an example of the relationship between Eye-Trace Integration Effect and a motion blur in a hold-type display device; -
FIG. 6 is an explanatory diagram schematically showing an example of an image processing method in an image processing device according to the present invention; -
FIG. 7A is an explanatory diagram showing an example of an operation waveform when a step waveform is input to a hold-type display device; -
FIG. 7B is an explanatory diagram showing an example of an operation waveform when a step waveform is input to a hold-type display device; -
FIG. 7C is an explanatory diagram showing an example of an operation waveform when a step waveform is input to a hold-type display device; -
FIG. 7D is an explanatory diagram showing an example of an operation waveform when a step waveform is input to a hold-type display device; -
FIG. 8A is an explanatory diagram showing an example of an input signal that is input to an image processing device according to the present invention; -
FIG. 8B is an explanatory diagram showing an example of an output signal that is output from an image processing device according to the present invention; -
FIG. 8C is an explanatory diagram showing an example of an output signal that is output from an image processing device according to the present invention; -
FIG. 9 is an explanatory diagram showing a variation in the spatial direction of the intensity of light accumulated on the retina of a user who has seen a hold-type display device displaying an image based on an output signal that is output from an image processing device according to the present invention; -
FIG. 10 is a block diagram showing a functional configuration of an image processing device according to an embodiment of the present invention; -
FIG. 11 is a block diagram showing a functional configuration of a display device according to the embodiment; -
FIG. 12 is a block diagram showing a functional configuration of a compensation processing unit according to the embodiment; -
FIG. 13 is an explanatory diagram for explaining functions of a high frequency detecting unit according to the embodiment; -
FIG. 14 is an explanatory diagram showing an example of setting filter characteristics in accordance with a filter setting unit according to the embodiment; -
FIG. 15 is an explanatory diagram showing an example of setting filter characteristics in accordance with the filter setting unit according to the embodiment; -
FIG. 16 is a block diagram showing a hardware configuration of the image processing device according to the embodiment; -
FIG. 17 is a flowchart showing the processing flow of an image processing method according to the embodiment; and -
FIG. 18 is a flowchart showing a concrete example of a compensation method according to the embodiment. - Hereinafter, preferred embodiments of the present invention will be described in detail with reference to the appended drawings. Note that, in this specification and the appended drawings, structural elements that have substantially the same function and structure are denoted with the same reference numerals, and repeated explanation of these structural elements is omitted.
- Before explaining the preferred embodiments of the present invention, the present inventors first explain why the present inventors have reached an image processing device according to the present invention as an improvement device for a motion blur in a hold-type display device, such as a liquid crystal display device or the like.
- As described above, in the hold-type display device, an object in motion may result in a motion blur, such as a blur in the leading edge, tailing in the trailing edge, and a delay in a perception position. In related art, the cause of the above has been considered as a result of a delay in a response time of a display device, such as a liquid crystal or the like. Thus, an overdrive technique has been utilized as a device for improving the motion blur in the hold-type display device. This overdrive technique enables to accelerate the response time of the display element, such as a liquid crystal or the like.
- The delay in the response time of the display element, such as a liquid crystal, is not only one cause of occurrence of the motion blur in the hold-type display device. Another major cause of the delay is the Eye-Trace Integration Effect that is the afterglow characteristic reflected on the human retina when tracing a motion image. Thus, the motion blur in the hold-type display device has not sufficiently been restrained only using a general overdrive technique under consideration of only the delay of the response time of the display element, such as a liquid crystal or the like.
- According of an image processing device disclosed in
Patent document 1 applied beforehand by the present applicant, the motion blur in the hold-type display device can sufficiently be restrained under consideration of not only the response time of the liquid crystal but also the Eye-Trace Integration Effect, when utilizing the overdrive technique. - However, the overdrive technique can achieve an effect for accelerating the response time of the display element, in the tone variation in a halftone region. However, with the technique, a sufficiently high voltage may not possibly be applied to the display element, when a target voltage for a white display or a black display is near the limit of the applicable voltage range. Thus, it is difficult to sufficiently achieve the effect of accelerating the response time of the display element.
- In the liquid crystal display device using a VA mode driving technique, it takes much time for the variation of molecular alignments, when rise from a level “0” (e.g. black). Thus, the response time may not fall in one frame using only the overdrive technique.
- Explanations will now be made to response characteristics of the liquid crystal, in the case where a pulse signal is input to the general VA mode liquid crystal by way of example, with reference to
FIG. 1 .FIG. 1 is an explanatory diagram showing an example of a response waveform of a liquid crystal, when a pulse signal is input to the general VA mode liquid crystal. InFIG. 1 , the vertical axis represents the tone of the liquid crystal, while the horizontal axis represents the time. Further, inFIG. 1 , a solid line represents a response waveform L of the liquid crystal which is generated at the time of inputting a waveform pulse signal P, represented by a broken line in one frame period, to the general VA mode liquid crystal. - As shown in
FIG. 1 , in the VA mode liquid crystal, the response characteristics differ between the rise and the decay. In the rise, the liquid crystal responds along a VT curve, thus causing a delay time since a signal input until a response to the signal. On the other hand, in the decay, the liquid crystal does not respond along a VT curve, thus not causing a much delay time. Particularly, as shown in a region U enclosed by a circle of a broken line inFIG. 1 , it is obvious that a long delay occurs in the response time in the rise from a low tone (e.g. a level 0). It is obvious also that a large difference occurs in the response times, between tone differences at the time of inputting a signal, in the rise. - The present inventors have further examined the relationship between the Eye-Trace Integration Effect and the motion blur in the hold-type display device. They have found that the motion blur can effectively be restrained in the hold-type display device by controlling application of a driving voltage in accordance with the response time of the display element, such as a liquid crystal, based on the difference of the response times between the tones. As a result, they have reached to complete the present invention.
- Explanations will now be made to the relationship between the Eye-Trace Integration Effect examined by the present inventors and the motion blur in the hold-type display device, with reference to
FIG. 2 toFIG. 5 .FIG. 2 toFIG. 5 are explanatory diagrams each for explaining an example of the relationship between the Eye-Trace Integration Effect and the motion blur in the hold-type display device. - In the following explanations, a liquid crystal display device will be described as a hold-type display device by way of example. A predetermined one of a plurality of pixels included in a frame or field (hereinafter simply referred to as a “frame” for easy explanation) corresponds to each of the display elements (liquid crystals in this embodiment) included in the display screen of the liquid crystal display device.
- It is assumed that an image to be handled has a background of a solid color, and that the image of stepwise changes moves at a constant speed. On this assumption, if the Eye-Trace Integration is traced, the brightness of this trace is a periodic function. Thus, the Eye-Trace Integration simply corresponds to one frame. For easy calculation, in this embodiment, the brightness change of the edge of the image (edge part) is vertical.
- Whether the improvement of the motion blur in the hold-type display device has reached a target quality can be determined in accordance with whether the same or greater effect can be obtained as the Eye-Trace Integration Effect in an LCD which drives at 120 Hz as a result of double-speed operation of a 60 Hz driving system. Determination matters of the target quality include: a steepness of the perceptual boundary (leading edge and trailing edge) in the Eye-Trace Integration; and a delay in the half value (half value of the maximum brightness) point of the attained brightness.
-
FIG. 2 toFIG. 5 show an example of a case wherein an image of stepwise changes moves by 4 pixels per frame from left to right on the display screen of the liquid crystal display device. The upper illustrations ofFIG. 2 toFIG. 5 show a waveform of an input image signal input to the liquid crystal display device. The middle illustrations ofFIG. 2 toFIG. 5 show a time transition of an output level (brightness) of the liquid crystal, when the image based on the input image signal of the upper illustrations is displayed on the liquid crystal display device. The lower illustrations thereof show the intensity of light (i.e. the Eye-Trace Integration Effect) introduced to the retina of a user's eye, when the user (a person) sees an image displayed on the liquid crystal display device. - In the middle illustrations of
FIG. 2 toFIG. 5 , the position in a horizontal direction represents a position (in the spatial direction) of each of pixels included in each frame. In the same illustrations, the replacement in a vertical and downward direction represents the time transition. In the middle illustrations ofFIG. 2 toFIG. 5 , one liquid crystal corresponds to one pixel, the intensity of gray tone represents an output level of each liquid crystal, and reference symbols “0F” and “1F” identify the number of each frame. - Further, in the lower illustrations of
FIG. 2 toFIG. 5 , the position in a horizontal direction represents a position (in the spatial direction) of the retina of a user's eye at a point of time tb in the middle illustration. In the same illustrations, the position in a vertical upward direction represents the intensity of light introduced to the retina of a user's eye. That is, areas S1, S2, S3 and S4 correspond to integration results of the intensity of light in the positions of the retina of the user's eye, and are results of the Eye-Trace Integration. In more particular, in the middle illustrations ofFIG. 2 toFIG. 5 , oblique arrows toward the lower right position represent the movement of the user's eye. A predetermined level of light, output from the liquid crystal in a position through which each of the oblique arrows pass, enters the user's retina at each moment between a time ta and the time tb. As a result, the light entered at each moment is sequentially accumulated on the user's retina. Thus, the intensity of the accumulated light (the integrated value of the level of the entered light) is introduced at a point of the time tb. - Explanations will now be made to the relationship between the Eye-Trace Integration Effect examined by the present inventors and the motion blur in the hold-type display device, based on the illustrations of
FIG. 2 toFIG. 5 . -
FIG. 2 shows the relationship between the Eye-Trace Integration Effect and motion blur, when an input image signal having a waveform shown in the upper illustration (an input image signal corresponding to theframe 1F of the illustration) is input at the time tb to the display element using an ideal hold-type display device (e.g. a liquid crystal) whose response time is 0. - As shown in
FIG. 2 , in the display device using an ideal hold-type device, the response time for a step input is 0. The output level of the liquid crystal instantaneously reaches the brightness (target brightness) corresponding to the input image signal, thus realizing a quick response of the liquid crystal. However, the Eye-Trace Integration Effect occurs even in the ideal hold-device, and thus resulting in the motion blur by four pixels corresponding to the movement speed of the input image of stepwise changes. -
FIG. 3 shows the relationship between the Eye-Trace Integration effect and the motion blur, when an input image signal having a waveform (an input image signal corresponding to theframe 1F of the illustration) shown in the upper illustration is input at the time tb, to a general liquid crystal display device (LCD). - As shown in
FIG. 3 , a general LCD has a slow response speed for a step input, has a response time corresponding to about one frame until reaching the target brightness. The LCD performs hold-type driving, thus causing the Eye-Trace Integration Effect. When a step input is applied to the general LCD, the Eye-Trace Integration Effect is added to the response time based on the response speed of the liquid crystal. This results in a motion blur of eight pixels that is twice the movement speed of the input image of stepwise changes. -
FIG. 4 shows the relationship between the Eye-Trace Integration Effect and the motion blur, when an input image signal (an input image signal corresponding to theframe 1F in the illustration) having a waveform shown in the upper illustration is input at the time tb to an LCD which performs a double-speed operation (double the motion picture display frequency). That is, this LCD is an LCD that displays an image interpolated based on a motion vector in a sub-field divided into two within one frame. - As shown in
FIG. 4 , there is no difference in the response speed of the liquid crystal between an LCD performing double-speed operation and a general LCD. In the LCD performing the double-speed operation, one frame is divided into two sub-fields, and the interpolated image is displayed in each of the sub-fields. Thus, the hold time for one input image signal will be reduced to half, thus reducing the Eye-Trace Integration effect. As a result, the motion blur is reduced to correspond to five pixels as a whole. As described above, whether the improvement of the motion blur in the hold-type display device has reached a target quality can be determined in accordance with whether the motion blur is equal to or lower than the motion blur of five pixels in the LCD performing the double-speed operation. -
FIG. 5 shows an example of the relationship between the Eye-Trace Integration Effect and the motion blur, when an input image signal having a waveform shown in the upper illustration (an input image signal corresponding to the frame of 1F in the illustration) is input at the time tb to the image processing device according to the present invention, and when this signal is displayed on the hold-type display device. - In the image processing device according to the present invention, response time information is stored in association with the brightness change. This response time information represents the time, since application of a driving voltage for displaying an image with a target brightness to the hold-type display device until the display device displays of an image with brightness corresponding to this driving voltage. The image processing device compensates the brightness value of each pixel included in the frame to be displayed for each pixel, in a frame (0F in this embodiment) ahead of the frame (1F in this embodiment) to be displayed, i.e. at the time ta, based on the response time information and a motion vector of the input image. This compensation is so performed that each pixel reaches the target brightness in the frame (1F) to be displayed. In the example of
FIG. 5 , for pixels displayed first in the frame (1F) to be displayed (the right four pixels), the image processing device adjusts a voltage to be applied to the liquid crystal corresponding to each pixel at the point of 0F, and adjusts the output level of the liquid crystal for each pixel (see the stair-like part of the output level of the liquid crystal at the point of 0F). As a result, each pixel in the frame (1F) to be displayed reaches the target brightness. - Accordingly, in the frame (0F) ahead of the frame (1F) to be displayed, an optimum voltage is applied to the liquid crystal corresponding to each pixel in advance to each pixel (the pixel value is compensated), thereby remarkably reducing the Eye-Trace Integration Effect, under consideration of the response time of the liquid crystal until each pixel included in the frame to be displayed reaches the target brightness. As a result, as shown in
FIG. 5 , the motion blur is reduced to correspond to two pixels as a whole. It is therefore obvious to have a grater motion blur restraining effect than that of the LCD which performs the double-speed operation. In the present invention, the pixel value is compensated for each pixel, thus realizing high pixels like a high-resolution display or the like. Further, like the VA-mode liquid crystal, the greater the difference of the response times due to the tone variation, and the higher the movement speed of the moving object (motion vector value), the greater the motion blur restraining effect by the compensation process. - Accordingly, the image after processed is displayed on the hold-type display device by the image processing device according to the present invention, thereby attaining a greater motion blur restraining effect than that of the LCD performing the double-speed operation. In the LCD performing the double-speed operation, the interpolated image is synthesized with an input image, thereby dividing the frame into a plurality of sub-fields so as to increase a frame rate and reducing the hold time so as to restrain the motion blur. In the image processing device according to the present invention, the interpolation is performed in a spatial direction rather than a time direction, based on a motion vector, and the interpolation result is converted from a spatial variation to a time variation based on the response time information, thereby obtaining an effect of increasing the pseudo frame rate. As a result, in the hold-type display device, the motion picture response characteristic is improved, and the motion blur can be restrained.
- Explanations will now be made to the general view of an example of an image processing method in the image processing device according to the present invention, with reference to
FIG. 6 .FIG. 6 is an explanatory diagram schematically showing an example of the image processing method in the image processing according to the present invention. - As shown in
FIG. 6 , if input image data is input to animage processing device 100, theimage processing device 100 compares the input image data corresponding to the input frame to be displayed with image data corresponding to a frame that is one frame ahead of the frame to be displayed and stored in a memory 5-1 of theimage processing device 100, so as to detect a motion vector of the input image (S11). The detected motion vector is used in the next step (S13) for generating an interpolated image. The detected motion vector is used also in the following compensation process or an overdrive process, and may be stored in the memory 5-1 as needed. - The image processing device generates an interpolated image to be inserted between the frame to be displayed and the frame that is one frame ahead thereof, based on the motion vector detected in Step S11 (S13). Upon generation of this interpolated image, the motion picture display frequency will be double (from 60 Hz to 120 Hz in a general LCD). The generated interpolated image is used in a next compensation step (S15). The generated interpolated image may be stored in the memory 5-1. This interpolated image generating step (S13) is not an indispensable step in this embodiment. Even if the motion picture display frequency (frame rate) is not increased, a motion blur restraining effect can sufficiently be attained in the hold-type display device by performing the compensation step (S15) as will now be described later.
- The image processing device generates compensation information for displaying the interpolated image generated in step S13 after a predetermined period of time in order that an image with a target brightness is displayed in a frame to be displayed, based on the motion vector detected in step S11 and the response time information stored in a lookup table (LUT) 5-2. The image processing device synthesizes this compensation information with the input image data, so as to generate compensated image data whose pixel value has been compensated (S15). The generated image data after compensated is used in a next overdrive process (S17). This compensation process step (S15) is performed in the frame ahead of the frame to be displayed. If step S13 is not performed (i.e. if no interpolated image is generated), the image processing device directly obtains the pixel value after compensated for displaying the image of the target brightness in the frame to be displayed without using the interpolated image in step S15, based on the motion vector detected in step S11 and the response time information stored in the lookup table (LUT) 5-2. After that, the image processing device generates image data after compensated, based on the obtained pixel value after compensated.
- The image processing device performs an overdrive process for the image data after compensated, using the input image data stored in the memory 5-1 and the compensated image data generated in step S15 (S17). As a result, display image data to be displayed on the hold-type display device can be generated.
- Explanations will now be made to an operation waveform when a step waveform is input to the hold-type display device with reference to
FIG. 7A toFIG. 7D .FIG. 7A toFIG. 7D are explanatory diagrams each showing an example of an operation waveform when a step waveform is input to the hold-type display device. InFIG. 7A to 7D , the vertical direction indicates the brightness of each pixel included in a frame, while the horizontal direction indicates the position of each pixel (spatial direction) included in a frame. InFIG. 7A toFIG. 7D , the areas partitioned by broken lines are referred to as units each including a plurality of pixels (four pixels in this embodiment). -
FIG. 7A shows a waveform of a step signal input to a general LCD. As shown inFIG. 7A , the input step signal has an edge part on the right end of an N-th unit. Note that the height of this edge is the target brightness in the frame to be displayed. -
FIG. 7B shows an operation waveform when a step signal is input to an LCD that adopts an overdrive system. As shown inFIG. 7B , according to the overdrive system, a voltage greater than a target voltage for displaying the image of the target brightness on the display device is applied, for example, in the frame wherein an input variation first occurs, so as to accelerate a brightness transition. Hence, in the N-th unit, the brightness is greater than the target brightness. Note, however, that, according to a general overdrive system, the movement of a moving object in the frame (i.e. motion vector) is not detected, and a voltage is evenly applied not in accordance with the motion vector. Thus, the N-th unit has an even brightness that is greater than the target brightness, as a whole (each of the pixels included in the N-th unit has an equal brightness). -
FIG. 7C shows an operation waveform when a step signal is input to an LCD adopting a system for adjusting a voltage to be applied based on a motion vector at the time of performing an overdrive operation, as described inpatent document 1. As shown inFIG. 7C , according to this system, the motion vector of an input image is detected when applying a voltage greater than a target voltage, and a voltage to be applied for each pixel is adjusted based on the detected motion vector. As a result, the motion blur restraining effect can be improved in the hold-type display device, as compared with a general overdrive system. - However, as described above, because there is a certain limit on a range of the voltage to be applied to the liquid crystal, an issue occurs. For example, when the target voltage for a black display or a white display is near the limit on the voltage range (i.e. in the case of the tone variation in a high tone range or a low tone range), a sufficiently high voltage for accelerating the response speed of the liquid crystal may not be applied, and the motion blur restraining effect may not sufficiently be attained. In consideration of this, according to the present invention, the compensation process described in step S15 of
FIG. 6 is performed. -
FIG. 7D shows an example of an operation waveform when a step signal is input to the image processing device according to the image processing method of the present invention. As shown inFIG. 7D , according to the system of the present invention, the brightness value of each pixel constituting the frame to be displayed is compensated in a frame that is one frame ahead of the frame to be displayed based on the response time information and the motion vector of the input image. This compensation is so performed that a target brightness is attained in each pixel in the frame to be displayed. As a result, the brightness does not suddenly drop vertically from a high value to a low value at the edge part of the step signal, but rather such an operation waveform can be attained that the brightness gradually decreases in accordance with the response speed of the liquid crystal in a stair-like manner. In addition to the image processing method of the present invention,FIG. 7D shows the operation waveform when an overdrive system is adopted with consideration of the motion vector. However, in the present invention, the overdrive system may be adopted only as needed, and is not necessarily adopted. - Subsequently, explanations will now be made to operations of a compensation process in the image processing device according to the present invention, while pointing out the waveforms of an input signal input to the image processing device and an output signal output from the image processing device, with reference to
FIG. 8A toFIG. 8C andFIG. 9 .FIG. 8A is an explanatory diagram showing an example of the input signal input to the image processing device according to the present invention.FIG. 8B andFIG. 8C are exemplary diagrams each showing an example of the output signal output from the image processing device according to the present invention.FIG. 9 is an exemplary diagram showing a variation in the spatial direction of the intensity of light accumulated on the retina of a user who has seen the hold-type display device displaying an image based on the output signal output from the image processing device according to the present invention. - In
FIG. 8A toFIG. 8C , the position in the horizontal direction shows the position of each pixel (in the spatial direction) constituting the frame, while the position in the vertical direction shows the brightness level output from the display device. InFIG. 8A toFIG. 8C , the areas partitioned by broken lines represent pixels constituting the frames. In the following explanations, it is assumed that the input signal input to the image processing device is to have a step waveform, and that the input image based on the signal of this step waveform has a motion vector of 4 dot/v. - The signal of the step waveform having edge parts shown in
FIG. 8A is input to the image processing device. As described above, this step signal moves from left to right in the illustration at a speed of 4 dot/v. Before this step signal is input, a black display is given on the display device, and the display sifts to a white display upon inputting of this step signal. - As shown in
FIG. 8B , in the image processing device according to the present invention, in response to this input step signal, for example, a voltage is applied in advance to the rise part so that the brightness level gradually decreases, in accordance with the response characteristics of the liquid crystal, particularly in order to attain smooth rise of the holding device (liquid crystal or the like) (compensation process). This process is very important particularly in the rise from a black display. At this time, the range in which a voltage is applied in advance is determined based on a motion vector value. In this embodiment, for example, a voltage is applied in advance in the pixel range of 4 dots corresponding to the motion vector value (4 dot/v). When a voltage is applied in advance, a voltage value to be applied may be set for each pixel. For example, as shown inFIG. 8B , a voltage may be applied so that the brightness level gradually decreases in a stair-like manner, or may be applied so that the brightness level gradually decreases in a straight line rather than in a stair-like manner. To realize the smooth rise, it is more preferred that the brightness level decrease in a straight line. -
FIG. 8C shows an operation waveform when the overdrive technique disclosed inPatent Document 1 is adopted for compensated image data in the image processing device according to the present invention. In this case, as shown inFIG. 8C , a cone-shaped signal is output, upon application of the overdrive technique. As a voltage greater than a target voltage is applied using the overdrive technique, the voltage will be greater than a voltage value to be applied in advance for the compensation process. Thus, the brightness level is entirely greater than the case ofFIG. 8B (in the case of only the compensation process of the present invention). - What is shown is the variation in the spatial direction of the intensity of light accumulated on the user's retina as shown in
FIG. 9 , by performing the operation for displaying the image as described with reference toFIG. 8A toFIG. 8C . That is, when neither the overdrive technique nor the compensation process of the present invention is performed, the brightness level of the light accumulated on the user's retina does not reach the brightness level of the input step signal, as shown by a curved chain double-dashed line. Thus, a long delay occurs in the display, and a motion blur occurs in the hold-type display device. When only the overdrive technique is performed, there is a little difference between the brightness level of the input step signal and the brightness level of the light accumulated on the user's retina, as shown by the curved broken line. Though the delay in the display is a little shorter, the delay still occurs, thus not sufficiently attaining the motion blur restraining effect. When both of the overdrive technique and the compensation process of the present invention are performed, the brightness level of the light accumulated on the user's retina reaches the brightness level of the input step signal, as shown by the curved solid line. The variation in the brightness level gently decreases but not steeply. As a result, the Eye-Trace Integration Effect can sufficiently be restrained, and the motion blur restraining effect is greatly realized in the hold-type display device. - Now, explanations will specifically be made to a functional configuration of an
image display system 10 according to an embodiment of the present invention, as a system for realizing the above-described functions.FIG. 10 is a block diagram showing the functional configuration of theimage processing device 100 constituting theimage display system 10 according to this embodiment.FIG. 11 is a block diagram showing a functional configuration of adisplay device 200 included in theimage display system 10 according to this embodiment. - As shown in
FIG. 10 andFIG. 11 , theimage display system 10 according to this embodiment includes theimage processing device 100 and the hold-type display device 200. Theimage processing device 100 processes externally input image data so as to output image data to be displayed. Thedisplay device 200 actually display an image, based on the input display image data for displaying input from theimage processing device 100. Here, the “system” indicates an item including a plurality of logically aggregated devices (functions), and is not to express whether each of the plurality of devices (functions) is included in the same casing. Thus, for example, like a TV receiver, theimage processing device 100 and thedisplay device 200 constituting theimage display system 10 may be incorporated together so as to be handled as one unit, or thedisplay device 200 may be handled as single casing. Explanations will now specifically be made to the functional configuration of each of theimage processing device 100 and thedisplay device 200 constituting thisimage display system 10. - As shown in
FIG. 10 , theimage processing device 100 according to this embodiment includes an input imagedata storage unit 110, a motionvector detecting unit 120, a response timeinformation storage unit 130, acompensation processing unit 140 and anoutput unit 160. - The input image
data storage unit 110 stores input image data that is externally input to theimage processing device 100, in association with each of a plurality of continuous frames. More specifically, for example, when input image data for displaying an image in a frame to be displayed is input to theimage processing device 100, the data is stored in the input imagedata storage unit 110. When input image data for displaying an image in the next frame to be displayed is input to theimage processing device 100, the input image data in the frame ahead thereof is kept being stored, and the data is used for detecting the motion vector by the motionvector detecting unit 120. The input image data stored in the input imagedata storage unit 110 may be deleted sequentially from data that is temporally stored before the rest of them as needed, for example. - When the input image data in a frame to be displayed is input, the motion
vector detecting unit 120 extracts, for example, input image data in a frame that is one frame ahead of the frame to be displayed from the input imagedata storage unit 110. The motionvector detecting unit 120 compares the input image data in the frame to be displayed and the input image data in the frame that is one frame ahead thereof, sees an object moving in the displayed image, and detects a motion vector of the input image data in the frame to be displayed based on the movement direction of this object and its distance. Like this embodiment, the motionvector detecting unit 120 may be one constituent element of theimage processing device 100, or may be one constituent element of an external unit of theimage processing device 100, such as an MPEG decoder, an IP converter, or the like. In the latter case, the motion vector of the input image data is detected separately by the external unit of theimage processing device 100, and is input to theimage processing device 100. - The response time
information storage unit 130, stores the time, since a driving voltage is applied to thedisplay device 200 until thedisplay device 200 displays an image with a tone corresponding to the driving voltage (i.e. response time information representing a response time of the hold-type display device), in association with the tone variation value of thedisplay device 200. The response time information may be stored in the response timeinformation storage unit 130 in the form of, for example, a lookup table (LUT), in a manner that the tone variation value and the response time of the display element are stored in association with each other. According to another given form of storing the response time information in the response timeinformation storage unit 130, a function indicating the relationship between the tone variation value and the response time of the display element is obtained in advance, and this function is stored in the response timeinformation storage unit 130. In this case, the input image data in the frame to be displayed is compared with the input image data in the frame ahead of the frame to be displayed, so as to obtain the tone variation value in each pixel. The obtained tone variation is converted into response time information using the function stored in the response timeinformation storage unit 130. Such a function can be realized with hardware, such as a RAM, ROM, or the like. - The
compensation processing unit 140 compensates the pixel value of the input image data, for each pixel included in one frame, in the frame that is one frame ahead of the frame to be displayed, based on the input image data extracted from the input imagedata storage unit 110, the motion vector detected by the motionvector detecting unit 120, and the response time information extracted from the response timeinformation storage unit 130. As a result of this compensation, the image data to be displayed is generated, and the generated image data is output to theoutput unit 160. - The
compensation processing unit 140 may include an interpolated image generating unit (not illustrated), a display-timing information generating unit (not illustrated) and an image synthesizing unit (not illustrated). The interpolated image generating unit generates an interpolated image to be inserted between frames input based on the input image data and the motion vector. The display-timing information generating unit generates display-timing information representing a timing in which the interpolated image is displayed after a predetermined period of time based on the response time information. The image synthesizing unit synthesizes the generated display information with the input image data. In this configuration, the interpolated image generating unit generates an interpolated image in a spatial direction rather than a time direction, based on the motion vector. The display-timing information generating unit can change the interpolated image into display-timing information based on a difference between response times of display elements in accordance with the display-tone variation, thereby converting from the spatial direction to the time direction. Thus, by synthesizing the display-timing information with the input image data, the same effect as in the case where the interpolated image in a time direction is generated can be attained (i.e. the effect of increasing the pseudo frame rate), using the interpolated image of a spatial direction that can easily be generated based on the motion vector. - Like the above-described configuration, the pixel value may be directly compensated using a spatial filter, such as a moving average filter or the like, without generating the interpolated image. The functional configuration of the latter configuration will be described later.
- The
output unit 160 accepts display image data input from thecompensation processing unit 140, and outputs the input image data to thedisplay device 200. - The functional configuration of the above-described
compensation processing unit 140 will more specifically be described with reference toFIG. 12 .FIG. 12 is a block diagram showing the functional configuration of thecompensation processing unit 140 according to this embodiment. - As shown in
FIG. 12 , thecompensation processing unit 140 includes a compensationrange setting unit 141, a maximum/minimumvalue detecting unit 142, anedge detecting unit 143, a highfrequency detecting unit 144, anoutside replacement unit 145, afilter setting unit 146, afilter processing unit 147, again adjusting unit 148, a selectingunit 149 and asynthesizing unit 150. - The compensation
range setting unit 141 sets a compensation range for compensating the pixel value in the input image data, based on a motion vector input from the motionvector detecting unit 120. Specifically, the compensationrange setting unit 141 detects an area where there is a movement in the input image data (a part corresponding to a moving object), and sets pixels in the area of the movement as a compensation range. The unit transmits information regarding the set compensation range and information regarding the input motion vector to the maximum/minimumvalue detecting unit 142, theedge detecting unit 143, the highfrequency detecting unit 144 and thefilter setting unit 146. - The maximum/minimum
value detecting unit 142 detects the maximum and minimum values of the input image data (input signal) within the compensation range, based on the information regarding the compensation range transmitted from the compensationrange setting unit 141. The information regarding the maximum and minimum values of the detected input signal is transmitted to theedge detecting unit 143 and theoutside replacement unit 145. - The
edge detecting unit 143 detects an edge part(s) in the input image data (input signal), based on the information regarding the compensation range transmitted from the compensationrange setting unit 141, the information regarding the input motion vector and the information regarding the maximum/minimum values of the input signal transmitted from the maximum/minimumvalue detecting unit 142. Thisedge detecting unit 143 detects not only the position (edge part) of the edge, but also the edge direction of the edge part (whether it is the variation direction from a low tone to a high tone, or the variation direction from a high tone to a low tone). Upon detection of this edge direction, determination can be made as to whether the response of the display element is at the rise or decay. Information regarding the detected edge part and the edge direction is transmitted to the selectingunit 149. - The high
frequency detecting unit 144 detects a high-frequency signal having spatial frequency of the input image data within the compensation range, based on the information regarding the compensation range transmitted from the compensationrange setting unit 141. In this case, the high frequency indicates a signal having a half wavelength (½ wavelength) in a range narrower than the compensation range, as shown inFIG. 13 . That is, the unit detects a high-frequency signal whose wavelength is shorter than twice of the compensation range, as a high-frequency signal. This is because, in the case of a high-frequency signal, both a rise area and a decay area exist in the compensation range, thus interfering with performance of an adequate process. The detected high-frequency signal is output to thegain adjusting unit 148, and is used for the gain adjusting after the process performed by thefilter processing unit 147. - The
outside replacement unit 145 performs outside replacement for the input image data (input signal) using its maximum and minimum values, based on the information regarding the maximum and minimum values of the input signal transmitted from the maximum/minimumvalue detecting unit 142. The input image data (input signal) after replaced is transmitted to thefilter processing unit 147. - The
filter setting unit 146 sets a characteristic(s) of the spatial filter for compensating the pixel value in the input image data in such a manner that the image of the tone being set based on the input image data is displayed. This setting is done when thedisplay device 200 displays the frame to be displayed, based on the input image data, the information regarding the compensation range and the motion vector transmitted from the compensationrange setting unit 141, and the response time information extracted from the response timeinformation storage unit 130. The filter characteristics are applied only for the pixels within the compensation range. The spatial filter of this embodiment may be a moving average filter, such as a low-pass filter (LPF), and the like. The filter characteristics according to this embodiment include, for example, the area that is filtered with the filter, the number of taps of the filter. Such filter characteristics can be realized by appropriately setting a filter coefficient of the filter matrix. Information regarding such set filter characteristics is transmitted to thefilter processing unit 147. - Explanations will now be made to an example of setting the filter characteristics with reference to
FIG. 14 andFIG. 15 .FIG. 14 andFIG. 15 are explanatory diagrams each showing an example of setting the filter characteristics by thefilter setting unit 146 according to this embodiment. -
FIG. 14 shows an example of setting different filter characteristics at between the rise and the decay of the display element (liquid crystal or the like). In this example, the filter is used for only the rise area of the edge.FIG. 14 shows an example of four kinds of step signals that move from left to right in the illustration, as input signals. The signals have different maximum values (maximum brightness), different minimum values (minimum brightness), and different edge heights (difference between the maximum and minimum values) from each other. InFIG. 14 , those values “255” and “0” indicate the brightness values of each pixel. - As shown in
FIG. 14 , though different compensation values of the pixel values are given for the pixels in accordance with the tone variations (differences between the maximum and minimum values of the brightness values), the filter may be used only for the rise area of the edge. Specifically, though not illustrated inFIG. 14 , for example, thefilter setting unit 146 acquires information regarding the edge direction detected by theedge detecting unit 143, and determines whether it is at the rise area or the decay area based on the direction of the tone variation in the edge part. The setting unit can set the filter characteristics applicable only when determined that it is at the rise area. -
FIG. 15 shows an example of setting the number of taps of the spatial filter in accordance with the motion vector value of the input image data. In this example, the number of taps of the filter is changed in proportion to the motion vector value. InFIG. 15 , given four kinds of step signals move from left to right by different movement values (motion vector values) in the illustration, as input signal. From left to right in the illustration, given step signals are those of: a still image (movement value 0 dot/v); amovement value 2 dot/v; amovement value 4 dot/v; and a movement value 6 dot/v. The values “255” and “0” inFIG. 15 indicate the brightness values of each pixel. - In the example of
FIG. 15 , thefilter setting unit 146 sets a filter characteristics that includes the number of taps (e.g. the number of taps is “2”, if the movement value is 2 dot/v) which is equal to the motion vector value (number of pixels) of the input image data. Accordingly, the greater the motion vector value of the input image signal (the higher the movement speed), the greater the number of taps of the filter. Thus, the greater the motion vector value of the input image signal (the higher the movement speed), the more precisely the compensation process of the pixel value can be performed. Thus, according to theimage processing device 100 of this embodiment, the greater the motion vector value of the input image data, the more effectively the motion blur can be restrained in the hold-type display device 200. - The
filter processing unit 147 filters the input image data after outside-replaced transmitted from theoutside replacement unit 145, using a filter having the filter characteristics set by thefilter setting unit 146, in the frame that is one frame ahead of the frame to be displayed by thedisplay device 200. By so doing, the pixel values of the pixels in the compensation range are compensated. The input image data whose pixel value is compensated is transmitted to thegain adjusting unit 148. Though thefilter processing unit 147 of this embodiment filters the input image data after outside-replaced, the unit does not necessary filter the input image data after outside-replaced, and may filter the input image data itself. - To avoid an error in the high frequency, the
gain adjusting unit 148 performs gain adjustment for the input image data after compensated transmitted from thefilter processing unit 147, based on a high-band signal transmitted from the high-frequency detecting unit 144. The input image data after gain adjustment is transmitted to the selectingunit 149. - As a result of the detection by the
edge detecting unit 143, the selectingunit 149 accepts inputs including: information regarding the edge part and the edge direction transmitted from theedge detecting unit 143; the input image data whose pixel value is compensated and which is transmitted from thefilter processing unit 147; and the input image data whose pixel value is not compensated and which is extracted from the input imagedata storage unit 110. This selectingunit 149 selects either one of the input image data whose pixel value has been compensated by thefilter processing unit 147 and the input image data whose pixel value has not been compensated by thefilter processing unit 147, in accordance with the input information regarding the edge part and the edge direction. Further, when the selectingunit 149 selects the input image data whose pixel value has been compensated (i.e. filtering is performed), the unit outputs the input image data to thesynthesizing unit 150. More particularly, for example, when it is determined that the edge part is at the rise area from a low tone to a high tone based on the edge direction, the selectingunit 149 selects the input image data whose pixel value has been compensated. When it is determined that the edge part is at the decay area from a low tone to a high tone based on the edge direction, the selectingunit 149 selects the input image data whose pixel value has not been compensated. By performing such processing, only the rise area can be filtered, as explained inFIG. 14 . - In this embodiment, the selecting
unit 149 is provided on the post stage of thefilter processing unit 147. The selectingunit 149 accepts both inputs including input image data which has been filtered by thefilter processing unit 147 and the externally input image data itself. The selectingunit 149 uses a system for selecting either of the input image data after filter processed and the externally input image. However, the system is not limited to this. For example, before thefilter processing unit 147 performs a filter process, the selectingunit 149 determines whether to perform the filter process in advance. Only when the selectingunit 149 determines to perform the filter process (for example, when determined that the edge part is at the rise area), the filter process may be performed. - When the input image data after filter processed is input from the selecting
unit 149, the synthesizingunit 150 synthesizes the externally input image data itself (data which has not been filter processed) with the input image data after filter processed, and outputs the data to theoutput unit 160. When the input image data after filter processed is not input from the selectingunit 149, the synthesizingunit 150 outputs the externally input image data itself which has not been filter processed, to theoutput unit 160. - The functional configuration of the
image processing device 100 has specifically been described above. The configuration of thedisplay device 200 will now be explained with reference toFIG. 11 . As shown inFIG. 11 , thedisplay device 200 is a hold-type display device, and includes animage display unit 210, asource driver 220, agate driver 230 and adisplay controlling unit 240. - The
image display unit 210 displays an image corresponding to display image data input from theimage processing device 100. Theimage display unit 210 is, for example, a dot matrix type display in an m×n matrix arrangement. Specific examples of theimage display unit 210 are an active matrix type OLED (Organic Light Emitting Diode) display using an a-Si (amorphous silicon) TFT, an LCD and the like. - The
source driver 220 and thegate driver 230 are driving units for driving theimage display unit 210 in an m×n matrix arrangement. Thesource driver 220 supplies adata line 221 with a data signal, while thegate driver 230 supplies ascanning line 231 with a select signal (address signal). - The
display controlling unit 240 controls driving of the image display unit 210 (driving of thesource driver 220 and the gate driver 230), based on the display image data input from theimage processing device 100. More specifically, thedisplay controlling unit 240 outputs a control signal to be supplied to each driver (thesource driver 220 and the gate driver 230) at an appropriate timing, based on the display image data (video signal) obtained from theimage processing device 100. - The explanations have been made to the examples of the functions of the
image processing device 100 and identification information reader according to this embodiment. Each of the above-described constituent elements may be formed using a widely used member or circuit, or may be formed with hardware specialized for the constituent elements thereof. Each function of the constituent elements may be executed by a CPU or the like. Thus, the applicable configuration may be changed appropriately in accordance with a technical level at the time this embodiment is implemented. - Explanations will now be made to a hardware configuration of the
image processing device 100 according to this embodiment with reference toFIG. 16 .FIG. 16 is a block diagram showing the hardware configuration of the image processing device according to this embodiment. - The
image processing device 100 mainly includes a CPU (Central Processing Unit) 901, a ROM (Read Only Memory) 903, a RAM (Random Access Memory) 905, ahost bus 907, abridge 909, anexternal bus 911, aninterface 913, aninput device 915, anoutput device 917, astorage device 919, adrive 921, aconnection port 923, and a communication device 925. - The
CPU 901 functions as an arithmetic device and a control device, and controls entirely or partially operations of theimage processing device 100 in accordance with various programs stored in theROM 903, theRAM 905, thestorage device 919 or aremovable recording medium 927. TheROM 903 stores programs or arithmetic parameters used by theCPU 901. TheRAM 905 temporarily stores programs used for the performance of theCPU 901 and also parameters that are changed appropriately during the performance. These are connected with each other through thehost bus 907 including an internal bus, such as a CPU bus or the like. - The
host bus 907 is connected to theexternal bus 911, such as a PCI (Peripheral Component Interconnect/interface) bus, through thebridge 909. - The
input device 915 is an operational unit, such as a mouse, a keyboard, a touch panel, a button, a switch and a lever, and is operated by users. Theinput device 915 may be a remote control unit (so-called a remote control), using infrared rays or any other electric wave, or may be an external connection unit 929, such as a cell phone, a PDA or the like corresponding to the operations of theimage processing device 100. Further, theinput device 915 generates, for example, an input signal based on information input by a user with the utilization of the above-described operation unit, and includes an input control circuit for outputting information to theCPU 901. The user of theimage processing device 100 operates thisinput device 915, thereby enabling to input various data for theimage processing device 100 and instructs it for processing operations. - The
output device 917 includes a device that can visually or aurally inform the user of acquired information. Thedevice 917 may, for example, be a CRT display device, a liquid crystal display device, a plasma display device, a display device such as an EL display device and a lamp, an audio output device such as a speaker and a headphone, a printer device, a cell phone, a facsimile, or the like. Specifically, the display device displays various information, such as image data or the like, in text form or image form. The audio output device converts audio data into a voice, and outputs the converted voice. - The
storage device 919 is a device for data storage, and is configured as an example of a storage unit of theimage processing device 100 according to this embodiment. Thestorage device 919 includes a magnetic storage unit device, such as an HDD (Hard Disk Drive), etc., a semiconductor storage device, an optical storage device, or a magneto-optical storage device. Thisstorage device 919 stores programs to be executed by theCPU 901, various data, and externally acquired image signal data. - The
drive 921 is a reader/writer for storage medium, and is incorporated in an image signal processing device or externally installed thereon. Thedrive 921 reads information recorded on aremovable recording medium 927, such as an installed magnetic disk, optical disk, magneto-optical disk or semiconductor memory or the like, and outputs the information to theRAM 905. Thedrive 921 can write the record to theremovable recording medium 927, such as the installed magnetic disk, optical disk, magneto-optical disk or semiconductor memory or the like. Theremovable recording medium 927 may, for example, be a DVD medium, an HD-DVD medium, a Blu-ray medium, a Compact Flash (CF) (registered trademark), a memory stick or an SD memory card (Secure Digital memory card). Theremovable recording medium 927 may be an IC card (Integrated Circuit card) having a contactless IC chip installed thereon or an electronic unit. - The
connection port 923 is a port for directly connecting theimage processing device 100 with a unit, such as a USB (Universal Serial Bus) port, an IEEE1394 port (such as an i.Link or the like), a SCSI (Small Computer System Interface) port, an RS-232Cport, an optical audio terminal or the like. Upon connection of thisconnection port 923 to the external connection unit 929, theimage processing device 100 directly acquires image signal data from the external connection unit 929, and provides the external connection unit 929 with the image signal data. - The communication device 925 is a communication interface including a communication device or the like for connecting to a
communication network 10. The communication device 925 is a communication card for a cable or wireless LAN (Local Area Network), Bluetooth, or for WUSB (Wireless USB), a router for optical communication, a router for ADSL (Asymmetric Digital Subscriber Line), or a modem for various communications. This communication device 925 can transmit and receive an image signal and the like to and from the Internet or any other communication unit. Thecommunication network 10 connected to the communication device 925 includes a network or the like connected through a cable or a wireless system, and may include the Internet, a home LAN, infrared communication or satellite communication. - According to the above-described configuration, the
image processing device 100 can acquire information regarding an input image signal from various information sources of the other external connection unit 929 connected to theconnection port 923 orcommunication network 10, and can transmit the image signal to thedisplay device 200. - The hardware configuration of the
display device 200 according to this embodiment is substantially the same as that of theimage processing device 100, thus will not be explained hereinbelow. - Accordingly, the explanations have been made to the example of the hardware configuration for enabling to realize the function of the
image processing device 100 anddisplay device 200 according to this embodiment. Each of the above-described constituent elements may include a widely used member, or may include hardware specialized for the constituent elements thereof. Thus, the applicable configuration may be changed appropriately in accordance with a technical level at the time this embodiment is implemented. - The configuration of the
image processing device 100 anddisplay device 200 according to this embodiment has specifically been described above. Now, explanations will be made to an image processing method according to this embodiment with using theimage processing device 100 having such a configuration with reference toFIG. 17 .FIG. 17 is a flowchart showing the processing flow of the image processing method according to this embodiment. - The image processing method according to this embodiment is to process input image data externally input to the
image processing device 100, thereby generating display image data output to the hold-type display device 200. - Specifically, as shown in
FIG. 17 , if the input image data is externally input to theimage processing device 100, the input image data is stored in the input image data storage unit 110 (S101), and also input to the motionvector detecting unit 120. - When input image data in a frame to be displayed is input to the motion
vector detecting unit 120, the motionvector detecting unit 120 extracts, for example, input image data in a frame that is one frame ahead of the frame to be displayed, from the input imagedata storage unit 110. The motionvector detecting unit 120 compares the input image data in the frame to be displayed and the input image data in the frame that is one frame ahead thereof, and sees an object moving in this display image, and detects a motion vector of the input image data in the frame to be displayed based on the distance between the object's movement direction and its distance (S103). The detected motion vector is transmitted to thecompensation processing unit 140 or the like. - When the input image data in the frame to be displayed is externally input, the
compensation processing unit 140 extracts response time information corresponding to a tone variation value of each pixel in the frame to be displayed from the response time information storage unit 130 (S105). Thecompensation processing unit 140 performs a compensation process for compensating a pixel value in the input image data, for each pixel included in the frame that is one frame ahead of the frame to be displayed, based on the externally input image data, the motion vector input from the motionvector detecting unit 120 and the response time information extracted from the response time information storage unit 130 (S107). As a result of this compensation process, the display image data is generated, and thecompensation processing unit 140 outputs the generated display image data to the output unit 160 (S109). - When the display image data is input from the
compensation processing unit 140, theoutput unit 160 outputs the input display image data to the display device 200 (S111). - Explanations will now be made to a specific example of a compensation process step according to this embodiment (S107) with reference to
FIG. 18 .FIG. 18 is a flowchart showing a specific example of the compensation process according to this embodiment. - As shown in
FIG. 18 , when input image data is externally input to the compensation processing unit 140 (S201), the compensationrange setting unit 141 sets a compensation range for compensating the pixel value in the input image data, based on the motion vector input from the motion vector detecting unit 120 (S203). Specifically, the compensationrange setting unit 141 detects an area with a movement in the input image data (a part corresponding to an moving object), and sets the pixel in the area having the movement as a compensation range. Further, the compensationrange setting unit 141 transmits information regarding the set compensation range and information regarding the input motion vector, to the maximum/minimumvalue detecting unit 142, theedge detecting unit 143, the highfrequency detecting unit 144 and thefilter setting unit 146 and the like. - The maximum/minimum
value detecting unit 142 detects the maximum and minimum values of input image data (input signal) in the compensation range, based on the information regarding the compensation range and transmitted from the compensation range setting unit 141 (S205). Further, the maximum/minimumvalue detecting unit 142 transmits information regarding the maximum and minimum values of the detected input signal to theedge detecting unit 143 and theoutside replacement unit 145. - The
edge detecting unit 143 detects an edge area in the input image data (input signal), based on the information regarding the compensation range transmitted from the compensationrange setting unit 141, the information regarding the input motion vector and the information regarding the maximum/minimum values of the input signal transmitted from the maximum/minimum value detecting unit 142 (S207). At this time, theedge detecting unit 143 detects not only the position of the edge (edge part), but also the edge direction in the edge part (whether it is a variation direction from a low tone to a high tone, or a variation direction from a high tone to a low tone). Further, theedge detecting unit 143 transmits information regarding the detected edge part and edge direction to the selectingunit 149. - The high
frequency detecting unit 144 detects a high frequency signal having spatial frequency of the input image data in the compensation range, based on the information regarding the compensation range transmitted from the compensation range setting unit 141 (S209). In this case, the high frequency represents a signal having a half wavelength (½ wavelength) in a range narrower than the compensation range. That is, the detecting unit detects a high-band signal whose wavelength is shorter than twice of the compensation range, as a high-band signal. This is because, in the case of a high-band signal, both a rise area and a decay area exist in the compensation range, thus interfering with performance of an adequate process. The highfrequency detecting unit 144 outputs the detected high-band signal to thegain adjusting unit 148, and the output high-band signal is used for the gain adjusting after the process performed by thefilter processing unit 147. - The
outside replacement unit 145 performs outside replacement for the input image data (input signal) using its maximum and minimum values, based on the information regarding the maximum and minimum values of the input signal transmitted from the maximum/minimum value detecting unit 142 (S211). Further, theoutside replacement unit 145 transmits the input image data after replaced (input signal) to thefilter processing unit 147. - When the input image data in the frame to be displayed is externally input, and when the information regarding the compensation range from the compensation
range setting unit 141 and the motion vector are transmitted, thefilter setting unit 146 extracts response time information corresponding to a tone variation value of each pixel in the frame to be displayed (S213). - The
filter setting unit 146 sets a characteristic of a spatial filter for compensating the pixel value in the input image data so that the image having the tone set based on the input image data is displayed, when thedisplay device 200 displays the frame to be displayed, based on the input image data, the information regarding the compensation range, the motion vector and the response time information (S215). The spatial filter in this embodiment may be a moving average filter, such as a low pass filter (LPF) or the like. The characteristics of the filter of this embodiment may include the area which is filtered with the filter and the number of taps of the filter. Such filter characteristics can be realized by appropriately setting a filter coefficient(s) of the filter matrix. Further, thefilter setting unit 146 transmits information regarding thus set filter characteristics to thefilter processing unit 147. - The
filter processing unit 147 performs a filter process for compensating the pixel value of each pixel positioned in the compensation range, by providing the input image data after outside replaced transmitted from theoutside replacement unit 145 with a filter having the filter characteristic(s) set by thefilter setting unit 146, in the frame that is one frame ahead of the frame to be displayed by the display device 200 (S217). Further, thefilter processing unit 147 transmits the input image data whose pixel value has been compensated to thegain adjusting unit 148. Thefilter processing unit 147 according to this embodiment provides the input image data after outside replaced with the filter. However, the filter is not necessarily provided to the input image data after outside replaced, and may be provided to the input image data itself. - In order to avoid an error in the high frequency, the
gain adjusting unit 148 performs gain adjustment for the input image data after compensated and transmitted from thefilter processing unit 147, based on the high-band signal transmitted from the high-frequency detecting unit 144 (S219). Further, thegain adjusting unit 148 transmits the input image data after gain adjustment to the selectingunit 149. - As a detection result of the
edge detecting unit 143, upon input of the input image data whose pixel value has been compensated and transmitted from thefilter processing unit 147 and the input image data itself whose pixel value has not been compensated and extracted from the input imagedata storage unit 110, the selectingunit 149 selects either one of the input image data whose pixel value has been compensated by thefilter processing unit 147 and the input image data whose pixel value has not been compensated by thefilter processing unit 147, in accordance with information regarding the input edge part and edge direction. In a specific process, the selectingunit 149 determines whether the edge part is at the rise area from a low tone to a high tone or at the decay area from a high tone to a low tone, based on the edge direction (S221). - As a result of this determination, when it is determined that the edge part of the input image data is at the rise area, the selecting
unit 149 selects the input image data whose pixel value has been compensated (S223), and outputs the input image data whose pixel value has been compensated (filter processed) (S225). - As a result of the determination of step S221, when it is determined that the edge part of the input image data is at the decay area, the selecting
unit 149 selects the input image data whose pixel value has not been compensated (S227). - Finally, when the input image data after filter processed is input from the selecting
unit 149, the synthesizingunit 150 synthesizes the externally input image data itself (not filter processed) with the input image data after filter processed (S229), and outputs the data to the output unit (S231). The synthesizingunit 150 outputs the externally input image data itself not filter processed to theoutput unit 160, when the input image data after filter processed is not input (S233). - In this embodiment, the selection process by the selecting unit is performed after the filter process by the
filter processing unit 147 is performed. The selectingunit 149 selects either one of the input image data after filter processed and the externally input image data. However, the timing to perform this process is not limited to the above. For example, before thefilter processing unit 147 performs the filter process, the selectingunit 149 determines whether to perform the filter process in advance. Only when the selectingunit 149 determines the filter process is to be performed (for example, when it is determined that the edge part is at the rise area), the filter process may be performed. - It should be understood by those skilled in the art that various modifications, combinations, sub-combinations and alterations may occur depending on design requirements and other factors insofar as they are within the scope of the appended claims or the equivalents thereof.
Claims (16)
1. An image processing device which processes externally input image data and outputs the display image data to a hold-type display device, comprising:
a motion vector detecting unit which detects a motion vector of the input image data;
a response time information storage unit which stores response time information representing a time since a driving voltage is applied to the display device until the display device displays an image with a tone corresponding to the driving voltage, in association with a tone variation value;
a compensation processing unit which compensates a pixel value in the image data for each pixel, in a frame which is one frame ahead of a frame to be displayed by the display device, based on the input image data, the motion vector and the response time information; and
an output unit which outputs the image data after compensated by the compensation processing unit to the display device.
2. The image processing device according to claim 1 , further comprising an edge detecting unit which detects an edge from the input image data, based on the motion vector.
3. The image processing device according to claim 2 , wherein
the compensation processing unit determines whether to perform a compensation process for the pixel value, in accordance with a detection result of the edge detecting unit.
4. The image processing device according to claim 3 , wherein
the compensation processing unit decides whether to perform the compensation process in accordance with an edge direction of an edge part detected by the edge detecting unit.
5. The image processing device according to claim 4 , wherein
the compensation processing unit decides to perform the compensation process, when it is determined that the edge part detected by the edge detecting unit is in a rise area from a low tone to a high tone based on the edge direction, and decides not to perform the compensation process, when it is determined that the edge part is in a decay area from a high tone to a low tone based on the edge direction.
6. The image processing device according to claim 1 , wherein the compensation processing unit includes:
a compensation range setting unit which sets a compensation range for compensating a pixel value in the image data based on the motion vector;
a filter setting unit which sets a characteristic of a filter for compensating the pixel value in the image data so as to display an image with a tone corresponding to a tone set based on the image data when the display device displays the frame to be displayed, based on the image data, the motion vector and the response time information; and
a filter processing unit which compensates a pixel value of the pixel within the compensation range by filtering the image data with a filter having the characteristic set by the filter setting unit, in the frame that is one frame ahead of the frame to be displayed by the display device.
7. The image processing device according to claim 6 , further comprising an edge detecting unit which detects an edge from the input image data based on the motion vector.
8. The image processing device according to claim 7 , wherein the compensation processing unit further includes a selecting unit which selects either one of the image data whose pixel value has been compensated by the filter processing unit and the image data whose pixel value has not been compensated by the filter processing unit, in accordance with a detection result of the edge detecting unit.
9. The image processing device according to claim 8 , wherein
the selecting unit selects either one of image data whose pixel value has been compensated and image data whose pixel value has not been compensated, in accordance with an edge direction of an edge part detected by the edge detecting unit.
10. The image processing device according to claim 9 , wherein
the selecting unit selects the image data whose pixel value has been compensated, when it is determined that the edge part detected by the edge detecting unit is in a rise area from a low tone to a high tone, and selects the image data whose pixel value has not been compensated, when it is determined that the edge part is in a decay area from a high tone to a low tone, based on the edge direction.
11. The image processing device according to claim 6 , wherein
the filter setting unit changes a number of taps of the filter in accordance with a motion vector value detected by the motion vector detecting unit.
12. The image processing device according to claim 6 , wherein
the filter is a moving average filter.
13. The image processing device according to claim 12 , wherein
the compensation processing unit further includes an outside replacement unit which replaces outside of the image data for the input image data, using a maximum value and a minimum value of a tone of the image data, and
the filter processing unit filters the image data after processed by the outside replacement unit, with the filter.
14. An image display system which comprising:
an image processing device, processing externally input image data; and
a hold-type display device, displaying the image data processed by the image processing device and input from the image processing device; wherein:
the image processing device includes
a motion vector detecting unit which detects a motion vector of the input image data,
a response time information storage unit which stores response time information representing a time since a driving voltage is applied to the display device until the display device displays an image with a tone corresponding to the driving voltage, in association with a tone variation value,
a compensation processing unit which compensates a pixel value in the image data for each pixel, in a frame that is one frame ahead of a frame to be displayed by the display device, based on the input image data, the motion vector and the response time information, and
an output unit which outputs the image data after compensated by the compensation processing unit to the display device; and
the display device includes:
an image display unit which displays an image corresponding to the image data input from the image processing device, and
a display controlling unit which controls driving of the image display unit based on the image data input by the image processing device.
15. An image processing method for processing externally input image data and generating image data to be output to a hold-type display device, the method comprising the steps of:
detecting a motion vector of the input image data;
extracting response time information from a response time information storage unit which stores the response time information representing a time since a driving voltage is applied to the display device until the display device displays an image with a tone corresponding to the driving voltage, in association with a tone variation value;
compensating a pixel value of the image data for each pixel in a frame that is one frame ahead of a frame to be displayed by the display device, based on the input image data, the motion vector and the response time information; and
outputting the image data after compensated to the display device.
16. A program for controlling a computer to function as an image processing device which processes externally input image data and outputs it to a display device performing hold-type driving, the program comprising:
a motion vector detecting function which detects a motion vector of the input image data;
a response time storage function for storing response time information representing a time since a driving voltage is applied to the display device until the display device displays an image with a tone corresponding to the driving voltage, in association with a tone variation value;
a compensation processing function for compensating a pixel value in the image data for each pixel, in a frame that is one frame ahead of a frame to be displayed by the display device, based on the input image data, the motion vector and the response time information; and
an outputting function for outputting the image data after compensated by the compensation processing unit to the display device.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JPP2007-326342 | 2007-12-18 | ||
JP2007326342 | 2007-12-18 |
Publications (1)
Publication Number | Publication Date |
---|---|
US20090153743A1 true US20090153743A1 (en) | 2009-06-18 |
Family
ID=40752718
Family Applications (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/316,837 Abandoned US20090153743A1 (en) | 2007-12-18 | 2008-12-17 | Image processing device, image display system, image processing method and program therefor |
US12/653,187 Expired - Fee Related US8411104B2 (en) | 2007-12-18 | 2009-12-09 | Image processing device and image display system |
US12/653,477 Expired - Fee Related US8452119B2 (en) | 2007-12-18 | 2009-12-15 | Image processing device and image display system |
Family Applications After (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/653,187 Expired - Fee Related US8411104B2 (en) | 2007-12-18 | 2009-12-09 | Image processing device and image display system |
US12/653,477 Expired - Fee Related US8452119B2 (en) | 2007-12-18 | 2009-12-15 | Image processing device and image display system |
Country Status (3)
Country | Link |
---|---|
US (3) | US20090153743A1 (en) |
JP (2) | JP5024634B2 (en) |
CN (1) | CN101510401B (en) |
Cited By (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100098349A1 (en) * | 2007-12-18 | 2010-04-22 | Sony Corporation | Image processing device and image display system |
US20110074938A1 (en) * | 2009-09-30 | 2011-03-31 | Sony Corporation | Image display device, image display viewing system and image display method |
US20110157209A1 (en) * | 2009-12-28 | 2011-06-30 | Canon Kabushiki Kaisha | Image processing apparatus and image processing method |
US20110176071A1 (en) * | 2010-01-15 | 2011-07-21 | Seiko Epson Corporation | Video processing circuit, video processing method, liquid crystal display apparatus and electronic device |
US20110292068A1 (en) * | 2009-03-13 | 2011-12-01 | Sharp Kabushiki Kaisha | Image display method and image display apparatus |
US20110304709A1 (en) * | 2010-06-09 | 2011-12-15 | Yoshio Umeda | Video display apparatus and video viewing system |
JP2014240937A (en) * | 2013-06-12 | 2014-12-25 | シャープ株式会社 | Display device |
US20150077442A1 (en) * | 2013-08-09 | 2015-03-19 | Seiko Epson Corporation | Integrated circuit, display device, electronic apparatus, and display control method |
US9411182B2 (en) | 2012-06-18 | 2016-08-09 | Seiko Epson Corporation | Signal processing device, signal processing method, liquid crystal device, and electronic apparatus |
US20160293085A1 (en) * | 2015-04-02 | 2016-10-06 | Apple Inc. | Electronic Device With Image Processor to Reduce Color Motion Blur |
CN106297653A (en) * | 2016-10-28 | 2017-01-04 | 重庆工商职业学院 | A kind of LED screen pixel brightness correcting method based on image procossing and system thereof |
US10140955B1 (en) * | 2017-04-28 | 2018-11-27 | Facebook Technologies, Llc | Display latency calibration for organic light emitting diode (OLED) display |
US10311808B1 (en) | 2017-04-24 | 2019-06-04 | Facebook Technologies, Llc | Display latency calibration for liquid crystal display |
US10991324B2 (en) * | 2019-02-18 | 2021-04-27 | Beijing Boe Display Technology Co., Ltd. | Overdrive method and device, controller, display apparatus, and storage medium |
Families Citing this family (34)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4455649B2 (en) * | 2005-11-07 | 2010-04-21 | シャープ株式会社 | Image display method and image display apparatus |
US8648784B2 (en) * | 2006-01-03 | 2014-02-11 | Mstar Semiconductor, Inc. | Device and method for overdriving a liquid crystal display |
US8908100B2 (en) * | 2007-12-28 | 2014-12-09 | Entropic Communications, Inc. | Arrangement and approach for motion-based image data processing |
US8310592B2 (en) * | 2008-10-10 | 2012-11-13 | Panasonic Corporation | Signal processing apparatus, signal processing method, and program for signal processing |
JP5473373B2 (en) * | 2009-04-01 | 2014-04-16 | キヤノン株式会社 | Image processing apparatus and image processing method |
JP5324391B2 (en) * | 2009-10-22 | 2013-10-23 | キヤノン株式会社 | Image processing apparatus and control method thereof |
JP5370214B2 (en) | 2010-02-25 | 2013-12-18 | セイコーエプソン株式会社 | Video processing circuit, video processing method, liquid crystal display device, and electronic apparatus |
JP5381807B2 (en) * | 2010-02-25 | 2014-01-08 | セイコーエプソン株式会社 | VIDEO PROCESSING CIRCUIT, ITS PROCESSING METHOD, LIQUID CRYSTAL DISPLAY DEVICE, AND ELECTRONIC DEVICE |
JP5381804B2 (en) * | 2010-02-25 | 2014-01-08 | セイコーエプソン株式会社 | Video processing circuit, video processing method, liquid crystal display device, and electronic apparatus |
KR20110131897A (en) * | 2010-06-01 | 2011-12-07 | 삼성전자주식회사 | Method of processing data and display apparatus performing the method |
JP5558934B2 (en) * | 2010-06-28 | 2014-07-23 | キヤノン株式会社 | Image processing apparatus and image processing method |
JP5804837B2 (en) * | 2010-11-22 | 2015-11-04 | キヤノン株式会社 | Image display apparatus and control method thereof |
JP5601173B2 (en) * | 2010-11-26 | 2014-10-08 | セイコーエプソン株式会社 | Video processing method, video processing circuit, liquid crystal display device, and electronic apparatus |
JP5763933B2 (en) * | 2011-02-10 | 2015-08-12 | キヤノン株式会社 | Image processing apparatus and image processing method |
JP5682454B2 (en) * | 2011-05-30 | 2015-03-11 | 株式会社Jvcケンウッド | Video processing apparatus and interpolation frame generation method |
US8953907B2 (en) | 2011-06-15 | 2015-02-10 | Marvell World Trade Ltd. | Modified bicubic interpolation |
US9286845B2 (en) * | 2011-07-15 | 2016-03-15 | Sharp Kabushiki Kaisha | Liquid crystal display device and method of driving the same |
JP2013137418A (en) * | 2011-12-28 | 2013-07-11 | Panasonic Liquid Crystal Display Co Ltd | Liquid crystal display device |
JP2013219462A (en) * | 2012-04-05 | 2013-10-24 | Sharp Corp | Image processing device, image display device, image processing method, computer program, and recording medium |
US20140166991A1 (en) * | 2012-12-17 | 2014-06-19 | Dmitri E. Nikonov | Transparent light-emitting display |
JP5510580B2 (en) * | 2013-03-15 | 2014-06-04 | セイコーエプソン株式会社 | Signal processing device, signal processing method, liquid crystal display device, and electronic apparatus |
US20140333669A1 (en) * | 2013-05-08 | 2014-11-13 | Nvidia Corporation | System, method, and computer program product for implementing smooth user interface animation using motion blur |
JP6398162B2 (en) * | 2013-09-25 | 2018-10-03 | セイコーエプソン株式会社 | Image processing circuit, electro-optical device and electronic apparatus |
JP6288818B2 (en) * | 2013-11-11 | 2018-03-07 | 株式会社Joled | Signal generation apparatus, signal generation program, signal generation method, and image display apparatus |
KR102211592B1 (en) * | 2014-03-19 | 2021-02-04 | 삼성전자주식회사 | Electronic device for processing image and method thereof |
TWI514369B (en) * | 2014-05-29 | 2015-12-21 | Au Optronics Corp | Signal conversion method for display image |
US9811882B2 (en) * | 2014-09-30 | 2017-11-07 | Electronics And Telecommunications Research Institute | Method and apparatus for processing super resolution image using adaptive preprocessing filtering and/or postprocessing filtering |
CN104317085B (en) * | 2014-11-13 | 2017-01-25 | 京东方科技集团股份有限公司 | Data voltage compensation method, data voltage compensation device and display device |
JP7143316B2 (en) * | 2017-03-23 | 2022-09-28 | シグニファイ ホールディング ビー ヴィ | Lighting system and method |
CN111418000B (en) * | 2017-12-07 | 2023-09-19 | 株式会社半导体能源研究所 | Display device and working method thereof |
KR102500625B1 (en) * | 2018-03-27 | 2023-02-17 | 삼성디스플레이 주식회사 | Image processing device, display device having the same, and image processing method of the same |
JP7303877B2 (en) | 2018-11-16 | 2023-07-05 | テレフレックス メディカル インコーポレイテッド | surgical clip |
CN111210790B (en) * | 2020-04-20 | 2020-07-24 | 南京熊猫电子制造有限公司 | Liquid crystal display device for improving moving image display quality |
CN114913821B (en) * | 2022-05-31 | 2024-03-22 | 合肥京东方显示技术有限公司 | Display module, control method thereof and display device |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5689305A (en) * | 1994-05-24 | 1997-11-18 | Kabushiki Kaisha Toshiba | System for deinterlacing digitally compressed video and method |
US6025879A (en) * | 1996-08-29 | 2000-02-15 | Kokusai Denshin Denwa Kabushiki Kaisha | System for moving object detection in moving picture |
US20040189565A1 (en) * | 2003-03-27 | 2004-09-30 | Jun Someya | Image data processing method, and image data processing circuit |
US6828540B2 (en) * | 2000-07-06 | 2004-12-07 | California Institute Of Technology | Image sensor system operating with small amplitude scanning |
US20050030302A1 (en) * | 2003-07-04 | 2005-02-10 | Toru Nishi | Video processing apparatus, video processing method, and computer program |
US7054367B2 (en) * | 2001-12-31 | 2006-05-30 | Emc Corporation | Edge detection based on variable-length codes of block coded video |
US20070132683A1 (en) * | 2005-12-08 | 2007-06-14 | Lg Philips Lcd Co., Ltd. | Apparatus and method for driving liquid crystal display device |
US8279341B1 (en) * | 2007-02-26 | 2012-10-02 | MotionDSP, Inc. | Enhancing the resolution and quality of sequential digital images |
Family Cites Families (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5369446A (en) * | 1992-04-30 | 1994-11-29 | Thomson Consumer Electronics, Inc. | Video signal processor employing edge replacement, preshoots and overshoots for transient enhancement |
JP3769463B2 (en) * | 2000-07-06 | 2006-04-26 | 株式会社日立製作所 | Display device, image reproducing device including display device, and driving method thereof |
WO2002099753A1 (en) * | 2001-06-05 | 2002-12-12 | Sony Corporation | Image processor |
KR100532099B1 (en) * | 2002-12-26 | 2005-11-29 | 삼성전자주식회사 | Apparatus and method for converting frame rate |
ATE441283T1 (en) * | 2003-12-01 | 2009-09-15 | Koninkl Philips Electronics Nv | MOTION COMPENSATED INVERSE FILTERING WITH BANDPASS FILTERS FOR MOTION SMURRY REDUCTION |
US7630576B2 (en) * | 2004-02-19 | 2009-12-08 | Sony Corporation | Signal processing apparatus and method, and command-sequence data structure |
EP1589763A2 (en) * | 2004-04-20 | 2005-10-26 | Sony Corporation | Image processing apparatus, method and program |
KR100691324B1 (en) * | 2005-07-22 | 2007-03-12 | 삼성전자주식회사 | Liquid crystal display apparatus |
KR101182298B1 (en) * | 2005-09-12 | 2012-09-20 | 엘지디스플레이 주식회사 | Apparatus and method for driving liquid crystal display device |
KR100731048B1 (en) * | 2005-10-20 | 2007-06-22 | 엘지.필립스 엘시디 주식회사 | Apparatus and method for driving liquid crystal display device |
KR100769195B1 (en) * | 2006-02-09 | 2007-10-23 | 엘지.필립스 엘시디 주식회사 | Apparatus and method for driving liquid crystal display device |
KR100769196B1 (en) * | 2006-03-20 | 2007-10-23 | 엘지.필립스 엘시디 주식회사 | Apparatus and method for driving liquid crystal device |
JP4172495B2 (en) * | 2006-05-09 | 2008-10-29 | ソニー株式会社 | Image display device, signal processing device, image processing method, and computer program |
CN101072343A (en) * | 2006-05-12 | 2007-11-14 | 松下电器产业株式会社 | Image processing device, method and integrated circuit |
JP4530002B2 (en) * | 2007-07-06 | 2010-08-25 | セイコーエプソン株式会社 | Hold-type image display device |
US20090153743A1 (en) * | 2007-12-18 | 2009-06-18 | Sony Corporation | Image processing device, image display system, image processing method and program therefor |
US9407890B2 (en) * | 2008-01-14 | 2016-08-02 | Broadcom Corporation | Method and system for sharpening the luma and the chroma signals |
-
2008
- 2008-12-17 US US12/316,837 patent/US20090153743A1/en not_active Abandoned
- 2008-12-18 CN CN200810191096.9A patent/CN101510401B/en not_active Expired - Fee Related
- 2008-12-18 JP JP2008322299A patent/JP5024634B2/en not_active Expired - Fee Related
- 2008-12-18 JP JP2008322300A patent/JP5176936B2/en not_active Expired - Fee Related
-
2009
- 2009-12-09 US US12/653,187 patent/US8411104B2/en not_active Expired - Fee Related
- 2009-12-15 US US12/653,477 patent/US8452119B2/en not_active Expired - Fee Related
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5689305A (en) * | 1994-05-24 | 1997-11-18 | Kabushiki Kaisha Toshiba | System for deinterlacing digitally compressed video and method |
US6025879A (en) * | 1996-08-29 | 2000-02-15 | Kokusai Denshin Denwa Kabushiki Kaisha | System for moving object detection in moving picture |
US6828540B2 (en) * | 2000-07-06 | 2004-12-07 | California Institute Of Technology | Image sensor system operating with small amplitude scanning |
US7054367B2 (en) * | 2001-12-31 | 2006-05-30 | Emc Corporation | Edge detection based on variable-length codes of block coded video |
US20040189565A1 (en) * | 2003-03-27 | 2004-09-30 | Jun Someya | Image data processing method, and image data processing circuit |
US20050030302A1 (en) * | 2003-07-04 | 2005-02-10 | Toru Nishi | Video processing apparatus, video processing method, and computer program |
US20070132683A1 (en) * | 2005-12-08 | 2007-06-14 | Lg Philips Lcd Co., Ltd. | Apparatus and method for driving liquid crystal display device |
US8279341B1 (en) * | 2007-02-26 | 2012-10-02 | MotionDSP, Inc. | Enhancing the resolution and quality of sequential digital images |
Cited By (24)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100098349A1 (en) * | 2007-12-18 | 2010-04-22 | Sony Corporation | Image processing device and image display system |
US8452119B2 (en) * | 2007-12-18 | 2013-05-28 | Sony Corporation | Image processing device and image display system |
US20110292068A1 (en) * | 2009-03-13 | 2011-12-01 | Sharp Kabushiki Kaisha | Image display method and image display apparatus |
US9001195B2 (en) | 2009-09-30 | 2015-04-07 | Sony Corporation | Image display device, image display viewing system and image display method |
EP2306743A3 (en) * | 2009-09-30 | 2014-03-19 | Sony Corporation | Image display device, image display viewing system and image display method |
US20110074938A1 (en) * | 2009-09-30 | 2011-03-31 | Sony Corporation | Image display device, image display viewing system and image display method |
US20110157209A1 (en) * | 2009-12-28 | 2011-06-30 | Canon Kabushiki Kaisha | Image processing apparatus and image processing method |
US8704843B2 (en) * | 2009-12-28 | 2014-04-22 | Canon Kabushiki Kaisha | Image processing apparatus and image processing method |
US8686978B2 (en) | 2010-01-15 | 2014-04-01 | Seiko Epson Corporation | Video processing circuit, video processing method, liquid crystal display apparatus and electronic device |
US20110176071A1 (en) * | 2010-01-15 | 2011-07-21 | Seiko Epson Corporation | Video processing circuit, video processing method, liquid crystal display apparatus and electronic device |
US20110304709A1 (en) * | 2010-06-09 | 2011-12-15 | Yoshio Umeda | Video display apparatus and video viewing system |
US9411182B2 (en) | 2012-06-18 | 2016-08-09 | Seiko Epson Corporation | Signal processing device, signal processing method, liquid crystal device, and electronic apparatus |
JP2014240937A (en) * | 2013-06-12 | 2014-12-25 | シャープ株式会社 | Display device |
US9761180B2 (en) * | 2013-08-09 | 2017-09-12 | Seiko Epson Corporation | Integrated circuit, display device, electronic apparatus, and display control method |
US20150077442A1 (en) * | 2013-08-09 | 2015-03-19 | Seiko Epson Corporation | Integrated circuit, display device, electronic apparatus, and display control method |
US20160293085A1 (en) * | 2015-04-02 | 2016-10-06 | Apple Inc. | Electronic Device With Image Processor to Reduce Color Motion Blur |
US10283031B2 (en) * | 2015-04-02 | 2019-05-07 | Apple Inc. | Electronic device with image processor to reduce color motion blur |
CN106297653A (en) * | 2016-10-28 | 2017-01-04 | 重庆工商职业学院 | A kind of LED screen pixel brightness correcting method based on image procossing and system thereof |
US10311808B1 (en) | 2017-04-24 | 2019-06-04 | Facebook Technologies, Llc | Display latency calibration for liquid crystal display |
US10553164B1 (en) | 2017-04-24 | 2020-02-04 | Facebook Technologies, Llc | Display latency calibration for liquid crystal display |
US10140955B1 (en) * | 2017-04-28 | 2018-11-27 | Facebook Technologies, Llc | Display latency calibration for organic light emitting diode (OLED) display |
US10276130B1 (en) | 2017-04-28 | 2019-04-30 | Facebook Technologies, Llc | Display latency calibration for organic light emitting diode (OLED) display |
US10339897B1 (en) * | 2017-04-28 | 2019-07-02 | Facebook Technologies, Llc | Display latency calibration for organic light emitting diode (OLED) display |
US10991324B2 (en) * | 2019-02-18 | 2021-04-27 | Beijing Boe Display Technology Co., Ltd. | Overdrive method and device, controller, display apparatus, and storage medium |
Also Published As
Publication number | Publication date |
---|---|
CN101510401A (en) | 2009-08-19 |
JP5176936B2 (en) | 2013-04-03 |
JP2009169411A (en) | 2009-07-30 |
CN101510401B (en) | 2013-06-19 |
US20100156772A1 (en) | 2010-06-24 |
US8411104B2 (en) | 2013-04-02 |
US8452119B2 (en) | 2013-05-28 |
JP2009169412A (en) | 2009-07-30 |
JP5024634B2 (en) | 2012-09-12 |
US20100098349A1 (en) | 2010-04-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20090153743A1 (en) | Image processing device, image display system, image processing method and program therefor | |
US7705816B2 (en) | Generating corrected gray-scale data to improve display quality | |
TWI326443B (en) | Dynamic gamma correction circuit, method thereof and plane display device | |
US8217880B2 (en) | Method for driving liquid crystal display apparatus | |
TWI408634B (en) | Dynamically selecting either frame rate conversion (frc) or pixel overdrive in an lcd panel based display | |
US8576925B2 (en) | Image processing apparatus and image processing method, and program | |
US8933917B2 (en) | Timing controller, display apparatus including the same, and method of driving the same | |
US20080239158A1 (en) | Adaptive gamma voltage switching method and device using the same | |
US7209106B2 (en) | High-quality-image liquid crystal display device and the driving method thereof | |
US8447131B2 (en) | Image processing apparatus and image processing method | |
KR100935404B1 (en) | Display device | |
CN101751894B (en) | Image processing device and image display system | |
US8614717B2 (en) | Device and method for selecting image processing function | |
US20120327140A1 (en) | Liquid crystal display for reducing motion blur | |
CN101751893B (en) | Image processing device and image display system | |
KR20110080846A (en) | Display driving method and apparatus using the same | |
CN106531043A (en) | Display device | |
TW201248609A (en) | A display control device and method thereof for reducing the amount of image zooming | |
KR100926306B1 (en) | Liquid crystal display and apparatus and method for driving thereof | |
JPH0646357A (en) | Liquid crystal panel driving device | |
CN107665678B (en) | Liquid crystal display and driving method thereof | |
KR101552886B1 (en) | Apparatus and method of improving definition of moving picture | |
KR20070081656A (en) | Display apparatus and method for controlling thereof | |
WO2019056602A1 (en) | Liquid crystal display and drive method therefor |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: SONY CORPORATION, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:ARASHIMA, KENJI;REEL/FRAME:022048/0590 Effective date: 20081027 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |