WO2015193953A1 - Image display device and optical device - Google Patents
Image display device and optical device Download PDFInfo
- Publication number
- WO2015193953A1 WO2015193953A1 PCT/JP2014/065929 JP2014065929W WO2015193953A1 WO 2015193953 A1 WO2015193953 A1 WO 2015193953A1 JP 2014065929 W JP2014065929 W JP 2014065929W WO 2015193953 A1 WO2015193953 A1 WO 2015193953A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- light
- image
- user
- unit
- optical
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/02—Viewing or reading apparatus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T1/00—General purpose image data processing
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/64—Constructional details of receivers, e.g. cabinets or dust covers
Definitions
- the present invention relates to an image display apparatus and an optical device.
- Patent Document 1 JP 2008-241822 (Patent Document 1) as background art in this technical field.
- the visible light reflecting infrared light transmitting mirror 2-a or the visible light transmitting infrared light reflecting mirror 2-b is used to minimize the loss of light and to display two images, namely, image display and iris imaging. Provides a means of assembling functions on one optical system.
- an image display device such as a head mounted display (Head Mounted Display, hereinafter abbreviated as HMD) that a user wears on his / her head is known.
- the user can acquire various types of information by visually recognizing images such as images and still images displayed by the HMD.
- the HMD can possess or display not only non-confidential information disclosed to the whole world but also confidential information disclosed only to specific individuals. Therefore, it is desirable that the HMD has a function of identifying a user and presents an image only to a regular user who has a use authority.
- an image display function that includes a visible light source, an infrared light source, and an imaging device and displays an image with a visible light source, and biometric authentication processing using the eyes are performed.
- an image display apparatus having an imaging function of imaging infrared light emitted from an infrared light source from the retina or iris of the infrared light with an imaging device is disclosed.
- Patent Document 1 has a problem that an infrared light source for capturing a retinal image or the like is arranged separately from the optical system for image display, and the optical system becomes enlarged. .
- the present invention has been made in order to solve such a problem, and an image display device having an imaging function for imaging an authentication target for performing biometric authentication processing using eyes is made smaller than before. With the goal.
- the present invention performs an image display process for displaying an image using light output from a light source unit and an imaging process for imaging an authentication target for performing a biometric authentication process using eyes.
- An optical processing unit wherein the optical processing unit captures the authentication object by receiving reflected light of the displayed image reflected by a user's eyes.
- an image display device having an imaging function for imaging an authentication target for performing biometric authentication processing using eyes can be made smaller than before.
- FIG. 1 is a block diagram illustrating a functional configuration of a video display device according to an embodiment of the invention.
- FIG. 23 is a block diagram illustrating a functional configuration of the video display device (apparatus) 10 according to the embodiment of the invention.
- the video display apparatus 10 includes an optical unit 200, a control unit 201, an image processing unit 202, an information storage unit 203, a communication processing unit 204, and a communication input / output unit 205.
- the video display device 10 will be described as an example.
- the image display device 10 may include not only a video image that is a moving image but also a still image display.
- the control unit 201 plays a role of controlling each unit included in the video display device 10 and gives a command to each unit included in the video display device 10.
- the information storage unit 203 stores a video that the video display device 10 presents to the user, an image necessary for biometric authentication using eyes such as a retina image or an iris image, and the like.
- biometric authentication using an eye such as a retina image or iris image is referred to as “biometric authentication”.
- the image processing unit 202 acquires a video, an image necessary for biometric authentication, and the like from the information storage unit 203 in accordance with an instruction from the control unit 201, and outputs them to the optical unit 200.
- the image processing unit 202 acquires a captured image generated by being captured by the optical unit 200, and performs determination based on a captured image acquired by a determination unit (not illustrated) in the image processing unit 202.
- the image processing unit 202 outputs the determination result to the control unit 201.
- the optical unit 200 presents the video or image received from the image processing unit 202 to the user in accordance with an instruction from the control unit 201. In addition, the optical unit 200 performs imaging in accordance with a command from the control unit 201.
- FIG. 24 is a diagram illustrating an operation mode of the video display system according to the embodiment of the present invention.
- the video display device 10 can communicate with an information processing device 212 different from the video display device 10. Communication processing with the information processing device 212 is performed by the communication processing unit 204 in accordance with an instruction from the control unit 201. Communication with the information processing apparatus 212 is performed via the communication input / output unit 205 and the Internet 211.
- the optical unit 200, the control unit 201, the image processing unit 202, the information storage unit 203, and the communication processing unit 204 described in the above embodiment are realized by a combination of software and hardware in the video display device 10.
- a hardware configuration of the video display apparatus 10 that realizes each component according to the present embodiment will be described with reference to FIG.
- FIG. 27 is a block diagram illustrating a hardware configuration of the video display apparatus 10 according to the present embodiment.
- the video display apparatus 10 according to the present embodiment includes the same configuration as a general server, a PC (Personal Computer), or the like. That is, the video display device 10 according to the present embodiment includes a CPU (Central Processing Unit) 11, a RAM (Random Access Memory) 20, a ROM (Read Only Memory) 30, an HDD (Hard Disk Drive) 40, and an I / F 50. 80 is connected. A display unit 60 and an operation unit 70 are connected to the I / F 50.
- a CPU Central Processing Unit
- RAM Random Access Memory
- ROM Read Only Memory
- HDD Hard Disk Drive
- the CPU 11 is a calculation means and controls the entire operation of the video display device 10.
- the RAM 20 is a volatile storage medium capable of reading and writing information at high speed, and is used as a work area when the CPU 11 processes information.
- the ROM 30 is a read-only nonvolatile storage medium and stores a program such as firmware.
- the HDD 40 is a non-volatile storage medium that can read and write information, and stores an OS (Operating System), various control programs, application programs, and the like. In addition to the HDD, a semiconductor storage device such as an SSD (Solid State Drive) may be used.
- the I / F 50 connects and controls the bus 80 and various hardware and networks.
- the display unit 60 is a visual user interface for the user to check the state of the video display device 10.
- the operation unit 70 is a user interface for the user to input information to the video display device 10 such as various hard buttons and a touch panel.
- a program stored in a storage medium such as the ROM 30, the HDD 40, or an optical disk (not shown) is read into the RAM 20, and the software control unit is configured by the CPU 11 performing calculations according to those programs.
- a functional block that realizes the function of the video display apparatus 10 according to the present embodiment is configured by a combination of the software control unit configured as described above and hardware.
- FIG. 1 is a diagram illustrating the configuration of an optical unit 200 according to this embodiment.
- the optical unit 200 according to the present embodiment includes an optical processing unit 3 that combines a light source unit 1, an illumination optical unit 2, a video generation unit (image generation unit), and a retinal imaging unit, A projection optical unit 4 including an element that changes the traveling direction of light. Further, as shown in FIG. 1, the light emitted from the projection optical unit 4 reaches the user's eyeball 6, and the light emitted from the user's eyeball 6 enters the projection optical unit 4.
- the light emitted from the light source unit 1 reaches the user's retina 7 through the illumination optical unit 2, the optical processing unit 3, the projection optical unit 4, and the user's crystalline lens 8.
- the light reaching the retina 7 is scattered by the retina 7, and a part of the scattered light reaches the optical processing unit 3 through the crystalline lens 8 and the projection optical unit 4.
- the light source unit 1 emits visible light for presenting an image to the user.
- the light source unit 1 includes a light source that emits white light or a plurality of light sources that emit light of different colors.
- the illumination optical unit 2 adjusts the illuminance distribution of the light emitted from the light source unit 1 so that the image generation unit in the optical processing unit 3 is illuminated with substantially the same illuminance without excess or deficiency.
- the illumination optical unit 2 also plays a role of combining light emitted from the plurality of light sources that constitute the light source unit 1.
- the light source unit 1 includes a plurality of light sources
- FIG. 2 is a diagram illustrating the configuration of the light source unit 1 and the illumination optical unit 2 according to this embodiment.
- the light source unit 1 includes, for example, a light source 1A, a light source 1B, and a light source 1C.
- the light sources 1A to 1C emit light of any one color among red light, green light, and blue light. As long as the three colors of light are emitted by the three light sources without any shortage, any light source may emit any color of light.
- Each of the light sources 1A to 1C may be mounted in an independent package, or two or more light sources may be integrated and mounted in one package.
- the illumination optical unit 2 includes, for example, a light tunnel 21A and a lens 21B.
- the light emitted from the light source unit 1 enters the light tunnel 21 ⁇ / b> A of the illumination optical unit 2. Since the light incident on the light tunnel 21A is reflected a plurality of times by the inner wall of the light tunnel 21A, the illuminance distribution of the light emitted from the light tunnel 21A becomes substantially uniform. The light emitted from the light tunnel 21A passes through the lens 21B.
- the lens 21B collects the divergent light emitted from the light tunnel 21A. Accordingly, the illumination optical unit 2 has a function of combining the light emitted from the light source unit 1 and illuminating the image generation unit in the optical processing unit 3 with a substantially uniform illuminance distribution.
- FIG. 3 is a diagram illustrating another configuration of the illumination optical unit 2.
- the light emitted from the light source unit 1 enters the diffusion plate 22A, diffuses, and exits the diffusion plate 22A. Accordingly, the illumination optical unit 2 has a function of combining the light emitted from the light source unit 1 and illuminating the image generation unit in the optical processing unit 3 with a substantially uniform illuminance distribution.
- FIG. 4 is a configuration diagram of another embodiment of the light source unit 1 and the illumination optical unit 2 of the present embodiment.
- the light source unit 1 includes, for example, a light source 1A that emits red light, a light source 1B that emits green light, and a light source 1C that emits blue light.
- the illumination optical unit 2 transmits lenses 23A, 23B, and 23C, a dichroic mirror 23D that transmits red light and reflects green light, a relay lens 23H, and transmits red light and green light. And a dichroic mirror 23E that reflects blue light, a microlens array 23F, and a lens 23G.
- the red light emitted from the light source 1A is transmitted through the lens 23A.
- the lens 23A serves to make the divergent light emitted from the light source 1A substantially parallel light.
- the light transmitted through the lens 23A travels toward the dichroic mirror 23D and enters the dichroic mirror 23D at an incident angle of approximately 45 degrees.
- the green light emitted from the light source 1B is transmitted through the lens 23B.
- the lens 23B plays a role of making divergent light emitted from the light source 1B substantially parallel light.
- the light transmitted through the lens 23B travels toward the dichroic mirror 23D and enters the dichroic mirror 23D at an incident angle of approximately 45 degrees.
- the red light emitted from the light source 1A and the green light emitted from the light source 1B are substantially orthogonal to each other, and the transmitted light of the dichroic mirror 23D of red light and the reflected light of the dichroic mirror 23D of green light are substantially the same.
- the light source 1A, the light source 1B, and the dichroic mirror 23D are arranged so as to travel in the same direction along the optical axis.
- the light that has passed through the dichroic mirror 23D passes through the relay lens 23H, travels toward the dichroic mirror 23E, and enters the dichroic mirror 23E at an incident angle of approximately 45 degrees.
- the relay lens 23H plays a role of correcting a difference in light spread caused by a difference between an optical path length from the light sources 1A and 1B to the dichroic mirror 23E and an optical path length from the light source 1C to the dichroic mirror 23E.
- Blue light emitted from the light source 1C is transmitted through the lens 23C.
- the lens 23C serves to make the divergent light emitted from the light source 1C substantially parallel light.
- the light transmitted through the lens 23C travels toward the dichroic mirror 23E and enters the dichroic mirror 23E at an incident angle of approximately 45 degrees.
- the light that has passed through the dichroic mirror 23D and the blue light emitted from the light source 1C are substantially orthogonal, and further, the light that has passed through the dichroic mirror 23D, the transmitted light of the dichroic mirror 23E, and the reflected light of the blue light dichroic mirror 23E
- the light source 1C and the dichroic mirror 23E are arranged so that the light beams travel in the same direction with substantially the same optical axis.
- the light source 1A, the light source 1B, and the light source 1C each emit red light, green light, and blue light
- any combination of light emitted by each light source may be used.
- the reflection / transmission characteristics of the dichroic mirror 23D and the dichroic mirror 23E are appropriately set so that the light emitted from the dichroic mirror 23E travels in the same direction along substantially the same optical axis. Need to change.
- the light emitted from the dichroic mirror 23E is transmitted through the microlens array 23F and the lens 23G.
- this object is imaged on the optical processing unit 3, and the size of the image formed is approximately equal to the size of the image generation unit in the optical processing unit 3.
- the micro lens array 23F and the lens 23G are designed and arranged. Accordingly, the illumination optical unit 2 has a function of combining the light emitted from the light source unit 1 and illuminating the image generation unit in the optical processing unit 3 with a substantially uniform illuminance distribution.
- the optical processing unit 3 includes a video generation unit having a function of performing video generation processing for generating a video by intensity-modulating incident light, and a retinal imaging unit having a function of performing imaging processing of capturing a retina.
- An optical device for example, an integrated liquid crystal display element / image sensor 31 including a transmissive monochrome liquid crystal display element and an image sensor can be used.
- the configuration of the optical processing unit 3 described below is one of the gist according to the present embodiment.
- the generation of an image by the optical processing unit 3 means that an image is displayed using the light output from the light source unit 1. Therefore, it can be said that a video generation unit having a function of performing video generation processing is a video display generation unit having a function of performing video display processing.
- a video generation unit having a function of performing video generation processing is a video display generation unit having a function of performing video display processing.
- the optical processing unit 3 includes an image display unit having a function of performing an image display process. It will be.
- the following embodiments will be described as video generation processing.
- FIG. 5 is an example of a schematic diagram of a cross section of the integrated liquid crystal display element / imaging element 31.
- the integrated liquid crystal display element / imaging element 31 includes a polarizing plate 3A, a pixel electrode 3B, a photodiode 3C, a liquid crystal layer 3D, a counter electrode 3E, and a polarizing plate 3F.
- the photodiode 3C receives at least one of red light, green light, and blue light traveling in the direction from the polarizing plate 3F to the photodiode 3C, and proceeds in the direction from the polarizing plate 3A to the photodiode 3C. It is configured not to receive the light to be received. Therefore, the photodiode 3C has no role in the function of displaying the image of the optical unit 200.
- the integrated liquid crystal display element / imaging element 31 operates as a liquid crystal panel with respect to light traveling in the direction from the polarizing plate 3A to the polarizing plate 3F.
- the light emitted from the light source unit 1 and passed through the illumination optical unit 2 enters the polarizing plate 3A in the direction from the polarizing plate 3A to the polarizing plate 3F. Since the polarizing plate 3A transmits light having a specific direction of polarization, the transmitted light of the polarizing plate 3A has a specific direction of polarization.
- the light transmitted through the polarizing plate 3A enters the liquid crystal layer 3D.
- the liquid crystal layer 3D is sandwiched between the pixel electrode 3B and the counter electrode 3E, and changes the orientation of the liquid crystal molecules according to the voltage applied between the pixel electrode 3B and the counter electrode 3E.
- the light incident on the liquid crystal layer 3D is rotated in polarization according to the orientation of the liquid crystal molecules in the liquid crystal layer 3D, and exits the liquid crystal layer 3D.
- the integrated liquid crystal display element / imaging element 31 can detect the light incident on the integrated liquid crystal display element / imaging element 31 from the light source unit 1. Intensity modulation can be given.
- the optical unit 200 can generate an image by driving the light source unit 1 and the integrated liquid crystal display element / imaging element 31 in synchronization.
- driving the light source unit 1 and the integrated liquid crystal display element / imaging element 31 in synchronization.
- the frequency for updating the full color image is defined as the refresh rate, for example, 60 Hz.
- a red image, a green image, and a blue image are displayed in order within 1/60 seconds. Since switching between images of different colors is faster than the temporal resolution of the user's eyes, the user cannot distinguish between images of different colors and perceives them as full-color images.
- the optical unit 200 displays a red image.
- the red light source is turned on, and the green light source and the blue light source are turned off.
- the RGB value is obtained for each pixel of the image to be displayed, and the voltage between the pixel electrode 3B and the counter electrode 3E corresponding to the pixel is set according to the value of R.
- the transmitted light of the integrated liquid crystal display element / imaging element 31 becomes an image composed only of the red component of the image to be displayed.
- the optical unit 200 displays a green image.
- the green light source is turned on, and the red light source and the blue light source are turned off.
- the RGB value is obtained for each pixel of the image to be displayed, and the voltage between the pixel electrode 3B and the counter electrode 3E corresponding to the pixel is set according to the G value.
- the transmitted light of the integrated liquid crystal display element / image pickup element 31 becomes an image composed only of the green component of the image to be displayed.
- the optical unit 200 displays a blue image.
- the blue light source is turned on, and the red light source and the green light source are turned off.
- the RGB value is obtained for each pixel of the image to be displayed, and the voltage between the pixel electrode 3B and the counter electrode 3E corresponding to the pixel is set according to the value of B.
- the transmitted light of the integrated liquid crystal display element / imaging element 31 becomes an image composed only of the blue component of the image to be displayed.
- Display of one full-color image is completed by displaying the red, green, and blue images.
- the red, green, and blue images may be displayed according to the image to be displayed.
- display is performed in the order of red, green, and blue images, but the order is not limited thereto. For example, it is possible to display in a different order such as red, blue, and green.
- the light source unit 1 includes a plurality of light sources, and the optical processing unit 3 is described as an example including a transmission type monochrome liquid crystal display element and an image sensor.
- the light source unit 1 may include a white light source, and the optical processing unit 3 may include a transmissive color liquid crystal display element and an image sensor.
- FIG. 21 is an example of a schematic diagram of a cross section of an integrated liquid crystal display element / image pickup device 35 including a transmissive color liquid crystal display device and an image pickup device.
- the integrated liquid crystal display element / image pickup device 35 includes a polarizing plate 3A, a pixel electrode 3B, a photodiode 3C, a liquid crystal layer 3D, a counter electrode 3E, a polarizing plate 3F, color filters 3L, 3M, and 3N.
- a light shielding film 3P is an example of a schematic diagram of a cross section of an integrated liquid crystal display element / image pickup device 35 including a transmissive color liquid crystal display device and an image pickup device.
- the integrated liquid crystal display element / image pickup device 35 includes a polarizing plate 3A, a pixel electrode 3B, a photodiode 3C, a liquid crystal layer 3D, a counter electrode 3E, a polarizing plate 3F, color filters 3L, 3M, and 3N.
- the color filter 3L transmits only red light
- the color filter 3M transmits only green light
- the color filter 3N transmits only blue light.
- the optical unit 200 turns on the white light source that constitutes the light source unit 1, and according to the RGB value of the pixel of the image presented by the optical unit 200, between the pixel electrode 3B and the counter electrode 3E corresponding to each RGB of the pixel. By setting the voltage, an image can be generated.
- the light emitted from the optical processing unit 3 enters the projection optical unit 4.
- the projection optical unit 4 plays a role of generating a virtual image perceived by the user from the video generated by the optical processing unit 3, and the light incident on the projection optical unit 4 from the optical processing unit 3 is incident on the user's eyeball. And serving as a light direction changing unit that changes the traveling direction of light.
- FIG. 6A is a diagram illustrating the configuration of the projection optical unit 4 according to this embodiment.
- the projection optical unit 4 includes, for example, a convex lens 41A and a plane mirror 41B.
- the convex lens 41A is arranged such that the distance a (> 0) between the optical processing unit 3 and the convex lens 41A is shorter than the focal length f (> 0) of the convex lens 41A.
- the image generated by the convex lens 41A of the image generated by the optical processing unit 3 is a virtual image
- the size of the virtual image is f / (fa) times that of the image generated by the optical processing unit 3. Since f / (fa)> 1, the projection optical unit 4 can generate a virtual image larger than the image generated by the optical processing unit 3.
- the flat mirror 41B changes the traveling direction of the light emitted from the convex lens 41A.
- the light reflected from the plane mirror 41B enters the user's eyeball 6.
- FIG. 7 is a diagram showing the plane mirror 41B and the user's eyeball 6 from the viewpoint on the right side of the user.
- the height h of the flat mirror 41B is smaller than the diameter of the user's pupil, which is a range in which an object such as the object 9 can be perceived visually (for example, 2 mm). Therefore, the light rays emitted from the object 9 that is the object on the extension line between the center of the crystalline lens 8 and the center of the plane mirror 41B pass through the top and bottom of the plane mirror 41B like the light rays 91A and 91B, and the retina of the user. 7 can be reached. Thereby, the user can perceive the object 9 visually and can ensure see-through property.
- the projection optical unit 4 may include a plurality of convex lenses or concave lenses and a plane mirror. By using a plurality of convex lenses or concave lenses, it is possible to reduce aberration caused by the lenses.
- FIG. 6B is another configuration diagram of the projection optical unit 4 according to the present embodiment.
- the projection optical unit 4 may include at least one lens and a triangular prism 41 ⁇ / b> C having an incident surface, a reflective surface, and an output surface.
- the incident surface, reflecting surface, and emitting surface are all flat.
- the height of the triangular prism 41C is smaller than the diameter of the user's pupil (for example, 2 mm). Thereby, see-through property can be secured.
- the triangular prism 41C may have another shape such as a cube prism.
- FIG. 6C is another configuration diagram of the projection optical unit 4 according to the present embodiment.
- the projection optical unit 4 may be configured by a curved mirror 41D.
- the curved mirror 41D plays a role of generating a virtual image perceived by the user from an image generated by the optical processing unit 3, and so that light incident on the projection optical unit 4 from the optical processing unit 3 enters the user's eyeball. It plays both roles of changing the traveling direction of light.
- the height of the curved mirror 41D is smaller than the diameter of the user's pupil (for example, 2 mm). Thereby, see-through property can be secured.
- the projection optical unit 4 may be configured by a prism having an incident surface, a reflection surface, and an output surface. At least one of the incident surface, reflecting surface, and emitting surface is a curved surface.
- the prism plays a role of generating a virtual image perceived by the user from the image generated by the optical processing unit 3, and the light incident on the user's eyeball from the optical processing unit 3 to the projection optical unit 4. It plays both roles of changing the direction of travel.
- the height of the prism is smaller than the diameter of the user's pupil (for example, 2 mm). Thereby, see-through property can be secured.
- FIG. 6D is another configuration diagram of the projection optical unit 4 according to the present embodiment.
- a planar beam splitter 41E having a reflectance of 20% or more as shown in FIG. 6D, for example.
- the planar beam splitter 41E even if the height of the planar beam splitter 41E is larger than the diameter of the pupil of the user, a part of the light beam emitted from the object on the extension line of the planar beam splitter 41E is transmitted. The user's retina 7 can be reached. Even with such a configuration, the user can visually perceive an object and can ensure see-through.
- a prism having a reflectance of 20% or more on the reflecting surface it is possible to use a prism having a reflectance of 20% or more on the reflecting surface.
- a prism having a reflectance of 20% or more on the reflecting surface.
- the structure of the projection optical part 4 which concerns on this embodiment is not limited to the said form,
- the form which combined the component of the said form may be sufficient.
- the projection optical unit 4 may have other forms such as a single lens and a prism in which at least one of the entrance surface, the reflection surface, and the exit surface is a curved surface.
- the light emitted from the projection optical unit 4 enters the user's eyeball 6 and then reaches the retina 7.
- the user perceives the light reaching the retina 7.
- the user perceives the video as if the virtual image exists in front of the eyes.
- a part of the light reaching the user's retina 7 from the light source unit 1 is scattered by the retina 7.
- Part of the scattered light enters the optical processing unit 3 through the crystalline lens 8 and the projection optical unit 4.
- the light incident on the integrated liquid crystal display element / imaging element 31 includes a polarizing plate 3F, a counter electrode 3E, and a liquid crystal layer.
- the light passes through 3D and enters the photodiode 3C.
- the light incident on the photodiode 3C is detected by the photodiode 3C because it travels in the direction from the polarizing plate 3F to the photodiode 3C.
- the optical processing unit 3 displays an image using the light output from the light source unit 1, and the light of the displayed image is the user's eyes.
- the retina that is an authentication target for performing biometric authentication processing using the eyes is imaged.
- the optical path is shared between the light for image (image) display and the light for retinal imaging, and the light source for image (image) display and the light source for retinal imaging of the optical unit 200 are used as the light source unit. Therefore, it is possible to reduce the size of an image display apparatus having an imaging function for imaging an authentication target for performing biometric authentication processing using eyes.
- the projection optical unit 4 includes an element that changes the traveling direction of light so that light incident on the projection optical unit 4 from the optical processing unit 3 enters the user's eyeball.
- An attached video display device can be provided.
- the configuration in which the projection optical unit 4 includes an element that changes the traveling direction of light is not essential, and the projection optical unit 4 has a configuration in which light incident from the optical processing unit 3 is directly incident on the user's eyeball. Also good.
- the case where the pixel electrode 3B and the photodiode 3C are integrally arranged as shown in FIG. 5 has been described as an example.
- such a configuration is not essential, and the pixel electrode 3B and the photodiode 3C may be disposed apart from each other as long as the photodiode 3C is within the focal depth so that an image of the retina can be captured.
- FIG. 8 is a diagram illustrating a retina authentication preparation image and a retina authentication image.
- FIG. 22 is a flowchart illustrating retinal authentication processing by the image processing unit 202.
- the image processing unit 202 when the image processing unit 202 receives an instruction to start retinal authentication from the control unit 201, the image processing unit 202 acquires the retinal authentication preparation image 71A shown in FIG.
- the work image 71A is output to the optical unit 200 (S2201).
- the optical unit 200 presents the retina authentication preparation image 71A acquired from the image processing unit 202 to the user.
- the retina authentication preparation image 71A is a predetermined image that gives an instruction to the user so that the user intentionally looks at a specific part of the video, and is stored in the information storage unit 203 in advance.
- the retina authentication preparation image 71 ⁇ / b> A includes a mark 71 ⁇ / b> B that serves as a guide for the user's eye-gaze position, and instructions 71 ⁇ / b> C such as “Look here. Line-of-sight guidance information for guiding the line of sight to a predetermined position is included.
- the reason why such a retina authentication preparation image 71A is displayed is that the optical processing unit 3 acquires a clear captured image of the retina 7 of the user.
- the lens 8 is formed so that the displayed image is formed on the retina 7 by intentionally viewing the displayed image. Therefore, the retina 7 and the optical processing unit 3 need to have a relationship between the image plane and the object plane. Therefore, the image processing unit 202 displays such a retina authentication preparation image 71A and guides the user to intentionally view the video.
- the image processing unit 202 that has output the retina preparation image 71A to the optical unit 200 waits until a predetermined time elapses after the retina preparation image 71A is output (S2202 / YES).
- the predetermined time is sufficient time for the user to read the displayed instruction sentence 71C and move his / her line of sight to the mark 71B, for example, several seconds.
- the image processing unit 202 acquires the retina authentication image 72 shown in FIG. 8 from the information storage unit 203 and outputs it to the optical unit 200 (S2203).
- the optical unit 200 presents the retinal authentication image 72 acquired from the image processing unit 202 to the user.
- the retina authentication image 72 is, for example, an image whose entire surface has the same color or a uniform intensity distribution on the retina, and is stored in advance in the information storage unit 203. Specifically, for example, as shown in FIG. 8, the entire surface is the same color image (indicated by dot hatching in FIG. 8). However, this is an example, and the retina authentication image 72 may be entirely white, or may be another color that is optimal for retina authentication.
- the image processing unit 202 that has output the retina authentication image to the optical unit 200 acquires a captured image of the user's retina 7 captured by the optical processing unit 3 (S2204).
- the retina authentication preparation image 71A is output to the optical unit 200 and presented to the user, so that the user intentionally sees a specific part of the video presented by the optical unit 200.
- the retina 7 and the retinal imaging unit in the optical processing unit 3 have a relationship between the image plane and the object plane, and the captured image is an image of the retina.
- FIG. 9 is an example of a retinal captured image acquired by the image processing unit 202. As shown in FIG. 9, a blood vessel pattern 73A on the retina is captured as a retinal captured image. The optical unit 200 outputs such a retinal captured image to the image processing unit 202.
- FIG. 10 is another example of the retinal captured image acquired by the image processing unit 202.
- the blood vessel pattern on the retina differs depending on the individual, and the captured image of the image of the retina for another individual is, for example, a blood vessel pattern 73B different from the blood vessel pattern 73A as shown in FIG.
- the image processing unit 202 determines whether or not the characteristics of the blood vessel pattern 73A of the acquired retinal captured image match the characteristics of the blood vessel pattern of the normal user stored in advance in the information storage unit 203 (S2205).
- the retina authentication determination unit (not shown) in the image processing unit 202 Determines that the user is a regular user and authentication is successful, and outputs the determination result to the control unit 201 (S2206).
- the image processing unit 202 acquires an authentication success presentation image for presenting the authentication success from the information storage unit 203 and outputs the authentication success presentation image to the optical unit 200.
- the optical unit 200 presents the authentication success presentation image acquired from the image processing unit 202 to the user.
- the image processing unit 202 that has output the determination result of the authentication success to the control unit 201 displays the browsing-limited information that is information that can be browsed by successful authentication in accordance with the command from the control unit 201 that has received the determination result.
- the information is acquired from the information storage unit 203 and output to the optical unit 200 (S2207).
- the optical unit 200 presents the browsing-limited information acquired from the image processing unit 202 to the user.
- the image processing unit 202 has been described as an example in which the browsing restriction information is acquired from the information storage unit 203.
- the image processing unit 202 may acquire the browsing restriction information from the information processing apparatus 212 illustrated in FIG.
- the communication processing unit 204 acquires the browsing restriction information from the information processing device 212 in accordance with a command from the control unit 201 that has received the authentication success determination result, and outputs it to the control unit 201.
- the control unit 201 outputs the browsing-limited image acquired from the communication processing unit 204 to the image processing unit 202.
- the image processing unit 202 starts retinal authentication. Then, it is determined whether or not the predetermined number of times has been implemented (S2208). When the number of authentications is less than the predetermined number (S2208 / NO), the image processing unit 202 outputs the retina authentication preparation image 71A to the optical unit 200 again (S2201), and performs authentication again.
- the retina image may be acquired differently depending on the user's line of sight at the time of retinal imaging, and depending on the acquired retinal imaging image, the captured blood vessel pattern may be obtained despite being a regular user. It may be determined that the image does not match the characteristics of the blood vessel pattern stored in the information storage unit 203. As described above, when the number of times of authentication is less than the predetermined number, by re-authenticating, it is possible to increase opportunities to match the line of retinal authentication correctly.
- the retina authentication determination unit of the image processing unit 202 determines that the user is not a regular user but has failed, and the determination result is sent to the control unit 201. It outputs (S2209). Further, the image processing unit 202 acquires an authentication failure presentation image for presenting an authentication failure from the information storage unit 203 and transmits the authentication failure presentation image to the optical unit 200. The optical unit 200 presents the authentication failure presentation image acquired from the image processing unit 202 to the user.
- the RGB values of, for example, 80% or more pixels of the retina authentication preparation image 71A are substantially the same as the RGB values of the same pixels of the retina authentication image 72.
- the burden on the user's eyes when switching between the retina authentication preparation image 71A and the retina authentication image 72 can be reduced, and at the same time, light / dark adaptation can be suppressed.
- a retinal authentication determination unit included in the image processing unit 202 performs a retinal authentication process based on the characteristics of the blood vessel pattern of the user acquired from the information storage unit 203. explained. However, this is only an example, and the video display apparatus 10 may perform the retina authentication processing by acquiring the characteristics of the blood vessel pattern of the user stored in the information processing apparatus 212.
- the video display apparatus 10 transmits the captured characteristics of the blood vessel pattern of the user to the information processing apparatus 212.
- the retinal authentication process may be performed by comparing the characteristics of the blood vessel pattern of the user received from the video display device 10 and the authentication result may be transmitted to the video display device 10.
- the video display device 10 can realize other functions using the captured image of the retina in addition to the retina authentication processing by appropriately capturing the retina.
- a video viewing determination unit (not shown) in the image processing unit 202 determines whether or not the user is viewing the video presented by the optical unit 200 using the retinal image captured by the optical unit 200. It is also possible to control the brightness of the video presented by the optical unit 200 according to the determination result received by the control unit 201 from the image processing unit 202.
- FIG. 11 is a diagram illustrating the relationship between the sharpness of the retina captured image and the brightness of the display image.
- the integrated liquid crystal display element / imaging element 31 and the retina 7 are in the relationship between the image plane and the object plane. Therefore, as shown in FIG.
- the blood vessel pattern of the image is a clear blood vessel pattern 73A.
- the integrated liquid crystal display element / imaging element 31 and the retina 7 are not in a relationship between the image plane and the object plane.
- the blood vessel pattern of the retinal image captured by the optical unit 200 is a blurred blood vessel pattern 74 (indicated by a dotted line in FIG. 11).
- the image processing unit 202 quantifies the sharpness of the blood vessel pattern of the retinal image captured by the optical unit 200, and if the blood vessel pattern is clear beyond a certain threshold, the image viewing determination unit It is determined that the user is watching the video, and the determination result is output to the control unit 201.
- the image processing unit 202 determines that the video viewing determination unit does not view the video, and outputs the determination result to the control unit 201. .
- the control unit 201 presents a bright image 75 when the user is viewing the video, and dark image 76 when the user is not viewing the image. Is issued to the optical unit 200.
- the optical unit 200 changes the brightness of the video presented by the optical unit 200 in accordance with the command received from the control unit 201. Thereby, when the user is not watching the video, it is possible to reduce the disturbance of the user's vision due to the video.
- the above-described video visibility determination unit has been described as an example in which it is determined whether or not the user is viewing the video based on the clearness of the blood vessel pattern of the retinal image captured by the optical unit 200.
- the video viewing determination unit can also determine whether the user is watching the video based on the movement of the blood vessel pattern.
- FIG. 12 is a diagram illustrating another relationship between the amount of movement of the retina captured image and the brightness of the display image.
- the blood vessel pattern of the retinal image captured by the optical unit 200 when the user is looking at the top and bottom or left and right backgrounds of the video and is not looking at the video
- the blood vessel pattern 77 is shifted vertically or horizontally.
- the video viewing determination unit first obtains the blood vessel pattern or its feature when the user is viewing the video from the information storage unit 203 that is stored in advance as a reference blood vessel pattern. Then, the image viewing determination unit determines that the blood vessel pattern of the retinal captured image captured by the optical unit 200 or the feature thereof has not moved beyond a certain threshold value compared to the acquired reference blood vessel pattern or the feature thereof. , It is determined that the user is watching the video, and the determination result is output to the control unit 201.
- the video visual recognition determining unit moves. Then, it is determined that the user is not watching the video, and the determination result is output to the control unit 201. Thereby, the image viewing determination unit can determine whether the user is viewing the image based on the movement amount of the blood vessel pattern.
- the optical unit 200 not only changes the brightness of the displayed image in a binary manner, but also changes it in two or more steps according to the sharpness of the retina captured image or the movement amount of the blood vessel pattern and its features. It is also possible.
- the video display apparatus 10 can obtain a user's gaze point on the video based on the movement amount of the blood vessel pattern of the retinal image captured by the optical unit 200, and serves as a pointing device. It can also be implemented.
- the image processing unit 202 calculates the amount of movement of the mouse pointer from the amount of movement of the blood vessel pattern of the captured retinal image or its feature, and moves the mouse pointer on the video.
- the image processing unit 202 causes the user to move the mouse. It is determined that the button has been clicked. Thereby, the role of a pointing device can be implemented without adding a new element.
- the video display device 10 can also realize an automatic focus adjustment function when the user views video based on the clearness of the blood vessel pattern of the retinal image captured by the optical unit 200. is there.
- at least one part or all of the optical processing unit 3 and the projection optical unit 4 includes a movable unit.
- the image processing unit 202 quantifies the sharpness of the blood vessel pattern in the retinal image captured by the optical unit 200 and outputs the result to the control unit 201.
- the control unit 201 instructs the optical unit 200 to drive the movable unit. Thereby, an automatic focus adjustment function can be realized.
- the light source unit 1 includes a light source that emits visible light and a light source that emits infrared light.
- the point which images a retina by detecting is different from Example 1.
- FIG. 13 is a diagram illustrating the configuration of the light source unit 1 according to the second embodiment of the invention.
- the light source unit 1 according to the second embodiment of the present invention includes a light source 1A, a light source 1B, a light source 1C, and a light source 1D.
- the light sources 1A to 1D emit any one of red light, green light, blue light, and infrared light. Any light source may emit any light as long as the four lights are emitted by the four light sources without any shortage.
- FIG. 5 An example of a schematic diagram of the optical processing unit 3 of the present embodiment is represented in FIG. 5 which is the same as that of the first embodiment of the present invention.
- the photodiode 3C is configured to receive infrared light traveling in the direction of the photodiode 3C from the polarizing plate 3F and not to receive light traveling in the direction of the photodiode 3C from the polarizing plate 3A.
- the light source in the optical unit 200 lights up as follows when the refresh rate is set to 60 Hz, for example. During 1/60 seconds, the red light source, the green light source, the blue light source, and the infrared light source are turned on in order and exclusively. The light from the red light source, the green light source, and the blue light source is used to present an image to the user, and the light from the infrared light source is used to image the retina.
- the red light source, the green light source, the blue light source, and the infrared light source are turned on in order, but the infrared light source is not always turned on when the retina is not imaged.
- a red light source, a green light source, and a blue light source may be turned on in order. Thereby, the brightness of the image can be improved.
- the red light source, the green light source, the blue light source, and the infrared light source are captured when the retina is imaged.
- the light source may not always be lit and the infrared light source may be lit constantly. Thereby, the time average intensity
- the infrared light source is always on, and the red light source, the green light source, The blue light source may be turned on in order. Thereby, the brightness of the said image
- the function of imaging the retina of the optical unit 200 will be described by taking as an example the case of performing retina authentication with a retina captured image.
- a part of the infrared light reaching the user's retina 7 from the light source unit 1 is scattered by the retina 7.
- Part of the scattered light passes through the crystalline lens 8 and the projection optical unit 4 and enters the photodiode 3 ⁇ / b> C in the integrated liquid crystal display element / image sensor 31. Since the light incident on the photodiode 3C is infrared light traveling in the direction from the polarizing plate 3F to the photodiode 3C, the photodiode 3C detects this infrared light.
- the optical unit 200 presents the retina authentication preparation image 71A shown in FIG. 8, since the optical unit 200 does not capture the retina, only the visible light source is turned on and the infrared light source is not required to be turned on. .
- the infrared light source When the optical unit 200 images the retina 7 of the user, the infrared light source is turned on. At that time, the red light source, the green light source, and the blue light source may not be turned on.
- the retinal imaging unit in the optical processing unit 3 can acquire a captured image of the retina by receiving the return light from the retina of the infrared light emitted from the infrared light source.
- the optical unit 200 When the optical unit 200 images the retina 7 of the user, not only the infrared light source but also the visible light source may be turned on. Since light incident on the optical processing unit 3 from the infrared light source is subjected to intensity modulation by the optical processing unit 3 in the same manner as light incident on the optical processing unit 3 from other visible light sources, the image processing unit 202 displays the displayed image. Based on this information, a process for correcting the intensity modulation may be performed on the captured retinal image.
- the video display apparatus 10 can realize other functions using the retina captured image besides the retina authentication using the retina captured image.
- Embodiment 3 of the present invention will be described below.
- an optical processing unit 32 having a configuration different from that of the first and second embodiments is used. According to this embodiment, it is possible to expand the choices of elements.
- FIG. 14 is a diagram illustrating the configuration of the optical processing unit 32 according to the third embodiment of the present invention.
- the optical processing unit 32 according to the third embodiment of the present invention includes a polarizing plate 32A, a polarizing beam splitter 32B, a liquid crystal display element 32C, a wave plate 32D, an image sensor 32E, It has.
- the light emitted from the light source unit 1 and passed through the illumination optical unit 2 enters the polarizing plate 32A. Since the polarizing plate 32A transmits light having polarized light in a specific direction, the transmitted light of the polarizing plate 32A has polarized light A in a specific direction. The light transmitted through the polarizing plate 32A having the polarization A enters the polarization beam splitter 32B.
- the reflection / transmission surface 32F of the polarization beam splitter reflects the light having the polarization A and transmits the light having the polarization B orthogonal to the polarization A. For this reason, the transmitted light of the polarizing plate 32A having the polarization A incident on the polarization beam splitter 32B is reflected by the reflection / transmission surface 32F and enters the liquid crystal display element 32C.
- the liquid crystal display element 32C modulates the intensity of the light incident on the liquid crystal display element 32C based on a video signal presented by the optical unit 200.
- the light subjected to the intensity modulation is emitted from the liquid crystal display element 32C in the opposite direction to the incident direction after the polarization is changed from the polarization A to the polarization B. That is, the liquid crystal display element 32 ⁇ / b> C outputs light of an image (image) displayed by the optical unit 200.
- the light emitted from the liquid crystal display element 32C is incident on the polarization beam splitter 32B. However, since the light emitted from the liquid crystal display element 32C has the polarization B, the light is transmitted through the reflection / transmission surface 32F of the polarization beam splitter 32B, and the wave plate 32D. Is incident on. The light transmitted through the wave plate 32D reaches the user's retina 7 through the projection optical unit 4 and the user's crystal 8.
- Part of the light that reaches the user's retina 7 is scattered by the retina 7. Part of the scattered light enters the wave plate 32D through the user's crystalline lens 8 and the projection optical unit 4, and exits the wave plate 32D.
- the light emitted from the wave plate 32D is incident on the polarization beam splitter 32B.
- the wave plate 32D is configured such that the polarized light of the light emitted from the wave plate 32D and incident on the polarization beam splitter 32B becomes the polarization A.
- the light emitted from the wave plate 32D and incident on the polarization beam splitter 32B is reflected by the reflection / transmission surface 32F and incident on the image sensor 32E. That is, the wave plate 32D changes the polarization of the light from the liquid crystal display element 32C to the user's retina 7 and the light from the user's retina 7 to the image sensor 32E.
- the polarization beam splitter 32B separates the optical path from the liquid crystal display element 32C to the user's retina 7 and the optical path from the user's retina 7 to the imaging element 32E. With such a configuration, the optical unit 200 can obtain a captured image of the retina from light received by the image sensor 32E.
- the liquid crystal display element 32C and the image pickup element 32E can independently select an optimum element, and the choice of elements can be expanded.
- the light source unit 1 may be configured by a light source that emits visible light as in the first embodiment. Moreover, you may be comprised from the light source which discharge
- FIG. When the light source unit 1 is composed of a light source that emits visible light, the imaging element 32E receives visible light. When the light source unit 1 is composed of a light source that emits visible light or infrared light, the imaging element 32E receives infrared light.
- the fourth embodiment is different from the first to third embodiments in that an optical processing unit 33 including a fiber scanning element 33B is used. According to the present embodiment, the video display device 10 with a retinal imaging function using the fiber scanning element 33B can be realized.
- the light source unit 1 includes a light source 1A that emits red light, a light source 1B that emits green light, and a light source 1C that emits blue light.
- each of the light source 1A, the light source 1B, and the light source 1C is a laser light source, and emits a light beam having the polarization A.
- the illumination optical unit 2 according to the present embodiment combines the light emitted from the light source and adjusts the illuminance distribution of the light so that the combined light is input to the optical processing unit as efficiently as possible.
- FIG. 15 is a diagram illustrating the configuration of an optical processing unit 33 according to the fourth embodiment of the present invention.
- the optical processing unit 33 according to the fourth embodiment of the present invention includes a fiber polarization beam splitter 33A, a fiber scanning element 33B, a wavelength plate 33C, and a light intensity measuring element 33D. ing.
- FIG. 16 is a diagram illustrating an example of a fiber polarization beam splitter 33A.
- the fiber polarization beam splitter 33A has optical input / output terminals of a terminal 33F, a terminal 33G, and a terminal 33H, and is orthogonal to the polarized light A when the polarized light A is input to the terminal 33F.
- polarized light B is input to the terminal 33G, the light is emitted from the terminal 33H.
- the light emitted from the light source unit 1 enters the terminal 33F of the polarization beam splitter 33A. Since the light incident on the terminal 33F has the polarization A, it is emitted from the terminal 33H. The light emitted from the terminal 33H enters the fiber scanning element 33B.
- FIG. 17 is a diagram illustrating an example of the fiber scanning element 33B.
- the light emitted from the terminal 33H enters the terminal 33M shown in FIG.
- the fiber scanning element 33B has a movable part (not shown), and scans the fiber terminal 33N in two axial directions substantially orthogonal to the traveling direction of light in the fiber, as indicated by an arrow 33L in FIG.
- the light incident on the fiber scanning element 33B is emitted from the terminal 33N.
- the optical unit 200 can generate an image at the position of the virtual screen 33Q by driving the light source unit 1 and the fiber scanning element 33B in synchronization.
- driving the light source unit 1 and the fiber scanning element 33B in synchronization will be described.
- the refresh rate is set to 60 Hz, for example.
- the fiber scanning element 33B has a fiber terminal 33N so that the emitted light 33P of the fiber scanning element 33B irradiates all the pixels of the image generated at the position of the virtual screen 33Q within 1/60 seconds. Scan.
- the optical unit 200 changes the intensity of light emitted by the light source 1A, the light source 1B, and the light source 1C according to the color of the pixel irradiated with the emitted light 33P in the image presented by the optical unit 200.
- the light emitted from the fiber scanning element 33B passes through the wave plate 33C.
- the wavelength plate 33C changes the polarization of light. Details of the function of the wavelength plate 33C will be described later.
- the light transmitted through the wave plate 33 ⁇ / b> C reaches the user's retina 7 through the projection optical unit 4 and the user's crystal 8. At this time, the user perceives the virtual image generated by the projection optical unit 4 of the image generated by the fiber scanning element 33B at the position of the virtual screen 33Q as existing in front of the eyes.
- the optical unit 200 can acquire an image of the retina as follows. A part of the light reaching the user's retina 7 is scattered by the retina 7. A part of the scattered light enters the wave plate 33C through the user's crystal 8 and the projection optical unit 4, and exits the wave plate 33C.
- the wave plate 33 ⁇ / b> C is configured such that the polarized light of the light emitted from the wave plate 33 ⁇ / b> C through the user's crystal 8 becomes the polarization B. The light emitted from the wave plate 33C reaches the position of the virtual screen 33Q.
- the user's retina 7 and the virtual screen 33Q are in the relationship between the object plane and the image plane.
- An image of the retina 7 is formed on the virtual screen 33Q.
- the light that has reached the position of the virtual screen 33Q is incident on the terminal 33N of the fiber scanning element 33B shown in FIG.
- the light incident on the fiber scanning element 33B from the terminal 33N exits from the terminal 33M and enters the terminal 33H of the fiber polarization beam splitter 33A shown in FIG.
- the optical unit 200 can acquire an image of the retina by forming an image from the light intensity measured by the light intensity measuring element 33D in synchronization with the scanning of the terminal 33N.
- the fiber scanning element 33B outputs image light
- the wave plate 33C includes light from the fiber scanning element 33B to the user's retina 7 and light from the user's retina 7 to the light intensity measuring element 33D.
- the polarization is changed, and the fiber polarization beam splitter 33A separates the optical path from the fiber scanning element 33B to the user's retina 7 and the optical path from the user's retina 7 to the light intensity measurement element 33D.
- a video display device with a retinal imaging function using the fiber scanning element 33B can be realized.
- Example 5 Next, a fifth embodiment of the present invention will be described.
- the fifth embodiment is different from the first to fourth embodiments in that an iris is imaged.
- a fifth embodiment of the present invention will be described with reference to FIGS.
- symbol shall be attached
- FIG. 18 is a diagram illustrating the configuration of the optical unit 200 of the video display apparatus 10 according to the fifth embodiment of the invention.
- the optical unit 200 according to this embodiment includes a light source unit 1, an illumination optical unit 2, an optical processing unit 34 that combines an image generation unit and an iris imaging unit, and a projection optical unit 4. It is equipped with.
- At least one of the optical processing unit 34 and the projection optical unit 4 includes an adjustment unit that is a mechanism for adjusting the projection relationship so that an iris image is formed by the iris imaging unit.
- the optical unit 200 acquires a captured image of the user's iris 101.
- the light source unit 1 may be composed of a light source that emits visible light as in the first embodiment. Moreover, you may be comprised from the light source which discharge
- the projection optical unit 4 includes a lens 41A and a planar beam splitter 41E having a reflectance of 20% or more.
- FIG. 19 is a configuration diagram of the optical processing unit 34 according to the fifth embodiment of the present invention.
- the optical processing unit 34 includes a polarizing plate 32A, a polarizing beam splitter 32B, a liquid crystal display element 32C, a wave plate 34D, an imaging element 32E, and a lens 34G.
- the light emitted from the optical processing unit 34 reaches the user's eyeball 6 via the projection optical unit 4. Part of the light reaching the user's eyeball 6 passes through the user's crystalline lens 8, and another part of the light reaching the user's eyeball 6 reaches the user's iris 101.
- the light that reaches the user's crystalline lens 8 passes through the user's crystalline lens 8 and reaches the user's retina 7.
- the user perceives the light reaching the user's retina 7.
- a part of the light reaching the user's iris 101 is scattered by the user's iris 101 and enters the wave plate 34D of the optical processing unit 34 through the projection optical unit 4.
- the light emitted from the wave plate 34D enters the polarization beam splitter 32B.
- the wave plate 34 ⁇ / b> D is configured such that the polarized light of the emitted light is polarized light A.
- the light that exits the wave plate 32D and enters the polarization beam splitter 32B is reflected by the reflection / transmission surface 32F.
- the light reflected by the reflection / transmission surface 32F passes through the lens 34G and enters the image sensor 32E.
- the lens 34G serves as an adjustment unit that adjusts the projection relationship so that the user's iris 101 and the image sensor 32E have a relationship between the object plane and the image plane.
- the image pick-up element 32E can image the user's iris 101 clearly.
- FIG. 25 is a diagram illustrating the configuration of the projection optical unit 4 according to the fifth embodiment of the invention.
- the projection optical unit 4 may include a lens 41A and a curved beam splitter 41F having a reflectance of 20% or more.
- FIG. 26 shows a reflection surface of the curved beam splitter 41F.
- the light that has reached the central portion 41G of the curved beam splitter 41F from the optical processing portion 34 reaches the user's crystalline lens 8.
- the light reaching the outer peripheral portion 41H of the curved beam splitter 41F from the optical processing portion 34 reaches the user's iris 101.
- the outer peripheral portion 41H of the curved beam splitter 41F has a non-zero curvature in order to collect the light that has reached the outer peripheral portion 41H of the curved beam splitter 41F from the optical processing unit 34 on the user's iris 101. Thereby, the illumination efficiency of a user's iris 101 can be improved.
- the central portion 41G of the reflecting surface of the curved beam splitter 41F plays a role of reflecting the image generated by the optical processing unit 34.
- An iris authentication determination unit (not shown) in the image processing unit 202 determines whether or not the user is a regular user.
- the image processing unit 202 acquires the iris image captured by the optical unit 200 from the optical unit 200, and determines whether the feature of the iris pattern matches the feature of the regular user's iris pattern acquired from the information storage unit 203. .
- the iris authentication determination unit determines that the user is a regular user and the authentication is successful. If they do not match, the iris authentication determination unit determines that the user is not a regular user but an authentication failure. Note that iris authentication may be attempted a plurality of times as in the first embodiment.
- FIG. 20 is a diagram illustrating an iris authentication preparation image. As shown in FIG. 20, using a plurality of iris authentication preparation images 171A and 172A provided with marks 171B and 172B that serve as references for the user's eye position at different positions, iris authentication is performed for each iris authentication preparation image. May be performed. Thereby, the precision of iris authentication can be improved.
- the case where the optical unit 200 includes the lens 34G has been described as an example in order to adjust the projection relationship so that the iris image is formed by the iris imaging unit in the optical processing unit 34.
- a part or all of at least one of the optical processing unit 34 and the projection optical unit 4 may include a movable unit.
- the projection optical unit 4 includes a movable unit that moves the convex lens 41A of the projection optical unit 4 shown in FIG. 6A, and the optical unit 200 operates the movable unit to project the iris 101 and the image sensor 32E. Adjust the relationship. That is, the convex lens 41A included in the projection optical unit 4 functions as an adjustment unit that adjusts the projection relationship so that the user's iris 101 and the image sensor 32E have a relationship between the object plane and the image plane.
- the image display device 10 with an iris imaging function provided with a movable part operates the movable part as follows.
- the image processing unit 202 is a movable unit in which the video generation unit in the optical processing unit 34 and the user's retina 7 have a relationship between the object plane and the image plane.
- a predetermined position is acquired from the information storage unit 203, and an instruction is given to the optical unit 200 to move the movable unit to the acquired position.
- the optical unit 200 operates the movable unit according to the command.
- the optical unit 200 captures the user's iris 101
- the optical unit 200 operates the movable unit according to this command. Thereby, the optical unit 200 can present an image to the user and can clearly image the user's iris 101.
- the image processing unit 202 instructs the optical unit 200 to move the movable unit to the acquired position based on the predetermined position of the movable unit acquired from the information storage unit 203.
- the present invention is not limited to this.
- the image processing unit 202 quantifies the sharpness of the iris pattern of the captured image acquired from the optical unit 200, and the optical unit 200 increases the sharpness. May be instructed to move the movable part.
- other movable units may be operated so that the image generation unit in the optical processing unit 34 and the user's retina 7 have a relationship between the object plane and the image plane. .
- the optical processing unit 3 displays an image using the light output from the light source unit 1, and the light of the displayed image is the user's eyes.
- an image that is an authentication target for performing biometric authentication processing using the eyes is picked up. Since the optical path is shared by the light and the light source for image display and the light source for iris imaging of the optical unit 200 can be shared as the light source unit 1, authentication for performing biometric authentication processing using the eyes
- An image display apparatus having an imaging function for imaging a target can be made smaller than before.
- the projection optical unit 4 includes an element that changes the traveling direction of light so that light incident on the projection optical unit 4 from the optical processing unit 3 enters the user's eyeball.
- An attached video display device can be provided.
- the configuration in which the projection optical unit 4 includes an element that polarizes the traveling direction of light is not essential, and the projection optical unit 4 has a configuration in which light incident from the optical processing unit 3 enters the user's eyeball as it is. Also good.
- the optical processing unit 3 or the projection optical unit 4 includes adjustment units such as the lens 34G and the convex lens 41A, and the user's iris 101 can be clearly imaged has been described as an example.
- this configuration is not essential, and the user's iris 101 can be imaged even when there is no adjustment unit.
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Optics & Photonics (AREA)
Abstract
In the present invention, an image display device having an imaging function that captures an image for performing biometric processing using an eye has increased compactness compared to conventional devices. The present invention is characterized by containing a light source unit that outputs light and an optical processing unit that performs an image display process for displaying an image using light output from the light source unit and an image capture process that captures a recognition subject for performing biometric processing using an eye, the optical processing unit capturing the recognition subject by means of receiving light resulting from light of the displayed image being reflected by the eye of a user.
Description
本発明は、画像表示装置及び光学デバイスに関する。
The present invention relates to an image display apparatus and an optical device.
本技術分野の背景技術として、特開2008-241822号公報(特許文献1)がある。この公報には、「可視光反射赤外光透過ミラー2-aまたは可視光透過赤外光反射ミラー2-bを用いることにより、光のロスを最小にしつつ、画像表示と虹彩撮像という2つの機能を一つの光学系上に集成する手段を提供する。」と記載されている。
There is JP 2008-241822 (Patent Document 1) as background art in this technical field. In this publication, “the visible light reflecting infrared light transmitting mirror 2-a or the visible light transmitting infrared light reflecting mirror 2-b is used to minimize the loss of light and to display two images, namely, image display and iris imaging. Provides a means of assembling functions on one optical system. "
従来から、利用者が自身の頭部に装着して利用するヘッドマウントディスプレイ(Head Mounted Display、以下ではHMDと省略する)などの画像表示装置が知られている。利用者は、HMDが表示する映像や静止画等の画像を視覚により認識することにより、様々な情報を取得することができる。
2. Description of the Related Art Conventionally, an image display device such as a head mounted display (Head Mounted Display, hereinafter abbreviated as HMD) that a user wears on his / her head is known. The user can acquire various types of information by visually recognizing images such as images and still images displayed by the HMD.
HMDは、全世界に公開された機密性を有さない情報だけではなく、特定の個人に対してのみ公開された機密性を有する情報も所持あるいは表示し得る。そのため、HMDは利用者を識別する機能を備え、利用権限を有する正規利用者に対してのみ画像を提示することが望ましい。
The HMD can possess or display not only non-confidential information disclosed to the whole world but also confidential information disclosed only to specific individuals. Therefore, it is desirable that the HMD has a function of identifying a user and presents an image only to a regular user who has a use authority.
このような問題を解決するために、例えば特許文献1では、可視光源、赤外光源及び撮像素子を備え、可視光源により画像を表示する画像表示機能と、目を用いた生体認証処理を行うために、赤外光源から出射した赤外光の網膜又は虹彩からの戻り光を撮像素子で撮像する撮像機能を有する画像表示装置が開示されている。
In order to solve such a problem, for example, in Patent Document 1, an image display function that includes a visible light source, an infrared light source, and an imaging device and displays an image with a visible light source, and biometric authentication processing using the eyes are performed. In addition, an image display apparatus having an imaging function of imaging infrared light emitted from an infrared light source from the retina or iris of the infrared light with an imaging device is disclosed.
しかしながら、特許文献1で開示されている装置では、網膜画像等を撮像するための赤外光源が画像表示の光学系とは別に配置されており、光学系が肥大化してしまうという問題点がある。
However, the apparatus disclosed in Patent Document 1 has a problem that an infrared light source for capturing a retinal image or the like is arranged separately from the optical system for image display, and the optical system becomes enlarged. .
本発明は、このような課題を解決するためになされたものであり、目を用いた生体認証処理を行うための認証対象を撮像する撮像機能を有する画像表示装置を従来よりも小型化することを目的とする。
The present invention has been made in order to solve such a problem, and an image display device having an imaging function for imaging an authentication target for performing biometric authentication processing using eyes is made smaller than before. With the goal.
上記課題を解決するために、本発明は、光源部から出力された光を用いて画像を表示する画像表示処理及び目を用いた生体認証処理を行うための認証対象を撮像する撮像処理を行う光学処理部とを含み、前記光学処理部は、表示された前記画像の光が利用者の目によって反射された反射光を受光することにより前記認証対象を撮像することを特徴とする。
In order to solve the above problems, the present invention performs an image display process for displaying an image using light output from a light source unit and an imaging process for imaging an authentication target for performing a biometric authentication process using eyes. An optical processing unit, wherein the optical processing unit captures the authentication object by receiving reflected light of the displayed image reflected by a user's eyes.
本発明によれば、目を用いた生体認証処理を行うための認証対象を撮像する撮像機能を有する画像表示装置を従来よりも小型化することができる。
According to the present invention, an image display device having an imaging function for imaging an authentication target for performing biometric authentication processing using eyes can be made smaller than before.
以下、本発明の実施形態について、図面を用いて詳細に説明する。なお、以下の説明は、本発明の一実施形態を説明するためのものであり、本発明の範囲を制限するものではない。従って、当業者であればこれらの各要素若しくは全要素をこれと同等なものに置換した実施形態を採用することが可能であり、これらの実施形態も本発明の範囲に含まれる。
Hereinafter, embodiments of the present invention will be described in detail with reference to the drawings. In addition, the following description is for describing one embodiment of the present invention, and does not limit the scope of the present invention. Accordingly, those skilled in the art can employ embodiments in which each of these elements or all of the elements are replaced with equivalent ones, and these embodiments are also included in the scope of the present invention.
[実施例1]
以下、本発明の実施例1を説明する。図23は、本発明の実施形態に係る映像表示装置(apparatus)10の機能構成を例示するブロック図である。映像表示装置10は、光学部200と、制御部201と、画像処理部202と、情報記憶部203と、通信処理部204と、通信入出力部205と、を備えている。なお、以下、本実施形態においては、映像表示装置10を例として説明するが、動画像である映像の表示だけでなく、静止画像の表示を含む画像表示装置10であってもよい。 [Example 1]
Embodiment 1 of the present invention will be described below. FIG. 23 is a block diagram illustrating a functional configuration of the video display device (apparatus) 10 according to the embodiment of the invention. The video display apparatus 10 includes an optical unit 200, a control unit 201, an image processing unit 202, an information storage unit 203, a communication processing unit 204, and a communication input / output unit 205. In the following, in the present embodiment, the video display device 10 will be described as an example. However, the image display device 10 may include not only a video image that is a moving image but also a still image display.
以下、本発明の実施例1を説明する。図23は、本発明の実施形態に係る映像表示装置(apparatus)10の機能構成を例示するブロック図である。映像表示装置10は、光学部200と、制御部201と、画像処理部202と、情報記憶部203と、通信処理部204と、通信入出力部205と、を備えている。なお、以下、本実施形態においては、映像表示装置10を例として説明するが、動画像である映像の表示だけでなく、静止画像の表示を含む画像表示装置10であってもよい。 [Example 1]
制御部201は、映像表示装置10に含まれる各部を制御する役割を担い、映像表示装置10に含まれる各部に命令を与える。情報記憶部203は、映像表示装置10が利用者に提示する映像や、網膜画像又は虹彩画像等の目を用いた生体認証に必要な画像等を記憶している。なお、以降、「網膜画像又は虹彩画像等の目を用いた生体認証」を「生体認証」とする。
The control unit 201 plays a role of controlling each unit included in the video display device 10 and gives a command to each unit included in the video display device 10. The information storage unit 203 stores a video that the video display device 10 presents to the user, an image necessary for biometric authentication using eyes such as a retina image or an iris image, and the like. Hereinafter, “biometric authentication using an eye such as a retina image or iris image” is referred to as “biometric authentication”.
画像処理部202は、制御部201の指令に従い、映像や生体認証に必要な画像等を情報記憶部203から取得し、光学部200に対して出力する。また、画像処理部202は、光学部200により撮像されて生成された撮像画像を取得し、画像処理部202内の図示しない判定部が取得した撮像画像を基に判定を行う。画像処理部202は、判定結果を制御部201に対して出力する。
The image processing unit 202 acquires a video, an image necessary for biometric authentication, and the like from the information storage unit 203 in accordance with an instruction from the control unit 201, and outputs them to the optical unit 200. In addition, the image processing unit 202 acquires a captured image generated by being captured by the optical unit 200, and performs determination based on a captured image acquired by a determination unit (not illustrated) in the image processing unit 202. The image processing unit 202 outputs the determination result to the control unit 201.
光学部200は、制御部201の指令に従い、画像処理部202から受信した映像又は画像を利用者に提示する。また、光学部200は制御部201の指令に従い、撮像を行う。
The optical unit 200 presents the video or image received from the image processing unit 202 to the user in accordance with an instruction from the control unit 201. In addition, the optical unit 200 performs imaging in accordance with a command from the control unit 201.
図24は、本発明の実施形態に係る映像表示システムの運用形態を例示する図である。映像表示装置10は、図24に示すように、映像表示装置10とは別の情報処理装置212と通信を行うことが可能である。情報処理装置212との通信処理は、制御部201の指令に従い、通信処理部204により行われる。情報処理装置212との通信は、通信入出力部205及びインターネット211を経由して行われる。
FIG. 24 is a diagram illustrating an operation mode of the video display system according to the embodiment of the present invention. As shown in FIG. 24, the video display device 10 can communicate with an information processing device 212 different from the video display device 10. Communication processing with the information processing device 212 is performed by the communication processing unit 204 in accordance with an instruction from the control unit 201. Communication with the information processing apparatus 212 is performed via the communication input / output unit 205 and the Internet 211.
なお、上記実施形態において説明した光学部200、制御部201、画像処理部202、情報記憶部203及び通信処理部204は、映像表示装置10において、ソフトウェアとハードウェアとの組み合わせによって実現される。以下、本実施形態に係る各構成部を実現する映像表示装置10のハードウェア構成について図27を参照して説明する。
The optical unit 200, the control unit 201, the image processing unit 202, the information storage unit 203, and the communication processing unit 204 described in the above embodiment are realized by a combination of software and hardware in the video display device 10. Hereinafter, a hardware configuration of the video display apparatus 10 that realizes each component according to the present embodiment will be described with reference to FIG.
図27は、本実施形態に係る映像表示装置10のハードウェア構成を例示するブロック図である。図2に示すように、本実施形態に係る映像表示装置10は、一般的なサーバやPC(Personal Computer)等と同様の構成を含む。すなわち、本実施形態に係る映像表示装置10は、CPU(Central Processing Unit)11、RAM(Random Access Memory)20、ROM(Read Only Memory)30、HDD(Hard Disk Drive)40及びI/F50がバス80を介して接続されている。また、I/F50には表示部60及び操作部70が接続されている。
FIG. 27 is a block diagram illustrating a hardware configuration of the video display apparatus 10 according to the present embodiment. As shown in FIG. 2, the video display apparatus 10 according to the present embodiment includes the same configuration as a general server, a PC (Personal Computer), or the like. That is, the video display device 10 according to the present embodiment includes a CPU (Central Processing Unit) 11, a RAM (Random Access Memory) 20, a ROM (Read Only Memory) 30, an HDD (Hard Disk Drive) 40, and an I / F 50. 80 is connected. A display unit 60 and an operation unit 70 are connected to the I / F 50.
CPU11は演算手段であり、映像表示装置10全体の動作を制御する。RAM20は、情報の高速な読み書きが可能な揮発性の記憶媒体であり、CPU11が情報を処理する際の作業領域として用いられる。ROM30は、読み出し専用の不揮発性記憶媒体であり、ファームウェア等のプログラムが格納されている。HDD40は、情報の読み書きが可能な不揮発性の記憶媒体であり、OS(Operating System)や各種の制御プログラム、アプリケーション・プログラム等が格納される。なお、HDDの他、SSD(Solid State Drive)等の半導体記憶装置を用いても良い。
The CPU 11 is a calculation means and controls the entire operation of the video display device 10. The RAM 20 is a volatile storage medium capable of reading and writing information at high speed, and is used as a work area when the CPU 11 processes information. The ROM 30 is a read-only nonvolatile storage medium and stores a program such as firmware. The HDD 40 is a non-volatile storage medium that can read and write information, and stores an OS (Operating System), various control programs, application programs, and the like. In addition to the HDD, a semiconductor storage device such as an SSD (Solid State Drive) may be used.
I/F50は、バス80と各種のハードウェアやネットワーク等を接続し制御する。表示部60は、ユーザが映像表示装置10の状態を確認するための視覚的ユーザインタフェースである。操作部70は、各種のハードボタン、タッチパネル等、ユーザが映像表示装置10に情報を入力するためのユーザインタフェースである。
The I / F 50 connects and controls the bus 80 and various hardware and networks. The display unit 60 is a visual user interface for the user to check the state of the video display device 10. The operation unit 70 is a user interface for the user to input information to the video display device 10 such as various hard buttons and a touch panel.
このようなハードウェア構成において、ROM30やHDD40若しくは図示しない光学ディスク等の記憶媒体に格納されたプログラムがRAM20に読み出され、CPU11がそれらのプログラムに従って演算を行うことによりソフトウェア制御部が構成される。このようにして構成されたソフトウェア制御部と、ハードウェアとの組み合わせによって、本実施形態に係る映像表示装置10の機能を実現する機能ブロックが構成される。
In such a hardware configuration, a program stored in a storage medium such as the ROM 30, the HDD 40, or an optical disk (not shown) is read into the RAM 20, and the software control unit is configured by the CPU 11 performing calculations according to those programs. . A functional block that realizes the function of the video display apparatus 10 according to the present embodiment is configured by a combination of the software control unit configured as described above and hardware.
次に、本実施形態に係る光学部200の構成について説明する。図1は、本実施形態に係る光学部200の構成を例示する図である。図1に示すように、本実施形態に係る光学部200は、光源部1と、照明光学部2と、映像生成部(画像生成部)と網膜撮像部とを兼ね備えた光学処理部3と、光の進行方向を変更する素子を備えた投影光学部4と、を備えている。また、図1に示すように、投影光学部4から出射した光が利用者の眼球6へ到達し、利用者の眼球6から出射した光が投影光学部4へ入射する。
Next, the configuration of the optical unit 200 according to the present embodiment will be described. FIG. 1 is a diagram illustrating the configuration of an optical unit 200 according to this embodiment. As shown in FIG. 1, the optical unit 200 according to the present embodiment includes an optical processing unit 3 that combines a light source unit 1, an illumination optical unit 2, a video generation unit (image generation unit), and a retinal imaging unit, A projection optical unit 4 including an element that changes the traveling direction of light. Further, as shown in FIG. 1, the light emitted from the projection optical unit 4 reaches the user's eyeball 6, and the light emitted from the user's eyeball 6 enters the projection optical unit 4.
光源部1から出射した光は、照明光学部2と、光学処理部3と、投影光学部4と、利用者の水晶体8と、を経て利用者の網膜7に到達する。網膜7に到達した光は網膜7で散乱され、その散乱された光の一部は、水晶体8と、投影光学部4と、を経て光学処理部3に到達する。
The light emitted from the light source unit 1 reaches the user's retina 7 through the illumination optical unit 2, the optical processing unit 3, the projection optical unit 4, and the user's crystalline lens 8. The light reaching the retina 7 is scattered by the retina 7, and a part of the scattered light reaches the optical processing unit 3 through the crystalline lens 8 and the projection optical unit 4.
以下では、まず光学部200の映像を表示する機能について説明する。光源部1は、利用者に対して映像を提示するための可視光を放出する。光源部1は、白色光を放出する光源、又は異なった色の光を放出する複数の光源で構成されている。
Hereinafter, the function of displaying the image of the optical unit 200 will be described first. The light source unit 1 emits visible light for presenting an image to the user. The light source unit 1 includes a light source that emits white light or a plurality of light sources that emit light of different colors.
照明光学部2は、光学処理部3内の映像生成部を略同一照度で過不足なく照明するよう、光源部1から放出された光の照度分布を調整する。光源部1が複数の光源を備えている場合、照明光学部2は、光源部1を構成する複数の光源が放出した光を合波する役割も果たす。以下では、光源部1が複数の光源を備えている場合を例に挙げて説明をする。
The illumination optical unit 2 adjusts the illuminance distribution of the light emitted from the light source unit 1 so that the image generation unit in the optical processing unit 3 is illuminated with substantially the same illuminance without excess or deficiency. When the light source unit 1 includes a plurality of light sources, the illumination optical unit 2 also plays a role of combining light emitted from the plurality of light sources that constitute the light source unit 1. Hereinafter, a case where the light source unit 1 includes a plurality of light sources will be described as an example.
図2は、本実施形態に係る光源部1及び照明光学部2の構成を例示する図である。光源部1は、例えば、光源1Aと、光源1Bと、光源1Cと、を備えている。光源1A乃至1Cは、赤色の光・緑色の光・青色の光のうち、いずれか1つの色の光を放出する。上記3色の光が3個の光源により不足なく放出されていれば、いずれの光源がいずれの色の光を放出していても構わない。なお、光源1A乃至1Cは、各々が独立したパッケージ内に実装されていても良いし、2つ以上の光源が1つのパッケージ内に集積して実装されていても良い。
FIG. 2 is a diagram illustrating the configuration of the light source unit 1 and the illumination optical unit 2 according to this embodiment. The light source unit 1 includes, for example, a light source 1A, a light source 1B, and a light source 1C. The light sources 1A to 1C emit light of any one color among red light, green light, and blue light. As long as the three colors of light are emitted by the three light sources without any shortage, any light source may emit any color of light. Each of the light sources 1A to 1C may be mounted in an independent package, or two or more light sources may be integrated and mounted in one package.
照明光学部2は、例えば、ライトトンネル21Aと、レンズ21Bと、を備えている。光源部1が放出した光は、照明光学部2のライトトンネル21Aに入射する。ライトトンネル21Aに入射した光は、ライトトンネル21Aの内壁で複数回反射されるため、ライトトンネル21Aを出射した光の照度分布は略均一となる。ライトトンネル21Aを出射した光は、レンズ21Bを透過する。
The illumination optical unit 2 includes, for example, a light tunnel 21A and a lens 21B. The light emitted from the light source unit 1 enters the light tunnel 21 </ b> A of the illumination optical unit 2. Since the light incident on the light tunnel 21A is reflected a plurality of times by the inner wall of the light tunnel 21A, the illuminance distribution of the light emitted from the light tunnel 21A becomes substantially uniform. The light emitted from the light tunnel 21A passes through the lens 21B.
レンズ21Bは、ライトトンネル21Aから出射した発散光を集める役割を果たす。これにより、照明光学部2は光源部1が放出した光を合波し、光学処理部3内の映像生成部を略均一照度分布で照明する機能を有する。
The lens 21B collects the divergent light emitted from the light tunnel 21A. Accordingly, the illumination optical unit 2 has a function of combining the light emitted from the light source unit 1 and illuminating the image generation unit in the optical processing unit 3 with a substantially uniform illuminance distribution.
ライトトンネル21Aから出射した発散光を1枚のレンズ21Bで集める代わりに、複数枚のレンズを利用することも可能である。複数枚のレンズを利用することで、1枚のレンズを利用する場合と比較して、より均一照度分布で光学処理部3内の映像生成部を照明することができる。
Instead of collecting the diverging light emitted from the light tunnel 21A with one lens 21B, it is possible to use a plurality of lenses. By using a plurality of lenses, it is possible to illuminate the image generation unit in the optical processing unit 3 with a more uniform illuminance distribution than in the case of using a single lens.
図3は、照明光学部2の他の形態の構成を例示する図である。光源部1が放出した光は、拡散板22Aに入射し、拡散して、拡散板22Aを出射する。これにより、照明光学部2は光源部1が放出した光を合波し、光学処理部3内の映像生成部を略均一照度分布で照明する機能を有する。
FIG. 3 is a diagram illustrating another configuration of the illumination optical unit 2. The light emitted from the light source unit 1 enters the diffusion plate 22A, diffuses, and exits the diffusion plate 22A. Accordingly, the illumination optical unit 2 has a function of combining the light emitted from the light source unit 1 and illuminating the image generation unit in the optical processing unit 3 with a substantially uniform illuminance distribution.
図4は、本実施例の光源部1及び照明光学部2の他の形態の一構成図である。光源部1は、例えば、赤色の光を放出する光源1Aと、緑色の光を放出する光源1Bと、青色の光を放出する光源1Cと、を備えている。
FIG. 4 is a configuration diagram of another embodiment of the light source unit 1 and the illumination optical unit 2 of the present embodiment. The light source unit 1 includes, for example, a light source 1A that emits red light, a light source 1B that emits green light, and a light source 1C that emits blue light.
照明光学部2は、例えば、レンズ23A・23B・23Cと、赤色の光を透過し、かつ緑色の光を反射するダイクロイックミラー23Dと、リレーレンズ23Hと、赤色の光と緑色の光を透過し、かつ青色の光を反射するダイクロイックミラー23Eと、マイクロレンズアレイ23Fと、レンズ23Gと、を備えている。
For example, the illumination optical unit 2 transmits lenses 23A, 23B, and 23C, a dichroic mirror 23D that transmits red light and reflects green light, a relay lens 23H, and transmits red light and green light. And a dichroic mirror 23E that reflects blue light, a microlens array 23F, and a lens 23G.
光源1Aが放出した赤色の光は、レンズ23Aを透過する。レンズ23Aは、光源1Aが放出した発散光を略平行光にする役割を果たす。レンズ23Aを透過した光は、ダイクロイックミラー23Dに向かって進行して、略45度の入射角でダイクロイックミラー23Dに入射する。
The red light emitted from the light source 1A is transmitted through the lens 23A. The lens 23A serves to make the divergent light emitted from the light source 1A substantially parallel light. The light transmitted through the lens 23A travels toward the dichroic mirror 23D and enters the dichroic mirror 23D at an incident angle of approximately 45 degrees.
同様に、光源1Bが放出した緑色の光は、レンズ23Bを透過する。レンズ23Bは、光源1Bが放出した発散光を略平行光にする役割を果たす。レンズ23Bを透過した光は、ダイクロイックミラー23Dに向かって進行して、略45度の入射角でダイクロイックミラー23Dに入射する。
Similarly, the green light emitted from the light source 1B is transmitted through the lens 23B. The lens 23B plays a role of making divergent light emitted from the light source 1B substantially parallel light. The light transmitted through the lens 23B travels toward the dichroic mirror 23D and enters the dichroic mirror 23D at an incident angle of approximately 45 degrees.
光源1Aが放出した赤色の光と光源1Bが放出した緑色の光とが略直交し、さらに赤色の光のダイクロイックミラー23Dの透過光と、緑色の光のダイクロイックミラー23Dの反射光とが略同一光軸で同一方向に進行するように、光源1A、光源1B、及びダイクロイックミラー23Dが配置される。
The red light emitted from the light source 1A and the green light emitted from the light source 1B are substantially orthogonal to each other, and the transmitted light of the dichroic mirror 23D of red light and the reflected light of the dichroic mirror 23D of green light are substantially the same. The light source 1A, the light source 1B, and the dichroic mirror 23D are arranged so as to travel in the same direction along the optical axis.
ダイクロイックミラー23Dを通過した光は、リレーレンズ23Hを透過後、ダイクロイックミラー23Eに向かって進行して、略45度の入射角でダイクロイックミラー23Eに入射する。リレーレンズ23Hは、光源1A及び光源1Bからダイクロイックミラー23Eに至る光路長と、光源1Cからダイクロイックミラー23Eに至る光路長との相違に起因した光線の広がりの相違を補正する役割を果たす。
The light that has passed through the dichroic mirror 23D passes through the relay lens 23H, travels toward the dichroic mirror 23E, and enters the dichroic mirror 23E at an incident angle of approximately 45 degrees. The relay lens 23H plays a role of correcting a difference in light spread caused by a difference between an optical path length from the light sources 1A and 1B to the dichroic mirror 23E and an optical path length from the light source 1C to the dichroic mirror 23E.
光源1Cが放出した青色の光は、レンズ23Cを透過する。レンズ23Cは、光源1Cが放出した発散光を略平行光にする役割を果たす。レンズ23Cを透過した光は、ダイクロイックミラー23Eに向かって進行して、略45度の入射角でダイクロイックミラー23Eに入射する。
Blue light emitted from the light source 1C is transmitted through the lens 23C. The lens 23C serves to make the divergent light emitted from the light source 1C substantially parallel light. The light transmitted through the lens 23C travels toward the dichroic mirror 23E and enters the dichroic mirror 23E at an incident angle of approximately 45 degrees.
ダイクロイックミラー23Dを通過した光と光源1Cが放出した青色の光とが略直交し、さらにダイクロイックミラー23Dを通過した光のダイクロイックミラー23Eの透過光と、青色の光のダイクロイックミラー23Eの反射光とが略同一光軸で同一方向に進行するように、光源1C、及びダイクロイックミラー23Eが配置される。
The light that has passed through the dichroic mirror 23D and the blue light emitted from the light source 1C are substantially orthogonal, and further, the light that has passed through the dichroic mirror 23D, the transmitted light of the dichroic mirror 23E, and the reflected light of the blue light dichroic mirror 23E The light source 1C and the dichroic mirror 23E are arranged so that the light beams travel in the same direction with substantially the same optical axis.
なお、光源1A・光源1B・光源1Cがそれぞれ赤色の光・緑色の光・青色の光を放出する場合を例として説明したが、各光源が放出する光の組合せはいずれでもよい。ただし、各光源が放出する光の組合せを変更する場合、ダイクロイックミラー23Eを出射した光が略同一光軸で同一方向に進行するように、ダイクロイックミラー23D及びダイクロイックミラー23Eの反射透過特性を適切に変更する必要がある。
In addition, although the case where the light source 1A, the light source 1B, and the light source 1C each emit red light, green light, and blue light has been described as an example, any combination of light emitted by each light source may be used. However, when changing the combination of the light emitted by each light source, the reflection / transmission characteristics of the dichroic mirror 23D and the dichroic mirror 23E are appropriately set so that the light emitted from the dichroic mirror 23E travels in the same direction along substantially the same optical axis. Need to change.
ダイクロイックミラー23Eを出射した光はマイクロレンズアレイ23F及びレンズ23Gを透過する。マイクロレンズアレイ23Fの各入射セルを物体として、この物体が光学処理部3上で結像し、結像した像の大きさが光学処理部3内の映像生成部の大きさと略等しくなるように、マイクロレンズアレイ23F及びレンズ23Gが設計され配置される。これにより、照明光学部2は光源部1が放出した光を合波し、光学処理部3内の映像生成部を略均一照度分布で照明する機能を有する。
The light emitted from the dichroic mirror 23E is transmitted through the microlens array 23F and the lens 23G. Using each incident cell of the microlens array 23F as an object, this object is imaged on the optical processing unit 3, and the size of the image formed is approximately equal to the size of the image generation unit in the optical processing unit 3. The micro lens array 23F and the lens 23G are designed and arranged. Accordingly, the illumination optical unit 2 has a function of combining the light emitted from the light source unit 1 and illuminating the image generation unit in the optical processing unit 3 with a substantially uniform illuminance distribution.
光学処理部3は、入射した光を強度変調することで映像を生成する映像生成処理を行う機能を有する映像生成部と、網膜を撮像する撮像処理を行う機能を有する網膜撮像部と、を有する光学デバイス(device)である。光学処理部3としては、例えば、透過型モノクロ液晶表示素子と、撮像素子と、を備えている一体型液晶表示素子兼撮像素子31を用いることができる。以下に説明する光学処理部3の構成が、本実施形態に係る要旨の1つである。
The optical processing unit 3 includes a video generation unit having a function of performing video generation processing for generating a video by intensity-modulating incident light, and a retinal imaging unit having a function of performing imaging processing of capturing a retina. An optical device. As the optical processing unit 3, for example, an integrated liquid crystal display element / image sensor 31 including a transmissive monochrome liquid crystal display element and an image sensor can be used. The configuration of the optical processing unit 3 described below is one of the gist according to the present embodiment.
また、本実施形態に係る光学処理部3による映像の生成は、光源部1から出力された光を用いて映像を表示することを意味する。したがって、映像生成処理を行う機能を有する映像生成部は、映像表示処理を行う機能を有する映像表示生成部ともいえる。また、動画像である映像だけでなく静止画を含む画像を対象とする場合であっても同様であり、その場合、光学処理部3は、画像表示処理を行う機能を有する画像表示部を有することになる。以下の実施形態においては、映像生成処理として説明する。
In addition, the generation of an image by the optical processing unit 3 according to the present embodiment means that an image is displayed using the light output from the light source unit 1. Therefore, it can be said that a video generation unit having a function of performing video generation processing is a video display generation unit having a function of performing video display processing. The same applies to a case where not only a moving image but also an image including a still image is targeted. In that case, the optical processing unit 3 includes an image display unit having a function of performing an image display process. It will be. The following embodiments will be described as video generation processing.
図5は、一体型液晶表示素子兼撮像素子31の断面の模式図の例である。一体型液晶表示素子兼撮像素子31は、偏光板3Aと、画素電極3Bと、フォトダイオード3Cと、液晶層3Dと、対向電極3Eと、偏光板3Fと、を備えている。
FIG. 5 is an example of a schematic diagram of a cross section of the integrated liquid crystal display element / imaging element 31. The integrated liquid crystal display element / imaging element 31 includes a polarizing plate 3A, a pixel electrode 3B, a photodiode 3C, a liquid crystal layer 3D, a counter electrode 3E, and a polarizing plate 3F.
フォトダイオード3Cは、偏光板3Fからフォトダイオード3Cの向きに進行する赤色の光・緑色の光・青色の光のうち、少なくとも一つの光を受光し、偏光板3Aからフォトダイオード3Cの向きに進行する光を受光しないように構成されている。従って、フォトダイオード3Cは、光学部200の映像を表示する機能においては役割を持たない。
The photodiode 3C receives at least one of red light, green light, and blue light traveling in the direction from the polarizing plate 3F to the photodiode 3C, and proceeds in the direction from the polarizing plate 3A to the photodiode 3C. It is configured not to receive the light to be received. Therefore, the photodiode 3C has no role in the function of displaying the image of the optical unit 200.
一体型液晶表示素子兼撮像素子31は、偏光板3Aから偏光板3Fの向きに進行する光に対して、液晶パネルとして動作する。まず、光源部1が放出して照明光学部2を通過した光は、偏光板3Aから偏光板3Fの向きに、偏光板3Aに入射する。偏光板3Aは特定の向きの偏光を持つ光を透過するため、偏光板3Aの透過光は、特定の向きの偏光を持つ。
The integrated liquid crystal display element / imaging element 31 operates as a liquid crystal panel with respect to light traveling in the direction from the polarizing plate 3A to the polarizing plate 3F. First, the light emitted from the light source unit 1 and passed through the illumination optical unit 2 enters the polarizing plate 3A in the direction from the polarizing plate 3A to the polarizing plate 3F. Since the polarizing plate 3A transmits light having a specific direction of polarization, the transmitted light of the polarizing plate 3A has a specific direction of polarization.
次に、偏光板3Aの透過光は液晶層3Dに入射する。液晶層3Dは画素電極3Bと対向電極3Eに挟まれており、画素電極3Bと対向電極3Eとの間に印加した電圧に応じて、液晶分子の配向を変更する。液晶層3Dに入射した光は、液晶層3Dの液晶分子の配向に応じて偏光が回転され、液晶層3Dを出射する。
Next, the light transmitted through the polarizing plate 3A enters the liquid crystal layer 3D. The liquid crystal layer 3D is sandwiched between the pixel electrode 3B and the counter electrode 3E, and changes the orientation of the liquid crystal molecules according to the voltage applied between the pixel electrode 3B and the counter electrode 3E. The light incident on the liquid crystal layer 3D is rotated in polarization according to the orientation of the liquid crystal molecules in the liquid crystal layer 3D, and exits the liquid crystal layer 3D.
そして、液晶層3Dを出射した光は偏光板3Fに入射する。偏光板3Fの偏光軸と同一の方向に偏光した光が偏光板3Fを透過する。画素電極3Bと対向電極3Eの間に印加する電気信号を制御することにより、一体型液晶表示素子兼撮像素子31は光源部1から一体型液晶表示素子兼撮像素子31に入射した光に対して、強度変調を与えることができる。
Then, the light emitted from the liquid crystal layer 3D enters the polarizing plate 3F. Light polarized in the same direction as the polarization axis of the polarizing plate 3F is transmitted through the polarizing plate 3F. By controlling an electric signal applied between the pixel electrode 3 </ b> B and the counter electrode 3 </ b> E, the integrated liquid crystal display element / imaging element 31 can detect the light incident on the integrated liquid crystal display element / imaging element 31 from the light source unit 1. Intensity modulation can be given.
光学部200は、光源部1及び一体型液晶表示素子兼撮像素子31を同期して駆動することにより、映像を生成することができる。以下では、このような駆動の一例について説明する。
The optical unit 200 can generate an image by driving the light source unit 1 and the integrated liquid crystal display element / imaging element 31 in synchronization. Hereinafter, an example of such driving will be described.
フルカラー画像を更新する周波数をリフレッシュレートと定義し、例えば60Hzとする。本実施例では、1/60秒の間に、赤色の画像・緑色の画像・青色の画像を順番に表示している。異なる色の画像の切り替えが、利用者の目の時間分解能よりも速いため、利用者は異なる色の画像を区別することができず、フルカラーの画像として知覚する。
The frequency for updating the full color image is defined as the refresh rate, for example, 60 Hz. In the present embodiment, a red image, a green image, and a blue image are displayed in order within 1/60 seconds. Since switching between images of different colors is faster than the temporal resolution of the user's eyes, the user cannot distinguish between images of different colors and perceives them as full-color images.
始めに、光学部200は赤色の画像を表示する。光源部1を構成する光源のうち、赤色の光源を点灯し、緑色の光源及び青色の光源を消灯する。表示する画像の画素ごとにRGB値を求め、Rの値に応じて、画素に対応する画素電極3Bと対向電極3Eの間の電圧を設定する。これにより、一体型液晶表示素子兼撮像素子31の透過光は、表示する画像の赤色成分のみからなる画像となる。
First, the optical unit 200 displays a red image. Among the light sources constituting the light source unit 1, the red light source is turned on, and the green light source and the blue light source are turned off. The RGB value is obtained for each pixel of the image to be displayed, and the voltage between the pixel electrode 3B and the counter electrode 3E corresponding to the pixel is set according to the value of R. Thereby, the transmitted light of the integrated liquid crystal display element / imaging element 31 becomes an image composed only of the red component of the image to be displayed.
次に、光学部200は緑色の画像を表示する。光源部1を構成する光源のうち、緑色の光源を点灯し、赤色の光源及び青色の光源を消灯する。表示する画像の画素ごとにRGB値を求め、Gの値に応じて、画素に対応する画素電極3Bと対向電極3Eの間の電圧を設定する。これにより、一体型液晶表示素子兼撮像素子31の透過光は、表示する画像の緑色成分のみからなる画像となる。
Next, the optical unit 200 displays a green image. Among the light sources constituting the light source unit 1, the green light source is turned on, and the red light source and the blue light source are turned off. The RGB value is obtained for each pixel of the image to be displayed, and the voltage between the pixel electrode 3B and the counter electrode 3E corresponding to the pixel is set according to the G value. Thereby, the transmitted light of the integrated liquid crystal display element / image pickup element 31 becomes an image composed only of the green component of the image to be displayed.
次に、光学部200は青色の画像を表示する。光源部1を構成する光源のうち、青色の光源を点灯し、赤色の光源及び緑色の光源を消灯する。表示する画像の画素ごとにRGB値を求め、Bの値に応じて、画素に対応する画素電極3Bと対向電極3Eの間の電圧を設定する。これにより、一体型液晶表示素子兼撮像素子31の透過光は、表示する画像の青色成分のみからなる画像となる。
Next, the optical unit 200 displays a blue image. Among the light sources constituting the light source unit 1, the blue light source is turned on, and the red light source and the green light source are turned off. The RGB value is obtained for each pixel of the image to be displayed, and the voltage between the pixel electrode 3B and the counter electrode 3E corresponding to the pixel is set according to the value of B. Thereby, the transmitted light of the integrated liquid crystal display element / imaging element 31 becomes an image composed only of the blue component of the image to be displayed.
上記赤色・緑色・青色の画像の表示により、1枚のフルカラー画像の表示が完了となる。次のフルカラー画像表示時には、表示する画像に応じて、上記の赤色・緑色・青色の画像の表示を行えばよい。なお、上記の映像の生成では、赤色・緑色・青色の画像の順番で表示を行ったが、このような順序に限定されるものではない。例えば、赤色・青色・緑色等、異なった順番で表示することも可能である。
Display of one full-color image is completed by displaying the red, green, and blue images. When the next full color image is displayed, the red, green, and blue images may be displayed according to the image to be displayed. In the above-described video generation, display is performed in the order of red, green, and blue images, but the order is not limited thereto. For example, it is possible to display in a different order such as red, blue, and green.
また、上記実施形態においては、光源部1は複数の光源を備えており、光学処理部3は、透過型モノクロ液晶表示素子と、撮像素子と、を備える場合を例として説明した。しかしながら、これは一例であり、光源部1は白色光源を備え、光学処理部3は、透過型カラー液晶表示素子と、撮像素子と、を備える構成としてもよい。
In the above embodiment, the light source unit 1 includes a plurality of light sources, and the optical processing unit 3 is described as an example including a transmission type monochrome liquid crystal display element and an image sensor. However, this is only an example, and the light source unit 1 may include a white light source, and the optical processing unit 3 may include a transmissive color liquid crystal display element and an image sensor.
図21は、透過型カラー液晶表示素子と撮像素子とを備える一体型液晶表示素子兼撮像素子35の断面の模式図の例である。一体型液晶表示素子兼撮像素子35は、偏光板3Aと、画素電極3Bと、フォトダイオード3Cと、液晶層3Dと、対向電極3Eと、偏光板3Fと、カラーフィルター3L・3M・3Nと、遮光膜3Pと、を備えている。
FIG. 21 is an example of a schematic diagram of a cross section of an integrated liquid crystal display element / image pickup device 35 including a transmissive color liquid crystal display device and an image pickup device. The integrated liquid crystal display element / image pickup device 35 includes a polarizing plate 3A, a pixel electrode 3B, a photodiode 3C, a liquid crystal layer 3D, a counter electrode 3E, a polarizing plate 3F, color filters 3L, 3M, and 3N. A light shielding film 3P.
カラーフィルター3Lは赤色の光のみを、カラーフィルター3Mは緑色の光のみを、カラーフィルター3Nは青色の光のみを透過する。光学部200は、光源部1を構成する白色光源を点灯させ、光学部200が提示する画像の画素のRGB値に応じて、画素の各RGBに対応する画素電極3Bと対向電極3Eの間の電圧を設定することにより、映像を生成することができる。
The color filter 3L transmits only red light, the color filter 3M transmits only green light, and the color filter 3N transmits only blue light. The optical unit 200 turns on the white light source that constitutes the light source unit 1, and according to the RGB value of the pixel of the image presented by the optical unit 200, between the pixel electrode 3B and the counter electrode 3E corresponding to each RGB of the pixel. By setting the voltage, an image can be generated.
光学処理部3を出射した光は、投影光学部4に入射する。投影光学部4は、光学処理部3が生成した映像から、利用者が知覚する虚像を生成する役割と、光学処理部3から投影光学部4に入射した光が利用者の眼球に入射するように光の進行方向を変更する光方向変更部としての役割と、を果たす。
The light emitted from the optical processing unit 3 enters the projection optical unit 4. The projection optical unit 4 plays a role of generating a virtual image perceived by the user from the video generated by the optical processing unit 3, and the light incident on the projection optical unit 4 from the optical processing unit 3 is incident on the user's eyeball. And serving as a light direction changing unit that changes the traveling direction of light.
図6(a)は、本実施形態に係る投影光学部4の構成を例示する図である。投影光学部4は、例えば、凸レンズ41Aと、平面ミラー41Bと、を備えている。凸レンズ41Aは、光学処理部3と凸レンズ41Aの間の距離a(>0)が、凸レンズ41Aの焦点距離f(>0)よりも短くなるように配置される。
FIG. 6A is a diagram illustrating the configuration of the projection optical unit 4 according to this embodiment. The projection optical unit 4 includes, for example, a convex lens 41A and a plane mirror 41B. The convex lens 41A is arranged such that the distance a (> 0) between the optical processing unit 3 and the convex lens 41A is shorter than the focal length f (> 0) of the convex lens 41A.
このような場合、光学処理部3が生成した映像の、凸レンズ41Aによる像は虚像となり、虚像の大きさは光学処理部3が生成した映像のf/(f-a)倍となる。f/(f-a) >1であるから、投影光学部4は、光学処理部3が生成した映像よりも大きい虚像を生成可能である。
In such a case, the image generated by the convex lens 41A of the image generated by the optical processing unit 3 is a virtual image, and the size of the virtual image is f / (fa) times that of the image generated by the optical processing unit 3. Since f / (fa)> 1, the projection optical unit 4 can generate a virtual image larger than the image generated by the optical processing unit 3.
平面ミラー41Bは、凸レンズ41Aを出射した光の進行方向を変更する。平
面ミラー41Bを反射した光は、利用者の眼球6に入射する。 Theflat mirror 41B changes the traveling direction of the light emitted from the convex lens 41A. The light reflected from the plane mirror 41B enters the user's eyeball 6.
面ミラー41Bを反射した光は、利用者の眼球6に入射する。 The
図7は、平面ミラー41B及び利用者の眼球6を利用者の右横の視点から示す図である。図7に示すように、平面ミラー41Bの高さhは、物体9等の対象物を視覚により知覚可能な範囲である利用者の瞳孔の直径よりも小さい(例えば、2mm)。そのため、水晶体8の中心と平面ミラー41Bの中心の延長線上にある対象物である物体9から出た光線は、光線91A及び光線91Bのように平面ミラー41Bの上下を通過し、利用者の網膜7に到達し得る。これにより、利用者は物体9を視覚により知覚することができ、シースルー性を確保できる。
FIG. 7 is a diagram showing the plane mirror 41B and the user's eyeball 6 from the viewpoint on the right side of the user. As shown in FIG. 7, the height h of the flat mirror 41B is smaller than the diameter of the user's pupil, which is a range in which an object such as the object 9 can be perceived visually (for example, 2 mm). Therefore, the light rays emitted from the object 9 that is the object on the extension line between the center of the crystalline lens 8 and the center of the plane mirror 41B pass through the top and bottom of the plane mirror 41B like the light rays 91A and 91B, and the retina of the user. 7 can be reached. Thereby, the user can perceive the object 9 visually and can ensure see-through property.
また、別の投影光学部4の形態として、投影光学部4は、複数枚の凸レンズ又は凹レンズと、平面ミラーと、を備えていても良い。複数枚の凸レンズ又は凹レンズを利用することにより、レンズによる収差の低減を図ることができる。
As another form of the projection optical unit 4, the projection optical unit 4 may include a plurality of convex lenses or concave lenses and a plane mirror. By using a plurality of convex lenses or concave lenses, it is possible to reduce aberration caused by the lenses.
図6(b)は、本実施形態に係る投影光学部4の別の一構成図である。投影光学部4は、例えば図6(b)に示すように、少なくとも1枚のレンズと、入射面・反射面・出射面を備えた三角プリズム41Cと、を備えていても良い。上記入射面・反射面・出射面は全て平面である。三角プリズム41Cの高さは、利用者の瞳孔の直径よりも小さい(例えば、2mm)。これにより、シースルー性を確保できる。なお、三角プリズム41Cは、キューブ型プリズムなど、他の形状であっても良い。
FIG. 6B is another configuration diagram of the projection optical unit 4 according to the present embodiment. For example, as shown in FIG. 6B, the projection optical unit 4 may include at least one lens and a triangular prism 41 </ b> C having an incident surface, a reflective surface, and an output surface. The incident surface, reflecting surface, and emitting surface are all flat. The height of the triangular prism 41C is smaller than the diameter of the user's pupil (for example, 2 mm). Thereby, see-through property can be secured. Note that the triangular prism 41C may have another shape such as a cube prism.
図6(c)は、本実施形態に係る投影光学部4の別の一構成図である。投影光学部4は、例えば図6(c)に示すように、曲面ミラー41Dで構成されていても良い。曲面ミラー41Dは、光学処理部3が生成した映像から、利用者が知覚する虚像を生成する役割と、光学処理部3から投影光学部4に入射した光が利用者の眼球に入射するように光の進行方向を変更する役割の、両方の役割を果たす。曲面ミラー41Dの高さは、利用者の瞳孔の直径よりも小さい(例えば、2mm)。これにより、シースルー性を確保できる。
FIG. 6C is another configuration diagram of the projection optical unit 4 according to the present embodiment. For example, as shown in FIG. 6C, the projection optical unit 4 may be configured by a curved mirror 41D. The curved mirror 41D plays a role of generating a virtual image perceived by the user from an image generated by the optical processing unit 3, and so that light incident on the projection optical unit 4 from the optical processing unit 3 enters the user's eyeball. It plays both roles of changing the traveling direction of light. The height of the curved mirror 41D is smaller than the diameter of the user's pupil (for example, 2 mm). Thereby, see-through property can be secured.
また、別の投影光学部4の形態として、投影光学部4は、入射面と、反射面と、出射面と、を備えたプリズムで構成されていても良い。入射面・反射面・出射面のうち、少なくとも1面は曲面である。プリズムは、光学処理部3が生成した映像から、利用者が知覚する虚像を生成する役割と、光学処理部3から投影光学部4に入射した光が利用者の眼球に入射するように光の進行方向を変更する役割の、両方の役割を果たす。プリズムの高さは、利用者の瞳孔の直径よりも小さい(例えば、2mm)。これにより、シースルー性を確保できる。
As another form of the projection optical unit 4, the projection optical unit 4 may be configured by a prism having an incident surface, a reflection surface, and an output surface. At least one of the incident surface, reflecting surface, and emitting surface is a curved surface. The prism plays a role of generating a virtual image perceived by the user from the image generated by the optical processing unit 3, and the light incident on the user's eyeball from the optical processing unit 3 to the projection optical unit 4. It plays both roles of changing the direction of travel. The height of the prism is smaller than the diameter of the user's pupil (for example, 2 mm). Thereby, see-through property can be secured.
図6(d)は、本実施形態に係る投影光学部4の別の一構成図である。別のシースルー性を確保する手段として、例えば図6(d)のように、反射率が20%以上の平面ビームスプリッター41Eを利用することも可能である。平面ビームスプリッター41Eを利用する場合、平面ビームスプリッター41Eの高さが利用者の瞳孔の直径より大きくても、平面ビームスプリッター41Eの延長線上にある物体から出た光線の一部が透過するので、利用者の網膜7に到達し得る。このような構成であっても、利用者は物体を視覚により知覚することができ、シースルー性を確保できる。
FIG. 6D is another configuration diagram of the projection optical unit 4 according to the present embodiment. As another means for ensuring the see-through property, it is possible to use a planar beam splitter 41E having a reflectance of 20% or more as shown in FIG. 6D, for example. When using the planar beam splitter 41E, even if the height of the planar beam splitter 41E is larger than the diameter of the pupil of the user, a part of the light beam emitted from the object on the extension line of the planar beam splitter 41E is transmitted. The user's retina 7 can be reached. Even with such a configuration, the user can visually perceive an object and can ensure see-through.
また、別のシースルー性を確保する手段として、反射面の反射率が20%以上のプリズムを利用することも可能である。このようなプリズムを利用する場合、プリズムの高さが利用者の瞳孔の直径より大きくても、プリズムの延長線上にある物体から出た光線の一部が透過するので、利用者の網膜7に到達し得る。このような構成であっても、利用者は物体9を視覚により知覚することができ、シースルー性を確保できる。
Also, as a means for securing another see-through property, it is possible to use a prism having a reflectance of 20% or more on the reflecting surface. When such a prism is used, even if the height of the prism is larger than the diameter of the pupil of the user, a part of the light beam emitted from the object on the extension line of the prism is transmitted. Can reach. Even with such a configuration, the user can visually perceive the object 9 and can ensure see-through.
なお、本実施形態に係る投影光学部4の構成は、上記形態に限定されるものではなく、上記形態の構成要素を組み合わせた形態でもよいことは、言うまでもない。例えば、投影光学部4は、1枚のレンズと、入射面・反射面・出射面のうち少なくとも1面は曲面であるプリズムと、を備えているなど、他の形態も可能である。
In addition, it cannot be overemphasized that the structure of the projection optical part 4 which concerns on this embodiment is not limited to the said form, The form which combined the component of the said form may be sufficient. For example, the projection optical unit 4 may have other forms such as a single lens and a prism in which at least one of the entrance surface, the reflection surface, and the exit surface is a curved surface.
図1において、投影光学部4を出射した光は、利用者の眼球6に入射した後、網膜7に到達する。利用者は網膜7に到達した光を知覚する。このとき利用者は、虚像が眼前に存在するものとして映像を知覚する。
In FIG. 1, the light emitted from the projection optical unit 4 enters the user's eyeball 6 and then reaches the retina 7. The user perceives the light reaching the retina 7. At this time, the user perceives the video as if the virtual image exists in front of the eyes.
次に、生体認証を行うために用いられる撮像画像の取得について説明する。以下、網膜撮像画像を用いた網膜認証を行う場合を例に挙げ、光学部200による網膜を撮像する機能について説明する。
Next, acquisition of a captured image used for biometric authentication will be described. Hereinafter, the function of imaging the retina by the optical unit 200 will be described with reference to an example of performing retina authentication using a retina captured image.
上述した映像を表示する機能において光源部1から利用者の網膜7に到達した光の一部は、網膜7で散乱する。散乱光の一部は、水晶体8及び投影光学部4を経て、光学処理部3に入射する。
In the function of displaying the video described above, a part of the light reaching the user's retina 7 from the light source unit 1 is scattered by the retina 7. Part of the scattered light enters the optical processing unit 3 through the crystalline lens 8 and the projection optical unit 4.
光学処理部3として図5を示して上述した一体型液晶表示素子兼撮像素子31を用いる場合、一体型液晶表示素子兼撮像素子31に入射した光は、偏光板3F、対向電極3E、液晶層3Dを透過し、フォトダイオード3Cに入射する。フォトダイオード3Cに入射した光は、偏光板3Fからフォトダイオード3Cの向きに進行する光であるから、フォトダイオード3Cによって検知される。
When the integrated liquid crystal display element / imaging element 31 described above with reference to FIG. 5 is used as the optical processing unit 3, the light incident on the integrated liquid crystal display element / imaging element 31 includes a polarizing plate 3F, a counter electrode 3E, and a liquid crystal layer. The light passes through 3D and enters the photodiode 3C. The light incident on the photodiode 3C is detected by the photodiode 3C because it travels in the direction from the polarizing plate 3F to the photodiode 3C.
以上説明したように、本実施形態に係る映像表示装置10において、光学処理部3は、光源部1から出力された光を用いて画像を表示し、表示された画像の光が利用者の目によって反射された反射光を受光することにより、目を用いた生体認証処理を行うための認証対象である網膜を撮像する。このような構成により、映像(画像)表示用の光と網膜撮像用の光とで光路が共通化され、光学部200の映像(画像)表示用の光源と網膜撮像用の光源とを光源部1として共通化することができるので、目を用いた生体認証処理を行うための認証対象を撮像する撮像機能を有する画像表示装置を従来よりも小型化することができる。
As described above, in the video display device 10 according to the present embodiment, the optical processing unit 3 displays an image using the light output from the light source unit 1, and the light of the displayed image is the user's eyes. By receiving the reflected light reflected by, the retina that is an authentication target for performing biometric authentication processing using the eyes is imaged. With this configuration, the optical path is shared between the light for image (image) display and the light for retinal imaging, and the light source for image (image) display and the light source for retinal imaging of the optical unit 200 are used as the light source unit. Therefore, it is possible to reduce the size of an image display apparatus having an imaging function for imaging an authentication target for performing biometric authentication processing using eyes.
また、光学処理部3から投影光学部4に入射した光が利用者の眼球に入射するように光の進行方向を変更する素子を投影光学部4が備えることで、シースルー性を有する網膜撮像機能付の映像表示装置を提供することができる。しかしながら、投影光学部4が光の進行方向を変更する素子を備える構成は必須ではなく、投影光学部4は、光学処理部3から入射した光が利用者の眼球にそのまま入射する構成であってもよい。
In addition, the projection optical unit 4 includes an element that changes the traveling direction of light so that light incident on the projection optical unit 4 from the optical processing unit 3 enters the user's eyeball. An attached video display device can be provided. However, the configuration in which the projection optical unit 4 includes an element that changes the traveling direction of light is not essential, and the projection optical unit 4 has a configuration in which light incident from the optical processing unit 3 is directly incident on the user's eyeball. Also good.
なお、上記実施形態に係る一体型液晶表示素子兼撮像素子31においては、図5に示すように、画素電極3B及びフォトダイオード3Cが一体として配置される場合を例として説明した。このような構成により、画素電極3B及びフォトダイオード3Cの性能等に応じて適宜組合せを考える必要がなく、予め組み合わされた画素電極3B及びフォトダイオード3Cを用いて一体型液晶表示素子兼撮像素子31を容易に構成することができる。しかしながら、このような構成は必須ではなく、フォトダイオード3Cが網膜の像を撮像できるように焦点深度内であれば、画素電極3Bとフォトダイオード3Cとは離れて配置されていてもよい。
In the integrated liquid crystal display element / imaging element 31 according to the above embodiment, the case where the pixel electrode 3B and the photodiode 3C are integrally arranged as shown in FIG. 5 has been described as an example. With such a configuration, there is no need to consider an appropriate combination according to the performance of the pixel electrode 3B and the photodiode 3C, and the integrated liquid crystal display element / image pickup element 31 using the pixel electrode 3B and the photodiode 3C combined in advance. Can be configured easily. However, such a configuration is not essential, and the pixel electrode 3B and the photodiode 3C may be disposed apart from each other as long as the photodiode 3C is within the focal depth so that an image of the retina can be captured.
次に、図8および図22を参照して、画像処理部202による網膜認証処理を説明する。図8は、網膜認証準備用画像及び網膜認証用画像を例示する図である。図22は、画像処理部202による網膜認証処理を例示するフローチャートである。
Next, with reference to FIG. 8 and FIG. 22, the retina authentication processing by the image processing unit 202 will be described. FIG. 8 is a diagram illustrating a retina authentication preparation image and a retina authentication image. FIG. 22 is a flowchart illustrating retinal authentication processing by the image processing unit 202.
図22に示すように、画像処理部202は、制御部201から網膜認証開始の指令を受信すると、図8に示した網膜認証準備用画像71Aを情報記憶部203から取得して、網膜認証準備用画像71Aを光学部200に対して出力する(S2201)。光学部200は、画像処理部202から取得した網膜認証準備用画像71Aを利用者に対して提示する。
As shown in FIG. 22, when the image processing unit 202 receives an instruction to start retinal authentication from the control unit 201, the image processing unit 202 acquires the retinal authentication preparation image 71A shown in FIG. The work image 71A is output to the optical unit 200 (S2201). The optical unit 200 presents the retina authentication preparation image 71A acquired from the image processing unit 202 to the user.
網膜認証準備用画像71Aは、利用者が映像の特定の箇所を意図的に見るように、利用者に対して指示を行う予め定められた画像であり、情報記憶部203に予め記憶されている。具体的には、例えば図8に示すように、網膜認証準備画像71Aは、利用者の目線位置の目安となる印71B及び「ここを見てください。」等の指示文71C等の利用者の視線を予め定められた位置に誘導するための視線誘導情報を含む。
The retina authentication preparation image 71A is a predetermined image that gives an instruction to the user so that the user intentionally looks at a specific part of the video, and is stored in the information storage unit 203 in advance. . Specifically, for example, as shown in FIG. 8, the retina authentication preparation image 71 </ b> A includes a mark 71 </ b> B that serves as a guide for the user's eye-gaze position, and instructions 71 </ b> C such as “Look here. Line-of-sight guidance information for guiding the line of sight to a predetermined position is included.
このような網膜認証準備画像71Aを表示するのは、光学処理部3が利用者の網膜7の鮮明な撮像画像を取得するためである。利用者の網膜7の鮮明な撮像画像を取得するためには、利用者が表示されている映像を意図的に見ることにより、表示されている映像が網膜7上で結像するように水晶体8の厚みが変更され、網膜7と光学処理部3とが像面と物面の関係になる必要がある。そのため、画像処理部202は、このような網膜認証準備画像71Aを表示させて、利用者が映像を意図的に見るよう導く。
The reason why such a retina authentication preparation image 71A is displayed is that the optical processing unit 3 acquires a clear captured image of the retina 7 of the user. In order to acquire a clear captured image of the user's retina 7, the lens 8 is formed so that the displayed image is formed on the retina 7 by intentionally viewing the displayed image. Therefore, the retina 7 and the optical processing unit 3 need to have a relationship between the image plane and the object plane. Therefore, the image processing unit 202 displays such a retina authentication preparation image 71A and guides the user to intentionally view the video.
網膜準備用画像71Aを光学部200に対して出力した画像処理部202は、網膜準備用画像71Aを出力してから所定時間を経過するまで(S2202/YES)、待機する(S2202/NO)。所定時間は、利用者が表示された指示文71Cを読み、さらに印71Bに目線を動かすために十分な時間であり、例えば、数秒間である。
The image processing unit 202 that has output the retina preparation image 71A to the optical unit 200 waits until a predetermined time elapses after the retina preparation image 71A is output (S2202 / YES). The predetermined time is sufficient time for the user to read the displayed instruction sentence 71C and move his / her line of sight to the mark 71B, for example, several seconds.
所定時間経過すると(S2202/YES)、画像処理部202は、図8に示した網膜認証用画像72を情報記憶部203から取得して、光学部200に対して出力する(S2203)。光学部200は、画像処理部202から取得した網膜認証用画像72を利用者に対して提示する。
When a predetermined time has elapsed (S2202 / YES), the image processing unit 202 acquires the retina authentication image 72 shown in FIG. 8 from the information storage unit 203 and outputs it to the optical unit 200 (S2203). The optical unit 200 presents the retinal authentication image 72 acquired from the image processing unit 202 to the user.
網膜認証用画像72は、例えば全面が同一の色又は網膜上で一様な強度分布となる画像であり、情報記憶部203に予め記憶されている。具体的には、例えば図8に示すように、全面同一色(図8においてはドットのハッチングで示す)の画像である。しかしながら、これは一例であり、網膜認証用画像72は、全面が白色でも良いし、網膜認証に最適となるような別の色でも良い。
The retina authentication image 72 is, for example, an image whose entire surface has the same color or a uniform intensity distribution on the retina, and is stored in advance in the information storage unit 203. Specifically, for example, as shown in FIG. 8, the entire surface is the same color image (indicated by dot hatching in FIG. 8). However, this is an example, and the retina authentication image 72 may be entirely white, or may be another color that is optimal for retina authentication.
網膜認証用画像を光学部200に対して出力した画像処理部202は、光学処理部3により撮像された利用者の網膜7の撮像画像を取得する(S2204)。S2201の処理において網膜認証準備用画像71Aが光学部200に出力されて利用者に対して提示されることにより、利用者は光学部200が提示する映像の特定の箇所を意図的に見ている状態である。そのため、網膜7と光学処理部3内の網膜撮像部とは、像面と物面の関係であり、撮像画像は網膜の像となる。
The image processing unit 202 that has output the retina authentication image to the optical unit 200 acquires a captured image of the user's retina 7 captured by the optical processing unit 3 (S2204). In the processing of S2201, the retina authentication preparation image 71A is output to the optical unit 200 and presented to the user, so that the user intentionally sees a specific part of the video presented by the optical unit 200. State. Therefore, the retina 7 and the retinal imaging unit in the optical processing unit 3 have a relationship between the image plane and the object plane, and the captured image is an image of the retina.
図9は画像処理部202が取得した網膜撮像画像の例である。図9に示すように、網膜上の血管パターン73Aが網膜撮像画像として撮影されている。光学部200はこのような網膜撮像画像を画像処理部202に対して出力する。
FIG. 9 is an example of a retinal captured image acquired by the image processing unit 202. As shown in FIG. 9, a blood vessel pattern 73A on the retina is captured as a retinal captured image. The optical unit 200 outputs such a retinal captured image to the image processing unit 202.
図10は、画像処理部202が取得した網膜撮像画像の別の例である。網膜上の血管パターンは個人によって異なっており、別の個人に対する網膜の像の撮像画像は、例えば図10のように、血管パターン73Aとは別の血管パターン73Bとなる。
FIG. 10 is another example of the retinal captured image acquired by the image processing unit 202. The blood vessel pattern on the retina differs depending on the individual, and the captured image of the image of the retina for another individual is, for example, a blood vessel pattern 73B different from the blood vessel pattern 73A as shown in FIG.
画像処理部202は、取得した網膜撮像画像の血管パターン73Aの特徴が、情報記憶部203に予め記憶されている正規利用者の血管パターンの特徴と一致するかを判定する(S2205)。光学部200が撮像した血管パターン73Aの特徴が、情報記憶部203から取得した正規利用者の血管パターンの特徴と一致する場合(S2205/YES)、画像処理部202内の図示しない網膜認証判定部は、利用者が正規利用者であり認証成功と判定し、その判定結果を制御部201に対して出力する(S2206)。
The image processing unit 202 determines whether or not the characteristics of the blood vessel pattern 73A of the acquired retinal captured image match the characteristics of the blood vessel pattern of the normal user stored in advance in the information storage unit 203 (S2205). When the characteristic of the blood vessel pattern 73A captured by the optical unit 200 matches the characteristic of the blood vessel pattern of the authorized user acquired from the information storage unit 203 (S2205 / YES), the retina authentication determination unit (not shown) in the image processing unit 202 Determines that the user is a regular user and authentication is successful, and outputs the determination result to the control unit 201 (S2206).
また、画像処理部202は認証成功を提示するための認証成功提示画像を情報記憶部203から取得して、認証成功提示画像を光学部200に対して出力する。光学部200は、画像処理部202から取得した認証成功提示画像を利用者に対して提示する。
Also, the image processing unit 202 acquires an authentication success presentation image for presenting the authentication success from the information storage unit 203 and outputs the authentication success presentation image to the optical unit 200. The optical unit 200 presents the authentication success presentation image acquired from the image processing unit 202 to the user.
認証成功の判定結果を制御部201に対して出力した画像処理部202は、判定結果を受けた制御部201からの指令に従って、認証が成功することにより閲覧可能になる情報である閲覧限定情報を情報記憶部203から取得し、光学部200に対して出力する(S2207)。光学部200は、画像処理部202から取得した閲覧限定情報を利用者に対して提示する。
The image processing unit 202 that has output the determination result of the authentication success to the control unit 201 displays the browsing-limited information that is information that can be browsed by successful authentication in accordance with the command from the control unit 201 that has received the determination result. The information is acquired from the information storage unit 203 and output to the optical unit 200 (S2207). The optical unit 200 presents the browsing-limited information acquired from the image processing unit 202 to the user.
なお、上記実施形態においては、画像処理部202は、閲覧限定情報を情報記憶部203から取得する場合を例として説明した。その他、画像処理部202は、閲覧限定情報を図24に示した情報処理装置212から取得するようにしてもよい。この場合、通信処理部204は、認証成功の判定結果を受けた制御部201からの指令に従って、情報処理装置212から閲覧限定情報を取得し、制御部201に対して出力する。制御部201は、通信処理部204から取得した閲覧限定画像を画像処理部202に対して出力する。
In the above-described embodiment, the image processing unit 202 has been described as an example in which the browsing restriction information is acquired from the information storage unit 203. In addition, the image processing unit 202 may acquire the browsing restriction information from the information processing apparatus 212 illustrated in FIG. In this case, the communication processing unit 204 acquires the browsing restriction information from the information processing device 212 in accordance with a command from the control unit 201 that has received the authentication success determination result, and outputs it to the control unit 201. The control unit 201 outputs the browsing-limited image acquired from the communication processing unit 204 to the image processing unit 202.
一方、光学部200が撮像した血管パターンの特徴が、情報記憶部203から取得した正規利用者の血管パターンの特徴と一致しない場合(S2205/NO)、画像処理部202は、網膜認証を開始してから所定回数実施したかを判定する(S2208)。認証回数が所定回数未満の場合(S2208/NO)、画像処理部202は、網膜認証準備用画像71Aを光学部200に対して再度出力し(S2201)、認証をやり直す。
On the other hand, when the characteristic of the blood vessel pattern captured by the optical unit 200 does not match the characteristic of the blood vessel pattern of the authorized user acquired from the information storage unit 203 (S2205 / NO), the image processing unit 202 starts retinal authentication. Then, it is determined whether or not the predetermined number of times has been implemented (S2208). When the number of authentications is less than the predetermined number (S2208 / NO), the image processing unit 202 outputs the retina authentication preparation image 71A to the optical unit 200 again (S2201), and performs authentication again.
網膜の画像は、網膜撮像時の利用者の目線により異なる画像が取得される場合があり、取得された網膜撮像画像によっては、正規の利用者であるにも関わらず、撮像された血管パターンの画像が情報記憶部203に記憶されている血管パターンの特徴と一致しないと判定される場合がある。上述のように、認証回数が所定回数未満の場合は認証をやり直すことにより、網膜認証が正しく行われる目線へ合わせる機会を増やすことができる。
The retina image may be acquired differently depending on the user's line of sight at the time of retinal imaging, and depending on the acquired retinal imaging image, the captured blood vessel pattern may be obtained despite being a regular user. It may be determined that the image does not match the characteristics of the blood vessel pattern stored in the information storage unit 203. As described above, when the number of times of authentication is less than the predetermined number, by re-authenticating, it is possible to increase opportunities to match the line of retinal authentication correctly.
一方、認証回数が所定回数の場合(S2208/YES)、画像処理部202の網膜認証判定部は、利用者が正規利用者ではなく認証失敗と判定し、その判定結果を制御部201に対して出力する(S2209)。また、画像処理部202は認証失敗を提示するための認証失敗提示画像を情報記憶部203から取得して、認証失敗提示画像を光学部200に対して送信する。光学部200は、画像処理部202から取得した認証失敗提示画像を利用者に対して提示する。
On the other hand, when the number of authentications is a predetermined number (S2208 / YES), the retina authentication determination unit of the image processing unit 202 determines that the user is not a regular user but has failed, and the determination result is sent to the control unit 201. It outputs (S2209). Further, the image processing unit 202 acquires an authentication failure presentation image for presenting an authentication failure from the information storage unit 203 and transmits the authentication failure presentation image to the optical unit 200. The optical unit 200 presents the authentication failure presentation image acquired from the image processing unit 202 to the user.
なお、網膜認証準備用画像71Aの例えば80%以上の画素のRGB値は、網膜認証用画像72の同一画素のRGB値と略同一であることが望ましい。これにより、網膜認証準備用画像71Aと網膜認証用画像72との画像切り替え時における利用者の目に対する負担を軽減することができると同時に、明/暗順応を抑えることができる。
Note that it is desirable that the RGB values of, for example, 80% or more pixels of the retina authentication preparation image 71A are substantially the same as the RGB values of the same pixels of the retina authentication image 72. As a result, the burden on the user's eyes when switching between the retina authentication preparation image 71A and the retina authentication image 72 can be reduced, and at the same time, light / dark adaptation can be suppressed.
また、上記実施形態においては、画像処理部202に含まれる図示しない網膜認証判定部が、情報記憶部203から取得した利用者の血管パターンの特徴に基づいて、網膜認証処理を行う場合を例として説明した。しかしながら、これは一例であり、映像表示装置10は、情報処理装置212において記憶されている利用者の血管パターンの特徴を取得して、網膜認証処理を行っても良い。
Further, in the above embodiment, as an example, a retinal authentication determination unit (not shown) included in the image processing unit 202 performs a retinal authentication process based on the characteristics of the blood vessel pattern of the user acquired from the information storage unit 203. explained. However, this is only an example, and the video display apparatus 10 may perform the retina authentication processing by acquiring the characteristics of the blood vessel pattern of the user stored in the information processing apparatus 212.
その他、映像表示装置10は、撮像された利用者の血管パターンの特徴を情報処理装置212に送信し、情報処理装置212は、情報処理装置212において記憶されている利用者の血管パターンの特徴と、映像表示装置10から受信した利用者の血管パターンの特徴とを比較して網膜認証処理を行い、その認証結果を映像表示装置10に送信しても良い。
In addition, the video display apparatus 10 transmits the captured characteristics of the blood vessel pattern of the user to the information processing apparatus 212. The retinal authentication process may be performed by comparing the characteristics of the blood vessel pattern of the user received from the video display device 10 and the authentication result may be transmitted to the video display device 10.
また、本実施形態に係る映像表示装置10は、適宜網膜を撮像することにより、網膜認証処理以外にも、網膜の撮像画像を利用した他の機能を実現することが可能である。例えば、光学部200が提示した映像を利用者が見ているか否かを、光学部200が撮像した網膜画像を利用して、画像処理部202内の図示しない映像被視判定部が判定し、制御部201が画像処理部202から受信したその判定結果に応じて、光学部200が提示する映像の明るさを制御することも可能である。
Further, the video display device 10 according to the present embodiment can realize other functions using the captured image of the retina in addition to the retina authentication processing by appropriately capturing the retina. For example, a video viewing determination unit (not shown) in the image processing unit 202 determines whether or not the user is viewing the video presented by the optical unit 200 using the retinal image captured by the optical unit 200. It is also possible to control the brightness of the video presented by the optical unit 200 according to the determination result received by the control unit 201 from the image processing unit 202.
図11は、網膜撮像画像の鮮明度と表示画像の明るさとの関係を例示する図である。利用者が映像を見ている場合、一体型液晶表示素子兼撮像素子31と網膜7とは、像面と物面の関係にあるから、図11に示すように、光学部200が撮像した網膜画像の血管パターンは、鮮明な血管パターン73Aとなる。
FIG. 11 is a diagram illustrating the relationship between the sharpness of the retina captured image and the brightness of the display image. When the user is viewing an image, the integrated liquid crystal display element / imaging element 31 and the retina 7 are in the relationship between the image plane and the object plane. Therefore, as shown in FIG. The blood vessel pattern of the image is a clear blood vessel pattern 73A.
一方、利用者が映像を見ておらず、映像よりも遠方あるいは近方を見ている場合、一体型液晶表示素子兼撮像素子31と網膜7とは、像面と物面の関係にはないから、図11に示すように、光学部200が撮像した網膜画像の血管パターンは、ぼやけた(図11においては点線で示す)血管パターン74となる。
On the other hand, when the user is not viewing the image and is looking farther or closer than the image, the integrated liquid crystal display element / imaging element 31 and the retina 7 are not in a relationship between the image plane and the object plane. 11, the blood vessel pattern of the retinal image captured by the optical unit 200 is a blurred blood vessel pattern 74 (indicated by a dotted line in FIG. 11).
このことから、画像処理部202は、光学部200が撮像した網膜画像の血管パターンの鮮明さを定量化し、ある閾値を超えて血管パターンの鮮明さがある場合には、映像被視判定部は利用者が映像を見ていると判定して、その判定結果を制御部201に対して出力する。
From this, the image processing unit 202 quantifies the sharpness of the blood vessel pattern of the retinal image captured by the optical unit 200, and if the blood vessel pattern is clear beyond a certain threshold, the image viewing determination unit It is determined that the user is watching the video, and the determination result is output to the control unit 201.
一方、画像処理部202は、血管パターンの鮮明さがない場合には、映像被視判定部は利用者が映像を見ていないと判定して、その判定結果を制御部201に対して出力する。
On the other hand, when the blood vessel pattern is not clear, the image processing unit 202 determines that the video viewing determination unit does not view the video, and outputs the determination result to the control unit 201. .
制御部201は、画像処理部202から受信した判定結果に応じて、利用者が映像を見ている場合には明るい画像75を提示し、利用者が画像を見ていない場合には暗い画像76を提示するように、光学部200に対して指令を行う。光学部200は、制御部201から受信した指令に応じて、光学部200が提示する映像の明るさを変更する。これにより、利用者が映像を見ていない場合、利用者の視覚が映像によって妨げられることを低減することができる。
In accordance with the determination result received from the image processing unit 202, the control unit 201 presents a bright image 75 when the user is viewing the video, and dark image 76 when the user is not viewing the image. Is issued to the optical unit 200. The optical unit 200 changes the brightness of the video presented by the optical unit 200 in accordance with the command received from the control unit 201. Thereby, when the user is not watching the video, it is possible to reduce the disturbance of the user's vision due to the video.
なお、上記の映像被視判定部は、光学部200が撮像した網膜画像の血管パターンの鮮明さを基に利用者が映像を見ているか否かを判定する場合を例として説明した。その他、映像被視判定部は、血管パターンの移動を基に利用者が映像を見ているか否かを判定することも可能である。
Note that the above-described video visibility determination unit has been described as an example in which it is determined whether or not the user is viewing the video based on the clearness of the blood vessel pattern of the retinal image captured by the optical unit 200. In addition, the video viewing determination unit can also determine whether the user is watching the video based on the movement of the blood vessel pattern.
図12は、網膜撮像画像の移動量と表示画像の明るさとの別の関係を例示する図である。例えば、図12に示すように、利用者が映像の上下あるいは左右の背景を見ており、映像を見ていない場合に光学部200が撮像した網膜画像の血管パターンは、利用者が映像を見ている場合に光学部200が撮像した網膜画像の血管パターン73Aと比べて、上下あるいは左右にずれた血管パターン77となる。
FIG. 12 is a diagram illustrating another relationship between the amount of movement of the retina captured image and the brightness of the display image. For example, as shown in FIG. 12, the blood vessel pattern of the retinal image captured by the optical unit 200 when the user is looking at the top and bottom or left and right backgrounds of the video and is not looking at the video, When compared with the blood vessel pattern 73A of the retinal image captured by the optical unit 200, the blood vessel pattern 77 is shifted vertically or horizontally.
このことから、映像被視判定部は、まず、利用者が映像を見ている場合の血管パターンあるいはその特徴を、参照血管パターンとして予め格納している情報記憶部203から取得する。そして、映像被視判定部は、光学部200が撮像した網膜撮像画像の血管パターンあるいはその特徴が、取得した参照血管パターンあるいはその特徴と比べて、ある閾値を超えて移動していない場合には、利用者が映像を見ていると判定して、その判定結果を制御部201に対して出力する。
From this, the video viewing determination unit first obtains the blood vessel pattern or its feature when the user is viewing the video from the information storage unit 203 that is stored in advance as a reference blood vessel pattern. Then, the image viewing determination unit determines that the blood vessel pattern of the retinal captured image captured by the optical unit 200 or the feature thereof has not moved beyond a certain threshold value compared to the acquired reference blood vessel pattern or the feature thereof. , It is determined that the user is watching the video, and the determination result is output to the control unit 201.
一方、映像被視判定部は、光学部200が撮像した網膜撮像画像の血管パターンあるいはその特徴が、取得した参照血管パターンあるいはその特徴と比べて、ある閾値を超えて移動している場合には、利用者が映像を見ていないと判定して、その判定結果を制御部201に対して出力する。これにより、映像被視判定部は、血管パターンの移動量を基に利用者が映像を見ているか否かを判定することができる。
On the other hand, when the blood vessel pattern of the retinal captured image captured by the optical unit 200 or a feature thereof moves beyond a certain threshold value compared to the acquired reference blood vessel pattern or the feature thereof, the video visual recognition determining unit moves. Then, it is determined that the user is not watching the video, and the determination result is output to the control unit 201. Thereby, the image viewing determination unit can determine whether the user is viewing the image based on the movement amount of the blood vessel pattern.
なお、網膜撮像画像の鮮明度、または、血管パターンやその特徴の移動量に応じて、光学部200は表示する映像の明るさを2値的に変更するだけではなく、2段階以上に変更することも可能である。
Note that the optical unit 200 not only changes the brightness of the displayed image in a binary manner, but also changes it in two or more steps according to the sharpness of the retina captured image or the movement amount of the blood vessel pattern and its features. It is also possible.
また、本実施形態に係る映像表示装置10は、光学部200が撮像した網膜画像の血管パターンの移動量を基に、映像上の利用者の注視点を求めることができ、ポインティングデバイスの役割を実装することも可能である。例えば、画像処理部202は、撮像した網膜画像の血管パターンあるいはその特徴の移動量からマウスポインターの移動量を算出して、映像上のマウスポインターを移動する。
In addition, the video display apparatus 10 according to the present embodiment can obtain a user's gaze point on the video based on the movement amount of the blood vessel pattern of the retinal image captured by the optical unit 200, and serves as a pointing device. It can also be implemented. For example, the image processing unit 202 calculates the amount of movement of the mouse pointer from the amount of movement of the blood vessel pattern of the captured retinal image or its feature, and moves the mouse pointer on the video.
また、利用者の瞬きにより、網膜の撮影が例えば1秒以内に2回遮断された場合、または例えば1秒以上連続して遮断された場合には、画像処理部202は、利用者がマウスをクリックしたと判定する。これにより、新たな素子を追加することなく、ポインティングデバイスの役割を実装することができる。
In addition, when the retina photographing is interrupted twice within one second by the user's blink, or when it is continuously interrupted for one second or more, for example, the image processing unit 202 causes the user to move the mouse. It is determined that the button has been clicked. Thereby, the role of a pointing device can be implemented without adding a new element.
また、本実施形態に係る映像表示装置10は、光学部200が撮像した網膜画像の血管パターンの鮮明さを基に、利用者が映像を見る場合における自動ピント調整機能を実現することも可能である。この場合、光学処理部3及び投影光学部4のうち少なくとも一つの一部又は全部は可動部を備える。
In addition, the video display device 10 according to the present embodiment can also realize an automatic focus adjustment function when the user views video based on the clearness of the blood vessel pattern of the retinal image captured by the optical unit 200. is there. In this case, at least one part or all of the optical processing unit 3 and the projection optical unit 4 includes a movable unit.
画像処理部202は、光学部200が撮像した網膜画像の血管パターンの鮮明さを定量化し、その結果を制御部201に対して出力する。血管パターンが不鮮明の場合には、制御部201は、光学部200に対して可動部を駆動するよう指令を行う。これにより、自動ピント調整機能を実現することができる。
The image processing unit 202 quantifies the sharpness of the blood vessel pattern in the retinal image captured by the optical unit 200 and outputs the result to the control unit 201. When the blood vessel pattern is unclear, the control unit 201 instructs the optical unit 200 to drive the movable unit. Thereby, an automatic focus adjustment function can be realized.
[実施例2]
以下、本発明の実施例2を説明する。実施例2は、光源部1が可視光を放出する光源と赤外光を放出する光源とを備えており、光源部1から出射した赤外光の網膜からの戻り光を光学処理部3が検出することで網膜の撮像を行う点が、実施例1と異なる。赤外光を用いて網膜撮像することで、可視光を用いて網膜撮像するよりも、鮮明な網膜画像を取得可能となる。 [Example 2]
Embodiment 2 of the present invention will be described below. In the second embodiment, the light source unit 1 includes a light source that emits visible light and a light source that emits infrared light. The point which images a retina by detecting is different from Example 1. By capturing the retina using infrared light, a clearer retina image can be acquired than when capturing the retina using visible light.
以下、本発明の実施例2を説明する。実施例2は、光源部1が可視光を放出する光源と赤外光を放出する光源とを備えており、光源部1から出射した赤外光の網膜からの戻り光を光学処理部3が検出することで網膜の撮像を行う点が、実施例1と異なる。赤外光を用いて網膜撮像することで、可視光を用いて網膜撮像するよりも、鮮明な網膜画像を取得可能となる。 [Example 2]
以下では、本発明の第2の実施形態について図5及び図13を参照して説明する。なお、実施例1と同一の構成や機能を有するものには同一の符号を付してその詳細な説明を省略するものとする。また、実施例1と同一の手続きに関しても、詳細な説明を省略するものとする。
Hereinafter, a second embodiment of the present invention will be described with reference to FIGS. 5 and 13. In addition, the same code | symbol shall be attached | subjected to what has the same structure and function as Example 1, and the detailed description shall be abbreviate | omitted. Further, detailed description of the same procedure as that of the first embodiment is omitted.
以下では、始めに光学部200の映像を表示する機能について説明する。図13は、本発明の第2の実施形態に係る光源部1の構成を例示する図である。図13に示すように、本発明の第2の実施形態に係る光源部1は、光源1Aと、光源1Bと、光源1Cと、光源1Dと、を備えている。光源1A乃至1Dは、赤色の光・緑色の光・青色の光・赤外光のうち、いずれか1つの光を放出する。上記4つの光が4個の光源により不足なく放出されていれば、いずれの光源がいずれの光を放出していても構わない。
Hereinafter, the function of displaying the image of the optical unit 200 will be described first. FIG. 13 is a diagram illustrating the configuration of the light source unit 1 according to the second embodiment of the invention. As shown in FIG. 13, the light source unit 1 according to the second embodiment of the present invention includes a light source 1A, a light source 1B, a light source 1C, and a light source 1D. The light sources 1A to 1D emit any one of red light, green light, blue light, and infrared light. Any light source may emit any light as long as the four lights are emitted by the four light sources without any shortage.
本実施例の光学処理部3の模式図の例は、本発明の実施例1と同一の図5で表される。ただし、フォトダイオード3Cは、偏光板3Fからフォトダイオード3Cの向きに進行する赤外光を受光し、偏光板3Aからフォトダイオード3Cの向きに進行する光を受光しないように構成されている。
An example of a schematic diagram of the optical processing unit 3 of the present embodiment is represented in FIG. 5 which is the same as that of the first embodiment of the present invention. However, the photodiode 3C is configured to receive infrared light traveling in the direction of the photodiode 3C from the polarizing plate 3F and not to receive light traveling in the direction of the photodiode 3C from the polarizing plate 3A.
本発明の第2の実施形態に係る光学部200内の光源は、リフレッシュレートを例えば60Hzとしたとき、以下のように点灯する。1/60秒の間に、赤色の光源・緑色の光源・青色の光源・赤外光源が順番かつ排他的に点灯する。赤色の光源・緑色の光源・青色の光源の光は、利用者に映像を提示するために利用され、赤外光源の光は網膜を撮像するために利用される。
The light source in the optical unit 200 according to the second embodiment of the present invention lights up as follows when the refresh rate is set to 60 Hz, for example. During 1/60 seconds, the red light source, the green light source, the blue light source, and the infrared light source are turned on in order and exclusively. The light from the red light source, the green light source, and the blue light source is used to present an image to the user, and the light from the infrared light source is used to image the retina.
ここで、1/60秒の間に、赤色の光源・緑色の光源・青色の光源・赤外光源が順番に点灯する代わりに、網膜の撮像を行っていない時には赤外光源は常時点灯せず、赤色の光源・緑色の光源・青色の光源が順番に点灯してもよい。これにより、上記映像の明るさを向上することができる。
Here, in 1/60 seconds, the red light source, the green light source, the blue light source, and the infrared light source are turned on in order, but the infrared light source is not always turned on when the retina is not imaged. A red light source, a green light source, and a blue light source may be turned on in order. Thereby, the brightness of the image can be improved.
また、1/60秒の間に、赤色の光源・緑色の光源・青色の光源・赤外光源が順番に点灯する代わりに、網膜の撮像を行っている時には赤色の光源・緑色の光源・青色の光源が常時点灯せず、赤外光源が常時点灯してもよい。これにより、赤外光の時間平均強度を向上することができる。
In addition, instead of the red light source, the green light source, the blue light source, and the infrared light source turning on in order within 1/60 seconds, the red light source, the green light source, and the blue light source are captured when the retina is imaged. The light source may not always be lit and the infrared light source may be lit constantly. Thereby, the time average intensity | strength of infrared light can be improved.
さらに、1/60秒の間に、赤色の光源・緑色の光源・青色の光源・赤外光源が順番に点灯する代わりに、赤外光源が常時点灯し、かつ赤色の光源・緑色の光源・青色の光源が順番に点灯してもよい。これにより、上記映像の明るさ及び赤外光の時間平均強度を向上することができる。
Furthermore, in 1/60 seconds, instead of the red light source, the green light source, the blue light source, and the infrared light source turning on in order, the infrared light source is always on, and the red light source, the green light source, The blue light source may be turned on in order. Thereby, the brightness of the said image | video and the time average intensity | strength of infrared light can be improved.
次に、網膜撮像画像で網膜認証を行う場合を例に挙げ、光学部200の網膜を撮像する機能について説明する。上記の映像を表示する機能において光源部1から利用者の網膜7に到達した赤外光の一部は、網膜7で散乱する。散乱光の一部は、水晶体8、投影光学部4を通過し、一体型液晶表示素子兼撮像素子31内のフォトダイオード3Cに入射する。フォトダイオード3Cに入射した光は偏光板3Fからフォトダイオード3Cの向きに進行する赤外光であるから、フォトダイオード3Cはこの赤外光を検知する。
Next, the function of imaging the retina of the optical unit 200 will be described by taking as an example the case of performing retina authentication with a retina captured image. In the function of displaying the video, a part of the infrared light reaching the user's retina 7 from the light source unit 1 is scattered by the retina 7. Part of the scattered light passes through the crystalline lens 8 and the projection optical unit 4 and enters the photodiode 3 </ b> C in the integrated liquid crystal display element / image sensor 31. Since the light incident on the photodiode 3C is infrared light traveling in the direction from the polarizing plate 3F to the photodiode 3C, the photodiode 3C detects this infrared light.
次に、網膜認証時における光源について説明する。光学部200が図8に示した網膜認証準備用画像71Aを提示する際は、光学部200は網膜の撮像を行わないため、可視光源のみが点灯し、赤外光源は点灯しなくても良い。
Next, the light source at the time of retina authentication will be described. When the optical unit 200 presents the retina authentication preparation image 71A shown in FIG. 8, since the optical unit 200 does not capture the retina, only the visible light source is turned on and the infrared light source is not required to be turned on. .
光学部200が利用者の網膜7を撮像する場合は赤外光源が点灯する。その際、赤色の光源・緑色の光源・青色の光源は点灯しなくても良い。光学処理部3内の網膜撮像部は、赤外光源が放出した赤外光の、網膜からの戻り光を受光することで、網膜の撮像画像を取得できる。
When the optical unit 200 images the retina 7 of the user, the infrared light source is turned on. At that time, the red light source, the green light source, and the blue light source may not be turned on. The retinal imaging unit in the optical processing unit 3 can acquire a captured image of the retina by receiving the return light from the retina of the infrared light emitted from the infrared light source.
光学部200が利用者の網膜7を撮像する場合に、赤外光源が点灯するだけではなく、可視光源も点灯しても良い。赤外光源から光学処理部3に入射した光は、他の可視光源から光学処理部3に入射した光と同様に、光学処理部3により強度変調を受けるため、画像処理部202は表示した画像の情報を基に、強度変調を補正するような処理を、撮像した網膜画像に対して施すとよい。
When the optical unit 200 images the retina 7 of the user, not only the infrared light source but also the visible light source may be turned on. Since light incident on the optical processing unit 3 from the infrared light source is subjected to intensity modulation by the optical processing unit 3 in the same manner as light incident on the optical processing unit 3 from other visible light sources, the image processing unit 202 displays the displayed image. Based on this information, a process for correcting the intensity modulation may be performed on the captured retinal image.
実施例1と同様、本実施例における映像表示装置10は、網膜の撮像画像を利用した網膜認証以外にも、網膜の撮像画像を利用した他の機能を実現することが可能である。また、本実施例によれば、赤外光を用いて網膜撮像することで、可視光を用いて網膜撮像するよりも、鮮明な網膜撮像画像を取得可能となる。
As in the first embodiment, the video display apparatus 10 according to the present embodiment can realize other functions using the retina captured image besides the retina authentication using the retina captured image. In addition, according to the present embodiment, it is possible to acquire a clear retinal captured image by performing retinal imaging using infrared light, compared to retinal imaging using visible light.
[実施例3]
以下、本発明の実施例3を説明する。実施例3においては、実施例1及び実施例2とは異なる構成の光学処理部32が用いられる。本実施例によれば、素子の選択肢を広げることができる。 [Example 3]
Embodiment 3 of the present invention will be described below. In the third embodiment, an optical processing unit 32 having a configuration different from that of the first and second embodiments is used. According to this embodiment, it is possible to expand the choices of elements.
以下、本発明の実施例3を説明する。実施例3においては、実施例1及び実施例2とは異なる構成の光学処理部32が用いられる。本実施例によれば、素子の選択肢を広げることができる。 [Example 3]
以下、本発明の第3の実施形態について図14を参照して説明する。なお、実施例1及び実施例2と同一の構成や機能を有するものには同一の符号を付してその詳細な説明を省略するものとする。また、実施例1及び実施例2と同一の手続きに関しても、詳細な説明を省略するものとする。
Hereinafter, a third embodiment of the present invention will be described with reference to FIG. In addition, the same code | symbol shall be attached | subjected to what has the structure and function same as Example 1 and Example 2, and the detailed description shall be abbreviate | omitted. Also, detailed description of the same procedures as those in the first and second embodiments will be omitted.
図14は、本発明の第3の実施形態に係る光学処理部32の構成を例示する図である。図14に示すように、本発明の第3の実施形態に係る光学処理部32は、偏光板32Aと、偏光ビームスプリッター32Bと、液晶表示素子32Cと、波長板32Dと、撮像素子32Eと、を備えている。
FIG. 14 is a diagram illustrating the configuration of the optical processing unit 32 according to the third embodiment of the present invention. As shown in FIG. 14, the optical processing unit 32 according to the third embodiment of the present invention includes a polarizing plate 32A, a polarizing beam splitter 32B, a liquid crystal display element 32C, a wave plate 32D, an image sensor 32E, It has.
光源部1を出射し、照明光学部2を通過した光は、偏光板32Aに入射する。偏光板32Aは特定の向きの偏光を持つ光を透過するため、偏光板32Aの透過光は、特定の向きの偏光Aを持つ。偏光Aを持つ偏光板32Aの透過光は、偏光ビームスプリッター32Bに入射する。
The light emitted from the light source unit 1 and passed through the illumination optical unit 2 enters the polarizing plate 32A. Since the polarizing plate 32A transmits light having polarized light in a specific direction, the transmitted light of the polarizing plate 32A has polarized light A in a specific direction. The light transmitted through the polarizing plate 32A having the polarization A enters the polarization beam splitter 32B.
偏光ビームスプリッターの反射透過面32Fは、偏光Aを持つ光を反射し、偏光Aに直交する偏光Bを持つ光を透過する。このため、偏光ビームスプリッター32Bに入射した偏光Aを持つ偏光板32Aの透過光は、反射透過面32Fで反射し、液晶表示素子32Cに入射する。
The reflection / transmission surface 32F of the polarization beam splitter reflects the light having the polarization A and transmits the light having the polarization B orthogonal to the polarization A. For this reason, the transmitted light of the polarizing plate 32A having the polarization A incident on the polarization beam splitter 32B is reflected by the reflection / transmission surface 32F and enters the liquid crystal display element 32C.
液晶表示素子32Cは、光学部200が提示する映像の信号に基づいて、液晶表示素子32Cに入射した光に強度変調を与える。強度変調を受けた光は、偏光が偏光Aから偏光Bに変化した上で、液晶表示素子32Cから入射時と逆向きで出射する。すなわち、液晶表示素子32Cは、光学部200が表示する映像(画像)の光を出力する。
The liquid crystal display element 32C modulates the intensity of the light incident on the liquid crystal display element 32C based on a video signal presented by the optical unit 200. The light subjected to the intensity modulation is emitted from the liquid crystal display element 32C in the opposite direction to the incident direction after the polarization is changed from the polarization A to the polarization B. That is, the liquid crystal display element 32 </ b> C outputs light of an image (image) displayed by the optical unit 200.
液晶表示素子32Cから出射した光は偏光ビームスプリッター32Bに入射するが、液晶表示素子32Cから出射した光は偏光Bを持つため、偏光ビームスプリッター32Bの反射透過面32Fを透過して、波長板32Dに入射する。波長板32Dを透過した光は、投影光学部4と、利用者の水晶体8と、を経て、利用者の網膜7に到達する。
The light emitted from the liquid crystal display element 32C is incident on the polarization beam splitter 32B. However, since the light emitted from the liquid crystal display element 32C has the polarization B, the light is transmitted through the reflection / transmission surface 32F of the polarization beam splitter 32B, and the wave plate 32D. Is incident on. The light transmitted through the wave plate 32D reaches the user's retina 7 through the projection optical unit 4 and the user's crystal 8.
利用者の網膜7に到達した光の一部は、網膜7で散乱する。散乱光の一部は、利用者の水晶体8と投影光学部4を経て、波長板32Dに入射し、波長板32Dを出射する。波長板32Dを出射した光は、偏光ビームスプリッター32Bに入射する。波長板32Dを出射して偏光ビームスプリッター32Bに入射する光の偏光が偏光Aとなるように、波長板32Dを構成する。
Part of the light that reaches the user's retina 7 is scattered by the retina 7. Part of the scattered light enters the wave plate 32D through the user's crystalline lens 8 and the projection optical unit 4, and exits the wave plate 32D. The light emitted from the wave plate 32D is incident on the polarization beam splitter 32B. The wave plate 32D is configured such that the polarized light of the light emitted from the wave plate 32D and incident on the polarization beam splitter 32B becomes the polarization A.
これより、波長板32Dを出射して偏光ビームスプリッター32Bに入射する光は、反射透過面32Fで反射し、撮像素子32Eに入射する。すなわち、波長板32Dは、液晶表示素子32Cから利用者の網膜7に至る光と利用者の網膜7から撮像素子32Eに至る光との偏光を変更する。また、偏光ビームスプリッター32Bは、液晶表示素子32Cから利用者の網膜7に至る光路と利用者の網膜7から撮像素子32Eに至る光路とを分離する。このような構成により、光学部200は、撮像素子32Eが受光した光から網膜の撮像画像を得ることが可能である。
Thus, the light emitted from the wave plate 32D and incident on the polarization beam splitter 32B is reflected by the reflection / transmission surface 32F and incident on the image sensor 32E. That is, the wave plate 32D changes the polarization of the light from the liquid crystal display element 32C to the user's retina 7 and the light from the user's retina 7 to the image sensor 32E. The polarization beam splitter 32B separates the optical path from the liquid crystal display element 32C to the user's retina 7 and the optical path from the user's retina 7 to the imaging element 32E. With such a configuration, the optical unit 200 can obtain a captured image of the retina from light received by the image sensor 32E.
本実施例によれば、液晶表示素子32Cと撮像素子32Eにおいて、独立して最適な素子を選択することができ、素子の選択肢を広げることができる。
According to the present embodiment, the liquid crystal display element 32C and the image pickup element 32E can independently select an optimum element, and the choice of elements can be expanded.
なお、本実施例に係る光源部1は、実施例1のように、可視光を放出する光源から構成されていてもよい。また、実施例2のように、可視光又は赤外光を放出する光源から構成されていてもよい。光源部1が可視光を放出する光源から構成されている場合、撮像素子32Eは可視光を受光する。光源部1が可視光又は赤外光を放出する光源から構成されている場合、撮像素子32Eは赤外光を受光する。
The light source unit 1 according to the present embodiment may be configured by a light source that emits visible light as in the first embodiment. Moreover, you may be comprised from the light source which discharge | releases visible light or infrared light like Example 2. FIG. When the light source unit 1 is composed of a light source that emits visible light, the imaging element 32E receives visible light. When the light source unit 1 is composed of a light source that emits visible light or infrared light, the imaging element 32E receives infrared light.
[実施例4]
次に、本発明の実施例4を説明する。実施例4は、ファイバー走査素子33Bを備えた光学処理部33を利用する点が、実施例1乃至実施例3と異なる。本実施例によれば、ファイバー走査素子33Bを利用した網膜撮像機能付の映像表示装置10を実現できる。 [Example 4]
Next, a fourth embodiment of the present invention will be described. The fourth embodiment is different from the first to third embodiments in that anoptical processing unit 33 including a fiber scanning element 33B is used. According to the present embodiment, the video display device 10 with a retinal imaging function using the fiber scanning element 33B can be realized.
次に、本発明の実施例4を説明する。実施例4は、ファイバー走査素子33Bを備えた光学処理部33を利用する点が、実施例1乃至実施例3と異なる。本実施例によれば、ファイバー走査素子33Bを利用した網膜撮像機能付の映像表示装置10を実現できる。 [Example 4]
Next, a fourth embodiment of the present invention will be described. The fourth embodiment is different from the first to third embodiments in that an
以下では、本発明の第4の実施形態について図15乃至図17を参照して説明する。なお、実施例1乃至実施例3と同一の構成や機能を有するものには同一の符号を付してその詳細な説明を省略するものとする。また、実施例1乃至実施例3と同一の手続きに関しても、詳細な説明を省略するものとする。
Hereinafter, a fourth embodiment of the present invention will be described with reference to FIGS. 15 to 17. In addition, the same code | symbol shall be attached | subjected to what has the same structure and function as Example 1 thru | or Example 3, and the detailed description shall be abbreviate | omitted. Further, detailed description of the same procedures as those in the first to third embodiments will be omitted.
本発明の第4の実施形態に係る光源部1は、赤色の光を放出する光源1Aと、緑色の光を放出する光源1Bと、青色の光を放出する光源1Cと、を備えている。ここで、光源1A、光源1B、及び光源1Cは、いずれもレーザー光源であり、偏光Aを有する光線を放出する。本実施例に係る照明光学部2は、光源が放出した光を合波し、合波された光が光学処理部に可能な限り高効率で入力するよう、光の照度分布を調整する。
The light source unit 1 according to the fourth embodiment of the present invention includes a light source 1A that emits red light, a light source 1B that emits green light, and a light source 1C that emits blue light. Here, each of the light source 1A, the light source 1B, and the light source 1C is a laser light source, and emits a light beam having the polarization A. The illumination optical unit 2 according to the present embodiment combines the light emitted from the light source and adjusts the illuminance distribution of the light so that the combined light is input to the optical processing unit as efficiently as possible.
図15は、本発明の第4の実施形態に係る光学処理部33の構成を例示する図である。図15に示すように、本発明の第4の実施形態に係る光学処理部33は、ファイバー偏光ビームスプリッター33Aと、ファイバー走査素子33Bと、波長板33Cと、光強度計測素子33Dと、を備えている。
FIG. 15 is a diagram illustrating the configuration of an optical processing unit 33 according to the fourth embodiment of the present invention. As shown in FIG. 15, the optical processing unit 33 according to the fourth embodiment of the present invention includes a fiber polarization beam splitter 33A, a fiber scanning element 33B, a wavelength plate 33C, and a light intensity measuring element 33D. ing.
図16は、ファイバー偏光ビームスプリッター33Aの一例を示す図である。図16に示すように、ファイバー偏光ビームスプリッター33Aは、端子33F・端子33G・端子33Hの光入出力端子を有しており、偏光Aの光を端子33Fに入力した場合及び偏光Aに直交する偏光Bの光を端子33Gに入力した場合には、光は端子33Hから出射する。
FIG. 16 is a diagram illustrating an example of a fiber polarization beam splitter 33A. As shown in FIG. 16, the fiber polarization beam splitter 33A has optical input / output terminals of a terminal 33F, a terminal 33G, and a terminal 33H, and is orthogonal to the polarized light A when the polarized light A is input to the terminal 33F. When polarized light B is input to the terminal 33G, the light is emitted from the terminal 33H.
反対に、偏光Aの光を端子33Hに入力した場合には、光は端子33Fから出射し、偏光Bの光を端子33Hに入力した場合には、光は端子33Gから出射する。なお、光がファイバー偏光ビームスプリッター33Aを通過する際には、偏光は変化しない。
On the other hand, when polarized light A is input to the terminal 33H, the light is emitted from the terminal 33F, and when polarized light B is input to the terminal 33H, the light is emitted from the terminal 33G. Note that the polarization does not change when light passes through the fiber polarization beam splitter 33A.
光源部1を出射した光は、偏光ビームスプリッター33Aの端子33Fに入射する。端子33Fに入射した光は、偏光Aを有しているので、端子33Hから出射する。端子33Hから出射した光は、ファイバー走査素子33Bに入射する。
The light emitted from the light source unit 1 enters the terminal 33F of the polarization beam splitter 33A. Since the light incident on the terminal 33F has the polarization A, it is emitted from the terminal 33H. The light emitted from the terminal 33H enters the fiber scanning element 33B.
図17は、ファイバー走査素子33Bの一例を示す図である。端子33Hから出射した光は、図17に示した端子33Mに入射する。ファイバー走査素子33Bは図示しない可動部を有しており、図17において矢印33Lで示すように、ファイバー内の光の進行方向に略直交する2軸方向に、ファイバーの端子33Nを走査する。ファイバー走査素子33Bに入射した光は、端子33Nから出射する。
FIG. 17 is a diagram illustrating an example of the fiber scanning element 33B. The light emitted from the terminal 33H enters the terminal 33M shown in FIG. The fiber scanning element 33B has a movable part (not shown), and scans the fiber terminal 33N in two axial directions substantially orthogonal to the traveling direction of light in the fiber, as indicated by an arrow 33L in FIG. The light incident on the fiber scanning element 33B is emitted from the terminal 33N.
光学部200は、光源部1とファイバー走査素子33Bとを同期して駆動することにより、仮想的なスクリーン33Qの位置に映像を生成することができる。以下では、このような駆動の一例について述べる。
The optical unit 200 can generate an image at the position of the virtual screen 33Q by driving the light source unit 1 and the fiber scanning element 33B in synchronization. Hereinafter, an example of such driving will be described.
リフレッシュレートを、例えば60Hzとする。1/60秒の間に、仮想的なスクリーン33Qの位置に生成される映像のすべての画素に向けてファイバー走査素子33Bの出射光33Pが照射するように、ファイバー走査素子33Bはファイバーの端子33Nを走査する。
The refresh rate is set to 60 Hz, for example. The fiber scanning element 33B has a fiber terminal 33N so that the emitted light 33P of the fiber scanning element 33B irradiates all the pixels of the image generated at the position of the virtual screen 33Q within 1/60 seconds. Scan.
光学部200は、光学部200が提示する映像のうち出射光33Pが照射している画素における色に応じて、光源1A・光源1B・光源1Cが放出する光の強度を変更する。
The optical unit 200 changes the intensity of light emitted by the light source 1A, the light source 1B, and the light source 1C according to the color of the pixel irradiated with the emitted light 33P in the image presented by the optical unit 200.
ファイバー走査素子33Bを出射した光は、波長板33Cを透過する。ここで波長板33Cは、光の偏光を変更するが、波長板33Cの機能の詳細については後述する。波長板33Cを透過した光は、投影光学部4と、利用者の水晶体8と、を経て、利用者の網膜7に到達する。このとき利用者は、ファイバー走査素子33Bが仮想的なスクリーン33Qの位置に生成した映像の、投影光学部4による虚像を、眼前に存在するものとして知覚する。
The light emitted from the fiber scanning element 33B passes through the wave plate 33C. Here, the wavelength plate 33C changes the polarization of light. Details of the function of the wavelength plate 33C will be described later. The light transmitted through the wave plate 33 </ b> C reaches the user's retina 7 through the projection optical unit 4 and the user's crystal 8. At this time, the user perceives the virtual image generated by the projection optical unit 4 of the image generated by the fiber scanning element 33B at the position of the virtual screen 33Q as existing in front of the eyes.
本実施形態に係る光学部200は、以下のようにして網膜の像を取得することができる。利用者の網膜7に到達した光の一部は、網膜7で散乱する。散乱光の一部は、利用者の水晶体8と投影光学部4を経て、波長板33Cに入射し、波長板33Cを出射する。波長板33Cは、利用者の水晶体8を経て波長板33Cを出射した光の偏光が偏光Bとなるように構成される。波長板33Cを出射した光は、仮想的なスクリーン33Qの位置に到達する。
The optical unit 200 according to the present embodiment can acquire an image of the retina as follows. A part of the light reaching the user's retina 7 is scattered by the retina 7. A part of the scattered light enters the wave plate 33C through the user's crystal 8 and the projection optical unit 4, and exits the wave plate 33C. The wave plate 33 </ b> C is configured such that the polarized light of the light emitted from the wave plate 33 </ b> C through the user's crystal 8 becomes the polarization B. The light emitted from the wave plate 33C reaches the position of the virtual screen 33Q.
光学部200が提示した映像を利用者が見ている場合、利用者の網膜7と仮想的なスクリーン33Qは物面と像面の関係にあるから、網膜7から端子33Nに到達した光は、仮想的なスクリーン33Q上で網膜7の像として結像する。
When the user is viewing the video presented by the optical unit 200, the user's retina 7 and the virtual screen 33Q are in the relationship between the object plane and the image plane. An image of the retina 7 is formed on the virtual screen 33Q.
仮想的なスクリーン33Qの位置に到達した光は、図17に示したファイバー走査素子33Bの端子33Nに入射する。端子33Nからファイバー走査素子33Bに入射した光は、端子33Mから出射し、図16に示したファイバー偏光ビームスプリッター33Aの端子33Hに入射する。
The light that has reached the position of the virtual screen 33Q is incident on the terminal 33N of the fiber scanning element 33B shown in FIG. The light incident on the fiber scanning element 33B from the terminal 33N exits from the terminal 33M and enters the terminal 33H of the fiber polarization beam splitter 33A shown in FIG.
端子33Hに入射した光は、偏光Bを有するため、端子33Gから出射され、図15に示した光強度計測素子33Dに入射する。光強度計測素子33Dは、入射した光の強度を計測する。光強度計測素子33Dにより得られた光の強度は、図17に示したファイバーの端子33Nの走査に同期して処理することで、仮想的なスクリーン33Q上における網膜7の像の各画素の強度を意味する。従って、光学部200は、端子33Nの走査に同期して、光強度計測素子33Dが計測した光強度から画像を形成することにより、網膜の像を取得することができる。
Since the light incident on the terminal 33H has the polarization B, it is emitted from the terminal 33G and enters the light intensity measuring element 33D shown in FIG. The light intensity measuring element 33D measures the intensity of incident light. The intensity of the light obtained by the light intensity measuring element 33D is processed in synchronization with the scanning of the fiber terminal 33N shown in FIG. 17, whereby the intensity of each pixel of the image of the retina 7 on the virtual screen 33Q. Means. Accordingly, the optical unit 200 can acquire an image of the retina by forming an image from the light intensity measured by the light intensity measuring element 33D in synchronization with the scanning of the terminal 33N.
すなわち、ファイバー走査素子33Bは、画像の光を出力し、波長板33Cは、ファイバー走査素子33Bから利用者の網膜7に至る光と利用者の網膜7から光強度計測素子33Dに至る光との偏光を変更し、ファイバー偏光ビームスプリッター33Aは、ファイバー走査素子33Bから利用者の網膜7に至る光路と利用者の網膜7から光強度計測素子33Dに至る光路とを分離する。本実施例によれば、ファイバー走査素子33Bを利用した網膜撮像機能付の映像表示装置を実現できる。
That is, the fiber scanning element 33B outputs image light, and the wave plate 33C includes light from the fiber scanning element 33B to the user's retina 7 and light from the user's retina 7 to the light intensity measuring element 33D. The polarization is changed, and the fiber polarization beam splitter 33A separates the optical path from the fiber scanning element 33B to the user's retina 7 and the optical path from the user's retina 7 to the light intensity measurement element 33D. According to the present embodiment, a video display device with a retinal imaging function using the fiber scanning element 33B can be realized.
[実施例5]
次に、本発明の実施例5を説明する。実施例5は、虹彩の撮像を行う点が、実施例1乃至実施例4と異なる。以下では、本発明の第5の実施形態について図18乃至20を参照して説明する。なお、実施例1及び実施例4と同一の構成や機能を有するものには同一の符号を付してその詳細な説明を省略するものとする。また、実施例1及び実施例4と同一の手続きに関しても、詳細な説明を省略するものとする。 [Example 5]
Next, a fifth embodiment of the present invention will be described. The fifth embodiment is different from the first to fourth embodiments in that an iris is imaged. Hereinafter, a fifth embodiment of the present invention will be described with reference to FIGS. In addition, the same code | symbol shall be attached | subjected to what has the same structure and function as Example 1 and Example 4, and the detailed description shall be abbreviate | omitted. Also, detailed description of the same procedures as those in the first and fourth embodiments will be omitted.
次に、本発明の実施例5を説明する。実施例5は、虹彩の撮像を行う点が、実施例1乃至実施例4と異なる。以下では、本発明の第5の実施形態について図18乃至20を参照して説明する。なお、実施例1及び実施例4と同一の構成や機能を有するものには同一の符号を付してその詳細な説明を省略するものとする。また、実施例1及び実施例4と同一の手続きに関しても、詳細な説明を省略するものとする。 [Example 5]
Next, a fifth embodiment of the present invention will be described. The fifth embodiment is different from the first to fourth embodiments in that an iris is imaged. Hereinafter, a fifth embodiment of the present invention will be described with reference to FIGS. In addition, the same code | symbol shall be attached | subjected to what has the same structure and function as Example 1 and Example 4, and the detailed description shall be abbreviate | omitted. Also, detailed description of the same procedures as those in the first and fourth embodiments will be omitted.
図18は、本発明の第5の実施形態に係る映像表示装置10の光学部200の構成を例示する図である。図18に示すように、本実施形態に係る光学部200は、光源部1と、照明光学部2と、映像生成部と虹彩撮像部とを兼ね備えた光学処理部34と、投影光学部4と、を備えている。また、光学処理部34と投影光学部4のうち少なくとも一つは、虹彩の像が虹彩撮像部で形成されるように投影関係を調整する機構である調整部を備える。
FIG. 18 is a diagram illustrating the configuration of the optical unit 200 of the video display apparatus 10 according to the fifth embodiment of the invention. As shown in FIG. 18, the optical unit 200 according to this embodiment includes a light source unit 1, an illumination optical unit 2, an optical processing unit 34 that combines an image generation unit and an iris imaging unit, and a projection optical unit 4. It is equipped with. At least one of the optical processing unit 34 and the projection optical unit 4 includes an adjustment unit that is a mechanism for adjusting the projection relationship so that an iris image is formed by the iris imaging unit.
また、図18に示すように、投影光学部4から出射した光が利用者の眼球6へ到達し、利用者の眼球6から出射した光が投影光学部4へ入射する。その際、本実施例に係る光学部200は、利用者の虹彩101の撮像画像を取得する。
Further, as shown in FIG. 18, the light emitted from the projection optical unit 4 reaches the user's eyeball 6, and the light emitted from the user's eyeball 6 enters the projection optical unit 4. At that time, the optical unit 200 according to the present embodiment acquires a captured image of the user's iris 101.
本実施例に係る光源部1は、実施例1のように、可視光を放出する光源から構成されていてもよい。また、実施例2のように、可視光又は赤外光を放出する光源から構成されていてもよい。光源部1が放出した光は、照明光学部2を経て、光学処理部34に入射する。
The light source unit 1 according to the present embodiment may be composed of a light source that emits visible light as in the first embodiment. Moreover, you may be comprised from the light source which discharge | releases visible light or infrared light like Example 2. FIG. The light emitted from the light source unit 1 enters the optical processing unit 34 through the illumination optical unit 2.
投影光学部4は、例えば図4(d)のように、レンズ41Aと、反射率が20%以上の平面ビームスプリッター41Eと、を備える。
For example, as shown in FIG. 4D, the projection optical unit 4 includes a lens 41A and a planar beam splitter 41E having a reflectance of 20% or more.
図19は、本発明の第5の実施形態に係る光学処理部34の一構成図である。図19に示すように、光学処理部34は、偏光板32Aと、偏光ビームスプリッター32Bと、液晶表示素子32Cと、波長板34Dと、撮像素子32Eと、レンズ34Gと、を備えている。
FIG. 19 is a configuration diagram of the optical processing unit 34 according to the fifth embodiment of the present invention. As shown in FIG. 19, the optical processing unit 34 includes a polarizing plate 32A, a polarizing beam splitter 32B, a liquid crystal display element 32C, a wave plate 34D, an imaging element 32E, and a lens 34G.
光学処理部34を出射した光は、投影光学部4を経て、利用者の眼球6に到達する。利用者の眼球6に到達した光の一部は、利用者の水晶体8を透過し、利用者の眼球6に到達した光の別の一部は、利用者の虹彩101に到達する。
The light emitted from the optical processing unit 34 reaches the user's eyeball 6 via the projection optical unit 4. Part of the light reaching the user's eyeball 6 passes through the user's crystalline lens 8, and another part of the light reaching the user's eyeball 6 reaches the user's iris 101.
利用者の水晶体8に到達した光は、利用者の水晶体8を透過し、利用者の網膜7に到達する。利用者は、利用者の網膜7に到達した光を知覚する。利用者の虹彩101に到達した光の一部は、利用者の虹彩101で散乱して、投影光学部4を経て、光学処理部34の波長板34Dに入射する。
The light that reaches the user's crystalline lens 8 passes through the user's crystalline lens 8 and reaches the user's retina 7. The user perceives the light reaching the user's retina 7. A part of the light reaching the user's iris 101 is scattered by the user's iris 101 and enters the wave plate 34D of the optical processing unit 34 through the projection optical unit 4.
波長板34Dを出射した光は、偏光ビームスプリッター32Bに入射する。ここで、波長板34Dは、出射する光の偏光が偏光Aとなるように構成する。これより、波長板32Dを出射して偏光ビームスプリッター32Bに入射する光は、反射透過面32Fで反射される。反射透過面32Fで反射された光は、レンズ34Gを透過して、撮像素子32Eに入射する。
The light emitted from the wave plate 34D enters the polarization beam splitter 32B. Here, the wave plate 34 </ b> D is configured such that the polarized light of the emitted light is polarized light A. Thus, the light that exits the wave plate 32D and enters the polarization beam splitter 32B is reflected by the reflection / transmission surface 32F. The light reflected by the reflection / transmission surface 32F passes through the lens 34G and enters the image sensor 32E.
利用者が光学部200により提示された映像を見ている場合、利用者の網膜7と撮像素子32Eとが物面と像面の関係となっているが、本実施例においては利用者の虹彩101を鮮明に撮像する必要がある。そこで、レンズ34Gは、利用者の虹彩101と撮像素子32Eとが物面と像面の関係となるように、投影関係を調整する調整部の役割を果たす。これにより、撮像素子32Eは、利用者の虹彩101を鮮明に撮像できる。
When the user is viewing the video presented by the optical unit 200, the user's retina 7 and the image sensor 32E have a relationship between the object plane and the image plane. In this embodiment, the user's iris It is necessary to image 101 clearly. Therefore, the lens 34G serves as an adjustment unit that adjusts the projection relationship so that the user's iris 101 and the image sensor 32E have a relationship between the object plane and the image plane. Thereby, the image pick-up element 32E can image the user's iris 101 clearly.
なお、上記実施形態において、投影光学部4は、図4(d)に示した構成を例として説明したがこれは一例であり、別の形態であってもよい。図25は、本発明の第5の実施形態に係る投影光学部4の構成を例示する図である。図25に示すように、投影光学部4は、レンズ41Aと、反射率が20%以上である曲面ビームスプリッター41Fと、を備える構成であってもよい。
In the above-described embodiment, the projection optical unit 4 has been described by taking the configuration shown in FIG. 4D as an example, but this is an example, and another form may be employed. FIG. 25 is a diagram illustrating the configuration of the projection optical unit 4 according to the fifth embodiment of the invention. As shown in FIG. 25, the projection optical unit 4 may include a lens 41A and a curved beam splitter 41F having a reflectance of 20% or more.
図26は、曲面ビームスプリッター41Fの反射面を示す。光学処理部34から曲面ビームスプリッター41Fの中央部41Gに到達した光は、利用者の水晶体8に到達する。光学処理部34から曲面ビームスプリッター41Fの外周部41Hに到達した光は、利用者の虹彩101に到達する。曲面ビームスプリッター41Fの外周部41Hは、光学処理部34から曲面ビームスプリッター41Fの外周部41Hに到達した光を、利用者の虹彩101上に集めるため、0でない曲率を有している。これにより、利用者の虹彩101の照明効率を向上できる。曲面ビームスプリッター41Fの反射面の中央部41Gは、光学処理部34が生成した映像を反射する役割を果たす。
FIG. 26 shows a reflection surface of the curved beam splitter 41F. The light that has reached the central portion 41G of the curved beam splitter 41F from the optical processing portion 34 reaches the user's crystalline lens 8. The light reaching the outer peripheral portion 41H of the curved beam splitter 41F from the optical processing portion 34 reaches the user's iris 101. The outer peripheral portion 41H of the curved beam splitter 41F has a non-zero curvature in order to collect the light that has reached the outer peripheral portion 41H of the curved beam splitter 41F from the optical processing unit 34 on the user's iris 101. Thereby, the illumination efficiency of a user's iris 101 can be improved. The central portion 41G of the reflecting surface of the curved beam splitter 41F plays a role of reflecting the image generated by the optical processing unit 34.
次に、画像処理部202による虹彩認証の手続きを説明する。利用者が正規利用者であるか否かは、画像処理部202内の図示しない虹彩認証判定部が判定する。画像処理部202は、光学部200が撮像した虹彩画像を光学部200から取得し、虹彩パターンの特徴が、情報記憶部203から取得した正規利用者の虹彩パターンの特徴と一致するかを判定する。
Next, the procedure for iris authentication by the image processing unit 202 will be described. An iris authentication determination unit (not shown) in the image processing unit 202 determines whether or not the user is a regular user. The image processing unit 202 acquires the iris image captured by the optical unit 200 from the optical unit 200, and determines whether the feature of the iris pattern matches the feature of the regular user's iris pattern acquired from the information storage unit 203. .
光学部200が撮像した虹彩パターンの特徴が、情報記憶部203から取得した正規利用者の虹彩パターンの特徴と一致する場合、虹彩認証判定部は利用者が正規利用者であり認証成功と判定し、一致しない場合、虹彩認証判定部は利用者が正規利用者ではなく認証失敗と判定する。なお、虹彩認証は、実施例1と同様、複数回試みられてもよい。
When the feature of the iris pattern captured by the optical unit 200 matches the feature of the regular user's iris pattern acquired from the information storage unit 203, the iris authentication determination unit determines that the user is a regular user and the authentication is successful. If they do not match, the iris authentication determination unit determines that the user is not a regular user but an authentication failure. Note that iris authentication may be attempted a plurality of times as in the first embodiment.
図20は、虹彩認証準備用画像を例示する図である。図20に示すように、異なる位置に利用者の目線位置の目安となる印171B及び172Bを設けた複数の虹彩認証準備用画像171A及び172Aを利用して、虹彩認証準備用画像毎に虹彩認証を行っても良い。これにより、虹彩認証の精度を高めることができる。
FIG. 20 is a diagram illustrating an iris authentication preparation image. As shown in FIG. 20, using a plurality of iris authentication preparation images 171A and 172A provided with marks 171B and 172B that serve as references for the user's eye position at different positions, iris authentication is performed for each iris authentication preparation image. May be performed. Thereby, the precision of iris authentication can be improved.
なお、本実施例においては、虹彩の像が光学処理部34内の虹彩撮像部で形成されるように投影関係を調整するために、光学部200がレンズ34Gを備える場合を例として説明した。その他、光学処理部34及び投影光学部4のうち少なくとも一つの一部又は全部が可動部を備えることも可能である。例えば、投影光学部4が、図6(a)に示した投影光学部4の凸レンズ41Aを可動させる可動部を備え、光学部200が可動部を動作させて虹彩101と撮像素子32Eとの投影関係を調整する。すなわち、投影光学部4に含まれる凸レンズ41Aが、利用者の虹彩101と撮像素子32Eとが物面と像面の関係となるように、投影関係を調整する調整部として機能する。
In this embodiment, the case where the optical unit 200 includes the lens 34G has been described as an example in order to adjust the projection relationship so that the iris image is formed by the iris imaging unit in the optical processing unit 34. In addition, a part or all of at least one of the optical processing unit 34 and the projection optical unit 4 may include a movable unit. For example, the projection optical unit 4 includes a movable unit that moves the convex lens 41A of the projection optical unit 4 shown in FIG. 6A, and the optical unit 200 operates the movable unit to project the iris 101 and the image sensor 32E. Adjust the relationship. That is, the convex lens 41A included in the projection optical unit 4 functions as an adjustment unit that adjusts the projection relationship so that the user's iris 101 and the image sensor 32E have a relationship between the object plane and the image plane.
可動部を備えた虹彩撮像機能付の映像表示装置10は、以下のように可動部を動作させる。光学部200が利用者に映像を提示する場合、画像処理部202は、光学処理部34内の映像生成部と利用者の網膜7とが物面と像面の関係となるような可動部の所定の位置を情報記憶部203から取得し、光学部200に対して可動部を取得した位置まで移動させるように指令を行う。
The image display device 10 with an iris imaging function provided with a movable part operates the movable part as follows. When the optical unit 200 presents a video to the user, the image processing unit 202 is a movable unit in which the video generation unit in the optical processing unit 34 and the user's retina 7 have a relationship between the object plane and the image plane. A predetermined position is acquired from the information storage unit 203, and an instruction is given to the optical unit 200 to move the movable unit to the acquired position.
光学部200は、指令に従い可動部を動作させる。光学部200が利用者の虹彩101を撮像する場合、利用者の虹彩101と光学処理部34内の虹彩撮像部とが物面と像面の関係となるような可動部の所定の位置を情報記憶部203から取得し、光学部200に対して可動部を取得した位置まで移動させるように指令を行う。光学部200は、この指令に従い可動部を動作させる。これにより、光学部200は利用者に映像を提示できるとともに、利用者の虹彩101を鮮明に撮像することができる。
The optical unit 200 operates the movable unit according to the command. When the optical unit 200 captures the user's iris 101, information on a predetermined position of the movable unit in which the user's iris 101 and the iris image capturing unit in the optical processing unit 34 are in the relationship between the object plane and the image plane. Obtained from the storage unit 203 and instructs the optical unit 200 to move the movable unit to the acquired position. The optical unit 200 operates the movable unit according to this command. Thereby, the optical unit 200 can present an image to the user and can clearly image the user's iris 101.
上記の可動部制御の説明では、画像処理部202は情報記憶部203から取得した可動部の所定の位置を基に、光学部200に対して可動部を取得した位置まで移動させるように指令を行ったが、本発明はこれに限られない。例えば、光学部200が利用者の虹彩101を撮像する場合、画像処理部202は、光学部200から取得した撮像画像の虹彩パターンの鮮明さを定量化し、鮮明度が上がるように、光学部200に対して可動部を移動させるように指令を行ってもよい。また、可動部の移動量に応じて、光学処理部34内の映像生成部と利用者の網膜7とが物面と像面の関係となるように、他の可動部を動作させてもよい。
In the description of the movable unit control, the image processing unit 202 instructs the optical unit 200 to move the movable unit to the acquired position based on the predetermined position of the movable unit acquired from the information storage unit 203. Although done, the present invention is not limited to this. For example, when the optical unit 200 captures the user's iris 101, the image processing unit 202 quantifies the sharpness of the iris pattern of the captured image acquired from the optical unit 200, and the optical unit 200 increases the sharpness. May be instructed to move the movable part. Further, according to the amount of movement of the movable unit, other movable units may be operated so that the image generation unit in the optical processing unit 34 and the user's retina 7 have a relationship between the object plane and the image plane. .
以上説明したように、本実施形態に係る映像表示装置10において、光学処理部3は、光源部1から出力された光を用いて画像を表示し、表示された画像の光が利用者の目によって反射された反射光を受光することにより、目を用いた生体認証処理を行うための認証対象である虹彩を撮像するこのような構成により、映像(画像)表示用の光と虹彩撮像用の光とで光路が共通化され、光学部200の映像表示用の光源と虹彩撮像用の光源とを光源部1として共通化することができるので、目を用いた生体認証処理を行うための認証対象を撮像する撮像機能を有する画像表示装置を従来よりも小型化することができる。
As described above, in the video display device 10 according to the present embodiment, the optical processing unit 3 displays an image using the light output from the light source unit 1, and the light of the displayed image is the user's eyes. By receiving the reflected light reflected by the lens, an image that is an authentication target for performing biometric authentication processing using the eyes is picked up. Since the optical path is shared by the light and the light source for image display and the light source for iris imaging of the optical unit 200 can be shared as the light source unit 1, authentication for performing biometric authentication processing using the eyes An image display apparatus having an imaging function for imaging a target can be made smaller than before.
また、光学処理部3から投影光学部4に入射した光が利用者の眼球に入射するように光の進行方向を変更する素子を投影光学部4が備えることで、シースルー性を有する虹彩撮像機能付の映像表示装置を提供することができる。しかしながら、投影光学部4が光の進行方向を偏光する素子を備える構成は必須ではなく、投影光学部4は、光学処理部3から入射した光が利用者の眼球にそのまま入射する構成であってもよい。
In addition, the projection optical unit 4 includes an element that changes the traveling direction of light so that light incident on the projection optical unit 4 from the optical processing unit 3 enters the user's eyeball. An attached video display device can be provided. However, the configuration in which the projection optical unit 4 includes an element that polarizes the traveling direction of light is not essential, and the projection optical unit 4 has a configuration in which light incident from the optical processing unit 3 enters the user's eyeball as it is. Also good.
また、上記実施形態においては、光学処理部3又は投影光学部4がレンズ34Gや凸レンズ41A等の調整部を含み、利用者の虹彩101を鮮明に撮像することができる場合を例として説明した。しかしながら、この構成は必須ではなく、調整部がない場合であっても、利用者の虹彩101を撮像することができる。
In the above-described embodiment, the case where the optical processing unit 3 or the projection optical unit 4 includes adjustment units such as the lens 34G and the convex lens 41A, and the user's iris 101 can be clearly imaged has been described as an example. However, this configuration is not essential, and the user's iris 101 can be imaged even when there is no adjustment unit.
1 光源部
1A~1D 光源
2 照明光学部
3 光学処理部
3A 偏光板
3B 画素電極
3C フォトダイオード
3D 液晶層
3E 対向電極
3F 偏光板
3L~N カラーフィルター
3P 遮光膜
4 投影光学部
6 眼球
7 網膜
8 水晶体
9 物体
10 映像表示装置
21A ライトトンネル
21B レンズ
22A 拡散板
23A~C レンズ
23D、23E ダイクロイックミラー
23F マイクロレンズアレイ
23G レンズ
23H リレーレンズ
31 一体型液晶表示素子兼撮像素子
32 光学処理部
32A 偏光板
32B 偏光ビームスプリッター
32C 液晶表示素子
32D 波長板
32E 撮像素子
32F 反射透過面
33 光学処理部
33A ファイバー偏光ビームスプリッター
33B ファイバー走査素子
33C 波長板
33D 光強度計測素子
34 光学処理部
34D 波長板
34G レンズ
35 一体型液晶表示素子兼撮像素子
41A 凸レンズ
41B 平面ミラー
41C プリズム
41D 曲面ミラー
41E 平面ビームスプリッター
100 光学部
101 虹彩
200 光学部
201 制御部
202 画像処理部
203 情報記憶部
204 通信処理部
205 通信入出力部 DESCRIPTION OFSYMBOLS 1 Light source part 1A-1D Light source 2 Illumination optical part 3 Optical processing part 3A Polarizing plate 3B Pixel electrode 3C Photodiode 3D Liquid crystal layer 3E Opposite electrode 3F Polarizing plate 3L-N Color filter 3P Light shielding film 4 Projection optical part 6 Eyeball 7 Retina 8 Crystal 9 Object 10 Video display device 21A Light tunnel 21B Lens 22A Diffuser 23A- C Lens 23D, 23E Dichroic mirror 23F Micro lens array 23G Lens 23H Relay lens 31 Integrated liquid crystal display element / image sensor 32 Optical processing unit 32A Polarizing plate 32B Polarization beam splitter 32C Liquid crystal display element 32D Wave plate 32E Image sensor 32F Reflective transmission surface 33 Optical processing unit 33A Fiber polarization beam splitter 33B Fiber scanning element 33C Wave plate 33D Light intensity measurement element 34 Optical processing unit 34D Wave plate 4G lens 35 Integrated liquid crystal display device / imaging device 41A Convex lens 41B Plane mirror 41C Prism 41D Curved mirror 41E Plane beam splitter 100 Optical unit 101 Iris 200 Optical unit 201 Control unit 202 Image processing unit 203 Information storage unit 204 Communication processing unit 205 Communication Input / output section
1A~1D 光源
2 照明光学部
3 光学処理部
3A 偏光板
3B 画素電極
3C フォトダイオード
3D 液晶層
3E 対向電極
3F 偏光板
3L~N カラーフィルター
3P 遮光膜
4 投影光学部
6 眼球
7 網膜
8 水晶体
9 物体
10 映像表示装置
21A ライトトンネル
21B レンズ
22A 拡散板
23A~C レンズ
23D、23E ダイクロイックミラー
23F マイクロレンズアレイ
23G レンズ
23H リレーレンズ
31 一体型液晶表示素子兼撮像素子
32 光学処理部
32A 偏光板
32B 偏光ビームスプリッター
32C 液晶表示素子
32D 波長板
32E 撮像素子
32F 反射透過面
33 光学処理部
33A ファイバー偏光ビームスプリッター
33B ファイバー走査素子
33C 波長板
33D 光強度計測素子
34 光学処理部
34D 波長板
34G レンズ
35 一体型液晶表示素子兼撮像素子
41A 凸レンズ
41B 平面ミラー
41C プリズム
41D 曲面ミラー
41E 平面ビームスプリッター
100 光学部
101 虹彩
200 光学部
201 制御部
202 画像処理部
203 情報記憶部
204 通信処理部
205 通信入出力部 DESCRIPTION OF
Claims (14)
- 光を出力する光源部と、
前記光源部から出力された光を用いて画像を表示する画像表示処理及び目を用いた生体認証処理を行うための認証対象を撮像する撮像処理を行う光学処理部と
を含み、
前記光学処理部は、表示された前記画像の光が利用者の目によって反射された反射光を受光することにより前記認証対象を撮像する
ことを特徴とする画像表示装置。 A light source unit that outputs light;
An image processing for displaying an image using the light output from the light source unit and an optical processing unit for performing an imaging process for imaging an authentication target for performing a biometric authentication process using the eyes, and
The optical processing unit images the authentication target by receiving reflected light in which the light of the displayed image is reflected by a user's eyes. - 前記光学処理部は、画素電極と撮像素子とを含み、
前記画素電極は、前記画像を生成し、
前記撮像素子は、前記画像の光が利用者の目によって反射された反射光をのみを受光し、
前記撮像素子は、前記反射光による目の像の焦点深度内に配置される
ことを特徴とする請求項1に記載の画像表示装置。 The optical processing unit includes a pixel electrode and an image sensor,
The pixel electrode generates the image;
The image sensor receives only reflected light in which the light of the image is reflected by the user's eyes,
The image display apparatus according to claim 1, wherein the imaging element is disposed within a focal depth of an eye image by the reflected light. - 前記光学処理部は、液晶表示素子と偏光ビームスプリッターと波長板と撮像素子とを含み、
前記液晶表示素子は、前記画像の光を出力し、
前記波長板は、前記液晶表示素子から利用者の目に至る光と前記利用者の目から前記撮像素子に至る光との偏光を変更し、
前記偏光ビームスプリッターは、前記液晶表示素子から前記利用者の目に至る光路と前記利用者の目から前記撮像素子に至る光路とを分離する
ことを特徴とする請求項1に記載の画像表示装置。 The optical processing unit includes a liquid crystal display element, a polarization beam splitter, a wave plate, and an imaging element,
The liquid crystal display element outputs the light of the image,
The wave plate changes the polarization of the light from the liquid crystal display element to the user's eyes and the light from the user's eyes to the imaging element,
The image display apparatus according to claim 1, wherein the polarization beam splitter separates an optical path from the liquid crystal display element to the eyes of the user and an optical path from the eyes of the user to the imaging element. . - 前記光学処理部は、前記認証対象を撮像するための予め定められた画像を表示する
ことを特徴とする請求項1に記載の画像表示装置。 The image display apparatus according to claim 1, wherein the optical processing unit displays a predetermined image for imaging the authentication target. - 前記予め定められた画像は、全面が同一の色である
ことを特徴とする請求項4に記載の画像表示装置。 The image display apparatus according to claim 4, wherein the predetermined image has the same color on the entire surface. - 前記予め定められた画像は、利用者の視線を予め定められた位置に誘導するための視線誘導情報を含み、
前記光学処理部は、前記視線誘導情報に従って前記予め定められた画像を知覚した利用者の目によって反射された反射光を受光することにより前記認証対象を撮像する
ことを特徴とする請求項4に記載の画像表示装置。 The predetermined image includes line-of-sight guidance information for guiding a user's line of sight to a predetermined position;
The said optical processing part images the said authentication object by receiving the reflected light reflected by the eyes of the user who perceived the said predetermined image according to the said visual guidance information. The image display device described. - 前記光学処理部から受光する前記画像の光が前記利用者の目に対して入射されるように光の進行方向を変更する光方向変更部を含み、
前記光方向変更部は、前記利用者が対象物を視覚により知覚可能な範囲よりも小さい
ことを特徴とする請求項1に記載の画像表示装置。 A light direction changing unit that changes a traveling direction of the light so that light of the image received from the optical processing unit is incident on the eyes of the user;
The image display device according to claim 1, wherein the light direction changing unit is smaller than a range in which the user can perceive an object visually. - 前記光学処理部から受光する前記画像の光が前記利用者の目に対して入射されるように光の進行方向を変更する光方向変更部を含み、
前記光方向変更部は、光の反射率が予め定められた値以上である
ことを特徴とする請求項1に記載の画像表示装置。 A light direction changing unit that changes a traveling direction of the light so that light of the image received from the optical processing unit is incident on the eyes of the user;
The image display device according to claim 1, wherein the light direction changing unit has a light reflectance equal to or greater than a predetermined value. - 前記光学処理部は、ファイバー走査素子とファイバー偏光ビームスプリッターと波長板と受光した光の強度を計測する光強度計測素子とを含み、
前記ファイバー走査素子は、前記画像の光を出力し、
前記波長板は、前記ファイバー走査素子から利用者の目に至る光と前記利用者の目から前記光強度計測素子に至る光との偏光を変更し、
前記ファイバー偏光ビームスプリッターは、前記ファイバー走査素子から前記利用者の目に至る光路と前記利用者の目から前記光強度計測素子に至る光路とを分離する
ことを特徴とする請求項1に記載の画像表示装置。 The optical processing unit includes a fiber scanning element, a fiber polarization beam splitter, a wave plate, and a light intensity measuring element for measuring the intensity of received light,
The fiber scanning element outputs the light of the image,
The wave plate changes the polarization of the light from the fiber scanning element to the user's eyes and the light from the user's eyes to the light intensity measuring element,
The optical fiber polarization beam splitter separates an optical path from the fiber scanning element to the user's eye and an optical path from the user's eye to the light intensity measurement element. Image display device. - 前記画像の光が入射した利用者の目の網膜と前記光学処理部との物面と像面の関係を、前記利用者の目の虹彩と前記光学処理部との物面と像面の関係になるよう調整する調整部を含み、
前記光学処理部は、前記利用者の目の虹彩を前記認証対象として撮像する
ことを特徴とする請求項1に記載の画像表示装置。 The relationship between the object surface and the image surface of the user's eye retina and the optical processing unit on which the light of the image is incident, and the relationship between the object surface and the image surface of the user's eye iris and the optical processing unit Including an adjustment unit that adjusts to
The image display apparatus according to claim 1, wherein the optical processing unit images an iris of the user's eye as the authentication target. - 前記光学処理部は、液晶表示素子と偏光ビームスプリッターと波長板と撮像素子とを含み、
前記液晶表示素子は、前記画像の光を出力し、
前記波長板は、前記液晶表示素子から利用者の目に至る光と前記利用者の目から前記撮像素子に至る光との偏光を変更し、
前記偏光ビームスプリッターは、前記液晶表示素子から前記利用者の目に至る光路と前記利用者の目から前記撮像素子に至る光路とを分離し、
前記調整部は、前記偏光ビームスプリッターと前記撮像素子との間に配置されるレンズである
ことを特徴とする請求項10に記載の画像表示装置。 The optical processing unit includes a liquid crystal display element, a polarization beam splitter, a wave plate, and an imaging element,
The liquid crystal display element outputs the light of the image,
The wave plate changes the polarization of the light from the liquid crystal display element to the user's eyes and the light from the user's eyes to the imaging element,
The polarizing beam splitter separates an optical path from the liquid crystal display element to the user's eyes and an optical path from the user's eyes to the imaging element,
The image display apparatus according to claim 10, wherein the adjustment unit is a lens disposed between the polarization beam splitter and the imaging element. - 前記画像の光を利用者の目の網膜に投影する投影光学部を含み、
前記調整部は、前記投影光学部に含まれるレンズである
ことを特徴とする請求項10に記載の画像表示装置。 A projection optical unit that projects the light of the image onto the retina of the user's eye;
The image display apparatus according to claim 10, wherein the adjustment unit is a lens included in the projection optical unit. - 光源部から出力された光を用いて画像を表示する画像表示部と、
目を用いた生体認証処理を行うための認証対象を撮像する撮像部と
を含み、
前記撮像部は、表示された前記画像の光が利用者の目によって反射された反射光を受光することにより前記認証対象を撮像する
ことを特徴とする光学デバイス。 An image display unit that displays an image using light output from the light source unit;
An imaging unit that images an authentication target for performing biometric authentication processing using eyes, and
The optical device picks up the authentication target by receiving reflected light in which the light of the displayed image is reflected by a user's eyes. - 光を出力する光源部と、
前記光源部から出力された光に基を用いて画像を表示する画像表示部と、
目を用いた生体認証処理を行うための認証対象を撮像する撮像部と、
を含み、
前記撮像部は、表示された前記画像の光が利用者の目によって反射された反射光を受光することにより前記認証対象を撮像する
ことを特徴とする画像表示装置。 A light source unit that outputs light;
An image display unit that displays an image based on light output from the light source unit;
An imaging unit for imaging an authentication target for performing biometric authentication processing using eyes;
Including
The image pickup device picks up the authentication target by receiving reflected light in which the light of the displayed image is reflected by a user's eyes.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2014/065929 WO2015193953A1 (en) | 2014-06-16 | 2014-06-16 | Image display device and optical device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2014/065929 WO2015193953A1 (en) | 2014-06-16 | 2014-06-16 | Image display device and optical device |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2015193953A1 true WO2015193953A1 (en) | 2015-12-23 |
Family
ID=54934986
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2014/065929 WO2015193953A1 (en) | 2014-06-16 | 2014-06-16 | Image display device and optical device |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2015193953A1 (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2017163386A1 (en) * | 2016-03-24 | 2017-09-28 | 株式会社日立製作所 | Optical scanning device, imaging device, and tof type analyzer |
WO2018008098A1 (en) * | 2016-07-06 | 2018-01-11 | 株式会社日立製作所 | Information display terminal |
JP2022160457A (en) * | 2016-06-21 | 2022-10-19 | 株式会社Nttドコモ | Illuminator for wearable display |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005037930A (en) * | 2003-06-27 | 2005-02-10 | Semiconductor Energy Lab Co Ltd | Display device and electronic equipment |
JP2007048113A (en) * | 2005-08-11 | 2007-02-22 | Casio Comput Co Ltd | Image reading apparatus and its image reading method |
JP2008241822A (en) * | 2007-03-26 | 2008-10-09 | Mitsubishi Electric Corp | Image display device |
JP2011069978A (en) * | 2009-09-25 | 2011-04-07 | Brother Industries Ltd | Retina scanning type image display device |
-
2014
- 2014-06-16 WO PCT/JP2014/065929 patent/WO2015193953A1/en active Application Filing
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005037930A (en) * | 2003-06-27 | 2005-02-10 | Semiconductor Energy Lab Co Ltd | Display device and electronic equipment |
JP2007048113A (en) * | 2005-08-11 | 2007-02-22 | Casio Comput Co Ltd | Image reading apparatus and its image reading method |
JP2008241822A (en) * | 2007-03-26 | 2008-10-09 | Mitsubishi Electric Corp | Image display device |
JP2011069978A (en) * | 2009-09-25 | 2011-04-07 | Brother Industries Ltd | Retina scanning type image display device |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2017163386A1 (en) * | 2016-03-24 | 2017-09-28 | 株式会社日立製作所 | Optical scanning device, imaging device, and tof type analyzer |
CN107615132A (en) * | 2016-03-24 | 2018-01-19 | 株式会社日立制作所 | Light scanning apparatus, device for image and TOF type analysis devices |
JPWO2017163386A1 (en) * | 2016-03-24 | 2018-03-29 | 株式会社日立製作所 | Optical scanning device, video device, and TOF type analyzer |
US10413187B2 (en) | 2016-03-24 | 2019-09-17 | Hitachi, Ltd. | Optical scanning device, imaging device, and TOF type analyzer |
CN107615132B (en) * | 2016-03-24 | 2020-03-31 | 株式会社日立制作所 | Optical scanning device, imaging device, and TOF-type analysis device |
JP2022160457A (en) * | 2016-06-21 | 2022-10-19 | 株式会社Nttドコモ | Illuminator for wearable display |
WO2018008098A1 (en) * | 2016-07-06 | 2018-01-11 | 株式会社日立製作所 | Information display terminal |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108882845B (en) | Eye tracker based on retinal imaging via light-guide optical elements | |
US10682055B1 (en) | Fluorescent imaging on a head-mountable device | |
AU2013217496B2 (en) | Image generation systems and image generation methods | |
US9092671B2 (en) | Visual line detection device and visual line detection method | |
US8014571B2 (en) | Multimodal ocular biometric system | |
US8398242B2 (en) | Display apparatus | |
JP5195537B2 (en) | Head mounted display | |
JP6048819B2 (en) | Display device, display method, integrated circuit, program | |
JP2020515895A (en) | Operable fovea display | |
JP2008241822A (en) | Image display device | |
JPH10319342A (en) | Eye ball projection type video display device | |
US20180017858A1 (en) | Stereoscopic reproduction system using transparency | |
WO2015193953A1 (en) | Image display device and optical device | |
US20170261750A1 (en) | Co-Aligned Retinal Imaging And Display System | |
US20180246566A1 (en) | Eye tracking | |
JP3785539B2 (en) | Wide viewing area retinal projection display system | |
JP2006195084A (en) | Display apparatus | |
JP7163230B2 (en) | Line-of-sight detection device, line-of-sight detection method, and display device | |
JP2010134051A (en) | Image display apparatus | |
JP7338476B2 (en) | Video projection device, video projection method, video display light output control method | |
JP2001242417A (en) | Pupil position detector and image display device using the same | |
JP2021062162A (en) | Scanning type ocular fundus imaging apparatus | |
JPH11347016A (en) | Imaging device | |
JP2021022851A (en) | Head-up display apparatus | |
WO2021210225A1 (en) | Electronic device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 14895224 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
NENP | Non-entry into the national phase |
Ref country code: JP |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 14895224 Country of ref document: EP Kind code of ref document: A1 |