WO2014106330A1 - Contact lens for measuring eyeball focus - Google Patents

Contact lens for measuring eyeball focus Download PDF

Info

Publication number
WO2014106330A1
WO2014106330A1 PCT/CN2013/070062 CN2013070062W WO2014106330A1 WO 2014106330 A1 WO2014106330 A1 WO 2014106330A1 CN 2013070062 W CN2013070062 W CN 2013070062W WO 2014106330 A1 WO2014106330 A1 WO 2014106330A1
Authority
WO
WIPO (PCT)
Prior art keywords
eye
focal distance
contact lens
infrared light
image
Prior art date
Application number
PCT/CN2013/070062
Other languages
French (fr)
Inventor
Zhen Xiao
Original Assignee
Empire Technology Development Llc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Empire Technology Development Llc filed Critical Empire Technology Development Llc
Priority to PCT/CN2013/070062 priority Critical patent/WO2014106330A1/en
Publication of WO2014106330A1 publication Critical patent/WO2014106330A1/en

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02CSPECTACLES; SUNGLASSES OR GOGGLES INSOFAR AS THEY HAVE THE SAME FEATURES AS SPECTACLES; CONTACT LENSES
    • G02C7/00Optical parts
    • G02C7/02Lenses; Lens systems ; Methods of designing lenses
    • G02C7/04Contact lenses for the eyes

Definitions

  • Example apparatus may include a wearable contact lens, an IR emitter disposed on the contact lens to emit IR light, an IR sensor disposed on the contact lens to detect reflections of the IR light, and a communication device disposed on the contact lens communicatively coupled to the IR sensor and configured to provide focal distance data associated with the eye.
  • the present disclosure describes various illustrative methods for detecting eye focal distance from a contact lens. Such methods may include emitting IR light from an emitter disposed on a wearable contact lens, detecting reflections at an IR sensor on the contact lens, and providing focal distance data associated with the eye from a communication device on the contact lens communicatively coupled to the IR sensor.
  • the present disclosure describes various illustrative machine- readable instructions for detecting eye focal distance from a contact lens.
  • Such machine-readable instructions may include emitting IR light from an emitter disposed on a wearable contact lens, detecting reflections at an IR sensor on the contact lens, and providing focal distance data associated with the eye from a communication device on the contact lens communicatively coupled to the IR sensor.
  • the foregoing summary may be illustrative only and may not be intended to be in any way limiting.
  • further aspects, embodiments, and features will become apparent by reference to the drawings and the following detailed description.
  • Fig. 1 illustrates a system showing a cross section of an eye and a light source shining into the eye to project a set of Purkinje-Sanson reflections
  • Fig. 2 illustrates cross sections of an eye including a lens and an example contact lens configured to measure eyeball focus of the eye;
  • Fig. 3 illustrates an example contact lens configured to measure eyeball focus
  • Fig. 4 illustrates a cutaway view of an example contact lens including a groove
  • Fig. 5 illustrates a flow diagram of an example method for measuring eyeball focus
  • Fig. 6 illustrates a flow diagram of an example method for calibrating a determined focal distance with the perception of the individual wearing the contact lens
  • Fig. 7 illustrates a flow diagram of an example method for calibrating a known object distance to a measured focal distance
  • Fig. 8 illustrates a flow diagram of an example method for associating a particular focal distance to an adjusted object distance
  • Fig. 9 illustrates an example computer program product
  • Fig. 10 illustrates an example computing device, all arranged in accordance with at least some embodiments of the present disclosure.
  • This disclosure is drawn, inter alia, to apparatus, methods and computer-readable media related to measuring eyeball focus using a contact lens including an emitter, a receiver, a communication device, and/or a control circuit.
  • a sensor may be disposed on a contact lens and may be configured to detect focal distance data associated with an eye to determine the focal distance of the eye, as will be described in greater detail throughout.
  • a light source disposed on the contact lens may shine into the eye, reflect off one or more surfaces of the eye, and the reflections may project one or more
  • Purkinje-Sanson images at particular locations of the eye.
  • the locations of the Purkinje-Sanson images may be detected at the sensor disposed on the contact lens.
  • the locations of the detected Purkinje-Sanson images may be associated with the physiological state of a natural lens within the eye as part of a process called accommodation, as will be described in greater detail throughout.
  • the locations of the detected Purkinje-Sanson images may be processed to determine an accommodation state of the eye, which may in turn be processed to determine a focal distance of the eye.
  • the optical power of an eye for a given accommodation state may be particular to an individual; accordingly, one or more calibration processes may associate detected focal distance data with a determination of the focal distance of the particular eye wearing the contact lens, as will be described in greater detail throughout.
  • calibration may incorporate the use of an Alternate Reality (AR) image device that may provide one or more AR images at selected focal distances.
  • a focal distance data of the eye may be determined when focused on an AR image.
  • a perceived distance of the AR image may be manipulated to coincide with a known physical distance.
  • FIG. 1 illustrates a system 100 showing a cross section of an eye
  • light source 1 10 may produce light 1 1 1 that travels toward a cornea 104 of eye 101 and may continue to a lens
  • a portion of light 1 1 1 may reflect off an outer surface of cornea
  • a focal distance of an eye may be determined based at least in part on the locations of one or more of the Purkinje-Sanson Images PS1 1 12, PS2 113, PS3 1 16, and/or PS4 1 18.
  • images PS1 1 12 and PS2 1 14 may be reflected from the outer and inner surfaces of cornea 104, and therefore may not be directly influenced by lens 102, so that the locations of Purkinje- Sanson images PS1 112 and PS2 1 14 may not be best suited to provide information related to the state of lens 102.
  • PS2 114 is depicted without a corresponding label and candle-image in part because PS2 1 14 may have less consequence to the subject matter disclosed herein.
  • the locations of Purkinje-Sanson images PS1 1 12, PS2 1 14, PS3 1 16, and/or PS4 1 18 may correspond to the location of light source 1 10 and/or the accommodation of lens 102.
  • image PS1 1 12 and/or image PS2 1 14 may appear relatively close to light source 1 10.
  • image PS3 1 16 and image PS4 1 18 may appear relatively far from light source 1 10, and in some examples image PS3 1 16 and image PS4 1 18 may appear on the substantially opposite side of cornea 104.
  • the relative locations of images PS1 1 12, PS2 1 14, PS3 1 16, and PS4 1 18 may be increasingly more distant from light source 1 10.
  • the relative distance of images PS1 1 12, PS2 1 14, PS3 1 16, and/or PS4 1 18 from light source 1 10 may be used to distinguish some or all of the Purkinje-Sanson images PS1 1 12, PS2 114, PS3 1 16, and PS4 118.
  • PS3 1 16 is generally relatively closer to light source 1 10 than PS4 1 18, then given two detected images thought to be PS3 1 16 and PS4 1 18, the relative distances of the images from the light source may be determined and the closer image may be labeled PS3 116 and the further image may be labeled PS4 1 18.
  • Purkinje-Sanson images PS1 1 12, PS2 1 14, PS3 1 16, and/or PS4 1 18 may be detected as upright images or inverted images.
  • light source 1 10 may depict a candle with a flame on top.
  • the depiction of the candle may be for illustrative purposes to demonstrate the orientation of Purkinje- Sanson images PS1 112, PS3 1 16, and/or PS4 1 18.
  • image PS1 1 12 and image PS3 116 may appear to have a similar orientation as light source 110, depicted here as upright, with the candle flame appearing on top.
  • image PS4 1 18 may appear substantially inverted, depicted here with the candle flame appearing on the bottom.
  • the orientation of an image may be used to distinguish some or all of the Purkinje-Sanson images PS1 1 12, PS2 114, PS3 1 16, and PS4 118. For example, given two detected images thought to be PS3 1 16 and PS4 1 18, the image appearing substantially upright may be labeled PS3 1 16 and the image appearing substantially inverted may be labeled PS4 1 18.
  • lens 102 may change focus by changing the optical power of lens 102 via the accommodation process natural to the human eye.
  • muscles controlling lens 102 may tense and/or relax causing deformation of lens 102, thereby changing the optical power and focus of lens 102.
  • outer surface of lens 102 and/or inner surface of lens 102 may change in curvature and lens 102 may change in thickness, thus the locations of image PS3 1 16 and/or image PS4 118 may correspondingly change location(s).
  • the locations of PS3 1 16 and PS4 1 18 may provide an indication of the current accommodation state of eye 101 , which in turn may be processed to determine a current focal distance of eye 101.
  • Fig. 2 illustrates cross sections of eye 101 including lens 102 and an example contact lens 201 configured to measure eyeball focus of eye 101 , arranged in accordance with at least some embodiments of the present disclosure.
  • Fig. 2 may depict a cross section of eye 101 including lens 102. Eye 101 , lens 102, PS3 1 16, and/or PS4 1 18 are further described as discussed with respect to Fig. 1 and elsewhere herein.
  • an emitter 210 In some examples, an emitter 210, a sensor 220, and/or a
  • communication device 230 may be disposed on contact lens 201.
  • contact lens 201 may include emitter 210 and sensor 220 to determine the focus of lens 102 in eye 101.
  • lens 102 may correspondingly change shape, so that image PS3 1 16 and image PS4 1 18 may change locations.
  • emitter 210 may be used to project images PS3 1 16 and/or PS4 1 18, and
  • Sensor 220 may be used to detect the locations of images PS3 1 16 and/or PS4 1 18. The detected locations of images PS3 1 16 and PS4 1 18 may then be used to determine an accommodation of lens 102, which in turn may be used to determine a focal distance of eye 101.
  • emitter 210 may be configured to emit light to create PS3 1 16 and/or PS4 1 18 images.
  • emitter 210 may include a device for power/control 212, an LED 214, and/or a microlens 216.
  • LED 214 may be activated by power/control 212 to emit light; the emitted light may pass through microlens 216 toward eye 101 creating images PS3 1 16 and PS4 1 18.
  • emitter 210 may be configured to emit any appropriate wavelength(s) of light, including, but not limited to, IR light outside of the visible spectrum of a typical human eye.
  • the wearer of contact lens 201 may not be aware of and/or disturbed by the emissions of emitter 210.
  • sensor 220 may be configured to detect PS3 1 16 and/or PS4 1 18 images.
  • sensor 220 may include a CCD/CMOS 222 and a microlens array 224.
  • Microlens array 224 may be configured to focus PS3 1 16 and/or PS4 1 18 images onto CCD/CMOS 222.
  • CCD/CMOS 222 and/or microlens array 224 may be arranged in any appropriate shape, including a linear distribution according to the direction of warp.
  • CCD/CMOS 222 may describe a line so that received images may be associated with a linear displacement on the line to provide location data as an offset coordinate.
  • microlens array 224 may be linearly collaborated to CCD/CMOS 222 to associate one or more individual microlenses of microlens array 224 with corresponding one or more sensitive elements of CCD/CMOS 222.
  • CCD/CMOS 222 may be any appropriate type of image sensor, including Charge Coupled Device (CCD) and/or Complementary Metal Oxide Semiconductor (CMOS), or the like.
  • CCD/CMOS 222 may incorporate one or more filters to attenuate signals not originating from emitter 210, for example one or more IR filters configured to pass primarily IR light.
  • sensor 220 may be configured to detect focal distance data associated with eye 101.
  • focal distance data may be determined based at least in part on a distance separating images PS3 1 16 and PS4 1 18, which may indicate an accommodation of lens 102, which in turn may indicate a focal distance of eye 101.
  • Focal distance data may be any appropriate focal distance data, including, for example, an indication of the focal distance and/or indications of the locations of images PS3 1 16 and/or PS4 1 18.
  • focal distance data may include substantially unprocessed data from sensor 220 which may be processed to determine at least the locations of images PS3 1 16 and/or PS4 118.
  • the focal distance of eye 101 may be determined based at least in part on focal distance data.
  • the determined focal distance of eye 101 may be used, for example, to determine where a wearer of contact lens 201 may be looking, what the wearer may be looking at, and/or in cooperation with any appropriate augmented reality (AR) display system to display information at the focal distance of the person wearing contact lens 201 .
  • AR augmented reality
  • communication device 230 may be used to communicate focal distance data associated with eye 101.
  • communication device 230 may be located with sensor 220, although
  • communication device 230 may be positioned at any appropriate location. In some examples, communication device 230 may be communicatively coupled to sensor 220 and/or may be communicatively coupled to CCD/CMOS 222.
  • Communication device 230 may be any appropriate communication device, for example one or more antennas, wired connectivity, or the like. Communication device 230 may use any appropriate protocol, such as Bluetooth, WiFi, cellular, Ethernet, or the like.
  • contact lens 201 may measure eyeball focus based at least in part on detecting the locations of image PS3 1 16 and/or image PS4 1 18.
  • a distance separating PS3 1 16 and PS4 1 18 may generally indicate an accommodation of lens 102. That is, as lens 102 changes focus, the
  • accommodation of lens 102 may change so that the locations of and/or distance separating images PS3 116 and/or PS4 1 18 may also change correspondingly.
  • the determination of locations of and/or distances separating images PS3 1 16 and PS4 1 18 may require distinguishing between the locations of image PS3 1 16 and image PS4 1 18.
  • image PS3 1 16 may be located closer to the center of contact lens 201 and image PS4 1 18 may likewise be located further from the center of contact lens 201 .
  • other distinguishing characteristics may be used to distinguish PS3 1 16 and/or PS4 1 18 images.
  • Images PS3 1 16 and/or PS4 1 18 may be distinguished, in some examples, by configuring LED 214 to emit two or more distinct wavelengths of IR light in a particular orientation and/or pattern so that CCD/CMOS 222 may determine whether an image appears in a substantially upright orientation, such as may be observed at image PS3 1 16, or instead whether an image appears in a substantially inverted orientation, such as may be observed at image PS4 118.
  • LED 214 may comprise one or more LED devices.
  • sensor 220 may be capable of distinguishing two or more wavelengths of IR light.
  • individual photosensitive sites of sensor 220 may be configured to improve sensitivity to one or more particular wavelengths.
  • individual photosensitive sites of sensor 220 may include filters to attenuate sensitivity to undesired wavelengths.
  • filtered photosensitive sites of sensor 220 may be patterned, for example ABABAB, wherein A indicates sensitivity to a first wavelength and B indicates sensitivity to a second wavelength. Any suitable pattern of photosensitive sites may be employed.
  • emitter 210 may be arranged substantially opposite to sensor 220. In some examples, emitter 210 may be arranged on a first sector of the contact lens and sensor 220 may be arranged on a second sector of the contact lens, substantially opposite to the first sector. In some examples, emitter 210 may be arranged substantially linearly according to the direction of warp.
  • power/control 212 may be configured to power and/or control LED 214. In some examples— depending in part on available power (or any other suitable factors)— power/control 212 may shine LED 214 continuously and/or for a long duration. In some examples, power/control 212 may pulse LED 214 in any appropriate duty cycle or cycles. Power/control 212 may be configured to provide steady illumination, to conserve power, and/or to adapt output according to environmental needs. In some examples, such as in the presence of bright and/or dark ambient light, power/control 212 may increase and/or decrease the intensity of LED 214 to provide brighter PS3 1 16 and/or PS4 1 18, as appropriate.
  • emitter 210, sensor 220, and/or communication device 230 may be disposed on an inside surface of contact lens 201 , on an outside surface of contact lens 201 , substantially through contact lens 201 , and/or substantially enclosed within contact lens 201.
  • components included in contact lens 201 may operate at the speed of modern electronics.
  • emitter 210 may emit light
  • sensor 220 may detect light
  • communication device 230 may communicate in real time.
  • Fig. 3 illustrates contact lens 201 configured to measure eyeball focus, arranged in accordance with at least some embodiments of the present disclosure.
  • Contact lens 201 may include emitter 210, sensor 220, power supply 310, and/or leads 330.
  • Contact lens 201 , emitter 210, and/or sensor 220 may be further described as discussed with respect to Fig. 2 and elsewhere herein.
  • Power may be used for any appropriate purpose. In some embodiments,
  • emitter 210 may use power to emit light at the LED.
  • sensor 220 may use power to operate CCD/CMOS 222 and/or communication device 230.
  • a power supply 310 may be configured as a substantially circular and/or semicircular circuit, circumferentially located toward the extents of contact lens 201. In some examples, power supply 310 may not be visible by the wearer of contact lens 201. In some examples, power supply 310 may be visible at the extents of vision and/or at one or more leads 330 coupling power supply 310 to emitter 210 and/or sensor 220. In some examples, power supply 310 may include substantially transparent materials.
  • Power supply 310 may be configured to harvest power to operate electronic components of contact lens 201 , for example emitter 210 and/or sensor 220. Power supply 310 may use more conventional techniques either in combination with harvesting power or alone, such as battery power, which may provide uninterrupted and/or conditioned power.
  • power supply 310 may be operable to harvest power from any suitable sources, for example solar power, kinetic energy, thermoelectric power, rectifying antenna, or the like.
  • power supply 310 may include a radio frequency (RF) antenna configured to harvest RF energy.
  • RF radio frequency
  • power supply 310 may incorporate an antenna or may be used as an antenna as appropriate. In some examples,
  • Fig. 4 illustrates a cutaway view of an example contact lens 401 including a groove 410, arranged in accordance with at least some embodiments of the present disclosure.
  • Fig. 3 previously discussed above, describes contact lens 201 arranged to include emitter 210 and sensor 220 on substantially opposite sectors of contact lens 201.
  • Fig. 4 depicts contact lens 401 having a annular groove, groove 410, wherein groove 410 may be arranged to emit light and receive reflections as described herein and throughout.
  • the light emitted from groove 410 may function as described above, particularly with respect to Fig. 1 , so that the emitted light may project reflections associated with an accommodation state of a natural lens of an eye wearing contact lens 401 , and the received reflections may be processed to determine the
  • Groove 410 may be an annular groove formed on the contact lens and may permit determination of the focal distance of the eye wearing contact lens 401 .
  • an IR emitter including an infrared laser diode may be configured to provide IR light to groove 410.
  • the IR laser diode may use very low power.
  • Groove 401 may act as a semi-reflex lens, and may be configured to emit a portion of light and/or conduct a portion of light.
  • a first potion of the provided IR light may be configured to emit from groove 410 and may reflect off one or more surfaces of an eye wearing contact lens 401.
  • the reflected IR light may be received at groove 410 to provide a first reflected signal.
  • a second portion of the provided IR light may conduct a total reflection along groove 410 to provide a second reflected signal.
  • An IR sensor coupled to groove 410 may be configured to detect the first and second reflected signal.
  • the first and second reflected signals may produce interference, which may be detected at the IR sensor.
  • the first and second reflected signals, including interference between the signals may comprise focal distance data, which may in turn be processed to determine a focal distance of the eye based in part on the interference between the first and second reflected signal.
  • the interference between the first and second reflected signals may correspond at least in part on the distance traveled by the first emitted portion of light as reflected by the lens of the eye wearing contact lens 401.
  • the reflected light may travel a shorter or further distance, thereby correspondingly affecting the interference detected by the sensor.
  • measuring the interference may permit determination of the focal distance of the eye wearing contact lens 401.
  • groove 410 may be formed on either a top or a bottom surface of contact lens 401 as appropriate.
  • groove 410 may behave similarly to an optical fiber, acting, for example, as a waveguide for at least a portion of light transmitted via groove 410.
  • groove cross-sectional shape may be any suitable shape, including semi-circular, rectangular, diamond, or the like.
  • emitted light may originate from groove 410 and/or any suitable surface of contact lens 401.
  • Fig. 5 illustrates a flow diagram of an example method 500 for measuring eyeball focus, arranged in accordance with at least some
  • method 500 may be performed by any suitable device, devices, or systems such as those discussed herein. More specifically, method 500 may be performed by a contact lens to measure eyeball focus.
  • Method 500 sets forth various functional blocks or actions that may be described as processing steps, functional operations, events and/or acts, etc., which may be performed by hardware, software, and/or firmware. Numerous alternatives to the functional blocks shown in Fig. 5 may be practiced in various implementations. For example, intervening actions not shown and/or additional actions not shown may be employed and/or some of the actions shown may be eliminated, without departing from the scope of claimed subject matter.
  • Method 500 may include one or more of functional operations as indicated by one or more of blocks 502, 504, 506, and/or 508. The process of method 500 may begin at block 502.
  • an emitter disposed on the contact lens may emit IR light.
  • any suitable type or types of light may be emitted by any suitable technique or techniques.
  • IR light may be emitted as may be further discussed with respect to Figs. 1 -4 and elsewhere herein, particularly with respect to emitter 210 and LED 214.
  • the emitted light may be comprised of wavelengths falling outside of human vision, for example IR light.
  • a sensor on the contact lens may detect reflections of the IR light.
  • the sensor may operate as may be further discussed with respect to Figs. 2-4 and elsewhere herein, particularly with respect to sensor 220 and/or CCD/CMOS 222.
  • the detected IR light may be associated with one or more locations on the sensor and may correspond to locations of images PS3 1 16 and/or PS4 1 18. Accordingly, in some examples, the locations of the detected IR light may be used to determine a focal distance of an eye wearing the contact lens.
  • detected data may be provided in a variety of formats to allow the determination of the focal distance, including formats such as unprocessed focal distance data, identified locations of images PS3 1 16 and/or PS4 118, an indication of focal distance, and/or as any suitable focal distance data.
  • Process of method 500 may continue from block 504 to block 506.
  • focal distance data associated with the eye wearing the contact lens may be provided via a communication device.
  • the communication device may operate as may be further discussed with respect to Figs. 2-4 and elsewhere herein, particularly with respect to communication device 230.
  • Focal distance data may be provided by communication techniques including radio communication, wired communication, optical communication, and/or any suitable communication technique or techniques.
  • Process of method 500 may optionally continue from block 506 to block 508 or stop after block 506.
  • focal distance data may be processed at a control circuit.
  • block 508 may be optional. Processing the focal distance data may include determining a focal distance based in part on one or more detected reflections of IR light.
  • the control circuit may determine the focal distance based at least in part on the distance separating images PS3 and PS4, for example as may be detected at block 504.
  • the focal distance data may include the focal distance, and accordingly determining the focal distance will not require further processing and/or interpretation of focal distance data.
  • control circuit may be disposed on the contact lens. In some examples, the control circuit may be located somewhere other than the contact lens. In some examples, the control circuit may be included in a mobile electronic device, a PDA, a computer, a cloud-based computing device, or the like. In general, the control circuit may be configured to process the focal distance data.
  • process 500 including blocks 502, 504, 506, and/or 508 may occur in real time.
  • processing focal distance data may include calibrating the determined focal distance to the perception of the contact lens wearer.
  • calibration may be optional.
  • calibration may occur as infrequently as once for a wearer of the contact lens, in some examples upon the wearer's first use of the contact lens.
  • calibration may be repeated, for example repeated to calibrate different focal distances or for example repeated at different times.
  • Process of method 500 may stop after block 508. In some embodiments,
  • process of method 500 may be repeated, beginning again at block 502, or the like. In some examples, process of method 500 may repeat without performing optional block 508, and may continue from block 506 to block 502, or the like.
  • Fig. 6 illustrates a flow diagram of an example method 600 for calibrating a determined focal distance with the perception of the individual wearing a contact lens, arranged in accordance with at least some embodiments of the present disclosure.
  • method 600 may be performed by any suitable device, devices, or systems such as those discussed herein.
  • method 600 may process focal distance data that may be provided via, for example, the communication device of Fig. 5 at block 506, though method 600 is not limited to this purpose.
  • method 600 may be similar to, encompass, and/or represent a portion of Fig. 5 at block 508, "Process focal distance data at a control circuit".
  • Method 600 sets forth various functional blocks or actions that may be described as processing steps, functional operations, events and/or acts, etc., which may be performed by hardware, software, and/or firmware. Numerous alternatives to the functional blocks shown in Fig. 6 may be practiced in various implementations. For example, intervening actions not shown and/or additional actions not shown may be employed and/or some of the actions shown may be eliminated, without departing from the scope of claimed subject matter.
  • Method 600 may include one or more of functional operations as indicated by one or more of blocks 602, and/or 604. The process of method 600 may begin at block 602.
  • a focal distance associated with an eye of the wearer of the contact lens may be determined at a control circuit. As described herein, particularly with respect to Fig. 5 and block 508, the focal distance may be determined based in part on processing focal distance data. Process of method 600 may continue from block 602 to block 604.
  • an association between accommodation of the eye and the focal distance may be calibrated.
  • human eyes may be unique and the diopter of an eye may differ from another eye.
  • two different people exhibiting similar measured accommodation of the eye for example such as when a distance between PS3 1 16 and PS4 1 18 images may be similar, may perceive focus at different focal distances.
  • different people focused on the same object at the same distance may exhibit different measured accommodations of the eye. These differences may be measured, such as at block 602, and calibrated for the individual wearing the contact lens.
  • calibration may include a best fit curve to associate the accommodation of the eye at one or more distances with one or more focal distances perceived at those distances.
  • the calibrated distances may include positions such as 25cm, 33cm, 1 m, 2m, 5, etc., which correspond to diopters of 4D, 3D, 1 D, 0.5D, and 0.2D, respectively, which, when associated with corresponding detected accommodations of the eye, may provide data to develop a best fit to form a corresponding relationship between a measured indication of accommodation and a corresponding perceived focal distance of the subject.
  • Calibration may be done when appropriate, for example when the contact lens wearer wears the contact lens for the first time, occasionally, and/or periodically.
  • the AR image device may include any appropriate AR image device.
  • the AR image device may be disposed on the contact lens.
  • the AR image device may include a head mounted display.
  • the AR image device may include one or more fixed display units, including, for example, projectors and/or displays.
  • Process of method 600 may end at block 604. In some examples, process of method 600 may be repeated, beginning again at block 602.
  • Fig. 7 illustrates a flow diagram of an example method 700 for calibrating a known object distance to a measured focal distance, arranged in accordance with at least some embodiments of the present disclosure.
  • method 700 may be performed by any suitable device, devices, or systems such as those discussed herein.
  • Method 700 sets forth various functional blocks or actions that may be described as processing steps, functional operations, events and/or acts, etc., which may be performed by hardware, software, and/or firmware. Numerous alternatives to the functional blocks shown in Fig. 7 may be practiced in various implementations. For example, intervening actions not shown and/or additional actions not shown may be employed and/or some of the actions shown may be eliminated, without departing from the scope of claimed subject matter.
  • Method 700 may include one or more of functional operations as indicated by one or more of blocks 702, 704, and/or 706. The process of method 700 may begin at block 702.
  • an AR image focused at a known focal distance may be provided.
  • the AR image may be provided by an alternate reality imaging device or any suitable alternative.
  • the alternate reality imaging device may project, overlay, and/or display computer generated data onto a view of the surroundings, or the like, which may permit viewing data without requiring a user to look away from a scene.
  • the provided AR image may have the appearance of a target, or the like.
  • the provided AR image may be displayed so as to be perceived at a known distance.
  • an AR imaging device may provide an image to be perceived at any suitable distance, including at infinity. Process of method 700 may continue from block 702 to block 704.
  • an accommodation of an eye wearing a contact lens may be determined when focused on the AR image.
  • the accommodation of the eye may correspond to a distance between images PS3 and PS4.
  • the contact lens may determine the accommodation of the eye as discussed herein, particularly with respect to Figs. 1 -5 and elsewhere herein.
  • contact lens 102 may emit IR light, detect reflections of the IR light, and/or provide focal distance data that may be processed to determine the accommodation of the eye.
  • a user may focus on the target and the
  • Process of method 700 may continue from block 704 to block 706.
  • the accommodation of the eye when focused on the target may be associated with the known focal distance of the provided AR image.
  • the association may be stored for later use and/or processed immediately.
  • the AR image device may include any appropriate AR image device.
  • process of method 700 may be repeated as desired at different focal distances to perform best fit analysis over a range of focal distances.
  • process of method 700 may repeat at block 702.
  • Process of method 700 may stop after 706.
  • Fig. 8 illustrates a flow diagram of an example method 800 for associating a particular focal distance to an adjusted object distance, arranged in accordance with at least some embodiments of the present disclosure.
  • the associated data may permit the detection of a current focal distance of an eye to allow an AR image device to display information to be perceived at the same focal distance.
  • Method 800 sets forth various functional blocks or actions that may be described as processing steps, functional operations, events and/or acts, etc., which may be performed by hardware, software, and/or firmware. Numerous alternatives to the functional blocks shown in Fig. 8 may be practiced in various implementations. For example, intervening actions not shown and/or additional actions not shown may be employed and/or some of the actions shown may be eliminated, without departing from the scope of claimed subject matter.
  • Method 800 may include one or more of functional operations as indicated by one or more of blocks 802, 804, 806, and/or 808. The process of method 800 may begin at block 802.
  • a user may focus the eye at a first focal distance and an accommodation of the eye may be determined when focused at the first focal distance.
  • the first distance may be any suitable distance.
  • the user may focus on an object in a viewport, for example a wall or the like.
  • determining the accommodation of the eye may proceed as may be further discussed with respect to Fig. 7 and elsewhere herein, particularly with respect to block 704.
  • the user may not substantially change their gaze and/or change focal distance for the duration of method 800, in some examples block 802 may be performed at any time prior to block 808. Process of method 800 may continue from block 802 to block 804.
  • an AR image may be provided. As described herein, and with respect to Fig. 7 above, particularly block 702, the AR image may be provided at any suitable focal distance. In some examples, the provided AR image may not initially coincide with the first focal distance. Process of method 800 may continue from block 804 to block 806.
  • a user interface may be provided so that the user may adjust the focal distance of the displayed AR image. In general, the user will adjust the provided AR image until the AR image coincides with the first focal distance.
  • Process of method 800 may continue from block 806 to block 808.
  • "Associate the accommodation of the eye to the adjusted focal distance of the AR image” the accommodation of the eye, as determined at block 802, may be associated to the adjusted focal distance of the AR image. In some examples, this association may be used as described herein to calibrate an AR imaging device with the perception of an individual person. In some examples, as described herein, this association may be repeated as necessary at different focal distances to perform best fit analysis over a range of focal distances.
  • the AR image device may include any appropriate AR image device.
  • process of method 800 may be repeated as desired at different focal distances to perform best fit analysis over a range of focal distances.
  • process of method 800 may repeat at block 802.
  • process of method 800 may stop after 808.
  • Fig. 9 illustrates an example computer program product 900, arranged in accordance with at least some embodiments of the present disclosure.
  • Computer program product 900 may include machine readable non- transitory medium having stored therein instructions that, when executed, may operatively enable a computing device to measure eyeball focus and/or calibrate measured focus data according to the processes and methods discussed herein.
  • Computer program product 900 may include a signal bearing medium 902.
  • Signal bearing medium 902 may include one or more machine-readable instructions 904, which, when executed by one or more processors, may operatively enable a computing device to provide the functionality described herein. In various examples, some or all of the machine-readable instructions may be used by the devices discussed herein.
  • signal bearing medium 902 may encompass a computer-readable medium 906, such as, but not limited to, a hard disk drive, a Compact Disc (CD), a Digital Versatile Disk (DVD), a digital tape, memory, etc.
  • signal bearing medium 902 may encompass a recordable medium 908, such as, but not limited to, memory, read/write (R/W) CDs, R/W DVDs, etc.
  • signal bearing medium 902 may encompass a communications medium 910, such as, but not limited to, a digital and/or an analog communication medium (e.g., a fiber optic cable, a waveguide, a wired communication link, a wireless communication link, etc.).
  • signal bearing medium 902 may encompass a machine readable non-transitory medium.
  • Fig. 10 is a block diagram illustrating an example computing device 1000, arranged in accordance with at least some embodiments of the present disclosure.
  • computing device 1000 may be configured to measure eyeball focus and/or calibrate measured focus data as discussed herein.
  • computing device 1000 may include one or more processors 1010 and system memory 1020.
  • a memory bus 1030 can be used for communicating between the processor 1010 and the system memory 1020.
  • processor 1010 may be of any type including but not limited to a microprocessor ( ⁇ ), a microcontroller ( ⁇ ), a digital signal processor (DSP), or any combination thereof.
  • Processor 1010 can include one or more levels of caching, such as a level one cache 101 1 and a level two cache 1012, a processor core 1013, and registers 1014.
  • the processor core 1013 can include an arithmetic logic unit (ALU), a floating point unit (FPU), a digital signal processing core (DSP Core), or any combination thereof.
  • a memory controller 1015 can also be used with the processor 1010, or in some
  • the memory controller 1015 can be an internal part of the processor 1010.
  • system memory 1020 may be of any type including but not limited to volatile memory (such as RAM), non-volatile memory (such as ROM, flash memory, etc.) or any combination thereof.
  • System memory 1020 may include an operating system 1021 , one or more applications 1022, and program data 1024.
  • Application 1022 may include eyeball focus measuring application 1023 that can be arranged to perform the functions, actions, and/or operations as described herein including the functional blocks, actions, and/or operations described herein.
  • Program Data 1024 may include eyeball focus measuring data 1025 for use with eyeball focus measuring application 1023.
  • application 1022 may be arranged to operate with program data 1024 on an operating system 1021 . This described basic configuration is illustrated in Fig. 10 by those components within dashed line 1001.
  • Computing device 1000 may have additional features or
  • a bus/interface controller 1040 may be used to facilitate communications between the basic configuration 1001 and one or more data storage devices 1050 via a storage interface bus 1041.
  • the data storage devices 1050 may be removable storage devices 1051 , non-removable storage devices 1052, or a combination thereof.
  • removable storage and non-removable storage devices include magnetic disk devices such as flexible disk drives and hard-disk drives (HDD), optical disk drives such as compact disk (CD) drives or digital versatile disk (DVD) drives, solid state drives (SSD), and tape drives to name a few.
  • Example computer storage media may include volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information, such as computer readable instructions, data structures, program modules, or other data.
  • System memory 1020, removable storage 1051 and non-removable storage 1052 are all examples of computer storage media.
  • Computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which may be used to store the desired information and which may be accessed by computing device 1000. Any such computer storage media may be part of device 1000.
  • Computing device 1000 may also include an interface bus 1042 for facilitating communication from various interface devices (e.g., output interfaces, peripheral interfaces, and communication interfaces) to the basic configuration 1001 via the bus/interface controller 1040.
  • Example output interfaces 1060 may include a graphics processing unit 1061 and an audio processing unit 1062, which may be configured to communicate to various external devices such as a display or speakers via one or more A/V ports 1063.
  • Example peripheral interfaces 1070 may include a serial interface controller 1071 or a parallel interface controller 1072, which may be configured to communicate with external devices such as input devices (e.g., keyboard, mouse, pen, voice input device, touch input device, etc.) or other peripheral devices (e.g., printer, scanner, etc.) via one or more I/O ports 1073.
  • An example communication interface 1080 includes a network controller 1081 , which may be arranged to facilitate communications with one or more other computing devices 1083 over a network communication via one or more communication ports 1082.
  • a communication connection is one example of a communication media.
  • Communication media may typically be embodied by computer readable instructions, data structures, program modules, or other data in a modulated data signal, such as a carrier wave or other transport mechanism, and may include any information delivery media.
  • a "modulated data signal" may be a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal.
  • communication media may include wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, radio frequency (RF), infrared (IR) and other wireless media.
  • RF radio frequency
  • IR infrared
  • the term computer readable media as used herein may include both storage media and communication media.
  • Computing device 1000 may be implemented as a portion of a small-form factor portable (or mobile) electronic device such as a cell phone, a mobile phone, a tablet device, a laptop computer, a personal data assistant (PDA), a personal media player device, a wireless web-watch device, a personal headset device, an application specific device, or a hybrid device that includes any of the above functions.
  • a small-form factor portable (or mobile) electronic device such as a cell phone, a mobile phone, a tablet device, a laptop computer, a personal data assistant (PDA), a personal media player device, a wireless web-watch device, a personal headset device, an application specific device, or a hybrid device that includes any of the above functions.
  • Computing device 1000 may also be implemented as a personal computer including both laptop computer and non-laptop computer configurations.
  • computing device 1000 may be implemented as part of a wireless base station or other wireless system or device.
  • block diagrams, flowcharts, and/or examples contain one or more functions and/or operations, it will be understood by those within the art that each function and/or operation within such block diagrams, flowcharts, or examples can be implemented, individually and/or collectively, by a wide range of hardware, software, firmware, or virtually any combination thereof.
  • ASICs Application Specific Integrated Circuits
  • FPGAs Field Programmable Gate Arrays
  • DSPs digital signal processors
  • a signal bearing medium examples include, but are not limited to, the following: a recordable type medium such as a flexible disk, a hard disk drive (HDD), a Compact Disc (CD), a Digital Versatile Disk (DVD), a digital tape, a computer memory, etc.; and a transmission type medium such as a digital and/or an analog communication medium (e.g., a fiber optic cable, a waveguide, a wired
  • any two components herein combined to achieve a particular functionality can be seen as “associated with” each other such that the desired functionality is achieved, irrespective of architectures or intermedial components.
  • any two components so associated can also be viewed as being “operably connected”, or “operably coupled”, to each other to achieve the desired functionality, and any two components capable of being so associated can also be viewed as being “operably couplable”, to each other to achieve the desired functionality.
  • operably couplable include but are not limited to physically mateable and/or physically interacting components and/or wirelessly interactable and/or wirelessly interacting

Landscapes

  • Health & Medical Sciences (AREA)
  • Ophthalmology & Optometry (AREA)
  • Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Eye Examination Apparatus (AREA)

Abstract

A contact lens (201) for measuring eyeball focus was disclosed. Said contact lens (201) comprised : a contact lens (201) configured to be worn on an eye (101); an infrared (IR) emitter (210) disposed on the contact lens (201), the IR emitter (210) configured to emit an infrared light; an IR sensor (220) disposed on the contact lens (201), the IR sensor (220) configured to detect one or more reflections of the infrared light and a communication device (230) disposed on the contact lens (201), the communication device(230) communicatively coupled to the IR sensor (220) and configured to provide focal distance data associated with the eye (101).

Description

CONTACT LENS FOR MEASURING EYEBALL FOCUS
BACKGROUND
[0001] Unless otherwise indicated herein, the approaches described in this section are not prior art to the claims in this application and are not admitted to be prior art by inclusion in this section.
[0002] Traditional augmented reality (AR) systems may not be able to provide virtual information at the perceived focal distance of the user, in part because traditional systems may not have access to calibrated real-time eye focal distance information. Traditional techniques for determining eye focal distance may be unwieldy, inaccurate, and/or refresh slowly.
SUMMARY
[0003] The present disclosure describes various illustrative apparatus for detecting eye focal distance from a contact lens. Example apparatus may include a wearable contact lens, an IR emitter disposed on the contact lens to emit IR light, an IR sensor disposed on the contact lens to detect reflections of the IR light, and a communication device disposed on the contact lens communicatively coupled to the IR sensor and configured to provide focal distance data associated with the eye.
[0004] The present disclosure describes various illustrative methods for detecting eye focal distance from a contact lens. Such methods may include emitting IR light from an emitter disposed on a wearable contact lens, detecting reflections at an IR sensor on the contact lens, and providing focal distance data associated with the eye from a communication device on the contact lens communicatively coupled to the IR sensor.
[0005] The present disclosure describes various illustrative machine- readable instructions for detecting eye focal distance from a contact lens. Such machine-readable instructions may include emitting IR light from an emitter disposed on a wearable contact lens, detecting reflections at an IR sensor on the contact lens, and providing focal distance data associated with the eye from a communication device on the contact lens communicatively coupled to the IR sensor. [0006] The foregoing summary may be illustrative only and may not be intended to be in any way limiting. In addition to the illustrative aspects, embodiments, and features described above, further aspects, embodiments, and features will become apparent by reference to the drawings and the following detailed description.
BRIEF DESCRIPTION OF THE DRAWINGS
[0007] Subject matter is particularly pointed out and distinctly claimed in the concluding portion of the specification. The foregoing and other features of the present disclosure will become more fully apparent from the following description and appended claims, taken in conjunction with the accompanying drawings. Understanding that these drawings depict only several embodiments in accordance with the disclosure and are, therefore, not to be considered limiting of its scope, the disclosure will be described with additional specificity and detail through use of the accompanying drawings.
[0008] In the drawings:
Fig. 1 illustrates a system showing a cross section of an eye and a light source shining into the eye to project a set of Purkinje-Sanson reflections; Fig. 2 illustrates cross sections of an eye including a lens and an example contact lens configured to measure eyeball focus of the eye;
Fig. 3 illustrates an example contact lens configured to measure eyeball focus;
Fig. 4 illustrates a cutaway view of an example contact lens including a groove;
Fig. 5 illustrates a flow diagram of an example method for measuring eyeball focus;
Fig. 6 illustrates a flow diagram of an example method for calibrating a determined focal distance with the perception of the individual wearing the contact lens;
Fig. 7 illustrates a flow diagram of an example method for calibrating a known object distance to a measured focal distance;
Fig. 8 illustrates a flow diagram of an example method for associating a particular focal distance to an adjusted object distance;
Fig. 9 illustrates an example computer program product; and
Fig. 10 illustrates an example computing device, all arranged in accordance with at least some embodiments of the present disclosure.
DETAILED DESCRIPTION
[0009] Subject matter is particularly pointed out and distinctly claimed in the concluding portion of the specification. The foregoing and other features of the present disclosure will become more fully apparent from the following description and appended claims, taken in conjunction with the accompanying drawings. Understanding that these drawings depict only several embodiments in accordance with the disclosure and are, therefore, not to be considered limiting of its scope, the disclosure will be described with additional specificity and detail through use of the accompanying drawings.
[0010] The following description sets forth various examples along with specific details to provide a thorough understanding of claimed subject matter. It will be understood by those skilled in the art, however, that claimed subject matter may be practiced without some or more of the specific details disclosed herein. Further, in some circumstances, well-known methods, procedures, systems, components and/or circuits have not been described in detail in order to avoid unnecessarily obscuring claimed subject matter.
[0011] In the following detailed description, reference is made to the accompanying drawings, which form a part hereof. In the drawings, similar symbols typically identify similar components, unless context dictates otherwise. The illustrative embodiments described in the detailed description, drawings, and claims are not meant to be limiting. Other embodiments may be utilized, and other changes may be made, without departing from the spirit or scope of the subject matter presented here. It will be readily understood that the aspects of the present disclosure, as generally described herein, and illustrated in the Figures, can be arranged, substituted, combined, and designed in a wide variety of different configurations, all of which are explicitly contemplated and make part of this disclosure.
[0012] This disclosure is drawn, inter alia, to apparatus, methods and computer-readable media related to measuring eyeball focus using a contact lens including an emitter, a receiver, a communication device, and/or a control circuit.
[0013] In general, a sensor may be disposed on a contact lens and may be configured to detect focal distance data associated with an eye to determine the focal distance of the eye, as will be described in greater detail throughout. A light source disposed on the contact lens may shine into the eye, reflect off one or more surfaces of the eye, and the reflections may project one or more
Purkinje-Sanson images at particular locations of the eye. The locations of the Purkinje-Sanson images may be detected at the sensor disposed on the contact lens. The locations of the detected Purkinje-Sanson images may be associated with the physiological state of a natural lens within the eye as part of a process called accommodation, as will be described in greater detail throughout.
Accordingly, the locations of the detected Purkinje-Sanson images may be processed to determine an accommodation state of the eye, which may in turn be processed to determine a focal distance of the eye. [0014] In general, the optical power of an eye for a given accommodation state may be particular to an individual; accordingly, one or more calibration processes may associate detected focal distance data with a determination of the focal distance of the particular eye wearing the contact lens, as will be described in greater detail throughout. In general, calibration may incorporate the use of an Alternate Reality (AR) image device that may provide one or more AR images at selected focal distances. In some examples of calibration, a focal distance data of the eye may be determined when focused on an AR image. In some examples of calibration, a perceived distance of the AR image may be manipulated to coincide with a known physical distance.
[0015] Fig. 1 illustrates a system 100 showing a cross section of an eye
101 and a light source 110 shining a light 1 1 1 into eye 101 to project a set of Purkinje-Sanson reflections, arranged in accordance with at least some embodiments of the present disclosure. As shown, light source 1 10 may produce light 1 1 1 that travels toward a cornea 104 of eye 101 and may continue to a lens
102 of eye 101. A portion of light 1 1 1 may reflect off an outer surface of cornea
104 to create a Purkinje-Sanson Image 1 (PS1 ) 1 12. Another portion of light 1 1 1 may reflect off an inner surface of cornea 104 to create a Purkinje-Sanson Image 2 (PS2) 1 14. Another portion of light 1 1 1 may reflect off an outer surface of lens 102 to create a Purkinje-Sanson Image 3 (PS3) 1 16. Another portion of light 1 1 1 may reflect off an inner surface of lens 102 to create a Purkinje-Sanson Image 4 (PS4) 1 18. As described herein, a focal distance of an eye may be determined based at least in part on the locations of one or more of the Purkinje-Sanson Images PS1 1 12, PS2 113, PS3 1 16, and/or PS4 1 18.
[0016] In general, as described above, images PS1 1 12 and PS2 1 14, may be reflected from the outer and inner surfaces of cornea 104, and therefore may not be directly influenced by lens 102, so that the locations of Purkinje- Sanson images PS1 112 and PS2 1 14 may not be best suited to provide information related to the state of lens 102. In particular, PS2 114 is depicted without a corresponding label and candle-image in part because PS2 1 14 may have less consequence to the subject matter disclosed herein.
[0017] The locations of Purkinje-Sanson images PS1 1 12, PS2 1 14, PS3 1 16, and/or PS4 1 18 may correspond to the location of light source 1 10 and/or the accommodation of lens 102. For example, image PS1 1 12 and/or image PS2 1 14 may appear relatively close to light source 1 10. Additionally, for example, image PS3 1 16 and image PS4 1 18 may appear relatively far from light source 1 10, and in some examples image PS3 1 16 and image PS4 1 18 may appear on the substantially opposite side of cornea 104. Furthermore, the relative locations of images PS1 1 12, PS2 1 14, PS3 1 16, and PS4 1 18 may be increasingly more distant from light source 1 10. Accordingly, the relative distance of images PS1 1 12, PS2 1 14, PS3 1 16, and/or PS4 1 18 from light source 1 10 may be used to distinguish some or all of the Purkinje-Sanson images PS1 1 12, PS2 114, PS3 1 16, and PS4 118. For example, because PS3 1 16 is generally relatively closer to light source 1 10 than PS4 1 18, then given two detected images thought to be PS3 1 16 and PS4 1 18, the relative distances of the images from the light source may be determined and the closer image may be labeled PS3 116 and the further image may be labeled PS4 1 18.
[0018] Purkinje-Sanson images PS1 1 12, PS2 1 14, PS3 1 16, and/or PS4 1 18 may be detected as upright images or inverted images. For example, light source 1 10 may depict a candle with a flame on top. The depiction of the candle may be for illustrative purposes to demonstrate the orientation of Purkinje- Sanson images PS1 112, PS3 1 16, and/or PS4 1 18. As shown, image PS1 1 12 and image PS3 116 may appear to have a similar orientation as light source 110, depicted here as upright, with the candle flame appearing on top. As shown, image PS4 1 18 may appear substantially inverted, depicted here with the candle flame appearing on the bottom. The orientation of an image may be used to distinguish some or all of the Purkinje-Sanson images PS1 1 12, PS2 114, PS3 1 16, and PS4 118. For example, given two detected images thought to be PS3 1 16 and PS4 1 18, the image appearing substantially upright may be labeled PS3 1 16 and the image appearing substantially inverted may be labeled PS4 1 18.
[0019] As described herein, lens 102 may change focus by changing the optical power of lens 102 via the accommodation process natural to the human eye. During accommodation, muscles controlling lens 102 may tense and/or relax causing deformation of lens 102, thereby changing the optical power and focus of lens 102. As lens 102 changes shape, outer surface of lens 102 and/or inner surface of lens 102 may change in curvature and lens 102 may change in thickness, thus the locations of image PS3 1 16 and/or image PS4 118 may correspondingly change location(s). In some examples, the locations of PS3 1 16 and PS4 1 18 may provide an indication of the current accommodation state of eye 101 , which in turn may be processed to determine a current focal distance of eye 101.
[0020] Fig. 2 illustrates cross sections of eye 101 including lens 102 and an example contact lens 201 configured to measure eyeball focus of eye 101 , arranged in accordance with at least some embodiments of the present disclosure. Fig. 2 may depict a cross section of eye 101 including lens 102. Eye 101 , lens 102, PS3 1 16, and/or PS4 1 18 are further described as discussed with respect to Fig. 1 and elsewhere herein.
[0021] In some examples, an emitter 210, a sensor 220, and/or a
communication device 230 may be disposed on contact lens 201.
[0022] In some examples, contact lens 201 may include emitter 210 and sensor 220 to determine the focus of lens 102 in eye 101. As described above, as eye 101 changes focal distance, lens 102 may correspondingly change shape, so that image PS3 1 16 and image PS4 1 18 may change locations. As described above, emitter 210 may be used to project images PS3 1 16 and/or PS4 1 18, and Sensor 220 may be used to detect the locations of images PS3 1 16 and/or PS4 1 18. The detected locations of images PS3 1 16 and PS4 1 18 may then be used to determine an accommodation of lens 102, which in turn may be used to determine a focal distance of eye 101.
[0023] As described above, emitter 210 may be configured to emit light to create PS3 1 16 and/or PS4 1 18 images. In some examples, emitter 210 may include a device for power/control 212, an LED 214, and/or a microlens 216. In some examples, LED 214 may be activated by power/control 212 to emit light; the emitted light may pass through microlens 216 toward eye 101 creating images PS3 1 16 and PS4 1 18. In some examples, emitter 210 may be configured to emit any appropriate wavelength(s) of light, including, but not limited to, IR light outside of the visible spectrum of a typical human eye. In some examples, the wearer of contact lens 201 may not be aware of and/or disturbed by the emissions of emitter 210.
[0024] As described above, sensor 220 may be configured to detect PS3 1 16 and/or PS4 1 18 images. In some examples, sensor 220 may include a CCD/CMOS 222 and a microlens array 224. Microlens array 224 may be configured to focus PS3 1 16 and/or PS4 1 18 images onto CCD/CMOS 222. CCD/CMOS 222 and/or microlens array 224 may be arranged in any appropriate shape, including a linear distribution according to the direction of warp. In some examples, CCD/CMOS 222 may describe a line so that received images may be associated with a linear displacement on the line to provide location data as an offset coordinate. In some examples, microlens array 224 may be linearly collaborated to CCD/CMOS 222 to associate one or more individual microlenses of microlens array 224 with corresponding one or more sensitive elements of CCD/CMOS 222.
[0025] CCD/CMOS 222 may be any appropriate type of image sensor, including Charge Coupled Device (CCD) and/or Complementary Metal Oxide Semiconductor (CMOS), or the like. In some examples CCD/CMOS 222 may incorporate one or more filters to attenuate signals not originating from emitter 210, for example one or more IR filters configured to pass primarily IR light.
[0026] As described above, sensor 220 may be configured to detect focal distance data associated with eye 101. In general, focal distance data may be determined based at least in part on a distance separating images PS3 1 16 and PS4 1 18, which may indicate an accommodation of lens 102, which in turn may indicate a focal distance of eye 101. Focal distance data may be any appropriate focal distance data, including, for example, an indication of the focal distance and/or indications of the locations of images PS3 1 16 and/or PS4 1 18. In some examples, focal distance data may include substantially unprocessed data from sensor 220 which may be processed to determine at least the locations of images PS3 1 16 and/or PS4 118.
[0027] The focal distance of eye 101 may be determined based at least in part on focal distance data. The determined focal distance of eye 101 may be used, for example, to determine where a wearer of contact lens 201 may be looking, what the wearer may be looking at, and/or in cooperation with any appropriate augmented reality (AR) display system to display information at the focal distance of the person wearing contact lens 201 .
[0028] In some examples, communication device 230 may be used to communicate focal distance data associated with eye 101. In some examples, communication device 230 may be located with sensor 220, although
communication device 230 may be positioned at any appropriate location. In some examples, communication device 230 may be communicatively coupled to sensor 220 and/or may be communicatively coupled to CCD/CMOS 222.
Communication device 230 may be any appropriate communication device, for example one or more antennas, wired connectivity, or the like. Communication device 230 may use any appropriate protocol, such as Bluetooth, WiFi, cellular, Ethernet, or the like.
[0029] As described herein, contact lens 201 may measure eyeball focus based at least in part on detecting the locations of image PS3 1 16 and/or image PS4 1 18. A distance separating PS3 1 16 and PS4 1 18 may generally indicate an accommodation of lens 102. That is, as lens 102 changes focus, the
accommodation of lens 102 may change so that the locations of and/or distance separating images PS3 116 and/or PS4 1 18 may also change correspondingly. In some examples, the determination of locations of and/or distances separating images PS3 1 16 and PS4 1 18 may require distinguishing between the locations of image PS3 1 16 and image PS4 1 18. In some examples, image PS3 1 16 may be located closer to the center of contact lens 201 and image PS4 1 18 may likewise be located further from the center of contact lens 201 . However, other distinguishing characteristics may be used to distinguish PS3 1 16 and/or PS4 1 18 images.
[0030] Images PS3 1 16 and/or PS4 1 18 may be distinguished, in some examples, by configuring LED 214 to emit two or more distinct wavelengths of IR light in a particular orientation and/or pattern so that CCD/CMOS 222 may determine whether an image appears in a substantially upright orientation, such as may be observed at image PS3 1 16, or instead whether an image appears in a substantially inverted orientation, such as may be observed at image PS4 118. In some examples, LED 214 may comprise one or more LED devices. In some examples, sensor 220 may be capable of distinguishing two or more wavelengths of IR light. In some examples, individual photosensitive sites of sensor 220 may be configured to improve sensitivity to one or more particular wavelengths. In some examples, individual photosensitive sites of sensor 220 may include filters to attenuate sensitivity to undesired wavelengths. In some examples, filtered photosensitive sites of sensor 220 may be patterned, for example ABABAB, wherein A indicates sensitivity to a first wavelength and B indicates sensitivity to a second wavelength. Any suitable pattern of photosensitive sites may be employed.
[0031] In some examples, emitter 210 may be arranged substantially opposite to sensor 220. In some examples, emitter 210 may be arranged on a first sector of the contact lens and sensor 220 may be arranged on a second sector of the contact lens, substantially opposite to the first sector. In some examples, emitter 210 may be arranged substantially linearly according to the direction of warp.
[0032] In some examples, power/control 212 may be configured to power and/or control LED 214. In some examples— depending in part on available power (or any other suitable factors)— power/control 212 may shine LED 214 continuously and/or for a long duration. In some examples, power/control 212 may pulse LED 214 in any appropriate duty cycle or cycles. Power/control 212 may be configured to provide steady illumination, to conserve power, and/or to adapt output according to environmental needs. In some examples, such as in the presence of bright and/or dark ambient light, power/control 212 may increase and/or decrease the intensity of LED 214 to provide brighter PS3 1 16 and/or PS4 1 18, as appropriate.
[0033] In some examples, emitter 210, sensor 220, and/or communication device 230 may be disposed on an inside surface of contact lens 201 , on an outside surface of contact lens 201 , substantially through contact lens 201 , and/or substantially enclosed within contact lens 201.
[0034] In general, components included in contact lens 201 may operate at the speed of modern electronics. In some examples, emitter 210 may emit light, sensor 220 may detect light, and/or communication device 230 may communicate in real time.
[0035] Fig. 3 illustrates contact lens 201 configured to measure eyeball focus, arranged in accordance with at least some embodiments of the present disclosure. Contact lens 201 may include emitter 210, sensor 220, power supply 310, and/or leads 330. Contact lens 201 , emitter 210, and/or sensor 220 may be further described as discussed with respect to Fig. 2 and elsewhere herein.
[0036] Power may be used for any appropriate purpose. In some
examples, emitter 210 may use power to emit light at the LED. In some examples, sensor 220 may use power to operate CCD/CMOS 222 and/or communication device 230. [0037] A power supply 310 may be configured as a substantially circular and/or semicircular circuit, circumferentially located toward the extents of contact lens 201. In some examples, power supply 310 may not be visible by the wearer of contact lens 201. In some examples, power supply 310 may be visible at the extents of vision and/or at one or more leads 330 coupling power supply 310 to emitter 210 and/or sensor 220. In some examples, power supply 310 may include substantially transparent materials.
[0038] Power supply 310 may be configured to harvest power to operate electronic components of contact lens 201 , for example emitter 210 and/or sensor 220. Power supply 310 may use more conventional techniques either in combination with harvesting power or alone, such as battery power, which may provide uninterrupted and/or conditioned power.
[0039] In some examples, power supply 310 may be operable to harvest power from any suitable sources, for example solar power, kinetic energy, thermoelectric power, rectifying antenna, or the like. In some examples, power supply 310 may include a radio frequency (RF) antenna configured to harvest RF energy.
[0040] In some examples, power supply 310 may incorporate an antenna or may be used as an antenna as appropriate. In some examples,
communication device 230 may be configured to use power supply 310 as an antenna. In some examples, power supply 310 may include communication device 230. [0041] Fig. 4 illustrates a cutaway view of an example contact lens 401 including a groove 410, arranged in accordance with at least some embodiments of the present disclosure. Fig. 3, previously discussed above, describes contact lens 201 arranged to include emitter 210 and sensor 220 on substantially opposite sectors of contact lens 201. In contrast, Fig. 4 depicts contact lens 401 having a annular groove, groove 410, wherein groove 410 may be arranged to emit light and receive reflections as described herein and throughout. In general, the light emitted from groove 410 may function as described above, particularly with respect to Fig. 1 , so that the emitted light may project reflections associated with an accommodation state of a natural lens of an eye wearing contact lens 401 , and the received reflections may be processed to determine the
accommodation state of the natural lens. Groove 410 may be an annular groove formed on the contact lens and may permit determination of the focal distance of the eye wearing contact lens 401 .
[0042] In some examples, an IR emitter including an infrared laser diode may be configured to provide IR light to groove 410. In some examples, the IR laser diode may use very low power. Groove 401 may act as a semi-reflex lens, and may be configured to emit a portion of light and/or conduct a portion of light. A first potion of the provided IR light may be configured to emit from groove 410 and may reflect off one or more surfaces of an eye wearing contact lens 401. The reflected IR light may be received at groove 410 to provide a first reflected signal. A second portion of the provided IR light may conduct a total reflection along groove 410 to provide a second reflected signal. An IR sensor coupled to groove 410 may be configured to detect the first and second reflected signal. In some examples, the first and second reflected signals may produce interference, which may be detected at the IR sensor. In some examples, the first and second reflected signals, including interference between the signals, may comprise focal distance data, which may in turn be processed to determine a focal distance of the eye based in part on the interference between the first and second reflected signal.
[0043] In some examples, the interference between the first and second reflected signals may correspond at least in part on the distance traveled by the first emitted portion of light as reflected by the lens of the eye wearing contact lens 401. As the lens changes accommodation (i.e. changes focal distance), the reflected light may travel a shorter or further distance, thereby correspondingly affecting the interference detected by the sensor. In some examples, measuring the interference may permit determination of the focal distance of the eye wearing contact lens 401.
[0044] In some examples, groove 410 may be formed on either a top or a bottom surface of contact lens 401 as appropriate. In some examples, groove 410 may behave similarly to an optical fiber, acting, for example, as a waveguide for at least a portion of light transmitted via groove 410. In some examples, groove cross-sectional shape may be any suitable shape, including semi-circular, rectangular, diamond, or the like. In some examples, emitted light may originate from groove 410 and/or any suitable surface of contact lens 401. [0045] Fig. 5 illustrates a flow diagram of an example method 500 for measuring eyeball focus, arranged in accordance with at least some
embodiments of the present disclosure. In general, method 500 may be performed by any suitable device, devices, or systems such as those discussed herein. More specifically, method 500 may be performed by a contact lens to measure eyeball focus.
[0046] Method 500 sets forth various functional blocks or actions that may be described as processing steps, functional operations, events and/or acts, etc., which may be performed by hardware, software, and/or firmware. Numerous alternatives to the functional blocks shown in Fig. 5 may be practiced in various implementations. For example, intervening actions not shown and/or additional actions not shown may be employed and/or some of the actions shown may be eliminated, without departing from the scope of claimed subject matter. Method 500 may include one or more of functional operations as indicated by one or more of blocks 502, 504, 506, and/or 508. The process of method 500 may begin at block 502.
[0047] At block 502, "Emit IR light from an emitter disposed on a contact lens", an emitter disposed on the contact lens may emit IR light. In general, any suitable type or types of light may be emitted by any suitable technique or techniques. In some examples, IR light may be emitted as may be further discussed with respect to Figs. 1 -4 and elsewhere herein, particularly with respect to emitter 210 and LED 214. In some examples, the emitted light may be comprised of wavelengths falling outside of human vision, for example IR light. In some examples, the emitted IR light may be configured to create images PS3 1 16 and/or PS4 1 18. Process of method 500 may continue from block 502 to block 504.
[0048] At block 504, "Detect reflections of the IR light at sensor on the contact lens", a sensor on the contact lens may detect reflections of the IR light. In some examples, the sensor may operate as may be further discussed with respect to Figs. 2-4 and elsewhere herein, particularly with respect to sensor 220 and/or CCD/CMOS 222. In some examples, the detected IR light may be associated with one or more locations on the sensor and may correspond to locations of images PS3 1 16 and/or PS4 1 18. Accordingly, in some examples, the locations of the detected IR light may be used to determine a focal distance of an eye wearing the contact lens. In some examples, detected data may be provided in a variety of formats to allow the determination of the focal distance, including formats such as unprocessed focal distance data, identified locations of images PS3 1 16 and/or PS4 118, an indication of focal distance, and/or as any suitable focal distance data. Process of method 500 may continue from block 504 to block 506.
[0049] At block 506, "Provide focal distance data associated with the eye from a communication device", focal distance data associated with the eye wearing the contact lens may be provided via a communication device. In some examples, the communication device may operate as may be further discussed with respect to Figs. 2-4 and elsewhere herein, particularly with respect to communication device 230. Focal distance data may be provided by communication techniques including radio communication, wired communication, optical communication, and/or any suitable communication technique or techniques. Process of method 500 may optionally continue from block 506 to block 508 or stop after block 506.
[0050] At block 508, "Process focal distance data at a control circuit", focal distance data may be processed at a control circuit. In general, block 508 may be optional. Processing the focal distance data may include determining a focal distance based in part on one or more detected reflections of IR light. In some examples, the control circuit may determine the focal distance based at least in part on the distance separating images PS3 and PS4, for example as may be detected at block 504. In some examples, the focal distance data may include the focal distance, and accordingly determining the focal distance will not require further processing and/or interpretation of focal distance data.
[0051] In some examples, the control circuit may be disposed on the contact lens. In some examples, the control circuit may be located somewhere other than the contact lens. In some examples, the control circuit may be included in a mobile electronic device, a PDA, a computer, a cloud-based computing device, or the like. In general, the control circuit may be configured to process the focal distance data.
[0052] In some examples, process 500 including blocks 502, 504, 506, and/or 508 may occur in real time.
[0053] In some examples, processing focal distance data may include calibrating the determined focal distance to the perception of the contact lens wearer. In general, calibration may be optional. In some examples, calibration may occur as infrequently as once for a wearer of the contact lens, in some examples upon the wearer's first use of the contact lens. In some examples, calibration may be repeated, for example repeated to calibrate different focal distances or for example repeated at different times.
[0054] Process of method 500 may stop after block 508. In some
examples, process of method 500 may be repeated, beginning again at block 502, or the like. In some examples, process of method 500 may repeat without performing optional block 508, and may continue from block 506 to block 502, or the like.
[0055] Fig. 6 illustrates a flow diagram of an example method 600 for calibrating a determined focal distance with the perception of the individual wearing a contact lens, arranged in accordance with at least some embodiments of the present disclosure. In general, method 600 may be performed by any suitable device, devices, or systems such as those discussed herein. In some examples, method 600 may process focal distance data that may be provided via, for example, the communication device of Fig. 5 at block 506, though method 600 is not limited to this purpose. In some examples, method 600 may be similar to, encompass, and/or represent a portion of Fig. 5 at block 508, "Process focal distance data at a control circuit". In some examples, as a result of this and/or other calibration processes described herein, the associated data may permit an AR image device to display information to be perceived at the contact lens wearer's current focal distance. [0056] Method 600 sets forth various functional blocks or actions that may be described as processing steps, functional operations, events and/or acts, etc., which may be performed by hardware, software, and/or firmware. Numerous alternatives to the functional blocks shown in Fig. 6 may be practiced in various implementations. For example, intervening actions not shown and/or additional actions not shown may be employed and/or some of the actions shown may be eliminated, without departing from the scope of claimed subject matter. Method 600 may include one or more of functional operations as indicated by one or more of blocks 602, and/or 604. The process of method 600 may begin at block 602.
[0057] At block 602, "Determine a focal distance at a control circuit", a focal distance associated with an eye of the wearer of the contact lens may be determined at a control circuit. As described herein, particularly with respect to Fig. 5 and block 508, the focal distance may be determined based in part on processing focal distance data. Process of method 600 may continue from block 602 to block 604.
[0058] At block 604, "Calibrate the association between accommodation of the eye and the focal distance", an association between accommodation of the eye and the focal distance may be calibrated. In general, human eyes may be unique and the diopter of an eye may differ from another eye. In some examples, two different people exhibiting similar measured accommodation of the eye, for example such as when a distance between PS3 1 16 and PS4 1 18 images may be similar, may perceive focus at different focal distances. Likewise, in some examples, different people focused on the same object at the same distance may exhibit different measured accommodations of the eye. These differences may be measured, such as at block 602, and calibrated for the individual wearing the contact lens.
[0059] In some examples, it may not be practical to provide a generalized association between the accommodation of the contact lens wearer's eye and the focal distance perceived the person. In some examples, calibration may include a best fit curve to associate the accommodation of the eye at one or more distances with one or more focal distances perceived at those distances. In some examples, the calibrated distances may include positions such as 25cm, 33cm, 1 m, 2m, 5, etc., which correspond to diopters of 4D, 3D, 1 D, 0.5D, and 0.2D, respectively, which, when associated with corresponding detected accommodations of the eye, may provide data to develop a best fit to form a corresponding relationship between a measured indication of accommodation and a corresponding perceived focal distance of the subject.
[0060] Calibration may be done when appropriate, for example when the contact lens wearer wears the contact lens for the first time, occasionally, and/or periodically.
[0061] The AR image device may include any appropriate AR image device. In some examples, the AR image device may be disposed on the contact lens. In some examples, the AR image device may include a head mounted display. In some examples, the AR image device may include one or more fixed display units, including, for example, projectors and/or displays. [0062] Process of method 600 may end at block 604. In some examples, process of method 600 may be repeated, beginning again at block 602.
[0063] Fig. 7 illustrates a flow diagram of an example method 700 for calibrating a known object distance to a measured focal distance, arranged in accordance with at least some embodiments of the present disclosure. In general, method 700 may be performed by any suitable device, devices, or systems such as those discussed herein.
[0064] Method 700 sets forth various functional blocks or actions that may be described as processing steps, functional operations, events and/or acts, etc., which may be performed by hardware, software, and/or firmware. Numerous alternatives to the functional blocks shown in Fig. 7 may be practiced in various implementations. For example, intervening actions not shown and/or additional actions not shown may be employed and/or some of the actions shown may be eliminated, without departing from the scope of claimed subject matter. Method 700 may include one or more of functional operations as indicated by one or more of blocks 702, 704, and/or 706. The process of method 700 may begin at block 702.
[0065] At block 702, "Provide AR image focused at a known focal distance", an AR image focused at a known focal distance may be provided. In some examples, the AR image may be provided by an alternate reality imaging device or any suitable alternative. In some examples, the alternate reality imaging device may project, overlay, and/or display computer generated data onto a view of the surroundings, or the like, which may permit viewing data without requiring a user to look away from a scene.
[0066] In some examples, the provided AR image may have the appearance of a target, or the like. In some examples, the provided AR image may be displayed so as to be perceived at a known distance. In some examples, an AR imaging device may provide an image to be perceived at any suitable distance, including at infinity. Process of method 700 may continue from block 702 to block 704.
[0067] At block 704, "Determine the accommodation of the eye focused on the AR image", an accommodation of an eye wearing a contact lens may be determined when focused on the AR image. In general, the accommodation of the eye may correspond to a distance between images PS3 and PS4. In some examples, the contact lens may determine the accommodation of the eye as discussed herein, particularly with respect to Figs. 1 -5 and elsewhere herein. For example contact lens 102 may emit IR light, detect reflections of the IR light, and/or provide focal distance data that may be processed to determine the accommodation of the eye. A user may focus on the target and the
accommodation of the eye may be determined. Process of method 700 may continue from block 704 to block 706.
[0068] At block 706, "Associate the accommodation of the eye to the focal distance", the accommodation of the eye when focused on the target may be associated with the known focal distance of the provided AR image. In some examples, the association may be stored for later use and/or processed immediately.
[0069] As discussed herein, the AR image device may include any appropriate AR image device.
[0070] In some examples, process of method 700 may be repeated as desired at different focal distances to perform best fit analysis over a range of focal distances. In some examples, process of method 700 may repeat at block 702. Process of method 700 may stop after 706.
[0071] Fig. 8 illustrates a flow diagram of an example method 800 for associating a particular focal distance to an adjusted object distance, arranged in accordance with at least some embodiments of the present disclosure. In some examples, the associated data may permit the detection of a current focal distance of an eye to allow an AR image device to display information to be perceived at the same focal distance.
[0072] Method 800 sets forth various functional blocks or actions that may be described as processing steps, functional operations, events and/or acts, etc., which may be performed by hardware, software, and/or firmware. Numerous alternatives to the functional blocks shown in Fig. 8 may be practiced in various implementations. For example, intervening actions not shown and/or additional actions not shown may be employed and/or some of the actions shown may be eliminated, without departing from the scope of claimed subject matter. Method 800 may include one or more of functional operations as indicated by one or more of blocks 802, 804, 806, and/or 808. The process of method 800 may begin at block 802.
[0073] At block 802, "Determine the accommodation of the eye at a first focal distance", a user may focus the eye at a first focal distance and an accommodation of the eye may be determined when focused at the first focal distance. In some examples, the first distance may be any suitable distance. In some examples, the user may focus on an object in a viewport, for example a wall or the like. In some examples, determining the accommodation of the eye may proceed as may be further discussed with respect to Fig. 7 and elsewhere herein, particularly with respect to block 704. In some examples, the user may not substantially change their gaze and/or change focal distance for the duration of method 800, in some examples block 802 may be performed at any time prior to block 808. Process of method 800 may continue from block 802 to block 804.
[0074] At block 804, "Provide AR image", an AR image may be provided. As described herein, and with respect to Fig. 7 above, particularly block 702, the AR image may be provided at any suitable focal distance. In some examples, the provided AR image may not initially coincide with the first focal distance. Process of method 800 may continue from block 804 to block 806.
[0075] At block 806, "Provide a user interface to adjust the displayed focal distance of the AR image", a user interface may be provided so that the user may adjust the focal distance of the displayed AR image. In general, the user will adjust the provided AR image until the AR image coincides with the first focal distance. Process of method 800 may continue from block 806 to block 808. [0076] At block 808, "Associate the accommodation of the eye to the adjusted focal distance of the AR image", the accommodation of the eye, as determined at block 802, may be associated to the adjusted focal distance of the AR image. In some examples, this association may be used as described herein to calibrate an AR imaging device with the perception of an individual person. In some examples, as described herein, this association may be repeated as necessary at different focal distances to perform best fit analysis over a range of focal distances.
[0077] As discussed herein, the AR image device may include any appropriate AR image device.
[0078] In some examples, process of method 800 may be repeated as desired at different focal distances to perform best fit analysis over a range of focal distances. In some examples, process of method 800 may repeat at block 802. In some examples, process of method 800 may stop after 808.
[0079] Fig. 9 illustrates an example computer program product 900, arranged in accordance with at least some embodiments of the present disclosure. Computer program product 900 may include machine readable non- transitory medium having stored therein instructions that, when executed, may operatively enable a computing device to measure eyeball focus and/or calibrate measured focus data according to the processes and methods discussed herein. Computer program product 900 may include a signal bearing medium 902.
Signal bearing medium 902 may include one or more machine-readable instructions 904, which, when executed by one or more processors, may operatively enable a computing device to provide the functionality described herein. In various examples, some or all of the machine-readable instructions may be used by the devices discussed herein.
[0080] In some implementations, signal bearing medium 902 may encompass a computer-readable medium 906, such as, but not limited to, a hard disk drive, a Compact Disc (CD), a Digital Versatile Disk (DVD), a digital tape, memory, etc. In some implementations, signal bearing medium 902 may encompass a recordable medium 908, such as, but not limited to, memory, read/write (R/W) CDs, R/W DVDs, etc. In some implementations, signal bearing medium 902 may encompass a communications medium 910, such as, but not limited to, a digital and/or an analog communication medium (e.g., a fiber optic cable, a waveguide, a wired communication link, a wireless communication link, etc.). In some examples, signal bearing medium 902 may encompass a machine readable non-transitory medium.
[0081] Fig. 10 is a block diagram illustrating an example computing device 1000, arranged in accordance with at least some embodiments of the present disclosure. In various examples, computing device 1000 may be configured to measure eyeball focus and/or calibrate measured focus data as discussed herein. In one example basic configuration 1001 , computing device 1000 may include one or more processors 1010 and system memory 1020. A memory bus 1030 can be used for communicating between the processor 1010 and the system memory 1020. [0082] Depending on the desired configuration, processor 1010 may be of any type including but not limited to a microprocessor (μΡ), a microcontroller (μθ), a digital signal processor (DSP), or any combination thereof. Processor 1010 can include one or more levels of caching, such as a level one cache 101 1 and a level two cache 1012, a processor core 1013, and registers 1014. The processor core 1013 can include an arithmetic logic unit (ALU), a floating point unit (FPU), a digital signal processing core (DSP Core), or any combination thereof. A memory controller 1015 can also be used with the processor 1010, or in some
implementations the memory controller 1015 can be an internal part of the processor 1010.
[0083] Depending on the desired configuration, the system memory 1020 may be of any type including but not limited to volatile memory (such as RAM), non-volatile memory (such as ROM, flash memory, etc.) or any combination thereof. System memory 1020 may include an operating system 1021 , one or more applications 1022, and program data 1024. Application 1022 may include eyeball focus measuring application 1023 that can be arranged to perform the functions, actions, and/or operations as described herein including the functional blocks, actions, and/or operations described herein. Program Data 1024 may include eyeball focus measuring data 1025 for use with eyeball focus measuring application 1023. In some example embodiments, application 1022 may be arranged to operate with program data 1024 on an operating system 1021 . This described basic configuration is illustrated in Fig. 10 by those components within dashed line 1001. [0084] Computing device 1000 may have additional features or
functionality, and additional interfaces to facilitate communications between the basic configuration 1001 and any required devices and interfaces. For example, a bus/interface controller 1040 may be used to facilitate communications between the basic configuration 1001 and one or more data storage devices 1050 via a storage interface bus 1041. The data storage devices 1050 may be removable storage devices 1051 , non-removable storage devices 1052, or a combination thereof. Examples of removable storage and non-removable storage devices include magnetic disk devices such as flexible disk drives and hard-disk drives (HDD), optical disk drives such as compact disk (CD) drives or digital versatile disk (DVD) drives, solid state drives (SSD), and tape drives to name a few. Example computer storage media may include volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information, such as computer readable instructions, data structures, program modules, or other data.
[0085] System memory 1020, removable storage 1051 and non-removable storage 1052 are all examples of computer storage media. Computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which may be used to store the desired information and which may be accessed by computing device 1000. Any such computer storage media may be part of device 1000. [0086] Computing device 1000 may also include an interface bus 1042 for facilitating communication from various interface devices (e.g., output interfaces, peripheral interfaces, and communication interfaces) to the basic configuration 1001 via the bus/interface controller 1040. Example output interfaces 1060 may include a graphics processing unit 1061 and an audio processing unit 1062, which may be configured to communicate to various external devices such as a display or speakers via one or more A/V ports 1063. Example peripheral interfaces 1070 may include a serial interface controller 1071 or a parallel interface controller 1072, which may be configured to communicate with external devices such as input devices (e.g., keyboard, mouse, pen, voice input device, touch input device, etc.) or other peripheral devices (e.g., printer, scanner, etc.) via one or more I/O ports 1073. An example communication interface 1080 includes a network controller 1081 , which may be arranged to facilitate communications with one or more other computing devices 1083 over a network communication via one or more communication ports 1082. A communication connection is one example of a communication media. Communication media may typically be embodied by computer readable instructions, data structures, program modules, or other data in a modulated data signal, such as a carrier wave or other transport mechanism, and may include any information delivery media. A "modulated data signal" may be a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal. By way of example, and not limitation, communication media may include wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, radio frequency (RF), infrared (IR) and other wireless media. The term computer readable media as used herein may include both storage media and communication media.
[0087] Computing device 1000 may be implemented as a portion of a small-form factor portable (or mobile) electronic device such as a cell phone, a mobile phone, a tablet device, a laptop computer, a personal data assistant (PDA), a personal media player device, a wireless web-watch device, a personal headset device, an application specific device, or a hybrid device that includes any of the above functions. Computing device 1000 may also be implemented as a personal computer including both laptop computer and non-laptop computer configurations. In addition, computing device 1000 may be implemented as part of a wireless base station or other wireless system or device.
[0088] Some portions of the foregoing detailed description are presented in terms of algorithms or symbolic representations of operations on data bits or binary digital signals stored within a computing system memory, such as a computer memory. These algorithmic descriptions or representations are examples of techniques used by those of ordinary skill in the data processing arts to convey the substance of their work to others skilled in the art. An algorithm is here, and generally, is considered to be a self-consistent sequence of operations or similar processing leading to a desired result. In this context, operations or processing involve physical manipulation of physical quantities. Typically, although not necessarily, such quantities may take the form of electrical or magnetic signals capable of being stored, transferred, combined, compared or otherwise manipulated. It has proven convenient at times, principally for reasons of common usage, to refer to such signals as bits, data, values, elements, symbols, characters, terms, numbers, numerals or the like. It should be understood, however, that all of these and similar terms are to be associated with appropriate physical quantities and are merely convenient labels. Unless specifically stated otherwise, as apparent from the following discussion, it is appreciated that throughout this specification discussions utilizing terms such as "processing," "computing," "calculating," "determining" or the like refer to actions or processes of a computing device, that manipulates or transforms data represented as physical electronic or magnetic quantities within memories, registers, or other information storage devices, transmission devices, or display devices of the computing device.
[0089] The foregoing detailed description has set forth various
embodiments of the devices and/or processes via the use of block diagrams, flowcharts, and/or examples. Insofar as such block diagrams, flowcharts, and/or examples contain one or more functions and/or operations, it will be understood by those within the art that each function and/or operation within such block diagrams, flowcharts, or examples can be implemented, individually and/or collectively, by a wide range of hardware, software, firmware, or virtually any combination thereof. In some embodiments, several portions of the subject matter described herein may be implemented via Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs), digital signal processors (DSPs), or other integrated formats. However, those skilled in the art will recognize that some aspects of the embodiments disclosed herein, in whole or in part, can be equivalently implemented in integrated circuits, as one or more computer programs running on one or more computers (e.g., as one or more programs running on one or more computer systems), as one or more programs running on one or more processors (e.g., as one or more programs running on one or more microprocessors), as firmware, or as virtually any combination thereof, and that designing the circuitry and/or writing the code for the software and or firmware would be well within the skill of one of skill in the art in light of this disclosure. In addition, those skilled in the art will appreciate that the mechanisms of the subject matter described herein are capable of being distributed as a program product in a variety of forms, and that an illustrative embodiment of the subject matter described herein applies regardless of the particular type of signal bearing medium used to actually carry out the distribution. Examples of a signal bearing medium include, but are not limited to, the following: a recordable type medium such as a flexible disk, a hard disk drive (HDD), a Compact Disc (CD), a Digital Versatile Disk (DVD), a digital tape, a computer memory, etc.; and a transmission type medium such as a digital and/or an analog communication medium (e.g., a fiber optic cable, a waveguide, a wired
communication link, a wireless communication link, etc.).
[0090] The herein described subject matter sometimes illustrates different components contained within, or connected with, different other components. It is to be understood that such depicted architectures are merely examples and that in fact many other architectures can be implemented which achieve the same functionality. In a conceptual sense, any arrangement of components to achieve the same functionality is effectively "associated" such that the desired
functionality is achieved. Hence, any two components herein combined to achieve a particular functionality can be seen as "associated with" each other such that the desired functionality is achieved, irrespective of architectures or intermedial components. Likewise, any two components so associated can also be viewed as being "operably connected", or "operably coupled", to each other to achieve the desired functionality, and any two components capable of being so associated can also be viewed as being "operably couplable", to each other to achieve the desired functionality. Specific examples of operably couplable include but are not limited to physically mateable and/or physically interacting components and/or wirelessly interactable and/or wirelessly interacting
components and/or logically interacting and/or logically interactable components.
[0091] With respect to the use of substantially any plural and/or singular terms herein, those having skill in the art can translate from the plural to the singular and/or from the singular to the plural as is appropriate to the context and/or application. The various singular/plural permutations may be expressly set forth herein for sake of clarity.
[0092] It will be understood by those within the art that, in general, terms used herein, and especially in the appended claims (e.g., bodies of the appended claims) are generally intended as "open" terms (e.g., the term "including" should be interpreted as "including but not limited to," the term "having" should be interpreted as "having at least," the term "includes" should be interpreted as "includes but is not limited to," etc.). It will be further understood by those within the art that if a specific number of an introduced claim recitation is intended, such an intent will be explicitly recited in the claim, and in the absence of such recitation no such intent is present. For example, as an aid to understanding, the following appended claims may contain usage of the introductory phrases "at least one" and "one or more" to introduce claim recitations. However, the use of such phrases should not be construed to imply that the introduction of a claim recitation by the indefinite articles "a" or "an" limits any particular claim containing such introduced claim recitation to claimed subject matter containing only one such recitation, even when the same claim includes the introductory phrases "one or more" or "at least one" and indefinite articles such as "a" or "an" (e.g., "a" and/or "an" should typically be interpreted to mean "at least one" or "one or more"); the same holds true for the use of definite articles used to introduce claim recitations. In addition, even if a specific number of an introduced claim recitation is explicitly recited, those skilled in the art will recognize that such recitation should typically be interpreted to mean at least the recited number (e.g., the bare recitation of "two recitations," without other modifiers, typically means at least two recitations, or two or more recitations). Furthermore, in those instances where a convention analogous to "at least one of A, B, and C, etc." is used, in general such a construction is intended in the sense one having skill in the art would understand the convention (e.g., "a system having at least one of A, B, and C" would include but not be limited to systems that have A alone, B alone, C alone, A and B together, A and C together, B and C together, and/or A, B, and C together, etc.). In those instances where a convention analogous to "at least one of A, B, or C, etc." is used, in general such a construction is intended in the sense one having skill in the art would understand the convention (e.g., "a system having at least one of A, B, or C" would include but not be limited to systems that have A alone, B alone, C alone, A and B together, A and C together, B and C together, and/or A, B, and C together, etc.). It will be further understood by those within the art that virtually any disjunctive word and/or phrase
presenting two or more alternative terms, whether in the description, claims, or drawings, should be understood to contemplate the possibilities of including one of the terms, either of the terms, or both terms. For example, the phrase "A or B" will be understood to include the possibilities of "A" or "B" or "A and B."
[0093] While certain example techniques have been described and shown herein using various methods and systems, it should be understood by those skilled in the art that various other modifications may be made, and equivalents may be substituted, without departing from claimed subject matter. Additionally, many modifications may be made to adapt a particular situation to the teachings of claimed subject matter without departing from the central concept described herein. Therefore, it is intended that claimed subject matter not be limited to the particular examples disclosed, but that such claimed subject matter also may include all implementations falling within the scope of the appended claims, and equivalents thereof.

Claims

WHAT IS CLAIMED:
1. An apparatus, comprising:
a contact lens configured to be worn on an eye;
an infrared (IR) emitter disposed on the contact lens, the IR emitter configured to emit an infrared light;
an IR sensor disposed on the contact lens, the IR sensor configured to detect one or more reflections of the infrared light; and
a communication device disposed on the contact lens, the communication device communicatively coupled to the IR sensor and configured to provide focal distance data associated with the eye.
2. The apparatus of claim 1 , wherein:
the IR emitter is arranged on a first sector of the contact lens; and the IR sensor is arranged on a second sector of the contact lens, substantially opposite to the first sector, wherein the one or more detected reflections of infrared light include a third Purkinje-Sanson image (PS3) and a fourth Purkinje-Sanson image (PS4).
3. The apparatus of claim 2, further comprising:
a control circuit communicatively coupled to the communication device and configured to determine a focal distance of the eye.
4. The apparatus of claim 3, wherein:
the control circuit is disposed on the contact lens.
5. The apparatus of claim 3, wherein:
the focal distance of the eye is determined based at least in part on a distance between the locations of the detected PS3 and PS4.
6. The apparatus of claim 3, wherein:
the PS3 includes infrared light reflected from an outer surface of a natural lens of the eye;
the PS4 includes infrared light reflected from an inner surface of the natural lens; and
a distance between the PS3 and PS4 indicates an accommodation of the eye associated with the focal distance of the eye.
7. The apparatus of claim 2, wherein:
the IR sensor includes at least one of a CMOS or a CCD.
8. The apparatus of claim 2, wherein:
the IR sensor includes a filter configured to attenuate wavelengths of light other than IR.
9. The apparatus of claim 2, wherein:
the IR sensor includes a linear array of photoreceptors and a
corresponding linear array of microlenses to form images of the emitted infrared light.
10. The apparatus of claim 3, wherein:
the emitted infrared light includes two or more distinct wavelengths of infrared light arranged in a first orientation;
the IR sensor includes a linear array of microlenses and a corresponding linear array of photoreceptors to form images of the emitted infrared light;
the control circuit is configured to determine the location of the PS3 based at least in part on using the linear array of photoreceptors to detect the location of a first reflected image in a first orientation; and
the control circuit is configured to determine the location of the PS4 based at least in part on using the linear array of photoreceptors to detect the location of a second reflected image in a second orientation substantially mirrored from the first orientation.
1 1 . The apparatus of claim 2, further comprising:
a power supply disposed on the contact lens, the power supply including a radio frequency (RF) antenna configured to harvest RF energy.
12. The apparatus of claim 1 , further comprising:
a control circuit communicatively coupled to the communication device; an annular groove formed on the contact lens; and
the IR emitter including an infrared laser diode configured to provide infrared light to the annular groove, wherein:
a first portion of the provided infrared light is configured to emit from the annular groove to provide a first reflected signal;
a second portion of the provided infrared light is configured to conduct a total reflection along the annular groove to provide a second reflected signal;
the IR sensor is coupled to the annular groove and configured to detect the first and second reflected signal; and
the control circuit is configured to determine a focal distance of the eye based in part on the interference between the first and second reflected signal.
13. A method comprising:
emitting infrared (IR) light from an IR emitter disposed on a contact lens, the contact lens configured to be worn on an eye;
detecting one or more reflections of the infrared light at an IR sensor disposed on the contact lens; and
providing focal distance data associated with the eye from a communication device disposed on the contact lens communicatively coupled to the IR sensor.
14. The method of claim 13, wherein:
the IR emitter is arranged on a first sector of the contact lens; and the IR sensor is arranged on a second sector of the contact lens, substantially opposite to the first sector, wherein the one or more detected reflections of the infrared light includes a third Purkinje-Sanson image (PS3) and a fourth Purkinje-Sanson image (PS4).
15. The method of claim 14, further comprising:
determining a focal distance of the eye at a control circuit communicatively coupled to the communication device.
16. The method of claim 15, wherein:
the focal distance of the eye is determined based at least in part on the locations of the PS3 and the PS4.
17. The method of claim 15, wherein:
the PS3 is the infrared light reflected from an outer surface of a natural lens of the eye;
the PS4 is the infrared light reflected from the inner surface of the natural lens; and the distance between the PS3 and PS4 indicates an accommodation of the eye and is associated with the focal distance of the eye.
18. The method of claim 17, further comprising:
calibrating the association between the accommodation of the eye and the focal distance of the eye by:
providing an augmented reality (AR) image focused at a first known focal distance;
determining a first accommodation of the eye focused on the AR image at the first known focal distance;
associating the first accommodation of the eye to the first known focal distance.
19. The method of claim 17, further comprising:
calibrating the association between the accommodation of the eye and the focal distance of the eye by:
providing an augmented reality (AR) image focused at a first focal distance;
providing a user interface to adjust the focal distance of the AR image to appear focused at a second focal distance;
determining a first accommodation of the eye focused on the AR image at the second focal distance; associating the first accommodation of the eye to the second focal distance.
20. The method of claim 15, further comprising:
determining, at the control circuit, the location of the PS3 based at least in part on using the linear array of photoreceptors to detect the location of a first reflected image in a first orientation; and
determining, at the control circuit, the location of the PS4 based at least in part on using the linear array of photoreceptors to detect the location of a second reflected image in a second orientation substantially mirrored from the first orientation, wherein:
the emitted infrared light includes two or more distinct wavelengths of infrared light arranged in a first orientation; and
the IR sensor includes a linear array of microlenses and a corresponding linear array of photoreceptors to form images of the emitted infrared light.
21 . The method of claim 13, wherein an annular groove is formed on the contact lens and the IR emitter includes an infrared laser diode, the method further comprising:
providing infrared light from the infrared laser diode to the annular groove; emitting a first portion of the provided infrared light from the annular groove to provide a first reflected signal; conducting a total reflection of a second portion of the provided infrared light along the annular groove to provide a second reflected signal;
detecting the first and second reflected signal at the IR sensor coupled to the annular groove; and
determining a focal distance of the eye at a control circuit communicatively coupled to the communication device based in part on the interference between the first and second reflected signal.
22. An article comprising:
a machine readable non-transitory medium having stored therein a plurality of instructions that, when executed, cause a machine to:
emit infrared (IR) light from an IR emitter disposed on a contact lens, the contact lens configured to be worn on an eye;
detect one or more reflections of the infrared light at an I R sensor disposed on the contact lens; and
provide focal distance data associated with the eye from a communication device disposed on the contact lens communicatively coupled to the IR sensor.
23. The article of claim 22, wherein:
the IR emitter is arranged on a first sector of the contact lens; and the IR sensor is arranged on a second sector of the contact lens, substantially opposite to the first sector, wherein the detected reflection of the infrared light includes a third Purkinje-Sanson image (PS3) and a fourth Purkinje- Sanson image (PS4).
24. The article of claim 23, wherein the plurality of instructions, when executed, further cause the machine to:
determine a focal distance of the eye at a control circuit communicatively coupled to the communication device;
provide an augmented reality (AR) image focused at a first focal distance; provide a user interface to adjust the focal of the AR image to appear focused at a second focal distance;
determine a first accommodation of the eye focused on the AR image at the second focal distance; and
associate the first accommodation of the eye to the second focal distance, wherein:
the focal distance of the eye is determined based at least in part on the locations of the PS3 and the PS4;
the PS3 is the infrared light reflected from an outer surface of a natural lens of the eye;
the PS4 is the infrared light reflected from the inner surface of the natural lens; and
the distance between the PS3 and PS4 indicates an accommodation of the eye and is associated with the focal distance of the eye.
PCT/CN2013/070062 2013-01-05 2013-01-05 Contact lens for measuring eyeball focus WO2014106330A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2013/070062 WO2014106330A1 (en) 2013-01-05 2013-01-05 Contact lens for measuring eyeball focus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2013/070062 WO2014106330A1 (en) 2013-01-05 2013-01-05 Contact lens for measuring eyeball focus

Publications (1)

Publication Number Publication Date
WO2014106330A1 true WO2014106330A1 (en) 2014-07-10

Family

ID=51062134

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2013/070062 WO2014106330A1 (en) 2013-01-05 2013-01-05 Contact lens for measuring eyeball focus

Country Status (1)

Country Link
WO (1) WO2014106330A1 (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9678361B2 (en) 2014-06-13 2017-06-13 Verily Life Sciences Llc Power delivery for accommodation by an eye-mountable device
US9854437B1 (en) 2014-06-13 2017-12-26 Verily Life Sciences Llc Apparatus, system and method for exchanging encrypted communications with an eye-mountable device
US9880401B2 (en) 2014-06-13 2018-01-30 Verily Life Sciences Llc Method, device and system for accessing an eye-mountable device with a user interface
WO2019033334A1 (en) * 2017-08-17 2019-02-21 Xinova, LLC Contact lenses with bifocal characteristics
US20190179165A1 (en) * 2017-12-12 2019-06-13 RaayonNova, LLC Smart Contact Lens with Embedded Display and Image Focusing System
US10353463B2 (en) * 2016-03-16 2019-07-16 RaayonNova LLC Smart contact lens with eye driven control system and method
US20190353894A1 (en) * 2013-01-24 2019-11-21 Yuchen Zhou Method of utilizing defocus in virtual reality and augmented reality
CN112400134A (en) * 2018-07-13 2021-02-23 德遁公司 Advanced optical design for eye-mounted imaging systems
CN114779471A (en) * 2022-03-24 2022-07-22 闽都创新实验室 Eye machine interface based on nanometer pixel array and working method thereof

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5535743A (en) * 1992-12-19 1996-07-16 Boehringer Mannheim Gmbh Device for the in vivo determination of an optical property of the aqueous humour of the eye
US20030202155A1 (en) * 2002-04-25 2003-10-30 Andre Berube Multi-focal contact lens
US20090204207A1 (en) * 2007-02-23 2009-08-13 Pixeloptics, Inc. Advanced Electro-Active Optic Device
WO2011067391A1 (en) * 2009-12-04 2011-06-09 Varioptic Electronically controlled focusing ophthalmic device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5535743A (en) * 1992-12-19 1996-07-16 Boehringer Mannheim Gmbh Device for the in vivo determination of an optical property of the aqueous humour of the eye
US20030202155A1 (en) * 2002-04-25 2003-10-30 Andre Berube Multi-focal contact lens
US20090204207A1 (en) * 2007-02-23 2009-08-13 Pixeloptics, Inc. Advanced Electro-Active Optic Device
WO2011067391A1 (en) * 2009-12-04 2011-06-09 Varioptic Electronically controlled focusing ophthalmic device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
ROSALES, PATRICIA ET AL.: "Crystalline lens radii of curvature from Purkinje and Scheimpflug imaging.", JOURNAL OF VISION, vol. 6, no. 2006, 19 September 2006 (2006-09-19), pages 1057 - 1067 *

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190353894A1 (en) * 2013-01-24 2019-11-21 Yuchen Zhou Method of utilizing defocus in virtual reality and augmented reality
US11006102B2 (en) * 2013-01-24 2021-05-11 Yuchen Zhou Method of utilizing defocus in virtual reality and augmented reality
US9854437B1 (en) 2014-06-13 2017-12-26 Verily Life Sciences Llc Apparatus, system and method for exchanging encrypted communications with an eye-mountable device
US9880401B2 (en) 2014-06-13 2018-01-30 Verily Life Sciences Llc Method, device and system for accessing an eye-mountable device with a user interface
US9678361B2 (en) 2014-06-13 2017-06-13 Verily Life Sciences Llc Power delivery for accommodation by an eye-mountable device
US10353463B2 (en) * 2016-03-16 2019-07-16 RaayonNova LLC Smart contact lens with eye driven control system and method
WO2019033334A1 (en) * 2017-08-17 2019-02-21 Xinova, LLC Contact lenses with bifocal characteristics
US11793674B2 (en) 2017-08-17 2023-10-24 Lutronic Vision Inc. Contact lenses with bifocal characteristics
US20190179165A1 (en) * 2017-12-12 2019-06-13 RaayonNova, LLC Smart Contact Lens with Embedded Display and Image Focusing System
US11333902B2 (en) * 2017-12-12 2022-05-17 RaayonNova LLC Smart contact lens with embedded display and image focusing system
CN112400134A (en) * 2018-07-13 2021-02-23 德遁公司 Advanced optical design for eye-mounted imaging systems
CN112400134B (en) * 2018-07-13 2022-05-06 德遁公司 Advanced optical design for eye-mounted imaging systems
CN114779471A (en) * 2022-03-24 2022-07-22 闽都创新实验室 Eye machine interface based on nanometer pixel array and working method thereof
CN114779471B (en) * 2022-03-24 2024-05-28 闽都创新实验室 Eye machine interface based on nano pixel array and working method thereof

Similar Documents

Publication Publication Date Title
WO2014106330A1 (en) Contact lens for measuring eyeball focus
ES2957329T3 (en) Systems and methods for eye tracking in virtual reality and augmented reality applications
US9213406B2 (en) Head-mount eye tracking system with improved determination of gazing position
KR101890542B1 (en) System and method for display enhancement
CN107209551B (en) Systems and methods for gaze tracking
US20230359274A1 (en) Eye tracking system for use in head-mounted display units and method of operating same
EP3797376A1 (en) In-field illumination and imaging for eye tracking
CN105900141A (en) Mapping glints to light sources
US20120206349A1 (en) Universal stylus device
US11243607B2 (en) Method and system for glint/reflection identification
US11455031B1 (en) In-field illumination for eye tracking
US9668319B2 (en) Lighting system and control method thereof
CN111630478A (en) High-speed staggered binocular tracking system
JP6136090B2 (en) Electronic device and display device
TW201427418A (en) Sensing apparatus and sensing method
KR102629149B1 (en) Electronic device and method for adjusting characterstic of display according to external light
JP6607254B2 (en) Wearable electronic device, gesture detection method for wearable electronic device, and gesture detection program for wearable electronic device
US20240319501A1 (en) Eye tracking system for vr/ar hmd units including pancake lens modules
JP6555707B2 (en) Pupil detection device, pupil detection method, and pupil detection program
US20240144533A1 (en) Multi-modal tracking of an input device
US20180260068A1 (en) Input device, input control method, and computer program
US11080874B1 (en) Apparatuses, systems, and methods for high-sensitivity active illumination imaging
JP2023065837A (en) Head-mounted display device, and control method for head-mounted display device
JP6123160B2 (en) Electronic device and display device
US12111463B2 (en) Head-mounted display apparatus and operating method thereof

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13870087

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 13870087

Country of ref document: EP

Kind code of ref document: A1