US7431457B2 - Methods and systems for tracking a torsional orientation and position of an eye - Google Patents
Methods and systems for tracking a torsional orientation and position of an eye Download PDFInfo
- Publication number
- US7431457B2 US7431457B2 US11/775,840 US77584007A US7431457B2 US 7431457 B2 US7431457 B2 US 7431457B2 US 77584007 A US77584007 A US 77584007A US 7431457 B2 US7431457 B2 US 7431457B2
- Authority
- US
- United States
- Prior art keywords
- eye
- image
- iris
- laser
- pupil
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Lifetime
Links
- 238000000034 method Methods 0.000 title abstract description 63
- 230000033001 locomotion Effects 0.000 claims abstract description 17
- 210000001747 pupil Anatomy 0.000 claims description 76
- 238000005259 measurement Methods 0.000 claims description 60
- 238000002679 ablation Methods 0.000 claims description 50
- 238000003384 imaging method Methods 0.000 claims description 30
- 238000011282 treatment Methods 0.000 claims description 22
- 239000003550 marker Substances 0.000 claims description 21
- 238000005286 illumination Methods 0.000 claims description 13
- 238000013519 translation Methods 0.000 claims description 13
- 238000002430 laser surgery Methods 0.000 claims description 10
- 238000012937 correction Methods 0.000 claims description 8
- 208000016339 iris pattern Diseases 0.000 claims description 8
- 238000004364 calculation method Methods 0.000 claims description 7
- 230000008859 change Effects 0.000 claims description 7
- 238000004891 communication Methods 0.000 claims description 4
- 230000004044 response Effects 0.000 claims description 4
- 238000004590 computer program Methods 0.000 claims description 2
- 210000004087 cornea Anatomy 0.000 abstract description 15
- 210000001508 eye Anatomy 0.000 description 211
- 210000000554 iris Anatomy 0.000 description 105
- 230000003287 optical effect Effects 0.000 description 49
- 238000004422 calculation algorithm Methods 0.000 description 38
- 210000001519 tissue Anatomy 0.000 description 34
- 238000001356 surgical procedure Methods 0.000 description 20
- 210000001525 retina Anatomy 0.000 description 12
- 238000013532 laser treatment Methods 0.000 description 11
- 230000014616 translation Effects 0.000 description 10
- 238000006073 displacement reaction Methods 0.000 description 9
- 230000004075 alteration Effects 0.000 description 8
- 230000000694 effects Effects 0.000 description 8
- 230000006870 function Effects 0.000 description 8
- 238000000608 laser ablation Methods 0.000 description 7
- 238000012360 testing method Methods 0.000 description 7
- 230000009466 transformation Effects 0.000 description 7
- 238000004458 analytical method Methods 0.000 description 6
- 239000011159 matrix material Substances 0.000 description 6
- 230000004438 eyesight Effects 0.000 description 5
- 201000009310 astigmatism Diseases 0.000 description 4
- 230000008901 benefit Effects 0.000 description 4
- 238000009826 distribution Methods 0.000 description 4
- 208000014733 refractive error Diseases 0.000 description 4
- 238000000844 transformation Methods 0.000 description 4
- PXFBZOLANLWPMH-UHFFFAOYSA-N 16-Epiaffinine Natural products C1C(C2=CC=CC=C2N2)=C2C(=O)CC2C(=CC)CN(C)C1C2CO PXFBZOLANLWPMH-UHFFFAOYSA-N 0.000 description 3
- 230000004424 eye movement Effects 0.000 description 3
- 210000000720 eyelash Anatomy 0.000 description 3
- 210000000744 eyelid Anatomy 0.000 description 3
- 230000008569 process Effects 0.000 description 3
- 230000004304 visual acuity Effects 0.000 description 3
- 206010020675 Hypermetropia Diseases 0.000 description 2
- 230000003044 adaptive effect Effects 0.000 description 2
- 238000013459 approach Methods 0.000 description 2
- 238000001514 detection method Methods 0.000 description 2
- 238000001914 filtration Methods 0.000 description 2
- 230000004305 hyperopia Effects 0.000 description 2
- 201000006318 hyperopia Diseases 0.000 description 2
- 230000001788 irregular Effects 0.000 description 2
- 230000007246 mechanism Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 208000001491 myopia Diseases 0.000 description 2
- 230000004379 myopia Effects 0.000 description 2
- 238000003860 storage Methods 0.000 description 2
- 230000001225 therapeutic effect Effects 0.000 description 2
- 238000011179 visual inspection Methods 0.000 description 2
- MARDFMMXBWIRTK-UHFFFAOYSA-N [F].[Ar] Chemical compound [F].[Ar] MARDFMMXBWIRTK-UHFFFAOYSA-N 0.000 description 1
- 238000011298 ablation treatment Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 238000007796 conventional method Methods 0.000 description 1
- 230000006378 damage Effects 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 238000002405 diagnostic procedure Methods 0.000 description 1
- 230000002708 enhancing effect Effects 0.000 description 1
- 210000003560 epithelium corneal Anatomy 0.000 description 1
- 238000011156 evaluation Methods 0.000 description 1
- 238000002474 experimental method Methods 0.000 description 1
- 239000012634 fragment Substances 0.000 description 1
- 210000003128 head Anatomy 0.000 description 1
- 238000010438 heat treatment Methods 0.000 description 1
- 238000010191 image analysis Methods 0.000 description 1
- 239000007943 implant Substances 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 238000011065 in-situ storage Methods 0.000 description 1
- 210000003041 ligament Anatomy 0.000 description 1
- 230000003278 mimic effect Effects 0.000 description 1
- 230000006855 networking Effects 0.000 description 1
- 238000003909 pattern recognition Methods 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 238000006303 photolysis reaction Methods 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 230000005855 radiation Effects 0.000 description 1
- 230000000284 resting effect Effects 0.000 description 1
- 230000035945 sensitivity Effects 0.000 description 1
- 238000007493 shaping process Methods 0.000 description 1
- 238000010008 shearing Methods 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 238000001228 spectrum Methods 0.000 description 1
- 239000000758 substrate Substances 0.000 description 1
- 230000002123 temporal effect Effects 0.000 description 1
- 230000003685 thermal hair damage Effects 0.000 description 1
- 210000001585 trabecular meshwork Anatomy 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- 238000011277 treatment modality Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61F—FILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
- A61F9/00—Methods or devices for treatment of the eyes; Devices for putting-in contact lenses; Devices to correct squinting; Apparatus to guide the blind; Protective devices for the eyes, carried on the body or in the hand
- A61F9/007—Methods or devices for eye surgery
- A61F9/008—Methods or devices for eye surgery using laser
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B3/00—Apparatus for testing the eyes; Instruments for examining the eyes
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B3/00—Apparatus for testing the eyes; Instruments for examining the eyes
- A61B3/0016—Operational features thereof
- A61B3/0025—Operational features thereof characterised by electronic signal processing, e.g. eye models
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B3/00—Apparatus for testing the eyes; Instruments for examining the eyes
- A61B3/10—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
- A61B3/1015—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for wavefront analysis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B3/00—Apparatus for testing the eyes; Instruments for examining the eyes
- A61B3/10—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
- A61B3/113—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining or recording eye movement
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B3/00—Apparatus for testing the eyes; Instruments for examining the eyes
- A61B3/10—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
- A61B3/14—Arrangements specially adapted for eye photography
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61F—FILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
- A61F9/00—Methods or devices for treatment of the eyes; Devices for putting-in contact lenses; Devices to correct squinting; Apparatus to guide the blind; Protective devices for the eyes, carried on the body or in the hand
- A61F9/007—Methods or devices for eye surgery
- A61F9/008—Methods or devices for eye surgery using laser
- A61F9/00802—Methods or devices for eye surgery using laser for photoablation
- A61F9/00804—Refractive treatments
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61F—FILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
- A61F9/00—Methods or devices for treatment of the eyes; Devices for putting-in contact lenses; Devices to correct squinting; Apparatus to guide the blind; Protective devices for the eyes, carried on the body or in the hand
- A61F9/007—Methods or devices for eye surgery
- A61F9/008—Methods or devices for eye surgery using laser
- A61F9/00802—Methods or devices for eye surgery using laser for photoablation
- A61F9/00804—Refractive treatments
- A61F9/00806—Correction of higher orders
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61F—FILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
- A61F9/00—Methods or devices for treatment of the eyes; Devices for putting-in contact lenses; Devices to correct squinting; Apparatus to guide the blind; Protective devices for the eyes, carried on the body or in the hand
- A61F9/007—Methods or devices for eye surgery
- A61F9/008—Methods or devices for eye surgery using laser
- A61F9/00825—Methods or devices for eye surgery using laser for photodisruption
- A61F9/00827—Refractive correction, e.g. lenticle
- A61F9/00829—Correction of higher orders
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
- G06T7/0014—Biomedical image inspection using an image reference approach
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/30—Determination of transform parameters for the alignment of images, i.e. image registration
- G06T7/33—Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
- G06T7/337—Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods involving reference images or patches
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61F—FILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
- A61F9/00—Methods or devices for treatment of the eyes; Devices for putting-in contact lenses; Devices to correct squinting; Apparatus to guide the blind; Protective devices for the eyes, carried on the body or in the hand
- A61F9/007—Methods or devices for eye surgery
- A61F9/008—Methods or devices for eye surgery using laser
- A61F2009/00844—Feedback systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61F—FILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
- A61F9/00—Methods or devices for treatment of the eyes; Devices for putting-in contact lenses; Devices to correct squinting; Apparatus to guide the blind; Protective devices for the eyes, carried on the body or in the hand
- A61F9/007—Methods or devices for eye surgery
- A61F9/008—Methods or devices for eye surgery using laser
- A61F2009/00844—Feedback systems
- A61F2009/00846—Eyetracking
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61F—FILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
- A61F9/00—Methods or devices for treatment of the eyes; Devices for putting-in contact lenses; Devices to correct squinting; Apparatus to guide the blind; Protective devices for the eyes, carried on the body or in the hand
- A61F9/007—Methods or devices for eye surgery
- A61F9/008—Methods or devices for eye surgery using laser
- A61F2009/00844—Feedback systems
- A61F2009/00848—Feedback systems based on wavefront
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61F—FILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
- A61F9/00—Methods or devices for treatment of the eyes; Devices for putting-in contact lenses; Devices to correct squinting; Apparatus to guide the blind; Protective devices for the eyes, carried on the body or in the hand
- A61F9/007—Methods or devices for eye surgery
- A61F9/008—Methods or devices for eye surgery using laser
- A61F2009/00861—Methods or devices for eye surgery using laser adapted for treatment at a particular location
- A61F2009/00872—Cornea
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61F—FILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
- A61F9/00—Methods or devices for treatment of the eyes; Devices for putting-in contact lenses; Devices to correct squinting; Apparatus to guide the blind; Protective devices for the eyes, carried on the body or in the hand
- A61F9/007—Methods or devices for eye surgery
- A61F9/008—Methods or devices for eye surgery using laser
- A61F2009/00878—Planning
- A61F2009/0088—Planning based on wavefront
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61F—FILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
- A61F9/00—Methods or devices for treatment of the eyes; Devices for putting-in contact lenses; Devices to correct squinting; Apparatus to guide the blind; Protective devices for the eyes, carried on the body or in the hand
- A61F9/007—Methods or devices for eye surgery
- A61F9/008—Methods or devices for eye surgery using laser
- A61F2009/00878—Planning
- A61F2009/00882—Planning based on topography
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61F—FILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
- A61F9/00—Methods or devices for treatment of the eyes; Devices for putting-in contact lenses; Devices to correct squinting; Apparatus to guide the blind; Protective devices for the eyes, carried on the body or in the hand
- A61F9/007—Methods or devices for eye surgery
- A61F9/008—Methods or devices for eye surgery using laser
- A61F2009/00897—Scanning mechanisms or algorithms
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61F—FILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
- A61F9/00—Methods or devices for treatment of the eyes; Devices for putting-in contact lenses; Devices to correct squinting; Apparatus to guide the blind; Protective devices for the eyes, carried on the body or in the hand
- A61F9/007—Methods or devices for eye surgery
- A61F9/008—Methods or devices for eye surgery using laser
- A61F9/00825—Methods or devices for eye surgery using laser for photodisruption
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30041—Eye; Retina; Ophthalmic
Definitions
- the present invention relates generally to laser eye surgery methods and systems. More specifically, the present invention relates to registering a first image of a patient's eye with a second image of a patients eye and to tracking a position and a torsional orientation of the patient's eye during laser eye surgery so as to register a customized ablation profile with the patient's eye.
- Known laser eye procedures generally employ an ultraviolet or infrared laser to remove a microscopic layer of stromal tissue from the cornea of the eye to alter the refractive characteristics of the eye.
- the laser removes a selected shape of the corneal tissue, often to correct refractive errors of the eye.
- Ultraviolet laser ablation results in photo-decomposition of the corneal tissue, but generally does not cause significant thermal damage to adjacent and underlying tissues of the eye.
- the irradiated molecules are broken into smaller volatile fragments photochemically, directly breaking the intermolecular bonds.
- Laser ablation procedures can remove the targeted stroma of the cornea to change the cornea's contour for varying purposes, such as for correcting myopia, hyperopia, astigmatism, and the like.
- Control over the distribution of ablation energy across the cornea may be provided by a variety of systems and methods, including the use of ablatable masks, fixed and moveable apertures, controlled scanning systems, eye movement tracking mechanisms, and the like.
- the laser beam often comprises a series of discrete pulses of laser light energy, with the total shape and amount of tissue removed being determined by the shape, size, location, and/or number of a pattern of laser energy pulses impinging on the cornea.
- a variety of algorithms may be used to calculate the pattern of laser pulses used to reshape the cornea so as to correct a refractive error of the eye.
- Known systems make use of a variety of forms of lasers and/or laser energy to effect the correction, including infrared lasers, ultraviolet lasers, femtosecond lasers, wavelength multiplied solid-state lasers, and the like.
- Alternative vision correction techniques make use of radial incisions in the cornea, intraocular lenses, removable corneal support structures, thermal shaping, and the like.
- Known corneal correction treatment methods have generally been successful in correcting standard vision errors, such as myopia, hyperopia, astigmatism, and the like. However, as with all successes, still further improvements would be desirable.
- wavefront measurement systems are now available to measure the refractive characteristics of a particular patient's eye. By customizing an ablation pattern based on wavefront measurements, it may be possible to correct minor refractive errors so as to reliably and repeatably provide visual acuities greater than 20/20. Alternatively, it may be desirable to correct aberrations of the eye that reduce visual acuity to less than 20/20. Unfortunately, these measurement systems are not immune from measurement error.
- the calculation of the ablation profile, the transfer of information from the measurement system to the ablation system, and the operation of the ablation system all provide opportunities for the introduction of errors, so that the actual visual acuities provided by real-world wavefront-based correction systems may not be as good as might be theoretically possible.
- wavefront measurements aligning the customized laser ablation pattern with the patient's eye.
- the wavefront measurement and the eye should share a common coordinate system. For example, when the wavefront measurement is taken, the patient will generally be in a seated position. However, when the laser eye surgery is being performed, the patient will generally be in a supine position, which may not position the patient's eye in the same position or torsional orientation as the eye when the wavefront measurement was taken.
- the present invention provides methods and systems which can improve laser eye surgery.
- the methods and software of the present invention can register a first image of the patient's eye with a second image of the patient's eye. In some embodiments, the methods can determine a torsional offset ⁇ 0 between the eye in the first image and the eye in the second image.
- a method comprises selecting at least one marker on the iris of the eye in the first image.
- a corresponding marker is located on the iris in the second image.
- the first image of the eye and the second image of the eye are registered by substantially matching a common reference point in the first and second images and matching the marker on the iris of the image of the first eye and the marker on the iris of the image of the second eye.
- a laser treatment can be centered and torsionally aligned with the second image of the eye.
- the second image of the eye can be obtained while the patient's eye is aligned with a laser beam that is to deliver the laser treatment.
- the common reference point is a pupil center. In other embodiments, the common reference point can be determined through a function of a pupil center and an iris center.
- the first image of the eye is obtained during the measurement of a wavefront (which reflects the lower and higher order optical aberrations in the optical system of the patient's eye) and the second image of the eye is obtained when the patient is positioned in the optical axis of the therapeutic laser.
- the patient's eye in the first image can be registered with the patient's eye when it is positioned in an optical axis of the therapeutic laser so that the laser treatment is delivered in a torsionally correct orientation.
- the present invention can track the torsional movement of the eye over time ⁇ (t). Tracking of the torsional orientation of the patient's eye allows a computer processor to adjust a delivery of the customized ablation treatment to account for the changes in the position and orientation of the patient's eye.
- the present invention provides for torsional tracking of the eye.
- a tracking algorithm can establish the exact amount of rotation of the eye with respect to the wavefront image taken during the wavefront measurement. This torsional rotation of the eye can be compensated for by making corresponding adjustment of the laser beam delivery.
- a reference point (such as a pupil center) is located in a first image of the eye. At least one marker is identified in the first image of the eye. The reference point is also located in a second image of the eye. A corresponding marker is identified in the second image of the eye. A cyclotorsional rotation of the eye is estimated between the first image and second image by comparing the orientation of the at least one markers relative to the pupil center in the first image and the second image.
- the present invention provides a method of performing laser eye surgery.
- the method comprises measuring a wavefront measurement of the patient's eye.
- An image of the patient's eye is obtained during the measuring of the wavefront measurement.
- a laser treatment of the patient's eye is generated based on the wavefront measurement.
- the position of the patient's eye is registered with the image of the patient's eye obtained during the measuring of the wavefront measurement so that the customized laser treatment can accurately delivered to the patient's eye.
- the laser treatment is delivered to the patient's eye while the torsional orientation of the patient's eye is monitored.
- the delivery of the laser treatment is adjusted based on the monitored torsional orientation of the patient's eye.
- the present invention provide a laser surgery system.
- the laser surgery system provides a computer processor configured to receive a first image of an eye and at least one of a wavefront measurement and an ablation pattern for the eye.
- An eye tracker can be coupled to the computer processor to track a position of the eye under an optical axis of a laser beam.
- a torsional tracker is coupled to the computer processor to track a torsional orientation of the eye.
- the computer processor can be configured to adjust a delivery of the ablation pattern based on a change of position and/or torsional orientation of the eye.
- the present invention provides a laser surgery system comprising a system for registering a first image of an eye with a second image of an eye.
- the system includes a computer processor that is configured to receive a first image of an eye.
- An imaging device can be coupled to the computer processor.
- the imaging device can obtain a second image of the eye.
- the computer processor can be configured to locate a reference point, such as a pupil center, in the first and second image of the eye and locate at least one marker in the first image and find a corresponding marker in the second image.
- the computer processor can register the first and second image by substantially matching the reference points (e.g., pupil centers) and markers of the first and second image.
- FIG. 1 schematically illustrates a simplified system of the present invention
- FIG. 2 schematically illustrates one laser surgery system of the present invention
- FIG. 3 illustrates one exemplary wavefront measurement device of the present invention
- FIG. 3A illustrates an alternative wavefront measurement device of the present invention
- FIG. 4 schematically illustrates an exemplary system of the present invention
- FIG. 5 schematically illustrates a method of the registering a first image with a second image
- FIG. 6A illustrates a reference image of an eye
- FIG. 6B illustrates a rotated image that corresponds to the reference image of FIG. 6A ;
- FIGS. 6C and 6D illustrate a center of a pupil and center of an iris
- FIG. 6E illustrate an inner and outer radii of a range of the iris radii
- FIG. 7A illustrates an unwrapped iris that is segmented into 24 sectors, with each sector having a numbered marker
- FIG. 7B illustrates a corresponding unwrapped iris in which the markers are torsionally rotated from their original positions
- FIG. 7C illustrates two iris images and texture blocks when the iris ring is not unwrapped
- FIG. 7D illustrates two iris images and texture blocks when the iris ring is unwrapped
- FIG. 8A illustrates an unwrapped iris
- FIG. 8B illustrates an unwrapped iris with LED reflections
- FIG. 9 is a graph that illustrates an angular rotation of the 24 markers.
- FIG. 10 is a simplified method of tracking a torsional rotation of a patient's eye
- FIG. 11 is a frame image of a patient's eye and two markers on the iris that are used for tracking a torsional rotation of the patient's eye;
- FIG. 12 illustrates six reference blocks/markers of the patient's iris that are used to track the torsional rotation of the patient's eye
- FIG. 13 illustrates the relative positions of the reference markers relative to the center of the patient's pupil
- FIG. 14 illustrates torsional angle estimates for an eye having a dark-colored iris
- FIG. 15 illustrates torsional angle estimates for an eye having a light colored iris
- FIGS. 16A and 16B are charts summarizing results for a data set processed by one alignment algorithm of the present invention.
- FIG. 17A is an image of an eye that has too much shadow to discern markers
- FIG. 17B is a chart illustrating an eye having an RMS that is above 1;
- FIG. 18A is an original frame image of an eye
- FIG. 18B is a final frame in which the image of the eye is rotated
- FIG. 19A is a reference frame
- FIG. 19B is a zeroth frame having two pixel blocks marked for tracking
- FIG. 20 is a chart of a pupil position over time
- FIG. 21 is a chart of the pupil radius from frame 0 to frame 500;
- FIG. 22 is a chart that illustrates errors per frame/block
- FIG. 23 is a chart that illustrates a measured torsional angle of the eye
- FIG. 24 depicts the tracking results for a 30-frame sequence starting with the 345 th frame
- FIG. 25 is a chart that shows the torsional data extracted from the slower acquired sequence
- FIGS. 26A and 26B show alignment results using a sine-method between the wavefront measurement position of the iris and the first image of the video sequence
- FIGS. 27A and 27B show measurements of the torsional eye movements with respect to the reference image
- FIG. 28 shows a difference between two torsional angle estimates
- FIG. 29A illustrates two torsion estimates
- FIG. 29B illustrates the error between the two estimates of FIG. 29A .
- the present invention is particularly useful for enhancing the accuracy and efficacy of laser eye surgical procedures such as photorefractive keratectomy (PRK), phototherapeutic keratectomy (PTK), laser in situ keratomileusis (LASIK), and the like.
- the efficacy of the laser eye surgical procedures can be enhanced by tracking the torsional orientation of the patient's eye so that a laser ablation pattern is more accurately aligned with the real-time orientation of the patient's eye.
- FIG. 1 schematically illustrates a simplified system of one embodiment of the present invention.
- the illustrated system of the present invention can include a laser system 15 coupled to a wavefront measurement device 10 that measures aberrations and other optical characteristics of an entire optical tissue system.
- the data from such a wavefront measurement device may be used to generate an optical surface from an array of optical gradients.
- the optical surface need not precisely match an actual tissue surface, as the gradients will show the effects of aberrations which are actually located throughout the ocular tissue system. Nonetheless, corrections imposed on an optical tissue surface so as to correct the aberrations derived from the gradients should correct the optical tissue system.
- an optical tissue surface may encompass a theoretical tissue surface (derived, for example, from wavefront sensor data), an actual tissue surface, and/or a tissue surface formed for purposes of treatment (for example, by incising corneal tissues so as to allow a flap of the corneal epithelium to be displaced and expose the underlying stroma during a LASIK procedure).
- Laser eye surgery system 15 includes a laser 12 that produces a laser beam 14 .
- Laser 12 is optically coupled to laser delivery optics 16 , which directs laser beam 14 to an eye of patient P.
- a delivery optics support structure (not shown here for clarity) extends from a frame 18 supporting laser 12 .
- a microscope 20 is mounted on the delivery optics support structure, the microscope often being used to image a cornea of eye E.
- Laser 12 generally comprises an excimer laser, typically comprising an argon-fluorine laser producing pulses of laser light having a wavelength of approximately 193 nm.
- Laser 12 will preferably be designed to provide a feedback stabilized fluence at the patient's eye, delivered via delivery optics 16 .
- the present invention may also be useful with alternative sources of ultraviolet or infrared radiation, particularly those adapted to controllably ablate the corneal tissue without causing significant damage to adjacent and/or underlying tissues of the eye.
- sources include, but are not limited to, solid state lasers and other devices which can generate energy in the ultraviolet wavelength between about 185 and 205 nm and/or those which utilize frequency-multiplying techniques.
- an excimer laser is the illustrative source of an ablating beam, other lasers may be used in the present invention.
- Laser 12 and delivery optics 16 will generally direct laser beam 14 to the eye of patient P under the direction of a computer processor 22 .
- Processor 22 will generally selectively adjust laser beam 14 to expose portions of the cornea to the pulses of laser energy so as to effect a predetermined sculpting of the cornea and alter the refractive characteristics of the eye.
- both laser 14 and the laser delivery optical system 16 will be under computer control of processor 22 to effect the desired laser sculpting process so as to deliver the customized ablation profile, with the processor ideally altering the ablation procedure in response to inputs from the optical feedback system.
- the feedback will preferably be input into processor 22 from an automated image analysis system, or may be manually input into the processor by a system operator using an input device in response to a visual inspection of analysis images provided by the optical feedback system.
- Processor 22 will often continue and/or terminate a sculpting treatment in response to the feedback, and may optionally also modify the planned sculpting based at least in part on the feedback.
- Laser beam 14 may be adjusted to produce the desired sculpting using a variety of alternative mechanisms.
- the laser beam 14 may be selectively limited using one or more variable apertures.
- An exemplary variable aperture system having a variable iris and a variable width slit is described in U.S. Pat. No. 5,713,892, the full disclosure of which is incorporated herein by reference.
- the laser beam may also be tailored by varying the size and offset of the laser spot from an axis of the eye, as described in U.S. Pat. No. 5,683,379, and as also described in co-pending U.S. patent application Ser. Nos. 08/968,380, filed Nov. 12, 1997; and 09/274,999 filed Mar. 22, 1999, the full disclosures of which are incorporated herein by reference.
- Still further alternatives are possible, including scanning of the laser beam over the surface of the eye and controlling the number of pulses and/or dwell time at each location, as described, for example, by U.S. Pat. Nos. 4,665,913 (the full disclosure of which is incorporated herein by reference) and as demonstrated by other scanning laser systems such as the LSX laser by LaserSight, LadarVision by Alcon/Autonomous, and the 217C by Technolas; using masks in the optical path of laser beam 14 which ablate to vary the profile of the beam incident on the cornea, as described in U.S. patent application Ser. No. 08/468,898, filed Jun.
- laser system 15 Additional components and subsystems may be included with laser system 15 , as should be understood by those of skill in the art.
- spatial and/or temporal integrators may be included to control the distribution of energy within the laser beam, as described in U.S. Pat. No. 5,646,791, the disclosure of which is incorporated herein by reference.
- An ablation effluent evacuator/filter, and other ancillary components of the laser surgery system which are not necessary to an understanding of the invention, need not be described in detail for an understanding of the present invention.
- laser system 15 will generally include a computer system or programmable processor 22 .
- Processor 22 may comprise (or interface with) a conventional PC system including the standard user interface devices such as a keyboard, a display monitor, and the like.
- Processor 22 will typically include an input device such as a magnetic or optical disk drive, a CD drive, an internet connection, or the like.
- Such input devices will often be used to download a computer executable code from a computer network or a tangible storage media 29 embodying steps or programming instructions for any of the methods of the present invention.
- Tangible storage media 29 includes, but is not limited to a CD-R, a CD-RW, DVD, a floppy disk, an optical disk, a data tape, a non-volatile memory, or the like, and the processor 22 will include the memory boards and other standard components of modern computer systems for storing and executing this code.
- Wavefront measurement device 10 typically includes a wavefront measurement assembly 11 and an imaging assembly 13 .
- Wavefront measurement assembly 11 can be used to measure and obtain a wavefront elevation surface of at least one of the patient's eyes and imaging assembly 13 can obtain still or moving images of the patient's eye during the wavefront measurement.
- imaging assembly 13 is a CCD camera that can obtain a still image of the patient's eye.
- the image(s) obtained by imaging assembly 13 can thereafter be used to register the wavefront measurement and/or a customized ablation pattern (based on the wavefront measurement) with the patient's eye during the laser surgical procedure.
- the wavefront measurement assembly 11 and imaging assembly 13 can be coupled to or integral with a computer system 17 that can generate and store the wavefront measurements and images of the patient's eye. Thereafter, the patient's wavefront data can be stored on a computer readable medium, such as a CD-R, CD-RW, DVD-R, floppy disk, optical disk, a hard drive, or other computer readable medium.
- the computer system of the wavefront measurement device can generate and save an ablation profile based on the wavefront data.
- the wavefront data and/or the customized ablation profile can be loaded into a laser surgical system 15 through reading of the computer readable medium or through delivery into a memory of surgical system 15 over a local or wide-area network (LAN or WAN).
- Laser eye surgery system 15 can include a computer controller system 22 that is in communication with an imaging assembly 20 and a laser assembly 12 .
- Computer system 22 can have software stored in a memory and hardware that can be used to control the delivery of the ablative energy to the patient's eye, the tracking of the position (translations in the x, y, and z directions and torsional rotations) of the patient's eye relative to an optical axis of laser beam 14 , and the like.
- computer system 22 can be programmed to calculate a customized ablation profile based on the wavefront data, register the image(s) taken with imaging assembly 11 with the image(s) taken by imaging assembly 20 , and measure the torsional offset, ⁇ 0 , between the patient's eye in the two images. Additionally, computer system 22 can be programmed to measure, in real-time, the movement (x(t), y(t), z(t), and rotational orientation ⁇ (t)) of the patient's eye relative to the optical axis of the laser beam so as to allow the computer system to modify the delivery of the customized ablation profile based on the real-time position of the patient's eye.
- FIG. 3 one embodiment of a wavefront measurement device of the present invention is schematically illustrated.
- the illustrated wavefront measurement device 10 is merely an example of one wavefront measurement device that can be used with the embodiments of the present invention and other conventional or proprietary wavefront measurement devices can be used.
- wavefront measurement device 10 includes an imaging assembly 13 that can image the patient's eye E during the wavefront measurement.
- Wavefront measurement assembly 13 includes an image source 32 which projects a source image through optical tissues 34 of eye E and so as to form an image 44 upon a surface of retina R.
- the image from retina R is transmitted by the optical system of the eye (specifically, optical tissues 34 ) and imaged onto a wavefront sensor 36 by system optics 38 .
- the imaging assembly 11 can be in communication with a computer system 22 to deliver the image(s) of the patient's eye to a memory in the computer.
- Wavefront sensor 36 can also communicate signals to computer 17 for determination of a corneal ablation treatment program.
- Computer 17 may be the same computer which is used to direct operation of the laser surgery system 15 , or at least some or all of the computer components of the wavefront measurement device 10 and laser surgery system may be separate. Data from wavefront sensor 36 may be transmitted to laser system computer 22 via tangible media 29 , via an I/O port, via an networking connection such as an intranet, the Internet, or the like.
- Wavefront sensor 36 generally comprises a lenslet array 38 and an image sensor 40 .
- the lenslet array separates the transmitted image into an array of beamlets 42 , and (in combination with other optical components of the system) images the separated beamlets on the surface of sensor 40 .
- Sensor 40 typically comprises a charged couple device or CCD, and senses the characteristics of these individual beamlets, which can be used to determine the characteristics of an associated region of optical tissues 34 .
- image 44 comprises a point or small spot of light
- a location of the transmitted spot as imaged by a beamlet can directly indicate a local gradient of the associated region of optical tissue.
- Eye E generally defines an anterior orientation ANT and a posterior orientation POS.
- Image source 32 generally projects an image in a posterior orientation through optical tissues 34 onto retina R.
- Optical tissues 34 again transmit image 44 from the retina anteriorly toward wavefront sensor 36 .
- Image 44 actually formed on retina R may be distorted by any imperfections in the eye's optical system when the image source is originally transmitted by optical tissues 34 .
- image source projection optics 46 may be configured or adapted to decrease any distortion of image 44 .
- image source optics may decrease lower order optical errors by compensating for spherical and/or cylindrical errors of optical tissues 34 . Higher order optical errors of the optical tissues may also be compensated through the use of an adaptive optic element, such as a deformable mirror.
- Use of an image source 32 selected to define a point or small spot at image 44 upon retina R may facilitate the analysis of the data provided by wavefront sensor 36 . Distortion of image 44 may be limited by transmitting a source image through a central region 48 of optical tissues 34 which is smaller than a pupil 50 , as the central portion of the pupil may be less prone to optical errors than the peripheral portion. Regardless of the particular image source structure, it will be generally be beneficial to have well-defined and accurately formed image 44 on retina R.
- a series of wavefront sensor data readings may be taken.
- a time series of wavefront data readings may help to provide a more accurate overall determination of the ocular tissue aberrations.
- a plurality of temporally separated wavefront sensor measurements can avoid relying on a single snapshot of the optical characteristics as the basis for a refractive correcting procedure.
- Still further alternatives are also available, including taking wavefront sensor data of the eye with the eye in differing configurations, positions, and/or orientations.
- a patient will often help maintain alignment of the eye with wavefront device 13 by focusing on a fixation target, as described in U.S. Pat. No. 6,004,313, the full disclosure of which is incorporated herein by reference.
- a focal position of the fixation target as described in that reference, optical characteristics of the eye may be determined while the eye accommodate or adapts to image a field of view at a varying distance.
- Further alternatives include rotating of the eye by providing alternative and/or moving fixation targets within wavefront device 11 .
- the location of the optical axis of the eye may be verified by reference to the data provided from an imaging assembly or pupil camera 13 that images the eye concurrently during the wavefront measurements.
- a pupil camera 13 images pupil 50 and/or the iris so as to allow subsequent determination of a position and torsional orientation of the pupil and/or iris for registration of the wavefront sensor data relative to the optical tissues, as will also be described hereinbelow.
- FIG. 3A An alternative embodiment of a wavefront sensor system is illustrated in FIG. 3A .
- the major components of the system of FIG. 3A are similar to those of FIG. 3 .
- FIG. 3A includes an adaptive optical element 52 in the form of a deformable mirror.
- the source image is reflected from deformable mirror 52 during transmission to retina R, and the deformable mirror is also along the optical path used to form the transmitted image between retina R and imaging sensor 40 .
- Deformable mirror 52 can be controllably deformed to limit distortion of the image formed on the retina, and may enhance the accuracy of the wavefront data.
- the structure and use of the system of FIG. 3A are more fully described in U.S. Pat. No. 6,095,651, the full disclosure of which his incorporated herein by reference.
- a wavefront system for measuring the eye and ablations comprise elements of a VISX WaveScanTM, available from VISX, Inc. of Santa Clara, Calif.
- a preferred embodiment includes a WaveScan with a deformable mirror as described above.
- An alternate embodiment of a wavefront measuring device is described in U.S. Pat. No. 6,271,915, the full disclosure of which is incorporated herein by reference.
- a treatment program map may be calculated from the wavefront elevation map so as to remove the regular (spherical and/or cylindrical) and irregular errors of the optical tissues.
- a table of ablation pulse locations, sizes, shapes, and/or numbers can be developed.
- An exemplary method and system for preparing such an ablation table is described in co-pending U.S. patent application Ser. No. 09/805,737 filed on Mar. 13, 2001 and entitled “Generating Scanning Spot Locations for Laser Eye Surgery,” the full disclosure of which is incorporated herein by reference.
- Ablation table may optionally be optimized by sorting of the individual pulses so as to avoid localized heating, minimize irregular ablations if the treatment program is interrupted, and the like.
- a corneal ablation pattern may be calculated by processor 17 or 22 (or by another separate processor) for ablating the eye with laser ablation system 15 so as to correct the optical errors of the eye.
- Such calculations will often be based on both the measured optical properties of the eye and on the characteristics of the corneal tissue targeted for ablation (such as the ablation rate, the refractive index, the propensity of the tissue to form “central islands” or decreased central ablation depths within a uniform energy beam, and the like).
- the results of the calculation will often comprise an ablation pattern in the form of an ablation table listing ablation locations, numbers of pulses, ablation sizes, and or ablation shapes to effect the desired refractive correction.
- alternative treatment plans may be prepared, such as corneal ring implant sizes, or the like.
- Wavefront measurement assembly 13 can use wavefront sensors 36 , such as Hartmann-Shack sensors, for obtaining a wavefront elevation surface 54 of the patient's eye.
- Wavefront elevation surface 54 can be run through a treatment algorithm 58 to generate a treatment table or ablation profile 60 that is customized to correspond to the patient's wavefront elevation surface 54 .
- ablation profile 60 can be calculated by a processor of wavefront device 10 , laser system 15 , or by a separate processor and stored in a memory of computer 17 , 22 .
- imaging assembly 11 can concurrently obtain an image 56 of the patient's eye, e.g., pupil and iris.
- the image of the patient's eye 56 can be analyzed by an algorithm 62 that locates the center of the pupil and/or iris, calculates the radius of the pupil and/or iris, and locates markers 64 in the patient's iris for subsequent registration and tracking.
- ablation profile 60 In order to register the ablation profile 60 and the patient's eye during the laser treatment, the ablation pattern and the patient's eye should share a common coordinate system. Thus, ablation profile 60 should be positionally and torsionally aligned with the patient's eye when the patient's eye is positioned in the path of the laser beam. Additionally, the translational and torsional orientation of the patient's eye should be tracked during the surgical procedure to ensure an accurate delivery of the ablation profile.
- pupil camera 20 is a video device that can obtain streaming video of the patient's eye.
- One frame 66 of the streaming video typically the first frame of the streaming video, can be analyzed by the computer processor to locate the pupil center, iris center, and/or markers 64 that were originally located in the reference image 56 .
- a torsionally offset, ⁇ 0 between reference image 56 and video frame image 66 of the patient's eye is calculated.
- the computer can track the translational position (x(t), y(t), and z(t)) of the patient's eye E with a high speed eye tracker (HSET) 68 and the torsional orientation ( ⁇ (t)) of the eye with a torsional tracker 70 . Because the position of the center of the pupil is tracked with the HSET 68 , the torsional tracker 70 generally has to estimate the position of the markers 64 with respect to the pupil center.
- HSET high speed eye tracker
- the computer can correct the delivery of the customized ablation pattern by adjusting the patient's customized treatment table 60 by adding in the translation and torsional measurements into the table.
- the treatment table can be adjusted such that at time t, if the overall rotation angle of the eye is ⁇ (t), and the next pulse of the laser is supposed to be delivered at location (x,y) on the cornea, the new location of the delivery of the pulse can be defined by:
- torsional tracker 70 can use the markers 64 identified above, other high-contrast iris patches, or if the patient's iris contains too little texture, the surgeon will have an option of drawing artificial landmarks 72 on the eye for tracking.
- the algorithm it is possible for the algorithm to decide if artificial markers are required.
- the translational position and torsional orientation of the patient's eye can be tracked and analyzed by a computer processor in real-time so that the x(t), y(t), z(t) and ⁇ (t) information 74 can be used to adjust the customized treatment table 60 so that laser 12 delivers the appropriate ablation pattern 76 to the patient's eye.
- a first step of the present invention entails registering a reference image of the eye taken during the calculation of the wavefront elevation map with a second image of the eye taken just prior to the delivery of the ablation energy.
- FIGS. 5 to 9 illustrate aspects of one embodiment of a method of the present invention.
- FIG. 5 schematically illustrates the data flow through an alignment algorithm that can torsionally register a reference image with a second image of the eye to determine the torsional displacement between the two images of the eye.
- An initial step in the method is to obtain the first, reference image.
- the images were 768 ⁇ 576 pixels and have 256 gray levels.
- the image contains the pupil and the iris. In some images, part of the iris may be occluded by one or both of the eyelids or cropped by the camera's field of view.
- the present invention can use a variety of imaging devices to produce different images and can be illuminated under various types of illumination.
- the smallest distance between the edge of the pupil and the obstructing elements, such as eyelids, eyelashes, strong shadows or highlights should be sufficiently large to leave a portion of the iris completely exposed for the entire 360-degree range.
- the largest possible portion of the iris is in sharp focus so as to expose its texture.
- a pupil finding algorithm can be used to locate the pupil, calculate the radius of the pupil and find the center of the pupil (Step 82 ).
- the pupil is located by thresholding the image by analyzing a pixel value histogram and choosing the position of a first “dip” in the histogram after at least 2000 pixels are below the cutoff threshold. All pixels below the threshold are labeled with “1” and pixels above the threshold are labeled with “0”. Pixels labeled with “1” would generally correspond to the pupil, eyelashes, and possibly other regions of the image. It should be appreciated however, that the number of pixels employed will be related to the area of the pupil and will vary with applications of the invention.
- the two distinguishing features about the pupil region, compared to other non-pupil regions is its large size and central location.
- regions intersecting with a 5-pixel wide inner frame of the image can be discarded and the largest remaining region can be selected as the pupil.
- the selected pupil region can be filled to remove any holes created by reflections, or the like.
- the remaining region of the image may also be analyzed for convexity. If the ratio of the area of the region to the area of its convex hull was less then 0.97, a circle completion procedure can be applied to the convex points on the region's boundary.
- One way of performing such an analysis is through a Matlab function “imfeature(. . . , ‘CovexHull’)”.
- a radius and center of the pupil can be estimated by a standard weighted least-square estimation procedure. If the convexity quotient was above 0.97, the radius and centroid can obtained using conventional methods, such as Matlab's “imfeature(. . . , ‘Centroid’, ‘EquivDiameter’)” function.
- an iris finding algorithm can be used to locate the iris, calculate the radius of the iris, and/or locate the iris center. Since the images of the eye from both imaging assembly 11 and the camera 20 both contain the pupil and iris, in some embodiments it may be more accurate to register the images by calculating the center of the pupil and the center of the iris and expressing the position of the pupil center with respect to the center of the iris.
- the center of the iris may be described as a center of a circle corresponding to the outer boundary of the iris. The position of the center of the iris can be used to calculate a pupil offset from the iris center.
- ⁇ right arrow over (X P WS ) ⁇ are the coordinates of the center of the pupil in image 56 ( FIG. 4 ).
- ⁇ right arrow over (C) ⁇ ⁇ right arrow over (X I WS ) ⁇ + ⁇ right arrow over (X P WS ) ⁇ right arrow over (X P LASER ) ⁇ + ⁇ right arrow over (X I LASER ) ⁇
- FIGS. 6C and 6D schematically illustrate simplified images of the eye taken with image assembly 11 and camera 20 , respectively that can be analyzed to find the pupil center and iris center.
- Marker 200 marks the iris center in both images
- marker 204 corresponds to the pupil center in image 56
- marker 206 corresponds to the pupil center in the laser image 66 .
- the pupil has changed in size (as shown by the gray outline) and the center of the pupil has moved relative to the center of the iris 200 .
- the measured wavefront measurement and corresponding ablation pattern can be centered over center position ⁇ right arrow over (C) ⁇ that is calculated by the above equation.
- One method for detection of both iris and the pupil in the image I(x,y) is to minimize the following integral over all possible values of iris radius and center:
- the pupil center has already been found (as described above), that the iris has a limited range of possible values and the iris center is usually not very far from the pupil center.
- the limited range of iris radius values occurring in nature, allows restriction of a range of possible search to a ring centered at pupil center and having inner and outer radii such that the iris edge should always be located somewhere within the range.
- the numerical search range can be between approximately 10.5 mm and 14 mm. In other embodiments, the range may be larger or smaller, if desired. See Burns et al., IOVS, July 2002.
- circles 208 , 210 illustrate a potential range for the iris radius.
- the values of the radial derivative that exceed certain threshold can be passed to the weighted least square estimator for the best circle fit through the set of points, as is described herein.
- the initial weights of the points are proportional to their intensity. After enough iterations (e.g., two iterations) are performed to converge to a stable solution, the algorithm converges to the answer represented by the red circle.
- the iris finding algorithm shows tolerance to other edges detected by the derivative operator, but corresponding to other structures in the image (e.g., LASIK flap). If desired, to reduce the computation time, the original images can be smoothed with a Gaussian kernel and sub-sampled by a factor of four prior to a derivative computation.
- the boundary of the iris can be localized with sub-pixel accuracy, but it might be slightly displaced from its true location if the shadows in the image soften the boundary edge.
- the errors are fairly well balanced in all directions from the center, so that the final result is very close to the actual center.
- the image scale for both the second image (e.g., laser image) and the first image (e.g., wavefront image) is estimated to be 52.3 pixels per millimeter, which is 19.1 ⁇ m per pixel.
- An error of one pixel in the boundary estimation on one side of the iris would result in about 10 ⁇ m error in the estimate of the iris center.
- the errors of a few pixels in the iris boundary would still be within the acceptable accuracy for the ablation centering.
- a width of the iris ring can be extracted from the images.
- the iris can be treated as an elastic sheet stretched between pupil and the outer rim of the iris.
- the width of the iris band can be set to 76 pixels for images of dark-colored eyes, and 104 pixels for the light-colored eyes. It should be appreciated, however, that other width estimations can be used.
- the radius of the iris in the reference images of FIGS. 6A and 6B were estimated to be 320 pixels and assumed to be roughly constant for all people.
- the iris ring can then be unwrapped and divided into a fixed number of sectors, by converting the Cartesian iris coordinates into polar coordinates, centered at the pupil. (Step 86 ).
- Applicant has found that unwrapping and scaling the iris ring allows better matching of texture blocks between different images of the eye by means of pure translation. For example, as shown in FIGS. 7C and 7D , if the iris ring is not unwrapped, the software may have trouble matching of texture blocks that have rotated ( FIG. 7C ), whereas if the iris ring is unwrapped, the texture blocks have the same relative shape ( FIG. 7D ).
- the iris ring can be sampled at one-pixel steps in the radial direction for the reference image.
- the dynamic range of pixel values in the iris may be adjusted to remove outliers due to reflections from the illumination LED lights.
- the pixel value histogram can be thresholded so that all the pixels with values above the threshold are assigned the value of the threshold.
- some band-pass filtering may be applied to the iris bands prior to region selection to remove lighting variation artifacts.
- the iris region is segmented into twenty four sectors of fifteen degrees. It should be appreciated, however, that in other embodiments, the iris region can be segmented into more than twenty four sectors or less than twenty four sectors.
- the markers in the reference image can be stored and later located in the second image of the eye so as to estimate the torsional displacement of the eye between the two images.
- One embodiment of a method of locating the markers is described more fully in Groen, E., “Chapter 1 on Video-oculography,” PhD Thesis, University of Utrecht (1997), the complete disclosure of which is incorporated herein by reference.
- the markers should be sufficiently distinct and have high contrast. There are several possible ways to select such points.
- a square mask of size M ⁇ M (for example, 21 ⁇ 21 for dark-colored eyes and 31 ⁇ 31 for light-colored eyes) is defined.
- the mask can be scanned over each of the twenty four sectors, and for each pixel in each sector a value is computed from the region inside the mask centered at that pixel.
- the value assigned to the pixel is determined as the sum of amplitudes of all spatial frequencies present in the region.
- the sum of the amplitudes can be computed by a Fourier transform of the region. If desired, the central 5 ⁇ 5 portion of the Fourier spectrum can be nulled to remove a DC component.
- the maximum value can then be located in each sector, such that the boundary of its corresponding mask is at least 5 pixels away from the iris image boundary in order to avoid getting close to the pupil margin and other boundary artifacts, such as the eyelid and eyelashes.
- the “winning” positions and the corresponding blocks are stored for later comparison.
- ⁇ 1 , ⁇ 2 be the eigenvalues of the matrix of Z, with ⁇ 2 being the smaller one, then ⁇ 2 is the texture strength of the block.
- the second image of the eye can also be obtained.
- the second image is obtained with a laser surgical system's microscope camera prior to delivering the ablative energy to the patient.
- the laser camera has a resolution of 680 ⁇ 460 pixels using 256 grayscale levels.
- the magnification of the laser camera in relation to the reference camera from the CCD camera was estimated to be 0.885.
- the eye can be illuminated by a set of infrared LED lights having a wavelength of 880 nm. It should be appreciated, however, that many other imaging devices can be used to obtain different image types, including images that do not require a magnification, images of different resolution, and images that are illuminated by other light wavelengths.
- the sectors in the second image are located and the salient regions that correspond to the salient regions in the reference image are located. (Step 94 ; FIG. 7B ). For each sector in the second image, a best matching region is located.
- the search is constrained to the matching sector and the two adjacent sectors in the second image, thus limiting possible matches to within 15 degrees, which is a reasonable biological limit for ocular cyclo-rotation. It should be appreciated however, in other embodiments, the range of limiting the possible match may be larger or smaller than 15 degrees.
- the match between the marker in the reference image and the marker in the second image is evaluated as the sum of absolute errors (after both blocks are made to have zero mean value) for each corresponding region centered at a given pixel.
- the match between the marker in the reference image and the marker in the second image is evaluated as the sum of absolute errors (after both blocks are made to have zero mean value) for each corresponding region centered at a given pixel.
- FIGS. 8A and 8B due to presence of LED reflections on the iris, some portions of the iris may lose its texture in the second image. In some embodiments, these areas 95 can be detected by histogram analysis similar to pupil detection and can be excluded from matching. The points with the smallest error can then be selected as the matching markers for each marker in the reference image.
- a dot product of the mean-subtracted reference and the second image patches can be calculated, where:
- an angular displacement for each marker is calculated to estimate a total torsional angle of the eye between the first, reference image and the second image. (Step 96 ; FIG. 9 ).
- the displacement of each marker would be identical and equal to the torsional angle.
- the center of the pupil may not be estimated correctly. This introduces a sinusoidal distribution of displacement angles around the true torsional angle. The amplitude of the sinusoid is usually quite small.
- the actual shape of the pupil is often elliptical and not round. This can introduce a sinusoidal distortion with twice the period of the center of the pupil distortion due to the method of measurement of the landmarks with respect to the circular pupil. Indeed, points further away from the pupil center will be spaced closer to each other after the iris is unwrapped, and points closer to the pupil center would end up being spaced more widely.
- Application of the functions to the torsional angle data can thereafter provide an estimate for the torsional angle ⁇ 0 between the reference image and the second image.
- the initial torsional angle, ⁇ 0 computed by the alignment algorithm (between the iris image 56 taken with pupil camera 13 and the initial video frame 66 from imaging device 20 ) can be added to every subsequent frame for tracking of the torsional orientation of the patient's eye.
- the high speed eye tracker (HSET) of the laser surgical system can be used to keep track of the translation of the pupil the x, y, and z directions. Having the position of the pupil readily available requires only that the torsional tracker estimate the positions of the iris landmarks with respect to the center of the pupil.
- the iris can undergo rigid translations (e.g., movement in the x, y, and z directions), rotations, as well as some non-rigid affine transformations of scaling and shearing. While the torsional angle is not affected by the non-rigid transformations, it is preferable that the non-rigid transformations be taken into account in order to ensure accurate feature matching from frame to frame.
- Such an approach is described in computer vision literature such as Lucas B. D. and Kanade, T. “An Iterative Image Registration Technique and Application to Stereo Vision” ILCAI (1981), Shi, J. and Tomasi, C. “Good Features to Track,” IEEE Conference on Computer Vision and Pattern Recognition 1994, and Hager, G. D.
- FIG. 10 schematically illustrates a simplified method of tracking the torsional rotation of the patient's eye during the surgical procedure.
- the pupil and iris are located in both the first frame and n th frame of the video stream.
- Reference points can be located in the first frame and the corresponding reference points can be located in the n th frame of the video stream.
- Step 102 The angular offset between the reference points in the two images can then be calculated to estimate the torsional rotation of the eye.
- Step 104 The steps can be repeated for each of frames of the video stream until the ablation procedure is completed.
- Step 105 ).
- FIG. 11 is an example of a first frame 106 from the video stream of the eye taken prior to the laser ablation.
- a pupil 108 has been located (as noted by circular outline 110 image around the circumference of the pupil), and two reference loci or points 112 , 114 are selected for torsional tracking.
- reference points 112 , 114 are a subset of the points chosen for registration (described above).
- the points 112 , 114 can be chosen automatically by the software of the present invention based on its texture strength, and positioning relative to the pupil (e.g., 8 o'clock position and 2 o'clock position). In alternative embodiments, however, it may be possible to independently select points 112 , 114 separate from the original markers using the same technique described above or to manually select or draw the reference points 112 , 114 on the patient's iris.
- the process of selecting points for tracking can be automatic or surgeon-assisted.
- the automatic process can select one point on the right of the pupil and one on the left based on which reference block in the corresponding neighborhood has best block-match score and also included in the estimate of the alignment angle, i.e. not an outlier. If the texture of the iris has very low contrast or does not have distinctive components, it may be necessary to introduce artificial landmarks. Such landmarks can be drawn on the eye by the surgeon, so that the algorithm would track their spatial displacements instead of displacements of the patches of iris texture.
- One exemplary selection algorithm selects a subset of blocks that are not outliers. From this subset, blocks are removed that are in the positional domain of possible reflections. These positions are known due to specific placement of LEDs on the laser. The texture of the remaining blocks from the laser image may be quantified by the second largest eigenvector ⁇ 2 . Two blocks, roughly on the opposite sides of the pupil are chosen, such that they have the largest ⁇ 2 in the group. In one embodiment, the “left block” is selected from the valid blocks centered around the 8-o'clock position, and the “right block” is selected among the valid blocks centered at the 2-o'clock position. The coordinates of the centers of these blocks can be used to initialize tracking.
- the blocks/loci 112 , 114 have been selected in the first frame, for each consecutive frame of the video feed, the blocks are located within a region of the iris that has the same position with respect to the pupil of the eye.
- the region is generally limited to approximately 15 degrees, since the eye will generally not rotate more than such a range, and within such a time between each consecutive frame of the video stream, the torsional rotation will likely be much less than the 15 degrees.
- the range of analysis can be limited to a smaller or larger range, if desired.
- the spatially corresponding regions of the first frame and the n th frame can be compared for affine displacement, giving preference to rigid transformations. In one embodiment, only horizontal and vertical displacements are reported by the tracking algorithm.
- FIG. 12 illustrates six images of selected blocks 112 , 114 .
- Images 116 , 118 are images of blocks 112 , 114 in reference image 66 .
- Blocks 120 , 122 are the corresponding blocks from the new, real-time frame.
- Block images 124 , 126 are the best transformed block from the first frame that match the target block. From the change in the positional coordinates of the blocks 112 , 114 , a torsional angle between the first frame and the second frame can be computed. ( FIG. 13 ).
- B i is the coordinate of the i th block in the reference frame
- X is the pupil center coordinate in the reference frame
- X n is the pupil center coordinate in the n th frame
- the expected pupil center coordinates of the blocks in both frames are: B i B i ⁇ X
- one part of the described embodiment of the tracking algorithm is to estimate the motion parameters of a given block or marker. If I is the block in the original frame and J is the spatially corresponding block in a subsequent frame, let x be the pixel coordinates in these blocks. To estimate an affine transformation matrix A and translation vector D, the following equation can be minimized:
- Matrix A can be decomposed into a rotation component and a scale/shear component as follows:
- a linear system for computing rigid motion parameters is:
- the magnification factor of the laser's camera was adjusted to match that of the imaging device of the wavefront measurement device, thus eliminating scaling issues. Also, as the resolution of the laser camera increased due to larger magnification factor, more details became visible on the light-colored irises.
- Sixteen eyes (from six people) were photographed with the a CCD of the VISX WaveScanTM camera, while subjects were in the sitting position and with the laser's camera, while subjects were laying down in the surgical chair. The torsional angle was estimated between the two photographs of the same eye of the same subject.
- FIG. 14 is a torsional angle estimate for two different dark-colored iris eyes.
- FIG. 15 is a torsional angle estimate for two different light-colored iris.
- the line fit criteria is not explicitly evaluated, since it can be thought of as a sinusoidal fit of zero-amplitude. This is simply a result of having 3 parameters in the sinusoidal fit (mean, amplitude and phase) versus one parameter for the line (mean). Therefore, any line fit quality would be worse then the sinusoidal estimates, even if it captures the nature of the data. As mentioned earlier, the line fit estimate of the torsion angle is usually close to the value reported by sinusoidal or possibly a double sinusoidal fit.
- FIG. 16A and 16B summarize the results for the data set processed by the algorithm.
- a majority of the iris should be visible so that a minimum width of the iris ring is more then 80 pixels.
- the focus of the camera should be adjusted so that most of the iris is in focus providing the highest possible texture resolution of the iris ring. Several images can be taken to ensure good quality.
- images with strong shadows and reflections on the iris should be rejected in order to avoid strong false markers.
- images should be saved into a file of type BMP or TIF.
- image names should contain unique name of the subject, left or right indicator for the eye and the ID of the device from which they come (e.g., laser image or wavefront image).
- the illumination when obtaining the wavefront image should be the same when obtaining the image with the laser camera.
- Applicants have found that dark-colored eyes have more rich texture under the infrared illumination and light -colored eyes have more rich texture under visible light.
- the striated trabecular meshwork of elastic pectinate ligament creates a predominant texture under visible light.
- deeper slower modulated stromal features dominate the iris pattern. See for example Daugman, J. “High confidence visual recognition of persons by a test of statistical independence,” IEEE Transactions of Pattern Analysis and Machine Intelligence, vol. 15(11), pp 1148-1161 (1993).
- Image quality may also be degraded by LED reflections. However, because illumination is required, it may be unavoidable to have several LED reflections on the iris. These features can be handled by the algorithm as described above. These reflections, however, can greatly degrade the image quality. As shown in FIG. 17A , the shadow makes it impossible to discern any texture of the right side of the iris. As a result, as shown in the FIG. 17B , the alignment data obtained from the image in FIG. 17A was rejected due to the large RMS factor (i.e., above 1). Therefore, the alignment algorithm of the present invention can have an internal quality of fit check that automatically rejects bad data.
- a first step was to mark the expected position of the LASIK flap as an invalid region, preventing the algorithm from selecting reference blocks in that area of the iris.
- a second step is to apply band-pass filtering to the unwrapped iris images.
- the convolution kernel was set to be the difference of 2-D Gaussian distributions with standard deviations equal to 3 and 12 pixels.
- a third step was the introduction of bi-directional alignment, when the blocks were selected and matched from the wavefront device to the laser and from the laser to the wavefront device. This essentially doubled the number of data points used for sinusoidal fitting.
- the algorithm was run through several tests.
- the first set of results using the methods and software of the present invention to track the torsional movement of the patient's eye involved artificial rotation of an image of a video frame from the laser surgical system's camera 20 .
- the image was rotated by 1 degree counter-clockwise for each subsequent frame.
- a total of 15 rotated frames were analyzed by the torsional tracking algorithm.
- the original frame and final frame are illustrated in FIGS. 18A and 18B , respectively.
- Application of the torsional tracking algorithm were accurate for every frame to a precision of within 0.2 degrees from its actual value.
- the second set of results comes from a 500-frame sequence capturing 25 seconds of real video of an eye.
- Several variables were tracked during the video processing: pupil center position, pupil radius, torsional angle, and error estimates for the two blocks tracked for each frame. The sequence was also visually inspected to verify the black match and the overall eye torsion.
- the zero th frame ( FIG. 19A ) was used as a reference with two 31 ⁇ 31 pixel blocks marked for tracking. The last frame shows the same blocks at the appropriate locations. ( FIG. 19B ).
- FIGS. 20-23 show the data extracted from the video sequence.
- FIG. 20 shows the pupil position over time.
- FIG. 21 shows the change of the pupil radius from frame 0 to 500.
- FIG. 22 illustrates errors per frame/block.
- FIGS. 23 shows the torsional angle of the markers (relative to the first frame of the video).
- FIG. 24 depicts the tracking results for the 30-frame sequence starting with the 345 th frame.
- the data shows that the algorithm jumped to correct position and correctly tracked the blocks throughout the video sequence to within 1 ⁇ 4 degree precision compared to the original torsional data. Skipping video frames is often required to give time to the torsional alignment algorithm to establish the rotational angle between the reference image and the second image (e.g., first frame of the video sequence).
- FIG. 25 shows the torsional data extracted from the slower acquired sequence. Such data still matches the measurement extracted from the normal frame rate sequence illustrated in FIG. 23 .
- the torsional tracking algorithm was engaged using the two images (snapshot and wavefront image) as a reference.
- the measurement of the torsional eye movements with respect to the reference image is depicted in FIGS. 27A and 27B .
- Estimated torsional angle reference to the first image of the video sequence ( FIG. 27A ) closely resembled the one referenced to the snapshot ( FIG. 27B ), with the exception of the constant offset of about 0.5 degrees counterclockwise.
- ⁇ first video image ⁇ 0.04 ⁇ snapshot +0.80, where the alignment angle has a sign notation of clockwise being positive.
- the difference between the two estimates is shown in FIG. 28 .
- the mean shows the difference in the total alignment angle and its value is less than 1 degree, which is the specified tolerance for this one exemplary embodiment. It should be appreciated however, that other embodiments may have a tolerance that is more than 1 degree or less than 1 degree.
- FIGS. 29A and 29B show two different torsional angle estimates that include the alignment with the wavefront measurement image.
- the reference frames for the two estimates were 0.41 degrees clockwise 134 ( FIG. 29A ) and 1.17 degrees clockwise 136 ( FIG. 29A ).
- the errors between the estimates are shown in FIG. 29B as a function of the frame number. As in previous tests, the errors do not exceed 1 degree for any frame.
Landscapes
- Health & Medical Sciences (AREA)
- Engineering & Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Ophthalmology & Optometry (AREA)
- Physics & Mathematics (AREA)
- General Health & Medical Sciences (AREA)
- Veterinary Medicine (AREA)
- Heart & Thoracic Surgery (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- Public Health (AREA)
- Biomedical Technology (AREA)
- Medical Informatics (AREA)
- Biophysics (AREA)
- Molecular Biology (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Vascular Medicine (AREA)
- Optics & Photonics (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Human Computer Interaction (AREA)
- General Engineering & Computer Science (AREA)
- Signal Processing (AREA)
- Radiology & Medical Imaging (AREA)
- Quality & Reliability (AREA)
- Laser Surgery Devices (AREA)
- Eye Examination Apparatus (AREA)
- Image Processing (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
Description
{right arrow over (C)}=−{right arrow over (XI WS)}+{right arrow over (XP WS)}−{right arrow over (XP LASER)}+{right arrow over (XI LASER)}
in which the higher the “L”, the better the match between the markers.
F1=TA1
F2=TA2+A1*sin(θ)+B1*cos(θ)
where TAs are the estimates of the true torsional angle and θ is the angular coordinate of the markers. Application of the functions to the torsional angle data can thereafter provide an estimate for the torsional angle θ0 between the reference image and the second image.
θtotal(t)=θ0+θ(t)
where θ(t) is the measured torsional angle between the eye in the initial frame of the video stream and the eye in the nth frame at time t.
In(Ax+d)=I 0(x)
where A=1+D, where D is a deformation matrix and d is the translation of the feature window. Such an approach is described in computer vision literature such as Lucas B. D. and Kanade, T. “An Iterative Image Registration Technique and Application to Stereo Vision” ILCAI (1981), Shi, J. and Tomasi, C. “Good Features to Track,” IEEE Conference on Computer Vision and Pattern Recognition 1994, and Hager, G. D. and Toyama, K. “X-Vision: A portable Substrate for Real-Time Vision Applications,” Computer Vision and Image Understanding 1996, the complete disclosures of which are incorporated herein by reference. Parameters of deformation and translation are determined by Newton-Raphson minimization procedure which can produce accurate results.
B in =B i −X+X n.
B i B i −X
B′ i =B i −D i
θn=meani(θ′i−θi)
where θ′i is the angular position of the block in the nth frame and θi is the angular position of the block in the reference (first) frame.
where w(x) is an optional weighting functions. Because the equations above are approximations, iterative Newton-Raphson minimization can be used to solve the system.
Experimental Registration Results:
θtotal(t)=Tracking[(reference image, video](t)+Alignment[Wavefront reference image]
Claims (12)
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US11/775,840 US7431457B2 (en) | 2002-05-30 | 2007-07-10 | Methods and systems for tracking a torsional orientation and position of an eye |
US12/210,933 US8740385B2 (en) | 2002-05-30 | 2008-09-15 | Methods and systems for tracking a torsional orientation and position of an eye |
US14/258,854 US9596983B2 (en) | 2002-05-30 | 2014-04-22 | Methods and systems for tracking a torsional orientation and position of an eye |
US15/430,087 US10251783B2 (en) | 2002-05-30 | 2017-02-10 | Methods and systems for tracking a torsional orientation and position of an eye |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US38465302P | 2002-05-30 | 2002-05-30 | |
US10/300,714 US7044602B2 (en) | 2002-05-30 | 2002-11-19 | Methods and systems for tracking a torsional orientation and position of an eye |
US11/277,743 US7261415B2 (en) | 2002-05-30 | 2006-03-28 | Methods and systems for tracking a torsional orientation and position of an eye |
US11/775,840 US7431457B2 (en) | 2002-05-30 | 2007-07-10 | Methods and systems for tracking a torsional orientation and position of an eye |
Related Parent Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/277,743 Division US7261415B2 (en) | 2002-05-30 | 2006-03-28 | Methods and systems for tracking a torsional orientation and position of an eye |
US11/277,743 Continuation US7261415B2 (en) | 2002-05-30 | 2006-03-28 | Methods and systems for tracking a torsional orientation and position of an eye |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/210,933 Continuation US8740385B2 (en) | 2002-05-30 | 2008-09-15 | Methods and systems for tracking a torsional orientation and position of an eye |
Publications (2)
Publication Number | Publication Date |
---|---|
US20080009840A1 US20080009840A1 (en) | 2008-01-10 |
US7431457B2 true US7431457B2 (en) | 2008-10-07 |
Family
ID=29712073
Family Applications (6)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US10/300,714 Expired - Lifetime US7044602B2 (en) | 2000-04-07 | 2002-11-19 | Methods and systems for tracking a torsional orientation and position of an eye |
US11/277,743 Expired - Lifetime US7261415B2 (en) | 2002-05-30 | 2006-03-28 | Methods and systems for tracking a torsional orientation and position of an eye |
US11/775,840 Expired - Lifetime US7431457B2 (en) | 2002-05-30 | 2007-07-10 | Methods and systems for tracking a torsional orientation and position of an eye |
US12/210,933 Active 2026-11-14 US8740385B2 (en) | 2002-05-30 | 2008-09-15 | Methods and systems for tracking a torsional orientation and position of an eye |
US14/258,854 Expired - Lifetime US9596983B2 (en) | 2002-05-30 | 2014-04-22 | Methods and systems for tracking a torsional orientation and position of an eye |
US15/430,087 Expired - Lifetime US10251783B2 (en) | 2002-05-30 | 2017-02-10 | Methods and systems for tracking a torsional orientation and position of an eye |
Family Applications Before (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US10/300,714 Expired - Lifetime US7044602B2 (en) | 2000-04-07 | 2002-11-19 | Methods and systems for tracking a torsional orientation and position of an eye |
US11/277,743 Expired - Lifetime US7261415B2 (en) | 2002-05-30 | 2006-03-28 | Methods and systems for tracking a torsional orientation and position of an eye |
Family Applications After (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/210,933 Active 2026-11-14 US8740385B2 (en) | 2002-05-30 | 2008-09-15 | Methods and systems for tracking a torsional orientation and position of an eye |
US14/258,854 Expired - Lifetime US9596983B2 (en) | 2002-05-30 | 2014-04-22 | Methods and systems for tracking a torsional orientation and position of an eye |
US15/430,087 Expired - Lifetime US10251783B2 (en) | 2002-05-30 | 2017-02-10 | Methods and systems for tracking a torsional orientation and position of an eye |
Country Status (8)
Country | Link |
---|---|
US (6) | US7044602B2 (en) |
EP (1) | EP1516156B1 (en) |
JP (1) | JP4256342B2 (en) |
CN (1) | CN100442006C (en) |
AU (1) | AU2002346438A1 (en) |
CA (1) | CA2487411C (en) |
MX (1) | MXPA04011893A (en) |
WO (1) | WO2003102498A1 (en) |
Cited By (22)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050131398A1 (en) * | 2003-11-10 | 2005-06-16 | Visx, Inc. | Methods and devices for testing torsional alignment between a diagnostic device and a laser refractive system |
US20070225693A1 (en) * | 2006-03-10 | 2007-09-27 | Dirk Muehlhoff | Treatment and diagnostic systems for the eye |
US20070242854A1 (en) * | 2006-04-03 | 2007-10-18 | University College Cardiff Consultants Limited | Method of and apparatus for detecting degradation of visual performance |
US20090012505A1 (en) * | 2002-05-30 | 2009-01-08 | Amo Manufacturing Usa, Llc | Methods and Systems for Tracking a Torsional Orientation and Position of an Eye |
US20100202669A1 (en) * | 2007-09-24 | 2010-08-12 | University Of Notre Dame Du Lac | Iris recognition using consistency information |
DE102009030466A1 (en) * | 2009-06-23 | 2011-01-05 | Carl Zeiss Meditec Ag | Method and device for aligning location-related eye data |
US20110161160A1 (en) * | 2009-12-30 | 2011-06-30 | Clear Channel Management Services, Inc. | System and method for monitoring audience in response to signage |
US20130060241A1 (en) * | 2010-04-27 | 2013-03-07 | Daniel S. Haddad | Dynamic real time active pupil centroid compensation |
WO2014149625A1 (en) | 2013-03-15 | 2014-09-25 | Amo Development Llc | Systems and methods for providing anatomical flap centration for an ophthalmic laser treatment system |
US20140320808A1 (en) * | 2013-03-15 | 2014-10-30 | Neuro Kinetics, Inc. | Method and apparatus for system synchronization in video oculography based neuro-otologic testing and evaluation |
US9082002B2 (en) | 2011-01-13 | 2015-07-14 | Panasonic Intellectual Property Corporation Of America | Detection device and detection method |
US9373123B2 (en) | 2009-12-30 | 2016-06-21 | Iheartmedia Management Services, Inc. | Wearable advertising ratings methods and systems |
US20160346128A1 (en) * | 2008-04-01 | 2016-12-01 | Amo Development, Llc | System and method of iris-pupil contrast enhancement |
US9552660B2 (en) | 2012-08-30 | 2017-01-24 | Truevision Systems, Inc. | Imaging system and methods displaying a fused multidimensional reconstructed image |
US20180008460A1 (en) * | 2016-07-06 | 2018-01-11 | Amo Wavefront Sciences, Llc | Retinal imaging for reference during laser eye surgery |
US10117721B2 (en) | 2008-10-10 | 2018-11-06 | Truevision Systems, Inc. | Real-time surgical reference guides and methods for surgical applications |
US10299880B2 (en) | 2017-04-24 | 2019-05-28 | Truevision Systems, Inc. | Stereoscopic visualization camera and platform |
US10398598B2 (en) | 2008-04-04 | 2019-09-03 | Truevision Systems, Inc. | Apparatus and methods for performing enhanced visually directed procedures under low ambient light conditions |
US10917543B2 (en) | 2017-04-24 | 2021-02-09 | Alcon Inc. | Stereoscopic visualization camera and integrated robotics platform |
US11039901B2 (en) | 2009-02-20 | 2021-06-22 | Alcon, Inc. | Real-time surgical reference indicium apparatus and methods for intraocular lens implantation |
US11051884B2 (en) | 2008-10-10 | 2021-07-06 | Alcon, Inc. | Real-time surgical reference indicium apparatus and methods for surgical applications |
US11083537B2 (en) | 2017-04-24 | 2021-08-10 | Alcon Inc. | Stereoscopic camera with fluorescence visualization |
Families Citing this family (198)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2000052516A2 (en) | 1999-03-01 | 2000-09-08 | Boston Innovative Optics, Inc. | System and method for increasing the depth of focus of the human eye |
US7431455B2 (en) * | 2005-03-22 | 2008-10-07 | Amo Manufacturing Usa, Llc | Pupilometer for pupil center drift and pupil size measurements at differing viewing distances |
JP4086667B2 (en) * | 2003-01-15 | 2008-05-14 | 株式会社ニデック | Cornea surgery device |
US6910770B2 (en) | 2003-02-10 | 2005-06-28 | Visx, Incorporated | Eye refractor with active mirror wavefront sensor |
US7556378B1 (en) | 2003-04-10 | 2009-07-07 | Tsontcho Ianchulev | Intraoperative estimation of intraocular lens power |
US7628810B2 (en) | 2003-05-28 | 2009-12-08 | Acufocus, Inc. | Mask configured to maintain nutrient transport without producing visible diffraction patterns |
US7458683B2 (en) * | 2003-06-16 | 2008-12-02 | Amo Manufacturing Usa, Llc | Methods and devices for registering optical measurement datasets of an optical system |
US20050046794A1 (en) * | 2003-06-17 | 2005-03-03 | Silvestrini Thomas A. | Method and apparatus for aligning a mask with the visual axis of an eye |
WO2005008590A1 (en) * | 2003-07-17 | 2005-01-27 | Matsushita Electric Industrial Co.,Ltd. | Iris code generation method, individual authentication method, iris code entry device, individual authentication device, and individual certification program |
US7338164B2 (en) * | 2003-07-31 | 2008-03-04 | Visx, Incorporated | Systems and methods for eye aberration and image sensor orientation |
US7349583B2 (en) * | 2003-09-05 | 2008-03-25 | The Regents Of The University Of California | Global motion estimation image coding and processing |
US7481536B2 (en) * | 2004-02-19 | 2009-01-27 | Amo Manufacturing Usa, Llc | Methods and systems for differentiating left and right eye images |
US7331671B2 (en) * | 2004-03-29 | 2008-02-19 | Delphi Technologies, Inc. | Eye tracking method based on correlation and detected eye movement |
WO2005099639A1 (en) * | 2004-04-09 | 2005-10-27 | Steinert Roger F | Laser system for vision correction |
WO2005102200A2 (en) * | 2004-04-20 | 2005-11-03 | Wavetec Vision Systems, Inc. | Integrated surgical microscope and wavefront sensor |
EP1602321A1 (en) * | 2004-06-02 | 2005-12-07 | SensoMotoric Instruments GmbH | Method and apparatus for image-based eye tracking for retinal diagnostic or surgery device |
JP4609838B2 (en) * | 2004-08-10 | 2011-01-12 | 株式会社ニデック | Cornea surgery device |
US7809171B2 (en) * | 2005-01-10 | 2010-10-05 | Battelle Memorial Institute | Facial feature evaluation based on eye location |
US7207983B2 (en) * | 2005-04-29 | 2007-04-24 | University Of Florida Research Foundation, Inc. | System and method for real-time feedback of ablation rate during laser refractive surgery |
US7292318B2 (en) * | 2005-05-13 | 2007-11-06 | The Boeing Company | System and method for generating thrust at remote objects |
JP4591211B2 (en) * | 2005-05-31 | 2010-12-01 | 富士ゼロックス株式会社 | Image processing apparatus, image processing method, medium, code reading apparatus, and program |
US7261412B2 (en) | 2005-06-30 | 2007-08-28 | Visx, Incorporated | Presbyopia correction through negative high-order spherical aberration |
US20070027438A1 (en) * | 2005-07-26 | 2007-02-01 | Frieder Loesel | System and method for compensating a corneal dissection |
DE102005046130A1 (en) * | 2005-09-27 | 2007-03-29 | Bausch & Lomb Inc. | Excimer laser-eye surgical system, has eye tracing device sending instruction signal to laser device via bidirectional bus to fire shot, when preset position data is same as initial position data of scanning device for shot |
WO2007071401A1 (en) * | 2005-12-21 | 2007-06-28 | Novartis Ag | Adaptive optic ophthalmic design system |
US7934833B2 (en) | 2005-12-22 | 2011-05-03 | Alcon Refractivehorizons, Inc. | Image alignment system for use in laser ablation treatment of the cornea |
US8100530B2 (en) * | 2006-01-20 | 2012-01-24 | Clarity Medical Systems, Inc. | Optimizing vision correction procedures |
US9101292B2 (en) | 2006-01-20 | 2015-08-11 | Clarity Medical Systems, Inc. | Apparatus and method for operating a real time large dipoter range sequential wavefront sensor |
US11090190B2 (en) | 2013-10-15 | 2021-08-17 | Lensar, Inc. | Iris registration method and system |
US8777413B2 (en) | 2006-01-20 | 2014-07-15 | Clarity Medical Systems, Inc. | Ophthalmic wavefront sensor operating in parallel sampling and lock-in detection mode |
US8820929B2 (en) * | 2006-01-20 | 2014-09-02 | Clarity Medical Systems, Inc. | Real-time measurement/display/record/playback of wavefront data for use in vision correction procedures |
US8356900B2 (en) | 2006-01-20 | 2013-01-22 | Clarity Medical Systems, Inc. | Large diopter range real time sequential wavefront sensor |
US9107608B2 (en) | 2006-01-20 | 2015-08-18 | Clarity Medical Systems, Inc. | Apparatus and method for operating a real time large diopter range sequential wavefront sensor |
US9248047B2 (en) * | 2006-01-23 | 2016-02-02 | Ziemer Holding Ag | System for protecting tissue in the treatment of eyes |
EP1810646A1 (en) * | 2006-01-23 | 2007-07-25 | SIE AG, Surgical Instrument Engineering | Apparatus for protecting tissue during eye surgery |
US8182471B2 (en) * | 2006-03-17 | 2012-05-22 | Amo Manufacturing Usa, Llc. | Intrastromal refractive correction systems and methods |
US8226236B2 (en) * | 2006-05-18 | 2012-07-24 | University Of Rochester | Method and apparatus for imaging in an eye |
US7452077B2 (en) * | 2006-08-29 | 2008-11-18 | Carl Zeiss Meditec, Inc. | Image adjustment derived from optical imaging measurement data |
US7620147B2 (en) * | 2006-12-13 | 2009-11-17 | Oraya Therapeutics, Inc. | Orthovoltage radiotherapy |
US7535991B2 (en) | 2006-10-16 | 2009-05-19 | Oraya Therapeutics, Inc. | Portable orthovoltage radiotherapy |
ES2380673T3 (en) * | 2006-11-08 | 2012-05-17 | Schwind Eye-Tech-Solutions Gmbh & Co. Kg | Corneal ablation control system of an eye by means of a laser |
JP5028073B2 (en) * | 2006-11-29 | 2012-09-19 | 株式会社ニデック | Cornea surgery device |
CN101196389B (en) * | 2006-12-05 | 2011-01-05 | 鸿富锦精密工业(深圳)有限公司 | Image measuring system and method |
US20080140345A1 (en) * | 2006-12-07 | 2008-06-12 | International Business Machines Corporation | Statistical summarization of event data |
WO2008083015A2 (en) * | 2006-12-31 | 2008-07-10 | Novartis Ag | Method and system for determining power profile for an eye |
US9427357B2 (en) | 2007-02-21 | 2016-08-30 | Amo Development, Llc | Preformed lens systems and methods |
AU2008254747B2 (en) | 2007-05-17 | 2013-10-24 | Amo Development, Llc | Customized laser epithelial ablation systems and methods |
JP5292725B2 (en) * | 2007-05-24 | 2013-09-18 | 株式会社島津製作所 | Motion tracker device |
US8363783B2 (en) * | 2007-06-04 | 2013-01-29 | Oraya Therapeutics, Inc. | Method and device for ocular alignment and coupling of ocular structures |
US8920406B2 (en) * | 2008-01-11 | 2014-12-30 | Oraya Therapeutics, Inc. | Device and assembly for positioning and stabilizing an eye |
US8414123B2 (en) * | 2007-08-13 | 2013-04-09 | Novartis Ag | Toric lenses alignment using pre-operative images |
US20090060286A1 (en) * | 2007-09-04 | 2009-03-05 | General Electric Company | Identification system and method utilizing iris imaging |
JP5623907B2 (en) * | 2007-09-05 | 2014-11-12 | アルコン レンゼックス, インコーポレーテッド | Laser-induced protective shield in laser surgery |
WO2009033107A2 (en) * | 2007-09-06 | 2009-03-12 | Lensx Lasers, Inc. | Photodisruptive treatment of crystalline lens |
US9456925B2 (en) * | 2007-09-06 | 2016-10-04 | Alcon Lensx, Inc. | Photodisruptive laser treatment of the crystalline lens |
DE112008002448B4 (en) * | 2007-09-10 | 2013-03-21 | Alcon Lensx, Inc. | Effective laser photodisruptive surgery in a gravitational field |
JP2010538770A (en) * | 2007-09-18 | 2010-12-16 | アルコン レンゼックス, インコーポレーテッド | Method and apparatus for integrated cataract surgery |
US20090137991A1 (en) * | 2007-09-18 | 2009-05-28 | Kurtz Ronald M | Methods and Apparatus for Laser Treatment of the Crystalline Lens |
US7933380B2 (en) * | 2007-09-28 | 2011-04-26 | Varian Medical Systems International Ag | Radiation systems and methods using deformable image registration |
US10398599B2 (en) * | 2007-10-05 | 2019-09-03 | Topcon Medical Laser Systems Inc. | Semi-automated ophthalmic photocoagulation method and apparatus |
US7594729B2 (en) | 2007-10-31 | 2009-09-29 | Wf Systems, Llc | Wavefront sensor |
ES2390315T3 (en) * | 2007-11-02 | 2012-11-08 | Alcon Lensx, Inc. | Apparatus for improved postoperative ocular optical performance |
JP5072553B2 (en) * | 2007-11-29 | 2012-11-14 | 浜松ホトニクス株式会社 | Eye movement measurement device |
DE102007055922A1 (en) * | 2007-12-21 | 2009-06-25 | Carl Zeiss Surgical Gmbh | Method for determining properties and / or the position of characteristic ocular components |
US8662667B2 (en) | 2007-12-21 | 2014-03-04 | Carl Zeiss Meditec Ag | Ophthalmologic visualization system |
DE102007055924B4 (en) * | 2007-12-21 | 2009-11-26 | Carl Zeiss Surgical Gmbh | Method for determining characteristic properties and / or the position of characteristic ocular components |
DE102007055919B4 (en) * | 2007-12-21 | 2023-08-10 | Carl Zeiss Meditec Ag | Eye viewing system and method therefor |
US7801271B2 (en) | 2007-12-23 | 2010-09-21 | Oraya Therapeutics, Inc. | Methods and devices for orthovoltage ocular radiotherapy and treatment planning |
CN101951990A (en) | 2007-12-23 | 2011-01-19 | Oraya治疗公司 | Methods and devices for detecting, controlling, and predicting radiation delivery |
ES2537205T3 (en) * | 2008-01-09 | 2015-06-03 | Alcon Lensx, Inc. | Photodisruptor laser tissue fragmentation |
US8073288B2 (en) * | 2008-01-16 | 2011-12-06 | International Business Machines Corporation | Rendering a mask using coarse mask representation |
CA2731810C (en) | 2008-04-01 | 2017-07-04 | Amo Development, Llc | Ophthalmic laser system with high resolution imaging and kit therefor |
WO2009124306A1 (en) * | 2008-04-04 | 2009-10-08 | Amo Wavefront Sciences, Llc | Registering multiple ophthalmic datasets |
JP5264257B2 (en) * | 2008-04-09 | 2013-08-14 | キヤノン株式会社 | Wavefront measuring method and wavefront measuring apparatus using the same |
KR100955686B1 (en) * | 2008-04-14 | 2010-05-03 | 국립암센터 | Method of tracing an eyeball in an eyeball tumor treatment |
WO2009129222A2 (en) * | 2008-04-14 | 2009-10-22 | The Johns Hopking University | Systems and methods for testing vestibular and oculomotor function |
US8622951B2 (en) * | 2008-06-09 | 2014-01-07 | Abbott Medical Optics Inc. | Controlling a phacoemulsification system based on real-time analysis of image data |
EP2337523B1 (en) * | 2008-06-27 | 2017-08-16 | AMO Development, LLC | System for modifying a refractive profile using a corneal tissue inlay |
WO2010011785A1 (en) * | 2008-07-23 | 2010-01-28 | Indiana University Research & Technology Corporation | System and method for a non-cooperative iris image acquisition system |
DE102008034490B4 (en) * | 2008-07-24 | 2018-12-20 | Carl Zeiss Meditec Ag | Eye surgery system and method for preparing and performing eye surgery |
EP2184005B1 (en) | 2008-10-22 | 2011-05-18 | SensoMotoric Instruments Gesellschaft für innovative Sensorik mbH | Method and apparatus for image processing for computer-aided eye surgery |
WO2010054268A2 (en) | 2008-11-06 | 2010-05-14 | Wavetec Vision Systems, Inc. | Optical angular measurement system for ophthalmic applications and method for positioning of a toric intraocular lens with increased accuracy |
TWI392858B (en) * | 2008-12-12 | 2013-04-11 | Inst Information Industry | Pupil position acquisition system, method and computer program products |
US8663208B2 (en) | 2009-02-09 | 2014-03-04 | Amo Development, Llc | System and method for intrastromal refractive correction |
DE102009010467A1 (en) | 2009-02-26 | 2010-09-09 | Carl Zeiss Vision Gmbh | Method and device for determining the position of the ocular pivot point |
US8577095B2 (en) * | 2009-03-19 | 2013-11-05 | Indiana University Research & Technology Corp. | System and method for non-cooperative iris recognition |
US8900221B2 (en) * | 2009-04-01 | 2014-12-02 | Wavelight Gmbh | Apparatus for treating an eye with laser radiation |
DE102009030464B4 (en) * | 2009-06-23 | 2022-04-14 | Carl Zeiss Meditec Ag | Laser device and method, in particular operating method for a laser device, for creating irradiation control data for a pulsed laser |
DE102009030504B4 (en) * | 2009-06-24 | 2024-06-06 | Carl Zeiss Meditec Ag | Eye surgery microscopy system |
US8308298B2 (en) * | 2009-06-24 | 2012-11-13 | Carl Zeiss Meditec Ag | Microscopy system for eye surgery |
US8876290B2 (en) | 2009-07-06 | 2014-11-04 | Wavetec Vision Systems, Inc. | Objective quality metric for ocular wavefront measurements |
JP5837489B2 (en) | 2009-07-14 | 2015-12-24 | ウェーブテック・ビジョン・システムズ・インコーポレイテッドWavetec Vision Systems, Inc. | Ophthalmic equipment |
ES2653970T3 (en) | 2009-07-14 | 2018-02-09 | Wavetec Vision Systems, Inc. | Determination of the effective position of the lens of an intraocular lens using aphakic refractive power |
KR101304014B1 (en) | 2009-08-13 | 2013-09-04 | 아큐포커스, 인크. | Corneal inlay with nutrient transport structures |
US10004593B2 (en) | 2009-08-13 | 2018-06-26 | Acufocus, Inc. | Intraocular lens with elastic mask |
CA2770735C (en) | 2009-08-13 | 2017-07-18 | Acufocus, Inc. | Masked intraocular implants and lenses |
US20110224657A1 (en) * | 2009-09-18 | 2011-09-15 | Amo Development, Llc | Registration of Corneal Flap With Ophthalmic Measurement and/or Treatment Data for Lasik and Other Procedures |
US8784443B2 (en) | 2009-10-20 | 2014-07-22 | Truevision Systems, Inc. | Real-time surgical reference indicium apparatus and methods for astigmatism correction |
USD656526S1 (en) | 2009-11-10 | 2012-03-27 | Acufocus, Inc. | Ocular mask |
US9504376B2 (en) | 2009-12-22 | 2016-11-29 | Amo Wavefront Sciences, Llc | Optical diagnosis using measurement sequence |
EP3138475B1 (en) * | 2010-01-22 | 2023-10-25 | AMO Development, LLC | Apparatus for automated placement of scanned laser capsulorhexis incisions |
CN102917635B (en) * | 2010-02-15 | 2015-05-20 | 视乐有限公司 | Method for determining deviations between coordinate systems of various technical systems |
US20110213342A1 (en) * | 2010-02-26 | 2011-09-01 | Ashok Burton Tripathi | Real-time Virtual Indicium Apparatus and Methods for Guiding an Implant into an Eye |
DE102010012616A1 (en) | 2010-03-20 | 2011-09-22 | Carl Zeiss Meditec Ag | Ophthalmic laser treatment device and method of operation for such |
JP5297415B2 (en) * | 2010-04-30 | 2013-09-25 | キヤノン株式会社 | Ophthalmic device and ophthalmic method |
JP5743425B2 (en) * | 2010-04-30 | 2015-07-01 | キヤノン株式会社 | Ophthalmic apparatus and method for controlling ophthalmic apparatus |
NL2006804A (en) | 2010-06-24 | 2011-12-28 | Asml Netherlands Bv | Measurement system, method and lithographic apparatus. |
DE102010032193A1 (en) * | 2010-07-24 | 2012-01-26 | Chronos Vision Gmbh | Method and device for determining eye torsion |
WO2012032630A1 (en) | 2010-09-09 | 2012-03-15 | トヨタ自動車株式会社 | Gear |
US9532708B2 (en) * | 2010-09-17 | 2017-01-03 | Alcon Lensx, Inc. | Electronically controlled fixation light for ophthalmic imaging systems |
WO2012040196A1 (en) * | 2010-09-20 | 2012-03-29 | Amo Development Llc | System and methods for mitigating changes in pupil size during laser refractive surgery to maintain ablation centration |
US10028862B2 (en) | 2012-12-06 | 2018-07-24 | Amo Development, Llc | Compensation systems and methods for flap induced aberrations |
EP2457497B1 (en) | 2010-11-26 | 2023-08-16 | Alcon Inc. | Apparatus for multi-level eye registration |
US8831416B2 (en) * | 2010-12-22 | 2014-09-09 | Michael Braithwaite | System and method for illuminating and identifying a person |
US8254768B2 (en) * | 2010-12-22 | 2012-08-28 | Michael Braithwaite | System and method for illuminating and imaging the iris of a person |
FI20115057L (en) | 2011-01-21 | 2012-07-22 | Wallac Oy | Method and device for cutting out one or more sample areas from a sample carrier |
EP2499963A1 (en) * | 2011-03-18 | 2012-09-19 | SensoMotoric Instruments Gesellschaft für innovative Sensorik mbH | Method and apparatus for gaze point mapping |
WO2012135073A2 (en) | 2011-03-25 | 2012-10-04 | Board Of Trustees Of Michigan State University | Adaptive laser system for ophthalmic use |
WO2012178054A1 (en) * | 2011-06-23 | 2012-12-27 | Amo Development, Llc | Ophthalmic range finding |
US10779989B2 (en) | 2011-07-04 | 2020-09-22 | Alcon Inc. | Device and method for a laser-assisted eye-surgery treatment system |
DE102011082901A1 (en) * | 2011-09-16 | 2013-03-21 | Carl Zeiss Meditec Ag | Determining the azimuthal orientation of a patient's eye |
ES2911679T3 (en) * | 2011-10-22 | 2022-05-20 | Alcon Inc | Apparatus for monitoring one or more surgical parameters of the eye |
JP6046160B2 (en) | 2011-12-02 | 2016-12-14 | アキュフォーカス・インコーポレーテッド | Ophthalmic mask with selective spectral transmission |
MX2014010282A (en) * | 2012-02-28 | 2015-03-03 | Digitalvision Llc | A vision testing system. |
TWI471808B (en) * | 2012-07-20 | 2015-02-01 | Pixart Imaging Inc | Pupil detection device |
US9854159B2 (en) * | 2012-07-20 | 2017-12-26 | Pixart Imaging Inc. | Image system with eye protection |
US9498117B2 (en) | 2012-07-20 | 2016-11-22 | Amo Development, Llc | Systems and methods for treatment deconvolution using dual scale kernels |
US9072462B2 (en) | 2012-09-27 | 2015-07-07 | Wavetec Vision Systems, Inc. | Geometric optical power measurement device |
WO2014055690A1 (en) | 2012-10-02 | 2014-04-10 | Amo Development, Llc. | Systems and methods for treatment target deconvolution |
TW201416908A (en) * | 2012-10-23 | 2014-05-01 | Pixart Imaging Inc | Pupil tracking device |
US10702209B2 (en) * | 2012-10-24 | 2020-07-07 | Amo Development, Llc | Graphical user interface for laser eye surgery system |
US10314746B2 (en) * | 2012-11-02 | 2019-06-11 | Optimedica Corporation | Laser eye surgery system calibration |
EP2749204B1 (en) | 2012-12-28 | 2016-03-16 | Canon Kabushiki Kaisha | Image processing apparatus and image processing method |
US9204962B2 (en) | 2013-03-13 | 2015-12-08 | Acufocus, Inc. | In situ adjustable optical mask |
US10092393B2 (en) | 2013-03-14 | 2018-10-09 | Allotex, Inc. | Corneal implant systems and methods |
US9427922B2 (en) | 2013-03-14 | 2016-08-30 | Acufocus, Inc. | Process for manufacturing an intraocular lens with an embedded mask |
US9265419B2 (en) * | 2013-03-15 | 2016-02-23 | Abbott Medical Optics Inc. | Systems and methods for measuring position and boundary of lens capsule and implanted intraocular lens in eye imaging |
US20140327755A1 (en) * | 2013-05-06 | 2014-11-06 | Delta ID Inc. | Apparatus and method for positioning an iris for iris image capture |
FR3006776B1 (en) * | 2013-06-07 | 2016-10-21 | Essilor Int | METHOD FOR DETERMINING AT LEAST ONE VALUE OF A CUSTOMIZATION PARAMETER OF A VISUAL COMPENSATION EQUIPMENT |
US10117576B2 (en) * | 2013-07-19 | 2018-11-06 | The General Hospital Corporation | System, method and computer accessible medium for determining eye motion by imaging retina and providing feedback for acquisition of signals from the retina |
WO2015042120A1 (en) | 2013-09-18 | 2015-03-26 | Richard Awdeh | Surgical navigation system and method |
US9936866B2 (en) | 2013-09-24 | 2018-04-10 | Novartis Ag | Adjusting laser treatment in response to changes in the eye |
DE102014201746A1 (en) | 2014-01-31 | 2015-08-06 | Carl Zeiss Ag | Method and device for measuring the position of an eye |
US10572008B2 (en) | 2014-02-21 | 2020-02-25 | Tobii Ab | Apparatus and method for robust eye/gaze tracking |
GB2523356A (en) * | 2014-02-21 | 2015-08-26 | Tobii Technology Ab | Apparatus and method for robust eye/gaze tracking |
DE102014102425B4 (en) * | 2014-02-25 | 2018-06-28 | Carl Zeiss Meditec Ag | Microscope system and microscopy method using digital markers |
EP3117258B1 (en) | 2014-03-13 | 2019-01-02 | Richard Awdeh | A microscope insert |
WO2015166550A1 (en) * | 2014-04-30 | 2015-11-05 | 株式会社クリュートメディカルシステムズ | Ophthalmologic observation system |
US10420608B2 (en) * | 2014-05-20 | 2019-09-24 | Verily Life Sciences Llc | System for laser ablation surgery |
CN104184991A (en) * | 2014-06-27 | 2014-12-03 | 海信集团有限公司 | Monitoring system |
WO2016081493A1 (en) | 2014-11-19 | 2016-05-26 | Acufocus, Inc. | Fracturable mask for treating presbyopia |
CA2963020C (en) * | 2014-11-20 | 2019-09-10 | Novartis Ag | An apparatus for laser processing an eye |
US10089525B1 (en) * | 2014-12-31 | 2018-10-02 | Morphotrust Usa, Llc | Differentiating left and right eye images |
WO2016182514A1 (en) * | 2015-05-12 | 2016-11-17 | Agency For Science, Technology And Research | A system and method for displaying a video image |
US10286208B2 (en) * | 2015-05-20 | 2019-05-14 | Cardiac Pacemakers, Inc. | Fully integrated lead stabilizer for medical electrical leads and methods of attachment |
US10449090B2 (en) | 2015-07-31 | 2019-10-22 | Allotex, Inc. | Corneal implant systems and methods |
US10016130B2 (en) * | 2015-09-04 | 2018-07-10 | University Of Massachusetts | Eye tracker system and methods for detecting eye parameters |
ES2972581T3 (en) | 2015-10-05 | 2024-06-13 | Acufocus Inc | Intraocular lens molding methods |
US10445606B2 (en) * | 2015-10-08 | 2019-10-15 | Microsoft Technology Licensing, Llc | Iris recognition |
US11464625B2 (en) | 2015-11-24 | 2022-10-11 | Acufocus, Inc. | Toric small aperture intraocular lens with extended depth of focus |
PL3439593T3 (en) * | 2016-04-06 | 2021-12-27 | Keranova | Optical focussing system for a human or animal tissue cutting device |
AU2016405856B2 (en) * | 2016-05-02 | 2021-10-07 | Alcon Inc. | Overlay imaging for registration of a patient eye for laser surgery |
EP3243429A1 (en) * | 2016-05-13 | 2017-11-15 | Erasmus University Medical Center Rotterdam | Method for measuring a subject's eye movement and scleral contact lens |
CA3033355A1 (en) | 2016-08-10 | 2018-02-15 | Amo Development, Llc | Epithelial ablation systems and methods |
US11633143B2 (en) * | 2016-08-18 | 2023-04-25 | Northwestern University | Systems and methods for assessment of ocular cyclotorsion |
WO2018049230A1 (en) | 2016-09-08 | 2018-03-15 | Amo Development, Llc | Systems and methods for obtaining iris registration and pupil centration for laser surgery |
SE541262C2 (en) * | 2016-11-15 | 2019-05-21 | Heads Stockholm Ab | Method and device for eye metric acquisition |
CN110291369A (en) * | 2016-12-13 | 2019-09-27 | 奇跃公司 | For transmitting polarised light and determining enhancing and the virtual reality glasses, system and method for glucose level |
CN107133619A (en) * | 2017-05-31 | 2017-09-05 | 执鼎医疗科技(杭州)有限公司 | A kind of eyeball position adaptive location method and its device |
WO2019012881A1 (en) * | 2017-07-12 | 2019-01-17 | ソニー株式会社 | Image processing device, ophthalmic observation apparatus, and ophthalmic observation system |
US11311188B2 (en) * | 2017-07-13 | 2022-04-26 | Micro Medical Devices, Inc. | Visual and mental testing using virtual reality hardware |
EP3453317B1 (en) * | 2017-09-08 | 2021-07-14 | Tobii AB | Pupil radius compensation |
US10867252B2 (en) | 2017-09-08 | 2020-12-15 | Tobii Ab | Continuous calibration based on pupil characteristics |
TWI630507B (en) * | 2017-09-25 | 2018-07-21 | 仁寶電腦工業股份有限公司 | Gaze detection, identification and control method |
EP3691517B1 (en) * | 2017-10-06 | 2021-10-27 | Alcon Inc. | Tracking movement of an eye within a tracking range |
CA3073009A1 (en) * | 2017-10-17 | 2019-04-25 | Alcon Inc. | Customized ophthalmic surgical profiles |
ES2715524A1 (en) * | 2017-12-04 | 2019-06-04 | Ruiz Pedro Grimaldos | Laser exlicer specific for therapeutic and cosmetic iridoplasties (Machine-translation by Google Translate, not legally binding) |
US10586311B2 (en) * | 2018-03-14 | 2020-03-10 | Adobe Inc. | Patch validity test |
US10706509B2 (en) | 2018-03-14 | 2020-07-07 | Adobe Inc. | Interactive system for automatically synthesizing a content-aware fill |
WO2019217471A1 (en) | 2018-05-09 | 2019-11-14 | Acufocus, Inc. | Intraocular implant with removable optic |
US11033185B2 (en) | 2018-06-08 | 2021-06-15 | Alcon Inc. | System and method for automatic torsion correction in diagnostic ophthalmic measurements |
EP3826528A4 (en) | 2018-07-25 | 2022-07-27 | Natus Medical Incorporated | Real-time removal of ir led reflections from an image |
CA3101096A1 (en) | 2018-11-02 | 2020-05-07 | Amo Development, Llc | Iris registration method for ophthalmic laser surgical procedures |
SE1851597A1 (en) * | 2018-12-17 | 2020-06-02 | Tobii Ab | Gaze tracking via tracing of light paths |
CN110200584B (en) * | 2019-07-03 | 2022-04-29 | 南京博视医疗科技有限公司 | Target tracking control system and method based on fundus imaging technology |
US10989528B2 (en) * | 2019-08-27 | 2021-04-27 | Raytheon Company | High speed beam component-resolved profile and position sensitive detector |
JP7120548B2 (en) * | 2019-09-19 | 2022-08-17 | 日本電信電話株式会社 | SENSING DIRECTION ESTIMATION DEVICE, SENSING DIRECTION ESTIMATION METHOD AND PROGRAM |
WO2021059103A1 (en) * | 2019-09-27 | 2021-04-01 | Alcon Inc. | Instant eye gaze calibration systems and methods |
CN112043235B (en) * | 2020-07-13 | 2024-02-06 | 天津市眼科医院 | Portable eyeball static rotation measuring instrument and method for measuring eyeball rotation angle by utilizing same |
DE102020123728A1 (en) | 2020-09-11 | 2022-03-17 | Carl Zeiss Meditec Ag | Method for determining a parameter of an eye and device for identifying an angular position of an eye |
WO2022091429A1 (en) * | 2020-10-27 | 2022-05-05 | 株式会社トプコン | Ophthalmologic observation device |
CN112561787B (en) * | 2020-12-22 | 2024-03-22 | 维沃移动通信有限公司 | Image processing method, device, electronic equipment and storage medium |
US12118825B2 (en) * | 2021-05-03 | 2024-10-15 | NeuraLight Ltd. | Obtaining high-resolution oculometric parameters |
US20230103129A1 (en) * | 2021-09-27 | 2023-03-30 | ResMed Pty Ltd | Machine learning to determine facial measurements via captured images |
US11375891B1 (en) * | 2021-12-17 | 2022-07-05 | The Trustees Of Indiana University | Methods, systems, and devices for vision testing |
EP4197428A1 (en) * | 2021-12-20 | 2023-06-21 | Ziemer Ophthalmic Systems AG | Opthalmological treatment device for determining a rotation angle of an eye |
EP4226845A1 (en) * | 2022-02-11 | 2023-08-16 | Ziemer Ophthalmic Systems AG | Eye image quality analysis |
DE102022203850A1 (en) * | 2022-04-20 | 2023-10-26 | Robert Bosch Gesellschaft mit beschränkter Haftung | Device and method for determining a pupil position |
Citations (92)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4365874A (en) | 1980-09-11 | 1982-12-28 | Milburn Wanda O | Oculotorsionometer |
US4540254A (en) | 1982-10-26 | 1985-09-10 | Humphrey Instruments, Inc. | Keratometer having peripheral light entrance and exit paths |
US4641349A (en) | 1985-02-20 | 1987-02-03 | Leonard Flom | Iris recognition system |
US4665913A (en) | 1983-11-17 | 1987-05-19 | Lri L.P. | Method for ophthalmological surgery |
US4761071A (en) | 1984-11-06 | 1988-08-02 | Baron William S | Apparatus and method for determining corneal and scleral topography |
US4815839A (en) | 1987-08-03 | 1989-03-28 | Waldorf Ronald A | Infrared/video electronystagmographic apparatus |
US4848340A (en) | 1988-02-10 | 1989-07-18 | Intelligent Surgical Lasers | Eyetracker and method of use |
US4995716A (en) | 1989-03-09 | 1991-02-26 | Par Technology Corporation | Method and apparatus for obtaining the topography of an object |
US5036347A (en) | 1988-08-31 | 1991-07-30 | Canon Kabushiki Kaisha | Visual line detecting device and camera having the same |
US5062702A (en) | 1990-03-16 | 1991-11-05 | Intelligent Surgical Lasers, Inc. | Device for mapping corneal topography |
US5070883A (en) | 1988-12-16 | 1991-12-10 | Konan Camera Research Institute Inc. | Eye movement analyzing device utilizing pupil center-of-gravity data |
US5098426A (en) | 1989-02-06 | 1992-03-24 | Phoenix Laser Systems, Inc. | Method and apparatus for precision laser surgery |
US5159361A (en) | 1989-03-09 | 1992-10-27 | Par Technology Corporation | Method and apparatus for obtaining the topography of an object |
US5196873A (en) | 1990-05-08 | 1993-03-23 | Nihon Kohden Corporation | Eye movement analysis system |
US5214455A (en) | 1991-04-01 | 1993-05-25 | General Electric Company | Objective eye alignment measurement method and system |
US5231674A (en) | 1989-06-09 | 1993-07-27 | Lc Technologies, Inc. | Eye tracking method and apparatus |
US5291560A (en) * | 1991-07-15 | 1994-03-01 | Iri Scan Incorporated | Biometric personal identification system based on iris analysis |
US5293871A (en) | 1993-05-05 | 1994-03-15 | Cornell Research Foundation Inc. | System for ultrasonically determining corneal layer thicknesses and shape |
US5341180A (en) | 1991-06-29 | 1994-08-23 | Nidek Co., Ltd. | Ophthalmic photographing apparatus |
US5347331A (en) | 1992-06-30 | 1994-09-13 | Nidek Co., Ltd. | Ophthalmic apparatus for photographing the anterior part of the eye with a reproducible photographing position |
US5398684A (en) | 1988-12-23 | 1995-03-21 | Hardy; Tyrone L. | Method and apparatus for video presentation from scanner imaging sources |
US5406342A (en) | 1992-01-15 | 1995-04-11 | Euclid Medical Instruments | System for determining the topography of a curved surface |
US5491524A (en) | 1994-10-05 | 1996-02-13 | Carl Zeiss, Inc. | Optical coherence tomography corneal mapping apparatus |
US5512965A (en) | 1993-06-24 | 1996-04-30 | Orbtek, Inc. | Ophthalmic instrument and method of making ophthalmic determinations using Scheimpflug corrections |
US5550937A (en) | 1992-11-23 | 1996-08-27 | Harris Corporation | Mechanism for registering digital images obtained from multiple sensors having diverse image collection geometries |
US5549597A (en) | 1993-05-07 | 1996-08-27 | Visx Incorporated | In situ astigmatism axis alignment |
US5572596A (en) | 1994-09-02 | 1996-11-05 | David Sarnoff Research Center, Inc. | Automated, non-invasive iris recognition system and method |
US5581637A (en) | 1994-12-09 | 1996-12-03 | Xerox Corporation | System for registering component image tiles in a camera-based scanner device transcribing scene images |
US5614967A (en) * | 1995-03-30 | 1997-03-25 | Nihon Kohden Corporation | Eye movement analysis system |
US5620436A (en) | 1994-09-22 | 1997-04-15 | Chiron Technolas Gmbh Ophthalmologische Systeme | Method and apparatus for providing precise location of points on the eye |
US5632742A (en) | 1994-04-25 | 1997-05-27 | Autonomous Technologies Corp. | Eye movement sensing method and system |
US5640221A (en) | 1995-05-15 | 1997-06-17 | Nihon Kohden Corporation | Binocular optical image-pickup equipment and binocular image-pickup system |
US5646791A (en) | 1995-01-04 | 1997-07-08 | Visx Incorporated | Method and apparatus for temporal and spatial beam integration |
US5649032A (en) | 1994-11-14 | 1997-07-15 | David Sarnoff Research Center, Inc. | System for automatically aligning images to form a mosaic image |
US5683379A (en) | 1992-10-01 | 1997-11-04 | Chiron Technolas Gmbh Ophthalmologische Systeme | Apparatus for modifying the surface of the eye through large beam laser polishing and method of controlling the apparatus |
US5713892A (en) | 1991-08-16 | 1998-02-03 | Visx, Inc. | Method and apparatus for combined cylindrical and spherical eye corrections |
US5740803A (en) | 1997-03-07 | 1998-04-21 | Autonomous Technologies Corporation | Locating the center of the entrance pupil of an eye after pupil dilation |
US5757462A (en) | 1996-05-31 | 1998-05-26 | Nidek Company, Ltd. | Ophthalmic apparatus for photographing a section of an anterior part of an eye |
US5774591A (en) | 1995-12-15 | 1998-06-30 | Xerox Corporation | Apparatus and method for recognizing facial expressions and facial gestures in a sequence of images |
US5777719A (en) | 1996-12-23 | 1998-07-07 | University Of Rochester | Method and apparatus for improving vision and the resolution of retinal images |
US5790235A (en) | 1997-03-26 | 1998-08-04 | Carl Zeiss, Inc. | Method and apparatus to measure pupil size and position |
US5843070A (en) | 1996-05-13 | 1998-12-01 | Partech, Inc. | Simulating corneal laser surgery |
US5850486A (en) | 1996-04-29 | 1998-12-15 | The Mclean Hospital Corporation | Registration of image data |
US5859686A (en) | 1997-05-19 | 1999-01-12 | Northrop Grumman Corporation | Eye finding and tracking system |
US5865832A (en) | 1992-02-27 | 1999-02-02 | Visx, Incorporated | System for detecting, measuring and compensating for lateral movements of a target |
US5891132A (en) | 1996-05-30 | 1999-04-06 | Chiron Technolas Gmbh Opthalmologische Systeme | Distributed excimer laser surgery system |
US5926251A (en) | 1997-08-12 | 1999-07-20 | Mitsubishi Denki Kabushiki Kaisha | Eye image tracking apparatus |
US5951475A (en) | 1997-09-25 | 1999-09-14 | International Business Machines Corporation | Methods and apparatus for registering CT-scan data to multiple fluoroscopic images |
US5963300A (en) | 1998-02-17 | 1999-10-05 | Amt Technologies, Corp. | Ocular biometer |
US5974165A (en) | 1993-11-30 | 1999-10-26 | Arch Development Corporation | Automated method and system for the alignment and correlation of images from two different modalities |
US5980513A (en) | 1994-04-25 | 1999-11-09 | Autonomous Technologies Corp. | Laser beam delivery and eye tracking system |
US5982555A (en) | 1998-01-20 | 1999-11-09 | University Of Washington | Virtual retinal display with eye tracking |
US6004313A (en) | 1998-06-26 | 1999-12-21 | Visx, Inc. | Patient fixation system and method for laser eye surgery |
US6099522A (en) | 1989-02-06 | 2000-08-08 | Visx Inc. | Automated laser workstation for high precision surgical and industrial interventions |
US6104828A (en) | 1994-03-24 | 2000-08-15 | Kabushiki Kaisha Topcon | Ophthalmologic image processor |
US6116738A (en) | 1997-01-06 | 2000-09-12 | Vismed, Inc. | Corneal topographer with central and peripheral measurement capability |
US6129722A (en) | 1999-03-10 | 2000-10-10 | Ruiz; Luis Antonio | Interactive corrective eye surgery system with topography and laser system interface |
US6159202A (en) | 1995-09-29 | 2000-12-12 | Nidex Co., Ltd. | Corneal surgery apparatus |
US6203539B1 (en) | 1993-05-07 | 2001-03-20 | Visx, Incorporated | Method and system for laser treatment of refractive errors using offset imaging |
US6217596B1 (en) | 1999-09-01 | 2001-04-17 | Samir G. Farah | Corneal surface and pupillary cardinal axes marker |
US6234631B1 (en) | 2000-03-09 | 2001-05-22 | Lasersight Technologies, Inc. | Combination advanced corneal topography/wave front aberration measurement |
US6245059B1 (en) | 1999-04-07 | 2001-06-12 | Visx, Incorporated | Offset ablation profiles for treatment of irregular astigmation |
US6257722B1 (en) | 1999-05-31 | 2001-07-10 | Nidek Co., Ltd. | Ophthalmic apparatus |
US6266453B1 (en) | 1999-07-26 | 2001-07-24 | Computerized Medical Systems, Inc. | Automated image fusion/alignment system and method |
US6267756B1 (en) | 1986-03-08 | 2001-07-31 | G. Rodenstock Instrumente Gmbh | Apparatus for the observation and the treatment of the eye using a laser |
US6271915B1 (en) | 1996-11-25 | 2001-08-07 | Autonomous Technologies Corporation | Objective measurement and correction of optical systems using wavefront analysis |
US6280436B1 (en) | 1999-08-10 | 2001-08-28 | Memphis Eye & Cataract Associates Ambulatory Surgery Center | Eye tracking and positioning system for a refractive laser system |
US6285780B1 (en) | 1997-03-28 | 2001-09-04 | Oki Electric Industry Co., Ltd. | Apparatus for identifying individual animals and image processing method |
US6296358B1 (en) | 2000-07-14 | 2001-10-02 | Visual Pathways, Inc. | Ocular fundus auto imager |
US6305802B1 (en) | 1999-08-11 | 2001-10-23 | Johnson & Johnson Vision Products, Inc. | System and method of integrating corneal topographic data and ocular wavefront data with primary ametropia measurements to create a soft contact lens design |
US6314197B1 (en) | 1997-08-22 | 2001-11-06 | International Business Machines Corporation | Determining an alignment estimation between two (fingerprint) images |
US20020013573A1 (en) | 1995-10-27 | 2002-01-31 | William B. Telfair | Apparatus and method for tracking and compensating for eye movements |
US6347549B1 (en) | 1998-07-09 | 2002-02-19 | Ryan International Corporation | Enhancement of storm location from a single moving platform |
US6351573B1 (en) | 1994-01-28 | 2002-02-26 | Schneider Medical Technologies, Inc. | Imaging device and method |
US20020047992A1 (en) | 2000-01-27 | 2002-04-25 | Zyoptics, Inc. | Method and apparatus for measuring optical aberrations of the human eye |
US6396069B1 (en) | 1999-06-25 | 2002-05-28 | Macpherson David C. | Topographer for real time ablation feedback having synthetic wavelength generators |
US6394999B1 (en) | 2000-03-13 | 2002-05-28 | Memphis Eye & Cataract Associates Ambulatory Surgery Center | Laser eye surgery system using wavefront sensor analysis to control digital micromirror device (DMD) mirror patterns |
US6496594B1 (en) | 1998-10-22 | 2002-12-17 | Francine J. Prokoski | Method and apparatus for aligning and comparing images of the face and body from different imagers |
US6612698B2 (en) * | 2001-01-25 | 2003-09-02 | Explorer Inc. | Pupil measurement apparatus, refraction correction apparatus, and pupil measurement method |
US6634752B2 (en) | 2002-03-11 | 2003-10-21 | Alcon, Inc. | Dual-path optical system for measurement of ocular aberrations and corneal topometry and associated methods |
US6634750B2 (en) | 2001-03-15 | 2003-10-21 | Wavefront Sciences, Inc. | Tomographic wavefont analysis system and method of mapping an optical system |
US20030223037A1 (en) | 2002-05-30 | 2003-12-04 | Visx, Incorporated | Methods and systems for tracking a torsional orientation and position of an eye |
US6673062B2 (en) | 2000-03-14 | 2004-01-06 | Visx, Inc. | Generating scanning spot locations for laser eye surgery |
US20040012760A1 (en) | 2000-10-18 | 2004-01-22 | Toshifumi Mihashi | Eye characteristics measuring device |
US20040019346A1 (en) | 2002-06-13 | 2004-01-29 | Visx, Incorporated | Corneal topography-based target warping |
US6702806B2 (en) | 2000-04-19 | 2004-03-09 | Alcon, Inc. | Eye registration and astigmatism alignment control systems and method |
US20040070730A1 (en) | 2001-02-09 | 2004-04-15 | Toshifumi Mihashi | Eye characteristic measuring device |
US6728424B1 (en) | 2000-09-15 | 2004-04-27 | Koninklijke Philips Electronics, N.V. | Imaging registration system and method using likelihood maximization |
US20040116910A1 (en) | 2002-12-12 | 2004-06-17 | Bausch & Lomb Incorporated | System and method for evaluating a secondary LASIK treatment |
US6929638B2 (en) | 2000-04-19 | 2005-08-16 | Alcon Refractivehorizons, Inc. | Eye registration and astigmatism alignment control systems and method |
US7146983B1 (en) | 1999-10-21 | 2006-12-12 | Kristian Hohla | Iris recognition and tracking for optical treatment |
USRE39882E1 (en) | 1997-11-11 | 2007-10-16 | Kabushiki Kaisha Topcon | Ophthalmologic characteristic measuring apparatus |
Family Cites Families (28)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4932968A (en) * | 1987-07-07 | 1990-06-12 | Caldwell Delmar R | Intraocular prostheses |
WO1992001417A1 (en) | 1990-07-19 | 1992-02-06 | Horwitz Larry S | Vision measurement and correction |
AU3781193A (en) | 1992-02-27 | 1993-09-13 | Phoenix Laser Systems, Inc. | Automated laser workstation for high precision surgical and industrial interventions |
US6090100A (en) | 1992-10-01 | 2000-07-18 | Chiron Technolas Gmbh Ophthalmologische Systeme | Excimer laser system for correction of vision with reduced thermal effects |
IL108672A (en) | 1993-02-19 | 1997-07-13 | Phoenix Laser Systems | System for detecting, measuring and compensating for lateral movements of a target |
CA2187373C (en) | 1994-04-08 | 2001-06-05 | Kristian Hohla | Method and apparatus for providing precise location of points on the eye |
FR2719690B1 (en) * | 1994-05-04 | 1996-07-19 | Lille Ii Universite | Device and method for simulating an examination or a surgical operation performed on a simulated organ. |
JP3655022B2 (en) | 1995-09-29 | 2005-06-02 | 株式会社ニデック | Ophthalmic surgery equipment |
US5782822A (en) | 1995-10-27 | 1998-07-21 | Ir Vision, Inc. | Method and apparatus for removing corneal tissue with infrared laser radiation |
EP1032809B1 (en) | 1997-11-21 | 2007-01-10 | Alcon Inc. | Objective measurement and correction of optical systems using wavefront analysis |
US7303281B2 (en) | 1998-10-07 | 2007-12-04 | Tracey Technologies, Llc | Method and device for determining refractive components and visual function of the eye for vision correction |
AUPP697398A0 (en) | 1998-11-06 | 1998-12-03 | Lions Eye Institute Of Western Australia Incorporated, The | Eye tracker for refractive surgery |
DE19950792A1 (en) | 1999-10-21 | 2001-04-26 | Technolas Gmbh | Ophthalmic wavefront aberration diagnostic tool, has camera that aids in focusing aerial image from lenslet array, on wavefront sensor |
DE10014479A1 (en) | 2000-03-23 | 2001-10-04 | Technolas Gmbh | Image alignment in ophthalmic refractive surgery systems for refractive treatment of eye involves aligning two images of eye having spatial relationship, for performing refractive treatment |
EP1221890B1 (en) | 1999-10-21 | 2009-06-03 | Technolas GmbH Ophthalmologische Systeme | System for customized corneal profiling |
US6758563B2 (en) * | 1999-12-30 | 2004-07-06 | Nokia Corporation | Eye-gaze tracking |
US6610049B2 (en) | 2000-03-04 | 2003-08-26 | Katana Technologies Gmbh | Customized laser ablation of corneas with solid state lasers |
DK1210003T3 (en) | 2000-05-08 | 2004-12-06 | Alcon Inc | Objective measurement and correction of optical systems using wavefront analysis |
US6460997B1 (en) | 2000-05-08 | 2002-10-08 | Alcon Universal Ltd. | Apparatus and method for objective measurements of optical systems using wavefront analysis |
JP2003532484A (en) | 2000-05-09 | 2003-11-05 | メンフィス アイ アンド カタラクト アソシエーツ アンビュラトリー サージェリー センター(ディー.ビー.エー.)メカ レーザー アンド サージェリー センター | Method and apparatus for controlling a high resolution high speed digital micromirror device for laser refractive ophthalmic surgery |
DE10022995C2 (en) | 2000-05-11 | 2003-11-27 | Wavelight Laser Technologie Ag | Device for photorefractive corneal surgery |
US6607527B1 (en) * | 2000-10-17 | 2003-08-19 | Luis Antonio Ruiz | Method and apparatus for precision laser surgery |
WO2002076355A2 (en) * | 2001-03-27 | 2002-10-03 | Wavelight Laser Technologie Ag | Method for treatment and diagnosis of eye tissues |
CN100502777C (en) | 2001-04-27 | 2009-06-24 | 鲍希与洛姆伯股份有限公司 | Iris pattern recognition and alignment |
US20030208189A1 (en) * | 2001-10-19 | 2003-11-06 | Payman Gholam A. | Integrated system for correction of vision of the human eye |
US6666857B2 (en) * | 2002-01-29 | 2003-12-23 | Robert F. Smith | Integrated wavefront-directed topography-controlled photoablation |
US20050107775A1 (en) * | 2002-03-04 | 2005-05-19 | The Cleveland Clinic Foundation | Method and apparatus for controlling ablation in refractive surgery |
US7458683B2 (en) * | 2003-06-16 | 2008-12-02 | Amo Manufacturing Usa, Llc | Methods and devices for registering optical measurement datasets of an optical system |
-
2002
- 2002-11-19 AU AU2002346438A patent/AU2002346438A1/en not_active Abandoned
- 2002-11-19 US US10/300,714 patent/US7044602B2/en not_active Expired - Lifetime
- 2002-11-19 MX MXPA04011893A patent/MXPA04011893A/en active IP Right Grant
- 2002-11-19 CA CA2487411A patent/CA2487411C/en not_active Expired - Fee Related
- 2002-11-19 JP JP2004509342A patent/JP4256342B2/en not_active Expired - Fee Related
- 2002-11-19 CN CNB028293150A patent/CN100442006C/en not_active Expired - Fee Related
- 2002-11-19 EP EP02784501.5A patent/EP1516156B1/en not_active Expired - Lifetime
- 2002-11-19 WO PCT/US2002/037051 patent/WO2003102498A1/en active Search and Examination
-
2006
- 2006-03-28 US US11/277,743 patent/US7261415B2/en not_active Expired - Lifetime
-
2007
- 2007-07-10 US US11/775,840 patent/US7431457B2/en not_active Expired - Lifetime
-
2008
- 2008-09-15 US US12/210,933 patent/US8740385B2/en active Active
-
2014
- 2014-04-22 US US14/258,854 patent/US9596983B2/en not_active Expired - Lifetime
-
2017
- 2017-02-10 US US15/430,087 patent/US10251783B2/en not_active Expired - Lifetime
Patent Citations (102)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4365874A (en) | 1980-09-11 | 1982-12-28 | Milburn Wanda O | Oculotorsionometer |
US4540254A (en) | 1982-10-26 | 1985-09-10 | Humphrey Instruments, Inc. | Keratometer having peripheral light entrance and exit paths |
US4665913A (en) | 1983-11-17 | 1987-05-19 | Lri L.P. | Method for ophthalmological surgery |
US4761071A (en) | 1984-11-06 | 1988-08-02 | Baron William S | Apparatus and method for determining corneal and scleral topography |
US4641349A (en) | 1985-02-20 | 1987-02-03 | Leonard Flom | Iris recognition system |
US6267756B1 (en) | 1986-03-08 | 2001-07-31 | G. Rodenstock Instrumente Gmbh | Apparatus for the observation and the treatment of the eye using a laser |
US4815839A (en) | 1987-08-03 | 1989-03-28 | Waldorf Ronald A | Infrared/video electronystagmographic apparatus |
US4848340A (en) | 1988-02-10 | 1989-07-18 | Intelligent Surgical Lasers | Eyetracker and method of use |
US5036347A (en) | 1988-08-31 | 1991-07-30 | Canon Kabushiki Kaisha | Visual line detecting device and camera having the same |
US5070883A (en) | 1988-12-16 | 1991-12-10 | Konan Camera Research Institute Inc. | Eye movement analyzing device utilizing pupil center-of-gravity data |
US5398684A (en) | 1988-12-23 | 1995-03-21 | Hardy; Tyrone L. | Method and apparatus for video presentation from scanner imaging sources |
US5098426A (en) | 1989-02-06 | 1992-03-24 | Phoenix Laser Systems, Inc. | Method and apparatus for precision laser surgery |
US6099522A (en) | 1989-02-06 | 2000-08-08 | Visx Inc. | Automated laser workstation for high precision surgical and industrial interventions |
US5159361A (en) | 1989-03-09 | 1992-10-27 | Par Technology Corporation | Method and apparatus for obtaining the topography of an object |
US4995716A (en) | 1989-03-09 | 1991-02-26 | Par Technology Corporation | Method and apparatus for obtaining the topography of an object |
US5231674A (en) | 1989-06-09 | 1993-07-27 | Lc Technologies, Inc. | Eye tracking method and apparatus |
US5062702A (en) | 1990-03-16 | 1991-11-05 | Intelligent Surgical Lasers, Inc. | Device for mapping corneal topography |
US5196873A (en) | 1990-05-08 | 1993-03-23 | Nihon Kohden Corporation | Eye movement analysis system |
US5214455A (en) | 1991-04-01 | 1993-05-25 | General Electric Company | Objective eye alignment measurement method and system |
US5341180A (en) | 1991-06-29 | 1994-08-23 | Nidek Co., Ltd. | Ophthalmic photographing apparatus |
US5291560A (en) * | 1991-07-15 | 1994-03-01 | Iri Scan Incorporated | Biometric personal identification system based on iris analysis |
US5713892A (en) | 1991-08-16 | 1998-02-03 | Visx, Inc. | Method and apparatus for combined cylindrical and spherical eye corrections |
US5406342A (en) | 1992-01-15 | 1995-04-11 | Euclid Medical Instruments | System for determining the topography of a curved surface |
US5865832A (en) | 1992-02-27 | 1999-02-02 | Visx, Incorporated | System for detecting, measuring and compensating for lateral movements of a target |
US5347331A (en) | 1992-06-30 | 1994-09-13 | Nidek Co., Ltd. | Ophthalmic apparatus for photographing the anterior part of the eye with a reproducible photographing position |
US5683379A (en) | 1992-10-01 | 1997-11-04 | Chiron Technolas Gmbh Ophthalmologische Systeme | Apparatus for modifying the surface of the eye through large beam laser polishing and method of controlling the apparatus |
US5550937A (en) | 1992-11-23 | 1996-08-27 | Harris Corporation | Mechanism for registering digital images obtained from multiple sensors having diverse image collection geometries |
US5293871A (en) | 1993-05-05 | 1994-03-15 | Cornell Research Foundation Inc. | System for ultrasonically determining corneal layer thicknesses and shape |
US5549597A (en) | 1993-05-07 | 1996-08-27 | Visx Incorporated | In situ astigmatism axis alignment |
US6203539B1 (en) | 1993-05-07 | 2001-03-20 | Visx, Incorporated | Method and system for laser treatment of refractive errors using offset imaging |
US5512966A (en) | 1993-06-24 | 1996-04-30 | Orbtek, Inc. | Ophthalmic pachymeter and method of making ophthalmic determinations |
US5512965A (en) | 1993-06-24 | 1996-04-30 | Orbtek, Inc. | Ophthalmic instrument and method of making ophthalmic determinations using Scheimpflug corrections |
US5974165A (en) | 1993-11-30 | 1999-10-26 | Arch Development Corporation | Automated method and system for the alignment and correlation of images from two different modalities |
US6351573B1 (en) | 1994-01-28 | 2002-02-26 | Schneider Medical Technologies, Inc. | Imaging device and method |
US6104828A (en) | 1994-03-24 | 2000-08-15 | Kabushiki Kaisha Topcon | Ophthalmologic image processor |
US5980513A (en) | 1994-04-25 | 1999-11-09 | Autonomous Technologies Corp. | Laser beam delivery and eye tracking system |
US5632742A (en) | 1994-04-25 | 1997-05-27 | Autonomous Technologies Corp. | Eye movement sensing method and system |
US5572596A (en) | 1994-09-02 | 1996-11-05 | David Sarnoff Research Center, Inc. | Automated, non-invasive iris recognition system and method |
US5751836A (en) | 1994-09-02 | 1998-05-12 | David Sarnoff Research Center Inc. | Automated, non-invasive iris recognition system and method |
US5620436A (en) | 1994-09-22 | 1997-04-15 | Chiron Technolas Gmbh Ophthalmologische Systeme | Method and apparatus for providing precise location of points on the eye |
US5491524A (en) | 1994-10-05 | 1996-02-13 | Carl Zeiss, Inc. | Optical coherence tomography corneal mapping apparatus |
US5649032A (en) | 1994-11-14 | 1997-07-15 | David Sarnoff Research Center, Inc. | System for automatically aligning images to form a mosaic image |
US6393163B1 (en) | 1994-11-14 | 2002-05-21 | Sarnoff Corporation | Mosaic based image processing system |
US5581637A (en) | 1994-12-09 | 1996-12-03 | Xerox Corporation | System for registering component image tiles in a camera-based scanner device transcribing scene images |
US5646791A (en) | 1995-01-04 | 1997-07-08 | Visx Incorporated | Method and apparatus for temporal and spatial beam integration |
US5614967A (en) * | 1995-03-30 | 1997-03-25 | Nihon Kohden Corporation | Eye movement analysis system |
US5640221A (en) | 1995-05-15 | 1997-06-17 | Nihon Kohden Corporation | Binocular optical image-pickup equipment and binocular image-pickup system |
US6159202A (en) | 1995-09-29 | 2000-12-12 | Nidex Co., Ltd. | Corneal surgery apparatus |
US20020013573A1 (en) | 1995-10-27 | 2002-01-31 | William B. Telfair | Apparatus and method for tracking and compensating for eye movements |
US5774591A (en) | 1995-12-15 | 1998-06-30 | Xerox Corporation | Apparatus and method for recognizing facial expressions and facial gestures in a sequence of images |
US5850486A (en) | 1996-04-29 | 1998-12-15 | The Mclean Hospital Corporation | Registration of image data |
US5843070A (en) | 1996-05-13 | 1998-12-01 | Partech, Inc. | Simulating corneal laser surgery |
US5891132A (en) | 1996-05-30 | 1999-04-06 | Chiron Technolas Gmbh Opthalmologische Systeme | Distributed excimer laser surgery system |
US5757462A (en) | 1996-05-31 | 1998-05-26 | Nidek Company, Ltd. | Ophthalmic apparatus for photographing a section of an anterior part of an eye |
US6271915B1 (en) | 1996-11-25 | 2001-08-07 | Autonomous Technologies Corporation | Objective measurement and correction of optical systems using wavefront analysis |
US6095651A (en) | 1996-12-23 | 2000-08-01 | University Of Rochester | Method and apparatus for improving vision and the resolution of retinal images |
US5777719A (en) | 1996-12-23 | 1998-07-07 | University Of Rochester | Method and apparatus for improving vision and the resolution of retinal images |
US6116738A (en) | 1997-01-06 | 2000-09-12 | Vismed, Inc. | Corneal topographer with central and peripheral measurement capability |
US5740803A (en) | 1997-03-07 | 1998-04-21 | Autonomous Technologies Corporation | Locating the center of the entrance pupil of an eye after pupil dilation |
US5790235A (en) | 1997-03-26 | 1998-08-04 | Carl Zeiss, Inc. | Method and apparatus to measure pupil size and position |
US6285780B1 (en) | 1997-03-28 | 2001-09-04 | Oki Electric Industry Co., Ltd. | Apparatus for identifying individual animals and image processing method |
US5859686A (en) | 1997-05-19 | 1999-01-12 | Northrop Grumman Corporation | Eye finding and tracking system |
US5926251A (en) | 1997-08-12 | 1999-07-20 | Mitsubishi Denki Kabushiki Kaisha | Eye image tracking apparatus |
US6314197B1 (en) | 1997-08-22 | 2001-11-06 | International Business Machines Corporation | Determining an alignment estimation between two (fingerprint) images |
US5951475A (en) | 1997-09-25 | 1999-09-14 | International Business Machines Corporation | Methods and apparatus for registering CT-scan data to multiple fluoroscopic images |
USRE39882E1 (en) | 1997-11-11 | 2007-10-16 | Kabushiki Kaisha Topcon | Ophthalmologic characteristic measuring apparatus |
US5982555A (en) | 1998-01-20 | 1999-11-09 | University Of Washington | Virtual retinal display with eye tracking |
US5963300A (en) | 1998-02-17 | 1999-10-05 | Amt Technologies, Corp. | Ocular biometer |
US6004313A (en) | 1998-06-26 | 1999-12-21 | Visx, Inc. | Patient fixation system and method for laser eye surgery |
US6347549B1 (en) | 1998-07-09 | 2002-02-19 | Ryan International Corporation | Enhancement of storm location from a single moving platform |
US6496594B1 (en) | 1998-10-22 | 2002-12-17 | Francine J. Prokoski | Method and apparatus for aligning and comparing images of the face and body from different imagers |
US6129722A (en) | 1999-03-10 | 2000-10-10 | Ruiz; Luis Antonio | Interactive corrective eye surgery system with topography and laser system interface |
US6245059B1 (en) | 1999-04-07 | 2001-06-12 | Visx, Incorporated | Offset ablation profiles for treatment of irregular astigmation |
US6257722B1 (en) | 1999-05-31 | 2001-07-10 | Nidek Co., Ltd. | Ophthalmic apparatus |
US6396069B1 (en) | 1999-06-25 | 2002-05-28 | Macpherson David C. | Topographer for real time ablation feedback having synthetic wavelength generators |
US6266453B1 (en) | 1999-07-26 | 2001-07-24 | Computerized Medical Systems, Inc. | Automated image fusion/alignment system and method |
US6280436B1 (en) | 1999-08-10 | 2001-08-28 | Memphis Eye & Cataract Associates Ambulatory Surgery Center | Eye tracking and positioning system for a refractive laser system |
US6305802B1 (en) | 1999-08-11 | 2001-10-23 | Johnson & Johnson Vision Products, Inc. | System and method of integrating corneal topographic data and ocular wavefront data with primary ametropia measurements to create a soft contact lens design |
US6217596B1 (en) | 1999-09-01 | 2001-04-17 | Samir G. Farah | Corneal surface and pupillary cardinal axes marker |
US7146983B1 (en) | 1999-10-21 | 2006-12-12 | Kristian Hohla | Iris recognition and tracking for optical treatment |
US20020047992A1 (en) | 2000-01-27 | 2002-04-25 | Zyoptics, Inc. | Method and apparatus for measuring optical aberrations of the human eye |
US6234631B1 (en) | 2000-03-09 | 2001-05-22 | Lasersight Technologies, Inc. | Combination advanced corneal topography/wave front aberration measurement |
US6394999B1 (en) | 2000-03-13 | 2002-05-28 | Memphis Eye & Cataract Associates Ambulatory Surgery Center | Laser eye surgery system using wavefront sensor analysis to control digital micromirror device (DMD) mirror patterns |
US6413251B1 (en) | 2000-03-13 | 2002-07-02 | Memphis Eye & Cataract Associates Ambulatory Surgery Center | Method and system for controlling a digital mircomirror device for laser refractive eye surgery |
US6500171B1 (en) | 2000-03-13 | 2002-12-31 | Memphis Eye & Cataract Associates Ambulatory Surgery Center | System for generating ablation profiles for laser refractive eye surgery |
US6508812B1 (en) | 2000-03-13 | 2003-01-21 | Memphis Eye & Cataract Associates Ambulatory Surgery Center | Control system for high resolution high speed digital micromirror device for laser refractive eye surgery |
US6673062B2 (en) | 2000-03-14 | 2004-01-06 | Visx, Inc. | Generating scanning spot locations for laser eye surgery |
US6929638B2 (en) | 2000-04-19 | 2005-08-16 | Alcon Refractivehorizons, Inc. | Eye registration and astigmatism alignment control systems and method |
US6866661B2 (en) | 2000-04-19 | 2005-03-15 | Alcon Refractivehorizons, Inc. | Eye registration and astigmatism alignment control systems and method |
US20040143245A1 (en) | 2000-04-19 | 2004-07-22 | Alcon Refractivehorizons, Inc. | Eye registration and astigmatism alignment control systems and method |
US6702806B2 (en) | 2000-04-19 | 2004-03-09 | Alcon, Inc. | Eye registration and astigmatism alignment control systems and method |
US6296358B1 (en) | 2000-07-14 | 2001-10-02 | Visual Pathways, Inc. | Ocular fundus auto imager |
US6728424B1 (en) | 2000-09-15 | 2004-04-27 | Koninklijke Philips Electronics, N.V. | Imaging registration system and method using likelihood maximization |
US20040012760A1 (en) | 2000-10-18 | 2004-01-22 | Toshifumi Mihashi | Eye characteristics measuring device |
US7309126B2 (en) | 2000-10-18 | 2007-12-18 | Kabushiki Kaisha Topcon | Eye characteristics measuring device |
US6612698B2 (en) * | 2001-01-25 | 2003-09-02 | Explorer Inc. | Pupil measurement apparatus, refraction correction apparatus, and pupil measurement method |
US20040070730A1 (en) | 2001-02-09 | 2004-04-15 | Toshifumi Mihashi | Eye characteristic measuring device |
US6634750B2 (en) | 2001-03-15 | 2003-10-21 | Wavefront Sciences, Inc. | Tomographic wavefont analysis system and method of mapping an optical system |
US6634752B2 (en) | 2002-03-11 | 2003-10-21 | Alcon, Inc. | Dual-path optical system for measurement of ocular aberrations and corneal topometry and associated methods |
US20030223037A1 (en) | 2002-05-30 | 2003-12-04 | Visx, Incorporated | Methods and systems for tracking a torsional orientation and position of an eye |
US20040019346A1 (en) | 2002-06-13 | 2004-01-29 | Visx, Incorporated | Corneal topography-based target warping |
US20040116910A1 (en) | 2002-12-12 | 2004-06-17 | Bausch & Lomb Incorporated | System and method for evaluating a secondary LASIK treatment |
Non-Patent Citations (39)
Title |
---|
Autonomous Technologies Corporation, Tracker-Assisted Photorefractive Keratectomy System (T-PRK(R)) Operation Manual, Sep. 2, 1997, 8 pages total. |
Bara et al., "Positioning Tolerances for Phase Plates Compensating Aberrations of the Human Eye," Applied Optics, 39:3413-3420 (Jul. 1, 2000). |
Bos et al., "Ocular Torsion Quantification with Video Images," IEEE Transactions on Biomedical Engg., vol. 41, No. 4,Apr. 1994, pp. 351-357. |
Chaudhuri et al., "Optimum Circular Fit to Weighted Data in Multi-Dimensional Space," Pattern Recognition Letters, 14:1-6 (Jan. 1993). |
Chiron Technolas GmbH, Keracor Excimer Laser System, User Manual Version 1.0, Aug. 1996,61 pages total. |
Daugman, "High Confidence Visual Recognition of Persons by a Test of Statistical Independence," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 15, No. 11, Nov. 1993. |
Daugman, "Wavelet Demodulation Codes, Statistical Independence, and Pattern Recognition, Institute of Mathematics and Its Applications," Proc, 2nd, IMA-IP pp. 244-260 (2000). |
Fitzgibbon et al., "Direct Least Square Fitting of Elipses," IEEE Transactions on Pattern Analysis and Machine Intelligence, 21(5):476-480 (May 1999). |
Groen et al., "Determination of Ocular Torsion by Means of Automatic Pattern Recognition," IEEE Trans Biomed. Eng., May 1996, vol. 43, No. 5, pp. 471-479. |
Groen et al., "Video-Oculography" Chapter 1. PhD Thesis, The Dutch Experiment Support Center [online], 1997 [retrieved on Jul. 20, 2005]. Retrieved from the Internet: <URL: https://www.desc.med.vu.nl/Publications/Thesis/Groen/Groen<SUB>-</SUB>Chapter 1.htm>. |
Guirao et al., "Corneal wave aberration from videokeratography: accuracy and limitations of the procedure," Opt. Soc. Am., 17(6):955-965 (Jun. 2000). |
Guirao et al., "Effect of rotation and translation on the expected benefit of an ideal method to correct the eye's higher-order aberrations," J. Opt. Soc. Am., 18(5):1003-1015 (May 2001). |
Hatamian et al., "Design considerations for a real-time ocular counter-roll instrument," IEEE Trans. Biomed. Eng., 30(5):278-288 (May 1983). |
Iskander et al., "An Alternative Polynomial Representation of the Wavefront Error Function," Invest. Ophthalmol. Vis. Sci., 2002; 43:E-Abstract 1898. |
Iskander et al., "Modeling of corneal surfaces with radial polynomials," IEEE Trans. Biomed. Eng., 49(4):320-328 (Apr. 2002). |
Koch, Refractive Surgical Problem, edited by Thomas Kohnen, MD, J. Cataract Refract Surg, vol. 24, No. 7, Jul. 1998, pp. 876-881, <<https://www.ascrs.org/publications/jcr/jcrsindex.html>>. |
Kremer, "How to Keep Lasik on Axis," Review of Ophthamology, Mar. 1999, <<https://www.revophth.com/1999/march<SUB>-</SUB>articles/rpc9q&a.html>>, 1 page only. |
Leventon, "A Registration, Tracking and Visualization System for Image Guided Surgery," MasterThesis, MIT 1997, 123 pages total. |
Liang et al., "Aberrations and Retinal Image Quality of the Normal Human Eye," J. of the Opt. Soc. of Amer., vol. 4, No. 11, Nov. 1997, pp. 2873-2883. |
Liang et al., "Objective Measurement of Wave Aberrations with the Use of a Hartmann-Shack Wave-front Sensor", J. Opt. Soc. of America., vol. 11, No. 7, Jul. 1994, pp. 1-9. |
Markham et al., "Eye Torsion in Space and during Static Tilt Pre- and Post-Spaceflight," Proceedings of the 6th European Symposium on Life Sciences Research in Space, Trondheim, Norway 1996 ESA SP-390 (Oct. 1996), p. 89. |
Mulligan, "Image Processing for Improved Eye-Tracking Accuracy," Behav. Res. Methods, Instr. & Computers, 29:54-65 (1997). |
Ott et al., "The Stability of Human Eye Orientation During Visual Fixation," Neurosci. Lett., 142(2):183-186 (1992). |
Roddier et al., "Wavefront Reconstruction Using Iterative Fourier Transforms," Applied Optics, 30(11):1325-1327 (Apr. 10, 1991). |
Schwiegerling et al., "Using Corneal Height maps and Polynomial Decomposition to Determine Corneal Aberrations," Optometry and Vision Science, 74(11):906-916 (Nov. 1997). |
Sensomotoric Instruments GmbH, "Opposition Against European Patent No. 1 221 922 B1", filed Jun. 27, 2007, 41 pages total. |
Sensomotoric Instruments GmbH, "VOG for Windows User Manual," version 3.08, Nov. 1996, 280 pages total. |
Shi et al., "Good Features to Track," IEEE Conference on Computer Vision and Pattern Recognition (CVPR94), Seattle, Jun. 1994. |
Stevens, "Astigmatic Excimer Laser Treatment: Theoretical Effects of Axis Misallgnment," Eur. J. Implant Ref. Surg, vol. 6, Dec. 1994. |
Suzuki et al., "Using a Reference Point and Videokeratography for Inoperative Identification of Astigmatism Axis", J. Cataract Refract. Surg., vol. 23, No. 10, Dec. 1997, pp. 1491-1495. |
Suzuki et al., Refractive Surgical Problem, edited by Thomas Kohnen, MD, J. Cataract Refract Surg, vol. 24, No. 7, Jul. 1998, pp. 876-881, <<https://www.ascrs.org/publications/jcrs/jcrsindex,html>>. |
Swami et al., "Rotational Malposition During Laser In Situ Keratomileusis," Am. J. Ophthalmol., 133(4):561-562 (Apr. 2002). |
Taylor et al., "Determining the Accuracy of an Eye Tracking System for Laser Refractive Surgery," J. Refrac. Surg., 16:S643-S646 (2000). |
Tjon-Fo-Sang et al., "Cyclotorsion: A Possible Cause of Residual Astigmatism in Refractive Surgery," J. Cataract Refract. Surg., 28(4):599-602 (2002). |
Uozato et al., "Centering Surgical Procedures," American J. Ophthal., vol. 103, Mar. 1987, pp. 264-275. |
Van Rijn et al., "Instability of Ocular Torsion During Fixation: Cyclovergence is More Stable Than Cycloversion," Vision Res., 34(8):1077-1087 (1994). |
Visx Incorporated, "Opposition Against European Patent No. 1 221 922 81", filed Jun. 26, 2007, 28 pages total. |
Yamanobe et al., "Eye Movement Analysis System Using Computerized Image Recognition," Arch Otolaryngol Head NeckSurg., vol. 116, No. 3, Mar. 1990, pp. 338-341. |
Yang et al., "Pupil Location Under Mesopic, Photopic, and Pharmacologically Dilated Conditions," Investigative Ophthalmology & Visual Science, vol. 43, No. 7, Jul. 2002. |
Cited By (38)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9596983B2 (en) | 2002-05-30 | 2017-03-21 | Amo Manufacturing Usa, Llc | Methods and systems for tracking a torsional orientation and position of an eye |
US20090012505A1 (en) * | 2002-05-30 | 2009-01-08 | Amo Manufacturing Usa, Llc | Methods and Systems for Tracking a Torsional Orientation and Position of an Eye |
US10251783B2 (en) * | 2002-05-30 | 2019-04-09 | Amo Manufacturing Usa, Llc | Methods and systems for tracking a torsional orientation and position of an eye |
US20170151089A1 (en) * | 2002-05-30 | 2017-06-01 | Amo Manufacturing Usa, Llc | Methods and systems for tracking a torsional orientation and position of an eye |
US8740385B2 (en) | 2002-05-30 | 2014-06-03 | Amo Manufacturing Usa, Llc | Methods and systems for tracking a torsional orientation and position of an eye |
US20050131398A1 (en) * | 2003-11-10 | 2005-06-16 | Visx, Inc. | Methods and devices for testing torsional alignment between a diagnostic device and a laser refractive system |
US20070225693A1 (en) * | 2006-03-10 | 2007-09-27 | Dirk Muehlhoff | Treatment and diagnostic systems for the eye |
US20070242854A1 (en) * | 2006-04-03 | 2007-10-18 | University College Cardiff Consultants Limited | Method of and apparatus for detecting degradation of visual performance |
US7773769B2 (en) * | 2006-04-03 | 2010-08-10 | University College Cardiff Consultants Limited | Method of and apparatus for detecting degradation of visual performance |
US20100202669A1 (en) * | 2007-09-24 | 2010-08-12 | University Of Notre Dame Du Lac | Iris recognition using consistency information |
US10687980B2 (en) * | 2008-04-01 | 2020-06-23 | Amo Development, Llc | System and method of iris-pupil contrast enhancement |
US20160346128A1 (en) * | 2008-04-01 | 2016-12-01 | Amo Development, Llc | System and method of iris-pupil contrast enhancement |
US10695223B2 (en) * | 2008-04-01 | 2020-06-30 | Amo Development, Llc | System and method of iris-pupil contrast enhancement |
US20160346127A1 (en) * | 2008-04-01 | 2016-12-01 | Amo Development, Llc | System and method of iris-pupil contrast enhancement |
US10398598B2 (en) | 2008-04-04 | 2019-09-03 | Truevision Systems, Inc. | Apparatus and methods for performing enhanced visually directed procedures under low ambient light conditions |
US10117721B2 (en) | 2008-10-10 | 2018-11-06 | Truevision Systems, Inc. | Real-time surgical reference guides and methods for surgical applications |
US11051884B2 (en) | 2008-10-10 | 2021-07-06 | Alcon, Inc. | Real-time surgical reference indicium apparatus and methods for surgical applications |
US11039901B2 (en) | 2009-02-20 | 2021-06-22 | Alcon, Inc. | Real-time surgical reference indicium apparatus and methods for intraocular lens implantation |
DE102009030466A1 (en) * | 2009-06-23 | 2011-01-05 | Carl Zeiss Meditec Ag | Method and device for aligning location-related eye data |
US9373123B2 (en) | 2009-12-30 | 2016-06-21 | Iheartmedia Management Services, Inc. | Wearable advertising ratings methods and systems |
US20110161160A1 (en) * | 2009-12-30 | 2011-06-30 | Clear Channel Management Services, Inc. | System and method for monitoring audience in response to signage |
US9047256B2 (en) | 2009-12-30 | 2015-06-02 | Iheartmedia Management Services, Inc. | System and method for monitoring audience in response to signage |
US20130060241A1 (en) * | 2010-04-27 | 2013-03-07 | Daniel S. Haddad | Dynamic real time active pupil centroid compensation |
US9082002B2 (en) | 2011-01-13 | 2015-07-14 | Panasonic Intellectual Property Corporation Of America | Detection device and detection method |
US9552660B2 (en) | 2012-08-30 | 2017-01-24 | Truevision Systems, Inc. | Imaging system and methods displaying a fused multidimensional reconstructed image |
US10019819B2 (en) | 2012-08-30 | 2018-07-10 | Truevision Systems, Inc. | Imaging system and methods displaying a fused multidimensional reconstructed image |
US10740933B2 (en) | 2012-08-30 | 2020-08-11 | Alcon Inc. | Imaging system and methods displaying a fused multidimensional reconstructed image |
WO2014149625A1 (en) | 2013-03-15 | 2014-09-25 | Amo Development Llc | Systems and methods for providing anatomical flap centration for an ophthalmic laser treatment system |
EP4088695A2 (en) | 2013-03-15 | 2022-11-16 | AMO Development LLC | Systems for providing anatomical flap centration for an ophthalmic laser treatment system |
US20140320808A1 (en) * | 2013-03-15 | 2014-10-30 | Neuro Kinetics, Inc. | Method and apparatus for system synchronization in video oculography based neuro-otologic testing and evaluation |
US9247870B2 (en) * | 2013-03-15 | 2016-02-02 | Neuro Kinetics, Inc. | Method and apparatus for system synchronization in video oculography based neuro-otologic testing and evaluation |
US20180008460A1 (en) * | 2016-07-06 | 2018-01-11 | Amo Wavefront Sciences, Llc | Retinal imaging for reference during laser eye surgery |
US10842673B2 (en) * | 2016-07-06 | 2020-11-24 | Amo Development, Llc | Retinal imaging for reference during laser eye surgery |
AU2017292847B2 (en) * | 2016-07-06 | 2022-05-19 | Amo Development, Llc | Retinal imaging for reference during laser eye surgery |
US10917543B2 (en) | 2017-04-24 | 2021-02-09 | Alcon Inc. | Stereoscopic visualization camera and integrated robotics platform |
US11058513B2 (en) | 2017-04-24 | 2021-07-13 | Alcon, Inc. | Stereoscopic visualization camera and platform |
US11083537B2 (en) | 2017-04-24 | 2021-08-10 | Alcon Inc. | Stereoscopic camera with fluorescence visualization |
US10299880B2 (en) | 2017-04-24 | 2019-05-28 | Truevision Systems, Inc. | Stereoscopic visualization camera and platform |
Also Published As
Publication number | Publication date |
---|---|
US20080009840A1 (en) | 2008-01-10 |
US7261415B2 (en) | 2007-08-28 |
EP1516156A4 (en) | 2008-09-10 |
US20170151089A1 (en) | 2017-06-01 |
JP2005528600A (en) | 2005-09-22 |
US7044602B2 (en) | 2006-05-16 |
US8740385B2 (en) | 2014-06-03 |
AU2002346438A1 (en) | 2003-12-19 |
US10251783B2 (en) | 2019-04-09 |
WO2003102498A1 (en) | 2003-12-11 |
EP1516156B1 (en) | 2019-10-23 |
US9596983B2 (en) | 2017-03-21 |
EP1516156A1 (en) | 2005-03-23 |
JP4256342B2 (en) | 2009-04-22 |
CA2487411A1 (en) | 2003-12-11 |
US20090012505A1 (en) | 2009-01-08 |
US20140316390A1 (en) | 2014-10-23 |
US20060161141A1 (en) | 2006-07-20 |
MXPA04011893A (en) | 2005-03-31 |
CA2487411C (en) | 2011-06-14 |
US20030223037A1 (en) | 2003-12-04 |
CN100442006C (en) | 2008-12-10 |
CN1650148A (en) | 2005-08-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10251783B2 (en) | Methods and systems for tracking a torsional orientation and position of an eye | |
US7458683B2 (en) | Methods and devices for registering optical measurement datasets of an optical system | |
CA2774536C (en) | Registration of corneal flap with ophthalmic measurement and/or treatment data for lasik and other procedures | |
EP1221922B1 (en) | Iris recognition and tracking for optical treatment | |
US6929638B2 (en) | Eye registration and astigmatism alignment control systems and method | |
EP3509546B1 (en) | Systems for obtaining iris registration and pupil centration for laser surgery | |
JP2007268285A (en) | Iris pattern recognition and alignment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: AMO MANUFACTURING USA, LLC, CALIFORNIA Free format text: CHANGE OF NAME;ASSIGNOR:VISX, INCORPORATED;REEL/FRAME:020308/0064 Effective date: 20071231 Owner name: AMO MANUFACTURING USA, LLC,CALIFORNIA Free format text: CHANGE OF NAME;ASSIGNOR:VISX, INCORPORATED;REEL/FRAME:020308/0064 Effective date: 20071231 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
FPAY | Fee payment |
Year of fee payment: 4 |
|
FPAY | Fee payment |
Year of fee payment: 8 |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 12TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1553); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 12 |