WO2023091660A1 - Vision screening systems and methods - Google Patents

Vision screening systems and methods Download PDF

Info

Publication number
WO2023091660A1
WO2023091660A1 PCT/US2022/050390 US2022050390W WO2023091660A1 WO 2023091660 A1 WO2023091660 A1 WO 2023091660A1 US 2022050390 W US2022050390 W US 2022050390W WO 2023091660 A1 WO2023091660 A1 WO 2023091660A1
Authority
WO
WIPO (PCT)
Prior art keywords
vision screening
patient
display
radiation
screening device
Prior art date
Application number
PCT/US2022/050390
Other languages
French (fr)
Other versions
WO2023091660A9 (en
Inventor
Edward Charles BREMER
Yaolong Lou
Original Assignee
Welch Allyn, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US17/531,320 external-priority patent/US20220076417A1/en
Application filed by Welch Allyn, Inc. filed Critical Welch Allyn, Inc.
Priority to EP22896520.8A priority Critical patent/EP4432895A1/en
Publication of WO2023091660A1 publication Critical patent/WO2023091660A1/en
Publication of WO2023091660A9 publication Critical patent/WO2023091660A9/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/11Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for measuring interpupillary distance or diameter of pupils
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/103Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining refraction, e.g. refractometers, skiascopes

Definitions

  • This application is directed to medical equipment, and in particular, to devices, systems, and methods associated with determining refractive error, spherical error, and/or other parameters.
  • Visual screening in individuals typically includes one or more tests to determine various deficiencies associated with the patient’s eyes.
  • vision tests may include, for example, refractive error tests, convergence tests, accommodation tests, visual acuity tests, color vision screening tests, and the like. While one or more of the above tests may be related, each test has a respective purpose.
  • refractive error test the person is typically positioned within a measurement range associated with a vision screening device. Once the person is properly positioned, the screening device can be used to direct light onto the person’s retinas. Sensors on the device may then collect corresponding light that is reflected by the retinas, and the device may determine a refractive error for each eye based on characteristics of the reflected light.
  • vision screening devices While various vision screening devices exist, such devices are typically cumbersome and complicated to use.
  • existing vision screening devices commonly utilize equipment such as range finders, optics for center light sources for image capture, and other specialized components that increase the cost, size, and operational complexity of vision screening systems.
  • current methods utilize manual alignment, manual determinations, range-finding components, and/or large rooms/facilities while testing patients.
  • additional systems may utilize complicated and specialized equipment to complete patient tests.
  • existing vision screening devices utilize calibration curves for refractive error tests that introduce inaccuracies and errors in recommendations provided to medical practitioners and patients.
  • a system can include a light source operable to generate an eccentric source of visible or near infrared (NIR) radiation.
  • the eccentric source of radiation can be comprised of a plurality of point radiations sources that are disposed radially surrounding a radiation sensor. Additionally, the plurality of point sources can be predominantly disposed in a two-dimension plane surrounding the radiation sensor.
  • the system can include an anterior surface that is comprised of the radiation sensor and the eccentric source of radiation and a posterior surface that is comprised of an interactive display.
  • Refractive error as determined by the system of the present disclosure, may be represented by three parameters: sphere, cylinder and axis.
  • an example method of the present disclosure includes a refractive error test that collects a set of images from a patient over a timeframe. Additionally, the method includes monitoring, via a radiation sensor, a retina of a patient and periodically emitting, via a radiation source, near infrared (NIR) radiation for a duration of the timeframe.
  • NIR near infrared
  • the duration for emission of the NIR radiation can be determined based at least in part on an image capture rate of the radiation sensor. Accordingly, the radiation sensor can capture a plurality of images, wherein a CPU may select, based at least on the duration that the radiation source emitted NIR radiation, a set of images for the refractive error test from the plurality of images and analyze the set of images to determine the refractive error of the eye.
  • the duration of the NIR radiation emission can be configured such that the set of images are selected based at least on the set of images depicting the retina of the patient being fully illuminated by the NIR radiation during image capture.
  • an example device of the present disclosure includes an eccentric radiation source configured to generate one or more beams of near infrared (NIR) radiation, an optics component configured to receive the one or more beams of NIR radiation and to direct the one or more beams of NIR onto a retina of a patient, and a radiation sensor configured to collect reflected NIR radiation from the retina and to provide information indicative of the reflected NIR to a processor.
  • NIR near infrared
  • the processor can be configured to identify a pupil of the patient based at least in part on the information, cause an image capture device of the system to capture a plurality of images of the pupil, determine a plurality of pupil positions based at least in part on the plurality of images, and determine a refractive error of the patient based at least on the plurality of pupil positions.
  • the eccentric radiation source can be comprised a plurality of radiation point sources configured in meridians (e.g., lines of radiation point sources extending from the radiation sensor at rotational offsets) and eccentricities (e.g., a group of radiation point sources arranged around the radiation sensor at a radial distance). Accordingly, the processor can capture images once a focused state is achieved and generate a refractive error from the plurality of images.
  • an example system of the present disclosure includes a processor, operably connected to an eccentric radiation source and a radiation sensor.
  • the processor can cause the eccentric radiation source to generate radiation that illuminates a pupil and a retina of a patient and cause the radiation sensor to collect reflected radiation from the pupil and the retina of the patient.
  • the processor can cause the radiation sensor to capture a series of images based at least on a determination that the reflected radiation satisfies a focus state threshold.
  • the processor can determine one or more parameters associated with a detected pupil that enable the normalization of the series of images.
  • the processor can utilize neural networks to analyze an intensity profile and identify the refractive errors of the patient eyes.
  • FIG. 1 illustrates an example system of the present disclosure.
  • components of the example system shown in FIG. 1 may be used to perform one or more tests associated with vision screening.
  • FIG. 2A illustrates another example system of the present disclosure.
  • FIG. 2B illustrates an example vision screening device of the present disclosure. Similar to FIG. 1, in some implementations, components of the example systems and devices shown in FIGS. 2A and 2B and may be used to perform one or more tests associated with vision screening.
  • FIGS. 2C, 2D, and 2E illustrates further details of components of the vision screening device shown in FIG. 2B.
  • FIG. 3 depicts a block diagram of another example system for implementing the techniques described herein.
  • FIG. 4 provides a first flow diagram illustrating an example method of the present disclosure.
  • FIG. 5 provides a second flow diagram illustrating an example method of the present disclosure.
  • FIG. 6 provides a third flow diagram illustrating an example method of the present disclosure.
  • FIG. 7 provides a fourth flow diagram illustrating an example method of the present disclosure.
  • FIG. 8 provides a pictorial flow diagram of an example workflow for performing one or more vision screening tests using the example systems and devices on the present disclosure.
  • FIG. 9 provides a fifth flow diagram illustrating an example method of the present disclosure.
  • the present disclosure is directed to, in part, a vision screening device, system and corresponding methods.
  • a vision screening system may be configured to perform one or more vision screening tests on a patient and to output the results of the vision screening test(s) to a user of the device, such as a physician or a physician’s assistant.
  • the vision screening system may generate one or more beams of radiation, via one or more radiation sources, and may be configured to direct such beams at the retinas of the patient.
  • the system may collect corresponding light that is reflected by the retinas and may determine a refractive error for one or both eyes of the patient based at least in part on characteristics of the collected light.
  • the system may generate one or more images from the light that is reflected from the retinas of the patient over a period of time, and process the one or more images such that the refractive error is determined for the patient. Additionally, the system may present various visual stimuli to the patient, and perform measurements of ocular response of the patient’s eye(s) in response to the stimuli. As such, in any of the examples described herein, the results of the various vision tests performed using the system may include one or more measurements obtained by the vision screening device included in the system. In addition, the system may generate a recommendation and/or diagnosis associated with the patient for display to the user of the vision screening device.
  • the system may evaluate the measurements determined by the system to provide a recommendation to the user regarding the vision of the patient (e.g., whether the patient passed the test, requires additional screening, etc.).
  • the system described herein may provide automated diagnosis recommendations in order to assist the physician or other user of the vision screening device.
  • a system may include a vision screening device housing configured to contain the vision screening system.
  • the various tests performed using a vision screening device may have respective distance requirement or other parameters that must be met in order to ensure accurate testing results. For instance, during a refractive error test performed using the vision screening device, it may be recommended that the patient be spaced from the vision screening device by a distance of approximately three feet or approximately one meter. It should be noted that due to the compact and/or mobile design of the vision screening device (e.g., the vision screening device may be a handheld device, a mobile device similar to a tablet or a smartphone, etc.), the vision screening device may be configured to account for variation in the distance.
  • an example vision screening device of the present disclosure may be configured to utilize a focusing algorithm to account for variations in the distance between the patient and the device, provide instructions to the patient and/or the physician to adjust the distance, or otherwise achieve proper spacing for the vision screening test(s).
  • the vision screening device housing may be removably connected to a user device via a mount, a user device case, or other apparatus for securing the vision screening device housing and the user device.
  • a vision screening device base may include a stem that is vertically, rotationally, and/or otherwise moveably connected to the vision screening device housing.
  • the vision screening device may be removably connected to the stem of the vision screening device base, wherein the user device may be removably connected to the base via the vision screening device housing.
  • the user device may be removably connected to the stem, and the vision screening device housing may be removably connected to the user device.
  • FIG. 1 illustrates an example system 100 for vision screening according to some implementations.
  • a vision screening system may be utilized to perform vision screening test(s) for a patient 102.
  • a user 104 may utilize a vision screening device 106 and/or other components of the system 100 to administer a vision screening test on a patient 102 to determine the vision health of the patient 102.
  • the vision screening device 106 may be configured to perform a refractive error test via a sensor 108 and an eccentric radiation source 110. Additionally, the vision screening device 106 may be configured to emit radiation in the visible band and/or the NIR band via the eccentric radiation source 110 and capture reflected visible and/or NIR radiation 112c via the sensor 108. Further, the eccentric radiation source 110 may emit a plurality of radiation beams, including radiation beam 112a and 112b. The radiation beams in the visible band may be configured to provide visual stimuli to the patient’s eye(s) for one or more of the vision screening tests. It should be understood that, while FIG. 1 depicts the system 100 including a single vision screening device 106.
  • the system 100 may include any number of local or remote vision screening devices substantially similar to the vision screening device 106, configured to operate independently and/or in combination, and configured to communicate via the network 116.
  • the system 100 may include one or more databases 118 and one or more vision screening systems 120 comprised of one or more processors 122, one or more network interfaces 124, and/or patient screening components 126.
  • the patient screening components 126 may include one or more programs, modules, engines, instructions, algorithms, and/or other patient screening components that are executable by the processor(s) 122.
  • the vision screening device 106 and/or vision screening system 120 may be configured to perform refractive error testing on the patient 102.
  • refractive error testing may include displaying emitting radiation beams 112a and 112b, such as a visible and/or NIR light, configured to illuminate the eyes of the patient 102.
  • the vision screening device 106 may detect the pupils and/or lenses of the eyes of the patient 102, acquire images and/or video data of the pupils/lenses via the sensor 108, and may transmit the vision screening data, via the network 116, to the vision screening system 120 for analysis.
  • the vision screening device 106 may perform the analysis locally.
  • the senor 108 may include optics components (not shown) that include one or more lenses, windows, prisms, filters, mirrors, and/or any other devices configured to collect and direct the reflected beam 112c of visible and/or NIR radiation generated by the eccentric radiation source 110.
  • the optics components may comprise a collimating lens, a convergent, lens, a divergent lens, and/or any other substantially transparent lens or series of lenses configured to assist in directing such the reflected beam(s) 112c to impinge the sensor 108.
  • a memory associated with the vision screening device 106 and/or one or more of the patient screening components 126 may be configured to store and/or access data associated with the patient 102.
  • the patient 102 may provide data (referred to herein as “patient data”) upon initiating a vision screening test.
  • patient data data
  • the vision screening device 106 and/or vision screening system 120 initiates a vision screening test
  • the patient 102 may provide, or the user 104 may request, patient data including demographic information, physical characteristics, preferences, and similar information regarding the patient 102.
  • the patient 102 may provide demographic information such as name, age, ethnicity, gender, and the like.
  • the patient 102 may also provide physical characteristic information such as height of the patient 102.
  • the user 104 may request the patient data while the screening is in progress, or before the screening has begun.
  • the user 104 may be provided with predetermined categories associated with the patient 102, such as predetermined age ranges (e.g., six to twelve months, one to five years old, etc.), and may request the patient data in order to select the appropriate category associated with the patient 102.
  • the user 104 may provide a free form input associated with the patient data.
  • patient data may be accessed, e.g., from the database(s) 118, and may include previous screening history, patient preferences, demographic information, other medical data, and the like .
  • the vision screening device 106 may be configured to generate image and/or video data associated with the patient 102 at the onset of the vision screening test.
  • the vision screening device 106 may include one or more digital cameras, motion sensors, proximity sensors, or other image capture devices configured to collect images and/or video of the patient 102, and one or more processors of the vision screening device 106 may analyze the collected images and/or video to determine, for example, the height of the patient 102, the distance of the patient 102 from the screening device, and/or any of the patient data described above.
  • the vision screening device 106 may be configured to transmit the images, video, and/or any other collected information to the vision screening system 120, via the network 116, for analysis.
  • the vision screening system 120 may store such information in the patient screening components 126 and/or in an external database 118.
  • the database 118 may comprise memory or computer- readable media substantially similar to and/or the same as the computer-readable media associated with the patient screening components 126.
  • the database 118 may be accessible by the vision screening system 120, and/or by the vision screening device 106, via the network 116.
  • the database 118 may be configured to store patient data in association with a patient ID (e.g., a name, social security number, an alphanumeric code, etc.) or other unique patient identifier.
  • a patient ID e.g., a name, social security number, an alphanumeric code, etc.
  • the patient screening components 126 may access or receive patient data stored in association with the patient ID.
  • the optics components of the vision screening device 106 may include the sensor 108 and the eccentric radiation source 110.
  • the eccentric radiation source 110 may comprise a plurality of light emitting diodes (LEDs) or other light sources capable of producing visible and/or NIR radiation beams 112a and 112b.
  • the eccentric radiation source 110 may comprise collimating lens, convergent lens, divergent lens, and/or any other substantially transparent lens or series of lenses configured to assist in directing such beams 112a and 112b to illuminate the retinas of patient 102.
  • the eccentric light source 110 can utilize undirected light sources and/or dispersion filters associated with the light sources to illuminate the patient 102 and the retinas of the patient 102.
  • the sensor 108 can be configured to determine a distance 132 between a patient position 130 and a vision screening device position 128.
  • the vision screening device 106 can be configured to maintain the distance 132 (approximately 3 feet or approximately 1 meter) between the patient 102 and the sensor 108.
  • the vision screening device 106 may be configured to determine the patient position 130 relative to the vision screening device position 128 and provide the user 104 instructions, via the user interface 114, that cause the user 104 to modify the vision screening device position 128 or the patient 102 to modify the patient position 130 such that the distance 132 is properly maintained.
  • the vision screening device 106 may be configured to provide the user 104 instructions, via the user interface 114, that cause the user 104 to rotate the vision screening device 106 around a vertical axis or that causes the patient 102 to rotate such that the patient 102 is facing an anterior face of the vision screening device 106 comprised of the sensor 108 and the eccentric radiation source 110.
  • the vision screening device 106 may be configured to provide the user 104 instructions, via the user interface 114, that cause the user 104 to modify the vision screening device position by increasing an elevation or a height of the vision screening device, at the vision screening device position 128, relative to the patient 102.
  • the user interface 114 may be disposed on a posterior face of the vision screening device 106 that substantially faces the user 104 during operation of the vision screening device 106.
  • the user interface 114 may include a graphical user interface configured to display information to the user 104 and/or receive input from the user 104 during a vision test.
  • the user interface 114 may be configured to receive input from the user 104 regarding the patient 102, such as any of the patient information described herein.
  • the user interface 114 may be configured to display information regarding the vision screening device 106 (e.g., a current setting or operating mode of the device, etc.), the distance of the patient 102 from the vision screening device 106, the quality of the environment and/or the focus of the vision screening device 106, the progress of the screening, options for transmitting data from the vision screening device 106 to the vision screening system 120, one or more measurements and/or values generated during the vision screening, etc.
  • the user interface 114 may comprise, for example, a liquid crystal display (LCD) or active matrix organic light emitting display (AMOLED).
  • the user interface 114 may also be touch-sensitive to receive input from the user 104.
  • the network 116 is typically any type of wireless network or other communication network known in the art.
  • Examples of network 116 include the Internet, an intranet, a wide area network (WAN), a local area network (LAN), and a virtual private network (VPN), cellular network connections and connections made using protocols such as 802.11a, b, g, n and/or ac.
  • the vision screening device 106 can include a microprocessor or a control unit substantially similar to one or more components of the vision screening system 120 described above.
  • the vision screening device 106 may comprise one or more processors 122 and/or other hardware and/or software components configured to operably control the sensor 108, the eccentric radiation sources 110, the user interface 114, and other components of vision screening device 106.
  • vision screening device 106 may include a single processing unit (e.g., a single processor) or a number of processing units (e.g., multiple processors), and can include single or multiple computing units and/or multiple processing cores.
  • the processor(s) 122 of the vision screening device 106 can be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, state machines, logic circuitries, and/or any devices that manipulate signals based on operational instructions.
  • the processor(s) 122 of the vision screening device 106 can be one or more hardware processors and/or logic circuits of any suitable type specifically programmed or configured to execute the algorithms, operations, and methods described herein.
  • the processor(s) of the vision screening device 106 can be configured to fetch and execute computer-readable instructions stored in the patient screening components 126, which can program the processor(s) of the vision screening device 106 to perform the functions described herein. Additionally or alternatively, the processor(s) of the vision screening device 106 can be configured to fetch and execute computer-readable instructions stored in computer-readable media and/or other memory of/local to the vision screening device 106.
  • a processor such as processor(s) 122
  • the processor(s) 122 can be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, state machines, logic circuitries, and/or any devices that manipulate signals based on operational instructions.
  • the processor(s) 122 can be one or more hardware processors and/or logic circuits of any suitable type specifically programmed or configured to execute the algorithms and processes described herein.
  • the processor(s) 122 can be configured to send, receive, and transmit communications via the network interface 124. Additionally, the processor(s) 122 can be configured to fetch and execute computer-readable instructions stored in the computer-readable media of the patient screening components 126, which can program the processor(s) 122 to perform the functions described herein.
  • the network interface(s) 124 may enable wired and/or wireless communications between the components and/or devices shown in system 100 and/or with one or more other remote systems, as well as other networked devices.
  • the network interface(s) 124 may include a personal area network component to enable communications over one or more short-range wireless communication channels.
  • at least some of the network interface(s) 124 may include a wide area network component to enable communication over a wide area network.
  • Such network interface(s) 124 may enable, for example, communication between the vision screening system 120 and the vision screening device 106 and/or other components of the system 100, via the network 116.
  • the patient screening components 126 may include volatile and nonvolatile memory and/or removable and non-removable media implemented in any type of technology for storage of information, such as computer-readable instructions, data structures, program modules, or other data.
  • Memory can include, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, optical storage, solid state storage, magnetic tape, magnetic disk storage, RAID storage systems, storage arrays, network attached storage, storage area networks, cloud storage, or any other medium that can be used to store the desired information and that can be accessed by a computing device.
  • the patient screening components 126 can include various types of computer-readable storage media and/or can be a tangible non-transitory media to the extent that when mentioned, non-transitory computer-readable media exclude media such as energy, carrier signals, electromagnetic waves, and signals per se.
  • the patient screening components 126 can include any number of functional components that are executable by the processor(s) 122.
  • these functional components comprise instructions or programs that are executable by the processor(s) 122 and that, when executed, specifically configure the one or more processor(s) 122 to perform the actions associated with one or more vision tests.
  • the patient screening components 126 may include computer-readable media configured to store a measurement data component.
  • the measurement data component may be configured to receive, access, and/or analyze testing data collected and/or detected by the vision screening device 106 during one or more vision screening procedures.
  • the measurement data component may be configured to receive, via the network 116, image data and/or video data generated by the vision screening device 106 during a vision screening test.
  • the measurement data component may analyze the image data and/or video data to determine one or more measurements associated with the patient 102, such as a gaze of the patient 102 throughout the screening, a location of the pupils of the patient 102 at points in time of viewing the graphical representation, a diameter of the pupils, an accommodation of the lens, motion information associated with the eyes of the patient 102, and the like.
  • the patient screening components 126 may also include computer-readable media configured to store a threshold data component.
  • the threshold data component may be configured to receive, access, and/or analyze threshold data associated with standard vision testing results.
  • a threshold data component may be configured to access or receive data from one or more additional databases (e.g., the database 118, a third-party database, etc.) storing testing data, measurements, and/or a range of values indicating various thresholds or ranges within which testing values should he.
  • additional databases e.g., the database 118, a third-party database, etc.
  • Such thresholds or ranges may be associated with patients having normal vision health with similar testing conditions.
  • standard testing data may be accessed or received by the threshold data component and may be utilized for comparison against the measurement data stored by the measurement data component described above.
  • the threshold data associated with the toddler testing category may include standard pupil measurements, and/or a threshold range of values which the testing values should not exceed or fall below (e.g., a standard value range) for toddlers when displayed each graphical representation.
  • an example threshold data component may be configured to store information associated with the amplitude of accommodation and age (e.g., Bonder’s Table).
  • FIG. 2A illustrates an additional example system 200 of the present disclosure.
  • the system 200 may include one or more of the same components included in the system 100.
  • the system 200 can include different components that provide similar functions to the components included in the system 100.
  • the system 200 may be utilized to determine a refraction error associated with one or more eyes of a patient 202.
  • a user 204 can cause the system 200 to activate a radiation source 206 such that beams of radiation 208a and 208b are emitted towards the patient 202.
  • a reflected beam of radiation 208c can be captured by a radiation sensor lens 210 and directed onto a radiation sensor 212.
  • the system 200 can be controlled via a user device 216 and a user interface 218.
  • the user device 216 may include a controller 220 configured to operate the radiation source 206, the radiation sensor lens 210, and the radiation sensor 212 during the refractive error test.
  • the radiation source 206 may be comprised of a plurality of radiation point sources that emit visible or NIR electromagnetic waves when powered and/or provided an activation signal (e.g., LEDs, Organic Light Emitting Diodes (OLEDs), light bulbs, etc.).
  • the radiation source 206 may be configured such that the plurality of radiation point sources is arranged in one or more eccentricities and/or one or more meridians.
  • the radiation source 206 of FIG. 2A depicts eight radiation point sources that are configured in four eccentricities.
  • the radiation source 206 of FIG. 2A depicts the eight radiation point sources in a single meridian that extends vertically from the radiation sensor lens 210.
  • the one or more eccentricities may describe a radial distance of one or more radiation point sources from the radiation sensor lens 210.
  • the one or more meridians may describe a rotational orientation of one or more additional radiation point sources relative to a vertical axis of the system 200.
  • the radiation source 206 can include a plurality of the radiation point sources arranged in three meridians that extend radially in four meridians (e.g., a first that extends from vertically from 0 degrees and 180 degrees, a second that extends from 60 degrees and 240 degrees, and a third that extends from 120 degrees and 300 degrees).
  • the radiation source 206 may be configured to activate one or more subsets of the plurality of radiation point sources during a refractive error test. For instance, six radiation point sources arranged in three meridians can be sequentially activated for an inner eccentricity, a middle eccentricity, and an out eccentricity. It should be noted that while the above example describes the various eccentricities being activated as a whole, the system 200 may activate subsets of individual meridians and/or individual eccentricities. Accordingly, the radiation source 206 may be configured to provide one or more illumination patterns during the refractive error test. Additionally, while the example system of FIG.
  • the system 200 may be configured to emit any number of radiation beams as an illumination patter for the refractive error test.
  • the system 200 may include the radiation sensor lens 210 to direct the reflected radiation beam 208c to the radiation sensor 212.
  • the radiation sensor lens 210 may be substantially similar or the same as the optical components discussed with respect to FIG. 1.
  • the radiation sensor lens 210 may be configured adjust the aperture width and the focal length of the lens such that the radiation sensor 212 captures a high-quality image, sequence of images, and/or video of the one or more retinas of the patient 202 under illumination by the radiation source 206 (e.g., a clear image, an image without blurred features, etc.).
  • the system 200 may be configured such that the emission of visible and/or NIR radiation beams 208a and 208b by the radiation source 206 is triggered based at least on a framerate and/or a capture rate associated with the radiation sensor 212.
  • the radiation sensor 212 may be configured to capture images and/or video at a rate of thirty frames per second.
  • the radiation source 206 may be configured to emit visible and/or NIR radiation in bursts, flashes, packets, periods, etc. that are configured to be partially or fully synchronized with the capture rate of the radiation sensor 212.
  • the system 200 may be operated such that some or all of the images captured by the radiation sensor 212 are illuminated for the duration of a frame capture period (e.g., for images captured at thirty frames per second, the frame capture period would be a thirtieth of a second).
  • the emission of visible radiation may be controlled to display color stimuli such as color dot patterns to the patient 202, and the NIR radiation beams 208a and 208b may be activated for measuring refractive error and/or gaze angle of the eye(s) of the patient 202 in synchronization with the presentation of the color stimuli during the performance of a color vision screening test.
  • the system 200 may be configured such that the radiation source 206, the radiation sensor lens 210, and the radiation sensor 212 can be secured by one or more radiation source supports 214 and/or a system housing 224.
  • the radiation source support 214 can be one or more substrates that may be removably or non- removably attached to the radiation source 206, the radiation sensor lens 210, and/or the radiation sensor 212.
  • a first radiation source support 214a may be configured to provide structural support to the plurality of radiation point sources of the radiation source 206.
  • the radiation sensor lens 210 and/or the radiation sensor 212 may be radially encompassed in at least a two-dimensional plane by the first radiation source support 214a.
  • first radiation source support 214a is depicted by FIG. 2A as being attached to a posterior surface of the individual radiation point sources, the individual radiation point sources may be partially embedded in the first radiation source support 214a at any point along the primary axis (e.g., the axis perpendicular to the anterior, light emitting surface and the posterior surface of the radiation point source).
  • the radiation sensor 212 of the system 200 may be configured to receive and/or access light, image, and/or video data associated with a patient 202 being evaluated during a refractive error test.
  • the radiation sensor 212 may be configured to capture, or generate, image and/or video data during the vision test.
  • image/video data may be transmitted, via the communication interface(s) 222, to the user device 216 for processing and analysis.
  • the radiation sensor 212 includes, for example, a complementary metal-oxide semiconductor (CMOS) sensor array, also known as an active pixel sensor (APS), or a charge connected device (CCD) sensor.
  • CMOS complementary metal-oxide semiconductor
  • APS active pixel sensor
  • CCD charge connected device
  • the radiation sensor lens 210 is supported by the system 200 and positioned in front of the radiation sensor 212.
  • the radiation sensor 212 has a plurality of rows of pixels and a plurality of columns of pixels.
  • the radiation sensor 212 may include approximately 1280 by 1024 pixels, approximately 640 by 480 pixels, approximately 1500 by 1152 pixels, approximately 2048 by 1536 pixels, and/or approximately 2560 by 1920 pixels.
  • the radiation sensor 212 may be capable of capturing approximately 25 frames per second (fps), approximately 30 fps, approximately 35 fps, approximately 40 fps, approximately 50 fps, approximately 75 fps, approximately 100 fps, approximately 150 fps, approximately 200 fps, approximately 225 fps, and/or approximately 250 fps. Note that the above pixel values and frames per second are exemplary, and other values may be greater or less than the examples described herein.
  • the radiation sensor 212 may include photodiodes having a light-receiving surface and have substantially uniform length and width. During exposure, the photodiodes convert the incident light to a charge.
  • the radiation sensor 212 may be operated as a global shutter. For example, substantially all of the photodiodes may be exposed simultaneously and for substantially identical lengths of time.
  • the radiation sensor 212 may be used with a rolling shutter mechanism, in which exposures move as a wave from one side of an image to the other. Other mechanisms are possible to operate the radiation sensor 212 in yet other examples.
  • the radiation sensor 212 may also be configured to capture digital images. The digital images can be captured in various formats, such as JPEG, BITMAP, TIFF, etc.
  • the system 200 may be additionally configured such that the second radiation source support 214b is attached to the radiation sensor 212 and the first radiation source support 214a. Similar to the first radiation source support 214a, the second radiation source support 214b may be configured to provide structural support to the radiation sensor lens 210, the radiation sensor 212, and the first radiation source support 214a. Additionally, the radiation sensor 212 may be partially embedded and radially encompassed in a two-dimensional plane at any position along the primary axis of the radiation sensor 212. Further, a communication interface 222 can be provided structural support by the second radiation source support 214b. [0052] In the example system of FIG.
  • the system 200 may be further configured such that the system housing may provide structural support for the individual radiation point sources of the radiation sources 206, the radiation sensor lens 210, the first radiation source support 214a, the second radiation source support 214b, the user device 216, and/or other components of the system 200.
  • the system housing 224 can be an enclosed or partially enclosed structure comprised of at least an anterior surface penetrated or perforated such that the radiation source 206 may emit the radiation beams 208a and 208 while the radiation sensor lens 210 may collect the radiation beam 208c.
  • a posterior surface of the system housing 222 may comprise a socket, a bracket, a port, or other connector that provides a connection interface for the user device 216.
  • the posterior surface of the system housing 222 may include a user interface 218 embedded in or attached to the system housing 222 and integrated with the system 200.
  • the system 200 may include the first radiation source support 214a and the second radiation source support 214b, wherein the radiation source supports may be integrated circuit boards, printed circuit assays (PCAs), printed circuit boards (PCBs), or other circuit board configured to provide support and signaling to the radiation source 206, the radiation sensor lens 210, the radiation sensor 212, the communication interface 222, and other components of the system 200.
  • the radiation source supports may comprise processors, microprocessors, microcontrollers, memory, computer readable media, drivers for the individual radiation point sources, and other support components for the system 200. Further, the radiation source supports can provide communication with the user device 216 and/or the user interface 218 via circuitry and/or connections routed through the system housing 222 or the communications interface 220.
  • the communication interface 222 may be configured to provide data connections and network communications with the user device 216 and/or the communication network 116 described with respect to FIG. 1.
  • the communication interface 222 may be configured to connect to external databases (e.g., the database 118) to receive, access, and/or send screening data using wireless connections.
  • Wireless connections can include cellular network connections and connections made using protocols such as 802.11a, b, g, and/or ac.
  • a wireless connection can be accomplished directly between the vision screening device 106 and an external display using one or more wired or wireless protocols, such as Bluetooth, Wi-Fi Direct, radio-frequency identification (RFID), infrared signals, and/or Zigbee.
  • RFID radio-frequency identification
  • the communication of data to an external database 118 or an external system can enable report printing or further assessment of the visual test data of the patient 202.
  • collected data and related test results may be wirelessly transmitted and stored in a remote database accessible by authorized medical professionals.
  • FIG. 2B illustrates an embodiment of a vision screening device 230 according to some implementations.
  • the example vision screening device 230 may include one or more of the same components included in the device 106 of system 100. In some additional examples, the vision screening device 230 can include different components that provide similar functions to the device 106 included in the system 100.
  • the vision screening device 230 may include two tablet-like components, a first component 232 including a first housing 233 and a second component 234 including a second housing 235, which may include one or more processors associated therewith (not shown).
  • a first front surface 236 of the first component 232 may face a direction 238A towards a patient (such as the patient 202), and a second front surface 240 of the second component 234 may face a direction 238B, opposite the direction 238A, towards a user of the vision screening device 230 (such as the user 204).
  • first back surface 242A of the first component 232 In the partially open state 230A, first back surface 242A of the first component 232, formed by back of the first housing 233, faces second back surface 242B of the second component 234, formed by back of the second housing 235, as shown.
  • the first housing 233 and the second housing 235 may be rotatably coupled via a hinged attachment 244 and/or be otherwise pivotable, so that the first component 232 and the second component 234 pivot about a central longitudinal axis 246.
  • the first component 232 In the completely open position (not shown), the first component 232 may be rotated along counter-clockwise direction 248A, and the second component 234 may be rotated along clockwise direction 248B towards each other until the back surfaces 242A, 242B are adjacent and flush against each other.
  • first component 232 may be rotated along the clockwise direction 248B and the second component 234 may be rotated along the counter-clockwise direction 248A until the front surfaces 236, 240 are proximate (e.g., flush against and facing each other), and the back surfaces 242A, 242B form the exterior surfaces of the vision screening device 230, reaching a closed position 230B.
  • the first housing 233 and the second housing 235 are moveable between the closed position 230B in which the first front surface 236 is disposed adjacent to and facing the second front surface 240, and a fully open position in which the first front surface 236 is disposed substantially parallel to and approximately 180 degrees opposite the second front surface 240, moving through intermediate partially open positions, such as the partially open position 230A.
  • the vision screening device 230 While the vision screening device 230 is being used to administer vision screening test(s), the vision screening device 230 may be at the fully open position described above. In some examples, the vision screening device 230 may instead be at a partially open position, where the first component 232 is substantially perpendicular to the direction 238A facing the patient, and the second component 234 forms an obtuse angle with the direction 238B to provide a comfortable reading angle for a user of the vision screening device 230.
  • the first component 232 may comprise an imaging assembly 250 and a first display 252 disposed within the first housing 233 forming the substantially planar first front surface 236.
  • the imaging assembly 250 may include one or more illumination and image capture modules 254.
  • the imaging assembly 250 is shown as being disposed at a top end of the first display 252, in other examples, the imaging assembly 250 may be disposed at other locations along a periphery of the first display 252. Additionally, the imaging assembly 250 may be substantially planar and its front surface 255 may be co-planar with the first front surface 236 of the first component 232.
  • the imaging assembly 250 may comprise one or more units distributed at different locations along the periphery of the first display 252, each unit including one or more of the modules 254.
  • a first unit 250A of the imaging assembly 250 may be disposed on right side of the first display 252, and a second unit 250B of the imaging assembly 250 may be disposed on left side of the first display 252.
  • Each unit 250A, 250B of the imaging assembly 250 may include one or more modules 254.
  • the first unit and the second unit of the imaging assembly 250 may be targeted at different eyes of the patient e.g., left eye and right eye.
  • the first display 252 may be used to present visual stimuli 256 to the patient during the vision screening test(s).
  • the visual stimuli 256 may include a video to attract the gaze of the patient’s eyes to the display screen, a visual acuity target (e.g., a Snellen chart) during a visual acuity screening test, different brightness levels for an accommodation screening test, color patterns for a color vision screening test, and the like.
  • the visual stimulus 256 shown on the first display 252 is a Snellen chart for determining visual acuity of the patient.
  • the patient being evaluated may be asked to read the characters where each successive line has characters smaller than the characters in the line above.
  • the second component 234 may include a second display 258 forming the substantially planar second front surface 240, disposed within the second housing 235.
  • the second display 258 may be used to display information to the user that is relevant to the vision screening test(s) being administered.
  • the second display 258 may present one or more visual elements conveying the relevant information.
  • the visual elements may include a live video feed 260 of the patient being evaluated so that the user can monitor the patient’s actions and responses.
  • the visual elements may also include the patient’s information 262 which may include demographic information, prior screening results, and/or medical history and the like.
  • the visual elements may include a replication 264 of the visual stimuli 256 being presented to the patient and/or an image 266 of the eyes of the patient captured by a module of the imaging assembly 250.
  • the visual elements may include measurements and/or results 268 related to the vision screening test being administered, and/or a recommendation 270 which may be system-generated in some examples.
  • the second display 258 may also display an icon panel 272 including icons 274 that provides easy navigation and access to other tasks or screens to the user.
  • the icon panel 272 may include several icons for performing tasks associated with administering vision screening test(s). Some examples of icons that may be displayed include an icon for selecting a vision screening test from a listing of vision screening tests supported by the vision screening device 230, an icon for accessing a list of patients who may be scheduled for evaluation, an icon for adding screening test results to a summary of patient evaluation, an icon to initiate a re-do a previous step or a previous screening test, and/or an icon to save or print an evaluation report, and the like.
  • the icon panel 272 may also include a “home” icon to allow the user to access a “home” screen, which may be set up to include one or more of the visual elements
  • the second display 258 may be touch-sensitive to receive input from the user of the vision screening device 230 using a stylus or finger.
  • the vision screening device 230 may include a keyboard or other data entry mechanism (e.g., voice command recognition) to allow the user to input data related to the vision screening test(s).
  • the vision screening device 230 may also be configured to connect wirelessly with external computing devices, peripherals, cloud services, and/or external databases, as described in further detail with reference to FIG. 3.
  • FIG. 2B depicts an exemplary vision screening device 230 that includes components for administering one or more vision screening test(s) to a patient.
  • the vision screening device 230 is intended to perform an entire vision screening which may include multiple, different vision screening tests.
  • the vision screening device 230 has the additional features of being light weight enough to be hand-held, e.g., being a weight equivalent of two tablet-like components 232,234, allowing for portability and ease-of-use.
  • the imaging assembly 250 of the vision screening device 230 provides the radiation sources and image capture modules needed for the one or more vision screening test(s) in a compact and substantially planar arrangement, enabling the light weight and portable form factor of the vision screening device 230.
  • the imaging assembly 250 and its modules 254 are described in detail with reference to FIG. 2C below.
  • FIG. 2C illustrates an imaging assembly 250 of the vision screening device 230, including an exemplary configuration of modules 254.
  • the imaging assembly 250 is envisioned to be configurable to include modules 254 needed for performing a variety of vision screening tests. Though a particular set of modules 280-288 is shown, there may be fewer or additional modules. For example, based on the vision screening tests and features enabled in the vision screening device 230, fewer or additional modules may be included in the imaging assembly 250.
  • the imaging assembly 250 may include a housing 276 in which the modules 254 are disposed, supported by a base 277 that may be operably connected to the processors of the vision screening device 230 to provide the functionality described herein.
  • the housing 276 and the modules 254 disposed therein may be covered with a front cover 278, on an opposite side of the base 277.
  • the front cover 278 may be transparent in areas which are disposed over optical components of the modules 254 underneath the front cover 278 e.g., cameras, illumination sources, and the like, to allow radiation to pass through without change.
  • FIG. 2D illustrates a view of the imaging assembly 250 where the front cover 278 is removed, exposing the modules 254.
  • the modules 254 may include one or more of a near-infrared (NIR) radiation source 280, a NIR camera 282 for capturing NIR radiation reflected from eyes of the patient, a visible light camera 284, a range sensor 286, and/or a visible light source 288.
  • the camera modules 282, 284 may also include video capture capabilities.
  • the modules 254 may each include a connector 287 that provides physical and electrical connections to the base 277 of the housing 276.
  • the modules 280, 282, 284, 286, 288 may each be modular components that can be added in a series, or combined in other ways, to form the imaging assembly 250.
  • the type of modules, the number of modules, or the order and manner in which they are assembled is not intended to be construed as a limitation, and any number of modules can be combined in any manner to form the imaging assembly 250.
  • the NIR radiation source module 280 which may include an arrangement of near infrared light emitting diodes (NIR LEDs), may be placed in front of the NIR camera 282 module so that the NIR LEDs are disposed around an optical axis of the NIR camera 282.
  • NIR LEDs near infrared light emitting diodes
  • each unit of the imaging assembly 250 may include a housing in which one or more modules e.g., modules 280-288, are disposed.
  • the first unit 250A of the imaging assembly 250 may include a first near-infrared (NIR) radiation source 280 and a first NIR camera 282
  • the second unit 250B of the imaging assembly 250 may include a second near-infrared (NIR) radiation source 280 and a second NIR camera 282.
  • the first unit 250A and the second unit 250B of the imaging assembly 250 may include a visible light source 288 and/or a visible light camera 284.
  • a third unit (not shown) of the imaging assembly 250 may be disposed behind the first display 252 and within the first housing 233.
  • the first display 252 may be a transparent screen, such as a transparent organic light emitting display (OLED) screen, allowing radiation to travel through the display and to or from the modules 254 of the third unit without change.
  • the third unit may include a NIR camera 282, so that the NIR camera 282 is aimed in the direction 238A, directly at the eyes of a patient viewing content on the first display 252.
  • the third unit may include a visible light camera 284 and/or other modules 254.
  • the NIR radiation source 280 may comprise a NIR light-emitting diode (LED) illumination module which may incorporate one or more NIR LEDs arranged in a pattern that may be configured to provide patterns of illumination needed during various vision screening tests e.g., a refractive error test as described below.
  • the NIR camera module 282 configured to capture NIR radiation reflected from the eyes of the patient may comprise a high- resolution, auto-focus camera with custom optics for imaging eyes in clinical applications. The NIR camera module 282 will be described in further detail with reference to FIG. 2E.
  • the visible light camera 284 may be equipped with auto-focus zoom lens(es) that can provide telephoto as well as wide-angle image capture capability. For example, at the start of a vision screening session, the camera 284 may operate in a wide-angle capture mode to enable the vision screening device 230 to locate the patient in the examination room. Thereafter, the camera 284 may operate in a telephoto zoom mode to capture close-up views of the patient’s face or eyes. The camera 284 may also capture video of the patient while the vision screening tests are being administered.
  • auto-focus zoom lens(es) that can provide telephoto as well as wide-angle image capture capability. For example, at the start of a vision screening session, the camera 284 may operate in a wide-angle capture mode to enable the vision screening device 230 to locate the patient in the examination room. Thereafter, the camera 284 may operate in a telephoto zoom mode to capture close-up views of the patient’s face or eyes. The camera 284 may also capture video of the patient while the vision screening tests are being administered.
  • the imaging assembly 250 may include a range sensor module 286 for sensing depth information.
  • the depth information of the patient relative to the vision screening device 230 corresponds to the distance of the patient from the vision screening device 230. Therefore, the depth detected by the range sensor module 286 may be used by the user of the vision screening device 230 to verily that the patient is located at an ideal distance for the vision screening test being administered.
  • the detected depth or distance to the patient may also be used to adjust visual stimuli being presented to the patient e.g., the size of characters in a visual acuity test presented to the patient may be smaller if the patient is closer to the vision screening device 230.
  • the range sensor module 286 may comprise 3D sensors such as LIDAR sensors which use laser scanning, or ToF (time-of-flight) sensors which use infrared light to determine depth information.
  • the range sensor module 286 may generate a three-dimensional point cloud of a portion of the examination room, including the patient. The point cloud may be generated while the patient is illuminated by NIR radiation emitted by the NIR radiation source 280, and may be used to measure distance to the patient.
  • a three-dimensional point cloud of the patient’s head may be generated using the range sensor module 286. The three-dimensional point cloud of the patient’s head may be used to more accurately localize the patient’s eyes and locate pupils of the eyes.
  • the imaging assembly 250 may include a visible light source 288, which is more commonly referred to as a flash.
  • the visible light source 288 may illuminate the patient while the visible light camera 284 captures images or video of the patient.
  • the timing of activation of different modules may be synchronized e.g., the visible light source 288 may be activated during the capture of an image by the visible light camera 284.
  • the modules 254 of the imaging assembly 250 may be controlled by a processor of the vision screening device 230 as described with reference to FIG. 3.
  • FIG. 2E illustrates an exploded view of the NIR camera module 282.
  • the NIR camera module 282 may include a high-resolution lens 290 with a narrow field of view suitable for imaging eyes in a vision screening setting.
  • the lens 290 may incorporate folded prism slim lens technology which allows for telephoto zoom while maintaining a low height profile.
  • the optical system used in folded prism lenses bends and focuses light while it is reflected back and forth inside thin optical prisms, reducing the thickness of the lens and allowing for a substantially low-height form factor.
  • some vision screening tests, and in particular, a refractive error screening test require images of pupils and/or lenses of the eyes of the patient.
  • the high- resolution zoom capability of the camera 282 enables the capture of close-up images of the eyes of the patient from which the pupils and/or lenses of the eyes can be localized.
  • the lens may incorporate an aspherical multi-lens design which allows focusing in a 3-22 feet range that is typically used in a vision screening setting as distance between the patient and the vision screening device 230.
  • the lens 290 may be controlled by an actuator 292 incorporating high-performance miniature actuator technology which enables auto-focus and optical image stabilization features of the camera 282.
  • the multi-lens design of the lens 290 may incorporate a stack of lenses along an optical axis 291, each of the lenses of the stack being capable of off-axis lateral motion.
  • the lens 290 may include a prism rotatable about an axis perpendicular to the optical axis 291.
  • the actuator 292 which may be micro electro-mechanical systems(MEMS)-based, may control the motion of the lenses of the stack of lenses as well as the prism, providing optical image stabilization to compensate for small movements arising from a user handholding the vision screening device 230 while administering the vision screening test(s).
  • the optical image stabilization feature of the lens 290 may enable sharper image capture using the NIR camera 282 in a hand-held vision screening device 230, eliminating the need for using a rigid support system for the vision screening device 230, such as a tripod.
  • the NIR camera module 282 also includes a NIR sensor 294 and a controller board 296, such as a radio frequency printed circuit board (RF-PCB), fitted into a housing 298 of the module 282.
  • the housing 298 may have a base 298 and one or more walls 299 extending substantially perpendicular to the base.
  • the housing may also include a transparent cover opposite the base 298 and supported by the one or more walls 299 to form an internal space.
  • the internal space may be partially enclosed by the walls 299, cover, and the base 298.
  • one or more components of the 282 may be disposed at least partly within the internal space.
  • the cover and the base 298 may be substantially planar.
  • the cover forms a portion of the cover 278 of the imaging assembly 250 that is disposed substantially coplanar with the first display 252 of the vision screening device 230.
  • one or more of the components of the module 282 may at least partially protrude from (e.g., extend through one or more orifices formed by the cover) the surface of the cover 278.
  • the visible light camera 284 may include components substantially similar to those described above with reference to the NIR camera 282.
  • FIG. 3 illustrates another example system 300 of the present disclosure.
  • the system 300 may include one or more similar of the same components as the components included in the systems 100 and 200.
  • the system 300 can interact with the components described with respect to the systems 100 and 200 to execute one or more methods according to this disclosure.
  • the system 300 may include an LED PCA 302 comprised of an LED driver 304, a microcontroller unit (MCU) 306, power management systems 308, and an information interface 310.
  • a user device 312 can be included in the system 300, which may be the vision screening device 230.
  • the user device 312 comprises a computer processing unit (CPU) 314, a communication interface 316, a user interface 318, memory 320, power management systems 322, an information interface 324, and a display 326. Additionally, the user device 312 may include display 328 for displaying content to a patient 202 whose vision is being screened using user device 312. Further, components of the LED PCA 302 can be configured to operate the radiation emitting systems and radiation capturing systems described with respect to FIGS. 1 and 2A-C e.g., the imaging assembly 250.
  • the LED driver 304 may operate and control the radiation source 206 or the NIR radiation source 280.
  • the LED driver 304 may provide one or more commands, received via the MCU 306, for selectively activating and deactivating individual radiation point sources of the vision screening system.
  • the LED driver 304 may operate to activate the individual radiation point sources independent of the meridian and eccentricity associations established between the radiation point sources.
  • the LED driver 304 may be configured to receive and distribute power from the power management system 308 to the radiation source 206. Further, the LED driver 304 may communicate with the CPU 314 and receive one or more indications providing instruction for radiation source activation and activation patterns.
  • the MCU 306 may be configured to perform or partially perform methods described by FIGs. 4-7 or the workflow of FIG. 8. Additionally or alternatively, the MCU 306 may operate to control the radiation sensor 212, or the NIR camera 282, to capture an image, a plurality of images, and/or a video of one or more retinas associated with the patient 202. The MCU 306 may communicate with the CPU 314 via the information interface 310 to receive indications of routines and/or algorithms to be performed during the vision screening test(s) and to transmit the capture image(s) and video captured during the routines and/or algorithms. In some additional examples of FIG.
  • the MCU 306 may be configured to perform high speed video frame capture while the LED driver 304 causes the radiation source 206 to illumination the patient 202.
  • the MCU 306 may receive an indication for a series of light pattern images to be captured within a timeframe associated with a vision screening test, like a refraction test.
  • the CPU 314 can synchronize the radiation source 206, or the NIR radiation source 280, flashing the patient 202 and illuminating the retinas of the patient 202 such that the radiation sensor 212, or the NIR camera 282, may the capture of images and/or video for the series of light pattern images of the refraction test.
  • the synchronization of the LED driver 304 with the MCU 306 by the CPU 314 enables the series of light pattern images to be captured despite the latency between the LED driver 304, the MCU 306, and the CPU 314.
  • the CPU 314 of the system 200 may comprise one or more controllers, processors, and/or other hardware and/or software components configured to operably control the LED driver 304, the MCU 306, communication interface 316, the user interface 318, the display 326, and/or other components of the system 200.
  • the CPU 314 shown in FIG. 3 may include a single processing unit (e.g., a single processor) or a number of processing units (e.g., multiple processors), and can include single or multiple computing units or multiple processing cores.
  • CPU 314 shown in FIG. 3 can be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, state machines, logic circuitries, and/or any devices that manipulate signals based on operational instructions.
  • CPU 314 shown in FIG. 3 can be one or more hardware processors and/or logic circuits of any suitable type specifically programmed or configured to execute the algorithms, operations, and methods described herein.
  • the CPU 314 shown in FIG. 3 can be configured to fetch and execute computer- readable instructions stored in memory 320, which can program the CPU 314 to perform the functions described herein. Additionally or alternatively, the CPU 314 shown in FIG. 3 can be configured to fetch and execute computer-readable instructions stored in patient screening components 126 of the vision screening system 120 (FIG. 1).
  • the CPU 314 shown in FIG. 3 may be configured to receive various information, signals, and/or other inputs from one or more of the user interfaces 318, the radiation sensor 212 or the NIR camera module 282, the display 326, and/or other components of the system 200.
  • the user interface 318 may receive such inputs from the user 204, and one or more such inputs may comprise a command or a request for the system 200 to generate, display, provide, and/or otherwise output one or more images, beams of radiation, dynamic stimulus, or other output included in a refractive error examination or other vision test.
  • the CPU 3 may be operable to cause the radiation source 206, the radiation source 280, and/or the visible light source 288 to generate, display, provide, and/or otherwise output, beams of radiation, dynamic stimulus, or other output included in a refractive error examination or other vision test.
  • the CPU may cause one or more images to be output at the display 326 (or the display 258) facing the user 204, and/or the display 328 (or the display 252) facing the patient 202.
  • the CPU 314 can be configured to synchronize capture of highspeed images or frames with periodic illumination of the patient 202 via the radiation source 206 or the radiation source 280. It should be noted that the synchronization of high-speed image and/or frame capture with the illumination of the patient 202 may be utilized to compensates for communication delays, between the CPU 314 and the LED driver 304 and between the CPU 314 and the MCU 306, caused by the information interfaces 310 and 324.
  • the CPU 314 may communicate with the MCU 306 via a first communication method (e.g., through USB video class (UVC) communications where the images and/or frames observed by the radiation sensor 212, or captured by the NIR camera 282, and the MCU 306 are transmitted to the CPU for recording), the exchange of commands, images observed by the radiation sensor 212, , or captured by the NIR camera 282, and other information can be exchanged with relatively low latency between transmission of information by either the CPU 314 or the MCU 306 and the receipt of the transmission by either the MCU 306 or the CPU 314 respectively.
  • a first communication method e.g., through USB video class (UVC) communications where the images and/or frames observed by the radiation sensor 212, or captured by the NIR camera 282, and the MCU 306 are transmitted to the CPU for recording
  • the CPU 314 may communicate with the LED driver 304 via a second communication method (e.g., through Bluetooth communications), wherein the second communication method exchanges information with relatively high latency between transmission and receipt of information.
  • the LED driver 304 and the MCU 306 may be unable to exchange communications. Accordingly, the high latency between the CPU 314 and the LED driver 304 may introduce a delay between the CPU 314 transmitting an illuminate command to the LED driver 304 and the completion of the illuminate command for each image of the series of light pattern images that a complete series of light pattern images is not captured during the timeframe provided for the refraction test.
  • synchronization of the LED driver 304 can enable the radiation source 206 or the radiation source 280 to illuminate the patient 202 while an image and/or a frame of the series of light pattern images is captured.
  • the CPU 314 can ensure that at least one frame or image captured during the flash will be fully illumination. Accordingly, the CPU 314 modification of the flash duration can enable the series of light pattern images captured during the timeframe to include a series of images that may be utilized for the refraction test.
  • each activation of the radiation source 206 or the radiation source 280 includes at least a wholly illuminated frame (e.g., the radiation source 206 or the radiation source 280 does not deactivate during the frame capture) captured by the radiation sensor 212 and may include one or more partially illuminated frames. Additionally, any frames from the series of images and/or filters that overlap with a deactivation of the radiation source 206 or the radiation source 280 (e.g., flash ends while the frame is captured, causing the frame to be partially illuminated) are discarded.
  • the MCU 306 can synchronize of activation of the eccentric radiation source 206 or the radiation source 280, or the NIR camera 282, by the LED driver 304 with the capture of the series of images and/or frames by the radiation sensor 212. Additionally, the MCU 306 can receive a command from the CPU 314 that triggers the synchronization of eccentric radiation source activation and image capture in such that a series of images may be captured. Further, the synchronization of the eccentric radiation source activation and the image capture may be achieved by utilizing the communication pathways between the MCU 306 and the CPU 314 and the MCU 306 and the LED driver 304 to issue commands.
  • the MCU 306 can utilize unused information bandwidth to forward commands, via the information interface 310 to the LED driver 304 (e.g., the MCU 306 may utilize a VSYNC pin to transmit a VSYNC signal to the LED driver 304 and bypass the utilization of the Bluetooth connection to trigger the activation of the eccentric radiation source 206 or the radiation source 280). Accordingly, the MCU 306 forward commands for radiation source activation to the LED driver 304, thereby causing the LED driver 304 to activate in synchronicity with the rate of frame capture at the radiation sensor 212 or the NIR camera 282 and bypassing the latency between the LED driver 304 and the CPU 314.
  • the CPU 314 may encode a “start” image and/or frame for the series of light pattern images received from the MCU 306.
  • the components of the LED PCA 302 are unaware of a start frame or image for the series of light pattern images.
  • CPU 314 may be configured to determine the start image associated with the LED driver 304 receiving a command to initiate one or more flashes via the radiation source 206 or the radiation source 280 and the CPU 314 collecting the series of light pattern images. Accordingly, based at least on the flash duration, the CPU 314 may determine an illumination pattern for one or more images that indicates the start image for the series of light pattern images requested by the refraction error test.
  • the illumination pattern can be a high light intensity image followed by a low light intensity image followed by an additional high light intensity image.
  • the illumination pattern can be two high light intensity images followed by a low light intensity image followed by an additional high light intensity image.
  • the CPU 314 may identify the start image for the series of light pattern images recorded by the CPU 314 and encode the series of light pattern images with an identifier of the start image.
  • encoding the start frame data into the series of light pattern images may comprise creating the illumination patter identified based at least on the flash duration and the time to capture a single image to encode the high illumination, low illumination, high illumination pattern, or other illumination pattern, into a sequence of images the precede the start image.
  • an image processing algorithm may be configured to identify the start image of the refraction test for the series of light pattern images due to the embedded illumination pattern.
  • the vision screening test being administered may be a color vision screening test.
  • the MCU 306 may synchronize the activation of the radiation source 206, or the radiation source 280, with the presentation of color stimuli to the patient 202.
  • the color stimuli may be displayed on the display 328, or the display 252, facing the patient, and the radiation source 206, or the radiation source 280, may be activated to determine the refractive error of the eye(s) of the patient 202 responsive to the color stimulus being displayed.
  • the radiation sensor 212 or the NIR camera 282 may capture images of the eye(s) illuminated by the NIR radiation emitted by the radiation source 206, or the radiation source 280, and the captured images may be processed by the CPU 314 to determine differences in the refractive error in response to changes in the color stimulus being presented on the display 328 or the display 252.
  • the power management systems 308 and 322 may comprise any removable, rechargeable, and/or other power source known in the art and configured to store electrical power.
  • the power management systems 308 and 322 may comprise one or more rechargeable batteries configured to selectively provide electrical current to the one or more components of the system 200 during use.
  • the power management systems 308 and 322 may comprise one or more sealed lead acid batteries, lithium ion batteries, nickel cadmium batteries, nickel -metal hydride batteries, or other types of batteries configured to provide sufficient power to the LED driver 304, the MCU 306, the radiation source 206 or the radiation source 280, the radiation sensor lens 210, 290 the radiation sensor 212, 282 the information interfaces 310 and 324, the CPU 314, the communication interface 316, the user interface 318, and/or other components of the described systems.
  • sealed lead acid batteries lithium ion batteries, nickel cadmium batteries, nickel -metal hydride batteries, or other types of batteries configured to provide sufficient power to the LED driver 304, the MCU 306, the radiation source 206 or the radiation source 280, the radiation sensor lens 210, 290 the radiation sensor 212, 282 the information interfaces 310 and 324, the CPU 314, the communication interface 316, the user interface 318, and/or other components of the described systems.
  • the communication interface(s) 316 of the system 200 shown in FIG. 3 may enable wired and/or wireless communications between the vision screening device 300 and one or more external databases 118, a communications network 116, and/or one or more components of the vision screening system 120 (FIG. 1), as well as with one or more other remote systems and/or other networked devices.
  • the communication interface(s) 316 may include a personal area network component to enable communications over one or more short-range wireless communication channels.
  • the communication interface(s) 316 may include a wide area network component to enable communication over a wide area network. In any of the examples described herein, the communication interface(s) 316 may enable communication between the system 200 and external resources via the network 116 (FIG. 1).
  • the memory 320 shown in FIG. 3 may be similar to the patient screening components 126 described above with respect to the vision screening system 120 (FIG. 1), or memory associated with the tablet-like components 232 and 234.
  • the memory 320 may include volatile and nonvolatile memory and/or removable and nonremovable media implemented in any type of technology for storage of information, such as computer-readable instructions, data structures, program modules, or other data.
  • Such memory 320 can include, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, optical storage, solid state storage, magnetic tape, magnetic disk storage, RAID storage systems, storage arrays, network attached storage, storage area networks, cloud storage, or any other medium that can be used to store the desired information and that can be accessed by a computing device.
  • the memory 320 can be a type of computer-readable storage media and/or can be a tangible non-transitory media to the extent that when mentioned, non-transitory computer-readable media exclude media such as energy, carrier signals, electromagnetic waves, and signals per se.
  • the memory 320 can be used to store any number of functional components that are executable and the images to be processed by the CPU(s) 314.
  • these functional components comprise instructions or programs that are executable by the CPU(s) 314 and that, when executed, specifically configure the one or more CPU (s) 314 to perform the actions described herein and associated with one or more vision screening tests.
  • Other functional components stored in the memory 320 may include, among other things, a graphical representation data component, a measurement data component, a threshold data component, a notification component, a sensor data component, a range finder data component, a microphone data component, a light source control component, a machine learning component, and/or any other functional component associated with the operation of the system 200.
  • FIG. 4 provides a flow diagram illustrating an example method 400 for vision testing, as described herein.
  • the method 400 is illustrated as collections of blocks in a logical flow graph, which represents a sequence of operations that can be implemented in hardware, software, or a combination thereof.
  • the blocks represent computerexecutable instructions stored on one or more computer-readable storage media that, when executed by CPU(s), perform the recited operations.
  • computer-executable instructions include routines, programs, objects, components, data structures, and the like that perform particular functions or implement particular abstract data types.
  • the order in which the operations are described is not intended to be construed as a limitation, and any number of the described blocks can be combined in any order and/or in parallel to implement the method
  • one or more blocks of the method 400 can be omitted entirely.
  • the CPU of a vision screening device 102 may determine whether a field of view observed by a radiation sensor is in a focused state.
  • the focused state indicates that a series of images may be captured for a refraction test.
  • the focused state can be identified by the satisfaction of one or more image state thresholds.
  • the one or more image state thresholds may be associated with a clarity of an image (e.g., a lack of blurred features), a stability of the image, and/or other features associated with the subject of the image (e.g., eyes of a patient receiving a refraction test) observed within the field of view of the radiation sensor.
  • the CPU of a vision screening device 102 determines that the image satisfies the image state thresholds and is in the focused state
  • the CPU can cause the radiation sensor to capture a series of images (e.g., in some embodiment the series of images can include 25 frames) having a selection of LED illuminations of the eyes/retinas of the patient.
  • the CPU may cause the radiation sensor to capture the series of images where individual images of the series of images are captured in association with individual LED illuminations of the selection of LED illuminations. The determination of the focused state for the image and the capture of the series of images is discussed in greater detail by FIG. 5.
  • the CPU of a vision screening device 102 may determine the relative position of individual images and/or frames of the series of images.
  • the radiation sensor captures the series of images in rapid succession and minimizes the positional drift between individual images.
  • the positional drift caused by shifts in radiation sensor location and movement of the patient, relocates the pupils within the region of interest between images captured by the radiation sensor.
  • the CPU causes the radiation sensor to rapidly capture the series of images such that a position of the pupil in a preceding image can be utilized to identify potential locations of pupils in an image of the series of images as the previous location of pupils can identify regions of interest for analysis.
  • the determination of pupil position based on a previous pupil position associated with a previous image is discussed in greater detail by FIG. 6.
  • the CPU of a vision screening device 102 may determine gaze calculations that identify deviations of the pupil centers for both eyes. For instance, the CPU may enhance the contrast for the series of images such that pupil edge detection may be enhanced. Additionally, the CPU may detect the pupil edge and further enable pupil fitting to determine pupil center coordinates and pupil radius for the image. Further, the CPU may operate to determine glint position from the series of images indicated by positional coordinates associated with individual eyes captured by the series of images.
  • the system 200 may remove the glint from the image, determine a gaze deviation based at least on the positional coordinates of the glint, and apply a Gaussian LPF (e.g., a Gaussian filter) to smooth the radiation intensity signal such that further signal processing can be performed.
  • a Gaussian LPF e.g., a Gaussian filter
  • CPU may determine a refractive error for each eye of the patient.
  • the CPU can determine a spherical error (SE) slope based at least on one or more radiation intensity profiles determined from the series of images captured by the radiation sensor. For example, the CPU may extract a profile box that encompasses a range of detected radiation intensities for an image. Additionally, CPU can determine the SE slope from a linear function fitted to the radiation intensity profile along a primary axis of the extracted profile box. It should be noted that the determination of SE slope and refractive error is discussed in greater detail by FIG. 7.
  • SE spherical error
  • the CPU may generate a report based at least on the refractive error calculation for the patient. For instance, the CPU can, after determining the SE slope and the refractive error for individual images of the series of images, generate a report that lists the refractive error of Sphere (Ds), Cylinder (De), and Axis (Ax) of the Cylinder as well as gaze deviation, pupil size, pupil distance, and other pupil information for both eyes. Additionally, the CPU may include a referral for further treatment, for additional vision tests, a prescription for the individual eyes, and/or additional actions to be taken by the user 204 in the generated report.
  • Ds refractive error of Sphere
  • De Cylinder
  • Axis Axis
  • the CPU can present the generated report to the user 204 and/or the patient 202 via the user interface 218, cause a communication interface to transmit the report to an external database 118, and/or cause a communication interface to transmit to an external vision screening system 120 for further analysis or action.
  • FIG. 5 provides a flow diagram illustrating an example method 500 for vision testing, as described herein.
  • the method 500 is illustrated as collections of blocks in a logical flow graph, which represents a sequence of operations that can be implemented in hardware, software, or a combination thereof.
  • the blocks represent computerexecutable instructions stored on one or more computer-readable storage media that, when executed by CPU(s), perform the recited operations.
  • computer-executable instructions include routines, programs, objects, components, data structures, and the like that perform particular functions or implement particular abstract data types.
  • the order in which the operations are described is not intended to be construed as a limitation, and any number of the described blocks can be combined in any order and/or in parallel to implement the method 500. In some embodiments, one or more blocks of the method 500 can be omitted entirely.
  • the CPU can cause a radiation sensor to capture a series of images for a vision test.
  • the CPU can determine at least a first set of radiation point sources, associated with the eccentric radiation source 206, to be activated for a first image.
  • the CPU can determine a set of radiation point sources to be activated for each image of the series of images.
  • the CPU can determine multiple sets of radiation point sources to be activated for one or more images of the series of images.
  • the eccentric radiation source 206 can be configured such that a plurality of radiation point sources is positioned at meridians indicating rotational positioning and eccentricities indicating the radial positioning of the radiation point sources.
  • the eccentric radiation source 206 can include one or more meridians and one or more eccentricities where the radiation point sources may be positioned.
  • the individual sets of radiation point sources for the capture of the one or more images for the series of image may be comprised of radiation point sources selected from any combination of meridians and/or eccentricities. However, in some examples, it may be common for the sets of radiation point sources to be comprised of the radiation point sources from a single eccentricity or a single meridian.
  • the CPU may detect one or more pupils on an image and/or from image data associated with a face of a patient 202.
  • CPU can execute facial recognition algorithms for the system 200, the user device 216, and/or the vision screening system 120 (FIG. 1). Additionally or alternatively, a user 204 of the system 200 can indicate that the image includes a face.
  • the CPU can identify the pupils as a first region of interest and a second region of interest based at least in part on the reflected radiation collected by the radiation sensor 212. Further, the CPU may detect and define the boundaries of the first region of interest and the second region of interest such that the pupils of the patient 202 are substantially or wholly encompassed by the regions of interest.
  • the radiation sensor may capture an image and the CPU may transmit the image, via a network, to vision screening system 120. Additionally, the vision screening system 120 can identify the pupils as the first region of interest and the second region of interest based at least on the reflected radiation collected by the radiation sensor 212. Further, the vision screening system 120 may detect and define the boundaries of the first region of interest and the second region of interest within the image such that the pupils of the patient
  • the eccentric radiation source 206 may contribute the inability of standard methods of pupil detection to identify the pupils within the first region of interest and the second region of interest of the series of images.
  • the eccentric radiation source 206 may prevent a fully circular pupil from being detected by standard methods and may cause additional anomalies in pupil detection.
  • standard methods attempting to match the eccentrically illuminated pupil to calibration curves may fail to detect one or more pupils for the patient 202 due to the anomalous reflected light associated with the eccentric radiation source 206.
  • the CPU may operate a neural network algorithm trained to detect pupils illuminated by the eccentric radiation source 206.
  • the neural network can be trained by utilizing manually graded test data set and/or data sets comprised of previously analyzed images associated with feedback indications that provide data of successful identification of pupils and failed identification of pupils by the neural network. Additionally or alternatively, the neural network may be trained to identify eccentrically illuminated pupils independent of the system 200 and then deployed to perform pupil identification in association with the system 200. Further, the neural network may store images that have been analyzed by the neural network for manual grading and future training data set in the database 118, in the patient screening components 126, or in association with memory 320.
  • the CPU may determine the pupil position within the region of interest, the pupil diameters, and the inner pupil distance.
  • the CPU can analyze the first region of interest to determine a first horizontal position and a first vertical position of a first pupil of the patient 202.
  • the CPU can analyze the second region of interest to determine a second horizontal position and a second vertical position of a second pupil of the patient 202.
  • the first horizontal position, the second horizontal position, the first vertical position, and the second vertical position may identify positions for the first pupil and the second pupil in a two-dimension grid, a three-dimension mesh, or other coordinate system that allows the position of the pupils to be tracked.
  • the system 200 can determine a first pupil diameter, a second pupil diameter, and the inner pupil distance for the patient 202. Further, the CPU can determine the first pupil diameter, the second pupil diameter, and the inner pupil distance can be determined based at least on the coordinate system, an estimated distance to between the radiation sensor 212 and the patient 202 (e.g., the system 200 is to be place approximately 1 meter or approximately 3 feet from the patient 202), and/or other identification of an approximate size of the pupil features.
  • the CPU may determine a glint position for the first pupil and/or the second pupil.
  • the glint position may be determined based at least on an intensity of reflected light that exceeds an intensity threshold and/or is a position associated with the highest intensity of reflected light within an image.
  • the CPU may determine intensity values for individual positions and/or pixels of the image and/or the image data captured by the radiation sensor. Additionally, the CPU may compare intensity values associated with the image and determine an absolute maximum intensity value and/or one or more local maximum intensity values within the first region of interest and/or the second region of interest.
  • the CPU may identify the position and/or pixels associated with the absolute maximum intensity value for the first region of interest and/or the second region of interest as the glint position for the first pupil and/or the second pupil. Alternatively, the CPU may identify a region within the first region of interest and/or the second region of interest associated with an intensity of reflected light that exceeds the intensity threshold and determine that the region is associated with the glint position. [00103] At block 510, the CPU may determine a glint intensity and an image contrast for the first pupil and/or the second pupil. For example, the glint intensity can indicate a luminous intensity value for the captured radiation at the glint position determined in block 508.
  • the glint intensity can indicate data values for radiant energy, radiant energy density, radiant exposure, and/or other values indicating an amount of radiation captured by the radiation sensor for the glint position.
  • the image contrast can indicate a differential between the intensity values of a portion of the image covering the whole region of interest surrounding the pupil and the intensity values associated with the surrounding positions and/or pixels. Accordingly, the CPU can identify how intense the radiation captured by the radiation sensor 212 is for the position of the portion for contrast and utilize that information to identify a differential between the radiation collected at the position of the portion for contrast and radiation collected at positions surrounding the portion for contrast.
  • the CPU may utilize the glint position, the glint intensity, and the image contrast for the first pupil and/or the second pupil to identify focusing logic adjustments for utilization in determining whether an image observed by the radiation sensor 212 is in a focused state based on one or more image state thresholds.
  • the CPU may utilize, individually or in combination, the glint intensity and the image contrast to determine whether a focusing quality for the image.
  • a high glint intensity and/or a high image contrast may indicate that the image has a high focusing quality.
  • a low glint intensity and/or a low image contrast may indicate that the image has a low focusing quality.
  • the CPU may determine whether the glint intensity and/or the image contrast exceed a glint intensity threshold and/or an image contrast threshold to identify whether the image is associated with a focusing quality sufficient for further analysis. Alternatively, the CPU may determine whether the glint intensity and/or the image contrast satisfies an image contrast threshold that indicates, when satisfies, that the focusing logic of the radiation sensor should be adjusted and an additional image captured.
  • the example method can cause the system 200 to determine whether an image observed by the radiation sensor 212 is in a focused state and whether a plurality of images can be captured.
  • a first image can be captured by the radiation sensor 212 while the patient 202 is illuminated by the radiations source 206 as described by block 502.
  • the system 200 may determine whether one or more image state thresholds are satisfied by the image.
  • the one or more image state thresholds may include: 1) determine whether a pupil diameter (e.g., the first pupil diameter and/or the second pupil diameter) is greater than 3mm and less than 10mm; 2) determine whether an inner pupil distance is greater than 30mm and less than 100mm; 3) determine that a glint intensity is greater than 220; and 4) determine that image contrast is greater than 20. Further, the system can determine a set of image state thresholds to be satisfied for the image to be in a focused state and approved by the system 200 (e.g., the system can determine that all four image state thresholds are to be satisfied for the image to be in a focused state).
  • a pupil diameter e.g., the first pupil diameter and/or the second pupil diameter
  • the system 200 can capture the series of image. It should be noted that in some examples, the system 200 can iteratively determine whether each image of the series of images satisfies the set of image state thresholds to maintain the focused state for the series of images. Additionally or alternatively, in some additional examples, the system 200 may determine whether a subset of the series of images passes the set of image state thresholds. Further, if a minimum number of consecutive images (e.g., the subset of the series of images) satisfy the set of image state thresholds and can be determined to be in a focused state, the system 200 can capture the remaining images for the series of images without further scrutiny.
  • a minimum number of consecutive images e.g., the subset of the series of images
  • the example method can include a determination that the image captured by system 200 does not satisfy one or more of the image state thresholds.
  • the system 200 can execute corrective procedures to modify the distance between the system 200 and the patient 202 such that updated image variables (e.g., update pupil diameter, pupil distance, glint intensity, and contrast measurements based on a modified distance between the system 200 and the patient 202) are generated that satisfy the set of image state thresholds.
  • updated image variables e.g., update pupil diameter, pupil distance, glint intensity, and contrast measurements based on a modified distance between the system 200 and the patient 202
  • the system 200 can indicate that the patient 202 is too far from the system.
  • the system 200 can indicate that the patient 202 is too close to the system. Further, if the pupil diameter and/or the pupil distance consecutively satisfy the related image state thresholds a minimum number of times, the system 200 can indicate that the patient 202 is in a good position and/or capture the series of images.
  • the system can determine whether the image or the series of images is in a focused state. In examples where the system 200 determines that the image(s) are in a focused state, the plurality of images can be captured at block 514. In examples where the system 200 determines that the images(s) are not in a focused state, the image(s) can be discarded and/or disregarded and the system can return to block 504.
  • FIG. 6 provides a flow diagram illustrating an example method 600 for vision testing, as described herein.
  • the method 600 is illustrated as collections of blocks in a logical flow graph, which represents a sequence of operations that can be implemented in hardware, software, or a combination thereof.
  • the blocks represent computerexecutable instructions stored on one or more computer-readable storage media that, when executed by CPU(s), perform the recited operations.
  • computer-executable instructions include routines, programs, objects, components, data structures, and the like that perform particular functions or implement particular abstract data types.
  • the order in which the operations are described is not intended to be construed as a limitation, and any number of the described blocks can be combined in any order and/or in parallel to implement the method 600. In some embodiments, one or more blocks of the method 600 can be omitted entirely.
  • the CPU can detect, for a first image of a series of images received from block 514, one or more pupils within the first image of the patient 202. Additionally, the CPU may determine a first region of interest associated with a first pupil and/or a second region of interest associated with a second pupil. Further, the CPU may determine first pupil center position for the first image of the series of images.
  • the CPU may detect, based at least on the pupil center position for the first image, one or more additional regions of interest for an additional image.
  • the one or more additional regions of interest may be determined based at least on the pupil center position of the first image based at least on a high rate of image capture (e.g., 25 frames can be captured in less than 2.5 seconds) limiting the pupil center drift between frames.
  • regions of interest for a successive image in the series of images e.g., the additional image
  • the difficulty in identifying eccentrically illuminated pupils may be compensated for by the low pupil center draft between frames.
  • the CPU may be configured to generate a low difficulty image for pupil detection via activation of high success radiation point source sets of the eccentric radiation sources 206 for the first image. Additionally, once the one or more pupils are identified for the first image and the pupil center position determined, the CPU can utilize the low pupil center position drift to restrict the one or more additional regions of interest for the additional image and reduce the threshold for pupil detection by the neural network and/or the system 200.
  • the example method may cause the CPU to identify, based at least on the one or more additional regions of interest, an additional pupil center position for the additional image.
  • an additional pupil center position for the additional image.
  • limited pupil center drift and the additional regions of interest identified based at least on the pupil center location identified for the preceding image enables the identification of the additional pupil center under low eccentric illumination scenarios, asymmetric illumination scenarios, and other scenarios that may be difficult for standard methods to analyze.
  • the additional pupil center position can be utilized for to determine one or more further regions of interest for a further image that succeeds the additional image. Accordingly, and at block 608 the utilization of a preceding pupil center position for a preceding image can enable the identification of a successive pupil center position for a successive image by the neural network.
  • FIG. 7 provides a flow diagram illustrating an example method 700 for vision testing, as described herein.
  • the method 700 is illustrated as collections of blocks in a logical flow graph, which represents a sequence of operations that can be implemented in hardware, software, or a combination thereof.
  • the blocks represent computerexecutable instructions stored on one or more computer-readable storage media that, when executed by CPU(s), perform the recited operations.
  • computer-executable instructions include routines, programs, objects, components, data structures, and the like that perform particular functions or implement particular abstract data types.
  • the order in which the operations are described is not intended to be construed as a limitation, and any number of the described blocks can be combined in any order and/or in parallel to implement the method 700.
  • the CPU may generate normalized images from individual images of a series of images. For example, the CPU may utilize the individual images to generate 0 degree rotation corrected, 60 degree rotation corrected, and 120 degree rotation corrected images to normalize the series of images against rotational effects caused by radiation collected from radiation point sources on a 0 degree meridian, a 60 degree meridian and a 120 degree meridian of the eccentric radiation source 206.
  • the CPU may determine a primary axis for individual images captured by the radiation sensor, wherein the primary axis may be indicated by the largest range of reflected light intensity detected by the radiation sensor 212.
  • the CPU may determine the primary axis for the individual images captured by the radiation sensor, wherein the primary axis may be determined based on the activation of individual radiation point sources of the eccentric radiation source 206 such that the primary axis is aligned with a meridian of the eccentric radiation source 206. Accordingly, while the eccentric radiation source 206 has been described as having a 0 degree meridian, a 60 degree meridian, and a 120 degree meridian, the CPU may operate to normalize the axis of any meridians based at least on the primary axis associated with the individual images.
  • the CPU may extract a radiation intensity profile from each individual image of the series of normalized images based at least on a pupil diameter associated with the individual image.
  • the CPU can extract a profile of the radiation intensity detected by the radiation sensor 212 for an image that captures the range of radiation intensity reflected by the one or more retinas, via the one or more pupils.
  • the CPU can determine a primary axis associated with the image, wherein the primary axis may be utilized to identify the radiation intensity profile to be extracted from the image.
  • the radiation intensity profile may be a series of radiation intensity values obtained from a normalized image along the primary axis at the pupil center. The radiation intensity values may be point values identified along the primary axis or an average of radiation intensity values at positions along the primary axis.
  • the CPU may identify a profile box aligned to and centered on the primary axis. Additionally or alternatively, the CPU may determine that the profile box includes a radiation intensity maximum and a radiation intensity minimum along the primary axis.
  • the CPU may determine that the profile box is comprised of a first dimension and a second dimension, wherein the first dimension may be indicate a first number of radiation point values or pixels association with radiation point values that are substantially parallel to the primary axis while the second dimension may indicate a second number of radiation point values or pixels that are substantially perpendicular to the primary axis (e.g., the first dimension indicates that the profile box is comprised of 18 pixels along the primary axis and the second dimension indicates that the profile box is comprised of 11 pixels across the primary axis).
  • the radiation intensity values for the radiation intensity profile may be determined for each position along the first dimension by averaging the radiation point values along the second dimension (e.g., for the first position along the primary axis, the CPU may average the 11 radiation point values in the second dimension at the first position to determine a first radiation point value for the radiation profile).
  • the profile of the radiation intensity can be extracted based at least on the pupil diameter determined for the individual image.
  • the profile extracted from the individual image can be an 18 x 11 pixel box, depicting a range of radiation intensity for the individual image, where the pupil diameter is less than 4mm.
  • the profile extracted from the individual image can be a 25 x 11 pixel box where the pupil diameter exceeds 4mm.
  • the radiation intensity profile for the primary axis of the normalized image may be generated from the radiation point values of the profile box.
  • the example method can determine slopes for each radiation source eccentricity.
  • the CPU can process the profile box extracted from each individual image of the series of images such that a SE slope is determined for each eccentricity utilized by the radiation source 206 to produce the series of images.
  • the CPU may process the radiation intensity profile box and determine a linear function fitted to the data results to produce a slope associated with the eccentricity of the radiation point sources active when the image was captured.
  • the radiation intensity within the profile box may be averaged to generate, for example, a 1x18 or 1x24 pixel curve representing the radiation intensity profile for the normalized image.
  • each image of the series of images may be associated with a radiation point source that is further associated with a meridian and an eccentricity.
  • the linearized slope of the radiation intensity profile may be correlated with a refractive error based at least in part on the meridian (e.g., the 0 degree meridian, the 60 degree meridian, and/or the 120 degree meridian) and the eccentricity (e.g., the first eccentricity, the second eccentricity, the third eccentricity, and/or the fourth eccentricity at different radial distances from the radiation sensor) associated with the radiation point source that is active when each image is captured. Accordingly, for refractive errors along a meridian, different central profile slopes may correspond to different eccentricities.
  • the CPU may call neural networks associated with the meridians of the radiation source 206 to generate refractive errors from the profile slopes.
  • the CPU may utilize the neural networks to correlate the linear function and/or a linearized slope (e.g., an SE slope) of the radiation intensity profile with a refractive error for the primary axis of the normalized image, the meridian of the image, and/or the set of radiation point sources that are active when image is captured.
  • the CPU may generate a Sphere (Ds), Cylinder (De), and Axis (Ax) of the Cylinder from the SE slope values at the three meridians.
  • the CPU may generate the three parameters (e.g., Ds, De, and Ax) for the right eye and the left eye associated with the patient 202.
  • the refractive error for a meridian can be determined by the neural network based at least on correlations between different SE slopes and the active radiation point sources during image capture.
  • the neural networks may be configured to correlate the SE slope of an image with the refractive error for the patient 202 by utilizing variations in radiation intensity caused by radiation point sources at different eccentricities along one meridian.
  • the neural networks may be trained by evaluated SE slopes and known refractive errors at the meridians from previously diagnosed images of patient eyes.
  • the pupil size of an eye may be utilized as an input to further correlate SE slopes with refractive errors for the neural networks.
  • the CPU may call one or more neural networks with the same structure determine the refractive errors at the meridians (e.g., the 0 degree meridian, the 60 degree meridian, and/or the 120 degree meridian) of the radiation source 206.
  • the CPU may call one unified neural network with more complicated structure (e.g., more neurons and more weights) to determine the overall eye refractive error, Ds, De and Ax.
  • the radiation intensity profiles may be utilized by the neural network(s) to determine Ds, De, and Ax utilizing calculations discussed by U.S. Pat. App. Pub. No. 2017/0027440 Al, the entire disclosure of which, except for any definitions, disclaimers, disavowals, and inconsistencies, is incorporated herein by reference.
  • the described systems can utilize an eccentric radiation source to simplify the vision screening system such that excess complexity and operational difficulties can be eliminated from the system.
  • the described systems can be configured to utilize novel logic and algorithms for focusing collected radiation for image capture, capturing a series of images, and calculation of reflective error for a patient.
  • the system can synchronize activation of a radiation source and an image capture rate of a radiation sensor such that a series of images is captured within a timeframe that can minimize positional drift and reduce calculation loads for the processor.
  • the system can be configured to focus the radiation captured by the radiation sensor, normalize the images produced by the radiation sensor, and identify key parameters that enable the utilization of neural networks to determine refractive errors for the eyes of the patient and generate recommendations for future actions.
  • the vision screening device, system and methods may be used to perform different types of vision screening tests and analysis.
  • the vision screening tests may include ocular alignment testing, ocular reflexes testing, estimation of pupil size, pupil distance, and eye gaze deviation, identification of vision abnormalities such as myopia, hyperopia, astigmatism, anisometropia, strabismus, anisocoria and the like.
  • FIG. 8 is a pictorial flow diagram illustrating an example workflow 800 for administering one or more of the vision screening tests above.
  • the vision screening tests may be administered using the vision screening device 230, for example.
  • the vision screening device 230 may include a first display 252 facing a patient, and a second display 258 facing a user of the vision screening device 230.
  • An illustration of content displayed on the first display 252 at each step of the workflow 800 is shown in column 802, while the content displayed on the second display 258 for the corresponding step of the workflow 800 is shown in column 804.
  • the steps of the workflow 800 is illustrated as collections of blocks in a logical flow graph, which represents a sequence of operations that can be implemented in hardware, software, or a combination thereof.
  • the blocks represent computer-executable instructions stored on one or more computer-readable storage media that, when executed by processor(s) e.g., CPU 314 or the processor(s) of the vision screening device 230, perform the recited steps.
  • the workflow 800 may be implemented as an app running on the processor 314, and the app may provide step-by-step guidance to the user in administering the vision screening test(s).
  • the app may provide informational material to the patient educating the patient on the vision screening test being administered.
  • step 806 the processor(s) begins a vision screening session to be administered to a patient.
  • step 806 may be performed in response to an input from a user of the vision screening device 230 selecting a vision screening test to be administered e.g., by using a touch interface of the second component 234.
  • vision screening tests supported by the vision screening device 230 may include photorefraction screening tests, visual acuity screening tests, dynamic eye tracking tests, accommodation tests, color vision screening tests, and the like.
  • the processor(s) may generate content 808, 810 and cause the first display 252 to present welcome screen 808 to the patient, and cause the second display 258 to present content 810 to the user, which may include one or more of the visual elements described with reference to FIG. 2B.
  • the content 810 may comprise a user interface including the patient’s information 262, the icon panel 272 and the like.
  • the vision screening device 230 may acquire image(s) of the eye(s) of the patient.
  • the processor(s) may cause the NIR radiation source 280 or the visible light source 288 to emit radiation illuminating the patient during a period of time.
  • the vision screening device 230 may present an attention-attracting video 814 on the first display 252 to attract attention of the patient and hold the patient’s gaze towards the vision screening device 230.
  • the video 814 may include dynamic light patterns and/or sounds such as bird sounds.
  • the processor(s) may cause the NIR camera 282 or the visible light camera 284 to capture image(s) and/or video 816 of the patient’s eye(s).
  • the processor(s) may also determine a position of pupils and/or lenses of the eye(s) based on the captured image(s) and/or video 816, as described with reference to FIGS. 4-6.
  • the vision screening device 230 may present the captured image(s) and/or video 816 of the eye(s) to the user via the second display 258.
  • the processor(s) may determine a location of the patient relative to the vision screening device 230 e.g., based on an output of the range sensor 286 or the determined location of the pupils and/or lenses of the eye(s), and provide an indication to the user regarding the correctness of location of the patient.
  • each vision screening test may have a recommended distance between the vision screening device 230 and the patient and/or a recommended position of the patient relative to the vision screening device 230.
  • the processor(s) may provide instructions to the user, via the second display 258, indicating that the patient and/or the vision screening device 230 needs to be moved, along with adjustments required to bring the patient into a correct position for administering the vision screening test.
  • the user may be instructed to adjust the position of the vision screening device 230 and/or the patient and the processor(s) may re-run the step 812.
  • the vision screening device 230 may use eye tracking techniques on a video feed of the patient e.g., video captured by the visible light camera 284, to produce a focused and centered image of the eye(s) on the second display 258.
  • the vision screening device 230 conducts the first vision screening test selected by the user.
  • One or more visual stimulus corresponding to each vision screening test supported by the vision screening device 230 may be stored in memory e.g., the memory 320.
  • the processor 314 may access the one or more visual stimulus from the memory and present the visual stimuli 820 on the first display 252 to the patient.
  • the visual stimuli may be associated with timing information e.g., indicative of duration of display of each visual stimulus. In some examples, the duration of display of each visual stimulus may be controlled by the user of the vision screening device 230.
  • the processor 314 may present information 822 relevant to the vision screening test being administered to the second display 258.
  • the information 822 may be indicative of measurements determined by the vision screening device 230 during the presentation of the visual stimuli to the patient.
  • the vision screening device 230 may detect pupils, retinas, and/or lenses of the eye(s) of the patient, and acquire data comprising images and/or video data of the pupils/retinas/lenses, and the like.
  • the processor(s) may analyze the data to determine one or more measurements associated with the patient, such as an accommodation of a lens of the eyes of the patient, motion information associated with the eyes of the patient, the refractive error of the eye(s) of the patient, gaze angle of the eye(s) of the patient 106, and the like.
  • the information 822 may also include testing parameters, current status of the test being administered, and/or the patient’s medical history.
  • step 818 may be interactive e.g., requiring input from the patient in response to the visual stimuli being presented.
  • the input may be in the form of verbal communication or other indication from the patient to the user of the vision screening device 230 and/or directly to the vision screening device 230 e.g., using a touch interface of the first component 232.
  • the visual stimuli 820 presented to the patient, on the first display 252 may be a randomized set of characters 820.
  • the visual stimuli 820 may also be included in the information 822 presented to the user via the second display 258, so that the user may determine if the patient is able to read the characters correctly.
  • the information 822 may vary.
  • the information 822 may comprise a close-up image and/or video feed of the eyes of the patient along with measurements of pupil size, diameter and the like determined by the processor(s) in response to visual stimuli comprising different levels of brightness.
  • the user may be presented with a close-up video feed of the eyes of the patient along with eye tracking ouput determined by the processor(s) indicating movement of the eye while a visual stimulus comprising a moving target is being presented to the patient.
  • results of the first vision screening test are determined. The content
  • the results of the first screening test 828 may be presented to the user via the second display 258.
  • the results may be based on measurements made by the vision screening device 230 during step 818.
  • the refractive error of each eye may be measured using the methods described above with reference to FIGS. 4-7.
  • the refractive errors may be compared to standardized thresholds and ranges to determine abnormalities in vision such as hypermetropia, anisometropia, amblyopia, strabismus and the like.
  • a type of color vision abnormality may be displayed at the end of a color vision screening test.
  • the user may add the results 828 to the patient’s information, or generate and/or print a report for the patient or for a clinician.
  • the user may re-do the test e.g., if the results returned are invalid, by reverting to the step 812.
  • the processor(s) receive input from a user of the vision screening device 230 indicating an end of the vision screening session, or the selection of another vision screening test to administer. If another vision screening test is selected, the workflow may revert to step 812.
  • the workflow described above illustrates the use of the vision screening device 230 to conveniently administer a series of vision screening tests, using the displays 252, 258 of the vision screening device 230 to provide the relevant content to the patient and to the user of the vision screening device 230.
  • the steps of the vision screening session described in the workflow 800 may be administered using the vision screening device 230.
  • the vision screening device 230 may include a first display 252 facing a patient, enabling display of content 808, 814, 820, 826 to the patient during the vision screening session, and of the vision screening device 230.
  • the vision screening device 230 may include a second display 258 facing a user, enabling display of content 810, 816, 822, 828 to the user.
  • the imaging assembly 250 of the vision screening device 230 includes illumination and capture components needed for acquiring image(s) of the eye(s) at step 812, and conducting one or more vision screening tests at step 818 e.g., NIR source 280, NIR camera 282, visible light source 288, a visible light camera 284, and the like.
  • FIG. 9 provides a flow diagram illustrating an example method 900 for administering a vision screening test, as described herein.
  • the method 900 is illustrated as collections of blocks in a logical flow graph, which represents a sequence of operations that can be implemented in hardware, software, or a combination thereof.
  • the blocks represent computer-executable instructions stored on one or more computer-readable storage media that, when executed by CPU(s), perform the recited operations.
  • computer-executable instructions include routines, programs, objects, components, data structures, and the like that perform particular functions or implement particular abstract data types.
  • the vision screening device 230 and/or one or more processors associated therewith initiates a vision screening test to be administered to a patient.
  • operation 902 may be performed in response to an input from a user of the vision screening device 230 selecting a vision screening test to be administered e.g., by using a touch interface of the second component 234 to select a vision screening test from options provided in a test selection icon of the icon panel 272.
  • the content 808 may be displayed on a first display, and content 810 may be displayed on a second display to initiate the vision screening test.
  • the vision screening device 230 displays a visual stimulus to the patient.
  • the visual stimulus displayed may be based on the specific vision screening test being administered.
  • the visual stimuli may include light patterns that dynamically get brighter or dimmer and/or move around on the display screen facing the patient, e.g., the first display 252.
  • the visual stimuli may include text or other characters with a gradation in sizes e.g., a Snellen chart.
  • the visual stimuli may include color dot patterns e.g., an Ishihara plate.
  • the vision screening device 230 causes a radiation source to emit radiation to illuminate the eye(s) of the patient.
  • the radiation source may be the NIR radiation source 280 or the visible light source 288, based on the vision screening test selected.
  • the NIR radiation source 280 may generate a pattern of eccentric radiation to illuminate the eye(s) of the patient.
  • the vision screening device 230 causes a camera to capture radiation reflected by the eye(s) of the patient.
  • the type of camera may be based on the type of illumination at operation 906.
  • the NIR camera 282 of the vision screening device 230 may be used.
  • the visible light camera 284 may be used to capture the reflected radiation.
  • the cameras 282, 284 may be configured to record multiple images and/or video data of the patient’s eyes during the vision screening test.
  • the vision screening device 230 and/or one or more processors associated therewith determines, based on the captured radiation, one or more measurements associated with the eye(s).
  • the measurement(s) may be determined by analyzing the images and/or video data obtained by the camera(s) at operation 908.
  • the measurements may include, but not be limited to measurements associated with refractive error of the eye(s), a gaze of the patient, a location of the pupils, a diameter of the pupils, a diameter of the lens, motion data, etc.
  • the image/video of the eye(s) captured at operation 908 may be used to determine a measurement of gaze angle or gaze direction of the eye(s) of the patient, which may be tracked over time to determine a pattern of gaze angles or gaze directions during the vision screening test.
  • a measurement of refractive error of the eye(s) may be determined based on reflected NIR radiation image captured by the NIR camera 282.
  • the vision screening device 230 determines, based on the measurement(s), a result of the vision screening test.
  • the vision screening device 230 and/or one or more processors associated therewith may compare the measurement(s) with one or more standard values.
  • the system may be configured to receive, access, and/or determine criteria based on known standard values, indicating values of known standard behavior for patients being screened.
  • the system may then compare the determined measurement(s) with the standard values e.g., thresholds or ranges, to determine if the patient is exhibiting normal vision behavior.
  • the standard values may be received and/or accessed from a database e.g., database 118 and/or may be determined utilizing historical screening data and/or one or more machine learning techniques.
  • the vision screening device 230 may display the result of the vision screening test via a second display e.g., the second display 258, facing the user of the vision screening device 230.
  • the result may include testing parameters, current status of the test, measurements(s) determined during the test, a diagnosis determined based on the measurement(s), and/or a recommendation associated with the diagnosis. For example, if the measurement(s) satisfy the criteria based on standard values at operation 912, the vision screening device 230 may generate a recommendation indicating that the patient has passed the vision screening test. If the measurements do not satisfy the criteria, the system may generate a recommendation including an indication that the patient has failed the screening, an indication of a diagnosis of a type of vision deficiency exhibited by the patient, and/or a recommendation for additional screening.
  • a vision screening device comprising: a first housing; a second housing rotatably coupled to the first housing; a first display disposed within the first housing, the first display forming a substantially planar first front surface of the vision screening device; a second display disposed within the second housing, the second display forming a substantially planar second front surface of the vision screening device, wherein in a closed configuration of the vision screening device, the first front surface is disposed adjacent and facing the second front surface; an imaging assembly disposed within the first housing, the imaging assembly comprising: a near infra-red (NIR) radiation source, and a NIR camera configured to capture NIR radiation reflected by an eye of a patient; a processor operably connected to the first display, the second display, and the imaging assembly; and memory storing instructions that, when executed by the processor, cause the processor to: cause the first display to present a visual stimulus to the patient during a period of time; cause the NIR radiation source to illuminate the eye of the patient during the period of time;
  • NIR near infra
  • Clause 3 The vision screening of any of the clauses above, either alone or in combination, wherein the imaging assembly further includes a front cover disposed over the NIR radiation source and the NIR camera, wherein the front cover extends substantially coplanar with the first front surface of the vision screening device.
  • Clause 5 The vision screening of any of the clauses above, either alone or in combination, wherein the measurement includes at least one of a pupil position, a pupil diameter, and an inner pupil distance, and the instructions further cause the processor to: detect, based on the captured radiation, a pupil of the eye; and determine, based on the detection, the measurement.
  • Clause 7 The vision screening of any of the clauses above, either alone or in combination, wherein the period of time is a first period of time, and the instructions further cause the processor to: cause the visible light source to illuminate the patient during a second period of time; cause the visible light camera to capture an image of a face of the patient during the second period of time; determine, based on the image, a location of the eye; generate an augmented image illustrating the eye, the augmented image including a visual identifier identifying a pupil of the eye; and cause the second display to display the augmented image of the eye.
  • Clause 8 The vision screening of any of the clauses above, either alone or in combination, wherein the imaging assembly further comprises a 3D range sensor at least partly disposed within the third housing, and the instructions further cause the processor to: cause the 3D range sensor to generate a 3D point cloud of the patient; and determine, based at least in part on the 3D point cloud, a distance of the patient from the vision screening device.
  • Clause 9 The vision screening of any of the clauses above, either alone or in combination, further comprising: a first back surface disposed opposite the first front surface and formed by the first housing; and a second back surface disposed opposite the second front surface and formed by the second housing, wherein the first housing and the second housing are moveable to achieve an open position of the vision screening device in which the first back surface is disposed adjacent to and facing the second back surface.
  • Clause 10 The vision screening of any of the clauses above, either alone or in combination, wherein the instructions further cause the processor to: cause the second display to display one or more vision screening test options; and receive, via a touch interface of the second display, input indicative of a selection of a vision screening test of the one or more vision screening test options.
  • Clause 11 The vision screening of any of the clauses above, either alone or in combination, wherein the vision screening test options include one or more of a photorefraction screening test, a visual acuity screening test, and a color vision screening test.
  • a method comprising: initiating a vision screening test to be administered using a vision screening device, the vision screening device comprising: a first display disposed within a first housing, the first display; a second display disposed within a second housing rotatably coupled to the first housing; an imaging assembly disposed within the first housing, the imaging assembly comprising a radiation source and a camera; causing the first display to present a visual stimulus to a patient during a period of time; causing the radiation source emit radiation during the period of time; causing the camera to capture a portion of the radiation emitted by the radiation source, wherein the portion of the radiation captured by the camera is reflected by an eye of the patient during the period of time; determining, based on the portion of the radiation, a parameter of the eye; determining, based on the parameter, a result of the vision screening test; and causing the second display to display information indicative of the result.
  • Clause 13 The method of clause 12, wherein the radiation source is a near-infrared (NIR) radiation source, the camera is a NIR camera, and the parameter is indicative of a refractive error of the eye.
  • NIR near-infrared
  • Clause 14 The method of any of the clauses above, either alone or in combination, wherein the visual stimulus comprises one or more of: an attention-attracting video, a visual acuity screening chart, a color vision screening pattern, a moving light target, and a light display at varying levels of brightness.
  • the visual stimulus comprises one or more of: an attention-attracting video, a visual acuity screening chart, a color vision screening pattern, a moving light target, and a light display at varying levels of brightness.
  • Clause 15 The method of any of the clauses above, either alone or in combination, further comprising: capturing video frames of the eye of the patient during the period of time; tracking, based on the video frames, a gaze direction of the eye of the patient responsive to the visual stimulus; and determining, based on the tracked gaze direction, the parameter.
  • the vision screening device includes a range sensor
  • the method further comprising: causing the range sensor to capture a three-dimensional point cloud of an environment in which the patient is located; determining, based on the three-dimensional point cloud, a location of an eye of the patient; determining that the location of the eye does not satisfy a recommended location of the vision screening test; and causing the second display to display an indication that the location of the eye does not satisfy the recommended location of the vision screening test.
  • a vision screening device comprising: a first display; a second display moveable relative to the first display; an imaging assembly fixed relative to, and moveable with, the first display, the imaging assembly comprising: a near infra-red (NIR) radiation source, a NIR camera configured to capture NIR radiation reflected by an eye of a patient, a visible light source, and a visible light camera configured to capture an image or video of the patient; a processor operably connected to the first display, the second display, and the imaging assembly; and memory storing instructions that, when executed by the processor, cause the processor to: cause the first display to display a visual stimulus associated with a vision screening test; cause the NIR radiation source to illuminate the eye of the patient; cause the NIR camera to capture radiation reflected by the eye and responsive to the visual stimulus; determine, based on the captured radiation, a measurement associated with the eye; determine, based on the measurement, a result of the vision screening test; and cause the second display to present information indicative of the result.
  • NIR near infra-red
  • Clause 18 The vision screening device of clause 17, and/or of any of the other clauses above, either alone or in combination, wherein the instructions further cause the processor to: cause visible light to illuminate the patient during a period of time; cause the visible light camera to capture a video of the eye of the patient; determine, based on the video, a location of a pupil of the eye; and display, on the second display, the video of the eye including an indication of the pupil of the eye.
  • Clause 19 The vision screening device of any of the clauses above, either alone or in combination, wherein the result is indicative of at least one of: passing the vision screening test, failing the vision screening test, or requiring additional vision screening.
  • Clause 20 The vision screening device of any of the clauses above, either alone or in combination, wherein the imaging assembly is proximate a perimeter of the first display and substantially co-planar with the first display.

Landscapes

  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Biophysics (AREA)
  • Ophthalmology & Optometry (AREA)
  • Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Physics & Mathematics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Eye Examination Apparatus (AREA)

Abstract

A system includes system housing, an eccentric radiation source, and a radiation sensor. The radiation produced by the eccentric radiation source can be collected by the radiation sensor to generate images of retinas for a patient. The system also includes a vision screening device connected with the eccentric radiation source and the radiation sensor via the system house that can control and synchronize actions for the eccentric radiation source and the radiation sensor. The vision screening device further analyzes the images generated by the radiation sensor via neural network algorithms to determine spherical error slopes, refractive errors, and recommendations for the patient.

Description

VISION SCREENING SYSTEMS AND METHODS
CROSS REFERENCE TO RELATED APPLICATIONS
[0001] The present International Application claims priority to, and the benefit of, U.S. Patent Application No. 17/531,320, filed November 19, 2021, which is a continuation-in-part of, and claims priority to, U.S. Patent Application No. 17/344,838, filed June 10, 2021, which is a nonprovisional of, and claims priority to, U.S. Provisional Application No. 63/041,550, filed June 19, 2020, the entire contents of which are incorporated herein by reference.
TECHNICAL FIELD
[0002] This application is directed to medical equipment, and in particular, to devices, systems, and methods associated with determining refractive error, spherical error, and/or other parameters.
BACKGROUND
[0003] Visual screening in individuals typically includes one or more tests to determine various deficiencies associated with the patient’s eyes. Such vision tests may include, for example, refractive error tests, convergence tests, accommodation tests, visual acuity tests, color vision screening tests, and the like. While one or more of the above tests may be related, each test has a respective purpose. For instance, in a refractive error test, the person is typically positioned within a measurement range associated with a vision screening device. Once the person is properly positioned, the screening device can be used to direct light onto the person’s retinas. Sensors on the device may then collect corresponding light that is reflected by the retinas, and the device may determine a refractive error for each eye based on characteristics of the reflected light. [0004] However, while various vision screening devices exist, such devices are typically cumbersome and complicated to use. For example, existing vision screening devices commonly utilize equipment such as range finders, optics for center light sources for image capture, and other specialized components that increase the cost, size, and operational complexity of vision screening systems. In addition, current methods utilize manual alignment, manual determinations, range-finding components, and/or large rooms/facilities while testing patients. Further, additional systems may utilize complicated and specialized equipment to complete patient tests. Additionally, existing vision screening devices utilize calibration curves for refractive error tests that introduce inaccuracies and errors in recommendations provided to medical practitioners and patients.
[0005] The various examples of the present disclosure are directed toward overcoming one or more of the deficiencies noted above.
SUMMARY
[0006] In an example of the present disclosure, a system can include a light source operable to generate an eccentric source of visible or near infrared (NIR) radiation. The eccentric source of radiation can be comprised of a plurality of point radiations sources that are disposed radially surrounding a radiation sensor. Additionally, the plurality of point sources can be predominantly disposed in a two-dimension plane surrounding the radiation sensor. Further, the system can include an anterior surface that is comprised of the radiation sensor and the eccentric source of radiation and a posterior surface that is comprised of an interactive display. Refractive error, as determined by the system of the present disclosure, may be represented by three parameters: sphere, cylinder and axis. The parameters are a description of the imperfectness of the optics of the eye, mainly due to lens and eyeball shape. The reflection of the eccentric light by the retina may be used to measure the refractive error of the eye. The system can also include a controller operable to cause the interactive display to output an image included in a visual acuity examination based at least in part on generation of the visible beam. [0007] For instance, an example method of the present disclosure includes a refractive error test that collects a set of images from a patient over a timeframe. Additionally, the method includes monitoring, via a radiation sensor, a retina of a patient and periodically emitting, via a radiation source, near infrared (NIR) radiation for a duration of the timeframe. The duration for emission of the NIR radiation can be determined based at least in part on an image capture rate of the radiation sensor. Accordingly, the radiation sensor can capture a plurality of images, wherein a CPU may select, based at least on the duration that the radiation source emitted NIR radiation, a set of images for the refractive error test from the plurality of images and analyze the set of images to determine the refractive error of the eye. The duration of the NIR radiation emission can be configured such that the set of images are selected based at least on the set of images depicting the retina of the patient being fully illuminated by the NIR radiation during image capture.
[0008] Further, an example device of the present disclosure includes an eccentric radiation source configured to generate one or more beams of near infrared (NIR) radiation, an optics component configured to receive the one or more beams of NIR radiation and to direct the one or more beams of NIR onto a retina of a patient, and a radiation sensor configured to collect reflected NIR radiation from the retina and to provide information indicative of the reflected NIR to a processor. Additionally, the processor can be configured to identify a pupil of the patient based at least in part on the information, cause an image capture device of the system to capture a plurality of images of the pupil, determine a plurality of pupil positions based at least in part on the plurality of images, and determine a refractive error of the patient based at least on the plurality of pupil positions. Further, the eccentric radiation source can be comprised a plurality of radiation point sources configured in meridians (e.g., lines of radiation point sources extending from the radiation sensor at rotational offsets) and eccentricities (e.g., a group of radiation point sources arranged around the radiation sensor at a radial distance). Accordingly, the processor can capture images once a focused state is achieved and generate a refractive error from the plurality of images.
[0009] Moreover, an example system of the present disclosure includes a processor, operably connected to an eccentric radiation source and a radiation sensor. In particular, the processor can cause the eccentric radiation source to generate radiation that illuminates a pupil and a retina of a patient and cause the radiation sensor to collect reflected radiation from the pupil and the retina of the patient. Additionally, the processor can cause the radiation sensor to capture a series of images based at least on a determination that the reflected radiation satisfies a focus state threshold. Further, the processor can determine one or more parameters associated with a detected pupil that enable the normalization of the series of images. Accordingly, the processor can utilize neural networks to analyze an intensity profile and identify the refractive errors of the patient eyes.
BRIEF DESCRIPTION OF THE DRAWINGS
[0010] Features of the present disclosure, its nature, and various advantages, may be more apparent upon consideration of the following detailed description, taken in conjunction with the accompanying drawings.
[0011] FIG. 1 illustrates an example system of the present disclosure. In some implementations, components of the example system shown in FIG. 1 may be used to perform one or more tests associated with vision screening.
[0012] FIG. 2A illustrates another example system of the present disclosure.
[0013] FIG. 2B illustrates an example vision screening device of the present disclosure. Similar to FIG. 1, in some implementations, components of the example systems and devices shown in FIGS. 2A and 2B and may be used to perform one or more tests associated with vision screening.
[0014] FIGS. 2C, 2D, and 2E illustrates further details of components of the vision screening device shown in FIG. 2B.
[0015] FIG. 3 depicts a block diagram of another example system for implementing the techniques described herein.
[0016] FIG. 4 provides a first flow diagram illustrating an example method of the present disclosure.
[0017] FIG. 5 provides a second flow diagram illustrating an example method of the present disclosure.
[0018] FIG. 6 provides a third flow diagram illustrating an example method of the present disclosure.
[0019] FIG. 7 provides a fourth flow diagram illustrating an example method of the present disclosure.
[0020] FIG. 8 provides a pictorial flow diagram of an example workflow for performing one or more vision screening tests using the example systems and devices on the present disclosure.
[0021] FIG. 9 provides a fifth flow diagram illustrating an example method of the present disclosure.
[0022] In the figures, the left -most digit(s) of a reference number identifies the figure in which the reference number first appears. The use of the same reference numbers in different figures indicates similar or identical items or features. The drawings are not to scale.
DETAILED DESCRIPTION [0023] The present disclosure is directed to, in part, a vision screening device, system and corresponding methods. Such an example vision screening system may be configured to perform one or more vision screening tests on a patient and to output the results of the vision screening test(s) to a user of the device, such as a physician or a physician’s assistant. For example, the vision screening system may generate one or more beams of radiation, via one or more radiation sources, and may be configured to direct such beams at the retinas of the patient. The system may collect corresponding light that is reflected by the retinas and may determine a refractive error for one or both eyes of the patient based at least in part on characteristics of the collected light. Moreover, the system may generate one or more images from the light that is reflected from the retinas of the patient over a period of time, and process the one or more images such that the refractive error is determined for the patient. Additionally, the system may present various visual stimuli to the patient, and perform measurements of ocular response of the patient’s eye(s) in response to the stimuli. As such, in any of the examples described herein, the results of the various vision tests performed using the system may include one or more measurements obtained by the vision screening device included in the system. In addition, the system may generate a recommendation and/or diagnosis associated with the patient for display to the user of the vision screening device. For example, by utilizing standard testing data and/or machine learning techniques, the system may evaluate the measurements determined by the system to provide a recommendation to the user regarding the vision of the patient (e.g., whether the patient passed the test, requires additional screening, etc.). As such, the system described herein may provide automated diagnosis recommendations in order to assist the physician or other user of the vision screening device.
[0024] Additionally, in any of the examples herein, a system may include a vision screening device housing configured to contain the vision screening system. In any of the examples described herein, the various tests performed using a vision screening device may have respective distance requirement or other parameters that must be met in order to ensure accurate testing results. For instance, during a refractive error test performed using the vision screening device, it may be recommended that the patient be spaced from the vision screening device by a distance of approximately three feet or approximately one meter. It should be noted that due to the compact and/or mobile design of the vision screening device (e.g., the vision screening device may be a handheld device, a mobile device similar to a tablet or a smartphone, etc.), the vision screening device may be configured to account for variation in the distance. Accordingly, an example vision screening device of the present disclosure may be configured to utilize a focusing algorithm to account for variations in the distance between the patient and the device, provide instructions to the patient and/or the physician to adjust the distance, or otherwise achieve proper spacing for the vision screening test(s).
[0025] In some examples, the vision screening device housing may be removably connected to a user device via a mount, a user device case, or other apparatus for securing the vision screening device housing and the user device. In other examples, a vision screening device base may include a stem that is vertically, rotationally, and/or otherwise moveably connected to the vision screening device housing. In some examples, the vision screening device may be removably connected to the stem of the vision screening device base, wherein the user device may be removably connected to the base via the vision screening device housing. In other examples, the user device may be removably connected to the stem, and the vision screening device housing may be removably connected to the user device.
[0026] Additional details pertaining to the above-mentioned techniques are described below with reference to FIGS. 1-9. It is to be appreciated that while these figures describe example systems and devices that may utilize the claimed methods, the methods, processes, functions, operations, and/or techniques described herein may apply equally to other devices, systems, and the like. [0027] FIG. 1 illustrates an example system 100 for vision screening according to some implementations. As illustrated in FIG. 1 , a vision screening system may be utilized to perform vision screening test(s) for a patient 102. For instance, a user 104 may utilize a vision screening device 106 and/or other components of the system 100 to administer a vision screening test on a patient 102 to determine the vision health of the patient 102. As described herein, the vision screening device 106 may be configured to perform a refractive error test via a sensor 108 and an eccentric radiation source 110. Additionally, the vision screening device 106 may be configured to emit radiation in the visible band and/or the NIR band via the eccentric radiation source 110 and capture reflected visible and/or NIR radiation 112c via the sensor 108. Further, the eccentric radiation source 110 may emit a plurality of radiation beams, including radiation beam 112a and 112b. The radiation beams in the visible band may be configured to provide visual stimuli to the patient’s eye(s) for one or more of the vision screening tests. It should be understood that, while FIG. 1 depicts the system 100 including a single vision screening device 106. In some additional examples, the system 100 may include any number of local or remote vision screening devices substantially similar to the vision screening device 106, configured to operate independently and/or in combination, and configured to communicate via the network 116. In some further examples, the system 100 may include one or more databases 118 and one or more vision screening systems 120 comprised of one or more processors 122, one or more network interfaces 124, and/or patient screening components 126. The patient screening components 126 may include one or more programs, modules, engines, instructions, algorithms, and/or other patient screening components that are executable by the processor(s) 122.
[0028] As described herein, the vision screening device 106 and/or vision screening system 120 may be configured to perform refractive error testing on the patient 102. For example, refractive error testing may include displaying emitting radiation beams 112a and 112b, such as a visible and/or NIR light, configured to illuminate the eyes of the patient 102. In response, the vision screening device 106 may detect the pupils and/or lenses of the eyes of the patient 102, acquire images and/or video data of the pupils/lenses via the sensor 108, and may transmit the vision screening data, via the network 116, to the vision screening system 120 for analysis. Alternatively, or in addition, the vision screening device 106 may perform the analysis locally. It should be noted that the sensor 108 may include optics components (not shown) that include one or more lenses, windows, prisms, filters, mirrors, and/or any other devices configured to collect and direct the reflected beam 112c of visible and/or NIR radiation generated by the eccentric radiation source 110. In some further examples, the optics components may comprise a collimating lens, a convergent, lens, a divergent lens, and/or any other substantially transparent lens or series of lenses configured to assist in directing such the reflected beam(s) 112c to impinge the sensor 108.
[0029] In examples, a memory associated with the vision screening device 106 and/or one or more of the patient screening components 126 may be configured to store and/or access data associated with the patient 102. For example, the patient 102 may provide data (referred to herein as “patient data”) upon initiating a vision screening test. For instance, when the vision screening device 106 and/or vision screening system 120 initiates a vision screening test, the patient 102 may provide, or the user 104 may request, patient data including demographic information, physical characteristics, preferences, and similar information regarding the patient 102. For example, the patient 102 may provide demographic information such as name, age, ethnicity, gender, and the like. The patient 102 may also provide physical characteristic information such as height of the patient 102. In such examples, the user 104 may request the patient data while the screening is in progress, or before the screening has begun. In some examples, the user 104 may be provided with predetermined categories associated with the patient 102, such as predetermined age ranges (e.g., six to twelve months, one to five years old, etc.), and may request the patient data in order to select the appropriate category associated with the patient 102. In other examples, the user 104 may provide a free form input associated with the patient data. In still further examples, patient data may be accessed, e.g., from the database(s) 118, and may include previous screening history, patient preferences, demographic information, other medical data, and the like .
[0030] The vision screening device 106 may be configured to generate image and/or video data associated with the patient 102 at the onset of the vision screening test. For example, the vision screening device 106 may include one or more digital cameras, motion sensors, proximity sensors, or other image capture devices configured to collect images and/or video of the patient 102, and one or more processors of the vision screening device 106 may analyze the collected images and/or video to determine, for example, the height of the patient 102, the distance of the patient 102 from the screening device, and/or any of the patient data described above.
[0031] Alternatively, or in addition, the vision screening device 106 may be configured to transmit the images, video, and/or any other collected information to the vision screening system 120, via the network 116, for analysis. In any such examples, the vision screening system 120 may store such information in the patient screening components 126 and/or in an external database 118. For example, the database 118 may comprise memory or computer- readable media substantially similar to and/or the same as the computer-readable media associated with the patient screening components 126. The database 118 may be accessible by the vision screening system 120, and/or by the vision screening device 106, via the network 116. In any such examples, the database 118 may be configured to store patient data in association with a patient ID (e.g., a name, social security number, an alphanumeric code, etc.) or other unique patient identifier. When the patient 102 and/or the user 104 enters the patient ID, the patient screening components 126 may access or receive patient data stored in association with the patient ID.
[0032] The optics components of the vision screening device 106 may include the sensor 108 and the eccentric radiation source 110. For instance, the eccentric radiation source 110 may comprise a plurality of light emitting diodes (LEDs) or other light sources capable of producing visible and/or NIR radiation beams 112a and 112b. For example, the eccentric radiation source 110 may comprise collimating lens, convergent lens, divergent lens, and/or any other substantially transparent lens or series of lenses configured to assist in directing such beams 112a and 112b to illuminate the retinas of patient 102. Additionally or alternatively, the eccentric light source 110 can utilize undirected light sources and/or dispersion filters associated with the light sources to illuminate the patient 102 and the retinas of the patient 102. [0033] For example, the sensor 108 can be configured to determine a distance 132 between a patient position 130 and a vision screening device position 128. In such examples, the vision screening device 106 can be configured to maintain the distance 132 (approximately 3 feet or approximately 1 meter) between the patient 102 and the sensor 108. Additionally, the vision screening device 106 may be configured to determine the patient position 130 relative to the vision screening device position 128 and provide the user 104 instructions, via the user interface 114, that cause the user 104 to modify the vision screening device position 128 or the patient 102 to modify the patient position 130 such that the distance 132 is properly maintained. Further, the vision screening device 106 may be configured to provide the user 104 instructions, via the user interface 114, that cause the user 104 to rotate the vision screening device 106 around a vertical axis or that causes the patient 102 to rotate such that the patient 102 is facing an anterior face of the vision screening device 106 comprised of the sensor 108 and the eccentric radiation source 110. Similarly, the vision screening device 106 may be configured to provide the user 104 instructions, via the user interface 114, that cause the user 104 to modify the vision screening device position by increasing an elevation or a height of the vision screening device, at the vision screening device position 128, relative to the patient 102.
[0034] The user interface 114 may be disposed on a posterior face of the vision screening device 106 that substantially faces the user 104 during operation of the vision screening device 106. The user interface 114 may include a graphical user interface configured to display information to the user 104 and/or receive input from the user 104 during a vision test. For example, the user interface 114 may be configured to receive input from the user 104 regarding the patient 102, such as any of the patient information described herein. Further, the user interface 114 may be configured to display information regarding the vision screening device 106 (e.g., a current setting or operating mode of the device, etc.), the distance of the patient 102 from the vision screening device 106, the quality of the environment and/or the focus of the vision screening device 106, the progress of the screening, options for transmitting data from the vision screening device 106 to the vision screening system 120, one or more measurements and/or values generated during the vision screening, etc. The user interface 114 may comprise, for example, a liquid crystal display (LCD) or active matrix organic light emitting display (AMOLED). The user interface 114 may also be touch-sensitive to receive input from the user 104.
[0035] As used herein, the network 116 is typically any type of wireless network or other communication network known in the art. Examples of network 116 include the Internet, an intranet, a wide area network (WAN), a local area network (LAN), and a virtual private network (VPN), cellular network connections and connections made using protocols such as 802.11a, b, g, n and/or ac.
[0036] In some examples, the vision screening device 106 can include a microprocessor or a control unit substantially similar to one or more components of the vision screening system 120 described above. For example, the vision screening device 106 may comprise one or more processors 122 and/or other hardware and/or software components configured to operably control the sensor 108, the eccentric radiation sources 110, the user interface 114, and other components of vision screening device 106. For instance, vision screening device 106 may include a single processing unit (e.g., a single processor) or a number of processing units (e.g., multiple processors), and can include single or multiple computing units and/or multiple processing cores. The processor(s) 122 of the vision screening device 106 can be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, state machines, logic circuitries, and/or any devices that manipulate signals based on operational instructions. For example, the processor(s) 122 of the vision screening device 106 can be one or more hardware processors and/or logic circuits of any suitable type specifically programmed or configured to execute the algorithms, operations, and methods described herein. The processor(s) of the vision screening device 106 can be configured to fetch and execute computer-readable instructions stored in the patient screening components 126, which can program the processor(s) of the vision screening device 106 to perform the functions described herein. Additionally or alternatively, the processor(s) of the vision screening device 106 can be configured to fetch and execute computer-readable instructions stored in computer-readable media and/or other memory of/local to the vision screening device 106.
[0037] As described herein, a processor, such as processor(s) 122, can be a single processing unit or a number of processing units, and can include single or multiple computing units or multiple processing cores. The processor(s) 122 can be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, state machines, logic circuitries, and/or any devices that manipulate signals based on operational instructions. For example, the processor(s) 122 can be one or more hardware processors and/or logic circuits of any suitable type specifically programmed or configured to execute the algorithms and processes described herein. The processor(s) 122 can be configured to send, receive, and transmit communications via the network interface 124. Additionally, the processor(s) 122 can be configured to fetch and execute computer-readable instructions stored in the computer-readable media of the patient screening components 126, which can program the processor(s) 122 to perform the functions described herein.
[0038] The network interface(s) 124 may enable wired and/or wireless communications between the components and/or devices shown in system 100 and/or with one or more other remote systems, as well as other networked devices. For instance, at least some of the network interface(s) 124 may include a personal area network component to enable communications over one or more short-range wireless communication channels. Furthermore, at least some of the network interface(s) 124 may include a wide area network component to enable communication over a wide area network. Such network interface(s) 124 may enable, for example, communication between the vision screening system 120 and the vision screening device 106 and/or other components of the system 100, via the network 116.
[0039] The patient screening components 126 may include volatile and nonvolatile memory and/or removable and non-removable media implemented in any type of technology for storage of information, such as computer-readable instructions, data structures, program modules, or other data. Memory can include, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, optical storage, solid state storage, magnetic tape, magnetic disk storage, RAID storage systems, storage arrays, network attached storage, storage area networks, cloud storage, or any other medium that can be used to store the desired information and that can be accessed by a computing device. The patient screening components 126 can include various types of computer-readable storage media and/or can be a tangible non-transitory media to the extent that when mentioned, non-transitory computer-readable media exclude media such as energy, carrier signals, electromagnetic waves, and signals per se.
[0040] The patient screening components 126 can include any number of functional components that are executable by the processor(s) 122. In many implementations, these functional components comprise instructions or programs that are executable by the processor(s) 122 and that, when executed, specifically configure the one or more processor(s) 122 to perform the actions associated with one or more vision tests.
[0041] Although not illustrated in FIG. 1, in some examples the patient screening components 126 may include computer-readable media configured to store a measurement data component. In such examples, the measurement data component may be configured to receive, access, and/or analyze testing data collected and/or detected by the vision screening device 106 during one or more vision screening procedures. For example, the measurement data component may be configured to receive, via the network 116, image data and/or video data generated by the vision screening device 106 during a vision screening test. The measurement data component may analyze the image data and/or video data to determine one or more measurements associated with the patient 102, such as a gaze of the patient 102 throughout the screening, a location of the pupils of the patient 102 at points in time of viewing the graphical representation, a diameter of the pupils, an accommodation of the lens, motion information associated with the eyes of the patient 102, and the like.
[0042] Further, although not illustrated in FIG. 1, the patient screening components 126 may also include computer-readable media configured to store a threshold data component. The threshold data component may be configured to receive, access, and/or analyze threshold data associated with standard vision testing results. For example, in such embodiments, a threshold data component may be configured to access or receive data from one or more additional databases (e.g., the database 118, a third-party database, etc.) storing testing data, measurements, and/or a range of values indicating various thresholds or ranges within which testing values should he. Such thresholds or ranges may be associated with patients having normal vision health with similar testing conditions. For example, for each testing category, standard testing data may be accessed or received by the threshold data component and may be utilized for comparison against the measurement data stored by the measurement data component described above. For instance, the threshold data associated with the toddler testing category may include standard pupil measurements, and/or a threshold range of values which the testing values should not exceed or fall below (e.g., a standard value range) for toddlers when displayed each graphical representation. For example, when testing for accommodation in the patient 102, an example threshold data component may be configured to store information associated with the amplitude of accommodation and age (e.g., Bonder’s Table).
[0043] FIG. 2A illustrates an additional example system 200 of the present disclosure. In some examples, the system 200 may include one or more of the same components included in the system 100. In some additional examples, the system 200 can include different components that provide similar functions to the components included in the system 100. As shown in FIG. 2A, the system 200 may be utilized to determine a refraction error associated with one or more eyes of a patient 202. In particular, a user 204 can cause the system 200 to activate a radiation source 206 such that beams of radiation 208a and 208b are emitted towards the patient 202. Additionally, a reflected beam of radiation 208c can be captured by a radiation sensor lens 210 and directed onto a radiation sensor 212. Further, the system 200 can be controlled via a user device 216 and a user interface 218. In some examples, the user device 216 may include a controller 220 configured to operate the radiation source 206, the radiation sensor lens 210, and the radiation sensor 212 during the refractive error test.
[0044] In the example system 200 of FIG. 2A, the radiation source 206 may be comprised of a plurality of radiation point sources that emit visible or NIR electromagnetic waves when powered and/or provided an activation signal (e.g., LEDs, Organic Light Emitting Diodes (OLEDs), light bulbs, etc.). The radiation source 206 may be configured such that the plurality of radiation point sources is arranged in one or more eccentricities and/or one or more meridians. For instance, the radiation source 206 of FIG. 2A depicts eight radiation point sources that are configured in four eccentricities. Additionally, the radiation source 206 of FIG. 2A depicts the eight radiation point sources in a single meridian that extends vertically from the radiation sensor lens 210. It should be noted that the one or more eccentricities may describe a radial distance of one or more radiation point sources from the radiation sensor lens 210. Similarly, the one or more meridians may describe a rotational orientation of one or more additional radiation point sources relative to a vertical axis of the system 200. In some additional examples, the radiation source 206 can include a plurality of the radiation point sources arranged in three meridians that extend radially in four meridians (e.g., a first that extends from vertically from 0 degrees and 180 degrees, a second that extends from 60 degrees and 240 degrees, and a third that extends from 120 degrees and 300 degrees).
[0045] In the example system 200 of FIG. 2A, the radiation source 206 may be configured to activate one or more subsets of the plurality of radiation point sources during a refractive error test. For instance, six radiation point sources arranged in three meridians can be sequentially activated for an inner eccentricity, a middle eccentricity, and an out eccentricity. It should be noted that while the above example describes the various eccentricities being activated as a whole, the system 200 may activate subsets of individual meridians and/or individual eccentricities. Accordingly, the radiation source 206 may be configured to provide one or more illumination patterns during the refractive error test. Additionally, while the example system of FIG. 2A depicts only radiation beams 208a and 208b being emitted by the radiation source 206, the system 200 may be configured to emit any number of radiation beams as an illumination patter for the refractive error test. [0046] In the example system of FIG. 2A, the system 200 may include the radiation sensor lens 210 to direct the reflected radiation beam 208c to the radiation sensor 212. In some examples, the radiation sensor lens 210 may be substantially similar or the same as the optical components discussed with respect to FIG. 1. In some additional examples, the radiation sensor lens 210 may be configured adjust the aperture width and the focal length of the lens such that the radiation sensor 212 captures a high-quality image, sequence of images, and/or video of the one or more retinas of the patient 202 under illumination by the radiation source 206 (e.g., a clear image, an image without blurred features, etc.).
[0047] In the example system of FIG. 2A, the system 200 may be configured such that the emission of visible and/or NIR radiation beams 208a and 208b by the radiation source 206 is triggered based at least on a framerate and/or a capture rate associated with the radiation sensor 212. For instance, the radiation sensor 212 may be configured to capture images and/or video at a rate of thirty frames per second. Additionally, the radiation source 206 may be configured to emit visible and/or NIR radiation in bursts, flashes, packets, periods, etc. that are configured to be partially or fully synchronized with the capture rate of the radiation sensor 212. Accordingly, the system 200 may be operated such that some or all of the images captured by the radiation sensor 212 are illuminated for the duration of a frame capture period (e.g., for images captured at thirty frames per second, the frame capture period would be a thirtieth of a second). In another example, the emission of visible radiation may be controlled to display color stimuli such as color dot patterns to the patient 202, and the NIR radiation beams 208a and 208b may be activated for measuring refractive error and/or gaze angle of the eye(s) of the patient 202 in synchronization with the presentation of the color stimuli during the performance of a color vision screening test.
[0048] In the example system of FIG. 2A, the system 200 may be configured such that the radiation source 206, the radiation sensor lens 210, and the radiation sensor 212 can be secured by one or more radiation source supports 214 and/or a system housing 224. For instance, the radiation source support 214 can be one or more substrates that may be removably or non- removably attached to the radiation source 206, the radiation sensor lens 210, and/or the radiation sensor 212. A first radiation source support 214a may be configured to provide structural support to the plurality of radiation point sources of the radiation source 206. Additionally, the radiation sensor lens 210 and/or the radiation sensor 212 may be radially encompassed in at least a two-dimensional plane by the first radiation source support 214a. It should be noted that while the first radiation source support 214a is depicted by FIG. 2A as being attached to a posterior surface of the individual radiation point sources, the individual radiation point sources may be partially embedded in the first radiation source support 214a at any point along the primary axis (e.g., the axis perpendicular to the anterior, light emitting surface and the posterior surface of the radiation point source).
[0049] In the example system of FIG. 2A, the radiation sensor 212 of the system 200 may be configured to receive and/or access light, image, and/or video data associated with a patient 202 being evaluated during a refractive error test. In particular, the radiation sensor 212 may be configured to capture, or generate, image and/or video data during the vision test. For example, as described herein, image/video data may be transmitted, via the communication interface(s) 222, to the user device 216 for processing and analysis. In some additional examples, the radiation sensor 212 includes, for example, a complementary metal-oxide semiconductor (CMOS) sensor array, also known as an active pixel sensor (APS), or a charge connected device (CCD) sensor. In some examples, the radiation sensor lens 210 is supported by the system 200 and positioned in front of the radiation sensor 212. In still further examples, the radiation sensor 212 has a plurality of rows of pixels and a plurality of columns of pixels. For example, the radiation sensor 212 may include approximately 1280 by 1024 pixels, approximately 640 by 480 pixels, approximately 1500 by 1152 pixels, approximately 2048 by 1536 pixels, and/or approximately 2560 by 1920 pixels. The radiation sensor 212 may be capable of capturing approximately 25 frames per second (fps), approximately 30 fps, approximately 35 fps, approximately 40 fps, approximately 50 fps, approximately 75 fps, approximately 100 fps, approximately 150 fps, approximately 200 fps, approximately 225 fps, and/or approximately 250 fps. Note that the above pixel values and frames per second are exemplary, and other values may be greater or less than the examples described herein.
[0050] In the example system of FIG. 2A, the radiation sensor 212 may include photodiodes having a light-receiving surface and have substantially uniform length and width. During exposure, the photodiodes convert the incident light to a charge. The radiation sensor 212 may be operated as a global shutter. For example, substantially all of the photodiodes may be exposed simultaneously and for substantially identical lengths of time. Alternatively, the radiation sensor 212 may be used with a rolling shutter mechanism, in which exposures move as a wave from one side of an image to the other. Other mechanisms are possible to operate the radiation sensor 212 in yet other examples. The radiation sensor 212 may also be configured to capture digital images. The digital images can be captured in various formats, such as JPEG, BITMAP, TIFF, etc.
[0051] In the example system of FIG. 2A, the system 200 may be additionally configured such that the second radiation source support 214b is attached to the radiation sensor 212 and the first radiation source support 214a. Similar to the first radiation source support 214a, the second radiation source support 214b may be configured to provide structural support to the radiation sensor lens 210, the radiation sensor 212, and the first radiation source support 214a. Additionally, the radiation sensor 212 may be partially embedded and radially encompassed in a two-dimensional plane at any position along the primary axis of the radiation sensor 212. Further, a communication interface 222 can be provided structural support by the second radiation source support 214b. [0052] In the example system of FIG. 2A, the system 200 may be further configured such that the system housing may provide structural support for the individual radiation point sources of the radiation sources 206, the radiation sensor lens 210, the first radiation source support 214a, the second radiation source support 214b, the user device 216, and/or other components of the system 200. For instance, the system housing 224 can be an enclosed or partially enclosed structure comprised of at least an anterior surface penetrated or perforated such that the radiation source 206 may emit the radiation beams 208a and 208 while the radiation sensor lens 210 may collect the radiation beam 208c. Additionally, a posterior surface of the system housing 222 may comprise a socket, a bracket, a port, or other connector that provides a connection interface for the user device 216. Alternatively, the posterior surface of the system housing 222 may include a user interface 218 embedded in or attached to the system housing 222 and integrated with the system 200.
[0053] In the example system of FIG. 2A, the system 200 may include the first radiation source support 214a and the second radiation source support 214b, wherein the radiation source supports may be integrated circuit boards, printed circuit assays (PCAs), printed circuit boards (PCBs), or other circuit board configured to provide support and signaling to the radiation source 206, the radiation sensor lens 210, the radiation sensor 212, the communication interface 222, and other components of the system 200. The radiation source supports may comprise processors, microprocessors, microcontrollers, memory, computer readable media, drivers for the individual radiation point sources, and other support components for the system 200. Further, the radiation source supports can provide communication with the user device 216 and/or the user interface 218 via circuitry and/or connections routed through the system housing 222 or the communications interface 220.
[0054] In the example system of FIG. 2A, the communication interface 222 may be configured to provide data connections and network communications with the user device 216 and/or the communication network 116 described with respect to FIG. 1. For instance, the communication interface 222 may be configured to connect to external databases (e.g., the database 118) to receive, access, and/or send screening data using wireless connections. Wireless connections can include cellular network connections and connections made using protocols such as 802.11a, b, g, and/or ac. In other examples, a wireless connection can be accomplished directly between the vision screening device 106 and an external display using one or more wired or wireless protocols, such as Bluetooth, Wi-Fi Direct, radio-frequency identification (RFID), infrared signals, and/or Zigbee. Other configurations are possible. The communication of data to an external database 118 or an external system can enable report printing or further assessment of the visual test data of the patient 202. For example, collected data and related test results may be wirelessly transmitted and stored in a remote database accessible by authorized medical professionals.
[0055] FIG. 2B illustrates an embodiment of a vision screening device 230 according to some implementations. The example vision screening device 230 may include one or more of the same components included in the device 106 of system 100. In some additional examples, the vision screening device 230 can include different components that provide similar functions to the device 106 included in the system 100.
[0056] The vision screening device 230 may include two tablet-like components, a first component 232 including a first housing 233 and a second component 234 including a second housing 235, which may include one or more processors associated therewith (not shown). In a partially open position 230A of the vision screening device 230 shown, a first front surface 236 of the first component 232 may face a direction 238A towards a patient (such as the patient 202), and a second front surface 240 of the second component 234 may face a direction 238B, opposite the direction 238A, towards a user of the vision screening device 230 (such as the user 204). In the partially open state 230A, first back surface 242A of the first component 232, formed by back of the first housing 233, faces second back surface 242B of the second component 234, formed by back of the second housing 235, as shown. The first housing 233 and the second housing 235 may be rotatably coupled via a hinged attachment 244 and/or be otherwise pivotable, so that the first component 232 and the second component 234 pivot about a central longitudinal axis 246. In the completely open position (not shown), the first component 232 may be rotated along counter-clockwise direction 248A, and the second component 234 may be rotated along clockwise direction 248B towards each other until the back surfaces 242A, 242B are adjacent and flush against each other. In addition, the first component 232 may be rotated along the clockwise direction 248B and the second component 234 may be rotated along the counter-clockwise direction 248A until the front surfaces 236, 240 are proximate (e.g., flush against and facing each other), and the back surfaces 242A, 242B form the exterior surfaces of the vision screening device 230, reaching a closed position 230B. As described above, the first housing 233 and the second housing 235 are moveable between the closed position 230B in which the first front surface 236 is disposed adjacent to and facing the second front surface 240, and a fully open position in which the first front surface 236 is disposed substantially parallel to and approximately 180 degrees opposite the second front surface 240, moving through intermediate partially open positions, such as the partially open position 230A.
[0057] While the vision screening device 230 is being used to administer vision screening test(s), the vision screening device 230 may be at the fully open position described above. In some examples, the vision screening device 230 may instead be at a partially open position, where the first component 232 is substantially perpendicular to the direction 238A facing the patient, and the second component 234 forms an obtuse angle with the direction 238B to provide a comfortable reading angle for a user of the vision screening device 230. [0058] In the embodiment shown in FIG. 2B, the first component 232 may comprise an imaging assembly 250 and a first display 252 disposed within the first housing 233 forming the substantially planar first front surface 236. The imaging assembly 250 may include one or more illumination and image capture modules 254. Additional details of the illumination and image capture modules 254 included in the imaging assembly 250 will be discussed with reference to FIG. 2C. Though the imaging assembly 250 is shown as being disposed at a top end of the first display 252, in other examples, the imaging assembly 250 may be disposed at other locations along a periphery of the first display 252. Additionally, the imaging assembly 250 may be substantially planar and its front surface 255 may be co-planar with the first front surface 236 of the first component 232.
[0059] In some examples, the imaging assembly 250 may comprise one or more units distributed at different locations along the periphery of the first display 252, each unit including one or more of the modules 254. For example, as will be discussed further with reference to FIG. 2C, a first unit 250A of the imaging assembly 250 may be disposed on right side of the first display 252, and a second unit 250B of the imaging assembly 250 may be disposed on left side of the first display 252. Each unit 250A, 250B of the imaging assembly 250 may include one or more modules 254. The first unit and the second unit of the imaging assembly 250 may be targeted at different eyes of the patient e.g., left eye and right eye.
[0060] The first display 252 may be used to present visual stimuli 256 to the patient during the vision screening test(s). For example, the visual stimuli 256 may include a video to attract the gaze of the patient’s eyes to the display screen, a visual acuity target (e.g., a Snellen chart) during a visual acuity screening test, different brightness levels for an accommodation screening test, color patterns for a color vision screening test, and the like. In the example shown in FIG. 2B, the visual stimulus 256 shown on the first display 252 is a Snellen chart for determining visual acuity of the patient. The patient being evaluated may be asked to read the characters where each successive line has characters smaller than the characters in the line above.
[0061] As also shown in FIG. 2B, the second component 234 may include a second display 258 forming the substantially planar second front surface 240, disposed within the second housing 235. The second display 258 may be used to display information to the user that is relevant to the vision screening test(s) being administered. The second display 258 may present one or more visual elements conveying the relevant information. For example, the visual elements may include a live video feed 260 of the patient being evaluated so that the user can monitor the patient’s actions and responses. The visual elements may also include the patient’s information 262 which may include demographic information, prior screening results, and/or medical history and the like. In some example, the visual elements may include a replication 264 of the visual stimuli 256 being presented to the patient and/or an image 266 of the eyes of the patient captured by a module of the imaging assembly 250. In addition, the visual elements may include measurements and/or results 268 related to the vision screening test being administered, and/or a recommendation 270 which may be system-generated in some examples.
[0062] The second display 258 may also display an icon panel 272 including icons 274 that provides easy navigation and access to other tasks or screens to the user. For example, the icon panel 272 may include several icons for performing tasks associated with administering vision screening test(s). Some examples of icons that may be displayed include an icon for selecting a vision screening test from a listing of vision screening tests supported by the vision screening device 230, an icon for accessing a list of patients who may be scheduled for evaluation, an icon for adding screening test results to a summary of patient evaluation, an icon to initiate a re-do a previous step or a previous screening test, and/or an icon to save or print an evaluation report, and the like. The icon panel 272 may also include a “home” icon to allow the user to access a “home” screen, which may be set up to include one or more of the visual elements
260-270 and icons 274 described above. There may also be set-up icons available to an administrator of the vision screening device 230 for setting up content e.g., images/video to be shown to the patient, and timing of display of the content on each of the displays 252 and 258 for each of the vision screening tests supported by the vision screening device 230.
[0063] In some examples, the second display 258 may be touch-sensitive to receive input from the user of the vision screening device 230 using a stylus or finger. In other examples, the vision screening device 230 may include a keyboard or other data entry mechanism (e.g., voice command recognition) to allow the user to input data related to the vision screening test(s). The vision screening device 230 may also be configured to connect wirelessly with external computing devices, peripherals, cloud services, and/or external databases, as described in further detail with reference to FIG. 3.
[0064] As discussed herein, FIG. 2B depicts an exemplary vision screening device 230 that includes components for administering one or more vision screening test(s) to a patient. The vision screening device 230 is intended to perform an entire vision screening which may include multiple, different vision screening tests. The vision screening device 230, as shown, has the additional features of being light weight enough to be hand-held, e.g., being a weight equivalent of two tablet-like components 232,234, allowing for portability and ease-of-use. The imaging assembly 250 of the vision screening device 230 provides the radiation sources and image capture modules needed for the one or more vision screening test(s) in a compact and substantially planar arrangement, enabling the light weight and portable form factor of the vision screening device 230. The imaging assembly 250 and its modules 254 are described in detail with reference to FIG. 2C below.
[0065] FIG. 2C illustrates an imaging assembly 250 of the vision screening device 230, including an exemplary configuration of modules 254. The imaging assembly 250 is envisioned to be configurable to include modules 254 needed for performing a variety of vision screening tests. Though a particular set of modules 280-288 is shown, there may be fewer or additional modules. For example, based on the vision screening tests and features enabled in the vision screening device 230, fewer or additional modules may be included in the imaging assembly 250. The imaging assembly 250 may include a housing 276 in which the modules 254 are disposed, supported by a base 277 that may be operably connected to the processors of the vision screening device 230 to provide the functionality described herein. The housing 276 and the modules 254 disposed therein may be covered with a front cover 278, on an opposite side of the base 277. The front cover 278 may be transparent in areas which are disposed over optical components of the modules 254 underneath the front cover 278 e.g., cameras, illumination sources, and the like, to allow radiation to pass through without change.
[0066] FIG. 2D illustrates a view of the imaging assembly 250 where the front cover 278 is removed, exposing the modules 254. In examples, the modules 254 may include one or more of a near-infrared (NIR) radiation source 280, a NIR camera 282 for capturing NIR radiation reflected from eyes of the patient, a visible light camera 284, a range sensor 286, and/or a visible light source 288. The camera modules 282, 284 may also include video capture capabilities. For example, the modules 254 may each include a connector 287 that provides physical and electrical connections to the base 277 of the housing 276. The modules 280, 282, 284, 286, 288 may each be modular components that can be added in a series, or combined in other ways, to form the imaging assembly 250. The type of modules, the number of modules, or the order and manner in which they are assembled is not intended to be construed as a limitation, and any number of modules can be combined in any manner to form the imaging assembly 250. For example, in some instances, the NIR radiation source module 280, which may include an arrangement of near infrared light emitting diodes (NIR LEDs), may be placed in front of the NIR camera 282 module so that the NIR LEDs are disposed around an optical axis of the NIR camera 282.
[0067] In instances where the imaging assembly 250 comprises multiple units distributed along the periphery of the first display 252 e.g., units 250A, 250B, each unit of the imaging assembly 250 may include a housing in which one or more modules e.g., modules 280-288, are disposed. For example, the first unit 250A of the imaging assembly 250 may include a first near-infrared (NIR) radiation source 280 and a first NIR camera 282, and the second unit 250B of the imaging assembly 250 may include a second near-infrared (NIR) radiation source 280 and a second NIR camera 282. In some examples, alternatively or additionally, the first unit 250A and the second unit 250B of the imaging assembly 250 may include a visible light source 288 and/or a visible light camera 284.
[0068] In another example, a third unit (not shown) of the imaging assembly 250, comprising one or more modules 254, may be disposed behind the first display 252 and within the first housing 233. In such an example, the first display 252 may be a transparent screen, such as a transparent organic light emitting display (OLED) screen, allowing radiation to travel through the display and to or from the modules 254 of the third unit without change. For example, the third unit may include a NIR camera 282, so that the NIR camera 282 is aimed in the direction 238A, directly at the eyes of a patient viewing content on the first display 252. In this arrangement, radiation reflected from the patient’s eye(s) impinges directly on the NIR camera 282 e.g., during a photorefraction screening test. In other examples, the third unit may include a visible light camera 284 and/or other modules 254.
[0069] In examples, the NIR radiation source 280 may comprise a NIR light-emitting diode (LED) illumination module which may incorporate one or more NIR LEDs arranged in a pattern that may be configured to provide patterns of illumination needed during various vision screening tests e.g., a refractive error test as described below. The NIR camera module 282 configured to capture NIR radiation reflected from the eyes of the patient may comprise a high- resolution, auto-focus camera with custom optics for imaging eyes in clinical applications. The NIR camera module 282 will be described in further detail with reference to FIG. 2E.
[0070] In examples, the visible light camera 284 may be equipped with auto-focus zoom lens(es) that can provide telephoto as well as wide-angle image capture capability. For example, at the start of a vision screening session, the camera 284 may operate in a wide-angle capture mode to enable the vision screening device 230 to locate the patient in the examination room. Thereafter, the camera 284 may operate in a telephoto zoom mode to capture close-up views of the patient’s face or eyes. The camera 284 may also capture video of the patient while the vision screening tests are being administered.
[0071] As described, the patient is typically located 3 to 22 feet from the vision screening device 230 during screening. In some examples, the imaging assembly 250 may include a range sensor module 286 for sensing depth information. The depth information of the patient relative to the vision screening device 230 corresponds to the distance of the patient from the vision screening device 230. Therefore, the depth detected by the range sensor module 286 may be used by the user of the vision screening device 230 to verily that the patient is located at an ideal distance for the vision screening test being administered. The detected depth or distance to the patient may also be used to adjust visual stimuli being presented to the patient e.g., the size of characters in a visual acuity test presented to the patient may be smaller if the patient is closer to the vision screening device 230. The range sensor module 286 may comprise 3D sensors such as LIDAR sensors which use laser scanning, or ToF (time-of-flight) sensors which use infrared light to determine depth information. In some examples, the range sensor module 286 may generate a three-dimensional point cloud of a portion of the examination room, including the patient. The point cloud may be generated while the patient is illuminated by NIR radiation emitted by the NIR radiation source 280, and may be used to measure distance to the patient. In some examples, a three-dimensional point cloud of the patient’s head may be generated using the range sensor module 286. The three-dimensional point cloud of the patient’s head may be used to more accurately localize the patient’s eyes and locate pupils of the eyes.
[0072] Additionally, the imaging assembly 250 may include a visible light source 288, which is more commonly referred to as a flash. The visible light source 288 may illuminate the patient while the visible light camera 284 captures images or video of the patient. The timing of activation of different modules may be synchronized e.g., the visible light source 288 may be activated during the capture of an image by the visible light camera 284. The modules 254 of the imaging assembly 250 may be controlled by a processor of the vision screening device 230 as described with reference to FIG. 3.
[0073] FIG. 2E illustrates an exploded view of the NIR camera module 282. For example, as shown, the NIR camera module 282 may include a high-resolution lens 290 with a narrow field of view suitable for imaging eyes in a vision screening setting. The lens 290 may incorporate folded prism slim lens technology which allows for telephoto zoom while maintaining a low height profile. The optical system used in folded prism lenses bends and focuses light while it is reflected back and forth inside thin optical prisms, reducing the thickness of the lens and allowing for a substantially low-height form factor. As discussed later with reference to FIGS. 4-7, some vision screening tests, and in particular, a refractive error screening test require images of pupils and/or lenses of the eyes of the patient. The high- resolution zoom capability of the camera 282 enables the capture of close-up images of the eyes of the patient from which the pupils and/or lenses of the eyes can be localized.
[0074] In some examples, the lens may incorporate an aspherical multi-lens design which allows focusing in a 3-22 feet range that is typically used in a vision screening setting as distance between the patient and the vision screening device 230. The lens 290 may be controlled by an actuator 292 incorporating high-performance miniature actuator technology which enables auto-focus and optical image stabilization features of the camera 282. For example, the multi-lens design of the lens 290 may incorporate a stack of lenses along an optical axis 291, each of the lenses of the stack being capable of off-axis lateral motion. In addition, the lens 290 may include a prism rotatable about an axis perpendicular to the optical axis 291. The actuator 292, which may be micro electro-mechanical systems(MEMS)-based, may control the motion of the lenses of the stack of lenses as well as the prism, providing optical image stabilization to compensate for small movements arising from a user handholding the vision screening device 230 while administering the vision screening test(s). The optical image stabilization feature of the lens 290 may enable sharper image capture using the NIR camera 282 in a hand-held vision screening device 230, eliminating the need for using a rigid support system for the vision screening device 230, such as a tripod.
[0075] The NIR camera module 282 also includes a NIR sensor 294 and a controller board 296, such as a radio frequency printed circuit board (RF-PCB), fitted into a housing 298 of the module 282. In some examples, the housing 298 may have a base 298 and one or more walls 299 extending substantially perpendicular to the base. In additional examples (not shown) the housing may also include a transparent cover opposite the base 298 and supported by the one or more walls 299 to form an internal space. The internal space may be partially enclosed by the walls 299, cover, and the base 298. In such examples, one or more components of the 282 may be disposed at least partly within the internal space. The cover and the base 298 may be substantially planar. In some examples, the cover forms a portion of the cover 278 of the imaging assembly 250 that is disposed substantially coplanar with the first display 252 of the vision screening device 230. In some other examples, one or more of the components of the module 282 may at least partially protrude from (e.g., extend through one or more orifices formed by the cover) the surface of the cover 278. In some examples, the visible light camera 284 may include components substantially similar to those described above with reference to the NIR camera 282.
[0076] FIG. 3 illustrates another example system 300 of the present disclosure. As can be seen in FIG. 3, the system 300 may include one or more similar of the same components as the components included in the systems 100 and 200. In some examples of FIG. 3, the system 300 can interact with the components described with respect to the systems 100 and 200 to execute one or more methods according to this disclosure. For example, as shown in FIG. 3, the system 300 may include an LED PCA 302 comprised of an LED driver 304, a microcontroller unit (MCU) 306, power management systems 308, and an information interface 310. Additionally, a user device 312 can be included in the system 300, which may be the vision screening device 230. The user device 312 comprises a computer processing unit (CPU) 314, a communication interface 316, a user interface 318, memory 320, power management systems 322, an information interface 324, and a display 326. Additionally, the user device 312 may include display 328 for displaying content to a patient 202 whose vision is being screened using user device 312. Further, components of the LED PCA 302 can be configured to operate the radiation emitting systems and radiation capturing systems described with respect to FIGS. 1 and 2A-C e.g., the imaging assembly 250.
[0077] In some examples of FIG. 3, the LED driver 304 may operate and control the radiation source 206 or the NIR radiation source 280. For instance, the LED driver 304 may provide one or more commands, received via the MCU 306, for selectively activating and deactivating individual radiation point sources of the vision screening system. It should be noted that the LED driver 304 may operate to activate the individual radiation point sources independent of the meridian and eccentricity associations established between the radiation point sources. Additionally, the LED driver 304 may be configured to receive and distribute power from the power management system 308 to the radiation source 206. Further, the LED driver 304 may communicate with the CPU 314 and receive one or more indications providing instruction for radiation source activation and activation patterns.
[0078] In some examples of FIG. 3, the MCU 306 may be configured to perform or partially perform methods described by FIGs. 4-7 or the workflow of FIG. 8. Additionally or alternatively, the MCU 306 may operate to control the radiation sensor 212, or the NIR camera 282, to capture an image, a plurality of images, and/or a video of one or more retinas associated with the patient 202. The MCU 306 may communicate with the CPU 314 via the information interface 310 to receive indications of routines and/or algorithms to be performed during the vision screening test(s) and to transmit the capture image(s) and video captured during the routines and/or algorithms. In some additional examples of FIG. 3, the MCU 306 may be configured to perform high speed video frame capture while the LED driver 304 causes the radiation source 206 to illumination the patient 202. In particular, the MCU 306 may receive an indication for a series of light pattern images to be captured within a timeframe associated with a vision screening test, like a refraction test. Accordingly, the CPU 314 can synchronize the radiation source 206, or the NIR radiation source 280, flashing the patient 202 and illuminating the retinas of the patient 202 such that the radiation sensor 212, or the NIR camera 282, may the capture of images and/or video for the series of light pattern images of the refraction test. Additionally, due to the timeframe provided for completion of the refraction test, the synchronization of the LED driver 304 with the MCU 306 by the CPU 314 enables the series of light pattern images to be captured despite the latency between the LED driver 304, the MCU 306, and the CPU 314.
[0079] In the example shown in FIG. 3, the CPU 314 of the system 200 may comprise one or more controllers, processors, and/or other hardware and/or software components configured to operably control the LED driver 304, the MCU 306, communication interface 316, the user interface 318, the display 326, and/or other components of the system 200. For instance, the CPU 314 shown in FIG. 3 may include a single processing unit (e.g., a single processor) or a number of processing units (e.g., multiple processors), and can include single or multiple computing units or multiple processing cores. The CPU 314 shown in FIG. 3 can be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, state machines, logic circuitries, and/or any devices that manipulate signals based on operational instructions. For example, CPU 314 shown in FIG. 3 can be one or more hardware processors and/or logic circuits of any suitable type specifically programmed or configured to execute the algorithms, operations, and methods described herein. The CPU 314 shown in FIG. 3 can be configured to fetch and execute computer- readable instructions stored in memory 320, which can program the CPU 314 to perform the functions described herein. Additionally or alternatively, the CPU 314 shown in FIG. 3 can be configured to fetch and execute computer-readable instructions stored in patient screening components 126 of the vision screening system 120 (FIG. 1).
[0080] In any of the examples described herein, the CPU 314 shown in FIG. 3 may be configured to receive various information, signals, and/or other inputs from one or more of the user interfaces 318, the radiation sensor 212 or the NIR camera module 282, the display 326, and/or other components of the system 200. In some examples, the user interface 318 may receive such inputs from the user 204, and one or more such inputs may comprise a command or a request for the system 200 to generate, display, provide, and/or otherwise output one or more images, beams of radiation, dynamic stimulus, or other output included in a refractive error examination or other vision test. For example, the CPU 314 shown in FIG. 3 may be operable to cause the radiation source 206, the radiation source 280, and/or the visible light source 288 to generate, display, provide, and/or otherwise output, beams of radiation, dynamic stimulus, or other output included in a refractive error examination or other vision test. In addition, the CPU may cause one or more images to be output at the display 326 (or the display 258) facing the user 204, and/or the display 328 (or the display 252) facing the patient 202.
[0081] In some examples, the CPU 314 can be configured to synchronize capture of highspeed images or frames with periodic illumination of the patient 202 via the radiation source 206 or the radiation source 280. It should be noted that the synchronization of high-speed image and/or frame capture with the illumination of the patient 202 may be utilized to compensates for communication delays, between the CPU 314 and the LED driver 304 and between the CPU 314 and the MCU 306, caused by the information interfaces 310 and 324. For instance, where the CPU 314 may communicate with the MCU 306 via a first communication method (e.g., through USB video class (UVC) communications where the images and/or frames observed by the radiation sensor 212, or captured by the NIR camera 282, and the MCU 306 are transmitted to the CPU for recording), the exchange of commands, images observed by the radiation sensor 212, , or captured by the NIR camera 282, and other information can be exchanged with relatively low latency between transmission of information by either the CPU 314 or the MCU 306 and the receipt of the transmission by either the MCU 306 or the CPU 314 respectively. However, the CPU 314 may communicate with the LED driver 304 via a second communication method (e.g., through Bluetooth communications), wherein the second communication method exchanges information with relatively high latency between transmission and receipt of information. Further, in some additional examples, the LED driver 304 and the MCU 306 may be unable to exchange communications. Accordingly, the high latency between the CPU 314 and the LED driver 304 may introduce a delay between the CPU 314 transmitting an illuminate command to the LED driver 304 and the completion of the illuminate command for each image of the series of light pattern images that a complete series of light pattern images is not captured during the timeframe provided for the refraction test. [0082] In a first example, synchronization of the LED driver 304 can enable the radiation source 206 or the radiation source 280 to illuminate the patient 202 while an image and/or a frame of the series of light pattern images is captured. By modifying a flash duration to sufficiently exceeds the time for capture of a single image and/or frame via the radiation sensor 212 or the NIR camera 282 (e.g., establish the flash duration to be at least double the time for capturing a frame), the CPU 314 can ensure that at least one frame or image captured during the flash will be fully illumination. Accordingly, the CPU 314 modification of the flash duration can enable the series of light pattern images captured during the timeframe to include a series of images that may be utilized for the refraction test. It should be noted that each activation of the radiation source 206 or the radiation source 280 includes at least a wholly illuminated frame (e.g., the radiation source 206 or the radiation source 280 does not deactivate during the frame capture) captured by the radiation sensor 212 and may include one or more partially illuminated frames. Additionally, any frames from the series of images and/or filters that overlap with a deactivation of the radiation source 206 or the radiation source 280 (e.g., flash ends while the frame is captured, causing the frame to be partially illuminated) are discarded.
[0083] In a second example, the MCU 306 can synchronize of activation of the eccentric radiation source 206 or the radiation source 280, or the NIR camera 282, by the LED driver 304 with the capture of the series of images and/or frames by the radiation sensor 212. Additionally, the MCU 306 can receive a command from the CPU 314 that triggers the synchronization of eccentric radiation source activation and image capture in such that a series of images may be captured. Further, the synchronization of the eccentric radiation source activation and the image capture may be achieved by utilizing the communication pathways between the MCU 306 and the CPU 314 and the MCU 306 and the LED driver 304 to issue commands. For example, where the information interface between the radiation sensor 212 or the NIR camera 282 and the MCU 306 permits, the MCU 306 can utilize unused information bandwidth to forward commands, via the information interface 310 to the LED driver 304 (e.g., the MCU 306 may utilize a VSYNC pin to transmit a VSYNC signal to the LED driver 304 and bypass the utilization of the Bluetooth connection to trigger the activation of the eccentric radiation source 206 or the radiation source 280). Accordingly, the MCU 306 forward commands for radiation source activation to the LED driver 304, thereby causing the LED driver 304 to activate in synchronicity with the rate of frame capture at the radiation sensor 212 or the NIR camera 282 and bypassing the latency between the LED driver 304 and the CPU 314.
[0084] Continuing from the first example and the second example, the CPU 314 may encode a “start” image and/or frame for the series of light pattern images received from the MCU 306. In particular, the components of the LED PCA 302 are unaware of a start frame or image for the series of light pattern images. Additionally, CPU 314 may be configured to determine the start image associated with the LED driver 304 receiving a command to initiate one or more flashes via the radiation source 206 or the radiation source 280 and the CPU 314 collecting the series of light pattern images. Accordingly, based at least on the flash duration, the CPU 314 may determine an illumination pattern for one or more images that indicates the start image for the series of light pattern images requested by the refraction error test. For example, where the flash duration is determined to be twice the time utilized to capture a single image, the illumination pattern can be a high light intensity image followed by a low light intensity image followed by an additional high light intensity image. In an additional example, where the flash duration is determined to be three times the time utilized to capture a single image, the illumination pattern can be two high light intensity images followed by a low light intensity image followed by an additional high light intensity image. In either of the above examples, the CPU 314 may identify the start image for the series of light pattern images recorded by the CPU 314 and encode the series of light pattern images with an identifier of the start image. It should be noted that encoding the start frame data into the series of light pattern images may comprise creating the illumination patter identified based at least on the flash duration and the time to capture a single image to encode the high illumination, low illumination, high illumination pattern, or other illumination pattern, into a sequence of images the precede the start image. Accordingly, an image processing algorithm may be configured to identify the start image of the refraction test for the series of light pattern images due to the embedded illumination pattern.
[0085] In a third example, the vision screening test being administered may be a color vision screening test. In this example, the MCU 306 may synchronize the activation of the radiation source 206, or the radiation source 280, with the presentation of color stimuli to the patient 202. For example, the color stimuli may be displayed on the display 328, or the display 252, facing the patient, and the radiation source 206, or the radiation source 280, may be activated to determine the refractive error of the eye(s) of the patient 202 responsive to the color stimulus being displayed. The radiation sensor 212 or the NIR camera 282 may capture images of the eye(s) illuminated by the NIR radiation emitted by the radiation source 206, or the radiation source 280, and the captured images may be processed by the CPU 314 to determine differences in the refractive error in response to changes in the color stimulus being presented on the display 328 or the display 252.
[0086] Further, the power management systems 308 and 322 may comprise any removable, rechargeable, and/or other power source known in the art and configured to store electrical power. The power management systems 308 and 322 may comprise one or more rechargeable batteries configured to selectively provide electrical current to the one or more components of the system 200 during use. For instance, the power management systems 308 and 322 may comprise one or more sealed lead acid batteries, lithium ion batteries, nickel cadmium batteries, nickel -metal hydride batteries, or other types of batteries configured to provide sufficient power to the LED driver 304, the MCU 306, the radiation source 206 or the radiation source 280, the radiation sensor lens 210, 290 the radiation sensor 212, 282 the information interfaces 310 and 324, the CPU 314, the communication interface 316, the user interface 318, and/or other components of the described systems.
[0087] The communication interface(s) 316 of the system 200 shown in FIG. 3 may enable wired and/or wireless communications between the vision screening device 300 and one or more external databases 118, a communications network 116, and/or one or more components of the vision screening system 120 (FIG. 1), as well as with one or more other remote systems and/or other networked devices. For instance, the communication interface(s) 316 may include a personal area network component to enable communications over one or more short-range wireless communication channels. Furthermore, the communication interface(s) 316 may include a wide area network component to enable communication over a wide area network. In any of the examples described herein, the communication interface(s) 316 may enable communication between the system 200 and external resources via the network 116 (FIG. 1).
[0088] In some respects, the memory 320 shown in FIG. 3 may be similar to the patient screening components 126 described above with respect to the vision screening system 120 (FIG. 1), or memory associated with the tablet-like components 232 and 234. For example, the memory 320 may include volatile and nonvolatile memory and/or removable and nonremovable media implemented in any type of technology for storage of information, such as computer-readable instructions, data structures, program modules, or other data. Such memory 320 can include, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, optical storage, solid state storage, magnetic tape, magnetic disk storage, RAID storage systems, storage arrays, network attached storage, storage area networks, cloud storage, or any other medium that can be used to store the desired information and that can be accessed by a computing device. The memory 320 can be a type of computer-readable storage media and/or can be a tangible non-transitory media to the extent that when mentioned, non-transitory computer-readable media exclude media such as energy, carrier signals, electromagnetic waves, and signals per se.
[0089] The memory 320 can be used to store any number of functional components that are executable and the images to be processed by the CPU(s) 314. In many implementations, these functional components comprise instructions or programs that are executable by the CPU(s) 314 and that, when executed, specifically configure the one or more CPU (s) 314 to perform the actions described herein and associated with one or more vision screening tests.
[0090] Other functional components stored in the memory 320 may include, among other things, a graphical representation data component, a measurement data component, a threshold data component, a notification component, a sensor data component, a range finder data component, a microphone data component, a light source control component, a machine learning component, and/or any other functional component associated with the operation of the system 200.
[0091] FIG. 4 provides a flow diagram illustrating an example method 400 for vision testing, as described herein. The method 400 is illustrated as collections of blocks in a logical flow graph, which represents a sequence of operations that can be implemented in hardware, software, or a combination thereof. In the context of software, the blocks represent computerexecutable instructions stored on one or more computer-readable storage media that, when executed by CPU(s), perform the recited operations. Generally, computer-executable instructions include routines, programs, objects, components, data structures, and the like that perform particular functions or implement particular abstract data types. The order in which the operations are described is not intended to be construed as a limitation, and any number of the described blocks can be combined in any order and/or in parallel to implement the method
400. In some embodiments, one or more blocks of the method 400 can be omitted entirely.
[0092] At block 402, the CPU of a vision screening device 102 may determine whether a field of view observed by a radiation sensor is in a focused state. In some examples, the focused state indicates that a series of images may be captured for a refraction test. Additionally, the focused state can be identified by the satisfaction of one or more image state thresholds. The one or more image state thresholds may be associated with a clarity of an image (e.g., a lack of blurred features), a stability of the image, and/or other features associated with the subject of the image (e.g., eyes of a patient receiving a refraction test) observed within the field of view of the radiation sensor. Accordingly, where the CPU of a vision screening device 102 determines that the image satisfies the image state thresholds and is in the focused state, the CPU can cause the radiation sensor to capture a series of images (e.g., in some embodiment the series of images can include 25 frames) having a selection of LED illuminations of the eyes/retinas of the patient. Further, the CPU may cause the radiation sensor to capture the series of images where individual images of the series of images are captured in association with individual LED illuminations of the selection of LED illuminations. The determination of the focused state for the image and the capture of the series of images is discussed in greater detail by FIG. 5.
[0093] At block 404, the CPU of a vision screening device 102 may determine the relative position of individual images and/or frames of the series of images. In some examples, the radiation sensor captures the series of images in rapid succession and minimizes the positional drift between individual images. The positional drift, caused by shifts in radiation sensor location and movement of the patient, relocates the pupils within the region of interest between images captured by the radiation sensor. Accordingly, the CPU causes the radiation sensor to rapidly capture the series of images such that a position of the pupil in a preceding image can be utilized to identify potential locations of pupils in an image of the series of images as the previous location of pupils can identify regions of interest for analysis. The determination of pupil position based on a previous pupil position associated with a previous image is discussed in greater detail by FIG. 6.
[0094] At block 406, the CPU of a vision screening device 102 may determine gaze calculations that identify deviations of the pupil centers for both eyes. For instance, the CPU may enhance the contrast for the series of images such that pupil edge detection may be enhanced. Additionally, the CPU may detect the pupil edge and further enable pupil fitting to determine pupil center coordinates and pupil radius for the image. Further, the CPU may operate to determine glint position from the series of images indicated by positional coordinates associated with individual eyes captured by the series of images. Accordingly, the system 200 may remove the glint from the image, determine a gaze deviation based at least on the positional coordinates of the glint, and apply a Gaussian LPF (e.g., a Gaussian filter) to smooth the radiation intensity signal such that further signal processing can be performed.
[0095] At block 408, CPU may determine a refractive error for each eye of the patient. In some examples, the CPU can determine a spherical error (SE) slope based at least on one or more radiation intensity profiles determined from the series of images captured by the radiation sensor. For example, the CPU may extract a profile box that encompasses a range of detected radiation intensities for an image. Additionally, CPU can determine the SE slope from a linear function fitted to the radiation intensity profile along a primary axis of the extracted profile box. It should be noted that the determination of SE slope and refractive error is discussed in greater detail by FIG. 7.
[0096] At block 410, the CPU may generate a report based at least on the refractive error calculation for the patient. For instance, the CPU can, after determining the SE slope and the refractive error for individual images of the series of images, generate a report that lists the refractive error of Sphere (Ds), Cylinder (De), and Axis (Ax) of the Cylinder as well as gaze deviation, pupil size, pupil distance, and other pupil information for both eyes. Additionally, the CPU may include a referral for further treatment, for additional vision tests, a prescription for the individual eyes, and/or additional actions to be taken by the user 204 in the generated report. Accordingly, the CPU can present the generated report to the user 204 and/or the patient 202 via the user interface 218, cause a communication interface to transmit the report to an external database 118, and/or cause a communication interface to transmit to an external vision screening system 120 for further analysis or action.
[0097] FIG. 5 provides a flow diagram illustrating an example method 500 for vision testing, as described herein. The method 500 is illustrated as collections of blocks in a logical flow graph, which represents a sequence of operations that can be implemented in hardware, software, or a combination thereof. In the context of software, the blocks represent computerexecutable instructions stored on one or more computer-readable storage media that, when executed by CPU(s), perform the recited operations. Generally, computer-executable instructions include routines, programs, objects, components, data structures, and the like that perform particular functions or implement particular abstract data types. The order in which the operations are described is not intended to be construed as a limitation, and any number of the described blocks can be combined in any order and/or in parallel to implement the method 500. In some embodiments, one or more blocks of the method 500 can be omitted entirely.
[0098] At block 502, the CPU can cause a radiation sensor to capture a series of images for a vision test. In the example method, the CPU can determine at least a first set of radiation point sources, associated with the eccentric radiation source 206, to be activated for a first image. In some examples of the method, the CPU can determine a set of radiation point sources to be activated for each image of the series of images. In some additional examples, the CPU can determine multiple sets of radiation point sources to be activated for one or more images of the series of images. As noted above with respect to FIG. 2A, the eccentric radiation source
206 can be configured such that a plurality of radiation point sources is positioned at meridians indicating rotational positioning and eccentricities indicating the radial positioning of the radiation point sources. The eccentric radiation source 206 can include one or more meridians and one or more eccentricities where the radiation point sources may be positioned. The individual sets of radiation point sources for the capture of the one or more images for the series of image may be comprised of radiation point sources selected from any combination of meridians and/or eccentricities. However, in some examples, it may be common for the sets of radiation point sources to be comprised of the radiation point sources from a single eccentricity or a single meridian.
[0099] At block 504, the CPU may detect one or more pupils on an image and/or from image data associated with a face of a patient 202. In the example method, CPU can execute facial recognition algorithms for the system 200, the user device 216, and/or the vision screening system 120 (FIG. 1). Additionally or alternatively, a user 204 of the system 200 can indicate that the image includes a face. In some examples, the CPU can identify the pupils as a first region of interest and a second region of interest based at least in part on the reflected radiation collected by the radiation sensor 212. Further, the CPU may detect and define the boundaries of the first region of interest and the second region of interest such that the pupils of the patient 202 are substantially or wholly encompassed by the regions of interest. In some additional examples of block 504, the radiation sensor may capture an image and the CPU may transmit the image, via a network, to vision screening system 120. Additionally, the vision screening system 120 can identify the pupils as the first region of interest and the second region of interest based at least on the reflected radiation collected by the radiation sensor 212. Further, the vision screening system 120 may detect and define the boundaries of the first region of interest and the second region of interest within the image such that the pupils of the patient
202 are substantially or wholly encompassed by the regions of interest.
[00100] In some examples of block 504, the eccentric radiation source 206 may contribute the inability of standard methods of pupil detection to identify the pupils within the first region of interest and the second region of interest of the series of images. In particular, the eccentric radiation source 206 may prevent a fully circular pupil from being detected by standard methods and may cause additional anomalies in pupil detection. Additionally, standard methods attempting to match the eccentrically illuminated pupil to calibration curves may fail to detect one or more pupils for the patient 202 due to the anomalous reflected light associated with the eccentric radiation source 206. Accordingly, in the example method, the CPU may operate a neural network algorithm trained to detect pupils illuminated by the eccentric radiation source 206. In some example methods, the neural network, can be trained by utilizing manually graded test data set and/or data sets comprised of previously analyzed images associated with feedback indications that provide data of successful identification of pupils and failed identification of pupils by the neural network. Additionally or alternatively, the neural network may be trained to identify eccentrically illuminated pupils independent of the system 200 and then deployed to perform pupil identification in association with the system 200. Further, the neural network may store images that have been analyzed by the neural network for manual grading and future training data set in the database 118, in the patient screening components 126, or in association with memory 320.
[00101] At block 506, the CPU may determine the pupil position within the region of interest, the pupil diameters, and the inner pupil distance. In the example method, the CPU can analyze the first region of interest to determine a first horizontal position and a first vertical position of a first pupil of the patient 202. Similarly, the CPU can analyze the second region of interest to determine a second horizontal position and a second vertical position of a second pupil of the patient 202. It should be noted that the first horizontal position, the second horizontal position, the first vertical position, and the second vertical position may identify positions for the first pupil and the second pupil in a two-dimension grid, a three-dimension mesh, or other coordinate system that allows the position of the pupils to be tracked. Additionally, the system 200 can determine a first pupil diameter, a second pupil diameter, and the inner pupil distance for the patient 202. Further, the CPU can determine the first pupil diameter, the second pupil diameter, and the inner pupil distance can be determined based at least on the coordinate system, an estimated distance to between the radiation sensor 212 and the patient 202 (e.g., the system 200 is to be place approximately 1 meter or approximately 3 feet from the patient 202), and/or other identification of an approximate size of the pupil features.
[00102] At block 508, the CPU may determine a glint position for the first pupil and/or the second pupil. For example, the glint position may be determined based at least on an intensity of reflected light that exceeds an intensity threshold and/or is a position associated with the highest intensity of reflected light within an image. The CPU may determine intensity values for individual positions and/or pixels of the image and/or the image data captured by the radiation sensor. Additionally, the CPU may compare intensity values associated with the image and determine an absolute maximum intensity value and/or one or more local maximum intensity values within the first region of interest and/or the second region of interest. Further, the CPU may identify the position and/or pixels associated with the absolute maximum intensity value for the first region of interest and/or the second region of interest as the glint position for the first pupil and/or the second pupil. Alternatively, the CPU may identify a region within the first region of interest and/or the second region of interest associated with an intensity of reflected light that exceeds the intensity threshold and determine that the region is associated with the glint position. [00103] At block 510, the CPU may determine a glint intensity and an image contrast for the first pupil and/or the second pupil. For example, the glint intensity can indicate a luminous intensity value for the captured radiation at the glint position determined in block 508. The glint intensity can indicate data values for radiant energy, radiant energy density, radiant exposure, and/or other values indicating an amount of radiation captured by the radiation sensor for the glint position. Additionally, the image contrast can indicate a differential between the intensity values of a portion of the image covering the whole region of interest surrounding the pupil and the intensity values associated with the surrounding positions and/or pixels. Accordingly, the CPU can identify how intense the radiation captured by the radiation sensor 212 is for the position of the portion for contrast and utilize that information to identify a differential between the radiation collected at the position of the portion for contrast and radiation collected at positions surrounding the portion for contrast.
[00104] In some examples of block 510, the CPU may utilize the glint position, the glint intensity, and the image contrast for the first pupil and/or the second pupil to identify focusing logic adjustments for utilization in determining whether an image observed by the radiation sensor 212 is in a focused state based on one or more image state thresholds. For example, the CPU may utilize, individually or in combination, the glint intensity and the image contrast to determine whether a focusing quality for the image. In particular, a high glint intensity and/or a high image contrast may indicate that the image has a high focusing quality. Similarly, a low glint intensity and/or a low image contrast may indicate that the image has a low focusing quality. Additionally, the CPU may determine whether the glint intensity and/or the image contrast exceed a glint intensity threshold and/or an image contrast threshold to identify whether the image is associated with a focusing quality sufficient for further analysis. Alternatively, the CPU may determine whether the glint intensity and/or the image contrast satisfies an image contrast threshold that indicates, when satisfies, that the focusing logic of the radiation sensor should be adjusted and an additional image captured.
[00105] At block 512, the example method can cause the system 200 to determine whether an image observed by the radiation sensor 212 is in a focused state and whether a plurality of images can be captured. In the example method, a first image can be captured by the radiation sensor 212 while the patient 202 is illuminated by the radiations source 206 as described by block 502. After the variables from blocks 504-510 are determined by system 200, the system 200 may determine whether one or more image state thresholds are satisfied by the image. In some examples, the one or more image state thresholds may include: 1) determine whether a pupil diameter (e.g., the first pupil diameter and/or the second pupil diameter) is greater than 3mm and less than 10mm; 2) determine whether an inner pupil distance is greater than 30mm and less than 100mm; 3) determine that a glint intensity is greater than 220; and 4) determine that image contrast is greater than 20. Further, the system can determine a set of image state thresholds to be satisfied for the image to be in a focused state and approved by the system 200 (e.g., the system can determine that all four image state thresholds are to be satisfied for the image to be in a focused state). In response to a determination that the image is in a focused state and satisfied the set of image state thresholds, the system 200 can capture the series of image. It should be noted that in some examples, the system 200 can iteratively determine whether each image of the series of images satisfies the set of image state thresholds to maintain the focused state for the series of images. Additionally or alternatively, in some additional examples, the system 200 may determine whether a subset of the series of images passes the set of image state thresholds. Further, if a minimum number of consecutive images (e.g., the subset of the series of images) satisfy the set of image state thresholds and can be determined to be in a focused state, the system 200 can capture the remaining images for the series of images without further scrutiny. [00106] At block 512, the example method can include a determination that the image captured by system 200 does not satisfy one or more of the image state thresholds. In some examples, the system 200 can execute corrective procedures to modify the distance between the system 200 and the patient 202 such that updated image variables (e.g., update pupil diameter, pupil distance, glint intensity, and contrast measurements based on a modified distance between the system 200 and the patient 202) are generated that satisfy the set of image state thresholds. In particular, where the pupil diameter and/or the pupil distance is determined to be below a lower threshold, the system 200 can indicate that the patient 202 is too far from the system. Additionally, where the pupil diameter and/or the pupil distance is determined to be above an upper threshold, the system 200 can indicate that the patient 202 is too close to the system. Further, if the pupil diameter and/or the pupil distance consecutively satisfy the related image state thresholds a minimum number of times, the system 200 can indicate that the patient 202 is in a good position and/or capture the series of images.
[00107] Accordingly, at block 512 of the example method, the system can determine whether the image or the series of images is in a focused state. In examples where the system 200 determines that the image(s) are in a focused state, the plurality of images can be captured at block 514. In examples where the system 200 determines that the images(s) are not in a focused state, the image(s) can be discarded and/or disregarded and the system can return to block 504.
[00108] FIG. 6 provides a flow diagram illustrating an example method 600 for vision testing, as described herein. The method 600 is illustrated as collections of blocks in a logical flow graph, which represents a sequence of operations that can be implemented in hardware, software, or a combination thereof. In the context of software, the blocks represent computerexecutable instructions stored on one or more computer-readable storage media that, when executed by CPU(s), perform the recited operations. Generally, computer-executable instructions include routines, programs, objects, components, data structures, and the like that perform particular functions or implement particular abstract data types. The order in which the operations are described is not intended to be construed as a limitation, and any number of the described blocks can be combined in any order and/or in parallel to implement the method 600. In some embodiments, one or more blocks of the method 600 can be omitted entirely.
[00109] At block 602, the CPU can detect, for a first image of a series of images received from block 514, one or more pupils within the first image of the patient 202. Additionally, the CPU may determine a first region of interest associated with a first pupil and/or a second region of interest associated with a second pupil. Further, the CPU may determine first pupil center position for the first image of the series of images.
[00110] At block 604, the CPU may detect, based at least on the pupil center position for the first image, one or more additional regions of interest for an additional image. In some examples, the one or more additional regions of interest may be determined based at least on the pupil center position of the first image based at least on a high rate of image capture (e.g., 25 frames can be captured in less than 2.5 seconds) limiting the pupil center drift between frames. Accordingly, regions of interest for a successive image in the series of images (e.g., the additional image) may be determined based at least on the pupil center for the preceding image in the series of images (e.g., the first image). In some additional example, the difficulty in identifying eccentrically illuminated pupils may be compensated for by the low pupil center draft between frames. In particular, the CPU may be configured to generate a low difficulty image for pupil detection via activation of high success radiation point source sets of the eccentric radiation sources 206 for the first image. Additionally, once the one or more pupils are identified for the first image and the pupil center position determined, the CPU can utilize the low pupil center position drift to restrict the one or more additional regions of interest for the additional image and reduce the threshold for pupil detection by the neural network and/or the system 200.
[00111] At block 606, the example method may cause the CPU to identify, based at least on the one or more additional regions of interest, an additional pupil center position for the additional image. As noted above, limited pupil center drift and the additional regions of interest identified based at least on the pupil center location identified for the preceding image (e.g., the first image) enables the identification of the additional pupil center under low eccentric illumination scenarios, asymmetric illumination scenarios, and other scenarios that may be difficult for standard methods to analyze. Further, the additional pupil center position can be utilized for to determine one or more further regions of interest for a further image that succeeds the additional image. Accordingly, and at block 608 the utilization of a preceding pupil center position for a preceding image can enable the identification of a successive pupil center position for a successive image by the neural network.
[00112] FIG. 7 provides a flow diagram illustrating an example method 700 for vision testing, as described herein. The method 700 is illustrated as collections of blocks in a logical flow graph, which represents a sequence of operations that can be implemented in hardware, software, or a combination thereof. In the context of software, the blocks represent computerexecutable instructions stored on one or more computer-readable storage media that, when executed by CPU(s), perform the recited operations. Generally, computer-executable instructions include routines, programs, objects, components, data structures, and the like that perform particular functions or implement particular abstract data types. The order in which the operations are described is not intended to be construed as a limitation, and any number of the described blocks can be combined in any order and/or in parallel to implement the method 700. In some embodiments, one or more blocks of the method 700 can be omitted entirely. [00113] At block 702, the CPU may generate normalized images from individual images of a series of images. For example, the CPU may utilize the individual images to generate 0 degree rotation corrected, 60 degree rotation corrected, and 120 degree rotation corrected images to normalize the series of images against rotational effects caused by radiation collected from radiation point sources on a 0 degree meridian, a 60 degree meridian and a 120 degree meridian of the eccentric radiation source 206. In particular, the CPU may determine a primary axis for individual images captured by the radiation sensor, wherein the primary axis may be indicated by the largest range of reflected light intensity detected by the radiation sensor 212. Additionally or alternatively, the CPU may determine the primary axis for the individual images captured by the radiation sensor, wherein the primary axis may be determined based on the activation of individual radiation point sources of the eccentric radiation source 206 such that the primary axis is aligned with a meridian of the eccentric radiation source 206. Accordingly, while the eccentric radiation source 206 has been described as having a 0 degree meridian, a 60 degree meridian, and a 120 degree meridian, the CPU may operate to normalize the axis of any meridians based at least on the primary axis associated with the individual images.
[00114] At block 704, the CPU may extract a radiation intensity profile from each individual image of the series of normalized images based at least on a pupil diameter associated with the individual image. In particular, the CPU can extract a profile of the radiation intensity detected by the radiation sensor 212 for an image that captures the range of radiation intensity reflected by the one or more retinas, via the one or more pupils. As noted above, the CPU can determine a primary axis associated with the image, wherein the primary axis may be utilized to identify the radiation intensity profile to be extracted from the image. Additionally, the radiation intensity profile may be a series of radiation intensity values obtained from a normalized image along the primary axis at the pupil center. The radiation intensity values may be point values identified along the primary axis or an average of radiation intensity values at positions along the primary axis.
[00115] For example, at block 704, the CPU may identify a profile box aligned to and centered on the primary axis. Additionally or alternatively, the CPU may determine that the profile box includes a radiation intensity maximum and a radiation intensity minimum along the primary axis. The CPU may determine that the profile box is comprised of a first dimension and a second dimension, wherein the first dimension may be indicate a first number of radiation point values or pixels association with radiation point values that are substantially parallel to the primary axis while the second dimension may indicate a second number of radiation point values or pixels that are substantially perpendicular to the primary axis (e.g., the first dimension indicates that the profile box is comprised of 18 pixels along the primary axis and the second dimension indicates that the profile box is comprised of 11 pixels across the primary axis). The radiation intensity values for the radiation intensity profile may be determined for each position along the first dimension by averaging the radiation point values along the second dimension (e.g., for the first position along the primary axis, the CPU may average the 11 radiation point values in the second dimension at the first position to determine a first radiation point value for the radiation profile). Additionally, the profile of the radiation intensity can be extracted based at least on the pupil diameter determined for the individual image. For examples, the profile extracted from the individual image can be an 18 x 11 pixel box, depicting a range of radiation intensity for the individual image, where the pupil diameter is less than 4mm. In some additional examples, the profile extracted from the individual image can be a 25 x 11 pixel box where the pupil diameter exceeds 4mm. Accordingly, the radiation intensity profile for the primary axis of the normalized image may be generated from the radiation point values of the profile box. [00116] At block 706, the example method can determine slopes for each radiation source eccentricity. In some examples, the CPU can process the profile box extracted from each individual image of the series of images such that a SE slope is determined for each eccentricity utilized by the radiation source 206 to produce the series of images. For example, the CPU may process the radiation intensity profile box and determine a linear function fitted to the data results to produce a slope associated with the eccentricity of the radiation point sources active when the image was captured. As noted above, the radiation intensity within the profile box may be averaged to generate, for example, a 1x18 or 1x24 pixel curve representing the radiation intensity profile for the normalized image. In some additional examples, each image of the series of images may be associated with a radiation point source that is further associated with a meridian and an eccentricity. Additionally, the linearized slope of the radiation intensity profile may be correlated with a refractive error based at least in part on the meridian (e.g., the 0 degree meridian, the 60 degree meridian, and/or the 120 degree meridian) and the eccentricity (e.g., the first eccentricity, the second eccentricity, the third eccentricity, and/or the fourth eccentricity at different radial distances from the radiation sensor) associated with the radiation point source that is active when each image is captured. Accordingly, for refractive errors along a meridian, different central profile slopes may correspond to different eccentricities.
[00117] At block 708, the CPU may call neural networks associated with the meridians of the radiation source 206 to generate refractive errors from the profile slopes. In particular, the CPU may utilize the neural networks to correlate the linear function and/or a linearized slope (e.g., an SE slope) of the radiation intensity profile with a refractive error for the primary axis of the normalized image, the meridian of the image, and/or the set of radiation point sources that are active when image is captured. For example, the CPU may generate a Sphere (Ds), Cylinder (De), and Axis (Ax) of the Cylinder from the SE slope values at the three meridians. Additionally, the CPU may generate the three parameters (e.g., Ds, De, and Ax) for the right eye and the left eye associated with the patient 202. For some additional examples, and as noted above, the refractive error for a meridian can be determined by the neural network based at least on correlations between different SE slopes and the active radiation point sources during image capture. The neural networks may be configured to correlate the SE slope of an image with the refractive error for the patient 202 by utilizing variations in radiation intensity caused by radiation point sources at different eccentricities along one meridian. As noted above, the neural networks may be trained by evaluated SE slopes and known refractive errors at the meridians from previously diagnosed images of patient eyes. Additionally, the pupil size of an eye may be utilized as an input to further correlate SE slopes with refractive errors for the neural networks. Further, the CPU may call one or more neural networks with the same structure determine the refractive errors at the meridians (e.g., the 0 degree meridian, the 60 degree meridian, and/or the 120 degree meridian) of the radiation source 206. Alternatively, the CPU may call one unified neural network with more complicated structure (e.g., more neurons and more weights) to determine the overall eye refractive error, Ds, De and Ax. The radiation intensity profiles may be utilized by the neural network(s) to determine Ds, De, and Ax utilizing calculations discussed by U.S. Pat. App. Pub. No. 2017/0027440 Al, the entire disclosure of which, except for any definitions, disclaimers, disavowals, and inconsistencies, is incorporated herein by reference.
[00118] In general, the described systems can utilize an eccentric radiation source to simplify the vision screening system such that excess complexity and operational difficulties can be eliminated from the system. Additionally, the described systems can be configured to utilize novel logic and algorithms for focusing collected radiation for image capture, capturing a series of images, and calculation of reflective error for a patient. For example, the system can synchronize activation of a radiation source and an image capture rate of a radiation sensor such that a series of images is captured within a timeframe that can minimize positional drift and reduce calculation loads for the processor. Further, the system can be configured to focus the radiation captured by the radiation sensor, normalize the images produced by the radiation sensor, and identify key parameters that enable the utilization of neural networks to determine refractive errors for the eyes of the patient and generate recommendations for future actions.
[00119] The foregoing is merely illustrative of the principles of this disclosure. Though the refractive error test is used as an example, the vision screening device, system and methods may be used to perform different types of vision screening tests and analysis. For example, the vision screening tests may include ocular alignment testing, ocular reflexes testing, estimation of pupil size, pupil distance, and eye gaze deviation, identification of vision abnormalities such as myopia, hyperopia, astigmatism, anisometropia, strabismus, anisocoria and the like.
[00120] FIG. 8 is a pictorial flow diagram illustrating an example workflow 800 for administering one or more of the vision screening tests above. The vision screening tests may be administered using the vision screening device 230, for example. As described with reference to FIG. 2B, the vision screening device 230 may include a first display 252 facing a patient, and a second display 258 facing a user of the vision screening device 230. An illustration of content displayed on the first display 252 at each step of the workflow 800 is shown in column 802, while the content displayed on the second display 258 for the corresponding step of the workflow 800 is shown in column 804. The steps of the workflow 800 is illustrated as collections of blocks in a logical flow graph, which represents a sequence of operations that can be implemented in hardware, software, or a combination thereof. In the context of software, the blocks represent computer-executable instructions stored on one or more computer-readable storage media that, when executed by processor(s) e.g., CPU 314 or the processor(s) of the vision screening device 230, perform the recited steps. In some examples, the workflow 800 may be implemented as an app running on the processor 314, and the app may provide step-by-step guidance to the user in administering the vision screening test(s). In addition, the app may provide informational material to the patient educating the patient on the vision screening test being administered.
[00121] At step 806, the processor(s) begins a vision screening session to be administered to a patient. In some examples, step 806 may be performed in response to an input from a user of the vision screening device 230 selecting a vision screening test to be administered e.g., by using a touch interface of the second component 234. For example, vision screening tests supported by the vision screening device 230 may include photorefraction screening tests, visual acuity screening tests, dynamic eye tracking tests, accommodation tests, color vision screening tests, and the like. The processor(s) may generate content 808, 810 and cause the first display 252 to present welcome screen 808 to the patient, and cause the second display 258 to present content 810 to the user, which may include one or more of the visual elements described with reference to FIG. 2B. For example, the content 810 may comprise a user interface including the patient’s information 262, the icon panel 272 and the like.
[00122] At step 812, the vision screening device 230 may acquire image(s) of the eye(s) of the patient. For example, the processor(s) may cause the NIR radiation source 280 or the visible light source 288 to emit radiation illuminating the patient during a period of time. The vision screening device 230 may present an attention-attracting video 814 on the first display 252 to attract attention of the patient and hold the patient’s gaze towards the vision screening device 230. For example, the video 814 may include dynamic light patterns and/or sounds such as bird sounds. During the period of time the patient is illuminated, the processor(s) may cause the NIR camera 282 or the visible light camera 284 to capture image(s) and/or video 816 of the patient’s eye(s). The processor(s) may also determine a position of pupils and/or lenses of the eye(s) based on the captured image(s) and/or video 816, as described with reference to FIGS. 4-6. The vision screening device 230 may present the captured image(s) and/or video 816 of the eye(s) to the user via the second display 258. In addition, the processor(s) may determine a location of the patient relative to the vision screening device 230 e.g., based on an output of the range sensor 286 or the determined location of the pupils and/or lenses of the eye(s), and provide an indication to the user regarding the correctness of location of the patient. For example, each vision screening test may have a recommended distance between the vision screening device 230 and the patient and/or a recommended position of the patient relative to the vision screening device 230. In the instance that the patient’s location and/or position does not satisfy the recommendation of the vision screening test being administered, the processor(s) may provide instructions to the user, via the second display 258, indicating that the patient and/or the vision screening device 230 needs to be moved, along with adjustments required to bring the patient into a correct position for administering the vision screening test. For example, the user may be instructed to adjust the position of the vision screening device 230 and/or the patient and the processor(s) may re-run the step 812. In some examples, the vision screening device 230 may use eye tracking techniques on a video feed of the patient e.g., video captured by the visible light camera 284, to produce a focused and centered image of the eye(s) on the second display 258.
[00123] At step 818, the vision screening device 230 conducts the first vision screening test selected by the user. One or more visual stimulus corresponding to each vision screening test supported by the vision screening device 230 may be stored in memory e.g., the memory 320. The processor 314 may access the one or more visual stimulus from the memory and present the visual stimuli 820 on the first display 252 to the patient. The visual stimuli may be associated with timing information e.g., indicative of duration of display of each visual stimulus. In some examples, the duration of display of each visual stimulus may be controlled by the user of the vision screening device 230. During the presentation of the visual stimuli to the patient, the processor 314 may present information 822 relevant to the vision screening test being administered to the second display 258. The information 822 may be indicative of measurements determined by the vision screening device 230 during the presentation of the visual stimuli to the patient. For example, the vision screening device 230 may detect pupils, retinas, and/or lenses of the eye(s) of the patient, and acquire data comprising images and/or video data of the pupils/retinas/lenses, and the like. The processor(s) may analyze the data to determine one or more measurements associated with the patient, such as an accommodation of a lens of the eyes of the patient, motion information associated with the eyes of the patient, the refractive error of the eye(s) of the patient, gaze angle of the eye(s) of the patient 106, and the like. The information 822 may also include testing parameters, current status of the test being administered, and/or the patient’s medical history. In some examples, step 818 may be interactive e.g., requiring input from the patient in response to the visual stimuli being presented. The input may be in the form of verbal communication or other indication from the patient to the user of the vision screening device 230 and/or directly to the vision screening device 230 e.g., using a touch interface of the first component 232.
[00124] In the example shown, a visual acuity test is being administered. In this example, the visual stimuli 820 presented to the patient, on the first display 252, may be a randomized set of characters 820. The visual stimuli 820 may also be included in the information 822 presented to the user via the second display 258, so that the user may determine if the patient is able to read the characters correctly. In other vision screening tests, the information 822 may vary. For example, in an accommodation screening test, the information 822 may comprise a close-up image and/or video feed of the eyes of the patient along with measurements of pupil size, diameter and the like determined by the processor(s) in response to visual stimuli comprising different levels of brightness. Similarly, in a screening test for amblyopia, the user may be presented with a close-up video feed of the eyes of the patient along with eye tracking ouput determined by the processor(s) indicating movement of the eye while a visual stimulus comprising a moving target is being presented to the patient. [00125] At step 824, results of the first vision screening test are determined. The content
826 displayed on the first display 252 to the patient may indicate the end of the vision screening test, which may comprise a return to the initial welcome screen 808. The results of the first screening test 828 may be presented to the user via the second display 258. The results may be based on measurements made by the vision screening device 230 during step 818. For example, the refractive error of each eye may be measured using the methods described above with reference to FIGS. 4-7. The refractive errors may be compared to standardized thresholds and ranges to determine abnormalities in vision such as hypermetropia, anisometropia, amblyopia, strabismus and the like. In another example, a type of color vision abnormality may be displayed at the end of a color vision screening test. The user may add the results 828 to the patient’s information, or generate and/or print a report for the patient or for a clinician. In some examples, the user may re-do the test e.g., if the results returned are invalid, by reverting to the step 812.
[00126] At step 830, the processor(s) receive input from a user of the vision screening device 230 indicating an end of the vision screening session, or the selection of another vision screening test to administer. If another vision screening test is selected, the workflow may revert to step 812. The workflow described above illustrates the use of the vision screening device 230 to conveniently administer a series of vision screening tests, using the displays 252, 258 of the vision screening device 230 to provide the relevant content to the patient and to the user of the vision screening device 230.
[00127] The steps of the vision screening session described in the workflow 800 may be administered using the vision screening device 230. As described with reference to FIG. 2B, the vision screening device 230 may include a first display 252 facing a patient, enabling display of content 808, 814, 820, 826 to the patient during the vision screening session, and of the vision screening device 230. As also described with reference to FIG. 2B, the vision screening device 230 may include a second display 258 facing a user, enabling display of content 810, 816, 822, 828 to the user. In addition, as described with reference to FIGS. 2C- E, the imaging assembly 250 of the vision screening device 230 includes illumination and capture components needed for acquiring image(s) of the eye(s) at step 812, and conducting one or more vision screening tests at step 818 e.g., NIR source 280, NIR camera 282, visible light source 288, a visible light camera 284, and the like.
[00128] FIG. 9 provides a flow diagram illustrating an example method 900 for administering a vision screening test, as described herein. The method 900 is illustrated as collections of blocks in a logical flow graph, which represents a sequence of operations that can be implemented in hardware, software, or a combination thereof. In the context of software, the blocks represent computer-executable instructions stored on one or more computer-readable storage media that, when executed by CPU(s), perform the recited operations. Generally, computer-executable instructions include routines, programs, objects, components, data structures, and the like that perform particular functions or implement particular abstract data types. The order in which the operations are described is not intended to be construed as a limitation, and any number of the described blocks can be combined in any order and/or in parallel to implement the method 900. In some embodiments, one or more blocks of the method 900 can be omitted entirely.
[00129] At operation 902, the vision screening device 230 and/or one or more processors associated therewith initiates a vision screening test to be administered to a patient. For example, operation 902 may be performed in response to an input from a user of the vision screening device 230 selecting a vision screening test to be administered e.g., by using a touch interface of the second component 234 to select a vision screening test from options provided in a test selection icon of the icon panel 272. The content 808 may be displayed on a first display, and content 810 may be displayed on a second display to initiate the vision screening test.
[00130] At operation 904, the vision screening device 230 displays a visual stimulus to the patient. The visual stimulus displayed may be based on the specific vision screening test being administered. For example, for an accommodation and pupil control screening test, the visual stimuli may include light patterns that dynamically get brighter or dimmer and/or move around on the display screen facing the patient, e.g., the first display 252. In another example, for a visual acuity screening test, the visual stimuli may include text or other characters with a gradation in sizes e.g., a Snellen chart. In yet another example, for a color vision screening test, the visual stimuli may include color dot patterns e.g., an Ishihara plate.
[00131] At operation 906, the vision screening device 230 causes a radiation source to emit radiation to illuminate the eye(s) of the patient. The radiation source may be the NIR radiation source 280 or the visible light source 288, based on the vision screening test selected. For example, for a photo refraction-based screening test, the NIR radiation source 280 may generate a pattern of eccentric radiation to illuminate the eye(s) of the patient.
[00132] At operation 908, the vision screening device 230 causes a camera to capture radiation reflected by the eye(s) of the patient. The type of camera may be based on the type of illumination at operation 906. For example, when the eye(s) are illuminated by NIR radiation, the NIR camera 282 of the vision screening device 230 may be used. In the instance of the eye(s) being illuminated by visible light, the visible light camera 284 may be used to capture the reflected radiation. The cameras 282, 284 may be configured to record multiple images and/or video data of the patient’s eyes during the vision screening test.
[00133] At operation 910, the vision screening device 230 and/or one or more processors associated therewith determines, based on the captured radiation, one or more measurements associated with the eye(s). The measurement(s) may be determined by analyzing the images and/or video data obtained by the camera(s) at operation 908. The measurements may include, but not be limited to measurements associated with refractive error of the eye(s), a gaze of the patient, a location of the pupils, a diameter of the pupils, a diameter of the lens, motion data, etc. For example, the image/video of the eye(s) captured at operation 908 may be used to determine a measurement of gaze angle or gaze direction of the eye(s) of the patient, which may be tracked over time to determine a pattern of gaze angles or gaze directions during the vision screening test. In another example, a measurement of refractive error of the eye(s) may be determined based on reflected NIR radiation image captured by the NIR camera 282.
[00134] At operation 912, the vision screening device 230 determines, based on the measurement(s), a result of the vision screening test. The vision screening device 230 and/or one or more processors associated therewith may compare the measurement(s) with one or more standard values. For example, the system may be configured to receive, access, and/or determine criteria based on known standard values, indicating values of known standard behavior for patients being screened. The system may then compare the determined measurement(s) with the standard values e.g., thresholds or ranges, to determine if the patient is exhibiting normal vision behavior. The standard values may be received and/or accessed from a database e.g., database 118 and/or may be determined utilizing historical screening data and/or one or more machine learning techniques.
[00135] At operation 914, the vision screening device 230 may display the result of the vision screening test via a second display e.g., the second display 258, facing the user of the vision screening device 230. The result may include testing parameters, current status of the test, measurements(s) determined during the test, a diagnosis determined based on the measurement(s), and/or a recommendation associated with the diagnosis. For example, if the measurement(s) satisfy the criteria based on standard values at operation 912, the vision screening device 230 may generate a recommendation indicating that the patient has passed the vision screening test. If the measurements do not satisfy the criteria, the system may generate a recommendation including an indication that the patient has failed the screening, an indication of a diagnosis of a type of vision deficiency exhibited by the patient, and/or a recommendation for additional screening.
Example Clauses
[00136] Examples of the present disclosure are embodied by one of more of the following clauses:
[00137] Clause 1 : A vision screening device, comprising: a first housing; a second housing rotatably coupled to the first housing; a first display disposed within the first housing, the first display forming a substantially planar first front surface of the vision screening device; a second display disposed within the second housing, the second display forming a substantially planar second front surface of the vision screening device, wherein in a closed configuration of the vision screening device, the first front surface is disposed adjacent and facing the second front surface; an imaging assembly disposed within the first housing, the imaging assembly comprising: a near infra-red (NIR) radiation source, and a NIR camera configured to capture NIR radiation reflected by an eye of a patient; a processor operably connected to the first display, the second display, and the imaging assembly; and memory storing instructions that, when executed by the processor, cause the processor to: cause the first display to present a visual stimulus to the patient during a period of time; cause the NIR radiation source to illuminate the eye of the patient during the period of time; cause the NIR camera to capture radiation reflected by the eye and responsive to the visual stimulus; determine, based on the captured radiation, a measurement associated with the eye; and cause the second display to present information indicative of the measurement. [00138] Clause 2: The vision screening device of clause 1, wherein the imaging assembly includes a third housing, and wherein the NIR radiation source and the NIR camera are at least partly disposed within the third housing.
[00139] Clause 3: The vision screening of any of the clauses above, either alone or in combination, wherein the imaging assembly further includes a front cover disposed over the NIR radiation source and the NIR camera, wherein the front cover extends substantially coplanar with the first front surface of the vision screening device.
[00140] Clause 4: The vision screening of any of the clauses above, either alone or in combination, wherein the imaging assembly further includes a base, wherein the NIR radiation source and the NIR camera are connected to the base, and wherein the base is operably connected to the processor of the vision screening device.
[00141] Clause 5: The vision screening of any of the clauses above, either alone or in combination, wherein the measurement includes at least one of a pupil position, a pupil diameter, and an inner pupil distance, and the instructions further cause the processor to: detect, based on the captured radiation, a pupil of the eye; and determine, based on the detection, the measurement.
[00142] Clause 6: The vision screening of any of the clauses above, either alone or in combination, wherein the imaging assembly further comprises a visible light source and a visible light camera, and wherein the visible light source and the visible light camera are at least partly disposed within the third housing.
[00143] Clause 7: The vision screening of any of the clauses above, either alone or in combination, wherein the period of time is a first period of time, and the instructions further cause the processor to: cause the visible light source to illuminate the patient during a second period of time; cause the visible light camera to capture an image of a face of the patient during the second period of time; determine, based on the image, a location of the eye; generate an augmented image illustrating the eye, the augmented image including a visual identifier identifying a pupil of the eye; and cause the second display to display the augmented image of the eye.
[00144] Clause 8: The vision screening of any of the clauses above, either alone or in combination, wherein the imaging assembly further comprises a 3D range sensor at least partly disposed within the third housing, and the instructions further cause the processor to: cause the 3D range sensor to generate a 3D point cloud of the patient; and determine, based at least in part on the 3D point cloud, a distance of the patient from the vision screening device.
[00145] Clause 9: The vision screening of any of the clauses above, either alone or in combination, further comprising: a first back surface disposed opposite the first front surface and formed by the first housing; and a second back surface disposed opposite the second front surface and formed by the second housing, wherein the first housing and the second housing are moveable to achieve an open position of the vision screening device in which the first back surface is disposed adjacent to and facing the second back surface.
[00146] Clause 10: The vision screening of any of the clauses above, either alone or in combination, wherein the instructions further cause the processor to: cause the second display to display one or more vision screening test options; and receive, via a touch interface of the second display, input indicative of a selection of a vision screening test of the one or more vision screening test options.
[00147] Clause 11: The vision screening of any of the clauses above, either alone or in combination, wherein the vision screening test options include one or more of a photorefraction screening test, a visual acuity screening test, and a color vision screening test.
[00148] Clause 12: A method, comprising: initiating a vision screening test to be administered using a vision screening device, the vision screening device comprising: a first display disposed within a first housing, the first display; a second display disposed within a second housing rotatably coupled to the first housing; an imaging assembly disposed within the first housing, the imaging assembly comprising a radiation source and a camera; causing the first display to present a visual stimulus to a patient during a period of time; causing the radiation source emit radiation during the period of time; causing the camera to capture a portion of the radiation emitted by the radiation source, wherein the portion of the radiation captured by the camera is reflected by an eye of the patient during the period of time; determining, based on the portion of the radiation, a parameter of the eye; determining, based on the parameter, a result of the vision screening test; and causing the second display to display information indicative of the result.
[00149] Clause 13: The method of clause 12, wherein the radiation source is a near-infrared (NIR) radiation source, the camera is a NIR camera, and the parameter is indicative of a refractive error of the eye.
[00150] Clause 14, The method of any of the clauses above, either alone or in combination, wherein the visual stimulus comprises one or more of: an attention-attracting video, a visual acuity screening chart, a color vision screening pattern, a moving light target, and a light display at varying levels of brightness.
[00151] Clause 15, The method of any of the clauses above, either alone or in combination, further comprising: capturing video frames of the eye of the patient during the period of time; tracking, based on the video frames, a gaze direction of the eye of the patient responsive to the visual stimulus; and determining, based on the tracked gaze direction, the parameter.
[00152] Clause 16, The method of any of the clauses above, either alone or in combination, wherein the vision screening device includes a range sensor, the method further comprising: causing the range sensor to capture a three-dimensional point cloud of an environment in which the patient is located; determining, based on the three-dimensional point cloud, a location of an eye of the patient; determining that the location of the eye does not satisfy a recommended location of the vision screening test; and causing the second display to display an indication that the location of the eye does not satisfy the recommended location of the vision screening test.
[00153] Clause 17: A vision screening device, comprising: a first display; a second display moveable relative to the first display; an imaging assembly fixed relative to, and moveable with, the first display, the imaging assembly comprising: a near infra-red (NIR) radiation source, a NIR camera configured to capture NIR radiation reflected by an eye of a patient, a visible light source, and a visible light camera configured to capture an image or video of the patient; a processor operably connected to the first display, the second display, and the imaging assembly; and memory storing instructions that, when executed by the processor, cause the processor to: cause the first display to display a visual stimulus associated with a vision screening test; cause the NIR radiation source to illuminate the eye of the patient; cause the NIR camera to capture radiation reflected by the eye and responsive to the visual stimulus; determine, based on the captured radiation, a measurement associated with the eye; determine, based on the measurement, a result of the vision screening test; and cause the second display to present information indicative of the result.
[00154] Clause 18: The vision screening device of clause 17, and/or of any of the other clauses above, either alone or in combination, wherein the instructions further cause the processor to: cause visible light to illuminate the patient during a period of time; cause the visible light camera to capture a video of the eye of the patient; determine, based on the video, a location of a pupil of the eye; and display, on the second display, the video of the eye including an indication of the pupil of the eye.
[00155] Clause 19: The vision screening device of any of the clauses above, either alone or in combination, wherein the result is indicative of at least one of: passing the vision screening test, failing the vision screening test, or requiring additional vision screening. [00156] Clause 20: The vision screening device of any of the clauses above, either alone or in combination, wherein the imaging assembly is proximate a perimeter of the first display and substantially co-planar with the first display.
[00157] The foregoing is merely illustrative of the principles of this disclosure and various modifications can be made by those skilled in the art without departing from the scope of this disclosure. The above described examples are presented for purposes of illustration and not of limitation. The present disclosure also can take many forms other than those explicitly described herein. Accordingly, it is emphasized that this disclosure is not limited to the explicitly disclosed methods, systems, and apparatuses, but is intended to include variations to and modifications thereof, which are within the spirit of the following claims.
[00158] As a further example, variations of apparatus or process limitations (e.g., dimensions, configurations, components, process step order, etc.) can be made to further optimize the provided structures, devices and methods, as shown and described herein. In any event, the structures and devices, as well as the associated methods, described herein have many applications. Therefore, the disclosed subject matter should not be limited to any single example described herein, but rather should be construed in breadth and scope in accordance with the appended claims.

Claims

CLAIMS What is claimed is:
1. A vision screening device, comprising: a first housing; a second housing rotatably coupled to the first housing; a first display disposed within the first housing, the first display forming a substantially planar first front surface of the vision screening device; a second display disposed within the second housing, the second display forming a substantially planar second front surface of the vision screening device, wherein in a closed configuration of the vision screening device, the first front surface is disposed adjacent and facing the second front surface; an imaging assembly disposed within the first housing, the imaging assembly comprising: a near infra-red (NIR) radiation source, and a NIR camera configured to capture NIR radiation reflected by an eye of a patient; a processor operably connected to the first display, the second display, and the imaging assembly; and memory storing instructions that, when executed by the processor, cause the processor to: cause the first display to present a visual stimulus to the patient during a period of time; cause the NIR radiation source to illuminate the eye of the patient during the period of time; cause the NIR camera to capture radiation reflected by the eye and responsive to the visual stimulus; determine, based on the captured radiation, a measurement associated with the eye; and cause the second display to present information indicative of the measurement.
2. The vision screening device of claim 1, wherein the imaging assembly includes a third housing, and wherein the NIR radiation source and the NIR camera are at least partly disposed within the third housing.
3. The vision screening device of claim 2, wherein the imaging assembly further includes a front cover disposed over the NIR radiation source and the NIR camera, wherein the front cover extends substantially coplanar with the first front surface of the vision screening device.
4. The vision screening device of claim 1, wherein the imaging assembly further includes a base, wherein the NIR radiation source and the NIR camera are connected to the base, and wherein the base is operably connected to the processor of the vision screening device.
5. The vision screening device of claim 1, wherein the measurement includes at least one of a pupil position, a pupil diameter, and an inner pupil distance, and the instructions further cause the processor to: detect, based on the captured radiation, a pupil of the eye; and determine, based on the detection, the measurement.
6. The vision screening device of claim 2, wherein the imaging assembly further comprises a visible light source and a visible light camera, and wherein the visible light source and the visible light camera are at least partly disposed within the third housing.
7. The vision screening device of claim 6, wherein the period of time is a first period of time, and the instructions further cause the processor to: cause the visible light source to illuminate the patient during a second period of time; cause the visible light camera to capture an image of a face of the patient during the second period of time; determine, based on the image, a location of the eye; generate an augmented image illustrating the eye, the augmented image including a visual identifier identifying a pupil of the eye; and cause the second display to display the augmented image of the eye.
8. The vision screening device of claim 2, wherein the imaging assembly further comprises a 3D range sensor at least partly disposed within the third housing, and the instructions further cause the processor to: cause the 3D range sensor to generate a 3D point cloud of the patient; and determine, based at least in part on the 3D point cloud, a distance of the patient from the vision screening device.
9. The vision screening device of claim 1, further comprising: a first back surface disposed opposite the first front surface and formed by the first housing; and a second back surface disposed opposite the second front surface and formed by the second housing, wherein the first housing and the second housing are moveable to achieve an open position of the vision screening device in which the first back surface is disposed adjacent to and facing the second back surface.
10. The vision screening device of claim 1, wherein the instructions further cause the processor to: cause the second display to display one or more vision screening test options; and receive, via a touch interface of the second display, input indicative of a selection of a vision screening test of the one or more vision screening test options.
11. The vision screening device of claim 10, wherein the vision screening test options include one or more of a photo-refraction screening test, a visual acuity screening test, and a color vision screening test.
12. A method, comprising: initiating a vision screening test to be administered using a vision screening device, the vision screening device comprising: a first display disposed within a first housing, the first display; a second display disposed within a second housing rotatably coupled to the first housing; an imaging assembly disposed within the first housing, the imaging assembly comprising a radiation source and a camera; causing the first display to present a visual stimulus to a patient during a period of time; causing the radiation source emit radiation during the period of time; causing the camera to capture a portion of the radiation emitted by the radiation source, wherein the portion of the radiation captured by the camera is reflected by an eye of the patient during the period of time; determining, based on the portion of the radiation, a parameter of the eye; determining, based on the parameter, a result of the vision screening test; and causing the second display to display information indicative of the result.
13. The method of claim 12, wherein the radiation source is a near-infrared (NIR) radiation source, the camera is a NIR camera, and the parameter is indicative of a refractive error of the eye.
14. The method of claim 12, wherein the visual stimulus comprises one or more of: an attention-attracting video, a visual acuity screening chart, a color vision screening pattern, a moving light target, and a light display at varying levels of brightness.
15. The method of claim 12, further comprising: capturing video frames of the eye of the patient during the period of time; tracking, based on the video frames, a gaze direction of the eye of the patient responsive to the visual stimulus; and determining, based on the tracked gaze direction, the parameter.
16. The method of claim 12, wherein the vision screening device includes a range sensor, the method further comprising: causing the range sensor to capture a three-dimensional point cloud of an environment in which the patient is located; determining, based on the three-dimensional point cloud, a location of an eye of the patient; determining that the location of the eye does not satisfy a recommended location of the vision screening test; and causing the second display to display an indication that the location of the eye does not satisfy the recommended location of the vision screening test.
17. A vision screening device, comprising: a first display; a second display moveable relative to the first display; an imaging assembly fixed relative to, and moveable with, the first display, the imaging assembly comprising: a near infra-red (NIR) radiation source, a NIR camera configured to capture NIR radiation reflected by an eye of a patient, a visible light source, and a visible light camera configured to capture an image or video of the patient; a processor operably connected to the first display, the second display, and the imaging assembly; and memory storing instructions that, when executed by the processor, cause the processor to: cause the first display to display a visual stimulus associated with a vision screening test; cause the NIR radiation source to illuminate the eye of the patient; cause the NIR camera to capture radiation reflected by the eye and responsive to the visual stimulus; determine, based on the captured radiation, a measurement associated with the eye; determine, based on the measurement, a result of the vision screening test; and cause the second display to present information indicative of the result.
18. The vision screening device of claim 17, wherein the instructions further cause the processor to: cause visible light to illuminate the patient during a period of time; cause the visible light camera to capture a video of the eye of the patient; determine, based on the video, a location of a pupil of the eye; and display, on the second display, the video of the eye including an indication of the pupil of the eye.
19. The vision screening device of claim 17, wherein the result is indicative of at least one of: passing the vision screening test, failing the vision screening test, or requiring additional vision screening.
20. The vision screening device of claim 17, wherein the imaging assembly is proximate a perimeter of the first display and substantially co-planar with the first display.
PCT/US2022/050390 2021-11-19 2022-11-18 Vision screening systems and methods WO2023091660A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP22896520.8A EP4432895A1 (en) 2021-11-19 2022-11-18 Vision screening systems and methods

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US17/531,320 US20220076417A1 (en) 2020-06-19 2021-11-19 Vision screening systems and methods
US17/531,320 2021-11-19

Publications (2)

Publication Number Publication Date
WO2023091660A1 true WO2023091660A1 (en) 2023-05-25
WO2023091660A9 WO2023091660A9 (en) 2024-06-20

Family

ID=86397746

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2022/050390 WO2023091660A1 (en) 2021-11-19 2022-11-18 Vision screening systems and methods

Country Status (2)

Country Link
EP (1) EP4432895A1 (en)
WO (1) WO2023091660A1 (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070052924A1 (en) * 2005-09-02 2007-03-08 Nidek Co., Ltd. Optometer
US20070171372A1 (en) * 2005-12-16 2007-07-26 Nonavision, Inc. Adjustable device for vision testing and therapy
US20160262617A1 (en) * 2015-03-09 2016-09-15 Lawrence J. Gerrans Robotic Ophthalmology
US9517008B1 (en) * 2014-11-06 2016-12-13 Bertec Corporation System and method for testing the vision of a subject
US20200225706A1 (en) * 2019-01-10 2020-07-16 Samsung Electronics Co., Ltd. Method for providing image using foldable display and electronic device for supporting the same
US20210330185A1 (en) * 2020-04-24 2021-10-28 RemmedVR Sp. z o.o. System and methods for use in vision assessment to determine refractive errors and neurodegenerative disorders by ocular biomarking features
US20210345872A1 (en) * 2020-05-06 2021-11-11 Welch Allyn, Inc. Vision screening systems and methods

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070052924A1 (en) * 2005-09-02 2007-03-08 Nidek Co., Ltd. Optometer
US20070171372A1 (en) * 2005-12-16 2007-07-26 Nonavision, Inc. Adjustable device for vision testing and therapy
US9517008B1 (en) * 2014-11-06 2016-12-13 Bertec Corporation System and method for testing the vision of a subject
US20160262617A1 (en) * 2015-03-09 2016-09-15 Lawrence J. Gerrans Robotic Ophthalmology
US20200225706A1 (en) * 2019-01-10 2020-07-16 Samsung Electronics Co., Ltd. Method for providing image using foldable display and electronic device for supporting the same
US20210330185A1 (en) * 2020-04-24 2021-10-28 RemmedVR Sp. z o.o. System and methods for use in vision assessment to determine refractive errors and neurodegenerative disorders by ocular biomarking features
US20210345872A1 (en) * 2020-05-06 2021-11-11 Welch Allyn, Inc. Vision screening systems and methods

Also Published As

Publication number Publication date
EP4432895A1 (en) 2024-09-25
WO2023091660A9 (en) 2024-06-20

Similar Documents

Publication Publication Date Title
US10416725B2 (en) Wearable device having a display, lens, illuminator, and image sensor
CN101730500B (en) Apparatus and method for objective perimetry visual field test
US20180333092A1 (en) Portable ocular response testing device and methods of use
US10799115B2 (en) Through focus retinal image capturing
US11406257B2 (en) Vision screening device and methods
US20220076417A1 (en) Vision screening systems and methods
US11246482B2 (en) Visual acuity examination
CN110420008A (en) For determining component, computer program, system and the external member of correcting lens
US20210390692A1 (en) Detecting and tracking macular degeneration
EP4238486A1 (en) Vision screening device including color imaging
US20230218163A1 (en) Method to monitor accommodation state during visual testing
US20220039649A1 (en) Method and device for determining a refraction feature of an eye of a subject
US20220230749A1 (en) Systems and methods for ophthalmic digital diagnostics via telemedicine
US12029480B2 (en) Vision screening systems and methods
WO2023091660A9 (en) Vision screening systems and methods
EP3925519B1 (en) Vision screening systems and methods
US20210386287A1 (en) Determining refraction using eccentricity in a vision screening system
US20230404397A1 (en) Vision screening device including oversampling sensor
CN109674442A (en) A kind of self-service vision drop system and device containing built-in light source
US20230181032A1 (en) Measurements of keratometry and axial length
US20230107167A1 (en) Vision screening device, vision impairment screening system, improved vision screening process, method of assessment, and method of use
US20230047557A1 (en) Determining color vision ability using a vision screening device
JP2019103079A (en) Iris imaging apparatus and iris analysis system using the same

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22896520

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2022896520

Country of ref document: EP

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2022896520

Country of ref document: EP

Effective date: 20240619