WO2023193038A1 - Interventional protocol in respect of human-operated machine based on processing of blepharometric data and/or other physiological parameters - Google Patents

Interventional protocol in respect of human-operated machine based on processing of blepharometric data and/or other physiological parameters Download PDF

Info

Publication number
WO2023193038A1
WO2023193038A1 PCT/AU2023/000002 AU2023000002W WO2023193038A1 WO 2023193038 A1 WO2023193038 A1 WO 2023193038A1 AU 2023000002 W AU2023000002 W AU 2023000002W WO 2023193038 A1 WO2023193038 A1 WO 2023193038A1
Authority
WO
WIPO (PCT)
Prior art keywords
measure
impairment
countermeasures
vehicle
physiological state
Prior art date
Application number
PCT/AU2023/000002
Other languages
French (fr)
Inventor
Scott Coles
Trefor Morgan
Original Assignee
Sdip Holdings Pty Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from AU2022900922A external-priority patent/AU2022900922A0/en
Application filed by Sdip Holdings Pty Ltd filed Critical Sdip Holdings Pty Ltd
Publication of WO2023193038A1 publication Critical patent/WO2023193038A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1103Detecting eye twinkling
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0077Devices for viewing the surface of the body, e.g. camera, magnifying lens
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/168Evaluating attention deficit, hyperactivity
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/18Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state for vehicle drivers or machine operators
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6887Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient mounted on external non-worn devices, e.g. non-medical devices
    • A61B5/6893Cars
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K28/00Safety devices for propulsion-unit control, specially adapted for, or arranged in, vehicles, e.g. preventing fuel supply or ignition in the event of potentially dangerous conditions
    • B60K28/02Safety devices for propulsion-unit control, specially adapted for, or arranged in, vehicles, e.g. preventing fuel supply or ignition in the event of potentially dangerous conditions responsive to conditions relating to the driver
    • B60K28/06Safety devices for propulsion-unit control, specially adapted for, or arranged in, vehicles, e.g. preventing fuel supply or ignition in the event of potentially dangerous conditions responsive to conditions relating to the driver responsive to incapacity of driver
    • B60K28/066Safety devices for propulsion-unit control, specially adapted for, or arranged in, vehicles, e.g. preventing fuel supply or ignition in the event of potentially dangerous conditions responsive to conditions relating to the driver responsive to incapacity of driver actuating a signalling device
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • B60W30/09Taking automatic action to avoid collision, e.g. braking and steering
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/08Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/59Context or environment of the image inside of a vehicle, e.g. relating to seat occupancy, driver state or inner lighting conditions
    • G06V20/597Recognising the driver's state or behaviour, e.g. attention or drowsiness
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/193Preprocessing; Feature extraction
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/113Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining or recording eye movement
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/117Identification of persons
    • A61B5/1171Identification of persons based on the shapes or appearances of their bodies or parts thereof
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/40Detecting, measuring or recording for evaluating the nervous system
    • A61B5/4076Diagnosing or monitoring particular conditions of the nervous system
    • A61B5/4082Diagnosing or monitoring movement diseases, e.g. Parkinson, Huntington or Tourette
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/40Detecting, measuring or recording for evaluating the nervous system
    • A61B5/4076Diagnosing or monitoring particular conditions of the nervous system
    • A61B5/4088Diagnosing of monitoring cognitive diseases, e.g. Alzheimer, prion diseases or dementia
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/40Detecting, measuring or recording for evaluating the nervous system
    • A61B5/4076Diagnosing or monitoring particular conditions of the nervous system
    • A61B5/4094Diagnosing or monitoring seizure diseases, e.g. epilepsy
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/4836Diagnosis combined with treatment in closed-loop systems or methods
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61MDEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
    • A61M2210/00Anatomical parts of the body
    • A61M2210/06Head
    • A61M2210/0612Eyes
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/08Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
    • B60W2040/0818Inactivity or incapacity of driver
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/08Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
    • B60W2040/0818Inactivity or incapacity of driver
    • B60W2040/0827Inactivity or incapacity of driver due to sleepiness
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/08Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
    • B60W2040/0818Inactivity or incapacity of driver
    • B60W2040/0836Inactivity or incapacity of driver due to alcohol
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/08Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
    • B60W2040/0818Inactivity or incapacity of driver
    • B60W2040/0845Inactivity or incapacity of driver due to drugs
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/08Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
    • B60W2040/0872Driver physiology
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • B60W2050/143Alarm means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2420/00Indexing codes relating to the type of sensors based on the principle of their operation
    • B60W2420/40Photo, light or radio wave sensitive means, e.g. infrared sensors
    • B60W2420/403Image sensing, e.g. optical camera
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2540/00Input parameters relating to occupants
    • B60W2540/22Psychological state; Stress level or workload
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2540/00Input parameters relating to occupants
    • B60W2540/229Attention level, e.g. attentive to driving, reading or sleeping
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2540/00Input parameters relating to occupants
    • B60W2540/26Incapacity
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/10Path keeping
    • B60W30/12Lane keeping
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/14Adaptive cruise control
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • B60W50/16Tactile feedback to the driver, e.g. vibration or force feedback to the driver on the steering wheel or the accelerator pedal
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/172Classification, e.g. identification

Definitions

  • the present invention relates, in various embodiments, to technology configured to implement a cascading interventional protocol in respect of human-operated machine based on processing of blepharometric data and/or other physiological parameters.
  • some embodiments provide software and integrated systems which influence the operation of a vehicle, such as an automobile, based on monitoring of a human operator. While some embodiments will be described herein with particular reference to that application, it will be appreciated that the invention is not limited to such a field of use, and is applicable in broader contexts.
  • drowsiness detection technology it is known to implement drowsiness detection technology in human operated vehicles. For example, such technology performs monitoring functions thereby to predict presence of potential drowsiness in respect of an operator.
  • This may include technologies such as blepharometric analysis (which involves processing time-series data representative of eyelid amplitude), blink rate/duration analysis (which, for example, considers the regularity and/or length of blinks), and facial image processing methods (for example Al systems which are configured to predict whether a facial image represents a drowsy person or an alert person).
  • blepharometric analysis which involves processing time-series data representative of eyelid amplitude
  • blink rate/duration analysis which, for example, considers the regularity and/or length of blinks
  • facial image processing methods for example Al systems which are configured to predict whether a facial image represents a drowsy person or an alert person.
  • any one of the terms comprising, comprised of or which comprises is an open term that means including at least the elements/features that follow, but not excluding others.
  • the term comprising, when used in the claims should not be interpreted as being limitative to the means or elements or steps listed thereafter.
  • the scope of the expression a device comprising A and B should not be limited to devices consisting only of elements A and B.
  • Any one of the terms including orwhich includes or that includes as used herein is also an open term that also means including at least the elements/features that follow the term, but not excluding others. Thus, including is synonymous with and means comprising.
  • exemplary is used in the sense of providing examples, as opposed to indicating quality. That is, an “exemplary embodiment” is an embodiment provided as an example, as opposed to necessarily being an embodiment of exemplary quality.
  • FIG. 1 illustrates an in-vehicle blepharometric data collection/monitoring system according to one embodiment.
  • FIG. 2 illustrates an example graph of drowsiness score against relative risk in the context of vehicle operation, using JDS technology.
  • FIG. 3 illustrates an example cascading interventional protocol.
  • the present invention relates, in various embodiments, to technology configured to implement a cascading interventional protocol in respect of human-operated machine based on processing of blepharometric data and/or other physiological parameters.
  • some embodiments provide software and integrated systems which influence the operation of a vehicle, such as an automobile, based on monitoring of a human operator. While some embodiments will be described herein with particular reference to that application, it will be appreciated that the invention is not limited to such a field of use, and is applicable in broader contexts.
  • a human subject’s involuntary blinks and eyelid movements are influenced by a range of factors, including the subject’s behavioural state and brain function. For example, this has been used in the past for detection of drowsiness. More broadly, analysis of data derived from eye and eyelid movements can be performed thereby to identify data artefacts, patterns and the like, and these are reflective of the subject’s behavioural state, brain function and the like. [0018]
  • the technology described herein is focussed on collection and analysis of “blepharometric data”, with the term “blepharon” describing a human eyelid.
  • the term “blepharometric data” is used to define data that describes eyelid movement as a function of time.
  • eyelid position may be recorded as an amplitude.
  • Eyelid movements are commonly categorised as “blinks” or “partial blinks”.
  • the term “blepharometric data” is used to distinguish technology described herein from other technologies which detect the presence of blinks for various purposes.
  • the technology herein is focussed on analysing eyelid movement as a function of time, typically measured as an amplitude. This data may be used to infer the presence of what would traditionally be termed “blinks”, however it is attributes of “events” and other parameters identifiable in eyelid movements which are of primary interest to technologies described herein. Events and other parameters which are identified from the processing of blepharometric data are referred to as “blepharometric artefacts”.
  • blepharometric artefacts These are referred to as “blepharometric artefacts”, with such artefacts being identifiable by application of various processing algorithms to a data set that described eyelid position as a function of time (i.e. blepharometric data).
  • the artefacts may include:
  • Blink total duration (BTD), which is preferably measured as a time between commencement of closure movement which exceeds a defined threshold and completion of subsequent opening movement.
  • AVRs Amplitude to velocity ratios
  • the determination of blepharometric artefacts may include any one or more of:
  • Blink initiation and blink completion may be determined based on a determined “inter-blink” eyelid amplitude range, with movement outside that amplitude range being categorised as a blink.
  • a “blink” is in some embodiments defined as the pairing of positive and negative events that are within relative amplitude limits and relative position limits. There may be multiple events within a given blink, when an eyelid is outside of an “inter-blink” eyelid amplitude range.
  • velocity measurements which include velocity estimation measurements
  • eye closure motion and/or eye re-opening motion are also made, which may be used for the purposes of determining amplitude-to-velocity ratios.
  • Known eyelid movement monitoring systems focus on point-in-time subject analysis.
  • commonly such technology is used as a means for assessing subject alertness/drowsiness at a specific moment, in some instances benchmarked against known data for a demographically relevant population.
  • JDS Johns Drowsiness Scale
  • This provides a graduated scale of drowsiness, which can be broken up into distinct tiers which each provide a respective objective level of drowsiness.
  • Technology described herein provides for controlling one or more electronic systems provided by a machine that is controlled by an operator.
  • a current physiological state of the operator is determined relative to an objective scale, for example an objective measure of impairment (such as drowsiness).
  • the scale has at least three distinct condition levels.
  • one or more operational countermeasures associated with that measure are identified. These “countermeasures” are each operational controls which can be applied to respective ones of the electronic systems.
  • a signal is provided thereby to trigger implementation of the identified one or more operational countermeasures (optionally subject to failsafe logic and the like).
  • the examples below focus primarily on a vehicle, such as an automobile, in which case the electronic systems include one or more of: (i) driver assistance systems such as cruise control and lane maintenance; (ii) entertainment systems; (iii) electronic windows; (iv) internal lighting; (iv) climate control systems; and various other electronic systems in modern vehicles.
  • the technology may be applied in a wide range of other settings, for example to manage risks associated with a condition such as drowsiness, or even to encourage a condition such as drowsiness (for example via an in-vehicle entertainment system utilised by a nonoperator, for example in a commercial airline).
  • the technology may be applied in settings including heavy machinery, factory equipment, smartphones and other computing devices, entertainment systems, and others.
  • Examples below are also directed predominately to the physiological condition of drowsiness. However, it should be apperceived that this is a non-limiting example, with other embodiments being additionally/alternatively applied to other conditions (for example other conditions of impairment). These may include other forms of cognitive impairment, such as those resulting from degenerative medical ailments, injury, drug/alcohol usage, and the like. Other physiological conditions may include adverse neurological and/or cardiovascular events.
  • the condition of the operator is assessed relative to the objective scale, and countermeasures applied as a result.
  • the scale has at least three distinct levels (and preferably more), this allows for a cascading increase (or decrease) in countermeasures based on point-in-time considerations.
  • countermeasures are applied in a cascading (and preferably additive) manner as the observed level of drowsiness increases.
  • This may include modifying the way in which the vehicle is operated (for example increasing the degree of required operator input as drowsiness increases, hence rendering operation more challenging than it otherwise would be), adjusting a level of feedback/alert provided to the operator; and/or catering the operator’s environment (for example noise, haptic feedback, airflow, temperature, lighting, and so on).
  • VMS Vehicle Management System
  • FIG. 1 illustrates an example VMS, in the form of an in-vehicle blepharometric monitoring system.
  • an example VMS determines an objective measure of drowsiness based on: (i) technology other then blepharometric analysis, for example bling monitoring, facial image classification and/or other Al-based approaches).
  • Other embodiments may include cloud-hosted processing components, for example thereby to facilitate aspects of drowsiness level determination.
  • Further embodiments use technology other than image capture thereby to obtain physiological information regarding a vehicle operator, for example using wearable technologies (such as infrared reflectance oculography spectacles, and other wearable physiological monitors).
  • the system of FIG. 1 includes an image capture device 120.
  • This may include substantially any form of appropriately sized digital camera, preferably a digital camera with a frame rate of over 60 frames per second. Higher frame rate cameras are preferred, given that with enhanced frame rate comes an ability to obtain higher resolution data for eyelid movement.
  • Device 120 is positioned to capture a facial region of a subject.
  • Device 120 is in one embodiment installed in a region of a vehicle in the form of an automobile, for example on or adjacent the dashboard, windscreen, or visor, such that it is configured to capture a facial region of a driver.
  • device 120 is positioned on or adjacent the dashboard, windscreen, or visor, such that it is configured to capture a facial region of a front seat passenger.
  • device 120 is positioned in a region such as the rear of a seat such that it is configured to capture a facial region of a back-seat passenger. In some embodiments a combination of these is provided, thereby to enable blepharometric data monitoring for both a driver and one or more passengers.
  • FIG. 1 Although the system of FIG. 1 (and other systems) is described by reference to a vehicle in the form of an automobile, it will be appreciated that a system as described is also optionally implemented in other forms of vehicles, including mass-transport vehicles such as passenger airplanes, busses/coaches, and trains.
  • mass-transport vehicles such as passenger airplanes, busses/coaches, and trains.
  • An in-vehicle image processing system 110 is configured to receive image data from image capture device 120 (or multiple devices 120), and process that data thereby to generate blepharometric data.
  • a control module 111 is configured to control device 120, operation of image data processing, and management of generated data. This includes controlling operation of image data processing algorithms, which are configured to:
  • Identify, in the eye region(s), presence and movement of an eyelid For example, in a preferred embodiment this is achieved by way of recording an eyelid position relative to a defined “open” position against time. This allows generation of blepharometric data in the form of eyelid position (amplitude) over time. It will be appreciated that such data provides for identification of events (for example blink events) and velocity (for example as a first derivative of position against time).
  • a facial recognition algorithm is used to enable identification of: (i) a central position on an upper eyelid on a detected face; and (ii) at least two fixed points on the detected face.
  • the two fixed points on the detected face are used to enable scaling of measurements of movement of the central position of the upper eyelid thereby to account to changes in relative distance between the user and the camera. That is, a distance between the two fixed points is used as a means to determine position of the face relative to the camera, including position by reference to distance from the camera (as the user moves away, the distance between the fixed points decreases).
  • Algorithms 112 optionally operate to extract additional artefacts from blepharometric data, for example amplitude-velocity ratios, blink total durations, inter-event durations, and the like. It will be appreciated, however, that extraction of such artefacts may occur in downstream processing.
  • a blepharometric data output module 113 is configured to output blepharometric data generated by algorithms 112 in user blepharometric data 152.
  • Memory system 150 optionally includes user identification data 151 for one or more users.
  • the identification data may include login credentials (for example a user ID and/or password) which are inputted via an input device.
  • the identification data may be biometric, for example using facial recognition as discussed above or an alternate biometric input (such as a fingerprint scanner). In some embodiments this leverages an existing biometric identification system of the vehicle. This is optionally used to adjust an objective drowsiness scale based on individual and/or demographic attributes of the operator,
  • Analysis modules 130 are configured to perform analysis of user blepharometric data 152. This includes assessing a physiological state of the vehicle operator, in the present example being drowsiness, and providing an objective measure as output. For example, that objective measure could be on a scale of X to Y (for instance 1 to 10). The Johns Drowsiness Scale may be used.
  • a control system 140 includes logic modules 141 which control overall operation of system 100. This includes execution of logical rules thereby to determine communications to be provided in response to outputs from analysis modules 130. For example, this may include identifying a set of countermeasures to be applied based on a level of drowsiness that is determined by modules 130.
  • a countermeasure implementation system 160 is configured to process instructions from control system 140 thereby to implement countermeasures in the vehicle, by providing control instructions to individual vehicle systems (e.g. driver assistance, climate control, window operation, suspension, and so on). In some embodiments control system provides a stream of instructions to system 161 thereby to implement a series of operational modifications based on a predefined schedule (e.g. opening and closing of windows, temperature variation, triggering of a manoeuvring alteration, and so on).
  • a predefined schedule e.g. opening and closing of windows, temperature variation, triggering of a manoeuvring alteration, and so on.
  • the logic modules preferably provide data representative of countermeasure implementation protocols for some or all of the following condition levels:
  • a condition level associated with a set of operational countermeasures which simplify vehicle operation.
  • those countermeasures include automated cruise control and/or lane maintenance.
  • a condition level associated with a set of operational countermeasures which provide increased operator simulation include any one or more of: enhancing feedback about lane keeping accuracy; applying acoustic and haptic feedback about lane departure; amplifying kinesthetic feedback of the adaptive cruise control; and/or introduction of increased sensitivity of FCW and LDW systems.
  • a condition level associated with a set of operational countermeasures which provide modification of operation conditions include any one or more of: controlling windows; controlling a climate control system; controlling an entertainment system; and/or increasing sensitivity of creased sensitivity of FCW and LDW systems.
  • a condition level associated with a set of operational countermeasures which render vehicle operation relatively more challenging compared to standard operation include one or more of the following: disabling cruise control; increasing steering resistance; reducing spring suspension; amplifying motor sound; and introduction of controlled disturbances.
  • the technology uses human monitoring and assessment to objectively quantify deterioration in cognitive brain function due to drowsiness.
  • This technology analyses biomarkers extracted from a machine operator (e.g. a vehicle driver) and outputs a drowsiness risk level in a scale, for example a scale of to 10.
  • a scale for example a scale of to 10.
  • An example scale is illustrated in FIG. 2, which shows an example graph of drowsiness score against relative risk in the context of vehicle operation, using JDS technology.
  • a reliable and objective drowsiness measure is determined using a composite metric extracted from eyelid movements (specifically from processing timeseries data representative of eyelid position as a function of time, i.e. blepharometric analysis). This can be measured using data collected via existing in-car monitoring cameras that are part of a Driver Monitoring System (DMS), for example using images containing a driver’s face.
  • DMS Driver Monitoring System
  • Example technology for achieving such objectives is disclosed in various international publications by SDIP Holdings Pty Ltd, which are herein incorporated by cross reference.
  • Objective drowsiness measurement enables a multi-stage approach to the application of cascading countermeasures in a vehicle, enabling the driver to prolong a duration of safe driving.
  • this is configured as follows: • Where the objective measure of drowsiness measurement is in an “Alert” range: no special intervention required from the system. That is, the vehicle operates as specified in a “normal” state.
  • intervention measures are set for a “preservation” phase. This allows the driver to focus on the main driving tasks, and countermeasures are focussed on tools configured to “make driving easy”, for example by enabling automatic cruise control and lane maintenance, and reducing distractions by warning systems.
  • intervention measures are set for a “driver stimulation” phase.
  • the system aims to stimulate the driver to counter further drowsiness by, for example, enhancing feedback about lane keeping accuracy, applying acoustic and haptic feedback about lane departure, and amplifying kinesthetic feedback of the adaptive cruise control (stronger acceleration/deceleration).
  • This optionally includes the introduction of increased sensitivity of FCW and LDW systems.
  • intervention measures are set for a “Modification of driving conditions” phase.
  • the system aims to introduce sensory stimulation to the driver by modifying ambient conditions, for instance, introducing fresh air (ventilation or opening window slightly), increasing radio/entertainment system volume, and/or reducing cabin temperature. This is preferably accompanied by a further increase in sensitivity of the FCW and LDW systems.
  • intervention measures are set for a “high stimulation”.
  • the system aims to increase driver engagement by, in effect, “make driving difficult”. This includes further engaging the driver in the driving activities by disabling automatic cruise control, increasing steering torque (resistance), reducing spring suspension, amplifying motor sound, and/or introducing controlled disturbances. This is accompanied by a further increase in sensitivity of the FCW and LDW systems.
  • intervention measures are set for a “protection” phase.
  • the system tries to compensate for the driver’s potential impairment by advising the driver to stop and lowering to minimum the criteria for all warnings, increasing the amplitude and frequency of the audible warnings.
  • FIG. 3 An example of such a layered/cascading countermeasure approach according to one embodiment is provided in FIG. 3.
  • the objective scale is from 0 (very alert) to 10 (very drowsy/asleep).
  • the layered countermeasure approach above relies on an objective and reliable method for the quantification of driver drowsiness, for example with sufficient resolution at the early drowsiness onset stage and ability for segregation into multiple distinct phases. It will be appreciated that the above is an example only, and that modifications may be implemented.
  • a protocol is implemented thereby to apply countermeasures in respect of biomarkers other than drowsiness.
  • blepharometric techniques may be applied in respect of other conditions such as seizures (current and/or predicted), degenerative conditions such as Alzheimer’s, and the like.
  • Research is currently being directed to the use of blepharometric analysis and Al to identify biomarkers for other degradations to neurological brain function in conditions such as epilepsy (due to onset of seizures), ADHD, Parkinson’s, Alzheimer’s, traumatic brain injuries, concussion and motor neuron diseases, and aim to flag potential early signs to the driver (and also passengers) and/or with a recommendation to seek medical advice.
  • the technology disclosed herein is readily applied in such scenarios.
  • an impairment monitoring algorithm and associated intervention control system is applied to process driver monitoring data in a motor vehicle.
  • the driver monitoring data may yield eyelid amplitude data (e.g. from image processing), and this may be used to derive an objective measure of impairment (for example an objective measure of drowsiness).
  • Each example embodiment includes a method for controlling one or more electronic systems provided by a machine that is controlled by an operator, the method including: accessing data representative of a current measure of a physiological state of impairment of the operator, wherein the measure of the physiological state of impairment is defined relative to a scale having at least three distinct impairment condition levels; based on the condition level of the current measure of the physiological state of impairment, identifying one or more operational countermeasures associated with that measure. These countermeasures are implemented by an intervention control system.
  • Default Impairment Level For the present purposes these are referred to as Default Impairment Level, First Impairment Level, Second Impairment Level, n th Impairment Level, etc. to define an increasing set of impairment levels (although there may be additional intermediate levels). It will be appreciated that these are arbitrary labels, and that the labelling and number of defined impairment levels is to an extent a design choice.
  • the vehicle operates in a “standard” mode where the operator has control over driver assistance systems (e.g. cruise control, lane control, etc), internal environment control systems (e.g. climate control, lighting control, volume control), and physical control systems (e.g. windows, acceleration and braking).
  • driver assistance systems e.g. cruise control, lane control, etc
  • internal environment control systems e.g. climate control, lighting control, volume control
  • physical control systems e.g. windows, acceleration and braking.
  • the intervention control system upon detection of a transition from the Default Impairment Level to a First Impairment Level, applies a first set of predefined settings to a first set of one or more of the driver assistance systems, thereby overriding current user- controlled settings.
  • the intervention control system Upon detection of a transition from the First Impairment Level to a Second Impairment Level, applies a second set of predefined settings to a second set of one or more of the driver assistance systems, thereby overriding current user- controlled settings. This is preferably additive to the first set of predefined settings in the sense that there is additional reduction in driver assistance.
  • the second set of predefined settings may: reduce driver assistance to one or more driver assistance systems in the first set (and also in the second set); and/or affect driver assistance systems in the second set (but not in the first set). That is, there may be increased intervention into already-affected systems and/or intervention into additional driver assistance systems.
  • the intervention control system upon detection of a transition from a n th Impairment Level to a n+1 th Impairment Level, the intervention control system applies a first set of predefined settings to a first set of one or more of the driver assistance systems, thereby overriding current user- controlled settings.
  • the intervention control system Upon detection of a transition from the n+1 th Impairment Level to a n+2 th Impairment Level, the intervention control system applies a second set of predefined settings to a second set of one or more of the driver assistance systems, thereby overriding current user- controlled settings.
  • This is preferably additive to the first set of predefined settings in the sense that there is additional reduction in driver assistance.
  • the second set of predefined settings may: reduce driver assistance to one or more driver assistance systems in the first set (and also in the second set); and/or affect driver assistance systems in the second set (but not in the first set). That is, there may be increased intervention into already-affected systems and/or intervention into additional driver assistance systems.
  • the intervention control system upon detection of a transition from a n th Impairment Level to a n+1 th Impairment Level, applies a first set of predefined settings to a first set of one or more of the driver assistance systems, thereby overriding current user- controlled settings.
  • the intervention control system Upon detection of a transition from the n+1 th Impairment Level to a n+2 th Impairment Level, applies a second set of predefined settings to a second set of one or more vehicle systems, which include either or both of internal environment control systems, and physical control systems (each optionally in combination with driver assistance systems).
  • the intervention control system upon detection of a transition from a n th Impairment Level to a n+1 th Impairment Level, applies a first set of predefined settings to a first set of one or more vehicle systems, including any one or more of the driver assistance systems, environment control systems, and/or physical control systems, thereby overriding current user-controlled settings.
  • the intervention control system Upon detection of a transition from the n+1 th Impairment Level to a n+2 th Impairment Level, the intervention control system applies a second set of predefined settings to a second set of one or more vehicle systems, which include the driver assistance systems, environment control systems, and/or physical control systems.
  • intervention measures applied to driver assistance systems may include, for instance, one or more of the following:
  • intervention measures applied to environment control systems may include, for instance, one or more of the following:
  • Dynamically varying speaker volume e.g. cyclical
  • intervention measures applied to physical control systems may include, for instance, one or more of the following:
  • Coupled when used in the claims, should not be interpreted as being limited to direct connections only.
  • the terms “coupled” and “connected,” along with their derivatives, may be used. It should be understood that these terms are not intended as synonyms for each other.
  • the scope of the expression a device A coupled to a device B should not be limited to devices or systems wherein an output of device A is directly connected to an input of device B. It means that there exists a path between an output of A and an input of B which may be a path including other devices or means.
  • Coupled may mean that two or more elements are either in direct physical or electrical contact, or that two or more elements are not in direct contact with each other but yet still co-operate or interact with each other.

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Veterinary Medicine (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Public Health (AREA)
  • Automation & Control Theory (AREA)
  • Mechanical Engineering (AREA)
  • Transportation (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Developmental Disabilities (AREA)
  • Multimedia (AREA)
  • General Physics & Mathematics (AREA)
  • Psychiatry (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Social Psychology (AREA)
  • Psychology (AREA)
  • Child & Adolescent Psychology (AREA)
  • Hospice & Palliative Care (AREA)
  • Educational Technology (AREA)
  • Ophthalmology & Optometry (AREA)
  • Physiology (AREA)
  • Dentistry (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Mathematical Physics (AREA)
  • Traffic Control Systems (AREA)

Abstract

The present invention relates, in various embodiments, to technology configured to implement a cascading interventional protocol in respect of human-operated machine based on processing of blepharometric data and/or other physiological parameters. For example, some embodiments provide software and integrated systems which influence the operation of a vehicle, such as an automobile, based on monitoring of a human operator. While some embodiments will be described herein with particular reference to that application, it will be appreciated that the invention is not limited to such a field of use, and is applicable in broader contexts.

Description

INTERVENTIONAL PROTOCOL IN RESPECT OF HUMAN- OPERATED MACHINE BASED ON PROCESSING OF BLEPHAROMETRIC DATA AND/OR OTHER PHYSIOLOGICAL PARAMETERS
FIELD OF THE INVENTION
[0001] The present invention relates, in various embodiments, to technology configured to implement a cascading interventional protocol in respect of human-operated machine based on processing of blepharometric data and/or other physiological parameters. For example, some embodiments provide software and integrated systems which influence the operation of a vehicle, such as an automobile, based on monitoring of a human operator. While some embodiments will be described herein with particular reference to that application, it will be appreciated that the invention is not limited to such a field of use, and is applicable in broader contexts.
BACKGROUND
[0002] Any discussion of the background art throughout the specification should in no way be considered as an admission that such art is widely known or forms part of common general knowledge in the field.
[0003] It is known to implement drowsiness detection technology in human operated vehicles. For example, such technology performs monitoring functions thereby to predict presence of potential drowsiness in respect of an operator. This may include technologies such as blepharometric analysis (which involves processing time-series data representative of eyelid amplitude), blink rate/duration analysis (which, for example, considers the regularity and/or length of blinks), and facial image processing methods (for example Al systems which are configured to predict whether a facial image represents a drowsy person or an alert person).
[0004] Technology has been adapted to enable a vehicle’s systems to intervene if a driver’s drowsiness level exceeds a certain threshold. For example, when a driver becomes drowsy according to a predetermined definition, the forward collision warning and lane departure warning are set to maximum sensitivity for the remainder of a journey. [0005] The automotive industry has been trying to address driver fatigue, drowsiness or impairment over the past two decades. However, the performance of previously developed algorithms designed to detect this condition has not been at a standard that could completely resolve this issue. Existing drowsiness detection algorithms often use attributes that are not specifically related to drowsiness, through the steering wheel or identifying yawning or facial expressions.
SUMMARY OF THE INVENTION
[0006] It is an object of the present invention to overcome or ameliorate at least one of the disadvantages of the prior art, or to provide a useful alternative.
[0007] Various example embodiments are disclosed in the sections entitled “Detailed Description” and “Claims”, along with the accompanying drawings.
[0008] Reference throughout this specification to “one embodiment”, “some embodiments” or “an embodiment” means that a particular feature, structure or characteristic described in connection with the embodiment is included in at least one embodiment of the present invention. Thus, appearances of the phrases “in one embodiment”, “in some embodiments” or “in an embodiment” in various places throughout this specification are not necessarily all referring to the same embodiment, but may. Furthermore, the particular features, structures or characteristics may be combined in any suitable manner, as would be apparent to one of ordinary skill in the art from this disclosure, in one or more embodiments.
[0009] As used herein, unless otherwise specified the use of the ordinal adjectives "first", "second", "third", etc., to describe a common object, merely indicate that different instances of like objects are being referred to, and are not intended to imply that the objects so described must be in a given sequence, either temporally, spatially, in ranking, or in any other manner.
[0010] In the claims below and the description herein, any one of the terms comprising, comprised of or which comprises is an open term that means including at least the elements/features that follow, but not excluding others. Thus, the term comprising, when used in the claims, should not be interpreted as being limitative to the means or elements or steps listed thereafter. For example, the scope of the expression a device comprising A and B should not be limited to devices consisting only of elements A and B. Any one of the terms including orwhich includes or that includes as used herein is also an open term that also means including at least the elements/features that follow the term, but not excluding others. Thus, including is synonymous with and means comprising.
[0011] As used herein, the term “exemplary” is used in the sense of providing examples, as opposed to indicating quality. That is, an “exemplary embodiment” is an embodiment provided as an example, as opposed to necessarily being an embodiment of exemplary quality.
BRIEF DESCRIPTION OF THE DRAWINGS
[0012] Embodiments of the invention will now be described, by way of example only, with reference to the accompanying drawings in which:
[0013] FIG. 1 illustrates an in-vehicle blepharometric data collection/monitoring system according to one embodiment.
[0014] FIG. 2 illustrates an example graph of drowsiness score against relative risk in the context of vehicle operation, using JDS technology.
[0015] FIG. 3 illustrates an example cascading interventional protocol.
DETAILED DESCRIPTION
[0016] The present invention relates, in various embodiments, to technology configured to implement a cascading interventional protocol in respect of human-operated machine based on processing of blepharometric data and/or other physiological parameters. For example, some embodiments provide software and integrated systems which influence the operation of a vehicle, such as an automobile, based on monitoring of a human operator. While some embodiments will be described herein with particular reference to that application, it will be appreciated that the invention is not limited to such a field of use, and is applicable in broader contexts.
Overview and Context
[0017] A human subject’s involuntary blinks and eyelid movements are influenced by a range of factors, including the subject’s behavioural state and brain function. For example, this has been used in the past for detection of drowsiness. More broadly, analysis of data derived from eye and eyelid movements can be performed thereby to identify data artefacts, patterns and the like, and these are reflective of the subject’s behavioural state, brain function and the like. [0018] The technology described herein is focussed on collection and analysis of “blepharometric data”, with the term “blepharon” describing a human eyelid. The term “blepharometric data” is used to define data that describes eyelid movement as a function of time. For example, eyelid position may be recorded as an amplitude. Eyelid movements are commonly categorised as “blinks” or “partial blinks”. The term “blepharometric data” is used to distinguish technology described herein from other technologies which detect the presence of blinks for various purposes. The technology herein is focussed on analysing eyelid movement as a function of time, typically measured as an amplitude. This data may be used to infer the presence of what would traditionally be termed “blinks”, however it is attributes of “events” and other parameters identifiable in eyelid movements which are of primary interest to technologies described herein. Events and other parameters which are identified from the processing of blepharometric data are referred to as “blepharometric artefacts”. These are referred to as “blepharometric artefacts”, with such artefacts being identifiable by application of various processing algorithms to a data set that described eyelid position as a function of time (i.e. blepharometric data). For example, the artefacts may include:
• Blink total duration (BTD), which is preferably measured as a time between commencement of closure movement which exceeds a defined threshold and completion of subsequent opening movement.
• Blink rates.
• Amplitude to velocity ratios (AVRs).
• Negative Inter-Event-Duration (IED) (discussed in detail further below).
• Positive IED.
• Negative AVR (i.e. during closure)
• Positive AVR (i.e. during re-opening)
• AVR Product (negative AVR * positive AVR)
• AVR ratio (negative AVR divided by positive AVR)
BECD (blink eye closure duration). • Negatve DOQ (duration of ocular quiescence)
• Positive DOQ
• Relative Amplitude
• Relative Position
• Maximum Amplitude
• Maximum Velocity
• Negative zero crossing index (ZCI).
• Pos ZCI
• Blink start position
• Blink end position
• Blink start time
• Blink end time
• Trends and changes in any of the above artefacts over a defined period.
• Statistical measures derived for temporal or other distinct blocks containing a plurality of detected blink events.
[0019] The determination of blepharometric artefacts may include any one or more of:
• Determination of a time period from blink initiation to blink completion (also referred to as a blink duration or blink length). Blink initiation and blink completion may be determined based on a determined “inter-blink” eyelid amplitude range, with movement outside that amplitude range being categorised as a blink.
• Determination time period between blinks, optionally measured between blink initiation times for consecutive blinks. • Analysis of “events”, including relative timing of events, with an “event” being defined as any positive or negative deflection that is greater than a given velocity threshold for a given duration. In this regard, a “blink” is in some embodiments defined as the pairing of positive and negative events that are within relative amplitude limits and relative position limits. There may be multiple events within a given blink, when an eyelid is outside of an “inter-blink” eyelid amplitude range.
• a time period for eye closure motion;
• a time period during which the eye is closed;
• a time period for eye re-opening motion;
• velocity measurements (which include velocity estimation measurements) for eye closure motion and/or eye re-opening motion, are also made, which may be used for the purposes of determining amplitude-to-velocity ratios.
[0020] Known eyelid movement monitoring systems (also referred to herein as blepharometric data monitoring systems) focus on point-in-time subject analysis. For example, commonly such technology is used as a means for assessing subject alertness/drowsiness at a specific moment, in some instances benchmarked against known data for a demographically relevant population. For example, the Johns Drowsiness Scale (JDS) is one such measure. This provides a graduated scale of drowsiness, which can be broken up into distinct tiers which each provide a respective objective level of drowsiness.
Cascading Interventional Protocol Technology
[0021] Technology described herein provides for controlling one or more electronic systems provided by a machine that is controlled by an operator. In overview, a current physiological state of the operator is determined relative to an objective scale, for example an objective measure of impairment (such as drowsiness). The scale has at least three distinct condition levels. Based on the condition level of the current measure of the physiological state, one or more operational countermeasures associated with that measure are identified. These “countermeasures” are each operational controls which can be applied to respective ones of the electronic systems. A signal is provided thereby to trigger implementation of the identified one or more operational countermeasures (optionally subject to failsafe logic and the like). [0022] The examples below focus primarily on a vehicle, such as an automobile, in which case the electronic systems include one or more of: (i) driver assistance systems such as cruise control and lane maintenance; (ii) entertainment systems; (iii) electronic windows; (iv) internal lighting; (iv) climate control systems; and various other electronic systems in modern vehicles. However, the technology may be applied in a wide range of other settings, for example to manage risks associated with a condition such as drowsiness, or even to encourage a condition such as drowsiness (for example via an in-vehicle entertainment system utilised by a nonoperator, for example in a commercial airline). The technology may be applied in settings including heavy machinery, factory equipment, smartphones and other computing devices, entertainment systems, and others.
[0023] Examples below are also directed predominately to the physiological condition of drowsiness. However, it should be apperceived that this is a non-limiting example, with other embodiments being additionally/alternatively applied to other conditions (for example other conditions of impairment). These may include other forms of cognitive impairment, such as those resulting from degenerative medical ailments, injury, drug/alcohol usage, and the like. Other physiological conditions may include adverse neurological and/or cardiovascular events.
[0024] In overview, the condition of the operator is assessed relative to the objective scale, and countermeasures applied as a result. Given that the scale has at least three distinct levels (and preferably more), this allows for a cascading increase (or decrease) in countermeasures based on point-in-time considerations. For example, in the context of managing drowsiness of a vehicle operator, countermeasures are applied in a cascading (and preferably additive) manner as the observed level of drowsiness increases. This may include modifying the way in which the vehicle is operated (for example increasing the degree of required operator input as drowsiness increases, hence rendering operation more challenging than it otherwise would be), adjusting a level of feedback/alert provided to the operator; and/or catering the operator’s environment (for example noise, haptic feedback, airflow, temperature, lighting, and so on).
Example Vehicle Management System (VMS)
[0025] FIG. 1 illustrates an example VMS, in the form of an in-vehicle blepharometric monitoring system. This is provided as an example only, and alternate systems may be used in further embodiments. For example, in further embodiments an example VMS determines an objective measure of drowsiness based on: (i) technology other then blepharometric analysis, for example bling monitoring, facial image classification and/or other Al-based approaches). Other embodiments may include cloud-hosted processing components, for example thereby to facilitate aspects of drowsiness level determination. Further embodiments use technology other than image capture thereby to obtain physiological information regarding a vehicle operator, for example using wearable technologies (such as infrared reflectance oculography spectacles, and other wearable physiological monitors).
[0026] The system of FIG. 1 includes an image capture device 120. This may include substantially any form of appropriately sized digital camera, preferably a digital camera with a frame rate of over 60 frames per second. Higher frame rate cameras are preferred, given that with enhanced frame rate comes an ability to obtain higher resolution data for eyelid movement.
[0027] Device 120 is positioned to capture a facial region of a subject. Device 120 is in one embodiment installed in a region of a vehicle in the form of an automobile, for example on or adjacent the dashboard, windscreen, or visor, such that it is configured to capture a facial region of a driver. In another embodiment device 120 is positioned on or adjacent the dashboard, windscreen, or visor, such that it is configured to capture a facial region of a front seat passenger. In another embodiment device 120 is positioned in a region such as the rear of a seat such that it is configured to capture a facial region of a back-seat passenger. In some embodiments a combination of these is provided, thereby to enable blepharometric data monitoring for both a driver and one or more passengers.
[0028] Although the system of FIG. 1 (and other systems) is described by reference to a vehicle in the form of an automobile, it will be appreciated that a system as described is also optionally implemented in other forms of vehicles, including mass-transport vehicles such as passenger airplanes, busses/coaches, and trains.
[0029] An in-vehicle image processing system 110 is configured to receive image data from image capture device 120 (or multiple devices 120), and process that data thereby to generate blepharometric data. A control module 111 is configured to control device 120, operation of image data processing, and management of generated data. This includes controlling operation of image data processing algorithms, which are configured to:
(i) Identify that a human face is detected.
(ii) In embodiments where subject identification is achieved via facial recognition algorithms (which is not present in some embodiments, for example embodiments that identify a subject via alternate means), perform a facial recognition process thereby to identify the subject. This may include identifying a known subject based on an existing subject record defined in user identification data 151 stored in a memory system 150, or identifying an unknown subject and creating a new subject user identification data 151 stored in a memory system 150.
(iii) In a detected human face, identify an eye region. In some embodiments the algorithms are configured to track one eye region only; in other embodiments both eye regions are tracked thereby to improve data collection.
(iv) Identify, in the eye region(s), presence and movement of an eyelid. For example, in a preferred embodiment this is achieved by way of recording an eyelid position relative to a defined “open” position against time. This allows generation of blepharometric data in the form of eyelid position (amplitude) over time. It will be appreciated that such data provides for identification of events (for example blink events) and velocity (for example as a first derivative of position against time). In a preferred embodiment, a facial recognition algorithm is used to enable identification of: (i) a central position on an upper eyelid on a detected face; and (ii) at least two fixed points on the detected face. The two fixed points on the detected face are used to enable scaling of measurements of movement of the central position of the upper eyelid thereby to account to changes in relative distance between the user and the camera. That is, a distance between the two fixed points is used as a means to determine position of the face relative to the camera, including position by reference to distance from the camera (as the user moves away, the distance between the fixed points decreases).
[0030] Algorithms 112 optionally operate to extract additional artefacts from blepharometric data, for example amplitude-velocity ratios, blink total durations, inter-event durations, and the like. It will be appreciated, however, that extraction of such artefacts may occur in downstream processing.
[0031] A blepharometric data output module 113 is configured to output blepharometric data generated by algorithms 112 in user blepharometric data 152.
[0032] Memory system 150 optionally includes user identification data 151 for one or more users. The identification data may include login credentials (for example a user ID and/or password) which are inputted via an input device. Alternately, the identification data may be biometric, for example using facial recognition as discussed above or an alternate biometric input (such as a fingerprint scanner). In some embodiments this leverages an existing biometric identification system of the vehicle. This is optionally used to adjust an objective drowsiness scale based on individual and/or demographic attributes of the operator,
[0033] Analysis modules 130 are configured to perform analysis of user blepharometric data 152. This includes assessing a physiological state of the vehicle operator, in the present example being drowsiness, and providing an objective measure as output. For example, that objective measure could be on a scale of X to Y (for instance 1 to 10). The Johns Drowsiness Scale may be used.
[0034] A control system 140 includes logic modules 141 which control overall operation of system 100. This includes execution of logical rules thereby to determine communications to be provided in response to outputs from analysis modules 130. For example, this may include identifying a set of countermeasures to be applied based on a level of drowsiness that is determined by modules 130. A countermeasure implementation system 160 is configured to process instructions from control system 140 thereby to implement countermeasures in the vehicle, by providing control instructions to individual vehicle systems (e.g. driver assistance, climate control, window operation, suspension, and so on). In some embodiments control system provides a stream of instructions to system 161 thereby to implement a series of operational modifications based on a predefined schedule (e.g. opening and closing of windows, temperature variation, triggering of a manoeuvring alteration, and so on).
[0035] The logic modules preferably provide data representative of countermeasure implementation protocols for some or all of the following condition levels:
(i) A condition level associated with a standard set of operational countermeasures for conventional vehicle operation.
(ii) A condition level associated with a set of operational countermeasures which simplify vehicle operation. For example, those countermeasures include automated cruise control and/or lane maintenance.
(iii) A condition level associated with a set of operational countermeasures which provide increased operator simulation. For example, those countermeasures include any one or more of: enhancing feedback about lane keeping accuracy; applying acoustic and haptic feedback about lane departure; amplifying kinesthetic feedback of the adaptive cruise control; and/or introduction of increased sensitivity of FCW and LDW systems. (iv) A condition level associated with a set of operational countermeasures which provide modification of operation conditions. For example, those countermeasures include any one or more of: controlling windows; controlling a climate control system; controlling an entertainment system; and/or increasing sensitivity of creased sensitivity of FCW and LDW systems.
(v) A condition level associated with a set of operational countermeasures which render vehicle operation relatively more challenging compared to standard operation. For example those countermeasures include one or more of the following: disabling cruise control; increasing steering resistance; reducing spring suspension; amplifying motor sound; and introduction of controlled disturbances.
(vi) A condition level associated with a set of operational countermeasures which increase amplitude and/or frequency of audible warnings.
(vii) A condition level associated with a set of operational countermeasures which limit vehicle manoeuvrability and/or stop the vehicle under controlled conditions.
[0036] In some embodiments, the technology uses human monitoring and assessment to objectively quantify deterioration in cognitive brain function due to drowsiness. This technology analyses biomarkers extracted from a machine operator (e.g. a vehicle driver) and outputs a drowsiness risk level in a scale, for example a scale of to 10. An example scale is illustrated in FIG. 2, which shows an example graph of drowsiness score against relative risk in the context of vehicle operation, using JDS technology.
[0037] In example embodiments, a reliable and objective drowsiness measure is determined using a composite metric extracted from eyelid movements (specifically from processing timeseries data representative of eyelid position as a function of time, i.e. blepharometric analysis). This can be measured using data collected via existing in-car monitoring cameras that are part of a Driver Monitoring System (DMS), for example using images containing a driver’s face. Example technology for achieving such objectives is disclosed in various international publications by SDIP Holdings Pty Ltd, which are herein incorporated by cross reference.
[0038] Objective drowsiness measurement enables a multi-stage approach to the application of cascading countermeasures in a vehicle, enabling the driver to prolong a duration of safe driving. In an example embodiment, this is configured as follows: • Where the objective measure of drowsiness measurement is in an “Alert” range: no special intervention required from the system. That is, the vehicle operates as specified in a “normal” state.
• Where the objective measure of drowsiness measurement is in an “Early drowsiness onset” range, intervention measures are set for a “preservation” phase. This allows the driver to focus on the main driving tasks, and countermeasures are focussed on tools configured to “make driving easy”, for example by enabling automatic cruise control and lane maintenance, and reducing distractions by warning systems.
• Where the objective measure of drowsiness measurement is in an “Early-stage drowsiness” range: intervention measures are set for a “driver stimulation” phase. In this phase, the system aims to stimulate the driver to counter further drowsiness by, for example, enhancing feedback about lane keeping accuracy, applying acoustic and haptic feedback about lane departure, and amplifying kinesthetic feedback of the adaptive cruise control (stronger acceleration/deceleration). This optionally includes the introduction of increased sensitivity of FCW and LDW systems.
• Where the objective measure of drowsiness measurement is in a “Drowsiness” range: intervention measures are set for a “Modification of driving conditions” phase. In this phase, the system aims to introduce sensory stimulation to the driver by modifying ambient conditions, for instance, introducing fresh air (ventilation or opening window slightly), increasing radio/entertainment system volume, and/or reducing cabin temperature. This is preferably accompanied by a further increase in sensitivity of the FCW and LDW systems.
• Where the objective measure of drowsiness measurement is in an “Late-stage drowsiness” range: intervention measures are set for a “high stimulation”. In this phase, the system aims to increase driver engagement by, in effect, “make driving difficult”. This includes further engaging the driver in the driving activities by disabling automatic cruise control, increasing steering torque (resistance), reducing spring suspension, amplifying motor sound, and/or introducing controlled disturbances. This is accompanied by a further increase in sensitivity of the FCW and LDW systems.
• Where the objective measure of drowsiness measurement is in a “Falling asleep” range: intervention measures are set for a “protection” phase. In this phase, the system tries to compensate for the driver’s potential impairment by advising the driver to stop and lowering to minimum the criteria for all warnings, increasing the amplitude and frequency of the audible warnings.
• Where the objective measure of drowsiness measurement is in an “Sleep” range: intervention measures are set for a “Unresponsive” phase. This includes introduction of minimal risk manoeuvres and stopping the vehicle safely.
[0039] An example of such a layered/cascading countermeasure approach according to one embodiment is provided in FIG. 3. In this example, the objective scale is from 0 (very alert) to 10 (very drowsy/asleep).
[0040] The layered countermeasure approach above relies on an objective and reliable method for the quantification of driver drowsiness, for example with sufficient resolution at the early drowsiness onset stage and ability for segregation into multiple distinct phases. It will be appreciated that the above is an example only, and that modifications may be implemented.
[0041] In further embodiments, a protocol is implemented thereby to apply countermeasures in respect of biomarkers other than drowsiness. For example, blepharometric techniques may be applied in respect of other conditions such as seizures (current and/or predicted), degenerative conditions such as Alzheimer’s, and the like. Research is currently being directed to the use of blepharometric analysis and Al to identify biomarkers for other degradations to neurological brain function in conditions such as epilepsy (due to onset of seizures), ADHD, Parkinson’s, Alzheimer’s, traumatic brain injuries, concussion and motor neuron diseases, and aim to flag potential early signs to the driver (and also passengers) and/or with a recommendation to seek medical advice. The technology disclosed herein is readily applied in such scenarios.
Example Embodiments
[0042] Several example embodiments are described below which make use of cascading intervention as described herein. For these examples, an impairment monitoring algorithm and associated intervention control system is applied to process driver monitoring data in a motor vehicle. For example, the driver monitoring data may yield eyelid amplitude data (e.g. from image processing), and this may be used to derive an objective measure of impairment (for example an objective measure of drowsiness). [0043] Each example embodiment includes a method for controlling one or more electronic systems provided by a machine that is controlled by an operator, the method including: accessing data representative of a current measure of a physiological state of impairment of the operator, wherein the measure of the physiological state of impairment is defined relative to a scale having at least three distinct impairment condition levels; based on the condition level of the current measure of the physiological state of impairment, identifying one or more operational countermeasures associated with that measure. These countermeasures are implemented by an intervention control system.
[0044] For the present purposes these are referred to as Default Impairment Level, First Impairment Level, Second Impairment Level, nth Impairment Level, etc. to define an increasing set of impairment levels (although there may be additional intermediate levels). It will be appreciated that these are arbitrary labels, and that the labelling and number of defined impairment levels is to an extent a design choice.
[0045] Initially, for the Default Impairment Level, the vehicle operates in a “standard” mode where the operator has control over driver assistance systems (e.g. cruise control, lane control, etc), internal environment control systems (e.g. climate control, lighting control, volume control), and physical control systems (e.g. windows, acceleration and braking). Upon transition between impairment levels in an increasing level direction, the intervention control system applies cascading (preferably additive) intervention measures
[0046] In a first example, upon detection of a transition from the Default Impairment Level to a First Impairment Level, the intervention control system applies a first set of predefined settings to a first set of one or more of the driver assistance systems, thereby overriding current user- controlled settings. Upon detection of a transition from the First Impairment Level to a Second Impairment Level, the intervention control system applies a second set of predefined settings to a second set of one or more of the driver assistance systems, thereby overriding current user- controlled settings. This is preferably additive to the first set of predefined settings in the sense that there is additional reduction in driver assistance. For example, the second set of predefined settings may: reduce driver assistance to one or more driver assistance systems in the first set (and also in the second set); and/or affect driver assistance systems in the second set (but not in the first set). That is, there may be increased intervention into already-affected systems and/or intervention into additional driver assistance systems. [0047] In a second example, upon detection of a transition from a nth Impairment Level to a n+1th Impairment Level, the intervention control system applies a first set of predefined settings to a first set of one or more of the driver assistance systems, thereby overriding current user- controlled settings. Upon detection of a transition from the n+1th Impairment Level to a n+2th Impairment Level, the intervention control system applies a second set of predefined settings to a second set of one or more of the driver assistance systems, thereby overriding current user- controlled settings. This is preferably additive to the first set of predefined settings in the sense that there is additional reduction in driver assistance. For example, the second set of predefined settings may: reduce driver assistance to one or more driver assistance systems in the first set (and also in the second set); and/or affect driver assistance systems in the second set (but not in the first set). That is, there may be increased intervention into already-affected systems and/or intervention into additional driver assistance systems.
[0048] In a third example, upon detection of a transition from a nth Impairment Level to a n+1th Impairment Level, the intervention control system applies a first set of predefined settings to a first set of one or more of the driver assistance systems, thereby overriding current user- controlled settings. Upon detection of a transition from the n+1th Impairment Level to a n+2th Impairment Level, the intervention control system applies a second set of predefined settings to a second set of one or more vehicle systems, which include either or both of internal environment control systems, and physical control systems (each optionally in combination with driver assistance systems).
[0049] In a fourth example, upon detection of a transition from a nth Impairment Level to a n+1th Impairment Level, the intervention control system applies a first set of predefined settings to a first set of one or more vehicle systems, including any one or more of the driver assistance systems, environment control systems, and/or physical control systems, thereby overriding current user-controlled settings. Upon detection of a transition from the n+1th Impairment Level to a n+2th Impairment Level, the intervention control system applies a second set of predefined settings to a second set of one or more vehicle systems, which include the driver assistance systems, environment control systems, and/or physical control systems.
[0050] In a fifth example, there is an additional control level at which driver assistance systems are not available or only available in a limited capacity, whilst an initial/baseline impairment level is determined. In a further example vehicle operation is restricted/limited until an initial/baseline impairment level is determined. [0051] In the examples above, intervention measures applied to driver assistance systems may include, for instance, one or more of the following:
• Deactivation of cruise control functionalities.
• Deactivation of lane control functionalities.
• Increasing sensitivity of lane monitoring functionalities.
• Increasing sensitivity of forward collision detection functionalities.
• Deactivation of other driver assistance functionalities.
• Delivering additional notifications via vehicle notification systems.
• Increasing sensitivity of other driver assistance functionalities.
[0052] In the examples above, intervention measures applied to environment control systems may include, for instance, one or more of the following:
• Changing internal vehicle temperature.
• Dynamically varying internal vehicle temperature (e.g. cyclical).
• Changing internal vehicle lighting.
• Dynamically varying internal vehicle lighting (e.g. cyclical).
• Changing air conditioning fan speed.
• Dynamically varying air conditioning fan speed (e.g. cyclical).
• Changing speaker volume.
• Dynamically varying speaker volume (e.g. cyclical).
[0053] In the examples above, intervention measures applied to physical control systems may include, for instance, one or more of the following:
Changing position of vehicle windows. • Dynamically varying position of vehicle windows (e.g. cyclical).
• Adjusting steering damping.
• Adjusting suspension settings.
• Adjusting braking sensitivity.
• Limiting vehicle maximum speed.
• Delivering haptic feedback through steering wheel, seats, etc.
[0054] The above are examples only; other variations and/or combinations may also be implemented.
Conclusions and Interpretation
[0055] It will be appreciated that the above disclosure provides improved methods for modifying the operation of user-operated equipment based on monitoring of the operator, for example thereby to reduce risks associated with drowsiness and/or cognitive impairment.
[0056] It should be appreciated that in the above description of exemplary embodiments of the invention, various features of the invention are sometimes grouped together in a single embodiment, FIG., or description thereof for the purpose of streamlining the disclosure and aiding in the understanding of one or more of the various inventive aspects. This method of disclosure, however, is not to be interpreted as reflecting an intention that the claimed invention requires more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive aspects lie in less than all features of a single foregoing disclosed embodiment. Thus, the claims following the Detailed Description are hereby expressly incorporated into this Detailed Description, with each claim standing on its own as a separate embodiment of this invention.
[0057] Furthermore, while some embodiments described herein include some but not other features included in other embodiments, combinations of features of different embodiments are meant to be within the scope of the invention, and form different embodiments, as would be understood by those skilled in the art. For example, in the following claims, any of the claimed embodiments can be used in any combination. [0058] Furthermore, some of the embodiments are described herein as a method or combination of elements of a method that can be implemented by a processor of a computer system or by other means of carrying out the function. Thus, a processor with the necessary instructions for carrying out such a method or element of a method forms a means for carrying out the method or element of a method. Furthermore, an element described herein of an apparatus embodiment is an example of a means for carrying out the function performed by the element for the purpose of carrying out the invention.
[0059] In the description provided herein, numerous specific details are set forth. However, it is understood that embodiments of the invention may be practiced without these specific details. In other instances, well-known methods, structures and techniques have not been shown in detail in order not to obscure an understanding of this description.
[0060] Similarly, it is to be noticed that the term coupled, when used in the claims, should not be interpreted as being limited to direct connections only. The terms "coupled" and "connected," along with their derivatives, may be used. It should be understood that these terms are not intended as synonyms for each other. Thus, the scope of the expression a device A coupled to a device B should not be limited to devices or systems wherein an output of device A is directly connected to an input of device B. It means that there exists a path between an output of A and an input of B which may be a path including other devices or means. "Coupled" may mean that two or more elements are either in direct physical or electrical contact, or that two or more elements are not in direct contact with each other but yet still co-operate or interact with each other.
[0061] Thus, while there has been described what are believed to be the preferred embodiments of the invention, those skilled in the art will recognize that other and further modifications may be made thereto without departing from the spirit of the invention, and it is intended to claim all such changes and modifications as falling within the scope of the invention. For example, any formulas given above are merely representative of procedures that may be used. Functionality may be added or deleted from the block diagrams and operations may be interchanged among functional blocks. Steps may be added or deleted to methods described within the scope of the present invention.

Claims

1 . A method for controlling one or more electronic systems provided by a machine that is controlled by an operator, the method including: accessing data representative of a current measure of a physiological state of the operator, wherein the measure of the physiological state is defined relative to a scale having at least three distinct condition levels; based on the condition level of the current measure of the physiological state, identifying one or more operational countermeasures associated with that measure; and providing a signal thereby to trigger implementation of the identified one or more operational countermeasures.
2. A method according to claim 1 wherein the machine is a vehicle.
3. A method according to claim 1 or claim 2 wherein the current measure of a physiological state includes a measure of impairment.
4. A method according to claim 3 wherein the measure of impairment includes a measure of drowsiness.
5. A method according to claim 3 or claim 4 wherein the condition levels include three or more of the following:
(i) a condition level associated with a standard set of operational countermeasures for conventional vehicle operation;
(ii) a condition level associated with a set of operational countermeasures which simplify vehicle operation;
(iii) a condition level associated with a set of operational countermeasures which provide increased operator simulation;
(iv) a condition level associated with a set of operational countermeasures which provide modification of operation conditions; (v) a condition level associated with a set of operational countermeasures which render vehicle operation relatively more challenging compared to standard operation;
(vi) a condition level associated with a set of operational countermeasures which increase amplitude and/or frequency of audible warnings; and
(vii) a condition level associated with a set of operational countermeasures which limit vehicle manoeuvrability and/or stop the vehicle under controlled conditions. A method according to claim 5 wherein the condition levels include a condition level associated with a set of operational countermeasures which simplify vehicle operation, wherein those countermeasures include automated cruise control and/or lane maintenance. A method according to claim 5 or claim 6 wherein the condition levels include a condition level associated with a set of operational countermeasures which provide increased operator simulation, wherein those countermeasures include any one or more of: enhancing feedback about lane keeping accuracy; applying acoustic and haptic feedback about lane departure; amplifying kinesthetic feedback of the adaptive cruise control; and/or introduction of increased sensitivity of FCW and LDW systems. A method according to any one of claim 5 to claim 6 wherein the condition levels include a condition level associated with a set of operational countermeasures which provide modification of operation conditions, wherein those countermeasures include any one or more of: controlling windows; controlling a climate control system; controlling an entertainment system; and/or increasing sensitivity of FCW and LDW systems. A method according to any one of claim 5 to claim 8 wherein the condition levels include a condition level associated with a set of operational countermeasures which render vehicle operation relatively more challenging compared to standard operation, wherein those countermeasures include one or more of the following: disabling cruise control; increasing steering resistance; reducing spring suspension; amplifying motor sound; and introduction of controlled disturbances. A method according to any preceding claim wherein the data representative of a current measure of a physiological state of the operator includes data derived from processing of facial images. A method according to claim 10 wherein the facial images are processed using an Al classifier thereby to determine an objective measure of impairment. A method according to claim 10 wherein the facial images are processed thereby to identify characteristics of eyelid movement, thereby to determine an objective measure of impairment. A method according to claim 10 wherein the facial images are processed thereby to identify eyelid position as a function of time, thereby to determine an objective measure of impairment. A method according to claim 10 wherein the eyelid position as a function of time is processed thereby to extract a set of blepharometric artefacts, and those artefacts are processed thereby to determine an objective measure of impairment. A method according to any one of claims 11 to claim 14 wherein the objective measure of impairment is an objective measure of drowsiness. A method according to claim 1 wherein the machine is any one or more of: a vehicle; an in-flight entertainment system; a smartphone; a computing device; or an item of heavy machinery.
A method according to any preceding claim wherein the current measure of a physiological state includes a measure of impairment.
A method according to any preceding claim wherein the current measure of a physiological state includes a measure of drowsiness.
A method according to any preceding claim wherein the current measure of a physiological state includes detection of an adverse neurological event. A method according to any preceding claim wherein the current measure of a physiological state includes detection of an adverse cardiovascular event. A method according to any preceding claim wherein the current measure of a physiological state includes a measure relating to any one or more of the following: a degenerative cognitive impairment; a drug or alcohol induced state of cognitive impairment; and injury induced state of cognitive impairment. A method according to any preceding claim including a step of determining one or more demographic attributes of the operator, wherein the scale is influenced by the one or more demographic attributes of the operator. A method for controlling one or more electronic systems provided by a machine that is controlled by an operator, the method including: accessing data representative of a current measure of a physiological state of impairment of the operator, wherein the measure of the physiological state of impairment is defined relative to a scale having at least three distinct condition levels; based on the condition level of the current measure of the physiological state of impairment, identifying one or more operational countermeasures associated with that measure; and providing a signal thereby to trigger implementation of the identified one or more operational countermeasures, such that operation of the vehicle becomes more challenging for the operator as the level of impairment increases. A device configured to perform a method according to any one of claims 1-23. A vehicle configured to perform a method according to any one of claims 1-23.
PCT/AU2023/000002 2022-04-08 2023-04-08 Interventional protocol in respect of human-operated machine based on processing of blepharometric data and/or other physiological parameters WO2023193038A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
AU2022900922 2022-04-08
AU2022900922A AU2022900922A0 (en) 2022-04-08 Interventional protocol in respect of human-operated machine based on processing of blepharometric data and/or other physiological parameters

Publications (1)

Publication Number Publication Date
WO2023193038A1 true WO2023193038A1 (en) 2023-10-12

Family

ID=88243614

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/AU2023/000002 WO2023193038A1 (en) 2022-04-08 2023-04-08 Interventional protocol in respect of human-operated machine based on processing of blepharometric data and/or other physiological parameters

Country Status (1)

Country Link
WO (1) WO2023193038A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080218359A1 (en) * 2007-03-08 2008-09-11 Denso Corporation Drowsiness determination apparatus, program, and method
US20150109131A1 (en) * 2013-10-15 2015-04-23 Volvo Car Corporation Vehicle driver assist arrangement
US20160001781A1 (en) * 2013-03-15 2016-01-07 Honda Motor Co., Ltd. System and method for responding to driver state

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080218359A1 (en) * 2007-03-08 2008-09-11 Denso Corporation Drowsiness determination apparatus, program, and method
US20160001781A1 (en) * 2013-03-15 2016-01-07 Honda Motor Co., Ltd. System and method for responding to driver state
US20150109131A1 (en) * 2013-10-15 2015-04-23 Volvo Car Corporation Vehicle driver assist arrangement

Similar Documents

Publication Publication Date Title
US10343693B1 (en) System and method for monitoring and reducing vehicle operator impairment
Sommer et al. Evaluation of PERCLOS based current fatigue monitoring technologies
Kaplan et al. Driver behavior analysis for safe driving: A survey
US9908530B1 (en) Advanced vehicle operator intelligence system
US9165326B1 (en) System and method to adjust insurance rate based on real-time data about potential vehicle operator impairment
Kang Various approaches for driver and driving behavior monitoring: A review
Friedrichs et al. Camera-based drowsiness reference for driver state classification under real driving conditions
JP6087088B2 (en) System and method for improving vehicle operator performance estimates
CA2649731C (en) An unobtrusive driver drowsiness detection method
Hossain et al. IOT based real-time drowsy driving detection system for the prevention of road accidents
Friedrichs et al. Drowsiness monitoring by steering and lane data based features under real driving conditions
WO2020152678A1 (en) Detection of cognitive state of a driver
Gonçalves et al. Driver state monitoring systems–transferable knowledge manual driving to HAD
CN109716411B (en) Method and apparatus to monitor activity level of driver
KR20190061726A (en) System and method for adjusting drivingcontrol of vehicle
Melnicuk et al. Towards hybrid driver state monitoring: Review, future perspectives and the role of consumer electronics
Poursadeghiyan et al. Determination the levels of subjective and observer rating of drowsiness and their associations with facial dynamic changes
Kumar et al. Detecting distraction in drivers using electroencephalogram (EEG) signals
US10945651B2 (en) Arousal level determination device
WO2023193038A1 (en) Interventional protocol in respect of human-operated machine based on processing of blepharometric data and/or other physiological parameters
Sontakke Efficient driver fatigue detection and alerting system
Poliak et al. Analysis and research plan of commercial truck drivers’ potentially dangerous driving behaviors caused by the changes of Regulation (EC) No. 561/2006
Thummar et al. A real time driver fatigue system based on eye gaze detection
Golz et al. Microsleep detection in electrophysiological signals
Dababneh et al. Driver vigilance level detection systems: A literature survey

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23783983

Country of ref document: EP

Kind code of ref document: A1