US20210298614A1 - Methods of determining ventilatory threshold - Google Patents
Methods of determining ventilatory threshold Download PDFInfo
- Publication number
- US20210298614A1 US20210298614A1 US17/347,293 US202117347293A US2021298614A1 US 20210298614 A1 US20210298614 A1 US 20210298614A1 US 202117347293 A US202117347293 A US 202117347293A US 2021298614 A1 US2021298614 A1 US 2021298614A1
- Authority
- US
- United States
- Prior art keywords
- subject
- heart rate
- sensor
- monitoring device
- time
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 74
- 230000003519 ventilatory effect Effects 0.000 title claims abstract description 24
- 230000029058 respiratory gaseous exchange Effects 0.000 claims abstract description 148
- 238000012806 monitoring device Methods 0.000 claims abstract description 65
- 230000033001 locomotion Effects 0.000 claims abstract description 35
- 230000000694 effects Effects 0.000 claims abstract description 32
- 230000001965 increasing effect Effects 0.000 claims abstract description 15
- 230000002596 correlated effect Effects 0.000 claims abstract description 14
- 238000012545 processing Methods 0.000 claims description 21
- 238000013186 photoplethysmography Methods 0.000 description 31
- 230000037081 physical activity Effects 0.000 description 23
- 230000003287 optical effect Effects 0.000 description 21
- 238000004422 calculation algorithm Methods 0.000 description 20
- 230000008859 change Effects 0.000 description 20
- 238000012544 monitoring process Methods 0.000 description 20
- 230000000747 cardiac effect Effects 0.000 description 16
- 230000036772 blood pressure Effects 0.000 description 14
- 238000010586 diagram Methods 0.000 description 14
- 230000036760 body temperature Effects 0.000 description 13
- 230000002708 enhancing effect Effects 0.000 description 11
- 230000009183 running Effects 0.000 description 11
- 210000000707 wrist Anatomy 0.000 description 11
- 230000036541 health Effects 0.000 description 10
- 238000005259 measurement Methods 0.000 description 10
- 230000000737 periodic effect Effects 0.000 description 10
- 238000001514 detection method Methods 0.000 description 9
- 230000006870 function Effects 0.000 description 9
- 239000008280 blood Substances 0.000 description 8
- 210000004369 blood Anatomy 0.000 description 8
- 238000004891 communication Methods 0.000 description 7
- 230000008878 coupling Effects 0.000 description 7
- 238000010168 coupling process Methods 0.000 description 7
- 238000005859 coupling reaction Methods 0.000 description 7
- 230000008569 process Effects 0.000 description 7
- 230000004044 response Effects 0.000 description 7
- 230000001154 acute effect Effects 0.000 description 6
- 230000008901 benefit Effects 0.000 description 6
- 210000004204 blood vessel Anatomy 0.000 description 6
- 230000001684 chronic effect Effects 0.000 description 6
- 230000001276 controlling effect Effects 0.000 description 6
- 230000000670 limiting effect Effects 0.000 description 6
- 230000007774 longterm Effects 0.000 description 6
- 230000002503 metabolic effect Effects 0.000 description 6
- 230000000007 visual effect Effects 0.000 description 6
- 230000035945 sensitivity Effects 0.000 description 5
- 230000017531 blood circulation Effects 0.000 description 4
- 238000004590 computer program Methods 0.000 description 4
- 230000007613 environmental effect Effects 0.000 description 4
- 230000005284 excitation Effects 0.000 description 4
- 230000003993 interaction Effects 0.000 description 4
- 238000012360 testing method Methods 0.000 description 4
- 206010003658 Atrial Fibrillation Diseases 0.000 description 3
- 230000001133 acceleration Effects 0.000 description 3
- 230000006793 arrhythmia Effects 0.000 description 3
- 206010012601 diabetes mellitus Diseases 0.000 description 3
- 210000003414 extremity Anatomy 0.000 description 3
- 239000007789 gas Substances 0.000 description 3
- 239000011521 glass Substances 0.000 description 3
- 230000003862 health status Effects 0.000 description 3
- 230000013632 homeostatic process Effects 0.000 description 3
- 238000003672 processing method Methods 0.000 description 3
- 230000009182 swimming Effects 0.000 description 3
- 238000012549 training Methods 0.000 description 3
- 241000282412 Homo Species 0.000 description 2
- 241001465754 Metazoa Species 0.000 description 2
- 230000009471 action Effects 0.000 description 2
- 238000004458 analytical method Methods 0.000 description 2
- 210000003423 ankle Anatomy 0.000 description 2
- 206010003119 arrhythmia Diseases 0.000 description 2
- QVGXLLKOCUKJST-UHFFFAOYSA-N atomic oxygen Chemical compound [O] QVGXLLKOCUKJST-UHFFFAOYSA-N 0.000 description 2
- 210000003403 autonomic nervous system Anatomy 0.000 description 2
- 230000002612 cardiopulmonary effect Effects 0.000 description 2
- 230000001351 cycling effect Effects 0.000 description 2
- 210000000624 ear auricle Anatomy 0.000 description 2
- 230000005021 gait Effects 0.000 description 2
- 230000036571 hydration Effects 0.000 description 2
- 238000006703 hydration reaction Methods 0.000 description 2
- 230000003340 mental effect Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000000926 neurological effect Effects 0.000 description 2
- 239000001301 oxygen Substances 0.000 description 2
- 229910052760 oxygen Inorganic materials 0.000 description 2
- 230000001734 parasympathetic effect Effects 0.000 description 2
- 230000033764 rhythmic process Effects 0.000 description 2
- 230000003595 spectral effect Effects 0.000 description 2
- 239000003381 stabilizer Substances 0.000 description 2
- 230000002889 sympathetic effect Effects 0.000 description 2
- 230000002123 temporal effect Effects 0.000 description 2
- 238000002560 therapeutic procedure Methods 0.000 description 2
- 230000007704 transition Effects 0.000 description 2
- 208000016261 weight loss Diseases 0.000 description 2
- 230000004580 weight loss Effects 0.000 description 2
- 206010003130 Arrhythmia supraventricular Diseases 0.000 description 1
- NBGBEUITCPENLJ-UHFFFAOYSA-N Bunazosin hydrochloride Chemical compound Cl.C1CN(C(=O)CCC)CCCN1C1=NC(N)=C(C=C(OC)C(OC)=C2)C2=N1 NBGBEUITCPENLJ-UHFFFAOYSA-N 0.000 description 1
- 0 CC(CCC12)C1*1(C)C2*=CC1 Chemical compound CC(CCC12)C1*1(C)C2*=CC1 0.000 description 1
- WQZGKKKJIJFFOK-GASJEMHNSA-N Glucose Natural products OC[C@H]1OC(O)[C@H](O)[C@@H](O)[C@@H]1O WQZGKKKJIJFFOK-GASJEMHNSA-N 0.000 description 1
- 206010049816 Muscle tightness Diseases 0.000 description 1
- 241001025261 Neoraja caerulea Species 0.000 description 1
- 208000037656 Respiratory Sounds Diseases 0.000 description 1
- 208000004301 Sinus Arrhythmia Diseases 0.000 description 1
- 241000746998 Tragus Species 0.000 description 1
- 230000001668 ameliorated effect Effects 0.000 description 1
- 239000012491 analyte Substances 0.000 description 1
- 238000003556 assay Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- ZOMSMJKLGFBRBS-UHFFFAOYSA-N bentazone Chemical compound C1=CC=C2NS(=O)(=O)N(C(C)C)C(=O)C2=C1 ZOMSMJKLGFBRBS-UHFFFAOYSA-N 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000036996 cardiovascular health Effects 0.000 description 1
- 210000004004 carotid artery internal Anatomy 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 230000036757 core body temperature Effects 0.000 description 1
- 238000013016 damping Methods 0.000 description 1
- 238000013480 data collection Methods 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 238000003745 diagnosis Methods 0.000 description 1
- 230000029087 digestion Effects 0.000 description 1
- 201000010099 disease Diseases 0.000 description 1
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 description 1
- 239000003814 drug Substances 0.000 description 1
- 229940079593 drug Drugs 0.000 description 1
- 210000000613 ear canal Anatomy 0.000 description 1
- 210000000883 ear external Anatomy 0.000 description 1
- 210000003027 ear inner Anatomy 0.000 description 1
- 210000005069 ears Anatomy 0.000 description 1
- 230000002526 effect on cardiovascular system Effects 0.000 description 1
- 239000003792 electrolyte Substances 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 206010016256 fatigue Diseases 0.000 description 1
- 210000002683 foot Anatomy 0.000 description 1
- 239000008103 glucose Substances 0.000 description 1
- 230000036449 good health Effects 0.000 description 1
- 238000009499 grossing Methods 0.000 description 1
- 210000003128 head Anatomy 0.000 description 1
- 230000000004 hemodynamic effect Effects 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 230000037323 metabolic rate Effects 0.000 description 1
- 239000002207 metabolite Substances 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 210000003205 muscle Anatomy 0.000 description 1
- 208000010125 myocardial infarction Diseases 0.000 description 1
- 230000036284 oxygen consumption Effects 0.000 description 1
- 238000006213 oxygenation reaction Methods 0.000 description 1
- 238000003909 pattern recognition Methods 0.000 description 1
- 238000011422 pharmacological therapy Methods 0.000 description 1
- 238000005086 pumping Methods 0.000 description 1
- 238000011084 recovery Methods 0.000 description 1
- 230000000241 respiratory effect Effects 0.000 description 1
- 230000000284 resting effect Effects 0.000 description 1
- 230000002441 reversible effect Effects 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
- 238000009738 saturating Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 231100000430 skin reaction Toxicity 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 238000007619 statistical method Methods 0.000 description 1
- 230000001225 therapeutic effect Effects 0.000 description 1
- 230000036962 time dependent Effects 0.000 description 1
- 231100000167 toxic agent Toxicity 0.000 description 1
- 239000003440 toxic substance Substances 0.000 description 1
- 210000003454 tympanic membrane Anatomy 0.000 description 1
- 238000009423 ventilation Methods 0.000 description 1
- 230000001755 vocal effect Effects 0.000 description 1
- 239000012855 volatile organic compound Substances 0.000 description 1
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/02—Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
- A61B5/0205—Simultaneously evaluating both cardiovascular conditions and different types of body conditions, e.g. heart and respiratory condition
- A61B5/02055—Simultaneously evaluating both cardiovascular condition and temperature
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0002—Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
- A61B5/0015—Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network characterised by features of the telemetry system
- A61B5/0022—Monitoring a patient using a global network, e.g. telephone networks, internet
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/02—Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
- A61B5/024—Detecting, measuring or recording pulse rate or heart rate
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/08—Detecting, measuring or recording devices for evaluating the respiratory organs
- A61B5/0816—Measuring devices for examining respiratory frequency
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/48—Other medical applications
- A61B5/486—Bio-feedback
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6801—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
- A61B5/6813—Specially adapted to be attached to a specific body part
- A61B5/6814—Head
- A61B5/6815—Ear
- A61B5/6817—Ear canal
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6801—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
- A61B5/683—Means for maintaining contact with the body
- A61B5/6831—Straps, bands or harnesses
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7235—Details of waveform analysis
- A61B5/7246—Details of waveform analysis using correlation, e.g. template matching or determination of similarity
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/74—Details of notification to user or communication with user or patient ; user input means
- A61B5/7405—Details of notification to user or communication with user or patient ; user input means using sound
- A61B5/7415—Sound rendering of measured values, e.g. by pitch or volume variation
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H40/00—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
- G16H40/60—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
- G16H40/67—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for remote operation
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B2503/00—Evaluating a particular growth phase or type of persons or animals
- A61B2503/10—Athletes
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
- A61B5/1118—Determining activity level
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/24—Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
- A61B5/316—Modalities, i.e. specific diagnostic methods
- A61B5/318—Heart-related electrical modalities, e.g. electrocardiography [ECG]
- A61B5/346—Analysis of electrocardiograms
- A61B5/349—Detecting specific parameters of the electrocardiograph cycle
- A61B5/352—Detecting R peaks, e.g. for synchronising diagnostic apparatus; Estimating R-R interval
Definitions
- the present invention relates generally to monitoring devices and, more particularly, to monitoring devices for measuring physiological information.
- Wearable devices capable of monitoring physiological information, such as heart rate, are increasingly being used. These devices come in various form factors, including devices configured to be worn at the ear or at other locations of the body.
- Physiological information obtained from a subject can be used to generate various types of health and fitness assessments of the subject.
- blood flow information can be measured during daily activities of a subject and this information can be used to generate assessments, such as maximum oxygen consumption VO 2 max, total energy expenditure (TEE), etc.
- assessments such as maximum oxygen consumption VO 2 max, total energy expenditure (TEE), etc.
- Embodiments of the present invention can facilitate identifying musical audio that can improve a person's exercise training, and can facilitate identifying the best biometric parameters for optimizing the person's exercise training.
- embodiments of the present invention can be used to study a person's biometric correlations with music while exercising and listening to music to learn how music tempo relates to a controllable biometric parameter and then learn how to directly control that biometric as a means of indirectly controlling another biometric.
- embodiments of the present invention can be used to help a person learn how to minimize heart rate (HR) for a given workload and thus improve endurance during exercise (i.e., running, cycling, swimming, etc). Alternately, the person can learn how to maximize HR for a given workload and thus increase energy expenditure during exercise.
- HR heart rate
- a method of controlling a biometric parameter of a subject engaged in an activity includes sensing the biometric parameter via a monitoring device worn by the subject, determining frequency characteristics of the biometric parameter, and presenting to the subject musical audio having a tempo correlated to the frequency characteristics of the biometric parameter.
- the biometric parameter is breathing rate, and musical audio having a tempo correlated to frequency characteristics of the breathing rate is presented to the subject.
- the biometric parameter is heart rate, and musical audio having a tempo correlated to frequency characteristics of the heart rate is presented to the subject.
- the tempo of musical audio presented to the subject can be changed in order to cause a change in the biometric parameter. For example, if the biometric parameter is breathing rate, the tempo of the musical audio can be changed to cause a change in the breathing rate of the subject. If the biometric parameter is heart rate, the tempo of the musical audio can be changed to cause a change in the heart rate of the subject.
- the monitoring device is configured to be positioned at or within an ear of the subject, and in other embodiments, the monitoring device is configured to be secured to an appendage of the subject or at a different location of the body of the subject. In some embodiments, the monitoring device is integrated within or otherwise associated with clothing worn by the subject.
- a method of controlling a biometric parameter of a subject engaged in an activity includes sensing the biometric parameter via a monitoring device worn by the subject (e.g., a device positioned at or within an ear of the subject, secured to an appendage of the subject, integrated within or otherwise associated with clothing worn by the subject, etc.) as musical audio is presented to the subject.
- a monitoring device worn by the subject e.g., a device positioned at or within an ear of the subject, secured to an appendage of the subject, integrated within or otherwise associated with clothing worn by the subject, etc.
- Characteristics of the musical audio are analyzed in context with frequency characteristics of the biometric parameter.
- One or more correlations between the musical audio characteristics and the frequency characteristics of the biometric parameter are identified, and then additional musical audio is selected for subsequent presentation to the subject based on the one or more correlations.
- Selecting additional musical audio may include selecting musical audio having a tempo correlated to the frequency characteristics of the biometric parameter.
- biometric parameter is breathing rate
- additional musical audio having a tempo correlated to frequency characteristics of the breathing rate is selected and presented.
- biometric parameter is heart rate
- additional musical audio having a tempo correlated to frequency characteristics of the heart rate is selected and presented.
- the tempo of the additional musical audio presented to the subject can be changed in order to cause a change in the biometric parameter.
- selecting additional musical audio for presentation to the subject based on the one or more correlations includes selecting a playlist of additional musical audio.
- a method of presenting musical audio to a subject engaged in an activity includes sensing physiological information from the subject via a monitoring device worn by the subject (e.g., a device positioned at or within an ear of the subject, secured to an appendage of the subject, integrated within or otherwise associated with clothing worn by the subject, etc.), analyzing the physiological information to identify a natural body frequency of the subject, and then presenting musical audio to the subject that is in resonance with the natural body frequency and/or in resonance with a harmonic of the natural body frequency.
- the physiological information includes breathing rate, RRi (R-R interval) and/or heart rate.
- a method of modulating heart rate of a subject engaged in an activity includes sensing a breathing rate of the subject via a monitoring device worn by the subject (e.g., a device positioned at or within an ear of the subject, secured to an appendage of the subject, integrated within or otherwise associated with clothing worn by the subject, etc.), and then presenting to the subject musical audio having a tempo selected to change the breathing rate by an amount sufficient to cause a change in the heart rate by a desired amount.
- a monitoring device worn by the subject e.g., a device positioned at or within an ear of the subject, secured to an appendage of the subject, integrated within or otherwise associated with clothing worn by the subject, etc.
- presenting to the subject musical audio having a tempo selected to change the breathing rate by an amount sufficient to cause a change in the heart rate may include presenting musical audio having a tempo selected to increase the breathing rate by an amount sufficient to cause an increase in the heart rate.
- presenting to the subject musical audio having a tempo selected to change the breathing rate by an amount sufficient to cause a change in the heart rate may include presenting musical audio having a tempo selected to decrease the breathing rate by an amount sufficient to cause a decrease in the heart rate.
- a method of determining a ventilatory threshold of a subject engaged in an activity includes sensing heart rate information, breathing rate information, and motion information from the subject via a monitoring device worn by the subject (e.g., a device positioned at or within an ear of the subject, secured to an appendage of the subject, integrated within or otherwise associated with clothing worn by the subject, etc.).
- the heart rate and breathing rate information is analyzed to identify one or more points in time where the subject's heart rate increased at a steady subject workload. Ventilatory threshold is then identified as occurring at a point in time where a rapid increase in breathing rate lagged a rapid increase in heart rate.
- a method of determining body temperature of a subject engaged in an activity includes sensing heart rate information and motion information from the subject via a monitoring device worn by the subject (e.g., a device positioned at or within an ear of the subject, or secured to an appendage of the subject, etc.).
- the heart rate information is analyzed to identify one or more points in time where heart rate changed at a steady subject workload.
- the body temperature of the subject is then determined at each of the one or more points in time based on an amount of change in heart rate at each respective point in time.
- a method of controlling a physical activity parameter of a subject engaged in an activity includes sensing the physical activity parameter and a periodic biometric parameter via at least one monitoring device worn by the subject. Frequency characteristics of the activity parameter and biometric parameter are determined and audio feedback is provided to the subject to encourage the subject to maintain a frequency of the physical activity parameter such that the physical activity parameter and biometric parameter share a common fundamental frequency.
- the physical activity parameter includes an exercise cadence and the biometric parameter includes heart rate and/or breathing rate.
- a method of generating a physiological assessment of a subject includes guiding the subject into a state of controlled breathing, sensing physical activity information via at least one monitoring device worn by the subject, sensing biometric information via the at least one monitoring device, processing the physical activity information and biometric information to generate a physiological assessment of the subject, and providing feedback to the subject related to the physiological assessment.
- the at least one monitoring device may include a PPG sensor, an ECG sensor, an auscultatory sensor, a piezoelectric sensor, a ballistogram sensor, or a bioimpedance sensor.
- a physiological assessment may include subject health status, subject physical fitness, subject physical stress status, and/or subject mental stress status.
- Guiding the subject into a state of controlled breathing may include providing the subject with audible and/or visual instructions.
- audible and/or visual instructions may be provided via a human being.
- audible and/or visual instructions may be provided via an electronic device, such as a cell phone, television, computer, etc.
- Providing feedback to the subject related to the physiological assessment may include providing the subject with audible and/or visual feedback.
- sensing biometric information includes monitoring R-R interval in an electrocardiogram or photoplethysmogram of the subject.
- sensing physical activity information includes sensing subject distance traveled, subject speed, subject acceleration, subject cadence, subject pace, or subject gait.
- FIG. 1 illustrates an audio earbud capable of sensing physiological information from a person wearing the earbud.
- FIG. 2 illustrates a wrist band positioned around a wrist of a person, which may also be applied for wear at a limb or digit, and that includes a sensor module configured to sense physiological information from the person.
- FIG. 3 illustrates an audio earbud capable of sensing physiological information from a person wearing the earbud.
- FIG. 4A is a perspective view of the wrist band of FIG. 2 .
- FIG. 4B is a cross sectional view of the wrist band of FIG. 4A illustrating a sensor module on an inside surface of the wrist band.
- FIG. 5 is a block diagram of a system for enhancing the biometric performance of a person via musical audio, according to some embodiments of the present invention.
- FIG. 6 is a block diagram of a system for enhancing the biometric performance of a person via musical audio using both acute and chronic feedback, according to some embodiments of the present invention.
- FIG. 7 is a block diagram of a system for enhancing the biometric performance of a person via musical audio using both acute and chronic feedback and wherein the sensor data includes PPG data, inertial data and audio data, according to some embodiments of the present invention.
- FIGS. 8-10 are flowcharts of operations for enhancing the biometric performance of a person via musical audio, according to some embodiments of the present invention.
- FIG. 11 is a plot illustrating a method of determining optimal breathing rate and music tempo for minimum or maximum cardiac exertion at a steady cadence, according to some embodiments of the present invention.
- FIG. 12 is a plot illustrating a method of determining optimal cadence and music tempo for minimum or maximum cardiac exertion at a steady speed, according to some embodiments of the present invention.
- FIG. 13 is a plot of breathing rate and heart rate data collected over a period of time for a person and from which the ventilator threshold (V T ) can be estimated, according to some embodiments of the present invention.
- FIG. 14 is a plot of breathing rate over time for a person and illustrating the manipulation of V T via musical audio, according some embodiments of the present invention.
- FIGS. 15-17 are flowcharts of operations for enhancing the biometric performance of a person via audio, according to some embodiments of the present invention.
- FIGS. 18-19 are flowcharts of operations for guiding a subject to control breathing such that a physiological assessment may be generated for the subject based on biometric sensor data collected during the guided controlled breathing, according to some embodiments of the present invention.
- FIG. 20 is a flowchart of operations for determining if a subject is in a state of controlled breathing or uncontrolled breathing, according to some embodiments of the present invention.
- FIG. 21 is a plot of RRi vs. time collected by a user wearing a PPG sensor module having an inertial sensor, according to some embodiments of the present invention.
- FIG. 22A is a Poincaré plot showing successive RR-intervals from FIG. 21 plotted against each other.
- FIG. 22B illustrates ellipses that fit upon data points for controlled and uncontrolled breathing in FIG. 22A .
- FIG. 23 illustrates a physiological assessment presentation, according to some embodiments of the present invention.
- FIG. 24 is a table of physiological assessment parameters and potential assessments, according to some embodiments of the present invention.
- the terms “comprise”, “comprising”, “comprises”, “include”, “including”, “includes”, “have”, “has”, “having”, or variants thereof are open-ended, and include one or more stated features, integers, elements, steps, components or functions but does not preclude the presence or addition of one or more other features, integers, elements, steps, components, functions or groups thereof.
- the common abbreviation “e.g.”, which derives from the Latin phrase “exempli gratia,” may be used to introduce or specify a general example or examples of a previously mentioned item, and is not intended to be limiting of such item.
- the common abbreviation “i.e.”, which derives from the Latin phrase “id est,” may be used to specify a particular item from a more general recitation.
- phrases such as “between X and Y” and “between about X and Y” should be interpreted to include X and Y.
- phrases such as “between about X and Y” mean “between about X and about Y.”
- phrases such as “from about X to Y” mean “from about X to about Y.”
- spatially relative terms such as “under”, “below”, “lower”, “over”, “upper” and the like, may be used herein for ease of description to describe one element or feature's relationship to another element(s) or feature(s) as illustrated in the figures. It will be understood that the spatially relative terms are intended to encompass different orientations of the device in use or operation in addition to the orientation depicted in the figures. For example, if a device in the figures is inverted, elements described as “under” or “beneath” other elements or features would then be oriented “over” the other elements or features. Thus, the exemplary term “under” can encompass both an orientation of over and under.
- the device may be otherwise oriented (rotated 90 degrees or at other orientations) and the spatially relative descriptors used herein interpreted accordingly.
- the terms “upwardly”, “downwardly”, “vertical”, “horizontal” and the like are used herein for the purpose of explanation only unless specifically indicated otherwise.
- first and second are used herein to describe various features or elements, these features or elements should not be limited by these terms. These terms are only used to distinguish one feature or element from another feature or element. Thus, a first feature or element discussed below could be termed a second feature or element, and similarly, a second feature or element discussed below could be termed a first feature or element without departing from the teachings of the present invention.
- sensor refers to a sensor element or group of sensor elements that may be utilized to sense information, such as information (e.g., physiological information, body motion, etc.) from the body of a subject and/or environmental information in a vicinity of the subject.
- a sensor/sensing element/sensor module may comprise one or more of the following: a detector element, an emitter element, a processing element, optics, mechanical support, supporting circuitry, and the like. Both a single sensor element and a collection of sensor elements may be considered a sensor, a sensing element, or a sensor module.
- optical emitter may include a single optical emitter and/or a plurality of separate optical emitters that are associated with each other.
- optical detector may include a single optical detector and/or a plurality of separate optical detectors that are associated with each other.
- wearable sensor module refers to a sensor module configured to be worn on or near the body of a subject.
- monitoring device and “biometric monitoring device”, as used herein, are interchangeable and include any type of device, article, or clothing that may be worn by and/or attached to a subject and that includes at least one sensor/sensing element/sensor module.
- Exemplary monitoring devices may be embodied in an earpiece, a headpiece, a finger clip, a digit (finger or toe) piece, a limb band (such as an arm band or leg band), an ankle band, a wrist band, a nose piece, a sensor patch, eyewear (such as glasses or shades), apparel (such as a shirt, hat, underwear, etc.), a mouthpiece or tooth piece, contact lenses, or the like.
- monitoring refers to the act of measuring, quantifying, qualifying, estimating, sensing, calculating, interpolating, extrapolating, inferring, deducing, or any combination of these actions. More generally, “monitoring” refers to a way of getting information via one or more sensing elements.
- blood health monitoring includes monitoring blood gas levels, blood hydration, and metabolite/electrolyte levels.
- headset is intended to include any type of device or earpiece that may be attached to or near the ear (or ears) of a user and may have various configurations, without limitation.
- Headsets incorporating sensor modules, as described herein may include mono headsets (a device having only one earbud, one earpiece, etc.) and stereo headsets (a device having two earbuds, two earpieces, etc.), true wireless headsets (having two wireless earpieces), earbuds, hearing aids, ear jewelry, face masks, headbands, glasses or eyewear, and the like.
- the term “headset” may include broadly headset elements that are not located on the head but are associated with the headset.
- the wearable medallion would be considered part of the headset as a whole.
- the term “headset” may refer to the headphone-mobile device combination.
- the terms “headset” and “earphone”, as used herein, are interchangeable.
- physiological refers to matter or energy of or from the body of a creature (e.g., humans, animals, etc.). In embodiments of the present invention, the term “physiological” is intended to be used broadly, covering both physical and psychological matter and energy of or from the body of a creature.
- body refers to the body of a subject (human or animal) that may wear a monitoring device, according to embodiments of the present invention.
- a localized signal processor may comprise one or more signal processors or processing methods localized to a general location, such as to a wearable device.
- wearable devices may comprise an earpiece, a headpiece, a finger clip, a digit (finger or toe) piece, a limb band (such as an arm band or leg band), an ankle band, a wrist band, a nose piece, a sensor patch, eyewear (such as glasses or shades), apparel (such as a shirt, hat, underwear, etc.), a mouthpiece or tooth piece, contact lenses, or the like.
- Examples of a distributed processor comprise “the cloud”, the internet, a remote database, a remote processor computer, a plurality of remote processors or computers in communication with each other, or the like, or processing methods distributed amongst one or more of these elements.
- a distributed processor may include delocalized elements, whereas a localized processor may work independently of a distributed processing system.
- microprocessors, microcontrollers, ASICs (application specific integrated circuits), analog processing circuitry, or digital signal processors are a few non-limiting examples of physical signal processors that may be found in wearable devices.
- remote does not necessarily mean that a remote device is a wireless device or that it is a long distance away from a device in communication therewith. Rather, the term “remote” is intended to reference a device or system that is distinct from another device or system or that is not substantially reliant on another device or system for core functionality. For example, a computer wired to a wearable device may be considered a remote device, as the two devices are distinct and/or not substantially reliant on each other for core functionality. However, any wireless device (such as a portable device, for example) or system (such as a remote database for example) is considered remote to any other wireless device or system.
- music and “musical audio”, as used herein, are interchangeable and refer to vocal and/or instrumental sounds that can be played or otherwise presented to a person, for example, via speakers, earbuds, etc.
- tempo refers to the speed of the rhythm at which a composition of music is played. Conventionally, tempo is measured according to beats per minute. For example, according to some conventions, a very fast tempo, prestissimo, has between 200 and 208 beats per minute, presto has 168 to 200 beats per minute, allegro has between 120 and 168 beats per minute, moderato has 108 to 120 beats per minute, andante has 76 to 108 beats per minute, adagio has 66 to 76 beats per minute, larghetto has 60 to 66 beats per minute, and largo has 40 to 60 beats per minute.
- sequence refers to the frequency or repetition rate of an activity.
- one's primary cadence may be their running or walking cadence (their footstep rate).
- one's primary cadence may be their weightlifting cadence (their “rep rate” or repetition rate).
- exercise cadence refers to the primary cadence of a particular exercise. It should be noted that some exercises may be characterized by actions that involve more than one cadence—for example, during swimming, one may have an arm-stroke cadence which is different than the cadence of their leg motion. Thus, in some cases, more than one exercise cadence may be required to accurately assess one's workload.
- workload refers to the amount of work required to perform a task, for example, the amount of work required by a person to perform a task (e.g., running, swimming, exercising, etc.). Workload is typically measured in terms of power (watts) or total energy (joules or calories burned). However, workload may be estimated by measuring or estimating a cadence, speed, or distance traveled by someone and applying various assumptions to correlate distance or exercise cadence with work performed. For example, a constant workload may be associated with running at a constant speed on a flat surface (as altitude changes must be factored at a constant speed, since running up a hill is more work than running down a hill for the same speed).
- Other indirect measurements such as measurements of heart rate, respiration rate, etc. can be used to estimate workload.
- Combinational models of biometric parameters (heart rate, respiration rate, blood pressure, and the like) and physical activity parameters (distance traveled, speed, acceleration, and the like) may also be applied towards indirectly measuring (estimating) workload.
- heart rate to estimate workload can be deceiving, as heart rate may increase with internal body changes (such as temperature changes, digestion, exhaustion, and the like) independently of true physical workload.
- Examples of direct measurements of workload may include treadmill distance monitoring, cadence monitoring, power metering, video recording, or the like.
- Cadence monitoring may be particularly useful for a person wearing a wearable device, as cadence may be measured by an accelerometer.
- an assessment of workload may be approximated by measuring a user's cadence.
- an assessment of workload may be generated by measuring the user cadence of a physical activity by multiplying the user cadence by a scaler value “m” and adding a constant “b” in the form of a linear equation, where the “m” value is an experimentally derived slope relating a user's cadence to energy expenditure and the “b” value is related to an estimate of REE (resting energy expenditure).
- energy expenditure and “total calories burned”, as used herein, are interchangeable. It should be understood that energy expenditure and workload are not necessarily equal, as energy can be expended at rest due to homeostasis or fidgeting, which is not considered as a workload. As with workload, energy expenditure may be indirectly measured (estimated) via physiological models, i.e., by applying models for heart rate vs. energy expenditure or accelerometry measurements vs. energy expenditure, or combination models of heart rate and accelerometry. However, it should be noted that using heart rate to estimate energy expenditure also may be deceiving, as heart rate may reach a physical limit (maximum possible heart rate) during intense exercise even when energy expenditure is increasing. Examples of direct measurements of energy expenditure may include gas exchange analysis, doubly-labeled water assays, metabolic chamber monitoring, or the like.
- heart rate and “pulse rate”, as used herein, are interchangeable.
- RRi refers to “R-R interval” in the electrocardiogram or photoplethysmogram of a person.
- RRi may also be applied in a similar manner.
- HRV refers to “heart rate variability” or “R-R variability”, which is a statistical representation of a group of consecutive R-R intervals or N-N intervals (beat-to-beat intervals between consecutive heart beats).
- R-R variability is a statistical representation of a group of consecutive R-R intervals or N-N intervals (beat-to-beat intervals between consecutive heart beats).
- the types of statistics performed to generate an HRV value can be quite numerous and broad. In general, a variety of different time-domain and/or frequency domain statistics on heart beat intervals can be described as different HRV values.
- 2- or 5-minutes worth of R-R intervals may be processed to determine the mean and standard deviation (SDNN), which is a representation of HRV.
- SDNN standard deviation
- the higher the SDNN for a group of R-R intervals collected from a person the more relaxed, physically fit, or healthy that person may be.
- N-N intervals may be collected via photoplethysmograms (PPG), electrocardiograms
- natural body frequency refers to a resonant frequency of the body or other characteristic frequency of the body where some sort of resonance may occur.
- resonance between breathing rate and heart rate may be a natural body frequency.
- Resonance can refer to the case where heart rate and respiration rate share a fundamental frequency (they are harmonics of the same fundamental frequency) or it can refer to the case where the heart rate variability (HRV) is at a maximum.
- HRV heart rate variability
- a resonance between someone's running or walking cadence (or more broadly their exercise cadence) and heart rate (or other periodic vital sign) may be a natural body frequency.
- the mechanical structure of the body in terms of springs, damping, and the like
- the periodic neurological processing of the human body, or homeostasis may be characterized by a natural frequency.
- fundamental frequency refers to the lowest frequency of a periodic waveform.
- breathing rate and heart rate may resonate with each other when they share a fundamental frequency, such as when heart rate is 6 ⁇ that of the breathing rate.
- biometric parameters and activity parameters may be described herein by using the name of the parameter (such as “heart rate”, VO 2 max, and the like).
- these names may refer to instantaneous values, averaged values, or some other processing of the associated parameter(s).
- a breathing rate of 14 BPM (breaths per minute) may refer to an instantaneous measurement or an averaged measurement (for example, an average breathing rate of 14 BPM as averaged over 5 minutes).
- peripheral biometric parameter refers to a biometric parameter that is derived from a periodic process in the body of a subject such that it is characterized by a rate or frequency, such as heart rate, breathing rate, homeostasis, RRi, neurological functioning, sleep cycles, and the like.
- the ear is an ideal location for wearable health and environmental monitors.
- the ear is a relatively immobile platform that does not obstruct a person's movement or vision.
- Monitoring devices located at an ear have, for example, access to the inner-ear canal and tympanic membrane (for measuring core body temperature), muscle tissue (for monitoring muscle tension), the pinna, earlobe, and elsewhere (for monitoring blood gas levels), the region behind the ear (for measuring skin temperature and galvanic skin response), and the internal carotid artery (for measuring cardiopulmonary functioning), etc.
- the ear is also at or near the point of exposure to: environmental breathable toxicants of interest (volatile organic compounds, pollution, etc.); noise pollution experienced by the ear; and lighting conditions for the eye.
- the ear canal is naturally designed for transmitting acoustical energy
- the ear provides a good location for monitoring internal sounds, such as heartbeat, breathing rate, and mouth motion.
- Accurate sensing of photoplethysmograms and heart rate from the ear has been demonstrated in regions between the concha and anti-tragus locations of the outer ear, and elsewhere at the ear.
- Optical coupling into the blood vessels of the ear may vary between individuals.
- the term “coupling” refers to the interaction or communication between excitation energy (such as light) entering a region and the region itself.
- one form of optical coupling may be the interaction between excitation light generated from within an optical sensor of an earbud (or other device positioned at or within an ear) and the blood vessels of the ear.
- this interaction may involve excitation light entering the ear region and scattering from a blood vessel in the ear such that the temporal change in intensity of scattered light is proportional to a temporal change in blood flow within the blood vessel.
- optical coupling may be the interaction between excitation light generated by an optical emitter within an earbud and a light-guiding region of the earbud.
- an earbud with integrated light-guiding capabilities wherein light can be guided to multiple and/or select regions along the earbud, can assure that each individual wearing the earbud will generate an optical signal related to blood flow through the blood vessels.
- Optical coupling of light to a particular ear region of one person may not yield photoplethysmographic signals for each person. Therefore, coupling light to multiple regions may assure that at least one blood-vessel-rich region will be interrogated for each person wearing an earbud. Coupling multiple regions of the ear to light may also be accomplished by diffusing light from a light source within an earbud.
- FIGS. 1 and 3 illustrate audio earbud 20 capable of sensing physiological information and configured to be positioned within an ear of a subject, according to some embodiments of the present invention.
- the illustrated apparatus 20 of FIG. 3 includes an earpiece body or housing 22 , a sensor module 24 , a stabilizer 25 (optional), and a sound port 26 .
- the sensor module 24 When positioned within the ear of a subject, the sensor module 24 has a region 24 a configured to contact a selected area of the ear.
- the illustrated sensor region 24 a may be contoured (i.e., is “form-fitted”) to matingly engage a portion of the ear between the anti tragus and acoustic meatus, and the stabilizer is configured to engage the anti-helix.
- monitoring devices in accordance with embodiments of the present invention can have sensor modules with one or more regions configured to engage various portions of the ear.
- Various types of device configured to be worn at or near the ear may be utilized in conjunction with embodiment
- FIGS. 3 and 4A-4B illustrate a monitoring apparatus 30 in the form of a sensor band 32 configured to be secured to an appendage (e.g., an arm, wrist, hand, finger, toe, leg, foot, neck, etc.) of a subject.
- the band 32 includes a sensor module 34 on or extending from the inside surface 32 a of the band 32 .
- the sensor module 34 is configured to detect and/or measure physiological information from the subject and includes a sensor region 34 a that may be contoured to contact the skin of a subject wearing the apparatus 30 .
- the sensor region 34 a may comprise a photoplethysmography (PPG), bioimpedance sensor, ballistogram sensor, auscultatory sensor, thermal sensor, or the like.
- PPG photoplethysmography
- the sensor modules 24 , 34 for the illustrated monitoring devices 20 , 30 of FIGS. 1, 2, 3 and 4A-4B are configured to detect and/or measure physiological information from a subject wearing the monitoring devices 20 , 30 .
- the sensor modules 24 , 34 may be configured to detect and/or measure one or more environmental conditions in a vicinity of the subject wearing the monitoring devices 20 , 30 .
- a sensor module 24 , 34 utilized in accordance with embodiments of the present invention may be an optical sensor module that includes at least one optical emitter and at least one optical detector, in a PPG sensor configuration, reflection-mode and/or transmission-mode.
- Exemplary optical emitters include, but are not limited to light-emitting diodes (LEDs), laser diodes (LDs), compact incandescent bulbs, organic LEDs (OLEDs), micro-plasma emitters, IR blackbody sources, or the like.
- Exemplary optical detectors include, but are not limited to, photodiodes, photodetectors, solar cells, CCD (charge-coupled device) cameras, photomultipliers, avalanche photodiodes, CMOS-imaging circuits, or the like.
- a sensor module may include various types of sensors including and/or in addition to optical sensors.
- a sensor module may include one or more inertial sensors (e.g., an accelerometer, optical sensor, blocked-channel sensor, piezoelectric sensor, vibration sensor, photoreflector sensor, pressure sensor, etc.) for detecting changes in motion, one or more thermal sensors (e.g., a thermopile, thermistor, resistor, etc.) for measuring temperature of a part of the body, one or more electrical sensors for measuring changes in electrical conduction, one or more skin perspiration or humidity sensors, and/or one or more acoustical sensors.
- inertial sensors e.g., an accelerometer, optical sensor, blocked-channel sensor, piezoelectric sensor, vibration sensor, photoreflector sensor, pressure sensor, etc.
- thermal sensors e.g., a thermopile, thermistor, resistor, etc.
- electrical sensors for measuring changes in electrical conduction
- skin perspiration or humidity sensors e.g.,
- the system 100 includes at least one processor 40 that is coupled to the sensor(s) of a sensor module 24 , 34 and that is configured to receive and analyze signals produced by the sensor(s).
- the at least one processor 40 utilizes one or more algorithms 50 for enhancing the biometric performance of a person and/or modulating one or more biometric parameters via musical audio, as will be described below.
- FIG. 6 illustrates a system 100 for enhancing the biometric performance of a person via musical audio using both acute and chronic feedback, according to some embodiments of the present invention.
- the system 100 includes one or more monitoring devices 20 , 30 configured to measure physiological information from the person wearing the devices 20 , 30 , such as heart rate, breathing rate, etc.
- One or more local processors 40 are configured to receive data from the sensors associated with the monitoring devices 20 , 30 and to provide real-time (also referred to as “acute”) feedback to the person based upon both personalized and generalized physiological models.
- the system 100 includes one or more remote processors 40 ′ that are configured to process stored data from the devices 20 , 30 and provide long-term (also referred to as “chronic”) feedback to the person.
- a feedback loop is provided to update algorithms for personalized processing, based on long-term trends observed over time.
- the remote processor 40 ′ (such as a cloud processor) may process sets of acquired sensor data to determine that someone is at risk of a cardiac condition (such as arrhythmia, atrial fibrillation, a heart attack, stroke, and the like).
- feedback may be sent to at least one local processor 40 to update processing sources and to focus those processing resources on monitoring for the cardiac condition of interest.
- the sampling frequency or polling of a sensor may be increased or an unpowered or sleeping sensor may be turned on or awakened.
- FIG. 7 illustrates a specific embodiment of the system 100 of FIG. 6 , namely a system 200 for enhancing the biometric performance of a person via musical audio using both acute and chronic feedback and wherein the sensor data includes PPG data, inertial data and audio data, according to some embodiments of the present invention.
- the system 200 includes one or more monitoring devices 20 , 30 configured to measure PPG data, inertial data, and audio data.
- the audio data is from musical audio played to the person wearing monitoring device 20 (e.g., an earbud).
- One or more local processors 40 are configured to receive the PPG data, inertial data and audio data from the sensors associated with the monitoring devices and provide real-time (acute) feedback to the person based upon both personalized and generalized physiological models.
- the system 100 includes one or more remote processors 40 ′ that are configured to process stored data from the devices 20 , 30 and provide long-term (chronic) feedback to the person.
- a method of controlling a biometric parameter of a person includes sensing vital sign data, such as, in this particular example, heart rate and/or breathing rate, via one or more monitoring devices 20 , 30 (Block 300 ) and determining frequency characteristics of the heart rate and/or breathing rate (Block 302 ).
- Music audio is then selected and presented to the person (e.g., via an audio earbud 20 ) based on the frequency characteristics of the person's heart rate and/or breathing rate (Block 304 ).
- the person's breathing rate and/or heart rate can be modulated.
- characteristics of a photoplethysmogram other than heartbeat or respiration rate frequency, also may be employed in embodiments of the present invention.
- the amplitude, ramp rate, decay rate, shape, etc. of a PPG waveform may be characterized by a processor and then music may be presented to the user based on at least one of these characteristics. For example, if a processor determines that a person has a sharp rise-time (or fall-time) to his/her PPG waveform, the music may be modified such that every up-beat (or down beat) is accentuated or sped-up in time in order to resonate or correlate with that of the PPG waveform.
- the tempo or amplitude of the music may be modified with the ramp-rate or decay-rate of a heart rate or breathing rate of the user.
- the music may be modified such that the acoustical waveforms have a sawtooth characteristic.
- a processor determines a person's heart rate and breathing rate are characterized by distinct frequencies or frequency bands, then the processor may modify the music such that harmonics of each frequency or frequency band are introduced into the playlist or are incorporated or alternated in a selected song.
- PPG information may be processed into a variety of biometrics other than heart rate and breathing rate, for example, such as blood pressure, blood hydration level, blood analyte (blood oxygen, CO 2 , CO, glucose, etc) level, R-R interval (RRi), heart rate variability (HRV) information, hemodynamic information, cardiac output, aerobic capacity (VO 2 max), VO 2 , metabolic rate, health status information, breathing volume (inhalation and exhalation volume) and the like.
- biometrics may also comprise frequency characteristics that can be mapped to musical frequencies, and some embodiments of the present invention, such as those described with respect to FIG. 8 and FIG. 9 , may be applied using these biometrics instead of, or in addition to, heart rate and breathing rate.
- a method of controlling a biometric parameter of a person includes sensing vital sign data, such as heart rate and breathing rate, via one or more monitoring devices 20 , 30 , as musical audio is played to the person, for example via an audio earbud 20 (Block 310 ).
- vital sign data such as heart rate and breathing rate
- monitoring devices 20 , 30 as musical audio is played to the person, for example via an audio earbud 20
- One or more characteristics of the music such as tempo, pitch, frequency, rhythm, articulation, order, cadence, instrumentation characteristics, tone, voice characteristics, and the like.
- the vital sign data from the person is then analyzed in context with the music characteristics to identify any correlations (Block 314 ) and these correlations are stored (Block 316 ).
- a music playlist may then be adjusted to play musical audio to the person based on one or more stored correlations between music and vital signs data (Block 318 ).
- a processor may determine that high voice pitch (or treble-rich music) is correlated with higher user heart rate, and the processor may then adjust the playlist to play high-voice-pitch music to increase heart rate of the user or to play low-voice-pitch music (or bass-rich music) to lower heart rate of the user.
- a method of enhancing the biometric performance of a person via audio includes analyzing vital sign data (e.g., heart rate, breathing rate, etc.) from a person to identify natural body frequency (Block 320 ).
- Music audio is then played to the person, for example via an audio earbud 20 , in resonance with or in a harmonic of, the natural body frequency of the person (Block 322 ).
- both heart rate and respiration rate are determined by a processor, and the audio is selected to manipulate the respiration rate (as described below with respect to FIG. 11 ) such that the respiration rate and heart rate are always in resonance with each other (i.e., they share a fundamental frequency or the heart rate variability is highest).
- Another method of enhancing the biometric performance of a person via musical audio may include analyzing vital sign data (e.g., heart rate, breathing rate, etc.) from a person as well as analyzing physical activity data (user cadence, speed, pace, gait, etc.) from the person and notifying the person when the cadence and vital sign are characterized by the same fundamental frequency. Additionally, the music playlist may be adjusted to this common frequency (or pitch) or the songs may be stretched or compressed in time to match this common frequency or to match with at least a harmonic of the fundamental frequency.
- vital sign data e.g., heart rate, breathing rate, etc.
- physical activity data user cadence, speed, pace, gait, etc.
- FIG. 11 is a plot 400 of a response metric 404 and controlled metric 402 over time that illustrates a method of determining optimal breathing rate and music tempo for minimum cardiac exertion at a steady cadence, according to some embodiments of the present invention.
- the controlled metric 402 is respiration rate and the response metric 404 is heart rate.
- a person's respiration rate may subconsciously track with a harmonic of the tempo of a given musical audio heard by the person.
- the person's respiration rate may be controlled by the music, and heart rate may respond to the respiration rate.
- an optimal controlled metric associated with an optimal response metric, for a given workload (or more generally, “activity state”), can be learned.
- the ideal breathing rate 402 (and associated musical audio tempo) for minimal cardiac exertion/workload i.e., minimum heart rate (HR) at a given workload
- HR minimum heart rate
- the ideal breathing rate 402 (and associated music temp 406 ) for maximal energy expenditure/workload is identified as that of time period E, which is associated with the maximum cardiac exertion/energy expenditure (i.e., maximum HR at a given workload).
- FIG. 12 is a plot 500 of a response metric 504 and controlled metric 502 over time that illustrates a method of determining optimal cadence and music tempo for minimum cardiac exertion at a steady speed, according to some embodiments of the present invention.
- the controlled metric 502 is cadence, such as step rate, cycling cadence, exercise cadence, etc.
- the response metric 504 is heart rate (HR).
- HR heart rate
- a person's cadence 502 may subconsciously track with a harmonic of the tempo of a given musical audio heard by the person. In such case, the person's cadence is controlled by the music, and heart rate responds to this cadence.
- an optimal controlled metric associated with an optimal response metric, for a given activity state, such as workload can be learned.
- a person listens to musical audio while exercising at a steady speed.
- the musical audio changes tempo 506 while the user is exercising, and the person's cadence rate 502 naturally locks-in to a harmonic of the musical audio tempo 506 .
- the ideal cadence rate 502 , and associated musical audio tempo, for minimal cardiac exertion is identified as that associated with the minimum HR 504 during a steady cadence as shown in time period C.
- the ideal cadence rate 502 , and associated musical audio tempo 506 for maximal energy expenditure at the same running speed is identified as that associated with the maximum cardiac exertion (maximum HR 504 ), as shown in time period E.
- body metrics that may be modulated by musical audio and monitored to optimize the use of music with respect to exercise include, but are not limited to, body temperature, blood pressure, cardiac output, RRi, and ventilatory threshold (V T ).
- Body temperature can be measured via a body temperature sensor, or estimated via monitoring HR and/or BR at a constant workload.
- Using musical audio to manipulate BR to reduce HR for a given speed can be used to push out V T in time (pushing it out to a higher heart rate), thereby pushing out the transition between aerobic and anaerobic exercise. This is illustrated in FIGS. 13 and 14 and described below.
- RRi may be measured by identifying heart beat peaks or frequencies in a PPG waveform (using time-domain or frequency domain analysis) and then reporting time periods between each heart beat.
- the user's time-between-heart-beats may lock-in to the time-between-music-beats such that one's RRi may be controlled by the music beat.
- FIG. 13 is a plot 600 of BR data 602 and HR data 604 collected over a period of time for a person and from which ventilatory threshold V T can be estimated, according to some embodiments of the present invention.
- Data was collected from a person wearing a monitoring device having PPG-based sensor technology during a VO 2 max test.
- the person was wearing an audio earbud 20 , but could have also been wearing a wrist band 30 (or armband, legband, ring, patch, etc.) containing a sensor module 24 , 24 .
- the monitoring device 20 sensed photoplethysmograms (via a PPG sensor) and motion (via a motion sensor) and processed the signals (via a processor) to attenuate motion noise and generate metrics for HR, BR, distance, speed, pace, cadence, VO 2 (oxygen volume consumption), blood pressure, RRi (R-R interval), and other biometrics.
- ventilatory threshold is the point at which ventilation begins increasing at a faster rate than VO 2 . From the time-dependent HR and V T data, one can estimate the ventilatory threshold (V T ) as the point at which BR increases rapidly (such as when the slope of BR vs. time increases). Greater confidence in the V T estimation can be derived by also noting the presence of a HR inflection (Ta) a few minutes ahead of V T (Tb).
- a person's workload may also be factored into an algorithm for determining V T .
- a wearable device 20 , 30 includes a motion sensor, such as an accelerometer, an inflection in the motion sensor signal (such as the intensity or cadence, for example) may trigger the algorithm to begin looking for V T or may be factored into an algorithm for estimating V T .
- V T will only be observed at a workload high enough to sufficiently tax aerobic capacity, and thus knowledge of a person's workload and aerobic capacity (such as VO 2 max) may be used to determine regions in time wherein the person may experience V T .
- EE may increase linearly with increasing BR after V T is reached.
- FIG. 14 is a plot of BR 702 (without bio-tuned music), 704 (with bio-tuned music) over time for a person and illustrating the manipulation of V T via musical audio, according some embodiments of the present invention. Specifically, FIG. 14 illustrates how “bio-tuning musical audio” can push-out V T . For example, once an optimal musical tempo is identified for minimizing energy expenditure (EE), VO 2 , or HR for a given workload, the bio-tuned musical audio can be used to shift-out V T during a maximal exercise, such as for a VO 2 max test or for intense training. Thus, the person may be better able to endure greater workloads without reaching exhaustion as quickly.
- EE energy expenditure
- VO 2 or HR for a given workload
- FIG. 15 a method of determining V T in a person wearing a monitoring device 20 , 30 having a PPG sensor and motion sensor, according to some embodiments of the present invention, is illustrated.
- HR, BR and activity data (motion data) is sensed via a monitoring device 20 , 30 and stored (Block 800 ).
- Data plots of HR vs. time and BR vs. time are analyzed to identify points of rapid increase in HR at a steady workload (e.g., steady speed and/or cadence) (Block 802 ).
- V T is then identified as a point in rapid increase in BR that lags a point of rapid increase in HR (Block 804 ).
- an alternate method of determining the time that V T is reached may be to analyze at least a few seconds of data to see when the slope of BR vs. HR changes substantially.
- the BR does not change substantially with increasing HR until V T is reached, at which point the change in BR vs. HR increases substantially (i.e., the slope of BR vs. HR increases substantially).
- substantially this may mean a change in slope of 5% or higher over a period of sixty (60) or more seconds.
- FIG. 16 another method of determining V T in a person wearing a monitoring device 20 , 30 having a PPG sensor and motion sensor, according to some embodiments of the present invention, is illustrated.
- HR and/or BR data and activity data (motion data) is sensed via a monitoring device 20 , 30 and stored (Block 810 ).
- Data plots of HR vs. time and/or BR vs. time are analyzed to identify points of rapid increase in HR and/or BR at a steady workload (e.g., steady speed and/or cadence) (Block 812 ).
- a steady workload e.g., steady speed and/or cadence
- Activity level of the person is analyzed (Block 814 ) and a determination is made if V T is viable at one or more time periods where the person has a steady workload (Block 816 ).
- physiological models such as theoretical or experiential models
- V T can be reached at those workloads.
- static characteristics about a person such as age, height, and gender, as well as quasi-static characteristics, such as weight and cardiac efficiency.
- at least one model can be used to determine whether V T is viable.
- the model may also include information about heart rate, such that V T is viable at only certain workloads and certain heart rates.
- V T is identified as a point of rapid increase in HR or BR during the viable time period (Block 818 ). If the answer is no, V T is not viable in the time period (Block 820 ).
- the method of FIG. 16 can be a “looped” process such that it is continually analyzed in time to mark periods of V T being viable or not viable.
- HR, BR and activity data is sensed via a monitoring device 20 , 30 and stored (Block 830 ).
- Data plots of HR vs. time, HRV vs. time, and/or BR vs. time are analyzed to identify points of rapid increase in HR, HRV, and/or BR at a steady workload (e.g., steady speed and/or cadence) (Block 832 ).
- Increases in body temperature are identified as significant changes in HR, HRV and/or BR at the steady workload (Block 834 ).
- HR and BR may significantly increase with increasing body temperature and HRV may have spectral (frequency-domain) components with spectral coefficients that either increase or decrease with body temperature. Since HR can be measured using data from a PPG sensor, the body temperature of the person can then be estimated using a calibration factor or mathematical relationship between the measured HR and body temperature for a given workload (Block 836 ).
- Body temperature may also be estimated using a calibration factor or mathematical relationship between BR and temperature for a given workload or a mathematical relationship between HRV and body temperature for a given workload.
- it may be beneficial to estimate workload using data from the inertial sensor(s) in the wearable device worn by the user.
- a variety of methods for estimating workload using wearable inertial sensors are well known to those skilled in the art.
- the present invention may be used to help guide a user to controlled breathing, such that a physiological assessment may be generated for the user based on biometric sensor data collected during the guided controlled breathing, as shown in FIG. 18 .
- a user wearing a biometric sensor such as the wearable sensors described in FIGS. 1-5 , or a user utilizing a sensor system, such as that presented in FIGS. 6-7 , may be audibly, and perhaps also visually, guided into a state of controlled breathing (Block 900 ).
- an animated character may be presented on a view-screen to demonstrate inhaling and exhaling, such that the user adapts to the breathing rate of the character on the view screen.
- the user may be presented with audible instructions for inhaling and exhaling, such as being presented with breathing sounds for inhaling and exhaling or with music that follows inhalation and exhalation, at the frequency of the targeted breathing rate or breathing volume.
- the audio and visual feedback may be provided by the wearable device itself or another part of the system of FIG. 6 or FIG. 7 , such as a phone, computer, or the like.
- a processor may process biometric sensor data to generate a physiological assessment for the user (Block 902 ).
- physiological assessments are presented in FIGS. 22-24 .
- Such physiological assessments may comprise an assessment of one's: health status, physical fitness, stress status (physical and/or mental), a biometric (such as blood pressure, cardiac functioning, blood oxygenation, or the like), or the like.
- the assessment(s) may then be presented audibly, and perhaps also visually, to the user (Block 904 ).
- a key benefit of generating a physiological assessment during controlled breathing is that recurring measurements of the physiological assessment may be more useful for long-term trending, as one's metabolic activity may be more normalized for each measurement.
- generating a daily data point for a physiological assessment, such as blood pressure, during the same time of day over the course of several months can be useful for monitoring significant deviations in one's average blood pressure to decide if a medical intervention is warranted.
- one assumption is that measuring at the same time of day may help reduce artifacts associated with one's physical activity or metabolic activity, as one's physical activity and metabolic activity may be generally consistent at the same time of day.
- one's activity level and metabolic level may not always be the same during the same time of day, and thus guiding a user to controlled breathing may help normalize the user's physical activity and/or metabolic activity during regular measurements of blood pressure, such that long-term trending of blood pressure is more consistent, yielding more accurate determinations as to whether or not a medical intervention is warranted in response to the detection of significant changes in blood pressure compared with average blood pressure readings.
- the method of FIG. 18 may be incorporated into a game having a goal.
- generating an important physiological assessment can also be entertaining.
- a gaming character may be able to unlock and utilize certain skills, weapons, or powers once a state of controlled breathing or relaxation is detected. Once the gaming goals are achieved, a physiological assessment may be presented to the user.
- the method of FIG. 18 may be utilized to generate a physiological assessment of one's stress sensitivity.
- a physiological assessment may be generated before and after one has reached controlled breathing.
- the processor may then compare the before and after readings of this physiological assessment to determine if these readings are substantially different for uncontrolled vs. controlled breathing.
- the determination of a substantial difference may yield an assessment regarding one's stress sensitivity for presentation to the user, and this assessment may have therapeutic implications.
- one's blood pressure readings are determined to be satisfactory (i.e., “normal”) for controlled breathing and poor (i.e., “too low” or “too high”) for uncontrolled breathing, one's stress sensitivity may be determined to be “high”.
- a high stress sensitivity may then be presented to the user (or the user's trainer, physician, caretaker, or the like), implying that stress-reduction therapy may be effective in helping the user maintain a satisfactory blood pressure.
- one's stress sensitivity assessment may be determined to be “low”, such that stress-reduction therapy may not be particularly useful for the user.
- Some individuals may be more likely to benefit physiologically from controlled breathing and similar stress-reduction methodologies than others.
- non-pharmacological therapies may be more desirable for improving health than drugs, which may be associated with undesired side-effects.
- a key importance of this invention is that enabling a physiological assessment before and after controlled breathing provides a measurable way of determining whether stress-reduction methodologies would be useful for improving an individual's health.
- FIG. 19 illustrates a method of generating assessments by guided controlled breathing while monitoring RRi and physical activity via a wearable sensor capable of measuring both RRi information and physical activity information.
- the user may be wearing a wearable sensor device (e.g., monitoring device 20 , 30 ) comprising a PPG sensor or another sensor for measuring RRi (i.e., an ECG sensor, an auscultatory sensor, a piezoelectric sensor, a ballistogram sensor, a bioimpedance sensor, or the like) and physical activity information, wherein the device is in communication with a view-screen and/or audio device such that the user may be guided towards controlled breathing visually and/or audibly, as described above (Block 910 ).
- RRi i.e., an ECG sensor, an auscultatory sensor, a piezoelectric sensor, a ballistogram sensor, a bioimpedance sensor, or the like
- the device is in communication with a view-screen and/or audio device such that the user may be guided towards controlled breathing visually and/or audibly, as described above (Block 910 ).
- this collected sensor information (Block 912 ) may be processed into RRi information and/
- the RRi information may then be processed by a processor to determine if the user is in a state of controlled breathing or a state of uncontrolled breathing (Block 914 ).
- a processor may process the data to generate a physiological assessment for the user, as described above (Block 916 ).
- the assessment(s) may then be presented to the user, or someone monitoring the user, visually and/or audibly (Block 918 ).
- FIGS. 22A-22B and FIG. 23 present particular examples of providing a health assessment with diagnostic value to an end user based on the method of FIG. 19 .
- FIG. 20 A specific method of processing RRi and physical activity information to determine if the user is in a state of controlled breathing or uncontrolled breathing is presented in FIG. 20 . As described earlier, this method can be used within the assessment generation method presented in FIG. 19 .
- a physiological waveform PPG, ECG, bioimpedance, auscultatory, or the like
- a physical activity waveform is processed to generate physical activity information (cadence, speed, acceleration, position, exercise intensity, motion intensity, motion or rest duration, and the like) for the user (Block 922 ).
- the RRi information is then processed to identify peaks in the RRi-vs.-time (Block 924 ), and the physical activity information is processed to identify an activity state of the user (Block 926 ).
- the RRi peak information and activity state information can then be processed to determine if the user is (or is not) in a state of controlled breathing (Block 928 ).
- the RRi peaks during controlled breathing may have a period and frequency within a range that is characteristic of controlled breathing due to respiratory sinus arrhythmia and knowledge about the activity state of the user may be applied towards ascertaining if the person is at a state of relative “rest” (low activity) suitable for controlled breathing.
- This information may be processed together (Block 928 ) to assess whether a subject is truly in a state of controlled breathing.
- FIG. 21 is a plot 1000 of RRi vs. time collected by a user wearing a PPG sensor module at the wrist, wherein the PPG sensor module further comprises an inertial sensor (in this case an accelerometer). The user started the test with uncontrolled breathing, and at ⁇ 360 seconds the user was instructed to start controlled breathing at 6 breaths a minute (0.1 Hz).
- an inertial sensor in this case an accelerometer
- the RRi-vs.-time information was generated in “real-time” by processing the raw PPG signal to determine the peak-to-peak (or valley-to-valley) points in time, in a pulse-picking fashion, as RRi-vs.-time may develop a pronounced periodic characteristic during controlled breathing.
- RRi-vs.-time may develop a pronounced periodic characteristic during controlled breathing.
- the time between pulses was processed as an RR-interval (RRi) and thus a stream of successive RR-intervals was processed in time and smoothed, using a smoothing algorithm (in this case a moving average filter) to help remove unwanted noise artifacts (such as motion-noise, environmental-noise, and electrical-noise), generating a smoothed RRi waveform.
- a smoothing algorithm in this case a moving average filter
- the derivative of the smoothed RRi waveform was then calculated to generate an RRi-derivative waveform, and positive-to-negative zero-crossings of the derivative waveform were recorded to indicate peaks in the RRi waveform in time.
- the periodic time between RRi peaks should be close to 10 seconds, and for uncontrolled breathing, the time between peaks is likely to be much smaller and/or aperiodic (i.e., not periodic).
- the time-between-peaks information was fed to a controlled breathing detection algorithm, where the signal output of the detection algorithm was incremented when the time between peaks was between 7 and 13 seconds and decremented otherwise. There was also a maximum value by which the detection signal output was capped and a minimum value by which the detection signal output was floored.
- the algorithm may have additional intelligence to change the “time between peaks” depending on the guided breathing rate. For example, if the guided controlled breathing is selected at 4 breaths per minute, then the signal output of the detection algorithm may be incremented when the time between peaks is between ⁇ 14 and ⁇ 17 seconds (as there are 15 seconds for each full breath in such case). Additionally, the breathing rate may be autonomously detected via a breathing rate detection algorithm (such as that described and referenced earlier) and then the “time between peaks” may be autonomously adjusted according to the detected breathing rate.
- a breathing rate detection algorithm such as that described and referenced earlier
- the controlled breathing detection algorithm output for this dataset is presented in FIG. 21 , showing maximum values in the time periods between ⁇ 380 and ⁇ 680 seconds.
- the algorithm of FIG. 20 was tuned to report controlled breathing (to report a controlled breathing flag) when a predefined percentage of this maximum value was reached, in this case 70% of maximum value, reporting the onset of controlled breathing at 375 seconds.
- the report of controlled breathing was continued (the flag was kept high) until unless more than 20 seconds passed with the detection signal output below the predefined percentage of maximum value (75% of maximum).
- controlled breathing was reported from ⁇ 375 seconds to ⁇ 680 seconds, as shown in FIG. 21 .
- the accelerometer readings from the PPG sensor module were monitored to determine if the activity level of the person was too high to trust the RRi readings. In such case, a flag would be generated to enable a processor to discount erroneous RRi data. In this particular dataset, the flag was always “0”, as there was no substantial physical activity detected.
- the identification of low activity was used to determine that the subject was truly in a state of rest suitable for enabling the subject to enter a state of controlled breathing.
- the determination that a subject is in a state of controlled breathing, leveraging biometric and activity sensing during the duration of the breathing session may comprise the combination of: a) determining that the user's RRi-vs.-time plot is periodic in a manner that is consistent with controlled breathing (as described earlier), and b) determining that the person's activity state is at relative rest by sensing relatively low levels of motion (such as low accelerometry counts) and/or sensing that the user is at a seated or supine position (such as via body position sensing).
- the RRi data shown in FIG. 21 can be further processed to generate a physiological assessment for the user.
- Numerous types of physiological assessments may be generated by processing RRi+physical activity information collected over a period of time, such as cardiovascular assessments, cardiac assessments, stress assessments, or the like. More specific examples of such assessments may comprise the identification of: arrhythmia, atrial fibrillation, fatigue, VO 2 max, lactic threshold, or the like.
- Specific examples of generating a physiological assessment based on RRi data are presented in FIGS. 22A, 22B and 23 .
- FIG. 22A shows a Poincaré plot 1100 (a type of recurrence plot) comprising successive RR-intervals of FIG. 21 plotted against each other.
- Ellipses can be fit upon the data points for controlled and uncontrolled breathing 1102 , 1104 , as shown in FIG. 22A and emphasized in the break-out diagram of FIG. 22B .
- the ellipses may be defined by characteristic standard deviations, SD1 and SD2, along the minor and major axes of the ellipses for controlled breathing (“CB”) and uncontrolled breathing (“UCB”).
- SD2 may be more closely related with long-term variability in the RR-intervals and may reflect primarily sympathetic activity of the autonomic nervous system (the magnitude of SD2 is inversely related to sympathetic activity level), whereas SD1 may be more closely related with the short-term variability in the RR-intervals and reflects primarily parasympathetic activity of the autonomic nervous system (the magnitude of SD1 is directly related to parasympathetic activity level).
- SD2CB controlled breathing
- SD2UCB uncontrolled breathing
- the physiological assessment generated for the user may be that the user was originally in a higher stressed state (during uncontrolled breathing) that was corrected or at least ameliorated by a session of controlled breathing. More generally, physiological assessments may be generated for the user by processing the controlled breathing statistical parameters in comparison to the uncontrolled breathing statistical parameters, thereby generating physiological assessment parameters that may be processed via algorithms to generate physiological assessments.
- a key benefit of factoring both controlled and uncontrolled breathing statistical parameters in generating physiological assessments is that the assessments may then be less dependent on external variables, rather than to health conditions, that may also affect SD1 and SD2. For example, a user who is fatigued from excessive exercise may have a lower SD1 and SD2 on the 3rd day than the 1st day due to a fatigue- or recovery-induced drop in HRV following excessive exercise on the 2nd day. However, in this case, the SD1 CB /SD1 UCB or SD2 CB /SD2 UCB ratio may not change as long has the health condition has not changed.
- the ratio SD1 CB /SD1 UCB or SD2 CB /SD2 UCB may be effectively normalized such that the influence of day-to-day, non-health-related, variability on physiological assessments can be lessened. In this manner, true health conditions can be exposed as the user moves from uncontrolled to controlled breathing.
- Some non-limiting examples of such physiological assessment parameters and potential assessments are summarized in the table 1300 of FIG. 24 .
- the term “average”, as used in the table 1300 generally refers to an average value of a group of users having similar demographics with the user under test.
- the visual display 1200 may summarize the user's Poincare plot, common shapes of the Poincare plots (referred to as “Poincare shapes”), and a description of what the user's Poincare shape means.
- the identification of the user's Poincare shape on the plot 1200 may be achieved by implementing a pattern recognition algorithm, a statistical analysis algorithm, or the like as processed by a processor, such as a processor 40 , 40 ′ in the system of FIG. 6 or FIG. 7 .
- a comet shape 1202 may be indicative of good health
- a torpedo shape 1204 may be indicative of poor cardiovascular health or a particular disease condition
- a fan shape 1206 may be indicative of atrial fibrillation, arrhythmia, or another cardiac issue.
- the user's shape may be unknown (represented in FIG. 23 by 1208 ) to the identification algorithm, and in such case the system of FIG. 6 or FIG. 7 may initiate a search through a plurality of different users' data to find similar plots via a correlational algorithm running on a processor.
- commonalities between meta data and/or associated heath data of the users may be identified by a processor to help diagnose the particular user of interest.
- a notice may given (i.e., visually and/or audibly) to the user of interest, or to someone monitoring the user of interest, that the user may be at risk of being diabetic.
- Example embodiments are described herein with reference to block diagrams and flowchart illustrations. It is understood that a block of the block diagrams and flowchart illustrations, and combinations of blocks in the block diagrams and flowchart illustrations, can be implemented by computer program instructions that are performed by one or more computer circuits.
- These computer program instructions may be provided to a processor circuit of a general purpose computer circuit, special purpose computer circuit, and/or other programmable data processing circuit to produce a machine, such that the instructions, which execute via the processor of the computer and/or other programmable data processing apparatus, transform and control transistors, values stored in memory locations, and other hardware components within such circuitry to implement the functions/acts specified in the block diagrams and flowchart block or blocks, and thereby create means (functionality) and/or structure for implementing the functions/acts specified in the block diagrams and flowchart blocks.
- These computer program instructions may also be stored in a tangible computer-readable medium that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable medium produce an article of manufacture including instructions which implement the functions/acts specified in the block diagrams and flowchart blocks.
- a tangible, non-transitory computer-readable medium may include an electronic, magnetic, optical, electromagnetic, or semiconductor data storage system, apparatus, or device. More specific examples of the computer-readable medium would include the following: a portable computer diskette, a random access memory (RAM) circuit, a read-only memory (ROM) circuit, an erasable programmable read-only memory (EPROM or Flash memory) circuit, a portable compact disc read-only memory (CD-ROM), and a portable digital video disc read-only memory (DVD/BlueRay).
- RAM random access memory
- ROM read-only memory
- EPROM or Flash memory erasable programmable read-only memory
- CD-ROM compact disc read-only memory
- DVD/BlueRay portable digital video disc read-only memory
- the computer program instructions may also be loaded onto a computer and/or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer and/or other programmable apparatus to produce a computer-implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions/acts specified in the block diagrams and flowchart blocks. Accordingly, embodiments of the present invention may be embodied in hardware and/or in software (including firmware, resident software, micro-code, etc.) that runs on a processor such as a digital signal processor, which may collectively be referred to as “circuitry,” “a module” or variants thereof.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Biomedical Technology (AREA)
- Public Health (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Animal Behavior & Ethology (AREA)
- Pathology (AREA)
- Biophysics (AREA)
- Heart & Thoracic Surgery (AREA)
- Physics & Mathematics (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Veterinary Medicine (AREA)
- Physiology (AREA)
- Cardiology (AREA)
- Pulmonology (AREA)
- Otolaryngology (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Psychiatry (AREA)
- Signal Processing (AREA)
- Computer Networks & Wireless Communication (AREA)
- Artificial Intelligence (AREA)
- Business, Economics & Management (AREA)
- General Business, Economics & Management (AREA)
- Epidemiology (AREA)
- Primary Health Care (AREA)
- Biodiversity & Conservation Biology (AREA)
- Measuring Pulse, Heart Rate, Blood Pressure Or Blood Flow (AREA)
Abstract
Description
- This application is a continuation application of pending U.S. patent application Ser. No. 15/744,642, filed Jan. 12, 2018, which is a 35 U.S.C. § 371 national stage application of PCT Application No. PCT/US2016/041842, filed Jul. 12, 2016, which itself claims the benefit of and priority to U.S. Provisional Patent Application No. 62/192,683 filed Jul. 15, 2015, and U.S. Provisional Patent Application No. 62/274,463 filed Jan. 4, 2016, the disclosures of all of which are incorporated herein by reference as if set forth in their entireties. The above-referenced PCT International Application was published in the English language as International Publication No. WO 2017/011431 A2 on Jan. 19, 2017.
- The present invention relates generally to monitoring devices and, more particularly, to monitoring devices for measuring physiological information.
- Wearable devices capable of monitoring physiological information, such as heart rate, are increasingly being used. These devices come in various form factors, including devices configured to be worn at the ear or at other locations of the body. U.S. Pat. Nos. 8,652,040, 8,700,111, 8,647,270, 8,788,002, 8,886,269, and 8,929,965, which are incorporated herein by reference in their entireties, describe various wearable devices configured to monitor physiological information, including headsets, earbuds, and wrist bands. Physiological information obtained from a subject can be used to generate various types of health and fitness assessments of the subject. For example, using a photoplethysmography (PPG) sensor incorporated into a wearable monitoring device, blood flow information can be measured during daily activities of a subject and this information can be used to generate assessments, such as maximum oxygen consumption VO2max, total energy expenditure (TEE), etc.
- It should be appreciated that this Summary is provided to introduce a selection of concepts in a simplified form, the concepts being further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of this disclosure, nor is it intended to limit the scope of the invention.
- Embodiments of the present invention can facilitate identifying musical audio that can improve a person's exercise training, and can facilitate identifying the best biometric parameters for optimizing the person's exercise training. For example, embodiments of the present invention can be used to study a person's biometric correlations with music while exercising and listening to music to learn how music tempo relates to a controllable biometric parameter and then learn how to directly control that biometric as a means of indirectly controlling another biometric. Moreover, embodiments of the present invention can be used to help a person learn how to minimize heart rate (HR) for a given workload and thus improve endurance during exercise (i.e., running, cycling, swimming, etc). Alternately, the person can learn how to maximize HR for a given workload and thus increase energy expenditure during exercise.
- According to some embodiments of the present invention, a method of controlling a biometric parameter of a subject engaged in an activity includes sensing the biometric parameter via a monitoring device worn by the subject, determining frequency characteristics of the biometric parameter, and presenting to the subject musical audio having a tempo correlated to the frequency characteristics of the biometric parameter. In some embodiments, the biometric parameter is breathing rate, and musical audio having a tempo correlated to frequency characteristics of the breathing rate is presented to the subject. In some embodiments, the biometric parameter is heart rate, and musical audio having a tempo correlated to frequency characteristics of the heart rate is presented to the subject.
- In some embodiments, the tempo of musical audio presented to the subject can be changed in order to cause a change in the biometric parameter. For example, if the biometric parameter is breathing rate, the tempo of the musical audio can be changed to cause a change in the breathing rate of the subject. If the biometric parameter is heart rate, the tempo of the musical audio can be changed to cause a change in the heart rate of the subject.
- In some embodiments, the monitoring device is configured to be positioned at or within an ear of the subject, and in other embodiments, the monitoring device is configured to be secured to an appendage of the subject or at a different location of the body of the subject. In some embodiments, the monitoring device is integrated within or otherwise associated with clothing worn by the subject.
- According to other embodiments of the present invention, a method of controlling a biometric parameter of a subject engaged in an activity includes sensing the biometric parameter via a monitoring device worn by the subject (e.g., a device positioned at or within an ear of the subject, secured to an appendage of the subject, integrated within or otherwise associated with clothing worn by the subject, etc.) as musical audio is presented to the subject. Characteristics of the musical audio are analyzed in context with frequency characteristics of the biometric parameter. One or more correlations between the musical audio characteristics and the frequency characteristics of the biometric parameter are identified, and then additional musical audio is selected for subsequent presentation to the subject based on the one or more correlations. Selecting additional musical audio may include selecting musical audio having a tempo correlated to the frequency characteristics of the biometric parameter. For example, if the biometric parameter is breathing rate, additional musical audio having a tempo correlated to frequency characteristics of the breathing rate is selected and presented. If the biometric parameter is heart rate, additional musical audio having a tempo correlated to frequency characteristics of the heart rate is selected and presented.
- In some embodiments, the tempo of the additional musical audio presented to the subject can be changed in order to cause a change in the biometric parameter.
- In some embodiments, selecting additional musical audio for presentation to the subject based on the one or more correlations includes selecting a playlist of additional musical audio.
- According to other embodiments of the present invention, a method of presenting musical audio to a subject engaged in an activity includes sensing physiological information from the subject via a monitoring device worn by the subject (e.g., a device positioned at or within an ear of the subject, secured to an appendage of the subject, integrated within or otherwise associated with clothing worn by the subject, etc.), analyzing the physiological information to identify a natural body frequency of the subject, and then presenting musical audio to the subject that is in resonance with the natural body frequency and/or in resonance with a harmonic of the natural body frequency. In some embodiments, the physiological information includes breathing rate, RRi (R-R interval) and/or heart rate.
- According to other embodiments of the present invention, a method of modulating heart rate of a subject engaged in an activity includes sensing a breathing rate of the subject via a monitoring device worn by the subject (e.g., a device positioned at or within an ear of the subject, secured to an appendage of the subject, integrated within or otherwise associated with clothing worn by the subject, etc.), and then presenting to the subject musical audio having a tempo selected to change the breathing rate by an amount sufficient to cause a change in the heart rate by a desired amount. In some embodiments, presenting to the subject musical audio having a tempo selected to change the breathing rate by an amount sufficient to cause a change in the heart rate may include presenting musical audio having a tempo selected to increase the breathing rate by an amount sufficient to cause an increase in the heart rate. Similarly, presenting to the subject musical audio having a tempo selected to change the breathing rate by an amount sufficient to cause a change in the heart rate may include presenting musical audio having a tempo selected to decrease the breathing rate by an amount sufficient to cause a decrease in the heart rate.
- According to other embodiments of the present invention, a method of determining a ventilatory threshold of a subject engaged in an activity includes sensing heart rate information, breathing rate information, and motion information from the subject via a monitoring device worn by the subject (e.g., a device positioned at or within an ear of the subject, secured to an appendage of the subject, integrated within or otherwise associated with clothing worn by the subject, etc.). The heart rate and breathing rate information is analyzed to identify one or more points in time where the subject's heart rate increased at a steady subject workload. Ventilatory threshold is then identified as occurring at a point in time where a rapid increase in breathing rate lagged a rapid increase in heart rate.
- According to other embodiments of the present invention, a method of determining body temperature of a subject engaged in an activity includes sensing heart rate information and motion information from the subject via a monitoring device worn by the subject (e.g., a device positioned at or within an ear of the subject, or secured to an appendage of the subject, etc.). The heart rate information is analyzed to identify one or more points in time where heart rate changed at a steady subject workload. The body temperature of the subject is then determined at each of the one or more points in time based on an amount of change in heart rate at each respective point in time.
- According to other embodiments of the present invention, a method of controlling a physical activity parameter of a subject engaged in an activity includes sensing the physical activity parameter and a periodic biometric parameter via at least one monitoring device worn by the subject. Frequency characteristics of the activity parameter and biometric parameter are determined and audio feedback is provided to the subject to encourage the subject to maintain a frequency of the physical activity parameter such that the physical activity parameter and biometric parameter share a common fundamental frequency. In some embodiments, the physical activity parameter includes an exercise cadence and the biometric parameter includes heart rate and/or breathing rate.
- According to other embodiments of the present invention, a method of generating a physiological assessment of a subject includes guiding the subject into a state of controlled breathing, sensing physical activity information via at least one monitoring device worn by the subject, sensing biometric information via the at least one monitoring device, processing the physical activity information and biometric information to generate a physiological assessment of the subject, and providing feedback to the subject related to the physiological assessment. The at least one monitoring device may include a PPG sensor, an ECG sensor, an auscultatory sensor, a piezoelectric sensor, a ballistogram sensor, or a bioimpedance sensor. A physiological assessment may include subject health status, subject physical fitness, subject physical stress status, and/or subject mental stress status.
- Guiding the subject into a state of controlled breathing may include providing the subject with audible and/or visual instructions. In some embodiments, audible and/or visual instructions may be provided via a human being. In other embodiments, audible and/or visual instructions may be provided via an electronic device, such as a cell phone, television, computer, etc. Providing feedback to the subject related to the physiological assessment may include providing the subject with audible and/or visual feedback.
- In some embodiments, sensing biometric information includes monitoring R-R interval in an electrocardiogram or photoplethysmogram of the subject. In some embodiments, sensing physical activity information includes sensing subject distance traveled, subject speed, subject acceleration, subject cadence, subject pace, or subject gait.
- It is noted that aspects of the invention described with respect to one embodiment may be incorporated in a different embodiment although not specifically described relative thereto. That is, all embodiments and/or features of any embodiment can be combined in any way and/or combination. Applicant reserves the right to change any originally filed claim or file any new claim accordingly, including the right to be able to amend any originally filed claim to depend from and/or incorporate any feature of any other claim although not originally claimed in that manner. These and other objects and/or aspects of the present invention are explained in detail below.
- The accompanying drawings, which form a part of the specification, illustrate various embodiments of the present invention. The drawings and description together serve to fully explain embodiments of the present invention.
-
FIG. 1 illustrates an audio earbud capable of sensing physiological information from a person wearing the earbud. -
FIG. 2 illustrates a wrist band positioned around a wrist of a person, which may also be applied for wear at a limb or digit, and that includes a sensor module configured to sense physiological information from the person. -
FIG. 3 illustrates an audio earbud capable of sensing physiological information from a person wearing the earbud. -
FIG. 4A is a perspective view of the wrist band ofFIG. 2 . -
FIG. 4B is a cross sectional view of the wrist band ofFIG. 4A illustrating a sensor module on an inside surface of the wrist band. -
FIG. 5 is a block diagram of a system for enhancing the biometric performance of a person via musical audio, according to some embodiments of the present invention. -
FIG. 6 is a block diagram of a system for enhancing the biometric performance of a person via musical audio using both acute and chronic feedback, according to some embodiments of the present invention. -
FIG. 7 is a block diagram of a system for enhancing the biometric performance of a person via musical audio using both acute and chronic feedback and wherein the sensor data includes PPG data, inertial data and audio data, according to some embodiments of the present invention. -
FIGS. 8-10 are flowcharts of operations for enhancing the biometric performance of a person via musical audio, according to some embodiments of the present invention. -
FIG. 11 is a plot illustrating a method of determining optimal breathing rate and music tempo for minimum or maximum cardiac exertion at a steady cadence, according to some embodiments of the present invention. -
FIG. 12 is a plot illustrating a method of determining optimal cadence and music tempo for minimum or maximum cardiac exertion at a steady speed, according to some embodiments of the present invention. -
FIG. 13 is a plot of breathing rate and heart rate data collected over a period of time for a person and from which the ventilator threshold (VT) can be estimated, according to some embodiments of the present invention. -
FIG. 14 is a plot of breathing rate over time for a person and illustrating the manipulation of VT via musical audio, according some embodiments of the present invention. -
FIGS. 15-17 are flowcharts of operations for enhancing the biometric performance of a person via audio, according to some embodiments of the present invention. -
FIGS. 18-19 are flowcharts of operations for guiding a subject to control breathing such that a physiological assessment may be generated for the subject based on biometric sensor data collected during the guided controlled breathing, according to some embodiments of the present invention. -
FIG. 20 is a flowchart of operations for determining if a subject is in a state of controlled breathing or uncontrolled breathing, according to some embodiments of the present invention. -
FIG. 21 is a plot of RRi vs. time collected by a user wearing a PPG sensor module having an inertial sensor, according to some embodiments of the present invention. -
FIG. 22A is a Poincaré plot showing successive RR-intervals fromFIG. 21 plotted against each other. -
FIG. 22B illustrates ellipses that fit upon data points for controlled and uncontrolled breathing inFIG. 22A . -
FIG. 23 illustrates a physiological assessment presentation, according to some embodiments of the present invention. -
FIG. 24 is a table of physiological assessment parameters and potential assessments, according to some embodiments of the present invention. - The present invention will now be described more fully hereinafter with reference to the accompanying figures, in which embodiments of the invention are shown. This invention may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. Like numbers refer to like elements throughout. In the figures, certain components or features may be exaggerated for clarity. In addition, the sequence of operations (or steps) is not limited to the order presented in the figures and/or claims unless specifically indicated otherwise. Features described with respect to one figure or embodiment can be associated with another embodiment or figure although not specifically described or shown as such.
- It will be understood that when a feature or element is referred to as being “on” another feature or element, it can be directly on the other feature or element or intervening features and/or elements may also be present. In contrast, when a feature or element is referred to as being “directly on” another feature or element, there are no intervening features or elements present. It will also be understood that, when a feature or element is referred to as being “secured”, “connected”, “attached” or “coupled” to another feature or element, it can be directly secured, directly connected, attached or coupled to the other feature or element or intervening features or elements may be present. In contrast, when a feature or element is referred to as being “directly secured”, “directly connected”, “directly attached” or “directly coupled” to another feature or element, there are no intervening features or elements present. Although described or shown with respect to one embodiment, the features and elements so described or shown can apply to other embodiments.
- The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used herein, the singular forms “a”, “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise.
- As used herein, the terms “comprise”, “comprising”, “comprises”, “include”, “including”, “includes”, “have”, “has”, “having”, or variants thereof are open-ended, and include one or more stated features, integers, elements, steps, components or functions but does not preclude the presence or addition of one or more other features, integers, elements, steps, components, functions or groups thereof. Furthermore, as used herein, the common abbreviation “e.g.”, which derives from the Latin phrase “exempli gratia,” may be used to introduce or specify a general example or examples of a previously mentioned item, and is not intended to be limiting of such item. The common abbreviation “i.e.”, which derives from the Latin phrase “id est,” may be used to specify a particular item from a more general recitation.
- As used herein, the term “and/or” includes any and all combinations of one or more of the associated listed items and may be abbreviated as “/”.
- As used herein, phrases such as “between X and Y” and “between about X and Y” should be interpreted to include X and Y. As used herein, phrases such as “between about X and Y” mean “between about X and about Y.” As used herein, phrases such as “from about X to Y” mean “from about X to about Y.”
- Spatially relative terms, such as “under”, “below”, “lower”, “over”, “upper” and the like, may be used herein for ease of description to describe one element or feature's relationship to another element(s) or feature(s) as illustrated in the figures. It will be understood that the spatially relative terms are intended to encompass different orientations of the device in use or operation in addition to the orientation depicted in the figures. For example, if a device in the figures is inverted, elements described as “under” or “beneath” other elements or features would then be oriented “over” the other elements or features. Thus, the exemplary term “under” can encompass both an orientation of over and under. The device may be otherwise oriented (rotated 90 degrees or at other orientations) and the spatially relative descriptors used herein interpreted accordingly. Similarly, the terms “upwardly”, “downwardly”, “vertical”, “horizontal” and the like are used herein for the purpose of explanation only unless specifically indicated otherwise.
- It will be understood that although the terms first and second are used herein to describe various features or elements, these features or elements should not be limited by these terms. These terms are only used to distinguish one feature or element from another feature or element. Thus, a first feature or element discussed below could be termed a second feature or element, and similarly, a second feature or element discussed below could be termed a first feature or element without departing from the teachings of the present invention.
- Unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. It will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the specification and relevant art and should not be interpreted in an idealized or overly formal sense unless expressly so defined herein. Well-known functions or constructions may not be described in detail for brevity and/or clarity.
- The term “about”, as used herein with respect to a value or number, means that the value or number can vary more or less, for example by +/−20%, +/−10%, +/−5%, +/−1%, +/−0.5%, +/−0.1%, etc.
- The terms “sensor”, “sensing element”, and “sensor module”, as used herein, are interchangeable and refer to a sensor element or group of sensor elements that may be utilized to sense information, such as information (e.g., physiological information, body motion, etc.) from the body of a subject and/or environmental information in a vicinity of the subject. A sensor/sensing element/sensor module may comprise one or more of the following: a detector element, an emitter element, a processing element, optics, mechanical support, supporting circuitry, and the like. Both a single sensor element and a collection of sensor elements may be considered a sensor, a sensing element, or a sensor module.
- The term “optical emitter”, as used herein, may include a single optical emitter and/or a plurality of separate optical emitters that are associated with each other.
- The term “optical detector”, as used herein, may include a single optical detector and/or a plurality of separate optical detectors that are associated with each other.
- The term “wearable sensor module”, as used herein, refers to a sensor module configured to be worn on or near the body of a subject.
- The terms “monitoring device” and “biometric monitoring device”, as used herein, are interchangeable and include any type of device, article, or clothing that may be worn by and/or attached to a subject and that includes at least one sensor/sensing element/sensor module. Exemplary monitoring devices may be embodied in an earpiece, a headpiece, a finger clip, a digit (finger or toe) piece, a limb band (such as an arm band or leg band), an ankle band, a wrist band, a nose piece, a sensor patch, eyewear (such as glasses or shades), apparel (such as a shirt, hat, underwear, etc.), a mouthpiece or tooth piece, contact lenses, or the like.
- The term “monitoring” refers to the act of measuring, quantifying, qualifying, estimating, sensing, calculating, interpolating, extrapolating, inferring, deducing, or any combination of these actions. More generally, “monitoring” refers to a way of getting information via one or more sensing elements. For example, “blood health monitoring” includes monitoring blood gas levels, blood hydration, and metabolite/electrolyte levels.
- The term “headset”, as used herein, is intended to include any type of device or earpiece that may be attached to or near the ear (or ears) of a user and may have various configurations, without limitation. Headsets incorporating sensor modules, as described herein, may include mono headsets (a device having only one earbud, one earpiece, etc.) and stereo headsets (a device having two earbuds, two earpieces, etc.), true wireless headsets (having two wireless earpieces), earbuds, hearing aids, ear jewelry, face masks, headbands, glasses or eyewear, and the like. In some embodiments, the term “headset” may include broadly headset elements that are not located on the head but are associated with the headset. For example, in a “medallion” style wireless headset, where the medallion comprises the wireless electronics and the headphones are plugged into or hard-wired into the medallion, the wearable medallion would be considered part of the headset as a whole. Similarly, in some cases, if a mobile phone or other mobile device is intimately associated with a plugged-in headphone, then the term “headset” may refer to the headphone-mobile device combination. The terms “headset” and “earphone”, as used herein, are interchangeable.
- The term “physiological” refers to matter or energy of or from the body of a creature (e.g., humans, animals, etc.). In embodiments of the present invention, the term “physiological” is intended to be used broadly, covering both physical and psychological matter and energy of or from the body of a creature.
- The term “body” refers to the body of a subject (human or animal) that may wear a monitoring device, according to embodiments of the present invention.
- The term “processor” is used broadly to refer to a signal processor or computing system or processing or computing method which may be localized or distributed. For example, a localized signal processor may comprise one or more signal processors or processing methods localized to a general location, such as to a wearable device. Examples of such wearable devices may comprise an earpiece, a headpiece, a finger clip, a digit (finger or toe) piece, a limb band (such as an arm band or leg band), an ankle band, a wrist band, a nose piece, a sensor patch, eyewear (such as glasses or shades), apparel (such as a shirt, hat, underwear, etc.), a mouthpiece or tooth piece, contact lenses, or the like. Examples of a distributed processor comprise “the cloud”, the internet, a remote database, a remote processor computer, a plurality of remote processors or computers in communication with each other, or the like, or processing methods distributed amongst one or more of these elements. The key difference is that a distributed processor may include delocalized elements, whereas a localized processor may work independently of a distributed processing system. As a specific example, microprocessors, microcontrollers, ASICs (application specific integrated circuits), analog processing circuitry, or digital signal processors are a few non-limiting examples of physical signal processors that may be found in wearable devices.
- The term “remote” does not necessarily mean that a remote device is a wireless device or that it is a long distance away from a device in communication therewith. Rather, the term “remote” is intended to reference a device or system that is distinct from another device or system or that is not substantially reliant on another device or system for core functionality. For example, a computer wired to a wearable device may be considered a remote device, as the two devices are distinct and/or not substantially reliant on each other for core functionality. However, any wireless device (such as a portable device, for example) or system (such as a remote database for example) is considered remote to any other wireless device or system.
- The terms “music” and “musical audio”, as used herein, are interchangeable and refer to vocal and/or instrumental sounds that can be played or otherwise presented to a person, for example, via speakers, earbuds, etc.
- The term “tempo”, as used herein, refers to the speed of the rhythm at which a composition of music is played. Conventionally, tempo is measured according to beats per minute. For example, according to some conventions, a very fast tempo, prestissimo, has between 200 and 208 beats per minute, presto has 168 to 200 beats per minute, allegro has between 120 and 168 beats per minute, moderato has 108 to 120 beats per minute, andante has 76 to 108 beats per minute, adagio has 66 to 76 beats per minute, larghetto has 60 to 66 beats per minute, and largo has 40 to 60 beats per minute.
- The term “cadence”, as used herein, refers to the frequency or repetition rate of an activity. During running or walking, for example, one's primary cadence may be their running or walking cadence (their footstep rate). During weightlifting, one's primary cadence may be their weightlifting cadence (their “rep rate” or repetition rate). Generally speaking, “exercise cadence” refers to the primary cadence of a particular exercise. It should be noted that some exercises may be characterized by actions that involve more than one cadence—for example, during swimming, one may have an arm-stroke cadence which is different than the cadence of their leg motion. Thus, in some cases, more than one exercise cadence may be required to accurately assess one's workload.
- The term “workload”, as used herein, refers to the amount of work required to perform a task, for example, the amount of work required by a person to perform a task (e.g., running, swimming, exercising, etc.). Workload is typically measured in terms of power (watts) or total energy (joules or calories burned). However, workload may be estimated by measuring or estimating a cadence, speed, or distance traveled by someone and applying various assumptions to correlate distance or exercise cadence with work performed. For example, a constant workload may be associated with running at a constant speed on a flat surface (as altitude changes must be factored at a constant speed, since running up a hill is more work than running down a hill for the same speed). Other indirect measurements, such as measurements of heart rate, respiration rate, etc. can be used to estimate workload. Combinational models of biometric parameters (heart rate, respiration rate, blood pressure, and the like) and physical activity parameters (distance traveled, speed, acceleration, and the like) may also be applied towards indirectly measuring (estimating) workload. However, it should be noted that using heart rate to estimate workload can be deceiving, as heart rate may increase with internal body changes (such as temperature changes, digestion, exhaustion, and the like) independently of true physical workload. Examples of direct measurements of workload may include treadmill distance monitoring, cadence monitoring, power metering, video recording, or the like.
- Cadence monitoring may be particularly useful for a person wearing a wearable device, as cadence may be measured by an accelerometer. In a particular embodiment, an assessment of workload may be approximated by measuring a user's cadence. In another embodiment, an assessment of workload may be generated by measuring the user cadence of a physical activity by multiplying the user cadence by a scaler value “m” and adding a constant “b” in the form of a linear equation, where the “m” value is an experimentally derived slope relating a user's cadence to energy expenditure and the “b” value is related to an estimate of REE (resting energy expenditure).
- The terms “energy expenditure” and “total calories burned”, as used herein, are interchangeable. It should be understood that energy expenditure and workload are not necessarily equal, as energy can be expended at rest due to homeostasis or fidgeting, which is not considered as a workload. As with workload, energy expenditure may be indirectly measured (estimated) via physiological models, i.e., by applying models for heart rate vs. energy expenditure or accelerometry measurements vs. energy expenditure, or combination models of heart rate and accelerometry. However, it should be noted that using heart rate to estimate energy expenditure also may be deceiving, as heart rate may reach a physical limit (maximum possible heart rate) during intense exercise even when energy expenditure is increasing. Examples of direct measurements of energy expenditure may include gas exchange analysis, doubly-labeled water assays, metabolic chamber monitoring, or the like.
- The terms “respiration rate” and “breathing rate”, as used herein, are interchangeable.
- The terms “heart rate” and “pulse rate”, as used herein, are interchangeable.
- The term “RRi” refers to “R-R interval” in the electrocardiogram or photoplethysmogram of a person. Generally, where heart rate is used in embodiments of the present invention, RRi may also be applied in a similar manner. However, RRi and heart rate are generally related in an inverse fashion, such that 1/RRi=instantaneous heart rate.
- The term “HRV” refers to “heart rate variability” or “R-R variability”, which is a statistical representation of a group of consecutive R-R intervals or N-N intervals (beat-to-beat intervals between consecutive heart beats). The types of statistics performed to generate an HRV value can be quite numerous and broad. In general, a variety of different time-domain and/or frequency domain statistics on heart beat intervals can be described as different HRV values. As one specific example of HRV, 2- or 5-minutes worth of R-R intervals may be processed to determine the mean and standard deviation (SDNN), which is a representation of HRV. In general, the higher the SDNN for a group of R-R intervals collected from a person, the more relaxed, physically fit, or healthy that person may be. N-N intervals may be collected via photoplethysmograms (PPG), electrocardiograms (ECG), blood pressure pulses, ballistocardiograms (BCG), and the like.
- The term “natural body frequency”, as used herein, refers to a resonant frequency of the body or other characteristic frequency of the body where some sort of resonance may occur. For example, resonance between breathing rate and heart rate may be a natural body frequency. Resonance can refer to the case where heart rate and respiration rate share a fundamental frequency (they are harmonics of the same fundamental frequency) or it can refer to the case where the heart rate variability (HRV) is at a maximum. As another example, a resonance between someone's running or walking cadence (or more broadly their exercise cadence) and heart rate (or other periodic vital sign) may be a natural body frequency. In another example, the mechanical structure of the body (in terms of springs, damping, and the like) may have a mechanical resonance frequency. In another example, the periodic neurological processing of the human body, or homeostasis, may be characterized by a natural frequency.
- The term “fundamental frequency”, as used herein, refers to the lowest frequency of a periodic waveform. For example, breathing rate and heart rate may resonate with each other when they share a fundamental frequency, such as when heart rate is 6× that of the breathing rate.
- Various biometric parameters and activity parameters may be described herein by using the name of the parameter (such as “heart rate”, VO2max, and the like). Generally speaking, these names may refer to instantaneous values, averaged values, or some other processing of the associated parameter(s). For example, a breathing rate of 14 BPM (breaths per minute) may refer to an instantaneous measurement or an averaged measurement (for example, an average breathing rate of 14 BPM as averaged over 5 minutes). Unless “instantaneous”, “average”, or some other adjective is used to describe the parameter, it should not be assumed that there is a limitation with respect to the processing of the parameter.
- The term “periodic biometric parameter”, as used herein, refers to a biometric parameter that is derived from a periodic process in the body of a subject such that it is characterized by a rate or frequency, such as heart rate, breathing rate, homeostasis, RRi, neurological functioning, sleep cycles, and the like.
- In the following figures, various monitoring devices will be illustrated and described for attachment to the ear or an appendage of the human body. However, it is to be understood that embodiments of the present invention are not limited to those worn by humans.
- The ear is an ideal location for wearable health and environmental monitors. The ear is a relatively immobile platform that does not obstruct a person's movement or vision. Monitoring devices located at an ear have, for example, access to the inner-ear canal and tympanic membrane (for measuring core body temperature), muscle tissue (for monitoring muscle tension), the pinna, earlobe, and elsewhere (for monitoring blood gas levels), the region behind the ear (for measuring skin temperature and galvanic skin response), and the internal carotid artery (for measuring cardiopulmonary functioning), etc. The ear is also at or near the point of exposure to: environmental breathable toxicants of interest (volatile organic compounds, pollution, etc.); noise pollution experienced by the ear; and lighting conditions for the eye. Furthermore, as the ear canal is naturally designed for transmitting acoustical energy, the ear provides a good location for monitoring internal sounds, such as heartbeat, breathing rate, and mouth motion. Accurate sensing of photoplethysmograms and heart rate from the ear has been demonstrated in regions between the concha and anti-tragus locations of the outer ear, and elsewhere at the ear.
- Optical coupling into the blood vessels of the ear may vary between individuals. As used herein, the term “coupling” refers to the interaction or communication between excitation energy (such as light) entering a region and the region itself. For example, one form of optical coupling may be the interaction between excitation light generated from within an optical sensor of an earbud (or other device positioned at or within an ear) and the blood vessels of the ear. In one embodiment, this interaction may involve excitation light entering the ear region and scattering from a blood vessel in the ear such that the temporal change in intensity of scattered light is proportional to a temporal change in blood flow within the blood vessel. Another form of optical coupling may be the interaction between excitation light generated by an optical emitter within an earbud and a light-guiding region of the earbud. Thus, an earbud with integrated light-guiding capabilities, wherein light can be guided to multiple and/or select regions along the earbud, can assure that each individual wearing the earbud will generate an optical signal related to blood flow through the blood vessels. Optical coupling of light to a particular ear region of one person may not yield photoplethysmographic signals for each person. Therefore, coupling light to multiple regions may assure that at least one blood-vessel-rich region will be interrogated for each person wearing an earbud. Coupling multiple regions of the ear to light may also be accomplished by diffusing light from a light source within an earbud.
-
FIGS. 1 and 3 illustrateaudio earbud 20 capable of sensing physiological information and configured to be positioned within an ear of a subject, according to some embodiments of the present invention. The illustratedapparatus 20 ofFIG. 3 includes an earpiece body orhousing 22, asensor module 24, a stabilizer 25 (optional), and asound port 26. When positioned within the ear of a subject, thesensor module 24 has a region 24 a configured to contact a selected area of the ear. The illustrated sensor region 24 a may be contoured (i.e., is “form-fitted”) to matingly engage a portion of the ear between the anti tragus and acoustic meatus, and the stabilizer is configured to engage the anti-helix. However, monitoring devices in accordance with embodiments of the present invention can have sensor modules with one or more regions configured to engage various portions of the ear. Various types of device configured to be worn at or near the ear may be utilized in conjunction with embodiments of the present invention. -
FIGS. 3 and 4A-4B illustrate amonitoring apparatus 30 in the form of asensor band 32 configured to be secured to an appendage (e.g., an arm, wrist, hand, finger, toe, leg, foot, neck, etc.) of a subject. Theband 32 includes asensor module 34 on or extending from theinside surface 32 a of theband 32. Thesensor module 34 is configured to detect and/or measure physiological information from the subject and includes asensor region 34 a that may be contoured to contact the skin of a subject wearing theapparatus 30. For example, thesensor region 34 a may comprise a photoplethysmography (PPG), bioimpedance sensor, ballistogram sensor, auscultatory sensor, thermal sensor, or the like. - The
sensor modules monitoring devices FIGS. 1, 2, 3 and 4A-4B are configured to detect and/or measure physiological information from a subject wearing themonitoring devices sensor modules monitoring devices - A
sensor module - Referring to
FIG. 5 , asystem 100 for enhancing the biometric performance of a person via musical audio, according to some embodiments of the present invention, is illustrated. Thesystem 100 includes at least oneprocessor 40 that is coupled to the sensor(s) of asensor module processor 40 utilizes one ormore algorithms 50 for enhancing the biometric performance of a person and/or modulating one or more biometric parameters via musical audio, as will be described below. -
FIG. 6 illustrates asystem 100 for enhancing the biometric performance of a person via musical audio using both acute and chronic feedback, according to some embodiments of the present invention. Thesystem 100 includes one ormore monitoring devices devices local processors 40 are configured to receive data from the sensors associated with themonitoring devices system 100 includes one or moreremote processors 40′ that are configured to process stored data from thedevices - Additionally, a feedback loop is provided to update algorithms for personalized processing, based on long-term trends observed over time. For example, the
remote processor 40′ (such as a cloud processor) may process sets of acquired sensor data to determine that someone is at risk of a cardiac condition (such as arrhythmia, atrial fibrillation, a heart attack, stroke, and the like). In such case, feedback may be sent to at least onelocal processor 40 to update processing sources and to focus those processing resources on monitoring for the cardiac condition of interest. For example, the sampling frequency or polling of a sensor may be increased or an unpowered or sleeping sensor may be turned on or awakened. -
FIG. 7 illustrates a specific embodiment of thesystem 100 ofFIG. 6 , namely asystem 200 for enhancing the biometric performance of a person via musical audio using both acute and chronic feedback and wherein the sensor data includes PPG data, inertial data and audio data, according to some embodiments of the present invention. Thesystem 200 includes one ormore monitoring devices local processors 40 are configured to receive the PPG data, inertial data and audio data from the sensors associated with the monitoring devices and provide real-time (acute) feedback to the person based upon both personalized and generalized physiological models. In addition, thesystem 100 includes one or moreremote processors 40′ that are configured to process stored data from thedevices - Referring to
FIG. 8 , a method of controlling a biometric parameter of a person includes sensing vital sign data, such as, in this particular example, heart rate and/or breathing rate, via one ormore monitoring devices 20, 30 (Block 300) and determining frequency characteristics of the heart rate and/or breathing rate (Block 302). Musical audio is then selected and presented to the person (e.g., via an audio earbud 20) based on the frequency characteristics of the person's heart rate and/or breathing rate (Block 304). By varying the tempo or other characteristic(s) of the musical audio, the person's breathing rate and/or heart rate can be modulated. - However, it is to be understood that characteristics of a photoplethysmogram, other than heartbeat or respiration rate frequency, also may be employed in embodiments of the present invention. For example, the amplitude, ramp rate, decay rate, shape, etc. of a PPG waveform may be characterized by a processor and then music may be presented to the user based on at least one of these characteristics. For example, if a processor determines that a person has a sharp rise-time (or fall-time) to his/her PPG waveform, the music may be modified such that every up-beat (or down beat) is accentuated or sped-up in time in order to resonate or correlate with that of the PPG waveform. Similarly, the tempo or amplitude of the music may be modified with the ramp-rate or decay-rate of a heart rate or breathing rate of the user. As another example, if the shape of a person's PPG waveform is “sawtooth” in nature, then the music may be modified such that the acoustical waveforms have a sawtooth characteristic. As yet another example, if a processor determines a person's heart rate and breathing rate are characterized by distinct frequencies or frequency bands, then the processor may modify the music such that harmonics of each frequency or frequency band are introduced into the playlist or are incorporated or alternated in a selected song.
- As described earlier, PPG information may be processed into a variety of biometrics other than heart rate and breathing rate, for example, such as blood pressure, blood hydration level, blood analyte (blood oxygen, CO2, CO, glucose, etc) level, R-R interval (RRi), heart rate variability (HRV) information, hemodynamic information, cardiac output, aerobic capacity (VO2max), VO2, metabolic rate, health status information, breathing volume (inhalation and exhalation volume) and the like. These biometrics may also comprise frequency characteristics that can be mapped to musical frequencies, and some embodiments of the present invention, such as those described with respect to
FIG. 8 andFIG. 9 , may be applied using these biometrics instead of, or in addition to, heart rate and breathing rate. - Referring to
FIG. 9 , a method of controlling a biometric parameter of a person includes sensing vital sign data, such as heart rate and breathing rate, via one ormore monitoring devices - Referring to
FIG. 10 , a method of enhancing the biometric performance of a person via audio includes analyzing vital sign data (e.g., heart rate, breathing rate, etc.) from a person to identify natural body frequency (Block 320). Musical audio is then played to the person, for example via anaudio earbud 20, in resonance with or in a harmonic of, the natural body frequency of the person (Block 322). For example, in one embodiment of this invention, both heart rate and respiration rate are determined by a processor, and the audio is selected to manipulate the respiration rate (as described below with respect toFIG. 11 ) such that the respiration rate and heart rate are always in resonance with each other (i.e., they share a fundamental frequency or the heart rate variability is highest). - Another method of enhancing the biometric performance of a person via musical audio, in accordance with embodiments of the present invention, may include analyzing vital sign data (e.g., heart rate, breathing rate, etc.) from a person as well as analyzing physical activity data (user cadence, speed, pace, gait, etc.) from the person and notifying the person when the cadence and vital sign are characterized by the same fundamental frequency. Additionally, the music playlist may be adjusted to this common frequency (or pitch) or the songs may be stretched or compressed in time to match this common frequency or to match with at least a harmonic of the fundamental frequency.
- There may be physiological benefits to running at or exercising at a cadence that shares a fundamental frequency with one's heart rate and/or respiration rate. For example, as with impedance matching for electrical circuits, the pumping mechanism of the cardiopulmonary system may require less work to generate sufficient blood flow when the exercising cadence resonates with the heart rate of the person. As such, notifying a person that his/her cadence and heart rate are not resonating (i.e. are not equal or do not share a common fundamental frequency) can enable the person to change cadence and minimize total energy expenditure for a given workload. Alternately, if the person wants to increase total energy expenditure for a given workload, then the person may want to exercise at a cadence that does not resonate with heart rate. Thus, notifying the user of resonance between at least one vital sign and at least one exercise cadence can help train the person to better exercise efficiency or to intentionally poorer exercise efficiency.
-
FIG. 11 is aplot 400 of aresponse metric 404 and controlled metric 402 over time that illustrates a method of determining optimal breathing rate and music tempo for minimum cardiac exertion at a steady cadence, according to some embodiments of the present invention. InFIG. 11 , the controlledmetric 402 is respiration rate and theresponse metric 404 is heart rate. Applicant has discovered that a person's respiration rate may subconsciously track with a harmonic of the tempo of a given musical audio heard by the person. In such case, the person's respiration rate may be controlled by the music, and heart rate may respond to the respiration rate. As such, an optimal controlled metric associated with an optimal response metric, for a given workload (or more generally, “activity state”), can be learned. - Initially, a person listens to musical audio while exercising at a constant cadence. The musical audio changes tempo 406 while the user is exercising, and the person's
breathing rate 402 may naturally lock-in to a harmonic of the musical audio tempo 406. In turn, the user's heart rate may fluctuate with breathing rate, during exercise at a constant work-load by the user. Thus, the ideal breathing rate 402 (and associated musical audio tempo) for minimal cardiac exertion/workload (i.e., minimum heart rate (HR) at a given workload) may be identified as that in time period D, which is associated with theminimum HR 404 during a steady cadence (i.e., minimal energy expenditure for a given workload or minimal heart rate for a given workload). However, in the case where weight-loss is desired, the opposite goal may be desired. Namely, the ideal breathing rate 402 (and associated music temp 406) for maximal energy expenditure/workload is identified as that of time period E, which is associated with the maximum cardiac exertion/energy expenditure (i.e., maximum HR at a given workload). -
FIG. 12 is aplot 500 of aresponse metric 504 and controlled metric 502 over time that illustrates a method of determining optimal cadence and music tempo for minimum cardiac exertion at a steady speed, according to some embodiments of the present invention. InFIG. 12 , the controlledmetric 502 is cadence, such as step rate, cycling cadence, exercise cadence, etc., and theresponse metric 504 is heart rate (HR). Applicant has discovered that a person'scadence 502 may subconsciously track with a harmonic of the tempo of a given musical audio heard by the person. In such case, the person's cadence is controlled by the music, and heart rate responds to this cadence. As such, an optimal controlled metric associated with an optimal response metric, for a given activity state, such as workload, can be learned. - Initially, a person listens to musical audio while exercising at a steady speed. The musical audio changes
tempo 506 while the user is exercising, and the person'scadence rate 502 naturally locks-in to a harmonic of themusical audio tempo 506. Theideal cadence rate 502, and associated musical audio tempo, for minimal cardiac exertion is identified as that associated with theminimum HR 504 during a steady cadence as shown in time period C. However, in the case where weight-loss is desired, theideal cadence rate 502, and associatedmusical audio tempo 506 for maximal energy expenditure at the same running speed is identified as that associated with the maximum cardiac exertion (maximum HR 504), as shown in time period E. - In addition to heart rate (HR) and breathing rate (BR), other body metrics that may be modulated by musical audio and monitored to optimize the use of music with respect to exercise include, but are not limited to, body temperature, blood pressure, cardiac output, RRi, and ventilatory threshold (VT). Body temperature can be measured via a body temperature sensor, or estimated via monitoring HR and/or BR at a constant workload. Using musical audio to manipulate BR to reduce HR for a given speed can be used to push out VT in time (pushing it out to a higher heart rate), thereby pushing out the transition between aerobic and anaerobic exercise. This is illustrated in
FIGS. 13 and 14 and described below. RRi may be measured by identifying heart beat peaks or frequencies in a PPG waveform (using time-domain or frequency domain analysis) and then reporting time periods between each heart beat. The user's time-between-heart-beats may lock-in to the time-between-music-beats such that one's RRi may be controlled by the music beat. -
FIG. 13 is aplot 600 ofBR data 602 andHR data 604 collected over a period of time for a person and from which ventilatory threshold VT can be estimated, according to some embodiments of the present invention. Data was collected from a person wearing a monitoring device having PPG-based sensor technology during a VO2max test. The person was wearing anaudio earbud 20, but could have also been wearing a wrist band 30 (or armband, legband, ring, patch, etc.) containing asensor module monitoring device 20 sensed photoplethysmograms (via a PPG sensor) and motion (via a motion sensor) and processed the signals (via a processor) to attenuate motion noise and generate metrics for HR, BR, distance, speed, pace, cadence, VO2 (oxygen volume consumption), blood pressure, RRi (R-R interval), and other biometrics. - By definition, ventilatory threshold (VT) is the point at which ventilation begins increasing at a faster rate than VO2. From the time-dependent HR and VT data, one can estimate the ventilatory threshold (VT) as the point at which BR increases rapidly (such as when the slope of BR vs. time increases). Greater confidence in the VT estimation can be derived by also noting the presence of a HR inflection (Ta) a few minutes ahead of VT (Tb).
- Additionally, a person's workload may also be factored into an algorithm for determining VT. As a specific example, if a
wearable device - VT is an important biometric assessment because it can indicate the transition of a body from aerobic to anaerobic exercise, and can also be associated with lactic threshold. Additionally, before VT, an estimation of energy expenditure (related to VO2) can be a linear relationship factoring HR as the variable: EE=k*VO2=m*HR+b, where EE=energy expenditure, k=constant, m=slope, HR=heart rate, and b=intercept. After VT is reached, since HR rate may not be changing (and indeed HR may be saturating) in time, a different relationship for energy expenditure may be required. Namely, the algorithm for energy expenditure may branch at VT, such that two separate algorithms are employed before and after VT is reached. For example, before VT, the aforementioned EE=m*HR+b may be employed, and after VT, an alternative relationship between EE and HR may be employed. Alternatively, after VT is reached, a relationship between EE and BR may be employed, where heart rate is not factored into the equation. For example, EE may increase linearly with increasing BR after VT is reached.
-
FIG. 14 is a plot of BR 702 (without bio-tuned music), 704 (with bio-tuned music) over time for a person and illustrating the manipulation of VT via musical audio, according some embodiments of the present invention. Specifically,FIG. 14 illustrates how “bio-tuning musical audio” can push-out VT. For example, once an optimal musical tempo is identified for minimizing energy expenditure (EE), VO2, or HR for a given workload, the bio-tuned musical audio can be used to shift-out VT during a maximal exercise, such as for a VO2max test or for intense training. Thus, the person may be better able to endure greater workloads without reaching exhaustion as quickly. - Referring to
FIG. 15 , a method of determining VT in a person wearing amonitoring device monitoring device - It should be noted that a variety of methods may be employed to determine the timing that VT is reached. For example, an alternate method of determining the time that VT is reached may be to analyze at least a few seconds of data to see when the slope of BR vs. HR changes substantially. As can be observed from
FIG. 13 , the BR does not change substantially with increasing HR until VT is reached, at which point the change in BR vs. HR increases substantially (i.e., the slope of BR vs. HR increases substantially). By “substantially”, this may mean a change in slope of 5% or higher over a period of sixty (60) or more seconds. - Referring to
FIG. 16 , another method of determining VT in a person wearing amonitoring device monitoring device - Activity level of the person is analyzed (Block 814) and a determination is made if VT is viable at one or more time periods where the person has a steady workload (Block 816). For example, physiological models (such as theoretical or experiential models) may include relationships between a given workload and whether VT can be reached at those workloads. These models may also incorporate static characteristics about a person, such as age, height, and gender, as well as quasi-static characteristics, such as weight and cardiac efficiency. Using cadence or speed as a proxy for workload, at least one model can be used to determine whether VT is viable. In some embodiments, the model may also include information about heart rate, such that VT is viable at only certain workloads and certain heart rates. In any case, if a processor determines that VT is viable, VT is identified as a point of rapid increase in HR or BR during the viable time period (Block 818). If the answer is no, VT is not viable in the time period (Block 820). The method of
FIG. 16 can be a “looped” process such that it is continually analyzed in time to mark periods of VT being viable or not viable. - Referring to
FIG. 17 , a method of determining body temperature of a person wearing amonitoring device monitoring device - In particular, HR and BR may significantly increase with increasing body temperature and HRV may have spectral (frequency-domain) components with spectral coefficients that either increase or decrease with body temperature. Since HR can be measured using data from a PPG sensor, the body temperature of the person can then be estimated using a calibration factor or mathematical relationship between the measured HR and body temperature for a given workload (Block 836).
- Body temperature may also be estimated using a calibration factor or mathematical relationship between BR and temperature for a given workload or a mathematical relationship between HRV and body temperature for a given workload. In this particular example, it may be beneficial to estimate workload using data from the inertial sensor(s) in the wearable device worn by the user. A variety of methods for estimating workload using wearable inertial sensors are well known to those skilled in the art.
- In some embodiments, the present invention may be used to help guide a user to controlled breathing, such that a physiological assessment may be generated for the user based on biometric sensor data collected during the guided controlled breathing, as shown in
FIG. 18 . In this method, a user wearing a biometric sensor, such as the wearable sensors described inFIGS. 1-5 , or a user utilizing a sensor system, such as that presented inFIGS. 6-7 , may be audibly, and perhaps also visually, guided into a state of controlled breathing (Block 900). As a specific example, an animated character may be presented on a view-screen to demonstrate inhaling and exhaling, such that the user adapts to the breathing rate of the character on the view screen. Alternatively or additionally, the user may be presented with audible instructions for inhaling and exhaling, such as being presented with breathing sounds for inhaling and exhaling or with music that follows inhalation and exhalation, at the frequency of the targeted breathing rate or breathing volume. The audio and visual feedback may be provided by the wearable device itself or another part of the system ofFIG. 6 orFIG. 7 , such as a phone, computer, or the like. - As the user is being guided, a processor may process biometric sensor data to generate a physiological assessment for the user (Block 902). Nonlimiting examples of such physiological assessments are presented in
FIGS. 22-24 . But more generally, such physiological assessments may comprise an assessment of one's: health status, physical fitness, stress status (physical and/or mental), a biometric (such as blood pressure, cardiac functioning, blood oxygenation, or the like), or the like. The assessment(s) may then be presented audibly, and perhaps also visually, to the user (Block 904). - A key benefit of generating a physiological assessment during controlled breathing is that recurring measurements of the physiological assessment may be more useful for long-term trending, as one's metabolic activity may be more normalized for each measurement. As a specific example, generating a daily data point for a physiological assessment, such as blood pressure, during the same time of day over the course of several months can be useful for monitoring significant deviations in one's average blood pressure to decide if a medical intervention is warranted. In this case, one assumption is that measuring at the same time of day may help reduce artifacts associated with one's physical activity or metabolic activity, as one's physical activity and metabolic activity may be generally consistent at the same time of day. However, one's activity level and metabolic level may not always be the same during the same time of day, and thus guiding a user to controlled breathing may help normalize the user's physical activity and/or metabolic activity during regular measurements of blood pressure, such that long-term trending of blood pressure is more consistent, yielding more accurate determinations as to whether or not a medical intervention is warranted in response to the detection of significant changes in blood pressure compared with average blood pressure readings.
- In some embodiments, the method of
FIG. 18 may be incorporated into a game having a goal. In this way, generating an important physiological assessment can also be entertaining. For example, a gaming character may be able to unlock and utilize certain skills, weapons, or powers once a state of controlled breathing or relaxation is detected. Once the gaming goals are achieved, a physiological assessment may be presented to the user. - In some embodiments, the method of
FIG. 18 may be utilized to generate a physiological assessment of one's stress sensitivity. For example, a physiological assessment may be generated before and after one has reached controlled breathing. The processor may then compare the before and after readings of this physiological assessment to determine if these readings are substantially different for uncontrolled vs. controlled breathing. The determination of a substantial difference may yield an assessment regarding one's stress sensitivity for presentation to the user, and this assessment may have therapeutic implications. As a specific example, if one's blood pressure readings are determined to be satisfactory (i.e., “normal”) for controlled breathing and poor (i.e., “too low” or “too high”) for uncontrolled breathing, one's stress sensitivity may be determined to be “high”. A high stress sensitivity may then be presented to the user (or the user's trainer, physician, caretaker, or the like), implying that stress-reduction therapy may be effective in helping the user maintain a satisfactory blood pressure. In contrast, if one's before and after readings are found to be essentially identical, one's stress sensitivity assessment may be determined to be “low”, such that stress-reduction therapy may not be particularly useful for the user. Some individuals may be more likely to benefit physiologically from controlled breathing and similar stress-reduction methodologies than others. For individuals who are stress-sensitive, non-pharmacological therapies may be more desirable for improving health than drugs, which may be associated with undesired side-effects. Thus, a key importance of this invention is that enabling a physiological assessment before and after controlled breathing provides a measurable way of determining whether stress-reduction methodologies would be useful for improving an individual's health. - A specific embodiment of the method of
FIG. 18 is presented inFIG. 19 . For example,FIG. 19 illustrates a method of generating assessments by guided controlled breathing while monitoring RRi and physical activity via a wearable sensor capable of measuring both RRi information and physical activity information. In this method, the user may be wearing a wearable sensor device (e.g.,monitoring device 20, 30) comprising a PPG sensor or another sensor for measuring RRi (i.e., an ECG sensor, an auscultatory sensor, a piezoelectric sensor, a ballistogram sensor, a bioimpedance sensor, or the like) and physical activity information, wherein the device is in communication with a view-screen and/or audio device such that the user may be guided towards controlled breathing visually and/or audibly, as described above (Block 910). If the user is wearing a PPG device that also comprises an inertial sensor, this collected sensor information (Block 912) may be processed into RRi information and/or breathing rate information as described inFIG. 20 , by using a variable filter as described in U.S. Patent Application Publication No. 2014/0114147, which is incorporated herein by reference in its entirety, or by using other methods. - Regardless of the signal processing methodology used to generate RRi, the RRi information may then be processed by a processor to determine if the user is in a state of controlled breathing or a state of uncontrolled breathing (Block 914). One method for determining controlled breathing vs. uncontrolled breathing is explained via
FIG. 20 andFIG. 21 , but other methods may be used. Once the RRi data from periods of controlled and uncontrolled breathing are collected, the data may be processed to generate a physiological assessment for the user, as described above (Block 916). The assessment(s) may then be presented to the user, or someone monitoring the user, visually and/or audibly (Block 918). For example,FIGS. 22A-22B andFIG. 23 present particular examples of providing a health assessment with diagnostic value to an end user based on the method ofFIG. 19 . - A specific method of processing RRi and physical activity information to determine if the user is in a state of controlled breathing or uncontrolled breathing is presented in
FIG. 20 . As described earlier, this method can be used within the assessment generation method presented inFIG. 19 . In this method, a physiological waveform (PPG, ECG, bioimpedance, auscultatory, or the like) is processed to generate RRi information (Block 920). Similarly, a physical activity waveform (accelerometry, gyroscopy, speedometry, and the like) is processed to generate physical activity information (cadence, speed, acceleration, position, exercise intensity, motion intensity, motion or rest duration, and the like) for the user (Block 922). The RRi information is then processed to identify peaks in the RRi-vs.-time (Block 924), and the physical activity information is processed to identify an activity state of the user (Block 926). The RRi peak information and activity state information can then be processed to determine if the user is (or is not) in a state of controlled breathing (Block 928). As a specific example, the RRi peaks during controlled breathing may have a period and frequency within a range that is characteristic of controlled breathing due to respiratory sinus arrhythmia and knowledge about the activity state of the user may be applied towards ascertaining if the person is at a state of relative “rest” (low activity) suitable for controlled breathing. This information may be processed together (Block 928) to assess whether a subject is truly in a state of controlled breathing. - A specific non-limiting example of the method of
FIG. 20 is elucidated byFIG. 21 .FIG. 21 is aplot 1000 of RRi vs. time collected by a user wearing a PPG sensor module at the wrist, wherein the PPG sensor module further comprises an inertial sensor (in this case an accelerometer). The user started the test with uncontrolled breathing, and at ˜360 seconds the user was instructed to start controlled breathing at 6 breaths a minute (0.1 Hz). In this case, the RRi-vs.-time information was generated in “real-time” by processing the raw PPG signal to determine the peak-to-peak (or valley-to-valley) points in time, in a pulse-picking fashion, as RRi-vs.-time may develop a pronounced periodic characteristic during controlled breathing. Thus, there is some inherent latency in generating the RRi information, as each pulse wave must be analyzed by this technique before a pulse is identified in time. The time between pulses was processed as an RR-interval (RRi) and thus a stream of successive RR-intervals was processed in time and smoothed, using a smoothing algorithm (in this case a moving average filter) to help remove unwanted noise artifacts (such as motion-noise, environmental-noise, and electrical-noise), generating a smoothed RRi waveform. The derivative of the smoothed RRi waveform was then calculated to generate an RRi-derivative waveform, and positive-to-negative zero-crossings of the derivative waveform were recorded to indicate peaks in the RRi waveform in time. For controlled breathing at 6 breaths/minute, the periodic time between RRi peaks should be close to 10 seconds, and for uncontrolled breathing, the time between peaks is likely to be much smaller and/or aperiodic (i.e., not periodic). Thus, the time-between-peaks information was fed to a controlled breathing detection algorithm, where the signal output of the detection algorithm was incremented when the time between peaks was between 7 and 13 seconds and decremented otherwise. There was also a maximum value by which the detection signal output was capped and a minimum value by which the detection signal output was floored. - It should be noted that the algorithm may have additional intelligence to change the “time between peaks” depending on the guided breathing rate. For example, if the guided controlled breathing is selected at 4 breaths per minute, then the signal output of the detection algorithm may be incremented when the time between peaks is between ˜14 and ˜17 seconds (as there are 15 seconds for each full breath in such case). Additionally, the breathing rate may be autonomously detected via a breathing rate detection algorithm (such as that described and referenced earlier) and then the “time between peaks” may be autonomously adjusted according to the detected breathing rate.
- The controlled breathing detection algorithm output for this dataset is presented in
FIG. 21 , showing maximum values in the time periods between ˜380 and ˜680 seconds. The algorithm ofFIG. 20 was tuned to report controlled breathing (to report a controlled breathing flag) when a predefined percentage of this maximum value was reached, in thiscase 70% of maximum value, reporting the onset of controlled breathing at 375 seconds. Moreover, to prevent potential vacillation in reporting, the report of controlled breathing was continued (the flag was kept high) until unless more than 20 seconds passed with the detection signal output below the predefined percentage of maximum value (75% of maximum). Thus, controlled breathing was reported from ˜375 seconds to ˜680 seconds, as shown inFIG. 21 . It should be noted that during the entire data collection period, the accelerometer readings from the PPG sensor module were monitored to determine if the activity level of the person was too high to trust the RRi readings. In such case, a flag would be generated to enable a processor to discount erroneous RRi data. In this particular dataset, the flag was always “0”, as there was no substantial physical activity detected. - Additionally, the identification of low activity was used to determine that the subject was truly in a state of rest suitable for enabling the subject to enter a state of controlled breathing. In one embodiment of the present invention, the determination that a subject is in a state of controlled breathing, leveraging biometric and activity sensing during the duration of the breathing session, may comprise the combination of: a) determining that the user's RRi-vs.-time plot is periodic in a manner that is consistent with controlled breathing (as described earlier), and b) determining that the person's activity state is at relative rest by sensing relatively low levels of motion (such as low accelerometry counts) and/or sensing that the user is at a seated or supine position (such as via body position sensing).
- The RRi data shown in
FIG. 21 can be further processed to generate a physiological assessment for the user. Numerous types of physiological assessments may be generated by processing RRi+physical activity information collected over a period of time, such as cardiovascular assessments, cardiac assessments, stress assessments, or the like. More specific examples of such assessments may comprise the identification of: arrhythmia, atrial fibrillation, fatigue, VO2max, lactic threshold, or the like. Specific examples of generating a physiological assessment based on RRi data are presented inFIGS. 22A, 22B and 23 . -
FIG. 22A shows a Poincaré plot 1100 (a type of recurrence plot) comprising successive RR-intervals ofFIG. 21 plotted against each other. Ellipses can be fit upon the data points for controlled anduncontrolled breathing FIG. 22A and emphasized in the break-out diagram ofFIG. 22B . The ellipses may be defined by characteristic standard deviations, SD1 and SD2, along the minor and major axes of the ellipses for controlled breathing (“CB”) and uncontrolled breathing (“UCB”). In general, SD2 may be more closely related with long-term variability in the RR-intervals and may reflect primarily sympathetic activity of the autonomic nervous system (the magnitude of SD2 is inversely related to sympathetic activity level), whereas SD1 may be more closely related with the short-term variability in the RR-intervals and reflects primarily parasympathetic activity of the autonomic nervous system (the magnitude of SD1 is directly related to parasympathetic activity level). The higher SD2 for controlled breathing (SD2CB) compared with the lower SD2 for uncontrolled breathing (SD2UCB) may be associated with a higher state of relaxation and lower stress for controlled breathing (when compared to uncontrolled breathing), showing that controlled breathing was able to bring the user to a state of lower stress. - Thus, the physiological assessment generated for the user may be that the user was originally in a higher stressed state (during uncontrolled breathing) that was corrected or at least ameliorated by a session of controlled breathing. More generally, physiological assessments may be generated for the user by processing the controlled breathing statistical parameters in comparison to the uncontrolled breathing statistical parameters, thereby generating physiological assessment parameters that may be processed via algorithms to generate physiological assessments.
- A key benefit of factoring both controlled and uncontrolled breathing statistical parameters in generating physiological assessments is that the assessments may then be less dependent on external variables, rather than to health conditions, that may also affect SD1 and SD2. For example, a user who is fatigued from excessive exercise may have a lower SD1 and SD2 on the 3rd day than the 1st day due to a fatigue- or recovery-induced drop in HRV following excessive exercise on the 2nd day. However, in this case, the SD1CB/SD1UCB or SD2CB/SD2UCB ratio may not change as long has the health condition has not changed. This is because the ratio SD1CB/SD1UCB or SD2CB/SD2UCB may be effectively normalized such that the influence of day-to-day, non-health-related, variability on physiological assessments can be lessened. In this manner, true health conditions can be exposed as the user moves from uncontrolled to controlled breathing. Some non-limiting examples of such physiological assessment parameters and potential assessments are summarized in the table 1300 of
FIG. 24 . The term “average”, as used in the table 1300, generally refers to an average value of a group of users having similar demographics with the user under test. - As specific example of visually presenting the physiological assessment of a user, in context of methods presented in
FIG. 18 andFIG. 19 , is presented inFIG. 23 . Thevisual display 1200 may summarize the user's Poincare plot, common shapes of the Poincare plots (referred to as “Poincare shapes”), and a description of what the user's Poincare shape means. The identification of the user's Poincare shape on theplot 1200 may be achieved by implementing a pattern recognition algorithm, a statistical analysis algorithm, or the like as processed by a processor, such as aprocessor FIG. 6 orFIG. 7 . Acomet shape 1202 may be indicative of good health, atorpedo shape 1204 may be indicative of poor cardiovascular health or a particular disease condition, and afan shape 1206 may be indicative of atrial fibrillation, arrhythmia, or another cardiac issue. In some cases, the user's shape may be unknown (represented inFIG. 23 by 1208) to the identification algorithm, and in such case the system ofFIG. 6 orFIG. 7 may initiate a search through a plurality of different users' data to find similar plots via a correlational algorithm running on a processor. Moreover, once similar plots are found among a group of users, commonalities between meta data and/or associated heath data of the users may be identified by a processor to help diagnose the particular user of interest. For example, if it is found that a group of users sharing a common Poincaré shape with the user of interest shows a high correlation with a diagnosis of diabetes (i.e., the meta data for this group and/or processed sensor data shows this group to be generally diabetic), then a notice may given (i.e., visually and/or audibly) to the user of interest, or to someone monitoring the user of interest, that the user may be at risk of being diabetic. - Example embodiments are described herein with reference to block diagrams and flowchart illustrations. It is understood that a block of the block diagrams and flowchart illustrations, and combinations of blocks in the block diagrams and flowchart illustrations, can be implemented by computer program instructions that are performed by one or more computer circuits. These computer program instructions may be provided to a processor circuit of a general purpose computer circuit, special purpose computer circuit, and/or other programmable data processing circuit to produce a machine, such that the instructions, which execute via the processor of the computer and/or other programmable data processing apparatus, transform and control transistors, values stored in memory locations, and other hardware components within such circuitry to implement the functions/acts specified in the block diagrams and flowchart block or blocks, and thereby create means (functionality) and/or structure for implementing the functions/acts specified in the block diagrams and flowchart blocks.
- These computer program instructions may also be stored in a tangible computer-readable medium that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable medium produce an article of manufacture including instructions which implement the functions/acts specified in the block diagrams and flowchart blocks.
- A tangible, non-transitory computer-readable medium may include an electronic, magnetic, optical, electromagnetic, or semiconductor data storage system, apparatus, or device. More specific examples of the computer-readable medium would include the following: a portable computer diskette, a random access memory (RAM) circuit, a read-only memory (ROM) circuit, an erasable programmable read-only memory (EPROM or Flash memory) circuit, a portable compact disc read-only memory (CD-ROM), and a portable digital video disc read-only memory (DVD/BlueRay).
- The computer program instructions may also be loaded onto a computer and/or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer and/or other programmable apparatus to produce a computer-implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions/acts specified in the block diagrams and flowchart blocks. Accordingly, embodiments of the present invention may be embodied in hardware and/or in software (including firmware, resident software, micro-code, etc.) that runs on a processor such as a digital signal processor, which may collectively be referred to as “circuitry,” “a module” or variants thereof.
- It should also be noted that in some alternate implementations, the functions/acts noted in the blocks may occur out of the order noted in the flowcharts. For example, two blocks shown in succession may in fact be executed substantially concurrently or the blocks may sometimes be executed in the reverse order, depending upon the functionality/acts involved. Moreover, the functionality of a given block of the flowcharts and block diagrams may be separated into multiple blocks and/or the functionality of two or more blocks of the flowcharts and block diagrams may be at least partially integrated. Finally, other blocks may be added/inserted between the blocks that are illustrated. Moreover, although some of the diagrams include arrows on communication paths to show a primary direction of communication, it is to be understood that communication may occur in the opposite direction to the depicted arrows.
- The foregoing is illustrative of the present invention and is not to be construed as limiting thereof. Although a few exemplary embodiments of this invention have been described, those skilled in the art will readily appreciate that many modifications are possible in the exemplary embodiments without materially departing from the teachings and advantages of this invention. Accordingly, all such modifications are intended to be included within the scope of this invention as defined in the claims. The invention is defined by the following claims, with equivalents of the claims to be included therein.
Claims (32)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US17/347,293 US20210298614A1 (en) | 2015-07-15 | 2021-06-14 | Methods of determining ventilatory threshold |
Applications Claiming Priority (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201562192683P | 2015-07-15 | 2015-07-15 | |
US201662274463P | 2016-01-04 | 2016-01-04 | |
PCT/US2016/041842 WO2017011431A2 (en) | 2015-07-15 | 2016-07-12 | Methods of controlling biometric parameters via musical audio |
US201815744642A | 2018-01-12 | 2018-01-12 | |
US17/347,293 US20210298614A1 (en) | 2015-07-15 | 2021-06-14 | Methods of determining ventilatory threshold |
Related Parent Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/744,642 Continuation US11058304B2 (en) | 2015-07-15 | 2016-07-12 | Methods of controlling biometric parameters via musical audio |
PCT/US2016/041842 Continuation WO2017011431A2 (en) | 2015-07-15 | 2016-07-12 | Methods of controlling biometric parameters via musical audio |
Publications (1)
Publication Number | Publication Date |
---|---|
US20210298614A1 true US20210298614A1 (en) | 2021-09-30 |
Family
ID=57757512
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/744,642 Active 2037-09-01 US11058304B2 (en) | 2015-07-15 | 2016-07-12 | Methods of controlling biometric parameters via musical audio |
US17/347,293 Pending US20210298614A1 (en) | 2015-07-15 | 2021-06-14 | Methods of determining ventilatory threshold |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/744,642 Active 2037-09-01 US11058304B2 (en) | 2015-07-15 | 2016-07-12 | Methods of controlling biometric parameters via musical audio |
Country Status (2)
Country | Link |
---|---|
US (2) | US11058304B2 (en) |
WO (1) | WO2017011431A2 (en) |
Families Citing this family (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10632278B2 (en) | 2017-07-20 | 2020-04-28 | Bose Corporation | Earphones for measuring and entraining respiration |
US10682491B2 (en) | 2017-07-20 | 2020-06-16 | Bose Corporation | Earphones for measuring and entraining respiration |
US10848848B2 (en) | 2017-07-20 | 2020-11-24 | Bose Corporation | Earphones for measuring and entraining respiration |
JP2019076692A (en) * | 2017-10-26 | 2019-05-23 | 京セラ株式会社 | Measurement device and measurement system |
US11013416B2 (en) | 2018-01-26 | 2021-05-25 | Bose Corporation | Measuring respiration with an in-ear accelerometer |
EP3648470A1 (en) * | 2018-11-05 | 2020-05-06 | GN Hearing A/S | Hearing system with heart condition alert and related methods |
US10820810B2 (en) * | 2018-11-26 | 2020-11-03 | Firstbeat Analytics, Oy | Method and a system for determining the maximum heart rate of a user of in a freely performed physical exercise |
CN109394188B (en) * | 2018-11-27 | 2022-03-08 | 中山大学 | Method, device and equipment for detecting respiratory anomaly based on heart rate variability |
US10860114B1 (en) | 2019-06-20 | 2020-12-08 | Bose Corporation | Gesture control and pulse measurement through embedded films |
JP7539634B2 (en) * | 2019-09-10 | 2024-08-26 | 学校法人産業医科大学 | DEEP BODY TEMPERATURE ESTIMATION DEVICE, DEEP BODY TEMPERATURE ESTIMATION METHOD, AND DEEP BODY TEMPERATURE ESTIMATION PROGRAM |
WO2021068000A1 (en) * | 2019-10-02 | 2021-04-08 | Breathebeatz Llc | Breathing guidance based on real-time audio analysis |
EP3878357A1 (en) * | 2020-03-10 | 2021-09-15 | Vagus Health Ltd. | Wearable electrocardiogram devices and methods for early detection, diagnostics and monitoring of infections |
US20210321648A1 (en) * | 2020-04-16 | 2021-10-21 | John Martin | Acoustic treatment of fermented food products |
US11961332B1 (en) * | 2020-06-19 | 2024-04-16 | Apple Inc. | Electronic devices with 6 minute walk distance estimates |
CN112604123A (en) * | 2020-12-16 | 2021-04-06 | 中山职业技术学院 | Monitoring system of music therapy |
US20220192605A1 (en) * | 2020-12-22 | 2022-06-23 | West Affum Holdings Corp. | Managing cardiac risk based on physiological markers |
WO2022155423A1 (en) * | 2021-01-14 | 2022-07-21 | Giordano Matt | Systems and methods of reducing stress with music |
CN115191961B (en) * | 2021-04-09 | 2024-08-23 | 广东小天才科技有限公司 | Cardiopulmonary health detection method and device, wearable equipment and storage medium |
FR3122983A1 (en) * | 2021-05-18 | 2022-11-25 | Age Impulse | Portable device allowing to characterize with precision and in a synthetic way the state of physical form of individuals in activity as well as to calculate and detect in real time and with precision their ventilatory thresholds |
CN114403892B (en) * | 2022-01-24 | 2024-04-02 | 中山大学 | Method, device and equipment for interactively adjusting respiratory feedback based on heart rate variability |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100179438A1 (en) * | 2006-11-01 | 2010-07-15 | Biancamed Limited | System and method for monitoring cardiorespiratory parameters |
US20130096403A1 (en) * | 2011-10-13 | 2013-04-18 | University Of Houston System | Apparatus and method for improving training threshold |
US20150297133A1 (en) * | 2012-11-28 | 2015-10-22 | Iee International Electronics & Engineering S.A. | Method and system for determining a ventilatory threshold |
Family Cites Families (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8672852B2 (en) * | 2002-12-13 | 2014-03-18 | Intercure Ltd. | Apparatus and method for beneficial modification of biorhythmic activity |
US7314451B2 (en) * | 2005-04-25 | 2008-01-01 | Earlysense Ltd. | Techniques for prediction and monitoring of clinical episodes |
AU2006217448A1 (en) * | 2005-02-22 | 2006-08-31 | Health-Smart Limited | Methods and systems for physiological and psycho-physiological monitoring and uses thereof |
US8157730B2 (en) * | 2006-12-19 | 2012-04-17 | Valencell, Inc. | Physiological and environmental monitoring systems and methods |
US8652040B2 (en) * | 2006-12-19 | 2014-02-18 | Valencell, Inc. | Telemetric apparatus for health and environmental monitoring |
US8251903B2 (en) * | 2007-10-25 | 2012-08-28 | Valencell, Inc. | Noninvasive physiological analysis using excitation-sensor modules and related devices and methods |
US8788002B2 (en) | 2009-02-25 | 2014-07-22 | Valencell, Inc. | Light-guiding devices and monitoring devices incorporating same |
US20100217100A1 (en) * | 2009-02-25 | 2010-08-26 | Leboeuf Steven Francis | Methods and Apparatus for Measuring Physiological Conditions |
WO2010098912A2 (en) * | 2009-02-25 | 2010-09-02 | Valencell, Inc. | Light-guiding devices and monitoring devices incorporating same |
US9750462B2 (en) * | 2009-02-25 | 2017-09-05 | Valencell, Inc. | Monitoring apparatus and methods for measuring physiological and/or environmental conditions |
WO2013016007A2 (en) * | 2011-07-25 | 2013-01-31 | Valencell, Inc. | Apparatus and methods for estimating time-state physiological parameters |
WO2013019494A2 (en) * | 2011-08-02 | 2013-02-07 | Valencell, Inc. | Systems and methods for variable filter adjustment by heart rate metric feedback |
US9522317B2 (en) * | 2011-08-19 | 2016-12-20 | Pulson, Inc. | Systems and methods for coordinating musculoskeletal and cardiovascular or cerebrovascular hemodynamics |
US8961185B2 (en) * | 2011-08-19 | 2015-02-24 | Pulson, Inc. | System and method for reliably coordinating musculoskeletal and cardiovascular hemodynamics |
JP2015521064A (en) * | 2012-05-14 | 2015-07-27 | ライオンズゲイト テクノロジーズ, インコーポレイテッドLionsGate Technologies, Inc. | System, method and apparatus for identifying physiological parameters |
US9005129B2 (en) * | 2012-06-22 | 2015-04-14 | Fitbit, Inc. | Wearable heart rate monitor |
KR102025571B1 (en) * | 2012-07-27 | 2019-09-27 | 삼성전자주식회사 | Apparatus and method for measuring change in blood pressure caused by breathing control |
CN104837403A (en) * | 2012-11-27 | 2015-08-12 | 佛吉亚汽车座椅有限责任公司 | Vehicle seat with integrated sensors |
EP2967401B1 (en) * | 2013-03-15 | 2019-02-20 | Pulson, Inc. | Coordinating musculoskeletal and cardiovascular hemodynamics |
US20150335288A1 (en) * | 2013-06-06 | 2015-11-26 | Tricord Holdings, Llc | Modular physiologic monitoring systems, kits, and methods |
US20150313484A1 (en) * | 2014-01-06 | 2015-11-05 | Scanadu Incorporated | Portable device with multiple integrated sensors for vital signs scanning |
-
2016
- 2016-07-12 WO PCT/US2016/041842 patent/WO2017011431A2/en active Application Filing
- 2016-07-12 US US15/744,642 patent/US11058304B2/en active Active
-
2021
- 2021-06-14 US US17/347,293 patent/US20210298614A1/en active Pending
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100179438A1 (en) * | 2006-11-01 | 2010-07-15 | Biancamed Limited | System and method for monitoring cardiorespiratory parameters |
US20130096403A1 (en) * | 2011-10-13 | 2013-04-18 | University Of Houston System | Apparatus and method for improving training threshold |
US20150297133A1 (en) * | 2012-11-28 | 2015-10-22 | Iee International Electronics & Engineering S.A. | Method and system for determining a ventilatory threshold |
Also Published As
Publication number | Publication date |
---|---|
WO2017011431A2 (en) | 2017-01-19 |
US11058304B2 (en) | 2021-07-13 |
US20180220901A1 (en) | 2018-08-09 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20210298614A1 (en) | Methods of determining ventilatory threshold | |
US10798471B2 (en) | Methods for improving signal quality in wearable biometric monitoring devices | |
US11638561B2 (en) | Physiological monitoring devices with adjustable signal analysis and interrogation power and monitoring methods using same | |
TWI586322B (en) | Blood pressure management device and method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
AS | Assignment |
Owner name: WESTERN ALLIANCE BANK, CALIFORNIA Free format text: SECURITY INTEREST;ASSIGNOR:VALENCELL, INC.;REEL/FRAME:059501/0119 Effective date: 20220308 |
|
AS | Assignment |
Owner name: VALENCELL, INC., NORTH CAROLINA Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:WESTERN ALLIANCE BANK;REEL/FRAME:060919/0018 Effective date: 20220725 |
|
AS | Assignment |
Owner name: YUKKA MAGIC LLC, DELAWARE Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:VALENCELL, INC.;REEL/FRAME:061501/0082 Effective date: 20220707 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |