WO2002032208A2 - Method for determining an acoustic environment situation, application of the method and hearing aid - Google Patents
Method for determining an acoustic environment situation, application of the method and hearing aid Download PDFInfo
- Publication number
- WO2002032208A2 WO2002032208A2 PCT/CH2002/000049 CH0200049W WO0232208A2 WO 2002032208 A2 WO2002032208 A2 WO 2002032208A2 CH 0200049 W CH0200049 W CH 0200049W WO 0232208 A2 WO0232208 A2 WO 0232208A2
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- kin
- kil
- processing
- classification
- class information
- Prior art date
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04R—LOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
- H04R25/00—Deaf-aid sets, i.e. electro-acoustic or electro-mechanical hearing aids; Electric tinnitus maskers providing an auditory perception
- H04R25/50—Customised settings for obtaining desired overall acoustical characteristics
- H04R25/505—Customised settings for obtaining desired overall acoustical characteristics using digital signal processing
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04R—LOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
- H04R2225/00—Details of deaf aids covered by H04R25/00, not provided for in any of its subgroups
- H04R2225/41—Detection or adaptation of hearing aid parameters or programs to listening situation, e.g. pub, forest
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04R—LOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
- H04R25/00—Deaf-aid sets, i.e. electro-acoustic or electro-mechanical hearing aids; Electric tinnitus maskers providing an auditory perception
- H04R25/40—Arrangements for obtaining a desired directivity characteristic
- H04R25/407—Circuits for combining signals of a plurality of transducers
Definitions
- the present invention relates to a method for determining an acoustic environmental situation, an application of the method, a device for determining the acoustic environmental situation and a hearing aid.
- the hearing program can be selected either using the remote control or using a switch on the hearing aid itself. Switching between different hearing programs is, however, troublesome, if not impossible, for many users. It is not always easy for experienced hearing aid users to determine which program offers the best comfort and the best speech intelligibility at which time. Automatic detection of the acoustic environment and the associated automatic switching of the hearing program in the hearing aid is therefore desirable.
- BESTATIGUNGSKOPIE Various methods for the automatic classification of acoustic environmental situations are currently known. In all of these methods, the input signal, which can come from one or more microphones in the hearing aid, is different
- a pattern recognizer uses an algorithm to make a decision about the belonging of the analyzed input signal to a certain acoustic environment.
- the different known methods differ on the one hand by the different features that are used in the description of the acoustic environment (signal analysis), and on the other hand by the pattern recognizer used that classifies the features (signal identification).
- Determination of the acoustic surrounding situation known. It is a one-step processing of an acoustic input signal in a feature extraction unit and a classification unit connected downstream thereof, in which the extracted features are classified to generate class information.
- good results are obtained, particularly when using auditory-based features.
- An improvement is particularly in the
- noise class noise for example, contains a wide variety of sounds such as background conversations, train station noise, hair dryers, and the noise class music includes pop music, classical music, individual instruments, vocals, etc.
- the present invention is therefore based on the object of specifying a method for determining an acoustic ambient situation which is more robust and more precise than the known methods. This object is achieved by the measures specified in claim 1.
- Advantageous embodiments of the invention, an application of the method, a device and a hearing aid are specified in further claims.
- an acoustic input signal in a multi-stage process which consists of at least two classification levels, each level preferably consisting of one
- Extraction phase and an identification phase is processed, an extremely robust and accurate classification of the current acoustic environment is obtained.
- the method according to the invention can successfully avoid incorrect classification of pop music into the "speech in noise" category.
- the method according to the invention also enables a general noise class, for example noise, to be subdivided into subclasses, such as traffic noise or background noise for conversations. Special situations, such as those that occur in the interior of an automobile (“in-the-car noise”), can also be recognized. In general, room properties can be identified and taken into account accordingly in the further processing of important signal components. It has been shown that the method according to the invention also makes it possible to localize the sources of noise, which has made it possible to determine the presence of a detect specific noise source in several other noise sources.
- Fig. 1 shows a known single-stage device for
- FIG. 2 shows a first embodiment of a device according to the invention with two processing stages
- FIG. 3 shows a second, general embodiment of a multi-stage device according to the invention
- FIG. 5 shows a fourth, general embodiment of a multi-stage device according to the invention
- FIG. 6 shows a simplified embodiment compared to the two-stage embodiment according to FIG. 2 and
- FIG. 7 shows a hearing aid with a multi-stage device according to the invention according to FIGS. 2 to 6.
- 1 shows a known single-stage device for determining the acoustic environmental situation, the device consisting of a series connection of a feature extraction unit F, a classification unit C and a post-processing unit P.
- An acoustic input signal IN which was recorded with a microphone, for example, is the
- Feature extraction unit F is applied, in which characteristic features are extracted.
- Amplitude histogram proposed to achieve the same goal. Finally, the feature extraction was also examined and applied by analyzing different modulation frequencies.
- the features M extracted in the feature extraction unit F are applied to the classification unit C, which is basically one of the known ones for the noise classification
- Pattern identification methods are used. So-called distance estimators, Bayesian classifiers, fuzzy logic systems or neurons are particularly suitable.
- Processing steps receive class information KI, which may be fed to a post-processing unit P for any cleanup of the class membership. Subsequently, adjusted class information KI 1 is obtained .
- FIG. 2 shows a first embodiment variant of a device according to the invention. It is a device with two process stages S1 and S2, with each feature level S1, S2 each containing a feature extraction unit F1 or F2 and a classification unit C1 or C2.
- the original input signal IN is supplied to both process stages S1 and S2, namely both the feature extraction unit F1 and the feature extraction unit F2, which are each operatively connected to the corresponding classification unit C1 or C2 in the sequence.
- class information KI1 which is obtained on the basis of calculations in the classification unit C1 of the first method step S1, the
- Classification unit C2 of the second method stage S2 is influenced, in such a way that, for example, one of several possible pattern identification methods is selected and used for the noise classification in the classification unit C2 of the second method stage S2.
- the characteristic extraction unit F1 provides the characteristics tonality, spectral center of gravity (CGAV: spectral center of gravity) and fluctuation of the spectral
- CGFS Center of gravity
- Classification unit Cl in which an HMM (Hidden Markov Model) classifier is used, classified, wherein the input signal IN is divided into one of the following classes with the aid of the HMM classification: "speech", “speech in noise”, “noise” or "music”. This is called class information KI.
- the result of the first processing stage S1 is applied to the classification unit C2 of the processing stage S2, in which a second feature set is extracted using the feature extraction unit F2.
- the additional characteristic variance of the harmonic structure (pitch) - also referred to below as Pitchvar - is extracted.
- the result of the first processing stage S1 is checked and, if necessary, corrected using a rule-based classifier in the classification unit C2.
- the rule-based classifier contains only a few simple heuristic decisions, which are based on the four characteristics and are based on the following considerations:
- the tonality characteristic is used for correction in each class if the characteristic values lie completely outside an admissible value range of the class information KII, which is in the first classification unit C1 - i.e. by the HMM classifier.
- the tonality for "music” is high, for "speech” in the middle range, for “speech in noise” a little lower and for “noise” deep. For example, if an input signal IN by the
- Classification unit Cl falls into the "language” class, it is then expected that corresponding features, which have been determined in the feature extraction unit F1, have indicated to the classification unit Cl that the relevant signal component fluctuates in the input signal IN scark. If, on the other hand, the tonality for this input signal IN is very low, it is most likely not “speech” but “speech in noise”. Similar considerations can be made for the other three characteristics, namely the variance of the harmonic structure
- the rules for the rule-based classifier which are used in the classification unit C2, can be formulated as follows:
- FIG. 3 A further embodiment is shown in a general representation in FIG. 3. It is a processing method with n stages.
- each of the processing stages Sl to Sn has a feature extraction unit Fl to Fn and a classification unit Cl to Cn connected downstream thereof for generating the respective class information KIl to KIn. If necessary, Sl to Sn is in each or in individual processing stages
- the embodiment variant shown in FIG. 3 is particularly suitable for a so-called coarse-fine classification.
- a result obtained in processing stage i is refined in a subsequent processing stage i + 1.
- a rough classification is thus carried out in a higher processing level, with a rough classification based on more specific feature extractions and / or classification methods being carried out in a lower processing level on the basis of the rough classification.
- This process can also be viewed as generation of hypotheses in a higher process level, which are checked, ie confirmed or rejected in a lower process step.
- hypotheses that have been created in a higher level of procedure can also be entered with other information, in particular with manual means such as remote controls or switches.
- this is indicated, representative of the first processing stage S1, by means of a manipulated variable ST, by means of which, for example, the calculations in the classification unit C1 can be overridden.
- the manipulated variable can also be a classification unit C2 to Cn or one
- Postprocessing units Pl to Pn are fed to another process stage S1 to Sn.
- Processing level Sl to Sn which, however, is not mandatory, can be assigned to a task, such as, for example: a rough classification, a fine classification, a localization of a noise source, a check whether a specific noise source, e.g. B. automotive noise in a vehicle, or an extraction of certain signal components from an input signal, for. B. Elimination of echo taking into account spatial characteristics.
- the individual process steps S1 to Sn are therefore individual in the sense that in them different characteristics are extracted and different classification methods are used.
- an individual one is provided in a first processing stage S1
- the localization of the sound source carried out in the first processing stage can be followed by directional filtering, for example using multi-microphone technology.
- a feature extraction unit Fl to Fn can be divided among several classification units Cl to Cn, i.e. the results of one
- Feature extraction units F1 to Fn can be used by several classification units C1 to Cn. It is also conceivable that a classification unit C1 to Cn is used in several processing stages S1 to Sn. Finally, it can be provided that the class information KIl to KIn or the cleaned class information KIl 'to KIn' obtained in the different processing stages S1 to Sn are weighted differently in order to maintain the final classification. 4 shows a further embodiment of the invention, in which a plurality of processing stages S1 to Sn are used. In contrast to the embodiment according to FIG. 3, the embodiment according to FIG. 3, the embodiment according to FIG. 3, the
- Class information KIl to KIn not only used in the immediately following processing stages, but possibly in all subordinate processing stages.
- the results from previous processing stages S1 to Sn can also have their effects on the subsequent feature extraction units F1 to Fn or on the features to be extracted.
- postprocessing units Pl to Pn prepare the intermediate results of the classification and make them available as cleaned class information KIl 1 to KIn '.
- FIG. 5 shows a further embodiment variant of a multi-stage device for determining the acoustic ambient situation, again in a general form.
- a multi-stage device for determining the acoustic ambient situation, again in a general form.
- FIGS. 3 and 4 several processing stages S1 to Sn with feature extraction units Fl to Fn and classification units Cl to Cn are shown.
- the class information KIl to KIn obtained in each processing step Sl to Sn becomes one Decision unit FD supplied, in which the final classification is carried out by generating the class information KI.
- the decision unit FD it is optionally provided to generate feedback signals which are based on the decision unit FD
- the information is exchanged via a wired or a wireless transmission link.
- the first processing stage S1 consists of the feature extraction unit F1 and the
- Classification unit Cl In the second processing stage S2, the same features are used that have already been used in the first method stage S1. A recalculation of the features in process stage S2 is therefore unnecessary, and the results of the feature extraction unit F1 of the first process stage S1 can be used in the second process stage S2. In the second method stage S2, only the classification method is changed, depending on the class information KII of the first processing stage S1.
- FIG. 7 shows the use of the device according to the invention in a hearing aid device which essentially has a transmission unit 200.
- 100 denotes a multi-stage processing unit which is implemented according to one of the embodiment variants shown in FIGS. 2 to 6.
- the input signal IN is applied to both the multi-stage processing unit 100 and the transmission unit 200, in which the acoustic input signal IN is processed with the aid of the class information KI1 to KIn or KIl 'to KIn' generated in the multi-stage processing unit 100. It is preferably provided to select a suitable hearing program based on the acoustic environment situation determined, as is the case in FIG Introduction and has been described in the international patent application WO 01/20 965.
- FIG. 300 denotes a manual input unit, with the aid of which - for example via a radio link, as can be seen schematically in FIG. 7 - may act both on the multi-stage processing unit 100 in the manner already explained or on the transmission unit 200.
- the hearing aid 200 reference is made to the explanations in WO 01/20 965, whose
- the preferred application of the method according to the invention for determining the acoustic ambient situation is the selection of a hearing program in a hearing aid device.
Landscapes
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Neurosurgery (AREA)
- Otolaryngology (AREA)
- Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Acoustics & Sound (AREA)
- Signal Processing (AREA)
- Circuit For Audible Band Transducer (AREA)
- Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
Abstract
Description
Claims
Priority Applications (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
AU2472202A AU2472202A (en) | 2002-01-28 | 2002-01-28 | Method for determining an acoustic environment situation, application of the method and hearing aid |
CA2439427A CA2439427C (en) | 2002-01-28 | 2002-01-28 | Method for determining an acoustic environment situation, application of the method and hearing aid |
JP2002535462A JP3987429B2 (en) | 2002-01-28 | 2002-01-28 | Method and apparatus for determining acoustic environmental conditions, use of the method, and listening device |
EP02706499.7A EP1470735B1 (en) | 2002-01-28 | 2002-01-28 | Method for determining an acoustic environment situation, application of the method and hearing aid |
AU2002224722A AU2002224722B2 (en) | 2002-01-28 | 2002-01-28 | Method for determining an acoustic environment situation, application of the method and hearing aid |
PCT/CH2002/000049 WO2002032208A2 (en) | 2002-01-28 | 2002-01-28 | Method for determining an acoustic environment situation, application of the method and hearing aid |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/CH2002/000049 WO2002032208A2 (en) | 2002-01-28 | 2002-01-28 | Method for determining an acoustic environment situation, application of the method and hearing aid |
Publications (2)
Publication Number | Publication Date |
---|---|
WO2002032208A2 true WO2002032208A2 (en) | 2002-04-25 |
WO2002032208A3 WO2002032208A3 (en) | 2002-12-05 |
Family
ID=4358282
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CH2002/000049 WO2002032208A2 (en) | 2002-01-28 | 2002-01-28 | Method for determining an acoustic environment situation, application of the method and hearing aid |
Country Status (5)
Country | Link |
---|---|
EP (1) | EP1470735B1 (en) |
JP (1) | JP3987429B2 (en) |
AU (2) | AU2002224722B2 (en) |
CA (1) | CA2439427C (en) |
WO (1) | WO2002032208A2 (en) |
Cited By (27)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP1320281A2 (en) | 2003-03-07 | 2003-06-18 | Phonak Ag | Binaural hearing device and method for controlling a such a hearing device |
EP1326478A2 (en) | 2003-03-07 | 2003-07-09 | Phonak Ag | Method for producing control signals, method of controlling signal transfer and a hearing device |
WO2004114722A1 (en) * | 2003-06-24 | 2004-12-29 | Gn Resound A/S | A binaural hearing aid system with coordinated sound processing |
EP1532841A1 (en) * | 2002-05-21 | 2005-05-25 | Hearworks Pty Ltd. | Programmable auditory prosthesis with trainable automatic adaptation to acoustic conditions |
EP1538874A2 (en) * | 2003-12-01 | 2005-06-08 | Siemens Audiologische Technik GmbH | Hearing device with direction dependent signal processing and corresponding method |
US6912289B2 (en) | 2003-10-09 | 2005-06-28 | Unitron Hearing Ltd. | Hearing aid and processes for adaptively processing signals therein |
EP1691574A2 (en) | 2005-02-11 | 2006-08-16 | Phonak Communications Ag | Method and system for providing hearing assistance to a user |
EP1819195A2 (en) | 2006-02-13 | 2007-08-15 | Phonak Communications Ag | Method and system for providing hearing assistance to a user |
EP1858292A1 (en) | 2006-05-16 | 2007-11-21 | Phonak AG | Hearing device and method of operating a hearing device |
WO2007131815A1 (en) * | 2006-05-16 | 2007-11-22 | Phonak Ag | Hearing device and method for operating a hearing device |
EP2192794A1 (en) | 2008-11-26 | 2010-06-02 | Oticon A/S | Improvements in hearing aid algorithms |
WO2010133703A2 (en) | 2010-09-15 | 2010-11-25 | Phonak Ag | Method and system for providing hearing assistance to a user |
US7889879B2 (en) | 2002-05-21 | 2011-02-15 | Cochlear Limited | Programmable auditory prosthesis with trainable automatic adaptation to acoustic conditions |
US7957548B2 (en) | 2006-05-16 | 2011-06-07 | Phonak Ag | Hearing device with transfer function adjusted according to predetermined acoustic environments |
US7986790B2 (en) | 2006-03-14 | 2011-07-26 | Starkey Laboratories, Inc. | System for evaluating hearing assistance device settings using detected sound environment |
US8027495B2 (en) | 2003-03-07 | 2011-09-27 | Phonak Ag | Binaural hearing device and method for controlling a hearing device system |
US8068627B2 (en) | 2006-03-14 | 2011-11-29 | Starkey Laboratories, Inc. | System for automatic reception enhancement of hearing assistance devices |
WO2012010218A1 (en) | 2010-07-23 | 2012-01-26 | Phonak Ag | Hearing system and method for operating a hearing system |
US8111848B2 (en) | 2003-03-07 | 2012-02-07 | Phonak Ag | Hearing aid with acoustical signal direction of arrival control |
EP2249586A3 (en) * | 2003-03-03 | 2012-06-20 | Phonak AG | Method for manufacturing acoustical devices and for reducing wind disturbances |
US8249284B2 (en) | 2006-05-16 | 2012-08-21 | Phonak Ag | Hearing system and method for deriving information on an acoustic scene |
US8494193B2 (en) | 2006-03-14 | 2013-07-23 | Starkey Laboratories, Inc. | Environment detection and adaptation in hearing assistance devices |
WO2013170885A1 (en) * | 2012-05-15 | 2013-11-21 | Phonak Ag | Method for operating a hearing device as well as a hearing device |
US8605923B2 (en) | 2007-06-20 | 2013-12-10 | Cochlear Limited | Optimizing operational control of a hearing prosthesis |
US8873780B2 (en) | 2010-05-12 | 2014-10-28 | Phonak Ag | Hearing system and method for operating the same |
US8958586B2 (en) | 2012-12-21 | 2015-02-17 | Starkey Laboratories, Inc. | Sound environment classification by coordinated sensing using hearing assistance devices |
CN112954569A (en) * | 2021-02-20 | 2021-06-11 | 深圳市智听科技有限公司 | Multi-core hearing aid chip, hearing aid method and hearing aid |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7738666B2 (en) | 2006-06-01 | 2010-06-15 | Phonak Ag | Method for adjusting a system for providing hearing assistance to a user |
JP2012083746A (en) * | 2010-09-17 | 2012-04-26 | Kinki Univ | Sound processing device |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2001020965A2 (en) * | 2001-01-05 | 2001-03-29 | Phonak Ag | Method for determining a current acoustic environment, use of said method and a hearing-aid |
WO2001076321A1 (en) * | 2000-04-04 | 2001-10-11 | Gn Resound A/S | A hearing prosthesis with automatic classification of the listening environment |
-
2002
- 2002-01-28 CA CA2439427A patent/CA2439427C/en not_active Expired - Lifetime
- 2002-01-28 EP EP02706499.7A patent/EP1470735B1/en not_active Expired - Lifetime
- 2002-01-28 WO PCT/CH2002/000049 patent/WO2002032208A2/en active Application Filing
- 2002-01-28 AU AU2002224722A patent/AU2002224722B2/en not_active Ceased
- 2002-01-28 JP JP2002535462A patent/JP3987429B2/en not_active Expired - Fee Related
- 2002-01-28 AU AU2472202A patent/AU2472202A/en active Pending
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2001076321A1 (en) * | 2000-04-04 | 2001-10-11 | Gn Resound A/S | A hearing prosthesis with automatic classification of the listening environment |
WO2001020965A2 (en) * | 2001-01-05 | 2001-03-29 | Phonak Ag | Method for determining a current acoustic environment, use of said method and a hearing-aid |
Cited By (38)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7889879B2 (en) | 2002-05-21 | 2011-02-15 | Cochlear Limited | Programmable auditory prosthesis with trainable automatic adaptation to acoustic conditions |
EP1532841A4 (en) * | 2002-05-21 | 2008-12-24 | Hearworks Pty Ltd | Programmable auditory prosthesis with trainable automatic adaptation to acoustic conditions |
EP1532841A1 (en) * | 2002-05-21 | 2005-05-25 | Hearworks Pty Ltd. | Programmable auditory prosthesis with trainable automatic adaptation to acoustic conditions |
US8532317B2 (en) | 2002-05-21 | 2013-09-10 | Hearworks Pty Limited | Programmable auditory prosthesis with trainable automatic adaptation to acoustic conditions |
EP2249586A3 (en) * | 2003-03-03 | 2012-06-20 | Phonak AG | Method for manufacturing acoustical devices and for reducing wind disturbances |
US8027495B2 (en) | 2003-03-07 | 2011-09-27 | Phonak Ag | Binaural hearing device and method for controlling a hearing device system |
US8111848B2 (en) | 2003-03-07 | 2012-02-07 | Phonak Ag | Hearing aid with acoustical signal direction of arrival control |
US7286672B2 (en) | 2003-03-07 | 2007-10-23 | Phonak Ag | Binaural hearing device and method for controlling a hearing device system |
EP1320281A2 (en) | 2003-03-07 | 2003-06-18 | Phonak Ag | Binaural hearing device and method for controlling a such a hearing device |
EP1326478A2 (en) | 2003-03-07 | 2003-07-09 | Phonak Ag | Method for producing control signals, method of controlling signal transfer and a hearing device |
US7773763B2 (en) | 2003-06-24 | 2010-08-10 | Gn Resound A/S | Binaural hearing aid system with coordinated sound processing |
CN108882136B (en) * | 2003-06-24 | 2020-05-15 | Gn瑞声达A/S | Binaural hearing aid system with coordinated sound processing |
WO2004114722A1 (en) * | 2003-06-24 | 2004-12-29 | Gn Resound A/S | A binaural hearing aid system with coordinated sound processing |
CN108882136A (en) * | 2003-06-24 | 2018-11-23 | Gn瑞声达A/S | With the binaural hearing aid system for coordinating acoustic processing |
US6912289B2 (en) | 2003-10-09 | 2005-06-28 | Unitron Hearing Ltd. | Hearing aid and processes for adaptively processing signals therein |
EP1538874A3 (en) * | 2003-12-01 | 2010-01-20 | Siemens Audiologische Technik GmbH | Hearing device with direction dependent signal processing and corresponding method |
EP1538874A2 (en) * | 2003-12-01 | 2005-06-08 | Siemens Audiologische Technik GmbH | Hearing device with direction dependent signal processing and corresponding method |
EP1691574A2 (en) | 2005-02-11 | 2006-08-16 | Phonak Communications Ag | Method and system for providing hearing assistance to a user |
EP1819195A2 (en) | 2006-02-13 | 2007-08-15 | Phonak Communications Ag | Method and system for providing hearing assistance to a user |
US9264822B2 (en) | 2006-03-14 | 2016-02-16 | Starkey Laboratories, Inc. | System for automatic reception enhancement of hearing assistance devices |
US7986790B2 (en) | 2006-03-14 | 2011-07-26 | Starkey Laboratories, Inc. | System for evaluating hearing assistance device settings using detected sound environment |
US8068627B2 (en) | 2006-03-14 | 2011-11-29 | Starkey Laboratories, Inc. | System for automatic reception enhancement of hearing assistance devices |
US8494193B2 (en) | 2006-03-14 | 2013-07-23 | Starkey Laboratories, Inc. | Environment detection and adaptation in hearing assistance devices |
EP1858292A1 (en) | 2006-05-16 | 2007-11-21 | Phonak AG | Hearing device and method of operating a hearing device |
US7957548B2 (en) | 2006-05-16 | 2011-06-07 | Phonak Ag | Hearing device with transfer function adjusted according to predetermined acoustic environments |
US8249284B2 (en) | 2006-05-16 | 2012-08-21 | Phonak Ag | Hearing system and method for deriving information on an acoustic scene |
WO2007131815A1 (en) * | 2006-05-16 | 2007-11-22 | Phonak Ag | Hearing device and method for operating a hearing device |
US8605923B2 (en) | 2007-06-20 | 2013-12-10 | Cochlear Limited | Optimizing operational control of a hearing prosthesis |
EP2192794A1 (en) | 2008-11-26 | 2010-06-02 | Oticon A/S | Improvements in hearing aid algorithms |
US8873780B2 (en) | 2010-05-12 | 2014-10-28 | Phonak Ag | Hearing system and method for operating the same |
WO2012010218A1 (en) | 2010-07-23 | 2012-01-26 | Phonak Ag | Hearing system and method for operating a hearing system |
US9167359B2 (en) | 2010-07-23 | 2015-10-20 | Sonova Ag | Hearing system and method for operating a hearing system |
US9131318B2 (en) | 2010-09-15 | 2015-09-08 | Phonak Ag | Method and system for providing hearing assistance to a user |
WO2010133703A2 (en) | 2010-09-15 | 2010-11-25 | Phonak Ag | Method and system for providing hearing assistance to a user |
WO2013170885A1 (en) * | 2012-05-15 | 2013-11-21 | Phonak Ag | Method for operating a hearing device as well as a hearing device |
US9584930B2 (en) | 2012-12-21 | 2017-02-28 | Starkey Laboratories, Inc. | Sound environment classification by coordinated sensing using hearing assistance devices |
US8958586B2 (en) | 2012-12-21 | 2015-02-17 | Starkey Laboratories, Inc. | Sound environment classification by coordinated sensing using hearing assistance devices |
CN112954569A (en) * | 2021-02-20 | 2021-06-11 | 深圳市智听科技有限公司 | Multi-core hearing aid chip, hearing aid method and hearing aid |
Also Published As
Publication number | Publication date |
---|---|
JP2005504325A (en) | 2005-02-10 |
WO2002032208A3 (en) | 2002-12-05 |
JP3987429B2 (en) | 2007-10-10 |
EP1470735B1 (en) | 2019-08-21 |
CA2439427A1 (en) | 2002-04-25 |
AU2472202A (en) | 2002-04-29 |
EP1470735A2 (en) | 2004-10-27 |
CA2439427C (en) | 2011-03-29 |
AU2002224722B2 (en) | 2008-04-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP1470735B1 (en) | Method for determining an acoustic environment situation, application of the method and hearing aid | |
DE60120949T2 (en) | A HEARING PROSTHESIS WITH AUTOMATIC HEARING CLASSIFICATION | |
DE69432943T2 (en) | Method and device for speech detection | |
WO2001020965A2 (en) | Method for determining a current acoustic environment, use of said method and a hearing-aid | |
EP1247425B1 (en) | Method for operating a hearing-aid and a hearing aid | |
EP1647972A2 (en) | Intelligibility enhancement of audio signals containing speech | |
EP2405673B1 (en) | Method for localising an audio source and multi-channel audio system | |
EP3386215B1 (en) | Hearing aid and method for operating a hearing aid | |
DE112016007138T5 (en) | DEVICE AND METHOD FOR MONITORING A WEARING STATE OF AN EARPHONE | |
EP2200341B1 (en) | Method for operating a hearing aid and hearing aid with a source separation device | |
DE69511602T2 (en) | Signal source characterization system | |
DE102015221764A1 (en) | Method for adjusting microphone sensitivities | |
DE19948907A1 (en) | Signal processing in hearing aid | |
DE102006001730A1 (en) | Sound system, method for improving the voice quality and / or intelligibility of voice announcements and computer program | |
DE10114101A1 (en) | Processing input signal in signal processing unit for hearing aid, involves analyzing input signal and adapting signal processing unit setting parameters depending on signal analysis results | |
EP2792165B1 (en) | Adaptation of a classification of an audio signal in a hearing aid | |
EP1303166B1 (en) | Method of operating a hearing aid and assembly with a hearing aid | |
EP3403260B1 (en) | Method and apparatus for conditioning an audio signal subjected to lossy compression | |
EP1649719B1 (en) | Device and method for operating voice-assisted systems in motor vehicles | |
EP1348315B1 (en) | Method for use of a hearing-aid and corresponding hearing aid | |
EP0540535B1 (en) | Process for speaker adaptation in an automatic speech-recognition system | |
DE19813512A1 (en) | Hearing aid with noise signal suppression | |
DE10137685C1 (en) | Speech signal detection method for hearing aid provides evaluation index from correlation between instant amplitude signal and instant frequency signal | |
EP1366617B1 (en) | Method and device for improving voice quality on transparent telecommunication-transmission paths | |
DE10137395C1 (en) | Noise interference suppression method in hearing aid, involves performing non-linear modification of instant amplitude signal to obtain modified signal for linking with instant phase signal to output voice signal |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A2 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ OM PH PL PT RO RU SD SE SG SI SK SL TJ TM TN TR TT TZ UA UG US UZ VN YU ZA ZM ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A2 Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
AK | Designated states |
Kind code of ref document: A3 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ OM PH PL PT RO RU SD SE SG SI SK SL TJ TM TN TR TT TZ UA UG US UZ VN YU ZA ZM ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A3 Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2002706499 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2002224722 Country of ref document: AU |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2439427 Country of ref document: CA |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2002535462 Country of ref document: JP |
|
WWP | Wipo information: published in national office |
Ref document number: 2002706499 Country of ref document: EP |
|
REG | Reference to national code |
Ref country code: DE Ref legal event code: 8642 |