CN104317388B - A kind of exchange method and wearable electronic equipment - Google Patents

A kind of exchange method and wearable electronic equipment Download PDF

Info

Publication number
CN104317388B
CN104317388B CN201410470040.2A CN201410470040A CN104317388B CN 104317388 B CN104317388 B CN 104317388B CN 201410470040 A CN201410470040 A CN 201410470040A CN 104317388 B CN104317388 B CN 104317388B
Authority
CN
China
Prior art keywords
user
unit
sensing
wearable electronic
mouth
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201410470040.2A
Other languages
Chinese (zh)
Other versions
CN104317388A (en
Inventor
许奔
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lenovo Beijing Ltd
Original Assignee
Lenovo Beijing Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lenovo Beijing Ltd filed Critical Lenovo Beijing Ltd
Priority to CN201410470040.2A priority Critical patent/CN104317388B/en
Publication of CN104317388A publication Critical patent/CN104317388A/en
Application granted granted Critical
Publication of CN104317388B publication Critical patent/CN104317388B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention discloses in a kind of exchange method and wearable electronic equipment, the wearable electronic equipment has sensing unit;The wearable electronic equipment is able to maintain that the relative positional relationship with user ear;When the wearable electronic equipment maintains the first relative positional relationship with user ear, the exchange method includes: the mouth action that the user is detected by the sensing unit, obtains parameter sensing;Judge whether the parameter sensing meets the first preset condition, obtains judging result;When the judging result shows that the parameter sensing meets first preset condition, determines and execute first operational order.

Description

Interaction method and wearable electronic equipment
Technical Field
The present invention relates to interaction technologies, and in particular, to an interaction method and a wearable electronic device.
Background
With the development of wearable electronic devices, wearable electronic devices are widely used by users. In a scenario where both hands are inoperable, for example, when a user types on a keyboard and both hands are inoperable, the user wants the wearable electronic device to perform some operations on the electronic device through the actions of the user's teeth, but there is no effective solution to achieve this.
Disclosure of Invention
In order to solve the technical problem, embodiments of the present invention provide an interaction method and a wearable electronic device.
The interaction method provided by the embodiment of the invention is applied to wearable electronic equipment, and the wearable electronic equipment is provided with a sensing unit; the wearable electronic equipment can maintain the relative position relation with the ear of a user; when the wearable electronic device maintains a first relative positional relationship with an ear of a user, the interaction method comprises:
detecting mouth movements of the user through the sensing unit to obtain sensing parameters;
judging whether the sensing parameters meet a first preset condition or not to obtain a judgment result;
and when the judgment result shows that the sensing parameter meets the first preset condition, determining and executing the first operation instruction.
The wearable electronic equipment provided by the embodiment of the invention comprises a sensing unit; the wearable electronic device is capable of maintaining a first relative positional relationship with an ear of a user; the wearable electronic device further includes:
the control unit is used for controlling the sensing unit to detect the mouth movement of the user to obtain sensing parameters;
the judging unit is used for judging whether the sensing parameters meet a first preset condition or not to obtain a judging result;
and the processing unit is used for determining and executing the first operation instruction when the judgment result shows that the sensing parameter meets the first preset condition.
In the technical scheme of the embodiment of the invention, the wearable electronic equipment can maintain a first relative position relation with the ear of a user, and particularly the wearable electronic equipment, such as an intelligent earphone, can be worn on the ear of the user; the wearable electronic equipment is provided with a sensing unit, and the sensing unit can be used for detecting the mouth movement of a user to obtain sensing parameters; when the sensing parameters meet a first preset condition, determining and executing a first operation instruction corresponding to the sensing parameters. Here, a matching rule of the sensing parameter and the operation instruction may be set in advance, and the operation instruction corresponding to the sensing parameter may be specified according to the matching rule. According to the embodiment of the invention, the electronic equipment is operated through the tooth action of the user, so that the mouth movement operation is realized, the operation mode is simple and convenient, the hands can be liberated, the interestingness of interaction is increased, and the user experience is improved.
Drawings
Fig. 1 is a schematic flowchart of an interaction method according to a first embodiment of the present invention;
FIG. 2 is a flowchart illustrating an interaction method according to a second embodiment of the present invention;
FIG. 3 is a flowchart illustrating an interaction method according to a third embodiment of the present invention;
FIG. 4 is a flowchart illustrating an interaction method according to a fourth embodiment of the present invention;
FIG. 5 is a flowchart illustrating an interaction method according to a fifth embodiment of the present invention;
fig. 6 is a flowchart illustrating an interaction method according to a sixth embodiment of the present invention;
fig. 7 is a schematic structural component diagram of a wearable electronic device according to a first embodiment of the invention;
fig. 8 is a schematic structural composition diagram of a wearable electronic device according to a second embodiment of the present invention;
fig. 9 is a schematic structural composition diagram of a wearable electronic device according to a third embodiment of the present invention;
fig. 10 is a schematic structural composition diagram of a wearable electronic device according to a fourth embodiment of the present invention;
fig. 11 is a schematic structural composition diagram of a wearable electronic device according to a fifth embodiment of the present invention;
fig. 12 is a schematic structural composition diagram of a wearable electronic device according to a sixth embodiment of the present invention.
Detailed Description
So that the manner in which the features and aspects of the embodiments of the present invention can be understood in detail, a more particular description of the embodiments of the invention, briefly summarized above, may be had by reference to the embodiments, some of which are illustrated in the appended drawings.
Fig. 1 is a schematic flowchart of an interaction method according to a first embodiment of the present invention, where the interaction method is applied to a wearable electronic device, and the wearable electronic device has a sensing unit; the wearable electronic equipment can maintain the relative position relation with the ear of a user; when the wearable electronic equipment maintains a first relative position relation with the ear of a user, the interaction method comprises the following steps:
step 101: and detecting the mouth action of the user through the sensing unit to obtain a sensing parameter.
In the embodiment of the present invention, the wearable electronic device is specifically an intelligent headset, and the intelligent headset can be worn on the ear of the user, that is, maintain a first relative position relationship with the ear of the user. The smart headset has a speaker and a sensing unit, and the sensing unit can detect the mouth movement of a user and obtain a sensing parameter.
In an embodiment of the present invention, the sensing unit may be a bone conduction unit, and the detecting the mouth movement of the user by using the bone conduction unit specifically includes: when a user bites teeth of the mouth, a tapping sound is generated, different sounds correspond to mechanical vibrations of different frequencies, and the mechanical vibrations of specific frequencies are transmitted from the mouth of the user to the ears of the user through the cheekbones of the user, so that the bone conduction units located at the ears can detect biting actions of the teeth of the user and obtain mechanical vibration data generated by the biting actions.
In an embodiment of the present invention, the sensing unit may also be a pressure detecting unit, and the detecting the mouth movement of the user by using the pressure detecting unit specifically includes: when a user bites teeth of the mouth, the facial muscles of the user are changed, so that the muscles of the ears are contracted, and the contraction of the muscles of the ears exerts pressure on the pressure detection unit positioned at the ears, so that the pressure detection unit can detect the mouth movement of the user and obtain the ear pressure data generated by the change of the mouth movement of the user.
Step 102: and judging whether the sensing parameters meet a first preset condition or not to obtain a judgment result.
In the embodiment of the present invention, a matching rule of the sensing parameter and the operation instruction is preset, for example, the selection operation of the application program corresponding to the mouth motion of continuously tapping the teeth, and accordingly, the matching rule is: the sensing parameters corresponding to the mouth action of continuously tapping the teeth are matched with the application program selection operation instruction. For another example, the matched mouth motion, that is, the sensing parameter, is set for the operation instruction such as long press, short press, drag, etc., and when the corresponding sensing parameter is acquired, the first operation instruction corresponding to the sensing parameter can be determined according to the matching rule.
Based on this, the specific step of judging whether the sensing parameter meets the first preset condition is: judging whether the sensing parameters are matched with a certain operation instruction or not according to the matching rule; when the sensing parameter matches a certain operation command, the operation command is executed, see step 103 below.
Step 103: and when the judgment result shows that the sensing parameter meets the first preset condition, determining and executing the first operation instruction.
Based on step 102, it may be determined whether the sensing parameter satisfies a first preset condition, that is, whether the sensing parameter matches with an operation instruction in the matching rule according to a preset matching rule. And when the judgment result shows that the acquired sensing parameters meet a first preset condition, determining and executing a first operation instruction corresponding to the sensing parameters.
For example, the following porting operations are implemented: program selection is achieved by tapping the teeth continuously, and then double tapping of the teeth launches the selected application. The wearable electronic device needs to execute the following steps: the sensing unit detects sensing parameters generated by mouth movement, judges the sensing parameters to represent the mouth movement of a user as continuous tapping movement, and can determine the selection operation of the execution application program according to the matching rule. And then, the sensing unit detects the sensing parameter generated by the mouth movement again, judges that the sensing parameter represents the mouth movement of the user as a double-tapping movement, and can determine the operation of executing the starting application program according to the matching rule.
According to the embodiment of the invention, the electronic equipment is operated through the tooth action of the user, so that the mouth movement operation is realized, the operation mode is simple and convenient, the hands can be liberated, the interestingness of interaction is increased, and the user experience is improved.
Fig. 2 is a schematic flowchart of an interaction method according to a second embodiment of the present invention, where the interaction method in this example is applied to a wearable electronic device, the wearable electronic device has a sensing unit, and the sensing unit is a bone conduction unit; the wearable electronic equipment can maintain the relative position relation with the ear of a user; when the wearable electronic equipment maintains a first relative position relation with the ear of a user, the interaction method comprises the following steps:
step 201: and detecting mouth movements generated by the biting movements of the teeth of the user transmitted through the cheekbones of the user through the bone conduction unit to obtain tooth vibration data.
In the embodiment of the present invention, the wearable electronic device is specifically an intelligent headset, and the intelligent headset can be worn on the ear of the user, that is, maintain a first relative position relationship with the ear of the user. The intelligent earphone is provided with a speaker and a sensing unit, wherein the sensing unit is a bone conduction unit; the bone conduction unit can detect the mouth movement of the user.
Specifically, the bone conduction unit is used for detecting the mouth movement of the user and specifically comprises the following steps: when a user bites teeth of the mouth, a tapping sound is emitted, different sounds correspond to mechanical vibrations of different frequencies, and the mechanical vibrations of specific frequencies are transmitted from the mouth of the user to the ears of the user through the cheekbones of the user, so that the bone conduction units located at the ears can detect biting actions of the teeth of the user and obtain tooth vibration data generated by the biting actions.
Step 202: and judging whether the tooth vibration data meet a first preset condition or not to obtain a judgment result.
In the embodiment of the present invention, a matching rule of tooth vibration data and an operation instruction is preset, for example, a selection operation of an application program corresponding to a mouth motion of continuously tapping teeth, and accordingly, the matching rule is: tooth vibration data corresponding to the mouth motion of continuously tapping the teeth is matched with the application program selection operation instruction. For another example, the matched mouth motion, that is, the tooth vibration data, is set for the operation instructions such as long press, short press, drag, and the like, and when the corresponding tooth vibration data is acquired, the first operation instruction corresponding to the tooth vibration data can be determined according to the matching rule.
Based on this, judging whether the tooth vibration data meets a first preset condition specifically includes: judging whether the tooth vibration data are matched with an operation instruction or not according to the matching rule; when the tooth vibration data matches a certain operation instruction, the operation instruction is executed, see step 203 below.
Step 203: and when the judgment result shows that the tooth vibration data meet the first preset condition, determining and executing the first operation instruction.
Based on step 202, according to a preset matching rule, it may be determined whether the teething tooth vibration data meets a first preset condition, that is, whether the teething tooth vibration data matches with an operation instruction in the matching rule. And when the judgment result shows that the acquired tooth vibration data meet a first preset condition, determining and executing a first operation instruction corresponding to the tooth vibration data.
For example, the following porting operations are implemented: program selection is achieved by tapping the teeth continuously, and then double tapping of the teeth launches the selected application. The wearable electronic device needs to execute the following steps: the sensing unit detects tooth vibration data generated by mouth movement, judges that the tooth vibration data represents the mouth movement of a user as continuous tapping movement, and can determine the selection operation of the execution application program according to a matching rule. And then, the sensing unit detects tooth vibration data generated by the mouth movement again, judges that the tooth vibration data represent the mouth movement of the user as a double-tapping movement, and can determine the operation of starting the application program according to a matching rule.
According to the embodiment of the invention, the electronic equipment is operated through the tooth action of the user, so that the mouth movement operation is realized, the operation mode is simple and convenient, the hands can be liberated, the interestingness of interaction is increased, and the user experience is improved.
Fig. 3 is a flowchart illustrating an interaction method according to a third embodiment of the present invention, where the interaction method is applied to a wearable electronic device, and the wearable electronic device has a sensing unit; the wearable electronic equipment can maintain the relative position relation with the ear of a user; when the wearable electronic equipment maintains a first relative position relation with the ear of a user, the interaction method comprises the following steps:
step 301: detecting, by the sensing unit, ear pressure data resulting from the change in the user's mouth motion.
In the embodiment of the present invention, the wearable electronic device is specifically an intelligent headset, and the intelligent headset can be worn on the ear of the user, that is, maintain a first relative position relationship with the ear of the user. The smart headset includes a speaker and a sensing unit, and the sensing unit can detect the mouth movement of the user.
In an embodiment of the present invention, the sensing unit, especially the pressure detecting unit, which detects the mouth movement of the user by using the pressure detecting unit, specifically includes: when a user bites teeth of the mouth, the facial muscles of the user are changed, so that the muscles of the ears are contracted, and the contraction of the muscles of the ears exerts pressure on the pressure detection unit positioned at the ears, so that the pressure detection unit can detect the mouth movement of the user and obtain the ear pressure data generated by the change of the mouth movement of the user.
Step 302: and judging whether the ear pressure data meet a first preset condition or not to obtain a judgment result.
In the embodiment of the present invention, a matching rule of ear pressure data and an operation instruction is preset, for example, a selection operation of an application program corresponding to a mouth motion of continuously tapping teeth, and accordingly, the matching rule is: the ear pressure data corresponding to the mouth motion of the continuous tapping of the teeth is matched with the application program selection operation instruction. For another example, the matched mouth action, that is, the ear pressure data, is set for the operation instructions such as long pressing, short pressing, dragging, and the like, and when the corresponding ear pressure data is collected, the first operation instruction corresponding to the ear pressure data can be determined according to the matching rule.
Based on this, it is specifically as to judge whether ear pressure data satisfies first preset condition: judging whether the ear pressure data is matched with an operation instruction or not according to the matching rule; when the ear pressure data matches a certain operating instruction, the operating instruction is executed, see step 303 below.
Step 303: and when the judgment result shows that the ear pressure data meet the first preset condition, determining and executing the first operation instruction.
Based on step 302, according to a preset matching rule, it may be determined whether the ear pressure data meets a first preset condition, that is, whether the ear pressure data matches one of the operation instructions in the matching rule. And when the judgment result shows that the acquired ear pressure data meet a first preset condition, determining and executing a first operation instruction corresponding to the ear pressure data.
For example, the following porting operations are implemented: program selection is achieved by tapping the teeth continuously, and then double tapping of the teeth launches the selected application. The wearable electronic device needs to execute the following steps: the sensing unit detects ear pressure data generated by mouth movement, judges that the ear pressure data represents the mouth movement of a user as continuous tapping movement, and can determine to execute selection operation of an application program according to a matching rule. And then, the sensing unit detects the ear pressure data generated by the mouth movement again, judges that the ear pressure data represents the mouth movement of the user as a double-tapping movement, and can determine to execute the operation of starting the application program according to the matching rule.
According to the embodiment of the invention, the electronic equipment is operated through the tooth action of the user, so that the mouth movement operation is realized, the operation mode is simple and convenient, the hands can be liberated, the interestingness of interaction is increased, and the user experience is improved.
Fig. 4 is a schematic flowchart of an interaction method according to a fourth embodiment of the present invention, where the interaction method in this example is applied to a wearable electronic device, the wearable electronic device has a sensing unit, and the sensing unit is a bone conduction unit; the wearable electronic equipment can maintain the relative position relation with the ear of a user; when the wearable electronic equipment maintains a first relative position relation with the ear of a user, the interaction method comprises the following steps:
step 401: and detecting mouth movements generated by the biting movements of the teeth of the user transmitted through the cheekbones of the user through the bone conduction unit to obtain tooth vibration data.
In the embodiment of the present invention, the wearable electronic device is specifically an intelligent headset, and the intelligent headset can be worn on the ear of the user, that is, maintain a first relative position relationship with the ear of the user. The intelligent earphone is provided with a speaker and a sensing unit, wherein the sensing unit is a bone conduction unit; the bone conduction unit can detect the mouth movement of the user.
Specifically, the bone conduction unit is used for detecting the mouth movement of the user and specifically comprises the following steps: when a user bites teeth of the mouth, a tapping sound is emitted, different sounds correspond to mechanical vibrations of different frequencies, and the mechanical vibrations of specific frequencies are transmitted from the mouth of the user to the ears of the user through the cheekbones of the user, so that the bone conduction units located at the ears can detect biting actions of the teeth of the user and obtain tooth vibration data generated by the biting actions.
Step 402: and determining the occlusion times of the teeth of the user according to the tooth vibration data.
In the embodiment of the invention, the tooth vibration data can determine the occlusion times of teeth, wherein the tooth vibration data is specifically audio vibration data, and the wave crest of the audio vibration data is the occlusion times of the teeth.
Step 403: and judging whether the occlusion times meet a first preset condition or not to obtain a judgment result.
In the embodiment of the present invention, a matching rule of the occlusion times and the operation instruction is preset, for example, the selection operation of the application program corresponding to the mouth motion of continuously tapping the teeth, and accordingly, the matching rule is: the occlusion times corresponding to the mouth movements of the continuous tapping of the teeth are matched with the application program selection operation instructions.
Based on this, the specific step of judging whether the occlusion frequency meets a first preset condition is: judging whether the occlusion times are matched with certain operation instructions or not according to the matching rule; when the number of snaps matches an operation instruction, the operation instruction is executed, see step 404 below.
Step 404: and when the judgment result shows that the occlusion times meet the first preset condition, determining and executing the first operation instruction.
Based on step 403, according to the preset matching rule, it may be determined whether the number of times of engagement meets a first preset condition, that is, whether the number of times of engagement matches one operation instruction in the matching rule. And when the judgment result shows that the acquired occlusion times meet a first preset condition, determining and executing a first operation instruction corresponding to the occlusion times.
For example, the following porting operations are implemented: program selection is achieved by tapping the teeth continuously, and then double tapping of the teeth launches the selected application. The wearable electronic device needs to execute the following steps: the sensing unit detects the occlusion times generated by the mouth movements, judges the occlusion times to represent the mouth movements of the user as continuous tapping movements, and can determine the selection operation of the execution application program according to the matching rules. And then, the sensing unit detects the occlusion times generated by the mouth movement again, judges that the occlusion times represent the mouth movement of the user as a double-tapping movement, and can determine the operation of executing the starting application program according to the matching rule.
According to the embodiment of the invention, the electronic equipment is operated through the tooth action of the user, so that the mouth movement operation is realized, the operation mode is simple and convenient, the hands can be liberated, the interestingness of interaction is increased, and the user experience is improved.
Fig. 5 is a schematic flowchart of an interaction method according to a fifth embodiment of the present invention, where the interaction method in this example is applied to a wearable electronic device, the wearable electronic device has a sensing unit, and the sensing unit is a bone conduction unit; the wearable electronic equipment can maintain the relative position relation with the ear of a user; when the wearable electronic equipment maintains a first relative position relation with the ear of a user, the interaction method comprises the following steps:
step 501: and detecting mouth movements generated by the biting movements of the teeth of the user transmitted through the cheekbones of the user through the bone conduction unit to obtain tooth vibration data.
In the embodiment of the present invention, the wearable electronic device is specifically an intelligent headset, and the intelligent headset can be worn on the ear of the user, that is, maintain a first relative position relationship with the ear of the user. The intelligent earphone is provided with a speaker and a sensing unit, wherein the sensing unit is a bone conduction unit; the bone conduction unit can detect the mouth movement of the user.
Specifically, the bone conduction unit is used for detecting the mouth movement of the user and specifically comprises the following steps: when a user bites teeth of the mouth, a tapping sound is emitted, different sounds correspond to mechanical vibrations of different frequencies, and the mechanical vibrations of specific frequencies are transmitted from the mouth of the user to the ears of the user through the cheekbones of the user, so that the bone conduction units located at the ears can detect biting actions of the teeth of the user and obtain tooth vibration data generated by the biting actions.
Step 502: and determining the occlusion duration of the teeth of the user according to the tooth vibration data.
In the embodiment of the invention, the tooth vibration data can determine the occlusion duration of the teeth, wherein the tooth vibration data is specifically audio vibration data, and the time interval between two adjacent peaks in the audio vibration data is the occlusion duration of the teeth.
Step 503: and judging whether the meshing duration meets a first preset condition or not to obtain a judgment result.
In the embodiment of the present invention, a matching rule of the occlusion duration and the operation instruction is preset, for example, the selection operation of the application program corresponding to the mouth action of the occluding tooth for a long time, and correspondingly, the matching rule is as follows: the occlusion duration of the mouth action of the long-time occluding tooth is matched with the application program selection operation instruction.
Based on this, the specific step of judging whether the occlusion duration meets a first preset condition is: judging whether the occlusion duration is matched with an operation instruction or not according to the matching rule; when the engagement duration matches an operation command, the operation command is executed, see step 504 below.
Step 504: and when the judgment result shows that the meshing duration meets the first preset condition, determining and executing the first operation instruction.
Based on step 503, it may be determined whether the engagement duration satisfies the first preset condition, that is, whether the engagement duration matches with an operation instruction in the matching rule according to the preset matching rule. And when the judgment result shows that the acquired occlusion duration meets a first preset condition, determining and executing a first operation instruction corresponding to the occlusion duration.
For example, the following porting operations are implemented: program selection is achieved by engaging the teeth for a long period of time, and then engaging the teeth for a short period of time initiates the selected application. Here, the long time and the short time may be determined based on a time threshold set in advance, for example, when the engagement time period is longer than T1, it is a long time engagement, and when the engagement time period is shorter than T2, it is a short time engagement. The wearable electronic device needs to execute the following steps: the sensing unit detects the occlusion time length of the mouth action, judges that the occlusion time length represents that the mouth action of the user is a long-time occlusion action, and can determine the selection operation of the execution application program according to the matching rule. And then, the sensing unit detects the occlusion time length of the mouth action again, judges that the occlusion time length represents that the mouth action of the user is a short-time occlusion action, and can determine to execute the operation of starting the application program according to the matching rule.
According to the embodiment of the invention, the electronic equipment is operated through the tooth action of the user, so that the mouth movement operation is realized, the operation mode is simple and convenient, the hands can be liberated, the interestingness of interaction is increased, and the user experience is improved.
Fig. 6 is a flowchart illustrating an interaction method according to a sixth embodiment of the present invention, where the interaction method in this example is applied to a wearable electronic device, and the wearable electronic device has a sensing unit; the wearable electronic equipment can maintain the relative position relation with the ear of a user; when the wearable electronic equipment maintains a first relative position relation with the ear of a user, the interaction method comprises the following steps:
step 601: and detecting the mouth action of the user through the sensing unit to obtain a sensing parameter.
In the embodiment of the present invention, the wearable electronic device is specifically an intelligent headset, and the intelligent headset can be worn on the ear of the user, that is, maintain a first relative position relationship with the ear of the user. The smart headset has a speaker and a sensing unit, and the sensing unit can detect the mouth movement of a user and obtain a sensing parameter.
In an embodiment of the present invention, the sensing unit may be a bone conduction unit, and the detecting the mouth movement of the user by using the bone conduction unit specifically includes: when a user bites teeth of the mouth, a tapping sound is generated, different sounds correspond to mechanical vibrations of different frequencies, and the mechanical vibrations of specific frequencies are transmitted from the mouth of the user to the ears of the user through the cheekbones of the user, so that the bone conduction units located at the ears can detect biting actions of the teeth of the user and obtain mechanical vibration data generated by the biting actions.
In an embodiment of the present invention, the sensing unit may also be a pressure detecting unit, and the detecting the mouth movement of the user by using the pressure detecting unit specifically includes: when a user bites teeth of the mouth, the facial muscles of the user are changed, so that the muscles of the ears are contracted, and the contraction of the muscles of the ears exerts pressure on the pressure detection unit positioned at the ears, so that the pressure detection unit can detect the mouth movement of the user and obtain the ear pressure data generated by the change of the mouth movement of the user.
Step 602: and judging whether the sensing parameters meet a first preset condition or not to obtain a judgment result.
In the embodiment of the present invention, a matching rule of the sensing parameter and the operation instruction is preset, for example, the selection operation of the application program corresponding to the mouth motion of continuously tapping the teeth, and accordingly, the matching rule is: the sensing parameters corresponding to the mouth action of continuously tapping the teeth are matched with the application program selection operation instruction. For another example, the matched mouth motion, that is, the sensing parameter, is set for the operation instruction such as long press, short press, drag, etc., and when the corresponding sensing parameter is acquired, the first operation instruction corresponding to the sensing parameter can be determined according to the matching rule.
Based on this, the specific step of judging whether the sensing parameter meets the first preset condition is: judging whether the sensing parameters are matched with a certain operation instruction or not according to the matching rule; when the sensing parameter matches a certain operation command, the operation command is executed, see step 603 below.
Step 603: and when the judgment result shows that the sensing parameter meets the first preset condition, determining and executing the first operation instruction.
Based on step 602, according to the preset matching rule, it may be determined whether the sensing parameter meets a first preset condition, that is, whether the sensing parameter matches with an operation instruction in the matching rule. And when the judgment result shows that the acquired sensing parameters meet a first preset condition, determining and executing a first operation instruction corresponding to the sensing parameters.
For example, the following porting operations are implemented: program selection is achieved by tapping the teeth continuously, and then double tapping of the teeth launches the selected application. The wearable electronic device needs to execute the following steps: the sensing unit detects sensing parameters generated by mouth movement, judges the sensing parameters to represent the mouth movement of a user as continuous tapping movement, and can determine the selection operation of the execution application program according to the matching rule. And then, the sensing unit detects the sensing parameter generated by the mouth movement again, judges that the sensing parameter represents the mouth movement of the user as a double-tapping movement, and can determine the operation of executing the starting application program according to the matching rule.
Step 604: a second operation is obtained.
In the embodiment of the present invention, the second operation is triggered by a user, and specifically, the user may trigger the second operation through a key on the wearable electronic device, or trigger the second operation through a sound acquisition unit on the wearable electronic device, or trigger the second operation through an image acquisition unit on the wearable electronic device.
Step 605: and locking a sensing unit in the electronic equipment in response to the second operation to stop detecting the mouth action of the user.
In the embodiment of the invention, the wearable electronic equipment has a locking function, and when the second operation triggered by the user is obtained, the sensing unit is locked to stop detecting the mouth action of the user, so that the user is prevented from accidentally biting the trigger.
According to the embodiment of the invention, the electronic equipment is operated through the tooth action of the user, so that the mouth movement operation is realized, the operation mode is simple and convenient, the hands can be liberated, the interestingness of interaction is increased, and the user experience is improved.
Fig. 7 is a schematic structural composition diagram of a wearable electronic device according to a first embodiment of the present invention, in which the wearable electronic device includes a sensing unit; the wearable electronic device is capable of maintaining a first relative positional relationship with an ear of a user; the wearable electronic device further includes:
the control unit 71 is used for controlling the sensing unit to detect the mouth movement of the user to obtain sensing parameters;
the judging unit 72 is configured to judge whether the sensing parameter meets a first preset condition, so as to obtain a judgment result;
and the processing unit 73 is configured to determine and execute the first operation instruction when the determination result indicates that the sensing parameter meets the first preset condition.
It should be understood by those skilled in the art that the functions of each processing unit in the wearable electronic device according to the embodiments of the present invention may be realized by analog circuits that implement the functions described in the embodiments of the present invention, or by running software that performs the functions described in the embodiments of the present invention on a smart device, as described with reference to the foregoing description of the interaction method.
Fig. 8 is a schematic structural composition diagram of a wearable electronic device according to a second embodiment of the present invention, in which the wearable electronic device includes a sensing unit; the wearable electronic device is capable of maintaining a first relative positional relationship with an ear of a user; the wearable electronic device further includes:
the control unit 81 is used for controlling the sensing unit to detect the mouth movement of the user to obtain sensing parameters;
the judging unit 82 is configured to judge whether the sensing parameter meets a first preset condition, so as to obtain a judgment result;
and the processing unit 83 is configured to determine and execute the first operation instruction when the determination result indicates that the sensing parameter meets the first preset condition.
Preferably, the sensing unit is a bone conduction unit;
the control unit 81 includes a first control subunit 811 for controlling the bone conduction unit to detect mouth movements generated by biting movements of the user's teeth transmitted through the user's cheekbones, and to obtain tooth vibration data.
It should be understood by those skilled in the art that the functions of each processing unit in the wearable electronic device according to the embodiments of the present invention may be realized by analog circuits that implement the functions described in the embodiments of the present invention, or by running software that performs the functions described in the embodiments of the present invention on a smart device, as described with reference to the foregoing description of the interaction method.
Fig. 9 is a schematic structural composition diagram of a wearable electronic device according to a third embodiment of the present invention, where the wearable electronic device includes a sensing unit; the wearable electronic device is capable of maintaining a first relative positional relationship with an ear of a user; the wearable electronic device further includes:
the control unit 91 is used for controlling the sensing unit to detect the mouth movement of the user to obtain sensing parameters;
the judging unit 92 is configured to judge whether the sensing parameter meets a first preset condition, so as to obtain a judgment result;
and the processing unit 93 is configured to determine and execute the first operation instruction when the determination result indicates that the sensing parameter meets the first preset condition.
Preferably, the control unit 91 comprises a second control subunit 911 for controlling the sensing unit to detect ear pressure data generated by the change of the user's mouth movements.
It should be understood by those skilled in the art that the functions of each processing unit in the wearable electronic device according to the embodiments of the present invention may be realized by analog circuits that implement the functions described in the embodiments of the present invention, or by running software that performs the functions described in the embodiments of the present invention on a smart device, as described with reference to the foregoing description of the interaction method.
Fig. 10 is a schematic structural composition diagram of a wearable electronic device according to a fourth embodiment of the present invention, where the wearable electronic device includes a sensing unit; the wearable electronic device is capable of maintaining a first relative positional relationship with an ear of a user; the wearable electronic device further includes:
the control unit 11 is used for controlling the sensing unit to detect the mouth movement of the user to obtain sensing parameters;
the judging unit 12 is configured to judge whether the sensing parameter meets a first preset condition, so as to obtain a judgment result;
and the processing unit 13 is configured to determine and execute the first operation instruction when the determination result indicates that the sensing parameter meets the first preset condition.
Preferably, the sensing unit is a bone conduction unit;
the control unit 11 includes a first control subunit 111 for controlling the bone conduction unit to detect mouth movements generated by biting movements of the teeth of the user transmitted through the cheekbones of the user, and obtaining tooth vibration data.
Preferably, the wearable electronic device further comprises:
a first determining unit 14 for determining the number of times of occlusion of the user's teeth based on the tooth vibration data;
accordingly, the judging unit 12 includes a first judging subunit 121 configured to judge whether the number of engagements satisfies a first preset sub-condition.
It should be understood by those skilled in the art that the functions of each processing unit in the wearable electronic device according to the embodiments of the present invention may be realized by analog circuits that implement the functions described in the embodiments of the present invention, or by running software that performs the functions described in the embodiments of the present invention on a smart device, as described with reference to the foregoing description of the interaction method.
Fig. 11 is a schematic structural composition diagram of a wearable electronic device according to a fifth embodiment of the present invention, where the wearable electronic device in this example includes a sensing unit; the wearable electronic device is capable of maintaining a first relative positional relationship with an ear of a user; the wearable electronic device further includes:
the control unit 21 is used for controlling the sensing unit to detect the mouth movement of the user to obtain sensing parameters;
the judging unit 22 is configured to judge whether the sensing parameter meets a first preset condition, so as to obtain a judgment result;
and the processing unit 23 is configured to determine and execute the first operation instruction when the determination result indicates that the sensing parameter meets the first preset condition.
Preferably, the sensing unit is a bone conduction unit;
the control unit 21 comprises a first control subunit 211, configured to control the bone conduction unit to detect mouth movements generated by biting movements of the teeth of the user transmitted through the cheekbones of the user, so as to obtain tooth vibration data.
Preferably, the wearable electronic device further comprises:
a second determining unit 24 for determining the occlusion duration of the user's teeth according to the tooth vibration data;
accordingly, the judging unit 22 includes a second judging subunit 221, configured to judge whether the engagement duration satisfies a second preset sub-condition.
It should be understood by those skilled in the art that the functions of each processing unit in the wearable electronic device according to the embodiments of the present invention may be realized by analog circuits that implement the functions described in the embodiments of the present invention, or by running software that performs the functions described in the embodiments of the present invention on a smart device, as described with reference to the foregoing description of the interaction method.
Fig. 12 is a schematic structural composition diagram of a wearable electronic device according to a sixth embodiment of the present invention, where the wearable electronic device includes a sensing unit; the wearable electronic device is capable of maintaining a first relative positional relationship with an ear of a user; the wearable electronic device further includes:
the control unit 31 is used for controlling the sensing unit to detect the mouth movement of the user to obtain sensing parameters;
the judging unit 32 is configured to judge whether the sensing parameter meets a first preset condition, so as to obtain a judgment result;
and the processing unit 33 is configured to determine and execute the first operation instruction when the determination result indicates that the sensing parameter meets the first preset condition.
Preferably, the wearable electronic device further comprises:
an acquisition unit 34 for acquiring a second operation;
a response unit 35, configured to lock the sensing unit in the electronic device in response to the second operation, so as to stop detecting the mouth motion of the user.
It should be understood by those skilled in the art that the functions of each processing unit in the wearable electronic device according to the embodiments of the present invention may be realized by analog circuits that implement the functions described in the embodiments of the present invention, or by running software that performs the functions described in the embodiments of the present invention on a smart device, as described with reference to the foregoing description of the interaction method.
The technical schemes described in the embodiments of the present invention can be combined arbitrarily without conflict.
In the embodiments provided in the present invention, it should be understood that the disclosed method and intelligent device may be implemented in other ways. The above-described device embodiments are merely illustrative, for example, the division of the unit is only a logical functional division, and there may be other division ways in actual implementation, such as: multiple units or components may be combined, or may be integrated into another system, or some features may be omitted, or not implemented. In addition, the coupling, direct coupling or communication connection between the components shown or discussed may be through some interfaces, and the indirect coupling or communication connection between the devices or units may be electrical, mechanical or other forms.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, that is, may be located in one place, or may be distributed on a plurality of network units; some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, all the functional units in the embodiments of the present invention may be integrated into one second processing unit, or each unit may be separately regarded as one unit, or two or more units may be integrated into one unit; the integrated unit can be realized in a form of hardware, or in a form of hardware plus a software functional unit.
The above description is only for the specific embodiments of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present invention, and all the changes or substitutions should be covered within the scope of the present invention.

Claims (8)

1. An interaction method is applied to wearable electronic equipment, and the wearable electronic equipment is provided with a sensing unit; the wearable electronic equipment can maintain the relative position relation with the ear of a user; when the wearable electronic device maintains a first relative positional relationship with an ear of a user, the interaction method comprises:
detecting mouth movements of the user through the sensing unit to obtain sensing parameters;
judging whether the sensing parameters meet a first preset condition or not to obtain a judgment result;
when the judgment result shows that the sensing parameters meet the first preset condition, determining and executing a first operation instruction;
wherein, the detecting the mouth action of the user through the sensing unit to obtain the sensing parameter comprises:
when the sensing unit is a bone conduction unit, mouth movements generated by the occlusion movements of the teeth of the user and transmitted by the cheekbones of the user are detected through the bone conduction unit to obtain tooth vibration data; or,
when the sensing unit is a pressure sensing unit, ear pressure data generated by the change of the mouth action of the user is detected through the sensing unit.
2. The interaction method of claim 1, the method further comprising:
determining the occlusion times of the teeth of the user according to the tooth vibration data;
correspondingly, the judging whether the sensing parameter meets a first preset condition includes:
and judging whether the occlusion times meet a first preset condition.
3. The interaction method of claim 1, the method further comprising:
determining the occlusion duration of the teeth of the user according to the tooth vibration data;
correspondingly, the judging whether the sensing parameter meets a first preset condition includes:
and judging whether the meshing duration meets a first preset condition.
4. The interaction method according to any one of claims 1 to 3, the method further comprising:
obtaining a second operation;
and locking a sensing unit in the electronic equipment in response to the second operation to stop detecting the mouth action of the user.
5. A wearable electronic device, comprising a sensing unit; the wearable electronic device is capable of maintaining a first relative positional relationship with an ear of a user; the wearable electronic device further includes:
the control unit is used for controlling the sensing unit to detect the mouth movement of the user to obtain sensing parameters;
the judging unit is used for judging whether the sensing parameters meet a first preset condition or not to obtain a judging result;
the processing unit is used for determining and executing a first operation instruction when the judgment result shows that the sensing parameter meets the first preset condition;
wherein the sensing unit is a bone conduction unit;
the control unit comprises a first control subunit, a second control subunit and a control unit, wherein the first control subunit is used for controlling the bone conduction unit to detect mouth movements generated by the biting movements of the teeth of the user and transmitted by the cheekbones of the user, and obtaining tooth vibration data;
or, the sensing unit is a pressure sensing unit, and the control unit comprises a second control subunit for controlling the sensing unit to detect ear pressure data generated by the change of the mouth motion of the user.
6. The wearable electronic device of claim 5, further comprising:
a first determining unit for determining the occlusion times of the teeth of the user according to the tooth vibration data;
correspondingly, the judging unit comprises a first judging subunit for judging whether the occlusion times meet a first preset sub-condition.
7. The wearable electronic device of claim 5, further comprising:
the second determining unit is used for determining the occlusion time of the teeth of the user according to the tooth vibration data;
correspondingly, the judging unit comprises a second judging subunit, and is used for judging whether the occlusion duration meets a second preset sub-condition.
8. A wearable electronic device according to any of claims 5-7, further comprising:
an acquisition unit configured to acquire a second operation;
and the response unit is used for responding to the second operation and locking the sensing unit in the electronic equipment so as to stop detecting the mouth action of the user.
CN201410470040.2A 2014-09-15 2014-09-15 A kind of exchange method and wearable electronic equipment Active CN104317388B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410470040.2A CN104317388B (en) 2014-09-15 2014-09-15 A kind of exchange method and wearable electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410470040.2A CN104317388B (en) 2014-09-15 2014-09-15 A kind of exchange method and wearable electronic equipment

Publications (2)

Publication Number Publication Date
CN104317388A CN104317388A (en) 2015-01-28
CN104317388B true CN104317388B (en) 2018-12-14

Family

ID=52372627

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410470040.2A Active CN104317388B (en) 2014-09-15 2014-09-15 A kind of exchange method and wearable electronic equipment

Country Status (1)

Country Link
CN (1) CN104317388B (en)

Families Citing this family (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105844851B (en) * 2016-05-13 2018-12-04 李玉婷 A kind of concealed alarm method and hidden alarming device
CN105913623A (en) * 2016-05-13 2016-08-31 李玉婷 Concealed alarming method and concealed alarming device
CN106774914A (en) * 2016-12-26 2017-05-31 苏州欧菲光科技有限公司 The control method and Wearable of Wearable
CN107242913A (en) * 2017-06-02 2017-10-13 京东方科技集团股份有限公司 A kind of dental prosthesis system and its method of work, terminal, signal interaction system
CN108958477A (en) * 2018-06-08 2018-12-07 张沂 Exchange method, device, electronic equipment and computer readable storage medium
CN110134249A (en) * 2019-05-31 2019-08-16 王刘京 Wear interactive display device and its control method
CN110286755B (en) * 2019-06-12 2022-07-12 Oppo广东移动通信有限公司 Terminal control method and device, electronic equipment and computer readable storage medium
CN111093134B (en) * 2019-12-23 2022-01-11 Oppo广东移动通信有限公司 Earphone control device, earphone control method and earphone
CN111248915B (en) * 2019-12-30 2021-08-17 联想(北京)有限公司 Processing method and device and electronic equipment
CN111050248B (en) * 2020-01-14 2021-10-01 Oppo广东移动通信有限公司 Wireless earphone and control method thereof
CN111785267A (en) * 2020-07-01 2020-10-16 Oppo广东移动通信有限公司 Interaction control method and device and computer readable storage medium
CN111768757A (en) * 2020-07-10 2020-10-13 Oppo(重庆)智能科技有限公司 Control method of wearable device, wearable device and storage medium
CN112286289A (en) * 2020-10-30 2021-01-29 刘啸 Buccal wearable device, processing method and storage medium
CN113010008A (en) * 2021-02-01 2021-06-22 深圳市沃特沃德信息有限公司 Tooth movement interaction method and device based on TWS earphone and computer equipment
CN113963528A (en) * 2021-10-20 2022-01-21 浙江理工大学 Man-machine interaction system

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1679371A (en) * 2002-08-30 2005-10-05 中岛淑贵 Microphone and communication interface system
CN101926185A (en) * 2008-02-15 2010-12-22 索尼图斯医疗公司 Headset systems and methods
CN103116405A (en) * 2013-03-06 2013-05-22 胡三清 Real-time detection and control device and method for brain and muscle electricity in tooth movement states
CN103412640A (en) * 2013-05-16 2013-11-27 胡三清 Device and method for character or command input controlled by teeth
CN103425489A (en) * 2012-05-03 2013-12-04 Dsp集团有限公司 A system and apparatus for controlling a device with a bone conduction transducer
CN103699226A (en) * 2013-12-18 2014-04-02 天津大学 Tri-modal serial brain-computer interface method based on multi-information fusion
CN103699227A (en) * 2013-12-25 2014-04-02 邵剑锋 Novel human-computer interaction system
CN104007826A (en) * 2014-06-17 2014-08-27 合一网络技术(北京)有限公司 Video control method and system based on face movement identification technology

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8994647B2 (en) * 2009-02-05 2015-03-31 Ercc Co., Ltd. Input device, wearable computer, and input method

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1679371A (en) * 2002-08-30 2005-10-05 中岛淑贵 Microphone and communication interface system
CN101926185A (en) * 2008-02-15 2010-12-22 索尼图斯医疗公司 Headset systems and methods
CN103425489A (en) * 2012-05-03 2013-12-04 Dsp集团有限公司 A system and apparatus for controlling a device with a bone conduction transducer
CN103116405A (en) * 2013-03-06 2013-05-22 胡三清 Real-time detection and control device and method for brain and muscle electricity in tooth movement states
CN103412640A (en) * 2013-05-16 2013-11-27 胡三清 Device and method for character or command input controlled by teeth
CN103699226A (en) * 2013-12-18 2014-04-02 天津大学 Tri-modal serial brain-computer interface method based on multi-information fusion
CN103699227A (en) * 2013-12-25 2014-04-02 邵剑锋 Novel human-computer interaction system
CN104007826A (en) * 2014-06-17 2014-08-27 合一网络技术(北京)有限公司 Video control method and system based on face movement identification technology

Also Published As

Publication number Publication date
CN104317388A (en) 2015-01-28

Similar Documents

Publication Publication Date Title
CN104317388B (en) A kind of exchange method and wearable electronic equipment
CN105988768B (en) Intelligent device control method, signal acquisition method and related device
JP5977436B2 (en) Gesture-based remote device control
JP6353982B2 (en) Touch interaction processing method, apparatus and system
US8519835B2 (en) Systems and methods for sensory feedback
CN105159505B (en) A kind of interface operation method and terminal
EP3001422A1 (en) Media player automated control based on detected physiological parameters of a user
CN102662554B (en) Information processing apparatus and code input mode switching method thereof
JP2017094055A (en) Gesture recognition method, apparatus and wearable device
CN104820566B (en) Small intelligent touch terminal returns to the method and device of main screen
CN104780486A (en) Use of microphones with vsensors for wearable devices
JP2015126869A (en) Sleep aid system and sleep aid method
TWI435251B (en) Method and system for estimating the tendency of pressure change on a touch panel
US20190324539A1 (en) Systems and methods for providing dynamic haptic playback for an augmented or virtual reality environments
JP2015170174A (en) Information processor, information processing system, information processing method and program
JP2017518597A (en) Control method, apparatus, electronic device, and computer recording medium
CN104898880B (en) A kind of control method and electronic equipment
CN105446578B (en) A kind of information processing method, the first electronic equipment and wearable electronic equipment
CN105872158A (en) Alarm clock ringing method and device
CN111785267A (en) Interaction control method and device and computer readable storage medium
JP5794526B2 (en) Interface system
EP3678006A1 (en) Detection method and device for preventing accidental touch and terminal
CN106648540B (en) Music switching method and device
CN103677500B (en) A kind of data processing method and electronic equipment
CN103279201B (en) A kind of key system and Wristwatch-type intelligent terminal

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant