US20180253213A1 - Intelligent Interaction Method, Device, and System - Google Patents
Intelligent Interaction Method, Device, and System Download PDFInfo
- Publication number
- US20180253213A1 US20180253213A1 US15/559,691 US201515559691A US2018253213A1 US 20180253213 A1 US20180253213 A1 US 20180253213A1 US 201515559691 A US201515559691 A US 201515559691A US 2018253213 A1 US2018253213 A1 US 2018253213A1
- Authority
- US
- United States
- Prior art keywords
- control information
- smart watch
- pointer icon
- arm
- smart
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 230000003993 interaction Effects 0.000 title claims abstract description 51
- 238000000034 method Methods 0.000 title claims abstract description 48
- 239000004984 smart glass Substances 0.000 claims abstract description 92
- 238000006073 displacement reaction Methods 0.000 claims description 32
- 238000010586 diagram Methods 0.000 description 8
- 238000004891 communication Methods 0.000 description 5
- 230000008878 coupling Effects 0.000 description 3
- 238000010168 coupling process Methods 0.000 description 3
- 238000005859 coupling reaction Methods 0.000 description 3
- 239000011521 glass Substances 0.000 description 3
- 230000009471 action Effects 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 210000000245 forearm Anatomy 0.000 description 2
- 238000012790 confirmation Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04812—Interaction techniques based on cursor appearance or behaviour, e.g. being affected by the presence of displayed objects
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
-
- G—PHYSICS
- G04—HOROLOGY
- G04G—ELECTRONIC TIME-PIECES
- G04G17/00—Structural details; Housings
- G04G17/02—Component assemblies
- G04G17/06—Electric connectors, e.g. conductive elastomers
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/163—Wearable computers, e.g. on a belt
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/014—Hand-worn input/output arrangements, e.g. data gloves
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/0485—Scrolling or panning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G08—SIGNALLING
- G08C—TRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
- G08C17/00—Arrangements for transmitting signals characterised by the use of a wireless electrical link
- G08C17/02—Arrangements for transmitting signals characterised by the use of a wireless electrical link using a radio link
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B2027/0178—Eyeglass type
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2200/00—Indexing scheme relating to G06F1/04 - G06F1/32
- G06F2200/16—Indexing scheme relating to G06F1/16 - G06F1/18
- G06F2200/163—Indexing scheme relating to constructional details of the computer
- G06F2200/1637—Sensing arrangement for detection of housing movement or orientation, e.g. for controlling scrolling or cursor movement on the display of an handheld computer
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/038—Indexing scheme relating to G06F3/038
- G06F2203/0383—Remote input, i.e. interface arrangements in which the signals generated by a pointing device are transmitted to a PC at a remote location, e.g. to a PC in a LAN
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
-
- G—PHYSICS
- G08—SIGNALLING
- G08C—TRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
- G08C2201/00—Transmission systems of control signals via wireless link
- G08C2201/30—User interface
- G08C2201/32—Remote control based on movements, attitude of remote control device
Definitions
- Embodiments of the present invention relate to wearable technologies, and in particular, to an intelligent interaction method, a device, and a system.
- the interaction method has the following disadvantages:
- Embodiments of the present invention provide an intelligent interaction method, a device, and a system, to resolve problems of undiversified functions and a limited application scenario of the foregoing interaction method.
- an embodiment of the present invention provides an intelligent interaction method, where
- a pointer icon is set on a man-machine interface of smart glasses, where the method includes:
- control information sent by a smart watch where the control information is generated by the smart watch according to a user input operation that is received
- the control information includes displacement information, where the displacement information includes a displacement that corresponds to the user input operation and that is obtained by a touchscreen of the smart watch;
- controlling, by the smart glasses, movement of the pointer icon according to the control information includes:
- control information further includes force information, and the force information is used to represent a pressing force corresponding to the user input operation;
- the controlling, by the smart glasses, the pointer icon to move by the displacement includes:
- the control information includes angle information
- the angle information includes a longitudinal rotation angle of the smart watch that is worn on an arm and that rotates about an axial direction of the arm, and a transverse rotation angle of the smart watch about an axial direction that is perpendicular to the arm by using an angle of the smart watch at a preset starting point as a reference when the arm is horizontally placed in front of a body;
- controlling, by the smart glasses, movement of the pointer icon according to the control information includes:
- the pointer icon controlling, by the smart glasses, the pointer icon to move, starting from the preset starting point, by a distance of P ⁇ along an X-axis of the man-machine interface, and by a distance of p ⁇ along a Y-axis of the man-machine interface, where ⁇ represents the longitudinal rotation angle, ⁇ represents the transverse rotation angle, and p is a preset constant.
- an embodiment of the present invention provides an intelligent interaction method, including:
- control information is an angle of the smart watch that is worn on an arm and that rotates about an axial direction of the arm when the arm is horizontally placed in front of a body;
- an embodiment of the present invention provides an intelligent device, where a pointer icon is set on a man-machine interface of the intelligent device, and the intelligent device includes:
- the receiver is configured to receive control information sent by a smart watch, where the control information is generated by the smart watch according to a user input operation that is received;
- processor configured to control movement of the pointer icon according to the control information.
- control information includes displacement information, where the displacement information includes a displacement that corresponds to the user input operation and that is obtained by a touchscreen of the smart watch, and the processor is specifically configured to control the pointer icon to move by the displacement.
- control information further includes force information
- the force information is used to represent a pressing force corresponding to the user input operation
- the processor is further configured to:
- the control information includes angle information
- the angle information includes a longitudinal rotation angle of the smart watch that is worn on an arm and that rotates about an axial direction of the arm, and a transverse rotation angle of the smart watch about an axial direction that is perpendicular to the arm by using an angle of the smart watch at a preset starting point as a reference when the arm is horizontally placed in front of a body
- the processor is specifically configured to:
- control the pointer icon to move, starting from the preset starting point, by a distance of p ⁇ along an X-axis of the man-machine interface, and by a distance of p ⁇ along a Y-axis of the man-machine interface, wherein ⁇ represents the longitudinal rotation angle, ⁇ represents the transverse rotation angle, and p is a preset constant.
- the intelligent device is smart glasses.
- an intelligent device including:
- the receiver is configured to receive control information sent by a smart watch, and the control information is an angle of the smart watch that is worn on an arm and that rotates about an axial direction of the arm when the arm is horizontally placed in front of a body;
- processor configured to control scrolling of menus of the intelligent device on a man-machine interface of the intelligent device according to the control information, where a quantity of the scrolled menus on the man-machine interface depends on a magnitude of the angle.
- the intelligent device is smart glasses.
- an embodiment of the present invention provides an intelligent interaction system, including:
- a smart watch configured to generate control information according to a user input operation that is received
- the smart watch is in communication connection with the intelligent device.
- a smart watch is used as a recipient of a user input operation.
- a structure of the smart watch is used to convert the user input operation into control information, so as to control movement of a pointer icon on a man-machine interface of smart glasses, implementing interaction between a user and the smart glasses.
- the interaction method is not limited by a structure of the smart glasses. Therefore, functions that can be implemented by the interaction method are greatly increased.
- the interaction method is not limited by a scenario, thereby improving convenience of interaction between the user and the smart glasses.
- FIG. 1 is a diagram of an example of an application scenario of an intelligent interaction method according to the present invention
- FIG. 2 is a flowchart of Embodiment 1 of an intelligent interaction method according to the present invention.
- FIG. 3A is a diagram of an example of a correspondence between force information (S) and a pressing force (F) in Embodiment 2 of an intelligent interaction method according to the present invention:
- FIG. 3B is a diagram of another example of a correspondence between force information (S) and a pressing force (F) in Embodiment 2 of an intelligent interaction method according to the present invention
- FIG. 4 is a diagram of an example of another application scenario of an intelligent interaction method according to the present invention.
- FIG. 5 is a flowchart of Embodiment 2 of an intelligent interaction method according to the present invention.
- FIG. 6 is a diagram of an example of still another application scenario of an intelligent interaction method according to the present invention.
- FIG. 7 is a diagram of an example of yet another application scenario of an intelligent interaction method according to the present invention.
- FIG. 8 is a schematic structural diagram of Embodiment 1 of an intelligent device according to the present invention.
- Smart glasses are also known as intelligent glasses.
- the smart glasses have an independent operating system, a user may install programs that are provided by software service providers such as software and games, and functions of adding an agenda, map navigation, interaction with a friend, photographing and videotaping, and carrying out a video call with a friend may be completed by means of voice or operation, and wireless network access may be implemented by using a mobile communications network.
- a basic architecture of the smart glasses includes a parallel frame that can be placed transversely on the bridge of a nose, a touchpad that is disposed on a leg of the frame, a wide stripe computer that is located on a right side of the frame, and a transparent display screen.
- the technical solutions in the embodiments of the present invention are applicable to a scenario in which smart glasses and a smart watch are worn simultaneously and the smart glasses need an external pointer tool, and a scenario in which a portable intelligent device, for example, a personal computer (Personal Computer, PC for short), which has no mouse and needs an external pointer tool.
- a portable intelligent device for example, a personal computer (Personal Computer, PC for short), which has no mouse and needs an external pointer tool.
- An embodiment of the present invention provides an intelligent interaction system.
- the intelligent interaction system includes: a smart watch and an intelligent device.
- the smart watch is configured to generate control information according to a user input operation that is received.
- the intelligent device is any intelligent device described below.
- the smart watch is in communication connection with the intelligent device.
- FIG. 1 in this example, an intelligent device is described by using smart glasses as an example. Communication is performed between a smart watch and the smart glasses by using a technical path of Blue-Tooth ((Blue-Tooth, BT for short) or Blue-Tooth Low Energy (Blue-Tooth Low Energy, BLE for short).
- FIG. 2 is a flowchart of Embodiment 1 of an intelligent interaction method according to the present invention.
- An embodiment of the present invention provides an intelligent interaction method, to implement interaction between a user and smart glasses.
- the method may be executed by any apparatus for executing an intelligent interaction method, and the apparatus may be implemented by means of software and/or hardware.
- the apparatus may be integrated into the smart glasses, and a pointer icon is set on a man-machine interface of the smart glasses.
- the method includes:
- the smart glasses receive control information sent by a smart watch, where the control information is generated by the smart watch according to a user input operation that is received.
- a user performs touch input on a touchscreen of the smart watch or performs key-press input on the smart watch: the smart watch generates, according to a user input operation (including the touch input and the key-press input), control information that is used to control a pointer icon on a man-machine interface (Man Machine Interface, MMI for short) of the smart glasses, and sends the control information to the smart glasses.
- the sending, by the smart watch, the control information to the smart glasses may be implemented by using a technical path of BT or BLE, but the present invention is not limited thereto.
- the smart glasses control movement of the pointer icon according to the control information, so that the user interacts with the smart glasses.
- a pointer icon is set on a man-machine interface of smart glasses, where the pointer icon is, for example, a cursor.
- the setting described herein includes implementation by using software or implementation by installing an application (Application, APP for short) in the smart glasses.
- a layer is suspended on the man-machine interface, and the layer is configured to display the pointer icon.
- a smart watch is used as a recipient of a user input operation.
- a structure of the smart watch is used to convert the user input operation into control information, so as to control movement of a pointer icon on a man-machine interface of smart glasses, implementing interaction between a user and the smart glasses.
- the interaction method is not limited by a structure of the smart glasses. Therefore, functions that can be implemented by the interaction method are greatly increased.
- the interaction method is not limited by a scenario, thereby improving convenience of interaction between the user and the smart glasses.
- control information may include displacement information.
- the displacement information may include a displacement that corresponds to the user input operation and that is obtained by a touchscreen of the smart watch.
- S 102 may include: controlling, by the smart glasses, a pointer icon to move by the foregoing displacement.
- the smart watch reads a built-in sensor of the touchscreen, obtains coordinate information related to contact positions of a finger, and obtains the displacement information by means of coordinate calculation.
- the smart watch obtains a displacement D 1 ((X 2 ⁇ X 1 ), (Y 2 ⁇ Y 1 )) by comparing coordinates P 1 (X 1 , Y 1 ) and P 2 (X 2 , Y 2 ) of the contact positions of the finger that have been collected twice.
- the smart watch can send the coordinate or displacement information to the smart glasses (or PC) by using a BT or BLE path.
- the smart glasses obtain the displacement information from the smart watch.
- the smart glasses After entering an application function of the pointer icon, the smart glasses (or PC) implement corresponding movement of the pointer icon on an MMI according to the obtained coordinate or displacement information. For example, as shown in FIG. 1 , the finger flicks left (arrow direction) on the touchscreen of the smart watch, and correspondingly, the pointer icon on the MMI of the smart glasses moves left.
- the control information may further include force information.
- the force information is used to represent a pressing force corresponding to the user input operation.
- the smart watch reads a built-in force sensor of the touchscreen, and obtains a pressing force.
- the controlling, by the smart glasses, a pointer icon to move by the foregoing displacement may include: controlling, by the smart glasses according to the force information, a movement speed of moving the pointer icon to implement interaction between a user and the smart glasses, where a magnitude of the force information determines the movement speed of the pointer icon.
- the movement speed of the pointer icon may increase as the force information increases, or the movement speed of the pointer icon may decrease as the force information increases.
- the force information may be consecutive or segmentally discrete.
- a correspondence between the force information and the pressing force includes multiple types.
- the force information (S) is directly proportional to the pressing force (F), as shown in FIG. 3A ; or the correspondence between the force information (S) and the pressing force (F) is one-to-many, as shown in FIG. 3B .
- the force information corresponding to pressing forces ranging from 0 to F 1 is 0, the force information corresponding to pressing forces ranging from F to F 2 is S 1 , and so on.
- control information may include angle information.
- the angle information may include a longitudinal rotation angle of the smart watch that is worn on an arm and that rotates about an axial direction of the arm, and a transverse rotation angle of the smart watch about an axial direction that is perpendicular to the arm by using an angle of the smart watch at a preset starting point as a reference when the arm is horizontally placed in front of a body.
- S 102 may include: controlling, by the smart glasses, the pointer icon to move, starting from the preset starting point, by a distance of p ⁇ along an X-axis of the man-machine interface, and by a distance of p ⁇ along a Y-axis of the man-machine interface, where a represents the longitudinal rotation angle, ⁇ represents the transverse rotation angle, p is a preset constant, and the X-axis and the Y-axis are perpendicular to each other on the man-machine interface.
- a gyroscope in a smart watch is used as a sensor of a pointer icon of smart glasses.
- a rotation angle of the smart watch about an axial direction of a forearm is ⁇
- a rotation angle of the smart watch about an axis that is perpendicular to the forearm is ⁇
- a displacement D (X, Y) of the pointer icon of a target device may be the smart glasses, a PC, or the like
- an original point for example, a geometric center point of an eyeglass of the smart glasses
- FIG. 5 is a flowchart of Embodiment 2 of an intelligent interaction method according to the present invention.
- This embodiment of the present invention provides an intelligent interaction method, to implement interaction between a user and smart glasses.
- the method may be executed by any apparatus for executing an intelligent interaction method, and the apparatus may be implemented by means of software and/or hardware.
- the apparatus may be integrated into smart glasses.
- the method includes:
- Smart glasses receive control information sent by a smart watch, where the control information is an angle of the smart watch that is worn on an arm and that rotates about an axial direction of the arm when the arm is horizontally placed in front of a body.
- S 502 The smart glasses control, scrolling of menus of the smart glasses on a man-machine interface of the smart glasses according to the control information, where a quantity of the scrolled menus on the man-machine interface depends on a magnitude of the angle.
- This embodiment is applicable to an operation of smart glasses of scrolling menus, for example, early google glass (google glass).
- the smart glasses can control the scrolling of the menus of the smart glasses according to the control information, to implement interaction between a user and the smart glasses.
- the scrolling of the menus of the smart glasses is output.
- a left arm is placed horizontally in front of a body, and the smart watch that is worn on the left arm rotates about an axial direction of the left arm; the smart watch obtains an angle by which the smart watch rotates about the axial direction of the left arm by reading data from a gyroscope (Gyroscope), and then transfers control information to smart glasses by using a path of BT or BLE, so as to determine a quantity of scrolled menus of the smart glasses.
- a menu that is currently selected is highlighted on an MMI of the smart glasses to remind a user of a status of the current menu, so that the user can adjust a rotation angle of the smart watch to reach a position of a pre-selected menu.
- a smart watch is used as a recipient of a user input operation.
- a structure of the smart watch is used to convert the user input operation into control information, so as to control display of a menu on a man-machine interface of smart glasses, implementing interaction between a user and the smart glasses.
- the interaction method is not limited by a structure of the smart glasses. Therefore, functions that can be implemented by the interaction method are greatly increased.
- the interaction method is not limited by a scenario, thereby improving convenience of interaction between the user and the smart glasses.
- the smart glasses can further receive startup information sent by the smart watch. Based on the above, after the smart glasses detect the startup information, a menu is displayed on the man-machine interface. For example, when detecting an action that a finger of a user touches the touchscreen, the smart watch transfers the startup information to the smart glasses, so as to enter the menu of the smart glasses.
- the user can further implement different function and shortcut keys of the smart glasses by knocking different side faces of the smart watch.
- knocking a left top of the smart watch can implement a confirmation action of the smart glasses.
- knocking a left top, a right top, and a left bottom of the smart watch can respectively implement a left mouse button, a calibration key, and a right mouse button, as shown in FIG. 7 .
- FIG. 8 is a schematic structural diagram of Embodiment 1 of an intelligent device according to the present invention.
- This embodiment of the present invention provides an intelligent device, to implement interaction between a user and the intelligent device.
- the intelligent device 80 includes: a receiver 81 , and a processor 82 .
- the receiver 81 is configured to receive control information sent by a smart watch.
- the control information is generated by the smart watch according to a user input operation that is received.
- the processor 82 is configured to control movement of a pointer icon according to the control information.
- the pointer icon is set on a man-machine interface of the intelligent device 80 .
- the intelligent device in this embodiment may be configured to execute the technical solution of the method embodiment shown in FIG. 2 , the implementation principles and technical effects thereof are similar, and details are not described herein again.
- control information includes displacement information.
- the displacement information may include a displacement that corresponds to the user input operation and that is obtained by a touchscreen of the smart watch.
- the processor 82 may be specifically configured to: control the pointer icon to move by the displacement.
- control information further includes force information.
- the force information is used to represent a pressing force corresponding to the user input operation.
- the processor 82 may be further configured to: control, according to the force information, a movement speed of moving the pointer icon to implement interaction between a user and the intelligent device 80 , where a magnitude of the force information determines the movement speed of the pointer icon.
- a correspondence between the force information and the pressing force may include at least the following types: the force information being directly proportional to the pressing force, a one-to-many correspondence between the force information and the pressing force, or the like.
- control information may include angle information.
- the angle information may include a longitudinal rotation angle of the smart watch that is worn on an arm and that rotates about an axial direction of the arm, and a transverse rotation angle of the smart watch about an axial direction that is perpendicular to the arm by using an angle of the smart watch at a preset starting point as a reference when the arm is horizontally placed in front of a body.
- a pointer icon is set on a man-machine interface of the smart glasses.
- the processor 82 may be specifically configured to: control the pointer icon to move, starting from the preset starting point, by a distance of p ⁇ along an X-axis of the man-machine interface, and by a distance of p ⁇ along a Y-axis of the man-machine interface, where ⁇ represents the longitudinal rotation angle, ⁇ represents the transverse rotation angle, p is a preset constant, and the X-axis and the Y-axis are perpendicular to each other on the man-machine interface.
- the intelligent device 80 may be smart glasses.
- the receiver 81 is configured to receive control information sent by a smart watch.
- the control information is an angle of the smart watch that is worn on an arm and that rotates about an axial direction of the arm when the arm is horizontally placed in front of a body.
- the processor 82 is configured to control scrolling of menus of the intelligent device on a man-machine interface of the intelligent device 80 according to the control information, where a quantity of the scrolled menus on the man-machine interface depends on a magnitude of the angle.
- the intelligent device in this embodiment may be configured to execute the technical solution of the method embodiment shown in FIG. 5 , the implementation principles and technical effects thereof are similar, and details are not described herein again.
- the disclosed apparatus and method may be implemented in other manners.
- the described device embodiment is merely an example.
- the unit or module division is merely logical function division and may be other division in actual implementation.
- a plurality of units or modules may be combined or integrated into another system, or some features may be ignored or not performed.
- the displayed or discussed mutual couplings or direct couplings or communication connections may be implemented through some interfaces.
- the indirect couplings or communication connections between the devices or modules may be implemented in electronic, mechanical, or other forms.
- modules described as separate parts may or may not be physically separate, and parts displayed as modules may or may not be physical modules, may be located in one position, or may be distributed on a plurality of network units. Some or all of the modules may be selected according to actual needs to achieve the objectives of the solutions of the embodiments.
- the program may be stored in a computer-readable storage medium.
- the foregoing storage medium includes: any medium that can store program code, such as a ROM, a RAM, a magnetic disk, or an optical disc.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Computer Hardware Design (AREA)
- Computer Networks & Wireless Communication (AREA)
- Optics & Photonics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
- Embodiments of the present invention relate to wearable technologies, and in particular, to an intelligent interaction method, a device, and a system.
- With development of wearable technologies, smart watches, smart glasses and the like are becoming wearable devices that are widely popularized among consumers.
- For the smart glasses, currently a touchpad on a leg of the smart glasses is used as an input interaction tool, and man computer interaction is performed in combination with voice input. However, the interaction method has the following disadvantages:
- Due to a size limitation of the touchpad, only one-dimensional movement can be input, and corresponding menus can only be made into one-dimensional scrolling menus, whose function is undiversified. In addition, voice input is efficient, but an application scenario is limited. For example, both a noisy environment and a public place such as a library needing quietness limit use of voice input.
- Embodiments of the present invention provide an intelligent interaction method, a device, and a system, to resolve problems of undiversified functions and a limited application scenario of the foregoing interaction method.
- According to a first aspect, an embodiment of the present invention provides an intelligent interaction method, where
- a pointer icon is set on a man-machine interface of smart glasses, where the method includes:
- receiving, by the smart glasses, control information sent by a smart watch, where the control information is generated by the smart watch according to a user input operation that is received; and
- controlling, by the smart glasses, movement of the pointer icon according to the control information.
- With reference to the first aspect, in a first possible implementation manner of the first aspect, the control information includes displacement information, where the displacement information includes a displacement that corresponds to the user input operation and that is obtained by a touchscreen of the smart watch; and
- the controlling, by the smart glasses, movement of the pointer icon according to the control information includes:
- controlling, by the smart glasses, the pointer icon to move by the displacement.
- With reference to the first possible implementation manner of the first aspect, in a second possible implementation manner of the first aspect, the control information further includes force information, and the force information is used to represent a pressing force corresponding to the user input operation; and
- the controlling, by the smart glasses, the pointer icon to move by the displacement includes:
- controlling, by the smart glasses, a movement speed of moving the pointer icon according to the force information.
- With reference to the first aspect, in a third possible implementation manner of the first aspect, the control information includes angle information, and the angle information includes a longitudinal rotation angle of the smart watch that is worn on an arm and that rotates about an axial direction of the arm, and a transverse rotation angle of the smart watch about an axial direction that is perpendicular to the arm by using an angle of the smart watch at a preset starting point as a reference when the arm is horizontally placed in front of a body; and
- the controlling, by the smart glasses, movement of the pointer icon according to the control information includes:
- controlling, by the smart glasses, the pointer icon to move, starting from the preset starting point, by a distance of P×β along an X-axis of the man-machine interface, and by a distance of p×α along a Y-axis of the man-machine interface, where α represents the longitudinal rotation angle, β represents the transverse rotation angle, and p is a preset constant.
- According to a second aspect, an embodiment of the present invention provides an intelligent interaction method, including:
- receiving, by smart glasses, control information sent by a smart watch, where the control information is an angle of the smart watch that is worn on an arm and that rotates about an axial direction of the arm when the arm is horizontally placed in front of a body; and
- controlling, by the smart glasses, scrolling of menus of the smart glasses on a man-machine interface of the smart glasses according to the control information, where a quantity of the scrolled menus on the man-machine interface depends on a magnitude of the angle.
- According to a third aspect, an embodiment of the present invention provides an intelligent device, where a pointer icon is set on a man-machine interface of the intelligent device, and the intelligent device includes:
- a receiver, where the receiver is configured to receive control information sent by a smart watch, where the control information is generated by the smart watch according to a user input operation that is received; and
- a processor, where the processor is configured to control movement of the pointer icon according to the control information.
- With reference to the third aspect, in a first possible implementation manner of the third aspect, the control information includes displacement information, where the displacement information includes a displacement that corresponds to the user input operation and that is obtained by a touchscreen of the smart watch, and the processor is specifically configured to control the pointer icon to move by the displacement.
- With reference to the first possible implementation manner of the third aspect, in a second possible implementation manner of the third aspect, the control information further includes force information, the force information is used to represent a pressing force corresponding to the user input operation, and the processor is further configured to:
- control a movement speed of moving the pointer icon according to the force information.
- With reference to the third aspect, in a third possible implementation manner of the second aspect, the control information includes angle information, the angle information includes a longitudinal rotation angle of the smart watch that is worn on an arm and that rotates about an axial direction of the arm, and a transverse rotation angle of the smart watch about an axial direction that is perpendicular to the arm by using an angle of the smart watch at a preset starting point as a reference when the arm is horizontally placed in front of a body, and the processor is specifically configured to:
- control the pointer icon to move, starting from the preset starting point, by a distance of p×β along an X-axis of the man-machine interface, and by a distance of p×α along a Y-axis of the man-machine interface, wherein α represents the longitudinal rotation angle, β represents the transverse rotation angle, and p is a preset constant.
- With reference to any one of the third aspect or the first to the third possible implementation manners of the third aspect, in a fourth possible implementation manner of the third aspect, the intelligent device is smart glasses.
- According to a fourth aspect, an embodiment of the present invention provides an intelligent device, including:
- a receiver, where the receiver is configured to receive control information sent by a smart watch, and the control information is an angle of the smart watch that is worn on an arm and that rotates about an axial direction of the arm when the arm is horizontally placed in front of a body; and
- a processor, where the processor is configured to control scrolling of menus of the intelligent device on a man-machine interface of the intelligent device according to the control information, where a quantity of the scrolled menus on the man-machine interface depends on a magnitude of the angle.
- With reference to the fourth aspect, in a first possible implementation manner of the fourth aspect, the intelligent device is smart glasses.
- According to a fifth aspect, an embodiment of the present invention provides an intelligent interaction system, including:
- a smart watch, configured to generate control information according to a user input operation that is received; and
- the intelligent device according to any one of the third aspect or the fourth aspect, where
- the smart watch is in communication connection with the intelligent device.
- According to the intelligent interaction method, the device, and the system in the embodiments of the present invention, a smart watch is used as a recipient of a user input operation. A structure of the smart watch is used to convert the user input operation into control information, so as to control movement of a pointer icon on a man-machine interface of smart glasses, implementing interaction between a user and the smart glasses. The interaction method is not limited by a structure of the smart glasses. Therefore, functions that can be implemented by the interaction method are greatly increased. In addition, the interaction method is not limited by a scenario, thereby improving convenience of interaction between the user and the smart glasses.
- To describe the technical solutions in the embodiments of the present invention or in the prior art more clearly, the following briefly describes the accompanying drawings required for describing the embodiments or the prior art. Apparently, the accompanying drawings in the following description show merely some embodiments of the present invention, and a person of ordinary skill in the art may still derive other drawings from these accompanying drawings without creative efforts.
-
FIG. 1 is a diagram of an example of an application scenario of an intelligent interaction method according to the present invention; -
FIG. 2 is a flowchart of Embodiment 1 of an intelligent interaction method according to the present invention; -
FIG. 3A is a diagram of an example of a correspondence between force information (S) and a pressing force (F) inEmbodiment 2 of an intelligent interaction method according to the present invention: -
FIG. 3B is a diagram of another example of a correspondence between force information (S) and a pressing force (F) inEmbodiment 2 of an intelligent interaction method according to the present invention; -
FIG. 4 is a diagram of an example of another application scenario of an intelligent interaction method according to the present invention; -
FIG. 5 is a flowchart ofEmbodiment 2 of an intelligent interaction method according to the present invention; -
FIG. 6 is a diagram of an example of still another application scenario of an intelligent interaction method according to the present invention; -
FIG. 7 is a diagram of an example of yet another application scenario of an intelligent interaction method according to the present invention; and -
FIG. 8 is a schematic structural diagram of Embodiment 1 of an intelligent device according to the present invention. - The following clearly and completely describes the technical solutions in the embodiments of the present invention with reference to the accompanying drawings in the embodiments of the present invention. Apparently, the described embodiments are merely some but not all of the embodiments of the present invention. All other embodiments obtained by a person of ordinary skill in the art based on the embodiments of the present invention without creative efforts shall fall within the protection scope of the present invention.
- Smart glasses are also known as intelligent glasses. The smart glasses have an independent operating system, a user may install programs that are provided by software service providers such as software and games, and functions of adding an agenda, map navigation, interaction with a friend, photographing and videotaping, and carrying out a video call with a friend may be completed by means of voice or operation, and wireless network access may be implemented by using a mobile communications network.
- A basic architecture of the smart glasses includes a parallel frame that can be placed transversely on the bridge of a nose, a touchpad that is disposed on a leg of the frame, a wide stripe computer that is located on a right side of the frame, and a transparent display screen.
- The technical solutions in the embodiments of the present invention are applicable to a scenario in which smart glasses and a smart watch are worn simultaneously and the smart glasses need an external pointer tool, and a scenario in which a portable intelligent device, for example, a personal computer (Personal Computer, PC for short), which has no mouse and needs an external pointer tool.
- An embodiment of the present invention provides an intelligent interaction system. The intelligent interaction system includes: a smart watch and an intelligent device. The smart watch is configured to generate control information according to a user input operation that is received. The intelligent device is any intelligent device described below. The smart watch is in communication connection with the intelligent device. As shown in
FIG. 1 , in this example, an intelligent device is described by using smart glasses as an example. Communication is performed between a smart watch and the smart glasses by using a technical path of Blue-Tooth ((Blue-Tooth, BT for short) or Blue-Tooth Low Energy (Blue-Tooth Low Energy, BLE for short). -
FIG. 2 is a flowchart of Embodiment 1 of an intelligent interaction method according to the present invention. An embodiment of the present invention provides an intelligent interaction method, to implement interaction between a user and smart glasses. The method may be executed by any apparatus for executing an intelligent interaction method, and the apparatus may be implemented by means of software and/or hardware. In this embodiment, the apparatus may be integrated into the smart glasses, and a pointer icon is set on a man-machine interface of the smart glasses. As shown inFIG. 2 , the method includes: - S101: The smart glasses receive control information sent by a smart watch, where the control information is generated by the smart watch according to a user input operation that is received.
- S102: The smart glasses control movement of the pointer icon according to the control information.
- Specifically, a user performs touch input on a touchscreen of the smart watch or performs key-press input on the smart watch: the smart watch generates, according to a user input operation (including the touch input and the key-press input), control information that is used to control a pointer icon on a man-machine interface (Man Machine Interface, MMI for short) of the smart glasses, and sends the control information to the smart glasses. The sending, by the smart watch, the control information to the smart glasses may be implemented by using a technical path of BT or BLE, but the present invention is not limited thereto.
- The smart glasses control movement of the pointer icon according to the control information, so that the user interacts with the smart glasses.
- It should be noted that a pointer icon is set on a man-machine interface of smart glasses, where the pointer icon is, for example, a cursor. The setting described herein includes implementation by using software or implementation by installing an application (Application, APP for short) in the smart glasses. For example, a layer is suspended on the man-machine interface, and the layer is configured to display the pointer icon.
- According to this embodiment of the present invention, a smart watch is used as a recipient of a user input operation. A structure of the smart watch is used to convert the user input operation into control information, so as to control movement of a pointer icon on a man-machine interface of smart glasses, implementing interaction between a user and the smart glasses. The interaction method is not limited by a structure of the smart glasses. Therefore, functions that can be implemented by the interaction method are greatly increased. In addition, the interaction method is not limited by a scenario, thereby improving convenience of interaction between the user and the smart glasses.
- The following describes the technical solutions of the present invention in details by using several specific embodiments.
- In an embodiment, the control information may include displacement information. The displacement information may include a displacement that corresponds to the user input operation and that is obtained by a touchscreen of the smart watch. In this embodiment, S102 may include: controlling, by the smart glasses, a pointer icon to move by the foregoing displacement.
- Specifically, the smart watch reads a built-in sensor of the touchscreen, obtains coordinate information related to contact positions of a finger, and obtains the displacement information by means of coordinate calculation. The smart watch obtains a displacement D1 ((X2−X1), (Y2−Y1)) by comparing coordinates P1 (X1, Y1) and P2 (X2, Y2) of the contact positions of the finger that have been collected twice. The smart watch can send the coordinate or displacement information to the smart glasses (or PC) by using a BT or BLE path. Correspondingly, the smart glasses obtain the displacement information from the smart watch. After entering an application function of the pointer icon, the smart glasses (or PC) implement corresponding movement of the pointer icon on an MMI according to the obtained coordinate or displacement information. For example, as shown in
FIG. 1 , the finger flicks left (arrow direction) on the touchscreen of the smart watch, and correspondingly, the pointer icon on the MMI of the smart glasses moves left. - Based on above, the control information may further include force information. The force information is used to represent a pressing force corresponding to the user input operation. The smart watch reads a built-in force sensor of the touchscreen, and obtains a pressing force. In this case, the controlling, by the smart glasses, a pointer icon to move by the foregoing displacement may include: controlling, by the smart glasses according to the force information, a movement speed of moving the pointer icon to implement interaction between a user and the smart glasses, where a magnitude of the force information determines the movement speed of the pointer icon. For example, the movement speed of the pointer icon may increase as the force information increases, or the movement speed of the pointer icon may decrease as the force information increases.
- It should be further noted that the force information may be consecutive or segmentally discrete. A correspondence between the force information and the pressing force includes multiple types. For example, the force information (S) is directly proportional to the pressing force (F), as shown in
FIG. 3A ; or the correspondence between the force information (S) and the pressing force (F) is one-to-many, as shown inFIG. 3B . The force information corresponding to pressing forces ranging from 0 to F1 is 0, the force information corresponding to pressing forces ranging from F to F2 is S1, and so on. A displacement of the pointer icon on the smart glasses or the PC in a corresponding time period is: D2=a×S×D1 (a is a preset constant), where S may be S1, S2, or S3. - In another embodiment, control information may include angle information. The angle information may include a longitudinal rotation angle of the smart watch that is worn on an arm and that rotates about an axial direction of the arm, and a transverse rotation angle of the smart watch about an axial direction that is perpendicular to the arm by using an angle of the smart watch at a preset starting point as a reference when the arm is horizontally placed in front of a body. In this case, S102 may include: controlling, by the smart glasses, the pointer icon to move, starting from the preset starting point, by a distance of p×β along an X-axis of the man-machine interface, and by a distance of p×α along a Y-axis of the man-machine interface, where a represents the longitudinal rotation angle, β represents the transverse rotation angle, p is a preset constant, and the X-axis and the Y-axis are perpendicular to each other on the man-machine interface.
- Referring to
FIG. 4 , a gyroscope in a smart watch is used as a sensor of a pointer icon of smart glasses. Using an angle during calibration as a reference, when a rotation angle of the smart watch about an axial direction of a forearm is α, and a rotation angle of the smart watch about an axis that is perpendicular to the forearm is β, a displacement D (X, Y) of the pointer icon of a target device (may be the smart glasses, a PC, or the like) relative to an original point (for example, a geometric center point of an eyeglass of the smart glasses) to a destination point can be obtained by means of DY=p×α and DX=p×β. -
FIG. 5 is a flowchart ofEmbodiment 2 of an intelligent interaction method according to the present invention. This embodiment of the present invention provides an intelligent interaction method, to implement interaction between a user and smart glasses. The method may be executed by any apparatus for executing an intelligent interaction method, and the apparatus may be implemented by means of software and/or hardware. In this embodiment, the apparatus may be integrated into smart glasses. As shown inFIG. 5 , the method includes: - S501: Smart glasses receive control information sent by a smart watch, where the control information is an angle of the smart watch that is worn on an arm and that rotates about an axial direction of the arm when the arm is horizontally placed in front of a body.
- S502: The smart glasses control, scrolling of menus of the smart glasses on a man-machine interface of the smart glasses according to the control information, where a quantity of the scrolled menus on the man-machine interface depends on a magnitude of the angle.
- This embodiment is applicable to an operation of smart glasses of scrolling menus, for example, early google glass (google glass).
- In this embodiment, the smart glasses can control the scrolling of the menus of the smart glasses according to the control information, to implement interaction between a user and the smart glasses. According to this embodiment, by sensing variations in an azimuth of the smart watch, for example, upward, downward, leftward, or rightward tilt of the smart watch, the scrolling of the menus of the smart glasses is output.
- As shown in
FIG. 6 , the same as looking at a smart watch by a user in a normal status, a left arm is placed horizontally in front of a body, and the smart watch that is worn on the left arm rotates about an axial direction of the left arm; the smart watch obtains an angle by which the smart watch rotates about the axial direction of the left arm by reading data from a gyroscope (Gyroscope), and then transfers control information to smart glasses by using a path of BT or BLE, so as to determine a quantity of scrolled menus of the smart glasses. A menu that is currently selected is highlighted on an MMI of the smart glasses to remind a user of a status of the current menu, so that the user can adjust a rotation angle of the smart watch to reach a position of a pre-selected menu. - According to this embodiment of the present invention, a smart watch is used as a recipient of a user input operation. A structure of the smart watch is used to convert the user input operation into control information, so as to control display of a menu on a man-machine interface of smart glasses, implementing interaction between a user and the smart glasses. The interaction method is not limited by a structure of the smart glasses. Therefore, functions that can be implemented by the interaction method are greatly increased. In addition, the interaction method is not limited by a scenario, thereby improving convenience of interaction between the user and the smart glasses.
- Furthermore, the smart glasses can further receive startup information sent by the smart watch. Based on the above, after the smart glasses detect the startup information, a menu is displayed on the man-machine interface. For example, when detecting an action that a finger of a user touches the touchscreen, the smart watch transfers the startup information to the smart glasses, so as to enter the menu of the smart glasses.
- It should be complementarily noted that the user can further implement different function and shortcut keys of the smart glasses by knocking different side faces of the smart watch. For example, using a gesture of transversely placing the left arm and looking at the watch as a reference, knocking a left top of the smart watch can implement a confirmation action of the smart glasses. For another example, in a PC application, using a gesture of transversely placing the left arm and looking at the watch as a reference, knocking a left top, a right top, and a left bottom of the smart watch can respectively implement a left mouse button, a calibration key, and a right mouse button, as shown in
FIG. 7 . -
FIG. 8 is a schematic structural diagram of Embodiment 1 of an intelligent device according to the present invention. This embodiment of the present invention provides an intelligent device, to implement interaction between a user and the intelligent device. As shown inFIG. 8 , theintelligent device 80 includes: areceiver 81, and aprocessor 82. - The
receiver 81 is configured to receive control information sent by a smart watch. The control information is generated by the smart watch according to a user input operation that is received. Theprocessor 82 is configured to control movement of a pointer icon according to the control information. The pointer icon is set on a man-machine interface of theintelligent device 80. - The intelligent device in this embodiment may be configured to execute the technical solution of the method embodiment shown in
FIG. 2 , the implementation principles and technical effects thereof are similar, and details are not described herein again. - In an implementation manner, the control information includes displacement information. The displacement information may include a displacement that corresponds to the user input operation and that is obtained by a touchscreen of the smart watch. The
processor 82 may be specifically configured to: control the pointer icon to move by the displacement. - Furthermore, the control information further includes force information. The force information is used to represent a pressing force corresponding to the user input operation. The
processor 82 may be further configured to: control, according to the force information, a movement speed of moving the pointer icon to implement interaction between a user and theintelligent device 80, where a magnitude of the force information determines the movement speed of the pointer icon. - A correspondence between the force information and the pressing force may include at least the following types: the force information being directly proportional to the pressing force, a one-to-many correspondence between the force information and the pressing force, or the like.
- In another implementation manner, the control information may include angle information. The angle information may include a longitudinal rotation angle of the smart watch that is worn on an arm and that rotates about an axial direction of the arm, and a transverse rotation angle of the smart watch about an axial direction that is perpendicular to the arm by using an angle of the smart watch at a preset starting point as a reference when the arm is horizontally placed in front of a body. A pointer icon is set on a man-machine interface of the smart glasses. The
processor 82 may be specifically configured to: control the pointer icon to move, starting from the preset starting point, by a distance of p×β along an X-axis of the man-machine interface, and by a distance of p×α along a Y-axis of the man-machine interface, where α represents the longitudinal rotation angle, β represents the transverse rotation angle, p is a preset constant, and the X-axis and the Y-axis are perpendicular to each other on the man-machine interface. - It should be complementarily noted that the
intelligent device 80 may be smart glasses. - Referring to the structure shown in
FIG. 8 , thereceiver 81 is configured to receive control information sent by a smart watch. The control information is an angle of the smart watch that is worn on an arm and that rotates about an axial direction of the arm when the arm is horizontally placed in front of a body. Theprocessor 82 is configured to control scrolling of menus of the intelligent device on a man-machine interface of theintelligent device 80 according to the control information, where a quantity of the scrolled menus on the man-machine interface depends on a magnitude of the angle. - The intelligent device in this embodiment may be configured to execute the technical solution of the method embodiment shown in
FIG. 5 , the implementation principles and technical effects thereof are similar, and details are not described herein again. - In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other manners. For example, the described device embodiment is merely an example. For example, the unit or module division is merely logical function division and may be other division in actual implementation. For example, a plurality of units or modules may be combined or integrated into another system, or some features may be ignored or not performed. In addition, the displayed or discussed mutual couplings or direct couplings or communication connections may be implemented through some interfaces. The indirect couplings or communication connections between the devices or modules may be implemented in electronic, mechanical, or other forms.
- The modules described as separate parts may or may not be physically separate, and parts displayed as modules may or may not be physical modules, may be located in one position, or may be distributed on a plurality of network units. Some or all of the modules may be selected according to actual needs to achieve the objectives of the solutions of the embodiments.
- Persons of ordinary skill in the art may understand that all or some of the steps of the method embodiments may be implemented by a program instructing relevant hardware. The program may be stored in a computer-readable storage medium. When the program runs, the steps of the method embodiments are performed. The foregoing storage medium includes: any medium that can store program code, such as a ROM, a RAM, a magnetic disk, or an optical disc.
- Finally, it should be noted that the foregoing embodiments are merely intended for describing the technical solutions of the present invention, but not for limiting the present invention. Although the present invention is described in detail with reference to the foregoing embodiments, persons of ordinary skill in the art should understand that they may still make modifications to the technical solutions described in the foregoing embodiments or make equivalent replacements to some or all technical features thereof, without departing from the scope of the technical solutions of the embodiments of the present invention.
Claims (12)
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/CN2015/074743 WO2016149873A1 (en) | 2015-03-20 | 2015-03-20 | Intelligent interaction method, equipment and system |
Publications (1)
Publication Number | Publication Date |
---|---|
US20180253213A1 true US20180253213A1 (en) | 2018-09-06 |
Family
ID=56979100
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/559,691 Abandoned US20180253213A1 (en) | 2015-03-20 | 2015-03-20 | Intelligent Interaction Method, Device, and System |
Country Status (6)
Country | Link |
---|---|
US (1) | US20180253213A1 (en) |
EP (1) | EP3264203A4 (en) |
JP (1) | JP2018508909A (en) |
KR (1) | KR20170124593A (en) |
CN (1) | CN107209483A (en) |
WO (1) | WO2016149873A1 (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2021145614A1 (en) * | 2020-01-14 | 2021-07-22 | 삼성전자 주식회사 | Electronic device for controlling external electronic device and method thereof |
US20220373863A1 (en) * | 2021-05-24 | 2022-11-24 | Lanto Electronic Limited | Photographing glasses |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080165138A1 (en) * | 2004-12-31 | 2008-07-10 | Lenovo (Beijing) Limited | Information Input Device for Portable Electronic Apparatus and Control Method |
US20150123895A1 (en) * | 2013-11-05 | 2015-05-07 | Seiko Epson Corporation | Image display system, method of controlling image display system, and head-mount type display device |
US20150254882A1 (en) * | 2014-03-06 | 2015-09-10 | Ram Industrial Design, Inc. | Wireless immersive experience capture and viewing |
Family Cites Families (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH1115594A (en) * | 1997-06-20 | 1999-01-22 | Masanobu Kujirada | Three-dimensional pointing device |
KR20060065344A (en) * | 2004-12-10 | 2006-06-14 | 엘지전자 주식회사 | Means for jog-dial of mobile phone using camera and method thereof |
US8436810B2 (en) * | 2006-03-21 | 2013-05-07 | Koninklijke Philips Electronics N.V. | Indication of the condition of a user |
CN101984396A (en) * | 2010-10-19 | 2011-03-09 | 中兴通讯股份有限公司 | Method for automatically identifying rotation gesture and mobile terminal thereof |
KR20120105818A (en) * | 2011-03-16 | 2012-09-26 | 한국전자통신연구원 | Information input apparatus based events and method thereof |
US8194036B1 (en) * | 2011-06-29 | 2012-06-05 | Google Inc. | Systems and methods for controlling a cursor on a display using a trackpad input device |
JP5762892B2 (en) * | 2011-09-06 | 2015-08-12 | ビッグローブ株式会社 | Information display system, information display method, and information display program |
JP5576841B2 (en) * | 2011-09-09 | 2014-08-20 | Kddi株式会社 | User interface device capable of zooming image by pressing, image zoom method and program |
CN103946732B (en) * | 2011-09-26 | 2019-06-14 | 微软技术许可有限责任公司 | Video based on the sensor input to perspective, near-eye display shows modification |
JP2013125247A (en) * | 2011-12-16 | 2013-06-24 | Sony Corp | Head-mounted display and information display apparatus |
JP2013210963A (en) * | 2012-03-30 | 2013-10-10 | Denso Corp | Display control device and program |
CN103513908B (en) * | 2012-06-29 | 2017-03-29 | 国际商业机器公司 | For controlling light target method and apparatus on the touchscreen |
US20140152558A1 (en) * | 2012-11-30 | 2014-06-05 | Tom Salter | Direct hologram manipulation using imu |
US20140198034A1 (en) * | 2013-01-14 | 2014-07-17 | Thalmic Labs Inc. | Muscle interface device and method for interacting with content displayed on wearable head mounted displays |
CN103116411B (en) * | 2013-02-05 | 2015-12-09 | 上海飞智电子科技有限公司 | The method and system of positioning pointer position |
WO2014185146A1 (en) * | 2013-05-15 | 2014-11-20 | ソニー株式会社 | Display control device, display control method, and recording medium |
CN103309226B (en) * | 2013-06-09 | 2016-05-11 | 深圳先进技术研究院 | The intelligent watch that coordinates intelligent glasses to use |
CN103440097A (en) * | 2013-07-29 | 2013-12-11 | 康佳集团股份有限公司 | Method and terminal for controlling touch pad cursor to slide on the basis of pressure |
EP2843507A1 (en) * | 2013-08-26 | 2015-03-04 | Thomson Licensing | Display method through a head mounted device |
CN104317491B (en) * | 2014-09-30 | 2018-03-30 | 北京金山安全软件有限公司 | Display content control method and device and mobile terminal |
-
2015
- 2015-03-20 KR KR1020177028787A patent/KR20170124593A/en not_active Application Discontinuation
- 2015-03-20 CN CN201580071192.8A patent/CN107209483A/en active Pending
- 2015-03-20 US US15/559,691 patent/US20180253213A1/en not_active Abandoned
- 2015-03-20 JP JP2017549229A patent/JP2018508909A/en active Pending
- 2015-03-20 EP EP15885812.6A patent/EP3264203A4/en not_active Withdrawn
- 2015-03-20 WO PCT/CN2015/074743 patent/WO2016149873A1/en active Application Filing
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080165138A1 (en) * | 2004-12-31 | 2008-07-10 | Lenovo (Beijing) Limited | Information Input Device for Portable Electronic Apparatus and Control Method |
US20150123895A1 (en) * | 2013-11-05 | 2015-05-07 | Seiko Epson Corporation | Image display system, method of controlling image display system, and head-mount type display device |
US20150254882A1 (en) * | 2014-03-06 | 2015-09-10 | Ram Industrial Design, Inc. | Wireless immersive experience capture and viewing |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2021145614A1 (en) * | 2020-01-14 | 2021-07-22 | 삼성전자 주식회사 | Electronic device for controlling external electronic device and method thereof |
US20220373863A1 (en) * | 2021-05-24 | 2022-11-24 | Lanto Electronic Limited | Photographing glasses |
Also Published As
Publication number | Publication date |
---|---|
EP3264203A4 (en) | 2018-07-18 |
EP3264203A1 (en) | 2018-01-03 |
JP2018508909A (en) | 2018-03-29 |
WO2016149873A1 (en) | 2016-09-29 |
KR20170124593A (en) | 2017-11-10 |
CN107209483A (en) | 2017-09-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
KR102471977B1 (en) | Method for displaying one or more virtual objects in a plurality of electronic devices, and an electronic device supporting the method | |
US10101874B2 (en) | Apparatus and method for controlling user interface to select object within image and image input device | |
EP2508972A2 (en) | Portable electronic device and method of controlling same | |
JP5620440B2 (en) | Display control apparatus, display control method, and program | |
KR20180044129A (en) | Electronic device and method for acquiring fingerprint information thereof | |
CN109933252B (en) | Icon moving method and terminal equipment | |
KR20180109229A (en) | Method and apparatus for providing augmented reality function in electornic device | |
EP2746924B1 (en) | Touch input method and mobile terminal | |
CN110502162B (en) | Folder creating method and terminal equipment | |
KR20160027775A (en) | Method and Apparatus for Processing Touch Input | |
EP3933545A1 (en) | Method and apparatus for detecting orientation of electronic device, and storage medium | |
CN113396378A (en) | System and method for a multipurpose input device for two-dimensional and three-dimensional environments | |
KR20180014614A (en) | Electronic device and method for processing touch event thereof | |
US9665232B2 (en) | Information-processing device, storage medium, information-processing method, and information-processing system for enlarging or reducing an image displayed on a display device | |
KR101339985B1 (en) | Display apparatus, remote controlling apparatus and control method thereof | |
WO2015199806A1 (en) | Controlling brightness of a remote display | |
US20150177947A1 (en) | Enhanced User Interface Systems and Methods for Electronic Devices | |
KR20180065727A (en) | Method for displaying object and electronic device thereof | |
WO2018058673A1 (en) | 3d display method and user terminal | |
US20180253213A1 (en) | Intelligent Interaction Method, Device, and System | |
US20160291703A1 (en) | Operating system, wearable device, and operation method | |
US9983029B2 (en) | Integrated optical encoder for tilt able rotatable shaft | |
US20170017389A1 (en) | Method and apparatus for smart device manipulation utilizing sides of device | |
CN114397961B (en) | Head-mounted display device control method, head-mounted display device assembly and medium | |
KR20180058097A (en) | Electronic device for displaying image and method for controlling thereof |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: HUAWEI TECHNOLOGIES CO., LTD., CHINA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:WANG, XILIN;REEL/FRAME:043705/0256 Effective date: 20170925 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE AFTER FINAL ACTION FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: ADVISORY ACTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |