WO2017112714A1 - Combination computer keyboard and computer pointing device - Google Patents
Combination computer keyboard and computer pointing device Download PDFInfo
- Publication number
- WO2017112714A1 WO2017112714A1 PCT/US2016/067890 US2016067890W WO2017112714A1 WO 2017112714 A1 WO2017112714 A1 WO 2017112714A1 US 2016067890 W US2016067890 W US 2016067890W WO 2017112714 A1 WO2017112714 A1 WO 2017112714A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- touch
- keyboard
- data packet
- key
- computer
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/02—Input arrangements using manually operated switches, e.g. using keyboards or dials
- G06F3/0202—Constructional details or processes of manufacture of the input device
- G06F3/021—Arrangements integrating additional peripherals in a keyboard, e.g. card or barcode reader, optical scanner
- G06F3/0213—Arrangements providing an integrated pointing device in a keyboard, e.g. trackball, mini-joystick
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/038—Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/044—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means
- G06F3/0445—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means using two or more layers of sensing electrodes, e.g. using two layers of electrodes separated by a dielectric layer
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/044—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means
- G06F3/0446—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means using a grid-like structure of electrodes in at least two directions, e.g. using row and column electrodes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0489—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using dedicated keyboard keys or combinations thereof
- G06F3/04892—Arrangements for controlling cursor position based on codes indicative of cursor displacements from one discrete location to another, e.g. using cursor control keys associated to different directions or using the tab key
-
- H—ELECTRICITY
- H03—ELECTRONIC CIRCUITRY
- H03K—PULSE TECHNIQUE
- H03K17/00—Electronic switching or gating, i.e. not by contact-making and –breaking
- H03K17/94—Electronic switching or gating, i.e. not by contact-making and –breaking characterised by the way in which the control signals are generated
- H03K17/945—Proximity switches
- H03K17/955—Proximity switches using a capacitive detector
-
- H—ELECTRICITY
- H03—ELECTRONIC CIRCUITRY
- H03K—PULSE TECHNIQUE
- H03K17/00—Electronic switching or gating, i.e. not by contact-making and –breaking
- H03K17/94—Electronic switching or gating, i.e. not by contact-making and –breaking characterised by the way in which the control signals are generated
- H03K17/96—Touch switches
- H03K17/962—Capacitive touch switches
- H03K17/9622—Capacitive touch switches using a plurality of detectors, e.g. keyboard
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04806—Zoom, i.e. interaction techniques or interactors for controlling the zooming operation
Definitions
- the present embodiment relates generally to computer input devices, and more particularly, to a touch sensitive keyboard configured to perform at least one action by combining a touch screen gesture with a plurality of discrete keys without utilizing a conventional pointing device.
- the preferred embodiment of the present invention provides a keyboard system for displaying at least one action in response to at least one touch input without utilizing a pointing device.
- the keyboard system includes a touch sensitive keyboard, a touch event processor and at least one graphical interface device (GID).
- the touch sensitive keyboard in one embodiment includes a plurality of discrete keys and a plurality of sensor pads.
- each of the plurality of discrete keys includes a key press switch and a keycap where the top of the key incorporates a capacitive touch sensor pad whereby the proximity or direct touch of a finger can be detected.
- a plurality of sensor pads is arranged around and between each of the plurality of discrete keys of the touch sensitive keyboard.
- the touch sensitive surface is implemented by deploying a plurality of driver lines orthogonal to a plurality of sensor lines.
- the purpose of the sensors is to enable the determination the location and movement of fingers touching the surface of the keys.
- the invention employs a separate keypress mechanism registers when keys are depressed as in the normal course of typing.
- touch event refers to a user's action of touching, tapping, or sliding one or more fingers on the surface of the keys without depressing the keys.
- keypress refers to completely depressing a key as would traditionally be done in a conventional keyboard to register a conventional key press.
- a keyboard touch is generated by a touch on at least one of the plurality of discrete keys and optional motion along the surface of the keys, in other words, a touch sensitive gesture.
- the plurality of sensor pads is configured to transfer the keyboard touch input signal.
- Each of the plurality of sensor pads includes a via connected to an underlying underlying circuit board, either rigid or in some embodiments flexible which route the signals from the plurality of sensors either under or around the keys to a processor where capacitance of each sensor is measured.
- the top layer of the circuit board is the sensor layer
- the second is a a ground plane layer which insulates the traces in the routing layer underneath from background interference.
- Each of the plurality of sensor pads whether located on the surface of each key or to the side of each key incorporated into frame of the keyboard is connected through a circuit board ground layer to one or more routing layers where circuit traces connect the sensors processor.
- the raw capacitance values of the plurality of sensors are measured and collected by the touch determination module.
- This module combines this information to determines touches much as a touch pad or a multitouch mouse.
- the touch determination module may need to be modified for each different hardware configuration in order that output of this module be consistent across hardware implementations.
- a data packet describing the touch information is sent to a low-level event module
- the low-level event module combines the touch information into conventional mouse-touch events understood by existing GID systems. This information is described as a translated data packet an includes meaningful pointing device events such as mouseDown and mouseMove, touchBegin, etc. In practice, there are often a stream of data packets (and a corresponding stream of translated data packets) as the user uses the computer.
- the translated data packets are sent to the High-Level Event Module, which monitors the output of the low-level event module and provides to the computer new or additional gestures and commands to be defined using combinations of touches, taps and gestures. For example, tapping ' dw' (that is quickly touching the 'd' and 'w' keys without fully depressing them might be mapped by the High-Level Event Module to a command to delete the word in which the text cursor resides. Currently no applications respond to "tap events" so the High-Level Event Module recognizes this sequence and will output a mouse double click followed by the key code indicating the delete key was depressed.
- the High-Level Event Module translates this tap-touch gesture into a series of conventional events: a mouse down event, several mouse move events, a mouse release event and the command key for the copy command. These conventional pointing device and keyboard events are generated by the High-Level Event Module and sent to the user's computer. Thus, the High-Level Event Module enables the definition of new tap touch gestures that generate multiple conventional mouse and keyboard events. Note that the key upon which the tap-touch gesture began may be significant. Tapping 'd' and sliding might delete while tapping 'c' and sliding might copy.
- the at least one graphical interface device is configured to display the at least one action performed by computer.
- the keyboard input signal is generated by a touch on at least one of the plurality of discrete keys and a touch sensitive gesture.
- the touch sensitive keyboard enables a user to provide an input through the plurality of discrete keys by a tap-touch.
- the touch sensitive keyboard enables the keyboard system to perform the at least one action by combining the touch screen gesture or mouse gesture with the at least one of the plurality of discrete keys without utilizing a separate pointing device.
- Each of the plurality of sensor pads is connected to the routing layer through the sensor layer and the ground plane layer.
- the at least one touch input may be a low-level event or a high- level event.
- the low-level event is processed by a low-level event module 11 and the high-level event is processed by a high-level event module 12.
- the preferred embodiment includes a method for displaying the at least one action in response to the at least one touch input utilizing the keyboard system.
- the method includes the steps of: providing at least one touch input to the at least one of the plurality of discrete keys of the touch sensitive keyboard; enabling the plurality of sensor pads arranged around each of the plurality of discrete keys to transfer the keyboard input signal to the touch event processor installed with the event translation software; enabling the touch determination module at the touch event processor to interpret the keyboard input signal and to generate a data packet; enabling the low- level event module coupled with the touch determination module to receive the data packet, translate the data packet to a translated data packet and sends it to a High-Level Event Module, which corresponding to the at least one touch input to either passes said translated data packet to the computer or modifies the translated data packet and passes the modified translated data packet to the computer, thereby instructing the computer to perform at least one action; enabling the at least one graphical interface device (GID) connected with the touch event processor to display the at least one action performed by computer in response
- a first objective of the present invention is to provide a keyboard system having a touch sensitive keyboard that enables a user to perform at least one action by combining a touch screen gesture with at least one of a plurality of discrete keys without utilizing a separate pointing device such as a mouse.
- a second objective of the present invention is to provide a keyboard system that equalizes the ease of use of a pointing device but without the time penalty and other drawbacks involved in repeatedly moving a hand from the keyboard to the pointing device and back.
- a third objective of the present invention is to add a new layer of higher level events which incorporate tap-touch, slide with positional information that provides simple, easy to remember, high-level commands that are translated into a sequence of mouse-touch events.
- a further objective of the present invention is to provide a keyboard system that greatly improves the efficiency of computer use.
- FIG. 1 is a high-level block diagram for displaying at least one action in response to at least one touch input without utilizing a pointing device in accordance with the preferred embodiment of the present invention
- FIG. 2 is a diagrammatic top view of a keyboard wherein the top surfaces of the mechanical keys are touch sensitive and when taken altogether make up a large touch sensitive area in accordance with the preferred embodiment of the present invention
- FIG. 3 is a diagrammatic side view of a typical single key in use in accordance with the preferred embodiment of the present invention
- FIG. 4 is a block diagram of a typical switch configuration for reporting key presses in accordance with the preferred embodiment of the present invention.
- FIG. 5 A is a block diagram of a plurality of sensor layers of a touch sensitive keyboard
- FIG. 5 B is a block diagram of another embodiment of the plurality of circuit board layers of the touch sensitive keyboard shown in FIG. 5A;
- FIG. 6 is a block diagram of the touch sensitive components of a touch sensitive keyboard in accordance with an alternate embodiment of the present invention.
- FIG. 7 is a perspective front view of two typical keys in use in accordance with the present invention.
- FIG. 8 is a perspective front view of three typical keys in use in accordance with an alternative embodiment of the present invention.
- FIG. 9 is a diagrammatic top view of a keyboard according to one embodiment of the present invention wherein the keyboard is implemented using a contiguous touch sensitive surface as opposed to discreet mechanical keys;
- FIG. 10 is a block diagram showing the logic behind interpreting and translating several low-level events generated by the hardware circuitry into events which can be processed by existing applications in accordance with an embodiment of the present invention
- FIG. 11A is a top view diagrammatic depiction of motion possible by a user' s fingers that does not require repositioning the user's hand in use according to a preferred method of the present invention
- FIG. 11B is a diagrammatic depiction of a keyboard layout with discreet left and right mouse buttons according to an alternative embodiment of the present invention.
- FIG. 12 is a is a diagrammatic depiction of a keyboard in use according to an preferred method embodiment of the present invention, "tap-touch";
- FIG. 13 shows a computer display in use according to a method of the present invention
- FIG. 14 shows a computer display and an exemplary case of the implied shift key
- FIG. 15 shows a computer display depicting a text insertion pointer's direct motion through text in magnetic mouse mode
- FIG. 16 shows a computer display depicting a text insertion pointer's second motion through text in a vertical direction in magnetic mouse mode
- FIG. 17 shows a computer display depicting a pointer's third motion through text in a direction in magnetic mouse mode alongside a depiction of the user's hand during this motion;
- FIG. 18 shows a computer display depicting a pointer's fourth motion through text in magnetic mouse mode alongside a depiction of the user's hand during this motion
- FIG. 19 shows a computer display depicting a pointer's fifth motion through text alongside a depiction of the user's hand during this motion
- FIG. 20 shows a computer display depicting a pointer's sixth motion through text alongside a depiction of the user's hand during this motion
- FIG. 21 shows a computer display depicting a pointer's seventh motion through text alongside a depiction of the user's hand during this motion
- FIG. 22 shows a computer display depicting a pointer's eighth motion through text alongside a depiction of the user's hand during this motion
- FIG. 23 shows a computer display depicting a pointer's ninth motion through text alongside a depiction of the user's hand during this motion
- FIG. 24 shows a computer display depicting a pointer's tenth motion through text alongside a depiction of the user's hand during this motion
- FIG.25 shows a computer display depicting a pointer's eleventh motion through text alongside a depiction of the user's hand during this motion
- FIG. 26 is a top diagrammatic depiction of a keyboard showing a user's two hands making a first motion
- FIG. 27 is a top diagrammatic depiction of a keyboard showing a user's two hands making a second motion
- FIG. 28 shows a computer display depicting a pointer's twelfth motion through text alongside a depiction of the user's hand during this motion
- FIG. 29 shows the result on a computer display of the text after the user raises her hand after the tap-drag originating on the 'D' key.
- FIG. 30 shows another embodiment of the present invention, illustrating the sensors located on every space of the touch sensitive keyboard.
- GID Graphical Interface Device
- Windows Macintosh
- mobile operating systems such as Apple's iOS, Google's Android
- any other computing devices with a graphical user interface that accept keyboard input as well as pointing device input in the form of mouse events as defined below.
- Modes are the keys on a standard computer keyboard such as shift, command, option or alt, function, and control which do not transmit an ASCII code when pressed but which modify the ASCII code transmitted by the remaining keys.
- “Mouse-touch events” shall be understood to be a catch-all phrase that includes well known pointing device position and button state events as generated by a computer mouse, track ball or other pointing device as well as touch events as generated by a touchscreen, touchpad or similar input device.
- Low-level user input events or simply “low-level events” refer to mouse- touch events that are closest to the physical actions of the hardware such as the click of a button, or a touch on a touch sensitive surface.
- high-level events are often combinations of low-level events. They may comprise a gesture and may be associated with a specific meaning or semantics.
- a swipe in the art may be composed of a touch followed by a quick motion in a single direction while remaining in contact with the touch sensitive surface and finishing with the lifting of the fingers from the touch surface.
- the semantics of the swipe may be to progress through an ordered collection such as advancing the current page in a document, scrolling a document up or down within a window, or moving a text cursor to the end or beginning of a line of text.
- Keyboard events shall be understood to include the hardware and software events typically generated by a mechanical, membrane, or virtual keyboard. Keys may be pressed individually, in sequence or in combination with modifier keys.
- “User input semantics” or just “semantics” shall be understood to be the meaning of user interactions with a GID within the current context of a software application or operating system. For example, the semantics of pressing a key on a keyboard might be to insert the associated ASCII character into a document while a key pressed in conjunction with a modifier key might move the text cursor through the document, delete a line or another command to the editor.
- a “Drag event” shall be understood to mean a subset of mouse-touch events as defined above that indicates the moving of the pointing device while the mouse button being is held down or, in a touch environment, while in continuous contact with the surface.
- the semantics of a drag event may be the moving of a selected user interface object, the moving of selected text, or the selecting of text, depending on the current program state when the drag event occurs.
- a "touch sensitive keyboard” shall be understood to mean 1) a computer keyboard with discreet keys where a) the tops of the keys form a single touch sensitive surface that can be used as a pointing device in a similar manner to a touch pad or mouse, 2) a keyboard represented on a touch sensitive pad or touchscreen display (sometimes referred to as a virtual keyboard) where the pressure necessary to register as a mouse-touch event can be distinguished from a greater pressure that registers as a key press.
- the touch sensitive keyboard may be implemented using a tablet computer as long as it is capable of distinguishing more than one level of pressure.
- a keyboard system 1 including a touch sensitive keyboard 2, a touch event processor 3 and at least one graphical interface device (GID) 4 is illustrated.
- the touch sensitive keyboard 2 includes a plurality of discrete keys 7 and a plurality of sensor pads 1004.
- Each of the plurality of discrete keys 7 includes a key cap 110, a key press switch 120 and a touch sensitive top layer 100.
- FIG. 2 shows a keyboard 2 with discreet keys 7 where the top surfaces 100 of the keys 7 are touch sensitive and, when taken altogether, make up a large touch sensitive surface.
- the surface 100 of the keys 7 is thus akin to a large touchpad.
- the user may touch the surface with one or more fingers and may easily move his or her fingers across the surface. Doing so will generate touch movement signals in a similar manner to a touchpad or touch screen without depressing any key 7 far enough to generate a key press.
- FIG. 3 shows a typical single key 7, its keycap 110, and indicates the top of the key 7 may be a touch sensitive surface 100, while at the bottom of the key 7 the switch 120 is triggered when the key 7 is fully depressed.
- FIG. 4 shows a block diagram of a typical sensor configuration for reporting key presses. Key presses are typically detected by a mechanical or membrane switch 10 located under each key 7. Key presses are detected by the key press sensor, 20 and converted to standardized electric signals that are transmitted externally to the keyboard, usually by USB (universal serial bus), 70, or alternatively wirelessly through Bluetooth or other networking technologies.
- USB universal serial bus
- the plurality of self capacitive sensor pads 1004 is arranged around each of the plurality of discrete keys 7 of the touch sensitive keyboard 2 and covered with a top cosmetic coating 1000.
- the plurality of sensor pads 1004 is configured to transfer a keyboard input signal.
- Each of the plurality of sensor pads 1004 includes a via 1005 connected to a signal trace line back to the touch event processor 3.
- a plurality of signal trace layers may be employed.
- the set of signal trace layers when taken together is referred to as the routing layer 1003.
- the plurality of circuit board layers includes a top sensor layer 1001, a ground plane layer 1002 and a routing layer 1003.
- Each of the plurality of sensor pads 1004 appears on the top sensor layer and is connected to the routing layer 1003 through the ground plane layer 1002.
- the routing layer 1003 connects each of the plurality of sensor pads 1004 with the touch event processor 3.
- the touch sensitive surface 100 on each of the plurality of discrete keys 7 and the top cosmetic coating 1000 of the plurality of sensor pads 1004 defines to form a touch sensitive surface on the touch sensitive keyboard 2.
- the sensor pads 2005 are placed on the tops of the keys, just under a thin key cap or in some embodiments, the sensor pads 2005 are fabricated on top of the key and then coated with an insulating layer.
- the frame of the keyboard is used to house circuit traces 2002 to route the electrical signals from the key sensor pads around the key wells 2006 to be processed.
- a flexible conducting connector bridges the conductive sensor pad on the key to a connection on the keyboard frame 2004.
- the end of each flexible connector connects through a via in the ground plane 2001 to the circuit trace layer below.
- multiple circuit board layers may be utilized to carry the signal given the narrowness of the passage between the keys on the keyboard. In some embodiments, these separate layers are separated by ground planes to minimize or prevent signals from interfering with each other.
- the touch event processor 3 is coupled to a memory unit wherein the event translation software enables the processor 3 to perform a set of predefined actions corresponding to at least one touch input on the touch sensitive keyboard 2.
- the touch event processor 3 receives the keyboard input signal from the plurality of sensor pads 1004 through the plurality of sensor layers.
- the keyboard input signal is generated by a touch on at least one of the plurality of discrete keys 7 and optionally a drag event.
- the touch event processor 3 includes a touch determination module 5, a low-level event module 11 for generating a data packet that describes the mouse touch events, and a high-level event module 12 which monitors the stream of data packets and either sends the command directly to the computer or further modifies the translated data packet to a modified translated data packet, which is the sent to the computer.
- the high-level event module 12 passes it directly.
- the high-level event module 12 modifies it prior to it being sent.
- the computer is instructed to perform an action, and the at least one graphical interface device 4 is configured to display the at least one action.
- FIG. 6 shows a block diagram of the touch sensitive components of a touch sensitive keyboard 2.
- An alternative method of implementing touch sensitive surfaces is by deploying a plurality of driver lines 70 orthogonal to a plurality of sensor lines 60. Each of the plurality of driver lines 70 and each of the plurality of sensor lines 60 is separated by a non-conductive film 80 so that no physical connection between driver 70 and sensor lines 60 is made. And each of the plurality of driver lines 70 in sequence is momentarily connected to a voltage source which induces capacitance in each of the plurality of sensor lines 60 where the driver 70 and sensor lines 60 cross. This capacitance can be measured.
- FIG. 5 A is a block diagram of a plurality of sensor layers of a touch sensitive keyboard; and sensor lines FIG. 5 A is a block diagram of a plurality of sensor layers of a touch sensitive keyboard; under each key cap FIG. 5 A is a block diagram of a plurality of sensor layers of a touch sensitive keyboard; This is so that the position of a touch within the key FIG. 5 A is a block diagram of a plurality of sensor layers of a touch sensitive keyboard; e.g.
- driver 140 and sensor lines 130 are shown attached to the undersurface of the keycap 150.
- FIG. 8 depicts an alternate embodiment wherein the driver 160 and sensor lines
- 170 may be embedded in a flexible cloth or film, 180, that is bonded to the top and sides of the keys 7 with enough slack between keys 7 to allow the keys 7 to be pressed.
- FIG. 9 shows a functionally similar setup but is implemented using a touchpad without discreet keys 7.
- the key positions shown are thus drawn on the pad or displayed on the touchscreen.
- the keys 7 may be contoured or equipped with force feedback to give the user some tactile sense as to the position, extant, and sensitivity of the key areas.
- the surface must be capable of distinguishing between a light touch for generating mouse-touch events and a heavier touch to generate key press events. This may be accomplished in plurality of ways including a series of dome or membrane switches deployed under the touch surface, or the touch sensitive surface itself may be capable of detecting a light touch for generating mouse- touch events and a heavier touch to generate key presses.
- the surface is a contiguous touch sensor, functioning both as a touch surface and keyboard 2.
- a tablet computer may be used as the keyboard 2 with the keys 7 represented graphically on the screen.
- the tablet must be able to sense and provide feedback to the user as to the difference between touching the surface and a harder touch that corresponds to a keypress.
- One way of achieving this is to measure the size of the touch area. A light touch will produce an area in contact with the surface that is smaller than a heavier touch meant to serve as a key press.
- the touch sensitive keyboard 2 connects to the computer via a common USB, Bluetooth, or other networking connection.
- the touch sensitive keyboard 2 enables a user to provide an input through the plurality of discrete keys 7 by a tap-touch.
- the touch sensitive keyboard 2 enables the keyboard system 1 to perform the at least one action by combining the touch screen gesture with the at least one of the plurality of discrete keys 7 without utilizing an external pointing device.
- FIG. 10 depicts a block diagram showing the logic behind interpreting and translating the raw sensor outputs from multiple sensors into low-level events and then further to high-level events as defined above. These low-level events are events which can be processed by existing applications in accordance with an embodiment of the present invention.
- the high-level event module 12 also monitors low-level events output by the low-level event module 11. Additional information provided by the touch
- the at least one touch input can be a low-level event or a high-level event.
- the low-level event is processed by a touch determination module 5 and the low-level event module 11 and the high-level event is processed by a high-level event module 12.
- touch sensors 8 senses touch events
- tap-touch-and gesture input while a key press sensor 9 senses key press inputs.
- each keypress also generates an initial touch
- a touch by itself has no semantics and generates no output unless it is part of a tap, tap-touch, double tap or or drag or three fingered touch.
- a mouse down is generated by a double tap.
- a tap- touch consists of a touch followed immediately by a release. Any key press immediately following a tap on the same key remove the touch from consideration. Thereafter, the gesture input and the key press inputs are transmitted to the touch event processor 3.
- the low-level event module 11 receives a data packet describing the inputs, and using event translation software translates the data packet to a translated data packet and transmits the translated data packet to the high-level event module 12.
- a keyboard-mouse driver logic 13 coupled with an operating system 14 in the graphical interface device 4 displays the at least one action performed by the high-level event module 12. In this way, the user can enter data through the touch sensitive keyboard 2 without utilizing the pointing device.
- the low-level events generated by the hardware circuitry employed in a mechanical keyboard 2 design with touch sensitive keys 7 may be quite different from those generated by a keyboard 2 implemented on a touchpad or touchscreen tablet.
- a touch pad or tablet device often generates absolute position information: a touch at the lower left of the input device is mapped directly to a position on the lower left of the user's display in a linear fashion. Additionally, touches hard enough to trigger a key press event will have to be mapped from an x-y position to the key 7 presented at that location.
- the present invention translates all low-level events from absolute to relative motion events in the manner of a mouse or track pad and not a touch screen display: that is, a new position is determined by the previous position modified according to the direction and speed of the user's fingers.
- the preferred embodiment includes a method for displaying the at least one action in response to the at least one touch input utilizing the touch sensitive keyboard system 1. The method commences by providing the at least one touch input to the at least one of the plurality of discrete keys 7 of the touch sensitive keyboard 2. Next, the plurality of sensor pads 1004 arranged around each of the plurality of discrete keys 7 is enabled to transfer the keyboard input signal to the touch determination module 5 within the event processor
- the touch determination module 5 is enabled to interpret the keyboard input signal and to generate the data packet which is sent to the low-level event module 11 for translation.
- the translated data packet is sent to the high-level event module, which receives the translated data packet corresponding to the at least one touch input and either passes said translated data packet to the computer or modifies said translated data packet and passes the modified translated data packet to the computer, thereby instructing the computer to perform at least one action.
- the at least one graphical interface device 4 connected with the touch event processor 3 is enabled to display the at least one action.
- lateral motion across the tops of keys 7 is not something users generally do today. Such a motion has no existing user interface semantics and is distinct from a key press and so the system is free to interpret it as mouse motion with low chance for error.
- the low-level event logic therefore ignores random mouse-touch events but responds to movements across the surface of the keys 7 to indicate motion events in the manner of an Apple multi-touch trackpad as used in macOS and other Apple operating systems.
- the position (or mouse) cursor may move anywhere a conventional mouse or trackpad might move it.
- hysteresis is employed: small movements that do not occur as part of a longer movement are ignored, that is to say a threshold of distance must be crossed before motion is reported. This is to eliminate random movements generated by a user as he/she types or randomly lifts or rests her fingers on the keyboard surface without intention to generate mouse events.
- Method 1 Simply include mouse buttons on the keyboard 2 below the space bar as shown in FIG. 11B. This is common on existing laptops that do not implement mouse clicks through their trackpad.
- Method 2 The system uses a gesture to indicate the beginning of mouse events.
- tap-touch we define "tap-touch", as shown in FIG. 12.
- the user would raise at at least one finger off the touch sensitive keyboard surface of the keyboard 2 then quickly tap the surface of a key or keys, 240, followed immediately by a touch, 250, in the same position
- This gesture can be distinguished from random touches. If tap-touch is followed by a drag, then a selection is created from the position of the cursor as with a three-fingered tap drag in the Apple macOS operating system
- tap-touch replaces the movement of the hand from the keyboard 2 to the mouse or track pad, the pressing of the mouse button and returning of the hand to position on the keyboard and as such is an improvement in efficiency and speed.
- the touch event processor 3 of the keyboard 2 recognizes the tap-touch gesture translating it to the equivalent mouse-down event.
- tap-touch-drag is defined as tap-touch with motion before raising the fingers again.
- the semantics defined for this gesture are usually selecting text, or selecting and moving a graphical object on the display. This is equivalent, and in most circumstances will be translated to mouse-down and mouse-move events. Raising the fingers at the end of the drag is equivalent to releasing a mouse button.
- Method 3 Holding the shift key down may be defined to create or extend a selection to when there is a text insertion point in a block of text from the insertion point to the cursor position and continues to modify the selection when followed immediately by a mouse-drag event.
- the semantics are similar to holding a shift key while using the arrow keys to select text in existing applications or using the shift key with a trackpad This is as opposed to holding the shift key followed by a key press, which simply generates the shift value of the pressed key, e.g. 'a' to ⁇ '.
- FIG. 13 with the text insert cursor at the position denoted by 300, the shift key is pressed and held down. The user drags across the keyboard until position 310 where the shift key is released and the desired text is selected.
- Method 5 a method for using the shift key as a mouse button outside of text is equivalent to a mouse click with the shift key down. This is referred to as an implied shift key.
- the user has momentarily pressed the shift key. Each time, a new graphic object is added to the selection. Alternatively, the user could have clicked a mouse button if one were provided on the keyboard 2, or tapped two fingers on the keyboard 2 with the shift key held to accomplish the same thing as described in the above-mentioned method.
- To release an item from the selection the user simply presses the shift key while the mouse is over the item, (or taps with two fingers, or clicks the mouse button) while the shift key is held.
- To release all items simply tap two fingers, mouse button click, or press and release the shift key while the mouse cursor is over empty space in the document. Note that like many alternatives of the preferred embodiment, this application of the shift key is an optional optimization and is not required.
- the low-level and high-level event logics receive the keyboard input signals from the touch sensitive keyboard 2 and translate them into standard mouse- touch events and, when appropriate, higher level gestures.
- Existing applications and operating systems may use these higher-level events from the keyboard 2 with no need for modification to the software.
- the event translation software is configured to enable the touch sensitive keyboard 2 to operate in a plurality of modes but is not limited to a magnetic cursor mode and a magnetic mouse mode. The plurality of modes is described below:
- An additional innovation of the present invention is the capability it presents to translate some mouse-touch events into a combination of mouse-touch events, keyboard command-key events and arrowKey events.
- the "magnetic cursor" mode is a technique to further reduce the number of mouse events that need to be specifically generated by the user and thereby making interaction quicker and more efficient. With magnetic mode selected, certain events are generated automatically for the user. In this case, when a user has a text insertion cursor in some block of text and the text cursor and mouse cursor are combined. Moving the mouse-pointer across text (by dragging across the surface of the touch sensitive keyboard's 2 keys 7) will advance the text insertion cursor through the text as shown in FIG. 15.
- a slow but deliberate leftward motion across the touch sensitive keyboard 2 is moving the text insertion cursor leftward through the text one character at a time as shown in FIG. 15. Unlike a typical use of a mouse, no mouse click is required at the end of the mouse movement to set a new insertion point.
- FIG. 16 shows a similar motion through text in a vertical direction in magnetic mouse mode. A slow upward drag movement from 500 to 510 is illustrated. It is similar to, and in fact the low-level event logic may choose to translate this motion into multiple clicks of the up arrow.
- FIG. 17 With the magnetic mouse mode active, a faster movement across the surface of the keys 7 is represented using the mouse cursor which moves smoothly without stopping to show the text insertion cursor at every possible intermediate position starting from the existing text insertion point, at 600 vertically to release at 610.
- the rapid motion causes the appearance of the mouse cursor.
- a mouse cursor is added when a touch sensitive keyboard is attached. This may require support from an application running on a tablet or from the tablet operating system itself but would allow the tablet to be used for serious text processing without fatigue and inefficiency of moving ones finger from the keyboard to the tablet surface and back for each touch input.
- the mouse cursor Rather than appearing at the previous position of the mouse cursor, the mouse cursor appears at the text insertion point and moves from there as opposed to the common practice of the day.
- the preferred embodiment is partially combining and more tightly relating the text insertion cursor and mouse cursor by sharing gestures.
- the raising of the dragging finger(s) off the surface causes a mouse Up event to be generated at the new position of the cursor.
- the mouse cursor is past the end of a line when released.
- the text insertion cursor is placed in the nearest position where text may be entered, the same as it would be for a standard mouse click past the end of the existing line.
- the mouse cursor position of the mouse cursor is updated to share the position of the text cursor and is made invisible when the text cursor is displayed.
- Magnetic mouse mode includes efficiencies that save events. Using a conventional pointing device, the user would need to move a hand to the pointing device, move the mouse to the desired location, click the mouse, return the hand to the keyboard 2 and continue typing. Using cursor control with arrow keys would require moving of the hand and an additional six key presses. Using a command key combination might avoid moving a hand if the required modifier keys didn't require repositioning but would still require six key presses. In magnetic mouse mode, the user leaves both hands in place, keeps her eyes on the display and simply swipes upwards until the proper position is reached and continues typing. This requires a single gesture instead of four or more and no loss of attention on the working document.
- tap-touch-hold over a selection does the same (tap-touch- release would still remove the selection and create a new insertion point in text (or mouseUp while not in text).
- tap-touch-release would still remove the selection and create a new insertion point in text (or mouseUp while not in text).
- FIG. 18 shows the result of a rapid diagonal motion from 720 to 730 by the user. This causes the mouse cursor to appear and move from 700 to 710 followed by a click at 730 when the fingers are released which places the text insert cursor at the new location.
- magnetic mouse mode is most useful in applications such as text editors or word processors where the predominance of the time is spent entering and moving about in text and where pushing a text insert cursor through text makes sense.
- the low-level event logic 11 may read motion events and then generate arrow key events in the case when the motion is slow. Alternatively, sensing a rapid motion, it may generate motion events and an automatic mouse click event at the end. The effect is to give the user the sense of pushing a text insert cursor through the text and it is this visual metaphor which makes this novel computer interaction intuitive. It is as if the pointing cursor was "magnetic" and with the pause in motion, sticks to the underlying virtual object.
- FIG. 21 shows a longer very rapid movement across two keys from 920 to 930 causes the text insert cursor to advance from 900 to the beginning of the next word at 910 is illustrated.
- FIG. 22 shows a rapid 2 key movement vertically from 1000 to 1010 causes the text insert cursor 1020 to move upward to the previous beginning of a block 1030 in this case the block where the text insert cursor already resided. On repetition of the gesture the text insert cursor would move to the beginning of the next block above.
- 1130 when there is a text insert cursor has the meaning of moving the cursor all the way to the right side of the current text line, 1100 to 1110 is illustrated. Similarly defined, a long swipe left will move the text input cursor to the left side of the line.
- a long, fast swipe in any direction "throws" the text cursor in the direction of the motion.
- the throw distance is settable by a user preference and may cause the document to scroll if so desired. This enables the user to gesture from 1230 to 1240 and to move the cursor from 1200 to 1210 and "glide” from 1230 to 1240. This allows the user in a large document to throw the cursor from a distance greater than he/she could drag the cursor in one stroke. The user can then precisely position the cursor with another stroke.
- an upward swipe, 1330 where the fingers remain on the top row of keys, 1340, will move the cursor, 1300, to the top of the display, 1310, and continue scrolling upwards with the cursor through the text until the touch is released.
- downward swipes that are held in the bottom row and horizontal swipes that are held at the left and right extremes of the keyboard 2 for panning right and left.
- the event translation software is configured to enable the touch sensitive keyboard to operate with a plurality of commands to perform the at least one action.
- the plurality of commands includes but is not limited to a scrolling and a panning gesture command, a double hand gesture command, a tap key command and a tap gesture command.
- three fingers gestures are defined to generate the native platforms commands for scrolling or panning and selection.
- the system may define continuous scrolling or panning of the window contents (rather than the moving of a cursor through the document as described above.) Moving the appropriate number of fingers to the edge of the draggable area (again, typically one more finger than for mouse motion) on the touch sensitive keyboard 2 and pausing there may be defined to cause continuous scrolling in the indicated direction while the hand remains at the edge.
- Tablet computers may define a two-fingered pinch gesture to zoom in or out or using two fingers of the same hand and drawing them closer together or drawing them further apart.
- a tablet may define a rotation gesture. These may also be deployed by the touch sensitive keyboard 2 but the user's hand position on the keyboard 2 and the possibility of the user resting her fingers on the keyboard 2 makes it easier to define similar gestures by using two hands.
- the gestures are similar except that one or two fingers of each hand are brought closer together or further apart to indicate zooming as shown in FIG. 26, while moving the fingers of each hand in opposite directions indicates rotation as indicated at FIG. 27.
- the force of the tap can be measured when the hardware is capable of such measurements as well as the immediate raising of the finger inherent in a tap detected.
- Tapping 'b' might move backwards by a word.
- Tapping ⁇ ' might open a new line after the current one without having to first go to the end of the line and hit return.
- a touch sensitive keyboard 2 can be used to combine key commands with mouse commands. This is possible because the touch sensitive keyboard is both keyboard 2 and mouse, and taps always always take place on top of a key. Touching a modifier key while tap-touching distinguishes a mouse-touch event from a tap command. For example, touching but not depressing a shift key. In the same way that depressing the shift key modifies the key code sent when pressing a key (usually to send the uppercase version of the letter associated with the key), touching the shift key modifies the event to send a tap command.
- Tapping on a key 7 can also be combined with gestures. For example, tap-touch on the 'd' key while touching the shift key, (shift-tap-touch) with a single finger and sliding left may be defined to delete characters to the left of the text insert cursor. Similarly, as shown in FIG. 28, shift-tap-touch-slide right deletes characters to the right of the text insert cursor.
- FIG. 29 shows the result of the text after the user raises her hand after the tap-drag.
- shift-tapping 'd' twice ('dd') is defined to delete a line
- shift-tapping twice and then dragging up or down might be defined to delete line by line upwards or downwards respectively.
- Tapping of different letters can be combined. Tapping 'dw' might be defined to delete the word at the text insert cursor, 'dw'-slide might delete by word. Shift-tapping enables a easily accessible and intuitive way to define easy to remember commands by leveraging the difference between touch and depressing of a key along with the fact that every mouse-touch event also occurs on a particular key.
- gestures may be modified to increase functionality or ease of use, while supporting and taking advantage of not having to move one's hand to the pointing device.
- the following gestures are single-finger gestures unless otherwise noted. These include but are not limited to:
- a quick double tap generates a double click— (same as Apple trackpad)
- a quick two-finger tap is a right mouse click— (same as Apple trackpad)
- option-quick tap is also a right mouse— (same as Apple trackpad)
- tap-tap-touch-drag selects a word and continues selecting by word upon as the touch is dragged, (diff from Apple)
- tap-tap-touch-drag is equivalent to three fingered double tap-drag on a
- Macintosh multi-touch track pad
- mouseDown (same as Apple)
- tapping a key while touching the Shift key sends a tap command if one is defined for the key.
- shift-touch-tap-d can delete in any direction. This command has to be processed by the High-Level Event module and issue the appropriate low-level events.
- g. preceding a tap by a tapped numerical value executes the command the indicated number of times, e.g. tapping 4 before a tap command will execute the tap command 4 times.
- tap events which may be defined for actions typical to a word or text editor.
- Other suitable tap events may be employed below, a letter or letters in single quotes indicates a tap on the associated key or keys.
- a repeated letter or groups of letters indicates repeated taps on the associated key(s).
- the symbol '#' indicates a number generated by tapping on the number keys on the keyboard.
- a single tap or multiple taps while holding the touch on the last tap may be defined to repeat the associated command until released.
- a. 'h', 'j', 'k', T tap command alternative for cursor motion.
- Other combinations of letters are possible in software such as games and may be defined or customized.
- Tap on the associated key to move the cursor in the associated direction. For the given example keys used within text, tapping 'h' moves the text cursor one character to the left, 'k' moves it one line up, 'k', one line down; T, one character to the right. Tapping 'j' and T may also be used to move through lists or menus.
- k. 'd' swipe delete in the direction of the swipe to the end of logical unit in that direction, e.g. 'd' swipe right might delete to the end of the line
- t. ' ⁇ ' move to the beginning of the current section or the beginning of the preceding section if already at the beginning of a section. That which constitutes a section is defined by the application. For example, a programmer's text editor might go to the beginning of the current code block which is delimited in many programming languages with the character ' ⁇ '. u. ' ⁇ ' : move to the end of the current section
- Taps on keys may be defined to have application specific meanings.
- applications define command keys equivalents to use for menu shortcuts or other functions specific to the application.
- a keyboard utility can be used to map these functions to tap commands.
- an application can support tap commands directly, as well as tap-gesture commands. For example, in a graphics editor, tapping 'b' and dragging might draw with the brush tool, while tapping 'p' and dragging might draw with the pencil tool.
- the novelty lies in the ability to distinguish a gesture which begins with one key from the same gesture that begins on a different key.
- a mode is a state in which keys 7 typed on the keyboard are interpreted differently than the same keys typed while in a different mode. For example, using the text editor VIM, the user switches between "command mode” and "text insert mode” . In command mode, each key is interpreted as being part of a command. In text insert mode keys typed are interpreted as characters to be inserted into the file. In real world editing, commands and text tend to group together: one tends to type a few commands, then some text, then some commands.
- editing often involves issuing commands to position the text cursor properly, followed by a stream of text that is entered into the file, then more commands to go to another location.
- a command to enter command mode where all keys are commands.
- a command to switch to text insert mode can be issued and thereafter all key presses will be interpreted as text.
- VIM Voice over IP
- the most frequent and damaging mistake with modal editors is the user being in one mode when he or she thought the other mode was selected.
- a modified VIM could be created that has but a single mode.
- the VIM commands are defined as taps on the surface of the keys 7 normally used for commands in command mode. Normal typing is always text insert mode.
- pressing the 'w' key always enters the 'w' character while tapping 'w' will move the cursor to the right one word.
- the user is never stuck in the wrong mode as a tap is always a command while a key press always enters text.
- the low-level events are those which map closely to the actions of the hardware, e.g. a key press, a mouse click.
- High-level events are often combinations of low-level events, e.g. a fast swipe to the right initiated on the 'd' key is a combination of several low-level events that might be defined with the semantics "delete one character to the right of the cursor".
- This high-level event could be send by the keyboard 2 to the GID 4 in different ways.
- the simplest event to send might not be a gesture or a series of mouse-touch events but instead it might be more concise and accurate to translate this gesture to a press of the forward delete key.
- tap 'd' and drag to delete may be directly defined and supported by an application or may be mapped into a sequence of existing mouse-touch and keyboard events with meaning to the operating system or application. For example, shift-tap-dragging from the 'd' key might generate a mouse-down and drag series of events to select text, followed by mouse-up event and a final press of the delete key.
- the low and high-level event manager software may be able to learn from the user's interaction to extend or contract the distance required to distinguish certain gestures from each other, for example allowing a longer drag for character positioning than simply one key distance.
- Each user has different sized hands and finger reach. When a gesture is followed by smaller counter gestures to correct it, it may be possible to recognize this as an error in interpretation and correct it to fit the user.
- gestures may benefit from training or user customization. Having the user repeat each gesture several times can allow the recognition software to adjust to the individual user. For example, a programmer might define a gesture and associate that with a compile and build command.
- a keyboard 2 may be customized to an individual as well as the tasks that the individual performs.
- a bicycle is fit to an individual bike racer's size and strength while the design of the bike varies for terrain, e.g. a road bike versus a mountain bike.
- a person with large hands and thick fingers can certainly benefit from a different keyboard than that tailored to someone with small hands and thin fingers.
- the touch sensitive keyboard By eliminating the need for a trackpad entirely, the touch sensitive keyboard opens up space for customization.
- the touch sensitive keyboard and customization of keyboard layout and key sizes can achieve greater efficiency than ever before.
- the bracket keys may be moved between the 'g' and 'h' keys.
- the caps lock key a large and importantly placed key which nevertheless has little utility in programming, may be used instead as a different more utilized key, such as the control key, with a different key being used for caps lock, or with the use of a combination such as a double tap on the caps lock key to activate/deactivate the caps lock function required.
- the sensors are located between the keys of a keyboard.
- the sensors are shown as vertical lines 1400 between each of the keys, while the wiring for the sensors is shown in horizontal lines 1410 between rows of keys.
- the sensors are located on every space of the keyboard that has a key on its left and right, in other not shown embodiments fewer sensors are placed between keys.
- sensors may be located on the keys and between the keys. To further improve on these embodiments with sensors between the keys, additional sensors may be present to determine the height of a finger over the keys or for detecting a tap gesture so as to filter out meaningful touch movements versus the user simply resting his or her fingers on the keys.
- a non-transitory computer-readable medium comprises computer-executable instructions stored therein for causing a computer to implement a program executable on a keyboard system 1 for a method for displaying at least one action in response to at least one touch input.
- the non-transitory computer readable storage medium may comprise a memory card, USB-Drive, CD-ROM or flash memory to list but a few.
- a non-transitory computer-readable medium comprises computer-executable instructions stored therein for causing mobility solutions to implement a program executable on a keyboard system 1 that enables the keyboard system 1 to perform at least one action by combining a touch screen gesture with at least one of a plurality of discrete keys 7 without utilizing a pointing device.
- the present invention provides several benefits. Firstly, a substantial increase in the speed and efficiency of user interaction is provided even with existing software on GIDs 4 due primarily to a reduction of the number of required actions to execute a task on a GID 4 and the elimination of the time it takes for the user to move her hand from the keyboard 2 to a mouse, trackpad, or set of arrow keys to perform an action, and then move her hand back to the keyboard 2 and position it for text entry.
- a further goal is the elimination of the necessity to move one's eyes from the display in order to locate and use certain keys on the input device which cannot easily be found by touch alone, only then to return one's gaze to the display and refocus on the current task.
- this invention a combination of the three components: touch sensitive keyboard 2, a simple but complete set of interactions, and the mapping of those actions to events compatible with existing applications— extends a similar increase in efficiency to the modern workflow that includes display, keyboard 2 and pointing device.
- the preferred embodiment is far easier to learn than existing methodologies for speeding user interactions with a keyboard 2 which require the memorization both mentally and physically of a set of complex and often awkward command key combinations.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Position Input By Displaying (AREA)
- Input From Keyboards Or The Like (AREA)
Abstract
A keyboard system performs at least one action by combining a touch screen gesture with a plurality of discrete keys without utilizing an external pointing device. The keyboard system includes a touch sensitive keyboard, a touch event processor and a graphical interface device. The touch sensitive keyboard includes a plurality of discrete keys which include a key press switch and a keycap where the top of the key incorporates a capacitive touch sensor pad. A keyboard input signal is generated by a touch on at least one discrete key and a touch sensitive gesture. A touch determination module interprets the keyboard input signal and generates a data packet. A low-level event module receives the data packet corresponding to the touch input, translates the data packet and sends it to a high-level event module, which instructs a computer to perform an action displayed by a graphical interface device.
Description
COMBINATION COMPUTER KEYBOARD AND COMPUTER POINTING DEVICE
Inventor: Michael Farr
RELATED APPLICATION
[0001] This application claims priority from the United States provisional application with Serial Number 62/270007, and which was filed on December 20, 2015. The disclosure of that application is incorporated herein as if set out in full.
BACKGROUND OF THE DISCLOSURE
TECHNICAL FIELD OF THE DISCLOSURE
[0002] The present embodiment relates generally to computer input devices, and more particularly, to a touch sensitive keyboard configured to perform at least one action by combining a touch screen gesture with a plurality of discrete keys without utilizing a conventional pointing device.
DESCRIPTION OF THE RELATED ART
[0003] For several decades, computer keyboards have slowly evolved alongside computer pointing devices such as the mouse, the trackpad, and trackball. The universal adoption of the mouse or trackpad and Graphical User Interface as popularized by Apple's Macintosh, have made it possible for the average person to use a computer without having to memorize complex text commands or numerous arcane and difficult to type command key combinations ("shortcuts"). In such systems, the user uses a pointing device is used to indicate which object, section of text, or position in a document is subject to change. There are some task domains, however, where the time it takes to repeatedly move the hand back and forth from a keyboard to a mouse or touchpad can have a significant effect on the productivity of the user, slowing the user's work significantly. In 1984, Ted Selker, a researcher at PARC (Palo Alto Research Center Incorporated), determined that a typist needs a relatively long 0.75 sec to shift the hand from the keyboard to the mouse, and comparable time to shift back
from mouse to keyboard. The same is true of a trackpad or arrow keys, as the user must still lift and reposition the hand to utilize these input devices. For most users, this requires a look down at the keyboard in order to reset the hand's position. Thus, every use of the separate pointing device slows the user down a full second and a half, an interval often longer than required to complete the immediate task itself.
[0004] The time spent in moving back and forth between mouse and keyboard is small when considered discretely, but quickly adds up over the hundreds or thousands of iterations comprising a day's work in many fields. In one field, computer programming, this slowdown is so significant that many programmers adopt software tools from the 1970s, a time before mice and trackpads were invented. Such tools commonly take the form of early text editors such as Vim and EMACS or modern variants where commands comprising one or more modifier keys along with the familiar alphabetic keys on the keyboard are used to position, select, and manipulate text.
[0005] The advantage to these keyboard-only control systems is not needing to move the hand to a mouse or keyboard and back. The drawback is the incredibly steep learning curve. Despite that learning curve, many users do utilize these systems simply to offset the time loss otherwise associated with the use of additional input devices. A user skilled in the use of these tools can significantly outperform another who relies on the mouse or other traditional supplemental input device. For these users, it is much faster to type the command-key combinations, however arcane, to move a cursor around on the screen, selecting items, or issuing a desired command. This gives the master of these systems a significant competitive advantage.
[0006] As noted, these keyboard-only systems are tedious, complicated, time consuming to memorize, and are often awkward to use. For example, this requires the simultaneous pressing of two or three modifier keys in addition to one or more alphanumeric keys in order to issue a distinct command. In any work specialty where document creation and editing is extensive and/or where the user's hand goes back and forth to the mouse or pivots out of position to access a trackpad, the experience is the same.
[0007] Modern use of the mouse has also led to an epidemic of repetitive use injuries: in particular, carpal tunnel syndrome, where compression of the median nerve by the carpal ligament can result from the repeated mechanical process of using a mouse.
[0008] Lifting and moving the hand and arm back and forth to reach a mouse or
trackpad thousands of times a day, and doing this only on one side of the body, can also result in extra loads on the user and imbalances on the muscles in the upper back (Trapezius muscle) and shoulder (Deltoid muscle). Repeated use of the mouse, therefore, can cause aches, pains and potential degeneration in the shoulder and neck area.
[0009] Previous attempts to solve these problems have been introduced. The most notable is the IBM TrackPoint, a small rubber capped lever ("nub") inserted between the 'g', 'h', and 'b' keys of a keyboard. This pointing device is manipulated like a joystick to move the mouse cursor around the screen. Mouse "clicks" may be generated by tapping the nub or by use of mouse buttons on the keyboard. The advantage of the nub is to allow repositioning the computer' s cursor without having to move the hand away from the keyboard. Although IBM's nub involved a new physical coordination skill to learn that never caught on in the industry, it nevertheless remains popular for those who have attained mastery of the device.
[00010] Previous attempts have sought to reorder the letters on the keyboard, for example the Dvorak layout, but still the general design has proven surprisingly resistant to change. People "know" what a keyboard should look like even though the size and distance between keys, the arrangement in linear rows, and especially the staggered layout where each row is offset horizontally from the one above it tend to be rigidly copied even though the mechanical design reasons for these decisions no longer apply in the electronic age. Additionally, with modern manufacturing processes it should be easier to create customized keyboards that account for hand and finger size differences among users, yet the problems described herein remain and no solution as caught hold.
[00011] There is thus a need for a keyboard system that enables a user to perform at least one action without utilizing a pointing device such as a mouse. Such a needed system would equalize the ease of use of a pointing device but without the time penalty and other drawbacks involved in repeatedly moving a hand from the keyboard to the pointing device and back. Further, such a device that would greatly improve the efficiency of computer use. The present embodiment accomplishes these objectives.
SUMMARY OF THE DISCLOSURE
[00012] To minimize the limitations found in the prior art, and to minimize other limitations that will be apparent upon the reading of the specification, the preferred embodiment of the present invention provides a keyboard system for displaying at least one action in response to at least one touch input without utilizing a pointing device. The keyboard system includes a touch sensitive keyboard, a touch event processor and at least one graphical interface device (GID). The touch sensitive keyboard in one embodiment includes a plurality of discrete keys and a plurality of sensor pads. In one embodiment, each of the plurality of discrete keys includes a key press switch and a keycap where the top of the key incorporates a capacitive touch sensor pad whereby the proximity or direct touch of a finger can be detected. In another embodiment, a plurality of sensor pads is arranged around and between each of the plurality of discrete keys of the touch sensitive keyboard. In a third embodiment, the touch sensitive surface is implemented by deploying a plurality of driver lines orthogonal to a plurality of sensor lines. However implemented, the purpose of the sensors is to enable the determination the location and movement of fingers touching the surface of the keys. Additionally, as in a standard computer keyboard, the invention employs a separate keypress mechanism registers when keys are depressed as in the normal course of typing.
[00013] Throughout this document, the term touch event refers to a user's action of touching, tapping, or sliding one or more fingers on the surface of the keys without depressing the keys. The term keypress refers to completely depressing a key as would traditionally be done in a conventional keyboard to register a conventional key press. A keyboard touch is generated by a touch on at least one of the plurality of discrete keys and optional motion along the surface of the keys, in other words, a touch sensitive gesture. The plurality of sensor pads is configured to transfer the keyboard touch input signal. Each of the plurality of sensor pads includes a via connected to an underlying underlying circuit board, either rigid or in some embodiments flexible which route the signals from the plurality of sensors either under or around the keys to a processor where capacitance of each sensor is measured.
[00014] In the case where sensors surround the keys, the top layer of the circuit board is the sensor layer, the second is a a ground plane layer which insulates the traces in the routing layer underneath from background interference. Each of the plurality of sensor pads whether located on the surface of each key or to the side of each key incorporated
into frame of the keyboard is connected through a circuit board ground layer to one or more routing layers where circuit traces connect the sensors processor.
[00015] At the lowest level, the raw capacitance values of the plurality of sensors are measured and collected by the touch determination module. This module combines this information to determines touches much as a touch pad or a multitouch mouse. The touch determination module may need to be modified for each different hardware configuration in order that output of this module be consistent across hardware implementations. A data packet describing the touch information is sent to a low-level event module
[00016] The low-level event module combines the touch information into conventional mouse-touch events understood by existing GID systems. This information is described as a translated data packet an includes meaningful pointing device events such as mouseDown and mouseMove, touchBegin, etc. In practice, there are often a stream of data packets (and a corresponding stream of translated data packets) as the user uses the computer.
[00017] At the next level, the translated data packets are sent to the High-Level Event Module, which monitors the output of the low-level event module and provides to the computer new or additional gestures and commands to be defined using combinations of touches, taps and gestures. For example, tapping ' dw' (that is quickly touching the 'd' and 'w' keys without fully depressing them might be mapped by the High-Level Event Module to a command to delete the word in which the text cursor resides. Currently no applications respond to "tap events" so the High-Level Event Module recognizes this sequence and will output a mouse double click followed by the key code indicating the delete key was depressed. Similarly tap-touching the 'c' key and then moving the touch might be defined to select text in the direction of the touch and copying the contents selected to the clipboard. The High-Level Event Module translates this tap-touch gesture into a series of conventional events: a mouse down event, several mouse move events, a mouse release event and the command key for the copy command. These conventional pointing device and keyboard events are generated by the High-Level Event Module and sent to the user's computer. Thus, the High-Level Event Module enables the definition of new tap touch gestures that generate multiple conventional mouse and keyboard events. Note that the key upon which the tap-touch gesture began may be significant. Tapping 'd' and sliding might delete while tapping
'c' and sliding might copy. The at least one graphical interface device is configured to display the at least one action performed by computer.
[00018] The keyboard input signal is generated by a touch on at least one of the plurality of discrete keys and a touch sensitive gesture. The touch sensitive keyboard enables a user to provide an input through the plurality of discrete keys by a tap-touch. The touch sensitive keyboard enables the keyboard system to perform the at least one action by combining the touch screen gesture or mouse gesture with the at least one of the plurality of discrete keys without utilizing a separate pointing device. Each of the plurality of sensor pads is connected to the routing layer through the sensor layer and the ground plane layer. The at least one touch input may be a low-level event or a high- level event. The low-level event is processed by a low-level event module 11 and the high-level event is processed by a high-level event module 12.
[00019] The preferred embodiment includes a method for displaying the at least one action in response to the at least one touch input utilizing the keyboard system. The method includes the steps of: providing at least one touch input to the at least one of the plurality of discrete keys of the touch sensitive keyboard; enabling the plurality of sensor pads arranged around each of the plurality of discrete keys to transfer the keyboard input signal to the touch event processor installed with the event translation software; enabling the touch determination module at the touch event processor to interpret the keyboard input signal and to generate a data packet; enabling the low- level event module coupled with the touch determination module to receive the data packet, translate the data packet to a translated data packet and sends it to a High-Level Event Module, which corresponding to the at least one touch input to either passes said translated data packet to the computer or modifies the translated data packet and passes the modified translated data packet to the computer, thereby instructing the computer to perform at least one action; enabling the at least one graphical interface device (GID) connected with the touch event processor to display the at least one action performed by computer in response to direction from the High-Level Event Module. In other embodiments, the high-level event module commands the display directly.
[00020] A first objective of the present invention is to provide a keyboard system having a touch sensitive keyboard that enables a user to perform at least one action by combining a touch screen gesture with at least one of a plurality of discrete keys without utilizing a separate pointing device such as a mouse.
[00021] A second objective of the present invention is to provide a keyboard system that equalizes the ease of use of a pointing device but without the time penalty and other drawbacks involved in repeatedly moving a hand from the keyboard to the pointing device and back.
[00022] A third objective of the present invention is to add a new layer of higher level events which incorporate tap-touch, slide with positional information that provides simple, easy to remember, high-level commands that are translated into a sequence of mouse-touch events.
[00023] A further objective of the present invention is to provide a keyboard system that greatly improves the efficiency of computer use.
[00024] These and other advantages and features of the present invention are described with specificity to make the present invention understandable to one of ordinary skill in the art.
BRIEF DESCRIPTION OF THE FIGURES
[00025] To enhance their clarity and improve understanding of these various elements and embodiments of the invention, elements in the figures have not necessarily been shown to scale. Furthermore, elements that are known to be common and well understood to those in the industry are not depicted to provide a clear view of the various embodiments of the invention. Thus, the figures are generalized in form in the interest of clarity and conciseness.
[00026] The foregoing summary as well as the following detailed description of the preferred embodiment of the present invention that follows will be best understood when considered in conjunction with the accompanying figures, wherein like designations denote like elements throughout the figures, and wherein:
[00027] FIG. 1 is a high-level block diagram for displaying at least one action in response to at least one touch input without utilizing a pointing device in accordance with the preferred embodiment of the present invention;
[00028] FIG. 2 is a diagrammatic top view of a keyboard wherein the top surfaces of the mechanical keys are touch sensitive and when taken altogether make up a large touch sensitive area in accordance with the preferred embodiment of the present invention;
[00029] FIG. 3 is a diagrammatic side view of a typical single key in use in accordance with the preferred embodiment of the present invention;
[00030] FIG. 4 is a block diagram of a typical switch configuration for reporting key presses in accordance with the preferred embodiment of the present invention;
[00031] FIG. 5 A is a block diagram of a plurality of sensor layers of a touch sensitive keyboard;
[00032] FIG. 5 B is a block diagram of another embodiment of the plurality of circuit board layers of the touch sensitive keyboard shown in FIG. 5A;
[00033] FIG. 6 is a block diagram of the touch sensitive components of a touch sensitive keyboard in accordance with an alternate embodiment of the present invention;
[00034] FIG. 7 is a perspective front view of two typical keys in use in accordance with the present invention;
[00035] FIG. 8 is a perspective front view of three typical keys in use in accordance with an alternative embodiment of the present invention;
[00036] FIG. 9 is a diagrammatic top view of a keyboard according to one embodiment of the present invention wherein the keyboard is implemented using a contiguous touch sensitive surface as opposed to discreet mechanical keys;
[00037] FIG. 10 is a block diagram showing the logic behind interpreting and translating several low-level events generated by the hardware circuitry into events which can be processed by existing applications in accordance with an embodiment of the present invention;
[00038] FIG. 11A is a top view diagrammatic depiction of motion possible by a user' s fingers that does not require repositioning the user's hand in use according to a preferred method of the present invention;
[00039] FIG. 11B is a diagrammatic depiction of a keyboard layout with discreet left and right mouse buttons according to an alternative embodiment of the present invention;
[00040] FIG. 12 is a is a diagrammatic depiction of a keyboard in use according to an
preferred method embodiment of the present invention, "tap-touch";
[00041] FIG. 13 shows a computer display in use according to a method of the present invention;
[00042] FIG. 14 shows a computer display and an exemplary case of the implied shift key;
[00043] FIG. 15 shows a computer display depicting a text insertion pointer's direct motion through text in magnetic mouse mode;
[00044] FIG. 16 shows a computer display depicting a text insertion pointer's second motion through text in a vertical direction in magnetic mouse mode;
[00045] FIG. 17 shows a computer display depicting a pointer's third motion through text in a direction in magnetic mouse mode alongside a depiction of the user's hand during this motion;
[00046] FIG. 18 shows a computer display depicting a pointer's fourth motion through text in magnetic mouse mode alongside a depiction of the user's hand during this motion;
[00047] FIG. 19 shows a computer display depicting a pointer's fifth motion through text alongside a depiction of the user's hand during this motion;
[00048] FIG. 20 shows a computer display depicting a pointer's sixth motion through text alongside a depiction of the user's hand during this motion;
[00049] FIG. 21 shows a computer display depicting a pointer's seventh motion through text alongside a depiction of the user's hand during this motion;
[00050] FIG. 22 shows a computer display depicting a pointer's eighth motion through text alongside a depiction of the user's hand during this motion;
[00051] FIG. 23 shows a computer display depicting a pointer's ninth motion through text alongside a depiction of the user's hand during this motion;
[00052] FIG. 24 shows a computer display depicting a pointer's tenth motion through text alongside a depiction of the user's hand during this motion;
[00053] FIG.25 shows a computer display depicting a pointer's eleventh motion through text alongside a depiction of the user's hand during this motion;
[00054] FIG. 26 is a top diagrammatic depiction of a keyboard showing a user's two hands making a first motion;
[00055] FIG. 27 is a top diagrammatic depiction of a keyboard showing a user's two hands making a second motion;
[00056] FIG. 28 shows a computer display depicting a pointer's twelfth motion through text alongside a depiction of the user's hand during this motion;
[00057] FIG. 29 shows the result on a computer display of the text after the user raises her hand after the tap-drag originating on the 'D' key; and
[00058] FIG. 30 shows another embodiment of the present invention, illustrating the sensors located on every space of the touch sensitive keyboard.
[00059] It is noted that the figures are intended to depict only typical embodiments of the invention and therefore should not be considered as limiting the scope thereof. It is further noted that the figures are not necessarily to scale. The invention will now be described in greater detail with reference to the accompanying figures.
DETAILED DESCRIPTION OF THE FIGURES
[00060] In the following discussion that addresses a number of embodiments and applications of the present invention, reference is made to the accompanying figures that form a part hereof, and in which is shown by way of illustration specific embodiments in which the invention may be practiced. It is to be understood that other embodiments may be utilized and changes may be made without departing from the scope of the present invention.
[00061] Various inventive features are described below that can each be used independently of one another or in combination with other features. However, any single inventive feature may not address any of the problems discussed above or only address one of the problems discussed above. Further, one or more of the problems discussed above may not be fully addressed by any of the features described below.
[00062] For purposes of this document, the following definitions shall apply.
[00063] "Graphical Interface Device", (GID), shall be understood to include a superset of personal computing devices including but not limited to computing devices based on the Windows, Macintosh, and variants of the Unix operating system as well as mobile operating systems such as Apple's iOS, Google's Android and any other computing devices with a graphical user interface that accept keyboard input as well as pointing device input in the form of mouse events as defined below.
[00064] "Modifier keys" are the keys on a standard computer keyboard such as shift, command, option or alt, function, and control which do not transmit an ASCII code when pressed but which modify the ASCII code transmitted by the remaining keys.
[00065] "Mouse-touch events" shall be understood to be a catch-all phrase that includes well known pointing device position and button state events as generated by a computer mouse, track ball or other pointing device as well as touch events as generated by a touchscreen, touchpad or similar input device.
[00066] "Low-level user input events" or simply "low-level events" refer to mouse- touch events that are closest to the physical actions of the hardware such as the click of a button, or a touch on a touch sensitive surface. In contrast, "high-level events" are often combinations of low-level events. They may comprise a gesture and may be associated with a specific meaning or semantics. For example, a "swipe" in the art may be composed of a touch followed by a quick motion in a single direction while remaining in contact with the touch sensitive surface and finishing with the lifting of the fingers from the touch surface. The semantics of the swipe may be to progress through an ordered collection such as advancing the current page in a document, scrolling a document up or down within a window, or moving a text cursor to the end or beginning of a line of text.
[00067] "Keyboard events" shall be understood to include the hardware and software events typically generated by a mechanical, membrane, or virtual keyboard. Keys may be pressed individually, in sequence or in combination with modifier keys.
[00068] "User input semantics" or just "semantics" shall be understood to be the meaning of user interactions with a GID within the current context of a software application or operating system. For example, the semantics of pressing a key on a keyboard might be to insert the associated ASCII character into a document while a key pressed in conjunction with a modifier key might move the text cursor through the
document, delete a line or another command to the editor.
[00069] A "Drag event" shall be understood to mean a subset of mouse-touch events as defined above that indicates the moving of the pointing device while the mouse button being is held down or, in a touch environment, while in continuous contact with the surface. For example, the semantics of a drag event may be the moving of a selected user interface object, the moving of selected text, or the selecting of text, depending on the current program state when the drag event occurs.
[00070] A "touch sensitive keyboard" shall be understood to mean 1) a computer keyboard with discreet keys where a) the tops of the keys form a single touch sensitive surface that can be used as a pointing device in a similar manner to a touch pad or mouse, 2) a keyboard represented on a touch sensitive pad or touchscreen display (sometimes referred to as a virtual keyboard) where the pressure necessary to register as a mouse-touch event can be distinguished from a greater pressure that registers as a key press. In this latter case the touch sensitive keyboard may be implemented using a tablet computer as long as it is capable of distinguishing more than one level of pressure.
[00071] Referring to FIGS. 1-4, a keyboard system 1 including a touch sensitive keyboard 2, a touch event processor 3 and at least one graphical interface device (GID) 4 is illustrated. The touch sensitive keyboard 2 includes a plurality of discrete keys 7 and a plurality of sensor pads 1004. Each of the plurality of discrete keys 7 includes a key cap 110, a key press switch 120 and a touch sensitive top layer 100.
[00072] FIG. 2 shows a keyboard 2 with discreet keys 7 where the top surfaces 100 of the keys 7 are touch sensitive and, when taken altogether, make up a large touch sensitive surface. The surface 100 of the keys 7 is thus akin to a large touchpad. The user may touch the surface with one or more fingers and may easily move his or her fingers across the surface. Doing so will generate touch movement signals in a similar manner to a touchpad or touch screen without depressing any key 7 far enough to generate a key press.
[00073] FIG. 3 shows a typical single key 7, its keycap 110, and indicates the top of the key 7 may be a touch sensitive surface 100, while at the bottom of the key 7 the switch 120 is triggered when the key 7 is fully depressed.
[00074] FIG. 4 shows a block diagram of a typical sensor configuration for reporting key presses. Key presses are typically detected by a mechanical or membrane switch
10 located under each key 7. Key presses are detected by the key press sensor, 20 and converted to standardized electric signals that are transmitted externally to the keyboard, usually by USB (universal serial bus), 70, or alternatively wirelessly through Bluetooth or other networking technologies.
[00075] Referring to FIG. 5A, the plurality of self capacitive sensor pads 1004 is arranged around each of the plurality of discrete keys 7 of the touch sensitive keyboard 2 and covered with a top cosmetic coating 1000. The plurality of sensor pads 1004 is configured to transfer a keyboard input signal. Each of the plurality of sensor pads 1004 includes a via 1005 connected to a signal trace line back to the touch event processor 3. In order for multiple signal trace lines to travel around the keys a plurality of signal trace layers may be employed. The set of signal trace layers when taken together is referred to as the routing layer 1003. The plurality of circuit board layers includes a top sensor layer 1001, a ground plane layer 1002 and a routing layer 1003. Each of the plurality of sensor pads 1004 appears on the top sensor layer and is connected to the routing layer 1003 through the ground plane layer 1002. The routing layer 1003 connects each of the plurality of sensor pads 1004 with the touch event processor 3. The touch sensitive surface 100 on each of the plurality of discrete keys 7 and the top cosmetic coating 1000 of the plurality of sensor pads 1004 defines to form a touch sensitive surface on the touch sensitive keyboard 2. In FIG. 5B, the sensor pads 2005 are placed on the tops of the keys, just under a thin key cap or in some embodiments, the sensor pads 2005 are fabricated on top of the key and then coated with an insulating layer. The frame of the keyboard is used to house circuit traces 2002 to route the electrical signals from the key sensor pads around the key wells 2006 to be processed. A flexible conducting connector bridges the conductive sensor pad on the key to a connection on the keyboard frame 2004. In one embodiment, the end of each flexible connector connects through a via in the ground plane 2001 to the circuit trace layer below. In another embodiment, multiple circuit board layers may be utilized to carry the signal given the narrowness of the passage between the keys on the keyboard. In some embodiments, these separate layers are separated by ground planes to minimize or prevent signals from interfering with each other.
[00076] As shown in FIG. 1, the touch event processor 3 is coupled to a memory unit wherein the event translation software enables the processor 3 to perform a set of predefined actions corresponding to at least one touch input on the touch sensitive
keyboard 2. In the preferred embodiment, the touch event processor 3 receives the keyboard input signal from the plurality of sensor pads 1004 through the plurality of sensor layers. The keyboard input signal is generated by a touch on at least one of the plurality of discrete keys 7 and optionally a drag event. The touch event processor 3 includes a touch determination module 5, a low-level event module 11 for generating a data packet that describes the mouse touch events, and a high-level event module 12 which monitors the stream of data packets and either sends the command directly to the computer or further modifies the translated data packet to a modified translated data packet, which is the sent to the computer. In the case where the translated data packet is interpretable by the computer without modification, the high-level event module 12 passes it directly. In the case where the translated data packet is not interpretable by the computer, the high-level event module 12 modifies it prior to it being sent. In either case, the computer is instructed to perform an action, and the at least one graphical interface device 4 is configured to display the at least one action.
[00077] FIG. 6 shows a block diagram of the touch sensitive components of a touch sensitive keyboard 2. An alternative method of implementing touch sensitive surfaces is by deploying a plurality of driver lines 70 orthogonal to a plurality of sensor lines 60. Each of the plurality of driver lines 70 and each of the plurality of sensor lines 60 is separated by a non-conductive film 80 so that no physical connection between driver 70 and sensor lines 60 is made. And each of the plurality of driver lines 70 in sequence is momentarily connected to a voltage source which induces capacitance in each of the plurality of sensor lines 60 where the driver 70 and sensor lines 60 cross. This capacitance can be measured.
[00078] Wherever a finger is touching a key cap 110 the capacitance reported by sensor lines 60 just under the surface of the key 2 will be different from where no finger is present. In this way, the presence and position of each user touch may be determined. In this diagram there are four such intersections of driver FIG. 5 A is a block diagram of a plurality of sensor layers of a touch sensitive keyboard; and sensor lines FIG. 5 A is a block diagram of a plurality of sensor layers of a touch sensitive keyboard; under each key cap FIG. 5 A is a block diagram of a plurality of sensor layers of a touch sensitive keyboard; This is so that the position of a touch within the key FIG. 5 A is a block diagram of a plurality of sensor layers of a touch sensitive keyboard; e.g. lower left, upper right, can be determined with greater resolution.
[00079] In general, more sensor-driver line intersections per unit area lead to greater position resolution. This precision is necessary so that motion across the surface of each key 7 as well as across the full keyboard surface can be computed.
[00080] Turning now to FIG. 7, the driver 140 and sensor lines 130 (with separating layer, not shown) are shown attached to the undersurface of the keycap 150.
[00081] FIG. 8 depicts an alternate embodiment wherein the driver 160 and sensor lines
170 may be embedded in a flexible cloth or film, 180, that is bonded to the top and sides of the keys 7 with enough slack between keys 7 to allow the keys 7 to be pressed.
[00082] FIG. 9 shows a functionally similar setup but is implemented using a touchpad without discreet keys 7. The key positions shown are thus drawn on the pad or displayed on the touchscreen. In some embodiments, the keys 7 may be contoured or equipped with force feedback to give the user some tactile sense as to the position, extant, and sensitivity of the key areas. The surface must be capable of distinguishing between a light touch for generating mouse-touch events and a heavier touch to generate key press events. This may be accomplished in plurality of ways including a series of dome or membrane switches deployed under the touch surface, or the touch sensitive surface itself may be capable of detecting a light touch for generating mouse- touch events and a heavier touch to generate key presses. As shown in FIG. 8, the surface is a contiguous touch sensor, functioning both as a touch surface and keyboard 2.
[00083] In another embodiment, a tablet computer may be used as the keyboard 2 with the keys 7 represented graphically on the screen. Once again, the tablet must be able to sense and provide feedback to the user as to the difference between touching the surface and a harder touch that corresponds to a keypress. One way of achieving this is to measure the size of the touch area. A light touch will produce an area in contact with the surface that is smaller than a heavier touch meant to serve as a key press.
[00084] Regardless of embodiment, the touch sensitive keyboard 2 connects to the computer via a common USB, Bluetooth, or other networking connection.
[00085] In the preferred embodiment, the touch sensitive keyboard 2 enables a user to provide an input through the plurality of discrete keys 7 by a tap-touch. The touch sensitive keyboard 2 enables the keyboard system 1 to perform the at least one action by combining the touch screen gesture with the at least one of the plurality of discrete keys 7 without utilizing an external pointing device.
[00086] FIG. 10 depicts a block diagram showing the logic behind interpreting and translating the raw sensor outputs from multiple sensors into low-level events and then further to high-level events as defined above. These low-level events are events which can be processed by existing applications in accordance with an embodiment of the present invention.
[00087] The high-level event module 12 also monitors low-level events output by the low-level event module 11. Additional information provided by the touch
determination module 5 such as if a shift key is being touched but not depressed and which other key a tap is. This information is used to add semantics to low-level events (e.g. a shift tap on the 'd' key is interpreted differently from a shift- tap on a T key, and may be interpreted differently if the shift key is held down. The at least one touch input can be a low-level event or a high-level event. The low-level event is processed by a touch determination module 5 and the low-level event module 11 and the high-level event is processed by a high-level event module 12. When a user provides a touch input to the plurality of discrete keys 7, touch sensors 8 senses touch events, tap-touch-and gesture input while a key press sensor 9 senses key press inputs. Although each keypress also generates an initial touch, a touch by itself has no semantics and generates no output unless it is part of a tap, tap-touch, double tap or or drag or three fingered touch. A mouse down is generated by a double tap. A tap- touch consists of a touch followed immediately by a release. Any key press immediately following a tap on the same key remove the touch from consideration. Thereafter, the gesture input and the key press inputs are transmitted to the touch event processor 3. The low-level event module 11 receives a data packet describing the inputs, and using event translation software translates the data packet to a translated data packet and transmits the translated data packet to the high-level event module 12. A keyboard-mouse driver logic 13 coupled with an operating system 14 in the graphical interface device 4 displays the at least one action performed by the high-level event module 12. In this way, the user can enter data through the touch sensitive keyboard 2 without utilizing the pointing device.
LOW-LEVEL AND HIGH-LEVEL EVENTS
[00088] The low-level events generated by the hardware circuitry employed in a mechanical keyboard 2 design with touch sensitive keys 7 may be quite different from
those generated by a keyboard 2 implemented on a touchpad or touchscreen tablet. For example, a touch pad or tablet device often generates absolute position information: a touch at the lower left of the input device is mapped directly to a position on the lower left of the user's display in a linear fashion. Additionally, touches hard enough to trigger a key press event will have to be mapped from an x-y position to the key 7 presented at that location. As a final example, dragging one's fingers across a touchscreen enabled tablet will generate a touch-move gesture while the surfaces of discreet keys 7 on a keyboard 2 may report contact with one to four keys 7 at a time and positions on these keys 7 as well as the keys 7 being touched change as the drag continues. Whatever the differences, these low-level events must be cleaned up and translated into a common set of mouse-touch and keyboard events. This is the job of the low-level event module. 89] Referring to FIG. 11 A: it is one object of the invention to avoid the need for the user to move her hands from the preferred touch typing position on the keyboard
2. This inherently limits the distance in any direction that fingers may be moved across the surface of the keyboard 2 without having to lift and move the hand. In order to generate mouse-touch position information for the GID 4 that potentially might include multiple displays, the present invention translates all low-level events from absolute to relative motion events in the manner of a mouse or track pad and not a touch screen display: that is, a new position is determined by the previous position modified according to the direction and speed of the user's fingers. The preferred embodiment includes a method for displaying the at least one action in response to the at least one touch input utilizing the touch sensitive keyboard system 1. The method commences by providing the at least one touch input to the at least one of the plurality of discrete keys 7 of the touch sensitive keyboard 2. Next, the plurality of sensor pads 1004 arranged around each of the plurality of discrete keys 7 is enabled to transfer the keyboard input signal to the touch determination module 5 within the event processor
3. Then the touch determination module 5 is enabled to interpret the keyboard input signal and to generate the data packet which is sent to the low-level event module 11 for translation. The translated data packet is sent to the high-level event module, which receives the translated data packet corresponding to the at least one touch input and either passes said translated data packet to the computer or modifies said translated data packet and passes the modified translated data packet to the computer, thereby instructing the computer to perform at least one action. Finally, the at least one
graphical interface device 4 connected with the touch event processor 3 is enabled to display the at least one action.
GENERATING A MOUSE CLICK AND MOUSE MOVEMENT
[00090] Even though the most fundamental user inputs are mouse events, the present embodiment does not employ a mouse but uses instead the touch sensitive keyboard 2. Some computer users rest their fingers lightly on the keyboard 7 or just above. Requiring that the user never touch the keyboard 2 other than to generate a mouse event or a key press would require replacing an existing habit with one that requires more energy and effort of attention. Touching the tops of the keys 7 in and of itself is therefore not a good indicator of an intentional mouse-touch event and so a different methodology must be found.
[00091] In contrast, lateral motion across the tops of keys 7 is not something users generally do today. Such a motion has no existing user interface semantics and is distinct from a key press and so the system is free to interpret it as mouse motion with low chance for error. The low-level event logic therefore ignores random mouse-touch events but responds to movements across the surface of the keys 7 to indicate motion events in the manner of an Apple multi-touch trackpad as used in macOS and other Apple operating systems. The position (or mouse) cursor may move anywhere a conventional mouse or trackpad might move it. Typically, hysteresis is employed: small movements that do not occur as part of a longer movement are ignored, that is to say a threshold of distance must be crossed before motion is reported. This is to eliminate random movements generated by a user as he/she types or randomly lifts or rests her fingers on the keyboard surface without intention to generate mouse events.
[00092] Implementing mouse click events can be defined in the following alternative ways.
[00093] Method 1) Simply include mouse buttons on the keyboard 2 below the space bar as shown in FIG. 11B. This is common on existing laptops that do not implement mouse clicks through their trackpad.
[00094] Method 2) The system uses a gesture to indicate the beginning of mouse events. To illustrate this principle, we define "tap-touch", as shown in FIG. 12. In this method, the user would raise at at least one finger off the touch sensitive keyboard surface of the keyboard 2 then quickly tap the surface of a key or keys, 240, followed
immediately by a touch, 250, in the same position This gesture can be distinguished from random touches. If tap-touch is followed by a drag, then a selection is created from the position of the cursor as with a three-fingered tap drag in the Apple macOS operating system In this case, tap-touch replaces the movement of the hand from the keyboard 2 to the mouse or track pad, the pressing of the mouse button and returning of the hand to position on the keyboard and as such is an improvement in efficiency and speed. The touch event processor 3 of the keyboard 2 recognizes the tap-touch gesture translating it to the equivalent mouse-down event.
[00095] In this method, "tap-touch-drag" is defined as tap-touch with motion before raising the fingers again. The semantics defined for this gesture are usually selecting text, or selecting and moving a graphical object on the display. This is equivalent, and in most circumstances will be translated to mouse-down and mouse-move events. Raising the fingers at the end of the drag is equivalent to releasing a mouse button.
[00096] In this method, "tap-tap-touch-release", where release defined as raising the fingers above the surface is translated to a mouse double click. Tap-tap-touch-drag generates a double click and drag.
[00097] Method 3) Holding the shift key down may be defined to create or extend a selection to when there is a text insertion point in a block of text from the insertion point to the cursor position and continues to modify the selection when followed immediately by a mouse-drag event. The semantics are similar to holding a shift key while using the arrow keys to select text in existing applications or using the shift key with a trackpad This is as opposed to holding the shift key followed by a key press, which simply generates the shift value of the pressed key, e.g. 'a' to Ά'. In FIG. 13, with the text insert cursor at the position denoted by 300, the shift key is pressed and held down. The user drags across the keyboard until position 310 where the shift key is released and the desired text is selected.
[00098] Method 4) When there is no text insertion cursor or when the mouse cursor is not over text, the shift key acts as it normally does, which an application usually interprets as a command to to constrain motion. Double pressing a shift key, or double tapping the keyboard surface, is the equivalent of double clicking the mouse button.
[00099] Method 5) Referring now to FIG. 14, a method for using the shift key as a mouse button outside of text is equivalent to a mouse click with the shift key down. This is referred to as an implied shift key. At each location 400, the user has
momentarily pressed the shift key. Each time, a new graphic object is added to the selection. Alternatively, the user could have clicked a mouse button if one were provided on the keyboard 2, or tapped two fingers on the keyboard 2 with the shift key held to accomplish the same thing as described in the above-mentioned method. To release an item from the selection, the user simply presses the shift key while the mouse is over the item, (or taps with two fingers, or clicks the mouse button) while the shift key is held. To release all items, simply tap two fingers, mouse button click, or press and release the shift key while the mouse cursor is over empty space in the document. Note that like many alternatives of the preferred embodiment, this application of the shift key is an optional optimization and is not required.
[000100] Together, the low-level and high-level event logics receive the keyboard input signals from the touch sensitive keyboard 2 and translate them into standard mouse- touch events and, when appropriate, higher level gestures. Existing applications and operating systems may use these higher-level events from the keyboard 2 with no need for modification to the software. The event translation software is configured to enable the touch sensitive keyboard 2 to operate in a plurality of modes but is not limited to a magnetic cursor mode and a magnetic mouse mode. The plurality of modes is described below:
MOVING THROUGH TEXT
[000101] An additional innovation of the present invention is the capability it presents to translate some mouse-touch events into a combination of mouse-touch events, keyboard command-key events and arrowKey events. Also, the "magnetic cursor" mode is a technique to further reduce the number of mouse events that need to be specifically generated by the user and thereby making interaction quicker and more efficient. With magnetic mode selected, certain events are generated automatically for the user. In this case, when a user has a text insertion cursor in some block of text and the text cursor and mouse cursor are combined. Moving the mouse-pointer across text (by dragging across the surface of the touch sensitive keyboard's 2 keys 7) will advance the text insertion cursor through the text as shown in FIG. 15. A slow but deliberate leftward motion across the touch sensitive keyboard 2 is moving the text insertion cursor leftward through the text one character at a time as shown in FIG. 15. Unlike a typical use of a mouse, no mouse click is required at the end of the mouse movement
to set a new insertion point.
[000102] This is similar to and may in fact be implemented by mapping the motion over the surface of the keys 7 to arrow key presses when the keyboard 2 can determine that the user has a text insertion point in text such as would always be the case with a software programmer's text editor. Alternatively, the behavior can be requested via a preference switch or toggle on the touch sensitive keyboard 2 or may be defined as the default mode for a text editor specialized for computer programming. This technique is defined as "magnetic mouse" mode.
[000103] FIG. 16 shows a similar motion through text in a vertical direction in magnetic mouse mode. A slow upward drag movement from 500 to 510 is illustrated. It is similar to, and in fact the low-level event logic may choose to translate this motion into multiple clicks of the up arrow.
[000104] In FIG. 17, with the magnetic mouse mode active, a faster movement across the surface of the keys 7 is represented using the mouse cursor which moves smoothly without stopping to show the text insertion cursor at every possible intermediate position starting from the existing text insertion point, at 600 vertically to release at 610. The rapid motion causes the appearance of the mouse cursor. For devices such as a tablet where there is no mouse cursor, a mouse cursor is added when a touch sensitive keyboard is attached. This may require support from an application running on a tablet or from the tablet operating system itself but would allow the tablet to be used for serious text processing without fatigue and inefficiency of moving ones finger from the keyboard to the tablet surface and back for each touch input. Rather than appearing at the previous position of the mouse cursor, the mouse cursor appears at the text insertion point and moves from there as opposed to the common practice of the day. The preferred embodiment is partially combining and more tightly relating the text insertion cursor and mouse cursor by sharing gestures. In magnetic mouse mode, the raising of the dragging finger(s) off the surface causes a mouse Up event to be generated at the new position of the cursor. As illustrated in FIG. 17, the mouse cursor is past the end of a line when released. The text insertion cursor is placed in the nearest position where text may be entered, the same as it would be for a standard mouse click past the end of the existing line. The mouse cursor position of the mouse cursor is updated to share the position of the text cursor and is made invisible when the text cursor is displayed.
[000105] Magnetic mouse mode includes efficiencies that save events. Using a
conventional pointing device, the user would need to move a hand to the pointing device, move the mouse to the desired location, click the mouse, return the hand to the keyboard 2 and continue typing. Using cursor control with arrow keys would require moving of the hand and an additional six key presses. Using a command key combination might avoid moving a hand if the required modifier keys didn't require repositioning but would still require six key presses. In magnetic mouse mode, the user leaves both hands in place, keeps her eyes on the display and simply swipes upwards until the proper position is reached and continues typing. This requires a single gesture instead of four or more and no loss of attention on the working document.
[000106] To move a block of text or group of selected object requires that the user makes the selection, position the cursor over the selection and then click and hold the mouse or press and hold on a track pad and continue to hold/press while repositioning the objects. In the case of a text selection you may not be able to move it this way or you will have to wait with the pointing device depressed until a timeout threshold has been met and the selection is available to be moved. There are ways to make this much easier. We may define that a three finger touch while over the selection enables it to be moved without a wait state and not needing to hold the pointing divide depressed. We can also define that tap-touch-hold over a selection does the same (tap-touch- release would still remove the selection and create a new insertion point in text (or mouseUp while not in text). We can also take advantage of the ability to touch without depressing a key to define that touching a shift key while tap-touch-dragging will move the selection.
[000107] FIG. 18 shows the result of a rapid diagonal motion from 720 to 730 by the user. This causes the mouse cursor to appear and move from 700 to 710 followed by a click at 730 when the fingers are released which places the text insert cursor at the new location.
[000108] Note that magnetic mouse mode is most useful in applications such as text editors or word processors where the predominance of the time is spent entering and moving about in text and where pushing a text insert cursor through text makes sense.
[000109] What is happening at the level of the touch sensitive keyboard's 2 low-level event module is that there is not a one-to-one mapping of events generated by the touch sensitive keyboard 2 and equivalent mouse events in magnetic mouse mode. When this mode is turned on, the low-level event logic 11 may read motion events and then
generate arrow key events in the case when the motion is slow. Alternatively, sensing a rapid motion, it may generate motion events and an automatic mouse click event at the end. The effect is to give the user the sense of pushing a text insert cursor through the text and it is this visual metaphor which makes this novel computer interaction intuitive. It is as if the pointing cursor was "magnetic" and with the pause in motion, sticks to the underlying virtual object.
MAGNETIC MOUSE GESTURES
[000110] With magnetic mouse mode on, a short very rapid swipe along the surface of only one key, causes the equivalent arrow key to be generated in the direction of the swipe, or, when there is no text insertion cursor but there is a keyboard focus, the focus is moved from one field or button to the next focusable object in order. An advantage in using this gesture over the tab key is that the tab-key can only proceed to the next object whereas the gesture is inherently 2 dimensional and can advance to the next focusable object or text insertion point in the direction of the swipe. Thus, as shown in FIG. 19, the text insert cursor 600 is moved one character to the right 610 by a fast, short swipe. In FIG. 20 a similar quick swipe upward from 800 to 810 moves the text insert cursor on line up from 820 to 830.
[000111] Referring to FIG. 21, a longer very rapid movement across two keys from 920 to 930 causes the text insert cursor to advance from 900 to the beginning of the next word at 910 is illustrated. FIG. 22 shows a rapid 2 key movement vertically from 1000 to 1010 causes the text insert cursor 1020 to move upward to the previous beginning of a block 1030 in this case the block where the text insert cursor already resided. On repetition of the gesture the text insert cursor would move to the beginning of the next block above.
[000112] Referring to FIG. 23, a longer horizontal swipe over three keys from 1120 to
1130 when there is a text insert cursor has the meaning of moving the cursor all the way to the right side of the current text line, 1100 to 1110 is illustrated. Similarly defined, a long swipe left will move the text input cursor to the left side of the line.
[000113] Referring to FIG. 24, a long, fast swipe in any direction "throws" the text cursor in the direction of the motion. The throw distance is settable by a user preference and may cause the document to scroll if so desired. This enables the user to gesture from 1230 to 1240 and to move the cursor from 1200 to 1210 and "glide" from 1230 to
1240. This allows the user in a large document to throw the cursor from a distance greater than he/she could drag the cursor in one stroke. The user can then precisely position the cursor with another stroke.
[000114] As shown in FIG. 25, an upward swipe, 1330, where the fingers remain on the top row of keys, 1340, will move the cursor, 1300, to the top of the display, 1310, and continue scrolling upwards with the cursor through the text until the touch is released. The same is true for downward swipes that are held in the bottom row and horizontal swipes that are held at the left and right extremes of the keyboard 2 for panning right and left.
[000115] The event translation software is configured to enable the touch sensitive keyboard to operate with a plurality of commands to perform the at least one action. The plurality of commands includes but is not limited to a scrolling and a panning gesture command, a double hand gesture command, a tap key command and a tap gesture command.
SCROLLING AND PANNING GESTURES
[000116] If two fingers are used to scroll or pan the document as is consistent with Apple multi-touch devices, except for modifications as listed within, three fingers gestures are defined to generate the native platforms commands for scrolling or panning and selection. Similarly, the system may define continuous scrolling or panning of the window contents (rather than the moving of a cursor through the document as described above.) Moving the appropriate number of fingers to the edge of the draggable area (again, typically one more finger than for mouse motion) on the touch sensitive keyboard 2 and pausing there may be defined to cause continuous scrolling in the indicated direction while the hand remains at the edge.
[000117] As an example, a swipe upwards with multiple fingers where the fingers momentarily remain on the uppermost row of the keyboard 2 after the swipe initiates continuous scrolling until the fingers are lifted. The same is true for downward as well as left and right swipes to scroll or pan in the appropriate direction.
DOUBLE HAND GESTURES
[000118] Tablet computers may define a two-fingered pinch gesture to zoom in or out or using two fingers of the same hand and drawing them closer together or drawing them
further apart. Similarly, in a gesture input, just like twisting a dial, a tablet may define a rotation gesture. These may also be deployed by the touch sensitive keyboard 2 but the user's hand position on the keyboard 2 and the possibility of the user resting her fingers on the keyboard 2 makes it easier to define similar gestures by using two hands. In this case, the gestures are similar except that one or two fingers of each hand are brought closer together or further apart to indicate zooming as shown in FIG. 26, while moving the fingers of each hand in opposite directions indicates rotation as indicated at FIG. 27.
TAP KEY COMMANDS
[000119] Touching a modifier key such as the shift key, while tapping another key enables us to interpret that key as a tap command as opposed to a mouse command.
[000120] Shift-tap commands, or simply "tap-commands" enables the possibility to utilize taps on keys 2 to issue commands Modern programmers' text editors commonly use combinations of modifier keys to enter commands. Learning these combinations is a tedious business. Tap-Commands are a easier to type layer of command keys. Some of the key commands for the popular Sublime Text editor are reproduced later in this document. Instead, since the preferred embodiment creates tapping on the surface of the keys 7 as a new user input capable of detection, the invention may imbue taps on specific keys with specific semantics. For example, if it is understood convention that mouse-touch events require two fingers acting in tandem then tapping the 'w' key with a single finger may move a text cursor forward to the beginning of the next word. To differentiate the tap from a random touch, the force of the tap can be measured when the hardware is capable of such measurements as well as the immediate raising of the finger inherent in a tap detected. Tapping 'b' might move backwards by a word. Tapping Ό' might open a new line after the current one without having to first go to the end of the line and hit return.
TAP GESTURES
[000121] In one embodiment, a touch sensitive keyboard 2 can be used to combine key commands with mouse commands. This is possible because the touch sensitive keyboard is both keyboard 2 and mouse, and taps always always take place on top of a key. Touching a modifier key while tap-touching distinguishes a mouse-touch event
from a tap command. For example, touching but not depressing a shift key. In the same way that depressing the shift key modifies the key code sent when pressing a key (usually to send the uppercase version of the letter associated with the key), touching the shift key modifies the event to send a tap command.
[000122] Tapping on a key 7 can also be combined with gestures. For example, tap-touch on the 'd' key while touching the shift key, (shift-tap-touch) with a single finger and sliding left may be defined to delete characters to the left of the text insert cursor. Similarly, as shown in FIG. 28, shift-tap-touch-slide right deletes characters to the right of the text insert cursor. FIG. 29 shows the result of the text after the user raises her hand after the tap-drag. In this case, deleting several characters on a line using this method is faster than having to press the delete key for each character deleted: Tap-'d'-slide versus <delete> <delete> <delete> <delete> <delete> <delete> <delete> <delete>.
[000123] Likewise, if shift-tapping 'd' twice ('dd') is defined to delete a line, shift-tapping twice and then dragging up or down might be defined to delete line by line upwards or downwards respectively.
[000124] Tapping of different letters can be combined. Tapping 'dw' might be defined to delete the word at the text insert cursor, 'dw'-slide might delete by word. Shift-tapping enables a easily accessible and intuitive way to define easy to remember commands by leveraging the difference between touch and depressing of a key along with the fact that every mouse-touch event also occurs on a particular key.
SUMMARY OF CONSISTENCIES AND DIFFERENCES TO CONVENTIONAL MOUSE TOUCH EVENTS.
[000125] Different operating systems that employ a GID define mouse touch events that are for the most part consistent. To create a multi-touch keyboard some gestures may be modified to increase functionality or ease of use, while supporting and taking advantage of not having to move one's hand to the pointing device. The following gestures are single-finger gestures unless otherwise noted. These include but are not limited to:
a. single finger touch-hold does nothing, (diff from mouse, same as trackpad) b. a quick tap generates a mouseUp— (same as Apple trackpad)
c. a quick double tap generates a double click— (same as Apple trackpad)
d. a quick two-finger tap is a right mouse click— (same as Apple trackpad) e. option-quick tap is also a right mouse— (same as Apple trackpad)
f. tap-touch generates a mouseDown— (diff from Apple trackpad)
g. fast, short touch-move-release generates arrow key or moves keyboard focus— (diff from Apple trackpad)
h. tap-touch-move generates mouseDown and move — (diff from Apple trackpad)
i. a release from the active touch when mouse is down generates a mouseUp j. tap-tap-touch within text selects a word while subsequent movement of the touch continues to select by word
k. tap-tap-touch behaves and triple finger double-tap
1. tap-tap-touch-drag selects a word and continues selecting by word upon as the touch is dragged, (diff from Apple)
m. tap-tap-touch-drag is equivalent to three fingered double tap-drag on a
Macintosh multi-touch track pad.
n. once a touch is the active touch for motion purposes, it stays the active touch until released (diff from Apple trackpad)
o. a touch inactive for more than a threshold time is released automatically (diff from Apple Trackpad)
p. two finger touch-move is scroll/pan (same as Apple)
q. three finger touch = 1 finger tap-touch (diff from Apple)
r. three finger tap is an alternative mouse down, (same as Apple)
s. three finger touch-pause is mouseDown (same as Apple)
t. three finger touch-drag is mouseDown + drag (same as Apple)
u. three finger tap and tap-touch doesn't change x-y position, just causes mouseDown/mouseUp at position from before three-finger tap or tap touch started, (diff from Apple Trackpad)
6] Tap Commands:
a. tapping a key while touching the Shift key, sends a tap command if one is defined for the key.
b. tap-touch-drag while touching the Shift key, invokes a tap command with the move parameters as arguments. Thus shift-touch-tap-d can delete in any direction. This command has to be processed by the High-Level Event module
and issue the appropriate low-level events.
[000127] What follows is a non-exhaustive list of some of the commands available while touching a modifier key such as a touch sensitive shift key.
a. single or double tap on a specific key to execute a command associated with that key.
b. single tap and drag starting or stopping on a specific key or keys: perform a command associated with the key in question on the range of text or of objects selected by the drag
c. double tap and drag beginning or ending on specific key or keys: perform an alternative command associated with the key in question on the range of text or of objects selected by the drag
d. single or multiple tap on more than one key o execute a command associated with the keys tapped, for example, tapping 'var' in sequence might bring up a list of variable names to select from
e. single or multiple tap and drag on more than one key to execute a command associated with the keys tapped
f. single or multiple tap with or without a drag and including a modifier key executes an associated command
g. preceding a tap by a tapped numerical value (taps on the number keys) executes the command the indicated number of times, e.g. tapping 4 before a tap command will execute the tap command 4 times.
h. With two fingers:
i. drag right or left from within selection = shift selected lines left or right ii. fast swipe left in text with no selection = fast move to beginning of line iii. fast swipe right in text with no selection = fast move to end of line iv. fast swipe up with no selection = fast move to beginning of section v. fast swipe down with no selection = fast move to end of section vi. fast swipe up with no selection = fast move to beginning of section vii. fast swipe down with no selection = fast move to end of section
[000128] Examples: The following are examples of tap events which may be defined for actions typical to a word or text editor. Other suitable tap events may be employed below, a letter or letters in single quotes indicates a tap on the associated key or keys. A repeated letter or groups of letters indicates repeated taps on the associated key(s).
The symbol '#' indicates a number generated by tapping on the number keys on the keyboard. A single tap or multiple taps while holding the touch on the last tap may be defined to repeat the associated command until released.
a. 'h', 'j', 'k', T : tap command alternative for cursor motion. Other combinations of letters are possible in software such as games and may be defined or customized. Tap on the associated key to move the cursor in the associated direction. For the given example keys used within text, tapping 'h' moves the text cursor one character to the left, 'k' moves it one line up, 'k', one line down; T, one character to the right. Tapping 'j' and T may also be used to move through lists or menus.
b. Ό' = open line below current line
c. Όο' = open line above current line
d. shift + Ό' = open a dialog box wherein the name of a file or other object can be entered.
e. kreturn key>' = alternative for 'o'
f. kreturn key, return key>' alternative for 'oo'
g. 'dd' = delete line
h. '#dd' where '#' indicates a tapped number: delete '#' number of lines, a way to pass a count to the command
i. 'd' drag = delete in the direction of the drag
j. 'd' drag left = delete backward
k. 'd' swipe = delete in the direction of the swipe to the end of logical unit in that direction, e.g. 'd' swipe right might delete to the end of the line
1. 'cc' copy current line
m. 's' drag: select in the direction of the drag
n. 'ss' drag: alternative of 's' drag
o. 'ww': select the word at the current cursor position
p. 'ww' drag right or left: move the cursor: select right or left a word at a time q. 'ww' drag up, down, or diagonally: vertical motion selects by line, horizontal by word
r. 'sw': select a word
s. 'sw' drag: alternative similar to 'ww' drag described above
t. '{': move to the beginning of the current section or the beginning of the
preceding section if already at the beginning of a section. That which constitutes a section is defined by the application. For example, a programmer's text editor might go to the beginning of the current code block which is delimited in many programming languages with the character '{'. u. '}': move to the end of the current section
v. : select a line
w. 'si': alternative for selecting a line
x. 'dl': alternative for deleting a line
y. 'p' paste
z. ']]' shift indicated line(s) right
aa. '[[' shift indicated line(s) left
[000129] Below are additional two-finger motion gestures.
a. drag right or left from within selection = shift selected lines left or right b. fast swipe left in text with no selection = fast move to beginning of line c. fast swipe right in text with no selection = fast move to end of line d. fast swipe up with no selection = fast move to beginning of section e. fast swipe down with no selection = fast move to end of section
f. fast swipe up with no selection = fast move to beginning of section g. fast swipe down with no selection = fast move to end of section
APPLICATION DEFINED TAP KEY COMMANDS AND TAP GESTURE COMMANDS
[000130] Taps on keys may be defined to have application specific meanings. For example, applications define command keys equivalents to use for menu shortcuts or other functions specific to the application. A keyboard utility can be used to map these functions to tap commands. Alternatively, an application can support tap commands directly, as well as tap-gesture commands. For example, in a graphics editor, tapping 'b' and dragging might draw with the brush tool, while tapping 'p' and dragging might draw with the pencil tool. The novelty lies in the ability to distinguish a gesture which begins with one key from the same gesture that begins on a different key.
MODELESS TEXT EDITOR
[000131] Some popular text editors used by software engineers attempt to solve the
difficulty of typing complex and convoluted command key combinations by creating modes. A mode is a state in which keys 7 typed on the keyboard are interpreted differently than the same keys typed while in a different mode. For example, using the text editor VIM, the user switches between "command mode" and "text insert mode" . In command mode, each key is interpreted as being part of a command. In text insert mode keys typed are interpreted as characters to be inserted into the file. In real world editing, commands and text tend to group together: one tends to type a few commands, then some text, then some commands. For example, editing often involves issuing commands to position the text cursor properly, followed by a stream of text that is entered into the file, then more commands to go to another location. Thus, rather than needing to type a modifier key along with each command letter, one types a command to enter command mode where all keys are commands. A command to switch to text insert mode can be issued and thereafter all key presses will be interpreted as text.
[000132] The most frequent and damaging mistake with modal editors is the user being in one mode when he or she thought the other mode was selected. In one embodiment of the invention a modified VIM could be created that has but a single mode. The VIM commands are defined as taps on the surface of the keys 7 normally used for commands in command mode. Normal typing is always text insert mode. Thus, in single mode VIM, pressing the 'w' key always enters the 'w' character while tapping 'w' will move the cursor to the right one word. The user is never stuck in the wrong mode as a tap is always a command while a key press always enters text.
LOW-LEVEL EVENT MAPPING TO HIGH-LEVEL EVENT MAPPING
[000133] As described in FIG. 10, illustrating the processing of low and high-level events, the low-level events are those which map closely to the actions of the hardware, e.g. a key press, a mouse click. High-level events are often combinations of low-level events, e.g. a fast swipe to the right initiated on the 'd' key is a combination of several low-level events that might be defined with the semantics "delete one character to the right of the cursor". This high-level event could be send by the keyboard 2 to the GID 4 in different ways. The simplest event to send might not be a gesture or a series of mouse-touch events but instead it might be more concise and accurate to translate this gesture to a press of the forward delete key. (Note that even if the keyboard doesn't have a forward delete key such as is the case for many Apple keyboards, the code for forward delete is still defined and understood by the application).
[000134] Certainly, novel inputs such as tap-drag commands, for example, tap 'd' and drag to delete may be directly defined and supported by an application or may be mapped into a sequence of existing mouse-touch and keyboard events with meaning to the operating system or application. For example, shift-tap-dragging from the 'd' key might generate a mouse-down and drag series of events to select text, followed by mouse-up event and a final press of the delete key.
TRAINING AND LEARNING
[000135] Since one goal of the touch sensitive keyboard 2 combined with this method of user interaction is to enable the user to be more productive by eliminating the need for the user to move her hand from the rest typing position on the keyboard 2, the low and high-level event manager software may be able to learn from the user's interaction to extend or contract the distance required to distinguish certain gestures from each other, for example allowing a longer drag for character positioning than simply one key distance. Each user has different sized hands and finger reach. When a gesture is followed by smaller counter gestures to correct it, it may be possible to recognize this as an error in interpretation and correct it to fit the user.
[000136] Some more complicated gestures may benefit from training or user customization. Having the user repeat each gesture several times can allow the recognition software to adjust to the individual user. For example, a programmer might define a gesture and associate that with a compile and build command.
CUSTOMIZATION
[000137] Originally typewriters, and even more so computers, were too expensive for each individual to have his or her own. Standards were created so that anyone trained on one typewriter could still type on another. This is similar to the piano keyboard where many people are expected to be able to use the same expensive instrument.
[000138] A keyboard 2 may be customized to an individual as well as the tasks that the individual performs. As an analogy, a bicycle is fit to an individual bike racer's size and strength while the design of the bike varies for terrain, e.g. a road bike versus a mountain bike. Thus, a keyboard 2—even a mass-produced one— for a programmer really should differ from the keyboard 2 used by a writer of fiction as programmers use
special punctuation characters far more frequently. Further, a person with large hands and thick fingers can certainly benefit from a different keyboard than that tailored to someone with small hands and thin fingers.
[000139] The strongest factor for holding back the adoption of custom keyboard design is that moving one's hand to and from a mouse or trackpad dominates computer- human interaction time and as such can make the motion not worth the time.
[000140] By eliminating the need for a trackpad entirely, the touch sensitive keyboard opens up space for customization. Thus, used in combination, the touch sensitive keyboard and customization of keyboard layout and key sizes can achieve greater efficiency than ever before. As necessary we may now propose an arrangement of keys, and sizes based on a user's hand size, and the tasks they do. The further a finger must reach for a key, the more likely it is to miss the key and hit the wrong one. Thus, the further the user must reach, the larger the key should be if the error rate is to be minimized. Eliminating the trackpad from a laptop increases the area available for the keyboard and to make harder to read keys larger and/or more spaced apart. We are now free to create a design based on the calculated frequency of use, error, and distance as well as arrangement for specific hands and specific tasks such as programming. A premium is placed on reducing the need for difficult modifier key combinations or positions that require hand movement away from the base location or are otherwise slow and difficult to reach. For example, the '[',']','{', and '}', keys are rare in natural language usage but are some of the more commonly used keys in computer programming. Thus, for a programmer, one might move the bracket keys below the space bar in the area freed up by the elimination of the trackpad so that they can be struck by the user's thumbs. Alternatively, they might be moved into the gap opened up by the ergonomic splitting of the keyboard 2 and rotation of the two halves to be more in-line with the hands of the user. Thus, the bracket keys may be moved between the 'g' and 'h' keys. Normally, experimenting with such layouts would not be possible on a laptop. Alternatively, the caps lock key, a large and importantly placed key which nevertheless has little utility in programming, may be used instead as a different more utilized key, such as the control key, with a different key being used for caps lock, or with the use of a combination such as a double tap on the caps lock key to activate/deactivate the caps lock function required.
[000141] In addition to measurement of finger length and finger reach under bent and
straight finger configuration, the user may be observed and the results measured during specific exercises and /or typical work for which they use the keyboard as directed by a specialist or by a series of self-directed exercises that involve custom mechanical or virtual keyboards with the ability to adjust layout. Different locations, arrangements, and sizes of keys may be tried to find the most rapid and accurate typing for the user's usual tasks.
[000142] In one such other embodiment, the sensors are located between the keys of a keyboard. Turning to FIG. 30, the sensors are shown as vertical lines 1400 between each of the keys, while the wiring for the sensors is shown in horizontal lines 1410 between rows of keys. In FIG. 30, the sensors are located on every space of the keyboard that has a key on its left and right, in other not shown embodiments fewer sensors are placed between keys. In still further embodiments, sensors may be located on the keys and between the keys. To further improve on these embodiments with sensors between the keys, additional sensors may be present to determine the height of a finger over the keys or for detecting a tap gesture so as to filter out meaningful touch movements versus the user simply resting his or her fingers on the keys.
[000143] In one embodiment, a non-transitory computer-readable medium comprises computer-executable instructions stored therein for causing a computer to implement a program executable on a keyboard system 1 for a method for displaying at least one action in response to at least one touch input. The non-transitory computer readable storage medium may comprise a memory card, USB-Drive, CD-ROM or flash memory to list but a few. In one embodiment, a non-transitory computer-readable medium comprises computer-executable instructions stored therein for causing mobility solutions to implement a program executable on a keyboard system 1 that enables the keyboard system 1 to perform at least one action by combining a touch screen gesture with at least one of a plurality of discrete keys 7 without utilizing a pointing device.
[000144] In use, the present invention provides several benefits. Firstly, a substantial increase in the speed and efficiency of user interaction is provided even with existing software on GIDs 4 due primarily to a reduction of the number of required actions to execute a task on a GID 4 and the elimination of the time it takes for the user to move her hand from the keyboard 2 to a mouse, trackpad, or set of arrow keys to perform an action, and then move her hand back to the keyboard 2 and position it for text entry. A further goal is the elimination of the necessity to move one's eyes from the display in
order to locate and use certain keys on the input device which cannot easily be found by touch alone, only then to return one's gaze to the display and refocus on the current task.
[000145] In the same way, then, that the preferred embodiment for touch typing greatly increase productivity of typists' by eliminating the need to periodically look down at the typewriter keyboard, find the desired key, position the hand and type, this invention — a combination of the three components: touch sensitive keyboard 2, a simple but complete set of interactions, and the mapping of those actions to events compatible with existing applications— extends a similar increase in efficiency to the modern workflow that includes display, keyboard 2 and pointing device.
[000146] Additionally, the reduction of repetitive motions and especially the elimination of lifting actions that stress the body asymmetrically such as repeatedly lifting and lowering a hand only on one side of the body can reduce the epidemic of repetitive motion injuries caused by those same actions.
[000147] Finally, the preferred embodiment is far easier to learn than existing methodologies for speeding user interactions with a keyboard 2 which require the memorization both mentally and physically of a set of complex and often awkward command key combinations.
[000148] The foregoing description of the preferred embodiment of the present invention has been presented for the purpose of illustration and description. It is not intended to be exhaustive or to limit the invention to the precise form disclosed. Many modifications and variations are possible in light of the above teachings. It is intended that the scope of the present invention not be limited by this detailed description, but by the claims and the equivalents to the claims appended hereto.
Claims
A method for using a keyboard system as a component of a computer, the method comprising: a. providing a touch sensitive keyboard comprising a plurality of discrete keys arranged thereon to form the touch sensitive keyboard, each of the plurality of discrete keys having a key press switch and a key cap incorporating a capacitive touch sensitive top layer; and b. providing a touch event processor in association with the touch sensitive keyboard, the touch event processor installed with event translation software and coupled with a memory unit, wherein the event translation software enables the touch event processor to perform a set of predefined actions corresponding to at least one touch input on the touch sensitive keyboard, wherein the touch event processor receives the keyboard input signal from the plurality of key caps, the touch event processor comprising a touch determination module which interprets the keyboard input signal and generates a data packet; c. sending the data packet to a low-level event module; d. translating the data packet; e. sending the translated data packet to a high-level event module which receives the translated data packet corresponding to the at least one touch input and either passes said translated data packet to the computer or modifies said translated data packet and passes the modified translated data packet to the computer, thereby instructing the computer to perform at least one action; and f. at least one graphical interface device configured to display the at least one action performed by the computer.
The keyboard system of claim 1 wherein the keyboard input signal is generated by a touch on at least one of the plurality of discrete keys and a touch sensitive gesture.
The keyboard system of claim 1 wherein the touch sensitive keyboard enables a user to provide an input through the plurality of discrete keys by a tap-touch.
The keyboard system of claim 1 wherein the touch sensitive keyboard enables the
keyboard system to perform the at least one action by combining the touch screen gesture with the at least one of the plurality of discrete keys without utilizing a pointing device.
5. The keyboard system of claim 1 wherein the at least one touch input is a low-level event processed by the touch event processor in the low-level event module.
6. The keyboard system of claim 5 wherein the low-level event includes but is not limited to mouse-touch events.
7. The keyboard system of claim 1 wherein the touch event processor is configured to enable the touch sensitive keyboard to operate in a plurality of modes, wherein the plurality of modes includes but is not limited to a magnetic cursor mode and a magnetic mouse mode.
8. The keyboard system of claim 1 wherein the touch event processor is configured to enable the touch sensitive keyboard to operate with a plurality of commands to perform the at least one action, wherein the plurality of commands includes but is not limited to a scrolling and a panning gesture command, a double hand gesture command, a tap key command and a tap gesture command.
9. A method for displaying at least one action in response to at least one touch input utilizing a keyboard system having a touch sensitive keyboard with a plurality of discrete keys and a plurality of sensor pads, a touch event processor and at least one graphical interface device directed by a computer, the method comprising the steps of: a. providing at least one touch input to at least one of the plurality of discrete keys of the touch sensitive keyboard; b. enabling the plurality of sensor pads arranged around each of the plurality of discrete keys to transfer a keyboard input signal to the touch event processor installed with event translation software; c. enabling a touch determination module within the touch event processor to interpret the keyboard input signal and to generate a data packet describing said signal; d. enabling a low-level event module coupled with the touch determination
module to receive and translate the data packet into a translated data packet; e. enabling a high-level event module to receive said translated data packet and to either pass said translated data packet to the computer or to modify said translated data packet and pass the modified translated data packet to the computer, thereby instructing the computer to perform at least one action; and f. enabling at least one graphical interface device to display the at least one action.
10. The method of claim 9 wherein the keyboard input signal is generated by a touch on at least one of the plurality of discrete keys and a touch sensitive gesture.
11. The method of claim 9 wherein the touch sensitive keyboard enables the keyboard system to perform the at least one action by combining the touch screen gesture with the at least one of the plurality of discrete keys without utilizing an external pointing device.
12. The method of claim 9 wherein the action is a mouse touch event.
13. The method of claim 9 wherein the event translation software is configured to enable the touch sensitive keyboard to operate in a plurality of modes, wherein the plurality of modes includes but is not limited to a magnetic cursor mode, a magnetic mouse mode, a text mode and a command mode.
14. The method of claim 9 wherein the event translation software is configured to enable the touch sensitive keyboard to operate with a plurality of commands to perform the at least one action, wherein the plurality of commands includes but is not limited to a scrolling and a panning gesture command, a double hand gesture command, a tap key command and a tap gesture command.
15. A method of using a computer program product in association with a computer, the computer program product comprising computer executable instructions embodied in a non-transitory computer readable storage medium having a computer readable program code embodied therein, the computer readable program code configured to be executed on a computer system to implement a method for displaying at least one action in response to at least one touch input utilizing a keyboard system having a touch sensitive keyboard with a plurality of discrete keys each with a keycap incorporating a touch sensor, a touch event processor and at least one graphical interface device, the
method comprising the steps of:
a. providing at least one touch input to at least one of the plurality of discrete keys of the touch sensitive keyboard;
b. enabling the plurality of keycaps to transfer a keyboard input signal to the touch event processor installed with event translation software;
c. enabling a touch determination module at the touch event processor to interpret the keyboard input signal and to generate a data packet describing said signal; d. enabling a low-level event module coupled with the touch determination module to receive and translate the data packet into a translated data packet; e. enabling a high-level event module to receive said translated data packet corresponding to the at least one touch input and to either pass said translated data packet to the computer or to modify said translated data packet and pass the modified translated data packet to the computer, thereby instructing the computer to perform at least one action; and
f. enabling at least one graphical interface device to display the at least one action.
16. The method of claim 15 wherein the keyboard input signal is generated by a touch on at least one of the plurality of discrete keys and a touch sensitive gesture.
17. The method of claim 15 wherein the touch sensitive keyboard enables the keyboard system to perform the at least one action by combining the touch screen gesture with the at least one of the plurality of discrete keys without utilizing an external pointing device.
18. The method of claim 15 wherein the at least one action is a mouse touch event.
19. The method of claim 15 wherein the event translation software is configured to enable the touch sensitive keyboard to operate in a plurality of modes, wherein the plurality of modes includes but is not limited to a magnetic cursor mode, a magnetic mouse mode, a text mode and a command mode.
20. The method of claim 15 wherein the event translation software is configured to enable the touch sensitive keyboard to operate with a plurality of commands to perform the at least one action, wherein the plurality of commands includes but is not limited to a scrolling and a panning gesture command, a double hand gesture command, a tap key command and a tap gesture command.
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201562270000P | 2015-12-20 | 2015-12-20 | |
US62/27000 | 2015-12-20 | ||
US62/270,007 | 2015-12-20 |
Publications (2)
Publication Number | Publication Date |
---|---|
WO2017112714A1 true WO2017112714A1 (en) | 2017-06-29 |
WO2017112714A8 WO2017112714A8 (en) | 2017-11-23 |
Family
ID=59089970
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US2016/067890 WO2017112714A1 (en) | 2015-12-20 | 2016-12-20 | Combination computer keyboard and computer pointing device |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2017112714A1 (en) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2019192124A (en) * | 2018-04-27 | 2019-10-31 | 株式会社東海理化電機製作所 | Switch device and controller |
CN112131076A (en) * | 2020-09-17 | 2020-12-25 | 上海上讯信息技术股份有限公司 | Method, equipment and system for acquiring mouse operation event information |
CN112306248A (en) * | 2019-07-29 | 2021-02-02 | 瑟克公司 | Hybrid circuit for touch pad and keyboard |
US10908727B2 (en) | 2017-11-02 | 2021-02-02 | Blackberry Limited | Electronic device including touchpad and fingerprint sensor and method of detecting touch |
CN112384884A (en) * | 2019-05-09 | 2021-02-19 | 微软技术许可有限责任公司 | Quick menu selection apparatus and method |
CN116467059A (en) * | 2023-04-21 | 2023-07-21 | 哈尔滨有初科技有限公司 | Data processing system and method based on distributed computing |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100306427A1 (en) * | 2009-05-29 | 2010-12-02 | Aten International Co., Ltd. | Ps/2 to usb keyboard adaptor supporting n-key rollover |
US20120054671A1 (en) * | 2010-08-30 | 2012-03-01 | Vmware, Inc. | Multi-touch interface gestures for keyboard and/or mouse inputs |
US20130300668A1 (en) * | 2012-01-17 | 2013-11-14 | Microsoft Corporation | Grip-Based Device Adaptations |
US20140118264A1 (en) * | 2012-10-30 | 2014-05-01 | Apple Inc. | Multi-functional keyboard assemblies |
US20150058809A1 (en) * | 2013-08-23 | 2015-02-26 | General Electric Company | Multi-touch gesture processing |
-
2016
- 2016-12-20 WO PCT/US2016/067890 patent/WO2017112714A1/en active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100306427A1 (en) * | 2009-05-29 | 2010-12-02 | Aten International Co., Ltd. | Ps/2 to usb keyboard adaptor supporting n-key rollover |
US20120054671A1 (en) * | 2010-08-30 | 2012-03-01 | Vmware, Inc. | Multi-touch interface gestures for keyboard and/or mouse inputs |
US20130300668A1 (en) * | 2012-01-17 | 2013-11-14 | Microsoft Corporation | Grip-Based Device Adaptations |
US20140118264A1 (en) * | 2012-10-30 | 2014-05-01 | Apple Inc. | Multi-functional keyboard assemblies |
US20150058809A1 (en) * | 2013-08-23 | 2015-02-26 | General Electric Company | Multi-touch gesture processing |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10908727B2 (en) | 2017-11-02 | 2021-02-02 | Blackberry Limited | Electronic device including touchpad and fingerprint sensor and method of detecting touch |
JP2019192124A (en) * | 2018-04-27 | 2019-10-31 | 株式会社東海理化電機製作所 | Switch device and controller |
JP7137962B2 (en) | 2018-04-27 | 2022-09-15 | 株式会社東海理化電機製作所 | Switching device and control device |
CN112384884A (en) * | 2019-05-09 | 2021-02-19 | 微软技术许可有限责任公司 | Quick menu selection apparatus and method |
CN112306248A (en) * | 2019-07-29 | 2021-02-02 | 瑟克公司 | Hybrid circuit for touch pad and keyboard |
CN112306248B (en) * | 2019-07-29 | 2024-03-05 | 瑟克公司 | Hybrid circuit for touch pad and keyboard |
CN112131076A (en) * | 2020-09-17 | 2020-12-25 | 上海上讯信息技术股份有限公司 | Method, equipment and system for acquiring mouse operation event information |
CN116467059A (en) * | 2023-04-21 | 2023-07-21 | 哈尔滨有初科技有限公司 | Data processing system and method based on distributed computing |
Also Published As
Publication number | Publication date |
---|---|
WO2017112714A8 (en) | 2017-11-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10061510B2 (en) | Gesture multi-function on a physical keyboard | |
US8542206B2 (en) | Swipe gestures for touch screen keyboards | |
KR101117481B1 (en) | Multi-touch type input controlling system | |
US10444989B2 (en) | Information processing apparatus, and input control method and program of information processing apparatus | |
JP6115867B2 (en) | Method and computing device for enabling interaction with an electronic device via one or more multi-directional buttons | |
WO2017112714A1 (en) | Combination computer keyboard and computer pointing device | |
JP2013527539A5 (en) | ||
US20110169760A1 (en) | Device for control of electronic apparatus by manipulation of graphical objects on a multicontact touch screen | |
EP0660218A1 (en) | An improved graphical keyboard | |
US20150100911A1 (en) | Gesture responsive keyboard and interface | |
EP2898397A1 (en) | Gesture-initiated keyboard functions | |
JP2000501526A (en) | Multi-touch input device, method and system that minimizes memory requirements | |
US10409412B1 (en) | Multi-input element for electronic device | |
US20140317564A1 (en) | Navigation and language input using multi-function key | |
KR20130002983A (en) | Computer keyboard with integrated an electrode arrangement | |
CN102103461A (en) | Method for realizing shortcut key mode on touch pad of notebook computer | |
US20090262072A1 (en) | Cursor control system and method thereof | |
US8970498B2 (en) | Touch-enabled input device | |
Zhang et al. | Gestkeyboard: enabling gesture-based interaction on ordinary physical keyboard | |
US20140298275A1 (en) | Method for recognizing input gestures | |
TWM486807U (en) | Peripheral device with touch control function | |
CN104503591A (en) | Information input method based on broken line gesture | |
US9632591B1 (en) | Capacitive keyboard having variable make points | |
CN103365451B (en) | Multidimensional speedup space-efficient man-machine interaction method and device for intelligent platform | |
Hall et al. | T-Bars: towards tactile user interfaces for mobile touchscreens |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16879997 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 16879997 Country of ref document: EP Kind code of ref document: A1 |