US20160209924A1 - Controllable tactile sensations in a consumer device - Google Patents
Controllable tactile sensations in a consumer device Download PDFInfo
- Publication number
- US20160209924A1 US20160209924A1 US15/079,660 US201615079660A US2016209924A1 US 20160209924 A1 US20160209924 A1 US 20160209924A1 US 201615079660 A US201615079660 A US 201615079660A US 2016209924 A1 US2016209924 A1 US 2016209924A1
- Authority
- US
- United States
- Prior art keywords
- tactile
- display device
- texturized
- electronic device
- display
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 230000035807 sensation Effects 0.000 title description 3
- 238000000034 method Methods 0.000 claims description 37
- 230000004044 response Effects 0.000 claims description 10
- 229920000642 polymer Polymers 0.000 claims description 8
- 238000004891 communication Methods 0.000 claims description 3
- 230000008569 process Effects 0.000 description 28
- 239000010410 layer Substances 0.000 description 25
- 239000000463 material Substances 0.000 description 13
- 238000010586 diagram Methods 0.000 description 11
- 229920001746 electroactive polymer Polymers 0.000 description 11
- 230000006870 function Effects 0.000 description 9
- 239000011159 matrix material Substances 0.000 description 8
- 230000008859 change Effects 0.000 description 7
- 238000005516 engineering process Methods 0.000 description 6
- 238000007373 indentation Methods 0.000 description 6
- 239000000758 substrate Substances 0.000 description 6
- 238000005259 measurement Methods 0.000 description 5
- 230000009471 action Effects 0.000 description 4
- 239000004973 liquid crystal related substance Substances 0.000 description 4
- 238000001514 detection method Methods 0.000 description 3
- 239000000499 gel Substances 0.000 description 3
- 238000010438 heat treatment Methods 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 3
- 239000004033 plastic Substances 0.000 description 3
- 229920003023 plastic Polymers 0.000 description 3
- 238000013473 artificial intelligence Methods 0.000 description 2
- 230000008901 benefit Effects 0.000 description 2
- 230000001413 cellular effect Effects 0.000 description 2
- 238000011960 computer-aided design Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 230000005672 electromagnetic field Effects 0.000 description 2
- 239000000203 mixture Substances 0.000 description 2
- 210000003205 muscle Anatomy 0.000 description 2
- 230000006855 networking Effects 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- 238000000926 separation method Methods 0.000 description 2
- 239000002356 single layer Substances 0.000 description 2
- 238000010897 surface acoustic wave method Methods 0.000 description 2
- 238000012360 testing method Methods 0.000 description 2
- 244000303258 Annona diversifolia Species 0.000 description 1
- 235000002198 Annona diversifolia Nutrition 0.000 description 1
- 229920002595 Dielectric elastomer Polymers 0.000 description 1
- 241000283973 Oryctolagus cuniculus Species 0.000 description 1
- 240000007643 Phytolacca americana Species 0.000 description 1
- 239000011149 active material Substances 0.000 description 1
- 239000000853 adhesive Substances 0.000 description 1
- 230000001070 adhesive effect Effects 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 238000005452 bending Methods 0.000 description 1
- 238000007664 blowing Methods 0.000 description 1
- 239000002131 composite material Substances 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 229920001940 conductive polymer Polymers 0.000 description 1
- 230000000994 depressogenic effect Effects 0.000 description 1
- 238000006073 displacement reaction Methods 0.000 description 1
- 230000003028 elevating effect Effects 0.000 description 1
- 230000002708 enhancing effect Effects 0.000 description 1
- 239000003256 environmental substance Substances 0.000 description 1
- 239000004744 fabric Substances 0.000 description 1
- 230000001815 facial effect Effects 0.000 description 1
- 238000005562 fading Methods 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 150000002500 ions Chemical class 0.000 description 1
- 210000003041 ligament Anatomy 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 210000004072 lung Anatomy 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000000968 medical method and process Methods 0.000 description 1
- 230000003340 mental effect Effects 0.000 description 1
- 239000002184 metal Substances 0.000 description 1
- 229910052751 metal Inorganic materials 0.000 description 1
- 239000002905 metal composite material Substances 0.000 description 1
- 150000002739 metals Chemical class 0.000 description 1
- 238000000465 moulding Methods 0.000 description 1
- 230000003387 muscular Effects 0.000 description 1
- 238000012634 optical imaging Methods 0.000 description 1
- 230000008447 perception Effects 0.000 description 1
- 238000000554 physical therapy Methods 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 230000003362 replicative effect Effects 0.000 description 1
- 230000000241 respiratory effect Effects 0.000 description 1
- 238000006748 scratching Methods 0.000 description 1
- 230000002393 scratching effect Effects 0.000 description 1
- 229910052710 silicon Inorganic materials 0.000 description 1
- 239000010703 silicon Substances 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 230000000638 stimulation Effects 0.000 description 1
- 238000002560 therapeutic procedure Methods 0.000 description 1
- 238000012876 topography Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 1
- 239000002023 wood Substances 0.000 description 1
- 210000002268 wool Anatomy 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/016—Input arrangements with force or tactile feedback as computer generated output to the user
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0416—Control or interface arrangements specially adapted for digitisers
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0346—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0412—Digitisers structurally integrated in a display
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04845—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/0485—Scrolling or panning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/0486—Drag-and-drop
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04886—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G3/00—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
- G09G3/20—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters
- G09G3/22—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters using controlled light sources
- G09G3/30—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters using controlled light sources using electroluminescent panels
- G09G3/32—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters using controlled light sources using electroluminescent panels semiconductive, e.g. using light-emitting diodes [LED]
- G09G3/3208—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters using controlled light sources using electroluminescent panels semiconductive, e.g. using light-emitting diodes [LED] organic, e.g. using organic light-emitting diodes [OLED]
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01H—ELECTRIC SWITCHES; RELAYS; SELECTORS; EMERGENCY PROTECTIVE DEVICES
- H01H13/00—Switches having rectilinearly-movable operating part or parts adapted for pushing or pulling in one direction only, e.g. push-button switch
- H01H13/70—Switches having rectilinearly-movable operating part or parts adapted for pushing or pulling in one direction only, e.g. push-button switch having a plurality of operating members associated with different sets of contacts, e.g. keyboard
- H01H13/84—Switches having rectilinearly-movable operating part or parts adapted for pushing or pulling in one direction only, e.g. push-button switch having a plurality of operating members associated with different sets of contacts, e.g. keyboard characterised by ergonomic functions, e.g. for miniature keyboards; characterised by operational sensory functions, e.g. sound feedback
- H01H13/85—Switches having rectilinearly-movable operating part or parts adapted for pushing or pulling in one direction only, e.g. push-button switch having a plurality of operating members associated with different sets of contacts, e.g. keyboard characterised by ergonomic functions, e.g. for miniature keyboards; characterised by operational sensory functions, e.g. sound feedback characterised by tactile feedback features
-
- H—ELECTRICITY
- H03—ELECTRONIC CIRCUITRY
- H03K—PULSE TECHNIQUE
- H03K17/00—Electronic switching or gating, i.e. not by contact-making and –breaking
- H03K17/94—Electronic switching or gating, i.e. not by contact-making and –breaking characterised by the way in which the control signals are generated
- H03K17/96—Touch switches
- H03K17/964—Piezoelectric touch switches
- H03K17/9643—Piezoelectric touch switches using a plurality of detectors, e.g. keyboard
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/041—Indexing scheme relating to G06F3/041 - G06F3/045
- G06F2203/04102—Flexible digitiser, i.e. constructional details for allowing the whole digitising part of a device to be flexed or rolled like a sheet of paper
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/041—Indexing scheme relating to G06F3/041 - G06F3/045
- G06F2203/04104—Multi-touch detection in digitiser, i.e. details about the simultaneous detection of a plurality of touching locations, e.g. multiple fingers or pen and finger
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/041—Indexing scheme relating to G06F3/041 - G06F3/045
- G06F2203/04105—Pressure sensors for measuring the pressure or force exerted on the touch surface without providing the touch position
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04806—Zoom, i.e. interaction techniques or interactors for controlling the zooming operation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04809—Textured surface identifying touch areas, e.g. overlay structure for a virtual keyboard
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01H—ELECTRIC SWITCHES; RELAYS; SELECTORS; EMERGENCY PROTECTIVE DEVICES
- H01H2215/00—Tactile feedback
- H01H2215/05—Tactile feedback electromechanical
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01H—ELECTRIC SWITCHES; RELAYS; SELECTORS; EMERGENCY PROTECTIVE DEVICES
- H01H2215/00—Tactile feedback
- H01H2215/05—Tactile feedback electromechanical
- H01H2215/052—Tactile feedback electromechanical piezoelectric
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01H—ELECTRIC SWITCHES; RELAYS; SELECTORS; EMERGENCY PROTECTIVE DEVICES
- H01H2217/00—Facilitation of operation; Human engineering
- H01H2217/006—Different feeling for different switch sites
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01H—ELECTRIC SWITCHES; RELAYS; SELECTORS; EMERGENCY PROTECTIVE DEVICES
- H01H2217/00—Facilitation of operation; Human engineering
- H01H2217/038—Prompting
Definitions
- This application is related to an apparatus and method for providing and configuring an elevated, indented, or texturized display device. Moreover, processes are provided and described involving elevated, indented, or texturized portions of a display device.
- Display devices have become commonplace in electronic devices such as mobile devices, cellular phones, personal digital assistants, smart phones, televisions, monitors, touchscreens, picture frames, or the like. Display devices may be based on liquid crystal, plasma, or organic light emitting technologies using ridged substrates or soon to be flexible substrates. Although commonplace, when a display device functions as an input device, such as a touchscreen, their applications are limited to two dimensions. Another limitation or problem of current display devices is the lack of texture. As the world becomes more electronic, texture is needed for enhancing and enabling certain applications and computer processes. Therefore, it is desirable to have display devices that can provide three dimensional and/or texturized structures or processes.
- An apparatus and method for providing and configuring an elevated, indented, or texturized display device is disclosed. Processes are also given involving elevated, indented, or texturized portions of a display device. By providing an elevated, indented, or texturized display device enhanced input/output functions are provided.
- FIG. 1 is a diagram of an electronic device having an elevated, indented, or texturized display device in accordance with one embodiment
- FIGS. 2 a -2 e are diagrams of elevated, indented, or texturized display devices in accordance with another embodiment
- FIG. 3 is a diagram of an elevated or texturized display device in accordance with another embodiment
- FIG. 4 is a diagram comprising of processes for an electronic device having a display device with elevated, indented, or texturized display portions in accordance with another embodiment
- FIG. 5 is a process for detecting objects or shapes using a display device with an elevated, indented, or texturized display portions in accordance with another embodiment
- FIG. 6 is a process using an elevated, indented, or texturized display device in accordance with another embodiment.
- FIG. 7 is a process using an elevated, indented, or texturized display device for identifying intellectual property assets in accordance with another embodiment.
- elevation or elevated describes an orientation where a given surface level is higher or raised relative to another surface level.
- the relative elevation may be by one or more millimeters to one or more centimeters or up to an inch.
- Indenting describes an orientation where a given surface level is lower or depressed relative to another surface level.
- the relative indentation may be by one or more millimeters to one or more centimeters.
- Texturizing or texturing describes a process where a surface provides or mimics friction, variable smoothness, sandpaper like granularity, variable thickness, variable hardness, coarseness, fineness, irregularity, a movement sensation, bumpiness, or rigidness that is sensed by a human touch or detectable by electronic or mechanical sensors.
- FIG. 1 is a diagram of a fixed or mobile subscriber unit, user equipment (UE), mobile station, pager, cellular telephone, personal digital assistant (PDA), computing device, surface computer, monitor, general display, automobile computer system, vehicle computer system, or television device 100 for mobile or fixed applications.
- Device 100 comprises computer bus 140 that couples one or more processors 102 , one or more interface controllers 104 , memory 106 having software 108 , storage device 110 , power source 112 , one or more displays controller 120 .
- device 100 comprises a display(s) elevation, indenting, or texturizing controller 121 for one or more display devices 122 .
- One or more display devices 122 can be configured as a liquid crystal display (LCD), light emitting diode (LED), field emission display (FED), organic light emitting diode (OLED), or flexible OLED display device.
- the one or more display devices 122 may be configured, manufactured, produced, or assembled based on the descriptions provided in US Patent Publication Nos. 2007-247422, 2007-139391, 2007-085838, or 2006-096392 or U.S. Pat. 7,050,835 or WO Publication 2007-012899 all herein incorporated by reference as if fully set forth.
- the one or more electronic display devices 122 may be configured and assembled using organic light emitting diodes (OLED), liquid crystal displays using flexible substrate technology, flexible transistors, field emission displays (FED) using flexible substrate technology, as desired.
- OLED organic light emitting diodes
- FED field emission displays
- One or more display devices 122 can be configured as a touch screen display using resistive, surface-acoustic wave (SAW) capacitive, infrared, strain gauge, optical imaging, dispersive signal technology, acoustic pulse recognition, frustrated total internal reflection or magneto-strictive technology, as understood by one of ordinary skill in the art.
- SAW surface-acoustic wave
- Coupled to one or more display devices 122 are pressure sensors 123 and optionally heating elements 127 . Coupled to computer bus 140 are one or more input/output (I/O) controller 116 , I/O devices 118 , GPS device 114 , one or more network adapters 128 , and one or more antennas 130 .
- Device 100 may have one or more motion, light, optical, chemical, environmental, water, acoustic, heat, temperature, radio frequency identification (RFID), biometric, face recognition, image, or voice recognition sensors 126 and touch detectors 124 for detecting any touch inputs, including multi-touch inputs, for one or more display devices 122 .
- One or more interface controller 104 may communicate with touch detectors 124 and I/O controller 116 for determining user inputs to device 100 .
- Shape detectors 125 may be configured in combination with touch detectors 124 , display(s) elevation, indenting, or texturizing controller 121 , one or more display devices 122 , pressure sensors 123 , or sensors 126 to determine the shape, geometry or texture of an object placed on one or more display devices 122 , as will be explained in more detail below.
- storage device 110 may be any disk based or solid state memory device for storing data.
- Power source 112 may be a plug-in, battery, solar panels for receiving and storing solar energy, or a device for receiving and storing wireless power as described in U.S. Pat. No. 7,027,311 herein incorporated by reference as if fully set forth.
- One or more network adapters 128 may be configured as a Time Division Multiple Access (TDMA), Code Division Multiple Access (CDMA), Orthogonal Frequency-Division Multiplexing (OFDM), Orthogonal Frequency-Division Multiple Access (OFDMA), Global System for Mobile (GSM) communications, Enhanced Data rates for GSM Evolution (EDGE), General Packet Radio Service (GPRS), cdma2000, wideband CDMA (W-CDMA), long term evolution (LTE), 802.11x, Wi-Max, mobile Wi-MAX, Bluetooth, or any other wireless or wired transceiver for modulating and demodulating information communicated via one or more antennas 130 . Additionally, any of devices, controllers, displays, components, etc. in device 100 may be combined, made integral, or separated as desired.
- TDMA Time Division Multiple Access
- CDMA Code Division Multiple Access
- OFDM Orthogonal Frequency-Division Multiplexing
- OFDMA Orthogonal Frequency-Division Multiple Access
- GSM Global
- FIGS. 2 a -2 e are diagrams of elevated, indented, or texturized display devices.
- a layer 204 lays in proximity to display device layer 202 with layer 203 providing separation.
- layers 202 , 203 , and 204 can be composed of a plurality of sublayers.
- a particular sublayer within 204 can be transflective to reflect any ambient light and emit white light, such as the average light emitted by surrounding display pixels or cells, so that the displayed image is clear.
- Display device layer 202 can be either a flexible or rigid display device.
- Layer 204 can be configured and composed of a clear, flexible, electroactive polymer, polymer composite, or gel material.
- Electroactive polymers also known as electroactive plastics, can be pulled, expand, contract, deform, change shapes in controllable directions, change dimensions in predetermined directions, or change sizes electronically by applying an electric current, potential difference, voltage, time varying voltage, or electromagnetic fields across the material, as described in U.S. Pat. No. 6,117,296, U.S. Pat. No. 6,787,238, US Patent Publication No. 2008-188907, US Patent Publication No. 2004-199232, US Patent Publication No. 2005-157893, WO Publication 2007-114699 and “ Electric Flex ” by Yoseph Bar-Cohen (2001) all herein incorporated by reference as if fully set forth.
- Electroactive polymers may be dielectric or ionic EAPs.
- actuation can be caused by electrostatic forces between two electrodes that squeeze or compress the polymer. Although requiring a high actuation voltage, dielectric EAPS consume very little power and require no power to keep an actuator at a given position.
- dielectric EAPs are electrostrictive polymers and dielectric elastomers that are used for artificial muscles.
- Ionic EAPs actuation is caused by the displacement of ions inside the polymer. Only a few volts are needed for actuation, but the ionic flow implies a higher electrical power needed for actuation, and energy is needed to keep the actuator at a given position.
- ionic EAPS are conductive polymers, ionic polymer-metal composites (IPMCs), and responsive gels.
- layer 204 can also be configured and composed of piezoelectric materials or actuators that are bonded to a firm plastic component to form a piezo bending element, as explained in “ TPaD: Tactile Pattern Display ” by Colgate and Peshkin (2007) herein incorporated by reference as if fully set forth.
- TPaD Tactile Pattern Display
- layer 204 can also be configured and composed of piezoelectric materials or actuators that are bonded to a firm plastic component to form a piezo bending element, as explained in “ TPaD: Tactile Pattern Display ” by Colgate and Peshkin (2007) herein incorporated by reference as if fully set forth.
- layer 204 can also be configured and composed of organic transistors formed on a flexible substrate to drive or contract a surface creating texture, indentation, or elevation.
- Organic transistors are transistors that use organic molecules rather than silicon for their active material.
- An advantage of organic transistors is the ability to operate on a flexible substrate. Similar to EAPs, organic transistors also exhibit material properties such that they can be pulled, expand, contract, deform, change shapes in controllable directions, change dimensions in predetermined directions, or change sizes electronically by applying an electric current, potential difference, voltage, time varying voltage, or electromagnetic fields.
- portions of surface 216 are selectively elevated, indented, or texturized with one or more substantially cubicle segment 206 , dot or dimple segment 208 , substantially cylindrical segment 210 , bulge segment 212 , or indentation segment 214 .
- the shape and texture of the indented or elevated portion depends on the image, document, or application to be displayed and the effects on the resolution of the display device. Because of their natural geometry, certain segments may provide a clearer display of the underlying image or document.
- Segments 206 , 208 , 210 , 212 , and 214 are controlled at least by displays controller 120 and controller 121 that adjust the height, indentation, or depression to multiple different levels, orientation, hardness, thickness, direction, vibration, or gyration individually for each segment.
- Display(s) elevation, indenting, or texturizing controller 121 may comprise of analog or digital driving circuits (not shown) for driving the segments. Examples of display driving circuits are given in US Patent Publication Nos. 2008-062088, 2006-221016, or 2006-007078 all herein incorporated by reference as if fully set forth.
- the operation and configuration of layer 204 may be independent of display device layer 202 thereby simplifying manufacturing since it can be an add-on or attachment to preexisting display systems or technologies. Also, in certain applications images may not be displayed on surface 216 in an area that is elevated, indented, or texturized thereby making it darkened in order to make the area more noticeable to the user. For this configuration, the image displayed on display device layer 202 is rendered to adjust for the darkened area.
- FIG. 2 b is a diagram of an elevated or texturized display device.
- Layer 218 lays in proximity to display device layer 220 with layer 219 providing separation. Although a single layer is shown, layers 218 , 219 , and 220 may be composed of a plurality of sublayers.
- Display device layer 220 is configured as a flexible display device, such as flexible OLED.
- Layer 218 may be comprised of the same composition or materials explained above for layer 204 such as EAPs, piezoelectric materials, or organic transistors.
- portions of surface 231 are selectively elevated or texturized with one or more substantially cubicle segment 222 1 controlling segment 222 2 , dot or dimple segment 224 1 controlling segment 224 2 , substantially cylindrical segment 226 1 controlling segment 226 2 , or bulge segment 228 1 controlling segment 228 2 .
- Segments 222 2 , 224 2 , 226 2 , and 228 2 are controlled at least by displays controller 120 and/or controller 121 that adjust the height, orientation, direction, or gyration individually or collectively for each segment.
- Display(s) elevation, indenting, or texturizing controller 121 may comprise of analog or digital driving circuits (not shown) for driving the segments.
- layer 218 is oriented below or behind display device layer 220 , there is little interference with the resolution or clarity of images to be displayed on display device layer 220 . Also, in certain applications images may not be displayed on surface 231 in an area that is elevated, indented, or texturized thereby making it darkened in order to make the area more noticeable to the user. For this configuration, the image displayed on display device layer 220 is rendered to adjust for the darkened area.
- FIG. 2 c is a diagram of an elevated, indented, or texturized display device.
- Display pixels 232 1 to 232 n lay adjacent, on the same level, or on the same layer to elevation, indenting, or texturizing cells 234 1 to 234 n .
- the display array or matrix 233 also comprises of display pixels 236 1 to 236 n adjacent to elevation, indenting, or texturizing cells 238 1 to 238 n that are adjacent to display pixels 240 1 to 240 n .
- the elevation, indenting, or texturizing cells may be comprised of the same composition or materials explained above for layer 204 or 218 such as EAPs, piezoelectric material, or organic transistors.
- Cells 234 1 to 234 n and 238 1 to 238 n are controlled at least by displays controller 120 and/or controller 121 that adjust the height, orientation, direction, or gyration individually or collectively for each cell.
- Display(s) elevation, indenting, or texturizing controller 121 may comprise of analog or digital driving circuits (not shown) for driving the cells.
- cells 234 1 to 234 n and 238 1 to 238 n may be illuminated based on the configuration of surrounding pixels to blend in with any images being displayed.
- FIG. 2 d shows an embodiment of a display device array or matrix 235 from a top view where elevation, indenting, or texturizing cells 239 are placed selectively within a small area footprint so that the surface of display device array or matrix 235 is mostly comprised of display pixels 237 . Having texturizing cells 239 sparsely placed in display device array or matrix 235 ensures minimal interference with a displayed image.
- the elevation, indented, or texturized cells may be unnoticeable to the human eye but detectable by touch or feeling of display device array or matrix 235 .
- FIG. 2 e is a diagram of an elevated, indented, or texturized display device.
- display pixels 242 are in the same layer or level but separate from elevation, indenting, or texturizing cells and display pixels areas 244 and 246 .
- FIG. 2 e provides a hybrid layout with display pixels 242 operating with selectively placed elevation, indenting, or texturizing cells and display pixels 244 and 246 .
- area 244 can provide scrolling functions while area 246 can be configured as a keyboard, dialpad, keypad, or any other interface.
- FIG. 3 is a diagram of an elevated or texturized display device.
- a matrix of pockets or cells 304 1 to 304 x lays on top of a display device 302 .
- Matrix of pockets or cells 304 1 to 304 x may be full of compressed air or low heat activated gel that becomes elevated or texturized by heating elements 127 as a result of thermal expansion, as understood by one of ordinary skill in the art.
- Matrix of pockets or cells 304 1 to 304 x can be tapered but flat and seamless when unexpanded.
- heating elements 127 can be used to provide different tactile sensations in combination with pockets or cells 304 1 to 304 x so that a user is provided varying temperatures, such as hot or cold information, relating to a displayed image.
- FIG. 4 is a diagram illustrating processes for an electronic device having a display device 402 with elevated, indented, or texturized display portions.
- Display device 402 can be assembled with at least some of the components described in device 100 .
- display device 402 may be configured with the devices described in FIG. 2 a , 2 c , or 2 d , as desired.
- display device 402 may be configured with the devices described in FIG. 2 b or 3 , as desired.
- a darkened or black portion represents an indented portion
- a white portion represents an elevated portion
- a checkered pattern represents a texturized portion.
- a “click here” displayed link is provided with a combination of an indented portion 404 1 and elevated portion 404 2 .
- part of a virtual or simulated keyboard displayed on display device 402 provides the letter “E” key with a partially displayed portion 406 , an elevated circular portion 414 and an elevated square portion 408 .
- display device 402 can be configured to show a whole QWERTY keyboard, a numbered keypad for dialing, or a combination of a whole QWERTY keyboard and a numbered keypad, as desired.
- the letter “S” key is provided by a partially displayed portion and an elevated circular portion 410 .
- the letter “Q” key is completely elevated by portion 412 .
- the virtual or simulated keyboard can also be programmed to replicate Braille lettering, as desired.
- portions 414 , 408 , 410 , or 412 can detect different pressure forces when pushed down, pushed sideways, or pulled sideways providing another metric or advantage for the man machine interface. For instance, a pull in one direction may indicate a capital letter input while a push in another direction may indicate subscripting of the letter.
- These different pressure forces are detected by pressure sensors 123 in combination with touch detectors 124 and/or display(s) elevation, indenting, or texturizing controller 121 by measuring gradient, force, or potential difference values.
- haptic feedback, force feedback or tactile feedback in the form of a played sound, gyration, or vibration can be provided via I/O controller 116 .
- instructions in software 108 can be used to predict or anticipate keystrokes based on a word or sentence entered. In response to the anticipation, different keys can be raised, indented, or texturized in order to increase typing speeds.
- Advertisement 416 can be sold to an advertiser for a certain price for having elevated portions 416 1 and 416 3 and indentation 416 2 on at least one part or the entire advertisement.
- Advertisement 418 can be sold to an advertiser for a different price, higher or lower, for having elevated portions 418 1 and 418 2 having different heights from other advertisements and can be based in combination with location determined by GPS device 114 .
- Advertisement 419 can be sold to an advertiser for a different price for having a plurality of elevated, indented, or texturized portions 419 1 .
- An embodiment of the present invention provides electronic business processes.
- a “Buy Now” button is provided with an elevated circular portion 422 1 and a square portion 422 2 .
- the “Buy Now” button is associated with triggering the purchasing of shirt 424 by sending a request to a server (not shown) over one or more network adapters 128 .
- a texturizing portion 426 is provided to replicate or simulate the surface of shirt 424 .
- Texturizing portion 426 can be a combination of elevated and indented cells.
- texturizing portion 426 can be used to provide surface information for any product being sold or displayed on display device 402 such as electronics, home goods, jewelry, etc.
- shirt 424 can be rotated in response to a multitouch input while texturizing portion 426 is dynamically changed to reflect the different surfaces or materials used in the product.
- Shirt 424 can be zoomed in and out using multitouch inputs detected by touch detectors 124 with each zoom level reflecting texture differences on portion 426 . For instance, a zoomed in view can be more grainy or rough compared to a zoomed out view.
- the zoom levels can also be configured with a fading in or out effect by one or more processors 102 and can involve retrieving additional information from a server (not shown) over one or more network adapters 128 .
- legend 425 identifies or associates different materials, such as rabbit skin, llama wool, and rare silk, by texturized portions 425 1 , 425 2 , and 425 3 , respectively.
- materials such as rabbit skin, llama wool, and rare silk
- texturized portions 425 1 , 425 2 , and 425 3 can be replicated or simulated by texturizing portion 426 .
- an embodiment of the present invention provides an electronic game with elevated, indented, or texturizing portion 428 (hereinafter “portion 428 ”), such as tic-tac-toe.
- portion 428 can detect different pressure forces when pushed down, pushed sideways, or pulled sideways providing another metric or feature for the man machine interface. These different pressure forces can be detected by pressure sensors 123 in combination with touch detectors 124 and/or display(s) elevation, indenting, or texturizing controller 121 by measuring gradient, force, or potential difference values of touches to raised portions in portion 428 .
- Another gaming application comprises portion 428 emulating a piano or guitar.
- a game can receive as inputs flicks, pinches, or scratches to portion 428 and generate an action on display device 402 in response to each detected action differently.
- a pinch to a raised portion of 428 can represent an object or block being picked up or opened in a game or any other simulated environment.
- Portion 428 can also control scrolling or drag and drop functions in combination with multitouch inputs detected by touch detectors 124 .
- certain portions 428 can be used as a miniature joystick or pointing stick for 360 degrees rotational input that is detected by pressure sensors 123 in combination with touch detectors 124 and/or display(s) elevation, indenting, or texturizing controller 121 .
- a three dimensional accelerometer can be included in sensors 126 to be used in combination with display(s) elevation, indenting, or texturizing controller 121 to raise part of portion 428 in response to a programmed action in the game.
- Portion 428 can also be used to simulate or replicate a lottery scratching/rubbing game or a children's productivity game, as desired.
- Portion 428 can also provide a gaming feature where tilting or rotation detected by an accelerometer in sensors 126 raises some portions while indenting others for four dimensional motion gaming.
- portion 428 can provide online collaboration, online conferencing, social networking, or online dating.
- portion 428 In response to push or pull input on a networked computing device (not shown) having an elevated, indented, or texturized display device, portion 428 provides feedback, similar to a poke on Facebook.
- tactile inputs to portion 428 can be used during a video conference application for additional interaction between conversing parties.
- adult entertainment can be enhanced by portion 428 providing stimulation in connection with a video, image, or audio media on display device 402 .
- Portion 428 can provide remote medical diagnostics and measurements, such as a pressure test, over a network using one or more network adapters 128 and pressure sensors 123 in combination with touch detectors 124 and/or display(s) elevation, indenting, or texturizing controller 121 . Diagnostics and measurements include tactile for motor skills, respiratory for testing lung pressure by a person blowing on portion 428 , or muscular by measuring range of motion and strength, as desired. Therefore, portion 428 can provide bi-directional information exchange.
- diagnostics and measurements include tactile for motor skills, respiratory for testing lung pressure by a person blowing on portion 428 , or muscular by measuring range of motion and strength, as desired. Therefore, portion 428 can provide bi-directional information exchange.
- portion 428 can be used for the elderly or disabled to provide simple, small scale physical therapy exercises for the hand or fingers by allowing a user to pull raised portions at different tensions and resistances or pickup objects. This is particularly useful for stroke victims by providing mental therapy, exercises and massaging of small muscles and ligaments.
- portion 428 can be used to provide a plurality of processes or applications.
- Portion 428 can simulate maps, topography, geography, imagery, or location service processes in combination with GPS device 114 .
- Portion 428 can display mountainous regions on a map by elevating and oceans by indenting.
- Portion 428 can follow a musical sequence being played on device 100 for a ringtone during a received call or song.
- portion 428 can be used to display content in an email, 3rd Generation Partnership Project (3GPP) or 3GPP2 short message service (SMS) text message, 3GPP or 3GPP2 multimedia message service (MMS) message, an image or video motion in an MMS message, PDF application, word document, excel graphs, excel charts, four dimensional (4-D) screensaver, 4-D art, 4-D drawings, 3-D imagery, a 3-D sculpture, a 4-D “etch-a-sketch”, or architecture designs using scalable or vector graphics. Any of the content given above and displayed by portion 428 may be transmitted or received over one or more network adapters 128 .
- 3GPP 3rd Generation Partnership Project
- SMS short message service
- MMS multimedia message service
- MMS multimedia message service
- An image or video motion in an MMS message PDF application, word document, excel graphs, excel charts, four dimensional (4-D) screensaver, 4-D art, 4-D drawings, 3-D imagery, a 3-D sculpture, a 4-D “etch-a-
- portion 428 can provide, replicate, or simulate integrated circuit layouts, electrical circuit layouts, facial features, enhanced video clips, computer aided designs (CAD), semiconductor layouts, prototyping, modeling, molding for producing form factors, logos, trademarks, a children's educational product, a general education product, a 3-D drawing tool, distance learning, or a pop-up children's books, as desired.
- portion 428 can be responsive to voice or visual commands or recognition detected by sensors 126 for being elevated, indented, or texturized.
- portion 428 can provide object awareness to display device 402 .
- a post it note can be detected when it is determined that there is additional resistivity or elasticity by the adhesive on the post it note by pressure sensors 123 and in combination with touch detectors 124 and/or display(s) elevation, indenting, or texturizing controller 121 to raised or elevated cell in portion 428 .
- display device 402 can adapt and reformat the images around the note such that images are not obstructed to the user.
- portion 428 can provide advanced Bluetooth capabilities for Bluetooth keyboards, headsets and can function as a Bluetooth device itself for medical applications.
- a preprogrammed texturized pattern is reproduced on portion 428 for notifying the user, such as when display device 402 is in a shirt pocket in hands-free mode communicating with a wireless headset.
- the texturized pattern reproduced on portion 428 during an incoming call can be controlled, designed, or customized by the calling party if the function is enabled on device 100 .
- FIG. 5 is a process 500 for detecting objects or shapes using elevated, indented, or texturized display portions.
- process 500 can be performed by device 100 in a fat client architecture, device 100 can also be configured as a thin client by sharing shape detection processing functions with a server (not shown) using one or more network adapters 128 .
- Cells are selectively raised around an object placed on area 429 by display(s) elevation, indenting, or texturizing controller 121 (step 502 ).
- the weight of the object is detected by pressure sensors 123 and shape detectors 125 and a height graph of the surface of the object is generated by one or more processors 102 (step 504 ).
- the perimeter of the object placed on area 429 is determined by one or more processors 102 and shape detectors 125 by raising or lowering cells in proximity to object by display(s) elevation, indenting, or texturizing controller 121 (step 506 ).
- One or more processors 102 calculate gradients values (step 508 ) and generates a surface graph (step 510 ) based on the previous measurements made.
- display device 402 may have infrared detectors 430 1 - 430 4 in a slightly beveled position or in the level with the frame of display device 402 .
- Display device 402 may also have digital cameras 434 1 - 434 4 for capturing, tracking, and detecting shapes using algorithms such as that described in U.S. Pat. No. 7,317,872, herein incorporated by reference as if fully set forth, that can be used to perform additional sensor measurements (step 512 ).
- Other sensor measurements for additional metrics and refinement include infrared or optical detection to detect depth. These sensors can be embedded next to or within each display cell in display device 402 .
- a preliminary image may be rendered by one or more processors 102 . The preliminary image can be compared and matched against a database of images in storage 110 , or stored remotely, using artificial intelligence algorithms. Information is then retrieved by one or more network adapters 128 based on the detected object and/or preliminary image (step 514 ).
- a ring size is detected and related information, such as from an online jewelry stored, is retrieved over one or more network adapters in response.
- related information such as from an online jewelry stored
- the size and type of certain household goods, such as hardware, screws, nuts, light bulbs, batteries can be determined by area 429 and process 500 .
- a key placed on area 429 can be keyed and the information sent over one or more network adapters 128 for subsequent duplication and mail delivery by an online store.
- process 500 can be used to obtain biometric information for security purposes.
- FIG. 7 is a process using an elevated, indented, or texturized display device for identifying intellectual property assets.
- the shape of a widget 702 placed on area 429 is detected by process 500 and digitally rendered.
- the detected shape of the widget 702 is compared against widgets 706 1 , 706 2 , or 706 3 shown and described in U.S. Pat. No. X ( 704 ) stored in a database.
- the comparison between widgets can be performed graphically using image rendering, as understood to one of ordinary skill in the art.
- artificial intelligence algorithms can be used to compare claim text or descriptions 710 in U.S. Pat. No. X ( 704 ) against features detected by area 429 and process 500 . If a match is determined or found, the widget 702 is associated with U.S. Pat. No. X ( 708 ) and displayed in a map format on display device 402 .
- display device 402 replicates, mimics, or simulates a customizable or programmable interface or control panel for a remote control, instrument panel on a vehicle, an automobile dashboard configuration, audio equalizers, multitouch equalizers, radio button list, or a consumer electronics button surface with raised button portions 432 1 - 432 3 .
- the simulated interface can be used to sell consumer electronics or function as an advanced user guide whereby input, output, and programming functions are simulated with button portions 432 1 - 432 3 that have the same size and shape as the actual buttons on a product.
- 432 1 - 432 3 can be programmed for controlling volume control, replicating smart home switches or controllers, as desired.
- advanced web searching 436 is performed by measuring pressure applied or detecting depth perception to raised or elevated portion 438 .
- Web searching 436 can be used in combination with area 429 to display hits or webpages relevant to detected objects.
- FIG. 6 is a process 600 using an elevated, indented, or texturized display device that can be selectively performed by the display devices described above.
- Information is received from one or more network adapters 128 , I/O devices 118 , or storage device 110 (step 602 ).
- the sector of cells to be elevated, indented, or texturized based on the received information is checked and tested by display(s) elevation, indenting, or texturizing controller 121 and display controllers 120 (step 604 ) to determine how a high image quality in the area can be maintained (step 606 ) by raising or lowering selective cells.
- One or more processors 102 in combination with sensors 126 determine display orientation or viewing angle (step 608 ) that is taken into consideration to properly elevate, indent, or texturize the display devices described above. If an image of an object is to be simulated or replicated, it is rendered by one or more processors 102 and checked to determine if it can be properly displayed (step 610 ). The cells in the display device are elevated, indented, or texturized (step 612 ).
- ROM read only memory
- RAM random access memory
- register cache memory
- semiconductor memory devices magnetic media such as internal hard disks and removable disks, magneto-optical media, and optical media such as CD-ROM disks, digital versatile disks (DVDs), and BluRay discs.
- Suitable processors include, by way of example, a general purpose processor, a special purpose processor, a conventional processor, a digital signal processor (DSP), a plurality of microprocessors, one or more microprocessors in association with a DSP core, a controller, a microcontroller, Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs) circuits, any other type of integrated circuit (IC), and/or a state machine.
- DSP digital signal processor
- ASICs Application Specific Integrated Circuits
- FPGAs Field Programmable Gate Arrays
- a processor in association with software may be used to implement a radio frequency transceiver for use in a computer, wireless transmit receive unit (WTRU), user equipment (UE), terminal, base station, radio network controller (RNC), or any host computer.
- the WTRU may be used in conjunction with modules, implemented in hardware and/or software, such as a camera, a video camera module, a videophone, a speakerphone, a vibration device, a speaker, a microphone, a television transceiver, a hands free headset, a keyboard, a Bluetooth® module, a frequency modulated (FM) radio unit, a liquid crystal display (LCD) display unit, an organic light-emitting diode (OLED) display unit, a digital music player, a media player, a video game player module, an Internet browser, and/or any wireless local area network (WLAN) or Ultra Wide Band (UWB) module.
- WLAN wireless local area network
- UWB Ultra Wide Band
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Computer Hardware Design (AREA)
- User Interface Of Digital Computer (AREA)
- Controls And Circuits For Display Device (AREA)
- Control Of Indicators Other Than Cathode Ray Tubes (AREA)
Abstract
Description
- This application is a continuation of U.S. patent application Ser. No. 15/060,016 filed Mar. 3, 2016, which is a continuation of U.S. patent application Ser. No. 14/485,246 filed Sep. 12, 2014, which is a continuation of U.S. patent application Ser. No. 13/291,375 filed Nov. 8, 2011, which issued as U.S. Pat. No. 8,866,766 on Oct. 21, 2014, which is a continuation of U.S. patent application Ser. No. 12/406,273 filed Mar. 18, 2009, which issued as U.S. Pat. No. 8,686,951 on Apr. 1, 2014, the contents of which are all hereby incorporated by reference herein as if fully set forth.
- This application is related to an apparatus and method for providing and configuring an elevated, indented, or texturized display device. Moreover, processes are provided and described involving elevated, indented, or texturized portions of a display device.
- Display devices have become commonplace in electronic devices such as mobile devices, cellular phones, personal digital assistants, smart phones, televisions, monitors, touchscreens, picture frames, or the like. Display devices may be based on liquid crystal, plasma, or organic light emitting technologies using ridged substrates or soon to be flexible substrates. Although commonplace, when a display device functions as an input device, such as a touchscreen, their applications are limited to two dimensions. Another limitation or problem of current display devices is the lack of texture. As the world becomes more electronic, texture is needed for enhancing and enabling certain applications and computer processes. Therefore, it is desirable to have display devices that can provide three dimensional and/or texturized structures or processes.
- An apparatus and method for providing and configuring an elevated, indented, or texturized display device is disclosed. Processes are also given involving elevated, indented, or texturized portions of a display device. By providing an elevated, indented, or texturized display device enhanced input/output functions are provided.
- A more detailed understanding may be had from the following description, given by way of example in conjunction with the accompanying drawings wherein:
-
FIG. 1 is a diagram of an electronic device having an elevated, indented, or texturized display device in accordance with one embodiment; -
FIGS. 2a-2e are diagrams of elevated, indented, or texturized display devices in accordance with another embodiment; -
FIG. 3 is a diagram of an elevated or texturized display device in accordance with another embodiment; -
FIG. 4 is a diagram comprising of processes for an electronic device having a display device with elevated, indented, or texturized display portions in accordance with another embodiment; -
FIG. 5 is a process for detecting objects or shapes using a display device with an elevated, indented, or texturized display portions in accordance with another embodiment; -
FIG. 6 is a process using an elevated, indented, or texturized display device in accordance with another embodiment; and -
FIG. 7 is a process using an elevated, indented, or texturized display device for identifying intellectual property assets in accordance with another embodiment. - The present invention will be described with reference to the drawing figures wherein like numerals represent like elements throughout. In the description forthcoming, elevation or elevated describes an orientation where a given surface level is higher or raised relative to another surface level. As an example, the relative elevation may be by one or more millimeters to one or more centimeters or up to an inch. Indenting describes an orientation where a given surface level is lower or depressed relative to another surface level. As an example, the relative indentation may be by one or more millimeters to one or more centimeters. Texturizing or texturing describes a process where a surface provides or mimics friction, variable smoothness, sandpaper like granularity, variable thickness, variable hardness, coarseness, fineness, irregularity, a movement sensation, bumpiness, or rigidness that is sensed by a human touch or detectable by electronic or mechanical sensors.
- In addition, lines shown in the accompanying figures for areas having elevated, indented, or texturized portions or cells are for illustrative purposes. Actual display devices may not show lines on the display surface. In addition, in the processes described below the steps recited may be performed out of sequence and substeps not explicitly described or shown may be performed by one of ordinary skill in the art.
-
FIG. 1 is a diagram of a fixed or mobile subscriber unit, user equipment (UE), mobile station, pager, cellular telephone, personal digital assistant (PDA), computing device, surface computer, monitor, general display, automobile computer system, vehicle computer system, ortelevision device 100 for mobile or fixed applications.Device 100 comprisescomputer bus 140 that couples one ormore processors 102, one ormore interface controllers 104,memory 106 havingsoftware 108,storage device 110,power source 112, one ormore displays controller 120. In addition to one ormore displays controller 120,device 100 comprises a display(s) elevation, indenting, ortexturizing controller 121 for one ormore display devices 122. - One or
more display devices 122 can be configured as a liquid crystal display (LCD), light emitting diode (LED), field emission display (FED), organic light emitting diode (OLED), or flexible OLED display device. The one ormore display devices 122 may be configured, manufactured, produced, or assembled based on the descriptions provided in US Patent Publication Nos. 2007-247422, 2007-139391, 2007-085838, or 2006-096392 or U.S. Pat. 7,050,835 or WO Publication 2007-012899 all herein incorporated by reference as if fully set forth. In the case of a flexible display device, the one or moreelectronic display devices 122 may be configured and assembled using organic light emitting diodes (OLED), liquid crystal displays using flexible substrate technology, flexible transistors, field emission displays (FED) using flexible substrate technology, as desired. One ormore display devices 122 can be configured as a touch screen display using resistive, surface-acoustic wave (SAW) capacitive, infrared, strain gauge, optical imaging, dispersive signal technology, acoustic pulse recognition, frustrated total internal reflection or magneto-strictive technology, as understood by one of ordinary skill in the art. - Coupled to one or
more display devices 122 arepressure sensors 123 and optionallyheating elements 127. Coupled tocomputer bus 140 are one or more input/output (I/O)controller 116, I/O devices 118,GPS device 114, one ormore network adapters 128, and one ormore antennas 130.Device 100 may have one or more motion, light, optical, chemical, environmental, water, acoustic, heat, temperature, radio frequency identification (RFID), biometric, face recognition, image, orvoice recognition sensors 126 andtouch detectors 124 for detecting any touch inputs, including multi-touch inputs, for one ormore display devices 122. One ormore interface controller 104 may communicate withtouch detectors 124 and I/O controller 116 for determining user inputs todevice 100. -
Shape detectors 125 may be configured in combination withtouch detectors 124, display(s) elevation, indenting, ortexturizing controller 121, one ormore display devices 122,pressure sensors 123, orsensors 126 to determine the shape, geometry or texture of an object placed on one ormore display devices 122, as will be explained in more detail below. - Still referring to
device 100,storage device 110 may be any disk based or solid state memory device for storing data.Power source 112 may be a plug-in, battery, solar panels for receiving and storing solar energy, or a device for receiving and storing wireless power as described in U.S. Pat. No. 7,027,311 herein incorporated by reference as if fully set forth. One ormore network adapters 128 may be configured as a Time Division Multiple Access (TDMA), Code Division Multiple Access (CDMA), Orthogonal Frequency-Division Multiplexing (OFDM), Orthogonal Frequency-Division Multiple Access (OFDMA), Global System for Mobile (GSM) communications, Enhanced Data rates for GSM Evolution (EDGE), General Packet Radio Service (GPRS), cdma2000, wideband CDMA (W-CDMA), long term evolution (LTE), 802.11x, Wi-Max, mobile Wi-MAX, Bluetooth, or any other wireless or wired transceiver for modulating and demodulating information communicated via one ormore antennas 130. Additionally, any of devices, controllers, displays, components, etc. indevice 100 may be combined, made integral, or separated as desired. -
FIGS. 2a-2e are diagrams of elevated, indented, or texturized display devices. InFIG. 2a layer 204 lays in proximity to displaydevice layer 202 withlayer 203 providing separation. Although a single layer is shown,layers Display device layer 202 can be either a flexible or rigid display device.Layer 204 can be configured and composed of a clear, flexible, electroactive polymer, polymer composite, or gel material. Electroactive polymers (EAPs), also known as electroactive plastics, can be pulled, expand, contract, deform, change shapes in controllable directions, change dimensions in predetermined directions, or change sizes electronically by applying an electric current, potential difference, voltage, time varying voltage, or electromagnetic fields across the material, as described in U.S. Pat. No. 6,117,296, U.S. Pat. No. 6,787,238, US Patent Publication No. 2008-188907, US Patent Publication No. 2004-199232, US Patent Publication No. 2005-157893, WO Publication 2007-114699 and “Electric Flex” by Yoseph Bar-Cohen (2001) all herein incorporated by reference as if fully set forth. - Electroactive polymers (EAPs) may be dielectric or ionic EAPs. For dielectric EAPS, actuation can be caused by electrostatic forces between two electrodes that squeeze or compress the polymer. Although requiring a high actuation voltage, dielectric EAPS consume very little power and require no power to keep an actuator at a given position. Examples of dielectric EAPs are electrostrictive polymers and dielectric elastomers that are used for artificial muscles. For Ionic EAPs, actuation is caused by the displacement of ions inside the polymer. Only a few volts are needed for actuation, but the ionic flow implies a higher electrical power needed for actuation, and energy is needed to keep the actuator at a given position. Examples of ionic EAPS are conductive polymers, ionic polymer-metal composites (IPMCs), and responsive gels.
- In another embodiment,
layer 204 can also be configured and composed of piezoelectric materials or actuators that are bonded to a firm plastic component to form a piezo bending element, as explained in “TPaD: Tactile Pattern Display” by Colgate and Peshkin (2007) herein incorporated by reference as if fully set forth. When a potential difference is applied to a bonded or constricted piezoelectric material it changes shape. The shape change can be controlled electrically to provide different surface textures, indentation, and elevation. - In another embodiment,
layer 204 can also be configured and composed of organic transistors formed on a flexible substrate to drive or contract a surface creating texture, indentation, or elevation. Organic transistors are transistors that use organic molecules rather than silicon for their active material. An advantage of organic transistors is the ability to operate on a flexible substrate. Similar to EAPs, organic transistors also exhibit material properties such that they can be pulled, expand, contract, deform, change shapes in controllable directions, change dimensions in predetermined directions, or change sizes electronically by applying an electric current, potential difference, voltage, time varying voltage, or electromagnetic fields. - Still referring to
FIG. 2a , portions ofsurface 216 are selectively elevated, indented, or texturized with one or more substantially cubiclesegment 206, dot ordimple segment 208, substantiallycylindrical segment 210,bulge segment 212, orindentation segment 214. The shape and texture of the indented or elevated portion depends on the image, document, or application to be displayed and the effects on the resolution of the display device. Because of their natural geometry, certain segments may provide a clearer display of the underlying image or document.Segments displays controller 120 andcontroller 121 that adjust the height, indentation, or depression to multiple different levels, orientation, hardness, thickness, direction, vibration, or gyration individually for each segment. Display(s) elevation, indenting, ortexturizing controller 121 may comprise of analog or digital driving circuits (not shown) for driving the segments. Examples of display driving circuits are given in US Patent Publication Nos. 2008-062088, 2006-221016, or 2006-007078 all herein incorporated by reference as if fully set forth. - In
FIG. 2a , the operation and configuration oflayer 204 may be independent ofdisplay device layer 202 thereby simplifying manufacturing since it can be an add-on or attachment to preexisting display systems or technologies. Also, in certain applications images may not be displayed onsurface 216 in an area that is elevated, indented, or texturized thereby making it darkened in order to make the area more noticeable to the user. For this configuration, the image displayed ondisplay device layer 202 is rendered to adjust for the darkened area. -
FIG. 2b is a diagram of an elevated or texturized display device.Layer 218 lays in proximity to displaydevice layer 220 withlayer 219 providing separation. Although a single layer is shown, layers 218, 219, and 220 may be composed of a plurality of sublayers.Display device layer 220 is configured as a flexible display device, such as flexible OLED.Layer 218 may be comprised of the same composition or materials explained above forlayer 204 such as EAPs, piezoelectric materials, or organic transistors. - In
FIG. 2b , portions ofsurface 231 are selectively elevated or texturized with one or more substantially cubicle segment 222 1 controlling segment 222 2, dot or dimple segment 224 1 controlling segment 224 2, substantially cylindrical segment 226 1 controlling segment 226 2, or bulge segment 228 1 controlling segment 228 2. Segments 222 2, 224 2, 226 2, and 228 2 are controlled at least bydisplays controller 120 and/orcontroller 121 that adjust the height, orientation, direction, or gyration individually or collectively for each segment. Display(s) elevation, indenting, ortexturizing controller 121 may comprise of analog or digital driving circuits (not shown) for driving the segments. Sincelayer 218 is oriented below or behinddisplay device layer 220, there is little interference with the resolution or clarity of images to be displayed ondisplay device layer 220. Also, in certain applications images may not be displayed onsurface 231 in an area that is elevated, indented, or texturized thereby making it darkened in order to make the area more noticeable to the user. For this configuration, the image displayed ondisplay device layer 220 is rendered to adjust for the darkened area. -
FIG. 2c is a diagram of an elevated, indented, or texturized display device. Display pixels 232 1 to 232 n lay adjacent, on the same level, or on the same layer to elevation, indenting, or texturizing cells 234 1 to 234 n. The display array ormatrix 233 also comprises of display pixels 236 1 to 236 n adjacent to elevation, indenting, or texturizing cells 238 1 to 238 n that are adjacent to display pixels 240 1 to 240 n. The elevation, indenting, or texturizing cells may be comprised of the same composition or materials explained above forlayer displays controller 120 and/orcontroller 121 that adjust the height, orientation, direction, or gyration individually or collectively for each cell. Display(s) elevation, indenting, ortexturizing controller 121 may comprise of analog or digital driving circuits (not shown) for driving the cells. In this embodiment, cells 234 1 to 234 n and 238 1 to 238 n may be illuminated based on the configuration of surrounding pixels to blend in with any images being displayed. -
FIG. 2d shows an embodiment of a display device array ormatrix 235 from a top view where elevation, indenting, ortexturizing cells 239 are placed selectively within a small area footprint so that the surface of display device array ormatrix 235 is mostly comprised ofdisplay pixels 237. Havingtexturizing cells 239 sparsely placed in display device array ormatrix 235 ensures minimal interference with a displayed image. In this embodiment the elevation, indented, or texturized cells may be unnoticeable to the human eye but detectable by touch or feeling of display device array ormatrix 235. -
FIG. 2e is a diagram of an elevated, indented, or texturized display device. InFIG. 2e , displaypixels 242 are in the same layer or level but separate from elevation, indenting, or texturizing cells anddisplay pixels areas FIG. 2e provides a hybrid layout withdisplay pixels 242 operating with selectively placed elevation, indenting, or texturizing cells and displaypixels FIG. 2e ,area 244 can provide scrolling functions whilearea 246 can be configured as a keyboard, dialpad, keypad, or any other interface. -
FIG. 3 is a diagram of an elevated or texturized display device. A matrix of pockets orcells 304 1 to 304 x lays on top of adisplay device 302. Matrix of pockets orcells 304 1 to 304 x may be full of compressed air or low heat activated gel that becomes elevated or texturized byheating elements 127 as a result of thermal expansion, as understood by one of ordinary skill in the art. Matrix of pockets orcells 304 1 to 304 x can be tapered but flat and seamless when unexpanded. Moreover,heating elements 127 can be used to provide different tactile sensations in combination with pockets orcells 304 1 to 304 x so that a user is provided varying temperatures, such as hot or cold information, relating to a displayed image. -
FIG. 4 is a diagram illustrating processes for an electronic device having adisplay device 402 with elevated, indented, or texturized display portions.Display device 402 can be assembled with at least some of the components described indevice 100. For elevated, indented, or texturized applications,display device 402 may be configured with the devices described inFIG. 2a, 2c , or 2 d, as desired. For elevated or certain texturized applications,display device 402 may be configured with the devices described inFIG. 2b or 3, as desired. For illustrative purposes, inFIG. 4 a darkened or black portion represents an indented portion, a white portion represents an elevated portion, and a checkered pattern represents a texturized portion. - For inputting data or triggering a request action, a “click here” displayed link is provided with a combination of an indented portion 404 1 and elevated portion 404 2. Moreover, part of a virtual or simulated keyboard displayed on
display device 402 provides the letter “E” key with a partially displayedportion 406, an elevatedcircular portion 414 and an elevatedsquare portion 408. Although part of a virtual or simulated keyboard is shown,display device 402 can be configured to show a whole QWERTY keyboard, a numbered keypad for dialing, or a combination of a whole QWERTY keyboard and a numbered keypad, as desired. The letter “S” key is provided by a partially displayed portion and an elevatedcircular portion 410. The letter “Q” key is completely elevated byportion 412. The virtual or simulated keyboard can also be programmed to replicate Braille lettering, as desired. - In addition to key inputs,
portions pressure sensors 123 in combination withtouch detectors 124 and/or display(s) elevation, indenting, ortexturizing controller 121 by measuring gradient, force, or potential difference values. Moreover, in response to a detected force bypressure sensors 123 andtouch detectors 124, haptic feedback, force feedback or tactile feedback in the form of a played sound, gyration, or vibration can be provided via I/O controller 116. - Still referring to the virtual or simulated keyboard on
display device 402, instructions insoftware 108 can be used to predict or anticipate keystrokes based on a word or sentence entered. In response to the anticipation, different keys can be raised, indented, or texturized in order to increase typing speeds. - An embodiment of the present invention provides electronic advertising processes.
Advertisement 416 can be sold to an advertiser for a certain price for havingelevated portions indentation 416 2 on at least one part or the entire advertisement.Advertisement 418 can be sold to an advertiser for a different price, higher or lower, for havingelevated portions GPS device 114.Advertisement 419 can be sold to an advertiser for a different price for having a plurality of elevated, indented, or texturizedportions 419 1. - An embodiment of the present invention provides electronic business processes. A “Buy Now” button is provided with an elevated circular portion 422 1 and a square portion 422 2. The “Buy Now” button is associated with triggering the purchasing of
shirt 424 by sending a request to a server (not shown) over one ormore network adapters 128. Forshirt 424, atexturizing portion 426 is provided to replicate or simulate the surface ofshirt 424. Texturizingportion 426 can be a combination of elevated and indented cells. Although ashirt 424 is shown,texturizing portion 426 can be used to provide surface information for any product being sold or displayed ondisplay device 402 such as electronics, home goods, jewelry, etc. - Using
touch detectors 124 in combination with display(s) elevation, indenting, ortexturizing controller 121,shirt 424 can be rotated in response to a multitouch input while texturizingportion 426 is dynamically changed to reflect the different surfaces or materials used in the product.Shirt 424 can be zoomed in and out using multitouch inputs detected bytouch detectors 124 with each zoom level reflecting texture differences onportion 426. For instance, a zoomed in view can be more grainy or rough compared to a zoomed out view. The zoom levels can also be configured with a fading in or out effect by one ormore processors 102 and can involve retrieving additional information from a server (not shown) over one ormore network adapters 128. Moreover, if certain rare or uncommon materials cannot be replicated or simulated by texturizingportion 426,legend 425 identifies or associates different materials, such as rabbit skin, llama wool, and rare silk, by texturizedportions portion 426. - Still referring to
FIG. 4 , an embodiment of the present invention provides an electronic game with elevated, indented, or texturizing portion 428 (hereinafter “portion 428”), such as tic-tac-toe.Portion 428 can detect different pressure forces when pushed down, pushed sideways, or pulled sideways providing another metric or feature for the man machine interface. These different pressure forces can be detected bypressure sensors 123 in combination withtouch detectors 124 and/or display(s) elevation, indenting, ortexturizing controller 121 by measuring gradient, force, or potential difference values of touches to raised portions inportion 428. Another gaming application comprisesportion 428 emulating a piano or guitar. - Moreover, a game can receive as inputs flicks, pinches, or scratches to
portion 428 and generate an action ondisplay device 402 in response to each detected action differently. A pinch to a raised portion of 428 can represent an object or block being picked up or opened in a game or any other simulated environment.Portion 428 can also control scrolling or drag and drop functions in combination with multitouch inputs detected bytouch detectors 124. In another embodiment,certain portions 428 can be used as a miniature joystick or pointing stick for 360 degrees rotational input that is detected bypressure sensors 123 in combination withtouch detectors 124 and/or display(s) elevation, indenting, ortexturizing controller 121. A three dimensional accelerometer can be included insensors 126 to be used in combination with display(s) elevation, indenting, ortexturizing controller 121 to raise part ofportion 428 in response to a programmed action in the game.Portion 428 can also be used to simulate or replicate a lottery scratching/rubbing game or a children's productivity game, as desired.Portion 428 can also provide a gaming feature where tilting or rotation detected by an accelerometer insensors 126 raises some portions while indenting others for four dimensional motion gaming. - In another embodiment,
portion 428 can provide online collaboration, online conferencing, social networking, or online dating. In response to push or pull input on a networked computing device (not shown) having an elevated, indented, or texturized display device,portion 428 provides feedback, similar to a poke on Facebook. In online conferencing, tactile inputs toportion 428 can be used during a video conference application for additional interaction between conversing parties. For social networking, adult entertainment can be enhanced byportion 428 providing stimulation in connection with a video, image, or audio media ondisplay device 402. - Additional processes exist within the medical field for online collaboration.
Portion 428 can provide remote medical diagnostics and measurements, such as a pressure test, over a network using one ormore network adapters 128 andpressure sensors 123 in combination withtouch detectors 124 and/or display(s) elevation, indenting, ortexturizing controller 121. Diagnostics and measurements include tactile for motor skills, respiratory for testing lung pressure by a person blowing onportion 428, or muscular by measuring range of motion and strength, as desired. Therefore,portion 428 can provide bi-directional information exchange. - Still referring to medical processes,
portion 428 can be used for the elderly or disabled to provide simple, small scale physical therapy exercises for the hand or fingers by allowing a user to pull raised portions at different tensions and resistances or pickup objects. This is particularly useful for stroke victims by providing mental therapy, exercises and massaging of small muscles and ligaments. - In another embodiment,
portion 428 can be used to provide a plurality of processes or applications.Portion 428 can simulate maps, topography, geography, imagery, or location service processes in combination withGPS device 114.Portion 428 can display mountainous regions on a map by elevating and oceans by indenting.Portion 428 can follow a musical sequence being played ondevice 100 for a ringtone during a received call or song. Moreover,portion 428 can be used to display content in an email, 3rd Generation Partnership Project (3GPP) or 3GPP2 short message service (SMS) text message, 3GPP or 3GPP2 multimedia message service (MMS) message, an image or video motion in an MMS message, PDF application, word document, excel graphs, excel charts, four dimensional (4-D) screensaver, 4-D art, 4-D drawings, 3-D imagery, a 3-D sculpture, a 4-D “etch-a-sketch”, or architecture designs using scalable or vector graphics. Any of the content given above and displayed byportion 428 may be transmitted or received over one ormore network adapters 128. - Moreover,
portion 428 can provide, replicate, or simulate integrated circuit layouts, electrical circuit layouts, facial features, enhanced video clips, computer aided designs (CAD), semiconductor layouts, prototyping, modeling, molding for producing form factors, logos, trademarks, a children's educational product, a general education product, a 3-D drawing tool, distance learning, or a pop-up children's books, as desired. In addition,portion 428 can be responsive to voice or visual commands or recognition detected bysensors 126 for being elevated, indented, or texturized. - Moreover,
portion 428 can provide object awareness to displaydevice 402. For instance, a post it note can be detected when it is determined that there is additional resistivity or elasticity by the adhesive on the post it note bypressure sensors 123 and in combination withtouch detectors 124 and/or display(s) elevation, indenting, ortexturizing controller 121 to raised or elevated cell inportion 428. In response to the detected post it,display device 402 can adapt and reformat the images around the note such that images are not obstructed to the user. - Moreover,
portion 428 can provide advanced Bluetooth capabilities for Bluetooth keyboards, headsets and can function as a Bluetooth device itself for medical applications. When a call is received over one ormore network adapters 128, a preprogrammed texturized pattern is reproduced onportion 428 for notifying the user, such as whendisplay device 402 is in a shirt pocket in hands-free mode communicating with a wireless headset. Alternatively, the texturized pattern reproduced onportion 428 during an incoming call can be controlled, designed, or customized by the calling party if the function is enabled ondevice 100. - Still referring to
FIG. 4 , another embodiment provides object detection for a 3-D object that is placed onarea 429 having a combination of elevated cells, indented cells, and/or texturized cells.FIG. 5 is aprocess 500 for detecting objects or shapes using elevated, indented, or texturized display portions. Althoughprocess 500 can be performed bydevice 100 in a fat client architecture,device 100 can also be configured as a thin client by sharing shape detection processing functions with a server (not shown) using one ormore network adapters 128. Cells are selectively raised around an object placed onarea 429 by display(s) elevation, indenting, or texturizing controller 121 (step 502). The weight of the object is detected bypressure sensors 123 andshape detectors 125 and a height graph of the surface of the object is generated by one or more processors 102 (step 504). The perimeter of the object placed onarea 429 is determined by one ormore processors 102 andshape detectors 125 by raising or lowering cells in proximity to object by display(s) elevation, indenting, or texturizing controller 121 (step 506). One ormore processors 102 calculate gradients values (step 508) and generates a surface graph (step 510) based on the previous measurements made. - Moreover,
display device 402 may have infrared detectors 430 1-430 4 in a slightly beveled position or in the level with the frame ofdisplay device 402.Display device 402 may also have digital cameras 434 1-434 4 for capturing, tracking, and detecting shapes using algorithms such as that described in U.S. Pat. No. 7,317,872, herein incorporated by reference as if fully set forth, that can be used to perform additional sensor measurements (step 512). Other sensor measurements for additional metrics and refinement include infrared or optical detection to detect depth. These sensors can be embedded next to or within each display cell indisplay device 402. Based on steps 502-512, a preliminary image may be rendered by one ormore processors 102. The preliminary image can be compared and matched against a database of images instorage 110, or stored remotely, using artificial intelligence algorithms. Information is then retrieved by one ormore network adapters 128 based on the detected object and/or preliminary image (step 514). - In a
process involving area 429 andprocess 500, a ring size is detected and related information, such as from an online jewelry stored, is retrieved over one or more network adapters in response. Alternatively, the size and type of certain household goods, such as hardware, screws, nuts, light bulbs, batteries can be determined byarea 429 andprocess 500. Moreover, a key placed onarea 429 can be keyed and the information sent over one ormore network adapters 128 for subsequent duplication and mail delivery by an online store. In addition,process 500 can be used to obtain biometric information for security purposes. - In another
process involving area 429, intellectual property assets, such as patents, trademarks, or copyrights, relating to the shape of a detected object is retrieved and displayed in a map format indisplay device 402 to show a correspondence between similar features of an object and related intellectual property assets.FIG. 7 is a process using an elevated, indented, or texturized display device for identifying intellectual property assets. The shape of awidget 702 placed onarea 429 is detected byprocess 500 and digitally rendered. The detected shape of thewidget 702 is compared against widgets 706 1, 706 2, or 706 3 shown and described in U.S. Pat. No. X (704) stored in a database. The comparison between widgets can be performed graphically using image rendering, as understood to one of ordinary skill in the art. Moreover, artificial intelligence algorithms can be used to compare claim text ordescriptions 710 in U.S. Pat. No. X (704) against features detected byarea 429 andprocess 500. If a match is determined or found, thewidget 702 is associated with U.S. Pat. No. X (708) and displayed in a map format ondisplay device 402. - In another embodiment,
display device 402 replicates, mimics, or simulates a customizable or programmable interface or control panel for a remote control, instrument panel on a vehicle, an automobile dashboard configuration, audio equalizers, multitouch equalizers, radio button list, or a consumer electronics button surface with raised button portions 432 1-432 3. The simulated interface can be used to sell consumer electronics or function as an advanced user guide whereby input, output, and programming functions are simulated with button portions 432 1-432 3 that have the same size and shape as the actual buttons on a product. Moreover, 432 1-432 3 can be programmed for controlling volume control, replicating smart home switches or controllers, as desired. - Still referring to
FIG. 4 , advanced web searching 436 is performed by measuring pressure applied or detecting depth perception to raised orelevated portion 438. Web searching 436 can be used in combination witharea 429 to display hits or webpages relevant to detected objects. -
FIG. 6 is aprocess 600 using an elevated, indented, or texturized display device that can be selectively performed by the display devices described above. Information is received from one ormore network adapters 128, I/O devices 118, or storage device 110 (step 602). The sector of cells to be elevated, indented, or texturized based on the received information is checked and tested by display(s) elevation, indenting, ortexturizing controller 121 and display controllers 120 (step 604) to determine how a high image quality in the area can be maintained (step 606) by raising or lowering selective cells. One ormore processors 102 in combination withsensors 126 determine display orientation or viewing angle (step 608) that is taken into consideration to properly elevate, indent, or texturize the display devices described above. If an image of an object is to be simulated or replicated, it is rendered by one ormore processors 102 and checked to determine if it can be properly displayed (step 610). The cells in the display device are elevated, indented, or texturized (step 612). - Although features and elements are described above in particular combinations, each feature or element can be used alone without the other features and elements or in various combinations with or without other features and elements. The methods, processes, or flow charts provided herein may be implemented in a computer program, software, or firmware incorporated in a computer-readable storage medium for execution by a general purpose computer or a processor. Examples of computer-readable storage mediums include a read only memory (ROM), a random access memory (RAM), a register, cache memory, semiconductor memory devices, magnetic media such as internal hard disks and removable disks, magneto-optical media, and optical media such as CD-ROM disks, digital versatile disks (DVDs), and BluRay discs.
- Suitable processors include, by way of example, a general purpose processor, a special purpose processor, a conventional processor, a digital signal processor (DSP), a plurality of microprocessors, one or more microprocessors in association with a DSP core, a controller, a microcontroller, Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs) circuits, any other type of integrated circuit (IC), and/or a state machine.
- A processor in association with software may be used to implement a radio frequency transceiver for use in a computer, wireless transmit receive unit (WTRU), user equipment (UE), terminal, base station, radio network controller (RNC), or any host computer. The WTRU may be used in conjunction with modules, implemented in hardware and/or software, such as a camera, a video camera module, a videophone, a speakerphone, a vibration device, a speaker, a microphone, a television transceiver, a hands free headset, a keyboard, a Bluetooth® module, a frequency modulated (FM) radio unit, a liquid crystal display (LCD) display unit, an organic light-emitting diode (OLED) display unit, a digital music player, a media player, a video game player module, an Internet browser, and/or any wireless local area network (WLAN) or Ultra Wide Band (UWB) module.
Claims (10)
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/079,660 US9405371B1 (en) | 2009-03-18 | 2016-03-24 | Controllable tactile sensations in a consumer device |
US15/222,265 US9778840B2 (en) | 2009-03-18 | 2016-07-28 | Electronic device with an interactive pressure sensitive multi-touch display |
US15/239,264 US9547368B2 (en) | 2009-03-18 | 2016-08-17 | Electronic device with a pressure sensitive multi-touch display |
US15/365,225 US9772772B2 (en) | 2009-03-18 | 2016-11-30 | Electronic device with an interactive pressure sensitive multi-touch display |
Applications Claiming Priority (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US12/406,273 US8686951B2 (en) | 2009-03-18 | 2009-03-18 | Providing an elevated and texturized display in an electronic device |
US13/291,375 US8866766B2 (en) | 2009-03-18 | 2011-11-08 | Individually controlling a tactile area of an image displayed on a multi-touch display |
US14/485,246 US9335824B2 (en) | 2009-03-18 | 2014-09-12 | Mobile device with a pressure and indentation sensitive multi-touch display |
US15/060,016 US9459728B2 (en) | 2009-03-18 | 2016-03-03 | Mobile device with individually controllable tactile sensations |
US15/079,660 US9405371B1 (en) | 2009-03-18 | 2016-03-24 | Controllable tactile sensations in a consumer device |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/060,016 Continuation US9459728B2 (en) | 2009-03-18 | 2016-03-03 | Mobile device with individually controllable tactile sensations |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/222,265 Continuation US9778840B2 (en) | 2009-03-18 | 2016-07-28 | Electronic device with an interactive pressure sensitive multi-touch display |
Publications (2)
Publication Number | Publication Date |
---|---|
US20160209924A1 true US20160209924A1 (en) | 2016-07-21 |
US9405371B1 US9405371B1 (en) | 2016-08-02 |
Family
ID=42737109
Family Applications (13)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/406,273 Active 2031-05-31 US8686951B2 (en) | 2009-03-18 | 2009-03-18 | Providing an elevated and texturized display in an electronic device |
US13/291,375 Active 2029-09-12 US8866766B2 (en) | 2009-03-18 | 2011-11-08 | Individually controlling a tactile area of an image displayed on a multi-touch display |
US14/485,246 Expired - Fee Related US9335824B2 (en) | 2009-03-18 | 2014-09-12 | Mobile device with a pressure and indentation sensitive multi-touch display |
US15/060,016 Active US9459728B2 (en) | 2009-03-18 | 2016-03-03 | Mobile device with individually controllable tactile sensations |
US15/061,580 Active US9400558B2 (en) | 2009-03-18 | 2016-03-04 | Providing an elevated and texturized display in an electronic device |
US15/079,660 Active - Reinstated US9405371B1 (en) | 2009-03-18 | 2016-03-24 | Controllable tactile sensations in a consumer device |
US15/080,025 Active US9423905B2 (en) | 2009-03-18 | 2016-03-24 | Providing an elevated and texturized display in a mobile electronic device |
US15/145,766 Active US9448632B2 (en) | 2009-03-18 | 2016-05-03 | Mobile device with a pressure and indentation sensitive multi-touch display |
US15/222,265 Active US9778840B2 (en) | 2009-03-18 | 2016-07-28 | Electronic device with an interactive pressure sensitive multi-touch display |
US15/239,264 Active US9547368B2 (en) | 2009-03-18 | 2016-08-17 | Electronic device with a pressure sensitive multi-touch display |
US15/365,225 Active US9772772B2 (en) | 2009-03-18 | 2016-11-30 | Electronic device with an interactive pressure sensitive multi-touch display |
US15/694,930 Active US10191652B2 (en) | 2009-03-18 | 2017-09-04 | Electronic device with an interactive pressure sensitive multi-touch display |
US16/234,078 Abandoned US20190129610A1 (en) | 2009-03-18 | 2018-12-27 | Electronic device with an elevated and texturized display |
Family Applications Before (5)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/406,273 Active 2031-05-31 US8686951B2 (en) | 2009-03-18 | 2009-03-18 | Providing an elevated and texturized display in an electronic device |
US13/291,375 Active 2029-09-12 US8866766B2 (en) | 2009-03-18 | 2011-11-08 | Individually controlling a tactile area of an image displayed on a multi-touch display |
US14/485,246 Expired - Fee Related US9335824B2 (en) | 2009-03-18 | 2014-09-12 | Mobile device with a pressure and indentation sensitive multi-touch display |
US15/060,016 Active US9459728B2 (en) | 2009-03-18 | 2016-03-03 | Mobile device with individually controllable tactile sensations |
US15/061,580 Active US9400558B2 (en) | 2009-03-18 | 2016-03-04 | Providing an elevated and texturized display in an electronic device |
Family Applications After (7)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/080,025 Active US9423905B2 (en) | 2009-03-18 | 2016-03-24 | Providing an elevated and texturized display in a mobile electronic device |
US15/145,766 Active US9448632B2 (en) | 2009-03-18 | 2016-05-03 | Mobile device with a pressure and indentation sensitive multi-touch display |
US15/222,265 Active US9778840B2 (en) | 2009-03-18 | 2016-07-28 | Electronic device with an interactive pressure sensitive multi-touch display |
US15/239,264 Active US9547368B2 (en) | 2009-03-18 | 2016-08-17 | Electronic device with a pressure sensitive multi-touch display |
US15/365,225 Active US9772772B2 (en) | 2009-03-18 | 2016-11-30 | Electronic device with an interactive pressure sensitive multi-touch display |
US15/694,930 Active US10191652B2 (en) | 2009-03-18 | 2017-09-04 | Electronic device with an interactive pressure sensitive multi-touch display |
US16/234,078 Abandoned US20190129610A1 (en) | 2009-03-18 | 2018-12-27 | Electronic device with an elevated and texturized display |
Country Status (1)
Country | Link |
---|---|
US (13) | US8686951B2 (en) |
Families Citing this family (56)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8686951B2 (en) | 2009-03-18 | 2014-04-01 | HJ Laboratories, LLC | Providing an elevated and texturized display in an electronic device |
US8730182B2 (en) * | 2009-07-30 | 2014-05-20 | Immersion Corporation | Systems and methods for piezo-based haptic feedback |
US8441465B2 (en) | 2009-08-17 | 2013-05-14 | Nokia Corporation | Apparatus comprising an optically transparent sheet and related methods |
US20110199342A1 (en) * | 2010-02-16 | 2011-08-18 | Harry Vartanian | Apparatus and method for providing elevated, indented or texturized sensations to an object near a display device or input detection using ultrasound |
US9791928B2 (en) * | 2010-04-26 | 2017-10-17 | Nokia Technologies Oy | Apparatus, method, computer program and user interface |
US9733705B2 (en) * | 2010-04-26 | 2017-08-15 | Nokia Technologies Oy | Apparatus, method, computer program and user interface |
US9715275B2 (en) * | 2010-04-26 | 2017-07-25 | Nokia Technologies Oy | Apparatus, method, computer program and user interface |
US9132352B1 (en) | 2010-06-24 | 2015-09-15 | Gregory S. Rabin | Interactive system and method for rendering an object |
US9971405B2 (en) * | 2010-09-27 | 2018-05-15 | Nokia Technologies Oy | Touch sensitive input |
US8174931B2 (en) * | 2010-10-08 | 2012-05-08 | HJ Laboratories, LLC | Apparatus and method for providing indoor location, position, or tracking of a mobile computer using building information |
US20120133494A1 (en) * | 2010-11-29 | 2012-05-31 | Immersion Corporation | Systems and Methods for Providing Programmable Deformable Surfaces |
US8743244B2 (en) | 2011-03-21 | 2014-06-03 | HJ Laboratories, LLC | Providing augmented reality based on third party information |
US8717151B2 (en) * | 2011-05-13 | 2014-05-06 | Qualcomm Incorporated | Devices and methods for presenting information to a user on a tactile output surface of a mobile device |
JP2013058037A (en) * | 2011-09-07 | 2013-03-28 | Konami Digital Entertainment Co Ltd | Item selection device, item selection method, and program |
US8928582B2 (en) | 2012-02-17 | 2015-01-06 | Sri International | Method for adaptive interaction with a legacy software application |
US20130215038A1 (en) * | 2012-02-17 | 2013-08-22 | Rukman Senanayake | Adaptable actuated input device with integrated proximity detection |
US9218526B2 (en) | 2012-05-24 | 2015-12-22 | HJ Laboratories, LLC | Apparatus and method to detect a paper document using one or more sensors |
KR102007651B1 (en) * | 2012-12-21 | 2019-08-07 | 삼성전자주식회사 | Touchscreen keyboard configuration method, apparatus, and computer-readable medium storing program |
JP6276385B2 (en) | 2013-04-26 | 2018-02-07 | イマージョン コーポレーションImmersion Corporation | Passive stiffness and active deformation haptic output device for flexible displays |
US9361709B2 (en) * | 2013-05-08 | 2016-06-07 | International Business Machines Corporation | Interpreting texture in support of mobile commerce and mobility |
US20150034469A1 (en) * | 2013-08-05 | 2015-02-05 | Samsung Display Co., Ltd. | Formable input keypad and display device using the same |
US9824642B2 (en) * | 2013-09-27 | 2017-11-21 | Intel Corporation | Rendering techniques for textured displays |
US20150234488A1 (en) * | 2014-02-17 | 2015-08-20 | Denso International America, Inc. | System for integrating smart device with vehicle |
US9965974B2 (en) * | 2014-03-11 | 2018-05-08 | Technologies Humanware Inc. | Portable device with virtual tactile keyboard and refreshable Braille display |
US10490167B2 (en) * | 2014-03-25 | 2019-11-26 | Intel Corporation | Techniques for image enhancement using a tactile display |
TWI489151B (en) * | 2014-05-09 | 2015-06-21 | Wistron Corp | Method, apparatus and cell for displaying three dimensional object |
US10101829B2 (en) | 2014-06-11 | 2018-10-16 | Optelec Holding B.V. | Braille display system |
US9904366B2 (en) * | 2014-08-14 | 2018-02-27 | Nxp B.V. | Haptic feedback and capacitive sensing in a transparent touch screen display |
US9690381B2 (en) | 2014-08-21 | 2017-06-27 | Immersion Corporation | Systems and methods for shape input and output for a haptically-enabled deformable surface |
US9535550B2 (en) | 2014-11-25 | 2017-01-03 | Immersion Corporation | Systems and methods for deformation-based haptic effects |
US20160240102A1 (en) * | 2015-02-12 | 2016-08-18 | Vikram BARR | System for audio-tactile learning with reloadable 3-dimensional modules |
CN104679220A (en) * | 2015-03-19 | 2015-06-03 | 上海华勤通讯技术有限公司 | Mobile terminal and message reminding method thereof |
CA2995815C (en) * | 2015-06-22 | 2021-08-24 | The Brigham And Women's Hospital, Inc | Home evaluation of the quality of semen samples |
WO2017030497A1 (en) * | 2015-08-20 | 2017-02-23 | Robert Bosch Gmbh | Layer arrangement and input/output device |
KR102472970B1 (en) | 2015-09-07 | 2022-12-01 | 엘지전자 주식회사 | Display device |
US10345905B2 (en) * | 2015-09-08 | 2019-07-09 | Apple Inc. | Electronic devices with deformable displays |
US20170195736A1 (en) * | 2015-12-31 | 2017-07-06 | Opentv, Inc. | Systems and methods for enabling transitions between items of content |
WO2018018442A1 (en) * | 2016-07-27 | 2018-02-01 | 深圳市柔宇科技有限公司 | Display interface control method and device for misoperation prevention, and terminal |
US9799279B1 (en) | 2016-09-15 | 2017-10-24 | Essential Products, Inc. | Electronic display with a relief |
US10484201B2 (en) | 2016-09-28 | 2019-11-19 | Samsung Electronics Co., Ltd. | Distributed platform for robust execution of smart home applications |
US10607506B2 (en) | 2016-10-05 | 2020-03-31 | International Business Machines Corporation | Braille reading using fingerprint scanner and varying vibration frequencies |
CN106814912B (en) * | 2017-01-17 | 2021-01-26 | 京东方科技集团股份有限公司 | Pressure touch sensor, display device and driving method thereof |
CN106844033B (en) * | 2017-01-23 | 2020-07-28 | 努比亚技术有限公司 | Application quick starting method and terminal |
US10396272B2 (en) * | 2017-05-04 | 2019-08-27 | International Business Machines Corporation | Display distortion for alignment with a user gaze direction |
US10460442B2 (en) | 2017-05-04 | 2019-10-29 | International Business Machines Corporation | Local distortion of a two dimensional image to produce a three dimensional effect |
US10692400B2 (en) | 2017-08-08 | 2020-06-23 | Educational Media Consulting, Llc | Method of mechanically translating written text to Braille on computer programmed machine using motion haptic stimulation technology |
CN107371063B (en) * | 2017-08-17 | 2020-03-10 | 广州视源电子科技股份有限公司 | Video playing method, device, equipment and storage medium |
US10725648B2 (en) | 2017-09-07 | 2020-07-28 | Paypal, Inc. | Contextual pressure-sensing input device |
EP3731922B1 (en) | 2017-10-23 | 2024-02-21 | DataFeel Inc. | Communication devices, methods, and systems |
US10955922B2 (en) * | 2017-11-29 | 2021-03-23 | International Business Machines Corporation | Simulating tactile information for haptic technology |
US10440848B2 (en) | 2017-12-20 | 2019-10-08 | Immersion Corporation | Conformable display with linear actuator |
WO2019198082A1 (en) * | 2018-04-12 | 2019-10-17 | Mttech Interactive Multimedia Systems Ltd | Pressure sensitive display device |
US11934583B2 (en) | 2020-10-30 | 2024-03-19 | Datafeel Inc. | Wearable data communication apparatus, kits, methods, and systems |
CN112516585A (en) * | 2020-11-25 | 2021-03-19 | 珠海市智迪科技股份有限公司 | Method for triggering signal by using virtual optical micro-motion key |
US12125337B2 (en) | 2021-03-16 | 2024-10-22 | Aristocrat Technologies, Inc. | Zero-cabling screen connection for gaming device |
GB2609463A (en) * | 2021-08-03 | 2023-02-08 | Continental Automotive Gmbh | Method and operating device for securing functions of the operating device |
Family Cites Families (299)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7274652B1 (en) | 2000-06-02 | 2007-09-25 | Conexant, Inc. | Dual packet configuration for wireless communications |
US4871992A (en) | 1988-07-08 | 1989-10-03 | Petersen Robert C | Tactile display apparatus |
US5327457A (en) | 1991-09-13 | 1994-07-05 | Motorola, Inc. | Operation indicative background noise in a digital receiver |
US5867144A (en) | 1991-11-19 | 1999-02-02 | Microsoft Corporation | Method and system for the direct manipulation of information, including non-default drag and drop operation |
US6597347B1 (en) | 1991-11-26 | 2003-07-22 | Itu Research Inc. | Methods and apparatus for providing touch-sensitive input in multiple degrees of freedom |
WO1994002911A1 (en) | 1992-07-24 | 1994-02-03 | Toda Koji | Ultrasonic touch system |
US5402490A (en) | 1992-09-01 | 1995-03-28 | Motorola, Inc. | Process for improving public key authentication |
US7084859B1 (en) | 1992-09-18 | 2006-08-01 | Pryor Timothy R | Programmable tactile touch screen displays and man-machine interfaces for improved vehicle instrumentation and telematics |
US5412189A (en) | 1992-12-21 | 1995-05-02 | International Business Machines Corporation | Touch screen apparatus with tactile information |
US5490087A (en) | 1993-12-06 | 1996-02-06 | Motorola, Inc. | Radio channel access control |
US5463657A (en) | 1994-02-15 | 1995-10-31 | Lockheed Missiles & Space Company, Inc. | Detection of a multi-sequence spread spectrum signal |
FI95178C (en) | 1994-04-08 | 1995-12-27 | Nokia Mobile Phones Ltd | Keyboard |
US5504938A (en) | 1994-05-02 | 1996-04-02 | Motorola, Inc. | Method and apparatus for varying apparent cell size in a cellular communication system |
WO1996006392A1 (en) | 1994-08-18 | 1996-02-29 | Interval Research Corporation | Content-based haptic input device for video |
US5602901A (en) | 1994-12-22 | 1997-02-11 | Motorola, Inc. | Specialized call routing method and apparatus for a cellular communication system |
US7500952B1 (en) | 1995-06-29 | 2009-03-10 | Teratech Corporation | Portable ultrasound imaging system |
US5673256A (en) | 1995-07-25 | 1997-09-30 | Motorola, Inc. | Apparatus and method for sending data messages at an optimum time |
US5712870A (en) | 1995-07-31 | 1998-01-27 | Harris Corporation | Packet header generation and detection circuitry |
US5752162A (en) | 1995-11-03 | 1998-05-12 | Motorola, Inc. | Methods for assigning subscriber units to visited gateways |
US5825308A (en) | 1996-11-26 | 1998-10-20 | Immersion Human Interface Corporation | Force feedback interface having isotonic and isometric functionality |
US5937049A (en) | 1995-12-29 | 1999-08-10 | Apropos Technology | Service bureau caller ID collection with ISDN BRI |
EP0789295B1 (en) | 1996-02-09 | 2003-05-02 | Seiko Instruments Inc. | Display unit having a transparent touchswitch and a liquid crystal display, and manufacturing method of the same |
US5724659A (en) | 1996-07-01 | 1998-03-03 | Motorola, Inc. | Multi-mode variable bandwidth repeater switch and method therefor |
US5892902A (en) | 1996-09-05 | 1999-04-06 | Clark; Paul C. | Intelligent token protected system with network authentication |
US5867789A (en) | 1996-12-30 | 1999-02-02 | Motorola, Inc. | Method and system for real-time channel management in a radio telecommunications system |
US6882086B2 (en) | 2001-05-22 | 2005-04-19 | Sri International | Variable stiffness electroactive polymer systems |
US6037882A (en) | 1997-09-30 | 2000-03-14 | Levy; David H. | Method and apparatus for inputting data to an electronic system |
US7102621B2 (en) | 1997-09-30 | 2006-09-05 | 3M Innovative Properties Company | Force measurement system correcting for inertial interference |
WO1999017929A1 (en) | 1997-10-03 | 1999-04-15 | The Trustees Of The University Of Pennsylvania | Polymeric electrostrictive systems |
US5995763A (en) | 1997-10-10 | 1999-11-30 | Posa; John G. | Remote microphone and range-finding configurations |
US6243078B1 (en) | 1998-06-23 | 2001-06-05 | Immersion Corporation | Pointing device with forced feedback button |
US6131032A (en) | 1997-12-01 | 2000-10-10 | Motorola, Inc. | Method and apparatus for monitoring users of a communications system |
US6256011B1 (en) | 1997-12-03 | 2001-07-03 | Immersion Corporation | Multi-function control device with force feedback |
US7663607B2 (en) | 2004-05-06 | 2010-02-16 | Apple Inc. | Multipoint touchscreen |
US8479122B2 (en) * | 2004-07-30 | 2013-07-02 | Apple Inc. | Gestures for touch sensitive input devices |
WO1999038149A1 (en) | 1998-01-26 | 1999-07-29 | Wayne Westerman | Method and apparatus for integrating manual input |
US7760187B2 (en) | 2004-07-30 | 2010-07-20 | Apple Inc. | Visual expander |
NL1008351C2 (en) | 1998-02-19 | 1999-08-20 | No Wires Needed B V | Data communication network. |
US6104922A (en) | 1998-03-02 | 2000-08-15 | Motorola, Inc. | User authentication in a communication system utilizing biometric information |
JP3739927B2 (en) | 1998-03-04 | 2006-01-25 | 独立行政法人科学技術振興機構 | Tactile sensor and tactile detection system |
US6185536B1 (en) | 1998-03-04 | 2001-02-06 | Motorola, Inc. | System and method for establishing a communication link using user-specific voice data parameters as a user discriminator |
US5888161A (en) * | 1998-03-19 | 1999-03-30 | Ford Global Technologies, Inc. | All wheel drive continuously variable transmission having dual mode operation |
US6211856B1 (en) | 1998-04-17 | 2001-04-03 | Sung M. Choi | Graphical user interface touch screen with an auto zoom feature |
US6563487B2 (en) | 1998-06-23 | 2003-05-13 | Immersion Corporation | Haptic feedback for directional control pads |
US6184868B1 (en) | 1998-09-17 | 2001-02-06 | Immersion Corp. | Haptic feedback control devices |
DE19827905C1 (en) | 1998-06-23 | 1999-12-30 | Papenmeier Friedrich Horst | Device for entering and reading out data |
US6429846B2 (en) | 1998-06-23 | 2002-08-06 | Immersion Corporation | Haptic feedback for touchpads and other touch controls |
EP0974948A1 (en) | 1998-07-20 | 2000-01-26 | Nec Corporation | Apparatus and method of assisting visually impaired persons to generate graphical data in a computer |
US6117296A (en) | 1998-07-21 | 2000-09-12 | Thomson; Timothy | Electrically controlled contractile polymer composite |
JP4633207B2 (en) | 1998-09-08 | 2011-02-16 | ソニー株式会社 | Image display device |
US6004049A (en) | 1998-10-29 | 1999-12-21 | Sun Microsystems, Inc. | Method and apparatus for dynamic configuration of an input device |
US6787238B2 (en) | 1998-11-18 | 2004-09-07 | The Penn State Research Foundation | Terpolymer systems for electromechanical and dielectric applications |
US6434702B1 (en) | 1998-12-08 | 2002-08-13 | International Business Machines Corporation | Automatic rotation of digit location in devices used in passwords |
US6313825B1 (en) | 1998-12-28 | 2001-11-06 | Gateway, Inc. | Virtual input device |
US6266690B1 (en) | 1999-01-27 | 2001-07-24 | Adc Telecommunications, Inc. | Enhanced service platform with secure system and method for subscriber profile customization |
DE19918432A1 (en) | 1999-04-23 | 2000-10-26 | Saueressig Gmbh & Co | Expansion layer of compressible material between core cylinder and its sleeve is provided with depressions on its outer or inner circumferential surface |
US6462840B1 (en) | 1999-05-17 | 2002-10-08 | Grigory Kravtsov | Three dimensional monitor and tactile scanner |
AU5549500A (en) | 1999-06-22 | 2001-01-09 | Peratech Ltd | Conductive structures |
US6417821B1 (en) | 1999-06-28 | 2002-07-09 | John V. Becker | Braille computer monitor |
US6766036B1 (en) | 1999-07-08 | 2004-07-20 | Timothy R. Pryor | Camera based man machine interfaces |
US6492979B1 (en) | 1999-09-07 | 2002-12-10 | Elo Touchsystems, Inc. | Dual sensor touchscreen utilizing projective-capacitive and force touch sensors |
JP2001117721A (en) | 1999-10-14 | 2001-04-27 | Mitsubishi Electric Corp | Touch panel input type keyboard |
US6535201B1 (en) | 1999-12-17 | 2003-03-18 | International Business Machines Corporation | Method and system for three-dimensional topographical modeling |
JP3913463B2 (en) | 1999-12-27 | 2007-05-09 | セイコーインスツル株式会社 | Pulse detection device and manufacturing method thereof |
AU2001234446A1 (en) | 2000-01-11 | 2001-07-24 | Cirque Corporation | Flexible touchpad sensor grid for conforming to arcuate surfaces |
US6822635B2 (en) | 2000-01-19 | 2004-11-23 | Immersion Corporation | Haptic interface for laptop computers and other portable devices |
JP2001282141A (en) | 2000-03-31 | 2001-10-12 | Sony Corp | Photon control device |
JP2001282140A (en) | 2000-03-31 | 2001-10-12 | Sony Corp | Information receiving display device |
GB0011829D0 (en) | 2000-05-18 | 2000-07-05 | Lussey David | Flexible switching devices |
EP1303853A4 (en) | 2000-05-24 | 2009-03-11 | Immersion Corp | Haptic devices using electroactive polymers |
US7210099B2 (en) | 2000-06-12 | 2007-04-24 | Softview Llc | Resolution independent vector display of internet content |
US6559620B2 (en) | 2001-03-21 | 2003-05-06 | Digital Angel Corporation | System and method for remote monitoring utilizing a rechargeable battery |
US6925495B2 (en) | 2000-07-13 | 2005-08-02 | Vendaria Media, Inc. | Method and system for delivering and monitoring an on-demand playlist over a network using a template |
US6571102B1 (en) | 2000-08-08 | 2003-05-27 | Motorola, Inc. | Channel management technique for asymmetric data services |
DE10046099A1 (en) | 2000-09-18 | 2002-04-04 | Siemens Ag | Touch sensitive display with tactile feedback |
US20020050983A1 (en) | 2000-09-26 | 2002-05-02 | Qianjun Liu | Method and apparatus for a touch sensitive system employing spread spectrum technology for the operation of one or more input devices |
US7615008B2 (en) | 2000-11-24 | 2009-11-10 | U-Systems, Inc. | Processing and displaying breast ultrasound information |
US6456245B1 (en) | 2000-12-13 | 2002-09-24 | Magis Networks, Inc. | Card-based diversity antenna structure for wireless communications |
US6782102B2 (en) | 2000-12-21 | 2004-08-24 | Motorola, Inc. | Multiple format secure voice apparatus for communication handsets |
US6842428B2 (en) | 2001-01-08 | 2005-01-11 | Motorola, Inc. | Method for allocating communication network resources using adaptive demand prediction |
US6628511B2 (en) | 2001-01-22 | 2003-09-30 | Xoucin, Inc. | Palm-sized handheld device with inverted ergonomic keypad |
US6776800B2 (en) * | 2001-02-28 | 2004-08-17 | Synthes (U.S.A.) | Implants formed with demineralized bone |
US6856816B2 (en) | 2001-03-23 | 2005-02-15 | Hall Aluminum Llc | Telephone quick dialing/re-dialing method and apparatus |
US6852416B2 (en) | 2001-03-30 | 2005-02-08 | The Penn State Research Foundation | High dielectric constant composites of metallophthalaocyanine oligomer and poly(vinylidene-trifluoroethylene) copolymer |
US6636202B2 (en) * | 2001-04-27 | 2003-10-21 | International Business Machines Corporation | Interactive tactile display for computer screen |
JP3913496B2 (en) | 2001-05-28 | 2007-05-09 | 独立行政法人科学技術振興機構 | Tactile detection system |
GB0113905D0 (en) | 2001-06-07 | 2001-08-01 | Peratech Ltd | Analytical device |
FI20012231A (en) | 2001-06-21 | 2002-12-22 | Ismo Rakkolainen | System for creating a user interface |
DE60116646T2 (en) | 2001-07-09 | 2006-08-10 | Nokia Corp. | PACKAGE DATA TRANSMISSION BY VARIABLE DQPSK MODULATION |
JP2003029898A (en) | 2001-07-16 | 2003-01-31 | Japan Science & Technology Corp | Tactile device |
CA2353697A1 (en) | 2001-07-24 | 2003-01-24 | Tactex Controls Inc. | Touch sensitive membrane |
US7042997B2 (en) | 2001-07-30 | 2006-05-09 | Persona Software, Inc. | Passive call blocking method and apparatus |
AUPR694401A0 (en) | 2001-08-10 | 2001-09-06 | University Of Wollongong, The | Bio-mechanical feedback device |
US20030048260A1 (en) | 2001-08-17 | 2003-03-13 | Alec Matusis | System and method for selecting actions based on the identification of user's fingers |
CA2398798A1 (en) | 2001-08-28 | 2003-02-28 | Research In Motion Limited | System and method for providing tactility for an lcd touchscreen |
US6703550B2 (en) | 2001-10-10 | 2004-03-09 | Immersion Corporation | Sound data output and manipulation using haptic feedback |
WO2003039080A1 (en) | 2001-10-31 | 2003-05-08 | Nokia Corporation | A method for handling of messages between a terminal and a data network |
KR20040062956A (en) | 2001-11-01 | 2004-07-09 | 임머숀 코퍼레이션 | Method and apparatus for providing tactile sensations |
SE0103835L (en) | 2001-11-02 | 2003-05-03 | Neonode Ab | Touch screen realized by display unit with light transmitting and light receiving units |
US8095879B2 (en) | 2002-12-10 | 2012-01-10 | Neonode Inc. | User interface for mobile handheld computer unit |
US8339379B2 (en) | 2004-04-29 | 2012-12-25 | Neonode Inc. | Light-based touch screen |
GB2382291A (en) | 2001-11-16 | 2003-05-21 | Int Computers Ltd | Overlay for touch sensitive screen |
US7050835B2 (en) | 2001-12-12 | 2006-05-23 | Universal Display Corporation | Intelligent multi-media display communication system |
US7352356B2 (en) | 2001-12-13 | 2008-04-01 | United States Of America | Refreshable scanning tactile graphic display for localized sensory stimulation |
US6864878B2 (en) | 2002-03-29 | 2005-03-08 | Xerox Corporation | Tactile overlays for screens |
KR100769783B1 (en) | 2002-03-29 | 2007-10-24 | 가부시끼가이샤 도시바 | Display input device and display input system |
CA2480797A1 (en) | 2002-04-15 | 2003-10-23 | Schott Ag | Method for producing a copy protection for an electronic circuit and corresponding component |
US7289826B1 (en) | 2002-04-16 | 2007-10-30 | Faulkner Interstices, Llc | Method and apparatus for beam selection in a smart antenna system |
US7400640B2 (en) | 2002-05-03 | 2008-07-15 | Conexant, Inc. | Partitioned medium access control implementation |
US7269153B1 (en) | 2002-05-24 | 2007-09-11 | Conexant Systems, Inc. | Method for minimizing time critical transmit processing for a personal computer implementation of a wireless local area network adapter |
TW554538B (en) | 2002-05-29 | 2003-09-21 | Toppoly Optoelectronics Corp | TFT planar display panel structure and process for producing same |
US6988247B2 (en) | 2002-06-18 | 2006-01-17 | Koninklijke Philips Electronics N.V. | Graphic user interface having touch detectability |
US11275405B2 (en) | 2005-03-04 | 2022-03-15 | Apple Inc. | Multi-functional hand-held device |
US6984208B2 (en) | 2002-08-01 | 2006-01-10 | The Hong Kong Polytechnic University | Method and apparatus for sensing body gesture, posture and movement |
US7841944B2 (en) | 2002-08-06 | 2010-11-30 | Igt | Gaming device having a three dimensional display device |
US6842607B2 (en) | 2002-09-09 | 2005-01-11 | Conexant Systems, Inc | Coordination of competing protocols |
US20040046739A1 (en) | 2002-09-11 | 2004-03-11 | Palm, Inc. | Pliable device navigation method and apparatus |
WO2004027314A1 (en) | 2002-09-19 | 2004-04-01 | Matsushita Electric Industrial Co., Ltd. | Illumination unit and liquid crystal display comprising it |
US7138985B2 (en) | 2002-09-25 | 2006-11-21 | Ui Evolution, Inc. | Tactilely enhanced visual image display |
US7253807B2 (en) | 2002-09-25 | 2007-08-07 | Uievolution, Inc. | Interactive apparatuses with tactiley enhanced visual imaging capability and related methods |
US7190416B2 (en) | 2002-10-18 | 2007-03-13 | Nitto Denko Corporation | Liquid crystal display with touch panel having internal front polarizer |
US20040100448A1 (en) | 2002-11-25 | 2004-05-27 | 3M Innovative Properties Company | Touch display |
EP1597716B1 (en) | 2003-02-24 | 2016-07-06 | Peichun Yang | Electroactive polymer actuator braille cell and braille display |
KR100563460B1 (en) | 2003-02-25 | 2006-03-23 | 엘지.필립스 엘시디 주식회사 | Liquid Crystal Display Associated with Touch Panel And Driving Method Thereof |
CA2430317A1 (en) | 2003-05-29 | 2004-11-29 | Vincent Hayward | Method and apparatus to record and reproduce tactile sensations |
US7567243B2 (en) | 2003-05-30 | 2009-07-28 | Immersion Corporation | System and method for low power haptic feedback |
US8373660B2 (en) | 2003-07-14 | 2013-02-12 | Matt Pallakoff | System and method for a portable multimedia client |
JP2005056267A (en) | 2003-08-06 | 2005-03-03 | Sony Corp | Kinesthetic sense feedback device |
EP1665880B1 (en) | 2003-09-03 | 2012-12-05 | SRI International | Surface deformation electroactive polymer transducers |
US20060172557A1 (en) | 2004-09-09 | 2006-08-03 | He Xinhua Sam | Electrical actuator having smart muscle wire |
TW594183B (en) | 2003-09-26 | 2004-06-21 | Chi Lin Technology Co Ltd | Panel positioning and testing device |
JP2007512796A (en) | 2003-10-17 | 2007-05-17 | ファイアフライ パワー テクノロジーズ,インコーポレイテッド | Method and apparatus for supplying power wirelessly |
US20050088417A1 (en) * | 2003-10-24 | 2005-04-28 | Mulligan Roger C. | Tactile touch-sensing system |
US7042711B2 (en) | 2003-11-18 | 2006-05-09 | Kabushiki Kaisha Toshiba | Multi-functional electronic device with a continuously accessible pointing device |
US7054145B2 (en) | 2003-11-18 | 2006-05-30 | Kabushiki Kaisha Toshiba | Mechanism for adjusting a display |
GB0402191D0 (en) | 2004-02-02 | 2004-03-03 | Eleksen Ltd | Linear sensor |
GB0406080D0 (en) | 2004-03-18 | 2004-04-21 | Eleksen Ltd | Sensor assembly |
GB0406079D0 (en) | 2004-03-18 | 2004-04-21 | Eleksen Ltd | Sensor response |
IL161002A0 (en) | 2004-03-22 | 2004-08-31 | Itay Katz | Virtual video keyboard system |
US7436318B2 (en) | 2004-04-19 | 2008-10-14 | Atg Designworks, Llc | Self contained device for displaying electronic information |
US7750890B2 (en) | 2004-05-11 | 2010-07-06 | The Chamberlain Group, Inc. | Movable barrier operator system display method and apparatus |
US7116855B2 (en) * | 2004-06-30 | 2006-10-03 | Xerox Corporation | Optical shuttle system and method used in an optical switch |
TWI287771B (en) | 2004-07-06 | 2007-10-01 | Au Optronics Corp | Active matrix organic light emitting diode (AMOLED) display and a pixel drive circuit thereof |
US7522153B2 (en) | 2004-09-14 | 2009-04-21 | Fujifilm Corporation | Displaying apparatus and control method |
CN101040245A (en) | 2004-10-12 | 2007-09-19 | 皇家飞利浦电子股份有限公司 | Ultrasound touchscreen user interface and display |
KR100682901B1 (en) * | 2004-11-17 | 2007-02-15 | 삼성전자주식회사 | Apparatus and method for providing fingertip haptics of visual information using electro-active polymer in a image displaying device |
EP1842172A2 (en) | 2005-01-14 | 2007-10-10 | Philips Intellectual Property & Standards GmbH | Moving objects presented by a touch input display device |
US7952564B2 (en) | 2005-02-17 | 2011-05-31 | Hurst G Samuel | Multiple-touch sensor |
US7602118B2 (en) | 2005-02-24 | 2009-10-13 | Eastman Kodak Company | OLED device having improved light output |
US7193350B1 (en) | 2005-02-25 | 2007-03-20 | United States Of America As Represented By The Secretary Of The Navy | Electroactive polymer structure |
JP4360497B2 (en) | 2005-03-09 | 2009-11-11 | 国立大学法人 東京大学 | Electric tactile presentation device and electric tactile presentation method |
DE102005011633A1 (en) | 2005-03-14 | 2006-09-21 | Siemens Ag | Touch screen with haptic feedback |
US20060209083A1 (en) | 2005-03-18 | 2006-09-21 | Outland Research, L.L.C. | Method and electroactive device for a dynamic graphical imagery display |
JP2006276707A (en) | 2005-03-30 | 2006-10-12 | Toshiba Matsushita Display Technology Co Ltd | Display device and its driving method |
US20070020589A1 (en) | 2005-04-06 | 2007-01-25 | Ethan Smith | Electrothermal refreshable Braille cell and method for actuating same |
US7382357B2 (en) | 2005-04-25 | 2008-06-03 | Avago Technologies Ecbu Ip Pte Ltd | User interface incorporating emulated hard keys |
US7368307B2 (en) | 2005-06-07 | 2008-05-06 | Eastman Kodak Company | Method of manufacturing an OLED device with a curved light emitting surface |
WO2007012899A1 (en) | 2005-07-25 | 2007-02-01 | Plastic Logic Limited | Flexible touch screen display |
US7878977B2 (en) | 2005-09-30 | 2011-02-01 | Siemens Medical Solutions Usa, Inc. | Flexible ultrasound transducer array |
US20070085828A1 (en) | 2005-10-13 | 2007-04-19 | Schroeder Dale W | Ultrasonic virtual mouse |
US20070085838A1 (en) | 2005-10-17 | 2007-04-19 | Ricks Theodore K | Method for making a display with integrated touchscreen |
US7659887B2 (en) | 2005-10-20 | 2010-02-09 | Microsoft Corp. | Keyboard with a touchpad layer on keys |
US7843449B2 (en) | 2006-09-20 | 2010-11-30 | Apple Inc. | Three-dimensional display system |
JP4536638B2 (en) | 2005-10-28 | 2010-09-01 | 株式会社スクウェア・エニックス | Display information selection apparatus and method, program, and recording medium |
US8059100B2 (en) | 2005-11-17 | 2011-11-15 | Lg Electronics Inc. | Method for allocating/arranging keys on touch-screen, and mobile terminal for use of the same |
EP1788473A1 (en) | 2005-11-18 | 2007-05-23 | Siemens Aktiengesellschaft | input device |
EP1970877B1 (en) | 2005-12-08 | 2017-10-18 | The University of Tokyo | Electric tactile display |
KR100801089B1 (en) | 2005-12-13 | 2008-02-05 | 삼성전자주식회사 | Mobile device and operation method control available for using touch and drag |
KR100791379B1 (en) | 2006-01-02 | 2008-01-07 | 삼성전자주식회사 | System and method for user interface |
KR100877067B1 (en) * | 2006-01-03 | 2009-01-07 | 삼성전자주식회사 | Haptic button, and haptic device using it |
US7956846B2 (en) | 2006-01-05 | 2011-06-07 | Apple Inc. | Portable electronic device with content-dependent touch sensitivity |
JP4412288B2 (en) * | 2006-01-26 | 2010-02-10 | セイコーエプソン株式会社 | Electro-optical device and electronic apparatus |
WO2007089819A1 (en) | 2006-01-30 | 2007-08-09 | Briancon Alain C | Skin tone mobile device and service |
US7594839B2 (en) | 2006-02-24 | 2009-09-29 | Eastman Kodak Company | OLED device having improved light output |
US20070247422A1 (en) | 2006-03-30 | 2007-10-25 | Xuuk, Inc. | Interaction techniques for flexible displays |
US7511702B2 (en) | 2006-03-30 | 2009-03-31 | Apple Inc. | Force and location sensitive display |
EP1843406A1 (en) | 2006-04-05 | 2007-10-10 | Nederlandse Organisatie voor Toegepast-Natuuurwetenschappelijk Onderzoek TNO | Actuator comprising an electroactive polymer |
WO2007114669A1 (en) | 2006-04-06 | 2007-10-11 | Samsung Electronics Co., Ltd. | Apparatus and method for identifying an application in the multiple screens environment |
US7978181B2 (en) | 2006-04-25 | 2011-07-12 | Apple Inc. | Keystroke tactility arrangement on a smooth touch surface |
US8279180B2 (en) | 2006-05-02 | 2012-10-02 | Apple Inc. | Multipoint touch surface controller |
US20070257891A1 (en) | 2006-05-03 | 2007-11-08 | Esenther Alan W | Method and system for emulating a mouse on a multi-touch sensitive surface |
US8552989B2 (en) | 2006-06-09 | 2013-10-08 | Apple Inc. | Integrated display and touch screen |
US8086971B2 (en) | 2006-06-28 | 2011-12-27 | Nokia Corporation | Apparatus, methods and computer program products providing finger-based and hand-based gesture commands for portable electronic device applications |
US8441467B2 (en) | 2006-08-03 | 2013-05-14 | Perceptive Pixel Inc. | Multi-touch sensing display through frustrated total internal reflection |
US8144271B2 (en) | 2006-08-03 | 2012-03-27 | Perceptive Pixel Inc. | Multi-touch sensing through frustrated total internal reflection |
CN101507290A (en) | 2006-08-24 | 2009-08-12 | 皇家飞利浦电子股份有限公司 | Device for and method of processing an audio signal and/or a video signal to generate haptic excitation |
KR100910577B1 (en) | 2006-09-11 | 2009-08-04 | 삼성전자주식회사 | Computer system and control method thereof |
KR20080023901A (en) | 2006-09-12 | 2008-03-17 | 삼성전자주식회사 | Method and device for presenting braille in a wireless mobile terminal |
US20080062088A1 (en) | 2006-09-13 | 2008-03-13 | Tpo Displays Corp. | Pixel driving circuit and OLED display apparatus and electrionic device using the same |
US20100315345A1 (en) | 2006-09-27 | 2010-12-16 | Nokia Corporation | Tactile Touch Screen |
TW200835995A (en) | 2006-10-10 | 2008-09-01 | Cbrite Inc | Electro-optic display |
US20080122589A1 (en) | 2006-11-28 | 2008-05-29 | Ivanov Yuri A | Tactile Output Device |
EP1930800A1 (en) | 2006-12-05 | 2008-06-11 | Electronics and Telecommunications Research Institute | Tactile and visual display device |
US9697556B2 (en) | 2007-09-06 | 2017-07-04 | Mohammad A. Mazed | System and method of machine learning based user applications |
US8970501B2 (en) | 2007-01-03 | 2015-03-03 | Apple Inc. | Proximity and multi-touch sensor detection and demodulation |
US7844915B2 (en) | 2007-01-07 | 2010-11-30 | Apple Inc. | Application programming interfaces for scrolling operations |
US20080180399A1 (en) * | 2007-01-31 | 2008-07-31 | Tung Wan Cheng | Flexible Multi-touch Screen |
US8269729B2 (en) | 2007-01-31 | 2012-09-18 | Perceptive Pixel Inc. | Methods of interfacing with multi-point input devices and multi-point input systems employing interfacing techniques |
US7731670B2 (en) | 2007-02-02 | 2010-06-08 | Honda Motor Co., Ltd. | Controller for an assistive exoskeleton based on active impedance |
US8098234B2 (en) | 2007-02-20 | 2012-01-17 | Immersion Corporation | Haptic feedback system with stored effects |
US20080211353A1 (en) | 2007-03-02 | 2008-09-04 | Charles Erklin Seeley | High temperature bimorph actuator |
US8253654B2 (en) | 2007-03-16 | 2012-08-28 | Motorola Mobility Llc | Visual interface control based on viewing display area configuration |
US20080259236A1 (en) | 2007-04-13 | 2008-10-23 | Saint-Gobain Ceramics & Plastics, Inc. | Electrostatic dissipative stage and effectors for use in forming lcd products |
KR100888480B1 (en) | 2007-05-23 | 2009-03-12 | 삼성전자주식회사 | Reflective unit using electro active polymer and flexible display |
KR100863571B1 (en) | 2007-05-23 | 2008-10-15 | 삼성전자주식회사 | Display pixel using electro active polymer and display employing the same |
WO2008146203A1 (en) | 2007-06-01 | 2008-12-04 | Koninklijke Philips Electronics, N.V. | Wireless ultrasound probe user interface |
US20080303795A1 (en) | 2007-06-08 | 2008-12-11 | Lowles Robert J | Haptic display for a handheld electronic device |
US20090002328A1 (en) * | 2007-06-26 | 2009-01-01 | Immersion Corporation, A Delaware Corporation | Method and apparatus for multi-touch tactile touch panel actuator mechanisms |
US7956770B2 (en) | 2007-06-28 | 2011-06-07 | Sony Ericsson Mobile Communications Ab | Data input device and portable electronic device |
US7952498B2 (en) * | 2007-06-29 | 2011-05-31 | Verizon Patent And Licensing Inc. | Haptic computer interface |
EP2165248A4 (en) | 2007-07-06 | 2011-11-23 | Neonode Inc | Scanning of a touch screen |
US20090015560A1 (en) * | 2007-07-13 | 2009-01-15 | Motorola, Inc. | Method and apparatus for controlling a display of a device |
US8226562B2 (en) | 2007-08-10 | 2012-07-24 | Ultrasonix Medical Corporation | Hand-held ultrasound system having sterile enclosure |
US20090198132A1 (en) | 2007-08-10 | 2009-08-06 | Laurent Pelissier | Hand-held ultrasound imaging device having reconfigurable user interface |
KR20090019161A (en) * | 2007-08-20 | 2009-02-25 | 삼성전자주식회사 | Electronic device and method for operating the same |
KR101430445B1 (en) * | 2007-08-20 | 2014-08-14 | 엘지전자 주식회사 | Terminal having function for controlling screen size and program recording medium |
EP2034399B1 (en) | 2007-09-04 | 2019-06-05 | LG Electronics Inc. | Scrolling method of mobile terminal |
EP2790088B1 (en) | 2007-09-18 | 2019-05-01 | Senseg Oy | Method and apparatus for sensory stimulation |
US20090102805A1 (en) * | 2007-10-18 | 2009-04-23 | Microsoft Corporation | Three-dimensional object simulation using audio, visual, and tactile feedback |
WO2009057062A1 (en) | 2007-10-29 | 2009-05-07 | Koninklijke Philips Electronics, N.V. | Systems and methods for ultrasound assembly including multiple imaging transducer arrays |
EP2065870A1 (en) | 2007-11-21 | 2009-06-03 | Roche Diagnostics GmbH | Medical device for visually impaired users and users not visually impaired |
US8136402B2 (en) | 2007-11-28 | 2012-03-20 | International Business Machines Corporation | Accelerometer module for use with a touch sensitive device |
KR101537524B1 (en) | 2007-11-29 | 2015-07-21 | 코닌클리케 필립스 엔.브이. | Method of providing a user interface |
KR20090062190A (en) | 2007-12-12 | 2009-06-17 | 삼성전자주식회사 | Input/output device for tactile sensation and driving method for the same |
JP2009151684A (en) | 2007-12-21 | 2009-07-09 | Sony Corp | Touch-sensitive sheet member, input device and electronic equipment |
US9857872B2 (en) | 2007-12-31 | 2018-01-02 | Apple Inc. | Multi-touch display screen with localized tactile feedback |
US8154527B2 (en) * | 2008-01-04 | 2012-04-10 | Tactus Technology | User interface system |
US8547339B2 (en) * | 2008-01-04 | 2013-10-01 | Tactus Technology, Inc. | System and methods for raised touch screens |
US8232973B2 (en) | 2008-01-09 | 2012-07-31 | Apple Inc. | Method, device, and graphical user interface providing word recommendations for text input |
US20090181724A1 (en) | 2008-01-14 | 2009-07-16 | Sony Ericsson Mobile Communications Ab | Touch sensitive display with ultrasonic vibrations for tactile feedback |
US7890257B2 (en) | 2008-01-14 | 2011-02-15 | Research In Motion Limited | Using a shape-changing display as an adaptive lens for selectively magnifying information displayed onscreen |
US8004501B2 (en) | 2008-01-21 | 2011-08-23 | Sony Computer Entertainment America Llc | Hand-held device with touchscreen and digital tactile pixels |
US20090184936A1 (en) | 2008-01-22 | 2009-07-23 | Mathematical Inventing - Slicon Valley | 3D touchpad |
US8310444B2 (en) | 2008-01-29 | 2012-11-13 | Pacinian Corporation | Projected field haptic actuation |
US20090195512A1 (en) | 2008-02-05 | 2009-08-06 | Sony Ericsson Mobile Communications Ab | Touch sensitive display with tactile feedback |
US20090199392A1 (en) | 2008-02-11 | 2009-08-13 | General Electric Company | Ultrasound transducer probes and system and method of manufacture |
US8022933B2 (en) | 2008-02-21 | 2011-09-20 | Sony Corporation | One button remote control with haptic feedback |
US9513704B2 (en) | 2008-03-12 | 2016-12-06 | Immersion Corporation | Haptically enabled user interface |
US20090237373A1 (en) * | 2008-03-19 | 2009-09-24 | Sony Ericsson Mobile Communications Ab | Two way touch-sensitive display |
US8786555B2 (en) | 2008-03-21 | 2014-07-22 | Sprint Communications Company L.P. | Feedback-providing keypad for touchscreen devices |
KR100943989B1 (en) | 2008-04-02 | 2010-02-26 | (주)엠아이디티 | Capacitive Touch Screen |
US8788967B2 (en) | 2008-04-10 | 2014-07-22 | Perceptive Pixel, Inc. | Methods of interfacing with multi-input devices and multi-input display systems employing interfacing techniques |
US8745514B1 (en) | 2008-04-11 | 2014-06-03 | Perceptive Pixel, Inc. | Pressure-sensitive layering of displayed objects |
US20090256807A1 (en) | 2008-04-14 | 2009-10-15 | Nokia Corporation | User interface |
US20090262078A1 (en) | 2008-04-21 | 2009-10-22 | David Pizzi | Cellular phone with special sensor functions |
TWI397850B (en) | 2008-05-14 | 2013-06-01 | Ind Tech Res Inst | Sensing apparatus and scanning actuation method thereof |
US9035886B2 (en) | 2008-05-16 | 2015-05-19 | International Business Machines Corporation | System and apparatus for a multi-point touch-sensitive sensor user interface using distinct digit identification |
US20090295760A1 (en) | 2008-06-02 | 2009-12-03 | Sony Ericsson Mobile Communications Ab | Touch screen display |
US7924143B2 (en) | 2008-06-09 | 2011-04-12 | Research In Motion Limited | System and method for providing tactile feedback to a user of an electronic device |
US8054300B2 (en) | 2008-06-17 | 2011-11-08 | Apple Inc. | Capacitive sensor panel having dynamically reconfigurable sensor size and shape |
US8754855B2 (en) | 2008-06-27 | 2014-06-17 | Microsoft Corporation | Virtual touchpad |
US8508495B2 (en) | 2008-07-03 | 2013-08-13 | Apple Inc. | Display with dual-function capacitive elements |
US10031549B2 (en) | 2008-07-10 | 2018-07-24 | Apple Inc. | Transitioning between modes of input |
US8106749B2 (en) | 2008-07-14 | 2012-01-31 | Sony Ericsson Mobile Communications Ab | Touchless control of a control device |
US8274484B2 (en) | 2008-07-18 | 2012-09-25 | Microsoft Corporation | Tracking input in a screen-reflective interface environment |
US9335868B2 (en) | 2008-07-31 | 2016-05-10 | Apple Inc. | Capacitive sensor behind black mask |
US10983665B2 (en) * | 2008-08-01 | 2021-04-20 | Samsung Electronics Co., Ltd. | Electronic apparatus and method for implementing user interface |
US7953462B2 (en) | 2008-08-04 | 2011-05-31 | Vartanian Harry | Apparatus and method for providing an adaptively responsive flexible display device |
KR101505198B1 (en) | 2008-08-18 | 2015-03-23 | 엘지전자 주식회사 | PORTABLE TERMINAL and DRIVING METHOD OF THE SAME |
US7982723B2 (en) | 2008-09-18 | 2011-07-19 | Stmicroelectronics Asia Pacific Pte. Ltd. | Multiple touch location in a three dimensional touch screen sensor |
JP2010086471A (en) | 2008-10-02 | 2010-04-15 | Sony Corp | Operation feeling providing device, and operation feeling feedback method, and program |
US8593409B1 (en) * | 2008-10-10 | 2013-11-26 | Immersion Corporation | Method and apparatus for providing haptic feedback utilizing multi-actuated waveform phasing |
KR20100041006A (en) | 2008-10-13 | 2010-04-22 | 엘지전자 주식회사 | A user interface controlling method using three dimension multi-touch |
US8427433B2 (en) | 2008-10-17 | 2013-04-23 | Honeywell International Inc. | Tactile-feedback touch screen |
US20100103115A1 (en) * | 2008-10-24 | 2010-04-29 | Sony Ericsson Mobile Communications Ab | Display arrangement and electronic device |
US8433138B2 (en) | 2008-10-29 | 2013-04-30 | Nokia Corporation | Interaction using touch and non-touch gestures |
US20120126959A1 (en) | 2008-11-04 | 2012-05-24 | Bayer Materialscience Ag | Electroactive polymer transducers for tactile feedback devices |
KR20100050103A (en) | 2008-11-05 | 2010-05-13 | 엘지전자 주식회사 | Method of controlling 3 dimension individual object on map and mobile terminal using the same |
US8413066B2 (en) | 2008-11-06 | 2013-04-02 | Dmytro Lysytskyy | Virtual keyboard with visually enhanced keys |
WO2010062901A1 (en) | 2008-11-26 | 2010-06-03 | Research In Motion Limited | Touch-sensitive display method and apparatus |
US8558803B2 (en) | 2008-11-28 | 2013-10-15 | Samsung Electronics Co., Ltd. | Input device for portable terminal and method thereof |
US9600070B2 (en) | 2008-12-22 | 2017-03-21 | Apple Inc. | User interface having changeable topography |
US8686952B2 (en) | 2008-12-23 | 2014-04-01 | Apple Inc. | Multi touch with multi haptics |
TW201025085A (en) | 2008-12-23 | 2010-07-01 | Hannstar Display Corp | Keyboard formed from a touch display, method of endowing a touch display with a keyboard function, and a device with functions of keyboard or writing pad input and image output |
US8760413B2 (en) | 2009-01-08 | 2014-06-24 | Synaptics Incorporated | Tactile surface |
US8255323B1 (en) | 2009-01-09 | 2012-08-28 | Apple Inc. | Motion based payment confirmation |
JP2010165032A (en) | 2009-01-13 | 2010-07-29 | Hitachi Displays Ltd | Touch panel display device |
US8345013B2 (en) * | 2009-01-14 | 2013-01-01 | Immersion Corporation | Method and apparatus for generating haptic feedback from plasma actuation |
CN102334089A (en) | 2009-01-21 | 2012-01-25 | 拜耳材料科技公司 | Electroactive polymer transducers for tactile feedback devices |
KR101632963B1 (en) | 2009-02-02 | 2016-06-23 | 아이사이트 모빌 테크놀로지 엘티디 | System and method for object recognition and tracking in a video stream |
US8406816B2 (en) | 2009-02-03 | 2013-03-26 | Research In Motion Limited | Method and apparatus for implementing a virtual rotary dial pad on a portable electronic device |
TW201030588A (en) | 2009-02-13 | 2010-08-16 | Hannstar Display Corp | In-cell touch panel |
US8188844B2 (en) | 2009-02-16 | 2012-05-29 | GM Global Technology Operations LLC | Reconfigurable tactile interface utilizing active material actuation |
US8077021B2 (en) | 2009-03-03 | 2011-12-13 | Empire Technology Development Llc | Dynamic tactile interface |
US20100225734A1 (en) | 2009-03-03 | 2010-09-09 | Horizon Semiconductors Ltd. | Stereoscopic three-dimensional interactive system and method |
US9874935B2 (en) * | 2009-03-12 | 2018-01-23 | Immersion Corporation | Systems and methods for a texture engine |
US9927873B2 (en) * | 2009-03-12 | 2018-03-27 | Immersion Corporation | Systems and methods for using textures in graphical user interface widgets |
EP2228711A3 (en) | 2009-03-12 | 2014-06-04 | Lg Electronics Inc. | Mobile terminal and method for providing user interface thereof |
US8686951B2 (en) | 2009-03-18 | 2014-04-01 | HJ Laboratories, LLC | Providing an elevated and texturized display in an electronic device |
KR101628782B1 (en) | 2009-03-20 | 2016-06-09 | 삼성전자주식회사 | Apparatus and method for providing haptic function using multi vibrator in portable terminal |
JP5347673B2 (en) | 2009-04-14 | 2013-11-20 | ソニー株式会社 | Information processing apparatus, information processing method, and program |
EP2427879B1 (en) | 2009-05-07 | 2015-08-19 | Immersion Corporation | Method and apparatus for providing a haptic feedback shape-changing display |
US8279200B2 (en) | 2009-05-19 | 2012-10-02 | Microsoft Corporation | Light-induced shape-memory polymer display screen |
US20110107958A1 (en) | 2009-11-12 | 2011-05-12 | Apple Inc. | Input devices and methods of operation |
US8766933B2 (en) | 2009-11-12 | 2014-07-01 | Senseg Ltd. | Tactile stimulation apparatus having a composite section comprising a semiconducting material |
US8624878B2 (en) | 2010-01-20 | 2014-01-07 | Apple Inc. | Piezo-based acoustic and capacitive detection |
US20110199342A1 (en) | 2010-02-16 | 2011-08-18 | Harry Vartanian | Apparatus and method for providing elevated, indented or texturized sensations to an object near a display device or input detection using ultrasound |
US10146426B2 (en) | 2010-11-09 | 2018-12-04 | Nokia Technologies Oy | Apparatus and method for user input for controlling displayed information |
US9471145B2 (en) | 2011-01-06 | 2016-10-18 | Blackberry Limited | Electronic device and method of displaying information in response to a gesture |
-
2009
- 2009-03-18 US US12/406,273 patent/US8686951B2/en active Active
-
2011
- 2011-11-08 US US13/291,375 patent/US8866766B2/en active Active
-
2014
- 2014-09-12 US US14/485,246 patent/US9335824B2/en not_active Expired - Fee Related
-
2016
- 2016-03-03 US US15/060,016 patent/US9459728B2/en active Active
- 2016-03-04 US US15/061,580 patent/US9400558B2/en active Active
- 2016-03-24 US US15/079,660 patent/US9405371B1/en active Active - Reinstated
- 2016-03-24 US US15/080,025 patent/US9423905B2/en active Active
- 2016-05-03 US US15/145,766 patent/US9448632B2/en active Active
- 2016-07-28 US US15/222,265 patent/US9778840B2/en active Active
- 2016-08-17 US US15/239,264 patent/US9547368B2/en active Active
- 2016-11-30 US US15/365,225 patent/US9772772B2/en active Active
-
2017
- 2017-09-04 US US15/694,930 patent/US10191652B2/en active Active
-
2018
- 2018-12-27 US US16/234,078 patent/US20190129610A1/en not_active Abandoned
Also Published As
Publication number | Publication date |
---|---|
US20170364256A1 (en) | 2017-12-21 |
US9778840B2 (en) | 2017-10-03 |
US20160187986A1 (en) | 2016-06-30 |
US9423905B2 (en) | 2016-08-23 |
US20190129610A1 (en) | 2019-05-02 |
US9459728B2 (en) | 2016-10-04 |
US9335824B2 (en) | 2016-05-10 |
US8866766B2 (en) | 2014-10-21 |
US9772772B2 (en) | 2017-09-26 |
US20100238114A1 (en) | 2010-09-23 |
US20160202822A1 (en) | 2016-07-14 |
US20170083228A1 (en) | 2017-03-23 |
US20120050200A1 (en) | 2012-03-01 |
US8686951B2 (en) | 2014-04-01 |
US20160246380A1 (en) | 2016-08-25 |
US10191652B2 (en) | 2019-01-29 |
US9547368B2 (en) | 2017-01-17 |
US9400558B2 (en) | 2016-07-26 |
US20160357259A1 (en) | 2016-12-08 |
US9405371B1 (en) | 2016-08-02 |
US9448632B2 (en) | 2016-09-20 |
US20150002438A1 (en) | 2015-01-01 |
US20160334874A1 (en) | 2016-11-17 |
US20160188101A1 (en) | 2016-06-30 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10191652B2 (en) | Electronic device with an interactive pressure sensitive multi-touch display | |
US10496170B2 (en) | Vehicle computing system to provide feedback | |
US10048758B2 (en) | Haptic feedback for interactions with foldable-bendable displays | |
US8981915B2 (en) | System and method for display of multiple data channels on a single haptic display | |
US8570296B2 (en) | System and method for display of multiple data channels on a single haptic display | |
US20100020036A1 (en) | Portable electronic device and method of controlling same | |
CN105094661B (en) | Mobile terminal | |
CN105005376A (en) | Haptic device incorporating stretch characteristics | |
JP2016197425A (en) | Systems and methods for using textures in graphical user interface widgets | |
WO2010009552A1 (en) | Tactile feedback for key simulation in touch screens | |
US20180011538A1 (en) | Multimodal haptic effects | |
US20200012348A1 (en) | Haptically enabled overlay for a pressure sensitive surface | |
Leslie | Touch screen | |
Ramstein et al. | InformationDisplay Button |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: HJ LABORATORIES, LLC, PENNSYLVANIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:VARTANIAN, HARRY;JURIKSON-RHODES, JARON;REEL/FRAME:038332/0223 Effective date: 20110725 |
|
AS | Assignment |
Owner name: HJ LABORATORIES LICENSING, LLC, PENNSYLVANIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:HJ LABORATORIES, LLC;REEL/FRAME:039063/0967 Effective date: 20160701 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
FEPP | Fee payment procedure |
Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY |
|
PRDP | Patent reinstated due to the acceptance of a late maintenance fee |
Effective date: 20201001 |
|
FEPP | Fee payment procedure |
Free format text: PETITION RELATED TO MAINTENANCE FEES GRANTED (ORIGINAL EVENT CODE: PMFG); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY Free format text: PETITION RELATED TO MAINTENANCE FEES FILED (ORIGINAL EVENT CODE: PMFP); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY Free format text: SURCHARGE, PETITION TO ACCEPT PYMT AFTER EXP, UNINTENTIONAL. (ORIGINAL EVENT CODE: M2558); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YR, SMALL ENTITY (ORIGINAL EVENT CODE: M2551); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY Year of fee payment: 4 |
|
AS | Assignment |
Owner name: IP3 2020, SERIES 500 OF ALLIED SECURITY TRUST I, NEW JERSEY Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:HJ LABORATORIES LICENSING, LLC;REEL/FRAME:054345/0298 Effective date: 20201028 |
|
FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
AS | Assignment |
Owner name: OMNISLASH DIGITAL LLC, TEXAS Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:IP3 2020, SERIES 500 OF ALLIED SECURITY TRUST I;REEL/FRAME:061510/0823 Effective date: 20211223 Owner name: HAPTIC SYNERGY LLC, TEXAS Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:OMNISLASH DIGITAL LLC;REEL/FRAME:061511/0596 Effective date: 20221014 |
|
AS | Assignment |
Owner name: OMNISLASH DIGITAL LLC, TEXAS Free format text: CORRECTIVE ASSIGNMENT TO CORRECT THE TWO APPLICATION NUMBERS INCORRECTLY LISTED AS PCT NUMBERS PREVIOUSLY RECORDED AT REEL: 061510 FRAME: 0823. ASSIGNOR(S) HEREBY CONFIRMS THE ASSIGNMENT;ASSIGNOR:IP3 2020, SERIES 500 OF ALLIED SECURITY TRUST I;REEL/FRAME:061773/0552 Effective date: 20211223 |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 8 |