US20110080344A1 - Blending touch data streams that include touch input data - Google Patents
Blending touch data streams that include touch input data Download PDFInfo
- Publication number
- US20110080344A1 US20110080344A1 US12/893,427 US89342710A US2011080344A1 US 20110080344 A1 US20110080344 A1 US 20110080344A1 US 89342710 A US89342710 A US 89342710A US 2011080344 A1 US2011080344 A1 US 2011080344A1
- Authority
- US
- United States
- Prior art keywords
- touch
- multitouch
- data
- input device
- user
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0416—Control or interface arrangements specially adapted for digitisers
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/042—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
- G06F3/0425—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04808—Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen
Definitions
- Multitouch systems In multitouch systems, a user is able to provide input by touching or otherwise contacting multiple areas of a touch screen simultaneously. Multitouch systems provide for opportunities for new types of user interactions and experiences that have not been possible in single touch or non-touch systems. Some multitouch systems have included frustrated total internal reflectance systems and capacitive systems.
- Embodiments of the present invention pertain to a system that includes multiple touch input devices.
- Each of the touch input devices generates a set of touch data.
- a blending component receives the sets of touch data and generates a corresponding combined set of touch data that is outputted to a touch input application.
- at least one of the multiple touch input devices is a multitouch input device that supports multiple simultaneous touch inputs.
- FIG. 1 is a block diagram of a multitouch environment.
- FIG. 2 is a perspective view of a wall type multitouch device.
- FIG. 3 is a process flow diagram illustrating the operation of a multitouch device.
- FIG. 4 is a front view of the wall type multitouch device of FIG. 2 .
- FIG. 5 is a schematic diagram of a multitouch device light source.
- FIG. 6 is a back view of the wall type multitouch device of FIG. 2 .
- FIG. 7 is an electrical block diagram of a multitouch device.
- FIG. 8 is a cross-sectional view of the wall type multitouch device of FIG. 2 .
- FIG. 9 is a simplified perspective view of a parabolic multitouch device.
- FIG. 10 is a simplified front view of an airline gate wall multitouch device.
- FIG. 11 is a perspective view of a table type multitouch device.
- FIG. 12 is a simplified schematic diagram of a lighting scheme for a table type multitouch device.
- FIG. 13 is a cross-sectional view of the table type multitouch device of FIG. 11 .
- FIG. 14 is a perspective view of a capacitive multitouch system.
- FIG. 15 is a schematic diagram of a multitouch environment that incorporates multiple multitouch devices working together.
- FIG. 16 is a block diagram illustrating the operation of a universal multitouch driver.
- FIG. 17 is a schematic diagram of a universal multitouch driver.
- FIGS. 18-1 , 18 - 2 , and 18 - 3 are screenshots from an Application Launcher multitouch application.
- FIGS. 19-1 , 19 - 2 , 19 - 3 , 19 - 4 , 19 - 5 , and 19 - 6 illustrate multitouch gestures that may be utilized with the multitouch applications described in this disclosure.
- FIGS. 20-1 , 20 - 2 , 20 - 3 , 20 - 4 , and 20 - 5 are screenshots of a Chat and Group Collaboration multitouch application.
- FIGS. 21-1 and 21 - 2 are screenshots of a Finger Painting multitouch application.
- FIG. 22 is a screenshot of a Falling Debris multitouch application.
- FIG. 23 is a screenshot of a Duck Shot multitouch application.
- FIG. 24 is screenshot of a Text Messaging multitouch application.
- FIGS. 25-1 , 25 - 2 , 25 - 3 , 25 - 4 , 25 - 5 , and 25 - 6 are screenshots of a Flight Scheduling multitouch application.
- FIG. 26 is a block diagram of a data blending component.
- FIG. 27 is a process flow diagram illustrating the operation of a blending component.
- FIG. 1 is a block diagram of one illustrative environment in which some embodiments may be incorporated. It should be noted however that FIG. 1 is for illustration purposes only and that embodiments are not limited to any particular environment. Embodiments are illustratively incorporated in any multitouch environment.
- FIG. 1 shows a multitouch device 100 that is optionally communicatively coupled to a radio frequency identification reader 102 , an object detection sensor 104 , and a barcode reader 106 .
- these input devices illustratively collect information that is utilized in combination with multitouch information from the multitouch device.
- a multitouch device 100 is running a flight scheduling application.
- users are illustratively able to use barcode reader 106 to scan their boarding pass and are then able to make changes to their flight plans by making selections on the screen of multitouch device 100 .
- FIG. 1 also shows that device 100 is optionally coupled to other input devices 108 . This is to illustrate that embodiments are not limited to having any particular other type of input devices connected to a multitouch device and optionally include any other type of input device.
- a multitouch device driver 110 communicates with multitouch device 100 to obtain multitouch input information collected from a user.
- multitouch device 100 may collect input information that includes x-axis coordinates and y-axis coordinates for touch gestures.
- Device driver 110 obtains that information from device 100 .
- the input information is then passed along to a universal multitouch device driver 120 .
- multitouch device drivers 110 commonly have different types of outputs.
- the format or syntax of outputs may vary (e.g. TUIO and HID).
- the outputs may have different types or amounts of information.
- some multitouch devices 100 only output x and y coordinates while other devices 100 output x and y coordinates along with a change in x, a change in y, a width, a height, and an acceleration.
- Universal multitouch device driver 120 is illustratively able to collect the outputs of many different types of device drivers 110 and convert the outputs into a standardized or universal format. The output of universal driver 120 is then transmitted or collected by one or more applications 130 Applications 130 are illustratively multitouch applications. Several examples of multitouch applications 130 are discussed later in this description. Embodiments are not however limited to only multitouch applications and also illustratively include single touch or no touch applications.
- Embodiments of the present disclosure are illustratively practiced with any type of multitouch device.
- multitouch devices include those that detect touches utilizing capacitive screens and those that detect touches utilizing a vision based system. Some illustrative embodiments of multitouch devices are described below.
- FIGS. 2 , 4 , 6 and 8 show a wall type multitouch device 200 .
- Device 200 is referred to as a wall system because its screen 202 has a relatively large form factor.
- the height 204 of screen 202 is forty-six inches, and the width 206 of screen 202 is eighty inches.
- Embodiments are not however limited to any particular dimensions, and embodiments illustratively include screens 202 that have any shape or size. Additionally, multiple devices 200 may be connected together and function as an even bigger wall.
- Device 200 also includes a depth 208 .
- Depth 208 may be any value. In one embodiment, for illustrative purposes only and not by limitation, depth 208 is twenty-six inches. In at least certain embodiments of the present disclosure, depths 208 are able to be kept relatively short by utilizing multiple projectors and/or low throw projectors to display graphics on screen 202 . Embodiments may however have other configurations such as, but not limited to, systems that utilize only one projector and/or those having longer depths 208 .
- FIG. 3 is a flow diagram showing one method of operating multitouch devices such as, but not limited to, device 200 in FIG. 2 .
- a blanket of light is generated over the front of screen 202 .
- a blanket of light is generated between screen 202 and user 210 .
- an imaging camera is positioned such that it is able to detect objects illuminated by light blanket.
- graphical images are projected onto screen 202 .
- one or more users interact with the graphical images on the screen. For instance, in a flight scheduling application, a map may be displayed on the screen, and a user selects his destination by touching the corresponding portion of the screen.
- the imaging camera detects the user's interactions. This information is then used at block 312 to update the graphics on the screen. For instance, continuing with the flight scheduling application example, once a user touches his destination, the multitouch device may show available flights going to that destination. As is indicated by arrow 313 , the process then returns to block 308 , and blocks 308 , 310 , and 312 are repeated as needed.
- FIG. 4 is a schematic diagram of the front of device 200 .
- Device 200 includes multiple light sources 212 that generate a light blanket over the front of screen 202 .
- light sources 212 are infrared laser diodes. Infrared light is useful in that it is not visually perceivable by human eyes so it does not interfere with graphics displayed on the screen. Embodiments are not however limited to any particular type of light sources and optionally include any type of light source.
- FIG. 5 shows one example of an infrared laser diode light source 212 .
- the output of light source 212 is passed through a line filter 214 .
- Filter 214 illustratively changes the light source output such that a beam of light 216 is formed that has an angle 218 and a height 217 .
- Different filters 214 may be used to generate any desired angle 218 and height 217 .
- screen 202 is curved.
- filters that are used in curved portions provide smaller angles 218 than filters used for flat portions. This enables the light blanket to be kept just in front of the screen instead of diverging away from the screen.
- flat screens or flat portions of screens use filters that have an angle 218 that is approximately one hundred and twenty degrees and a height 217 that is approximately two millimeters. Embodiments may however utilize filters providing different angles 218 and heights 217 .
- FIG. 4 shows that the light sources 212 at the top 220 of the screen and at the bottom 222 of the screen are each spaced apart from each other by a distance 224 , and that the light sources at the top 220 are offset from the light sources at the bottom 222 by a distance 225 .
- Distance 225 is optionally one half of distance 224 .
- This configuration, along with the “V” shaped light beam 216 shown in FIG. 5 illustratively provides a uniform blanket of light in front of screen 202 .
- distance 224 is twelve and a half inches and distance 225 is six and one quarter inches.
- FIG. 6 is a schematic diagram of the back of device 200 .
- Device 200 illustratively includes multiple projectors 226 .
- Projectors 226 are illustratively low throw projectors. The use of multiple low throw projectors helps to reduce the depth 208 (shown in FIG. 2 ) of device 200 such that it has a relatively slim form factor. For instance, if only one projector was used or if non-low throw projectors were used, projectors 226 would have to be spaced further away from screen 202 , and depth 208 would have to be increased.
- Embodiments of the present disclosure optionally include any type of projector and any number of projectors including one. For example, a system twice as long as the one shown in FIG. 6 may have eight low throw projectors.
- FIG. 6 also shows that device 200 includes an imaging camera 228 .
- Imaging camera 228 is not limited to any particular type of camera. The selection of camera 228 depends upon how the light blanket in front of the screen is formed. Camera 228 needs to be able to detect objects illuminated by the light blanket and needs to be selected accordingly. In one embodiment, in which infrared light is used to generate the light blanket, imaging camera 228 is either an infrared camera or is a camera fitted with an infrared filter such that camera 228 is able to image objects illuminated with infrared light. Camera 228 optionally includes any angle of field of view, frame rate, and resolution, and the selection of the camera is based upon the size of the screen and desired sensitivity of light detection.
- camera 228 has a field of view from 56 to 75 degrees, frames rates from 60 to 120 hertz, and resolutions from 320 by 240 to 640 by 480 pixels. In some embodiments, such as in the case of a device 200 having a longer length 206 , multiple cameras 228 may be used as needed.
- FIG. 6 further shows that device 200 includes a control system 230 and a power source 232 for lights 212 .
- Control system 230 illustratively runs the operations of device 200 .
- system 230 generates the graphics that are output to the projectors and processes any computations needed by applications running on device 200 .
- system 230 is one computer (e.g. a workstation, server, personal computer, blade, etc.). The exact implementation of system 230 is not however limited to any particular type of system or group of systems.
- device 200 may or may not need a power source 232 depending upon the type of light source 212 .
- FIG. 7 is a simplified electrical block diagram of device 200 .
- Control system 230 operates the light power source 232 such that it controls the turning on and turning off of light sources 212 .
- Control system receives the output of imaging cameras 228 and processes the images to determine the positions of objects illuminate by the light blanket in front of the screen.
- Control system 230 also generates and supplies the projectors 226 with graphics that are displayed on the device's screen.
- FIG. 7 shows that device 200 is further optionally connected to other devices 234 .
- These other devices may include an RFID reader, a barcode reader, and an object detection sensor, such as those shown in FIG. 1 .
- Devices 234 may include any devices that are useful for applications being run on device 200 .
- Devices 234 may also include devices that are needed for or useful for the operation of device 200 .
- devices 234 includes an additional camera that is put in front of device 200 . The additional camera generates images that are used to blend the images of the multiple projectors together such that the image displayed on the screen looks as if it were generated by a single projector.
- FIG. 8 is a cross-section of device 200 from the perspective of line 8 - 8 in FIG. 4 .
- the screen of device 200 includes several components.
- the screen illustratively includes a piece of transparent material 236 (e.g. glass or plexiglass), a backing/screen material 237 , and a light absorbing edge 238 (e.g. an acid washed edge).
- Backing material 237 provides a surface on which the images of projectors 226 are displayed.
- Material 237 is chosen such that a user on the opposite side of the screen from projectors 226 can view the images.
- material 237 is illustratively any translucent material.
- FIG. 8 shows a light blanket 240 in front of the screen.
- Blanket 240 illustratively has a depth 241 that extends along the entire surface of the screen of device 200 (i.e. it is a three-dimensional box having the dimensions of depth 241 by height 204 (shown in FIG. 2 ) by width 206 (also shown in FIG. 2 )).
- light sources 212 are positioned such that their outputs overlap the screen.
- the outputs of light sources 212 have heights (e.g. height 217 in FIG.
- light sources 212 may be positioned such that there is no overlap with the screen or may even be positioned such that there is a gap between the light blanket and the screen. Embodiments are not limited to any particular positioning of the light blanket relative to the screen.
- FIG. 8 also shows imaginary lines 229 emanating from imaging camera 228 and quasi-imaginary lines 227 emanating from projectors 226 .
- the imaging camera lines 229 represent the field of view of the camera and illustrate that the camera is able to detect illuminated objects over the entire surface of the screen.
- multiple cameras are used and the combined field of view of all the cameras covers the entire surface of the screen.
- the quasi-imaginary projector lines 227 represent the graphical images generated by the projectors and displayed on the screen. As is indicated in the figure, the projectors each cover only a portion of the screen. For instance, in a four projector embodiment, each projector generates an image that covers approximately one quarter of the screen.
- FIG. 9 is a simplified schematic diagram of another embodiment of a multitouch device, device 900 .
- Device 900 illustratively operates similar to device 200 .
- One difference between device 900 and device 200 is that in device 900 , the multitouch screen 902 and its corresponding light blanket are parabolic in shape as opposed to being rectangular boxes (i.e. the top 920 and bottom 922 of screen 902 are parabolic or curved).
- the parabolic or curved light blanket is illustratively generated by adjusting the dispersion angles (i.e. angle 218 in FIG. 5 ) of the light sources 212 and the spacing 924 between the light sources.
- light sources illustratively have an angle of one hundred and twenty degrees and light sources are spaced twelve and a half inches apart.
- both the angles and the spacing between light sources are reduced (e.g. less than one hundred and twenty degrees and less than twelve and a half inches). The amount of the reduction depends on the rate of change of the curve. For instance, if the screen curves thirty degrees over a length of ten feet, the angles and the spacings will need to be less than if the screen only curves ten degrees over a length of ten feet.
- the angles are reduced by connecting the light sources to line filters that provide the desired angle.
- FIG. 9 also has three boxes 928 . These boxes illustrate one potential placement of imaging cameras for device 900 (i.e. the boxes correspond to the centers of the fields of view of the cameras). As was previously mentioned, more or fewer imaging cameras may be used depending upon the size of the screen and the fields of view of the cameras.
- its height is the same as the height of device 200 (e.g. forty-six inches tall) but its width 906 is three times as long (e.g. device 200 's width is eighty inches and device 900 's width is two hundred and forty inches). In such a case, device 900 illustratively uses the same type of projectors but requires twelve instead of four.
- Embodiments of devices 200 and 900 are not however limited to any particular dimensions, number of imaging cameras, number of projectors, or shape. As will be appreciated by those skilled in the art, the techniques that have been described allow for multitouch devices to be made that have any size or shape.
- FIG. 10 is a simplified schematic view of another embodiment of a wall type multitouch device, device 1000 .
- device 1000 is an airline gate wall and is used at an airport.
- the screen of device 1000 has two portions, an upper portion 1001 and a lower portion 1002 .
- Device 1000 illustratively includes object detection sensors 104 . As is shown in the figure, sensors are optionally grouped into pairs of two that are aligned in the vertical direction. Sensors 104 can be used to detect the presence of a person who may wish to interact with device 1000 . In one configuration, if only the lower one of the two sensors 104 in a pair of sensors detects a person, device 1000 interprets that result as indicating that a child is present in front of its screen.
- Device 1000 illustratively responds by launching an application 131 on lower screen potion 1002 near where the child is detected.
- Application 131 can be any type of application. In one example, for illustration purposes only and not by limitation, application 131 is a game application that a child may enjoy playing with. If both of the two sensors 131 in a pair of sensors detects an object, device 1000 interprets that result as indicating that an adult is present in front of its screen.
- Device 1000 illustratively responds by launching an application 130 on upper screen portion 1001 near where the adult is detected.
- application 130 can be any type of application. In one embodiment, in which device 1000 is an airline gate wall, application 130 is a flight scheduling application.
- device 1000 has several pairs of sensors 104 . This illustrates that device 1000 supports multiple people interacting with the device simultaneously. In the specific example shown in FIG. 10 , device 1000 has four pairs of sensors 104 . Embodiments may of course have more or fewer pairs of sensors including no sensors.
- FIG. 10 shows that device 1000 has a height 1004 and a width 1006 . Height 1004 and width 1006 include any values and embodiments can be made to be any desired size to support the simultaneous use by any number of people. In one embodiment, for illustration purposes only, height 1004 is ninety inches and width 1006 is between nine and eighteen feet.
- Device 1000 further optionally has multiple barcode scanners 106 and multiple RFID readers 102 .
- Scanners 106 may be useful for certain applications 130 .
- barcode scanner 106 can be used to scan a boarding pass, a confirmation e-mail, etc.
- RFID readers 102 can similarly be used along with applications 130 . Readers 102 could for example be used to read credit card information or to read user identification information. The user identification information could be used to log into an application 130 .
- FIGS. 11 , 12 , and 13 illustrate yet another embodiment of a multitouch device, device 1100 .
- device 1100 is a table type multitouch device and has a smaller form factor than the devices shown in FIGS. 2 , 4 , 6 , 8 , 9 , and 10 .
- device 1100 's width 206 and height 204 are illustratively each in the range of one to four feet.
- the screen 1102 of device 1100 is surrounded on all four of its sides by light sources 1112 .
- FIG. 12 is a simplified schematic drawing showing screen 1102 and light sources 1112 in more detail.
- Screen 1102 includes a transparent material 1136 (e.g. glass or plexiglass) and many small reflectors 1137 (e.g. small pieces of aluminum or chrome) embedded and disbursed throughout material 1136 .
- Reflectors 1137 illustratively have random orientations such that they reflect light in all directions.
- light source 1112 is a ribbon of light emitting diodes 1113 . In one specific example, for illustration purposes only and not by limitation, diodes 1113 are three quarter inch infrared LEDs.
- Light source 1112 generates light (e.g. IR light) that is transmitted through transparent material 1136 and is reflected by reflectors 1137 .
- Light source 1112 is illustratively chosen such that light is transmitted throughout the entire volume of the screen (i.e. throughout its entire width by height by thickness).
- FIG. 13 is a simplified cross-sectional or side view of device 1100 .
- a projector 1126 projects an image onto a backing/screen material 1138 that is attached to the back of glass 1136 having embedded reflectors 1137 (shown in FIG. 12 ).
- a user 1110 interacts with images on the screen by touching the surface of the screen. When user 1110 touches the screen, light collects or becomes concentrated at the point of contact (i.e. the area of the screen being touched).
- Imaging camera 1128 illustratively has a field of view that covers the retire surface of the screen and is able to detect any touches (i.e. it is able to image the concentrated areas of light). For instance, in the case of light source 1112 generating infrared light, camera 1128 is optionally an infrared camera or a camera equipped with an IR filter.
- device 1100 is shown as having only one projector 1126 and one imaging camera 1128 . In other embodiments, device 1100 may have multiple projectors 1126 and/or multiple imaging cameras 1128 as is needed. Additionally, multiple devices 1100 are illustratively connected together and function as a single unit. For instance, up to five devices 1100 are optionally connected together and function as a single unit.
- FIG. 14 illustrates another type of multitouch system, a capacitive system 1400 .
- a multitouch capacitive screen overlay 1402 is positioned in front of a television or monitor 1404 .
- the multitouch capability and display device are not separate pieces and are illustratively integrated into one unit.
- a control system 1406 e.g. a computer
- Control system 1406 generates graphical output 1411 that is transmitted to and displayed on television/monitor 1404 .
- a user interacts with the graphics on television/monitor 1404 by touching capacitive screen overlay 1402 .
- Overlay 1402 then sends multitouch output 1412 back to control system 1406 that indicates the user's touches.
- Control system 1406 then uses the multitouch output 1412 to update its graphics and sends the updated graphics back to television/monitor 1404 .
- FIG. 15 illustrates one example of such an environment.
- the environment in FIG. 15 has three multitouch devices, a wall type device 1510 in a living room, a table type device 1530 in the living room, and a second wall type device 1520 in a bedroom.
- Devices are illustratively coupled to each other through an intranet/router 1541 and are also coupled to the internet 1542 .
- the screen 1531 of table device 1530 has multiple windows, windows 1532 and window 1533 .
- Window 1533 illustratively includes television programming of guide information such as, but not limited to, a listing of television stations and an indication of what programs will be on the stations at different times and dates.
- Windows 1532 are illustratively television program windows. For example, a user may select four different programs from guide window 1533 and each of the four programs is displayed in its own window 1532 . The user could then control what is being displayed on the screens of the wall type devices 1510 and 1520 . For instance, a user could touch one of windows 1532 and push or flick it in the direction of one of the wall devices, and the program being displayed in the window would then be displayed on the wall device.
- table device 1530 generates audio output for headphones and the selection of one or more of windows 1532 generates one or more channels of headphone audio output (i.e. the audio corresponding to the television program being displayed in the selected windows).
- the environment shown in FIG. 15 may have personal video recording capability, and a user is able to control the recording and playing of video utilizing the multiple multitouch devices. For example, for illustration purposes only and not by limitation, a user could touch the screen of table device 1530 to pause or rewind the program being displayed on wall device 1510 .
- the multiple multitouch device environment in FIG. 15 is not however limited to running any particular application and is illustratively used in running any application.
- Other illustrative applications include making or placing phone calls, making or placing video calls using optional cameras 1512 and 1522 , controlling various aspects of the home (e.g. temperature, garage door), making drawings or doodles, doing homework, shopping or ordering (e.g. ordering a pizza), surfing the internet, and playing video games.
- multitouch device drivers commonly output their touch information in different formats or syntax. Their various types of outputs may also have different kinds of information. For example, one multitouch device driver may only output an x and a y coordinate of a touch, and another device driver may have x and y coordinates as well as a change in x and a change in y (i.e. dx, dy) These varying forms of multitouch output can make implementing multitouch applications difficult and/or costly. For example, if an application developer wants to make a multitouch application that can be used on multiple different types of multitouch devices, he may have to make and maintain several different versions of the same application, with each of the versions being specially tailored to accommodate for a particular device driver.
- a universal multitouch driver that eliminates or educes the need to specially tailor applications to multiple types of device drivers.
- the universal multitouch driver is able to receive and interpret the outputs of multiple different types of device drivers.
- the universal multitouch driver then converts the output it receives into a standardized format such that regardless of the format or type of information that it receives, it consistently outputs multitouch information in the same format and having the same type of information. Consequently, multitouch application developers do not have to make or maintain multiple versions of multitouch applications to implement an application across multiple types of multitouch devices. Instead, developers only need to make applications that are able to use the multitouch information from the universal multitouch driver.
- FIG. 16 is a block diagram illustrating the operation of a universal multitouch driver.
- a user interacts with a multitouch screen (e.g. the user touches the screen).
- a multitouch device driver generates touch data based upon the user interaction.
- this touch data can include several different parameters depending upon the device driver.
- Touch data commonly includes a touch identifier (e.g. touch #1, touch #2, etc.) and x-axis and y-axis coordinates that correspond to the location on the multitouch screen that was touched. Touch data may also include other types of information such a change in the x-axis coordinate from the previously sent data (e.g.
- dx a change in the y-axis coordinate from the previously sent data
- dy a change in the y-axis coordinate from the previously sent data
- w a width of the touch
- h a height of the touch
- an acceleration e.g. m
- the universal multitouch driver obtains the touch data from the device driver.
- the device driver has an application programming interface, and the universal driver communicates with that interface to retrieve the data.
- the device driver automatically sends the data to the universal driver. Embodiments are not limited to any specific manner of obtaining the data.
- the universal multitouch driver optionally stores the data.
- the universal multitouch driver optionally stores the data to a volatile (e.g. DRAM) or non-volatile (e.g. flash or hard disk drive) memory.
- a volatile e.g. DRAM
- non-volatile e.g. flash or hard disk drive
- the universal multitouch driver calculates any missing parameters.
- different device drivers do not always output the same types of information.
- One device driver may for example only output a touch identifier, an x-axis coordinate, a y-axis coordinate, a width, and a height.
- Universal multitouch driver optionally retrieves the data stored at block 1607 and uses the current information and the stored information to generate a full set of touch data parameters (i.e. the touch identifier, x, y, dx, dy, w, h, and m).
- the universal multitouch driver may fill in one or more missing parameters with a default value. For instance, if the device driver does not provide a width or a height, the universal driver illustratively uses a default value for those parameters (e.g. 0 or 1).
- the universal multitouch driver formats or packages the touch data, including any calculated or default values, into a standardized format.
- the standardized format follows the “User Datagram Protocol” or “UDP” network protocol.
- the standardized format follows the “Extensible Markup Language” or “XML” rules.
- a universal multitouch driver formats the touch data and packages it using multiple different formats simultaneously (e.g. it packages the same touch data in both UDP and XML simultaneously). Embodiments are not however limited to any particular standardized format.
- the universal multitouch driver transfers the standardized touch data to one or more multitouch applications.
- the universal multitouch driver may send the data to the application, or alternatively the application may retrieve the data from the universal multitouch driver.
- universal multitouch driver sends UDP formatted data to the UDP port 3000 and XML formatted data to the TCP port 3333 .
- the universal multitouch driver may optionally transfer data in multiple ways simultaneously (e.g. it sends data to both port 3000 and 3333 simultaneously).
- the multitouch application utilizes the standardized user touch data that it received from universal multitouch driver. The process then illustratively repeats itself and returns to block 1602 where a user interacts with a multitouch screen.
- FIG. 17 is a block diagram illustrating one embodiment of a universal multitouch driver 1700 .
- Universal multitouch driver 1700 illustratively has a device driver interface 1702 that is communicatively coupled to an application programming interface 1752 of multitouch device driver 1750 .
- universal multitouch driver 1700 utilizes this connection to obtain the touch data from the device driver.
- Universal multitouch driver 1700 also has a processing module 1704 and a storage module 1706 .
- Module 1704 is utilized in performing any computations, processing, or logical functions that may be required. For instance, processing module 1704 may perform calculations to determine additional parameters as was described in relation to block 1608 in FIG. 16 .
- Storage module 1706 is utilized in performing any data storage, data management, or data retrieval functions such as, but not limited to, storing touch data to calculate change in x and change in y parameters (e.g. blocks 1607 and 1608 in FIG. 16 ).
- Universal multitouch driver then has an application interface 1708 that is communicatively coupled to an interface 1782 of one or more multitouch applications 1780 . This connection is illustratively used in transferring (e.g. sending or transmitting) the standardized touch data to the multitouch applications.
- the methods and the universal multitouch drivers described above illustratively help to reduce problems associated with implementing multitouch applications across multiple types of multitouch devices having various types of drivers. Instead of making and maintaining multiple versions of the same application, developers only need to make and maintain one version of any given application. It should be noted however that embodiments of the present disclosure are not limited to multitouch systems or environments having universal multitouch drivers. Embodiments are illustratively practiced without a universal multitouch driver, and applications illustratively receive touch data directly from device drivers. It should also be noted that in systems having a universal multitouch driver, the universal multitouch driver is optionally bypassable. For instance, in a situation in which the data from the device driver is already in a suitable format for an application, the data may bypass the universal multitouch driver and go directly to the application.
- Embodiments of the present disclosure also illustratively include a number of multitouch applications. These applications may be practiced in environments such as those previously discussed. The applications are however not limited to any particular environment and may also be practiced in environments different than those previously discussed.
- FIGS. 18-1 , 18 - 2 , and 18 - 3 illustrate various states of Application Launcher
- FIGS. 19-1 , 19 - 2 , 19 - 3 , 19 - 4 , 19 - 5 , and 19 - 6 illustrate multitouch gestures that may be used along with Application Launcher and the other applications disclosed herein.
- FIG. 18-1 shows Application Launcher from a state in which it is only showing background graphics or wallpaper 1802 on the screen of a multitouch device. In one embodiment, graphics or wallpaper 1802 are animated.
- FIG. 18-2 shows Application Launcher from a state in which an application menu 1804 has been activated.
- FIG. 19-1 shows two alternative gestures that may be used to activate the application menu. One of the gestures includes a single touch (e.g. a tap) followed by a touch that makes a line to the right.
- the other illustrative activation method includes making a clockwise rotation gesture on the screen.
- Application menu 1804 illustratively has multiple icons that correspond to other applications that may be ran on the multitouch device.
- application menu 1804 includes icons corresponding to or representing six applications. They are a Chat and Group Collaboration application 1811 , a Finger Painting application 1812 , a Falling Debris Game application 1813 , a Duck Shot Game application 1814 , a Text Messaging application 1815 , and a Flight Scheduler application 1816 .
- Each of applications 1811 - 1816 are discussed in greater detail below.
- the applications have text based icons.
- the icons are graphical icons that provide some indication of the corresponding applications (e.g. a picture of a plane for a flight scheduling application).
- FIG. 18-3 shows Application Launcher from a state in which two applications are launched and displayed on the screen of a multitouch device.
- the first application that has been launched is in one window 1821 of the screen
- the second application that has been launched is in a second window 1822 of the screen.
- Each of the two applications was illustratively launched by tapping on the corresponding icon 1811 - 1816 in application menu 1804 .
- multiple multitouch applications are able to be ran on one device simultaneously.
- Application Launcher illustratively manages and allocates system resources to enable the multiple simultaneous applications.
- FIGS. 19-2 , 19 - 3 , 19 - 4 , 19 - 5 , and 19 - 6 show additional gestures that may be used in applications according to the present disclosure.
- FIG. 19-2 represents gestures for closing application menu 1804 .
- the gestures are essentially the opposite of the gestures shown in FIG. 19-1 for opening or launching application menu 1804 .
- FIG. 19-3 represents a gesture for increasing the size of a window.
- a user illustratively uses two fingers to touch two spots on an object displayed on a multitouch screen and then keeps touching the screen with the fingers while moving the fingers away from each other.
- This gesture is illustratively able to make any window larger.
- a user can increase the size of application windows 1821 and 1822 , and application menu 1804 .
- a user is able to use the gesture to magnify wallpaper or background graphics.
- FIG. 19-4 represents a gesture for decreasing the size of a window or de-magnifying background graphics.
- a user illustratively uses two fingers to touch two spots on an object and then keeps touching the screen with the fingers while moving the fingers towards each other.
- FIGS. 19-5 and 19 - 6 represents gestures for rotating windows or background graphics.
- the gesture in FIG. 19-5 corresponds to clockwise rotation
- the gesture in FIG. 19-6 corresponds to counter-clockwise rotation.
- the gestures include touching two spots on the screen and moving the fingers together while making parallel lines.
- FIGS. 20-1 , 20 - 2 , 20 - 3 , 20 - 4 , and 20 - 5 show several illustrative examples of graphical user interfaces (e.g. screenshots) for a multitouch Chat and Group Collaboration application.
- the application is illustratively launched by selecting the corresponding icon 1811 from Application Launcher application menu 1804 (shown in FIG. 18 ).
- FIG. 20-1 shows a first user interface 2001 that is illustratively utilized to log a user into the application.
- the log in user interface 2001 includes background graphics 2002 and a log in window 2004 .
- Background graphics 2002 can include any graphics desired by the user. Background graphics 2002 may also optionally include animated graphics.
- Log in window 2004 is illustratively used to collect information from a user such that the application is able to identify the user. This may be useful for example for retrieving a users saved information (e.g. saved preferences, friend list, etc.) or for restricting access to the application to only certain people (e.g. paying subscribers to the service).
- Embodiments of log in window 2004 may have any features and collect any type of information. In the specific example shown in FIG.
- log in window 2004 has a title or header 2005 , a closing button 2006 , a first side or portion 2007 that enables a user to log into the application as a guest (e.g. a person that enters another person's meeting), and a second side or portion 2008 that enables a user to log into the application as a host (e.g. a person holding a meeting that others attend).
- the guest portion 2007 has a guest user input field 2009 to collect identifying information (e.g. a user name or handle) from a person logging in as a guest, a label or header 2010 that describes or identifies input field 2009 , and a button 2011 to submit the user's identifying information and to log into the application as a guest.
- identifying information e.g. a user name or handle
- the host portion 2008 has a host user input field 2012 to collect identifying information (e.g. username, email address, handle, etc.) from a user who wishes to log in as a host, a label or header 2013 that describes or identifies input field 2012 , a second host user input field 2014 that collects authentication information (e.g. a password, PIN, etc.), a label or header 2015 that describes or identifies input field 2014 , and a button 2016 to submit the user's identifying/authentication information and to log into the application as a host.
- identifying information e.g. username, email address, handle, etc.
- a label or header 2013 that describes or identifies input field 2012
- a second host user input field 2014 that collects authentication information (e.g. a password, PIN, etc.)
- a label or header 2015 that describes or identifies input field 2014
- a button 2016 to submit the user's identifying/authentication information and to log into the application as a host.
- FIG. 20-2 shows a second user interface 2020 of the Chat and Group Collaboration application.
- Interface 2020 is illustratively displayed once a user has submitted his identification and/or authentication information to the application through a log in window (e.g. through log in window 2004 in FIG. 20-1 ), and the application has successfully verified the information (e.g. that the user is a registered user).
- Interface 2020 include a user list window 2021 .
- Window 2021 has an identification label or header 2022 that helps a user understand what window 2021 is, a closing button 2023 to close window 2021 , and multiple user identification fields 2024 .
- Each user identification field 2024 includes a text and/or image user identifier 2025 (e.g. a user name, a real name, a picture of the user, a handle, etc.) and a status field 2026 that indicates a status of the user (e.g. online/available or offline/not available).
- FIG. 20-3 shows a third user interface 2028 of the Chat and Group Collaboration application.
- Interface 2028 is illustratively displayed once a user has selected (e.g. touched) one or more of the user identification fields 2024 .
- a corresponding communication window 2030 is displayed on the interface.
- Communication window 2030 includes an identification field or header 2031 that describes a window (e.g. it identifies a selected user) and a closing button 2032 to terminate the communication session with the selected user.
- Communication window 2030 further has multiple buttons, 2033 , 2034 , and 2035 that allow a user to select various methods of communicating with the selected user.
- Button 2033 corresponds to communicating by video.
- Button 2034 corresponds to communicating by voice
- button 2035 corresponds to communicating by text.
- Window 2030 further optionally includes two video portions 2036 and 2037 , and a text portion 2038 .
- Video portions 2036 and 2037 may be used for video communication.
- one of the portions may show video of the selected user, and the other one of the portions may show the user's own video that is being transmitted to the selected user.
- Text portion 2038 may be used for communicating by text.
- Portion 2038 may include, for example, a series of type written messages along with identifiers that show which user generated each of the messages.
- the Chat and Group collaboration application as well as the rest of the applications described in this specification, have multitouch capabilities.
- the applications illustratively support the use of multitouch gestures such as, but not limited to, those shown in FIGS. 19-1 , 19 - 2 , 19 - 3 , 19 - 4 , 19 - 5 , and 19 - 6 .
- This allows for windows, backgrounds, and any object within an application to be manipulated or controlled by the multitouch gestures. For instance, windows can be increased or decreased in size or rotated utilizing multitouch gestures.
- Window 2030 in FIG. 20-3 further includes a drawing button 2039 that enables users to communicate by drawing images on their multitouch screens.
- the user interface 2040 shown in FIG. 20-4 is illustratively displayed.
- Interface 2040 includes a drawing window 2042 .
- Window 2042 has a multitouch shared drawing area 2043 .
- Drawing area 2043 is a shared object in that the same image of area 2043 is displayed on the multitouch screens of each of the users logged into the application (i.e. users can see what other users draw).
- Window 2043 further optionally has multiple buttons 2044 that provide additional features.
- Buttons 2044 may include features that are useful in drawing in area 2043 . For instance, buttons 2044 may include a button to draw in area 2043 in a pencil or pen format, a button to draw in area 2043 with a paint brush format, and/or a button to draw in area 2043 with a spray paint format.
- FIG. 20-5 shows another user interface 2050 that may be included with a Chat and Group collaboration application.
- Interface 2050 includes a map window 2052 .
- Map window 2052 is illustratively shared amongst all of the users logged into the system.
- Each user is able to control or manipulate the map shown in the window by using multitouch gestures. For instance, users may zoom in on a location utilizing the gesture shown in FIG. 19-4 , zoom out on a location utilizing the gesture shown in FIG. 19-3 , and/or rotate the view of a location utilizing the gestures shown in FIGS. 19-5 and 19 - 6 . Additionally, the window 2052 itself can be resized, rotated, etc. utilizing multitouch gestures.
- FIGS. 21-1 and 21 - 2 show graphical user interfaces of a multitouch Finger Painting application.
- the application is illustratively launched by selecting the corresponding icon 1812 from the Application Launcher application menu 1804 (shown in FIG. 18 ).
- Interface 2101 in FIG. 21-1 shows the painting area 2104 of the user interface before anything has been painted in it, and interface 2102 shows the user interface after some squiggly lines have been painted in painting area 2104 .
- Paint area 2104 illustratively has background graphics that look like a painting canvas or a piece of paper. Area 2104 however may include any type of graphics (e.g. one solid color).
- Interfaces 2101 and 2102 include a plurality of color selection buttons 2106 .
- a user illustratively selects one or more of the colors. The selected color is then activated, and when a user touches painting area 2104 , it paints with the selected color.
- Interfaces 2101 and 2102 may also have additional buttons such as, but not limited to, a clear button 2108 to erase the painting/drawing in area 2104 and/or a closing button 2109 to terminate the application.
- FIG. 22 shows a graphical user interface 2201 of a Falling Debris multitouch game application.
- the application is illustratively launched by selecting the corresponding icon 1813 from the Application Launcher application menu 1804 (shown in FIG. 18 ).
- falling debris 2202 move from the top of the interface towards the bottom of the interface.
- Debris 2202 have tails 2203 that represent the previous locations of the debris.
- the object of the game is for a user to trigger explosions 2205 to destroy the debris before they hit houses 2204 .
- explosions 2205 are triggered by a user touching the screen (i.e. an explosion occurs where the screen is touched). Given the multitouch capabilities of the application, a user is able to touch multiple spots on interface 2201 simultaneously, thus triggering multiple simultaneous explosions 2205 .
- FIG. 23 shows a graphical user interface 2301 of a Duck Shot multitouch game application.
- the application is illustratively launched by selecting the corresponding icon 1814 from the Application Launcher application menu 1804 (shown in FIG. 18 ).
- different types of objects such as, but not limited to ducks 2302 , fish 2303 , birds 2304 , and/or octopuses 2305 intermittently are displayed on the interface.
- the objects optionally include targets or bulls eyes that indicate that they are to be “shot.”
- the application is displayed on a relatively large multitouch device (e.g. 80′′ by 46′′) and a user throws bean bags at the device's multitouch screen.
- the application is displayed on a smaller multitouch device a user “shoots” the objects by touching the objects with his fingers.
- interface 2301 illustratively display points 2306 that the user is awarded.
- display 2301 may display negative points or points that are deducted 2307 from the user's score upon an unsuccessful shot. Because of the multitouch capability, a user is able to shoot multiple objects simultaneously and/or multiple users can play simultaneously with each of the users being able to shoot at the same time.
- FIG. 24 shows a graphical user interface 2401 of a Text Messaging multitouch application.
- the application is illustratively launched by selecting the corresponding icon 1815 from the Application Launcher application menu 1804 (shown in FIG. 18 ).
- the application illustratively includes two windows within its user interface.
- the first window 2402 is a text messaging window.
- text messaging window 2402 includes a label or header 2403 , a closing button 2404 , a phone number field 2405 , a phone number field label 2406 , a subject field 2407 , a subject field label 2408 , a message body field 2409 , a message body field label 2410 , and a button to send the message 2412 .
- text messaging window 2402 may have more or fewer fields and labels than what is shown in the figure.
- the second window in user interface 2401 is a user input window 2411 .
- the user input window 2411 is a QWERTY keyboard that has multitouch capability (e.g. a user could touch the “Shift” key and a letter key to type the capitalized version of the letter).
- a user illustratively utilizes the multitouch QWERTY keyboard displayed on the screen to fill out fields 2405 , 2407 , and 2409 in the text messaging window 2402 .
- User input window 2411 is not however limited to only QWERTY keyboards and illustratively includes any other type of input mechanism that can collect the required information from a user.
- FIGS. 25-1 , 25 - 2 , 25 - 3 , 25 - 4 , 25 - 5 , and 25 - 6 show several graphical user interfaces of a Flight Scheduling application.
- the application is illustratively launched by selecting the corresponding icon 1816 from the Application Launcher application menu 1804 (shown in FIG. 18 ).
- FIG. 25-1 shows a user interface 2501 that is illustratively the background of the application or the default interface of the application.
- the interface illustratively has background graphics 2502 that are displayed. Graphics 2502 are optionally animated graphics in that they move.
- FIG. 25-2 shows a user interface 2503 that is used to gather information from a user.
- Interface 2503 illustratively has several windows that enable a user to provide various types of information. The ability to use multiple types of information may be more convenient to a user as opposed to requiring the user to know any one specific piece of information.
- a first window included within interface 2503 is a flight number window 2504 .
- a user illustratively is able to retrieve his flight information by inputting the flight number into flight number field 2505 .
- a second widow is a record locator window 2507 . It has a field 2508 that enables a user to enter his record locator (e.g. a code given by the airline).
- a third window is a boarding pass window 2510 .
- the multitouch device running the application illustratively has a bar code reader, and a user is able to input his information by scanning his boarding pass.
- the input fields optionally each include a corresponding label, 2506 , 2509 , and 2512 , that indicates to a user what type of information is to be entered into the fields.
- FIG. 25-3 shows a user interface 2513 .
- Interface 2513 is illustratively displayed after a user has entered an identifier that enables the application to retrieve his flight information.
- Interface 2513 has a main portion 2514 that includes a map.
- the map is of the lower forty-eight states of the U.S.
- the map that is displayed in the window is optionally selected based upon the user's flight plans. For instance, if a user was flying from Texas to France, the map could show the lower forty-eight states and Europe.
- the map includes graphic that indicate the user's departure location 2515 , the user's destination location 2516 , the user's flight path 2517 , and if there are any intermediary lay over destinations, the map may also show those.
- the graphics are animated (e.g. the location markers “bounce”).
- interface 2513 may also include a window 2518 that shows the user's flight information in text form.
- Window 2518 is shown in the figure as including the user's departure location, destination location, flight number, gate number, and date and time of the flight's departure.
- Window 2518 optionally includes any flight information (e.g. meal information, layover locations, flight duration, arrival time, etc.).
- Interface 2513 also includes several buttons, a search for alternative flights button 2519 , a weather button 2520 , an information button 2521 , and a customer service agent button 2522 .
- FIG. 25-4 shows a window 2523 that is illustratively generated upon a user selecting the information button 2521 .
- Information window 2523 reports information about the user's flight. For instance, in the example shown in the figure, window 2523 states that the user's flight has been delayed because of weather, traffic, and mechanical issues. If the user's flight was on time with no issues, window 2523 could illustratively state that the flight was on schedule or on time.
- Window 2523 optionally includes an additional flight search button 2524 and customer service agent button 2525 .
- FIG. 25-5 shows a customer service window 2526 that is generated upon a user selecting a customer service agent button.
- Window 2526 illustratively displays live, real time video of a customer service agent that can assist the user.
- this capability may eliminate the need to have customer service agents onsite which may reduce costs for an airline.
- FIG. 25-6 shows an alternative flights window 2527 that is generated upon a user selecting a flight search button (e.g. button 2519 in FIG. 25-3 or button 2524 in FIG. 25-4 ).
- Window 2527 shows several flights that a user could take to get to his destination instead of his scheduled flight.
- Each of the different flights is illustratively positioned upon a tile or button 2528 , and a user can select one of the flights by touching the tile or button.
- the user interface is updated such that it shows the new information instead of the information for the previously scheduled flight (e.g. it shows updated departure, destination, layover, time, gate, terminal, etc. information).
- a user may be prompted to input credit card information (e.g. though an RFID reader) to pay for any additional costs for the flight change.
- credit card information e.g. though an RFID reader
- a user is illustratively given weather information about his departure location, his destination location, or both upon selection of weather button 2520 (labeled in FIG. 25-3 ).
- weather button 2520 labeled in FIG. 25-3 .
- a new window is generated and the information is given in text form.
- the information is given in graphical form.
- the map shown in the user interface is illustratively updated to include graphics that indicate the weather (e.g. a sun, a snow flake, a rain drop, a dark cloud, etc.).
- FIG. 26 is a block diagram of one embodiment of a blending component 2608
- FIG. 27 is a process flow diagram illustrating the operation of a blending component.
- Blending component 2608 receives multiple data streams, combines or blends the multiple data streams into one data stream, and outputs the combined/blended data stream to one or more multitouch applications. Accordingly, blending component 2608 allows for multitouch applications 2614 to treat or view multiple input devices as one input device. In other words, blending component 2608 allows for multiple input devices to be used with an application without configuring/programming the application to use multiple input devices.
- blending component 2608 includes an input interface 2610 and an output interface 2612 .
- Input interface 2610 receives data streams or data sets from one or more input devices.
- input interface 2610 receives multitouch data from devices 2602 , receives multitouch and non-touch data from devices 2606 , and receives non-touch data from devices 2604 .
- Multitouch devices 2602 illustratively include any combination of multitouch devices.
- multitouch devices 2602 may comprise multitouch devices using different multitouch technologies (e.g. vision or capacitive), may comprise multitouch devices having different screen sizes, may comprise multitouch devices having different orientations (e.g. vertical vs. horizontal), and any other combination of multitouch devices.
- multitouch devices 2602 may comprise multitouch devices using different multitouch technologies (e.g. vision or capacitive), may comprise multitouch devices having different screen sizes, may comprise multitouch devices having different orientations (e.g. vertical vs. horizontal), and any other combination of multitouch devices.
- Non-touch devices 2604 illustratively include mice, keyboards, RFID sensors, object detection sensors, barcode readers, fiducial marker recognition system, image recognition system, and any other type of input devices.
- Multitouch and non-touch devices 2606 illustratively include devices that generate and output a combination of multitouch and non-touch data.
- a multitouch and non-touch device 2606 has a system that recognizes an identity of an object placed on or near a multitouch screen and also recognizes its location.
- objects such as, but not limited to, blocks include fiducial marks.
- Device 2606 recognizes the positions of the objects/blocks and outputs the corresponding multitouch data.
- Device 2606 also recognizes the fiducial marks and determines the identities of the objects.
- a database optionally connected to blending component 2608 illustratively includes information that associates fiducial marks or other identifiers with information such as, but not limited to, a three-dimensional size, color, weight, etc.
- Multitouch and non-touch devices 2606 illustratively output the multitouch data and other data to the blending component input interface 2610 .
- FIG. 27 is a process flow diagram illustrating the operation of a blending component 2608 .
- an object having an identifier e.g. a fiducial mark, 1D or 2D barcode, etc.
- the location of the object is identified.
- a set of multitouch data is generated based on the location of the object relative to the multitouch screen.
- the identifier on the object is recognized.
- data associated with the identifier is retrieved.
- the multitouch data from block 2706 and the retrieved object data from block 2710 is combined or blended together into one data set or data stream.
- the blended data from block 2712 is outputted/transmitted to one or more multitouch or other applications.
- the application or applications interact with the blended data and output graphics, sounds, etc. based on the blended data.
- embodiments of the present disclosure include multitouch devices, drivers, and applications that may have improved or useful features over existing multitouch devices, drivers, and applications.
- These various embodiments are illustratively practiced individually or in combination with each other. Also, it is to be understood that even though numerous characteristics and advantages of various embodiments have been set forth in the foregoing description, together with details of the structure and function of various embodiments, this detailed description is illustrative only, and changes may be made in detail, especially in matters of structure and arrangements of parts within the principles of the present disclosure to the full extent indicated by the broad general meaning of the terms in which the appended claims are expressed.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
A system includes multiple touch input devices. Each of the touch input devices generates a set of touch data. A blending component receives the sets of touch data and generates a corresponding combined set of touch data that is outputted to a touch input application. In one embodiment, at least one of the multiple touch input devices is a multitouch input device that supports multiple simultaneous touch inputs.
Description
- The present application is based on and claims the benefit of U.S. provisional patent application Ser. No. 61/247,964, filed Oct. 2, 2009, the content of which is hereby incorporated by reference in its entirety.
- In multitouch systems, a user is able to provide input by touching or otherwise contacting multiple areas of a touch screen simultaneously. Multitouch systems provide for opportunities for new types of user interactions and experiences that have not been possible in single touch or non-touch systems. Some multitouch systems have included frustrated total internal reflectance systems and capacitive systems.
- Embodiments of the present invention pertain to a system that includes multiple touch input devices. Each of the touch input devices generates a set of touch data. A blending component receives the sets of touch data and generates a corresponding combined set of touch data that is outputted to a touch input application. In one embodiment, at least one of the multiple touch input devices is a multitouch input device that supports multiple simultaneous touch inputs.
- These and various other features and advantages that characterize the claimed embodiments will become apparent upon reading the following detailed description and upon reviewing the associated drawings.
-
FIG. 1 is a block diagram of a multitouch environment. -
FIG. 2 is a perspective view of a wall type multitouch device. -
FIG. 3 is a process flow diagram illustrating the operation of a multitouch device. -
FIG. 4 is a front view of the wall type multitouch device ofFIG. 2 . -
FIG. 5 is a schematic diagram of a multitouch device light source. -
FIG. 6 is a back view of the wall type multitouch device ofFIG. 2 . -
FIG. 7 is an electrical block diagram of a multitouch device. -
FIG. 8 is a cross-sectional view of the wall type multitouch device ofFIG. 2 . -
FIG. 9 is a simplified perspective view of a parabolic multitouch device. -
FIG. 10 is a simplified front view of an airline gate wall multitouch device. -
FIG. 11 is a perspective view of a table type multitouch device. -
FIG. 12 is a simplified schematic diagram of a lighting scheme for a table type multitouch device. -
FIG. 13 is a cross-sectional view of the table type multitouch device ofFIG. 11 . -
FIG. 14 is a perspective view of a capacitive multitouch system. -
FIG. 15 is a schematic diagram of a multitouch environment that incorporates multiple multitouch devices working together. -
FIG. 16 is a block diagram illustrating the operation of a universal multitouch driver. -
FIG. 17 is a schematic diagram of a universal multitouch driver. -
FIGS. 18-1 , 18-2, and 18-3 are screenshots from an Application Launcher multitouch application. -
FIGS. 19-1 , 19-2, 19-3, 19-4, 19-5, and 19-6 illustrate multitouch gestures that may be utilized with the multitouch applications described in this disclosure. -
FIGS. 20-1 , 20-2, 20-3, 20-4, and 20-5 are screenshots of a Chat and Group Collaboration multitouch application. -
FIGS. 21-1 and 21-2 are screenshots of a Finger Painting multitouch application. -
FIG. 22 is a screenshot of a Falling Debris multitouch application. -
FIG. 23 is a screenshot of a Duck Shot multitouch application. -
FIG. 24 is screenshot of a Text Messaging multitouch application. -
FIGS. 25-1 , 25-2, 25-3, 25-4, 25-5, and 25-6 are screenshots of a Flight Scheduling multitouch application. -
FIG. 26 is a block diagram of a data blending component. -
FIG. 27 is a process flow diagram illustrating the operation of a blending component. - Embodiments of the present disclosure include systems and methods that may be utilized in multitouch environments.
FIG. 1 is a block diagram of one illustrative environment in which some embodiments may be incorporated. It should be noted however thatFIG. 1 is for illustration purposes only and that embodiments are not limited to any particular environment. Embodiments are illustratively incorporated in any multitouch environment. -
FIG. 1 shows amultitouch device 100 that is optionally communicatively coupled to a radiofrequency identification reader 102, anobject detection sensor 104, and abarcode reader 106. As will be described in greater detail later, these input devices illustratively collect information that is utilized in combination with multitouch information from the multitouch device. For instance, in one embodiment, such as in that shown inFIGS. 25-1 to 25-6, amultitouch device 100 is running a flight scheduling application. In that embodiment, users are illustratively able to usebarcode reader 106 to scan their boarding pass and are then able to make changes to their flight plans by making selections on the screen ofmultitouch device 100.FIG. 1 also shows thatdevice 100 is optionally coupled toother input devices 108. This is to illustrate that embodiments are not limited to having any particular other type of input devices connected to a multitouch device and optionally include any other type of input device. - A
multitouch device driver 110 communicates withmultitouch device 100 to obtain multitouch input information collected from a user. For instance,multitouch device 100 may collect input information that includes x-axis coordinates and y-axis coordinates for touch gestures.Device driver 110 obtains that information fromdevice 100. The input information is then passed along to a universalmultitouch device driver 120. As those skilled in the art will recognize,multitouch device drivers 110 commonly have different types of outputs. For example, the format or syntax of outputs may vary (e.g. TUIO and HID). Also for example, the outputs may have different types or amounts of information. For instance, somemultitouch devices 100 only output x and y coordinates whileother devices 100 output x and y coordinates along with a change in x, a change in y, a width, a height, and an acceleration. Universalmultitouch device driver 120 is illustratively able to collect the outputs of many different types ofdevice drivers 110 and convert the outputs into a standardized or universal format. The output ofuniversal driver 120 is then transmitted or collected by one ormore applications 130Applications 130 are illustratively multitouch applications. Several examples ofmultitouch applications 130 are discussed later in this description. Embodiments are not however limited to only multitouch applications and also illustratively include single touch or no touch applications. - Embodiments of the present disclosure are illustratively practiced with any type of multitouch device. Several different types of systems and methods are able to collect multitouch information. Some common types of multitouch devices include those that detect touches utilizing capacitive screens and those that detect touches utilizing a vision based system. Some illustrative embodiments of multitouch devices are described below.
-
FIGS. 2 , 4, 6 and 8 show a walltype multitouch device 200.Device 200 is referred to as a wall system because itsscreen 202 has a relatively large form factor. For example, in one embodiment, theheight 204 ofscreen 202 is forty-six inches, and thewidth 206 ofscreen 202 is eighty inches. Embodiments are not however limited to any particular dimensions, and embodiments illustratively includescreens 202 that have any shape or size. Additionally,multiple devices 200 may be connected together and function as an even bigger wall. -
Device 200 also includes adepth 208.Depth 208 may be any value. In one embodiment, for illustrative purposes only and not by limitation,depth 208 is twenty-six inches. In at least certain embodiments of the present disclosure,depths 208 are able to be kept relatively short by utilizing multiple projectors and/or low throw projectors to display graphics onscreen 202. Embodiments may however have other configurations such as, but not limited to, systems that utilize only one projector and/or those havinglonger depths 208. -
FIG. 3 is a flow diagram showing one method of operating multitouch devices such as, but not limited to,device 200 inFIG. 2 . Atblock 302, a blanket of light is generated over the front ofscreen 202. For instance, inFIG. 2 , a blanket of light is generated betweenscreen 202 anduser 210. Atblock 304, an imaging camera is positioned such that it is able to detect objects illuminated by light blanket. Atblock 306, graphical images are projected ontoscreen 202. Atblock 308, one or more users interact with the graphical images on the screen. For instance, in a flight scheduling application, a map may be displayed on the screen, and a user selects his destination by touching the corresponding portion of the screen. Atblock 310, the imaging camera detects the user's interactions. This information is then used atblock 312 to update the graphics on the screen. For instance, continuing with the flight scheduling application example, once a user touches his destination, the multitouch device may show available flights going to that destination. As is indicated byarrow 313, the process then returns to block 308, and blocks 308, 310, and 312 are repeated as needed. -
FIG. 4 is a schematic diagram of the front ofdevice 200.Device 200 includes multiplelight sources 212 that generate a light blanket over the front ofscreen 202. In one embodiment,light sources 212 are infrared laser diodes. Infrared light is useful in that it is not visually perceivable by human eyes so it does not interfere with graphics displayed on the screen. Embodiments are not however limited to any particular type of light sources and optionally include any type of light source. -
FIG. 5 shows one example of an infrared laserdiode light source 212. InFIG. 5 , the output oflight source 212 is passed through aline filter 214.Filter 214 illustratively changes the light source output such that a beam oflight 216 is formed that has anangle 218 and aheight 217.Different filters 214 may be used to generate any desiredangle 218 andheight 217. For instance, in one embodiment,screen 202 is curved. In such a case, filters that are used in curved portions providesmaller angles 218 than filters used for flat portions. This enables the light blanket to be kept just in front of the screen instead of diverging away from the screen. In one embodiment, flat screens or flat portions of screens use filters that have anangle 218 that is approximately one hundred and twenty degrees and aheight 217 that is approximately two millimeters. Embodiments may however utilize filters providingdifferent angles 218 andheights 217. - Returning to
FIG. 4 ,FIG. 4 shows that thelight sources 212 at the top 220 of the screen and at the bottom 222 of the screen are each spaced apart from each other by adistance 224, and that the light sources at the top 220 are offset from the light sources at the bottom 222 by adistance 225.Distance 225 is optionally one half ofdistance 224. This configuration, along with the “V” shapedlight beam 216 shown inFIG. 5 , illustratively provides a uniform blanket of light in front ofscreen 202. In one particular embodiment, for illustration purposes only and not by limitation,distance 224 is twelve and a half inches anddistance 225 is six and one quarter inches. -
FIG. 6 is a schematic diagram of the back ofdevice 200.Device 200 illustratively includesmultiple projectors 226.Projectors 226 are illustratively low throw projectors. The use of multiple low throw projectors helps to reduce the depth 208 (shown inFIG. 2 ) ofdevice 200 such that it has a relatively slim form factor. For instance, if only one projector was used or if non-low throw projectors were used,projectors 226 would have to be spaced further away fromscreen 202, anddepth 208 would have to be increased. Embodiments of the present disclosure optionally include any type of projector and any number of projectors including one. For example, a system twice as long as the one shown inFIG. 6 may have eight low throw projectors. -
FIG. 6 also shows thatdevice 200 includes animaging camera 228.Imaging camera 228 is not limited to any particular type of camera. The selection ofcamera 228 depends upon how the light blanket in front of the screen is formed.Camera 228 needs to be able to detect objects illuminated by the light blanket and needs to be selected accordingly. In one embodiment, in which infrared light is used to generate the light blanket,imaging camera 228 is either an infrared camera or is a camera fitted with an infrared filter such thatcamera 228 is able to image objects illuminated with infrared light.Camera 228 optionally includes any angle of field of view, frame rate, and resolution, and the selection of the camera is based upon the size of the screen and desired sensitivity of light detection. In one particular example,camera 228 has a field of view from 56 to 75 degrees, frames rates from 60 to 120 hertz, and resolutions from 320 by 240 to 640 by 480 pixels. In some embodiments, such as in the case of adevice 200 having alonger length 206,multiple cameras 228 may be used as needed. -
FIG. 6 further shows thatdevice 200 includes acontrol system 230 and apower source 232 forlights 212.Control system 230 illustratively runs the operations ofdevice 200. For instance,system 230 generates the graphics that are output to the projectors and processes any computations needed by applications running ondevice 200. In one embodiment,system 230 is one computer (e.g. a workstation, server, personal computer, blade, etc.). The exact implementation ofsystem 230 is not however limited to any particular type of system or group of systems. Similarly,device 200 may or may not need apower source 232 depending upon the type oflight source 212. -
FIG. 7 is a simplified electrical block diagram ofdevice 200.Control system 230 operates thelight power source 232 such that it controls the turning on and turning off oflight sources 212. Control system receives the output ofimaging cameras 228 and processes the images to determine the positions of objects illuminate by the light blanket in front of the screen.Control system 230 also generates and supplies theprojectors 226 with graphics that are displayed on the device's screen. -
FIG. 7 shows thatdevice 200 is further optionally connected toother devices 234. These other devices may include an RFID reader, a barcode reader, and an object detection sensor, such as those shown inFIG. 1 .Devices 234 may include any devices that are useful for applications being run ondevice 200.Devices 234 may also include devices that are needed for or useful for the operation ofdevice 200. In one embodiment,devices 234 includes an additional camera that is put in front ofdevice 200. The additional camera generates images that are used to blend the images of the multiple projectors together such that the image displayed on the screen looks as if it were generated by a single projector. -
FIG. 8 is a cross-section ofdevice 200 from the perspective of line 8-8 inFIG. 4 . In one embodiment, such as in the one shown inFIG. 8 , the screen ofdevice 200 includes several components. The screen illustratively includes a piece of transparent material 236 (e.g. glass or plexiglass), a backing/screen material 237, and a light absorbing edge 238 (e.g. an acid washed edge).Backing material 237 provides a surface on which the images ofprojectors 226 are displayed.Material 237 is chosen such that a user on the opposite side of the screen fromprojectors 226 can view the images. For example,material 237 is illustratively any translucent material. -
FIG. 8 shows alight blanket 240 in front of the screen.Blanket 240 illustratively has adepth 241 that extends along the entire surface of the screen of device 200 (i.e. it is a three-dimensional box having the dimensions ofdepth 241 by height 204 (shown inFIG. 2 ) by width 206 (also shown inFIG. 2 )). In an embodiment,light sources 212 are positioned such that their outputs overlap the screen. For example, in one embodiment, the outputs oflight sources 212 have heights (e.g. height 217 inFIG. 5 ) that is two millimeters, and the light sources are positioned such that the one and a half millimeters of the height is in front of the glass, and the other half of a millimeter is directed to thelight absorbing edge 238 that extends along theentire length 206 of the top 220 of the screen and of the bottom 222 of the screen (note:length 206, top 220, and bottom 222 are shown and labeled inFIG. 4 ). In another embodiment,light sources 212 may be positioned such that there is no overlap with the screen or may even be positioned such that there is a gap between the light blanket and the screen. Embodiments are not limited to any particular positioning of the light blanket relative to the screen. -
FIG. 8 also showsimaginary lines 229 emanating fromimaging camera 228 andquasi-imaginary lines 227 emanating fromprojectors 226. Theimaging camera lines 229 represent the field of view of the camera and illustrate that the camera is able to detect illuminated objects over the entire surface of the screen. As was previously mentioned, in another embodiment, multiple cameras are used and the combined field of view of all the cameras covers the entire surface of the screen. Thequasi-imaginary projector lines 227 represent the graphical images generated by the projectors and displayed on the screen. As is indicated in the figure, the projectors each cover only a portion of the screen. For instance, in a four projector embodiment, each projector generates an image that covers approximately one quarter of the screen. -
FIG. 9 is a simplified schematic diagram of another embodiment of a multitouch device,device 900.Device 900 illustratively operates similar todevice 200. One difference betweendevice 900 anddevice 200 is that indevice 900, themultitouch screen 902 and its corresponding light blanket are parabolic in shape as opposed to being rectangular boxes (i.e. the top 920 andbottom 922 ofscreen 902 are parabolic or curved). The parabolic or curved light blanket is illustratively generated by adjusting the dispersion angles (i.e.angle 218 inFIG. 5 ) of thelight sources 212 and thespacing 924 between the light sources. For example, for straight or flat portions of a screen, light sources illustratively have an angle of one hundred and twenty degrees and light sources are spaced twelve and a half inches apart. For parabolic or curved portions of a screen, both the angles and the spacing between light sources are reduced (e.g. less than one hundred and twenty degrees and less than twelve and a half inches). The amount of the reduction depends on the rate of change of the curve. For instance, if the screen curves thirty degrees over a length of ten feet, the angles and the spacings will need to be less than if the screen only curves ten degrees over a length of ten feet. In an embodiment, the angles are reduced by connecting the light sources to line filters that provide the desired angle. These adjustments to the angles and the spacings illustratively help to ensure that the light blanket is located just above and follows the screen, as opposed to diverging away from the screen. -
FIG. 9 also has threeboxes 928. These boxes illustrate one potential placement of imaging cameras for device 900 (i.e. the boxes correspond to the centers of the fields of view of the cameras). As was previously mentioned, more or fewer imaging cameras may be used depending upon the size of the screen and the fields of view of the cameras. In one embodiment ofdevice 900, its height is the same as the height of device 200 (e.g. forty-six inches tall) but itswidth 906 is three times as long (e.g. device 200's width is eighty inches anddevice 900's width is two hundred and forty inches). In such a case,device 900 illustratively uses the same type of projectors but requires twelve instead of four. Embodiments ofdevices -
FIG. 10 is a simplified schematic view of another embodiment of a wall type multitouch device,device 1000. In one embodiment,device 1000 is an airline gate wall and is used at an airport. The screen ofdevice 1000 has two portions, anupper portion 1001 and alower portion 1002.Device 1000 illustratively includesobject detection sensors 104. As is shown in the figure, sensors are optionally grouped into pairs of two that are aligned in the vertical direction.Sensors 104 can be used to detect the presence of a person who may wish to interact withdevice 1000. In one configuration, if only the lower one of the twosensors 104 in a pair of sensors detects a person,device 1000 interprets that result as indicating that a child is present in front of its screen.Device 1000 illustratively responds by launching anapplication 131 onlower screen potion 1002 near where the child is detected.Application 131 can be any type of application. In one example, for illustration purposes only and not by limitation,application 131 is a game application that a child may enjoy playing with. If both of the twosensors 131 in a pair of sensors detects an object,device 1000 interprets that result as indicating that an adult is present in front of its screen.Device 1000 illustratively responds by launching anapplication 130 onupper screen portion 1001 near where the adult is detected. Again,application 130 can be any type of application. In one embodiment, in whichdevice 1000 is an airline gate wall,application 130 is a flight scheduling application. - As is shown in
FIG. 10 ,device 1000 has several pairs ofsensors 104. This illustrates thatdevice 1000 supports multiple people interacting with the device simultaneously. In the specific example shown inFIG. 10 ,device 1000 has four pairs ofsensors 104. Embodiments may of course have more or fewer pairs of sensors including no sensors.FIG. 10 shows thatdevice 1000 has aheight 1004 and awidth 1006.Height 1004 andwidth 1006 include any values and embodiments can be made to be any desired size to support the simultaneous use by any number of people. In one embodiment, for illustration purposes only,height 1004 is ninety inches andwidth 1006 is between nine and eighteen feet. -
Device 1000 further optionally hasmultiple barcode scanners 106 andmultiple RFID readers 102.Scanners 106 may be useful forcertain applications 130. For instance, in the case ofapplication 130 being a flight scheduling application,barcode scanner 106 can be used to scan a boarding pass, a confirmation e-mail, etc.RFID readers 102 can similarly be used along withapplications 130.Readers 102 could for example be used to read credit card information or to read user identification information. The user identification information could be used to log into anapplication 130. -
FIGS. 11 , 12, and 13 illustrate yet another embodiment of a multitouch device,device 1100. In one embodiment,device 1100 is a table type multitouch device and has a smaller form factor than the devices shown inFIGS. 2 , 4, 6, 8, 9, and 10. For instance,device 1100'swidth 206 andheight 204 are illustratively each in the range of one to four feet. - In one embodiment, the
screen 1102 ofdevice 1100 is surrounded on all four of its sides bylight sources 1112.FIG. 12 is a simplified schematicdrawing showing screen 1102 andlight sources 1112 in more detail.Screen 1102 includes a transparent material 1136 (e.g. glass or plexiglass) and many small reflectors 1137 (e.g. small pieces of aluminum or chrome) embedded and disbursed throughoutmaterial 1136.Reflectors 1137 illustratively have random orientations such that they reflect light in all directions. In one embodiment,light source 1112 is a ribbon oflight emitting diodes 1113. In one specific example, for illustration purposes only and not by limitation,diodes 1113 are three quarter inch infrared LEDs. Embodiments are not however limited to any particular type of light source.Light source 1112 generates light (e.g. IR light) that is transmitted throughtransparent material 1136 and is reflected byreflectors 1137.Light source 1112 is illustratively chosen such that light is transmitted throughout the entire volume of the screen (i.e. throughout its entire width by height by thickness). -
FIG. 13 is a simplified cross-sectional or side view ofdevice 1100. As is shown in the figure, aprojector 1126 projects an image onto a backing/screen material 1138 that is attached to the back ofglass 1136 having embedded reflectors 1137 (shown inFIG. 12 ). Auser 1110 interacts with images on the screen by touching the surface of the screen. Whenuser 1110 touches the screen, light collects or becomes concentrated at the point of contact (i.e. the area of the screen being touched).Imaging camera 1128 illustratively has a field of view that covers the retire surface of the screen and is able to detect any touches (i.e. it is able to image the concentrated areas of light). For instance, in the case oflight source 1112 generating infrared light,camera 1128 is optionally an infrared camera or a camera equipped with an IR filter. - In
FIG. 13 ,device 1100 is shown as having only oneprojector 1126 and oneimaging camera 1128. In other embodiments,device 1100 may havemultiple projectors 1126 and/ormultiple imaging cameras 1128 as is needed. Additionally,multiple devices 1100 are illustratively connected together and function as a single unit. For instance, up to fivedevices 1100 are optionally connected together and function as a single unit. - The multitouch devices shown in the previous figures have been vision based multitouch systems in that they detect touches utilizing optical imaging. Embodiments of the present disclosure are not however limited to only vision based systems. Embodiments are illustratively incorporated in or practiced with any type of multitouch device.
FIG. 14 illustrates another type of multitouch system, acapacitive system 1400. - In the example shown in
FIG. 14 , a multitouchcapacitive screen overlay 1402 is positioned in front of a television or monitor 1404. In another embodiment, the multitouch capability and display device are not separate pieces and are illustratively integrated into one unit. A control system 1406 (e.g. a computer) is communicatively coupled to television/monitor 1404 and tomultitouch screen overlay 1402. Control system 1406 generatesgraphical output 1411 that is transmitted to and displayed on television/monitor 1404. A user interacts with the graphics on television/monitor 1404 by touchingcapacitive screen overlay 1402.Overlay 1402 then sendsmultitouch output 1412 back to control system 1406 that indicates the user's touches. Control system 1406 then uses themultitouch output 1412 to update its graphics and sends the updated graphics back to television/monitor 1404. - It is also worth noting that any number of multiple multitouch systems are illustratively communicatively coupled together and work to run applications.
FIG. 15 illustrates one example of such an environment. The environment inFIG. 15 has three multitouch devices, awall type device 1510 in a living room, atable type device 1530 in the living room, and a secondwall type device 1520 in a bedroom. Devices are illustratively coupled to each other through an intranet/router 1541 and are also coupled to theinternet 1542. - In one embodiment, the
screen 1531 oftable device 1530 has multiple windows,windows 1532 andwindow 1533.Window 1533 illustratively includes television programming of guide information such as, but not limited to, a listing of television stations and an indication of what programs will be on the stations at different times and dates.Windows 1532 are illustratively television program windows. For example, a user may select four different programs fromguide window 1533 and each of the four programs is displayed in itsown window 1532. The user could then control what is being displayed on the screens of thewall type devices windows 1532 and push or flick it in the direction of one of the wall devices, and the program being displayed in the window would then be displayed on the wall device. In another embodiment,table device 1530 generates audio output for headphones and the selection of one or more ofwindows 1532 generates one or more channels of headphone audio output (i.e. the audio corresponding to the television program being displayed in the selected windows). Additionally, the environment shown inFIG. 15 may have personal video recording capability, and a user is able to control the recording and playing of video utilizing the multiple multitouch devices. For example, for illustration purposes only and not by limitation, a user could touch the screen oftable device 1530 to pause or rewind the program being displayed onwall device 1510. - The multiple multitouch device environment in
FIG. 15 is not however limited to running any particular application and is illustratively used in running any application. Other illustrative applications include making or placing phone calls, making or placing video calls usingoptional cameras - As was previously mentioned in reference to
FIG. 1 , multitouch device drivers commonly output their touch information in different formats or syntax. Their various types of outputs may also have different kinds of information. For example, one multitouch device driver may only output an x and a y coordinate of a touch, and another device driver may have x and y coordinates as well as a change in x and a change in y (i.e. dx, dy) These varying forms of multitouch output can make implementing multitouch applications difficult and/or costly. For example, if an application developer wants to make a multitouch application that can be used on multiple different types of multitouch devices, he may have to make and maintain several different versions of the same application, with each of the versions being specially tailored to accommodate for a particular device driver. - In one aspect of the present disclosure, a universal multitouch driver is provided that eliminates or educes the need to specially tailor applications to multiple types of device drivers. The universal multitouch driver is able to receive and interpret the outputs of multiple different types of device drivers. The universal multitouch driver then converts the output it receives into a standardized format such that regardless of the format or type of information that it receives, it consistently outputs multitouch information in the same format and having the same type of information. Consequently, multitouch application developers do not have to make or maintain multiple versions of multitouch applications to implement an application across multiple types of multitouch devices. Instead, developers only need to make applications that are able to use the multitouch information from the universal multitouch driver.
-
FIG. 16 is a block diagram illustrating the operation of a universal multitouch driver. Atblock 1602, a user interacts with a multitouch screen (e.g. the user touches the screen). Atblock 1604, a multitouch device driver generates touch data based upon the user interaction. As will be appreciated by those skilled in the art, this touch data can include several different parameters depending upon the device driver. Touch data commonly includes a touch identifier (e.g. touch # 1,touch # 2, etc.) and x-axis and y-axis coordinates that correspond to the location on the multitouch screen that was touched. Touch data may also include other types of information such a change in the x-axis coordinate from the previously sent data (e.g. dx), a change in the y-axis coordinate from the previously sent data (e.g. dy), a width of the touch (e.g. w), a height of the touch (e.g. h), and an acceleration (e.g. m). - At
block 1606, the universal multitouch driver obtains the touch data from the device driver. In one embodiment, the device driver has an application programming interface, and the universal driver communicates with that interface to retrieve the data. In another embodiment, the device driver automatically sends the data to the universal driver. Embodiments are not limited to any specific manner of obtaining the data. - At
block 1607, the universal multitouch driver optionally stores the data. For instance, the universal multitouch driver optionally stores the data to a volatile (e.g. DRAM) or non-volatile (e.g. flash or hard disk drive) memory. - At
block 1608, the universal multitouch driver calculates any missing parameters. As was mentioned previously, different device drivers do not always output the same types of information. One device driver may for example only output a touch identifier, an x-axis coordinate, a y-axis coordinate, a width, and a height. Universal multitouch driver optionally retrieves the data stored atblock 1607 and uses the current information and the stored information to generate a full set of touch data parameters (i.e. the touch identifier, x, y, dx, dy, w, h, and m). Alternatively or in addition to calculating missing parameters, the universal multitouch driver may fill in one or more missing parameters with a default value. For instance, if the device driver does not provide a width or a height, the universal driver illustratively uses a default value for those parameters (e.g. 0 or 1). - At
block 1610, the universal multitouch driver formats or packages the touch data, including any calculated or default values, into a standardized format. In one embodiment, the standardized format follows the “User Datagram Protocol” or “UDP” network protocol. In another embodiment, the standardized format follows the “Extensible Markup Language” or “XML” rules. In yet another embodiment, a universal multitouch driver formats the touch data and packages it using multiple different formats simultaneously (e.g. it packages the same touch data in both UDP and XML simultaneously). Embodiments are not however limited to any particular standardized format. - At
block 1612, the universal multitouch driver transfers the standardized touch data to one or more multitouch applications. Embodiments are not limited to any particular method of transferring the data. For instance, the universal multitouch driver may send the data to the application, or alternatively the application may retrieve the data from the universal multitouch driver. In one specific example, for illustration purposes only and not by limitation, universal multitouch driver sends UDP formatted data to the UDP port 3000 and XML formatted data to the TCP port 3333. Also, as was previously alluded to, the universal multitouch driver may optionally transfer data in multiple ways simultaneously (e.g. it sends data to both port 3000 and 3333 simultaneously). - At block 1614, the multitouch application utilizes the standardized user touch data that it received from universal multitouch driver. The process then illustratively repeats itself and returns to block 1602 where a user interacts with a multitouch screen.
-
FIG. 17 is a block diagram illustrating one embodiment of auniversal multitouch driver 1700.Universal multitouch driver 1700 illustratively has adevice driver interface 1702 that is communicatively coupled to anapplication programming interface 1752 ofmultitouch device driver 1750. In an embodiment,universal multitouch driver 1700 utilizes this connection to obtain the touch data from the device driver.Universal multitouch driver 1700 also has aprocessing module 1704 and astorage module 1706.Module 1704 is utilized in performing any computations, processing, or logical functions that may be required. For instance,processing module 1704 may perform calculations to determine additional parameters as was described in relation to block 1608 inFIG. 16 .Storage module 1706 is utilized in performing any data storage, data management, or data retrieval functions such as, but not limited to, storing touch data to calculate change in x and change in y parameters (e.g. blocks 1607 and 1608 inFIG. 16 ). Universal multitouch driver then has anapplication interface 1708 that is communicatively coupled to aninterface 1782 of one or moremultitouch applications 1780. This connection is illustratively used in transferring (e.g. sending or transmitting) the standardized touch data to the multitouch applications. - The methods and the universal multitouch drivers described above illustratively help to reduce problems associated with implementing multitouch applications across multiple types of multitouch devices having various types of drivers. Instead of making and maintaining multiple versions of the same application, developers only need to make and maintain one version of any given application. It should be noted however that embodiments of the present disclosure are not limited to multitouch systems or environments having universal multitouch drivers. Embodiments are illustratively practiced without a universal multitouch driver, and applications illustratively receive touch data directly from device drivers. It should also be noted that in systems having a universal multitouch driver, the universal multitouch driver is optionally bypassable. For instance, in a situation in which the data from the device driver is already in a suitable format for an application, the data may bypass the universal multitouch driver and go directly to the application.
- Embodiments of the present disclosure also illustratively include a number of multitouch applications. These applications may be practiced in environments such as those previously discussed. The applications are however not limited to any particular environment and may also be practiced in environments different than those previously discussed.
- The first application that will be discussed is illustratively an Application Launcher application. As will become more clear shortly, it is used to manage other applications on a multitouch device.
FIGS. 18-1 , 18-2, and 18-3 illustrate various states of Application Launcher, andFIGS. 19-1 , 19-2, 19-3, 19-4, 19-5, and 19-6 illustrate multitouch gestures that may be used along with Application Launcher and the other applications disclosed herein. -
FIG. 18-1 shows Application Launcher from a state in which it is only showing background graphics orwallpaper 1802 on the screen of a multitouch device. In one embodiment, graphics orwallpaper 1802 are animated.FIG. 18-2 shows Application Launcher from a state in which anapplication menu 1804 has been activated.FIG. 19-1 shows two alternative gestures that may be used to activate the application menu. One of the gestures includes a single touch (e.g. a tap) followed by a touch that makes a line to the right. The other illustrative activation method includes making a clockwise rotation gesture on the screen. -
Application menu 1804 illustratively has multiple icons that correspond to other applications that may be ran on the multitouch device. In the example shown inFIG. 18-2 ,application menu 1804 includes icons corresponding to or representing six applications. They are a Chat andGroup Collaboration application 1811, aFinger Painting application 1812, a FallingDebris Game application 1813, a DuckShot Game application 1814, aText Messaging application 1815, and aFlight Scheduler application 1816. Each of applications 1811-1816 are discussed in greater detail below. In the example shown in the figures, the applications have text based icons. However, in another embodiment, the icons are graphical icons that provide some indication of the corresponding applications (e.g. a picture of a plane for a flight scheduling application). -
FIG. 18-3 shows Application Launcher from a state in which two applications are launched and displayed on the screen of a multitouch device. In the illustrative example shown in the figure, the first application that has been launched is in onewindow 1821 of the screen, and the second application that has been launched is in asecond window 1822 of the screen. Each of the two applications was illustratively launched by tapping on the corresponding icon 1811-1816 inapplication menu 1804. In an embodiment, such as that shown inFIG. 18-3 , multiple multitouch applications are able to be ran on one device simultaneously. Application Launcher illustratively manages and allocates system resources to enable the multiple simultaneous applications. -
FIGS. 19-2 , 19-3, 19-4, 19-5, and 19-6 show additional gestures that may be used in applications according to the present disclosure.FIG. 19-2 represents gestures for closingapplication menu 1804. The gestures are essentially the opposite of the gestures shown inFIG. 19-1 for opening or launchingapplication menu 1804. -
FIG. 19-3 represents a gesture for increasing the size of a window. A user illustratively uses two fingers to touch two spots on an object displayed on a multitouch screen and then keeps touching the screen with the fingers while moving the fingers away from each other. This gesture is illustratively able to make any window larger. For example, a user can increase the size ofapplication windows application menu 1804. Additionally, in an embodiment, a user is able to use the gesture to magnify wallpaper or background graphics. -
FIG. 19-4 represents a gesture for decreasing the size of a window or de-magnifying background graphics. A user illustratively uses two fingers to touch two spots on an object and then keeps touching the screen with the fingers while moving the fingers towards each other. -
FIGS. 19-5 and 19-6 represents gestures for rotating windows or background graphics. The gesture inFIG. 19-5 corresponds to clockwise rotation, and the gesture inFIG. 19-6 corresponds to counter-clockwise rotation. As indicated in the figures, the gestures include touching two spots on the screen and moving the fingers together while making parallel lines. -
FIGS. 20-1 , 20-2, 20-3, 20-4, and 20-5 show several illustrative examples of graphical user interfaces (e.g. screenshots) for a multitouch Chat and Group Collaboration application. The application is illustratively launched by selecting thecorresponding icon 1811 from Application Launcher application menu 1804 (shown inFIG. 18 ). -
FIG. 20-1 shows afirst user interface 2001 that is illustratively utilized to log a user into the application. The log inuser interface 2001 includesbackground graphics 2002 and a log inwindow 2004.Background graphics 2002 can include any graphics desired by the user.Background graphics 2002 may also optionally include animated graphics. Log inwindow 2004 is illustratively used to collect information from a user such that the application is able to identify the user. This may be useful for example for retrieving a users saved information (e.g. saved preferences, friend list, etc.) or for restricting access to the application to only certain people (e.g. paying subscribers to the service). Embodiments of log inwindow 2004 may have any features and collect any type of information. In the specific example shown inFIG. 20-1 , log inwindow 2004 has a title orheader 2005, aclosing button 2006, a first side orportion 2007 that enables a user to log into the application as a guest (e.g. a person that enters another person's meeting), and a second side orportion 2008 that enables a user to log into the application as a host (e.g. a person holding a meeting that others attend). Theguest portion 2007 has a guestuser input field 2009 to collect identifying information (e.g. a user name or handle) from a person logging in as a guest, a label orheader 2010 that describes or identifiesinput field 2009, and abutton 2011 to submit the user's identifying information and to log into the application as a guest. Thehost portion 2008 has a hostuser input field 2012 to collect identifying information (e.g. username, email address, handle, etc.) from a user who wishes to log in as a host, a label orheader 2013 that describes or identifiesinput field 2012, a second hostuser input field 2014 that collects authentication information (e.g. a password, PIN, etc.), a label orheader 2015 that describes or identifiesinput field 2014, and abutton 2016 to submit the user's identifying/authentication information and to log into the application as a host. -
FIG. 20-2 shows asecond user interface 2020 of the Chat and Group Collaboration application.Interface 2020 is illustratively displayed once a user has submitted his identification and/or authentication information to the application through a log in window (e.g. through log inwindow 2004 inFIG. 20-1 ), and the application has successfully verified the information (e.g. that the user is a registered user).Interface 2020 include auser list window 2021.Window 2021 has an identification label orheader 2022 that helps a user understand whatwindow 2021 is, aclosing button 2023 to closewindow 2021, and multiple user identification fields 2024. Eachuser identification field 2024 includes a text and/or image user identifier 2025 (e.g. a user name, a real name, a picture of the user, a handle, etc.) and astatus field 2026 that indicates a status of the user (e.g. online/available or offline/not available). -
FIG. 20-3 shows athird user interface 2028 of the Chat and Group Collaboration application.Interface 2028 is illustratively displayed once a user has selected (e.g. touched) one or more of the user identification fields 2024. Upon the selection of a user, acorresponding communication window 2030 is displayed on the interface.Communication window 2030 includes an identification field orheader 2031 that describes a window (e.g. it identifies a selected user) and aclosing button 2032 to terminate the communication session with the selected user.Communication window 2030 further has multiple buttons, 2033, 2034, and 2035 that allow a user to select various methods of communicating with the selected user.Button 2033 corresponds to communicating by video.Button 2034 corresponds to communicating by voice, andbutton 2035 corresponds to communicating by text. One or more buttons are optionally selected to communicate with the selected user using the corresponding methods.Window 2030 further optionally includes twovideo portions text portion 2038.Video portions Text portion 2038 may be used for communicating by text.Portion 2038 may include, for example, a series of type written messages along with identifiers that show which user generated each of the messages. - It is worth mentioning again that in an embodiment, that the Chat and Group collaboration application, as well as the rest of the applications described in this specification, have multitouch capabilities. The applications illustratively support the use of multitouch gestures such as, but not limited to, those shown in
FIGS. 19-1 , 19-2, 19-3, 19-4, 19-5, and 19-6. This allows for windows, backgrounds, and any object within an application to be manipulated or controlled by the multitouch gestures. For instance, windows can be increased or decreased in size or rotated utilizing multitouch gestures. -
Window 2030 inFIG. 20-3 further includes adrawing button 2039 that enables users to communicate by drawing images on their multitouch screens. Upon selection ofbutton 2039, theuser interface 2040 shown inFIG. 20-4 is illustratively displayed.Interface 2040 includes adrawing window 2042.Window 2042 has a multitouch shareddrawing area 2043.Drawing area 2043 is a shared object in that the same image ofarea 2043 is displayed on the multitouch screens of each of the users logged into the application (i.e. users can see what other users draw).Window 2043 further optionally hasmultiple buttons 2044 that provide additional features.Buttons 2044 may include features that are useful in drawing inarea 2043. For instance,buttons 2044 may include a button to draw inarea 2043 in a pencil or pen format, a button to draw inarea 2043 with a paint brush format, and/or a button to draw inarea 2043 with a spray paint format. -
FIG. 20-5 shows anotheruser interface 2050 that may be included with a Chat and Group collaboration application.Interface 2050 includes amap window 2052.Map window 2052 is illustratively shared amongst all of the users logged into the system. Each user is able to control or manipulate the map shown in the window by using multitouch gestures. For instance, users may zoom in on a location utilizing the gesture shown inFIG. 19-4 , zoom out on a location utilizing the gesture shown inFIG. 19-3 , and/or rotate the view of a location utilizing the gestures shown inFIGS. 19-5 and 19-6. Additionally, thewindow 2052 itself can be resized, rotated, etc. utilizing multitouch gestures. -
FIGS. 21-1 and 21-2 show graphical user interfaces of a multitouch Finger Painting application. The application is illustratively launched by selecting thecorresponding icon 1812 from the Application Launcher application menu 1804 (shown inFIG. 18 ).Interface 2101 inFIG. 21-1 shows thepainting area 2104 of the user interface before anything has been painted in it, and interface 2102 shows the user interface after some squiggly lines have been painted inpainting area 2104.Painting area 2104 illustratively has background graphics that look like a painting canvas or a piece of paper.Area 2104 however may include any type of graphics (e.g. one solid color). -
Interfaces 2101 and 2102 include a plurality ofcolor selection buttons 2106. A user illustratively selects one or more of the colors. The selected color is then activated, and when a user touchespainting area 2104, it paints with the selected color.Interfaces 2101 and 2102 may also have additional buttons such as, but not limited to, aclear button 2108 to erase the painting/drawing inarea 2104 and/or aclosing button 2109 to terminate the application. -
FIG. 22 shows agraphical user interface 2201 of a Falling Debris multitouch game application. The application is illustratively launched by selecting thecorresponding icon 1813 from the Application Launcher application menu 1804 (shown inFIG. 18 ). In the game, fallingdebris 2202 move from the top of the interface towards the bottom of the interface.Debris 2202 havetails 2203 that represent the previous locations of the debris. The object of the game is for a user to triggerexplosions 2205 to destroy the debris before they hit houses 2204. In an embodiment,explosions 2205 are triggered by a user touching the screen (i.e. an explosion occurs where the screen is touched). Given the multitouch capabilities of the application, a user is able to touch multiple spots oninterface 2201 simultaneously, thus triggering multiplesimultaneous explosions 2205. -
FIG. 23 shows agraphical user interface 2301 of a Duck Shot multitouch game application. The application is illustratively launched by selecting thecorresponding icon 1814 from the Application Launcher application menu 1804 (shown inFIG. 18 ). In the game, different types of objects, such as, but not limited toducks 2302,fish 2303,birds 2304, and/oroctopuses 2305 intermittently are displayed on the interface. The objects optionally include targets or bulls eyes that indicate that they are to be “shot.” In one embodiment, the application is displayed on a relatively large multitouch device (e.g. 80″ by 46″) and a user throws bean bags at the device's multitouch screen. In another embodiment, the application is displayed on a smaller multitouch device a user “shoots” the objects by touching the objects with his fingers. Upon a successful shot,interface 2301illustratively display points 2306 that the user is awarded. Similarly,display 2301 may display negative points or points that are deducted 2307 from the user's score upon an unsuccessful shot. Because of the multitouch capability, a user is able to shoot multiple objects simultaneously and/or multiple users can play simultaneously with each of the users being able to shoot at the same time. -
FIG. 24 shows agraphical user interface 2401 of a Text Messaging multitouch application. The application is illustratively launched by selecting thecorresponding icon 1815 from the Application Launcher application menu 1804 (shown inFIG. 18 ). The application illustratively includes two windows within its user interface. Thefirst window 2402 is a text messaging window. In the example shown in the figure,text messaging window 2402 includes a label orheader 2403, aclosing button 2404, a phone number field 2405, a phonenumber field label 2406, asubject field 2407, asubject field label 2408, amessage body field 2409, a messagebody field label 2410, and a button to send themessage 2412. In other implementations,text messaging window 2402 may have more or fewer fields and labels than what is shown in the figure. - The second window in
user interface 2401 is auser input window 2411. In the example shown in the figure, theuser input window 2411 is a QWERTY keyboard that has multitouch capability (e.g. a user could touch the “Shift” key and a letter key to type the capitalized version of the letter). A user illustratively utilizes the multitouch QWERTY keyboard displayed on the screen to fill outfields text messaging window 2402.User input window 2411 is not however limited to only QWERTY keyboards and illustratively includes any other type of input mechanism that can collect the required information from a user. - As was previously mentioned, all of the windows in all of the multitouch applications discussed in this specification have multitouch capabilities such as, but not limited to, resizing and rotating windows. In the case of the Text Message application, for example, both the
user input window 2411 and thetext messaging window 2402 are able to be resized, rotated, moved, etc. utilizing multitouch gestures. -
FIGS. 25-1 , 25-2, 25-3, 25-4, 25-5, and 25-6 show several graphical user interfaces of a Flight Scheduling application. The application is illustratively launched by selecting thecorresponding icon 1816 from the Application Launcher application menu 1804 (shown inFIG. 18 ).FIG. 25-1 shows auser interface 2501 that is illustratively the background of the application or the default interface of the application. The interface illustratively hasbackground graphics 2502 that are displayed.Graphics 2502 are optionally animated graphics in that they move. -
FIG. 25-2 shows auser interface 2503 that is used to gather information from a user.Interface 2503 illustratively has several windows that enable a user to provide various types of information. The ability to use multiple types of information may be more convenient to a user as opposed to requiring the user to know any one specific piece of information. A first window included withininterface 2503 is aflight number window 2504. A user illustratively is able to retrieve his flight information by inputting the flight number intoflight number field 2505. A second widow is arecord locator window 2507. It has afield 2508 that enables a user to enter his record locator (e.g. a code given by the airline). A third window is aboarding pass window 2510. It has afield 2511 that collects boarding pass information from a user. In one embodiment, the multitouch device running the application illustratively has a bar code reader, and a user is able to input his information by scanning his boarding pass. The input fields optionally each include a corresponding label, 2506, 2509, and 2512, that indicates to a user what type of information is to be entered into the fields. -
FIG. 25-3 shows auser interface 2513.Interface 2513 is illustratively displayed after a user has entered an identifier that enables the application to retrieve his flight information.Interface 2513 has amain portion 2514 that includes a map. In the example shown in the figure, the map is of the lower forty-eight states of the U.S. The map that is displayed in the window is optionally selected based upon the user's flight plans. For instance, if a user was flying from Texas to France, the map could show the lower forty-eight states and Europe. The map includes graphic that indicate the user'sdeparture location 2515, the user'sdestination location 2516, the user'sflight path 2517, and if there are any intermediary lay over destinations, the map may also show those. In an embodiment, the graphics are animated (e.g. the location markers “bounce”). - In addition to showing a graphical representation of the user's flight information,
interface 2513 may also include awindow 2518 that shows the user's flight information in text form.Window 2518 is shown in the figure as including the user's departure location, destination location, flight number, gate number, and date and time of the flight's departure.Window 2518 optionally includes any flight information (e.g. meal information, layover locations, flight duration, arrival time, etc.). -
Interface 2513 also includes several buttons, a search foralternative flights button 2519, aweather button 2520, aninformation button 2521, and a customerservice agent button 2522.FIG. 25-4 shows awindow 2523 that is illustratively generated upon a user selecting theinformation button 2521.Information window 2523 reports information about the user's flight. For instance, in the example shown in the figure,window 2523 states that the user's flight has been delayed because of weather, traffic, and mechanical issues. If the user's flight was on time with no issues,window 2523 could illustratively state that the flight was on schedule or on time.Window 2523 optionally includes an additionalflight search button 2524 and customerservice agent button 2525. -
FIG. 25-5 shows acustomer service window 2526 that is generated upon a user selecting a customer service agent button.Window 2526 illustratively displays live, real time video of a customer service agent that can assist the user. As will be appreciated by those skilled in the art, this capability may eliminate the need to have customer service agents onsite which may reduce costs for an airline. -
FIG. 25-6 shows analternative flights window 2527 that is generated upon a user selecting a flight search button (e.g. button 2519 inFIG. 25-3 orbutton 2524 inFIG. 25-4 ).Window 2527 shows several flights that a user could take to get to his destination instead of his scheduled flight. Each of the different flights is illustratively positioned upon a tile orbutton 2528, and a user can select one of the flights by touching the tile or button. In an embodiment, upon a user selecting one of the alternative flights, the user interface is updated such that it shows the new information instead of the information for the previously scheduled flight (e.g. it shows updated departure, destination, layover, time, gate, terminal, etc. information). Alternatively, a user may be prompted to input credit card information (e.g. though an RFID reader) to pay for any additional costs for the flight change. - Finally, a user is illustratively given weather information about his departure location, his destination location, or both upon selection of weather button 2520 (labeled in
FIG. 25-3 ). In one embodiment, a new window is generated and the information is given in text form. In another embodiment, the information is given in graphical form. For instance, the map shown in the user interface is illustratively updated to include graphics that indicate the weather (e.g. a sun, a snow flake, a rain drop, a dark cloud, etc.). - Certain embodiments of the present disclosure also include blending components.
FIG. 26 is a block diagram of one embodiment of ablending component 2608, andFIG. 27 is a process flow diagram illustrating the operation of a blending component.Blending component 2608 receives multiple data streams, combines or blends the multiple data streams into one data stream, and outputs the combined/blended data stream to one or more multitouch applications. Accordingly,blending component 2608 allows formultitouch applications 2614 to treat or view multiple input devices as one input device. In other words, blendingcomponent 2608 allows for multiple input devices to be used with an application without configuring/programming the application to use multiple input devices. - In the embodiment shown in
FIG. 26 ,blending component 2608 includes aninput interface 2610 and anoutput interface 2612.Input interface 2610 receives data streams or data sets from one or more input devices. InFIG. 26 ,input interface 2610 receives multitouch data fromdevices 2602, receives multitouch and non-touch data fromdevices 2606, and receives non-touch data fromdevices 2604. -
Multitouch devices 2602 illustratively include any combination of multitouch devices. For instance,multitouch devices 2602 may comprise multitouch devices using different multitouch technologies (e.g. vision or capacitive), may comprise multitouch devices having different screen sizes, may comprise multitouch devices having different orientations (e.g. vertical vs. horizontal), and any other combination of multitouch devices. -
Non-touch devices 2604 illustratively include mice, keyboards, RFID sensors, object detection sensors, barcode readers, fiducial marker recognition system, image recognition system, and any other type of input devices. - Multitouch and
non-touch devices 2606 illustratively include devices that generate and output a combination of multitouch and non-touch data. In one embodiment, a multitouch andnon-touch device 2606 has a system that recognizes an identity of an object placed on or near a multitouch screen and also recognizes its location. For example, in one embodiment, objects such as, but not limited to, blocks include fiducial marks.Device 2606 recognizes the positions of the objects/blocks and outputs the corresponding multitouch data.Device 2606 also recognizes the fiducial marks and determines the identities of the objects. For instance, a database optionally connected to blendingcomponent 2608 illustratively includes information that associates fiducial marks or other identifiers with information such as, but not limited to, a three-dimensional size, color, weight, etc. Multitouch andnon-touch devices 2606 illustratively output the multitouch data and other data to the blendingcomponent input interface 2610. -
FIG. 27 is a process flow diagram illustrating the operation of ablending component 2608. Atblock 2702, an object having an identifier (e.g. a fiducial mark, 1D or 2D barcode, etc.) is placed on or near a multitouch screen. Atblock 2704, the location of the object is identified. Atblock 2706, a set of multitouch data is generated based on the location of the object relative to the multitouch screen. Atblock 2708, the identifier on the object is recognized. Atblock 2710, data associated with the identifier is retrieved. Atblock 2712, the multitouch data fromblock 2706 and the retrieved object data fromblock 2710 is combined or blended together into one data set or data stream. Atblock 2714, the blended data fromblock 2712 is outputted/transmitted to one or more multitouch or other applications. Atblock 2716, the application or applications interact with the blended data and output graphics, sounds, etc. based on the blended data. - As has been described above, embodiments of the present disclosure include multitouch devices, drivers, and applications that may have improved or useful features over existing multitouch devices, drivers, and applications. These various embodiments are illustratively practiced individually or in combination with each other. Also, it is to be understood that even though numerous characteristics and advantages of various embodiments have been set forth in the foregoing description, together with details of the structure and function of various embodiments, this detailed description is illustrative only, and changes may be made in detail, especially in matters of structure and arrangements of parts within the principles of the present disclosure to the full extent indicated by the broad general meaning of the terms in which the appended claims are expressed.
Claims (21)
1. A system comprising:
multiple touch input devices, each of the touch input devices generating a set of touch data;
a blending component that receives the sets of touch data from the multiple touch input devices, the component combining the sets of touch data into a single combined set of touch data that is outputted to a touch input application.
2. The system of claim 1 , wherein at least one of said multiple touch input devices is a multitouch input device that supports multiple simultaneous touch inputs.
3. The system of claim 1 , wherein at least two of the multiple touch input devices have a different size.
4. The system of claim 1 , wherein at least two of the multiple touch input devices have a same shape.
5. The system of claim 1 , wherein at least two of the multiple touch input devices have different shapes.
6. The system of claim 1 , wherein at least two of the multiple touch input devices have a same orientation.
7. The system of claim 1 , wherein at least two of the multiple touch input devices have different orientations.
8. A system comprising:
at least one touch input device;
at least one non-touch user input device;
a component having a first interface and a second interface, the first interface receiving data from both the at least one touch input device and the at least one non-touch user input device, the second interface outputting a single data stream that is indicative of the data from both the at least one touch input device and the at least one non-touch user input device.
9. The system of claim 8 , wherein the at least one non-touch user input device comprises a radio frequency identification device.
10. The system of claim 8 , wherein the at least one non-touch user input device comprises a barcode scanner.
11. The system of claim 8 , wherein the at least one non-touch user input device comprises an object recognition component.
12. The system of claim 8 , wherein the at least one non-touch user input device comprises a mouse.
13. The system of claim 8 , wherein the at least one non-touch user input device comprises a keyboard.
14. The system of claim 8 , wherein the at least one non-touch user input device comprises a fiducial marker recognition system.
15. The system of claim 8 , wherein the at least one non-touch user input device comprises an image recognition system.
16. A method comprising:
receiving a first set of data from a touch input device;
receiving a second set of data from another device;
generating a third set of data that is representative of the first and the second sets of data; and
outputting the third set of data to an application.
17. The method of claim 16 , wherein the another device is a non-touch device.
18. The method of claim 17 , wherein the another device is another touch input device.
19. The method of claim 18 , wherein the touch input device and the another touch input device have different orientations.
20. The method of claim 18 , wherein the touch input device and the another touch input device have different shapes.
21. The method of claim 18 , wherein the touch input device and the another touch input device are both multitouch input devices.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US12/893,427 US20110080344A1 (en) | 2009-10-02 | 2010-09-29 | Blending touch data streams that include touch input data |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US24796409P | 2009-10-02 | 2009-10-02 | |
US12/893,427 US20110080344A1 (en) | 2009-10-02 | 2010-09-29 | Blending touch data streams that include touch input data |
Publications (1)
Publication Number | Publication Date |
---|---|
US20110080344A1 true US20110080344A1 (en) | 2011-04-07 |
Family
ID=43822809
Family Applications (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/893,427 Abandoned US20110080344A1 (en) | 2009-10-02 | 2010-09-29 | Blending touch data streams that include touch input data |
US12/893,381 Expired - Fee Related US8816991B2 (en) | 2009-10-02 | 2010-09-29 | Touch input apparatus including image projection |
US12/893,375 Expired - Fee Related US8760416B2 (en) | 2009-10-02 | 2010-09-29 | Universal touch input driver |
Family Applications After (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/893,381 Expired - Fee Related US8816991B2 (en) | 2009-10-02 | 2010-09-29 | Touch input apparatus including image projection |
US12/893,375 Expired - Fee Related US8760416B2 (en) | 2009-10-02 | 2010-09-29 | Universal touch input driver |
Country Status (1)
Country | Link |
---|---|
US (3) | US20110080344A1 (en) |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110080360A1 (en) * | 2009-10-02 | 2011-04-07 | Dedo Interactive Inc. | Universal touch input driver |
EP2533141A1 (en) * | 2011-06-07 | 2012-12-12 | Amadeus S.A.S. | A personal information display system and associated method |
US20130031482A1 (en) * | 2011-07-28 | 2013-01-31 | Microsoft Corporation | Multi-Touch Remoting |
WO2013104061A1 (en) * | 2012-01-11 | 2013-07-18 | Smart Technologies Ulc | Calibration of an interactive light curtain |
US8954638B2 (en) | 2012-10-17 | 2015-02-10 | Perceptive Pixel, Inc. | Selective reporting of touch data |
US8971572B1 (en) * | 2011-08-12 | 2015-03-03 | The Research Foundation For The State University Of New York | Hand pointing estimation for human computer interaction |
EP2980683A1 (en) * | 2014-07-30 | 2016-02-03 | LG Electronics Inc. | Display apparatus and method for operating the same |
US20160260099A1 (en) * | 2015-03-03 | 2016-09-08 | Amadeus S.A.S. | Prioritizing transactions in a transaction queue |
US10032230B2 (en) | 2014-08-12 | 2018-07-24 | Amadeus S.A.S. | Auditing system with historic sale deviation database |
Families Citing this family (25)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8849560B2 (en) * | 2010-05-19 | 2014-09-30 | Arinc Incorporated | Method and apparatus for customer/passenger wayfinding using boarding pass barcode scanning capabilities on low-cost display devices |
US8619049B2 (en) | 2011-05-17 | 2013-12-31 | Microsoft Corporation | Monitoring interactions between two or more objects within an environment |
US8872799B2 (en) | 2011-06-20 | 2014-10-28 | The Regents Of The University Of California | Scalable distributed/cooperative/collaborative paradigm for multi-user interaction with projection-based display walls |
US9723293B1 (en) * | 2011-06-21 | 2017-08-01 | Amazon Technologies, Inc. | Identifying projection surfaces in augmented reality environments |
CN102890597A (en) * | 2011-07-18 | 2013-01-23 | 宏碁股份有限公司 | Input data processing method and relevant input data management system |
US8840250B1 (en) | 2012-01-11 | 2014-09-23 | Rawles Llc | Projection screen qualification and selection |
WO2014062197A1 (en) * | 2012-10-19 | 2014-04-24 | Aditi Majumder | Scalable distributed/cooperative/collaborative paradigm for multi-user interaction with projection-based display walls |
US9563955B1 (en) * | 2013-05-15 | 2017-02-07 | Amazon Technologies, Inc. | Object tracking techniques |
USD761860S1 (en) * | 2014-06-20 | 2016-07-19 | Samsung Electronics Co., Ltd. | Display screen or portion thereof with icon |
EP3250993B1 (en) | 2015-01-28 | 2019-09-04 | FlatFrog Laboratories AB | Dynamic touch quarantine frames |
CN107209609A (en) | 2015-02-09 | 2017-09-26 | 平蛙实验室股份公司 | It is included in the optical touch system of the device of the projection of transmission panel above and within and detection light beam |
US10915288B2 (en) * | 2015-03-27 | 2021-02-09 | Inkerz Pty Ltd. | Systems and methods for sharing physical writing actions |
CN106303325A (en) * | 2015-06-08 | 2017-01-04 | 中强光电股份有限公司 | Interactive projection system and projecting method thereof |
JP2018005806A (en) * | 2016-07-08 | 2018-01-11 | 株式会社スクウェア・エニックス | Position specification program, computer device, position specification method, and position specification system |
PL3667475T3 (en) | 2016-12-07 | 2022-11-21 | Flatfrog Laboratories Ab | A curved touch device |
CN110300950B (en) | 2017-02-06 | 2023-06-16 | 平蛙实验室股份公司 | Optical coupling in touch sensing systems |
WO2018174787A1 (en) | 2017-03-22 | 2018-09-27 | Flatfrog Laboratories | Eraser for touch displays |
WO2018182476A1 (en) | 2017-03-28 | 2018-10-04 | Flatfrog Laboratories Ab | Touch sensing apparatus and method for assembly |
WO2019045629A1 (en) | 2017-09-01 | 2019-03-07 | Flatfrog Laboratories Ab | Improved optical component |
WO2019172826A1 (en) | 2018-03-05 | 2019-09-12 | Flatfrog Laboratories Ab | Improved touch-sensing apparatus |
US12055969B2 (en) | 2018-10-20 | 2024-08-06 | Flatfrog Laboratories Ab | Frame for a touch-sensitive device and tool therefor |
WO2020153890A1 (en) | 2019-01-25 | 2020-07-30 | Flatfrog Laboratories Ab | A videoconferencing terminal and method of operating the same |
WO2021107840A1 (en) | 2019-11-25 | 2021-06-03 | Flatfrog Laboratories Ab | A touch-sensing apparatus |
EP4104042A1 (en) | 2020-02-10 | 2022-12-21 | FlatFrog Laboratories AB | Improved touch-sensing apparatus |
US11709568B2 (en) | 2020-02-25 | 2023-07-25 | Promethean Limited | Convex interactive touch displays and related systems and methods |
Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5929844A (en) * | 1996-05-03 | 1999-07-27 | First Person Gaming, Inc. | First person perspective control system |
US6008777A (en) * | 1997-03-07 | 1999-12-28 | Intel Corporation | Wireless connectivity between a personal computer and a television |
US6198485B1 (en) * | 1998-07-29 | 2001-03-06 | Intel Corporation | Method and apparatus for three-dimensional input entry |
US20040075820A1 (en) * | 2002-10-22 | 2004-04-22 | Chu Simon C. | System and method for presenting, capturing, and modifying images on a presentation board |
US20060004680A1 (en) * | 1998-12-18 | 2006-01-05 | Robarts James O | Contextual responses based on automated learning techniques |
US20060178212A1 (en) * | 2004-11-23 | 2006-08-10 | Hillcrest Laboratories, Inc. | Semantic gaming and application transformation |
US20060235750A1 (en) * | 2005-04-13 | 2006-10-19 | Nec Infrontia Corporation | Point-of-sales terminal |
US20070126716A1 (en) * | 2005-11-17 | 2007-06-07 | Jonathan Haverly | Digital pen |
US20080029691A1 (en) * | 2006-08-03 | 2008-02-07 | Han Jefferson Y | Multi-touch sensing display through frustrated total internal reflection |
US20100259493A1 (en) * | 2009-03-27 | 2010-10-14 | Samsung Electronics Co., Ltd. | Apparatus and method recognizing touch gesture |
Family Cites Families (41)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US3737646A (en) * | 1971-10-15 | 1973-06-05 | J Burrows | Removable peripheral light assembly for bathroom mirror |
US4089587A (en) * | 1974-11-29 | 1978-05-16 | Schudel Conrad R | Projection screen surface and method of forming said surface |
US5732283A (en) * | 1995-10-24 | 1998-03-24 | International Business Machines Corporation | System and method of providing universal support for multiple pointing devices |
JP3627322B2 (en) * | 1995-11-07 | 2005-03-09 | ヤマハ株式会社 | Automatic piano |
US7007070B1 (en) * | 1996-03-06 | 2006-02-28 | Hickman Paul L | Method and apparatus for computing over a wide area network |
US5807175A (en) * | 1997-01-15 | 1998-09-15 | Microsoft Corporation | Dynamic detection of player actuated digital input devices coupled to a computer port |
US6282586B1 (en) * | 1998-10-28 | 2001-08-28 | 3Com Corporation | Method in an operating system, a method and system for supporting multiple hardware devices from a single communications port |
US7895342B2 (en) * | 2000-03-02 | 2011-02-22 | Dearborn Group, Inc. | Multi-protocol adapter for in-vehicle and industrial communications networks |
US20010037254A1 (en) * | 2000-03-09 | 2001-11-01 | Adi Glikman | System and method for assisting a customer in purchasing a commodity using a mobile device |
US6791530B2 (en) * | 2000-08-29 | 2004-09-14 | Mitsubishi Electric Research Laboratories, Inc. | Circular graphical user interfaces |
US8035612B2 (en) * | 2002-05-28 | 2011-10-11 | Intellectual Ventures Holding 67 Llc | Self-contained interactive video display system |
US20040019895A1 (en) * | 2002-07-29 | 2004-01-29 | Intel Corporation | Dynamic communication tuning apparatus, systems, and methods |
GB2394339A (en) * | 2002-10-08 | 2004-04-21 | Retail Experience Ltd | An automatic ordering system for supplying garments to a changing room |
US9024884B2 (en) * | 2003-09-02 | 2015-05-05 | Apple Inc. | Touch-sensitive electronic apparatus for media applications, and methods therefor |
US8287373B2 (en) * | 2008-12-05 | 2012-10-16 | Sony Computer Entertainment Inc. | Control device for communicating visual information |
JP3734815B2 (en) * | 2003-12-10 | 2006-01-11 | 任天堂株式会社 | Portable game device and game program |
US7620915B2 (en) * | 2004-02-13 | 2009-11-17 | Ludwig Lester F | Electronic document editing employing multiple cursors |
US7827573B2 (en) * | 2004-04-05 | 2010-11-02 | Comcast Cable Holdings, Llc | Method and system for provisioning a set-top box |
US8904273B2 (en) * | 2004-07-02 | 2014-12-02 | International Business Machines Corporation | System and method of format specification |
JP2006039982A (en) * | 2004-07-28 | 2006-02-09 | Canon Inc | Control method for information processor, information processor, and control program for information processor |
US20060044282A1 (en) * | 2004-08-27 | 2006-03-02 | International Business Machines Corporation | User input apparatus, system, method and computer program for use with a screen having a translucent surface |
US7928964B2 (en) * | 2005-04-22 | 2011-04-19 | Microsoft Corporation | Touch input data handling |
WO2007041646A2 (en) * | 2005-09-30 | 2007-04-12 | Infocus Corporation | Reconfigurable projection-screen system |
US7736231B2 (en) * | 2005-10-03 | 2010-06-15 | Microsoft Corporation | Common controller |
EP1802038B1 (en) * | 2005-12-23 | 2009-01-07 | Sony Deutschland GmbH | System and method for improving service and device discovery in a UPnP-based wireless communication network |
US7599561B2 (en) * | 2006-02-28 | 2009-10-06 | Microsoft Corporation | Compact interactive tabletop with projection-vision |
US7587348B2 (en) * | 2006-03-24 | 2009-09-08 | Basepoint Analytics Llc | System and method of detecting mortgage related fraud |
US20070257891A1 (en) * | 2006-05-03 | 2007-11-08 | Esenther Alan W | Method and system for emulating a mouse on a multi-touch sensitive surface |
US7552402B2 (en) * | 2006-06-22 | 2009-06-23 | Microsoft Corporation | Interface orientation using shadows |
US8022941B2 (en) * | 2006-10-12 | 2011-09-20 | Disney Enterprises, Inc. | Multi-user touch screen |
US8085249B2 (en) * | 2007-04-18 | 2011-12-27 | Luidia Inc. | Pre-assembled part with an associated surface convertible to a transcription apparatus |
US7889175B2 (en) * | 2007-06-28 | 2011-02-15 | Panasonic Corporation | Touchpad-enabled remote controller and user interaction methods |
US8581852B2 (en) * | 2007-11-15 | 2013-11-12 | Microsoft Corporation | Fingertip detection for camera based multi-touch systems |
US8390577B2 (en) * | 2008-07-25 | 2013-03-05 | Intuilab | Continuous recognition of multi-touch gestures |
US8159760B2 (en) * | 2008-08-19 | 2012-04-17 | Seiko Epson Corporation | Projector and control method of projector |
US8223121B2 (en) * | 2008-10-20 | 2012-07-17 | Sensor Platforms, Inc. | Host system and method for determining an attitude of a device undergoing dynamic acceleration |
US8591039B2 (en) * | 2008-10-28 | 2013-11-26 | Smart Technologies Ulc | Image projection methods and interactive input/projection systems employing the same |
US8515707B2 (en) * | 2009-01-07 | 2013-08-20 | Sensor Platforms, Inc. | System and method for determining an attitude of a device undergoing dynamic acceleration using a Kalman filter |
US7870496B1 (en) * | 2009-01-29 | 2011-01-11 | Jahanzeb Ahmed Sherwani | System using touchscreen user interface of a mobile device to remotely control a host computer |
US8730183B2 (en) * | 2009-09-03 | 2014-05-20 | Obscura Digital | Large scale multi-user, multi-touch system |
US20110080344A1 (en) * | 2009-10-02 | 2011-04-07 | Dedo Interactive Inc. | Blending touch data streams that include touch input data |
-
2010
- 2010-09-29 US US12/893,427 patent/US20110080344A1/en not_active Abandoned
- 2010-09-29 US US12/893,381 patent/US8816991B2/en not_active Expired - Fee Related
- 2010-09-29 US US12/893,375 patent/US8760416B2/en not_active Expired - Fee Related
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5929844A (en) * | 1996-05-03 | 1999-07-27 | First Person Gaming, Inc. | First person perspective control system |
US6008777A (en) * | 1997-03-07 | 1999-12-28 | Intel Corporation | Wireless connectivity between a personal computer and a television |
US6198485B1 (en) * | 1998-07-29 | 2001-03-06 | Intel Corporation | Method and apparatus for three-dimensional input entry |
US20060004680A1 (en) * | 1998-12-18 | 2006-01-05 | Robarts James O | Contextual responses based on automated learning techniques |
US20040075820A1 (en) * | 2002-10-22 | 2004-04-22 | Chu Simon C. | System and method for presenting, capturing, and modifying images on a presentation board |
US20060178212A1 (en) * | 2004-11-23 | 2006-08-10 | Hillcrest Laboratories, Inc. | Semantic gaming and application transformation |
US20060235750A1 (en) * | 2005-04-13 | 2006-10-19 | Nec Infrontia Corporation | Point-of-sales terminal |
US20070126716A1 (en) * | 2005-11-17 | 2007-06-07 | Jonathan Haverly | Digital pen |
US20080029691A1 (en) * | 2006-08-03 | 2008-02-07 | Han Jefferson Y | Multi-touch sensing display through frustrated total internal reflection |
US20100259493A1 (en) * | 2009-03-27 | 2010-10-14 | Samsung Electronics Co., Ltd. | Apparatus and method recognizing touch gesture |
Cited By (26)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8760416B2 (en) * | 2009-10-02 | 2014-06-24 | Dedo Interactive, Inc. | Universal touch input driver |
US20110080360A1 (en) * | 2009-10-02 | 2011-04-07 | Dedo Interactive Inc. | Universal touch input driver |
AU2012266767B2 (en) * | 2011-06-07 | 2016-03-17 | Amadeus S.A.S. | A personal information display system and associated method |
KR101832022B1 (en) * | 2011-06-07 | 2018-02-23 | 아마데우스 에스.에이.에스. | A personal information display system and associated method |
EP2533141A1 (en) * | 2011-06-07 | 2012-12-12 | Amadeus S.A.S. | A personal information display system and associated method |
WO2012167924A1 (en) * | 2011-06-07 | 2012-12-13 | Amadeus S.A.S. | A personal information display system and associated method |
CN103597434A (en) * | 2011-06-07 | 2014-02-19 | 阿玛得斯两合公司 | A personal information display system and associated method |
US20120314082A1 (en) * | 2011-06-07 | 2012-12-13 | Benjamin Bezine | Personal information display system and associated method |
JP2014516186A (en) * | 2011-06-07 | 2014-07-07 | アマデウス エス.エイ.エス | Personal information display system and related method |
US20140219583A1 (en) * | 2011-06-07 | 2014-08-07 | Amadeus S.A.S. | Personal information display system and associated method |
US10311109B2 (en) * | 2011-06-07 | 2019-06-04 | Amadeus S.A.S. | Personal information display system and associated method |
US20130031482A1 (en) * | 2011-07-28 | 2013-01-31 | Microsoft Corporation | Multi-Touch Remoting |
US9727227B2 (en) * | 2011-07-28 | 2017-08-08 | Microsoft Technology Licensing, Llc | Multi-touch remoting |
US9372546B2 (en) * | 2011-08-12 | 2016-06-21 | The Research Foundation For The State University Of New York | Hand pointing estimation for human computer interaction |
US20150177846A1 (en) * | 2011-08-12 | 2015-06-25 | The Research Foundation For The State University Of New York | Hand pointing estimation for human computer interaction |
US9128530B2 (en) * | 2011-08-12 | 2015-09-08 | The Research Foundation For The State University Of New York | Hand pointing estimation for human computer interaction |
US20150378444A1 (en) * | 2011-08-12 | 2015-12-31 | The Research Foundation For The State University Of New York | Hand pointing estimation for human computer interaction |
US8971572B1 (en) * | 2011-08-12 | 2015-03-03 | The Research Foundation For The State University Of New York | Hand pointing estimation for human computer interaction |
US9207812B2 (en) | 2012-01-11 | 2015-12-08 | Smart Technologies Ulc | Interactive input system and method |
WO2013104061A1 (en) * | 2012-01-11 | 2013-07-18 | Smart Technologies Ulc | Calibration of an interactive light curtain |
US8954638B2 (en) | 2012-10-17 | 2015-02-10 | Perceptive Pixel, Inc. | Selective reporting of touch data |
EP2980683A1 (en) * | 2014-07-30 | 2016-02-03 | LG Electronics Inc. | Display apparatus and method for operating the same |
CN105320365A (en) * | 2014-07-30 | 2016-02-10 | Lg电子株式会社 | Display apparatus and method for operating same |
US9733765B2 (en) | 2014-07-30 | 2017-08-15 | Lg Electronics Inc. | Display apparatus and method for operating the same |
US10032230B2 (en) | 2014-08-12 | 2018-07-24 | Amadeus S.A.S. | Auditing system with historic sale deviation database |
US20160260099A1 (en) * | 2015-03-03 | 2016-09-08 | Amadeus S.A.S. | Prioritizing transactions in a transaction queue |
Also Published As
Publication number | Publication date |
---|---|
US20110080361A1 (en) | 2011-04-07 |
US20110080360A1 (en) | 2011-04-07 |
US8816991B2 (en) | 2014-08-26 |
US8760416B2 (en) | 2014-06-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8816991B2 (en) | Touch input apparatus including image projection | |
US11561519B2 (en) | Systems and methods of gestural interaction in a pervasive computing environment | |
US9996197B2 (en) | Camera-based multi-touch interaction and illumination system and method | |
CN105378599B (en) | interactive digital display | |
US6414672B2 (en) | Information input apparatus | |
US10015402B2 (en) | Electronic apparatus | |
CN104364753B (en) | Method for highlighting active interface element | |
CN105229582A (en) | Based on the gestures detection of Proximity Sensor and imageing sensor | |
CN102222329B (en) | Raster scanning for depth detection | |
EP2959362B1 (en) | System and method for tracking a passive wand and actuating an effect based on a detected wand path | |
EP3271838B1 (en) | Image management device, image management method, image management program, and presentation system | |
EP2498485A2 (en) | Automated selection and switching of displayed information | |
CN103347437A (en) | Gaze detection in a 3d mapping environment | |
US20090315829A1 (en) | Multi-User Pointing Apparaus and Method | |
CN103914152A (en) | Recognition method and system for multi-point touch and gesture movement capturing in three-dimensional space | |
JPH1157216A (en) | Game device | |
EP1779226B1 (en) | Method and system for controlling a display | |
US10866779B2 (en) | User interactive display device and operating device | |
JPH1153111A (en) | Information input/output device | |
CN109144235A (en) | Man-machine interaction method and system based on head hand co-operating | |
CN210006135U (en) | Unmanned retail equipment | |
CN103677271A (en) | Remote pointing device and application method thereof | |
CN208240003U (en) | A kind of orientable spherical displaying device | |
WO2024039887A1 (en) | Interactive reality computing experience using optical lenticular multi-perspective simulation | |
JP2018165879A (en) | Electronic blackboard system and display method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: DEDO INTERACTIVE, INC., TEXAS Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:DURA, DANIEL ADAM;MILLER, MARK F.;REEL/FRAME:025494/0994 Effective date: 20101129 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |