US9535595B2 - Accessed location of user interface - Google Patents
Accessed location of user interface Download PDFInfo
- Publication number
- US9535595B2 US9535595B2 US13/284,370 US201113284370A US9535595B2 US 9535595 B2 US9535595 B2 US 9535595B2 US 201113284370 A US201113284370 A US 201113284370A US 9535595 B2 US9535595 B2 US 9535595B2
- Authority
- US
- United States
- Prior art keywords
- user interface
- location
- shared portion
- transmit
- accessed
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active, expires
Links
- 230000000007 visual effect Effects 0.000 claims abstract description 70
- 238000004891 communication Methods 0.000 claims description 24
- 238000000034 method Methods 0.000 claims description 17
- 238000009877 rendering Methods 0.000 claims description 12
- 230000004044 response Effects 0.000 claims description 12
- 230000004048 modification Effects 0.000 description 4
- 238000012986 modification Methods 0.000 description 4
- 230000001413 cellular effect Effects 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 230000032258 transport Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/50—Monitoring users, programs or devices to maintain the integrity of platforms, e.g. of processors, firmware or operating systems
- G06F21/55—Detecting local intrusion or implementing counter-measures
- G06F21/56—Computer malware detection or handling, e.g. anti-virus arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/60—Protecting data
- G06F21/606—Protecting data by securing the transmission between two devices or processes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
- G06F3/1454—Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay
-
- G06F9/4445—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F9/00—Arrangements for program control, e.g. control units
- G06F9/06—Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
- G06F9/44—Arrangements for executing specific programs
- G06F9/451—Execution arrangements for user interfaces
- G06F9/452—Remote windowing, e.g. X-Window System, desktop virtualisation
Definitions
- a user can establish a remote connection with the other users.
- the other users can then view the visual content and can gain access to the user's device. This can create security concerns for the user as the content and data of the user's device becomes visible and accessible to the other users. Additionally, the user's device can be exposed and harmed by malware which may be present on the other user's devices.
- FIG. 1 illustrates a device coupled to a display component according to an example.
- FIG. 2 illustrates a sensor detecting a user accessing a location of a user interface according to an example.
- FIG. 3 illustrates a block diagram of an interface application transmitting information to a second device according to an example.
- FIG. 4 illustrates a device and a second device sharing accessed information according to an example.
- FIG. 5 is a flow chart illustrating a method for managing a device according to an example.
- FIG. 6 is a flow chart illustrating a method for managing a device according to another example.
- a device can include a display component to display a user interface with visual content for a user of the device to view and interact with.
- the visual content can include alphanumeric characters, images, and/or videos included on objects displayed as part of the user interface.
- An object can be wallpaper, a screensaver, media, a document, and/or an application of the device.
- a sensor of the device can detect for a user accessing a location of the user interface.
- a user can access a location of the user interface with a hand gesture for the device to detect information of the accessed location.
- the device can share a portion of the user interface with a second device.
- the second device can then render a second user interface corresponding to a portion of the user interface.
- the portion includes the entire user interface.
- the portion includes one or more of the objects included on the user interface.
- a user of the second device can view all or some of the user interface of the device.
- the device can transmit information of the accessed location to the second device.
- the second device can render a visual indicator at a location of the second user interface corresponding to the accessed location of the user interface.
- the visual indicator can be a visual mark and/or a modification to the second user interface, such as a magnification or an animation.
- FIG. 1 illustrates a device 100 coupled to a display component 160 according to an example.
- the device 100 can be a laptop, a notebook, a tablet, a netbook, an all-in-one system, and/or a desktop.
- the device 100 can be a cellular device, a PDA (Personal Digital Assistant), an E (Electronic)-Reader, and/or any additional device which can be coupled to a display component 160 .
- the device 100 includes a controller 120 , a display component 160 , a sensor 130 , a communication component 170 , and a communication channel 150 for components of the device 100 to communicate with one another.
- the device 100 includes an interface application which can be utilized independently and/or in conjunction with the controller 120 to manage the device 100 .
- the interface application can be a firmware or application which can be executed by the controller 120 from a non-transitory computer readable memory of the device 100 .
- the controller 120 and/or the interface application can render a user interface 165 on the display component 160 .
- the user interface 165 displays visual content included on objects rendered as part of the user interface 165 .
- the objects can be media, a document, wallpaper, a screensaver, a webpage, and/or an application of the device 100 .
- the display component 160 is a hardware output component which can display the user interface 165 as a visual interface of the device 100 .
- the display component 160 is a LCD (liquid crystal display), a LED (light emitting diode) display, a CRT (cathode ray tube) display, a plasma display, a projector and/or any additional device configured to display the user interface 165 .
- a sensor 130 of the device 100 can detect for a user of the device 100 accessing one or more locations of the user interface 165 .
- the sensor 130 is a hardware component of the device 100 which can detect for a hand gesture from the user to detect the user accessing a location of the user interface 165 .
- the sensor 130 can include an image capture component, a touch screen, a proximity sensor, radar, and/or any additional hardware component which can detect a hand gesture from the user.
- a hand gesture can include the user touching a location of the user interface 165 .
- the hand gesture can include the user hovering above or coming within proximity of a location of the user interface 165 .
- the hand gesture can include the user pointing at a location of the user interface 165 . If the sensor 130 detects the user accessing the user interface 165 , the sensor 130 can pass information of an accessed location 175 to the controller 120 and/or the interface application. In one embodiment, the information of the accessed location 175 can include a coordinate of the user interface 165 accessed by the user.
- the controller 120 and/or the interface application can then transmit information to a second device using a communication component 170 .
- the communication component 170 is hardware component of the device which can receive and/or transmit data and information between the device 100 and the second device.
- the communication component 170 can be a network interface component, a radio component, an infrared component, a Bluetooth component, and/or any additional component which can receive and/or transmit data and/or information between the device 100 and the second device.
- the second device can be a laptop, a notebook, a tablet, a netbook, an all-in-one system, a desktop, a cellular device, a PDA (Personal Digital Assistant), an E (Electronic)-Reader, and/or any additional device which can be coupled to a display component to render a second user interface for a user of the second device.
- PDA Personal Digital Assistant
- E Electronic
- Transmitting information to the second device includes the controller 120 and/or the interface application sharing a portion of the user interface 165 with a second device.
- the controller 120 and/or the interface application initially determine which portion of the user interface 165 to share with the second device.
- a portion of the user interface 165 can include the entire user interface 165 .
- a portion of the user interface 165 can include a subset of the entire user interface 165 .
- the subset can include one or more objects located at the accessed location 175 of the user interface 165 .
- the controller 120 and/or the interface application can prompt the user through a visual or audio prompt to specify whether to share objects at the accessed location 175 or to share the entire user interface 165 .
- the controller 120 and/or the interface application can transmit a video stream of the portion of the user interface 165 to the second device.
- the controller 120 and/or the interface application can share a file of the objects included in the specified portion of the user interface 165 with the second device.
- the controller 120 and/or the interface application can share a name of the object or a link to where the object can be accessed.
- the second device can use the received information of the portion of the user interface 154 to render a second user interface for a user of the second device to view and/or interact with.
- the controller 120 and/or the interface application also transmit information of the accessed location 175 to the second device using the communication component 170 .
- the information of the accessed location 175 can include a coordinate of the user interface 165 accessed by the user.
- the information of the accessed location 175 can specify an object, alphanumeric characters, an image, and/or any additional visual content located at the accessed location 175 on the user interface 165 .
- the second device can render a visual indicator at a location of the second user interface.
- a visual indicator is a visual mark or a visual modification which can be rendered at a location of the second user interface based on the information of the accessed location 175 of the user interface 165 .
- the location of the second user interface to render the visual indicator corresponds to the accessed location of the user interface 165 .
- rendering the visual indicator can include the second device rendering a shape, such as a circle, an X, a square, or a star at a location of the second interface corresponding to the accessed location of the user interface 165 .
- rendering the visual indicator includes modifying the second user to magnify and/or animate the location of the second interface corresponding to the accessed location of the user interface 165 .
- FIG. 2 illustrates a sensor 230 detecting a user 205 accessing a location of a user interface 265 according to an example.
- the user interface 265 can be rendered on a display component 260 , such as a LCD, a LED display, a CRT display, a plasma display, a projector and/or any additional device configured to display objects as part of the user interface 265 .
- the objects can include wallpaper, a screen saver, a document, media, a webpage, and/or an application of the device 200 for a user 205 of the device 200 to view and/or interact with.
- the objects can be locally stored on the device 200 or remotely accessed from another location by the device 200 .
- the user 205 can interact with the user interface 265 by accessing a location of the user interface 265 with a hand gesture.
- the hand gesture can be made with a hand, palm, and/or one or more fingers of the user 205 .
- a sensor 230 of the device 200 can detect for a hand, palm, and/or finger of the user 205 touching, hovering above, and/or pointing to a location of the user interface 165 .
- the sensor 230 can be a touch screen, an image capture component, a proximity sensor, radar, and/or any additional component which can detect for the user 205 accessing a location of the user interface 265 .
- the user 205 has accessed an image of a person displayed on a lower left portion of the user interface 265 .
- the sensor 230 passes information of the accessed location 275 to the controller and/or interface application of the device 200 .
- the information from the sensor 230 can include a coordinate of the user interface 265 corresponding to the accessed location 275 .
- the controller and/or the interface application can use the coordinate as information of the accessed location 275 .
- the controller and/or the interface application can identify an object of the user interface 265 located at the accessed location 275 .
- the controller and/or the interface application can then use a communication component 270 of the device 200 to share a portion of the user interface 265 with the second device 280 and to transmit information of the accessed location 275 of the user interface 265 to the second device 280 .
- the communication component 270 can be a network interface component, a radio component, a Bluetooth component, and/or an Infrared component to physically or wirelessly couple the device 200 to the second device 280 and transfer information and/or data between the device 200 and the second device 280 .
- the second device 280 can be a computing device which can couple with the device 200 and receive information of a portion of the user interface 265 .
- the controller and/or the interface application can initially determine whether to share the entire user interface 265 or one or more objects of the user interface 265 located at the accessed location 275 .
- a default option for the device 200 includes not sharing the entire user interface 265 and to share objects at the accessed location 275 as the portion of the user interface 265 .
- the user 205 can be prompted with a visual or audio prompt to select whether to share the entire user interface 265 or to share objects of the user interface 275 at the accessed location 275 .
- the portion of the user interface 265 can be shared by the controller and/or the interface application as a video stream transmitted to the second device 280 .
- the controller and/or the interface application can transfer a file of an object at the accessed location when sharing a portion of the user interface 265 .
- the controller and/or the interface component can transmit a link for the second device 280 to use to access or retrieve the objects located at the accessed location 275 .
- the controller and/or interface application can additionally transmit information of where the object should be rendered on a second user interface 245 of the second device 280 .
- the controller and/or the interface application can share a resolution of the user interface 265 for the second device 280 to use for the second user interface 245 .
- the second device can use the information of the portion of the user interface 265 to render a second user interface 245 on a second display component 240 of the second device 200 .
- the entire user interface 265 can be shared as the portion of the user interface 265 .
- the second user interface 245 of the second device 280 matches the user interface 265 of the device 200 .
- the second user interface 245 can be rendered at a same resolution as the user interface 265 .
- the controller and/or the interface application can also transmit information of the accessed location 275 to the second device 280 .
- the information of the access location 275 can be transmitted as a data package and/or as a file to the second device 280 .
- the information can include a coordinate of the user interface 265 for the accessed location 275 .
- the information can list an object of the user interface located at the accessed location 275 .
- the second device 280 uses the information of the accessed location 275 , the second device 280 renders a visual indicator 290 at a location of the second user interface 245 .
- the visual indicator 290 is rendered at the location of the second user interface 245 corresponding to the accessed location 275 of the user interface 265 .
- the second device 280 accesses a coordinate listed in the information of the accessed location 275 and renders the visual indicator 290 at the corresponding coordinate on the second user interface 245 .
- the second device 280 can search the second user interface 245 for the object and proceed to render the visual indicator 290 at a location of the second user interface 245 where the object is found.
- a visual indicator 290 can be a visual mark, such as a rectangle, and the visual indicator 290 is rendered at a location of the second user interface 245 corresponding to the accessed location 275 of the user interface 265 .
- the visual indicator 290 can include other types of visual marks such as a square, a star, an X, and/or a circle.
- rendering the visual indicator 290 includes rendering the corresponding location of the second user interface 245 to magnify and/or animate.
- the second device 280 can determine a type of visual indicator 290 to render on the second user interface 245 .
- the second device 280 can determine whether a type of visual indicator 290 has been specified 290 .
- the type of visual indicator 290 can be specified in the information of the accessed location 275 by the user 205 of the device 200 .
- the type of visual indicator 290 can specify whether the visual indicator 290 is to a visual mark and/or a modification to the second user interface 245 . Additionally, the type of visual indicator 290 can specify whether the visual mark is a square, a star, an X, a circle, and/or any additional visual mark. Further, the type of visual indicator 290 can specify whether the modification is to magnify and/or animate the corresponding location of the second user interface 245 . In one embodiment, the user 205 of the device 205 can be prompted by an audio or video prompt to specify the type of visual indicator 290 for the second device 280 to render.
- the second device 280 can randomly select a visual indicator 290 to render or a user of the second device 280 can be prompted to select the type of visual indicator 290 which he/she would like to view on the second user interface 245 .
- FIG. 3 illustrates a block diagram of an interface application 310 transmitting information to a second device 380 according to an example.
- the interface application 310 can be a firmware of the device or an application stored on a computer readable memory accessible to the device.
- the computer readable memory is any tangible apparatus, such as a hard drive, a compact disc, a flash disk, a network drive or any other form of computer readable medium that contains, stores, communicates, or transports the interface application 310 for use by the device.
- the controller 320 and/or the interface application 310 configure the display component 360 to display a user interface 365 with Document A.
- Document A is displayed on the user interface 365
- the sensor 33 detects a user accessing a location of the user interface 365 .
- the sensor 330 shares information of the accessed location (coordinate X, Y) with the controller 320 and/or the interface application 310 .
- the controller 320 and/or the interface application 310 identify a portion of the user interface 365 to share with the second device 380 and transmit information of the accessed location to the second device.
- the controller 320 and/or the interface application 310 can identify a portion of the user interface 365 to share before the sensor 330 has detected an accessed location of the user interface 365 .
- the controller 320 and/or the interface application 310 can prompt the user to specify the portion of the user interface 365 to share with the second device.
- a portion of the user interface 365 can be defined to be an object included at the accessed location of the user interface 365 . As noted above, the portion can include the entire user interface 365 or one or more objects displayed on the user interface 365 .
- Document A has been selected as the portion of the user interface 365 to share with the second device.
- the controller 320 and/or the interface application 310 use the communication component 370 to transmit Document A as a file to the second device 380 .
- the controller 320 and/or the interface application 310 can transmit a link, such as a web address, of where Document A can be accessed by the second device 380 .
- the controller 320 and/or the interface application 310 additionally transmit information of the accessed location of the user interface 365 to the second device 380 .
- the controller 320 and/or the interface application 310 transmit the coordinates X, Y, as the information of the accessed location to the second device 380 .
- the controller 320 and/or the interface application 310 additionally transmit a resolution which is currently used for the user interface 365 .
- the resolution can be used by the second 380 device as a resolution for the second user interface.
- the shared portion of the user interface 365 and the accessed location of the user interface 365 appear on the second user interface of the second device 380 to match the user interface 365 .
- FIG. 4 illustrates a device 400 and a second device 480 sharing accessed information according to an example.
- an object, Document A is rendered for display on the user interface 465 of the device 400 .
- user 405 has been detected accessing an image of a star at a far right location of Document A.
- the coordinate of the star on the user interface 465 is identified by the controller and/or the interface application to be X, Y.
- the communication component 470 shares a portion of the user interface 465 and information of the accessed location is transmitted to the second device 480 .
- the communication component 470 shares Document A as a portion of the user interface 465 and the communication component transmits coordinate X, Y as information of the accessed location to the second device 480 .
- the user 405 of the device 400 can additionally create a message associated with the accessed location.
- the message can include information associated with the accessed location which the user 405 can use to communicate to user 408 of the second device 480 .
- the message can be a text message which describes what is located at the accessed location.
- the message can be an audio or video message which the user can use to notify the other user 408 that an item at the accessed location is incorrect or needs to be modified.
- the message can include any additional contextual information from the user 405 associated with the accessed location of the user interface 465 .
- the device 400 can include an input component 435 to capture the message associated with the accessed location.
- the input component 435 is a hardware component, such as an image capture component, a microphone, and/or a keyboard which the user 405 can use to create an alphanumeric message, a video message, and/or an audio message.
- the user interface 465 can display a message asking whether the user would like to create a message.
- the input component 435 can automatically enable to capture a message associated with the accessed location in response to the user 405 accessing a location of the user interface 465 .
- the message can be transmitted with the information of the accessed location to the second device 480 with the communication component 470 .
- the second device 485 receives Document A and proceeds to render a second user interface 445 on a second display component 440 to include Document A.
- the second device 480 receives the coordinate X, Y and proceeds to render a visual indicator 490 at coordinate X, Y of the second user interface 445 .
- the second device 480 renders the visual indicator 490 by modifying the second user interface 445 to magnify the coordinate X, Y. As a result, the image of the star which was accessed by the user 405 of the device 400 is magnified on the second user interface 445 .
- the second device 480 can output the message on the second user interface 445 or with an audio speaker of the second device 480 .
- the user 405 of the device 400 can communicate contextual information of the accessed location with the user 408 of the second device 480 with the message.
- a user 408 of the second device 480 can also access a location of the second user interface 445 .
- the user 408 of the second device 480 accesses the alphanumeric characters “Text” located at a far left side of Document A.
- the second device 480 transmits information of the accessed location of the second user interface 445 over to the device 400 .
- the information of the accessed location is transmitted by the second device 480 as a string “Text”
- user 408 can additionally record a message associated with “Text” to transmit to the device 400 .
- the controller and/or the interface application can proceed to search Document A for the string “Text.”
- the controller and/or the interface application identify the string “Text” is located at a far left side of Document A and proceed to render a visual indicator 490 around “Text”
- the visual indicator 490 can include one or more shapes, such as a circle, rendered at the location of the user interface 465 corresponding to the accessed location.
- the message can be outputted on the device 400 for the user 405 along with the visual indicator 490 .
- FIG. 5 is a flow chart illustrating a method for managing a device according to an example.
- a controller and/or interface application can be utilized independently and/or in conjunction with one another to manage the device.
- the controller and/or the interface application initially render a user interface on a display component of the device at 500 .
- the user interface includes objects, such as a document, wallpaper, a screen saver, media, and/or an application which can include alphanumeric characters, images, and/or videos for a user of the device to view and/or interact with.
- a sensor of the device can detect for a hand gesture from the user when detecting the user accessing a location of the user interface at 510 .
- the controller and/or the interface application can share a portion of the user interface for a second device to render a second user interface at 520 .
- a portion of the user interface can include everything included on the user interface.
- a portion of the user interface includes an object located at the accessed location of the user interface.
- the controller and/or the interface application can share the portion of the user interface using a communication component.
- the portion of the user interface can be shared as a video stream, a file including the object, and/or a link to the object.
- the controller and/or the interface application can also use the communication component to transmit information of the accessed location to the second device for the second device to render a visual indicator at a location of the second user interface based on the information of the accessed location at 530 .
- the location of the second user interface of the second device corresponds to the accessed location of the user interface of the device.
- the visual indicator can be rendered at the location of the second user interface as a visual mark. In another embodiment, rendering the visual indicator includes magnifying or animating the location of the second user interface.
- the method is then complete.
- the method of FIG. 5 includes additional steps in addition to and/or in lieu of those depicted in FIG. 5 .
- FIG. 6 is a flow chart illustrating a method for managing a portable device according to an example.
- the controller and/or the interface application initially render a user interface to include one or more objects on a display component at 610 .
- a sensor such as an image capture component, a touch screen, a proximity sensor, and/or a radar can detect for a finger, a palm, and/or a hand of the user touching, coming within proximity, or hovering above a location of the user interface to detect for the user accessing a location of the user interface at 620 .
- the controller and/or the interface application can identify information of the accessed location by receiving a coordinate from the sensor at 630 .
- the controller and/or the interface application additionally identify an object, alphanumeric characters, an image, and/or a video displayed at the accessed location of the user interface.
- the controller and/or the interface application can use a communication component to share a portion of the user interface for a second device to render a second user interface at 640 .
- the shared portion of the user interface can include everything displayed on the user interface.
- the shared portion of the user interface includes an object, such as a document, media, an application, wallpaper, and/or a screensaver which is rendered on the user interface.
- the portion of the user interface can be shared as a video stream, as a file with the object, and/or with a link used by the second device to access the object.
- the second device can render a second user interface.
- the controller and/or the interface application can also use the communication component to transmit information of the accessed location to the second device at 650 .
- the information of the accessed location includes a coordinate of the accessed location.
- the information of the accessed location lists an object, alphanumeric characters, an image, and/or a video located at the accessed location of the user interface.
- the information can include a resolution of the user interface and/or a message associated with the accessed location.
- the controller and/or the interface application can also send an instruction for the second device to render a visual indicator at a location of the second user interface using the information of the accessed location at 660 .
- the location to render the visual indicator on the second user interface corresponds to the accessed location of the user interface.
- the visual indicator can be rendered as a visual mark on the second user interface.
- rendering the visual indicator includes modifying the corresponding location of the second user interface to magnify and/or animate.
- the communication component of the device can detect for information of an accessed location from the second device in response to a user of the second device accessing a location of the second user interface at 670 .
- the controller and/or the interface application can then render a visual indicator at a location of the user interface corresponding to the accessed location of the second user interface at 680 .
- the method is then complete.
- the method of FIG. 6 includes additional steps in addition to and/or in lieu of those depicted in FIG. 6 .
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Computer Security & Cryptography (AREA)
- General Physics & Mathematics (AREA)
- Software Systems (AREA)
- Physics & Mathematics (AREA)
- Computer Hardware Design (AREA)
- General Health & Medical Sciences (AREA)
- Health & Medical Sciences (AREA)
- Virology (AREA)
- Bioethics (AREA)
- Human Computer Interaction (AREA)
- User Interface Of Digital Computer (AREA)
- Information Transfer Between Computers (AREA)
Abstract
Description
Claims (19)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/284,370 US9535595B2 (en) | 2011-10-28 | 2011-10-28 | Accessed location of user interface |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/284,370 US9535595B2 (en) | 2011-10-28 | 2011-10-28 | Accessed location of user interface |
Publications (2)
Publication Number | Publication Date |
---|---|
US20130111360A1 US20130111360A1 (en) | 2013-05-02 |
US9535595B2 true US9535595B2 (en) | 2017-01-03 |
Family
ID=48173769
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/284,370 Active 2034-04-28 US9535595B2 (en) | 2011-10-28 | 2011-10-28 | Accessed location of user interface |
Country Status (1)
Country | Link |
---|---|
US (1) | US9535595B2 (en) |
Families Citing this family (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9383888B2 (en) * | 2010-12-15 | 2016-07-05 | Microsoft Technology Licensing, Llc | Optimized joint document review |
US9013366B2 (en) * | 2011-08-04 | 2015-04-21 | Microsoft Technology Licensing, Llc | Display environment for a plurality of display devices |
US20130219303A1 (en) * | 2012-02-21 | 2013-08-22 | Research In Motion Tat Ab | Method, apparatus, and system for providing a shared user interface |
US10503373B2 (en) * | 2012-03-14 | 2019-12-10 | Sony Interactive Entertainment LLC | Visual feedback for highlight-driven gesture user interfaces |
US9489114B2 (en) * | 2013-06-24 | 2016-11-08 | Microsoft Technology Licensing, Llc | Showing interactions as they occur on a whiteboard |
WO2015167260A1 (en) * | 2014-04-30 | 2015-11-05 | Lg Innotek Co., Ltd. | Touch device, wearable device having the same and touch recognition method |
US9658836B2 (en) | 2015-07-02 | 2017-05-23 | Microsoft Technology Licensing, Llc | Automated generation of transformation chain compatible class |
US9712472B2 (en) | 2015-07-02 | 2017-07-18 | Microsoft Technology Licensing, Llc | Application spawning responsive to communication |
US9733993B2 (en) | 2015-07-02 | 2017-08-15 | Microsoft Technology Licensing, Llc | Application sharing using endpoint interface entities |
US9785484B2 (en) | 2015-07-02 | 2017-10-10 | Microsoft Technology Licensing, Llc | Distributed application interfacing across different hardware |
US10261985B2 (en) | 2015-07-02 | 2019-04-16 | Microsoft Technology Licensing, Llc | Output rendering in dynamic redefining application |
US10198252B2 (en) | 2015-07-02 | 2019-02-05 | Microsoft Technology Licensing, Llc | Transformation chain application splitting |
US9860145B2 (en) | 2015-07-02 | 2018-01-02 | Microsoft Technology Licensing, Llc | Recording of inter-application data flow |
US9733915B2 (en) | 2015-07-02 | 2017-08-15 | Microsoft Technology Licensing, Llc | Building of compound application chain applications |
US10198405B2 (en) | 2015-07-08 | 2019-02-05 | Microsoft Technology Licensing, Llc | Rule-based layout of changing information |
US10031724B2 (en) | 2015-07-08 | 2018-07-24 | Microsoft Technology Licensing, Llc | Application operation responsive to object spatial status |
US10277582B2 (en) | 2015-08-27 | 2019-04-30 | Microsoft Technology Licensing, Llc | Application service architecture |
Citations (23)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5008853A (en) * | 1987-12-02 | 1991-04-16 | Xerox Corporation | Representation of collaborative multi-user activities relative to shared structured data objects in a networked workstation environment |
US5107443A (en) * | 1988-09-07 | 1992-04-21 | Xerox Corporation | Private regions within a shared workspace |
US5872922A (en) * | 1995-03-07 | 1999-02-16 | Vtel Corporation | Method and apparatus for a video conference user interface |
US6237025B1 (en) * | 1993-10-01 | 2001-05-22 | Collaboration Properties, Inc. | Multimedia collaboration system |
US20040250214A1 (en) * | 2003-01-07 | 2004-12-09 | Microsoft Corporation | Automatic image capture for generating content |
US20050273700A1 (en) * | 2004-06-02 | 2005-12-08 | Amx Corporation | Computer system with user interface having annotation capability |
US20060010392A1 (en) * | 2004-06-08 | 2006-01-12 | Noel Vicki E | Desktop sharing method and system |
US20060168533A1 (en) * | 2005-01-27 | 2006-07-27 | Microsoft Corporation | System and method for providing an indication of what part of a screen is being shared |
US7461121B2 (en) * | 2003-05-21 | 2008-12-02 | Hitachi, Ltd. | Controlling the display of contents designated by multiple portable terminals on a common display device in a segmented area having a terminal-specific cursor |
US20080301101A1 (en) * | 2007-02-27 | 2008-12-04 | The Trustees Of Columbia University In The City Of New York | Systems, methods, means, and media for recording, searching, and outputting display information |
US20090125586A1 (en) * | 2007-11-14 | 2009-05-14 | Canon Kabushiki Kaisha | Screen sharing system and data transfer method |
US20090167700A1 (en) | 2007-12-27 | 2009-07-02 | Apple Inc. | Insertion marker placement on touch sensitive display |
US20090235170A1 (en) * | 2008-03-17 | 2009-09-17 | Golden Signals, Inc. | Methods and apparatus for sharing either a computer display screen or a media file and selecting therebetween |
US20100131868A1 (en) * | 2008-11-26 | 2010-05-27 | Cisco Technology, Inc. | Limitedly sharing application windows in application sharing sessions |
US7849410B2 (en) * | 2007-02-27 | 2010-12-07 | Awind Inc. | Pointing-control system for multipoint conferences |
US7870496B1 (en) * | 2009-01-29 | 2011-01-11 | Jahanzeb Ahmed Sherwani | System using touchscreen user interface of a mobile device to remotely control a host computer |
US20110085016A1 (en) | 2009-10-14 | 2011-04-14 | Tandberg Telecom As | Device, computer program product and method for providing touch control of a video conference |
US20110181492A1 (en) * | 2010-01-26 | 2011-07-28 | Canon Kabushiki Kaisha | Screen sharing apparatus, control method thereof, program and screen sharing system |
US20110191695A1 (en) * | 2010-02-03 | 2011-08-04 | Skype Limited | Screen sharing |
US20110202854A1 (en) * | 2010-02-17 | 2011-08-18 | International Business Machines Corporation | Metadata Capture for Screen Sharing |
US8010901B1 (en) * | 2007-10-26 | 2011-08-30 | Sesh, Inc. | System and method for automated synchronized co-browsing |
US20120110474A1 (en) * | 2010-11-01 | 2012-05-03 | Google Inc. | Content sharing interface for sharing content in social networks |
US8843830B2 (en) * | 2010-12-30 | 2014-09-23 | Alticast Corporation | System and method of controlling a screen of a display device using a mobile terminal |
-
2011
- 2011-10-28 US US13/284,370 patent/US9535595B2/en active Active
Patent Citations (23)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5008853A (en) * | 1987-12-02 | 1991-04-16 | Xerox Corporation | Representation of collaborative multi-user activities relative to shared structured data objects in a networked workstation environment |
US5107443A (en) * | 1988-09-07 | 1992-04-21 | Xerox Corporation | Private regions within a shared workspace |
US6237025B1 (en) * | 1993-10-01 | 2001-05-22 | Collaboration Properties, Inc. | Multimedia collaboration system |
US5872922A (en) * | 1995-03-07 | 1999-02-16 | Vtel Corporation | Method and apparatus for a video conference user interface |
US20040250214A1 (en) * | 2003-01-07 | 2004-12-09 | Microsoft Corporation | Automatic image capture for generating content |
US7461121B2 (en) * | 2003-05-21 | 2008-12-02 | Hitachi, Ltd. | Controlling the display of contents designated by multiple portable terminals on a common display device in a segmented area having a terminal-specific cursor |
US20050273700A1 (en) * | 2004-06-02 | 2005-12-08 | Amx Corporation | Computer system with user interface having annotation capability |
US20060010392A1 (en) * | 2004-06-08 | 2006-01-12 | Noel Vicki E | Desktop sharing method and system |
US20060168533A1 (en) * | 2005-01-27 | 2006-07-27 | Microsoft Corporation | System and method for providing an indication of what part of a screen is being shared |
US7849410B2 (en) * | 2007-02-27 | 2010-12-07 | Awind Inc. | Pointing-control system for multipoint conferences |
US20080301101A1 (en) * | 2007-02-27 | 2008-12-04 | The Trustees Of Columbia University In The City Of New York | Systems, methods, means, and media for recording, searching, and outputting display information |
US8010901B1 (en) * | 2007-10-26 | 2011-08-30 | Sesh, Inc. | System and method for automated synchronized co-browsing |
US20090125586A1 (en) * | 2007-11-14 | 2009-05-14 | Canon Kabushiki Kaisha | Screen sharing system and data transfer method |
US20090167700A1 (en) | 2007-12-27 | 2009-07-02 | Apple Inc. | Insertion marker placement on touch sensitive display |
US20090235170A1 (en) * | 2008-03-17 | 2009-09-17 | Golden Signals, Inc. | Methods and apparatus for sharing either a computer display screen or a media file and selecting therebetween |
US20100131868A1 (en) * | 2008-11-26 | 2010-05-27 | Cisco Technology, Inc. | Limitedly sharing application windows in application sharing sessions |
US7870496B1 (en) * | 2009-01-29 | 2011-01-11 | Jahanzeb Ahmed Sherwani | System using touchscreen user interface of a mobile device to remotely control a host computer |
US20110085016A1 (en) | 2009-10-14 | 2011-04-14 | Tandberg Telecom As | Device, computer program product and method for providing touch control of a video conference |
US20110181492A1 (en) * | 2010-01-26 | 2011-07-28 | Canon Kabushiki Kaisha | Screen sharing apparatus, control method thereof, program and screen sharing system |
US20110191695A1 (en) * | 2010-02-03 | 2011-08-04 | Skype Limited | Screen sharing |
US20110202854A1 (en) * | 2010-02-17 | 2011-08-18 | International Business Machines Corporation | Metadata Capture for Screen Sharing |
US20120110474A1 (en) * | 2010-11-01 | 2012-05-03 | Google Inc. | Content sharing interface for sharing content in social networks |
US8843830B2 (en) * | 2010-12-30 | 2014-09-23 | Alticast Corporation | System and method of controlling a screen of a display device using a mobile terminal |
Non-Patent Citations (1)
Title |
---|
Jiazhi Ou et al., Gestural Commnnication over Video Stream: Supporting Multimodal Interaction for Remote Collaborative Physical Tasks, School of Computer Science Carnegie Mellon University, Nov. 5, 2003, https://sfussell.hci.cornell.edu/pubs/Manuscripts/p107-ou.pdf. |
Also Published As
Publication number | Publication date |
---|---|
US20130111360A1 (en) | 2013-05-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9535595B2 (en) | Accessed location of user interface | |
US9864612B2 (en) | Techniques to customize a user interface for different displays | |
US9459759B2 (en) | Method for displaying contents use history and electronic device thereof | |
US10921979B2 (en) | Display and processing methods and related apparatus | |
KR102330829B1 (en) | Method and apparatus for providing augmented reality function in electornic device | |
EP3097704B1 (en) | Determing data associated with proximate computing devices | |
US20180196854A1 (en) | Application extension for generating automatic search queries | |
KR102482361B1 (en) | Direct input from remote device | |
US20210349591A1 (en) | Object processing method and terminal device | |
US20130198653A1 (en) | Method of displaying input during a collaboration session and interactive board employing same | |
US11886495B2 (en) | Predictively presenting search capabilities | |
US20140035851A1 (en) | Method for controlling user input and electronic device thereof | |
EP2778989A2 (en) | Application information processing method and apparatus of mobile terminal | |
CN111279300B (en) | Providing a rich electronic reading experience in a multi-display environment | |
US10592580B2 (en) | Web UI builder application | |
CN111026299A (en) | Information sharing method and electronic equipment | |
US10789033B2 (en) | System and method for providing widget | |
KR20180051782A (en) | Method for displaying user interface related to user authentication and electronic device for the same | |
CN111090489A (en) | Information control method and electronic equipment | |
US20150046809A1 (en) | Activity indicator | |
WO2020168882A1 (en) | Interface display method and terminal device | |
US20150169168A1 (en) | Methods and systems for managing displayed content items on touch-based computer devices | |
CN111090529A (en) | Method for sharing information and electronic equipment | |
WO2024140554A1 (en) | Display method and apparatus, and ar device | |
WO2023134599A1 (en) | Voice information sending method and apparatus, and electronic device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: HEWLETT-PACKARD DEVELOPMENT COMPANY, L.P., TEXAS Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:KODAMA, JUSTIN;HORNYAK, MATTHEW;ZNAMEROSKI, MATTHEW;AND OTHERS;SIGNING DATES FROM 20111023 TO 20111026;REEL/FRAME:027150/0420 |
|
AS | Assignment |
Owner name: PALM, INC., CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:HEWLETT-PACKARD DEVELOPMENT COMPANY, L.P.;REEL/FRAME:030341/0459 Effective date: 20130430 |
|
AS | Assignment |
Owner name: HEWLETT-PACKARD DEVELOPMENT COMPANY, L.P., TEXAS Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:PALM, INC.;REEL/FRAME:031837/0239 Effective date: 20131218 Owner name: HEWLETT-PACKARD DEVELOPMENT COMPANY, L.P., TEXAS Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:PALM, INC.;REEL/FRAME:031837/0659 Effective date: 20131218 Owner name: PALM, INC., CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:HEWLETT-PACKARD DEVELOPMENT COMPANY, L.P.;REEL/FRAME:031837/0544 Effective date: 20131218 |
|
AS | Assignment |
Owner name: QUALCOMM INCORPORATED, CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:HEWLETT-PACKARD COMPANY;HEWLETT-PACKARD DEVELOPMENT COMPANY, L.P.;PALM, INC.;REEL/FRAME:032177/0210 Effective date: 20140123 |
|
FEPP | Fee payment procedure |
Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 4 |
|
FEPP | Fee payment procedure |
Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |