US20160291694A1 - Haptic authoring tool for animated haptic media production - Google Patents
Haptic authoring tool for animated haptic media production Download PDFInfo
- Publication number
- US20160291694A1 US20160291694A1 US14/678,271 US201514678271A US2016291694A1 US 20160291694 A1 US20160291694 A1 US 20160291694A1 US 201514678271 A US201514678271 A US 201514678271A US 2016291694 A1 US2016291694 A1 US 2016291694A1
- Authority
- US
- United States
- Prior art keywords
- haptic
- actuators
- animation object
- time
- animation
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/016—Input arrangements with force or tactile feedback as computer generated output to the user
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/20—Drawing from basic elements, e.g. lines or circles
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T13/00—Animation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2210/00—Indexing scheme for image generation or computer graphics
- G06T2210/28—Force feedback
Definitions
- Embodiments disclosed herein relate to haptic feedback. More specifically, embodiments disclosed herein provide authoring tools for animated haptic media production.
- Haptic feedback has added a new dimension to modern entertainment and media applications, enhancing user interaction and immersion.
- Recent haptic technologies supplement movies, applications, games, and social activities with coherent and synchronized cues, creating dynamic and engaging haptic effects perceived by a user's body.
- these technologies have seen limited adoption, partly due to a dearth of haptic authoring tools that allow artists and designers to create rich haptic content.
- any existing tools are not able to generate content that is universally applicable to the varying types of haptic hardware.
- most existing tools are designed for a single actuator, and those that accommodate multiple actuators require individual control of each actuator. These multi-track authoring tools are cumbersome and complicated for systems having dozens of actuators.
- Embodiments disclosed herein include systems, methods, and computer program products to perform an operation comprising receiving input specifying one or more positional and dimensional properties of a first haptic animation object in an animation tool displaying a representation of a vibrotactile array comprising a plurality of actuators configured to output haptic feedback, computing, based on a rendering algorithm applied to the first haptic animation object, a vector profile for each of the actuators, and computing an intensity value for each of the actuators based on the vector profile of the respective actuator.
- FIGS. 1A-1B are schematics illustrating techniques to provide authoring tools for animated haptic media production, according to one embodiment.
- FIG. 2 illustrates a graphical user interface that provides authoring tools for animated haptic media production, according to one embodiment.
- FIG. 3 is a block diagram illustrating a system that provides authoring tools for animated haptic media production, according to one embodiment.
- FIG. 4 is a flow chart illustrating a method to provide authoring tools for animated haptic media production, according to one embodiment.
- FIG. 5 is a flow chart illustrating a method to render haptic animations, according to one embodiment.
- FIG. 6 is a block diagram illustrating components of a haptic animation application, according to one embodiment.
- Embodiments disclosed herein provide an authoring interface for users to generate animated haptic content.
- the interface allows users to create haptic animation objects, which are visualized abstract phantom sensations that can be moved in both time and space on multi-actuator vibrotactile arrays using the interface.
- Embodiments disclosed herein render the haptic animation objects using a rendering pipeline that translates the designed haptic patterns to vibrotactile arrays.
- the rendering pipeline uses perceptual models for user-optimized haptic media on actuator grids used in haptic hardware.
- Embodiments disclosed herein use spatial and temporal characteristics of vibrotactile perception, allowing users to edit in space and time to generate dynamic and expressive haptic media.
- Embodiments disclosed herein allow both animation-based and individual track-based control of haptic sensation, thus allowing fine editing and tuning of haptic experiences on the fly.
- the techniques described herein are independent of the underlying implementation of multi-actuator haptic hardware.
- users are allowed to “keyframe” tactile patterns by adding variations and fidelity in haptic content.
- Embodiments disclosed herein allow users to export media in different formats, allowing other tools and protocols to access the haptic media data, while also allowing copy/paste like editable features for quick authoring of repetitive haptic effects.
- FIG. 1A is a schematic 100 illustrating techniques to provide authoring tools for animated haptic media production, according to one embodiment.
- the schematic depicts stages 110 , 120 , and 130 of a processing pipeline.
- stage 110 depicts a haptic animation object 101 created by a user
- stage 120 depicts a rendering stage
- stage 130 depicts a rasterization stage.
- the output of the stages 110 , 120 , 130 may be produced via haptic animation hardware disposed in one or more devices 151 , 152 , 153 .
- device 151 is a smart phone (or gaming device) which includes a vibrotactile array 161 of physical actuators.
- device 152 is a wearable jacket including a vibrotactile array 162 of physical actuators
- device 153 is a chair that includes a vibrotactile array 163 of physical actuators.
- haptic animation objects are high-level specifications of virtual sensations moving on a two-dimensional vibrotactile array of actuators. Therefore, as shown in stage 110 , the haptic animation object 101 is depicted among a visualization of a set of actuators 102 - 106 .
- the actuators 102 - 106 may represent physical actuators disposed in a given vibrotactile array 161 - 163 of actuators.
- Embodiments disclosed herein may leverage a configuration file specifying the actual configuration of actuators in the vibrotactile arrays 161 - 163 (therefore displaying, for the user, a pattern of actuators 102 - 106 matching the configuration of the arrays 161 - 13 ).
- the haptic animation object 101 may be static, or the user may specify a path 107 for the haptic animation object 101 to travel across.
- the user may create any number and type of haptic animation objects 101 .
- a given haptic animation object 101 may include a number of parameters, such as location, size, duration and other semantic qualities such as frequency.
- embodiments disclosed herein may render the animation to create vector profiles for each actuator 102 - 106 .
- the output of the rendering stage is a vector profile for each actuator 102 - 106 .
- the vector profile for each actuator 102 - 106 is reflected by a respective graphical representation 121 - 123 .
- the vectors generated for each actuator 102 - 106 may include parameters such as duration, amplitude envelopes (such as fade-ins or fade-outs), frequencies, and start times.
- the vectors may include any type of information related to controlling a given actuator 102 - 106 . Being device-specific, the vector formats offer finer sensation control than haptic animation objects.
- a rendering algorithm may be used to render the haptic animation object 101 to the hardware actuators.
- the rendering algorithm translates the animations created in by the user to animated vibrotactile patterns on the underlying hardware.
- embodiments disclosed herein may compute the barycentric coordinates a 1 , a 2 , a 3 , of the haptic animation object 101 relative to a triangle defined by three actuators, in this example, actuators 102 , 103 , and 105 .
- the barycentric coordinates a 1 , a 2 , a 3 may then be scaled by an interpolation method to compute an actual intensity value for each actuator 102 , 103 , 105 .
- an intensity value relates to one or more output properties of the actuators, such as amplitude, frequency, duration, and the like.
- the interpolation method (or rendering algorithm) may be selected from one of the following equations:
- Ai is the intensity of the output of an actuator i
- a i is a barycentric coordinate
- Av is the intensity of the haptic animation object 101 .
- the intensity of the output of a physical actuator is based on the size (i.e., the radius or diameter) of the haptic animation object 101 created by a user. For example, the greater the diameter of the haptic animation object 101 , the more intense the output of an actuator. Doing so allows the user to control the intensity of the actuator output based on the size of the haptic animation object 101 .
- applying the linear model provides a linear relationship between the barycentric coordinates and the amplitude of the vibrating actuators.
- the barycentric coordinates are scaled in a logarithmic fashion, based on the fact that the perceived intensity is logarithmic related to the physical amplitude of the vibrating actuators.
- the barycentric coordinates are coupled to the power (square of the amplitude) of the vibrating actuators.
- FIG. 1B is a continuation of the schematic 100 and depicts the rasterization process.
- the rasterization process rasterizes the vector sensations into a series of frames.
- the rasterization process is suitable for playback operations or exporting to a device specific format.
- the output of the raster process is a matrix of actuator intensities.
- Each row, such as row 140 , 141 , and 142 defines intensities for a given actuator, while a given column, such as the frame 131 , defines the intensities at each time instance.
- a timestamp row 143 may be included which specifies timestamps of all rows at a frame defined by the rendering engine's framerate.
- a playback system may then parse the raster data, find a current row, and push the actuator settings to the device. Therefore, as shown, each vibrotactile array 161 - 163 of each device 151 - 153 may generate a phantom sensation mimicking the trajectory of the haptic animation object 101 .
- FIGS. 1A-1B depict a plurality of actuators
- the rendering process could translate vector profiles for two actuators (along a line between two actuators).
- the rendering process may translate a vector profile for a single actuator (e.g., a heartbeat could be generated by using a single actuator.
- a vector profile would not move locations, but may vary the properties of the actuator, such as amplitude envelope).
- FIG. 2 illustrates a graphical user interface (GUI) 200 that provides authoring tools for animated haptic media production, according to one embodiment.
- GUI graphical user interface
- the GUI 200 includes an animation window 201 and a timeline window 202 .
- the animation window 201 may include representations of physical actuators 240 - 249 .
- the placement of the actuators 240 - 249 may be generated based on a device configuration file that specifies the location and type of actuators, available rendering schemes, and any hardware-specific actuators.
- the GUI 200 may output a representation of triangles, such as the triangle 260 , connecting each actuator 240 - 249 .
- the triangles may be predefined in the configuration file, or may be programmatically generated using a triangulation algorithm (such as the Delaunay triangulation). Barycentric coordinates of a given haptic object can then be determined with respect to a particular triangle in order to determine the object's position relative to the actuators represented by the triangle.
- a triangulation algorithm such as the Delaunay triangulation
- a user may use the animation window 201 to create haptic animation objects 221 , 222 , for which the hardware ultimately generates corresponding vibrotactile sensations (or haptic feedback) on the actuators 240 - 249 .
- the user may select button 206 to create a new haptic animation object, which is displayed in the animation window 201 , which may display the size and position of the respective object.
- the position may be a set of (x,y) coordinates, which are not depicted in the animation window 201 .
- the haptic animation objects 221 - 222 have respective radii (or diameters) of 0.25 and 0.8, respectively.
- the radius of a given haptic animation object corresponds to the intensity of the haptic feedback ultimately output by the actuators 240 - 249 . Therefore, a haptic animation object having a larger radius (object 222 ) will likely output haptic feedback with a greater intensity than a haptic animation object having a smaller radius (object 221 ).
- object 222 a haptic animation object having a larger radius
- object 221 a haptic animation object having a smaller radius
- other units may be used. For example, decibels (dB) or physical radii (where a correlation between perceived size and intensity is found) may be used.
- buttons 207 and 208 may be used to add or remove a path, respectively, to a haptic animation object.
- the user has added path 212 to haptic animation object 222 .
- the motion of haptic animation objects is constrained to the defined path.
- haptic animation object 222 may move from the beginning to the end of the path 212 .
- An object having a path may have a single position parameter from 0 (beginning of the path) to 1 (end of the path), instead of (x,y) parameters, and is manipulated in different ways. Selecting the button 205 will cause the object 222 to move along the path 212 from start to end.
- Selecting the button 204 allows the user to move the object 222 and the path 212 together within the animation window 201 .
- the path 212 may be redefined by clicking and dragging either end of the path 212 .
- the hardware actuators may provide the sensation of the object 222 moving along the path 212 .
- each actuator 240 - 249 has a respective intensity value.
- actuator 245 has an intensity value of 0.62 (on a scale from 0-1)
- actuator 244 has an intensity value of 0.56
- actuator 249 has an intensity value of 0.
- the intensity values may be based on the size of the haptic animation objects 221 , 222 , and their proximity to a given actuator.
- the intensity values at a given time may be computed using the rendering algorithms described above.
- actuators 240 - 249 output the sum of all values generated by each haptic animation object created by the user, such as the haptic animation objects 221 , 222 .
- the files are JavaScript Object Notation (JSON) files.
- button 211 allows users to load audio tracks, which may be visualized via the graph 215 in the timeline 202 . Doing so allows the user to design haptic feedback for audio files. Overlay of video files may be provided in a similar manner (not shown).
- the timeline 202 represents any haptic animation objects in the animation window 201 by a respective track 223 - 225 , showing the objects' current position in time.
- track 223 corresponds to haptic animation object 221
- track 224 corresponds to haptic animation object 222
- track 225 corresponds to a haptic animation object that is not depicted in the animation window (for example, because the start time of depicting the haptic animation object has not occurred).
- the cursor 216 shows the current time, and can be dragged around by the user.
- Users may manipulate haptic animation objects via the tracks timeline 202 . For example, by clicking and resizing the track 224 , the user may cause the haptic animation object 222 to have a longer or shorter duration.
- the parameters 219 allows users to provide different parameter values for each actuator.
- a user may specify that a parameter is “keyframeable.”
- a keyframeable parameter has a value that depends on the current time.
- a keyframe 218 is automatically created at the current time. Values may be linearly interpolated between keyframe values.
- New vector button 209 allows users to create vector sensations for a selected haptic animation object 221 , 222 .
- Vector sensations control each actuator 240 - 249 directly through parameter values, controlling the actuator's voltage from 0 to 1 (the same as controlling the radius parameter of the haptic animation objects 221 , 222 ).
- the corresponding actuator may be highlighted in the animation window 201 when the text field 219 or track 223 - 225 is selected.
- each track 223 - 225 is keyframeable, allowing the user to manipulate each individual actuator 240 - 249 for fine tuning.
- the play button 214 allows users to play an entire animation
- the pause button 231 allows users to pause the animation during playback.
- the animation runs in the animation window 201 , while the corresponding parameters in the timeline 202 vary.
- vibrotactile simulations e.g., haptic feedback
- the GUI 200 allows for real-time feedback while manipulating haptic animation objects.
- FIG. 3 is a block diagram illustrating a system 300 that provides authoring tools for animated haptic media production, according to one embodiment.
- the system 300 includes a computer 302 connected to other computers via a network 330 .
- the network 330 may be a telecommunications network and/or a wide area network (WAN).
- the network 330 includes access to the Internet.
- the computer 302 generally includes a processor 304 which obtains instructions and data via a bus 320 from a memory 306 and/or storage 308 .
- the computer 302 may also include one or more network interface devices 318 , input devices 322 , output devices 324 , and vibrotactile array 325 connected to the bus 320 .
- the computer 302 is generally under the control of an operating system. Examples of operating systems include the UNIX operating system, versions of the Microsoft Windows operating system, and distributions of the Linux operating system. (UNIX is a registered trademark of The Open Group in the United States and other countries. Microsoft and Windows are trademarks of Microsoft Corporation in the United States, other countries, or both.
- the processor 304 is a programmable logic device that performs instruction, logic, and mathematical processing, and may be representative of one or more CPUs.
- the network interface device 318 may be any type of network communications device allowing the computer 302 to communicate with other computers via the network 330 .
- the storage 308 is representative of hard-disk drives, solid state drives, flash memory devices, optical media and the like. Generally, the storage 308 stores application programs and data for use by the computer 302 . In addition, the memory 306 and the storage 308 may be considered to include memory physically located elsewhere; for example, on another computer coupled to the computer 302 via the bus 320 .
- the input device 322 may be any device for providing input to the computer 302 .
- a keyboard and/or a mouse may be used.
- the input device 322 represents a wide variety of input devices, including keyboards, mice, controllers, and so on.
- the output device 324 may include monitors, touch screen displays, and so on.
- the vibrotactile array 325 is an object that includes one or more hardware actuators configured to output haptic feedback.
- a plurality of devices 350 include a respective vibrotactile array 325 .
- the devices 350 may be, for example and without limitation, smart phones, portable media players, portable gaming devices, wearable devices, furniture, and the like.
- the memory 306 contains a haptic animation application (HAA) 312 , which is generally configured to provide the GUI 200 and generate haptic feedback that is outputted by the actuators of the vibrotactile array 325 .
- the HAA 312 may apply a rendering algorithm to haptic animation objects, such as the haptic animation objects 221 - 222 , to translate the visual objects to sensations outputted by the actuators of the vibrotactile array 325 .
- the HAA 312 may apply a rendering pipeline that converts the visual haptic animation objects to vector sensations and a raster format, the latter of which outputs to hardware, such as the vibrotactile array 325 .
- the rendering algorithm applied by the HAA 312 translates virtual percepts to the vibrotactile array 325 .
- the HAA 312 may construct a Delaunay triangulation for all actuators in the array 325 to define a mesh on the array 325 .
- the HAA 312 uses barycentric coordinates of the haptic animation objects relative to a triangle defined by three real actuators of the array 325 .
- the HAA 312 may then apply an interpolation method (linear, logarithmic, or power, as described above), to determine the actual intensity of each actuator in the array 325 .
- the storage includes the output 316 and the settings 317 .
- the output 316 may be configured to store output generated by the HAA 312 , such as vector profiles generated during the rendering phase, the matrix of actuator intensity data generated during the rasterizing phase, and the like.
- the settings 317 may include hardware-specific settings and parameters for a given vibrotactile array 325 .
- the settings 317 may specify, without limitation, a width and height of the vibrotactile array 325 , a dictionary of actuator types (e.g., voice coils, rumble motors) each with a list of control parameters (e.g., frequency, intensity) specifying minimum and maximum allowable values, a location and type of each actuator in the vibrotactile array 325 , communication protocols and rendering methods supported by the vibrotactile array 325 , and default settings of the vibrotactile array.
- the settings 317 may specify height and width for each array 325 , with each array 325 separated by a corresponding identifier.
- FIG. 4 is a flow chart illustrating a method 400 to provide authoring tools for animated haptic media production, according to one embodiment.
- the method 400 begins at step 410 , where the haptic animation application (HAA) 312 provides the user interface, such as the GUI 200 , that includes a representation of a vibrotactile array 325 including a plurality of actuators.
- HAA haptic animation application
- a user provides input creating haptic animation objects and specifying any movements or paths the objects should move along.
- the user may also provide other details regarding the haptic animation objects, such as radius (or size), start times, end times, durations, frequencies, and the like.
- a user may specify a position, size, and frequency parameter for a moving virtual point, while specifying a position, frequency, and size of a “rain” effect.
- the HAA 312 may render the haptic animation objects created by the user to create vectors for the actuators in the vibrotactile array 325 .
- the HAA 312 may rasterize the vectors created at step 430 to create a matrix of intensity values for the actuators in the vibrotactile array 325 .
- Each row of the matrix of actuator intensities may define intensities of an actuator, while each column contains an intensity value at a corresponding instance of time.
- the HAA 312 may store the computed matrix and/or send the computed matrix to a device, which may parse the raster data, finds the correct row, and pushes the settings to the respective actuator.
- FIG. 5 is a flow chart illustrating a method 500 corresponding to step 430 to render haptic animations, according to one embodiment.
- the HAA 312 may perform the steps of the method 500 to translate haptic animation objects created by users to animated patterns that can be played back by the actuators of the vibrotactile array 325 . Although depicted as being performed on a single haptic animation object, the HAA 312 may perform the steps of the method 500 for each haptic animation object created by a user.
- a user or the HAA 312 may select a target device including a vibrotactile array 325 (such as a chair, phone, and the like), and the HAA 312 may construct a Delaunay triangulation for all actuators to automatically define a mesh on the actuators of the vibrotactile array 325 of the selected device.
- the HAA 312 (or a user) may select an interpolation algorithm, such as the linear, logarithmic, or power algorithms described above.
- the HAA 312 may compute the barycentric coordinates of the haptic animation object relative to a triangle defined by three actuators.
- the HAA 312 performs a loop including step 540 for each actuator.
- the HAA 312 may apply the interpolation algorithm selected at step 510 to compute the intensity value for the current actuator based on the intensity of the haptic animation object.
- the HAA 312 determines whether more actuators remain. If more actuators remain, the HAA 312 returns to step 530 , otherwise, the method 500 ends.
- FIG. 6 is a block diagram 600 illustrating components of the haptic animation application (HAA) 312 , according to one embodiment.
- the HAA 312 includes an interface component 601 , a rendering component 602 , and a rasterizing component 603 .
- the interface component 601 is configured to output and manage a graphical user interface to create haptic animation objects, such as the GUI 200 .
- the rendering component 602 is configured to render the created animation objects to compute output values for each actuator, as described in greater detail above.
- the output values may be a vector of values describing different properties of each actuator in the vibrotactile array 325 .
- the rasterizing component 603 is configured to rasterize the parameters of the vectors created by the rendering component 602 to a matrix of intensity values for each actuator at a given point in time.
- embodiments disclosed herein allow users to easily generate haptic effects based on haptic animation objects created in a graphical user interface.
- the techniques described herein create data that is consumable by all types of hardware configured to generate haptic feedback via a plurality of actuators.
- aspects of the present invention may be embodied as a system, method or computer program product. Accordingly, aspects of the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, micro-code, etc.) or an embodiment combining software and hardware aspects that may all generally be referred to herein as a “circuit,” “module” or “system.” Furthermore, aspects of the present invention may take the form of a computer program product embodied in one or more computer readable medium(s) having computer readable program code embodied thereon.
- the computer readable medium may be a computer readable signal medium or a computer readable storage medium.
- a computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing.
- a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
- a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof.
- a computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
- Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
- Computer program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C++ or the like and conventional procedural programming languages, such as the “C” programming language or similar programming languages.
- the program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server.
- the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).
- LAN local area network
- WAN wide area network
- Internet Service Provider for example, AT&T, MCI, Sprint, EarthLink, MSN, GTE, etc.
- These computer program instructions may also be stored in a computer readable medium that can direct a computer, other programmable data processing apparatus, or other devices to function in a particular manner, such that the instructions stored in the computer readable medium produce an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
- the computer program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide processes for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
- Embodiments of the invention may be provided to end users through a cloud computing infrastructure.
- Cloud computing generally refers to the provision of scalable computing resources as a service over a network.
- Cloud computing may be defined as a computing capability that provides an abstraction between the computing resource and its underlying technical architecture (e.g., servers, storage, networks), enabling convenient, on-demand network access to a shared pool of configurable computing resources that can be rapidly provisioned and released with minimal management effort or service provider interaction.
- cloud computing allows a user to access virtual computing resources (e.g., storage, data, applications, and even complete virtualized computing systems) in “the cloud,” without regard for the underlying physical systems (or locations of those systems) used to provide the computing resources.
- cloud computing resources are provided to a user on a pay-per-use basis, where users are charged only for the computing resources actually used (e.g. an amount of storage space consumed by a user or a number of virtualized systems instantiated by the user).
- a user can access any of the resources that reside in the cloud at any time, and from anywhere across the Internet.
- a user may access applications or related data available in the cloud.
- the haptic animation application (HAA) 312 could execute on a computing system in the cloud and allow users to create animations including haptic feedback.
- the HAA 312 could create haptic feedback and store output files including haptic feedback at a storage location in the cloud. Doing so allows a user to access this information from any computing system attached to a network connected to the cloud (e.g., the Internet).
- each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s).
- the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order or out of order, depending upon the functionality involved.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Processing Or Creating Images (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Systems, methods, and computer program products to perform an operation comprising receiving input specifying one or more positional and dimensional properties of a first haptic animation object in an animation tool displaying a representation of a vibrotactile array comprising a plurality of actuators configured to output haptic feedback, computing, based on a rendering algorithm applied to the first haptic animation object, a vector profile for each of the actuators, and computing an intensity value for each of the actuators based on the vector profile of the respective actuator.
Description
- 1. Field of the Invention
- Embodiments disclosed herein relate to haptic feedback. More specifically, embodiments disclosed herein provide authoring tools for animated haptic media production.
- 2. Description of the Related Art
- Haptic feedback has added a new dimension to modern entertainment and media applications, enhancing user interaction and immersion. Recent haptic technologies supplement movies, applications, games, and social activities with coherent and synchronized cues, creating dynamic and engaging haptic effects perceived by a user's body. However, these technologies have seen limited adoption, partly due to a dearth of haptic authoring tools that allow artists and designers to create rich haptic content. Similarly, any existing tools are not able to generate content that is universally applicable to the varying types of haptic hardware. In addition, most existing tools are designed for a single actuator, and those that accommodate multiple actuators require individual control of each actuator. These multi-track authoring tools are cumbersome and complicated for systems having dozens of actuators.
- Embodiments disclosed herein include systems, methods, and computer program products to perform an operation comprising receiving input specifying one or more positional and dimensional properties of a first haptic animation object in an animation tool displaying a representation of a vibrotactile array comprising a plurality of actuators configured to output haptic feedback, computing, based on a rendering algorithm applied to the first haptic animation object, a vector profile for each of the actuators, and computing an intensity value for each of the actuators based on the vector profile of the respective actuator.
- So that the manner in which the above recited aspects are attained and can be understood in detail, a more particular description of embodiments of the invention, briefly summarized above, may be had by reference to the appended drawings.
- It is to be noted, however, that the appended drawings illustrate only typical embodiments of this invention and are therefore not to be considered limiting of its scope, for the invention may admit to other equally effective embodiments.
-
FIGS. 1A-1B are schematics illustrating techniques to provide authoring tools for animated haptic media production, according to one embodiment. -
FIG. 2 illustrates a graphical user interface that provides authoring tools for animated haptic media production, according to one embodiment. -
FIG. 3 is a block diagram illustrating a system that provides authoring tools for animated haptic media production, according to one embodiment. -
FIG. 4 is a flow chart illustrating a method to provide authoring tools for animated haptic media production, according to one embodiment. -
FIG. 5 is a flow chart illustrating a method to render haptic animations, according to one embodiment. -
FIG. 6 is a block diagram illustrating components of a haptic animation application, according to one embodiment. - Embodiments disclosed herein provide an authoring interface for users to generate animated haptic content. The interface allows users to create haptic animation objects, which are visualized abstract phantom sensations that can be moved in both time and space on multi-actuator vibrotactile arrays using the interface. Embodiments disclosed herein render the haptic animation objects using a rendering pipeline that translates the designed haptic patterns to vibrotactile arrays. The rendering pipeline uses perceptual models for user-optimized haptic media on actuator grids used in haptic hardware. By creating a phantom vibration sensation (a spatial vibration perceived in-between physical actuators), embodiments disclosed herein allow the creation of continuous spatial sensations with device independence.
- Embodiments disclosed herein use spatial and temporal characteristics of vibrotactile perception, allowing users to edit in space and time to generate dynamic and expressive haptic media. Embodiments disclosed herein allow both animation-based and individual track-based control of haptic sensation, thus allowing fine editing and tuning of haptic experiences on the fly. Furthermore, by leveraging hardware configuration files, the techniques described herein are independent of the underlying implementation of multi-actuator haptic hardware. In addition, users are allowed to “keyframe” tactile patterns by adding variations and fidelity in haptic content. Embodiments disclosed herein allow users to export media in different formats, allowing other tools and protocols to access the haptic media data, while also allowing copy/paste like editable features for quick authoring of repetitive haptic effects.
-
FIG. 1A is a schematic 100 illustrating techniques to provide authoring tools for animated haptic media production, according to one embodiment. As shown, the schematic depictsstages stage 110 depicts ahaptic animation object 101 created by a user, whilestage 120 depicts a rendering stage, andstage 130 depicts a rasterization stage. The output of thestages more devices device 151 is a smart phone (or gaming device) which includes avibrotactile array 161 of physical actuators. Similarly,device 152 is a wearable jacket including avibrotactile array 162 of physical actuators, whiledevice 153 is a chair that includes avibrotactile array 163 of physical actuators. - Generally, haptic animation objects are high-level specifications of virtual sensations moving on a two-dimensional vibrotactile array of actuators. Therefore, as shown in
stage 110, thehaptic animation object 101 is depicted among a visualization of a set of actuators 102-106. The actuators 102-106 may represent physical actuators disposed in a given vibrotactile array 161-163 of actuators. Embodiments disclosed herein may leverage a configuration file specifying the actual configuration of actuators in the vibrotactile arrays 161-163 (therefore displaying, for the user, a pattern of actuators 102-106 matching the configuration of the arrays 161-13). As shown instage 110, thehaptic animation object 101 may be static, or the user may specify apath 107 for thehaptic animation object 101 to travel across. Generally, the user may create any number and type ofhaptic animation objects 101. A givenhaptic animation object 101 may include a number of parameters, such as location, size, duration and other semantic qualities such as frequency. - Once the user has created the desired haptic animation objects, embodiments disclosed herein may render the animation to create vector profiles for each actuator 102-106. As shown in
stage 120, the output of the rendering stage is a vector profile for each actuator 102-106. The vector profile for each actuator 102-106, is reflected by a respective graphical representation 121-123. The vectors generated for each actuator 102-106 may include parameters such as duration, amplitude envelopes (such as fade-ins or fade-outs), frequencies, and start times. Generally, the vectors may include any type of information related to controlling a given actuator 102-106. Being device-specific, the vector formats offer finer sensation control than haptic animation objects. - In one embodiment, a rendering algorithm may be used to render the
haptic animation object 101 to the hardware actuators. Generally, the rendering algorithm translates the animations created in by the user to animated vibrotactile patterns on the underlying hardware. As shown inbox 170, embodiments disclosed herein may compute the barycentric coordinates a1, a2, a3, of thehaptic animation object 101 relative to a triangle defined by three actuators, in this example,actuators actuator -
- Where Ai is the intensity of the output of an actuator i, ai is a barycentric coordinate, and Av is the intensity of the
haptic animation object 101. In at least one one embodiment, the intensity of the output of a physical actuator is based on the size (i.e., the radius or diameter) of thehaptic animation object 101 created by a user. For example, the greater the diameter of thehaptic animation object 101, the more intense the output of an actuator. Doing so allows the user to control the intensity of the actuator output based on the size of thehaptic animation object 101. - For example, applying the linear model provides a linear relationship between the barycentric coordinates and the amplitude of the vibrating actuators. In the logarithmic model, the barycentric coordinates are scaled in a logarithmic fashion, based on the fact that the perceived intensity is logarithmic related to the physical amplitude of the vibrating actuators. In the power model, the barycentric coordinates are coupled to the power (square of the amplitude) of the vibrating actuators.
-
FIG. 1B is a continuation of the schematic 100 and depicts the rasterization process. As shown instage 130, the rasterization process rasterizes the vector sensations into a series of frames. The rasterization process is suitable for playback operations or exporting to a device specific format. The output of the raster process is a matrix of actuator intensities. Each row, such asrow frame 131, defines the intensities at each time instance. Atimestamp row 143 may be included which specifies timestamps of all rows at a frame defined by the rendering engine's framerate. A playback system may then parse the raster data, find a current row, and push the actuator settings to the device. Therefore, as shown, each vibrotactile array 161-163 of each device 151-153 may generate a phantom sensation mimicking the trajectory of thehaptic animation object 101. - Although
FIGS. 1A-1B depict a plurality of actuators, embodiments disclosed herein apply to vibrotactile arrays with any number of actuators. For example, the rendering process could translate vector profiles for two actuators (along a line between two actuators). As another example, the rendering process may translate a vector profile for a single actuator (e.g., a heartbeat could be generated by using a single actuator. A vector profile would not move locations, but may vary the properties of the actuator, such as amplitude envelope). -
FIG. 2 illustrates a graphical user interface (GUI) 200 that provides authoring tools for animated haptic media production, according to one embodiment. As shown, theGUI 200 includes ananimation window 201 and atimeline window 202. As shown, theanimation window 201 may include representations of physical actuators 240-249. The placement of the actuators 240-249 may be generated based on a device configuration file that specifies the location and type of actuators, available rendering schemes, and any hardware-specific actuators. Based on this information, theGUI 200 may output a representation of triangles, such as thetriangle 260, connecting each actuator 240-249. The triangles may be predefined in the configuration file, or may be programmatically generated using a triangulation algorithm (such as the Delaunay triangulation). Barycentric coordinates of a given haptic object can then be determined with respect to a particular triangle in order to determine the object's position relative to the actuators represented by the triangle. - Generally, a user may use the
animation window 201 to create haptic animation objects 221, 222, for which the hardware ultimately generates corresponding vibrotactile sensations (or haptic feedback) on the actuators 240-249. For example, the user may selectbutton 206 to create a new haptic animation object, which is displayed in theanimation window 201, which may display the size and position of the respective object. The position may be a set of (x,y) coordinates, which are not depicted in theanimation window 201. For example, as shown, the haptic animation objects 221-222 have respective radii (or diameters) of 0.25 and 0.8, respectively. The radius of a given haptic animation object corresponds to the intensity of the haptic feedback ultimately output by the actuators 240-249. Therefore, a haptic animation object having a larger radius (object 222) will likely output haptic feedback with a greater intensity than a haptic animation object having a smaller radius (object 221). Although shown as depicting the diameters of the haptic animation objects 221, 222 as a normalized number between 0 and 1, other units may be used. For example, decibels (dB) or physical radii (where a correlation between perceived size and intensity is found) may be used. - As shown,
buttons path 212 tohaptic animation object 222. Generally, the motion of haptic animation objects is constrained to the defined path. For example,haptic animation object 222 may move from the beginning to the end of thepath 212. An object having a path may have a single position parameter from 0 (beginning of the path) to 1 (end of the path), instead of (x,y) parameters, and is manipulated in different ways. Selecting thebutton 205 will cause theobject 222 to move along thepath 212 from start to end. Selecting thebutton 204 allows the user to move theobject 222 and thepath 212 together within theanimation window 201. Thepath 212 may be redefined by clicking and dragging either end of thepath 212. When rendered, the hardware actuators may provide the sensation of theobject 222 moving along thepath 212. - As shown, at any given point in time, each actuator 240-249 has a respective intensity value. For example, as shown,
actuator 245 has an intensity value of 0.62 (on a scale from 0-1), whileactuator 244 has an intensity value of 0.56, andactuator 249 has an intensity value of 0. The intensity values may be based on the size of the haptic animation objects 221, 222, and their proximity to a given actuator. The intensity values at a given time may be computed using the rendering algorithms described above. Generally, actuators 240-249 output the sum of all values generated by each haptic animation object created by the user, such as the haptic animation objects 221, 222. - Users may save or load animations to or from files via the save button 210 and
load button 217, respectively. In at least one embodiment, the files are JavaScript Object Notation (JSON) files. As shown,button 211 allows users to load audio tracks, which may be visualized via thegraph 215 in thetimeline 202. Doing so allows the user to design haptic feedback for audio files. Overlay of video files may be provided in a similar manner (not shown). - As shown, the
timeline 202 represents any haptic animation objects in theanimation window 201 by a respective track 223-225, showing the objects' current position in time. For example, track 223 corresponds tohaptic animation object 221,track 224 corresponds tohaptic animation object 222, and track 225 corresponds to a haptic animation object that is not depicted in the animation window (for example, because the start time of depicting the haptic animation object has not occurred). Thecursor 216 shows the current time, and can be dragged around by the user. Users may manipulate haptic animation objects via thetracks timeline 202. For example, by clicking and resizing thetrack 224, the user may cause thehaptic animation object 222 to have a longer or shorter duration. Similarly, theparameters 219 allows users to provide different parameter values for each actuator. - By checking a
box 220, a user may specify that a parameter is “keyframeable.” A keyframeable parameter has a value that depends on the current time. When the value is changed, akeyframe 218 is automatically created at the current time. Values may be linearly interpolated between keyframe values. -
New vector button 209 allows users to create vector sensations for a selectedhaptic animation object animation window 201 when thetext field 219 or track 223-225 is selected. In addition, each track 223-225 is keyframeable, allowing the user to manipulate each individual actuator 240-249 for fine tuning. - The
play button 214 allows users to play an entire animation, while thepause button 231 allows users to pause the animation during playback. During playback, the animation runs in theanimation window 201, while the corresponding parameters in thetimeline 202 vary. Simultaneously, vibrotactile simulations (e.g., haptic feedback) may be activated on hardware connected to the system executing theGUI 200, allowing the user to feel the sensations. Further still, theGUI 200 allows for real-time feedback while manipulating haptic animation objects. -
FIG. 3 is a block diagram illustrating a system 300 that provides authoring tools for animated haptic media production, according to one embodiment. The system 300 includes acomputer 302 connected to other computers via anetwork 330. In general, thenetwork 330 may be a telecommunications network and/or a wide area network (WAN). In a particular embodiment, thenetwork 330 includes access to the Internet. - The
computer 302 generally includes aprocessor 304 which obtains instructions and data via abus 320 from amemory 306 and/orstorage 308. Thecomputer 302 may also include one or morenetwork interface devices 318,input devices 322,output devices 324, andvibrotactile array 325 connected to thebus 320. Thecomputer 302 is generally under the control of an operating system. Examples of operating systems include the UNIX operating system, versions of the Microsoft Windows operating system, and distributions of the Linux operating system. (UNIX is a registered trademark of The Open Group in the United States and other countries. Microsoft and Windows are trademarks of Microsoft Corporation in the United States, other countries, or both. Linux is a registered trademark of Linus Torvalds in the United States, other countries, or both.) More generally, any operating system supporting the functions disclosed herein may be used. Theprocessor 304 is a programmable logic device that performs instruction, logic, and mathematical processing, and may be representative of one or more CPUs. Thenetwork interface device 318 may be any type of network communications device allowing thecomputer 302 to communicate with other computers via thenetwork 330. - The
storage 308 is representative of hard-disk drives, solid state drives, flash memory devices, optical media and the like. Generally, thestorage 308 stores application programs and data for use by thecomputer 302. In addition, thememory 306 and thestorage 308 may be considered to include memory physically located elsewhere; for example, on another computer coupled to thecomputer 302 via thebus 320. - The
input device 322 may be any device for providing input to thecomputer 302. For example, a keyboard and/or a mouse may be used. Theinput device 322 represents a wide variety of input devices, including keyboards, mice, controllers, and so on. Theoutput device 324 may include monitors, touch screen displays, and so on. Thevibrotactile array 325 is an object that includes one or more hardware actuators configured to output haptic feedback. Similarly, a plurality ofdevices 350 include a respectivevibrotactile array 325. Thedevices 350 may be, for example and without limitation, smart phones, portable media players, portable gaming devices, wearable devices, furniture, and the like. - As shown, the
memory 306 contains a haptic animation application (HAA) 312, which is generally configured to provide theGUI 200 and generate haptic feedback that is outputted by the actuators of thevibrotactile array 325. TheHAA 312 may apply a rendering algorithm to haptic animation objects, such as the haptic animation objects 221-222, to translate the visual objects to sensations outputted by the actuators of thevibrotactile array 325. Generally, theHAA 312 may apply a rendering pipeline that converts the visual haptic animation objects to vector sensations and a raster format, the latter of which outputs to hardware, such as thevibrotactile array 325. - The rendering algorithm applied by the
HAA 312 translates virtual percepts to thevibrotactile array 325. Initially, theHAA 312 may construct a Delaunay triangulation for all actuators in thearray 325 to define a mesh on thearray 325. At each instant of rendering, theHAA 312 uses barycentric coordinates of the haptic animation objects relative to a triangle defined by three real actuators of thearray 325. TheHAA 312 may then apply an interpolation method (linear, logarithmic, or power, as described above), to determine the actual intensity of each actuator in thearray 325. - As shown, the storage includes the output 316 and the
settings 317. The output 316 may be configured to store output generated by theHAA 312, such as vector profiles generated during the rendering phase, the matrix of actuator intensity data generated during the rasterizing phase, and the like. Thesettings 317 may include hardware-specific settings and parameters for a givenvibrotactile array 325. Thesettings 317 may specify, without limitation, a width and height of thevibrotactile array 325, a dictionary of actuator types (e.g., voice coils, rumble motors) each with a list of control parameters (e.g., frequency, intensity) specifying minimum and maximum allowable values, a location and type of each actuator in thevibrotactile array 325, communication protocols and rendering methods supported by thevibrotactile array 325, and default settings of the vibrotactile array. In the event adevice 350 has more than one vibrotactile array 325 (such as in the back and seat of a chair), thesettings 317 may specify height and width for eacharray 325, with eacharray 325 separated by a corresponding identifier. -
FIG. 4 is a flow chart illustrating amethod 400 to provide authoring tools for animated haptic media production, according to one embodiment. As shown, themethod 400 begins atstep 410, where the haptic animation application (HAA) 312 provides the user interface, such as theGUI 200, that includes a representation of avibrotactile array 325 including a plurality of actuators. Atstep 420, a user provides input creating haptic animation objects and specifying any movements or paths the objects should move along. The user may also provide other details regarding the haptic animation objects, such as radius (or size), start times, end times, durations, frequencies, and the like. For example, a user may specify a position, size, and frequency parameter for a moving virtual point, while specifying a position, frequency, and size of a “rain” effect. - At
step 430, theHAA 312 may render the haptic animation objects created by the user to create vectors for the actuators in thevibrotactile array 325. Atstep 440, theHAA 312 may rasterize the vectors created atstep 430 to create a matrix of intensity values for the actuators in thevibrotactile array 325. Each row of the matrix of actuator intensities may define intensities of an actuator, while each column contains an intensity value at a corresponding instance of time. Atstep 450, theHAA 312 may store the computed matrix and/or send the computed matrix to a device, which may parse the raster data, finds the correct row, and pushes the settings to the respective actuator. -
FIG. 5 is a flow chart illustrating amethod 500 corresponding to step 430 to render haptic animations, according to one embodiment. Generally, theHAA 312 may perform the steps of themethod 500 to translate haptic animation objects created by users to animated patterns that can be played back by the actuators of thevibrotactile array 325. Although depicted as being performed on a single haptic animation object, theHAA 312 may perform the steps of themethod 500 for each haptic animation object created by a user. - At
step 505, a user or theHAA 312 may select a target device including a vibrotactile array 325 (such as a chair, phone, and the like), and theHAA 312 may construct a Delaunay triangulation for all actuators to automatically define a mesh on the actuators of thevibrotactile array 325 of the selected device. Atstep 510, the HAA 312 (or a user) may select an interpolation algorithm, such as the linear, logarithmic, or power algorithms described above. Atstep 520, theHAA 312 may compute the barycentric coordinates of the haptic animation object relative to a triangle defined by three actuators. Atstep 530, theHAA 312 performs aloop including step 540 for each actuator. Atstep 540, theHAA 312 may apply the interpolation algorithm selected atstep 510 to compute the intensity value for the current actuator based on the intensity of the haptic animation object. Atstep 550, theHAA 312 determines whether more actuators remain. If more actuators remain, theHAA 312 returns to step 530, otherwise, themethod 500 ends. -
FIG. 6 is a block diagram 600 illustrating components of the haptic animation application (HAA) 312, according to one embodiment. As shown, theHAA 312 includes aninterface component 601, arendering component 602, and arasterizing component 603. Theinterface component 601 is configured to output and manage a graphical user interface to create haptic animation objects, such as theGUI 200. Therendering component 602 is configured to render the created animation objects to compute output values for each actuator, as described in greater detail above. The output values may be a vector of values describing different properties of each actuator in thevibrotactile array 325. Therasterizing component 603 is configured to rasterize the parameters of the vectors created by therendering component 602 to a matrix of intensity values for each actuator at a given point in time. - Advantageously, embodiments disclosed herein allow users to easily generate haptic effects based on haptic animation objects created in a graphical user interface. Advantageously, the techniques described herein create data that is consumable by all types of hardware configured to generate haptic feedback via a plurality of actuators.
- In the foregoing, reference is made to embodiments of the invention. However, it should be understood that the invention is not limited to specific described embodiments. Instead, any combination of the recited features and elements, whether related to different embodiments or not, is contemplated to implement and practice the invention. Furthermore, although embodiments of the invention may achieve advantages over other possible solutions and/or over the prior art, whether or not a particular advantage is achieved by a given embodiment is not limiting of the invention. Thus, the recited aspects, features, embodiments and advantages are merely illustrative and are not considered elements or limitations of the appended claims except where explicitly recited in a claim(s). Likewise, reference to “the invention” shall not be construed as a generalization of any inventive subject matter disclosed herein and shall not be considered to be an element or limitation of the appended claims except where explicitly recited in a claim(s).
- As will be appreciated by one skilled in the art, aspects of the present invention may be embodied as a system, method or computer program product. Accordingly, aspects of the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, micro-code, etc.) or an embodiment combining software and hardware aspects that may all generally be referred to herein as a “circuit,” “module” or “system.” Furthermore, aspects of the present invention may take the form of a computer program product embodied in one or more computer readable medium(s) having computer readable program code embodied thereon.
- Any combination of one or more computer readable medium(s) may be utilized. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
- A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
- Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
- Computer program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C++ or the like and conventional procedural programming languages, such as the “C” programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).
- Aspects of the present invention are described below with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the invention. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
- These computer program instructions may also be stored in a computer readable medium that can direct a computer, other programmable data processing apparatus, or other devices to function in a particular manner, such that the instructions stored in the computer readable medium produce an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
- The computer program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide processes for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
- Embodiments of the invention may be provided to end users through a cloud computing infrastructure. Cloud computing generally refers to the provision of scalable computing resources as a service over a network. More formally, cloud computing may be defined as a computing capability that provides an abstraction between the computing resource and its underlying technical architecture (e.g., servers, storage, networks), enabling convenient, on-demand network access to a shared pool of configurable computing resources that can be rapidly provisioned and released with minimal management effort or service provider interaction. Thus, cloud computing allows a user to access virtual computing resources (e.g., storage, data, applications, and even complete virtualized computing systems) in “the cloud,” without regard for the underlying physical systems (or locations of those systems) used to provide the computing resources.
- Typically, cloud computing resources are provided to a user on a pay-per-use basis, where users are charged only for the computing resources actually used (e.g. an amount of storage space consumed by a user or a number of virtualized systems instantiated by the user). A user can access any of the resources that reside in the cloud at any time, and from anywhere across the Internet. In context of the present invention, a user may access applications or related data available in the cloud. For example, the haptic animation application (HAA) 312 could execute on a computing system in the cloud and allow users to create animations including haptic feedback. In such a case, the
HAA 312 could create haptic feedback and store output files including haptic feedback at a storage location in the cloud. Doing so allows a user to access this information from any computing system attached to a network connected to the cloud (e.g., the Internet). - The flowchart and block diagrams in the Figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order or out of order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems that perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
- While the foregoing is directed to embodiments of the present invention, other and further embodiments of the invention may be devised without departing from the basic scope thereof, and the scope thereof is determined by the claims that follow.
Claims (20)
1. A method, comprising:
receiving input specifying one or more positional and dimensional properties of a first haptic animation object in an animation tool displaying a representation of a vibrotactile array comprising a plurality of actuators configured to output haptic feedback;
computing, based on a rendering algorithm applied to the first haptic animation object, a vector profile for each of the actuators; and
computing an intensity value for each of the actuators based on the vector profile of the respective actuator.
2. The method of claim 1 , wherein the one or more positional and dimensional properties of the first haptic animation object comprise: (i) a location of the first haptic animation object, (ii) a size of the first haptic animation object, (iii) a start time for producing a haptic sensation representing the first haptic animation object, and (iv) a duration of the haptic sensation.
3. The method of claim 1 , wherein the actuators of the vibrotactile array are configured to output a haptic feedback representing the first haptic animation object responsive to receiving the computed intensity values.
4. The method of claim 1 , wherein the input further specifies a path of the first haptic animation object, wherein the path is from a first position to a second position, wherein the first haptic animation object travels from the first position at a first time to the second position at a second time.
5. The method of claim 4 , wherein the vector profile comprises a set of parameters for each of the plurality of actuators at the first time and the second time, wherein intensity values are computed for each of the actuators at the first time and the second time.
6. The method of claim 5 , wherein the set of parameters in the vector profile for the actuators comprise: (i) a duration of haptic feedback outputted by the respective actuator, (ii) an amplitude of the haptic feedback, (iii) a frequency of the haptic feedback, and (iv) a start time of the haptic feedback.
7. The method of claim 1 , wherein the actuators, when receiving the intensity values, generate a single haptic sensation.
8. The method of claim 1 , wherein the rendering algorithm comprises one of: (i) a linear function, (ii) a logarithmic function, and (iii) a power function, wherein the rendering algorithm is applied to a set of barycentric coordinates of the first haptic animation object.
9. The method of claim 1 , wherein the representation of the vibrotactile array is generated based on a layout of the actuators in the vibrotactile array stored in a configuration file.
10. A computer program product, comprising:
a computer-readable storage medium having computer-readable program code embodied therewith, which, when executed by a processor, performs an operation comprising:
receiving input specifying one or more positional and dimensional properties of a first haptic animation object in an animation tool displaying a representation of a vibrotactile array comprising a plurality of actuators configured to output haptic feedback;
computing, based on a rendering algorithm applied to the first haptic animation object, a vector profile for each of the actuators; and
computing an intensity value for each of the actuators based on the vector profile of the respective actuator.
11. The computer program product of claim 10 , wherein the one or more positional and dimensional properties of the first haptic animation object comprise: (i) a location of the first haptic animation object, (ii) a size of the first haptic animation object, (iii) a start time for producing a haptic sensation representing the first haptic animation object, and (iv) a duration of the haptic sensation.
12. The computer program product of claim 10 , wherein the actuators of the vibrotactile array are configured to output a haptic feedback representing the first haptic animation object responsive to receiving the computed intensity values.
13. The computer program product of claim 10 , wherein the input further specifies a path of the first haptic animation object, wherein the path is from a first position to a second position, wherein the first haptic animation object travels from the first position at a first time to the second position at a second time.
14. The computer program product of claim 13 , wherein the vector profile comprises a set of parameters for each of the plurality of actuators at the first time and the second time, wherein intensity values are computed for each of the actuators at the first time and the second time.
15. The computer program product of claim 14 , wherein the set of parameters in the vector profile for the actuators comprise: (i) a duration of haptic feedback outputted by the respective actuator, (ii) an amplitude of the haptic feedback, (iii) a frequency of the haptic feedback, and (iv) a start time of the haptic feedback.
16. The computer program product of claim 10 , wherein the authoring tool comprises a graphical user interface (GUI) configured to display the first haptic animation object and enable spatiotemporal manipulation of the first haptic object to receive the input.
17. A system, comprising:
one or more computer processors; and
a memory containing a program which when executed by the processors performs an operation comprising:
receiving input specifying one or more positional and dimensional properties of a first haptic animation object in an animation tool displaying a representation of a vibrotactile array comprising a plurality of actuators configured to output haptic feedback;
computing, based on a rendering algorithm applied to the first haptic animation object, a vector profile for each of the actuators; and
computing an intensity value for each of the actuators based on the vector profile of the respective actuator.
18. The system of claim 17 , wherein the one or more positional and dimensional properties of the first haptic animation object comprise: (i) a location of the first haptic animation object, (ii) a size of the first haptic animation object, (iii) a start time for producing a haptic sensation representing the first haptic animation object, and (iv) a duration of the haptic sensation.
19. The system of claim 17 , wherein the input further specifies a path of the first haptic animation object, wherein the path is from a first position to a second position, wherein the first haptic animation object travels from the first position at a first time to the second position at a second time.
20. The system of claim 19 , wherein the vector profile comprises a set of parameters for each of the plurality of actuators at the first time and the second time, wherein intensity values are computed for each of the actuators at the first time and the second time, wherein the set of parameters in the vector profile for the actuators comprise: (i) a duration of haptic feedback outputted by the respective actuator, (ii) an amplitude of the haptic feedback, (iii) a frequency of the haptic feedback, and (iv) a start time of the haptic feedback, wherein the rendering algorithm comprises one of: (i) a linear function, (ii) a logarithmic function, and (iii) a power function, wherein the rendering algorithm is applied to a set of barycentric coordinates of the first haptic animation object.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/678,271 US10013059B2 (en) | 2015-04-03 | 2015-04-03 | Haptic authoring tool for animated haptic media production |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/678,271 US10013059B2 (en) | 2015-04-03 | 2015-04-03 | Haptic authoring tool for animated haptic media production |
Publications (2)
Publication Number | Publication Date |
---|---|
US20160291694A1 true US20160291694A1 (en) | 2016-10-06 |
US10013059B2 US10013059B2 (en) | 2018-07-03 |
Family
ID=57016130
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/678,271 Active 2036-08-11 US10013059B2 (en) | 2015-04-03 | 2015-04-03 | Haptic authoring tool for animated haptic media production |
Country Status (1)
Country | Link |
---|---|
US (1) | US10013059B2 (en) |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20170060240A1 (en) * | 2015-08-31 | 2017-03-02 | Fujitsu Ten Limited | Input device, display device, and program |
US20180301140A1 (en) * | 2017-04-17 | 2018-10-18 | Facebook, Inc. | Envelope encoding of speech signals for transmission to cutaneous actuators |
WO2019163260A1 (en) * | 2018-02-20 | 2019-08-29 | ソニー株式会社 | Information processing apparatus, information processing method, and program |
JPWO2020080433A1 (en) * | 2018-10-19 | 2021-09-09 | ソニーグループ株式会社 | Information processing equipment, information processing methods, and programs |
US20220083141A1 (en) * | 2019-01-07 | 2022-03-17 | Google Llc | Haptic output for trackpad controlled using force signal and sense signal |
US11688386B2 (en) * | 2017-09-01 | 2023-06-27 | Georgetown University | Wearable vibrotactile speech aid |
US12141357B2 (en) | 2018-10-19 | 2024-11-12 | Sony Group Corporation | Information processor, information processing method, and program |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110157088A1 (en) * | 2009-05-21 | 2011-06-30 | Hideto Motomura | Tactile processing device |
US20110242113A1 (en) * | 2010-04-06 | 2011-10-06 | Gary Keall | Method And System For Processing Pixels Utilizing Scoreboarding |
US20120092146A1 (en) * | 2009-12-11 | 2012-04-19 | Gwangju Institute Of Science And Technology | Method for expressing haptic information using control information, and system for transmitting haptic information |
US20140189507A1 (en) * | 2012-12-27 | 2014-07-03 | Jaime Valente | Systems and methods for create and animate studio |
US20140257806A1 (en) * | 2013-03-05 | 2014-09-11 | Nuance Communications, Inc. | Flexible animation framework for contextual animation display |
-
2015
- 2015-04-03 US US14/678,271 patent/US10013059B2/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110157088A1 (en) * | 2009-05-21 | 2011-06-30 | Hideto Motomura | Tactile processing device |
US20120092146A1 (en) * | 2009-12-11 | 2012-04-19 | Gwangju Institute Of Science And Technology | Method for expressing haptic information using control information, and system for transmitting haptic information |
US20110242113A1 (en) * | 2010-04-06 | 2011-10-06 | Gary Keall | Method And System For Processing Pixels Utilizing Scoreboarding |
US20140189507A1 (en) * | 2012-12-27 | 2014-07-03 | Jaime Valente | Systems and methods for create and animate studio |
US20140257806A1 (en) * | 2013-03-05 | 2014-09-11 | Nuance Communications, Inc. | Flexible animation framework for contextual animation display |
Cited By (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20170060240A1 (en) * | 2015-08-31 | 2017-03-02 | Fujitsu Ten Limited | Input device, display device, and program |
US10943503B2 (en) * | 2017-04-17 | 2021-03-09 | Facebook, Inc. | Envelope encoding of speech signals for transmission to cutaneous actuators |
US10867526B2 (en) | 2017-04-17 | 2020-12-15 | Facebook, Inc. | Haptic communication system using cutaneous actuators for simulation of continuous human touch |
US10650701B2 (en) | 2017-04-17 | 2020-05-12 | Facebook, Inc. | Haptic communication using inside body illusions |
US10665129B2 (en) | 2017-04-17 | 2020-05-26 | Facebook, Inc. | Haptic communication system using broad-band stimuli |
US10748448B2 (en) | 2017-04-17 | 2020-08-18 | Facebook, Inc. | Haptic communication using interference of haptic outputs on skin |
US20180301140A1 (en) * | 2017-04-17 | 2018-10-18 | Facebook, Inc. | Envelope encoding of speech signals for transmission to cutaneous actuators |
US10854108B2 (en) | 2017-04-17 | 2020-12-01 | Facebook, Inc. | Machine communication system using haptic symbol set |
US11355033B2 (en) | 2017-04-17 | 2022-06-07 | Meta Platforms, Inc. | Neural network model for generation of compressed haptic actuator signal from audio input |
US11011075B1 (en) | 2017-04-17 | 2021-05-18 | Facebook, Inc. | Calibration of haptic device using sensor harness |
US11688386B2 (en) * | 2017-09-01 | 2023-06-27 | Georgetown University | Wearable vibrotactile speech aid |
JP7314926B2 (en) | 2018-02-20 | 2023-07-26 | ソニーグループ株式会社 | Information processing device, information processing method, and program |
WO2019163260A1 (en) * | 2018-02-20 | 2019-08-29 | ソニー株式会社 | Information processing apparatus, information processing method, and program |
US11334226B2 (en) * | 2018-02-20 | 2022-05-17 | Sony Corporation | Information processing device, information processing method, and program |
EP3757721A4 (en) * | 2018-02-20 | 2021-04-21 | Sony Corporation | Information processing apparatus, information processing method, and program |
CN111712779A (en) * | 2018-02-20 | 2020-09-25 | 索尼公司 | Information processing apparatus, information processing method, and program |
JPWO2019163260A1 (en) * | 2018-02-20 | 2021-02-04 | ソニー株式会社 | Information processing equipment, information processing methods, and programs |
JPWO2020080433A1 (en) * | 2018-10-19 | 2021-09-09 | ソニーグループ株式会社 | Information processing equipment, information processing methods, and programs |
US12141357B2 (en) | 2018-10-19 | 2024-11-12 | Sony Group Corporation | Information processor, information processing method, and program |
JP7424301B2 (en) | 2018-10-19 | 2024-01-30 | ソニーグループ株式会社 | Information processing device, information processing method, and program |
US20220083141A1 (en) * | 2019-01-07 | 2022-03-17 | Google Llc | Haptic output for trackpad controlled using force signal and sense signal |
Also Published As
Publication number | Publication date |
---|---|
US10013059B2 (en) | 2018-07-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10013059B2 (en) | Haptic authoring tool for animated haptic media production | |
KR101511819B1 (en) | Method system and software for providing image sensor based human machine interfacing | |
Schneider et al. | Tactile animation by direct manipulation of grid displays | |
CN102999932B (en) | chart animation | |
US11010141B2 (en) | Graphical interface to generate instructions to control a representation by an output interface of one or more objects | |
KR101575092B1 (en) | Method, system and computer-readable recording medium for creating motion sequence of animation | |
CN110796712A (en) | Material processing method, device, electronic equipment and storage medium | |
US10983812B2 (en) | Replaying interactions with a graphical user interface (GUI) presented in a video stream of the GUI | |
CN108459704A (en) | Stroke for 3-dimensional digital content operates prediction | |
WO2020220773A1 (en) | Method and apparatus for displaying picture preview information, electronic device and computer-readable storage medium | |
KR101523328B1 (en) | Method of providing pose-library for three-dimensional animation character, apparatus performing the same and storage media storing the same | |
JP6352275B2 (en) | Method, system and computer readable recording medium for generating crowd animation | |
US10395412B2 (en) | Morphing chart animations in a browser | |
JP2011096190A (en) | Method, apparatus and program for analyzing by finite element method | |
US9396574B2 (en) | Choreography of animated crowds | |
CN109493428B (en) | Optimization method and device for three-dimensional virtual model, electronic equipment and storage medium | |
US11907503B2 (en) | Switching display of page between a window of a graphical user interface and an independent child window | |
US20110175908A1 (en) | Image Effect Display Method and Electronic Apparatus Thereof | |
US11068145B2 (en) | Techniques for creative review of 3D content in a production environment | |
CN111708475A (en) | Virtual keyboard generation method and device | |
CN110888787A (en) | Data monitoring method, device and system | |
Tadel et al. | Exploring server/web-client event display for CMS | |
Van de Broek et al. | Perspective Chapter: Evolution of User Interface and User Experience in Mobile Augmented and Virtual Reality Applications | |
CN116271832A (en) | Editing method, device, medium, electronic device and program product for virtual image | |
KR20230105375A (en) | 3d image model implementation method, apparatus and systems for metaverse platform environment production |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: DISNEY ENTERPRISES, INC., CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:ISRAR, ALI;MORAN, JONATHAN;SCHNEIDER, OLIVER STIRLING;REEL/FRAME:035329/0539 Effective date: 20150401 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 4 |