WO2011078796A1 - Plateforme de téléfabrication de marionnettes - Google Patents

Plateforme de téléfabrication de marionnettes Download PDF

Info

Publication number
WO2011078796A1
WO2011078796A1 PCT/SG2010/000478 SG2010000478W WO2011078796A1 WO 2011078796 A1 WO2011078796 A1 WO 2011078796A1 SG 2010000478 W SG2010000478 W SG 2010000478W WO 2011078796 A1 WO2011078796 A1 WO 2011078796A1
Authority
WO
WIPO (PCT)
Prior art keywords
puppet
data
sensors
computer program
program product
Prior art date
Application number
PCT/SG2010/000478
Other languages
English (en)
Inventor
Lord Kenneth Pinpin
Shuzhi Ge
Original Assignee
National University Of Singapore
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by National University Of Singapore filed Critical National University Of Singapore
Publication of WO2011078796A1 publication Critical patent/WO2011078796A1/fr

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63HTOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
    • A63H3/00Dolls
    • A63H3/28Arrangements of sound-producing means in dolls; Means in dolls for producing sounds
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/4302Content synchronisation processes, e.g. decoder synchronisation
    • H04N21/4307Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen
    • H04N21/43074Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen of additional data with content streams on the same device, e.g. of EPG data or interactive icon with a TV program
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B5/00Electrically-operated educational appliances
    • G09B5/06Electrically-operated educational appliances with both visual and audible presentation of the material to be studied
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/4104Peripherals receiving signals from specially adapted client devices
    • H04N21/4131Peripherals receiving signals from specially adapted client devices home appliance, e.g. lighting, air conditioning system, metering devices
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42202Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS] environmental sensors, e.g. for detecting temperature, luminosity, pressure, earthquakes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63HTOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
    • A63H2200/00Computerized interactive toys, e.g. dolls

Definitions

  • animal pets such as dogs and parrots can be more intelligent and emotionally responsive compared to the most expensi ve toy robots currently available.
  • Each character has a consistent and unique personality.
  • a puppet character has certain key phrases or expressions that he regularly uses.
  • Each character has a perceived age and intelligence that is slightly older than the child.
  • the present invention and its various embodiments create an ecosystem beneficial for children's TV show producers, toy companies, game developers, parents, researchers, and children.
  • Producers of current children's shows using puppets can use the platform to make their stories more interactive by creating puppet behavior databases for toy puppet versions of their characters.
  • Toy companies can use the platform to make more socially interactive puppets without significantly raising the cost of the toy.
  • Updating puppet behaviors by continually creating more sophisticated puppet behavior databases provide ew revenue streams for toy companies, TV show producers, or independent developers.
  • Game developers for the console market can now target the younger age group by creating a new type of game environment where one of the characters is present with the player as a robotic puppet and is also the interface to the game.
  • Parents benefit by using online services that can keep track of the child's behavior and long-term development while playing with the puppets.
  • researchers in human-robot interaction will have a stable platform to conduct studies and also a robotic platform for testing advanced intelligence routines that can recognize emotions and react naturally to the child.
  • Such research can be eventually be used by developers, toy companies, and TV show producers.
  • the children benefit by having a toy companion while learning from a TV show or while playing a computer game with the puppet.
  • a puppet can take on a variety of appearances. In general, a puppet resembles a character or even an object in a story and is able to exhibit behaviors that create a more immersive experience for the user.
  • the platform consists of a puppeteer base or puppet base station that is in communication with one of or more puppets within its range.
  • the story meta-data for various media formats, such as television shows and video games, and corresponding puppet behavior database are downloaded from the internet using online puppet services.
  • Real time sensor data from puppets in combination with decoded story timing and meta-data is analyzed at the puppeteer base to select the appropriate puppet behavior consisting of joint movements and voice data which is played back by the puppet.
  • the puppet base station transmits information to the puppet indicating when a certain puppet behavior or action should be performed.
  • Puppet behaviors are developed and tested using puppetry development tools and to make stories or games compatible with the platform.
  • a telepuppetry-based business method is provided.
  • FIG. 1 is a diagram showing the elements of the platform in its preferred embodiment
  • FIG. 2 is a block diagram of a puppet showing its various functional elements
  • FIG. 3 is a block diagram showing the functional components of the puppeteer base in one embodiment
  • FIG. 4 is an alternative embodiment of the puppeteer baseband
  • FIG. 5 is ah embodiment of the puppetry development tool.
  • the figures depict various embodiments of the present invention for purposes of illustration only. One skilled in the art will readily recognize from the following discussion that alternati ve embodiments of the structures and methods illustrated herein may be employed without departing from the principles of the invention described herein.
  • the primary goal of this entire system 1 is to provide a live- puppetry experience for a child while watching a compatible story 22 on a video display device 14 such as a TV. It is designed such that the puppet is physically in the presence of the child and behaves in a way that is responsive to the child as well as the context provided by the story 22.
  • One scenario by way of example, is that the puppet sings along with the other characters in the story 22 or even asks the child to sing along.
  • the puppet can ask the child to participate by asking questions that can help one of the other characters in the story 22.
  • This technique of using puppets 12 to ask the child watching the TV is used often in children's educational TV shows. However, unlike the traditional TV shows, this time the puppet is not on the TV screen but right beside the child. In this manner, the educational and entertainment benefits of children's TV shows is enhanced by the live interactive presence that traditional puppetry provides.
  • the telepuppetry platform is a system comprised of one or more robotic puppets 12 that are in wireless communication 24 with a puppeteer base 10.
  • the puppeteer base 10 controls each puppet's motion and speech.
  • the puppeteer base 10 also receives all the sensor data from each of the puppets 12 for interpretation of the child responses to the ongoing story 22.
  • the story 22 can take the form of a live children's show video e isode or one downloaded beforehand from the Internet 20 using a set of online puppet services 18 and stored on the puppeteer base 10.
  • a story 22 may also take the form of 2D or 3D game environment with the puppet serving both as a character and the game interface to effect changes in the story line.
  • the system also includes a set of puppetry development tools 16 that provide the means to make stories, in the form of videos or games, compatible with the platform.
  • the set of puppetry development tools 16 also provides the means for creating and testing puppet behaviors and different combinations of puppets 12.
  • the set of puppetry development tools 16, made available via the Internet 20, can be used by both professionals and hobbyists to make new puppet behaviors for old stories, or new stories for old puppets 12.
  • stories and puppet behaviors developed using the tools can be published for both free and commercial use using online puppet services 18.
  • Online puppet services 18 also provide the user community access to an online library of stories and compatible puppet behaviors that can be downloaded individually or, preferably, automatically by subscription to educational services supporting the platform. Online puppet services 18 extend the possible uses of the platform beyond puppetry.
  • parent-centered services can take advantage of the platform by monitoring their children's behavior and learning progress through the collected puppet sensor data that may be analyzed professionally by educators or child behavior specialists.
  • the insight gained from the analysis of child behavioral data recorded using the puppeteer base 10 can be used for customizing puppet behaviors that are specific to the child's stage of emotional, intellectual, and social development.
  • a puppet includes a main controller 42, memory storage 44, power supply 36, wireless communication 24, joint control 38, sound speakers 30, and various sensors.
  • the main controller 42 runs a software program that manages the different functional elements of the puppet.
  • the main -controller 42 may be implemented using microcontrollers or similar microprocessors with adequate computing capability as can be determined by those familiar in the art of embedded systems.
  • Memory storage 44 is included for storing the puppet software and for storing of parameters related to the operation of the puppet.
  • Memory storage 44 may take the form of solid-state memory devices such as flash memory which have low power consumption and hold data in a non- volatile manner.
  • a puppet also has a power supply 36 unit that provides electric power for all electronic components and joint actuators 40. Typically, removable batteries that preferably are rechargeable are used for supplying the necessary power.
  • the wireless communication 24 element provides the means to communicate with the puppeteer base 10 in an untethered manner to allow flexibility in the use of the puppet while increasing safety for the intended child users of the puppet.
  • the specific mechanism for wireless communication 24 should provide sufficient bandwidth to allow real-time transfer of sensor data from the puppet to the puppeteer base 10 and the receipt of real-time behavioral commands to the puppet.
  • any of the WiFi standards such as 802.1 la/b/g/n or Bluetooth communication can be used for the wireless communication 24.
  • the use of common wireless standards can help reduce costs and are constantly improving in important features such as power consumption and communication bandwidth.
  • the joint control 38 element of a puppet provides a means to control puppet joint motions by controlling one or more joint actuators 40. It uses the joint sensors 41 for feedback to implement an algorithm for controlling one or more variables such as position, velocity, and torque.
  • This algorithm for controlling joint variables may be implemented in hardware as dedicated circuitry or as software routines running in the main controller 42. Desired puppet movements are executed by the main controller 42 by sending the appropriate signals to joint control 38 which then executes the control algorithm for controlling each joint actuator.
  • Each puppet includes sensors for sound, proprioception, touch, and vision.
  • a sound system 32 captures signals from the microphone 28 and also plays back audio via the speakers 30 it drives.
  • One or more touch; sensors 34 placed at strategic locations on the puppet's skin are used for detecting tactile gestures received by the puppet.
  • the joint sensors 41 of the puppet provide the proprioceptive feedback: In combination with touch,
  • proprioceptive feedback can be used for detection of gestures such as a child shaking the puppet's hand, a child tapping the puppet's face or head, or a child hugging the puppet.
  • One or more cameras provide visual sensory data for the puppet. All sensory inputs are sent to the puppeteer base 10 for real-time analysis, recording and interpretation. Although some minimal processing of the sensor data may be done at the puppet, it is preferred that the computationally demanding processing be performed at the puppeteer base 10 which has more significant computing resources. In effect, without the puppeteer base 10 a puppet cannot demonstrate the sensitivity to the Ongoing story 22 and responses of the child. It requires a puppeteer base 10 that has the computational intelligence to interpret the puppet's sensors and coordinate the puppet's movement and speech in relation to the ongoing interaction between child, story 22, and puppet.
  • the elements of the puppet as shown in Figure 2 are shown separately, but in practice, as is known to those familiar to 15 the art of building electronic systems, these may be highly integrated into just a few integrated circuits using a system-on-a-chip design, for example.
  • the variety of sensors that can be used is numerous and those shown in the Figure 2 are by way of example only, to illustrate the functionality of the puppet.
  • accelerometers can be included to detect various puppet movements such as when the child shakes the puppet's body, hand, or head.
  • Another possible sensor is a gyroscope and electronic compass that can be used for determining the orientation and movement of the puppet in relation to the video display device 14, the floor, and the child.
  • Each puppet can receive and execute movement commands and playback audio received wirelessly from a puppeteer base 10.
  • Each puppet also communicates wirelessly to the puppeteer base 10 the sensor data it receives from puppet's microphone 28, camera 26, touch sensors 34, joint sensors 41.
  • Puppet joint commands 70 and voice data 72 are sent from a puppeteer base 10 to one or more puppets 12 within its range.
  • a collection of puppet joint commands 70 and voice data 72 make up a puppet behavior database 66 that are specific for a TV episode for a children's show (here called a story 22) and are downloaded to the puppeteer base 10 via the Internet 20.
  • the video for a story 22 is watermarked with timing data 54 and story meta-data 52 that is detected by the puppeteer base 10.
  • the operation of the puppeteer base 10 is explained by showing the relationships between the internal functional blocks, one or more puppets 12 and video display device; 14 Watermarked video 50 from live TV or a recorded TV show containing embedded time and story meta-data 52 is obtained using a video capture 48 device. The captured video is processed by the meta-data decoder 46 to extract said timing data 54 and story meta-data 52. At the same time, sensor data from each puppet is continuously sent to the puppeteer base 10 for processing, interpretation and recording. Interpretation of puppet sensor data is done by the sensory interpretation module 56 containing one or more of the following submodules: speech recognition 58, gesture recognition 60, tactile gesture recognition 62, facial expression recognition 76, face recognition 78, emotion recognition 80.
  • the context of the story 22 is input to these modules in the form of the extracted story meta-data 52 and timing information.
  • the outputs of the recognition modules and the story meta-data 52 and timing data 54 are used by the behavior selection logic 64 to choose and appropriate behavior for the puppets 12.
  • the puppet behavior selection logic 64 uses a puppet behavior database 66 appropriate for the story 22 being played.
  • the selected puppet behavior 68 is translated into joint commands 70 and voice data 72. These joint commands 70 arid voice data 72 are sent to the puppets 12 using the wireless communication 24 means between the puppeteer base 10 and the puppets 12.
  • the received joint commands 70 are executed by the joint controller to move the actuators 40 of the puppet to effect the desired motion.
  • Joint commands 70 may take the form of joint position, velocity, or torque values.
  • Voice data 72 is played back on the puppet's sound system 32 in synchrony with the puppet's mouth, head, and body movements.
  • the puppets 12 can be made to appear to be participating with the story 22 being shown on the TV.
  • the puppet behavior 68 is dependent on the interpreted puppet sensor data, the puppets 12 can be made to appear naturally responsive to the child's verbal and nonverbal responses and the child's emotional state.
  • the puppeteer base 10 can be implemented using a computer such as a desktop PC, laptop, or entertainment PC equipped with a video capture 48 device, internet 20 connectivity, and a compatible wireless communication 24 device. Alternatively, it can also be implemented on a gaming console with internet 20 connectivity, a compatible wireless communication 24 device, and an external video capture 48 device attached to the console by standard ports such as the universal serial bus (USB).
  • a computer such as a desktop PC, laptop, or entertainment PC equipped with a video capture 48 device, internet 20 connectivity, and a compatible wireless communication 24 device.
  • USB universal serial bus
  • the puppeteer base 10 implemented either on a PC or game console, such as the XBOX 360 or SONY PlayStation 3, and the video and audio is generated by the said PC or game console.
  • the story 22 is implemented as a computer game using standard 3D or 2D game technology for the said PC or game console platforms.
  • the story meta-data 52 and timing data 54 is extracted from the state of the game engine 74 implementing the story 22. Similar to the live TV embodiment, the story meta-data 52, timing data 54 and the interpreted puppet sensor data are used by the behavior selection logic 64 to determine the appropriate puppet movements and voice response of each of the participating puppets 12. Such joint movements and voice data 72 are retrieved from a puppet behavior database 66.
  • the selected joint movements are translated as joint commands 70 and together with the voice data 72 are sent to each of the puppets 12 via the wireless communication 24 link.
  • the selected puppet behaviors are used as inputs to the game engine 74.
  • the progression of the story 22 is thus affected by the puppet behavior 68, which in turn is influenced by the detected verbal and nonverbal responses of the child.
  • the use of a game engine 74 allows for extensive story 22 branching - the path through which is determined by the child by interacting with the puppet.
  • the puppet or puppets 12 can be considered a novel gaming interface where speech and gestures can be used to communicate with each puppet.
  • the gesture recognition 60 module uses the video coming from the camera 26 of the puppet and the story 22 context (that there are 2 doors, one on the left, one on the right) to infer the child's answer. If the response was verbal, the speech recognition 58 module would interpret the child's response. In the event that the child's response cannot be inferred, the behavior selection logic 64 could make the puppet ask the question again or ask if the child wishes to do something else (if the emotion recognition 80 system has detected boredom, for example). The possibilities are only limited by the creator of the story 22 and the associated puppet behaviors compatible with the story 22.
  • FIG. 5 an embodiment of the puppetry development tool for creating and testing puppet behaviors for use with videos of children's shows is discussed in detail.
  • This software running on a PC or workstation imports videos and provides the means to encode story meta-data 52 into the video frames using standard digital watermarking techniques. It also interfaces with one or more puppets 12 to capture the joint movements which can be initiated by clicking the puppet record button 98. Recorded puppet movements and recorded puppet voice data 72 are saved together with the story meta-data 52 in a puppet
  • behavior database 66 The use of the software is similar to video editing software packages and is discussed in the succeeding paragraphs.
  • the story timeline 82 shows the timing relationship of the tracks that make up a story 22.
  • the first track is the video track 88, containing at least one video clip 90 of the story 22.
  • the editing and encoding process typically begins by dragging one of the video clips from the Video clips collection 102 onto the ideo track 88. Then, there is at least one audio track 104 containing the original sound of the video clip 90 is loaded.
  • a puppet is selected which creates at least one puppet voice track 92 and at least one puppet movement track 94 for each puppet supporting the story 22.
  • the meta-data track 96 that contains the meta-data segments corresponding to a particular scene in the video.
  • Playing back the story 22 can be done using either the timeline controls 110 or the video playback controls 112.
  • the video scene corresponding to the current time is displayed in the video preview 84 window. If puppet movements are available for that scene snapshot, the puppet preview 86 displays the current pose of the puppet.
  • Puppet behaviors can be recorded using a puppet connected to the PC or workstation running the puppet development tool.
  • This puppet is operated in passive joint mode or puppeteering mode when recording the joint movements.
  • the puppet can also be controlled to demonstrate playback of the movements during the editing process.
  • the speech of the puppet is recorded onto the puppet voice track 92 using an actor's voice.
  • Puppet playback can be controlled using the puppet playback controls 100.
  • a 3D graphical representation of the puppet is also shown in the puppet preview 86 window. This 3D puppet is animated during recording and playback of puppet joint movements.
  • Puppet behaviors can also be synthesized using the puppet development tool or selected from various pre-define puppet behaviors.
  • Puppet behaviors are then encoded as meta-data associated with a story such as a TV show or video game.
  • Story meta-data 52 can be edited on a per scene basis.
  • the meta- data segment 108 highlighted on the timeline is shown in detail in the meta-data editor window 114.
  • the validity of the meta-data syntax is checked against an xml schema describing how such xml documents are formed.
  • RRL a Rich Representation Language can be used as the xml standard for describing scenes, characters, semantic content, and the interaction between characters. To support multiple standards and conversion between them, an .
  • the meta-data is encoded in the video by pressing the encode story button 106.
  • only timing information can be encoded in each video frame to reduce the amount of data being encoded using digital watermarking techniques.
  • the timing data 54 can be used at the puppeteer base 10 to retrieve the appropriate story meta-data 52 that was downloaded and stored in the puppeteer base 10 in advance.
  • puppet behaviors and story 22 editing takes a different process when the puppetry development tools 16 are incorporated as part of a game development engine.
  • Each puppet is a character in the game or story 22 but also serves as the input device to substitute for joystick or game pads.
  • the puppet can be simulated in 3D graphics with its physical behavior modeled to represent the real puppet.
  • the tactile sensors which can be located in the various parts of the puppet are difficult to simulate using mouse and keyboard. It would be simpler to use a real puppet during game development. That way, during testing, touch responses are simply done by physically touching the puppet. Another example is detecting visual gesture responses.
  • gestures can only be recognized if the child or user is within the field of view of the puppet's camera 26.
  • the behavior selection logic 64 in this case should start the "face the child” behavior which could result in head movement before asking the question. Such details are best anticipated when real puppets 12 are used during game or story 22 development.
  • determining refers to the action and processes of a computer system or similar electronic computing device, which manipulates and transforms data represented as physical (electronic) quantities within the computer system memories or registers or other such information storage, transmission, or display devices.
  • Embodiments of the invention may also relate to an apparatus for performing the operations herein.
  • This apparatus may be specially constructed for the required purposes, or it may comprise a general-purpose computing device selectively activated or reconfigured by a computer program stored in the computer.
  • a computer program may be stored in a computer readable storage medium, such as, but not limited to, any type of disk including floppy disks, optical disks, CD-ROMs, magnetic-optical disks, read-only memories (ROMs), random access memories (RAMs), EPROMs, EEPROMs, magnetic or optical cards, application specific integrated circuits (ASICs), or any type of media suitable for storing electronic instructions, and each coupled to a computer system bus.
  • the computers referred to in the specification may include a single processor or may be architectures employing multiple processor designs for increased computing capability.
  • Embodiments of the invention may also relate to a computer data signal embodied in a carrier wave, where the computer data signal includes any embodiment of a computer program product or other data combination described herein.
  • the computer data signal is a product that is presented in a tangible medium and modulated or otherwise encoded in a carrier wave transmitted according to any suitable transmission method.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Business, Economics & Management (AREA)
  • Physics & Mathematics (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biodiversity & Conservation Biology (AREA)
  • Ecology (AREA)
  • Emergency Management (AREA)
  • Environmental & Geological Engineering (AREA)
  • Environmental Sciences (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Toys (AREA)

Abstract

La présente invention concerne une plateforme de divertissement éducatif constituée de marionnettes construites avec les technologies de plateforme présentées ici, permettant de réaliser des figurines qui présentent une sensibilité émotionnelle, une intelligence sociale et une adaptabilité, et qui donnent l'impression d'être vivantes et de grandir avec l'enfant. Une ou plusieurs marionnettes sont capables de présenter des comportements de marionnettes, notamment en termes de mouvements et de reproduction de sons. Les comportements de marionnettes sont codés en tant que métadonnées associées à une histoire, telle qu'un spectacle télévisé ou un jeu vidéo. Une station de base de marionnettes reçoit l'histoire et des métadonnées associées, et indique aux marionnettes quand adopter certains comportements. Les marionnettes comportent également une variété de capteurs destinés à rassembler des informations sur leur environnement. Des données de capteur sont analysées par la station de base de marionnette ou la marionnette elle-même pour déterminer les comportements qui doivent être mis en œuvre en réponse aux données de capteur recueillies.
PCT/SG2010/000478 2009-12-21 2010-12-20 Plateforme de téléfabrication de marionnettes WO2011078796A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US28867409P 2009-12-21 2009-12-21
US61/288,674 2009-12-21

Publications (1)

Publication Number Publication Date
WO2011078796A1 true WO2011078796A1 (fr) 2011-06-30

Family

ID=44196050

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/SG2010/000478 WO2011078796A1 (fr) 2009-12-21 2010-12-20 Plateforme de téléfabrication de marionnettes

Country Status (1)

Country Link
WO (1) WO2011078796A1 (fr)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016032835A1 (fr) * 2014-08-28 2016-03-03 Krolewski Jaroslaw Poupée intelligente interactive
WO2016070593A1 (fr) * 2014-11-07 2016-05-12 深圳新创客电子科技有限公司 Procédé de commande pour jouet intelligent, et procédé et système de commande pour dispositif électronique
EP3040806A1 (fr) * 2014-12-31 2016-07-06 OpenTV, Inc. Commande de périphériques synchronisée avec des médias
US20160269760A1 (en) * 2013-12-02 2016-09-15 Panasonic Intellectual Property Management Co., Ltd. Repeating device, interlocking system, distribution device, processing method of repeating device, and a program
WO2018025067A1 (fr) * 2016-08-05 2018-02-08 Cubies, Sia Jouet éducatif

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2001069572A1 (fr) * 2000-03-16 2001-09-20 Creator Ltd. Procedes et appareils destines a des transactions commerciales dans un environnement ludique interactif
WO2001070361A2 (fr) * 2000-03-24 2001-09-27 Creator Ltd. Applications pour jouets interactifs
US6773344B1 (en) * 2000-03-16 2004-08-10 Creator Ltd. Methods and apparatus for integration of interactive toys with interactive television and cellular communication systems
WO2008132486A1 (fr) * 2007-04-30 2008-11-06 Sony Computer Entertainment Europe Limited Jouet interactif et dispositif de divertissement

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2001069572A1 (fr) * 2000-03-16 2001-09-20 Creator Ltd. Procedes et appareils destines a des transactions commerciales dans un environnement ludique interactif
US6773344B1 (en) * 2000-03-16 2004-08-10 Creator Ltd. Methods and apparatus for integration of interactive toys with interactive television and cellular communication systems
WO2001070361A2 (fr) * 2000-03-24 2001-09-27 Creator Ltd. Applications pour jouets interactifs
WO2008132486A1 (fr) * 2007-04-30 2008-11-06 Sony Computer Entertainment Europe Limited Jouet interactif et dispositif de divertissement

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160269760A1 (en) * 2013-12-02 2016-09-15 Panasonic Intellectual Property Management Co., Ltd. Repeating device, interlocking system, distribution device, processing method of repeating device, and a program
EP3079367A4 (fr) * 2013-12-02 2016-11-02 Panasonic Ip Man Co Ltd Appareil de relais, système coopératif, appareil de diffusion, procédé de traitement pour appareil de relais et programme associé
JP2017139792A (ja) * 2013-12-02 2017-08-10 パナソニックIpマネジメント株式会社 中継装置、連動システム、配信装置、中継装置の処理方法およびプログラム
WO2016032835A1 (fr) * 2014-08-28 2016-03-03 Krolewski Jaroslaw Poupée intelligente interactive
WO2016070593A1 (fr) * 2014-11-07 2016-05-12 深圳新创客电子科技有限公司 Procédé de commande pour jouet intelligent, et procédé et système de commande pour dispositif électronique
EP3040806A1 (fr) * 2014-12-31 2016-07-06 OpenTV, Inc. Commande de périphériques synchronisée avec des médias
US9833723B2 (en) 2014-12-31 2017-12-05 Opentv, Inc. Media synchronized control of peripherals
US11207608B2 (en) 2014-12-31 2021-12-28 Opentv, Inc. Media synchronized control of peripherals
EP4224866A1 (fr) 2014-12-31 2023-08-09 OpenTV, Inc. Commande synchronisée sur le support de description de périphériques d'art associé
US11944917B2 (en) 2014-12-31 2024-04-02 Opentv, Inc. Media synchronized control of peripherals
WO2018025067A1 (fr) * 2016-08-05 2018-02-08 Cubies, Sia Jouet éducatif

Similar Documents

Publication Publication Date Title
KR102328959B1 (ko) 로봇, 서버 및 인간-기계 상호 작용 방법
US11291919B2 (en) Development of virtual character in a learning game
US20150298315A1 (en) Methods and systems to facilitate child development through therapeutic robotics
JP7254772B2 (ja) ロボットインタラクションのための方法及びデバイス
Aylett et al. Unscripted narrative for affectively driven characters
Bobick et al. The KidsRoom: A perceptually-based interactive and immersive story environment
EP2744579B1 (fr) Système multifonctionnel connecté et son procédé d'utilisation
JP5498938B2 (ja) 対話型玩具及びエンタテインメント装置
CN108919950A (zh) 基于Kinect的孤独症儿童互动影像装置及方法
Ryokai et al. Children's storytelling and programming with robotic characters
Fontijn et al. StoryToy the interactive storytelling toy
US20220093000A1 (en) Systems and methods for multimodal book reading
WO2011078796A1 (fr) Plateforme de téléfabrication de marionnettes
Liang et al. Exploitation of novel multiplayer gesture-based interaction and virtual puppetry for digital storytelling to develop children's narrative skills
Gomes et al. Migration between two embodiments of an artificial pet
Carvajal et al. Robotic applications in virtual environments for children with autism
Marsh et al. The online and offline digital literacy practices of young children
Aylett Games robots play: once more, with feeling
Haskell et al. An extensible platform for interactive, entertaining social experiences with an animatronic character
EP3576075A1 (fr) Utilisation d'un jouet pour l'évaluation et la thérapie de la parole et du langage
Koike et al. Tangible Scenography as a Holistic Design Method for Human-Robot Interaction
Wang Building Interactive Virtual Pet Experience in Augmented Reality
TWI406202B (zh) Leadership Robotics System, its control methods, computer program products and computer-readable recording media
WO2002029715A1 (fr) Systeme et procede de programmation de comportements de creatures synthetiques
Haibin Development of a robotic nanny for children and a case study of emotion recognition in human-robotic interaction

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 10839910

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 10839910

Country of ref document: EP

Kind code of ref document: A1