US7671916B2 - Motion sensor using dual camera inputs - Google Patents

Motion sensor using dual camera inputs Download PDF

Info

Publication number
US7671916B2
US7671916B2 US10/861,582 US86158204A US7671916B2 US 7671916 B2 US7671916 B2 US 7671916B2 US 86158204 A US86158204 A US 86158204A US 7671916 B2 US7671916 B2 US 7671916B2
Authority
US
United States
Prior art keywords
camera
images
motion
image
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active, expires
Application number
US10/861,582
Other versions
US20050270368A1 (en
Inventor
Kazuyuki Hashimoto
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Electronic Arts Inc
Original Assignee
Electronic Arts Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Electronic Arts Inc filed Critical Electronic Arts Inc
Priority to US10/861,582 priority Critical patent/US7671916B2/en
Assigned to ELECTRONIC ARTS INC. reassignment ELECTRONIC ARTS INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: HASHIMOTO, KAZUYUKI
Priority to JP2007515077A priority patent/JP2008502206A/en
Priority to GB0624588A priority patent/GB2430042B/en
Priority to PCT/US2005/012650 priority patent/WO2005122582A2/en
Priority to TW094113107A priority patent/TWI282435B/en
Publication of US20050270368A1 publication Critical patent/US20050270368A1/en
Application granted granted Critical
Publication of US7671916B2 publication Critical patent/US7671916B2/en
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/285Analysis of motion using a sequence of stereo image pairs
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/181Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/188Capturing isolated or intermittent images triggered by the occurrence of a predetermined event, e.g. an object reaching a predetermined position

Definitions

  • Sensing the motion of a device can be a time consuming task that requires intensive numerical processing. Additionally, motion sensing may only provide a gross measure of the motion of a device. Sensing motion of a portable object may require costly hardware and complicated numerical processing techniques.
  • Motion sensing using trilateration of positions may be inadequate for sensing motion of a portable device.
  • GPS Global Positioning System
  • a GPS system or hybrid position determination system typically provides inadequate precision to sense motion of a portable device. GPS can require a great deal of time to acquire an initial position fix. Additionally, a GPS system or hybrid position determination system typically cannot determine rotational motion of a portable device.
  • Motion sensing systems incorporating gyroscopes can be used to sense the motion of a portable device.
  • Such systems are typically costly and not suitable for low cost designs.
  • the costs associated with gyroscopes used in motion sensing may exceed a perceived value of the entire portable device.
  • a first camera can be directed along a first viewing axis and a second camera can be directed along a second viewing axis, different from the first viewing axis.
  • the second viewing axis can be substantially opposite the first viewing axis.
  • a processor can determine changes in images from the first camera and changes in images from the second camera the determine respective directions of change.
  • the processor can compare the direction of change determined from the first camera images relative to the direction of change determined from the second camera images.
  • the processor can then determine the motion of the device based in part on the comparison.
  • the motion sensing device can include a first camera configured to capture images along a first viewing axis, a second camera configured to capture images along a second viewing axis different from the first viewing axis, and a motion processing module configured to determine motion of the device based in part on the images from the first camera and the second camera.
  • the invention includes a motion sensing device.
  • the motion sensing device can include a first camera having a viewing axis directed away from a front of the device, and configured to capture images, a second camera having a viewing axis directed away from a rear of the device, and configured to capture images, a motion processing module configured to determine a motion of the device based in part on the images from the first camera and the second camera, and further configured to generate a display image based in part on the motion of the device, and a display coupled to the motion processing module and configured to display the display image.
  • the invention includes a motion sensing system.
  • the system can include a first camera positioned along a first viewing axis, and configured to capture first camera images, a second camera positioned along a second viewing axis different from the first viewing axis, and configured to capture second camera images, a motion processing module configured to determine a motion parameter based in part on at least two first camera images and at least two second camera images, a base device configured to process the motion parameter and generate a display signal based in part on the motion parameter, and a display configured to display the display signal generated by the base device.
  • the invention can include a motion sensing method.
  • the method can include receiving a plurality of images from a first camera, receiving a plurality of images from a second camera, determining a first image translation based in part on the first camera images, determining a second image translation based in part on the second camera images, comparing the first image translation to the second image translation, and determining the motion of the device based in part on the comparison.
  • the invention can include a motion sensing method.
  • the method can include receiving a motion parameter determined from images captured along a first viewing axis and images captured along a second viewing axis substantially opposite the first viewing axis, retrieving a display image from a plurality of stored images, modifying the display image based at least in part on the motion parameter, and communicating the modified display image to a portable device.
  • the invention can include one or more processor readable storage devices configured to store one or more processor readable instructions to be executed by one or more processors, the instructions directing the processor to perform a method.
  • the method can include receiving a plurality of images from a first camera, receiving a plurality of images from a second camera, determining a first image translation based in part on the first camera images, determining a second image translation based in part on the second camera images, comparing the first image translation to the second image translation, and determining the motion of the device based in part on the comparison.
  • the invention can include one or more processor readable storage devices configured to store one or more processor readable instructions to be executed by one or more processors, the instructions directing the processor to perform a method.
  • the method can include receiving a motion parameter determined from images captured along a first viewing axis and images captured along a second viewing axis substantially opposite the first viewing axis, retrieving a display image from a plurality of stored images, modifying the display image based at least in part on the motion parameter, and communicating the modified display image to a portable device.
  • FIG. 1 is a functional block diagram of a motion sensing device.
  • FIG. 2 is a functional block diagram of a motion sensing system.
  • FIGS. 3A-3D are views of an embodiment of a motion sensing device.
  • FIGS. 4A-4D are illustrative examples of images captured by cameras.
  • FIGS. 5A-5D are illustrative examples of images captured by cameras.
  • FIGS. 6A-6D are illustrative examples of images captured by cameras.
  • FIGS. 7A-7D are illustrative examples of images captured by cameras.
  • FIGS. 8A-8D are illustrative examples of images captured by cameras.
  • FIG. 9 is a flowchart of an embodiment of a motion sensing process.
  • a portable device having motion sensing capabilities, and a method of sensing motion in a portable device use images captured from multiple cameras positioned along different viewing axis.
  • the images captured by the multiple cameras can be processed to determine a direction of change.
  • the direction of change in images captured from a first camera relative to the direction of change in images captured from a second camera can be used to sense motion of the device.
  • FIG. 1 is a functional block diagram of a motion sensing device 100 .
  • the device 100 includes a first camera and a second camera 120 coupled to a motion processing module 150 .
  • the motion processing module 150 can include, for example, an Input/Output (I/O) controller 152 and a processor 154 .
  • the motion processing module 150 can be coupled to memory 160 , and can be coupled to a communication interface 170 .
  • the motion processing module 150 can also be coupled to a display 130 and I/O devices 140 .
  • the motion sensing device 100 can be configured to sense the motion of the device based in part on the images captured by the first camera 110 and the second camera 120 .
  • the first camera 110 can be positioned along a first viewing axis, where the viewing axis can include the line that extends from the imaging device in the camera to a center of the image captured by the camera.
  • the second camera 120 can be positioned along a second viewing axis.
  • the first viewing axis is different from the second viewing axis.
  • the second camera 120 can be positioned along a second viewing axis that is substantially opposite the first viewing axis.
  • the second camera 120 can be positioned to capture an image from a back of the device 100 . It may be advantageous to position the cameras 110 and 120 with viewing axes in substantially orthogonal axes if the cameras 110 and 120 are not positioned substantially in opposite directions. However, any two viewing axes can be acceptable provided the images captured by the cameras 110 and 120 are not perfectly correlated.
  • the cameras 110 and 120 can be analog cameras or digital cameras, and each camera can be the same or different from other cameras.
  • a camera, for example 110 can be an analog camera, a Charge Coupled Device (CCD) camera, a CMOS camera, and the like, or some other device for capturing images.
  • the cameras 110 and 120 may capture visible images or may be configured to capture images outside the visible spectrum.
  • the camera, for example 110 can be a visible light camera, an infrared camera, and the like, or some other device for capturing images.
  • the first camera 110 can be configured to capture sequential images and the second camera 120 can also be configured to capture sequential images. Where a camera continually captures images, the images captured at two distinct times may be considered sequential images. It may be advantageous for a first image captured by the first camera 110 to be captured approximately at the same time as a first image captured by the second camera 120 . Similarly, it may be advantageous for a second or subsequent image captured by the first camera 110 to be captured approximately at the same time as an image captured by the second camera 120 .
  • the time between an image from the first camera 110 to its counterpart image from the second camera 120 should be less than 1 second and preferably less than 500 mSec, 250 mSec, 200 mSec, 150 mSec, 100 mSec, 50 mSec, 25 mSec or 10 mSec.
  • the difference between the time a first image is captured by the first camera 110 and the time a first image is captured by the second camera 120 can be less than 250 mSec.
  • the images captured by the cameras 110 and 120 are communicated to the motion processing module 150 .
  • the motion processing module can determine, based on the captured images, a direction of relative motion of the originating camera. For example, a left translation of a second image relative to a first image can indicate that the camera has moved right relative to the image. Alternatively, a left translation can indicate movement of the objects in the images.
  • the motion processing module 150 can process the images and determine a relative direction of motion of the first camera 110 based in part on sequential images captured by the first camera 110 . Similarly, the motion processing module 150 can process the sequential images from the second camera 120 and determine, based in part on the images, a relative direction of motion of the second camera 120 .
  • the motion processing module 150 can then compare the direction of motion of the first camera 110 to the direction of motion of the second camera 120 and, based at least in part on the comparison, determine a motion of the device 100 . Additional descriptions of motion sensing are provided below in conjunction with FIGS. 4-8 .
  • the motion processing module 150 can also receive a signal from a base device (not shown) via the communication interface 170 .
  • the received signal can be all or a portion of a signal to output on the display 130 .
  • the motion processing module 150 can output the signal on the display 130 .
  • the display can be, for example, a video display, a CRT display, an LCD display, a plasma display, an array of lights, and the like, or some other device for providing a display.
  • the motion processing module 150 can also output signals to another device (not shown) via the communication interface 170 .
  • the motion processing module 150 can provide signals to another device indicative of the sensed motion.
  • the motion processing module 150 can receive user input via one or more I/O devices 140 and can provide signals to the other device based in part on the user input signals.
  • the I/O devices 140 can include keys, keypads, buttons, joy sticks, keyboards, switches, dials, knobs, microphones, touch screens, and the like, or some other device for receiving input.
  • the device 100 can receive, via the communication interface, signals that are produced, at least in part, based on the signals output from the device 100 .
  • the device 100 outputs signals corresponding to the sensed motion and receives a display signal that is modified based on the sensed motion signals.
  • the device 100 outputs the captured images from one or more cameras 110 and 120 and also outputs the sensed device motion.
  • the device 100 can then receive signals that are modified at least in part on the output signals.
  • the received signals can include display signals to be output to the display 130 as well as other output signals.
  • the device 100 outputs the captured images from one or more cameras 110 and 120 and also outputs the sensed device motion.
  • the device 100 can then receive an augmented reality image signal based on the signals output from the device 100 .
  • the device 100 can itself generate the augmented reality image.
  • the motion processing module 150 can then output the augmented reality image on the display 130 .
  • An augmented reality image can be a captured image that is modified to include synthetic or virtual images.
  • the augmented image can include synthetic images superimposed on the captured image.
  • an image captured by a camera, for example 120 can be modified to include virtual images not present in the captured image.
  • the modified image can be displayed to a user via the display 130 .
  • the device 100 can be a handheld device and the first camera 110 can be configured to capture an image of a user's face when held by a user in a typical orientation.
  • the motion processing module 150 can then determine a distance to the user's face based at least in part on the captured image.
  • the first camera 110 can capture an image that includes an image of a user's face.
  • the motion processing module 150 can process the image to locate points in the image corresponding to the user's eyes.
  • the motion processing module 150 can, for example, determine a distance between the eyes and estimate a distance from the user to the device 100 .
  • the first camera 110 can be configured to capture at least two images of a user's face.
  • the motion processing module 150 can then determine a change in distance to the user's face based at least in part on differences in the two captured image.
  • the first camera 110 can capture an image that includes an image of a user's face.
  • the motion processing module 150 can process the image to locate points in the image corresponding to the user's eyes.
  • the motion processing module 150 determines a relative distance between the eyes.
  • the relative distance may be a number of pixels or a percentage of full image width.
  • the motion processing module 150 can repeat the relative eye distance measurement for the second captured image.
  • the motion processing module 150 can then compare the two relative distances to estimate a change in the user's distance to the device 100 .
  • the various modules shown in FIG. 1 and in the subsequent figures can be implemented as hardware, software executed by one or more processors, or a combination of hardware and software executed by processors.
  • Software can include one or more processor readable instructions stored within a processor readable storage device. The one or more instructions can be executed by a processor or a plurality of processors to perform some or all of the functionality related a module.
  • the processor readable storage device can include the memory 160 .
  • the memory 160 can include ROM, RAM, non-volatile RAM, flash, magnetic memory, optical memory, floppy disks, storage tapes, CDROM, DVD, and the like, or some other device for storing instructions, data, or instructions and data.
  • FIG. 2 is a functional block diagram of a motion sensing system 200 having a plurality of remote devices 100 a and 100 b in communication with a base device 250 .
  • a first remote device 100 a is directly coupled and in direct communication with the base device 250 .
  • the second remote device 100 b couples and is in communication with the base device using a network 220 .
  • Each of the remote devices 100 a and 100 b can be, for example, the motion sensing device of FIG. 1 .
  • the base device 250 can include a communication interface 252 configured to provide one or more interfaces to one or more remote devices 100 a - 100 b .
  • the communication interface 252 can be coupled to a processor 254 .
  • the processor 254 can also be coupled to memory 256 and a storage device 256 .
  • the storage device 258 can include any type of processor readable memory as previously described.
  • the remote devices 100 a and 100 b can be the same type of device or may be different devices. It may be advantageous for the remote device, for example 100 a , to be a portable device.
  • the remote device 100 a can be a portable gaming device, such as a game controller.
  • the remote device 100 a can be a portable computer.
  • the remote device 100 a can be a personal digital assistant, calculator, phone, or some other type of portable device.
  • the base device 250 can be a game console, a computer, a server, a base station, a host, a plurality of computers, and the like, or some other device. In some embodiments, the base device 250 can be another remote device.
  • the connection between the first remote device 100 a and the base device 250 can be a wired connection, a wireless connection, or some combination of wired and wireless connection.
  • a wired connection can include, but is not limited to, an electrical connection, a mechanical connection, an optical connection, and the like, or some other manner of communicatively coupling the first remote device 100 a to the base device 250 .
  • a wireless connection can include, but is not limited to, an RF connection, a microwave connection, an optical link, an infrared link, an audio link, and the like, or some other manner of communicatively coupling the first remote device 100 a to the base device 250 .
  • the network 220 connecting the second remote device 100 b to the base device 250 can be a private network or may be a public network. Additionally, the network may be a wired network, a wireless network, or some combination of wired and wireless network. The network may also be a Local Area Network (LAN), Metropolitan Area Network (MAN) or a Wide Area Network (WAN). In one embodiment, the network 220 can be a wireless LAN within a residence. In another embodiment, the network 220 can be the Internet.
  • LAN Local Area Network
  • MAN Metropolitan Area Network
  • WAN Wide Area Network
  • the communication interface 252 is shown connected to only two remote devices 100 a and 100 b , the communication interface 252 may have only a single port or may have multiple ports, each capable of connecting to one or more remote devices.
  • the motion of a remote device can be sensed completely within the remote device 100 a , completely within the base device 250 , or distributed across the remote device 100 a and the base device 250 .
  • the first remote device 100 a can capture images and sense its motion based in part on the captured images.
  • the first remote device 100 a can communicate the sensed motion and a subset of captured images to the base device 250 .
  • the first remote device 100 a may only communicate a subset of captured images from one of the plurality of cameras.
  • the base device 250 can then receive the sensed motion, receive the subset of captured images, and provide a signal to the remote device 100 a based in part on the received motion and images.
  • the base device 250 may provide an augmented reality display image to the remote device 100 a based in part on the received sensed motion signal and the received subset of images.
  • Each of the remote devices 100 a and 100 b can be a handheld controller having a video display 130 positioned on a surface directed towards the user.
  • the I/O devices 140 for a remote device 100 can be buttons, switches, or devices that are positioned and configured to accept user input, for example, from a user's thumbs or fingers.
  • the first camera 110 can be positioned at the top center of the of the handheld controller and can have a viewing axis that is aligned with the direction of the display 130 .
  • the second camera 120 can be positioned opposite the first camera 110 and can have a viewing axis that is substantially opposite the viewing axis of the first camera 110 .
  • a game that is played from a storage device 258 can be, for example a driving game such as an interactive racing game or an interactive street driving game.
  • the base device 250 or game console can send information such as video display information and audio information to the handheld controller.
  • the system can use the multiple cameras to detect the motion of the handheld controller.
  • the handheld controller can operate as a steering wheel and the system can determine when the handheld controller is rotated. The rotation can be interpreted by the game system to be a rotation of a steering wheel.
  • the game console may then modify the video and audio information based on the detected motion.
  • the system may detect motion of the handheld controller and allow the user to drive a course by selectively rotating the handheld controller.
  • Another game that may be displayed from the storage device 258 can be, for example, an augmented reality game.
  • the user can point the second camera 120 of the handheld controller towards a scene, such as a map, image, or a live view.
  • the display 130 on the handheld controller can display an image captured by the second camera 120 that is augmented by one or more images from the base device 250 game console.
  • FIGS. 3A-3D are views of an embodiment of a motion sensing device 100 , which can be the motion sensing device of FIG. 1 .
  • FIG. 3A is a front view of the motion sensing device 100 and shows the position of the first camera 110 in the center top of the device 100 , and a display 130 positioned on the front of the device 100 .
  • the front of the device 100 can also include a first user input device 140 a positioned on the left hand side of the front panel and a second user input device 140 b positioned on a right hand side of the front panel.
  • the first user input device 140 a is shown as a navigation pad, while the second user input device 140 b is shown as a collection of buttons.
  • FIG. 3B is a rear view of the motion sensing device 100 and shows the position of the second camera 120 positioned approximately on a centerline of the back panel.
  • the viewing axis of the first camera 110 is substantially opposite the viewing angle of the second camera 120 .
  • the two viewing axes are not along the same line, although the two axes are substantially parallel.
  • FIG. 3C shows an orientation of the motion sensing device 100 in a rectangular coordinate system for purposes of explanation.
  • the directions referred to in the subsequent figures refer to the directions shown in FIG. 3C .
  • the front of the motion sensing device 100 faces the positive x direction and the rear of the device 100 faces the negative x direction.
  • the first viewing axis can be described as being in the +x direction and the second viewing axis can be described as being in the ⁇ x direction.
  • FIG. 3D shows an embodiment of the motion sensing device 100 in relation to a user.
  • the front panel of the device 100 faces the front of the user.
  • the user can see the display 130 and the camera 110 on the front panel of the device 100 can typically capture an image of the user's face.
  • the camera 120 on the back panel of the device 100 can capture an image that is within the user's field of view.
  • the system using the device, base device, or combination of devices, can then display augmented reality images on the display 130 .
  • the image presented on the display 130 can be sized.
  • the system can adjust the image size to make it appear as if the user is looking through a transparent window when the user views the display.
  • the system can add virtual or synthetic images to the displayed image to provide an augmented reality environment.
  • FIGS. 4-8 show examples of images that can be captured by a pair of cameras.
  • the cameras are configured with a first camera on a front of the device and a second camera on a rear of the device.
  • the images are used to help illustrate how the device can perform motion sensing.
  • the figures refer to first and second images captured by two separate cameras. As discussed previously, it may be advantageous for the first and second images captured by the rear facing camera to be captured substantially at the same time, respectively, that the front facing camera captures its first and second images. Additionally, the second image captured by a camera need not be the next image captured by the camera. Instead, the second image can refer to any subsequent image captured by the camera.
  • FIG. 4A shows a first image 400 that can be captured by a front facing camera.
  • the term first is a relative term used to provide a reference relative to subsequent images.
  • the first image 400 from the front facing camera shows an object 402 within the image.
  • FIG. 4B shows a second image 410 captured by the front facing camera.
  • the same object 402 can appear in the second image 410 and can be translated relative to the object 402 in the first image 400 .
  • the object 402 is illustrated as having moved to the left.
  • the motion sensing device can determine a direction of motion by comparing the two images, 400 and 410 .
  • the motion sensing device need not make a detailed determination of the image.
  • the motion sensing device may only determine that there was an image translation, and the direction of the image translation.
  • the motion sensing device may perform additional image analysis and may determine an approximate magnitude of translation in addition to a direction of translation. Because the object 402 in the captured images of the front facing camera is translated to the left, the motion sensing device may determine the direction of motion of the device is to the left. However, the motion sensing device may not be able to make a motion determination without analyzing the images captured by the rear facing camera.
  • FIG. 5A shows a first image 500 captured by a front facing camera.
  • the first image 500 shows an object 502 within the image 500 .
  • FIG. 5B shows a second image 510 captured by the front facing camera.
  • the second image 510 shows the same object 502 shown in the first image 500 , but translated downward.
  • the motion sensing device can determine an upward direction of motion.
  • FIG. 5C shows a first image 520 captured by a rear facing camera.
  • the first image 520 shows an object 512 within the image 520 .
  • FIG. 5D shows a second image 540 captured by the rear facing camera.
  • the second image 540 shows the same object 512 shown in the first image 520 of the rear facing camera. However, the image is translated downward, indicating an upward direction of device motion.
  • the motion sensing device can compare the direction of image translation from the two sets of images to determine a device motion. Because the captured images showed a downward image translation for both of the cameras, the motion of the device can be determined to be an upward translation. Alternatively, the motion sensing device can compare the direction of motion and determine a device motion. Because the motion sensing device determined an upward direction of motion from both sets of images, the motion sensing device can determine that the motion was an upward translation.
  • FIGS. 6A-6D show additional examples of images captured by a front facing camera and a rear facing camera.
  • the images illustrate a twisting translation of the motion sensing device.
  • the twisting translation can also be referred to as a rotation of the motion sensing device.
  • the rotation occurs along an axis of rotation extending in the x direction.
  • FIG. 6A shows a first image 600 captured by a front facing camera.
  • the first image 600 shows an object 602 within the image 600 .
  • FIG. 6B shows a second image 610 captured by the front facing camera.
  • the second image 610 shows the same object 602 shown in the first image 600 , but rotated counterclockwise relative to a viewing axis of the camera. This corresponds to a counterclockwise rotation of the motion sensing device, where the direction refers to counterclockwise as viewed from the front of the device.
  • the motion sensing device can determine a counterclockwise rotation as the direction of motion.
  • FIG. 6C shows a first image 620 captured by a rear facing camera.
  • the first image 620 from the rear facing camera shows an object 612 within the image 620 .
  • FIG. 6D shows a second image 640 captured by the rear facing camera.
  • the second image 640 shows the same object 612 shown in the first image 620 of the rear facing camera. However, the image is rotated clockwise relative to a viewing axis, indicating a counterclockwise rotation of the device as the direction of motion.
  • the motion sensing device can compare the direction of image translation from the two sets of images to determine a device motion. Because the captured images showed a opposite image rotations for the two cameras, the motion of the device can be determined to be rotation.
  • the direction of rotation is the same as the direction of rotation shown by the front camera, which in this example is counterclockwise.
  • the motion sensing device can compare the direction of motion and determine a device motion. Because the motion sensing device determined a counterclockwise direction of motion from both sets of images, the motion sensing device can determine that the motion was a counterclockwise rotation along the camera viewing axis.
  • FIGS. 7A-7D show additional examples of images captured by a front facing camera and a rear facing camera.
  • the images illustrate a rotation of the motion sensing device. The rotation occurs along an axis of rotation extending in the z direction.
  • FIG. 7A shows a first image 700 captured by a front facing camera.
  • the first image 700 shows an object 702 within the image 700 .
  • FIG. 7B shows a second image 710 captured by the front facing camera.
  • the second image 710 shows the same object 702 shown in the first image 700 , but translated to the right of the image.
  • the translation can correspond to a right side horizontal translation of the motion sensing device.
  • the motion sensing device can determine that the direction of motion is to the right.
  • FIG. 7C shows a first image 720 captured by a rear facing camera.
  • the first image 720 from the rear facing camera shows an object 712 within the image 720 .
  • FIG. 7D shows a second image 740 captured by the rear facing camera.
  • the second image 740 shows the same object 712 shown in the first image 720 of the rear facing camera.
  • the image is horizontally translated to the right.
  • the horizontal translation can indicate the direction of motion is horizontally to the left.
  • FIGS. 8A-8D show additional examples of images captured by a front facing camera and a rear facing camera.
  • the images illustrate a rotation of the motion sensing device about an axis of rotation along the y axis.
  • FIG. 8A shows a first image 800 captured by a front facing camera.
  • the first image 800 shows an object 802 within the image 800 .
  • FIG. 8B shows a second image 810 captured by the front facing camera.
  • the second image 810 shows the same object 802 shown in the first image 800 , but translated upward.
  • the motion sensing device can determine that a device direction of motion is downward.
  • the opposite direction of image translation, and correspondingly, the opposite device direction of motion can indicate that the device has rotated about an axis of rotation that lies along the y direction.
  • the motion sensing device can use the image translations or the directions of motion to determine that the direction of rotation is counterclockwise as viewed from the right of the device.
  • a motion sensing device can thus determine a direction of motion based on images captured by a plurality of cameras. Only two cameras are shown in the examples, but more than two cameras can be used in the device. Additionally, two cameras are shown having viewing axes that are in substantially opposite directions. However, the multiple cameras can be positioned along any viewing axes that are not identical or that do not result in completely correlated images.
  • the motion sensing device can determine if the device has translated or rotated.
  • the motion sensing device can determine horizontal translation along an axis parallel to a viewing axis by comparing, for example, the size of objects in two or more images.
  • Table 1 A summary of image translations and motion determination is provided in Table 1 for a two camera embodiment having a first camera facing forward and a second camera facing rearward.
  • any device translation or rotation can be determined based on a combination of on-axis translations and/or rotations.
  • an arbitrary device translation can be determined as a combination of device translations in each axis direction.
  • the vector sum of each of the on-axis translations can equal a translation to any three dimensional location.
  • a rotation along any arbitrary axis can be determined as a combination of device rotations along each axis.
  • a translation of the device along an arbitrary direction can be determined as a vector sum of motion along the x, y, and z axis.
  • translation along each of the three axis can be determined exclusive of translation along another axis.
  • the motion sensing system may, for example, determine translation along the x-axis, followed by the y-axis and the z-axis.
  • the total translation can be determined as the vector sum of each of the translations.
  • rotation along an arbitrary axis can be determined as a sum of rotations along each of the axis.
  • Table 1 the movements of the camera images used to determine axis of rotation are independent and thus, the individual axis rotations can be determined from an arbitrary rotation.
  • a rotation can be differentiated from a translation by comparing the magnitude of the different image translations. For example, a y-axis translation and a z-axis rotation both can be determined based on left or right image translations. However, when the cameras are positioned substantially opposite one another, the magnitude of image translation should be approximately the same for images from the two cameras. When the magnitude of image translation from the two cameras differs, the system can determine a composite translation and rotation. The difference in the magnitude of translations and the direction of the difference, can be used to determine the component of image translation attributable to either device translation or rotation.
  • FIG. 9 is a flowchart of an embodiment of a motion sensing process 900 that can be implemented, for example, in the motion sensing device of FIG. 1 or the motion sensing system of FIG. 2 .
  • the process 900 can begin at block 902 when the motion sensing device captures a first image from a first camera.
  • the first camera can be, for example, a forward facing camera.
  • the motion sensing device next proceeds to block 904 where the device captures a first image from a second camera.
  • the second camera can be, for example, a rear facing camera.
  • the motion sensing device next proceeds to block 912 and captures a second image from the first camera.
  • the motion sensing device captures the second image from the second camera.
  • the first and second images from each of the first and second cameras can be, for example, stored in memory or some storage device.
  • the motion sensing device next proceeds to block 920 and compares the first and second images from the first camera.
  • the motion sensing device determines an image translation based on the comparison of the captured images.
  • the motion sensing device can determine a device direction of motion from the first camera images.
  • the motion sensing device next proceeds to block 924 and determines a distance from the motion sensing device to a user.
  • the motion sensing device can determine or estimate the distance to a user by measuring analyzing an image of the user's face to determine a distance between the eyes, and thus derive a distance to the device.
  • the motion sensing device then proceeds to block 930 and compares the images captured by the second camera.
  • the motion sensing device determines an image translation based on the comparison of the captured second camera images.
  • the motion sensing device can determine a device direction of motion from the second camera images.
  • the motion sensing device next proceeds to block 940 where it compares the image translation determined from the first camera images to the image translation determined from the second camera images.
  • the motion sensing device can compare, for example, the direction of horizontal or vertical translation and the direction of rotation.
  • the motion sensing device determines a direction of device motion based in part on the comparison of image translations.
  • the motion sensing device can also determine a magnitude of the device motion based in part on the magnitude of image translation or image rotation in each of the first and second camera image pairs.
  • motion sensing device can communicate the motion parameters and one or more images to another device, such as the base device or another motion sensing device as shown in FIG. 2 .
  • the motion parameters can include, for example, the direction of motion, including translation and rotation, the magnitude of the motion, and the distance from the user.
  • the images can include some or all of the images captured by the cameras. In one embodiment, the motion sensing device communicates the images captured by the rear facing camera and does not communicate the images from the front facing camera.
  • the motion sensing device then proceeds to block 980 and can receive display signals from another device, such as the base device or another motion sensing device.
  • the display signals can be signals that are generated based in part on the images and motion parameters communicated by the motion sensing device.
  • the motion sensing device receives augmented reality display signals from a base device and displays the augmented reality signals on a display.
  • the various steps or blocks of the process 900 can be performed in another order and do not need to be performed in the order shown in FIG. 9 . Additionally, blocks, steps, or functions may be added to the process, 900 or deleted from the process 900 . Additional blocks, steps, or functions can be added at the beginning of the process, at the end of the process, or in between one or more existing blocks of the process.
  • a motion sensing device can determine motion based in part on images captured from a plurality of cameras.
  • the motion sensing device can compare a first set of images from a first camera and determine image translation within the first set of images.
  • the motion sensing device can also compare a second set of images from a second camera and determine image translation within the second set of images.
  • the motion sensing device can then compare the image translation determined from the first set of images to the image translation from the second set of images.
  • the motion sensing device can then determine motion based in part on the comparison.
  • DSP digital signal processor
  • RISC Reduced Instruction Set Computer
  • ASIC application specific integrated circuit
  • FPGA field programmable gate array
  • a general purpose processor may be a microprocessor, but in the alternative, the processor may be any processor, controller, microcontroller, or state machine.
  • a processor may also be implemented as a combination of computing devices, for example, a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration.
  • a software module may reside in RAM memory, flash memory, non-volatile memory, ROM memory, EPROM memory, EEPROM memory, registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art.
  • An exemplary storage medium can be coupled to the processor such the processor can read information from, and write information to, the storage medium. In the alternative, the storage medium may be integral to the processor.
  • the processor and the storage medium may reside in an ASIC.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Studio Devices (AREA)
  • Telephone Function (AREA)
  • Image Analysis (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Motion sensing of a portable device using two cameras. A first camera is directed along a first viewing axis and a second camera is directed along a second viewing axis, different from the first viewing axis. The second viewing axis can be substantially opposite the first viewing axis. A motion processing module determines changes in images from the first camera and changes in images from the second camera. The motion processing module compares the direction of change determined from the first camera images relative to the direction of change determined from the second camera images. The motion processing module determines the motion of the portable device based in part on the comparison.

Description

BACKGROUND OF THE INVENTION
Sensing the motion of a device can be a time consuming task that requires intensive numerical processing. Additionally, motion sensing may only provide a gross measure of the motion of a device. Sensing motion of a portable object may require costly hardware and complicated numerical processing techniques.
Motion sensing using trilateration of positions, as performed by a Global Positioning System (GPS) or a hybrid position determination system, may be inadequate for sensing motion of a portable device. A GPS system or hybrid position determination system typically provides inadequate precision to sense motion of a portable device. GPS can require a great deal of time to acquire an initial position fix. Additionally, a GPS system or hybrid position determination system typically cannot determine rotational motion of a portable device.
Motion sensing systems incorporating gyroscopes can be used to sense the motion of a portable device. However, such systems are typically costly and not suitable for low cost designs. The costs associated with gyroscopes used in motion sensing may exceed a perceived value of the entire portable device.
BRIEF SUMMARY OF THE INVENTION
Motion sensing of a device using a plurality of cameras is disclosed. A first camera can be directed along a first viewing axis and a second camera can be directed along a second viewing axis, different from the first viewing axis. The second viewing axis can be substantially opposite the first viewing axis. A processor can determine changes in images from the first camera and changes in images from the second camera the determine respective directions of change. The processor can compare the direction of change determined from the first camera images relative to the direction of change determined from the second camera images. The processor can then determine the motion of the device based in part on the comparison.
One aspect of the invention includes a motion sensing device. The motion sensing device can include a first camera configured to capture images along a first viewing axis, a second camera configured to capture images along a second viewing axis different from the first viewing axis, and a motion processing module configured to determine motion of the device based in part on the images from the first camera and the second camera.
In another aspect, the invention includes a motion sensing device. The motion sensing device can include a first camera having a viewing axis directed away from a front of the device, and configured to capture images, a second camera having a viewing axis directed away from a rear of the device, and configured to capture images, a motion processing module configured to determine a motion of the device based in part on the images from the first camera and the second camera, and further configured to generate a display image based in part on the motion of the device, and a display coupled to the motion processing module and configured to display the display image.
In still another aspect, the invention includes a motion sensing system. The system can include a first camera positioned along a first viewing axis, and configured to capture first camera images, a second camera positioned along a second viewing axis different from the first viewing axis, and configured to capture second camera images, a motion processing module configured to determine a motion parameter based in part on at least two first camera images and at least two second camera images, a base device configured to process the motion parameter and generate a display signal based in part on the motion parameter, and a display configured to display the display signal generated by the base device.
In still another aspect, the invention can include a motion sensing method. The method can include receiving a plurality of images from a first camera, receiving a plurality of images from a second camera, determining a first image translation based in part on the first camera images, determining a second image translation based in part on the second camera images, comparing the first image translation to the second image translation, and determining the motion of the device based in part on the comparison.
In yet another aspect, the invention can include a motion sensing method. The method can include receiving a motion parameter determined from images captured along a first viewing axis and images captured along a second viewing axis substantially opposite the first viewing axis, retrieving a display image from a plurality of stored images, modifying the display image based at least in part on the motion parameter, and communicating the modified display image to a portable device.
In yet another aspect, the invention can include one or more processor readable storage devices configured to store one or more processor readable instructions to be executed by one or more processors, the instructions directing the processor to perform a method. The method can include receiving a plurality of images from a first camera, receiving a plurality of images from a second camera, determining a first image translation based in part on the first camera images, determining a second image translation based in part on the second camera images, comparing the first image translation to the second image translation, and determining the motion of the device based in part on the comparison.
In still another aspect, the invention can include one or more processor readable storage devices configured to store one or more processor readable instructions to be executed by one or more processors, the instructions directing the processor to perform a method. The method can include receiving a motion parameter determined from images captured along a first viewing axis and images captured along a second viewing axis substantially opposite the first viewing axis, retrieving a display image from a plurality of stored images, modifying the display image based at least in part on the motion parameter, and communicating the modified display image to a portable device.
BRIEF DESCRIPTION OF THE DRAWINGS
The features, objects, and advantages of embodiments of the disclosure will become more apparent from the detailed description set forth below when taken in conjunction with the drawings, in which like elements bear like reference numerals.
FIG. 1 is a functional block diagram of a motion sensing device.
FIG. 2 is a functional block diagram of a motion sensing system.
FIGS. 3A-3D are views of an embodiment of a motion sensing device.
FIGS. 4A-4D are illustrative examples of images captured by cameras.
FIGS. 5A-5D are illustrative examples of images captured by cameras.
FIGS. 6A-6D are illustrative examples of images captured by cameras.
FIGS. 7A-7D are illustrative examples of images captured by cameras.
FIGS. 8A-8D are illustrative examples of images captured by cameras.
FIG. 9 is a flowchart of an embodiment of a motion sensing process.
DETAILED DESCRIPTION OF THE INVENTION
A portable device having motion sensing capabilities, and a method of sensing motion in a portable device are disclosed. The device and method use images captured from multiple cameras positioned along different viewing axis. The images captured by the multiple cameras can be processed to determine a direction of change. The direction of change in images captured from a first camera relative to the direction of change in images captured from a second camera can be used to sense motion of the device.
FIG. 1 is a functional block diagram of a motion sensing device 100. The device 100 includes a first camera and a second camera 120 coupled to a motion processing module 150. The motion processing module 150 can include, for example, an Input/Output (I/O) controller 152 and a processor 154. The motion processing module 150 can be coupled to memory 160, and can be coupled to a communication interface 170. The motion processing module 150 can also be coupled to a display 130 and I/O devices 140.
The motion sensing device 100 can be configured to sense the motion of the device based in part on the images captured by the first camera 110 and the second camera 120. The first camera 110 can be positioned along a first viewing axis, where the viewing axis can include the line that extends from the imaging device in the camera to a center of the image captured by the camera. The second camera 120 can be positioned along a second viewing axis. Preferably, the first viewing axis is different from the second viewing axis. In one embodiment, the second camera 120 can be positioned along a second viewing axis that is substantially opposite the first viewing axis. That is, if the first camera 110 captures an image from a front of the device 100, the second camera 120 can be positioned to capture an image from a back of the device 100. It may be advantageous to position the cameras 110 and 120 with viewing axes in substantially orthogonal axes if the cameras 110 and 120 are not positioned substantially in opposite directions. However, any two viewing axes can be acceptable provided the images captured by the cameras 110 and 120 are not perfectly correlated.
The cameras 110 and 120 can be analog cameras or digital cameras, and each camera can be the same or different from other cameras. For example, a camera, for example 110, can be an analog camera, a Charge Coupled Device (CCD) camera, a CMOS camera, and the like, or some other device for capturing images. Additionally, the cameras 110 and 120 may capture visible images or may be configured to capture images outside the visible spectrum. For example, the camera, for example 110, can be a visible light camera, an infrared camera, and the like, or some other device for capturing images.
The first camera 110 can be configured to capture sequential images and the second camera 120 can also be configured to capture sequential images. Where a camera continually captures images, the images captured at two distinct times may be considered sequential images. It may be advantageous for a first image captured by the first camera 110 to be captured approximately at the same time as a first image captured by the second camera 120. Similarly, it may be advantageous for a second or subsequent image captured by the first camera 110 to be captured approximately at the same time as an image captured by the second camera 120. The time between an image from the first camera 110 to its counterpart image from the second camera 120 should be less than 1 second and preferably less than 500 mSec, 250 mSec, 200 mSec, 150 mSec, 100 mSec, 50 mSec, 25 mSec or 10 mSec. Thus, for example, the difference between the time a first image is captured by the first camera 110 and the time a first image is captured by the second camera 120 can be less than 250 mSec.
The images captured by the cameras 110 and 120 are communicated to the motion processing module 150. The motion processing module can determine, based on the captured images, a direction of relative motion of the originating camera. For example, a left translation of a second image relative to a first image can indicate that the camera has moved right relative to the image. Alternatively, a left translation can indicate movement of the objects in the images. The motion processing module 150 can process the images and determine a relative direction of motion of the first camera 110 based in part on sequential images captured by the first camera 110. Similarly, the motion processing module 150 can process the sequential images from the second camera 120 and determine, based in part on the images, a relative direction of motion of the second camera 120. The motion processing module 150 can then compare the direction of motion of the first camera 110 to the direction of motion of the second camera 120 and, based at least in part on the comparison, determine a motion of the device 100. Additional descriptions of motion sensing are provided below in conjunction with FIGS. 4-8.
The motion processing module 150 can also receive a signal from a base device (not shown) via the communication interface 170. The received signal can be all or a portion of a signal to output on the display 130. The motion processing module 150 can output the signal on the display 130. The display can be, for example, a video display, a CRT display, an LCD display, a plasma display, an array of lights, and the like, or some other device for providing a display.
The motion processing module 150 can also output signals to another device (not shown) via the communication interface 170. For example, the motion processing module 150 can provide signals to another device indicative of the sensed motion. Additionally, the motion processing module 150 can receive user input via one or more I/O devices 140 and can provide signals to the other device based in part on the user input signals. The I/O devices 140 can include keys, keypads, buttons, joy sticks, keyboards, switches, dials, knobs, microphones, touch screens, and the like, or some other device for receiving input.
The device 100 can receive, via the communication interface, signals that are produced, at least in part, based on the signals output from the device 100. In one embodiment, the device 100 outputs signals corresponding to the sensed motion and receives a display signal that is modified based on the sensed motion signals.
In another embodiment, the device 100 outputs the captured images from one or more cameras 110 and 120 and also outputs the sensed device motion. The device 100 can then receive signals that are modified at least in part on the output signals. The received signals can include display signals to be output to the display 130 as well as other output signals.
In still another embodiment, the device 100 outputs the captured images from one or more cameras 110 and 120 and also outputs the sensed device motion. The device 100 can then receive an augmented reality image signal based on the signals output from the device 100. Alternatively, the device 100 can itself generate the augmented reality image.
The motion processing module 150 can then output the augmented reality image on the display 130. An augmented reality image can be a captured image that is modified to include synthetic or virtual images. The augmented image can include synthetic images superimposed on the captured image. Thus, an image captured by a camera, for example 120, can be modified to include virtual images not present in the captured image. The modified image can be displayed to a user via the display 130.
In another embodiment, the device 100 can be a handheld device and the first camera 110 can be configured to capture an image of a user's face when held by a user in a typical orientation. The motion processing module 150 can then determine a distance to the user's face based at least in part on the captured image. For example, the first camera 110 can capture an image that includes an image of a user's face. The motion processing module 150 can process the image to locate points in the image corresponding to the user's eyes. The motion processing module 150 can, for example, determine a distance between the eyes and estimate a distance from the user to the device 100.
In still another embodiment, the first camera 110 can be configured to capture at least two images of a user's face. The motion processing module 150 can then determine a change in distance to the user's face based at least in part on differences in the two captured image. For example, the first camera 110 can capture an image that includes an image of a user's face. The motion processing module 150 can process the image to locate points in the image corresponding to the user's eyes. The motion processing module 150 determines a relative distance between the eyes. For example, the relative distance may be a number of pixels or a percentage of full image width. The motion processing module 150 can repeat the relative eye distance measurement for the second captured image. The motion processing module 150 can then compare the two relative distances to estimate a change in the user's distance to the device 100.
The various modules shown in FIG. 1 and in the subsequent figures can be implemented as hardware, software executed by one or more processors, or a combination of hardware and software executed by processors. Software can include one or more processor readable instructions stored within a processor readable storage device. The one or more instructions can be executed by a processor or a plurality of processors to perform some or all of the functionality related a module.
The processor readable storage device can include the memory 160. The memory 160 can include ROM, RAM, non-volatile RAM, flash, magnetic memory, optical memory, floppy disks, storage tapes, CDROM, DVD, and the like, or some other device for storing instructions, data, or instructions and data.
FIG. 2 is a functional block diagram of a motion sensing system 200 having a plurality of remote devices 100 a and 100 b in communication with a base device 250. In the embodiment shown in FIG. 2, a first remote device 100 a is directly coupled and in direct communication with the base device 250. The second remote device 100 b couples and is in communication with the base device using a network 220. Each of the remote devices 100 a and 100 b can be, for example, the motion sensing device of FIG. 1.
The base device 250 can include a communication interface 252 configured to provide one or more interfaces to one or more remote devices 100 a-100 b. The communication interface 252 can be coupled to a processor 254. The processor 254 can also be coupled to memory 256 and a storage device 256. The storage device 258 can include any type of processor readable memory as previously described.
The remote devices 100 a and 100 b can be the same type of device or may be different devices. It may be advantageous for the remote device, for example 100 a, to be a portable device. In one embodiment, the remote device 100 a can be a portable gaming device, such as a game controller. In another embodiment, the remote device 100 a can be a portable computer. In still another embodiment, the remote device 100 a can be a personal digital assistant, calculator, phone, or some other type of portable device.
Similarly, the base device 250 can be a game console, a computer, a server, a base station, a host, a plurality of computers, and the like, or some other device. In some embodiments, the base device 250 can be another remote device.
The connection between the first remote device 100 a and the base device 250 can be a wired connection, a wireless connection, or some combination of wired and wireless connection. A wired connection can include, but is not limited to, an electrical connection, a mechanical connection, an optical connection, and the like, or some other manner of communicatively coupling the first remote device 100 a to the base device 250. A wireless connection can include, but is not limited to, an RF connection, a microwave connection, an optical link, an infrared link, an audio link, and the like, or some other manner of communicatively coupling the first remote device 100 a to the base device 250.
The network 220 connecting the second remote device 100 b to the base device 250 can be a private network or may be a public network. Additionally, the network may be a wired network, a wireless network, or some combination of wired and wireless network. The network may also be a Local Area Network (LAN), Metropolitan Area Network (MAN) or a Wide Area Network (WAN). In one embodiment, the network 220 can be a wireless LAN within a residence. In another embodiment, the network 220 can be the Internet.
Although the communication interface 252 is shown connected to only two remote devices 100 a and 100 b, the communication interface 252 may have only a single port or may have multiple ports, each capable of connecting to one or more remote devices.
In the motion sensing system of FIG. 2, the motion of a remote device, for example 100 a, can be sensed completely within the remote device 100 a, completely within the base device 250, or distributed across the remote device 100 a and the base device 250.
For example, in an embodiment the base device 250 can receive images captured by the cameras in the first remote device 100 a. The base device 250 can then determine the motion of the first remote device 100 a based in part on the captured images. The base device 250 can then provide display images to the first remote device 100 a based on the captured images and the sensed motion. The base device 250 can provide, for example, an augmented reality image to be displayed on the first remote device 100 a.
In another embodiment, the first remote device 100 a can capture images and sense its motion based in part on the captured images. The first remote device 100 a can communicate the sensed motion and a subset of captured images to the base device 250. For example, the first remote device 100 a may only communicate a subset of captured images from one of the plurality of cameras. The base device 250 can then receive the sensed motion, receive the subset of captured images, and provide a signal to the remote device 100 a based in part on the received motion and images. For example, the base device 250 may provide an augmented reality display image to the remote device 100 a based in part on the received sensed motion signal and the received subset of images.
Additionally, the base device 250 can estimate the distance of the first remote device 100 a from its user, based in part on the received captured images, and provide a display image that is modified based in part on the estimated distance. Alternatively, the remote device 100 a can estimate its distance from the user and communicate the estimate to the base device 250. The base device 250 can receive the distance estimate, generate the display image based in part on the distance estimate, and provide the display image to the remote device 100 a.
In still another embodiment, the base device 250 may receive a combination of captured images, sensed motion, and user distance estimate from a first remote device 100 a and generate a display image based in part on the received signals. The base device 250 can communicate the display image to one or more remote devices, for example 100 b, that may or may not include the first remote device 100 a.
In an embodiment, the motion sensing system of FIG. 2 can be a video gaming system. The remote devices 100 a and 100 b can be handheld controllers for the video gaming system and the base device 250 can be a video game console or game processor. The storage device 258 within the base device 250 can be, for example, an optical disk reader configured to interface with one or more optical disks, such as Compact Discs (CD) or Digital Versatile Discs (DVD) having stored thereon a video game or an interactive player game.
Each of the remote devices 100 a and 100 b can be a handheld controller having a video display 130 positioned on a surface directed towards the user. The I/O devices 140 for a remote device 100 can be buttons, switches, or devices that are positioned and configured to accept user input, for example, from a user's thumbs or fingers. The first camera 110 can be positioned at the top center of the of the handheld controller and can have a viewing axis that is aligned with the direction of the display 130. The second camera 120 can be positioned opposite the first camera 110 and can have a viewing axis that is substantially opposite the viewing axis of the first camera 110.
A game that is played from a storage device 258 can be, for example a driving game such as an interactive racing game or an interactive street driving game. The base device 250 or game console can send information such as video display information and audio information to the handheld controller. The system can use the multiple cameras to detect the motion of the handheld controller. For example, the handheld controller can operate as a steering wheel and the system can determine when the handheld controller is rotated. The rotation can be interpreted by the game system to be a rotation of a steering wheel. The game console may then modify the video and audio information based on the detected motion. For example, the system may detect motion of the handheld controller and allow the user to drive a course by selectively rotating the handheld controller.
Another game that may be displayed from the storage device 258 can be, for example, an augmented reality game. The user can point the second camera 120 of the handheld controller towards a scene, such as a map, image, or a live view. The display 130 on the handheld controller can display an image captured by the second camera 120 that is augmented by one or more images from the base device 250 game console.
The system may monitor the motion of the handheld controller as well as the image captured by the second camera 120 and provide an augmented reality signal to the handheld controller in response to the captured image and detected motion. Thus, for example, the display 130 can show the image captured by the second camera 120 having augmented reality images inserted based on information provided by the game console. In an outdoor game setting, such a game may allow a user to roam through a real environment, such as a home or outdoor setting, and have an interactive gaming experience using augmented reality. Such games may include, for example, hunting games or combat games. Augmented reality animals or combatants, for example, may be added to images captured by the second camera 120.
FIGS. 3A-3D are views of an embodiment of a motion sensing device 100, which can be the motion sensing device of FIG. 1. FIG. 3A is a front view of the motion sensing device 100 and shows the position of the first camera 110 in the center top of the device 100, and a display 130 positioned on the front of the device 100. The front of the device 100 can also include a first user input device 140 a positioned on the left hand side of the front panel and a second user input device 140 b positioned on a right hand side of the front panel. The first user input device 140 a is shown as a navigation pad, while the second user input device 140 b is shown as a collection of buttons.
FIG. 3B is a rear view of the motion sensing device 100 and shows the position of the second camera 120 positioned approximately on a centerline of the back panel. The viewing axis of the first camera 110 is substantially opposite the viewing angle of the second camera 120. The two viewing axes are not along the same line, although the two axes are substantially parallel.
FIG. 3C shows an orientation of the motion sensing device 100 in a rectangular coordinate system for purposes of explanation. The directions referred to in the subsequent figures refer to the directions shown in FIG. 3C. As shown in FIG. 3C, the front of the motion sensing device 100 faces the positive x direction and the rear of the device 100 faces the negative x direction. Thus, the first viewing axis can be described as being in the +x direction and the second viewing axis can be described as being in the −x direction.
FIG. 3D shows an embodiment of the motion sensing device 100 in relation to a user. In a typical configuration, the front panel of the device 100 faces the front of the user. The user can see the display 130 and the camera 110 on the front panel of the device 100 can typically capture an image of the user's face.
In an augmented reality embodiment, the camera 120 on the back panel of the device 100, which is not shown in FIG. 3D, can capture an image that is within the user's field of view. The system, using the device, base device, or combination of devices, can then display augmented reality images on the display 130. By estimating the distance from the device 100 to the user, the image presented on the display 130 can be sized. The system can adjust the image size to make it appear as if the user is looking through a transparent window when the user views the display. However, the system can add virtual or synthetic images to the displayed image to provide an augmented reality environment.
FIGS. 4-8 show examples of images that can be captured by a pair of cameras. The cameras are configured with a first camera on a front of the device and a second camera on a rear of the device. The images are used to help illustrate how the device can perform motion sensing. The figures refer to first and second images captured by two separate cameras. As discussed previously, it may be advantageous for the first and second images captured by the rear facing camera to be captured substantially at the same time, respectively, that the front facing camera captures its first and second images. Additionally, the second image captured by a camera need not be the next image captured by the camera. Instead, the second image can refer to any subsequent image captured by the camera.
FIGS. 4A-4D show examples of images captured by a front facing camera and a rear facing camera. The images illustrate a horizontal translation of the device. Using the coordinate system of FIG. 3C, the horizontal translation can refer to movement along the +y or −y direction.
FIG. 4A shows a first image 400 that can be captured by a front facing camera. The term first is a relative term used to provide a reference relative to subsequent images. The first image 400 from the front facing camera shows an object 402 within the image. FIG. 4B shows a second image 410 captured by the front facing camera. The same object 402 can appear in the second image 410 and can be translated relative to the object 402 in the first image 400. The object 402 is illustrated as having moved to the left. The motion sensing device can determine a direction of motion by comparing the two images, 400 and 410. The motion sensing device need not make a detailed determination of the image. The motion sensing device may only determine that there was an image translation, and the direction of the image translation. Of course, the motion sensing device may perform additional image analysis and may determine an approximate magnitude of translation in addition to a direction of translation. Because the object 402 in the captured images of the front facing camera is translated to the left, the motion sensing device may determine the direction of motion of the device is to the left. However, the motion sensing device may not be able to make a motion determination without analyzing the images captured by the rear facing camera.
FIG. 4C shows a first image 420 captured by a rear facing camera. The first image 420 also shows an object 412 within the image 420. FIG. 4D shows a second image 430 captured by the rear facing camera. The second image 430 shows the same object 412 as in the first image 420, but translated to one side. The object 412 captured by the rear facing camera is shown as translated to the right. Thus, the motion sensing device may determine a direction of motion of the device is to the left.
The objects in the captured images appear to be translated in opposite directions, but because the two cameras face in opposite directions, the motion sensing device is able to determine that the direction of motion is the same. Because the motion sensing device determined that the direction of motion detected by the two cameras is the same, the motion sensing device is able to determine that the device translated to the left. Alternatively, the motion sensing device may determine a direction of image translation from images from a first camera. The motion sensing device can also determine a direction of image translation from images from a second camera. The motion sensing device can compare the two directions of image translation and determine a direction of device motion.
FIGS. 5A-5D show additional examples of images captured by a front facing camera and a rear facing camera. The images illustrate a vertical translation of the motion sensing device.
FIG. 5A shows a first image 500 captured by a front facing camera. The first image 500 shows an object 502 within the image 500. FIG. 5B shows a second image 510 captured by the front facing camera. The second image 510 shows the same object 502 shown in the first image 500, but translated downward. Thus, the motion sensing device can determine an upward direction of motion.
FIG. 5C shows a first image 520 captured by a rear facing camera. The first image 520 shows an object 512 within the image 520. FIG. 5D shows a second image 540 captured by the rear facing camera. The second image 540 shows the same object 512 shown in the first image 520 of the rear facing camera. However, the image is translated downward, indicating an upward direction of device motion.
The motion sensing device can compare the direction of image translation from the two sets of images to determine a device motion. Because the captured images showed a downward image translation for both of the cameras, the motion of the device can be determined to be an upward translation. Alternatively, the motion sensing device can compare the direction of motion and determine a device motion. Because the motion sensing device determined an upward direction of motion from both sets of images, the motion sensing device can determine that the motion was an upward translation.
FIGS. 6A-6D show additional examples of images captured by a front facing camera and a rear facing camera. The images illustrate a twisting translation of the motion sensing device. The twisting translation can also be referred to as a rotation of the motion sensing device. The rotation occurs along an axis of rotation extending in the x direction.
FIG. 6A shows a first image 600 captured by a front facing camera. The first image 600 shows an object 602 within the image 600. FIG. 6B shows a second image 610 captured by the front facing camera. The second image 610 shows the same object 602 shown in the first image 600, but rotated counterclockwise relative to a viewing axis of the camera. This corresponds to a counterclockwise rotation of the motion sensing device, where the direction refers to counterclockwise as viewed from the front of the device. Thus, the motion sensing device can determine a counterclockwise rotation as the direction of motion.
FIG. 6C shows a first image 620 captured by a rear facing camera. The first image 620 from the rear facing camera shows an object 612 within the image 620. FIG. 6D shows a second image 640 captured by the rear facing camera. The second image 640 shows the same object 612 shown in the first image 620 of the rear facing camera. However, the image is rotated clockwise relative to a viewing axis, indicating a counterclockwise rotation of the device as the direction of motion.
As discussed above, the motion sensing device can compare the direction of image translation from the two sets of images to determine a device motion. Because the captured images showed a opposite image rotations for the two cameras, the motion of the device can be determined to be rotation. The direction of rotation is the same as the direction of rotation shown by the front camera, which in this example is counterclockwise.
Alternatively, the motion sensing device can compare the direction of motion and determine a device motion. Because the motion sensing device determined a counterclockwise direction of motion from both sets of images, the motion sensing device can determine that the motion was a counterclockwise rotation along the camera viewing axis.
FIGS. 7A-7D show additional examples of images captured by a front facing camera and a rear facing camera. The images illustrate a rotation of the motion sensing device. The rotation occurs along an axis of rotation extending in the z direction.
FIG. 7A shows a first image 700 captured by a front facing camera. The first image 700 shows an object 702 within the image 700. FIG. 7B shows a second image 710 captured by the front facing camera. The second image 710 shows the same object 702 shown in the first image 700, but translated to the right of the image. The translation can correspond to a right side horizontal translation of the motion sensing device. Thus, the motion sensing device can determine that the direction of motion is to the right.
FIG. 7C shows a first image 720 captured by a rear facing camera. The first image 720 from the rear facing camera shows an object 712 within the image 720. FIG. 7D shows a second image 740 captured by the rear facing camera. The second image 740 shows the same object 712 shown in the first image 720 of the rear facing camera. However, the image is horizontally translated to the right. The horizontal translation can indicate the direction of motion is horizontally to the left.
Thus, the objects in the captured images appear to be translated in the same directions, but because the two cameras face in opposite directions, the motion sensing device determines opposite directions of motion. The motion sensing device determines the direction of motion of the front facing camera is right but the direction of motion of the rear facing camera is left. Because the motion sensing device determined that the direction of motion of the two cameras is the opposite, the motion sensing device is able to determine that the device rotated about the z axis. The motion sensing device can further determine from the directions of motion that the rotation is counterclockwise when the device is viewed from above.
Alternatively, the motion sensing device may determine the movement of the device based on the direction of image translation. The motion sensing device can compare the two directions of image translation and determine a direction of device motion. Thus, because the image translated right in both the front and rear camera images, the device can determine the movement corresponds to a counterclockwise rotation of the device when viewed from above.
FIGS. 8A-8D show additional examples of images captured by a front facing camera and a rear facing camera. The images illustrate a rotation of the motion sensing device about an axis of rotation along the y axis.
FIG. 8A shows a first image 800 captured by a front facing camera. The first image 800 shows an object 802 within the image 800. FIG. 8B shows a second image 810 captured by the front facing camera. The second image 810 shows the same object 802 shown in the first image 800, but translated upward. Thus, the motion sensing device can determine that a device direction of motion is downward.
FIG. 8C shows a first image 820 captured by a rear facing camera. The first image 820 shows an object 812 within the image 820. FIG. 8D shows a second image 840 captured by the rear facing camera. The second image 840 shows the same object 812 shown in the first image 820 of the rear facing camera. The image is translated downward, indicating an upward direction of device motion.
The opposite direction of image translation, and correspondingly, the opposite device direction of motion can indicate that the device has rotated about an axis of rotation that lies along the y direction. The motion sensing device can use the image translations or the directions of motion to determine that the direction of rotation is counterclockwise as viewed from the right of the device.
A motion sensing device can thus determine a direction of motion based on images captured by a plurality of cameras. Only two cameras are shown in the examples, but more than two cameras can be used in the device. Additionally, two cameras are shown having viewing axes that are in substantially opposite directions. However, the multiple cameras can be positioned along any viewing axes that are not identical or that do not result in completely correlated images.
By comparing the image translations or device direction of motion determined from the images, the motion sensing device can determine if the device has translated or rotated. The motion sensing device can determine horizontal translation along an axis parallel to a viewing axis by comparing, for example, the size of objects in two or more images. A summary of image translations and motion determination is provided in Table 1 for a two camera embodiment having a first camera facing forward and a second camera facing rearward.
TABLE 1
Image Translation Image Translation
Front Camera Rear Camera Device Motion
Larger Smaller +x Translation
Smaller Larger −x Translation
Left Right +y Translation
Right Left −y Translation
Down Down +z Translation
Up Up −z Translation
CCW Rotation CW Rotation CCW x-axis Rotation
CW Rotation CCW Rotation CW x-axis Rotation
Up Down CCW y-axis Rotation
Down Up CW y-axis Rotation
Right Right CCW z-axis Rotation
Left Left CW z-axis Rotation
Although examples of device translations and rotations were provided for on-axis translations and rotations, any device translation or rotation can be determined based on a combination of on-axis translations and/or rotations. For example, an arbitrary device translation can be determined as a combination of device translations in each axis direction. The vector sum of each of the on-axis translations can equal a translation to any three dimensional location. Similarly, a rotation along any arbitrary axis can be determined as a combination of device rotations along each axis.
For example, a translation of the device along an arbitrary direction can be determined as a vector sum of motion along the x, y, and z axis. As can be seen from Table 1, translation along each of the three axis can be determined exclusive of translation along another axis. Thus, the motion sensing system may, for example, determine translation along the x-axis, followed by the y-axis and the z-axis. The total translation can be determined as the vector sum of each of the translations.
Additionally, rotation along an arbitrary axis can be determined as a sum of rotations along each of the axis. As can be seen from Table 1, the movements of the camera images used to determine axis of rotation are independent and thus, the individual axis rotations can be determined from an arbitrary rotation.
Although the some of the rotations and translations are determined from the same image translations, a rotation can be differentiated from a translation by comparing the magnitude of the different image translations. For example, a y-axis translation and a z-axis rotation both can be determined based on left or right image translations. However, when the cameras are positioned substantially opposite one another, the magnitude of image translation should be approximately the same for images from the two cameras. When the magnitude of image translation from the two cameras differs, the system can determine a composite translation and rotation. The difference in the magnitude of translations and the direction of the difference, can be used to determine the component of image translation attributable to either device translation or rotation.
FIG. 9 is a flowchart of an embodiment of a motion sensing process 900 that can be implemented, for example, in the motion sensing device of FIG. 1 or the motion sensing system of FIG. 2. The process 900 can begin at block 902 when the motion sensing device captures a first image from a first camera. As previously discussed, the first camera can be, for example, a forward facing camera.
The motion sensing device next proceeds to block 904 where the device captures a first image from a second camera. The second camera can be, for example, a rear facing camera. The motion sensing device next proceeds to block 912 and captures a second image from the first camera. At block 914, the motion sensing device captures the second image from the second camera. The first and second images from each of the first and second cameras can be, for example, stored in memory or some storage device.
The motion sensing device next proceeds to block 920 and compares the first and second images from the first camera. At block 922, the motion sensing device determines an image translation based on the comparison of the captured images. Alternatively, the motion sensing device can determine a device direction of motion from the first camera images.
The motion sensing device next proceeds to block 924 and determines a distance from the motion sensing device to a user. As discussed earlier, the motion sensing device can determine or estimate the distance to a user by measuring analyzing an image of the user's face to determine a distance between the eyes, and thus derive a distance to the device.
The motion sensing device then proceeds to block 930 and compares the images captured by the second camera. In block 932, the motion sensing device determines an image translation based on the comparison of the captured second camera images. Alternatively, the motion sensing device can determine a device direction of motion from the second camera images.
The motion sensing device next proceeds to block 940 where it compares the image translation determined from the first camera images to the image translation determined from the second camera images. The motion sensing device can compare, for example, the direction of horizontal or vertical translation and the direction of rotation. At block 950, the motion sensing device determines a direction of device motion based in part on the comparison of image translations.
Proceeding to block 950, the motion sensing device can also determine a magnitude of the device motion based in part on the magnitude of image translation or image rotation in each of the first and second camera image pairs. In block 960, motion sensing device can communicate the motion parameters and one or more images to another device, such as the base device or another motion sensing device as shown in FIG. 2. The motion parameters can include, for example, the direction of motion, including translation and rotation, the magnitude of the motion, and the distance from the user. The images can include some or all of the images captured by the cameras. In one embodiment, the motion sensing device communicates the images captured by the rear facing camera and does not communicate the images from the front facing camera.
The motion sensing device then proceeds to block 980 and can receive display signals from another device, such as the base device or another motion sensing device. The display signals can be signals that are generated based in part on the images and motion parameters communicated by the motion sensing device. In one embodiment, the motion sensing device receives augmented reality display signals from a base device and displays the augmented reality signals on a display.
The various steps or blocks of the process 900 can be performed in another order and do not need to be performed in the order shown in FIG. 9. Additionally, blocks, steps, or functions may be added to the process, 900 or deleted from the process 900. Additional blocks, steps, or functions can be added at the beginning of the process, at the end of the process, or in between one or more existing blocks of the process.
Thus a motion sensing device has been disclosed that can determine motion based in part on images captured from a plurality of cameras. The motion sensing device can compare a first set of images from a first camera and determine image translation within the first set of images. The motion sensing device can also compare a second set of images from a second camera and determine image translation within the second set of images. The motion sensing device can then compare the image translation determined from the first set of images to the image translation from the second set of images. The motion sensing device can then determine motion based in part on the comparison.
The various illustrative logical blocks, modules, and circuits described in connection with the embodiments disclosed herein may be implemented or performed with a general purpose processor, a digital signal processor (DSP), a Reduced Instruction Set Computer (RISC) processor, an application specific integrated circuit (ASIC), a field programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described herein. A general purpose processor may be a microprocessor, but in the alternative, the processor may be any processor, controller, microcontroller, or state machine. A processor may also be implemented as a combination of computing devices, for example, a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration.
The steps of a method, process, or algorithm described in connection with the embodiments disclosed herein may be embodied directly in hardware, in a software module executed by a processor, or in a combination of the two. A software module may reside in RAM memory, flash memory, non-volatile memory, ROM memory, EPROM memory, EEPROM memory, registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art. An exemplary storage medium can be coupled to the processor such the processor can read information from, and write information to, the storage medium. In the alternative, the storage medium may be integral to the processor. The processor and the storage medium may reside in an ASIC.
The above description of the disclosed embodiments is provided to enable any person skilled in the art to make or use the disclosure. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the scope of the disclosure. Thus, the disclosure is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.

Claims (32)

1. A motion sensing device for use with interactive computer games, the device comprising:
a first camera configured to capture images along a first viewing axis;
a second camera configured to capture images along a second viewing axis different from the first viewing axis;
a motion processing module configured to receive images captured from the first camera, to receive images captured from the second camera, to determine a first direction of change based on the images captured along the first viewing axis, to determine a second direction of change based on the images captured along the second viewing axis, and to determine motion of the device based in part on comparing the first direction of change from the images captured from the first camera and the second direction of change from the images captured from the second camera;
a distance processing module configured to receive at least one image of a user's face from the first or second camera and determine a distance to the user based in part on the at least one image of the user's face; and
a display configured to display images representing an augmented reality environment associated with an interactive computer game based in part on the motion of the device and the distance to the user, the images representing the augmented reality environment including a portion of the images captured from the first camera or a portion of the images captured from the second camera automatically modified according to the interactive computer game in response to the motion of the device and the distance to the user with a virtual image associated with the interactive computer game.
2. The device of claim 1, wherein the display is configured to display the images representing the augmented reality environment associated with the interactive computer game using only the portion of the images captured from the first camera as modified by the motion processing module according to the interactive computer game based in part on the motion of the device.
3. The device of claim 1, wherein the display is configured to display the images representing the augmented reality environment associated with the interactive computer game using only the portion of the images captured from the second camera as modified by the motion processing module according to the interactive computer game based in part on the motion of the device.
4. The device of claim 1, wherein the motion processing module is further configured to communicate data based on motion of the device to a base unit, and to receive information from the base unit based in part on the communicated motion of the device.
5. The device of claim 4, wherein the received information comprises the images representing the augmented reality environment associated with the interactive computer game.
6. The device of claim 1, wherein the first viewing axis and the second viewing axis are in substantially opposite directions.
7. The device of claim 1, wherein the first camera is positioned along a front of the device.
8. The device of claim 7, wherein the second camera is positioned along a rear of the device.
9. The device of claim 1, wherein the motion processing module is configured to determine an image translation of the images captured from the first camera and an image translation of the images captured from the second camera, and to determine the motion of the device based in part on the image translation of the images captured from the first camera and the image translation of the mages captured from the second camera; and
wherein the images representing the augmented reality environment associated with the interactive computer game include the portion of the images captured from the first camera or the portion of the images captured from the second camera that have been automatically modified according to the interactive computer game in response to the motion of the device determined based in part on the image translation of the images captured from the first camera and the image translation of the images captured from the second camera.
10. The device of claim 1, wherein the motion processing module is further configured to determine the distance to the user s based in part on a user's eye distance determined from the at least one image of the user's face.
11. The device of claim 1, wherein the first camera, second camera, and motion processing module are contained within a portable device.
12. The device of claim 1, wherein the first camera and second camera are contained within a portable device remote from the motion processing module.
13. A motion sensing device for use with interactive computer games, the device comprising:
a first camera having a viewing axis directed away from a front of the device, the first camera configured to capture images along the viewing axis directed away from the front of the device;
a second camera having a viewing axis directed away from a rear of the device, the second camera configured to capture images along the viewing axis directed away from the rear of the device;
a motion processing module configured to receive images captured from the first camera, to receive images captured from the second camera, to determine a first direction of change based on the images captured from the first camera, to determine a second direction of change based on the images captured from the second camera, to determine a distance to a user based in part on at least one image in the images captured from the first or second camera of the user's face to determine a motion of the device based in part on comparing the first direction of change from the images from the first camera and the second direction of change from the images from the second camera, and to generate a display image representing an augmented reality associated with an interactive computer game based in part on the motion of the device and the distance to the user, the display image representing the augmented reality environment including a portion of the images captured from the first camera or a portion of the images captured from the second camera automatically modified by the motion processing module according to the interactive computer game in response to the motion of the device and the distance to the user with a virtual image associated with the interactive computer game; and
a display coupled to the motion processing module and configured to display the display image representing the augmented reality environment associated with the interactive computer game.
14. A motion sensing system for use with interactive computer games, the system comprising:
a first camera positioned along a first viewing axis, the first camera configured to capture a first plurality of camera images along the first viewing axis;
a second camera positioned along a second viewing axis different from the first viewing axis, the second camera configured to capture a second plurality of camera images;
a motion processing module configured to receive the first plurality of camera images, to receive the second plurality of camera images, and to determine a motion parameter based in part on a comparison between a first change in direction indicated by at least two images in the first plurality of camera images and a second change in direction indicated by at least two images in the second plurality of camera images;
a distance processing module configured to receive at least one image of a user's face from a device associated with the first or second camera and determine a distance parameter based in part on the at least one image of the user's face, the distance parameter indicative of a distance to the user;
a base device configured to receive the motion parameter determined by the motion processing module and the distance parameter from the distance processing module, to process the motion parameter and the distance parameter to generate an augmented reality environment associated with an interactive computer game executed by the base device, and to generate a display signal representing the augmented reality environment associated with the interactive computer game based in part on the motion parameter and the distance parameter, the display signal representing the augmented reality environment including a portion of the first plurality of images or a portion of the second plurality of images automatically modified by the base device according to the interactive computer game in response to the motion parameter and the distance parameter with a virtual image associated with the interactive computer game; and
a display configured to display the display signal generated by the base device representing the augmented reality environment associated with the interactive computer game.
15. The system of claim 14 wherein the base device comprises:
a storage device configured to store one or more virtual images; and
wherein the base device is further configured to generate the display signal that represents the augmented reality environment associated with the interactive computer game by automatically modifying at least one of the one or more virtual images when retrieved from the storage device according to the interactive computer game based in part on the motion parameter.
16. The system of claim 14 wherein the motion processing module is further configured to communicate at least one of the second plurality of camera images to the base device, and wherein the base device generates the display signal representing the augmented reality environment associated with the interactive computer game based on only the at least one of the second plurality of camera images.
17. The system of claim 14 wherein the motion parameter comprises at least one parameter selected from the group comprising a device translation, and a device rotation.
18. The system of claim 14 wherein the distance processing module is further configured to estimate the distance to the user based in part on a user's eye distance determined from the image of the user's face.
19. A motion sensing method, the method comprising:
receiving a first plurality of images at a processor associated with a computer from a first camera associated with a device;
receiving a second plurality of images at the processor from a second camera associated with the device, wherein the first or second plurality of images includes an image of a user's face;
determining a first image translation with the processor based in part on the first plurality of images from the first camera;
determining a second image translation with the processor based in part on the second plurality of images from the second camera;
comparing the first image translation to the second image translation with the processor;
determining a distance to the user based in part on the image of the user's face;
determining motion of the device with the processor based in part on the comparison; and
generating images that represent an augmented reality environment associated with an interactive computer game with the processor based in part on the motion of the device and the distance to the user, the images representing the augmented reality environment including a portion of the first plurality of images or a portion of the second plurality of images automatically modified by the processor according to the interactive computer game in response to the motion of the device and the distance to the user with a virtual image associated with the interactive computer game.
20. The method of claim 19, further comprising:
displaying the images representing the augmented reality environment associated with the interactive game on the device.
21. The method of claim 19, further comprising:
determining with the processor a distance from a user's eyes from the image of the user's face; and
estimating with the processor a distance between the user and the device based in part on the distance from the user's eyes.
22. The method of claim 19, wherein receiving the second plurality of images at the processor from the second camera comprises receiving images at the processor from the second camera positioned substantially opposite the first camera.
23. A motion sensing method, the method comprising:
receiving at a base device a motion parameter determined from a comparison of between a first change of direction determined from images captured along a first viewing axis and a second change in direction determined from images captured along a second viewing axis substantially opposite the first viewing axis;
receiving at the base device a distance parameter determined from an image in the images captured along the first viewing axis or the images captured along the second viewing axis of a user's face, the distance parameter indicative of a distance to the user;
retrieving a first virtual image at the base device from a plurality of stored images using a storage device associated with the base device;
automatically modifying the first virtual image at the base device according to an interactive computer game based at least in part on the motion parameter and the distance parameter to create a modified virtual image that forms part of an augmented reality environment associated with the interactive computer game; and
communicating the modified virtual image to the portable device.
24. The method of claim 23, further comprising:
receiving at least one image captured along the first viewing axis at the base device; and
modifying the first virtual image at the base device based in part on the at least one image captured along the first viewing axis.
25. The method of claim 24, further comprising:
receiving at least one image captured along the second viewing axis at the base device; and
modifying the first virtual image at the base device based in part on the at least one image captured along the second viewing axis.
26. A computer readable storage medium having stored thereon instructions configured to cause determination of a device motion, the instructions comprising:
program code for receiving a first plurality of images from a first camera associated with a device;
program code for receiving a second plurality of images from a second camera associated with a device, wherein the first or second plurality of images includes an image of a user's face;
program code for determining a first image translation based in part on the first plurality of images from the first camera;
program code for determining a second image translation based in part on the second plurality of images from the second camera;
program code for comparing the first image translation to the second image translation;
program code for determining a distance to the user based in part on the image of the user's face; program code for determining motion of the device based in part on the comparison;
program code for generating images that represent an augmented reality environment associated with an interactive computer game based in part on the motion of the device and the distance to the user, the images representing the augmented reality environment including a portion of the first plurality of images or a portion of the second plurality of images automatically modified according to the interactive computer game in response to the motion of the device and the distance between the user with a virtual image associated with the interactive computer game.
27. A computer readable storage medium having stored thereon instructions configured to cause communication of a modified image to a portable device, the instructions comprising:
program code for receiving a motion parameter determined from images captured along a first viewing axis and images captured along a second viewing axis substantially opposite the first viewing axis;
program code for receiving a distance parameter determined from an image in the images captured along the first viewing axis or the images captured along the second viewing axis of a user's face, the distance parameter indicative of a distance to the user;
program code for retrieving a first virtual image from a plurality of stored images;
program code for automatically modifying the first virtual image according to an interactive computer game based at least in part on the motion parameter and the distance parameter to create a modified virtual image that forms part of an augmented reality environment associated with the interactive computer game; and
program code for communicating the modified virtual image to a portable device.
28. A motion sensing system for use with interactive computer games, the system comprising:
a first camera positioned on a first side of a game controller along a first viewing axis, the first camera configured to capture a first plurality of camera images along the first viewing axis;
a second camera positioned on the game controller along a second viewing axis substantially opposite the first viewing axis, the second camera configured to capture a second plurality of camera images along the second viewing axis;
a motion processing module in communication with the first and second cameras and configured to receive the first plurality of camera images, to receive the second plurality of camera images, and to determine a motion parameter based in part on a comparison between a first change in direction indicated by at least two images from the first plurality of camera images and a second change in direction indicated by at least two images from the second plurality of camera images;
a distance processing module configured to receive at least one image of a user's face from the first or second camera and determine a distance parameter based in part on at least one image of a user's face from the first or second camera, the distance parameter indicative of a distance to the user;
a processor configured to retrieve a virtual image from a storage device and to automatically modify the virtual image based in part on the motion parameter and the distance parameter, the virtual image forming part of an augmented reality environment associated with an interactive computer game executed by the processor, the augmented reality environment including a portion of the first plurality of images or a portion of the second plurality of images automatically modified according to the interactive computer game in response to the motion of the device and the distance to the user using the virtual image; and
a display positioned on the game controller and configured to display images representing the augmented reality environment generated by the processor.
29. The system of claim 28, wherein the processor and storage device are included in a game console that is external to the game controller.
30. The system of claim 28, wherein the processor and storage device are included in the game controller.
31. The system of claim 28, wherein the storage device comprises a storage medium having game images stored therein.
32. The system of claim 28, wherein the game controller comprises at least one input device configured to receive user input.
US10/861,582 2004-06-04 2004-06-04 Motion sensor using dual camera inputs Active 2027-12-04 US7671916B2 (en)

Priority Applications (5)

Application Number Priority Date Filing Date Title
US10/861,582 US7671916B2 (en) 2004-06-04 2004-06-04 Motion sensor using dual camera inputs
JP2007515077A JP2008502206A (en) 2004-06-04 2005-04-15 Sensor with dual camera input
GB0624588A GB2430042B (en) 2004-06-04 2005-04-15 Motion sensor using dual camera inputs
PCT/US2005/012650 WO2005122582A2 (en) 2004-06-04 2005-04-15 Motion sensor using dual camera inputs
TW094113107A TWI282435B (en) 2004-06-04 2005-04-25 Motion sensor using dual camera inputs

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US10/861,582 US7671916B2 (en) 2004-06-04 2004-06-04 Motion sensor using dual camera inputs

Publications (2)

Publication Number Publication Date
US20050270368A1 US20050270368A1 (en) 2005-12-08
US7671916B2 true US7671916B2 (en) 2010-03-02

Family

ID=35448422

Family Applications (1)

Application Number Title Priority Date Filing Date
US10/861,582 Active 2027-12-04 US7671916B2 (en) 2004-06-04 2004-06-04 Motion sensor using dual camera inputs

Country Status (5)

Country Link
US (1) US7671916B2 (en)
JP (1) JP2008502206A (en)
GB (1) GB2430042B (en)
TW (1) TWI282435B (en)
WO (1) WO2005122582A2 (en)

Cited By (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060038833A1 (en) * 2004-08-19 2006-02-23 Mallinson Dominic S Portable augmented reality device and method
US20090224999A1 (en) * 2007-08-29 2009-09-10 Nintendo Co. Ltd. Imaging apparatus
US20090278764A1 (en) * 2007-08-29 2009-11-12 Nintendo Co., Ltd. Imaging apparatus
US20090327950A1 (en) * 2008-06-26 2009-12-31 Chi Mei Communication Systems, Inc. System and method for scrolling through an electronic document in a mobile device
US20100040257A1 (en) * 2003-06-02 2010-02-18 Fujifilm Corporation Image displaying system and apparatus for displaying images by changing the displayed images based on direction or direction changes of a displaying unit
US20100138797A1 (en) * 2008-12-01 2010-06-03 Sony Ericsson Mobile Communications Ab Portable electronic device with split vision content sharing control and method
US20110043598A1 (en) * 2009-08-20 2011-02-24 Oki Electric Industry Co., Ltd. Remote communication apparatus and method of estimating a distance between an imaging device and a user image-captured
US20110234857A1 (en) * 2008-06-13 2011-09-29 Nintendo Co., Ltd. Information processing apparatus and computer-readable storage medium recording information processing program
US20120105477A1 (en) * 2010-11-01 2012-05-03 Samsung Electronics Co., Ltd. Apparatus and method for displaying data in portable terminal
US20120172118A1 (en) * 2011-01-05 2012-07-05 Nintendo Co., Ltd. Game apparatus, information processing apparatus, storage medium having game program or information processing program stored therein, game system, delay measurement system, image display method, audio output method, and delay measurement method
US20140096084A1 (en) * 2012-09-28 2014-04-03 Samsung Electronics Co., Ltd. Apparatus and method for controlling user interface to select object within image and image input device
US8717294B2 (en) * 2010-03-05 2014-05-06 Sony Computer Entertainment America Llc Calibration of portable devices in a shared virtual space
US20140355895A1 (en) * 2013-05-31 2014-12-04 Lidong Xu Adaptive motion instability detection in video
US8957973B2 (en) 2012-06-11 2015-02-17 Omnivision Technologies, Inc. Shutter release using secondary camera
US8998712B2 (en) 2012-10-18 2015-04-07 Nintendo Co., Ltd. Game system, game apparatus, non-transitory computer-readable storage medium having game program stored thereon, and game processing control method
US9013550B2 (en) 2010-09-09 2015-04-21 Qualcomm Incorporated Online reference generation and tracking for multi-user augmented reality
US20150189149A1 (en) * 2013-12-27 2015-07-02 Panasonic Corporation Communication method
US9135026B2 (en) 2008-06-13 2015-09-15 Nintendo Co., Ltd. Information-processing apparatus having photography applications
US9250703B2 (en) 2006-03-06 2016-02-02 Sony Computer Entertainment Inc. Interface with gaze detection and voice input
US9268406B2 (en) 2011-09-30 2016-02-23 Microsoft Technology Licensing, Llc Virtual spectator experience with a personal audio/visual apparatus
US20160059128A1 (en) * 2014-08-28 2016-03-03 Nintendo Co., Ltd. Information processing terminal, non-transitory storage medium encoded with computer readable information processing program, information processing terminal system, and information processing method
US9286711B2 (en) 2011-09-30 2016-03-15 Microsoft Technology Licensing, Llc Representing a location at a previous time period using an augmented reality display
US9606992B2 (en) 2011-09-30 2017-03-28 Microsoft Technology Licensing, Llc Personal audio/visual apparatus providing resource management
US9630099B2 (en) 2008-10-01 2017-04-25 Nintendo Co., Ltd. Information processing device, information processing system, and launch program and storage medium storing the same providing photographing functionality
US10120438B2 (en) 2011-05-25 2018-11-06 Sony Interactive Entertainment Inc. Eye gaze to alter device behavior
US10565726B2 (en) 2017-07-03 2020-02-18 Qualcomm Incorporated Pose estimation using multiple cameras
US11120627B2 (en) * 2012-08-30 2021-09-14 Atheer, Inc. Content association and history tracking in virtual and augmented realities

Families Citing this family (37)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7682245B2 (en) 2000-02-29 2010-03-23 Igt Name your prize game playing methodology
FI20045300A (en) * 2004-08-17 2006-02-18 Nokia Corp Electronic device and procedure for controlling the functions of the electronic device and software product for implementing the procedure
US7667686B2 (en) * 2006-02-01 2010-02-23 Memsic, Inc. Air-writing and motion sensing input for portable devices
US8149277B2 (en) * 2006-07-13 2012-04-03 Nikon Corporation Display control device, display system, and television set
US20090017910A1 (en) * 2007-06-22 2009-01-15 Broadcom Corporation Position and motion tracking of an object
US8396321B1 (en) * 2007-04-25 2013-03-12 Marvell International Ltd. Method and apparatus for processing image data from a primary sensor and a secondary sensor
WO2010001756A1 (en) * 2008-06-30 2010-01-07 株式会社ソニー・コンピュータエンタテインメント Portable type game device and method for controlling portable type game device
CN101579571B (en) * 2009-04-30 2012-09-26 武汉市高德电气有限公司 Live-action game device and method for realizing live-action game
JP5898842B2 (en) 2010-01-14 2016-04-06 任天堂株式会社 Portable information processing device, portable game device
EP2355526A3 (en) 2010-01-14 2012-10-31 Nintendo Co., Ltd. Computer-readable storage medium having stored therein display control program, display control apparatus, display control system, and display control method
JP5800501B2 (en) 2010-03-12 2015-10-28 任天堂株式会社 Display control program, display control apparatus, display control system, and display control method
KR20110116525A (en) * 2010-04-19 2011-10-26 엘지전자 주식회사 Image display device and operating method for the same
US8633947B2 (en) 2010-06-02 2014-01-21 Nintendo Co., Ltd. Computer-readable storage medium having stored therein information processing program, information processing apparatus, information processing system, and information processing method
US8384770B2 (en) * 2010-06-02 2013-02-26 Nintendo Co., Ltd. Image display system, image display apparatus, and image display method
EP2395769B1 (en) 2010-06-11 2015-03-04 Nintendo Co., Ltd. Image display program, image display system, and image display method
JP5647819B2 (en) * 2010-06-11 2015-01-07 任天堂株式会社 Portable electronic devices
JP5739674B2 (en) 2010-09-27 2015-06-24 任天堂株式会社 Information processing program, information processing apparatus, information processing system, and information processing method
US8854356B2 (en) 2010-09-28 2014-10-07 Nintendo Co., Ltd. Storage medium having stored therein image processing program, image processing apparatus, image processing system, and image processing method
TWI419031B (en) * 2010-10-20 2013-12-11 Sonix Technology Co Ltd Optical touch module and loading data method thereof
US9507416B2 (en) 2011-02-22 2016-11-29 Robert Howard Kimball Providing a corrected view based on the position of a user with respect to a mobile platform
JP5890969B2 (en) 2011-06-03 2016-03-22 任天堂株式会社 Information processing program, information processing apparatus, information processing system, and information processing method
KR101880998B1 (en) * 2011-10-14 2018-07-24 삼성전자주식회사 Apparatus and Method for motion recognition with event base vision sensor
US11232626B2 (en) 2011-12-21 2022-01-25 Twenieth Century Fox Film Corporation System, method and apparatus for media pre-visualization
US9799136B2 (en) * 2011-12-21 2017-10-24 Twentieth Century Fox Film Corporation System, method and apparatus for rapid film pre-visualization
US9261878B2 (en) 2011-12-29 2016-02-16 Intel Corporation Electronic device having a motion detector
US9986208B2 (en) * 2012-01-27 2018-05-29 Qualcomm Incorporated System and method for determining location of a device using opposing cameras
JP5966624B2 (en) * 2012-05-29 2016-08-10 株式会社リコー Information processing apparatus and information display system
US9720508B2 (en) 2012-08-30 2017-08-01 Google Technology Holdings LLC System for controlling a plurality of cameras in a device
FR3000349B1 (en) * 2012-12-21 2015-02-27 Thales Sa METHOD FOR MOTION COMPENSATION ON SEVERAL IMAGE SEQUENCES
CN103900477A (en) * 2012-12-28 2014-07-02 北京三星通信技术研究有限公司 Method for judging relative displacement of handheld device, and handheld device
JP6219037B2 (en) * 2013-02-06 2017-10-25 任天堂株式会社 Information processing program, information processing apparatus, information processing system, and information processing method
KR102146641B1 (en) * 2013-04-08 2020-08-21 스냅 아이엔씨 Distance estimation using multi-camera device
CN103425402A (en) * 2013-08-28 2013-12-04 紫光股份有限公司 Detection method and device for mobile terminal posture
US20150138314A1 (en) * 2013-11-20 2015-05-21 Google Inc. Generating Panoramic Images
US9851805B2 (en) * 2014-12-24 2017-12-26 Immersion Corporation Systems and methods for haptically-enabled holders
US11763209B1 (en) * 2019-03-06 2023-09-19 American Airlines, Inc. Virtual measurement system for baggage management
KR20210007697A (en) 2019-07-12 2021-01-20 삼성전자주식회사 Image sensor and electronic device comprising the image sensor

Citations (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4969036A (en) 1989-03-31 1990-11-06 Bir Bhanu System for computing the self-motion of moving images devices
US5473364A (en) 1994-06-03 1995-12-05 David Sarnoff Research Center, Inc. Video technique for indicating moving objects from a movable platform
US6233003B1 (en) 1996-07-22 2001-05-15 Fuji Photo Film Co., Ltd. Parallax image input apparatus
US6249285B1 (en) 1998-04-06 2001-06-19 Synapix, Inc. Computer assisted mark-up and parameterization for scene analysis
US20010017651A1 (en) 1996-12-11 2001-08-30 Baker Henry H. Moving imager camera for track and range capture
US20010039212A1 (en) * 2000-04-25 2001-11-08 Takao Sawano Portable game machine with download capability
US6317151B1 (en) 1997-07-10 2001-11-13 Mitsubishi Denki Kabushiki Kaisha Image reproducing method and image generating and reproducing method
US20020024635A1 (en) 2000-05-09 2002-02-28 Jon Oshima Multiplexed motion picture camera
US6430304B2 (en) 1998-08-28 2002-08-06 Sarnoff Corporation Method and apparatus for processing images to compute image flow information
US20020167537A1 (en) 2001-05-11 2002-11-14 Miroslav Trajkovic Motion-based tracking with pan-tilt-zoom camera
US6487421B2 (en) 1997-09-16 2002-11-26 Nokia Mobile Phones Limited Method for inputting information to a mobile radiotelephone
US6522787B1 (en) 1995-07-10 2003-02-18 Sarnoff Corporation Method and system for rendering and combining images to form a synthesized view of a scene containing image information from a second image
US20030048947A1 (en) 2001-09-07 2003-03-13 Grindstaff Gene Arthur Method, device and computer program product for demultiplexing of video images
US20030118217A1 (en) * 2000-08-09 2003-06-26 Kenji Kondo Eye position detection method and device
US20030156304A1 (en) * 2002-02-19 2003-08-21 Eastman Kodak Company Method for providing affective information in an imaging system
US20030210461A1 (en) 2002-03-15 2003-11-13 Koji Ashizaki Image processing apparatus and method, printed matter production apparatus and method, and printed matter production system
US20030215010A1 (en) * 2002-03-14 2003-11-20 Kotaro Kashiwa Image pickup apparatus and method, signal processing apparatus and method, and wearable signal processing apparatus
US6665003B1 (en) 1998-09-17 2003-12-16 Issum Research Development Company Of The Hebrew University Of Jerusalem System and method for generating and displaying panoramic images and movies
US20040212586A1 (en) 2003-04-25 2004-10-28 Denny Trueman H. Multi-function pointing device
US20050140565A1 (en) 2002-02-20 2005-06-30 Rainer Krombach Mobile telephone comprising wraparound display
US20050212766A1 (en) 2004-03-23 2005-09-29 Reinhardt Albert H M Translation controlled cursor
US20050212757A1 (en) * 2004-03-23 2005-09-29 Marvit David L Distinguishing tilt and translation motion components in handheld devices
US20060146009A1 (en) * 2003-01-22 2006-07-06 Hanno Syrbe Image control

Patent Citations (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4969036A (en) 1989-03-31 1990-11-06 Bir Bhanu System for computing the self-motion of moving images devices
US5473364A (en) 1994-06-03 1995-12-05 David Sarnoff Research Center, Inc. Video technique for indicating moving objects from a movable platform
US6522787B1 (en) 1995-07-10 2003-02-18 Sarnoff Corporation Method and system for rendering and combining images to form a synthesized view of a scene containing image information from a second image
US6233003B1 (en) 1996-07-22 2001-05-15 Fuji Photo Film Co., Ltd. Parallax image input apparatus
US20010017651A1 (en) 1996-12-11 2001-08-30 Baker Henry H. Moving imager camera for track and range capture
US6317151B1 (en) 1997-07-10 2001-11-13 Mitsubishi Denki Kabushiki Kaisha Image reproducing method and image generating and reproducing method
US6487421B2 (en) 1997-09-16 2002-11-26 Nokia Mobile Phones Limited Method for inputting information to a mobile radiotelephone
US6249285B1 (en) 1998-04-06 2001-06-19 Synapix, Inc. Computer assisted mark-up and parameterization for scene analysis
US6430304B2 (en) 1998-08-28 2002-08-06 Sarnoff Corporation Method and apparatus for processing images to compute image flow information
US6665003B1 (en) 1998-09-17 2003-12-16 Issum Research Development Company Of The Hebrew University Of Jerusalem System and method for generating and displaying panoramic images and movies
US20010039212A1 (en) * 2000-04-25 2001-11-08 Takao Sawano Portable game machine with download capability
US20020024635A1 (en) 2000-05-09 2002-02-28 Jon Oshima Multiplexed motion picture camera
US20030118217A1 (en) * 2000-08-09 2003-06-26 Kenji Kondo Eye position detection method and device
US20020167537A1 (en) 2001-05-11 2002-11-14 Miroslav Trajkovic Motion-based tracking with pan-tilt-zoom camera
US20030048947A1 (en) 2001-09-07 2003-03-13 Grindstaff Gene Arthur Method, device and computer program product for demultiplexing of video images
US20030156304A1 (en) * 2002-02-19 2003-08-21 Eastman Kodak Company Method for providing affective information in an imaging system
US20050140565A1 (en) 2002-02-20 2005-06-30 Rainer Krombach Mobile telephone comprising wraparound display
US20030215010A1 (en) * 2002-03-14 2003-11-20 Kotaro Kashiwa Image pickup apparatus and method, signal processing apparatus and method, and wearable signal processing apparatus
US20030210461A1 (en) 2002-03-15 2003-11-13 Koji Ashizaki Image processing apparatus and method, printed matter production apparatus and method, and printed matter production system
US20060146009A1 (en) * 2003-01-22 2006-07-06 Hanno Syrbe Image control
US20040212586A1 (en) 2003-04-25 2004-10-28 Denny Trueman H. Multi-function pointing device
US20050212766A1 (en) 2004-03-23 2005-09-29 Reinhardt Albert H M Translation controlled cursor
US20050212757A1 (en) * 2004-03-23 2005-09-29 Marvit David L Distinguishing tilt and translation motion components in handheld devices

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Wagner, Daniel. Augmented Reality Kanji Learning. 2003. IEEE Computer Society. Proceedings of the 2nd IEEE/ACM International Symposium on Mixed and Augmented Reality, pp. 335-336. *
Wagner, Daniel. First Steps Towards Handheld Augmented Reality. 2003. IEEE Computer Society. Proceedings of the 7th IEEE International Symposium on Wearable Computers, pp. 127-135. *

Cited By (57)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100040257A1 (en) * 2003-06-02 2010-02-18 Fujifilm Corporation Image displaying system and apparatus for displaying images by changing the displayed images based on direction or direction changes of a displaying unit
US8184156B2 (en) * 2003-06-02 2012-05-22 Fujifilm Corporation Image displaying system and apparatus for displaying images by changing the displayed images based on direction or direction changes of a displaying unit
US8547401B2 (en) * 2004-08-19 2013-10-01 Sony Computer Entertainment Inc. Portable augmented reality device and method
US20060038833A1 (en) * 2004-08-19 2006-02-23 Mallinson Dominic S Portable augmented reality device and method
US9250703B2 (en) 2006-03-06 2016-02-02 Sony Computer Entertainment Inc. Interface with gaze detection and voice input
US9264694B2 (en) 2007-08-29 2016-02-16 Nintendo Co., Ltd. Hand-held imaging apparatus and storage medium storing program
US20090278974A1 (en) * 2007-08-29 2009-11-12 Nintendo Co., Ltd. Hand-held imaging apparatus and storage medium storing program
US7817142B2 (en) * 2007-08-29 2010-10-19 Nintendo Co., Ltd. Imaging apparatus
US9325967B2 (en) 2007-08-29 2016-04-26 Nintendo Co., Ltd. Imaging apparatus
US9344706B2 (en) 2007-08-29 2016-05-17 Nintendo Co., Ltd. Camera device
US9894344B2 (en) 2007-08-29 2018-02-13 Nintendo Co., Ltd. Camera device
US20090278764A1 (en) * 2007-08-29 2009-11-12 Nintendo Co., Ltd. Imaging apparatus
US20090224999A1 (en) * 2007-08-29 2009-09-10 Nintendo Co. Ltd. Imaging apparatus
US10437424B2 (en) 2008-06-13 2019-10-08 Nintendo Co., Ltd. Information processing apparatus and computer-readable storage medium recording information processing program
US9135026B2 (en) 2008-06-13 2015-09-15 Nintendo Co., Ltd. Information-processing apparatus having photography applications
US10509538B2 (en) 2008-06-13 2019-12-17 Nintendo Co., Ltd. Information processing apparatus having a photographing-enabled state
US20110234857A1 (en) * 2008-06-13 2011-09-29 Nintendo Co., Ltd. Information processing apparatus and computer-readable storage medium recording information processing program
US8913172B2 (en) 2008-06-13 2014-12-16 Nintendo Co., Ltd. Information processing apparatus and computer-readable storage medium recording information processing program
US9256449B2 (en) 2008-06-13 2016-02-09 Nintendo Co., Ltd. Menu screen for information processing apparatus and computer-readable storage medium recording information processing program
US20090327950A1 (en) * 2008-06-26 2009-12-31 Chi Mei Communication Systems, Inc. System and method for scrolling through an electronic document in a mobile device
US10124247B2 (en) 2008-10-01 2018-11-13 Nintendo Co., Ltd. System and device for communicating images
US9630099B2 (en) 2008-10-01 2017-04-25 Nintendo Co., Ltd. Information processing device, information processing system, and launch program and storage medium storing the same providing photographing functionality
US10525334B2 (en) 2008-10-01 2020-01-07 Nintendo Co., Ltd. System and device for communicating images
US20100138797A1 (en) * 2008-12-01 2010-06-03 Sony Ericsson Mobile Communications Ab Portable electronic device with split vision content sharing control and method
US8525870B2 (en) * 2009-08-20 2013-09-03 Oki Electric Industry Co., Ltd. Remote communication apparatus and method of estimating a distance between an imaging device and a user image-captured
US20110043598A1 (en) * 2009-08-20 2011-02-24 Oki Electric Industry Co., Ltd. Remote communication apparatus and method of estimating a distance between an imaging device and a user image-captured
US9513700B2 (en) 2009-12-24 2016-12-06 Sony Interactive Entertainment America Llc Calibration of portable devices in a shared virtual space
US9310883B2 (en) 2010-03-05 2016-04-12 Sony Computer Entertainment America Llc Maintaining multiple views on a shared stable virtual space
US8717294B2 (en) * 2010-03-05 2014-05-06 Sony Computer Entertainment America Llc Calibration of portable devices in a shared virtual space
US9013550B2 (en) 2010-09-09 2015-04-21 Qualcomm Incorporated Online reference generation and tracking for multi-user augmented reality
US9558557B2 (en) 2010-09-09 2017-01-31 Qualcomm Incorporated Online reference generation and tracking for multi-user augmented reality
US9245469B2 (en) * 2010-11-01 2016-01-26 Samsung Electronics Co., Ltd. Apparatus and method for displaying data in portable terminal
US10102786B2 (en) * 2010-11-01 2018-10-16 Samsung Electronics Co., Ltd. Apparatus and method for displaying data in portable terminal
US20150379779A1 (en) * 2010-11-01 2015-12-31 Samsung Electronics Co., Ltd. Apparatus and method for displaying data in portable terminal
US20120105477A1 (en) * 2010-11-01 2012-05-03 Samsung Electronics Co., Ltd. Apparatus and method for displaying data in portable terminal
US20120172118A1 (en) * 2011-01-05 2012-07-05 Nintendo Co., Ltd. Game apparatus, information processing apparatus, storage medium having game program or information processing program stored therein, game system, delay measurement system, image display method, audio output method, and delay measurement method
US8715074B2 (en) * 2011-01-05 2014-05-06 Nintendo Co., Ltd. Game apparatus, information processing apparatus, storage medium having game program or information processing program stored therein, game system, delay measurement system, image display method, audio output method, and delay measurement method
US10120438B2 (en) 2011-05-25 2018-11-06 Sony Interactive Entertainment Inc. Eye gaze to alter device behavior
US9286711B2 (en) 2011-09-30 2016-03-15 Microsoft Technology Licensing, Llc Representing a location at a previous time period using an augmented reality display
US9606992B2 (en) 2011-09-30 2017-03-28 Microsoft Technology Licensing, Llc Personal audio/visual apparatus providing resource management
US9268406B2 (en) 2011-09-30 2016-02-23 Microsoft Technology Licensing, Llc Virtual spectator experience with a personal audio/visual apparatus
US8957973B2 (en) 2012-06-11 2015-02-17 Omnivision Technologies, Inc. Shutter release using secondary camera
US9313392B2 (en) 2012-06-11 2016-04-12 Omnivision Technologies, Inc. Shutter release using secondary camera
US11763530B2 (en) * 2012-08-30 2023-09-19 West Texas Technology Partners, Llc Content association and history tracking in virtual and augmented realities
US11120627B2 (en) * 2012-08-30 2021-09-14 Atheer, Inc. Content association and history tracking in virtual and augmented realities
US20220058881A1 (en) * 2012-08-30 2022-02-24 Atheer, Inc. Content association and history tracking in virtual and augmented realities
US20140096084A1 (en) * 2012-09-28 2014-04-03 Samsung Electronics Co., Ltd. Apparatus and method for controlling user interface to select object within image and image input device
US10101874B2 (en) * 2012-09-28 2018-10-16 Samsung Electronics Co., Ltd Apparatus and method for controlling user interface to select object within image and image input device
US8998712B2 (en) 2012-10-18 2015-04-07 Nintendo Co., Ltd. Game system, game apparatus, non-transitory computer-readable storage medium having game program stored thereon, and game processing control method
US20140355895A1 (en) * 2013-05-31 2014-12-04 Lidong Xu Adaptive motion instability detection in video
US9336460B2 (en) * 2013-05-31 2016-05-10 Intel Corporation Adaptive motion instability detection in video
US9413460B2 (en) 2013-12-27 2016-08-09 Panasonic Intellectual Property Corporation Of America Communication method
US9294666B2 (en) * 2013-12-27 2016-03-22 Panasonic Intellectual Property Corporation Of America Communication method
US20150189149A1 (en) * 2013-12-27 2015-07-02 Panasonic Corporation Communication method
US10004990B2 (en) * 2014-08-28 2018-06-26 Nintendo Co., Ltd. Information processing terminal, non-transitory storage medium encoded with computer readable information processing program, information processing terminal system, and information processing method
US20160059128A1 (en) * 2014-08-28 2016-03-03 Nintendo Co., Ltd. Information processing terminal, non-transitory storage medium encoded with computer readable information processing program, information processing terminal system, and information processing method
US10565726B2 (en) 2017-07-03 2020-02-18 Qualcomm Incorporated Pose estimation using multiple cameras

Also Published As

Publication number Publication date
WO2005122582A2 (en) 2005-12-22
GB2430042A (en) 2007-03-14
TWI282435B (en) 2007-06-11
GB0624588D0 (en) 2007-01-24
US20050270368A1 (en) 2005-12-08
GB2430042B (en) 2009-01-14
TW200540458A (en) 2005-12-16
JP2008502206A (en) 2008-01-24
WO2005122582A3 (en) 2007-01-18

Similar Documents

Publication Publication Date Title
US7671916B2 (en) Motion sensor using dual camera inputs
TWI544447B (en) System and method for augmented reality
US6198485B1 (en) Method and apparatus for three-dimensional input entry
US8854356B2 (en) Storage medium having stored therein image processing program, image processing apparatus, image processing system, and image processing method
JP5791433B2 (en) Information processing program, information processing system, information processing apparatus, and information processing method
EP2601640B1 (en) Three dimensional user interface effects on a display by using properties of motion
US8310537B2 (en) Detecting ego-motion on a mobile device displaying three-dimensional content
JP5586545B2 (en) GAME SYSTEM, PORTABLE GAME DEVICE, INFORMATION PROCESSOR CONTROL METHOD, AND INFORMATION PROCESSOR CONTROL PROGRAM
Tomioka et al. Approximated user-perspective rendering in tablet-based augmented reality
WO2016095057A1 (en) Peripheral tracking for an augmented reality head mounted device
JP6242039B2 (en) Apparatus and method for gyro controlled game viewpoint with automatic centering function
JP6021296B2 (en) Display control program, display control device, display control system, and display control method
TWI701941B (en) Method, apparatus and electronic device for image processing and storage medium thereof
JP5791434B2 (en) Information processing program, information processing system, information processing apparatus, and information processing method
US20150009119A1 (en) Built-in design of camera system for imaging and gesture processing applications
US8555205B2 (en) System and method utilized for human and machine interface
JP2013050883A (en) Information processing program, information processing system, information processor, and information processing method
EP2557482A2 (en) Input device, system and method
TWI825982B (en) Method for providing visual content, host, and computer readable storage medium
US10345595B2 (en) Head mounted device with eye tracking and control method thereof
US20200184675A1 (en) Positioning Method and Reality Presenting Device
US20170083105A1 (en) Electronic apparatuses and methods for providing a man-machine interface (mmi)
JP5739672B2 (en) Image display program, apparatus, system and method
WO2018163340A1 (en) Moving-image processing device, moving-image processing method, moving-image processing program and moving-image processing display system
GB2493646A (en) Stereoscopic mapping as input for an entertainment device

Legal Events

Date Code Title Description
AS Assignment

Owner name: ELECTRONIC ARTS INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:HASHIMOTO, KAZUYUKI;REEL/FRAME:015443/0667

Effective date: 20040402

Owner name: ELECTRONIC ARTS INC.,CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:HASHIMOTO, KAZUYUKI;REEL/FRAME:015443/0667

Effective date: 20040402

STCF Information on status: patent grant

Free format text: PATENTED CASE

FPAY Fee payment

Year of fee payment: 4

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552)

Year of fee payment: 8

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 12TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1553); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 12