US20060164423A1 - Interactive device capable of transmitting parameters of image objects - Google Patents

Interactive device capable of transmitting parameters of image objects Download PDF

Info

Publication number
US20060164423A1
US20060164423A1 US11/279,257 US27925706A US2006164423A1 US 20060164423 A1 US20060164423 A1 US 20060164423A1 US 27925706 A US27925706 A US 27925706A US 2006164423 A1 US2006164423 A1 US 2006164423A1
Authority
US
United States
Prior art keywords
image object
image
interactive device
parameters
interface
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US11/279,257
Inventor
Hsuan-Hsien Lee
Chin-Hsin Yang
Tzu-Yi Chao
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Pixart Imaging Inc
Original Assignee
Pixart Imaging Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Pixart Imaging Inc filed Critical Pixart Imaging Inc
Priority to US11/279,257 priority Critical patent/US20060164423A1/en
Assigned to PIXART IMAGING INC. reassignment PIXART IMAGING INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: CHAO, TZU-YI, LEE, HSUAN-HSIEN, YANG, CHIN-HSIN
Publication of US20060164423A1 publication Critical patent/US20060164423A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith
    • H04N25/76Addressed sensors, e.g. MOS or CMOS sensors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N3/00Scanning details of television systems; Combination thereof with generation of supply voltages
    • H04N3/10Scanning details of television systems; Combination thereof with generation of supply voltages by means not exclusively optical-mechanical
    • H04N3/14Scanning details of television systems; Combination thereof with generation of supply voltages by means not exclusively optical-mechanical by means of electrically scanned solid-state devices
    • H04N3/15Scanning details of television systems; Combination thereof with generation of supply voltages by means not exclusively optical-mechanical by means of electrically scanned solid-state devices for picture signal generation
    • H04N3/155Control of the image-sensor operation, e.g. image processing within the image-sensor
    • H04N3/1562Control of the image-sensor operation, e.g. image processing within the image-sensor for selective scanning, e.g. windowing, zooming

Definitions

  • the present invention relates to an interactive device, and more specifically, to an interactive device capable of transmitting parameters of image objects.
  • image sensors are used to capture human motions as controlling instructions. Take electronic pets for example; the built-in image sensors installed inside the electronic pets functions as an “eye” of the interactive toy to capture image pictures of human motions. Then, the captured and digitized pictures are transmitted to the following device to identify the controlling instructions. Eventually, the electronic pets would act according to the identified instructions.
  • FIG. 1 is a functional block diagram of an interactive device 10 according to the prior art.
  • the interactive device 10 includes an image sensor 12 , a micro-controller 14 , and a parallel transmission bus 16 .
  • the image sensor 12 contains a CMOS sensing array 22 and an analog to digital converter (ADC) 24 . Data sensed by the CMOS sensing array 22 is transmitted to the analog to digital converter 24 . Because the CMOS sensing array 22 is capable of sensing a plurality of pixel data for forming images, the CMOS sensing array 22 of the image sensor 12 would generate various pixel data continuously while taking continuously moving images.
  • ADC analog to digital converter
  • the pixel data are transmitted from the image sensor 12 to the micro-controller 14 through the parallel transmission bus 16 , and then the micro-controller 14 recomposes the images, extracts image objects on the recomposed images, and then determines the condition of the image object to control the operation of the interactive device 10 .
  • an image object refers to a group of at least one pixel having similar properties, such as similar gray intensities or similar colors.
  • the micro-controller 14 does not need to deal with the entire image data.
  • the micro-controller 14 does not need to obtain and deal with the entire image data, but can calculate the difference of the coordinates of the gravity centers for the corresponding image objects to obtain the trail of relative motions of these image objects.
  • the micro-controller 14 has to receive and process all pixel data, resulting in a major burden while processing the image data.
  • the claimed invention discloses an interactive device capable of transmitting parameters of image objects.
  • the interactive device comprises an image sensor, a processor, and a transmission interface.
  • the image sensor generates a plurality of pixel signals corresponding to an image.
  • the processing module determines at lease one static parameter of at least one image object within the image based on the plurality of pixel signals.
  • an image object refers to a group of at least one pixel having similar properties, such as similar gray intensities or similar colors.
  • the transmission interface outputs a digitized signal comprising at least one value based on the at least one static parameter of at least one image object.
  • FIG. 1 is a functional block diagram of the interactive device according to the prior art.
  • FIG. 2 is a functional block diagram of the interactive device according to the present invention.
  • FIG. 3 shows multiple image pictures.
  • FIG. 4 is another functional block diagram of the interactive device according to the present invention.
  • FIG. 2 is a functional block diagram of an interactive device 30 according to the present invention.
  • the interactive device 30 can be one component of an interface controller, one component of a game controller, or one component of an interactive toy.
  • the interactive device 30 comprises a processing module 44 that is a chip, and a controller 54 .
  • the processing module 44 comprises an image sensor 42 , which is a charge-coupled device (CCD) or a CMOS image sensor (CIS), for generating a plurality of digital pixel signals. Then, the plurality of pixel signals is transmitted to the processing module 44 .
  • the processing module 44 comprises a substrate 41 , an estimation unit 45 , a calculation unit 46 , and transmission interfaces 48 , 52 .
  • the image sensor 42 , the estimation unit 45 , the calculation unit 46 , and the transmission interfaces 48 , 52 are all integrated in a single chip.
  • the image sensor 42 , the estimation unit 45 , and the transmission interfaces 48 , 52 are all formed on the substrate 41 .
  • FIG. 3 shows multiple image pictures.
  • Each picture comprises a plurality of pixel signals.
  • the image sensor 42 is used to generate a plurality of pixel signals.
  • the plurality of generated pixel signals are transmitted to the estimation unit 45 .
  • the estimation unit 45 would estimate various parameters of each image object based on a plurality of pixel signals. Take a target picture 120 for example.
  • a target object 100 comprising a group of at least one pixel with similar gray intensities or similar colors in the target picture 120 is extracted first. Then various image parameters of the target object are estimated.
  • the image parameters which indicate whether the inner of the target object 100 is filled or unfilled with background pixels, and the number of objects with different colors from the target object enclosed in the target object 100 , etc, can also be estimate.
  • the estimation unit 45 can generate the extended parameters based on the estimated parameters. For example, the estimation unit 45 can generate the normalized coordinate of the gravity center of the target object with respect to a specified length and a specified width.
  • the target object 100 is taken as a set of the pixel signals with similar colors, and the estimation unit 45 is capable of determining parameters of the target object 100 in the target picture 120 (e.g. the area, the color, the orientation, and the boundaries) based on the number of the pixel signals, the pixel colors, and their corresponding coordinates.
  • the estimation unit 45 can also determine parameters, such as characteristic points of the target object 100 , the geometrical shape of the target object 100 , the coordinate of the gravity center of the target object 100 , and the length to width ratio of the target object 100 . For example, if the target object 100 is in a rectangular shape, the estimation unit 45 is able to determine that the number of the corner points of the target object 100 is 4. That is to say, the static image parameters are the measurable parameters of the target object 100 while the target object 100 is being statically captured by an image sensor.
  • the motion vector as the difference of the gravity coordinates of two image objects either on the same picture or on the different pictures obtained at different time.
  • the image pixels with similar properties are grouped into image objects.
  • the coordinate difference between the reference object 150 and the target object 100 which can be calculated as the difference between the coordinate of the gravity center of one image object and the coordinate of the gravity center of another image object, representing the motion vector of the target image object.
  • the calculation unit 46 is able to determine the motion vector between two different objects in above-mentioned way.
  • the estimation unit 45 and the calculation unit 46 can transmit the parameters to the transmission interfaces 48 , 52 .
  • the transmission interfaces 48 , 52 can be a universal asynchronous receiver/transmitter (UART) interface.
  • UART universal asynchronous receiver/transmitter
  • Asynchronous serial transmission has the advantages of small volume, low price, and the ability to transmit over a long distance.
  • a universal asynchronous transceiver is an asynchronous serial/parallel data transmitter for transmitting data between serial devices that control and connect to the interactive device 30 (or a processor).
  • the transmission interfaces 48 , 52 can be I 2 C (inter-IC), USB interfaces, wireless USB or SPI (serial peripheral interface). Because the principle of transforming serial data and parallel data with I 2 C, USB, wireless USB, or SPI is similar to that with UART interface and is well known to those skilled in the art, there is no further description hereinafter.
  • the first transmission interface 48 and the second transmission interface 52 can each use at least one kind of interface from the serial transmission groups including the UART interface, I 2 C (inter-IC), USB interface, and wireless USB interface.
  • the serial transmission groups including the UART interface, I 2 C (inter-IC), USB interface, and wireless USB interface.
  • the controller 54 After receiving the motion vectors or the static parameters (e.g. the coordinate of the gravity center of the image object, the area of the image object, the average color of the image object, the orientation of the image object, the boundary of the image object, the characteristic points, such as corner points and/or high curvature points, the geometrical shape of the image object, and the length to width ratio of the image object) transmitted from the transmission interfaces 48 , 52 , the controller 54 is able to utilize codes of each object in the previous picture 110 in cooperation with motion vectors and static parameters of each object to recover the target picture 120 . The controller 54 may take further action based on the parameters for controlling the operation of the interactive device 30 .
  • the static parameters e.g. the coordinate of the gravity center of the image object, the area of the image object, the average color of the image object, the orientation of the image object, the boundary of the image object, the characteristic points, such as corner points and/or high curvature points, the geometrical shape of the image object, and the length to width ratio of the image object
  • the first transmission interface 48 for transmitting the data generated by the estimation unit 45 and the second transmission interface 52 for transmitting the motion vectors calculated by the calculation unit 46 can be combined into a single interface.
  • the processing module 44 comprises the image sensor 42 , the calculation unit 46 , and the second transmission interface 52 , and all are integrated in a single chip.
  • the processing module 44 comprises the image sensor 42 , the calculation unit 46 , and the second transmission interface 52 are all formed on the same substrate 41 .
  • the third embodiment does not make use of the estimation unit 45 and the first transmission interface 48 .
  • the image sensor 42 , the estimation unit 45 , and the first transmission interface 48 are integrated in a single chip.
  • the image sensor 42 , the estimation unit 45 , and the first transmission interface 48 are all formed on the same substrate 41 , and the calculation unit 46 and the second transmission interface 52 are not used.
  • FIG. 4 is another functional block diagram of an interactive device 40 according to the present invention.
  • the interactive device 40 comprises an image sensor 50 , a processor 60 , a transmission interface 70 , and a controller 80 .
  • the processor 60 determines static parameters of the image object as the estimation unit 45 does, and determines the motion vector between two different image objects as the calculation unit 46 does.
  • the processor 60 can be a digital signal processor (DSP), a micro control unit (MCU), or other modules capable of determining static parameters and/or motion vectors.
  • DSP digital signal processor
  • MCU micro control unit
  • Data can be transmitted to the controller 80 in a serial or parallel manner through the transmission interface 70 , and thereby the transmission interface 70 can be an I 2 C interface, a universal serial bus (USB) interface, a wireless USB inter face, a universal asynchronous receiver/transmitter (UART) interface, a parallel transmission interface, or other interfaces.
  • Data transmitted through the transmission interface 70 comprise the area of the image object, the color of the image object, the orientation indicating the image object, the boundaries of the image object, the characteristic points of the image object, the geometrical shape of the image object, the length to width ratio of the image object, and the coordinate of the gravity center of the image object.
  • the image sensor 42 , the estimation unit 45 , the calculation unit 46 , and the transmission interfaces 48 , 52 are all integrated in a single chip.
  • the image sensor 42 , the estimation unit 45 , the calculation unit 46 , and the transmission interfaces 48 , 52 are all formed on the same substrate 41 .
  • these elements can be distributed in different chips.
  • the image sensor 50 , the processor 60 , and the transmission interface 70 are not necessarily integrated in the same chip. They can be distributed in different chips. That is, they can be formed on different substrates.
  • the present invention determines static parameters of the image object, and determines the motion vector among different image objects before transmitting data to the controller at the rear end.
  • the transmission interface transmits the calculated image parameters by the UART interface or any other serial and/or parallel transmission interfaces. In this way, the controller at the rear end does not need to process considerable sensed data any more, which reduces the circuit design complexity and shortens the development period of interactive devices.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Processing (AREA)
  • Transforming Light Signals Into Electric Signals (AREA)
  • Image Analysis (AREA)

Abstract

An interactive device includes an image sensor for generating a plurality of pixel signals, and a processor for determining a static parameter of at least one image object of the image based on the plurality of pixel signals. A transmission interface is used for outputting a control signal based on the static parameter determined by the processor.

Description

    CROSS REFERENCE TO RELATED APPLICATIONS
  • This is a continuation-in-part of application Ser. No. 10/904,301, filed Nov. 3, 2004, which is included in its entirety herein by reference.
  • BACKGROUND OF THE INVENTION
  • 1. Field of the Invention
  • The present invention relates to an interactive device, and more specifically, to an interactive device capable of transmitting parameters of image objects.
  • 2. Description of the Prior Art
  • In conventional interactive devices, image sensors are used to capture human motions as controlling instructions. Take electronic pets for example; the built-in image sensors installed inside the electronic pets functions as an “eye” of the interactive toy to capture image pictures of human motions. Then, the captured and digitized pictures are transmitted to the following device to identify the controlling instructions. Eventually, the electronic pets would act according to the identified instructions.
  • Please refer to FIG. 1. FIG. 1 is a functional block diagram of an interactive device 10 according to the prior art. The interactive device 10 includes an image sensor 12, a micro-controller 14, and a parallel transmission bus 16. The image sensor 12 contains a CMOS sensing array 22 and an analog to digital converter (ADC) 24. Data sensed by the CMOS sensing array 22 is transmitted to the analog to digital converter 24. Because the CMOS sensing array 22 is capable of sensing a plurality of pixel data for forming images, the CMOS sensing array 22 of the image sensor 12 would generate various pixel data continuously while taking continuously moving images. In order to transmit a considerable amount of pixel data, the pixel data are transmitted from the image sensor 12 to the micro-controller 14 through the parallel transmission bus 16, and then the micro-controller 14 recomposes the images, extracts image objects on the recomposed images, and then determines the condition of the image object to control the operation of the interactive device 10.
  • Here, an image object refers to a group of at least one pixel having similar properties, such as similar gray intensities or similar colors.
  • The total amount of the data is considerable, and the micro-controller 14 still has to determine and analyze the necessary data after receiving the data transmitted through the parallel transmission interface 16. However, for most applications, the micro-controller 14 does not need to deal with the entire image data. Take object tracking application for example, the micro-controller 14 does not need to obtain and deal with the entire image data, but can calculate the difference of the coordinates of the gravity centers for the corresponding image objects to obtain the trail of relative motions of these image objects. As a result, if users utilize the conventional image sensor 12 for generating pixel data, the micro-controller 14 has to receive and process all pixel data, resulting in a major burden while processing the image data.
  • SUMMARY OF THE INVENTION
  • Instead of transmitting the entire image data, the claimed invention discloses an interactive device capable of transmitting parameters of image objects. The interactive device comprises an image sensor, a processor, and a transmission interface. The image sensor generates a plurality of pixel signals corresponding to an image. The processing module determines at lease one static parameter of at least one image object within the image based on the plurality of pixel signals. Here, an image object refers to a group of at least one pixel having similar properties, such as similar gray intensities or similar colors. The transmission interface outputs a digitized signal comprising at least one value based on the at least one static parameter of at least one image object.
  • These and other objectives of the present invention will no doubt become obvious to those of ordinary skill in the art after reading the following detailed description of the preferred embodiment that is illustrated in the various figures and drawings.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a functional block diagram of the interactive device according to the prior art.
  • FIG. 2 is a functional block diagram of the interactive device according to the present invention.
  • FIG. 3 shows multiple image pictures.
  • FIG. 4 is another functional block diagram of the interactive device according to the present invention.
  • DETAILED DESCRIPTION
  • Please refer to FIG. 2. FIG. 2 is a functional block diagram of an interactive device 30 according to the present invention. The interactive device 30 can be one component of an interface controller, one component of a game controller, or one component of an interactive toy. The interactive device 30 comprises a processing module 44 that is a chip, and a controller 54. The processing module 44 comprises an image sensor 42, which is a charge-coupled device (CCD) or a CMOS image sensor (CIS), for generating a plurality of digital pixel signals. Then, the plurality of pixel signals is transmitted to the processing module 44. The processing module 44 comprises a substrate 41, an estimation unit 45, a calculation unit 46, and transmission interfaces 48, 52. In this embodiment, the image sensor 42, the estimation unit 45, the calculation unit 46, and the transmission interfaces 48, 52 are all integrated in a single chip. For an SOC solution, the image sensor 42, the estimation unit 45, and the transmission interfaces 48, 52 are all formed on the substrate 41.
  • Please refer to FIG. 3. FIG. 3 shows multiple image pictures. Each picture comprises a plurality of pixel signals. For each picture, the image sensor 42 is used to generate a plurality of pixel signals. Then, the plurality of generated pixel signals are transmitted to the estimation unit 45. Once the pixel signals are received, the estimation unit 45 would estimate various parameters of each image object based on a plurality of pixel signals. Take a target picture 120 for example. A target object 100 comprising a group of at least one pixel with similar gray intensities or similar colors in the target picture 120 is extracted first. Then various image parameters of the target object are estimated. The image parameters include the area of the target object 100, which indicates the total pixel number of the target object 100, the average color of the target object 100, which indicates the averaged color of all pixels' colors, the orientation of the target object 100, the boundaries of a minimum square enclosing the target object 100, the characteristic points of the target object 100, which especially indicate the corner points and/or the high curvature points of the target object 100, the geometrical shape of the target object 100, the length to width ratio of the target object 100 and the coordinate of the gravity center of the target object 100, which can be estimated by the equation (1): ( X _ , Y _ ) = [ i = 1 M X i M , i = 1 M Y i M ] , ( 1 )
  • where ({overscore (X)},{overscore (Y)}) denotes the coordinate of the gravity center of the target object, (Xi,Yi) denotes the coordinate of each pixel within the image object, and M denotes the pixel number within the target object.
  • Further more, the image parameters which indicate whether the inner of the target object 100 is filled or unfilled with background pixels, and the number of objects with different colors from the target object enclosed in the target object 100, etc, can also be estimate. After the aforementioned parameters are estimated, the estimation unit 45 can generate the extended parameters based on the estimated parameters. For example, the estimation unit 45 can generate the normalized coordinate of the gravity center of the target object with respect to a specified length and a specified width.
  • The target object 100 is taken as a set of the pixel signals with similar colors, and the estimation unit 45 is capable of determining parameters of the target object 100 in the target picture 120 (e.g. the area, the color, the orientation, and the boundaries) based on the number of the pixel signals, the pixel colors, and their corresponding coordinates. The estimation unit 45 can also determine parameters, such as characteristic points of the target object 100, the geometrical shape of the target object 100, the coordinate of the gravity center of the target object 100, and the length to width ratio of the target object 100. For example, if the target object 100 is in a rectangular shape, the estimation unit 45 is able to determine that the number of the corner points of the target object 100 is 4. That is to say, the static image parameters are the measurable parameters of the target object 100 while the target object 100 is being statically captured by an image sensor.
  • Furthermore, please keep referring to FIG. 3. To estimate the motion vector as the difference of the gravity coordinates of two image objects either on the same picture or on the different pictures obtained at different time. The image pixels with similar properties are grouped into image objects. Then the coordinate difference between the reference object 150 and the target object 100, which can be calculated as the difference between the coordinate of the gravity center of one image object and the coordinate of the gravity center of another image object, representing the motion vector of the target image object. The calculation unit 46 is able to determine the motion vector between two different objects in above-mentioned way.
  • After obtaining parameters for image objects in one picture or more than one picture, the estimation unit 45 and the calculation unit 46 can transmit the parameters to the transmission interfaces 48, 52. The transmission interfaces 48, 52 can be a universal asynchronous receiver/transmitter (UART) interface. Asynchronous serial transmission has the advantages of small volume, low price, and the ability to transmit over a long distance. For instance, a universal asynchronous transceiver is an asynchronous serial/parallel data transmitter for transmitting data between serial devices that control and connect to the interactive device 30 (or a processor).
  • In addition to the aforementioned UART interface (RS-232 is one kind of UART interface), the transmission interfaces 48, 52 can be I2C (inter-IC), USB interfaces, wireless USB or SPI (serial peripheral interface). Because the principle of transforming serial data and parallel data with I2C, USB, wireless USB, or SPI is similar to that with UART interface and is well known to those skilled in the art, there is no further description hereinafter.
  • In other words, the first transmission interface 48 and the second transmission interface 52 can each use at least one kind of interface from the serial transmission groups including the UART interface, I2C (inter-IC), USB interface, and wireless USB interface.
  • Ultimately, after receiving the motion vectors or the static parameters (e.g. the coordinate of the gravity center of the image object, the area of the image object, the average color of the image object, the orientation of the image object, the boundary of the image object, the characteristic points, such as corner points and/or high curvature points, the geometrical shape of the image object, and the length to width ratio of the image object) transmitted from the transmission interfaces 48, 52, the controller 54 is able to utilize codes of each object in the previous picture 110 in cooperation with motion vectors and static parameters of each object to recover the target picture 120. The controller 54 may take further action based on the parameters for controlling the operation of the interactive device 30.
  • In another embodiment, the first transmission interface 48 for transmitting the data generated by the estimation unit 45 and the second transmission interface 52 for transmitting the motion vectors calculated by the calculation unit 46 can be combined into a single interface.
  • In the third embodiment, the processing module 44 comprises the image sensor 42, the calculation unit 46, and the second transmission interface 52, and all are integrated in a single chip. For an SOC solution, the processing module 44 comprises the image sensor 42, the calculation unit 46, and the second transmission interface 52 are all formed on the same substrate 41. Thus, the third embodiment does not make use of the estimation unit 45 and the first transmission interface 48.
  • In the fourth embodiment, the image sensor 42, the estimation unit 45, and the first transmission interface 48 are integrated in a single chip. For an SOC solution, the image sensor 42, the estimation unit 45, and the first transmission interface 48 are all formed on the same substrate 41, and the calculation unit 46 and the second transmission interface 52 are not used.
  • Please refer to FIG. 4, which is another functional block diagram of an interactive device 40 according to the present invention. The interactive device 40 comprises an image sensor 50, a processor 60, a transmission interface 70, and a controller 80. In this embodiment, the processor 60 determines static parameters of the image object as the estimation unit 45 does, and determines the motion vector between two different image objects as the calculation unit 46 does. Additionally, the processor 60 can be a digital signal processor (DSP), a micro control unit (MCU), or other modules capable of determining static parameters and/or motion vectors. Data can be transmitted to the controller 80 in a serial or parallel manner through the transmission interface 70, and thereby the transmission interface 70 can be an I2C interface, a universal serial bus (USB) interface, a wireless USB inter face, a universal asynchronous receiver/transmitter (UART) interface, a parallel transmission interface, or other interfaces. Data transmitted through the transmission interface 70 comprise the area of the image object, the color of the image object, the orientation indicating the image object, the boundaries of the image object, the characteristic points of the image object, the geometrical shape of the image object, the length to width ratio of the image object, and the coordinate of the gravity center of the image object.
  • In FIG. 2, the image sensor 42, the estimation unit 45, the calculation unit 46, and the transmission interfaces 48, 52 are all integrated in a single chip. For an SOC solution, the image sensor 42, the estimation unit 45, the calculation unit 46, and the transmission interfaces 48,52 are all formed on the same substrate 41. Also, these elements can be distributed in different chips. As shown in FIG. 4, the image sensor 50, the processor 60, and the transmission interface 70 are not necessarily integrated in the same chip. They can be distributed in different chips. That is, they can be formed on different substrates.
  • The present invention determines static parameters of the image object, and determines the motion vector among different image objects before transmitting data to the controller at the rear end. The transmission interface transmits the calculated image parameters by the UART interface or any other serial and/or parallel transmission interfaces. In this way, the controller at the rear end does not need to process considerable sensed data any more, which reduces the circuit design complexity and shortens the development period of interactive devices.
  • Those skilled in the art will readily observe that numerous modifications and alterations of the device and method may be made while retaining the teachings of the invention. Accordingly, the above disclosure should be construed as limited only by the metes and bounds of the appended claims.

Claims (15)

1. An interactive device capable of transmitting parameters of image objects, the interactive device comprising:
an image sensor for generating a plurality of pixel signals corresponding to an image;
a processor for determining at least one static parameter set of the at least one image object within the image based on the plurality of pixel signals; and
a transmission interface for outputting at least one output signal.
2. The interactive device of claim 1, wherein the static parameter set comprises one or more parameters from a group comprising a coordinate of an gravity center of an image object, an area of the image object, a direction indicating the image object orientation, an average color of the image object, coordinates of some specified object points of the image object, a length to width ratio of the image object, a shape of the image object, and boundaries of the image object.
3. The interactive device of claim 2, wherein the some specified object points of the image object indicate corner points or high curvature points of the image object.
4. The interactive device of claim 1, wherein the output signal comprises at least one parameter from the at least one image object.
5. The interactive device of claim 1, wherein the output signal comprises at least one value calculated with a combination of at least one parameter from the at least one image object.
6. The interactive device of claim 1, wherein the output signal comprises a motion vector calculated in any combination of at least two parameters from the at least two image objects.
7. The interactive device of claim 4, wherein the parameters of the image objects comprise the one or more parameters from the group comprising the coordinate of the gravity center of the image object, the area of the image object, the direction indicating the image object orientation the average color of the image object, coordinates of some specified object points of the image object, the length to width ratio of the image object, the shape of the image object, and the boundaries of the image object.
8. The interactive device of claim 5, wherein the parameters of the image objects comprise the one or more parameters from the group comprising the coordinate of the gravity center of the image object, the area of the image object, the direction indicating the image object orientation the average color of the image object, coordinates of some specified object points of the image object, the length to width ratio of the image object, the shape of the image object, and the boundaries of the image object.
9. The interactive device of claim 6, wherein the parameters of the image objects comprise the one or more parameters from the group comprising the coordinate of the gravity center of the image object, the area of the image object, the direction indicating the image object orientation the average color of the image object, coordinates of some specified object points of the image object, the length to width ratio of the image object, the shape of the image object, and the boundaries of the image object.
10. The interactive device of claim 1, wherein the transmission interface is selected from a group comprising an I2C interface, a universal serial bus (USB) interface, a wireless USB inter face, a serial peripheral interface (SPI), a universal asynchronous receiver/transmitter (UART) interface, and a parallel transmission interface.
11. The interactive device of claim 1, wherein the image sensor is a CMOS sensor, or a charge-coupled device (CCD) sensor.
12. The interactive device of claim 1, wherein the processor is a digital signal processor (DSP), or a micro control unit (MCU).
13. The interactive device of claim 1 further comprising a controller for controlling operation of the interactive device based on the control signal.
14. The interactive device of claim 1, wherein the image sensor, the processor, and the transmission interface are integrated in a single chip.
15. The interactive device of claim 1, wherein the image sensor, the processor, and the transmission interface are formed on the same substrate.
US11/279,257 2004-08-11 2006-04-11 Interactive device capable of transmitting parameters of image objects Abandoned US20060164423A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US11/279,257 US20060164423A1 (en) 2004-08-11 2006-04-11 Interactive device capable of transmitting parameters of image objects

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
TW093124089A TWI236289B (en) 2004-08-11 2004-08-11 Interactive device capable of improving image processing
TW093124089 2004-08-11
US10/904,301 US8072426B2 (en) 2004-08-11 2004-11-03 Interactive device capable of improving image processing
US11/279,257 US20060164423A1 (en) 2004-08-11 2006-04-11 Interactive device capable of transmitting parameters of image objects

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US10/904,301 Continuation-In-Part US8072426B2 (en) 2004-08-11 2004-11-03 Interactive device capable of improving image processing

Publications (1)

Publication Number Publication Date
US20060164423A1 true US20060164423A1 (en) 2006-07-27

Family

ID=35799594

Family Applications (3)

Application Number Title Priority Date Filing Date
US10/904,301 Active 2027-11-30 US8072426B2 (en) 2004-08-11 2004-11-03 Interactive device capable of improving image processing
US11/279,257 Abandoned US20060164423A1 (en) 2004-08-11 2006-04-11 Interactive device capable of transmitting parameters of image objects
US12/727,262 Active 2025-05-24 US8760390B2 (en) 2004-08-11 2010-03-19 Interactive device capable of improving image processing

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US10/904,301 Active 2027-11-30 US8072426B2 (en) 2004-08-11 2004-11-03 Interactive device capable of improving image processing

Family Applications After (1)

Application Number Title Priority Date Filing Date
US12/727,262 Active 2025-05-24 US8760390B2 (en) 2004-08-11 2010-03-19 Interactive device capable of improving image processing

Country Status (3)

Country Link
US (3) US8072426B2 (en)
JP (1) JP4667833B2 (en)
TW (1) TWI236289B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150334356A1 (en) * 2014-05-14 2015-11-19 Hanwha Techwin Co., Ltd. Camera system and method of tracking object using the same
KR20150130901A (en) * 2014-05-14 2015-11-24 한화테크윈 주식회사 Camera apparatus and method of object tracking using the same

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9024880B2 (en) 2004-08-11 2015-05-05 Pixart Imaging Inc. Interactive system capable of improving image processing
TWI236289B (en) 2004-08-11 2005-07-11 Pixart Imaging Inc Interactive device capable of improving image processing
US8934545B2 (en) * 2009-02-13 2015-01-13 Yahoo! Inc. Extraction of video fingerprints and identification of multimedia using video fingerprinting
US8947350B2 (en) * 2009-09-14 2015-02-03 Broadcom Corporation System and method for generating screen pointing information in a television control device
TWM393739U (en) * 2010-02-12 2010-12-01 Pixart Imaging Inc Optical touch control apparatus
TWI508543B (en) * 2010-05-06 2015-11-11 Pixart Imaging Inc Interactive system capable of improving image processing
TWI507925B (en) * 2010-06-23 2015-11-11 Pixart Imaging Inc You can switch the range of interactive pointing devices and how to switch fetch ranges
TWI446218B (en) * 2010-06-30 2014-07-21 Pixart Imaging Inc A method of switching the range of interactive pointing devices and a handover fetch for interactive pointing devices
US9304574B2 (en) * 2012-04-13 2016-04-05 Pixart Imaging Inc. Remote device and power saving method of interactive system
TWI476706B (en) * 2012-04-30 2015-03-11 Pixart Imaging Inc Method for outputting command by detecting object movement and system thereof

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5897663A (en) * 1996-12-24 1999-04-27 Compaq Computer Corporation Host I2 C controller for selectively executing current address reads to I2 C EEPROMs
US6256016B1 (en) * 1997-06-05 2001-07-03 Logitech, Inc. Optical detection system, device, and method utilizing optical matching
US20030085878A1 (en) * 2001-11-06 2003-05-08 Xiadong Luo Method and apparatus for determining relative movement in an optical mouse
US20030193561A1 (en) * 2002-04-12 2003-10-16 Giampietro Tecchiolli "Electro-optical device for the acquisition and processing of images"
US20050071499A1 (en) * 2003-09-30 2005-03-31 Batra Rajesh K. Real-time diagnostic data streams for tunable optical devices
US7085408B1 (en) * 2002-07-16 2006-08-01 Magna Chip Semiconductor Method and system for testing image sensor system-on-chip
US7102615B2 (en) * 2002-07-27 2006-09-05 Sony Computer Entertainment Inc. Man-machine interface using a deformable device

Family Cites Families (30)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3030485B2 (en) 1994-03-17 2000-04-10 富士通株式会社 Three-dimensional shape extraction method and apparatus
JPH08125935A (en) 1994-10-28 1996-05-17 Canon Inc Semiconductor device and semiconductor circuit, correlation arithmetic device, a/d converter, d/a converter and signal processing system using the semiconductor device
US5619281A (en) * 1994-12-30 1997-04-08 Daewoo Electronics Co., Ltd Method and apparatus for detecting motion vectors in a frame decimating video encoder
US5956415A (en) * 1996-01-26 1999-09-21 Harris Corporation Enhanced security fingerprint sensor package and related methods
CN1164076A (en) 1996-01-26 1997-11-05 哈里公司 Enhanced security fingerprint sensor package and related methods
JPH10224696A (en) 1997-01-31 1998-08-21 Toshiba Corp Solid-state image pickup element and image system using the solid-state image pickup element
US5955415A (en) * 1997-08-04 1999-09-21 Lever Brothers Company, Division Of Conopco, Inc. Detergent compositions containing polyethyleneimines for enhanced peroxygen bleach stability
KR20010052282A (en) 1998-04-30 2001-06-25 뢰그렌, 크리스터 Input unit, method for using the same and input system
WO2000067960A1 (en) 1999-05-10 2000-11-16 Sony Corporation Toboy device and method for controlling the same
AUPQ289099A0 (en) 1999-09-16 1999-10-07 Silverbrook Research Pty Ltd Method and apparatus for manipulating a bayer image
JP2001056742A (en) 1999-08-19 2001-02-27 Alps Electric Co Ltd Input device
JP2001141981A (en) 1999-11-16 2001-05-25 Olympus Optical Co Ltd Range finder
JP2001209488A (en) * 2000-01-27 2001-08-03 Mitsubishi Electric Corp Information input device and recording medium recording information input program
JP2001242780A (en) 2000-02-29 2001-09-07 Sony Corp Information communication robot device, information communication method, and information communication robot system
US6924787B2 (en) 2000-04-17 2005-08-02 Immersion Corporation Interface for controlling a graphical image
JP2002101332A (en) 2000-09-26 2002-04-05 Kanebo Ltd Image pickup camera and digital signal processing method used for the image pickup camera
JP3576987B2 (en) * 2001-03-06 2004-10-13 株式会社東芝 Image template matching method and image processing apparatus
JP2002268663A (en) 2001-03-08 2002-09-20 Sony Corp Voice synthesizer, voice synthesis method, program and recording medium
JP2003110895A (en) 2001-09-28 2003-04-11 Olympus Optical Co Ltd Personal digital assistance device with camera function
US6859199B2 (en) * 2001-11-06 2005-02-22 Omnivision Technologies, Inc. Method and apparatus for determining relative movement in an optical mouse using feature extraction
DE10316208A1 (en) * 2002-04-12 2003-11-20 Samsung Electro Mech Navigation system and navigation method
KR100516629B1 (en) * 2003-10-02 2005-09-22 삼성전기주식회사 Optical nevigation sensor device and method for processing the image data using the 2-demention sequential process
TWI230890B (en) * 2003-12-29 2005-04-11 Pixart Imaging Inc Handheld pointing device and method for estimating a displacement
TW200522710A (en) * 2003-12-29 2005-07-01 Pixart Imaging Inc Image navigation chip
US9024880B2 (en) 2004-08-11 2015-05-05 Pixart Imaging Inc. Interactive system capable of improving image processing
TWI236289B (en) 2004-08-11 2005-07-11 Pixart Imaging Inc Interactive device capable of improving image processing
CN100559334C (en) 2004-08-18 2009-11-11 原相科技股份有限公司 Image-processing circuit and image sensor are integrated in the interactive apparatus of a substrate
TWI254581B (en) * 2004-12-27 2006-05-01 Sunplus Technology Co Ltd Method and device for detecting image movements
US7864159B2 (en) 2005-01-12 2011-01-04 Thinkoptics, Inc. Handheld vision based absolute pointing system
US9176598B2 (en) 2007-05-08 2015-11-03 Thinkoptics, Inc. Free-space multi-dimensional absolute pointer with improved performance

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5897663A (en) * 1996-12-24 1999-04-27 Compaq Computer Corporation Host I2 C controller for selectively executing current address reads to I2 C EEPROMs
US6256016B1 (en) * 1997-06-05 2001-07-03 Logitech, Inc. Optical detection system, device, and method utilizing optical matching
US20030085878A1 (en) * 2001-11-06 2003-05-08 Xiadong Luo Method and apparatus for determining relative movement in an optical mouse
US20030193561A1 (en) * 2002-04-12 2003-10-16 Giampietro Tecchiolli "Electro-optical device for the acquisition and processing of images"
US7085408B1 (en) * 2002-07-16 2006-08-01 Magna Chip Semiconductor Method and system for testing image sensor system-on-chip
US7102615B2 (en) * 2002-07-27 2006-09-05 Sony Computer Entertainment Inc. Man-machine interface using a deformable device
US20050071499A1 (en) * 2003-09-30 2005-03-31 Batra Rajesh K. Real-time diagnostic data streams for tunable optical devices

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150334356A1 (en) * 2014-05-14 2015-11-19 Hanwha Techwin Co., Ltd. Camera system and method of tracking object using the same
KR20150130901A (en) * 2014-05-14 2015-11-24 한화테크윈 주식회사 Camera apparatus and method of object tracking using the same
US10334150B2 (en) * 2014-05-14 2019-06-25 Hanwha Aerospace Co., Ltd. Camera system and method of tracking object using the same
KR102282470B1 (en) * 2014-05-14 2021-07-28 한화테크윈 주식회사 Camera apparatus and method of object tracking using the same

Also Published As

Publication number Publication date
TWI236289B (en) 2005-07-11
JP2006054844A (en) 2006-02-23
JP4667833B2 (en) 2011-04-13
US8760390B2 (en) 2014-06-24
US8072426B2 (en) 2011-12-06
US20100177209A1 (en) 2010-07-15
US20060033822A1 (en) 2006-02-16
TW200607340A (en) 2006-02-16

Similar Documents

Publication Publication Date Title
US20060164423A1 (en) Interactive device capable of transmitting parameters of image objects
US8373751B2 (en) Apparatus and method for measuring location and distance of object by using camera
US7551955B2 (en) Device, system and method for image based size analysis
WO2009098743A1 (en) Vehicle periphery monitoring device, vehicle, vehicle periphery monitoring program, and vehicle periphery monitoring method
CN107683403B (en) Distance image acquisition device and distance image acquisition method
US11252306B2 (en) Method for generating depth information and electronic device supporting the same
CN106878668A (en) Mobile detection to object
US9628777B2 (en) Method of 3D reconstruction of a scene calling upon asynchronous sensors
JP2022160678A (en) Three-dimensional image apparatus
JP7280860B2 (en) Information processing device, system, information processing method and information processing program
US20100110209A1 (en) Fast motion measurement device for gaming
CN113822942A (en) Method for measuring object size by monocular camera based on two-dimensional code
CN106470277A (en) A kind of safety instruction method and device
JP2013124972A (en) Position estimation device and method and television receiver
US9024880B2 (en) Interactive system capable of improving image processing
JP4737073B2 (en) Human body detection device
CN100559334C (en) Image-processing circuit and image sensor are integrated in the interactive apparatus of a substrate
CN102238358B (en) Interactive system capable of improving image processing speed and method thereof
JP5044207B2 (en) Imaging device
TWI743379B (en) Vital sign signal detecting method of the vital sign detecting system and method for evaluating a confidence of the vital sign signal
CN114341650A (en) Event detection method and device, movable platform and computer readable storage medium
GB2424332A (en) An image processing system
KR20240084053A (en) Image recognition device and image recognition method using tactile stimulus information
JPH09243325A (en) Method and device for recognizing three-dimensional position
JP2005198059A (en) Data signal reception method, device thereof, program, and recording medium thereof

Legal Events

Date Code Title Description
AS Assignment

Owner name: PIXART IMAGING INC., TAIWAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:LEE, HSUAN-HSIEN;YANG, CHIN-HSIN;CHAO, TZU-YI;REEL/FRAME:017462/0334

Effective date: 20060407

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION