U.S. patent application number 11/279257 was filed with the patent office on 2006-07-27 for interactive device capable of transmitting parameters of image objects.
Invention is credited to Tzu-Yi Chao, Hsuan-Hsien Lee, Chin-Hsin Yang.
Application Number | 20060164423 11/279257 |
Document ID | / |
Family ID | 35799594 |
Filed Date | 2006-07-27 |
United States Patent
Application |
20060164423 |
Kind Code |
A1 |
Lee; Hsuan-Hsien ; et
al. |
July 27, 2006 |
INTERACTIVE DEVICE CAPABLE OF TRANSMITTING PARAMETERS OF IMAGE
OBJECTS
Abstract
An interactive device includes an image sensor for generating a
plurality of pixel signals, and a processor for determining a
static parameter of at least one image object of the image based on
the plurality of pixel signals. A transmission interface is used
for outputting a control signal based on the static parameter
determined by the processor.
Inventors: |
Lee; Hsuan-Hsien; (Hsin-Chu
Hsien, TW) ; Yang; Chin-Hsin; (Hsin-Chu Hsien,
TW) ; Chao; Tzu-Yi; (Hsin-Chu Hsien, TW) |
Correspondence
Address: |
NORTH AMERICA INTELLECTUAL PROPERTY CORPORATION
P.O. BOX 506
MERRIFIELD
VA
22116
US
|
Family ID: |
35799594 |
Appl. No.: |
11/279257 |
Filed: |
April 11, 2006 |
Related U.S. Patent Documents
|
|
|
|
|
|
Application
Number |
Filing Date |
Patent Number |
|
|
10904301 |
Nov 3, 2004 |
|
|
|
11279257 |
Apr 11, 2006 |
|
|
|
Current U.S.
Class: |
345/474 ;
345/473; 348/E3.02 |
Current CPC
Class: |
H04N 3/1562 20130101;
H04N 5/374 20130101 |
Class at
Publication: |
345/474 ;
345/473 |
International
Class: |
G06T 15/70 20060101
G06T015/70 |
Foreign Application Data
Date |
Code |
Application Number |
Aug 11, 2004 |
TW |
093124089 |
Claims
1. An interactive device capable of transmitting parameters of
image objects, the interactive device comprising: an image sensor
for generating a plurality of pixel signals corresponding to an
image; a processor for determining at least one static parameter
set of the at least one image object within the image based on the
plurality of pixel signals; and a transmission interface for
outputting at least one output signal.
2. The interactive device of claim 1, wherein the static parameter
set comprises one or more parameters from a group comprising a
coordinate of an gravity center of an image object, an area of the
image object, a direction indicating the image object orientation,
an average color of the image object, coordinates of some specified
object points of the image object, a length to width ratio of the
image object, a shape of the image object, and boundaries of the
image object.
3. The interactive device of claim 2, wherein the some specified
object points of the image object indicate corner points or high
curvature points of the image object.
4. The interactive device of claim 1, wherein the output signal
comprises at least one parameter from the at least one image
object.
5. The interactive device of claim 1, wherein the output signal
comprises at least one value calculated with a combination of at
least one parameter from the at least one image object.
6. The interactive device of claim 1, wherein the output signal
comprises a motion vector calculated in any combination of at least
two parameters from the at least two image objects.
7. The interactive device of claim 4, wherein the parameters of the
image objects comprise the one or more parameters from the group
comprising the coordinate of the gravity center of the image
object, the area of the image object, the direction indicating the
image object orientation the average color of the image object,
coordinates of some specified object points of the image object,
the length to width ratio of the image object, the shape of the
image object, and the boundaries of the image object.
8. The interactive device of claim 5, wherein the parameters of the
image objects comprise the one or more parameters from the group
comprising the coordinate of the gravity center of the image
object, the area of the image object, the direction indicating the
image object orientation the average color of the image object,
coordinates of some specified object points of the image object,
the length to width ratio of the image object, the shape of the
image object, and the boundaries of the image object.
9. The interactive device of claim 6, wherein the parameters of the
image objects comprise the one or more parameters from the group
comprising the coordinate of the gravity center of the image
object, the area of the image object, the direction indicating the
image object orientation the average color of the image object,
coordinates of some specified object points of the image object,
the length to width ratio of the image object, the shape of the
image object, and the boundaries of the image object.
10. The interactive device of claim 1, wherein the transmission
interface is selected from a group comprising an I.sup.2C
interface, a universal serial bus (USB) interface, a wireless USB
inter face, a serial peripheral interface (SPI), a universal
asynchronous receiver/transmitter (UART) interface, and a parallel
transmission interface.
11. The interactive device of claim 1, wherein the image sensor is
a CMOS sensor, or a charge-coupled device (CCD) sensor.
12. The interactive device of claim 1, wherein the processor is a
digital signal processor (DSP), or a micro control unit (MCU).
13. The interactive device of claim 1 further comprising a
controller for controlling operation of the interactive device
based on the control signal.
14. The interactive device of claim 1, wherein the image sensor,
the processor, and the transmission interface are integrated in a
single chip.
15. The interactive device of claim 1, wherein the image sensor,
the processor, and the transmission interface are formed on the
same substrate.
Description
CROSS REFERENCE TO RELATED APPLICATIONS
[0001] This is a continuation-in-part of application Ser. No.
10/904,301, filed Nov. 3, 2004, which is included in its entirety
herein by reference.
BACKGROUND OF THE INVENTION
[0002] 1. Field of the Invention
[0003] The present invention relates to an interactive device, and
more specifically, to an interactive device capable of transmitting
parameters of image objects.
[0004] 2. Description of the Prior Art
[0005] In conventional interactive devices, image sensors are used
to capture human motions as controlling instructions. Take
electronic pets for example; the built-in image sensors installed
inside the electronic pets functions as an "eye" of the interactive
toy to capture image pictures of human motions. Then, the captured
and digitized pictures are transmitted to the following device to
identify the controlling instructions. Eventually, the electronic
pets would act according to the identified instructions.
[0006] Please refer to FIG. 1. FIG. 1 is a functional block diagram
of an interactive device 10 according to the prior art. The
interactive device 10 includes an image sensor 12, a
micro-controller 14, and a parallel transmission bus 16. The image
sensor 12 contains a CMOS sensing array 22 and an analog to digital
converter (ADC) 24. Data sensed by the CMOS sensing array 22 is
transmitted to the analog to digital converter 24. Because the CMOS
sensing array 22 is capable of sensing a plurality of pixel data
for forming images, the CMOS sensing array 22 of the image sensor
12 would generate various pixel data continuously while taking
continuously moving images. In order to transmit a considerable
amount of pixel data, the pixel data are transmitted from the image
sensor 12 to the micro-controller 14 through the parallel
transmission bus 16, and then the micro-controller 14 recomposes
the images, extracts image objects on the recomposed images, and
then determines the condition of the image object to control the
operation of the interactive device 10.
[0007] Here, an image object refers to a group of at least one
pixel having similar properties, such as similar gray intensities
or similar colors.
[0008] The total amount of the data is considerable, and the
micro-controller 14 still has to determine and analyze the
necessary data after receiving the data transmitted through the
parallel transmission interface 16. However, for most applications,
the micro-controller 14 does not need to deal with the entire image
data. Take object tracking application for example, the
micro-controller 14 does not need to obtain and deal with the
entire image data, but can calculate the difference of the
coordinates of the gravity centers for the corresponding image
objects to obtain the trail of relative motions of these image
objects. As a result, if users utilize the conventional image
sensor 12 for generating pixel data, the micro-controller 14 has to
receive and process all pixel data, resulting in a major burden
while processing the image data.
SUMMARY OF THE INVENTION
[0009] Instead of transmitting the entire image data, the claimed
invention discloses an interactive device capable of transmitting
parameters of image objects. The interactive device comprises an
image sensor, a processor, and a transmission interface. The image
sensor generates a plurality of pixel signals corresponding to an
image. The processing module determines at lease one static
parameter of at least one image object within the image based on
the plurality of pixel signals. Here, an image object refers to a
group of at least one pixel having similar properties, such as
similar gray intensities or similar colors. The transmission
interface outputs a digitized signal comprising at least one value
based on the at least one static parameter of at least one image
object.
[0010] These and other objectives of the present invention will no
doubt become obvious to those of ordinary skill in the art after
reading the following detailed description of the preferred
embodiment that is illustrated in the various figures and
drawings.
BRIEF DESCRIPTION OF THE DRAWINGS
[0011] FIG. 1 is a functional block diagram of the interactive
device according to the prior art.
[0012] FIG. 2 is a functional block diagram of the interactive
device according to the present invention.
[0013] FIG. 3 shows multiple image pictures.
[0014] FIG. 4 is another functional block diagram of the
interactive device according to the present invention.
DETAILED DESCRIPTION
[0015] Please refer to FIG. 2. FIG. 2 is a functional block diagram
of an interactive device 30 according to the present invention. The
interactive device 30 can be one component of an interface
controller, one component of a game controller, or one component of
an interactive toy. The interactive device 30 comprises a
processing module 44 that is a chip, and a controller 54. The
processing module 44 comprises an image sensor 42, which is a
charge-coupled device (CCD) or a CMOS image sensor (CIS), for
generating a plurality of digital pixel signals. Then, the
plurality of pixel signals is transmitted to the processing module
44. The processing module 44 comprises a substrate 41, an
estimation unit 45, a calculation unit 46, and transmission
interfaces 48, 52. In this embodiment, the image sensor 42, the
estimation unit 45, the calculation unit 46, and the transmission
interfaces 48, 52 are all integrated in a single chip. For an SOC
solution, the image sensor 42, the estimation unit 45, and the
transmission interfaces 48, 52 are all formed on the substrate
41.
[0016] Please refer to FIG. 3. FIG. 3 shows multiple image
pictures. Each picture comprises a plurality of pixel signals. For
each picture, the image sensor 42 is used to generate a plurality
of pixel signals. Then, the plurality of generated pixel signals
are transmitted to the estimation unit 45. Once the pixel signals
are received, the estimation unit 45 would estimate various
parameters of each image object based on a plurality of pixel
signals. Take a target picture 120 for example. A target object 100
comprising a group of at least one pixel with similar gray
intensities or similar colors in the target picture 120 is
extracted first. Then various image parameters of the target object
are estimated. The image parameters include the area of the target
object 100, which indicates the total pixel number of the target
object 100, the average color of the target object 100, which
indicates the averaged color of all pixels' colors, the orientation
of the target object 100, the boundaries of a minimum square
enclosing the target object 100, the characteristic points of the
target object 100, which especially indicate the corner points
and/or the high curvature points of the target object 100, the
geometrical shape of the target object 100, the length to width
ratio of the target object 100 and the coordinate of the gravity
center of the target object 100, which can be estimated by the
equation (1): ( X _ , Y _ ) = [ i = 1 M .times. X i M , i = 1 M
.times. Y i M ] , ( 1 ) ##EQU1##
[0017] where ({overscore (X)},{overscore (Y)}) denotes the
coordinate of the gravity center of the target object,
(X.sub.i,Y.sub.i) denotes the coordinate of each pixel within the
image object, and M denotes the pixel number within the target
object.
[0018] Further more, the image parameters which indicate whether
the inner of the target object 100 is filled or unfilled with
background pixels, and the number of objects with different colors
from the target object enclosed in the target object 100, etc, can
also be estimate. After the aforementioned parameters are
estimated, the estimation unit 45 can generate the extended
parameters based on the estimated parameters. For example, the
estimation unit 45 can generate the normalized coordinate of the
gravity center of the target object with respect to a specified
length and a specified width.
[0019] The target object 100 is taken as a set of the pixel signals
with similar colors, and the estimation unit 45 is capable of
determining parameters of the target object 100 in the target
picture 120 (e.g. the area, the color, the orientation, and the
boundaries) based on the number of the pixel signals, the pixel
colors, and their corresponding coordinates. The estimation unit 45
can also determine parameters, such as characteristic points of the
target object 100, the geometrical shape of the target object 100,
the coordinate of the gravity center of the target object 100, and
the length to width ratio of the target object 100. For example, if
the target object 100 is in a rectangular shape, the estimation
unit 45 is able to determine that the number of the corner points
of the target object 100 is 4. That is to say, the static image
parameters are the measurable parameters of the target object 100
while the target object 100 is being statically captured by an
image sensor.
[0020] Furthermore, please keep referring to FIG. 3. To estimate
the motion vector as the difference of the gravity coordinates of
two image objects either on the same picture or on the different
pictures obtained at different time. The image pixels with similar
properties are grouped into image objects. Then the coordinate
difference between the reference object 150 and the target object
100, which can be calculated as the difference between the
coordinate of the gravity center of one image object and the
coordinate of the gravity center of another image object,
representing the motion vector of the target image object. The
calculation unit 46 is able to determine the motion vector between
two different objects in above-mentioned way.
[0021] After obtaining parameters for image objects in one picture
or more than one picture, the estimation unit 45 and the
calculation unit 46 can transmit the parameters to the transmission
interfaces 48, 52. The transmission interfaces 48, 52 can be a
universal asynchronous receiver/transmitter (UART) interface.
Asynchronous serial transmission has the advantages of small
volume, low price, and the ability to transmit over a long
distance. For instance, a universal asynchronous transceiver is an
asynchronous serial/parallel data transmitter for transmitting data
between serial devices that control and connect to the interactive
device 30 (or a processor).
[0022] In addition to the aforementioned UART interface (RS-232 is
one kind of UART interface), the transmission interfaces 48, 52 can
be I.sup.2C (inter-IC), USB interfaces, wireless USB or SPI (serial
peripheral interface). Because the principle of transforming serial
data and parallel data with I.sup.2C, USB, wireless USB, or SPI is
similar to that with UART interface and is well known to those
skilled in the art, there is no further description
hereinafter.
[0023] In other words, the first transmission interface 48 and the
second transmission interface 52 can each use at least one kind of
interface from the serial transmission groups including the UART
interface, I.sup.2C (inter-IC), USB interface, and wireless USB
interface.
[0024] Ultimately, after receiving the motion vectors or the static
parameters (e.g. the coordinate of the gravity center of the image
object, the area of the image object, the average color of the
image object, the orientation of the image object, the boundary of
the image object, the characteristic points, such as corner points
and/or high curvature points, the geometrical shape of the image
object, and the length to width ratio of the image object)
transmitted from the transmission interfaces 48, 52, the controller
54 is able to utilize codes of each object in the previous picture
110 in cooperation with motion vectors and static parameters of
each object to recover the target picture 120. The controller 54
may take further action based on the parameters for controlling the
operation of the interactive device 30.
[0025] In another embodiment, the first transmission interface 48
for transmitting the data generated by the estimation unit 45 and
the second transmission interface 52 for transmitting the motion
vectors calculated by the calculation unit 46 can be combined into
a single interface.
[0026] In the third embodiment, the processing module 44 comprises
the image sensor 42, the calculation unit 46, and the second
transmission interface 52, and all are integrated in a single chip.
For an SOC solution, the processing module 44 comprises the image
sensor 42, the calculation unit 46, and the second transmission
interface 52 are all formed on the same substrate 41. Thus, the
third embodiment does not make use of the estimation unit 45 and
the first transmission interface 48.
[0027] In the fourth embodiment, the image sensor 42, the
estimation unit 45, and the first transmission interface 48 are
integrated in a single chip. For an SOC solution, the image sensor
42, the estimation unit 45, and the first transmission interface 48
are all formed on the same substrate 41, and the calculation unit
46 and the second transmission interface 52 are not used.
[0028] Please refer to FIG. 4, which is another functional block
diagram of an interactive device 40 according to the present
invention. The interactive device 40 comprises an image sensor 50,
a processor 60, a transmission interface 70, and a controller 80.
In this embodiment, the processor 60 determines static parameters
of the image object as the estimation unit 45 does, and determines
the motion vector between two different image objects as the
calculation unit 46 does. Additionally, the processor 60 can be a
digital signal processor (DSP), a micro control unit (MCU), or
other modules capable of determining static parameters and/or
motion vectors. Data can be transmitted to the controller 80 in a
serial or parallel manner through the transmission interface 70,
and thereby the transmission interface 70 can be an I.sup.2C
interface, a universal serial bus (USB) interface, a wireless USB
inter face, a universal asynchronous receiver/transmitter (UART)
interface, a parallel transmission interface, or other interfaces.
Data transmitted through the transmission interface 70 comprise the
area of the image object, the color of the image object, the
orientation indicating the image object, the boundaries of the
image object, the characteristic points of the image object, the
geometrical shape of the image object, the length to width ratio of
the image object, and the coordinate of the gravity center of the
image object.
[0029] In FIG. 2, the image sensor 42, the estimation unit 45, the
calculation unit 46, and the transmission interfaces 48, 52 are all
integrated in a single chip. For an SOC solution, the image sensor
42, the estimation unit 45, the calculation unit 46, and the
transmission interfaces 48,52 are all formed on the same substrate
41. Also, these elements can be distributed in different chips. As
shown in FIG. 4, the image sensor 50, the processor 60, and the
transmission interface 70 are not necessarily integrated in the
same chip. They can be distributed in different chips. That is,
they can be formed on different substrates.
[0030] The present invention determines static parameters of the
image object, and determines the motion vector among different
image objects before transmitting data to the controller at the
rear end. The transmission interface transmits the calculated image
parameters by the UART interface or any other serial and/or
parallel transmission interfaces. In this way, the controller at
the rear end does not need to process considerable sensed data any
more, which reduces the circuit design complexity and shortens the
development period of interactive devices.
[0031] Those skilled in the art will readily observe that numerous
modifications and alterations of the device and method may be made
while retaining the teachings of the invention. Accordingly, the
above disclosure should be construed as limited only by the metes
and bounds of the appended claims.
* * * * *