U.S. patent application number 11/446271 was filed with the patent office on 2007-05-17 for method and imager for detecting the location of objects.
Invention is credited to John Gregory Aceti, Herschel Clement Burstyn, Richard Morgan III Moroney, Timothy Allen Pletcher, Peter John Zanzucchi.
Application Number | 20070109421 11/446271 |
Document ID | / |
Family ID | 35909241 |
Filed Date | 2007-05-17 |
United States Patent
Application |
20070109421 |
Kind Code |
A1 |
Zanzucchi; Peter John ; et
al. |
May 17, 2007 |
Method and imager for detecting the location of objects
Abstract
A device determines the location of objects in an environment by
receiving an optical image of an environment and converting the
optical image of the environment into a live color digital image.
The device employs software to perform an analysis of the live
color digital image to determine the location of the environment
having one or more lost object by using color and shape
characteristics of the one or more objects. The software uses a
range of the visible portion of the color space uniquely identified
for the type of object in that environment and identifies those
pixels in the color digital image that may be possible targets.
Intensity of background and object size are used to exclude pixels
as possible target objects.
Inventors: |
Zanzucchi; Peter John;
(Mercer, NJ) ; Aceti; John Gregory; (Lexington,
MA) ; Moroney; Richard Morgan III; (Mercer, NJ)
; Pletcher; Timothy Allen; (Burlington, NJ) ;
Burstyn; Herschel Clement; (Mercer, NJ) |
Correspondence
Address: |
BINGHAM MCCUTCHEN LLP
2020 K Street, N.W.
Intellectual Property Department
WASHINGTON
DC
20006
US
|
Family ID: |
35909241 |
Appl. No.: |
11/446271 |
Filed: |
June 5, 2006 |
Related U.S. Patent Documents
|
|
|
|
|
|
Application
Number |
Filing Date |
Patent Number |
|
|
10921294 |
Aug 19, 2004 |
7119838 |
|
|
11446271 |
Jun 5, 2006 |
|
|
|
Current U.S.
Class: |
348/222.1 |
Current CPC
Class: |
G06T 7/20 20130101 |
Class at
Publication: |
348/222.1 |
International
Class: |
H04N 5/228 20060101
H04N005/228 |
Claims
1. A method of determining the location of at least a portion of a
stationary object in an environment, the method comprising:
defining a target color space for the object type of the stationary
object, wherein the target color space is defined by reflected
light and emmited light observed in a series of reference live
digital color images of the object type; detecting the presence of
an object having a color space in the target color space; and
reporting the determination that there are a set of pixels within
the target color space defined for the type of object that is
stationary.
2. The method according to claim 1, wherein the color digital image
is generated by one of: a digital camera and a digital video
camera.
3. The method according to claim 2, further comprising providing a
light source for illuminating the environment.
4. The method according to claim 1, wherein the object is white or
any acceptable color for which RGB components may be
determined.
5. The method according to claim 4, wherein the stationary object
is a golf ball.
6. The method according to claim 5, wherein the environment
includes at least one of: grass, bushes, trees, and sand.
7. The method according to claim 1, further comprising determining
whether the set of pixels determined are within the target color
space defined for the type of the stationary object satisfy a
characteristic of the object.
8. The method according to claim 8, wherein the characteristic is
one of: a size, color intensity and a shape.
9. The method according to claim 1, wherein defining the target
color space includes generating the series of reference digital
color images of the object type.
10. The method according to claim 1, further comprising storing the
target color space.
11. The method of claim 1, wherein the live digital color image of
the environment is one digital color image in a series of digital
color images of the environment.
12. The method of claim 1, wherein the determination that there are
a set of pixels within the target color space defined for the
object type is reported by one of: a visual display, a tactile
alert, a sound alert, and an odorous alert.
13. The method of claim 1, further comprising generating a live
color digital image of the environment; determining whether there
are a set of pixels in the live digital color image of the
environment that are within the target color space defined for the
type of object that is stationary.
14. The method according to claim 13, further comprising reporting
the location of the set of pixels in the live digital color image,
wherein the set of pixels corresponds to the location of the at
least a portion of the stationary of object in the environment.
15. An apparatus for determining the location of at least a portion
of a stationary object in an environment comprising: a processor
operable to execute computer program instructions; and a memory
operable to store computer program instructions executable by the
processor, for performing the steps of: defining a target color
space for the object type of the stationary object, wherein the
target color space is defined by reflected light and emmited light
observed in a series of reference live digital color images of the
object type; detecting the presence of an object having a color
space in the target color space; and reporting the determination
that there are a set of pixels within the target color space
defined for the type of object that is stationary.
16. The apparatus according to claim 15, wherein the color digital
image is generated by one of: a digital camera and a digital video
camera.
17. The apparatus according to claim 16, further comprising means
for providing a light source for illuminating the environment.
18. The apparatus according to claim 17, wherein the object is
white or any acceptable color for which RGB components may be
determined.
19. The apparatus according to claim 18, wherein the stationary
object is a golf ball.
20. The apparatus according to claim 18, wherein the environment
includes at least one of: grass, bushes, trees, and sand.
21. The apparatus according to claim 15, further comprising
determining whether the set of pixels determined are within the
target color space defined for the object type.
22. The apparatus according to claim 19, wherein the characteristic
is one of: a size, color intensity and a shape.
23. The apparatus according to claim 15, wherein defining the
target color space includes generating the series of reference
digital color images of the type of object.
24. The apparatus according to claim 15, further comprising means
for storing the target color space.
25. The apparatus of claim 15, wherein the digital color image is
one digital color image in a series of digital color images.
26. The apparatus of claim 15, wherein the determination that there
are a set of pixels within the target color space defined for the
type of object is reported by one of: a visual display, a tactile
alert, a sound alert, and an odorous alert.
27. The apparatus of claim 15, further comprising means for
generating a live color digital image of the environment; means for
determining whether there are a set of pixels in the live digital
color image of the environment that are within the target color
space defined for the type of object that is stationary.
28. The apparatus according to claim 15, further comprising means
for reporting the location of the set of pixels in the live digital
color image, wherein the set of pixels corresponds to the location
of the at least a portion of the stationary of object in the
environment.
Description
CROSS REFERENCE TO RELATED APPLICATIONS
[0001] This application claims the benefit of utility patent
application having Ser. No. 10/921,294 filed on Aug. 19, 2004
BACKGROUND OF THE INVENTION
[0002] 1. Field of the Invention:
[0003] The present invention relates to a method, a system and a
computer program product for determining the location of objects in
an environment. More particularly, the present invention relates to
a method, a device and a computer program product for determining
the location of the objects in environments where the color of the
object is not naturally found.
[0004] 2. Description of the Prior Art:
[0005] There are many circumstances where an object is lost and
determining its location is difficult due to the characteristics of
the environment in which it has been lost. One such circumstance
occurs during the playing of the sport of golf. Typically, the
sport of golf is played on terrain having a variety of
characteristics, such as grass, sand, trees, water, a specified
distance, etc. It is not uncommon for a golf ball to become lost
while playing golf due to the characteristics of the environment in
which it is played. Once a golf ball is lost, a substantial amount
of time can be spent trying to find it. This results in an increase
of playing time for the player who lost the ball, as well as other
players playing behind or with the player. In cases where the golf
ball cannot be located, the player who lost the ball is accessed a
penalty stroke increasing the player's final score.
[0006] Accordingly, there is a need for a device that detects and
determines the location of an object in an environment having a
variety of characteristics. There is further need for the device to
be mobile. There is a further need for the device to detect the
location of an object over long distances. There is a need for the
device to be operable in a variety of lighting conditions. There is
a need for the device to reduce glare and related image artifacts.
There is a need for the device to reduce multiple reflections and
shadowing in the detection of the object. There is a need for the
device to decrease the amount of time required to locate an
object.
SUMMARY OF THE INVENTION
[0007] According to embodiments of the present invention, a method,
a device and a computer program product for determining the
location of an object in an environment are provided. The method
receives an optical image of an environment and converts the
optical image of the environment into a live color digital image of
the environment consisting of charged signals, where each charged
signal was generated by a pixel in an array of a Charged Coupler
Device (CCD) by photoelectric conversion.
[0008] The live color digital image depicts an environment having
one or more similar objects. Software performs an analysis of the
live color digital image to detect and determine the location of
the one or more objects in the live color digital image of the
environment by using color and shape characteristics of the one or
more objects. The software uses a range of the visible portion of
the color space uniquely identified for the type of object in that
environment to detect and determine the location of the one or more
objects in the live color digital image of the environment. When
the object is a golf ball, color is defined by reflection of light
and by UV stimulated emission of blue fluorescence due to the
brighteners incorporated in the composition of golf balls. Color in
this application may be due to the reflection of light, stimulated
emission such as fluorescence, phosphorescence and alike processes
separately or in combination. In the presence of sufficient sun
light, the color of a golf ball is expected to be unique, a blue
enhanced white not naturally found in objects. Furthermore, because
a lost golf ball is only partially visible, where 1% or more of its
surface may be unobstructed, the color of the golf ball is not
identifiable as an object. In general, the image of a lost golf
ball occupies a very small percentage of the image and statistical
approaches are needed to identify the pixels for a lost golf
ball.
[0009] The range of the color space is based at least in part on
the color spaces identified for the type of object, such as a golf
ball having a blue enhanced white not naturally found in objects,
under various lighting conditions in the environment where the type
of object would be lost. The color spaces for the object are
defined by analyzing the color spaces obtained from the object in
live color digital images of the object under the various lighting
conditions in a training mode and storing the color spaces. The
object is detected by using an algorithim that identifies pixels in
the live color digital image that corresponds with a color space in
the range of color spaces. Once a pixel is identified, it is
recorded. Recorded pixels are analyzed to determine whether there
are clusters of pixels that meet a particular criteria. The image
may be filtered using polarization to eliminate glare.
BRIEF DESCRIPTION OF THE DRAWINGS
[0010] The above described features and advantages of the present
invention will be more fully appreciated with reference to the
detailed description and appended figures in which:
[0011] FIG. 1 depicts an exemplary functional block diagram of a
device in which the present invention can find application; FIG. 2
depicts an exemplary color digital image taken with the device
depicted in FIG. 1;
[0012] FIG. 3 depicts an exemplary flow diagram for detecting the
location of an object in an environment according to an embodiment
of the present invention; and
[0013] FIGS. 4a-4d depict exemplary color space diagrams of an
object shown in a color digital image.
DETAILED DESCRIPTION OF THE INVENTION
[0014] The present invention is now described more fully
hereinafter with reference to the accompanying drawings that show
embodiments of the present invention. The present invention,
however, may be embodied in many different forms and should not be
construed as limited to embodiments set forth herein.
Appropriately, these embodiments are provided so that this
disclosure will be thorough and complete, and will fully convey the
scope of the present invention.
[0015] According to embodiments of the present invention, a method,
a device and a computer program product for determining the
location of an object in an environment are provided. The method
receives an optical image of an environment and converts the
optical image of the environment into a live color digital image of
the environment consisting of charged signals, where each charged
signal was generated by a pixel in an array of a Charged Coupler
Device (CCD) by photoelectric conversion.
[0016] The live color digital image depicts an environment having
one or more similar objects. Software performs an analysis of the
live color digital image to detect and determine the location of
the one or more objects in the live color digital image of the
environment by using color and shape characteristics of the one or
more objects. The software uses a range of the visible portion of
the color space uniquely identified for the type of object in that
environment to detect and determine the location of the one or more
objects in the live color digital image of the environment. When
the object is a golf ball, color is defined by reflection of light
and by UV stimulated emission of blue fluorescence due to the
brighteners incorporated in the composition of golf balls. Color in
this application may be due to the reflection of light, stimulated
emission such as fluorescence, phosphorescence and alike processes
separately or in combination. In the presence of sufficient sun
light, the color of a golf ball is expected to be unique, a blue
enhanced white not naturally found in objects. Furthermore, because
a lost golf ball is only partially visible, where 1% or more of its
surface may be unobstructed, the color of the golf ball is not
identifiable as an object. In general, the image of a lost golf
ball occupies a very small percentage of the image and statistical
approaches are needed to identify the pixels for a lost golf
ball.
[0017] The range of the color space is based at least in part on
the color spaces identified for the type of object, such as a golf
ball having a blue enhanced white not naturally found in objects,
under various lighting conditions in the environment where the type
of object would be lost. The color spaces for the object are
defined by analyzing the color spaces obtained from the object in
live color digital images of the object under the various lighting
conditions in a training mode and storing the color spaces. The
object is detected by using an algorithim that identifies pixels in
the live color digital image that corresponds with a color space in
the range of color spaces. Once a pixel is identified, it is
recorded. Recorded pixels are analyzed to determine whether there
are clusters of pixels that meet a particular criteria. The image
may be filtered using polarization to eliminate glare.
[0018] FIG. 1 depicts a functional block diagram of an image taking
device in which the present invention can find application. In the
embodiment of FIG. 1, image taking device 100 can be implemented to
detect the presence of an object in an area in an environment and
determine the location of the object in the particular environment,
such as a golf ball on a golf course. In the FIG. 1 embodiment,
image taking device 100 is a system, such as a digital camera,
digital video camera, or the like, but can be any apparatus that
executes program instruction in accordance with the present
invention. In an embodiment of the present invention, the image
taking device 100 is hand-held. In an embodiment of the present
invention, the image taking device 100 is mountable on a mobile
object, such as a golf cart, aircraft, or automobile. In an
embodiment of the present invention, the imaging device 100 is
positioned at a fixed location, such as a position of the green of
a hole on a golf course.
[0019] In the FIG. 1 embodiment of the present invention, the image
taking device 100 includes a processor (CPU) 102, an input system
104, imaging circuitry 106, programmable gain amplifier (PGA) 108,
analog-to-converter 110, memory 112, data 116, and display 118. In
the FIG. 1 embodiment, the input system 104 is a digital image
system. The input system 104 provides an interface for acquiring
light reflected from an object, and by UV stimulated emission of
blue fluorescence due to the brighteners incorporated in the
composition of golf balls or light depicting an object. In an
embodiment of the present invention, the light acquired by the
input system can be used to form a live image of the object and a
live environment. The input system 104 includes imaging optics that
may be set to satisfy the Scheimpflug Condition and a
charge-coupled device sensor having a plurality of pixels. In the
Scheimpflug Condition, the object plane, the image plane, and the
median plane all intersect at a common point through the lens. This
condition has the effect that an object plain is mapped onto a
non-parallel image plane. The advantage of this condition is the
ability to focus on the ground where we expect the lost object (for
example, a golf ball) to be located with significantly improved
depth of focus.
[0020] The input system 104 is coupled to circuitry 106 and
provides an analog image signal to the circuitry 106. The circuitry
106 samples the analog image signal and extracts the voltage that
is proportional to the amount of light which fell on each pixel of
the charge-coupled device sensor of the input system 104 using
color components R (red), G (green) and B (blue). Programmable gain
amplifier (PGA) 108 is coupled to circuitry 106, amplifies the
voltages to the proper range and provides the voltages as input to
analog-to-converter 110. Analog-to-digital converter (ADC) 110 is
coupled to CPU 102 and converts the voltage to a digital code
suitable for further digital signal processing by CPU 102. The CPU
102 is a microprocessor, such as an INTEL PENTIUM.RTM. or AMD.RTM.
processor, but can be any processor that executes program
instructions in order to carry out the functions of the present
invention.
[0021] In the FIG. 1 embodiment, the memory 112 is coupled to CPU
102 and stores object detecting program 114 and data 116. The data
116 includes, but is not limited to, a live color digital image
depicting a particular environment having one or more objects whose
locations in the environment are desired to be determined, a set of
color space ranges, where each color space range in the set of
color space ranges is uniquely identified for a type of object, and
the color space of one or more pixels of the live color digital
image. In an embodiment of the present invention, the color space
is a range of blue enhanced white not naturally found in objects.
The blue enhanced white is defined by the reflection of light of
the golf ball, by UV stimulated emission of blue fluorescence from
the golf ball due to the brighteners incorporated in the
composition of the golf ball or a combination.
[0022] In the FIG. 1 embodiment, the object detecting program 114
provides the functionality associated with detecting the presence
of an object in an environment and determining the location of the
object in the environment, such as a golf ball, in the live color
digital image of an environment as executed by the CPU 102. The
object detecting program 114 is designed to report the location of
an object in the live color digital image, such as on a display
118.
[0023] FIG. 2 depicts an exemplary live color digital image taken
with the device depicted in FIG. 1. In FIG. 2 the live color
digital image 200 shows golf balls 202a-202d distributed on terrain
with grass.
[0024] An exemplary flow diagram of an embodiment for determining
the location of an object in a particular environment is shown in
FIG. 3. FIG. 3 is best understood when read in combination with
FIG. 1. As shown in FIG. 3, the process begins with step 300, in
which a target color space for the type of object is defined based
on the observed R, G, B levels in a series of reference live images
of the object, such as a golf ball having a color of blue enhance
white. The series of reference live images are taken several times
under various conditions to determine a desirable target color
space. This training produces slightly different color spaces.
Using a set of reference live images increases the robustness of
our approach relative to using just a single image. In the case of
typical golf balls, this results in a "blue enhanced white" space.
This space may be a restricted set from the universe of observed
colors, such as the space of colors that together account for 50%
of all observations. The color space uses two of the three
available degrees of freedom in the RGB measurement.
[0025] The target color space for a lost golf ball depends on both
light reflected from the golf ball and light emitted from the golf
ball, i.e., fluorescence of the golf ball. The fluorescence is due
to brighteners added to golf ball to improve their appearance. Such
brighteners absorb UV from sunlight and re-emit the light at lower
energy as blue light. Blue color added to white is well known to
improve the "whiteness" of an object. The practice of adding
brighteners to golf ball is common for this reason. Hence, the
color of a golf ball has two components, reflected light and blue
fluorescence.
[0026] In defining a target color space, color shifts caused by the
specific lighting conditions of the particular type of object must
be considered and included in the target color space for the type
of object. Accordingly, the color shifts of the type of object must
be determined. This includes color shifts caused by "global"
lighting, such as sunny versus cloudy weather, as well as "local"
lighting, such as in grass or under a bush. For purposes of our
invention, we define "white" as the color of a typical golf
ball.
[0027] Turning here briefly to FIG. 4a-4d, where an exemplary color
space diagram depicts the corresponding color space of a ball in a
color digital image. In the FIG. 4 embodiment, the color space
diagram 4b shows colors that in the picture 4a are provided in a
shade of gray. The shade level gives an indication of the relative
frequency of that particular color, with dark gray having few
occurrences and white having many occurrences. This is due to the
different color temperatures of the illumination. An automatic
white balance feature to correct for color shifts may be provided
on device 100 where the user can optionally select its operation.
In the FIG. 4c embodiment, a subset of the color space of FIG. 4b
is shown where only the colors that constitute 99% of the pixels
are selected and represented in white by FIG. 4d. In the FIG. 4d
embodiment, a subset of the color space of FIG. 4b is shown where
only the colors that constitute 50% of the pixels are selected and
represented in white by FIG. 4d.
[0028] Returning here to FIG. 3, in step 302 a live digital color
image of an environment where the object is thought to be located
is generated. This includes, but is not limited to, acquiring
object light or light depicting an object and forming an image,
providing an analog image signal for extraction of voltage which is
proportional to the amount of light which fell on each pixel of a
charge-coupled device sensor using color components R, G, and B,
and converting the voltage to a digital code suitable for further
digital signal processing. In an embodiment of the present
invention, a light source may be used to shift the color space back
into a regular detection range and to raise the light intensity
from a ball resting in a shadow back up to the high levels expected
if it was not shaded. One having ordinary skill in the art would
understand that the light source can be a UV light source where
device 100 employs UV color space. For the reasons cited, UV
lighting particular where the ambient lighting was poor, such as in
shaded areas or with overcast, will improve detection of a golf
ball by stimulating fluorescence and the emission of blue light
[0029] In step 304, the digital color image is processed to detect
the location of the object in the environment. This includes, but
is not limited to, identifying pixels in the live color digital
image that matches a color space in the target color space defined
for the type of object. In the FIG. 3 embodiment of the present
invention, the processing is performed by an algorithm that looks
for blue and red pixels that fall into a color space in the target
color space. All the pixels that fall into a color space in the
target color space are then evaluated based on the pixels total
luminance. Pixels that meet a minimum luminance value based on the
current lighting conditions are then grouped into clusters of
pixels that are all within a specific distance from each other.
Each cluster is evaluated to determine if it meets certain color
and luminance requirements to be a golf ball. If it does not then
the cluster is rejected, otherwise the cluster is accepted as a
possible location. The accepted clusters are sorted based on the
most blue, least red, and brightest cluster in the image data, and
storing the location of the pixels whose color space matches a
color space in the defined target color space for the type of
object.
[0030] In step 306, a decision statistic is defined that represents
the likely characteristics of the type of object. In an embodiment
of the present invention, the intensity of the background can be
used as a decision statistic. The intensity of the background can
be determined by processing the color digital image a second time.
With an image-specific histogram of the background intensity, a
lower-bound threshold for the expected target intensity can be
defined, such as at the 90%, 95%, or 99% level of the background
intensity. The pixels whose locations are stored can be screened
using this criterion, with those pixels not meeting the intensity
specification removed.
[0031] In an embodiment of the present invention, the size of the
type of object can be used as a decision statistic. The size of the
type of object can be used to identify the object by determining
the diameter, such as a golf ball measured in pixels. This value
can serve as a cluster distance. The pixels whose locations are
stored can be screened using this criterion by collecting into
groups, or clusters, those pixels that are within a cluster
distance of each other.
[0032] In step 308, it is determined whether the object is
identified in the environment based on one or more statistics. A
statistic includes color space information, and may also include
intensity information and/or cluster information. A statistic may
also include weighting values from any reference images collected.
The preferred approach is to define one statistic, but it is
obvious that multiple statistics could be defined and used with
this method. In step 310, the object is reported if identified,
such as by display 118.
[0033] While specific embodiments of the present invention have
been illustrated and described, it will be understood by those
having ordinary skill in the art that changes can be made to those
embodiments without departing from the spirit and scope of the
invention. For example, while the present invention concentrates on
a single color digital image and stationary lost object analysis,
it is understood that information from a series of images, a moving
object or a specific object might advantageously be used as well.
Also, while our application to golf balls has us discussing UV and
visible light, the method is not dependent on this choice.
* * * * *