U.S. patent application number 10/471958 was filed with the patent office on 2004-06-03 for omni-directional radiation source and object locator.
Invention is credited to Eyal, Reuven, Gal, Ehud, Graisman, Gil, Liteyga, Gennadiy.
Application Number | 20040104334 10/471958 |
Document ID | / |
Family ID | 26958214 |
Filed Date | 2004-06-03 |
United States Patent
Application |
20040104334 |
Kind Code |
A1 |
Gal, Ehud ; et al. |
June 3, 2004 |
Omni-directional radiation source and object locator
Abstract
The current invention describes a method for determining azimuth
and elevation angles of a radiation source or other physical
objects located anywhere within an cylindrical field of view. The
invention makes use of an omni-directional imaging system
comprising of reflective surfaces, an image sensor and an optional
optical filter for filtration of the desired wavelengths. The said
imaging system is designed to view an omni-directional field of
view using a single image sensor and with no need for mechanical
scan for coverage of the full field of view. Use of two such
systems separated by a known distance, each providing a different
reading of azimuth and elevation angle of the same object, enables
classic triangulation for determination of the actual location of
the object. The invention is designed to enable use of low cost
omni-directional imaging systems for location of radiation sources
or objects. Many additional needs and applications are envisaged
for such a method. Those needs include: location of flares and
torches in search and rescue operations at sea or over land,
detection of aircraft in close proximity for flight safety in VFR
flight conditions, detection and location of weapon systems that
employ Laser Range Finders, detection and warning of Laser Target
Designators used in conjunction with surface launched or air
dropped precision guided munitions, operation of Infra-red
countermeasures, location of sparks resulted by enemy fire etc.
Inventors: |
Gal, Ehud; (Reut, IL)
; Eyal, Reuven; (Hasharon, IL) ; Graisman,
Gil; (Reut, IL) ; Liteyga, Gennadiy;
(Ashkelon, IL) |
Correspondence
Address: |
MARSTELLER & ASSOCIATES
P O BOX 803302
DALLAS
TX
75380-3302
US
|
Family ID: |
26958214 |
Appl. No.: |
10/471958 |
Filed: |
September 16, 2003 |
PCT Filed: |
March 20, 2002 |
PCT NO: |
PCT/IL02/00228 |
Current U.S.
Class: |
250/203.6 |
Current CPC
Class: |
G02B 13/06 20130101 |
Class at
Publication: |
250/203.6 |
International
Class: |
G01C 021/24; G01C
021/02; G01J 001/20 |
Claims
What is claimed is:
1. A method for determining elevation angle of an object imaged by
a focal plane array sensor, comprising the following stages: a.
Imaging a cylindrical field of view using an omni-directional
imaging system which comprises of an omni-directional lens assembly
and a focal plane array. b. Detection of an object imaged by a
first sensor element on the said focal plane array. c. Registration
of the coordinates of said first sensor element relative to its
position on the said focal plane array. d. Registration of the
coordinates of a second sensor element which occupies the center of
the entire image, relative to its position on the said focal plane
array. e. Determination of the distance between said first sensor
element and said second sensor element. f. Determination of a
transformation function, which assigns each said distance the
appropriate elevation angle value, said transformation function is
compatible to the design of the omni-directional imaging system. g.
Extraction of elevation angle value which corresponds to the said
distance value from the said transformation function. Wherein said
focal plane array images an omni-directional field of view.
2. A method of claim 1, wherein said omni-directional lens assembly
comprises reflective lenses.
3. A method of claim 1, wherein said detection of an object is
accomplished by software processing of the image.
4. A method of claim 1, wherein said detection of an object is
performed by an electronic circuit connected to said focal plane
array.
5. An electronic circuit of claim 4, designed to detect charge
changes on the said focal plane array and register the coordinates
of sensor elements in which changes have been detected.
6. A method of claim 1, further comprising placement of an optical
filter, anywhere along the optical path of light rays captured by
the said omni-directional imaging system, selected to insure
filtration of specific wavelengths, and covering the entire field
of view.
7. An optical filter of claim 6, comprising of a multitude of
optical filters.
8. A method of claim 1, wherein said object is a radiation
source.
9. A radiation source of claim 8, which emits in the visible
spectrum.
10. A radiation source of claim 8, which emits in the invisible
spectrum.
11. A method for determining azimuth angle of an object imaged by a
focal plane array sensor, comprising the following stages: a.
Imaging a cylindrical field of view using an omni-directional
imaging system which comprises of an omni-directional lens assembly
and a focal plane array. b. Detection of an object imaged by a
first sensor element on the said focal plane array. c. Registration
of the coordinates of said first sensor element relative to its
position on the said focal plane array. d. Registration of the
coordinates of a second sensor element which occupies the center of
the entire image, relative to its position on the said focal plane
array. e. Determination of the distance between said first sensor
element and said second sensor element. f. Superposition of a
virtual two dimensional coordinate system upon said focal plane
array, in a way that the origin of said coordinate system coincides
with the said second sensor element. g. Alignment of one of the
axes of said coordinate system with true north. h. Determination of
the angle between the line connecting said first sensor element
with said second sensor element and the axis aligned with true
north--said angle being the azimuth angle Wherein said focal plane
array images an omni-directional field of view.
12. A method of claim 11, wherein said omni-directional lens
assembly comprises reflective lenses.
13. A method of claim 11, wherein said detection of an object is
accomplished by software processing of the image.
14. A method of claim 11, wherein said detection of an object is
performed by an electronic circuit connected to said focal plane
array.
15. An electronic circuit of claim 14, designed to detect charge
changes on the said focal plane array and register the coordinates
of sensor elements in which changes have been detected.
16. A method of claim 11, further comprising placement of an
optical filter, anywhere along the optical path of light rays
captured by the said omni-directional imaging system, selected to
insure filtration of specific wavelengths, and covering the entire
field of view.
17. An optical filter of claim 16, comprising of a multitude of
optical filters.
18. A method of claim 11, wherein said object is a radiation
source.
19. A radiation source of claim 18, which emits in the visible
spectrum.
20. A radiation source of claim 18, which emits in the invisible
spectrum.
Description
FIELD OF THE INVENTION
[0001] The invention relates to the field of omni-directional
imaging. More specifically but not exclusively, it relates to the
field of location of radiation sources and objects by using
omni-directional imaging systems.
DESCRIPTION OF RELATED ART
[0002] The present invention refers to a method for detection and
location of radiation sources and physical objects in a cylindrical
field of view. Radiation source detection systems are widely used,
mostly for military purposes. Current techniques are based on
employment of an imaging device with a focal plane array that is
sensitive to a specific wavelength, thus enabling detection of
energy radiated in this precise wavelength. Detection of a
radiation source is done by detection of changes on the focal plane
array--changes that occur only if a ray of the defined wavelength
has penetrated the optical filter and came in contact with the
focal plane array. Determination of the position (azimuth and
elevation angles) of the radiation source is based on registration
of each pixel's elevation and azimuth. Employment of two such
systems, each detecting the same radiation source and each
producing a different azimuth and elevation angle enable
determination of the radiation source's exact location by classic
triangulation methods.
[0003] The mentioned method is currently used with imaging systems
which are able to cover only a relatively narrow field of view.
Therefore, in order to cover a field of regard that is wider than
the field of view covered by the imaging system, it is customary to
use several imaging systems, each covers a different field of view.
The use of several imaging systems in such a solution necessitates
accurate alignment of the systems to assure that each of them
covers a different sector with no gaps or overlaps, and that all of
them together cover the full panoramic view. It is also required
that advanced synchronized software will support all imaging
devices and provide accurate readings and calculation of the
azimuth and elevation of the illuminating source. Due to its
complexity this method is considered cumbersome and costly. Another
method commonly used is by rotating a conventional system about its
axis to achieve coverage of a full panoramic field of regard.
Rotation of such a system requires combination of smoothly moving
mechanical component, accurately controlled and synchronized with
the software's operation to assure accurate determination of
azimuth and elevation angles of the illuminating source.
[0004] The current invention provides a static staring imaging
system that enables coverage of a full panoramic or nearly
spherical field of view, without mechanical movement or the need
for multiple imaging systems. The invention was disclosed in
provisional patent No. 60/276933 submitted by "Wave Group Ltd.".
The optical structures that enable the unique coverage of a full
panoramic field of view or the nearly spherical field of view are
disclosed at provisional patent No. 60/322737 submitted by "Wave
Group Ltd." and provisional patent application No. 60/22565
submitted by "Wave Group Ltd.".
SUMMARY OF THE INVENTION
[0005] A first embodiment of the current invention provides a
method for determining elevation angle of an object imaged by a
focal plane array sensor. Said focal plane array sensor is part of
a focal plane array that images an omni-directional field of view.
Said method comprises of the following stages:
[0006] a. Imaging a cylindrical field of view using an
omni-directional imaging system which comprises of an
omni-directional lens assembly and a focal plane array.
[0007] b. Detection of an object imaged by a first sensor element
on the said focal plane array.
[0008] c. Registration of the coordinates of said first sensor
element relative to its position on the said focal plane array.
[0009] d. Registration of the coordinates of a second sensor
element which occupies the center of the entire image, relative to
its position on the said focal plane array.
[0010] e. Determination of the distance between said first sensor
element and said second sensor element.
[0011] f. Determination of a transformation function, which assigns
each said distance the appropriate elevation angle value, said
transformation function is compatible to the design of the
omni-directional imaging system.
[0012] g. Extraction of elevation angle value which corresponds to
the said distance value from the said transformation function.
[0013] Preferably, said omni-directional lens assembly, which is a
part of the omni-directional imaging system, comprises reflective
lenses, which create a reflection of the omni-directional field of
view towards the said focal plane array.
[0014] Said method, may further incorporate placement of an optical
filter anywhere along the optical path of light rays that are
captured by the said omni-directional imaging system. Said optical
filter is selected to insure filtration of specific
wavelengths.
[0015] The said object that is detected by the said
omni-directional imaging system may be a radiation source. Said
radiation source may emit in the visible or invisible spectrum.
[0016] Preferably, detection of said object or said radiation
source on the said focal plane array is accomplished by software
processing of the image that is captured by the said focal plane
array.
[0017] Preferably, detection of said object on the said focal plane
array is accomplished by employment of an electronic circuit, which
is connected to said focal plane array.
[0018] Preferably, said electronic circuit is designed to detect
charge changes on the said focal plane array and register the
coordinates of the sensor elements on which changes have been
detected.
[0019] A Second embodiment of the current invention provides a
method for determining azimuth angle of an object imaged by a focal
plane array sensor. Said focal plane array sensor is part of a
focal plane array that images an omni-directional field of view.
Said method comprises of the following stages:
[0020] a. Imaging a cylindrical field of view using an
omni-directional imaging system which comprises of an
omni-directional lens assembly and a focal plane array.
[0021] b. Detection of an object imaged by a first sensor element
on the said focal plane array.
[0022] c. Registration of the coordinates of said first sensor
element relative to its position on the said focal plane array.
[0023] d. Registration of the coordinates of a second sensor
element which occupies the center of the entire image, relative to
its position on the said focal plane array.
[0024] e. Determination of the distance between said first sensor
element and said second sensor element.
[0025] f. Superposition of a virtual two dimensional coordinate
system upon said focal plane array, in a way that the origin of
said coordinate system coincides with the said second sensor
element.
[0026] g. Alignment of one of the axes of said coordinate system
with true north.
[0027] h. Determination of the angle between the line connecting
said first sensor element with said second sensor element and the
axis aligned with true north--said angle being the azimuth
angle
[0028] Preferably, said omni-directional lens assembly, which is a
part of the omni-directional imaging system, comprises reflective
lenses, which create a reflection of the omni-directional field of
view towards the said focal plane array.
[0029] Said method, may further incorporate placement of an optical
filter anywhere along the optical path of light rays that are
captured by the said omni-directional imaging system. Said optical
filter is selected to insure filtration of specific
wavelengths.
[0030] The said object that is detected by the said
omni-directional imaging system may be a radiation source. Said
radiation source may emit in the visible or invisible spectrum.
[0031] Preferably, detection of said object or said radiation
source on the said focal plane array is accomplished by software
processing of the image that is captured by the said focal plane
array.
[0032] Preferably, detection of said object on the said focal plane
array is accomplished by employment of an electronic circuit, which
is connected to said focal plane array.
[0033] Preferably, said electronic circuit is designed to detect
charge changes on the said focal plane array and register the
coordinates of the sensor elements on which changes have been
detected.
[0034] The embodiments as described hereby enable determination of
azimuth and elevation angles of objects or radiation sources in the
visible or invisible spectrum, which are located in a cylindrical
field of view, reflected towards a focal plane array by a lens
assembly comprises reflective lens or a plurality of lenses, and
detected on the focal plane array.
BRIEF DESCRIPTION OF THE DRAWINGS
[0035] For a better understanding of the invention and to show how
the same may be carried into effect, reference will now be made,
purely by way of example, to the accompanying drawings. With
specific reference now to the drawings in detail, it is stressed
that the particulars shown are by way of example and for purposes
of illustrative discussion of preferred embodiments of the present
invention only, and are presented in the cause of providing what is
believed to be the most useful and readily understood description
of the principles and conceptual aspects of the invention. In this
regard, no attempt is made to show structural details of the
invention in more detail than is necessary for a fundamental
understanding of the invention, the description taken with the
drawings making apparent to those skilled in the art how the
several forms of the invention may be embodied in practice. In the
accompanying drawings:
[0036] FIG. 1 is a schematic description of an imaging device which
provides a cylindrical field of view, and of the optical path of a
light beam traveling within the imaging device.
[0037] FIG. 2 is a schematic description of an imaging device which
provides a nearly spherical field of view and the optical path of a
light beam traveling within the imaging device.
[0038] FIG. 3 is a brief schematic description of prior art, used
to determine azimuth and elevation angles of an object imaged by a
narrow-angle imaging system.
[0039] FIG. 4 is a schematic description of the unique shape of the
image as acquired on the focal plane array of the imaging
device.
[0040] FIG. 5 is a schematic description of a method for
determination of the azimuth and elevation angles of the object or
radiation source that is imaged.
DETAILED DESCRIPTION
[0041] The preferred embodiments of the current invention provide
methods for determining the azimuth and elevation angles of a
radiation source or object located in a cylindrical field of view
and imaged by a Focal Plane Array (FPA) of an omni-directional
imaging device. The following detailed description will refer, in
brief, to the structure of a few omni-directional imaging devices.
It is stressed, that although only several forms of structure are
demonstrated, the method of determining the azimuth and elevation
angles of an object imaged by these systems described hereby, is
applicable to many other forms and structures of omni-directional
imaging devices that use reflective surfaces. Therefore the
incorporation of figures and references to specific models of
omni-directional imaging devices is done purely by way of example,
and should not be considered as limiting the extent of this
invention.
[0042] FIG. 1 demonstrates detection of radiation (1), originating
at a radiation source (2). The radiation (1) is reflected from an
omni-directional mirror assembly (3) towards a focusing lens (4),
an optical filter (5) and a Focal Plane Array (6). Said
omni-directional mirror assembly (3) contains one or more
reflective surfaces and is designed to enable a panoramic field of
view. It is stressed that alternative designs are possible for
panoramic lens assemblies. Each such design may enable a full
panoramic view at different elevation and depression angles, and
specific designs can be determined according to the desired
applications and needs. It is further stressed that the optical
filter may be matched to wavelengths of radiation of interest. The
optical filter may be employed anywhere along the optical path of
the radiation as long as it is positioned before to the Focal Plane
Array. The radiation (1) is detected by one or more sensor elements
(7) on the Focal Plane Array (6), for example by one or several
pixels on a Charged Couple Device (CCD). The actual detection of a
light beam may be done by employment of an electronic circuit,
connected to the Focal Plane Array and designed to detect charge
changes or by means of software that examines or processes the
output image.
[0043] FIG. 2 demonstrates detection of radiation (8) originating
at a first radiation source (9) and radiation (10) originating at a
second radiation source (11). The figure demonstrates an
omni-directional lens assembly (12) which provides a nearly
spherical field of view. By using this kind of lens assembly, it is
possible to detect radiation sources or objects located within a
cylindrical field of view around the imaging device, as well as
radiation sources or objects located above the imaging device. The
radiation (8) originating at the first radiation source (9) is
reflected inside the lens assembly (12) and towards a focusing lens
(13), an optical filter (14) and a Focal Plane Array (15) and is
detected by a sensor element or a group of sensor elements (16) on
the Focal Plane Array (15). The radiation (10) originated at the
second radiation source (11) penetrates the lens assembly (12) from
above, passing through the lens assembly (12), the focusing lens
(13), being filtered by the optical filter (14) and being detected
by a sensor element or a group of sensor elements (17) on the Focal
Plane Array (15).
[0044] FIG. 3 is a schematic description of prior art, by which
determination of azimuth and elevation angles is made. This figure
refers to imaging systems which enable a conventional, narrow-angle
field of view. A scene (18) is imaged by a Focal Plane Array (19).
It is stressed that the Focal Plane Array (19) is part of an entire
imaging system, however, in order to simplify the explanation,
reference is made only to the Focal Plane Array (1 9). The image
produced by the Focal Plane Array (19) is that of a relatively
narrow field of view. It is assumed that the size, in terms of
angles, of the field of view covered by the imaging device, is
known and that the number of sensor elements per line and per
column on the Focal Plane Array is also known. Given this
information, it is easy to determine how many sensor elements per
column cover a single degree at elevation and how many sensor
elements per line cover a single degree in azimuth. Each sensor
element on the Focal Plane Array (19) is assigned a coordinate
which specifies its line number and column number. A point (20) in
the scene is selected, in respect to which, the center (21) of the
Focal Plane Array is neither elevated nor depressed or shifted in
azimuth. An object (22) in the scene appears on a sensor element
(23) on the Focal Plane Array. Elevation and azimuth angles of the
object (22) need to be determined. Since the coordinates of the
sensor element (23) that images the object (22) are known, and the
coordinates sensor element which coincides with the center (21) of
the Focal Plane Array (19) are also known, it is easy to determine
the distance of the sensor element (23) from the sensor element
that coincides with the center (21) on the Focal Plane Array (19).
It is also known how many sensor elements per line and how many
sensor elements per column cover a degree in space. All this
information is easily used to determine the azimuth and elevation
angles of the object (22). This well known method commonly used in
prior art, is not applicable when imaging a full panoramic field of
view, since such imaging devices incorporate reflective surfaces,
which cause reflections and sometimes double reflections of the
scene and distortions in ways other that in conventional imaging.
The irregular reflection of the scene causes the image acquired by
the focal plane array to have a unique shape, as illustrated
below.
[0045] FIG. 4 is a schematic description of the shape of the image
created on a Focal Plane Array, when using an omni-directional
imaging system, such as those demonstrated in FIGS. 1 and 2. In
this figure, a circular image (24) is acquired by the Focal Plane
Array (25). Those skilled in the art of omni-directional imaging
would appreciate that the circular image (24) actually consists of
an outer circle (26) and an inner circle (27). When imaging a
cylindrical field of view, the outer circle (26) will image the
cylindrical field of view and the inner circle (27) will image a
reflection of the lens that is inside the imaging system. When
imaging a nearly spherical field of view, the outer circle (26)
will image the cylindrical field of view from around the imaging
device, whereas the inner circle (27) will image the field of view
above the imaging device.
[0046] FIG. 5 illustrates the manner in which determination of
azimuth and elevation angles is made when using an omni-directional
imaging system. This demonstration applies to objects located
within the cylindrical field of view, imaged as the outer circle
(26) on the focal plane array (25). In this figure, a sensor
element (28) of the Focal Plane Array (25) images a radiation
source or object located somewhere within an omni-directional
scene. For the purpose of illustration only it is assumed that the
Focal Plane Array (25) is rectangular in shape and that the
circular image (24) is located exactly at center of the Focal Plane
Array. The center (29) of the circular image (24) is determined and
a virtual two dimensional coordinate system originates from that
center, having an "X" axis (30) and a "Y" axis (31), is imposed on
it, its origin coinciding with the center of the circular image
(29). The virtual coordinate system is rotated so that the "X" axis
(30) is aligned with true north. Each sensor on the Focal Plane
Array (25) is assigned a coordinate specifying its line number and
column number.
[0047] To determine the azimuth angle of an object or radiation
source that is imaged by a sensor (28) of the Focal Plane Array
(25):
[0048] A virtual line (32) is formed, which connects the sensor
element coinciding with the center of the circular image (29) with
the sensor element (28) which images the object of interest. Given
the coordinates of the two said sensors, and by using conventional
trigonometry, the angle (33) between that line and any of the axes
can be determined.
[0049] To determine the elevation angle of an object or radiation
source that is imaged by a sensor element (28) on the Focal Plane
Array (25):
[0050] A virtual line (32) is formed, which connects the sensor
element coinciding with the center of the circular image (29) with
the sensor element (28) which images the object of interest. Given
the coordinates of the two said sensors, it is easy to determine
the length of the virtual line (32) that connects them. The length
of the virtual line (32) is used by a transformation function. The
transformation function assigns each "length" value, a
corresponding elevation angle. The transformation function is
determined according to the specific design and parameters of the
omni-directional lens assembly and layout of the imaging
system.
[0051] Those skilled in the art would appreciate that the
transformation function is a product of the detailed optical design
of the lens assembly. Since this invention does not refer to
optical design parameters, and is not intended to serve as a guide
in the process of optical design, no further reference is made to
the transformation function. It is stressed however, that although
the transformation function is needed for proper determination of
elevation angles, this function varies according to the specific
design of the lens assembly, and is considered as given information
to those skilled in the art of optical design.
[0052] It is further important to notice that the transformation
function should produce different values according to the position
of the imaging system itself. More explicitly, if the imaging
system itself it tilted (in elevation or in azimuth), the tilt
angle is needed in order to produce a true result regarding
positions of objects that appear in the image.
[0053] Referring to the current invention in general, it is
stressed that although reference was made to several kinds of
omni-directional imaging systems, including both cylindrical filed
of view imaging devices and nearly spherical field of view imaging
devices, the azimuth and elevation measurement methods described
hereby refer only to objects appearing in the field of view
acquired by the focal plane array after reflection, which is the
cylindrical field of view. It is important to note that the nearly
spherical field of view imaging device, produces two different
image sectors on the FPA. One image sector, referred to in FIG. 5
as the outer circle (26) comprises the cylindrical field of view
which is generated after reflection. The other image sector,
referred to as the inner circle (27) comprises a landscape from
above the imaging system, which is imposed as direct light through
optical lenses and not as reflections from reflective surfaces.
Therefore, when implementing this method, it should be noticed,
that the implementation is performed on image sectors that are
acquired only after reflection, normally--by a round mirror of
axi-symmetrical shape.
* * * * *