U.S. patent application number 09/834403 was filed with the patent office on 2001-10-18 for target detection system using radar and image processing.
Invention is credited to Ohta, Akihiro, Oka, Kenji.
Application Number | 20010031068 09/834403 |
Document ID | / |
Family ID | 26590424 |
Filed Date | 2001-10-18 |
United States Patent
Application |
20010031068 |
Kind Code |
A1 |
Ohta, Akihiro ; et
al. |
October 18, 2001 |
Target detection system using radar and image processing
Abstract
A target detection system using an EHF radar and the image
processing is disclosed, in which the processing time is shortened
by mutually complementing the disadvantages of the EHF radar and
the image processing thereby to improve the reliability. The system
comprises a radar, an image acquisition unit and an image
processing ECU. The microcomputer of the ECU specifies an image
recognition area based on the power output from the radar, and
carries out the image processing only within the specified
recognition area for the image obtained from the image acquisition
unit. By performing the image processing only for the area where a
target is detected by the radar, the time required for image
processing is shortened on the one hand and the erroneous detection
of letters on the road surface or the like is eliminated.
Inventors: |
Ohta, Akihiro; (Kobe-shi,
JP) ; Oka, Kenji; (Kobe-shi, JP) |
Correspondence
Address: |
CHRISTIE, PARKER & HALE, LLP
350 WEST COLORADO BOULEVARD
SUITE 500
PASADENA
CA
91105
US
|
Family ID: |
26590424 |
Appl. No.: |
09/834403 |
Filed: |
April 13, 2001 |
Current U.S.
Class: |
382/103 ;
382/104; 382/199 |
Current CPC
Class: |
G01S 7/41 20130101; G01S
11/12 20130101; G06K 9/3233 20130101; G06T 7/97 20170101; G01C 3/08
20130101; G01S 13/931 20130101; G06V 10/25 20220101; G01S 13/867
20130101; G01C 11/06 20130101 |
Class at
Publication: |
382/103 ;
382/104; 382/199 |
International
Class: |
G06K 009/00; G06K
009/48 |
Foreign Application Data
Date |
Code |
Application Number |
Apr 14, 2000 |
JP |
2000-118549 |
May 18, 2000 |
JP |
2000-152695 |
Claims
1. A target detection system comprising: a radar; an image
acquisition unit; and a processing unit for specifying an area of
image recognition based on the data output from said radar and
processing the image data output from said image acquisition unit
only for said specified area.
2. A target detection system comprising: a radar for scanning a
specified area and outputting signal of power corresponding to an
object scanned; an image acquisition unit for acquiring an image of
said specified area; and a processing unit for specifying an area
of image recognition based on the power of the signal output from
said radar, extracting the edge data from the image data output
from said image acquisition unit only for said specified area, and
detecting a target based on said edge data.
3. A target detection system according to claim 2, wherein said
processing unit specifies an area having said power not less than a
predetermined level as said image recognition area.
4. A target detection system according to claim 2, wherein said
processing unit specifies an area having said power between a first
predetermined level and a second predetermined level as said image
recognition area.
5. A target detection system according to claim 2, wherein, in the
case where the level of said power has a plurality of peaks, said
processing unit specifies an area constituting a valley between the
peaks as said image recognition area.
6. A target detection system according to claim 2, wherein said
processing specifies as said image recognition area an area having
said power not less than a predetermined level, an area having said
power between a first predetermined level and a second
predetermined level, and an area constituting a valley between
peaks in the case where the level of said power has a plurality of
peaks.
7. A target detection system according to claim 2, wherein said
processing unit extracts the peak position of the power of the
signal output from said radar, checks the density change, on the
left and right sides of said peak position, of the image data
output from said image acquisition unit, extracts the position
where the density change ceases to be laterally symmetric, and
specifies an area having a predetermined width about said
extraction position as said image recognition area.
8. A target detection system according to claim 2, wherein said
processing unit extracts the peak position of the power of the
signal output from said radar, examines the density projection
value, on the left and right sides of said peak position, of the
image data output from said image acquisition unit, extracts the
position where the density projection value ceases to be laterally
symmetric, and specifies an area having a predetermined width about
said extraction position as said image recognition area.
9. A target detection system comprising: a radar for scanning a
specified area and outputting signal of power corresponding to an
object scanned; an image acquisition unit for acquiring an image of
said specified area; and a processing unit for extracting the edge
data from the image data output from said image acquisition unit,
specifying an area of image recognition based on the power of the
signal output from said radar, and detecting a target using those
of said extracted edge data existing in said specified area.
10. A target detection system according to claim 9, wherein said
processing unit specifies an area having said power not less than a
predetermined level as said image recognition area.
11. A target detection system according to claim 9, wherein said
processing unit specifies an area having said power between a first
predetermined level and a second predetermined level as said image
recognition area.
12. A target detection system according to claim 9, wherein, in the
case where the level of said power has a plurality of peaks, said
processing unit specifies an area constituting a valley between the
peaks as said image recognition area.
13. A target detection system according to claim 9, wherein said
processing unit specifies, as said image recognition area an area
having said power not less than a predetermined level, an area
having said power between a first predetermined level and a second
predetermined level, and an area constituting a valley between
peaks in the case where the level of said power has a plurality of
the peaks.
14. A target detection system comprising: a radar for outputting a
near flag in the state corresponding to the distance upon
determination that a target exists in a near area; an image
acquisition unit for acquiring the image of a specified area; and
an image recognition unit for outputting edge data by processing
the image data output from said image acquisition unit; a
processing unit for determining the state of the near flag output
from said radar and detecting a target based on the edge data in
the distance range corresponding to the state of said near flag
among the edge data output from said image recognition unit.
15. A target detection system comprising: a radar for outputting a
near flag in the state corresponding to the distance upon
determination that a target exists in a near area; an image
acquisition unit for acquiring the image of a specified area; an
image recognition unit for acquiring edge data by processing the
image data output from said image acquisition unit and, among said
edge data, removing and outputting the edge data in the distance
range corresponding to the state of the near flag output from a
processing unit; and a processing unit for determining the state of
the near flag output from said radar, outputting the result thereof
to said image recognition unit, and detecting a target based on the
edge data output from said image recognition unit.
16. A target detection system comprising: a radar for outputting a
near flag in the state corresponding to the distance upon
determination that a target exists in a near area; an image
acquisition unit for acquiring the image of a specified area; an
image recognition unit for acquiring edge data by processing the
image data output from said image acquisition unit and, among said
edge data, performing the pattern matching processing in priority
for the edge data within the distance range corresponding to the
state of the near flag output from a processing unit, and
outputting said edge data; and a processing unit for determining
the state of the near flag output from said radar, outputting the
result thereof to said image recognition unit, and detecting a
target based on the edge data output from said image recognition
unit.
17. A target detection system comprising: a radar for outputting a
near flag in the state corresponding to the distance upon
determination that a target exists in a near area; an image
acquisition unit for acquiring the image of a specified area; an
image recognition unit for acquiring edge data by processing the
image data output from said image acquisition unit and, among said
edge data, outputting by attaching the road surface flag to the
edge data identified as an edge of the road surface from the height
information and the distance information obtained from said image
data; and a processing unit for determining the distance range
corresponding to the state of the near flag output from said radar,
invalidating the edge data output from said image recognition unit
and having said road surface flag attached thereto, in the case
where the distance information of said edge data with said road
surface flag attached thereto indicates a near distance and said
determined distance range indicates a far distance, and performing
said target detection process based on the remaining edge data.
18. A target detection system comprising: a radar for outputting
distance information; an image acquisition unit for acquiring the
image of a specified area; an image recognition unit for acquiring
edge data by processing the image data output from said image
acquisition unit and, among said edge data, outputting by attaching
the road surface flag to the edge data identified as an edge of the
road surface based on the height information and the distance
information obtained from said image data; and a processing unit
for determining whether the distance information of the edge data
output from said image recognition unit and having said road
surface flag attached thereto is within the allowable error range
of the distance information acquired from said radar, in the case
where said edge data with said road surface flag attached thereto
indicates a far distance, and invalidates said edge data and
performs the target detection process based on the remaining edge
data in the case where said distance information is not within said
allowable error range.
19. A target detection system comprising: a radar for outputting a
near flag in the state corresponding to the distance upon
determination that a target exists in a near area; an image
acquisition unit for acquiring the image of a specified area; an
image recognition unit for acquiring edge data by processing the
image data output from said image acquisition unit and, among said
edge data, outputting by attaching a letter flag to the edge data
identified as a letter on the road surface based on the density
information obtained from said image data; and a processing unit
for determining the distance range corresponding to the state of
the near flag output from said radar, invalidating the edge data
output from said image recognition unit and having said letter flag
attached thereto in the case where the distance information of the
edge data with said letter flag attached thereto indicates a near
distance and said determined distance range indicates a far
distance, and performing said target detection process based on the
remaining edge data.
20. A target detection system comprising: a radar for outputting
distance information; an image acquisition unit for acquiring the
image of a specified area; an image recognition unit for acquiring
edge data by processing the image data output from said image
acquisition unit and, among said edge data, outputting by attaching
a letter flag to the edge data identified as a letter on the road
surface based on the density information obtained from said image
data; and a processing unit for determining whether the distance
information of the edge data output from said image recognition
unit and having said letter flag attached thereto is within the
allowable error range of the distance information acquired from
said radar, in the case where said distance information of the edge
data with said letter flag attached thereto indicates a far
distance, and invalidating said edge data and performing the target
detection process based on the remaining edge data in the case
where said distance information is not within said allowable
range.
21. A target detection system comprising: a radar for outputting a
near flag in the state corresponding to the distance upon
determination that a target exists in a near area; an image
acquisition unit for acquiring the image of a specified area; an
image recognition unit for acquiring edge data by processing the
image data output from said image acquisition unit and, among said
edge data, outputting by attaching a road surface flag to the edge
data identified as a letter on the road surface based on the height
information and the distance information obtained from said image
data, and also by attaching a letter flag to the edge data
identified as a letter on the road surface based on the density
information; and a processing unit for determining the distance
range corresponding to the state of the near flag output from said
radar, and invalidating the edge data output from said image
recognition unit and having at least one of said road surface flag
and said letter flag attached thereto, in the case where the
distance information of said edge data having at least one of said
road surface flag and said letter flag attached thereto indicates a
near distance and said determined distance range indicates a far
distance, and performing said target detection process based on the
remaining edge data.
22. A target detection system comprising: a radar for outputting a
near flag in the state corresponding to the distance upon
determination that a target exists in a near area; an image
acquisition unit for acquiring the image of a specified area; an
image recognition unit for acquiring edge data by processing the
image data output form said image acquisition unit and, among said
edge data, outputting by attaching a road surface flag to the edge
data identified as a letter on the road surface based on the height
information and the distance information obtained from said image
data and also by attaching a letter flag to the edge data
identified as a letter on the road surface based on the density
information; and a processing unit for determining whether the
distance information of the edge data output from said image
recognition unit and having at least one of said road surface flag
and said letter flag attached thereto is within the allowable error
range of the distance data obtained from said radar, in the case
where the distance information of said edge data having at least
one of said road surface flag and said letter flag attached thereto
indicates a far distance, and invalidating said edge data having at
least one of said road surface flag and said letter flag attached
thereto and performing said target detection process based on the
remaining edge data, in the case where said distance information is
not within said allowable error range.
Description
BACKGROUND OF THE INVENTION
[0001] 1. Field of the Invention
[0002] The present invention relates to a target detection system.
The target detection system according to this invention is mounted
on an automotive vehicle, for example, and is used to aid the
driver in driving the vehicle by detecting a preceding vehicle
running ahead of his vehicle or an obstacle lying ahead or the like
target located ahead of the vehicle driven by the driver.
[0003] 2. Description of the Related Art
[0004] A conventional target detection system uses a fusion
algorithm in which the reliability of data is studied by use of the
result of detecting a target using a EHF radar and the result of
detecting a target by image processing thereby to achieve the
optimal result. Such a target detection system comprises an EHF
radar, a left camera, a right camera and an image processing ECU
(electronic control unit). The ECU includes an image processing
microcomputer and a fusion processing microcomputer.
[0005] In the processing method using the EHF radar, a specified
area ahead is scanned by the extremely high-frequency wave. The
strength of the signal power output from the EHF radar and the
range are so related to each other that the signal power strength
is high for the portion where a target exists and low for the
portion where a target does not exist. The EHF radar can measure a
far distance with high accuracy but is low in accuracy for the
measurement of a near target. Also, the EHF radar outputs a near
flag upon detection of a near target.
[0006] The image processing microcomputer extracts the edge of each
of the two images acquired by the two cameras. The edge positions
of the two images are different due to the parallax error between
the left and right cameras, and this difference is used to
calculate the distance to the target. Image processing can be used
to measure the distance over a wide range but is low in accuracy
for detection of a far target.
[0007] The distance measurement by image processing further has the
following problems.
[0008] 1. (Erroneous recognition) In view of the fact that the edge
extraction processing is for simply extracting the edges from an
image, the edges of letters written on the road surface, shadows or
other objects not three-dimensional and different from the target
may be extracted erroneously due to the density difference thereof.
In such a case, edges are output in spite of the absence of a
target.
[0009] 2. (Erroneous distance measurement) In the case where an
edge is detected by the edge extraction processing, the distance is
measured by pattern matching between the images acquired by the two
cameras. In this processing, the result may become erroneous in the
case where a similar pattern happens to exist.
[0010] FIG. 1 shows detection areas defined for the target
detection system.
[0011] An area 2 in which a target can be detected by image
processing has a large range, while an area 3 where a target can be
detected by an EHF radar reaches a far distance. In an area 4 where
a target can be detected by using both the image processing and the
EHF radar, on the other hand, a target can be recognized very
reliably by the fusion processing between the output data of the
radar and the output data of the image processing. The area 4 is
called the fusion area. The microcomputer for fusion processing
determines the presence or absence of a target based on an overall
decision on both the result of detection by the EHF radar and the
result of detection by the image processing microcomputer, and thus
recognizes the presence of a target such as a preceding vehicle and
calculates the distance, etc.
[0012] In the conventional target detection system, the processing
time for the fusion algorithm in the fusion processing
microcomputer is required in addition to the processing time for
the EHF radar and the processing time for the image processing
microcomputer, and therefore the total processing time is long.
Also, the conventional target detection system has yet to overcome
the disadvantages of both the EHF radar, and image processing,
sufficiently.
SUMMARY OF THE INVENTION
[0013] An object of the present invention is to provide a target
detection system using the EHF radar and image processing, wherein
the processing time for detection is shortened while compensating
for the disadvantages of the EHF radar and image processing with
each other.
[0014] Another object of the invention is to provide a target
detection system using both the EHF radar and the image processing,
wherein the reliability is improved by preventing erroneous
recognition and erroneous distance measurement in image recognition
processing.
[0015] The present invention has been developed in order to achieve
the objects described above. According to one aspect of the
invention, there is provided a target detection system comprising a
radar, an image acquisition unit and a processing unit. The
processing unit specifies an area for image recognition based on
the data output from the radar, and processes image data output
from the image acquisition unit only within the specified area
thereby to detect a target. According to this invention, objects
other than three-dimensional ones, including lines or letters on
the road surface are not detected by the radar as a target, and
therefore lines, letters and other auxiliary objects are not
detected as a target by image processing. Also, the image data are
processed only for an area where a target such as an obstacle or a
vehicle is detected by the radar, and therefore the time required
for processing the image data is shortened thereby shortening the
processing time, as a whole, for target detection.
[0016] In the target detection system according to this invention,
the image recognition area can be specified based on the power of
the signal output from the radar. Upon detection of a target such
as an obstacle or a vehicle, the radar outputs a signal of
predetermined power. A target is extracted only from an area having
such a target by extracting the edge of the image data only for the
particular area from which the signal power is detected. As a
result, the time required for image data processing can be
shortened. By the way, all the edges may be extracted from the
image data and only the edges existing in the image recognition
area may be processed as effective edges for target detection. In
such a case, the time required for image processing is not
shortened but the time required for fusion processing can be
shortened.
[0017] In the target detection system according to this invention,
the image recognition area can be determined based on the state of
the near flag output from the radar. Upon detection of a near
target, the radar outputs a near flag, the state of which changes
with the distance to the target. In the processing for target
detection, the edge data acquired in the image processing is
selected in accordance with the presence or absence and the state
of the near flag, and therefore the recognition error and the
distance measurement error of the target can be prevented before
the fusion processing.
[0018] Further, in the target detection system according to this
invention, a road surface flag and a letter flag can be attached to
the edge data extracted by image processing in the case where a
density difference on the image due to lines or letters on the road
surface is detected. For the edge data with the road surface flag
or the letter flag, it is determined whether the edge data
including the particular road surface flag or the letter flag
actually represents lines or characters written on the road
surface. In the case where the edge data are found to represent
lines or letters, the data in the particular area is invalidated.
As a result, the recognition error in which lines or characters on
the road surface are recognized as a target and the measurement
error can be prevented before the fusion processing.
BRIEF DESCRIPTION OF THE DRAWINGS
[0019] The above object and features of the present invention will
be more apparent from the following description of the preferred
embodiment with reference to the accompanying drawings,
wherein:
[0020] FIG. 1 shows different detection areas defined for a target
detection system mounted on an automotive vehicle;
[0021] FIG. 2 shows a basic configuration of a target detection
system;
[0022] FIG. 3 shows an image picked up as the condition ahead of
the target detection system;
[0023] FIG. 4 is a diagram showing the edges calculated by the
image processing in the system of FIG. 2;
[0024] FIG. 5 is a diagram showing typical edges extracted by
processing the edges of FIG. 4;
[0025] FIG. 6 is a diagram showing the data output from the EHF
radar of FIG. 2;
[0026] FIG. 7 is a vehicle target detection system according to a
first embodiment of the invention;
[0027] FIG. 8 is a flowchart showing the processing steps of the
microcomputer of FIG. 7;
[0028] FIG. 9 shows a first method of determining a search area
from the signal power strength obtained by the EHF radar of FIG.
7;
[0029] FIG. 10 shows edges extracted by the processing shown in
FIG. 8;
[0030] FIG. 11 shows a second method for determining a search area
from the strength of the signal power obtained by the EHF radar
shown in FIG. 7;
[0031] FIG. 12 shows edges extracted by the method shown in FIG.
11;
[0032] FIG. 13 shows a third method for determining a search area
from the strength of the signal power obtained by the EHF radar
shown in FIG. 7;
[0033] FIG. 14 shows an image of a plurality of preceding
vehicles;
[0034] FIG. 15 is a flowchart showing the second processing of the
microcomputer of FIG. 7;
[0035] FIG. 16 shows the strength of the power obtained by the EHF
radar in the processing shown in FIG. 15;
[0036] FIG. 17 shows a first method for determining a search area
in the process of FIG. 15;
[0037] FIG. 18 shows a second method for determining a search area
in the process of FIG. 13;
[0038] FIG. 19 is a flowchart showing the third processing of the
microcomputer of FIG. 7;
[0039] FIG. 20 is a vehicle target detection apparatus according to
a second embodiment of the invention;
[0040] FIGS. 21A to 21E are diagrams for explaining the processing
in the image recognition unit of FIG. 20;
[0041] FIG. 22 shows a first specific circuit configuration of a
target detection system according to a second embodiment;
[0042] FIG. 23 is a flowchart showing the operation of the system
of FIG. 22;
[0043] FIG. 24 shows a second specific circuit configuration of a
target detection system according to the second embodiment;
[0044] FIG. 25 is a flowchart showing the operation of the system
of FIG. 24;
[0045] FIG. 26 shows a third specific circuit configuration of a
target detection system according to the second embodiment;
[0046] FIG. 27 shows a pattern matching area in FIG. 26;
[0047] FIG. 28 is a flowchart showing the operation of the system
of FIG. 27;
[0048] FIG. 29 shows a fourth specific circuit configuration of a
vehicle target detection system according to the second
embodiment;
[0049] FIG. 30 is a flowchart showing the first operation of the
system of FIG. 29; and
[0050] FIG. 31 is a flowchart showing the second operation of the
system of FIG. 29.
DESCRIPTION OF THE PREFERRED EMBODIMENTS
[0051] First, an explanation will be given of the principle of
fusion processing for target detection which is used in a target
detection system according to this invention.
[0052] As shown in FIG. 2, the target detection system comprises an
EHF radar 11, a left camera 12, a right camera 3 and an image
processing ECU 14. The ECU 14 is configured with an image
processing microcomputer 21 and a fusion processing microcomputer
22. The image processing microcomputer 21 detects a target by
processing the image data obtained from the cameras 12, 13.
Specifically, edges are extracted from the images obtained from the
left and right cameras 12, 13, and the parallax is calculated from
the left and right positions of edge extraction thereby to
calculate the distance value. The processing operation of the image
processing microcomputer 21 will be explained with reference to
FIG. 3.
[0053] FIG. 3 shows a picked-up image of the condition ahead of the
target detection system. Ahead of the vehicle, there is a preceding
vehicle 6, lines 7 are drawn on the road surface, and a guard rail
8 is present at a shoulder. FIG. 4 shows the result of calculating
the edges by processing the image of FIG. 3, and FIG. 5 the result
of extracting the edges in the descending order of peak strength by
processing the result of FIG. 4. In FIG. 5, the vertical short
lines represent the edges extracted. The image processing
microcomputer 21 extracts the edges of FIG. 5 from the left and
right cameras 12, 13 and calculates the distance to the target
using the parallax.
[0054] In the processing method of the EHF radar 11, the interior
of a specified area is scanned by the EHF radar and the portion of
the output data having strong power is recognized as a target. FIG.
6 shows the relation between the horizontal position (angle or
range) of the scanned area and the power strength of the data
output from the EHF radar 11. It is seen that the power strength of
the portion where the target is present is high, and vice
versa.
[0055] The fusion processing microcomputer 22 determines whether a
target is present or not by overall analysis of the detection
result of the EHF radar 11 and the detection result of the image
processing microcomputer 21, and thereby checks the presence of a
target such as a preceding vehicle and calculates the distance to
the preceding vehicle.
Embodiment 1
[0056] FIG. 7 shows a vehicle target detection system according to
a first embodiment of the invention.
[0057] A vehicle target detection system comprises an EHF radar 11,
a left camera 12, a right camera 13 and an image processing ECU 14.
The ECU 14 is configured with a microcomputer 15 having the dual
functions of image processing and fusion processing. Although the
two cameras 12, 13, left and right, are used for measuring the
distance by parallax in image processing, only one camera will do
in the case where the distance is not measured by parallax.
[0058] Now, the processing in the microcomputer 15 will be
explained.
Embodiment 1-1
[0059] FIG. 8 is a flowchart showing the processing in the
microcomputer 15. The condition ahead of the vehicle is assumed to
be the same as shown in FIG. 3 and described above.
[0060] In FIG. 8, the interior of a specified area is scanned by
the EHF radar 11 in step S1. FIG. 9 shows the result of scanning
obtained from the EHF radar 11. In FIG. 9, the abscissa represents
the horizontal position (angle) of the area scanned, and the
ordinate the power strength in dB. In the case where a preceding
vehicle 6 is present, as shown, the signal power strength is high
at the horizontal position corresponding to the preceding vehicle
6.
[0061] In step S2, an object having a high signal power strength
(not less than P dB) is recognized as a target, and the range (X0
to X1) where the target is present is held as a search area. In the
case shown in FIG. 3, what is detected as a target is the preceding
vehicle 6 alone. Power is not detected from a planar object like
the lines 7 drawn on the road surface.
[0062] In step S3, the angular range (X0 to X1) obtained in step S2
is defined as a search area in the image acquired from the left and
right cameras 12, 13.
[0063] In step S4, edges are extracted in the search area thus
defined. The processing for extracting vertical edges is well known
by those skilled in the art and therefore will not be described
herein.
[0064] In step S5, only edges high in signal power strength are
extracted from among the extracted edges. Unlike FIG. 4 showing the
result of edge extraction over all the images obtained from the
cameras 12, 13, the present embodiment is such that the vertical
edges are extracted only for the search area but not over the whole
image. The result is as shown in FIG. 10, in which the edges are
represented by vertical short lines.
[0065] By extracting the vertical edges only within the specified
search area in this way, the processing time can be shortened as
compared with the case where a target is detected based on the
edges of the whole image. Also, the edges are extracted for line 7,
etc. (FIG. 3) not included in the search area of FIG. 9, and
therefore lines or letters written on the road surface are not
erroneously detected as a target.
[0066] In step S6, the peaks in the left and right images are
matched, and in step S7, the parallax for the corresponding peaks
is calculated thereby to obtain the distance to the preceding
vehicle 6. The process of steps S6, S7 is well known by those
skilled in the art, and therefore will not be described.
[0067] In the example described above, the search area is defined
for edge extraction by image processing and the processing time can
be shortened. Also, objects such as lines or letters on the road
surface are not reflected in the signal power of the EHF radar.
Thus, only such objects such as an obstacle and a preceding vehicle
can be detected.
Embodiment 1-2
[0068] The detection of the search area in Step S2 of FIG. 8 can be
variously modified.
[0069] FIG. 11 shows a different method of extracting the search
area in step S2. FIG. 12 shows the result of this edge extraction.
In FIG. 11, the range of first level P0 to second level P1 dB of
the signal power strength obtained from the EHF radar 11 is defined
as a predetermined level range of power strength, and the ranges X0
to X1, X2 to X3 in the particular level range are extracted as a
search area. The portion of FIG. 11 where the power strength is
high represents the detection of a target. The range of P0 to P1 dB
where the signal power strength changes sharply represents the edge
position of the target. According to this embodiment, therefore,
the position where the edges can probably be extracted can be
further limited, and therefore, as shown in FIG. 12, only the edges
of the target can be extracted, thereby further shortening the
processing time.
Embodiment 1-)
[0070] FIG. 13 shows another different method of extracting a
search area in step S2 of FIG. 8. The distribution of the signal
power strength obtained from the EHF radar 11 may be divided into a
plurality of peaks as shown in FIG. 13. This phenomenon often
occurs when two vehicles 6, 9 are running ahead as shown in FIG.
14. In the case where the power distribution is divided into two
peaks as described above, the horizontal positions X0 to X1 of the
valley (the portion where the signal power strength is not more
than P1 dB) are extracted as a search area.
Embodiment 1-4
[0071] When a vehicle is actually running on a toll road or a free
way, the possibility of presence of a single preceding vehicle is
very low and a plurality of vehicles are running ahead in almost
all cases. Therefore, various patterns are obtained in the result
of output from the EHF radar and it is impossible to determine a
pattern uniquely. In view of this, the actual driving requirement
is met by assuming that the total of all the search areas described
in embodiments 1-1 to 1-3 constitute a search area.
Embodiment 1-5
[0072] FIG. 15 is a flowchart showing the second process in the
microcomputer 15.
[0073] In step S11, the interior of a specified area is scanned by
the EHF radar 11. FIG. 16 shows the result of scanning obtained
from the EHF radar 11. Assume that the condition ahead of the
vehicle is the same as that shown in FIG. 2.
[0074] In step S12, an object with high signal power strength is
recognized as a target, and the angle (X0) corresponding to the
peak of signal power strength in FIG. 16 is extracted and held.
[0075] In steps S13, S14, a search area is extracted based on the
density change of the image. FIG. 17 shows a density change of the
image obtained from the cameras 12, 13. This density change
represents the density of the image obtained from the cameras 12,
13, as expressed on a given horizontal line (X coordinate).
[0076] In step S13, an area of the density change laterally
symmetric about the coordinate X0 corresponding to a peak is
searched for, and the positions X1, X2 which have ceased to be
symmetric are held. In the case where the target is a vehicle, the
image density thereof is laterally symmetric about the center
while, outside of the vehicle, the image of the road surface, etc.
is detected and therefore is not laterally symmetric. This
indicates that a target area is probably located in the
neighborhood of the positions X1, X2 where the lateral symmetry has
disappeared. In view of the fact that the perfect lateral symmetry
cannot be actually obtained even for the same target, however, a
certain degree of allowance is given for a symmetry decision.
[0077] In step S14, the areas (X3 to X4, X5 to X6) covering several
neighboring coordinate points about the positions X1, X2 are held
as a search area. In this way, the area in the vicinity of the
positions X1, X2 is specified to show that the edges of a target
are present in the particular search area.
[0078] The processes in subsequent steps, i.e. steps S4 to S7 using
this search area are similar to that in the flowchart of FIG. 8
described above. Also in this embodiment, the time required for
image processing is shortened, and letters on the road surface are
prevented from being detected erroneously as a target.
Embodiment 1-6
[0079] The aforementioned extraction of a search area by image
processing in steps S13, S14 described above can use a density
projection value instead of a density change of the image.
[0080] FIG. 18 shows a method of extracting a search area using the
density projection value of an image. The density projection value
is obtained by totaling the pixel densities in vertical direction
for the images obtained from the cameras 12, 13. In this
embodiment, too, the search areas X3 to X4, X5 to X6 are obtained
in a similar manner to the aforementioned embodiment 2-1.
Embodiment 1-7
[0081] FIG. 19 is a flowchart showing the third process in the
microcomputer 15 of FIG. 7. In step S21, an image is acquired from
the cameras 12, 13.
[0082] In step S22, edges are extracted by image processing. In
this image processing, edges are extracted over the entire range of
the image, and therefore the result as shown in FIG. 4 described
above is obtained.
[0083] From the edges obtained, peaks are extracted in the
descending order of power strength in step S23. The result is as
shown in FIG. 5. The extraction position for each edge is held as
an angular position Xn.
[0084] In step S24, the interior of the specified area is scanned
by the EHF radar 11. In step S25, the angular position Yn is
extracted and held from the result of scanning. This angular
position Yn is similar to the one extracted as a search area in
embodiments described above, and any of the methods shown in FIGS.
9, 11 and 13 or a given combination thereof can be used.
[0085] In step S26, a portion shared by the angular positions Xn
and Yn is extracted. In step S27, the parallax is determined for
the target at the common angular position extracted, and by
converting it into a distance value, a target is detected.
[0086] In embodiment 1-7, the time required for image processing is
not shortened, but the measurement error due to letters or other
obstacles on the road surface can be eliminated.
Embodiment 2
[0087] FIG. 20 shows a target detection system for a vehicle
according to a second embodiment of the invention.
[0088] This vehicle target detection system comprises an EHF radar
11, a left camera 12, a right camera 13 and an ECU 14. The ECU 14
includes an image recognition unit 25 for processing the images
input from the two cameras 12, 13 and outputting edge data and a
processing unit 26 for detecting the presence of and measuring the
distance to a target by fusion processing of the edge data input
from the EHF radar 11 and the image recognition unit 25.
[0089] The configuration described above is similar to that of the
conventional target detection system. Unlike in the conventional
target detection system in which the result is output
unidirectionally only from the image recognition unit 25 to the
processing unit 26, however, the target detection system shown in
FIG. 20 is different from the conventional target detection system
in that bidirectional communication is sometimes established
between the processing unit 26 and the image recognition unit
25.
[0090] The EHF radar 11 radiates an EHF forward of the vehicle, and
detects the presence of and the distance to a target based on the
radio wave reflected from the target. The EHF radar 11, which has a
low accuracy of distance measurement for a near target, outputs a
near flag upon detection of a near target. The near flag is output
in temporally stable state in the case where a target is located
very near (not farther than 5 m, for example), and output
intermittently in unstable state in the case where a near target is
present (about 5 m to 10 m). In the case where a target is located
far (not less than 10 m), on the other hand, no near flag is
output.
[0091] The processing in the image recognition unit 25 will be
explained with reference to FIGS. 21A to 21E. First, the image
recognition unit 25 extracts the edges of an input image (FIG. 21A)
of the camera 12. As a result, the edges shown in FIG. 21B are
obtained. Then, from the result of this edge extraction, N (say,
16) edges are extracted in the descending order of strength (FIG.
21C).
[0092] From each of the N edges, a matching pattern 17 including
M.times.M (say, 9.times.9) pixels is retrieved as shown in FIG.
21E, and the pattern matching is effected for the input image (FIG.
21D) from the other camera 13 thereby to detect corresponding
edges. From the parallax between the two edges, the distance to
each edge is calculated and the result is output to the processing
unit 26 as edge data.
[0093] As shown also in FIG. 21B, the image recognition unit 25 may
erroneously output a distance by extracting also the edges for the
density difference of white lines and other objects other than the
target which are not three-dimensional. Also, the distance may be
erroneously measured by a mis-operation in the case where the
matching area happens to include a pattern similar to the pattern
17 as large as M.times.M pixels used for pattern matching as shown
in FIG. 21E.
[0094] In view of this, according to this invention, the near flag
output from the EHF radar 11 and the letter flag and the road
surface flag output from the image recognition unit 25 are used so
that the recognition error and the distance measurement error of
the image recognition system for the fusion area 4 (FIG. 1) are
prevented before the fusion processing in the processing unit
26.
Embodiment 2-1
[0095] FIG. 22 shows a first specific configuration of a vehicle
target detection system. The component parts that have already been
explained with reference to FIG. 20 will not be explained
again.
[0096] When edge data is output from the image recognition unit 25,
the pre-processing unit 28 of the processing unit 26 selects the
edge data by the near flag output from the EHF radar 11. The edge
data determined as effective are employed and output to the fusion
processing unit 29.
[0097] This processing will be explained in detail with reference
to the flowchart of FIG. 23.
[0098] The image recognition unit 25 is supplied with images from
the cameras 12, 13 (step S31) and extracts the edges from one of
the images (step S32). From the edges thus extracted, a
predetermined number of edges having a strong peak are extracted
(step S33). The pattern matching for the other image is carried out
for each edge (step S34) thereby to measure the distance (step
S35).
[0099] The pre-processing unit 28 of the processing unit 26
determines whether the near flag is output from the EHF radar 11
(step S36), and if any is output, determines whether the near flag
is output in stable fashion (step S37).
[0100] In the case where it is determined that the near flag is
output in stable fashion (continuously temporally), it is
determined that a target is present at a very near distance (say, 0
to 5 m), and the edge data having distance information of a very
near distance (say, not more than 5 m) is employed (step S38). In
the case where it is determined that the near flag is output in
unstable fashion (intermittently), on the other hand, it is
determined that a target is located at a near distance (say, 5 to
10 m), and the edge is employed which has distance information on a
near distance (say, 5 to 10 m) (step S39). Further, in the case
where the near flag is not output, it is determined that a target
is located far (say, not less than 10 m), so that the edges having
far distance (say, not less than 10 m) information in the fusion
area 4 are employed (step S40).
[0101] In the fusion processing unit 29, the fusion processing is
executed based on the edge data employed and the data output from
the EHF radar 11 thereby to recognize the presence of a target and
measure the distance to the target (step S41), followed by
outputting the result (step S42).
[0102] According to this embodiment, even in the case where the
edge data is recognized erroneously or the distance is measured
erroneously by the image recognition unit 25, the particular edge
data is eliminated unless a target is detected by the EHF radar 11
in the area of erroneous distance measurement. Thus, erroneous
recognition or erroneous distance measurement for the target can be
prevented. Also, invalid edge data is removed before the fusion
processing and, therefore, the processing time can be
shortened.
Embodiment 2-2
[0103] FIG. 24 shows a second specific circuit configuration of a
vehicle target detection system according to a second embodiment.
The component parts already explained will not be explained
again.
[0104] The continuity determination unit 30 of the processing unit
26 determines the state of the near flag output from the EHF radar
11, and the resulting data is sent to the invalid edge removing
unit 31 of the image recognition unit 25. In the invalid edge
removing unit 31, invalid edge data are removed in accordance with
the condition of the near flag and the edge data is output to the
fusion processing unit 29.
[0105] The aforementioned process will be explained in detail with
reference to the flowchart of FIG. 25.
[0106] In the image recognition unit 25, as in steps S31 to S33 in
the embodiment 2-1 described above, the image is input (step S51),
the edges are extracted (step S52) and the peak is extracted (step
S53).
[0107] The image recognition unit 25, as in steps S34, S35 in the
aforementioned embodiment, conducts pattern matching using the edge
data not removed (step S54) and measures the distance (step
S55).
[0108] Then, as in steps S36, S37 in the embodiment 2-1 described
above, the continuity determination unit 30 determines whether the
near flag is output or not from the EHF radar 11 (step S56) and
also whether the near flag is in stable state or not (step S57),
the result thereof being output to the invalid edge removing unit
31.
[0109] In the case where the near flag is output in stable fashion,
the invalid edge removing unit 31 removes the edge data having
other than the very near distance information (step S58). Upon
receipt of the data indicating that the near flag is output in
unstable fashion, on the other hand, the edges having other than
the near distance information are removed (step S59). Further, in
the case where no near flag is output, the edge data having other
than far distance information are removed (step S60).
[0110] The resulting edge data is output to the fusion processing
unit 29. In the fusion processing unit 29, as in steps S41, S42 of
the embodiment 2-1 described above, the fusion processing is
carried out (step S61) and the result is output (step S62).
[0111] This embodiment also produces the same effect as the
embodiment 2-1 described above.
Embodiment 2-3
[0112] FIG. 26 shows a third specific circuit configuration of a
vehicle target detection system according to a second embodiment.
The component parts already explained will not be explained
again.
[0113] The continuity determination unit 30 of the processing unit
26 determines the state of the near flag output from the EHF radar
11, and sends the result data to the image recognition unit 25. In
the image recognition unit 25, an area priority setting unit 32
determines the order of priority of the pattern matching areas
corresponding to the input result data, and performs the pattern
matching for the selected area in priority.
[0114] FIG. 27 shows areas for which the pattern matching is
conducted.
[0115] Upon extraction of the edges from one of the images, as
shown in FIG. 21A, a matching pattern corresponding to the edge
portion is taken out and, as shown in FIG. 21D, the pattern
matching is carried out for the other image. In the process, based
on the data input from the continuity determination unit 30, the
order of priority of areas is determined according to the edge
extraction position.
[0116] The image recognition unit 25, upon receipt of the data
indicating that a near flag is stably output, performs the pattern
matching for the area of the 26th to 80th pixels from the edge
extraction position as a very near area in priority over the other
areas. Upon receipt of the data indicating that the near flag is
output in an unstable fashion, on the other hand, the image
recognition unit 25 performs the pattern matching for the area of
the 10th to 25th pixels, for example, in priority as a near area.
Further, upon receipt of the data indicating that no near flag is
output, the image recognition unit 25 performs the pattern matching
for the area of the 0th to the 9th pixels, for example, in priority
as a far area.
[0117] The aforementioned processing will be explained in detail
with reference to the flowchart of FIG. 28.
[0118] In the image recognition unit 25, an image is input (step
S71), edges are extracted (step S72) and a peak is extracted (step
S73), and the continuity determination unit 30 determines whether
the near flag is output or not (step S74) and whether the near flag
is stable or not (step S75). The result is output to the edge
priority setting unit 32.
[0119] In the case where the near flag is output in a stable
fashion, the edge priority setting unit 32 gives the priority to
the very near distance for the pattern matching area (step S76).
Upon receipt of the data indicating that the near flag is output in
an unstable fashion, on the other hand, the near distance is given
priority (step S77). Further, in the case where no near flag is
output, the far distance is given priority (step S78).
[0120] The image recognition unit 25 performs the pattern matching
(step S79) and measures the distance (step S80) for the area given
priority. The resulting edge data is output to the fusion
processing unit 29.
[0121] In the fusion processing unit 29, as in steps S41 and S42 of
the embodiment 2-1 described above, the fusion processing is
carried out (step S81) and the result is output (step S82).
[0122] According to this embodiment, the pattern matching is
started from the area mostly likely to match, and therefore the
time until successful matching is shortened. Also, the possibility
of handling a similar matching pattern is reduced thereby to
prevent the erroneous distance measurement.
Embodiment 2-4
[0123] FIG. 29 shows a fourth specific circuit configuration of a
vehicle target detection system according to the second embodiment.
The component parts already explained will not be explained
again.
[0124] A road surface/letter edge determination unit 33 of the
image recognition unit 25 determines whether an extracted edge
represents a line or a letter on the road surface or not, and
outputs the result to the invalid edge removing unit 34 of the
processing unit 26. The invalid edge removing unit 34 removes the
invalid edges from the edge data input thereto from the image
recognition unit 25, and outputs the remaining edge data to the
fusion processing unit 29.
[0125] In the image recognition unit 25, the edges are extracted
according to the density difference on the image. Thus, the edges
of the letters and shadows on the road surface, though not a
target, are extracted undesirably according to the density
difference.
[0126] The road surface/letter edge determination unit 33
determines whether the density difference on the road surface or a
target is involved or not, based on the distance information and
height information on the density difference extracted. In the case
where it is determined that the density difference is that on the
road surface, the edge data corresponding to the particular density
difference with the road surface flag attached thereto is output to
the invalid edge removing unit 34.
[0127] The letters written on the road surface change from the road
surface color to white or yellow or from white or yellow to the
road surface color in the vicinity of the edge thereof. The road
surface/letter edge determination unit 33, in any of the changes
mentioned above, determines that the road surface letters are
detected, using the density information in the neighborhood of the
extracted edge. Upon determination that the road surface letters
are involved, the road surface letter determination unit 33 outputs
the edge with a letter flag attached thereto to the invalid edge
removing unit 34.
[0128] In the case where the road surface flag or the letter flag
is attached to the edge data and the distance information indicates
the near distance (say, not more than 10 m), the invalid edge
removing unit 34 determines whether there is a near flag output
from the EHF radar 11. Unless the near flag is output, the
particular edge is determined as the density difference or the
letters on the road surface and removed, while the remaining edge
data are output to the fusion processing unit 26.
[0129] The aforementioned process will be explained in detail with
reference to the flowchart of FIG. 30.
[0130] In the image recognition unit 25, as in steps S31 to S35 of
the embodiment 2-1 described above, an image is input (step S91),
edges are extracted (step S92), a peak is extracted (step S93), the
pattern matching is carried out (step S94), and the distance is
measured (step S95). By using the technique mentioned above, the
road surface flag or the letter flag is attached to a predetermined
edge data (step S96).
[0131] The invalid edge removing unit 34 determines whether the
road surface flag or the letter flag exists or not (step S97),
determines whether the edge distance information indicates a near
distance or not (step S98), and determines whether the near flag is
output or not from the EHF radar 11 (step S99). In the case where
the road surface flag or the letter flag is attached, the edge
distance information indicates the near distance and the near flag
is not output, then the edge data of the road surface flag or the
letter flag, as the case may be, is removed (step S100), and the
remaining edge data is delivered to the fusion processing unit
29.
[0132] In the fusion processing unit 29, as in steps S41 and S42 of
the embodiment 2-1 described above, the fusion processing is
carried out (step S101) and the result is output (step S102).
[0133] The embodiment 2-4 can be modified in the following way.
[0134] The road surface/letter edge determination unit 33 may
output only the road surface flag from the distance and height of
the density difference of the road surface or, conversely, may
output only the letter flag from the change in the density
difference of the road surface.
[0135] Also, the process can be changed as shown in the flowchart
of FIG. 31. Specifically, the invalid edge removing unit 34
determines whether the road surface flag or the letter flag is
attached to the edge data or not and also determines in step S981
whether the distance information indicates a far distance (say, not
less than 10 m). In the case where the distance information
indicates a far distance, it is determined in step S991 whether the
distance data output from the EHF radar 11 is within the allowable
error range of the distance information of the edge data. In the
case where it is not within the allowable error range, the edge
data to which the road surface flag or the letter flag is attached
is removed in step S100.
[0136] According to this embodiment, the erroneous recognition and
the erroneous distance measurement in the image recognition system
can be prevented before the fusion processing by use of the letter
flag and the road surface flag of the image recognition system.
* * * * *