U.S. patent application number 11/758180 was filed with the patent office on 2010-01-21 for radar, lidar and camera enhanced methods for vehicle dynamics estimation.
This patent application is currently assigned to GM GLOBAL TECHNOLOGY OPERATIONS, INC.. Invention is credited to Shuqing Zeng.
Application Number | 20100017128 11/758180 |
Document ID | / |
Family ID | 40157523 |
Filed Date | 2010-01-21 |
United States Patent
Application |
20100017128 |
Kind Code |
A1 |
Zeng; Shuqing |
January 21, 2010 |
Radar, Lidar and camera enhanced methods for vehicle dynamics
estimation
Abstract
A system for estimation vehicle dynamics, including vehicle
position and velocity, using a stationary object. The system
includes an object sensor that provides object signals of the
stationary object. The system also includes in-vehicle sensors that
provide signals representative of vehicle motion. The system also
includes an association processor that receives the object signals,
and provides object tracking through multiple frames of data. The
system also includes a longitudinal state estimation processor that
receives the object signals and the sensor signals, and provides a
correction of the vehicle speed in a forward direction. The system
also includes a lateral state estimation processor that receives
the object signals and the sensor signals, and provides a
correction of the vehicle speed in the lateral direction.
Inventors: |
Zeng; Shuqing; (Sterling
Heights, MI) |
Correspondence
Address: |
GENERAL MOTORS LLC;LEGAL STAFF
MAIL CODE 482-C23-B21, P O BOX 300
DETROIT
MI
48265-3000
US
|
Assignee: |
GM GLOBAL TECHNOLOGY OPERATIONS,
INC.
DETROIT
MI
|
Family ID: |
40157523 |
Appl. No.: |
11/758180 |
Filed: |
June 5, 2007 |
Current U.S.
Class: |
701/301 ;
342/70 |
Current CPC
Class: |
G01S 2013/932 20200101;
G01S 13/865 20130101; B60W 40/105 20130101; B60W 2050/0033
20130101; B60W 2420/52 20130101; G01S 13/60 20130101; G01S 13/867
20130101; B60W 40/107 20130101 |
Class at
Publication: |
701/301 ;
342/70 |
International
Class: |
G08G 1/16 20060101
G08G001/16; G01S 13/93 20060101 G01S013/93 |
Claims
1. A system for estimating vehicle dynamics in a vehicle, said
system comprising: an object detection sensor for detecting
stationary objects relative to vehicle motion, said object sensor
providing object detection signals for tracking the object; a
plurality of in-vehicle sensors providing sensor signals
representative of the vehicle motion; an association processor
responsive to the object signals, said association processor
matching the object signals of consecutive views to track the
object, and providing object tracking signals; a longitudinal state
estimation processor responsive to the object tracking signals and
the sensor signals, and providing an estimation of vehicle speed in
a forward direction relative to the vehicle motion; and a lateral
state estimation processor responsive to the object tracking
signals and the sensor signals, said lateral state estimation
processor estimating vehicle speed in a lateral direction relative
to the vehicle motion.
2. The system according to claim 1 wherein the object detection
sensor is selected from the group consisting of radar devices,
lidar devices, cameras and vision systems.
3. The system according to claim 1 wherein the object detection
sensor is a long range radar sensor.
4. The system according to claim 3 wherein the long range sensor is
mounted at the center of gravity of the vehicle.
5. The system according to claim 1 wherein the object detection
sensor detects the position of the object by employing a bicycle
model.
6. The system according to claim 1 wherein the in-vehicle sensors
are selected from the group consisting of steering wheel angle
sensors, yaw-rate sensors, longitudinal speed sensors, longitudinal
acceleration sensors and lateral acceleration sensors.
7. The system according to claim 1 wherein the longitudinal state
estimation processor and the lateral state estimation processor
employ Kalman filters for determining the longitudinal and lateral
speed of the vehicle.
8. The system according to claim 7 wherein the longitudinal state
estimation processor receives object and sensor signals of the
vehicle longitudinal position, the vehicle longitudinal speed, the
vehicle longitudinal acceleration and vehicle speed, and uses the
Kalman filter and an auto-regression noise model to provide a
corrected longitudinal acceleration, a corrected vehicle speed and
a wheel slippage signal.
9. The system according to claim 7 wherein the lateral state
estimation processor receives object and sensor signals of azimuth
angle and lateral offset of an object and steering wheel angle,
yaw-rate and lateral acceleration of the vehicle, and uses the
Kalman filter and an auto-regression noise model to provide a
yaw-rate correction, a lateral acceleration correction and a
lateral velocity of the vehicle.
10. The system according to claim 1 wherein the object detection
sensor is a monocular camera.
11. The system according to claim 1 wherein the system is part of a
micro-electromechanical system based inertia measurement unit.
12. A system for determining the speed of a vehicle, said system
comprising: an object detection sensor for detecting stationary
objects relative to the vehicle motion; and a processor responsive
to the object detection signals, said processor tracking the
objects and employing a model for determining the longitudinal and
lateral speed of the vehicle relative to the stationary object,
said processor employing Kalman filtering.
13. The system according to claim 12 wherein the object detection
sensor is selected from the group consisting of radar devices,
lidar devices, cameras and vision systems.
14. The system according to claim 12 wherein the object detection
sensor detects the position of the object by employing a bicycle
model.
15. The system according to claim 12 further comprising in-vehicle
sensors for providing signals indicative of the vehicle motion,
said processor being responsive to the signals from the in-vehicle
sensors to determine the longitudinal and lateral speed of the
vehicle.
16. A method for estimating vehicle dynamics of a vehicle, said
method comprising: detecting stationary objects relative to the
motion of the vehicle and providing object detection signals;
measuring various vehicle states using in-vehicle sensors and
providing sensor signals; using the object signals and the sensor
signals to provide a corrected longitudinal acceleration and
vehicle speed of the vehicle; and using the object detection
signals and the sensor signals to provide a corrected lateral
acceleration and yaw-rate of the vehicle.
17. The method according to claim 16 wherein detecting stationary
objects includes using a radar device, a lidar device and/or a
camera device.
18. The method according to claim 16 wherein measuring the various
vehicle states using in-vehicle sensors includes using steering
wheel angle sensors, yaw-rate sensors, longitudinal speed sensors,
longitudinal acceleration sensors and lateral acceleration
sensors.
19. The method according to claim 16 wherein providing a corrected
longitudinal acceleration and vehicle speed includes receiving the
object signals and the sensor signals that identify vehicle
longitudinal position, vehicle longitudinal speed, vehicle
longitudinal acceleration and vehicle speed, and using a Kalman
filter and an auto-regression noise model to provide the corrected
longitudinal acceleration, vehicle speed and wheel slippage.
20. The method according to claim 16 wherein providing a corrected
lateral acceleration and yaw-rate includes using the object signals
that identify object azimuth angle and object lateral offset and
the sensor signals that identify steering wheel angle, yaw-rate and
lateral acceleration, and using a Kalman filter and an
auto-regression noise model to provide the lateral acceleration
correction, a lateral velocity correction and the yaw rate.
Description
BACKGROUND OF THE INVENTION
[0001] 1. Field of the Invention
[0002] This invention relates generally to a system and method for
determining vehicle dynamics and, more particularly, to a system
and method for determining vehicle speed and position that employs
radar, lidar and/or camera signals.
[0003] 2. Discussion of the Related Art
[0004] Various driver assist systems and autonomous driving
operations in vehicles, such as electronic stability control (ECS),
adaptive cruise control (ACC), lane keeping (LK), lane changing
(LC), etc., require the development of highly robust and precise
modules for estimating various vehicle dynamics. Such modules are
necessary to provide knowledge of the vehicle position and velocity
to control the vehicle along a desired state.
[0005] Currently, micro-electromechanical system (MEMS) based
inertial measurement units (IMUs) and wheel speed sensors are used
to provide vehicle speed. However, the performance of wheel speed
sensors is reduced during wheel slippage conditions, such as when
the driver performs cornering and swerving maneuvers. Therefore, a
dead-reckoning strategy for an IMU is utilized at these times to
produce vehicle velocity and position of the vehicle. Because MEMS
IMUs usually have larger errors than expensive gyro-systems, errors
in position and velocity can grow rapidly. Thus, current
automotive-grade MEMS IMUs alone are typically not suitable for
dead-reckoning for a long period of time.
[0006] It has been proposed in the art to integrate GPS and a low
cost MEMS IMU to address the non-zero bias and drift issues of an
IMU. However, few of these systems address the issue that the GPS
signals are not always available, such as when the vehicle is in
"urban canyons" where an insufficient number of satellites are
tracked to determine the position and the velocity of the
vehicle.
[0007] Future advanced driver assist systems (ADS) for vehicles
will include various object detection sensors, such as long-range
radar and lidar sensors and ultrasonic parking aid sensors.
Further, camera-based systems for lane departure warning are
currently being developed. Thus, there has been an increased
interest in utilizing data from these devices to estimate vehicle
self-motion. For example, one system proposes to use an in-vehicle
mounted calibrating camera to estimate self-motion and to detect
moving objects on roads. Another proposed system uses a single
camera for computing ego-motion of the vehicle based on optical
flow. Further work includes the use of stereo-vision to the
ego-pose estimation problem in urban environments. However, none of
these approaches alone is reliable enough in cluttered scenes.
Further, few of these systems make explicit the essential need for
system integration that will be necessary in the future commercial
development of this technology.
SUMMARY OF THE INVENTION
[0008] In accordance with the teachings of the present invention, a
system and method are disclosed that estimates vehicle dynamics,
including vehicle position and velocity, using a stationary object.
The system includes an object sensor, such as a radar, lidar or
camera, that provides object signals of the stationary object. The
system also includes in-vehicle sensors that provide signals
representative of vehicle motion, such as steering wheel angle,
yaw-rate, longitudinal speed, longitudinal acceleration and lateral
acceleration. The system also includes an association processor
that receives the object signals, and provides object tracking
through multiple frames of data. The system also includes a
longitudinal state estimation processor that receives the object
signals and the sensor signals, and provides a correction of the
vehicle speed in a forward direction. The system also includes a
lateral state estimation processor that receives the object signals
and the sensor signals, and provides a correction of the vehicle
speed in a lateral direction.
[0009] Additional features of the present invention will become
apparent from the following description and appended claims taken
in conjunction with the accompanying drawings.
BRIEF DESCRIPTION OF THE DRAWINGS
[0010] FIG. 1 is a block diagram of a system for determining
vehicle state estimations using object sensors, according to an
embodiment of the present invention;
[0011] FIG. 2 is a plan view showing vehicle ego-motion from
stationary radar tracks;
[0012] FIG. 3 is an illustration of a bicycle model identifying
parameters for vehicle handling; and
[0013] FIG. 4 is a plan view of a vehicle coordinate system.
DETAILED DESCRIPTION OF THE EMBODIMENTS
[0014] The following discussion of the embodiments of the invention
directed to a system and method for estimating vehicle dynamics
using radar, lidar and/or camera signals is merely exemplary in
nature, and is in no way intended to limit the invention or its
applications or uses.
[0015] As will be discussed in detail below, the present invention
proposes an integrated system using low-cost MEMS IMUs and other
in-vehicle dynamic sensors to correct vehicle dynamics estimations
in real-time using supporting sensors, such as radar, lidar, vision
systems or combinations thereof. This will allow either improved
performance from existing sensors or the same performance from
smaller and cheaper sensors.
[0016] FIG. 1 is a block diagram of a system 10 that provides
vehicle state estimations, such as vehicle position and velocity,
according to an embodiment of the present invention. The system 10
includes one or more object sensors 12, such as radar, lidar,
vision systems, cameras, etc., that may be available on the vehicle
to provide signals that track objects external to the vehicle. The
signals from the sensors 12 are sent to an association processor 14
that matches the maps or images from the signals from consecutive
views to track the objects. The tracks from the association
processor 14 are sent to a longitudinal state estimation processor
16 that estimates vehicle speed in a forward direction and a
lateral state estimation processor 18 that estimates vehicle speed
in a lateral direction as a result of yaw-rate and side-slip angle.
The processors 16 and 18 also receive signals from in-vehicle
sensors 20, such as a steering wheel angle sensor, a yaw-rate
sensor, a longitudinal speed sensor, a longitudinal acceleration
sensor, a lateral acceleration sensor, etc., all of which are
well-known to those skilled in the art as part of the various
vehicle stability control and driver assist systems referred to
above.
[0017] The object sensors 12 determine the ego-motion of the
vehicle from the measurement of stationary objects. FIG. 2 is a
plan view of a vehicle 26 including a radar sensor 28 that is
tracking a stationary object 30, such as a tree. The radar sensor
28 is mounted at the center of gravity (CG) of the vehicle frame
and is directed along the direction of the vehicle travel. From the
radar signals, vehicle velocities can be calculated as:
.upsilon..sub.x=-(x'-x)/.DELTA.T (1)
.upsilon..sub.y=-(y'-y)/.DELTA.T (2)
r=-(.THETA.'-.THETA.)/.DELTA.T (3)
Where the quantities (x,y) and (x',y') are the positions of the
stationary object 30 at time t and t+.DELTA.t, respectively, and
.upsilon..sub.x,.upsilon..sub.y and r are the longitudinal
velocity, lateral velocity and yaw-rate, respectively, of the
vehicle 26.
[0018] The range calculations referred to above can be extended
using a bicycle model of a vehicle 32, shown in FIG. 3, to provide
position tracking in the association processor 14. A range sensor
34 is mounted at position (x.sup.0,y.sup.0,.theta..sup.0) on the
vehicle, and points in a forward direction. The notations for the
bicycle model are given as: [0019] a: Distance from the font wheel
axis to the vehicle center of gravity (CG); [0020] b: Distance from
the rear wheel axis to the vehicle CG; [0021] m: Vehicle mass;
[0022] I.sub.z: Vehicle inertia mass along the z-axis; [0023]
.delta.: Front road wheel angle; [0024] I: Vehicle yaw-rate; [0025]
u: Vehicle longitudinal velocity; and [0026] .upsilon.: Vehicle
lateral velocity.
[0027] The radar output is a list of objects {o.sub.i|i=1, . . . ,
K} at time t. The measurement of the i-th object o.sub.i contains
range .rho., range rate {dot over (.rho.)} and azimuth angle
.THETA.. Usually the field-of-view of a long-range radar is narrow,
for example, 15.degree., therefore each detected object lies in
front of the vehicle in the x-axis. Equation (4) below can be used
to determine if the object is stationary.
|{dot over (r)} cos .THETA.-.upsilon..sub.x|<T (4)
[0028] The vehicle dynamics can be represented by a state vector z
whose components include: [0029] .delta.a.sub.x: Correction of the
longitudinal acceleration measurement; [0030] .upsilon..sub.x:
Vehicle longitudinal velocity; [0031] .delta..sub.r: Correction of
the yaw rate measurement; [0032] .upsilon..sub.y: Vehicle lateral
velocity; and [0033] {(x.sub.i,y.sub.i)|i-1, . . . , K}: List of
positions of stationary objects.
[0034] The longitudinal state estimation processor 16 receives
longitudinal position, longitudinal speed, longitudinal
acceleration and vehicle speed signals from the sensors, and uses
Kalman filtering and an auto-regression noise model to provide a
corrected longitudinal acceleration a.sub.xo, a corrected
longitudinal velocity .upsilon..sub.xo and wheel slippage of the
vehicle. The sensors 20, such as the accelerometers and wheel
sensors, give measurements of the longitudinal acceleration
a.sub.xo and the velocity .upsilon..sub.xo of the vehicle. The
estimation processor 16 may receive longitudinal position and
longitudinal speed of the vehicle from the radar input and
longitudinal acceleration and vehicle speed from the in-vehicle
sensors 20.
[0035] The processor 16 considers the acceleration correction as a
random walk process. The longitudinal system of the vehicle can be
written as:
x i ( t + 1 ) = x i ( t ) - .DELTA. T .upsilon. x ( t ) - .DELTA. T
2 2 ( a xo ( t ) + .delta. a x ( t ) ) - .DELTA. T 2 2 .di-elect
cons. ( 5 ) .upsilon. x ( t + 1 ) = .upsilon. x ( t ) + .DELTA. T (
a xo ( t ) + .delta. a x ( t ) ) + .DELTA. T .di-elect cons. ( 6 )
.delta. a x ( t + 1 ) = .delta. a x ( t ) + .di-elect cons. ( 7 )
##EQU00001##
Where .di-elect cons. is a zero-mean white random process with a
Gaussian distribution.
[0036] Given a measurement of o.sub.i=(p.sub.i,{dot over
(p)}.sub.i,.THETA..sub.i) from the i-th object, the observation
equation can be written as:
p.sub.i cos .THETA..sub.i=x.sub.i+v.sub.1 (8)
{dot over (P)}.sub.i cos .THETA..sub.i=-.upsilon..sub.x+v.sub.2
(9)
.upsilon..sub.xo=.upsilon..sub.x+v.sub.3 (10)
Where the quantities v.sub.1, v.sub.2 and v.sub.3 are observation
noises that are modeled as Guassian white random processes.
[0037] The Kalman filtering process in the processor 16 is used to
determine the corrected longitudinal acceleration
a.sub.x+.delta.a.sub.x, the corrected longitudinal velocity
.upsilon..sub.x and wheel slippage if the following condition is
satisfied.
|.upsilon..sub.x-.upsilon..sub.xo|>T (11)
Where T is a threshold.
[0038] The lateral state estimation processor 18 receives an object
azimuth angle, object lateral offset, steering wheel angle,
yaw-rate and lateral acceleration signals from the sensors, and
uses Kalman filtering and an auto-regression noise model to provide
a yaw-rate correction, a lateral acceleration correction and a
lateral velocity signal of the vehicle. The in-vehicle sensors 20,
such as the accelerometer and yaw-rate sensors, give measurements
of lateral acceleration a.sub.yo and yaw-rate r.sub.o of the
vehicle. The steering wheel angle sensor gives the steering wheel
angle .delta..sub.f. The correction of the yaw-rate is modeled as a
random walk. By letting L=a+b be the wheel-base and c.sub.f and
c.sub.r be the cornering stiffness coefficients of the front and
rear tires, the lateral plant model can be written as:
y i ( t + 1 ) = y i ( t ) - .DELTA. T ( ( r o + .delta. r ( t ) ) x
i + .upsilon. y ( t ) ) - .DELTA. Tx i .di-elect cons. 1 - .DELTA.
T .di-elect cons. 2 ( 12 ) .upsilon. y ( t + 1 ) = ( 1 - .DELTA. T
c f + c r m .upsilon. x ) .upsilon. y ( t ) + .DELTA. T ( c r b - c
f b m .upsilon. x ) ( r o + .delta. r ( t ) ) + .DELTA. T c f m
.delta. f + .di-elect cons. 2 ( 13 ) .delta. r ( t + 1 ) = .DELTA.
T - ac f + bc r I z .upsilon. x .upsilon. y ( t ) - .DELTA. T a 2 c
f + b 2 c r I z .upsilon. x ( r o + .delta. r ( t ) ) + .di-elect
cons. 1 ( 14 ) ##EQU00002##
Where .di-elect cons..sub.1 and .di-elect cons..sub.2 are two white
Gaussian random processes.
[0039] Given a measurement of o.sub.i-(p.sub.i,{dot over
(p)}.sub.i,.THETA..sub.i) from the i-th object, the observation
equation can be written as:
p.sub.i sin .THETA..sub.i=y.sub.i+v.sub.1 (15)
a.sub.yo=.upsilon..sub.x(r.sub.o+.delta..sub.r)+v.sub.2 (16)
Where the quantities v.sub.1 and v.sub.2 are observation noises
that are modeled as Gaussian white random processes. Here v.sub.1
is the error introduced by the measurement and v.sub.2 is the error
introduced by banked road or sensor measurements.
[0040] The Kalman filter in the lateral estimation processor 18 is
used to determine the corrected yaw-rate (r.sub.o+.delta.r), the
corrected lateral acceleration (a.sub.y+.delta.a.sub.y), and the
lateral velocity .upsilon..sub.y.
[0041] Most range sensors used in the art report an object with an
identifier that remains the same across different time frames. This
information is used to match object maps from consecutive frames.
Mathematically, it can be assumed that the sensor gives the object
map {o.sub.i(t)l.sub.ii(t)|i=1, . . . , K.sub.1} at time i and map
{o.sub.j(t'),l.sub.i(t')|j=1, . . . , K.sub.t'} at time t', where l
denotes the object identifier. Therefore, the matches of
consecutive maps can be defined as:
{(o.sub.i(t),o'.sub.j(t'))|l.sub.i(t)=l.sub.j(t'),1.ltoreq.i.ltoreq.K.su-
b.t,1.ltoreq.j.ltoreq.k.sub.t'} (17)
[0042] In one embodiment, a monocular camera can be used as the
object sensor. The camera recovers motion of the vehicle navigating
on the ground surface by tracking certain scene elements on the
ground. In a monocular sequence, a nomography transformation exists
between two views for the elements on the same plane. That is, for
a set of point correspondences x.sub.i,x'.sub.i between two images,
assuming that the points are coplanar, there is a homography matrix
F defined as;
x'.sub.i=Fx.sub.i (18)
Where x denotes a homogeneous image coordinate
(u,.upsilon.,m).sup.T, which represents the image pixel at
( u m , .upsilon. m ) . ##EQU00003##
[0043] A world coordinate at time t is defined so that the plane
x-o-y are coincident to the ground plane, as shown in FIG. 4. The
camera coordinate system o'x'y'z' is fixed with the camera whose
image plane and symmetric axis are coincident to the plane o'x'y'
and z'-axis, respectively. By letting the camera's projection
matrices of two views be P=K[I|0] and P'=K[Rc], respectively, the
homography matrix F can be given as:
F=K(R-cn.sup.T/d)K.sup.-1 (19)
Where K is the intrinsic matrix, R is the camera rotation matrix, c
is the camera center coordinates and .pi.=(n.sup.T,d).sup.T is the
plane equation (n.sup.TX+d=0), all in world coordinate system.
[0044] In FIG. 4, let P denote the project matrix at time t be
P=KR[I|-c], where c=(x,y,-d)' and R are the camera's extrinsic
matrix. The ground plane is denoted as
.pi..sub.0=(n.sub.0.sup.T,0).sup.T where n.sub.0=(0,0,1).sup.T. For
the view of the next frame t', the vehicle translates
.DELTA.c=(.DELTA..sub.x,.DELTA..sub.y,0).sup.T and rotates
.DELTA..theta. on the ground plane. The rotation matrix can be
written as:
.DELTA. R = ( cos .DELTA. .theta. sin .DELTA. .theta. 0 - sin
.DELTA. .theta. cos .DELTA. .theta. 0 0 0 0 ) ( 20 )
##EQU00004##
[0045] Then, the projection matrix P' at time t' can be written as
P'=KR'[I|-c'], where R'=R.DELTA.R, and c'=c+.DELTA.c.
[0046] To apply equation (19), the world coordinate system is moved
to the camera center at time t. The new projection matrices
become:
P=K[I|0] (21)
P'=K.left brkt-bot.R'R.sup.-1|-R'.DELTA.c.right brkt-bot. (22)
[0047] The ground plane becomes (Rn.sub.0,d). Thus:
F=K(R'R.sup.-1+R'.DELTA.cn.sub.0.sup.TRR.sup.T/d)K.sup.-1 (23)
[0048] Applying R.sup.T=R.sup.-1 and R'=R.DELTA.R gives:
F=KR.DELTA.R(I+.DELTA.cn.sub.0.sup.T/d)(KR).sup.-1 (24)
[0049] If the calibrated camera is considered, i.e., K and R are
known in advance, the essential matrix E can be computed as:
E=(KR).sup.-1FKR=.DELTA.R(I+.DELTA.cn.sub.0.sup.T/d) (25)
[0050] With n=(0,0,1).sup.T, .DELTA.c=(.DELTA.x,.DELTA.y,0).sup.T
and equation (20), the essential matrix E can be written as:
E = ( cos .DELTA. .theta. sin .DELTA. .theta. .DELTA. x d cos
.DELTA. .theta. + .DELTA. y d sin .DELTA. .theta. - sin .DELTA.
.theta. cos .DELTA. .theta. - .DELTA. x d sin .DELTA. .theta. +
.DELTA. y d cos .DELTA. .theta. 0 0 1 ) ( 26 ) ##EQU00005##
[0051] Usually the rotation angle .theta. is small (i.e., sin
.theta.<<1) with two consecutive views. Thus, equation (26)
can be approximated as:
E .apprxeq. [ cos .DELTA. .theta. sin .DELTA. .theta. .DELTA. x d -
sin .DELTA. .theta. cos .DELTA. .theta. .DELTA. y d 0 0 1 ] ( 27 )
##EQU00006##
[0052] The essential matrix E in equation (27) is actually a
two-dimensional transformation having a translation
.DELTA.x/d,.DELTA.y/d and a rotation .theta..
[0053] Given a set of matched feature point pairs
{(x.sub.i,x'.sub.1|i=1, . . . , N)}, the self-motion estimation can
be formulated as a least square estimation:
arg min F 1 N i = 1 N x i ' - Fx i ( 28 ) ##EQU00007##
Which can be transformed into:
arg min E 1 N i = 1 N x ^ i ' - E x ^ i ( 29 ) ##EQU00008##
Where {circumflex over (x)}=KRx and {circumflex over (x)}'=KRx' if
the camera's calibration matrices K and R are known.
[0054] Examining equation (27), the normalized points between the
two views are related to each other with a rigid rotation
(.DELTA..theta.) and translation (.DELTA.x/d,.DELTA.y/d). The
following method can be utilized to recover the parameters.
[0055] The input is N pairs of the matched ground feature points
{(x.sub.i,x'.sub.i)|i=1, . . . , N} and the camera's intrinsic and
extrinsic parametric matrices K and R.
[0056] The output is the estimated self-motion parameters
c.sub.2=(.DELTA.x/d,.DELTA.y/d) and
R 2 = ( cos .DELTA. .theta. - sin .DELTA. .theta. sin .DELTA.
.theta. cos .DELTA. .theta. ) . ##EQU00009##
[0057] 1. For all x.sub.i and x'.sub.i, calculate {circumflex over
(x)}.sub.i=KRxi and {circumflex over (x)}'.sub.i=KRx'.sub.i.
[0058] 2. Compute
x _ = 1 N i = 1 N x ^ i and x _ ' = 1 N i = 1 N x ^ i ' .
##EQU00010##
[0059] 3. Compute:
C = 1 N i = 1 N ( x ^ i - x _ ) ( x ^ i - x _ ) T ( 30 )
##EQU00011##
[0060] 4. Let the singular value composition of the matrix C be
written as C=UWV.sup.T
[0061] Then, the rotation R.sub.2 and the translation t.sub.2 can
be solved as:
R 2 = U ( 1 0 0 det ( UV T ) ) V T ( 31 ) t 2 = x _ ' - R 2 x _ (
32 ) ##EQU00012##
[0062] A Harris corner detector can be used to detect the feature
points in the two consecutive images. Then, a correlation operation
of the images is conducted to find the matching between the found
feature points. The image point at (u,.upsilon.) in image I and the
image point at (u',.upsilon.') in image I' is matched if, and only
if, the following conditions are satisfied:
( u - u ' ) 2 + ( .upsilon. - .upsilon. ' ) 2 < T 1 ( 33 ) 1 K 2
i = u - K u + K i = .upsilon. - K .upsilon. + K I ( i , j ) - I ( i
, j ) < T 2 ( 34 ) ##EQU00013##
[0063] The estimated motion parameters in the previous cycle are
used to guide the matching process. The following outliers deletion
method is used to reject scene elements above the ground or from
dynamic moving objects, such as vehicles in the road.
[0064] The input is two sets of the scene elements (normalized with
intrinsic and extrinsic matrices) in pixel coordinates, denoted by
{(u.sub.i,{circumflex over (.upsilon.)}.sub.i)|i=1, . . . , N} and
{(u.sub.j,{circumflex over (.upsilon.)}.sub.j)|j=1, . . . , M}.
[0065] The output is matched point pairs and estimated motion
parameters.
[0066] 1. Predict the location of the elements in the previous
frame by using previous motion parameters:
- ( u ~ i u ~ i ) = R 2 ( u ~ i u ~ i ) + t 2 . ##EQU00014##
[0067] 2. Use the correlation method to match the sets of the
predicted points {(u.sub.i,{circumflex over (.upsilon.)}.sub.i)}
and {(u'.sub.j,{circumflex over (.upsilon.)}'.sub.j)}.
[0068] 3. Randomly pick no less than four matched pairs that are
not co-linear, and then derive the self-motion parameters using the
method discussed above.
[0069] 4. Validate the derived motion parameters by using the
matched pairs from step 3.
[0070] 5. If the error of the majority of the matched points are
sufficiently small, exit the process, otherwise, go to step 3.
[0071] Similarly as the range sensor described above, the
ego-motion across multiple frames can be estimated by tracking. The
plant and observation models can be written as follows.
[0072] The state of motion is denoted by: [0073] .upsilon..sub.x:
Vehicle longitudinal velocity; [0074] .upsilon..sub.y: Vehicle
lateral velocity; [0075] .delta.r: rate measurement correction; and
[0076] .delta.a.sub.x: Longitudinal acceleration correction.
[0077] Let .upsilon..sub.xo,a.sub.xo,r.sub.o, and a.sub.yo denote
measured longitudinal speed, longitudinal acceleration, yaw rate
and lateral acceleration, respectively. Then, the plant model can
be written as:
.delta. a x ( t + 1 ) = .delta. a x ( t ) + .di-elect cons. 1 ( 35
) .upsilon. x ( t + 1 ) = .upsilon. x ( t ) + .DELTA. T ( .delta. a
x ( t ) + a xo ) + .DELTA. T .di-elect cons. 1 ( 36 ) .delta. r ( t
+ 1 ) = .DELTA. T - ac f + bc r I z .upsilon. x .upsilon. y ( t ) -
.DELTA. T - a 2 c f + b 2 cr I z .upsilon. x ( r o + .delta. r ( t
) ) + .di-elect cons. 2 ( 37 ) .upsilon. y ( t + 1 ) = ( 1 -
.DELTA. T c f + c r m .upsilon. x ) .upsilon. y ( t ) + .DELTA. T (
c r b - c f b m .upsilon. x - .upsilon. x ) ( r o + .delta. r ( t )
) + .DELTA. T c f m .delta. f + .di-elect cons. 3 ( 38 )
##EQU00015##
Where .di-elect cons..sub.1,.di-elect cons..sub.2 and .di-elect
cons..sub.3 are zero-mean Gaussian white noise.
[0078] Let the motion parameters recovered from two consecutive
views be denoted by
(.DELTA..theta..sub.o(t),.DELTA.x.sub.o(t),.DELTA.y.sub.o(t)),
where .DELTA..theta..sub.o is the rotation angle and
(.DELTA.x.sub.o,.DELTA.y.sub.o) is the translation. The observation
equations can be written as:
.DELTA.x.sub.o=.upsilon..sub.x.DELTA.T+v.sub.1 (39)
.DELTA.y.sub.o=.upsilon..sub.y.DELTA.T+v.sub.2 (40)
.DELTA..theta..sub.o=(r.sub.o+.delta.r).DELTA.T+v.sub.3 (41)
a.sub.yo=.upsilon..sub.x(r.sub.o+.delta.r)+v.sub.4 (42)
Where v.sub.1,v.sub.2,v.sub.3, and v.sub.4 are the noises
introduced by the measurements, usually modified as white zero-mean
Gaussian random processes.
[0079] Thus, the Kalman filter can be used to determine the state
variables.
[0080] The foregoing discussion discloses and describes merely
exemplary embodiments of the present invention. One skilled in the
art will readily recognize from such discussion and from the
accompanying drawings and claims that various changes,
modifications and variations can be made therein without departing
from the spirit and scope of the invention as defined in the
following claims.
* * * * *