U.S. patent application number 15/624234 was filed with the patent office on 2017-10-05 for smart vehicle.
The applicant listed for this patent is ARAFAT M.A. ANSARI. Invention is credited to ARAFAT M.A. ANSARI.
Application Number | 20170287335 15/624234 |
Document ID | / |
Family ID | 57451874 |
Filed Date | 2017-10-05 |
United States Patent
Application |
20170287335 |
Kind Code |
A1 |
ANSARI; ARAFAT M.A. |
October 5, 2017 |
SMART VEHICLE
Abstract
Systems and method are disclosed to control a vehicle by
creating a 3D model of a vehicle environment and the vehicle;
obtaining path information that provides an estimated location of a
travel path; and estimating a neighbor path for at least one
neighboring vehicle.
Inventors: |
ANSARI; ARAFAT M.A.;
(Fremont, CA) |
|
Applicant: |
Name |
City |
State |
Country |
Type |
ANSARI; ARAFAT M.A. |
Fremont |
CA |
US |
|
|
Family ID: |
57451874 |
Appl. No.: |
15/624234 |
Filed: |
June 15, 2017 |
Related U.S. Patent Documents
|
|
|
|
|
|
Application
Number |
Filing Date |
Patent Number |
|
|
14732555 |
Jun 5, 2015 |
9711050 |
|
|
15624234 |
|
|
|
|
Current U.S.
Class: |
1/1 |
Current CPC
Class: |
B60W 40/04 20130101;
B60W 30/09 20130101; G06Q 10/00 20130101; B60K 28/00 20130101; G06Q
40/08 20130101; G06Q 50/30 20130101; G06Q 30/0251 20130101; H04L
67/12 20130101; G06K 9/00355 20130101; G08G 1/166 20130101; G08G
1/22 20130101; G06K 9/00805 20130101; G08G 1/167 20130101; G08G
1/165 20130101; G08G 1/162 20130101; B60W 30/12 20130101 |
International
Class: |
G08G 1/16 20060101
G08G001/16; G06K 9/00 20060101 G06K009/00; B60W 30/09 20060101
B60W030/09; H04L 29/08 20060101 H04L029/08; G08G 1/00 20060101
G08G001/00 |
Claims
1. A method to control a vehicle, comprising: creating a 3D model
of a vehicle environment and the vehicle; obtaining path
information that provides an estimated location of a travel path;
estimating a neighbor path for at least one neighboring vehicle;
and when the path information has become unavailable or unreliable,
maintaining a relative position until the path information becomes
available again.
2. The method of claim 1, wherein the path information is based on
at least one of a lane marking on the road, a geographic location
of the vehicle, and a predetermined map of the road.
3. The method of claim 1, further comprising using radio
triangulation, laser ranging or camera estimation to monitor the
neighboring vehicle.
4. The method of claim 1, comprising communicating with neighboring
vehicles to determine a relative position of the vehicle to
neighboring vehicles.
5. The method of claim 1, further comprising: identifying vehicles
traveling to a common destination for a predetermined period;
organizing the vehicles into a flock.
6. The method of claim 5, comprising applying a separation rule to
avoid crowding vehicular neighbors, an alignment rule to steer
towards average heading of vehicular neighbors, and a cohesion rule
to steer towards average position of vehicular neighbors
7. The method of claim 1, comprising flocking the vehicles over a
plurality of lanes.
8. The method of claim 1, comprising determining a common
destination based on a segment of a vehicle's final
destination.
9. The method of claim 1, wherein vehicles that share same driving
segments broadcast expressions indicating their path(s), comprising
detecting vehicles in the same segment as part of the proximity
services for capturing and sharing crowd-sourced navigation
data.
10. The method of claim 1, comprising capturing crowdsourced
information relating to: closing of a path; predicting an avoidance
maneuver; predicting a congestion with respect to a segment of the
route of the at least one vehicle; and predicting traffic
patterns.
11. The method of claim 1, comprising determining an obstacle in
the path and changing the vehicle's path to avoid the obstacle,
wherein the obstacle comprises rock, hidden object, submerged
object, a path closure, inoperative vehicle, or vehicle in an
accident.
12. The method of claim 1, wherein the vehicle comprises a drone, a
car, a plane, a ship, a motorcycle, or a bicycle.
Description
BACKGROUND
[0001] The present invention relates to smart vehicles.
SUMMARY
[0002] In one aspect, systems and method are disclosed to control a
vehicle by creating a 3D model of a vehicle environment and the
vehicle; obtaining path information that provides an estimated
location of a travel path; and estimating a neighbor path for at
least one neighboring vehicle.
[0003] In an alternative, systems and method are disclosed to
control a vehicle by creating a 3D model of a vehicle environment
and the vehicle; obtaining path information that provides an
estimated location of a travel path; and estimating a path for a
flock of vehicle.
[0004] In another aspect, systems and method are disclosed to
control a vehicle by creating a 3D model of the road and the
vehicle; obtaining lane information that provides an estimated
location of a lane; and estimating a path for at least one
neighboring vehicle.
[0005] In other aspects, smart car operations are detailed. The
smart vehicle has a number of sensors such as IoT (internet of
things) sensors that can share data with other vehicles and that
can communicate with the cloud to provide intelligent handling of
the vehicle. The vehicle can be a drone, a car, a plane, a ship, a
motorcycle, or a bicycle.
BRIEF DESCRIPTION OF THE DRAWINGS
[0006] FIG. 1 shows an exemplary environmentally friendly
vehicle;
[0007] FIG. 2 illustrates an exemplary battery system and an
exemplary power cable system for a car;
[0008] FIG. 3 illustrates an exemplary battery system and an
exemplary power cable system for a car;
[0009] FIG. 4A shows an exemplary car electronic system;
[0010] FIG. 4B illustrates another exemplary car electronic
system;
[0011] FIG. 4C illustrates an exemplary gesture control sub-system
in the system of FIG. 4A or 4B.
[0012] FIGS. 5A-5L show exemplary gesture controls of the car;
[0013] FIGS. 6A-6C show exemplary obstacles that may be encountered
by vehicles;
[0014] FIGS. 7A-7H illustrate an exemplary process to fuse data for
3D models used for car navigation;
[0015] FIGS. 8A-8F show exemplary detection of objects outside of
the vehicle and guidance on their handling;
[0016] FIGS. 9A-9B show exemplary systems for capturing navigation
data and using such data for smart vehicles;
[0017] FIG. 10 shows an exemplary group of cars following flock
control behavior;
[0018] FIG. 11 illustrates a typical network environment in which
the systems, methods for cloud based driver behavior capturing and
monitoring;
[0019] FIG. 12 is a diagram illustrating generally, a portion of
vehicle alone with possible locations of sensors, cameras, among
others;
[0020] FIG. 13 is a diagram illustrating generally, possible
locations of sensors, cameras, and/or other technologies;
[0021] FIG. 14 is a sequence diagram illustrating generally,
operations performed by the system as described in the FIG. 11;
[0022] FIG. 15 is a diagram illustrating generally, an overview of
a recommender system that may allow drivers to obtain action
recommendations based on the driver behavior parameters, according
to embodiments disclosed herein;
[0023] FIG. 16 is a diagram 600 illustrating generally, an overview
of preferences matching by the server 202, according to embodiments
disclosed herein;
[0024] FIG. 17 is a flow chart illustrating generally, a method for
selectively providing insurance information to a service provider,
according to embodiments as disclosed herein;
[0025] FIG. 18 is a diagram illustrating generally, an exemplary
system that customizes insurance rates to correspond to behavior
driver, according to embodiments as disclosed herein;
[0026] FIG. 19 is a diagram illustrates generally an insurance rate
adjustment component that further includes an analyzer component,
according to embodiments as disclosed herein;
[0027] FIG. 20 illustrating generally, a method for customizing
insurance rates of a driver, according to embodiments as described
herein;
[0028] FIG. 21 illustrating generally, a method for presenting
information related to a real-time insurance rate, according to
embodiments as described herein;
[0029] FIG. 22 is diagram illustrating generally, a method for
installation of a real-time insurance system, according to
embodiments disclosed herein;
[0030] FIG. 23 is a diagram illustrating generally, a method for
gathering information from an on-board monitoring system employed
in a real-time insurance system, according to embodiments as
disclosed herein;
[0031] FIG. 24 is a diagram illustrating generally, a method
mounting cameras to capture traffic information, according to
embodiments as disclosed herein;
[0032] FIG. 25 is a diagram illustrating generally, a method
mounting cameras to capture driver behavior, according to
embodiments as disclosed herein; and
[0033] FIG. 26 is a diagram illustrating generally, a first vehicle
program communicating with a second vehicle program through an
Inter-Vehicle Communication, according to embodiments as disclosed
herein.
DESCRIPTION
[0034] Reference in the specification to "one embodiment" or "an
embodiment" is intended to indicate that a particular feature,
structure, or characteristic described in connection with the
embodiment is included in at least an embodiment of the invention.
The appearances of the phrase "in one embodiment" or "an
embodiment" in various places in the specification are not
necessarily all referring to the same embodiment. Throughout the
drawings, reference numbers are re-used to indicate correspondence
between referenced elements. In addition, the first digit of each
reference number indicates the figure in which the element first
appears.
[0035] In the present specification and in the appended claims, the
term "road" is meant to be understood broadly as any digitally
represented or real-world path by which a user of a GPS system may
pass over. In some examples, the road may be represented either
incorrectly or correctly in the digital map. In the real-world, the
road may be a newly created path through which a vehicle, a
cyclist, a pedestrian, etc. may travel in order to reach a
destination. The term "vehicle" is meant to be understood broadly
as a car or automobile. The term "GPS" stands for Global
Positioning System and refers to a radio navigation system that
allows land, sea, and airborne users to determine their exact
location. The term "GPS trace" is meant to be understood broadly as
any data defining the recent or past positions of a GPS system,
each GPS trace having a sequence of GPS points comprising:
latitude, longitude, and timestamp or time-delta. In one example,
each GPS point may further comprise data describing the measured
speed of the GPS system, the compass bearing at which the GPS
system is directed towards, among others.
[0036] In the following description, for purposes of explanation,
numerous specific details are set forth in order to provide a
thorough understanding of the present systems and methods. It will
be apparent, however, to one skilled in the art that the present
apparatus, systems and methods may be practiced without these
specific details. Reference in the specification to "an example" or
similar language indicates that a particular feature, structure, or
characteristic described in connection with that example is
included as described, but may not be included in other
examples.
[0037] Methods and apparatus that implement the embodiments of the
various features of the disclosure will now be described with
reference to the drawings. The drawings and the associated
descriptions are provided to illustrate embodiments of the
invention and not to limit the scope of the invention.
[0038] Electric Car with Battery as Body
[0039] FIG. 1 shows an exemplary environmentally friendly (green)
vehicle such as a car 1 with a passenger compartment 2 and a
central engine compartment 3 behind passenger compartment 2 with a
front window 14 and one or more side windows and a rear window.
Although the engine compartment 3 is shown as a rear-engine, the
engine compartment 3 can also be affront engine compartment. The
engine can be all electric engine, hydrogen engine, hybrid engine,
or an ultra low emission gas engine. To minimize emission, the ULEG
engine can be turned off when stopped, or the cylinders can be
disabled if the full power is not needed.
[0040] A frame 4 of the car 1 supports a roof 5 which can be a sun
roof that can expose the passenger compartment 2 in an open
position and can cover the passenger when closed. To support the
sun roof, the frame 4 provides two vertical posts 6 facing each
other on opposite sides of car 1, at the boundary between passenger
compartment 2 and engine compartment 3. When sun roof 5 is in the
closed position, roof members 7 and 8 are substantially horizontal,
substantially coplanar, and positioned seamlessly one behind the
other. The car contains a cooling system that minimizes the weight
and power consumption of conventional air conditioning system for
the car 1.
[0041] In one embodiment, the vehicle exterior body can be a
laminate defining the chamber and piping connected to the chamber.
The vehicle exterior body can be a lightweight composite material.
Composite body structures provide an impact-resistant exterior that
is lighter than steel but three times as strong. The car can
include front crash zones that absorb and deflect energy to keep
the passenger from harm. The car can also provide integrated
high-strength aluminum door beams that transfer crash loads into
the body and away from the cabin. A complement of driver and
passenger air bags is incorporated to ensure that each passenger is
protected and secure.
[0042] The vehicle can provide an evaporative cooling system with a
fluid. The fluid can be Freon or water or any suitable evaporative
fluid. Water is cheap and has no side effect. Thus, in one
embodiment, the system reduces the temperature of a space by making
use of the natural characteristic of water to absorb heat during
its vaporization from the body with which it is in contact.
[0043] In one embodiment, evaporation can be enhanced by creating
small rough surfaces on the floor of a container. Such rough
surfaces can be made by blasting, sanding, or depositing small
projective surfaces on the floor 56. The vapor eventually condenses
and is subsequently collected by the liquid reservoir. A pump can
circulate water needed for the vaporization according to the
specific conditions of each case so that it can be kept wet on its
whole surface. A wet surface such as a shroud or fabric reduces the
temperature of a space by making use of the natural characteristic
of water to absorb heat during its vaporization from the body with
which it is in contact. It includes large wet surfaces created with
a small mass of water within a limited space due to the activation
of the molecular powers of water and of other material with
molecular powers relevant to the ones of water.
[0044] FIG. 2 illustrates an exemplary battery and power cable
system for a car. In FIG. 2 each car body part is a battery shaped
to provide a particular mechanical function. The battery can be a
rechargeable battery such as a lithium type battery, among others.
For example, a battery shaped as hood 100 covers the engine and can
be opened to allow access to the engine and other drive train
components. A battery shaped left and right front portions 102, 104
covers the left and right front part of the car, while a front
battery shaped bumper 116 provides protection against frontal
collision. A battery shaped as a left door 108 and as a right door
110 allows passenger access to the vehicle, while a battery shaped
as a roof 106 protects the occupant from sun or rain. A battery
shaped as a trunk 112 covers a storage space, and a battery shaped
as a bumper 114 protects the vehicle from a rear collision.
[0045] The battery can be rechargeable lithium ion, although other
chemistries can be used. In one embodiment, conformal batteries
such as lithium polymer batteries can be formed to fit the
available space of the car body part regardless of the geometry of
the part. Alternatively, for batteries that are available only in
relatively standard prismatic shapes, the prismatic battery can be
efficiently constructed to fill the space available, be it
rectilinear or irregular (polyhedral) in shape. This conformal
space-filling shape applies in all three dimensions. In one
embodiment, this is done by selecting a slab of lithium polymer
battery material of a desired height; freezing the slab; vertically
cutting the slab to a desired shape thus forming a cut edge;
attaching an anode lead to each anode conductor of the cut slab
along the cut edge while maintaining the cut slab frozen; and
attaching a cathode lead to a each cathode conductor of the cut
slab along the cut edge while maintaining the cut slab frozen. The
slab may contain one or many cells. The leads may be made of single
or multistranded, metallic wire, metallic ribbon, low melting point
alloy, self-healing metal, and litz wire. Attachment is
accomplished so as to minimize tension on the leads. The cut slab
may need to be deburred after cutting and before attaching leads.
The cut edge may be inspected for burrs before deburring is
performed. As discussed in US Application Serial 20070079500, the
content of which is incorporated by reference, burr formation can
be avoided by recessing the edge of each anodic half cell or each
cathodic half cell by mechanical means, blowing away dust; and
insulating the recessed edges with non-conductive polymer. Lead
attachment my be accomplished by a number of methods including:
wire bonding; wedge bonding; adhering the lead to the electrode
with conductive epoxy, anistotropic conductive adhesive or
conductive thermoplastic; stapling with microstaples; adhering the
lead to the electrode by electropolymerization; welding the lead to
the electrode with micro welding; and growing a lead in place by
electroless plating, electro-plating or a combination of
electroless plating and electroplating. The leads should be
insulated. Preferably the insulation is thermoplastic. If there is
more than one cell in the slab, the distal ends of the leads may be
connected together so that the cells are connected together in
series, in parallel or some in series and the remainder in
parallel. After the leads have been attached to the cut slab and
connected together, the assembly will preferably be wrapped with
standard packaging for lithium polymer batteries or a shrinkable
form fitting version thereof.
[0046] Because the starting material for the conformal battery is
purchased pre-made from a battery manufacturer, this approach
eliminates the considerable expense of formulating and producing
the materials for the anodes and cathodes as well as combining the
anodes and cathodes into battery cells. This reduces cost and
weight for the car.
[0047] The vehicle may include a liquid reservoir; a vehicle
exterior body having a chamber to contain water and adapted to
evaporate the liquid to cool the vehicle; and a pump coupled to the
chamber and the liquid reservoir to circulate the liquid. The
system includes a handle near a vehicle window, the handle having a
fan to draw air from the vehicle interior to the outside. The
vehicle window includes a motor to move the window up or down,
comprising a processor couple to the motor to move the window down
to allow air circulation. The chamber can be safety glass. A clear
hydrophobic material can be used in the chamber. A moveable shade
can be provided under the chamber to shield UV rays. The vehicle
body that is cooled can be a rooftop or windows on the car. A solar
cell can be mounted above the chamber. A concentrator can be used
for focusing sunlight on the solar cell. A hydrophobic material can
be used in the chamber or a shroud in the chamber. A vent can be
used to bring evaporation into the vehicle interior to humidify the
interior. A processor can control the pump to vary the pump speed
to adjust the temperature of the vehicle interior. A battery
chamber can be connected to the chamber to cool the battery. The
chamber is positioned on a roof-top, window or trunk. A return path
can connect the liquid reservoir and the chamber, wherein the
return path runs through a passenger seat to cool the seat. The
vehicle exterior body can be a laminate defining the chamber and
piping connected to the chamber. The vehicle exterior body can be a
lightweight composite material. The system distributes recharging
energy so replenishing the battery can be done quickly and in a
distributed manner. Cost is minimized since overhead charging
control components are centralized in a controller. The actual
energy transfer switches are distributed to minimize energy losses.
The system is light weight and distributes the weight of the
battery throughout the car. The battery can be air cooled since it
is not densely packed into a large brick. Battery repair and
replacement can be done easily as well. The strength of the battery
is available as structural support to provide safety to the
occupant of the vehicle.
[0048] In one embodiment shown in FIG. 3, each of batteries 100-116
has a built in charger (103a-103g) and a switch to isolate each
battery to enable rapid parallel charging from one power cable.
During such parallel charging, each battery is charged independent
of the others. A master charging controller 119 controls and
coordinates the chargers 103a-103g to ensure quick charging.
Battery-monitoring systems can monitor the battery's state of
charge, which in turn determines the battery's cost and
performance. By knowing the battery's state of charge, the system
can use more capacity from each cell, use fewer cells, and maximize
the lifetimes of those cells. Voltage, current, charge, temperature
can provide a good indication of the state of charge. The
charging/discharging of series-connected cells must stop when any
cell reaches its maximum or minimum allowable state of charge. The
system keeps the capacity levels the same in all cells over time
and helps them age in unison. The battery-monitoring system can
tweak the charge level in each cell to derive more energy and
greater lifetime from the pack. Cell balancing is a critical
feature in EVs and HEVs.
[0049] In one embodiment, a passive-balancing technique places a
bleed resistor across a cell when its state of charge exceeds that
of its neighbors. Passive balancing doesn't increase the drive
distance after a charge because the technique dissipates, rather
than redistributes, power. In another embodiment, active balancing
is used so that charge shuttles between cells and does not end up
as wasted heat. This approach requires a storage element such as
capacitors, inductors, or transformers for the charge transfer. The
capacitor continuously switches between two adjacent cells. Current
flows to equalize the voltage and, therefore, the state of charge
of the two cells. Using a bank of switches and capacitors, the
voltage of all cells tends to equalize. The circuit continuously
balances cells in the background as long as the switching clock is
active. A transformer-based scheme transfers charge between a
single cell and a group of cells. The scheme requires
state-of-charge information to select the cell for charging and
discharging to and from the group of six cells.
[0050] FIG. 4A shows a block diagram of an embodiment of an
electrical power and automobile control system. The system is
controlled by a processor 202. The processor 202 is connected with
an inertial system (INS) 204 and a global positioning system (GPS)
receiver 206 that generate navigation information. The processor
202 is also connected with a wireless communication device 208 that
transmits and receives digital data as well as being a Doppler
radar when desired. The processor 202 drives a display 210 and a
speaker 212 for alerting a driver. The processor 202 provides
control inputs to the automobile's braking and steering systems
220. A power cable 200 carries power between the batteries 100-116
and an electric motor engine (not shown). The power cable 200 also
carries power to recharge the batteries 100-116 serially or in
parallel as discussed above.
[0051] The power cable 200 can be a coaxial cable or a power cable
and a data cable. In one embodiment, the same wire carrying power
also carries data. Data in the form of radio frequency (RF) energy
can be bundled on the same line that carries electrical current.
Since RF and electricity vibrate on different frequencies, there is
no interference between the two. As such, data packets transmitted
over RF frequencies are not overwhelmed or lost because of
electrical current. Eventually, the data can be provided to
wireless transmitters that will wirelessly receive the signal and
send the data on to computer stations. Exemplary protocols that can
be used include CAN-bus, LIN-bus over power line (DC-LIN), and
LonWorks power line based control. In one embodiment, the protocol
is compatible with the HomePlug specifications for home networking
technology that connects devices to each other through the power
lines in a home. Many devices have HomePlug built in and to connect
them to a network all one has to do is to plug the device into the
wall in a home with other HomePlug devices. In this way, when the
vehicle is recharged by plugging the home power line to the vehicle
connectors, automotive data is automatically synchronized with a
computer in the home or office.
[0052] Alternatively, two separate transmission media can be used:
one to carry power and a second to carry data. In one embodiment,
the data cable can be a fiber optic cable while the power cable can
be copper cable or even copper coated with silver or gold. The data
cable can also be an Ethernet cable. The data can be an Internet
Protocol (IP) in the cable. Each body panel can have a battery
recharger. The body panel can be made of lithium ion batteries. The
batteries can have a shape that conforms to a specific shape such
as a door or a hood or a seat, for example. To protect the
occupant, a beam can be used that transfers a crash load into the
vehicle body and away from a passenger cabin. Additionally, driver
and passenger air bags positioned in the vehicle body. A wireless
transceiver can be connected to the power cable. The wireless
transceiver sends status of components in the vehicle to a remote
computer. The wireless transceiver communicates maintenance
information to a remote computer. If needed, the remote computer
orders a repair part based on the maintenance information and
schedules a visit to a repair facility to install the repair
part.
[0053] Sensors in the Car
[0054] This embodiment includes navigation systems, the INS 204 and
the GPS receiver 206. Alternate embodiments may feature an
integrated GPS and INS navigation system or other navigation
system. The use of only an INS 204 or only a GPS receiver 206 as
the sole source of navigation information is also contemplated.
Alternatively, the wireless communication device 208 can
triangulate with two other fixed wireless devices to generate
navigation information.
[0055] A display 210 and speaker/microphone 212 provide both visual
and audio situational awareness information to a driver. Alternate
embodiments may feature only a display 210 or only a speaker 212 as
the sole source of information for the driver. Embodiments that
interact directly with the braking and steering systems that
provide no audio information to the driver are also
contemplated.
[0056] The INS 204 supplies the processor 202 with navigation
information derived from accelerometers and angular position or
angular rate sensors. The processor 202 may also provide the INS
204 with initial position data or periodic position updates that
allow the INS 204 to correct drift errors, misalignment errors or
other errors.
[0057] The INS 204 may be a standard gimbal or strapdown INS having
one or more gyroscopes and substantially orthogonally mounted
accelerometers. Alternatively, the INS 204 may have accelerometers
and microelectromechanical systems (MEMS) that estimate angular
position or angular rates. An INS 204 having a gyroscope for
detecting automobile heading and a speed sensor is also
contemplated.
[0058] The GPS receiver 206 supplies the processor 202 with
navigation information derived from timing signal received from the
GPS satellite constellation. The processor 202 may provide the GPS
receiver 206 with position data to allow the GPS receiver 206 to
quickly reacquire the timing signals if the timing signals are
temporarily unavailable. GPS timing signal may be unavailable for a
variety of reasons, for example, antenna shadowing as a result of
driving through a tunnel or an indoor parking garage. The GPS
receiver 206 may also have a radio receiver for receiving
differential corrections that make the GPS navigation information
even more accurate.
[0059] The INS 204 and the GPS receiver 206 are complementary
navigation systems. The INS 204 is very responsive to changes in
the trajectory of the automobile. A steering or braking input is
sensed very quickly at the accelerometers and the angular position
sensors. INS 204 position and velocity estimates, however, are
derived by integrating accelerometer measurements and errors in the
estimates accumulate over time. The GPS receiver 206 is not
generally as responsive to changes in automobile trajectory but
continually estimates position very accurately. The use of both the
INS 204 and the GPS receiver 206 allows the processor 202 to
estimate the automobile's state more accurately than with a single
navigation system.
[0060] The wireless communication device 208 receives the
automobile's navigated state vector from the processor 202. The
wireless communication device 208 device broadcasts this state
vector for use by neighboring automobiles. The wireless
communication device 208 also receives the state vectors from
neighboring automobiles. The received state vectors from the
neighboring automobiles are sent to the processor 202 for further
processing. The automobile state vector may have more or less
elements describing the state of the vehicle such as the XYZ
position and 3D velocity of the vehicle and 3D acceleration. Other
information may be provided. For example the state vector may
contain entries that describe the angular position, the angular
rates, and the angular accelerations. The state vector may be
described using any coordinate system or any type of units. The
state vector may also contain information about the vehicle such as
its weight, stopping distance, its size, its fuel state etc.
Information packed in the state vector may be of value in collision
avoidance trajectory analysis or may be useful for generating and
displaying more accurate display symbology for the driver. For
example, the automobile may receive a state vector from a
neighboring vehicle that identifies the vehicle as an eighteen
wheel truck with a ten ton load. Such information may be important
for trajectory analysis and for providing accurate and informative
display symbology.
[0061] The wireless communication device 208 may be part of a local
area wireless network such as an IEEE 802.11 network. The local
area network may be a mesh network, ad-hoc network, contention
access network or any other type of network. The use of a device
that is mesh network enabled according to a widely accepted
standard such as 802.11(s) may be a good choice for a wireless
communication device 208. The wireless communication device 208 may
also feature a transmitter with low broadcast power to allow
automobiles in the area to receive the broadcast signal. The
broadcast of state vectors over a broad area network or the
internet is also contemplated.
[0062] The display 210 and the speaker 212 are features that
provide the driver with situational awareness. The processor 202
sends commands to the display 210 and the speaker 212 that alert
the driver to hazards. The display 210 may for example show the
relative positions and velocities of neighboring vehicles. The
display 210 may also warn the driver to slow down or apply the
brakes immediately. The speaker 212 may give aural warnings such as
"STOP" or "CAUTION VEHICLE APPROACHING".
[0063] The braking and steering systems 220 may also be commanded
by the processor 202. The processor 202 may command that the brakes
be applied to prevent collision with a vehicle ahead or may provide
a steering input to prevent the driver from colliding with a
vehicle. The processor 202 may also issue braking or steering
commands to minimize the damage resulting from a collision as
discussed in United States Patent Application 20080091352, the
content of which is incorporated by reference.
[0064] FIG. 4B is a simplified block diagram of an example vehicle
700, in accordance with an embodiment. The vehicle 700 may, for
example, be similar to the vehicle 600 described above in
connection with FIG. 6. The vehicle 700 may take other forms as
well. While the vehicle 700 in FIG. 7 is described as being
configured to operate in an autonomous mode, in some embodiments
the above methods may be implemented in a vehicle that is not
configured to operate in an autonomous mode. In these embodiments,
the vehicle may include fewer and/or different systems and/or
components. The sensor system 704 may include a number of sensors
configured to sense information about an environment in which the
vehicle 700 is located, as well as one or more actuators 736
configured to modify a position and/or orientation of the sensors.
As shown, the sensors of the sensor system include a Global
Positioning System (GPS) 726, an inertial measurement unit (IMU)
728, a RADAR unit 730, a laser rangefinder and/or LIDAR unit 732,
and a camera 734. The sensor system 704 may include additional
sensors as well, including, for example, sensors that monitor
internal systems of the vehicle 700 (e.g., an O2 monitor, a fuel
gauge, an engine oil temperature, etc.). Other sensors are possible
as well.
[0065] The GPS 726 may be any sensor configured to estimate a
geographic location of the vehicle 700. To this end, the GPS 726
may include a transceiver configured to estimate a position of the
vehicle 700 with respect to the Earth. The GPS 726 may take other
forms as well.
[0066] The IMU 728 may be any combination of sensors configured to
sense position and orientation changes of the vehicle 700 based on
inertial acceleration. In some embodiments, the combination of
sensors may include, for example, accelerometers and gyroscopes.
Other combinations of sensors are possible as well.
[0067] The RADAR 730 unit may be any sensor configured to sense
objects in the environment in which the vehicle 700 is located
using radio signals. In some embodiments, in addition to sensing
the objects, the RADAR unit 730 may additionally be configured to
sense the speed and/or heading of the objects.
[0068] Similarly, the laser rangefinder or LIDAR unit 732 may be
any sensor configured to sense objects in the environment in which
the vehicle 700 is located using lasers. In particular, the laser
rangefinder or LIDAR unit 732 may include a laser source and/or
laser scanner configured to emit a laser and a detector configured
to detect reflections of the laser. The laser rangefinder or LIDAR
732 may be configured to operate in a coherent (e.g., using
heterodyne detection) or an incoherent detection mode.
[0069] In one embodiment, a LIDAR-on-a-chip system steers its
electronic beam using arrays of many small emitters that each put
out a signal at a slightly different phase. The new phased array
thus forms a synthetic beam that it can sweep from one extreme to
another and back again 100,000 times a second. In one embodiment,
each antenna, which consists of a silicon waveguide and five curved
grooves etched in silicon, is 3 micrometers long, 2.8 .mu.m wide,
and 0.22 .mu.m thick. An infrared laser beam is delivered to the
antennas through a waveguide.
[0070] The camera 734 may be any camera (e.g., a still camera, a
video camera, etc.) configured to record three-dimensional images
of an interior portion of the vehicle 700. To this end, the camera
734 may be, for example, a depth camera. Alternatively or
additionally, the camera 734 may take any of the forms described
above in connection with the exterior camera 610. In some
embodiments, the camera 734 may comprise multiple cameras, and the
multiple cameras may be positioned in a number of positions on the
interior and exterior of the vehicle 700.
[0071] The control system 706 may be configured to control
operation of the vehicle 700 and its components. To this end, the
control system 706 may include a steering unit 738, a throttle 740,
a brake unit 742, a sensor fusion algorithm 744, a computer vision
system 746, a navigation or pathing system 748, and an obstacle
avoidance system 750. The steering unit 738 may be any combination
of mechanisms configured to adjust the heading of vehicle 700. The
throttle 740 may be any combination of mechanisms configured to
control the operating speed of the engine/motor 718 and, in turn,
the speed of the vehicle 700.
[0072] The brake unit 742 may be any combination of mechanisms
configured to decelerate the vehicle 700. For example, the brake
unit 742 may use friction to slow the wheels/tires 724. As another
example, the brake unit 742 may convert the kinetic energy of the
wheels/tires 724 to electric current. The brake unit 742 may take
other forms as well.
[0073] The sensor fusion algorithm 744 may be an algorithm (or a
computer program product storing an algorithm) configured to accept
data from the sensor system 704 as an input. The data may include,
for example, data representing information sensed at the sensors of
the sensor system 704. The sensor fusion algorithm 744 may include,
for example, a Kalman filter, a Bayesian network, or another
algorithm. The sensor fusion algorithm 744 may further be
configured to provide various assessments based on the data from
the sensor system 704, including, for example, evaluations of
individual objects and/or features in the environment in which the
vehicle 700 is located, evaluations of particular situations,
and/or evaluations of possible impacts based on particular
situations. Other assessments are possible as well.
[0074] The computer vision system 746 may be any system configured
to process and analyze images captured by the camera 734 in order
to identify objects and/or features in the environment in which the
vehicle 700 is located, including, for example, traffic signals and
obstacles (e.g., in embodiments where the camera 734 includes
multiple cameras, including a camera mounted on the exterior of the
vehicle 700). To this end, the computer vision system 746 may use
an object recognition algorithm, a Structure from Motion (SFM)
algorithm, video tracking, or other computer vision techniques. In
some embodiments, the computer vision system 746 may additionally
be configured to map the environment, track objects, estimate the
speed of objects, etc.
[0075] The navigation/path system 748 may be any system configured
to determine a driving path for the vehicle 700. The
navigation/path system 748 may additionally be configured to update
the driving path dynamically while the vehicle 700 is in operation.
In some embodiments, the navigation and path system 748 may be
configured to incorporate data from the sensor fusion algorithm
744, the GPS 726, and one or more predetermined maps so as to
determine the driving path for the vehicle 700.
[0076] The obstacle avoidance system 750 may be any system
configured to identify, evaluate, and avoid or otherwise negotiate
obstacles in the environment in which the vehicle 700 is located.
The control system 706 may additionally or alternatively include
components other than those shown.
[0077] Peripherals 708 may be configured to allow the vehicle 700
to interact with external sensors, other vehicles, and/or a user.
To this end, the peripherals 708 may include, for example, a
wireless communication system 752, a touchscreen 754, a microphone
756, and/or a speaker 758.
[0078] The wireless communication system 752 may take any of the
forms described above. In one embodiment, it can be the Dedicated
Short Range Communications (DSRC) which provides the
communications-based active safety systems. DSRC communications
take place over a dedicated 75 MHz spectrum band around 5.9 GHz,
allocated by the US Federal Communications Commission (FCC) for
vehicle safety applications. In contrast to WiFi, DSRC can
accommodate an extremely short time in which devices must recognize
each other and transmit messages to each other. A large number of
these safety applications require response times measured in
milliseconds. DSRC is targeted to operate in a 75 MHz licensed
spectrum around 5.9 GHz, as opposed to IEEE 802.11a that is allowed
to utilize only the unlicensed portions in the frequency band. DSRC
is meant for outdoor high-speed vehicle (up to 120 mph)
applications, as opposed to IEEE 802.11a originally designed for
indoor WLAN (walking speed) applications. In IEEE 802.11a, all PHY
parameters are optimized for the indoor low-mobility propagation
environment. Communications-based active safety applications use
vehicle-to-vehicle (V2V) and vehicle-to-infrastructure (V2I)
short-range wireless communications to detect potential hazards in
a vehicle's path even those the driver does not see. The connected
vehicle provides enhanced awareness at potentially reduced cost,
and offers additional functionality over autonomous sensor systems
available on some vehicles today. Communications-based sensor
systems provide a low-cost means of enabling hazard detection
capability on all vehicle classes, but requires vehicles and
infrastructure to be outfitted with interoperable communications
capabilities of DSRC or similar Vehicle to Vehicle networks.
[0079] The touchscreen 754 may be used by a user to input commands
to the vehicle 700. To this end, the touchscreen 754 may be
configured to sense at least one of a position and a movement of a
user's finger via capacitive sensing, resistance sensing, or a
surface acoustic wave process, among other possibilities. The
touchscreen 754 may be capable of sensing finger movement in a
direction parallel or planar to the touchscreen surface, in a
direction normal to the touchscreen surface, or both, and may also
be capable of sensing a level of pressure applied to the
touchscreen surface. The touchscreen 754 may be formed of one or
more translucent or transparent insulating layers and one or more
translucent or transparent conducting layers. The touchscreen 754
may take other forms as well.
[0080] The microphone 756 may be configured to receive audio (e.g.,
a voice command or other audio input) from a user of the vehicle
700. Similarly, the speakers 758 may be configured to output audio
to the user of the vehicle 700.
[0081] The computer system 710 may be configured to transmit data
to and receive data from one or more of the propulsion system 702,
the sensor system 704, the control system 706, and the peripherals
708. To this end, the computer system 710 may be communicatively
linked to one or more of the propulsion system 702, the sensor
system 704, the control system 706, and the peripherals 708 by a
system bus, network, and/or other connection mechanism (not
shown).
[0082] The computer system 710 may be further configured to
interact with and control one or more components of the propulsion
system 702, the sensor system 704, the control system 706, and/or
the peripherals 708. For example, the computer system 710 may be
configured to control operation of the transmission 722 to improve
fuel efficiency. As another example, the computer system 710 may be
configured to cause the camera 734 to record three-dimensional
images of an interior of the vehicle. As yet another example, the
computer system 710 may be configured to store and execute
instructions corresponding to the sensor fusion algorithm 744. As
still another example, the computer system 710 may be configured to
store and execute instructions for displaying a display on the
touchscreen 754. Other examples are possible as well.
[0083] As shown, the computer system 710 includes the processor 712
and data storage 714. The processor 712 may comprise one or more
general-purpose processors and/or one or more special-purpose
processors. To the extent the processor 712 includes more than one
processor, such processors could work separately or in combination.
Data storage 714, in turn, may comprise one or more volatile and/or
one or more non-volatile storage components, such as optical,
magnetic, and/or organic storage, and data storage 714 may be
integrated in whole or in part with the processor 712.
[0084] In some embodiments, data storage 714 may contain
instructions 716 (e.g., program logic) executable by the processor
712 to execute various vehicle functions, including those described
above in connection with FIGS. 1 and 3A-4C. Further, data storage
714 may contain a correlation 762 for the vehicle 700, which may
take any of the forms described above. Data storage 714 may contain
additional instructions as well, including instructions to transmit
data to, receive data from, interact with, and/or control one or
more of the propulsion system 702, the sensor system 704, the
control system 706, and the peripherals 708.
[0085] As shown, the vehicle 700 further includes a power supply
760, which may be configured to provide power to some or all of the
components of the vehicle 700. To this end, the power supply 760
may include, for example, a rechargeable lithium-ion or lead-acid
battery. In some embodiments, one or more banks of batteries could
be configured to provide electrical power. Other power supply
materials and configurations are possible as well. In some
embodiments, the power supply 760 and energy source 720 may be
implemented together, as in some all-electric cars.
[0086] In some embodiments, one or more of the propulsion system
702, the sensor system 704, the control system 706, and the
peripherals 708 could be configured to work in an interconnected
fashion with other components within and/or outside their
respective systems.
[0087] Further, the vehicle 700 may include one or more elements in
addition to or instead of those shown. For example, the vehicle 700
may include one or more additional interfaces and/or power
supplies. Other additional components are possible as well. In such
embodiments, data storage 714 may further include instructions
executable by the processor 712 to control and/or communicate with
the additional components.
[0088] Still further, while each of the components and systems are
shown to be integrated in the vehicle 700, in some embodiments, one
or more components or systems may be removably mounted on or
otherwise connected (mechanically or electrically) to the vehicle
700 using wired or wireless connections.
[0089] Still further, while the above description focused on a
vehicle 700 configured to operate in an autonomous mode, in other
embodiments the vehicle may not be configured to operate in an
autonomous mode. In these embodiments, for example, one or more of
the following components may be omitted: the global positioning
system 726, the inertial measurement unit 728, the RADAR unit 730,
the laser rangefinder or LIDAR unit 732, the actuators 736, the
sensor fusion algorithm 744, the computer vision system 746, the
navigation or path system 748, the obstacle avoidance system 750,
the wireless communication system 752, the touchscreen 754, the
microphone 756, and the speaker 758.
[0090] Car Repair/Maintenance
[0091] The sensors can be used for maintenance prediction and in
case of component failure, to help the driver to navigate safely.
For example, the vehicle can monitor brake pad wear and adjusting
how hard the brake needs to be applied in light of other vehicles
and how fast does the vehicle need to come to a complete stop. In
addition to changing the way the vehicle brake, the vehicle may
change the way it maneuvers in other ways as well, such as
accelerating differently or changing directions. For instance, the
vehicle may accelerate more slowly if the measured oil pressure is
excessively high. The vehicle may also turn more or less tightly in
order to mitigate wear. The vehicle may also use other systems and
methods to determine the state of a vehicle component. For example,
the vehicle may monitor how far it takes the car to stop compared
to expected braking distance. If the distance is longer than
expected, such as taking longer than it has in the past, the
computer system may determine that the brakes are worn and start
braking earlier. The system and method may also estimate the state
of a component based on its repair service record. In that regard,
the processor may query data 134 or an external database (e.g., a
server with which the vehicle is in wireless communication) for
repair records and estimate the wear on a component based on the
length of time since the last repair.
[0092] The system and method may rely on other information to
change the way the vehicle is maneuvered. For instance, the vehicle
may sense weight distribution and adjust maneuvering in response to
the changes in the loading and/or weight distributions on the
vehicle. The vehicle may further move differently when there is
only one user in the vehicle than four passengers on board, or
differently with light loads than with hauling a trailer behind.
The vehicle may also adapt the driving to the observed
environmental changes such as weather or roadway conditions.
[0093] Modeling of the patterns of changes in the vehicle's
performance and conditions, as well as modeling of the patterns of
changes in the driving environment, may be performed by the
autonomous driving computer system. Alternatively, predetermined
models may be stored in the autonomous driving system. The computer
system may process the observed data, fit them into the 3D models
in FIGS. 7A-7I, and issue compensation signals accordingly.
[0094] The vehicle may take the steps necessary to repair a
component. By way of example, when the vehicle is not being used by
anyone, the vehicle may autonomously and without direct human
assistance navigate to a repair facility, notify the facility of
the component that requires repair and return to its original
location when the repair is finished.
[0095] Gesture Sensor for Vehicular Control
[0096] FIG. 4C shows an exemplary gesture recognition system. The
system takes advantage of the numerous cameras onboard the vehicle
for navigation and mapping purposes, and additionally includes the
gesture control feature. System 800 includes a pair of cameras 802,
804 coupled to an image-analysis system 806. Cameras 802, 804 can
be any type of camera, including cameras sensitive across the
visible spectrum or, more typically, with enhanced sensitivity to a
confined wavelength band (e.g., the infrared (IR) or ultraviolet
bands); more generally, the term "camera" herein refers to any
device (or combination of devices) capable of capturing an image of
an object and representing that image in the form of digital data.
For example, line sensors or line cameras rather than conventional
devices that capture a two-dimensional (2D) image can be employed.
The term "light" is used generally to connote any electromagnetic
radiation, which may or may not be within the visible spectrum, and
may be broadband (e.g., white light) or narrowband (e.g., a single
wavelength or narrow band of wavelengths).
[0097] Cameras 802, 804 are preferably capable of capturing video
images (i.e., successive image frames at a constant rate of at
least 15 frames per second), although no particular frame rate is
required. The capabilities of cameras 802, 804 are not critical to
the invention, and the cameras can vary as to frame rate, image
resolution (e.g., pixels per image), color or intensity resolution
(e.g., number of bits of intensity data per pixel), focal length of
lenses, depth of field, etc. In general, for a particular
application, any cameras capable of focusing on objects within a
spatial volume of interest can be used. For instance, to capture
motion of the hand of an otherwise stationary person, the volume of
interest might be defined as a cube approximately one meter on a
side.
[0098] System 800 also includes a pair of light sources 808, 810,
which can be disposed to either side of cameras 802, 804, and
controlled by image-analysis system 806. Light sources 808, 810 can
be infrared light sources of generally conventional design, e.g.,
infrared light-emitting diodes (LEDs), and cameras 802, 804 can be
sensitive to infrared light. Filters 820, 822 can be placed in
front of cameras 802, 804 to filter out visible light so that only
infrared light is registered in the images captured by cameras 802,
804. In some embodiments where the object of interest is a person's
hand or body, use of infrared light can allow the motion-capture
system to operate under a broad range of lighting conditions and
can avoid various inconveniences or distractions that may be
associated with directing visible light into the region where the
person is moving. However, a particular wavelength or region of the
electromagnetic spectrum is required.
[0099] It should be stressed that the foregoing arrangement is
representative and not limiting. For example, lasers or other light
sources can be used instead of LEDs. For laser setups, additional
optics (e.g., a lens or diffuser) may be employed to widen the
laser beam (and make its field of view similar to that of the
cameras). Useful arrangements can also include short- and
wide-angle illuminators for different ranges. Light sources are
typically diffuse rather than specular point sources; for example,
packaged LEDs with light-spreading encapsulation are suitable.
[0100] In operation, cameras 802, 804 are oriented toward a region
of interest 812 in which an object of interest 814 (in this
example, a hand) and one or more background objects 816 can be
present. Light sources 808, 810 are arranged to illuminate region
812. In some embodiments, one or more of the light sources 808, 810
and one or more of the cameras 802, 804 are disposed below the
motion to be detected, e.g., where hand motion is to be detected,
beneath the spatial region where that motion takes place. This is
an optimal location because the amount of information recorded
about the hand is proportional to the number of pixels it occupies
in the camera images, the hand will occupy more pixels when the
camera's angle with respect to the hand's "pointing direction" is
as close to perpendicular as possible. Because it is uncomfortable
for a user to orient his palm toward a screen, the optimal
positions are either from the bottom looking up, from the top
looking down (which requires a bridge) or from the screen bezel
looking diagonally up or diagonally down. In scenarios looking up
there is less likelihood of confusion with background objects
(clutter on the user's desk, for example) and if it is directly
looking up then there is little likelihood of confusion with other
people out of the field of view (and also privacy is enhanced by
not imaging faces). Image-analysis system 806, which can be, e.g.,
a computer system, can control the operation of light sources 808,
810 and cameras 802, 804 to capture images of region 812. Based on
the captured images, image-analysis system 806 determines the
position and/or motion of object 814.
[0101] For example, as a step in determining the position of object
814, image-analysis system 806 can determine which pixels of
various images captured by cameras 802, 804 contain portions of
object 814. In some embodiments, any pixel in an image can be
classified as an "object" pixel or a "background" pixel depending
on whether that pixel contains a portion of object 814 or not. With
the use of light sources 808, 810, classification of pixels as
object or background pixels can be based on the brightness of the
pixel. For example, the distance (rO) between an object of interest
814 and cameras 802, 804 is expected to be smaller than the
distance (rB) between background object(s) 816 and cameras 802,
804. Because the intensity of light from sources 808, 810 decreases
as 1/r2, object 814 will be more brightly lit than background 816,
and pixels containing portions of object 814 (i.e., object pixels)
will be correspondingly brighter than pixels containing portions of
background 816 (i.e., background pixels). For example, if rB/rO=2,
then object pixels will be approximately four times brighter than
background pixels, assuming object 814 and background 816 are
similarly reflective of the light from sources 808, 810, and
further assuming that the overall illumination of region 812 (at
least within the frequency band captured by cameras 802, 804) is
dominated by light sources 808, 810. These assumptions generally
hold for suitable choices of cameras 802, 804, light sources 808,
810, filters 810, 812, and objects commonly encountered. For
example, light sources 808, 810 can be infrared LEDs capable of
strongly emitting radiation in a narrow frequency band, and filters
810, 812 can be matched to the frequency band of light sources 808,
810. Thus, although a human hand or body, or a heat source or other
object in the background, may emit some infrared radiation, the
response of cameras 802, 804 can still be dominated by light
originating from sources 808,180 and reflected by object 814 and/or
background 816.
[0102] In this arrangement, image-analysis system 806 can quickly
and accurately distinguish object pixels from background pixels by
applying a brightness threshold to each pixel. For example, pixel
brightness in a CMOS sensor or similar device can be measured on a
scale from 0.0 (dark) to 1.0 (fully saturated), with some number of
gradations in between depending on the sensor design. The
brightness encoded by the camera pixels scales standardly
(linearly) with the luminance of the object, typically due to the
deposited charge or diode voltages. In some embodiments, light
sources 808, 810 are bright enough that reflected light from an
object at distance rO produces a brightness level of 1.0 while an
object at distance rB=2rO produces a brightness level of 0.25.
Object pixels can thus be readily distinguished from background
pixels based on brightness. Further, edges of the object can also
be readily detected based on differences in brightness between
adjacent pixels, allowing the position of the object within each
image to be determined. Correlating object positions between images
from cameras 802, 804 allows image-analysis system 806 to determine
the location in 3D space of object 814, and analyzing sequences of
images allows image-analysis system 806 to reconstruct 3D motion of
object 814 using conventional motion algorithms.
[0103] In identifying the location of an object in an image
according to an embodiment of the present invention, light sources
808, 810 are turned on. One or more images are captured using
cameras 802, 804. In some embodiments, one image from each camera
is captured. In other embodiments, a sequence of images is captured
from each camera. The images from the two cameras can be closely
correlated in time (e.g., simultaneous to within a few
milliseconds) so that correlated images from the two cameras can be
used to determine the 3D location of the object. A threshold pixel
brightness is applied to distinguish object pixels from background
pixels. This can also include identifying locations of edges of the
object based on transition points between background and object
pixels. In some embodiments, each pixel is first classified as
either object or background based on whether it exceeds the
threshold brightness cutoff. Once the pixels are classified, edges
can be detected by finding locations where background pixels are
adjacent to object pixels. In some embodiments, to avoid noise
artifacts, the regions of background and object pixels on either
side of the edge may be required to have a certain minimum size
(e.g., 2, 4 or 8 pixels).
[0104] In other embodiments, edges can be detected without first
classifying pixels as object or background. For example,
.DELTA..beta. can be defined as the difference in brightness
between adjacent pixels, and |.DELTA..beta.| above a threshold can
indicate a transition from background to object or from object to
background between adjacent pixels. (The sign of .DELTA..beta. can
indicate the direction of the transition.) In some instances where
the object's edge is actually in the middle of a pixel, there may
be a pixel with an intermediate value at the boundary. This can be
detected, e.g., by computing two brightness values for a pixel
i:
.beta.L=(.beta.i+.beta.i-1)/2 and .beta.R=(.beta.i+.beta.i+1)/2,
where pixel (i-1) is to the left of pixel i and pixel (i+1) is to
the right of pixel i. If pixel i is not near an edge,
|.beta.L-.beta.R| will generally be close to zero; if pixel is near
an edge, then |.beta.L-.beta.R| will be closer to 1, and a
threshold on |.beta.L-.beta.R| can be used to detect edges.
[0105] In some instances, one part of an object may partially
occlude another in an image; for example, in the case of a hand, a
finger may partly occlude the palm or another finger Occlusion
edges that occur where one part of the object partially occludes
another can also be detected based on smaller but distinct changes
in brightness once background pixels have been eliminated.
[0106] Detected edges can be used for numerous purposes. For
example, as previously noted, the edges of the object as viewed by
the two cameras can be used to determine an approximate location of
the object in 3D space. The position of the object in a 2D plane
transverse to the optical axis of the camera can be determined from
a single image, and the offset (parallax) between the position of
the object in time-correlated images from two different cameras can
be used to determine the distance to the object if the spacing
between the cameras is known.
[0107] Further, the position and shape of the object can be
determined based on the locations of its edges in time-correlated
images from two different cameras, and motion (including
articulation) of the object can be determined from analysis of
successive pairs of images. An object's motion and/or position is
reconstructed using small amounts of information. For example, an
outline of an object's shape, or silhouette, as seen from a
particular vantage point can be used to define tangent lines to the
object from that vantage point in various planes, referred to
herein as "slices." Using as few as two different vantage points,
four (or more) tangent lines from the vantage points to the object
can be obtained in a given slice. From these four (or more) tangent
lines, it is possible to determine the position of the object in
the slice and to approximate its cross-section in the slice, e.g.,
using one or more ellipses or other simple closed curves. As
another example, locations of points on an object's surface in a
particular slice can be determined directly (e.g., using a
time-of-flight camera), and the position and shape of a
cross-section of the object in the slice can be approximated by
fitting an ellipse or other simple closed curve to the points.
Positions and cross-sections determined for different slices can be
correlated to construct a 3D model of the object, including its
position and shape. A succession of images can be analyzed using
the same technique to model motion of the object. Motion of a
complex object that has multiple separately articulating members
(e.g., a human hand) can be modeled using these techniques.
[0108] More particularly, an ellipse in the xy plane can be
characterized by five parameters: the x and y coordinates of the
center (xC, yC), the semimajor axis, the semiminor axis, and a
rotation angle (e.g., angle of the semimajor axis relative to the x
axis). With only four tangents, the ellipse is underdetermined.
However, an efficient process for estimating the ellipse in spite
of this fact involves making an initial working assumption (or
"guess") as to one of the parameters and revisiting the assumption
as additional information is gathered during the analysis. This
additional information can include, for example, physical
constraints based on properties of the cameras and/or the object.
In some circumstances, more than four tangents to an object may be
available for some or all of the slices, e.g., because more than
two vantage points are available. An elliptical cross-section can
still be determined, and the process in some instances is somewhat
simplified as there is no need to assume a parameter value. In some
instances, the additional tangents may create additional
complexity. In some circumstances, fewer than four tangents to an
object may be available for some or all of the slices, e.g.,
because an edge of the object is out of range of the field of view
of one camera or because an edge was not detected. A slice with
three tangents can be analyzed. For example, using two parameters
from an ellipse fit to an adjacent slice (e.g., a slice that had at
least four tangents), the system of equations for the ellipse and
three tangents is sufficiently determined that it can be solved. As
another option, a circle can be fit to the three tangents; defining
a circle in a plane requires only three parameters (the center
coordinates and the radius), so three tangents suffice to fit a
circle. Slices with fewer than three tangents can be discarded or
combined with adjacent slices.
[0109] To determine geometrically whether an object corresponds to
an object of interest comprises, one approach is to look for
continuous volumes of ellipses that define an object and discard
object segments geometrically inconsistent with the ellipse-based
definition of the object--e.g., segments that are too cylindrical
or too straight or too thin or too small or too far away--and
discarding these. If a sufficient number of ellipses remain to
characterize the object and it conforms to the object of interest,
it is so identified, and may be tracked from frame to frame.
[0110] In some embodiments, each of a number of slices is analyzed
separately to determine the size and location of an elliptical
cross-section of the object in that slice. This provides an initial
3D model (specifically, a stack of elliptical cross-sections),
which can be refined by correlating the cross-sections across
different slices. For example, it is expected that an object's
surface will have continuity, and discontinuous ellipses can
accordingly be discounted. Further refinement can be obtained by
correlating the 3D model with itself across time, e.g., based on
expectations related to continuity in motion and deformation. In
some embodiments, light sources 808, 110 can be operated in a
pulsed mode rather than being continually on. This can be useful,
e.g., if light sources 808, 110 have the ability to produce
brighter light in a pulse than in a steady-state operation. Light
sources 808, 110 can be pulsed on at regular intervals as shown at
502. The shutters of cameras 802, 804 can be opened to capture
images at times coincident with the light pulses. Thus, an object
of interest can be brightly illuminated during the times when
images are being captured. In some embodiments, the silhouettes of
an object are extracted from one or more images of the object that
reveal information about the object as seen from different vantage
points. While silhouettes can be obtained using a number of
different techniques, in some embodiments, the silhouettes are
obtained by using cameras to capture images of the object and
analyzing the images to detect object edges.
[0111] In some embodiments, the pulsing of light sources 808, 110
can be used to further enhance contrast between an object of
interest and background. In particular, the ability to discriminate
between relevant and irrelevant (e.g., background) objects in a
scene can be compromised if the scene contains object that
themselves emit light or are highly reflective. This problem can be
addressed by setting the camera exposure time to extraordinarily
short periods (e.g., 800 microseconds or less) and pulsing the
illumination at very high powers (i.e., 5 to 20 watts or, in some
cases, to higher levels, e.g., 40 watts). This approach increases
the contrast of an object of interest with respect to other
objects, even those emitting in the same general band. Accordingly,
discriminating by brightness under such conditions allows
irrelevant objects to be ignored for purposes of image
reconstruction and processing. Average power consumption is also
reduced; in the case of 20 watts for 800 microseconds, the average
power consumption is under 80 milliwatts. In general, the light
sources 808, 110 are operated so as to be on during the entire
camera exposure period, i.e., the pulse width is equal to the
exposure time and is coordinated therewith. It is also possible to
coordinate pulsing of lights 808, 810 for purposes of by comparing
images taken with lights 808, 810 on and images taken with lights
808, 810 off.
[0112] Hand-Gesture Control of Vehicle
[0113] FIGS. 5A-5L show exemplary hand control of a smart vehicle.
First, a Left Hand Gesture Based Car Control process is disclosed.
FIG. 5A shows the left arm gesture based window glass control
process. The process checks for the raised arm (1002). If the arm
is raised (1004) it checks for the number of fingers raised (1006).
The controls for windows are activated if first four fingers are
raised (1008). The process allows controlling only the driver seat
glass control and also all the window glass control. This decision
is based on the number of fingers raised. A single finger chooses
only driver glass. Movements of the glass is than controlled by the
angular movement of the arm (1020), a right movement slides the
glass up (1028) and a left movement slides it down (1026). The
process is concluded (1030) after the windows are at the required
position. At any moment the driver can choose to exit the process
by forming a fist of his left arm.
[0114] FIG. 5B shows the flow for seat control process. The process
is capable of controlling both the driver seat and the front
passenger seat as well. The process starts with checking for which
arm is raised (1036). After the arm the process scans for the
fingers (1038), first 2 fingers initiate the seat actuators (1040).
Now, the driver can choose to adjust his own seat or may be the
seat of the passenger. This decision is dependent whether one or
two fingers are raised (1044). The seats can be moved forth or back
as per the arms angular movement (1052) (1050). As per the
convenience, the seats can be adjusted. After the adjustment is
done, the process concludes (1062). At any point the process can be
ended if the driver forms a fist on left hand.
[0115] FIG. 5C shows exemplary left hand based gesture based
mechanism for unlocking the `Hood` and the `Trunk` of the car. As
it is left arm based control, the mechanism uses a camera to check
the arm raised. A raised left arm initiates the process which
unlocks the hood and trunk (1068). The camera than checks for the
fingers that are raised (1070), the first finger is used to
activate the hood & trunk control (1072). To open the trunk the
driver has to make a right angular movement (1076) and an opposite
movement for unlocking the hood (1078). As soon as either of the
two is unlocked the process ends (1082). If the process is started
by mistake or confusion, the driver can choose to exit by forming a
fist on his left arm.
[0116] FIG. 5D shows exemplary process for controlling temperature
of the driver and front passenger seats. After checking for left
raised arm (1088) the camera scans for the fingers raised (1090).
The first three fingers are to be used by the driver to activate
seat temperature controls (1092). The driver can choose to control
his seat temperature or the passenger's seat temperature by raising
the appropriate number of fingers (1096). The angular movements of
the left arm can be used to increase or decrease the temperature of
the selected seat (1104) (1102). The process can be ended after
adjusting the temperature or at any other point by forming a fist
(1114).
[0117] FIG. 5E is an example of a left arm gesture based navigation
(GPS) control for a car. The process initializes when the driver
raises his/her left arm (1120). The GPS system is activated if the
all the fingers are raised i.e. an open palm (1124). Now the arm
motion in the vertical and horizontal axis can be used to move the
GPS pointer (1128). To select a particular destination, the pointer
must be kept at the same location for a pre-defined duration of
time (1144). Once the destination is set, the GPS starts routing
(1146) and then exits the process (1148). The process can be ended
abruptly if needed by forming a fist on left hand.
[0118] FIG. 5F shows an exemplary gesture based control of drivers
mirror using left arm. The driver initiates the process by raising
the left arm (1154). The thumb is used as a trigger for activating
the mirror actuators (1158). To adjust the mirror angle, the driver
can move his/her arm along the vertical or horizontal axis (1162).
The driver can form a fist (1170) (1168) or wait for a predefined
time interval to set the mirror angle (1182). This process has an
option which enables the driver to exit anytime by forming a fist
on left hand.
[0119] FIG. 5G shows an exemplary music control in the car using
gestures of right hand. The process is activated if the camera
scans a vertically standing right arm (1190). The car music system
is initiated if the driver has an open right palm (1194). Depending
upon the fingers raised after the music system is initiated either
radio or just the MP3 player is started (1204). The angular
movements of the arm can be used to switch between stations or
songs (1206) (1202). Once the desired station or song is selected
the driver can exit the process by forming a closed fist (1216). A
closed fist formed anytime can be used to exit the process
anytime.
[0120] FIG. 5H shows an exemplary car temperature control using
gestures from the right arm. The driver is expected to raise the
first two fingers of the right arm to activate the temperature
controls (1246). The temperature controlling element is the angular
motion of the right arm (1250). A left motion causes decrease in
temperature and vice versa (1256) (1254). Once the desire
temperature is achieved, the driver can stop the process by forming
a fist. A fist basically exits the process at any given point.
[0121] FIG. 5I shows an exemplary control the car volume using arm
gestures. The camera initiates the process whenever the driver
raises his/her right arm (1222). The process expects the driver to
raise three fingers to initiate volume control (1226). Using the
right or left angular motion the volume can be increased and
decreased (1230).
[0122] FIG. 5J shows an exemplary technique for sliding the sun
roof by the means of hand gesture. The sun roof control process
starts when the driver raises his/her right arm (1264) and first
four fingers of the same (1268). The camera now scans for the
angular motion of the arm (1272). A left motion pulls the roof back
(1276) whereas a right motion pushes it forward so that it can be
closed (1274). The process ends once the roof is entirely opened or
closed and it can also be concluded by forming a fist on the right
arm.
[0123] FIG. 5K shows an exemplary arm gesture based technique for
controlling the car wind shield wipers. The wiper motors are
activated when the right arm along with first finger is raised
(1284) (1288). The speed of the wiper motors can be controlled
using the right arm angular motion (1292). The left motion
decreases the speed (1298), the right motion increases the wiper
speed (1296) and in order to stop the wiper a still right arm with
a closed fist should be scanned by the camera (1294).
[0124] FIG. 5L shows an exemplary right arm gesture based control
of the rear view mirror. The camera scans for the right arm, if it
is up the process is initiated (1306). The rear view mirror control
is activated if the camera scans only a thumb on right arm (1310).
Now, the rear view mirror can be adjusted vertically and
horizontally, this is achieved by moving the arm with only raised
thumb along the desired axis (1314). To lock the position of the
mirror, the same position is to be maintained for a pre-defined
interval of time. Once done the process locks the mirror and
concludes (1332). The process can be ended anytime by the driver by
forming a fist on his right arm.
[0125] In other embodiments, by cupping the hand on the an object
such as a steering wheel, the user can use voice to make calls,
receive and respond to texts, launch apps, get turn-by-turn
directions, find the nearest Chinese restaurant and other local
businesses, or say "Play me some Barry Manilow." You can also ask
Siri or Google Now to search the Internet as you roll down the
Interstate. The apps will be able to pull contacts directly from
the phone's address book, access favorites and bookmarks, and have
user location history close at hand.
[0126] A gesture is used to control an air conditioning system in
an example vehicle, in accordance with an embodiment. As shown, a
user 300 is driving the vehicle. The vehicle may maintain a
correlation between a plurality of predetermined gestures, in
combination with a plurality of predetermined regions of the
vehicle, and a plurality of functions, such that each gesture in
the plurality of predetermined gestures, in combination with a
particular region of the plurality of predetermined regions, is
associated with a particular function in the plurality of
functions, as described above. For example, the correlation may
include a downward swiping gesture in a region that includes an
air-conditioning vent associated with the function of decreasing a
fan speed of an air conditioning system. Other examples are
possible as well.
[0127] As shown, a fan speed indicator on the display indicates
that a fan speed of the air conditioning system in the vehicle is
high. At some point, the user may wish to lower the fan speed of
the air conditioning system. To this end, the user may make a
downward swiping gesture in a region that includes an
air-conditioning vent. The camera 304 may record three-dimensional
images of the downward swiping gesture in the region that includes
an air-conditioning vent. Based on the three-dimensional images,
the vehicle may detect the downward swiping gesture in the region
that includes the air-conditioning vent.
[0128] The vehicle may then select, based on the correlation, a
function associated with the downward swiping gesture in the region
that includes the air-conditioning vent. For example, the downward
swiping gesture in the region that includes the air-conditioning
vent may be associated with the function of decreasing a fan speed
of the air conditioning system, as described above. Other examples
are possible as well. Once the vehicle has selected the function
from the correlation, the vehicle may initiate the function in the
vehicle. That is, the vehicle may decrease the fan speed in the
vehicle.
[0129] In some embodiments, the vehicle may additionally determine
an extent determining an extent of the downward swiping gesture and
may decrease the fan speed by an amount that is, for example,
proportional to the extent.
[0130] In some embodiments, in addition to initiating the function,
the vehicle may trigger a feedback to the user, such as an audible
feedback, a visual feedback, and/or a haptic feedback. Such
feedback may be particularly useful when the function is not
immediately detectable by the user, such as a small decrease in the
fan speed of the climate control system or a slight repositioning
of a seat.
[0131] Further, in some embodiments, the vehicle may determine an
extent of the given gesture. For example, if the given gesture is a
swipe gesture, the vehicle may determine an extent of the swipe
(e.g., how long the swipe is in space and/or time). The vehicle may
then determine an operational parameter based on the extent. For
example, for a greater extent, the vehicle may determine a greater
operational parameter than for a lesser extent. The operational
parameter may be, for example, proportional to, or approximately
proportional to, the extent. In these embodiments, when the vehicle
initiates the function the vehicle may initiate the function with
the determined operational parameter.
[0132] For example, if the swipe gesture is in a region that
includes a window, and the swipe gesture in the region that
includes the window is associated with opening the window, the
vehicle may determine an extent of the swipe and further may
determine how far to open the window based on the extent of the
swipe. For instance, the vehicle may open the window further for a
longer swipe than for a shorter swipe.
[0133] As another example, if the swipe gesture is in a region that
includes an air-conditioning vent, and the swipe gesture in the
region that includes the air-conditioning vent is associated with
lowering a temperature in the vehicle, the vehicle may determine an
extent of the swipe and further may determine how much to lower the
temperature in the vehicle based on the extent of the swipe. For
instance, the vehicle may lower the temperature further for a
longer swipe than for a shorter swipe.
[0134] Such an extent could be determined for gestures other than a
swipe gesture as well. For example, if a tap gesture is in a region
that includes a speaker, and the tap gesture in the region that
includes the speaker is associated with lowering a volume of an
audio system, the vehicle may determine an extent of the tap (e.g.,
how many taps, how long the tap is held, etc.) and further may
determine how much to lower the volume of the audio system based on
the extent of the tap. For instance, the vehicle may lower the
volume more for more taps (or a longer tap) than for fewer taps (or
a shorter tap).
[0135] In some embodiments, rather than determining the extent of
the gesture and the corresponding operational parameter and then
initiating the function with the determined operational parameter,
the vehicle may instead continuously determine the extent of the
gesture and update the corresponding operational parameter, and may
continuously initiate the function with the updated operational
parameter. For example, the vehicle may detect a cover gesture in a
region that includes an air-conditioning vent (e.g., such that the
air-conditioning vent is covered), and the cover gesture in the
region that includes the air-conditioning vent may be associated
with lowering a fan speed of the air conditioning system. Once the
vehicle detects the cover gesture in the region that includes the
air-conditioning vent, the vehicle may lower the fan speed (e.g.,
by a predetermined amount). As the vehicle continues to detect the
cover gesture, the vehicle may continue to lower the fan speed
(e.g., in increments of, for example, the predetermined amount,
growing amounts, etc.). Once the vehicle detects that the cover
gesture has ended, the vehicle may cease to lower the fan speed. As
a result, during the cover gesture the vehicle may lower the fan
speed by an amount that is based on the extent of the cover
gesture.
[0136] In some embodiments, the vehicle may have difficulty
detecting the given gesture and/or the given region. For example,
the vehicle may determine that a confidence level of one or both of
the given gesture and the given region is below a predetermined
threshold. In these embodiments, the vehicle may request an
occupant to repeat the given gesture in the given region. When the
occupant repeats the given gesture in the given region, the vehicle
may record additional three-dimensional images and may detect the
given gesture and the given region based on the additional
three-dimensional images (and, in some cases, the three-dimensional
images previously recorded).
[0137] Obstacle Detection
[0138] In some embodiments, a vehicle identifies obstacles on the
road, and the computer system may use one or more sensors to sense
the obstacles. For example, the computer system may use an
image-capture device to capture images of the road and may detect
the obstacles by analyzing the images for predetermined colors,
shapes, and/or brightness levels indicative of an obstacle. As
another example, the computer system may project LIDAR to detect
the obstacle. The computer system may estimate the location of the
obstacle and control the vehicle to avoid the vehicle and yet
maintain a predetermined distance from neighboring vehicles in both
directions. Other vehicles behind the lead vehicle can then simply
follow the lead vehicle as part of a flock. The computer system may
then control the vehicle to maintain a distance between the vehicle
and the at least one neighboring vehicle to be at least a
predetermined minimum distance to avoid colliding with the at least
one neighboring vehicle.
[0139] FIGS. 6A-6C show exemplary obstacles that may be encountered
by vehicles. FIG. 6A illustrates an example of a situation where an
obstacle 603 is present in front of a host vehicle 601 mounted with
a sensor such as camera or sensor 602 in a front portion of a
vehicle body. In vehicles, a vehicle-mounted sensors such as
cameras, radar, and LIDAR is used in a vehicle control system such
as an inter-vehicle distance alarm system, a preceding vehicle
following system or a collision reducing brake system. Where an
obstacle is not present in front of the host vehicle 601, since the
target data is not output from the sensor 602, the vehicle velocity
control system performs a control so that the vehicle operates
according to a planned path. However, the path may need adjustment
when an obstacle is encountered, or when weather affects the
operation, or traffic condition, emergency or holiday patterns
require a change in the planned path and speed.
[0140] In this example, the front obstacle 603 is another vehicle.
Furthermore, in this example, as the sensor 602, a radar in one
embodiment, has horizontal resolution due to a plurality of arrays
installed in the horizontal direction; however, it does not have a
vertical resolution. In this case, the sensor 602 outputs target
data having position information such as a relative longitudinal
distance, lateral position and velocity between the host vehicle
601 and the obstacle 603 to a vehicle velocity control system. In
another embodiment, the sensor 602 is a camera. Pictures captured
by the camera can be used to form a 3D reconstruction of the
obstacles and the road. The task of converting multiple 2D images
into 3D model consists of a series of processing steps: Camera
calibration consists of intrinsic and extrinsic parameters, and the
camera calibration is usually required for determining depth. Depth
determination calculates--depth. The correspondence problem,
finding matches between two images so the position of the matched
elements can then be triangulated in 3D space. With the multiple
depth maps the system combines them to create a final mesh by
calculating depth and projecting out of the camera--registration.
Camera calibration will be used to identify where the many meshes
created by depth maps can be combined together to develop a larger
one, providing more than one view for observation to have a
complete 3D mesh.
[0141] The vehicle velocity control system performs controls such
as a control for maintaining the distance from the obstacle 603,
and a control for executing an alarm or velocity reduction in a
case where collision with the obstacle 603 is predicted, according
to the position information about the input target data.
[0142] Here, FIG. 6B illustrates an example of a situation where
the obstacle 702 such as a boulder or a large object is not in the
database but is now present near the host vehicle 601. The obstacle
can be on the side or the rear as captured by sensors As shown in
FIG. 6C, when a downward-view structure 711 such as a manhole or a
cat's eye is present, or as another example, when a small fallen
object such as an empty can is present and the sensor 602 may
output target data, as if the front obstacle is present, according
to a reflection intensity or camera capture. For example, obstacles
such as land slip and falling rocks can appear unexpectedly on a
mountain road. Image data in the fall monitoring area is processed
by car computers and/or transmitted to a cloud processing system,
and images are frame-difference-processed to identify the new
obstacles. Thereafter, in the case where there are no variations in
the extracted component data representing the body and in
particular in the case where there are no moving variations, when
several frames with an interval of a predetermined time are
processed in a similar way, the processor detects the obstacle and
transmits a warning signal to adjacent vehicles and/or to a traffic
display board. The warning information can be incorporated by the
driving software of other drivers that there are road obstacles by
land slip.
[0143] In another embodiment, the obstacle can be the result of a
car accident or emergency. The system automatically detects the
occurrence of an emergency and provides safety at the scene. This
is done by diverting traffic flow near the point of emergency to a
point where traffic resumes normal flow. The system secures the
incident site to protect emergency personnel, their equipment and
the public, from hazardous conditions at the scene and throughout
the traffic control zone. The system can establish a traffic
control set-up that gives motorists adequate warning and reaction
time. The system also separates pedestrians from vehicular traffic
and limits access to the site to authorized persons only. One
embodiment directs vehicles through an emergency traffic control
zone with the following: Advance Warning Area should alert vehicles
that there is a traffic situation or difficulty ahead which will
require some action on its part; Approach area should identify the
nature of the equipment or vehicle that is about to encounter and
allow them to analyze the situation; Transition Area should provide
an indication as to the expected action to be taken by the vehicle
to decide on a course of action and execute safe driving techniques
prior to entering the Activity Area; and Activity Area includes
Fend Off Position of the emergency vehicle, Buffer Zone (refers to
scene protection area between the first emergency vehicle and the
incident site), Incident Site (Restricted to authorized personnel
only), Traffic Space (Area where traffic is allowed to pass by the
Activity Area), and Staging Area (Emergency Vehicles not
immediately required to perform a function or shielding at the
incident scene should be directed to stage in this area. The area
should be downstream/upstream of the incident site and the location
should not create a traffic hazard or obstruction). The system can
determine a Termination Area from the downstream side of the
Staging Area to the point where normal traffic is able to resume.
The information for an emergency is incorporated into the 3D model
for vehicular processing.
[0144] Weather conditions can affect the driving plan. For cameras,
it affects the ability to see, which is very limited in adverse
weather conditions such as rain, fog, ice, snow, and dust. For
example, if the fog becomes so thick the system can suggest the car
be moved completely off the road. The car system also slows down
for rain, drizzle, or snow on the road. This is when many road
surfaces are most slippery because moisture mixes with oil and dust
that has not been washed away. The slippery roads can reduce
traction and control of the vehicle may be compromised. The system
can detect wet road surface via its camera or water sensors. Wet
road surfaces can cause tires to hydroplane (skim on a thin layer
of water). This could result in loss of control and steering
ability. Hydroplaning is caused by a combination of standing water
on the road, car speed, and under-inflated or worn-out tires. Thus,
the system can check the pressure of the tires by communicating
with the tire sensors. The 3D modeling system also incorporates the
effects of high temperatures, sun glare and high winds. One
exemplary 3D modeling process for navigation is detailed next.
[0145] FIGS. 7A-7H illustrate an exemplary process to fuse data for
3D models used for car navigation. FIG. 7A shows an exemplary
system that performs data fusion based on sensor based detection of
objects, change in weather and traffic, and holiday/emergency
conditions, among others. The process checks all the sensors for
change in weather (2004), detection of object (2002) and the GPS
for current traffic conditions (2006). For each given sensor for
detecting objects in a vehicle's environment, the process generates
a 3D model of the given sensor's field of view; obstacle
information from front cars using vehicle-vehicle communication
(DRSC); neighboring car driver preference information; traffic
information including emergency information. The process can adjust
one or more characteristics of the plurality of 3D models based on
the received weather information to account for an impact of the
actual or expected weather conditions on one or more of the
plurality of sensors. After the adjusting, aggregating, by a
processor, the plurality of 3D models to generate a comprehensive
3D model; combining the comprehensive 3D model with detailed map
information; and using the combined comprehensive 3D model with
detailed map information to maneuver the vehicle. In FIG. 7A, the
process checks sensors for object detection (2008) and then checks
for confirmations from other vehicles over V2V communication such
as DSRC and then generates 3D model therefrom. The process can also
check for weather change (2004) and correlate the weather change to
generate an updated 3D model. Similarly, the process integrates
traffic flow information (2006) and updates the 3D model as needed.
FIG. 7B shows an exemplary process for identifying the object,
while FIG. 7C-7H show in more details the object modeling process.
The process checks sensors for object detection and scans the
object against 3D library for matches. If a match is found, the
process sets the object to the object in the library, and otherwise
the process performs a best-guess of what the object is and send
the object identification for subsequent 3D modeling use.
[0146] FIGS. 8A-8F show exemplary detection of objects outside of
the vehicle and guidance on their handling. The detected objects
can include automobile, a pedestrian, structure, or a bicycle, for
example. The system assists the driver by identifying the objects
as potential "threats" and recommend options for the driver. For
example, the system can perform the following: [0147] detecting an
object external to a vehicle using one or more sensors; [0148]
determining a classification and a state of the detected object;
[0149] estimating the destination of the object; [0150] predicting
a likely behavior of the detected object based on prior behavior
data and destination; [0151] preparing the vehicle to respond based
at least in part on the likely behavior of the detected object; and
[0152] notifying a driver of options based on the likely
behavior.
[0153] FIG. 8A shows an exemplary process to identify a vehicle
based on the 3D models created in FIGS. 7A-7H. FIG. 8B shows an
exemplary handling where the detected object is an automobile--the
classification of the detected object includes the type of
automobile. FIG. 8C shows a process to retrieve prior behavior data
of the detected object by identifying at least one of a logo, a
bumper sticker, or a license plate. Such information is then used
to look up driver behavior. Public information such as driving
ticket and Insurance information can be extracted to see if the
driver has a bad driving history and if so the system can take a
defensive driving posture. FIG. 8D shows an exemplary process to
determine the state of the object. For example, the state of the
detected object can be related to at least one of: location,
traffic lane in which the detected object is traveling, speed,
acceleration, entry onto a road, exit off of a road, activation of
headlights, activation of taillights, or activation of blinkers.
The behavior data is based on movement data for a plurality of
other objects at one or more locations. The movement data are
tracked using one of: satellite imagery, roadside cameras, on-board
GPS data, or sensor data acquired for other nearby vehicles. FIG.
8E shows an exemplary process to identify predict other
driver/rider behavior, while FIG. 8F generates proposed response to
the object's expected behavior. The system can send a driver
recommendation or vehicle command to orient the vehicle includes
positioning the vehicle at a predetermined distance from the
detected object, the predetermined distance being based, at least
in part, on the classification of the detected object. The likely
behavior of the detected object can be provided as a probability of
the detected object entering to one or more states. The process
includes receiving updated behavior data; and wherein predicting
the likely behavior of the detected object is based at least in
part on the updated behavior data. The driver can be informed of
the options using haptic interface or a heads-up display. The
process can also share the likely behavior of the object to
neighboring vehicles using vehicle-to-vehicle communication.
[0154] The process may cause the vehicle to take particular actions
in response to the predicted actions of the surrounding objects.
For example, if other car is turning at the next intersection, the
process may slow the vehicle down as it approaches the
intersection. In this regard, the predicted behavior of other
objects is based not only on the type of object and its current
trajectory, but also based on some likelihood that the object may
obey traffic rules or pre-determined behaviors. In another example,
the process may include a library of rules about what objects will
do in various situations. For example, a car in a left-most lane
that has a left-turn arrow mounted on the light will very likely
turn left when the arrow turns green. The library may be built
manually, or by the vehicle's observation of other vehicles
(autonomous or not) on the roadway. The library may begin as a
human built set of rules which may be improved by the vehicle's
observations. Similarly, the library may begin as rules learned
from vehicle observation and have humans examine the rules and
improve them manually. This observation and learning may be
accomplished by, for example, tools and techniques of machine
learning. In addition to processing data provided by the various
sensors, the computer may rely on environmental data that was
obtained at a previous point in time and is expected to persist
regardless of the vehicle's presence in the environment. For
example, the system can use highly detailed maps identifying the
shape and elevation of roadways, lane lines, intersections,
crosswalks, speed limits, traffic signals, buildings, signs, real
time traffic information, or other such objects and information.
For example, the map information may include explicit speed limit
information associated with various roadway segments. The speed
limit data may be entered manually or scanned from previously taken
images of a speed limit sign using, for example, optical-character
recognition. The map information may include three-dimensional
terrain maps incorporating one or more of objects listed above. For
example, the vehicle may determine that another car is expected to
turn based on real-time data (e.g., using its sensors to determine
the current GPS position of another car) and other data (e.g.,
comparing the GPS position with previously-stored lane-specific map
data to determine whether the other car is within a turn lane).
These objects may have particular behavior patterns that depend on
the nature of the object. For example, a bicycle is likely to react
differently than a motorcycle in a number of ways. Specifically, a
bicycle is more likely to make erratic movements when compared with
a motorcycle, but is much slower and thus can be handled with ease
compared to a speeding motorcycle. For each classification, the
object data may also contain behavior information that indicates
how an object having a particular classification is likely to
behave in a given situation. Vehicle may then autonomously respond
to the object based, in part, on the predicted behavior.
[0155] FIG. 9A shows an exemplary system for crowd-sourcing
navigation data. The system includes a crowdsourcing server in
communication with a plurality of vehicles 1 . . . n. The vehicles
in FIG. 9A performs peer-to-peer discovery and crowd-sourced
navigation as shown in FIG. 9B. The system receives proximity
services for a group of vehicles traveling a predetermined route
using peer-to-peer discovery, receives crowdsourcing data from said
plurality of vehicles, sharing crowdsourcing data to the group of
vehicles (or a subsequent group of vehicles) traveling the route of
interest. Such information can be used in providing navigation
guidance to the vehicle traveling the route using the crowdsourced
data.
[0156] In one aspect, the vehicles traveling the same route can be
determined using a vehicle to vehicle communication protocol that
facilitate identifying peers based upon encoded signals during peer
discovery in a peer to peer network. The system can be WiFi or
cellular based such as the Proximity Services via LTE Device
Broadcast, among others.
[0157] In one embodiment, the identification of peers based upon
encoded signals during peer discovery in a peer to peer network can
be done. For example, direct signaling that partitions a
time-frequency resource into a number of segments can be utilized
to communicate an identifier within a peer discovery interval;
thus, a particular segment selected for transmission can signal a
portion of the identifier, while a remainder can be signaled based
upon tones communicated within the selected segment. Moreover, a
subset of symbols within the resource can be reserved (e.g.,
unused) to enable identifying and/or correcting timing offset.
Further, signaling can be effectuated over a plurality of peer
discovery intervals such that partial identifiers communicated
during each of the peer discovery intervals can be linked (e.g.,
based upon overlapping bits and/or bloom filter information). The
method can include transmitting a first partial identifier during a
first peer discovery interval. Also, the method can comprise
transmitting a second partial identifier during a second peer
discovery interval. Further, the method can include generating
bloom filter information based upon the combination of the first
partial identifier and the second partial identifier. Moreover, the
method can comprise transmitting the bloom filter information to
enable a peer to link the first partial identifier and the second
partial identifier.
[0158] Another embodiment communicates using LTE Direct, a
device-to-device technology that enables discovering thousands of
devices and their services in the proximity of .about.500 m, in a
privacy sensitive and battery efficient way. This allows the
discovery to be "Always ON" and autonomous, without drastically
affecting the device battery life. LTE Direct uses radio
signals--called `expressions`--which can be private and discreet
(targeted securely for certain audiences only) or public
(transmitted so that any application can receive them). Public
expressions are a common language available to any application to
discover each other, and this is the door to consumer utility and
adoption. Public expressions exponentially expand the field of
value. For example, vehicles that share same driving segments can
broadcast expressions indicating their path(s). The system detects
vehicles in the same segment as part of the proximity services for
capturing and sharing crowd-sourced navigation data. Public
expressions combine all applications--all value--into one single
network, thereby expanding the utility of the system.
[0159] The crowdsourcing data includes vehicle performance
information and GPS locations of a vehicle; and wherein the vehicle
data includes odometer information, speedometer information, fuel
consumption information, steering information.
[0160] The data includes information relating to closing of a lane
using the crowdsourcing data; predicting an avoidance maneuver
using the crowdsourcing data; predicting a congestion with respect
to a segment of the route of the at least one vehicle using the
crowdsourcing data; and predicting traffic light patterns using the
crowdsourcing data.
[0161] The system can determine the presence of obstacles in a road
lane by monitoring a pattern of vehicle avoidance of a particular
location of the lane. The obstacles can be rocks or debris on the
lane, closure of a lane, inoperative vehicles on the lane, or
vehicles suffering from an accident, among others. The vehicular
avoidance information can be sent to vehicles that are planning to
use that particular road section to optimize
[0162] The system can detect closing of a lane by monitoring
changes of vehicle direction at a location on the route of the at
least one vehicle; and determining a lane is closed in response to
a number of changes of vehicle direction being larger than a
predetermined threshold value.
[0163] The system can share prior vehicle's avoidance maneuver by
monitoring change of vehicle direction and distance traveled at a
close vicinity of a location on the route of a lead vehicle; and
determining an avoidance maneuver in response to a ratio of change
of vehicle direction and distance traveled being less than a
predetermined threshold value.
[0164] The system can determine a route based at least in part on
an amount of time predicted for travelling from a starting location
to a destination location of the route using the crowdsourcing
data; and determining a route based at least in part on a predicted
fuel consumption of the route using the crowdsourcing data. The
determining information corresponding to a route of interest to at
least one vehicle further can include monitoring a distance
traveled by the at least one vehicle after reaching a destination,
and predicting availability of parking spaces at the destination
based at least in part on the distance traveled; and monitoring an
amount of time traveled by the at least one vehicle after reaching
a destination, and predicting availability of parking spaces at the
destination based at least in part on the amount of time traveled.
The determining information corresponding to a route of interest to
at least one vehicle further comprises: measuring a time taken to
travel a predefined percent of the route until the at least one
vehicle comes to a halt at a predetermined location; and predicting
an average amount of time used to find parking at the predetermined
location using the time taken to travel a predefined percent of the
route. The determining information corresponding to a route of
interest to at least one vehicle further comprises at least one of:
determining popularity of a fueling station along the route;
determining type of fuel sold at the fueling station along the
route; determining popularity of a business along the route; and
determining popularity of a rest area along the route.
[0165] Crowd-Sourced Map Updating and Obstacle Annotating
[0166] Next, a system to crowd-source the updates of precision maps
with data from smart vehicles is detailed. In embodiments,
crowd-sourced obstacle data can be used to update a map with
precision. The obstacles can be rocks, boulders, pot-holes,
manhole, utility hole, cable chamber, maintenance hole, inspection
chamber, access chamber, sewer hole, confined space or can be water
pool or rising tidal waves that affect the road as detected by a
plurality of vehicles. Such crowd-sourced information is updated
into the map and annotated by time, weather and periodicity. The
detected obstacle information may include a geographic location of
the vehicle and a predetermined map of the road. The computer
system may determine the geographic location of the obstacle by,
for example, using a laser rangefinder or light detection and
ranging (LIDAR) unit to estimate a distance from the obstacle to
the at least two objects near the vehicle and determining the
geographic location of the obstacle using triangulation, for
example. Such information is updated into the map system and marked
as temporal. During use, if recent vehicles take defensive driving
around the temporary obstacle, the map adds the obstacles to the
map for the route guidance module to advise vehicles. If recent
vehicles drive the road as though the obstacle does not exist, the
system removes the obstacle from the map database, but keeps track
of the history in case it is a periodic obstacle. The obstacle
information is also reported to government agency for
repair/maintenance.
[0167] In another embodiment, if vehicles drive through the lane
with a smooth line or curve, but abruptly brakes, the system infers
that the road has defects or potholes, for example, and the bad
infrastructure is reported for path planning (to add more travel
time, or to change the route to avoid the bad road infrastructure
if it is long.
[0168] The new information is used to update a digital map that
lacks the current information or that contains inaccuracies or may
be incomplete. The digital map stored in the map database may be
updated using the information processed by a map matching module,
matched segment module, and unmatched segment module. The map
matching module, once it has received obstacle location and GPS
traces, processes obstacle locations and GPS traces by matching
them to a road defined in the digital map. The map matching module
matches the obstacles and the GPS traces with the most likely road
positions corresponding to a viable route through the digital map
by using the processor to execute a matching algorithm. In one
example, the matching algorithm may be a Viterbi matching
algorithm. Where the GPS traces do match a road defined in the
digital map, the matched trace to which the GPS traces match and
obstacle information are sent to the matched segment module for
further processing as will be described below. Where the GPS traces
do not match a road defined in the digital map, the unmatched trace
to which the GPS traces are correlated with and the obstacle
position information are sent to the unmatched segment module for
further processing. The matched segment module and unmatched
segment module both provide metadata to the map updating module.
The metadata may include obstacle metadata road geometry refinement
metadata, road closure and reopening metadata, missing intersection
metadata, missing road data and one-way correction metadata. The
map updating module updates the digital map in the map
database.
[0169] The process to update maps using crowd-sourced data may
begin with the unmatched segment module clustering the unmatched
GPS traces received from the map matching module. Many available
algorithms may be suitable for this process, but in one example, an
agglomerative clustering algorithm that iteratively compares GPS
traces with each other and combines those that fall within a
pre-determined tolerance into a cluster may be used. One example of
such and algorithm uses the Hausdorff distance as its distance
measure in the clustering algorithm. Once the cluster is selected,
the unmatched segment module may produce a single road geometry for
a cluster of unmatched GPS traces using a centerline fitting
procedure in which the single road geometry describes a new road
segment with the obstacle which is not described in the current map
database. In one example, a polygonal principal curve algorithm or
a Trace Clustering Algorithm (TC1) algorithm can be used. The
digital map can be modified to include the new road, including
possibly new intersections in the base map and any associated
pointers or indices updated.
[0170] Flock Navigation
[0171] Next a flock control behavior is detailed. In one
embodiment, in FIG. 10, a plurality of cars follow a leader car,
who in turn is following a target vehicle or a target driving plan.
The leader, or the first car in the group would automatically or
manually take evasive actions to avoid an obstacle, and the
information is transmitted via vehicle to vehicle communication
such as DSRC to following vehicles, and the driving path of the
entire flock is adjusted according to the obstacle. "Flocking" is
the collective motion of a large number of self-propelled entities
and is a collective animal behavior exhibited by many living beings
such as birds, fish, bacteria, and insects. It is considered an
emergent behavior arising from simple rules that are followed by
individuals and does not involve any central coordination. The
vehicle communications would identify vehicles traveling as a
flock, and the vehicles perform distributed flocking operation by
communication over the wireless network. One embodiment of the
vehicle flocking process has the following structure:
TABLE-US-00001 initialise_vehicle_positions( ) LOOP place_vehicles(
) move_all_vehicles_to_new_positions( ) END LOOP
[0172] Each of the vehicles rules works independently, so, for each
vehicle, the process calculates how much it will get moved by each
of the three rules, generating three velocity vectors. The three
vectors to the vehicle's current velocity to work out its new
velocity.
TABLE-US-00002 PROCEDURE move_all_vehicles_to_new_positions( )
Vector v1, v2, v3 Vehicle b FOR EACH VEHICLE b v1 = rule1(b) v2 =
rule2(b) v3 = rule3(b) b.velocity = b.velocity + v1 + v2 + v3
b.position = b.position + b.velocity END
[0173] The Vehicles Rules are discussed next. One embodiment
simulates simple agents (vehicles) that are allowed to move
according to a set of basic rules. The result is akin to a flock of
birds, a school of fish, or a swarm of insects. In one embodiment,
flocking behavior for each vehicle is controlled by three
rules:
[0174] Separation--avoid crowding neighbors (short range
repulsion)
[0175] Alignment--steer towards average heading of neighbors
[0176] Cohesion--steer towards average position of neighbors (long
range attraction)
[0177] Rule 1: Vehicles try to go towards the center of mass of
neighboring vehicles. The `center of mass` is simply the average
position of all the vehicles. Assume there are N vehicles, called
b1, b2, . . . , bN. Also, the position of a vehicle b is denoted
b.position. Then the `center of mass` c of all N vehicles is given
by: c=(b1.position+b2.position+ . . . +bN.position)/N
[0178] However, the `center of mass` is a property of the entire
flock of vehicles; it is not something that would be considered by
an individual vehicle. Each vehicle is moved toward its `perceived
center`, which is the center of all the other vehicles, not
including itself. Thus, for vehicle J (1<=J<=N), the
perceived center pcJ is given by:
pcJ=(b1.position+b2.position+ . . . +bJ-1.position+bJ+1.position+ .
. . +bN.position)/(N-1)
[0179] Having calculated the perceived center, the system moves the
vehicle towards it. To move it 1% of the way towards the center
this is given by (pcJ- bJ.position)/100 as:
TABLE-US-00003 PROCEDURE rule1(vehicle bJ) Vector pcJ FOR EACH
VEHICLE b IF b != bJ THEN pcJ = pcJ + b.position pcJ = pcJ / N-1
RETURN (pcJ - bJ.position) / 100
[0180] Rule 2: Vehicles try to keep a small distance away from
other objects (including other vehicles). The rule ensures vehicles
don't collide into each other. If each vehicle within a defined
small distance (say 100 units) of another vehicle, the vehicle is
moved away. This is done by subtracting from a vector c the
displacement of each vehicle which is near by.
TABLE-US-00004 PROCEDURE rule2(vehicle bJ) Vector c = 0; FOR EACH
VEHICLE b IF b != bJ THEN IF |b.position - bJ.position| < 100
THEN c = c - (b.position - bJ.position) RETURN c
[0181] If two vehicles are near each other, they will be slightly
steered away from each other, and at the next time step if they are
still near each other they will be pushed further apart. Hence, the
resultant repulsion takes the form of a smooth acceleration. If two
vehicles are very close to each other it's probably because they
have been driving very quickly towards each other, considering that
their previous motion has also been restrained by this rule.
Suddenly jerking them away from each other is not comfortable for
passengers and instead, the processes have them slow down and
accelerate away from each other until they are far enough apart for
our liking.
[0182] Rule 3: Vehicles try to match velocity with near
vehicles.
[0183] This is similar to Rule 1, however instead of averaging the
positions of the other vehicles we average the velocities. We
calculate a `perceived velocity`, pvJ, then add a small portion
(about an eighth) to the vehicle's current velocity. [0184]
PROCEDURE rule3(vehicle bJ) [0185] Vector pvJ [0186] FOR EACH
VEHICLE b [0187] IF b!=bJ THEN [0188] pvJ=pvJ+b.velocity [0189] END
IF [0190] END [0191] pvJ=pvJ/N-1 [0192] RETURN (pvJ-bJ.velocity)/8
[0193] END PROCEDURE
[0194] Additional rules is implemented as a new procedure returning
a vector to be added to a vehicle's velocity.
[0195] Action of a crowd or traffic is discussed next. For example,
to handle strong traffic. [0196] PROCEDURE strong_traffic(Vehicle
b) [0197] Vector traffic [0198] RETURN traffic [0199] END
PROCEDURE
[0200] This function returns the same value independent of the
vehicle being examined; hence the entire flock will have the same
push due to the traffic or crowd.
[0201] Limiting the speed of vehicles is discussed next. For a
limiting speed vlim: [0202] PROCEDURE limit_velocity(Vehicle b)
[0203] Integer vlim [0204] Vector v [0205] IF |b.velocity|>vlim
THEN [0206] b.velocity=(b.velocity/|b.velocity|)*vlim [0207] END IF
[0208] END PROCEDURE
[0209] This procedure creates a unit vector by dividing b.velocity
by its magnitude, then multiplies this unit vector by vlim. The
resulting velocity vector has the same direction as the original
velocity but with magnitude vlim.
[0210] The procedure operates directly on b.velocity, rather than
returning an offset vector. It is not used like the other rules;
rather, this procedure is called after all the other rules have
been applied and before calculating the new position, i.e. within
the procedure move_all_vehicles_to_new_positions: [0211]
b.velocity=b.velocity+v1+v2+v3+ [0212] limit_velocity(b) [0213]
b.position=b.position+b.velocity
[0214] Bounding the position is discussed next. In order to keep
the flock within a certain zone so that they can drive out of them,
but then slowly turn back, avoiding any harsh motions.
TABLE-US-00005 PROCEDURE bound_position(Vehicle b) Integer Xmin,
Xmax, Ymin, Ymax, Zmin, Zmax Vector v IF b.position.x < Xmin
THEN v.x = 10 ELSE IF b.position.x > Xmax THEN v.x = -10 IF
b.position.y < Ymin THEN v.y = 10 ELSE IF b.position.y > Ymax
THEN v.y = -10 IF b.position.z < Zmin THEN v.z = 10 ELSE IF
b.position.z > Zmax THEN v.z = -10 RETURN v
[0215] Here of course the value 10 is an arbitrary amount to
encourage them to drive in a particular direction.
[0216] During the course of flock control, one may want to break up
the flock for various reasons. For example the introduction of a
predator may cause the flock to scatter in all directions. The
predator can be an object on an impending collision course with the
flock. Scattering the flock can be done. Here the flock can
disperse; they are not necessarily moving away from any particular
object, but to break the cohesion (for example, the flock
encounters a dangerously driven vehicle). Thus the system negates
part of the influence of the vehicles rules.
TABLE-US-00006 PROCEDURE move_all_vehicles_to_new_positions( ) FOR
EACH VEHICLE b v1 = m1 * rule1(b) v2 = m2 * rule2(b) v3 = m3 *
rule3(b) b.velocity = b.velocity + v1 + v2 + v3 + ... b.position =
b.position + b.velocity
[0217] When the risk of collision arises, the process can make ml
negative to scatter the flock. Setting ml to a positive value again
will cause the flock to spontaneously re-form.
[0218] Tendency away from a particular place is handled next. If
the flock is to continue the flocking behavior but to move away
from a particular place or object (such as a car that appears to
collide with the flock), then we need to move each vehicle
individually away from that point. The calculation required is
identical to that of moving towards a particular place, implemented
above as tend_to_place; all that is required is a negative
multiplier: v=-m*tend_to_place(b).
[0219] The vehicles can be organized into a V formation (sometimes
called a skein) is the symmetric V-shaped formation for Drag
Reduction and Fuel Saving where all the cars except the first drive
in the upwash from the wingtip vortices of the car ahead. The
upwash assists each car in supporting its own weight in flight, in
the same way a glider can climb or maintain height indefinitely in
rising air.
[0220] FIG. 6F shows a flock of automatically driven motor vehicles
each having the flock behavior. The motor vehicles of the flock
establishes a target motor vehicle which will be used as a
reference for flocking. In FIG. 6F, the leading motor vehicle of
the flock is established as the target motor vehicle by the motor
vehicles of the flock. The target motor vehicle may be established
before the motor vehicle start running in flock. In another
embodiment, the first motor vehicle of the flock detects a
preceding motor vehicle with the information from the radar or the
CCD camera on the leading motor vehicle or flock leader, and
automatically establishes the detected preceding motor vehicle as a
new target motor vehicle. By successively changing new target motor
vehicles in this manner, new motor vehicles may automatically be
added to the flock. Even if a motor vehicle is incapable of
communication between motor vehicles, that motor vehicle may be
established as a target motor vehicle according to an algorithm
described later on.
[0221] In one embodiment, the leading motor vehicle of the flock
establishes a hypothetical target motor vehicle, and transmits
items of information of the hypothetical target motor vehicle to
the other motor vehicles of the flock which follow the flock leader
through the inter-vehicular communications such as DSRC.
[0222] Each vehicle in the flock is responsible for generating a
speed plan which governs the relationship between the position in
which the motor vehicle runs and the speed at which the motor
vehicle runs. The vehicles perform determining, based on the speed
plan, a planned position to be reached from the present position of
the motor vehicle after a predetermined time t, e.g., 1.5 seconds,
and a planned speed of the motor vehicle at the planned position in
the flock. According to this function, if the speed plan from the
present position of the motor vehicle is generated such that the
motor vehicle is to maintain the speed of 80 km/h, i.e., 22.2
m/sec., then the planned position to be reached after the
predetermined time t, e.g., 1.5 seconds, is 33.3 m spaced from the
present position down the running path B, and the planned speed at
the planned position to be reached is 80 km/h.
[0223] The function as the predicted value calculating means serves
to determine a predicted position and a predicted speed to be
reached by the motor vehicle after the predetermined time t. The
predicted position is calculated from the present position, i.e.,
the traveled distance, the present speed, and the present
acceleration of the motor vehicle which are given from the
communication module 1, and the predicted speed is calculated from
the present speed and the present acceleration of the motor
vehicle.
[0224] The speed/acceleration of the vehicle, based on which the
predicted position and the predicted speed will be determined, is
basically determined from the speedometer. The predicted position
and the predicted speed are determined using the speed and the
acceleration of the motor vehicle and GPS position.
[0225] A distance deviation, i.e., a position error, between a
planned position to be reached by the motor vehicle after the
predetermined time t based on the speed plan and the predicted
position, described above, to be reached by the motor vehicle, and
a speed deviation, i.e., a speed error, between a planned speed to
be reached by the motor vehicle after the predetermined time t
based on the speed plan and the predicted speed, described above,
to be reached by the motor vehicle are determined. These deviations
are calculated by subtractions.
[0226] The target motor vehicle may be a flock leader. If, however,
the target motor vehicle is not a flock leader, then the flock
leader calculates a position, a speed, and an acceleration of the
target motor vehicle using the laser radar, GPS, or triangulation
of RF signals, for example.
[0227] Based on the above control algorithm, the engine throttle
valve opening, the transmission, and the brake of each of plural
following motor vehicles are controlled to control the motor
vehicles in a flock.
[0228] The system detects the positional data of the preceding
motor vehicle through inter-vehicular communications or the laser
radar, and controls the following motor vehicle in the event that
the preceding motor vehicle drops out of a normal control range of
the vehicle flock control. Even when a motor vehicle drops out of
the normal range of the vehicle flock control, the control
algorithm controls a following motor vehicle to increase its
inter-vehicular distance up to such a motor vehicle. Therefore, the
vehicle platoon control will not be interrupted even when one or
more motor vehicles drops out of the platoon.
[0229] If it is known that a group of motor vehicles will travel in
platoon or motor vehicles are counted at a tollgate or the like and
the incremental count is indicated to each motor vehicle to let it
recognize its position in the platoon, then it is possible to
establish the position i for each of the motor vehicles before they
travel in platoon.
[0230] However, in order to handle a situation where another motor
vehicle pulls in between motor vehicles running in platoon or
another motor vehicle is added to a front or rear end of a platoon
of motor vehicles, the process according to the present invention
makes it possible for each of the motor vehicles running in flock
to recognize its position relative to a target motor vehicle
through inter-vehicular communications.
[0231] There are two procedures available for each of the motor
vehicles running in flock to recognize its position relative to a
target motor vehicle. The first procedure is applicable to local
inter-vehicular communications by which each of the motor vehicles
of the flock can communicate with only those motor vehicles which
run immediately in front of and behind the motor vehicle. If the
flock leader of a flock is selected as a target motor vehicle, then
the target motor vehicle transmits its own positional information
i=0 to a next motor vehicle which immediately follows the target
motor vehicle. The following motor vehicle adds 1 to i, producing
its own positional information i=1, recognizes that it is the
second motor vehicle from the target motor vehicle, and transmits
its own positional information i=1 to a next motor vehicle which
immediately follows the second motor vehicle. Having received the
positional information i=1, the next immediately following motor
vehicle adds 1 to i, producing its own positional information i=2,
recognizes that it is the third motor vehicle from the target motor
vehicle, and transmits its own positional information i=2 to a next
motor vehicle which immediately follows the third motor vehicle. In
this manner, each of the motor vehicles is able to recognize its
position relative to the target motor vehicle with a means for
counting its position and local inter-vehicular communications.
[0232] If a target motor vehicle is not the flock leader of a flock
and the target motor vehicle and the flock leader cannot
communicate with each other through inter-vehicular communications,
then the flock leader sets its own positional information to i=1,
and transmits the own positional information i=1 to a next motor
vehicle which immediately follows the target motor vehicle.
[0233] According to the present invention, as described above, a
longitudinal acceleration correcting quantity of each of the motor
vehicles of a flock is determined on the basis of predicted
deviations of a position and a speed that are predicted after a
predetermined time, from a speed plan, and the speed of the motor
vehicle is controlled on the basis of the determined longitudinal
acceleration correcting quantity. Therefore, the motor vehicles can
smoothly be controlled to run in flock along a running path on a
road.
[0234] A longitudinal acceleration correcting quantity of a motor
vehicle following a target motor vehicle is determined on the basis
of an inter-vehicular distance between the following motor vehicle
and the target motor vehicle and a speed difference there-between
after a predetermined time, and the speed of the following motor
vehicle is controlled on the basis of the determined longitudinal
acceleration correcting quantity. Consequently, the following motor
vehicle can automatically be driven smoothly along a running path
on a road while reliably keeping a proper inter-vehicular distance
between the following motor vehicle and the target motor
vehicle.
[0235] Since the system arrangements on a flock leader and a
following motor vehicle of a flock are identical to each other, the
flock leader and the following motor vehicle can automatically be
driven in a manner to match them using slightly different software
or program adaptations made therefor. Therefore, any one of the
motor vehicles of the flock may become a flock reader or a
following motor vehicle.
[0236] Each of following motor vehicles of a flock is not only
controlled with respect to a flock leader, but also always monitors
an inter-vehicular distance between itself and a preceding motor
vehicle, so that it can increase the inter-vehicular distance even
when a motor vehicle drops out of the flock. Therefore, it is not
necessary to stop controlling the vehicle flock control when a
motor vehicle drops out of the flock. Even when a motor vehicle
drops out of a flock, the vehicle flock control system does not
stop controlling the other motor vehicles to run in flock, and when
the motor vehicle that has dropped out returns to the flock, the
vehicle flock control system can continuously control the motor
vehicles to run in flock. The vehicle flock control system allows
different types of motor vehicles, such as trucks of different
lengths, smaller automobiles, larger automobiles, etc., to be mixed
in a flock, and can control those motor vehicles to run in flock.
Accordingly, the vehicle flock control system according to the
present invention is capable of stably controlling motor vehicles
to run in flock on a road designed for motor vehicles to run
automatically, and particularly of controlling the speeds of such
motor vehicles smoothly.
[0237] Lane Marking Visibility Handling
[0238] In some embodiments, a lead vehicle identifies lane
information that may include lane markings on the road, and the
computer system may use one or more sensors to sense the lane
markings For example, the computer system may use an image-capture
device to capture images of the road and may detect the lane
markings by analyzing the images for predetermined colors, shapes,
and/or brightness levels that are similar to a predetermined color,
shape, and/or brightness of the lane markings. As another example,
the computer system may project a laser onto the road and may
detect the lane markings by analyzing reflections off the road for
an intensity that is similar to a predetermined intensity of a
reflection off the lane markings. The computer system may estimate
the location of the lane based on the sensed lane markings and
control the vehicle to follow the lane. The vehicles behind the
lead vehicle can then simply follow the lead vehicle as part of a
flock.
[0239] At some point, the lead vehicle may determine that the lane
information has become unavailable or unreliable. For example,
severe fog may be present and severely affect the lane markings. In
other examples, the vehicle may no longer be able to detect the
lane markings on the road, the vehicle may detect contradictory
lane markings on the road, the vehicle may no longer be able to
determine a geographic location of the vehicle, and/or the vehicle
may not be able to access a predetermined map of the road. Other
examples are possible as well. In response to determining that the
lane information has become unavailable or unreliable, the computer
system may use at least one sensor to monitor at least one
neighboring vehicle, such as a neighboring vehicle in a neighboring
lane or a neighboring vehicle behind the vehicle that is part of
the flock. The computer system may then control the vehicle to
maintain a distance between the vehicle and the at least one
neighboring vehicle to be at least a predetermined minimum distance
and even if the vehicle is unable to rely on the lane information
to estimate a location of the lane on the road, the vehicle may
avoid colliding with the at least one neighboring vehicle.
[0240] In other embodiments, the lane information may include a
geographic location of the vehicle and a predetermined map of the
road. The computer system may determine the geographic location of
the vehicle by, for example, querying a location server for the
geographic location of the vehicle. Alternatively, if the
predetermined map indicates a geographic location of at least two
objects near the vehicle, the computer system may determine the
geographic location of the vehicle by, for example, using a laser
rangefinder or light detection and ranging (LIDAR) unit to estimate
a distance from the vehicle to the at least two objects near the
vehicle and determining the geographic location of the vehicle
using triangulation. Other examples are possible as well. In any
case, the computer system may then locate the geographic location
of the vehicle on the predetermined map to determine a location of
the lane relative to the geographic location of the vehicle.
[0241] In still other embodiments, the lane information may be
derived from a leading vehicle that is in front of the vehicle in
the lane and correlation with other information such as map data
and independent lane analysis to prevent the blind-following-the
blind situation. The computer system may estimate a path of the
leading vehicle using, for example, a laser rangefinder and/or a
LIDAR unit. Other examples are possible as well. Once the computer
system has estimated the path of the leading vehicle, the computer
system may estimate the location of the lane based on the estimated
path. For example, the computer system may estimate the location of
the lane to include the estimated path (e.g., extend by half of a
predetermined lane width on either side of the estimated path).
Other examples are possible as well.
[0242] In some embodiments, the computer system may maintain a
predetermined threshold for the lane information, and the computer
system may determine that the lane information has become
unavailable or unreliable when the computer system detects that a
confidence of the lane information (e.g., how confident the
computer system is that the lane information is reliable) is below
the predetermined threshold. In some embodiments, the computer
system may additionally maintain a predetermined time period for
the lane information, and the computer system may determine that
the lane information has become unavailable or unreliable when the
computer system detects that a confidence of the lane information
is below the predetermined threshold for at least the predetermined
amount of time.
[0243] Upon determining that the lane information has become
unavailable or unreliable, the computer system may use at least one
sensor to monitor at least one neighboring vehicle. The at least
one neighboring vehicle may include, for example, a neighboring
vehicle in a lane adjacent to the lane in which the vehicle is
traveling. As another example, the at least one neighboring vehicle
may include a neighboring vehicle behind the vehicle in the lane in
which the vehicle is traveling. As still another example, the at
least one neighboring vehicle may include a first neighboring
vehicle and a second neighboring vehicle, each of which may be
either in a lane adjacent to the lane in which the vehicle is
traveling or behind the vehicle in the lane in which the vehicle is
traveling. Other examples are possible as well.
[0244] When the lane information has become unavailable or
unreliable, the computer system may control the vehicle to maintain
a distance between the vehicle and the at least one neighboring
vehicle to be at least a predetermined distance. The predetermined
distance may be, for example, a distance determined to be a safe
distance and/or a distance approximately equal to the difference
between a predetermined lane width and a width of the vehicle.
Other predetermined distances are possible as well.
[0245] In order to maintain the distance between the vehicle and
the at least one neighboring vehicle to be at least the
predetermined distance, the computer system may continuously or
periodically use the at least one sensor on the vehicle to monitor
the distance between the vehicle and the at least one neighboring
vehicle. The computer system may monitor the distance between the
vehicle and the at least one neighboring vehicle using, for
example, a laser rangefinder and/or LIDAR unit. If the distance
between the vehicle and the at least one neighboring vehicle
becomes less than the predetermined distance, the computer system
may move the vehicle away from the at least one neighboring vehicle
in order to maintain the distance between the vehicle and the at
least one neighboring vehicle to be at least the predetermined
distance.
[0246] In some embodiments, in addition to maintaining the distance
between the vehicle and the at least one neighboring vehicle to be
at least the predetermined distance, the computer system may
additionally maintain the distance between the vehicle and the at
least one neighboring vehicle to be within a predetermined range of
the predetermined distance. In these embodiments, if the distance
between the vehicle and the at least one neighboring vehicle
becomes too large (e.g., no longer within the predetermined range
of the predetermined distance), the computer system may move the
vehicle closer to the at least one neighboring vehicle. This may,
for example, prevent the vehicle from drifting so far away from the
neighboring vehicle that the vehicle drifts into a lane on the
opposite side of the vehicle from the neighboring vehicle.
[0247] As noted above, in some embodiments the at least one vehicle
may include a first neighboring vehicle and a second neighboring
vehicle. In these embodiments, maintaining the distance between the
vehicle and the at least one neighboring vehicle may involve
maximizing both a first distance between the vehicle and the first
neighboring vehicle and a second distance between the vehicle and
the second neighboring vehicle (e.g., such that the vehicle remains
approximately in the middle between the first neighboring vehicle
and the second neighboring vehicle). Each of the first distance and
the second distance may be at least the predetermined distance.
[0248] In some embodiments, in addition to maintaining the distance
between the vehicle and the at least one neighboring vehicle to be
at least the predetermined distance, the computer system may
determine an updated estimated location of the lane. To this end,
the computer system may use the at least one sensor to monitor at
least a first distance to the at least one neighboring vehicle and
a second distance to the at least one vehicle. Based on the first
distance and the second distance, the computer system may determine
a first relative position and a second relative position (e.g.,
relative to the vehicle) of the at least one neighboring vehicle.
Based on the first relative position and the second relative
position, the computer system may estimate a path for the at least
one neighboring vehicle. The computer system may then use the
estimated path to determine an updated estimated location of the
lane. For example, in embodiments where the at least one
neighboring vehicle is traveling in a lane adjacent to the lane in
which the vehicle is traveling, the computer system may determine
the estimated location of the lane to be substantially parallel to
the estimated path (e.g., the lane may be centered on a path that
is shifted from the estimated path by, e.g., a predetermined lane
width and may extend by half of the predetermined lane width on
either side of the path). As another example, in embodiments where
the at least one neighboring vehicle is traveling behind the
vehicle in the lane in which the vehicle is traveling, the computer
system may determine the estimated location of the lane to be an
extrapolation (e.g., with constant curvature) of the estimated
path. Other examples are possible as well.
[0249] In some embodiments, the computer system may additionally
use a speed sensor to monitor a speed of the at least one
neighboring vehicle and may modify a speed of the vehicle to be
less than the speed of the at least one neighboring vehicle. This
may allow the vehicle to be passed by the at least one neighboring
vehicle. Once the at least one neighboring vehicle has passed the
vehicle, the at least one neighboring vehicle may become a leading
vehicle, either in a lane adjacent to the lane in which the vehicle
is traveling or a leading vehicle that is in front of the vehicle
in the lane in which the vehicle is traveling, and the computer
system may estimate the location of the lane of the road based on
an estimated path of the leading vehicle, as described above.
[0250] In some embodiments, the computer system may begin to
monitor the at least one neighboring vehicle only in response to
determining that the lane information has become unavailable or
unreliable. In these embodiments, prior to determining that the
lane information has become unavailable or unreliable, the computer
system may rely solely on the lane information to estimate the
location of the lane. In other embodiments, however, the computer
system may also monitor the at least one neighboring vehicle prior
to determining that the lane information has become unavailable or
unreliable. In these embodiments, the computer system may
additionally use the distance to the at least one neighboring
vehicle to estimate the location of the lane in which the vehicle
is traveling. For example, if the at least one neighboring vehicle
is traveling in a lane adjacent to the lane in which the vehicle is
traveling, the computer system may determine that the lane does not
extend to the at least one neighboring vehicle. As another example,
if the at least one neighboring vehicle is traveling behind the
vehicle in the lane in which the vehicle is traveling, the computer
system may determine that the lane includes the at least one
neighboring vehicle. Other examples are possible as well.
Alternatively, in these embodiments, prior to determining that the
lane information has become unavailable or unreliable, the computer
system may simply use the distance to the at least one neighboring
vehicle to avoid collisions with the at least one neighboring
vehicle.
[0251] Further, in some embodiments, once the vehicle begins to
monitor the at least one neighboring vehicle, the computer system
may stop using the lane information to estimate the location of the
lane in which the vehicle is traveling. In these embodiments, the
computer system may rely solely on the distance to the at least one
neighboring vehicle to avoid collisions with the at least one
neighboring vehicle until the lane information becomes available or
reliable. For example, the computer system may periodically attempt
to obtain updated lane information. Once the computer system
determines that the lane information has become available or
reliable, the lane information has become available or reliable,
the computer system may once again rely on the updated estimated
location of the lane and less (or not at all) on the distance to
the at least one neighboring vehicle. The computer system may
determine that the updated lane information is reliable when, for
example, the computer system determines that a confidence of the
updated lane information is greater than a predetermined threshold.
The predetermined threshold may be the same as or different than
the predetermined threshold.
[0252] FIG. 11 illustrate a typical network environment 4100 in
which the systems, methods, and computer program products may be
implemented, according to embodiments as disclosed herein. In an
embodiment, the environment 4100 includes a plurality of drivers
who are seeking insurance drive vehicles 4102. The vehicle 4102
described herein can be configured to include a driver monitoring
unit 4104 installed thereon. The monitoring device may be self
contained, such as a single unit mounted on a windshield or
dashboard of the vehicle 4102. Alternatively, the monitoring device
4104 may include multiple components, such as a processor or
central unit mounted under a car seat or in a trunk of the vehicle
and a user interface mounted on a dashboard or windshield.
Similarly, the monitoring unit 4104 may have a self-contained
antenna in the unit or may be connected to remotely mounted
antennas for communication with remote systems.
[0253] Further, the driver monitoring units 4104 may be connected
to an on-board diagnostic system or data bus in the vehicle 4104.
Information and behavior data associated with the driver may be
collected from the on-board diagnostic system. The driver
monitoring system may receive inputs from internal and external
sources and sensors such as accelerometers, global positioning
systems (GPS), vehicle on-board diagnostic systems, seatbelt
sensors, wireless device, or cell phone use detectors, alcohol
vapor detectors, or trans-dermal ethanol detection. Further, the
details related to the driver monitoring unit 4104 are described in
conjunction with the FIG. 12.
[0254] Further, the information may be exchanged between driver
monitoring unit 104 and central monitoring system or server 4106 in
real-time or at intervals. For example, the driver behavior
parameters may be transmitted to server 4106 via a communication
network 4108. In an embodiment, the communication network 4108
described herein can include for example, but not limited to, a
cellular, satellite, Wi-Fi, Bluetooth, infrared, ultrasound, short
wave, microwave, global system for mobile communication, or any
other suitable network. The information sent to the server 4104 may
then be forwarded with one or more insurance providers 4110. The
server 4106 can be configured to process the driver behavior
parameters and/or store the data to a local or remote database. The
drivers or insurance provider can access the data on the server
4106. In some embodiments, the data captured by monitoring unit
4104 in the vehicle 4102 may be transmitted via a hardwired
communication connection, such as an Ethernet connection that is
attached to vehicle 4102 when the vehicle is within a service yard
or at a base station or near the server 4106. Alternatively, the
data may be transferred via a flash memory, diskette, or other
memory device that can be directly connected to the server
4106.
[0255] In one embodiment of the invention, the data captured by
driver monitoring unit 4104 can be used to monitor, provide
feedback, mentor, provide recommendations, adjust insurance rates,
and to analyze a driver's behavior during certain events. For
example, if vehicle 4102 is operated improperly, such as speeding,
taking turns too fast, colliding with another vehicle, or driving
in an unapproved area, then the driver monitoring unit 4104 or
server 4106 may adjust the insurance rates for the driver and
provide feedback and suggestions to the driver, such as to improve
the diving skills. Additionally, if the driver's behavior is
inappropriate or illegal, such as not wearing a seatbelt or using a
cell phone while driving then feedback and suggestions can be
provided to the driver to improve the diving skills.
[0256] In an embodiment, the insurance price may be adjusted based
on the driver behavior. For example, if an insurance company,
supervisor, or other authority determines that the driver is
uninsured, underinsured, lacking coverage required in a particular
jurisdiction, that the driver's insurance premiums are delinquent,
and/or if the vehicle is not properly registered and/or delinquent
in registration with the state, then the driver monitoring unit 102
may be directed to disable or deactivate the vehicle.
Alternatively, the driver monitoring unit 102 can provide feedback
and recommendations to the driver if it is determined that the
driver behavior is uninsured, underinsured, lacking coverage
required in a particular jurisdiction, or that the driver's
insurance premiums are delinquent. In an embodiment, the driver's
behavior is typically evaluated while driving the vehicle 102 with
the driver monitoring unit 104 installed thereon. After receiving
the driver behavior data from the driver monitoring unit 104, the
insurance rates can be adjusted accordingly.
[0257] FIG. 12 is a diagram illustrating generally, a portion of
vehicle 4200 alone with possible locations of sensors, cameras,
and/or other technologies, according to embodiments described
herein. In an embodiment, exemplary mounted locations for the
driver monitoring unit 4104 are illustrated, such as on a dashboard
4202, windshield 4204, headliner 4206, surface 4208, corner 4210.
It will be understood that all or parts of the driver monitoring
unit 4104 can be mounted in any other location that allows for
audio and/or visual feedback to the driver of the vehicle 4102
while the vehicle is in operation. The driver monitoring unit 4104
is illustrated as being coupled to on-board diagnosis, from which
it may receive inputs associated with the driver and vehicle
operating parameters. The driver monitoring units such as 4202,
4204, 4206, 4208, and 4210 can be coupled to on-board diagnosis
(not shown). Moreover, the driver monitoring system may be coupled
to other sensors, such as a sensor for detecting the operation and
use of a cellular or wireless device in the vehicle 4102.
[0258] In an embodiment, the driver monitoring units can be
configured to include for example, but not limited to,
accelerometer, cameras, gyroscope, magnetometer, and the like
sensors. In an embodiment, the accelerometer can include at least
one accelerometer for measuring a lateral (sideways), longitudinal
(forward and aft) and vertical acceleration in order to determine
whether the driver is operating the vehicle in an unsafe or
aggressive manner. For example, excessive lateral acceleration may
be an indication that the driver is operating the vehicle at an
excessive speed around a turn along a roadway. Furthermore, it is
possible that the driver may be traveling at a speed well within
the posted speed limit for that area of roadway. However, excessive
lateral acceleration, defined herein as "hard turns," may be
indicative of aggressive driving behavior by the driver and may
contribute to excessive wear on tires and steering components as
well as potentially causing the load such as a trailer to shift and
potentially overturn.
[0259] As such, it can be seen that monitoring such driver behavior
by providing feedback and recommendations to the driver during the
occurrence of aggressive driving behavior such as hard turns can
improve safety and reduce accidents. In addition, providing
recommendations for such aggressive driver behavior can reduce wear
and tear on the vehicle and ultimately reduce fleet maintenance
costs as well as reduce insurance costs and identify at risk
drivers and driving behavior to fleet managers.
[0260] In one aspect, the driver monitoring system may be in data
communication with an on board diagnostic (OBD) system of the
vehicle such as via a port. In some vehicle models, the driver
monitoring system is in data communication with a controller area
network (CAN) system (bus) to allow acquisition of certain driver
and vehicle operating parameters including, but not limited to,
vehicle speed such as via the speedometer, engine speed or throttle
position such as via the tachometer, mileage such as via the
odometer reading, seat belt status, condition of various vehicle
systems including anti-lock-braking (ABS), turn signal, headlight,
cruise control activation and a multitude of various other
diagnostic parameters such as engine temperature, brake wear, and
the like. The OBD or CAN allows for acquisition of the
above-mentioned vehicle parameters for processing thereby and/or
for subsequent transmission to the server 4106.
[0261] In an embodiment, the driver monitoring system may also
include a GPS receiver (or other similar technology designed to
track location) configured to track the location and directional
movement of the driver in either real-time or over-time modes. As
is well known in the art, GPS signals may be used to calculate the
latitude and longitude of a driver as well as allowing for tracking
of driver movement by inferring speed and direction from positional
changes. Signals from GPS satellites also allow for calculating the
elevation and, hence, vertical movement, of the driver.
[0262] In an embodiment, the driver monitoring unit may further
include a mobile data terminal (MDT) mounted for observation and
manipulation by the driver, such as near the vehicle dash. The MDT
can be configured to include an operator interface such as a
keypad, keyboard, touch screen, display screen, or any suitable
user input device and may further include audio input capability
such as a microphone to allow voice communications. The driver
monitoring unit receives inputs from a number of internal and
external sources. The OBD/CAN bus, which provides data from the
vehicle's on-board diagnostic system, including engine performance
data and system status information. A GPS receiver provides
location information. The CDR, XLM, or accelerometers provide
information regarding the vehicle's movement and driving
conditions. Any number of other sensors, such as but not limited
to, a seat belt sensor, proximity sensor, driver monitoring
sensors, or cellular phone use sensors, also provide inputs to the
driver monitoring system.
[0263] In an embodiment, the driver monitoring system may have any
type of user interface, such as a screen capable of displaying
messages to the vehicle's driver or passengers, and a keyboard,
buttons or switches that allow for user input. The system or the
user interface may have one or more status LEDs or other indicators
to provide information regarding the status of the device's
operation, power, communications, GPS lock, and the like.
Additionally, the LEDs or other indicators may provide feedback to
the driver when a driving violation occurs. Additionally,
monitoring system may have a speaker and microphone integral to the
device.
[0264] In an embodiment, the monitoring system may be self-powered,
such as by a battery, or powered by the vehicle's battery and/or
power generating circuitry. Access to the vehicle's battery power
may be by accessing the power available on the vehicle's OBD and/or
CAN bus. The driver monitoring system may be self-orienting, which
allows it to be mounted in any position, angle or orientation in
the vehicle or on the dashboard. In an embodiment, the driver
monitoring system determines a direction of gravity and a direction
of driver movement and determines its orientation within the
vehicle using this information. In order to provide more accurate
measurements of driver behavior, the present invention filters
gravitational effects out of the longitudinal, lateral and vertical
acceleration measurements when the vehicle is on an incline or
changes its horizontal surface orientation. Driver behavior can be
monitored using the accelerometer, which preferably will be a
tri-axial accelerometer. Acceleration is measured in at least one
of lateral, longitudinal and/or vertical directions over a
predetermined time period, which may be a period of seconds or
minutes. An acceleration input signal is generated when a measured
acceleration exceeds a predetermined threshold.
[0265] It will be understood that the present invention may be used
for both fleets of vehicles and for individual drivers. For
example, the driver monitoring system described herein may be used
by insurance providers to monitor, recommend, provide feedback, and
adjust insurance rates based on the driving. A private vehicle
owner may also use the present invention to monitor the driver
behavior and user of the vehicle. For example, a parent may use the
system described herein to monitor a new driver or a teenage driver
behavior.
[0266] An embodiment of the invention provides real-time
recommendations, training, or other feedback to a driver while
operating the vehicle. The recommendations are based upon observed
operation of the vehicle and are intended to change and improve
driver behavior by identifying improper or illegal operation of the
vehicle. The driver monitoring system may identify aggressive
driving violations. For example, based upon the inputs from an
acceleration or CDR, aggressive driving behavior can be detected,
such as exceeding acceleration thresholds in a lateral,
longitudinal, or vertical direction, hard turns, hard acceleration
or jackrabbit starts, hard braking, and/or hard vertical movement
of the vehicle.
[0267] Further, in an embodiment, the sensor and camera described
herein can be configured to communicate with the vehicle
entertainment system. Typically, this functionality includes
pre-installed software or a user-downloadable application from a
network source (such as Apple's iTunes or Google's Android Market).
The system functionality may include mapping functions, directions,
landmark location, voice-control, and many other desirable
features. When such mobile computing device is placed within the
vehicle then a convenient vehicle entertainment system associated
with the vehicle can be provided. In an embodiment, a remote switch
can be used to initiate the vehicle entertainment software
application by communicating with the cameras/sensors located in
the vehicle and/or software residing on the mobile computing
device. Remote switch described herein can include one of a number
of well-known remote switches that uses wireless or wired
technology to communicate with mobile computing device. For
example, remote switch may include for example, but not limited to,
a Bluetooth, RF, infrared, or other well-known wireless
communication technology, or it may be connected via one or more
wires to mobile computing device. The switch may be located on any
vehicle interior surface, such as on a steering wheel, visor,
dashboard, or any other convenient location.
[0268] FIG. 13 is a diagram 4300 illustrating generally, possible
locations of sensors, cameras, and/or other technologies, according
to embodiments described herein. FIG. 14 is a sequence diagram
illustrates generally, operations 300 performed by the system as
described in FIG. 11, according to embodiments described herein. In
an embodiment, at 4402, the driver monitoring unit 104 can be
configured to monitor the behavior of the driver. The system can be
configured to include the driver monitoring unit 4104 installed in
the vehicle 102 to monitor the behavior parameters of the driver
while the vehicle 4102 is being driven. The vehicle 4102 can
include cameras, gyroscope, magnetometer, accelerometer, and other
sensors installed thereon to monitor the behavior parameter of the
driver. In an embodiment, the cameras or sensors may be placed at
any place in the vehicle, such as for example at four corners of
the front windshield, in a way that it can directly capture the
behavior parameters of the driver. For example, based on the driver
gestures, the cameras can detect finger position to detect that
driver is pointing at a particular object or vehicle and searches
the internet for the vehicle. Further, in an embodiment, a flexible
display film adhesively secured on the front windshield. The
display can be used controlled by a computer to display info in a
discrete way that may not take driver's eyes off the road and
opposing vehicles. In an embodiment, at 4404, the driver monitoring
unit 4102 can be configured to transmit the behavior parameters of
the driver to the server 4106. In an embodiment, the driver
behavior parameters described herein can include for example, but
not limited to, vehicle speed, vehicle accelerations, driver
location, seatbelt use, wireless device use, turn signal use,
driver aggression, detection of CO2 vapor, detection of alcohol,
driver seating position, time, and the like. In an embodiment, at
4406, the server 4106 can be configured to transmit the driver
behavior parameters to one or more insurance providers. In an
embodiment, at 4408, the server 4106 can be configured to analyze
the driver behavior parameters and adjust the insurance rates for
the driver. For example, if the driver is driving roughly by
drinking alcohol then the insurance rate may get decreased. In an
embodiment, at 4410, the server 4106 can be configured to match the
driver behavior preferences with similar or substantially similar
preferences of other drivers. The server 4104 can be configured to
generate action recommendations best matching the behavior of the
driver. In an embodiment at 4412, the server 4106 can be configured
to provide the generated recommendations to the driver. Based on
the driver behavior parameters the sever 4106 provides feedback and
recommendations to the driver, such as to improve the driving
skills. Further, in an embodiment, a flexible display film
adhesively secured on the front windshield. The display can be used
controlled by a computer to display info in a discrete way that may
not take driver's eyes off the road and opposing vehicles. In an
embodiment, at 4414, the server 4106 can be configured to
frequently monitor the behavior parameters associated with the
driver. Any changes in the behavior parameters can affect the
overall system performance and the driver experience. The server
4106 can be configured to frequently monitor and dynamically update
the insurance rate and action recommendations, which in turn helps
the driver for effectively improving the driving skills.
[0269] FIG. 15 is a diagram 4500 illustrates generally, an overview
of a recommender system that may allow drivers to obtain action
recommendations based on the driver behavior parameters, according
to embodiments disclosed herein. In an embodiment, the driver
behavior parameters can be used to provide customized
recommendations to drivers by comparing the driver behavior
parameters with other drivers who has similar or substantially
similar behavior parameters. Unlike conventional system, the server
106 can be configured to adaptively generate action recommendations
for the driver based on the behavior parameters. The server 106 can
be configured to match the behavior parameters of the drivers to
similar behavior parameters of the one or more drivers, such as to
provide personalized action recommendations to the driver. In an
embodiment, the recommendations can be filtered in advance of
display. In an embodiment, filtered recommendations may be derived
from the sources such as for example, but not limited to, those
sources that have added the data within a specified time, from
those sources that share specific similarities with the sources,
those sources that have been preselected by the driver as relevant,
those sources that are selected as friends or friends of friends,
and the like, those sources that are determined to provide valuable
reviews/ratings or are specifically declared to be experts within
the system or by the driver, or those users that have entered at
least a minimum amount of data into the system.
[0270] FIG. 16 is a diagram 4600 illustrates generally, an overview
of preferences matching by the server 4106, according to
embodiments disclosed herein. The FIG. 16 outlines recommender
functionality in accordance with an embodiment of the present
invention. The system 4100 can monitor the driver behavior and uses
the behavior data to match with the behavior data of other sources
and provide action recommendations to the driver. For example, if
the driver behavior parameter indicates that the user is driving
very fast (such as 70 kmph) in school zone where the speed limits
should be more than 30 kmph then the system can be configured to
execute one or more rules and provide suggestions to the driver to
slow down the speed.
[0271] In an embodiment, the activity recommendation rules may be
established in the recommendation system such as described in the
FIG. 16. Such rules derived from, for example, but not limited to,
automatic generation machine learning, automatic generation using a
generic algorithm, automatic generation using a neutral network,
automatic generation using a rule inference system, data mining,
generation using a preset list of recommendations, and/or a driver
behavior. In an embodiment, the sever 106 can be configured to
receive the recommendation rules such as unidirectional rules,
bidirectional rules, generalized rules including multi-way rules,
rules among items, rules among sets, rules among collections, rules
with weight factors, rules with priorities, un-weighted and
un-prioritized rules, and the like.
[0272] FIG. 17 is a flow chart illustrates generally, a method 4700
for selectively providing insurance information to a service
provider, according to embodiments as disclosed herein. At step
4702, the driver behavior is monitored. The behavior data can
include external parameters and/or internal parameters. In an
embodiment, the driver behavior data/parameters described herein
can include for example, but not limited to, vehicle speed, vehicle
accelerations, driver location, seatbelt use, wireless device use,
turn signal use, driver aggression, detection of ethanol vapor,
driver seating position, time, and the like. In an embodiment, the
behavior data can be over a period of hours, days, weeks, and so
forth. In an embodiment, the behavior data gathering can be
continuous, at predefined intervals, or at random intervals. In
accordance with some aspects, data can be gathered while a vehicle
is in operation and at other times (e.g., at two a.m. to determine
where the vehicle is parked overnight). In an embodiment, a change
to an insurance premium and/or an insurance coverage is prepared,
at 4704. The change is based on one or more of the driver behavior
data, wherein each item of driver behavior data can have a
different weight assigned. For example, data gathered related to
weather conditions might be given less weight than data gathered
related to user distractions (e.g., passengers, use of a mobile
device while vehicle is in operation, and so forth). In another
example, excessive speed might be assigned a higher weight than
data related to safety performance of the vehicle. As such, data
with a higher weight can be given more consideration than data with
a lower weight (e.g., data assigned a higher weight can have a
greater impact on the cost of insurance). Thus, if the user is
traveling at (or below) the speed limit and speed is assigned a
greater weight, then the safe speed will tend to decrease (or
remain constant) the cost of insurance.
[0273] In an embodiment, the driver is notified of the change, at
4706. The notification can be in any perceivable format. In an
example, the notification is provided as a dashboard-mounted
display. In another example, presenting the change can include
displaying the modified cost of the insurance policy in a
dashboard-mounted display and/or a heads-up display. In an
embodiment, a service provider is notified of the change, at 708.
At substantially the same time as notifying the service provider
(or trusted third party) of the change, parameters taken into
consideration (and associated weight) can also be provided. In such
a manner, the service provider (or third party) can selectively
further modify the cost of insurance, which can be communicated to
the user though the vehicle display or through other means.
[0274] The service provider (or third party) might be provided the
change information less often than the insurance cost change
information is provided to the user. For example, the user can be
provided the insurance cost change information dynamically and
almost instantaneously with detection of one or more parameters
that can influence the insurance cost. However, the insurance
provider (or third party) might only be notified of the change
after a specified interval (or based on other intervals). For
example, insurance cost changes might be accumulated over a period
of time (e.g., two weeks) and an average of the insurance cost
changes might be supplied to insurance provider. In such a manner,
the user has time to adjust parameters that tend to increase (or
decrease) the cost of insurance, which allows the user to have more
control over the cost of insurance.
[0275] In an embodiment, Vertical market specialization for
insurance is provided where markets are defined based on granular
aspects of coverage and presented to one or more insurance
subsystems to obtain quotes for a coverage premium. Such
specialization allows insurance companies to compete in more
specific areas of insurance coverage, which allows for more
accurate premium rates focused on the specific areas or one or more
related scenarios. In addition, the granular aspects of coverage
can be provided to one or more advertising systems in exchange for
further lowered rates, if desired.
[0276] According to an example, an insurance market can be defined
based on granular information received regarding an item, a related
person, use of the item, etc. Based on the market, premium quotes
can be obtained from one or more insurance subsystems related to
one or more insurance brokers. In addition, rates can be decreased
where the granular information can be provided to an advertising
system, in one example. In this regard, targeted advertisements can
additionally be presented to system related to requesting the
insurance coverage. Policies can be automatically selected based on
preferences, manually selected using an interface, and/or the
like.
[0277] FIG. 18 is a diagram 4800 illustrates generally, an
exemplary system that customizes insurance rates to correspond to
behavior driver, according to embodiments as disclosed herein. In
an embodiment, the server 4106 can be configured to maintain a
database component 4802 including data related to different driver
behaviors. Such leveraging from data banks enables insurance
providers to bid in real time, and hence an owner and/or user of a
vehicle can benefit from competition among various insurance
providers, to obtain optimum rates. The server includes a rate
adjustment component 4804 that in real time can determine the
various rates from a plurality of insurance providers 4110 (1 to N,
where N is an integer). In one particular aspect, a retrieval agent
(not shown) associated with the rate adjustment component 4804 can
pull insurance data from the insurance providers based on the
contextual data supplied thereto. For example, such contextual data
can be data records related to driver behavior, the vehicle 4102
(such as auto shop service records, current service status for the
car, and the like), data related to the individual driver (such as
health records, criminal records, shopping habits, and the like),
data related to the environment (road condition, humidity,
temperature, and the like) and data related to real time driving
(frequency of braking, accelerating, intensity of such actions, and
the like).
[0278] The retrieval agent (not shown) can pull data from the
insurance providers 4110 and further publish such data to enable a
rich interaction between the users on a display or a within a
written communication environment. The retrieval agent can further
generate an instance for a connection with the insurance providers.
Accordingly, a connection instance can be employed by the rate
adjustment component 4804 to store connection information such as
the state of data conveyance, the data being conveyed, connection
ID and the like. Such information can additionally be employed to
monitor progress of data transfer to the written communication
environment or display, for example.
[0279] Accordingly drivers/owners of motor vehicles can pull or
receive data from the insurance providers 4110, wherein received
data can be posted (e.g., displayed on a monitor) and the
connection instance can be concurrently updated to reflect any
successful and/or failed data retrievals. Thus, at any given moment
the connection instance can include the most up-to-date version of
data transferred between the motor vehicle and the insurance
providers. In an embodiment, a switching component 4806 can be
configured to automatically switch user/driver to an insurance
provider/company that bids the best rate. Such switching component
4806 can employ interrupts both in hardware and/or software to
conclude the switching from one insurance provider to another
insurance provider. For example, the interrupt can convey receipt
of a more optimal insurance rate or completion of a pull request to
the insurance providers 4110 or that a configuration has changed.
In one particular aspect, once an interrupt occurs, an operating
system analyzes the state of the system and performs an action in
accordance with the interrupt, such as a change of insurance
provider, for example
[0280] Such interrupts can be in form of asynchronous external
events to the processor that can alter normal program flow.
Moreover, the interrupts can usually require immediate attention
from a processor(s) associated with the system. In one aspect, when
an interrupt is detected, the system often interrupts all
processing to attend to the interrupt, wherein the system can
further save state of the processor and instruction pointers on
related stacks.
[0281] According to a further aspect, the switching component 4804
can employ an interrupt dispatch table in memory, which can be
accessed by the processor to identify a function that is to be
called in response to a particular interrupt. For example, a
function can accept a policy from an insurance provider, cancel an
existing policy, and/or clear the interrupt for a variety of other
reasons. The function can execute processes such as clearing the
state of the interrupt, calling a driver function to check the
state of an insurance policy and clearing, setting a bit, and the
like.
[0282] FIG. 19 is a diagram 4900 illustrates generally, the
switching component 806 that further includes an analyzer component
4902, which further employs threshold ranges and/or value(s) (e.g.,
pricing ranges for insurance policies, terms of the insurance
policy, and the like) according to a further aspect of the present
invention. The analyzer component 4902 can be configured to compare
a received value for insurance coverage to the predetermined
thresholds, which can be designated by an owner/driver.
Accordingly, the analyzer component 902 can determine if the
received insurance coverage policies are within the desired range
as specified by a user an "accept" or "reject", and/or further
create a hierarchy from "low" to "high" based on criteria
designated by the user (e.g., price of the insurance policy, terms
of the insurance policy, and the like).
[0283] According to a further aspect, the analyzer component 4902
can further interact with a rule engine component 4904. For
example, a rule can be applied to define and/or implement a desired
evaluation method for an insurance policy. It is to be appreciated
that the rule-based implementation can automatically and/or
dynamically define and implement an evaluation scheme of the
insurance policies provided. Accordingly, the rule-based
implementation can evaluate an insurance policy by employing a
predefined and/or programmed rule(s) based upon any desired
criteria (e.g., criteria affecting an insurance policy such as
duration of the policy, number of drivers covered, type of risks
covered, and the like.).
[0284] In a related example, a user can establish a rule that can
implement an evaluation based upon a preferred hierarchy (e.g.,
weight) of criteria that affects the insurance policy. For example,
the rule can be constructed to evaluate the criteria based upon
predetermined thresholds, wherein if such criteria does not comply
with set thresholds, the system can further evaluate another
criteria or attribute(s) to validate the status (e.g., "accept" or
"reject" the insurance bid and operate the switching component
based thereon). It is to be appreciated that any of the attributes
utilized in accordance with the subject invention can be programmed
into a rule-based implementation scheme.
[0285] FIG. 20 illustrates generally, a method 5000 for customizing
insurance rates of a driver, according to embodiments as described
herein. The methodology 5000 of customizing insurance rates
according to a further aspect of the subject innovation. While the
exemplary method is illustrated and described herein as a series of
blocks representative of various events and/or acts, the subject
innovation is not limited by the illustrated ordering of such
blocks. For instance, some acts or events may occur in different
orders and/or concurrently with other acts or events, apart from
the ordering illustrated herein, in accordance with the innovation.
In addition, not all illustrated blocks, events or acts, may be
required to implement a methodology in accordance with the subject
innovation. Moreover, it will be appreciated that the exemplary
method and other methods according to the innovation may be
implemented in association with the method illustrated and
described herein, as well as in association with other systems and
apparatus not illustrated or described. Initially and at 5002
contextual data from various data banks can be accessed by the
insurance providers or supplied thereto. As explained earlier, the
data banks can include data pertaining to the motor vehicle (e.g.,
maintenance history, current vehicle conditions, and the like),
data related to the driver (e.g., via health insurance records,
police records, internet records, and the like), and data related
to operating environment (e.g., weather, geographical location, and
the like.) Moreover, the real-time contextual driving data can
include both an intensity portion and a frequency portion, which
represent severity and regularity of driving episodes (e.g.,
slamming the brakes, gradual/sudden deceleration, velocity
variances, and the like). Subsequently and at 5004, such data can
be analyzed by the insurance providers as to customize an insurance
rate based thereon at 5006. In an embodiment, insurance rate can be
calculated in real-time and as such can more accurately reflect
appropriate coverage for a situation of a driver. A plurality of
different factors can influence a likelihood of the driver being
involved in an accident, having a vehicle stolen, and the like. For
example, if the driver is travelling through bad weather, then risk
can be higher and a rate can be increased in real-time as weather
conditions change-conversely, if there is relatively little traffic
surrounding the driver's vehicle, then the rate can be lowered. An
algorithm or complex model can be used to calculate the insurance
rates and can be disclosed to the driver through the display. In an
embodiment, the rate adjustment component 804 can be configured to
evaluate the insurance rate information against current vehicle
operation by the driver. Specifically, the evaluation can compare
the current operation against insurance rate information to
determine if an appropriate rate is being used, if the rate should
be changed, what the change should be, etc. For instance, different
aspects of vehicle operation can be taken into account such as for
example, but not limited to, weather and how a driver reacts, speed
(of a vehicle), traffic and how the driver reacts, and noise {e.g.,
radio level), and the like.
[0286] Subsequently, the customized insurance rate can then be sent
from an insurance provider to an owner/driver of the vehicle (e.g.,
in form of an insurance bid) at 5008. For example, the insurance
rate can be determined and represented upon the driver via the
display or controller in the vehicle. A processor that executes the
computer executable components stored on a storage medium can be
employed. In an embodiment, the monitoring unit can communicate
with an insurance company {e.g., continuous communication) and
obtain an insurance rate directly. The system can be configured to
customize the insurance based on the obtained insurance rates and
present to the driver and make appropriate modification to the
display automatically.
[0287] FIG. 21 illustrates generally, a method 1100 for presenting
information related to a real-time insurance rate, according to
embodiments as described herein. In an embodiment, at 5102,
Metadata can be collected pertaining to real-time operation of a
vehicle and at least a portion of the metadata can be evaluated, as
shown at 5104. The metadata described herein can include driver
behavior data, contextual information, driver history, and
real-time driving information that relates to operation of a driver
and vehicle, and the like. Based upon a result of the evaluation,
there can be calculation a real-time insurance rate, such as shown
at 5106. In an embodiment, at 5108, determination can be made on
how to present the calculated rate. For example, the determination
can be if the rate should be shown on a center console or a
heads-up display. A determination can also be made on how to
display data (e.g., if a numerical rate should be disclosed or a
color element should be lit). Additionally, a determination can be
made on other data to disclose, such as safety, environment impact,
cost of operating vehicle, a target speed, group rank, and the
like. The determined rate and other determined data can be
presented through a display, such as shown at 5110. Thus, the
determined rate is presented upon a display viewable to the driver
of the vehicle.
[0288] In an embodiment, at 5112, the method 5100 includes
determining if feedback should be presented to the user. The
feedback can be supplied in real-time as well as be a collective
summary presented after a driving session is complete. If no
feedback should be presented, then the method 5100 can end at 5114.
In one instance, if there is a new driver attempting to obtain a
full drivers license (e.g., teenage driver) or newer driver, then
the check 5112 can determine feedback should be automatically
provided. In another embodiment, an operator can be solicited on if
feedback should be presented depending on a response the method
5100 can end or continue.
[0289] Operation of the vehicle and driver can be evaluated at
5116, which can occur though different embodiments. As a user
operates a vehicle, metadata can be collected and evaluated in
real-time. In an alternative embodiment, data can be collected, but
evaluation does not occur until the check 5112 determines feedback
should be presented. At 5118, there can be determining feedback for
suggesting future driving actions for the operator to perform in
future driving to lower the insurance rate. The method 5100 can
include presenting the feedback (e.g., through the display, through
a printout, transferring feedback as part of e-mail or a text
message, etc.) at 5120. The feedback can be directly related to a
driving session as well as is an aggregate analysis of overall
driving performance (e.g., over multiple driving sessions).
[0290] FIG. 22 is diagram illustrates generally, a method 5200 for
installation of a real-time insurance system, according to
embodiments disclosed herein. In an embodiment, at 5202, an
on-board monitoring system (such as driver monitoring unit) 4102 is
installed in a vehicle to facilitate the collection of real-time
data from the vehicle and forwarding of the real-time data to an
insurance provider. At 5204, the on-board monitoring system can be
associated with the on-board data/diagnostic control units and
system(s) incorporated into the vehicle. The on-board
data/diagnostic control units and system(s) can include the
vehicles engine control unit/module (ECU/ECM), transmission control
unit (TCU), power train control unit (PCU), on-board diagnostics
(OBD), sensors and processors associated with the transmission
system, and other aspects of the vehicle allowing the on-board
monitoring system to gather sufficient data from the vehicle for a
determination of how the vehicle is being driven to be made. The
on-board monitoring system can be communicatively coupled by hard
wiring to the on-board diagnostic system(s) or the systems can be
communicatively associated using wireless technologies.
[0291] In an embodiment, at 5206, a mobile device (e.g., a cell
phone) can be associated with the onboard monitoring system where
the mobile device can facilitate communication between the on-board
monitoring systems with a remote insurance provider system. The
mobile device provides identification information to the on-board
monitoring system to be processed by the on-board monitoring system
or forwarded an insurance provider system to enable identification
of the driver.
[0292] In an embodiment, at 5208, communications are established
between the on-board monitoring system and the mobile device with
the remote insurance provider system. In one embodiment it is
envisaged that the on-board monitoring system and the insurance
provider system are owned and operated by the same insurance
company. However, the system could be less restricted whereby the
insurance provider system is accessible by a plurality of insurance
companies with the operator of the on-board monitoring system,
e.g., the driver of the vehicle to which the on-board monitoring
system is attached, choosing from the plurality of insurance
providers available for their particular base coverage. In such an
embodiment, upon startup of the system the insurance provider
system can default to the insurance company providing the base
coverage and the operator can select from other insurance companies
as they require. Over time, as usage of the on-board monitoring
system continues, at 5210, there is a likelihood that various
aspects of the system might need to be updated or replaced, e.g.,
software update, hardware updates, etc., where the updates might be
required for an individual insurance company system or to allow the
on-board monitoring system to function with one or more other
insurance company systems. Hardware updates may involve replacement
of a piece of hardware with another, while software updates can be
conducted by connecting the mobile device and/or the on-board
monitoring system to the internet and downloading the software from
a company website hosted thereon. Alternatively, the software
upgrade can be transmitted to the mobile device or the on-board
monitoring system by wireless means. As a further alternative the
updates can be conferred to the mobile device or the on-board
monitoring system by means of a plug-in module or the like, which
can be left attached to the respective device or the software can
be downloaded there from.
[0293] FIG. 23 is a diagram illustrates generally, a method for
gathering information from an on-board monitoring system employed
in a real-time insurance system, according to embodiments as
disclosed herein. In an embodiment, at 5302, monitoring of the
driver and the vehicle they are operating is commenced. Monitoring
can employ components of an on-board monitoring system, mobile
device components, e.g., cell phone system, or any other system
components associated with monitoring the vehicle as it is being
driven. Such components can include a global positioning system
(GPS) to determine the location of the vehicle at any given time,
such a GPS can be located in a cell phone, as part of the on-board
monitoring system, or an external system coupled to the monitoring
system/cell phone--such an external system being an OEM or after
sales GPS associated with the vehicle to be/being driven. A video
data stream can be gathered from a video camera coupled to the
on-board monitoring system recording the road conditions, etc.
throughout the journey. Information can also be gathered from
monitoring/control system(s) that are integral to the vehicle,
e.g., the vehicle's engine control unit/module (ECU/ECM) that
monitors various sensors located throughout the engine, fuel and
exhaust systems, etc.
[0294] In an embodiment, at 5304, the dynamically gathered data (or
driver behavior data) is transmitted to an insurance evaluation
system. In an embodiment, at 5306, the gathered data is analyzed.
Such analysis can involve identifying the route taken by the
driver, the speed driven, time of day the journey was undertaken,
weather conditions during the journey, other road traffic, did the
user use their cell phone during the journey?, and the like. In an
embodiment, at 5308, the gathered data is assessed from which an
insurance rate(s) can be determined. For example, if the driver
drove above the speed limit then an appropriate determination could
be to increase the insurance premium. In an embodiment, at 5310,
the driver can be informed of the newly determined insurance rate.
Any suitable device can be employed such as informing the user by
cell phone, a display device associated with the on-board
monitoring system, or another device associated with the vehicle.
The information can be conveyed in a variety of ways, including a
text message, a verbal message, graphical presentation, change of
light emitting diodes (LED's) on a display unit, a HUD, etc. At
5312, the driver can continue to drive the vehicle whereby the
method can return to 5302 where the data gathering is commenced
once more.
[0295] Alternatively, in an embodiment, at 5312, the driver may
complete their journey and data gathering and analysis is
completed. In an embodiment, at 5314 the driver can be presented
with new insurance rates based upon the data gathered while they
were driving the vehicle. The new insurance rates can be delivered
and presented to the driver by any suitable means, for example the
new insurance rates and any pertinent information can be forwarded
and presented to the driver via a HUD employed as part of the real
time data gathering system. By employing a HUD instantaneous
notifications regarding a change in the driver's insurance policy
can be presented while mitigating driver distractions {e.g., line
of sight remains substantially unchanged). Alternatively, the
on-board monitoring system can be used, or a remote
computer/presentation device coupled to the real time data
gathering system where the information is forwarded to the driver
via, e.g., email. In another embodiment, the driver can access a
website, hosted by a respective insurance company, where the driver
can view their respective rates/gathered information/analysis
system, etc. Further, traditional means of communication such as a
letter can be used to forward the insurance information to the
driver.
[0296] FIG. 24 is a diagram illustrates generally, a method 5400
mounting cameras to capture traffic information, according to
embodiments as disclosed herein. In an embodiment, at 5402, the
method 5400 includes mounting cameras on the car to monitor the
traffic information. For example, the car may include cameras
mounted to capture views in the rearward, downward, and the like
directions, on the upper surface at the leading end of the front
portion thereof. The position for mounting the cameras is not
limited to the left side, right side, upper surface, front side,
back side, and the like. For example, if the car has a left side
steering wheel, the camera may be mounted on a right upper surface
at a leading end of the front portion of the car. The cameras may
have an angle of view of about 60, 90, 180, and 360 degree. With
the construction, since the camera is mounted for a view in the
rearward and downward directions on the front portion of the car,
it can capture a wide area of the surface of the road in the
vicinity of the driver's car, and an area in the vicinity of the
left front wheel. Furthermore, the camera can also capture a part
of the body of the car in the vicinity of the front wheel. Thereby,
the relation between the car and the surface of the road can be
recorded. In an example, the cameras can be configured to capture
images of the road views including potential collision events such
as how close car is following car in front, how often brake is used
in period of time, hard brakes count more to reduce driver rating,
how frequently does car come close to objects and obstructions
(such as trees, cars on the other direction and cars in same
direction) while moving.
[0297] In an embodiment, at 5404, the method 5400 includes
receiving the recorded information from the camera and use image
processing techniques to process the information. For example, the
system uses image processing techniques to determine potential
collision events such as how close car is following car in front,
how often brake is used in period of time, hard brakes count more
to reduce driver rating, how frequently does car come close to
objects and obstructions (such as trees, cars on the other
direction and cars in same direction) while moving.
[0298] FIG. 25 is a diagram illustrates generally, a method 5500
mounting cameras to capture driver behavior, according to
embodiments as disclosed herein. In an embodiment, at 5502, the
method 5500 includes mounting cameras on the car to monitor the
driver behavior. The position for mounting the cameras is not
limited to the left side, right side, upper surface, front side,
back side, and the like. The cameras may have an angle of view of
about 60, 90, 180, and 360 degree. For example, the camera can
capture driver behavior such as for example, but not limited to,
images of texting and use of phone while driving, speech of driver
shouting or cursing at other drivers or other occupants,
indications of intoxication, sleepiness, alcohol level, mood,
aggressiveness, and the like. In an embodiment, at 5504, the method
5500 includes receiving the recorded information from the camera
and use image processing techniques and voice reorganization
techniques to process the information. For example, the system uses
image processing techniques to determine the driver activity such
as whether the driver is using mobile phone while driving. In
another example, the system uses voice recognition techniques to
determine the use voice, text, aggressiveness, and the like.
[0299] In an embodiment, the item-centric approach determines that
many drivers having similar behavior and the driver who performs
activity-A will also perform activity-B. This has proven to be
fairly effective. On the other hand, many insurance providers
interact with drivers online/offline. Such interaction can produce
a stream of contextual information that recommendation engines can
use. Early systems were batch oriented and computed recommendations
in advance for each driver. Thus, they could not always react to a
driver's most recent behavior. Recommendation engines work by
trying to establish a statistical relationship between drivers and
activities associated with there behavior. The system establishes
these relationships via information about driver's behavior from
vehicle owner, monitoring devices, sensors, and the like.
[0300] In an embodiment, the recommender systems collect data via
APIs, insurance application, insurance databases, and the like
sources. The insurance sources can be available through social
networks, ad hoc and marketing networks, and other external
sources. For example, data can be obtained from insurance sites,
insurance providers, driver insurance history, and search engines.
All this enables recommendation engines to take a more holistic
view of the driver. The recommendation engine can recommend
different insurance products that save money for the driver, or
alternatively can even recommend different insurance companies to
save money. Using greater amounts of data lets the engines find
connections that might otherwise go unnoticed, which yields better
suggestions. This also sometimes requires recommendation systems to
use complex big-data analysis techniques. Online public profiles
and preference listings on social networking sites such as Facebook
add useful data.
[0301] Most recommendation engines use complex algorithms to
analyze driver behavior and suggest recommended activities that
employ personalized collaborative filtering, which use multiple
agents or data sources to identify behavior patterns and draw
conclusions. This approach helps determine that numerous drivers
who have same or similar type of behavior in the past may have to
perform one or more similar activities in the future. Many systems
use expert adaptive approaches. These techniques create new sets of
suggestions, analyze their performance, and adjust the
recommendation pattern for similar behavior of drivers. This lets
systems adapt quickly to new trends and behaviors. Rules-based
systems enable businesses to establish rules that optimize
recommendation performance.
[0302] FIG. 26 is a diagram 5600 illustrates generally, a first
vehicle program communicating with a second vehicle program through
an Inter-Vehicle networking, according to embodiments as disclosed
herein. In an embodiment, the system develops inter-vehicular
networking, computing, transceivers, and sensing technologies in
the vehicles. Such vehicles have embedded computers, GPS receivers,
short-range wireless network interfaces, and potentially access to
in-car sensors and the Internet. Furthermore, they can interact
with road-side wireless sensor networks and sensors embedded in
other vehicles. These capabilities can be leveraged into
distributed computing and sensing applications over vehicular
networks for safer driving, dynamic route planning, mobile sensing,
or in-vehicle entertainment. The system can include
vehicular-specific network protocols, middleware platforms, and
security mechanisms to process the data. As shown in FIG. 26, a
first driver operating a vehicle observes a second driver operating
a vehicle within his visual range and wants to send a message to
the second driver. The vehicle can include identifying information
that is visually ascertainable such as the model, vehicle color,
number of doors, license plate number and state. The vehicle may
include additional information that is only ascertainable from up
close or at certain angles, or via certain technologies, such as a
roof top identification number, vehicle identification number, taxi
badge number, Bluetooth, or RFID code, and the like. In an
embodiment, a sender having access to the vehicle monitoring device
and viewing a second vehicle desires to contact the driver of the
second vehicle. In one embodiment, in case of an accident as
detected by an accelerometer or airbag deployment, both vehicles
automatically exchange insurance information and the drivers simply
confirm and signs to accept. In another embodiment, in case of a
hit-and-run, the vehicle computer would automatically capture
insurance information from the other vehicle and store all
parameters arising from the accident for accident investigator's
review. In another embodiment, if one vehicle detects that the
other vehicle has a low insurance rating, the vehicle automatically
enters a defensive driving mode around that vehicle. As best shown
in FIG. 16, the sender initiates communication via a telephone or
handheld computer or vehicle monitoring device and accesses the
interface to the inter-vehicle networking service and database. The
sender can select "send message" from the graphical or audio menu
to send message or directly communicate with the driver of the
second vehicle.
[0303] For example, the sender can directly communicate with the
driver using the inter-vehicle networking or the sender can choose
from a table of messages that can be sent to the driver using the
inter-vehicle networking. For example, the message can take the
form of voice, audio, video, or other data which can be converted
to a digital signal and sent to any communications terminal. The
inter-vehicle networking database receives the message or encrypted
message and reconstructs the message, including the address
information. The inter-vehicle networking then separates out the
address information including such as for example, but not limited
to, license plate number, vehicle identification number, and the
like.
[0304] In an embodiment, the message may include a return address
for the sender, so that a reply can be returned merely by hitting
the "reply to" or "call back" button on the message. One skilled in
the art would also recognize that the message could be sent
anonymously or by a non-returnable address. Alternatively, the
message could be a general broadcast sent by a police officer or
other official sending a warning message to speeders or an
informational message such as "road closed ahead" or other
message.
[0305] In this case, the transceiver can be a WiMAX system. In
another embodiment, the transceiver can be a meshed 802 protocol
network configuration with a constantly morphing mobile mesh
network that helps drivers avoid accidents, identify traffic jams
miles before they encounter them, and act as a relay point for
Internet access. In one embodiment, the mesh network can be the
ZigBee mesh network. In another embodiment, the mesh network can be
a modified Wi-Fi protocol called 802.11p standard for allowing data
exchange between moving vehicles in the 5.9 GHz band. 802.11p
operates in the 5.835-5.925 GHz range, divided into 7 channels of
10 MHz each. The standard defines mechanisms that allow IEEE
802.11.TM. technology to be used in high speed radio environments
typical of cars and trucks. In these environments, the 802.11p
enhancements to the previous standards enable robust and reliable
car-to-car and car-to-curb communications by addressing challenges
such as extreme Doppler shifts, rapidly changing multipath
conditions, and the need to quickly establish a link and exchange
data in very short times (less than 100 ms). Further enhancements
are defined to support other higher layer protocols that are
designed for the vehicular environment, such as the set of IEEE
1609.TM. standards for Wireless Access in Vehicular Environments
(WAVE). 802.11p supports Intelligent Transportation Systems (ITS)
applications such as cooperative safety, traffic and accident
control, intersection collision avoidance, and emergency
warning.
[0306] One variation of 802.11p is called the Dedicated Short Range
Communications (DSRC), a U.S. Department of Transportation project
as well as the name of the 5.9 GHz frequency band allocated for the
ITS communications. More information on the 802.11p standard can be
obtained from the IEEE. DSRC itself is not a mesh. It's a
broadcast, so it only reaches vehicles within range. Meshing
requires a lot more sophistication. There's a routing aspect to it,
relaying messages to other nodes. DSRC is much simpler.
[0307] One embodiment uses high-powered, heavily encrypted Wi-Fi
that establishes point-to-point connections between cars within a
half-mile radius. Those connections are used to communicate vital
information between vehicles, either triggering alerts to the
driver or interpreted by the vehicle's computer. An intelligent car
slamming on its brakes could communicate to all of the vehicles
behind it that it's coming to rapid halt, giving the driver that
much more warning that he too needs to hit the brakes.
[0308] But because these cars are networked--the car in front of
one vehicle is connected to the car in front it and so forth--in a
distributed mesh, an intelligent vehicle can know if cars miles
down the road are slamming on their brakes, alerting the driver to
potential traffic jams. Given enough vehicles with the technology,
individual cars become nodes in a constantly changing, self-aware
network that can not only monitor what's going on in the immediate
vicinity, but across a citywide traffic grid.
[0309] In one embodiment, the processor receives travel routes and
sensor data from adjacent vehicles, such information is then used
for preparing vehicular brakes for a detected turn or an
anticipated turn from adjacent vehicles. The travel routes can be
transmitted over a vehicular Wi-Fi system that sends protected
information to nearby vehicles equipped with Wi-Fi or Bluetooth or
ZigBee nodes. In one embodiment, a mesh-network is formed with
Wi-Fi transceivers, wherein each vehicle is given a temporary ID in
each vehicular block, similar to a cellular block where vehicles
can join or leave the vehicular block. Once the vehicle joins a
group, travel routes and sensor data is transferred among vehicles
in a group. Once travel routes are shared, the processor can
determine potential or desired actions from the adjacent vehicles
and adjust appropriately. For example, if the car in front of the
vehicle is about to make a turn, the system prepares the brakes and
gently tugs the driver's seat belt to give the drive notice that
the car in front is about to slow down. In another example, if the
processor detects that the driver is about to make a lane change to
the left based on sensor data and acceleration pedal actuation, but
if the processor detects that the vehicle behind in the desired
lane is also speeding up, the system can warn the driver and
disengage the lane change to avoid the accident. Thus, the
processor receives travel routes and sensor data from adjacent
vehicles and notifying the driver of a detected turn or an
anticipated turn from adjacent vehicles. The processor receives
travel routes and sensor data from adjacent vehicles and optimizes
group vehicular speed to improve fuel efficiency. The processor
receives travel routes and sensor data from adjacent vehicles and
sequences red light(s) to optimize fuel efficiency. The processor
notifies the driver of driving behaviors from other drivers at a
predetermined location. The processor switches turn signals and
brakes using a predetermined protocol to reduce insurance premium
for the driver. The processor warns the driver to avoid driving in
a predetermined pattern, driving during a predetermined time,
driving in a predetermined area, or parking in a predetermined area
to reduce insurance premium for the driver. The processor sends
driver behavior data to an insurer, including at least one of:
vehicle speed, vehicle accelerations, vehicle location, seatbelt
use, wireless device use, turn signal use, detection of ethanol
vapor, driver seating position, and time.
[0310] The various systems described above may be used by the
computer to operate the vehicle and maneuver from one location to
another. For example, a user may enter destination information into
the navigation system, either manually or audibly. The vehicle may
determine its location to a few inches based on a combination of
the GPS receiver data, the sensor data, as well as the detailed map
information. In response, the navigation system may generate a
route between the present location of the vehicle and the
destination.
[0311] When the driver is ready to relinquish some level of control
to the autonomous driving computer, the user may activate the
computer. The computer may be activated, for example, by pressing a
button or by manipulating a lever such as gear shifter. Rather than
taking control immediately, the computer may scan the surroundings
and determine whether there are any obstacles or objects in the
immediate vicinity which may prohibit or reduce the ability of the
vehicle to avoid a collision. In this regard, the computer may
require that the driver continue controlling the vehicle manually
or with some level of control (such as the steering or
acceleration) before entering into a fully autonomous mode.
[0312] Once the vehicle is able to maneuver safely without the
assistance of the driver, the vehicle may become fully autonomous
and continue to the destination. The driver may continue to assist
the vehicle by controlling, for example, steering or whether the
vehicle changes lanes, or the driver may take control of the
vehicle immediately in the event of an emergency.
[0313] The vehicle may continuously use the sensor data to identify
objects, such as traffic signals, people, other vehicles, and other
objects, in order to maneuver the vehicle to the destination and
reduce the likelihood of a collision. The vehicle may use the map
data to determine where traffic signals or other objects should
appear and take actions, for example, by signaling turns or
changing lanes. Once the vehicle has arrived at the destination,
the vehicle may provide audible or visual cues to the driver. For
example, by displaying "You have arrived" on one or more of the
electronic displays.
[0314] The vehicle may be only partially autonomous. For example,
the driver may select to control one or more of the following:
steering, acceleration, braking, and emergency braking.
[0315] The vehicle may also have one or more user interfaces that
allow the driver to reflect the driver's driving a style. For
example, the vehicle may include a dial which controls the level of
risk or aggressiveness with which a driver would like the computer
to use when controlling the vehicle. For example, a more aggressive
driver may want to change lanes more often to pass cars, drive in
the left lane on a highway, maneuver the vehicle closer to the
surrounding vehicles, and drive faster than less aggressive
drivers. A less aggressive driver may prefer for the vehicle to
take more conservative actions, such as somewhat at or below the
speed limit, avoiding congested highways, or avoiding populated
areas in order to increase the level of safety. By manipulating the
dial, the thresholds used by the computer to calculate whether to
pass another car, drive closer to other vehicles, increase speed
and the like may change. In other words, changing the dial may
affect a number of different settings used by the computer during
its decision making processes. A driver may also be permitted, via
the user interface, to change individual settings that relate to
the driver's preferences. In one embodiment, insurance rates for
the driver or vehicle may be based on the style of the driving
selected by the driver.
[0316] Aggressiveness settings may also be modified to reflect the
type of vehicle and its passengers and cargo. For example, if an
autonomous truck is transporting dangerous cargo (e.g., chemicals
or flammable liquids), its aggressiveness settings may be less
aggressive than a car carrying a single driver--even if the
aggressive dials of both such a truck and car are set to "high."
Moreover, trucks traveling across long distances over narrow,
unpaved, rugged or icy terrain or vehicles may be placed in a more
conservative mode in order reduce the likelihood of a collision or
other incident.
[0317] In another example, the vehicle may include sport and
non-sport modes which the user may select or deselect in order to
change the aggressiveness of the ride. By way of example, while in
"sport mode", the vehicle may navigate through turns at the maximum
speed that is safe, whereas in "non-sport mode", the vehicle may
navigate through turns at the maximum speed which results in
g-forces that are relatively imperceptible by the passengers in the
car.
[0318] The vehicle's characteristics may also be adjusted based on
whether the driver or the computer is in control of the vehicle.
For example, when a person is driving manually the suspension may
be made fairly stiff so that the person may "feel" the road and
thus drive more responsively or comfortably, while, when the
computer is driving, the suspension may be made such softer so as
to save energy and make for a more comfortable ride for
passengers.
[0319] For purposes of illustration, a number of example
implementations are described. It is to be understood, however,
that the example implementations are illustrative only and are not
meant to limiting. Other example implementations are possible as
well.
[0320] It should be understood, of course, that the foregoing
relates to exemplary embodiments of the invention and that
modifications may be made without departing from the spirit and
scope of the invention as set forth in the following claims.
* * * * *