U.S. patent application number 14/203344 was filed with the patent office on 2014-09-11 for vehicle image and sound data gathering for insurance rating purposes.
This patent application is currently assigned to STATE FARM MUTUAL AUTOMOBILE INSURANCE COMPANY. The applicant listed for this patent is STATE FARM MUTUAL AUTOMOBILE INSURANCE COMPANY. Invention is credited to Todd Binion, Scott T. Christensen, Steven Cielocha, Christopher E. Gay, Gregory Hayward.
Application Number | 20140257872 14/203344 |
Document ID | / |
Family ID | 51229155 |
Filed Date | 2014-09-11 |
United States Patent
Application |
20140257872 |
Kind Code |
A1 |
Christensen; Scott T. ; et
al. |
September 11, 2014 |
Vehicle Image and Sound Data Gathering for Insurance Rating
Purposes
Abstract
A method and computer system provides vehicle insurance
underwriting and ratings to a policy holder. The method and system
receives captured sensor data associated with a vehicle from the
policy holder. For example, the captured sensor data may include
vehicle image and sound data. The method and system may compare the
received data to baseline data to determine an operating condition
of the vehicle. Based at least in part on the determined operating
condition of the vehicle, the method and system may identify a risk
of loss for the vehicle. The method and system may then determine
an insurance premium for the vehicle based at least in part on the
identified risk of loss, and provide the determined insurance
premium to the policy holder. In some embodiments, the sensor data
is image and sound data captured by a user's mobile phone.
Inventors: |
Christensen; Scott T.;
(Gibson City, IL) ; Hayward; Gregory;
(Bloomington, IL) ; Gay; Christopher E.; (Dallas,
TX) ; Cielocha; Steven; (Bloomington, IL) ;
Binion; Todd; (Bloomington, IL) |
|
Applicant: |
Name |
City |
State |
Country |
Type |
STATE FARM MUTUAL AUTOMOBILE INSURANCE COMPANY |
Bloomington |
IL |
US |
|
|
Assignee: |
STATE FARM MUTUAL AUTOMOBILE
INSURANCE COMPANY
Bloomington
IL
|
Family ID: |
51229155 |
Appl. No.: |
14/203344 |
Filed: |
March 10, 2014 |
Related U.S. Patent Documents
|
|
|
|
|
|
Application
Number |
Filing Date |
Patent Number |
|
|
61775652 |
Mar 10, 2013 |
|
|
|
Current U.S.
Class: |
705/4 |
Current CPC
Class: |
G06Q 30/0207 20130101;
G06Q 30/0208 20130101; H04W 4/40 20180201; B60C 1/00 20130101; G07C
5/008 20130101; G06Q 40/00 20130101; G06Q 40/08 20130101; G07C 5/00
20130101; B60Q 1/00 20130101; B60R 25/20 20130101 |
Class at
Publication: |
705/4 |
International
Class: |
G06Q 40/08 20120101
G06Q040/08 |
Claims
1. A computer-implemented method for providing vehicle insurance,
the method comprising: receiving, via one or more processors, data
associated with a vehicle, the data captured by a sensor of a
computing device and including sound data recorded by an audio
sensor of the computing device; comparing, via one or more
processors, the received data to stored baseline vehicle data to
determine an operating condition of the vehicle, the stored
baseline vehicle data including a baseline engine sound model;
identifying, via one or more processors, a risk of loss for the
vehicle based on the determined operating condition of the vehicle,
wherein the risk of loss includes the sound data recorded by the
audio sensor of the computing device deviating from the baseline
engine sound model by a threshold amount; determining, via one or
more processors, an insurance premium for the vehicle based at
least in part on the identified risk of loss; and providing, via
one or more processors, the determined insurance premium to a
user.
2. The computer-implemented method of claim 1, wherein the captured
data includes one or more of sound data or image data, the image
data captured by an imaging sensor of the computing device.
3. The computer-implemented method of claim 2, wherein the imaging
sensor includes one or more of a camera or a video recorder, and
the audio sensor includes a microphone.
4. The computer-implemented method of claim 3, wherein the image
data includes a captured odometer reading indicating the mileage of
the vehicle and the sound data includes a recorded engine sound
while the vehicle is running.
5. The computer-implemented method of claim 4, wherein the image
data further includes a captured image of the vehicle indicating
the appearance of the vehicle such as a vehicle make, a vehicle
model, a vehicle model year, paint conditions of the vehicle, tire
conditions of the vehicle, window conditions of the vehicle,
interior conditions of the vehicle, dents on the vehicle, scratches
on the vehicle, or other vehicle defects.
6. The computer-implemented method of claim 1, further comprising
receiving vehicle owner or operator data associated with an owner
or operators of the vehicle, and identifying the risk of loss for
the vehicle is further based on comparing the received vehicle
owner or operator data to stored vehicle owner or operator
statistics.
7. The computer-implemented method of claim 6, wherein the vehicle
owner or operator data includes one or more of an age, gender, or
marital status of the vehicle owner or operator, a home address of
the vehicle owner or operator, how the vehicle is used by the
vehicle owner or operator, where the vehicle is driven to by the
vehicle owner or operator, how many miles driven by the vehicle
owner or operator, how many speeding tickets that the vehicle owner
or operator has, how many auto accidents that the vehicle owner or
operator has been involved in, and how many previous vehicle
insurance claims that the vehicle owner or operator has previously
filed.
8. The computer-implemented method of claim 2, further comprising:
receiving, via one or more processors, updated image data, sound
data or vehicle owner or operator data associated with the vehicle;
and calculating, via one or more processors, an adjustment for the
determined insurance premium based on the updated image data, sound
data or vehicle owner or operator data.
9. The computer-implemented method of claim 1, further comprising
identifying a beginning or ending of a trip using the vehicle based
at least in part on the sound data recorded by the audio sensor of
the computing device, wherein the beginning of the trip causes
collection of vehicle usage data including one or more of miles
driven, a speed, a time of day, or a geographic location.
10. A non-transitory computer-readable storage medium including
computer-readable instructions to be executed on one or more
processors of a system for providing vehicle insurance, the
instructions when executed causing the one or more processors to:
receive data associated with a vehicle, the data captured by a
sensor of a computing device and including sound data recorded by
an audio sensor of the computing device; compare the received data
to stored baseline vehicle data to determine an operating condition
of the vehicle, the stored baseline vehicle data including a
baseline engine sound model; identify a risk of loss for the
vehicle based on the determined operating condition of the vehicle,
wherein the risk of loss includes the sound data recorded by the
audio sensor of the computing device deviating from the baseline
engine sound model by a threshold amount; determine an insurance
premium for the vehicle based at least in part on the identified
risk of loss; and provide the determined insurance premium to a
user.
11. The non-transitory computer-readable storage medium of claim
10, wherein the received data includes one or more of sound data or
image data. The image data captured by an imaging sensor of the
computing device.
12. The non-transitory computer-readable storage medium of claim
11, wherein the imaging sensor includes one or more of a camera or
a video recorder, and the audio sensor includes a microphone.
13. The non-transitory computer-readable storage medium of claim
12, wherein the image data includes a captured odometer reading
indicating the mileage of the vehicle and the sound data includes a
recorded engine sound while the vehicle is running.
14. The non-transitory computer-readable storage medium of claim
11, wherein the image data further includes a captured image of the
vehicle indicating the appearance of the vehicle such as a vehicle
make, a vehicle model, a vehicle model year, paint conditions of
the vehicle, tire conditions of the vehicle, window conditions of
the vehicle, interior conditions of the vehicle, dents on the
vehicle, scratches on the vehicle, or other vehicle defects.
15. The non-transitory computer-readable storage medium of claim
10, further including instructions that, when executed, cause the
one or more processors to receive vehicle owner or operator data
associated with an owner or operators of the vehicle, and wherein
the instructions to identify the risk of loss for the vehicle
include identifying the risk of loss based on comparing the
received vehicle owner or operator data to stored vehicle owner or
operator statistics.
16. The non-transitory computer-readable storage medium of claim
15, wherein the vehicle owner or operator data includes one or more
of an age, gender, or marital status of the vehicle owner or
operator, a home address of the vehicle owner or operator, how the
vehicle is used by the vehicle owner or operator, where the vehicle
is driven to by the vehicle owner or operator, how many miles
driven by the vehicle owner or operator, how many speeding tickets
that the vehicle owner or operator has, how many auto accidents
that the vehicle owner or operator has been involved in, how many
previous vehicle insurance claims that the vehicle owner or
operator has previously filed, or other relevant information.
17. The non-transitory computer-readable storage medium of claim
10, further including instructions that, when executed, cause the
one or more processors to: receive updated image data, sound data
or vehicle owner or operator data associated with the vehicle; and
calculate an adjustment for the determined insurance premium based
on the updated image data, sound data or vehicle owner or operator
data.
18. A computer system for providing vehicle insurance, the system
comprising: a vehicle data repository; and a insurance server,
including a memory having instructions for execution on one or more
processors, wherein the instructions, when executed by the one or
more processors, cause the insurance server to: retrieve, via a
network connection, sound data associated with a vehicle from the
vehicle data repository; retrieve, via a network connection, a
baseline engine sound model from the vehicle data repository;
compare the sound data associated with the vehicle with the
baseline engine sound model to determine an operating condition of
the vehicle; identify a risk of loss for the vehicle based on the
determined operating condition of the vehicle, wherein the risk of
loss includes the sound data recorded by the audio sensor of the
computing device deviating from the baseline engine sound model by
a threshold amount; determine an insurance premium for the vehicle
based at least in part on the identified risk of loss; and provide,
via a network connection, the determined insurance premium to a
user.
19. The computer system of claim 18, wherein the instructions of
the insurance server, when executed by the one or more processors,
further cause the insurance server to: retrieve, via a network
connection, vehicle owner or operator data associated with an owner
or operators of the vehicle from the vehicle data repository;
retrieve, via a network connection, vehicle owner or operator
statistics from the vehicle data repository; and identify the risk
of loss based on comparing the vehicle owner or operator data to
the vehicle owner or operator statistics.
20. The computer system of claim 18, wherein the instructions of
the insurance server, when executed by the one or more processors,
further cause the insurance server to: retrieve, via a network
connection, updated data including image data, updated sound data
or vehicle owner or operator data associated with the vehicle from
the vehicle data repository; and calculate an adjustment for the
determined insurance premium based on the image data, updated sound
data or vehicle owner or operator data.
21. The computer system of claim 20, wherein the image data
includes a captured odometer reading indicating the mileage of the
vehicle, and the updated sound data includes a further recording of
engine sound while the vehicle is running.
Description
CROSS-REFERENCE TO RELATED APPLICATIONS
[0001] This application claims the benefit of U.S. Provisional
Application No. 61/775,652, filed Mar. 10, 2013, which is
incorporated by reference herein.
TECHNICAL FIELD
[0002] The present application relates generally to insurance and,
more specifically, to systems and methods for collecting and
processing vehicle image and sound data for insurance rating or
underwriting purposes.
BACKGROUND
[0003] In vehicle insurance, insurance providers generally seek to
determine an insurance policy premium for a vehicle based on the
risk of losses associated with the condition of the vehicle. For
purposes of making this determination, it is well understood that
various vehicle conditions can influence the probability that the
vehicle will experience a loss that is recognizable under the
policy. For example, mileage accrued on the vehicle can affect the
overall operating condition of the vehicle. As such, vehicles with
less mileage or that are driven less generally have a lower risk of
loss, and therefore may be offered lower premiums for a given level
of coverage. Conversely, vehicles with high mileage or that are
driven often correspond to a higher risk of loss, and therefore may
be offered higher premiums for the same level of coverage.
[0004] Currently, insurance providers have limited or inadequate
access to information regarding vehicle condition. Most of the
information is gathered through questionnaires provided to
prospective policy holders who own or operate the vehicle. However,
responses obtained from the questionnaires may not always be
accurate or complete. Thus, the determined insurance policy premium
for the vehicle may be poorly correlated with the actual risk of
losses associated with the conditions of the vehicle.
SUMMARY
[0005] The features and advantages described in this summary and
the following detailed description are not all-inclusive. Many
additional features and advantages will be apparent to one of
ordinary skill in the art in view of the drawings, specification,
and claims hereof. Additionally, other embodiments may omit one or
more (or all) of the features and advantages described in this
summary.
[0006] A computer-implemented method for providing vehicle
insurance may include receiving, via one or more processors, data
associated with a vehicle. The data may be captured by a sensor of
a computing device and includes image data recorded by a photo
sensor of the computing device and sound data recorded by an audio
sensor of the computing device. The method may also compare, via
one or more processors, the received data to stored baseline
vehicle data to determine an operating condition of the vehicle,
the stored baseline vehicle data including a baseline vehicle image
or baseline engine sound model. Further, the method may identify,
via one or more processors, a risk of loss for the vehicle based on
the determined operating condition of the vehicle, where the risk
of loss includes the image data deviating from the baseline image
by a threshold amount or the sound data recorded by the audio
sensor of the computing device deviating from the baseline engine
sound model by a threshold amount. The method may then determine,
via one or more processors, an insurance premium for the vehicle
based at least in part on the identified risk of loss. Finally, the
method may provide, via one or more processors, the determined
insurance premium to a user.
[0007] A non-transitory computer-readable storage medium including
computer-readable instructions to be executed on one or more
processors of a system for providing vehicle insurance. The
instructions when executed causing the one or more processors to
receive data associated with a vehicle. The data may be captured by
a sensor of a computing device. The instructions when executed, may
also cause the one or more processors to compare the received data
to stored baseline vehicle data to determine an operating condition
of the vehicle. Further, the instructions when executed, may cause
the one or more processors to identify a risk of loss for the
vehicle based on the determined operating condition of the vehicle.
The instructions when executed, may then cause the one or more
processors to determine an insurance premium for the vehicle based
at least in part on the identified risk of loss. Finally, the
instructions when executed, may cause the one or more processors to
provide the determined insurance premium to a user.
[0008] A computer system for providing vehicle insurance may
comprise a vehicle data repository and an insurance server that
includes a memory having instructions for execution on one or more
processors. The instructions when executed by the one or more
processors, may cause the insurance server to retrieve, via a
network connection, sound data associated with a vehicle from the
vehicle data repository. The instructions when executed by the one
or more processors, may also retrieve, via a network connection,
baseline engine sound model data from the vehicle data repository.
Further, the instructions when executed by the one or more
processors, may compare the sound data associated with the vehicle
with the baseline engine sound model to determine an operating
condition of the vehicle. The instructions when executed by the one
or more processors, may identify a risk of loss for the vehicle
based on the determined operating condition of the vehicle, wherein
the risk of loss includes the sound data recorded by the audio
sensor of the computing device deviating from the baseline engine
sound model by a threshold amount. The instructions when executed
by the one or more processors, may then determine an insurance
premium for the vehicle based at least in part on the identified
risk of loss. Finally, the instructions when executed by the one or
more processors, may provide, via a network connection, the
determined insurance premium to a user.
BRIEF DESCRIPTION OF THE DRAWINGS
[0009] FIG. 1 is a block diagram of an example system for providing
insurance ratings based on vehicle image and sound data.
[0010] FIG. 2 is a screenshot of an example user interface for
providing insurance ratings based on vehicle image and sound
data.
[0011] FIG. 3 is a flow diagram of an example method for providing
insurance ratings based on vehicle image and sound data.
[0012] FIG. 4 is a block diagram of a computing environment that
implements a system and method for providing insurance ratings
based on vehicle image and sound data.
[0013] The figures depict a preferred embodiment of the present
invention for purposes of illustration only. One skilled in the art
will readily recognize from the following discussion that
alternative embodiments of the structures and methods illustrated
herein may be employed without departing from the principles of the
invention described herein.
DETAILED DESCRIPTION
[0014] Generally speaking, the disclosed system collects and
analyzes image and sound data associated with the conditions of a
vehicle in order to provide a policy holder with vehicle insurance
ratings. As described herein, the conditions of the vehicle may
correspond to the vehicle mileage, engine sounds, appearance,
etc.
[0015] FIG. 1 is a block diagram of an example system 100 for
providing insurance ratings based on vehicle image and sound data.
The example system 100 includes a computing device 102 that is
communicatively coupled to an insurance server 104 via a network
106 (e.g., a local area network, a wide area network, a mobile
network, a wired or wireless network, a private network, etc.). The
computing device 102 may include a processor 108, a memory 110,
user interfaces 112 (e.g., a display screen, a touchscreen, a
keyboard, an analog control panel, etc.), and one or more sensors
114. In general, the computing device 102 may be a smart phone, a
personal digital assistant (PDA), a laptop computer, a tablet
computer, a smart watch, smart glasses, etc.
[0016] Generally, a policy holder (e.g., a current or potential
policy holder) may operate the one or more sensors 114 on the
computing device 102 to collect or capture image and sound data
associated with the conditions of the vehicle. For example, the one
or more sensors 114 may include an imaging sensor (e.g., a camera,
a video recorder, etc.) that the policy holder may operate to
capture images and/or videos of the vehicle. As another example,
the one or more sensors 114 may include an audio sensor (e.g., a
microphone) that the policy holder may operate to record sounds
related to the vehicle such as engine sounds while the vehicle is
running.
[0017] Once image and sound data associated with the conditions of
the vehicle is collected or captured by the one or more sensors
114, the processor 108 may cause the data to be stored in the
memory 110 before being transmitted to the insurance server 104 via
the network 106. As such, the memory 110 may store captured images
and/or videos as vehicle image data 110A, and recorded sounds as
vehicle sound data 110B.
[0018] Additionally, the memory 110 may store vehicle owner or
operator data 110C related to the owner or operators of the vehicle
(e.g., the policy holder). For example, the owner or operators may
input personal information (e.g., age, gender, home address,
previous insurance claims, etc.) via, for example, the user
interface 112. The processor 108 may then cause the personal
information to be stored as the vehicle owner or operator data 110C
in the memory 110. In some embodiments, the processor 108 may cause
the gathered vehicle image data 110A, vehicle sound data 110B, and
vehicle owner or operator data 110C to be transmitted directly to
the insurance server 104 via the network 106.
[0019] In some embodiments, some or all of the vehicle image data
110A, vehicle sound data 110B, and vehicle owner or operator data
110C may be sent to the insurance server 104 via a third party. For
example, a server of a vehicle data provider (not shown in FIG. 1)
may first store and then transmit the vehicle image data 110A,
vehicle sound data 110B, and vehicle owner or operator data 110C to
the insurance server 104 via the network 106 or a different
network.
[0020] With continued reference to FIG. 1, the insurance server 104
may be a single server or a plurality of servers with distributed
processing. The insurance server 104 may be coupled to a vehicle
data repository 120. In an embodiment, the insurance server 104 may
receive the data 110A-110C stored in the memory 110 via the network
106. The received data 110A-110C may be stored in the vehicle data
repository 120 as vehicle image data 120A, vehicle sound data 120B,
and vehicle owner or operator data 120C. The vehicle data
repository 120 may also include one or more vehicle condition
models 122A and vehicle owner or operator statistics 122B that the
insurance server 104 may use to correlate with the data 120A-120C
in order to determine the likelihood of incurring recognizable
losses for the vehicle. In some embodiments, the data 120A, 120B,
120C, 122A and/or 122B may be stored separately in different data
repositories.
[0021] A processor 104A of the insurance server 104 may execute
instructions stored in a memory 104B of the insurance server 104 to
retrieve data stored in the vehicle data repository 120. The
insurance server 104 may operate directly on the data 120A-120C
provided in the vehicle data repository 120, or may operate on
other data that is generated based on the data 120A-120C. For
example, the insurance server 104 may convert any or all of the
data 120A-120C to a particular format (e.g., for efficient
storage), and later utilize the modified data. In some embodiments,
the vehicle data repository 120 may not be directly coupled to the
insurance server 104, but instead may be accessible by the
insurance server 104 via a network such as the network 106.
[0022] The insurance server 104 may be configured to provide
insurance ratings for the vehicle based on image and sound data
associated with the condition of the vehicle. To this end, the
insurance server 104 may compare the data 120A-120C with the one or
more vehicle condition models 122A and vehicle owner or operator
statistics 122B or other data to determine a risk rating or a
parameter corresponding to a risk rating. The sound data may also
be used to identify the beginning and ending of a trip using the
vehicle. For example, an audio device may listen for a particular
engine sound for the vehicle (i.e., like an audio fingerprint for
the vehicle) and the functions described herein may determine when
a trip has started and then when the trip has ended. The vehicle
footprint sound could also trigger collection of other usage-based
data such as mile driven, speed, time of day, geographic location,
etc., or to distinguish a trip in Vehicle A from a trip in Vehicle
B. Some example embodiments and scenarios are described here for
illustration purposes.
[0023] In an example embodiment, a condition of the vehicle may
correspond to the vehicle mileage. For example, the vehicle image
data 120A in the repository 120 may specify odometer reading
information. As such, the insurance server 104 may compare the
odometer information in the data 120A to a vehicle odometer model
(e.g., stored as one of the vehicle condition models 122) in the
repository 120. The vehicle odometer model may identify one or more
ranges of vehicle mileage, where each range corresponds to an
indicator of loss likelihood. Thus, the insurance server 104 may
determine a risk indicator by matching the odometer information in
the data 120A to one of the vehicle mileage ranges in the vehicle
odometer model. The insurance server 104 may then determine an
insurance premium for the vehicle based at least in part on the
determined risk indicator.
[0024] In another example embodiment, a condition of the vehicle
may correspond to the vehicle engine sound. For example, the
vehicle sound data 120B in the repository 120 may specify engine
sound information. As such, the insurance server 104 may compare
the engine sound information in the data 120B to a baseline engine
sound model (e.g., stored as one of the vehicle condition models
122) in the repository 120. The baseline engine sound model may
indicate that certain engine sounds correspond to particular
operating conditions or states of the vehicle that can lead to a
high-risk of loss, such as states of disrepair or indications of
inappropriate or illegal modifications to the vehicle engine, when
the sound data recorded by the audio sensor of the computing device
deviates from the baseline engine sound model by a threshold
amount. For example, where an exhaust note of a normally operating
vehicle at idle may include a frequency of 600 Hz, the recorded
sound data may indicate a low frequency of 300 Hz or a high
frequency of 1000 Hz. Either the low or the high frequency may
indicate disrepair or other engine states that correspond to a
high-risk of loss. Thus, by comparing the engine sound information
in the data 120B to the baseline engine sound model, the insurance
server 104 may determine an appropriate risk of loss. Based at
least in part on the identified risk of loss, the insurance server
104 may then determine an appropriate insurance premium for the
vehicle.
[0025] In some embodiments, the insurance server 104 may use other
conditions of the vehicle (e.g., paint conditions, tire conditions,
window conditions, interior conditions, dents, scratches, or other
vehicle defects) for vehicle insurance rating purposes.
[0026] Further, the insurance server 104 may utilize the vehicle
owner or operator data 120C to provide insurance ratings for the
vehicle. In an example embodiment, the vehicle owner or operator
data 120C may specify information such as the age, gender, and
marital status of the vehicle owner or operator, the home address
or the neighborhood that the vehicle owner or operator resides, how
the vehicle is used, how far the vehicle owner or operator drives
every day and where the vehicle is driven to, how many miles driven
per day, week, month, etc., how many speeding tickets the vehicle
owner or operator has, how many accidents the vehicle owner or
operator has been involved in, how many vehicle insurance claims
that the vehicle owner or operator has filed, and other relevant
information. The insurance server 104 may compare any or all of the
information in the data 120C to the vehicle owner or operator
statistics 122B in the repository 120. For example, the vehicle
owner or operator statistics 122B may indicate a high risk of loss
for the vehicle if the vehicle owner or operator lives in a
dangerous neighborhood. In another example, the vehicle owner or
operator statistics 122B may indicate a higher risk of loss for the
vehicle if the vehicle owner or operator has been given speed
tickets or has been involved in accidents in the past. Conversely,
the vehicle owner or operator statistics 122B may indicate a lower
risk of loss if the vehicle owner or operator has a clean driving
record. Thus, by comparing the information in the data 120C to the
vehicle owner or operator statistics 122B, the insurance server 104
may further refine or modify the risk of loss associated with the
vehicle in order to provide more accurate vehicle insurance
ratings.
[0027] It is understood that the above examples are not exclusive,
and that more than one such embodiment may coexist within a single
system.
[0028] Moreover, the insurance server 104 may be configured to
provide renewals, updates, and/or adjustments of an existing
insurance premium. To do so, the insurance server 104 may perform
an audit, where the policy holder submits new or updated
information (e.g., new image and sound data) regarding the
conditions of the vehicle and/or the owner or operators of the
vehicle. Using the new or updated information, the insurance server
104 may calculate a new or updated risk rating to determine or
update the existing insurance premium.
[0029] In an example embodiment, the insurance server 104 may
determine an insurance premium for a vehicle based at least in part
on a mileage estimation provided by the policy holder. The mileage
estimation may specify that the policy holder will only accrue a
certain amount of mileage on the vehicle over a certain period of
time covered by the insurance premium. At renewal time, the
insurance server 104 may audit the policy holder to determine the
actual mileage accrued on the vehicle. If the actual mileage is
within the mileage estimation, then the insurance server 104 may
renew the existing insurance premium for the vehicle. However, if
the actual mileage is much greater than the mileage estimation,
then the insurance server 104 may calculate a new risk rating and
determine a new insurance premium for the vehicle.
[0030] In some embodiments, the policy holder may be requested to
use a link to connect with a professional agent to perform the
auditing process. For example, using a video chat service, the
professional agent may perform inspect the vehicle and submit
information regarding the conditions of the vehicle to the
insurance server 104.
[0031] FIG. 2 is a screenshot of an example user interface 200 for
providing insurance ratings based on vehicle image and sound data.
In FIG. 2, the user interface 200 is shown in a computing device
(e.g., a smart phone) such as the computing device 102 of FIG. 1.
As such, the user interface 200 may be stored in the memory 110 and
executed by the processor 108. When executed by the processor 108,
the user interface 200 may be displayed to a policy holder via the
user interfaces 112 (e.g., a display screen) of the computing
device 102.
[0032] In the embodiment of FIG. 2, the user interface 200 may
include an odometer reading input field 204 and an engine sound
input field 206, which allow the policy holder to enter odometer
reading information and engine sound information, respectively. For
example, the policy holder may take an image and/or a video of the
current vehicle mileage as shown in the vehicle odometer by using
an imaging sensor (e.g., a camera or video recorder that is part of
the one or more sensors 114) on the computing device 102. The
captured image and/or video may be stored in the memory 110 of the
computing device 102 as the vehicle image data 110A. The policy
holder may then access and select the vehicle image data 110A in
the memory 110 via the input field 204. The policy holder may also
record the sound of the vehicle engine while the vehicle is running
by using an audio sensor (e.g., a microphone that is part of the
one or more sensors 114) on the computing device 102. The recorded
engine sound may be stored in the memory 110 of the computing
device 102 as the vehicle sound data 110B. Accordingly, the policy
holder may access and select the vehicle sound data 110B in the
memory 110 via the input field 206.
[0033] In addition, the user interface 200 may include a vehicle
information input field 208, which allows the policy holder to
enter information about the vehicle as well as the owner or
operators of the vehicle (which may be the policy holder himself or
herself). In the embodiment of FIG. 2, the input field 208 includes
entries for a vehicle make 208A, a vehicle year 208B, a vehicle
identification number (VIN) 208C, a vehicle owner or operator age
208D, a vehicle owner or operator gender 208E, a vehicle owner or
operator marital status 208F, a vehicle owner or operator home
address 208G, how the vehicle is used by the vehicle owner or
operator 208H, where the vehicle is commonly driven to by the
vehicle owner or operator 208I, an amount of miles driven by the
vehicle owner or operator 208J, a number of speeding tickets that
the vehicle owner or operator has 208K, a number of auto accidents
that the vehicle owner or operator has been involved in 208L, and a
number of vehicle insurance claims that the vehicle owner or
operator has previously filed 208M. Information entered into the
input field 208 may be stored in the memory 110 as the vehicle
owner or operator data 110C, for example. Although FIG. 2 shows the
entries 208A-208M in the vehicle information input field 208, in
general, the input field 208 may include any number of entries that
provide relevant information about the vehicle and/or the vehicle
owner or operator.
[0034] Once the policy holder has entered or selected all the
necessary information in the input fields 204, 206 and 208, the
policy holder may execute the button 210 to submit the information
to the insurance server 104. For example, the computing device 102
may transmit the information to the insurance server 104 for
storage and processing via a network connection such as the network
106.
[0035] Referring now to FIG. 3, which describes a flow diagram of
an example method 300 for providing insurance ratings based on
vehicle image and sound data. The method 300 may include one or
more blocks, routines or functions in the form of computer
executable instructions that are stored in a tangible
computer-readable medium (e.g., 104B of FIG. 1) and executed using
a processor (e.g., 104A of FIG. 1).
[0036] The method 300 begins by receiving image and sound data
associated with a vehicle (block 302). For example, the method 300
may receive a captured image and/or a video of the vehicle odometer
reading as the image data, and a recorded sound of the vehicle
engine as the sound data.
[0037] Next, the method 300 analyzes the received image and sound
data by comparing the received image and sound data to stored
baseline vehicle data in order to determine an operating condition
of the vehicle (block 304). For example, the method 300 may compare
mileage information in the odometer reading (as received in the
image data or as entered into the user interface 200) to a set of
mileage ranges that are part of the stored baseline vehicle data.
From the comparison, the method 300 may determine how well the
vehicle can operate given the mileage that is already accrued on
the vehicle. Block 304 may also analyze the sound data to identify
the beginning and ending of a trip using the vehicle. Block 304 may
also trigger collection of other usage-based data such as mile
driven, speed, time of day, geographic location, etc.
[0038] The method 300 then identifies a risk of loss for the
vehicle based at least in part on the determined operating
condition of the vehicle (block 306). Continuing with the above
example, the method 300 may determine in the block 304 that the
vehicle has a high mileage. Accordingly, the method 300 may
identify a high risk of loss associated with the vehicle.
[0039] Finally, the method 300 may determine an insurance premium
(or provide an indication of the determined insurance premium) for
the vehicle based at least in part on the identified risk of loss
(block 308). The method 300 may provide (e.g., display or
communicate) the insurance premium to a policy holder via a display
screen (e.g., the user interfaces 112 of FIG. 1), for example.
[0040] In some embodiments, the method 300 may include additional
functionalities not shown in FIG. 3. For example, in addition to
image and sound data, the method 300 may receive data regarding the
owner or operators of the vehicle in the block 302. The method 300
may also analyze the received data regarding the owner or operators
of the vehicle in the block 304. In particular, the method 300 may
perform this analysis by comparing the data regarding the owner or
operators of the vehicle to stored vehicle owner or operator
statistics in the block 304, for example. Further, based at least
in part on the analysis in the block 304, the method 300 may
further develop or refine the identified risk of loss in the block
306.
[0041] In some embodiments, the method 300 may include additional
blocks not shown in FIG. 3. For example, the method 300 may include
blocks that provide renewals, updates, and/or adjustments of the
determined insurance premium in the block 308. In an example
embodiment, the method 300 may receive new or updated data (e.g.,
new image and sound data) associated with the vehicle, and
calculate a new or updated risk of loss in order to update or
modify the insurance premium determined in the block 308.
[0042] Using the system (e.g., 100), user interface (e.g., 200),
and method (e.g., 300) described herein, a system may be
implemented for providing insurance ratings based on vehicle image
and sound data.
[0043] FIG. 4 is a block diagram of an example computing
environment for a system 400 having a computing device 401 that may
be used to implement the systems and methods described herein. The
computing device 401 may include a computing device 102, an
insurance server 104, a mobile computing device (e.g., cellular
phone, a tablet computer, a Wi-Fi-enabled device or other personal
computing device capable of wireless or wired communication), a
thin client, or other known type of computing device. As will be
recognized by one skilled in the art, in light of the disclosure
and teachings herein, other types of computing devices can be used
that have different architectures. Processor systems similar or
identical to the example system 400 may be used to implement and
execute the example system of FIG. 1, the user interface of FIG. 2,
the method of FIG. 3, and the like. Although the example system 400
is described below as including a plurality of peripherals,
interfaces, chips, memories, etc., one or more of those elements
may be omitted from other example processor systems used to
implement and execute the example system 100. Also, other
components may be added.
[0044] As shown in FIG. 4, the computing device 401 includes a
processor 402 that is coupled to an interconnection bus 404. The
processor 402 includes a register set or register space 406, which
is depicted in FIG. 4 as being entirely on-chip, but which could
alternatively be located entirely or partially off-chip and
directly coupled to the processor 402 via dedicated electrical
connections and/or via the interconnection bus 404. The processor
402 may be any suitable processor, processing unit or
microprocessor. Although not shown in FIG. 4, the computing device
401 may be a multi-processor device and, thus, may include one or
more additional processors that are identical or similar to the
processor 402 and that are communicatively coupled to the
interconnection bus 404.
[0045] The processor 402 of FIG. 4 is coupled to a chipset 408,
which includes a memory controller 410 and a peripheral
input/output (I/O) controller 412. As is well known, a chipset
typically provides I/O and memory management functions as well as a
plurality of general purpose and/or special purpose registers,
timers, etc. that are accessible or used by one or more processors
coupled to the chipset 408. The memory controller 410 performs
functions that enable the processor 402 (or processors if there are
multiple processors) to access a system memory 414 and a mass
storage memory 416, that may include either or both of an in-memory
cache (e.g., a cache within the memory 414) or an on-disk cache
(e.g., a cache within the mass storage memory 416).
[0046] The system memory 414 may include any desired type of
volatile and/or non-volatile memory such as, for example, static
random access memory (SRAM), dynamic random access memory (DRAM),
flash memory, read-only memory (ROM), etc. The mass storage memory
416 may include any desired type of mass storage device. For
example, if the computing device 401 is used to implement an
application 418 having an API 419 (including functions and
instructions as described by the method 300 of FIG. 3). The mass
storage memory 416 may include a hard disk drive, an optical drive,
a tape storage device, a solid-state memory (e.g., a flash memory,
a RAM memory, etc.), a magnetic memory (e.g., a hard drive), or any
other memory suitable for mass storage. As used herein, the terms
module, block, function, operation, procedure, routine, step, and
method refer to tangible computer program logic or tangible
computer executable instructions that provide the specified
functionality to the computing device 401 and the system 400. Thus,
a module, block, function, operation, procedure, routine, step, and
method can be implemented in hardware, firmware, and/or software.
In one embodiment, program modules and routines (e.g., the
application 418, the API 419, etc.) are stored in mass storage
memory 416, loaded into system memory 414, and executed by a
processor 402 or can be provided from computer program products
that are stored in tangible computer-readable storage mediums
(e.g., RAM, hard disk, optical/magnetic media, etc.).
[0047] The peripheral I/O controller 410 performs functions that
enable the processor 402 to communicate with peripheral
input/output (I/O) devices 422 and 424, a network interface 426, a
local network transceiver 427, a cellular network transceiver 428,
and a GPS transceiver 429 via the network interface 426. The I/O
devices 422 and 424 may be any desired type of I/O device such as,
for example, a keyboard, a display (e.g., a liquid crystal display
(LCD), a cathode ray tube (CRT) display, etc.), a navigation device
(e.g., a mouse, a trackball, a capacitive touch pad, a joystick,
etc.), etc. The cellular telephone transceiver 428 may be resident
with the local network transceiver 427. The local network
transceiver 427 may include support for a Wi-Fi network, Bluetooth,
Infrared, or other wireless data transmission protocols. In other
embodiments, one element may simultaneously support each of the
various wireless protocols employed by the computing device 401.
For example, a software-defined radio may be able to support
multiple protocols via downloadable instructions. In operation, the
computing device 401 may be able to periodically poll for visible
wireless network transmitters (both cellular and local network) on
a periodic basis. Such polling may be possible even while normal
wireless traffic is being supported on the computing device 401.
The network interface 426 may be, for example, an Ethernet device,
an asynchronous transfer mode (ATM) device, an 802.11 wireless
interface device, a DSL modem, a cable modem, a cellular modem,
etc., that enables the system 100 to communicate with another
computer system having at least the elements described in relation
to the system 100.
[0048] While the memory controller 412 and the I/O controller 410
are depicted in FIG. 4 as separate functional blocks within the
chipset 408, the functions performed by these blocks may be
integrated within a single integrated circuit or may be implemented
using two or more separate integrated circuits. The system 400 may
also implement the application 418 on remote computing devices 430
and 432. The remote computing devices 430 and 432 may communicate
with the computing device 401 over an Ethernet link 434. For
example, the computing device 401 may receive vehicle image and
sound data created by an application executing on a remote
computing device 430, 432. In some embodiments, the application 418
may be retrieved by the computing device 401 from a cloud computing
server 436 via the Internet 438. When using the cloud computing
server 436, the retrieved application 418 may be programmatically
linked with the computing device 401. The application 418 may be a
Java.RTM. applet executing within a Java.RTM. Virtual Machine (JVM)
environment resident in the computing device 401 or the remote
computing devices 430, 432. The application 418 may also be
"plug-ins" adapted to execute in a web-browser located on the
computing devices 401, 430, and 432. In some embodiments, the
application 418 may communicate with a backend component 440 such
as the insurance server 104 via the Internet 438.
[0049] The system 400 may include but is not limited to any
combination of a LAN, a MAN, a WAN, a mobile, a wired or wireless
network, a private network, or a virtual private network. Moreover,
while only two remote computing devices 430 and 432 are illustrated
in FIG. 4 to simplify and clarify the description, it is understood
that any number of client computers are supported and can be in
communication within the system 400.
[0050] Additionally, certain embodiments are described herein as
including logic or a number of components, modules, or mechanisms.
Modules may constitute either software modules (e.g., code or
instructions embodied on a machine-readable medium or in a
transmission signal, wherein the code is executed by a processor)
or hardware modules. A hardware module is tangible unit capable of
performing certain operations and may be configured or arranged in
a certain manner. In example embodiments, one or more computer
systems (e.g., a standalone, client or server computer system) or
one or more hardware modules of a computer system (e.g., a
processor or a group of processors) may be configured by software
(e.g., an application or application portion) as a hardware module
that operates to perform certain operations as described
herein.
[0051] In various embodiments, a hardware module may be implemented
mechanically or electronically. For example, a hardware module may
comprise dedicated circuitry or logic that is permanently
configured (e.g., as a special-purpose processor, such as a field
programmable gate array (FPGA) or an application-specific
integrated circuit (ASIC)) to perform certain operations. A
hardware module may also comprise programmable logic or circuitry
(e.g., as encompassed within a general-purpose processor or other
programmable processor) that is temporarily configured by software
to perform certain operations. It will be appreciated that the
decision to implement a hardware module mechanically, in dedicated
and permanently configured circuitry, or in temporarily configured
circuitry (e.g., configured by software) may be driven by cost and
time considerations.
[0052] Accordingly, the term "hardware module" should be understood
to encompass a tangible entity, be that an entity that is
physically constructed, permanently configured (e.g., hardwired),
or temporarily configured (e.g., programmed) to operate in a
certain manner or to perform certain operations described herein.
As used herein, "hardware-implemented module" refers to a hardware
module. Considering embodiments in which hardware modules are
temporarily configured (e.g., programmed), each of the hardware
modules need not be configured or instantiated at any one instance
in time. For example, where the hardware modules comprise a
general-purpose processor configured using software, the
general-purpose processor may be configured as respective different
hardware modules at different times. Software may accordingly
configure a processor, for example, to constitute a particular
hardware module at one instance of time and to constitute a
different hardware module at a different instance of time.
[0053] Hardware modules can provide information to, and receive
information from, other hardware modules. Accordingly, the
described hardware modules may be regarded as being communicatively
coupled. Where multiple of such hardware modules exist
contemporaneously, communications may be achieved through signal
transmission (e.g., over appropriate circuits and buses) that
connect the hardware modules. In embodiments in which multiple
hardware modules are configured or instantiated at different times,
communications between such hardware modules may be achieved, for
example, through the storage and retrieval of information in memory
structures to which the multiple hardware modules have access. For
example, one hardware module may perform an operation and store the
output of that operation in a memory device to which it is
communicatively coupled. A further hardware module may then, at a
later time, access the memory device to retrieve and process the
stored output. Hardware modules may also initiate communications
with input or output devices, and can operate on a resource (e.g.,
a collection of information).
[0054] The various operations of example methods described herein
may be performed, at least partially, by one or more processors
that are temporarily configured (e.g., by software) or permanently
configured to perform the relevant operations. Whether temporarily
or permanently configured, such processors may constitute
processor-implemented modules that operate to perform one or more
operations or functions. The modules referred to herein may, in
some example embodiments, comprise processor-implemented
modules.
[0055] Similarly, the methods or routines described herein may be
at least partially processor-implemented. For example, at least
some of the operations of a method may be performed by one or more
processors or processor-implemented hardware modules. The
performance of certain of the operations may be distributed among
the one or more processors, not only residing within a single
machine, but deployed across a number of machines. In some example
embodiments, the processor or processors may be located in a single
location (e.g., within a home environment, an office environment or
as a server farm), while in other embodiments the processors may be
distributed across a number of locations.
[0056] The one or more processors may also operate to support
performance of the relevant operations in a "cloud computing"
environment or as a "software as a service" (SaaS). For example, at
least some of the operations may be performed by a group of
computers (as examples of machines including processors), these
operations being accessible via a network (e.g., the Internet) and
via one or more appropriate interfaces (e.g., application program
interfaces (APIs)).
[0057] The performance of certain of the operations may be
distributed among the one or more processors, not only residing
within a single machine, but deployed across a number of machines.
In some example embodiments, the one or more processors or
processor-implemented modules may be located in a single geographic
location (e.g., within a home environment, an office environment,
or a server farm). In other example embodiments, the one or more
processors or processor-implemented modules may be distributed
across a number of geographic locations.
[0058] Some portions of this specification are presented in terms
of algorithms or symbolic representations of operations on data
stored as bits or binary digital signals within a machine memory
(e.g., a computer memory). These algorithms or symbolic
representations are examples of techniques used by those of
ordinary skill in the data processing arts to convey the substance
of their work to others skilled in the art. As used herein, an
"algorithm" is a self-consistent sequence of operations or similar
processing leading to a desired result. In this context, algorithms
and operations involve physical manipulation of physical
quantities. Typically, but not necessarily, such quantities may
take the form of electrical, magnetic, or optical signals capable
of being stored, accessed, transferred, combined, compared, or
otherwise manipulated by a machine. It is convenient at times,
principally for reasons of common usage, to refer to such signals
using words such as "data," "content," "bits," "values,"
"elements," "symbols," "characters," "terms," "numbers,"
"numerals," or the like. These words, however, are merely
convenient labels and are to be associated with appropriate
physical quantities.
[0059] Unless specifically stated otherwise, discussions herein
using words such as "processing," "computing," "calculating,"
"determining," "presenting," "displaying," or the like may refer to
actions or processes of a machine (e.g., a computer) that
manipulates or transforms data represented as physical (e.g.,
electronic, magnetic, or optical) quantities within one or more
memories (e.g., volatile memory, non-volatile memory, or a
combination thereof), registers, or other machine components that
receive, store, transmit, or display information.
[0060] As used herein any reference to "some embodiments" or "an
embodiment" means that a particular element, feature, structure, or
characteristic described in connection with the embodiment is
included in at least one embodiment. The appearances of the phrase
"in some embodiments" in various places in the specification are
not necessarily all referring to the same embodiment.
[0061] Some embodiments may be described using the expression
"coupled" and "connected" along with their derivatives. For
example, some embodiments may be described using the term "coupled"
to indicate that two or more elements are in direct physical or
electrical contact. The term "coupled," however, may also mean that
two or more elements are not in direct contact with each other, but
yet still co-operate or interact with each other. The embodiments
are not limited in this context.
[0062] Further, the figures depict preferred embodiments of a
system for providing insurance ratings based on vehicle image and
sound data for purposes of illustration only. One skilled in the
art will readily recognize from the following discussion that
alternative embodiments of the structures and methods illustrated
herein may be employed without departing from the principles
described herein.
[0063] Upon reading this disclosure, those of skill in the art will
appreciate still additional alternative structural and functional
designs for a system and a process for providing insurance ratings
based on vehicle image and sound data through the disclosed
principles herein. Thus, while particular embodiments and
applications have been illustrated and described, it is to be
understood that the disclosed embodiments are not limited to the
precise construction and components disclosed herein. Various
modifications, changes and variations, which will be apparent to
those skilled in the art, may be made in the arrangement, operation
and details of the method and apparatus disclosed herein without
departing from the spirit and scope defined in the appended
claims.
* * * * *