U.S. patent application number 15/803071 was filed with the patent office on 2018-02-22 for unmanned aircraft structure evaluation system and method.
The applicant listed for this patent is Pictometry International Corp.. Invention is credited to John Monaco, Stephen L. Schultz.
Application Number | 20180053054 15/803071 |
Document ID | / |
Family ID | 53524310 |
Filed Date | 2018-02-22 |
United States Patent
Application |
20180053054 |
Kind Code |
A1 |
Schultz; Stephen L. ; et
al. |
February 22, 2018 |
UNMANNED AIRCRAFT STRUCTURE EVALUATION SYSTEM AND METHOD
Abstract
Unmanned aerial vehicle (UAV) systems and methods are disclosed,
including displaying a first graphical representation of a
structure to be inspected, the first graphical representation
comprising one or more images describing an aerial view of the
structure; determining a flight path according to which a UAV is to
navigate above the structure and obtain sensor data describing an
aerial view of the structure from the first graphical
representation; receiving sensor data obtained by the UAV including
multiple images describing an aerial view depicting a structure
rooftop; generating a second graphical representation of the
structure, based on the multiple images, comprising composite
imagery of the rooftop based on the multiple images; determining an
occurrence of damaged areas of the rooftop; and displaying the
second graphical representation of the rooftop with damaged areas
of the rooftop identified.
Inventors: |
Schultz; Stephen L.; (West
Henrietta, NY) ; Monaco; John; (Penfield,
NY) |
|
Applicant: |
Name |
City |
State |
Country |
Type |
Pictometry International Corp. |
Rochester |
NY |
US |
|
|
Family ID: |
53524310 |
Appl. No.: |
15/803071 |
Filed: |
November 3, 2017 |
Related U.S. Patent Documents
|
|
|
|
|
|
Application
Number |
Filing Date |
Patent Number |
|
|
15475978 |
Mar 31, 2017 |
|
|
|
15803071 |
|
|
|
|
14591556 |
Jan 7, 2015 |
9612598 |
|
|
15475978 |
|
|
|
|
61926137 |
Jan 10, 2014 |
|
|
|
Current U.S.
Class: |
1/1 |
Current CPC
Class: |
B64C 2201/146 20130101;
H04N 21/47 20130101; G08G 5/0039 20130101; B64D 47/08 20130101;
B60R 1/00 20130101; G08G 5/045 20130101; G08G 5/0069 20130101; H04N
21/4316 20130101; G06F 16/5838 20190101; G06F 16/5866 20190101;
G01S 19/39 20130101; B64C 39/024 20130101; B60R 2300/8093 20130101;
G06T 11/60 20130101; G06K 9/0063 20130101; B64C 2201/141 20130101;
G05D 1/0094 20130101; B64C 2201/127 20130101; B64C 2201/123
20130101; G06Q 40/08 20130101; G06F 16/51 20190101; G08G 5/0086
20130101; G06K 9/00637 20130101 |
International
Class: |
G06K 9/00 20060101
G06K009/00; H04N 5/445 20110101 H04N005/445; G08G 5/00 20060101
G08G005/00; G06F 17/30 20060101 G06F017/30; B64D 47/08 20060101
B64D047/08; G05D 1/00 20060101 G05D001/00; B64C 39/02 20060101
B64C039/02 |
Claims
1. A method performed by a system having one or more processors,
the method comprising: displaying, via a user interface, a first
graphical representation of a structure to be inspected, the first
graphical representation comprising one or more images describing
an aerial view of the structure; determining a flight path
according to which an unmanned aerial vehicle (UAV) is to navigate
above the structure and obtain sensor data describing an aerial
view of the structure from the first graphical representation, the
flight path including waypoints; receiving sensor data obtained by
the UAV during performance of the flight path, wherein the sensor
data includes multiple images describing an aerial view depicting a
rooftop of the structure; generating a second graphical
representation of the structure based on the multiple images,
wherein the second graphical representation comprises composite
imagery of the rooftop based on the multiple images; determining an
occurrence of damaged areas of the rooftop in the multiple images;
and displaying, via the user interface, the second graphical
representation of the rooftop with damaged areas of the rooftop
identified in the second graphical representation, wherein the
damaged areas of the rooftop are represented by a graphical
indication of the respective damaged areas, wherein the user
interface is configured to receive selection of a damaged area
identified in the second graphical representation.
2. The method of claim 1, wherein determining whether surface
damage has occurred about a particular area of the rooftop
comprises: determining whether a feature indicative of rooftop
damage exists in a digital image; and weighting the feature to
determine that the feature is indicative of a damaged area or an
undamaged area.
3. The method of claim 1, wherein a type of rooftop damage
identified is one of hail damage tornado damage or flood
damage.
4. The method of claim 1, further comprising: receiving, via the
user interface, positions about the second graphical representation
of the rooftop indicating locations of damaged areas.
5. The method of claim 1, wherein a position above the structure is
set at an altitude such that a required image pixel resolution is
obtained.
6. The method of claim 1, further comprising: displaying, via the
interface, a graphical representation of the flight path above the
first graphical representation of the structure.
7. A system comprising one or more processors and a non-transitory
computer storage media storing instructions that when executed by
the one or more processors cause the one or more processors to
perform operations comprising: displaying, via a user interface, a
first graphical representation of a structure to be inspected, the
first graphical representation comprising one or more images
describing an aerial view of the structure; determining a flight
path according to which an unmanned aerial vehicle (UAV) is to
navigate above the structure and obtain sensor data describing an
aerial view of the structure from the first graphical
representation, the flight path including waypoints; receiving
sensor data obtained by the UAV during performance of the flight
path, wherein the sensor data includes multiple digital images
describing an aerial view depicting a rooftop of the structure, and
wherein a subset of the digital images are detailed images;
generating a second graphical representation of the structure based
on the multiple digital images, wherein the second graphical
representation comprises composite imagery of the rooftop based on
the multiple digital images; determining an occurrence of damaged
areas of the rooftop in the multiple digital images; and
displaying, via the user interface, the second graphical
representation of the rooftop with damaged areas of the rooftop
identified in the second graphical representation, wherein the
damaged areas of the rooftop are represented by a graphical
indication of the respective damaged areas, wherein the user
interface is configured to receive selection of a damaged area and
display one or more detailed images of the damaged area.
8. The system of claim 7, wherein determining the occurrence of
damaged areas of the rooftop comprises: determining whether a
feature indicative of rooftop damage exists in a digital image; and
weighting the feature to determine that the feature is indicative
of a damaged area or an undamaged area.
9. The system of claim 7, wherein a type of rooftop damage
identified is one of hail damage, tornado damage or flood
damage.
10. The system of claim 7, further comprising the operations of:
receiving, via the user interface, positions about the second
graphical representation of the rooftop indicating locations of
damaged areas of the rooftop.
11. The system of claim 7, wherein waypoints above the structure
are set at an altitude such that a required image pixel resolution
for the multiple digital images of the rooftop is obtained.
12. The system of claim 7, further comprising the operations of:
displaying, via the user interface, a graphical representation of
the flight path above the first graphical representation of the
structure.
13. A computer storage medium having instructions that when
executed by one or more processors, cause the processors to perform
operations comprising: determining a flight path from a first
graphical representation of a structure to be inspected according
to which an unmanned aerial vehicle (UAV) is to navigate above the
structure and obtain sensor data describing an aerial view of the
structure, the flight path including waypoints, the graphical
representation comprising one or more images describing an aerial
view of the structure; receiving sensor data obtained by the UAV
during performance of the flight path, wherein the sensor data
includes multiple digital images describing an aerial view
depicting a rooftop of the structure, and wherein a first subset of
the digital images include a threshold level of detail of
particular areas of the rooftop, and a second subset of the digital
images include a greater than the threshold level of detail of
particular areas of the rooftop; generating a second graphical
representation of the structure based on the multiple digital
images, wherein the second graphical representation comprises
composite imagery of the rooftop based on the multiple digital
images; determining an occurrence of damaged areas of the rooftop
in the multiple digital images; and displaying, via a user
interface, the second graphical representation of the rooftop with
damaged areas of the rooftop identified in the second graphical
representation, wherein the damaged areas of the rooftop are
represented by a graphical indication of the respective damaged
areas, wherein the user interface is configured to receive
selection of a damaged area and display one or more digital images
that include greater than the threshold level of detail of the
damaged area.
14. The computer storage medium of claim 13, wherein determining
the occurrence of damaged areas of the rooftop comprises:
determining whether a feature indicative of rooftop damage exists
in a digital image; and weighting the feature to determine that the
feature is indicative of a damaged area or an undamaged area.
15. The computer storage medium of claim 13, wherein a type of
rooftop damage identified is one of hail damage, tornado damage and
flooding damage.
16. The computer storage medium of claim 13, further comprising the
operations of: receiving, via the user interface, positions about
the second graphical representation of the rooftop indicating
locations of damaged areas of the rooftop.
17. The computer storage medium of claim 13, wherein waypoints
above the structure are set at an altitude such that a required
image pixel resolution for the multiple digital images of the
rooftop is obtained.
18. The computer storage medium of claim 13, further comprising the
operations of: displaying, via the user interface, a graphical
representation of the flight path above the first graphical
representation of the structure.
19. A system, comprising one or more processors and a
non-transitory computer storage media storing instructions that
when executed by the one or more processors cause the one or more
processors to perform operations, comprising: displaying, via a
user interface, a first graphical representation of a structure to
be inspected, the first graphical representation comprising one or
more images describing an aerial view of the structure; determining
a flight path from the first graphical representation according to
which an unmanned aerial vehicle (UAV) is to navigate above the
structure and obtain sensor data describing an aerial view of the
structure, the flight path including waypoints; receiving first
sensor data obtained by the UAV during performance of the flight
path, and second sensor data from an absence of the flight path
closer to the structure in which finer resolution data of the
structure is obtained, wherein the first and second sensor data
includes multiple digital images describing an aerial view
depicting a rooftop of the structure, and wherein a subset of the
digital images are detailed images; generating a second graphical
representation of the structure based on the multiple digital
images, wherein the second graphical representation comprises
composite imagery of the rooftop based on the multiple digital
images; determining an occurrence of damaged areas of the rooftop
in the multiple digital images; and displaying, via the user
interface, the second graphical representation of the rooftop with
damaged areas of the rooftop identified in the second graphical
representation, wherein the damaged areas of the rooftop are
represented by a graphical indication of the respective damaged
areas, wherein the user interface is configured to receive
selection of a damaged area and display one or more detailed images
of the damaged area.
20. The system of claim 19, wherein determining the occurrence of
damaged areas of the rooftop comprises: determining whether a
feature indicative of rooftop damage exists in a digital image; and
weighting the feature to determine that the feature is indicative
of a damaged area or an undamaged area.
21. The system of claim 20, wherein a type of rooftop damage
identified is one of hail damage, tornado damage, or flood
damage.
22. The system of claim 20, further comprising the operations of:
receiving, via the user interface, positions about the second
graphical representation of the rooftop indicating locations of
damaged areas of the rooftop.
23. The system of claim 20, wherein waypoints above the structure
are set at an altitude such that a required image pixel resolution
for the multiple digital images of the rooftop is obtained.
24. The system of claim 20, further comprising the operations of:
displaying, via the user interface, a graphical representation of
the flight path above the first graphical representation of the
structure.
Description
CROSS REFERENCE TO RELATED APPLICATION/INCORPORATION BY
REFERENCE
[0001] The present patent application is a continuation of U.S.
patent application Ser. No. 15/475,978, filed Mar. 31, 2017, which
claims priority to U.S. patent application Ser. No. 14/591,556,
filed Jan. 7, 2015, which issued as U.S. Pat. No. 9,612,598, which
claims priority to the provisional patent application identified by
U.S. Ser. No. 61/926,137, filed on Jan. 10, 2014, the entire
contents of each of which are hereby expressly incorporated by
reference herein.
BACKGROUND
[0002] Unmanned aerial vehicles (UAVs), commonly known as drones,
are aircraft without a human pilot on board. Flight may be
controlled by computers or by remote control of a pilot located on
the ground.
[0003] Within the insurance industry, use of UAVs may aid in
obtaining evaluation estimates for structures, such as roofs, that
may be difficult to access. For example, a camera may be placed on
the UAV so that the roof of a structure may be viewed without
having to physically climb onto the roof.
[0004] The flight plan of the UAV may be based on evaluation of the
geographic area around the structure, and is generally
individualized for each structure. Currently within the industry,
flight plans and locations of capture images are manually selected
by a user.
BRIEF DESCRIPTION OF THE SEVERAL VIEWS OF THE DRAWINGS
[0005] Like reference numerals in the figures represent and refer
to the same or similar element or function. Implementations of the
disclosure may be better understood when consideration is given to
the following detailed description thereof. Such description makes
reference to the annexed pictorial illustrations, schematics,
graphs, drawings, and appendices. In the drawings:
[0006] FIG. 1 is a schematic diagram of an embodiment of an
unmanned aircraft structure evaluation system according to the
instant disclosure.
[0007] FIG. 2 is an image of an unmanned aircraft with a camera
positioned about a structure of interest.
[0008] FIG. 3 is a flow chart of an exemplary embodiment of a
program logic according to the instant disclosure.
[0009] FIG. 4 is an exemplary screen shot of an oblique image of
the structure of interest shown in FIG. 2.
[0010] FIG. 5 is an exemplary diagram illustrating lateral and
vertical offset of an unmanned aircraft in relation to a structure
in accordance with the present disclosure.
[0011] FIG. 6 is an exemplary screen shot of a nadir image of the
structure of interest shown in FIG. 4, the screen shot illustrating
an exemplary flight plan for an unmanned aircraft.
[0012] FIG. 7 is another exemplary screen shot of nadir image of
the structure shown in FIG. 6, the screen shot illustrating another
exemplary flight plan for an unmanned aircraft.
[0013] FIG. 8 is an exemplary screen shot of a nadir image of the
structure of interest shown in FIG. 4, the screen shot illustrating
a camera path of an unmanned aircraft.
[0014] FIG. 9 is an exemplary screen shot of a structure report
displayed on a display unit of a user terminal.
[0015] FIG. 10 is an exemplary screen shot of two oblique images of
a structure, each oblique image showing the structure at a distinct
time period.
DETAILED DESCRIPTION
[0016] Before explaining at least one embodiment of the inventive
concept disclosed herein in detail, it is to be understood that the
inventive concept is not limited in its application to the details
of construction and the arrangement of the components or steps or
methodologies set forth in the following description or illustrated
in the drawings. The inventive concept disclosed herein is capable
of other embodiments or of being practiced or carried out in
various ways. Also, it is to be understood that the phraseology and
terminology employed herein is for the purpose of description and
should not be regarded as limiting in any way.
[0017] In the following detailed description of embodiments of the
inventive concept, numerous specific details are set forth in order
to provide a more thorough understanding of the inventive concept.
It will be apparent to one of ordinary skill in the art, however,
that the inventive concept within the disclosure may be practiced
without these specific details. In other instances, well-known
features have not been described in detail to avoid unnecessarily
complicating the instant disclosure.
[0018] As used herein, the terms "network-based", "cloud-based" and
any variations thereof, are intended to include the provision of
configurable computational resources on demand via interfacing with
a computer and/or computer network, with software and/or data at
least partially located on the computer and/or computer network, by
pooling processing power of two or more networked processors.
[0019] As used herein, the terms "comprises", "comprising",
"includes", "including", "has", "having", or any other variation
thereof, are intended to be non-exclusive inclusions. For example,
a process, method, article, or apparatus that comprises a set of
elements is not limited to only those elements but may include
other elements not expressly listed or even inherent to such
process, method, article, or apparatus.
[0020] As used in the instant disclosure, the terms "provide",
"providing", and variations thereof comprise displaying or
providing for display a webpage (e.g., roofing webpage) to one or
more user terminals interfacing with a computer and/or computer
network(s) and/or allowing the one or more user terminal(s) to
participate, such as by interacting with one or more mechanisms on
a webpage (e.g., roofing webpage) by sending and/or receiving
signals (e.g., digital, optical, and/or the like) via a computer
network interface (e.g., Ethernet port, TCP/IP port, optical port,
cable modem, and combinations thereof). A user may be provided with
a web page in a web browser, or in a software application, for
example.
[0021] As used herein, the term "structure request", "structure
order", "flight plan request", "flight plan order", and any
variations thereof may comprise a feature of the graphical user
interface or a feature of a software application, allowing a user
to indicate to a host system that the user wishes to place an
order, such as by interfacing with the host system over a computer
network and exchanging signals (e.g., digital, optical, and/or the
like), with the host system using a network protocol, for example.
Such mechanism may be implemented with computer executable code
executed by one or more processors, for example, with a button, a
hyperlink, an icon, a clickable symbol, and/or combinations
thereof, that may be activated by a user terminal interfacing with
the at least one processor over a computer network, for
example.
[0022] Further, unless expressly stated to the contrary, "or"
refers to an inclusive or and not to an exclusive or. For example,
a condition A or B is satisfied by anyone of the following: A is
true (or present) and B is false (or not present), A is false (or
not present) and B is true (or present), and both A and B are true
(or present).
[0023] In addition, the use of the "a" or "an" are employed to
describe elements and components of the embodiments herein. This is
done merely for convenience and to give a general sense of the
inventive concept. This description should be read to include one
or more, and the singular also includes the plural unless it is
obvious that it is meant otherwise.
[0024] Finally, as used herein any reference to "one embodiment" or
"an embodiment" means that a particular element, feature,
structure, or characteristic described in connection with the
embodiment is included in at least one embodiment. The appearance
of the phrase "in one embodiment" in various places in the
specification are not necessarily all referring to the same
embodiment.
[0025] Referring now to FIGS. 1 and 2, shown therein is an
exemplary embodiment of an unmanned aircraft structure evaluation
system 10 according to the instant disclosure. The unmanned
aircraft structure evaluation system 10 comprises one or more host
systems 12 interfacing and/or communicating with one or more user
terminals 14 via a network 16. Generally, the one or more host
systems 12 receive identification information relating to a
structure of interest 21 (e.g., building) via the user terminals
14, and data indicative of the geographic positions of the
structure. Using the identification information and the geographic
positioning of the structure of interest 21, the one or more host
systems 12 may generate unmanned aircraft information including
flight path information, camera control information, and/or gimbal
control information. The unmanned aircraft information may be used
by an unmanned aircraft 18 to capture one or more aerial images
(e.g., oblique images) of the structure of interest 21. In some
embodiments, the flight path information, camera control
information, and/or gimbal control information may be determined
automatically by analyzing and using referenced images. As such,
manual manipulation and/or analysis by a user may be geo-minimized
and/or eliminated. In other embodiments, the flight path
information, camera control information and/or gimbal control
information may be determined with the aid of a user who supplies
data by clicking on one or more displayed oblique image of the
structure of interest 21 and/or otherwise inputs data into one or
more of the user terminals 14.
[0026] The structure of interest 21 may be a man-made structure,
such as a building. For example, in FIG. 2, the structure of
interest 21 is a residential building. Alternatively, the structure
may be a naturally occurring structure, such as a tree, for
example.
[0027] The unmanned aircraft 18 may be any type of unmanned aerial
vehicle that can be controlled by using a flight plan. Flight of
the unmanned aircraft 18 may be controlled autonomously as
described in further detail herein. In some embodiments, flight may
be controlled using a flight plan in combination with piloting by a
user located on the ground. An exemplary unmanned aircraft 18 may
include the Professional SR100 UAC Camera Drone manufactured and
distributed by Cadence Technology located in Singapore.
[0028] Generally, the unmanned aircraft 18 may include one or more
cameras 19 configured to provide aerial images. In some
embodiments, the camera 19 may be mounted on a gimbal support
(e.g., three-axis gimbal). Additionally, in some embodiments, the
unmanned aircraft 18 may include one or more global positioning
system (GPS) receivers, one or more inertial navigation units
(INU), one or more clocks, one or more gyroscopes, one or more
compasses, one or more altimeters, and/or the like so that the
position and orientation of the unmanned aircraft 18 at specific
instances of time can be monitored, recorded and/or stored with
and/or correlated with particular images.
[0029] The one or more cameras 19 may be capable of capturing
images photographically and/or electronically as well as recording
the time at which particular images are captured. In one
embodiment, this can be accomplished by sending a signal to a
processor (that receives time signals from the GPS) each time an
image is captured. The one or more cameras 19 may include, but are
not limited to, conventional cameras, digital cameras, digital
sensors, charge-coupled devices, and/or the like. In some
embodiments, one or more cameras 19 may be ultra-high resolution
cameras.
[0030] The one or more cameras 19 may include known or determinable
characteristics including, but not limited to, focal length, sensor
size, aspect ratio, radial and other distortion terms, principal
point offset, pixel pitch, alignment, and/or the like.
[0031] Referring to FIG. 1, the unmanned aircraft 18 may
communicate with the one or more user terminals 14. The one or more
user terminals 14 may be implemented as a personal computer, a
handheld computer, a smart phone, a wearable computer,
network-capable TV set, TV set-top box, a tablet, an e-book reader,
a laptop computer, a desktop computer, a network-capable handheld
device, a video game console, a server, a digital video recorder, a
DVD-player, a Blu-Ray player and combinations thereof, for example.
In an exemplary embodiment, the user terminal 14 may comprise an
input unit 20, a display unit 22, a processor (not shown) capable
of interfacing with the network 16, processor executable code (not
shown), and a web browser capable of accessing a website and/or
communicating information and/or data over a network, such as the
network 16. As will be understood by persons of ordinary skill in
the art, the one or more user terminals 14 may comprise one or more
non-transient memories comprising processor executable code and/or
software applications, for example.
[0032] The input unit 20 may be capable of receiving information
input from a user and/or other processor(s), and transmitting such
information to the user terminal 14 and/or to the one or more host
systems 12. The input unit 20 may be implemented as a keyboard, a
touchscreen, a mouse, a trackball, a microphone, a fingerprint
reader, an infrared port, a slide-out keyboard, a flip-out
keyboard, a cell phone, a PDA, a video game controller, a remote
control, a fax machine, a network interface, and combinations
thereof, for example. In some embodiments, the user terminal 14 is
loaded with flight management software for controlling the unmanned
aircraft 18.
[0033] The display unit 22 may output information in a form
perceivable by a user and/or other processor(s). For example, the
display unit 22 may be a server, a computer monitor, a screen, a
touchscreen, a speaker, a website, a TV set, a smart phone, a PDA,
a cell phone, a fax machine, a printer, a laptop computer, a
wearable display, and/or combinations thereof. It is to be
understood that in some exemplary embodiments, the input unit 20
and the display unit 22 may be implemented as a single device, such
as, for example, a touchscreen or a tablet. It is to be further
understood that as used herein the term user is not limited to a
human being, and may comprise a computer, a server, a website, a
processor, a network interface, a human, a user terminal, a virtual
computer, and combinations thereof, for example.
[0034] As discussed above, the system 10 may include one or more
host systems 12. The one or more host systems 12 may be partially
or completely network-based or cloud based, and not necessarily
located in a single physical location. Each of the host systems 12
may further be capable of interfacing and/or communicating with the
one or more user terminals 14 via the network 16, such as by
exchanging signals (e.g., digital, optical, and/or the like) via
one or more ports (e.g., physical or virtual) using a network
protocol, for example. Additionally, each host system 12 may be
capable of interfacing and/or communicating with other host systems
directly and/or via the network 16, such as by exchanging signals
(e.g., digital, optical, and/or the like) via one or more
ports.
[0035] It should be noted that multiple host systems 12 may be
independently controlled by separate entities. For example, in some
embodiments, system 10 may include two host systems 12 with a first
host system controlled by a first company and a second host system
controlled by a second company distinct from the first company.
[0036] The one or more host systems 12 may comprise one or more
processors 24 working together, or independently to, execute
processor executable code, one or more memories 26 capable of
storing processor executable code, one or more input devices 28,
and one or more output devices 30. Each element of the one or more
host systems 12 may be partially or completely network-based or
cloud-based, and not necessarily located in a single physical
location. Additionally, in embodiments having multiple host systems
12, each host system may directly communicate with additional host
systems and/or third party systems via the network 16.
[0037] The one or more processors 24 may be implemented as a single
or plurality of processors 24 working together, or independently to
execute the logic as described herein. Exemplary embodiments of the
one or more processors 24 include a digital signal processor (DSP),
a central processing unit (CPU), a field programmable gate array
(FPGA), a microprocessor, a multi-core processor, and/or
combinations thereof. The one or more processors 24 may be capable
of communicating with the one or more memories 26 via a path (e.g.,
data bus). The one or more processors 24 may be capable of
communicating with the input devices 28 and the output devices
30.
[0038] The one or more processors 24 may be further capable of
interfacing and/or communicating with the one or more user
terminals 14 and/or unmanned aircraft 18 via the network 16. For
example, the one or more processors 24 may be capable of
communicating via the network 16 by exchanging signals (e.g.,
digital, optical, and/or the like) via one or more physical or
virtual ports (i.e., communication ports) using a network protocol.
It is to be understood that in certain embodiments using more than
one processor 24, the one or more processors 24 may be located
remotely from one another, located in the same location, or
comprising a unitary multi-core processor (not shown). The one or
more processors 24 may be capable of reading and/or executing
processor executable code and/or of creating, manipulating,
altering, and/or storing computer data structures into one or more
memories 26.
[0039] The one or more memories 26 may be capable of storing
processor executable code. Additionally, the one or more memories
26 may be implemented as a conventional non-transient memory, such
as, for example, random access memory (RAM), a CD-ROM, a hard
drive, a solid state drive, a flash drive, a memory card, a
DVD-ROM, a floppy disk, an optical drive, and/or combinations
thereof. It is to be understood that while one or more memories 26
may be located in the same physical location as the host system 12,
the one or more memories 26 may be located remotely from the host
system 12, and may communicate with the one or more processor 24
via the network 16. Additionally, when more than one memory 26 is
used, a first memory may be located in the same physical location
as the host system 12, and additional memories 26 may be located in
a remote physical location from the host system 12. The physical
location(s) of the one or more memories 26 may be varied.
Additionally, one or more memories 26 may be implemented as a
"cloud memory" (i.e., one or more memory 26 may be partially or
completely based on or accessed using the network 16).
[0040] The one or more input devices 28 may transmit data to the
processors 24, and may be implemented as a keyboard, a mouse, a
touchscreen, a camera, a cellular phone, a tablet, a smart phone, a
PDA, a microphone, a network adapter, a wearable computer and/or
combinations thereof. The input devices 28 may be located in the
same physical location as the host system 12, or may be remotely
located and/or partially or completely network-based.
[0041] The one or more output devices 30 may transmit information
from the processor 24 to a user, such that the information may be
perceived by the user. For example, the output devices 30 may be
implemented as a server, a computer monitor, a cell phone, a
tablet, a speaker, a website, a PDA, a fax, a printer, a projector,
a laptop monitor, a wearable display and/or combinations thereof.
The output device 30 may be physically co-located with the host
system 12, or may be located remotely from the host system 12, and
may be partially or completely network based (e.g., website). As
used herein, the term "user" is not limited to a human, and may
comprise a human, a computer, a host system, a smart phone, a
tablet, and/or combinations thereof, for example.
[0042] The network 16 may permit bi-directional communication of
information and/or data between the one or more host systems 12,
the user terminals 14 and/or the unmanned aircraft 18. The network
16 may interface with the one or more host systems 12, the user
terminals 14, and the unmanned aircraft 18 in a variety of ways. In
some embodiments, the one or more host systems 12, the user
terminals 14 and/or the unmanned aircraft 18 may communicate via a
communication port. For example, the network 16 may interface by
optical and/or electronic interfaces, and/or may use a plurality of
network topographies and/or protocols including, but not limited
to, Ethernet, TCP/IP, circuit switched paths, and/or combinations
thereof. For example, the network 16 may be implemented as the
World Wide Web (or Internet), a local area network (LAN), a wide
area network (WAN), a metropolitan network, a wireless network, a
cellular network, a GSM-network, a CDMA network, a 3G network, a 4G
network, a satellite network, a radio network, an optical network,
a cable network, a public switched telephone network, an Ethernet
network, and/or combinations thereof. Additionally, the network 16
may use a variety of network protocols to permit bi-directional
interface and/or communication of data and/or information between
the one or more host systems 12, the one or more user terminals 14
and/or the unmanned aircraft 18.
[0043] In some embodiments, the one or more host systems 12, the
user terminals 14, and/or the unmanned aircraft 18 may communicate
by using a non-transitory computer readable medium. For example,
data obtained from the user terminal 14 may be stored on a USB
flash drive. The USB flash drive may be transferred to and received
by the unmanned aircraft 18 thereby communicating information, such
as the unmanned aircraft information including flight path
information, camera control information, and/or gimbal control
information from the user terminal 14 to the unmanned aircraft 18.
The USB flash drive may also be used to transfer images captured by
the camera 19, position, orientation and time date to the user
terminal(s) 14.
[0044] Referring to FIGS. 1 and 2, the one or more memories 26 may
store processor executable code and/or information comprising a
structure database 32, one or more images databases 34, and program
logic 36. The processor executable code may be stored as a data
structure, such as a database and/or a data table, for example. In
some embodiments, one or more memories of the user terminal 14 may
include a structure database 32, one or more image databases 34 and
program logic 36 as described in further detail herein.
[0045] The structure database 32 may include information (e.g.,
location, GIS data) about the structure of interest. For example,
the structure database 32 may store identification information
about the structure including, but not limited to, address,
geographic location, latitude/longitude, and/or the like.
[0046] The one or more memories 26 may include one or more image
databases 34. The one or more image databases 34 may store
geo-referenced imagery. Such imagery may be represented by a single
pixel map, and/or by a series of tiled pixel maps that when
aggregated recreate the image pixel map. Imagery may include nadir,
ortho-rectified and/or oblique geo-referenced images. The one or
more processors 24 may provide the images via the image database 34
to users at the one or more user terminals 14. In some embodiments,
one or more image databases 34 may be included within the user
terminals 14.
[0047] The one or more memories 26 may further store processor
executable code and/or instructions, which may comprise the program
logic 36. The program logic 36 may comprise processor executable
instructions and/or code, which when executed by the processor 24,
may cause the processor 24 to execute image display and analysis
software to generate, maintain, provide, and/or host a website
providing one or more structure evaluation requests, for example.
The program logic 36 may further cause the processor 24 to collect
identification information about the structure of interest 21
(e.g., address), allow one or more users to validate a location of
the structure, obtain geographical positions of the structure, and
the like, as described herein.
[0048] Referring to FIG. 3, shown therein is an exemplary flow
chart 40 of program logic 36 for creating a structure evaluation
report according to the instant disclosure. Program logic 36 may
comprise executable code, which when executed by the one or more
processors 24 may cause the one or more processors 24 to execute
one or more of the following steps.
[0049] In a step 42, the one or more host systems 12 may receive
identification information of the structure from the user terminal
14. For example, the one or more host systems 12 may receive the
address of the structure, geographic location of the structure
(e.g., X, Y, Z coordinates, latitude/longitude coordinates), a
location of the user terminal 14 determined by a Geographic
Position System (GPS) and/or the like.
[0050] In some embodiments, the user may validate the location of
the structure of interest 21. One or more processor 24 may provide
one or more images via the image database 34 to the display unit 22
of the user terminal 14. For example, FIG. 4 illustrates an
exemplary screen shot 60 of an oblique image 62 of the structure of
interest 21 that may be displayed on the display unit 22 of the
user terminal 14, shown in the block diagram of FIG. 1. The one or
more images 62 may be geo-referenced images illustrating portions
or all of the structure of interest 21. Referring to FIGS. 1 and 4,
the program logic 36 may cause the processor 24 to provide users
the one or more geo-referenced images 62 (e.g., via the display
unit 22), and allow the user to validate the location of the
structure of interest 21 (e.g., via the input unit 20). For
example, the user may be able to use a drag-and-drop element
provided by the program logic 36 via user terminal 14 to select the
structure of interest 21 within the one or more geo-referenced
images 62. Selection of the structure of interest 21 within the one
or more geo-referenced images 62 may provide one or more validated
images and a validated location of the structure of interest. It
should be noted, that in some embodiments, the program logic of the
user terminal 14, with or in lieu of the program logic 36 of the
processor 24, may provide users the one or more geo-referenced
images 62 to allow for validation of the location of the structure
of interest 21.
[0051] In some embodiments, validation of the geo-referenced images
may be provided by one or more additional host systems via the one
or more processors 24 in lieu of, or in combination with host
system 12. For example, the host system 12 may direct the user to a
second host system wherein one or more processors of the second
host system may provide geo-referenced images 62 from image
database to the user for validation of one or more structures of
interest 21.
[0052] In some embodiments, the geographic location may include
coordinates, and validation of the geographic location may be
provided by the user by altering one or more coordinates of the
geographic location. Users may alter the one or more coordinates by
methods including, but not limited to, manual manipulation,
drag-and-drop elements, and the like.
[0053] In some embodiments, location of the structure of interest
21 may be automatically determined by location of the user terminal
14. For example, a user may be physically present at the structure
of interest 21, and the user may be holding the user terminal 14
which determines its location using any suitable technology, such
as GPS. Using location coordinates of the user terminal 14, the
location of the structure of interest 21 may be determined.
[0054] In a step 44, a footprint of the structure of interest 21
may be determined. The footprint may provide a two-dimensional
boundary (e.g., sides) and/or outline of the structure of interest
21. For example, the outline of the structure of interest 21 may be
determined using systems and methods including, but not limited to,
those described in U.S. Patent Publication No. 2010/0179787, now
U.S. Pat. No. 8,145,578; U.S. Patent Publication No. 2010/0110074,
now U.S. Pat. No. 8,170,840; U.S. Patent Publication No.
2010/0114537, now U.S. Pat. No. 8,209,152; U.S. Patent Publication
No. 2011/0187713; U.S. Pat. No. 8,078,436; and U.S. Ser. No.
12/909,692, now U.S. Pat. No. 8,977,520; all of which are
incorporated by reference herein in their entirety. In some
embodiments, the footprint of the structure of interest 21 may be
provided to the user via the display unit 22. For example, in some
embodiments, the footprint of the structure of interest 21 may be
displayed as a layer on one or more images (e.g., nadir image) via
the display unit 22.
[0055] In some embodiments, the one or more processors 24 may
provide, via the display unit 22, one or more websites to the user
for evaluation of multiple oblique images to provide the footprint
of the structure of interest 21. For example, the user and/or the
processors 24 may identify edges of the structure of interest 21.
Two-dimensional and/or three-dimensional information regarding the
edges (e.g., position, orientation, and/or length) may be obtained
from the images using user selection of points within the images
and the techniques taught in U.S. Pat. No. 7,424,133, and/or
stereo-photogrammetry. Using the two-dimensional and/or
three-dimensional information (e.g., position orientation, and/or
length), line segments may be determined with multiple line
segments forming at least a portion of the footprint of the
structure of interest 21.
[0056] In a step 46, data indicative of geographic positions
pertaining to the footprint of the structure of interest 21 and/or
structure height information may be obtained. For example, in some
embodiments, the height of structure of interest 21 above the
ground may be determined. The height of the structure of interest
21 above the ground may aid in determining altitude for the flight
plan of the unmanned aircraft 18 as discussed in further detail
herein. Measurements of the geographic positions of the structure
of interest 21, such as a vertical structure, may include
techniques as described in U.S. Pat. No. 7,424,133, which is hereby
incorporated herein by reference in its entirety. The term
"vertical structures", as used herein includes structures that have
at least one portion of one surface that is not fully horizontal.
For example, "vertical structures" as described herein includes
structures that are fully vertical and structures that are not
fully vertical, such as structures that are pitched at an angle
and/or that drop into the ground. The side of a structure is not
limited to only one or more walls of the structure of interest 21,
but may include all visible parts of the structure of interest 21
from one viewpoint. For instance, when the present disclosure is
discussing a structure of interest 21, such as a house, a "side" or
"vertical side" includes the wall of the house and the roof above
the wall up to the highest point on the house.
[0057] In some embodiments, more than one height may be used. For
example, if the structure of interest 21 is a split-level building
having a single story part and a two story part, a first height may
be determined for the first story and a second height may be
determined for the second story. Altitude for the flight path of
the unmanned aircraft 18 may vary based on the differing heights of
the structure of interest 21.
[0058] In some embodiments, using the input unit 20 and/or the
display unit 22, the user may give additional details regarding
geographic positions pertaining to the outline of the structure of
interest 21 and/or structure height information. For example, if
the structure of interest 21 is a roof of a building, the user may
include identification of areas such as eaves, drip edges, ridges,
and/or the like. Additionally, the user may manually give values
for pitch, distance, angle, and/or the like. Alternatively, the one
or more processors 24 may evaluate imagery and determine areas
including eaves, drip edges, ridges and/or the like without manual
input of the user.
[0059] In a step 48, using the footprint, height, and possibly
additional geographic positions or information pertaining to the
structure of interest 21 including the geographic location of
obstructions in potential flight paths such as trees and utility
wires, unmanned aircraft information may be generated by the one or
more host systems 12 and/or the user terminal 14. The unmanned
aircraft information may include flight path information, camera
control information, and/or gimbal control information.
[0060] Flight path information may be configured to direct the
unmanned aircraft 18 to fly a flight path around the structure of
interest 21. In some embodiments, a flight path may be displayed to
the user on one or more images (e.g., nadir, oblique) via the
display unit 22. For example, FIG. 6 illustrates an exemplary
screen shot 66 of a nadir image 68 showing a flight path 70 about
the structure of interest 21. In some embodiments, the flight path
70 may be a displayed as a layer overlapping the nadir image 68 of
the structure of interest 21 on the display unit 22 of FIG. 1.
[0061] Generally, the flight path information directs the unmanned
aircraft 18 in three dimensions. Referring to FIGS. 5 and 6, the
flight path information may be determined such that the flight path
70 around the structure of interest 21 is laterally and/or
vertically offset from the geographic positions of the outline of
the structure of interest 21. In particular, lateral offset
L.sub.OFFSET and vertical offset V.sub.OFFSET may be dependent upon
the height H of the structure 21, orientation of the camera
relative to the unmanned aircraft 18, and characteristics of the
camera 19.
[0062] Referring to FIG. 5, generally in determining offset from
the structure 21, the field of view (FOV) of the camera 19 may be
positioned such that a center C.sub.1 is at one half the height H
of the structure 21, for example. Additionally, one or more buffer
regions B may be added to the FOV. Buffer regions B may increase
the angle of the FOV by a percentage. For example, buffer regions
B.sub.1 and B.sub.2 illustrated in FIG. 5 may increase the angle of
the FOV by 20-50%. To determine the lateral offset L.sub.OFFSET and
the vertical offset V.sub.OFFSET of the camera 19 from the
structure 21, a predetermined angle .theta. within a range of 25-75
degrees may be set. Once the angle .theta. is set, the lateral
offset L.sub.OFFSET and the vertical offset V.sub.OFFSET of the
camera 19 relative to the structure 21 may be determined using
trigonometric principles, for example. For example, lateral offset
L.sub.OFFSET may be determined based on the following equation:
L.sub.OFFSET=C.sub.1*Sin(.theta.) (EQ. 1)
wherein C.sub.1 is the centerline of the field of view FOV. The
vertical offset V.sub.OFFSET may be determined based on the
following equation:
V.sub.OFFSET=C.sub.1*COS(.theta.) (EQ. 2)
[0063] wherein C.sub.1 is the centerline of the field of view
FOV.
[0064] The flight path information may optionally direct the roll,
pitch and yaw of the unmanned aircraft 18. For example, some
versions of the unmanned aircraft 18 may not have a multi-axis
gimbal and as such, can be directed to aim the camera 19 by
changing the yaw, pitch or roll of the unmanned aircraft 18. The
current yaw, pitch and roll of the unmanned aircraft 18 may be
measured using a position and orientation system that is a part of
the unmanned aircraft 18. In some embodiments, the position and
orientation system may be implemented using microelectromechanical
based accelerometers and/or microelectromechanical based
gyrometers.
[0065] In many cases, there may be obstacles that lie along the
flight path. Some of those obstacles may be able to be detected by
the system through use of the imagery. In some embodiments, the
flight path 70 may be determined such that interference with
outside elements (e.g., trees and telephone wires) may be
minimized. For example, FIG. 7 illustrates a variation of the
flight path 70 determined in FIG. 4 wherein the flight path 70a of
FIG. 7 minimizes interference by following the outline of the
structure of interest 21.
[0066] A ground confidence map, as described in U.S. Pat. No.
8,588,547, which disclosure is hereby incorporated herein by
reference, could be used to identify objects for which there is a
high degree of confidence that the object lies elevated off of the
ground. Auto-correlation and auto-aerial triangulation methods
could then be used to determine the heights of these potential
obstructions. If the flight path would go through one of these
obstructions, it could be flagged and the algorithm could then
attempt to find the best solution for getting past the
obstructions: either flying closer to the structure of interest 21
as shown in FIG. 7, which might necessitate additional passes due
to a finer resolution and therefore smaller path width, or by
flying over the obstruction and aiming the camera 19 at a steeper
oblique angle, which again may require an adjustment to the flight
path to ensure full coverage. For any flight paths that are flagged
for possible obstructions, a system operator could validate the
corrective route chosen and alter it as necessary.
[0067] In addition to those obstacles that are identified within
the image, there may also be obstacles that cannot be identified in
the image. These could be newer trees or structures that were not
in the original images used for flight planning, wires or other
objects that may not show up in the images in enough detail to be
able to determine their location, or other unexpected obstacles. As
such, the unmanned aircraft 18 may also incorporate a collision
detection and avoidance system in some embodiments. The collision
detection and avoidance system could either be imaging based, or
active sensor based. When an obstacle lies along the Flight Path,
the software guiding the unmanned aircraft 18 could first attempt
to move closer to the structure of interest 21 along the path from
the Flight Path to the Target Path. If after a suitable threshold,
which may be set at 10% of the distance (104' in the above
examples, so 10% being 10.4') so that the 20% overlap still ensures
complete coverage, if the unmanned aircraft 18 is unable to bypass
the obstacle, the collision detection and avoidance system would
steer the unmanned aircraft 18 back to its original point of
collision detection and would then attempt to fly above the
obstacle.
[0068] Since the software controlling the unmanned aircraft 18
keeps the camera 19 aimed at the Target Path, flying higher may
still capture the necessary portions of the structure of interest
21; but the oblique down-look angle may change and the resolution
may become a bit coarser. In extreme circumstances, the unmanned
aircraft 18 may require operator intervention to properly negotiate
around the obstacle. In these cases, the software running on a
processor of the unmanned aircraft 18 would transmit a signal to
the operator in the form of an audible alarm, for example, and
allow the operator to steer the unmanned aircraft 18 around the
obstacle. As the unmanned aircraft 18 passes the Flight Capture
Points, the camera(s) 19 would fire. To ensure this, the Flight
Capture Points are not just points, but may be a vertical plane
that is perpendicular to the Flight Path and that passes through
the Flight Capture Point. Thus, even if the unmanned aircraft 18 is
30' above or away from the Flight Path at the time, as it passes
through that plane, and thus over or to the side of the Flight
Capture Point, the software controlling the unmanned aircraft 18
would cause the camera 19 to fire.
[0069] The camera control information may be loaded into the
software running on the processor of the unmanned aircraft 18 to
control actuation of the camera 19 of the unmanned aircraft 18. For
example, the camera control information may direct the camera 19 to
capture images (e.g., oblique images) at one or more predefined
geographic locations 74 (which are referred to herein below as
Flight Capture Points), as illustrated in screen shot 72 of FIG. 8.
In some embodiments, the camera control information may direct the
camera 19 to capture images on a schedule (e.g., periodic, random).
Further, the camera control information may control camera
parameters including, but not limited to zoom, focal length,
exposure control and/or the like.
[0070] The gimbal control information may be loaded into the
software running on the processor of the unmanned aircraft 18 to
control the direction of the camera 19 relative to the structure of
interest 21. For example, the gimbal control information may
control the orientation of the camera 19 in three dimensions such
that during capture of an image, the camera 19 is aligned with a
pre-determined location on the structure of interest 21 that are
referred to below as Target Capture Points.
[0071] In a step 50, the unmanned aircraft information may be
stored on one or more non-transitory computer readable medium of
the host system 12 and/or user terminal 14. For example, in some
embodiments, the host system 12 may determine the unmanned aircraft
information, communicate the unmanned aircraft information to the
user terminal 14 via the network 16, such that the unmanned
aircraft information may be stored on one or more non-transitory
computer readable medium. Alternatively, the user terminal 14 may
determine the unmanned aircraft information and store the unmanned
aircraft information on one or more non-transitory computer
readable medium. In some embodiments, the one or more
non-transitory computer readable medium may include a USB flash
drive or other similar data storage device.
[0072] In a step 52, the unmanned aircraft information may be
loaded onto the unmanned aircraft 18. For example, the unmanned
aircraft information may then be loaded onto the unmanned aircraft
18 via transfer of the non-transitory computer readable medium
(e.g., USB flash drive) from the user terminal 14. It should be
noted that the unmanned aircraft information may be loaded and/or
stored onto the unmanned aircraft 18 by any communication,
including communication via the network 16.
[0073] The unmanned aircraft 18 may use the unmanned aircraft
information to capture one or more oblique images of the structure
of interest 21. Generally, the unmanned aircraft 18 may follow the
flight path within the unmanned aircraft information obtaining the
one or more oblique images as set out within the camera control
information and gimbal control information. In some embodiments, a
user may manually manipulate the flight path 70 of the unmanned
aircraft information during flight of the unmanned aircraft 18. For
example, the user may request the unmanned aircraft 18 to add an
additional flight path 70 or repeat the same flight path 70 to
obtain additional images.
[0074] In a step 54, the one or more processors 24 may receive one
or more oblique images captured by the unmanned aircraft 18. The
flight path information, camera control information and gimbal
control information may direct the unmanned aircraft 18 to capture
one or more oblique images at predetermined locations and times as
described herein. The one or more oblique images may be
communicated to the one or more processors 24 via the network
and/or stored one or more non-transitory computer readable medium.
The one or more oblique images may be stored in one or more image
database 34. In some embodiments, the one or more oblique images
may be communicated to the user terminal 14, and the user terminal
14 may communicate the images to the one or more processors 24.
[0075] In a step 56, the one or more processors 24 may generate a
structure report. The program logic 36 may provide for one or more
user terminals 14 interfacing with the processor 24 over the
network 16 to provide one or more structure report website pages
allowing users to view the structure report. For example, FIG. 9
illustrates an exemplary screen shot 76 of a structure report 78 on
the display unit 22 of a user terminal 14.
[0076] One or more images 80 obtained from the camera 19 of the
unmanned aircraft 18 may be used for evaluation of the structure of
interest 21 for the structure report 78. For example, if the
structure of interest 21 is a building, the images obtained from
the camera 19 may be used in an insurance evaluation (e.g., flood
damage, hail damage, tornado damage).
[0077] One or more images 80 obtained from the camera may be
provided in the structure report 78. For example, the structure
report 78 in FIG. 9 includes an image data set 82. The image data
set 82 may include nadir and/or oblique images 80 of the structure
of interest 21. Additionally, the image data set 82 may include one
or more images 80 of objects of interest on and/or within the
structure of interest 21. For example, if the structure report 78
details damage to a roof of the structure of interest 21, one or
more images 80 of damage to the roof may be included within the
image data set 82. In some embodiments, third party images of the
structure of interest 21 may be included within the structure
report 78.
[0078] Structural details may be provided in the structure report
78 within a structure data set 84 as illustrated in FIG. 9. The
structure data set 84 may include information related to structure
of interest 21 including, but not limited to, area of the structure
of interest 21 (e.g., square feet), roof details (e.g., pitch,
ridge length, valley length, eave length, rake length), height of
the structure of interest 21, and/or the like. Additionally, the
structure data set 84 may include order information for the
structure report 78. For example, the structure data set 84 may
include information regarding the time an order for the structure
report 78 was placed, the time the order for the structure report
78 was completed, the delivery mechanism for the structure report
78, the price of the order for the structure report 78, and/or the
like, for example.
[0079] Based on the flight path information, camera control
information, and gimbal control information, during image capture,
the location of the camera 19 relative to the structure of interest
21 for images captured may also be known. For example, in some
embodiments, the X, Y, Z location (e.g., latitude, longitude, and
altitude) of a location seen within each image may be determined.
The information may be used to further evaluate objects on and/or
within the structure of interest 21. In some embodiments, images 80
captured by the unmanned aircraft 18 may be used to generate a two
or three-dimensional model of the structure of interest 21.
[0080] The unmanned aircraft structure evaluation system 10 may be
used as follows.
[0081] An insurance adjustor or other field operator would arrive
at the house being assessed for damage or for underwriting. He
would go to an online application on a portable networked computer
device (e.g., user terminal 14), such as a tablet, smart phone, or
laptop, and select the property and structure of interest 21. This
selection could be done with identification information, such as a
GPS determining his current location, through entering a street
address into the search bar, through entering the geographic
location into the user terminal 14, through scrolling on a map or
aerial image displayed on the user terminal 14 of the current
location, or through a preselected target property made by
virtually any method that results in finding the property and
storing it for later retrieval.
[0082] Once the location is found, an image or 3-D Model for that
property and structure of interest 21 is displayed on the screen.
An oblique image, or a street side image, would provide more
information to the operator for property verification as
traditional orthogonal images do not include any portion of the
side of the image. The 3D model (which may be textured with an
oblique or street side image) would work as well. The operator
verifies that the property and structure of interest 21 on the
screen matches the property and structure of interest 21 that he is
standing in front of to ensure that the operator generates the
proper report.
[0083] The operator then clicks on the structure of interest 21 and
requests a flight plan for that structure of interest 21. Software,
running on either or both of the user terminal 14 and the host
system 12 then isolates the structure of interest 21 and generates
an outline as described above. The software also causes the user
terminal 14 system to determine the height H of the structure,
either by using an automated method, or by having the operator use
a height tool on the oblique image, such as through the method
described in U.S. Pat. No. 7,424,133. This height H is then used to
automatically determine the proper flying height, lateral offset
L.sub.OFFSET, and vertical offset V.sub.OFFSET offset for the
flight path for the unmanned aircraft 18 (which may be an unmanned
aerial system). The height H may also be used to aim the steerable
camera 19 carried by the unmanned aircraft 18.
[0084] In this embodiment, first, a "Target Path" is generated that
follows the path of the perimeter of the structure 21 and that is
at a height over ground such that a center C.sub.1 of the field of
view may be located at one-half the height of the structure of
interest 21 as illustrated in FIG. 5. Thus, if it is a
two-and-a-half story structure of 28' height, the Target Path would
be generated such that the center C.sub.1 of the field of view may
be at 14' height over ground. Although, it should be understood
that the height over ground does not have to place the center
C.sub.1 of the field of view to be one-half the height of the
structure of interest 21 and can vary.
[0085] Next, characteristics of the camera 19 may be used, such as,
for example, the desired effective resolution of the image as well
as the overall sensor size of the camera 19 onboard the unmanned
aircraft 18, to determine the maximum vertical swath width that may
be captured on a single pass. So, for instance, if the desired
effective image resolution is 1/4'' GSD, and the sensor has 4,000
pixels in the vertical orientation, then the maximum vertical swath
width would be 1,000'' or 125'. A significant buffer B may be
subtracted out to allow for position and orientation errors when
flying, for buffeting due to wind, and for absolute position errors
in the reference imagery. The size of the buffer B can vary, but
can be about a 20% buffer on all sides of the imagery. As such, in
this example, the maximum vertical swath width would be 75'. If the
structure of interest 21 has a greater height H than this, then the
structure of interest 21 may need to be captured in multiple
passes. If so, using the same example numbers above, the first pass
would be captured at 37.5' above ground, the second at 112.5' above
ground, the third at 187.5' above ground, and so on until the
entire structure of interest 21 is covered.
[0086] If the structure of interest 21 is smaller than the maximum
vertical swath width, then the resolution can be increased beyond
the desired effective image resolution. So in the above example of
the two-and-a-half story house, the resolution could be switched to
W which would yield a maximum swath width of 37.5' which is more
than sufficient to cover the 28' of structure height while still
including the 20% buffer B on all sides.
[0087] Once the effective image resolution has been determined, the
lateral offset L.sub.OFFSET and vertical offset V.sub.OFFSET can
then be determined by calculating the path length that achieves the
determined resolution. For instance, with a 5-micron sensor pitch
size and a 50-mm lens, the path length would be 104'. If the
desired imagery is to be captured at a .theta. of 40-degrees (an
angle from 40-degrees to 50-degrees down from horizontal is
typically optimal for oblique aerial imagery) then that translates
to a lateral offset L.sub.OFFSET of 79.6' stand-off distance
(cosine of 40.times.104') and a vertical offset V.sub.OFFSET of
66.8' vertical height adjustment (sine of 40.times.104').
[0088] Using the Target Path as a starting point, the path would
now be grown by the requisite lateral offset L.sub.OFFSET and
vertical offset V.sub.OFFSET distance using standard geometry or
morphological operators to create the Flight Path. For instance, if
the target path were a perfect circle, the radius would be extended
by the 79.6' lateral offset L.sub.OFFSET distance. If the target
path were a rectangle, each side would be extended outward by the
79.6' lateral offset L.sub.OFFSET distance. The flying altitude for
the Flight Path would be determined by adding the vertical offset
V.sub.OFFSET distance to the height of the Target Path and then
adding that to the ground elevation for the starting point of the
flight path. So in the example of the 28' house, the flight
altitude would be the sum of the 14' Target Path height over
ground, the 66.8' vertical offset V.sub.OFFSET for the desired
resolution, and the base elevation at the start, which for this
example will be 280' above ellipsoid. Thus, the resulting flight
height would be 360.8' above ellipsoid.
[0089] Ellipsoidal heights are used by GPS-based systems. If the
elevation data available, such as an industry standard Digital
Elevation Model or as the Tessellated Ground Plane information
contained in the oblique images, as described in U.S. Pat. No.
7,424,133, is defined in mean sea level, the geoidal separation
value for that area can be backed out to get to an ellipsoidal
height, as is a well-known photogrammetric practice. From a
software stand-point, a software library such as is available from
Blue Marble Geo can be used to perform this conversion
automatically.
[0090] Next, the software would determine Target Capture Points of
the camera control information. The Target Capture Points may be
spaced along the Target Path in such a manner as to ensure full
coverage of the vertical structure of interest 21. This would be
determined using a similar method as was done with the maximum
vertical swath width. Once the desired resolution is known, it is
multiplied by the number of pixels in the horizontal orientation of
the sensor of the camera 19, and then sufficient overlap is
subtracted. Using the above example, if there are 3,000 pixels in
the sensor of the camera 19 in the horizontal orientation and the
software uses the same 20% overlap and 1/8'' GSD effective image
resolution that is discussed above, then a suitable spacing
distance for the Target Capture Points would be 18.75'. Thus, an
arbitrary start point would be selected (typically a corner along
the front wall is used) and then going in an arbitrary direction, a
Target Capture Point would be placed on the Target Path every
18.75' as well as one at the next corner if it occurs before a full
increment. A Target Capture Point may then be placed on the start
of the next segment along the Target Path and this pattern may be
repeated until all the segments have Target Capture Points.
[0091] Once all the Target Capture Points have been determined, the
Target Capture Points can be projected onto the Flight Path to
create Flight Capture Points. This projection may be accomplished
by extending a line outward from that is perpendicular to the
Target Path and finding where it intersects the Flight Path. This
has the effect of applying the lateral offset L.sub.OFFSET distance
and vertical offset V.sub.OFFSET calculated earlier. These Flight
Capture Points are then used to fire the camera 19 as the unmanned
aircraft 18 passes by the Flight Capture Points. When doing so, the
unmanned aircraft 18 keeps the camera aimed at the respective
Target Capture Point. This aiming can be accomplished by a number
of methods, such as an unmanned aircraft 18 that can turn, but is
best accomplished with a computer controlled gimbal mount for the
camera 19.
[0092] Alternatively, the camera 19 on the unmanned aircraft 18
could be put into "full motion video mode" whereby continuous
images are captured at a high rate of speed (typically greater than
1 frame per second up to and even beyond 30 frames per second).
Capturing at high frame rates ensures sufficient overlap. However,
capturing at high frame rates also results in a much greater amount
of image data than is needed which means longer upload times. In
addition, many cameras 19 can capture higher resolution imagery in
"still frame video" mode versus "full motion video" mode. But while
still frame video mode is preferred from a resolution and data
transfer standpoint, if the camera 19 has a full motion video mode,
then the full motion video mode can also be used. When in full
motion video mode, the unmanned aircraft 18 simply follows the
Flight Path keeping the camera 19 aimed towards the Target
Path.
[0093] The unmanned aircraft 18 would follow the indicated Flight
Path through autonomous flight. There are numerous computer systems
that can be configured as a flight management system to achieve
this available on the market today. The flight management system,
either onboard, or on the ground and communicating to the unmanned
aircraft 18 through some form of remote communication, would then
track the progress of the unmanned aircraft 18 along the Flight
Path and each time the unmanned aircraft 18 passes a Flight Capture
Point, the camera 19 would be triggered to capture a frame. Or in
the event that full motion video was selected, the camera 19 would
be continually firing as it flew along the Flight Path. The
position and orientation of the unmanned aircraft 18 would be
monitored and the camera 19 would be aimed towards the
corresponding Target Capture Point, or in the event that full
motion video was selected, the flight management system would keep
the camera aimed towards the nearest point on the Target Path. This
may be accomplished by calculating the relative directional offset
between the line moving forward on the Flight Path and the line
from the Flight Capture Point to the Target Capture Point (or
nearest point on the Flight Path for full motion video). This then
results in a yaw and declination offset for the camera gimbal.
Typically, these offsets are going to be a relative yaw of
90-degrees and a relative declination equal to the oblique
down-look angle selected above (in the example, 40-degrees).
However, since airborne systems are continually moved around by the
air, offsets for a shift in position, a shift due to crabbing, or a
shift in the yaw, pitch, or roll of the unmanned aircraft 18 would
need to be accounted for. Again, this may be done by using the
forward path along the Flight Path that the unmanned aircraft 18 is
currently on and offsetting it by the relative yaw, pitch, and roll
offsets of the unmanned aircraft 18 as measured by the position and
orientation system, and then further adjusted by the relative yaw
and declination as described above.
[0094] Once the complete circuit of the Flight Path has been
completed, the flight management system may instruct the unmanned
aircraft 18 to return to its launch point and land. The operator
may pull any detachable storage or otherwise transfer the imagery
from the onboard storage to a removable storage system or transfer
the imagery via some form of network or communications link. The
resulting images may then be used by the user terminal 14 and/or
the host system 12 to produce a structure and damage report.
Systems for producing a structure and/or damage report are
described in patents U.S. Pat. Nos. 8,078,436; 8,145,578;
8,170,840; 8,209,152; 8,401,222, and a patent application
identified by U.S. Ser. No. 12/909,692, now U.S. Pat. No.
8,977,520, the entire content of each of which are hereby
incorporated herein by reference. The completed report would then
be provided to the operator.
[0095] In some embodiments, additional data sets may be included
within the structure report 78. For example, data sets may include,
but are not limited to, weather data, insurance/valuation data,
census data, school district data, real estate data, and the
like.
[0096] Weather data sets may be provided by one or more databases
storing information associated with weather (e.g., inclement
weather). A weather data set within the structure report 78 may
include, but is not limited to, hail history information and/or
location, wind data, severe thunderstorm data, hurricane data,
tornado data, and/or the like. In some embodiments, the one or more
databases providing weather information may be hosted by a separate
system (e.g., LiveHailMap.com) and provide information to the host
system 12.
[0097] Insurance and/or valuation data sets may be provided by one
or more databases storing information associated with housing
insurance and/or valuation. An insurance and/or valuation data set
may include, but is not limited to, insured value of the home,
insurance premium amount, type of residence (e.g., multi-family,
single family), number of floors (e.g., multi-floor, single-floor),
building type, and/or the like. In some embodiments, the one or
more databases may be hosted by a separate system (e.g., Bluebook,
MSB, 360Value) and provide information to the host system 12.
[0098] The insurance and/or valuation data set may be included
within the structure report 78 and provided to the user. For
example, during underwriting of a home, an insurance company may be
able to request the structure report 78 on a home that is recently
purchased. The information within the structure report 78 may be
integrated with insurance information provided by an insurance
database and used to form a quote report. The quote report may be
sent to the user and/or insurance company. Alternatively, the
structure report 78 may be solely sent to the insurance company
with the insurance company using the information to formulate a
quote.
[0099] In another example, the structure report 78 may be used in
an insurance claim. In the case of a catastrophe of a customer, one
or more databases may be used to provide an insurance dataset with
claim information in the structure report 78. For example, an
insurance database having a policy in force (PIF) and a weather
database may be used to correlate information regarding an
insurance claim for a particular roof.
[0100] This information may be provided within the structure report
78. Additionally, in the case of loss or substantial alterations to
the structure 21, multiple images may be provided within the
structure report 78 showing the structure 21 at different time
periods (e.g., before loss, after loss). For example, FIG. 9
illustrates an exemplary screen shot 86 of the structure 21 having
with an image 88a captured at a first time period (e.g., before
loss), and an image 88b captured at a second time period (e.g.,
after loss).
[0101] Real estate and/or census data sets may also be including
within structure report 78. The real estate and/or census data sets
may be provided by one or more databases having detailed
information of a home. For example, a real estate data set may
include, but is not limited to, the homeowner's name, the purchase
price of the home, number of times the home has been on the market,
the number of days the home has been on the market, the lot size,
and/or the like. The census data set may include information
concerning the number of residents within the home. In some
embodiments, the one or more databases may be hosted by a separate
system (e.g., Core Logic) and provide information to the host
system 12 to provide data sets as described herein.
[0102] Other services related to structure may be provided within
the structure report 78. For example, using the square footage of
the roofing footprint, a price quote may be generated on the cost
of insulation for the roof (e.g., energy efficiency, insulation
replacement, and the like). Additionally, audits may be performed
using information within one or more databases. For example, using
the roofing area of a structure, historically paid insurance claims
for comparables, and validation of payment for a specific claim for
the home, a comparison may be made to determine whether the service
payment for the specific claim was within a certain threshold.
Auditing, it should be understood, may be applied to other areas as
described herein as well.
[0103] Although the images of residential structures are shown
herein, it should be noted that the systems and methods in the
present disclosure may be applied to any residential and/or
commercial building or structure. Further, the systems and methods
in the present disclosure may be applied to any man-made structure
and/or naturally occurring structure.
[0104] From the above description, it is clear that the inventive
concept(s) disclosed herein is well adapted to carry out the
objects and to attain the advantages mentioned herein as well as
those inherent in the inventive concept(s) disclosed herein. While
presently preferred embodiments of the inventive concept(s)
disclosed herein have been described for purposed of this
disclosure, it will be understood that numerous changes may be made
which will readily suggest themselves to those skilled in the art
and which are accomplished within the scope and spirit of the
inventive concept(s) disclosed herein and defined by the appended
claims.
* * * * *