U.S. patent application number 14/246181 was filed with the patent office on 2014-12-04 for cloud based command and control system.
This patent application is currently assigned to AAI CORPORATION. The applicant listed for this patent is Chad Chauffe, Chris Ellsworth, Anthony Neis, Johann Nguyen. Invention is credited to Chad Chauffe, Chris Ellsworth, Anthony Neis, Johann Nguyen.
Application Number | 20140358252 14/246181 |
Document ID | / |
Family ID | 51985988 |
Filed Date | 2014-12-04 |
United States Patent
Application |
20140358252 |
Kind Code |
A1 |
Ellsworth; Chris ; et
al. |
December 4, 2014 |
Cloud Based Command and Control System
Abstract
A command and control system is provided which links users and
platforms in real time and with touch screen ease, delivering a
highly intuitive, integrated user experience with minimal
infrastructure. Capitalizing on a cloud based architecture, from
the cloud, to the touch table, to a hand held device, the command
and control system creates seamless connections between sensors,
leaders and users for up-to-the-minute information clarity.
Inventors: |
Ellsworth; Chris;
(Huntsville, AL) ; Chauffe; Chad; (Slidell,
LA) ; Nguyen; Johann; (Huntsville, AL) ; Neis;
Anthony; (Scottsboro, AL) |
|
Applicant: |
Name |
City |
State |
Country |
Type |
Ellsworth; Chris
Chauffe; Chad
Nguyen; Johann
Neis; Anthony |
Huntsville
Slidell
Huntsville
Scottsboro |
AL
LA
AL
AL |
US
US
US
US |
|
|
Assignee: |
AAI CORPORATION
Hunt Valley
MD
|
Family ID: |
51985988 |
Appl. No.: |
14/246181 |
Filed: |
April 7, 2014 |
Related U.S. Patent Documents
|
|
|
|
|
|
Application
Number |
Filing Date |
Patent Number |
|
|
61827783 |
May 28, 2013 |
|
|
|
Current U.S.
Class: |
700/17 |
Current CPC
Class: |
G08B 25/08 20130101;
G08B 25/00 20130101 |
Class at
Publication: |
700/17 |
International
Class: |
G05B 15/02 20060101
G05B015/02 |
Claims
1. A cloud based command and control system comprising: a central
command hub configured to communicate over wired and wireless
connections; a sensor in wireless bi-directional communication with
said central command hub, and a computing device in bi-directional
communication with said central command hub, said computing device
having a graphical user interface configured to display data
received from said sensor, wherein a user may control said sensor
by manipulating said graphical user interface.
2. The command and control system of claim 1, further comprising: a
geosynchronous satellite in wireless bi-directional communication
with said central command hub, and wherein said sensor is an aerial
vehicle in bi-directional communication with said satellite.
3. The command and control system of claim 2, further comprising: a
ground based sensor, said ground based sensor being in
bi-directional communication with said central command hub, and
wherein as user may control said ground sensor by manipulating said
graphical user interface.
4. The command and control system of claim 2, wherein said aerial
vehicle is an unmanned aerial vehicle.
5. The command and control system of claim 1, wherein said
computing device is one selected from the group consisting of
desktop computer, tablet computer and mobile phone.
6. The command and control system of claim 1, wherein said
computing device is configured to record and playback data
associated with said sensor.
7. The command and control system of claim 1, wherein a user may
selectively associate certain data with said sensor using said
graphical user interface.
8. A method for data integration and processing in a command and
control system comprising the steps of: receiving raw data from a
sensor; converting said raw data into a common data format;
generating a set of processing rules by analyzing the data in the
common data format and communicating said processing rules to a
processing rules data bus; broadcasting said converted data to an
unprocessed data processor, said unprocessed data processor being
configured to manipulate the data according to said set of rules
contained on said processing rules data bus; transforming the data
using a data processing plugin and broadcasting said transformed
data to a post processing data message bus, and transmitting said
data from said post processing data message bus to an archiving
service for storage of said data in a database.
9. The method of claim 7, further comprising the steps of:
submitting query requests to said archiving service, and
broadcasting query results to a user filter for filtering and
transforming the data for presentation to a user.
10. The method of claim 7, further comprising the step of allowing
a user to add contextual data to said raw data.
11. The method of claim 9, further comprising the step of
communicating said contextual data to all users of the command and
control system.
12. The method of claim 9, further comprising the step of viewing
the stored data associated with a sensor in chronological and
reverse chronological order.
Description
CROSS-REFERENCE TO RELATED APPLICATIONS
[0001] This application claims the benefit of U.S. Provisional
Patent Application No. 61/827,787 filed on May 28, 2013 which is
incorporated by reference in its entirety.
[0002] This invention is related to command and control systems,
and more specifically, to such systems that employ detection,
analysis, data processing, and communications for military
operations, emergency services and commercial platform
management.
BACKGROUND
[0003] The advent of global communications networks such as the
Internet has facilitated numerous collaborative enterprises.
Telephone and IP networks (e.g., the Internet) facilitate bringing
individuals together in communication sessions to conduct business
via voice and video conferencing, for example. However, the
challenge of communications interoperability continues to plague
military and public safety agencies. Such interoperability could
give military personnel, first responders, elected officials, and
public safety agencies the capability to exchange video, voice and
data on-demand and in real time, when needed and as authorized.
[0004] National security incidents (e.g., terrorist attacks,
bombings, . . . ) and natural disasters (e.g., hurricanes,
earthquakes, floods, . . . ) have exposed that true
interoperability requires first responders and elected officials to
be able to communicate not just within their units, but also across
disciplines and jurisdictions. Additionally, full communications
interoperability is required at all levels, for example, at the
local, state, and federal levels. Conventional network availability
has proven to be difficult to maintain in unpredictable
environments such as firestorms, natural disasters, and terrorist
situations. Too often communications depend on access to fixed or
temporary infrastructure and are limited by range or line-of-sight
constraints. Moreover, radio interoperability between jurisdictions
(e.g., local, state, federal) is always an issue for responders and
has become a homeland security matter. Furthermore, proprietary
radios and multiple standards and their lack of interoperability
with wired and wireless telephony (also called telecommunications)
networks make it virtually impossible for different agencies to
cooperate in a scaled response to a major disaster.
[0005] Accordingly, reliable wireless and/or wired communications
that enable real time information sharing, constant availability,
and interagency interoperability are imperative in emergency
situations. Additionally, greater situational awareness is an
increasingly important requirement that enables soldiers and
emergency first responders to know each other's position in
relation to the incident, terrain, neighborhood, or perimeter being
secured. Live video, voice communication, sensor, and location data
provide mission-critical information, but low-speed data networks
cannot adequately meet the bandwidth requirements to support such
critical real time information. Large scale military operations
require a comprehensive and coordinated effort based on timely,
effective communications between any or all of the military's
soldiers and weapons is necessary to cope with the situation.
Therefore, what is needed is an improved interoperable command and
control communications architecture.
SUMMARY OF THE INVENTION
[0006] The following presents a simplified summary of the invention
in order to provide a basic understanding of some aspects of the
invention. This summary is not an extensive overview of the
invention. It is not intended to identify key/critical elements of
the invention or to delineate the scope of the invention. Its sole
purpose is to present some concepts of the invention in a
simplified form as a prelude to the more detailed description that
is presented later.
[0007] The invention disclosed and claimed herein, in one aspect
thereof, comprises a command and control architecture that
facilitates detection of a situation or event that is taking place.
The architecture employs sensors and sensors systems, as well as
existing systems, for processing, notifying and communicating
alerts, and calling for the appropriate military and/or public
safety and emergency services. Thus, whatever situation or event,
whether a sensor senses it, a human observes it, and/or the
physical location of military vehicles (including armored vehicles,
UAVs, etc.), police cars, emergency vehicles, fire vehicles are
ascertained, attributes of each of the sensors, observer, and/or
assets can be passed to central communications system for further
processing and analysis by a command center and/or the lower level
humans involved. For example, a mapping component can be employed
that generates one or more maps for routing services to and from
the situation location. The attribute data is also analyzed, with
the results data passed to the central communications system for
data and communications management, further facilitating
notification and alerting of the appropriate services to get the
right people and equipment involved, and then linking it to other
data sources in further support the system functions.
[0008] In support thereof, there is provided a command and control
system, comprising a detection component that facilitates sensing
of a situation and data analysis of detection data, a central
communications component (e.g., Internet-based) that provides data
and communications management related to the detection data, and a
mapping component that processes the detection data and presents
realtime location information related to a location of the
situation. The detection component includes at least one of a
sensor that senses situation parameters, an observer that observes
the situation, and/or an asset that is located near the
situation.
[0009] The mapping component includes a geographic location
technology that facilitates locating at least one of the sensor,
the observer, and the asset. The sensor is associated with
situation attributes that are analyzed, the observer is associated
with human attributes that are analyzed, and the asset is
associated with asset attributes that are analyzed. The asset
attributes are representative of a location of at least one of a
fire vehicle, a medical vehicle, and a law enforcement vehicle. The
sensor attributes are representative of a at least one of chemical
data, explosives data, drug data, motion data, biological data,
weapons data, acoustical data, nuclear data, audio data, and video
data.
[0010] The human attributes are representative of at least one of
voice data, visual data, tactile data, motion data, and audio data.
The system further comprises a tactical component that processes
tactical data for at least one of the mapping component, the
central communications component, and the detection component. The
system further comprises a security system that initiates a
security action based on the detection data. The security action
includes requesting at least one of a fire services, medical
services, and law enforcement services. The central communications
component facilitates communications over at least one of a
cellular network and an IP network. The central communications
component facilitates at least one of information rights
management, voice/video and data collaboration, file management,
workflow management, searching and indexing, and voice/text
alerting. The voice/text alerting includes an alert related to
detection by the diction component of at least one of nuclear data,
chemical data, biological data, and radiological data.
[0011] To the accomplishment of the foregoing and related ends,
certain illustrative aspects of the invention are described herein
in connection with the following description and the annexed
drawings. These aspects are indicative, however, of but a few of
the various ways in which the principles of the invention can be
employed and the subject invention is intended to include all such
aspects and their equivalents. Other advantages and novel features
of the invention will become apparent from the following detailed
description of the invention when considered in conjunction with
the drawings.
DESCRIPTION OF DRAWINGS OF INVENTION
[0012] The Applicant has attached the following figures of the
invention at the end of this patent application:
[0013] FIG. 1 is a simplified interconnection diagram in accordance
with an embodiment of the invention;
[0014] FIG. 2 is an interconnection diagram showing the various
components in accordance with an embodiment of the invention;
[0015] FIG. 3 is a simplified network diagram showing the various
components in accordance with an embodiment of the invention,
and;
[0016] FIG. 4 is a view of a multi-touch video screen in accordance
with an embodiment of the invention.
DETAILED DESCRIPTION OF INVENTION
[0017] The invention is now described with reference to the
drawings, wherein like reference numerals are used to refer to like
elements throughout. In the following description, for purposes of
explanation, numerous specific details are set forth in order to
provide a thorough understanding of the subject invention. It may
be evident, however, that the invention can be practiced without
these specific details. In other instances, well-known structures
and devices are shown in block diagram form in order to facilitate
describing the invention.
[0018] As used in this application, the terms "component" and
"system" are intended to refer to a computer-related entity, either
hardware, a combination of hardware and software, software, or
software in execution. For example, a component can be, but is not
limited to being, a process running on a processor, a processor, an
object, an executable, a thread of execution, a program, and/or a
computer. By way of illustration, both an application running on a
server and the server can be a component. One or more components
can reside within a process and/or thread of execution, and a
component can be localized on one computer and/or distributed
between two or more computers.
[0019] As used herein, terms "to infer" and "inference" refer
generally to the process of reasoning about or inferring states of
the system, environment, and/or user from a set of observations as
captured via events and/or data. Inference can be employed to
identify a specific context or action, or can generate a
probability distribution over states, for example. The inference
can be probabilistic--that is, the computation of a probability
distribution over states of interest based on a consideration of
data and events. Inference can also refer to techniques employed
for composing higher-level events from a set of events and/or data.
Such inference results in the construction of new events or actions
from a set of observed events and/or stored event data, whether or
not the events are correlated in close temporal proximity, and
whether the events and data come from one or several event and data
sources.
[0020] Referring now to FIG. 1, which shows a top level
interconnection diagram in accordance with an embodiment of the
invention 10 which depicts the concept of having all processing
centralized into a cloud based architecture for a command and
control system. The central communication hub 24 allows for
creating and viewing one singular view of an entire operational
environment known as the Common Operating Picture (COP). This
shared COP forms the basis for collaboration between users, sensors
and platforms. A computer server 26, well know in the art, provides
a virtualized computer environment where the various computer
services are run on a Virtual Machines(VM) which makes the command
and control system extremely portable and easily deployable as a
software appliance.
[0021] The command and control system 10 can be viewed by a wide
range of client devices. Some of the most common devices are
desktop computers 12, tablet computers 16 which may use for example
the Microsoft Windows, Unix, or Android based operating systems. A
client can be run using a keyboard, mouse and monitor, however the
system is optimized for a multi-touch screen display 14 for a
quicker and simpler user experience. Client devices may be deployed
with different client applications that offer unique sets of
capabilities and features to visualize and interact with the
cloud-based data. Cloud-based services and databases provide client
applications with the ability to recall and playback data that was
recorded to enhance situational awareness and decision making. Each
client presents the user with a user-specific display of the Cloud
data and also provides a means for collaboration and platform
tasking.
[0022] For users 18 in a tactical environment that would not
typically have the ability to use larger computer devices, a mobile
application is also available. This mobile application can be run
by any tablet 16 or smart phone 20 which may employ the Windows or
Android mobile operating system, for example. The mobile
application is a unique tool that provides multi-touch situational
awareness and collaboration for the tactical edge by displaying the
same Common Operating Picture to the user 18 while still remaining
light weight and responsive. The edge user may collaborate with
other users and platforms across units and echelons.
[0023] Data and platform integration is performed by creating
custom services, known as gateways, that listen to and communicate
with already existing data feeds from sensors 22 and systems.
Sensors 22 can be, as shown in the figure, an aircraft, a ground
based vehicle or the like which generates and communicates various
real time data associated with the sensor 22. The real time data
may include GPS coordinates, heading and velocity information, live
video feeds, environmental information or the like. This enables
the gateways to send information to and from the central
communications hub 24 comprised of server computer equipment and
systems 26 in such a way that all clients (14, 16, 20) will be able
to visualize on the clients screen. In some cases these gateways
even allow for users to communicate directly back to the sensor 22
in which the data was coming from, so the communication is
bi-directional. This bi-directional communication allows for users
to collaborate, send tasking requests and/or requests for
information (RFI) to a given sensor which can provide direct field
support, advanced warning of hazardous situations, navigational
guidance and/or any other situational awareness details.
[0024] The command and control system 10 can be synchronized across
multiple sites for extended collaboration through a method known as
cross-site data synchronization. Cross-site data synchronization
allows for data and services that is processed and centralized in a
location, such as a CONUS Cloud environment 38, to be transmitted
and synchronized to a deployed cloud environment 36 where this data
and information would not normally be readily available. Each
environment 38 and 36 hosts its own internal cloud 34 and 32 and
the cloud environments 38 and 36 then communicate with each other
to synchronize communications. A benefit to this is that each site
can operate completely independent of each other, and whenever they
are configured to communicate they will be able to share data that
was not readily available before. If one site loses communication,
it does not affect the other sites. In such a case, the site that
loses communication will then continue to operate in a stand-alone
state and no longer share data with the rest of the previously
synchronized Cloud environment(s). Moreover, the site(s) that did
not lose communication will simply no longer see the data from the
Cloud that lost communication and will continue to operate.
Communication between the Cloud environments may be supported by a
satellite link 30 which is in wireless communication with the
various cloud environments.
[0025] Referring now to FIG. 2, which depicts an interconnection
diagram showing the various components in accordance with an
embodiment 100 of the invention. The command and control system 100
provides a flexible and innovative solution based on the concept of
a Service Oriented Architecture (SOA). As mentioned previously, the
SOA allows for data integrations to be performed through services
known as gateways, which allows them to run completely isolated.
Therefore, in order to integrate a new data feed on an already
existing and running command and control network, a new gateway
would be created and once it is started within the Cloud, each
client would then be able to view the data from this new gateway
without needing to upgrade the software running on the client. This
also allows for quick integrations for rapid deliveries of stable
systems.
[0026] The concept of having Clouds running in multiple aerial
nodes 130 and 132 and ground node 116 allows for a wider coverage
of the grand battle space. Each aircraft 131 and 133 can host its
own Cloud 130 and 132 respectively with a number of gateway
services 136a, 136b, 138a, 138b and 138c running and sharing data
through a message bus 140 on each of the Cloud environments. Once
these aircraft 131 and 133 connect with one another, the services
hosted within the aircrafts can then be shared to create an
airborne network 142. Moreover, once even one of those aircraft
come within range of a ground unit 116, data and services can be
shared with the Cloud running on the ground unit via a Line of
Site(LOS) Link 135.
[0027] The benefit to this approach is that all the nodes that are
now connected form a network that spans a much greater area for an
even larger view of the battle space. Services that are run on any
of these nodes can then be accessed by any client 112 and 114
connected to the network. In the same case as the cross-site
synchronization, if communication is lost by any of the nodes,
simply the services running on those nodes will no longer be
available and the remainder of the connected nodes will continue to
run as they did before the connection was lost.
[0028] Users connected to the network 100 will be able to view a
web portal displayed inside items 112 and 114 for example
containing widgets 118a, 118b, 118c, 120a and 120b which
communicate using an HTTP Session 122 and 124 via web sockets 126a,
126b, 126c, 128a and 128b. Once a Cloud starts sharing data across
other Clouds on the network, all the clients connected to any of
the Cloud environments will be able to view and use any widget
being supported by any Cloud on the entire network. If one Cloud
loses connectivity, clients will not be able to use the widgets
supported in that Cloud, but will still be able to use the rest of
the widgets so long as their corresponding Clouds are still
connected.
[0029] Referring now to FIG. 3, which shows a simplified data
integration architecture diagram in accordance with an embodiment
200 of the invention. This figure depicts how data flows from data
sources and feeds 201 to a user's 215 unique client data view 214.
Data sources and feeds 201 provides data for services 202a, 202b,
202c, 202d to consume and process. The data services 202a-d may
convert the data into a common data format and broadcast the
converted data in the common data format to the Unprocessed Data
Message Bus 203. The Unprocessed Data Message Bus 203 provides a
medium for transferring messages from the data services 202a-d to
unprocessed data processor 204 and data analysis tools 206. The
unprocessed data processor 204 receives data from the unprocessed
message bus 203 and utilizes a "plug-in" architecture to delegate
the logic of processing and transforming the data to data
processing plugins 205a and 205b. After processing the data in the
plug-ins 205a and 205b, the data is broadcast to a post processed
data message bus 208.
[0030] The plug-ins 205a and 205b for the unprocessed data
processor 204 are configured to manipulate data according to a set
of rules broadcasted to a processing rules data bus 207 or other
external configurations stored on hard disk (not pictured). A data
analysis tool 206 receives data from the unprocessed message bus
203 and analyzes the data and determines how data should be
processed and manipulated and broadcasts processing rules on how
data should be processed to the processing rules data bus 207. The
processing rules data bus 207 provides a medium for transferring
rules for processing data from data analysis tools 206 to data
processing plugins 205a and 205b.
[0031] Processed data message bus 208 provides a medium for
transferring messages from the unprocessed data processor 203 to
the archiving services 209 and user filters 211. Archiving services
209 receives messages from the processed data message bus 208 and
stores it into a database 210. Query requests are received from
client applications 215 on the archived data query requests message
bus (not depicted). Query results are broadcast to the archive data
messages bus 213. Database 210 stores and retrieves data for the
archiving services 209 and user filters 211 receives data from the
processed data message bus 208 and the archive data message bus
213. User filters 211 utilizes a "plug-in" architecture to delegate
the logic of filtering and transforming the data to user filter
plugins 212a and 212c. The transformation of data allows entity
attribution to be managed for all users of the system (provided by
220: entity update plugin). For example, entity symbol, name, and
payload type can be specified by the end user to add context to the
raw data, which may initially enter the system with no attribution.
Entity layering may be controlled. Attachments in the form of
documents and presentations may be added to the entity to further
add context to the raw data. This collapses previously desperate
data onto the entities being managed with the objective of reducing
operator decision cycle time. As events change, entity attribution
can be updated on the fly and all users on system see the changes
immediately.
[0032] After filtering the data, the data is broadcast to the
respective client message bus 214. User filter plugins 212a and
212b are able to filter the data based on what the client is
interested in viewing (area of interest) and based on what the
client is allowed to view (active directory group policies). Data
can also be manipulated based on how the user would like to display
the data.
[0033] The archive data message bus 213 provides a medium for
transferring archived data from the archiving services 209 to the
user filters 211. The client message bus 214 provides a medium for
transferring data from the user filter 211 to the client 215. The
client 215 receives data from the client message bus 214 and
broadcasts archive data query requests to the archived data query
requests message bus (not depicted).
[0034] Referring now to FIG. 4 which shows a view of a multi-touch
video screen 14 in accordance with an embodiment of the invention.
Item 310 is a dynamically adjusting stare-points that allows the
user to drag and drop an ISR (Intelligence Surveillance, and
Reconnaissance) icon to send a collaboration message which may
dynamically re-task a platform's sensor payload. Users can
dynamically collaborate with platforms in the client map
application through a drag and drop interface. Such interactions
include dynamically adjusting a sensor's stare-point or a
platform's commanded loiter location. This is accomplished by the
placement of an appropriate drag and drop icon, which initiates a
collaboration message for a given platform. Item 312 is a window in
which Users can also view live full motion video (FMV) feed of a
given platform's sensor 22 (FIG. 1) payload in an associated
context menu. Item 314 is an icon button that allows a user to take
a snapshot from the live FMV feed 312 to upload and share as a spot
report to the command and control network.
[0035] Item 316 allows a user to scale a viewport by adjusting a
slider or touch-based gestures to match a desired Area Of
Responsibility (AOR). Item 318 is a platform/sensor field of view
capability that allows a user to project a platform's sensor's
Field Of View (FOV) onto the map. Item 320 depicts a mission replay
capability that allows a user to adjust a timeline slider to
dynamically retrieve and view and replay archived operational map
data. Item 322 allows users to request a sensor 22 to loiter or
slew its payload by dragging and dropping the corresponding icon
which allows the user to send a collaboration message to re-task a
platform's commanded loiter position or payload target.
* * * * *