U.S. patent application number 14/686192 was filed with the patent office on 2016-03-03 for shared server methods and systems for information storage, access, and security.
This patent application is currently assigned to COBAN TECHNOLOGIES, INC.. The applicant listed for this patent is Coban Technologies, Inc.. Invention is credited to Allan Chen, Yun Long Tan.
Application Number | 20160062992 14/686192 |
Document ID | / |
Family ID | 54932528 |
Filed Date | 2016-03-03 |
United States Patent
Application |
20160062992 |
Kind Code |
A1 |
Chen; Allan ; et
al. |
March 3, 2016 |
SHARED SERVER METHODS AND SYSTEMS FOR INFORMATION STORAGE, ACCESS,
AND SECURITY
Abstract
Devices and methods for managing multi-media files and
associated metadata in a hybrid manner are disclosed. Methods for
using the device(s) to implement different methods for managing
information obtained (e.g., recorded) by a plurality of recording
devices are also disclosed. This disclosure also relates to
comprehensive use of multiple distinct surveillance systems in a
coordinated manner. For example, a set of surveillance devices
configured for use by one or more law enforcement agencies or other
government agencies may share metadata to facilitate indexing,
sharing, accessing, and coordinating potential surveillance
recordings. In one example, metadata may be uploaded to cloud
storage while associated multi-media files are maintained locally
by the responsible agency. Maintaining metadata and actual
multi-media content separately may reduce bandwidth transmission
requirements and maintain confidentiality of surveillance
recordings. Further, chain of custody of evidence requirements
regarding digitally recorded evidence may be complied with.
Inventors: |
Chen; Allan; (Sugar Land,
TX) ; Tan; Yun Long; (Sugar Land, TX) |
|
Applicant: |
Name |
City |
State |
Country |
Type |
Coban Technologies, Inc. |
Houston |
TX |
US |
|
|
Assignee: |
COBAN TECHNOLOGIES, INC.
Houston
TX
|
Family ID: |
54932528 |
Appl. No.: |
14/686192 |
Filed: |
April 14, 2015 |
Related U.S. Patent Documents
|
|
|
|
|
|
Application
Number |
Filing Date |
Patent Number |
|
|
62044139 |
Aug 29, 2014 |
|
|
|
Current U.S.
Class: |
707/736 ;
707/803 |
Current CPC
Class: |
G06F 2212/1052 20130101;
H04L 9/3231 20130101; G06F 16/21 20190101; G06F 2212/7207 20130101;
H04L 9/3242 20130101; H04N 21/239 20130101; H04N 21/2743 20130101;
G06Q 50/26 20130101; G06F 21/79 20130101; H04N 7/185 20130101; G06F
2212/7202 20130101; G06F 12/0246 20130101; G06F 2221/0704 20130101;
H04N 21/2543 20130101; H04N 21/25816 20130101; G11B 27/10 20130101;
G06F 21/44 20130101; G06F 16/43 20190101; H04N 21/2353 20130101;
G06F 12/1408 20130101; G06F 2212/1024 20130101; G06F 16/13
20190101; H04N 21/214 20130101 |
International
Class: |
G06F 17/30 20060101
G06F017/30; G06Q 50/26 20060101 G06Q050/26 |
Claims
1. A computer system, comprising: one or more processors; and one
or more network communication interfaces communicatively coupled to
the one or more processors, wherein the one or more processors are
configured to execute instructions to cause the one or more
processors to: receive, via the one or more network communication
interfaces, first metadata information pertaining to a multi-media
recording, the first metadata information comprising information
regarding attributes describing recording circumstances of the
multi-media recording and an access location of the multi-media
recording; initiate transmission of at least a portion of the first
metadata information to a first storage location; and categorize
the multi-media recording using the first metadata information
independently of the multi-media recording.
2. The computer system of claim 1, wherein the one or more
processors are further configured to execute instructions to cause
the one or more processors to: process the first metadata
information pertaining to the multi-media recording; and
incorporate the processed first metadata information in an index
containing information pertaining to additional multi-media
recordings obtained from multiple distinct portable recording
devices.
3. The computer system of claim 2, wherein the one or more
processors are further configured to execute instructions to cause
the one or more processors to: receive a query request to identify
potentially applicable multi-media recordings; and utilize the
index to provide a response to the query request, wherein the
response to the query request comprises metadata information
pertaining to one or more potentially applicable multi-media
recordings or information identifying a storage location of one or
more potentially applicable multi-media recordings.
4-6. (canceled)
7. The computer system of claim 1, wherein the information
regarding attributes describing recording circumstances comprises
information pertaining to items selected from the group consisting
of: recording location, recording time, recording initiation,
recording termination, recording duration, event type/tag, and a
user who performed the multi-media recording.
8. The computer system of claim 1, wherein the one or more
processors are further configured to execute instructions to cause
the one or more processors to: apply audit controls regarding
access to and/or alteration of the first metadata information
and/or the multi-media recording.
9. (canceled)
10. The computer system of claim 1, wherein the one or more
processors are further configured to execute instructions to cause
the one or more processors to: receive an indication identifying
one or more potentially applicable multi-media recordings, the
indication responsive to a query request; and request initiation of
transmission of at least one of the one or more potentially
applicable multi-media recordings from a remote storage location to
a storage location accessible to the computer system.
11. The computer system of claim 1, wherein the one or more
processors are further configured to execute instructions to cause
the one or more processors to: apply a data retention policy to the
multi-media recording, wherein the data retention policy indicates
a data retention period for the multi-media recording based on the
first metadata information.
12-22. (canceled)
23. The computer system of claim 1, wherein the one or more
processors are further configured to execute instructions to cause
the one or more processors to: initiate transmission of the
multi-media recording to a second storage location different from
the first storage location.
24. The computer system of claim 23, wherein the first storage
location is a local network accessible storage location and the
second storage location is a remote storage location.
25. The computer system of claim 1, wherein the one or more
processors are further configured to execute instructions to cause
the one or more processors to: copy the first metadata information
and/or the multi-media recording to a local storage area distinct
from the first storage location.
26. The computer system of claim 1, further comprising: a local
storage area communicatively coupled to the one or more processors;
and a plug-in port communicatively coupled to the one or more
processors and configured to interface with a portable recording
device, wherein the initiation of transmission of the first
metadata information to the first storage location and/or
initiation of transmission of the multi-media recording to a second
storage location occurs automatically upon connection of a portable
recording device to the plug-in port, the portable recording device
storing the first metadata information and/or the multimedia
recording, respectively.
27. The computer system of claim 1, wherein the one or more
processors are further configured to execute instructions to cause
the one or more processors to: correlate the first metadata
information with additional metadata information to produce
correlated metadata information, the additional metadata
information comprising information regarding attributes describing
recording circumstances of one or more additional multi-media
recordings.
28. The computer system of claim 27, wherein the one or more
processors are further configured to execute instructions to cause
the one or more processors to: provide an interface to access the
correlated information, wherein the interface provides information
to assist in audit control of the first and additional metadata
information and the first and the one or more additional
multi-media recordings.
29. The computer system of claim 27, wherein the correlating of the
first metadata information with the additional metadata information
to produce correlated metadata information comprises correlating
the first and the one or more additional multi-media recordings
based on an event type associated with each of the first and the
one or more multi-media recordings, respectively, a recording
location of each of the first and the one or more multi-media
recordings, respectively, and/or a recording time of each of the
first and the one or more multi-media recordings, respectively.
30. A method comprising: receiving at a computer system, via one or
more network communication interfaces, first metadata information
pertaining to a first multi-media recording, the first metadata
information comprising information regarding attributes describing
recording circumstances of the first multi-media recording and an
access location of the first multi-media recording; initiating
transmission of at least a portion of the first metadata
information to a first storage location, the first storage location
comprising a storage area configured for storing at least metadata;
and categorizing the first multi-media recording using the first
metadata information independently of the first multi-media
recording.
31. The method of claim 30, further comprising: receiving a query
request to identify potentially applicable multi-media recordings;
and utilizing an index to provide a response to the query request,
wherein the response to the query request comprises metadata
information pertaining to one or more potentially applicable
multi-media recordings or information identifying a storage
location of one or more potentially applicable multi-media
recordings, and wherein the index contains metadata information
pertaining to multi-media recordings obtained from one or more
portable recording devices.
32. The method of claim 30, further comprising: applying audit
controls regarding access to and/or alteration of the first
metadata information and/or the first multi-media recording.
33. The method of claim 30, further comprising: receiving an
indication identifying one or more potentially applicable
multi-media recordings, the indication responsive to a query
request; and requesting initiation of transmission of at least one
of the one or more potentially applicable multi-media recordings
from a remote storage location to a storage location accessible to
the computer system.
34. The method of claim 30, further comprising: applying a data
retention policy to the first multi-media recording, wherein the
data retention policy indicates a data retention period for the
first multi-media recording based on the first metadata
information.
35. The method of claim 30, further comprising: initiating
transmission of the first multi-media recording to a second storage
location different from the first storage location, the second
storage location comprising a storage area configured for storing
at least multi-media recordings.
36. The method of claim 35, wherein the first storage location is a
local network accessible storage location and the second storage
location is a remote storage location.
37. The method of claim 30, further comprising: correlating the
first metadata information with additional metadata information to
produce correlated metadata information, the additional metadata
information comprising information regarding attributes describing
recording circumstances of one or more additional multi-media
recordings; and generating an index based on the correlated
metadata information, the index containing information pertaining
to the first and the one or more additional multi-media recordings.
Description
CROSS-REFERENCE TO RELATED APPLICATIONS
[0001] This application claims the benefit of, and priority to,
U.S. Provisional Application No. 62/044,139, filed Aug. 29, 2014,
and entitled, "Compact Multi-Function DVR with Multiple Integrated
Wireless Data Communication Devices," which is incorporated herein
by reference.
STATEMENT REGARDING FEDERALLY SPONSORED RESEARCH
[0002] Not applicable.
FIELD OF THE INVENTION
[0003] This disclosure relates generally to systems and methods to
assist in managing information including both multi-media and
associated metadata obtained (e.g., recorded) by a recording
device. More particularly, but not by way of limitation, this
disclosure relates to systems and methods for maintaining large
multi-media files on local storage and associated metadata files on
remote (e.g., cloud based) storage to facilitate searching,
cataloging, indexing, audit tracking, accessibility, and other
maintenance functions without requiring upload of large volumes of
data.
BACKGROUND
[0004] Today's law enforcement agencies are increasing their use of
digital data to collect surveillance information and other forms of
data to be used as evidence in legal proceedings. Devices and
methods for managing multi-media files collected as part of this
surveillance and evidence collection are increasing over time.
Multi-media files may be large. For example, a video or audio file
may easily be megabytes in size depending on the length of the
recording. Video files are typically larger than audio files and
they become larger based on the resolution of the video recording.
That is, higher resolution video files typically require larger
file sizes than either audio or lower resolution video because of
current audio and video compression techniques. Video files are
also typically larger than corresponding audio files because they
include more data than an audio recording.
[0005] Metadata associated with either audio recordings or video
recordings is a relatively small amount of data compared to the
audio or video data. However, today's systems typically embed the
metadata as part of the audio or video data file such that access
to the metadata requires access to the potentially large
multi-media file. Also, most access programs require an entire file
to understand the structure and content of the file itself.
Accordingly, to access any metadata associated with a typical
multi-media file, one must have complete access to the entire
multi-media file.
SUMMARY
[0006] According to a first aspect of the invention, a computer
system configured to collect and manage metadata associated with
one or more multi-media recordings is disclosed. The computer
system includes one or more processors and one or more network
communication interfaces communicatively coupled to the one or more
processors. The computer system also includes a storage area
accessible to the one or more processors. The storage area may be
used to store executable instructions for the processor(s) and to
store any collected (e.g., recorded) information. Of course, these
two types of data may be stored in separate logical areas of the
storage area. Overall, the computer system may be configured, by
the executable instructions, to receive, via the one or more
network communication interfaces, metadata information pertaining
to at least one multi-media recording. The metadata information may
include information regarding attributes describing recording
circumstances for the at least one multi-media recording and an
access location for the at least one multi-media recording. The
attributes describing recording circumstances will generally
provide information about when, where, why, and possibly how the
recording was made. This information about recording circumstances
may be helpful to determine which recordings may be of interest for
a given activity or search query.
[0007] In a second aspect of this disclosure, the computer system
(or a separate computer system) may be further configured to
process the metadata for the at least one multi-media recording to
incorporate information into a global index or catalog of
additional multi-media recordings. The additional multi-media
recordings may be obtained from the same or a plurality of distinct
capture devices. The overall global index may be useful to respond
to query requests to identify potentially applicable multi-media
recordings.
[0008] In a third aspect of this disclosure a method of managing a
plurality of multi-media recordings is disclosed. The method may
include receiving first metadata information having information
regarding attributes describing recording circumstances
attributable to a first multi-media recording obtained by a first
recording device. The first metadata information may be stored in
an associated external file rather than embedded into the
multi-media recordings. The metadata may be correlated with other
information about additional multi-media recordings. Overall, the
metadata may be managed independently of the recordings and provide
location information (e.g., storage location) for selected
multi-media files. A user interface may be provided to allow query
type functions to interface with the correlated information to
identify potentially relevant recordings based on a query
request.
[0009] In a fourth aspect of this disclosure, a docking station is
disclosed. The docking station may be configured to manage the
multi-media recordings and assist with overall management of
multi-media recordings as discussed throughout this disclosure. The
docking station may be configured to automate and possibly
prioritize some or all of the disclosed management functions.
[0010] Other aspects of the embodiments described herein will
become apparent from the following description and the accompanying
drawings, illustrating the principles of the embodiments by way of
example only.
BRIEF DESCRIPTION OF THE DRAWINGS
[0011] It being understood that the figures presented herein should
not be deemed to limit or define the subject matter claimed herein,
the applicants' disclosure may be understood by reference to the
following description taken in conjunction with the accompanying
drawings, in which like reference numerals identify like
elements.
[0012] FIGS. 1A-B illustrate a rear view and a front view,
respectively, of a device for capturing (e.g., recording)
multi-media and metadata according to some disclosed
embodiments.
[0013] FIGS. 2A-C illustrates block diagrams of a processing system
and two example removable storage devices that may be used for the
disclosed integrated mobile surveillance system to capture and
store multi-media files and associated metadata according to some
disclosed embodiments.
[0014] FIG. 3 illustrates a block system diagram showing some
additional internal components for the device of FIGS. 1A-B,
according to some disclosed embodiments.
[0015] FIG. 4 illustrates an intelligent docking, upload, and
charging station for battery packs and portable recording devices
according to some disclosed embodiments.
[0016] FIG. 5 illustrates a possible process flow to "checkout" a
portable device (e.g., body worn camera, wireless microphone),
including a storage device, that may be used by specific law
enforcement personnel for the duration of checkout and assist in
chain of custody procedures according to some disclosed
embodiments.
[0017] FIG. 6 illustrates possible data flow and Software as a
Service (SaaS) components for working with information stored in a
"hybrid" manner according to some disclosed embodiments.
[0018] FIG. 7 illustrates a flow chart depicting one possible
process for data mining of information collected by a plurality of
surveillance systems according to some disclosed embodiments.
[0019] FIGS. 8A-F illustrate excerpts of metadata files using
eXtensible Markup Language (XML) for the data format, according to
some disclosed embodiments.
NOTATION AND NOMENCLATURE
[0020] Certain terms are used throughout the following description
and claims to refer to particular system components and
configurations. As one skilled in the art will appreciate, the same
component may be referred to by different names. This document does
not intend to distinguish between components that differ in name
but not function. In the following discussion and in the claims,
the terms "including" and "comprising" are used in an open-ended
fashion, and thus should be interpreted to mean "including, but not
limited to . . . . " Also, the term "couple" or "couples" is
intended to mean either an indirect or direct connection. Thus, if
a first device couples to a second device, that connection may be
through a direct connection, or through an indirect connection via
other devices and connections.
[0021] As used throughout this disclosure the terms "computer
device" and "computer system" will both be used to refer to an
apparatus that may be used in conjunction with disclosed
embodiments of connectable storage drives and self-contained
removable storage devices. As used herein, a computer device may be
thought of as having a subset of functionalities as compared to a
computer system. That is, a computer device may refer to a special
purpose processor-based device such as a digital video surveillance
system primarily configured for executing a limited number of
applications. A computer system may more generally refer to a
general purpose computer such as a laptop, workstation, or server
which may be configured by a user to run any number of off the
shelf or specially designed software applications. Computer systems
and computer devices will generally interact with disclosed storage
drives included in embodiments of the disclosed portable recording
device in the same or similar ways.
[0022] The term "hybrid storage" is used in this disclosure to
describe that data associated with accessing and managing
multi-media files may be stored in a plurality of locations as
opposed to a single location and not embedded within the
multi-media file itself. For example, metadata files containing
attributes of associated multi-media files, and/or data collected
or maintained in association with multi-media files, may be stored
remotely from the multi-media files themselves. Metadata files are
typically considerably smaller in size than multi-media files.
Thus, metadata files are more easily transferred across data links
that may have limited bandwidth. As explained further below, hybrid
storage may allow for searching and indexing of numerous
multi-media files without requiring unnecessary transfer of the
potentially large multi-media files (e.g., video/audio recordings).
For simplicity the term "multi-media" will be used throughout this
disclosure to refer to files collected (e.g., recorded) by an audio
or audio/video recorder. Multi-media files may include only audio,
only video, or audio and video together and the information may be
compressed using an industry standard compression technology (e.g.,
Motion Picture Expert Group (MPEG) standards, Audio Video
Interleave (AVI), etc.) or another proprietary compression or
storage format.
[0023] The term "recording circumstances" is used herein to
describe that metadata information associated with an instance of a
multi-media recording may contain information describing attributes
associated with the act of actual recording of that multi-media
file. That is, the metadata may describe who (e.g., Officer ID) or
what (e.g., automatic trigger) initiated the recording. The
metadata may also describe where the recording was made. For
example, location may be obtained using global positioning system
(GPS) information. The metadata may also describe why (e.g., event
tag) the multi-media recording was made. In addition, the metadata
may also describe when the recording was made using timestamp
information obtained in association with GPS information or from an
internal clock, for example. From these types of metadata,
circumstances that caused the multi-media recording may provide
more information about the multi-media recording. This metadata may
include useful information to correlate multi-media recordings from
multiple distinct surveillance systems. This type of correlation
information, as described further below, may assist in many
different functions (e.g., query, data retention, chain of custody,
and so on).
[0024] This disclosure also refers to storage devices and storage
drives interchangeably. In general, a storage device/drive
represents a medium accessible by a computer to store data and
executable instructions. Also, throughout this disclosure reference
will be made to "plugging in" a storage drive. It is noted that
"plugging in" a storage drive is just one way to connect a storage
drive to a computer device/system. This disclosure is not intended
to be limited to drives that physically "plug in" and disclosed
embodiments are also applicable to devices that are "connected" to
a computer device or computer system. For example devices may be
connected by using a cable or by connecting using a computer bus.
Additionally, references to "removable" storage are analogous to
plugging-in/unplugging a device, connecting/disconnecting cabled
access to a device, and/or establishing/disconnecting networked
access to a device or storage area on a network (either wired or
wireless).
DETAILED DESCRIPTION OF DISCLOSED EMBODIMENTS
[0025] While various embodiments are described herein, it should be
appreciated that the present disclosure encompasses many inventive
concepts that may be embodied in a wide variety of contexts. Thus,
the following detailed description of exemplary embodiments, read
in conjunction with the accompanying drawings, is merely
illustrative and is not to be taken as limiting the scope of this
disclosure. Rather, the scope of the invention is defined by the
appended claims and equivalents thereof.
[0026] Illustrative embodiments of this disclosure are described
below. In the interest of clarity, not all features of an actual
implementation are described for every embodiment disclosed in this
specification. In the development of any such actual embodiment,
numerous implementation-specific decisions must be made to achieve
the design-specific goals, which will vary from one implementation
to another. It will be appreciated that such a development effort,
while possibly complex and time-consuming, would nevertheless be a
routine undertaking for persons of ordinary skill in the art having
the benefit of this disclosure.
[0027] Embodiments of the present disclosure provide for management
of multi-media files and associated metadata that might be
collected by one or more, mobile surveillance systems, portable
video recording devices, and other types of data recorders. The
mobile (and possibly stationary) surveillance system devices may be
configured to capture video, audio, and data parameters pertaining
to activity in the vicinity of the surveillance system, for example
a police vehicle. Other type of vehicles and other situations
requiring a surveillance unit are also within the scope of this
disclosure. Other types of vehicles may include, but are not
limited to, any transportation means equipped with a mobile
surveillance system (e.g., civilian transport trucks). The
disclosed embodiments are explained in the context of mobile
surveillance systems for vehicles that aid in law enforcement such
as buses, ambulances, police motorcycles or bicycles, fire trucks,
airplanes, boats, military vehicles, etc. However, in some
embodiments, data collected from other types of vehicles including
non law enforcement vehicles may be collected as a possible aid to
law enforcement (or for other applicable uses), at least in part,
because of the disclosed data mining and coordination
techniques.
[0028] Mobile surveillance systems have been in use by police
departments for the past few decades. Over that period of time,
several advances have been introduced in the technology used to
provide video/audio and data regarding specific police events. In
the late 1990s through the early 2000s, digital technologies became
prevalent in the industry, replacing existing analog technologies.
With the use of digital technologies, law enforcement agencies
obtained several advances over previous technologies and may
further benefit from additional advances (e.g., as described in
this disclosure). In general, digital technologies are more
adaptable and offer more opportunities for improvement than
corresponding analog technologies. This is largely because digital
video/audio files can be processed in a multitude of ways by
specifically configured computer devices. This disclosure
elaborates on several novel techniques to enhance the capability,
reliability, ease of use, security, integrity, and other aspects of
mobile surveillance systems and the information they collect.
[0029] Today, there are numerous surveillance systems in use by law
enforcement and the data they collect continues to increase in
volume and complexity. Accordingly, enhanced management techniques
for the amount of available data may be required. Additionally,
there is a need to improve data access and distribution, integrity,
reliability, and security throughout the lifecycle of that data.
Legal requirements for data collected by a remote/mobile
surveillance system include conformance to judiciary requirements
such as "chain of custody/evidence," and "preservation of
evidence." Chain of custody (CoC), in legal contexts, refers to the
chronological documentation or paper trail audit, showing the
seizure, custody, control, transfer, analysis, and disposition of
physical or electronic evidence. Preservation of evidence is a
closely related concept that refers to maintaining and securing
evidence from a particular crime scene before it ultimately appears
in a courtroom. For example, the evidence may go to a forensic
laboratory prior to arriving at the courtroom. Evidence
admissibility in court is predicated upon an unbroken chain of
custody. It is important to demonstrate that the evidence
introduced at trial is the same evidence collected at the crime
scene [e.g. that is, all access to the evidence (e.g., electronic
files) was controlled and documented], and that the evidence was
not altered in any way. Requirements for law enforcement are
further described in "Criminal Justice Information Services (CJIS)
Security Policy," version 5.3 published Aug. 4, 2014 referenced as
"CJISD-ITS-DOC-08140-5.3" which is hereby incorporated by reference
in its entirety.
[0030] As will be recognized, disclosed embodiments may allow for
comprehensive back-office video management software to be provided
using a Software as a Service (SaaS) architecture, giving each
agency (even small remote agencies) the tools they need to capture,
transfer, store and manage their digital video evidence from car to
court. That is, the disclosed system and back-office management
techniques meet the preservation of evidence requirements outlined
above with respect to management of digital evidence for law
enforcement. All activity with respect to digital evidence in the
back-office system may be logged to ensure proper documentation of
evidence handling. The disclosed system may include electronic
transfer of evidence in a controlled manner and may provide
comprehensive coordination of potential evidence captured from a
plurality of surveillance systems. The disclosed system may also
include integrated DVD burning software for easy and accurate
evidence transfer.
[0031] Referring now to FIGS. 1A-B, disclosed embodiments of an
integrated mobile surveillance system 100 are intended to
incorporate a plurality of functions as being "built-in" to mobile
surveillance system 100. Additionally, aspects of integrated mobile
surveillance system 100 have been designed with consideration for
future expansion as new technologies and capabilities become
available. Aspects of integrated system 100 include, but are not
limited to, the following integrated functional units. Integrated
system 100 may be configured to have one or more than one of each
of these functional units, as appropriate. Integrated wireless
microphone antenna connector 105 allows capture of audio from a
remote wireless microphone located within proximity of integrated
system 100. An external multi-conductor interface cable 110 allows
a wired connection to one or more internal interfaces of integrated
system 100. One or more Universal serial bus (USB) ports, such as
USB port 140, may be provided for general peripheral connectivity
and expansion according to some disclosed embodiments. An
integrated global positioning system (GPS) module 120 with optional
external antenna or connector 115 is used in part for capturing
location data, time sync, and speed logging. The GPS information
may also be used for time synchronization and to coordinate data,
ultimately facilitating map based search and synchronization (e.g.,
locate recorded information from a time and/or location across a
plurality of recording devices). Dual front facing cameras 125 may
include both a wide angle video camera and a tight field of view
camera for optical zoom effect snap shots. A record indicator 130
provides an indication of a current operating mode for integrated
system 100. A wired Ethernet adapter (e.g., Gigabit, 10/100 BASE-T,
etc.) 135 (or a wireless network adapter, not shown) for data
upload, computer interface, remote display and configuration.
Additionally, multiple wireless data communication devices (not
shown) may be integrated for flexibility and expansion. For
example, the system may include adapters conforming to wireless
communication specifications and technologies such as, 802.11,
Bluetooth, radio-frequency identification (RFID), and near field
communication (NFC). Each of these interfaces may be used, at least
in part, for data exchange, device authentication, and device
control. A serial port (not shown) may be used to interface with
radar/laser speed detection devices and other devices as needed. A
G-Sensor/Accelerometer (not shown) may be used for impact detection
and to automatically initiate record mode. The
G-Sensor/Accelerometer may also provide data logging for impact
statistics and road condition data. A DIO (Digital Input/Output)
(not shown) that may be used for external triggers to activate
record mode and/or provide metadata to the system. The DIO can also
be used to control external relays or other devices as appropriate.
The DIO can also be used to detect brake, light bar, car door, and
gun lock so that the video recording can be automatically
triggered. As shown in FIGS. 1A-B, a combination power button and
brightness control 145 can be used to turn on the system and
control the brightness of the monitor after the system is turned
on. Programmable function button 150 provides a user definable
external button for easy access to instigate any function provided
by integrated system 100. For example, rather than traversing
through a set of menus on articulating touchscreen 165, a user may
define function button 150 to perform an action with one touch
(e.g., instant replay, event tagging of a particular type, etc.). A
articulating touchscreen 165 may be used to view video in
real-time, or in one or more play back modes. Touchscreen 165 may
also serve as an input mechanism, providing a user interface to
integrated system 100. An integrated speaker (not shown) may be
used for in-car audio monitoring and in-car video/audio file
playback. An integrated internal battery 155 for proper shutdown in
the event of sudden power loss from the vehicle that might occur as
a result of a crash, for example, is shown. Also depicted is a
removable SSD Flash drive 170 (e.g., secure digital (SD) or
universal serial bus (USB) type), including any type of storage
that may be inserted or attached to the system via a storage
interface (e.g., SCSI, SATA, etc.). For security of access to data,
removable SSD flash drive 170 may be secured via a mechanical
removable media key lock 160. In some embodiments, event based data
is recorded and written to the removable drive to be transferred to
a back office server for storage and management. Wireless
microphone sync contacts 175 may be configured to synchronize a
wireless microphone/camera, such as a body worn camera and
microphone, for communication with integrated system 100. In
addition to actual sync contacts, that require physical contact,
other synchronization methods for wireless microphone/cameras
include utilizing NFC or RFID capability between the wireless
device and integrated system 100.
[0032] In addition to the components mentioned above, disclosed
embodiments of integrated mobile surveillance system 100 may be
configured to include functional components to provide operational
characteristics that may include the following. In accordance with
some embodiments, a pre-event playback function may be used to tag
historical events. In normal operation, integrated mobile
surveillance system 100 may record continuously to internal storage
and store tagged information (e.g., marked for export) to removable
storage. However, for the case of an incident that occurs without a
timely event trigger, the operator may instruct the system to
navigate back to an earlier time captured in the internal storage
and play back that portion of video/audio information. The selected
video, at any available point in time, may be marked, tagged for
extraction, and stored to removable storage, as if the event had
been tagged at that point in time. In accordance with some other
embodiments, a component may provide an instant replay function
configured to playback the last predetermined amount of time with
one button press. Note that both the instant replay and pre-event
playback (along with general system operation) allow for
simultaneous playback while the system is concurrently recording
information. Pre-defined event tags and a pre-defined event tagging
functions may also be provided. For example, tags may include DWI,
felony, speeding, stop sign, chase, etc. The tagging action may be
used to catalog portions of recorded data. For example, after an
event is indicated as ending (e.g., such as stop recording
indication), an option to select a predefined event may be
displayed. Upon selection the system may allow an associated
portion of collected information to be marked in a text file for
current and future identification and storage. Further, when the
tagged information is transferred to the data management software,
the tagged information may be searched by event type and maintained
on the server for a predefined retention period based on the event
type. A streaming function may also be provided to stream live view
and recorded video, audio, and/or data over available wireless and
wired networks. The integrated system 100 may also integrate
"hotspot" capabilities which allow the system to serve as an agency
accessible, mobile wireless local area network (WLAN).
[0033] Referring now to FIGS. 2A-C, possible internals and
peripheral components of an example device 200, which may be used
to practice the disclosed functional capabilities of integrated
surveillance system 100, are shown. Example device 200 comprises a
programmable control device 210 which may be optionally connected
to input device 260 (e.g., keyboard, mouse, touchscreen, etc.),
display 270 or program storage device 280. Also, included with
programmable control device 210 is a network interface 240 for
communication via a network with other computers and infrastructure
devices (not shown). Note network interface 240 may be included
within programmable control device 210 or be external to
programmable control device 210. In either case, programmable
control device 210 may be communicatively coupled to network
interface 240. Also, note Program Storage Device (PSD) 280
represents any form of non-volatile storage including, but not
limited to, all forms of optical and magnetic storage elements
including solid-state storage.
[0034] Program control device 210 may be included in a device 200
and be programmed to perform methods, including hybrid storage of
metadata and associated multi-media files, in accordance with this
disclosure. Program control device 210 comprises a processor unit
(PU) 220, input-output (I/O) interface 250 and memory 230.
Processing unit (PU) 220 may include any programmable controller
device including, for example, the Intel Core.RTM., Pentium.RTM.
and Celeron.RTM. processor families from Intel and the Cortex ARM
processor families from ARM.RTM. (INTEL.RTM. CORE.RTM.,
PENTIUM.RTM. and CELERON.RTM. are registered trademarks of the
Intel Corporation. CORTEX.RTM. is a registered trademark of the ARM
Limited Corporation. ARM.RTM. is a registered trademark of the ARM
Limited Company). Memory 230 may include one or more memory modules
and comprise random access memory (RAM), read only memory (ROM),
programmable read only memory (PROM), programmable read-write
memory, and solid state memory. One of ordinary skill in the art
will also recognize that PU 220 may also include some internal
memory including, for example, cache memory.
[0035] Various changes in the materials, components, circuit
elements, as well as in the details of the illustrated systems,
devices and below described operational methods are possible
without departing from the scope of the claims herein. For
instance, acts in accordance with disclosed functional capabilities
may be performed by a programmable control device executing
instructions organized into one or more modules (comprised of
computer program code or instructions). A programmable control
device may be a single computer processor (e.g., PU 220), a
plurality of computer processors coupled by a communications link
or one or more special purpose processors (e.g., a digital signal
processor or DSP). Such a programmable control device may be one
element in a larger data processing system such as a general
purpose computer system. Storage media, as embodied in storage
devices such as PSD 280 and memory internal to program control
device 210 are suitable for tangibly embodying computer program
instructions. Storage media may include, but not be limited to:
magnetic disks (fixed, floppy, and removable) and tape; optical
media such as CD-ROMs and digital video disks (DVDs); and
semiconductor memory devices such as Electrically Programmable
Read-Only Memory (EPROM), Electrically Erasable Programmable
Read-Only Memory (EEPROM), Programmable Gate Arrays and flash
devices. These types of storage media are also sometimes referred
to as computer readable medium or program storage devices.
[0036] FIG. 2B illustrates a secure digital (SD) card 285 that may
be configured as the programmable storage device described above.
An SD card is a nonvolatile memory card format for use in portable
devices, such as mobile phones, digital cameras, handheld consoles,
and tablet computers, etc. An SD card may be inserted into a
receptacle on the device conforming to the SD specification or may
alternately be configured with an interface to allow plugging into
a standard USB port (or other port). An example of the adapter for
USB compatibility 286 is illustrated in FIG. 2C. Modern computer
operating systems are typically configured to automatically permit
access to an SD card when it is plugged into an active computer
system (sometimes referred to as plug-n-play). In computing
technologies, a plug and play device or computer bus is one with a
specification that provides for or facilitates the discovery of a
hardware component in a system without the need for physical device
configuration or user intervention in resolving resource conflicts.
Because of additional security requirements regarding data access
with respect to the law enforcement field, disclosed systems may
incorporate a specifically modified interface to the removable
storage drive utilized in device 100 (i.e., removable media 170).
Modifications permitting specialized access to removable media,
such as a secure storage drive, are described in co-pending U.S.
patent application Ser. No. 14/588,139, entitled "Hidden Plug-in
Storage Drive for Data Integrity," by Hung C. Chang, which is
incorporated by reference herein. Modifications permitting
specialized functionality from removable media are described in
co-pending U.S. patent application Ser. No. 14/593,722, entitled
"Self-contained Storage Device for Self-contained Application
Execution," by Allan Chen et al., which is incorporated by
reference herein.
[0037] Referring now to FIG. 3, block diagram 300 illustrates one
embodiment of an integrated audio-video-data surveillance system.
Note that each of the components shown in block diagram 300 may be
communicatively coupled to other components via communication
channels (e.g., bus) not shown in the block diagram. The flow
arrows of block diagram 300 are general in nature to illustrate the
movement of information. In use, video and audio may be captured by
camera 305 and microphone 306 respectively. Captured data may be
provided initially to video/audio encoder 310 to encode and
optionally compress the raw video data and the encoded data may be
stored in a memory area (not shown) for access by CPU 315. Encoded
data may also be selectively stored to either internal failsafe
hard drive 320 or removable mobile hard drive 325 individually or
to both simultaneously. Data may also be transferred, for example
at the direction of a user, from internal failsafe hard drive 320
to removable mobile hard drive 325. Data capture devices such as
general purpose input output (GPIO) 330 and GPS 331 may be used to
capture metadata to associate with captured surveillance
information (e.g., multi-media files). All pertinent captured
metadata may be associated with captured video/audio recordings
using structured text files such as, for example, eXtensible Markup
Language (XML) files. An example of such structured text files is
explained in more detail below with reference to FIGS. 8A-F. In
addition to captured metrics provided by real-time capture inputs,
XML files may be utilized to store many different types of metadata
associated with captured video and data. This collection of
metadata may be used to describe "recording circumstances"
attributable to the surveillance information (e.g., multi-media
recordings). That is, the metadata may describe, when, where, who,
and why information, among other things, to indicate information
about the act of recording the surveillance information. The
metadata may include, but not be limited to, timestamps of capture,
[internal clock (not shown) of system 100 may be synchronized using
GPS data] event tags, GPS coordinates, GPS and RADAR/LIDAR
measurement from a target vehicle, breathalyzer analysis
information, analytical information, and so on. Analytical
information will be discussed in more detail below with reference
to FIG. 7. Wireless interface 335 (or a wired interface (not shown)
when available) may be used to upload information from one or more
surveillance systems to back office servers located, for example,
at a police station or to cloud based resources. Back office
servers and cloud based resources will be discussed in more detail
below with reference to FIG. 6.
[0038] Referring now to FIG. 4, advanced docking station 400 may
provide additional benefits for users that maintain a plurality of
portable body worn cameras 450 and/or a plurality of surveillance
systems. Some or all portable body worn cameras 450 may incorporate
one or more programmable function buttons 405. As shown in FIG. 4,
docking station 400 may have multiple ports/cradles 415. Docking
station 400 may assist in data upload, device checkout, device
upgrade (e.g., firmware/software update), recharging of battery
packs 420 and other maintenance type functions that may be
performed, for example, at a police station. For clarity, not all
repeated elements in FIG. 4 have an associated reference number.
Embodiments of the disclosed docking station may support
maintenance functions for multiple portable devices such as body
worn cameras 450 concurrently. The disclosed docking station 400
may be multifunctional for uploading and/or downloading of
video/audio and associated metadata. Configuration data such as
unit ID, user ID, operational modes, updates, and so on, may be
maintained and versions of such configuration information may be
presented on display screen 410 (which may also be a touchscreen
interface to docking station 400).
[0039] Docking station 400 may have integrated interfaces to
different types of surveillance systems. Interfaces such as, USB,
wired Ethernet or wireless network, as well as interface ports for
battery charging may be included. Docking station 400 may also
contain: a CPU and be configured as a computer device (see FIG. 1)
with optional integrated touchscreen display 410, output connectors
(not shown) for an optional external display/mouse or device
expansion. Docking station 400 may have an option for a wireless
display (not shown) to be used for status indication as well as for
an interface for checkout/assignment of surveillance system devices
to a user or group of users (See FIG. 5). Docking station 400 may
include wireless communications such as Bluetooth and/or
802.4AC/AD. Docking station 400 may also be configured to work as
an Access Point for a wireless network or may be configured to act
as a bridge to allow portable client devices to access
functionality of docking station 400 and possibly connect to other
system components including local or cloud based servers. Docking
station 400 may also include functional software or firmware
modules to support hybrid storage of recorded multi-media and
associated metadata automatically. Hybrid storage is discussed in
more detail below with reference to FIG. 7.
[0040] Docking station 400 may also have an internal storage device
to facilitate fast off-load storage which may be used to facilitate
a download/forward process for audio/video and metadata captured on
a surveillance system device (e.g. the body worn camera 450). For
example, the user may place the body worn camera 450 into a docking
station cradle 415 and docking station 400 offloads the data to the
local onboard storage drive (not shown) which can immediately (or
based on a timer) upload that information, or a portion thereof if
hybrid model, to a server (e.g., back office server or cloud
storage). Uploads may be prioritized based on many different
attributes such as time, size, event type priority, and so on.
Docking station 400 may also have an integrated locking mechanism
for one or more of the uploading/charging ports/cradles 415. The
docking station 400 may be configured to control the locking
mechanism to hold or release the wearable device in order to
prevent the user from taking it out during uploading/downloading,
or to make sure that only the recently "checked out" device is
removed, for example.
[0041] The touchscreen display 410 of FIG. 4 illustrates one
possible graphical user interface (GUI) layout as an example only.
Actual layouts may contain more information and features and may be
configurable based on requirements of different end users. In FIG.
4, the GUI shows examples of upload status and battery charging
progress. Other screens may be available on the GUI display 410 to
provide other status information such as unit ID, user ID, and/or
to assist with user checkout and assignment of devices to different
mobile surveillance systems.
[0042] Referring now to FIG. 5, process flow 500 illustrates a
possible method for assisting law enforcement personnel with
compliance to chain of custody of evidence requirements for legal
evidence. Chain of custody of evidence requirements may be
implemented with the assistance of docking station 400. In this
example, the computer device at the police station is considered to
be docking station 400 (but may be another workstation type device
for example) and a computer device in a police car, for example,
will be referred to as a "mobile surveillance system." Both docking
station 400 and the mobile surveillance system are examples
embodiments of computer device 100 of FIG. 1 described above.
Beginning at block 505 a portable recording apparatus (e.g., body
worn camera 450) including a storage device (e.g., 285, 286) is
"checked in" at a police station, for example. In the "checked in"
state the portable recording device may be connected to docking
station 400 that is configured to interact with the storage device
of the portable recording device. At block 510, docking station 400
receives a request to assign a portable recording device (e.g.,
body worn camera 450, or wireless microphone) to an officer (e.g.,
Officer "Joe Smith") for use in a patrol "shift." The request may,
for example, come from a GUI presented on touchscreen 410.
Optionally, the request may also include information to assign the
portable recording device to a particular mobile surveillance
system for that shift (e.g., surveillance system of "patrol car
54"). At block 515, docking station 400 writes control information
to the storage device of portable recording device to identify an
appropriate mobile device (e.g., 301). The control information may
include storage serial number, officer's ID (e.g., "Joe"), patrol
car (e.g., "54"), officer's password (likely encrypted), recording
parameter settings, or other information useful in assisting in
audit tracking of the portable recording device and any information
collected on the storage device of the portable recording device
during the shift. At block 520, the portable recording device is
removed from docking station 400 for association with a mobile
surveillance system (e.g., 301). The portable recording device
(e.g., 450) is now in a "checked out" state.
[0043] At block 525, the officer authenticates to a mobile
surveillance system. The portable recording device is connected to
the mobile surveillance system at block 530. Flow continues to
block 535 where the storage device of the portable recording device
(e.g., 450) becomes accessible to the mobile surveillance system if
authentication information is accurate. Authentication may require
that the mobile surveillance system match a previously identified
(e.g., at checkout) mobile surveillance system and may optionally
only become available after a second check that a proper officer
has authenticated to the mobile surveillance system. That is, both
the portable recording device is associated with a proper
surveillance system (e.g. 301), and the authenticated user will be
validated as a proper user. Thus, in this example, Officer "Joe
Smith" is authenticated to the mobile surveillance system and the
mobile surveillance system is the one in patrol car 54. In this
example the surveillance system in patrol car 54 is the system
which Officer Smith should be using for his shift. Accordingly,
prior to allowing any access to the storage drive of the portable
recording device from the mobile surveillance system both
attributes should be verified. Such increased authentication
methods may assist in compliance with chain of custody of evidence
requirements for gathering and maintenance of evidence. Note that
some law enforcement agencies require a two-factor authentication
for access to data. Validating "checkout information" regarding
both the portable device and the authenticated officer (e.g., both
the association with the surveillance system of patrol car 54 and
confirming Officer Smith is logged into that system) is one example
of two-factor authentication.
[0044] At block 540, as the officer performs his shift duties
(e.g., goes on patrol, etc.), the mobile surveillance system
records and stores evidence and surveillance data onto the storage
device of the portable recording device. During the shift, all data
recorded on the storage device may be associated with the officer
for audit tracking purposes as indicated at block 545. For example,
a metadata file may be used to "mark" any recorded data with
officer's ID, event type, date/time, GPS location, etc.
[0045] Next, at block 550 actions that may take place at the end of
a shift, for example, are performed. After a shift is completed and
the officer, mobile surveillance system, and portable recording
device return to the police station, recorded data may be securely
(for example, but not limited to, by data encryption) uploaded
wirelessly to a back office system at the police station. Securely
uploaded, as used here, indicates that the recorded data will be
uploaded in a manner as to maintain its association with the
officer and maintain chain of custody of evidence requirements as
well as any other type of security regarding the wireless network,
etc. As an alternative to wireless upload, the officer may remove
(e.g., disconnect) the portable recording device (e.g., 450) and
relocate the portable recording device to the same or a different
docking station 400 for upload at the police station. At block 555,
the officer may "check in" the portable recording device so as to
allow a different officer to use it on a subsequent shift. For
example, checking in may be performed using a GUI interface to
docking station 400.
[0046] In accordance with some embodiments, the above description
discloses how multi-media files and associated metadata may be
collected. In accordance with other embodiments, a hybrid model for
storing and analyzing information may be beneficial for small and
large law-enforcement agencies. Law-enforcement agencies with
limited staffing and resources may find it difficult to adopt
in-car or wearable video system technologies that involve complex,
expensive and cumbersome components. For example, an in-house
server based solution may require experienced computer
technicians/specialists to maintain proper hardware operations. A
non-server based solution may also be challenging because it may
lack the functions such as system configuration, video search and
storage management, and evidence life-cycle maintenance. It is
contemplated that a cloud based SaaS solution may offer the proper
flexibility and convenience required for such law enforcement
agencies. Additionally, the disclosed hybrid model for storing
metadata independently from actual multi-media files may more
effectively work for agencies having limited bandwidth
capabilities.
[0047] In some disclosed embodiments, a remote application and
database server may be hosted by a software as a service (SaaS)
cloud application to reduce (or eliminate) the need to hire
additional computer technicians. Some disclosed embodiments may be
implemented in a hybrid cloud and provide local (on site) data
storage for portions of data that require high bandwidth across a
network (e.g., Internet, police network) while maintaining metadata
in the cloud. This configuration may help ensure security and
integrity of digital evidentiary data by maintaining a single
global copy of metadata in the cloud (for storage) while still
allowing fast local access speeds for review of potentially large
video/audio files. Also, optionally, data on a shared server may be
downloaded to the local data storage site as backup data and then
re-uploaded to a remote (or cloud based) site if there is a systems
failure or "intrusion" attack at the remote (or cloud based
site).
[0048] To eliminate the need for (or to augment) a conventional DVD
burner based system, the user may auto upload all data and metadata
to the cloud. Optionally, a user may provide (or user event tags
may be used as) identification criteria for certain types of videos
(and their metadata) to be sent to the cloud automatically as soon
as the videos are uploaded to a server (or staged on docking
station 400) with certain "event type" metadata. For example, an
administrator may define: all DUI videos are sent to cloud based
storage and 2 DVD copies are burned. When an officer tags a video
as a DUI event type, as soon as the video is uploaded to the cloud,
the video may also be sent to a DVD burner for 2 copies
automatically. Alternatively, rather than burning DVD copies, an
email may be automatically generated and sent or instructions may
be provided to an employee to create and send an email with a time
limited access link to personnel or third parties (e.g.,
prosecuting attorney) who may have an interest in a DUI event.
Based on the tag type assigned, a wide number of triggers and
follow-on responses may be generated automatically. Furthermore,
actions relating to compliance with record retention policies may
be automatically generated so that as specific retention periods
pass, records are automatically deleted. Thus, the user may readily
and easily take advantage of cloud-based storage for an almost
limitless cataloguing and archiving device.
[0049] Referring now to FIG. 6, data flow in a content management
system that integrates with SaaS functionality is illustrated in
block diagram 600. The SaaS component may be a system which
typically includes a web-based portal that is the entry point to
the software services for all users requiring data access. As with
other data access points, access may be controlled by
authentication means such as, but not limited to, passwords,
fingerprints, encryption, and so on. Authorized users may search
media catalogues which may be generated from metadata obtained from
a single agency or from multiple jurisdictional agencies. Users may
also manage all the configuration settings of mobile/portable
video/audio recording devices via a cloud based control portal.
Having metadata in the cloud facilitates many different functions,
such as, query search of metadata associated with audio, video or
print media. The metadata in the cloud and an associated interface
portal may allow access to any evidentiary logs associated with the
data (local or cloud based) and access a user's local
hardware/software storage to review media that may not have been
uploaded to cloud storage (e.g., because of bandwidth/storage
constraints). That is, the cloud based system may include enough
information to allow secure access back to local storage (e.g., 644
and 642) so that a user at police station 640 may efficiently view
locally stored multi-media files. Alternatively, a user located
remotely from police station 640 may obtain access (e.g., secure
access via virtual private network VPN) to network and storage
infrastructure at police station 640 and perform desired actions on
multi-media files. Of course, bandwidth constraints of the obtained
remote access (e.g., VPN) may have an effect on what actions a
remote user decides to perform.
[0050] Local hardware/software storage at police station 640 may be
any storage device, such as local hard drives, removable drives, or
any type of network storage device, and so on. As shown in FIG. 6,
the SaaS functions may incorporate cloud storage (630) which is not
typically as limited in storage capacity as local hardware/software
storage. However, remote access to large files may have associated
communication bandwidth concerns. Such a SaaS content management
system may limit data handling (and thus the potential for breaking
the evidentiary chain of custody). Data handling may also be
limited by initiating data transfer from the local collection point
via an upload of data to the cloud storage using the web-based
portal. The user may determine which data will remain on local
storage and which data resides in the cloud. For example, in such a
hybrid storage solution, metadata relating to GIS information and
applications for performing data analysis may reside in the cloud,
while the related audio/video files remain at the user's facility.
This is largely based on the size of the files and recognition that
bandwidth to cloud storage may affect access to large files.
However, in some situations bandwidth concerns are not a
determining factor and other segmentation of data may be desired.
In the case where hybrid storage is implemented and a user has
local access to large files, a user may more efficiently interact
with metadata in the cloud and local multi-media files.
[0051] A cloud-based 630 video export and access system does away
with the hardware and ongoing maintenance costs of optical media
based systems by providing users a secure, controlled, reliable and
cost-effective method for sending video and data to third parties.
Video and data may be uploaded to the cloud for storage, one or
more third party recipients may be assigned access rights, and a
defined expiration date for third party access may also be
provided. Additionally, use of the cloud may permit real-time data
upload and storage which provides nearly limitless data storage
capacity for integrated system 100 (FIGS. 1A and 1B). Hybrid
storage models may be implemented to define pre-requisites as to
what actual multi-media files are stored in the cloud. In some
embodiments, only multi-media files requiring access by third
parties are uploaded to the cloud. In other embodiments, only
multi-media files that have been tagged with a particular event
type are uploaded to the cloud. In either or both of these
embodiments, other multi-media files that may be less important or
have not yet been fully analyzed may be maintained on local storage
for future consideration. Note that even though multi-media files
may be maintained on local storage it may be desirable to upload
associated metadata to the cloud based system to provide more
comprehensive indexing and searching functionality across all
recorded data.
[0052] Exported data may be stored in cloud-based storage that is
remotely accessible through a secured means (for example, but not
limited to, a password, finger print reader, etc). The system may
be configured to send one or more recipients an access link through
automated communication methods such as email, text, and mms, etc.
The link sent to each recipient may include an expiration date for
accessing the associated data. The system may also allow a
recipient of the link to review the data stored in the cloud via
the Internet, download a local copy of the data for future use, and
delete the data after review or download. The link sent to each
recipient may also limit access rights of recipients (e.g. read
only, data editing, deletion, etc.).
[0053] In order to comply with laws, court orders or
record-retention policies relating to data access, the system may
be configured to remove the accessible data after a predetermined
expiration date. A cloud-based system thus allows users to retain
the original data while limiting third party access to such data.
Once an access link has expired, no third party may access the
expired data. The disclosed SaaS system may also provide
bookkeeping functions to track content access, bandwidth usage, and
subscription expiration, etc. This bookkeeping function may be
capable of statistical analysis and billing and may generate
reports and invoices as needed.
[0054] FIG. 6 also graphically illustrates an example data exchange
flow in block diagram 600, thorough which video, audio, and print
data and associated metadata may be shared. Numerous users,
computer-based functionalities, storage options, and associated
lines of communication may be involved in data uploading and
downloading. For example, one or several police vehicles 610 may
transmit video and audio data and associated metadata via wireless
communication means 605 to a cloud storage system 630. Concurrently
(or as needed), this data or a subset of this data may be made
accessible to software applications, for example SaaS functions
620, via communication link 606. Police vehicle(s) 610 may also
manually download data and metadata to local storage 644 upon
arrival at police station 640 using data transmission channel 660.
Data transmission channel 660 may be a wired connection or a
wireless connection. In an alternative, a classical "sneakernet"
may be used by connecting a portable recording device to another
device (e.g., docking station 400). After connection data may be
uploaded to local storage 644, which is located at the police
station, and then optionally (based on a number of different
criteria) to the cloud 630 using any appropriate connection (e.g.,
645, 650 or another available connection).
[0055] In the example of block diagram 600, an integrated
surveillance system vendor 670 oversees and maintains SaaS
functions 620 utilizing communication channel 665. The vendor may
also optionally maintain the security and integrity of any cloud
based storage system 630 utilizing communication channel 666.
Vendor 670 may also provide all necessary technical support through
its SaaS functions 620 and communication channel 645 to assist
police station 640 in implementing best practices in the
preservation of data evidence. Police station 640, depending on
available resources, may have "in-house" routers (not shown) and
surveillance system backend server(s) 642 which provide redundant
data storage systems. Police station 640, in order to avoid
expensive data storage solutions, may optionally utilize cloud
storage 630 via communication channel 650 in a hybrid manner. Cloud
storage system 630 may also communicate directly with SaaS
functions through communications channel 655. Having multiple
channels of secured communications may provide rapid and efficient
data exchange while use of various storage means, (locally or
cloud-based) allows an inexpensive and flexible alternative to
resource-limited users.
[0056] Referring now to FIG. 7, flow chart 700 illustrates a
potential data mining strategy for captured data. The disclosed
data mining strategy may benefit from the above discussed hybrid
storage model in a number of ways. Example benefits across a single
agency or multiple jurisdictionally distinct entities may include
sharing of information without violating privacy or other data
access concerns. Sample use cases are described following this
overview of flow chart 700. Beginning at block 705, at least one
surveillance system automatically captures video and audio data and
associates that captured data with GPS positioning, timestamp, and
other information captured as metadata while the vehicle containing
the surveillance system is "on-patrol". All such video and audio
data (including metadata) from a single or multiple "on-patrol"
vehicles at block 710 may be uploaded to a central storage area
(e.g., a cloud) at the end of a law enforcement personnel shift. At
decision 715, it is determined by specially configured
software/firmware if a captured data segment has an associated tag
(e.g., event type). If not (NO prong of decision 715) then a
default tag and associated data retention policy may be applied to
the captured data as shown at block 725. The captured data segment
for untagged capture may be stored in an area of a computer hard
drive with continuous loop function such that oldest data is
overwritten by newer data. Alternatively, if a data segment has an
associated tag (the YES prong of decision 715), a retention
criteria based on the tag type may be applied and the appropriate
data stored with other tagged data as illustrated by block 720. As
required, any necessary evidentiary access controls as illustrated
by block 720 may be considered. At block 730, data mining is
performed. Information gathered from the data mining function may
be used to provide a global index of data (e.g., index of data
across all available metadata). Indexed data remains available
until the time limit for data retention is reached and then data
(and its associated index information) may be expunged. However, if
as in block 735, an unpredicted event occurs, for example, a
bombing, terrorist activity, report of previous criminal activity
etc., then, at block 740, the data mined in block 730 (for a
particular location, date and time) may be retrieved to assist with
investigation and evidence gathering. Optionally, as shown in block
745, the overall system may be configured to proactively apply
analytics to the captured metadata to identify possible criminal
activity or potential threats to public health and safety (e.g.,
face or pattern recognition analysis to identify a known criminal
or threat). Such analysis may then be used to produce an analytics
report as is shown at block 750. The analytics report, for example,
may then be reviewed by law enforcement personnel to assist with an
investigation or determine if further investigation is
required.
[0057] Collecting metadata from multiple surveillance systems to
create a comprehensive index may allow a law-enforcement agency to
correlate information from different systems. For example, if a set
of recordings from different patrol cars at a given geographical
location are of interest, then the metadata containing GPS
information may identify a subset of multi-media files that may be
of interest. If multiple agencies use a common global index, they
may be made aware of recordings that other agencies have obtained
that would otherwise be unknown to them. After following
appropriate legal procedures, they may obtain access to recordings
from other agencies to assist in gathering evidence. Note that
access to actual multi-media recordings may not be made available
because of privacy concerns, for example, but the global index
informs of the existence of potentially relevant information. In
this manner, coordinated inter-agency information sharing may be
enhanced. The hybrid storage model facilitates creation of a global
index because the overall size of actual multi-media recordings
across numerous surveillance devices may quickly become
unmanageable. Additionally, chain of custody of evidence and access
controls to actual multi-media files may be maintained.
[0058] Each agency may implement the hybrid storage model as
necessary based on their size and infrastructure capabilities.
Hybrid techniques may also be implemented as a sliding scale. That
is, at one extreme a maximal hybrid technique uploads all (or
nearly all) captured metadata and associated multi-media files. For
example, a large police station with a big cloud presence and high
bandwidth might use the maximal hybrid model. In another extreme, a
minimal hybrid technique would upload only enough metadata for
indexing and very few (if any) multi-media files. The minimal
amount of metadata may allow for global indexing as necessary so
that, when required, additional upload of data may be requested
from the agency implementing the minimal hybrid model.
[0059] Referring now to FIGS. 8A-F, examples of the metadata
referenced throughout this disclosure are shown in an example XML
file format. Note that because of the structure provided by XML
each of the metadata portions of FIGS. 8A-F may be stored in a
single file, multiple files or any other appropriate segregation.
Each element of an XML file is partitioned by tags (i.e.,
<row> followed by </row> as shown in FIGS. 8A-F). For
example, a start tag (i.e., <row>) as shown at the beginning
of element 805 and an end tag (i.e., </row>) as shown at the
end of element 805. According to some disclosed embodiments, the
actual root name of the file (e.g. filename with no extension) is
used as a key for associating the recorded audio/video with the
appropriate metadata file. Inside the example element 805 of FIG.
8A there are attribute/value pairs to provide a metadata parameter
name and its associated value for that attribute. Metadata
attributes shown in FIGS. 8A-F have self-evident names and
therefore are not discussed individually here. The examples
provided are simply to illustrate that a multitude of different
types of data may be captured and used to index or further maintain
associated captured surveillance data. In these examples, FIG. 8A
illustrates a video metadata file for a captured video segment
while FIG. 8F illustrates an XML segment that contains the VX-Sync
data. In this embodiment, the "V" of "VX" is used to reference the
particular video and the "X" to reference any event variable
associated with the particular video. For example, during the
recording, any action taken by a user, such as an action that
triggered the recording, e.g. pushing a wireless microphone, or
activation of a light bar, or the taking of a snapshot, will be
recorded in this metadata file and associated with the variable X
for future connection with the video V. FIG. 8B illustrates a
sample metadata file with attributes and values relating to in-car
activity logs based on personnel shifts, while FIG. 8C illustrates
a file portion that relates to GPS location metadata based on
personnel shifts, and FIG. 8D illustrates in-car error logs per
personnel shift. In the example shown in FIG. 8D, there are no
errors to report. FIG. 8E illustrates an example metadata file that
may be used to establish an audit trail for the associated video to
satisfy evidentiary requirements relating to chain of custody.
[0060] FIG. 8C illustrates a series of "collected" metadata
elements where several attributes have been assigned values based
on data collection. For example, the attribute "patrol unit" has a
value to identify a particular police vehicle and the officerID
attribute has a value corresponding to the identification of a
specific officer. Note that officerID may be initially blank as in
elements 820 and 825, and then be assigned an ID number as an
officer logs onto (e.g., successfully authenticates to) the
integrated system 100 (FIGS. 1A and 1B) as in shown in element 830.
Another attribute may be "log time" (element 835, FIG. 8E) which is
the date and time that a data record is captured. Yet other
attributes, which are self explanatory based on their name, may
indicate changes in longitude and latitude reflective of the
vehicle that is in motion. In addition the speed and velocity of
the vehicle in motion may be reflected in the metadata.
[0061] In light of the principles and example embodiments described
and illustrated herein, it will be recognized that the example
embodiments can be modified in arrangement and detail without
departing from such principles. Also, the foregoing discussion has
focused on particular embodiments, but other configurations are
also contemplated. In particular, even though expressions such as
"in one embodiment," "in another embodiment," or the like are used
herein, these phrases are meant to generally reference embodiment
possibilities, and are not intended to limit the invention to
particular embodiment configurations. As used herein, these terms
may reference the same or different embodiments that are combinable
into other embodiments. As a rule, any embodiment referenced herein
is freely combinable with any one or more of the other embodiments
referenced herein, and any number of features of different
embodiments are combinable with one another, unless indicated
otherwise.
[0062] Similarly, although example processes have been described
with regard to particular operations performed in a particular
sequence, numerous modifications might be applied to those
processes to derive numerous alternative embodiments of the present
invention. For example, alternative embodiments may include
processes that use fewer than all of the disclosed operations,
processes that use additional operations, and processes in which
the individual operations disclosed herein are combined,
subdivided, rearranged, or otherwise altered.
[0063] This disclosure may include descriptions of various benefits
and advantages that may be provided by various embodiments. One,
some, all, or different benefits or advantages may be provided by
different embodiments. In view of the wide variety of useful
permutations that may be readily derived from the example
embodiments described herein, this detailed description is intended
to be illustrative only, and should not be taken as limiting the
scope of the invention. What is claimed as the invention,
therefore, are all implementations that come within the scope of
the following claims, and all equivalents to such
implementations.
* * * * *