U.S. patent application number 13/385040 was filed with the patent office on 2013-08-01 for anti-sniper targeting and detection system.
The applicant listed for this patent is John Hiett, Steven Gregory Scott, Kenneth Varga. Invention is credited to John Hiett, Steven Gregory Scott, Kenneth Varga.
Application Number | 20130192451 13/385040 |
Document ID | / |
Family ID | 48869129 |
Filed Date | 2013-08-01 |
United States Patent
Application |
20130192451 |
Kind Code |
A1 |
Scott; Steven Gregory ; et
al. |
August 1, 2013 |
Anti-sniper targeting and detection system
Abstract
An anti-sniper targeting system where a spherical
omni-directional depth stereoscopic camera, a radar, and microphone
identify, detect, and determine target positions and bearings,
detect target weapon flash, detect glint, track bullet trajectory,
coordinate, track, share, and assign targets. Target bearings and
ranges are determined by sound and heat signature detection from an
infrared camera, from glint, and radar to rapidly position a fire
control arm with a camera onto assigned targets rapidly from
calculations on target positions and optimal trajectory. It can
account for firing corrections due to target range and wind effects
using wind sensors, pressure, temperature, and earth curvature
accommodating for bullet trajectory over large ranges. It can be an
offensive sniper system whereby a target is locked in spite of
movements, such as from a vehicle using stabilizing gyros and
accelerometers, image processing, or sensor data to adjust for
movements.
Inventors: |
Scott; Steven Gregory;
(Peoria, AZ) ; Varga; Kenneth; (Peoria, AZ)
; Hiett; John; (Tempe, AZ) |
|
Applicant: |
Name |
City |
State |
Country |
Type |
Scott; Steven Gregory
Varga; Kenneth
Hiett; John |
Peoria
Peoria
Tempe |
AZ
AZ
AZ |
US
US
US |
|
|
Family ID: |
48869129 |
Appl. No.: |
13/385040 |
Filed: |
January 30, 2012 |
Related U.S. Patent Documents
|
|
|
|
|
|
Application
Number |
Filing Date |
Patent Number |
|
|
61626702 |
Sep 30, 2011 |
|
|
|
61575131 |
Aug 16, 2011 |
|
|
|
61626701 |
Sep 30, 2011 |
|
|
|
61571113 |
Jun 20, 2011 |
|
|
|
Current U.S.
Class: |
89/41.05 |
Current CPC
Class: |
F41G 3/147 20130101;
F41G 3/00 20130101 |
Class at
Publication: |
89/41.05 |
International
Class: |
F41G 3/00 20060101
F41G003/00 |
Claims
1. A anti-sniper targeting and detection system comprising: a. a
camera, b. a microphone, c. a robotic weapon mounted with a
zoomable camera, d. a data processing system whereby targets are
detected from said camera and microphone, target position is
computed, gimbaled angles are computed to move to targets, e. a
radar system, f. a transceiver, and g. a heads up display.
Description
CROSS-REFERENCE TO RELATED APPLICATIONS
[0001] This application claims the benefit of the filing dates of
U.S. Patent applications No. 61/626,702; 2010/0238161 A1;
61/575,131; Ser. No. 61/626,701; Ser. No. 61/571,113; U.S. patent
application Ser. No. 12/460,552 35 U.S.C. sec. 119 and 120
particularly the Sep. 30, 2011 filing date of 61/626,702.
FEDERALLY SPONSORED RESEARCH
[0002] None.
SEQUENCE LISTING
[0003] None.
BACKGROUND
[0004] This application relates to pre-empting, target & weapon
tracking, counteracting the attack of snipers by detection of
snipers, or potential snipers, rapidly determining, assigning,
coordinating, and transferring, the target information between
anti-sniper systems. The sniper position information can be used by
counteracting forces through target information sharing and/or by
rapidly positioning a counter-sniper weapon. The counter-sniper
weapon robotic arm can rapidly zoom, pan, and tilt a camera
(infrared or otherwise as appropriate), based on target bearing,
elevation, range, and wind condition calculations, immediately be
moved upon sniper position to rapidly and accurately counter fire
against any sniper or multiple snipers. Small human adjustments of
pan, tilt, and zoom can be made upon human verification of target
from rapidly zoomed camera. Pooled together, multiple systems can
be designed to cooperate to nearly simultaneously coordinate,
assign, and communicate target data and counter-fire on
automatically or semi-automatically assigned multiple sniper
targets in response all at once, where targets can be chosen
programmatically (automatically or semi-automatically) optimally by
relative unit positions. Targets can also be assigned based on
terrain occlusions, for maximizing safety of units based on these
terrain occlusions from a terrain occlusion (line of sight from
target) data base or calculation, and/or from system
instrumentation of terrain (such as from three dimensional depth
cameras). Snipers can be dealt with in multiple stages:
pre-detection, barrel/glint (pre-fire) detection, fire detection,
bullet trajectory tracking, and fire return, as snipers come and
go.
[0005] The combination of stereoscopic/spherical/depth
(omni-directional) cameras as well as a spherical/omni-directional
microphone system and a radar system can be used to measure target
range. Other techniques to determine target range can be optic flow
estimation, laser range finding, terrain database information or
any other suitable technique. If a muzzle flash or heated muzzle
can be detected optically, because the speed of light is much
greater than the speed of sound through air, the muzzle flash and
muzzle sound detection can be used to determine range by taking the
time difference at the start of the muzzle flash from the start of
the optical detection and multiplying it by the speed of sound in
air of which can be optimized using air pressure & temperature
sensors if needed. Sensor integration and synthesis can be achieved
by weighting the probability of accuracy, precision, and
tolerances. Many of these techniques are well known in the art.
[0006] Pre-sniper detection techniques using radar signature
reflection of gun barrel are described in U.S. Pat. No. 8,049,659;
as well as described in T. CIPARA; "Using Radar Signals to
Safeguard Our Troops"; Mar. 15, 2011; George Mason University;
Fairfax, Va.; USA. Other pre-sniper detection techniques using
detection of glint or reflection from a scope, binocular, or even a
human eye lens are described in part in U.S. Pat. App. No.
2008/0136626 and also described in part in H. HASHARON; "SLD500
Sniper Locator CILAS"; 2006; Defense Update International Online
Defense Magazine; Issue 2; Israel; N. SHACHTMAN; "Lasers Stop
Snipers Before they Fire"; Apr. 26, 2007; Wired; New York, N.Y.;
USA; as well as described in Torrey Pines Logic product brochures
for their optical detection products, "Mirage 1200", "Sentinel
S30", "Beam 50/60", and "Beam 1000"; San Diego, Calif. where the
"Mirage 1200" is mentioned in D. CRANE; "Torrey Pines Logic
Mirage-1200 and Myth-350 Handheld Sniper Detection Systems"; Dec.
8, 2008; Defense Review; Miami, Fla.; USA; H. HASHARON; "SLD500
Sniper Locator CILAS"; 2006; Defense Update International Online
Defense Magazine; Issue 2; Israel; Y. ASFAW; "Impact of Pose and
Glasses on Face Detection Using the Red Eye Effect"; May 2003;
CCECE 2003; IEEE; Montreal; Canada. A counter measure to snipe
scopes is described in M. NAIMARK; "How to ZAP a Camera: Using
Lasers to Temporarily Neutralize Camera Sensors"; October 2002;
USA.
[0007] Acoustic detection of gun fire is described in U.S. Pat.
Nos. 7,796,470 and 6,178,141, as well as in U.S. Pat. App. No.
2010/0226210 and also described in G. L. DUCKWORTH; "Fixed and
wearable acoustic counter-sniper systems for law enforcement"; Nov.
3, 1998; SPIE Proceedings Vol. 3577; Boston, Mass.; USA; M. V.
SCANLON; "Networked Acoustic Sensor Array's Performance During 2004
Horizontal Fusion--Warrior's Edge Demonstration"; December 2004; US
Army Research Laboratory; USA; J. DUNNIGAN;"Sniper Detectors
Arrive"; Aug. 22, 2009; Strategy World; USA; F. SIMONIS;
"Nanotechnology: innovation opportunities for tomorrow's defense";
March 2006; TNO Science & Industry; Netherlands; G. SIMON;
"Sensor Network-Based Counter-sniper System"; Nov. 3, 2004;
SenSys'04; Baltimore, Md., USA; A. WHITE; "Fighting fire with fire:
technology finds a solution to sniper attacks"; June 2009; pg.
52-57; Jane's International Defense Review; Englewood, Colo.; USA;
J. KELLER; "Sniper-detection systems to provide perimeter security
for Army forces in Afghanistan to come from Raytheon BBN"; Feb. 15,
2011; Military & Aero. Elect.; USA; T. V. BROOK; "High-tech
device helps U.S. troops pinpoint snipers"; Mar. 2, 2011; USA
Today; USA; C. HUGHES; "British troops to get iPod-sized `sniper
finders` to take on deadly sharpshooters in Afghanistan"; Mar. 8,
2011; Daily Mirror; UK; T. HORNYAK; "U.S. troops getting wearable
gunshot detectors"; Mar. 21, 2011; CNET News; San Francisco,
Calif.; USA; A. BARRIE; "Sniper Detectors Coming to America's
Heartland"; Dec. 22, 2011; FOX NEWS; USA
[0008] The barrel flash detection is described in U.S. Pat. Nos.
7,947,954; 3,699,341; and in U.S. Pat. App. No. 2011/0095187 as
well as in A. GOLDBERG; "Infrared Signatures of the Muzzle Flash of
a 120 mm Tank Gun and their Implications for the Kinetic Energy
Active Protection System (KEAPS)"; October 2001; USA; M. ISAAC et.
al.; "Infrared Detects Sniper Gunfire"; Oct. 29, 2005; Wired; New
York, N.Y.; USA; S. A. MOROZ; "Airborne Deployment of and Recent
Improvements to the Viper Counter Sniper System"; 1999; Naval
Research Laboratory; Washington, D.C.; USA.
[0009] Combined acoustic and optical fire detection systems are
described in RAFAEL; "Anti-Sniper Systems Finding Their Range";
Nov. 3, 2005; Defense Industry Daily; USA; M. C. ERTEM; "An
acoustic sensor for the viper infrared sniper detection system";
August 1999; Maryland Advanced Development Laboratory; Greenbelt,
Md.; USA.
[0010] The trajectory tracking of bullets fired is described in X.
L. ZHANG; "Real-time tracking of bullet trajectory based on chirp
transform in a multi-sensor multi-freq radar"; May 10, 2010; Radar
Conference, 2010 IEEE; USA; Y. ZHANG; "Real-time acquisition and
tracking of sniper bullets using multi-sensor multi-frequency radar
techniques"; Aug. 31, 2009; SSP '09. IEEE; USA; BROWN, E. R.;
"Ku-band retrodirective radar for ballistic projectile detection
and tracking"; May 4, 2009; Radar Conference, 2009 IEEE; Pasadena,
Calif.; USA.
[0011] Systems and techniques that are able to return fire against
snipers, or avoid fire, are described in U.S. Pat. Nos. 6,357,158;
7,484,451 and U.S. Pat. App. Nos. 2009/0320348, 2009/0292467,
2009/0290019, 2008/0291075, as well as in N. F. EVANS; "British
Artillery Fire Control Ballistics & Data"; Apr. 11, 2010;
Australia; F. FLINCH et. al.; "External Ballistics"; Jan. 10, 2012;
Wikipedia; San Francisco, Calif.; USA.
[0012] A general discussion of various other anti-sniper systems is
included in D. CRANE; "Anti-Sniper/Sniper Detection/Gunfire
Detection Systems at a Glance"; New and Future Technology; Jul. 19,
2006; Defense Review; Miami, Fla.; USA; P. SARKA; "iRobot and
Boston Univ. Photonics Center Unveil Advanced Sniper Detection
System for iRobot Packbot"; Oct. 3, 2005; iRobot Corp. Press
Release; R. DOUGLAS; "The Objective Force Soldier/Soldier
Team--Volume II--The Science and Technology Challenges"; November
2001; Army Science Board SAAL-ASB; Arlington, Va.; USA; C. CALLAN;
"Sensors to Support the Soldier"; Feb. 3, 2005; pgs. 41-84; Jason
the MITRE Corporation; McLean, Va.; USA; P. A. BUXBAUM;
"Pinpointing Sniper Perches"; August, 2010; SOTECH 8.6; pgs. 11-14;
KMI Media Group; Rockville, Md.; USA; A. NATIVI; "Counter-sniper
Systems Detect Hidden Shooters"; Dec. 22, 2011; Aviation Week--The
McGraw-Hill Co.; USA.
[0013] We are unaware of any anti-sniper system that incorporates
and integrates all of the methods described in the prior art, as
well as providing a seamless response at every stage of sniper
interaction from pre-detection, to pre-fire warning, target
assignment, fire detection, trajectory tracking, to coordinated
fire response, as well as neural-fuzzy reinforcement optimization.
For an example application of reinforcement optimization learning
see M. MCPARTLAND, "Reinforcement Learning in First Person Shooter
Games"; March, 2011; IEEE Transactions on Computational
Intelligence and AI Games; Vol. 3 No. 1; USA. This invention fully
integrates all of the methods, and is fully applied at every stage
of sniper interaction, where the anti-sniper system can
automatically respond as if well prepared through continuous
vigilant sensor monitoring to snipe the sniper in advance. Doing
this by continuously and autonomously monitoring and tracking the
target(s) as well as atmospheric conditions such as wind speed,
wind direction, temperature, air pressure, unit positions, and
incorporating this data in real time with target bearing and
planned computed optimal counter sniper-bullet trajectories based
on ballistics (e.g. bullet/projectile mass, wind speed, distance to
target, etc.).
SUMMARY
[0014] A rapid and accurate sniper counter acting force response
system that can not only allow operators to immediately respond but
can also pre-empt the sniper by identifying sniper targets in
advance using detection of movement or presence of infrared
signatures of objects using frame by frame image, as well as gun
barrel radar detection, processing adjusting for vehicle motion and
vehicle position and utilizing existing information about the
terrain. With a fast autonomous, robotically gimbaled, zoom-able
camera an operator can quickly scan and verify suspect targets.
This can be done as a vehicle progresses through the field of
operation, by target locking and tracking, while allowing the
operator to simply press a "next target" (or "last target", or
"next coordinated assigned target", like a target slide show)
activation to quickly evaluate suspect targets in order to
differentiate real from non-real targets. The return fire weapon
and rapid zoom camera can help an operator further evaluate, from a
great distance, what the target is holding or doing, and if the
target is verified as threatening, the anti-sniper system can fire
at the target with great accuracy. Highly robust smooth image
stabilizers, gimbals, and laser locking techniques along with
gyroscopes can help stabilize and fix and follow the highly zoomed
(infrared and/or other) camera onto the target while the vehicle is
still in motion further enhancing the operator to verify if a
target is threatening or not in advance of being sniped, allowing a
pre-emptive snipe at a sniper. Anti-sniper systems can share
critical data and coordinate actions with each other in real time
in a firefight such as friendly positions as well as target
positions, and friendly weapon vectors, trajectories, and friendly
as well as target firing times.
[0015] The anti-sniper camera system can also be made to
incorporate a multitude of zoomed cameras per target, as well as
multiple robotic anti-sniper weapons so that even more rapid target
assessment and response can be made. The anti-sniper system
objective is ultimately to act as a very significant assured
deterrent to firing any weapon at the anti-sniper system. It is to
re-assert balance in asymmetrical warfare as well as mutual assured
destruction of equal system capability, or even verifiable threat,
of any gun weapon firing or pointing, thus making it a tremendous
counter incentive to firing a gun, or even threatening any force
carrying a fully autonomous (with manual override) integrated
anti-sniper system. It greatly reduces the element of chance
involved, and it is a powerful deterrent to not only firing a
weapon, but even pointing it.
DRAWINGS
[0016] FIG. 1 is a block diagram of the overall anti-sniper
targeting and detection sharing system.
[0017] FIG. 2 shows prior art calculations of how range can be
estimated by combining acoustic sensors (microphones) with optics
and taking advantage of the speed of light being much greater than
that of sound to determine range of gun fire.
[0018] FIG. 3 shows the anti-sniper targeting and detection sharing
system on top of a armored personnel vehicle, with spherical
stereoscopic camera & microphone system, gimbaled weapon system
with laser tracking and zoom IR (or other suitable) camera mounted
on pole arm, with wind sensors, differential global positioning
system, radar, glint detector, on top of spherical stereoscopic
camera & microphone system.
[0019] FIG. 4 shows a planar geometry of the field of view of a
camera with projection of a target onto the field of view used to
calculate target angles and range.
[0020] FIG. 5 shows a planar geometry of a pair of stereoscopic
cameras projected onto the plane of the robotic camera weapon laser
arm.
[0021] FIG. 6 shows a three dimensional perspective geometry of a
stereoscopic camera pair with the robotic camera weapon laser arm,
where the calculations for rapidly and automatically determining
the angles, and range to position the zoomed camera/weapon system
onto detected target(s) in rapid succession.
[0022] FIG. 7 is a flow chart of the system process of detecting
targets and rapidly automatically positioning the zoom camera gyro
stabilized laser weapon system onto the potential targets.
[0023] FIG. 8 is a coordinated sniper fires pre-detect,
glint/barrel detects, fire detects, trajectory track, and
coordinated fire return stage diagram showing how the coordinated
systems function in action.
DETAILED DESCRIPTION
[0024] FIG. 1 shows a system block diagram of the anti-sniper
targeting and detection system 2. A pre-snipe omni-directional
sensor scope/eye glint/barrel IR/radar detection system 34 is shown
connected to computer system 10 to detect evidence of snipers,
scopes, other optical sensors as well as gun barrels, if they are
present, before a sniper is able to fire. Countermeasures to
anti-sniper detection use anti-reflective layers, as well as
honeycomb shapes on scope lenses. Some of these can be overcome by
using different radar techniques such as varying the radar
frequency to resonate at the shape of the honeycomb, or varying the
frequency in the range of the possible shapes.
[0025] The anti-sniper system 2 is shown utilizing a spherical or
omni-directional high speed stereoscopic IR and/or visible depth
camera and stereoscopic or omni-directional microphone system 4
that contains a spherical (omni-directional) high speed
stereoscopic infrared (IR, or other appropriate, such as a
RGB--red, green, blue, ranging, time of flight) depth camera system
6 as well as a spherical omni-directional microphone system for
left ear orientation 8A as well as a spherical microphone system
for right ear orientation 8B. The spherical (omni-directional)
microphone system can not only be used to detect source bearing
(azimuth and elevation) of initial ordinance firing, but also
detect trajectory from bullet whizzing sound if initial firing was
not acoustically detected such as from a weapon silencer.
[0026] The computer or micro-controller system 10 can have terrain
and earth curvature data to use in projectile calculations. The
computer also can process the target data from the
camera/microphone system 4 as well as from other sensors. The
sensors can include a Differential Global Positioning System (DGPS)
14, bullet trajectory radar system 32, accelerometers, compass,
gyros 12 used to stabilize zoom-able gimbaled IR camera IR and/or
visible camera 16 and weapon 18, wind direction, air temperature,
air pressure, wind speed, or other sensors 20, to calculate source
and trajectory to/from target information. Target information, from
and to other anti-sniper systems 2 for real-time triangulation and
target location fixes, is done through high speed wireless
communications 26. Bullet trajectory radar 32 can provide near
instantaneous ordinance collision avoidance warning commands by
determining ordinance trajectory path and anti-sniper system 2 unit
positions. Microphones 8A can be used to detect bullet impact
sounds to verify trajectory tracking performance from trajectory
radar 32. The sound recorded from bullets whizzing by in the air
near the microphones 8A can also be used to verify trajectory
tracking performance from trajectory radar 32. On an anti-sniper
system 2 HUD display input control 24, this can be annunciated on
speakers in or outside computer 10 or displayed, such as halt,
duck, move left, move right, move forward, and move backward, on
bullets fired, detected, and tracked at long range.
[0027] Other wireless communications 26 data can be sent to and
from other remote systems 27, to be relayed or routed, such as
through satellites, drones, aircraft, or other vehicles. Target
data that is processed is used to rapidly position gimbaled weapon
system with laser designator 18 that can be mechanically connected
to automatically or manually zoom-able gimbaled IR camera and/or
visible camera 16 through computer 10. Multiple sniper targets can
be assigned, shuffled, prioritized, and have status tracked and
shared amongst multiple anti-sniper systems 2 to autonomously
coordinate a rapid anti-sniper response optimally assigning sniper
targets to each crew based on unit position, status, and weapons
capabilities. The robotic gimbaled weapon system with laser
designator 18 and zoom-able gimbaled visible and IR camera 16 can
rapidly and automatically swing into position of highest
probability of snipers based on prior history/intelligence data,
and also has manual operational over-ride capability by human
operator. The robotic gimbaled weapon system with laser designator
18 and zoom-able gimbaled visible and IR camera 16 can be made to
move at high speed, faster than any human can move, and be made
more accurate and precise at dynamically firing back, even while
vehicle is in motion, than a human sniper by using gyros with high
speed actuators, with automatically stabilizing shock absorbing
mast/boom, where the human decision is made to fire from the zoomed
scope view. To further enhance the response time, the gimbaled
weapon 18 itself can be a high powered laser. The gimbaled weapon
18 can also act just as a target designator to work in coordination
with an aircraft or ship, or other weapon system.
[0028] Computer 10 can display target data including zoomed target
on a HUD (Heads Up Display) with input controls and speaker/alarm
24 for user 30. If user 30 determines that a target is real and is
a threat, can fire at target using gimbaled weapons (rifle,
automatic weapon, missile, high powered laser, or other weapon)
system with laser designator 18 controlled by user weapon fire
control 22 via fire control switch 28. The anti-sniper system 2 can
work independently of other anti-sniper system 2 units while at the
same time also join in to work as a coordinated crew or to break
off if needed. The sensors of the anti-sniper system 2 can
self-check and report if they are valid, invalid, and failed by
marking the sensor data accordingly. The anti-sniper system 2
detection and fire response can incorporate neural-fuzzy
reinforcement learning technology to optimize detection and
response. A high speed rapid response, such as returning sniper
fire when fired upon from a high speed passing vehicle can be
incorporated into the anti-sniper system 2. Incorporating
autonomous (or semi-autonomous) zoom camera system as well as a
radar can be useful at preventing false alarms that could be
triggered in acoustic and fire flash detection systems alone due to
other events such as from firecrackers being ignited.
[0029] Multi anti-sniper system 2 target assignments can be both
independent and dependent, or handled by a ranked order of
anti-sniper systems 2 such that one anti-sniper system unit 2 acts
as a target assignment controller, of which can automatically hand
off target assignment control to other anti-sniper units 2 as they
are removed and added to the system.
[0030] FIG. 2 illustrates using a fire detection to determine the
target range by taking the difference between two measured events,
the first non-friendly identified gun fire sound wave of highest
magnitude subtracted from the detected non-friendly identified
muzzle flash heat signature pixel in the infrared camera.
Friendly's and non-friendly's can clearly be identified and
displayed on a Head's Up Display (such as in the user 30 HUD 24 of
FIG. 1). The gun fire sound has a peak signal 38 shown in the upper
graph of microphone fire detection magnitude 50 amongst sound
echo's 40 and noise 52 where the start of the gunfire sound signal
starts at t.sub.s 42. The sound from known friendly fire can be
filtered out, based on known time, duration and pulse width of
friendly fire, and relative friendly firing position (all
wirelessly transferred within the system), thus reducing false
alarms and system confusion during a fire fight with lots of
bullets being fired. The lower graph shows the IR camera muzzle
flash/heat detection magnitude 53 where the peak detection 44 at
time t.sub.f 46 shown amongst signal noise 54. The range to sniper
can then be calculated by subtracting the two times and multiplying
by the speed of sound in air as shown 48. Just as friendly acoustic
fire sounds can be filtered, so can friendly positions muzzle
flashes can be identified and filtered, via high speed real-time
encrypted position network transfer, where laser communications can
be used in case of spread spectrum radio frequency jamming is
occurring.
[0031] FIG. 3 shows the anti-sniper targeting and detection system
applied to an armored personnel vehicle 56 on ground surface 62
where a mounting post 58 is used to support the spherical high
speed stereoscopic depth camera and omni-directional microphone
system 4 as well as fire control arm 60 with zoom-able gimbaled
infrared (or other appropriate) camera 16 with weapon that has a
laser designator system 18 whereby if a target is an unfriendly
tank, or similar, a drone, or aerial strike can be called in on
target. Mounted on top of the spherical high speed stereoscopic
camera and microphone system 4 is accelerometers, gyros 12 as well
as wind speed and direction sensors 20 along with a differential
global positioning system 14 all used to more accurately aim fire
control arm onto target 76 on mountain terrain 64. Next target 78
in system targeting sequence is where fire control arm 60 can
rapidly move to and zoom to automatically. The field of view 68
including sky 66, and mountains 64, of one camera of the spherical
stereoscopic camera system 4 are shown with edges 70. Gyros 12 can
also be mounted in fire control arm 60 as well as on camera 16,
with any means necessary for shock absorption. Zoom camera 16 can
also be mounted on an independent robotic arm (not shown) of the
fire arm 60 such that the zoomed view is maintained even while
firing.
[0032] FIG. 4 shows the surface geometry of one camera 6 with field
of view 68 with field of view projection edges 70 with target
horizontal projection point 80. The angle of the target,
.theta..sub.T, can be calculated by the distances shown, given the
angle of the field of view (2.times..theta..sub.H).
[0033] FIG. 5 shows the surface geometry of a stereoscopic camera
pair 6 with control fire arm 16, 18 all projected on one plane with
target 80 also projected onto the same plane. Given the distance of
the fire control arm, the distance between the stereoscopic
cameras, the length between the camera focus and the center point
of rotation of the control arm, and the distances provided, the
horizontal angle of the control fire arm 16, 18, can be positioned
onto the target, given the control arm angle which can be rotated
into position so no target is occluded for optimal zooming, laser
designating and firing. If all angles of the spherical stereoscopic
camera pairs 6 are known relative to the fire control arm, all
targets can be easily identified and processed to rapidly calculate
and position the fire control arm onto the target. Target locking
can be done in priority order such as nearest range target, or
highest up target, or any other desired order. Range to target can
also be determined if a camera used is a depth sensor camera.
[0034] FIG. 6 shows a three dimensional perspective geometry of one
spherical stereoscopic camera pair with microphone 4 (that would
receive the highest amplitude gun fire sound) mounted on support
post 58 and the fire control arm 16, 18 aimed at target 76
projected on target plane camera viewing plane 68 via straight line
vector 72 to target 76. Return fire trajectory is not shown, but
would be a slight arc, due to gravitation on path to the target.
Center of right camera field of view 82 as well as center left
camera field of view 84 is shown on target plane camera viewing
plane 68. Target 76 is shown projected at 80 to plane perpendicular
to target plane 68 that intersects fire control arm 16, 18 mounting
plane. The horizontal angle .gamma..sub.g as well as vertical angle
.phi. of the fire control arm 16, 18 can then be calculated and
thus rapidly moved to the target vector for each target 76 or other
targets in assigned or selected sequence, from the distances
provided from the stereoscopic camera, as well as the fire control
arm rotated position .alpha..sub.g.
[0035] FIG. 7 shows a flow chart of the anti-sniper targeting
system, where the system scans via radar/laser/optics for evidence
of human presence/movement(s), barrel, scope, eye, binocular, or
sensor glint, as well as muzzle event(s) in IR images as well as
from microphones in process block 100 and then determines if they
are detected at decision block 102 the location of the target(s) as
well as the ranges are computed at process block 104. At process
block 104 valid target information as well as other sensor data is
shared amongst networked anti-sniper systems wirelessly. Targets
are assigned to anti-sniper systems such that all target
assignments are distributed optimally such that they are easiest
and most effective to hit back, such as closest and best line of
sight bearing to anti-sniper unit. Zoomed target camera data can
also be shared amongst anti-sniper systems. Target assignment can
be one to one, or by many to one, or assigned in a maximally
optimal tactical manner.
[0036] The wind speed, wind direction, temperature, and pressure or
other ballistic factor sensor is measured at process block 106 and
distributed amongst units, and estimated by taking the average from
multiple valid sensors to adjust for any firing compensation
required over long ranges. The fire control arm is then moved to a
coordinated assigned target vector adjusting firing angle for
aggregated sensor wind speed, direction, temperature, air pressure,
range, and zoomed camera to target, to be adjusted automatically
and/or manually, and fire if commanded at process block 108 where
the status of the target post fire, if hit, has an assigned
probability of disabled, and this probability is reported to other
anti-sniper systems. The system checks for shutdown at decision
block 110, if not, the next target check occurs at decision block
112, if yes, then the control arm is rapidly positioned onto the
next target. If there are no new targets, then further target
scanning occurs.
[0037] FIG. 8 shows how the coordinated anti-sniper systems can
work in action to detect, and suppress sniper fire. This is shown
in five stages: STAGE I 200, pre-detect; STAGE II 202, Barrel/Glint
detect; STAGE III 204, fire detect; STAGE IV 206, trajectory track;
STAGE V 208, coordinated fire return. Vehicles in motion can fire
in response while in motion using calculations to adjust, based on
the vehicles telemetry data or the first vehicle 56A may detect the
sniper, then become out of line of sight of the sniper, where the
second vehicle 56B may become in range (line of sight), and be able
to automatically be assigned the target because of position, and
programmatically respond near instantly with camera near
instantaneous zoom onto sniper for verification and firing upon as
the real time target data is wirelessly transferred from 56A to
56B. This data is able to be passed on to all vehicles and
personnel (56A, 56B, 56C, and 56D) in the group so that each
vehicle and personnel passing can fire back and the crew can
visually identify and verify target before firing. Although not
shown in FIG. 8, upon sniper detection, warning terrain zones can
be clearly marked in real time of each unit's HUD (particularly
useful for dismounted unit 56D) for anything within line of sight
or within projectile range of hostile detected. This automatic
target sharing system with a manual override is not limited to
vehicles; it can be applied to aircraft, ships, or a
combination.
[0038] In STAGE I 200 of FIG. 8 three armored personnel vehicles
56A, 56B, and 56C are shown with anti-sniper systems 2 where the
forward personnel vehicle 56A can be made specialized in IED
(Improvised Explosive Device)/land mine detection. Dismounted
person 56D with miniaturized (trimmed down) soldier carried
anti-sniper system 2 is shown between armored personnel vehicles
56A and 56B. The vehicles 56A, 56B, and 56C and dismounted person
56D are shown travelling from left to right in front of mountainous
terrain 64 under sky 66. A tree 67 is shown next to a building 57
with hidden sniper 71D. Occlusion regions that are out of sight of
the anti-sniper system 2 sensors are shown as 90A and 90B. These
occlusion regions can be calculated and displayed on the
anti-sniper HUD using terrain databases, and from data from depth
sensors. Other undetected sniper positions are shown as 71A, 71B,
and 71C shown along mountain ridges 64.
[0039] In STAGE II 202 of FIG. 8 the three armored personnel
vehicles 56A, 56B, and 56C as well as dismounted person 56D are
shown moved slightly forward with occlusion zones 90A and 90B
updated accordingly where armored personnel vehicle 56A detects
barrel or glint, or detects barrel radar reflection of three
snipers 71A, 71B, and 71C via line of non-occluded sensor sights 86
whereby gimbaled weapon system 18 is automatically locked in and
zoomed onto detector sniper target 71A via planned calculated
counter fire trajectory path 87. Detected sniper targets 71A, 71B,
and 71C are displayed with squares on them, recorded, encrypted,
and wirelessly shared amongst anti-sniper units in the system in
real time (three armored personnel vehicles 56A, 56B, and 56C as
well as dismounted person 56D) and can also be encrypted, and
communicated in real-time to other systems (27 through wireless
communications 26 of FIG. 1 such as through satellite) in the
network.
[0040] In STAGE III after sniper units were preemptively detected
via IR glint, or radar detection of barrel in STAGE II, target data
was passed to dismounted person 56D as a preemptive warning and
anti-sniper system (2 of FIG. 1) automatically recommends to
position self in an anti-sniper firing position on top of small
hill 64 based on terrain data calculations as shown by indicating
optimal positioning directions and annunciating sniper warnings in
dismounted person's 56D HUD. Dismounted person 56D scopes out and
aims at sniper 71A with line of sight 88 waiting further command
and automatically reporting to anti-sniper system (2 of FIG. 1)
units that target 71A is acquired by dismounted person 56D. Armored
personnel vehicles 56A, 56B, and 56C are shown moved further
forward to the right to optimize targeting of detected targets as
recommended, such as by annunciation of "unit `56A` move forward 30
meters" by anti-sniper system (2 of FIG. 1) where sniper 71D inside
building 57 is spotted by armored personnel vehicle 56A anti-sniper
system (2 of FIG. 1) via glint/barrel detection path 86 where
vehicle weapon is automatically positioned and locked onto planned
trajectory path 87. Armored personnel vehicle 56B is shown in front
of tree 67 locked onto detected sniper 71C via glint/barrel
sighting 86 and planned fire real-time response calculated and
sensed optimal trajectory 87 arm is rapidly and autonomously
rotated and adapted into position based on real wind magnitude and
angle acquired from wind direction and magnitude sensors of which
can be autonomously corrected from apparent wind from vehicle
motion as well as aggregated amongst sensors.
[0041] Friendly unit positions (56A, 56B, 56C, and 56D), weapon
angle, weapon target, weapon or other ordinance firing (to
differentiate with enemy fire/explosion, such as by HUD color
display to indicate if a firing/explosion cause was friendly or
hostile), are autonomously reported, shared, and tracked in real
time, encrypted and wirelessly transferred to avoid friendly fire
incidents, and to optimize coordination and response of target
sharing where data can be displayed/annunciated on all personnel
HUD's. Friendly's are clearly identified and marked in HUD display.
Sniper targets 71B and 71A are also detected by IR glint and/or
radar barrel detection by armored personnel vehicle 56A and 56B as
shown by the detection paths 86 where armored personnel vehicle 56B
has sniper 71C locked on with automated positioning fire arm
planned trajectory path 87 where sniper 71C fire detection weapon
IR muzzle flash indicates sniper 71C weapon is fired where position
is further verified if not preemptively detected. Snipers can also
be preemptively detected, targeted, and tracked (via image
processing or otherwise) by IR heat signature which can be very
visibly distinguished from terrain. Sniper 71B is shown scoped out
on line of sensor sight 86 by armored personnel vehicle 56C where
weapon is rapidly automatically positioned, adapted, and optimized
based on calculated real wind and vehicle motion along with
trajectory equations by sniper's 71B three dimensional relative
positions. Snipers 71A and 71C are also detected by armored
personnel vehicle 56C as shown by line of sensor sight paths 86.
Sniper targets 71A, 71B, 71C, and 71D status, such as firing
trajectories, or no longer firing, appears injured, or killed, are
reported and distributed as marked probabilities in real time. In
STAGE III if no snipers were preemptively detected, they can be
otherwise detected at this STAGE III by their weapon firing.
[0042] In STAGE IV sniper 71C's bullet 304 has its trajectory 300
tracked in real time (using mean of multiple triangulated
anti-sniper 2 of FIG. 1 unit sensors, throwing out spurious data
and noise) by high speed bullet tracking radar (32 of FIG. 1) sight
lines 302 on armored personnel vehicles 56A, 56B, and 56C. At this
stage it is clear the sniper 71C is hostile whereby a highly
coordinated efficient semi-autonomous rapid return fire response
occurs at STAGE V, controlled by rapid autonomous zoom visual
inspection with rapid target gyro stabilized rapid robotic weapon
zoom views can be rapidly shuffled through to verify if each target
is verified hostile. Targets can be rapidly assigned in real time
to optimal units based on position and weapons capability
available, threat level, and based on target type.
[0043] At STAGE V the targets were verified hostile, engaged, and
destroyed in rapid near simultaneous succession as shown by the
X's. If targets were missed or new targets were found, targets can
be re-assigned and transferred between units rapidly and
autonomously in real time with target status (Engaged, New,
Detected, Tracked, Lost, Partly Destroyed, Destroyed) continually
updated where numbers of rounds along with round types, round
source into target can be tracked, recorded, and reported as well
as used for computing probability of target disabled. A performance
recording and measurement system can be incorporated for post
battle analysis, whereby the successful battles, and unsuccessful,
can be reinforced into the neural-fuzzy reinforcement system, based
on parameters such as injury types, numbers, and fatalities, if
any. To improve the system through the Monte Carlo process.
[0044] To help avoid friendly fire, each anti-sniper system 2 can
share gun position using gun orientation sensors to provide gun
barrel angle, and complete calculated planned fire trajectory that
can be displayed on the HUD of each anti-sniper system 2 user where
anti-sniper system 2 users are clearly identified on the HUD's.
REFERENCE NUMERALS
[0045] 2 anti-sniper targeting and detection system [0046] 4
spherical high speed stereoscopic IR and/or visible camera and
stereoscopic microphone system [0047] 6 spherical high speed
stereoscopic IR and/or visible camera system [0048] 8A spherical
microphone system left ears [0049] 8B spherical microphone system
right ears [0050] 10 computer system [0051] 12 accelerometers,
compass, gyros/inertial reference (in case GPS is bad) [0052] 14
differential or other global positioning system (GPS) navigation
sensor, combined with omni-directional RGB-D (red, green, blue,
depth) camera, along with glint IR laser detection system, barrel
and bullet trajectory tracking radar detector [0053] 16 zoom-able
gimbaled camera [0054] 18 gimbaled weapon system with laser
designator, gyros [0055] 20 wind direction and speed sensor [0056]
22 user weapon fire control interface [0057] 24 user display and
input control [0058] 26 wireless communications system [0059] 27
other networked systems such as satellites, drones, aircraft,
robots [0060] 28 fire control switch [0061] 30 user [0062] 32
bullet trajectory radar [0063] 34 omni-directional pre-snipe (i.e.
laser to scope/eye glint, radar, barrel) sensor detection system
[0064] 38 muzzle sound detection peak [0065] 40 muzzle sound echoes
[0066] 42 time detected at start of muzzle firing peak [0067] 44 IR
muzzle heat signature [0068] 46 time detected at start of muzzle
heat signature [0069] 48 range equation using combination of
detected acoustic and IR peak times [0070] 50 magnitude axis of
acoustic signal [0071] 52 ambient microphone noise [0072] 53
magnitude axis of IR signal [0073] 54 ambient IR pixel noise [0074]
56A forward armored personnel vehicle that carries the muzzle event
target detection system (2), that can also contain a mine or
Improvised Explosive Device (IED) detection system [0075] 56D
dismounted person that carries a miniature or trimmed down
anti-sniping system (2) [0076] 57 building [0077] 58 system
mounting post [0078] 60 swivel beam to support gimbaled weapons
system with laser designator (18) and zoom-able gimbaled IR and/or
visible camera (16) [0079] 62 road [0080] 64 terrains [0081] 66 sky
[0082] 67 tree [0083] 68 camera view rectangle/square [0084] 70
camera projected corner through 3D space [0085] 71 target [0086] 72
true target (T1) vector [0087] 74 true target (T2) vector [0088] 76
target (T1) locked & engaged highlighted circle with square
[0089] 78 target (T2) lock highlighted circle [0090] 80 target
projected in plane at camera level or gun plane [0091] 82 right
camera center pixel [0092] 84 left camera center pixel [0093] 86
target detected by either glint from IR laser or radar barrel
[0094] 87 robotic fire arm target automatic engagement lock [0095]
88 manual target engagement lock [0096] 90 occlusion zones [0097]
90A left occlusion zone [0098] 90B right occlusion zone [0099] 100
Flowchart process block: Scan for evidence of human
presence/movement(s), as well as muzzle event(s) in IR images as
well as microphones [0100] 102 Flowchart condition block: Target or
muzzle event(s) de-tected? [0101] 104 Flowchart process block:
Locate Target(s) and compute range(s) [0102] 106 Flowchart process
block: determine wind speed & direction [0103] 108 Flowchart
process block: move firing arm to target vector adjusting for wind
speed & direction, range zoom camera to target, fire, adjust,
if commanded [0104] 110 Flowchart condition block: shutdown? [0105]
112 Flowchart condition block: next target? [0106] 200 pre-detect
stage I (no targets detected) [0107] 202 Barrel/glint detection
stage II (target detected via radar or IR/laser glint of scope/eye:
pre-emptive fire offensive fire opportunity if verified threat,
such as through zoomed visual/IR camera) [0108] 204 Fire detection
stage III (fire from target detected via IR muzzle flash, and/or
acoustic array) [0109] 206 Bullet trajectory tracking stage IV
[0110] 208 Coordinated return fire stage V [0111] 300 bullet
trajectory [0112] 302 bullet trajectory radar reflection [0113] 304
bullet
Operation
[0114] The anti-sniper targeting and detection system operates by
automatically detecting target(s), in advance, calculating the
target positions relative to the system, computing the fire control
arm angles based on aggregate sensor data and trajectory
calculations, and rapidly moving the fire control arm to the target
in an assigned shared priority sequence where targets can be
viewed, assessed, verified as unfriendly, and fired upon in rapid
succession. If the vehicle and targets are moving, the fire control
arm and target data can be continually adjusted accordingly.
* * * * *