U.S. patent application number 13/743695 was filed with the patent office on 2013-12-26 for target shot placement apparatus and method.
The applicant listed for this patent is Jonathan D. Lenoff. Invention is credited to Jonathan D. Lenoff.
Application Number | 20130341869 13/743695 |
Document ID | / |
Family ID | 49773766 |
Filed Date | 2013-12-26 |
United States Patent
Application |
20130341869 |
Kind Code |
A1 |
Lenoff; Jonathan D. |
December 26, 2013 |
Target Shot Placement Apparatus and Method
Abstract
A target shot placement apparatus and methods thereof are
described according to the teachings of the present invention.
Inventors: |
Lenoff; Jonathan D.; (Erie,
PA) |
|
Applicant: |
Name |
City |
State |
Country |
Type |
Lenoff; Jonathan D. |
Erie |
PA |
US |
|
|
Family ID: |
49773766 |
Appl. No.: |
13/743695 |
Filed: |
January 17, 2013 |
Related U.S. Patent Documents
|
|
|
|
|
|
Application
Number |
Filing Date |
Patent Number |
|
|
61587899 |
Jan 18, 2012 |
|
|
|
Current U.S.
Class: |
273/406 |
Current CPC
Class: |
F41J 5/14 20130101; F41J
9/02 20130101; F41J 5/02 20130101 |
Class at
Publication: |
273/406 |
International
Class: |
F41J 9/02 20060101
F41J009/02 |
Claims
1. A target shot placement apparatus, comprising: a housing which
includes a target support structure, a projector, and a first
camera; wherein said target support structure includes a target
frame and a first sheet; wherein the back surface of said front
sheet is in direct contact with said target frame; a transporting
means for said housing, wherein said means is affixed to said
housing; and wherein the line of sight of said camera includes a
surface of said first sheet.
Description
CROSS-REFERENCE TO RELATED APPLICATIONS
[0001] This application claims the benefit of provisional
Application No. 61/587,899, filed Jan. 18, 2012, which is
incorporated by reference herein.
BACKGROUND
[0002] In order to maintain proficiency in the use of firearms, it
is common for law enforcement officers, sportsmen, military
personnel, and individuals to engage in target practice for
valuable training to increase the individual's skills and
efficiency with a firearm.
[0003] Target practice can also be used to improve group cohesion,
efficiency and effectiveness when those individuals deal with
situations involving firearms or other weapons. Accordingly, target
practice increases the ability of an individual or group to use a
firearm safely and effectively.
[0004] The use of shooting ranges for target practice provides a
level of training which is difficult to duplicate in other types of
target practice. Shooting ranges can provide multiple targets,
moving targets, and other stimuli which may increase the
effectiveness of the target practice in training the individual or
group.
[0005] Target practice is categorized into basic target practice in
which a trainee improves his hitting accuracy when using live
bullets, and advanced target practice in which the trainee shoots
while judging a suitable timing and situation for firing.
[0006] In target practice, a trainee generally shoots at a
stationary, moving or bobbing target, and the trainee or a judge
visually checks the impact position on the target to evaluate the
hitting accuracy and the ability of the trainee to make a proper
circumstantial judgment.
[0007] To automatically and safely check such an impact position,
various target practice apparatuses have been proposed. However,
position detection mechanisms that are disclosed in the art are not
suitable for use with live bullets, or require modifications to the
weapon, the shooter or the shooter's position.
[0008] There exists no method to effectively and remotely operate a
moving target capable of dynamically changing the target displayed,
its orientation and location while providing valuable information
to the shooter, judge, or training personnel. Accordingly, there is
a need for new and improved target shot placement apparatuses that
allow for the use of live bullets without the need to modify the
weapon, the shooter, the shooter's position or the shooter's other
equipment.
SUMMARY
[0009] Target shot placement apparatus embodiments and methods of
use thereof are described according to the teachings of the present
invention.
DESCRIPTION OF DRAWINGS
[0010] FIG. 1 is a perspective front view of a first embodiment of
the present invention.
[0011] FIG. 2 is a perspective top view of the embodiment of FIG.
1.
[0012] FIG. 3 is a perspective top view of a second embodiment of
the present invention.
[0013] FIG. 4 is a side view of the embodiment of FIG. 1.
[0014] FIG. 5A is a front cross-section view along section 5A-5A of
FIG. 4. FIG. 5B is a rear cross-section view along section 5B-5B of
FIG. 4.
[0015] FIGS. 6, 7, 8, 9, 10, 11, and 12 show various alternative
arrangements of the target sheets and surfaces, cameras, and
projectors for use in, for example, the embodiments shown in FIG. 1
through FIG. 5.
[0016] FIG. 13A is a cross-section view along section 13A-13A of
FIG. 13B and the combination of FIG. 13A and FIG. 13B display the
lines of sight of the various possible cameras and projectors shown
in FIGS. 6, 7, 8, 9, 10, 11, and 12.
[0017] FIG. 14A displays an alternative embodiment of the target
surface shown in FIG. 1 through FIG. 5.
[0018] FIG. 14B shows a top cross-section view along section
14B-14B of the target surface and stand displayed in FIG. 14A.
[0019] FIG. 14C shows an enlarged portion circled in FIG. 14B.
[0020] FIG. 14D shows a side cross-section view along section
14D-14D of the target surface and stand displayed in FIG. 14A.
[0021] FIG. 14E shows an enlarged portion circled in FIG. 14D.
[0022] FIGS. 15A, 15B, 15C, 15D, 15E, 15F, 15G, 15H, 15I and 15J
show target surfaces used in various manners in embodiments of the
present invention.
[0023] FIGS. 16A, 16B and 16C display a target surface with an
embodiment of interactive targeting for the invention.
[0024] FIGS. 16D, 16E, and 16F display a target surface with a
second embodiment of interactive targeting for the invention.
[0025] FIG. 17 displays an embodiment of the invention at different
points in time as it interacts with multiple users.
[0026] FIG. 18 displays an embodiment of the invention at different
points in time as it interacts with a single user.
[0027] FIG. 19A, FIG. 19B, and FIG. 19C show a combination of video
feed and target projection that may be used in conjunction with one
or more embodiments of the invention.
[0028] FIG. 20A, FIG. 20B, and FIG. 20C show various embodiments of
possible display and control screens for user interaction with
embodiments of the invention.
[0029] FIG. 21A shows a top view of a user interacting with an
embodiment of the invention, with locations labeled for equations
to calculate the shooter position, as described in the detailed
description below.
[0030] FIG. 21B shows a side view of the user and embodiment of
FIG. 21A, with locations labeled for equations to calculate the
shooter position, as described in the detailed description
below.
[0031] FIG. 22 shows a top view of a user interacting with an
embodiment of the target, with distance and angle variables labeled
for equations to calculate the shooter position, as described in
the detailed description below.
[0032] FIG. 23A shows a front view of an embodiment of a target
stand.
[0033] FIG. 23B shows a side cross-sectional view along section
23B-23B of the target embodiment of FIG. 23A, with camera's field
of vision, projection throw, and lighting elements illustrated
therein.
[0034] FIG. 23C shows a front cross-section view along section
23C-23C of the upper portion of the target embodiment of FIG.
23B.
[0035] FIG. 23D shows a rear cross-section view along section
23B-23B of the upper portion of the target embodiment of FIG.
23B.
[0036] FIG. 24A and FIG. 24B show the interaction with a user
through a rotation of the target.
[0037] FIG. 25 shows a side view of a roll system included in one
or more embodiments of the invention.
DETAILED DESCRIPTION
[0038] The present invention will now be described more fully with
reference to the accompanying drawings in which alternate
embodiments of the invention are shown and described.
[0039] It is to be understood that the invention may be embodied in
many different forms and should not be construed as limited to the
illustrated embodiments set forth herein. Rather, these embodiments
are provided so that this disclosure may be thorough and complete,
and will convey the scope of the invention to those skilled in the
art.
[0040] Referring to FIG. 1, embodiments of the present invention
are directed to a shot placement apparatus 10 that is utilized to
train individuals and/or groups in live fire exercises. Although
the description herein refers to bullets and live fire, the device
may be used in conjunction with other projection transmissions, for
example, arrows, BB guns, lasers, paint guns, and so on.
[0041] The shot placement apparatus 10 comprises a target stand 12
and a housing 14. In one embodiment, the target stand 12 comprises
a target sheet 16 and a target support structure 18 (depicted in
FIG. 2).
[0042] The target sheet 16 provides a surface on which a training
image 20 is displayed.
[0043] In one embodiment, the target sheet 16 is made of a material
for displaying a projected image in the most life-like fashion.
[0044] In one or more embodiments, the target sheet 16 is made of,
but not limited to, resilient material capable of multiple rounds
of training, a disposable paper-like surface, a bullet-proof
material, or combinations thereof.
[0045] In yet another embodiment, the target sheet 16 has
pre-printed markings.
[0046] In one specific embodiment of the invention, the target
sheet 16 is a target roll which is rolled and can be replaced
automatically without human interaction or without a need to stop
the training exercise.
[0047] This target sheet 16 ensures a consistent shooting surface
for accuracy and reliability, provides for the best surface to
display the target image 20, and may aid in the position detection
of impact locations.
[0048] As illustrated in FIG. 2, the target support structure 18
may include a frame 22 and at least one leg 24. In this embodiment,
the frame 22 surrounds the periphery of the target sheet 16 to
provide support and stability for the target sheet 16.
[0049] The leg 24 comprises a top end 25 and a bottom end 27, and
is attached to the lower surface of the support structure 18 at the
top end 25, extending downward. Leg 24 may be connected to the
rotatable base 40 (depicted in FIG. 3) at the bottom end 27 to
provide support for the target sheet 16 and target stand 12.
[0050] With reference to FIG. 3, one embodiment of the invention as
herein described by way of example, is directed to a housing 14.
With reference to FIG. 5A, FIG. 5B, and FIG. 3, the housing 14
comprises a bottom wall 28, a front wall 30, a back wall 32, and
two opposing sidewalls, the left sidewall 34 and the right sidewall
36, respectively.
[0051] The walls 28, 30, 32, 34 and 36 form a cavity 38 (depicted
in FIG. 1) within the housing 14.
[0052] As illustrated in FIG. 3, a rotatable base 40 is fitted in
the housing cavity 38 and is attached to the bottom wall 28 inside
the cavity 38 through, for example, the rotary connection 58
(depicted in FIG. 5B) which allows free rotation while maintaining
necessary electrical connections. A communication antenna 41 for
receiving or sending, for example, projection instructions,
movement instructions, or for sending, for example, target hit
success, is attached to rotatable base 40 but may also be attached
elsewhere on the device. This antenna 41 could allow users the
capability to interact with the unit using, for example, personal
electronics or computers over such communication protocols as, for
example, wifi, radio, cellular telephones, or satellite
communication. In another embodiment, this antenna 41 is used to
receive data such as a GPS signal to keep track of its position to
aid in semi or full autonomous operation.
[0053] In one or more embodiments, the rotatable base 40 is
cylindrical in shape and can freely rotate on the bottom wall 28
within the cavity 38.
[0054] In one or more embodiments, the bottom end 27 of the leg 24
of the target stand 12 (depicted in FIG. 2) is connected to the
rotatable base 40 to allow for the movement of the target stand
12.
[0055] In yet another embodiment, the rotatable base 40 is fixed to
the bottom wall 28 and moves relative to the shot placement
apparatus 10.
[0056] In one or more embodiments, the placement apparatus 10 is
highly mobile and powered with autonomous or semi-autonomous
operation to simulate real-life targets and situations with
accurate movement of the target sheet 16.
[0057] Referring to FIG. 1, in one specific embodiment, the shot
placement apparatus 10 further comprises at least one
transportation device 42, which allows for the mobility of the
placement apparatus 10. Non-limiting examples of the transportation
device 42 include a wheel, track or pedrail. Accordingly, the
apparatus 10 is highly mobile in a vast variety of terrains and
conditions whether indoors or outdoors. In one specific embodiment,
the apparatus operates at different speeds to simulate real-life
scenarios, and may mimic vehicles, people, objects, or animals.
However, the transportation device 42 is not necessary in other
embodiments of the invention, and the device 10 does not
necessarily need to travel during use.
[0058] As shown in FIG. 4 and FIG. 5A, in one or more embodiments,
outer front, right, outer back and left controls cameras 47, 50, 51
and 52 may assist the device 10 in either automated or guided
motion or detection of shooters. These control cameras 47, 50, 51,
52, or other cameras on the unit, may aid in the unit's interaction
with the user by tracking the users movements so that the user's
actions can be used as inputs to, for example, begin training
scenarios, stop training scenarios, choose between different
programs, and so on. As an alternative or in addition, an outer
front rotatable controls camera 49 (depicted in FIG. 1) may be
included for the same purposes.
[0059] With reference to FIG. 5A and FIG. 5B, the inside of housing
14 may include an internal cavity 54 for support electronics and
propulsion mechanisms 56, with a rotary electrical connection or
conduit 58 that may be used, for example, for connecting wires
60.
[0060] The combination of the rotatable base 40 in conjunction with
the transportation device 42 may be used to present the target
sheet 16 in the most effective orientation with respect to the
shooter(s) in a variety of terrains and conditions. Referring to
FIG. 24A and FIG. 24B, this rotatable base 40 can also be used to
aid in the detection of the impact location of the projectiles. The
rotatable base 40 includes a point 1040 that the target stand 26
will rotate about. This target stand 26 may include a front sheet
1020 and a back sheet 1030. A projectile traveling through line of
motion 1045 will first strike the front sheet 1020 at a particular
point 1050 and then continue to travel and strike the back sheet
1030 at a different point 1055. Through a rotation about the center
of rotation 1040 of the target stand 26, even if a second
projectile traveling along a line of motion 1046 is the same as the
first projectile's line of motion 1045, the angle 1075 of the
second projectile's line of motion 1046 relative to target stand 26
or front sheet 1020 will be different from the angle 1070 of the
first projectile's line of motion 1045 relative to the target stand
26. This rotation will thus create unique impact points 1060 and
1065, in the front and back sheets respectively, from the original
impact points 1050 and 1055. This rotation allows for optimum
operation as, for example, areas of the front sheet 1020 or back
sheet 1030 will not become over fatigued from repeated impacts as
these impacts may be spread out increasing the lifetime and
usability of the sheets in target stand 26. This rotation also
allows the unit to accurately and consistently calculate the
position of impacts, for example, 1050, 1055, 1060, and 1065,
regardless of the projectile's line of motion 1045 or 1046 and
deals with the issue of "through hole" detection where a new
projectile traveling through the same location in a target as a
previous projectile would not necessarily make a new marking or
hole and thus becomes very difficult to discern where that
projectile actually hit the target sheet 16.
[0061] Referring back to FIG. 3, in accordance to embodiments of
the present invention, a projector 44 is fitted in the cavity 38 of
the housing 14 and projects the image 20 onto the target sheet 16.
The projected image 20 is initiated from a computer or other
digital media storage devices and is projected on the target sheet
16 through the projector 44.
[0062] In one specific example, the projector 44 and target sheet
16 rotate independently of each other, and independently of the
apparatus 10 to present the shooter with different angles of
attack. In addition, such movement allows the apparatus 10 to move
around without affecting the shooter's target as shown in FIG. 17
and FIG. 18.
[0063] In yet another embodiment, as depicted in FIG. 2, at least
one anterior camera 46 is placed inside the cavity 38 and focused
on the target sheet 16 to accurately calculate exact shot
placement. A posterior camera 48 is placed inside the cavity 38 in
addition to or in replacement of the anterior camera 46 and focused
on the back of the target sheet 16.
[0064] Anterior camera 46 and posterior camera 48 collect the
information relating to the exact placement of the shot and feed
this information to a computer system which will store the data and
transmit the necessary information back to the shooter or any other
designated individual through the use of wired or wireless
communication.
[0065] The combination of anterior camera 46 and posterior camera
48 may be used to calculate such information as the speed or firing
rate of the projectile(s) that move through the target sheet 16
and/or rear target sheet 660 (depicted in FIG. 6).
[0066] Generally, the computer system affords access and
information to individuals, including but not limited to the
shooter and/or instructor, through a server, an interface and/or
database. The server, the interface, and/or the database are
realized by at least one processor executing program instructions
stored on machine readable memory. The present invention is not
necessarily limited to any particular number, type or configuration
of processors, nor to any particular programming language, memory
storage format or memory storage medium.
[0067] Furthermore, in implementations of the server involving
multiple processors and/or storage media, the system is not limited
to any particular geographical location or networking or connection
of the processors and/or storage media, provided that the
processors and/or storage media are able to cooperate to execute
the disclosed interfaces and databases. Additionally, it is not
necessarily required that the processors and/or storage be commonly
owned or controlled.
[0068] In yet another embodiment, the apparatus 10 will have the
capability to react dynamically to the shooter's accuracy, firing
rate, and/or weapon selection.
[0069] The apparatus 10 offers the flexibility of a combination of
the selection of the projected image or video, modification to the
angle of the target sheet in relation to the shooter, as well as
moving the system as a whole.
[0070] As depicted in FIGS. 5A and 5B, several cameras 47 (depicted
in FIG. 2), 50, 51, and 52 may be positioned around the exterior of
the housing 14 to provide the necessary information to the
computer, the shooter, and any other designated individuals.
[0071] Such information includes, but is not limited to, the
shooter's posture during different shots in correlation to shot
placement, the distance from the apparatus 10 to the shooter, team
formations during exercises, video feed of the surrounding
terrain/area, as well as breach and room sweeping
effectiveness.
[0072] In one embodiment and as depicted in FIG. 5A, FIG. 5B, and
FIG. 2 by way of example, a front camera 47 is placed on the
exterior of the front wall 30; a back camera 51 is placed on the
exterior of the back wall 32; a left side camera 52 is placed on
the exterior of the left side wall 34; and a right side camera 50
is placed on the exterior of the right side wall 36 of the housing
14.
[0073] In one specific embodiment of the invention, the cameras 47,
50, 51, and 52 may each be placed within a bullet-proof,
transparent enclosed structure to avoid damage to the cameras.
[0074] In yet another embodiment, the housing 14 is comprised of
armor material to be protected from bullets, ricochet, and
vibrations. Housing 14 may be wrapped in an removable, impact
absorbent casing (not shown) to minimize the risk of bullets
ricocheting off of the housing 14.
[0075] In addition, safety measures are taken to protect the
electronics, other components, and all individuals present during
training exercises.
[0076] The cameras 47, 50, 51, and 52 gather information from
different angles surrounding the apparatus 10 in real-time, assess
the placement of the shooter and other factors which may impact on
training and transmit this information to a data collection
device.
[0077] This data collected from cameras 47, 50, 51, and 52 can also
be used to aid navigation and movement of the system to either an
operator or the onboard computer allowing for autonomous or
semi-autonomous operation.
[0078] In yet another embodiment, a software program is utilized
that provides such real-time information in an effective and
sensible data-format to the shooter and other interested parties.
The interested parties have the possibility to store or print this
data. Data can also be relayed to provide shooting instructions
such as firing commands or sequences through the use of speakers,
headphones, monitors, projected images, personal electronics such
as cellular phones or tablets, or optional attachments to the
shooter's body or weapon.
[0079] FIGS. 6 through 12 refer to exemplary alternative
arrangements of the target, one or more cameras, and one more
projectors within, for example, the housing 14 of FIG. 3. These
arrangements, however, are by way of demonstration only, and one of
ordinary skill in the art could conceive of virtually limitless
combinations and arrangements of target(s), camera(s), and/or
projector(s) that would still fall within the invention as
disclosed.
[0080] Referring first to FIG. 6, a target sheet 650 is adjacent to
a front frame 652. Target sheet 650 may be replaced in whole or in
part by additional continuous sheets on rolls 670. In the
embodiments disclosed, the rolls 670 are automated so that a user
is not required to approach the target shot placement apparatus 10
in order to refresh a punctured or damaged portion of sheet 650
during a training/shooting session. However, rolls 670 could be
manually rotated as well.
[0081] Back sheet 660 is adjacent and affixed in place against back
frame 662. It would also be possible, for example, to have a
similar set of rolls for the back sheet if it was necessary to
replace a damaged or punctured portion of back sheet 660.
[0082] Projector 610 may project a targeting image 612 onto, for
example, the front surface of target sheet 650. A front camera 620
is positioned so that its line of sight 622 will include the front
surface of target sheet 650. A rear camera 630 is positioned so
that its line of sight 632 will include the back surface of back
sheet 660.
[0083] Referring now to FIG. 25, in one or more embodiments the
rolls 670 may be fabricated in a unique manner to create a
different type of target roll system 1105. This target roll system
1105 allows for a multitude of target sheets, for example, 1080,
1085, 1090, and 1095 to be connected together at junctions 1100 to
be selected by either the user or the onboard computer by simply
rolling/unrolling the target roll 1105 to a different location.
These different target sheets 1080, 1085, 1090, and 1095 may vary,
for example, by material composition, surface treatments, colors,
or thicknesses. In one or more embodiments, one of the different
target sheets 1080, 1085, 1090, or 1095 may have preprinted
markings while the others are more suitable to display projected
images. This target roll system 1105 would allow the unit to
operate in a variety of weather, lighting, and training
conditions/scenarios to present an optimal target sheet to the user
and aid in the detection of the impacts. This can be useful as a
particular target sheet 1080, 1085, 1090, or 1095, can, for
example, substantially change the manner in which an image is
displayed or projected. For example, depending on ambient lighting
conditions and a user's preference, the target surface gain, the
measure of the reflectivity of light incident on the surface, and
target surface treatments can vary from surface to surface so that
the necessary and desired resolution, contrast, color shift,
luminance, and viewing angle can be obtained. Another advantage of
these different surfaces, 1080, 1085, 1090, or 1095, would be
properties such as flame retardation, mildew resistance, or tear
resistance can be changed real time as training scenarios or
training conditions change. This target roll 1105 may also allow
the user to selectively change out portions of the target sheets as
necessary.
[0084] Referring now to FIG. 7, the arrangements of projector 610,
front camera 620, and back camera 630 are similar to those in FIG.
6. However, roll 670 is omitted. This omission may be optimal in
situations where target sheet 650 is, for example, not flexible
enough to be rolled, or where sheet 650 is a particular type of
target not available in roll format, or where sheet 650 is not
subject to damage (e.g., live fire is not being used) and therefore
does not need to be replaced.
[0085] Referring now to FIG. 8, an alternative arrangement is shown
where cameras 720, 724, 730, and 734 are arranged between/above and
between/below target sheet 650 and back sheet 660. The line of
sight 712 of projector 710 includes the front surface of target
sheet 650. The line of sight 722 of inside top front camera 720
includes the back surface of target sheet 650. The line of sight
732 of inside top back camera 730 includes the front surface of
back sheet 660. The line of sight 726 of inside bottom front camera
724 includes the back surface of target sheet 650. The line of
sight 736 of inside bottom back camera 734 includes the front
surface of back sheet 660. Similar to FIG. 6, a roll 770 of
additional sheet may be included in order to replace target sheet
650 when necessary. FIG. 9 shows an embodiment similar to FIG. 8,
only without the rolls 770.
[0086] The use of internal cameras 720, 724, 730, 734, particularly
when the internal cameras are protected by camera enclosures 777
(see FIG. 14E), is advantageous because, for example, the interior
detection method will be less susceptible to dirt or debris from
outdoor use that may affect external cameras or the front of the
target sheet 750 or the back of the back sheet 762 (see FIG. 3). In
addition, any lighting 782, 783 (see, e.g., FIG. 14C and FIG. 14D)
used in between an enclosed area of the target sheet and back sheet
will be subject to greater control by the user, than for example,
external lighting, particularly if the device 10 is in motion.
Moreover, an optimal target surface for projecting a target image
(e.g., the front of the target sheet 750) may not be the optimal
surface for projectile impact detection.
[0087] Referring now to FIG. 10, an alternative arrangement is
shown with only one frame 752, with the target sheet 750 affixed
thereto. Projector 710 has a line of sight 712 that includes the
front of target sheet 750. Only two cameras 720, 724 are used in
this embodiment, and both cameras' lines of sight 722, 726 include
the back of target sheet 750.
[0088] Similarly, an alternative embodiment at FIG. 11 shows only
one frame 652 to which target sheet 650 is affixed. Front camera
620 has a line of sight 622 that includes the front surface of
target sheet 650. Back camera 630 has a line of sight 632 that
includes the back surface of target sheet 650.
[0089] Another alternative arrangement is shown in FIG. 12, which
includes a mirror or reflective material 680. Camera and/or
projector 682 may be placed directly beneath (or above) target
frame 650. Camera and/or projector 682 has a line of sight/image
684 that reflects off of mirror 680 to create a new line of
sight/image 686 that thereby includes the front surface of target
sheet 650. As with other embodiments, this arrangement may include
one or more sheet rolls, additional cameras, additional projectors,
and so on.
[0090] FIG. 13A shows the various cross-hatching designs and
overlap to demonstrate combined example lines of sight and
projected images discussed in above in reference to FIGS. 6-12,
with respect to projector 610/710, front camera 620, back camera
630, inside top front camera 720, inside top back camera 730,
inside bottom front camera 724, inside bottom back camera 734, and
camera and/or projector 682. The hatching marked as 712 represents
the images projected from the projector 610/710 that can be used
for displaying targets or providing user feedback. The hatching
marked as 713 represents the primary line of sight 713 of the front
camera 620 and can be seen to overlap with the projected image 712
in FIG. 13B. Hatching 714 refers to the primary line of sight for
the back camera 630. Hatching 715 refers to the primary line of
sight of the top front camera 720. Hatching 716 refers to the
primary line of sight of the inside top back camera 730. Hatching
717 refers to the primary line of sight of the inside bottom back
camera 734. Hatching 718 refers to the primary line of sight of the
inside bottom front camera 724. Hatching 719 refers to the primary
line of sight of the camera/projector 682. These lines of sight are
referred to as primary lines of sight because a selection of camera
lens can substantially modify the line of sight.
[0091] Referring to FIGS. 14A-14C, a target stand 26 may include
internal cameras 724, 734, as, for example, shown in the
arrangement of FIG. 9. FIG. 14C shows enlarged circled portion 780
of FIG. 14B. In addition, the target stand 26 may include internal
lights 782 that project light onto and between the back surface of
target sheet 650 and the front surface of back sheet 660. FIG. 14D
shows a side cross-section of target stand 26. FIG. 14E shows an
enlarged circled portion 784 of FIG. 14D. Leg 24/main support 702
includes a conduit 704 for control and power wires (not shown) that
may be connected to internal cameras 724, 730, roll 770, lighting
782, 783 and so on.
[0092] Internal cameras 724, 730 are protected from live fire by
camera enclosure 777, which may be made of a bullet-proof material.
Cameras 724, 730 are held in position by various camera support
structures 778 as well as other internal support structures 740,
which are well-known to those of ordinary skill in the art. Said
camera support structures 778 may also include conduits (not shown)
for power or control wires (not shown).
[0093] Each roll 770 includes support bar 776. This support bar 776
also acts as the center of rotation for roll 770 and provides for a
shaft to apply powered rotation from a motor (not shown) in order
to automatically rotate the roll 770. When an exchange of target
sheet 750 is desired, additional sheet may be taken from roll 770
and would leave protective camera enclosure 777 at 774. The used
target sheet 750 is then wound into to the top roll 770 at 772. Of
course, the functions of the top and bottom rolls may be reversed
so that the top roll unspools the sheet and the bottom roll takes
in the sheet.
[0094] FIGS. 15A-15J represent various images or scans projected
onto, for example, the front or back of target sheet 750. The
hatching of FIG. 15A represents a target surface where the
projector 710 is off. FIG. 15B displays a hatching representing
where a target is ready for use by a user, where an image from, for
example, a projector 710 is placed onto the target sheet 750, or
where the target sheet 750 is otherwise marked. FIG. 15C shows an
impact area 800 causing, for example, a hole or other marking on
the target sheet 750.
[0095] Referring now to FIGS. 23A-23D in conjunction with FIGS.
15A-15C referenced above, the impact hole 800 is shown at 2301. In
addition, a second hole is created at 2302 as the projectile
travels through the inside of target stand 26 and exits through the
back sheet 762.
[0096] The position of projectile impact holes 800 at 2301 and 2302
will create a distinct light pattern, as shown by light projections
2351, 2352, 2353, 2354, and 2356.
[0097] First, projector 610 light source 2355 originating at
projector 610 will come through the hole at 2301, creating an
extension of the light 2356 that projects onto the front of back
sheet 762 at 2303. In this embodiment, the projector 710 and target
stand 26 are in fixed positions relative to one another, so that
light projection at 2356 remains consistent and allows cameras 720,
724, 730, and 734 to consistently record the location of the light
at the front sheet and back sheet at 2301 and 2303, respectively.
Thus, the fixed relative positions of the projector and target
stand assist in avoiding the problem of changing light patterns at,
for example, 2356, when, for example, the entire device 10 is in
motion and may be subject to changing light angles from various
external lighting sources. The light 2355 coming from the projector
710 comes from a known location and angle relative to the target
sheet 750. There can be only point of origin 2301 where the hole
800 would allow light 2353 to pass through and create the
particular light pattern at 2303. By examining the orientation and
geometry of the light 2356 at 2303 the characteristics of the
impact hole at 2301 may be determined by considering the angle at
which the projected light 2355 hits the target sheet 750 along with
other characteristics such as the distance between 2301 and
2303.
[0098] It should be noted that the same angle calculations apply
even where the shooting device does not necessarily create a hole
800 such as shown at 2301 or 2302, as long as the target sheet 750
or back sheet 762 is transparent enough to allow light to pass
through. By way of example only, a laser could impact a transparent
front sheet at 2301, and travel through to impact the back sheet
762 at 2302. The illuminated points at 2301 and 2302 would thus be
detected by internal or external cameras.
[0099] It may be the case that the external lighting is not
consistent due to, for example, the motion of the device 10 or a
changing projector image 2355, and therefore any light source that
may enter at 2301 or 2302 may not create consistent light beams
measurable by internal cameras 720, 724, 730, and 734. Therefore,
an alternative method of measurement may include internal cameras
720, 724, 730, 734 scanning the back of the target sheet 750 and
the front of the back sheet 762 for holes 800. When impact holes
2301 and 2302 are created by a projectile, the cameras 720, 724,
730, 734 will scan only for holes 2301 and 2302 and not for any
beams of light created thereby.
[0100] If internal lighting is utilized, by, for example, lighting
782, 783 inside camera enclosure 777 both above and below target
sheet 750, then additional light projections 2351, 2352, 2353, and
2354 are created. For example, if lighting 783 is present in the
lower camera enclosure 777, then the light emanating therefrom will
enter the hole at 2301 and create light projection 2351, and enter
the hole at 2302 and create light projection 2353. Likewise, if
lighting 782 is present in the upper camera enclosure 777, then the
light emanating therefrom will enter the hole at 2301 and create
light projection 2352, and enter the hole at 2302 and create light
projection 2354. As with the external projector 710 being in fixed
relative position to the target stand 26, a similar advantage is
created here wherein the fixed relative positions of the lighting
782, 783 and target and back sheets 750, 762 will create consistent
light projections 2351, 2352, 2353, 2354 for given impact points
2301 and 2302, By examining the differences in diameter of the
impact points at 2301 and 2302, 2371 and 2372, respectively, the
system would be able to gain valuable information from the entry
and exit characteristics of the projectile. This information gained
from the diameter analysis would be similar to the information
calculated by topographical analysis explained below and may
include, for example, projectile diameter or projectile
composition. The difference in the height of between 2301 and 2302
would help to determine the projectile's rate of travel if the
location of the shooter is known as the system could apply
kinematic equations of motion familiar to those skilled in the
art.
[0101] In the event that light projections 2351, 2352, 2353, 2354
are not interfered with too greatly by external lighting sources,
an alternative detection system will be for outside front camera
620 and outside back camera 630 to scan the outer surfaces, i.e.,
the front of target sheet 650 and the back of back sheet 660, for
holes to detect light 2351, 2352, 2353, 2354 emanating
therefrom.
[0102] FIG. 15D represents a target sheet 750 (or back sheet 762)
without an image projection. Thus, the detecting system would be
comparing the original surface of the target sheet 750 at FIG. 15A
against what the surface of the target sheet 750 looks like after
impact 800.
[0103] FIG. 15E represents a target sheet 750 (or back sheet 762)
with an image projection or preprinted marking. Thus, a detection
system would compare the target image at FIG. 15B with the target
image after impact at 800, as shown in FIG. 15E. Thus, the device
10 could continue unhampered with a projected image when an impact
point 800 is being detected.
[0104] FIG. 15F shows a mark detection system, where permanent or
projected markings on a surface of target sheet 750 or back sheet
762, thereby allowing the user or detection system to determine
where impact point 800 is specifically located relative to the
markings. If the markings were placed on the visible portion of a
sheet, for example the front of target sheet 750 or the back of
back sheet 762, the detection marking pattern could be invisible to
the human eye by the use of, for example, infrared reflective
paint, ultraviolet paint, and so on. On the "internal" surfaces
(e.g., the back of target sheet 750 or the front of back sheet
762), however, the markings would not need to be invisible as these
markings would not interfere with the user's view of any
target.
[0105] If a projected marking of a visible surface, e.g, the front
of target sheet 750, is desired, this may be accomplished without
interfering with the target image by imposing the detection pattern
at intervals faster than could be detected than the human eye. By
way of example only, if the projector 710 projects an image at "X"
Hz, the detection pattern is shown at (1/Y)*X Hz, e.g., if the
projector projects a target image at 120 Hz and it was desired that
the detection pattern was displayed 1 out of every 10 times then
the detection pattern would display at 12 Hz. The controlling
computer would cause a camera, for example, outside front camera
620, to take a snapshot only when the detection pattern is active.
The detection pattern frequency could be adjusted based upon, for
example, the user's rate of fire.
[0106] FIG. 15G is an example of a projectile interacting with a
reactive target surface, where the surface changes color, texture,
and so when impacted by a projectile. By way of example only, a
sheet side may be coated with a thin layer of paint overlaying a
colored target surface, and when the target is struck a portion of
the paint is removed and the colored surface below is seen. If it
were desired that the reactive target surface were not to be
visible to the human eye, the paint overlay or the underlying color
could be made of, for example, infrared or ultraviolet paint, and
any projected image or other visible targeting could be unaffected
by the reactive coating that is detectable by a camera but unseen
by the user.
[0107] An alternative means of projected detection pattern marking
is shown in FIG. 15H. In this embodiment, projected marker 804
would move across the target surface in a "scanning" type method
and compare the target surface/image at the location of the
projection marker. This detection would occur relative to the
target image. In this type "scanning" detection the use of the
device 10 would not have to stop because only a portion of the
targeting image would be affected.
[0108] An alternative means of impact detection is shown in FIG.
15I. Here, a hole 800 is detected from the resulting light 805 that
is able to pass from one side of the sheet through to the other
side. Depending upon the positioning of light sources and cameras,
this detection method is applicable for, for example, either side
of both the target sheet 750 and back sheet 650.
[0109] FIG. 15J represents a sheet used in conjunction with light
beam 806 for "blocked light" detection. In this detection
embodiment, impact 800 creates a three-dimensional "hole" where,
for the example, the sheet is made of a material such as thin metal
or stiff paper such that the edges of the impacted hole 800 will
stick out, for example, at 808 and interfere with light source 806,
thereby creating a shadow or distortion at 810 that may be detected
by internal or external cameras. Although FIG. 15J shows the light
beam 806 emanating from above, a second light beam could come from
below when the first beam is off, thereby creating a second shadow
or distortion that may be detected by internal or external cameras.
Moreover, each light may reflect off of the protruding part 808 of
hole 800, thereby aiding in detection by external cameras.
[0110] The detection of "three-dimensional" holes such as the one
shown in FIG. 15J may be useful, for example, in topographical
detection of entry and exit holes of various bullet calibers, for
analysis, for example, of caliber (i.e., hole size), bullet
composition (e.g., by how the bullet broke up), depending on the
material of the target sheet 750 and back sheet 762.
[0111] Referring now to FIGS. 16A-16C, a projected target 900 is
displayed on a sheet surface, e.g., the front surface of front
sheet 750. In FIG. 16A, the target is projected on a surface that
has not been impacted by a projectile, either because the shooter
has not yet fired or has simply been unsuccessful in hitting the
target. In FIG. 16B, the area where the target 900 is projected has
received multiple impacts 904. One or more cameras, e.g., front
camera 620, detects the successful hitting of target 900, transmits
this information to the control computer, and in reaction a
projector projects the target 900 to a new location through motion
908, as shown in FIG. 16C. This type of moving target has the
advantage of spreading shots over a sheet, thereby increasing the
useful lifespan of the sheet.
[0112] FIGS. 16D-16F show a similar system where the difficulty of
the target is increased after a successful round of shooting. FIG.
16D is similar to FIG. 15A where the target 900 projected on the
surface has not been impacted yet by a projectile. FIG. 16E shows
an extremely successful round of projectiles creating a tight set
of impacts 906. In reaction to the expert shooting, the target may
increase in difficulty by moving in an erratic pattern 910 and
decreasing in size 902. Other dynamic responses that the system
would be capable of performing include, for example, changing the
color of the target shown, strobing the target image or moving the
unit itself quicker or slower. The factors such as, for example,
the speed that the unit moves at or the changing size of the target
would then be able to be tied to accuracy and communicated to the
appropriate and interested parties.
[0113] FIGS. 19A-19C show an alternative targeting embodiment, by
combining a virtual reality with a target. FIG. 19A shows the feed
903 from one of the cameras, for example, cameras 47, 49, 50, 51,
or 52 focused on the unit's 10 surroundings. FIG. 19B shows a
target 900 in motion 910. FIG. 19C shows FIGS. 19A and 19B
combined, where the projected target 900 is superimposed on the
video feed 903 of the surrounding area creating the "virtual
reality" that the target is actually in the environment, thereby
enhancing the training experience for the user.
[0114] FIGS. 20A-20C refer to various examples of user computer
interaction with the device 10 and its projectors, cameras,
transportation means, and so on. For example, in FIG. 20A, a user
would be able to choose between a target 900 that is static or in
motion. Also in FIG. 20A, a user could select various training
scenarios, e.g., where the device 10 moves at different speeds, or
where various images or virtual realities (e.g., FIG. 19C) can be
chosen.
[0115] FIG. 20B shows an example of a user screen that assists a
shooter in determining in how to change and improve the shooter's
aim as well as locate where different shots hit the target. Various
data is collected from the device's 10 cameras to determine, for
example, the number of hits on a target 900 and where exactly the
hits occurred, and the target distance. As seen in FIG. 20B,
particular impact locations can be singled out with, for example,
the use of the projector to show arrows or shade the region around
a particular impact to "highlight" a particular impact. This
"highlighting" could serve as a training aid if, for example, the
user is too far away from the unit to accurately judge where the
impact occurred unaided.
[0116] FIG. 20C shows the shooter's physical actions either
simultaneously with the resulting impacts on the target 900 or a
playback relating a particular shot to the shooter's posture and
movements during said shot. For example, outer front rotatable
controls camera 49 may record the shooter's actions at the same
time outside front camera 620 records the target 900 being impacted
by the shooter's projectiles. When the unit 10 has found an impact
location it could, for example, create a time stamp so that all the
data tracked by the unit 10, for example, unit speed, direction of
travel, program in use, etc. can be correlated to the exact time of
a particular shot. The two recorded events may then be
simultaneously replayed on a split screen, as shown in FIG. 20C, in
order to increase the shooter's self-awareness of the shooter's
actions and physical posture during shooting or to provide a judge
or training personal with valuable information.
[0117] By way of example, one way for a computer to interpret the
user's interaction with the unit would be as if the user's
identified shot placement acted as computer mouse "click" at the
particular location on the projection screen. Thus, for example,
any program or interaction typically reserved for a computer with a
visual interface where a user must select a location on that
interface and "click" a computer mouse would be analogous to a the
user using the unit 10. With additional interaction hardware, such
as a keyboard, the possibilities for different programming on the
unit 10 are only limited by what any other computer could run. Some
non-limiting examples of programming and the user's interactions
include: playing any computer game or even surfing the internet.
Due to the capability for wireless communication through the
antenna 41, the unit would be able to interact, for example, with
other units, locally or not. This communication allows users to
interact with each other although they may be physically separated,
allowing, for example, users to compete real-time with each other
in a video game or training situation.
[0118] Referring now to FIGS. 21A, 21B and 22, a means for the
device 10 (or accompanying computer system) to determine the
location of the user 1010 is shown. In order to keep track of both
its own position and that of different users, a global coordinate
system is utilized. After any impact, the system will be able to
detect the angle of the shot origin is P1 at 1015 relative to the
unit itself. This angle is denoted in FIG. 22 as .angle.A0. Once
this angle is known, alternative range finding methods can be used
to determine the exact location of the user with respect to the
global coordinate system, for example, by using the cameras mounted
on the outside of the device 10. Any additional impacts occurring
in other locations on the target surface, originating from the same
location, regardless of the unit's 10 motion, will allow the unit
to determine the exact position of that user relative to itself and
the global coordinate frame. In one embodiment the device 10 will
rotate the target surface 1020, 1030 so that no successive impacts
can pass through the same points (P.sub.2, P.sub.3, P.sub.5, and
P.sub.6) for one user who is not moving, as seen also in FIG.
24.
[0119] Although the system tracks the following variables as three
dimensional vectors, only two dimensions are necessary for these
equations so the third dimension is left out for explanation
simplicity, but can be easily calculated by those with ordinary
skill in the art based upon the illustrations herein. The line
extending from F.sub.1 to F.sub.2 represents the front target
surface of target sheet 1020. The line extending from B.sub.1 to
B.sub.2 represents the back target surface of back sheet 1030. The
center of rotation of the target is P.sub.0 at 1040. The unit
tracks the center of rotation as it will not change relative to the
unit itself and therefore unit movement/orientation can be factored
out of the equations. The projectile starting point 1015 of user
1010 is represented by P.sub.1. The line from P.sub.1 to P.sub.11
represents a parallel line to the global "x" axis through the user.
The line from P.sub.1 to P.sub.3 represents the first projectile's
trajectory 1045 where P.sub.2 is the impact location 1050 for the
front target surface and P.sub.3 is the impact location 1055 for
the back surface. The line from P.sub.10 to P.sub.4 represents a
perpendicular line to the target surface. The location of P.sub.2
and P.sub.3 are calculated from the detection algorithms. The
distance from P.sub.2 to P.sub.4, D.sub.24, remains constant and is
therefore a known value. Equation 1 below calculates the distance
from P.sub.2 to P.sub.3, denoted as D.sub.23, where P.sub.2 to
P.sub.3 are broken up into their x and y components, namely
P.sub.2=(x.sub.2,y.sub.2) and P.sub.3=(x.sub.3,y.sub.3). Utilizing
D.sub.24 and D.sub.23 along with trigonometric ratios evident for
right triangles, the incident angle A0 can be calculated. In order
to calculate the incident angle relative to the coordinate frame,
.angle.P.sub.10P.sub.1P.sub.2, .angle.B0 is calculated in equation
3 below and then used in equation 4 below where angle D refers to
the known rotation of the target surface itself relative to the
global coordinate system.
[0120] The same math and logic holds for any shot. For example, for
a second shot 1046 the equations 5-8 below operate just as
equations 1 through 4 where the following have similar roles
applicable for the respective impact: P.sub.2 at 1050 and P.sub.5
at 1060, P.sub.3 at 1055 and P.sub.6 at 1065, P.sub.4 and P.sub.7,
P.sub.5 and P.sub.2, P.sub.8 and P.sub.9, P.sub.10 and P.sub.11,
.angle.A0 and .angle.A1, .angle.B.sub.0 and .angle.B.sub.1, and
.phi.P.sub.10P.sub.1P.sub.2 and .angle.P.sub.11P.sub.1P.sub.5.
.angle.P.sub.1P.sub.2P.sub.8=.angle.P.sub.4P.sub.2P.sub.3 and
.angle.P.sub.1P.sub.5P.sub.9=.angle.P.sub.7P.sub.8P.sub.6 as they
are vertical angles created by their respective transversals.
.angle.P.sub.13P.sub.1P.sub.2=.angle.P.sub.1P.sub.2P.sub.8,
.angle.P.sub.13P.sub.1P.sub.9=.angle.P.sub.1P.sub.5P.sub.9,
.angle.P.sub.8P.sub.1P.sub.2=.angle.P.sub.2P.sub.3P.sub.4, and
.angle.P.sub.9P.sub.1P.sub.6=.angle.P.sub.7P.sub.6P.sub.5 as they
are alternate interior angles created by their respective
transversals and parallel lines.
[0121] Once two different impacts locations are identified the
system can now calculate the position of the user relative to the
global coordinate frame. Equation 9 below calculates the distance
between the two impact locations P.sub.2 and P.sub.5, denoted as
D.sub.25, where P.sub.2 to P.sub.5 are broken up into their x and y
components, namely P.sub.2=(x.sub.2, y.sub.2) and P.sub.5=(x.sub.5,
y.sub.5) Equation 10 below relies on fundamental geometric
principals for triangles and intersecting lines. Equation 11 below
utilizes the Law of Sines to calculate the two remaining sides, the
side formed from P.sub.1 to P.sub.5 and the side formed from
P.sub.1 to P.sub.2, the distances to the projectile starting point
1015, as a function of ratios with respect to the known distances
and angles.
[0122] Impact 0:
D 23 = ( .DELTA. x ) 2 + ( .DELTA. y ) 2 = ( x 3 - x 2 ) 2 + ( y 3
- y 2 ) 2 . 1 .angle. A 0 = cos - 1 ( D 24 D 23 ) . 2 .angle. B 0 =
180 - 90 - .angle. A 0 = 90 - .angle. A 0. 3 .angle.P 10 P 1 P 2 =
.angle. D + .angle. B 0 4 ##EQU00001##
[0123] After Impact 1:
d 56 = ( x 6 - x 5 ) 2 ( y 6 - y 5 ) 2 . 5 .angle. A 1 = cos - 1 (
D 57 D 56 ) . 6 .angle. B 1 = 180 - 90 - .angle. A 1 = 90 - .angle.
A 1. 7 .angle. P 11 P 1 P 5 = .angle. D + .angle. B 1. 8 .angle. D
25 = ( x 5 - x 2 ) 2 + ( y 5 - y 2 ) 2 . 9 .angle. P 5 P 1 P 2 =
.angle. A 1 - .angle. A 0. 10 D 25 sin ( .angle. A 1 - .angle. A 0
) = D 12 sin ( 90 - .angle. A 1 ) = D 15 sin ( 90 + .angle. A 0 ) .
11 ##EQU00002##
[0124] Referring now to FIG. 18, an embodiment of the device in
motion is shown at particular points in time 960a-f. One or more
optimal embodiments of the device will have the target sheets 750,
762 (and/or the target stand 26 in general), for example,
perpendicular to the shooter 965 to increase the effectiveness of,
for example, the image projection. Thus, the rotatable base 40 will
allow the target sheets 750, 762 to continually face the user 965
regardless of where the device is in motion in, for example,
programmed path of travel 970.
[0125] Referring now to FIG. 17, the device 10 of FIG. 18 is shown
interacting with multiple shooters/users 930, 932, 934. In order to
maintain an optimal target surface orientation relative to a
shooter, the device moving along a preprogrammed path of travel 952
will alternate orientations by, for example, facing user 930 at
time 950a, user 932 at times 950b and 950d, and user 934 at times
950c, 950e, and 950f. Rotation to display the target to different
users as shown in FIG. 17, or for one user as in FIG. 18,
replicates the common training method where an individual or group
must track a target while maintaining readiness to fire on that
target when the situation calls for it. The equations explained to
determine the position of a shooter could apply to as many shooters
as were using the unit; the unit would simply allocate memory to
each new shooter as they were found to store the appropriate
distances, angles, etc. so they could be used for later use. In
another embodiment, different users could, for example, wear or use
identifying markers/electronics that the system would be able to
detect. By way of example only, if the different users wore
different colors, the system would be able to use the cameras
focused towards those users to differentiate between them.
[0126] Various changes, alternatives, and modifications to the
above embodiments will become apparent to a person of ordinary
skill in the art after a reading of the foregoing specification. It
is intended that all such changes, alternatives, and modifications
as fall within the scope of the appended claims be considered part
of the present invention.
* * * * *