U.S. patent application number 17/577858 was filed with the patent office on 2022-07-07 for jump shot and athletic activity analysis system.
The applicant listed for this patent is Mo' Motion Ventures. Invention is credited to Maureen Holohan.
Application Number | 20220215689 17/577858 |
Document ID | / |
Family ID | |
Filed Date | 2022-07-07 |
United States Patent
Application |
20220215689 |
Kind Code |
A1 |
Holohan; Maureen |
July 7, 2022 |
Jump Shot and Athletic Activity Analysis System
Abstract
Techniques are described where a computer or mobile device
performs video analysis of a person performing a basketball jump
shot. The computer or mobile device performs video analysis to (1)
determine whether the basketball jump shot resulted in the
basketball going through a hoop as a made shot, (2) determine
metric data for the basketball jump shot based on a plurality of
pixels within a plurality of the frames of the video data, and (3)
repeat the made shot and metric data determinations for a plurality
of basketball jump shots by the person to thereby track the made
shots and the metric data for those shots. The metric data may
include knee bend data, release point data, and/or arc angle data
for the shots.
Inventors: |
Holohan; Maureen; (Troy,
NY) |
|
Applicant: |
Name |
City |
State |
Country |
Type |
Mo' Motion Ventures |
New York |
NY |
US |
|
|
Appl. No.: |
17/577858 |
Filed: |
January 18, 2022 |
Related U.S. Patent Documents
|
|
|
|
|
|
Application
Number |
Filing Date |
Patent Number |
|
|
16881716 |
May 22, 2020 |
11227150 |
|
|
17577858 |
|
|
|
|
15451044 |
Mar 6, 2017 |
10664690 |
|
|
16881716 |
|
|
|
|
14086554 |
Nov 21, 2013 |
9589207 |
|
|
15451044 |
|
|
|
|
International
Class: |
G06V 40/20 20060101
G06V040/20; G06T 7/73 20060101 G06T007/73; G06T 7/246 20060101
G06T007/246; G09B 19/00 20060101 G09B019/00; G16H 20/30 20060101
G16H020/30; H04M 1/72403 20060101 H04M001/72403; G06V 10/75
20060101 G06V010/75 |
Claims
1. A computer program product for video analysis of a basketball
jump shot, the computer program product comprising: a plurality of
processor-executable instructions configured for processing video
data, the video data comprising a plurality of frames of a person
performing the basketball jump shot with a basketball, each frame
comprising a plurality of pixels, the instructions being resident
on a non-transitory computer-readable storage medium and being
configured, upon execution by a processor, to: perform pixel
processing on a plurality of the frames of the video data to (1)
determine whether the basketball jump shot resulted in the
basketball going through a hoop as a made shot based on pixels
within one or more frames of the video data that depicts the
basketball relative to the hoop, (2) determine metric data for the
basketball jump shot, wherein the metric data comprises at least
two members of the group consisting of (i) data relating to a knee
bend for the basketball jump shot based on pixels within one or
more frames of the video data that depict the person in a base
phase of the basketball jump shot, (ii) data relating to a release
point for the basketball jump shot based on pixels within one or
more frames of the video data that depict the person in a release
phase of the basketball jump shot, and (iii) data relating to an
arc angle for the basketball jump shot based on pixels within a
plurality of frames of the video data that depict the basketball in
flight; and repeat the pixel processing for a plurality of
basketball jump shots by the person to thereby track the made shots
and the metric data for the plurality of basketball jump shots by
the person.
2. The computer program product of claim 1, wherein metrics data
comprises the knee bend data and the release point data.
3. The computer program product of claim 2, wherein the release
point data comprises data indicative of how quickly the person
released the basketball for the basketball jump shots.
4. The computer program product of claim 2, wherein the release
point data comprises data indicative of a release pinnacle for the
basketball jump shots.
5. The computer program product of claim 2, wherein the metrics
data further comprises the arc angle data.
6. The computer program product of claim 1, wherein the metrics
data comprises the knee bend data and the arc angle data.
7. The computer program product of claim 1, wherein the metrics
data comprises the release point data and the arc angle data.
8. The computer program product of claim 1, wherein the plurality
of processor-executable instructions are further configured to:
compare body positions of the person in a plurality of the frames
to reference skeletons that represent a fundamental basketball jump
shot in the release phase and the base phase; determine deviations
for the basketball jump shots from the fundamental basketball jump
shot based on the comparisons; and track the determined deviations
in a database.
9. The computer program product of claim 8, wherein the plurality
of processor-executable instructions are further configured to:
correlate the determined deviations with the made shot data to
identify a plurality of the determined deviations that are
outcome-determinative.
10. The computer program product of claim 1, wherein the plurality
of processor-executable instructions are embodied by a mobile
application for download onto a smart phone or tablet computer.
11. The computer program product of claim 10, wherein the mobile
application is configured to obtain the video data from a camera on
the smart phone or tablet computer.
12. A method for video analysis of a basketball jump shot, the
method comprising: processing video data by a processor, the video
data comprising a plurality of frames of a person performing the
basketball jump shot with a basketball, each frame comprising a
plurality of pixels, wherein the video data processing step
comprises: the processor determining whether the basketball jump
shot resulted in the basketball going through a hoop as a made shot
based on a plurality of pixels within one or more frames of the
video data that depicts the basketball relative to the hoop; the
processor determining metric data for the basketball jump shot
based on a plurality of pixels within a plurality of the frames of
the video data, wherein the metric data comprises at least two
members of the group consisting of (i) data relating to a knee bend
for the basketball jump shot based on pixels within one or more
frames of the video data that depict the person in a base phase of
the basketball jump shot, (ii) data relating to a release point for
the basketball jump shot based on pixels within one or more frames
of the video data that depict the person in a release phase of the
basketball jump shot, and (iii) data relating to an arc angle for
the basketball jump shot based on pixels within a plurality of
frames of the video data that depict the basketball in flight; and
repeating the made shot determining step and the metric data
determining step for a plurality of basketball jump shots by the
person to thereby track the made shots and the metric data for the
plurality of basketball jump shots by the person.
13. The method of claim 12 wherein the processor comprises a
plurality of processors.
14. A system for video analysis of a basketball jump shot, the
system comprising: a processor configured for processing video
data, the video data comprising a plurality of frames of a person
performing the basketball jump shot with a basketball, each frame
comprising a plurality of pixels, wherein the processor is
configured to: determine whether the basketball jump shot resulted
in the basketball going through a hoop as a made shot based on a
plurality of pixels within one or more frames of the video data
that depicts the basketball relative to the hoop; determine metric
data for the basketball jump shot based on a plurality of pixels
within a plurality of the frames of the video data, wherein the
metric data comprises at least two members of the group consisting
of (i) data relating to a knee bend for the basketball jump shot
based on pixels within one or more frames of the video data that
depict the person in a base phase of the basketball jump shot, (ii)
data relating to a release point for the basketball jump shot based
on pixels within one or more frames of the video data that depict
the person in a release phase of the basketball jump shot, and
(iii) data relating to an arc angle for the basketball jump shot
based on pixels within a plurality of frames of the video data that
depict the basketball in flight; and repeat the made shot
determination and the metric data determination for a plurality of
basketball jump shots by the person to thereby track the made shots
and the metric data for the plurality of basketball jump shots by
the person.
15. The system of claim 14, further comprising: a camera for
cooperation with the processor, the camera configured to generate
the video data.
16. The system of claim 15, wherein the processor and the camera
are part of a mobile device.
17. The system of claim 16, wherein the mobile device comprises a
smart phone.
18. The system of claim 14, wherein the processor comprises a
plurality of processors.
19. The system of claim 14, wherein the processor is part of a
server.
20. The system of claim 14, wherein the metric data comprises the
knee bend data and the release point data.
21. The system of claim 14, wherein the metric data comprises the
release point data and the arc angle data.
22. The system of claim 14, wherein the metric data comprises the
knee bend data and the arc angle data.
Description
CROSS-REFERENCE AND PRIORITY CLAIM TO RELATED PATENT
APPLICATIONS
[0001] This patent application is a continuation of U.S. patent
application Ser. No. 16/881,716, filed May 22, 2020, and entitled
"Jump Shot and Athletic Activity Analysis System", now U.S. patent
Ser. No. ______, which is a divisional of U.S. patent application
Ser. No. 15/451,044, filed Mar. 6, 2017, and entitled "Jump Shot
and Athletic Activity Analysis System", now U.S. Pat. No.
10,664,690, which is a continuation-in-part of U.S. patent
application Ser. No. 14/086,554, filed Nov. 21, 2013, and entitled
"Jump Shot and Athletic Activity Analysis System", now U.S. Pat.
No. 9,589,207, the entire disclosures of each of which are
incorporated herein by reference.
INTRODUCTION
[0002] Finely tuned fundamentals are often what separate highly
skilled basketball players from less skilled basketball players.
The acquisition and mastery of skills and fundamentals help players
compensate for lack of height or speed on the basketball court.
Skills and fundamentals should be regularly assessed and improved
as a player's body and mind grows and develops. One of the most
common and valued indicators of a player's basketball skills is the
player's ability to shoot the basketball and score consistently. A
player demonstrating a fundamentally sound jump shot generally uses
proper balance, alignment, mechanics, and leverage to generate the
appropriate arc angle and in-flight ball rotation that increases
the likelihood of consistently scoring baskets during training and
in competition over the course of the player's career.
[0003] A fundamentally sound jump shot has many important mechanics
that should be regularly assessed and improved during the process
of learning how to shoot a proper jump shot which has, for example,
five parts from start to finish. These mechanics may include a body
and base that is square to the basket, a proper knee bend and
squat, a straight shooting elbow in the frame, a ball release at
the pinnacle of an athlete's jump, a release that leads to optimal
arc, a smooth follow through to generate in-flight ball rotation,
and a consistent finishing pose. Collectively these five exemplary
parts comprise a one to two second action involved in the execution
of a jump shot. The mechanics that generate arc and rotation lead
to softer bounces on the rim, which leads to more made baskets.
Other mechanics of the fundamentally sound jump shot enhance the
likelihood that the shooter directs the basketball directly at the
basket. Still other mechanics decrease the likelihood of having a
jump shot blocked by a defender. Having strong mechanics in a jump
shot may compensate for physical limitations, such as height and
arm length. Some players are gifted with the ability to jump 42
inches in the air, other players grow to be seven feet tall, but
others can compensate for physical limitations with strong jump
shot fundamentals.
[0004] Coaches often teach strong jump shot fundamentals to players
because the proper jump shot motions are unnatural to many players.
The most common conventional method for improving a jump shot
remains coaching. In other words, a teacher or coach watches a
player take practice jump shots, critiques the player's
fundamentals, and ultimately offers corrections to improve the
player's fundamentals. In other words, the human eye was used to
find jump shot fundamental flaws.
[0005] Some coaches use video cameras to assist in the jump shot
lessons they give to basketball players. By showing the video to a
basketball player, the basketball player can see the mistakes he or
she is making and hopefully quickly fix the problems. While video
footage can show fundamental flaws in a basketball player's jump
shot to the basketball player, the coach is still largely
responsible for identifying the flaws in the jump shot technique
based on his or her own visual analysis of the video. So, even
though an element of technology is introduced when filming a
basketball player's jump shot, the human eye is still mainly
responsible for identifying mistakes in the jump shot.
[0006] Strong jump shot fundamentals are usually developed over a
long time with many hours of practice. The best results are
achieved when a coach insists that a player shoots perfect
repetitions from a realistic and appropriate range based on the
player's height, weight, and age. One-on-one coaching sessions are
costly, and not every basketball player can afford frequent (e.g.
daily) tutoring lessons from a coach or trainer, who often feels
pressured by the paying customer to shoot out of a player's most
productive range, which leads to instant gratification when one
shot is made, but poor results in both the short and long term.
Also, a coach only has limited time and likely has many pupils,
making it difficult to properly pace and adjust one's mechanics and
range over time as bodies grow and change. If a basketball player
really wants to improve, he cannot practice his jump shot only when
the coach is available and teaching the importance of shooting from
a productive range with realistic expectations. Disciplined
individual practices time devoted to mastering proper shooting
mechanics from a productive and reasonable range is an absolute
necessity if a player wants to become a highly skilled basketball
player.
[0007] When the player participates in individual practice
sessions, the player still needs feedback on whether his
fundamentals are improving and whether he is following the coach's
instructions. Muscle memory and "feeling" can help a player
remember solid fundamentals during individual practices, but bad
habits are easy to form when not under the watchful eye of a
knowledgeable coach. Often, players forget the feeling they had
during the training session with the coach during individual
practices. Further, as a player grows or his body changes, bad
habits and improper mechanics may develop. Shooting improvement
requires constant checks for proper alignment, balance, and
mechanics, but having the constant attention of a coach is simply
not feasible for most players. Without constantly checking for
improper mechanics, a player can quickly develop bad habits, which
may be difficult to correct after the bad habit has been ingrained.
So a method for self-checking on mechanics and fundamentals are
necessary for development early on and throughout one's basketball
career. Unfortunately, a consistent method for self-checking and
self-regulating fundamentals is lacking in the conventional jump
shot teaching methods.
[0008] Even in situations where a player has many basketball
resources, such as a college player, the college team rarely
provides a coach whose sole job is to teach the jump shot. College
head coaches and assistant coaches are often very busy teaching
players offensive plays, scouting opponents, watching game tape, or
designing defensive sets, and they do not have time to teach jump
shooting fundamentals. So, in just about every setting, whether a
child learning to play in the backyard to a college player
competing for a national championship, players need a way to learn,
fine-tune, and self-regulate their jump shot.
[0009] In light of all these problems with conventional jump shot
training methods, there is room for improvement in this technology
area. Players need a convenient and inexpensive way to maximize
their individual practices so that good fundamentals are honed
during these valuable practice sessions.
[0010] It is in view of the above problems that the present
invention was developed. The invention provides a system and method
for analyzing captured video of a physical activity based on a
reference skeleton of the physical activity on the captured video.
The skeleton exhibits a fundamentally strong version of a human
performing the physical activity based on a realistic and optimal
expectation for a player who is of the same age, body weight,
gender, or size. The skeleton may also be adapted for any skill
level from a beginner to a professional basketball player. Based on
the skeleton, a computer performing video analysis finds deviations
from the reference skeleton performed by the person performing the
physical activity.
[0011] The invention described herein provides a system that
combines video analysis with working knowledge and proven results
to identify and teach the physical activity for any person. The
skeleton and video analysis may be altered based on age, gender,
size, skill level, shooting hand, or any other variable. The
skeleton may be modified for younger players who lack the strength
to perform the physical activity in a way performed by grown adults
and professionals. The skeleton may also be modified for skill
level so that certain more basic fundamentals and mechanics are
focused on by the computer performing video analysis for less
skilled athletes and more refined mechanics are focused on for
highly skilled athletes.
[0012] In one embodiment, a method for video analysis of a physical
activity, the method comprises: processing video data, the video
data comprising a plurality of frames of a person performing the
physical activity, wherein the processing step comprises:
determining a frame of the video data that depicts a body position
of the person in a phase of the physical activity; comparing the
body position in the frame with a reference skeleton for the phase
of the physical activity; and determining a deviation between the
frame body position and the reference skeleton based on the
comparing step; and wherein the frame determining, comparing, and
deviation determining steps are performed by a processor.
[0013] In another embodiment, an apparatus for video analysis of a
physical activity, the apparatus comprises: a processor configured
for processing video data, the video data comprising a plurality of
frames of a person performing the physical activity, wherein the
processor is configured to: determine a frame of the video data
that depicts a body position of the person in a phase of the
physical activity; compare the body position in the frame with a
reference skeleton for the phase of the physical activity; and
determine a deviation between the frame body position and the
reference skeleton based on the comparison operation.
[0014] In another embodiment, a computer program product for video
analysis of a physical activity, the computer program product
comprises: a plurality of processor-executable instructions
configured for processing video data, the video data comprising a
plurality of frames of a person performing the physical activity,
the instructions being resident on a non-transitory
computer-readable storage medium and being configured, upon
execution by a processor, to: determine a frame of the video data
that depicts a body position of the person in a phase of the
physical activity; compare the body position in the frame with a
reference skeleton for the phase of the physical activity; and
determine a deviation between the frame body position and the
reference skeleton based on the comparison operation.
[0015] In another embodiment, a computer program product comprises:
a plurality of processor-executable instructions, the instructions
being resident on a non-transitory computer-readable storage medium
of a computing device and being configured, upon execution by a
processor, to: capture video of a person performing the physical
activity using a camera associated with a mobile device; determine
a first frame of the captured video that depicts the person
positioned in a first phase of the physical activity by searching
for characteristics identifying the first phase of the physical
activity in the captured video; compare the person's body position
in the first frame to a first reference skeleton corresponding to
the first phase of the physical activity; and determine whether the
person demonstrated a derivation from the first reference skeleton
in the first phase of the physical activity while performing the
physical activity as a result of comparing the first frame to the
first reference skeleton.
[0016] In another embodiment, an apparatus for video analysis of a
physical activity, the apparatus comprises: a processor configured
for processing video data, the video data comprising a plurality of
frames of a person performing the physical activity, wherein the
processor is configured to: determine a frame of the video data
that depicts a body position of the person in a phase of the
physical activity; compare the body position in the frame with a
reference skeleton for the phase of the physical activity;
determine a deviation between the frame body position and the
reference skeleton based on the comparison operation; determine
whether the basketball jump shot went through a basketball hoop; a
database for storing data, the database configured to: maintain
data indicative of whether the basketball jump shot went through a
basketball hoop; and maintain data indicative of one or more
determined derivations between the frame body position and the
reference skeleton.
[0017] In other example embodiments, techniques are disclosed for
processing video data to track aspects of basketball jump shots,
such as whether the shots were made or missed, release points,
release quicknesses, knee bend, and/or arc angle for the shots,
etc.
[0018] Further features and advantages of the present invention, as
well as the structure and operation of various embodiments of the
present invention, are described in detail below with reference to
the accompanying drawings.
BRIEF DESCRIPTION OF THE DRAWINGS
[0019] The accompanying drawings, which are incorporated in and
form a part of the specification, illustrate the embodiments of the
present invention and together with the description, serve to
explain the principles of the invention. In the drawings:
[0020] FIG. 1 illustrates an exemplary system that employs a mobile
device to capture video of a physical activity.
[0021] FIG. 2(a) illustrates an exemplary embodiment for a mobile
device.
[0022] FIG. 2(b) illustrates an exemplary architecture for a mobile
application executed by a mobile device.
[0023] FIG. 3 illustrates an exemplary embodiment for a remote
server.
[0024] FIG. 4 illustrates a reference skeleton of a basketball jump
shot from a front perspective according to an exemplary
embodiment.
[0025] FIG. 5 illustrates the reference skeleton of a basketball
jump shot from a side perspective according to an exemplary
embodiment.
[0026] FIG. 6 illustrates a flow chart of a mobile device
performing video analysis of the jump shot according to an
exemplary embodiment.
[0027] FIG. 7 illustrates a flow chart of a remote server
performing video analysis of the jump shot according to an
exemplary embodiment.
[0028] FIGS. 8(a) and 8(b) illustrate deviations from a reference
skeleton in a base stage of the jump shot.
[0029] FIGS. 9(a) and 9(b) illustrate deviations from a reference
skeleton in a frame stage of the jump shot.
[0030] FIG. 10 illustrates a deviations from a reference skeleton
in a release stage of the jump shot.
[0031] FIGS. 11(a) and 11(b) illustrate deviations from a reference
skeleton in a finish stage of the jump shot.
[0032] FIG. 12 illustrates an exemplary result spreadsheet
displayed to a user after video analysis.
[0033] FIG. 13 illustrates a reference skeleton superimposed over
video frames of a player taking a jump shot according to an
exemplary embodiment.
[0034] FIG. 14 depicts an example embodiment of a multi-camera
arrangement for generating video for use in basketball jump shot
analysis.
DETAILED DESCRIPTION OF EXAMPLE EMBODIMENTS
[0035] Referring to the accompanying drawings in which like
reference numbers indicate like elements, FIG. 1 illustrates an
exemplary system 100 for an embodiment that employs a mobile device
to capture videos of a player taking a jump shot 108. The system
100 may comprise at least one mobile device 102 in communication
with a remote server 106 via a data communications network 104. In
one embodiment, the remote server 106 receives videos taken by the
mobile device 102 of the player's jump shot 108 and performs video
analysis on the videos. Based on a reference jump shot skeleton,
the remote server 106 may determine deviations from the reference
skeleton in the player's jump shot 108. While FIG. 1 has been
illustrated to include a remote server 106 performing video
analysis, the mobile device 102 may perform video analysis. In
another embodiment, the mobile device 102 may connect to a remote
server 106 even though the mobile device 102 performs video
analysis. In this embodiment, the remote server 106 may be a web
server where basketball players may upload their results, receive
training schedules, games, and tutorials, or share their training
results with social media connections. The remote server 106 may be
configured to track a player's progress as they train using the
system 100. As will be seen by the exemplary embodiments described
below, the remote server 106 may provide numerous functions to help
players improve.
[0036] The mobile device 102 may be a smart phone (e.g., an iPhone,
a Google Android device, a BlackBerry device, etc.), tablet
computer (e.g., an iPad), a laptop computer (e.g. MacBook) or the
like. Furthermore, the mobile device 102 may be a position
tracking-enabled mobile device. That is, the mobile device 102 may
be configured to track its geographic position and communicate data
regarding same to other computing devices (e.g., the remote server
106). The mobile device preferably employs a touchscreen or the
like for interacting with a user. However, it should be understood
that any of a variety of data display techniques and data input
techniques could be employed by the mobile device. For example, to
receive inputs from a user, the mobile device need not necessarily
employ a touchscreen--it could also or alternatively employ a
keyboard or other mechanisms. The mobile device 102 also comprises
a video camera for capturing photos and videos of the player's jump
shot 108.
[0037] FIG. 2(a) depicts an exemplary embodiment for a mobile
device 102. The mobile device 102 may comprise a processor 200 and
associated memory 202, where the processor 200 and memory 202 are
configured to cooperate to execute software and/or firmware that
support operation of the mobile device 102. Furthermore, the mobile
device 102 may include an I/O device 204 (e.g., a touchscreen user
interface for graphically displaying output data and receiving
input data from a user), a camera 206, wireless I/O 208 for sending
and receiving data, a microphone 210 for sensing sound and
converting the sensed sound into an electrical signal for
processing by the mobile device 102, and a speaker 212 for
converting sound data into audible sound. The wireless I/O 208 may
include capabilities for making and taking telephone calls,
communicating with nearby objects via near field communication
(NFC), communicating with nearby objects via RF, and/or
communicating with nearby objects via Bluetooth. These components
are now resident in many standard models of smart phones and other
mobile devices.
[0038] FIG. 2(b) depicts an exemplary mobile application 250 for an
exemplary embodiment. Mobile application 250 may be installed on
the mobile device 102 for execution by processor 200. The mobile
application 250 preferably comprises a plurality of
computer-executable instructions resident on a non-transitory
computer-readable storage medium such as a computer memory. The
instructions may include instructions defining a plurality of GUI
screens for presentation to the user through the I/O device 204.
The instructions may also include instructions defining various I/O
programs 256 such as: [0039] a GUI data out interface 258 for
interfacing with the I/O device 204 to present one or more GUI
screens 252 to the user; [0040] a GUI data in interface 260 for
interfacing with the I/O device 204 to receive user input data
therefrom; [0041] a camera interface 264 for interfacing with the
camera 206 to communicate instructions to the camera 206 for
capturing an image in response to user input and to receive image
data corresponding to a captured image from the camera 206; [0042]
a wireless data out interface 266 for interfacing with the wireless
I/O 208 to provide the wireless I/O with data for communication
over the network 104; and [0043] a wireless data in interface 268
for interfacing with the wireless I/O 208 to receive data
communicated over the network 104 to the portable computing device
for processing by the mobile application 250. The instructions may
further include instructions defining a control program 254. The
control program may be configured to provide the primary
intelligence for the mobile application 250, including
orchestrating the data outgoing to and incoming from the I/O
programs 256 (e.g., determining which GUI screens 252 are to be
presented to the user).
[0044] The network 104 may be any data communications network
capable of supporting communications between the remote server 106
and the mobile device 102, wherein at least a portion of the data
communication is wireless data communication as shown in FIG. 1. It
should be understood that network 104 may comprise multiple data
communication networks that interconnect to form a larger network.
The network 104 may be public, private, or a mix of public and
private networks.
[0045] The remote server 106 may be a server or collection of
servers that are configured to support video processing as
described herein. The remote server 106 may perform other
activities other than or in addition to video analysis, such as
tracking improvements in a player's jump shot 108, storing tutorial
and training videos, storing training games, generating a training
schedule including training activities to be performed on each day,
sharing training results via social media, storing all the videos
taken of the player's jump shot 108, connecting with coaches,
providing a community where a player may upload questions about
training, or any other web-based service to help a player improve
their training and jump shot 108. The remote server 106 may be
operated by sports training company. However, it should be
understood that other entities may operate the system.
[0046] FIG. 3 depicts an exemplary embodiment of the remote server
106. The remote server 106 may comprise a processor 300 and
associated memory 302, where the processor 300 and memory 302 are
configured to cooperate to execute software that performs
transaction processing as described herein. A memory in the form of
a database 308 may be configured to store various data structures
representative of jump shot videos, jump shot training schedules,
tutorial videos, and previous jump shot video analysis results. The
system 106 may further comprise one or more I/O interfaces (e.g.,
I/O interfaces 304 and 306 for communicating via wireless RF and
Internet links, respectively). It should be understood that FIG. 3
is exemplary only, and a practitioner may distribute the processing
tasks described herein among numerous processors as well as
distribute the data storage tasks described herein among numerous
databases.
[0047] While basketball has been and will be the focus of the
disclosure, the video analysis features taught by the present
disclosure can be applied to any sport or physical activity.
[0048] Either the mobile device 102 processor 200 or the remote
server 106 processor 300 performs video analysis of a player's jump
shot 108 based on an reference jump shot skeleton. If the remote
server 106 performs video analysis, the mobile device 102 transmits
the video files of the jump shot to the remote server 106. If the
mobile device 102 performs video analysis, the mobile device may
still communicate with the remote server 106 by requesting the
remote server to store training results, training videos, or other
data, in the database 308.
[0049] FIGS. 4 and 5 illustrate a front view reference jump shot
skeleton 400 and a side view reference jump shot skeleton 500. A
jump shot has multiple components or phases as a player jumps in
the air and releases the basketball. The exemplary embodiments
disclosed herein have determined that five phases represent the
most important aspects of a jump shot, but the jump shot may have
more intermediate phases. These phases include: a base phase 402,
502, a frame phase 404, 504, a release phase 406, 506, a
follow-through phase 408, 508, and a finish phase 410, 510. The
reference jump shot skeletons 400, 500 may be a collection of
images each representing a phase of the jump shot. So, just as the
jump shot may have five phases, the reference jump shot skeleton
400, 500 may have five or more images matching the five phases of
the jump shot. Alternatively, the reference jump shot skeletons
400, 500 may be data points derived from an image.
[0050] FIG. 4 illustrates the reference jump shot skeleton 400 from
a front view, or a view where the player is facing the camera 206.
FIG. 5 illustrates the reference jump shot skeleton 500 from a side
view, or a view where the camera captures the jump shot from the
right or left side of the player. In the examples shown in FIGS. 4
and 5, a skeleton for a male age 14-16 is illustrated. The player
430 illustrated in FIGS. 4 and 5 may be right-handed, weighs
approximately 125-145 pounds, and may have a height of about
5'6''-5'10''.
[0051] The skeleton 400, 500 illustrated in FIGS. 4 and 5
illustrates a mid-air jump shot, where the player releases the ball
at the pinnacle of their jump. Not all people can perform a mid-air
jump shot, such as younger players or some female players. For
example, a younger player who has not fully grown may demonstrate
more of a "push" shot (set shot), wherein the player uses the arm
motion and leg jump to help push the ball at the basket. This
pushing mechanic occurs because younger players are not strong
enough to shoot the ball at a ten foot rim with simply a wrist
flick and arm extension. Other players are strong enough to perform
a mid-air jump shot, but still need to perform some pushing action
for longer jump shots (e.g. shots taken behind the three-point
line). The jump shot with some pushing action may be called a
modified jump shot. A skeleton 400, 500 may be altered to analyze a
modified jump shot, a set shot, or a mid-air jump shot upon user
command. The skeleton 400, 500 may be changed for gender, age,
height, weight, shooting hand, and skill level. For example, a
skeleton for a 8-10 year old may accommodate a set shot so that the
younger player can focus on other mechanics such as alignment,
balance, and footwork, which are things the younger player can
control while he is still growing. For example, the skeleton age
ranges may include a skeleton for age ranges of 9-11 year, 12-13
years, 14-16 years, and 17+ years. In this way, the skeleton 400,
500 grows with the player. The skeleton 400, 500 is not just a
"one-size fits all" model. The skeleton 400, 500 accounts for
player's physical limitations, basketball skill, and body size.
[0052] The side view angle may capture ball flight and also whether
the ball went through the hoop. The front view skeleton 400 may
have the same number of phases as the side view skeleton 500, but
the front view skeleton 400 is different than the side view
skeleton 500 because of the camera's 206 perspective. FIGS. 4 and 5
show the skeletons 400, 500 superimposed over an a basketball
player. The skeletons 400, 500 are generated by analyzing strong
shooters, gathering data about joint locations for the strong
shooters, body positions for the strong shooters, calculating a
standard deviation across all analyzed shooters, and ultimately
generating a reference skeleton 400, 500. The computer performing
video analysis (either the mobile device 102 or the remote server
106) may generate different skeletons 400, 500 based on age, race,
gender, height, weight, etc of a player 430. A plurality of
skeletons 400, 500 may be saved in a database 308 and applied based
on data inputted by a user of the mobile device 102 about the
player 430. Or, the database 308 may store very few skeletons 400,
500 and stretch the dimensions of generic skeletons 400, 500 based
on the player's 430 body measurements.
[0053] The reference jump shot skeleton 400, 500 may comprise
joints and connecting lines between the joints. The reference jump
shot skeletons 400, 500 should roughly resemble a human body taking
a jump shot, and the joints of the reference jump shot skeletons
400, 500 should generally overlap the joints of the player being
filmed. Referring to FIG. 4, the shoulder joints 420, 422
correspond to the player's 430 shoulders. The same is true for the
player's 430 knees, hips, elbows, or even the basketball itself.
Matching skeleton 400, 500 joints with joints found on the player
in the video taken of the player's 430 jump shot 108 may be
performed while analyzing the player's 430 jump shot 108. By
matching skeleton 400, 500 joints with joints on the basketball
player 430, the connecting lines match the player's body, limbs,
head, etc. If a player joints do not closely overlap with the
reference skeleton 400, 500 when the skeleton 400, 500 is
superimposed on a video of the player's 430 jumps shot, deviations
from the skeleton 400, 500 may be determined by the remote server
106 or the mobile device 102.
[0054] In another embodiment, the computer performing video
analysis 102 or 106 may analyze a frame of video and find joints,
limbs, or body positions of the player 430. The computer performing
video analysis 102 or 106 may determine joint locations and body
positions by determining the edges of a player's body and the size
of the player's body. The computer performing video analysis 102 or
106 may quantify the size of the player 430 and also define
coordinates for certain edges of the player 430 in the video. Based
on the size and edge coordinates, the computer performing video
analysis 102 or 106 may determine joint coordinates in the image.
Coordinates may be pixel locations, generated X-Y plane values, or
any other generated image coordinate system. The computer
performing video analysis 102 or 106 has a basic understanding of
the shape of a human and may determine shoulder joints from elbow
joints using the edge coordinates and the general shape
understanding. For example, if a first edge coordinate is located
relatively closely to a second edge coordinate along the X-axis,
the computer may determine an arm or a leg. The computer may scan a
number of lines along the X-axis searching for known areas of the
body. After finding the joints and limbs, the computer performing
video analysis 102 or 106 may generate a player skeleton that
actually represents the player's body positions in the video. The
player skeleton may be compared to the reference skeleton on an X-Y
plane to find deviations from the skeleton 400, 500. In places
where joints or connecting lines of the reference skeleton 400, 500
substantially deviate from the player skeleton, the computer
performing video analysis 102 or 106 may determine deviations from
the skeleton 400, 500 in the jump shot. The computer performing
video analysis 102 or 106 may choose either method of video
analysis.
[0055] The skeleton 400, 500 has a finite number of stages. FIGS. 4
and 5 show that the skeleton 400, 500 has five stages, matching the
five above-described phases of the jump shot. The skeleton 400, 500
may have more or less than five stages, but if the skeleton 400,
500 has many more stages the amount of video analysis performed by
a computer substantially increases. A substantial amount of video
analysis becomes a problem for processors with lower performance
specifications, such as the processor 200 in the mobile device 102.
So a balance needs to be found between the number of video frames
analyzed and the amount of processing power possessed by the
processor performing video analysis.
[0056] Basketball players differ in height, weight, and gender, so
the skeleton 400, 500 may stretch or constrict based on a player's
size. In order to account for a basketball player's size, a user of
the mobile device 102 may input size measurements before capturing
video so that the computer performing video analysis 102 or 106 may
generate a skeleton roughly matching the basketball player's 430
size. These body measurements may be general or specific. For
example, a general body measurement may include the basketball
player's 430 height, weight, and gender. Based on the height,
weight, and gender, the computer performing video analysis 102 or
106 may generate a skeleton that roughly matches what an average
person having those height and weight attributes should look like.
For example, if a player is very tall, the computer performing
video analysis 102 or 106 may stretch the connecting lines between
joints representing the legs, arms, and torso. As another example,
a male skeleton may have a slightly longer torso than a female
skeleton because females generally have a lower center of gravity.
Alternatively, a user could input more specific body measurements,
such as height, weight, arm length, shoulder width, leg length,
etc. The application 250 may request even more specific attributes,
like forearm length, a measurement from knee to ankle, foot size,
or the like. Basketball players tend to be lanky, so arm length and
leg length may greatly vary from player to player. If the computer
performing video analysis 102 or 106 receives very accurate and
specific measurements, the computer performing video analysis 102
or 106 may generate a very accurate skeleton 400, 500 and determine
jump shot deviations from the skeleton 400, 500 very
accurately.
[0057] The computer performing video analysis 102 or 106 may
further be able to stretch or shrink the skeleton 400, 500 based on
the distance between the player 430 and the camera 206. The player
430 may appear smaller or larger depending on the distance between
the camera 206 and the player 430. The computer performing video
analysis 102 or 106 may account for camera distance by stretching
or constricting the skeleton 400, 500. The application 250 may also
notify a user of a suggested distance between the camera 206 and
the player 430 for the video footage from each angle.
[0058] The computer performing video analysis 102 or 106 may
receive multiple videos from a front perspective and a side
perspective. The video taken from the side perspective allows the
video performing video analysis 102 or 106 to determine the arc of
the jump shot, determine whether the ball went through the hoop,
and perform analysis of the player's 430 mechanics from the side
based on the side perspective skeleton 500. For example, the
computer performing video analysis 102 or 106 may determine if a
player 430 is jumping forward a significant amount. Ideally, the
location where the player 430 left the ground and the location
where the player 430 lands should be substantially similar. If the
player 430 jumped forward a significant amount or leaned forward
while shooting, the computer performing video analysis 102 or 106
may detect such mechanical flaws using video taken from the side
perspective.
[0059] Receiving high quality video greatly reduces errors in video
analysis. Typically good video analysis footage includes a high
resolution, good lighting, camera stability, and a consistent
background. For example, if the background is substantially the
same color, the computer performing video analysis 102 or 106 may
determine the subject of the analysis quickly and with less
processing power. The application 250 may offer video capturing
tips before the application 250 begins capturing video of the jump
shot 108, such as tips for holding the camera still, tips for a
preferred background, and an idea distance from the shooter.
Further, the application 250 may check the video footage quality
before performing video analysis or sending the video to the remote
server 106 for video analysis. If the application 250 determines
that the video quality is poor, the application 250 may alert the
user that another video should be taken of the player's 430 jump
shot 108.
[0060] When a video is taken of the player's 430 jump shot 108, the
mobile device's 102 camera 206 captures a plurality still frames,
which when played in a sequence appear to the human eye as a moving
picture. The mobile device 102 camera 206 captures approximately
24-30 frames per second. So, depending on how long the camera 206
captures video, the computer performing video analysis 102 or 106
determines which frames match the various skeleton 400, 500 stages.
The computer performing video analysis 102 or 106 may need to only
analyze one video frame per skeleton 400, 500 stage. So, if the
skeleton comprises five stages, the computer performing video
analysis 102 or 106 may search for five video frames that match the
five skeleton 400, 500 stages. For example, the base stage 402, 502
has low ball position and slightly bent knees. So, the computer
performing video analysis may determine which video frame depicts
the player 430 with bent knees and a low ball position. Given the
characteristics of the base phase 402, 502, the computer performing
video analysis 102 or 106 may determine which frame captured the
player 430 with the most knee bent by finding, for example, the
frame where the player's head is lowest in the frame. A frame
determined to be the frame embodying the base phase 402, 502 may be
analyzed, for example, by superimposing the base stage 402, 502
image of the skeleton 400, 500. Because each phase of the jump shot
has identifying characteristics, the computer performing video
analysis 102 or 106 may use the identifying characteristics to find
a frame that matches each phase of the jump shot. For example, the
computer performing video analysis 102 or 106 may find a frame
where the player 430 is on his toes but not quite in the air for
the frame phase 404, 504, a frame where the player 430 is at the
crest of his jump for the release phase 406, 506, a frame where the
ball has left the player's hand for the follow through phase 408,
508, and a frame where the player has landed again for the finish
phase 410, 510.
[0061] The computer performing video analysis 102 or 106 finds a
video frame for each jump shot phase from both the front
perspective and the side perspective. In this way, the computer
performing video analysis 102 or 106 finds a frame from the front
perspective for each jump shot phase and a frame from the side
perspective for each jump shot phase. Subsequently, each skeleton
stage from the front skeleton 400 is superimposed on a
corresponding frame captured from the front perspective, and each
skeleton stage from the side skeleton is superimposed on a
corresponding frame captured from the side perspective.
[0062] It is important that the computer performing video analysis
102 or 106 determine the base phase 402, 502 first because then the
computer 102 or 106 may ignore any frames occurring before the
determined base phase 402, 502 when searching for the subsequent
phases of the jump shot.
[0063] Based on the number of frames between the frame determined
to be the base phase 402, 502 and the frame determined to be the
follow through phase 408, 508, the computer performing video
analysis may determine how quickly the player is releasing the
ball. A quick release is very important in basketball because it
gives a defender less time to react to a jump shot, thereby
decreasing the likelihood of a blocked shot or the defender
blocking the player's 430 view of the basket. So, the computer
performing video analysis 102 or 106 may suggest drills and
fundamental tweaks for a quicker release if the computer performing
video analysis 102 or 106 determines that the release is happening
too slowly.
[0064] Further still, the computer performing video analysis 102 or
106 may determine whether the ball was released at the pinnacle of
the player's 430 jump by determining which video frame displays the
player at the highest jump point. The computer performing video
analysis 102 or 106 may determine where the ball is located within
the frame determined to be the highest jump point to determine
whether the player released the ball at the pinnacle of the jump
shot. The computer performing video analysis 102 or 106 may analyze
body control during all phases of the jump shot. For example, the
computer performing video analysis 102 or 106 may analyze a
player's balance and body position while stepping into the jump
shot, or the computer performing video analysis 102 or 106 may
analyze whether the player jumped and landed at substantially the
same location on the floor. If body control issues are found, the
computer performing video analysis 102 or 106 may suggest that the
player moves closer to the basket knowing it is easier to master
mechanics from a closer, achievable range without jumping forward.
The computer performing video analysis 102 or 106 may also suggest
drills and critiques to improve the player's body control during
the jump shot. For example, having a proper knee bend and squat may
help the player control his body during the jump and increase the
player's range over time.
[0065] The computer performing video analysis 102 or 106 may also
track the arc angle of the shot taken. Using the side angle video,
the computer performing video analysis 102 or 106 may find the ball
and track the ball's position at various points in the ball flight
to calculate an arc angle. For example, the optimal arc angle may
be between 45 and 60 degrees. The arc angle may be calculated on
the ascent when the ball leaves the player's 430 hand and during
the decent as the ball travels toward the basket. Based on the
calculated arc angle, the computer performing video analysis 102 or
106 may suggest critiques, games, or training videos to help a
player improve their jump shot arc angle.
[0066] As described above, the computer performing video analysis
102 or 106 may superimpose one of the skeleton 400 or 500 stages
onto a frame determined to correspond with the stage. For example,
the computer performing video analysis 102 or 106 determines a
frame that best matches the base phase 402 and superimposes a base
skeleton stage 402s onto the frame. The computer performing video
analysis 102 or 106 compares the skeleton's 402s joint and
connector locations and the player's 430 joint, limb, and body
positions. Alternatively, the computer performing video analysis
102 or 106 may generate the player skeleton and then compare the
player skeleton to the reference skeleton 400, 500. Based on the
comparisons, the computer performing video analysis 102 or 106
determines deviations from the skeleton 400, 500 in the jump shot,
if any.
[0067] For example, if the player 430 performs very little knee
bend and squat in the base phase 402, 502, the knees and hip joints
for the skeleton 400, 500 are going to be lower than the player`
430 knees and hip locations. After determining that the skeleton's
400, 500 knees and hip joint are lower than the player's 430 knees
and hips, the computer performing video analysis 102 or 106 may
find the deviations from the skeleton 400, 500 in the player's base
phase 402, 502, more specifically, in their knee bend and squat. A
deeper knee bend and squat may produce more upward thrust when
leaping, which helps send the ball toward the basket.
[0068] For example, FIGS. 8(a) and 8(b) illustrates two common
problems the computer performing video analysis 102 or 106 may
determine in the base phase 402, 502. As shown in FIG. 8(a), the
player 430 is too upright, without enough knee bend and squat. As
shown in FIG. 8(b), the player 430 is not balance on his feet, and
his toes are off the floor to early.
[0069] The computer performing video analysis 102 or 106 performs
same frame determination and skeleton 400, 500 comparison processes
for all phases of the jump shot from both the front perspective and
the side perspective.
[0070] For example, FIGS. 9(a) and 9(b) illustrate two deviations
from the skeleton 400, 500 in the frame phase 404, 504. As shown in
FIG. 9(a), the player's 430 elbow flares out away from his body
during the frame phase 404, 504. As shown in FIG. 9(b), the
player's 430 misplaces his guide hand and is adding resistance
against the player's 430 shooting hand.
[0071] As another example, FIG. 10 illustrates a deviation from the
skeleton 400, 500 in the release phase 406, 506. As shown in FIG.
10, the player's 430 release point is too low, which may lead to a
poor ball flight arc.
[0072] FIGS. 11(a) and 11(b) illustrate two deviations from the
skeleton 400, 500 in the finish phase 410, 510. As shown in FIG.
11(a), the player 430 is falling backwards showing that the player
430 was not balanced throughout the entire jump shot. As shown in
FIG. 11(b), the player's 430 legs are twisted, again showing a lack
of balance and that the player 430 was not square to the
basket.
[0073] FIGS. 8-11 show only a few of the many possible deviations
from the skeleton 400, 500 that the computer performing video
analysis 102 or 106 may find when analyzing the video frames
captured by the mobile device 102. Also, FIGS. 8-11 illustrate the
embodiment where the computer performing video analysis generates a
player skeleton before comparing the jump shot to the reference
skeleton 400, 500. When comparing these generated skeletons to the
reference skeletons 400, 500 shown in FIGS. 4-5, the problems in
the jump shot may be determined by the computer performing video
analysis 102 or 106.
[0074] After deviations from the skeleton 400, 500 are determined,
the computer performing video analysis 102 or 106 may detect a
player's skill level. The computer performing video analysis 102 or
106 may determine the player's skill level based on the number of
determined deviations between the reference skeleton 400, 500 and
the player's body movements. If the player is a beginner, the
computer performing video analysis 102 or 106 may report only the
most basic, major deviations from the skeleton 400, 500 to the
player so that the player may build a solid foundation. For
example, if a beginner player does not have his feet and shoulders
square in the base phase 402, 502, the computer performing video
analysis 102 or 106 may recommend drills and critiques to fix this
basic deviation from the skeleton 400, 500. After basic deviations
from the skeleton 400, 500 are fixed, the computer 102 or 106 may
suggest further refinements for more advanced players. Meanwhile,
the computer performing video analysis 102 or 106 may report more
minor deviations from the skeleton 400, 500 to a more advanced
player. For example, if an advanced player has a square body to the
basket and a strong follow through, the computer performing video
analysis 102 or 106 may recommend drills and methods to take
quicker shots or increase ball rotation. The computer may track the
player's 430 progress
[0075] FIG. 6 illustrates a method 600 for video analysis, whereby
the mobile device 102 performs video analysis. The method 600
begins in step 602 when the mobile device 102 captures video of the
player 430 taking a jump shot. The mobile device 102 captures the
player 430 taking jump shot from at least two perspectives.
Preferably, these perspectives are a front perspective and a side
perspective. Also, preferably, the mobile device 102 is held very
still and captures high quality video footage against a still
background.
[0076] Subsequently, in step 604, the mobile device 102 determines
which frames of the video footage captured by the mobile device 102
camera 206 actually represent the player 430 taking a jump shot.
Any video frames not representing the jump shot, such as frames
where the player is dribbling, are disregarded. Only frames where
the player 430 is taking a jump shot are analyzed. As a result,
only about one-three seconds of footage needs to be analyzed for
jump shot fundamentals. This determining step is performed on both
the front perspective footage and the side perspective footage.
[0077] After disregarding extra footage, the processor 200 of the
mobile device 102 searches the frames depicting the jump shot for a
frame best representing the base phase 402, 502 in step 606. For
example, the processor 200 finds the frame where the player is
lowest in his squat to find the frame representing the base phase
402, 502 of the jump shot. This searching step is performed on both
the front perspective footage and the side perspective footage.
[0078] After finding the frame depicting the base phase, the
processor 200 generates a skeleton 400, that matches the player's
430 measurements in step 608. The method 600 assumes that a user
has already inputted player 430 measurements (height, weight, arm
length, gender, etc.) into to the mobile device 102. If not, the
application 250 may prompt the user to enter such information
before generating the skeleton 400, 500. The processor 200
references the front perspective skeleton 400 matching the player's
measurements and the side perspective skeleton 500 matching the
player's measurements.
[0079] In step 610, the processor 200 superimposes the base stage
402, 502 skeleton 400, 500 onto the frame determined to depict the
base phase 402, 502 of the jump shot. This superimposing step is
performed on both the front perspective frame and the side
perspective frame.
[0080] Next, the processor 200 analyzes the player's 430 base phase
402, 502 fundamentals based on the superimposed skeleton 400, 500
in step 612. Depending on the player's 430 skill level, either
minor or major deviations from the skeleton 400, 500 are flagged.
The processor 200 may find deviations from the skeleton 400, 500 in
the base phase 402, 502 of the player's jump shot by comparing the
joints of the skeleton 400, 500 to the player's 430 joints. Also,
the he computer performing video analysis 102 or 106 may find
deviations from the skeleton 400, 500 in the base phase 402, 502 by
generating the player skeleton and comparing it to the reference
skeleton 400, 500. For example, if the player's 430 shoulders do
not match the skeleton's 400, 500 shoulder joints, the player's 430
shoulders may not be square to the basket. The orientation of the
player's 430 shoulders may be a result of feet that are not square
to the basket. Feet location may also be analyzed. The processor
200 may further analyze how closely the player's limbs and torso
match vectors connecting the skeleton's 400, 500 joints. This
analysis step is performed on both the front perspective footage
and the side perspective footage.
[0081] After analyzing the base phase 402, 502 against the base
stage skeleton, the computer performing video analysis 102 or 106
repeats steps 606-612 for the other phases of the jump shot. For
example, the processor 200 may repeat steps 606-612 for the frame
phase 404, the release phase 406, the follow through phase 408, and
the finish phase 410.
[0082] After analyzing all phases of the jump shot, the mobile
device 102 uploads the results to the remote server 106 in step
614, and the mobile device 102 also presents the video analysis
results to the user on the I/O device 204 in step 616. FIG. 12
illustrates an exemplary results spreadsheet showing all the areas
of the jump shot analyzed by the computer performing video analysis
102 or 106.
[0083] FIG. 7 illustrates a method 700 for video analysis, whereby
the remote server 106 performs video analysis. The method 700
begins in step 702 when the remote server 106 receives video of the
player 430 taking a jump shot from the mobile device 102. The
mobile device 102 captures the player 430 taking jump shot from two
perspectives. Preferably, these perspectives are a front
perspective and a side perspective. So, the remote server 106
receives two video clips. Also, preferably, the mobile device 102
is held very still and captures high quality video footage against
a still background.
[0084] Subsequently, in step 704, the remote server 106 processor
300 determines which frames of the video footage captured by the
mobile device 102 camera 206 actually represent the player 430
taking a jump shot. Any video frames not representing the jump
shot, such as frames where the player 430 is dribbling, are
disregarded. Only frames where the player 430 is taking a jump shot
are analyzed. This determining step is performed on both the front
perspective footage and the side perspective footage.
[0085] After disregarding extra footage, the processor 300 searches
the frames depicting the jump shot for a frame best representing
the base phase 402, 502 of the jump shot in step 706. For example,
the processor 300 finds the frame where the player is lowest in his
squat to find the frame representing the base phase 402, 502 of the
jump shot. This searching step is performed on both the front
perspective footage and the side perspective footage.
[0086] After finding the frame depicting the base phase 402, 502,
the processor 300 generates a skeleton 400, 500 that matches the
player's 430 body measurements in step 708. The method 700 assumes
that the remote server 106 has already received the body
measurements from the mobile device 102. In the exemplary
embodiments, the application 250 on the mobile device 102 may not
allow the user to upload video to the remote server 106 until the
body measurements have been entered into the mobile device 102. In
one embodiment, the mobile device 102 sends player body
measurements with the captured video. Alternatively, the remote
server 106 may save player body measurements for each player
430.
[0087] In step 710, the processor 300 superimposes a base stage
402, 502 of the referenced skeleton 400, 500 onto the frame
determined to depict the base phase 402, 502 of the jump shot. This
superimposing step is performed on both the front perspective frame
and the side perspective frame.
[0088] Next, the processor 300 analyzes the player's 430 base phase
402, 502 fundamentals based on the superimposed skeleton 400, 500
in step 712. Depending on the player's 430 skill level, either
minor or major deviations from the skeleton 400, 500 are flagged.
The processor 200 may find deviations from the skeleton 400, 500 in
the base phase 402, 502 of the player's jump shot by comparing the
joints of the skeleton 400, 500 to the player's 430 joints. Also,
the he computer performing video analysis 102 or 106 may find
deviations from the skeleton 400, 500 in the base phase 402, 502 by
generating the player skeleton and comparing it to the reference
skeleton 400, 500. For example, if the player's 430 shoulders do
not match the skeleton's 400, 500 shoulder joints, the player's 430
shoulders may not be square to the basket. The orientation of the
player's 430 shoulders may be a result of feet that are not square
to the basket. Feet location may also be analyzed. The processor
300 may further analyze how closely the player's limbs and torso
match vectors connecting the skeleton's 400, 500 joints. This
analysis step is performed on both the front perspective footage
and the side perspective footage.
[0089] After analyzing the base frame 402, 502 against the base
stage skeleton 400, 500, steps 706-712 for the other phases of the
jump shot. For example, the processor 300 may complete steps
706-712 for the frame phase 404, the release phase 406, the follow
through phase 408, and the finish phase 410.
[0090] After analyzing all phases of the jump shot, the remote
server 106 saves the results to a database in step 714 and sends
the results to the mobile device 102 in step 716. When the remote
server 106 sends the results, the remote server 106 may include
hyperlinks to training videos to help the player fix the
fundamental deviations from the skeleton 400, 500 found by the
remote server 106.
[0091] As described above, the remote server 106 may perform tasks
other than just video analysis. For example, the remote server 106
may save a video of the player's 430 best shot in the database 308.
The player 430 can download that shot at any time to view what they
did right and what still needs improvement. The database 308 may
save many best shot videos for all players 430 registered with the
remote server 106. This best shot practice may be used more
frequently by professionals who do not need to improve mechanics,
but should simply maintain their method of shooting. For example,
the professional player may save their best shot in the database
308, and the computer performing video analysis 102 or 106 may
generate a skeleton 400, 500 based on that best shot saved in the
database 308. In this way, the professional player may quickly
determine what changes in his jump shot he may have developed from
his best shot.
[0092] In addition, the remote server 106 or the mobile device 102
may prepare a video of the player's 430 jump shot with the skeleton
400, 500 superimposed over the video. In this way, the player 430
can see how the computer performing video analysis 102 or 106
determined the deviations from the skeleton 400, 500 reported to
the player 430. FIG. 13 illustrates the reference skeleton 400, 500
superimposed over the video.
[0093] The remote server 106 may save many videos in the database
308. For example, the database 308 may save one video from each
training session, and the player 430 may view past jump shot videos
to see the progress he is making. The database 308 may further save
progress data. For example, in training session one the remote
server 106 may note in the database 308 that the player 430 was not
square to the basket. However, in training session ten, the remote
server 106 may find that the player 430 was square to the basket.
Using these two pieces of information, the remote server 106 may
note that the player 430 has made significant progress on a
previous issue with his or her jump shot.
[0094] The video analysis system 100 may further be configured to
detect the ball and a basketball hoop to determine whether the
player shot the ball through the hoop. The remote server 106 or the
mobile device 102 may further be configured to track the number of
made baskets during a training session. The remote server 106 or
the mobile device 102 may compile statistics for the player 430
detailing the number of made shots, the number of shots taken, the
percentage of shots made, etc.
[0095] If the video analysis system 100 is configured to detect
whether the ball went through the basketball hoop, the video
analysis system 100 may use this outcome data (e.g. whether the
ball went through the hoop) and correlate the outcome data with
detected deviations from the reference skeleton 400, 500. For
example, if a player 430 has a tendency to jump forward on his jump
shots, this would show up as a deviation from the reference
skeleton 400, 500. But, if the player 430 generally makes most of
the shots where he jumps forward, the jumping forward deviation may
be a low priority fundamental deviation for the player 430 to work
on. As another example, the player 430 may miss 80% of his shots
when he leaves his elbow out versus making 70% of his shots when
his elbows are properly positioned. So if the video analysis system
100 detects both the jumping forward deviation and the elbow
deviation, the video analysis system 100 may only report the elbow
deviation to the player 430. In this way, the video analysis system
100 communicates which deviations are outcome determinative, which
may be helpful in identifying which aspects of the jump shot the
player 430 should focus on. In subsequent training sessions, when
the elbow problem has been addressed, the video analysis system 100
may suggest a more balanced jump and less jumping forward.
[0096] The video analysis process described herein can easily be
performed on other physical activities, such as a golf swing,
throwing a football, a baseball swing, a pitching throw, running,
or any other physical activity. To apply the exemplary embodiments
to other sports or activities, the remote server 106 needs only to
receive skeletons for other physical activities. More specifically,
the video analysis process may be applied to a volleyball serve, a
tennis serve, throwing a football, kicking or punting a football, a
golf swing, discus throwing, shot put throwing, track events (high
jump, pole vaulting, hurdling, long jumping), or swimming strokes
(e.g. back stroke, free style, butterfly, breast stroke). To apply
the video analysis process described herein to other physical
activities, the system may include different skeletons uploaded to
the database 308 based on the various physical activities.
[0097] For physical activities other than a jump shot, the remote
server 106 or the mobile device 102 may store different skeletons
than the jump shot skeleton 400, 500 illustrated in FIGS. 4 and 5.
Different physical activities comprise a variety of body movements.
For example, a pitching skeleton may include a leg kick, an upright
stance, a step forward, and an arm follow through for the pitch
delivery. The pitching skeleton may include locations for other
"joints," such as glove hand position before delivery, head
position during delivery, or any other body positioning. Further,
some physical activities may benefit from more than two
perspectives of the activity, and so, more than two skeletons may
be generated (one for each perspective).
[0098] The skeleton-based video analysis process may also be
configured to perform injury prevention tests. For example, the
system may receive video of a person performing an anterior
cruciate ligament (ACL) jump test. During the ACL jump test, a
person may jump from a relatively low height (e.g. a chair or
bench) and land on the floor. The way the person jumps and lands
may indicate weakness in the person's knee ligaments, which may
lead to significant injury. The skeleton for the ACL test may
illustrate an ideal way to jump and land from a bench. The computer
performing video analysis 102 or 106 may look for deviations from
the ACL test skeleton. If deviations found have been shown to
increase injury, the person may try to correct their jumping and
landing style to prevent injury.
[0099] Similarly to the ACL test, under-pronation or over-pronation
during running, which can lead to serious foot or ankle injuries,
may be detected while a person runs. By analyzing video of a person
running based on a running skeleton, the computer performing video
analysis 102 or 106 may determine whether the person exhibits
over-pronation or under-pronation problems. The computer 102 or 106
may then suggest simple injury prevention videos to help the person
prevent injury as a result of their over-pronation or
under-pronation problems.
[0100] In view of the foregoing, it will be seen that the several
advantages of the invention are achieved and attained.
[0101] The exemplary embodiments can include one or more computer
programs that embody the functions described herein and illustrated
in the appended flow charts. However, it should be apparent that
there could be many different ways of implementing aspects of the
exemplary embodiments in computer programming, and these aspects
should not be construed as limited to one set of computer
instructions. Further, those skilled in the art will appreciate
that one or more acts described herein may be performed by
hardware, software, or a combination thereof, as may be embodied in
one or more computing systems.
[0102] The functionality described herein can be implemented by
numerous modules or components that can perform one or multiple
functions. Each module or component can be executed by a computer,
such as a server, having a non-transitory computer-readable medium
and processor. In one alternative, multiple computers may be
necessary to implement the functionality of one module or
component.
[0103] In another example embodiment, the jump shot analysis system
can be deployed in a multi-camera environment on a basketball court
as shown by FIG. 14. FIG. 14 shows a basketball hoop 1400 in a
basketball court, where numerous cameras are positioned around the
court to generate images of a basketball player as he or she shoots
baskets toward the hoop 1400.
[0104] For example, a first camera 1402 can be referred to as a
baseline camera because it is positioned near baseline 1404 at a
location sufficient to capture a baseline perspective view of a
player shooting a jump shot on the court. It should be understood
that camera 1402 can be positioned at any point along the baseline
1404 sufficient to generate a desired perspective of the shooter
(as indicated by the arrows near camera 1402). It should also be
understood that the baseline camera 1402 need not be positioned
precisely on the baseline so long as it is positioned to generate
baseline perspective video of the shooter.
[0105] A second camera 1406 can be referred to as a sideline camera
because it is positioned near sideline 1408 at a location
sufficient to capture a sideline perspective view of a player
shooting a jump shot on the court. A third camera 1410 can also be
referred to as a sideline camera, and it is positioned near the
sideline 1412 opposite from sideline 1408. As with baseline camera
1402, it should be understood that cameras 1406 and 1410 can be
positioned at any point along their respective sidelines 1408 and
1412 sufficient to generate a desired perspective of the shooter
(as indicated by the arrows near cameras 1406 and 1410). It should
also be understood that the sideline cameras 1406 and 1410 need not
be positioned precisely on the sidelines so long as they are
positioned to generate sideline perspective videos of the
shooter.
[0106] A fourth camera 1414 can be referred to as a backcourt
camera because it is positioned behind where a shooter will likely
be shooting (e.g., near half court or the baseline opposite
baseline 1404) so as to generate a back view of the shooter.
[0107] Through any combination of these cameras, multi-perspective
views (e.g., front views, side views, back views, angle views),
including 3D models, can be generated of players who are shooting
jump shots. The video data generated by these cameras can be
transmitted to a computer 1450 via wired or wireless links 1452.
For ease of illustration, link 1452 is shown with respect to
baseline camera 1402. However, it should be understood that each
camera can include such a link 1452 with computer 1450 so as to
provide computer 1450 with the shooter videos. Computer 1450 can be
configured to execute program code as discussed herein that
analyzes the video data to perform jump shot analysis. For example,
computer 1450 can process the video data from multiple perspectives
to detect and identify deviations from a reference skeleton and
suggest corrections.
[0108] Furthermore, if desired by a practitioner, any of the
cameras 1402, 1406, 1410, and 1414 can be positioned at a
sufficiently high elevation that reduces the impact of obstructions
caused by other players who are sharing the court with the shooter.
For example, if used during a live game or scrimmage environment,
other players on the court can be expected to be positioned between
one or more of the cameras and the shooter when the shooter shoots
the ball. Elevating the cameras above player height can help reduce
possible occlusions/blocking. For example, the cameras could be
positioned at a height of around 10 feet to help reduce
obstruction. However, when used in a practice environment when
there is less of a risk of obstruction, the cameras can be held or
positioned a shoulder height or the like.
[0109] Unless specifically stated otherwise as apparent from the
following discussion, it is appreciated that throughout the
description, discussions utilizing terms such as "determining" or
"comparing" or "processing" or the like, can refer to the action
and processes of a data processing system, or similar electronic
device, that manipulates and transforms data represented as
physical (electronic) quantities within the system's registers and
memories into other data similarly represented as physical
quantities within the system's memories or registers or other such
information storage, transmission or display devices.
[0110] The exemplary embodiments can relate to an apparatus for
performing one or more of the functions described herein. This
apparatus may be specially constructed for the required purposes,
or it may comprise a general purpose computer selectively activated
or reconfigured by a computer program stored in the computer. Such
a computer program may be stored in a machine (e.g. computer)
readable storage medium, such as, but is not limited to, any type
of disk including floppy disks, optical disks, CD-ROMs and
magnetic-optical disks, read only memories (ROMs), random access
memories (RAMs) erasable programmable ROMs (EPROMs), electrically
erasable programmable ROMs (EEPROMs), magnetic or optical cards, or
any type of media suitable for storing electronic instructions, and
each coupled to a bus.
[0111] The exemplary embodiments described herein are described as
software executed on at least one server, though it is understood
that embodiments can be configured in other ways and retain
functionality. The embodiments can be implemented on known devices
such as a personal computer, a special purpose computer, cellular
telephone, personal digital assistant ("PDA"), a digital camera, a
digital tablet, an electronic gaming system, a programmed
microprocessor or microcontroller and peripheral integrated circuit
element(s), and ASIC or other integrated circuit, a digital signal
processor, a hard-wired electronic or logic circuit such as a
discrete element circuit, a programmable logic device such as a
PLD, PLA, FPGA, PAL, or the like. In general, any device capable of
implementing the processes described herein can be used to
implement the systems and techniques according to this
invention.
[0112] It is to be appreciated that the various components of the
technology can be located at distant portions of a distributed
network and/or the Internet, or within a dedicated secure,
unsecured and/or encrypted system. Thus, it should be appreciated
that the components of the system can be combined into one or more
devices or co-located on a particular node of a distributed
network, such as a telecommunications network. As will be
appreciated from the description, and for reasons of computational
efficiency, the components of the system can be arranged at any
location within a distributed network without affecting the
operation of the system. Moreover, the components could be embedded
in a dedicated machine.
[0113] Furthermore, it should be appreciated that the various links
connecting the elements can be wired or wireless links, or any
combination thereof, or any other known or later developed
element(s) that is capable of supplying and/or communicating data
to and from the connected elements. The term module as used herein
can refer to any known or later developed hardware, software,
firmware, or combination thereof that is capable of performing the
functionality associated with that element. The terms determine,
calculate and compute, and variations thereof, as used herein are
used interchangeably and include any type of methodology, process,
mathematical operation or technique.
[0114] The embodiments were chosen and described in order to best
explain the principles of the invention and its practical
application to thereby enable others skilled in the art to best
utilize the invention in various embodiments and with various
modifications as are suited to the particular use contemplated.
[0115] As various modifications could be made in the constructions
and methods herein described and illustrated without departing from
the scope of the invention, it is intended that all matter
contained in the foregoing description or shown in the accompanying
drawings shall be interpreted as illustrative rather than limiting.
Thus, the breadth and scope of the present invention should not be
limited by any of the above-described exemplary embodiments, but
should be defined only in accordance with the following claims
appended hereto and their equivalents.
* * * * *