U.S. patent application number 15/281070 was filed with the patent office on 2017-04-13 for fitness training guidance system and method thereof.
The applicant listed for this patent is SceneSage, Inc.. Invention is credited to Wlodek P. Kubalski, Kwai Wing Egan Lau, Yuk Kwai Lee, John Peter Princen.
Application Number | 20170100637 15/281070 |
Document ID | / |
Family ID | 58499291 |
Filed Date | 2017-04-13 |
United States Patent
Application |
20170100637 |
Kind Code |
A1 |
Princen; John Peter ; et
al. |
April 13, 2017 |
Fitness training guidance system and method thereof
Abstract
A real-time personal fitness training guidance system is
described. A camera unit including one or more cameras is provided
to take images of a user ready to use or using an exercise
equipment. One or more current vital signs and physical activity
information are derived from the captured images. Further, a
personal fitness training guidance is generated for the user in
accordance with the current vital signs, the physical activity
information and a workout program the user is doing. To allow the
user to see how he or she is doing in his/her workout, an output
unit is provided to show the personal fitness training guidance to
the user in real-time. An external device is also provided to
generate a coaching guidance for the user.
Inventors: |
Princen; John Peter;
(Cupertino, CA) ; Lee; Yuk Kwai; (Fremont, CA)
; Lau; Kwai Wing Egan; (San Pablo, CA) ; Kubalski;
Wlodek P.; (Los Gatos, CA) |
|
Applicant: |
Name |
City |
State |
Country |
Type |
SceneSage, Inc. |
Santa Clara |
CA |
US |
|
|
Family ID: |
58499291 |
Appl. No.: |
15/281070 |
Filed: |
September 30, 2016 |
Related U.S. Patent Documents
|
|
|
|
|
|
Application
Number |
Filing Date |
Patent Number |
|
|
62284735 |
Oct 8, 2015 |
|
|
|
15281070 |
|
|
|
|
Current U.S.
Class: |
1/1 |
Current CPC
Class: |
G06K 9/00342 20130101;
A63B 2225/20 20130101; A63B 2225/50 20130101; A63B 2071/063
20130101; A61B 5/11 20130101; A63B 2220/806 20130101; A63B 2071/065
20130101; A63B 2220/805 20130101; G16H 30/40 20180101; A63B 2230/42
20130101; A63B 2071/068 20130101; G16H 20/30 20180101; A63B 2230/06
20130101; A63B 2071/0625 20130101; A63B 2230/207 20130101; A63B
71/0622 20130101; A63B 2220/30 20130101; A63B 2230/50 20130101;
A63B 2220/18 20130101; A63B 22/02 20130101 |
International
Class: |
A63B 24/00 20060101
A63B024/00; A63B 71/06 20060101 A63B071/06; A63B 22/02 20060101
A63B022/02 |
Claims
1. A personal fitness training system comprising: a fitness
equipment including: a camera unit with at least one camera
provided to capture images of a user using the fitness equipment; a
data processor, coupled to the camera unit, receiving the captured
images to derive from the captured images one or more vital signs
of the user exercising with the fitness equipment; and a console
providing a training guidance to the user to achieve at least one
goal through an exercise with the fitness equipment, wherein the
guidance is personalized per the user in accordance with a
plurality of attributes about the user derived from the captured
images.
2. The personal fitness training system as recited in claim 1,
wherein the training guidance is generated in accordance with a
profile of the user.
3. The personal fitness training system as recited in claim 2,
wherein the data processor is configured to detect a presence of
the user when the user is ready to use the fitness equipment.
4. The personal fitness training system as recited in claim 3,
wherein the data processor is configured to profile the user from
the images.
5. The personal fitness training system as recited in claim 1,
wherein the camera unit is further positioned to take images of the
user so that the data processor detects workout activities being
performed by the user.
6. The personal fitness training system as recited in claim 5,
wherein the training guidance is personalized to the user based on
the one or more vital signs and the workout activities.
7. The personal fitness training system as recited in claim 6,
wherein the data processor receives status information from the
fitness equipment, the status information includes at least one of:
speed, incline, and resistance.
8. The personal fitness training system as recited in claim 7,
wherein the training guidance is personalized to the user based on
the one or more vital signs, the workout activities and the status
information.
9. The personal fitness training system as recited in claim 6,
further comprising: at least one server, located remotely with
respect to the fitness equipment, communicating with the data
processor over a data network to receive the profile of the user
and the attributes about the user for an exercising session,
wherein the server is configured to archive historic data for each
of users registered therewith, the historic data for the user is
updated with the profile of the user and the attributes about the
user for the exercising session.
10. The personal fitness training system as recited in claim 9,
wherein the server is coupled to or served as persistent storage
and central processing for a plurality of users who have signed up
with the system.
11. The personal fitness training system as recited in claim 10,
wherein the server receives user workout data from each of the
users when the each of the users uses the system, stores the
workout data persistently for each of the users, generates
statistics data on historic user workout data from the users,
indexes user profiles through facial and physical features.
12. The personal fitness training system as recited in claim 11,
wherein the training guidance is personalized to the user based on
the one or more vital signs, the workout activities, and the
statistics data.
13. The personal fitness training system as recited in claim 11,
wherein the server further generates coaching parameters as an
input to a coaching module which in turn generates a coaching
guidance sent to a corresponding processing unit for display to a
user.
14. The personal fitness training system as recited in claim 13,
further comprising a mobile device communicating with the server,
the mobile device receives the coaching guidance from the server
over a data network.
15. The personal fitness training system as recited in claim 9,
wherein the server includes a coaching module designed to customize
a coaching guidance for the user based on inputs from the data
processor, an analytic engine periodically analyzing a historical
workout database to derive correlations between experience types,
intensities, user profile types, and vital sign measures based on
machine learning techniques.
16. A method for providing a training guidance in a personal
fitness training system, the method comprising: capturing by a
camera unit in real-time images of a user doing a workout on the
fitness equipment; analyzing the captured images to obtain one or
more vital signs of the user; obtaining a workout program the user
is doing; generating a personal fitness training guidance for the
user based on the vital signs, the physical activity information
and the workout program; and delivering the personal fitness
training guidance to the user.
17. The method as recited in claim 16, wherein the camera unit
includes at least one camera or is repositioned to image different
parts of the user.
18. The method as recited in claim 17, further comprising:
profiling the user from a set of facial images from the camera
unit; and detecting workout activities being performed by the user
from a set of images.
19. The method as recited in claim 18, further comprising:
determining a fitness level of the user; obtaining a fitness goal
of the user; and generating a workout program for the user based on
the fitness level of the user and the fitness goal of the user in
accordance with the status information.
20. The method as recited in claim 18, wherein said profiling the
user from a set of facial images comprises: determining an identity
of the user from the facial images; and obtaining the profile of
the user corresponding to the identity to indicate that the user is
a returning user.
Description
CROSS-REFERENCE TO RELATED APPLICATION
[0001] This application claims the benefits of U.S. provisional
application No. 62/284,735, entitled "Real-time Personal Fitness
Training Guidance System", filed on Oct. 8, 2015, which is hereby
incorporated by reference for all purposes.
BACKGROUND OF THE INVENTION
[0002] 1. Field of the Invention
[0003] The present invention is generally related to the area of
fitness equipment. Particularly, the present invention is related
to a real-time personal fitness training guidance system and a
method thereof.
[0004] 2. Description of the Related Art
[0005] Existing solutions to assist users to better use fitness
equipment such as treadmills have a number of limitations and fail
to truly help the users become more effective in using the fitness
equipment to attain their exercise goals. In the case of
treadmills, a user has to use his/her two hands to press on sensor
bars installed on the fitness equipment periodically to measure
his/her heart rate. As a result, the exercising movements on the
treadmill are restricted with two hands on the sensor bars and only
a numerical reading is provided to the user. Some of the
limitations in the fitness equipment currently on the market are
summarized below.
[0006] 1. Users face difficulties in getting their vital signs
measured continuously during an exercise. Sensor bars are seen on
some treadmills, but they must be constantly held on in hands to
get the vital signs measured continuously, hence a user is
considerably restricted when moving his hands. Existing systems
that monitor the vital signs continuously require a user to wear
some types of devices, such as a heart monitor strap, that
communicate with the fitness equipment through wireless
technologies. They require the user to consciously wear electronic
devices during workout. In reality, only the most motivated users
may remember to bring the devices. If the device is not worn, there
would be a data gap in the workout history. Further, the devices
are prone to loss or tear and wear, and may disrupt the workout
experience. More importantly, there is no correlation between the
vital sign data and the activities performed by a user, hence there
would be no feedback that could be provided to the user on how to
improve the effectiveness of the workout.
[0007] 2. A fitness equipment cannot seamlessly detect a user's
profile (gender, age, body mass, etc.). Each time a user gets on a
machine, the user needs to re-enter his/her profile information or
take the hassle to hook up a mobile phone if he/she has already
registered and the machine is network connected. In other words,
the user has to do some work each time he/she uses an exercise
machine to get recognized and the machine cannot identify the user
nor do profiling automatically. Further, these existing systems
cannot measure the physical body changes of the user.
[0008] 3. User activities on a fitness equipment are not monitored.
Existing systems cannot detect what activities the user is doing on
a fitness equipment or get any feedback on how the user is feeling
or doing during a workout.
[0009] 4. No exercise history records are shared across an exercise
equipment. Most often the results data are not persistently stored
by the fitness equipment after the workout is done. If wearable
devices are used, the results data can only be stored in the online
repository of a user account but disconnected from the fitness
equipment. The fitness equipment cannot communicate with each other
to recognize the user identity and continue the tracking from where
the user has left off before.
[0010] 5. Feedback on how to improve the exercises is not provided
by an existing exercise equipment. Users do not tangibly feel the
impact of a workout exercise towards his/her exercise goals. There
are no insightful interpretations on workout data results to feed
back to a user during and after the exercises.
[0011] Accordingly, there is a great need for a fitness equipment
or system that provides a user with a training guide based on how
the user is doing at the moment in view of his/her past
exercises.
BRIEF SUMMARY OF THE INVENTION
[0012] This section is for the purpose of summarizing some aspects
of the present invention and to briefly introduce some preferred
embodiments. Simplifications or omissions may be made to avoid
obscuring the purpose of the section. Such simplifications or
omissions are not intended to limit the scope of the present
invention.
[0013] In general, the present invention is related to real-time
personal fitness training guidance system and method for the same,
where such a system overcomes at least one of aforementioned
problems. According to one aspect of the present invention, a
real-time personal fitness training guidance system is described
and is able to take images of a user ready to use or using an
exercise equipment. One or more current vital signs and physical
activity information are derived from the captured images. Further,
a personal fitness training guidance is generated for the user in
accordance with the current vital signs, the physical activity
information and a workout program the user is doing. To allow the
user to see how he or she is doing in his/her workout, an output
unit is provided to show the personal fitness training guidance to
the user in real-time.
[0014] According to another aspect of the present application, a
computer vision system is provided to profile the user seamlessly
and automatically in view of the fitness equipment and a workout
program of the user is doing. The profile of the user is also
derived from captured images. The derived profile is retained or
used to update a stored profile of the user, and subsequently used
for generating the personal fitness training guidance.
[0015] According to still another aspect of the present
application, a user does not need to wear wearable devices or press
buttons periodically during a workout. As a result, the user
experience of using a fitness equipment during a workout is
maximized. In addition, the present invention not only displays the
measured vital signs to users but also provides the user personal
fitness training guidance on the basis of the current vital signs,
the physical activity information and the workout program being
used.
[0016] According to still another aspect of the present
application, a fitness training guidance system utilizes an
external device to record continuous workout data for users, which
is helpful for the users to measure their performance, and achieve
their workout goals. Depending on the implementation, the external
device may be a part of cloud computing system or a mobile device
communicating with a server.
[0017] According to yet another aspect of the present application,
as part of the fitness training guidance system, a server or a
collection of such servers are provided to serve as persistent
storage and central processing for a plurality of users who have
signed up with the system, receive user workout data from a
processing unit, store the data persistently for each of the users,
generate some statistics data on historic user workout data from
multiple users and previously known population data sets, index
user profiles through facial and physical features, generate
coaching parameters as an input to a coaching module which in turn
generates coaching guidance that is sent to the corresponding
processing unit for display to the user.
[0018] The present invention may be implemented in numerous ways,
including a method, a system, a device or a part of a system, each
yielding different benefits, objects and advantages. According to
one embodiment, the present invention is a personal fitness
training system comprising: a fitness equipment including a camera
unit with at least one camera provided to capture images of a user
using the fitness equipment; a data processor, coupled to the
camera unit, receiving the captured images to derive therefrom
vital signs of the user exercising with the fitness equipment; and
a console providing a training guidance to the user to achieve at
least one goal through an exercise with the fitness equipment,
wherein the guidance is personalized per the user and generated in
the data processor in accordance with a profile of the user and a
plurality of attributes about the user derived from the captured
images. The personal fitness training system further comprises at
least one server, located remotely with respect to the fitness
equipment, communicating with the processing unit over a data
network to receive the profile of the user and the attributes about
the user for an exercising session. The server is configured to
archive historic data for each of users registered therewith, the
historic data for the user is updated with the profile of the user
and the attributes about the user for the exercising session.
[0019] According to another embodiment, the present invention is a
method for a personal fitness training system, the method
comprises: capturing in real-time images of a user doing a workout
on the fitness equipment; analyzing the captured images to obtain
one or more vital signs and physical activity information of the
user; obtaining a workout program the user is doing; generating a
personal fitness training guidance for the user based on the vital
signs, the physical activity information and the workout program;
and delivering the personal fitness training guidance to the
user.
[0020] Accordingly, one of the objects of the present inventions is
to provide a mechanism to free a user exercising on a fitness
equipment from any restriction, identify the user and measure the
vital signs of the user without any intervention from the user, and
subsequently provide the user with an effective fitness training
guidance specifically personalized for the user.
[0021] Other objects, features, and advantages of the present
invention will become apparent upon examining the following
detailed description of an embodiment thereof, taken in conjunction
with the attached drawings.
BRIEF DESCRIPTION OF THE DRAWINGS
[0022] The invention will be readily understood by the following
detailed description in conjunction with the accompanying drawings,
wherein like reference numerals designate like structural elements,
and in which:
[0023] FIG. 1 shows a diagram of a complete exercise system which
includes a real-time personal fitness training guidance system;
[0024] FIG. 2 shows a block figure of a local processing unit
according to an embodiment of the present invention;
[0025] FIG. 3 shows a block diagram of the cloud processing
unit;
[0026] FIG. 4A shows a detailed flowchart or process of providing
an exercise guidance to a user according to one embodiment of the
present invention;
[0027] FIG. 4B shows a detailed flowchart or process of recognizing
the user;
[0028] FIG. 4C shows a detailed flowchart or process of profiling
the user;
[0029] FIG. 4D shows a detailed flowchart of guiding a user in
exercising on an exercising equipment; and
[0030] FIG. 5 show an overall functional diagram with exemplary
data flows according to one embodiment of the present
invention.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
[0031] The detailed description of the present invention is
presented largely in terms of procedures, steps, logic blocks,
processing, or other symbolic representations that directly or
indirectly resemble the operations of data processing devices.
These descriptions and representations are typically used by those
skilled in the art to most effectively convey the substance of
their work to others skilled in the art. Numerous specific details
are set forth in order to provide a thorough understanding of the
present invention. However, it will become obvious to those skilled
in the art that the present invention may be practiced without
these specific details. In other instances, well known methods,
procedures, components, and circuitry have not been described in
detail to avoid unnecessarily obscuring aspects of the present
invention.
[0032] Reference herein to "one embodiment" or "an embodiment"
means that a particular feature, structure, or characteristic
described in connection with the embodiment can be included in at
least one embodiment of the invention. The appearances of the
phrase "in one embodiment" in various places in the specification
are not necessarily all referring to the same embodiment, nor are
separate or alternative embodiments mutually exclusive of other
embodiments.
[0033] The present invention pertains to a system, a method, a
platform and an application each of which is uniquely designed,
implemented or configured to use images of a user ready to use a
fitness equipment or exercising with the fitness equipment. As used
herein, any pronoun references to gender (e.g., he, him, she, her,
etc.) are meant to be gender-neutral. Unless otherwise explicitly
stated, the use of the pronoun "he", "his" or "him" hereinafter is
only for administrative clarity and convenience. Additionally, any
use of the singular or to the plural shall also be construed to
refer to the plural or to the singular, respectively, as warranted
by the context.
[0034] To facilitate the description of the present invention, a
treadmill is used as an example to illustrate how one embodiment of
the present invention can be applied thereto. Those skilled in the
art shall appreciate that the present invention can be applied to
other exercise devices, applications and common platforms.
Embodiments of the present invention are discussed herein with
reference to FIGS. 1-5. However, those skilled in the art will
readily appreciate that the detailed description given herein with
respect to these figures is for explanatory purposes only as the
invention extends beyond these limited embodiments.
[0035] Referring now to the drawings, in which like numerals refer
to like parts throughout the several views. FIG. 1 shows a
configuration diagram of an exemplary personal fitness training
system according to an embodiment of the present application. As
shown in FIG. 1, the personal fitness training guidance system
includes a fitness equipment 120 (e.g., a treadmill), a local
processing unit 100 embedded in, affixed to or provided to a
console 121, and a server or a cloud processing unit 200. it should
be noted that the server or a cloud processing unit 200 is not part
of the fitness equipment 120 but is provided in the personal
fitness training guidance system to achieve one or more of the
benefits, advantages and objectives in the present invention.
According to one embodiment, the console 121 is shown in FIG. 1 to
include a user interface or controls and coupled to the local
processing unit 100 that includes a camera unit 101, a data
processor 102, and a display 103. The data processor is designed or
programmed to implement or perform some or all of the functions
contemplated in the present invention, with or without an external
device (e.g., a server or a mobile device).
[0036] Depending on the embodiment, the console 121 may be situated
in, on or near the fitness equipment 120. In the example of FIG. 1,
the console 121 is provided in the front of a treadmill, the camera
unit 101 is also disposed in front of the treadmill to take
real-time images of a user exercising on the treadmill, the data
processor 102 is designed or programmed to derive user workout data
(including current vital signs and physical activity information,
which will be described in detail later) from the captured images,
and provide in real-time the personal fitness training guidance to
the user based on the user workout data; The data processor 102 may
optionally receive some workout data from the exercise equipment
itself. For example, it could receive speed, incline, resistance
directly from the treadmill. The data may be processed locally or
transported to the server 200 for further processing.
[0037] In one embodiment, the server 200 is located remotely with
respect to the fitness equipment 120 but in communication with the
processing unit 100 over a data network (e.g., the Internet, wired
or wireless network). As part of the fitness training guidance
system, the server or a collection of such servers 200 serve as
persistent storage and central processing for a plurality of users
who have signed up with the system, receive the user workout data
from a processing unit, store the data persistently for each of the
users, generate some statistics data on historic user workout data
from multiple users and previously known population data sets,
index user profiles through facial and physical features, generate
coaching parameters as an input to a coaching module which in turn
generates coaching guidance that is sent to the corresponding
processing unit for display to the user. In one embodiment, the
coaching parameters refer to all physiological and biomechanical
data derived from the profile, exercise responses and fitness level
associated with a user while he is doing an exercise with the
system. It should be noted that a single cloud processing instance
can support or connect with many entities of local processing
units.
[0038] Referring now to FIG. 2, it shows a functional block diagram
of an exemplary local processing unit 100 according to an
embodiment of the present invention. As shown in FIG. 2, the local
processing unit 100 comprises: a camera unit 101 configured to take
in real-time images of the user doing a workout at the fitness
equipment. The camera unit 101 includes at least one camera to
capture images that include the user's head. The types of the
camera can include visual light cameras, infrared cameras, depth
sensors, or sensors which capture in other regions of the spectrum
(visible or invisible), and provide a sequence of data in time that
can be indexed by a set of two or more coordinates (e.g. horizontal
and vertical coordinates). As shown in FIG. 1, the camera unit 101
may include additional cameras that capture images of other areas
of the user's body. For example, an additional camera can take
images of the user's lower body, including the legs so that
additional information about the user's workout activities can be
determined.
[0039] The data processor 102 communicating with the camera unit
101 receives the captured images therefrom, performs real-time
analysis of the images to obtain one or more current vital signs
(e.g. a heart rate or a body temperature) and physical activity
information (e.g., information indicating whether the user is
running, walking or resting) of the user, and generates in
real-time a personal fitness training guidance message (e.g.,
prompting the user to increase his speed of movement) for the user
on the basis of the current vital signs, the physical activity
information and a workout program the user is doing. While the
derived current vital signs and the physical activity information
are used to generate the personal fitness training guidance
message, they are sent to the cloud processing unit 200 for record
and statistical analysis. The data processor 102 can also receive
status information from the exercise equipment that includes the
state of the machine, such as speed incline, resistance, and
etc.
[0040] An output unit 103 (e.g., a display) communicating with the
local data processor 102 receives the personal fitness training
guidance message from the local data processor 102 and provides in
real-time the personal fitness training guidance message to the
user. The output unit could be a screen or any other kind of visual
display, or speakers, headphones, or any other kind of
video/auditory output device.
[0041] In one embodiment, the local processing unit 100 is
integrated with the exercise equipment in which case the camera
unit 101 is part of the console on the exercise equipment.
Depending on the implementation, the local processing unit 100 can
be a separate unit internally to the console, or the functions
could be implemented on an existing processing unit in the exercise
equipment, and the output messages would be output on the exercise
equipment display.
[0042] FIG. 3 shows a functional block of a cloud processing unit
200 according to an embodiment of the present application. As shown
in FIG. 3, the cloud processing unit 200 includes a tracking agent
or module 201, a coaching agent or module 203, and an analytic
agent or module 205. Depending on the implementation, each of the
modules 201, 203 and 205 may be implemented in software or a
combination of software and hardware.
[0043] The tracking module 201 is designed to process requests to
store or retrieve historic exercise and profile data for all users
registered in the system. The data is persistently stored in a user
database 202 which contains the historic data of all exercise
records by a registered user. For example, it stores vital sign
readings, body form readings, equipment statuses, guidance messages
by a time line (e.g., second, minute, hour or day).
[0044] The coaching module 203 manages a set of personalized
workout instructions for each workout session. It loads a workout
program template from a Workout Program Database 204 and customizes
it based on inputs from the local processing unit 100. The workout
program is referred to a program for training an exercising user
and customized for the user by the real-time personal fitness
training guidance system. According to one embodiment, a workout
program is a collection of one or more training phases. For
example, there could be 3 phases, a warm-up phase, a main phase, a
cool-down phase, in a workout program. The goals of each phase in
terms of the activities, and the vital sign limits are customized
for each user based on the profile information, and the user's
goals. The phase goals include but are not limited to duration,
distance, physiological limit (e.g., heart rate), the number of
repetitions, and etc. A goal can also be in the form of a virtual
score derived from tangible data listed above. The coaching module
203 provides a guidance to the user in achieving each of the phase
goals. When a phase goal is met or a time limit has passed, the
coaching module 203 guides the user to move to the next phase.
[0045] The analytic engine 205 periodically analyzes the historical
workout database and derives correlations between experience types,
intensities, user profile types, and vital sign measures based on
machine learning techniques. It is used to predict an exercise
response and personal preference of a user based on the behavior of
other users who share similar physical or behavioral attributes.
The exercise response is referred to all physiological and
biomechanical changes on the user's body right after performing a
certain exercise. These exercise-response pattern insights are
stored in Population Segment Database 206 which stores the
exercise-response data and preference of each of the users. The
Population Segment Database 206 also stores general population data
sets, exercise best practices that would be applicable to each of
them.
[0046] The above is an introduction of the components of the
real-time personal fitness training guidance system, the local
processing unit 100, the cloud processing unit 200 and the signal
transmission relationship among them according to an embodiment.
Below is a description of a workflow of the real-time personal
fitness training guidance system according to one embodiment of the
present application with respect to FIG. 4A. FIG. 4B, FIG. 4C, and
FIG. 4D are the expanded details of FIG. 4A.
[0047] Referring now to FIG. 4A, it shows a flowchart or process
450 of providing an exercise guidance to a user. Depending on
implementation, the process 450 may be implemented in software or a
combination of hardware and software. At 300, the process 450 moves
to 300 to recognize the user, namely one or more cameras are put
into a working mode to take images of a user ready to use or while
using the exercise equipment.
[0048] FIG. 4B shows a flowchart or process 460 of recognizing the
user. At 400, the system profiles the user from these images. At
301, a camera unit or one of the cameras therein is positioned to
monitor the location where a user shall appear to use the fitness
equipment. Until it recognizes that a user shows up, the system may
remain idle or shows a promotional video and audio messages to
promote the system to potential users walking by the system. At
302, the camera unit recognizes that a user is on the fitness
equipment 120. The user detection or recognition can be implemented
using face detection from facial images (when the camera is focused
on the facial part of the user). Detecting a human face within a
certain defined region in the field of view of the camera unit can
be used as an indication that a user is ready to use the exercise
equipment. When a user is detected, the system or the console is
caused to prompt the user with a welcome message at 303. The
welcome message includes a call-to-action message that asks the
user if they wish to use the exercise guidance system.
[0049] At 304, the system waits for the user to agree to use the
guidance system. If the user agrees, the exercise guidance system
proceeds to 305. Otherwise the system won't start, and the user is
just using the exercise equipment as usual without a real-time
guidance generated based on his state. To provide more personalized
training guide, some inputs from the user to the system can be
through a variety of methods, including but not limited to a
gesture input through the camera sub-system, a voice input, or
inputs via a touch screen. Eventually, the process 460 goes to 305
that returns to 300 of FIG. 4A. The process 350 of FIG. 4A now goes
to 400 to profile the user to establish the context for executing
the exercise program, which is further detailed in FIG. 4C.
[0050] Referring now to FIG. 4C, it shows a flowchart or process
470 of profiling the user. It is assumed that the user agrees to
use the guidance system. The profile of a user is referred to some
basic information (e.g., age, gender, weight, and etc.) of the user
which is useful in generating a set of coaching parameters for the
user. The profile also contains an ID of the user which could be
generated based on extracting facial attributes from images of the
user's face.
[0051] At 401, the camera unit is driven to capture images of the
user positioned at the fitness equipment. At 402, the local data
processor detects a predefined image region in one or more of the
captured images that maps to the face of the user. The local data
processor sends detected facial image regions to the tracking
module 201 of FIG. 3, for example. The tracking module 201 uses a
face recognition technique based on the captured image regions and
data stored in the user database 202 to recognize the known users.
In other words, users who have previously used the system, and who
have recorded profile information, and exercise histories can be
confirmed as a returning user. When there is a match of user ID
with the user, the profile of the user corresponding to the
identity of the user is obtained from the tracking module for
subsequent exercise guidance processing.
[0052] In the event that the user is not recognized, the user is
treated as a new user. Using the face region as an input, at 404,
the local processor unit estimates the birth year of the user. At
405, a gender analyzer is designed to analyze the face region of
the image to estimate the gender (e.g., male or female) of the
user. There are a variety of methods known in the art for
estimating age and gender from facial images. These methods all use
sophisticated machine learning methods that have been trained on
large databases of labeled data, and their reliability is
sufficient for the system to use to derive initial coaching
parameters for the exercise guidance system.
[0053] At 406, the camera unit 101 take images of the body of the
users. These images are used to determine measures of physical
attributes of the user such as weight and height. The physical
attribute refers to the appearance characteristics or biological
characteristics of the users including body shape, body movement,
age, weight, and gender. These attributes can be objectively
measured but approximately inferable from the appearance of the
user. The physical attributes can be used to more accurately
determine certain workout data such as calories burnt, and also
helps to determine the fitness level changes of the user over time.
There are a variety of methods known in the art for estimating
physical attributes. For example, if the camera system includes
depth cameras, a full body outline with dimensions can be extracted
from the images, and from this outline, the height and weight can
be fairly estimated. If a camera sub-system includes only RGB
cameras, it is still possible to estimate physical attributes using
multiple cameras, when the locations of the cameras relative to
each other are known, and the cameras have overlapping views.
[0054] At 407, the exercise profile information is created for this
new user. The new profile is stored in the user database 202 of
FIG. 3. If this user returns to use the system next time, his
record can be determined already in existence and retrieved in
408.
[0055] The user's profile information is used to prepare for
coaching parameters in 409 and are further used in exercise
guidance subsequently. The coaching parameters that are derived
include the maximum heart rate and heart rates zones. There are
well known methods that take gender and age into consideration when
providing estimates of the maximum heart rate. For example, one
gender independent formula is to use 220-age to derive the maximum
heart rate. More sophisticated formula can include gender and
weight. For example, if gender is included, one common formula is
to use 220-age for men, and 226-age for women. If weight is
included, another popular formula is 216-0.5*age-0.05*weight (in
lbs), or for women, 211-0.5*age-0.05*weight (in lbs). Heart rate
zones can be determined from the maximum heart rate. These heart
rate zones are important in the exercise guidance to ensure that
the user is working at the correct intensity to make improvements
according to their goals. For example, an endurance aerobic zone
can be estimated as 50-60% of the maximum heart rate. Exercising in
this zone is excellent for losing weight and increasing aerobic
endurance. The threshold zone which is important for improving
running economy, and pushing the limits of running speed for 5K and
10K races is 80-90% of the maximum heart rate. Recovery heart rate
zone, is <50% of maximum heart rate, is also important for
resting between the intense intervals in an interval workout.
[0056] At 410, the profile of the user from 408 and the coaching
parameters are used to create a list of potential workout programs,
from the list of workout programs stored in the workout program
database 204. Different workout programs are catered to different
fitness goals (e.g., losing weight, toning up, getting stronger, or
gaining muscle mass). The user can make a selection from the list.
If the user does not select a program, the system is programmed or
designed to select a workout program for the user based on his
profile information. This workout program along with the user's
profile, and the coaching parameters are used for the system to
generate a training guidance.
[0057] After the user profile has been derived and the workout
program is selected, according to 400 of FIG. 4A, the system starts
the guidance according to 500 of FIG. 4A. Referring now to FIG. 4D,
it shows a flowchart or process 480 of producing a training
guidance to the user. At 501, the local data processor 102 captures
images from the camera unit 101. These images are used to estimate
vital signs at 502, and extract what activities the user is doing
at 503.
[0058] According to one embodiment, at 502 the system uses the
images captured to estimate the user's vital signs and sends the
vital signs to the coaching module 203. The vital signs in the
embodiment includes a heart rate, a respiration rate and an blood
oxygen level. Estimation of the heart rate from the images is known
as remote photoplethysmography. It uses RGB data from the cameras
to sense the periodic changes in blood volume in the face that
reflects the periodic beating of the heart. The user's breathing
rate can also be determined by observing the movement of the chest.
Periodic motions of the chest correspond to the periodic breathing
patterns of the user. With these methods it is possible to obtain a
user's vital signs continuously without any wearable sensor, and
without requiring the user to touch any sensor on the exercise
equipment.
[0059] At 503, the system uses the images to extract the physical
activity the user is performing and sends these activity
measurements to the coaching module 203. The physical activity
measurements sent to the coaching module can include information
obtained from the exercise equipment, as well as information
derived by processing the captured images. The physical activity
information in one embodiment includes information showing an
exercise form, an intensity level, a cadence, a speed, a distance
and etc. For example, the user's cadence can be estimated by
tracking the position of the user's face within the field of view
of the camera unit. The periodic motion of the face can be used to
derive the running or walking cadence of the user. The motion of
the user's head during running can also be used to estimate bounce,
which is the vertical motion that the user undergoes during their
running pattern. Bounce is an indication of the efficiency of the
running technique. Lower bounce is more efficient. If the camera
unit includes a field of view of the user's feet, the system can
also estimate a time of ground contact. Lower ground contact time
is an indicative of a more efficient running style. If the camera
unit includes a depth camera sensor, then the complete skeleton of
the user in running can be extracted, and such metrics as posture,
including body lean and footfall can be derived.
[0060] At 504, the coaching parameters from 409 are processed. The
coaching module 203 takes the coaching parameters and the workout
programs to generate the guidance for the user to complete an
exercise. When the user has no workout history in the system, the
coaching parameters are derived from the most basic profile
information which includes age, gender, body weight as described
earlier. As the user exercises under the guidance, the system
tracks the user's exercise responses over time, and will include
these exercise responses when deriving the coaching parameters for
the coaching module 203 to use in guiding the user. An exercise
response from the user is derived at 505 by continuously monitoring
the user's vital signs and the exercise activities.
[0061] Once some exercise response data are accumulated over the
time, the analytic module can assess a fitness level for the user.
Because the physical attributes of different users with different
profiles (e.g. different ages, different genders) are all input for
the fitness level assessment, the analytic module 205 can obtain
predicted physical attributes of users with the same or similar
profiles, e.g. in the same age group (e.g., in 41-50 year-old
period), in the same gender, in the same weight interval (e.g. in
60-65 kg weight interval), and then assess the fitness level of the
user based on comparison between the obtained exercise response of
the user and the predicted physical attributes of other users with
the similar profile. As more and more physical attributes are input
to the fitness level assess model, the accuracy of assess of the
fitness level assess model increases.
[0062] At 505, the coaching module takes the derived vital signs,
the activity information, the coaching parameters derived from the
user profile, the physical attributes of the user, and the workout
program, and generates a guidance for the user to complete the
workout program. A workout program includes one or more phases and
each phase defines one or more goals. The goals include but are not
limited to a duration, a distance, a physiological limit (e.g., a
heart rate), the number of repetitions, and etc. The goal can also
be in the form of a virtual score derived from the tangible data
listed above. The coaching module 203 provides the guidance to the
user in achieving the goals within a phase. When the goal for a
phase is met or a time limit has passed, the coaching module 203
guides the user to move to the next phase.
[0063] The coaching module 203 generates in real-time a personal
fitness training guidance message for the user on the basis of the
current vital signs, the physical activity information and the
workout program. The guidance messages are referred to as
instructions that clearly describe how the user should change the
current settings of the exercise equipment (e.g., "Short break.
Let's drop the speed to 3"), the exercise behavior (e.g. "Try to
speed up your cadence to 180"), a body form (e.g. "Keep your body
straight up"), or a statement that instructs the user what to do
with the current exercise and the physiological state of the user
(e.g., "Just reached the target zone. Let's keep it there"), or
statement that encourages the user to attain higher exercise
performance (e.g., "Use the last 30 seconds for a hard push,
increase speed by half a mile or more. You can do it! It is only
for 30 seconds"). The guidance messages are personalized for the
user and based on the predicted exercise responses that the
coaching module learns from the real-time and historical workout
data of the user and the general population. For example, from the
physical activity information, the speed of the user is slow and,
the current vital sign of the user is still 10% below the lower
limit of the heart rate zone (e.g., in fat burning zone). The
coaching module is designed to know that the user selects a weight
loss program. If the user can exercise in the fat burning zone for
20 minutes every day, it can help the user to reach the fitness
goal in one month. In one embodiment, with the profile of the user,
the past workout history and general population data sets, the
coaching module is designed to predict that the user can reach the
fat burning zone if the user increases the treadmill speed by 0.5
mph. Then, the local data processor 102 generates a personal
fitness training guidance message to instruct the user to increase
the speed by 0.5 mph so that the heart rate can reach the desired
fat burning intensity zone. The coaching module also continuously
generates messages to encourage the user to keep exercising at
speed levels that maintains the heart rate for 20 minutes. This
makes the exercise much more effective and predictable in helping
the user to achieving his goals.
[0064] In addition. the exercise activities can be inferred to
determine the exercise form of the user such as the cadence, the
footfall, the lean, and bounce. It can be recognized from the
captured images. The coaching module can have an exercise form
target to guide the user to follow. The system can generate an
action correction coaching parameter based on comparison of the
exercise form target and the body form of the user. If the exercise
form target does not match the body form of the user, the action
correction message instructs to the user to stop the current
exercise form and advises the user another exercise form which
matches the body form of the user most. For example, it is well
recognized that an efficient cadence for running is close to 180
steps per minute, while most beginning runners have a cadence well
below this target. The system can guide the user to increase their
cadence if it is below the target, and so help them improve their
running efficiency.
[0065] The body form guidance can be done by analyzing the full
body motion images of the users, possibly using multiple RGB or
depth cameras so that a full 3D view of the user exercising can be
derived. This 3D view can be displayed so that the user can feel he
is exercising in front of a mirror. The system evaluates the
running form against some best practice benchmark rules. For
example the system can monitor the footfall position and guide the
user toward midfoot striking (foot landing right underneath the
body) as often as possible. The system can monitor posture, and
guide the user to have the correct slight forward leaning
posture.
[0066] The guidance message is provided to the user through the
output unit 103. If the output unit 103 is a screen, the output
unit 103 can display the personal fitness training guidance message
of the user on the screen. If the output unit 103 is a speaker, the
output unit 103 can output a voice of the personal fitness training
guidance message.
[0067] While the user is exercising under the guidance of the
coaching module, the system will continuously monitor the exercise
activities, and vital signs at 506. By this continuous monitoring,
the system can derive the vital signs vs. exercise intensity
response, and at the same time use the measurements, along with
known population data sets, to determine various measures of the
user's performance, including his fitness level. The fitness level
can be quantified in metrics like VO2Max. The fitness level can be
measured based on vital sign responses relative to the exercise
intensity. One way to derive fitness level is to use the stable
heart rate of the user at a predetermined running speed and
incline. Alternatively, the same method plus the user's physical
attributes, such as age and gender, can be used. The fitness level
can be used to predict an exercise race performance. A good
training program should be based upon the current fitness level of
the user, and should adapt as the fitness of the user changes. The
analytic module 205 can compare the user's exercise response with
comparable users' historical exercise data. Updated fitness level
can be generated to add to the coaching parameter list.
[0068] At 508, after the user completes the workout program, the
user is shown with the workout results. Based on the fitness
performance evaluation, the system can rank the user's fitness
performance with the general public. For example, based on the
running pace and the vital signs response, the system can estimate
the pace of the user at various standard race distance (5K, 10K,
half-marathon, marathon, etc.). The system can also estimate their
race ranking according to published records of races. The user can
see a summary of their total workout efforts in terms of calories
burnt, and how much it accounts for based on the
government-published weekly exercise amount guideline.
[0069] At 509, the user indicates if he wants the system to store
his workout records. If the user permits, the workout records are
sent to the tracking module for storage at 510. The user input to
the system, can be through a variety of methods, including but not
limited to a gesture input through the camera sub-system, a voice
input, or interactions via a touch screen.
[0070] Referring now to FIG. 5, it shows a systemic diagram 550
with exemplary data flows among different devices according to one
embodiment of the present invention. Besides a fitness equipment
552 (shown as a user using the equipment), a local processing unit
554 is designed to recognize the user from a certain set of images,
derive the vital signs of the user from another set of images and
detect activities from still another set of images. Based on these
obtained data, a guidance is generated by the local processing unit
554. Meanwhile, the workout record collected in the local
processing unit 554 is transported to a cloud computing device 556
that is provided to store records for all registered users. The
cloud computing device 556 can be programmed to perform various
statistics based on different groups (e.g., an age group, a gender
group, and a region group), where the statistics may be shared with
other fitness club or a 3rd party 558 (e.g., an advertiser).
[0071] In addition, as shown in FIG. 5, a personal device (e.g., a
smartphone) 560 may be used to sign in the user when he is ready to
use a fitness equipment. The fitness goal of the user may be stored
in the personal device 560 and loaded to the local processing unit
554 to generate or modify the fitness training guidance.
[0072] The invention is preferably implemented by software, but can
also be implemented in hardware or a combination of hardware and
software. The invention can also be embodied as computer readable
code on a computer readable medium. The computer readable medium is
any data storage device that can store data which can thereafter be
read by a computer system. Examples of the computer readable medium
include read-only memory, random-access memory, CD-ROMs, DVDs,
magnetic tape, optical data storage devices, and carrier waves. The
computer readable medium can also be distributed over
network-coupled computer systems so that the computer readable code
is stored and executed in a distributed fashion.
[0073] The present invention has been described in sufficient
details with a certain degree of particularity. It is understood to
those skilled in the art that the present disclosure of embodiments
has been made by way of examples only and that numerous changes in
the arrangement and combination of parts may be resorted without
departing from the spirit and scope of the invention as claimed.
Accordingly, the scope of the present invention is defined by the
appended claims rather than the foregoing description of
embodiments.
* * * * *