U.S. patent application number 12/216540 was filed with the patent office on 2009-01-22 for exercise systems in virtual environment.
Invention is credited to Youngtack Shim.
Application Number | 20090023554 12/216540 |
Document ID | / |
Family ID | 40265318 |
Filed Date | 2009-01-22 |
United States Patent
Application |
20090023554 |
Kind Code |
A1 |
Shim; Youngtack |
January 22, 2009 |
Exercise systems in virtual environment
Abstract
An exercise system includes at least two exercise modules and is
arranged to allow multiple users performing exercises on, with or
against the modules in different locations while performing at
least one preset task defined in a context of a story, a scenery or
a video (or computer) game each in turn preferably defined in a
virtual environment. The exercise system generally generates the
task of the story, scenery or game in images of the virtual
environment and simulates the exercising users as simulated user in
such images, while allowing the users to manipulate the simulated
users for attaining a goal of the task based on features related
with the exercises performed by the user or allowing the task to
manipulate operations of the modules based on the features. The
execs system includes at least two output modules each including a
visual unit to provide the images for the task to each user and an
olfactory or tactile unit to provide smell or tactile sensation to
the user, respectively.
Inventors: |
Shim; Youngtack; (Port
Moody, CA) |
Correspondence
Address: |
Youngtack Shim
155 Aspenwood Drive
Port Moody
BC
V3H 5A5
CA
|
Family ID: |
40265318 |
Appl. No.: |
12/216540 |
Filed: |
July 7, 2008 |
Related U.S. Patent Documents
|
|
|
|
|
|
Application
Number |
Filing Date |
Patent Number |
|
|
60959464 |
Jul 16, 2007 |
|
|
|
60959564 |
Jul 16, 2007 |
|
|
|
Current U.S.
Class: |
482/4 |
Current CPC
Class: |
A63F 13/245 20140902;
A63B 22/02 20130101; A63B 2225/20 20130101; A63F 13/80 20140902;
A63B 22/04 20130101; A63F 13/795 20140902; A63B 2071/0658 20130101;
A63B 2225/50 20130101; A63B 69/06 20130101; A63B 2071/0666
20130101; A63B 22/0076 20130101; A63B 2071/0638 20130101; A63B
2230/42 20130101; A63B 24/0087 20130101; A63B 24/0062 20130101;
A63B 21/0628 20151001; A63B 69/16 20130101; A63B 2230/06 20130101;
A63B 69/0028 20130101; A63B 2071/0625 20130101; A63B 2230/50
20130101; A63B 2230/65 20130101; A63B 71/0622 20130101; G16H 20/30
20180101; A63B 2230/30 20130101; A63B 22/0664 20130101; A63B
22/0023 20130101; A63B 2024/0096 20130101; A63B 24/0084 20130101;
A63B 22/0605 20130101; A63B 2230/60 20130101; A63B 2220/806
20130101 |
Class at
Publication: |
482/4 |
International
Class: |
A63B 24/00 20060101
A63B024/00 |
Claims
1. An exercise system which is configured to provide at least two
users with at least one preset task of at least one of a story, a
scenery, a video game, and a computer game each of which defines a
preset task goal and is provided in images of at least one virtual
environment, to allow said users to simultaneously perform
exercises, to relate at least one first feature of one of said
exercises with at least one second feature of another of said
exercises, and to one of directly and indirectly manipulate said
second feature at least partly based upon said first feature
comprising: a first standard exercise module corresponding to at
least one of a first, second, third, fourth, fifth, sixth, seventh,
eighth, and ninth exercise module each configured to allow a first
user to perform a first exercise at least one of thereon,
therewith, and thereagainst while consuming energy thereof during
said first exercise, wherein said first exercise module is
configured to allow said first user to perform said first exercise
at least one of on, with, and against at least one portion thereof,
wherein said second exercise module is configured to define at
least one preset load, to incorporate therein at least one
actuating part operatively coupling to said load and contacting at
least one body part of said first user, and to allow said user to
perform said first exercise by contacting said actuating part and
moving said actuating part against said load while consuming said
energy during said first exercise, wherein said third exercise
module is configured to include therein at least one track
translating along a preset direction and to allow said first user
to perform said first exercise of at least one of walking and
running on said track while consuming said energy during said first
exercise, wherein said fourth exercise module is configured to
include at least one rotation axis, to define at least one preset
load, to include therein at least one pedal coupling with said load
and rotating about said axis, and to allow said first user to
perform said first exercise of rotating said pedal against said
load while consuming said energy during said first exercise,
wherein said fifth exercise module is configured to include therein
at least one movable weight, and to allow said first user to
perform said first exercise of at least one of translating,
reciprocating, pivoting, rotating, and moving said weight while
consuming said energy during said first exercise, wherein said
sixth exercise module is configured to define at least one central
point, to define at least one preset load, to include therein at
least one lever coupling with said load and pivoting about said
point, and to allow said first user to perform said first exercise
of at least one of reciprocating, translating, pivoting, rotating,
displacing, and moving said lever about said point against said
load while consuming said energy during said first exercise,
wherein said seventh exercise module is configured to incorporate
therein at least one belt capable of enclosing at least one body
part of said first user therearound, and to allow said first user
to perform said first exercise of vibrating said body part while
consuming said energy during said first exercise, wherein said
eighth exercise module is configured to define a preset load, to
include therein at least one pad capable of coupling to said load
and at least one of moving and deforming in response to energy
supplied by said user thereonto, and to allow said first user to
perform said first exercise of at least one of translating,
reciprocating, rotating, pivoting, deforming, pushing, and pulling
at least a portion of said pad against said load while consuming
said energy during said first exercise, wherein said ninth exercise
module is configured to define at least one preset load, to
incorporate therein at least one handle operatively coupling with
said load, and to allow said first user to perform said first
exercise of at least one of translating, displacing, reciprocating,
rotating, pivoting, and moving said handle against said load while
consuming said energy during said first exercise, wherein said
first exercise defines a first type and a first extent, wherein
said first standard exercise module is disposed in a first
location, and wherein said first feature is related to at least one
of said task, said first user, said first exercise, and at least
one operation of said first standard exercise module; a second
standard exercise module corresponding to at least one of said
first, second, third, fourth, fifth, sixth, seventh, eighth, and
ninth exercise module each of which is configured to allow a second
user to perform a second exercise at least one of thereon,
therewith, and thereagainst while consuming energy thereof during
said second exercise, wherein said second exercise is configured to
define a second type and a second extent which are one of the same
as, similar to, and different from said first type and extent,
respectively, wherein said second exercise module is disposed in a
second location which is one of the same as and different from said
first location in which said first standard exercise module is
disposed, wherein said second feature is related to at least one of
said task, said second user, said second exercise, and at least one
operation of said second standard exercise module, and wherein said
second standard exercise module is configured to be operatively
coupled to said first standard exercise module through one of a
local network and a global network one of indirectly and directly
and to allow said first and second users to simultaneously perform
said exercises; at least one output module which includes at least
two visual units and at least one of at least one olfactory unit
and at least one tactile unit, wherein each of said visual units is
disposed in a preset disposition and arrangement and is configured
to display said images to each of said users, wherein said
olfactory unit is configured to provide smell for said virtual
environment, to said user and wherein said tactile unit is
configured to provide sensation of said virtual environment to said
user; and at least one control module which is configured to
operatively couple with said output module and at least one of said
standard exercise modules one of directly and indirectly, to
provide said task in said images of said virtual environment, to
display said images on said visual units, to assign at least one
preset goal to said task, to monitor at least one of said first
feature and second feature, to relate one of said first and second
features to another thereof at least partly based upon at least one
preset relation one of stored therein, generated thereby, and
supplied thereto by at least one of said users, and to perform
manipulation of one of said first and second features at least
partly based on another of said features, while providing at least
one of said smells and sensation, wherein said control module is
configured to provide said relation as well as to perform said
manipulation not only when said types and extents of said first and
second exercises are identical to each other but also when said
types and extents of said first and second exercises are not
identical to each other based on at least one of said first and
second features, thereby allowing said first and second users to
compete for accomplishing said task goal while simultaneously
performing said first and second exercises respectively in said
first and second locations.
2. The system of claim 1, wherein said relation relates said first
feature of a preset characteristic related with at least one of
said first user, first exercise, and operation of said first
standard exercise module with said second feature of said preset
characteristic related with at least one of said second user,
second exercise, and operation of said second standard exercise
module.
3. The system of claim 1, wherein said relation is configured to
relate said first feature defining one characteristic which is
related with at least one of said first user, first exercise, and
operation of said first standard exercise module with said second
feature defining a different characteristic which is then related
with at least one of said second user, second exercise, and
operation of said second standard exercise module.
4. The system of claim 3, wherein said control module is configured
to convert said first feature from a first unit into a second
different unit and to perform said manipulation based upon said
relation which is configured to compare said converted first
feature with said second feature.
5. The system of claim 1, wherein said relation is configured to
relate said first feature defining one characteristic which is
related with at least one of said first user, first exercise, and
operation of said first standard exercise module with said second
feature having said characteristic which is then related to at
least one of said second user, second exercise, and operation of
said second standard exercise module.
6. The system of claim 1, wherein said control module is further
configured to manipulate at least one of said first and second
features at least partly based upon another thereof, whereby said
users simultaneously proceed along said task for said goal while
simultaneously performing said exercises at least one of on, with,
and against said standard exercise modules which are incorporated
in said locations and whereby said control module manipulates said
another feature of one of said task and standard exercise modules
one of directly and indirectly at least partly based upon at least
one of said relation and monitored feature while communicating with
at least one of said visual units and standard exercise module
through one of said local global networks encompassing said
locations.
7. The system of claim 1, wherein said first feature corresponds to
said feature of at least one of said first exercise and first user
and wherein said second feature corresponds to said feature of said
task, whereby said control module is configured to manipulate said
task at least partly based upon at least one of said exercises and
users.
8. The system of claim 1, wherein said first feature corresponds to
said feature of said task and wherein said second feature
corresponds to said feature of said operation, and wherein said
control module is configured to manipulate said operation of at
least one of said standard exercise modules at least partly based
on said task performed by at least one of said users.
9. The system of claim 1, wherein said control module is also
configured to simulate at least one of said users as at least one
simulated user, to incorporate said simulated user in said images
for said virtual environment, and to allow said at least one of
said users to perform said feature of said task by manipulating
said simulated user at least partly based on said feature of at
least one of said exercises and users.
10. The system of claim 1, wherein said control module is
configured to be disposed in one of said locations and to
communicate with at least one of said standard exercise modules
disposed in another of said locations for said manipulation one of
wirelessly and through wire.
11. An exercise system which is configured to connect a plurality
of locations, to incorporate at least one exercise module in each
of said locations, to define at least one preset task of at least
one of a story, a scenery, a video game, and a computer game each
defining a preset goal and provided in images for at least one
virtual environment, and to allow each of a plurality of users to
simultaneously perform exercises at least one of on, with, and
against each of said exercise modules in each of said locations
while competing each other in said images for said goal of said
task at least partly based on said exercises performed by said
users comprising: a first exercise module which is disposed in a
first location and configured to allow a first user to perform a
first exercise while consuming energy thereof during said first
exercise, wherein a first feature is related with at least one of
said first user, first exercise, and at least one operation of said
first exercise module; a second exercise module which is disposed
in a second and different location different and configured to
allow a second user to perform a second exercise while consuming
energy thereof during said second exercise, wherein said second
feature is related with at least one of said second user, second
exercise, and at least one operation of said second exercise
module, and wherein said second exercise module is configured to
form an operative coupling to said first exercise module via one of
a local network and a global network one of indirectly and
directly, thereby allowing said first and second users to
simultaneously perform said exercises while maintaining said
coupling through said network; at least one output module which
includes at least two visual units and at least one of at least one
olfactory unit and at least one tactile unit, wherein each of said
visual units is disposed in a preset disposition and arrangement
and is configured to display said images to each of said users,
wherein said olfactory unit is configured to provide smell for said
virtual environment, to said user and wherein said tactile unit is
configured to provide sensation of said virtual environment to said
user; and at least one control module which is configured to
operatively couple with at least one of said output module and
exercise modules one of directly and indirectly, to provide said
task in said images, to display said images on said visual units,
to assign at least one goal to said task, to be disposed in at
least one of said first and second locations, to monitor at least
one of said first and second features, to relate one of said first
and second exercises to another based on at least one preset
relation, and to perform manipulation of at least one of said
features at least partly based on at least one another feature,
wherein said control module is configured to provide said relation
as well as to perform said manipulation not only when said types
and extents of said first and second exercises are identical to
each other but also when said types and extents of said first and
second exercises are not identical to each other based on at least
one of said first and second features, thereby allowing said first
and second users to compete for accomplishing said task goal while
simultaneously performing said first and second exercises
respectively in said first and second locations.
12. The system of claim 11, wherein said relation relates said
first feature of a first characteristic related with at least one
of said first user, first exercise, and operation of said first
exercise module with said second feature of said first
characteristic which is related with at least one of said second
user, second exercise, and operation of said second exercise
module.
13. The system of claim 11, wherein said relation is configured to
relate said first feature defining one characteristic related with
at least one of said first user, first exercise, and operation of
said first exercise module with said second feature defining a
different characteristic which is related with at least one of said
second user, second exercise, and operation of said second exercise
module.
14. The system of claim 13, wherein said control module is
configured to convert said first feature from a first unit into a
second different unit and to perform said manipulation based upon
said relation which is configured to compare said converted first
feature with said second feature.
15. The system of claim 11, wherein said relation is configured to
relate said first feature defining one characteristic which is
related with at least one of said first user, first exercise, and
operation of said first standard exercise module with said second
feature having said characteristic which is then related to at
least one of said second user, second exercise, and operation of
said second standard exercise module.
16. An exercise system which is configured to operatively connect a
plurality of locations via at least one of a local network and a
global network, to include at least one exercise module in each of
said locations, to define at least one preset task of at least one
of a story, a scenery, a video game, and a computer game each
defining a preset goal for said task and provided in images for at
least one virtual environment, and to allow each of a plurality of
users to simultaneously perform exercises at least one of on, with,
and against each of said exercise modules disposed in each of said
locations while competing each other in said images for said task
goal at least partly based on said exercises performed by said
users comprising: a first exercise module which is configured to
define a first exercise type and a first exercise load and to allow
a first user to perform a first exercise while consuming energy
thereof during said first exercise; a second exercise module which
is configured to define a second exercise type and a second
exercise load and to also allow a second user to perform a second
exercise while consuming energy thereof during said second
exercise, wherein said second exercise module is configured to
operative couple with said first exercise module via said network
one of indirectly and directly, thereby allowing said first and
second users to simultaneously perform said exercises while
pursuing said goal of said task; at least one output module which
includes at least two visual units and at least one of at least one
olfactory unit and at least one tactile unit, wherein each of said
visual units is disposed in a preset disposition and arrangement
and is configured to display said images to each of said users,
wherein said olfactory unit is configured to provide smell for said
virtual environment, to said user and wherein said tactile unit is
configured to provide sensation of said virtual environment to said
user; and at least one control module which is configured to
operatively couple with at least one of said output module and
exercise modules one of directly and indirectly, to define said
task goal, to provide said task in said images, to display said
images on said visual units, to monitor at least one of said first
and second extents and loads as well as extents of said first and
second exercises each performed by each of said users, to simulate
said users into simulated users included in said images, to relate
at least one of said type, load, and extent of said first exercise
with at least one of those of said second exercise based on at
least one preset relation, and then to perform manipulation of at
least one of said simulated users in said images at least partly
based on at least one of said types, loads, and extents related to
each other by said relation, thereby allowing said users to compete
for accomplishing said task goal while simultaneously performing
said exercises in said locations regardless of whether said types
of said first and second exercises are identical to each other.
17. The system of claim 16, wherein said types of said first and
second exercises are identical to each other and wherein said
control module performs said manipulation at least substantially
based on at least one of said loads and extents of said
exercises.
18. The system of claim 16, wherein said types of said first and
second exercises are different from each other and wherein said
control module is configured to convert at least one of said
extents from one unit to another unit and to perform comparison of
said converted extent with another of said extents, thereby
performing said manipulation at least substantially based on said
comparison.
19. The system of claim 16, wherein said control module is
configured to perform said manipulation by manipulating said
simulated user in said images.
20. The system of claim 19, wherein said control module is
configured to perform said manipulation not only by manipulating
said simulated user in said images but also by manipulating said
operation of at least one of said exercise modules based on at
least one of said types, loads, and extents.
Description
CROSS-REFERENCES TO RELATED APPLICATIONS
[0001] The present application relates to various patent
applications which have been filed to the U.S. Patent and Trademark
Office by the same Applicant such as, e.g., a U.S. Provisional
Patent Application which is entitled "Local exercise systems with
compact visual units," was filed on Jul. 16, 2007, and bears the
Serial Number U.S. Ser. No. 60/959,464, a second U.S. Provisional
Patent Application which is entitled "Local exercise systems with
full-size visual units," was filed on Jul. 16, 2007, and bears the
Serial Number U.S. Ser. No. 60/959,564, and a third U.S.
Provisional Patent Application which is entitled "Global exercise
systems and methods," was filed on Aug. ______, 2007, and bears the
Serial Number U.S. Ser. No. 60/9______. All of the above
Applications will be referred to as the "co-pending Applications"
hereinafter and all of the Applications and are to be incorporated
herein in their entirety by reference.
FIELD OF THE INVENTION
[0002] An exercise system includes at least two exercise modules
and is arranged to allow multiple users performing exercises on,
with or against the modules in different locations while performing
at least one preset task defined in a context of a story, a scenery
or a video (or computer) game each in turn preferably defined in a
virtual environment. The exercise system generally generates the
task of the story, scenery or game in images of the virtual
environment and simulates the exercising users as simulated user in
such images, while allowing the users to manipulate the simulated
users for attaining a goal of the task based on features related
with the exercises performed by the user or allowing the task to
manipulate operations of the modules based on the features. The
execs system includes at least two output modules each including a
visual unit to provide the images for the task to each user and an
olfactory or tactile unit to provide smell or tactile sensation to
the user, respectively.
BACKGROUND OF THE INVENTION
[0003] A recent trend in many civilized countries is that their
population tend to get obese. With more nutritious foods available
and less time for exercise, excess nutrients are converted into
cholesterol and stored in fat cells. Since the obesity is related
to a high blood cholesterol and various diseases, not only
individuals but also governments focus on reducing obese
population.
[0004] As a result, it is not uncommon to find people engaging in
various exercises. Some people jog, whereas others choose to go to
gyms and to exercise on, with or against various exercise equipment
provided thereat. Such exercise or fitness equipment is generally
intended to improve and/or enhance a muscle tone of an user, to
increase a muscle mass and volume of the user, to force or
facilitate the user to reduce his or her weight, to increase
physical stamina of the user, and the like.
[0005] Although conventional exercise or fitness equipment is
effective in helping the user burn his or her excess energy, it
typically requires its user to spend a fair amount of time
therewith. For example, the user may waste a piece of cake causing
a surge of hundreds of calories within a minute, but she or he has
to run on a treadmill for at least an hour to burn those excess
calories. In addition, the user has to engage in various aerobic
and non-aerobic exercises for a prolonged period of time to improve
or enhance the muscle tone and to increase the muscle mass and
volume of the user, not to mention reducing his or her weight.
Accordingly, patient and endurance are vital virtues for the users
of such exercise or fitness equipment. In order to help the user
exercise for a proper duration, the gyms offer a variety of
amenities such as TV's and DVD players in order to alleviate the
user from getting bored. Rather, other users choose to carry their
radios or audio players and listen to music while performing
exercise. Whatever they may resort to, prior art exercise or
fitness equipment is, however, of little or at most a limited value
in eliminating the boredom of the user during exercise.
[0006] Various fitness or exercise equipment has been suggested to
alleviate such problems, where examples of such equipment have been
disclosed in U.S. Pat. Nos. 5,322,490 and 5,425,691 to M. A. van
der Hoeven entitled "Stepping and sliding exerciser," U.S. Pat. No.
6,013,007 to G. M. Root and F. Hoorn entitled "Athlete's GPS-based
performance monitor," U.S. Pat. No. 6,106,297 to E. Pollak and S.
Vaquerizo entitled "Distributed interactive simulation exercise
manager system and method," U.S. Pat. No. 6,159,131 to L. Pfeffer
entitled "Fitness triage system and method," U.S. Pat. No.
6,302,789 B2 to T. Harada and K. Shimizu which is entitled
"Pedometer with game mode," U.S. Pat. Nos. 6,312,363 B1 and
6,626,799 B2 both issued to S. R. Watterson et al. entitled "System
and methods for providing an improved exercise device with
motivational programming," U.S. Pat. No. 6,336,891 B1 to R.
Fedrigon et al. which is entitled "Interactive exercise pad
system," U.S. Pat. No. 6,453,111 B1 to J. H. Sklar et al. entitled
"Interactive workstation for creating customized, watch and do
physical exercise programs," U.S. Pat. No. 6,468,086 B1 to S.
Brady-Koontz which is entitled "Method of display of video images
of exercises," U.S. Pat. No. 6,561,952 B2 issued to M. C. Wu
entitled "Turning control device for a virtual stationary bike,"
U.S. Pat. No. 6,590,536 B1 to C. A. Walton which is entitled "Body
motion detecting system with correction for tilt of accelerometers
and remote measurement of body position," U.S. Pat. No. 6,604,138
B1 issued to L. D. Virine and T. G. Simpson entitled "System and
method for providing demographically targeted information," U.S.
Pat. No. 6,620,078 B2 to L. Pfeffer entitled "Fitness triage system
and nutrition gets personal," U.S. Pat. No. 6,635,013 B2 to L.
Pfeffer entitled "Fitness triage system and exercise gets
personal," U.S. Pat. No. 6,643,385 B1 to M. J. Bravomalo entitled
"System and method for weight-loss goal visualization and planning
and business method for use therefor," U.S. Pat. No. 6,656,091 B1
to K. G. Abelbeck et al. entitled "Exercise device control and
billing system," U.S. Pat. No. 6,613,000 B1 to D. J. Reinkensmeyer
et al. which is entitled "Method and apparatus for mass-delivered
movement rehabilitation," U.S. Pat. No. 6,671,736 B2 to L. D.
Virine and T. G. Simpson which is entitled "System and method for
providing demographically targeted information," U.S. Pat. No.
6,672,991 B2 issued to S. M. O'Malley entitled "Guided
instructional cardiovascular exercise with accompaniment," U.S.
Pat. No. 6,749,536 B1 to S. M. Cuskaden and A. G. Evans entitled
"Exercising using public communication network," U.S. Pat. No.
6,749,537 B1 to P. L. Hickman entitled "Method and apparatus for
remote interactive exercise and health equipment," U.S. Pat. No.
6,786,848 B2 issued to A. Yamashita entitled "Exercise assisting
method and apparatus implementing such method," U.S. Pat. No.
6,796,927 B2 issued to M. Toyama entitled "Exercise assistance
controlling method and exercise assisting apparatus," U.S. Pat. No.
6,852,069 B2 to S. H. Park which is entitled "Method and system for
automatically evaluating physical health state using a game," U.S.
Pat. No. 6,898,411 B2 to S. G. Ziv-el et al. entitled "Method and
system for online teaching using web pages," U.S. Pat. No.
6,918,860 B1 to N. H. Nusbaum entitled "Exercise bicycle virtual
reality apparatus," U.S. Pat. No. 6,921,351 B1 to P. L. Hickman and
M. L. Gough entitled "Method and apparatus for remote interactive
exercise and health equipment," U.S. Pat. No. 6,997,853 B1 to S. M.
Cuskaden and A. G. Evans entitled "Exercising using a public
communication network," U.S. Pat. No. 7,022,048 B1 to John
Fernandez and Juan Fernandez entitled "Video fitness machine," U.S.
Pat. No. 7,074,162 B2 to P. Kuo entitled "Exercise device," U.S.
Pat. No. 7,307,241 B1 to P. Kuo entitled "Exercise device," U.S.
Pat. No. 7,335,134 B1 to R. LaVelle, entitled "Exercise and game
controller apparatus and method," U.S. Pat. No. 7,347,779 B2 to R.
James-Herbert entitled "Computer game controller," and the like.
However, any of such equipment disclosed in the above patents or
any combination thereof have not completely solved the above
problem.
[0007] Therefore, there is a need for an exercise system capable of
allowing the users to engage in exercises in geographically
different locations without getting bored. To this end, there is a
need for an exercise system providing images of a task of a story,
a scenery or a video (or computer) game in a preset viewpoint such
that the users may engage in the task while performing exercises
on, with or against multiple exercise modules also disposed in
different locations. There is a need for an exercise system for
providing the images of the task of the story, scenery or game on
multiple visual units each disposed in a single or multiple view
angles of the exercising users. There is a need for the exercise
system including multiple exercise modules which are disposed in
geographically different locations and communicate with each other
either directly, indirectly through another module of the system or
through an external provider. There is a need for the exercise
system capable of providing auditory, olfactory, and/or tactile
features of the task each synchronized with the images of the task.
There also is a need for the exercise system capable of
manipulating at least one feature of the task of the story, scenery
or game based upon at least one another feature of the exercise or,
in the alternative, manipulating at least one feature of at least
one operation of the exercise system based upon at least one
feature of the task.
SUMMARY OF THE INVENTION
[0008] The present invention generally relates to an exercise
system providing users who perform exercises on, with or against
exercise modules of the system in different locations with a task
which is defined as a story, scenery or video (or computer) game in
a virtual environment. More particularly, the present invention
relates to the exercise system for generating the task of the
story, scenery or game in images of the virtual environment,
simulating the users into simulated user in such images, and
allowing the users to manipulate the simulated users for attaining
a goal of the task based on features related with the exercises or,
alternatively, allowing the task to manipulate operations of the
exercise modules based upon such features. The system includes at
least two visual units for providing such images for the task to
each exercising user. The present invention also relates to various
methods of and processes for providing such images of the task to
the exercising users and allowing such user to manipulate the task
or allowing the task to manipulate the exercise based on the
feature.
[0009] The exercise systems of this invention may be provided in
various embodiments. For example, the exercise system may be
fabricated as an assembly of multiple exercise modules each
disposed in different locations and providing at least one preset
exercise, at least one output module for providing the virtual
environment (i.e., images, optional sounds, smell or sensations,
and the like) of the task of a story, a scenery or a game to
multiple users who perform the same or different exercises in
different locations, at least one control module for manipulating
operations of the output and exercise modules or the rest of the
system, and the like. In another example, the system may be
provided as an add-on assembly of the output and control modules
operatively coupling with conventional exercise or fitness modules
disposed in different locations and providing the virtual
environment with the images, optional sounds, smells or sensations
to the users while manipulating the operations of the exercise
modules. In another example, the exercise system may include such
exercise and control modules, where the latter may operatively
couple to a prior art audiovisual, olfactory or tactile output
device of the users and use visual, auditory, olfactory or tactile
capability of the device to display the images, playing such
sounds, giving off smells, or generating sensations for the task,
where the device may be a portable or stationary image display
device (e.g., a TV, a monitor, a palm device or DVD player with a
display panel, a game console with a display panel, a communication
device with a display panel, and so on), or a portable or
stationary audio device (e.g., a sound-generating device with a
speaker, a CD player with a speaker, a communication device with a
speaker, a game console with a speaker, and the like).
[0010] Therefore, a primary objective of the present invention is
to provide an exercise system with multiple exercise modules
disposed in different (or geographically separate) locations and
on, with or against each of which each user simultaneously performs
physical exercises while participating in a preset task of a story,
a scenery or a video (or computer) game each provided in images
(and optional sounds, smells, and/or sensations) of a virtual
environment and each defining at least one preset goal. Therefore,
a related objective of this invention is to include in the exercise
system multiple compact or full-size visual units which may display
the images for the task for each user and to allow the users to
view the images on the visual units while simultaneously performing
the same or different exercises. Another related objective of this
invention is to generate the task of the story, scenery or game
each incorporating therein at least one simulated user which
simulates at least one feature of at least one of the exercising
users. Another related objective of this invention is to provide
the task of the story, scenery or game each defining visual,
auditory, olfactory or tactile feature at least one of which may be
manipulated by various features of the users, exercises performed
by the users, or operations of the system. Another related
objective of this invention is to provide the system which can
manipulate at least one operation of the exercise modules based on
at least one feature of the task of the story, scenery or game.
Another related objective of this invention is to enable the
exercise system with at least two exercise modules to transfer at
least one feature of the task, users, exercises or operations from
one to another of the exercise modules via a local or global
network in order to allow such users to participate in the task
while simultaneously perform the same or different exercises.
[0011] Another objective of the present invention is to provide an
exercise system on, against, and/or with which the users
simultaneously perform physical exercises while simulating each of
such users as at least one simulated user and then incorporating
the simulated users into a preset task of a story, a scenery or a
video (or computer) game generated in images of the virtual
environment and defining at least one preset goal. Accordingly, a
related objective of this invention is to generate the simulated
users by simulating at least one feature of each user such that at
least one feature of each simulated user in such images is related
to the feature of each user. Another related objective of this
invention is to allow such users to manipulate the simulated users
based on at least one feature of such users, exercises, or
operations of the system so that the users manipulate at least one
feature of the task of the story, scenery or game into which the
simulated users are included. Another related objective of this
invention is to allow the task to manipulate the simulated users so
that the system may manipulate at least one of its operations based
on at least one feature of such simulated users, thereby requiring
the users to perform the exercises at least one feature of which
may be decided by at least one of the simulated users. Another
related objective of this invention is to arrange the system to
manipulate at least one feature of the task between at least two
exercise modules so that at least two users may compete each other
in the task while simultaneously performing the same or different
exercises with, on, or against each of the exercise modules
disposed in the same or different locations.
[0012] Another objective of the present invention is to provide an
exercise system on, against, and/or with which multiple users
simultaneously perform physical exercises while participating in a
task of a story, a scenery or a video (or computer) game and
proceeding therethrough so as to attain a preset goal of the task.
Accordingly, a related objective of this invention is to provide
the story, scenery or game for the task while relating at least one
feature of the images for the task to at least one feature of the
exercises, users, and/or operation of the exercise system so that
the latter may manipulate the feature of the images. Another
related objective of this invention is to simulate each of the
users into at least one simulated user and include the simulated
users as a part of the images while manipulating at least one
feature of such simulated users based upon at least one feature of
the exercises, users, and/or operations of the system. Another
related objective of this invention is to arrange the system to
manipulate at least one feature of the task between at least two
exercise modules so that at least two users may compete each other
in the task of the game while simultaneously performing the same or
different exercises on, with or against each of the exercise
modules.
[0013] Another objective of the present invention is to participate
multiple users exercising on, with or against multiple exercise
modules of an exercise system simultaneously in a preset task of a
story, a scenery or a video (or computer) game while manipulating
at least one operation of the system based on the task. Therefore,
a related objective of this invention is to provide the task while
relating at least one feature of such images for the game with at
least one feature of the operation of the system such that the
former may manipulate at least one operation of the system. Another
related objective of this invention is to simulate each user into
at least one simulated user and then incorporate such as a part of
the task while manipulating at least one feature of the operations
of the system based upon at least one feature of the task. Another
related objective of this invention is to arrange the exercise
system to manipulate at least one feature of the task between at
least two exercise modules so that at least two users compete each
other in the task of the game while simultaneously performing the
same or different exercises on, with or against each of the
exercise modules.
[0014] Another objective of the present invention is to provide
communication between multiple users exercising on, with or against
multiple exercise modules while allowing such users to participate
in a task of a story, scenery or video (or computer) game and to
compete each other in images (or other features of a virtual
environment) for the task. Accordingly, a related objective of this
invention is to arrange the system to monitor at least one feature
of one of the users, at least one feature of one of such exercises,
at least one feature of at least one of the exercise modules, or at
least one feature of the task through the local or global network.
Another related objective of this invention is to monitor the
features of the users, the features of the exercises, the features
of operations of such exercise modules or the features of the task
through the local or global network. Another related objective of
this invention is to transfer the feature monitored in one location
to another location without altering the monitored feature (i.e., a
simple transfer). Another related objective of this invention is to
transfer the feature monitored in one location to another while
altering or converting the feature based on a preset relation
(i.e., an equivalent conversion), where the relation is defined
between at least one feature of the exercises and that of the task,
between at least one feature of the exercises and the goal and/or
stages of the task, between at least one feature of such users and
that of the task, and/or between at least one feature of such users
and the goal and/or stages of the task, and where the relation may
be defined between at least one feature of the operation and at
least one feature of the task, and/or between at least one feature
of the operation and the goal and/or stages of the task.
[0015] In all of such objectives, the system may be arranged to
provide the virtual environment which not only provides the images
for the task but also generates the sounds, smells, or sensations
for the task, thereby providing the visual feature as well as the
optional auditory, olfactory or tactile features. The system may
then manipulate at least one feature of the images, sounds, smells,
or sensations for the task of the story, scenery or game at least
partly based on other features of the users, exercises, or
operation of the exercise modules. Alternatively, the system may
manipulate at least one feature of operations of the exercise
modules at least partly based on the task, users, or performed
exercises.
[0016] Various exercise systems of this invention may be
constructed in various arrangements. For example, the exercise
system may have at least two exercise modules on, with or against
which the users simultaneously perform such same or different
exercises. In another example, the system may include a single
exercise module in one location and on, with or against which one
user performs the exercise, and may then couple with an external
exercise module in a different location and on, with or against
which another user performs the exercise.
[0017] Various apparatus, method, and process aspects of such
exercise systems and embodiments thereof are now enumerated. It is
appreciated, however, that following system, method, and process
aspects of the present invention may also be embodied in many other
different modes and, therefore, should not be limited to such
aspects and their embodiments which are to be set forth herein.
Rather, various exemplary aspects and their embodiments set forth
herein are provided so that this disclosure is thorough and
complete, and fully conveys the scope of the present invention to
one of ordinary skill in the relevant art.
[0018] In one aspect of the present invention, an exercise system
is arranged to provide at least two users with at least one preset
task of a story, a scenery, and/or a video (or computer) game each
of which is provided in images of at least one virtual environment,
to allow such users to simultaneously perform exercises (or to
allow the user to perform such exercises in a delayed mode), and to
directly or indirectly manipulate at least one feature of the
preset task at least partly based upon at least one feature of at
least one of the exercises performed by the users.
[0019] In one exemplary embodiment of this aspect of the invention,
an exercise system includes a first standard exercise module, at
least one of a second and third standard exercise modules, at least
one first output module, and at least one first control module. The
first standard exercise module may be one of a first exercise
module through a ninth exercise module, where the first exercise
module is arranged to allow the user to perform exercise on, with,
or against at least one portion thereof while consuming energy of
its user during the exercise, while the second exercise module is
arranged to define at least one preset load, to include at least
one actuating part capable of coupling with the load and contacting
at least one body part of the user, and then to allow the user to
perform the exercise by contacting the actuating part and by moving
such a part against the load while consuming energy thereof during
the exercise. The third exercise module may be arranged to include
at least one track capable of translating in a preset direction and
to allow the user to perform the exercise of walking or running
over the track while consuming energy thereof during the exercise,
while the fourth exercise module is arranged to define at least one
rotation axis, to define at least one preset load, to include at
least one pedal coupling with the load and rotating about the axis,
and to allow the user to perform the exercise of rotating the pedal
against such a load while consuming energy thereof during its
exercise. The fifth exercise module may be arranged to include at
least one movable weight, and to allow its user to perform its
exercise of pivoting, translating, reciprocating, rotating or
moving the weight while consuming energy thereof during the
exercise, while the sixth exercise module is arranged to define at
least one central point, to define at least one preset load, to
include at least one lever coupling with the load and pivoting
about the point, and to allow the user to perform the exercise of
reciprocating, translating, pivoting, rotating, displacing or
moving such a lever about the point against the load while
consuming energy thereof during the exercise. The seventh exercise
module is arranged to include at least one belt capable of
enclosing at least one body part of the user therearound, and then
to allow the user to perform the exercise of vibrating the body
part while consuming energy thereof during the exercise, while the
eighth exercise module is arranged to define a preset load, to
include at least one pad capable of coupling with the load and
moving or deforming in response to energy supplied thereto by the
user, and then to allow the user to perform the exercise of
translating, reciprocating, rotating, deforming, pivoting, pushing,
or pulling at least a portion of the pad against the load while
consuming energy thereof during the exercise. The ninth exercise
module may be arranged to define at least one preset load, to
include at least one handle coupling with the load, and to allow
the user to perform the exercise of translating, reciprocating,
rotating, pivoting, displacing, or moving the handle against such a
load while consuming energy of the user during the exercise.
Regardless of its configuration and operation, the first standard
exercise module is disposed in a first location, while the second
standard exercise module is one of the first to ninth exercise
module, but disposed in a second location which is different (or
geographically separate or apart) from the first location, whereby
different users may simultaneously perform such exercises of
different types and of the same, similar or different extents. The
third standard exercise module is one of the first through ninth
exercise modules and disposed in the second location, whereby
different users may simultaneously perform the exercises of the
same or similar types and of the same, similar or different
extents. The output module may include at least two or full-size
visual units each of which is provided in a disposition and
arrangement for displaying an entire portion of each image in a
single view angle of each user and within a viewable distance so
that each user may simultaneously view the entire portion of each
image displayed on each visual unit while performing each exercises
on, with or against each of the above standard exercise modules (to
be referred to as the "first output module" hereinafter). The
control module is arranged to operatively couple with at least one
of the output module and standard exercise modules directly or
indirectly, to provide the task in the images of the virtual
environment, to display the images on the visual units, to assign
at least one preset goal to the task, to monitor at least one
feature of the exercises provided by the standard exercise modules,
the users simultaneously performing such exercises, and/or at least
one operation of the standard exercise modules, and to relate the
exercises to each other based on at least one preset relation (to
be referred to as the "first control module" hereinafter). The
first control module may also be arranged to manipulate at least
one feature of the images at least partly based on the relation and
at least partly based upon at least one feature of such exercises,
users, or operation, whereby the users simultaneously proceed to
attain the task goal while simultaneously performing the exercises
on, with, and/or against the standard exercise modules which are
provided in the same or different locations and whereby the control
module directly or indirectly manipulates the task feature at least
partly based on the relation or upon the feature of at least one of
such exercises, users, and operation while communicating with the
visual units or standard exercise modules via a local or global
network encompassing such locations (to be referred to as the
"first control functions" hereinafter).
[0020] In another exemplary embodiment of this aspect of the
invention, another exercise system may include the first standard
exercise module and at least one of the second and third standard
exercise modules, at least one output module, and the first control
module which may perform the first control functions. The output
module may include at least two visual units each provided in a
disposition and an arrangement for displaying different portions of
each of the images in each of multiple view angles of each user but
within a viewable distance therefrom so that each user
simultaneously views each portion of each image displayed on each
visual unit one at a time while performing each exercise on, with
or against each standard exercise module (to be referred to as the
"second output module").
[0021] In another exemplary embodiment of this aspect of the
invention, another exercise system may include the first standard
exercise module and at least one of the second and third standard
exercise modules, at least one output module, and the first control
module which may be arranged to simulate at least one of the users
as at least one simulated user included in the images for such one
of the users, and to manipulate at least one feature of the
simulated user of the images at least partly based on the relation
and also at least partly based upon at least one feature of such
exercises, users or operation, whereby the users may proceed along
the task for the task goal while simultaneously performing the
exercises on, with or against the standard exercise modules to be
disposed in the different locations and whereby the control module
may directly or indirectly manipulate the feature of the simulated
user of the task at least partly based upon the relation and the
feature of the exercises, users or operation while communicating
with at least one visual unit and/or standard exercise module via a
local or global network encompassing the locations (to be referred
to as the "second control functions"). The output module may
include at least two visual units each defines a preset
configuration and is provided in an arrangement and disposition for
displaying an entire portion of each image in a single view angle
of each user and in a viewable distance therefrom due to the
configuration, disposition, or arrangement so that each user
simultaneously views the entire portion of each image displayed on
each visual unit while performing each exercise on, with or against
each standard exercise module (to be referred to as the "third
output module" hereinafter).
[0022] In another exemplary embodiment of this aspect of the
invention, another exercise system may include the first standard
exercise module and at least one of the second and third standard
exercise modules, at least one output module, and the first control
module which performs the second control functions. The output
module may include at least two visual units each having a preset
configuration and provided in a disposition and an arrangement to
display different portions of each image in each of multiple view
angles of each user but also in a viewable distance therefrom so
that each user may simultaneously view each portion of each image
displayed on each visual unit sequentially (or one at a time) due
to the configuration, disposition, or arrangement while performing
each exercise on, against or with each standard exercise module (to
be referred to as the "fourth output module").
[0023] In another aspect of the present invention, an exercise
system is arranged to provide at least two users with at least one
preset task of a story, scenery or video (or computer) game in
images of at least one virtual environment, to allow the users to
simultaneously perform exercises (or to perform such exercises in a
delayed mode), and then to directly or indirectly manipulate at
least one feature of at least one of such exercises at least partly
based on at least one feature of the task.
[0024] In one exemplary embodiment of this aspect of the invention,
an exercise system includes the first standard exercise module, at
least one of such second and third standard exercise modules, the
first output module, and the first control module which is arranged
to manipulate at least one feature of the operation at least partly
based on the relation and also on at least one feature of the
images of the task, whereby the users simultaneously proceed along
the task for the task goal while simultaneously performing such
exercises on, with or against the standard exercise modules which
are disposed in different locations and at least one feature of
which is arranged to be manipulated either indirectly or directly
by the control module at least partly based on the task performed
by at least one of the users while communicating with the visual
units or standard exercise modules via a local or global network
which is to encompass those locations (to be referred to as the
"third control functions").
[0025] In another exemplary embodiment of this aspect of the
invention, another exercise system may include the first standard
exercise module and at least one of the second and third standard
exercise modules, the second output module, and the first control
module performing the third control functions.
[0026] In another exemplary embodiment of this aspect of the
invention, another exercise system may include the first standard
exercise module and at least one of the second and third standard
exercise modules, the third output module, and the first control
module. Such a first control module may also be arranged to
simulate at least one of the users as at least one simulated user
included in the images for such a user, and to manipulate the
feature of the operation at least partly based on the relation and
on at least one feature of the images for the task, whereby the
users simultaneously proceed along the task for the task goal while
simultaneously performing the exercises on, with or against the
standard exercise modules and at least one feature of which is
arranged to be directly or indirectly manipulated by the control
module at least partly based on the simulated user while
communicating with the visual units or the standard exercise
modules via a local or global network which encompass such
locations (to be referred to as the "fourth control
functions").
[0027] In another exemplary embodiment of this aspect of the
invention, another exercise system may include the first standard
exercise module and at least one of the second and third standard
exercise modules, the fourth output module, and the first control
module performing the fourth control functions.
[0028] In another aspect of the present invention, an exercise
system is arranged to provide at least two users with at least one
preset task of a story, scenery or video (or computer) game in
images of at least one virtual environment, to allow such users to
simultaneously perform exercises defining the same, similar or
different types and extents (or to perform exercises in a delayed
mode), to relate at least one first feature of the task, exercises,
users or operation of the system to at least one second feature of
at least one another of the task, exercises, users or operation,
and to directly or indirectly manipulate one of the first and
second features at least partly based on the other thereof.
[0029] In one exemplary embodiment of this aspect of the invention,
an exercise system may include the first standard exercise module
as well as at least one of the second and third standard exercise
modules, where at least one of the first and second (or third)
standard exercise modules provides the operation defining such a
feature of the system. The system may also include the first output
module, and at least one control module which is arranged to
operatively couple with at least one of the output module and
standard exercise modules directly or indirectly, to provide the
task in such images of the virtual environment, to display the
images on the visual units, to assign at least one goal to the
task, to monitor at least one of the first and second features, to
relate one of the first and second features to another thereof at
least partly based upon at least one preset relation which may be
stored therein, generated thereby, supplied by at least one of the
users, and the like (to be referred to as the "second control
module"). The second control module may be arranged to manipulate
at least one of the first and second features at least partly based
on another thereof, whereby the users may simultaneously proceed
along the task for the goal while simultaneously performing the
exercises on, with or against the standard exercise modules
disposed in different locations, whereby the control module
directly or indirectly manipulates another task feature or standard
exercise modules at least partly based on the relation or monitored
feature while communicating with the visual units or standard
exercise modules via a local or global network linking those
locations (to be referred to as the "fifth control functions").
[0030] In another exemplary embodiment of this aspect of the
invention, another exercise system may include the first standard
exercise module and least one of such second and third standard
exercise modules, where at least one of the first and second (or
third) standard exercise modules provides the operation which
defines such a feature of the system. The system may also have the
second output module, and the second control module which may also
perform the fifth control functions.
[0031] In another exemplary embodiment of this aspect of the
invention, an exercise system has the first standard exercise
module and at least one of such second and third standard exercise
modules, where at least one of the first and second or third
standard exercise modules provides the operation defining the
feature of the system. The system may include the third output
module, and the second control module. The second control module
may be arranged to simulate at least one of the users as at least
one simulated user in the images for such an user, and to
manipulate at least one of the first and second features at least
partly based on another thereof, whereby the users may
simultaneously proceed along the task for the task goal while
simultaneously performing such exercises on, with or against the
standard exercise modules incorporated in the different locations
and whereby the control module directly or indirectly manipulates
another feature of the task or standard exercise modules at least
partly based on the preset relation or monitored feature while
communicating with at least one of the visual units or standard
exercise modules via a local or global network covering such
locations (to be referred to as the "sixth control functions").
[0032] In another exemplary embodiment of this aspect of the
invention, another exercise system may include the first standard
exercise module and least one of such second and third standard
exercise modules, where at least one of the first and second (or
third) standard exercise modules provides the operation defining
such a feature of the system. The system may also have the fourth
output module, and the second control module which may also perform
the sixth control functions.
[0033] In another aspect of the present invention, an exercise
system is arranged to connect multiple different locations, to
include at least one standard exercise module in each location, to
define at least one preset task of a story, a scenery or a video or
computer game each provided in images of at least one virtual
environment, and then to allow multiple users to simultaneously
perform exercises on, with or against the standard exercise modules
disposed in the locations (or to perform such exercises in a
delayed mode) while competing each other in the task images at
least partly based on such exercises performed by the users.
[0034] In one exemplary embodiment of this aspect of the invention,
an exercise system may include the first standard exercise module,
at least one of the second and third standard exercise modules, at
least one output module, and at least one control module. The
output module may include at least two visual units one of which
may be provided in the first location, another of which is provided
in the second location, and each of which is provided in a
disposition and an arrangement for displaying an entire portion of
each image in a single view angle of each user and in a viewable
distance therefrom so that each user may simultaneously view the
entire portion of each image on each visual unit while performing
each exercise on, with or against each of such standard exercise
modules (which is to be referred to as the "fifth output module").
The control module is arranged to operatively couple with at least
one of the output module and the standard exercise modules directly
or indirectly, to provide the task in such images, to display the
images on the visual units, to assign at least one goal to the
task, to be disposed in one of the first and second locations, to
monitor at least one feature of such exercises provided by the
standard exercise modules, users simultaneously performing the
exercises or at least one operation of the standard exercise
modules, to relate such exercises with each other based on at least
one preset relation (to be referred to as the "third control
module"). The third control module may manipulate at least one
feature of the images at least partly based on the relation and on
at least one feature of the exercises, users or operation, whereby
such users may simultaneously proceed along the task for the goal
while simultaneously performing the exercises on, with or against
the standard exercise modules in such locations and whereby the
control module directly or indirectly manipulates the task feature
at least partly based upon the relation or feature of the
exercises, users or operation while communicating with the visual
units or standard exercise modules disposed in another location
through a local or global network (to be referred to as the
"seventh control functions").
[0035] In another exemplary embodiment of this aspect of the
invention, an exercise system has the first standard exercise
module and at least one of the second and third standard exercise
modules, at least one output module, and the third control module
which performs such seventh control functions. The output module
may also include at least two visual units, one provided in the
first location, another provided in the second location, and each
provided in a disposition and an arrangement for displaying
different portions of each image in each view angle of each user
but in a viewable distance therefrom so that each user
simultaneously views each portion of each image displayed on each
of such visual units sequentially (or one at a time) while
performing each exercise on, with or against each standard exercise
module (to be referred to as the "sixth output module").
[0036] In another exemplary embodiment of this aspect of the
invention, an exercise system has the first standard exercise
module and at least one of the second and third standard exercise
modules, at least one output module, and the third control module.
The output module may have at least two visual units, one provided
in the first location, another provided in the second location, and
each defining a configuration and provided in an arrangement and a
disposition for displaying an entire portion of each image in a
single view angle of each user and within a viewable distance due
to such a configuration, disposition or arrangement so that each
user may simultaneously view the entire portion of the image on
each visual unit while performing each exercise on, with, and/or
against each standard exercise module (to be referred to as the
"seventh output module"). The third control module may be arranged
to simulate at least one of the users as at least one simulated
user included in the images of the task, to manipulate at least one
feature of such images at least partly based on the relation and on
at least one feature of the exercises, users or operation, whereby
such users may simultaneously proceed along the task for the task
goal while simultaneously performing the exercises on, with or
against the standard exercise modules in different locations and
whereby the control module directly or indirectly manipulates the
task feature at least partly based on the relation or such a
feature of the exercises, users or operation while communicating
with the visual units or standard exercise modules disposed in
another location in a local or global network (to be referred to as
the "eighth control functions").
[0037] In another exemplary embodiment of this aspect of the
invention, an exercise system has the first standard exercise
module and at least one of the second and third standard exercise
modules, at least one output module, and the third control module
performing such eighth control functions. The output module may
include at least two visual units one provided in the first
location, another provided in the second location, and each
defining a preset configuration and also provided in a disposition
and arrangement for displaying different portions of each image in
each view angle of each user but also in a viewable distance
therefrom so that each user may simultaneously view each portion of
each of the images on each visual unit sequentially due to the
configuration, disposition or arrangement while performing each
exercise on, with or against each standard exercise module (to be
referred to as the "eighth output module").
[0038] In another aspect of the present invention, an exercise
system may be arranged to operatively connect multiple locations
through a local network or a global network, to include at least
one exercise module in each of the locations, to define at least
one preset task of a story, a scenery, a video game, and/or a
computer game each defining a preset goal for the task and provided
in images for at least one virtual environment, and to allow each
of multiple users to simultaneously perform exercises on, with, or
against each of the exercise modules disposed in each of the
locations while competing each other in the images for the task
goal at least partly based on said exercises performed by the
users.
[0039] In one exemplary embodiment of this aspect of the invention,
an exercise system may have a first exercise module, a second
exercise module, at least one output module, and at least one
control module. The first exercise module is arranged to define a
first exercise type and a first exercise load and to allow a first
user to perform a first exercise while consuming energy thereof
during the first exercise. The second exercise module is arranged
to define a second exercise type and a second exercise load and to
also allow a second user to perform a second exercise while
consuming energy thereof during the second exercise, where the
second exercise module may be arranged to operative couple with the
first exercise module via the network indirectly or directly,
thereby allowing the first and second users to simultaneously
perform the exercises while pursuing the goal of the task. The
output module includes at least two visual units one of which is
provided in a first location, another of which is provided in a
second location, and each of which is provided in a preset
disposition as well as in a preset arrangement so as to display
such images to each of the exercising users. The control module is
arranged to operatively couple with the output module and/or
exercise modules indirectly or directly, to define the goal of the
task, to provide the task in the images, to display such images on
the visual units, to monitor the first and/or second extents and/or
loads as well as extents of the first and second exercises each
performed by each of the users, to simulate such users into
simulated users included in the images, to relate the type, load,
and/or extent of the first exercise with at least one of those of
the second exercise based on at least one preset relation, and then
to perform manipulation of at least one of such simulated users in
the images at least partly based on the types, loads, and/or
extents related to each other by such a relation, thereby allowing
the users to compete for attaining the task goal while
simultaneously performing the exercises in such locations
regardless of whether the types of the first and second exercises
are identical to each other.
[0040] The types of the first and second exercises may be identical
to each other, where the control module performs such manipulation
at least substantially based on the loads and/or extents of the
first and/or second exercises. Such types of the first and second
exercises may be different from each other, where the control
module may be arranged to convert at least one of the extents from
one unit to another unit and then to perform comparison of such
converted extent with another of the extents, thereby performing
such manipulation at least substantially based on such a
comparison. The control module may be arranged to perform the
manipulation by manipulating the simulated user in the images. The
control module may instead be arranged to perform such manipulation
not only by manipulating the simulated user in such images but also
by manipulating the operation of at least one of such exercise
modules based on such types, loads, and/or extents.
[0041] Embodiments of such apparatus aspects of the present
invention may include one or more of the following features, while
configurational and/or operational variations and/or modifications
of the foregoing systems also fall within the scope of the present
invention.
[0042] The system may provide the task in the auditory, olfactory
or tactile features (i.e., the sounds, smells or sensations,
respectively) of the virtual environment with the visual feature
(i.e., the images), where each feature may represent, connote or be
associated with at least one of multiple elements such as, e.g., a
preset object, background, event, geographic region, activity,
surrounding, and so on. The task may include the simulated user
which is at least one object or background to be included in the
images of the task of the story, scenery or game and which may be
manipulatable or controllable by at least one of the features of
such exercises, users, or operations of the exercise modules. The
system (or its control module) may simulate only one of such users
into a single simulated user, only one of the users into multiple
simulated users, multiple users as a single simulated user,
multiple users into multiple simulated users each simulating only
one of the users, and the like.
[0043] The exercise may include voluntary or involuntary physical
or electrical activities of muscles of the users, leading to
improvement or enhancement of a muscle tone, to an increase in a
muscle mass or muscle volume, to a reduction in his or her weight,
to an increased physical stamina, and the like. The standard
exercise modules may be installed in the different locations so as
to prevent each user from accessing both of the standard exercise
modules without physically moving from one to another location. The
standard exercise modules may directly couple to each other or,
alternatively, indirectly couple to each other through the control
module. At least one of the standard exercise modules may (or not)
be synchronized with another of the modules based on at least one
of the features. All (or at least two) of different exercise
modules may be disposed in different locations. In the alternative,
all (or at least two) of different exercise modules may be disposed
in the same locations.
[0044] The images for the task may be a single still picture an
entire portion of which is displayed on the visual unit, a single
still picture with multiple portions each displayed on the visual
unit, a series of still pictures, or a video clip. The visual
features may include at least one visual aspect such as, e.g., a
shape, a size, a content, a color, a brightness, a contrast, a
sharpness, a zoom, and a view angle of the image, a distance of
portraying the image, temporal or spatial characteristics,
distributions, or variations of any of the above, and so on. The
visual features may include at least one visual aspect such as,
e.g., a shape or size of the object (or background) or variations
thereof provided in different view angles or distances, contents
carried by the object (or background), color or brightness thereof,
contrast or sharpness thereof, zoom and view angles thereof, and so
on. The visual unit may acquire the images by storing such and then
retrieving such, by receiving the images from the control module,
by obtaining the images from an external source such as, e.g., the
user, other persons, an internet, a broadcast, an external storage,
and/or game console through wire or wirelessly, or by synthesizing
(or composing) the images for itself. The visual unit may repeat at
least a portion of such images in a preset sequence, randomly or
based on another sequence at least partly determined by at least
one of such features. The visual unit may define at least one image
domain on which such images may be displayed and which may consist
of a single portion, a pair of portions or multiple portions. The
visual unit may provide the images in black-and-white, multicolor
or a mixture. The visual unit may zoom in or out of such images,
vary their view angles, rotate the images with respect to a
rotation base, and the like. The visual unit may form such images
by acquiring the object and background simultaneously, by acquiring
the object and background independently and superposing one onto
the other, by acquiring one of the object and background followed
by synthesizing the other and composing the object and background,
and the like. The visual unit may form the images by acquiring the
simulated user and the rest of such elements simultaneously, by
acquiring the simulated user and the rest of such elements
independently and then superposing one over the other, by acquiring
one of the simulated user and the rest of the elements,
synthesizing the other, and then composing the object and
background, and the like. The visual unit may be disposed on the
exercise module, on the control module, away from the user, to be
worn by the user over or around at least one of his or her body
part, or to be carried by the user. The visual unit may be
incorporated into or provided as a wearable article such as, e.g.,
glasses, goggles, or helmets. The visual unit may display the
object (or background) in a perspective of the user or in another
perspective defined away from the user. The images may portray the
object or background as may be perceived by the user or, in the
alternative, the images may rather portray the object or background
including the simulated user in the another perspective of a third
party. The control and/or output modules may vary the perspective
during the exercise based on at least one of the features. The
output and/or control modules may zoom in or out of the images
while maintaining or changing the perspective based on at least one
of the features. The control module may provide the simulated user
by animating an appearance of the user, by adjusting a size of the
user, by selecting one of prestored multiple simulated users, and
the like.
[0045] The output module includes at least one auditory unit to
provide the sounds. The system may include at least two auditory
units each capable of playing such sounds to each user
simultaneously performing the same, similar or different exercises.
The auditory features may include, e.g., a volume or loudness of
the sounds, their tone, a balance when the auditory unit includes
multiple speakers, a frequency of the sounds, their frequency
distribution, their direction, temporal or spatial distributions,
characteristics or variations of such sounds, and so on. The
auditory unit may acquire the sounds by storing and retrieving
such, by receiving the sounds from the control module, by obtaining
the sounds through external sources including the users, an
internet, a broadcast, an external storage, and/or the external
game console either by wire or wirelessly, or by synthesizing or
composing the sounds. The auditory unit may repeat at least a
portion of the sounds in a preset sequence, randomly or based on a
sequence at least partly decided by at least one of such features.
The sounds may be real sounds, abstract sounds, or their mixture.
The sounds may be a voice, a conversation, music, sounds of the
animal or plant, sounds from or generated by the object,
synthesized or composed sounds, and their mixture. The sounds may
be in a mono or stereo mode. The auditory unit may provide the
sounds as acquired or retrieved, with or without modifying at least
one of the auditory features, by synthesizing or composing the
sound, and the like. The auditory unit may include at least one
cone-drive speaker, piezoelectric speaker, or electrostatic
speaker. The auditory unit may be disposed on the exercise or
control module, spaced away from the user, carried by the user,
worn by the user over or around at least one of his or her ears or
body parts, and the like. The auditory unit may be incorporated
into or formed as a wearable article such as a helmet, an earphone,
a headphone, and the like.
[0046] The output module may also include at least one olfactory
unit which may provide the smells. The olfactory feature may
include a type of the smells, an intensity of the smells, a
temporal or spatial distribution, characteristics, or variations
thereof, and the like. The olfactory unit may include at least one
storage storing at least one substance for the smells and may
dispense the substance to create the virtual environment. The
olfactory unit may include multiple storages each storing a
substance for preset smell, may dispense a mixture from at least
two of the substances for the virtual environment, may dispense the
mixture of different substances in a preset order or randomly, and
the like. Such an olfactory unit may include at least one dispenser
which manipulates the substance to be discharged from the storage,
where the dispenser may include therein at least one wick, nozzle
or evaporator. The olfactory unit may give off the smells in a
preset order, randomly, or based on at least one of the features.
The olfactory unit may dispense the smells to a space adjacent to a
portion of the body part of the user, where the portion may be an
entire area around the nose of the user, a space covering an upper
torso of the user, and the like. Such an olfactory unit may be
disposed away from the user, carried by the user, or worn by the
user near or around his or her nose.
[0047] The output module may include at least one tactile unit
which may provide the sensations. The sensations may include a
mechanical sensation, an air flow, heat, coldness, electrical
sensation, and the like. The tactile unit may repeat the sensations
in a preset order, randomly, based on at least one of the features,
and the like. The tactile features may include mechanical, thermal,
optical, or electrical properties of at least one portion of the
exercise module or, alternatively, the properties sensed by the
user away from the exercise module. The tactile unit may have at
least one actuator which provides different sensations at a contact
between the user and exercise module by changing the mechanical
property of the exercise module at a point contacting the user,
where the mechanical properties may be, e.g., an elasticity, a
modulus, a stiffness, a deformability, a bulk structure, a
roughness, a surface structure, and the like. The tactile unit may
include at least one air pump for generating an air flow to the
user, where such mechanical properties may include an air flow
rate, an air velocity, temporal or spatial distributions,
characteristics, or variations thereof, and the like. The tactile
unit may include at least one heater for irradiating heat (or
infrared) rays to the user, for heating the part of the exercise
module, for heating the air flow, and the like, where the thermal
properties may include a temperature of the part of the exercise
module or flow of air, a heat flux rate thereof or therethrough, a
position of the heater relative to the user, temporal and/or
spatial distributions, characteristics, and/or variations thereof,
and the like. The tactile unit may be provided to be disposed away
from the user, to be worn by the user over or around at least a
portion of his or her body, to be carried by the user, and the
like.
[0048] The system (or its control module) may allow transfer of the
task feature only from one to the other of the standard exercise
modules, between the standard exercise modules, and the like. Such
a system (or its control module) may allow the transfer wirelessly
or by wire, through control module or between other modules, and
the like. The system (or its control module) may perform the
transfer without altering the task feature (i.e., a simple
transfer) or, in the alternative, by altering or converting the
task feature based upon the relation (i.e., an equivalent
conversion). The relation may be defined between at least feature
of the exercises and that of the task, between at least one feature
of such exercises and the goal and/or stages of the task, between
at least one feature of the users and that of the task, between at
least one feature of the users and the goal and/or stages of the
task, and the like. The relation may be defined between at least
one feature of the operation and at least one of the features of
the task, between at least one feature of the operation and the
goal and/or stages of the task, and the like. The relation may be
maintained constant during the exercises, may change during the
exercises at least partly based upon at least one of the features,
may account for physical fatigue of the users which may be
reflected by the duration of the exercises and/or load, and the
like.
[0049] The control module may be disposed in only one of the
locations or, in the alternative, may be disposed in a location
which may be different from such locations of the standard exercise
modules. The control module may communicate with at least one of
the standard exercise modules (or at least one of the users)
wirelessly or by wire, may also communicate with at least one of
such visual units wirelessly or through wire, and the like. The
control module may receive at least one of the features from at
least one another of the modules of the system and/or at least one
of the users through wire or wirelessly, may transmit at least one
of the features to at least one another of the modules of the
system and/or at least one of the users by wire or wirelessly. The
network may cover the locations of a single city (or different
cities) or those of a single country (or different countries). The
network may encompass the locations of the same time zone or
different time zones.
[0050] The system may operatively couple with at least one external
visual unit capable of displaying at least a portion of the images
thereon. The external visual unit may operatively couple to the
output and/or control modules through wire or wirelessly in order
to supplement or replace at least a portion of the modules. The
external visual unit may be a visual device including the CRT, LCD,
OLED, IOLED, PDP or any screen for displaying the images in the
black-and-white or color mode, where examples of the device may
include a stationary or portable audiovisual device including at
least one screen (e.g., a DVD player, a TV, and the like), a
portable data processing device with at least one screen (e.g., a
PDA, a data organizer, a laptop computer, and the like), a portable
communication device including at least one screen (e.g., a
cellular phone), and the like. The system may operatively couple
with at least one external game console for providing at least a
portion of the task. The external game console may operatively
couple with the output and/or control modules by wire or wirelessly
in order to supplement or replace at least a portion of such
modules. The external game console may provide the task of the game
in signals where the control module may generate the images for the
task or, in the alternative, to provide the task in the images so
that the control module may relay the images to the output module.
Such an external game console may be a game device including at
least one storage and at least one processor, where the storage may
store algorithms for the task of the game, while the processor may
execute the algorithms to provide the game to the user.
[0051] In another aspect of the invention, a method is provided to
participate at least two exercising users in at least one task of a
story, a scenery, or a video (or computer) game each defining at
least one task goal and provided in images of at least one virtual
environment while allowing the exercising users to manipulate at
least one feature of the images and to compete each other for the
task goal at least partly based on at least one of at least two
exercises performed by the users.
[0052] In one exemplary embodiment of this aspect of the invention,
a method includes such steps of: exercising a first user on, with
or against a first standard exercise module of an exercise system
which is incorporated in a first location and arranged to
facilitate the first user to consume energy by performing first
exercise provided by the first standard exercise module (to be
referred to as the "first exercising"); exercising a second user
simultaneously with the first user on, with or against a second
standard exercise module disposed in a second location and is also
arranged to facilitate the second user to consume energy by
performing second exercise provided by the second standard exercise
module, where the second location is different from the first
location (to be referred to as the "second exercising"); arranging
and disposing at least two visual units each providing an entire
portion of the images in a single view angle of each of the users
user who simultaneously perform such exercises and each disposed
within a viewable distance therefrom (to be referred to as the
"first arranging"); displaying the images for the task on each of
such visual units (which is to be referred to as the "first
displaying"); monitoring at least one feature directly or
indirectly related to at least one of the first and second
exercises (to be referred to as the "first monitoring"); relating
the first and second exercises to each other at least partly based
upon at least one preset relation (to be referred to as the "first
relating"); and manipulating at least one feature of the images at
least partly based upon the relation and the monitored feature,
thereby allowing the at least one of the users to compete another
in such images of the task for the task goal through the exercises
which are simultaneously performed by the users (to be referred to
as the "first manipulating"). The above steps of monitoring to
manipulating may be replaced by such steps of: monitoring at least
one feature directly or indirectly related to both of the first and
second exercises (to be referred to as the "second monitoring");
the first relating; and manipulating at least one feature of the
images at least partly based on the relation and the monitored
feature, thereby allowing the users to compete each other in the
images for the task goal through the exercises simultaneously
performed by the users (to be referred to as the "second
manipulating").
[0053] In another exemplary embodiment of this aspect of the
invention, a method includes the steps of: the first exercising;
the second exercising; arranging and disposing at least two visual
units each providing different portions of the images in multiple
view angles of each of the users simultaneously performing the
exercises and each disposed in a viewable distance (to be referred
to as the "second arranging"); the first displaying; the first
monitoring; the first relating; and the first manipulating. The
above steps of monitoring to manipulating may be replaced by the
steps of: the second monitoring; the first relating; and the second
manipulating.
[0054] In another exemplary embodiment of this aspect of the
invention, a method includes the steps of: the first exercising;
the second exercising; the first arranging; simulating at least one
of the users as at least one simulated user (to be referred to as
the "first simulating"); including the simulated user in the images
of the virtual environment for at least one of such users (to be
referred to as the "first including"); displaying the images
including the simulated user on at least one of the visual units
(to be referred to as the "second displaying"); the first
monitoring; the first relating; and manipulating at least one
feature of the simulated user at least partly based on the relation
and monitored feature, thereby allowing at least one of the users
to compete another user in the images of the task for its goal
during the exercises simultaneously performed by such users (to be
referred to as the "third manipulating"). The above steps of
simulating to manipulating may be replaced by the steps of:
simulating each user as at least one simulated user (to be referred
to as the "second simulating"); including the simulated users in
the images (which will be referred to as the "second including"
hereinafter); displaying such images including the simulated users
on the visual units (to be referred to as the "third displaying");
the first monitoring; the first relating; and manipulating at least
one feature of the simulated users at least partly based on the
relation and monitored feature, thereby allowing the users to
compete each other in the images of the task for the task goal
during the exercises simultaneously performed by the users (to be
referred to as the "fourth manipulating").
[0055] In another exemplary embodiment of this aspect of the
invention, a method includes the steps of: the first exercising;
the second exercising; the second arranging; the first simulating;
the first including; the second displaying; the first monitoring;
the first relating; and the third manipulating. The above steps of
simulating to manipulating may be replaced by the steps of: the
second simulating; the second including; the third displaying; the
first monitoring; the first relating; and the fourth
manipulating.
[0056] In another aspect of the invention, a method is provided to
participate at least two exercising users in at least one task of a
story, a scenery, or a video (or computer) game each defining at
least one preset goal and provided in images of at least one
virtual environment while manipulating at least one of multiple
exercises performed by the users at least partly based upon at
least one of multiple exercises performed by at least one of the
users.
[0057] In one exemplary embodiment of this aspect of the invention,
a method includes the steps of: the first exercising; the second
exercising; the first arranging; the first displaying; monitoring
at least one feature directly or indirectly related to the first
exercise which is performed by the first user (to be referred to as
the "third monitoring"); the first relating; and manipulating at
least one feature of the second exercise performed by the second
user at least partly based upon the relation and monitored feature,
thereby allowing the first user to compete with the second user in
such images for the task by the first exercise (to be referred to
as the "fifth manipulating"). The above steps of monitoring to
manipulating may be replaced by such steps of: monitoring at least
one feature directly or indirectly related to each of the exercises
performed by each user (to be referred to as the "fourth
monitoring"); the first relating; and manipulating at least one
feature of the first and second exercises at least partly based on
the relation and monitored features of the second and first
exercises, respectively, thereby allowing the users to compete each
other in the images of the task for the goal during the exercises
simultaneously performed by the users (to be referred to as the
"sixth manipulating").
[0058] In another exemplary embodiment of this aspect of the
invention, such a method may include the steps of: the first
exercising; the second exercising; the second arranging; the first
displaying; the third monitoring; the first relating; and the fifth
manipulating. Such steps of monitoring to manipulating may be
replaced by the steps of: the fourth monitoring; the first
relating; and the sixth manipulating.
[0059] In another exemplary embodiment of this aspect of the
invention, a method has the steps of: the first exercising; the
second exercising; the first arranging; the first simulating; the
first including; the second displaying; the third monitoring; the
first relating; and manipulating at least one feature of the second
exercise performed by the second user at least partly based on the
relation and monitored feature, thereby allowing the first user to
compete with the second user through the simulated user in the
images through the first exercise (to be referred to as the
"seventh manipulating"). Such steps of simulating to manipulating
may further be replaced by the steps of: the second simulating; the
second including; the third displaying; the fourth monitoring; the
first relating; and then manipulating at least one feature of the
first and second exercises at least partly based on the relation
and monitored features of the second and first exercises,
respectively, thereby allowing the users to compete each other by
the simulated users in the images for the task goal through the
exercises simultaneously performed by the users (which will be
referred to as the "eighth manipulating").
[0060] In another exemplary embodiment of the same aspect, a method
includes the steps of: the first exercising; the second exercising;
the second arranging; the first simulating; the first including;
the second displaying; the third monitoring; the first relating;
and the seventh manipulating. The steps of simulating to
manipulating may further be replaced by the steps of: the second
simulating; the second including; the third displaying; the fourth
monitoring; the first relating; and the eighth manipulating.
[0061] In another aspect of the invention, a method is provided for
competing at least two users who perform exercises in different
locations with each other for at least one preset goal of at least
one task of a story, scenery or video (or computer) game each
having the goal and provided in images of a virtual environment at
least partly based on at least one feature of at least one of the
exercises.
[0062] In one exemplary embodiment of this aspect of the invention,
such a method includes the steps of: the first exercising; the
second exercising; operatively coupling such standard exercise
modules to each other via a local or global network encompassing
the locations (to be referred to as the "first coupling"); the
first arranging; the first displaying; the first monitoring through
such a network; the first relating; and the first manipulating via
the network. The steps of monitoring to manipulating may be
replaced by the steps of: the second monitoring through the
network; the first relating; and the second manipulating via the
network. Such steps of monitoring to manipulating may be replaced
by the steps of: the third monitoring via the network; the first
relating; and the fifth manipulating via the network. Such steps of
monitoring to manipulating may be replaced by the steps of: the
fourth monitoring via the network; the first relating; and the
sixth manipulating through the network.
[0063] In another exemplary embodiment of this aspect of the
invention, such a method includes the steps of: the first
exercising; the second exercising; the first coupling; the second
arranging; the first displaying; the first monitoring through the
network; the first relating; and the first manipulating through the
network. The above steps of monitoring to manipulating may be
replaced by such steps of: the second monitoring via the network;
the first relating; and the second manipulating via the network.
The above steps of monitoring to manipulating may be replaced by
the steps of: the third monitoring through the network; the first
relating; and the fifth manipulating through the network. Such
steps of monitoring to manipulating may be replaced by the steps
of: the fourth monitoring through the network; he first relating;
and the sixth manipulating through the network.
[0064] In another exemplary embodiment of this aspect of the
invention, a method includes the steps of: the first exercising;
the second exercising; the first coupling; the first arranging; the
first simulating; the first including; the second displaying; the
first monitoring via the network; the first relating; and the third
manipulating through the network. Such steps of simulating to
manipulating may be replaced by the steps of: the second
simulating; the second including; the third displaying; the first
monitoring via the network; the first relating; and the fourth
manipulating through the network. The above steps of simulating to
manipulating may be replaced by the steps of: the first simulating;
the first including; the second displaying; the third monitoring
via the network; the first relating; and the seventh manipulating
through the network. Such steps of simulating to manipulating may
further be replaced by the steps of: the second simulating; the
second including; the third displaying; the fourth monitoring
through the network; the first relating; and then the eighth
manipulating through the network.
[0065] In another exemplary embodiment of this aspect of the
invention, such a method includes the steps of: the first
exercising; the second exercising; the first coupling; the second
arranging; the first simulating; the first including; the second
displaying; the first monitoring through the network; the first
relating; and the third manipulating through the network. The steps
of simulating to manipulating may be replaced by the steps of: the
second simulating; the second including; the third displaying; the
first monitoring through the network; the first relating; and the
fourth manipulating through the network. Such steps of simulating
to simulating may be replaced by the steps of: the first
simulating; the first including; the second displaying; the third
monitoring through the network; the first relating; and the seventh
manipulating through the network. Such steps of simulating to
simulating may be replaced by the steps of: the second simulating;
the second including; the third displaying; the fourth monitoring
through the network; the first relating; and the eighth
manipulating via the network.
[0066] Embodiments of such method aspects of the present invention
may include one or more of the following features, while
configurational or operational variations and/or modifications of
the foregoing methods also fall within the scope of the present
invention.
[0067] Such defining the goal may include the step of: arranging
one of the users to perform the task against another user
performing the exercise defining the same, similar or different
type and/or extent on, with, and/or against another standard
exercise module. The defining the goal may include at least one of
the steps of: fighting (or opposing, hiding from) at least one
opponent manipulated by the task; overcoming (or opposing, hiding
from) at least one obstacle provided by the task; proceeding
through the obstacles; seeking at least one preset object hidden in
the task; or assembling at least one preset shape from multiple
objects provided thereby.
[0068] Such simulating may include one of the steps of: simulating
the single user into one or multiple simulated users included in
the images for the task; simulating each of at least two users as
the single simulated user (or each of at least two simulated
users); simulating each of at least two but not all of the users
into the single simulated user (or each of at least two simulated
users), and the like. Such simulating may include the steps of:
forming the simulated user as at least one object or background of
the images; and changing at least one visual feature of the object
or background at least partly based on at least one of the features
of the exercise, user, or operation. The simulating may have the
steps of: forming the simulated user as at least one object (or
background) in the images; and changing at least one feature of the
object (or background) with respect to other objects (or
backgrounds) of the images at least partly based on at least one of
the features of the exercise, user, or operation while maintaining
other visual features. The changing may include at least one of the
steps of: changing a shape or size of the simulated user; changing
its color; changing its contrast or sharpness; changing its zoom;
changing its view angle or distance; and changing its position. The
changing may include at least one of the steps of: moving the
simulated user with respect to the background based on at least one
of the factors; varying a shape or size of the simulated user based
on at least one of the factors; or changing an orientation of the
simulated user based upon at least one of the factors. The changing
may include at least one of the steps of: changing temporal
characteristics of the visual feature; and varying its spatial
characteristics. The simulating may include at least one of the
steps of: forming the simulated user as a living organism (e.g., a
person, animal, plant, and the like); forming the simulated user as
a nonliving object; forming the simulated user as a mixture of the
living organism and nonliving object, and the like. The forming may
include at least one of the steps of: animating the simulated user
after the user; synthesizing the simulated user using a preset
program; selecting one of the simulated user from multiple
simulated users, and the like. The simulating may also include one
of the steps of: defining the simulated user in a perspective of
the user; forming the simulated user in a perspective defined away
(or at a preset distance) from the user, and the like. The
simulating may include one of the steps of: maintaining a
perspective depicting the simulated user throughout the task;
changing the perspective during at least a portion of the task; and
constantly changing the perspective during the task. The simulating
may also include the step of: arranging the simulated user to walk,
run, sprint, jump, throw, row, push, pull, turn, bend, rotate,
swing, hit, and/or otherwise move based on at least one of the
features. The simulating may also include at least one of the steps
of: simulating at least one feature of the user into the simulated
user manipulated by the user; simulating at least one feature of
another user into the simulated user manipulated by the user, and
the like.
[0069] The disposing may also include at least one of the steps of:
incorporating at least a portion of the visual unit into the
standard exercise module(s); coupling such a portion away from the
standard exercise module(s); and disposing such a portion on a
structure supporting or enclosing the standard exercise module(s).
The disposing may also include at least one of the steps of:
incorporating at least a portion of the visual unit into a wearable
article such as glasses, a goggle, a helmet, a hat, a cap, a head
band, and the like; releasably coupling the wearable article with
the user; releasably (or fixedly) coupling the article to a cloth
of the user, and the like. The disposing may include one of the
steps of: disposing a single visual unit for multiple users;
providing each of the users with at least one visual unit, and the
like. The disposing may include one of the steps of: disposing the
visual units of same shapes or sizes to the user(s); and disposing
the visual units of different shapes or sizes thereto.
[0070] Such coupling may have one of the steps of: directly
coupling the standard exercise modules; indirectly coupling the
standard exercise modules; coupling the standard modules through a
provider which is not a part of the system, and the like. Such
coupling may include the step of: coupling the standard exercise
modules wirelessly or through wire. The method may include the
steps of: placing the standard exercise modules in the different
(or geographically separate) locations; and preventing each of the
users from accessing both of the standard exercise modules without
having to physically move out from one to another of the locations.
Such a method may further include at least one of the steps of:
encompassing such locations of a single city (or different cities)
by the global network; encompassing such locations of a single
country (or different countries) by the global network,
encompassing the locations of a single time zone by the global
network; and encompassing the locations of different time
zones.
[0071] The monitoring may include at least one of the steps of:
monitoring at least one feature of the user(s); monitoring at least
one feature of the exercise(s); and monitoring at least one feature
of the operation of the exercise module(s). Such monitoring the
user (or exercise, operation) feature may include at least one of
the steps of: sensing at least one factor of at least one type of
the user(s) (or exercises, operations); sensing at least one factor
of at least one extent of the user(s) (or exercises, operations),
and the like. The monitoring may include at least one of the steps
of: sensing at least one of the features away from the user(s);
sensing at least one of the features by contacting the user(s), and
the like.
[0072] The manipulating the task (or its feature) may include at
least one of the steps of: changing at least a portion of the
images only based on at least one feature of the exercises,
operations or users; changing such a portion of such images with
respect to the rest of the elements of the images based thereon;
and varying the perspective, view angle, or distance related to the
images. The manipulating the task (or its feature) may include at
least one of the steps of: varying the simulated user only based
upon at least one feature of the exercises, operations or users;
changing the simulated user relative to the rest of the elements of
the images based thereupon; and changing the perspective, view
angle, or distance related with the simulated user. The
manipulating the exercises (or feature thereof may include at least
one of the steps of: requiring the users to maintain the posture
during the exercises (or task); requiring the users to vary the
posture during the exercise (or task) based on at least one feature
of the users, task, operations, and user input. The manipulating
the exercise (or its feature) may include at least one of the steps
of: maintaining the load during the exercises (or task); changing
the load during the exercises (or task) based on at least one
feature of the user, tasks, operation, or user input, and so on.
The manipulating the exercise (or feature thereof may include at
least one of the steps of: varying the load based on a fatigue of
the users; and varying the load based on at least one feature of
the task without considering the fatigue. The manipulating the
operation (or its feature) may include at least one of the steps
of: controlling a configuration, arrangement or disposition of the
actuating part based on at least one feature of the exercises,
users or task; controlling the load of the standard exercise
modules based on at least one feature thereof, and the like. The
manipulating the operation (or its feature) may include at least
one of the steps of: maintaining the load of the standard exercise
modules during the exercises (or task); and then varying the load
during the exercises. The manipulating the operation (or feature
thereof) may include at least one of the steps of: performing the
manipulating manually; performing the manipulating based on at
least one feature of the task, user, and exercise; and performing
the manipulating based on the user input.
[0073] The method may include the steps of: providing at least one
auditory unit; and playing sounds (or at least one auditory
feature) during such exercises. The method may include at least one
of the steps of: providing at least one auditory feature (or
sounds) related with the task, users, exercises, or operations;
providing at least one olfactory feature (or smells) related to the
task, users, exercises, or operations; further providing at least
one tactile feature (or sensations) related to the task, users,
exercises, or operations, and so on. The method may also include
one of the steps of: synchronizing the auditory, olfactory, or
tactile features with the images; arranging the auditory,
olfactory, or tactile feature independent of the images, and the
like. The method may also include the steps of: coupling the system
to at least one external visual device; and utilizing visual
capacity of the device to display at least a portion of the images.
The method may include the steps of: coupling the system to at
least one external game console; and utilizing game generating
capacity of the device to define at least a portion of the
task.
[0074] More product-by-process claims may be constructed by
modifying the foregoing preambles of the apparatus (or system)
claims and/or method claims and by appending thereto such bodies of
the apparatus (or system) claims and/or method claims. In addition,
such process claims may include one or more of such features of the
apparatus (or system) claims and/or method claims of this
invention.
[0075] As used herein, the term "exercise equipment" is synonymous
to the term "fitness equipment" and refers to various prior art
equipment which is primarily intended to improve or enhance a
muscle tone of an user, to increase his or her muscle mass or
volume, to force or facilitate the user to reduce his or her
weight, to increase physical stamina of the user, and the like. To
such ends, the "exercise equipment" typically forces or facilitates
the user to consume energy by performing physical work on, with or
against the equipment or by receiving physical or electrical energy
from the equipment in order to twitch his or her muscles based
thereon. Therefore, the "exercise equipment" within the scope of
the invention does not refer to those prior art devices primarily
intended to engage the user in playing physically simulated games
or video (or computer) games, although such "exercise equipment" of
this invention may be modified to allow the user to engage in such
simulated or video (or computer) games while improving or enhancing
the muscle tone of the user, increasing the muscle mass or volume
of the user, forcing or facilitating the user to reduce his or her
weight, or increasing physical stamina of the user. Accordingly,
such "exercise equipment" of the present invention may refer to
various prior art equipment examples of which may include, but not
be limited to, cardio-exercise equipment, weight training equipment
such as, e.g., abdominal machines and stretching machines, and the
like. Examples of the cardio-exercise equipment may include, but
not be limited to, treadmills, running machines, stair climbers,
exercise cycles and/or bikes, rowing machines, combinations of
such, and so on, whereas examples of the weight training equipment
may include, but not limited to, various home gyms, weight
machines, curls, extensions, racks such as squat racks, presses,
crunches, benches which include incline and decline types,
extension benches, bench racks, weight benches, various exercise
chairs, leverages, dips, boards, and the like. In addition, such
"exercise equipment" may include various prior art devices capable
of delivering the physical energy to the user in order to improve
or enhance the muscle tone of the user, to increase his or her
muscle mass or volume, to force or facilitate the user to reduce
his or her weight, or to increase physical stamina of the user.
Such "exercise equipment" may also include various prior art
devices capable of providing electrical energy to the user in order
to improve or enhance the muscle tone of an user, to increase the
muscle mass or volume, to force or facilitate the user to reduce
his or her weight, or to increase such physical stamina of the
user.
[0076] An "exercise module" of an exercise system of this invention
is generally similar or identical to the exercise equipment
described in the previous paragraph. More particularly, the
"exercise module" corresponds to any of the above exercise
equipment as well as any modifications thereof according to various
teachings as set forth herein. For example, the "exercise module"
may refer to any prior art exercise equipment or, in the
alternative, may incorporate thereinto one or more of various units
of the output and/or control modules of the exercise system as set
forth herein. It is appreciated, however, that each "exercise
module" always includes at least one actuating part which is
typically designed to contact a body part of the user and to
receive physical energy from the user therewith and that such an
actuating part may be fabricated as a track, a pedal, a weight, a
lever, a handle, a belt, a pad, and the like. More particularly,
the track is arranged to translate and to allow the user to perform
physical exercise of walking or running on such a track while
consuming energy during the exercise, while the pedal is arranged
to couple with a preset load and to rotate about an axis of
rotation against the load when the user performs physical exercise
of rotating the pedal while consuming energy during such exercise.
The weight is arranged to translate vertically or transversely and
to be moved as the user performs physical exercise of lifting the
weight while consuming energy during such exercise, while the lever
is arranged to couple with a preset load and to pivot about a
central point against the load as the user performs physical
exercise of displacing, reciprocating or otherwise pivoting the
lever while consuming energy during the exercise. The belt is
arranged to enclose at least one body part of the user thereabout
and to translate or reciprocate to allow the body part of the user
to perform physical exercise of vibration while consuming energy
during such exercise, and the pad is arranged to couple to a preset
load and to translate, rotate, pivot, deform, or otherwise move
against the load for allowing the user to perform exercise of
translation, rotation, pivoting, deformation or other movements
during such exercise. The handle is arranged to couple with a
preset load and/or the above weight, lever, or pad and to
translate, reciprocate, rotate, pivot, deform or otherwise move
against such a load in order to allow the user to perform physical
exercise of translation, rotation, reciprocation, pivoting or other
movements of the lever while consuming energy during the exercise.
It is noted that the actuating part may be an electrode through
which electrical energy is supplied to the user and to twitch the
muscle of the user while forcing or facilitating the user to
consume the energy.
[0077] The term "exercise" refers to any voluntary or involuntary
activities of various muscles of the user which consume energy of
the user and which lead to improvement or enhancement of a muscle
tone of the user, to an increase in a muscle mass or volume of the
user, to reduction in the weight, or to an increased physical
stamina of the user. In general, various characteristics of the
"exercise" is typically determined by the exercise module or, more
specifically, operations of the exercise module.
[0078] As used herein, the term "user inputs" refers to various
inputs which are supplied by the user onto various modules of the
system of this invention in order to manipulate various operations
of such modules such as the exercise, output, and control modules.
The user may supply such "user inputs" by applying mechanical
inputs to various input units such as, e.g., conventional keys, key
pads, touch screens, track pads, track balls, track sticks or rods,
mouses, handles, joysticks, pedals, and the like. The user may
supply the "user inputs" by applying mechanical, thermal, electric
or magnetic signals to the modules of the system, by generating
movements of his or her body part which are monitored by at least
one of such modules of the system, by generating voice, face or
body signals which may also be monitored or sensed by at least one
of such modules, and the like.
[0079] As used herein, "features of operation" (or "operation
features") include "types of operation" (or "operation types") and
"extents of operation" (or "operation extents") and are attributed
to and/or determined by a specific exercise module of the system.
Such "operation types" may be affected or determined by various
factors which may include, but not limited to, a shape or size of
the actuating part of the exercise module, a position of the
actuating part in the exercise module, a contacting mode between a
body part of an user and the actuating part during an operation of
the exercise module, a movement (i.e., a direction, a displacement,
or a sequence) of the actuating part during the operation, and so
on. The "operation extents" may also be affected or determined by
various factors which may include, but not limited to, a load
required for operating the actuating part, a duration of the
operation, an amount of energy provided to the user by the exercise
module, a mathematical function of the load, duration, and/or
amount, and the like.
[0080] As used herein, "features of exercise" (or "exercise
features") include "types of exercise" (or "exercise types"),
"extents of exercise" (or "exercise extents") and result of such
operation features. The "exercise types" may be affected or
determined by various factors which may include, but not be limited
to, a posture of an user required for a specific exercise, an
orientation of the user therefor, a movement of the user required
therefor, a body part of the user required or recruited therefor, a
body part of the user contacting the actuating part of the exercise
module therefor, a body part of the user to which mechanical or
electrical energy is supplied therefor, and so on. The "exercise
extents" may be determined or affected by various factors which may
include, but not be limited to, the load which is imposed to such
exercise by the exercise module, a duration of the exercise, an
amount of energy provided to or consumed by the user during the
exercise, a number of calories measured or estimated to be consumed
by the user, a product of the load and duration, a temporal
integration of the load over the duration, a mathematical function
of the load, duration, and/or amount, and the like.
[0081] As used herein, "user features" include "user types" and
"user extents" and are related to or affected by a period of a
specific exercise performed by an user, thereby reflecting a
physical fatigue of the user when desired. The "user types" may be
affected or determined by a variety of factors of the user which
may include, but not be limited to, a height, a weight, a body fat
percent, a sex, an age, a race, a health or disease status, a
handicap, a physical and/or physiological condition before, during
or after an exercise (i.e., body temperature, systolic, diastolic
or other cardiovascular conditions such as a blood flow rate, blood
pressure, heart rate, or ECG, a respiratory condition such as a
respiratory rate, a respiratory air flow rate, and a pressure along
an air way, an EEG, an EMG, and the like), an orientation of the
user needed for the exercise, a posture of the user needed
therefor, a movement of the user needed therefor, a body part of
the user needed therefor, a body part of the user in contact with
the exercise module (or its actuating part) therefor, a body part
of the user to which mechanical or electrical energy is supplied,
and so on. The "user extents" may similarly be affected or
determined by various factors which may include, but not limited
to, the physical or physiological conditions of the user before,
during or after the exercise, a duration of the exercise, an amount
of energy consumed by or provided to the user during the exercise,
a mathematical function of the load, duration, and/or amount, and
the like.
[0082] It is to be understood that the above "load" generally
represents resistance to the operation of the actuating part of the
exercise module, resistance to the exercise, and the like. Such a
"load" may then be represented in various means examples of which
may include, but not be limited to, a mass of at least a portion of
the actuating part, a mass of a weight coupling with the actuating
part, an angle of the actuating part with respect to the user
and/or exercise module, a spring constant or elasticity of at least
a portion of the actuating part, an electrical property of an
electric element functionally related to the actuating part, a
viscosity of at least a portion of the actuating part, a viscosity
of a dash pot (or a viscous element) coupling with the actuating
part, a speed and/or acceleration of the actuating part, a
displacement of the actuating part, and the like. In this context,
such a "load" is deemed as a variable and/or a parameter
determining an amount of energy which is to be consumed by the user
in order to consummate a unit displacement or deformation of the
actuating part of the exercise module and/or a specific body part
of the user. It is to be understood that this "load" may also be
quantified by various amounts of energy which may be required for
the user to walk or run a unit distance with respect to the
actuating part of the exercise module, which may be required to
translate, rotate, pivot, deform or bend at least a portion of the
actuating part by a preset linear or angular length, which may be
needed to rotate, pivot, bend or deform at least a portion of the
actuating part about a preset angle, and so on. This "load" may be
adjusted by various means examples of which may include, but not be
limited to, adjusting the speed or acceleration of the actuating
part of the exercise module, adjusting the angle of the actuating
part, the mass of the weight or at least a portion of the movable
or deformable portion of the actuating part, the spring constant or
modulus thereof, the viscosity of at least a portion thereof, a
length of such a movable or deformable portion, a curvilinear
trajectory of the movable or deformable portion, and the like. In
general, the "load" is defined in such a manner that the user has
to consume a greater amount of energy as the "load" increases.
[0083] As used herein, "task features" include "primary task
features" and "secondary task features" and are defined by or in a
specific task. The "primary task features" include "primary task
types" and "primary task extents." Such "primary task types" may be
affected or determined by various factors which may include, but be
limited to, a type of a goal of the task, a number of stages
defined therein, levels or skills needed to complete a stage of the
task, means of accomplishing the task goal, means of proceeding
through such stages, characteristics (i.e., configuration,
arrangement, or disposition) of a simulated user (only if the
system defines at least one simulated user), means of manipulating
such a simulated user, and the like. The "primary task extents" may
be affected or determined by various factors which may include, but
not limited to, a stage of the task in which the user or simulated
user is currently disposed, a level or skill needed in the current
stage, a status of the user or simulated user, a duration of the
user or simulated user engaged in the current stage or in the task,
and the like. The "secondary task features" may similarly include
"secondary task types" and "secondary task extents." The "secondary
task types" may be affected or determined by various factors which
may include, but not be limited to, a type of images or visual
feature (i.e., a single still picture an entire or only a portion
of which is to be displayed, a series of still pictures, a video
clip, and the like), a mode of such images or visual feature (i.e.,
black and white, grey-scale, color-scale), a dimension of such
images or visual feature (i.e., two-dimensional or
three-dimensional), and so on. The "secondary task extents" may be
affected or determined by various factors which may include, but
not be limited to, a portion selected from multiple portions of a
still picture, a still picture selected from a series of still
pictures or video clip, a sequence of selecting a next portion of
the still picture, a sequence of selecting a next still picture in
a series of still pictures or video clip, a speed or a gap between
displaying the portions or pictures, a viewing area or a zoom, a
view angle or perspective angle of the picture or video clip, a
basis of the view or perspective angle, and so on. It is understood
that the "task features" are to collectively refer to the "primary
task features" and the "secondary task features" unless otherwise
specified, that the "task types" are to collectively refer to the
"primary task types" and the "secondary task types" unless
otherwise specified, and that the "task extents" similarly
collectively refer to the "primary task extents" and the "secondary
task extents" unless otherwise specified.
[0084] The term "location" is to mean a three-dimensional space
which includes at least one entrance or door. The space of a
"single location" may be open or include at least one partition
which does not block an user from getting therearound or
thereacross, whereby the user may access every corner of the
"location." Therefore, when multiple exercise modules are disposed
in a "single location," all of the exercise modules are to be
physically accessible to the user, without requiring the user to
get out of the "location" through one of its doors and then to
enter the same "location" through another door in order to access
another exercise module disposed in a different corner of that
"location." In contrary, "different locations" are to mean
three-dimensional spaces which do not share any common entrance or
door therebetween. Accordingly, when the exercise modules are
disposed in "different locations," the user cannot physically
access all exercise modules from one "location." Rather, he or she
has to get out of one "location" and then to enter a different
"location" in order to access the exercise module disposed in that
different "location."
[0085] As used herein, a "game" collectively refers to what is
conventionally known as computer or video games. The "game" is
typically played by a specific program, e.g., by generating two- or
three-dimensional images using a visual unit based on the program
and then electrically manipulating at least a portion of such
images so as to attain a preset goal specifically defined by the
"game." The "game" incorporates in such images at least one (but
preferably) multiple animated and/or imaginary objects or
backgrounds such that an user of an exercise system of this
invention may manipulate at least one of such objects or
backgrounds to accomplish the goal of the "game" according to
various preset rules defined therefor. The "game" is typically
arranged to allow the user to compete with the program or to allow
multiple users to compete for the same goal, where examples of such
"games" may include, but not limited to, prior art video games in
which the user has to fight one or multiple animated or imaginary
opponents manipulated by the program or another user, prior art
video games in which the user has to proceed through animated or
imaginary obstacles provided or manipulated by the programs or
another user, conventional video games in which the user has to
identify or find preset animated or imaginary objects hidden by the
program or another user, prior art computerized card games in which
the user plays against the program or another user using animated
or imaginary cards, prior art computerized board games in which the
user plays against the program or another user on animated or
imaginary boards with their animated or imaginary pieces,
conventional computerized puzzles in which the user plays against
the program or another user using animated or imaginary pieces, and
the like. Examples of the "game" may also include computerized
equivalents of any other conventional sport games, war games, card
games, board games, puzzles, and the like. In each of the above
examples, the user can preferably manipulate at least one object or
background incorporated into such images so as to attain the goal
of the "game," where the manipulatable or controllable object or
background will be referred to as a "simulated user" hereinafter.
When desirable, the "game" may be arranged to synchronize the
images with sounds for various purposes such as, e.g., assisting
the user, audibly depicting certain stages, events, and/or
landmarks of the "game," enhancing audiovisual quality of the
"game," and the like. When the exercise system of this invention is
synchronized with the "game" and the user is to play the "game"
while performing the exercise, the "game" is to correspond to the
task of the exercise performed by the user, whereas the goal of the
"game" is to correspond to the goal of the task, i.e., to
accomplish the preset goal of the "game" by manipulating the
simulated user of the images according to the preset rules of the
"game" against the rest of the animated or imaginary objects or
backgrounds manipulated by the program or another user. The "game"
or task may include any arbitrary number of stages therein. For
example, the "game" may define a single stage in which the user
manipulates the simulated user for the task goal, may define
multiple stages in which the user may proceed to a next stage only
by achieving a preset goal defined in a current stage, and the
like. The "game" may define a single or multiple goals and maintain
such goals therethrough, through the operation of the exercise
module, through exercise of the user, and the like. Alternatively,
the "game" may define a single goal and then change the goal
therealong, through the operation of the exercise module, through
exercise of the user, and the like. Alternatively, the "game" may
define multiple goals, select one of such goals, and thereafter
select another goal, through the operation of the exercise module,
through exercise of the user. In another alternative, the "game"
may define a single or multiple goals and then manipulate its goal
at least partly based upon any of such features. Similarly, such a
"game" may define a single or multiple objects (or backgrounds) and
employ such objects (or backgrounds) therethrough, through the
operation of the exercise module, through exercise of the user, and
the like. In the alternative, the "game" may define a preset number
of the objects (or backgrounds) and thereafter vary a number or
characteristics of the objects (or backgrounds), during the
operation of the exercise module, through exercise of the user, and
so on. Alternatively, the "game" may define a preset number of
objects (or backgrounds), select therefrom a certain number or
characteristics of such objects (or backgrounds), and thereafter
select another number or characteristics thereof, through the
operation of the exercise module, through exercise of the user, and
the like. In the alternative, the "game" may define a preset number
of objects (or backgrounds) and then manipulate at least one of the
objects (or backgrounds) at least partly based on any of the above
features.
[0086] It is appreciated that the "game" within the scope of this
invention are to provide such images including therein only those
objects (or backgrounds) which are either animated or arbitrary,
but to not provide such images including therein any objects (or
backgrounds) which are merely photographs or photographical
versions of a real objects (or backgrounds) present in a real
environment, where the latter images will be explained in greater
detail below in conjunction with a "story" or a "scenery." It is
appreciated that the "game" of this invention is to be embodied
differently from such prior art video or computer games as commonly
seen in prior art game consoles. More particularly, the "game"
primarily distinguishes itself from the prior art video or computer
games in that the simulated user included in the images of the
"game" are to be manipulated at least partly based on at least one
of various features of a preset task (i.e., the "game"), an user of
the exercise system, at least one operation of an exercise module
of the system, or an exercise provided by the exercise module,
where all of such features are either directly or indirectly
affected or determined by such exercise performed by the user on,
with or against the exercise module through applying mechanical
energy to the actuating part of the exercise module while allowing
the user to consume energy during the exercise. To the contrary,
the simulated user included in the images of the prior art video or
computer games are manipulated entirely based on user inputs
applied thereonto by various body parts of the user and, more
importantly, not based upon user inputs applied to the actuating
part of the exercise module.
[0087] As used herein, a "story" is to collectively refer to images
which are provided sequentially and which generally correspond to
prior art movies, plays, shows, musicals, concerts, and the like,
where the images of each of the "stories" as a whole define
artistic (i.e., literary or musical) themes and are arranged in
artistic sequences. In general, such a "story" within the scope of
the present invention is to be distinguished from the "game" in two
major aspects. In one aspect, the "game" always includes therein at
least one simulated user to be manipulated based on at least one of
various features of the task, user, exercise, or operation of an
exercise module, while the "story" is to not include therein any
simulated user. Therefore, the user of the "story" can not
manipulate any objects (or backgrounds) of such images contrary to
the user of the "game" who can manipulate at least one of such
objects (or backgrounds) as set forth above. It is to be understood
that, although the user of the "story" may not manipulate any
objects (or backgrounds) in the images, the user may manipulate
other features of the images such as, e.g., a view angle of the
images, a speed of displaying each images, a temporal gap between
displaying such images, a sequence of displaying the images or
groups of such images, and the like. In another aspect, the "game"
always includes therein such objects (or backgrounds) which are
animated or imaginary, while the "story" is to include only
photographs or photographical versions of real objects (or
backgrounds) present in a real environment. Within the scope of
this invention, the images including not only the real object (or
background) but also the animated or imaginary object (or
background) are to be deemed as the "story" as far as the images
define the artistic themes and are arranged in the artistic
sequence. When desirable, the "story" may be arranged to
synchronize such images with sounds for various purposes such as,
e.g., assisting the user, audibly depicting certain stages, events
or landmarks of the "story," enhancing audiovisual quality of the
"story," and the like. When the exercise system of this invention
is synchronized with the "story" and the user is to view the
"story" while performing the exercise, viewing such a "story" is to
correspond to the task of his or her exercise, while the goal of
the "story" is to correspond to the goal of the task of the user,
i.e., to view a preset or entire portion of the "story" by
manipulating various features of such the set forth in this
paragraph. The "story" or task may include any arbitrary number of
stages therein, where such stages may be formed as chapters, plots
or parts, primarily based upon contents or contexts thereof, time,
and the like. Accordingly, the "story" may include a single stage
in which the user views only a portion or entire portion thereof
depending upon various features, may have multiple stages in which
the user is allowed to proceed to a next stage only upon achieving
a preset goal defined for a stage, and so on. The "story" may also
define a single or multiple goals and maintain the goals
therethrough, through the operation of the exercise module, through
exercise of the user, and so on. Alternatively, the "story" may
define a single goal and thereafter vary the goal, through the
operation of the exercise module, through user's exercise, and so
on. In the alternative, the "story" may set forth multiple goals,
select one of the goals and then select another goal, through the
operation of the exercise module, or through exercise of the user.
In another alternative, the "story" may define a single or multiple
goals and thereafter manipulate its goal at least partly based upon
any of the features.
[0088] As used herein, a "scenery" is to collectively refer to
images which are provided sequentially and generally correspond to
prior art visual archives, documentaries, movies, and the like,
where the images of each of such "sceneries" define aesthetic
themes as a whole and where such images are arranged in geographic
sequences. Similar to the above "story," a "scenery" within the
scope of this invention is to be also distinguished from the "game"
in two major aspects. In one aspect, the "game" always includes at
least one simulated user to manipulate the rest of the objects (or
backgrounds) of such images therewith, while the "scenery" is to
not include therein any simulated users. Therefore, the user can
not manipulate any objects (or backgrounds) of the images of the
"scenery" contrary to the user of the "game" who manipulates at
least one of such objects (or backgrounds) as described above. It
is to be understood that, although the user of the "scenery" may
not manipulate any objects (or backgrounds) in the images, the user
may manipulate other features of the images such as, e.g., a view
angle of the images, a speed of displaying each images, a temporal
gap between displaying the images, a sequence of displaying the
images or groups of multiple images, and the like. In addition, the
"scenery" may incorporate at least one simulated user which may be
manipulated based upon various features while not changing any
objects (or backgrounds) included in the images. In another aspect,
the "game" always includes therein the above objects (or
backgrounds) which are either animated or imaginary, while the
"story" is to include therein only photographs or photographical
versions of real objects (or backgrounds) present in a real
environment. Within the scope of this invention, the images which
may include not only the real object (or background) but also the
animated or imaginary object (or background) are to be deemed as
the "scenery" as long as such images may define the aesthetic theme
and are arranged in the geographic sequence. When desired, the
"scenery" may be arranged to synchronize the images with sounds for
similar purposes such as, e.g., assisting the user, audibly
depicting certain stages, events, and/or landmarks of the
"scenery," enhancing audiovisual quality of the "scenery," and the
like. When such an exercise system of this invention is
synchronized with the "scenery" and the user is to view the
"scenery" while performing the exercise, viewing the "scenery" is
to correspond to the task of his or her exercise, while the goal of
the "scenery" is to correspond to the goal of the task, i.e., to
view a preset or entire portion of the "scenery" by manipulating
the above features of such images set forth in this paragraph. The
"scenery" or task may include any arbitrary number of stages
therein, where the stages may be formed as chapters, plots, and/or
parts, primarily based on contents or contexts, time, geography,
landmarks, distance defined in the real environment or in the
images, and the like. Accordingly, the "scenery" may form a single
stage therein in which the user views only a portion or entire
portion thereof depending on various features, may include multiple
stages in which the user is allowed to proceed to a next stage only
by achieving a preset goal which is defined for a current stage,
and the like. The "scenery" may further define a single or multiple
goals and maintain the goals therethrough, through the operation of
the exercise module, through exercise of the user, and so on.
Alternatively, the "scenery" may define a single goal and
thereafter vary the goal, through the operation of the exercise
module, through exercise of the user, and the like. In the
alternative, the "scenery" may define multiple goals, select one of
such goals, and thereafter select another of the goals, through the
operation of the exercise module, through exercise of the user. In
another alternative, the "scenery" may define a single or multiple
goals and thereafter manipulate its goal at least partly based on
any of the features. It is noted that the "scenery" may be defined
based on various different geographies so that the "scenery" may
represent the images of land geography, underwater geography,
astronomical geography, and the like, as far as such images define
aesthetic themes as a whole and are arranged in geographic
sequences.
[0089] As used herein, a "virtual environment" (or to be
abbreviated as "VE" hereinafter) is to refer to an environment
always including at least one visual feature (or "images") and also
optionally including at least one auditory feature (or "sounds"),
at least one olfactory feature (or "smells"), and/or at least one
tactile feature (or "sensations"), where each of the above features
may represent, connote or be associated with at least one of
multiple elements such as a preset object, background, event,
activity, surrounding, geographic region, and so on. Therefore, the
"VE" may include such features pertaining to at least one of the
above elements or, alternatively, may include such features
pertaining to at least one of such elements and those features
pertaining to at least one of a different object, background,
event, activity, surrounding, region, and so on. It is appreciated
within the scope of this invention that a mere display of
alphanumerals, symbols, or loads imposed by the exercise equipment
to the user is deemed to not constitute the "VE" and that a mere
display of a temporal characteristic, distribution, or variation of
such loads is deemed to not constitute the "VE" as well. That is,
the "VE" in this invention is to preferably include at least one of
such visual, auditory, olfactory or tactile features which relate
the user with at least one of the elements. Therefore, the "VE" in
this invention must include the visual feature and may optionally
include at least one of the auditory, olfactory, and tactile
features related to the above elements.
[0090] As used herein, "features" refer to various aspects related
to or associated with a preset VE, including temporal or spatial
characteristics, distributions, or variations of such aspects. For
example, the "visual feature" may refer to various visual aspects
examples of which may include, but not limited to, a shape and a
size of an image, contents carried thereby, its color and
brightness, its contrast and sharpness, its zoom, its temporal or
spatial characteristics, distributions or variations, and so on.
The "auditory feature" may refer to various auditory (or audible)
aspects examples of which may include, but not be limited to, a
volume or loudness of a sound, its tone, its balance (e.g., in a
stereo mode), its frequency distribution, a direction from its
source, and temporal or spatial characteristics, distributions or
variations of the audible (or auditory) aspects, and so on. The
"olfactory feature" may then include various olfactory aspects
examples of which may include, but not be limited to, a type of a
smell, its intensity, temporal or spatial characteristics,
distributions or variations of such olfactory aspects, and the
like. The "tactile feature" may then refer to various tactile
aspects examples of which may include, but not be limited to,
mechanical, thermal, and/or electrical properties of various parts
of such exercise and/or output modules of the exercise system of
this invention.
[0091] Unless otherwise defined in the following specification, all
technical and scientific terms used herein have the same meaning as
commonly understood by one of ordinary skill in the art to which
the present invention belongs. Although the methods or materials
equivalent or similar to those described herein can be used in the
practice or in the testing of the present invention, the suitable
methods and materials are described below. All publications, patent
applications, patents, and/or other references mentioned herein are
incorporated by reference in their entirety. In case of any
conflict, the present specification, including definitions, will
control. In addition, the materials, methods, and examples are
illustrative only and not intended to be limiting the scope of the
present invention.
[0092] Other features and/or advantages of the present invention
will be apparent from the following detailed description, and from
the claims as well.
BRIEF DESCRIPTION OF THE DRAWING
[0093] FIG. 1 is a schematic diagram of various modules of an
exemplary exercise system according to the present invention;
[0094] FIGS. 2A to 2F are schematic diagrams of exemplary exercise
systems incorporating therein a different number of modules;
[0095] FIGS. 3A and 3B are schematic perspective views of exemplary
exercise systems according to the present invention; and
[0096] FIGS. 4A and 4B are schematic perspective views of exemplary
exercise systems simulating exercising users as simulated users of
tasks of video games and allowing the users to compete each other
according to the present invention.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
[0097] The present invention generally relates to an exercise
system which includes multiple exercise modules and which provides
a preset task of a story, a scenery, and/or a video (or computer)
game to multiple users who simultaneously perform the same or
different exercises on, with, and/or against such exercise modules.
More particularly, the present invention relates to an exercise
system which includes multiple exercising modules disposed in
different locations but coupling to each other through a local or
global network and then provides multiple users exercising on, with
or against the exercise modules with a preset task in a format of a
story, scenery, or video (or computer) game each defining at least
one preset goal and provided in images of a virtual environment so
that the users can compete each other in the images for the task
for the goal at least partly based upon at least one feature of the
users, exercises or operation of the exercise modules. To this end,
the exercise system may simulate at least one of the users as at
least one simulated user which is included in the images (or as at
least one element of the images), and manipulate the simulated user
at least partly based upon at least one feature of the task, users,
exercises or operation of the exercise modules, where examples of
such elements include a preset object and background in the images.
Thereby, the users can perform the exercises on, with, and/or
against such exercise modules while simultaneously pursuing the
goal of the task based on at least one feature of the exercises.
Conversely, the present invention relates to an exercise system
which provides the users with such a task and manipulates at least
one feature of the operation of at least one of its exercise
modules directly or indirectly related to the exercises at least
partly based on at least one feature of the task. To this end, such
an exercise system simulates at least one of the users as at last
one simulated user and manipulates the operation of the system at
least partly based on at least one feature of the task with or
without allowing the users to manipulate other elements of the task
through at least one feature directly or indirectly related to such
exercises. Whereby, the users can perform the exercises while
pursuing the goal of the task based on progress of the task which
is determined at least partly based on the exercises. In all of
such embodiments, the system always includes at least one output
module which in turn incorporates at least two visual units
disposed in different locations and displaying the task images to
each exercising user and which also includes at least one olfactory
or tactile unit providing respectively smells or sensations related
to the task. Therefore, the exercising users can perform the
exercise while participating in the task as well as monitoring
progress of the task by watching the images, hearing the sounds,
smelling the smells or feeling the sensations.
[0098] The present invention also relates to various methods of
providing such a task of a scenery, a story or a game to multiple
users engaged in exercises on, with or against multiple exercise
modules incorporated in different locations via a local or global
network and relating at least one feature of the task with at least
one feature related with the exercises or vice versa. More
particularly, the present invention relates to various methods of
providing communication between multiple exercise modules in
different locations, monitoring various features of the task,
users, exercises performed by such users or operation of the
exercise modules, and then allowing the users to compete in a
virtual environment defining images, sounds, smells or sensations
while transferring such features between the exercise modules
through the local or global network. Based thereon, the present
invention relates to various methods of providing the users with
such images for the task, those of simulating such users as the
simulated users included into the task, and those of manipulating
the simulated users in the images for the task at least partly
based upon the features of the exercises, users or operations to
manipulate at least one feature of the task to accomplish the task
goal. Conversely, the present invention relates to various methods
of providing such images to the exercising users, those of
simulating such users as the simulated users, and those of
manipulating at least one feature of the operation of such exercise
modules at least partly based on at least one task feature, thereby
affecting at least one feature of the exercises for the goal of the
task. In all of these embodiments, the present invention relates to
various methods of performing the above manipulations when such
users simultaneously perform the same or different exercises on,
with or against the exercise modules. The present invention further
relates to various methods of providing a compact or full-size
visual unit for each user, those of disposing such visual units in
a single or multiple view angles of such users, and those of
displaying such images for the users during their exercises. The
present invention relates to various methods of providing such
users with sounds, smells or sensations for the task, those of
synchronizing such sounds, smells or sensations with the images for
the task, and the like.
[0099] The present invention further relates to various processes
for fabricating an exercise system capable of providing the task of
the story, scenery or video (or computer) game to multiple users
who engage in the same or different exercises in different
locations while competing each other in such a task. More
particularly, the present invention relates to various processes
for providing the system which allows multiple users to
simultaneously perform the exercises, those of generating the task
in a format of the story, scenery or game each defining at least
one preset goal and provided in images of a virtual environment
therefor, and those of manipulating progress of the task at least
partly based on at least one feature directly or indirectly related
with the exercises performed by the users. To these ends, the
present invention relates to various processes for defining and
assessing various features of the exercises performed by the users
or provided by the exercise modules, at least one operation of the
exercise modules, users engaging such exercises or the task, those
of simulating the users as the simulated users, those of
manipulating the simulated users at least partly based on the
features of the exercises, users or operation, and the like.
Thereby, multiple users can simultaneously perform the same or
different exercises while competing each other in the images of the
task and pursuing the task goal based upon the exercises.
Conversely, the present invention relates to various processes for
providing the exercising users with the task, and those of
manipulating at least one feature of the operation of the system
directly or indirectly related to the exercises at least partly
based on at least one feature of the task. To this end, the present
invention relates to various processes for defining and then
assessing the above features, those of simulating the users as the
simulated users, those of manipulating the operation of the
exercise modules at least partly based on at least one feature of
the task with or without allowing the users to manipulate other
elements of such a task based on at least one feature directly or
indirectly related to the exercises, and so on. Whereby, the users
can perform the exercises while pursuing the preset task goal based
on progress of the task which is affected at least partly by the
exercises. In either embodiment, the present invention relates to
various processes of including in the exercise system multiple
visual units which display the images of the task for each user,
define compact or full-size configurations, and are in dispositions
and arrangements providing a single or multiple view angles to the
users who then monitor progress of the task and manipulate such
simulated users during the exercises while competing each other for
the task. The present invention further relate to various processes
of providing such users with sounds, smells or sensations for the
task and those of synchronizing such sounds, smells or sensations
with the images for the task.
[0100] Various aspects and/or embodiments of various systems,
methods, and/or processes of this invention will now be described
more particularly with reference to the accompanying drawings and
text, where such aspects and/or embodiments thereof only represent
different forms. Such systems, methods, and/or processes of this
invention, however, may also be embodied in many other different
forms and, accordingly, should not be limited to such aspects
and/or embodiments which are set forth herein. Rather, various
exemplary aspects and/or embodiments described herein are provided
so that this disclosure will be thorough and complete, and fully
convey the scope of the present invention to one of ordinary skill
in the relevant art.
[0101] Unless otherwise specified, various modules, units,
elements, and parts of various exercise systems of this invention
are not typically drawn to scales or proportions for ease of
illustration. It is appreciated that such modules, units, elements,
and parts of the exercise systems of this invention designated by
the same and/or similar numerals generally represent the same,
similar or functionally equivalent modules, units, elements, and/or
parts thereof, respectively.
[0102] In one aspect of the present invention, an exemplary
exercise system is fabricated in various arrangements, while
incorporating therein various modules and units for providing such
exercise and task each defining various features. FIG. 1 describes
a schematic diagram of various modules of an exemplary exercise
system according to the present invention, where the system 10
includes therein at least two standard exercise modules or simply
exercise modules 20 (or 20A and 20B), at least one control module
40, and at least one output module 50. It is noted that the
exercise modules 20 may be disposed in the same location or in
different, geographically separate locations.
[0103] The exercise system 10 may be deemed to include all the
modules 20, 40, 50 or, alternatively, to have only the control and
output modules 40, 50 which are added to at least two prior art
exercise equipment which may be deemed to not be a part of the
system 10. It is appreciated that classification of various members
20, 40, 50 is not critical to the scope of this invention as far as
such a system 10 performs various functions set forth herein.
[0104] Each exercise module 20A, 20B is arranged to allow an user
to force or facilitate a single user to consume energy by
performing an exercise, e.g., by performing physical work thereon,
therewith or thereagainst or by receiving physical or electrical
energy therefrom and then vibrating or twitching his or her muscles
based thereupon. As a result, each exercise module 20 may improve
or enhance a muscle tone of the user, increase his or her muscle
mass or volume, force or facilitate the user to reduce his or her
weight, increase his or her physical stamina, primarily by
performing such exercise. In this context, the exercise modules 20
may include any prior art exercise or fitness equipment as set
forth herein. In general, the exercise modules 20 allows the user
to exercise by performing voluntary or involuntary activities of
various muscles of the user leading to the above results.
Accordingly, the exercise module 20 may include not only the prior
art exercise or fitness equipment commonly seen in a gymnasium or
fitness center but also any conventional equipment supplying
electrical energy to the user and inducing involuntary contraction
of the muscles. It is appreciated that the exercise modules 20
perform various operations in order to set a specific exercise to
be performed by the user and the operations are characterized by
various features (i.e., "features of operation" or "operation
features" as defined above). It is also noted that the exercise set
by such operations are also characterized by various features
(i.e., "features of exercise" or "exercise features" as defined
above).
[0105] In spite of its various operating mechanisms, each exercise
module 20A, 20B typically includes at least one actuating part
which is specifically shaped and sized to contact at least one body
part of the user for receiving mechanical energy therefrom. For
example, the actuating part may be shaped and sized as a track, a
pedal, a weight, a lever, a handle, a belt, a pad, and the like,
where details of such have been set forth herein above. The
exercise module 20 may incorporate at least one load as set forth
herein and couple the actuating part with the load so that an
amount of energy to be supplied by the user to the actuating part
per a unit exercise may vary depending upon a magnitude of the
load. When the exercise module 20 is to provide the electrical
energy to the user, it may include at least one actuating part
which contacts a body part of the user and supplies electric
current therein or electric voltage thereacross, where examples of
the actuating part may include, but not limited to, electrodes,
handles, pedals, belts, and so on. Thus, the exercise module 20 of
this invention does not include the prior art device which is
preferentially intended to engage the user in solely playing the
prior art video or computer game, although the exercise module 20
of this invention may be modified to allow the user to engage in
any of such games to obtain the aforementioned results.
[0106] As set forth herein, the exercise modules 20 may be disposed
in different locations, where the exercise modules 20A, 20B
preferably couple with each other directly or indirectly by a local
or global network encompassing those locations such as, e.g., those
locations in the same or different districts (or cities), those in
the same or different countries (or time zones), and those in the
same or different planets. It is also appreciated that the above
exercise modules 20 may provide such exercises which define the
same or similar types and different extents or, alternatively,
which have different types and same, similar or different extents.
More particularly, such exercise modules 20 preferably provide the
same, similar or different exercises when the users are to
simultaneously perform such exercises on, with or against each
exercise module 20. When desirable such exercise modules 20 may
provide the same, similar or different exercises in a delayed mode
when the users are to perform the exercises at different times.
Based thereupon, each exercise module 20 may include the same,
similar or different actuating parts to provide such exercises. The
exercise modules 20 may be operatively coupled with each other
directly or indirectly by another unit or module or disposed
independently of each other.
[0107] The exercise system 10 operatively couples the exercise
modules 20 with the control module 40 so as to perform various
functions as set forth herein. For example, the exercise modules 20
may receive at least one control signal from the control module 40
so as to manipulate at least one of their operations. Conversely,
such exercise modules 20 may transmit various signals to the
control module 40 which monitors various variables or parameters of
operations of the exercise modules 20 based thereon. The exercise
system 10 also couples the exercise modules 20 with the output
module 50 by the control module 40 so that at least one feature of
the operation of the exercise modules 20 may be reflected by the
output module 50 or at least one feature of such images for the
task may be reflected by the operation of the exercise modules 20.
Alternatively, the system 10 may also allow the exercise modules 20
to directly to with the output module 50. In any of such examples,
the modules 20, 40, 50 may also couple to each other by
transmitting and/or receiving at least one feature of the task,
users, exercises or operation of such exercise modules 20 via a
local or global network each encompassing the different locations
as will be described below. Although the embodiment of FIG. 1
describes the exercise system 10 with a single control module 40,
it is appreciated that such a system 10 may also include multiple
control modules when the exercise modules 20 are disposed in
different locations and it is desired to perform various control
functions for each exercise module 20 in each location.
[0108] As will be described in detail below, the exercise modules
20 may couple to or incorporate at least one of various units of
the control or output module 40, 50 so that the exercise modules 20
may directly participate to provide the task of the story, scenery
or game in the images, to perform various control functions, to
provide the images or other features of the virtual environment,
and the like. The exercise modules 20 are preferably provided as
separate modules of the system 10 as shown in FIG. 1. The system 10
may instead consist only of the control and output modules 40, 50
to which multiple prior art fitness or exercise equipment are
retrofit, where the equipment serves as external modules.
[0109] The output module 50 provides the users with at least one
virtual environment which includes at least one visual feature
(i.e., images) and may optionally include at least one auditory
feature (i.e., sounds), at least one olfactory feature (i.e.,
smells), and at least one tactile feature (i.e., sensations) each
of which is related to the task of the story, scenery or video (or
computer) game provided in the images. To this end, such a module
50 includes multiple visual (output) units 51, at least one
optional auditory (output) unit 52, olfactory (output) unit 53,
tactile (output) unit 54, and display unit 55.
[0110] To display such images or provide other features of the
virtual environment, the output module 50 operatively couples with
the control module 40 and optionally at least one of the exercise
modules 20. As in FIG. 1, the output module 50 may be provided as a
module of the system 10 or, alternatively, may be provided as an
add-on module coupling with or retrofit to the system 10 which
consists of the exercise and control modules 20, 40. The system 10
may not include any output module but couple to an external
audiovisual device to recruit its pre-existing audiovisual capacity
such as, e.g., a CRT, an LCD, an OLED, an IOLED, a PDP or any
screen capable of displaying the images in a black-and-white or
color mode, where examples of such external audiovisual devices may
include, but not limited to, a stationary or portable audiovisual
device including at least one screen (e.g., a DVD player and TV), a
portable data processing device including at least one screen
(e.g., a PDA, a data organizer or laptop computer), a portable
communication device with at least one screen (e.g., a cellular or
mobile phone), and the like. Similarly, the system 10 may couple to
an external olfactory or tactile device to recruit a pre-existing
olfactory or tactile capacity of the device. In addition to various
functions to be described in conjunction with such units 51-55, the
output module 50 may include at least a portion of the control
module 40, may perform at least one function of the control module
40, or may be recruited as at least a portion of the control module
40.
[0111] The exercise system 10 may include a single output module 50
including multiple visual units 51 and one or multiple auditory,
olfactory, tactile, and/or display units 52-55 or, alternatively,
may include multiple output modules 50 each including a single or
multiple of such units 51-55 each may define the similar or
different configurations and perform similar or different
functions. Accordingly, the system 10 may display images of a task
by a single or multiple output modules 50 to the users
simultaneously exercising on, with or against the exercise modules
20 or performing the exercise in a delayed mode. Therefore, the
single output module 50 may serve as a "common" output module 50 of
the system 10, whereas each output module 50 may function as an
"individual" output module 50. The same applies to each unit 51-55
of the output module 50 as well.
[0112] As will be disclosed below, the output module 50 is
preferably disposed to allow multiple users to view the images
displayed on multiple compact or full-size visual units 51 in a
single or multiple view angles and within a viewable distance
therefrom, to allow the users to hear the sounds played by the
auditory unit 52, to allow the users to smell substances delivered
by the olfactory unit 53, to allow the users to feel the sensations
generated by the tactile unit 54, and so on. Therefore, the output
module 50 (or at least one unit thereof may be disposed away from
the exercise and control modules 20, 40, while being supported by a
structure in which the system 10 is installed. In the alternative,
the output module 50 (or at least one unit thereof) may be
incorporated to various portable articles carried by the users or
various wearable articles to be worn thereby, where examples of
such articles may include, e.g., glasses, goggles, a helmet, a hat,
a cap, a head band, an earphone, a headphone, an earpiece, a
headpiece, a hair band, a ring, a glove, or any other articles for
releasably coupling such to the users or their cloths. Other
configurations of the output module 50 (or at least one of its
units) may depend on a specific type of the task provided in the
images, and their disposition or arrangement may depend on the type
of the task, or a number of the exercise modules 20A, 20B. Although
the embodiment of FIG. 1 describes the exercise system 10 including
a single output module 50, such a system 10 may also include
multiple output modules when the exercise modules 20 are disposed
in different locations and it is desired to perform various output
functions for each user exercising on, with, and/or against a
different exercise module 20 in each location.
[0113] It is appreciated that the output module 50 (or at least one
of its units) may perform at least one function of the control
module 40 or that at least a portion of the control module 40 may
be included in such a module 50. For example, the output module 50
may be arranged to generate at least a portion of the task in such
images. In general, the output module 50 (or at least one of its
units) is arranged to operate at least partly based on various
control signals supplied by the control module 40. The output
module 50 (or at least one of its units) may also operate at least
partly based on user inputs supplied directly thereto by the user
or transmitted thereto through the exercise or control modules 20,
40.
[0114] A main function of each visual unit 51 is to display the
images (i.e., visual feature) of the virtual environment for the
task of the story, scenery or game, where the images may be
generated thereby, stored and retrieved therefrom, supplied from or
transmitted by the control module 40 wirelessly or by wire,
transmitted through wire or wirelessly from an external source such
as, e.g., an external image-supplying device, user(s), other
persons, and the like. Each visual unit 51 may preferably embody
the task by displaying the images of various types such as, e.g., a
still picture of a single portion (i.e., an entire portion of the
picture is displayed simultaneously), a still picture of multiple
portions a different portion of which is displayed one at a time, a
series of the still pictures, a video clip, or its mixture.
[0115] At least one of the visual units 51 of the output module 50
may define a compact configuration so that, when disposed and
arranged properly within the viewable distance, one exercising user
can view an entire portion of the unit 51 in a single view angle,
where details of such a visual unit 51 have been disclosed in the
co-pending Applications. At least one of the visual units 51 may
have a full-size configuration so that, when disposed and arranged
properly in a viewable distance, one exercising user may view
different portions of the visual unit 51 in each of such view
angles of the user. Details of the full-size visual unit 51 have
also been disclosed in the co-pending Applications.
[0116] The images displayed on the visual units 51 may be in
black-and-white or multiple colors, and may be two- or
three-dimensional. Such images may include various features such
as, e.g., at least one object, background, optional simulated user,
manipulatable feature, and the like, where the object and
background may correspond to a living organism (e.g., a person, an
animal, a plant, and the like), a nonliving object or a mixture
thereof. When the task is the video or computer game, each object
and background is preferably synthesized or, at best, animated to
at least partly but not entirely resemble a real appearance of the
user(s). The object and background may be synthesized by a program
with or without animating the user(s) as well, where the simulated
user may correspond to the animated or simulated version of at
least one user feature. When the task is the story or scenery, each
object and background preferably describes an actual configuration
of the living organism, nonliving object, their mixture in the real
world, and so on. The story or scenery may not typically include
the simulated user therein, although the task thereof may also
include at least one manipulatable feature. In particular, the
background may represent an outdoor or indoor environment, a
ground, a desert, a forest or woods, a mountain, a cave, a
building, a stadium, a ring, a river, a lake, a water fall, an
ocean, an underwater environment, a sky, an universe, a planet, a
moon, a star, or other settings against or in which such an object
may be included or portrayed, where the event may include an
athletic game, a festival, a combat, a meeting, a contest, an exam,
a business, or any other happenings related to the object or
background, where the activity may then include walking, running,
sprinting, jogging, jumping, rowing, throwing, pushing with or
pulling a leg or an arm of the user, bending a leg, an arm, or any
joint of the user, or rotating a leg, an arm, a joint, a waist, or
a neck of the user, while the geographic region may include any
landmark such as, e.g., a city, an urban or rural area, a monument,
a theater, a building, a street, a road, a tunnel, a park, a
desert, a forest, a mountain, a desert, a cave, a river, a lake,
woods, a waterfall, an ocean, a sky, a star, an universe, and so
on. Each visual unit 51 may then provide the images for the task on
a single portion of its image domain, where further division of the
images is not plausible. Alternatively, each visual unit 51 may
provide the images simultaneously on multiple portions of its image
domain, where each of such portions of the image domain may display
at least one object or background thereon. Each visual unit 51 may
provide such images of the task in a single portion of its image
domain, where further division of the images is not plausible.
Alternatively, the visual unit 51 may provide the images on
multiple portions of its image domain simultaneously, where each of
such portions may then display at least one object or background
thereon.
[0117] The images for the task of the story, scenery or video (or
computer) game also define various features which are referred to
as "features" of the task or, simply, "task features" as defined
above. At least one of the task features is preferably manipulated
at least partly based on at least one feature of the exercises
engaged by the users (i.e., exercise features), at least one
operation of the exercise modules 20 (i.e., operation features), or
users (i.e., user features) such that the task features may be
manipulated directly or indirectly by the exercise features. In
general, the system 10 may manipulate any task feature, although
examples of the manipulatable task features may include, but not
limited to, a shape and a size of the object or background, a shape
and size of the simulated user, a number of the object, background
or simulated user, their configuration, their arrangement, their
disposition, their orientation, their color or hue, their contrast
or brightness, their sharpness, view or projection angles thereof,
a distance thereto, temporal or spatial characteristics,
distribution, or variations of thereof and any of the above, and
the like.
[0118] The task images may be provided or generated by various
means. For example, the visual unit 51 (or control module 40) may
store a single or multiple images and retrieve one or more images,
may generate a single or multiple images by superposing one of the
object and background onto the other or by composing such images
from one or multiple visual elements, may receive (or download) one
or multiple images from the control module 40 or at least one
external source such as, e.g., an internet, a wired broadcast or
wireless broadcast, another user of another system, and the like.
The visual unit 51 may also acquire one or multiple images by the
sensor unit 42. In all of such examples, each image may include at
least one object and background as set forth herein. As set forth
herein, the visual unit 51 performs any of such functions based
upon the control signals (i.e., automatically or adaptively), based
on the user inputs (i.e., manually), based on at least one of the
features of the user, exercises, or operations of the exercise
module 20, and the like.
[0119] To provide the images for the task, the visual unit 51 (or
control module 40) may preferably be arranged to provide the
desired visual feature in a preset viewpoint with respect to the
user. To this end, the visual unit 51 (or control module 40) may
control various features of the images as described above. The task
images may be provided based on a preset view angle or distance
with respect to the user or that the images may be provided to
simulate the view angle or distance. When desirable, the visual
unit 51 may zoom in or out of such images, change the view angle of
the images, vary the distance to the images, rotate the images with
respect to a preset reference point, and the like.
[0120] The visual unit 51 (or control module 40) may control the
temporal or spatial characteristics or distributions of such images
in various modes. For example, the visual unit 51 (or control
module 40) may display the still image, a series of still images, a
video clip or a mixture thereof each obtained by the above means or
sources in a preset order, randomly or based on at least one of
various features of the task, users, exercises, operations, and the
like. In addition, the visual unit 51 (or control module 40) may
also perform the control for a preset period of time or a preset
duration, where examples of such periods may be a period selected
by the user, a period determined based on such features, and the
like, and where examples of the durations may be a duration for the
user to finish a preset portion of the exercises (or task), a
duration till the images proceed to a preset stage, and so on. The
visual unit 51 may display such images by acquiring the object and
background simultaneously, by acquiring the object and background
independently and superposing one over the other, by acquiring one
of the object and background followed by synthesizing the other and
composing the object and background, and the like. The visual unit
51 (or control module 40) may also repeat a preset portion of the
images, display the images without any repetition, or display such
images in a preset order or randomly.
[0121] As set forth above, the output modules 50 may include one or
multiple visual units 51 for each exercise module 20. All visual
units 51 may define the same shapes or sizes and display the same
or different images in the same or different view angles, at least
two of the units 51 may define different shapes or sizes and
display the same or different sizes in the same or different view
angles, and the like, where at least one of such units 51 may
provide multiple view angles to the user who may view different
portions of the images in each view angle such as, e.g., a
panoramic view of such images.
[0122] Each visual unit 51 may be constructed from any prior art
display device capable of defining at least one image domain, where
the domain may define a single portion or multiple portions related
to or independent of such images. The visual unit 51 may include at
least one driver or storage to include a single or multiple images
or retrieves one or multiple images from various prior art storage
media such as, e.g., electric, magnetic or optical tapes or disks,
semiconductor or other equivalent memory chips, and the like. Thus,
examples of such display devices may include, but not be limited
to, a cathode-ray tube (or CRT), a liquid crystal display (or LCD),
a display device with an organic light emitting diode (or OLED), an
inorganic light emitting diode (or IOLED), a plasma display panel
(or PDP), a beam projector with a screen, and the like. When the
visual unit 51 defines multiple image domains or instead defines a
single image domain with multiple portions, the domains or portions
may display the same or different images, where such images may be
independent of each other or may cooperate with each other to form
a bigger and wider coherent images. Other configurations,
arrangements or dispositions of the visual unit 51 may be similar
or identical to those of the output module 50 as far as the visual
unit 51 may display the images in multiple view angles of the user
and within a viewable distance therefrom.
[0123] A primary function of the auditory unit 52 is to provide the
sounds (i.e., auditory features) for the task, where the sounds may
be generated thereby, stored and retrieved therefrom, supplied from
or transmitted by the control module 40 through wire or wirelessly,
transmitted by wire or wirelessly from various external sources
which are generally similar to those for the visual unit 51 such
as, e.g., an external sound-supplying device, user(s), other
persons, and so on. In general, the auditory unit 52 may embody the
task by playing such sounds of various types such as, e.g., voices
of (or sounds generated by) the user or others, sounds of (or
generated by) an animal or a plant, musical sounds, sounds from (or
generated by) a nonliving object, sounds related with (or
connoting) preset events or geographic regions, and/or synthetic
sounds not related to any of the above, where the sounds may be
played in a mono or stereo mode. Depending on their types, such
sounds may carry text contents as is the case of the conversation,
carry a melody or other musical contents as is the case of music or
may not include any of the contents as is the case of instrumental
music.
[0124] The auditory unit 52 is to play such sounds corresponding to
the auditory feature of the virtual environment for the task. The
sounds may be provided in a mono, stereo or surround mode, and may
include various features such as, e.g., a melody, tune, tempo, and
optional verse, where the features may reflect or represent a
living organism (e.g., a person, animal, or plant), a nonliving
object, or their mixture. When the task is the game, the sounds may
be synthesized or, at best, animated for at least partly but not
entirely resembling real sounds of (or from) the organism or object
or, alternatively, may be synthesized with or without animating the
users. Therefore, the simulated user may correspond to an animated
or simulated version of at least one feature of the sounds. When
the task is the story or scenery, however, each of the features of
the sounds may preferably describe actual sounds of the living
organism, nonliving object or their mixture in a real world. The
scenery or story may not include the simulated user, although the
task may define at least one manipulatable feature. In either case,
the features of the sounds may represent those of the objects or
backgrounds of the images.
[0125] The sounds for the task of the story, scenery or video (or
computer) game also define various features which are referred to
as "features" of the task or, simply, "task features" as set forth
herein. It is to be understood that at least one of the task
features may also be preferably manipulated at least partly based
on at least one exercise feature, at least one operation features,
and/or at least one user features such that the task features may
be manipulated either directly or indirectly by such exercise
features. In general, the system 10 may arrange any task feature to
be manipulated thereby, although examples of such preferred
manipulatable task features may include, but not be limited to, a
volume or loudness of such sounds, a frequency or frequency
distribution thereof, a balance thereof, temporal or spatial
characteristics, distributions, and/or variations thereof, and the
like.
[0126] Such sounds may be provided or generated by various means.
In one example, each auditory unit 52 (or control module 40) may
store a single or multiple sounds and retrieve one or more sounds
therefrom, generate a single or multiple sounds by superposing one
onto the other or by composing such sounds from one or multiple
auditory elements, receive or download a single or multiple sounds
from the control module 40 or at least one external source such as,
e.g., an internet, wired broadcast or wireless broadcast, another
user, and the like. The auditory unit 52 may acquire a single or
multiple sounds by the sensor unit 42. In all of such examples,
each sound may include at least one feature representing the object
and/or background each of which is preferably animated or
synthesized. As set forth herein, any of such functions of the
auditory unit 52 may be decided at least partly based on the
control signals (i.e., automatically or adaptively), user inputs
(i.e., manually), at least one feature of the user(s), exercises,
and/or operations of the exercise module 20, and the like. In each
example, the sounds may be related to and/or associated with the
event, region, timing, user or another person, and/or object. In
general, the auditory unit 52 (or control module 40) may manipulate
various aspects of the sounds based upon at least one feature of
the task, user(s), exercises, and/or operations such as, e.g., the
source of the sounds, type thereof, contents thereof, and the like,
each of which may in turn be determined by the control signals,
above features, and the like.
[0127] The auditory unit 52 is to play such sounds representing the
auditory feature of the task. The sounds may include various
features each describing the object, background, optional simulated
user, optional manipulatable feature, and the like. To provide the
sounds with desired auditory features, the auditory unit 52 (or
control module 40) may preferably provide the sounds in a preset
viewpoint of the user(s). To this end, the auditory unit 52 (or
control module 40) may control various features of such sounds,
where examples of the features may include, but not limited to, a
volume or amplitudes of the sounds, a tone or frequency
distribution thereof, contents carried by the sounds, a direction
thereof, a balance thereof, temporal and/or spatial
characteristics, distributions, and/or variations of the above
features, and so on. It is appreciated that the sounds may be
generated along a preset direction or in a preset distance with
respect to the user, or that such sounds may simulate the direction
or distance. When desirable, the auditory unit 52 may control
various features of the sounds, change the direction of and/or
distance to such sounds, and the like.
[0128] The auditory unit 52 (or control module 40) may control the
temporal or spatial characteristics or distributions of the sounds
in various modes. For example, the auditory unit 52 may play the
same sounds, a series of different sounds, a mixture thereof each
obtained by such means or sources in a preset order, randomly or
based on at least one of such features. In addition, the auditory
unit 52 may perform the control for a preset period of time (or
duration), where examples of the preset periods are similar to
those of the visual unit 51. The auditory unit 52 may play the
sounds by acquiring different sounds simultaneously, acquiring
different sounds independently and superposing one over another,
acquiring one of the sounds, synthesizing the other, and composing
mixed sounds therefrom, and the like. It is appreciated that the
system 10 may also be arranged to provide the user with the task in
the images alone, and/or in such sounds along with the images,
where the sounds may also be provided independently of the images
or synchronized therewith.
[0129] Each auditory unit 52 may also be fabricated from any
conventional with speakers and, when necessary, at least one driver
and at least one storage capable of storing therein such sounds
and/or retrieving such sounds from various prior art storage media
such as electric, magnetic, and/or optical tapes or disks,
semiconductor or other equivalent memory chips, and the like. In
general, the auditory unit 52 may include any prior art speakers
examples of which may include, but not be limited to, cone-drive
speakers, piezoelectric speakers, electrostatic speakers, and any
equivalents of the speakers. The auditory unit 52 may also include
a single speaker or multiple speakers which generate the same or
different sounds therefrom. The auditory unit 52 may propagate such
sounds to a preset area of the user(s), where such areas may be
selected similar to those of the visual unit 51. In addition, such
an auditory unit 52 may be provided in the dispositions similar to
those of the visual unit 51, except that the auditory unit 52 are
to be placed closer to ears (than eyes) of the user(s). Other
configurations, arrangements, and/or dispositions of the auditory
unit 52 are similar or identical to those of the overall output
module 50 as long as the auditory unit 52 may generate the sounds
and delivers such sounds to the exercising user(s).
[0130] A main function of the olfactory unit 53 is to provide the
olfactory feature (i.e., smells) of the virtual environment for the
task of the story, scenery or game, where such smells may be
generated thereby, stored and retrieved therefrom, and/or delivered
from an external source such as, e.g., an external device for
generating such smells, users, another system or other persons. In
general, the olfactory unit 53 (or control module 40) preferably
embodies the task by giving off smells provided (or generated) by
various means. For example, the olfactory unit 53 may store a
single strand or multiple strands of smells therein and retrieve
one or more smells therefrom, may generate a single or multiple
smells by mixing or reacting at least one chemical substance with
another substance, may receive a single or multiple smells from a
storage or at least one external source connected thereto by
various tubing. The olfactory unit 53 may acquire a single or
multiple smells using the sensor unit 42. In all of these examples,
the smells may be real ones (i.e., those existing in the nature)
for the task of story or scenery, synthesized (i.e., those not
existing in the nature) for the task of the game, and/or mixtures
thereof. The real smells may be obtained from a living organism
(such as a person, an animal, a plant, and the like) or nonliving
object and correspond to the smells of the objects or backgrounds
displayed on the images, user's smells, the synthesized smells for
the virtual environment, and the like. In each of the examples,
such smells may be related or associated with a preset event,
geographic location, timing, and the like. In general, the
olfactory unit 53 may be arranged to manipulate various aspects of
the smells depending on various factors such as, e.g., the source
of the smells, their types, contents thereof, and the like, each of
which is decided by various features such as, e.g., user inputs,
sensed or command signals, variables and/or parameters sensed by
the sensor unit 42, conditions of the user and/or exercise module
20, or features of the exercise module 20 and/or task.
[0131] Such smells may be typically provided in a mono, stereo or
surround mode and include various features such as, e.g., a type of
the chemical substance, an intensity thereof, its direction,
temporal or spatial characteristics, distributions, and/or
variations thereof, and the like, where such features may reflect
or represent a living organism (e.g., a person, an animal, a plant,
and the like), a nonliving object or a mixture thereof, and the
like. When the task is the game, the features are preferably
synthesized smells or, at best, animated smells for at least partly
but not entirely resembling such real smells of (or from) the
organism or object.
[0132] The smells for the task of the story, scenery or video (or
computer) game also define various features which are referred to
as "features" of the task or, simply, "task features" as set forth
herein. It is to be understood that at least one of the task
features may also be preferably manipulated at least partly based
on at least one exercise feature, at least one operation feature,
and/or at least one user feature such that the task features may be
manipulated either directly or indirectly by such exercise
features. In general, the system 10 may arrange any task feature to
be manipulated thereby, where examples of the manipulatable task
features coincide with those enumerated in the above paragraph.
[0133] A major function of the tactile unit 54 is to provide such
sensations (i.e., the tactile feature) of the virtual environment,
where such sensations may then be generated thereby, retrieved
therefrom, generated by the control module 40, retrieved from the
control module 40, transmitted by the control module 40,
transmitted by an external source such as, e.g., an external
device, user, another system, other persons, and the like. In
general, the tactile unit 54 may preferably embody the tactile
feature of the virtual environment by providing various mechanical,
thermal, electrical, and/or optical sensations to the user(s) who
is to sense such sensations by his or her skin of a head, a neck, a
hand, an arm, a shoulder, an upper torso, a lower torso, a thigh, a
leg, a foot, and the like. The tactile unit 54 may also provide the
user(s) with the sensations of various features, e.g., by
generating a single sensation of a constant amplitude or
time-varying amplitudes, a temporal series of such sensations, a
spatial series thereof, and a mixture thereof. Such sensations may
be provided and/or generated by various means as well. For example,
the tactile unit 54 (or control module 40) may store mechanical,
thermal, optical or electrical energy therein and utilize at least
a portion of the energy for providing a single or multiple
sensations, may receive at least a portion of such energy from the
storage unit 43 or from at least one external source of the energy
through a wire or wirelessly such as, e.g., an external energy
source, a wireless or wired transmission of such energy, user, or
user of another device which may not be the exercise module 20. The
tactile unit 54 may also preferably generate such sensations
related to or associated with a preset object, a preset background,
and the like.
[0134] As described above, the tactile unit 54 is to provide such
sensations which correspond to the tactile feature of the virtual
environment for the task of the story, scenery or video or computer
game. Such sensations may be provided in a mono, stereo or surround
mode, and include various features such as, e.g., a type of the
sensations, an intensity thereof, a body part onto which such
sensations are applied, temporal and/or spatial characteristics,
distributions, and/or variations thereof, and so on, where such
features may reflect or represent a living organism (e.g., a
person, an animal, a plant, and the like), a nonliving object or
their mixture. When the task is the game, such features are
preferably synthesized sensations or, at best, animated sensations
for at least partly but not entirely resembling real sensations of
(or from) the organism or object. Alternatively, the features of
the sensations may be synthesized using a program with or without
animating the user. When such a task is the story or scenery, each
of such features of the sensations preferably describes actual
sensations of (or from) the living organism, nonliving object or
their mixture in the real world. In either case, such features of
the sensations may represent those represented by the objects or
backgrounds of the images.
[0135] The sensations for the task of the story, scenery or video
(or computer) game define various features which are referred to as
"features" of the task or, simply, "task features" as set forth
herein. It is to be understood that at least one of the task
features may also be preferably manipulated at least partly based
on at least one exercise feature, at least one operation feature,
and/or at least one user feature so that the task features may be
manipulated directly or indirectly by the exercise features. In
general, the system 10 may arrange any task feature to be
manipulated thereby, where examples of such manipulatable task
features coincide with those of the above paragraph.
[0136] To provide such sensations of the virtual environment to the
exercising user, the tactile unit 54 (or control module 40)
preferably provides desired sensations in the preset viewpoint of
the user. To this end, the tactile unit 54 manipulates various
factors of such sensations, where examples of such factors may
include, but not be limited to, a type of such sensations, an
intensity thereof, a body part of the user to which the sensations
are delivered, temporal or spatial characteristics, distributions,
or variations of such factors, and the like. It is appreciated that
such sensations may be provided in a preset angle of application or
direction with respect to the user or, alternatively, the
sensations may be provided in order to simulate the angle or
direction thereof. When desirable, the tactile unit 54 may vary the
angle or direction of the application, change the type of the
sensations, apply the sensations to different body parts of the
user, and the like.
[0137] The tactile unit 54 (or control module 40) may control such
temporal or spatial characteristics or distributions of the
sensations in various modes. For example, the tactile unit 54 may
generate the sensations with various mechanical, thermal,
electrical, and/or optical properties with or without using at
least one applicator, where examples of such mechanical properties
may include, but not limited to, amplitudes of such mechanical
sensations, an area of the body part applied with such sensations,
a configuration and/or a number of such applicators, an arrangement
of such an applicator, a hardness or softness thereof, elasticity
or rigidness thereof, a surface structure thereof, where examples
of such thermal properties may include, but not be limited to,
temperature of the sensations, temperature of the applicator, an
area of the body part is applied with the sensations, thermal
conductivity of the applicator, a number of the applicators, an
arrangement thereof, and so on, where examples of such electrical
properties may include, but not be limited to, amplitudes of
electric current or voltage for the sensations, an area of the body
part applied with such sensations, electrical conductivity or
resistivity of the applicator, a number of the applicators, and/or
an arrangement of such applicators, and where examples of the
optical properties may include, but not limited to, amplitudes of
the sensations, optical characteristics of lights causing such
sensations, an area applied with such sensations, a number of the
applicators, an arrangement of the applicators, and the like. It is
appreciated that the tactile unit 54 may be arranged to control
various temporal and/or spatial characteristics of such sensations,
where examples of the sensations may include, but not limited to, a
duration of the sensations, a total number thereof, an interval
therebetween, a frequency thereof, a sequence thereof, and the
like. The tactile unit 54 may provide such sensations of various
properties to various body parts of the user in various modes. For
example, the tactile unit 54 may be arranged to contact its
applicator with the body part of the user and to directly deliver
such sensations thereto. It is appreciated that the applicator of
such a tactile unit 54 may be deemed as a part of the tactile unit
54 or, alternatively, may be deemed as a part of the exercise
and/or control modules 20, 40. Therefore, this application mode may
be deemed to be a direct mode in the former case, whereas this mode
may be deemed as the indirect mode in the latter case. In another
example, the tactile unit 54 may also be arranged to induce such
sensations without necessarily including the applicator or without
contacting its applicator with the desired body part of the user.
To this end, the tactile unit 54 may generate forced convection
such as a stream of wind of desired temperature (e.g., an ambient
air, heated air, cooled air, and the like), irradiate
electromagnetic waves such as infrared rays, and the like. The
tactile unit 54 may generate or induce the sensations each of which
may be obtained by the above means or sources in a preset order,
randomly or based on at least one of such factors. In addition, the
tactile unit 54 may perform such control for a preset period of
time or a preset duration, where examples of such preset periods
may be a period selected by the user, a period determined based on
such factors, and the like, where examples of such preset durations
may be a duration until the user(s) may finish the exercise, a
duration until the tactile feature of the environment reaches a
preset feature, and the like.
[0138] It is to be understood that the tactile unit 54 may be
provided to operatively couple with a single exercise module 20A or
20B and to generate the tactile feature of a virtual environment to
the user. Alternatively, the system 10 may include multiple tactile
units 54 at least one of which may serve as a "master" tactile unit
54 controlling the rest thereof. Similarly, multiple systems 10 may
include at least one tactile unit 54 which serves as the "master"
tactile unit 54 which controls the rest thereof.
[0139] The tactile unit 54 may also be constructed from any
conventional mechanical devices capable of generating and/or
delivering the mechanical sensations, heating and/or cooling
devices capable of generating and/or delivering the thermal
sensations, electrical devices for generating and/or delivering the
electrical sensations, optical devices for generating and/or
delivering such optical sensations, and so on. As described above,
the tactile unit 54 preferably includes at least one applicator for
providing mechanical and/or electrical sensations. In addition,
such a unit 54 may include a conductive wire or an electrode for
delivering the electrical energy to the preset body part. For such
thermal sensations, the tactile unit 54 may include at least one
heating element which may deliver the thermal sensations to the
user either through a direct contact, an indirect contact or
through irradiating such electromagnetic waves thereto. The tactile
unit 54 may be arranged to deliver such sensations across a preset
area which may correspond to a preset portion of the user, where
such portions may be an area around the eyes of the user, an area
encompassing an upper torso of the user, an area with a height
similar to that of the user, an area capable of receiving and
recognizing the mechanical, electrical, thermal or optical
sensations. The tactile unit 54 may also be provided to be disposed
away from the user, to be worn by the user over and/or around at
least one of his or her eyes, to be worn by the user over or around
his or her head, to be carried by or coupling with the user, and
the like. Such a tactile unit 54 may also be incorporated into or
provided as glasses, goggles, helmets, and the like. When
desirable, at least a portion of the tactile unit 54 may be
incorporated to the exercise module 20 or, alternatively, at least
a portion of the exercise module 20, or an external device which
may deliver such sensations may be recruited as the tactile unit 54
of the exercise system 10. Other configurational or operational
characteristics of the tactile unit 52 may be similar or identical
to those of the overall output module 50 as long as the tactile
unit 54 may generate the sensations and deliver such to the
exercising user.
[0140] A major function of the display unit 55 is to provide the
exercising user(s) with various system and/or operation variables
and/or parameters visually or audibly. To this end, the display
unit 55 may include any prior art audiovisual display elements such
as, e.g., display panels, speakers, and the like. The display unit
55 may be disposed into the control module 40 or at least a portion
of the unit 55 may be disposed into the exercise and/or output
modules 20, 50. Alternatively, various display devices of the
exercise module 20 may also be recruited as the display unit 55 of
the system 10 as well. Other configurations, arrangements, and/or
dispositions of the display unit 55 are similar or identical to
those of the overall output module 50 as long as the display unit
55 may provide the above variables and/or parameters to the
exercising user(s) during such exercise(s).
[0141] Various units 51-55 of the output module 50 may operatively
couple with each other in various modes. For example, each unit
51-55 of may operatively couple with the rest thereof so that each
unit 51-55 may receive or transmit various informations as the
electrical or optical signals through wire or wirelessly. In an
opposite example, at least one of the units 51-55 may couple with
not all of the rest of the units 51-55 so that, e.g., the visual
unit 51 may couple with the auditory unit 52 but not with the
display unit 55, and the like. Accordingly, the detailed coupling
modes of such units 51-55 depend not only upon an overall
configuration of the output module 50 but also upon assigned
functions of each unit 51-55. As described above, the output module
50 includes the visual unit 51 but not necessarily includes other
units 52-55. By the same token, the output module 50 may include
multiple units of the same type, e.g., by including multiple visual
units 51 for displaying different and/or overlapping images, or two
or more auditory units 52 for generating different or overlapping
sounds. It is appreciated that at least one unit 51-55 of the
output module 50 may be incorporated into such exercise modules
20A, 20B and/or control module 40 and perform the same or similar
functions as described herein. In such an embodiment, that unit may
be deemed as a part of the output module 50 or may be deemed to
form a part of the exercise or control module 20A, 20B, 40,
depending upon detailed definition thereof.
[0142] The control module 40 generally includes at least one input
unit 41, at least one control unit 44, at least one optional sensor
unit 42 and at least one optional storage unit 43, and is arranged
to control various operations of the system 10 as a whole, i.e., to
generate and manipulate various features of the task of the story,
scenery or game in the images, to relate at least one task feature
to at least one feature of the users, exercises or operation of the
exercise modules 20, to relate at least one feature of the
operation (or exercises) with at least one feature of the task,
users, exercises or operation, to manipulate at least one feature
of the task at least partly based on at least one feature of such
users, exercises, and/or operation, to manipulate at least one
operation feature at least partly based upon at least one feature
of the task, users or exercises, and the like. The control module
40 may manipulate various operations of the exercise or output
modules 20, 50. The control module 40 may operatively couple with
an external audiovisual device or other external devices including
the display screens or speakers and supplement (or replace) the
visual, auditory, olfactory, tactile or display units 51-55 of the
output module 50, may operatively couple with an external device so
as to provide the task and to replace (or supplement) at least a
portion of the control module 40, may be operatively coupled to an
external device including the processor and replace or supplement
at least a portion of itself, and the like. To these ends, the
units 41-44 of the control module 40 may perform various functions
such as, e.g., generating the task in the images (or optional
sounds), receiving the user inputs from the users in order to
convert such to the command signals, monitoring variables and/or
parameters and generating the sensed signals based thereupon,
generating the control signals based on the command or sensed
signals or independently thereof, generating the simulated user(s),
manipulating at least one feature of the task by manipulating at
least one feature of the images or simulated user(s), manipulating
at least one feature of the images (or sounds) displayed (or
played) by the output module 50, manipulating at least one
operation of at least one of such exercise modules 20, and the
like. The control module 40 may operate in various modes as will be
provided below and may include one or more of at least one of its
units 41-44. It is appreciated that the control module 40 may be
arranged to allow communication between at least two exercise
modules 20 either directly or therethrough via a local or global
network, where the exercise modules 20 are disposed in different
locations while denying thereto access of the user(s) and requiring
the users to go out of one location and to enter another for such
an access.
[0143] In order to manipulate their operations, the control module
40 may operatively couple with such exercise and output modules
20A, 20B, 50. As shown in the figure, such a control module 40 may
be provided as a module of the exercise system 10. Alternatively,
the control module 40 may be provided as an add-on module and
couple with the system 10 which then consists of such exercise and
output modules 20A, 20B, 50. In another alternative, the system 10
may not include any or not all units of the control module 50 but
operatively couple with an external device equipped with at least
one processor in order to recruit and use pre-existing
task-providing and/or task-processing capability of the device
examples of which may include, but not be limited to, a laptop
computer, a PDA, a data organizer, an external story and/or scenery
generator, an external game console, an external audiovisual,
visual or communication devices including such processors, and so
on. In this embodiment, the system 10 may include a preset program
for borrowing such task-generating or processing capability of the
external devices or include an interface to operatively couple
therewith. Alternatively, the external device may be equipped with
a preset program which may perform at least one function of the
control module 40 as set forth herein. In addition to various
functions described hereinabove, the control module 40 may
incorporate therein at least a portion of the exercise and/or
output modules 20A, 20B, 50, may perform at least one function of
the exercise and/or output modules 20A, 20B, 50, may be recruited
as at least a portion of such modules 20A, 20B, 50, and the
like.
[0144] The exercise system 10 of this invention may include a
single control module 40 including one or multiple input, sensor,
storage, and control units 41-44 or, alternatively, may include
multiple control modules 40 each including one or multiple of the
above units 41-44 each of which may have similar or different
configurations and each of which may perform similar or different
functions. The system 10 may have a single control module 40 for a
single or multiple output modules 50, where such a control module
40 may operate as the common module in the latter case. The system
10 may instead include multiple control modules 50 for a single or
multiple output modules 50, where such control modules 40 may
correspond to individual control modules 40 for each output module
50 and where at least one of the control modules 40 may serve as a
master control module 40. Similarly, the system 10 may include a
single control module 40 for multiple exercise modules 20, where
the control module 40 may operate as the common module in the
latter case. The system 10 may instead include multiple control
modules 50 for multiple exercise modules 20, where the control
modules 40 may be individual modules for each exercise module 20,
and at least one of the control modules 40 may serve as a master
module. When the system 10 includes multiple control modules 40,
their number may be more or less than a number of the exercise
modules 20 or output modules 50. Regardless of its number, such a
control module 40 may allow the user to manipulate the exercise
and/or output modules 20, 50 manually, may manipulate such modules
20, 50 automatically or adaptively at least partly based upon at
least one feature of the task, users, exercises, operation of the
exercise modules 20, and the like. It is to be understood that the
exercise modules 20 are to be disposed in different locations and,
accordingly, that such a control module 40 may have to communicate
with at least one of such exercise modules 20 through the global
network when the module 40 is disposed in the location of one
exercise module 20. When the control module 40 is disposed in a
third location, such a module 40 communicates with both exercise
modules 20 via the global network encompassing those locations of
the exercise modules 20 and itself 40.
[0145] It is appreciated that such a control module 40 (or at least
one of its units 41-44) may perform at least one function of the
output module 50 or that at least a portion of the output module 50
may be included in the control module 40. For example, the control
module 40 may be arranged to display the images and/or play the
sounds. In general, the control module 40 is arranged to generate
the control signals supplied to other modules 20, 40. The control
module 40 may transmit the user inputs supplied directly thereto to
other modules 20, 50. The control module 40 may also perform at
least one function of the exercise modules 20 or at least a portion
of the exercise modules 20 may be incorporated in the control
module 40. For example, the control module 40 may directly provide
or manipulate the load of the actuating part of such modules 20.
Conversely, at least a portion of the control module 40 may be
disposed in the exercise or output module 20, 50 while performing
its intended function. For example, at least a portion of the
control unit 44 may be disposed in such exercise modules 20 for
manipulating their operations, may be disposed in the output module
50 for manipulating various visual elements of such images for the
task, for manipulating various auditory elements of the sounds, and
the like.
[0146] The control module 40 (or at least one unit thereof) may
fixedly or releasably couple with both of the exercise and output
modules 20, 50. The control module 40 may be disposed away from
such modules 20, 50, while being supported by a structure in which
the system 10 is installed as well. The control module 50 (or at
least one unit thereof 41-44) may be incorporated in the portable
or wearable articles. Other configurations, arrangements, and/or
dispositions of the control module 40 depend on a specific type of
the task in the images or sounds, where its disposition and/or
arrangement may also depend upon whether the task is visual or
audiovisual.
[0147] The control module 40 may include various
communication-related units such as, e.g., at least one audio in
unit for acquiring such sounds, at least one audio out unit for
generating signals carrying therealong the sounds, at least one
video in unit for acquiring such images, at least one video out
unit for generating signals carrying therealong such images, at
least one receiving unit for receiving such signals, at least one
sending unit for transmitting such signals, and the like, where
such receiving and sending units may respectively receive and
transmit such signals through wire or wirelessly through a global
network which covers different locations in which the exercise
modules 20 are disposed and where such reception or transmission
may be unilateral or bilateral. Such communication-related units
may be provided as separate units or, in the alternative, may be
incorporated into one or more of such units 41-44. In particular,
the control module 40 may provide real-time communication of such
signals between the exercise modules 20 directly or indirectly, may
instead provide a real-time communication between the exercise
modules 20 directly or indirectly, or both, thereby providing a
transfer of such signals between the modules 20, 50, where such
signals preferably carry therein at least one feature of the task,
exercises, users, operation of the exercise modules 20, and the
like. As will be described below, the control module 40 may also be
arranged to transfer such signals without altering the task feature
(i.e., a simple transfer) or, in the alternative, to transfer such
signals by altering or converting the task feature based upon at
least one preset relation (i.e., an equivalent conversion).
Accordingly, the control module 40 may perform such transfer
sequentially or simultaneously, based on a mode of the users
performing the exercises on, with or against the exercise modules
20. When desirable, the control module 40 may operatively couple
with an external communication devices, where examples of such
devices may include, but not be limited to, a wired or wireless
telephone, a wireless portable or mobile phone, a beeper, a
walkie-talkie, and other prior art communication devices.
[0148] A major function of the input unit 41 is to receive user
inputs which are supplied by the user(s) and related or associated
with desired features of the task (i.e., the story, scenery or
game), user(s), exercise(s), and/or operation of the exercise
modules 20. Based upon its operating mechanisms, the input unit 41
may receive the user inputs by or without necessarily contacting
the user(s). The input unit 41 then converts the user inputs into
the (electric or optical) command signals. Any prior art input
device may be used as the input unit 41. Accordingly, the input
unit 41 may receive the user inputs by sensing mechanical,
electrical, optical, magnetic or electromagnetic input signals
supplied thereto by movements of various body parts of the user(s),
compression thereby, or contact therewith, where examples of such
body parts may include, but not limited to, fingers, hands, wrists,
arms, toes, feet, thighs, legs, shoulders, neck, head, eyes, back,
belly, sides, and the like. To this end, the input unit 41 may be
fabricated similar to various prior art input devices examples of
which may include, but not be limited to, a key, key pad, array of
the keys, button or array of buttons, switch or array of switches,
touch screen, mouse, track pad, track ball, track stick, joystick,
and the like. It is appreciated that the input unit 41 may define
any of such configurations or modifications thereof depending upon
types of the user inputs, body parts of the user(s) contacting the
input unit 41 for applying the user inputs, and the like, that the
input unit 41 may move or deform in response to the user inputs, or
that the input unit 41 may not move or deform in response thereto.
The input unit 41 may receive the user inputs without mechanically
contacting any body part of the user(s). To this end, such an input
unit 41 may generate therearound electric or magnetic fields and
receive the user inputs by monitoring perturbation of such fields
which is caused by the body part of the user(s) disposed in its
vicinity but not contacting such.
[0149] The input unit 41 may be incorporated in various positions
around the system 10. For example, the input unit 41 may be
provided physically separate from the exercise and output modules
20, 50 or, alternatively, at least a portion of the input unit 41
may be disposed on or in such modules 20, 50. The input unit 41 may
operatively couple to other units and/or modules of the system 10
wirelessly or wire, depending on its disposition and configuration.
At least a portion of the input unit 41 may be included into the
portable or wearable articles so that the users may perform the
exercises while providing the user inputs without disengaging
himself or herself from the exercises. At least a portion of the
input unit 41 may be worn around other body parts and allow the
user(s) to provide the user inputs without using his or her hand
and/or stopping such exercises.
[0150] It is appreciated that the system 10 may include a single
input unit 41 capable of receiving such user inputs for the
exercise modules 20 or, alternatively, may have multiple input
units 41 at least one of which operates as a "master" input unit 41
controlling the rest of thereof. Similarly, the input unit 41 may
receive the user inputs for a single output module 50 or,
alternatively, the system 10 may include multiple input units 41 at
least one of which serves as the master input unit 41. The input
units 41 may be of the same or different type, disposed in the same
or different positions, and/or receive the same or different units
inputs. The input unit 41 may receive the user inputs from one or
more body parts of the user(s) which have to contact the actuating
parts of the exercise modules 20 for such exercises. However, the
input unit 41 may receive such user inputs from other body parts of
the user(s) which may not be necessary for the exercises, which may
not contact such actuating parts of the exercise modules 20, and so
on. Therefore, the input unit 41 may receive the user inputs
through the first body part of the user(s) which are required for
such exercises and may additionally receive the user inputs by the
second body part of the user(s) which is different from such a
first body part and which is not necessarily required to perform
the exercise. This latter embodiment may be particularly useful
when the task requires multiple user inputs for the user(s) to
proceed through the task while performing the exercises. For
example, the input unit 41 receives the primary user inputs from
the feet of the user running on the treadmill-type exercise module
20 and monitors the exercise or user feature therefrom, and
receives the auxiliary user inputs from the hands of the user which
are not related to the running exercise but required for
manipulating various features of the task. Such primary and
auxiliary user inputs may generally be differentiated from each
other based on an amount of energy associated with or consumed by
the related body parts, i.e., the body part delivering a
significant amount of energy of the user. Other configurations,
arrangements or dispositions of the input unit 41 are similar or
identical to those of the overall control module 40 as far as the
input unit 41 can receive such user inputs.
[0151] A primary function of the optional sensor unit 42 is to
monitor various variables or parameters of (or related with)
various operations of the exercise and/or output modules 20, 50,
i.e., at least one feature of the task, exercising users,
exercises, and/or operation of the exercise modules 20, and to
convert the monitored features into the sensed signals. Depending
upon its configuration or operating characteristics, the sensor
unit 42 monitors such features by contacting at least one body part
of the users or without necessarily contacting such. The control
module 40 (or its control unit 44) receives the sensed signal and
converts such into the control signal for various purposes such as
a feedback control of other modules 20, 50, where the sensed signal
are electric or optical.
[0152] Any conventional sensing device may be used as the sensor
unit 42 of the control module 40. In particular, the sensor unit 42
may monitor various features of physical or physiological
conditions of the user(s), operations, exercises, and/or task.
Accordingly, any prior art sensing device capable of measuring such
conditions and features may be used as the sensor unit 42, where
examples of such conditions and features may include, but not
limited to, a presence or absence of the user(s) (or body part)
with respect to a preset landmark of the system 10 or exercise
modules 20, a distance from the landmark to the user(s) or body
part, a position or posture thereof, a movement (including its
direction or displacement) thereof, temperature thereof, heart rate
or blood pressure thereof, blood O.sub.2 level or sugar
concentration thereof, ECG, EEG, EMG, height, weight, or body fat
content or percent.
[0153] The sensor unit 42 may be incorporated into various
positions of the exercise system 10. For example, the sensor unit
42 may be formed physically separate from such exercise or output
modules 20, 50. At least a portion of the sensor unit 42 may
instead be disposed in or on at least one of such modules 20, 50.
The sensor unit 42 may operatively couple with other parts of the
system 10 by wire or wirelessly, depending on its disposition or
structure. In addition, at least a portion of the sensor unit 42
may be included into the portable or wearable articles so that the
user(s) may perform the exercise while providing the user inputs
without disengaging himself or herself from such exercises. At
least a portion of the sensor unit 42 may also be worn around other
body parts of the users to allow the unit 42 to monitor various
variables and parameters of other modules 20, 50 and/or conditions
and various features of the user. An exact configuration,
disposition or arrangement of such a unit 42, however, depends upon
such variables or parameters, feature to be monitored or operating
mechanism thereof.
[0154] The system 10 may include multiple sensor units 42
monitoring such variables or parameters of the exercise modules 20
and monitor various features of multiple users simultaneously
performing the exercises on, with, and/or against the exercise
modules 20. All sensor units 42 may define the same or different
configurations, may monitor of the same or different variables
and/or parameters, may be disposed in the same or different
positions, and the like. In addition, at least one of the sensor
units 42 may also serve as a "master" sensor unit 42 manipulating
the rest thereof. Similarly, the sensor units 42 may monitor the
variables and/or parameters for a single output module 50 or, in
the alternative, the system 10 may have multiple sensor units 42 at
least one of which serves as the master sensor unit 42. Further
configurations, arrangements or dispositions of the sensor units 42
are similar or identical to those of the overall control module 40
as far as the sensor units 42 monitor the features, variables,
and/or parameters with or without contacting the user(s).
[0155] A main function of the optional storage unit 43 is to store
informations required for generating or providing various features
of the task of the story, scenery, and/or game in such images (or
other features) of the virtual environment or which may be needed
to transfer various features of the task, user(s), exercise(s),
and/or operation(s). Accordingly, the storage unit 43 may store an
algorithm for generating and proceeding along the task, an
algorithm (i.e., relation) for manipulating the features of the
task based on various features of the exercises, and the like.
Depending on the configuration of the storage unit 43, such
informations may be stored therein or may be retrieved from any
other units 41, 42, 44 of the control module 40. In order to
facilitate retrieval of desirable informations, the storage unit 43
may include a driver for accessing such informations and/or capable
of storing such thereinto, where examples of the informations may
include various features of the images for the task, those of the
objects, backgrounds, and/or simulated user, various set points
and/or control thresholds for any of such variables, parameters,
features, control programs and/or algorithms, and the like.
[0156] Any conventional storage device may be used as the storage
unit 43 of the control module 40. Thus, the storage unit 43 may be
magnetic tapes or disks, optical disks or semiconductor data
storage devices, in each of which such informations may be stored
in an analog or digital mode. As far as the storage unit 43 may
store such informations, the storage unit 43 may be formed in
almost any prior art processes and in almost any prior art
configurations. Such a storage unit 43 may be incorporated into
those positions as disclosed in conjunction with the sensor unit
42.
[0157] A major function of the control unit 44 is to perform all of
the aforementioned functions except those of other units 41-43 of
the control module 40, although the control unit 44 may also
perform the functions of those units 41-43 in order to assist or
supplement such units 41-43. Most importantly, the control unit 44
may generate the task in the images (or virtual environment) and
allow multiple users to perform the same or different exercises
while transferring at least one feature of the task, exercises,
users, and/or operation of the exercise modules 20 in a preset
mode.
[0158] The first main function of the control unit 44 is to provide
the task of the story, scenery, and/or video (or computer) game
with primary (or 10) features. Such 1.degree. task features
typically consists of 10 task types and 1.degree. task extents,
where the task may be provided in such images of a single still
picture with a single or multiple portions, a series of the
pictures, and/or a video clip, or optionally provided in such
auditory, olfactory, and/or tactile features. The 1.degree. task
types include various features such as, e.g., a task goal (e.g.,
viewing or watching the images or other features of the virtual
environment) for the task, proceeding along such stages of the
task, attaining a preset objective by competing a preset program or
another user, and the like), a number of stages or levels required
therefor (i.e., a number of portions formed in the still pictures,
number of still pictures in the video clip, and/or number of parts
in the video clip), means to attain the task goal (e.g., performing
the exercise, applying the user inputs, and the like), means to
proceed through the task stages (e.g., performing the exercise,
applying such user inputs, monitoring the variables or parameters
of the modules 20, 50, and monitoring the features of the user(s),
exercise(s), and/or operation(s)), and the like. The 1.degree. task
extents may include various features such as, e.g., a number of
stages defined in the task, a current stage of the user in the
task, a duration of the user in the current stage, a duration of
the task, and the like.
[0159] The second main function of the control unit 44 is to
provide the task of the story, scenery or video (or computer) game
in the images (or optionally sounds, smells, sensations) with
secondary (or 2.degree.) features. The 2.degree. task features
typically consists of 2.degree. task types and 2.degree. task
extents, where such a task is provided in such images of a single
still picture including a single or multiple portions, a series of
such pictures, and/or a video clip, optionally provided in such
sounds, and/or optionally provided in such smells and/or
sensations. The 2.degree. features of the task defined in the
images of a still picture with a single or multiple portions may
include, e.g., selecting a preset portion, a direction or a
sequence of viewing a next portion, a speed or a temporal gap
between viewing different portions, a viewing area or an extent of
zoom, a view angle when the images may be rotated, and the like.
Such 2.degree. features of the task defined in the images of a
series of still pictures, and/or video clip may include, e.g.,
selecting a preset picture thereof, a direction or sequence of
viewing a next picture, a speed or a gap between viewing different
pictures, a viewing area or an extent of zoom, a perspective angle
of such images, a view angle when the images may be rotated, and
the like.
[0160] Another main function of the control unit 44 is to relate at
least one feature of one of the task, user, exercise done by the
user, and operation of the exercise module 20 with at least one
feature of the other thereof, whether directly or through at least
one simulated user. The control unit 44 may be arranged to perform
such relating based upon a fixed relation defined between at least
two of such features (i.e., automatically), based on a relation
defined between the features and varied according to at least one
of such features (i.e., automatically and/or adaptively), based on
the command signals (i.e., manually), based on the sensed signals
(i.e., automatically and/or adaptively), and the like.
[0161] Based upon these major functions, the control unit 44
performs numerous other functions. For example, the control unit 44
receives the command and sensed signals respectively from the input
and sensor units 41, 42 and to perform various control operations
on such exercise and/or output modules 20, 50 in various control
modes. The control unit 44 generally determines various features of
the task of the story, scenery or video (or computer) game defined
in such images of the virtual environment, to select desirable
images from multiple sets of images stored in the storage unit 43
or supplied thereto from various sources, to generate the images by
assembling or composing various features therefor, and the like. In
general, the control unit 44 performs such functions based on
various features which include the user inputs or command signals
derived therefrom, variables and/or parameters monitored by the
sensor unit 42 or sensed signals derived therefrom, and the like.
Therefore, the features also include the physical or physiological
conditions of the user(s) monitored by the sensor unit 42, control
programs or algorithms stored therein or supplied thereto by the
user(s) or external sources, and so on. In addition, such features
may include various variables and/or parameters related or
associated to the exercise modules 20A, 20B, exercise available
therewith, and so on, where examples of such features may include,
but not limited to, a type of the exercises, an extent of exercises
attained by the user(s) during the exercise(s), a movement thereof
for the exercise(s), and the like. It is appreciated that the
extent of exercises may be defined as any of various criteria
examples of which may include, but not limited to, a duration of a
presence or absence of the user(s) on or near the exercise modules
20, a duration of exercises done against or onto the exercise
modules 20, a duration of the exercises performed by the user(s), a
duration of the exercises (or energy) done on the user(s), the
mechanical load presented by and/or set in the exercise modules 20
against the user(s), a product of the load and any of such
durations, a variable which may be represented as a mathematical
function of the load, a number of calories consumed by the user(s),
a work done against or onto the exercise modules 20 by the user(s),
a work done on the user(s) by the exercise modules 20, and the
like, where the load may be deemed as a variable or a parameter
determining an amount of energy consumed by the user(s) in
consummating an unit displacement or an unit deformation of a
specific part of the exercise modules 20 or body part of such
user(s). It is understood that such a load may be quantified by
various means examples of which may include, but not limited to, a
distance in which the user(s) travels with respect to a preset part
of the exercise modules 20, a length along which such a part
travels or deforms, an angle about which the part bends or deforms,
or a weight of such a part which is moved or deformed by the
user(s). To such ends, the exercise modules 20 may manipulate such
a load by various means examples of which may include, but not
limited to, adjusting a speed or an angle of a preset part of the
exercise modules 20, a modulus of the part, its spring constant or
viscosity, its weight or its length.
[0162] The control unit 44 may acquire various features of the
virtual environment, may transfer such images or other features
related thereto to other units 41-43 of the control module 40 or,
alternatively, to other modules 20, 50. The control unit 44 may
receive such images (or other features) from other units 41-43 of
the control module 40 or other modules 20, 50 of the system 10. To
this end, various units of the control unit 44 may perform
communication-related functions or, alternatively, the control
module 40 may include various communication-related units, both for
performing transfer of at least one feature between the exercise
modules 20 or between multiple users.
[0163] In one example, the control module 40 may include at least
one video-in unit for acquiring such images (including their visual
features or elements) or, at least a portion of such a control unit
44 may acquire the images. In general, the images are generated by
or retrieved from the video-in unit (or the portion of the control
unit 44), generated by or retrieved from other units 41-43 of the
control module 40, or transmitted thereto by other units 41-43
thereof or by an external source such as the external device or
user(s). The images may be acquired as a still picture, a series of
the still pictures, a video clip or a mixture, where the images may
be in black-and-white or in multiple colors, while each image
includes at least one object or background. The images may be
formed or generated in various ways. For example, the video-in unit
(or a portion of the control unit 44) may store a single or
multiple images and retrieve one or more of such therefrom, may
provide a single or multiple images by superposing the object onto
the background or composing such images from one or multiple visual
elements, and/or may receive or download a single or multiple
images from the storage unit 43 or the external source such as,
e.g., an internet, wired or wireless broadcast, an user of another
exercise module, an user of another external device, and the like.
The video-in unit may acquire a single or multiple images by itself
or using the sensor unit 42. In all of such examples, each image
may include at least one object or background each respectively
representing an animation of a real or abstract object, an
animation of a real or abstract background, and the like, where the
object may be a living organism (such as a person, an animal, a
plant, and the like) or a nonliving object, while the background
may be the living organism or nonliving object. Accordingly, the
object may correspond to the user(s) recreated by the animation,
simulated object included in the images, controllable object
included in the background, and the like. In each example, contents
of the images (i.e., object and background) may be an animated or
arbitrary object and/or arbitrary scene. In addition, the object
and/or background may be associated or related to a preset event,
geographic location, timing, and the like. In general, the video-in
unit may manipulate various features of the images depending upon
various factors as, e.g., the source of the images, type of the
source, contents of the images, and/or such aspects each determined
by various factors such as, e.g., the user inputs, command or
sensed signals, control signals, variables and/or parameters
monitored by the sensor unit 42, conditions of the user(s), and
operations.
[0164] To provide the images of desired visual features, the
video-in unit may provide desired images in a preset viewpoint of
the user. To this end, the video-in unit may manipulate various
features of the images, where examples of such features may
include, but not limited to, shapes and/or sizes of such images or
their portions, contents or colors thereof, brightness or hues
thereof, sharpness or zoom thereof, contrasts thereof, temporal or
spatial characteristics, distributions, and/or variations of such
aspects, and the like. It is appreciated that the images may be
provided based on a preset view angle of or distance to the user or
that such images may be provided to simulate the view angle or
distance. When desirable, the control unit 44 may zoom in or out
such images, vary the view angle thereof, vary the distance
thereto, rotate such images with respect to a preset base, and the
like.
[0165] To this end, the video-in unit may include any prior art
camera, camcorder, and/or other image recording devices including
charge-coupled devices capable of acquiring such images. The
control unit 44 may include a single camera (or camcorder) or
multiple cameras (or camcorders) for acquiring the same or
different images therewith. The video-in unit may acquire the
images of the body part of the user(s), where the video-in unit may
be disposed to preferentially aim the body part. Accordingly, the
preset or target area for the video-in unit may be, e.g., an entire
visible area of the user(s), area around the face thereof, area
covering an upper torso thereof or area defining a height similar
to that of the user(s). The video-in unit may be disposed in the
exercise modules 20 or, alternatively, at least a portion of the
exercise modules 20 (or another external audiovisual device) may
also be recruited as the video-in unit. The sensor unit 42 may be
used to acquire the images as well.
[0166] In another example, the control module 40 may have at least
one video-out unit for generating such images (including visual
features or elements), or at least a portion of such a control unit
44 may generate the images. Such images may be generated by and/or
retrieved from the video-out unit (or a portion of the control unit
44), generated by or retrieved from other units 41-43 of such a
module 40 or other modules 20, 50 of the system 10, or transmitted
thereto by an external source such as, e.g., an external device or
user(s). Depending on the types of sources, the images may include
a still picture, a series of multiple pictures, a video clip, a
mixture thereof, and so on. The video-out unit is to perform
various functions similar or identical to those of the visual unit
51 and, therefore, may be replaced by the visual unit 51.
Otherwise, further configurations, arrangements, and/or
dispositions of the video-out unit are similar or identical to
those of the visual unit 51.
[0167] The video-in and/or video-out units (or control unit 44) may
acquire and/or generate the images associated or synchronized with
other features. In one example, the video-in and video-out units
may associate or synchronize at least one feature of such images
with at least one feature of the user(s) which may include, but not
limited to, a face, hand or arm, foot or leg, other body parts,
appearance, orientation or posture, movement, and physical or
physiological condition thereof. In another example, such video-in
and/or video-out units may associate or synchronize at least one
feature of the images with at least one of feature of the exercises
as set forth herein, each of which may be determined by various
aspects and/or factors as described above. In another example, the
video-in and/or video-out units may associate or synchronize at
least one feature of the images with at least one feature of the
operations which may also include, but not limited to, variables
and/or parameters of such operations of the exercise modules 20, a
preset control program designed therefor, and the like.
[0168] In another example, the control module 40 may include at
least one audio-in unit to acquire the sounds (i.e., auditory
features or elements) or, at least a portion of the control unit 44
may acquire the sounds. The sounds may be generated by or retrieved
from the audio-in unit (or a portion of the unit 44), generated by
or retrieved from other units 41-43 of the control module 40 or
other modules 20 of the system 10, or transmitted thereto by an
external source such as an external device or users. The audio-in
unit may acquire the sounds by various means, e.g., by acquiring
voices of or sounds of the user(s) or other persons, those
generated by a plant or animal, those of music or sounds generated
by a nonliving object, where the sounds may be provided in a mono,
stereo or surround mode. Based on the types of the sources, the
sounds may carry the contents (e.g., conversation) or melody (e.g.,
instrumental or non-instrumental music) or may not carry any
contents (e.g., instrumental music). The sounds may be provided by
various means. For example, the audio-in unit may store the sounds
and retrieve such therefrom, may synthesize the sounds or
superposing at least one sound onto another, may receive or
download the sounds from the storage unit 43 or at least one
external source such as an internet, wired or wireless broadcast,
an user of another exercise module, or an user of another device.
The audio-in unit may also acquire the sounds with the sensor unit
42. In such examples, the sounds may include real or synthesized
sounds, where each sound may be generated or represent each source.
Accordingly, the sounds may correspond to the sounds of the user(s)
in own voices or sounds synthesized or simulated by various prior
art means. In each example, such sounds may be related to or
associated with the preset event, geographic location, timing,
person, and/or object. The audio-in unit may manipulate various
aspects of the sounds depending on various factors such as the
source of such sounds, type thereof, their contents or aspects each
of which may be determined by various factors such as, e.g., the
user inputs, command or sensed signals, control signals, variables
or parameters sensed by the sensor unit 42, conditions of the
user(s), operations, and the like.
[0169] To provide such sounds with desired auditory features, the
audio-in unit preferably provides the desired sounds in the preset
viewpoint of the user. To this end, the audio-in unit may
manipulate various features or aspects of the sounds, where
examples of such aspects may include, but not be limited to, a
volume or tone of the sounds, a content thereof, frequency
distribution thereof, a direction thereof, temporal or spatial
characteristics, distributions, and variations of such aspects and
features, and the like. It is appreciated that the sounds may be
provided based on a preset direction or distance with respect to
the user or that the sounds may be provided to simulate the
direction or distance. The audio-in unit may also control such
aspects or features of the sounds, change the direction of and/or
distance to such sounds, and the like. To this end, the audio-in
unit may be constructed from any prior art microphones for
acquiring such sounds in an audible (or inaudible) frequency range.
The audio-in unit may include a single or multiple microphones for
acquiring the same or different sounds therefrom. The audio-in unit
may acquire such sounds from a preset target area of the user,
where the audio-in unit may be disposed near a mouth or other body
parts of the user(s). Therefore, the preset or target area may
include, e.g., an entire audible area of the user(s), an area
around a mouth thereof, an area encompassing an upper torso
thereof, an area defining a height similar to that of the user(s),
and the like. The audio-in unit may be disposed away from the
user(s), to be worn thereby over or around a mouth or vocal cord
thereof, to be worn thereby over or around a head thereof, to be
portably carried thereby, and the like. When desirable, the
audio-in unit may be incorporated into the exercise modules 20A,
20B, or at least a portion of the exercise modules 20A, 20B or
external audiovisual devices may be recruited therefor. The sensor
unit 42 of the control module 40 may be recruited therefor as
well.
[0170] In another example, the control module 40 may include at
least one audio-out unit for generate the sounds (including
auditory features or elements), or the control unit 44 may generate
the sounds. Such sounds may be generated by or retrieved from the
audio-out unit (or a portion of the control unit 44), generated by
or retrieved from other units 41-43 of the control module 40 or
other modules 20 of the system 10, transmitted thereto by an
external source such as, e.g., an external device or user(s).
Depending on the source types, such sounds may be in a mono, stereo
or surround mode. The audio-out unit is to perform various
functions similar or identical to those of the auditory unit 52 of
the output module 50 and, thus, may be replaced by the auditory
unit 51. Further configurations, arrangements or dispositions of
the audio-out unit are similar or identical to those of the
auditory unit 52.
[0171] The audio-in and/or audio-out units (or control unit 44) may
acquire and generate such sounds associated or synchronized with
other features. In one example, such audio-in and/or audio-out
units may associate or synchronize at least one feature of the
sounds with at least one of feature of the user(s) as set forth
herein. In another example, such audio-in and/or audio-out units
may associate or synchronize at least one feature of the sounds
with at least one feature of the exercises as set forth herein,
which may be decided by various aspects or factors as set forth
herein. In another example, the audio-in and/or audio-out units may
associate or synchronize at least one feature of such sounds with
at least one feature of the operations such as, e.g., variables or
parameters of such operations and a preset control program designed
therefor.
[0172] In another example, the control module 40 includes at least
one receiving unit for receiving the signals wirelessly or by wire,
or at least a portion of the control unit 44 may receive the
signals. The receiving unit (or the portion of the control unit 44)
may receive the signals from the exercise or output modules 20, 50,
other units 41-43 of the control module 40, or an external source
such as an internet, wired or wireless broadcast. It is appreciated
that the signals may carry various informations which include
various features described above. Therefore, the receiving unit (or
the portion of the control unit 44) defines a desirable frequency
response or sensitivity capable of receiving such informations and
features with minimal distortion. The receiving unit may be made
from any prior art wireless or wired receiver capable of receiving
the signals of preset frequency ranges. The receiving unit may
receive the informations from the storage unit 43 of the control
module 40, from a provider of a wired or wireless communication,
through a global network, and so on. The receiving unit may
operatively couple with the exercise or output modules 20, 50,
where this unit operates as a "common" receiving unit of the system
10. The system 10 may also include multiple receiving (or control)
units at least one of which serves as a "master" receiving unit and
controls the rest thereof. Regardless of its types and number, the
receiving unit may be at least partially enclosed by a cover, a
divider or a partition as it is desired to enclose the receiving
unit inside the system 10. The receiving unit may include various
prior art wave guides or paths for enhancing reception of such
waves. Similar to other modules, the receiving unit may be
incorporated into other units or modules of the system 10.
[0173] In another example, the control module 40 may include at
least one sending unit for transmitting various signals wirelessly
or through wire, or at least a portion of the control unit 44 may
transmit the signals. The signals transmitted by the sending unit
(or a portion of the control unit 44) carry various informations
therealong which include various features set forth herein and,
accordingly, the sending unit preferably includes a desirable
frequency response or sensitivity for transmitting the informations
or features with minimal distortion. The sending unit transmits
such informations other modules 20, 50 while obtaining at least one
of such features from the storage unit 43, user(s), other modules
20, 50, external sources, and the like. The sending unit may be
fabricated by any prior art wireless or wired transmitter for
transmitting such signals of preset frequency ranges. The sending
unit may transmit such informations to a provider of the wired or
wireless communication, to a global network, and the like. Other
configurations, arrangements, and/or dispositions of the sending
unit are similar or identical to those of the receiving unit
described in the above paragraph.
[0174] Such communication-related units may be operatively coupled
to each other in various modes. For example, each unit may
operatively couple with the rest thereof so that each unit may
receive or transmit the signals as the electrical or optical
signals by wire or wirelessly. In an opposite example, at least one
of such units may couple with not all but only some of the rest of
the units such that, e.g., the video-in or video-out unit may
couple to the audio-in or audio-out unit but not with the rest
thereof. Therefore, detailed coupling modes of such units depend
not only on a configuration of such units but also on assigned
functions of each of such units. As described above, such
communication-related units may include multiple of at least one of
such units of the same type, e.g., including multiple video-in
and/or video-out units in order to display different and/or
overlapping images, including two or more audio-in and/or audio-out
units to play different or overlapping sounds, and the like. At
least a portion of the communication-related units may be included
in the exercise and/or output modules 20, 50, and perform the same
or similar functions as set forth herein. In this embodiment, such
a portion may be classified as a part of the control module 40 or
may be deemed as a part of such exercise or control module 20, 40,
depending upon detailed definition thereof.
[0175] Such a control unit 44 may be arranged to transfer of at
least one feature of the task, users, exercises, and/or at least
one operation of the exercise modules 20 only from one to another
of the exercise modules 20, between the exercise modules 20, and
the like. The control unit 44 allows the transfer wirelessly or by
wire, without altering any feature (i.e., a simple transfer) by
converting or altering at least one feature based on at least one
preset relation (i.e., an equivalent conversion), and the like. To
this end, various portions of the control unit 44 may preferably
perform various transfer-related functions or such a control module
40 may include various conversion-related units, both for
performing transfer of at least one feature between the users or
between such exercise modules 20. It is appreciated that such
transfer (including both of the simple transfer and equivalent
conversion) is to transfer the task feature from one to another
exercise or output modules 20, 50 and that the control unit 44 may
perform the transfer directly between the modules 20, 50 or
indirectly therethrough. Such a control unit 44 may transfer at
least one feature of the users, exercises, and/or operations
between the exercise and/or output modules 20, 50 as well.
[0176] Such conversion-related units (or control unit 44) may
receive at least one feature of the task, user(s), exercises,
and/or operations and convert such into the converted (or control)
signals at least partly based on a preset relation. In particular,
such conversion-related units (or control unit 44) may generate the
converted signals based on the control signals or various features
of the task, user(s), exercises, and/or operations, may convert the
converted signals into the control signals, and the like, although
such units may generate the converted or control signals (to be
referred to as the "signals" hereinafter) for manipulating various
features of the task provided by the control module 40 or other
external devices defining story-generating, scenery-generating or
game-generating capabilities which may or may not be deemed as a
part of the system 10. It is appreciated that such
conversion-related units generate the signals and to deliver such
signals to the external device, thereby manipulating the device
based on the signals and manipulating at least one task feature
based thereon. To these ends, such conversion-related units may
include various units such as, e.g., at least one simulator unit,
at least one converter unit, and at least one driver unit, where
such units may be provided in a singular or plural arrangements. As
set forth herein, however, at least one of such units may be
incorporated to and/or replaced by at least a portion of the
control module 40 or other modules of the system 10. At least one
of such units may be incorporated into or replaced by at least a
portion of the control module 40 and/or device. Accordingly, exact
disposition of various conversion-related units or classifications
thereof may not be critical to the scope of this invention as long
as the system 10 may provide various functions to be disclosed in
conjunction with various conversion-related units.
[0177] In one example, such a control module 40 may include at
least one simulator unit for acquiring at least one feature of the
task, user(s), exercise(s), and/or operations, or at least a
portion of such a control unit 44 may be arranged to acquire the
feature. The simulator unit (or the portion of the control unit 44)
may passively receive the feature from another module (or unit) of
the system 10 or actively monitor and acquire the feature using any
prior art sensors or the sensor unit 42. As set forth herein, the
simulator unit acquires at least one exercise feature and then to
convert the acquired feature into the signals, where the simulator
unit may acquire the desired feature by directly monitoring such,
by estimating the desirable feature from at least one another
exercise feature, by estimating the desired feature based on at
least one of such features of the user(s), tasks, and/or
operations, by receiving the desirable feature from another module
or unit, and the like. It is appreciated that the simulator unit
may acquire the desired feature by analyzing various images or
sounds provided by other modules or units or by the external device
which may or may not be deemed as the part of the system 10. When
the simulator unit acquires at least one feature of the task,
user(s), and/or operations, such a unit may acquire the feature by
monitoring or estimating through various means similar to those of
acquiring the exercise feature. The simulator unit may acquire at
least one task feature and to convert the acquired feature into the
converted or control signals, where the simulator unit acquires the
desired feature of the task by directly monitoring the task, by
estimating the desirable feature from at least one another task
feature, by estimating the desired feature based on at least one of
the features of such user(s), exercise(s), and/or operations, by
receiving the desirable feature from another module and/or unit, by
receiving the desirable feature from the external device, by
receiving the desired feature from another user of another exercise
module, and the like. The simulator unit may acquire the desired
feature by analyzing the images and/or sounds generated for the
task by the control module 40 or by the external device which may
or may not be the part of the system 10.
[0178] The simulator unit couples with such exercise modules 20 or
external device to manipulate at least one operation feature of the
modules 20 (or device) at least partly based on at least one
feature of the task, users, exercises, and/or operations. The
simulator unit may acquire the desired exercise feature by directly
coupling to the exercise modules 20, indirectly coupling thereto
through the control module 40, and the like. Similarly, the
simulator unit may directly couple with the external device when
the system 10 may drive the device and manipulating at least one
feature of the task provided by the device. The simulator unit may
also operatively couple to the external device by at least one of
other modules or units of the system 10. The simulator unit may be
constructed from any prior art devices capable of acquiring such
desired exercises, task, and/or other features and converting such
into the converted and/or control signals. The simulator unit may
include at least one receiver to receive such a desired feature
from the exercise or control modules 20, 40 or external device, at
least one sensor for monitoring the desired exercises, users,
and/or task features, and the like. The simulator unit may be
provided as a software driven by other modules 20, 50, or may be
incorporated thereinto.
[0179] In another example, the control module 40 may include at
least one converter unit for assisting the simulator unit (or the
portion of the control unit 44) while performing conversion of the
acquired feature to converted or control signals by providing at
least one relation and equivalence respectively for the "simple
transfer" and "equivalent conversion," both of which are to be
collectively referred to as the "conversion" hereinafter. At least
a portion of the control unit 44 may perform such conversion by
providing the relation and equivalence. The converter unit may
preferably provide such a relation for associating or synchronizing
at least one feature of one of such a task, user(s), exercise(s),
and operation of the exercise modules 20 with at least one feature
of at least one another thereof so that various features of
different types may be related to each other at least partly based
on the relation. The converter unit may arrange the relation to
account for discrepancies in amounts of energy which may be
required for performing an unit of various exercises and which may
be attributed to different types of exercises, different loads
imposed by different exercise modules 20, differences in physical
abilities of the users, and the like. Thus, the system 10 may
perform the "simple transfer" when the control module 40
delivers/or transmits at least one preset feature between at least
two modules 20, 50 without modifying the preset feature or,
alternatively, the system 10 may perform the "equivalent
conversion" as the control module 40 may deliver or transmit at
least one preset feature between the modules 20, 50 while altering
such a feature based on the relation. In either example, the system
10 may relate at least one feature of the task, users, exercises or
operations to a different feature of the same type, to the same
feature of the different type, or to a different feature of a
different type.
[0180] The converter unit (or the portion of the control unit 44)
may perform the simple transfer and/or equivalent conversion based
on the preset relation which may be decided at least partly based
on at least one feature of various types such as, e.g., the task,
user(s), exercise(s), operation, and the like. For example, the
converter unit may provide the relation and convert at least one
feature of one of the above types defined in a specific unit (e.g.,
calories, watts, N, N/m, N-m, minute, meter, and so on) into the
same feature of the same type but in a different unit, into the
same feature of a different type but in the same unit, into a
different feature of the same type but in a different unit, and/or
into a different feature of a different type and in a different
unit. As a result, the converter unit may synchronize or associate
at least one feature of the exercise(s) with at least one feature
of the task (or vice versa), may associate or synchronize at least
one feature of the user(s) and/or operation(s) with at least one
feature of the task or vice versa, and the like. This function may
be of particular importance when the conversion unit is to simulate
the user(s) into the simulated user such as, e.g., at least one
object or background in the images for the task. The converter unit
may generate such a relation at least partly based on the control
signals generated at least partly based on the user inputs and/or
preset program, may retrieve one or more from multiple relations
stored in the system 10, and the like.
[0181] The converter unit may keep the relation constant during the
exercises or at least one stage of the task. The converter unit may
allow the users to manually control the relation during the
exercises or stage of the task. The converter unit may determine
the relation at least partly based upon various signals provided by
the users, an user of another exercise module or the external
device, and so on, with or without any intervention therefrom. The
converter unit may vary the relation automatically or adaptively at
least partly based on at least one factor of the users, task,
exercises, and/or operations. The converter unit may be made of any
prior art devices for generating or retrieving the basic relation
or equivalence and utilizing such to generate the converted or
control signals. The converter unit may include at least one
optional receiver to receive such a feature from the exercise or
output module 20, 50 (or external device), at least one optional
sensor for monitoring such a feature of the task, users or
exercises, at least one processor to generate the relation, and the
like. The converter unit may also be provided as a software and
driven by the control and/or other modules 20, 40, 50. The
converter unit may be incorporated to the control module 40, or at
least a portion thereof may be incorporated to the exercise or
output modules 20, 50 (or external device) when desirable.
[0182] In another example, the control module 40 may optionally
have at least one driver unit capable of providing the converted or
control signals and manipulating at least one feature of the task
provided by the external device, or at least a portion of the
control unit 44 may provide such signals and control the task
feature. To this end, the driver unit may have a configuration or
arrangement to communicate with the external device of only a
certain type or at least two external devices of different
operating types. It is appreciated that detailed configurations or
arrangements of the driver unit are not critical to the scope of
this invention as far as the driver unit drives the external
device. The control module 40 may not incorporate any driver unit
when the simulator unit, converter unit or its other units 41-44
may generate the task and manipulate various features thereof or
may directly control the external device, which explains why the
driver unit is merely an optional unit. The external device may
also perform the function of the driver unit when desirable. The
driver unit may be provided as a driver of any prior art
audiovisual external device capable of communicating with at least
one module of the system 10. The driver unit may have at least one
receiver and converter, where the receiver may receive various
signals from the simulator or converter unit, or exercise or output
modules 20, 50, while the converter unit may convert formats of the
signals in desired formats readable by the device. The driver unit
may be provided as a software driven by the control or other
modules of the system 10. Such a driver unit may be incorporated
into the control module 40 or at least a portion of the driver unit
may be included in the exercise or output modules 20, 50 or
external device.
[0183] Based upon these major functions, the control unit 44
performs numerous other functions. For example, the control unit 44
receives the command and sensed signals respectively from the input
and sensor units 41, 42 and performs various control operations on
such exercise or output module 20, 50 in various modes. The control
unit 44 may determine the features of the task defined in the
images, to select desirable images from multiple sets of images
stored in the storage unit 43 or supplied thereto from various
sources, to generate the images by assembling or composing various
features, and the like. The control unit 44 performs such functions
based on various features including the user inputs or command
signals, variables or parameters monitored by the sensor unit 42,
sensed signals, and so on. Thus, the control unit 44 may monitor
the features such as the physical or physiological conditions of
the users monitored by the sensor unit 42, control programs or
algorithms stored therein or supplied thereto by the users or
external sources, and the like. Such features may include various
variables or parameters related or associated to the exercise
modules 20 or exercises, where examples of such features may
include, but not be limited to, a type of the exercise and an
extent thereof attained by the user. Such an extent is defined as
various criteria examples of which may include, but not be limited
to, a duration of presence or absence of the users on or near the
exercise modules 20, a duration of such exercises done against or
onto the exercise modules 20, a duration of the exercises performed
by the users, a duration of the exercises or energy done by or onto
the users, a load presented by or set in the exercise modules 20
against the users, a product of the load and any of such durations,
a mathematical function of the load, a number of calories estimated
to be consumed by the user, a work done against or onto the
exercise modules 20 by the users, a work done on the users by the
exercise modules 20, and the like, where such a load may be deemed
as a variable or a parameter determining an amount of energy
consumed by the user in consummating an unit displacement and/or
deformation of a specific part (e.g., the actuating part) of the
exercise modules 20 or body parts of the users.
[0184] As described above, the control unit 44 may be arranged to
manipulate at least one feature of one of the task, users,
exercises, and/or operations of the exercise modules 20 at least
partly based on at least one feature of another thereof. Such
manipulation may be classified in three modes, i.e., manipulating
the task based on the exercises, manipulating the exercises based
on the task, and both.
[0185] For example, the control unit 44 may monitor the users or
exercise modules 20, acquire at least one feature of such users,
exercises, and/or operation of the exercise modules 20, and
manipulate at least one task feature at least partly based on the
monitored feature. To this end, the control unit 44 monitors the
operation or user feature by the sensor unit 42, generates the task
in the images (with or without including the simulated user) of the
virtual environment at least partly based on the monitored feature,
and manipulates the output module 50 to display the images (or with
the sounds). Therefore, the control module 40 may manipulate the
task (i.e., its features) at least partly directly based upon the
exercises (i.e., exercise feature) or indirectly based thereon
(i.e., user or operation feature), thereby allowing the users to
manipulate the task and to proceed through the task at least partly
based on the exercises. The control unit 44 may vary the mode of
manipulation in various means, e.g., in response to the command
signals (i.e., manually), to the sensed signals (i.e.,
automatically or adaptively), or at least partly based upon at
least one of such features of the task, users, exercises, and/or
operation features (i.e., automatically or adaptively).
[0186] In another example, the control unit 44 may monitor the
images for the task (or simulated user therein), acquire at least
one feature of the images (or that of the simulated user), and then
manipulate at least one operation feature of the exercise modules
20 at least partly based on the monitored task feature. To this
end, the control module 44 may operatively couple to the load of
the exercise modules 20, manipulate the load at least partly based
on the monitored feature, and then manipulate the output module 50
to display such images (or simulated user) of which the feature is
determined at least partly based on the manipulated load of such
exercise modules 20. Thus, the control module 40 manipulates the
operation feature of the exercise modules 20 at least partly based
upon the task, thereby directly affecting the operation of such
exercises by the task feature and indirectly affecting the types
and/or extents of such exercises to be performed by the users and
the physical or physiological conditions of the users which result
from such exercises at least partly based on performance of the
task. The control unit 44 may change the manipulation mode in
various means, e.g., in response to the command signals (i.e.,
manually), to the sensed signals (i.e., adaptively or
automatically), or at least partly based on at least one of the
features including those features of the task, users, exercises,
and/or operation (i.e., automatically or adaptively).
[0187] In another example, the control unit 44 may perform such
manipulations either sequentially (i.e., one after another) or
simultaneously. That is, the control unit 44 may manipulate at
least one feature of the task at least partly based on the
monitored features of the users, exercises, and/or operation or may
manipulate at least one operation feature at least partly based on
the monitored task feature. This control unit 44 may change its
mode of manipulation based upon the command signals (i.e.,
manually), sensed signals (i.e., automatically or adaptively), at
least one of such features (i.e., automatically or adaptively), and
the like. It is appreciated that the sequential manipulation is
best suited when a single user performs different exercises on,
with or against one exercise module 20 and views the images for the
task by the output module 50, although it is not impossible to use
the simultaneous manipulation mode therefor. Whereas, the
simultaneous manipulation is best suited as multiple users perform
same or different exercises on, with or against multiple exercise
modules 20 while viewing such images for the task using individual
output modules 50. The sequential or simultaneous manipulation is
performed in real time so that a desired feature may be transferred
between the exercise and output modules 20, 50, although the
control unit 44 may store such features of the first exercising
user and then transfer such to another user later. In both
manipulations, the control unit 44 may transfer the feature by wire
or wirelessly or may transfer the feature with or without modifying
at least a portion thereof.
[0188] When the control unit 44 is to manipulate various features
related to at least one object of such images, such features may
include various features associated with the objects, where
examples of the features may include, but not be limited to, the
type of the object (i.e., an animated or synthesized object), a
mode thereof (i.e., black and white in a grey-scale or
color-scale), a dimension thereof (i.e., two- or
three-dimensional), a configuration thereof, an arrangement and/or
disposition thereof, and so on. As the control unit 44 simulates
the user(s) into the simulated user, the manipulatable features of
the simulated user may similarly include, but not be limited to,
the type of the simulated user, its mode, its dimension, its
configuration, or its arrangement and/or disposition. Such an
object may be selected to be directly manipulated by at least one
feature of the user(s), task, exercise, and/or operation or, in the
alternative, may be the simulated user simulating or synthesizing
the user(s). It is appreciated that the control unit 44 may
generate one or multiple simulated user for the sequential
manipulation but may generate multiple simulated users for the
simultaneous manipulation, although the reverse is feasible.
[0189] The control unit 44 may manipulate at least one of such
features based on at least one another thereof while providing the
user with the task in the images of such desired features. In one
example, the control unit 44 may provide (i.e., generate or select)
the images of the task in a preset perspective of the user. For
example, such a feature associated or related with the users may
not be included in the images. The control unit 44 may also include
in the images at least one user feature which may be constant or
varying based upon other features. In another example, the control
unit 44 may construct the images in an arrangement that any feature
of the images may be varied by the users depending on other
features. In particular, the control unit 44 may include in such
images at least one object at least one feature of which is
determined at least partially based on at least one another
feature. Thus, the control unit 44 may allow the users to
manipulate at least one feature of the object, thereby allowing him
or her to manipulate the images directly or indirectly by
manipulating at least one of such features. The control unit 44 may
instead construct the images for the task which the users may not
be able to directly control. In another example, the control unit
44 may include at least one object in such images, while simulating
at least one characteristic of the users by the object
corresponding to the simulated user. Therefore, the control unit 44
may change at least one feature of the images for the task (or its
simulated user) based on various features of the users or exercise
modules 20 (i.e., their operations). The control unit 44 may
thereafter manipulate at least one feature of such images based
upon at least one of such features, user inputs, command or sensed
signals, conditions of the users, or operations.
[0190] The control unit 44 may arrange such images to simulate the
users into at least one object or background of the task, as at
least one voice or other auditory features, and the like. It is
appreciated that the control unit 44 may unilaterally manipulate
the object, background, voices, sounds, and so on, while simulating
the users as such, but the users may not manipulate such in a
reverse mode. In the alternative, the control unit 44 may be
arranged to allow such control by the users. Accordingly, the
control unit 44 may manipulate at least one feature of the images
for the task based on at least one of such features, user inputs,
command or sensed signals, conditions of the users, or
operations.
[0191] The control unit 44 may use the images for the task or
features to control various operations of the exercise modules 20.
For example, the control unit 44 may provide (i.e., generate or
select) such images in the users' (or another) perspective and then
to manipulate such operations of the exercise modules 20 based upon
at least one of the features. In another example, the control unit
44 simulates the users into at least one of such features of the
images and control such operations by manipulating the simulated
portion of the feature of such images. In another example, the
control unit 44 may allow the users to directly manipulate at least
one feature of the images and directly control such operations or
may control the operations indirectly using the manipulatable
feature of the images for the task.
[0192] Any conventional control device may be used as the control
unit 44 of the control module 40 of the present invention.
Accordingly, the control unit 44 may include various electric
elements such as a resistor, a capacitor, an inductor, an
amplifier, a diode, and the like, details of which are well known
to one of ordinary skill in the art of the electrical control
system. Such a control unit 44 may be formed on a circuit board,
may be fabricated as a microchip, and the like. As far as the
control unit 44 may be able to perform various control functions
set forth herein, the control unit 44 may be provided in almost any
prior art processes and in almost any prior art configurations.
[0193] The control unit 44 may be disposed into various positions
of the system 10. For example, the control unit 44 may be formed
physically separate from the exercise or output module 20, 50, or
may be disposed on or in the modules 20, 50. The control unit 44
may operatively couple with other parts of the system 10 by wire or
wirelessly, depending on its disposition or structure. In addition,
at least a portion of the control unit 44 may be incorporated in
the portable or wearable articles so that the users may perform the
exercises while providing the user inputs without stopping the
exercises. At least a portion of the control unit 44 may be worn
around the users to allow the control unit 44 to perform its
functions. Other configurations, arrangements, and/or dispositions
of the control unit 44 are same or similar to those of the overall
control module 40 as far as the control unit 44 performs such
functions.
[0194] Various units (including those units related to the
communication or conversion) of the control module 40 may
operatively couple to each other in various modes. For example,
each unit may couple with the rest thereof and receive or transmit
various informations carried along the electrical or optical
signals by wire or wirelessly. Conversely, at least one of such
units may couple with one or more but not all of such units where,
e.g., the storage unit 43 couples with the control unit 44 but not
with the sensor unit 42. Accordingly, the detailed coupling modes
of such units depend not only on an overall configuration of the
control module 40 but also on those functions assigned to each
unit. The control module 40 requires the control unit 44 but not
necessarily needs the input, sensor or storage unit 41-43. The
input unit 41 may also be replaced by at least one exercise module
20 capable of receiving the user inputs and relaying such (or
command signals) to the control unit 44. The sensor unit 42 may not
be needed as the control unit 44 may monitor the variables or
parameters, when the control unit 44 does not include a feedback
control mechanism, and the like. The control module 40 may not need
the storage unit 43 as the control unit 44 provides or generates
the task and its features based on various informations supplied
thereto by the users or from an external source. By the same token,
the control module 40 may include multiple units of the same or
different types such that, e.g., the control module 40 may include
two or more control units 44 performing different or redundant
functions, two or more sensor units 42 monitoring the same or
different variables or parameters, each monitoring the same
variable or parameter in different positions, and so on. At least a
portion of the control module 40 may be disposed in the exercise or
output module 20, 50 in order to perform the same or similar
functions. In this embodiment, this portion may be classified as a
part of the control module 40 or may form a part of the exercise
and/or output module 20, 50.
[0195] In another aspect of the present invention, an exemplary
exercise system may be embodied in various configurations and/or
arrangements. FIGS. 2A to 2F depict schematic diagrams of exemplary
exercise systems incorporating therein a different number of
modules. It is appreciated that not every module or unit of the
system is included in the figures but that the modules and/or units
set forth herein may be incorporated into such systems and perform
the above functions as set forth hereinabove.
[0196] In one exemplary embodiment and as exemplified in FIG. 2A,
an exemplary exercise system 10 includes two exercise modules 20A,
20B, control modules 40A, 40B, and output modules 50A, 50B, where
one control module 40A manipulates one exercise and output module
20A, 50A, while another control module 40B manipulates another
exercise and output module 20B, 50B. Each of such modules 20A and
20B, 40A and 40B, 50A and 50B, are disposed in different locations
and may communicate with each other either directly, indirectly
through another module, through an external provider, and so on.
Such modules 20A and 20B, 40A and 40B, 50A and 50B) may be
independent of each other, may perform same, different or redundant
functions, and the like.
[0197] In another exemplary embodiment and as exemplified in FIG.
2B, an exemplary exercise system 10 has two exercise modules 20A,
20B, two output modules 50A, 50B, but a single control module 40,
where the exercise modules 20A, 20B may communicate with each other
or be independent of each other, while the output modules 50A, 50B
may communicate with each other or may be independent of each
other, may perform the same, different or redundant functions, and
so on. However, the single control module 40 may manipulate the
exercise modules 20A, 20B and output modules 50A, 50B. In a related
embodiment and as exemplified in FIG. 2C, an exemplary exercise
system 10 has two exercise modules 20A, 20B, two control modules
40A, 40B, and a single output module 50, where the exercise modules
20A, 20B may communicate with each other or may be independent of
each other, while the control modules 40A, 40B may communicate with
each other, be independent of each other, perform the same,
different or redundant functions, and the like. The single output
module 40 may provide the images for the exercise modules 20A, 20B
and manipulated by the control modules 40A, 40B.
[0198] In another exemplary embodiment and as exemplified in FIG.
2D, an exemplary exercise system 10 includes two exercise modules
20A, 20B, a single output module 50, but a single control module
40, where the exercise modules 20A, 20B may communicate with each
other or be independent of each other. The single control module 40
may manipulate the exercise modules 20A, 20B and output module 50A,
and the single output module 50 may then display the images for the
exercise modules 20A, 20B either sequentially or
simultaneously.
[0199] In another exemplary embodiment and as exemplified in FIG.
2E, an exemplary exercise system 10 may include a single exercise
module 20, a single output module 50, and a single control module
40. This system 10 cooperates with another exercise module or
another system which is disposed in the same location and includes
at least one exercise module so that a single user may perform
different exercises sequentially or multiple users performs the
same or different exercises simultaneously. In a related embodiment
and as described in FIG. 2F, an exemplary exercise system 10
includes a single exercise module 20, a single control module 40,
and a pair of output modules 50A, 50B. This system 10 preferably
cooperates with another exercise module or another system which is
disposed in the same location and includes at least one exercise
module such that a single user may perform different exercises
sequentially or multiple users performs the same or different
exercises simultaneously.
[0200] In another aspect of the present invention, such exercise
systems may be embodied in various configurations and/or
arrangements. FIGS. 3A and 3B are schematic perspective views of
exemplary exercise systems each of which includes two exercise
modules according to the present invention. It is appreciated that
not every module and/or unit of the systems are shown in the
figures but that such modules and/or their units described
hereinabove may be incorporated into such systems and perform the
above functions as set forth hereinabove.
[0201] In one exemplary embodiment of this aspect of the invention
and as exemplified in FIG. 3A, an exemplary exercise system 10
includes two exercise modules 20A, 20B, at least one control module
(not shown in the figure), and two output modules 50A, 50B (only
their portions included in the figure). As set forth herein, the
control and output modules 40, 50A, 50B may include at least one or
all of the aforementioned units thereof.
[0202] The system 10 includes two exercise modules 20A, 20B each of
which is provided as a prior art treadmill machine with a frame
21A, 21B and a track 22A, 22B, where each frame 21A, 21B forms a
basic body of each exercise module 20A, 20B, while each track 22A,
22B is arranged to translate and to allow the user to walk or to
run thereon. On a front end of each frame 21A, 21B is provided a
stand 21S extending upward and bifurcating vertically while forming
a pair of handles 21H which may assist the user to hold on thereto
while engaging running or walking with or on each exercise module
20A, 20B. The frames 21A, 21B further extend forward to form
couplers 21C of which functions are to be provided below. Such
exercise modules 20A, 20B include numerous other parts which are
not included in the figure but commonly seen in the prior art
treadmill machine. For example, the exercise modules 20A, 20B have
electric actuators and controllers capable of translating the
tracks 22A, 22B at preset speeds or those selected by the user.
Such modules 20A, 20B may have gear assemblies or their equivalents
to adjust the speeds of the tracks 22A, 22B, to manipulate heights
or ascending angles of the tracks 22A, 22B with respect to the
ground, and the like.
[0203] Each output module 50A, 50B includes a full-size visual unit
51A, 51B forming an image domain 51M and including an auditory unit
52, an olfactory unit 53, and a tactile unit 54. Each visual unit
51A, 51B is provided as a projection screen including a projector
(not shown in the figure), TV, LED display panel, PDP, and the
like, each capable of displaying thereon a single or multiple
images to provide such images of the virtual environment for the
task of the story, scenery, and/or video (or computer) game. It is
noted that each visual unit 51A, 51B of this embodiment defines the
image domain slightly convex toward each exercise module 20A, 20B,
although each visual unit 51A, 51B may instead define a flat image
domain 51M. When desirable, the system 10 may include a single
visual unit 51 defining multiple portions in its image domain 51M
or, in the alternative, such a system 10 may have multiple visual
units 51A, 51B to display multiple images on their multiple images
domains 51M. Each visual unit 51A, 51B may be disposed at a preset
arrangement with respect to each exercise module 20A, 20B by being
at least partially supported by or coupled to the coupler 21C. The
couplers 21C may support such visual units 51A, 51B in a fixed
disposition or such couplers 21C may movably support the visual
units 51A, 51B to adjust their disposition with respect thereto.
The visual units 51A, 51B may further define the image domain 51M
which is asymmetric vertically or horizontally, disposed
preferentially to one side of each exercise module 20A, 20B,
encloses a greater portion or angle of such exercise modules 20A,
20B which are transversely disposed with respect to a vertical
direction which is perpendicular to a vertical axis of each
exercise module 20A, 20B, and the like. Such visual units 51A, 51B
may also be disposed in other positions or orientations as far as
the user(s) may watch the images thereon while engaging the same or
different exercises. Each auditory unit 52A, 52B includes multiple
speakers and is generally disposed alongside each visual unit 51A,
51B. In this embodiment, each auditory unit 52A, 52B disposes
multiple sets of speakers (e.g., a first set of speakers on one
vertical edge of the visual unit 51A, 51B and a second set thereof
on an opposing vertical edge thereof), thereby playing sounds in a
stereo mode. Different number of speakers may be disposed on
different positions of the system 10 in different arrangements as
well, as far as each auditory unit 52A, 52B may be able to
effectively deliver the sounds to the user(s) engaged in exercise
on, with or against each exercise module 20A, 20B. Although not
shown in this figure, the visual or auditory units 51A, 51B, 52A,
52B may include at least one storage or driver for retrieving
stored images or sounds and for providing retrieved images or
sounds to the user(s), where such storage or driver may be disposed
in various positions of the system 10 as long as such a disposition
may not hinder normal operation of the visual and/or auditory units
51A, 51B, 52A, 52B. It is also appreciated that the electrical
energy for actuating such visual or auditory units 51A, 51B, 52A,
52B may be directly supplied thereto or such units 51A, 51B, 52A,
52B may receive electrical energy from a power supplying the energy
to the exercise modules 20A, 20B. Although not shown in the figure,
the exercise modules 20A, 20B are disposed in different locations
and include one of the control and output modules 40A, 40B, 50A,
50B, where at least one module in one location communicates with at
least one another in another location via a local or global
network.
[0204] The output module 50A, 50B may also include the olfactory
units 53A, 53B to give off various smells to the user(s) during
exercises. To this end, the olfactory 53A, 53B have at least one
storage for storing at least one chemical substance and at least
one dispenser for discharging the substance out of the storage
toward the user(s) through outlets such as, e.g., openings,
nozzles, tubes, and the like. The olfactory units 53A, 53B of this
embodiment provides a series of nozzles or openings on top of the
left and right sets of speakers of the auditory units 52A, 52B,
thereby giving off different smells in a stereo mode or giving off
the same smell of different intensity therefrom. It is appreciated
that the storage or dispenser of such olfactory units 53A, 53B may
be disposed in any positions of the system 10 as long as far
disposition may not hinder normal operation thereof. It is further
appreciated that the olfactory units 53A, 53B or their various
parts may receive the electrical energy by a common power line or
directly through another line coupled to a source of such energy.
The output modules 50A, 50B may have the tactile units 54A, 54B to
generate and deliver the mechanical, thermal, electrical, and/or
optical sensations to the user(s). The tactile units 54A, 54B of
this embodiment may include multiple applicators preferentially
disposed along a pair of handles 21H of each exercise modules 20A,
20B. Depending upon its characteristics, such tactile units 54A,
54B may generate translating movements, rotational movements, or
vibration of such applicators for delivering such mechanical
sensations such that the user(s) may feel the intended tactile
feature of the virtual environment. The tactile units 54A, 54B may
instead apply the electrical potential across such applicators or
may flow the electric current through at least one of such
applicators, thereby allowing the user(s) to feel the electrical
sensations for the virtual environment. Contrary to those of the
figure, the tactile units 54A, 54B may include the applicators
which are disposed in other parts of the exercise and/or output
modules 20A, 20B, 50 and deliver various sensations to the user(s),
where such dispositions may not be critical to the scope of the
present invention as long as the tactile units 54A, 54B may
properly operate in such dispositions.
[0205] The output modules 50A, 50B may include the display units
55A, 55B disposed on top of such stands 21S of each exercise module
20A, 20B. The display units 55A, 55B display various variables or
parameters associated with operations of various modules 20A, 20B,
40A, 40B including the output modules 50A, 50B. Therefore, each
output module 50A, 50B may have its own display unit 55A, 55B or
may share a display panel of each exercise module 20A, 20B as its
display unit. In this respect, the display units are deemed as a
part of the exercise or control modules 20A, 20B, 40A, 40B,
depending on the usage and/or mode of coupling. Other
characteristics of various modules and their units of the system 10
have been set forth hereinabove.
[0206] In operation, multiple (two in this example) users perform
the same or different exercises using multiple (two in this
example) exercise modules 20A, 20B. That us, a first user sets a
first exercise module 20A or a first control module 40A in order to
perform first exercise and to provide desired first images (or
first virtual environment) for the task. For example, the first
user selects what kind of first images are generated by the first
control module in a desired mode, and provides the settings to
those modules. The user then turns on the first exercise module
20A, translates its track 22A at a desirable speed, begins walking
or running thereon, and so on. Concurrently with the first user, a
second user may set a second exercise module 20B and perform second
exercise which is identical to, similar to or different from the
first exercise, where the second user also performs the exercise of
running or walking on the track 22B in this example. The second
user also selects second images for the task, where the task for
the second user is typically same as that for the first user and
where the second images may be identical or similar to the first
images, may be such first images viewed in a different perspective,
or may be different images of the same task. Such first and/or
second control modules may monitor at least one feature of the
first and second exercises, the first and second users, and/or
operation of such first and second exercise modules 20A, 20B,
respectively, and manipulate at least one feature of the task at
least partly based on the monitored feature, whereby the first and
second users perform the exercises while proceeding along the task
directly or indirectly based upon the first and/or second
exercises. Alternatively, the first and/or second control modules
may monitor at least one feature of the task performed by the first
and/or second users included in the images for the task, and then
manipulate at least one feature of the operation of the first
and/or second exercise modules 20A, 20B at least partly based on
the monitored feature, whereby the first and second user perform
the first and second exercises of which the features are directly
or indirectly decided by the task in which the first and second
users proceed while competing each other.
[0207] The system 10 transfers the task (or another) feature
between the first and second exercise modules 20A, 20B by
transmitting the features of the task (e.g., a status of the task,
a current stage of the user, a status of the simulated user, other
variables or parameters of the task, and the like) or other
features from the first modules 20A, 40A, 50A and then receiving
such features by the second modules 20B, 40B, 50B. It is
appreciated that such transfer may be performed directly between
such exercise modules 20A, 20B, may be performed by the output
module(s), may be controlled by the first (or second) control
module which monitors and supplies requisite features to such
exercise modules 20A, 20B and/or output modules, and the like.
[0208] In addition, the first and/or second control modules 40 may
preferably compare the features of the task performed by the first
and second users, monitored features of such first and second
users, monitored features of such first and second exercises,
and/or monitored features of the operations of the first and second
exercise modules 20A, 20B. Based upon these features, the first
and/or second control modules 40 may manipulate the task feature of
such first and/or second users, may manipulate the operation
feature of such first and second exercise modules 20A, 20B, and the
like, whereby the users may compete each other in the task. To this
end, the first and/or second control modules may manipulate such
features of the task either without altering any of such features
or after modifying at least one of the features at least partly
based on the preset relation. When the users engage in such
exercises of the same type as exemplified in the figure, the first
and/or second control modules may compare such features without any
alteration, although such control modules may also alter one of the
features when the users selects the loads of the exercise modules
20A, 20B at different levels.
[0209] In another exemplary embodiment of this aspect of the
invention and as exemplified in FIG. 3B, an exemplary exercise
system 10 also includes two exercise modules 20A, 20B, at least one
control module (not shown in the figure), and two output modules
50A, 50B (only their portions included in the figure). As set forth
herein, such control and output modules 40, 50A, 50B may include at
least one or all of the aforementioned units thereof.
[0210] The system 10 includes two exercise modules 20A, 20B a first
of which is similar or identical to that of FIG. 3B and a second of
which is a conventional weight lift equipment. The second exercise
module 20B includes a frame 21C, a pair of handles 21HC, a chair
23, and multiple weights 24, where such a frame 21C defines a basic
body of the third exercise module 20C, where such weights 24 are
selectively loaded and coupled to the handles 21HC, and where the
second user sits on the chair 23 and engages in weight lifting by
moving the handles 21HC. More particularly, the frame 21C encloses
the chair 23 and movably retains a pair of handles 21HC disposed in
locations accessible by a second user when sitting on the chair 23.
Multiple weights 24 are stacked behind the chair 23, and arranged
to be releasably loaded onto a connector (not shown in this figure)
which mechanically couples with the handles 21HC. These handles
21HC are arranged to pivot about centers of rotation (not shown in
the figure) and to be disposed at a shoulder level of the second
user sitting on the chair 23. Similar to that of FIG. 3A, the
second exercise module 20C may include numerous other parts not
shown in the figure but commonly seen in the prior art lifting
machines as well. Although not included in the figure, the exercise
modules 20A, 20B are disposed in different locations and include
one of the control and output modules 40A, 40B, 50A, 50B, where at
least one module in one location communicates with at least one
another module in another location via a global network.
[0211] In operation, multiple (e.g., two in this example) users
perform the same or different exercises on, with or against two
exercise modules 20A, 20B of the system 10. For example, a first
user sets a first exercise module 20A and/or a first control module
40A to perform first exercise and to provide the first images (or
sounds) for the task. The user turns on the first exercise module
20A, translates its track 22A at a preset speed, and begins walking
or running thereover. A second user also sets a second exercise
module 20B and performs second exercise of lifting the weights 24
by pivoting the handles 21HC. The second user selects second images
(and/or second sounds) for the task similar to that of FIG. 3A. The
first and/or second control modules monitor at least one feature of
the first and second exercises, first and second users, and/or
operation of the first and second exercise modules 20A, 20B,
respectively, and manipulates at least one feature of the task at
least partly based upon the monitored feature, whereby the first
and second users perform such exercises while performing the task
directly or indirectly based upon the first and/or second
exercises. Alternatively, the first and/or second control modules
monitor at least one feature of the task performed by the first
and/or second users and displayed on such images for the task, and
manipulates at least one feature of at least one operation of the
first and/or second exercise modules 20A, 20B at least partly based
on the monitored feature, whereby the first and second user perform
such first and second exercises whose features are directly or
indirectly decided by the task in which the users proceed, in which
the first and second users compete each other, and the like.
[0212] In addition, the first and/or second control modules 40 may
preferably compare the features of the task performed by the first
and second users, monitored features of such first and second
users, monitored features of such first and second exercises,
and/or monitored features of the operations of the first and second
exercise modules 20A, 20B. Based upon these features, the first
and/or second control modules 40 may manipulate the task feature of
such first and/or second users, may manipulate the operation
feature of such first and second exercise modules 20A, 20B, and the
like, whereby the users may compete each other in the task. To this
end, the first and/or second control modules may manipulate such
features of the task either without altering any of such features
or after modifying at least one of the features at least partly
based on the preset relation. When the users engage in such
exercises of the different type, such first and/or second control
modules may compare such features and then modify at least one of
such features as explained in the case of the equivalent
conversion.
[0213] In another aspect of the present invention, a simulating
exercise system may simulate an user thereof into at least one
simulated user defined in a task and manipulate various features of
the user, task, exercise, and/or exercise module at least partly
based on at least one feature of the same type or a different type.
FIG. 4A shows a schematic perspective view of an exemplary
simulating exercise system including an exercise module and
simulating an user of the exercise module into a simulated user
which is defined in a task and which is arranged to compete against
a preset program stored in and/or provided to the system according
to the present invention. It is to be understood that not every
module and/or unit of the simulating system may be shown in the
figure but that those modules and/or units thereof described
hereinabove as well as those of the co-pending Applications may be
included in the system for performing various functions as set
forth herein and in the co-pending Applications, respectively. It
is also appreciated that any of the above units of the control and
conversion modules may be incorporated into various exposed and/or
hidden locations of the simulating exercising system. It is further
appreciated that an upper panel of the figure represents the
perspective view of the entire system, while a lower panel of the
figure visually explains a preset task defined for an user who may
be engaged in exercise also provided by the system.
[0214] An exemplary simulating exercise system 10 is similar to
that of FIG. 3A, except that the figure only focuses on one
exercise module 20 disposed in one location. Accordingly, it is
appreciated that the system 10 includes at least one another
exercise module which is not shown in this figure but is disposed
in a different location and on, with or against which another user
performs the same, similar or different exercise. When desires, the
user exercising on the exercise module 20 of this figure may be
arranged to compete a control module (not shown in the figure) of
the system 10 in a common task of a story, scenery or game.
[0215] As described in the lower panel of FIG. 4A, the control
module (or game console) defines the preset task which is to be
performed by the user. In this particular example, the task is
similar to the prior art video game which has been known as the
"Pac Man" or an equivalent thereof which defines the image domain
51M on which several mobile and/or stationary objects and a
stationary background are defined therein. More particularly, the
task defines a two-dimensional background which consists of
multiple rectangular blocks which are arranged in rows and columns
and spaced away from each other while providing vertical and
horizontal routes therebetween. The task defines a simulated user
81, multiple opposing users 82, and multiple credits 83, where the
simulated user 81 may preferably be manipulated by the user and
travel vertically and/or horizontally along the routes defined
between the blocks of the background, while catching the credits 83
and avoiding encounter with the opposing users 82. In addition, the
task may define a preset goal such as, e.g., catching all of such
credits 83, surviving through the task for a preset period of time
without being attacked by such opposing users 82, and the like. In
general, the task may dispose the credits 83 in preset locations
along such routes as defined by a preset program stored in the
control module 40 (or game console), as selected based on the user
inputs, as determined by at least one of such features of the user,
task, exercise, and/or exercise module 20 (or operations thereof,
and the like. Similarly, the task may dispose the opposing users 82
in preset locations along the routes and move the opposing users 82
along such routes in a preset manner which may be determined by a
preset program stored in such a control module 40 (or game
console), which may be selected based upon the user inputs, which
may be determined by at least one of the above features of the
user, task, exercise, and/or exercise module 20 (or operations
thereof), and the like.
[0216] The task may be arranged to define multiple stages each of
which may be provided to the user in a preset sequence which may be
decided at least partly based on a preset program of the control
module 40 or game console, based on the control and/or converted
signals, based on the user inputs provided by the user of the
exercise module 20 of the same or different system 10, and the
like. Each of such stages may also define identical, similar or
different objects and/or background therein, may define levels or
difficulties of different extents by varying characteristics of the
opposing users 82 or credits 83, and the like. The task may also
assign various thresholds onto those stages such that the simulated
user may proceed from one stage to the next one when the simulated
user accomplishes a preset goal in that stage.
[0217] The control and/or conversion modules 40, 70 may then be
arranged to manipulate at least one of various features of the
above task such that the simulated user 81 defined in the image
domain 51M may proceed through such stages of the task at least
partly based on at least one of the features of the user, exercise,
and/or exercise module 20 (i.e., various operations thereof.
Therefore, the user of the exercise module 20 may manipulate the
simulated user 81 of the task to proceed through such stages of the
task while performing exercise thereon, therewith, and/or
thereagainst. Alternatively, such control and/or conversion modules
40, 70 may be arranged to manipulate at least one of various
features of the exercise and/or exercise module 20 such that the
control module 40 may manipulate at least one operation of the
exercise module 20 and/or exercise provided by such a module 20 at
least partly based on at least one of such features of the task.
Accordingly, the user of the control module 40 (or game console)
may manipulate the simulated user 81 of the task to proceed through
the stages of the task while performing the exercise of which
features are determined at least partly based upon at least one of
such features of the task.
[0218] Other than the Pac Man game exemplified herein, the control
module 40 (or game console) may be arranged to provide the user
with different audiovisual games each of which may define the above
or similar features, while requiring the user to resort to specific
means of accomplishing the task goal such as, e.g., by fighting an
opposing user manipulated by a preset program and/or another user,
by proceeding against opposing users manipulated by the preset
program or another user, by arriving at a preset stage or a preset
location thereof without or against manipulation by such a preset
program or another user, finding a hidden object without or against
such manipulation, and the like. Details of the task provided by
the control module 40 (and/or game console), however, may not
necessarily be critical to the scope of the present invention, as
long as the user may manipulate various task features defined in
the image domain 51M at least partly based on such features of the
user, exercise, and/or exercise module 20 while performing such
exercise on, with, and/or against the exercise module 20, as long
as the exercise module 20 may change at least one of its operations
at least partly based on such features of the user, task, and/or
exercise offered by such a module 20, and the like.
[0219] Depending on the nature of the task, the control module 40
(or game console) may manipulate only a single preset feature or
multiple features of the simulated user. For example, the control
module 40 may convert one or more features of the user and/or
exercise into such control and/or converted signals and move the
simulated user 81 in a single or multiple directions at a preset or
variable speeds, either directly or in conjunction with the
conversion module 70 to incorporate the equivalence between
different features of the same or different types. As is manifest
in the figure, however, it is preferred that the control module 40
manipulate such a simulated user 81 to move in different directions
and/or at different speeds. To this end, the control module 40 may
utilize its input unit 41 to receive various user inputs capable of
manipulating at least one feature of the simulated user 81 in the
image domain 51M. For example, the input unit 41 may be disposed in
and/or operatively couple with the track 22 of the exercise module
20, monitor a force applied thereto by the user, a direction of a
movement of the user, and/or an acceleration thereof, extract an
intended user input therefrom, and manipulate at least one feature
of the simulated user 81 at least partly based thereupon. It is
appreciated in this example that the user may not only perform the
exercise but also provide the intended user input while walking or
running on such a track 22 by manipulating various features of his
or her exercise and that a single part of the system 10 may not
only receive the energy associated with the exercise but also
receive the user input for manipulating the simulated user 81. In
another example, such an input unit 41 may be disposed on and/or
operatively couple to the handle 21H of the exercise module 20,
monitor a force and/or torque applied thereto by the user, extract
the intended user input therefrom, and manipulate at least one
feature of the simulated user 81 at least partly based thereon. In
another example, the input unit 41 may be portably carried by
and/or disposed on the user, monitor or receive the user input, and
manipulate at least one feature of the simulated user 81 at least
partly based thereon. It is appreciated in these two last examples
that the system 10 includes at least two parts, i.e., at least one
major part for receiving the energy from the user for such exercise
and at least one minor part for receiving the user input for
manipulating the simulated user 81 in the task, that the track 22
of the exercise module 20 may function as the major part in this
embodiment, and that the input unit 41 of the control module 40 may
function as the minor part herein.
[0220] It is to be understood that the major and minor parts may be
provided in various arrangements. For example, the major part may
be arranged to receive energy from the user while acquiring the
user input at least partly based on a direction of input force
related and/or associated with such energy, a velocity thereof, an
acceleration thereof, a duration thereof, and the like, where the
energy supplied to such a minor part and associated with the user
input may correspond to at most 5%, 10%, 15%, 20%, 25%, 30%, 35%,
40%, 45%, 50%, 55%, 60%, 65%, 70%, 75%, 80%, 85% or 90% of another
energy supplied to the above major part. The major part may have a
configuration and/or may be incorporated into locations capable of
contacting a foot or feet of the user, a leg or legs thereof, a
thigh or thighs thereof, a back thereof, a belly thereof, a side or
sides thereof, a finger or fingers thereof, a hand or hands
thereof, an arm or arms thereof, a shoulder or shoulders thereof, a
head thereof, and/or a neck thereof. The minor part may similarly
define a configuration and/or may be incorporated into locations
capable of contacting a foot or feet of the user, a leg or legs
thereof, a thigh or thighs thereof, a back thereof, a belly
thereof, a side or sides thereof, a hand or hands thereof, an arm
or arms thereof, a neck thereof, a shoulder or shoulders thereof, a
head thereof, a finger or fingers thereof, and the like. The major
part may be designed and/or disposed to contact a first body part
which may be capable of providing more energy than another energy
capable of being provided by a second body part to said minor part.
The major and minor parts may be designed and/or disposed to
respectively allow the first and second body parts to move and/or
to be depressed in the same or different directions. The major and
minor parts may be designed and/or disposed to respectively contact
such first and second body parts which are spaced away from each
other by at least 1 cm, 2 cm, 3 cm, 4 cm, 5 cm, 7 cm, 10 cm, and
the like. The minor part may be disposed away from the major part,
around the major part, inside the major part, side by side with
respect to the major part, may be disposed in an elevation higher
than that of the major part, may be in a recession, may be flush
with the major part, and the like. Therefore, the user may provide
various input signals to move the simulated user 81 along a desired
direction, to perform a preset function, to cope with the opposing
user 82, and the like.
[0221] In operation, the user supplies various settings for desired
exercise in which the user intends to be engaged and for a desired
task or, more particularly, the user selects what kind of features
of a task is to be provided by the system 10 in a desired mode and
provides settings thereof. Thereafter, the user turns on the
exercise module 20, moves the track 22 at a desired speed, and
begins walking and/or running on the track 22. As the user begins
the exercise, the output module provides the user with such images,
sounds, and/or virtual environment with intended features. For
example, the output module may provide the user with the visual
feature of the virtual environment by displaying desired images
and/or visual feature of the environment on the image domain 51M of
its visual unit 51, where the images may be a still picture, a
series of still pictures, a video clip, a combination thereof, and
the like. It is preferred, however, that the output module may
manipulate the visual unit 51 to display such images and/or visual
feature related and/or associated with a desired type and/or extent
of exercise and/or task selected by the user. When commanded by the
user, the output module may provide such sounds and/or virtual
environment with the auditory feature, where it is preferred that
such sounds or auditory feature may be related or associated with
the type and/or extent of the exercise and/or task, the images
and/or visual feature displayed on the image domain 51M, and the
like. The output module may also provide the virtual environment
with the olfactory feature by giving off various smells related
and/or associated with such images, sounds, and/or virtual
environment so that the user is provided with not only the images
and sounds but also with the smells related thereto or associated
therewith. Upon being instructed, the output module may provide the
virtual environment with the tactile feature by generating various
sensations. It is preferred that the output module provide the
above features of the virtual environment in the preset perspective
with respect to the user as described above and that the output
module may vary at least one of the features and/or at least one of
temporal and/or spatial characteristics of at least one of such
features during the exercise based on various factors which have
also been disclosed above.
[0222] Depending on a preset mode of operation, the control module
40 acquires at least one feature of the user, task, exercise,
and/or exercise module 20 (i.e., operations thereof using its input
and/or sensor units 41, 42, and transmit to the conversion module
70 the acquired feature. The conversion module 70 is generally
charged with providing a preset relation relating the acquired
feature to at least one of multiple features of the exercise (or
task), generating the equivalence relating such features of the
same type (or different types) based on the relation, and then
converting the acquired feature into the control and/or converted
signals. In general, the conversion module 70 performs these
functions through its converter unit 72. Thereafter, such a
conversion module 70, with its simulator unit 71, may simulate the
user as the simulated user 81 and then display the simulated user
81 on the image domain 51M of the output unit. The conversion
module 70 (with its driver and/or interface units 73, 74) and/or
control module 40 (using its control unit 44) may manipulate at
least one feature of the simulated user 81 in order to allow the
simulated user 81 to proceed through the preset stage of the task
for the goal of the task by, e.g., fighting against or avoiding the
opposing users 82, collecting such credits 83, and the like. More
importantly, such control and/or conversion modules 40, 70 may
manipulate at least one of multiple features of the simulated user
81 at least partly based upon at least one of multiple features of
the user, exercise, and/or exercise module 20, whereby the user may
manipulate various features of the simulated user 81 at least
partly based upon various features of the exercise which he or she
performs on, with, and/or against the exercise module 20.
Conversely, the control and/or conversion modules 40, 70 may
manipulate at least one of such features of at least one operation
of the exercise module 20 at least partly based upon at least one
of such features of the user, task, and/or exercise. Accordingly,
the user may perform the exercise which is in turn provided by the
exercise module 20 and of which the features may be at least partly
dependent upon various user features such as his or her physical or
physiological conditions, upon various task features such as a
status of the simulated user 81 in the task, and/or upon various
exercise features such as the type and/or extent thereof. In each
of the examples, various task features may also be manipulated in
various modes. For example, the control module 40 may manipulate
the task feature when such a module 40 is to provide the task. In
another example, the conversion module 70 may manipulate the task
feature when the system 10 is operatively coupled to the game
console which may be a part of the system 10 or a console external
thereto. In the latter case, the system 10 may be arranged to
directly manipulate the task feature or, in the alternative, may
generate and then supply the control and/or converted signals with
which such a game console may manipulate the task feature.
[0223] Such a system 10 may be arranged to allow the user to
manipulate at least one feature of the user, task, exercise, and/or
exercise module 20 (i.e., operations thereof either directly or
indirectly. In one example, the user may directly supply the user
input to the system 10 which may then change at least one feature
of the task at least partly based thereon. In another example, the
user may supply the user input to the exercise module 20 which may
then deliver the control and/or converted signals to the control
and/or conversion modules 40, 70 for manipulating the task feature
at least partly based thereon. The system 10 may instead be
arranged to allow the user to change the task feature at least
partly based upon at least one feature of the user, task, exercise,
and/or exercise module 20 with or without accompanying changes in
the operations of the exercise module 20. Accordingly, the system
10 may adaptively change at least one feature of the task,
exercise, and/or operations of the exercise module 20 based on at
least one of such features acquired thereby.
[0224] In another aspect of the present invention, a simulating
exercise system may simulate an user thereof into at least one
simulated user defined in a task and manipulate various features of
the user, task, exercise, and/or exercise module at least partly
based on at least one feature of the same type or a different type.
FIG. 4B shows a schematic perspective view of an exemplary
simulating exercise system including an exercise module and
simulating an user of such a module as a simulated user of a task
playing a board game against another simulated user of another
exercise module according to the present invention. It is
appreciated that not every module and/or unit of the simulating
system may be incorporated into the figure but that those modules
and/or units thereof described hereinabove as well as those of the
co-pending Applications may be incorporated into the system so as
to perform various functions as set forth herein and in the
co-pending Applications, respectively. It is further appreciated
that any of the above units of the control and conversion modules
may also be incorporated to various exposed and/or hidden locations
of the simulating exercising system. It is further appreciated that
an upper panel of the figure represents the perspective view of the
entire system, while a lower panel of the figure visually explains
a preset task defined for an user engaged in exercise which is
provided by the system as well.
[0225] An exemplary simulating exercise system 10 is similar to
that of FIG. 3B, except that the figure only focuses on one
exercise module 20 disposed in one location. Accordingly, it is
appreciated that the system 10 includes at least one another
exercise module which is not shown in this figure but is disposed
in a different location and on, with or against which another user
performs the same, similar or different exercise. When desires, the
user exercising on the exercise module 20 of this figure may be
arranged to compete a control module (not shown in the figure) of
the system 10 in a common task of a story, scenery or game.
[0226] As described in the lower panel of FIG. 2B, the control
module (or game console) defines the preset task which is to be
performed by the user. In this particular example, the task is
similar to the prior art "go" game or an equivalent thereof which
defines the image domain 51M on which nineteen horizontal and
vertical lines are to intersect each other. More particularly, such
a task defines a two-dimensional background which consists of
multiple points of intersection of such lines arranged in a
19-by-19 matrix. The task defines multiple simulated users 81 and
multiple opposing users 82, where the simulated users 81 may be
represented by one of black and white marbles, whereas the opposing
users 82 may be represented by the other of such marbles. Such
simulated users 81 may preferably be manipulated by the user and
disposed in any of the above intersections of the background, while
competing against the opposing users 82 according to preset rules
of the task such as, e.g., creating as much a territory as possible
while surrounding and capturing such opposing users 82. In general,
the task may decide where the opposing users 82 are to be disposed
in response to positioning of the simulated users 82 based on a
preset program stored in the control module 40 (or game console),
as selected based upon the user inputs, as determined by at least
one of such features of the user, task, exercise, and/or exercise
module 20 (or operations thereof, and the like.
[0227] The task may define multiple stages each provided to the
user in a preset sequence which may be decided at least partly
based on a preset program of the control module 40 or game console,
based on the control and/or converted signals, based on the user
inputs provided by the user of the exercise module 20 of the same
or different system 10, and the like. Each of such stages may
define levels or difficulties of different extents by varying
skills of such opposing users 82.
[0228] The control and/or conversion modules 40, 70 may then be
arranged to manipulate at least one of various features of the
above task such that the simulated user 81 defined in the image
domain 51M may be positioned at least partly based upon at least
one of the features of the user, exercise, and/or exercise module
20 (i.e., various operations thereof. Therefore, the user of the
exercise module 20 may manipulate the simulated users 81 of the
task to be disposed in preferable positions of the image domain 51M
of such a task while performing exercise thereon, therewith, and/or
thereagainst. In the alternatively, the control and/or conversion
modules 40, 70 may be arranged to manipulate at least one of the
features of the exercise and/or exercise module 20 so that the
control module 40 may control at least one operation of the
exercise module 20 and/or exercise provided by such a module 20 at
least partly based on at least one of such features of the task.
Accordingly, the user of the control module 40 (or game console)
may manipulate the simulated users 81 while performing the exercise
of which features are determined at least partly based upon at
least one of such features of the task.
[0229] Other than the go game exemplified herein, the control
module 40 and/or game console may be arranged to provide the user
with different board and/or card games each of which may define
similar or different features, while requiring the user to resort
to specific means of accomplishing the goal of the task such as,
e.g., by positioning multiple simulated users while competing
against opposing users, by moving one or more simulated users
against one or more opposing users, by collecting preferable
simulated users (or cards) from a given set of cards, and the like.
Details of the task provided by the control module 40 (and/or game
console), however, may not necessarily be critical to the scope of
the present invention, as long as the user may manipulate such task
features defined in the image domain 51M at least partly based upon
such features of the user, exercise, and/or exercise module 20
while performing such exercise on, with or against the exercise
module 20, as long as the exercise module 20 may change at least
one of its operations at least partly based on such features of the
user, task, and/or exercise offered by such a module 20, and the
like.
[0230] In operation, the user supplies various settings for desired
exercise in which the user intends to be engaged and for a desired
task or, more particularly, the user selects what kind of features
of a task is to be provided by the system 10 in a desired mode and
provides settings thereof. Thereafter, the user couples a desired
number of the weights 24 with the handle 21H, sits on the chair 23
of the exercise module 20, grabs the handle 21H, and then begins
lifting the weights 21H by pivoting and/or reciprocating the handle
21H. As the user begins the exercise, the output module provides
the user with such images, sounds, and/or virtual environment with
intended features, while providing the user with the images,
sounds, and/or virtual environment as disclosed in conjunction with
FIG. 4A.
[0231] Depending on a preset mode of operation, the control module
40 acquires at least one feature of the user, task, exercise,
and/or exercise module 20 (i.e., operations thereof) using its
input and/or sensor units 41, 42, and transmit to the conversion
module 70 the acquired feature. The conversion module 70 is
generally charged with providing a preset relation relating the
acquired feature to at least one of multiple features of the
exercise (or task), generating the equivalence relating such
features of the same type (or different types) based on the
relation, and then converting the acquired feature into the control
and/or converted signals. In general, the conversion module 70
performs these functions through its converter unit 72. Thereafter,
such a conversion module 70, with its simulator unit 71, may
simulate the user as the simulated user 81 and then display the
simulated user 81 on the image domain 51M of the output unit. The
conversion module 70 (with its driver and/or interface units 73,
74) and/or control module 40 (using its control unit 44) may
manipulate at least one feature of the simulated user 81 in order
to allow the simulated user 81 to proceed through the preset stage
of the task for the goal of the task by, e.g., fighting against or
avoiding the opposing users 82, collecting such credits 83, and the
like. More importantly, such control and/or conversion modules 40,
70 may manipulate at least one of multiple features of the
simulated user 81 at least partly based upon at least one of
multiple features of the user, exercise, and/or exercise module 20,
whereby the user may manipulate various features of the simulated
user 81 at least partly based upon various features of the exercise
which he or she performs on, with, and/or against the exercise
module 20. Conversely, the control and/or conversion modules 40, 70
may manipulate at least one of such features of at least one
operation of the exercise module 20 at least partly based upon at
least one of such features of the user, task, and/or exercise.
Accordingly, the user may perform the exercise which is in turn
provided by the exercise module 20 and of which the features may be
at least partly dependent upon various user features such as his or
her physical or physiological conditions, upon various task
features such as a status of the simulated user 81 in the task,
and/or upon various exercise features such as the type and/or
extent thereof. In each of the examples, various task features may
also be manipulated in various modes. For example, the control
module 40 may manipulate the task feature when such a module 40 is
to provide the task. In another example, the conversion module 70
may manipulate the task feature when the system 10 is operatively
coupled to the game console which may be a part of the system 10 or
a console external thereto. In the latter case, the system 10 may
be arranged to directly manipulate the task feature or, in the
alternative, may generate and then supply the control and/or
converted signals with which such a game console may manipulate the
task feature. Other configurational or operational characteristics
of the exercise system 10 of FIG. 4B are similar or identical to
those of the system of FIG. 4A.
[0232] Configurational and/or operational variations and/or
modifications of the above embodiments of various exemplary
exercise systems, their modules, or units shown in FIG. 1, FIGS. 2A
to 2F, FIGS. 3A and 3B, and FIGS. 4A and 4B also fall within the
scope of this invention.
[0233] As shown in conjunction with FIG. 1, the system typically
consists of three types of modules, where the control module
includes four units, while the output module includes five units.
However, the control and/or output modules may not necessarily
include all of the units. Accordingly, the control unit may only
include the input and control units, whereas the output unit may
include only visual unit. In other words, the exact number of those
units of the control and output modules may not be critical to the
scope of this invention as far as each module performs its intended
functions. Similarly, various units of such modules may be deemed
to belong to other modules different from those set forth in FIG.
1. For example, the control unit may belong to the output module,
while the storage unit may belong to the exercise modules. At least
a portion of at least one of the visual, auditory, and/or display
units of the output module may be incorporated into the control
module. In other words, classifications of such units are not
critical to the scope of this invention as far as each unit
performs its intended function.
[0234] By the same token, such exercise modules may be necessary
for the system of this invention. In contrary, the system may be
deemed to include the control and output modules, where the
exercise modules may be the external equipment to which the system
is operatively coupled. The same applies to other auxiliary modules
needed for various operations of the system. For example, a power
supply module may be required to power such a system, where the
power supply module may or may not be deemed to be a module of the
system. The system may also require at least one support which may
physically retain various modules, where the support may or may not
be deemed as a part of thereof. That is, the exercise system
requires the control and output modules, the control module
requires the control unit, the output module requires the visual
unit, and so on. Other units of the control and output modules set
forth in this description, accordingly, may be deemed as optional
units of the system.
[0235] The output module may operatively couple to multiple
exercise modules and generate the same or different images (and/or
sounds) for multiple users simultaneously performing the exercises.
Such a control module may perform the same control function for
each exercise module while providing the same or different images
(or sounds) for the user(s) or, alternatively, to perform at least
one different control function for each exercise module. When
desirable, the control module may operatively couple to multiple
exercise modules of different systems which are disposed in the
single location, where this control module is then deemed as a
"common" module for multiple exercise systems.
[0236] Each visual unit preferably defines at least one image
domain for displaying the images for the task of the story, scenery
or game thereon. Such a visual unit may utilize an entire portion
of its image domain for displaying the images. When desirable, at
least one visual unit may define multiple portions in the image
domain, where such portions may define the same or different shapes
and/or sizes, may be arranged in rows and/or columns, may be
disposed symmetrically or asymmetrically, and the like.
[0237] The visual units may be arranged to manipulate the
configurations or dispositions of the images and/or their domains.
For example, at least one visual unit may form the image domain,
define a preset number of portions therein, and change a shape or
size of at least one of the portions while changing shapes or sizes
of the rest thereof or, alternatively, while maintaining the shapes
or sizes of the rest thereof. At least one visual unit may also
define the image domain, define a preset number of portions
therein, and change a disposition of at least one of such portions
while maintaining dispositions of the rest. At least one visual
unit may define the image domain, form a preset number of portions
therein while assigning the object and/or background thereto, and
change assignments of at least one of the portions so that, e.g.,
one portion assigned with a single object may be assigned with the
background, another object, a combination thereof, and so on. In
addition, at least one visual unit may be arranged to change the
number of portions defined in the image domain during exercise. At
least one visual unit may be arranged to assign the object and
background to such portions of the image domain based on various
arrangements. For example, the visual unit may assign only one of
the object and background into each portion so that the object may
be assigned to one of such portions, while the background or
another object may be assigned to another thereof. In an opposite
example, the visual unit may assign the object and background to
one or both of the portions.
[0238] The visual and/or control units may further be arranged to
manipulate the configurations and/or dispositions of the image
domain as well as its portions. For example, such unit(s) may first
form the image domain, define a preset number of portions therein,
and then change a shape and/or size of at least one of such
portions while changing shapes and/or sizes of the rest of such
portions or, in the alternative, while maintaining the shapes
and/or sizes of the rest thereof. Such unit(s) may also first form
the image domain, define a preset number of portions therein, and
then change a disposition of at least one of the portions while
maintaining dispositions of the rest of such portions. Such unit(s)
may also first form the image domain, define a preset number of
portions therein while assigning the object and/or background
thereto, and thereafter change such assignment for at least one of
such portions such that, e.g., one portion assigned with a single
object may then be assigned with the background, another object, a
combination thereof, and the like. In addition, such unit(s) may
also be arranged to change the number of portions defined in the
image domain during exercise.
[0239] The exercise system may also include at least one input unit
capable of receiving various user inputs supplied thereto by the
user(s). The system may also receive such user inputs by the
exercise or control modules by including various prior art input
devices as set forth hereinabove. Such user(s) may supply the user
inputs by applying mechanical, thermal, and/or electric signals to
various parts of the exercise and/or output modules, by generating
body movements which may be monitored by such parts of the
exercise, control, and/or output modules, by generating voice
signals, face signals, and/or body signals which may similarly be
monitored by those parts of such exercise, control, and/or output
modules, and the like. The system may then utilize such user inputs
for manipulating the task feature, whether or not the body
movements of the user(s) required for generating such user inputs
may be necessary or commensurate for consuming the energy of the
exercising user(s). The control module may further extract the user
inputs by monitoring and analyzing the body movements of the
user(s), voices thereof, facial expressions thereof, and/or other
body signals.
[0240] The task of the story, scenery, and/or game may be provided
in such images using the control module alone, by the external
story, scenery or game console alone, by both, and the like, where
the control module may provide the images and optional sounds,
smells, and/or sensations for the task to the output module for
providing such features of the virtual environment. The control
module may also manipulate at least one operation feature of at
least one the exercise modules at least partly based on the
exercise and/or user features.
[0241] Unless otherwise specified, various features of one
embodiment of one aspect of the present invention may apply
interchangeably to other embodiments of the same aspect of this
invention and/or embodiments of one or more of different aspects of
the present invention. Accordingly, any module of the system may be
equipped with communication capabilities in order to communicate
with at least one another module thereof, as long as the part of
the system disposed in each of such locations may have such
capabilities. In other words, such communication may not have to be
performed solely by the control module and may rather be performed
by the exercise and/or output modules 20, 50.
[0242] As described hereinabove, various systems, methods, and/or
processes of this invention may be applied to any prior art
exercise equipment. For example, such systems, methods, and
processes may be applied to the exercise equipment normally
requiring its user(s) to perform such physical work thereon or
thereagainst. In another example, such systems, methods, and
processes may be applied to the exercise equipment providing the
user(s) physical and/or electrical energy while forcing and/or
facilitating the body of the user(s) to vibrate or twitch the
muscles thereof based thereon. In another example, the systems,
methods, or process of this invention may be applied to convert any
prior art equipment not intended as the exercise system of this
invention. Therefore, any conventional devices primarily intended
to engage the user(s) in playing physically simulated games or
video games may be converted to the exercise system of this
invention which may improve or enhance the muscle tone of the
user(s), increase the muscle mass or volume thereof, force and/or
facilitate the user to reduce the weight, increase the physical
stamina of the user, and the like.
[0243] It is to be understood that, while various aspects and
embodiments of the present invention have been described in
conjunction with the detailed description thereof, the foregoing
description is intended to illustrate and not to limit the scope of
the invention, which is defined by the scope of the appended
claims. Other embodiments, aspects, advantages, and modifications
are within the scope of the following claims.
* * * * *