U.S. patent application number 13/084488 was filed with the patent office on 2011-10-13 for system and method for a 3d computer game with true vector of gravity.
Invention is credited to Steven K. Feiner, Ohan Oda.
Application Number | 20110250962 13/084488 |
Document ID | / |
Family ID | 44761324 |
Filed Date | 2011-10-13 |
United States Patent
Application |
20110250962 |
Kind Code |
A1 |
Feiner; Steven K. ; et
al. |
October 13, 2011 |
SYSTEM AND METHOD FOR A 3D COMPUTER GAME WITH TRUE VECTOR OF
GRAVITY
Abstract
A computer interaction system includes an augmented interaction
device and a computer. The augmented interaction device includes a
display device that displays augmented reality or virtual reality
images, a first tracking mechanism that tracks a position of a
physical object relative to the first tracking mechanism and an
orientation of the physical object relative to the first tracking
mechanism, and a second tracking mechanism that tracks a position
of the second tracking mechanism relative to a reference and an
orientation of the second tracking mechanism relative to the
reference. The computer includes a processor and a memory, and
processes position and orientation information received from the
first tracking mechanism, and position and orientation information
received from the second tracking mechanism. The computer can be
configured to compensate for movement relative to the physical
object of the first and second tracking mechanisms, and to output
augmented reality or virtual reality information to the
display.
Inventors: |
Feiner; Steven K.; (New
York, NY) ; Oda; Ohan; (New York, NY) |
Family ID: |
44761324 |
Appl. No.: |
13/084488 |
Filed: |
April 11, 2011 |
Related U.S. Patent Documents
|
|
|
|
|
|
Application
Number |
Filing Date |
Patent Number |
|
|
61322521 |
Apr 9, 2010 |
|
|
|
Current U.S.
Class: |
463/31 |
Current CPC
Class: |
A63F 13/57 20140902;
A63F 2300/1043 20130101; A63F 13/212 20140902; A63F 2300/204
20130101; A63F 2300/1093 20130101; A63F 2300/6045 20130101; A63F
2300/105 20130101; A63F 13/211 20140902; A63F 13/428 20140902; A63F
2300/8082 20130101; A63F 13/213 20140902 |
Class at
Publication: |
463/31 |
International
Class: |
A63F 9/24 20060101
A63F009/24 |
Claims
1. An interaction delivery device comprising: a display device
configured to display virtual objects; a first tracking mechanism
configured to track one or more of a position of a physical object
relative to the first tracking mechanism and an orientation of the
physical object relative to the first tracking mechanism; and a
second tracking mechanism configured to track one or more of a
position of the second tracking mechanism relative to a reference
and an orientation of the second tracking mechanism relative to the
reference, wherein one or more of position information received
from the first tracking mechanism and orientation information
received from the first tracking mechanism is used to generate
motion data for the physical object, wherein one or more of
position information received from the second tracking mechanism
and orientation information received from the second tracking
mechanism is used to generate adjusted motion data for the physical
object, wherein the adjusted motion data for the physical object
compensates for movement relative to the physical object of at
least one of the first tracking mechanism and the second tracking
mechanism, and wherein the adjusted motion data for the physical
object is used to generate virtual object information, and wherein
the virtual object information is received by the display.
2. An interaction processing device comprising: a processor; a
memory; an input unit configured to receive information; and an
output unit configured to output information, wherein the input
unit is configured to receive at least one of physical object
position information, physical object orientation information,
tracking mechanism position information, and tracking mechanism
orientation information, wherein the processor is configured to
generate motion data for a physical object using at least one of
the physical object position information and the physical object
orientation information, wherein the processor is configured to
generate adjusted motion data for the physical object using at
least one of tracking mechanism position information and tracking
mechanism orientation information, wherein the adjusted motion data
for the physical object compensates for movement of a tracking
mechanism relative to the physical object, wherein the processor is
configured to generate virtual object information using the
adjusted motion data for the physical object, and wherein the
output unit is configured to output the virtual object
information.
3. A computer interaction system, comprising: an interaction
delivery device comprising: a display device configured to display
virtual objects, a first tracking mechanism configured to track one
or more of a position of a physical object relative to the first
tracking mechanism and an orientation of the physical object
relative to the first tracking mechanism, and a second tracking
mechanism configured to track one or more of a position of the
second tracking mechanism relative to a reference and an
orientation of the second tracking mechanism relative to the
reference; and a computer comprising: a processor, and a memory,
wherein the computer is configured to process at least one of
position information received from the first tracking mechanism,
orientation information received from the first tracking mechanism,
position information received from the second tracking mechanism,
and orientation information received from the second tracking
mechanism, wherein the computer is further configured to compensate
for movement relative to the physical object of at least one of the
first tracking mechanism and the second tracking mechanism, and
wherein the computer is further configured to output virtual object
information to the display device.
4. The computer interaction system of claim 3, wherein the first
tracking mechanism is an optical tracking mechanism.
5. The computer interaction system of claim 4, wherein the
interaction delivery device is configured to be head-worn and to
display see-through video.
6. The computer interaction system of claim 3, wherein the second
tracking mechanism comprises a three-axis accelerometer.
7. The computer interaction system of claim 3, wherein the second
tracking mechanism comprises a six-degree-of-freedom tracker
configured to determine a three-dimensional position of the second
tracking mechanism relative to the reference and a
three-dimensional orientation of the second tracking mechanism
relative to the reference.
8. The computer interaction system of claim 7, wherein the
reference is earth or is fixed to earth.
9. The computer interaction system of claim 7, wherein the
reference is fixed to an object moving relative to earth.
10. The computer interaction system of claim 3, wherein the second
tracking mechanism is configured to determine a true direction of a
natural force vector.
11. The computer interaction system of claim 10, wherein the
computer is configured to simulate motion of virtual objects based
on a true direction of a natural force and based on at least one of
position information received from the first tracking mechanism,
orientation information received from the first tracking mechanism,
position information received from the second tracking mechanism,
and orientation information received from the second tracking
mechanism.
12. The computer interaction system of claim 10, wherein the
computer is configured to simulate motion of virtual objects based
on a direction of force that is different from the true direction
of the natural force and based on at least one of position
information received from the first tracking mechanism, orientation
information received from the first tracking mechanism, position
information received from the second tracking mechanism, and
orientation information received from the second tracking
mechanism.
13. The computer interaction system of claim 3, wherein the
computer further comprises a physics engine, and wherein the
computer is configured to simulate virtual objects.
14. The computer interaction system of claim 13, wherein the
computer is configured to simulate natural forces acting, in the
true direction, on the virtual objects based on at least one of the
position information received from the first tracking mechanism,
the orientation information received from the first tracking
mechanism, the position information received from the second
tracking mechanism, and the orientation information received from
the second tracking mechanism, wherein the position information
received from the first tracking mechanism comprises a first
position of the physical object relative to the first tracking
mechanism at a first time and a second position of the physical
object relative to the first tracking mechanism at a second time,
wherein the orientation information received from the first
tracking mechanism comprises a first orientation of the physical
object relative to the first tracking mechanism at the first time
and a second orientation of the physical object relative to the
first tracking mechanism at the second time, wherein the position
information received from the second tracking mechanism comprises a
first position of the second tracking mechanism relative to the
reference at the first time and a second position of the second
tracking mechanism relative to the reference at the second time,
and wherein the orientation information received from the second
tracking mechanism comprises a first orientation of the second
tracking mechanism relative to the reference at the first time and
a second orientation of the second tracking mechanism relative to
the reference at the second time.
15. The computer interaction system of claim 14, wherein the second
tracking mechanism has a position rigidly fixed relative to the
first tracking mechanism, wherein a predetermined time exists
between the first time and the second time, wherein the computer is
configured to determine a physical object movement vector between
the first position of the physical object and the second position
of the physical object, wherein the computer is configured to
determine a second tracking mechanism movement vector between the
first position of the second tracking mechanism and the second
position of the second tracking mechanism, wherein the computer is
configured to calculate an adjusted physical object movement vector
if a magnitude of the second tracking mechanism movement vector is
greater than or equal to a predetermined threshold distance, and to
simulate natural forces acting on the virtual objects based on at
least the adjusted physical object movement vector if the magnitude
of the second tracking mechanism movement vector is greater than or
equal to the predetermined threshold distance, and wherein the
computer is configured to simulate natural forces acting on the
virtual objects based on at least the physical object movement
vector if the magnitude of the second tracking mechanism movement
vector is less than the predetermined threshold distance.
16. The computer interaction system of claim 14, wherein the
computer is configured to determine a physical object rotation
tensor between the first orientation of the physical object and the
second orientation of the physical object, wherein the computer is
configured to determine a second tracking mechanism rotation tensor
between the first orientation of the second tracking mechanism and
the second orientation of the second tracking mechanism, wherein
the computer is configured to calculate an adjusted physical object
rotation tensor if a resultant rotation of the second tracking
mechanism is greater than or equal to a predetermined threshold
rotation value, wherein the computer is configured to simulate
natural forces acting on the virtual objects based on at least the
adjusted physical object rotation tensor if the resultant rotation
of the second tracking is greater than or equal to the
predetermined threshold rotation value, and wherein the computer is
configured to simulate natural forces acting on the virtual objects
based on at least the physical object rotation tensor if the
resultant rotation of the second tracking mechanism is less than
the predetermined threshold rotation value.
17. A non-transitory computer readable medium having computer
readable instructions stored thereon, which, when executed by a
computer having a processor to execute a plurality of processes,
are configured to cause the processor to: obtain first tracking
mechanism information; obtain second tracking mechanism
information; determine a physical object movement vector using the
first tracking mechanism information; determine a second tracking
mechanism movement vector using the second tracking mechanism
information; determine a direction of a true physical gravity
vector relative to the second tracking mechanism; simulate motion
data of a virtual object using: an adjusted physical object
movement vector if a magnitude of the second tracking mechanism
movement vector is greater than or equal to a predetermined
threshold distance, the physical object movement vector if the
magnitude of the second tracking mechanism movement vector is less
than the predetermined threshold distance, and the direction of a
true physical gravity vector relative to the second tracking
mechanism; and output the motion data of the virtual object to a
display.
18. The non-transitory computer readable medium having computer
readable instructions stored thereon of claim 17, which, when
executed by a computer having a processor to execute a plurality of
processes, are configured further to cause the processor to:
determine a physical object rotation tensor using the first
tracking mechanism information; determine a second tracking
mechanism rotation tensor using the second tracking mechanism
information; and simulate motion data of an a virtual object using
additionally: an adjusted physical object rotation tensor if a
resultant rotation of the second tracking mechanism is greater than
or equal to a predetermined threshold rotation value, and the
physical object rotation tensor if the resultant rotation of the
second tracking mechanism is less than the predetermined threshold
rotation value.
19. The non-transitory computer readable medium having computer
readable instructions stored thereon according to claim 18,
wherein: the first tracking mechanism information comprises: a
first position of a physical object relative to a first tracking
mechanism and a first orientation of the physical object at a first
time, and a second position of the physical object relative to the
first tracking mechanism and a second orientation of the physical
object at a second time; and the second tracking mechanism
information comprises: a first position of a second tracking
mechanism relative to a reference and a first orientation of the
second tracking mechanism at the first time, and a second position
of the second tracking mechanism relative to the reference and a
second orientation of the second tracking mechanism at the second
time.
20. A method of facilitating interaction between an interaction
delivery device and a physical object in an environment, the method
comprising: generating one or more virtual objects in the
environment; detecting a change in the physical object; determining
whether the change in the physical object is based on a change in
the state of the virtual objects and the physical object, or both a
force applied to the interaction delivery device and a change in
the state of the virtual objects and the physical object; measuring
a direction and effect of a natural force interacting with the
environment; and updating the virtual objects based on a result of
the determining and the measuring.
21. The method of facilitating interaction between an interaction
delivery device and a physical object in an environment of claim
20, wherein the detecting further comprises: detecting a change in
position of the physical object over a given time, and detecting a
change in position of the interaction delivery device over the
given time; wherein the determining further comprises: determining
whether a magnitude of the change in position of the interaction
delivery device over the given time is greater than or equal to a
threshold value, determining that the detected change in the
physical object is based on a change in the state of the virtual
objects and the physical object if the change in position of the
interaction delivery device over the given time is less than the
threshold value, and determining that the detected change in the
physical object is based on both a force applied to the interaction
delivery device and a change in the state of the virtual objects
and the physical object if the change in position of the
interaction delivery device over the given time is greater than or
equal to the threshold value; and wherein the updating further
comprises: updating positions of the virtual objects to simulate
motion consistent with the natural force and the detected change in
position of the physical object over the given time if the detected
change in the physical object is based on a change in the state of
the virtual objects and the physical object, and updating positions
of the virtual objects to simulate motion consistent with the
natural force and the detected change in position of the physical
object and adjusted to remove effects caused by the force applied
to the interaction delivery device over the given time if the
detected change in the physical object is based on both a force
applied to the interaction delivery device and a change in the
state of the virtual objects and the physical object.
22. An interaction system comprising: a physical object; at least
one virtual object; an interaction delivery device comprising: a
tracking mechanism configured to track at least one of a position
of the physical object relative to the first tracking mechanism and
an orientation of the physical object relative to the first
tracking mechanism, a detecting mechanism configured to detect
motion of the detecting mechanism, wherein a position of the
detecting mechanism relative to a position of the tracking
mechanism is predetermined, and a display configured to display the
at least one virtual object; and a processing device, wherein the
processing device is configured to receive physical object position
information from the tracking mechanism, wherein the processing
device is configured to receive detecting mechanism motion
information from the detecting mechanism, wherein the processing
device is configured to perform at least one of: determining if a
magnitude of acceleration of the detecting mechanism is greater
than a predetermined acceleration threshold value and generating
adjusted physical object position information if the magnitude of
acceleration of the detecting mechanism is greater than the
predetermined acceleration threshold value, and determining if a
magnitude of velocity of the detecting mechanism is greater than a
predetermined velocity threshold value and generating adjusted
physical object position information if the magnitude of velocity
of the detecting mechanism is greater than the predetermined
velocity threshold value, wherein the processing device is
configured to generate motion information for the at least one
virtual object based on the adjusted physical object position
information if the processing device generates the adjusted
physical object position information, and wherein the processing
device is configured to generate the motion information for the at
least one virtual object based on the physical object position
information if the processing device does not generate the adjusted
physical object position information, and wherein the processing
device is configured to output the motion information for the at
least one virtual object to the display.
23. The interaction system of claim 22, wherein the detecting
mechanism is further configured to detect a correct direction and
magnitude of a physical gravity vector, and wherein the processing
device is further configured to generate the motion information for
the at least one virtual object additionally based on the correct
direction and magnitude of the physical gravity vector.
24. The interaction system of claim 23, wherein the at least one
virtual object is configured to move according to a virtual gravity
vector, and wherein the virtual gravity vector is substantially
identical to the physical gravity vector.
25. The interaction system of claim 23, wherein the at least one
virtual object is configured to move according to a virtual gravity
vector, and wherein the virtual gravity vector is different from
the physical gravity vector.
26. The interaction system of claim 22, wherein the tracking
mechanism is an optical tracking device and the physical object
does not include attached or embedded electronic devices or
components.
27. An interaction delivery system comprising: a first tracking
mechanism rigidly attached to a display device configured to track
the position and orientation of a physical object relative to the
display device; a second tracking mechanism rigidly attached to the
display device, configured to track the absolute orientation of the
display device relative to the earth; and a processing device
configured to process tracking information output from the first
tracking mechanism and tracking information output from the second
tracking mechanism, wherein the processing device is configured to
simulate motion information of at least one virtual object from the
tracking information output from the first tracking mechanism and
tracking information output from the second tracking mechanism,
wherein the processing device is configured to output the simulated
motion information of the at least one virtual object to the
display, wherein the display is configured to display the at least
one virtual object disposed relative to the physical object, and
wherein the at least one virtual object is configured to behave as
if acted on by a virtual gravity vector in a same direction as a
physical gravity vector acting on the physical object.
Description
PRIORITY
[0001] This application claims the benefit of U.S. Provisional
Application No. 61/322,521, filed Apr. 9, 2010, which is hereby
incorporated by reference in its entirety.
FIELD
[0002] The present application relates to a computer interaction
system and a method of facilitating interaction between an
interaction delivery device and a physical object in an
environment.
BACKGROUND
[0003] Computer games have been developed that account for gravity.
An example of a game accounting for gravity is a virtual marble
labyrinth game. In a virtual marble labyrinth game, a user tilts a
gameboard having one or more virtual balls such that the virtual
balls roll about the gameboard as if the virtual balls were real
balls on the surface of the gameboard being influenced by a
gravitational force and the user's actions. Additionally, the
virtual balls can encounter a number of virtual objects that can
affect the virtual balls' motion in the same manner as a physical
object on the gameboard would affect the physical balls' motion.
The objective of the virtual marble labyrinth game is to move the
virtual balls to a desired position on the gameboard.
SUMMARY
[0004] One aspect of the presently disclosed subject matter
provides an interaction delivery device that can include a display
device, a first tracking mechanism, and a second tracking
mechanism. The display device can be configured to display virtual
objects. The first tracking mechanism can be configured to track
one or more of a position of a physical object relative to the
first tracking mechanism and an orientation of the physical object
relative to the first tracking mechanism. The second tracking
mechanism can be configured to track one or more of a position of
the second tracking mechanism relative to a reference and an
orientation of the second tracking mechanism relative to the
reference. In certain embodiments, one or more of position
information received from the first tracking mechanism and
orientation information received from the first tracking mechanism
can be used to generate motion data for the physical object. In
certain embodiments, one or more of position information received
from the second tracking mechanism and orientation information
received from the second tracking mechanism can be used to generate
adjusted motion data for the physical object. In certain
embodiments, the adjusted motion data for the physical object is
used to generate virtual object information. In certain
embodiments, the virtual object information is received by the
display.
[0005] Another aspect of the presently disclosed subject matter
provides an interaction processing device that can include a
processor, a memory, an input unit configured to receive
information, and an output unit configured to output information.
In certain embodiments, the input unit can be configured to receive
at least one of physical object position information (e.g., data,
video, images, etc.), physical object orientation information
(e.g., data, video, images, etc.), tracking mechanism position
information (e.g., data, video, images, etc.), and tracking
mechanism orientation information (e.g., data, video, images,
etc.). In certain embodiments, the processor can be configured to
generate motion data for a physical object using at least one of
the physical object position information and the physical object
orientation information. In certain embodiments, the processor can
be configured to generate adjusted motion data for the physical
object using at least one of tracking mechanism position
information and tracking mechanism orientation information. In
certain embodiments, the adjusted motion data for the physical
object can compensate for movement of a tracking mechanism relative
to the physical object. In certain embodiments, the processor can
be configured to generate virtual object information using the
adjusted motion data for the physical object. In certain
embodiments, the output unit can be configured to output the
virtual object information.
[0006] Another aspect of the presently disclosed subject matter
provides a computer interaction system that can include an
interaction delivery device and a computer. The interaction
delivery device can include a display device configured to display
one or more of virtual objects, images, videos, and other
information or media, a first tracking mechanism configured to
track one or more of a position of a physical object relative to
the first tracking mechanism and an orientation of the physical
object relative to the first tracking mechanism, and a second
tracking mechanism configured to track one or more of a position of
the second tracking mechanism relative to a reference and an
orientation of the second tracking mechanism relative to the
reference. The computer can include a processor and a memory, and
can be configured to process one or more of position information
received from the first tracking mechanism, orientation information
received from the first tracking mechanism, position information
received from the second tracking mechanism, and orientation
information received from the second tracking mechanism. Further,
in certain embodiments, the computer can be configured to
compensate for movement relative to the physical object of at least
one of the first tracking mechanism, the second tracking mechanism,
and the display device. In some embodiments, the computer can be
further configured to output virtual object information to the
display device. The computer can be any processing device.
[0007] In one embodiment, the first tracking mechanism can be an
optical tracking mechanism which can include one or more cameras.
Further, the interaction delivery device can be configured to be
head-worn and to display one or more of video see-through augmented
reality (e.g., video depicting a physical environment viewed
through one or more attached cameras that is augmented with
additional virtual graphics), virtual reality video (e.g., video
depicting an entirely virtual environment), and optical see-through
augmented reality (e.g., viewing the physical environment directly,
rather than through video, with virtual graphics overlaid on the
field of vision by using optical elements such as mirrors, lenses,
etc.). The second tracking mechanism can include one or more of a
three-axis accelerometer and a magnetometer. Moreover, the second
tracking mechanism can be configured to determine a true direction
of a natural force, such as gravity or magnetism, acting on the
second tracking mechanism. The second tracking mechanism can have a
position that is rigidly fixed relative to the first tracking
mechanism, or a position that is otherwise known relative to the
first tracking mechanism.
[0008] In a further embodiment, the second tracking mechanism can
be configured to comprise a six-degree-of-freedom tracker that can
be configured to determine a three-dimensional position of the
second tracking mechanism relative to the reference and a
three-dimensional orientation of the second tracking mechanism
relative to the reference. The reference can be the earth or fixed
to the earth. Alternatively, the reference can be fixed to an one
or more additional physical object moving relative to the
earth.
[0009] In another embodiment, the second tracking mechanism can be
configured to determine a true direction of a natural force vector.
Further, the computer can be configured to simulate motion of
virtual objects based on a true direction of a natural force and
based on at least one of position information received from the
first tracking mechanism, orientation information received from the
first tracking mechanism, position information received from the
second tracking mechanism, and orientation information received
from the second tracking mechanism. Alternatively or additionally,
the computer can be configured to simulate motion of virtual
objects based on a direction of force that is different from the
true direction of a natural force and based on at least one of
position information received from the first tracking mechanism,
orientation information received from the first tracking mechanism,
position information received from the second tracking mechanism,
and orientation information received from the second tracking
mechanism.
[0010] In a further embodiment, the computer can further include
one or more of a game engine and a physics engine, wherein the
computer can be configured to simulate virtual objects. For
example, the computer can be configured to simulate virtual forces
acting, in a true physical direction, on the virtual objects based
on one or more of the position information received from the first
tracking mechanism, the orientation information received from the
first tracking mechanism, the position information received from
the second tracking mechanism, the orientation information received
from the second tracking mechanism, acceleration information about
the second tracking mechanism received from the second tracking
mechanism, velocity information about the second tracking mechanism
received from the second tracking mechanism, and true physical
gravity vector information. The position information received from
the first tracking mechanism can include a first position of the
physical object relative to the first tracking mechanism at a first
time and a second position of the physical object relative to the
first tracking mechanism at a second time. The orientation
information received from the first tracking mechanism can include
a first orientation of the physical object relative to the first
tracking mechanism at the first time and a second orientation of
the physical object relative to the first tracking mechanism at the
second time. The position and orientation information received from
the first tracking mechanism can be data, video streams, images, or
other forms of information. The position information received from
the second tracking mechanism can include a first position of the
second tracking mechanism relative to the reference at the first
time and a second position of the second tracking mechanism
relative to the reference at the second time. The orientation
information received from the second tracking mechanism can include
a first orientation of the second tracking mechanism relative to
the reference at the first time and a second orientation of the
second tracking mechanism relative to the reference at the second
time.
[0011] Additionally, the second tracking mechanism can have a
position rigidly fixed, or otherwise known, relative to the first
tracking mechanism. A predetermined time can exist between the
first time and the second time. Moreover, the computer can be
configured to determine a physical object movement vector between
the first position of the physical object and the second position
of the physical object. The computer can be configured to determine
a second tracking mechanism movement vector between the first
position of the second tracking mechanism and the second position
of the second tracking mechanism. Further, the computer can be
configured to calculate an adjusted physical object movement vector
by subtracting the second tracking mechanism movement vector from
the physical object movement vector if a magnitude of the second
tracking mechanism movement vector exceeds a predetermined
threshold distance (e.g., the predetermined threshold distance can
be set as moving about 0.02 meters in the about 1 second) The
computer can be configured to simulate virtual forces acting on the
virtual objects based on at least the adjusted physical object
movement vector if the magnitude of the second tracking mechanism
movement vector exceeds, or is equal to, the predetermined
threshold distance. Additionally, the computer can be configured to
simulate virtual forces acting on the virtual objects based on at
least the physical object movement vector if the magnitude of the
second tracking mechanism movement vector is less than the
predetermined threshold distance.
[0012] In yet another embodiment, the computer can be configured to
determine a physical object rotation tensor between the first
orientation of the physical object and the second orientation of
the physical object. The computer can be configured to determine a
second tracking mechanism rotation tensor between the first
orientation of the second tracking mechanism and the second
orientation of the second tracking mechanism. Additionally, the
computer can be configured to calculate an adjusted physical object
rotation tensor by applying a transformation based on a resultant
rotation of the second tracking mechanism, if the resultant
rotation of the second tracking mechanism exceeds, or is equal to,
a predetermined threshold rotation value (e.g., a predetermined
threshold rotation value of, for example 3 degrees in a
predetermined time of 1 second). The computer can be configured to
simulate natural forces acting on the virtual objects based on at
least the adjusted physical object rotation tensor if the resultant
rotation of the second tracking mechanism exceeds, or is equal to,
the threshold rotation magnitude. Additionally, the computer can be
configured to simulate natural forces acting on the virtual objects
based on at least the physical object rotation tensor if the
resultant rotation of the second tracking mechanism is less than
the predetermined threshold rotation value.
[0013] Another aspect of the presently disclosed subject matter
provides a non-transitory computer readable medium having computer
readable instructions stored thereon, which, when executed by a
computer having a processor to execute a plurality of processes,
are configured to cause the processor to perform several functions.
In certain embodiments, the computer readable instructions can
cause the processor to obtain one or more of first tracking
mechanism information and second tracking mechanism information. In
certain embodiments, the computer readable instructions can cause
the processor to determine one or more of a physical object
movement vector using the first tracking mechanism information, a
second tracking mechanism movement vector using the second tracking
mechanism information, and direction of a true physical gravity
vector relative to the second tracking mechanism. In certain
embodiments, the computer readable instructions can cause the
processor to simulate motion data of a virtual object. The
processor can simulate motion data of the virtual object based on a
true or not true virtual gravity vector using an adjusted physical
object movement vector if a magnitude of the second tracking
mechanism movement vector is greater than or equal to a
predetermined threshold distance, the physical object movement
vector if the magnitude of the second tracking mechanism movement
vector is less than the predetermined threshold distance, and the
direction of a true physical gravity vector relative to the second
tracking mechanism. In certain embodiments, the computer readable
instructions can cause the processor to output the motion data of
the virtual object to a display.
[0014] In one embodiment, the non-transitory computer readable
medium having computer readable instructions stored thereon, which,
when executed by a computer having a processor to execute a
plurality of processes, are configured further to cause the
processor to determine one or more of a physical object rotation
tensor using the first tracking mechanism information, and a second
tracking mechanism rotation tensor using the second tracking
mechanism information. The computer readable instructions can also
cause the processor to simulate motion data of an a virtual object
using additionally: an adjusted physical object rotation tensor if
a resultant rotation of the second tracking mechanism is greater
than or equal to a predetermined threshold rotation value, and the
physical object rotation tensor if the resultant rotation of the
second tracking mechanism is less than the predetermined threshold
rotation value.
[0015] The first tracking mechanism information can include a first
position of a physical object relative to a first tracking
mechanism and a first orientation of the physical object at a first
time, and a second position of the physical object relative to the
first tracking mechanism and a second orientation of the physical
object at a second time. The second tracking mechanism information
can include a first position of a second tracking mechanism
relative to a reference and a first orientation of the second
tracking mechanism at the first time, and a second position of the
second tracking mechanism relative to the reference and a second
orientation of the second tracking mechanism at the second
time.
[0016] Another aspect of the presently disclosed subject matter
provides a method of facilitating interaction between an
interaction delivery device and a physical object in an
environment, the method including generating one or more virtual
objects in the environment; detecting a change in the physical
object; determining whether the change in the physical object is
based on a change in the state of the virtual objects and the
physical object, or both a force applied to the interaction
delivery device and a change in the state of the virtual objects
and the physical object; measuring a direction and effect (e.g.,
magnitude, etc.) of a natural force interacting with the
environment; and updating the virtual objects based on a result of
the determining and the measuring. In certain embodiments, the
detecting can further include: detecting a change in position of
the physical object over a given time, and detecting a change in
position of the interaction delivery device over the given time. In
certain embodiments, the determining can further include
determining whether a magnitude of the change in position of the
interaction delivery device over the given time exceeds, or is
equal to, a predetermined threshold value, determining that the
detected change in the physical object is based on a change in the
state of the virtual objects and physical object if the change in
position of the interaction delivery device over the given time is
less than the predetermined threshold value, and determining that
the detected change in the physical object is based on both a force
applied to the interaction delivery device and a change in the
state of the virtual objects and the physical object if the change
in position of the interaction delivery device over the given time
exceeds or is equal to the predetermined threshold value. The
updating can further include updating positions of the virtual
objects to simulate motion consistent with the natural force and
the detected change in position of the physical object over the
given time if the detected change in the physical object is based
on a change in the state of the virtual objects and the physical
object, and updating positions of the virtual objects to simulate
motion consistent with the natural force and the detected change in
position of the physical object and adjusted to remove effects
caused by the force applied to the interaction delivery device over
the given time if the detected change in the physical object is
based on both a force applied to the interaction delivery device
and a change in the state of the virtual objects and the physical
object.
[0017] Another aspect of the presently disclosed subject matter
provides an interaction system that can include a physical object,
at least one virtual object, an interaction delivery device, and a
processing device. The interaction delivery device can include: a
tracking mechanism that can be configured to track at least one of
a position of the physical object relative to the first tracking
mechanism and an orientation of the physical object relative to the
first tracking mechanism; a detecting mechanism configured to
detect motion of the detecting mechanism, wherein a position of the
detecting mechanism relative to a position of the tracking
mechanism is predetermined; and a display configured to display the
at least one virtual object. The processing device can be
configured to receive physical object position information from the
tracking mechanism. The processing device can be configured to
receive detecting mechanism motion information from the detecting
mechanism. The processing device can be configured to perform at
least one of: determining if a magnitude of acceleration of the
detecting mechanism is greater than a predetermined acceleration
threshold value and generating adjusted physical object position
information if the magnitude of acceleration of the detecting
mechanism is greater than the predetermined acceleration threshold
value; and determining if a magnitude of velocity of the detecting
mechanism is greater than a predetermined velocity threshold value
and generating adjusted physical object position information if the
magnitude of velocity of the detecting mechanism is greater than
the predetermined velocity threshold value. The processing device
can be configured to generate motion information for the at least
one virtual object based on the adjusted physical object position
information if the processing device generates the adjusted
physical object position information. The processing device can be
configured to generate the motion information for the at least one
virtual object based on the physical object position information if
the processing device does not generate the adjusted physical
object position information. The processing device can be
configured to output the motion information for the at least one
virtual object to the display.
[0018] The detecting mechanism can be configured to detect a
correct direction and magnitude of a physical gravity vector.
Additionally, the processing device can be further configured to
generate the motion information for the at least one virtual object
additionally based on the correct direction and magnitude of the
physical gravity vector. Further, the tracking mechanism can be an
optical tracking device. The physical object can be configured to
not include attached or embedded electronic devices or
components.
[0019] In one embodiment of the present aspect of the application,
the at least one virtual object can be configured to move according
to a virtual gravity vector, wherein the virtual gravity vector is
substantially identical to the physical gravity vector. In another
embodiment of the present aspect of the application, the at least
one virtual object can be configured to move according to a virtual
gravity vector, wherein the virtual gravity vector is different
from the physical gravity vector.
[0020] Another aspect of the presently disclosed subject matter
describes an interaction delivery system comprising: a first
tracking mechanism rigidly attached to a display device configured
to track the position and orientation of a physical object relative
to the display device; a second tracking mechanism rigidly attached
to the display device, configured to track the absolute orientation
of the display device relative to the earth; and a processing
device configured to process tracking information output from the
first tracking mechanism and tracking information output from the
second tracking mechanism, wherein the processing device is
configured to simulate motion information of at least one virtual
object from the tracking information output from the first tracking
mechanism and tracking information output from the second tracking
mechanism, wherein the processing device is configured to output
the simulated motion information of the at least one virtual object
to the display, wherein the display is configured to display the at
least one virtual object disposed relative to the physical object,
and wherein the at least one virtual object is configured to behave
as if acted on by a virtual gravity vector in a same direction as a
physical gravity vector acting on the physical object.
[0021] It is to be understood that both the foregoing general
description and the following detailed description are exemplary
and are intended to provide further explanation of the application
claimed.
[0022] The accompanying drawings, which are incorporated in and
constitute part of this specification, are included to illustrate
and provide a further understanding of the apparatus and method of
the application. Together with the written description, the
drawings serve to explain the principles of the application.
BRIEF DESCRIPTION OF THE DRAWINGS
[0023] FIG. 1 depicts a computer interaction system according to a
non-limiting embodiment.
[0024] FIG. 2A depicts a front side of a head-worn unit, and FIG.
2B depicts a back side of the head-worn unit according to a
non-limiting embodiment.
[0025] FIG. 3 depicts, in detail, a gameboard component of the
computer interaction system depicted in FIG. 1 according to a
non-limiting embodiment.
[0026] FIG. 4 depicts a computer interaction system including
virtual graphics according to a non-limiting embodiment.
[0027] FIG. 5 is a schematic of the head-worn unit connected to a
computer according to a non-limiting embodiment.
[0028] FIG. 6 is a schematic of a computer according to a
non-limiting embodiment.
[0029] FIG. 7 is a flow chart depicting operations of a computer
according to a non-limiting embodiment.
[0030] FIG. 8 is a flow chart depicting operations of the computer
which compensate for movement and rotation of the optical tracker
according to a non-limiting embodiment.
[0031] FIG. 9A depicts a front side of a head-worn unit, and FIG.
9B depicts a back side of the head-worn unit according to a
non-limiting embodiment
[0032] FIG. 10 is a schematic of the head-worn unit connected to a
processing device according to a non-limiting embodiment.
[0033] FIG. 11 is a flow chart depicting operations of a processing
device according to a non-limiting embodiment.
[0034] FIG. 12A is a flow chart depicting operations of the
processing device which compensate for movement and rotation of the
optical tracker, based on acceleration information of the head
motion detector, according to a non-limiting embodiment. FIG. 12B
is a flow chart depicting operations of the processing device which
compensate for movement and rotation of the optical tracker, based
on velocity information of the head motion detector, according to a
non-limiting embodiment.
DETAILED DESCRIPTION
[0035] The disclosed subject matter can be exemplified with, but is
not limited to, a virtual marble labyrinth game. The virtual marble
labyrinth game has been implemented using a number of technologies.
For example, the virtual marble labyrinth game can consist of a
hand-held display for displaying the virtual balls and other
virtual objects, such as obstacles, bumps, ramps, prizes, walls,
pools, holes, channels, etc. The hand-held display is further
integrated with a set of built-in accelerometers that can directly
measure the direction of gravity relative to the display. An
example of this technology in action is available at
http://www.youtube.com/watch?v=lipHmNbi7ss. This implementation has
several drawbacks: the game graphics are restricted to the
hand-held display; the weight and fragility of the gameboard depend
upon the technology it contains; and the position and orientation
of the player's head relative to the gameboard are not known, so
graphics must be generated for an assumed head position and
orientation, and cannot respond to changes in actual head position
and orientation during gameplay.
[0036] Another example of a virtual marble labyrinth game uses a
static camera to optically track a passive physical gameboard. This
makes it possible to determine a six-degree-of-freedom position and
orientation of the gameboard relative to the camera. The virtual
objects are displayed on an external monitor, with the virtual
objects overlaid on the actual view from the camera. An example of
this technology in action is available at
http://www.youtube.com/watch?v=L7dC8HU2KJY. In this case, the
camera is assumed to be in a specific fixed pose relative to the
direction of gravity. This fixed pose can be either a hardwired
pose or a pose that is calibrated prior to gameplay. Consequently,
changing the position or orientation of the camera during gameplay
will be indistinguishable from a complementary change in the
position or orientation of the gameboard. This is typically
undesirable in gameplay, as it rules out having the camera move,
since that would cause an unwanted change in the effective
direction of gravity, preventing mobile game play or the use of a
head-worn camera. Further, if the camera is rigidly affixed to a
moving platform, such as a car, ship, plane, or train, on which the
game is being played, the motion of the platform relative to the
earth will not be properly taken into account.
[0037] Additional examples of a virtual marble labyrinth game use a
display with a known or assumed orientation. One implementation
allows the user to manipulate a virtual gameboard using an input
device such as a mouse. This approach does not provide the natural
feel of controlling a virtual gameboard in the same manner as one
would feel while controlling a physical gameboard. Another
implementation allows the user to manipulate a game controller
containing an orientation tracker, effectively changing the
direction of gravity in the game, but without changing the visual
orientation of the gameboard. An example of this technology is
available at http://www.youtube.com/watch?v=BjEKoDW9S-4. In both of
these approaches, the gameboard is not held by the player.
[0038] Yet another example of the virtual marble labyrinth game
uses a gameboard and head-worn unit. The gameboard and the
head-worn unit are each individually tracked in six-degrees of
freedom relative to a real world reference. This implementation
requires that additional infrastructure for tracking the gameboard
is added to the environment or added directly to the gameboard.
Adding such additional infrastructure to the environment limits
where the game can be played. Adding such additional infrastructure
to the gameboard make it heavier, or more fragile. In both cases,
adding additional infrastructure can make the system more
expensive, more complicated, and awkward for the user.
[0039] Reference will now be made in detail to embodiments of the
disclosed subject matter, examples of which are illustrated in the
accompanying drawings and at on the internet at
http://www.youtube.com/watch?v=6AKgH4On65A.
[0040] While, solely for purpose of convenience, the computer
interaction system and the method of facilitating interaction
between an interaction delivery device and physical objects in an
environment presented herein are described in the context of
developing and playing augmented reality computer games, use of the
systems and methods of the presently disclosed subject matter is
not limited thereto. For example, the presently disclosed systems
and methods can be employed for other uses, such as, but not
limited to, pedagogical computer environments for teaching physics,
controls for user interfaces, pedagogical "virtual reality"
computer environments for medical training, pilot training, etc.
Additionally, the disclosed subject matter can be employed to
develop and play entirely virtual games using a physical object as
an interface to control virtual objects within an immersive virtual
environment.
[0041] In accordance with the disclosed subject matter, a computer
interaction system is provided. The computer interaction system
includes an interaction delivery device and a computer. The
interaction delivery device can include a display device configured
to display virtual objects, a first tracking mechanism configured
to track one or more of a position of a physical object relative to
the first tracking mechanism and an orientation of the physical
object relative to the first tracking mechanism, and a second
tracking mechanism configured to track one or more of a position of
the second tracking mechanism relative to a reference and an
orientation of the second tracking mechanism relative to the
reference. The computer can include a processor and a memory, and
can be configured to process at least one of position information
received from the first tracking mechanism, orientation information
received from the first tracking mechanism, position information
received from the second tracking mechanism, and orientation
information received from the second tracking mechanism. Further,
the computer can be configured to compensate for movement relative
to the physical object of at least one of the first tracking
mechanism and the second tracking mechanism and to output virtual
object information to the display.
[0042] For purpose of explanation and illustration, and not
limitation, an exemplary embodiment of the computer interaction
system in accordance with the application is shown in FIGS.
1-12.
[0043] FIG. 1 depicts a computer interaction system that includes a
head-worn unit 10 (an interaction delivery device) and a computer
14 that can be connected to the head-worn unit 10 wirelessly or
through a cable. Further, the computer interaction system can
include a gameboard 16 (or other physical object) that can be
manipulated to move in an x-direction, a y-direction, and a
z-direction with respect to head-worn unit 10 or to rotate to a new
orientation by a user. Moreover, the head-worn unit 10 can move in
an a-direction, a b-direction, and a c-direction or rotate to a new
orientation as the user's head moves. A force of gravity can act
upon the computer interaction environment in the direction of a
true virtual gravity vector pointing downward relative to earth 25,
as indicated in FIG. 1. In this embodiment, the true virtual
gravity vector acts with the same magnitude and direction as the
physical gravitational force acting on the user and gameboard
(i.e., toward earth 25). Accordingly, as used herein, "true"
indicates a natural and correct direction and magnitude, as would
exist in real life. Alternatively, the virtual gravity vector can
be configured to be a not true virtual gravity vector, where, as
used herein, "not true" indicates that the vector has at least one
of a direction and magnitude that is different from the natural and
correct direction and magnitude of the physical force, as would
exist in real life.
[0044] FIG. 2A depicts the front side of head-worn unit 10. This
embodiment depicts a modified version of the Vuzix Wrap.TM. 920AR,
but alternative devices can be used as heard-worn unit 10. The
head-worn unit 10 can include one or more optical sensors 11 (first
tracking mechanisms) for detecting optical markers on a gameboard
(see, e.g., optical marker 23 on gameboard 16 in FIG. 3). Optical
sensors 11 detect the position and orientation of optical markers
and transmit that information to the computer (not shown). Further,
head-worn unit 10 can include a head tracker 12 (a second tracking
mechanism) for detecting at least one of the orientation and
position of head tracker 12 relative to a reference 24. Reference
object 24 can be a position on earth 25 or an object moving
relative to earth 25 such as a car, a boat, or a plane. Head
tracker 12 can be, but is not limited to being, rigidly fixed
relative to at least one optical sensor 11. Accordingly, one or
more of the orientation of optical sensors 11 and the position of
optical sensors 11 can be readily determined from the orientation
and position of head tracker 12 if head tracker 12 is rigidly fixed
relative to at least one optical sensor 11.
[0045] FIG. 2B depicts the back side of head-worn unit 10. Head
tracker 12 can be disposed on the back side of head-worn unit 10.
Further, display devices 13 can be disposed on the back side of
head-worn unit 10. Display devices 13 can be configured to display
augmented reality images 2 based on augmented reality information
output by computer 14. Augmented reality images 2 can include
physical objects, such as gameboard 16, and virtual objects 15,
such as ball 15a and obstacles 15b as depicted in FIG. 4. Display
devices 13 can also be configured to allow a user to directly view
physical objects and to overlay virtual objects on the physical
environment viewed by the user. Alternatively or additionally,
display devices 13 can be configured to display entirely virtual
environments.
[0046] FIG. 3 depicts board 16, which can, for example, serve as a
gameboard. Board 16 can have a plurality of optical markers 23.
Nevertheless, the optical markers on board 16 are not limited to an
array of optical markers 23 as depicted in FIG. 3. For example,
optical markers may include text, barcodes, patterns, colors, or
other indicia that can be identified by an optical sensor.
Furthermore, the markers can be integrated within the gameboard to
create a visually appealing appearance that can for example,
simulate an environment consistent with the theme of the video
game. Alternatively or additionally, board 16 can have designs that
include a more general set of optical features (color, patterns,
shapes, images, etc.) that can be recognized by optical tracking
software using markerless optical tracking. Additionally or
alternatively, board 16 can include natural or artificial physical
features (texture, protrusions, holes, edges, grooves, notches,
perforations, dips, divits, cavities, slits, etc.) that can be
recognized by optical tracking software using optical feature
tracking. The system can be configured to pre-store information
regarding these optical and physical features prior to tracking,
but it is not necessary that the system pre-store this
information.
[0047] FIG. 4 further depicts the computer interaction system
including virtual objects 15, such as ball 15a, obstacles 15b, and
ramp 15c, or other virtual objects that can affect motion of at
least one virtual object 15. Virtual objects 15 are not limited to
balls, obstacles, and ramps, however. Virtual objects 15 can be any
number of objects found in virtual environments, such as pools,
pits, prizes, bumpers, accelerators, edges, walls, etc. Board 16,
ball 15a, and obstacles 15b are shown as they appear in the
displays of head-worn unit 10 (see FIG. 2b). In this particular
embodiment, ball 15a is simulated to engage in motion consistent
with a true virtual gravity vector, substantially identical to the
true physical gravity vector. For example, if board 16 is rotated
such that corner 16a is lowered, ball 15a will appear to move
toward corner 16a. Ball 15a will appear to move with a velocity and
acceleration similar to that which would be observed if a physical
ball having equivalent size, shape, and composition characteristics
was placed upon board 16. However, the computer interaction
environment may also include virtual objects 15, such as obstacles
15b, which alter the motion of ball 15a. For example, when ball 15a
approaches an obstacle 15b, ball 15a can be configured to stop,
decelerate, accelerate, bounce, deform, change direction, or engage
in any number of other changes in motion or characteristic
properties. Thus, ball 15a can be configured to move as if it were
a physical ball encountering physical obstacles on board 16, even
though ball 15a and obstacles 15b are virtual objects.
Additionally, a variety of nonphysical behaviors (shrinking,
growing, color change, gravitational shifting, acceleration,
deceleration, disappearing, etc.) can be simulated when the ball
15a is in contact with or in close proximity to other virtual
objects 15.
[0048] FIG. 5 is a schematic diagram depicting the flow of
information between head-worn unit 10 and computer 14. Optical
sensors 11 can output images or video streams to computer 14. In
the present embodiment, one optical sensor 11 can output video to
computer 14, which computer 14 can process to create at least one
of position and orientation data for gameboard 16, as will be
described herein. Further, another optical sensor 11 can output
video that can be combined with virtual objects 15 to generate an
augmented reality environment. However, it is not necessary that
optical sensors 11 each output video for separate processing. One
or more of optical sensors 11 can be configured to output video to
computer 14 for both processing to generate at least one of
position and orientation data and for combining with virtual
objects 15 to generate an augmented reality environment. Head
tracker 12 can track the position and orientation of head-worn unit
10. Head tracker 12 can include one or more of but is not limited
to, a three-axis accelerometer, a magnetometer, a gyroscope, and a
full six-degrees-of-freedom tracker. Further, head tracker 12 can
detect the relative direction of the true physical gravity vector
acting on head-worn unit 10. Head tracker 12 can output this
position information and orientation information, which includes
information regarding the relative direction of the true physical
gravity vector, to computer 14. Computer 14 can then process the
information to create augmented reality information, as will be
described herein. Alternatively, computer 14 can be configured to
create purely virtual reality information. Computer 14 can then
output the augmented reality or virtual reality information to
display screens 13. The augmented reality information can be, but
is not limited to, an augmented reality video stream including a
sequence of augmented reality images. Further, the virtual reality
information can be, but is not limited to, a virtual reality video
stream including a sequence of virtual reality images. Display
screens 13 can be configured to display one or more of virtual
reality information and augmented reality information. Further,
display screens 13 can be optically transparent or disposed at a
periphery of a user's field of vision, allowing a user to view the
physical environment directly, and can be configured to overlay
virtual objects 15 over the directly viewed physical environment.
Although this embodiment describes that computer 14 can be separate
from head-worn-unit 10, head-worn unit 10 can include computer 14.
Computer 14 can be any processing device. Further, one or more of
optical sensors 11 and head tracker 12 can be configured to output
one or more of position information, orientation information, or
information regarding the relative position of the true physical
gravity vector, alone or in combination with other information, to
computer 14.
[0049] FIG. 6 is a schematic diagram of computer 14. Computer 14
can include, but is not limited to, a processor 17, memory 18,
input unit 21, and output unit 22. Memory 18 can include, but is
not limited to, one or more of a game engine 19 and a physics
engine 20. Game engine 19 can be software stored in memory 18 and
can be used for generating the graphics associated with the virtual
objects 15 and the virtual objects' interactions with physical
objects in the computer interaction environment. Accordingly,
although this embodiment describes that memory 18 can include game
engine 19, it is not necessary that memory 18 includes game engine
19. Further, physics engine 20 can be software stored in memory 18
and can be used to simulate physical interactions in the computer
interaction system. Although this embodiment describes that physics
engine 20 can be software stored in memory 18, this is not
necessary. Physics engine 20 can be a separate processor in
computer 14 or a combination of a separate processor in computer 14
and software. Alternatively, processor 17 can include physics
engine 20. Physics engine 20 can perform the calculations needed to
model the physical interactions of virtual objects 15 with
additional virtual objects and, if desired, with physical objects,
according to the laws of physics. The present embodiment can
include the Newton Game Dynamics (http://www.newtondynamics.com)
physics engine, but other physics engines can be used, such as
Havok Physics (http://www.havok.com), NVIDIA PhysX
(http://www.nvidia.com/object/physx_ne.html), etc. Input unit 21
and output unit 22 can link computer 14 to head-worn unit 10. Input
unit 21 and output unit 22 can be independent of each other or can
be integrated together. Further, input unit 21 and output unit 22
can be wireless communication devices or wired communication
devices.
[0050] FIG. 7 outlines the operation of receiving orientation and
position information and determining relevant information. Memory
18 can store tracking software that can instruct processor 17 to
process the information (e.g., video, images, data, etc.) output by
optical sensors 11 in S1-A and S1-C. Specifically, when optical
sensors 11 output a video stream or other data to computer 14, the
tracking software can instruct processor 17 to process the location
of optical markers 23 in the video stream or other data and to
determine one or more of the position and orientation of board 16
relative to optical sensors 11. The software can instruct processor
17 to store the position and orientation of board 16 relative to
optical sensors 11 at discrete time intervals, each time interval
being a predetermined length of time (e.g., 0.03 seconds). S1-A and
S1-C can occur separately or simultaneously.
[0051] A similar process occurs in S1-B and S1-D with respect to
tracking data output from head tracker 12. Specifically, when head
tracker 12 outputs head tracking data to computer 14, the tracking
software can instruct processor 17 to process the location of head
tracker 12, if necessary, and to determine one or more of the
position and orientation of head tracker 12 relative to reference
24. The software can instruct processor 17 to store the position
and orientation of head tracker 12 relative to reference 24 in
memory 18 at discrete time intervals, each time interval being a
predetermined length of time (e.g., 0.03 seconds). S1-B and S1-D
can occur separately or simultaneously. If necessary, processor 17
can store information about the relative positions of head tracker
12 and optical trackers 11 in memory 18.
[0052] For each successive change in position of board 16 over a
discrete time interval, the tracking software can instruct the
processor to determine a board movement vector, as shown in S2-A.
The processor can also determine a board rotation tensor for the
successive change in orientation of board 16 over each discrete
time interval as shown in S2-C.
[0053] Further, the tracking software can instruct processor 17 to
process one or more of the orientation and position information
output by head tracker 12 to determine the position and orientation
of head tracker 12 relative to reference 24. Processor 17 can also
determine one or more of a head tracker movement vector and a head
tracker rotation tensor for each successive change in position of
board 16 and orientation of board 16, respectively as shown in S2-B
and S2-D. If head tracker 12 is not configured to output
information regarding the relative direction of the true physical
gravity vector to computer 14, the tracking software can instruct
processor 17 to determine the direction of the true physical
gravity vector relative to head tracker 12 from the position and
orientation of head tracker 12 relative to reference 24 as shown in
S2-E.
[0054] The tracking software can also instruct processor 17 to
perform a compensation procedure to account for motion of optical
sensors 13, as shown in FIG. 8. In S3-A, processor 17 can compare
the magnitude of head tracker movement vector with a predetermined
threshold distance stored in memory 18. This predetermined
threshold distance stored in memory 18 can be determined based on
the particular implementation of the application (e.g., smaller
thresholds for complex simulations such as simulated medical
training, and larger thresholds for simple simulations such as
basic physics simulations). If the magnitude of head tracker
movement vector exceeds or is equal to the predetermined threshold
distance, processor 17 can determine that head tracker 12 has
changed position. If the magnitude of head tracker movement vector
is less than the predetermined threshold distance, processor 17 can
determine that head tracker 12 has not changed position.
[0055] If processor 17 determines that head tracker 12 has changed
position, then processor 17 can determine a resultant rotation from
head tracker rotation tensor and compare the resultant rotation of
head tracker 12 with a predetermined threshold rotation value
stored in memory 18 as shown in S4-A. This predetermined threshold
rotation value stored in memory 18 can be determined based on the
particular implementation of the application smaller thresholds,
for complex simulations such as simulated medical training, and
larger thresholds for simple simulations such as basic physics
simulations). If the resultant rotation of head tracker 12 exceeds
or is equal to the predetermined threshold rotation value,
processor 17 can determine that head tracker 12 has changed both
position and orientation. If the resultant rotation of head tracker
12 is less than the predetermined threshold rotation value,
processor 17 can determine that head tracker 12 has not changed
orientation but has changed position.
[0056] If processor 17 determines that head tracker 12 has not
changed position, then processor 17 can determine a resultant
rotation from head tracker rotation tensor and compare the
resultant rotation of head tracker 12 with a predetermined
threshold rotation value stored in memory 18 as shown in S3-C. If
the resultant rotation of head tracker 12 exceeds or is equal to
the predetermined threshold rotation value, processor 17 can
determine that head tracker 12 has changed orientation but not
position. If the resultant rotation of head tracker 12 is less than
the predetermined threshold rotation value, processor 17 can
determine that head tracker 12 has not changed position or
orientation.
[0057] Alternatively, one or more of the comparisons of S3-A, S3-C,
and S4-A can occur simultaneously or as part of the same process.
It is not necessary to separately compare relative rotation
information and movement information of head tracker 12.
Accordingly, relative rotation information and movement information
of head tracker 12 can be compared with a combined threshold
value.
[0058] If processor 17 determines that both the position and
orientation of head tracker 12 has changed, processor 17 can apply
adjustments to the corresponding board movement vector and the
corresponding board rotation tensor to account for the change in
position and orientation of head tracker 12 and optical sensors 11,
as described in S5-A. Specifically, because optical sensors 11 are
rigidly fixed to head tracker 12, the orientation changes and
position changes of optical sensors 11 can be determined by
applying the proper adjustments. Here, the adjustments include
subtracting the head tracker movement vector from the board
movement vector for corresponding discrete time intervals and
applying frame rotations to correct for changes in orientation in
the proper sequence. This adjustment produces an adjusted board
movement vector and an adjusted board rotation tensor which more
accurately model motion of board 16.
[0059] If processor 17 determines that the position of head tracker
12 has changed but not the orientation, processor 17 can apply
adjustments to the corresponding board movement vector to account
for the change in position of head tracker 12 and optical sensors
11, as described in S5-B. The position changes of optical sensors
11 can be determined by applying the proper sequence of
translations. Here, the adjustments include subtracting the head
tracker movement vector from the board movement vector for
corresponding discrete time intervals. This adjustment produces an
adjusted board movement vector which more accurately models motion
of board 16 and retains the previously determined board rotation
tensor.
[0060] If processor 17 determines that the orientation of head
tracker 12 has changed but not the position, processor 17 can apply
adjustments to the corresponding board rotation tensor to account
for the change in orientation of head tracker 12 and optical
sensors 11, as described in S5-C. The orientation changes of
optical sensors 11 can be determined by applying the proper
sequence of rotations. Here, the adjustments include applying
coordinate transformations to the board rotation tensor which
remove effects of the resultant rotation of the second tracking
mechanism in the proper sequence. This adjustment produces an
adjusted board rotation tensor which more accurately models motion
of board 16 and retains the previously determined board movement
vector.
[0061] If processor 17 determines that both the position and
orientation of head tracker 12 have not changed, processor 17 is
not instructed to perform an adjustment and proceeds directly to
simulation.
[0062] If processor 17 has determined an adjusted board movement
vector and an adjusted board rotation tensor, processor 17 can use
the information regarding the relative direction of the true
physical gravity vector determined in S2-E, the adjusted board
movement vector, and the adjusted board rotation tensor to simulate
the motion of a virtual object 15, such as ball 15a, under a true
virtual gravity vector based on the actual motion of board 16, as
described in S6-A.
[0063] If processor 17 has determined an adjusted board movement
vector but has not adjusted the optical sensor rotation tensor,
processor 17 can use the information about the relative direction
of the true physical gravity vector determined in S2-E, the
adjusted board movement vector, and the unadjusted board rotation
tensor to simulate the motion of a virtual object 15, such as ball
15a, under a true virtual gravity vector based on the actual motion
of board 16, as described in S6-B.
[0064] If processor 17 has determined an adjusted board rotation
tensor but has not adjusted the board movement vector, processor 17
can use the information about the relative direction of the true
physical gravity vector determined in S2-E, the unadjusted board
movement vector, and the adjusted board rotation tensor to simulate
the motion of a virtual object 15, such as ball 15a, under a true
virtual gravity vector based on the actual motion of board 16, as
described in S6-C.
[0065] If processor 17 has not adjusted the board movement vector
and the board rotation tensor, processor 17 can use the information
about the relative direction of the true physical gravity vector
determined in S2-E, the unadjusted board movement vector, and the
unadjusted board rotation tensor to simulate the motion of a
virtual object 15, such as ball 15a, under a true virtual gravity
vector based on the actual motion of board 16, as described in
S6-D.
[0066] Once processor performs one or more of S6-A, S6-B, S6-C, and
S6-D to simulate the motion of a virtual object 15, computer 14
outputs this simulation as augmented reality information or virtual
reality information to head-worn unit 10. Accordingly, displays 13
can display a combination of virtual objects 15 and physical
objects, such as board 16, in real-time. Alternatively, displays 13
can display only virtual objects 15.
[0067] For purpose of explanation and illustration, and not
limitation, another exemplary embodiment of the interaction system
in accordance with the application is shown in FIGS. 9A-12. For
brevity, only the aspects of the another exemplary embodiment that
are different from the previously described embodiment will be
described.
[0068] FIG. 9A depicts the front side of a head-worn unit 110. The
head-worn unit 110 can include a pair of optical sensors 111 (first
tracking mechanisms) for detecting optical markers on a gameboard
(see, e.g., optical marker 23 on gameboard 16 in FIG. 3). Optical
sensors 111 detect the position and orientation of optical markers
and transmit that information to the processing device 114 (not
shown). Further, head-worn unit 110 can include a head motion
detector 126 (a detecting mechanism) for detecting at least one of
an acceleration of the head motion detector 126, a velocity of the
head motion detector 126, and an orientation of head motion
detector 126. Head motion detector 126 can be rigidly fixed
relative to optical sensors 11, but is not limited to being so
fixed as long as the position of head motion tracker 126 relative
to at least one optical sensor 111 is predetermined. Accordingly,
the orientation, position, and motion of at least one optical
sensor 111 can be readily determined from the motion of head motion
detector 126 if head motion detector 126 is rigidly fixed relative
to at least one optical sensor 111, or if the orientation and
position of head motion detector 126 relative to at least one
optical sensor 111 is otherwise known. Alternatively or
additionally, head motion detector 126 can be configured to detect
a correct direction and magnitude of a physical gravity vector.
Further, head motion detector 126 can alternatively include one or
more distinctly separate motion detectors each configured to detect
one or more of correct direction and magnitude of a physical
gravity vector, acceleration of the head motion detector 126, a
velocity of the head motion detector 126, and an orientation of
head motion detector 126. Head motion detector 126 can include
accelerometers, gyros, magnetometers, etc.
[0069] FIG. 9B depicts the back side of head-worn unit 110. Head
motion detector 126 can be disposed on the back side of head-worn
unit 110. Further, display devices 113 can be disposed on the back
side of head-worn unit 110. Display devices 113 can be configured
to display augmented reality images or virtual reality images as
described in other embodiments of the application.
[0070] FIG. 10 is a schematic diagram depicting the flow of
information between head-worn unit 110 and processing device 114.
Optical sensors 111 can output images or video streams to
processing device 114, as described in other embodiments of the
application. Processing device 114 performs functions similar to
computer 14 (e.g., FIG. 5), as described in the other embodiments
of the application. Head motion detector 126 can track the motion
of head-worn unit 110. Head motion detector 126 can include one or
more of, but is not limited to, a three-axis accelerometer, a
three-axis magnetometer, a three-axis gyroscope, and a full
six-degrees-of-freedom tracker. Further, head motion detector 126
can detect the relative direction of the true physical gravity
vector acting on head-worn unit 110 and head motion detector 126.
Head motion detector 126 can output this motion information, which
includes information regarding the relative direction of the true
physical gravity vector, to processing device 114. Processing
device 114 can then process the information to create augmented
reality information, as will be described herein. Alternatively,
processing device 114 can be configured to create purely virtual
reality information. Processing device 114 can then output the
augmented reality or virtual reality information to display screens
113. The augmented reality information can be, but is not limited
to, an augmented reality video stream including a sequence of
augmented reality images. Further, the virtual reality information
can be, but is not limited to, a virtual reality video stream
including a sequence of virtual reality images. Although this
embodiment describes that processing device 114 can be separate
from head-worn-unit 110, head-worn unit 110 can include processing
device 114. Processing device 114 can be any processing device.
Further, one or more of optical sensors 11 and head motion detector
126 can be configured to output one or more of position
information, orientation information, information regarding the
relative position of the true physical gravity vector, and head
motion detector motion information alone or in combination with
other information, to processing device 114.
[0071] FIG. 11 outlines the operation of receiving orientation,
position, and motion information and determining relevant
information. Processing device 114 can process the information
(e.g., video, images, data, etc.) output by optical sensors 111 in
S11-A and S11-B. Specifically, when optical sensors 111 output a
video stream or other data to processing device 114, processing
device 114 can process the location of optical markers in the video
stream or other data and can determine one or more of the position
and orientation of a gameboard relative to optical sensors 111.
Processing device 114 can store the position and orientation of the
gameboard relative to optical sensors 111 at discrete time
intervals, each time interval being a predetermined length of time
(e.g., 0.03 seconds). S11-A and S11-B can occur separately or
simultaneously.
[0072] A similar process occurs in S11-C and S11-D with respect to
head motion detector motion information and physical gravity vector
information output from head motion detector 126. Specifically,
when head motion detector 126 outputs head motion detector motion
information and physical gravity vector information to processing
device 114, the processing device 114 can process the motion
information of head motion detector, if necessary, and to determine
one or more of the acceleration and velocity of head motion
detector relative to a reference frame. Processing device 114 can
store the acceleration and velocity of head motion detector
relative to a reference frame at discrete time intervals, each time
interval being a predetermined length of time (e.g., 0.03 seconds).
S11-C and S11-D can occur separately or simultaneously. If
necessary, processing device 114 can store information about the
relative positions of head motion detector 126 and optical trackers
11.
[0073] For each successive change in position of the gameboard over
a discrete time interval, processing device 114 can determine a
board movement vector, as shown in S12-A. Processing device 114 can
also determine a board rotation tensor for the successive change in
orientation of the gameboard over each discrete time interval as
shown in S12-B.
[0074] Further, in S12-C, processing device 114 can process one or
more of the motion information output by head motion detector 126
to determine one or more of the velocity and acceleration of head
motion detector 126 relative to the reference frame.
[0075] Processing device 114 can perform a compensation procedure
to account for motion of optical sensors 113, as shown in FIGS. 12A
and 12B. Processing device 114 performs a comparison of
acceleration values in S13 if processing device determines the
acceleration of head motion detector 126 relative to the reference
frame in S12-C. In S13 of FIG. 12A, processing device 114 can
compare the magnitude of acceleration of head motion detector 114
over a predetermined time with a predetermined threshold
acceleration value. This predetermined threshold acceleration value
can be determined based on the particular implementation of the
application (e.g., smaller thresholds for complex simulations such
as simulated medical training, and larger thresholds for simple
simulations such as basic physics simulations). If the magnitude of
acceleration of head motion detector 114 over the predetermined
time is greater than or equal to the predetermined threshold
acceleration value, processing device 114 can determine that head
motion detector 126 has changed at least one of position and
orientation. If the magnitude of acceleration of head motion
detector 126 over the predetermined time is less than the
predetermined threshold acceleration value, processing device 114
can determine that head motion detector 126 has not changed
position and orientation, unless processing device 114 determines
that head motion detector 126 has changed at least one of position
and orientation as a result of a velocity comparison in S23, which
will be explained herein.
[0076] If processing device 114 determines that head motion
detector 126 has changed at least one of position and orientation
as a result of acceleration, processing device 114 can apply
adjustments to the corresponding at least one of board movement
vector and board rotation tensor to account for the change in at
least one of position and orientation of head motion detector 126
and optical sensors 11, as described in S14-A. Specifically,
because optical sensors 11 are rigidly fixed to head motion
detector 126, or have an otherwise known relative position, the
orientation changes and position changes of optical sensors 11 can
be determined by applying the proper adjustments. Here, the
adjustments can include one or more heuristics for compensation,
including, but not limited to: ignoring changes in one or more of
board position information and board orientation information over
the corresponding discrete time interval when simulating the
physics of the virtual reality object, while simulating a
physically accurate field of vision. These adjustments produce at
least one of an adjusted board movement vector and an adjusted
board rotation tensor which more accurately model motion of the
gameboard.
[0077] Processing device 114 performs a comparison of velocity
values in S23 if processing device determines the velocity of head
motion detector 126 relative to the reference frame in S12-C. In
S23 of FIG. 12B, processing device 114 can compare the magnitude of
velocity of head motion detector 114 over a predetermined time with
a predetermined threshold velocity value. This predetermined
threshold velocity value can be determined based on the particular
implementation of the application (e.g., smaller thresholds for
complex simulations such as simulated medical training, and larger
thresholds for simple simulations such as basic physics
simulations). If the magnitude of velocity of head motion detector
114 over the predetermined time is greater than or equal to the
predetermined threshold velocity value, processing device 114 can
determine that head motion detector 126 has changed at least one of
position and orientation. If the magnitude of velocity of head
motion detector 126 over the predetermined time is less than the
predetermined threshold velocity value, processing device 114 can
determine that head motion detector 126 has not changed position
and orientation, unless processing device 114 determines that head
motion detector 126 has changed at least one of position and
orientation as a result of a acceleration comparison in S13, as
explained above.
[0078] If processing device 114 infers that head motion detector
126 has changed at least one of position and orientation as a
result of acceleration, processing device 114 can apply adjustments
to the corresponding at least one of board movement vector and
board rotation tensor to account for the change in at least one of
position and orientation of head motion detector 126 and optical
sensors 11, as described in S24-A. Specifically, because optical
sensors 11 are rigidly fixed to head motion detector 126, or have
an otherwise known relative position, the orientation changes and
position changes of optical sensors 11 can be determined by
applying the proper adjustments. Here, the adjustments include one
or more of subtracting the head tracker movement vector from the
board movement vector for corresponding discrete time intervals and
applying frame rotations to correct for changes in orientation in
the proper sequence. This adjustment produces at least one of an
adjusted board movement vector and an adjusted board rotation
tensor which more accurately model motion of the gameboard.
[0079] If processing device 114 has adjusted one or more of the
board movement vector and the board rotation tensor, processing
device 114 can use the information about the relative direction of
the true physical gravity vector received in S11-D and one or more
of the adjusted board movement vector determined in S14-A or S24-A,
and the adjusted board rotation tensor determined in S14-A or S24-A
to simulate the motion of a virtual objects, under a true virtual
gravity vector based on the actual motion of the gameboard, as
described in S15-A (comparing acceleration) or S25-A (comparing
velocity). If an adjusted board movement vector is determined in
both S14-A S24-A, the processor can be programmed to select one of
the adjusted board movement vector determined in S14-A and the
adjusted board movement vector determined in S24-A, the processor
can be programmed to implement only one of S15-A and S25-A. If an
adjusted board rotation tensor is determined in both S14-A S24-A,
the processor can be programmed to select one of the adjusted board
rotation tensor determined in S14-A and the adjusted board rotation
tensor determined in S24-A, the processor can be programmed to
implement only one of S15-A and S25-A.
[0080] If processing device 114 infers that both the position and
orientation of head motion detector 126 have not changed as a
result of acceleration (S13) or velocity (S23), processing device
114 does not perform an adjustment and proceeds directly to
simulation.
[0081] If processing device 114 has not adjusted the board movement
vector and the board rotation tensor, processing device 114 can use
the information about the relative direction of the true physical
gravity vector received in S11-D, the unadjusted board movement
vector determined in S12-A, and the unadjusted board rotation
tensor determined in S12-B to simulate the motion of a virtual
objects, under a true virtual gravity vector based on the actual
motion of the gameboard, as described in S15-B (comparing
acceleration) or S25-B (comparing velocity).
[0082] Once processing device 114 performs one or more of S15-A,
S25-A, S15-B and S25-B to simulate the motion of at least one
virtual object, processing device 114 outputs this simulation as
augmented reality information or virtual reality information to
head-worn unit 110. Accordingly, displays 113 can display a
combination of virtual objects and physical objects, such as the
gameboard, in real-time. Alternatively, displays 113 can display
only virtual objects.
[0083] As described above, a virtual gravitational force can act on
ball 15a as natural gravity would act on a physical object placed
on board 16. Thus, even when the head-worn unit 10 moves
independently of board 16 or changes orientation with respect to
board 16, the virtual force of gravity will appear to act in the
true physical gravity vector direction with respect to board
16.
[0084] The computer interaction environment is not limited to the
embodiments described above. Interaction delivery devices other
than head-worn unit 10 can be used in place of or in combination
with head-worn unit 10. For example, interaction delivery devices
can include tracked physical displays that are held, worn on
additional or alternative parts of the body, or mounted on objects
in the environment. Alternatively, one or more of the first
tracking mechanism, the second tracking mechanism, and each display
device, individually or in combination, can be disposed separately
from the interaction delivery device.
[0085] Further, the first tracking mechanism can be a device other
than optical sensors 11 and can include fewer than two optical
sensors or other tracking mechanisms. For example, the first
tracking mechanism can include dephth cameras or acoustic tracking
devices, such as sonar devices, etc. Additionally, second tracking
mechanisms other than head tracker 12 can be used in place of or in
combination with head tracker 12. For example, the second tracking
mechanism can be or fixed to earth or to an object moving relative
to earth rather than being fixed to or at a predetermined position
relative to the first tracking mechanism. Here, the second tracking
mechanism can track the first tracking mechanism as the reference.
The second tracking mechanism can include one or more of a
three-axis accelerometer, a three-axis magnetometer, a three-axis
gyroscope, and a full six-degrees-of-freedom tracker. Additionally,
the second tracking mechanism can determine the relative direction
of a true force vector acting in the force's natural direction
(true physical vector direction) other than that of gravity,
including, but not limited to, the true vectors of magnetic forces,
electrostatic forces, friction, additional artificial forces,
etc.
[0086] Further, the tracking software can be configured to cause
the processor to determine only one of a movement vector and a
rotation tensor for each of the board and the second tracking
mechanism. Alternatively, the tracking software can be configured
to cause the processor to determine both a movement vector and a
rotation tensor for each of the board and the second tracking
mechanism as described above. Additionally, the tracking software
can be configured to cause the processor to apply an adjustment to
only movement vectors, only rotation tensors, or some combination
of rotation tensors and movement vectors.
[0087] Further, the simulated forces are not limited to
gravitational forces applied in the true physical gravity vector
direction. The virtual gravity vector can be intentionally
different from the true physical gravity vector (e.g., a not true
virtual gravity vector). For example, the simulated gravitational
force can have a different magnitude or act in a direction
different from the true physical gravity vector. Additionally, the
simulated force can include, but is not limited to, magnetic
forces, electrostatic forces, and additional artificial forces in
true or not true directions. Thus, the virtual gravity vector can
be an intentionally not true virtual gravity vector.
[0088] In addition to the specific embodiments and features
disclosed herein, this application also incorporates by reference
the entire disclosure of each and every patent publication
identified herein. This application therefore includes any possible
combination of the various features disclosed, incorporated by
reference or claimed herein. As such, the particular features
presented in the dependent claims and disclosed above can be
combined with each other in other manners within the scope of the
application such that the application should be recognized as also
specifically directed to other embodiments having any other
possible combinations. Thus, the foregoing description of specific
embodiments of the application has been presented for purposes of
illustration and description. It is not intended to be exhaustive
or to limit the application to those embodiments disclosed.
* * * * *
References