U.S. patent application number 15/044053 was filed with the patent office on 2016-06-09 for use of comparative sensor data to determine orientation of head relative to body.
The applicant listed for this patent is Michael Patrick Johnson, Thad Eugene Starner. Invention is credited to Michael Patrick Johnson, Thad Eugene Starner.
Application Number | 20160161240 15/044053 |
Document ID | / |
Family ID | 55314598 |
Filed Date | 2016-06-09 |
United States Patent
Application |
20160161240 |
Kind Code |
A1 |
Starner; Thad Eugene ; et
al. |
June 9, 2016 |
Use of Comparative Sensor Data to Determine Orientation of Head
Relative to Body
Abstract
Methods and systems are described that involve a wearable
computing device or an associated device determining the
orientation of a person's head relative to their body. To do so,
example methods and systems may compare sensor data from the
wearable computing device to corresponding sensor data from a
tracking device that is expected to move in a manner that follows
the wearer's body, such a mobile phone that is located in the
wearable computing device's wearer's pocket.
Inventors: |
Starner; Thad Eugene;
(Mountain View, CA) ; Johnson; Michael Patrick;
(Sunnyvale, CA) |
|
Applicant: |
Name |
City |
State |
Country |
Type |
Starner; Thad Eugene
Johnson; Michael Patrick |
Mountain View
Sunnyvale |
CA
CA |
US
US |
|
|
Family ID: |
55314598 |
Appl. No.: |
15/044053 |
Filed: |
February 15, 2016 |
Related U.S. Patent Documents
|
|
|
|
|
|
Application
Number |
Filing Date |
Patent Number |
|
|
13631454 |
Sep 28, 2012 |
9268136 |
|
|
15044053 |
|
|
|
|
Current U.S.
Class: |
702/150 |
Current CPC
Class: |
G01C 17/02 20130101;
G01C 19/00 20130101; G01P 15/00 20130101; G02B 2027/0187 20130101;
G06F 3/03547 20130101; G06F 3/0346 20130101; G01B 7/00 20130101;
G06F 3/017 20130101; G06F 3/011 20130101; G02B 2027/0178 20130101;
G02B 27/0179 20130101; G02B 27/0093 20130101; G02B 27/017
20130101 |
International
Class: |
G01B 7/00 20060101
G01B007/00; G01C 19/00 20060101 G01C019/00; G01P 15/00 20060101
G01P015/00; G01C 17/02 20060101 G01C017/02; G06F 3/01 20060101
G06F003/01; G02B 27/01 20060101 G02B027/01 |
Claims
1. A computer-implemented method comprising: detecting, by a
computing device, sensor data that is indicative of an association
between movement of a tracking device and body movement; in
response to detecting the sensor data that is indicative of the
association: determining a forward-backward body axis of a body
corresponding to a wearable computing device; and determining a
base orientation of a tracking device relative to the
forward-backward body axis; determining a first orientation of the
wearable computing device relative to the tracking device; and
determining a first head orientation relative to the body based on
both: (a) the first orientation of the wearable computing device
relative to the tracking device and (b) the base orientation of the
tracking device relative to the forward-backward body axis.
2. The method of claim 1, wherein determining the first head
orientation relative to the body comprises offsetting the first
orientation of the wearable computing device relative to the
tracking device by the base orientation of the tracking device
relative to the forward-backward body axis.
3. The method of claim 1, wherein the computing device is the
wearable computing device.
4. The method of claim 1, wherein the tracking device is a mobile
phone.
5. The method of claim 1, wherein detecting the calibration event
comprises: receiving sensor data from at least one of the wearable
computing device and the tracking device, wherein the sensor data
is indicative of movement; and determining that the sensor data is
characteristic of movement along the forward-backward body
axis.
6. The method of claim 5, wherein determining that the sensor data
is characteristic of movement along the forward-backward body axis
comprises determining that the sensor data is characteristic of
walking.
7. The method of claim 5, wherein the sensor data comprises at
least one of: (a) gyroscope data associated with the wearable
computing device, (b) accelerometer data associated with the
wearable computing device, (c) gyroscope data associated with the
tracking device, and (d) accelerometer data associated with the
tracking device.
8. The method of claim 1, wherein determining a forward-backward
body axis of a body comprises determining a direction of forward
body movement.
9. The method of claim 1, wherein determining a base orientation of
the tracking device relative to the forward-backward body axis
comprises determining an angle between a forward-facing direction
of the tracking device and the directional component along the
forward-backward body axis.
10. The method of claim 1, wherein the first orientation of the
wearable computing device relative to the tracking device is
determined based on both: (a) magnetometer data associated with the
wearable computing device and (b) magnetometer data associated with
the tracking device.
11. The method of claim 10, wherein determining the first
orientation of the HMD relative to the tracking device comprises:
determining a first orientation of the wearable computing device
relative to magnetic north based on the magnetometer data
associated with the wearable computing device; determining a first
orientation of the tracking device relative to magnetic north based
on the magnetometer data associated with the tracking device; and
determining the first orientation of the wearable computing device
relative to the tracking device based on a difference between (a)
the first orientation of the wearable computing device relative to
magnetic north and (b) the first orientation of the tracking device
relative to magnetic north.
12. The method of claim 11, wherein determining the first head
orientation relative to the body comprises: offsetting the first
orientation of the wearable computing device relative to the
tracking device by the base orientation of the tracking device
relative to the forward-backward body axis.
13. The method of claim 1, further comprising initiating a
computing action based on the first head orientation relative to
the body.
14. The method of claim 1, wherein determining a first head
orientation relative to the body comprises determining a rotation
of the head relative to the forward-backward body axis.
15. The method of claim 1, wherein determining a first head
orientation relative to the body comprises determining two or more
of: (a) a rotation of the head relative to the forward-backward
body axis, (b) a pitch of the head relative to an upward-downward
body axis and (c) a yaw of the head relative to the
forward-backward body axis and the upward-downward body axis.
16. A non-transitory computer readable medium having stored therein
instructions that are executable to cause a computing device to
perform functions comprising: detecting sensor data that is
indicative of an association between movement of a tracking device
and body movement; in response to detecting the sensor data that is
indicative of the positional association: determining a
forward-backward body axis of a body associated with a wearable
computing device; and determining a base orientation
.theta..sub.TD-B of a tracking device relative to the
forward-backward body axis; determining a first orientation of the
wearable computing device relative to the tracking device; and
determining a first head orientation relative to the body based on
both: (a) the first orientation of the wearable computing device
relative to the tracking device and (b) the base orientation
.theta..sub.TD-B of the tracking device relative to the
forward-backward body axis.
17. A computing system comprising: a non-transitory computer
readable medium; program instructions stored on the non-transitory
computer readable medium and executable by at least one processor
to: detect sensor data that is indicative of an association between
movement of a tracking device and body movement; in response to
detecting the sensor data that is indicative of the positional
association: determine a forward-backward body axis of a body
associated with a wearable computing device; and determine a base
orientation of a tracking device relative to the forward-backward
body axis; determine a first orientation of a wearable computing
device relative to the tracking device; and determine a first head
orientation relative to the body based on both: (a) the first
orientation of the wearable computing device relative to the
tracking device and (b) the base orientation of the tracking device
relative to the forward-backward body axis.
18. The computing system of claim 17, wherein the computing system
is implemented in or takes the form of the wearable computing
device.
19. The computing system of claim 17, wherein the first orientation
of the wearable computing device relative to the tracking device is
determined based on both: (a) magnetometer data associated with the
wearable computing device and (b) magnetometer data associated with
the tracking device.
20. The computing system of claim 17, wherein the determined first
head orientation .theta..sub.H-B.sub._.sub.1 relative to the body
comprises two or more of: (a) a rotation of the head relative to
the forward-backward body axis, (b) a pitch of the head relative to
an upward-downward body axis and (c) a yaw of the head relative to
the forward-backward body axis and the upward-downward body axis.
Description
CROSS-REFERENCE TO RELATED APPLICATION
[0001] This patent application is a continuation of U.S.
application Ser. No. 13/631,454, which was filed on Sep. 28, 2012,
the contents of which are entirely incorporated herein by reference
as if fully set forth in this application.
BACKGROUND
[0002] Unless otherwise indicated herein, the materials described
in this section are not prior art to the claims in this application
and are not admitted to be prior art by inclusion in this
section.
[0003] Computing devices such as personal computers, laptop
computers, tablet computers, cellular phones, and countless types
of Internet-capable devices are increasingly prevalent in numerous
aspects of modern life. Over time, the manner in which these
devices are providing information to users is becoming more
intelligent, more efficient, more intuitive, and/or less
obtrusive.
[0004] The trend toward miniaturization of computing hardware,
peripherals, as well as of sensors, detectors, and image and audio
processors, among other technologies, has helped open up a field
sometimes referred to as "wearable computing." In the area of image
and visual processing and production, in particular, it has become
possible to consider wearable displays that place a very small
image display element close enough to a wearer's (or user's) eye(s)
such that the displayed image fills or nearly fills the field of
view, and appears as a normal sized image, such as might be
displayed on a traditional image display device. The relevant
technology may be referred to as "near-eye displays."
[0005] Near-eye displays are fundamental components of wearable
displays, also sometimes called "head-mounted displays" (HMDs). A
head-mounted display places a graphic display or displays close to
one or both eyes of a wearer. To generate the images on a display,
a computer processing system may be used. Such displays may occupy
a wearer's entire field of view, or only occupy part of wearer's
field of view. Further, head-mounted displays may be as small as a
pair of glasses or as large as a helmet.
[0006] Emerging and anticipated uses of wearable displays include
applications in which users interact in real time with an augmented
or virtual reality. Such applications can be mission-critical or
safety-critical, such as in a public safety or aviation setting.
The applications can also be recreational, such as interactive
gaming.
SUMMARY
[0007] In one aspect, an exemplary computer-implemented method may
involve a computing device: (i) detecting sensor data that is
indicative of an association between movement of a tracking device
and body movement; (ii) in response to detecting the sensor data
that is indicative of the positional association: (a) determining a
forward-backward body axis of a body and (b) determining a base
orientation of a tracking device relative to the forward-backward
body axis; (iii) determining a first orientation of a
head-mountable device (HMD) relative to the tracking device; and
(iv) determining a first head orientation relative to the body
based on both: (a) the first orientation of the HMD relative to the
tracking device and (b) the base orientation of the tracking device
relative to the forward-backward body axis.
[0008] In another aspect, a non-transitory computer readable medium
may have stored therein instructions that are executable to cause a
computing system to perform functions comprising: (i) detecting
sensor data that is indicative of an association between movement
of a tracking device and body movement; (ii) in response to
detecting the sensor data that is indicative of the positional
association: (a) determining a forward-backward body axis of a body
and (b) determining a base orientation of a tracking device
relative to the forward-backward body axis; (iii) determining a
first orientation of a head-mountable device (HMD) relative to the
tracking device; and (iv) determining a first head orientation
relative to the body based on both: (a) the first orientation of
the HMD relative to the tracking device and (b) the base
orientation of the tracking device relative to the forward-backward
body axis.
[0009] In a further aspect, a computing system may include a
non-transitory computer readable medium and program instructions
stored on the non-transitory computer readable medium. The program
instructions may be executable by at least one processor to: (i)
detect sensor data that is indicative of an association between
movement of a tracking device and body movement; (ii) in response
to detecting the sensor data that is indicative of the positional
association: (a) determine a forward-backward body axis of a body
and (b) determine a base orientation of a tracking device relative
to the forward-backward body axis; (iii) determine a first
orientation of a head-mountable device (HMD) relative to the
tracking device; and (iv) determine a first head orientation
relative to the body based on both: (a) the first orientation of
the HMD relative to the tracking device and (b) the base
orientation of the tracking device relative to the forward-backward
body axis.
[0010] These as well as other aspects, advantages, and
alternatives, will become apparent to those of ordinary skill in
the art by reading the following detailed description, with
reference where appropriate to the accompanying drawings.
BRIEF DESCRIPTION OF THE DRAWINGS
[0011] FIG. 1A illustrates a wearable computing system according to
an exemplary embodiment.
[0012] FIG. 1B illustrates an alternate view of the wearable
computing device illustrated in FIG. 1A.
[0013] FIG. 1C illustrates another wearable computing system
according to an exemplary embodiment.
[0014] FIG. 1D illustrates another wearable computing system
according to an exemplary embodiment.
[0015] FIG. 2 is a simplified illustration of a network via which
one or more devices may engage in communications, according to an
exemplary embodiment.
[0016] FIG. 3 is a flow chart illustrating a method, according to
an example embodiment.
[0017] FIG. 4A is a top-down illustration of a scenario in which an
HMD wearer has a tracking device on their person.
[0018] FIG. 4B is a top-down illustration of a scenario in which an
HMD wearer has a tracking device on their person while walking.
[0019] FIG. 5 is a flow chart illustrating a method for determining
the orientation of an HMD relative to a tracking device.
[0020] FIG. 6 illustrates a side-view of a scenario in which an HMD
wearer moves their head with respect to their body.
DETAILED DESCRIPTION
[0021] Exemplary methods and systems are described herein. It
should be understood that the word "exemplary" is used herein to
mean "serving as an example, instance, or illustration." Any
embodiment or feature described herein as "exemplary" is not
necessarily to be construed as preferred or advantageous over other
embodiments or features. The exemplary embodiments described herein
are not meant to be limiting. It will be readily understood that
certain aspects of the disclosed systems and methods can be
arranged and combined in a wide variety of different
configurations, all of which are contemplated herein.
I. OVERVIEW
[0022] There are scenarios where it may be useful for a
head-mountable display (HMD), such as a glasses-style wearable
computer, to know how a wearer's head is oriented with respect to
their body. For example, the position of the head relative to the
body may be used to provide augmented-reality style graphics in the
HMD, in which certain applications or graphics may then be rendered
at the same place relative to the body. As a specific example, a
wearer might look over their left shoulder to view an e-mail
application, and look over their right shoulder to view a web
browser. As such, the e-mail application and web browser may appear
to exist in space over the user's left and right shoulder,
respectively.
[0023] An example embodiment may utilize data from an HMD and a
tracking device located on the body of the HMD wearer to determine
the orientation of the head relative to the wearer's body. The
tracking device may take various forms, such as computing device
the wearer typically has on their person (e.g., a mobile phone) or
a separate dedicated tracking device that the user can put in their
pocket or attach to their clothing. Importantly, example
embodiments may help an HMD to determine head position relative to
a wearer's body, without requiring that the user mount the tracking
device at a certain location and in a certain position on the
body.
[0024] In an exemplary embodiment, an HMD (or a remote computing
system in communication with an HMD) may compare the HMD's sensor
data to the sensor data from a tracking device that is expected or
inferred to have a certain physical association to the wearer's
body. For example, the HMD may determine that the wearer has a
mobile phone in their pocket (or in another location where the
movement of the phone is generally expected to follow the movement
of the wearer's body). The HMD may then detect when the wearer is
walking and determine the forward direction by sensing which
direction the body is moving while walking. The HMD or mobile phone
can then use sensor data from the mobile phone to determine the
orientation of the phone with respect to the body (e.g., the
orientation with respect to an axis aligned with direction the
wearer is walking). In addition, the HMD may use magnetometer data
(and possibly gyroscope data) from both the HMD and the mobile
phone to determine each device's orientation relative to magnetic
north. The HMD can then use this information to determine the
orientation of the HMD with respect to the tracking device. Then,
to determine the orientation of the HMD with respect to the body,
the HMD may offset its orientation with respect to the tracking
device by the orientation of the tracking device with respect to
the body.
[0025] The above technique, which can provide the horizontal
rotation of the head relative to the body, relies on the difference
between magnetometer readings at the HMD and a tracking device such
as a mobile phone. In a further aspect, the differences between the
accelerometer and/or gyroscope readings of the tracking device and
the HMD can be used in a similar manner to determine the pitch
and/or yaw of the head relative to the body. Therefore, by
analyzing the differences in sensor data between the
accelerometers, magnetometers, and/or gyroscopes of an HMD and a
tracking device, the HMD may detect three-dimensional changes in
head position relative to the body.
II. ILLUSTRATIVE SYSTEMS
[0026] Systems and devices in which exemplary embodiments may be
implemented will now be described in greater detail. In general, an
exemplary system may be implemented in or may take the form of a
wearable computer. In particular, an exemplary system may be
implemented in association with or take the form of a
head-mountable display (HMD), or a computing system that receives
data from an HMD, such as a cloud-based server system.
[0027] However, an exemplary system may also be implemented in or
take the form of other devices, such as a mobile phone, among
others. Further, an exemplary system may take the form of
non-transitory computer readable medium, which has program
instructions stored thereon that are executable by at a processor
to provide the functionality described herein. An exemplary system
may also take the form of a device such as a wearable computer or
mobile phone, or a subsystem of such a device, which includes such
a non-transitory computer readable medium having such program
instructions stored thereon.
[0028] FIG. 1A illustrates a wearable computing system according to
an exemplary embodiment. In FIG. 1A, the wearable computing system
takes the form of a head-mounted device (HMD) 102 (which may also
be referred to as a head-mountable display). It should be
understood, however, that exemplary systems and devices may take
the form of or be implemented within or in association with other
types of devices, without departing from the scope of the
invention. As illustrated in FIG. 1A, the head-mounted device 102
comprises frame elements including lens-frames 104, 106 and a
center frame support 108, lens elements 110, 112, and extending
side-arms 114, 116. The center frame support 108 and the extending
side-arms 114, 116 are configured to secure the head-mounted device
102 to a user's face via a user's nose and ears, respectively.
[0029] Each of the frame elements 104, 106, and 108 and the
extending side-arms 114, 116 may be formed of a solid structure of
plastic and/or metal, or may be formed of a hollow structure of
similar material so as to allow wiring and component interconnects
to be internally routed through the head-mounted device 102. Other
materials may be possible as well.
[0030] One or more of each of the lens elements 110, 112 may be
formed of any material that can suitably display a projected image
or graphic. Each of the lens elements 110, 112 may also be
sufficiently transparent to allow a user to see through the lens
element. Combining these two features of the lens elements may
facilitate an augmented reality or heads-up display where the
projected image or graphic is superimposed over a real-world view
as perceived by the user through the lens elements.
[0031] The extending side-arms 114, 116 may each be projections
that extend away from the lens-frames 104, 106, respectively, and
may be positioned behind a user's ears to secure the head-mounted
device 102 to the user. The extending side-arms 114, 116 may
further secure the head-mounted device 102 to the user by extending
around a rear portion of the user's head. Additionally or
alternatively, for example, the HMD 102 may connect to or be
affixed within a head-mounted helmet structure. Other possibilities
exist as well.
[0032] The HMD 102 may also include an on-board computing system
118, a video camera 120, a sensor 122, and a finger-operable touch
pad 124. The on-board computing system 118 is shown to be
positioned on the extending side-arm 114 of the head-mounted device
102; however, the on-board computing system 118 may be provided on
other parts of the head-mounted device 102 or may be positioned
remote from the head-mounted device 102 (e.g., the on-board
computing system 118 could be wire- or wirelessly-connected to the
head-mounted device 102). The on-board computing system 118 may
include a processor and memory, for example. The on-board computing
system 118 may be configured to receive and analyze data from the
video camera 120 and the finger-operable touch pad 124 (and
possibly from other sensory devices, user interfaces, or both) and
generate images for output by the lens elements 110 and 112.
[0033] The video camera 120 is shown positioned on the extending
side-arm 114 of the head-mounted device 102; however, the video
camera 120 may be provided on other parts of the head-mounted
device 102. The video camera 120 may be configured to capture
images at various resolutions or at different frame rates. Many
video cameras with a small form-factor, such as those used in cell
phones or webcams, for example, may be incorporated into an example
of the HMD 102.
[0034] Further, although FIG. 1A illustrates one video camera 120,
more video cameras may be used, and each may be configured to
capture the same view, or to capture different views. For example,
the video camera 120 may be forward facing to capture at least a
portion of the real-world view perceived by the user. This forward
facing image captured by the video camera 120 may then be used to
generate an augmented reality where computer generated images
appear to interact with the real-world view perceived by the
user.
[0035] The sensor 122 is shown on the extending side-arm 116 of the
head-mounted device 102; however, the sensor 122 may be positioned
on other parts of the head-mounted device 102. The sensor 122 may
include one or more of a gyroscope or an accelerometer, for
example. Other sensing devices may be included within, or in
addition to, the sensor 122 or other sensing functions may be
performed by the sensor 122.
[0036] The finger-operable touch pad 124 is shown on the extending
side-arm 114 of the head-mounted device 102. However, the
finger-operable touch pad 124 may be positioned on other parts of
the head-mounted device 102. Also, more than one finger-operable
touch pad may be present on the head-mounted device 102. The
finger-operable touch pad 124 may be used by a user to input
commands. The finger-operable touch pad 124 may sense at least one
of a position and a movement of a finger via capacitive sensing,
resistance sensing, or a surface acoustic wave process, among other
possibilities. The finger-operable touch pad 124 may be capable of
sensing finger movement in a direction parallel or planar to the
pad surface, in a direction normal to the pad surface, or both, and
may also be capable of sensing a level of pressure applied to the
pad surface. The finger-operable touch pad 124 may be formed of one
or more translucent or transparent insulating layers and one or
more translucent or transparent conducting layers. Edges of the
finger-operable touch pad 124 may be formed to have a raised,
indented, or roughened surface, so as to provide tactile feedback
to a user when the user's finger reaches the edge, or other area,
of the finger-operable touch pad 124. If more than one
finger-operable touch pad is present, each finger-operable touch
pad may be operated independently, and may provide a different
function.
[0037] FIG. 1B illustrates an alternate view of the wearable
computing device illustrated in FIG. 1A. As shown in FIG. 1B, the
lens elements 110, 112 may act as display elements. The
head-mounted device 102 may include a first projector 128 coupled
to an inside surface of the extending side-arm 116 and configured
to project a display 130 onto an inside surface of the lens element
112. Additionally or alternatively, a second projector 132 may be
coupled to an inside surface of the extending side-arm 114 and
configured to project a display 134 onto an inside surface of the
lens element 110.
[0038] The lens elements 110, 112 may act as a combiner in a light
projection system and may include a coating that reflects the light
projected onto them from the projectors 128, 132. In some
embodiments, a reflective coating may not be used (e.g., when the
projectors 128, 132 are scanning laser devices).
[0039] In alternative embodiments, other types of display elements
may also be used. For example, the lens elements 110, 112
themselves may include: a transparent or semi-transparent matrix
display, such as an electroluminescent display or a liquid crystal
display, one or more waveguides for delivering an image to the
user's eyes, or other optical elements capable of delivering an in
focus near-to-eye image to the user. A corresponding display driver
may be disposed within the frame elements 104, 106 for driving such
a matrix display. Alternatively or additionally, a laser or LED
source and scanning system could be used to draw a raster display
directly onto the retina of one or more of the user's eyes. Other
possibilities exist as well.
[0040] FIG. 1C illustrates another wearable computing system
according to an exemplary embodiment, which takes the form of an
HMD 152. The HMD 152 may include frame elements and side-arms such
as those described with respect to FIGS. 1A and 1B. The HMD 152 may
additionally include an on-board computing system 154 and a video
camera 206, such as those described with respect to FIGS. 1A and
1B. The video camera 206 is shown mounted on a frame of the HMD
152. However, the video camera 206 may be mounted at other
positions as well.
[0041] As shown in FIG. 1C, the HMD 152 may include a single
display 158 which may be coupled to the device. The display 158 may
be formed on one of the lens elements of the HMD 152, such as a
lens element described with respect to FIGS. 1A and 1B, and may be
configured to overlay computer-generated graphics in the user's
view of the physical world. The display 158 is shown to be provided
in a center of a lens of the HMD 152, however, the display 158 may
be provided in other positions. The display 158 is controllable via
the computing system 154 that is coupled to the display 158 via an
optical waveguide 160.
[0042] FIG. 1D illustrates another wearable computing system
according to an exemplary embodiment, which takes the form of an
HMD 172. The HMD 172 may include side-arms 173, a center frame
support 174, and a bridge portion with nosepiece 175. In the
example shown in FIG. 1D, the center frame support 174 connects the
side-arms 173. The HMD 172 does not include lens-frames containing
lens elements. The HMD 172 may additionally include an on-board
computing system 176 and a video camera 178, such as those
described with respect to FIGS. 1A and 1B.
[0043] The HMD 172 may include a single lens element 180 that may
be coupled to one of the side-arms 173 or the center frame support
174. The lens element 180 may include a display such as the display
described with reference to FIGS. 1A and 1B, and may be configured
to overlay computer-generated graphics upon the user's view of the
physical world. In one example, the single lens element 180 may be
coupled to the inner side (i.e., the side exposed to a portion of a
user's head when worn by the user) of the extending side-arm 173.
The single lens element 180 may be positioned in front of or
proximate to a user's eye when the HMD 172 is worn by a user. For
example, the single lens element 180 may be positioned below the
center frame support 174, as shown in FIG. 1D.
[0044] FIG. 2 illustrates a schematic drawing of a computing device
according to an exemplary embodiment. In system 200, a device 210
communicates using a communication link 220 (e.g., a wired or
wireless connection) to a remote device 230. The device 210 may be
any type of device that can receive data and display information
corresponding to or associated with the data. For example, the
device 210 may be a heads-up display system, such as the
head-mounted devices 102, 152, or 172 described with reference to
FIGS. 1A-1D.
[0045] Thus, the device 210 may include a display system 212
comprising a processor 214 and a display 216. The display 210 may
be, for example, an optical see-through display, an optical
see-around display, or a video see-through display. The processor
214 may receive data from the remote device 230, and configure the
data for display on the display 216. The processor 214 may be any
type of processor, such as a micro-processor or a digital signal
processor, for example.
[0046] The device 210 may further include on-board data storage,
such as memory 218 coupled to the processor 214. The memory 218 may
store software that can be accessed and executed by the processor
214, for example.
[0047] The remote device 230 may be any type of computing device or
transmitter including a laptop computer, a mobile telephone, or
tablet computing device, etc., that is configured to transmit data
to the device 210. The remote device 230 and the device 210 may
contain hardware to enable the communication link 220, such as
processors, transmitters, receivers, antennas, etc.
[0048] In an illustrative embodiment, a remote device 230 such as a
mobile phone, tablet computing device, a laptop computer, etc.,
could be utilized as a tracking device. More specifically, device
210 may be an HMD, and sensor data from sensors on the remote
device 230, such data from one or more magnetometers,
accelerometers, and/or gyroscopes, may be compared to corresponding
sensor data from the HMD to determine the position of the HMD
wearer's head relative to their body.
[0049] The remote device 230 could also be a remote computing
system that is configured to perform functions on behalf of device
210; i.e., a "cloud" computing system. In such an embodiment, the
remote computing system may receive data from device 210 via link
220, perform certain processing functions on behalf of device 210,
and then send the resulting data back to device 210.
[0050] Further, device 210 may be in communication with a number of
remote devices, such as remote device 230. For example, an HMD
could be in communication with a remote computing system that
provides certain functionality to the HMD, as well as a mobile
phone, or another such device that may serve as a tracking device
in embodiments described herein. Other examples are also
possible.
[0051] In FIG. 2, the communication link 220 is illustrated as a
wireless connection; however, wired connections may also be used.
For example, the communication link 220 may be a wired serial bus
such as a universal serial bus or a parallel bus. A wired
connection may be a proprietary connection as well. The
communication link 220 may also be a wireless connection using,
e.g., Bluetooth.RTM. radio technology, communication protocols
described in IEEE 802.11 (including any IEEE 802.11 revisions),
Cellular technology (such as GSM, CDMA, UMTS, EV-DO, WiMAX, or
LTE), or Zigbee.RTM. technology, among other possibilities. The
remote device 230 may be accessible via the Internet and may
include a computing cluster associated with a particular web
service (e.g., social-networking, photo sharing, address book,
etc.).
III. ILLUSTRATIVE METHODS
[0052] FIG. 3 is a flow chart illustrating a method 300, according
to an example embodiment. Illustrative methods, such as method 300,
may be carried out in whole or in part by an HMD, such as the
head-mountable displays shown in FIGS. 1A to 1D. Example methods,
or portions thereof, could also be carried out by a tracking device
(e.g., a mobile phone or another mobile device that the wearer of
an HMD might carry on their person), alone or in combination with
an HMD. Further, an example method, or portions thereof, may be
carried out by computing device that is in communication with an
HMD and/or in communication with a tracking device. An example
method may also be carried out by other types of computing devices
and/or combinations of computing devices, without departing from
the scope of the invention.
[0053] As shown by block 302, method 300 involves a computing
device detecting that sensor data is indicative of an association
between movement of a tracking device and body movement. For
example, an HMD may detect that data from an accelerometer and/or
gyroscope of the HMD (and/or data from such sensors on the tracking
device) corresponds to the HMD wearer walking forward.
[0054] In response to detecting the sensor data that is indicative
of such an association, the computing device may perform a
calibration routine to determine an orientation of the tracking
device with respect to the HMD wearer's body. (This assumes that
the HMD is being worn; if it is not, then the calibration routine
may still be implemented, but will produce an orientation of the
tracking device with respect to a hypothetical position of the
body.) More specifically, the computing device may determine a
forward-backward body axis of a body, as shown by block 304.
Further, the computing device may determine a base orientation of a
tracking device relative to the forward-backward body axis, which
may be referred to as .theta..sub.TD-B, as shown by block 306.a
[0055] Once the computing device has performed the calibration
routine, the computing device may use the base orientation
.theta..sub.TD-B of the tracking device relative to the
forward-backward body axis in order to determine an orientation of
the head relative to the body. For instance, in the illustrated
method 300, the computing device may determine a first orientation
.theta..sub.HMD-TD.sub._.sub.1 of the HMD relative to the tracking
device, as shown by block 308. The computing device may then
determine a first head orientation .theta..sub.H-B.sub._.sub.1
relative to the body (e.g., forward-backward body axis) based on
both: (a) the first orientation .theta..sub.HMD-TD.sub._.sub.1 of
the HMD relative to the tracking device and (b) the base
orientation .theta..sub.TD-B of the tracking device relative to the
forward-backward body axis, as shown by block 310. In a further
aspect, the computing device may initiate a computing action based
on the first head orientation .theta..sub.H-B.sub._.sub.1 relative
to the body, as shown by block 312.
[0056] Method 300 may be described herein with reference to the
scenario illustrated in FIG. 4A. In particular, FIG. 4A is a
top-down illustration of a scenario 400 in which an HMD wearer has
a tracking device on their person. More specifically, FIG. 4A shows
a top down view of the head 402 and the body 404 of a person that
is wearing an HMD 406 and has a mobile phone 408 on their person.
FIG. 4A also shows the north-south (N-S) and east-west (E-W) axes
that are defined by magnetic north (N).
[0057] A. Detecting an Association Between Movement of a Tracking
Device and Body Movement
[0058] Referring to FIG. 3, at block 302, a computing device may
use various techniques to detect an association between the
movement of a tracking device and body movement. Further, to do so,
various types of sensor data may be utilized.
[0059] For example, an HMD or a mobile phone may receive data that
is indicative of movement of the HMD and/or data that is indicative
of movement of the tracking device, such as data from a gyroscope,
accelerometer, and/or a compass on one or both of the devices. The
computing device may then analyze such sensor data and determine
that it is characteristic of movement along the forward-backward
body axis.
[0060] For instance, when the wearer of an HMD is walking or
driving, and has a mobile phone in their pocket (which serves as
the tracking device), the accelerometer(s) and/or gyroscopes of the
mobile phone and/or of the HMD may indicate movement of the mobile
phone and/or movement of the HMD that is characteristic of walking.
Similarly, there may be movement patterns that are indicative of
the wearer driving or riding in a car. More generally, the
computing device may detect the association when it detects other
an action where the wearer's body is typically facing in the
direction they are moving, such that the movement of the wearer's
body has significant directional component in the forward or
backward direction (e.g., along the forward-backward body axis
Y.sub.B of the wearer's body, as shown in FIG. 4A).
[0061] Techniques for analyzing sensor data, such as data from
accelerometers, gyroscopes, and/or compasses, to detect actions
where the wearer's body is typically aligned with the
forward-backward body axis Y.sub.B of the wearer's body, such as
walking, driving or riding in a car, are known in the art.
Accordingly, the details of such techniques or not discussed
further herein.
[0062] When the wearer is engaged in an action such as walking or
driving, it may often be the case that the wearer has a tracking
device such as a mobile phone on their person (e.g., in their
pocket, purse, backpack, etc.) or nearby in an orientation that is
relatively stable with respect to their body (e.g., sitting on the
passenger seat of their car while the HMD wearer is driving). In
such a scenario, the position of the tracking device may therefore
provide an indication of the position of the wearer's body. For
example, when an HMD wearer has a mobile phone or another type of
tracking device in their pocket, movements of the mobile phone will
typically follow the movements of the wearer's body.
[0063] Accordingly, an HMD may use its orientation with respect to
the mobile phone to determine the HMD's orientation with respect to
the wearer's body. And since the HMD may generally be assumed to
align with the wearer's head (possibly after adjusting to account
for translation between the wearer's field of view and sensors on
the HMD), the HMD may use the HMD's orientation with respect to the
wearer's body as a measure of the orientation of the wearer's head
with respect to the wearer's body.
[0064] B. Calibration
[0065] i. Determining the Forward-Backward Body Axis
[0066] At block 304, computing device may use various techniques to
define the forward-backward axis Y.sub.B. In particular, and as
noted above, a wearer's movement that typically has a significant
forward or backward component along the forward-backward body axis
Y.sub.B, such as walking, may be interpreted to indicate the
association between movement of a tracking device and movement of
the body. Therefore, when a computing device detects sensor data
that is indicative of such a movement by the wearer, some or all of
this data may be analyzed to determine the direction of forward
body movement, which may then be used to define the
forward-backward body axis Y.sub.B.
[0067] As a specific example, FIG. 4B is a top-down illustration of
a scenario 450 in which an HMD wearer has a tracking device on
their person while walking. More specifically, FIG. 4B shows a
top-down view of the head 452 and the body 454 of a person that is
wearing an HMD 456 and has a mobile phone 458 on their person
(e.g., in their pocket). As is often the case, the person's head
452 and body 454 are facing forward, in the direction 460 that the
person is walking.
[0068] To determine the forward-backward body axis Y.sub.B, a
computing device such as the mobile phone 458 and/or HMD 456 may
first determine an up-down body axis. To do so, the mobile phone
458 and/or HMD 456 may determine the direction of gravitational
force, which is aligned with the up-down body axis (assuming the
wearer is in an upright position). In particular, the mobile phone
458 and/or HMD 456 may utilize data from the HMD's accelerometer(s)
and/or gyroscopes(s), and/or data from the tracking device's
accelerometer(s) and/or gyroscopes(s). Then, to determine the
forward-backward axis Y.sub.B, the computing device may then
evaluate accelerometer readings that are perpendicular to the
downward direction (i.e., perpendicular to the direction of
gravitational force).
[0069] More specifically, when the movement that indicates the
association between the mobile phone 458 and the wearer's body 454,
the accelerometer data from the mobile phone 458 and/or from the
HMD 456 may be expected to have the highest variance in the forward
direction. As such, the HMD 456 and/or the mobile phone 458 may
analyze the accelerometer data to determine the forward direction
of the body by determining the direction having the highest
variance, or possibly the direction having the highest average
magnitude, over a predetermined period of time (e.g., a two-second
window). The computing device may then align the forward-backward
body axis Y.sub.B with the direction that is determined to be
forward.
[0070] ii. Determining the Offset Between the Tracking Device and
Body
[0071] Once the computing device has determined the wearer's the
forward-backward body axis Y.sub.B, the computing device may
determine the base orientation .theta..sub.T-B of the tracking
device relative to the forward-backward body axis. Since the
association between the movement of the tracking device and body
movement has been detected, it may be inferred that the tracking
device will follow the wearer's body. As such, the base orientation
.theta..sub.TD-B of the tracking device relative to the
forward-backward body axis may be used as a reference to determine
the orientation of the HMD (and thus the head) with respect to the
wearer's body.
[0072] For instance, if the tracking device is a mobile phone that
is located in the wearer's pocket, it may be expected that the
tracking device is likely to stay in the wearer's pocket for at
least a short period of time. The wearer's pocket may further be
expected to hold the mobile phone in substantially the same place
with respect to the wearer's body. Thus, the orientation
.theta..sub.TD-B of the mobile phone 408 with respect to the
wearer's body 406 may be expected and/or assumed to remain
substantially the same over a certain period of time. Therefore, at
a given point in time, the orientation .theta..sub.TD-B of the
mobile phone 408 with respect to the wearer's body 406 may be used
to offset the orientation of the HMD relative to the tracking
device, to determine the orientation of the HMD relative to the
body.
[0073] More specifically, at block 306 of method 300, the
determination of the base orientation .theta..sub.TD-B of the
tracking device relative to the forward-backward body axis may
involve the computing device determining an angle between a forward
direction of the tracking device and the directional component
along the forward-backward body axis. To do so, a compass and/or
other sensors of the tracking device may be configured so as to
indicate the orientation of the tracking device relative to
magnetic north. Data from the compass and/or the other sensors may
therefore be used to determine the direction that the tracking
device is facing.
[0074] As a specific example, in FIG. 4A, the direction that the
mobile phone 408 is facing may define the tracking device's
forward-backward axis Y.sub.TD, as shown in FIG. 4. The computing
device may then use the forward direction along the tracking
device's forward-backward axis Y.sub.TD and the forward direction
along the body's forward-backward Y.sub.B, to determine the base
orientation .theta..sub.TD-B of the tracking device with respect to
the body.
[0075] Note that in FIG. 4A, the tracking device's forward-backward
axis Y.sub.TD is shifted such that it aligns with the up-down axis
of the wearer's body, instead of the up-down axis of the body. This
is done for illustrative purposes. More specifically, in an
illustrative embodiment, the orientation .theta..sub.TD-B of the
tracking device relative to the body is measured in a parallel
plane to the plane of the forward-backward body axis Y.sub.B,
(e.g., parallel to the yaw planes of the wearer's body and the
wearer's head). Therefore, in such an embodiment, the shift of the
tracking device's forward-backward axis Y.sub.TD does not effect on
how the orientation .theta..sub.TD-B is calculated.
[0076] C. Determining the Orientation of the HMD Relative to the
Tracking Device
[0077] Referring again to example method 300 of FIG. 3, various
techniques may be used to determine the first orientation
.theta..sub.HMD-TD.sub._.sub.1 of the HMD relative to the tracking
device. For example, FIG. 5 is a flow chart illustrating a method
for determining the orientation of an HMD relative to a tracking
device. In particular, method 500 may determine the orientation of
an HMD relative to a tracking device based on: (a) magnetometer
data associated with the HMD and (b) magnetometer data associated
with the tracking device.
[0078] More specifically, at block 502 of method 500, a computing
device may determine a first orientation
.theta..sub.HMD-N.sub._.sub.1 of the HMD relative to magnetic
north. This determination may be based on magnetometer data
associated with the HMD (e.g., data captured by the HMD's
magnetometer). The computing device may also determine a first
orientation O.sub.TD-N.sub._.sub.1 of the tracking device relative
to magnetic north, as shown by block 504. These determinations may
be based on magnetometer data associated with the HMD (e.g., data
captured by the HMD's magnetometer), and on magnetometer data
associated with the tracking device (e.g., data captured by the
tracking device's magnetometer), respectively. The computing device
may then determine the first orientation
.theta..sub.HMD-TD.sub._.sub.1 of the HMD relative to the tracking
device based on a difference between (a) the first orientation
.theta..sub.HMD-N.sub._.sub.1 of the HMD relative to magnetic north
and (b) the first orientation .theta..sub.TD-N.sub._.sub.1 of the
tracking device relative to magnetic north, as shown by block
506.
[0079] For instance, FIG. 4A shows the orientation
.theta..sub.HMD-TD of the HMD 406 relative to the mobile phone 408
in example scenario 400. Applying method 500 in scenario 400, block
502 may involve the HMD 406 (or a remote computing system in
communication with the HMD) analyzing data from a compass (e.g., a
magnetometer) and/or other sensors attached to or integrated in the
HMD 406, and determining the first orientation of the HMD 406
relative to magnetic north (.theta..sub.HMD-N.sub._.sub.1)
therefrom. Similarly, block 504 may involve the HMD 406 and/or the
mobile phone 408 analyzing data from the mobile phone's compass
and/or other sensors of the mobile phone 408, and determining the
first orientation of the mobile phone 408 relative to magnetic
north (.theta..sub.TD-N.sub._.sub.1) therefrom. The HMD 406 may
then calculate its orientation relative to the mobile phone 408
(.theta..sub.HMD-TD.sub._.sub.1) as being equal to the angular
difference between .theta..sub.HMD-N.sub._.sub.1 and
.theta..sub.TD-N.sub._.sub.1.
[0080] D. Determining the First Head Orientation Relative to the
Body
[0081] At block 310 of method 300, various techniques may be used
to determine the head orientation relative to the wearer's
body.
[0082] In particular, by determining the base orientation of a
tracking device relative to the forward-backward body axis
(.theta..sub.TD-B) at block 306, the computing device has a
quantitative measure of how the HMD is positioned with respect to
the tracking device. Further, by determining the orientation of the
HMD relative to the tracking device .theta..sub.HMD-TD at block
308, the computing device has a quantitative measure of how the HMD
is positioned with respect to the tracking device. As such, a
computing device may determine the orientation .theta..sub.HMD-TD
of the HMD relative to the tracking device, and then adjust
according to the base orientation .theta..sub.TD-B to determine the
orientation .theta..sub.H-B of the wearer's head relative to their
body. As a specific example, and referring again to FIG. 4, a
computing device may offset .theta..sub.HMD-TD to account for the
orientation .theta..sub.TD-B of the mobile phone 408 with respect
to the body 404, in order to determine the orientation
.theta..sub.H-B of the head 402 with respect to the body 404.
[0083] Further, it may be assumed that the base orientation
.theta..sub.TD-B of the tracking device relative to the
forward-backward body axis stays substantially the same over
certain periods of time (such as when the tracking device is a
mobile phone in the wearer's pocket or purse). It is also possible
that the computing device may explicitly determine that the
tracking device has not or is unlikely to have moved relative to
the body since calculating .theta..sub.TD-B. In either case, the
initially-determined base orientation .theta..sub.TD-B of the
tracking device relative to the forward-backward body axis can thus
be used to offset a subsequent calculation of .theta..sub.HMD-TD
and determine the orientation of the head relative to the body
(.theta..sub.H-B) at the time of the subsequent calculation.
[0084] E. Determining Three-Dimensional Head Position Relative to
Body in
[0085] In the above description for FIGS. 3 to 5, differences
between the magnetometer readings from an HMD and a mobile phone or
another tracking device are used to determine two-dimensional
orientation of the HMD wearer's head with respect to the wearer's
body (i.e., the yaw of the head with respect to the body). In some
embodiments, additional sensor data, may be used to determine the
orientation of the wearer's head relative to their body in three
dimensions. For example, the differences between the accelerometer
readings of the tracking device and the HMD can be used in a
similar manner to determine upward and downward movements of the
head relative to the body.
[0086] For example, FIG. 6 illustrates a side-view of a scenario
600 in which an HMD wearer moves their head with respect to their
body. In particular, the person moves their head from a first
position 602a to a second position 602b. In the first position
602a, the person is facing forward, such that there their head is
substantially aligned with their body 604. In other words, there is
no rotation or pitch of the head with respect to their body, so the
axes of the body are generally parallel to the axes of the head.
However, when the person moves their head to the second position
603b, the position of their head is such that there is a pitch
.PHI..sub.H-B of their head 602 relative to their body 604.
Further, roll of the head may be determined by combining the
analysis of differences in accelerometer data with analysis of
differences in magnetometer data (and possibly differences in
gyroscope data as well) between the HMD and the tracking
device.
[0087] More specifically, to determine the pitch .PHI..sub.H-B of
the head with respect to the ground, a low-pass filter may be
applied to the accelerometer signal from the tracking device and/or
the HMD to substantially cancel out other motions and determine a
direction of gravity; i.e., to determine the downward direction and
thus provide the alignment of the UP-DOWN axis shown in FIG. 6. The
HMD could then determine the pitch .PHI..sub.H-B of the head based
on measured gravity vector (e.g., in the downward direction on the
UP-DOWN axis in the coordinate frame of the HMD. (Note that the
coordinate frame of the HMD is shown by Y.sub.HMD and Z.sub.HMD in
FIG. 6.) The angle of the gravity vector as compared to the
Y.sub.HMD axis may then be used to determine the pitch
.PHI..sub.H-B of the head.
[0088] Further, to determine the roll of the head with respect to
the ground, an HMD may apply a similar technique as that used to
determine the pitch, except that the HMD (or associated computing
device) may determine the downward direction and the direction to
the right or left of the body. In particular, the HMD may determine
a coordinate frame defined by the direction of gravity (e.g., the
UP-DOWN axis shown in FIG. 6), and the right-left axis of the body
(e.g., the X.sub.B axis shown in FIG. 4B). The angle of the gravity
vector as compared to the X.sub.HMD axis (as shown in FIG. 4A) may
then be used to determine the roll of the head.
[0089] By performing the above the HMD may determine the pitch
and/or roll of the HMD, and thus the angle of the head, with
respect to the ground (i.e., with respect to gravity). The same
process may be carried out by a tracking device such as a phone to
determine the pitch and/or roll of the tracking device with respect
to the ground (i.e., with respect to gravity). Then, the pitch
and/or roll of the HMD with respect to the body may be determined
by using the axis defined by gravity (e.g., the UP-DOWN axis) in a
similar manner as the forward-backward axis is used to determine
the yaw.
[0090] F. Re-calibration
[0091] As noted above, it may be assumed that the base orientation
.theta..sub.TD-B of the tracking device relative to the
forward-backward body axis stays substantially the same over
certain periods of time. However, it is possible that
.theta..sub.TD-B can change over time. For instance, the
orientation of a mobile phone with respect to the body may change
when, e.g., the mobile phone shifts within the wearer's pocket or
the wearer moves the mobile phone from a console in their car to
their purse. Accordingly, an example method may further involve a
computing device re-calibrating to compensate for changes in a
tracking device's position relative to the body.
[0092] For example, in some embodiments, a computing device may
periodically repeat blocks 302 to 306 of method 300 in order to
re-determine the base orientation .theta..sub.TD-B of the tracking
device relative to the forward-backward body axis. By doing so, the
computing device may update the offset that is applied to the
orientation .theta..sub.HMD-TD of the HMD relative to the tracking
device.
[0093] In some embodiments, a computing device may additionally or
alternatively monitor or periodically check whether the orientation
of the tracking relative to the body has changed. The computing
device may then re-calibrate when it determines that the
orientation of tracking device relative to the body has changed (or
is likely to have changed). For example, a computing device may
receive or detect an indication that the tracking device has moved
in relation to the body (e.g., in a message from the tracking
device). In response to the indication that the tracking device has
moved in relation to the body, the computing device may monitor the
HMD's sensor data and/or the tracking device's sensor data until it
detects sensor data that indicates a calibration event (e.g., the
wearer walking forward), and then re-determine the base orientation
.theta..sub.TD-B of the tracking device relative to the
forward-backward body axis.
IV. CONCLUSION
[0094] While various aspects and embodiments have been disclosed
herein, other aspects and embodiments will be apparent to those
skilled in the art. The various aspects and embodiments disclosed
herein are for purposes of illustration and are not intended to be
limiting, with the true scope and spirit being indicated by the
following claims.
* * * * *