U.S. patent application number 15/248369 was filed with the patent office on 2016-12-15 for determination of device placement using pose angle.
This patent application is currently assigned to Apple Inc.. The applicant listed for this patent is Apple Inc.. Invention is credited to Sunny Kai Pang Chow, Gunes Dervisoglu, Digvijay A. Jadhav, Anil K. Kandangath, Jonathan C. King, Hung A. Pham, Indranil S. Sen, Umamahesh Srinivas, Xiaoyuan Tu.
Application Number | 20160366654 15/248369 |
Document ID | / |
Family ID | 57517614 |
Filed Date | 2016-12-15 |
United States Patent
Application |
20160366654 |
Kind Code |
A1 |
Dervisoglu; Gunes ; et
al. |
December 15, 2016 |
Determination of Device Placement Using Pose Angle
Abstract
In an example method, a mobile device connects a voice call for
a user. The voice call causes one or more radio frequency
transmitters of the mobile device to transmit radio waves at a
first power level. Motion data describing movement of the mobile
device is obtained, and the orientation of the mobile device is
determined based on the motion data. A determination whether the
mobile device is on the user's body or on an inanimate object is
made based on the orientation of the mobile device over the period
of time. The transmit power level is adjusted based on the
determination.
Inventors: |
Dervisoglu; Gunes; (Santa
Clara, CA) ; Sen; Indranil S.; (Cupertino, CA)
; Srinivas; Umamahesh; (Milpitas, CA) ; Jadhav;
Digvijay A.; (San Jose, CA) ; Chow; Sunny Kai
Pang; (San Jose, CA) ; Pham; Hung A.;
(Oakland, CA) ; Kandangath; Anil K.; (Santa Clara,
CA) ; Tu; Xiaoyuan; (Sunnyvale, CA) ; King;
Jonathan C.; (San Jose, CA) |
|
Applicant: |
Name |
City |
State |
Country |
Type |
Apple Inc. |
Cupertino |
CA |
US |
|
|
Assignee: |
Apple Inc.
Cupertino
CA
|
Family ID: |
57517614 |
Appl. No.: |
15/248369 |
Filed: |
August 26, 2016 |
Related U.S. Patent Documents
|
|
|
|
|
|
Application
Number |
Filing Date |
Patent Number |
|
|
14292750 |
May 30, 2014 |
|
|
|
15248369 |
|
|
|
|
13913271 |
Jun 7, 2013 |
9432954 |
|
|
14292750 |
|
|
|
|
13913271 |
Jun 7, 2013 |
9432954 |
|
|
14292750 |
|
|
|
|
Current U.S.
Class: |
1/1 |
Current CPC
Class: |
H04W 52/246 20130101;
H04W 4/027 20130101; H04W 52/285 20130101; H04B 1/3838 20130101;
H04W 76/12 20180201; H04B 17/27 20150115; H04W 52/283 20130101;
H04W 52/146 20130101; H04W 52/367 20130101; H04W 88/02
20130101 |
International
Class: |
H04W 52/28 20060101
H04W052/28; H04W 4/02 20060101 H04W004/02 |
Claims
1. A method comprising: connecting, by a mobile device, a voice
call for a user of the mobile device, wherein connecting the voice
call causes one or more radio frequency transmitters of the mobile
device to transmit radio waves at a first power level; obtaining
motion data describing movement of the mobile device; determining,
based on the motion data, an orientation of the mobile device over
a period of time; determining whether the mobile device is on the
user's body or on an inanimate object based on the orientation of
the mobile device over the period of time; and adjusting the
transmit power level based on the determination whether the mobile
device is on the user's body or on an inanimate object.
2. The method of claim 1, wherein determining the orientation of
the mobile device over the period of time comprises: determining,
based on the motion data, an average angle of the device with
respect to a reference vector over the period of time.
3. The method of claim 1, wherein determining the orientation of
the mobile device over the period of time comprises: determining,
based on the motion data, an average angle of a vector extending
perpendicular to a surface of a display screen of the device with
respect to a gravity vector over the period of time.
4. The method of claim 3, wherein determining the orientation of
the mobile device over the period of time further comprises:
determining, based on the motion data, a standard deviation of the
angle over the period of time.
5. The method of claim 4, wherein determining whether the mobile
device is on the user's body or on an inanimate object comprises:
determining whether the mobile device is on the user's body or on
an inanimate object based on the average angle over the period of
time and the standard deviation of the angle over the period of
time.
6. The method of claim 5, wherein determining whether the mobile
device is on the user's body or on an inanimate object comprises:
calculating a ratio between the standard deviation of the angle
over the period of time and the average angle over the period of
time; and comparing the ratio to a threshold value.
7. The method of claim 5, wherein determining whether the mobile
device is on the user's body or on an inanimate object further
comprises: determining, based on the comparison, that the ratio
exceeds the threshold value; and upon determining that the ratio
exceeds the threshold value, determining that the mobile device is
on the user's body.
8. The method of claim 5, wherein determining whether the mobile
device is on the user's body or on an inanimate object further
comprises: determining, based on the comparison, that the ratio
does not exceed the threshold value; and upon determining that the
ratio does not exceed the threshold value, determining that the
mobile device is on an inanimate object.
9. The method of claim 1, further comprising: making an initial
determination that the mobile device is on the user's body; and
responsive to making the initial determination that the mobile
device is on the user's body: determining the orientation of the
mobile device over the period of time; and determining whether the
mobile device is on the user's body or on an inanimate object based
on the orientation of the mobile device over the period of
time.
10. The method of claim 9, wherein the initial determination is
based on the motion data and proximity data describing a proximity
of the mobile device to a detected object.
11. The method of claim 1, further comprising: determining that the
mobile device is on the user's body; and adjusting the transmit
power level to a second power level when the mobile device is on
the user's body, wherein the second power level is less than the
first power level.
12. The method of claim 1, further comprising: determining that the
mobile device is on an inanimate object; and adjusting the transmit
power level to a second power level when the mobile device is on an
inanimate object, wherein the second power level is greater than
the first power level.
13. A mobile device comprising: one or more radio frequency
transmitters; one or more processors; a non-transitory
computer-readable medium including one or more sequences of
instructions which, when executed by the one or more processors,
causes: connecting, by the mobile device, a voice call for a user
of the mobile device, wherein connecting the voice call causes the
one or more radio frequency transmitters of the mobile device to
transmit radio waves at a first power level; obtaining motion data
describing movement of the mobile device; determining, based on the
motion data, an orientation of the mobile device over a period of
time; determining whether the mobile device is on the user's body
or on an inanimate object based on the orientation of the mobile
device over the period of time; and adjusting the transmit power
level based on the determination whether the mobile device is on
the user's body or on an inanimate object.
14. The mobile device of claim 13, wherein determining the
orientation of the mobile device over the period of time comprises:
determining, based on the motion data, an average angle of the
device with respect to a reference vector over the period of
time.
15. The mobile device of claim 13, wherein determining the
orientation of the mobile device over the period of time comprises:
determining, based on the motion data, an average angle of a vector
extending perpendicular to a surface of a display screen of the
device with respect to a gravity vector over the period of
time.
16. The mobile device of claim 15, wherein determining the
orientation of the mobile device over the period of time further
comprises: determining, based on the motion data, a standard
deviation of the angle over the period of time.
17. The mobile device of claim 16, wherein determining whether the
mobile device is on the user's body or on an inanimate object
comprises: determining whether the mobile device is on the user's
body or on an inanimate object based on the average angle over the
period of time and the standard deviation of the angle over the
period of time.
18. The mobile device of claim 17, wherein determining whether the
mobile device is on the user's body or on an inanimate object
comprises: calculating a ratio between the standard deviation of
the angle over the period of time and the average angle over the
period of time; and comparing the ratio to a threshold value.
19. The mobile device of claim 17, wherein determining whether the
mobile device is on the user's body or on an inanimate object
further comprises: determining, based on the comparison, that the
ratio exceeds the threshold value; and upon determining that the
ratio exceeds the threshold value, determining that the mobile
device is on the user's body.
20. The mobile device of claim 17, wherein determining whether the
mobile device is on the user's body or on an inanimate object
further comprises: determining, based on the comparison, that the
ratio does not exceed the threshold value; and upon determining
that the ratio does not exceed the threshold value, determining
that the mobile device is on an inanimate object.
21. The mobile device of claim 13, wherein the one or more
sequences of instructions, when executed by the one or more
processor, further causes: making an initial determination that the
mobile device is on the user's body; and responsive to making the
initial determination that the mobile device is on the user's body:
determining the orientation of the mobile device over the period of
time; and determining whether the mobile device is on the user's
body or on an inanimate object based on the orientation of the
mobile device over the period of time.
22. The mobile device of claim 21, wherein the initial
determination is based on the motion data and proximity data
describing a proximity of the mobile device to a detected
object.
23. The mobile device of claim 13, wherein the one or more
sequences of instructions, when executed by the one or more
processor, further causes: determining that the mobile device is on
the user's body; and adjusting the transmit power level to a second
power level when the mobile device is on the user's body, wherein
the second power level is less than the first power level.
24. The mobile device of claim 13, wherein the one or more
sequences of instructions, when executed by the one or more
processor, further causes: determining that the mobile device is on
an inanimate object; and adjusting the transmit power level to a
second power level when the mobile device is on an inanimate
object, wherein the second power level is greater than the first
power level.
25. A non-transitory computer-readable medium including one or more
sequences of instructions which, when executed by one or more
processors, causes: connecting, by a mobile device, a voice call
for a user of the mobile device, wherein connecting the voice call
causes one or more radio frequency transmitters of the mobile
device to transmit radio waves at a first power level; obtaining
motion data describing movement of the mobile device; determining,
based on the motion data, an orientation of the mobile device over
a period of time; determining whether the mobile device is on the
user's body or on an inanimate object based on the orientation of
the mobile device over the period of time; and adjusting the
transmit power level based on the determination whether the mobile
device is on the user's body or on an inanimate object.
26. The computer-readable medium of claim 25, wherein determining
the orientation of the mobile device over the period of time
comprises: determining, based on the motion data, an average angle
of the device with respect to a reference vector over the period of
time.
27. The computer-readable medium of claim 25, wherein determining
the orientation of the mobile device over the period of time
comprises: determining, based on the motion data, an average angle
of a vector extending perpendicular to a surface of a display
screen of the device with respect to a gravity vector over the
period of time.
28. The computer-readable medium of claim 27, wherein determining
the orientation of the mobile device over the period of time
further comprises: determining, based on the motion data, a
standard deviation of the angle over the period of time.
29. The computer-readable medium of claim 28, wherein determining
whether the mobile device is on the user's body or on an inanimate
object comprises: determining whether the mobile device is on the
user's body or on an inanimate object based on the average angle
over the period of time and the standard deviation of the angle
over the period of time.
30. The computer-readable medium of claim 29, wherein determining
whether the mobile device is on the user's body or on an inanimate
object comprises: calculating a ratio between the standard
deviation of the angle over the period of time and the average
angle over the period of time; and comparing the ratio to a
threshold value.
31. The computer-readable medium of claim 29, wherein determining
whether the mobile device is on the user's body or on an inanimate
object further comprises: determining, based on the comparison,
that the ratio exceeds the threshold value; and upon determining
that the ratio exceeds the threshold value, determining that the
mobile device is on the user's body.
32. The computer-readable medium of claim 29, wherein determining
whether the mobile device is on the user's body or on an inanimate
object further comprises: determining, based on the comparison,
that the ratio does not exceed the threshold value; and upon
determining that the ratio does not exceed the threshold value,
determining that the mobile device is on an inanimate object.
33. The computer-readable medium of claim 25, wherein the one or
more sequences of instructions, when executed by the one or more
processors, further causes: making an initial determination that
the mobile device is on the user's body; and responsive to making
the initial determination that the mobile device is on the user's
body: determining the orientation of the mobile device over the
period of time; and determining whether the mobile device is on the
user's body or on an inanimate object based on the orientation of
the mobile device over the period of time.
34. The computer-readable medium of claim 33, wherein the initial
determination is based on the motion data and proximity data
describing a proximity of the mobile device to a detected
object.
35. The computer-readable medium of claim 25, wherein the one or
more sequences of instructions, when executed by the one or more
processor, further causes: determining that the mobile device is on
the user's body; and adjusting the transmit power level to a second
power level when the mobile device is on the user's body, wherein
the second power level is less than the first power level.
36. The computer-readable medium of claim 25, wherein the one or
more sequences of instructions, when executed by the one or more
processor, further causes: determining that the mobile device is on
an inanimate object; and adjusting the transmit power level to a
second power level when the mobile device is on an inanimate
object, wherein the second power level is greater than the first
power level.
Description
CROSS REFERENCE TO RELATED APPLICATIONS
[0001] This application is a continuation-in-part of and claims
priority to U.S. patent application Ser. No. 13/913,271, filed on
Jun. 7, 2013, and U.S. patent application Ser. No. 14/292,750,
filed on May 30, 2014, which is also a continuation-in-part of U.S.
patent application Ser. No. 13/913,271, filed on Jun. 7, 2013, the
entire contents of all of which are hereby incorporated by
reference.
TECHNICAL FIELD
[0002] The disclosure generally relates to motion analysis on a
mobile device.
BACKGROUND
[0003] Mobile devices often include telephony features that allow
users to receive and place voice calls over a wireless network. For
example, modern smartphones include wireless transceivers that
allow the smartphones to receive and place telephone calls over
cellular voice and/or data networks (e.g., CDMA, 2G, 3G, 4G LTE,
etc.). These wireless transceivers can transmit at different power
levels. Transmitting at high power levels can improve the quality
of the voice calls. Transmitting at low power levels may be
required when the mobile device is worn on or near the user's body
to comply with government regulations.
SUMMARY
[0004] In some implementations, a mobile device can analyze motion
sensor data and proximity sensor data during a voice call to
determine whether the mobile device is on a stationary object
(e.g., an inanimate or inert object, such as a table, a desk, or a
chair) or worn on a user's body (e.g., in the lap or pocket of a
user of the mobile device, or against a part of the user's body,
such as his head). The mobile device can adjust the transmit power
level of the telephony transceiver during the voice call based on
the determination.
[0005] Particular implementations provide at least the following
advantages. The user can experience better quality voice calls when
the mobile device is on a stationary object (e.g., a table) because
the voice call can be transmitted using a relatively high
transmission power level. Further, the mobile device can reduce its
transmission power level when it is on the user's body, thereby
reducing the amount of energy absorbed by the user's body. Thus,
the user can operate the mobile device more safely and/or in
accordance with government regulations. Further still, the mobile
device can more accurately distinguish between being on a
stationary object and being on a user's body, thereby improving its
performance and safety in a variety of operating environments.
[0006] In general, in an aspect, a method includes connecting, by a
mobile device, a voice call for a user of the mobile device.
Connecting the voice call causes one or more radio frequency
transmitters of the mobile device to transmit radio waves at a
first power level. The method also includes obtaining motion data
describing movement of the mobile device, and determining, based on
the motion data, an orientation of the mobile device over a period
of time. The method also includes determining whether the mobile
device is on the user's body or on an inanimate object based on the
orientation of the mobile device over the period of time, and
adjusting the transmit power level based on the determination
whether the mobile device is on the user's body or on an inanimate
object.
[0007] Implementations of this aspect may include one or more of
the following features.
[0008] In some implementations, determining the orientation of the
mobile device over the period of time can include determining,
based on the motion data, an average angle of the device with
respect to a reference vector over the period of time.
[0009] In some implementations, determining the orientation of the
mobile device over the period of time can include determining,
based on the motion data, an average angle of a vector extending
perpendicular to a surface of a display screen of the device with
respect to a gravity vector over the period of time.
[0010] In some implementations, determining the orientation of the
mobile device over the period of time can further include
determining, based on the motion data, a standard deviation of the
angle over the period of time.
[0011] In some implementations, determining whether the mobile
device is on the user's body or on an inanimate object can include
determining whether the mobile device is on the user's body or on
an inanimate object based on the average angle over the period of
time and the standard deviation of the angle over the period of
time.
[0012] In some implementations, determining whether the mobile
device is on the user's body or on an inanimate object can include
calculating a ratio between the standard deviation of the angle
over the period of time and the average angle over the period of
time, and comparing the ratio to a threshold value.
[0013] In some implementations, determining whether the mobile
device is on the user's body or on an inanimate object can further
include determining, based on the comparison, that the ratio
exceeds the threshold value, and upon determining that the ratio
exceeds the threshold value, determining that the mobile device is
on the user's body.
[0014] In some implementations, determining whether the mobile
device is on the user's body or on an inanimate object can further
include determining, based on the comparison, that the ratio does
not exceed the threshold value, and upon determining that the ratio
does not exceed the threshold value, determining that the mobile
device is on an inanimate object.
[0015] In some implementations, the method can further include
making an initial determination that the mobile device is on the
user's body, and responsive to making the initial determination
that the mobile device is on the user's body, determining the
orientation of the mobile device over the period of time, and
determining whether the mobile device is on the user's body or on
an inanimate object based on the orientation of the mobile device
over the period of time.
[0016] In some implementations, the initial determination can be
based on the motion data and proximity data describing a proximity
of the mobile device to a detected object.
[0017] In some implementations, the method can further include
determining that the mobile device is on the user's body, and
adjusting the transmit power level to a second power level when the
mobile device is on the user's body. The second power level can be
less than the first power level.
[0018] In some implementations, the method can further include
determining that the mobile device is on an inanimate object, and
adjusting the transmit power level to a second power level when the
mobile device is on an inanimate object. The second power level can
be greater than the first power level.
[0019] Other implementations are directed to systems, devices, and
non-transitory, computer-readable mediums.
[0020] Details of one or more implementations are set forth in the
accompanying drawings and the description below. Other features,
aspects, and potential advantages will be apparent from the
description and drawings, and from the claims.
DESCRIPTION OF DRAWINGS
[0021] FIG. 1 illustrates an example graph showing motion variance
of the mobile device for a stationary object and on user
positions.
[0022] FIG. 2 is a flow diagram of an example process for
determining whether the mobile device is on a stationary object or
on a user.
[0023] FIG. 3 illustrates an example state machine for determining
whether the mobile device is on a stationary object or on the
user's body.
[0024] FIG. 4 is flow diagram of an example process for adjusting
the transmit power of a mobile device based on a user body location
of the mobile device.
[0025] FIG. 5 is a flow diagram of another example process for
adjusting the transmit power of a mobile device based on a user
body location of the mobile device.
[0026] FIG. 6 is a scatter plot for classifying acceleration
signals.
[0027] FIG. 7 is a flow diagram of another example process for
adjusting the transmit power of a mobile device based on a user
body location of the mobile device.
[0028] FIG. 8A illustrates example directional axes for a mobile
device.
[0029] FIG. 8B illustrates an example pose angle of a mobile
device.
[0030] FIG. 9 is a block diagram of an example system for adjusting
the transmit power of a mobile device.
[0031] FIG. 10 is a block diagram illustrating an example API
architecture, which can be used in some implementations.
[0032] FIG. 11 illustrates an example software stack that includes
various application programming interfaces.
[0033] FIG. 12 is a block diagram of an example computing device
that can implement the features and processes of FIGS. 1-11.
[0034] Like reference symbols in the various drawings indicate like
elements.
DETAILED DESCRIPTION
Overview
[0035] In some implementations, a mobile device can be configured
with one or more motion sensors. For example, a motion sensor can
be an accelerometer, gyroscope or other type of motion sensor. In
some implementations, the mobile device can be configured with
telephony components that allow a user to make and receive
telephone calls. For example, telephony components can include one
or more radio frequency transceivers that are configured to access
one or more radio access technologies (e.g., GSM, UMTS, 2G, 3G,
LTE, etc.).
[0036] In some implementations, the mobile device can be configured
to adjust the transmission power of the telephony components when
the mobile device connects to a voice call. For example, the mobile
device can be configured to adjust the transmission power based on
whether the mobile device is on a stationary object (e.g., a table)
or worn on the user's body (e.g., on the user's lap, in the user's
pocket, against the user's head, etc.).
[0037] Although the examples herein are described in the context of
voice calls (e.g., adjusting a mobile device's transmission power
during the course of a voice call), it is understood that the
disclosed subject matter also can be implemented in the context of
any wireless data transmission. For example, in some
implementations, a mobile device can be configured to adjust the
transmission power of its telephony components when the mobile
device transmits non-voice data across a cellular network.
[0038] FIG. 1 illustrates an example graph 100 showing motion
variance of the mobile device for stationary object 102 and on user
positions 104. The vertical axis 106 of graph 100 indicates the
amount of variance (e.g., a statistical variance). In some
implementations, the determination of whether the mobile device is
on a stationary object or worn by the user can be based on the
variance in magnitude (i.e., amplitude) of a motion signal
generated by the mobile device's motion sensor. For example, when
the mobile device is on a stationary object 102 (e.g., a table),
the variance in the motion signal magnitude (i.e., amplitude) will
be small, as demonstrated by variance measurements 108. When the
mobile device is on the lap or in the pocket of a user, the
variance of the motion signal will be similar to that of the motion
signal when the mobile device is on a stationary object, as
demonstrated by variance measurements 110. However, when the mobile
device is worn on the user's body, the user will eventually move
and thereby create a magnitude spike in the motion signal that will
temporarily increase the variance of the motion signal, as
indicated by variance measurement 112. When the mobile device
detects that the variance of the motion signal has exceeded a
variance threshold 114 (e.g., a predefined, empirically determined
threshold), the mobile device can determine that the mobile device
is located or positioned on the user's body (e.g., lap, pocket,
etc.). If the mobile device does not detect the variance increase
within a period of time (e.g., a predefined, empirically determined
period of time), then the mobile device can determine that the
mobile device is located or positioned on a stationary (e.g.,
inert, inanimate) object.
[0039] FIG. 2 is a flow diagram of an example process 200 for
determining whether the mobile device is on a stationary object or
on a user. In some implementations, the mobile device can connect
to a voice call 202 at a time To (e.g., To is the time when the
call is connected). For example, the mobile device can be
configured with telephony features that allow a user to place and
receive voice calls. When the mobile device connects to a voice
call, the mobile device can obtain motion data (e.g., a motion
signal) from a motion sensor of the mobile device 204. For example,
the mobile device can generate motion signal that indicates the
amount (e.g., magnitude and frequency) of motion over time based on
motion data received from the motion sensor. The mobile device can
apply a low pass filter 206 to the motion signal to filter out
(e.g., attenuate) high frequency motion signals (e.g., noise).
[0040] In some implementations, the mobile device can be configured
to sample the filtered motion signal 208 for a period of time
(i.e., the sample period). For example, the mobile device can be
configured to collect samples of the motion signal over the sample
period. The mobile device can sample and store into a buffer five
(5) seconds of the filtered motion signal, for example. In some
implementations, the motion signal samples can be analyzed to
determine the variance of the motion signal 210 during the sample
period. For example, the motion signal sample can be passed through
a variance filter or other variance calculating function that will
determine the variance of the motion signal samples.
[0041] Once the variance of the motion signal samples is
determined, the variance can be compared to a variance threshold
value 212 to determine if the variance threshold value has been
exceeded by the motion signal samples. For example, if the variance
of the motion signal samples is greater than the motion variance
threshold, then the mobile device can determine that the mobile
device is located or positioned on the user's body 214. If the
variance of the motion signal samples is not greater than the
motion variance threshold, then the mobile device can compare the
amount of time that has passed since the voice call was connected
(e.g., T.sub.current-T.sub.0) to a threshold period of time 216.
For example, if the threshold period of time has elapsed since the
voice call was connected and the variance of the motion signal (as
determined from the motion signal samples) has not exceeded the
variance threshold within the threshold period of time, then the
mobile device can determine that the mobile device is located on a
stationary object 218 and not located on the user. Thus, the mobile
device can distinguish between being on a stationary object and
being on a user's body.
[0042] If the threshold period of time has not elapsed since the
call was connected, then the mobile device can collect another set
of motion signal samples 208 and determine whether the variance of
the motion signal samples 210 exceeds the variance threshold 212.
Collecting samples and comparing the variance of the samples to the
variance threshold can continue until the threshold period of time
has elapsed 216 and the on object location 218 of the mobile device
is determined. If the variance of a set of motion signal samples
210 exceeds the variance threshold 212 before the threshold period
of time has elapsed, then an on user location 214 of the mobile
device can be determined.
[0043] In some implementations, the mobile device can determine
that the threshold period of time has elapsed using a counter. For
example, if the threshold period of time is one minute and the
sample period is ten seconds, then the mobile device can determine
that the threshold period of time has elapsed after six motion
signal samples have been taken. Thus, in some implementations, each
time the mobile device determines whether the threshold period of
time has elapsed 216, the mobile device can increment a counter
(e.g., starting from zero). Once the counter has reached a number
(n) equal to the threshold period of time divided by the sample
period (n=threshold period/sample period), then the mobile device
can determine that the mobile device is on a stationary object 218,
as described above.
[0044] In some implementations, when a voice call is connected 202,
the mobile device can be configured to adjust the transmit power of
the telephony components of the mobile device to a low power level
as if it has determined that the mobile device is being worn on the
user's body. For example, the mobile device can be configured to
default or initialize to the on body location. After the threshold
period of time has passed 216, and if the variance of the motion
signal never exceeds the threshold variance 212, then the mobile
device can determine that the mobile device is located or
positioned on a stationary object and can increase the transmit
power to a high power level.
[0045] FIG. 3 illustrates an example state machine 300 for
determining the whether the mobile device is on a stationary object
or on the user's body. For example, state machine 300 can be used
to determine where the mobile device is currently positioned (e.g.,
on a stationary object, on the user's body). The mobile device can
use the current state of the state machine 300 to determine whether
to adjust the transmit power of the mobile device.
[0046] In some implementations, the mobile device can be configured
to start in an unknown state 304 when a voice call is connected
302. In some implementations, the mobile device can be configured
to start in the on body state 306 when a call is connected 302.
Once the call is connected, the mobile device can monitor the
movement of the mobile device to determine a stationary object 308
or on body 306 position for the mobile device, as described above
with reference to FIG. 2. For example, if an on stationary object
location (e.g., on table) is determined, then the state machine 300
can transition from unknown state 304 to on object state 308. If an
on body location (e.g., on user's lap, in user's pocket) is
determined, then the state machine 300 can transition from unknown
state 304 to on body state 306.
[0047] In some implementations, the state machine 300 can include
an in hand state 310 for transitioning between the on object state
308 and on body state 306. For example, if the mobile device is on
an object, such as a table, the user will likely pick up the mobile
device before putting the mobile device on the user's body (e.g.,
in the user's pocket, on the user's lap). If the mobile device is
on the user's body, the user will likely pick up the mobile device
before putting the mobile device on a stationary object. In either
case, the mobile device will be held in the user's hand during the
transition from an on body location to an on object location or
from an on object location to an on body location.
[0048] Similarly, the state machine 300 can be configured with an
in hand state 310 for transitioning between on object state 308 and
on body state 306. For example, once in the on object state 308 or
the on body state 306 the mobile device can continue analyzing
motion data (e.g., motion signal) from the mobile device's motion
sensor. If the motion data indicates a pattern of motion consistent
with the mobile device being picked up or held in hand, the state
machine 300 can transition from on object state 308 or on body
state 306 to in hand state 310.
[0049] In some implementations, when the state machine 300
indicates that the mobile device is in the in hand state 310, the
mobile device can perform process 200 to determine whether to
transition to on object state 308 or on body state 306. For
example, when the state machine 300 is in the in hand state 310,
the mobile device can perform steps 204-218 of process 200 to
determine whether and when to transition from the in hand state 310
to the on body state 306 or on object state 308.
[0050] FIG. 4 is flow diagram of an example process 400 for
adjusting the transmit power of a mobile device based on a user
body location of the mobile device. At step 402, the mobile device
can connect to a voice call. For example, a user can use the mobile
device to initiate or receive voice calls (e.g., telephone call) on
the mobile device.
[0051] At step 404, the mobile device can determine whether the
mobile device is on the user's body or on an object. For example,
the mobile device can perform process 200 of FIG. 2 and/or use
state machine 300 of FIG. 3 to determine whether the mobile device
is located on the user's body (e.g., in pocket, on lap, etc.) or on
a stationary object (e.g., table, desk, floor, etc.).
[0052] At step 406, the mobile device can adjust the transmit power
of the mobile device based on the location or position of the
mobile device. For example, if the mobile device is on the user's
body, the mobile device can reduce the transmit power of the mobile
device to a predetermined low power level. If the mobile device is
on a stationary object (e.g., away from the user's body), then the
mobile device can increase the transmit power of the mobile device
to a predetermined high power level.
[0053] In some implementations, the transmit power of a mobile
device can be adjusted based on both motion data (e.g., data from a
motion sensor describing the movement of a mobile device) and
proximity data (e.g., data from a proximity sensor describing the
proximity of the mobile device to a detected object). In some
cases, by using both motion data and proximity data, the location
of the mobile device can be determined with a greater degree of
accuracy or confidence.
[0054] Proximity data can be collected in a variety of ways. For
example, proximity data can be obtained from a proximity sensor,
such as a light-emitting diode (LED) and an associated
photodetector (e.g., a photodiode). In an example implementation,
the light-emitting diode may be an infrared light-emitting diode.
Reflected light from nearby objects may be detected using the
photodiode. When sufficient reflected light is detected, it can be
concluded that a human body part (e.g., a head, finger, or hand) or
other object (e.g., a desk, table, or chair) is located close to
the proximity sensor. When insufficient reflected light is
detected, it can be concluded that no objects are located near to
the proximity sensor. If desired, emitted light from the proximity
sensor may be concentrated at a particular distance from the
proximity sensor using a lens or other focusing structure. This may
help to enhance the strength of reflected signals from objects
located at this particular distance.
[0055] In some implementations, the light-emitting diode in the
proximity sensor may be modulated at a particular frequency or may
be modulated using any other suitable modulation pattern. The use
of a modulation pattern to drive the light-emitting diode may help
to discriminate reflected light-emitting diode signals from
background illumination. This may increase the signal-to-noise
ratio of the proximity sensor. If desired, the proximity sensor may
be based on proximity detection arrangements other than
light-emitting diode arrangements. For example, a proximity sensor
for a mobile device may be based on a capacitive sensor, a
photodetector that works only with ambient light (and not emitted
light from the mobile device), an acoustic proximity sensor (e.g.,
a sensor that uses ultrasonic sound waves to determine the presence
or absence of a nearby object), a sensor that detects reflected
electromagnetic radiation (e.g., radio-frequency radiation), or any
other suitable sensor capable of detecting the presence of a nearby
object.
[0056] In some implementations, an ambient light sensor may be used
to detect the level of ambient illumination around the mobile
device. An ambient light sensor may be implemented using a
photodiode that is sensitive to visible light. Separate photodiodes
can be used for proximity sensors and ambient light sensors, but
the photodiode functionality of an ambient light sensor and the
photodiode functionality of a proximity sensor (in a light-based
proximity detector) may be implemented using a common photodiode if
desired. Information on the amount of light that is gathered by an
ambient light sensor may be used to adjust the screen brightness of
a mobile device's display (as an example).
[0057] If desired, proximity sensor functionality may be
implemented in the mobile device using a device that serves
multiple functions. As an example, a capacitive touch sensor or
other such touch sensor that is part of a touch display may be used
in detecting the presence of a nearby object. During normal
operation, touch sensor output signals may be used to identify user
input selections as a user presses a finger against various
portions of the mobile device's display. When used as a proximity
sensor, the output signals of the touch screen may be processed to
determine whether or not an object is adjacent to the mobile
device. With this type of arrangement, the capacitive readings
obtained from the touch sensor portion of the display may be
processed, for example, to determine whether a user has placed the
mobile device next to the user's head. Because the presence of the
user's head in the vicinity of the screen will change the
capacitive reading (or other such touch sensor reading) from the
display, the presence of the user's head can be detected without
using a conventional proximity sensor. As another example, light
readings from an ambient light sensor may be used as an indicator
of the proximity of an object to the mobile device (e.g., by
detecting shadows that indicate the presence of an object). Touch
pads without displays may also be used to produce proximity
data.
[0058] To improve accuracy, signals from multiple proximity sensor
devices (e.g., an LED-based proximity sensor, an ambient light
sensor used to detect proximity, a capacitive touch screen, etc.)
may be processed in parallel. With this type of arrangement, the
mobile device can more accurately determine whether or not the
mobile device has been placed in close proximity to an object.
[0059] In some implementations, one or more proximity sensors can
be used to determine the distance of a detected object from the
proximity sensor. The proximity sensor can, in some cases,
determine the absolute distance of the detected object from the
proximity sensor. In some cases, the proximity sensor can determine
an approximate or relative distance of the detected object from the
proximity sensor. For example, the proximity sensor might determine
that a detected object is either "near" the proximity sensor (e.g.,
when the detected object is within a particular distance from the
proximity sensor), within an "intermediate" distance from the
proximity sensor (e.g., when the detected object is within a
particular larger distance from the proximity sensor), or "far"
from the proximity sensor (e.g., when the detected objected is
beyond a particular distance from the proximity sensor, or when no
object is detected at all).
[0060] Although example implementations of proximity sensors are
described above, these are merely examples. Other proximity sensors
capable of determining proximity information can also be used. As
an example, additional proximity sensors are described in U.S. Pat.
No. 8,417,296, which is incorporated herein by reference.
[0061] FIG. 5 is a flow diagram of an example process for adjusting
the transmit power of a mobile device based on a user body location
of the mobile device. At step 502, the mobile device can connect to
a voice call. For example, a user can use the mobile device to
initiate or receive voice calls (e.g., telephone call) on the
mobile device.
[0062] At step 504, the mobile device can determine whether the
mobile device is on the user's body or on an object using both
motion data and proximity data. In some implementations, the mobile
device can consider motion data and proximity data individually,
such that individual predictions are made using each of the
different types of data. For instance, in some cases, a first
prediction can be made using solely the motion data, while a second
prediction can be made using solely the proximity data.
[0063] Predictions can be made using motion data in a manner
similar to that described above. For example, using the motion
data, the mobile device can perform process 200 of FIG. 2 and/or
use state machine 300 of FIG. 3 to predict whether the mobile
device is located on the user's body (e.g., in a pocket, on lap,
etc.) or on a stationary object (e.g., table, desk, floor, etc.).
In some implementations, this prediction might be a determination
regarding the "motion state" of the mobile device. For example, the
mobile device can determine if it is in an "on-object" motion state
(e.g., corresponding to state 308 of FIG. 3), an "on-body" motion
state (e.g., corresponding to state 306 of FIG. 3), or an "in-hand"
motion state (e.g., corresponding to state 310 of FIG. 3). The
mobile device can, for instance, determine it is in an on-object
motion state upon predicting that the mobile device is on an
inanimate object.
[0064] Predictions can be made using proximity data in a variety of
ways. For example, in some implementations, a prediction might be a
determination regarding the "proximity state" of the mobile device.
This determination can be based, at least in part, on proximity
data from a proximity sensor. For example, the mobile device can
determine that it is in a "near" proximity state when it determines
that the mobile device is within a particular distance from a
detected object. As another example, the mobile device can
determine that it is in an "intermediate proximity state" when it
determines that the mobile device is within a particular further
distance from the detected object. As another example, the mobile
device can determine it is in a "far" proximity state when it
determines that the mobile device is even farther from the detected
object. Different states can, for example, be defined by particular
ranges of distances. For example, the near proximity state can
correspond to situations when the mobile device is within a first
distance from a detected object. As another example, the
intermediate proximity state can correspond to situations when the
mobile device is between a second distance and the first distance
from the detected object, where the second distance is greater than
the first. As another example, the far proximity state can
correspond to situations when the mobile device is further than the
second distance away from the detected object. While different
example states are described above, these are provided only to
illustrate how different proximity states can be defined. Different
proximity states and different numbers of proximity states can be
used, depending on the implementation.
[0065] Predictions based on motion data and predictions based on
proximity data can be interpreted in conjunction in order to
determine if the mobile device is on body (e.g., on a user's body)
or on object (e.g., on an inanimate object). For example, in some
implementations, some combinations of motion states and proximity
states can correspond to a determination that the mobile device is
on body, while other combinations of motion states and proximity
states can correspond to a determination that the mobile device is
on object. As an example, if the mobile device is in an "on-body"
motion state and a "near" proximity state," the mobile device might
determine that it is on body. As another example, if the mobile
device is in an "on-object" motion state and a "far" proximity
state," the mobile device might determine that is on object. These
example combinations are provided to demonstrate how combinations
of different motion states and proximity states can be used to
determine if the mobile device is on body or on object. In
practice, other combinations of states and corresponding
determinations can be implemented, either in addition to or instead
of those described above.
[0066] Motion data and proximity data need not be collected
concurrently at all times. For example, in some implementations,
the mobile device can make a determination regarding whether it is
on body or on object using only motion data (e.g., as shown in FIG.
2). At some point in the future (e.g., after proximity data is made
available at a later time), the mobile device can incorporate
proximity data in its determination (e.g., as shown in FIG. 3).
[0067] In some implementations, the length of time required to
collect sufficient motion data to make a motion state prediction
might differ from the length of time required to collect sufficient
proximity data to make a proximity state prediction. As an example,
in some implementations, a motion sensor might need to collect data
for several seconds in order to determine a motion state, while a
proximity sensor might only need to collect data for less than a
second in order to determine a proximity state. In these
implementations, the mobile device might make a determination
regarding whether it is on body or on object using data from only
one sensor (e.g., using only proximity data), until sufficient data
has been acquired from the other sensor. In this manner, the mobile
device can make a determination as soon as it receives sufficient
measurement data from at least one of the sensors, and can update
its determination upon receipt of additional measurement data from
other sensors.
[0068] At step 506, the mobile device can adjust the transmit power
of the mobile device based on the location or position of the
mobile device. For example, if the mobile device is on the user's
body (e.g., as determined using motion data and proximity data, as
described above) the mobile device can reduce the transmit power of
the mobile device to a predetermined low power level. If the mobile
device is on a stationary object (e.g., away from the user's body),
then the mobile device can increase the transmit power of the
mobile device to a predetermined high power level.
[0069] In the examples above, the motion state of a mobile device
can be determined based on the variance of a motion signal (e.g.,
as shown in FIGS. 1 and 2). Other techniques can also be used to
determine the motion state of a mobile device. For example, in some
implementations, motion data can include information that describes
the acceleration experienced by a mobile device over a period of
time (e.g., an acceleration signal). For instance, the acceleration
signal might include a series of values, where each value describes
the acceleration experienced by the mobile device at a particular
moment in time. When placed into a sequence, these values can be
represented as a time-dependent waveform that describes the
acceleration experienced by the mobile device during the given span
of time. This acceleration signal can be obtained, for example,
using an accelerometer that obtains acceleration measurements
periodically over a particular span of time (e.g., over the span of
one second, two seconds, three seconds, or some other span of
time).
[0070] Different portions of the acceleration signal can be
considered in order to determine the motion state of the mobile
device. For instance, in some implementations, the power of the
acceleration signal within two or more different frequency bands
can be compared in order to classify the motion state of the mobile
device. As an example, an acceleration signal can be interpreted
based on its low frequency power and its high frequency power. Low
frequency power and high frequency power can be determined, for
example, by calculating the signal's power of within a particular
frequency band. As an example, a low frequency band can correspond
to the band of frequencies between 0 and 10 Hz, and the high
frequency band can correspond to the band of frequencies between 10
and 50 Hz. These values are provided only as examples, and in
practice, each frequency band can vary, depending on the
application.
[0071] Given an acceleration signal having particular low frequency
and high frequency components, the mobile device can determine the
motion state of the mobile device. For example, in some
implementations, a ratio can be calculated between an acceleration
signal's high frequency power and low frequency power. If the ratio
exceeds a particular threshold ratio value, the mobile device might
predict that it is in a particular motion state (e.g., the
on-object motion state). If the ratio does not exceed that
threshold ratio value, the mobile device might predict that it is
in another motion state (e.g., the on-body motion state). In this
manner, the mobile device predicts whether it is on body or on
object based not only on the overall power of the acceleration
signal, but also on the power contained within particular frequency
bands of the acceleration signal in relation to that of other
frequency bands of the acceleration signal.
[0072] In some implementations, the threshold ratio value can
depend on the power of one or more of the frequency bands. As an
example, in some implementations, the threshold ratio value might
increase as the power of one of the frequency bands increases. As
another example, in some implementations, the threshold ratio might
increase as the power of one of the frequency bands decreases. In
some implementations, the relationship between the power of one or
more of the frequency bands and the threshold ratio value can be
described in as a mathematical function. In this manner, the
threshold ratio value need not remain the same in any particular
implementation.
[0073] In some implementations, the high frequency power and low
frequency power of an acceleration signal can be represented as a
point on a scatter plot. Depending on where the acceleration signal
lies on the scatter plot, a determination can be made regarding the
motion state of the mobile device. For example, as shown in FIG. 6,
a scatter plot 600 is divided into two regions 602a-b, separated by
a line 604. When an acceleration signal falls within region 602a,
the mobile device predicts that it is in not on a user's body
(e.g., in an on-object motion state). When the acceleration signal
falls within region 604b, the mobile device predicts that it is on
a user's body (e.g., in an on-body motion state). The path of line
604 through scatter plot 600 can be determined in various ways. For
example, as shown in FIG. 6, a series of acceleration measurements
can be obtained corresponding to mobile devices known not to be on
a user's body (e.g., points 606a), and a series of acceleration
measurements can be obtained corresponding to mobile devices known
to be on a user's body (e.g., points 606b). Based on these known
observations, line 604 can be drawn to divide, either completely or
partially, these points into two different regions. The path of
line 604 can be determined empirically, for example, based on
several experimental measurements obtained when a mobile device was
known to be on a user's body and when a mobile device is known not
to be on a user's body. In some implementations, line 604 need not
be a straight line. For example, portions of line 604 can be
straight (e.g., horizontal, vertical, or diagonal) or curved,
depending on the implementation.
[0074] While an example scatter plot 600 and line 604 is shown in
FIG. 6, this is only an example. A scatter plot can be divided into
different regions, depending on the application. Further, while the
above examples describe the use of two different frequency bands in
order to interpret an acceleration signal, in some implementations,
three or more different frequency bands can be used. For example,
in some implementations, a mobile device may consider a low
frequency band (e.g., frequencies from 0-10 Hz), an intermediate
frequency band (e.g., frequencies from 10-20 Hz), and a high
frequency band (e.g., frequencies from 20-60 Hz). In these
implementations, instead of representing the acceleration signals
as a two-dimensional scatter plot, the acceleration signals can
instead be represented in a three-dimensional scatter plot.
Likewise, in these implementations, line 604 may instead be a two
dimensional surface instead of a line. In a similar manner, more
frequency bands can be considered (e.g., four, five, six, or more),
depending on the implementation.
[0075] Further, while two regions are shown in scatter plot 600 of
FIG. 6, in some implementations, there can be a greater number of
regions (e.g., three, four, five, and so forth), corresponding to
additional motion states. These motion states can, for example,
represent the acceleration experienced by a mobile device when it
is in a car, when it is in an airplane, when it is on a train, or
in any other environment. Accordingly, additional lines can be
defined to divide a scatter plot into the desired number of
regions.
[0076] Considering different portions of the acceleration signal
(e.g., the high frequency components and the low frequency
components) can provide certain benefits. In some circumstances,
certain types of motion contain comparatively higher energy in
certain frequency band relative to certain other frequency bands.
In some implementations, by considering the distribution of energy
between various frequency bands, different types of motion can be
differentiated with a greater degree of accuracy. For example, in
some cases, a mobile device on a user's body might be expected to
have generally greater energy in lower frequency bands relative to
higher frequency bands, while a mobile device being operated on an
inanimate object might be expected to generally have greater energy
in higher frequency bands relative to lower frequency bands (e.g.,
corresponding to the mobile device striking a firm surface). By
determining the amount of energy contained with different frequency
bands, a mobile device can thus determine its location with a
greater degree of accuracy.
[0077] Other techniques also can be used to determine whether the
mobile device is on a user's body or on an inanimate object. For
instance, in some implementations, this determination can be made,
at least in part, based on the orientation or "pose angle" of the
mobile device with respect to its environment. As an example, the
pose angle can refer to the orientation of the mobile device with
respect to a reference vector (e.g., a vector indicating the
direction of gravity, or some other vector derived from the
operating environment). The orientation of the mobile device can be
monitored over a period of time (e.g., 1 second, 2.5 seconds, 5
seconds, or any other period of time), and the variation of the
orientation over that period of time can be considered to determine
whether the mobile device is on a user's body or on an inanimate
object.
[0078] As an example, if the orientation of the device experiences
a relatively high degree of variation over a period of time, the
mobile device is determined to be on the user's body (e.g., in the
user's lap or pocket, or held against the user's head).
Accordingly, the mobile device can reduce the transmission power of
its transceiver to reduce the amount of energy absorbed by the
user's body during use of the mobile device. As another example, if
the orientation of the device experiences a relatively low degree
of variation over the period of time, the mobile device is
determined to be on an inanimate or stationary object (e.g., a
table, desk, or chair). Accordingly, the mobile device can increase
the transmission power of its transceiver to improve its
performance.
[0079] In some cases, this technique can be used as the sole
technique for determining whether the mobile device is on a user's
body or on an inanimate object.
[0080] In other cases, this technique can be used in conjunction
with one or more other techniques (e.g., one or more of the
techniques described herein). For instance, an initial
determination can be made whether a mobile device is on a user's
body or on an inanimate object based on motion data and/or
proximity data (e.g., as described with respect to FIGS. 1-6). If
the mobile device is determined to be on a user's body, a secondary
determination can be made based on the orientation of the device
over a period of time.
[0081] If the secondary determination is in agreement that the
device is on a user's body (e.g., if the orientation of the device
experiences a relatively high degree of variation of the period of
time), the mobile device is determined to be on the user's body.
Accordingly, the mobile device can decrease the transmission power
of its transceiver.
[0082] However, if the secondary determination is in disagreement,
and instead indicates that the device is on an inanimate object
(e.g., if the orientation of the device experiences a relatively
low degree of variation of the period of time), the initial
determination can be overruled, and the mobile device is determined
to be on an inanimate object. Accordingly, the mobile device can
increase the transmission power of its transceiver.
[0083] In this manner, the orientation of the device can be
selectively considered, depending on the outcome of the initial
determination. This can be useful, for example, to validate a
determination made using one or more other techniques, and to
selectively reduce the occurrence of false positives when using
those techniques. For instance, this validation process can reduce
situations where the mobile device might otherwise incorrectly
determine that it is on a user's body based on the techniques
described with respect to FIG. 1-6, when it fact it is on an
inanimate object). As a result, a more accurate determination can
be made regarding the state of the mobile device.
[0084] As an example, FIG. 7 is flow diagram of an example process
700 for adjusting the transmit power of a mobile device based on a
user body location of the mobile device. At step 702, the mobile
device can connect to a voice call. For example, a user can use the
mobile device to initiate or receive voice calls (e.g., telephone
call) on the mobile device.
[0085] At step 704, the mobile device can make an initial
determination regarding whether the mobile device is on the user's
body or on an object. For example, the mobile device can perform
process 200 of FIG. 2 and/or use state machine 300 of FIG. 3 to
determine whether the mobile device is located on the user's body
(e.g., in pocket, on lap, etc.) or on a stationary object (e.g.,
table, desk, floor, etc.). As another example, the mobile device
can obtain an acceleration signal corresponding to the acceleration
that it has experienced over an "epoch" or period of time, and
compare the power of one or more frequency bands of the
acceleration signal against one or more threshold values (e.g., as
described with respect to FIG. 6).
[0086] In some cases, the mobile device can obtain the acceleration
signal by sampling the acceleration experienced by the mobile
device (e.g., using an accelerometer) at a rate of 100 Hz, over a
2.56 second period of time. Although example sample rates and
length of time are described, they are merely illustrative
examples. Other sampling rates (e.g., 1 Hz, 10 Hz, 50 Hz, 200 Hz,
and so forth) and periods of time (e.g., 0.5 seconds, 1 second, 2
seconds, and so forth), are also possible, depending on the
implementation.
[0087] If the mobile device determines that it is on an object, at
step 706, the mobile device can increase the transmit power of its
transceiver. In some cases, if the mobile device is already
operating its transceiver according to a high transmit power, the
mobile device can maintain the high transmit power.
[0088] If the mobile device determines that it is on the user's
body, at step 708, the mobile device can make a secondary
determination regarding whether the mobile device is on the user's
body or on an object. As described above, the secondary
determination can be made, at least in part, based on the
orientation or "pose angle" of the mobile device with respect to
its environment.
[0089] In some cases, the orientation of the mobile device can be
monitored over a period of time (e.g., an epoch of 2.56 seconds, at
a sample rate of 100 Hz), and the variation of the orientation over
that period of time can be considered to determine whether the
mobile device is on a user's body or on an inanimate object. As
noted above, although example sample rates and length of time are
described, they are merely illustrative examples. Other sampling
rates (e.g., 1 Hz, 10 Hz, 50 Hz, 200 Hz, and so forth) and periods
of time (e.g., 0.5 seconds, 1 second, 2 seconds, and so forth), are
also possible, depending on the implementation.
[0090] In some cases, the pose angle of the mobile device can be
calculated by determining the orientation of the mobile device with
respect to a reference vector (e.g., a vector indicating the
direction of gravity). For example, as shown in FIG. 8A, the
acceleration of a mobile device 800 can be determined along its
x-axis 802, y-axis 804, and z-axis 806. The x-axis 802 can refer to
an axis parallel to a front surface 808 of the mobile device 800
(e.g., a display screen). The y-axis 804 can refer to an axis
parallel to the front surface 808, and orthogonal to the x-axis
802. The z-axis 806 can refer to an axis orthogonal to the x-axis
802 and the y-axis 804, and orthogonal to the front surface 808
(e.g., extending outward from the front surface 808).
[0091] As shown in FIG. 8B, the pose angle 810 of the mobile device
800 can be determined by comparing the orientation to the device
with respect to a reference vector 812 (e.g., a vector indicating
the direction of gravity). In the example shown in FIG. 8B, the
mobile device 800 is oriented such that its front surface is
pointing diagonally upward, and the pose angle 810 refers to the
angle between the reference vector 812 (e.g., a gravity vector),
and the z-axis 806.
[0092] The pose angle 810 can be determined based on the
acceleration experienced by the mobile device 800 along each of the
axes 802, 804, and 806. For example, the pose angle 810 can be
determined by first obtaining values a.sub.x, a.sub.y, and a.sub.z,
representing the acceleration experienced by the mobile device 800
along the x-axis 802, y-axis 804, and z-axis 806, respectively. The
pose angle 810 (.theta.) can then be calculated using the
equation:
.theta. = arctan ( a x a y 2 + a y 2 ) ##EQU00001##
[0093] As described above, in some cases, if the orientation of the
device experiences a relatively high degree of variation over a
period of time (e.g., greater than or equal to a threshold value),
the mobile device is determined to be on the user's body (e.g., in
the user's lap or pocket, or held against the user's head).
However, if the orientation of the device experiences a relatively
low degree of variation over the period of time (e.g., less than a
threshold value), the mobile device is determined to be on an
inanimate or stationary object (e.g., a table, desk, or chair).
[0094] In some cases, the mobile device's variation in orientation
can be determined by calculating a "coefficient of variation." This
can calculated by sampling the mobile device's pose angle multiple
times over a period of time (e.g., 1, 10, 50, 100, 200 times per
epoch, or any other number of times). The mobile device's
coefficient of variation during that period of time can be
subsequently calculated by dividing the standard deviation of the
pose angles over the period of time, by the absolute value of the
mean of the pose angles over the period of time. For example, the
coefficient of variation, C.sub.v, can be calculated using the
equation:
C.sub.v=stdev(.theta..sub.1, . . .
,.theta..sub.n)/abs(mean(.theta..sub.1, . . . ,.theta..sub.n)),
[0095] where .theta..sub.1, . . . , .theta..sub.n each of the
determined pose angles over a period of time.
[0096] If the coefficient of variation, C.sub.v, is relatively
large (e.g., greater than or equal to a threshold value), the
mobile device is determined to be on the user's body (e.g., in the
user's lap or pocket, or held against the user's head). However, if
the coefficient of variation, C.sub.v, is relatively small (e.g.,
less than a threshold value), the mobile device is determined to be
on an inanimate or stationary object (e.g., a table, desk, or
chair). The threshold value can be assigned a value that accurately
distinguishes between the mobile device being on the user's body
and the mobile device being on an inanimate or stationary object.
In some cases, the threshold value can be assigned a value based on
empirical tests.
[0097] In some cases, considering the coefficient of variation can
enable the mobile device to more accurately distinguish between
being on the user's body and the mobile device being on an
inanimate or stationary object. However, in some cases, other
metrics of variation can be considered (e.g., standard deviation
alone, variance, or other metrics).
[0098] If the secondary determination is in agreement that the
device is on a user's body (e.g., if the orientation of the device
experiences a relatively high degree of variation of the period of
time), the mobile device is determined to be on the user's body.
Accordingly, at step 710, the mobile device can decrease the
transmission power of its transceiver. In some cases, if the mobile
device is already operating its transceiver according to a low
transmit power, the mobile device can maintain the low transmit
power.
[0099] However, if the secondary determination is in disagreement
with the initial determination, and instead indicates that the
device is on an inanimate object (e.g., if the orientation of the
device experiences a relatively low degree of variation of the
period of time), the initial determination can be overruled, and
the mobile device is determined to be on an inanimate object.
Accordingly, at step 712, the mobile device can increase the
transmission power of its transceiver. In some cases, if the mobile
device is already operating its transceiver according to a high
transmit power, the mobile device can maintain the high transmit
power.
[0100] In some cases, the initial determination and the secondary
determination can be carried out simultaneously or substantially
simultaneously (e.g., over a common epoch or period of time). For
example, the initial determination and the secondary determination
both can be carried out every 2.56 seconds. If, during a particular
2.56 second period of time, the mobile device initially determines
that it is on a user's body, the mobile device can consult the
secondary determination to confirm or override the initial
determination. As a result, a more accurate determination can be
made regarding the state of the mobile device.
[0101] An example system 900 is shown in FIG. 9. System 900 can be,
for example, included as a part of a mobile device. System 900
includes a motion sensor 902 and a proximity sensor 904 in
communication with a control module 906. During operation, the
motion sensor 902 obtains motion data, either continuously,
periodically, or intermittently (e.g., in response to an
instruction by the control module 906), and transmits the motion
data to control module 906. Motion sensor 902 can be implemented,
for example, using one or more of the motion sensors described
above. In a similar manner, during operation, the proximity sensor
904 obtains proximity data, either continuously, periodically, or
intermittently, and transmits the proximity data to the control
module 906. Proximity sensor 904 can be implemented, for example,
using one or more of the proximity sensors described above.
[0102] Based on the motion data and/or sensor data obtained from
motion sensor 902 and proximity sensor 904, the control module 906
adjusts the transmit power of the mobile device. For example, the
control module 906 can use implementations of processes 400, 500,
and/or 700 to determine if the mobile device is on body or on
object. In response, the control module 906 can adjust the transmit
power of the mobile device, for example by sending instructions to
baseband module 908 (e.g., a processor or other component that
manages the operation of a data transmission module). In response
to these instructions, the baseband module 908 adjusts the transmit
power of the mobile device accordingly.
Application Programming Interfaces
[0103] One or more Application Programming Interfaces (APIs) may be
used in implementations described herein. An API is an interface
implemented by a program code component or hardware component
(hereinafter "API-implementing component") that allows a different
program code component or hardware component (hereinafter
"API-calling component") to access and use one or more functions,
methods, procedures, data structures, classes, and/or other
services provided by the API-implementing component. An API can
define one or more parameters that are passed between the
API-calling component and the API-implementing component.
[0104] An API allows a developer of an API-calling component (which
may be a third party developer) to leverage specified features
provided by an API-implementing component. There may be one
API-calling component or there may be more than one such component.
An API can be a source code interface that a computer system or
program library provides in order to support requests for services
from an application. An operating system (OS) can have multiple
APIs to allow applications running on the OS to call one or more of
those APIs, and a service (such as a program library) can have
multiple APIs to allow an application that uses the service to call
one or more of those APIs. An API can be specified in terms of a
programming language that can be interpreted or compiled when an
application is built.
[0105] In some implementations, the API-implementing component may
provide more than one API that provide access to different aspects
of the functionality implemented by the API-implementing component.
For example, one API of an API-implementing component can provide a
first set of functions and can be exposed to third party
developers, and another API of the API-implementing component can
be hidden (not exposed) and provide a subset of the first set of
functions and also provide another set of functions, such as
testing or debugging functions which are not in the first set of
functions. In other implementations, the API-implementing component
may itself call one or more other components via an underlying API
and thus be both an API-calling component and an API-implementing
component.
[0106] An API defines the language and parameters that API-calling
components use when accessing and using specified features of the
API-implementing component. For example, an API-calling component
accesses the specified features of the API-implementing component
through one or more API calls or invocations (embodied for example
by function or method calls) exposed by the API and passes data and
control information using parameters via the API calls or
invocations. The API-implementing component may return a value
through the API in response to an API call from an API-calling
component. While the API defines the syntax and result of an API
call (e.g., how to invoke the API call and what the API call does),
the API may not reveal how the API call accomplishes the function
specified by the API call. Various API calls are transferred via
the one or more application programming interfaces between the
calling (API-calling component) and an API-implementing component.
Transferring the API calls may include issuing, initiating,
invoking, calling, receiving, returning, or responding to the
function calls or messages; in other words, transferring can
describe actions by either of the API-calling component or the
API-implementing component. The function calls or other invocations
of the API may send or receive one or more parameters through a
parameter list or other structure. A parameter can be a constant,
key, data structure, object, object class, variable, data type,
pointer, array, list or a pointer to a function or method or
another way to reference a data or other item to be passed via the
API.
[0107] Furthermore, data types or classes may be provided by the
API and implemented by the API-implementing component. Thus, the
API-calling component may declare variables, use pointers to, use
or instantiate constant values of such types or classes by using
definitions provided in the API.
[0108] Generally, an API can be used to access a service or data
provided by the API-implementing component or to initiate
performance of an operation or computation provided by the
API-implementing component. By way of example, the API-implementing
component and the API-calling component may each be any one of an
operating system, a library, a device driver, an API, an
application program, or other module (e.g., the API-implementing
component and the API-calling component may be the same or
different type of module from each other). API-implementing
components may in some cases be embodied at least in part in
firmware, microcode, or other hardware logic.
[0109] In some implementations, an API may allow a client program
to use the services provided by a Software Development Kit (SDK)
library. In other embodiments an application or other client
program may use an API provided by an Application Framework. In
these implementations, the application or client program may
incorporate calls to functions or methods provided by the SDK
and/or provided by the API or use data types or objects defined in
the SDK and provided by the API. An Application Framework may in
these implementations provide a main event loop for a program that
responds to various events defined by the Framework. The API allows
the application to specify the events and the responses to the
events using the Application Framework. In some implementations, an
API call can report to an application the capabilities or state of
a hardware device, including those related to aspects such as input
capabilities and state, output capabilities and state, processing
capability, power state, storage capacity and state, communications
capability, etc., and the API may be implemented in part by
firmware, microcode, or other low level logic that executes in part
on the hardware component.
[0110] The API-calling component may be a local component (e.g., on
the same data processing system as the API-implementing component)
or a remote component (e.g., on a different data processing system
from the API-implementing component) that communicates with the
API-implementing component through the API over a network. An
API-implementing component may also act as an API-calling component
(e.g., it may make API calls to an API exposed by a different
API-implementing component) and an API-calling component may also
act as an API-implementing component by implementing an API that is
exposed to a different API-calling component.
[0111] The API may allow multiple API-calling components written in
different programming languages to communicate with the
API-implementing component, thus the API may include features for
translating calls and returns between the API-implementing
component and the API-calling component. However the API may be
implemented in terms of a specific programming language. An
API-calling component can, in one embedment, call APIs from
different providers such as a set of APIs from an OS provider and
another set of APIs from a plug-in provider and another set of APIs
from another provider (e.g., the provider of a software library) or
creator of another set of APIs.
[0112] FIG. 10 is a block diagram illustrating an example API
architecture 1000, which can be used in some implementations. As
shown in FIG. 10, the API architecture 1000 includes the
API-implementing component 1010 (e.g., an operating system, a
library, a device driver, an API, an application program, software
or other module) that implements the API 1020. The API 1020 can
specify one or more functions, methods, classes, objects,
protocols, data structures, formats and/or other features of the
API-implementing component that may be used by the API-calling
component 1030. The API 1020 can specify at least one calling
convention that specifies how a function in the API-implementing
component receives parameters 1032 from the API-calling component
and how the function returns a result 1022 to the API-calling
component. The API-calling component 1030 (e.g., an operating
system, a library, a device driver, an API, an application program,
software or other module), makes API calls through the API 1020 to
access and use the features of the API-implementing component 1010
that are specified by the API 1020. The API-implementing component
1010 may return a value through the API 1020 to the API-calling
component 1030 in response to an API call.
[0113] For example, the API-implementing component 1010 can include
additional functions, methods, classes, data structures, and/or
other features that are not specified through the API 1020 and are
not available to the API-calling component 1030. The API-calling
component 1030 may be on the same system as the API-implementing
component 1010 or may be located remotely and accesses the
API-implementing component 1010 using the API 1020 over a network.
While FIG. 10 illustrates a single API-calling component 1030
interacting with the API 1020, other API-calling components, which
may be written in different languages (or the same language) than
the API-calling component 1030, may use the API 1020.
[0114] The API-implementing component 1010, the API 1020, and the
API-calling component 1030 may be stored in a machine-readable
medium, which includes any mechanism for storing information in a
form readable by a machine (e.g., a computer or other data
processing system). For example, a machine-readable medium includes
magnetic disks, optical disks, random access memory; read only
memory, flash memory devices, etc.
[0115] FIG. 11 illustrates an example software stack 1100 that
includes various application programming interfaces. As illustrated
by FIG. 11, applications 1102 and 1104 can make calls to Service A
1106 or Service B 1108 using several Service APIs 1110, 1112, and
1114 and to Operating System (OS) 1118 using several OS APIs
1120-1122. Service A 1106 or Service B 1108 can make calls to OS
1118 using several OS APIs 1120-1122.
[0116] Note that the Service B 1108 has two APIs 1112 and 1114, one
of which, Service B API 1 1112, receives calls from and returns
values to Application 1 1102 and the other, Service B API 2 1114,
receives calls from and returns values to Application 2 1104.
Service A 1106 (which can be, for example, a software library)
makes calls to and receives returned values from OS API 1 1120, and
Service B 1108 (which can be, for example, a software library)
makes calls to and receives returned values from both OS API 1 1120
and OS API 2 1122. Application 2 1104 makes calls to and receives
returned values from OS API 2 1122.
Example System Architecture
[0117] FIG. 12 is a block diagram of an example computing device
1200 that can implement the features and processes of FIGS. 1-11.
As an example, computing device 1200 can be used to implement
system 900 of FIG. 9, and perform implementations of process 400 of
FIG. 4, process 500 of FIG. 5, and/or process 700 of FIG. 7. The
computing device 1200 can include a memory interface 1202, one or
more data processors, image processors and/or central processing
units 1204, and a peripherals interface 1206. The memory interface
1202, the one or more processors 1204 and/or the peripherals
interface 1206 can be separate components or can be integrated in
one or more integrated circuits. The various components in the
computing device 1200 can be coupled by one or more communication
buses or signal lines.
[0118] Sensors, devices, and subsystems can be coupled to the
peripherals interface 1206 to facilitate multiple functionalities.
For example, a motion sensor 1210, a light sensor 1212, and a
proximity sensor 1214 can be coupled to the peripherals interface
1206 to facilitate orientation, lighting, and proximity functions.
Other sensors 1216 can also be connected to the peripherals
interface 1206, such as a global navigation satellite system (GNSS)
(e.g., GPS receiver), a temperature sensor, a biometric sensor,
magnetometer or other sensing device, to facilitate related
functionalities.
[0119] A camera subsystem 1220 and an optical sensor 1222, e.g., a
charged coupled device (CCD) or a complementary metal-oxide
semiconductor (CMOS) optical sensor, can be utilized to facilitate
camera functions, such as recording photographs and video clips.
The camera subsystem 1220 and the optical sensor 1222 can be used
to collect images of a user to be used during authentication of a
user, e.g., by performing facial recognition analysis.
[0120] Communication functions can be facilitated through one or
more wireless communication subsystems 1224, which can include
radio frequency receivers and transmitters and/or optical (e.g.,
infrared) receivers and transmitters. The specific design and
implementation of the communication subsystem 1224 can depend on
the communication network(s) over which the computing device 1200
is intended to operate. For example, the computing device 1200 can
include communication subsystems 1224 designed to operate over a
GSM network, a GPRS network, an EDGE network, a Wi-Fi or WiMax
network, and a Bluetooth.TM. network. In particular, the wireless
communication subsystems 1224 can include hosting protocols such
that the device 1200 can be configured as a base station for other
wireless devices.
[0121] An audio subsystem 1226 can be coupled to a speaker 1228 and
a microphone 1230 to facilitate voice-enabled functions, such as
speaker recognition, voice replication, digital recording, and
telephony functions. The audio subsystem 1226 can be configured to
facilitate processing voice commands, voiceprinting and voice
authentication, for example.
[0122] The I/O subsystem 1240 can include a touch-surface
controller 1242 and/or other input controller(s) 1244. The
touch-surface controller 1242 can be coupled to a touch surface
1246. The touch surface 1246 and touch-surface controller 1242 can,
for example, detect contact and movement or break thereof using any
of a plurality of touch sensitivity technologies, including but not
limited to capacitive, resistive, infrared, and surface acoustic
wave technologies, as well as other proximity sensor arrays or
other elements for determining one or more points of contact with
the touch surface 1246.
[0123] The other input controller(s) 1244 can be coupled to other
input/control devices 1248, such as one or more buttons, rocker
switches, thumb-wheel, infrared port, USB port, and/or a pointer
device such as a stylus. The one or more buttons (not shown) can
include an up/down button for volume control of the speaker 1228
and/or the microphone 1230.
[0124] In one implementation, a pressing of the button for a first
duration can disengage a lock of the touch surface 1246; and a
pressing of the button for a second duration that is longer than
the first duration can turn power to the computing device 1200 on
or off. Pressing the button for a third duration can activate a
voice control, or voice command, module that enables the user to
speak commands into the microphone 1230 to cause the device to
execute the spoken command. The user can customize a functionality
of one or more of the buttons. The touch surface 1246 can, for
example, also be used to implement virtual or soft buttons and/or a
keyboard.
[0125] In some implementations, the computing device 1200 can
present recorded audio and/or video files, such as MP3, AAC, and
MPEG files. In some implementations, the computing device 1200 can
include the functionality of an MP3 player, such as an iPod.TM..
The computing device 1200 can, therefore, include a 36-pin
connector that is compatible with the iPod. Other input/output and
control devices can also be used.
[0126] The memory interface 1202 can be coupled to memory 1250. The
memory 1250 can include high-speed random access memory and/or
non-volatile memory, such as one or more magnetic disk storage
devices, one or more optical storage devices, and/or flash memory
(e.g., NAND, NOR). The memory 1250 can store an operating system
1252, such as Darwin, RTXC, LINUX, UNIX, OS X, WINDOWS, or an
embedded operating system such as VxWorks.
[0127] The operating system 1252 can include instructions for
handling basic system services and for performing hardware
dependent tasks. In some implementations, the operating system 1252
can be a kernel (e.g., UNIX kernel). In some implementations, the
operating system 1252 can include instructions for determining
whether the mobile device is on a user's body or on a stationary
object and adjusting the transmit power of the mobile device
accordingly. For example, operating system 1252 can implement the
mobile device locating and transmission power adjustment features
as described with reference to FIGS. 1-11.
[0128] The memory 1250 can also store communication instructions
1254 to facilitate communicating with one or more additional
devices, one or more computers and/or one or more servers. The
memory 1250 can include graphical user interface instructions 1256
to facilitate graphic user interface processing; sensor processing
instructions 1258 to facilitate sensor-related processing and
functions; phone instructions 1260 to facilitate phone-related
processes and functions; electronic messaging instructions 1262 to
facilitate electronic-messaging related processes and functions;
web browsing instructions 1264 to facilitate web browsing-related
processes and functions; media processing instructions 1266 to
facilitate media processing-related processes and functions;
GNSS/Navigation instructions 1268 to facilitate GNSS and
navigation-related processes and functions; and/or camera
instructions 1270 to facilitate camera-related processes and
functions. The memory 1270 can store software instructions 1272 to
facilitate other processes and functions, such as the mobile device
locating and transmission power adjustment processes and functions
as described with reference to FIGS. 1-11.
[0129] The memory 1250 can also store other software instructions
1274, such as web video instructions to facilitate web
video-related processes and functions; and/or web shopping
instructions to facilitate web shopping-related processes and
functions. In some implementations, the media processing
instructions 1266 are divided into audio processing instructions
and video processing instructions to facilitate audio
processing-related processes and functions and video
processing-related processes and functions, respectively.
[0130] Each of the above identified instructions and applications
can correspond to a set of instructions for performing one or more
functions described above. These instructions need not be
implemented as separate software programs, procedures, or modules.
The memory 1250 can include additional instructions or fewer
instructions. Furthermore, various functions of the computing
device 1200 can be implemented in hardware and/or in software,
including in one or more signal processing and/or application
specific integrated circuits.
* * * * *