U.S. patent application number 17/176908 was filed with the patent office on 2021-06-17 for user interfaces for playing and managing audio items.
The applicant listed for this patent is Apple Inc.. Invention is credited to Taylor G. CARRIGAN, Christopher Patrick FOSS, Stephen O. LEMAY.
Application Number | 20210181903 17/176908 |
Document ID | / |
Family ID | 1000005417557 |
Filed Date | 2021-06-17 |
United States Patent
Application |
20210181903 |
Kind Code |
A1 |
CARRIGAN; Taylor G. ; et
al. |
June 17, 2021 |
USER INTERFACES FOR PLAYING AND MANAGING AUDIO ITEMS
Abstract
The present disclosure generally relates to playing and managing
audio items. In some examples, an electronic device provides
intuitive user interfaces for playing and managing audio items on
the device. In some examples, an electronic device provides
seamless transitioning from navigating a stack of items
corresponding to groups of audio items to navigating a list of
menus. In some examples, an electronic device provides for quick
and easy access between different applications that are active on
the device. In some examples, an electronic device enables
automatic transmission of data associated with audio items to be
stored locally on a linked external device.
Inventors: |
CARRIGAN; Taylor G.; (San
Francisco, CA) ; FOSS; Christopher Patrick; (San
Francisco, CA) ; LEMAY; Stephen O.; (Palo Alto,
CA) |
|
Applicant: |
Name |
City |
State |
Country |
Type |
Apple Inc. |
Cupertino |
CA |
US |
|
|
Family ID: |
1000005417557 |
Appl. No.: |
17/176908 |
Filed: |
February 16, 2021 |
Related U.S. Patent Documents
|
|
|
|
|
|
Application
Number |
Filing Date |
Patent Number |
|
|
15730610 |
Oct 11, 2017 |
10928980 |
|
|
17176908 |
|
|
|
|
62505760 |
May 12, 2017 |
|
|
|
Current U.S.
Class: |
1/1 |
Current CPC
Class: |
H04N 21/8113 20130101;
H04N 21/41407 20130101; H04N 21/4126 20130101; H04N 21/4586
20130101; H04N 21/43637 20130101; H04N 21/8173 20130101; G06F
16/639 20190101; H04N 21/4825 20130101; H04N 21/433 20130101; H04N
21/47214 20130101; G06F 16/64 20190101; G06F 3/0482 20130101; G06F
3/0362 20130101; G06F 3/0488 20130101 |
International
Class: |
G06F 3/0482 20060101
G06F003/0482; G06F 3/0362 20060101 G06F003/0362; G06F 3/0488
20060101 G06F003/0488; G06F 16/64 20060101 G06F016/64; G06F 16/638
20060101 G06F016/638; H04N 21/4363 20060101 H04N021/4363; H04N
21/81 20060101 H04N021/81; H04N 21/41 20060101 H04N021/41; H04N
21/433 20060101 H04N021/433; H04N 21/414 20060101 H04N021/414; H04N
21/458 20060101 H04N021/458; H04N 21/472 20060101 H04N021/472; H04N
21/482 20060101 H04N021/482 |
Claims
1. An electronic device, comprising: a touch-sensitive display; a
wireless communication radio; one or more processors; and memory
storing one or more programs configured to be executed by the one
or more processors, the one or more programs including instructions
for: displaying, on the touch-sensitive display, a user interface
including a plurality of item groups and a plurality of selection
affordances associated with the plurality of item groups, wherein a
selection affordance has a first state and a second state, and
wherein data of the plurality of item groups are stored on the
electronic device; receiving user input on a first selection
affordance associated with a first item group; in accordance with a
determination that the first selection affordance is in the first
state, designating the first item group; in accordance with a
determination that the first selection affordance is in the second
state, forgoing designating the first item group; and subsequent to
detecting, via the wireless communication radio, an external
device: in accordance with a determination that the first item
group is designated, automatically transmitting data of the items
associated with the first item group to the external device to be
stored on the external device; in accordance with a determination
that the first item group is not designated, forgoing to
automatically transmit data of the items associated with the first
item group to the external device to be stored on the external
device.
2. The electronic device of claim 1, wherein the one or more
programs further include instructions for: in accordance with the
determination that the first item group is designated: in
accordance with a determination that a first item associated with
the first item group is stored on the external device, forgoing to
automatically transmit the data of the first item to the external
device.
3. The electronic device of claim 1, wherein the one or more
programs further include instructions for: in accordance with the
determination that the first item group is designated: in
accordance with a determination that a second item not associated
with the first item group stored on the electronic device is stored
on the external device, causing data of the second item to be
removed from the external device.
4. The electronic device of claim 1, wherein the one or more
programs further include instructions for: receiving a second user
input on the user interface; in response to receiving the second
user input, displaying, on the display, at least one stored item
group of a plurality of stored item groups stored on external
device; receiving user selection of an edit affordance; in response
to receiving the user selection of the edit affordance, displaying,
on the touch-sensitive display, a plurality of removal affordances
associated with the plurality of stored item groups; receiving user
selection of a first removal affordance of the plurality of removal
affordances associated with a first stored item group of the
plurality of stored item groups; and in response to receiving the
user selection of the first removal affordance, causing data of the
first stored item group to be removed from the external device.
5. The electronic device of claim 4, wherein the one or more
programs further include instructions for: prior to causing the
data of the first stored item group to be removed from the external
device, displaying, on the touch-sensitive display, a confirmation
affordance; receiving user selection of the confirmation
affordance; and in response to receiving the user selection of the
confirmation affordance, causing data of the first stored item
group to be removed from the external device.
6. The electronic device of claim 1, wherein the one or more
programs further include instructions for: prior to displaying, on
the touch-sensitive display, the user interface including the
plurality of item groups and the plurality of selection affordances
associated with the plurality of item groups, displaying, on the
touch-sensitive display, an initial setup user interface including
a proceeding affordance; receiving user selection of the proceeding
affordance; and in response to receiving the user selection of the
proceeding affordance, displaying the user interface.
7. The electronic device of claim 6, wherein the initial setup user
interface is displayed in response to detecting, via the wireless
communication radio, connectivity with the external device.
8. The electronic device of claim 1, wherein the one or more
programs further include instructions for: prior to automatically
transmitting the data of the items in the first item group to the
external device to be stored on the external device, displaying, on
the touch-sensitive display, a confirmation sheet indicating that
the first item group is designated.
9. The electronic device of claim 1, wherein a default state of the
first selection affordance is the first state.
10. The electronic device of claim 1, wherein the one or more
programs further include instructions for: prior to automatically
transmitting the data of the items in the first item group to the
external device to be stored on the external device, receiving, via
the wireless communication radio, charge state information of the
external device; in accordance with a determination, based on the
received charge state information, that the external device is
currently being charged, automatically transmitting the data of the
items associated with the first item group to the external device;
and in accordance with a determination, based on the received
charge state information, that the external device is not currently
being charged, forgoing to automatically transmit the data of the
items associated with the first item group to the external
device.
11. The electronic device of claim 1, wherein the user interface
includes a storage limit indicator of the external device.
12. The electronic device of claim 1, wherein the user interface
includes a storage bar indicating storage information of the
external device.
13. The electronic device of claim 12, wherein the one or more
programs further include instructions for: subsequent to
transmitting the data of the items associated with the first item
group to the external device, receiving, via the wireless
communication radio, updated storage information of the external
device; and in response to receiving the updated storage
information of the external device, updating the storage bar to
reflect the updated storage information.
14. The electronic device of claim 1, wherein the electronic device
is paired with the external device.
15. A non-transitory computer-readable storage medium storing one
or more programs configured to be executed by one or more
processors of an electronic device with a touch-sensitive display
and a wireless communication radio, the one or more programs
including instructions for: displaying, on the touch-sensitive
display, a user interface including a plurality of item groups and
a plurality of selection affordances associated with the plurality
of item groups, wherein a selection affordance has a first state
and a second state, and wherein data of the plurality of item
groups are stored on the electronic device; receiving user input on
a first selection affordance associated with a first item group; in
accordance with a determination that the first selection affordance
is in the first state, designating the first item group; in
accordance with a determination that the first selection affordance
is in the second state, forgoing designating the first item group;
and subsequent to detecting, via the wireless communication radio,
an external device: in accordance with a determination that the
first item group is designated, automatically transmitting data of
the items associated with the first item group to the external
device to be stored on the external device; in accordance with a
determination that the first item group is not designated, forgoing
to automatically transmit data of the items associated with the
first item group to the external device to be stored on the
external device.
16. A method, comprising: at an electronic device with a
touch-sensitive display and a wireless communication radio:
displaying, on the touch-sensitive display, a user interface
including a plurality of item groups and a plurality of selection
affordances associated with the plurality of item groups, wherein a
selection affordance has a first state and a second state, and
wherein data of the plurality of item groups are stored on the
electronic device; receiving user input on a first selection
affordance associated with a first item group; in accordance with a
determination that the first selection affordance is in the first
state, designating the first item group; in accordance with a
determination that the first selection affordance is in the second
state, forgoing designating the first item group; and subsequent to
detecting, via the wireless communication radio, an external
device: in accordance with a determination that the first item
group is designated, automatically transmitting data of the items
associated with the first item group to the external device to be
stored on the external device; in accordance with a determination
that the first item group is not designated, forgoing to
automatically transmit data of the items associated with the first
item group to the external device to be stored on the external
device.
Description
CROSS-REFERENCE TO RELATED APPLICATION
[0001] This application is a continuation of U.S. application Ser.
No. 15/730,610, filed Oct. 11, 2017, entitled "USER INTERFACES FOR
PLAYING AND MANAGING AUDIO ITEMS," which claims priority to U.S.
Provisional Application Ser. No. 62/505,760, filed May 12, 2017,
entitled "USER INTERFACES FOR PLAYING AND MANAGING AUDIO ITEMS,"
each of which are hereby incorporated by reference in their
entirety for all purposes.
FIELD
[0002] The present disclosure relates generally to computer user
interfaces, and more specifically to techniques for playing and
managing audio items.
BACKGROUND
[0003] Playing and managing audio items, such as music, using
electronic devices is a common occurrence. Further, audio items are
often played and managed across multiples devices. Sometimes, a
device belonging to a user does not store all of the audio items
belonging to the user.
BRIEF SUMMARY
[0004] Attempting to play and manage audio items while engaging in
a physical activity such as commuting or exercising, in particular
using a portable electronic device having a limited amount of
display real estate, can still be a cumbersome task. This is even
more so when owning more than one electronic device and audio items
must be played and managed across devices. Therefore, faster, more
efficient methods and interfaces for playing and managing audio
items are needed.
[0005] Some techniques for playing and managing audio items using
electronic devices, however, are generally cumbersome and
inefficient. For example, some existing techniques use a complex
and time-consuming user interface, which may include multiple key
presses or keystrokes. Existing techniques require more time than
necessary, wasting user time and device energy. This latter
consideration is particularly important in battery-operated
devices.
[0006] Accordingly, the present technique provides electronic
devices with faster, more efficient methods and interfaces for
playing and managing audio items. Such methods and interfaces
optionally complement or replace other methods for playing and
managing audio items. Such methods and interfaces reduce the
cognitive burden on a user and produce a more efficient
human-machine interface. For battery-operated computing devices,
such methods and interfaces conserve power and increase the time
between battery charges. Such methods and interfaces also reduce
the number of unnecessary, extraneous, or repetitive input required
at computing devices, such as smartphones and smartwatches.
[0007] In accordance with some embodiments, a method performed at
an electronic device with a touch-sensitive display is described.
The method comprises: displaying, on the display, a first user
interface, wherein the first user interface includes a scrollable
plurality of audio playlist items associated with a plurality of
audio playlists; receiving a first user input on a first audio
playlist item of the plurality of audio playlist items; in response
to receiving the first user input on the first audio playlist item:
displaying, on the display, a second user interface, wherein the
second user interface includes an indication of a first audio item
of a first audio playlist associated with the first audio playlist
item, and displaying, on the display, a plurality of indicia icons,
wherein a first indicia icon associated with the second user
interface includes an indication that the second user interface is
currently displayed; receiving a second user input on the second
user interface; and in response to receiving the second user input
on the second user interface: displaying, on the display, a third
user interface, wherein the third user interface includes a
plurality of audio items of the first audio playlist, and updating
display of the plurality of indicia icons, wherein a second indicia
icon associated with the third user interface includes the
indication that the third user interface is currently
displayed.
[0008] In accordance with some embodiments, a non-transitory
computer-readable storage medium is described. The non-transitory
computer-readable storage medium storing one or more programs
configured to be executed by one or more processors of an
electronic device with a touch-sensitive display, the one or more
programs including instructions for: displaying, on the display, a
first user interface, wherein the first user interface includes a
scrollable plurality of audio playlist items associated with a
plurality of audio playlists; receiving a first user input on a
first audio playlist item of the plurality of audio playlist items;
in response to receiving the first user input on the first audio
playlist item: displaying, on the display, a second user interface,
wherein the second user interface includes an indication of a first
audio item of a first audio playlist associated with the first
audio playlist item, and displaying, on the display, a plurality of
indicia icons, wherein a first indicia icon associated with the
second user interface includes an indication that the second user
interface is currently displayed; receiving a second user input on
the second user interface; and in response to receiving the second
user input on the second user interface: displaying, on the
display, a third user interface, wherein the third user interface
includes a plurality of audio items of the first audio playlist,
and updating display of the plurality of indicia icons, wherein a
second indicia icon associated with the third user interface
includes the indication that the third user interface is currently
displayed.
[0009] In accordance with some embodiments, a transitory
computer-readable storage medium is described. The transitory
computer-readable storage medium storing one or more programs
configured to be executed by one or more processors of an
electronic device with a touch-sensitive display, the one or more
programs including instructions for: displaying, on the display, a
first user interface, wherein the first user interface includes a
scrollable plurality of audio playlist items associated with a
plurality of audio playlists; receiving a first user input on a
first audio playlist item of the plurality of audio playlist items;
in response to receiving the first user input on the first audio
playlist item: displaying, on the display, a second user interface,
wherein the second user interface includes an indication of a first
audio item of a first audio playlist associated with the first
audio playlist item, and displaying, on the display, a plurality of
indicia icons, wherein a first indicia icon associated with the
second user interface includes an indication that the second user
interface is currently displayed; receiving a second user input on
the second user interface; and in response to receiving the second
user input on the second user interface: displaying, on the
display, a third user interface, wherein the third user interface
includes a plurality of audio items of the first audio playlist,
and updating display of the plurality of indicia icons, wherein a
second indicia icon associated with the third user interface
includes the indication that the third user interface is currently
displayed.
[0010] In accordance with some embodiments, an electronic device is
described. The electronic device comprises: a touch-sensitive
display; one or more processors; and memory storing one or more
programs configured to be executed by the one or more processors,
the one or more programs including instructions for: displaying, on
the display, a first user interface, wherein the first user
interface includes a scrollable plurality of audio playlist items
associated with a plurality of audio playlists; receiving a first
user input on a first audio playlist item of the plurality of audio
playlist items; in response to receiving the first user input on
the first audio playlist item: displaying, on the display, a second
user interface, wherein the second user interface includes an
indication of a first audio item of a first audio playlist
associated with the first audio playlist item, and displaying, on
the display, a plurality of indicia icons, wherein a first indicia
icon associated with the second user interface includes an
indication that the second user interface is currently displayed;
receiving a second user input on the second user interface; and in
response to receiving the second user input on the second user
interface: displaying, on the display, a third user interface,
wherein the third user interface includes a plurality of audio
items of the first audio playlist, and updating display of the
plurality of indicia icons, wherein a second indicia icon
associated with the third user interface includes the indication
that the third user interface is currently displayed.
[0011] In accordance with some embodiments, an electronic device is
described. The electronic device comprises: a touch-sensitive
display; means for displaying, on the display, a first user
interface, wherein the first user interface includes a scrollable
plurality of audio playlist items associated with a plurality of
audio playlists; means for receiving a first user input on a first
audio playlist item of the plurality of audio playlist items;
means, in response to receiving the first user input on the first
audio playlist item, for: displaying, on the display, a second user
interface, wherein the second user interface includes an indication
of a first audio item of a first audio playlist associated with the
first audio playlist item, and displaying, on the display, a
plurality of indicia icons, wherein a first indicia icon associated
with the second user interface includes an indication that the
second user interface is currently displayed; means for receiving a
second user input on the second user interface; and means, in
response to receiving the second user input on the second user
interface, for: displaying, on the display, a third user interface,
wherein the third user interface includes a plurality of audio
items of the first audio playlist, and updating display of the
plurality of indicia icons, wherein a second indicia icon
associated with the third user interface includes the indication
that the third user interface is currently displayed.
[0012] In accordance with some embodiments, a method performed at
an electronic device with a touch-sensitive display is described.
The method comprises: displaying, on the display, an ordered stack
of audio playlist items in a first position, wherein the ordered
stack of audio playlist items includes a first item, a second item,
and a third item, and wherein the first item is displayed in the
first position; receiving a first input in a first direction; in
response to receiving the first input, displaying, on the display,
the ordered stack of audio playlist items in a second position,
wherein the second item is displayed in the second position;
receiving a second input in the first direction; and in response to
receiving the second input: in accordance with a determination that
the second item is a terminal item in the ordered stack of audio
playlist items, displaying, on the display, at least one menu
affordance of a plurality of menu affordances, and in accordance
with a determination that the second item is an intermediate item
in the ordered stack of audio playlist items, displaying, on the
display, the ordered stack of audio playlist items in a third
position, wherein the third item is displayed in the third
position.
[0013] In accordance with some embodiments, a non-transitory
computer-readable storage medium is described. The non-transitory
computer-readable storage medium storing one or more programs
configured to be executed by one or more processors of an
electronic device with a touch-sensitive display, the one or more
programs including instructions for: displaying, on the display, an
ordered stack of audio playlist items in a first position, wherein
the ordered stack of audio playlist items includes a first item, a
second item, and a third item, and wherein the first item is
displayed in the first position; receiving a first input in a first
direction; in response to receiving the first input, displaying, on
the display, the ordered stack of audio playlist items in a second
position, wherein the second item is displayed in the second
position; receiving a second input in the first direction; and in
response to receiving the second input: in accordance with a
determination that the second item is a terminal item in the
ordered stack of audio playlist items, displaying, on the display,
at least one menu affordance of a plurality of menu affordances,
and in accordance with a determination that the second item is an
intermediate item in the ordered stack of audio playlist items,
displaying, on the display, the ordered stack of audio playlist
items in a third position, wherein the third item is displayed in
the third position.
[0014] In accordance with some embodiments, a transitory
computer-readable storage medium is described. The transitory
computer-readable storage medium storing one or more programs
configured to be executed by one or more processors of an
electronic device with a touch-sensitive display, the one or more
programs including instructions for: displaying, on the display, an
ordered stack of audio playlist items in a first position, wherein
the ordered stack of audio playlist items includes a first item, a
second item, and a third item, and wherein the first item is
displayed in the first position; receiving a first input in a first
direction; in response to receiving the first input, displaying, on
the display, the ordered stack of audio playlist items in a second
position, wherein the second item is displayed in the second
position; receiving a second input in the first direction; and in
response to receiving the second input: in accordance with a
determination that the second item is a terminal item in the
ordered stack of audio playlist items, displaying, on the display,
at least one menu affordance of a plurality of menu affordances,
and in accordance with a determination that the second item is an
intermediate item in the ordered stack of audio playlist items,
displaying, on the display, the ordered stack of audio playlist
items in a third position, wherein the third item is displayed in
the third position.
[0015] In accordance with some embodiments, an electronic device is
described. The electronic device comprises: a touch-sensitive
display; one or more processors; and memory storing one or more
programs configured to be executed by the one or more processors,
the one or more programs including instructions for: displaying, on
the display, an ordered stack of audio playlist items in a first
position, wherein the ordered stack of audio playlist items
includes a first item, a second item, and a third item, and wherein
the first item is displayed in the first position; receiving a
first input in a first direction; in response to receiving the
first input, displaying, on the display, the ordered stack of audio
playlist items in a second position, wherein the second item is
displayed in the second position; receiving a second input in the
first direction; and in response to receiving the second input: in
accordance with a determination that the second item is a terminal
item in the ordered stack of audio playlist items, displaying, on
the display, at least one menu affordance of a plurality of menu
affordances, and in accordance with a determination that the second
item is an intermediate item in the ordered stack of audio playlist
items, displaying, on the display, the ordered stack of audio
playlist items in a third position, wherein the third item is
displayed in the third position.
[0016] In accordance with some embodiments, an electronic device is
described. The electronic device comprises: a touch-sensitive
display; means for displaying, on the display, an ordered stack of
audio playlist items in a first position, wherein the ordered stack
of audio playlist items includes a first item, a second item, and a
third item, and wherein the first item is displayed in the first
position; means for receiving a first input in a first direction;
means, in response to receiving the first input, for displaying, on
the display, the ordered stack of audio playlist items in a second
position, wherein the second item is displayed in the second
position; means for receiving a second input in the first
direction; and means, in response to receiving the second input,
for: in accordance with a determination that the second item is a
terminal item in the ordered stack of audio playlist items,
displaying, on the display, at least one menu affordance of a
plurality of menu affordances, and in accordance with a
determination that the second item is an intermediate item in the
ordered stack of audio playlist items, displaying, on the display,
the ordered stack of audio playlist items in a third position,
wherein the third item is displayed in the third position.
[0017] In accordance with some embodiments, a method performed at
an electronic device with a touch-sensitive display is described.
The method comprises: receiving user input initiating a first
application while a second application different from the first
application is active on the electronic device; displaying, on the
display, a first user interface associated with the first
application and a first affordance associated with the second
application; receiving user selection of the first affordance; in
response to receiving the user selection of the first affordance:
replacing display of the first user interface with display of a
second user interface associated with the second application,
wherein the first application remains active on the electronic
device, and replacing display of the first affordance with display
of a second affordance associated with the first application.
[0018] In accordance with some embodiments, a non-transitory
computer-readable storage medium is described. The non-transitory
computer-readable storage medium storing one or more programs
configured to be executed by one or more processors of an
electronic device with a touch-sensitive display, the one or more
programs including instructions for: receiving user input
initiating a first application while a second application different
from the first application is active on the electronic device;
displaying, on the display, a first user interface associated with
the first application and a first affordance associated with the
second application; receiving user selection of the first
affordance; in response to receiving the user selection of the
first affordance: replacing display of the first user interface
with display of a second user interface associated with the second
application, wherein the first application remains active on the
electronic device, and replacing display of the first affordance
with display of a second affordance associated with the first
application.
[0019] In accordance with some embodiments, a transitory
computer-readable storage medium is described. The transitory
computer-readable storage medium storing one or more programs
configured to be executed by one or more processors of an
electronic device with a touch-sensitive display, the one or more
programs including instructions for: receiving user input
initiating a first application while a second application different
from the first application is active on the electronic device;
displaying, on the display, a first user interface associated with
the first application and a first affordance associated with the
second application; receiving user selection of the first
affordance; in response to receiving the user selection of the
first affordance: replacing display of the first user interface
with display of a second user interface associated with the second
application, wherein the first application remains active on the
electronic device, and replacing display of the first affordance
with display of a second affordance associated with the first
application.
[0020] In accordance with some embodiments, an electronic device is
described. The electronic device comprises: a touch-sensitive
display; one or more processors; and memory storing one or more
programs configured to be executed by the one or more processors,
the one or more programs including instructions for: receiving user
input initiating a first application while a second application
different from the first application is active on the electronic
device; displaying, on the display, a first user interface
associated with the first application and a first affordance
associated with the second application; receiving user selection of
the first affordance; in response to receiving the user selection
of the first affordance: replacing display of the first user
interface with display of a second user interface associated with
the second application, wherein the first application remains
active on the electronic device, and replacing display of the first
affordance with display of a second affordance associated with the
first application.
[0021] In accordance with some embodiments, an electronic device is
described. The electronic device comprises: a touch-sensitive
display; means for receiving user input initiating a first
application while a second application different from the first
application is active on the electronic device; means for
displaying, on the display, a first user interface associated with
the first application and a first affordance associated with the
second application; means for receiving user selection of the first
affordance; means in response to receiving the user selection of
the first affordance, for: replacing display of the first user
interface with display of a second user interface associated with
the second application, wherein the first application remains
active on the electronic device, and replacing display of the first
affordance with display of a second affordance associated with the
first application.
[0022] In accordance with some embodiments, a method performed at
an electronic device with a touch-sensitive display and a wireless
communication radio is described. The method comprises: displaying,
on the display, a user interface including a plurality of item
groups and a plurality of selection affordances associated with the
plurality of item groups, wherein a selection affordance has a
first state and a second state, and wherein data of the plurality
of item groups are stored on the electronic device; receiving user
input on a first selection affordance associated with a first item
group; in accordance with a determination that the first selection
affordance is in the first state, designating the first item group;
in accordance with a determination that the first selection
affordance is in the second state, forgoing designating the first
item group; subsequent to detecting, via the wireless communication
radio, an external device: in accordance with a determination that
the first item group is designated, automatically transmitting data
of the items associated with the first item group to the external
device to be stored on the external device, and in accordance with
a determination that the first item group is not designated,
forgoing to automatically transmit data of the items associated
with the first item group to the external device to be stored on
the external device.
[0023] In accordance with some embodiments, a non-transitory
computer-readable storage medium is described. The non-transitory
computer-readable storage medium storing one or more programs
configured to be executed by one or more processors of an
electronic device with a touch-sensitive display and a wireless
communication radio, the one or more programs including
instructions for: displaying, on the display, a user interface
including a plurality of item groups and a plurality of selection
affordances associated with the plurality of item groups, wherein a
selection affordance has a first state and a second state, and
wherein data of the plurality of item groups are stored on the
electronic device; receiving user input on a first selection
affordance associated with a first item group; in accordance with a
determination that the first selection affordance is in the first
state, designating the first item group; in accordance with a
determination that the first selection affordance is in the second
state, forgoing designating the first item group; subsequent to
detecting, via the wireless communication radio, an external
device: in accordance with a determination that the first item
group is designated, automatically transmitting data of the items
associated with the first item group to the external device to be
stored on the external device, and in accordance with a
determination that the first item group is not designated, forgoing
to automatically transmit data of the items associated with the
first item group to the external device to be stored on the
external device.
[0024] In accordance with some embodiments, a transitory
computer-readable storage medium is described. The transitory
computer-readable storage medium storing one or more programs
configured to be executed by one or more processors of an
electronic device with a touch-sensitive display and a wireless
communication radio, the one or more programs including
instructions for: displaying, on the display, a user interface
including a plurality of item groups and a plurality of selection
affordances associated with the plurality of item groups, wherein a
selection affordance has a first state and a second state, and
wherein data of the plurality of item groups are stored on the
electronic device; receiving user input on a first selection
affordance associated with a first item group; in accordance with a
determination that the first selection affordance is in the first
state, designating the first item group; in accordance with a
determination that the first selection affordance is in the second
state, forgoing designating the first item group; subsequent to
detecting, via the wireless communication radio, an external
device: in accordance with a determination that the first item
group is designated, automatically transmitting data of the items
associated with the first item group to the external device to be
stored on the external device, and in accordance with a
determination that the first item group is not designated, forgoing
to automatically transmit data of the items associated with the
first item group to the external device to be stored on the
external device.
[0025] In accordance with some embodiments, an electronic device is
described. The electronic device comprises: a touch-sensitive
display; a wireless communication radio; one or more processors;
and memory storing one or more programs configured to be executed
by the one or more processors, the one or more programs including
instructions for: displaying, on the display, a user interface
including a plurality of item groups and a plurality of selection
affordances associated with the plurality of item groups, wherein a
selection affordance has a first state and a second state, and
wherein data of the plurality of item groups are stored on the
electronic device; receiving user input on a first selection
affordance associated with a first item group; in accordance with a
determination that the first selection affordance is in the first
state, designating the first item group; in accordance with a
determination that the first selection affordance is in the second
state, forgoing designating the first item group; subsequent to
detecting, via the wireless communication radio, an external
device: in accordance with a determination that the first item
group is designated, automatically transmitting data of the items
associated with the first item group to the external device to be
stored on the external device, and in accordance with a
determination that the first item group is not designated, forgoing
to automatically transmit data of the items associated with the
first item group to the external device to be stored on the
external device.
[0026] In accordance with some embodiments, an electronic device is
described. The electronic device comprises: a touch-sensitive
display; a wireless communication radio; means for displaying, on
the display, a user interface including a plurality of item groups
and a plurality of selection affordances associated with the
plurality of item groups, wherein a selection affordance has a
first state and a second state, and wherein data of the plurality
of item groups are stored on the electronic device; means for
receiving user input on a first selection affordance associated
with a first item group; means, in accordance with a determination
that the first selection affordance is in the first state, for
designating the first item group; means, in accordance with a
determination that the first selection affordance is in the second
state, for forgoing designating the first item group; means,
subsequent to detecting, via the wireless communication radio, an
external device, for: in accordance with a determination that the
first item group is designated, automatically transmitting data of
the items associated with the first item group to the external
device to be stored on the external device, and in accordance with
a determination that the first item group is not designated,
forgoing to automatically transmit data of the items associated
with the first item group to the external device to be stored on
the external device.
[0027] Executable instructions for performing these functions are,
optionally, included in a non-transitory computer-readable storage
medium or other computer program product configured for execution
by one or more processors. Executable instructions for performing
these functions are, optionally, included in a transitory
computer-readable storage medium or other computer program product
configured for execution by one or more processors.
[0028] Thus, devices are provided with faster, more efficient
methods and interfaces for playing and managing audio items,
thereby increasing the effectiveness, efficiency, and user
satisfaction with such devices. Such methods and interfaces may
complement or replace other methods for playing and managing audio
items.
DESCRIPTION OF THE FIGURES
[0029] For a better understanding of the various described
embodiments, reference should be made to the Description of
Embodiments below, in conjunction with the following drawings in
which like reference numerals refer to corresponding parts
throughout the figures.
[0030] FIG. 1A is a block diagram illustrating a portable
multifunction device with a touch-sensitive display in accordance
with some embodiments.
[0031] FIG. 1B is a block diagram illustrating exemplary components
for event handling in accordance with some embodiments.
[0032] FIG. 2 illustrates a portable multifunction device having a
touch screen in accordance with some embodiments.
[0033] FIG. 3 is a block diagram of an exemplary multifunction
device with a display and a touch-sensitive surface in accordance
with some embodiments.
[0034] FIG. 4A illustrates an exemplary user interface for a menu
of applications on a portable multifunction device in accordance
with some embodiments.
[0035] FIG. 4B illustrates an exemplary user interface for a
multifunction device with a touch-sensitive surface that is
separate from the display in accordance with some embodiments.
[0036] FIG. 5A illustrates a personal electronic device in
accordance with some embodiments.
[0037] FIG. 5B is a block diagram illustrating a personal
electronic device in accordance with some embodiments.
[0038] FIGS. 6A-6S illustrate exemplary user interfaces for
navigating an application for playing and managing audio items and
managing storage of the audio items, in accordance with some
embodiments.
[0039] FIGS. 7A-7C are a flow diagram illustrating methods of
navigating an application for playing and managing audio items and
managing storage of the audio items, in accordance with some
embodiments.
[0040] FIGS. 8A-8AC illustrate exemplary user interfaces for
navigating an application for playing and managing audio items
using different techniques, in accordance with some
embodiments.
[0041] FIGS. 9A-9D are a flow diagram illustrating methods of user
interfaces for navigating an application for playing and managing
audio items using different techniques, in accordance with some
embodiments.
[0042] FIGS. 10A-10H illustrate exemplary user interfaces for
easily transitioning amongst active applications, in accordance
with some embodiments.
[0043] FIGS. 11A-11B are a flow diagram illustrating methods of
easily transitioning amongst active applications, in accordance
with some embodiments.
[0044] FIGS. 12A-12AE illustrate exemplary user interfaces for
configuring and managing automatic transmission of audio-related
data from one device to another device, in accordance with some
embodiments.
[0045] FIGS. 13A-13C are a flow diagram illustrating methods of
configuring and managing automatic transmission of audio-related
data from one device to another device, in accordance with some
embodiments.
DESCRIPTION OF EMBODIMENTS
[0046] The following description sets forth exemplary methods,
parameters, and the like. It should be recognized, however, that
such description is not intended as a limitation on the scope of
the present disclosure but is instead provided as a description of
exemplary embodiments.
[0047] There is a need for electronic devices that provide
efficient methods and interfaces for playing and managing audio
items. When playing and managing audio items (e.g., songs, radio
stations, podcasts), a user is often engaged in a physical activity
(e.g., walking, exercising, commuting, driving). When engaged in
such physical activities, the user cannot easily devote full
attention to playing and managing audio items. Such techniques can
reduce the cognitive burden on a user who accesses audio items,
thereby enhancing productivity. Further, such techniques can reduce
processor and battery power otherwise wasted on redundant user
inputs.
[0048] Below, FIGS. 1A-1B, 2, 3, 4A-4B, and 5A-5B provide a
description of exemplary devices for performing the techniques for
managing event notifications. FIGS. 6A-6S illustrate exemplary user
interfaces for navigating an application for playing and managing
audio items and managing storage of the audio items, in accordance
with some embodiments. FIGS. 7A-7C are a flow diagram illustrating
methods of navigating an application for playing and managing audio
items and managing storage of the audio items, in accordance with
some embodiments. FIGS. 8A-8AC illustrate exemplary user interfaces
for navigating an application for playing and managing audio items
using different techniques, in accordance with some embodiments.
FIGS. 9A-9D are a flow diagram illustrating methods of user
interfaces for navigating an application for playing and managing
audio items using different techniques, in accordance with some
embodiments. FIGS. 10A-10H illustrate exemplary user interfaces for
easily transitioning amongst active applications, in accordance
with some embodiments. FIGS. 11A-11B are a flow diagram
illustrating methods of easily transitioning amongst active
applications, in accordance with some embodiments. FIGS. 12A-12AE
illustrate exemplary user interfaces for configuring and managing
automatic transmission of audio-related data from one device to
another device, in accordance with some embodiments. FIGS. 13A-13C
are a flow diagram illustrating methods of configuring and managing
automatic transmission of audio-related data from one device to
another device, in accordance with some embodiments.
[0049] Although the following description uses terms "first,"
"second," etc. to describe various elements, these elements should
not be limited by the terms. These terms are only used to
distinguish one element from another. For example, a first touch
could be termed a second touch, and, similarly, a second touch
could be termed a first touch, without departing from the scope of
the various described embodiments. The first touch and the second
touch are both touches, but they are not the same touch.
[0050] The terminology used in the description of the various
described embodiments herein is for the purpose of describing
particular embodiments only and is not intended to be limiting. As
used in the description of the various described embodiments and
the appended claims, the singular forms "a," "an," and "the" are
intended to include the plural forms as well, unless the context
clearly indicates otherwise. It will also be understood that the
term "and/or" as used herein refers to and encompasses any and all
possible combinations of one or more of the associated listed
items. It will be further understood that the terms "includes,"
"including," "comprises," and/or "comprising," when used in this
specification, specify the presence of stated features, integers,
steps, operations, elements, and/or components, but do not preclude
the presence or addition of one or more other features, integers,
steps, operations, elements, components, and/or groups thereof.
[0051] The term "if" is, optionally, construed to mean "when" or
"upon" or "in response to determining" or "in response to
detecting," depending on the context. Similarly, the phrase "if it
is determined" or "if [a stated condition or event] is detected"
is, optionally, construed to mean "upon determining" or "in
response to determining" or "upon detecting [the stated condition
or event]" or "in response to detecting [the stated condition or
event]," depending on the context.
[0052] Embodiments of electronic devices, user interfaces for such
devices, and associated processes for using such devices are
described. In some embodiments, the device is a portable
communications device, such as a mobile telephone, that also
contains other functions, such as PDA and/or music player
functions. Exemplary embodiments of portable multifunction devices
include, without limitation, the iPhone.RTM., iPod Touch.RTM., and
iPad.RTM. devices from Apple Inc. of Cupertino, Calif. Other
portable electronic devices, such as laptops or tablet computers
with touch-sensitive surfaces (e.g., touch screen displays and/or
touchpads), are, optionally, used. It should also be understood
that, in some embodiments, the device is not a portable
communications device, but is a desktop computer with a
touch-sensitive surface (e.g., a touch screen display and/or a
touchpad).
[0053] In the discussion that follows, an electronic device that
includes a display and a touch-sensitive surface is described. It
should be understood, however, that the electronic device
optionally includes one or more other physical user-interface
devices, such as a physical keyboard, a mouse, and/or a
joystick.
[0054] The device typically supports a variety of applications,
such as one or more of the following: a drawing application, a
presentation application, a word processing application, a website
creation application, a disk authoring application, a spreadsheet
application, a gaming application, a telephone application, a video
conferencing application, an e-mail application, an instant
messaging application, a workout support application, a photo
management application, a digital camera application, a digital
video camera application, a web browsing application, a digital
music player application, and/or a digital video player
application.
[0055] The various applications that are executed on the device
optionally use at least one common physical user-interface device,
such as the touch-sensitive surface. One or more functions of the
touch-sensitive surface as well as corresponding information
displayed on the device are, optionally, adjusted and/or varied
from one application to the next and/or within a respective
application. In this way, a common physical architecture (such as
the touch-sensitive surface) of the device optionally supports the
variety of applications with user interfaces that are intuitive and
transparent to the user.
[0056] Attention is now directed toward embodiments of portable
devices with touch-sensitive displays. FIG. 1A is a block diagram
illustrating portable multifunction device 100 with touch-sensitive
display system 112 in accordance with some embodiments.
Touch-sensitive display 112 is sometimes called a "touch screen"
for convenience and is sometimes known as or called a
"touch-sensitive display system." Device 100 includes memory 102
(which optionally includes one or more computer-readable storage
mediums), memory controller 122, one or more processing units
(CPUs) 120, peripherals interface 118, RF circuitry 108, audio
circuitry 110, speaker 111, microphone 113, input/output (I/O)
subsystem 106, other input control devices 116, and external port
124. Device 100 optionally includes one or more optical sensors
164. Device 100 optionally includes one or more contact intensity
sensors 165 for detecting intensity of contacts on device 100
(e.g., a touch-sensitive surface such as touch-sensitive display
system 112 of device 100). Device 100 optionally includes one or
more tactile output generators 167 for generating tactile outputs
on device 100 (e.g., generating tactile outputs on a
touch-sensitive surface such as touch-sensitive display system 112
of device 100 or touchpad 355 of device 300). These components
optionally communicate over one or more communication buses or
signal lines 103.
[0057] As used in the specification and claims, the term
"intensity" of a contact on a touch-sensitive surface refers to the
force or pressure (force per unit area) of a contact (e.g., a
finger contact) on the touch-sensitive surface, or to a substitute
(proxy) for the force or pressure of a contact on the
touch-sensitive surface. The intensity of a contact has a range of
values that includes at least four distinct values and more
typically includes hundreds of distinct values (e.g., at least
256). Intensity of a contact is, optionally, determined (or
measured) using various approaches and various sensors or
combinations of sensors. For example, one or more force sensors
underneath or adjacent to the touch-sensitive surface are,
optionally, used to measure force at various points on the
touch-sensitive surface. In some implementations, force
measurements from multiple force sensors are combined (e.g., a
weighted average) to determine an estimated force of a contact.
Similarly, a pressure-sensitive tip of a stylus is, optionally,
used to determine a pressure of the stylus on the touch-sensitive
surface. Alternatively, the size of the contact area detected on
the touch-sensitive surface and/or changes thereto, the capacitance
of the touch-sensitive surface proximate to the contact and/or
changes thereto, and/or the resistance of the touch-sensitive
surface proximate to the contact and/or changes thereto are,
optionally, used as a substitute for the force or pressure of the
contact on the touch-sensitive surface. In some implementations,
the substitute measurements for contact force or pressure are used
directly to determine whether an intensity threshold has been
exceeded (e.g., the intensity threshold is described in units
corresponding to the substitute measurements). In some
implementations, the substitute measurements for contact force or
pressure are converted to an estimated force or pressure, and the
estimated force or pressure is used to determine whether an
intensity threshold has been exceeded (e.g., the intensity
threshold is a pressure threshold measured in units of pressure).
Using the intensity of a contact as an attribute of a user input
allows for user access to additional device functionality that may
otherwise not be accessible by the user on a reduced-size device
with limited real estate for displaying affordances (e.g., on a
touch-sensitive display) and/or receiving user input (e.g., via a
touch-sensitive display, a touch-sensitive surface, or a
physical/mechanical control such as a knob or a button).
[0058] As used in the specification and claims, the term "tactile
output" refers to physical displacement of a device relative to a
previous position of the device, physical displacement of a
component (e.g., a touch-sensitive surface) of a device relative to
another component (e.g., housing) of the device, or displacement of
the component relative to a center of mass of the device that will
be detected by a user with the user's sense of touch. For example,
in situations where the device or the component of the device is in
contact with a surface of a user that is sensitive to touch (e.g.,
a finger, palm, or other part of a user's hand), the tactile output
generated by the physical displacement will be interpreted by the
user as a tactile sensation corresponding to a perceived change in
physical characteristics of the device or the component of the
device. For example, movement of a touch-sensitive surface (e.g., a
touch-sensitive display or trackpad) is, optionally, interpreted by
the user as a "down click" or "up click" of a physical actuator
button. In some cases, a user will feel a tactile sensation such as
an "down click" or "up click" even when there is no movement of a
physical actuator button associated with the touch-sensitive
surface that is physically pressed (e.g., displaced) by the user's
movements. As another example, movement of the touch-sensitive
surface is, optionally, interpreted or sensed by the user as
"roughness" of the touch-sensitive surface, even when there is no
change in smoothness of the touch-sensitive surface. While such
interpretations of touch by a user will be subject to the
individualized sensory perceptions of the user, there are many
sensory perceptions of touch that are common to a large majority of
users. Thus, when a tactile output is described as corresponding to
a particular sensory perception of a user (e.g., an "up click," a
"down click," "roughness"), unless otherwise stated, the generated
tactile output corresponds to physical displacement of the device
or a component thereof that will generate the described sensory
perception for a typical (or average) user.
[0059] It should be appreciated that device 100 is only one example
of a portable multifunction device, and that device 100 optionally
has more or fewer components than shown, optionally combines two or
more components, or optionally has a different configuration or
arrangement of the components. The various components shown in FIG.
1A are implemented in hardware, software, or a combination of both
hardware and software, including one or more signal processing
and/or application-specific integrated circuits.
[0060] Memory 102 optionally includes high-speed random access
memory and optionally also includes non-volatile memory, such as
one or more magnetic disk storage devices, flash memory devices, or
other non-volatile solid-state memory devices. Memory controller
122 optionally controls access to memory 102 by other components of
device 100.
[0061] Peripherals interface 118 can be used to couple input and
output peripherals of the device to CPU 120 and memory 102. The one
or more processors 120 run or execute various software programs
and/or sets of instructions stored in memory 102 to perform various
functions for device 100 and to process data. In some embodiments,
peripherals interface 118, CPU 120, and memory controller 122 are,
optionally, implemented on a single chip, such as chip 104. In some
other embodiments, they are, optionally, implemented on separate
chips.
[0062] RF (radio frequency) circuitry 108 receives and sends RF
signals, also called electromagnetic signals. RF circuitry 108
converts electrical signals to/from electromagnetic signals and
communicates with communications networks and other communications
devices via the electromagnetic signals. RF circuitry 108
optionally includes well-known circuitry for performing these
functions, including but not limited to an antenna system, an RF
transceiver, one or more amplifiers, a tuner, one or more
oscillators, a digital signal processor, a CODEC chipset, a
subscriber identity module (SIM) card, memory, and so forth. RF
circuitry 108 optionally communicates with networks, such as the
Internet, also referred to as the World Wide Web (WWW), an intranet
and/or a wireless network, such as a cellular telephone network, a
wireless local area network (LAN) and/or a metropolitan area
network (MAN), and other devices by wireless communication. The RF
circuitry 108 optionally includes well-known circuitry for
detecting near field communication (NFC) fields, such as by a
short-range communication radio. The wireless communication
optionally uses any of a plurality of communications standards,
protocols, and technologies, including but not limited to Global
System for Mobile Communications (GSM), Enhanced Data GSM
Environment (EDGE), high-speed downlink packet access (HSDPA),
high-speed uplink packet access (HSUPA), Evolution, Data-Only
(EV-DO), HSPA, HSPA+, Dual-Cell HSPA (DC-HSPDA), long term
evolution (LTE), near field communication (NFC), wideband code
division multiple access (W-CDMA), code division multiple access
(CDMA), time division multiple access (TDMA), Bluetooth, Bluetooth
Low Energy (BTLE), Wireless Fidelity (Wi-Fi) (e.g., IEEE 802.11a,
IEEE 802.11b, IEEE 802.11g, IEEE 802.11n, and/or IEEE 802.11ac),
voice over Internet Protocol (VoIP), Wi-MAX, a protocol for e-mail
(e.g., Internet message access protocol (IMAP) and/or post office
protocol (POP)), instant messaging (e.g., extensible messaging and
presence protocol (XMPP), Session Initiation Protocol for Instant
Messaging and Presence Leveraging Extensions (SIMPLE), Instant
Messaging and Presence Service (IMPS)), and/or Short Message
Service (SMS), or any other suitable communication protocol,
including communication protocols not yet developed as of the
filing date of this document.
[0063] Audio circuitry 110, speaker 111, and microphone 113 provide
an audio interface between a user and device 100. Audio circuitry
110 receives audio data from peripherals interface 118, converts
the audio data to an electrical signal, and transmits the
electrical signal to speaker 111. Speaker 111 converts the
electrical signal to human-audible sound waves. Audio circuitry 110
also receives electrical signals converted by microphone 113 from
sound waves. Audio circuitry 110 converts the electrical signal to
audio data and transmits the audio data to peripherals interface
118 for processing. Audio data is, optionally, retrieved from
and/or transmitted to memory 102 and/or RF circuitry 108 by
peripherals interface 118. In some embodiments, audio circuitry 110
also includes a headset jack (e.g., 212, FIG. 2). The headset jack
provides an interface between audio circuitry 110 and removable
audio input/output peripherals, such as output-only headphones or a
headset with both output (e.g., a headphone for one or both ears)
and input (e.g., a microphone).
[0064] I/O subsystem 106 couples input/output peripherals on device
100, such as touch screen 112 and other input control devices 116,
to peripherals interface 118. I/O subsystem 106 optionally includes
display controller 156, optical sensor controller 158, intensity
sensor controller 159, haptic feedback controller 161, and one or
more input controllers 160 for other input or control devices. The
one or more input controllers 160 receive/send electrical signals
from/to other input control devices 116. The other input control
devices 116 optionally include physical buttons (e.g., push
buttons, rocker buttons, etc.), dials, slider switches, joysticks,
click wheels, and so forth. In some alternate embodiments, input
controller(s) 160 are, optionally, coupled to any (or none) of the
following: a keyboard, an infrared port, a USB port, and a pointer
device such as a mouse. The one or more buttons (e.g., 208, FIG. 2)
optionally include an up/down button for volume control of speaker
111 and/or microphone 113. The one or more buttons optionally
include a push button (e.g., 206, FIG. 2).
[0065] A quick press of the push button optionally disengages a
lock of touch screen 112 or optionally begins a process that uses
gestures on the touch screen to unlock the device, as described in
U.S. patent application Ser. No. 11/322,549, "Unlocking a Device by
Performing Gestures on an Unlock Image," filed Dec. 23, 2005, U.S.
Pat. No. 7,657,849, which is hereby incorporated by reference in
its entirety. A longer press of the push button (e.g., 206)
optionally turns power to device 100 on or off. The functionality
of one or more of the buttons are, optionally, user-customizable.
Touch screen 112 is used to implement virtual or soft buttons and
one or more soft keyboards.
[0066] Touch-sensitive display 112 provides an input interface and
an output interface between the device and a user. Display
controller 156 receives and/or sends electrical signals from/to
touch screen 112. Touch screen 112 displays visual output to the
user. The visual output optionally includes graphics, text, icons,
video, and any combination thereof (collectively termed
"graphics"). In some embodiments, some or all of the visual output
optionally corresponds to user-interface objects.
[0067] Touch screen 112 has a touch-sensitive surface, sensor, or
set of sensors that accepts input from the user based on haptic
and/or tactile contact. Touch screen 112 and display controller 156
(along with any associated modules and/or sets of instructions in
memory 102) detect contact (and any movement or breaking of the
contact) on touch screen 112 and convert the detected contact into
interaction with user-interface objects (e.g., one or more soft
keys, icons, web pages, or images) that are displayed on touch
screen 112. In an exemplary embodiment, a point of contact between
touch screen 112 and the user corresponds to a finger of the
user.
[0068] Touch screen 112 optionally uses LCD (liquid crystal
display) technology, LPD (light emitting polymer display)
technology, or LED (light emitting diode) technology, although
other display technologies are used in other embodiments. Touch
screen 112 and display controller 156 optionally detect contact and
any movement or breaking thereof using any of a plurality of touch
sensing technologies now known or later developed, including but
not limited to capacitive, resistive, infrared, and surface
acoustic wave technologies, as well as other proximity sensor
arrays or other elements for determining one or more points of
contact with touch screen 112. In an exemplary embodiment,
projected mutual capacitance sensing technology is used, such as
that found in the iPhone.RTM. and iPod Touch.RTM. from Apple Inc.
of Cupertino, Calif.
[0069] A touch-sensitive display in some embodiments of touch
screen 112 is, optionally, analogous to the multi-touch sensitive
touchpads described in the following U.S. Pat. No. 6,323,846
(Westerman et al.), U.S. Pat. No. 6,570,557 (Westerman et al.),
and/or U.S. Pat. No. 6,677,932 (Westerman), and/or U.S. Patent
Publication 2002/0015024A1, each of which is hereby incorporated by
reference in its entirety. However, touch screen 112 displays
visual output from device 100, whereas touch-sensitive touchpads do
not provide visual output.
[0070] A touch-sensitive display in some embodiments of touch
screen 112 is described in the following applications: (1) U.S.
patent application Ser. No. 11/381,313, "Multipoint Touch Surface
Controller," filed May 2, 2006; (2) U.S. patent application Ser.
No. 10/840,862, "Multipoint Touchscreen," filed May 6, 2004; (3)
U.S. patent application Ser. No. 10/903,964, "Gestures For Touch
Sensitive Input Devices," filed Jul. 30, 2004; (4) U.S. patent
application Ser. No. 11/048,264, "Gestures For Touch Sensitive
Input Devices," filed Jan. 31, 2005; (5) U.S. patent application
Ser. No. 11/038,590, "Mode-Based Graphical User Interfaces For
Touch Sensitive Input Devices," filed Jan. 18, 2005; (6) U.S.
patent application Ser. No. 11/228,758, "Virtual Input Device
Placement On A Touch Screen User Interface," filed Sep. 16, 2005;
(7) U.S. patent application Ser. No. 11/228,700, "Operation Of A
Computer With A Touch Screen Interface," filed Sep. 16, 2005; (8)
U.S. patent application Ser. No. 11/228,737, "Activating Virtual
Keys Of A Touch-Screen Virtual Keyboard," filed Sep. 16, 2005; and
(9) U.S. patent application Ser. No. 11/367,749, "Multi-Functional
Hand-Held Device," filed Mar. 3, 2006. All of these applications
are incorporated by reference herein in their entirety.
[0071] Touch screen 112 optionally has a video resolution in excess
of 100 dpi. In some embodiments, the touch screen has a video
resolution of approximately 160 dpi. The user optionally makes
contact with touch screen 112 using any suitable object or
appendage, such as a stylus, a finger, and so forth. In some
embodiments, the user interface is designed to work primarily with
finger-based contacts and gestures, which can be less precise than
stylus-based input due to the larger area of contact of a finger on
the touch screen. In some embodiments, the device translates the
rough finger-based input into a precise pointer/cursor position or
command for performing the actions desired by the user.
[0072] In some embodiments, in addition to the touch screen, device
100 optionally includes a touchpad (not shown) for activating or
deactivating particular functions. In some embodiments, the
touchpad is a touch-sensitive area of the device that, unlike the
touch screen, does not display visual output. The touchpad is,
optionally, a touch-sensitive surface that is separate from touch
screen 112 or an extension of the touch-sensitive surface formed by
the touch screen.
[0073] Device 100 also includes power system 162 for powering the
various components. Power system 162 optionally includes a power
management system, one or more power sources (e.g., battery,
alternating current (AC)), a recharging system, a power failure
detection circuit, a power converter or inverter, a power status
indicator (e.g., a light-emitting diode (LED)) and any other
components associated with the generation, management and
distribution of power in portable devices.
[0074] Device 100 optionally also includes one or more optical
sensors 164. FIG. 1A shows an optical sensor coupled to optical
sensor controller 158 in I/O subsystem 106. Optical sensor 164
optionally includes charge-coupled device (CCD) or complementary
metal-oxide semiconductor (CMOS) phototransistors. Optical sensor
164 receives light from the environment, projected through one or
more lenses, and converts the light to data representing an image.
In conjunction with imaging module 143 (also called a camera
module), optical sensor 164 optionally captures still images or
video. In some embodiments, an optical sensor is located on the
back of device 100, opposite touch screen display 112 on the front
of the device so that the touch screen display is enabled for use
as a viewfinder for still and/or video image acquisition. In some
embodiments, an optical sensor is located on the front of the
device so that the user's image is, optionally, obtained for video
conferencing while the user views the other video conference
participants on the touch screen display. In some embodiments, the
position of optical sensor 164 can be changed by the user (e.g., by
rotating the lens and the sensor in the device housing) so that a
single optical sensor 164 is used along with the touch screen
display for both video conferencing and still and/or video image
acquisition.
[0075] Device 100 optionally also includes one or more contact
intensity sensors 165. FIG. 1A shows a contact intensity sensor
coupled to intensity sensor controller 159 in I/O subsystem 106.
Contact intensity sensor 165 optionally includes one or more
piezoresistive strain gauges, capacitive force sensors, electric
force sensors, piezoelectric force sensors, optical force sensors,
capacitive touch-sensitive surfaces, or other intensity sensors
(e.g., sensors used to measure the force (or pressure) of a contact
on a touch-sensitive surface). Contact intensity sensor 165
receives contact intensity information (e.g., pressure information
or a proxy for pressure information) from the environment. In some
embodiments, at least one contact intensity sensor is collocated
with, or proximate to, a touch-sensitive surface (e.g.,
touch-sensitive display system 112). In some embodiments, at least
one contact intensity sensor is located on the back of device 100,
opposite touch screen display 112, which is located on the front of
device 100.
[0076] Device 100 optionally also includes one or more proximity
sensors 166. FIG. 1A shows proximity sensor 166 coupled to
peripherals interface 118. Alternately, proximity sensor 166 is,
optionally, coupled to input controller 160 in I/O subsystem 106.
Proximity sensor 166 optionally performs as described in U.S.
patent application Ser. No. 11/241,839, "Proximity Detector In
Handheld Device"; Ser. No. 11/240,788, "Proximity Detector In
Handheld Device"; Ser. No. 11/620,702, "Using Ambient Light Sensor
To Augment Proximity Sensor Output"; Ser. No. 11/586,862,
"Automated Response To And Sensing Of User Activity In Portable
Devices"; and Ser. No. 11/638,251, "Methods And Systems For
Automatic Configuration Of Peripherals," which are hereby
incorporated by reference in their entirety. In some embodiments,
the proximity sensor turns off and disables touch screen 112 when
the multifunction device is placed near the user's ear (e.g., when
the user is making a phone call).
[0077] Device 100 optionally also includes one or more tactile
output generators 167. FIG. 1A shows a tactile output generator
coupled to haptic feedback controller 161 in I/O subsystem 106.
Tactile output generator 167 optionally includes one or more
electroacoustic devices such as speakers or other audio components
and/or electromechanical devices that convert energy into linear
motion such as a motor, solenoid, electroactive polymer,
piezoelectric actuator, electrostatic actuator, or other tactile
output generating component (e.g., a component that converts
electrical signals into tactile outputs on the device). Contact
intensity sensor 165 receives tactile feedback generation
instructions from haptic feedback module 133 and generates tactile
outputs on device 100 that are capable of being sensed by a user of
device 100. In some embodiments, at least one tactile output
generator is collocated with, or proximate to, a touch-sensitive
surface (e.g., touch-sensitive display system 112) and, optionally,
generates a tactile output by moving the touch-sensitive surface
vertically (e.g., in/out of a surface of device 100) or laterally
(e.g., back and forth in the same plane as a surface of device
100). In some embodiments, at least one tactile output generator
sensor is located on the back of device 100, opposite touch screen
display 112, which is located on the front of device 100.
[0078] Device 100 optionally also includes one or more
accelerometers 168. FIG. 1A shows accelerometer 168 coupled to
peripherals interface 118. Alternately, accelerometer 168 is,
optionally, coupled to an input controller 160 in I/O subsystem
106. Accelerometer 168 optionally performs as described in U.S.
Patent Publication No. 20050190059, "Acceleration-based Theft
Detection System for Portable Electronic Devices," and U.S. Patent
Publication No. 20060017692, "Methods And Apparatuses For Operating
A Portable Device Based On An Accelerometer," both of which are
incorporated by reference herein in their entirety. In some
embodiments, information is displayed on the touch screen display
in a portrait view or a landscape view based on an analysis of data
received from the one or more accelerometers. Device 100 optionally
includes, in addition to accelerometer(s) 168, a magnetometer (not
shown) and a GPS (or GLONASS or other global navigation system)
receiver (not shown) for obtaining information concerning the
location and orientation (e.g., portrait or landscape) of device
100.
[0079] In some embodiments, the software components stored in
memory 102 include operating system 126, communication module (or
set of instructions) 128, contact/motion module (or set of
instructions) 130, graphics module (or set of instructions) 132,
text input module (or set of instructions) 134, Global Positioning
System (GPS) module (or set of instructions) 135, and applications
(or sets of instructions) 136. Furthermore, in some embodiments,
memory 102 (FIG. 1A) or 370 (FIG. 3) stores device/global internal
state 157, as shown in FIGS. 1A and 3. Device/global internal state
157 includes one or more of: active application state, indicating
which applications, if any, are currently active; display state,
indicating what applications, views or other information occupy
various regions of touch screen display 112; sensor state,
including information obtained from the device's various sensors
and input control devices 116; and location information concerning
the device's location and/or attitude.
[0080] Operating system 126 (e.g., Darwin, RTXC, LINUX, UNIX, OS X,
iOS, WINDOWS, or an embedded operating system such as VxWorks)
includes various software components and/or drivers for controlling
and managing general system tasks (e.g., memory management, storage
device control, power management, etc.) and facilitates
communication between various hardware and software components.
[0081] Communication module 128 facilitates communication with
other devices over one or more external ports 124 and also includes
various software components for handling data received by RF
circuitry 108 and/or external port 124. External port 124 (e.g.,
Universal Serial Bus (USB), FIREWIRE, etc.) is adapted for coupling
directly to other devices or indirectly over a network (e.g., the
Internet, wireless LAN, etc.). In some embodiments, the external
port is a multi-pin (e.g., 30-pin) connector that is the same as,
or similar to and/or compatible with, the 30-pin connector used on
iPod.RTM. (trademark of Apple Inc.) devices.
[0082] Contact/motion module 130 optionally detects contact with
touch screen 112 (in conjunction with display controller 156) and
other touch-sensitive devices (e.g., a touchpad or physical click
wheel). Contact/motion module 130 includes various software
components for performing various operations related to detection
of contact, such as determining if contact has occurred (e.g.,
detecting a finger-down event), determining an intensity of the
contact (e.g., the force or pressure of the contact or a substitute
for the force or pressure of the contact), determining if there is
movement of the contact and tracking the movement across the
touch-sensitive surface (e.g., detecting one or more
finger-dragging events), and determining if the contact has ceased
(e.g., detecting a finger-up event or a break in contact).
Contact/motion module 130 receives contact data from the
touch-sensitive surface. Determining movement of the point of
contact, which is represented by a series of contact data,
optionally includes determining speed (magnitude), velocity
(magnitude and direction), and/or an acceleration (a change in
magnitude and/or direction) of the point of contact. These
operations are, optionally, applied to single contacts (e.g., one
finger contacts) or to multiple simultaneous contacts (e.g.,
"multitouch"/multiple finger contacts). In some embodiments,
contact/motion module 130 and display controller 156 detect contact
on a touchpad.
[0083] In some embodiments, contact/motion module 130 uses a set of
one or more intensity thresholds to determine whether an operation
has been performed by a user (e.g., to determine whether a user has
"clicked" on an icon). In some embodiments, at least a subset of
the intensity thresholds are determined in accordance with software
parameters (e.g., the intensity thresholds are not determined by
the activation thresholds of particular physical actuators and can
be adjusted without changing the physical hardware of device 100).
For example, a mouse "click" threshold of a trackpad or touch
screen display can be set to any of a large range of predefined
threshold values without changing the trackpad or touch screen
display hardware. Additionally, in some implementations, a user of
the device is provided with software settings for adjusting one or
more of the set of intensity thresholds (e.g., by adjusting
individual intensity thresholds and/or by adjusting a plurality of
intensity thresholds at once with a system-level click "intensity"
parameter).
[0084] Contact/motion module 130 optionally detects a gesture input
by a user. Different gestures on the touch-sensitive surface have
different contact patterns (e.g., different motions, timings,
and/or intensities of detected contacts). Thus, a gesture is,
optionally, detected by detecting a particular contact pattern. For
example, detecting a finger tap gesture includes detecting a
finger-down event followed by detecting a finger-up (liftoff) event
at the same position (or substantially the same position) as the
finger-down event (e.g., at the position of an icon). As another
example, detecting a finger swipe gesture on the touch-sensitive
surface includes detecting a finger-down event followed by
detecting one or more finger-dragging events, and subsequently
followed by detecting a finger-up (liftoff) event.
[0085] Graphics module 132 includes various known software
components for rendering and displaying graphics on touch screen
112 or other display, including components for changing the visual
impact (e.g., brightness, transparency, saturation, contrast, or
other visual property) of graphics that are displayed. As used
herein, the term "graphics" includes any object that can be
displayed to a user, including, without limitation, text, web
pages, icons (such as user-interface objects including soft keys),
digital images, videos, animations, and the like.
[0086] In some embodiments, graphics module 132 stores data
representing graphics to be used. Each graphic is, optionally,
assigned a corresponding code. Graphics module 132 receives, from
applications etc., one or more codes specifying graphics to be
displayed along with, if necessary, coordinate data and other
graphic property data, and then generates screen image data to
output to display controller 156.
[0087] Haptic feedback module 133 includes various software
components for generating instructions used by tactile output
generator(s) 167 to produce tactile outputs at one or more
locations on device 100 in response to user interactions with
device 100.
[0088] Text input module 134, which is, optionally, a component of
graphics module 132, provides soft keyboards for entering text in
various applications (e.g., contacts 137, e-mail 140, IM 141,
browser 147, and any other application that needs text input).
[0089] GPS module 135 determines the location of the device and
provides this information for use in various applications (e.g., to
telephone 138 for use in location-based dialing; to camera 143 as
picture/video metadata; and to applications that provide
location-based services such as weather widgets, local yellow page
widgets, and map/navigation widgets).
[0090] Applications 136 optionally include the following modules
(or sets of instructions), or a subset or superset thereof: [0091]
Contacts module 137 (sometimes called an address book or contact
list); [0092] Telephone module 138; [0093] Video conference module
139; [0094] E-mail client module 140; [0095] Instant messaging (IM)
module 141; [0096] Workout support module 142; [0097] Camera module
143 for still and/or video images; [0098] Image management module
144; [0099] Video player module; [0100] Music player module; [0101]
Browser module 147; [0102] Calendar module 148; [0103] Widget
modules 149, which optionally include one or more of: weather
widget 149-1, stocks widget 149-2, calculator widget 149-3, alarm
clock widget 149-4, dictionary widget 149-5, and other widgets
obtained by the user, as well as user-created widgets 149-6; [0104]
Widget creator module 150 for making user-created widgets 149-6;
[0105] Search module 151; [0106] Video and music player module 152,
which merges video player module and music player module; [0107]
Notes module 153; [0108] Map module 154; and/or [0109] Online video
module 155.
[0110] Examples of other applications 136 that are, optionally,
stored in memory 102 include other word processing applications,
other image editing applications, drawing applications,
presentation applications, JAVA-enabled applications, encryption,
digital rights management, voice recognition, and voice
replication.
[0111] In conjunction with touch screen 112, display controller
156, contact/motion module 130, graphics module 132, and text input
module 134, contacts module 137 are, optionally, used to manage an
address book or contact list (e.g., stored in application internal
state 192 of contacts module 137 in memory 102 or memory 370),
including: adding name(s) to the address book; deleting name(s)
from the address book; associating telephone number(s), e-mail
address(es), physical address(es) or other information with a name;
associating an image with a name; categorizing and sorting names;
providing telephone numbers or e-mail addresses to initiate and/or
facilitate communications by telephone 138, video conference module
139, e-mail 140, or IM 141; and so forth.
[0112] In conjunction with RF circuitry 108, audio circuitry 110,
speaker 111, microphone 113, touch screen 112, display controller
156, contact/motion module 130, graphics module 132, and text input
module 134, telephone module 138 are optionally, used to enter a
sequence of characters corresponding to a telephone number, access
one or more telephone numbers in contacts module 137, modify a
telephone number that has been entered, dial a respective telephone
number, conduct a conversation, and disconnect or hang up when the
conversation is completed. As noted above, the wireless
communication optionally uses any of a plurality of communications
standards, protocols, and technologies.
[0113] In conjunction with RF circuitry 108, audio circuitry 110,
speaker 111, microphone 113, touch screen 112, display controller
156, optical sensor 164, optical sensor controller 158,
contact/motion module 130, graphics module 132, text input module
134, contacts module 137, and telephone module 138, video
conference module 139 includes executable instructions to initiate,
conduct, and terminate a video conference between a user and one or
more other participants in accordance with user instructions.
[0114] In conjunction with RF circuitry 108, touch screen 112,
display controller 156, contact/motion module 130, graphics module
132, and text input module 134, e-mail client module 140 includes
executable instructions to create, send, receive, and manage e-mail
in response to user instructions. In conjunction with image
management module 144, e-mail client module 140 makes it very easy
to create and send e-mails with still or video images taken with
camera module 143.
[0115] In conjunction with RF circuitry 108, touch screen 112,
display controller 156, contact/motion module 130, graphics module
132, and text input module 134, the instant messaging module 141
includes executable instructions to enter a sequence of characters
corresponding to an instant message, to modify previously entered
characters, to transmit a respective instant message (for example,
using a Short Message Service (SMS) or Multimedia Message Service
(MMS) protocol for telephony-based instant messages or using XMPP,
SIMPLE, or IMPS for Internet-based instant messages), to receive
instant messages, and to view received instant messages. In some
embodiments, transmitted and/or received instant messages
optionally include graphics, photos, audio files, video files
and/or other attachments as are supported in an MMS and/or an
Enhanced Messaging Service (EMS). As used herein, "instant
messaging" refers to both telephony-based messages (e.g., messages
sent using SMS or MMS) and Internet-based messages (e.g., messages
sent using XMPP, SIMPLE, or IMPS).
[0116] In conjunction with RF circuitry 108, touch screen 112,
display controller 156, contact/motion module 130, graphics module
132, text input module 134, GPS module 135, map module 154, and
music player module, workout support module 142 includes executable
instructions to create workouts (e.g., with time, distance, and/or
calorie burning goals); communicate with workout sensors (sports
devices); receive workout sensor data; calibrate sensors used to
monitor a workout; select and play music for a workout; and
display, store, and transmit workout data.
[0117] In conjunction with touch screen 112, display controller
156, optical sensor(s) 164, optical sensor controller 158,
contact/motion module 130, graphics module 132, and image
management module 144, camera module 143 includes executable
instructions to capture still images or video (including a video
stream) and store them into memory 102, modify characteristics of a
still image or video, or delete a still image or video from memory
102.
[0118] In conjunction with touch screen 112, display controller
156, contact/motion module 130, graphics module 132, text input
module 134, and camera module 143, image management module 144
includes executable instructions to arrange, modify (e.g., edit),
or otherwise manipulate, label, delete, present (e.g., in a digital
slide show or album), and store still and/or video images.
[0119] In conjunction with RF circuitry 108, touch screen 112,
display controller 156, contact/motion module 130, graphics module
132, and text input module 134, browser module 147 includes
executable instructions to browse the Internet in accordance with
user instructions, including searching, linking to, receiving, and
displaying web pages or portions thereof, as well as attachments
and other files linked to web pages.
[0120] In conjunction with RF circuitry 108, touch screen 112,
display controller 156, contact/motion module 130, graphics module
132, text input module 134, e-mail client module 140, and browser
module 147, calendar module 148 includes executable instructions to
create, display, modify, and store calendars and data associated
with calendars (e.g., calendar entries, to-do lists, etc.) in
accordance with user instructions.
[0121] In conjunction with RF circuitry 108, touch screen 112,
display controller 156, contact/motion module 130, graphics module
132, text input module 134, and browser module 147, widget modules
149 are mini-applications that are, optionally, downloaded and used
by a user (e.g., weather widget 149-1, stocks widget 149-2,
calculator widget 149-3, alarm clock widget 149-4, and dictionary
widget 149-5) or created by the user (e.g., user-created widget
149-6). In some embodiments, a widget includes an HTML (Hypertext
Markup Language) file, a CSS (Cascading Style Sheets) file, and a
JavaScript file. In some embodiments, a widget includes an XML
(Extensible Markup Language) file and a JavaScript file (e.g.,
Yahoo! Widgets).
[0122] In conjunction with RF circuitry 108, touch screen 112,
display controller 156, contact/motion module 130, graphics module
132, text input module 134, and browser module 147, the widget
creator module 150 are, optionally, used by a user to create
widgets (e.g., turning a user-specified portion of a web page into
a widget).
[0123] In conjunction with touch screen 112, display controller
156, contact/motion module 130, graphics module 132, and text input
module 134, search module 151 includes executable instructions to
search for text, music, sound, image, video, and/or other files in
memory 102 that match one or more search criteria (e.g., one or
more user-specified search terms) in accordance with user
instructions.
[0124] In conjunction with touch screen 112, display controller
156, contact/motion module 130, graphics module 132, audio
circuitry 110, speaker 111, RF circuitry 108, and browser module
147, video and music player module 152 includes executable
instructions that allow the user to download and play back recorded
music and other sound files stored in one or more file formats,
such as MP3 or AAC files, and executable instructions to display,
present, or otherwise play back videos (e.g., on touch screen 112
or on an external, connected display via external port 124). In
some embodiments, device 100 optionally includes the functionality
of an MP3 player, such as an iPod (trademark of Apple Inc.).
[0125] In conjunction with touch screen 112, display controller
156, contact/motion module 130, graphics module 132, and text input
module 134, notes module 153 includes executable instructions to
create and manage notes, to-do lists, and the like in accordance
with user instructions.
[0126] In conjunction with RF circuitry 108, touch screen 112,
display controller 156, contact/motion module 130, graphics module
132, text input module 134, GPS module 135, and browser module 147,
map module 154 are, optionally, used to receive, display, modify,
and store maps and data associated with maps (e.g., driving
directions, data on stores and other points of interest at or near
a particular location, and other location-based data) in accordance
with user instructions.
[0127] In conjunction with touch screen 112, display controller
156, contact/motion module 130, graphics module 132, audio
circuitry 110, speaker 111, RF circuitry 108, text input module
134, e-mail client module 140, and browser module 147, online video
module 155 includes instructions that allow the user to access,
browse, receive (e.g., by streaming and/or download), play back
(e.g., on the touch screen or on an external, connected display via
external port 124), send an e-mail with a link to a particular
online video, and otherwise manage online videos in one or more
file formats, such as H.264. In some embodiments, instant messaging
module 141, rather than e-mail client module 140, is used to send a
link to a particular online video. Additional description of the
online video application can be found in U.S. Provisional Patent
Application No. 60/936,562, "Portable Multifunction Device, Method,
and Graphical User Interface for Playing Online Videos," filed Jun.
20, 2007, and U.S. patent application Ser. No. 11/968,067,
"Portable Multifunction Device, Method, and Graphical User
Interface for Playing Online Videos," filed Dec. 31, 2007, the
contents of which are hereby incorporated by reference in their
entirety.
[0128] Each of the above-identified modules and applications
corresponds to a set of executable instructions for performing one
or more functions described above and the methods described in this
application (e.g., the computer-implemented methods and other
information processing methods described herein). These modules
(e.g., sets of instructions) need not be implemented as separate
software programs, procedures, or modules, and thus various subsets
of these modules are, optionally, combined or otherwise rearranged
in various embodiments. For example, video player module is,
optionally, combined with music player module into a single module
(e.g., video and music player module 152, FIG. 1A). In some
embodiments, memory 102 optionally stores a subset of the modules
and data structures identified above. Furthermore, memory 102
optionally stores additional modules and data structures not
described above.
[0129] In some embodiments, device 100 is a device where operation
of a predefined set of functions on the device is performed
exclusively through a touch screen and/or a touchpad. By using a
touch screen and/or a touchpad as the primary input control device
for operation of device 100, the number of physical input control
devices (such as push buttons, dials, and the like) on device 100
is, optionally, reduced.
[0130] The predefined set of functions that are performed
exclusively through a touch screen and/or a touchpad optionally
include navigation between user interfaces. In some embodiments,
the touchpad, when touched by the user, navigates device 100 to a
main, home, or root menu from any user interface that is displayed
on device 100. In such embodiments, a "menu button" is implemented
using a touchpad. In some other embodiments, the menu button is a
physical push button or other physical input control device instead
of a touchpad.
[0131] FIG. 1B is a block diagram illustrating exemplary components
for event handling in accordance with some embodiments. In some
embodiments, memory 102 (FIG. 1A) or 370 (FIG. 3) includes event
sorter 170 (e.g., in operating system 126) and a respective
application 136-1 (e.g., any of the aforementioned applications
137-151, 155, 380-390).
[0132] Event sorter 170 receives event information and determines
the application 136-1 and application view 191 of application 136-1
to which to deliver the event information. Event sorter 170
includes event monitor 171 and event dispatcher module 174. In some
embodiments, application 136-1 includes application internal state
192, which indicates the current application view(s) displayed on
touch-sensitive display 112 when the application is active or
executing. In some embodiments, device/global internal state 157 is
used by event sorter 170 to determine which application(s) is (are)
currently active, and application internal state 192 is used by
event sorter 170 to determine application views 191 to which to
deliver event information.
[0133] In some embodiments, application internal state 192 includes
additional information, such as one or more of: resume information
to be used when application 136-1 resumes execution, user interface
state information that indicates information being displayed or
that is ready for display by application 136-1, a state queue for
enabling the user to go back to a prior state or view of
application 136-1, and a redo/undo queue of previous actions taken
by the user.
[0134] Event monitor 171 receives event information from
peripherals interface 118. Event information includes information
about a sub-event (e.g., a user touch on touch-sensitive display
112, as part of a multi-touch gesture). Peripherals interface 118
transmits information it receives from I/O subsystem 106 or a
sensor, such as proximity sensor 166, accelerometer(s) 168, and/or
microphone 113 (through audio circuitry 110). Information that
peripherals interface 118 receives from I/O subsystem 106 includes
information from touch-sensitive display 112 or a touch-sensitive
surface.
[0135] In some embodiments, event monitor 171 sends requests to the
peripherals interface 118 at predetermined intervals. In response,
peripherals interface 118 transmits event information. In other
embodiments, peripherals interface 118 transmits event information
only when there is a significant event (e.g., receiving an input
above a predetermined noise threshold and/or for more than a
predetermined duration).
[0136] In some embodiments, event sorter 170 also includes a hit
view determination module 172 and/or an active event recognizer
determination module 173.
[0137] Hit view determination module 172 provides software
procedures for determining where a sub-event has taken place within
one or more views when touch-sensitive display 112 displays more
than one view. Views are made up of controls and other elements
that a user can see on the display.
[0138] Another aspect of the user interface associated with an
application is a set of views, sometimes herein called application
views or user interface windows, in which information is displayed
and touch-based gestures occur. The application views (of a
respective application) in which a touch is detected optionally
correspond to programmatic levels within a programmatic or view
hierarchy of the application. For example, the lowest level view in
which a touch is detected is, optionally, called the hit view, and
the set of events that are recognized as proper inputs are,
optionally, determined based, at least in part, on the hit view of
the initial touch that begins a touch-based gesture.
[0139] Hit view determination module 172 receives information
related to sub-events of a touch-based gesture. When an application
has multiple views organized in a hierarchy, hit view determination
module 172 identifies a hit view as the lowest view in the
hierarchy which should handle the sub-event. In most circumstances,
the hit view is the lowest level view in which an initiating
sub-event occurs (e.g., the first sub-event in the sequence of
sub-events that form an event or potential event). Once the hit
view is identified by the hit view determination module 172, the
hit view typically receives all sub-events related to the same
touch or input source for which it was identified as the hit
view.
[0140] Active event recognizer determination module 173 determines
which view or views within a view hierarchy should receive a
particular sequence of sub-events. In some embodiments, active
event recognizer determination module 173 determines that only the
hit view should receive a particular sequence of sub-events. In
other embodiments, active event recognizer determination module 173
determines that all views that include the physical location of a
sub-event are actively involved views, and therefore determines
that all actively involved views should receive a particular
sequence of sub-events. In other embodiments, even if touch
sub-events were entirely confined to the area associated with one
particular view, views higher in the hierarchy would still remain
as actively involved views.
[0141] Event dispatcher module 174 dispatches the event information
to an event recognizer (e.g., event recognizer 180). In embodiments
including active event recognizer determination module 173, event
dispatcher module 174 delivers the event information to an event
recognizer determined by active event recognizer determination
module 173. In some embodiments, event dispatcher module 174 stores
in an event queue the event information, which is retrieved by a
respective event receiver 182.
[0142] In some embodiments, operating system 126 includes event
sorter 170. Alternatively, application 136-1 includes event sorter
170. In yet other embodiments, event sorter 170 is a stand-alone
module, or a part of another module stored in memory 102, such as
contact/motion module 130.
[0143] In some embodiments, application 136-1 includes a plurality
of event handlers 190 and one or more application views 191, each
of which includes instructions for handling touch events that occur
within a respective view of the application's user interface. Each
application view 191 of the application 136-1 includes one or more
event recognizers 180. Typically, a respective application view 191
includes a plurality of event recognizers 180. In other
embodiments, one or more of event recognizers 180 are part of a
separate module, such as a user interface kit (not shown) or a
higher level object from which application 136-1 inherits methods
and other properties. In some embodiments, a respective event
handler 190 includes one or more of: data updater 176, object
updater 177, GUI updater 178, and/or event data 179 received from
event sorter 170. Event handler 190 optionally utilizes or calls
data updater 176, object updater 177, or GUI updater 178 to update
the application internal state 192. Alternatively, one or more of
the application views 191 include one or more respective event
handlers 190. Also, in some embodiments, one or more of data
updater 176, object updater 177, and GUI updater 178 are included
in a respective application view 191.
[0144] A respective event recognizer 180 receives event information
(e.g., event data 179) from event sorter 170 and identifies an
event from the event information. Event recognizer 180 includes
event receiver 182 and event comparator 184. In some embodiments,
event recognizer 180 also includes at least a subset of: metadata
183, and event delivery instructions 188 (which optionally include
sub-event delivery instructions).
[0145] Event receiver 182 receives event information from event
sorter 170. The event information includes information about a
sub-event, for example, a touch or a touch movement. Depending on
the sub-event, the event information also includes additional
information, such as location of the sub-event. When the sub-event
concerns motion of a touch, the event information optionally also
includes speed and direction of the sub-event. In some embodiments,
events include rotation of the device from one orientation to
another (e.g., from a portrait orientation to a landscape
orientation, or vice versa), and the event information includes
corresponding information about the current orientation (also
called device attitude) of the device.
[0146] Event comparator 184 compares the event information to
predefined event or sub-event definitions and, based on the
comparison, determines an event or sub-event, or determines or
updates the state of an event or sub-event. In some embodiments,
event comparator 184 includes event definitions 186. Event
definitions 186 contain definitions of events (e.g., predefined
sequences of sub-events), for example, event 1 (187-1), event 2
(187-2), and others. In some embodiments, sub-events in an event
(187) include, for example, touch begin, touch end, touch movement,
touch cancellation, and multiple touching. In one example, the
definition for event 1 (187-1) is a double tap on a displayed
object. The double tap, for example, comprises a first touch (touch
begin) on the displayed object for a predetermined phase, a first
liftoff (touch end) for a predetermined phase, a second touch
(touch begin) on the displayed object for a predetermined phase,
and a second liftoff (touch end) for a predetermined phase. In
another example, the definition for event 2 (187-2) is a dragging
on a displayed object. The dragging, for example, comprises a touch
(or contact) on the displayed object for a predetermined phase, a
movement of the touch across touch-sensitive display 112, and
liftoff of the touch (touch end). In some embodiments, the event
also includes information for one or more associated event handlers
190.
[0147] In some embodiments, event definition 187 includes a
definition of an event for a respective user-interface object. In
some embodiments, event comparator 184 performs a hit test to
determine which user-interface object is associated with a
sub-event. For example, in an application view in which three
user-interface objects are displayed on touch-sensitive display
112, when a touch is detected on touch-sensitive display 112, event
comparator 184 performs a hit test to determine which of the three
user-interface objects is associated with the touch (sub-event). If
each displayed object is associated with a respective event handler
190, the event comparator uses the result of the hit test to
determine which event handler 190 should be activated. For example,
event comparator 184 selects an event handler associated with the
sub-event and the object triggering the hit test.
[0148] In some embodiments, the definition for a respective event
(187) also includes delayed actions that delay delivery of the
event information until after it has been determined whether the
sequence of sub-events does or does not correspond to the event
recognizer's event type.
[0149] When a respective event recognizer 180 determines that the
series of sub-events do not match any of the events in event
definitions 186, the respective event recognizer 180 enters an
event impossible, event failed, or event ended state, after which
it disregards subsequent sub-events of the touch-based gesture. In
this situation, other event recognizers, if any, that remain active
for the hit view continue to track and process sub-events of an
ongoing touch-based gesture.
[0150] In some embodiments, a respective event recognizer 180
includes metadata 183 with configurable properties, flags, and/or
lists that indicate how the event delivery system should perform
sub-event delivery to actively involved event recognizers. In some
embodiments, metadata 183 includes configurable properties, flags,
and/or lists that indicate how event recognizers interact, or are
enabled to interact, with one another. In some embodiments,
metadata 183 includes configurable properties, flags, and/or lists
that indicate whether sub-events are delivered to varying levels in
the view or programmatic hierarchy.
[0151] In some embodiments, a respective event recognizer 180
activates event handler 190 associated with an event when one or
more particular sub-events of an event are recognized. In some
embodiments, a respective event recognizer 180 delivers event
information associated with the event to event handler 190.
Activating an event handler 190 is distinct from sending (and
deferred sending) sub-events to a respective hit view. In some
embodiments, event recognizer 180 throws a flag associated with the
recognized event, and event handler 190 associated with the flag
catches the flag and performs a predefined process.
[0152] In some embodiments, event delivery instructions 188 include
sub-event delivery instructions that deliver event information
about a sub-event without activating an event handler. Instead, the
sub-event delivery instructions deliver event information to event
handlers associated with the series of sub-events or to actively
involved views. Event handlers associated with the series of
sub-events or with actively involved views receive the event
information and perform a predetermined process.
[0153] In some embodiments, data updater 176 creates and updates
data used in application 136-1. For example, data updater 176
updates the telephone number used in contacts module 137, or stores
a video file used in video player module. In some embodiments,
object updater 177 creates and updates objects used in application
136-1. For example, object updater 177 creates a new user-interface
object or updates the position of a user-interface object. GUI
updater 178 updates the GUI. For example, GUI updater 178 prepares
display information and sends it to graphics module 132 for display
on a touch-sensitive display.
[0154] In some embodiments, event handler(s) 190 includes or has
access to data updater 176, object updater 177, and GUI updater
178. In some embodiments, data updater 176, object updater 177, and
GUI updater 178 are included in a single module of a respective
application 136-1 or application view 191. In other embodiments,
they are included in two or more software modules.
[0155] It shall be understood that the foregoing discussion
regarding event handling of user touches on touch-sensitive
displays also applies to other forms of user inputs to operate
multifunction devices 100 with input devices, not all of which are
initiated on touch screens. For example, mouse movement and mouse
button presses, optionally coordinated with single or multiple
keyboard presses or holds; contact movements such as taps, drags,
scrolls, etc. on touchpads; pen stylus inputs; movement of the
device; oral instructions; detected eye movements; biometric
inputs; and/or any combination thereof are optionally utilized as
inputs corresponding to sub-events which define an event to be
recognized.
[0156] FIG. 2 illustrates a portable multifunction device 100
having a touch screen 112 in accordance with some embodiments. The
touch screen optionally displays one or more graphics within user
interface (UI) 200. In this embodiment, as well as others described
below, a user is enabled to select one or more of the graphics by
making a gesture on the graphics, for example, with one or more
fingers 202 (not drawn to scale in the figure) or one or more
styluses 203 (not drawn to scale in the figure). In some
embodiments, selection of one or more graphics occurs when the user
breaks contact with the one or more graphics. In some embodiments,
the gesture optionally includes one or more taps, one or more
swipes (from left to right, right to left, upward and/or downward),
and/or a rolling of a finger (from right to left, left to right,
upward and/or downward) that has made contact with device 100. In
some implementations or circumstances, inadvertent contact with a
graphic does not select the graphic. For example, a swipe gesture
that sweeps over an application icon optionally does not select the
corresponding application when the gesture corresponding to
selection is a tap.
[0157] Device 100 optionally also include one or more physical
buttons, such as "home" or menu button 204. As described
previously, menu button 204 is, optionally, used to navigate to any
application 136 in a set of applications that are, optionally,
executed on device 100. Alternatively, in some embodiments, the
menu button is implemented as a soft key in a GUI displayed on
touch screen 112.
[0158] In some embodiments, device 100 includes touch screen 112,
menu button 204, push button 206 for powering the device on/off and
locking the device, volume adjustment button(s) 208, subscriber
identity module (SIM) card slot 210, headset jack 212, and
docking/charging external port 124. Push button 206 is, optionally,
used to turn the power on/off on the device by depressing the
button and holding the button in the depressed state for a
predefined time interval; to lock the device by depressing the
button and releasing the button before the predefined time interval
has elapsed; and/or to unlock the device or initiate an unlock
process. In an alternative embodiment, device 100 also accepts
verbal input for activation or deactivation of some functions
through microphone 113. Device 100 also, optionally, includes one
or more contact intensity sensors 165 for detecting intensity of
contacts on touch screen 112 and/or one or more tactile output
generators 167 for generating tactile outputs for a user of device
100.
[0159] FIG. 3 is a block diagram of an exemplary multifunction
device with a display and a touch-sensitive surface in accordance
with some embodiments. Device 300 need not be portable. In some
embodiments, device 300 is a laptop computer, a desktop computer, a
tablet computer, a multimedia player device, a navigation device,
an educational device (such as a child's learning toy), a gaming
system, or a control device (e.g., a home or industrial
controller). Device 300 typically includes one or more processing
units (CPUs) 310, one or more network or other communications
interfaces 360, memory 370, and one or more communication buses 320
for interconnecting these components. Communication buses 320
optionally include circuitry (sometimes called a chipset) that
interconnects and controls communications between system
components. Device 300 includes input/output (I/O) interface 330
comprising display 340, which is typically a touch screen display.
I/O interface 330 also optionally includes a keyboard and/or mouse
(or other pointing device) 350 and touchpad 355, tactile output
generator 357 for generating tactile outputs on device 300 (e.g.,
similar to tactile output generator(s) 167 described above with
reference to FIG. 1A), sensors 359 (e.g., optical, acceleration,
proximity, touch-sensitive, and/or contact intensity sensors
similar to contact intensity sensor(s) 165 described above with
reference to FIG. 1A). Memory 370 includes high-speed random access
memory, such as DRAM, SRAM, DDR RAM, or other random access solid
state memory devices; and optionally includes non-volatile memory,
such as one or more magnetic disk storage devices, optical disk
storage devices, flash memory devices, or other non-volatile solid
state storage devices. Memory 370 optionally includes one or more
storage devices remotely located from CPU(s) 310. In some
embodiments, memory 370 stores programs, modules, and data
structures analogous to the programs, modules, and data structures
stored in memory 102 of portable multifunction device 100 (FIG.
1A), or a subset thereof. Furthermore, memory 370 optionally stores
additional programs, modules, and data structures not present in
memory 102 of portable multifunction device 100. For example,
memory 370 of device 300 optionally stores drawing module 380,
presentation module 382, word processing module 384, website
creation module 386, disk authoring module 388, and/or spreadsheet
module 390, while memory 102 of portable multifunction device 100
(FIG. 1A) optionally does not store these modules.
[0160] Each of the above-identified elements in FIG. 3 is,
optionally, stored in one or more of the previously mentioned
memory devices. Each of the above-identified modules corresponds to
a set of instructions for performing a function described above.
The above-identified modules or programs (e.g., sets of
instructions) need not be implemented as separate software
programs, procedures, or modules, and thus various subsets of these
modules are, optionally, combined or otherwise rearranged in
various embodiments. In some embodiments, memory 370 optionally
stores a subset of the modules and data structures identified
above. Furthermore, memory 370 optionally stores additional modules
and data structures not described above.
[0161] Attention is now directed towards embodiments of user
interfaces that are, optionally, implemented on, for example,
portable multifunction device 100.
[0162] FIG. 4A illustrates an exemplary user interface for a menu
of applications on portable multifunction device 100 in accordance
with some embodiments. Similar user interfaces are, optionally,
implemented on device 300. In some embodiments, user interface 400
includes the following elements, or a subset or superset thereof:
[0163] Signal strength indicator(s) 402 for wireless
communication(s), such as cellular and Wi-Fi signals; [0164] Time
404; [0165] Bluetooth indicator 405; [0166] Battery status
indicator 406; [0167] Tray 408 with icons for frequently used
applications, such as: [0168] Icon 416 for telephone module 138,
labeled "Phone," which optionally includes an indicator 414 of the
number of missed calls or voicemail messages; [0169] Icon 418 for
e-mail client module 140, labeled "Mail," which optionally includes
an indicator 410 of the number of unread e-mails; [0170] Icon 420
for browser module 147, labeled "Browser;" and [0171] Icon 422 for
video and music player module 152, also referred to as iPod
(trademark of Apple Inc.) module 152, labeled "iPod;" and [0172]
Icons for other applications, such as: [0173] Icon 424 for IM
module 141, labeled "Messages;" [0174] Icon 426 for calendar module
148, labeled "Calendar;" [0175] Icon 428 for image management
module 144, labeled "Photos;" [0176] Icon 430 for camera module
143, labeled "Camera;" [0177] Icon 432 for online video module 155,
labeled "Online Video;" [0178] Icon 434 for stocks widget 149-2,
labeled "Stocks;" [0179] Icon 436 for map module 154, labeled
"Maps;" [0180] Icon 438 for weather widget 149-1, labeled
"Weather;" [0181] Icon 440 for alarm clock widget 149-4, labeled
"Clock;" [0182] Icon 442 for workout support module 142, labeled
"Workout Support;" [0183] Icon 444 for notes module 153, labeled
"Notes;" and [0184] Icon 446 for a settings application or module,
labeled "Settings," which provides access to settings for device
100 and its various applications 136.
[0185] It should be noted that the icon labels illustrated in FIG.
4A are merely exemplary. For example, icon 422 for video and music
player module 152 is labeled "Music" or "Music Player." Other
labels are, optionally, used for various application icons. In some
embodiments, a label for a respective application icon includes a
name of an application corresponding to the respective application
icon. In some embodiments, a label for a particular application
icon is distinct from a name of an application corresponding to the
particular application icon.
[0186] FIG. 4B illustrates an exemplary user interface on a device
(e.g., device 300, FIG. 3) with a touch-sensitive surface 451
(e.g., a tablet or touchpad 355, FIG. 3) that is separate from the
display 450 (e.g., touch screen display 112). Device 300 also,
optionally, includes one or more contact intensity sensors (e.g.,
one or more of sensors 359) for detecting intensity of contacts on
touch-sensitive surface 451 and/or one or more tactile output
generators 357 for generating tactile outputs for a user of device
300.
[0187] Although some of the examples that follow will be given with
reference to inputs on touch screen display 112 (where the
touch-sensitive surface and the display are combined), in some
embodiments, the device detects inputs on a touch-sensitive surface
that is separate from the display, as shown in FIG. 4B. In some
embodiments, the touch-sensitive surface (e.g., 451 in FIG. 4B) has
a primary axis (e.g., 452 in FIG. 4B) that corresponds to a primary
axis (e.g., 453 in FIG. 4B) on the display (e.g., 450). In
accordance with these embodiments, the device detects contacts
(e.g., 460 and 462 in FIG. 4B) with the touch-sensitive surface 451
at locations that correspond to respective locations on the display
(e.g., in FIG. 4B, 460 corresponds to 468 and 462 corresponds to
470). In this way, user inputs (e.g., contacts 460 and 462, and
movements thereof) detected by the device on the touch-sensitive
surface (e.g., 451 in FIG. 4B) are used by the device to manipulate
the user interface on the display (e.g., 450 in FIG. 4B) of the
multifunction device when the touch-sensitive surface is separate
from the display. It should be understood that similar methods are,
optionally, used for other user interfaces described herein.
[0188] Additionally, while the following examples are given
primarily with reference to finger inputs (e.g., finger contacts,
finger tap gestures, finger swipe gestures), it should be
understood that, in some embodiments, one or more of the finger
inputs are replaced with input from another input device (e.g., a
mouse-based input or stylus input). For example, a swipe gesture
is, optionally, replaced with a mouse click (e.g., instead of a
contact) followed by movement of the cursor along the path of the
swipe (e.g., instead of movement of the contact). As another
example, a tap gesture is, optionally, replaced with a mouse click
while the cursor is located over the location of the tap gesture
(e.g., instead of detection of the contact followed by ceasing to
detect the contact). Similarly, when multiple user inputs are
simultaneously detected, it should be understood that multiple
computer mice are, optionally, used simultaneously, or a mouse and
finger contacts are, optionally, used simultaneously.
[0189] FIG. 5A illustrates exemplary personal electronic device
500. Device 500 includes body 502. In some embodiments, device 500
can include some or all of the features described with respect to
devices 100 and 300 (e.g., FIGS. 1A-4B). In some embodiments,
device 500 has touch-sensitive display screen 504, hereafter touch
screen 504. Alternatively, or in addition to touch screen 504,
device 500 has a display and a touch-sensitive surface. As with
devices 100 and 300, in some embodiments, touch screen 504 (or the
touch-sensitive surface) optionally includes one or more intensity
sensors for detecting intensity of contacts (e.g., touches) being
applied. The one or more intensity sensors of touch screen 504 (or
the touch-sensitive surface) can provide output data that
represents the intensity of touches. The user interface of device
500 can respond to touches based on their intensity, meaning that
touches of different intensities can invoke different user
interface operations on device 500.
[0190] Exemplary techniques for detecting and processing touch
intensity are found, for example, in related applications:
International Patent Application Serial No. PCT/US2013/040061,
titled "Device, Method, and Graphical User Interface for Displaying
User Interface Objects Corresponding to an Application," filed May
8, 2013, published as WIPO Publication No. WO/2013/169849, and
International Patent Application Serial No. PCT/US2013/069483,
titled "Device, Method, and Graphical User Interface for
Transitioning Between Touch Input to Display Output Relationships,"
filed Nov. 11, 2013, published as WIPO Publication No.
WO/2014/105276, each of which is hereby incorporated by reference
in their entirety.
[0191] In some embodiments, device 500 has one or more input
mechanisms 506 and 508. Input mechanisms 506 and 508, if included,
can be physical. Examples of physical input mechanisms include push
buttons and rotatable mechanisms. In some embodiments, device 500
has one or more attachment mechanisms. Such attachment mechanisms,
if included, can permit attachment of device 500 with, for example,
hats, eyewear, earrings, necklaces, shirts, jackets, bracelets,
watch straps, chains, trousers, belts, shoes, purses, backpacks,
and so forth. These attachment mechanisms permit device 500 to be
worn by a user.
[0192] FIG. 5B depicts exemplary personal electronic device 500. In
some embodiments, device 500 can include some or all of the
components described with respect to FIGS. 1A, 1B, and 3. Device
500 has bus 512 that operatively couples I/O section 514 with one
or more computer processors 516 and memory 518. I/O section 514 can
be connected to display 504, which can have touch-sensitive
component 522 and, optionally, intensity sensor 524 (e.g., contact
intensity sensor). In addition, I/O section 514 can be connected
with communication unit 530 for receiving application and operating
system data, using Wi-Fi, Bluetooth, near field communication
(NFC), cellular, and/or other wireless communication techniques.
Device 500 can include input mechanisms 506 and/or 508. Input
mechanism 506 is, optionally, a rotatable input device. Input
mechanism 508 is, optionally, a button, in some examples.
[0193] Input mechanism 508 is, optionally, a microphone, in some
examples. Personal electronic device 500 optionally includes
various sensors, such as GPS sensor 532, accelerometer 534,
directional sensor 540 (e.g., compass), gyroscope 536, motion
sensor 538, and/or a combination thereof, all of which can be
operatively connected to I/O section 514.
[0194] Memory 518 of personal electronic device 500 can include one
or more non-transitory computer-readable storage mediums, for
storing computer-executable instructions, which, when executed by
one or more computer processors 516, for example, can cause the
computer processors to perform the techniques described below,
including processes 700, 900, 1100, and 1300 (FIGS. 7A-7C, 9A-9D,
11A-11B, and 13A-13C). A computer-readable storage medium can be
any medium that can tangibly contain or store computer-executable
instructions for use by or in connection with the instruction
execution system, apparatus, or device. In some examples, the
storage medium is a transitory computer-readable storage medium. In
some examples, the storage medium is a non-transitory
computer-readable storage medium. The non-transitory
computer-readable storage medium can include, but is not limited
to, magnetic, optical, and/or semiconductor storages. Examples of
such storage include magnetic disks, optical discs based on CD,
DVD, or Blu-ray technologies, as well as persistent solid-state
memory such as flash, solid-state drives, and the like. Personal
electronic device 500 is not limited to the components and
configuration of FIG. 5B, but can include other or additional
components in multiple configurations.
[0195] As used here, the term "affordance" refers to a
user-interactive graphical user interface object that is,
optionally, displayed on the display screen of devices 100, 300,
and/or 500 (FIGS. 1A, 3, and 5A-5B). For example, an image (e.g.,
icon), a button, and text (e.g., hyperlink) each optionally
constitute an affordance.
[0196] As used herein, the term "focus selector" refers to an input
element that indicates a current part of a user interface with
which a user is interacting. In some implementations that include a
cursor or other location marker, the cursor acts as a "focus
selector" so that when an input (e.g., a press input) is detected
on a touch-sensitive surface (e.g., touchpad 355 in FIG. 3 or
touch-sensitive surface 451 in FIG. 4B) while the cursor is over a
particular user interface element (e.g., a button, window, slider,
or other user interface element), the particular user interface
element is adjusted in accordance with the detected input. In some
implementations that include a touch screen display (e.g.,
touch-sensitive display system 112 in FIG. 1A or touch screen 112
in FIG. 4A) that enables direct interaction with user interface
elements on the touch screen display, a detected contact on the
touch screen acts as a "focus selector" so that when an input
(e.g., a press input by the contact) is detected on the touch
screen display at a location of a particular user interface element
(e.g., a button, window, slider, or other user interface element),
the particular user interface element is adjusted in accordance
with the detected input. In some implementations, focus is moved
from one region of a user interface to another region of the user
interface without corresponding movement of a cursor or movement of
a contact on a touch screen display (e.g., by using a tab key or
arrow keys to move focus from one button to another button); in
these implementations, the focus selector moves in accordance with
movement of focus between different regions of the user interface.
Without regard to the specific form taken by the focus selector,
the focus selector is generally the user interface element (or
contact on a touch screen display) that is controlled by the user
so as to communicate the user's intended interaction with the user
interface (e.g., by indicating, to the device, the element of the
user interface with which the user is intending to interact). For
example, the location of a focus selector (e.g., a cursor, a
contact, or a selection box) over a respective button while a press
input is detected on the touch-sensitive surface (e.g., a touchpad
or touch screen) will indicate that the user is intending to
activate the respective button (as opposed to other user interface
elements shown on a display of the device).
[0197] As used in the specification and claims, the term
"characteristic intensity" of a contact refers to a characteristic
of the contact based on one or more intensities of the contact. In
some embodiments, the characteristic intensity is based on multiple
intensity samples. The characteristic intensity is, optionally,
based on a predefined number of intensity samples, or a set of
intensity samples collected during a predetermined time period
(e.g., 0.05, 0.1, 0.2, 0.5, 1, 2, 5, 10 seconds) relative to a
predefined event (e.g., after detecting the contact, prior to
detecting liftoff of the contact, before or after detecting a start
of movement of the contact, prior to detecting an end of the
contact, before or after detecting an increase in intensity of the
contact, and/or before or after detecting a decrease in intensity
of the contact). A characteristic intensity of a contact is,
optionally, based on one or more of: a maximum value of the
intensities of the contact, a mean value of the intensities of the
contact, an average value of the intensities of the contact, a top
10 percentile value of the intensities of the contact, a value at
the half maximum of the intensities of the contact, a value at the
90 percent maximum of the intensities of the contact, or the like.
In some embodiments, the duration of the contact is used in
determining the characteristic intensity (e.g., when the
characteristic intensity is an average of the intensity of the
contact over time). In some embodiments, the characteristic
intensity is compared to a set of one or more intensity thresholds
to determine whether an operation has been performed by a user. For
example, the set of one or more intensity thresholds optionally
includes a first intensity threshold and a second intensity
threshold. In this example, a contact with a characteristic
intensity that does not exceed the first threshold results in a
first operation, a contact with a characteristic intensity that
exceeds the first intensity threshold and does not exceed the
second intensity threshold results in a second operation, and a
contact with a characteristic intensity that exceeds the second
threshold results in a third operation. In some embodiments, a
comparison between the characteristic intensity and one or more
thresholds is used to determine whether or not to perform one or
more operations (e.g., whether to perform a respective operation or
forgo performing the respective operation), rather than being used
to determine whether to perform a first operation or a second
operation.
[0198] In some embodiments, a portion of a gesture is identified
for purposes of determining a characteristic intensity. For
example, a touch-sensitive surface optionally receives a continuous
swipe contact transitioning from a start location and reaching an
end location, at which point the intensity of the contact
increases. In this example, the characteristic intensity of the
contact at the end location is, optionally, based on only a portion
of the continuous swipe contact, and not the entire swipe contact
(e.g., only the portion of the swipe contact at the end location).
In some embodiments, a smoothing algorithm is, optionally, applied
to the intensities of the swipe contact prior to determining the
characteristic intensity of the contact. For example, the smoothing
algorithm optionally includes one or more of: an unweighted
sliding-average smoothing algorithm, a triangular smoothing
algorithm, a median filter smoothing algorithm, and/or an
exponential smoothing algorithm. In some circumstances, these
smoothing algorithms eliminate narrow spikes or dips in the
intensities of the swipe contact for purposes of determining a
characteristic intensity.
[0199] The intensity of a contact on the touch-sensitive surface
is, optionally, characterized relative to one or more intensity
thresholds, such as a contact-detection intensity threshold, a
light press intensity threshold, a deep press intensity threshold,
and/or one or more other intensity thresholds. In some embodiments,
the light press intensity threshold corresponds to an intensity at
which the device will perform operations typically associated with
clicking a button of a physical mouse or a trackpad. In some
embodiments, the deep press intensity threshold corresponds to an
intensity at which the device will perform operations that are
different from operations typically associated with clicking a
button of a physical mouse or a trackpad. In some embodiments, when
a contact is detected with a characteristic intensity below the
light press intensity threshold (e.g., and above a nominal
contact-detection intensity threshold below which the contact is no
longer detected), the device will move a focus selector in
accordance with movement of the contact on the touch-sensitive
surface without performing an operation associated with the light
press intensity threshold or the deep press intensity threshold.
Generally, unless otherwise stated, these intensity thresholds are
consistent between different sets of user interface figures.
[0200] An increase of characteristic intensity of the contact from
an intensity below the light press intensity threshold to an
intensity between the light press intensity threshold and the deep
press intensity threshold is sometimes referred to as a "light
press" input. An increase of characteristic intensity of the
contact from an intensity below the deep press intensity threshold
to an intensity above the deep press intensity threshold is
sometimes referred to as a "deep press" input. An increase of
characteristic intensity of the contact from an intensity below the
contact-detection intensity threshold to an intensity between the
contact-detection intensity threshold and the light press intensity
threshold is sometimes referred to as detecting the contact on the
touch-surface. A decrease of characteristic intensity of the
contact from an intensity above the contact-detection intensity
threshold to an intensity below the contact-detection intensity
threshold is sometimes referred to as detecting liftoff of the
contact from the touch-surface. In some embodiments, the
contact-detection intensity threshold is zero. In some embodiments,
the contact-detection intensity threshold is greater than zero.
[0201] In some embodiments described herein, one or more operations
are performed in response to detecting a gesture that includes a
respective press input or in response to detecting the respective
press input performed with a respective contact (or a plurality of
contacts), where the respective press input is detected based at
least in part on detecting an increase in intensity of the contact
(or plurality of contacts) above a press-input intensity threshold.
In some embodiments, the respective operation is performed in
response to detecting the increase in intensity of the respective
contact above the press-input intensity threshold (e.g., a "down
stroke" of the respective press input). In some embodiments, the
press input includes an increase in intensity of the respective
contact above the press-input intensity threshold and a subsequent
decrease in intensity of the contact below the press-input
intensity threshold, and the respective operation is performed in
response to detecting the subsequent decrease in intensity of the
respective contact below the press-input threshold (e.g., an "up
stroke" of the respective press input).
[0202] In some embodiments, the device employs intensity hysteresis
to avoid accidental inputs sometimes termed "jitter," where the
device defines or selects a hysteresis intensity threshold with a
predefined relationship to the press-input intensity threshold
(e.g., the hysteresis intensity threshold is X intensity units
lower than the press-input intensity threshold or the hysteresis
intensity threshold is 75%, 90%, or some reasonable proportion of
the press-input intensity threshold). Thus, in some embodiments,
the press input includes an increase in intensity of the respective
contact above the press-input intensity threshold and a subsequent
decrease in intensity of the contact below the hysteresis intensity
threshold that corresponds to the press-input intensity threshold,
and the respective operation is performed in response to detecting
the subsequent decrease in intensity of the respective contact
below the hysteresis intensity threshold (e.g., an "up stroke" of
the respective press input). Similarly, in some embodiments, the
press input is detected only when the device detects an increase in
intensity of the contact from an intensity at or below the
hysteresis intensity threshold to an intensity at or above the
press-input intensity threshold and, optionally, a subsequent
decrease in intensity of the contact to an intensity at or below
the hysteresis intensity, and the respective operation is performed
in response to detecting the press input (e.g., the increase in
intensity of the contact or the decrease in intensity of the
contact, depending on the circumstances).
[0203] For ease of explanation, the descriptions of operations
performed in response to a press input associated with a
press-input intensity threshold or in response to a gesture
including the press input are, optionally, triggered in response to
detecting either: an increase in intensity of a contact above the
press-input intensity threshold, an increase in intensity of a
contact from an intensity below the hysteresis intensity threshold
to an intensity above the press-input intensity threshold, a
decrease in intensity of the contact below the press-input
intensity threshold, and/or a decrease in intensity of the contact
below the hysteresis intensity threshold corresponding to the
press-input intensity threshold. Additionally, in examples where an
operation is described as being performed in response to detecting
a decrease in intensity of a contact below the press-input
intensity threshold, the operation is, optionally, performed in
response to detecting a decrease in intensity of the contact below
a hysteresis intensity threshold corresponding to, and lower than,
the press-input intensity threshold.
[0204] Attention is now directed towards embodiments of user
interfaces ("UI") and associated processes that are implemented on
an electronic device, such as portable multifunction device 100,
device 300, or device 500.
[0205] FIGS. 6A-6S illustrate exemplary user interfaces for playing
and managing audio items, in accordance with some embodiments. The
user interfaces in these figures are used to illustrate the
processes described below, including the processes in FIGS.
7A-7C.
[0206] FIG. 6A illustrates an electronic device 600 (e.g., portable
multifunction device 100, device 300, or device 500). In the
non-limiting exemplary embodiment depicted in FIGS. 6A-6S,
electronic device 600 is a smartwatch. In other embodiments,
electronic device 600 can be a different type of electronic device,
such as a different type of wearable device or a smartphone.
[0207] As shown in FIG. 6A, electronic device 600 has a
touch-sensitive display 602. In some embodiments, electronic device
600 also has a rotatable input mechanism 606 for navigating the
displayed user interface. For example, rotatable input mechanism
606 can be used to scroll (upwards and downwards) the displayed
user interface. For another example, the rotatable input mechanism
606 can be used to zoom in and zoom out of the displayed user
interface. For another example, the rotatable input mechanism 606
can be used to zoom in to a displayed icon corresponding to an
application and launch the application corresponding to the
zoomed-in icon.
[0208] In FIG. 6A, electronic device 600 displays, on the
touch-sensitive display 602, a portion of a navigation user
interface 604 of an application (e.g., a music application) for
playing audio items (e.g., a song, an audio file, a podcast, a
radio channel, a recording). Navigation user interface 604 enables
a user of electronic device 600 to navigate through audio
playlists, albums, artists, and tracks that can be managed and
played using electronic device 600. In some embodiments, navigation
user interface 604 includes an indication 608 (e.g., "MUSIC" for a
music application) that the currently-displayed application is an
application for playing audio items. In some embodiments,
navigation user interface 604 includes a time indication 610 of the
current time. In some embodiments, navigation user interface 604
includes a scroll bar 612 indicating the portion (or region) of
navigation user interface 604 that is currently being displayed on
touch-sensitive display 602. For example, in FIG. 6A, scroll bar
612 indicates that the very bottom portion of navigation user
interface 604 is currently displayed.
[0209] Navigation user interface 604 also includes a graphical
depiction of a stack of audio group items 614 corresponding to
audio items that can be played through electronic device 600 (e.g.,
via internal speakers of electronic device 600 or via an external
device, such as headphones paired with electronic device 600, an
external speaker paired with electronic device 600, or speakers of
a different device, such as a smartphone, tablet, laptop computer,
or desktop computer, paired with electronic device 600). In some
embodiments, as shown in FIG. 6A, the stack of audio group items
614 is located at a bottom region of navigation user interface 604.
In some embodiments, as shown in FIG. 6A, the stack of audio group
items 614 is organized based on albums (e.g., a musical album, a
collection of podcasts, a collection of listings from a radio
channel), and each stack item of the stack of audio group items 614
corresponds to an album. For example, the first stack item 616 of
the stack corresponds to a "Classics Album." In some embodiments,
the stack of audio group items 614 are organized based on playlists
created by a user of electronic device 600, and each stack item of
the stack of audio group items 614 corresponds to a playlist. In
some embodiments, the stack of audio group items 614 are organized
based on artists, and each stack item of the stack of audio group
items 614 corresponds to an artist. In some embodiments, navigation
user interface 604 optionally displays, below the stack of audio
group items 614, a textual indication 618 of an album, playlist, or
radio station corresponding to the first record of the stack and a
textual indication 620 of an artist associated with the album,
playlist, or radio station.
[0210] In FIG. 6B, while displaying navigation user interface 604,
electronic device 600 receives a tap gesture 601 by a user of
electronic device 600 on first stack item 616 of the stack of
records 614. In FIG. 6C, in response to receiving tap gesture 601
on first stack item 616, electronic device 600 displays, on
touch-sensitive display 602, a control user interface 622 of the
application for playing audio items (e.g., a music application). In
some embodiments, in addition to displaying control user interface
622, electronic device 600 causes audio output (e.g., via internal
speakers of electronic device 600 or via an external device, such
as headphones paired with electronic device 600, an external
speaker paired with electronic device 600, or speakers of a
different device, such as a smartphone, tablet, laptop computer, or
desktop computer, paired with electronic device 600) of a first
audio item associated with the album corresponding to the selected
stack item 616. In some embodiments, electronic device 600 displays
control user interface 622 but does not cause audio output of the
first audio item associated with the album corresponding to the
selected stack item 616.
[0211] Control user interface 622 includes an indication 626 (e.g.,
the title) of the currently-playing audio item (e.g., "First
Track"). In some examples, the audio item is the first listed audio
item in the album (e.g., "Classics Album") corresponding to the
selected stack item 616. In some embodiments, control user
interface 622 also includes an indication 628 of the artist (e.g.,
"Classics Band") associated with the currently-playing audio item
(e.g., "First Track").
[0212] In some embodiments, control user interface 622 displays a
selectable indication 624 which indicates a progress time (e.g.,
"0:00") of the currently-playing audio item (e.g., "First Track").
In some embodiments, in response to detecting user selection of
selectable indication 624, electronic device 600 replaces display
of the current user interface with display of the previous user
interface (e.g., replaces display of control user interface 622
with display of navigation user interface 604. In some embodiments,
selectable indication 624 replaces display of indication 608. In
some embodiments, electronic device 600 maintains display of time
indication 610.
[0213] Control user interface 622 further includes a rewind icon
630, a play/pause icon 632, and a forwards icon 634. In some
embodiments, play/pause icon 632 is displayed in "pause" mode, as
depicted in FIG. 6C, when the audio item (e.g., "First Track") is
currently being played. In some embodiments, play/pause icon 632 is
displayed in "play" mode when the audio item (e.g., "First Track")
is currently not being played. In some embodiments, user activation
of rewind icon 630 causes the currently-playing audio item (e.g.,
"First Track") to rewind by a predetermined increment of time
(e.g., 3 seconds, 5 seconds, 10 seconds, etc.). In some
embodiments, user activation of rewind icon 630 causes the
currently-playing audio item (e.g., "First Track") to be played
from the beginning of the file. In some embodiments, user
activation of rewind icon 630 causes a previous listed audio item
in the album (e.g., "Classics Album") to be played. In some
embodiments, user activation of forwards icon 634 causes the
currently-playing audio item (e.g., "First Track") to jump forwards
by a predetermined increment of time (e.g., 3 seconds, 5 seconds,
10 seconds, etc.). In some embodiments, user activation of forwards
icon 634 causes a subsequent listed audio item in the album (e.g.,
"Classics Album") to be played.
[0214] Control user interface 622 further includes an
add-to-library icon 636, a volume control icon 638, and a show-more
icon 640. Volume control icon 638 enables the user to manipulate
the output volume setting. In some embodiments, user activation of
volume control icon 638 causes display of a volume bar for
increasing or decreasing the volume. Add-to-library icon 636 is
described in greater detail below with reference to FIGS. 6L-6M and
show-more icon 640 is described in greater detail below with
reference to FIGS. 6N-6Q.
[0215] Control user interface 622 further includes a first indicia
icon 642 corresponding to control user interface 622 and second
indicia icon 644 corresponding to a user interface different from
control user interface 642. In FIG. 6C, first indicia icon 642 is
highlighted (e.g., visually darkened, visually marked) to be
visually distinguished relative to second indicia icon 644 in order
to indicate to the user that the currently-displayed user interface
is control user interface 622 (as opposed to a different user
interface).
[0216] In FIG. 6D, electronic device 600 receives a user input on
control user interface 622. In some embodiments, as shown in the
transition of FIG. 6D through FIG. 6F, the user input is a swipe
gesture 603 in a right-to-left horizontal direction, beginning on
control user interface 622 and moving in a leftwards direction on
touch-sensitive display 602. In some embodiments, as shown in FIG.
6E, as swipe gesture 603 is being detected, control user interface
622 is gradually removed from display towards a side (e.g., the
left side) of touch-sensitive display 602, and a track list user
interface 646 moves into display from an opposite side (e.g., the
right side) of touch-sensitive display 602. Thus, swipe gesture 603
causes electronic device 600 to smoothly transition from control
user interface 622 to track list user interface 646. In some
embodiments, during the transition of control user interface 622 to
track list user interface 646, electronic device maintains causing
audio output of the currently-playing audio item (e.g., "First
Track"). In some embodiments, during the transition of control user
interface 622 to track list user interface 646, electronic device
600 maintains display of selectable indication 624 indicating the
time progress of the currently-playing audio item (e.g., "First
Track") and time indication 610.
[0217] As shown in FIG. 6G, track list user interface 646 includes
display of a list of selectable titles 648-656 associated with
audio items in the currently-selected album (or playlist or radio
station) (e.g., "Classics Album"), including the title 648
corresponding to the currently-playing audio item (e.g., "First
Track"). In some embodiments, as shown in FIG. 6G, track list user
interface 646 updates display of selectable indication 624 to
indicate the title of the currently-playing audio item (e.g.,
"First Track"). In some embodiments, user activation of selectable
indication 624 causes electronic device 600 to replace display of
track list user interface 646 with control user interface 622. In
some embodiments, electronic device 600 maintains display of time
indication 610 in track list user interface 646.
[0218] Track list user interface 646 also includes display of first
indicia icon 642 corresponding to control user interface 622 and
second indicia icon 644 corresponding to track list user interface
646. Because track list user interface 646 is being displayed,
second indicia icon 644, instead of first indicia icon 642, is
highlighted (e.g., visually darkened, visually marked) to be
visually distinguished relative to first indicia icon 642 in order
to indicate to the user that the currently-displayed user interface
is track list user interface 646 (as opposed to control user
interface 622).
[0219] In FIG. 6H, electronic device 600 receives user selection,
via touch-sensitive display 602 (or, alternatively, a voice input
via a mic or rotation input via rotatable input mechanism 606), of
an audio item of the current album (e.g., "Classics Album")
different from the currently-playing audio item (e.g., "First
Track"). For example, electronic device 600 detects tap gesture 605
on selectable title 652 corresponding to the third audio item
(e.g., "Third Track") of the current album (e.g., "Classics
Album"). In response to detecting tap gesture 605 on selectable
title 652, electronic device 600 causes audio output (e.g., via
internal speakers of electronic device 600 or via an external
device, such as headphones paired with electronic device 600, an
external speaker paired with electronic device 600, or speakers of
a different device, such as a smartphone, tablet, laptop computer,
or desktop computer, paired with electronic device 600) of the
audio item corresponding to selectable title 652 (e.g., "Third
Track"). In some embodiments, selectable indication 624 is updated
to indicate that the audio item corresponding to selectable title
652 is now playing.
[0220] As shown in the transition of FIG. 6I through FIG. 6K,
electronic device 600 detects a swipe gesture 607 in a
left-to-right horizontal direction, beginning on track list user
interface 646 and moving in a rightwards direction on
touch-sensitive display 602. Through the transition, electronic
device 600 smoothly replaces display of track list user interface
646 with control user interface 622. As shown in FIG. 6K, control
user interface 622 displays indication 626 (e.g., the title) of the
currently-playing audio item (e.g., "Third Track") and indication
628 of the artist (e.g., "Classics Band") associated with the
currently-playing audio item (e.g., "Third Track"). As also shown
in FIG. 6K, electronic device 600 updates display of first indicia
icon 642 and second indicia icon 644 such that first indicia icon
642 (corresponding to control use interface 622) is highlighted
(e.g., visually darkened, visually marked) relative to second
indicia icon 644 (corresponding to track list user interface 646)
to indicate to the user that the currently-displayed user interface
is control user interface 622.
[0221] As shown in FIG. 6K, control user interface 622 includes
display of add-to-library icon 636. In some embodiments,
add-to-library icon has a first display mode (e.g., displayed with
a "+" symbol) and a second display mode (e.g., displayed with a " "
symbol). The first display mode indicates that the
currently-playing audio item (e.g., "Third Track") is not contained
in a media library (e.g., a music library, a library of audio items
stored in and accessible from a cloud service) associated with a
user account that is logged into electronic device 600, and the
second display mode indicates that the currently-playing audio item
(e.g., "Third Track") is contained in a media library (e.g., a
music library, a library of audio items stored in and accessible
from a cloud service) associated with a user account that is logged
into electronic device 600. In FIG. 6K, the currently-playing audio
item (e.g., "Third Track") is not contained in a media library
associated with a user account of the user of electronic device
600, and thus add-to-library affordance is displayed in the first
display mode (e.g., is displayed with a "+" symbol).
[0222] In FIG. 6L, while playing an audio item (e.g., "Third
Track") that is not contained in a media library associated with a
user account of the user, electronic device 600 detects, via
touch-sensitive display 602 (or, alternatively, a voice input via a
mic or rotation input via rotatable input mechanism 606), user
selection of add-to-library icon 636 (in the first display mode).
For example, as shown in FIG. 6L, the user selection is a tap
gesture 609 on add-to-library icon 636. In response to detecting
tap gesture 609 on add-to-library icon 636 (in the first display
mode), electronic device 600 causes the current audio item (e.g.,
"Third Track") to be added to a media library.
[0223] As shown in FIG. 6M, once the current audio (e.g., "Third
Track") has been added to the media library, electronic device 600
changes the display mode of add-to-library icon 636 from the first
display mode (e.g., displayed with a "+" symbol) to the second
display mode (e.g., displayed with a " " symbol). Add-to-library
icon 636 in the second display indicates to the user that the
current audio item is already contained in the media library. In
some embodiments, when in the second display mode, upon receiving
user selection of the add-to-library icon 636, electronic device
600 displays a notification (or a prompt) indicating to the user
that the current audio item (e.g., "Third Track") is already
contained in the media library. In some embodiments, when in the
second display mode, add-to-library icon 636 is no longer
selectable by the user.
[0224] In FIG. 6N, while displaying control user interface 622,
electronic device 600 detects, via touch-sensitive display 602 (or,
alternatively, a voice input via a mic or rotation input via
rotatable input mechanism 606), user selection of show-more icon
640. For example, the user selection is a tap gesture 611 on
show-more icon 640. In some embodiments, as shown in FIG. 6N,
show-more icon 640 is displayed with a " . . . " symbol (e.g.,
representing a "show more" option) to indicate that more user
actions can be taken with respect to the current audio item (e.g.,
"Third Track").
[0225] In some embodiments, in response to detecting tap gesture
611 on show-more icon 640, electronic device 600 displays (replaces
display of control user interface 622 with) an additional options
user interface 658, as shown in FIG. 6O. In some embodiments, in
additional options user interface 658, selectable indication 624
indicates (e.g., by displaying "CANCEL") to the user that the user
can select selectable indication 624 to leave additional options
user interface 658 (and return to control user interface 622). In
some embodiments, additional options user interface 658 maintains
display of time indication 610. In some embodiments, additional
options user interface 658 maintains display of indication 626
(e.g., the title) of the current audio item (e.g., "Third Track").
In some embodiments, additional options user interface 658
maintains display of indication 628 of the artist (e.g., "Classics
Band") corresponding to the current audio item (e.g., "Third
Track").
[0226] In some embodiments, as further shown in FIG. 6O, additional
options user interface 658 includes an add-locally icon 660 and
play-remotely icon 662. In some embodiments, add-locally icon 660
indicates (e.g., by displaying a textual indication, such as "Add
to Watch") to the user that the current audio item (e.g., "Third
Track") can be stored locally on electronic device 600. In some
embodiments, play-remotely icon 662 indicates (e.g., via a textual
indication, such as "AirPlay To" or "Stream To") to the user that
the current audio item can be caused to play remotely at a
different device (e.g., a smartphone, a laptop computer, a desktop
computer) that is paired (e.g., connected with and recognized by)
electronic device 600.
[0227] In FIG. 6P, while displaying additional options user
interface 658, electronic device 600 detects, via touch-sensitive
display 602 (or, alternatively, a voice input via a mic or rotation
input via rotatable input mechanism 606), user selection of
add-locally icon 660. For example, as shown in FIG. 6P, the user
selection is a tap gesture 613 on add-locally icon 660.
[0228] As shown in FIG. 6Q, in response to detecting tap gesture
613 on add-locally icon 660, electronic device 600 displays a
second additional options user interface 664. In some embodiments,
second additional options user interface 664 maintains display of
selectable indication in the cancel mode. In some embodiments, user
activation of selectable indication 624 on second additional
options user interface 664 causes electronic device 600 to return
to display of additional options user interface 658. In some
embodiments, user activation of selectable indication 624 on second
additional options user interface 664 causes electronic device 600
to return to display of control user interface 622.
[0229] As also shown in FIG. 6Q, second additional options user
interface 664 includes an add-track-only icon 668 and an
add-entire-album icon 670. Add-track-only icon 668 indicates (e.g.,
via a textual indication, such as "Add Song," "Add Track," "Add
Audio item," etc.) to the user that only the current audio item
(e.g., "Third Track"), but not other audio items associated with
the current album (e.g., "Classics Album"), will be added (stored)
locally on electronic device 600. Add-entire-album icon 670
indicates (e.g., via a textual indication, such as "Add Album,"
"Add Playlist," "Add Folder," etc.) that all audio items associated
with the current album (or current playlist, current folder,
current radio station, etc.), will be added (stored) locally on
electronic device 600.
[0230] In some embodiments, as shown in FIG. 6R, while displaying
second additional options user interface 664, electronic device 600
detects, via touch-sensitive display 602 (or, alternatively, a
voice input via a mic or rotation input via rotatable input
mechanism 606), user selection of add-track-only icon 668. For
example, the user selection is a tap gesture 615 on add-track-only
icon 668. In response to detecting user activation of
add-track-only icon 668, electronic device 600 proceeds to add
(store) the current audio item (e.g., "Third Track") locally on
electronic device 600, but does not add (store) other audio items
(e.g., "First Track," "Second Track," "Fourth Track," "Fifth
Track") of the current album (e.g., "Classics Album") locally on
electronic device 600, even if one or more of the other audio items
are not stored locally on electronic device 600.
[0231] In some embodiments, as shown in FIG. 6S, while displaying
second additional options user interface 664, electronic device 600
receives, via touch-sensitive display 602 (or, alternatively, a
voice input via a mic or rotation input via rotatable input
mechanism 606), user selection of add-entire-album 670. For
example, the user input is a tap gesture 617 on add-entire-album
670. In response to detecting user activation of add-entire-album
670, electronic device 600 proceeds to add (store) all audio items
(e.g., "First Track," "Second Track," "Third Track," "Fourth
Track," "Fifth Track") in the current album (e.g., "Classics
Album") locally on electronic device 600.
[0232] FIGS. 7A-7C are a flow diagram illustrating a method for
playing and managing audio items using an electronic device in
accordance with some embodiments. Method 700 is performed at a
device (e.g., 100, 300, 500, 600) with a touch-sensitive display
(e.g., touch-sensitive display 602). Some operations in method 700
are, optionally, combined, the order of some operations is,
optionally, changed, and some operations are, optionally,
omitted.
[0233] As described below, method 700 provides an intuitive way for
playing and managing audio items. The method reduces the cognitive
burden on a user for playing and managing audio items, thereby
creating a more efficient human-machine interface. For
battery-operated computing devices, enabling a user to play and
manage audio files faster and more efficiently conserves power and
increases the time between battery charges.
[0234] At block 702, the electronic device (e.g., 600) displays, on
the display (e.g., 602), a first user interface (e.g., 604), where
the first user interface includes a scrollable plurality of audio
playlist items (e.g., 614) associated with a plurality of audio
playlists. In some examples, the scrollable plurality of audio
playlist items (e.g., 614) is a stack of records. In some examples,
the scrollable plurality of audio playlist items (e.g., 614) is a
stack of audio tracks. In some examples, the scrollable plurality
of audio playlist items (e.g., 614) is a collection of titles or
albums. In some examples, the scrollable plurality of audio
playlist items (e.g., 614) is a collection of radio items, news
items, or podcasts. In some examples, the scrollable plurality of
audio playlist items (e.g., 614) is a collection of audio
recordings. Displaying a user interface that includes a scrollable
plurality of audio playlist items associated with a plurality of
audio playlists provides a visual feedback that enables a user to
visualize (and thus experience the sensation of) flipping through a
real stack of playlist items. Providing improved visual feedback to
the user enhances the operability of the device and makes the
user-device interface more efficient (e.g., by helping the user to
provide proper inputs and reducing user mistakes when
operating/interacting with the device) which, additionally, reduces
power usage and improves battery life of the device by enabling the
user to use the device more quickly and efficiently.
[0235] In some embodiments, at block 704, while displaying the
first user interface (e.g., 604), the electronic device (e.g., 600)
receives a third input. In some examples, the third input is a
touch gesture on the touch-sensitive display (e.g., 602)
corresponding to a scroll in an upwards direction. In some
examples, the third input is a touch gesture on the touch-sensitive
display (e.g., 602) corresponding to a scroll in a downwards
direction. In some examples, the third input is a movement of a
rotatable crown (e.g., 606) of the electronic device (e.g., 600) in
a clockwise direction. In some examples, the third input is a
movement of a rotatable crown (e.g., 606) of the electronic device
(e.g., 600) in a counter-clockwise direction. In some embodiments,
at block 706, in response to receiving the third input, the
electronic device (e.g., 600) displays, on the display (e.g., 602),
a plurality of menu affordances. In some embodiments, at block 708,
the electronic device (e.g., 600) receives user selection of a
first menu affordance (e.g., a "library" menu) of the plurality of
menu affordances (e.g., a "now playing" menu, a "search" menu, or a
"library" menu), where the menu affordance is associated with the
media library. In some embodiments, at block 710, in response to
receiving the user selection of the first menu affordance (e.g.,
the "library" menu), the electronic device (e.g., 600) displays, on
the display (e.g., 602), one or more audio items associated with
(e.g., contained in) the media library.
[0236] In some embodiments, at block 712, while displaying the one
or more audio items associated with the media library, the
electronic device (e.g., 600) receives user selection of a second
audio item of the one or more audio items associated with the media
library. In some embodiments, at block 714, in response to
receiving the user selection of the second audio item, the
electronic device (e.g., 600) displays, on the display, the second
user interface (e.g., 622) (e.g., a control user interface of a
music application), where the second user interface (e.g., 622)
includes a first indication (e.g., 626) (e.g., a title of the audio
item or an artist/album associated with the audio item) of the
second audio item and a second indication (e.g., 636) indicating
that the second audio item is contained in the media library.
[0237] At block 716, the electronic device (e.g., 600) receives a
first user input (e.g., 601) on a first audio playlist item of the
plurality of audio playlist items (e.g., 614). In some examples,
the first user input (e.g., 601) is a touch gesture detectable by
the touch-sensitive display (e.g., 602), such as a tap.
[0238] At block 718, in response to receiving the first user input
(e.g., 601) on the first audio playlist item, the electronic device
(e.g., 600), at block 720, displays, on the display (e.g., 602), a
second user interface (e.g., 622) (e.g., a control user interface
of a music application), where the second user interface (e.g.,
622) includes an indication (e.g., 626) of a first audio item of a
first audio playlist associated with the first audio playlist item,
and, at block 722, displays, on the display, a plurality of indicia
icons (e.g., 642, 644), where a first indicia icon (e.g., 642)
associated with the second user interface includes an indication
(e.g., highlighting or marking of the indicia icon) that the second
user interface is currently displayed. Thus, the first indicia icon
(e.g., 642) is visually distinguished from other indicia icons that
are not associated with the second user interface (e.g., 622).
Displaying a plurality of indicia icons, where the first indicia
icon associated with the second user interface includes the
indication that the second user interface is currently displayed,
provides the user with feedback about the currently-displayed user
interface and about other user interfaces that the user can
navigate to relative to the currently-displayed user interface.
Providing improved visual feedback to the user enhances the
operability of the device and makes the user-device interface more
efficient (e.g., by helping the user to provide proper inputs and
reducing user mistakes when operating/interacting with the device)
which, additionally, reduces power usage and improves battery life
of the device by enabling the user to use the device more quickly
and efficiently.
[0239] In some embodiments, at block 724, in response to receiving
the first user input (e.g., 601) on the first audio playlist item
(e.g., 616), the electronic device (e.g., 600) causes audio output
of the first audio item. For example, the electronic device (e.g.,
600) causes the audio output of the first audio item via internal
speakers of the electronic device or via an external device, such
as headphones paired with the electronic device, an external
speaker paired with the electronic device, or the speakers of a
different device (e.g., a smartphone, tablet, laptop computer, or
desktop computer) paired with the electronic device. Causing audio
output of the first audio item in response to receiving the first
user input on the first audio playlist item allows the user to
quickly and easily listen to and recognize an audio item within the
audio playlist item without having to manually view and select an
audio item within the audio playlist item. Performing an operation
when a set of conditions has been met without requiring further
user input enhances the operability of the device and makes the
user-device interface more efficient (e.g., by helping the user to
provide proper inputs and reducing user mistakes when
operating/interacting with the device) which, additionally, reduces
power usage and improves battery life of the device by enabling the
user to use the device more quickly and efficiently.
[0240] In some embodiments, at block 728, in accordance with a
determination that the first audio item is not contained in a media
library associated with a user account that is logged into the
electronic device (e.g., 600), the electronic device displays, on
the display (e.g., 602), a first affordance (e.g., 636). In some
examples, the media library is a library of audio items stored in
and accessible from a cloud service. In some examples, the first
affordance (e.g., 636) is displayed with a "+" to indicate that
selecting the affordance will cause the first audio item to be
added to the media library. In some embodiments, at block 730, the
electronic device (e.g., 600) receives user selection of the first
affordance (e.g., 636). In some embodiments, at block 732, in
response to receiving the user selection of the first affordance
(e.g., 636), the electronic device (e.g., 600) causes the first
audio item to be added to the media library.
[0241] Displaying, in accordance with a determination that the
audio item is not contained in a media library associated with a
user account that is logged into the device, the affordance
provides the user with feedback about the current state of the
audio item and whether or not the user would want to take
additional action regarding the audio item. Providing improved
visual feedback to the user enhances the operability of the device
and makes the user-device interface more efficient (e.g., by
helping the user to provide proper inputs and reducing user
mistakes when operating/interacting with the device) which,
additionally, reduces power usage and improves battery life of the
device by enabling the user to use the device more quickly and
efficiently.
[0242] In some embodiments, at block 734, in accordance with a
determination that the first audio item is contained in the media
library associated with the user account that is logged into the
electronic device (e.g., 600), the electronic device displays, on
the display (e.g., 602), an indication (e.g., 636) that the first
audio item is contained in the media library. In some examples, the
first affordance (e.g., 636) is displayed with a " " (instead of a
"+") to indicate that the first audio item is already contained in
the media library. Displaying the indication that the audio item is
contained in the media library provides the user with visual
feedback about the current state of the audio item. Providing
improved visual feedback to the user enhances the operability of
the device and makes the user-device interface more efficient
(e.g., by helping the user to provide proper inputs and reducing
user mistakes when operating/interacting with the device) which,
additionally, reduces power usage and improves battery life of the
device by enabling the user to use the device more quickly and
efficiently.
[0243] In some embodiments, at block 736, the second user interface
(e.g., 622) (e.g., a control user interface of a music application)
includes a second affordance (e.g., 640), and the electronic device
(e.g., 600) receives user selection of the second affordance. In
some examples, the second affordance (e.g., 640) is displayed with
a " . . . " (e.g., representing a "show more" option) to indicate
that more actions can be taken with respect to the first audio
item. In some embodiments, at block 738, subsequent to receiving
the user selection of the second affordance (e.g., 640), the
electronic device (e.g., 600) displays, on the display (e.g., 602),
a first add affordance (e.g., 660, 668). In some examples, the
first add affordance (e.g., 668) indicates to the user that the
current audio item can be stored locally on the electronic device
(e.g., 600). In some embodiments, at block 740, the electronic
device (e.g., 600) receives user selection of the first add
affordance (e.g., 660, 668). In some embodiments, at block 742, in
response to receiving the user selection of the first add
affordance (e.g., 660, 668), the electronic device (e.g., 600)
stores the first audio item on the electronic device, and forgoes
storing a second audio item of the plurality of audio items of the
first audio playlist different from the first audio item. For
example, the electronic device (e.g., 600) stores only the current
audio item of the playlist but does not store all other audio items
of the playlist.
[0244] In some embodiments, at block 744, subsequent to receiving
the user selection of the second affordance (e.g., 670), the
electronic device (e.g., 600) displays, on the display (e.g., 602),
a second add affordance (e.g., 670). In some examples, the second
add affordance (e.g., 670) indicates to the user that all the audio
items of the current playlist can be stored locally on the device
(e.g., 600). In some embodiments, at block 746, the electronic
device (e.g., 600) receives user selection of the second add
affordance (e.g., 670). In some embodiments, at block 748, in
response to receiving the user selection of the second add
affordance (e.g., 670), the electronic device (e.g., 600) stores
all of the plurality of audio items of the first audio playlist on
the electronic device.
[0245] At block 750, the electronic device (e.g., 600) receives a
second user input on the second user interface (e.g., 622). In some
examples, the second user input is a swipe gesture (e.g., 607) in a
horizontal direction on the second user interface (e.g., 622). In
some examples, the second user input is a swipe gesture in a
vertical direction on the second user interface.
[0246] At block 752, in response to receiving the second user input
(e.g., 607) on the second user interface, the electronic device
(e.g., 600), at block 754, displays, on the display (e.g., 602), a
third user interface (e.g., 646) (e.g., a user interface showing
the list of tracks in the playlist that is associated with the
currently-playing track), where the third user interface includes a
plurality of audio items (e.g., 648-656) of the first audio
playlist and, at block 756, updates display of the plurality of
indicia icons, where a second indicia icon (e.g., 644) associated
with the third user interface (e.g., 646) includes the indication
(e.g., highlighting or marking of the indicia icon) that the third
user interface is currently displayed.
[0247] Updating display of the plurality of indicia icons, where
the first indicia icon associated with the second user interface
includes the indication that the second user interface is currently
displayed, provides the user with feedback about the
currently-displayed user interface and about other user interfaces
that the user can navigate to relative to the currently-displayed
user interface. Providing improved visual feedback to the user
enhances the operability of the device and makes the user-device
interface more efficient (e.g., by helping the user to provide
proper inputs and reducing user mistakes when operating/interacting
with the device) which, additionally, reduces power usage and
improves battery life of the device by enabling the user to use the
device more quickly and efficiently.
[0248] In some embodiments, at block 758, while the electronic
device (e.g., 600) has no connectivity with an external device
(e.g., a smartphone, tablet, laptop computer, or desktop computer)
that is storing a second audio item of the first audio playlist,
the electronic device receives user selection of the second audio
item. In some embodiments, at block 760, in response to receiving
the user selection of the second audio item of the first audio
playlist, the electronic device (e.g., 600), at block 762, in
accordance with a determination that the second audio item is
stored on the electronic device, causes audio output of the second
audio item, and, at block 764, in accordance with a determination
that the second audio item is not stored on the electronic device,
forgoes causing audio output of the second audio item. For example,
the electronic device (e.g., 600) causes the audio output of the
second audio item via internal speakers of the electronic device or
via an external device, such as headphones paired with the
electronic device, an external speaker paired with the electronic
device, or the speakers of a different device (e.g., a smartphone,
tablet, laptop computer, or desktop computer) paired with the
electronic device.
[0249] Causing audio output of the audio item in accordance with a
determination that the audio item is stored on the device and
forgoing causing audio output of the audio item in accordance with
a determination that the audio item is not stored on the device
provides the user with feedback indicative of whether or not the
audio item is stored on the device. Providing improved feedback
enhances the operability of the device and makes the user-device
interface more efficient (e.g., by helping the user to provide
proper inputs and reducing user mistakes when operating/interacting
with the device) which, additionally, reduces power usage and
improves battery life of the device by enabling the user to use the
device more quickly and efficiently.
[0250] In some embodiments, at block 766, the electronic device
(e.g., 600) receives user input (e.g., 605) on a second audio item
(e.g., a different track from the current track) of the plurality
of audio items of the first audio playlist displayed on the third
user interface, where the second audio item is different from the
first audio item. In some embodiments, at block 768, in response to
receiving the user input (e.g., 605) on the second audio item, at
the electronic device (e.g., 600), at block 770, displays, on the
display (e.g., 602), the second user interface (e.g., 622), where
the second user interface includes an indication (e.g., 626) of the
second audio item. In some embodiments, at block 772, the
electronic device (e.g., 600) also updates display of the plurality
of indicia icons, where the first indicia icon (e.g., 642)
associated with the second user interface (e.g., 622) includes the
indication (e.g., highlighting or marking of the indicia icon) that
the second user interface is currently displayed. In some
embodiments, at block 774, the electronic device (e.g., 600) also
causes audio output of the second audio item. For example, the
electronic device (e.g., 600) causes the audio output of the second
audio item via internal speakers of the electronic device or via an
external device, such as headphones paired with the electronic
device, an external speaker paired with the electronic device, or
the speakers of a different device (e.g., a smartphone, tablet,
laptop computer, or desktop computer) paired with the electronic
device.
[0251] Updating display of the plurality of indicia icons, where
the first indicia icon associated with the second user interface
includes the indication that the second user interface is currently
displayed, provides the user with feedback about the
currently-displayed user interface and about other user interfaces
that the user can navigate to relative to the currently-displayed
user interface. Providing improved visual feedback to the user
enhances the operability of the device and makes the user-device
interface more efficient (e.g., by helping the user to provide
proper inputs and reducing user mistakes when operating/interacting
with the device) which, additionally, reduces power usage and
improves battery life of the device by enabling the user to use the
device more quickly and efficiently.
[0252] Note that details of the processes described above with
respect to method 700 (e.g., FIGS. 7A-7C) are also applicable in an
analogous manner to the methods described below. For example,
methods 900, 1100, and 1300 optionally include one or more of the
characteristics of the various methods described above with
reference to method 700. For example, the method of navigating the
displayed stack of stack items described in method 900 can be used
to navigate audio items and select audio items to be played through
electronic device 600. For another example, the method of quickly
and efficiently switching between user interfaces of active
applications described in method 1100 can be used to switch amongst
active applications on electronic device 600. For another example,
the method of updating data associated with audio files using a
different device as described in method 1300 can be used to update
locally stored data on electronic device 600. For brevity, these
details are not repeated below.
[0253] FIGS. 8A-8AC illustrate exemplary user interfaces for
playing and managing audio items, in accordance with some
embodiments. The user interfaces in these figures are used to
illustrate the processes described below, including the processes
in FIGS. 9A-9D.
[0254] FIG. 8A illustrates the face of an electronic device 800
(e.g., portable multifunction device 100, device 300, or device
500). In this non-limiting exemplary embodiment depicted in FIGS.
8A-8AC, electronic device 800 is a smartwatch. In other
embodiments, electronic device 600 can be a different type of
electronic device, such as a different type of wearable device or a
smartphone.
[0255] As shown in FIG. 8A, electronic device 800 has a
touch-sensitive display 802. In some embodiments, electronic device
800 also has a rotatable input mechanism 806 for navigating the
displayed user interface. For example, rotatable input mechanism
806 can be used to scroll (upwards and downwards) the displayed
user interface. For another example, the rotatable input mechanism
806 can be used to zoom in and zoom out of the displayed user
interface. For another example, the rotatable input mechanism 806
can be used to zoom in to a displayed icon corresponding to an
application and launch the application corresponding to the
zoomed-in icon.
[0256] In FIG. 8A, electronic device 800 is displaying, on the
touch-sensitive display 802, a portion of a navigation user
interface 804 of an application (e.g., a music application) for
playing audio items (e.g., a song, an audio file, a podcast, a
radio channel). Navigation user interface 804 enables a user of
electronic device 800 to navigate through audio playlists, albums,
artists, and/or tracks that can be managed and played using
electronic device 800. In some embodiments, navigation user
interface 804 includes an indication 808 (e.g., "MUSIC" for a music
application) that the currently-displayed application is an
application for playing audio items (e.g., songs, podcasts, radio
channels). In some embodiments, navigation user interface 804
includes a time indication 810 of the current time. In some
embodiments, navigation user interface 804 includes a scroll bar
812 indicating the portion (or region) of navigation user interface
804 that is currently being displayed on touch-sensitive display
802. For example, in FIG. 8A, scroll bar 812 indicates that the
very bottom portion of navigation user interface 804 is currently
displayed.
[0257] Navigation user interface 804 also includes a graphical
depiction of a stack of audio group items 814 corresponding to
audio items that can be played through electronic device 800 (e.g.,
via internal speakers of electronic device 800 or via an external
device, such as headphones paired with electronic device 800, an
external speaker paired with electronic device 800, or speakers of
a different device, such as a smartphone, tablet, laptop computer,
or desktop computer, paired with electronic device 800).
[0258] As shown in FIG. 8A, the stack of audio group items 814 is
located at a bottom region of navigation user interface 804. In
some embodiments, as shown in FIG. 8A, the stack of audio group
items 814 is organized based on albums (e.g., a musical album, a
collection of podcasts, a collection of listings from a radio
channel), and each stack item of the stack of audio group items 814
corresponds to an album. For example, the first stack item 816 of
stack 814, which is shown at the top of stack 814, corresponds to a
"Classics Album." In some embodiments, the stack of audio group
items 814 are organized based on playlists created by a user of
electronic device 800, and each stack item of the stack of audio
group items 814 corresponds to a playlist. In some embodiments, the
stack of audio group items 814 are organized based on artists, and
each stack item of the stack of audio group items 814 corresponds
to an artist.
[0259] As further shown in FIG. 8A, the stack of audio group items
814 includes one or more additional records, in addition to first
stack item 816 of the stack. For example, stack 814 further
includes at least a second audio record 824 and a third audio
record 826. In some embodiments a portion of the one or more
additional records are displayed beneath first stack item 816. This
configuration of stack 814 provides a graphical depiction of album
records being piled on top or one another in a piled "stack."
[0260] In some embodiments, navigation user interface 804
optionally displays, below the stack of audio group items 814, a
textual indication 820 of an album, playlist, or radio station
corresponding to the first record of the stack and a textual
indication 822 of an artist associated with the album, playlist, or
radio station.
[0261] In FIG. 8B, while displaying stack 814 of navigation user
interface 804, electronic device 800 receives, via touch-sensitive
display 804 (or alternatively, a voice input via a mic or rotation
input via rotatable input mechanism 806), user input navigating
navigation user interface 804. For example, in FIG. 8B, the user
input is a scrolling 803 of rotatable input mechanism 806 in a
counter-clockwise direction.
[0262] As shown in the transition of FIG. 8B through FIG. 8E, while
detecting scrolling 803 of rotatable input mechanism 806 in the
counter-clockwise direction, as shown in the transition of FIG. 8B
through FIG. 8E, electronic device 800 replaces display of first
stack item 816 at the top of stack 814 with second audio record 824
(e.g., "Modern Album"). In some embodiments, electronic device 800
graphically depicts this transition to reminisce a "flipping
through" of records within stack 814. In some embodiments, as
depicted in FIGS. 8B-8D, electronic device 800 dynamically depicts
first stack item 816 sliding down from the top of stack 814 towards
a bottom edge of touch-sensitive display 802 and relocating to the
back of stack 814. In some embodiments, electronic device 800
dynamically depicts first stack item 816 flipping forward from the
top of stack 814 and rotating based on its bottom edge to the end
(i.e., bottom) of stack 814.
[0263] In some embodiments, while detecting scrolling 803 of
rotatable input mechanism 806 and displaying the "flipping through"
animation of stack 814, electronic device 800 continually updates
display of scroll bar 812 to indicate an amount of scrolling of
navigation user interface 804 that resulted from scrolling 803 of
rotatable input mechanism 806. In the transition from FIG. 8B to
FIG. 8E, because stack 814 was displayed at the very bottom of
navigation user interface 804 and scrolling 803 corresponded to an
upwards scroll on navigation user interface 804, scroll bar 812
indicates an amount of the scrolling of navigation user interface
804 in an upwards direction.
[0264] As shown in FIG. 8E, once scrolling 803 of rotatable input
mechanism 806 fully replaces the top stack item (e.g., first stack
item 816) of stack 814 with the next stack item (e.g., second audio
record 824), stack 814 within navigation user interface 804 depicts
stack 814 with the next stack item (e.g., second audio record 824)
as the top stack item. The previous top stack item, first stack
item 816, is now located the bottom of stack 814. In some
embodiments, as shown in FIG. 8E, a portion (e.g., a topo portion)
of first stack item 816 remains visible in stack 814 as the last
stack item in the stack. In some embodiments, navigation user
interface 804 updates display of textual indication 820 to the
album, playlist, or radio station corresponding to the second audio
record (e.g., "Modern Album") and textual indication 822 to the
artist (e.g., "Modern Band") associated with the album, playlist,
or radio station corresponding to the second audio record (e.g.,
"Modern Album").
[0265] In FIG. 8F, while displaying stack 814 with the top stack
item as second stack item 824, electronic device 800 receives, via
touch-sensitive display 802 (or alternatively, a voice input via a
mic or rotation input via rotatable input mechanism 806), user
input of on stack 814. For example, as illustrated in the
transition from FIG. 8F through FIG. 8I, the user input is a
scrolling (or sliding) gesture 803 in a downwards direction on
stack 814 that replaces the top stack item of stack 814 from second
stack item 824 to third stack item 826. In some embodiments, the
transition stack 814 depicted in FIGS. 8F-8I involves the same
"flipping through" animation corresponding to the transition
illustrated in FIGS. 8B-8E (replacing first stack item 816 with
second stack item 824 at the top of stack 814). In some
embodiments, while detecting scrolling gesture 803 and displaying
the "flipping through" animation of stack 814, electronic device
800 continually updates display of scroll bar 812 to indicate an
amount of scrolling of navigation user interface 804 that resulted
from scrolling gesture 803 on stack 814. In some embodiments,
scrolling gesture 803 corresponds to "flipping through" stack 814
from the top of the stack towards the bottom of the stack, and
"flipping through" stack 814 in the top to bottom direction
corresponds to an upwards scroll of navigation user interface 804
(or alternatively, a downwards scroll of navigation user interface
804).
[0266] In FIG. 8I, subsequent to navigating ("flipping through")
stack 814 from first stack item 816 to second stack item 824, then
from second stack item 824 to third stack item 826, electronic
device 800 displays stack 814 with third stack item 826 as the top
stack item. In some embodiments, a user account (of the user)
logged into electronic device 800 is associated with one or more
additional albums (or playlists, artists, radio stations, etc.),
and thus stack 814 includes one or more additional stack items, in
addition to first stack item 816, second stack item 824, and first
stack item 826, corresponding to the one or more additional
albums.
[0267] In some embodiments, as displayed in the transition from
FIG. 8J through FIG. 8Q, further scrolling (e.g., "flipping
through") of stack 814 (via scrolling of rotatable input mechanism
806, scrolling gesture on touch-sensitive display 802, or a
different input mechanism, such as a voice input command) enables
the user to view the one or more additional stack items in stack
814.
[0268] For example, in FIG. 8J, electronic device 800 receives
scrolling 805 of rotatable input mechanism 806 in the
counter-clockwise direction. In some embodiments, scrolling of
rotatable input mechanism 806 in the counter-clockwise direction
corresponds to scrolling of navigation user interface 804 in an
upwards direction. Scrolling 805 of rotatable input mechanism 806
causes further navigation (e.g., "flipping through") of stack 814,
as third stack item 826 flips away from the top of stack 814,
revealing fourth stack item 830 (e.g., a stack item corresponding
to a "Workout" album). In some embodiments, as shown in FIG. 8J,
additional stack items 828 beneath fourth stack item 830 is at
least partially visible. Further, electronic device 800 continually
updates display of scroll bar 812 to correspond with the continued
navigation (e.g., "flipping through") of stack 814. In some
embodiments, because the scrolling 805 of rotatable input mechanism
806 corresponds to an upwards scrolling of navigation user
interface 804, scroll bar 812 of FIG. 8J indicates that navigation
user interface 804 is further scrolled in the upwards direction
relative to scroll bar 812 of FIG. 8I.
[0269] In FIG. 8K, electronic device 800 continues to receive
scrolling 805 of rotatable input mechanism 806 in the same
counter-clockwise direction. The continued scrolling 805 of
rotatable input mechanism 806 continues to cause navigation (e.g.,
"flipping through") of stack 814, as third stack item 826 flips
further away from the top of stack 814, fourth stack item 830 also
flips away from the top of stack 814, revealing fifth stack item
832 (e.g., a stack item corresponding to a "Sunday Chill List"
album). Further, electronic device 800 continually updates display
of scroll bar 812 to correspond with the continued navigation
(e.g., "flipping through") of stack 814. In some embodiments,
because the scrolling 805 of rotatable input mechanism 806
corresponds to an upwards scrolling of navigation user interface
804, scroll bar 812 of FIG. 8K indicates that navigation user
interface 804 is further scrolled in the upwards direction relative
to scroll bar 812 of FIG. 8J.
[0270] In FIG. 8L, electronic device 800 continues to receive more
scrolling 805 of rotatable input mechanism 806 in the same
counter-clockwise direction. The additional continued scrolling 805
of rotatable input mechanism 806 causes further navigation (e.g.,
continued "flipping through") of stack 814, as third stack item 826
is completely flipped away and to the back of stack 814, fourth
stack item 830 further flips away from the top of stack 814, fifth
stack item 832 also flips away from the top of stack 814, revealing
sixth stack item 834 (e.g., a stack item corresponding to a "Road
Trip Favorites" album). In some embodiments, sixth stack item is
the sixth and final stack item in stack 814 (e.g., if first stack
item 816 is considered the "first" stack item of stack 814, sixth
stack item 834 is the "last" stack item of stack 814). Further,
electronic device 800 continually updates display of scroll bar 812
to correspond with the continued navigation (e.g., "flipping
through") of stack 814. In some embodiments, because the scrolling
805 of rotatable input mechanism 806 corresponds to an upwards
scrolling of navigation user interface 804, scroll bar 812 of FIG.
8L indicates that navigation user interface 804 is further scrolled
in the upwards direction relative to scroll bar 812 of FIG. 8K.
[0271] In FIG. 8M, electronic device 800 continues to receive
scrolling 805 of rotatable input mechanism 806 in the same
counter-clockwise direction. The continued scrolling 805 of
rotatable input mechanism 806 causes continued navigation (e.g.,
continued "flipping through") of stack 814, as fourth stack item
830 is completely flipped away and to the back of stack 814, fifth
stack item 832 further flips away from the top of stack 814, and
sixth stack item 834 (e.g., the "last" stack item of the stack) is
increasingly revealed. In some embodiments, as shown in FIG. 8M,
the flipped stack items 836, which correspond to stack items that
have been "flipped" to the back of stack 814 via scrolling inputs
801, 803, and 805 (e.g., first stack item 816, second stack item
824, third stack item 826, fourth stack item 830) are partially
visible in the stack. Further, electronic device 800 continually
updates display of scroll bar 812 to correspond with the continued
navigation (e.g., "flipping through") of stack 814. In some
embodiments, because the scrolling 805 of rotatable input mechanism
806 corresponds to an upwards scrolling of navigation user
interface 804, scroll bar 812 of FIG. 8M indicates that navigation
user interface 804 is further scrolled in the upwards direction
relative to scroll bar 812 of FIG. 8L.
[0272] In FIG. 8N, in response to receiving scrolling 805 of
rotatable input mechanism 806, electronic device 800 displays sixth
stack item 834 in stack 814. In some embodiments, electronic device
800 recognizes that sixth stack item 834 is the "last" stack item
of stack 814. In some embodiments, electronic device 800 recognizes
sixth stack item 834 as being the "last" stack item based on first
stack item 816 having been the top-most stack item and sixth stack
item 834 having been the bottom-most stack item of stack 814 prior
to the device receiving scrolling 801. In some embodiments,
electronic device 800 recognizes sixth stack item 834 as being the
"last" stack item based on a predetermined (or pre-configured)
label of each stack item. In some embodiments, electronic device
800 recognizes sixth stack item 834 as being the "last" stack item
based on a structured ordering of the stack items (e.g.,
alphanumeric-based ordering based on a name, such as album name,
playlist name, artist name, folder name, or radio station name)
corresponding to the album (or playlist, artist, folder, radio
station) associated with the stack item.
[0273] In FIG. 8O, electronic device 800 receives a user input
corresponding to further navigation of navigation user interface
804 in an upwards direction. For example, as shown in FIG. 8O, the
user input is a scrolling 807 of rotatable input mechanism 806 in
the counter-clockwise (or alternatively, clockwise) direction that
corresponds to a navigation (e.g., scrolling) of navigation user
interface 804 in an upwards direction. Because sixth stack item 834
is the "last" item of stack 814, scrolling 807 causes (instead of
continuing to "flip through" stack 814) navigation user interface
804 to transition from display of stack 814 to display of one or
more menu icons (e.g., library menu icon 838) of navigation user
interface 804. Further, electronic device 800 continually updates
display of scroll bar 812 to correspond with the navigation (e.g.,
scrolling) of navigation user interface in the upwards
direction.
[0274] In FIG. 8P, electronic device 800 receives continues to
receive scrolling 807 of rotatable input mechanism 806 in the
counter-clockwise (or alternatively, clockwise) direction that
corresponds to a navigation (e.g., scrolling) of navigation user
interface 804 in an upwards direction until navigation user
interface 804 reaches the top of navigation user interface 804
(e.g., the opposite end of navigation user interface 804 from the
region displaying stack 814). Further upwards navigation (e.g.,
scrolling) of navigation user interface 804 causes display of
additional menu icons (e.g., search menu icon 840, now playing menu
icon 842, stations menu icon). Further, electronic device 800
continually updates display of scroll bar 812 to correspond with
the navigation (e.g., scrolling) of navigation user interface in
the upwards direction until scroll bar 812 indicates that the user
interface has been navigated to the very top.
[0275] FIG. 8Q illustrates navigation user interface 804 navigated
to the top of the user interface and displaying the menu icons
region. In some embodiments, navigation user interface 804 includes
library menu icon 838, search menu icon 840, and now playing menu
icon 842. In some embodiments, in addition or instead of library
menu icon 838, navigation user interface includes a (radio)
stations icon. In some embodiments, at least a portion (e.g., a top
portion) of stack 814 is still visible in the menu icons region of
navigation user interface 804.
[0276] In some embodiments, in response to detecting user selection
of now playing menu icon 842, electronic device 800 displays (e.g.,
replaces navigation user interface 804 with) control user interface
888, which includes an indication of the currently-playing audio
item. In some embodiments.
[0277] In some embodiments, in response to detecting user selection
of the stations icon, electronic device 800 displays a stations
list user interface that includes a list of available radio
stations by genre. In some embodiments, in response to detecting
user selection of a genre from the displayed list, electronic
device 800 displays a list of (radio) stations corresponding to the
selected genre. In some embodiments, in response to detecting user
selection of a station from the list of presented stations,
electronic device 800 displays control user interface 888 (which
includes an indication of the selected station) and causes audio
output of the selected station.
[0278] In some embodiments, in response to detecting user selection
of search menu icon 840, electronic device 800 displays a search
user interface that includes a dictate icon and a scribble icon. In
some embodiments, in response to detecting user selection of the
dictate icon, electronic device 800 enables a user to search for an
audio item using dictation. In some embodiments, in response to
detecting user selection of the scribble icon, electronic device
800 enables a user to search for an audio item using "scribble"
input (e.g., handwriting input) on touch-sensitive display 802 of
electronic device 800. In some embodiments, when a first letter
(e.g., "A") is entered in a search, electronic device 800 provides
selectable suggestions corresponding to the entered letter. In some
embodiments, once the user returns to the search function, the
search mode is automatically selected to be the previously-used
mode (e.g., dictation mode or scribble mode). In some embodiments,
electronic device 800 enables the user to configure a default
setting between the dictation mode and the scribble mode. In some
embodiments, when a search is performed (using either dictation
mode or scribble mode), electronic device 800 displays a list of
audio items which, when selected, causes the device to display
control user interface 888 and cause audio output of the selected
audio item.
[0279] In FIG. 8R, electronic device 800 detects, via
touch-sensitive display 802 (or alternatively, a voice input via a
mic or rotation input via rotatable input mechanism 806), user
activation of library menu icon 838. For example, the user
activation is a tap gesture 809 on library menu icon 838. As shown
in FIG. 8S, in response to detecting tap gesture 809, electronic
device 800 displays (e.g., replaces display of navigation user
interface 804 with) a library user interface 844 that includes a
plurality of library icons. In some embodiments, library user
interface 844 includes a first library icon 846 (e.g.,
"Playlists"), a second library icon 848 (e.g., "Artists"), a third
library icon 850 (e.g., "Albums"), and a fourth library icon 852
(e.g., "Songs"). In some embodiments, in library user interface
844, electronic device 800 maintains display of selectable
indication 808. In some embodiments, in library user interface 844,
selectable indication 808 includes an indication that the
currently-displayed user interface is library user interface 844
(e.g., a textual indication stating "LIBRARY"). Further, in some
embodiments, in library user interface 844, user selection of
selectable indication 808 causes electronic device 800 to display
(e.g., replace display of library user interface 844 with)
navigation user interface 804.
[0280] In FIG. 8T, electronic device 800 detects, via
touch-sensitive display 802 (or alternatively, a voice input via a
mic or rotation input via rotatable input mechanism 806), user
activation of first library icon 846 (e.g., "Playlists"). For
example, as shown in FIG. 8T, the user activation is a tap gesture
811 on first library icon 846. As shown in FIG. 8U, in response to
detecting tap gesture 811, electronic device 800 displays (e.g.,
replaces display of library user interface 844 with) a playlists
user interface 854 that includes a plurality of playlist items
corresponding to the albums corresponding to the stack items in
stack 814. For example, playlist user interface 854 includes a
first playlist item 856 (e.g., "My New Music Mix") corresponding to
third stack item 826, a second playlist item 858 (e.g., "Workout")
corresponding to fourth stack item 830, a third playlist item 860
(e.g., "Sunday Chill List") corresponding to fifth stack item 832,
and a fourth playlist item 862 (e.g., "Roadtrip Favorites")
corresponding to sixth stack item 834. In some embodiments, in
playlist user interface 8854, electronic device 800 maintains
display of selectable indication 808. In some embodiments, in
playlist user interface 854, selectable indication 808 includes an
indication that the currently-displayed user interface is playlist
user interface 854 (e.g., a textual indication stating
"PLAYLISTS"). Further, in some embodiments, in playlist user
interface 854, user selection of selectable indication 808 causes
electronic device 800 to display (e.g., replace display of playlist
user interface 854 with) library user interface 844.
[0281] In FIG. 8V, electronic device 800 detects, via
touch-sensitive display 802 (or alternatively, a voice input via a
mic or rotation input via rotatable input mechanism 806), user
activation of second playlist item 858 (e.g., "Workout"). For
example, as shown in FIG. 8V, the user activation is a tap gesture
813 on second playlist item 858. As shown in FIG. 8W, in response
to detecting the user activation of second playlist item 858,
electronic device 800 displays (e.g., replaces display of playlist
user interface 854 with) a control user interface 888 that
corresponds to control user interface 622 described above with
reference to FIGS. 6C-6S. As with control user interface 622,
control user interface 888 includes display of an indication 864 of
the currently-playing audio item (e.g., "First WO Track") and an
indication 866 of an artist (e.g., "First WO Artist") corresponding
to the currently-playing audio item. In some embodiments,
electronic device 800 automatically causes audio output (e.g., via
internal speakers of electronic device 800 or via an external
device, such as headphones paired with electronic device 800, an
external speaker paired with electronic device 800, or speakers of
a different device, such as a smartphone, tablet, laptop computer,
or desktop computer, paired with electronic device 800) of a first
audio item associated with second playlist item 858 (which, in the
embodiment depicted in FIG. 8W, is "First WO Track"). In some
embodiments, in response to detecting the user activation of second
playlist item 858, electronic device displays (e.g., replaces
display of playlist user interface 854 with) a tracks user
interface (e.g., tracks user interface 878 depicted in FIG.
8AA).
[0282] FIG. 8X illustrates electronic device 800 displaying library
user interface 844 and, while displaying library user interface
844, detecting, via touch-sensitive display 802 (or alternatively,
a voice input via a mic or rotation input via rotatable input
mechanism 806), user activation of second library icon 848 (e.g.,
"Artists"). For example, the user activation is a tap gesture 815
on second library icon 848. As shown in FIG. 8Y, in response to
detecting tap gestures 815 on second library icon 848 (e.g.,
"Artists"), electronic device 800 displays (e.g., replaces display
of library user interface 844 with) a artists user interface 868
that includes a first artist icon 870 (e.g., displaying "First
Artist," an indication of the name of the first artist), a second
artist icon 872 (e.g., displaying "Second Artist," an indication of
the name of the second artist), a third artist icon 874 (e.g.,
displaying "Third Artist," an indication of the name of the third
artist), and a fourth artist icon 876 (e.g., displaying "Fourth
Artist," an indication of the name of the fourth artist). In some
embodiments, in artists user interface 868, electronic device 800
maintains display of selectable indication 808. In some
embodiments, in artists user interface 868, selectable indication
808 includes an indication that the currently-displayed user
interface is artists user interface 868 (e.g., a textual indication
stating "ARTISTS"). Further, in some embodiments, in artists user
interface 868, user selection of selectable indication 808 causes
electronic device 800 to display (e.g., replace display of artists
user interface 868 with) library user interface 844.
[0283] In FIG. 8Z, electronic device 800 detects, via
touch-sensitive display 802 (or alternatively, a voice input via a
mic or rotation input via rotatable input mechanism 806), user
activation of fourth artist icon 876 (e.g., "Fourth Artist"). For
example, as shown in FIG. 8Z, the user activation is a tap gesture
817 on fourth artist icon 876. As shown in FIG. 8AA, in response to
detecting the user activation of fourth artist icon 876, electronic
device 800 displays (e.g., replaces display of artist user
interface 868 with) a tracks user interface 878 that includes one
or more track items corresponding to audio items associated with
the fourth artist. For example, as shown in FIG. 8AA, tracks user
interface 878 includes a first track item 880 (e.g., displaying "FA
Track One," an indication of the name of the first track), a second
track item 882 (e.g., displaying "FA Track Two," an indication of
the name of the second track), a third track item 884 (e.g.,
displaying "FA Track Three," an indication of the name of the third
track), and a fourth track item 886 (e.g., displaying "FA Track
Four," an indication of the name of the fourth track). In some
embodiments, in tracks user interface 878, electronic device 800
maintains display of selectable indication 808. In some
embodiments, in tracks user interface 878, selectable indication
808 includes an indication that the currently-displayed user
interface is tracks user interface 878 (e.g., a textual indication
stating the name of the artist, such as "FOURTH ARTIST"). Further,
in some embodiments, in tracks user interface 878, user selection
of selectable indication 808 causes electronic device 800 to
display (e.g., replace display of tracks user interface 878 with)
artists user interface 868.
[0284] In FIG. 8AB, electronic device 800 detects, via
touch-sensitive display 802 (or alternatively, a voice input via a
mic or rotation input via rotatable input mechanism 806), user
activation of first track item 880 (e.g., "FA Track One"). For
example, as shown in FIG. 8AB, the user activation is a tap gesture
819 on first track item 880. As shown in FIG. 8AC, in response to
detecting tap gesture 819, electronic device 800 displays (e.g.,
replaces display of tracks user interface 878 with) control user
interface 888 that corresponds to control user interface 622
described above with reference to FIGS. 6C-6S. As with control user
interface 622, control user interface 888 includes indication 864
of the currently-playing audio item (e.g., "FA Track One") and
indication 866 of an artist (e.g., "Fourth Artist") corresponding
to the currently-playing audio item.
[0285] FIGS. 9A-9D are a flow diagram illustrating a method for
playing and managing audio items using an electronic device in
accordance with some embodiments. Method 900 is performed at a
device (e.g., 100, 300, 500) with a touch-sensitive display. Some
operations in method 900 are, optionally, combined, the order of
some operations is, optionally, changed, and some operations are,
optionally, omitted.
[0286] As described below, method 900 provides an intuitive way for
playing and managing audio items. The method reduces the cognitive
burden on a user for playing and managing audio items, thereby
creating a more efficient human-machine interface. For
battery-operated computing devices, enabling a user to play and
manage audio items faster and more efficiently conserves power and
increases the time between battery charges.
[0287] At block 902, the electronic device (e.g., 800) displays, on
the display (e.g., 802), an ordered stack (e.g., 814) of audio
playlist items in a first position, where the ordered stack of
audio playlist items includes a first item (e.g., 816), a second
item, and a third item, and where the first item is displayed in
the first position. In some examples, the ordered stack of audio
playlist items (e.g., 814) is a stack of records. In some examples,
the ordered stack of audio playlist items (e.g., 814) is a stack of
audio tracks. In some examples, the ordered stack of audio playlist
items (e.g., 814) is a collection of titles or albums. In some
examples, the ordered stack of audio playlist items (e.g., 814) is
a collection of radio items, news items, or podcasts. In some
examples, the ordered stack of audio playlist items (e.g., 814) is
a collection of audio recordings.
[0288] Displaying the ordered stack of audio playlist items
provides a visual feedback that enables the user to visualize (and
thus experience the sensation of) flipping through a real stack of
playlist items. Providing improved visual feedback to the user
enhances the operability of the device and makes the user-device
interface more efficient (e.g., by helping the user to provide
proper inputs and reducing user mistakes when operating/interacting
with the device) which, additionally, reduces power usage and
improves battery life of the device by enabling the user to use the
device more quickly and efficiently.
[0289] In some embodiments, at block 904, while displaying, on the
display (e.g., 802), the ordered stack of audio playlist items
(e.g., 814) in the first position, the electronic device (e.g.,
800) receives user selection of the first item. In some
embodiments, at block 906, in response to receiving the user
selection of the first item, the electronic device (e.g., 800)
displays, on the display (e.g., 802), a control user interface
(e.g., 888) (e.g., a main user interface of a music application),
where the control user interface includes an indication of a first
audio item (e.g., a first track of the selected playlist)
associated with the first item. In some embodiments, at block 908,
the electronic device (e.g., 800) causes audio output of the first
audio item. For example, the electronic device (e.g., 800) causes
the audio output of the second audio item via internal speakers of
the electronic device or via an external device, such as headphones
paired with the electronic device, an external speaker paired with
the electronic device, or the speakers of a different device (e.g.,
a smartphone, tablet, laptop computer, or desktop computer) paired
with the electronic device.
[0290] At block 910, the electronic device (e.g., 800) receives a
first input (e.g., 801, 803, 805) in a first direction. In some
embodiments, the electronic device (e.g., 800) includes a rotatable
input mechanism (e.g., 806) (e.g., a physical rotatable crown of
the electronic device for navigating the display of the electronic
device), and the first input is based on a movement of the
rotatable input mechanism in the first direction (which can either
be rotation in a clockwise direction or in a counter-clockwise
direction), and the second input is based on a (additional)
continued movement of the rotatable input mechanism in the first
direction. In some embodiments, the first input is a gesture on the
touch-sensitive display (e.g., 802) corresponding to a request to
scroll in the first direction (e.g., an upwards direction or a
downwards direction), and the second input is an additional finger
scroll in the first direction.
[0291] Having the first input be based on the movement of the
rotatable input mechanism in the first direction and the second
input be based on the continued movement of the rotatable input
mechanism in the first direction provides an improved input
technique for navigating a user interface that seamlessly
transition from one navigation mode to a different navigation mode.
Providing a seamless transition between different navigation modes
within a user interface using the same input technique and reducing
the number of inputs needed to perform the technique enhances the
operability of the device and makes the user-device interface more
efficient (e.g., by helping the user to provide proper inputs and
reducing user mistakes when operating/interacting with the device)
which, additionally, reduces power usage and improves battery life
of the device by enabling the user to use the device more quickly
and efficiently.
[0292] At block 912, in response to receiving the first input, the
electronic device (e.g., 800) displays, on the display (e.g., 802),
the ordered stack of audio playlist items (e.g., 814) in a second
position, where the second item is displayed in the second
position.
[0293] At block 914, the electronic device (e.g., 800) receives a
second (additional) input in the first direction.
[0294] At block 916, in response to receiving the second input, the
electronic device (e.g., 800), at block 918, in accordance with a
determination that the second item is a terminal item (e.g., the
first item in the stack or the last item in the stack) in the
ordered stack of audio playlist items (e.g., 814), displays, on the
display (e.g., 802), at least one menu affordance of a plurality of
menu affordances, and, at block 920, in accordance with a
determination that the second item is an intermediate item (e.g.,
any item in the stack that is not the first or the last item) in
the ordered stack of audio playlist items (e.g., 814), displays, on
the display (e.g., 802), the ordered stack of audio playlist items
in a third position, where the third item is displayed in the third
position. In some examples, the plurality of menu affordances are a
plurality of quick access menus (e.g., 842, 840, 838), such as a
"now playing" menu, a "search" menu, and a "library" menu.
[0295] Displaying at least one menu affordance of the plurality of
menu affordances in accordance with a determination that the second
item is a terminal item and displaying the ordered stack of audio
playlist items in a third position in accordance with a
determination that the second item is an intermediate item allows
for the user to seamlessly transition between viewing the plurality
of menu affordances and viewing the ordered stack of audio playlist
items without providing addition input or providing a request to
change navigation modes. Performing an operation when a set of
conditions has been met without requiring further user input
enhances the operability of the device and makes the user-device
interface more efficient (e.g., by helping the user to provide
proper inputs and reducing user mistakes when operating/interacting
with the device) which, additionally, reduces power usage and
improves battery life of the device by enabling the user to use the
device more quickly and efficiently.
[0296] In some embodiments, the terminal item (e.g., an audio
playlist item that is in the first or last position of the ordered
stack) is the first of the audio playlist items in the ordered
stack of audio playlist items (e.g., 814). In some examples, the
terminal item (e.g., an audio playlist item that is in the first or
last position of the ordered stack) is the last of the audio
playlist items in the ordered stack of audio playlist items (e.g.,
814). In some examples, the intermediate item (e.g., an audio
playlist item that is not in the first or last position of the
ordered stack) is located between a first of the audio playlist
items in the ordered stack of audio playlist items (e.g., 814) and
a last of the audio playlist items in the ordered stack of audio
playlist items.
[0297] In some embodiments, at block 922, while displaying the at
least one menu affordance of the plurality of menu affordances
(e.g., 842, 840, 838) (e.g., a "now playing" menu, a "search" menu,
a "library" menu), the electronic device (e.g., 800) receives user
selection of a first menu affordance (e.g., the "library" menu) of
the plurality of menu affordances. In some embodiments, at block
924, in response to receiving the user selection of the first menu
affordance (e.g., the "library" menu), the electronic device (e.g.,
800) displays, on the display, a plurality of audio group
affordances (e.g., a list of available playlists, artists, albums,
songs). In some embodiments, at block 926, the electronic device
receives user selection (e.g., 811) of a first audio group
affordance (e.g., 846) (e.g., playlists) of the plurality of
playlist menu affordances. In some embodiments, at block 928, in
response to receiving the user selection of the first audio group
affordance (e.g., playlists), the electronic device displays, on
the display, one or more audio group items (e.g., 856, 858, 860,
862) (e.g., one or more playlists) associated with the first audio
group affordance. In some embodiments, at block 930, the electronic
device receives user selection (e.g., 813) of a first audio group
item (e.g., 858) (e.g., a particular playlist) of the one or more
audio group items. In some embodiments, at block 932, in response
to receiving the user selection of the first audio group item, the
electronic device (e.g., 800), at block 934, displays, on the
display, a control user interface (e.g., 888) (e.g., a control
music interface of a music application), where the control user
interface includes an indication of a first audio item (e.g., 864)
(e.g., a song associated with the selected playlist) of the first
audio group item, and, at block 936, causes audio output of the
first audio item. For example, the electronic device causes the
audio output of the first audio item via internal speakers of the
electronic device or via an external device, such as headphones
paired with the electronic device, an external speaker paired with
the electronic device, or the speakers of a different device (e.g.,
a smartphone, tablet, laptop computer, or desktop computer) paired
with the electronic device.
[0298] In some embodiments, at block 938, while displaying the at
least one menu affordance of the plurality of menu affordances
(e.g., 842, 840, 838) (e.g., a "now playing" menu, a "search" menu,
a "library" menu), the electronic device (e.g., 800) receives user
selection (e.g., 809) of a first menu affordance (e.g., 838) (e.g.,
the "library" menu) of the plurality of menu affordances. In some
embodiments, at block 940, in response to receiving the user
selection of the first menu affordance (e.g., the "library" menu),
the electronic device (e.g., 800) displays, on the display, a
plurality of audio group affordances (e.g., 846, 848, 850, 852)
(e.g., a list of available playlists, artists, albums, songs). In
some embodiments, at block 942, the electronic device (e.g., 800)
receives user selection (e.g., 815) of a first audio group
affordance (e.g., 848) (e.g., artists) of the plurality of playlist
menu affordances. In some embodiments, at block 944, in response to
receiving the user selection of the first audio group affordance
(e.g., artists), the electronic device displays, on the display,
one or more audio group items (e.g., one or more different artists)
associated with the first audio group affordance.
[0299] In some embodiments, at block 946, the electronic device
(e.g., 800) receives user selection (e.g., 817) of a first audio
group item (e.g., 876) (e.g., a particular artist) of the one or
more audio group items. In some embodiments, at block 948, in
response to receiving the user selection of the first audio group
item, the electronic device (e.g., 800) displays, on the display
(e.g., 802), one or more audio group sub-items (e.g., albums
associated with the selected artist) associated with the first
audio group item. In some embodiments, at block 950, the electronic
device (e.g., 800) receives user selection of a first audio
sub-group item (e.g., a particular album associated with the
selected artist). In some embodiments, at block 952, in response to
receiving the user selection of the first audio sub-group item
(e.g., the particular album), the electronic device (e.g., 800), at
block 954, displays, on the display, a control user interface
(e.g., 888) (e.g., a control music interface of a music
application), where the control user interface includes an
indication of a first audio item (e.g., a song associated with the
selected album) of the first audio sub-group item, and, at block
956, causes audio output of the first audio item. For example, the
electronic device causes the audio output of the first audio item
via internal speakers of the electronic device or via an external
device, such as headphones paired with the electronic device, an
external speaker paired with the electronic device, or the speakers
of a different device (e.g., a smartphone, tablet, laptop computer,
or desktop computer) paired with the electronic device.
[0300] Note that details of the processes described above with
respect to method 900 (e.g., FIGS. 9A-9D) are also applicable in an
analogous manner to the methods described above and below. For
example, methods 700, 1100, and 1300 optionally include one or more
of the characteristics of the various methods described above with
reference to method 900. For example, the method of navigating and
selecting an audio item to play described in method 700 can be used
to select audio items to be played through electronic device 800.
For another example, the method of quickly and efficiently
switching between user interfaces of active applications described
in method 1100 can be used to switch amongst active applications on
electronic device 800. For another example, the method of updating
data associated with audio files using a different device as
described in method 1300 can be used to update locally stored data
on electronic device 800. For brevity, these details are not
repeated below.
[0301] FIGS. 10A-10H illustrate exemplary user interfaces for
playing and managing audio items, in accordance with some
embodiments. The user interfaces in these figures are used to
illustrate the processes described below, including the processes
in FIGS. 11A-11B.
[0302] FIG. 10A illustrates the face of an electronic device 1000
(e.g., portable multifunction device 100, device 300, or device
500). In this non-limiting exemplary embodiment depicted in FIGS.
10A-10H, electronic device 600 is a smartwatch. In other
embodiments, electronic device 1000 can be a different type of
electronic device, such as a different type of wearable device or a
smartphone.
[0303] As shown in FIG. 10A, electronic device 1000 displays, on
touch-sensitive display 1002, a control user interface 1004 of an
application (e.g., music application) for playing audio items
(e.g., songs, podcasts, radio channels). Control user interface
1004 of FIG. 10A corresponds to control user interface 622
described with reference to FIGS. 6C-6S. In addition, control user
interface 1004 includes a quick access menu corresponding to an
application (e.g., a workout application, a music application) that
is active on electronic device 1000. For example, quick access menu
can correspond to any application that can run on electronic device
1000. In some embodiments, quick access menu 1010 is displayed
adjacent to time indication 1012 at a first location (e.g., a
top-right corner region) on display 1002.
[0304] In some embodiments, quick access menu can correspond to a
music application. In some embodiments, a music application is
active on electronic device 1000 if the device is causing (e.g.,
via internal speakers or via an external device, such as headphones
paired with the device, an external speaker paired with the device,
a different device, such as a smartphone, tablet, laptop computer,
or desktop computer, paired with the device) audio output of an
audio item (e.g., a song, a media item, live radio, podcast) using
the music application, even when the application itself is not
currently being displayed on electronic device 1000. In some
embodiments, a music application is also deemed active when the
application is pausing audio output of an audio item (e.g., a song,
a media item, live radio, podcast), even when the application
itself is not currently being displayed on electronic device 1000.
In some embodiments, a music application is inactive when the
application has stopped (instead of paused) causing audio output of
an audio item (e.g., a song, a media item, live radio,
podcast).
[0305] In some embodiments, quick access menu can correspond to a
workout application. In some embodiments, a workout application is
active when a workout routine (e.g., distance traveled, time
traveled, steps taken, distance remaining to goal, time remaining
to goal) is enabled and running on the application, even when the
application itself is not currently being displayed on electronic
device 1000. In some embodiments, a workout application is inactive
when no workout tracking features (e.g., distance traveled, time
traveled, steps taken, distance remaining to goal, time remaining
to goal) are enabled on the application.
[0306] As mentioned above, FIG. 10A illustrates control user
interface 1004 of a music application. In FIG. 10A, electronic
device 1000 is causing (e.g., via internal speakers of electronic
device 1000 or via an external device, such as headphones paired
with electronic device 1000, an external speaker paired with
electronic device 1000, or speakers of a different device, such as
a smartphone, tablet, laptop computer, or desktop computer, paired
with electronic device 1000) audio output of, through the music
application, a song titled "FA Track One." Thus, the music
application is active on electronic device 1000 of FIG. 10A. In
addition, a workout routine is also running on electronic device
1000, and thus a workout application is also active on the device.
As such, FIG. 10A further illustrates electronic device 1000
displaying, on control user interface 1004 corresponding to the
music application, quick access menu 1010 corresponding to a
workout application.
[0307] As shown in FIG. 10B, electronic device 1000 detects, via
touch-sensitive display 1002 (or alternatively, a voice input via a
mic), user activation of quick access menu 1010A corresponding to
the workout application. For example, the user activation is a tap
gesture 1001 on quick access menu 1010 corresponding to the workout
application on control user interface 1004.
[0308] In FIG. 10C, in response to detecting tap gesture 1001 on
quick access menu 1010A corresponding to the workout application on
control user interface 1004, electronic device 1000 displays (e.g.,
replaces display of control user interface 1004 of the music
application with) a main workout user interface 1014 of the workout
application. In some embodiments, main workout user interface 1014
of the workout application includes representations of activity
monitoring data, including, for example, active calorie data 1016,
total calorie data 1018, bpm data 1020, and distance travelled data
1022, corresponding to the active workout routine.
[0309] While electronic device 1000 is now displaying main workout
user interface 1014 of the workout application, the music
application is still causing audio output of "FA Track One." Thus,
the music application remains active on electronic device 1000. As
such, in addition to displaying main workout user interface 1014,
electronic device 1000 displays a quick access menu 1010B
corresponding to the music application at the same location on
display 1002 as the previous display of quick access menu 1010A
corresponding to the workout application.
[0310] As shown in FIG. 10D, while the music application and the
workout application remain active (e.g., the music application
continues to cause audio output of tracks, the workout application
continues to run the workout routine) on electronic device 1000,
electronic device 1000 detects, via touch-sensitive display 1002
(or alternatively, a voice input via a mic), user activation of
quick access menu 1010B corresponding to the music application. For
example, as shown in FIG. 10D, the user activation is a tap gesture
1003 on quick access menu 1010B corresponding to the music
application.
[0311] In FIG. 10E, in response to detecting tap gesture 1003 on
quick access menu 1010B corresponding to the music application,
electronic device 1000 displays (e.g., replaces display of main
workout user interface 1014 corresponding to the workout
application with) control user interface 1004 corresponding to the
music application. Further, while displaying control user interface
1004, the workout routine remains running on the device. As such,
in addition to displaying control user interface 1004, electronic
device 1000 displays quick access menu 1010A corresponding to the
workout application at the same location on display 1002 as the
previous display of quick access menu 1010B corresponding to the
music application.
[0312] FIG. 10F illustrates electronic device 1000 displaying a
time user interface 1016. Time user interface 10106 includes an
indication 1018 of the current time. In some embodiments, the
device is in a locked mode (e.g., in a user interface locked state)
while displaying time user interface 1016. In some embodiments, the
device is in an unlocked mode (e.g., in a user interface unlocked
state) while displaying time user interface 1016.
[0313] In FIG. 10F, electronic device 1000 is causing (e.g., via
internal speakers of electronic device 1000 or via an external
device, such as headphones paired with electronic device 1000, an
external speaker paired with electronic device 1000, or speakers of
a different device, such as a smartphone, tablet, laptop computer,
or desktop computer, paired with electronic device 1000) audio
output of a track (e.g., "FA Track One") through the music
application, but not workout routine is in progress. Thus, in FIG.
10F, the music application is active but the workout application is
inactive on the device.
[0314] Because the music application is active on the device,
electronic device 1000 also displays, at a second location (e.g.,
top-center region) of display 1002 different from the first
location (e.g., top-right corner region), quick access menu 1010B
corresponding to the music application. Subsequent to displaying
quick access menu 1010B on time user interface 1016 at the second
location, in FIG. 10G, electronic device 1000 detects, via
touch-sensitive display 1002 (or alternatively, a voice input via a
mic), user activation of quick access menu 1010B corresponding to
the music application. For example, the user activation is a tap
gesture 1005 on quick access menu 1010B corresponding to the music
application on time user interface 1016.
[0315] In some embodiments, if electronic device 1000 is in a
locked mode (e.g., in a user interface locked state) when tap
gesture 1005 is detected, tap gesture 1005 also causes the device
to convert from the locked mode to an unlocked mode (e.g., a user
interface unlocked state). In some embodiments, if electronic
device 1000 is in a locked mode (e.g., in a user interface locked
state) when tap gesture 1005 is detected, tap gesture 1005 causes
display of control user interface 1004 of the music application but
does not cause the device to convert from the locked mode to an
unlocked mode.
[0316] As shown in FIG. 10H, in response to detecting tap gesture
1005, electronic device 1000 displays (e.g., replaces display of
time user interface 1016 with) control user interface 1004 of the
music application. As also shown in FIG. 10H, because a workout
routine is not currently active on electronic device 1000, control
user interface 1004 does not include display of quick access menu
1010A corresponding to the workout application. Electronic device
1000 continues to cause (e.g., via internal speakers of electronic
device 1000 or via an external device, such as headphones paired
with electronic device 1000, an external speaker paired with
electronic device 1000, or speakers of a different device, such as
a smartphone, tablet, laptop computer, or desktop computer, paired
with electronic device 1000) audio output of the currently-playing
track (e.g., "FA Track One").
[0317] FIGS. 11A-11B are a flow diagram illustrating a method for
playing and managing audio items using an electronic device in
accordance with some embodiments. Method 1100 is performed at a
device (e.g., 100, 300, 500) with a touch-sensitive display. Some
operations in method 1100 are, optionally, combined, the order of
some operations is, optionally, changed, and some operations are,
optionally, omitted.
[0318] As described below, method 1100 provides an intuitive way
for playing and managing music. The method reduces the cognitive
burden on a user for playing and managing music, thereby creating a
more efficient human-machine interface. For battery-operated
computing devices, enabling a user to play and manage music faster
and more efficiently conserves power and increases the time between
battery charges.
[0319] At block 1102, the electronic device (e.g., 1000) receives
user input initiating a first application while a second
application different from the first application is active (e.g.,
running in the background) on the electronic device. In some
examples, the first application is a music application, a video
application, or a media application, and the second application is
a workout application, an exercise application, or a health
monitoring application (or vice versa).
[0320] In some embodiments, the first application is a music
application, and the first application is active when the
electronic device (e.g., 1000) is causing audio output of an audio
item (e.g., a song, a media item, live radio, podcast) associated
with the first application. For example, the electronic device
(e.g., 1000) causes the audio output of an audio item via internal
speakers of the electronic device or via an external device, such
as headphones paired with the electronic device, an external
speaker paired with the electronic device, or the speakers of a
different device (e.g., a smartphone, tablet, laptop computer, or
desktop computer) paired with the electronic device.
[0321] In some embodiments, the first application remains active
when the first application causes the electronic device (e.g.,
1000) to pause causing of the audio output of the audio item. In
some examples, the first application causes the electronic device
(e.g., 1000) to pause in response to receiving user input on a
"pause" affordance of the music application to pause playing of the
audio item. In some embodiments, the first application is inactive
when the first application causes the electronic device (e.g.,
1000) to stop causing audio output of the audio item. In some
examples, the first application causes the electronic device (e.g.,
1000) to stop causing audio output of the audio item in response to
receiving user input on a "stop" affordance of the music
application to stop playing of the audio item.
[0322] In some embodiments, the second application is a workout
application, and the second application is active when a workout
routine (e.g., distance traveled, time traveled, steps taken,
distance remaining to goal, time remaining to goal) is enabled on
the second application.
[0323] In some embodiments, the second application is inactive when
all workout tracking features (e.g., distance traveled, time
traveled, steps taken, distance remaining to goal, time remaining
to goal) are disabled on the second application.
[0324] At block 1104, the electronic device (e.g., 1000) displays,
on the display, a first user interface (e.g., 1004) associated with
the first application and a first affordance (e.g., 1010A)
associated with the second application. In some examples, the first
user interface (e.g., 1004) is the home screen of music
application. In some examples, the first user interface (e.g.,
1004) is the home screen of a video player application. In some
examples, the first user interface (e.g., 1004) is the home screen
of a media application. In some examples, the first affordance
(e.g., 1010A) is a mini-icon for a workout application. In some
examples, the first affordance (e.g., 1010A) is a mini-icon for an
exercise application. In some examples, the first affordance (e.g.,
1010A) is a mini-icon for a health monitoring application.
[0325] Displaying the first user interface associated with the
first application and the first affordance associated with the
second application provides the user with a user interface that
allows the user to, while viewing the first user interface of the
first application, recognize that a second application is active on
the device and that a selection of the first affordance can cause
display of the second application. Providing additional control
options without cluttering the UI with additional displayed
controls enhances the operability of the device and makes the
user-device interface more efficient (e.g., by helping the user to
provide proper inputs and reducing user mistakes when
operating/interacting with the device) which, additionally, reduces
power usage and improves battery life of the device by enabling the
user to use the device more quickly and efficiently.
[0326] At block 1106, the electronic device (e.g., 1000) receives
user selection (e.g., 1001) (e.g., a tap, a detectable touch
gesture) of the first affordance.
[0327] At block 1108, in response to receiving the user selection
(e.g., 1001) of the first affordance (e.g., 1010A), the electronic
device (e.g., 1000), at block 1110, replaces display of the first
user interface (e.g., 1004) with display of a second user interface
(e.g., 1014) associated with the second application, where the
first application remains active on the electronic device, and, at
block 1112, replaces display of the first affordance (e.g., 1010A)
with display of a second affordance (e.g., 1010B) associated with
the first application. In some examples, the second user interface
(e.g., 1014) is the home screen of a workout application. In some
examples, the second user interface (e.g., 1014) is the home screen
of a, exercise application. In some examples, the second user
interface (e.g., 1014) is the home screen of a health monitoring
application. In some examples, the second affordance (e.g., 1010B)
is a mini-icon for a music application. In some examples, the
second affordance (e.g., 1010B) is a mini-icon for a video player
application. In some examples, the second affordance (e.g., 1010B)
is a mini-icon for a media application.
[0328] Replacing display of the first user interface with display
of the second user interface associated with the second
application, where the first application remains active on the
device, and replacing display of the first affordance with display
of the second affordance associated with the first application in
response to receiving the user selection of the first affordance
allows a user to quickly and easily transition back and forth
between viewing one active application (e.g., the first
application) and viewing another active application (e.g., the
second application). Reducing the number of inputs needed to
perform an operation enhances the operability of the device and
makes the user-device interface more efficient (e.g., by helping
the user to provide proper inputs and reducing user mistakes when
operating/interacting with the device) which, additionally, reduces
power usage and improves battery life of the device by enabling the
user to use the device more quickly and efficiently.
[0329] In some embodiments, at block 1114, the electronic device
(e.g., 1000) determines that the second application is no longer
active. In some examples, the electronic device (e.g., 1000)
determines that the second application is no longer active based on
user input ending or canceling an ongoing workout session. In some
examples, the electronic device determines that the second
application is no longer active based on expiration of an ongoing
workout session. In some examples, the electronic device determines
that the second application is no longer active based on user input
stopping play of an audio item. In some examples, the electronic
device determines that the second application is no longer active
based on all tracks of a playlist having been played. In some
embodiments, at block 1116, the electronic device (e.g., 1000)
receives user selection of the second affordance (e.g., 1010B)
(e.g., a quick access menu associated with the music application)
associated with the first application. In some embodiments, at
block 1118, in response to receiving the user selection of the
second affordance (e.g., 1010B), the electronic device (e.g.,
1000), at block 1120 replaces display of the second user interface
(e.g., 1014) (e.g., of the workout application) with display of the
first user interface (e.g., 1004) (e.g., of the music application)
associated with the first application without replacing display of
the second affordance (e.g., 1010B) with the first affordance
(e.g., 1010A), and, at block 1122, ceases to display the first
affordance. For example, the electronic device replaces display of
the second user interface with display of the first user interface
associated with the first application without replacing display of
the second affordance with the first affordance and ceases to
display the first affordance because the second application is no
longer active.
[0330] Replacing display of the second user interface with display
of the first user interface associated with the first application
without replacing display of the second affordance with the first
affordance ad ceasing to display the first affordance in response
to receiving the user selection of the second affordance provides a
visual feedback indicating to the user that the second application
is no longer active on the device. Providing improved feedback
enhances the operability of the device and makes the user-device
interface more efficient (e.g., by helping the user to provide
proper inputs and reducing user mistakes when operating/interacting
with the device) which, additionally, reduces power usage and
improves battery life of the device by enabling the user to use the
device more quickly and efficiently.
[0331] In some embodiments, at block 1124, the electronic device
(e.g., 1000) displays, on the display (e.g., 1002), a home user
interface (e.g., 1016) (e.g., a main user interface of the device,
a time user interface of the device), where the home user interface
includes the second affordance (e.g., 1010B) (e.g., a quick access
menu associated with the music application) associated with the
first application (e.g., a music application) at a first location
of the display different from a second location of the display. In
some examples, the first location is a top-center region of the
display (e.g., 1002), which can be a more readily visible portion
of a display to a user. In some examples, the second location is a
top-corner region of the display (e.g., 1002), which can allow for
less interference with other elements of the displayed
interface.
[0332] Displaying the second affordance associated with the first
application on the home user interface provides a visual indication
to the user while the user is viewing the home user interface that
the first application is active on the device. Providing improved
feedback enhances the operability of the device and makes the
user-device interface more efficient (e.g., by helping the user to
provide proper inputs and reducing user mistakes when
operating/interacting with the device) which, additionally, reduces
power usage and improves battery life of the device by enabling the
user to use the device more quickly and efficiently.
[0333] In some embodiments, at block 1126, the electronic device
(e.g., 1000) receives user selection (e.g., 1005) of the second
affordance (e.g., 1010B) associated with the first application. In
some embodiments, at block 1128, in response to receiving the user
selection of the second affordance associated with the first
application, the electronic device (e.g., 1000), at block 1130,
replaces display of the home user interface (e.g., 1016) with
display of the first user interface (e.g., 1004) associated with
the first application, at block 1132, ceases to display the second
affordance (e.g., 1010B) (e.g., a quick access menu associated with
the music application) at the first location of the display, and,
at block 1134, displays the first affordance (e.g., 1010A) (e.g., a
quick access menu associated with the workout application)
associated with the second application at the second location of
the display.
[0334] Note that details of the processes described above with
respect to method 1100 (e.g., FIGS. 11A-11B) are also applicable in
an analogous manner to the methods described above and below. For
example, methods 800, 900, and 1300 optionally include one or more
of the characteristics of the various methods described above with
reference to method 1100. For example, the method of navigating and
selecting an audio item to play described in method 700 can be used
to select audio items to be played through electronic device 1000.
For another example, the method of navigating the displayed stack
of stack items described in method 900 can be used to navigate
audio items and select audio items to be played through electronic
device 1000. For another example, the method of updating data
associated with audio files using a different device as described
in method 1300 can be used to update locally stored data on
electronic device 1000. For brevity, these details are not repeated
below.
[0335] FIGS. 12A-12AE illustrate exemplary user interfaces for
playing and managing audio items, in accordance with some
embodiments. The user interfaces in these figures are used to
illustrate the processes described below, including the processes
in FIGS. 13A-13C.
[0336] FIG. 12A illustrates the face of an electronic device 1200
(e.g., portable multifunction device 100, device 300, or device
500). In this non-limiting exemplary embodiment depicted in FIGS.
12A-12AE, electronic device 1200 is a smartphone. In other
embodiments, electronic device 1200 can be a different type of
electronic device, such as a wearable device (e.g., a
smartwatch).
[0337] In some embodiments, electronic device 1200 detects, via a
wireless communication radio (e.g., a WiFi connection, a Bluetooth
connection), an external device (e.g., a smartwatch). In some
embodiments, the external device, such as a smartwatch, is paired
with electronic device 1200 (e.g., both devices are associated with
the same user account). In some embodiments, a user of electronic
device 1200 has not yet configured automatic push (e.g., automatic
transfer of files, such as audio files, audio playlists, audio
albums, folders, etc., from electronic device 1200 to the external
device) settings between electronic device 1200 and the external
device.
[0338] As shown in FIG. 12A, in some embodiments, in response to
establishing a communication link (via the wireless communication
radio) between electronic device 1200 and the external device
(e.g., a smartwatch), electronic device 1200 displays a setup user
interface 1208 for configuring automatic push settings between
electronic device 1200 and the external device. In some
embodiments, setup user interface 1208 is displayed only when the
user of the device has not yet configured automatic push settings
on the device. In some embodiments, setup user interface 1208
includes a textual indication 1210 (e.g., "Add Music To Your
Watch") and a graphical indication 1212 of the availability of
automatic push settings and a request that the user configure
automatic push settings between electronic device 1200 and the
external device. Setup user interface 1208 also includes a proceed
icon 1216 for proceeding with configuring automatic push settings
and a cancel icon 1218 (e.g., "Not Now") for forgoing configuring
automatic push settings at this time.
[0339] In FIG. 12B, electronic device 1200 detects, via
touch-sensitive display 1202 (or alternatively, a voice input via a
mic) user activation of proceed icon 1216. For example, the user
activation is a tap gesture 1201 on proceed icon 1216.
[0340] As shown in FIG. 12C, in response to detecting tap gesture
1201 on proceed icon 1216, electronic device 1200 displays (e.g.,
replaces display of setup user interface 1208 with) setup selection
user interface 1220. In some embodiments, setup selection user
interface 1220 includes a description 1224 of automatic push
functionality and a cancel icon 1240 for forgoing configuring
initial automatic push settings. In some embodiments, setup
selection user interface 1220 also includes one or more playlists
(or albums, artists, folders) available on (e.g., stored on)
electronic device (or available via electronic device 1200 through
a cloud service) to designate or de-designate for automatic push
between electronic device 1200 and the external device (e.g., a
smartwatch).
[0341] For example, in FIG. 12C, setup selection user interface
1220 includes a first playlist 1226 (e.g., "My Top Hits"), a second
playlist 1230 (e.g., "Classics"), and a third playlist 1234 (e.g.,
"Workout"). In some embodiments, setup selection user interface
also includes toggle icons 1228, 1232, 1236 associated with each
available playlist. In some embodiments, a toggle icon is in the
"designated" or "on" state when it is toggled to the right (as are
toggle icons 1228, 1232, and 1236) and is in the "non-designated"
or "off" state when it is toggled to the left. Thus, in FIG. 12C,
all three playlists 1226, 1230, and 1234 are designated (via toggle
icons 1228, 1232, and 1236) to be automatically pushed from
electronic device 1200 to the external device. In some embodiments,
setup selection user interface 1220 also includes a second proceed
icon 1238 for proceeding with initial automatic push
configuration.
[0342] In FIG. 12D, electronic device 1200 detects, via
touch-sensitive display 1202 (or alternatively, a voice input via a
mic) user input on toggle icon 1236 corresponding to third playlist
1234 (e.g., "Workout"). For example, the user input is a tap
gesture 1203 (or alternatively, a short swipe gesture) on toggle
icon 1236 to de-designate third playlist 1234 for automatic push.
As shown in FIG. 12E, tap gesture 1203 on toggle icon 1236 causes
toggle icon 1236 to shift from the "designated" (or "on") state to
the "non-designated" (or "off") state. Thus, as shown in FIG. 12E,
following tap gesture 1203 on toggle icon 1236, third playlist 1234
(e.g., "Workout") is no longer designated for automatic push.
[0343] In FIG. 12F, with first playlist 1226 and second playlist
1230 designated for automatic push and third playlist 1234 no
longer designated for automatic push, electronic device 1200
detects, via touch-sensitive display 1202 (or alternatively, a
voice input via a mic) user activation of second proceed icon 1238
for proceeding with configuring automatic push. For example, the
user activation is a tap gesture 1205 on proceed icon 1238.
[0344] As shown in FIG. 12G, in response to detecting tap gesture
1205, electronic device 1200 displays (e.g., replaces display of
setup selection user interface 1220 with) a setup confirmation user
interface 1242 for completing initial automatic push configuration.
In some embodiments, setup confirmation user interface 1242
maintains display of cancel icon 1240 for cancelling the setup and
forgoing completing initial automatic push configuration. In some
embodiments, setup confirmation user interface 1242 includes a back
icon 1252 for returning to setup selection user interface 1220
(e.g., to change selection settings). In some embodiments, setup
confirmation user interface 1242 includes a graphical indication
1244 and a textual indication 1248 that automatic push
configuration settings can be changed and additional playlists (or
tracks, albums, folders) can be added later on. Setup confirmation
user interface 1242 also includes a completing icon 1250 for
affirming confirmation of the selected initial automatic push
settings. In FIG. 12H, electronic device 1200 detects, via
touch-sensitive display 1202 (or alternatively, a voice input via a
mic), user selection of completing icon 1250. For example, as shown
in FIG. 12H, the user selection is a tap gesture 1207 on completing
icon 1250. In response detecting tap gesture 1207, electronic
device 1200 stores and applies the setup settings and exits the
setup process.
[0345] In FIG. 12I, electronic device 1200 displays, on
touch-sensitive display 1202, a homescreen user interface 1254. In
some embodiments, homescreen user interface 1254 includes one or
more icons 1256A-H corresponding to one or more applications
installed on electronic device 1200. In particular, homescreen user
interface 1254 of electronic device 1200 includes an external
device configuration icon 1256A (e.g., for configuring, changing
settings of an external device linked to electronic device 1200,
such as a paired smartwatch) corresponding to a configuration
application for an external device linked with electronic device
1200 (e.g., two devices linked to the same user account).
[0346] In FIG. 12J, while displaying homescreen user interface
1254, electronic device 1200 detects, via touch-sensitive display
1202 (or alternatively, a voice input via a mic), user activation
of external device configuration icon 1256A. For example, as shown
in FIG. 12J, the user activation is a tap gesture 1209 on external
device configuration icon 1256A.
[0347] As shown in FIG. 12K, in response to detecting tap gesture
1209, electronic device 1200 displays (e.g., replaces display of
homescreen user interface 1254 with) external device configuration
user interface 1258. In some embodiments, external device
configuration user interface 1258 includes an graphical indication
1260 of the linked external device. In some embodiments, external
device configuration user interface 1258 includes a listing 1266 of
setting options 1266A-E (e.g., general settings option 1266A,
passcode settings option 1266B, clock settings option 1266C, mail
settings option 1266D, music settings option 1266E). In some
embodiments, external device configuration user interface 1258
includes an application bar 1268 consisting of operating-system
controlled applications, which includes external device
configuration application, and an indication 1270 that the
currently-displayed user interface (i.e., external device
configuration user interface 1258) corresponds to the external
device configuration application.
[0348] In FIG. 12L, electronic device 1200 detects, via
touch-sensitive display 1202 (or alternatively, a voice input via a
mic), user selection of music settings option 1266E within listing
1266. For example, the user selection is a tap gesture 1211 on
music settings option 1266E. As shown in FIG. 12M, in response to
detecting tap gesture 1211, electronic device 1200 displays (e.g.,
replaces display of external device configuration user interface
1258 with) an audio settings user interface 1272 associated with
the linked external device. In some embodiments, audio settings
user interface 1272 includes an indication 1274 of the
currently-used storage amount and the storage limit of the linked
external device (e.g., "USED 3.2 GB OF 4 GB"). In some embodiments,
audio settings user interface 1272 includes a storage bar 1276 that
proportionally the various different types of files (e.g., music
files, podcast files, photo files) that are stored on the linked
external device and the amount of free storage available on the
linked external device. In some embodiments, audio settings user
interface 1272 includes a storage limit indicator 1278 indicating
the storage limit of the linked external device. In some
embodiments, audio settings user interface 1272 includes an "update
only with charger" setting toggle option 1280 which, in FIG. 12M,
is currently in the "off" state. In some embodiments, audio
settings user interface 1272 includes an edit icon 1288 for editing
list entries included in the user interface (e.g., entries of list
1290 depicted in FIG. 12R). In some embodiments, audio settings
user interface 1272 includes a selectable indication 1252 that
includes an indication (e.g., a textual indication stating "My
Watch") that the current application relates to the linked external
device, and, when selected, causes electronic device 1200 to return
to display of external device configuration user interface
1258.
[0349] In some embodiments, audio settings user interface 1272
includes a list 1284 of playlists available on (e.g., stored on) or
available via (e.g., from a cloud service) electronic device 1200
that can be automatically pushed to linked external device. The
list 1284 of playlists corresponds to the list of playlists (e.g.,
first playlist 1226 (e.g., "My Top Hits"), second playlist 1230
(e.g., "Classics"), third playlist 1234 (e.g., "Workout"))
displayed in setup selection user interface 1220 described with
reference to FIGS. 12C-12F, with corresponding toggle icons 1228,
1232, and 1236. Because third playlist 1234 (e.g., "Workout") was
de-designated during the initial automatic push configuration
process described with reference to FIGS. 12A-12H, third playlist
1234 remains non-designated while first playlist 1226 and second
playlist 1230 remain designated for automatic push.
[0350] In FIG. 12N, while displaying audio settings user interface
1272, electronic device 1200 detects, via touch-sensitive display
1202 (or alternatively, a voice input via a mic), user input on
toggle icon 1232 corresponding to second playlist 1230 (e.g.,
"Classics"). For example, the user selection is a tap gesture 1213
(or alternatively, a short swipe gesture) on toggle icon 1232
corresponding to second playlist 1230. As shown in FIG. 12O, in
response to detecting tap gesture 1213, toggle icon 1232
corresponding to second playlist 1230 shifts to the
"non-designated" (e.g., "off") mode, thereby de-designating second
play 1230 for automatic push from electronic device 1200 to the
linked external device.
[0351] In FIG. 12P, while displaying audio settings user interface
1272, electronic device 1200 detects, via touch-sensitive display
1202 (or alternatively, a voice input via a mic), a scrolling input
on audio settings user interface 1272. For example, the scrolling
input is a scrolling gesture 1215 in an upwards direction on audio
settings user interface 1272. As displayed in the transition from
FIG. 12P through FIG. 12R, scrolling gesture 1215 causes gradual
display (e.g., a shifting up from the bottom edge of display 1202)
of a bottom region of audio settings user interface 1272 that
previously could not be displayed on display 1202. As shown in
FIGS. 12Q-12R, scrolling gesture 1215 reveals a list 1290 of albums
(or playlists, folders) currently stored locally on the linked
external device. For example, as shown in FIGS. 12Q-12R, list 1290
includes a first album 1292 (e.g., "Current Favorites"), a second
album 1294 (e.g., "Heavy Rotation"), a third album 1296 (e.g.,
"Best Classics"), and a fourth album 1298 (e.g., "90s Playlist")
currently stored on the linked external device.
[0352] In FIG. 12S, while displaying list 1290 of audio settings
user interface 1272, electronic device 1200 detects, via
touch-sensitive display 1202 (or alternatively, a voice input via a
mic), user selection of edit icon 1288. For example, as shown in
FIG. 12S, the user selection of a tap gesture 1217 on edit icon
1288. As shown in FIG. 12T, in response to detecting tap gesture
1217, electronic device 1200 displays, on audio settings user
interface 1272 a plurality of remove icons 1292A, 1294A, 1296A, and
1298A corresponding to first album 1292, second album 1294, third
album 1296, and fourth album 1298 of list 1290, respectively.
Electronic device 1200 also displays, on audio settings user
interface 1272, a cancel icon for removing display of the plurality
of remove icons.
[0353] In FIG. 12U, while displaying the plurality of remove icons
1292A, 1294A, 1296A, and 1298A corresponding to first album 1292,
second album 1294, third album 1296, and fourth album 1298 of list
1290 on audio settings user interface 1272, electronic device 1200
detects, via touch-sensitive display 1202 (or alternatively, a
voice input via a mic), user selection of first remove icon 1292A
corresponding to first album 1292. For example, as shown in FIG.
12U, the user selection is a tap gesture 1217 on first remove icon
1292A. As shown in FIG. 12V, in response to detecting tap gesture
1217 on first remove icon 1292A corresponding to first album 1292,
electronic device 1200 displays (e.g., at a bottom region of the
user interface), a prompt 1299 (e.g., a pop-up prompt) over audio
settings user interface 1272 requesting user confirmation for
removing data corresponding to first album 1292 from local storage
on the linked external device. In some embodiments, prompt 1299
includes an indication 1299A (e.g., a textual indication)
confirming the user's intent to have the data associated with the
selected album (e.g., first album 1292) removed from the linked
external device, a confirmation icon 1299B (e.g., an icon stating
"REMOVE"), and a cancel icon 1299C for canceling the removal.
[0354] In FIG. 12W, while displaying prompt 1299, electronic device
1200 detects, via touch-sensitive display 1202 (or alternatively, a
voice input via a mic), user selection of confirmation icon 1299B.
For example, as shown in FIG. 12W, the user selection is a tap
gesture 1219 on confirmation icon 1299B. As shown in FIG. 12X, in
response to detecting tap gesture 1219 confirming the request to
remove data associated with first album from local storage on the
linked external device, electronic device 1200 removes display of
prompt 1299 and removes display of first album 1292 and remove icon
1292A corresponding to first album 1292 from audio settings user
interface 1272. As such, listing 1290 now shows only second album
1294, third album 1296, and fourth album 1298 as being stored
locally on the linked external device. Further, electronic device
1200 causes data of corresponding to first album 1292 to be removed
from the linked external device.
[0355] In FIG. 12Y, while displaying the entries (e.g., second
album 1294, third album 1296, and fourth album 1298) of listing
1290 and remove icons 1294A, 1296A, and 1298A corresponding to the
entries of listing 1290, electronic device 1200 detects, via
touch-sensitive display 1202 (or alternatively, a voice input via a
mic), user selection of cancel icon 1240. For example, the user
selection is a tap gesture 1221 on cancel icon 1240. As shown in
FIG. 12Z, in response to detecting tap gesture 1221, electronic
device 1200 ceases display of remove icons 1294A, 1296A, and 1298A
corresponding to second album 1294, third album 1296, and fourth
album 1298, respectively, on audio settings user interface
1272.
[0356] In FIG. 12AA, while displaying audio settings user interface
1272, electronic device 1200 detects, via touch-sensitive display
1202 (or alternatively, a voice input via a mic), user input on
toggle icon 1236, which is in the "non-designated" mode (e.g.,
"off" mode), corresponding to third playlist 1234 (e.g.,
"Workout"). For example, as shown in FIG. 12AA, the user selection
is a tap gesture 1223 (or alternatively, a short swipe gesture) on
toggle icon 1236. As shown in FIG. 12AB, in response to detecting
tap gesture 1223, toggle icon 1236 corresponding to third playlist
1234 (e.g., "Workout") switches from the "non-designated" mode
(e.g., "off" mode) to the "designated" mode (e.g., "on" mode).
[0357] As shown in the transition from FIG. 12AC to FIG. 12AD,
electronic device 1200 detects, via touch-sensitive display 1202
(or alternatively, a voice input via a mic), a scrolling input on
audio settings user interface 1272. For example, the scrolling
input is a scrolling gesture 1225 in a downwards direction (thus
again revealing the top portion of audio settings user interface
1272, as shown in FIG. 12 AD). As shown in FIG. 12AD, toggle icon
1236 corresponding to third playlist 1234 (e.g., "Workout") remains
in the "designated" mode. Further, as also shown in FIG. 12AD,
toggle icon 1280 corresponding to an update mode (e.g., update the
linked external device only when the external device is being
charged v. update the linked external device irrespective of
whether or not the linked external device is being charged) is in
the "off" mode (e.g., update the linked external device
irrespective of whether or not the linked external device is being
charged). As such, as long as the external device is currently
linked to electronic device 1200, electronic device causes data
associated with third playlist 1234 to be transmitted to the linked
external device to be locally stored on the linked external device.
If toggle icon 1280 corresponding to an update mode is in the "on"
mode (e.g., update the linked external device only when the
external device is being charged), electronic device 1200 causes
data transmission to the linked external device only when the
linked device is currently being charged.
[0358] In some embodiments, as further shown in FIG. 12AD, when
initiating update of (e.g., causing data corresponding to third
playlist 1234 to be transmitted to) the linked external device,
electronic device 1200 displays, on audio settings user interface
1272, an indication 1297 (e.g., a textual indication stating
"UPDATING . . . ") that the linked external device is being updated
to locally store the transmitted data (e.g., corresponding to third
playlist 1234).
[0359] In some embodiments, as shown in FIG. 12AE, as the update is
progressing, electronic device 1200 updates indication 1295 to
include a current progress (e.g., "UPDATING SONG 8 OF 25") of the
update. In some embodiments, as also shown in FIG. 12AE, as the
update is progressing, electronic device 1200 displays (e.g.,
adjacent to or below indication 1295) an update status bar 1293
graphically indicating the current progress of the update. In some
embodiments, as shown in FIG. 12AE, while causing update of the
linked external device, electronic device 1200 updates display of
storage limit indicator 1274 (e.g., from "USED 3.2 GB of 4 GB" to
"USED 3.5 GB of 4 GB") and storage bar 1276 to account for change
in local storage use of the linked external device as the update
progresses (i.e., as the external device continues to locally
stores the transmitted data (e.g., corresponding to third playlist
1234)).
[0360] FIGS. 13A-13C are a flow diagram illustrating a method for
playing and managing audio items using an electronic device in
accordance with some embodiments. Method 1300 is performed at a
device (e.g., 100, 300, 500) with a touch-sensitive display and a
wireless communication radio (e.g., Bluetooth, WiFi, NFC, etc.).
Some operations in method 1300 are, optionally, combined, the order
of some operations is, optionally, changed, and some operations
are, optionally, omitted.
[0361] As described below, method 1300 provides an intuitive way
for playing and managing audio items. The method reduces the
cognitive burden on a user for playing and managing audio items,
thereby creating a more efficient human-machine interface. For
battery-operated computing devices, enabling a user to play and
manage audio items faster and more efficiently conserves power and
increases the time between battery charges.
[0362] At block 1302, the electronic device (e.g., 1200) displays,
on the display (e.g., 1202), a user interface (e.g., 1220, 1272)
including a plurality of item groups (e.g., 1226, 1230, 1234)
(e.g., a plurality of audio playlists, a plurality of audio albums,
a plurality of track lists) and a plurality of selection
affordances (e.g., 1228,1232, 1236) associated with the plurality
of item groups, where a selection affordance has a first state and
a second state, and where data of the plurality of item groups are
stored on the electronic device. In some examples, the first state
is a selected state. In some examples, the first state is a
"checked" state. In some examples, the first state is an "on"
state. In some examples, the second state is a non-selected state.
In some examples, the second state is an "un-checked" state. In
some examples, the second state is an "off" state.
[0363] Displaying a plurality of item groups and a plurality of
selection affordances associated with the plurality of item groups,
where a selection affordance can be switched between a first state
and a second state to designated or de-designate the corresponding
item group provides the user with a quick and efficient way to
designate or de-designate the plurality of item groups. Providing
additional control options without cluttering the UI with
additional displayed controls enhances the operability of the
device and makes the user-device interface more efficient (e.g., by
helping the user to provide proper inputs and reducing user
mistakes when operating/interacting with the device) which,
additionally, reduces power usage and improves battery life of the
device by enabling the user to use the device more quickly and
efficiently.
[0364] In some embodiments, at block 1304, prior to displaying, on
the display (e.g., 1202), the user interface (e.g., 1220, 1272)
including the plurality of item groups (e.g., a plurality of audio
playlists, a plurality of audio albums, a plurality of track lists)
and the plurality of selection affordances associated with the
plurality of item groups, the electronic device (e.g., 1200)
displays, on the display, an initial setup user interface (e.g.,
1208) including a proceeding affordance (e.g., 1216). In some
examples, the initial setup user interface (e.g., 1208) is a user
interface that is displayed only when a user has not yet configured
automatic push settings on the device (e.g., a "get started" user
interface). In some examples, the proceeding affordance (e.g.,
1216) is an affordance for agreeing to proceed with setting up
automatic push functionality with the external device (e.g., a "get
started" affordance). In some embodiments, at block 1306, the
electronic device receives user selection (e.g., a touch gesture,
such as a tap) of the proceeding affordance. In some embodiments,
at block 1308, in response to receiving the user selection of the
proceeding affordance, the electronic device displays the user
interface. In some embodiments, the initial setup user interface is
displayed in response to detecting, via the wireless communication
radio, connectivity with the external device.
[0365] At block 1310, the electronic device (e.g., 1200) receives
user input on a first selection affordance associated with a first
item group. At block 1312, in accordance with a determination that
the first selection affordance is in the first state, the
electronic device designates the first item group. In some
examples, the electronic device (e.g., 1200) designates the first
item group to be transmitted to a device different from the
electronic device. At block 1314, in accordance with a
determination that the first selection affordance is in the second
state, the electronic device forgoes designating the first item
group.
[0366] In some embodiments, at block 1316, the electronic device
(e.g., 1200) receives a second user input (e.g., 1215) (e.g., a
finger scroll gesture) on the user interface (e.g., 1272). In some
embodiments, at block 1318, in response to receiving the second
user input (e.g., in response to a user scrolling the user
interface to view user interface elements that are not currently
visible on the display), the electronic device (e.g., 1200)
displays, on the display (e.g., 1202), at least one stored item
group of a plurality of stored item groups (e.g., 1294, 1296, 1298)
(e.g., at least one playlist of the plurality of playlists
currently stored on the electronic device) stored on external
device. In some embodiments, at block 1320, the electronic device
(e.g., 1200) receives user selection (e.g., 1217) (e.g., a
detectable touch gesture, such as a tap) of an edit affordance
(e.g., 1288). In some embodiments, at block 1322, in response to
receiving the user selection of the edit affordance, the electronic
device (e.g., 1200) displays, on the display (e.g., 1202), a
plurality of removal affordances (e.g., 1294A, 1296A, 1298a)
associated with the plurality of stored item groups.
[0367] Displaying the plurality of removal affordances associated
with the plurality of sorted item groups in response to receiving
the user selection of the edit affordance provides a visual
feedback to the user indicating that one or more of the plurality
of stored item groups can be removed from local storage on the
external device. Providing improved visual feedback enhances the
operability of the device and makes the user-device interface more
efficient (e.g., by helping the user to provide proper inputs and
reducing user mistakes when operating/interacting with the device)
which, additionally, reduces power usage and improves battery life
of the device by enabling the user to use the device more quickly
and efficiently.
[0368] In some embodiments, at block 1324, the electronic device
(e.g., 1200) receives user selection of a first removal affordance
(e.g., 1294A) of the plurality of removal affordances associated
with a first stored item group of the plurality of stored item
groups. In some embodiments, at block 1326, in response to
receiving the user selection of the first removal affordance, the
electronic device (e.g., 1200) causes data of the first stored item
group (e.g., 1294) to be removed from the external device.
[0369] In some embodiments, at block 1328, prior to causing the
data of the first stored item group (e.g., 1294) to be removed from
the external device, the electronic device (e.g., 1200) displays,
on the display (e.g., 1202), a confirmation affordance (e.g.,
1299B). In some embodiments, at block 1330, the electronic device
(e.g., 1200) receives user selection (e.g., 1219) of the
confirmation affordance. In some embodiments, at block 1332, in
response to receiving the user selection of the confirmation
affordance, the electronic device (e.g., 1200) causes data of the
first stored item group (e.g., 1294) to be removed from the
external device.
[0370] Causing data of the first stored item group to be removed
from the external device in response to receiving the user
selection of the first removal affordance, without additional user
input on the external device, enables a user to easily and
efficient control (e.g., remove) data corresponding to items stored
on the external device. Performing an operation when a set of
conditions has been met without requiring further user input
enhances the operability of the device and makes the user-device
interface more efficient (e.g., by helping the user to provide
proper inputs and reducing user mistakes when operating/interacting
with the device) which, additionally, reduces power usage and
improves battery life of the device by enabling the user to use the
device more quickly and efficiently.
[0371] At block 1334, subsequent to detecting, via the wireless
communication radio, an external device (e.g., a smartphone, a
smartwatch, a tablet computer, a laptop computer, a desktop
computer), the electronic device (e.g., 1200), at block 1336, in
accordance with a determination that the first item group is
designated, automatically transmits (without any user input) data
of the items associated with the first item group to the external
device to be stored on the external device, and, at block 1338, in
accordance with a determination that the first item group is not
designated, the electronic device forgoes automatically
transmitting data of the items associated with the first item group
to the external device to be stored on the external device. In some
embodiments, the electronic device is paired with the external
device
[0372] Automatically transmitting data of the items associated with
the first item group to the external device to be stored on the
external device in accordance with a determination that the first
item group is designated and automatically transmitting data of the
items associated with the first item group to the external device
to be stored on the external device in accordance with a
determination that the first item group is not designated enables a
user to easily and efficiently control the transmission of data
associated with the first item group to the external device by
simply designating or de-designating the item group using the
electronic device. Performing an operation when a set of conditions
has been met without requiring further user input enhances the
operability of the device and makes the user-device interface more
efficient (e.g., by helping the user to provide proper inputs and
reducing user mistakes when operating/interacting with the device)
which, additionally, reduces power usage and improves battery life
of the device by enabling the user to use the device more quickly
and efficiently.
[0373] In some embodiments, in accordance with a determination that
the first item group is designated, the electronic device (e.g.,
1200), at block 1340, in accordance with a determination that a
first item associated with the first item group is stored on the
external device (e.g., an audio item of the designated playlist is
already stored on the external device), forgoes automatically
transmitting the data of the first item to the external device.
[0374] In some embodiments, in accordance with the determination
that the first item group is designated, the electronic device
(e.g., 1200), at block 1342, in accordance with a determination
that a second item not associated with the first item group stored
on the electronic device is stored on the external device (e.g., an
audio item exists in the corresponding playlist of the external
device because the playlist previously contained the audio item,
but the audio item has been removed from the playlist on the
electronic device), causes data of the second item to be removed
from the external device.
[0375] In some embodiments, at block 1344, prior to automatically
transmitting (without any user input) the data of the items in the
first item group to the external device to be stored on the
external device, the electronic device (e.g., 1200) displays, on
the display (e.g., 1202), a confirmation sheet indicating that the
first item group is designated. In some examples, the confirmation
sheet is a pop-up sheet that partially covers the display. In some
examples, the confirmation sheet is a confirmation page that
entirely covers the display.
[0376] In some embodiments, at block 1346, prior to automatically
transmitting (without any user input) the data of the items in the
first item group to the external device to be stored on the
external device, the electronic device (e.g., 1200) receives, via
the wireless communication radio, charge state information (e.g.,
information concerning whether or not the external device is being
charged) of the external device. In some embodiments, at block
1348, in accordance with a determination, based on the received
charge state information, that the external device is currently
being charged, the electronic device (e.g., 1200) automatically
transmits (without any user input) the data of the items associated
with the first item group to the external device. In some
embodiments, at block 1350, in accordance with a determination,
based on the received charge state information, that the external
device is not currently being charged, the electronic device (e.g.,
1200) forgoes automatically transmitting the data of the items
associated with the first item group to the external device.
[0377] Automatically transmitting the data of the items associated
with the first item group to the external device in accordance with
a determination, based on the received charge state information,
that the external device is currently being charged and forgoing
automatically transmitting the data of the items associated with
the first item group to the external device in accordance with a
determination, based on the received charge state information, that
the external device is not currently being charged enables a user
to not have to worry about the charge state of the external device
when the data is being transmitted to the external device for local
storage on the external device, which can be a battery-intensive
process. Performing an operation when a set of conditions has been
met without requiring further user input enhances the operability
of the device and makes the user-device interface more efficient
(e.g., by helping the user to provide proper inputs and reducing
user mistakes when operating/interacting with the device) which,
additionally, reduces power usage and improves battery life of the
device by enabling the user to use the device more quickly and
efficiently.
[0378] In some embodiments, a default state of the first selection
affordance is the first state. Thus, in some embodiments, the
default setting for a playlist newly added to the electronic device
is to be automatically pushed.
[0379] In some embodiments, the user interface (e.g., 1272)
includes a storage limit indicator (e.g., 1274, 1278) of the
external device. In some examples, the storage limit indicator
indicates the maximum storage capacity of the external device
(e.g., "4 GB," "16 GB," "32 GB").
[0380] In some embodiments, the user interface (e.g., 1272)
includes a storage bar (e.g., 1276) indicating storage information
(e.g., types of data files stored on the external device, such as
music, applications, photos, media) of the external device. In some
examples, the different types of data files are indicated
proportionally to one's respective amount of used storage within
the storage bar by adjusting the length of mini-bars associated
with the different data types within the storage bar. In some
examples, the different types of data files are indicated using
different colors. In some examples, while automatically
transmitting data of the items associated with the first item group
to the external device, the electronic device displays a status
indicator (e.g., a status bar that indicates current progress by
"filling up" the bar, text that indicates current progress (e.g.,
"updating 5 of 40")) indicating the current progress of the
transmission.
[0381] In some embodiments, at block 1352, subsequent to
transmitting the data of the items associated with the first item
group to the external device, the electronic device (e.g., 1200)
receives, via the wireless communication radio, updated storage
information (reflecting the data transfer) of the external device.
In some embodiments, at block 1354, in response to receiving the
updated storage information of the external device, the electronic
device (e.g., 1200) updates the storage bar to reflect the updated
storage information. Updating the storage bar to reflect the
updated storage information in response to receiving the updated
storage information of the external device provides the user with
easily recognizable feedback regarding the local storage status of
the external device, which may have a limited amount of available
local storage. Providing improved feedback enhances the operability
of the device and makes the user-device interface more efficient
(e.g., by helping the user to provide proper inputs and reducing
user mistakes when operating/interacting with the device) which,
additionally, reduces power usage and improves battery life of the
device by enabling the user to use the device more quickly and
efficiently.
[0382] Note that details of the processes described above with
respect to method 1300 (e.g., FIGS. 13A-13C) are also applicable in
an analogous manner to the methods described above. For example,
the method of navigating and selecting an audio item to play
described in method 700 can be used to select audio items to be
played through electronic device 1200. For another example, the
method of navigating the displayed stack of stack items described
in method 900 can be used to navigate audio items and select audio
items to be played through electronic device 1200. For another
example, the method of quickly and efficiently switching between
user interfaces of active applications described in method 1100 can
be used to switch amongst active applications on electronic device
1200. For brevity, these details are not repeated below.
[0383] The foregoing description, for purpose of explanation, has
been described with reference to specific embodiments. However, the
illustrative discussions above are not intended to be exhaustive or
to limit the invention to the precise forms disclosed. Many
modifications and variations are possible in view of the above
teachings. The embodiments were chosen and described in order to
best explain the principles of the techniques and their practical
applications. Others skilled in the art are thereby enabled to best
utilize the techniques and various embodiments with various
modifications as are suited to the particular use contemplated.
[0384] Although the disclosure and examples have been fully
described with reference to the accompanying drawings, it is to be
noted that various changes and modifications will become apparent
to those skilled in the art. Such changes and modifications are to
be understood as being included within the scope of the disclosure
and examples as defined by the claims.
* * * * *