U.S. patent application number 16/832285 was filed with the patent office on 2020-07-16 for replacing display of icons in response to a gesture.
The applicant listed for this patent is Apple Inc.. Invention is credited to Imran CHAUDHRI, Gregory N. CHRISTIE, Scott M. HERZ.
Application Number | 20200225843 16/832285 |
Document ID | / |
Family ID | 40845592 |
Filed Date | 2020-07-16 |
View All Diagrams
United States Patent
Application |
20200225843 |
Kind Code |
A1 |
HERZ; Scott M. ; et
al. |
July 16, 2020 |
REPLACING DISPLAY OF ICONS IN RESPONSE TO A GESTURE
Abstract
In one aspect of the invention, a computer-implemented method at
a computing device with a touch screen display includes: displaying
a first set of a first plurality of icons in a first area of the
touch screen display, wherein the first plurality of icons includes
a plurality of sets of icons that are separately displayed in the
first area of the touch screen display; displaying a second
plurality of icons in a second area on the touch screen display,
wherein the second area is different from the first area; detecting
a first finger gesture on the touch screen display; in response to
detecting the first finger gesture, initiating a user interface
reconfiguration process, and varying positions of one or more icons
in the first set of the first plurality of icons about respective
average positions.
Inventors: |
HERZ; Scott M.; (San Jose,
CA) ; CHAUDHRI; Imran; (San Francisco, CA) ;
CHRISTIE; Gregory N.; (San Jose, CA) |
|
Applicant: |
Name |
City |
State |
Country |
Type |
Apple Inc. |
Cupertino |
CA |
US |
|
|
Family ID: |
40845592 |
Appl. No.: |
16/832285 |
Filed: |
March 27, 2020 |
Related U.S. Patent Documents
|
|
|
|
|
|
Application
Number |
Filing Date |
Patent Number |
|
|
15426836 |
Feb 7, 2017 |
10628028 |
|
|
16832285 |
|
|
|
|
12242851 |
Sep 30, 2008 |
9619143 |
|
|
15426836 |
|
|
|
|
61010208 |
Jan 6, 2008 |
|
|
|
Current U.S.
Class: |
1/1 |
Current CPC
Class: |
G06F 3/017 20130101;
G06F 3/0482 20130101; G06F 9/451 20180201; G06F 3/0486 20130101;
G06F 15/0291 20130101; G06F 3/04883 20130101; G06F 8/38 20130101;
G06F 2203/04803 20130101; G06Q 10/109 20130101; G06F 3/04886
20130101; G06F 9/542 20130101; G06F 3/0488 20130101; G06F 16/9577
20190101; G06F 3/048 20130101; G06F 3/0481 20130101; G06F 3/0483
20130101; G06F 2200/1637 20130101; G06F 3/04817 20130101 |
International
Class: |
G06F 3/0488 20060101
G06F003/0488; G06F 9/54 20060101 G06F009/54; G06F 3/0482 20060101
G06F003/0482; G06F 3/01 20060101 G06F003/01; G06F 3/048 20060101
G06F003/048; G06F 3/0481 20060101 G06F003/0481; G06F 3/0483
20060101 G06F003/0483; G06F 3/0486 20060101 G06F003/0486; G06F
16/957 20060101 G06F016/957 |
Claims
1. (canceled)
2. A computing device, comprising: a touch screen display; one or
more processors; memory; and one or more programs, wherein the one
or more programs are stored in the memory and configured to be
executed by the one or more processors, the one or more programs
including instructions for: displaying a first set of a first
plurality of icons on the touch screen display, wherein: the first
plurality of icons includes a plurality of sets of icons that are
separately displayed on the touch screen display, and the first
plurality of icons includes application launch icons, wherein each
application launch icon represents a particular application, and
activation of a respective application launch icon causes:
launching and displaying of the particular application represented
by the activated application launch icon, when the particular
application is not already launched, and displaying of the
particular application represented by the activated application
launch icon, when the particular application is already launched;
while displaying the first set of the first plurality of icons on
the touch screen display, detecting a first finger swipe gesture on
the touch screen display; in response to detecting the first finger
swipe gesture on the touch screen display: in accordance with a
determination that the first finger swipe gesture is in a first
direction, replacing display of the first set of the first
plurality of icons with display of a second set of the first
plurality of icons on the touch screen display, wherein the second
set of the first plurality of icons is distinct from the first set
of the first plurality of icons; and in accordance with a
determination that the first finger swipe gesture is in a direction
that is substantially opposite the first direction, replacing
display of the first set of the first plurality of icons with a
display of information, other than a set in the plurality of sets
of icons, customized to a user of the device.
3. The computing device of claim 2, wherein replacing display of
the first set of the first plurality of icons with display of a
second set of the first plurality of icons on the touch screen
display comprises an animation that moves the first set out of the
first area and the second set into the first area.
4. The computing device of claim 2, further including instructions
for: detecting a second finger gesture on an icon in the second set
of the first plurality of icons; and in response to detecting the
second finger gesture, displaying an application that corresponds
to the icon in the second set upon which the second finger gesture
was detected.
5. The computing device of claim 2, further including instructions
for: detecting a third finger gesture on the touch screen display;
in response to detecting the third finger gesture, replacing
display of the second set of the first plurality of icons with
display of a third set of the first plurality of icons on the touch
screen display; detecting a fourth finger gesture on an icon in the
third set of the first plurality of icons; and in response to
detecting the fourth finger gesture, displaying an application that
corresponds to the icon in the third set upon which the fourth
finger gesture was detected.
6. The computing device of claim 2, wherein the plurality of sets
of icons includes a number of sets of icons that are configured to
be separately displayed as a sequence of sets of icons on the touch
screen display; and further including instructions for: displaying
two or more set-sequence-indicia icons, wherein the
set-sequence-indicia icons provide information about the number of
sets of icons in the plurality of sets of icons and a position of a
displayed set of icons in the sequence of sets of icons; and in
response to detecting the first finger swipe gesture, updating the
information provided by the set-sequence-indicia icons to reflect
the replacement of the displayed first set by the second set.
7. The computing device of claim 6, further including instructions
for: in accordance with the determination that the first finger
swipe gesture is in a direction that is substantially opposite the
first direction, updating the information provided by a
customized-information indicia icon and the set-sequence-indicia
icons to reflect the replacement of the displayed first set by the
information customized to the user.
8. The computing device of claim 2, wherein the first finger swipe
gesture is at a location on the touch screen display that does not
align with any indicia representative of said replacing.
9. A method, comprising: at a computing device with a touch screen
display: displaying a first set of a first plurality of icons on
the touch screen display, wherein: the first plurality of icons
includes a plurality of sets of icons that are separately displayed
on the touch screen display, and the first plurality of icons
includes application launch icons, wherein each application launch
icon represents a particular application, and activation of a
respective application launch icon causes: launching and displaying
of the particular application represented by the activated
application launch icon, when the particular application is not
already launched, and displaying of the particular application
represented by the activated application launch icon, when the
particular application is already launched; while displaying the
first set of the first plurality of icons on the touch screen
display, detecting a first finger swipe gesture on the touch screen
display; in response to detecting the first finger swipe gesture on
the touch screen display: in accordance with a determination that
the first finger swipe gesture is in a first direction, replacing
display of the first set of the first plurality of icons with
display of a second set of the first plurality of icons on the
touch screen display, wherein the second set of the first plurality
of icons is distinct from the first set of the first plurality of
icons; and in accordance with a determination that the first finger
swipe gesture is in a direction that is substantially opposite the
first direction, replacing display of the first set of the first
plurality of icons with a display of information, other than a set
in the plurality of sets of icons, customized to a user of the
device.
10. The method of claim 9, wherein replacing display of the first
set of the first plurality of icons with display of a second set of
the first plurality of icons on the touch screen display comprises
an animation that moves the first set out of the first area and the
second set into the first area.
11. The method of claim 9, further comprising: detecting a second
finger gesture on an icon in the second set of the first plurality
of icons; and in response to detecting the second finger gesture,
displaying an application that corresponds to the icon in the
second set upon which the second finger gesture was detected.
12. The method of claim 9, further comprising: detecting a third
finger gesture on the touch screen display; in response to
detecting the third finger gesture, replacing display of the second
set of the first plurality of icons with display of a third set of
the first plurality of icons on the touch screen display; detecting
a fourth finger gesture on an icon in the third set of the first
plurality of icons; and in response to detecting the fourth finger
gesture, displaying an application that corresponds to the icon in
the third set upon which the fourth finger gesture was
detected.
13. The method of claim 9, wherein the plurality of sets of icons
includes a number of sets of icons that are configured to be
separately displayed as a sequence of sets of icons on the touch
screen display; and wherein the method further comprises:
displaying two or more set-sequence-indicia icons, wherein the
set-sequence-indicia icons provide information about the number of
sets of icons in the plurality of sets of icons and a position of a
displayed set of icons in the sequence of sets of icons; and in
response to detecting the first finger swipe gesture, updating the
information provided by the set-sequence-indicia icons to reflect
the replacement of the displayed first set by the second set.
14. The method of claim 13, further comprising: in accordance with
the determination that the first finger swipe gesture is in a
direction that is substantially opposite the first direction,
updating the information provided by a customized-information
indicia icon and the set-sequence-indicia icons to reflect the
replacement of the displayed first set by the information
customized to the user.
15. The method of claim 9, wherein the first finger swipe gesture
is at a location on the touch screen display that does not align
with any indicia representative of said replacing.
16. A non-transitory computer-readable storage medium storing one
or more programs configured to be executed by one or more
processors of a device with a touch screen display, the one or more
programs including instructions for: displaying a first set of a
first plurality of icons on the touch screen display, wherein: the
first plurality of icons includes a plurality of sets of icons that
are separately displayed on the touch screen display, and the first
plurality of icons includes application launch icons, wherein each
application launch icon represents a particular application, and
activation of a respective application launch icon causes:
launching and displaying of the particular application represented
by the activated application launch icon, when the particular
application is not already launched, and displaying of the
particular application represented by the activated application
launch icon, when the particular application is already launched;
while displaying the first set of the first plurality of icons on
the touch screen display, detecting a first finger swipe gesture on
the touch screen display; in response to detecting the first finger
swipe gesture on the touch screen display: in accordance with a
determination that the first finger swipe gesture is in a first
direction, replacing display of the first set of the first
plurality of icons with display of a second set of the first
plurality of icons on the touch screen display, wherein the second
set of the first plurality of icons is distinct from the first set
of the first plurality of icons; and in accordance with a
determination that the first finger swipe gesture is in a direction
that is substantially opposite the first direction, replacing
display of the first set of the first plurality of icons with a
display of information, other than a set in the plurality of sets
of icons, customized to a user of the device.
17. The non-transitory computer-readable storage medium of claim
16, wherein replacing display of the first set of the first
plurality of icons with display of a second set of the first
plurality of icons on the touch screen display comprises an
animation that moves the first set out of the first area and the
second set into the first area.
18. The non-transitory computer-readable storage medium of claim
16, further including instructions for: detecting a second finger
gesture on an icon in the second set of the first plurality of
icons; and in response to detecting the second finger gesture,
displaying an application that corresponds to the icon in the
second set upon which the second finger gesture was detected.
19. The non-transitory computer-readable storage medium of claim
16, further including instructions for: detecting a third finger
gesture on the touch screen display; in response to detecting the
third finger gesture, replacing display of the second set of the
first plurality of icons with display of a third set of the first
plurality of icons on the touch screen display; detecting a fourth
finger gesture on an icon in the third set of the first plurality
of icons; and in response to detecting the fourth finger gesture,
displaying an application that corresponds to the icon in the third
set upon which the fourth finger gesture was detected.
20. The non-transitory computer-readable storage medium of claim
16, wherein the plurality of sets of icons includes a number of
sets of icons that are configured to be separately displayed as a
sequence of sets of icons on the touch screen display; and further
including instructions for: displaying two or more
set-sequence-indicia icons, wherein the set-sequence-indicia icons
provide information about the number of sets of icons in the
plurality of sets of icons and a position of a displayed set of
icons in the sequence of sets of icons; and in response to
detecting the first finger swipe gesture, updating the information
provided by the set-sequence-indicia icons to reflect the
replacement of the displayed first set by the second set.
21. The non-transitory computer-readable storage medium of claim
20, further including instructions for: in accordance with the
determination that the first finger swipe gesture is in a direction
that is substantially opposite the first direction, updating the
information provided by a customized-information indicia icon and
the set-sequence-indicia icons to reflect the replacement of the
displayed first set by the information customized to the user.
22. The non-transitory computer-readable storage medium of claim
16, wherein the first finger swipe gesture is at a location on the
touch screen display that does not align with any indicia
representative of said replacing.
Description
RELATED APPLICATIONS
[0001] This application claims priority to U.S. Provisional Patent
Application No. 61/010,208, "Portable Multifunction Device with
Interface Reconfiguration Mode," filed Jan. 16, 2008, which is
incorporated by reference herein in its entirety.
[0002] This application is related to the following applications:
(1) U.S. patent application Ser. No. 10/188,182, "Touch Pad For
Handheld Device," filed on Jul. 1, 2002; (2) U.S. patent
application Ser. No. 10/722,948, "Touch Pad For Handheld Device,"
filed on Nov. 25, 2003; (3) U.S. patent application Ser. No.
10/643,256, "Movable Touch Pad With Added Functionality," filed on
Aug. 18, 2003; (4) U.S. patent application Ser. No. 10/654,108,
"Ambidextrous Mouse," filed on Sep. 2, 2003; (5) U.S. patent
application Ser. No. 10/840,862, "Multipoint Touchscreen," filed on
May 6, 2004; (6) U.S. patent application Ser. No. 10/903,964,
"Gestures For Touch Sensitive Input Devices," filed on Jul. 30,
2004; (7) U.S. patent application Ser. No. 11/038,590, "Mode-Based
Graphical User Interfaces For Touch Sensitive Input Devices" filed
on Jan. 18, 2005; (8) U.S. patent application Ser. No. 11/057,050,
"Display Actuator," filed on Feb. 11, 2005; (9) U.S. patent
application Ser. No. 11/367,749, "Multi-Functional Hand-Held
Device," filed Mar. 3, 2006; (10) U.S. patent application Ser. No.
11/850,011, "Web Clip Widgets on a Portable Multifunction Device,"
filed Sep. 4, 2007; (11) U.S. patent application Ser. No.
11/969,912, "Web-Clip Widgets on a Portable Multifunction Device,"
filed Jan. 6, 2008; (12) U.S. patent application Ser. No.
11/459,602, "Portable Electronic Device with Interface
Reconfiguration Mode," filed Jul. 24, 2006; and (13) U.S. patent
application Ser. No. 11/850,635, "Touch Screen Device, Method, and
Graphical User Interface for Determining Commands by Applying
Heuristics," filed Sep. 5, 2007. All of these applications are
incorporated by reference herein in their entirety.
TECHNICAL FIELD
[0003] The disclosed embodiments relate generally to portable
electronic devices, and more particularly, to user interfaces on
portable multifunction devices with touch-sensitive displays that
include an interface reconfiguration mode and to creating widgets
for displaying specified areas of web pages (i.e., creating
web-clip widgets) on portable multifunction devices.
BACKGROUND
[0004] As portable electronic devices become more compact, and the
number of functions performed by a given device increases, it has
become a significant challenge to design a user interface that
allows users to easily interact with a multifunction device. This
challenge is particularly significant for handheld portable
devices, which have much smaller screens than desktop or laptop
computers. This situation is unfortunate because the user interface
is the gateway through which users receive not only content but
also responses to user actions or behaviors, including user
attempts to access a device's features, tools, and functions. Some
portable communication devices (e.g., mobile telephones, sometimes
called mobile phones, cell phones, cellular telephones, and the
like) have resorted to adding more pushbuttons, increasing the
density of pushbuttons, overloading the functions of pushbuttons,
or using complex menu systems to allow a user to access, store and
manipulate data. These conventional user interfaces often result in
complicated key sequences and menu hierarchies that must be
memorized by the user.
[0005] Many conventional user interfaces, such as those that
include physical pushbuttons, are also inflexible. This may prevent
a user interface from being configured and/or adapted by either an
application running on the portable device or by users. When
coupled with the time consuming requirement to memorize multiple
key sequences and menu hierarchies, and the difficulty in
activating a desired pushbutton, such inflexibility is frustrating
to most users.
[0006] Some conventional user interfaces can be configured by
users, thereby allowing at least partial customization.
Unfortunately, the process of modifying such conventional user
interfaces is often as cumbersome and complicated as the use of the
conventional user interface itself. In particular, the required
behaviors during configuration of such conventional user interfaces
are often counterintuitive and the corresponding indicators guiding
user actions are often difficult to understand. These challenges
are often a source of additional frustration for users.
[0007] Accordingly, there is a need for more transparent and
intuitive user interfaces for portable devices that enable a user
to easily configure the user interface.
[0008] In addition, as a result of the small size of display
screens on portable electronic devices, frequently only a portion
of a web page of interest to a user can be displayed on the screen
at a given time. Furthermore, the scale of display may be too small
for comfortable or practical viewing. Users thus will frequently
need to scroll and to scale a web page to view a portion of
interest each time that they access the web page. However, the
limitations of conventional user interfaces can cause this
scrolling and scaling to be awkward to perform.
[0009] Accordingly, there is a need for portable multifunction
devices with more transparent and intuitive user interfaces for
creating widgets for displaying specified areas of web pages (i.e.,
for creating web-clip widgets) that are easy to use, configure,
and/or adapt. In addition, once the web-clip widgets are created,
there is a need for transparent and intuitive methods for
configuring user interfaces that include icons for activating
web-clip widgets.
SUMMARY
[0010] The above deficiencies and other problems associated with
user interfaces for portable devices are reduced or eliminated by
the disclosed portable multifunction device. In some embodiments,
the device has a touch-sensitive display (also known as a "touch
screen") with a graphical user interface (GUI), one or more
processors, memory and one or more modules, programs or sets of
instructions stored in the memory for performing multiple
functions. In some embodiments, the user interacts with the GUI
primarily through finger contacts and gestures on the
touch-sensitive display. In some embodiments, the functions may
include telephoning, video conferencing, e-mailing, instant
messaging, blogging, digital photographing, digital videoing, web
browsing, digital music playing, and/or digital video playing.
Instructions for performing these functions may be included in a
computer readable storage medium or other computer program product
configured for execution by one or more processors.
[0011] In one aspect of the invention, a computer-implemented
method at a computing device with a touch screen display includes:
displaying a first set of a first plurality of icons in a first
area of the touch screen display, wherein the first plurality of
icons includes a plurality of sets of icons that are separately
displayed in the first area of the touch screen display; displaying
a second plurality of icons in a second area on the touch screen
display, wherein the second area is different from the first area;
detecting a first finger gesture on the touch screen display in the
first area; and in response to detecting the first finger gesture
on the touch screen display in the first area, replacing display of
the first set of the first plurality of icons with display of a
second set of the first plurality of icons in the first area on the
touch screen display while maintaining the display of the second
plurality of icons in the second area on the touch screen
display.
[0012] In another aspect of the invention, a computer-implemented
method at a computing device with a touch screen display includes:
displaying a first set of a first plurality of icons in a first
area of the touch screen display, wherein the first plurality of
icons includes a plurality of sets of icons that are separately
displayed in the first area of the touch screen display; displaying
a second plurality of icons in a second area on the touch screen
display, wherein the second area is different from the first area;
detecting a first finger gesture on the touch screen display; in
response to detecting the first finger gesture, initiating a user
interface reconfiguration process, and varying positions of one or
more icons in the first set of the first plurality of icons about
respective average positions.
[0013] Thus, interface reconfiguration in accordance with the
disclosed embodiments allows a user to reposition displayed icons
(e.g., icons for activating applications and/or web-clip widgets)
in a simple, intuitive manner with finger gestures.
BRIEF DESCRIPTION OF THE DRAWINGS
[0014] For a better understanding of the aforementioned embodiments
of the invention as well as additional embodiments thereof,
reference should be made to the Description of Embodiments below,
in conjunction with the following drawings in which like reference
numerals refer to corresponding parts throughout the figures.
[0015] FIGS. 1A and 1B are block diagrams illustrating a portable
multifunction device with a touch-sensitive display in accordance
with some embodiments.
[0016] FIG. 2 illustrates a portable multifunction device having a
touch screen in accordance with some embodiments.
[0017] FIG. 3 illustrates an exemplary user interface for unlocking
a portable electronic device in accordance with some
embodiments.
[0018] FIGS. 4A-4B illustrate exemplary user interfaces having
menus of applications and/or widgets on a portable multifunction
device in accordance with some embodiments.
[0019] FIG. 4C illustrates an exemplary user interface having a
list of user-created widgets on a portable multifunction device in
accordance with some embodiments.
[0020] FIGS. 5A-5K illustrate an exemplary user interface for a
browser in accordance with some embodiments.
[0021] FIGS. 5L and 5M illustrate exemplary user interfaces for
displaying web-clip widgets in accordance with some
embodiments.
[0022] FIGS. 6A-6D illustrate an animation for creating and
displaying an icon corresponding to a web-clip widget in accordance
with some embodiments.
[0023] FIG. 6E illustrates an exemplary user interface for
activating a web-clip widget in accordance with some
embodiments.
[0024] FIGS. 7A-7E are flow diagrams illustrating processes for
creating and using a web-clip widget in accordance with some
embodiments.
[0025] FIGS. 7F-7H are flow diagrams illustrating processes for
displaying web-clip widgets in accordance with some
embodiments.
[0026] FIGS. 8A-8D illustrate exemplary user interfaces for
displaying icons in accordance with some embodiments.
[0027] FIGS. 9A and 9B are flow diagrams of an icon display process
in accordance with some embodiments.
[0028] FIG. 10 is a flow diagram of a position adjustment process
for a portable multifunction device in accordance with some
embodiments.
[0029] FIGS. 11A-1100 illustrate exemplary user interfaces during
interface reconfiguration in accordance with some embodiments.
[0030] FIGS. 12A-12F flow diagrams of icon reconfiguration
processes in accordance with some embodiments.
DESCRIPTION OF EMBODIMENTS
[0031] Reference will now be made in detail to embodiments,
examples of which are illustrated in the accompanying drawings. In
the following detailed description, numerous specific details are
set forth in order to provide a thorough understanding of the
present invention. However, it will be apparent to one of ordinary
skill in the art that the present invention may be practiced
without these specific details. In other instances, well-known
methods, procedures, components, circuits, and networks have not
been described in detail so as not to unnecessarily obscure aspects
of the embodiments.
[0032] It will also be understood that, although the terms first,
second, etc. may be used herein to describe various elements, these
elements should not be limited by these terms. These terms are only
used to distinguish one element from another. For example, a first
gesture could be termed a second gesture, and, similarly, a second
gesture could be termed a first gesture, without departing from the
scope of the present invention.
[0033] The terminology used in the description of the invention
herein is for the purpose of describing particular embodiments only
and is not intended to be limiting of the invention. As used in the
description of the invention and the appended claims, the singular
forms "a", "an" and "the" are intended to include the plural forms
as well, unless the context clearly indicates otherwise. It will
also be understood that the term "and/or" as used herein refers to
and encompasses any and all possible combinations of one or more of
the associated listed items. It will be further understood that the
terms "comprises" and/or "comprising," when used in this
specification, specify the presence of stated features, integers,
steps, operations, elements, and/or components, but do not preclude
the presence or addition of one or more other features, integers,
steps, operations, elements, components, and/or groups thereof.
[0034] As used herein, the term "if' may be construed to mean
"when" or "upon" or in response to determining" or "in response to
detecting," depending on the context. Similarly, the phrase "if it
is determined" or "if [a stated condition or event] is detected"
may be construed to mean "upon determining" or "in response to
determining" or "upon detecting [the stated condition or event]" or
"in response to detecting [the stated condition or event],"
depending on the context.
[0035] Embodiments of a portable multifunction device, user
interfaces for such devices, and associated processes for using
such devices are described. In some embodiments, the device is a
portable communications device such as a mobile telephone that also
contains other functions, such as PDA and/or music player
functions.
[0036] The user interface may include a physical click wheel in
addition to a touch screen or a virtual click wheel displayed on
the touch screen. A click wheel is a user-interface device that may
provide navigation commands based on an angular displacement of the
wheel or a point of contact with the wheel by a user of the device.
A click wheel may also be used to provide a user command
corresponding to selection of one or more items, for example, when
the user of the device presses down on at least a portion of the
wheel or the center of the wheel. Alternatively, breaking contact
with a click wheel image on a touch screen surface may indicate a
user command corresponding to selection. For simplicity, in the
discussion that follows, a portable multifunction device that
includes a touch screen is used as an exemplary embodiment. It
should be understood, however, that some of the user interfaces and
associated processes may be applied to other devices, such as
personal computers and laptop computers, which may include one or
more other physical user-interface devices, such as a physical
click wheel, a physical keyboard, a mouse and/or a joystick.
[0037] The device supports a variety of applications, such as one
or more of the following: a telephone application, a video
conferencing application, an e-mail application, an instant
messaging application, a blogging application, a photo management
application, a digital camera application, a digital video camera
application, a web browsing application, a digital music player
application, and/or a digital video player application.
[0038] The various applications that may be executed on the device
may use at least one common physical user-interface device, such as
the touch screen. One or more functions of the touch screen as well
as corresponding information displayed on the device may be
adjusted and/or varied from one application to the next and/or
within a respective application. In this way, a common physical
architecture (such as the touch screen) of the device may support
the variety of applications with user interfaces that are intuitive
and transparent.
[0039] The user interfaces may include one or more soft keyboard
embodiments. The soft keyboard embodiments may include standard
(QWERTY) and/or non-standard configurations of symbols on the
displayed icons of the keyboard, such as those described in U.S.
patent application Ser. No. 11/459,606, "Keyboards For Portable
Electronic Devices," filed Jul. 24, 2006, and Ser. No. 11/459,615,
"Touch Screen Keyboards For Portable Electronic Devices," filed
Jul. 24, 2006, the contents of which are hereby incorporated by
reference in their entirety. The keyboard embodiments may include a
reduced number of icons (or soft keys) relative to the number of
keys in existing physical keyboards, such as that for a typewriter.
This may make it easier for users to select one or more icons in
the keyboard, and thus, one or more corresponding symbols. The
keyboard embodiments may be adaptive. For example, displayed icons
may be modified in accordance with user actions, such as selecting
one or more icons and/or one or more corresponding symbols. One or
more applications on the portable device may utilize common and/or
different keyboard embodiments. Thus, the keyboard embodiment used
may be tailored to at least some of the applications. In some
embodiments, one or more keyboard embodiments may be tailored to a
respective user. For example, one or more keyboard embodiments may
be tailored to a respective user based on a word usage history
(lexicography, slang, individual usage) of the respective user.
Some of the keyboard embodiments may be adjusted to reduce a
probability of a user error when selecting one or more icons, and
thus one or more symbols, when using the soft keyboard
embodiments.
[0040] Attention is now directed towards embodiments of the device.
FIGS. 1A and 1B are block diagrams illustrating portable
multifunction devices 100 with touch-sensitive displays 112 in
accordance with some embodiments. The touch-sensitive display 112
is sometimes called a "touch screen" for convenience, and may also
be known as or called a touch-sensitive display system. The device
100 may include a memory 102 (which may include one or more
computer readable storage mediums), a memory controller 122, one or
more processing units (CPU's) 120, a peripherals interface 118, RF
circuitry 108, audio circuitry 110, a speaker 111, a microphone
113, an input/output (I/O) subsystem 106, other input or control
devices 116, and an external port 124. The device 100 may include
one or more optical sensors 164. These components may communicate
over one or more communication buses or signal lines 103.
[0041] It should be appreciated that the device 100 is only one
example of a portable multifunction device 100, and that the device
100 may have more or fewer components than shown, may combine two
or more components, or a may have a different configuration or
arrangement of the components. The various components shown in
FIGS. 1A and 1B may be implemented in hardware, software or a
combination of both hardware and software, including one or more
signal processing and/or application specific integrated
circuits.
[0042] Memory 102 may include high-speed random access memory and
may also include non-volatile memory, such as one or more magnetic
disk storage devices, flash memory devices, or other non-volatile
solid-state memory devices. Access to memory 102 by other
components of the device 100, such as the CPU 120 and the
peripherals interface 118, may be controlled by the memory
controller 122.
[0043] The peripherals interface 118 couples the input and output
peripherals of the device to the CPU 120 and memory 102. The one or
more processors 120 run or execute various software programs and/or
sets of instructions stored in memory 102 to perform various
functions for the device 100 and to process data.
[0044] In some embodiments, the peripherals interface 118, the CPU
120, and the memory controller 122 may be implemented on a single
chip, such as a chip 104. In some other embodiments, they may be
implemented on separate chips.
[0045] The RF (radio frequency) circuitry 108 receives and sends RF
signals, also called electromagnetic signals. The RF circuitry 108
converts electrical signals to/from electromagnetic signals and
communicates with communications networks and other communications
devices via the electromagnetic signals. The RF circuitry 108 may
include well-known circuitry for performing these functions,
including but not limited to an antenna system, an RF transceiver,
one or more amplifiers, a tuner, one or more oscillators, a digital
signal processor, a CODEC chipset, a subscriber identity module
(SIM) card, memory, and so forth. The RF circuitry 108 may
communicate with networks, such as the Internet, also referred to
as the World Wide Web (WWW), an intranet and/or a wireless network,
such as a cellular telephone network, a wireless local area network
(LAN) and/or a metropolitan area network (MAN), and other devices
by wireless communication. The wireless communication may use any
of a plurality of communications standards, protocols and
technologies, including but not limited to Global System for Mobile
Communications (GSM), Enhanced Data GSM Environment (EDGE),
high-speed downlink packet access (HSDPA), wideband code division
multiple access (W-CDMA), code division multiple access (CDMA),
time division multiple access (TDMA), Bluetooth, Wireless Fidelity
(Wi-Fi) (e.g., IEEE 802.11a, IEEE 80211b, IEEE 802.11g and/or IEEE
802.11n), voice over Internet Protocol (VoIP), Wi-MAX, a protocol
for email (e.g., Internet message access protocol (IMAP) and/or
post office protocol (POP)), instant messaging (e.g., extensible
messaging and presence protocol (XMPP), Session Initiation Protocol
for Instant Messaging and Presence Leveraging Extensions (SIMPLE),
and/or Instant Messaging and Presence Service (IMPS)), and/or Short
Message Service (SMS)), or any other suitable communication
protocol, including communication protocols not yet developed as of
the filing date of this document.
[0046] The audio circuitry 110, the speaker 111, and the microphone
113 provide an audio interface between a user and the device 100.
The audio circuitry 110 receives audio data from the peripherals
interface 118, converts the audio data to an electrical signal, and
transmits the electrical signal to the speaker 111. The speaker 111
converts the electrical signal to human-audible sound waves. The
audio circuitry 110 also receives electrical signals converted by
the microphone 113 from sound waves. The audio circuitry 110
converts the electrical signal to audio data and transmits the
audio data to the peripherals interface 118 for processing. Audio
data may be retrieved from and/or transmitted to memory 102 and/or
the RF circuitry 108 by the peripherals interface 118. In some
embodiments, the audio circuitry 110 also includes a headset jack
(e.g. 212, FIG. 2). The headset jack provides an interface between
the audio circuitry 110 and removable audio input/output
peripherals, such as output-only headphones or a headset with both
output (e.g., a headphone for one or both ears) and input (e.g., a
microphone).
[0047] The I/O subsystem 106 couples input/output peripherals on
the device 100, such as the touch screen 112 and other
input/control devices 116, to the peripherals interface 118. The
I/O subsystem 106 may include a display controller 156 and one or
more input controllers 160 for other input or control devices. The
one or more input controllers 160 receive/send electrical signals
from/to other input or control devices 116. The other input/control
devices 116 may include physical buttons (e.g., push buttons,
rocker buttons, etc.), dials, slider switches, joysticks, click
wheels, and so forth. In some alternate embodiments, input
controller(s) 160 may be coupled to any (or none) of the following:
a keyboard, infrared port, USB port, and a pointer device such as a
mouse. The one or more buttons (e.g., 208, FIG. 2) may include an
up/down button for volume control of the speaker 111 and/or the
microphone 113. The one or more buttons may include a push button
(e.g., 206, FIG. 2). A quick press of the push button may disengage
a lock of the touch screen 112 or begin a process that uses
gestures on the touch screen to unlock the device, as described in
U.S. patent application Ser. No. 11/322,549, "Unlocking a Device by
Performing Gestures on an Unlock Image," filed Dec. 23, 2005, which
is hereby incorporated by reference in its entirety. A longer press
of the push button (e.g., 206) may turn power to the device 100 on
or off. The user may be able to customize a functionality of one or
more of the buttons. The touch screen 112 is used to implement
virtual or soft buttons and one or more soft keyboards.
[0048] The touch-sensitive touch screen 112 provides an input
interface and an output interface between the device and a user.
The display controller 156 receives and/or sends electrical signals
from/to the touch screen 112. The touch screen 112 displays visual
output to the user. The visual output may include graphics, text,
icons, video, and any combination thereof (collectively termed
"graphics"). In some embodiments, some or all of the visual output
may correspond to user-interface objects, further details of which
are described below.
[0049] A touch screen 112 has a touch-sensitive surface, sensor or
set of sensors that accepts input from the user based on haptic
and/or tactile contact. The touch screen 112 and the display
controller 156 (along with any associated modules and/or sets of
instructions in memory 102) detect contact (and any movement or
breaking of the contact) on the touch screen 112 and converts the
detected contact into interaction with user-interface objects
(e.g., one or more soft keys, icons, web pages or images) that are
displayed on the touch screen. In an exemplary embodiment, a point
of contact between a touch screen 112 and the user corresponds to a
finger of the user.
[0050] The touch screen 112 may use LCD (liquid crystal display)
technology, or LPG (light emitting polymer display) technology,
although other display technologies may be used in other
embodiments. The touch screen 112 and the display controller 156
may detect contact and any movement or breaking thereof using any
of a plurality of touch sensing technologies now known or later
developed, including but not limited to capacitive, resistive,
infrared, and surface acoustic wave technologies, as well as other
proximity sensor arrays or other elements for determining one or
more points of contact with a touch screen 112.
[0051] A touch-sensitive display in some embodiments of the touch
screen 112 may be analogous to the multi-touch sensitive tablets
described in the following U.S. Pat. No. 6,323,846 (Westerman et
al.), U.S. Pat. No. 6,570,557 (Westerman et al.), and/or U.S. Pat.
No. 6,677,932 (Westerman), and/or U.S. Patent Publication
200210015024A1, each of which is hereby incorporated by reference
in its entirety. However, a touch screen 112 displays visual output
from the portable device 100, whereas touch sensitive tablets do
not provide visual output.
[0052] A touch-sensitive display in some embodiments of the touch
screen 112 may be as described in the following applications: (1)
U.S. patent application Ser. No. 11/381,313, "Multipoint Touch
Surface Controller," filed May 2, 2006; (2) U.S. patent application
Ser. No. 10/840,862, "Multipoint Touchscreen," filed May 6, 2004;
(3) U.S. patent application Ser. No. 10/903,964, "Gestures For
Touch Sensitive Input Devices," filed Jul. 30, 2004; (4) U.S.
patent application Ser. No. 11/048,264, "Gestures For Touch
Sensitive Input Devices," filed Jan. 31, 2005; (5) U.S. patent
application Ser. No. 11/038,590, "Mode-Based Graphical User
Interfaces For Touch Sensitive Input Devices," tiled Jan. 18, 2005;
(6) U.S. patent application Ser. No. 11/228,758, "Virtual Input
Device Placement On A Touch Screen User Interface," filed Sep. 16,
2005; (7) U.S. patent application Ser. No. 11/228,700, "Operation
Of A Computer With A Touch Screen Interface," filed Sep. 16, 2005;
(8) U.S. patent application Ser. No. 11/228,737, "Activating
Virtual Keys Of A Touch-Screen Virtual Keyboard," filed Sep. 16,
2005; and (9) U.S. patent application Ser. No. 11/367,749,
"Multi-Functional Hand-Held Device," filed Mar. 3, 2006. All of
these applications are incorporated by reference in their entirety
herein.
[0053] The touch screen 112 may have a resolution in excess of 100
dpi. In an exemplary embodiment, the touch screen has a resolution
of approximately 160 dpi. The user may make contact with the touch
screen 112 using any suitable object or appendage, such as a
stylus, a finger, and so forth. In some embodiments, the user
interface is designed to work primarily with finger-based contacts
and gestures, which are much less precise than stylus-based input
due to the larger area of contact of a finger on the touch screen.
In some embodiments, the device translates the rough finger-based
input into a precise pointer/cursor position or command for
performing the actions desired by the user.
[0054] In some embodiments, in addition to the touch screen, the
device 100 may include a touchpad (not shown) for activating or
deactivating particular functions. In some embodiments, the
touchpad is a touch-sensitive area of the device that, unlike the
touch screen, does not display visual output. The touchpad may be a
touch-sensitive surface that is separate from the touch screen 112
or an extension of the touch-sensitive surface formed by the touch
screen.
[0055] In some embodiments, the device 100 may include a physical
or virtual click wheel as an input control device 116. A user may
navigate among and interact with one or more graphical objects
(henceforth referred to as icons) displayed in the touch screen 112
by rotating the click wheel or by moving a point of contact with
the click wheel (e.g., where the amount of movement of the point of
contact is measured by its angular displacement with respect to a
center point of the click wheel). The click wheel may also be used
to select one or more of the displayed icons. For example, the user
may press down on at least a portion of the click wheel or an
associated button. User commands and navigation commands provided
by the user via the click wheel may be processed by an input
controller 160 as well as one or more of the modules and/or sets of
instructions in memory 102. For a virtual click wheel, the click
wheel and click wheel controller may be part of the touch screen
112 and the display controller 156, respectively. For a virtual
click wheel, the click wheel may be either an opaque or
semitransparent object that appears and disappears on the touch
screen display in response to user interaction with the device. In
some embodiments, a virtual click wheel is displayed on the touch
screen of a portable multifunction device and operated by user
contact with the touch screen.
[0056] The device 100 also includes a power system 162 for powering
the various components. The power system 162 may include a power
management system, one or more power sources (e.g., battery,
alternating current (AC)), a recharging system, a power failure
detection circuit, a power converter or inverter, a power status
indicator (e.g., a light-emitting diode (LED)) and any other
components associated with the generation, management and
distribution of power in portable devices.
[0057] The device 100 may also include one or more optical sensors
164. FIGS. 1A and 1B show an optical sensor coupled to an optical
sensor controller 158 in I/O subsystem 106. The optical sensor 164
may include charge-coupled device (CCD) or complementary
metal-oxide semiconductor (CMOS) phototransistors. The optical
sensor 164 receives light from the environment, projected through
one or more lens, and converts the light to data representing an
image. In conjunction with an imaging module 143 (also called a
camera module), the optical sensor 164 may capture still images or
video. In some embodiments, an optical sensor is located on the
back of the device 100, opposite the touch screen display 112 on
the front of the device, so that the touch screen display may be
used as a viewfinder for either still and/or video image
acquisition. In some embodiments, an optical sensor is located on
the front of the device so that the user's image may be obtained
for videoconferencing while the user views the other video
conference participants on the touch screen display. In some
embodiments, the position of the optical sensor 164 can be changed
by the user (e.g., by rotating the lens and the sensor in the
device housing) so that a single optical sensor 164 may be used
along with the touch screen display for both video conferencing and
still and/or video image acquisition.
[0058] The device 100 may also include one or more proximity
sensors 166. FIGS. 1A and 1B show a proximity sensor 166 coupled to
the peripherals interface 118. Alternately, the proximity sensor
166 may be coupled to an input controller 160 in the I/O subsystem
106. The proximity sensor 166 may perform as described in U.S.
patent application Ser. No. 11/241,839, "Proximity Detector In
Handheld Device," Sep. 30, 2005; Ser. No. 11/240,788, "Proximity
Detector In Handheld Device," Sep. 30, 2005; Ser. No. 11/620,702,
"Using Ambient Light Sensor To Augment Proximity Sensor Output";
Ser. No. 11/586,862, "Automated Response To And Sensing Of User
Activity In Portable Devices." filed Oct. 24, 2006; and Ser. No.
11/638,251, "Methods And Systems For Automatic Configuration Of
Peripherals," which are hereby incorporated by reference herein in
their entirety. In some embodiments, the proximity sensor turns off
and disables the touch screen 112 when the multifunction device is
placed near the user's ear (e.g., when the user is making a phone
call). In some embodiments, the proximity sensor keeps the screen
off when the device is in the user's pocket, purse, or other dark
area to prevent unnecessary battery drainage when the device is a
locked state.
[0059] The device 100 may also include one or more accelerometers
168. FIGS. 1A and 1B show an accelerometer 168 coupled to the
peripherals interface 118. Alternately, the accelerometer 168 may
be coupled to an input controller 160 in the I/O subsystem 106. The
accelerometer 168 may perform as described in U.S. Patent
Publication No. 20050190059, "Acceleration-based Theft Detection
System for Portable Electronic Devices," and U.S. Patent
Publication No. 20060017692, "Methods And Apparatuses For Operating
A Portable Device Based On An Accelerometer," both of which are
which are incorporated herein by reference in their entirety. In
some embodiments, information is displayed on the touch screen
display in a portrait view or a landscape view based on an analysis
of data received from the one or more accelerometers.
[0060] In some embodiments, the software components stored in
memory 102 may include an operating system 126, a communication
module (or set of instructions) 128, a contact/motion module (or
set of instructions) 130, a graphics module (or set of
instructions) 132, a text input module (or set of instructions)
134, a Global Positioning System (GPS) module (or set of
instructions) 135, and applications (or set of instructions)
136.
[0061] The operating system 126 (e.g., Darwin, RTXC, LINUX, UNIX,
OS X, WINDOWS, or an embedded operating system such as VxWorks)
includes various software components and/or drivers for controlling
and managing general system tasks (e.g., memory management, storage
device control, power management, etc.) and facilitates
communication between various hardware and software components.
[0062] The communication module 128 facilitates communication with
other devices over one or more external ports 124 and also includes
various software components for handling data received by the RF
circuitry 108 and/or the external port 124. The external port 124
(e.g., Universal Serial Bus (USB), FIREWIRE, etc.) is adapted for
coupling directly to other devices or indirectly over a network
(e.g., the Internet, wireless LAN, etc.). In some embodiments, the
external port is a multi-pin (e.g., 30-pin) connector that is the
same as, or similar to and/or compatible with the 30-pin connector
used on iPod (trademark of Apple Computer, Inc.) devices.
[0063] The contact/motion module 130 may detect contact with the
touch screen 112 (in conjunction with the display controller 156)
and other touch sensitive devices (e.g., a touchpad or physical
click wheel). The contact/motion module 130 includes various
software components for performing various operations related to
detection of contact, such as determining if contact has occurred,
determining if there is movement of the contact and tracking the
movement across the touch screen 112, and determining if the
contact has been broken (i.e., if the contact has ceased).
Determining movement of the point of contact may include
determining speed (magnitude), velocity (magnitude and direction),
and/or an acceleration (a change in magnitude and/or direction) of
the point of contact. These operations may be applied to single
contacts (e.g., one finger contacts) or to multiple simultaneous
contacts (e.g., "multitouch"/multiple finger contacts). In some
embodiments, the contact/motion module 130 and the display
controller 156 also detects contact on a touchpad. In some
embodiments, the contact/motion module 130 and the controller 160
detects contact on a click wheel.
[0064] The graphics module 132 includes various known software
components for rendering and displaying graphics on the touch
screen 112, including components for changing the intensity of
graphics that are displayed. As used herein, the term "graphics"
includes any object that can be displayed to a user, including
without limitation text, web pages, icons (such as user-interface
objects including soft keys), digital images, videos, animations
and the like. An animation in this context is a display of a
sequence of images that gives the appearance of movement, and
informs the user of an action that has been performed (such as
expanding a user-selected web-page portion to fill a browser
window). In this context, a respective animation that executes an
action, or confirms an action by the user of the device, typically
takes a predefined, finite amount of time, typically between 0.2
and 1.0 seconds, and generally less than two seconds.
[0065] The text input module 134, which may be a component of
graphics module 132, provides soft keyboards for entering text in
various applications (e.g., contacts 137, e-mail 140, IM 141,
blogging 142, browser 147, and any other application that needs
text input).
[0066] The GPS module 135 determines the location of the device and
provides this information for use in various applications (e.g., to
telephone 138 for use in location-based dialing, to camera 143
and/or blogger 142 as picture/video metadata, and to applications
that provide location-based services such as weather widgets, local
yellow page widgets, and map/navigation widgets).
[0067] The applications 136 may include the following modules (or
sets of instructions), or a subset or superset thereof: [0068] a
contacts module 137 (sometimes called an address book or contact
list); [0069] a telephone module 138; [0070] a video conferencing
module 139; [0071] an e-mail client module 140; [0072] an instant
messaging (IM) module 141; [0073] a blogging module 142; [0074] a
camera module 143 for still and/or video images; [0075] an image
management module 144; [0076] a video player module 145; [0077] a
music player module 146; [0078] a browser module 147; [0079] a
calendar module 148; [0080] widget modules 149, which may include
weather widget 149-1, stocks widget 149-2, calculator widget 149-3,
alarm clock widget 149-4, dictionary widget 149-5, and other
widgets obtained by the user, as well as user-created widgets
149-6; [0081] widget creator module 150 for making user-created
widgets 149-6; [0082] search module 151; [0083] video and music
player module 152, which merges video player module 145 and music
player module 146; [0084] notes module 153; [0085] map module 154;
and/or [0086] online video module 155.
[0087] Examples of other applications 136 that may be stored in
memory 102 include other word processing applications, JAVA-enabled
applications, encryption, digital rights management, voice
recognition, and voice replication.
[0088] In conjunction with touch screen 112, display controller
156, contact module 130, graphics module 132, and text input module
134, the contacts module 137 may be used to manage an address book
or contact list, including: adding name(s) to the address book;
deleting name(s) from the address book; associating telephone
number(s), e-mail address(es), physical address(es) or other
information with a name; associating an image with a name;
categorizing and sorting names; providing telephone numbers or
e-mail addresses to initiate and/or facilitate communications by
telephone 138, video conference 139, e-mail 140, or IM 141; and so
forth.
[0089] In conjunction with RF circuitry 108, audio circuitry 110,
speaker 111, microphone 113, touch screen 112, display controller
156, contact module 130, graphics module 132, and text input module
134, the telephone module 138 may be used to enter a sequence of
characters corresponding to a telephone number, access one or more
telephone numbers in the address book 137, modify a telephone
number that has been entered, dial a respective telephone number,
conduct a conversation and disconnect or hang up when the
conversation is completed. As noted above, the wireless
communication may use any of a plurality of communications
standards, protocols and technologies.
[0090] In conjunction with RF circuitry 108, audio circuitry 110,
speaker 111, microphone 113, touch screen 112, display controller
156, optical sensor 164, optical sensor controller 158, contact
module 130, graphics module 132, text input module 134, contact
list 137, and telephone module 138, the videoconferencing module
139 may be used to initiate, conduct, and terminate a video
conference between a user and one or more other participants.
[0091] In conjunction with RF circuitry 108, touch screen 112,
display controller 156, contact module 130, graphics module 132,
and text input module 134, the e-mail client module 140 may be used
to create, send, receive, and manage e-mail. In conjunction with
image management module 144, the e-mail module 140 makes it very
easy to create and send e-mails with still or video images taken
with camera module 143.
[0092] In conjunction with RF circuitry 108, touch screen 112,
display controller 156, contact module 130, graphics module 132,
and text input module 134, the instant messaging module 141 may be
used to enter a sequence of characters corresponding to an instant
message, to modify previously entered characters, to transmit a
respective instant message (for example, using a Short Message
Service (SMS) or Multimedia Message Service (MMS) protocol for
telephony-based instant messages or using XMPP, SIMPLE, or IMPS for
Internet-based instant messages), to receive instant messages and
to view received instant messages. In some embodiments, transmitted
and/or received instant messages may include graphics, photos,
audio files, video files and/or other attachments as are supported
in a MMS and/or an Enhanced Messaging Service (EMS). As used
herein, "instant messaging" refers to both telephony-based messages
(e.g., messages sent using SMS or MMS) and Internet-based messages
(e.g., messages sent using XMPP, SIMPLE, or IMPS).
[0093] In conjunction with RF circuitry 108, touch screen 112,
display controller 156, contact module 130, graphics module 132,
text input module 134, image management module 144, and browsing
module 147, the blogging module 142 may be used to send text, still
images, video, and/or other graphics to a blog (e.g., the user's
blog).
[0094] In conjunction with touch screen 112, display controller
156, optical sensor(s) 164, optical sensor controller 158, contact
module 130, graphics module 132, and image management module 144,
the camera module 143 may be used to capture still images or video
(including a video stream) and store them into memory 102, modify
characteristics of a still image or video, or delete a still image
or video from memory 102.
[0095] In conjunction with touch screen 112, display controller
156, contact module 130, graphics module 132, text input module
134, and camera module 143, the image management module 144 may be
used to arrange, modify or otherwise manipulate, label, delete,
present (e.g., in a digital slide show or album), and store still
and/or video images.
[0096] In conjunction with touch screen 112, display controller
156, contact module 130, graphics module 132, audio circuitry 110,
and speaker 111, the video player module 145 may be used to
display, present or otherwise play back videos (e.g., on the touch
screen or on an external, connected display via external port
124).
[0097] In conjunction with touch screen 112, display system
controller 156, contact module 130, graphics module 132, audio
circuitry 110, speaker 111, RF circuitry 108, and browser module
147, the music player module 146 allows the user to download and
play back recorded music and other sound files stored in one or
more file formats, such as MP3 or AAC files. In some embodiments,
the device 100 may include the functionality of an MP3 player, such
as an iPod (trademark of Apple Computer, Inc.).
[0098] In conjunction with RF circuitry 108, touch screen 112,
display system controller 156, contact module 130, graphics module
132, and text input module 134, the browser module 147 may be used
to browse the Internet, including searching, linking to, receiving,
and displaying web pages or portions thereof, as well as
attachments and other files linked to web pages. Embodiments of
user interfaces and associated processes using browser module 147
are described further below.
[0099] In conjunction with RF circuitry 108, touch screen 112,
display system controller 156, contact module 130, graphics module
132, text input module 134, e-mail module 140, and browser module
147, the calendar module 148 may be used to create, display,
modify, and store calendars and data associated with calendars
(e.g., calendar entries, to do lists, etc.).
[0100] In conjunction with RF circuitry 108, touch screen 112,
display system controller 156, contact module 130, graphics module
132, text input module 134, and browser module 147, the widget
modules 149 are mini-applications that may be downloaded and used
by a user (e.g., weather widget 149-1, stocks widget 149-2,
calculator widget 149-3, alarm clock widget 149-4, and dictionary
widget 149-5) or created by the user (e.g., user-created widget
149-6). In some embodiments, a widget includes an HTML (Hypertext
Markup Language) file, a CSS (Cascading Style Sheets) file, and a
JavaScript file. In some embodiments, a widget includes an XML
(Extensible Markup Language) file and a JavaScript file (e.g.,
Yahoo! Widgets). Embodiments of user interfaces and associated
processes using widget modules 149 are described further below.
[0101] In conjunction with RF circuitry 108, touch screen 112,
display system controller 156, contact module 130, graphics module
132, text input module 134, and browser module 147, the widget
creator module 150 may be used by a user to create widgets (e.g.,
turning a user-specified portion of a web page into a web-clip
widget). In some embodiments, a web-clip widget comprises a file
containing an XML property list that includes a URL for the web
page and data indicating the user-specified portion of the web
page. In some embodiments, the data indicating the user-specified
portion of the web page includes a reference point and a scale
factor. In some embodiments, the data indicating the user-specified
portion of the web page includes a set of coordinates within the
web page or an identification of a structural element within the
web page. Alternatively, in some embodiments a web-clip widget
includes an HTML (Hypertext Markup Language) file, a CSS (Cascading
Style Sheets) file, and a JavaScript file. Alternatively, in some
embodiments a web-clip widget includes an XML (Extensible Markup
Language) file and a JavaScript file.
[0102] In some embodiments a web-clip widget includes an image file
(e.g., a png file) of an icon corresponding to the widget. In some
embodiments, a web-clip widget corresponds to a folder containing
the image file and a file that includes the URL for the web page
and data indicating the user-specified portion of the web page. In
some embodiments, a web-clip widget corresponds to a folder
containing the image file and an executable script.
[0103] Embodiments of user interfaces and associated processes
using widget creator module 150 are described further below.
[0104] In conjunction with touch screen 112, display system
controller 156, contact module 130, graphics module 132, and text
input module 134, the search module 151 may be used to search for
text, music, sound, image, video, and/or other files in memory 102
that match one or more search criteria (e.g., one or more
user-specified search terms).
[0105] In conjunction with touch screen 112, display controller
156, contact module 130, graphics module 132, and text input module
134, the notes module 153 may be used to create and manage notes,
to do lists, and the like.
[0106] In conjunction with RF circuitry 108, touch screen 112,
display system controller 156, contact module 130, graphics module
132, text input module 134, GPS module 135, and browser module 147,
the map module 154 may be used to receive, display, modify, and
store maps and data associated with maps (e.g., driving directions;
data on stores and other points of interest at or near a particular
location; and other location-based data).
[0107] In conjunction with touch screen 112, display system
controller 156, contact module 130, graphics module 132, audio
circuitry 110, speaker 111, RF circuitry 108, text input module
134, e-mail client module 140, and browser module 147, the online
video module 155 allows the user to access, browse, receive (e.g.,
by streaming and/or download), play back (e.g., on the touch screen
or on an external, connected display via external port 124), send
an e-mail with a link to a particular online video, and otherwise
manage online videos in one or more file formats, such as H.264. In
some embodiments, instant messaging module 141, rather than e-mail
client module 140, is used to send a link to a particular online
video. Additional description of the online video application can
be found in U.S. Provisional Patent Application No. 60/936,562,
"Portable Multifunction Device, Method, and Graphical User
Interface for Playing Online Videos," filed Jun. 20, 2007, the
content of which is hereby incorporated by reference in its
entirety.
[0108] Each of the above identified modules and applications
correspond to a set of instructions for performing one or more
functions described above. These modules (i.e., sets of
instructions) need not be implemented as separate software
programs, procedures or modules, and thus various subsets of these
modules may be combined or otherwise re-arranged in various
embodiments. For example, video player module 145 may be combined
with music player module 146 into a single module (e.g., video and
music player module 152, FIG. 1B). In some embodiments, memory 102
may store a subset of the modules and data structures identified
above. Furthermore, memory 102 may store additional modules and
data structures not described above.
[0109] In some embodiments, the device 100 is a device where
operation of a predefined set of functions on the device is
performed exclusively through a touch screen 112 and/or a touchpad.
By using a touch screen and/or a touchpad as the primary
input/control device for operation of the device 100, the number of
physical input/control devices (such as push buttons, dials, and
the like) on the device 100 may be reduced.
[0110] The predefined set of functions that may be performed
exclusively through a touch screen and/or a touchpad include
navigation between user interfaces. In some embodiments, the
touchpad, when touched by the user, navigates the device 100 to a
main, home, or root menu from any user interface that may be
displayed on the device 100. In such embodiments, the touchpad may
be referred to as a "menu button." In some other embodiments, the
menu button may be a physical push button or other physical
input/control device instead of a touchpad.
[0111] FIG. 2 illustrates a portable multifunction device 100
having a touch screen 112 in accordance with some embodiments. The
touch screen may display one or more graphics within user interface
(UI) 200. In this embodiment, as well as others described below, a
user may select one or more of the graphics by making contact or
touching the graphics, for example, with one or more fingers 202
(not drawn to scale in the figure). In some embodiments, selection
of one or more graphics occurs when the user breaks contact with
the one or more graphics. In some embodiments, the contact may
include a gesture, such as one or more taps, one or more swipes
(from left to right, right to left, upward and/or downward) and/or
a rolling of a finger (from tight to left, left to right, upward
and/or downward) that has made contact with the device 100. In some
embodiments, inadvertent contact with a graphic may not select the
graphic. For example, a swipe gesture that sweeps over an
application icon may not select the corresponding application when
the gesture corresponding to selection is a tap.
[0112] The device 100 may also include one or more physical
buttons, such as "home" or menu button 204. As described
previously, the menu button 204 may be used to navigate to any
application 136 in a set of applications that may be executed on
the device 100. Alternatively, in some embodiments, the menu button
is implemented as a soft key in a GUI in touch screen 112.
[0113] In one embodiment, the device 100 includes a touch screen
112, a menu button 204, a push button 206 for powering the device
on/off and locking the device, volume adjustment button(s) 208, a
Subscriber Identity Module (SIM) card slot 210, a head set jack
212, and a docking/charging external port 124. The push button 206
may be used to turn the power on/off on the device by depressing
the button and holding the button in the depressed state for a
predefined time interval; to lock the device by depressing the
button and releasing the button before the predefined time interval
has elapsed; and/or to unlock the device or initiate an unlock
process. In an alternative embodiment, the device 100 also may
accept verbal input for activation or deactivation of some
functions through the microphone 113.
[0114] Attention is now directed towards embodiments of user
interfaces ("Ur) and associated processes that may be implemented
on a portable multifunction device 100.
[0115] FIG. 3 illustrates an exemplary user interface for unlocking
a portable electronic device in accordance with some embodiments.
In some embodiments, user interface 300 includes the following
elements, or a subset or superset thereof: [0116] Unlock image 302
that is moved with a finger gesture to unlock the device; [0117]
Arrow 304 that provides a visual cue to the unlock gesture; [0118]
Channel 306 that provides additional cues to the unlock gesture;
[0119] Time 308; [0120] Day 310; [0121] Date 312; and [0122]
Wallpaper image 314.
[0123] In some embodiments, the device detects contact with the
touch-sensitive display (e.g., a user's finger making contact on or
near the unlock image 302) while the device is in a user-interface
lock state. The device moves the unlock image 302 in accordance
with the contact. The device transitions to a user-interface unlock
state if the detected contact corresponds to a predefined gesture,
such as moving the unlock image across channel 306. Conversely, the
device maintains the user-interface lock state if the detected
contact does not correspond to the predefined gesture. This process
saves battery power by ensuring that the device is not accidentally
awakened. This process is easy for users to perform, in part
because of the visual cue(s) provided on the touch screen.
[0124] As noted above, processes that use gestures on the touch
screen to unlock the device are described in U.S. patent
application Ser. No. 11/322,549, "Unlocking A Device By Performing
Gestures On An Unlock Image," filed Dec. 23, 2005, and Ser. No.
11/322,550, "Indication Of Progress Towards Satisfaction Of A User
Input Condition," filed Dec. 23, 2005, which are hereby
incorporated by reference in their entirety.
[0125] FIG. 4A illustrates an exemplary user interface for a menu
of applications on a portable multifunction device in accordance
with some embodiments. In some embodiments, user interface 400A
includes the following elements, or a subset or superset thereof:
[0126] Signal strength indicator 402 for wireless communication;
[0127] Time 404; [0128] Battery status indicator 406; [0129] Tray
408 with icons for frequently used applications, such as: [0130]
Phone 138; [0131] E-mail client 140, which may include an indicator
410 of the number of unread e-mails; [0132] Browser 147; and [0133]
Music player 146; and [0134] Icons for other applications, such as:
[0135] IM 141; [0136] Image management 144; [0137] Camera 143;
[0138] Video player 145; [0139] Weather 149-1; [0140] Stocks 149-2;
[0141] Blog 142; [0142] Calendar 148; [0143] Calculator 149-3;
[0144] Alarm clock 149-4; [0145] Dictionary 149-5; [0146]
User-created widget 149-6; and [0147] Other applications (not
shown) (e.g., map 154 and online video 155).
[0148] In some embodiments, UI 400A displays all of the available
applications 136 on one screen so that there is no need to scroll
through a list of applications (e.g., via a scroll bar). In some
embodiments, as the number of applications increase, the icons
corresponding to the applications may decrease in size so that all
applications may be displayed on a single screen without scrolling.
In some embodiments, having all applications on one screen and a
menu button enables a user to access any desired application with
at most two inputs, such as activating the menu button 204 and then
activating the desired application (e.g., by a tap or other finger
gesture on the icon corresponding to the application).
[0149] In some embodiments, UI 400A provides integrated access to
both widget-based applications and non-widget-based applications.
In some embodiments, all of the widgets, whether user-created or
not, are displayed in UI 400A. In other embodiments, activating the
icon for user-created widget 149-6 may lead to another UI that
displays the user-created widgets or icons corresponding to the
user-created widgets. For example, UI 400B (FIG. 4B) displays a
menu of six icons corresponding to six user-created widgets 149-6-1
through 149-6-6 in accordance with some embodiments. A user may
activate a particular widget by gesturing on the corresponding
icon. Alternatively, user-created widgets may be displayed in a
list. UI 400C (FIG. 4C) illustrates a list of names of six
user-created widgets 149-6-1 through 149-6-6 along with
corresponding icons in accordance with some embodiments. A user may
activate a particular widget by gesturing on the corresponding name
or icon.
[0150] In some embodiments, a user may rearrange the icons in UI
400A, UI 400B, or UI 400C, e.g., using processes described in U.S.
patent application Ser. No. 11/459,602, "Portable Electronic Device
With Interface Reconfiguration Mode," filed Jul. 24, 2006, which is
hereby incorporated by reference in its entirety. For example, a
user may move application icons in and out of tray 408 using finger
gestures.
[0151] In some embodiments, UI 400A includes a gauge (not shown)
that displays an updated account usage metric for an account
associated with usage of the device (e.g., a cellular phone
account), as described in U.S. patent application Ser. No.
11/322,552, "Account Information Display For Portable Communication
Device," filed Dec. 23, 2005, which is hereby incorporated by
reference in its entirety.
Making and Using Web-Clip Widgets
[0152] FIGS. 5A-51 illustrate an exemplary user interface for a
browser in accordance with some embodiments.
[0153] In some embodiments, user interface 3900A (FIG. 5A) includes
the following elements, or a subset or superset thereof: [0154]
402, 404, and 406, as described above; [0155] Previous page icon
3902 that when activated (e.g., by a finger tap on the icon)
initiates display of a previous web page (if any); [0156] Web page
name 3904; [0157] Next page icon 3906 that when activated (e.g., by
a finger tap on the icon) initiates display of a next web page (if
any); [0158] URL (Uniform Resource Locator) entry box 3908 for
inputting URLs of web pages; [0159] Refresh icon 3910 that when
activated (e.g., by a finger tap on the icon) initiates a refresh
of the web page; [0160] Web page 3912 or other structured document,
which includes a plurality of blocks 3914 of text content and other
graphics (e.g., images); [0161] Settings icon 3916 that when
activated (e.g., by a finger tap on the icon) initiates display of
a settings menu for the browser; [0162] Bookmarks icon 3918 that
when activated (e.g., by a finger tap on the icon) initiates
display of a bookmarks list or menu for the browser; [0163] Options
icon 3920 that when activated (e.g., by a finger tap on the icon)
initiates display of a plurality of options, including options for
creating a web-clip widget, adding a bookmark, and emailing a link
to the displayed web page 3912 (e.g., UI 3900F, FIG. 5F, which like
other UIs and pages, can be displayed in either portrait or
landscape view); and [0164] New window icon 3922 that when
activated (e.g., by a finger tap on the icon) initiates display of
a UI for adding new windows to the browser (e.g., UI 3900G, FIG.
5G).
[0165] In some embodiments, in response to a predefined gesture by
the user on a block 3914 (e.g., a single tap gesture or a double
tap gesture), the block is enlarged and centered (or substantially
centered) in the web page display. For example, in response to a
single tap gesture 3923 on block 3914-5, the user-selected block
3914-5 may be enlarged and centered in the display, as shown in UI
3900C (FIG. 5C). In some embodiments, the width of the
user-selected block is scaled to fill the touch screen display. In
some embodiments, the width of the user-selected block is scaled to
fill the touch screen display with a predefined amount of padding
along the sides of the display. In some embodiments, a zooming
animation of the user-selected block is displayed during
enlargement of the block. Similarly, in response to a single tap
gesture 3925 on block 3914-2, block 3914-2 may be enlarged with a
zooming animation and two-dimensionally scrolled to the center of
the display (not shown).
[0166] In some embodiments, the device analyzes the render tree of
the web page 3912 to determine the blocks 3914 in the web page. In
some embodiments, a block 3914 corresponds to a render node that
is: replaced; a block; an inline block; or an inline table.
[0167] In some embodiments, in response to the same predefined
gesture by the user on a block 3914 (e.g., a single tap gesture or
a double tap gesture) that is already enlarged and centered, the
enlargement and/or centering is substantially or completely
reversed. For example, in response to a single tap gesture 3929 on
block 3914-5 (FIG. 5C), the web page image may zoom out and return
to UI 3900A (FIG. 5A).
[0168] In some embodiments, in response to a predefined gesture
(e.g., a single tap gesture or a double tap gesture) by the user on
a block 3914 that is already enlarged but not centered, the block
is centered (or substantially centered) in the web page display.
For example, in response to a single tap gesture 3927 on block
3914-4 (FIG. 5C), block 3914 4 may be centered (or substantially
centered) in the web page display. Similarly, in response to a
single tap gesture 3935 on block 3914-6, block 3914-6 may be
centered (or substantially centered) in the web page display. Thus,
for a web page display that is already enlarged, in response to a
predefined gesture, the device may display in an intuitive manner a
series of blocks that the user wants to view. This same gesture may
initiate different actions in different contexts (e.g., (1) zooming
and/or enlarging in combination with scrolling when the web page is
reduced in size, UI 3900A and (2) reversing the enlargement and/or
centering if the block is already centered and enlarged).
[0169] In some embodiments, in response to a multi-touch (3931 and
3933) de-pinching gesture by the user (FIG. 5C), the web page may
be enlarged. Conversely, in response to a multi-touch pinching
gesture by the user, the web page may be reduced.
[0170] In some embodiments, in response to a substantially vertical
upward (or downward) swipe gesture by the user, the web page (or,
more generally, other electronic documents) may scroll
one-dimensionally upward (or downward) in the vertical direction.
For example, in response to an upward swipe gesture 3937 by the
user that is within a predetermined angle (e.g., 27.degree.) of
being perfectly vertical, the web page may scroll one-dimensionally
upward in the vertical direction.
[0171] Conversely, in some embodiments, in response to a swipe
gesture that is not within a predetermined angle (e.g., 27.degree.)
of being perfectly vertical, the web page may scroll
two-dimensionally (i.e., with simultaneous movement in both the
vertical and horizontal directions). For example, in response to an
upward or diagonal swipe gesture 3939 by the user that is not
within a predetermined angle (e.g., 27.degree.) of being perfectly
vertical, the web page may scroll two-dimensionally along the
direction of the swipe 3939.
[0172] In some embodiments, in response to a multi-touch (3941 and
3943) rotation gesture by the user, the web page may be rotated
exactly 90.degree. (UI 3900D, FIG. 5D) for landscape viewing, even
if the amount of rotation in the multi-touch (3941 and 3943)
rotation gesture is substantially different from 90.degree..
Similarly, in response to a multi-touch (3945 and 3947) rotation
gesture by the user (U1 3900D, FIG. 5D), the web page may be
rotated exactly 90.degree. for portrait viewing, even if the amount
of rotation in the multi-touch (3945 and 3947) rotation gesture is
substantially different from 90.degree..
[0173] Thus, in response to imprecise gestures by the user, precise
movements of graphics occur. The device behaves in the manner
desired by the user despite inaccurate input by the user. Also,
note that the gestures described for UI 3900C, which has a portrait
view, are also applicable to UIs with a landscape view (e.g., UI
3900D, FIG. 5D) so that the user can choose whichever view
(portrait or landscape) the user prefers for web browsing.
[0174] In some embodiments, in response to a tap or other
predefined user gesture on URL entry box 3908 (UI 3900A, FIG. 5A),
the touch screen displays an enlarged entry box 3926 and a keyboard
616 (e.g., UI 3900B, FIG. 5B in portrait viewing and LTI 3900E,
FIG. 5E in landscape viewing). In some embodiments, the touch
screen also displays: [0175] Contextual clear icon 3928 that when
activated (e.g., by a finger tap on the icon) initiates deletion of
all text in entry box 3926; [0176] a search icon 3930 that when
activated (e.g., by a finger tap on the icon) initiates an Internet
search using the search terms input in box 3926; and [0177] Go to
URL icon 3932 that when activated (e.g., by a finger tap on the
icon) initiates acquisition of the web page at the URL in box
3926;
[0178] Thus, the same entry box 3926 may be used for inputting both
search terms and URLs. In some embodiments, whether or not clear
icon 3928 is displayed depends on the context.
[0179] UI 3900G (FIG. 5G) is a UI for adding new windows to an
application, such as the browser 147. LTI 3900G displays an
application (e.g., the browser 147), which includes a displayed
window (e.g., web page 3912-2) and at least one hidden window
(e.g., web pages 3912-1 and 3912-3 and possibly other web pages
that are completely hidden off-screen). LTI 3900G also displays an
icon for adding windows to the application (e.g., new window or new
page icon 3936). In response to detecting activation of the icon
3936 for adding windows, the browser adds a window to the
application (e.g., a new window for a new web page 3912).
[0180] In response to detecting a gesture on the touch screen
display, a displayed window in the application is moved off the
display and a hidden window is moved onto the display. For example,
in response to detecting a tap gesture 3949 on the left side of the
screen, the window with web page 3912-2 is moved partially or fully
off-screen to the right, the window with web page 3912-3 is moved
completely off-screen, partially hidden window with web page 3912-1
is moved to the center of the display, and another completely
hidden window (not shown in FIG. 5G) with a web page may be moved
partially onto the display. Alternatively, detection of a
left-to-right swipe gesture 3951 may achieve the same effect.
[0181] Conversely, in response to detecting a tap gesture 3953 on
the right side of the screen, the window with web page 3912-2 is
moved partially or fully off-screen to the left, the window with
web page 3912-1 is moved completely off-screen, partially hidden
window with web page 3912-3 is moved to the center of the display,
and another completely hidden window (not shown in FIG. 50) with a
web page may be moved partially onto the display. Alternatively,
detection of a right-to-left swipe gesture 3951 may achieve the
same effect.
[0182] In some embodiments, in response to a tap or other
predefined gesture on a delete icon 3934 (e.g., 3934-2 or 3934-3),
the corresponding window 3912 is deleted. In some embodiments, in
response to a tap or other predefined gesture on Done icon 3938,
the window in the center of the display (e.g., 3912-2) is enlarged
to fill the screen.
[0183] A user may create a web-clip widget in accordance with some
embodiments. Activation of the user-created web-clip widget
displays a previously specified area in a web page (having a
specified URL) at a specified display size or scale factor. In some
embodiments, the area in the web page is specified by scaling
and/or translating the display of the web page. For example, a
specified area in the web page is enlarged and centered. The
specified area may be displayed in a browser application (e.g., the
browser 147) or other application. For example, activation of the
web-clip widget may display a particular block that is of interest
to the user within the web page; furthermore, the block may be
enlarged. Activation of the web-clip widget thus enables the user
to view the particular block of interest without having to enlarge
and center the web page area that is of interest each time the user
visits the web page. In some embodiments, after activation of the
web-clip widget, the user may manipulate the display to view other
portions of the web page by scaling and/or translating the display.
Alternatively, in some embodiments, the user may not be permitted
to manipulate the display.
[0184] Web-clip widgets provide more functionality than mere
bookmarks: activation of a bookmark only displays a specified web
page, while activation of a web-clip widget displays a specified
area of a web page at a specified display size or scale factor in
accordance with some embodiments. Similarly, a web-clip widget is
distinguishable from a hyperlink. To view a web page or portion
thereof specified by a hyperlink, the user must activate the
browser application, navigate to a web page containing the
hyperlink, activate the hyperlink, and then potentially scroll
and/or scale the resulting web page. In contrast, to view an area
of a web page specified by a web-clip widget, the user merely
activates the widget.
[0185] In some embodiments, the web-clip widget corresponds to a
block or other structural element of the web page. As described in
U.S. patent application Ser. No. 11/620,492, "Selecting and
Manipulating Web Content," filed on Jan. 5, 2007, which application
is incorporated by reference herein in its entirety, structural
elements that are displayed in a web page may be identified during
the web-clip widget creation process. In some embodiments, if the
dimensions of a selected structural element change after creation
of a web-clip widget, the area that is displayed upon activation of
the web-clip widget is changed accordingly.
[0186] In some embodiments, a web-clip widget comprises a URL for
the web page and data (e.g., metadata) indicating the
user-specified portion of the web page. For example, in some
embodiments the web-clip widget comprises a file containing an XML
property list that includes the URL and the data indicating the
user-specified portion of the web page. In some embodiments, the
data indicating the user-specified portion of the web page includes
a reference point (e.g., a corner point or center point for the
widget) and a scale factor. In some embodiments, the data
indicating the user-specified portion of the web page includes a
set of coordinates within the web page (e.g., a user-defined
rectangle) or an identification of a structural element within the
web page. The application for viewing the web-clip widget (e.g.,
the browser 147) is configured to process the data indicating the
user-specified portion of the web page and to display the
corresponding portion.
[0187] In some embodiments a web-clip widget comprises an
executable script. In some embodiments, the widget includes an HTML
(Hypertext Markup Language) file, a CSS (Cascading Style Sheets)
file, and a JavaScript file. In some embodiments, the widget
includes an XML (Extensible Markup Language) file and a JavaScript
file (e.g., Yahoo! Widgets).
[0188] To the extent that any application incorporated by reference
herein includes a definition of web-clip widgets that contradicts
the definition in the preceding five paragraphs, the definition in
the preceding five paragraphs is to be considered controlling for
purposes of interpreting the specification and claims of the
present application.
[0189] Referring to FIG. 5C, in some embodiments, once a user has
centered and/or enlarged an area of a web page (e.g., block
3914-5), the user may initiate creation of a web-clip widget by
activating the options icon 3920. The options icon 3920 is an
example of an options icon referenced in operation 706 of process
700 (FIG. 7A, below). In some embodiments, the user activates the
options icon 3920 by performing a tap or other predefined gesture
on the options icon 3920.
[0190] As a result of activating the options icon 3920, a user
interface such as UI 3900F (FIG. 5F) is displayed (e.g., operation
708, FIG. 7A), which includes a plurality of icons 3972. In some
embodiments, the plurality of icons 3972 includes an icon 3973 for
creating a web-clip widget, an icon 3974 for adding a bookmark
(e.g., via UI 39001, FIG. 51), an icon 3975 for emailing a link
corresponding to the displayed web page 3912, and a cancel icon
3976 for returning to the previous UI. If the user activates the
"create web-clip widget" icon 3973, a web-clip widget corresponding
to the centered and/or enlarged area of the web page (e.g., block
3914-5 or the entire displayed portion of the web page 3912), will
be created (e.g., operations 710 and 712, FIG. 7A). Text and/or
graphics displayed for the icon 3973 may vary. In some embodiments,
for example, the icon 3973 may be labeled "Add to Home Screen."
[0191] In some embodiments, in response to user activation of the
"create web-clip widget" icon 3973 (FIG. SF), UI 3900H (FIG. 5H)
will appear and will prompt the user to enter the widget name in
text entry box 3960 using the contextual keyboard 616. In some
embodiments, the user can access other keyboards that display other
symbols by activating the alternate keyboard selector icon 618. In
some embodiments, UI 3900H includes an image 3978 of the selected
area of the web page. Once the user has completed entering the
widget name in the text entry box 3960, the user activates the
add-widget icon 3928 and the widget is created. Alternately, the
user may activate the cancel icon 3928 to avoid creating the
widget.
[0192] In some embodiments, as a result of activating the "create
web-clip widget" icon 3973, a web-clip widget corresponding to the
centered and/or enlarged area of the web page will be created and
assigned a name without any further actions by a user. In some
embodiments, instead of displaying a user interface such as UI
3900H (FIG. 5H) for receiving a name, the newly created web-clip
widget may be assigned the same name as the web page name 3904.
[0193] An icon corresponding to the newly created widget may be
created and displayed on a menu in a UI such as UI 400A or UI 400B
(FIG. 4A or 4B). Alternatively, the icon and/or the name of the
newly created widget may be listed on a UI such as UI 400C (FIG.
4q. Subsequent activation of the newly created widget will launch
an application (e.g., the browser 147) that will display the
web-clip widget. In some embodiments, the web-clip widget is
displayed within the browser UI (e.g., UI 3900C, FIG. 5C). In some
embodiments, the web-clip widget is displayed without other
elements of the browser UI (e.g., without elements 3902, 3906,
3908, and/or 3910), such that the web-clip widget appears to be its
own mini-application rather than a portion of a web page displayed
in a browser. In some embodiments, the web-clip widget is displayed
with decorative features such as a decorative frame or a border
resembling a torn page. In some embodiments, the decorative
features are user-customizable.
[0194] For example, as described above, a user viewing web page
3912 (FIG. 5A) may enlarge and center block 3914-5 by performing a
tap gesture 3923 (e.g., a single tap or a double tap) on block
3914-5. As a result, block 3914-5 appears enlarged and centered in
the browser window, as shown in FIG. 5C. The user then may perform
gestures (e.g., taps) on the options icon 3920 and the web-clip
widget creation icon 3973 (FIG. 5F) to create a widget
corresponding to block 3914-5, in accordance with some embodiments.
In some embodiments, the user then enters a widget name in the text
entry box 3960 (FIG. 5H) and activates the add-widget icon 3928. A
corresponding icon may be created and displayed on a menu such as
in UI 400A or 400B (FIG. 4A or 4B) or in a list such as in UI 400C
(FIG. 4C). In some embodiments, subsequent activation of the newly
created widget will launch the browser 147, which will display
block 3914-5, as shown in UI 3900C (FIG. 5C).
[0195] In some embodiments, instead of or in addition to performing
a tap gesture 3923 (FIG. 5A) to center and enlarge a block, a user
may define the area of a web page to be associated with a widget by
performing one or more other gestures. Examples of gestures that
may be used to define the area of the web page include a tap
gesture 3927 or 3935 (FIG. 5C) to center an adjacent enlarged
block; a multi-touch depinching gesture (3931 and 3933) (FIG. 5C)
to enlarge the web page; a multi-touch pinching gesture (not shown)
to reduce the web page; swipe gestures such as a substantially
vertical swipe 3937 (FIG. SC), an upward or diagonal swipe 3939
(FIG. 5C), and/or other swipe gestures (not shown) to scroll the
web page; and/or a multi-touch rotation gesture (3941 and 3943) to
select a portrait or landscape view (FIG. 5C).
[0196] In some embodiments, instead of first defining the area of
the web page to be associated with the web-clip widget and then
activating the options icon 3920 (e.g., FIG. 5C) and the "create
web-clip widget" icon 3973 (FIG. 5F), a user may first activate the
icons 3920 and 3973 and then define the area by performing gestures
that are detected by the touch screen display, such as those
described above. Once the area has been selected and/or scaled, the
user may make a gesture on the touch screen to indicate that the
area of the web page to be associated with the widget has been
defined.
[0197] In some embodiments, in response to the user activating the
"create web-clip widget" icon 3973 (FIG. 5F), the device displays a
user interface (e.g., UI 3900K, FIG. 5K) that lets the user define
the area of the web page to be associated with the widget. The user
may define the area using gestures such as the gestures described
above with reference to UIs 3900A, 3900C, and 3900D (FIGS. 5A, 5C,
and 5D). In some embodiments, the user interface may include
information 3950 to help guide the user. In some embodiments, the
user may activate a cancel icon 3952 to abort the widget creation
process and may activate an add widget icon 3954 to complete the
widget creation process. In some embodiments, a rotation gesture
such as multi-touch rotation gesture (3941 and 3943, FIG. 5C)
rotates the entire UI 3900K, and not just the defined area, from
portrait viewing to landscape viewing or vice versa.
[0198] In some embodiments, in response to the user activating the
"create web-clip widget" icon 3973 (FIG. 5F), the device displays a
user interface (e.g., UI 3900J, FIG. 5J) that lets the user define
the area of a web page to be associated with a widget by toggling
between frames. The frames are successively overlaid on the web
page to frame or highlight successive blocks and other structural
elements of the web page. For example, in UI 3900J a frame 3958
frames block 2 3914-2. The user may activate a toggle icon 3956 to
toggle between successive blocks. Once a block of interest is
framed, the user may activate an add widget icon 3954 to create a
widget corresponding to the framed block. The user may activate a
cancel icon 3952 to end the widget creation process.
[0199] In some embodiments, creating and displaying an icon
corresponding to the newly created web-clip widget includes
displaying an animation, as illustrated in FIGS. 6A-6D in
accordance with some embodiments. The animation may be displayed,
for example, after activation of the add-widget icon 3928 (FIG. 5H)
or after activation of the "create web-clip widget" icon 3973 (FIG.
5F). In the animation, the selected area of the web page 3912
corresponding to the newly created web-clip widget (e.g., block
3914-5 in UI 3900C) is displayed, as illustrated in FIG. 6A. The
displayed image is shrunk down, as illustrated for image 602 (FIG.
6B), and displayed over a menu of icons. In some embodiments, the
menu of icons includes vacant areas (e.g., 604-1 and 604-2, FIG.
6B) in which an icon could be displayed but is not currently
displayed. The image 602 may be moved (FIG. 6C) into the first
available vacancy 604-1, where it is displayed as an icon
corresponding to the new web-clip widget 149-6-7 (FIG. 6D). In some
embodiments, the first available vacancy is the left-most vacancy
in the highest row with a vacancy. In other embodiments, the image
is moved into another vacancy or is appended to the menu after the
last (e.g., lowest and right-most) vacancy.
[0200] In some embodiments, instead of displaying an animation, the
icon corresponding to the newly created web-clip widget is simply
displayed in a first available vacancy in a menu of icons or in
another available vacancy in the menu, or is appended to the
menu.
[0201] Activation of the icon corresponding to the newly created
web-clip widget 149-6-7 (e.g., by a gesture 606 (FIG. 6E) on the
icon, such as a tap gesture) results in display of the
corresponding web-clip widget (e.g., display of block 3914-5, as
shown in FIG. 6A) in the browser application or in its own
mini-application without other elements of the browser UI.
[0202] UI 3900L (FIG. 5L) is a UI for displaying a portion of two
or more web-clip widgets, in accordance with some embodiments. The
displayed portion may include a first web-clip widget (e.g.,
149-6-1), and may include all or a portion of additional web-clip
widgets (e.g., 149-6-2). The displayed portion is scrolled in
response to detecting a gesture on the touch screen display, such
as a swipe gesture 3962.
[0203] UI 3900M (FIG. 5M) is a UI for displaying a web-clip widget
(e.g., 149-6-2) in accordance with some embodiments. In response to
detecting a gesture on the touch screen display, display of the
web-clip widget is ceased and another web-clip widget is displayed.
For example, in response to detecting a downward swipe 3962 or a
tap gesture 3964 at the top of the displayed widget 149-6-2,
display of the web-clip widget 149-6-2 is ceased and a previous
user-created widget 149-6-1 is displayed. In response to detecting
an upward swipe 3962 or a tap gesture 3966 at the bottom of the
displayed widget 149-6-2, display of the web-clip widget 149-6-2 is
ceased and a next user-created widget 149-6-3 is displayed.
Alternatively, in response to detecting a substantially horizontal
right-to-left swipe 3963 or a tap gesture 3965 at the right side of
the displayed widget 149-6-2, display of the web-clip widget
149-6-2 is ceased and a next user-created widget 149-6-3 is
displayed. In response to detecting a substantially horizontal
left-to-right swipe 3963 or a tap gesture 3967 at the left side of
the displayed widget 149-6-2, display of the web-clip widget
149-6-2 is ceased and a previous user-created widget 149-6-1 is
displayed.
[0204] FIG. 7A is a flow diagram illustrating a process 700 for
creating a web-clip widget from a web page or portion thereof on a
portable multifunction device with a touch screen display in
accordance with some embodiments. While the web-clip widget
creation process 700 described below includes a number of
operations that appear to occur in a specific order, it should be
apparent that the process 700 can include more or fewer operations,
which can be executed serially or in parallel (e.g., using parallel
processors or a multi-threading environment), an order of two or
more operations may be changed, and/or two or more operations may
be combined into a single operation.
[0205] In some embodiments, selection of a web page or portion
thereof for display is detected (702). For example, one or more
finger gestures are detected on the touch screen display to select
the web page or portion thereof. In some embodiments, the one or
more finger gestures include one or more finger gestures to scale
an area in the web page. In some embodiments, the one or more
finger gestures include one or more finger gestures to center an
area in the web page. Examples of finger gestures used to select,
center, and/or scale an area in the web page include a tap gesture
3923 or 3925 to center and enlarge a block (FIG. 5A); a tap gesture
3927 or 3935 to center an adjacent enlarged block; a multi-touch
depinching gesture (3931 and 3933) to enlarge the web page; a
multi-touch pinching gesture (not shown) to reduce the web page;
swipe gestures such as a substantially vertical swipe 3937, an
upward or diagonal swipe 3939, and/or other swipe gestures (not
shown) to translate the web page; and/or a multi-touch rotation
gesture (3941 and 3943) to select a portrait or landscape view
(FIG. 5C).
[0206] The web page or portion thereof is displayed (704) on the
touch screen display. In the example of FIG. 5C, block 3914-5 is
displayed on the touch screen display.
[0207] An activation of an options icon (e.g., icon 3920) is
detected (706). In some embodiments, detecting activation of the
options icon includes detecting a finger gesture (e.g., a tap
gesture) on the options icon.
[0208] In response to detecting activation of the options icon, a
plurality of icons (e.g., 3972, FIG. 5F) is displayed (708)
including a web-clip widget creation icon (e.g., icon 3973, FIG.
5F). In some embodiments, the web-clip widget creation icon
includes text, such as "Create Web-Clip Widget" or "Add to Home
Screen."
[0209] An activation of the web-clip widget creation icon (e.g.,
3973) is detected (710). In some embodiments, detecting activation
of the web-clip widget creation icon includes detecting a finger
gesture (e.g., a tap gesture) on the web-clip widget creation
icon.
[0210] In response to detecting activation of the web-clip widget
creation icon, a web-clip widget is created (712) corresponding to
the displayed web page or portion thereof.
[0211] In some embodiments, the web-clip widget corresponds to a
structural element of the web page, such as a particular block
within the web page. In some embodiments, the web-clip widget
corresponds to a user-specified rectangle in the web page.
[0212] In some embodiments, creating the web-clip widget includes
(714) requesting a name for the web-clip widget, receiving the
name, and storing the name. In some embodiments, requesting the
name includes displaying a keyboard to receive input for the name.
For example, in UI 390011 (FIG. 5H), the user is prompted to enter
the widget name in the text entry box 3960 using the keyboard
616.
[0213] In some embodiments, creating the web-clip widget includes
creating (716) an icon corresponding to the web-clip widget and
displaying (718) the icon corresponding to the web-clip widget in a
menu (e.g., UI 400A or 400B, FIG. 4A or 4B) or list (e.g., UT 400C,
FIG. 4C) of icons. In some embodiments, the icon corresponding to
the web-clip widget is created in response to detecting an
activation of an add-widget icon (e.g., icon 3928, FIG. 5H). In
some embodiments, the icon corresponding to the web-clip widget is
created in response to detecting an activation of the web-clip
widget creation icon (e.g., 3973, FIG. 5F).
[0214] In some embodiments, the menu or list of icons comprises a
menu or list of applications and widgets (e.g., UI 400A, FIG. 4A)
on the multifunction device. In some embodiments, the menu or list
of icons comprises a menu or list of widgets on the multifunction
device. In some embodiments, the menu or list of icons comprises a
menu or list of user-created widgets (e.g., UI 400B or 400C, FIG.
4B or 4C) on the multifunction device.
[0215] In some embodiments, the icon corresponding to the web-clip
widget is displayed in a previously vacant area in the menu of
icons. In some embodiments, the previously vacant area is a first
available vacancy (e.g., 604-1, FIG. 6B) in the menu of icons. In
some embodiments, an animation is displayed of the icon
corresponding to the web-clip widget moving into the previously
vacant area. For example, FIGS. 6A-6D illustrate an animation in
which an icon corresponding to the web-clip widget 149-6-7 is
created and moved into a previously vacant area in UI 600B.
[0216] In some embodiments, the web-clip widget is stored (720) as
a bookmark in a browser application. In some embodiments, as
described in U.S. patent application Ser. No. 11/469,838,
"Presenting and Managing Clipped Content," filed on Sep. 1, 2006,
which application is incorporated by reference herein in its
entirety, the web-clip widget is encoded as a URL associated with
the bookmark.
[0217] In some embodiments, the web-clip widget is sent (722) to a
web server for storage. In some embodiments, the web-clip widget
stored on the web server is publicly accessible. Storing a
user-created web-clip widget on a publicly accessible server allows
the user to share the web-clip widget with other users.
[0218] In some embodiments, as illustrated in FIG. 7B, an
activation of the icon corresponding to the web-clip widget is
detected (724). For example, a finger gesture (e.g., a tap gesture
606, FIG. 6E) is detected on the icon. In response, the web-clip
widget is displayed (726). For example, in response to detecting
the tap gesture 606, block 3914-5 is displayed, as illustrated in
FIG. 6A in the browser application or, as described above, as its
own mini-application without other elements of the browser UI.
[0219] In some embodiments, as illustrated in FIG. 7C, the web-clip
widget is sent (728) to an electronic device external to the
portable multifunction device. For example, the web-clip widget may
be sent to another portable multifunction device 100. The external
electronic device stores (730) the web-clip widget, detects an
activation (732) of the web-clip widget, and displays the web-clip
widget (734). In some embodiments, the web-clip widget is sent to
the external electronic device via email. In some embodiments, the
web-clip widget is sent to the external electronic device via
instant messaging. As used herein, "instant messaging" refers to
both telephony-based messages (e.g., messages sent using Multimedia
Message Service (MMS)) and Internet-based messages (e.g., messages
sent using Extensible Messaging and Presence Protocol (XMPP),
Session Initiation Protocol for Instant Messaging and Presence
Leveraging Extensions (SIMPLE), or Instant Messaging and Presence
Service (IMPS)). Sending a user-created web-clip widget to another
electronic device provides the user with a way to share the
web-clip widget with other users. Operations 728-734 of FIG. 7C may
be performed as part of process 700 or may be performed as an
independent process.
[0220] In some embodiments, as illustrated in FIG. 7D, an
activation of a widget editing icon (e.g., edit widget icon 3970,
FIG. 5M) is detected (736). In response to detecting the activation
of the widget editing icon, one or more settings associated with
the web-clip widget are displayed (738). In some embodiments, an
animation is displayed (740) of flipping the web-clip widget, to
reveal the one or more settings. As described in U.S. patent
application Ser. No. 11/145,561, "Presenting Clips of Content,"
filed on Jun. 3, 2005, which application is incorporated by
reference herein in its entirety, settings (e.g., preferences)
associated with a web-clip widget may be displayed by flipping the
widget to reveal a user interface to edit the settings. A change to
a setting of the one or more settings is received (742). In some
embodiments, one or more finger gestures are detected to refocus
(744) an area in the web-clip or portion thereof for use by the
web-clip widget. As described in the "Presenting Clips of Content"
application, the user interface revealed by flipping the widget may
include a refocus preference to allow redefinition of the selected
area of the web page for use by the web-clip widget. The change is
stored (746) and display of the one or more settings is ceased
(748). Operations 736-748 of FIG. 7D may be performed as part of
process 700 or may be performed as an independent process.
[0221] In some embodiments, each operation of process 700 is
performed by a portable multifunction device. In some embodiments,
however, one or more operations of process 700 are performed by a
server system in communication with a portable multifunction device
via a network connection. The portable multifunction device may
transmit data associated with the widget creation process to the
server system and may receive information corresponding to the
widget in return. For example, code (e.g., an HTML file, a CSS
file, and/or a JavaScript file, in accordance with some
embodiments, or an XML file and/or a JavaScript file, in accordance
with some other embodiments) associated with the widget may be
generated by the server system and then transmitted to the portable
multifunction device. In general, operations in the widget creation
process may be performed by the portable multifunction device, by
the server system, or by a combination thereof.
[0222] Process 700 creates a widget that allows a user to view a
specified area in a web page upon activation of the widget. The
user thus is spared from having to enlarge and center the area of
the web page that is of interest, such as a particular block of
interest, each time the user visits the web page.
[0223] FIG. 7E is a flow diagram illustrating a process 750 for
creating a web-clip widget from a web page or portion thereof in
accordance with some embodiments. While the web-clip widget
creation process 750 described below includes a number of
operations that appear to occur in a specific order, it should be
apparent that the process 750 can include more or fewer operations,
which can be executed serially or in parallel (e.g., using parallel
processors or a multi-threading environment), an order of two or
more operations may be changed and/or two or more operations may be
combined into a single operation.
[0224] On a touch screen display of a portable multifunction
device, an activation of an options icon (e.g., icon 3920, FIG. 5A)
is detected (752). In some embodiments, a finger gesture (e.g., a
tap gesture) is detected (753) on the options icon.
[0225] An activation of a web-clip widget creation icon (e.g., icon
3973, FIG. 5F) is detected (754). In some embodiments, a finger
gesture (e.g., a tap gesture) is detected (756) on the web-clip
widget creation icon.
[0226] An area in a web page or portion thereof displayed on the
touch screen display is selected (758). In some embodiments,
selecting the area includes toggling (760) between frames that are
successively overlaid on the displayed web page or portion thereof
For example, in UI 3900J (FIG. 5J), a frame 3958 is displayed
overlaid on the web page 3912 such that it frames block 2 3914-2.
Upon activation of a toggle icon 3956, display of the frame 3958 is
ceased and another frame is displayed overlaid on the web page 3912
such that it frames another block (e.g., block 3 3914-3). Thus, in
some embodiments, the frames successively highlight blocks and
other structural elements of the web page. As described in U.S.
patent application Ser. No. 11/620,492, "Selecting and Manipulating
Web Content," filed on Jan. 5, 2007, which application is
incorporated by reference herein in its entirety, structural
elements that are displayed in a web page can be identified during
the web-clip widget creation process.
[0227] In some embodiments, selecting the area includes detecting
(762) one or more finger gestures to select an area in the web page
or portion thereof for use by the web-clip widget. In some
embodiments, selecting the area includes detecting (764) one or
more finger gestures to scale an area in the web page or portion
thereof for display by the web-clip widget. Examples of finger
gestures used to select and/or scale an area in the web page or
portion thereof include a single tap gesture 3923 or 3925 to center
and enlarge a block (FIG. 5A); a single tap gesture 3927 or 3935 to
center an adjacent enlarged block; a multi-touch depinching gesture
(3931 and 3933) to enlarge the web page; a multi-touch pinching
gesture (not shown) to reduce the web page; swipe gestures such as
a substantially vertical swipe 3937, an upward or diagonal swipe
3939, and/or other swipe gestures (not shown) to scroll the web
page; and/or a multi-touch rotation gesture (3941 and 3943) to
select a portrait or landscape view (FIG. 5C).
[0228] A finishing gesture is detected (766). In some embodiments,
a finger gesture (e.g., a tap gesture) on an icon (e.g., add widget
icon 3954, FIG. 5J or 5K) is detected (768). A web-clip widget is
created (770) from the selected area.
[0229] In some embodiments, creating the web-clip widget includes
requesting a name for the web-clip widget, receiving the name, and
storing the name, in accordance with operation 714 of process 700
(FIG. 7A).
[0230] In some embodiments, creating the web-clip widget includes
creating an icon corresponding to the web-clip widget, in
accordance with operation 716 of process 700. In some embodiments,
the icon corresponding to the web-clip widget is displayed in a
menu or list of icons, in accordance with operation 718 of process
700. In some embodiments, the menu or list of icons comprises a
menu or list of applications and widgets on the multifunction
device. In some embodiments, the menu or list of icons comprises a
menu or list of widgets on the multifunction device. In some
embodiments, the menu or list of icons comprises a menu or list of
user-created widgets on the multifunction device.
[0231] In some embodiments, an activation of the icon corresponding
to the web-clip widget is detected and the web-clip widget is
displayed, in accordance with operations 724 and 726 (FIG. 6B) of
process 700.
[0232] In some embodiments, settings associated with the web-clip
widget are edited, in accordance with operations 736-748 (FIG. 7D)
of process 700.
[0233] In some embodiments, the web-clip widget is stored as a
bookmark in a browser application, in accordance with operation 720
of process 700 (FIG. 7A).
[0234] In some embodiments, the web-clip widget is sent to a web
server for storage, in accordance with operation 722 of process
700. In some embodiments, the web-clip widget is sent to an
external electronic device, in accordance with operations 728-734
(FIG. 7C) of process 700.
[0235] In some embodiments, each operation of process 750 is
performed by a portable multifunction device. In some embodiments,
however, one or more operations of process 750 are performed by a
server system in communication with a portable multifunction device
via a network connection. The portable multifunction device may
transmit data associated with the widget creation process to the
server system and may receive information corresponding to the
widget in return. For example, code (e.g., an HTML file, a CSS
file, and/or a JavaScript file, in accordance with some
embodiments, or an XML file and/or a JavaScript file, in accordance
with some other embodiments) associated with the widget may be
generated by the server system and then transmitted to the portable
multifunction device. In general, operations in the widget creation
process may be performed by the portable multifunction device, by
the server system, or by a combination thereof.
[0236] Process 750, like process 700, creates a widget that allows
a user to view a specified area in a web page upon activation of
the widget, thus sparing the user from having to enlarge and center
the area of the web page that is of interest each time the user
visits the web page.
[0237] FIG. 7F is a flow diagram illustrating a process 780 for
displaying web-clip widgets in accordance with some embodiments. On
a touch screen display on a portable multifunction device, an icon
is displayed (781) corresponding to a plurality of widgets,
including two or more web-clip widgets. For example, in some
embodiments, the icon for user-created widget 149-6 (FIG. 4A)
corresponds to multiple widgets including multiple web-clip
widgets.
[0238] An activation of the icon is detected (782). For example, a
finger gesture (e.g., a tap gesture) on the icon is detected.
[0239] In response to detecting the activation, a first portion of
the two or more web-clip widgets is displayed (783). For example,
UI 3900L (FIG. 5L) displays a first portion that includes a first
user-created widget 149-6-1 and a portion of a second user-created
widget 149-6-2. In another example, UI 3900M (FIG. 5M) displays a
first portion that includes the second user-created widget 149-6-2
and no other widgets or portions thereof. Thus, in some
embodiments, the first portion is a first web-clip widget.
[0240] A gesture is detected (784) on the touch screen display. In
some embodiments, the gesture is a scrolling gesture. For example,
a swipe gesture 3962 (FIGS. 5L and 5M) or 3963 (FIG. 5M) is
detected on the touch screen display.
[0241] In response to detecting the gesture, a second portion of
the two or more web-clip widgets is displayed (785). In some
embodiments, in response to detecting the gesture, a displayed
portion of the two or more web-clip widgets is scrolled from the
first portion to the second portion. For example, in response to
detecting an upward scroll gesture 3962 in UI 3900L (FIG. 5L), a
second portion is displayed that includes more or all of the second
user-created widget 149-6-2 and less or none of the first
user-created widget 149-6-1. In some embodiments, the second
portion is a second web-clip widget (e.g., the second user-created
widget 149-6-2).
[0242] In some embodiments, the gesture is a de-pinching gesture
(e.g., gestures 3931 and 3933, FIG. 5C). In response to detecting
the de-pinching gesture, a displayed portion of the two or more
web-clip widgets is zoomed in from the first portion to the second
portion.
[0243] In some embodiments, the gesture is a finger tap on an area
within the first portion (e.g., a finger tap analogous to gesture
3923, FIG. 5A), and the displayed second portion is centered on the
area and is zoomed in with respect to the first portion.
[0244] FIG. 7G is a flow diagram illustrating a process 790 for
displaying web-clip widgets in accordance with some embodiments. On
a touch screen display on a portable multifunction device, an icon
is displayed (791) corresponding to a plurality of widgets,
including two or more web-clip widgets. For example, in some
embodiments, the icon for user-created widget 149-6 (FIG. 4A)
corresponds to multiple widgets including multiple web-clip
widgets.
[0245] An activation of the icon is detected (792). For example, a
finger gesture (e.g., a tap gesture) on the icon is detected.
[0246] In response to detecting the activation of the icon, a
plurality of icons corresponding to respective widgets in the
plurality of widgets is displayed (793). In some embodiments, the
plurality of icons is displayed in a menu, or in a list. For
example, UI 400B (FIG. 4B) displays a menu of icons corresponding
to user-created widgets 149-6-1 through 149-6-6, and UI 400C (FIG.
4C) displays a list of icons corresponding to user-created widgets
149-6-1 through 149-6-6.
[0247] An activation is detected (794) of a respective icon in the
plurality of icons corresponding to a respective web-clip widget.
In response to detecting the activation of the respective icon, the
respective web-clip widget is displayed (795). For example, in
response to detecting an activation of an icon corresponding to
user-created widget 149-6-2 in UI 400B or UI 400C, user-created
widget 149-6-2 is displayed in UI 3900M (FIG. 5M).
[0248] A gesture is detected (796) on the touch screen display. For
example, a swipe gesture 3962 or 3963 (FIG. 5M) is detected on the
touch screen display. Alternately, a tap gesture 3964 at the top or
a tap gesture 3966 at the bottom of the displayed widget 149-6-2 is
detected. In another example, a tap gesture 3965 at the right side
or a tap gesture 3967 at the left side of the displayed widget
149-6-2 is detected.
[0249] In response to detecting the gesture, display of the
respective web-clip widget is ceased and another web-clip widget is
displayed (797). For example, in response to detecting a downward
swipe 3962, a substantially horizontal left-to-right swipe 3963, a
tap gesture 3967 at the left side of the displayed widget 149-6-2,
or a tap gesture 3964 at the top of the displayed widget 149-6-2, a
previous user-created widget 149-6-1 is displayed. In response to
detecting an upward swipe 3962, a substantially horizontal
right-to-left swipe 3963, a tap gesture 3965 at the right side of
the displayed widget 149-6-2, or a tap gesture 3966 at the bottom
of the displayed widget 149-6-2, a next user-created widget 149-6-3
is displayed.
[0250] Processes 780 and 790 thus provide user-friendly ways to
view multiple specified areas in web pages without having to surf
between successive web pages and without having to enlarge and
center an area of interest in each web page.
[0251] FIG. 7H is a flow diagram illustrating a process 7000 for
displaying a web-clip widget in accordance with some embodiments.
On a touch screen display on a portable multifunction device, an
icon for a web-clip widget (e.g., 149-6-7, FIG. 6E) is displayed
(7002). The web-clip widget corresponds to a user-specified area of
a web page (e.g., block 3914-5, FIG. 6A).
[0252] In some embodiments, the icon is displayed (7004) in a menu
or list of icons. In some embodiments, the menu or list of icons
comprises a menu or list of applications and widgets (e.g., UI
400A, FIG. 4A) on the multifunction device. In some embodiments,
the menu or list of icons comprises a menu or list of widgets on
the multifunction device. In some embodiments, the menu or list of
icons comprises a menu or list of user-created widgets (e.g., UI
400B or 400C, FIG. 4B or 4C) on the multifunction device.
[0253] In some embodiments, the user-specified area was previously
selected by translating and scaling (7006) a displayed portion of
the web page. In some embodiments, the user-specified area was
previously selected by centering and enlarging (7008) a displayed
portion of the web page. Examples of finger gestures used to
translate, scale, center, and/or enlarge an area in the web page
include a tap gesture 3923 or 3925 to center and enlarge a block
(FIG. 5A); a tap gesture 3927 or 3935 to center an adjacent
enlarged block; a multi-touch depinching gesture (3931 and 3933,
FIG. 5C) to enlarge the web page; a multi-touch pinching gesture
(not shown) to reduce the web page; swipe gestures such as a
substantially vertical swipe 3937, an upward or diagonal swipe
3939, and/or other swipe gestures (not shown) to translate the web
page; and/or a multi-touch rotation gesture (3941 and 3943, FIG.
5C) to select a portrait or landscape view.
[0254] An activation of the icon is detected (7010). In some
embodiments, a finger gesture (e.g., a tap gesture 606, FIG. 6E) is
detected (7012) on the icon.
[0255] In response to detecting activation of the icon, the
user-specified area of the web page is displayed (7014). For
example, in response to activation of the icon for the web-clip
widget 149-6-7 (FIG. 6E), block 3914-5 is displayed (FIG. 6A).
[0256] The process 7000 allows a user to view a specified area in a
web page upon activation of the corresponding icon. The user thus
is spared from having to enlarge and center the area of the web
page that is of interest, such as a particular block of interest,
each time the user visits the web page.
Icon Display and Interface Reconfiguration
[0257] FIGS. 8A-8D illustrate exemplary user interfaces for
displaying icons in accordance with some embodiments. FIGS. 9A and
9B are flow diagrams of an icon display process 900 in accordance
with some embodiments. The process is performed by a computing
device with a touch screen display (e.g., portable multifunction
device 100). The process provides a simple intuitive way for a user
to view a large number of icons (e.g., multiple pages of
application icons and web-clip widget icons) on a touch screen
display.
[0258] The computing device displays (902) a first set of a first
plurality of icons in a first area of the touch screen display
(e.g., area 802, FIG. 8A). The first plurality of icons includes a
plurality of sets of icons that are separately displayed in the
first area of the touch screen display. For example, in FIGS.
8A-8C, icons 141, 148, 144, 143, 155, 149-2, 154, 149-1, 149-4,
149-3, 153, 412, 149-6, 149-6-1, 149-6-2, 149-6-3, 149-6-4,
149-6-5, 149-6-6, 149-6-7, 149-6-8, 149-6-9, 149-6-10, 149-6-11,
149-6-12, 149-6-13, 149-6-14, and 149-6-6 are a first plurality of
icons in area 802. Icons 141, 148, 144, 143, 155, 149-2, 154,
149-1, 149-4, 149-3, 153, 412, and 149-6 form a first set in area
802 in FIG. 8A; icons 149-6-1, 149-6-2, 149-6-3, 149-6-4, 149-6-5,
and 149-6-6 form a second set in area 802 in FIG. 8B; and icons
149-6-7, 149-6-8, 149-6-9, 149-6-10, 149-6-11, 149-6-12, 149-6-13,
149-6-14, and 149-6-15 form a third set in area 802 in FIG. 8C. In
this context, "separately displayed" means when one of the sets is
displayed, the other sets are not concurrently displayed, except
possibly during a brief transition from one set of icons to the
next (e.g., an animation). As this example illustrates, the first
and second sets of the first plurality of icons are distinct sets
of icons.
[0259] In some embodiments, the first plurality of icons includes a
plurality of application launch icons, wherein in response to
detecting activation of an application launch icon in the plurality
of application icons, an application that corresponds to the
activated application icon is launched and displayed. In some
embodiments, the applications include a default set of
applications, third-party applications, and/or web-clip widget
applications. The application launch icons are not for issuing
commands or subcommands with an application. Rather, they are for
launching applications. If an application is already launched, then
activation of the corresponding application launch icon results in
display of the application.
[0260] In some embodiments, the first plurality of icons includes
one or more web-clip widget icons (e.g., widget icon 149-6, FIG.
8A), wherein in response to detecting activation of a web-clip
widget icon, a portion of a web page that corresponds to the
activated web-clip widget icon is displayed.
[0261] The computing device displays (904) a second plurality of
icons in a second area (e.g., tray 408, FIG. 8A) on the touch
screen display while displaying icons in the first plurality of
icons in the first area. For example, in FIGS. 8A-8C, application
launch icons 138, 140, 147, and 152 are displayed in tray 408. The
second area is different (e.g., visually distinct) from the first
area. For example, tray 408 is different from area 802 in FIG. 8A.
In some embodiments, the second plurality of icons correspond to
applications or functions that are frequently used by a user.
[0262] In some embodiments, the second plurality of icons includes
a plurality of application launch icons, wherein in response to
detecting activation of an application icon in the plurality of
application icons, an application that corresponds to the activated
application icon is launched and/or displayed, as explained above.
In some embodiments, the applications include a default set of
applications, third-party applications, and/or web-clip widget
applications.
[0263] The computing device detects (906) a first finger gesture on
the touch screen display in the first area. In some embodiments,
the first finger gesture is a swipe gesture (e.g., swipe 808, FIG.
8A). In some embodiments, the swipe gesture is a horizontal (or
substantially horizontal) swipe gesture on the touch screen
display, from left to right or from right to left on the touch
screen display. In some embodiments, the swipe gesture is a
vertical (or substantially vertical) swipe gesture on the touch
screen display.
[0264] In response to detecting the first finger gesture on the
touch screen display in the first area, the computing device
replaces (908) display of the first set of the first plurality of
icons with display of a second set of the first plurality of icons
in the first area on the touch screen display while maintaining the
display of the second plurality of icons in the second area on the
touch screen display. For example, in response to swipe 808, 1.11
800A (FIG. 8A) transitions to UI 800B (FIG. 8B). The first set of
icons (141, 148, 144, 143, 155, 149-2, 154, 149-1, 149-4, 149-3,
153, 412, and 149-6 in area 802, FIG. 8A) are replaced by a second
set of icons (149-6-1, 149-6-2, 149-6-3, 149-6-4, 149-6-5, and
149-6-6 in area 802, FIG. 8B) while the display of the second
plurality of icons (138, 140, 147, and 152) is maintained.
[0265] In some embodiments, replacing display of the first set of
the first plurality of icons with display of a second set of the
first plurality of icons in the first area on the touch screen
display comprises an animation that moves the first set out of the
first area and the second set into the first area.
[0266] In some embodiments, the plurality of sets of icons includes
a number of sets of icons that are configured to be separately
displayed as a sequence of sets of icons in the first area of the
touch screen display. In some embodiments, the computing device
displays two or more set-sequence-indicia icons (e.g., icons 804-1,
804-2, and 804-3 in FIGS. 8A-8D). The set-sequence-indicia icons
provide information about the number of sets of icons in the
plurality of sets of icons and a position of a displayed set of
icons in the sequence of sets of icons. In response to detecting
the first finger gesture, the computing device updates (910) the
information provided by the set-sequence-indicia icons to reflect
the replacement of the displayed first set by the second set. For
example, set-sequence-indicia icons 804-1, 804-2, and 804-3 in
FIGS. 8A-8D indicate that there are three sets of icons in the
plurality of sets of icons. The set-sequence-indicia icons 804-1,
804-2, and 804-3 also indicate a position of a displayed set of
icons in the sequence of sets of icons. For example, the
set-sequence-indicia icons are displayed in a sequence, with the
icon that corresponds to the set that is currently displayed being
visually distinguished from the other set-sequence-indicia icons
(e.g., icon 804-1 is darkened in FIG. 8A when the first set is
displayed, icon 804-2 is darkened in FIG. 8B when the second set is
displayed, and icon 804-3 is darkened in FIG. 8C when the third set
is displayed).
[0267] In some embodiments, the computing device detects (912) a
second finger gesture on an icon in the second set of the first
plurality of icons. In response to detecting the second finger
gesture, the computing device displays (914) an application that
corresponds to the icon in the second set upon which the second
finger gesture was detected. For example, in response to a finger
tap gesture 814 (FIG. 8B), user-created widget 149-6-5 is
displayed.
[0268] In some embodiments, the computing device detects (916) a
third finger gesture on the touch screen display while the second
set of the first plurality of icons are displayed. In response to
detecting the third finger gesture, the computing device replaces
(918) display of the second set of the first plurality of icons
with display of a third set of the first plurality of icons in the
first area on the touch screen display while maintaining the
display of the second plurality of icons in the second area on the
touch screen display. For example, in response to detecting swipe
812 (FIG. 8B), the computing device replaces (918) display of the
second set of the first plurality of icons (icons 149-6-1, 149-6-2,
149-6-3, 149-6-4, 149-6-5, and 149-6-6, FIG. 8B) with display of a
third set of the first plurality of icons (icons 149-6-7, 149-6-8,
149-6-9, 149-6-10, 149-6-11, 149-6-12, 149-6-13, 149-6-14, and
149-6-15, FIG. 8C) in area 802 on the touch screen display while
maintaining the display of the second plurality of icons in the
second area on the touch screen display (icons 138, 140, 147, and
152 in tray 408).
[0269] In some embodiments, the computing device detects (920) a
fourth finger gesture on an icon in the third set of the first
plurality of icons. In response to detecting the fourth finger
gesture, the computing device displays (922) an application that
corresponds to the icon in the third set upon which the fourth
finger gesture was detected. For example, in response to a finger
tap gesture 816 (FIG. 8C), user-created widget 149-6-11 is
displayed.
[0270] In some embodiments, the first finger gesture is a swipe
gesture in a first direction and the computing device detects (924)
a second finger swipe gesture on the touch screen display in a
direction that is opposite (or substantially opposite) the first
direction. In response to detecting the second finger swipe
gesture, the computing device replaces (926) display of the first
set of the first plurality of icons with a display of information,
other than a set in the plurality of sets of icons, customized to a
user of the device. In some embodiments, the customized information
includes: local time, location, weather, stocks, calendar entries,
and/or recent messages for the user. For example, in response to
detecting finger swipe gesture 810 (FIG. 8A), the computing device
replaces (926) display of the first set of the first plurality of
icons (icons 141, 148, 144, 143, 155, 149-2, 154, 149-1, 149-4,
149-3, 153, 412, and 149-6, FIG. 8A) with a display of information,
other than a set in the plurality of sets of icons, customized to a
user of the device (e.g., local time, location, weather, stocks,
calendar entries, and recent messages for the user in area 802,
FIG. 8D).
[0271] In some embodiments, the first finger gesture is a swipe
gesture (e.g., swipe 808, FIG. 8A) in a first direction and the
computing device detects (924) a second finger swipe gesture (e.g.,
swipe 810, FIG. 8A) on the touch screen display in a direction that
is opposite (or substantially opposite) the first direction. In
response to detecting the second finger swipe gesture, the
computing device replaces (926) display of the first set of the
first plurality of icons with a display of information, other than
a set in the plurality of sets of icons, customized to a user of
the device, and updates (928) the information provided by a
customized-information indicia icon (e.g., icon 806, FIGS. 8A-8D)
and the set-sequence-indicia icons (e.g., icons 804) to reflect the
replacement of the displayed first set by the information
customized to the user (e.g., icon 806 is darkened in FIG. 8D and
none of the set-sequence-indicia icons 804 are darkened). In some
embodiments, the customized-information indicia icon and the
set-sequence-indicia icons have the same visual appearance (e.g.,
all are circles, not shown). In some embodiments, the
customized-information indicia icon and the set-sequence-indicia
icons are visually distinct (e.g., the customized-information
indicia icon 806 is a star and the set-sequence-indicia icons 804
are circles). In some embodiments, the customized-information
indicia icon 806 and the set-sequence-indicia icons 804 are
adjacent to each other (e.g., as shown in FIGS. 8A-8D).
[0272] Attention is now directed towards interface reconfiguration.
In response to a user initiating an interface reconfiguration mode,
positions of one or more icons displayed on the portable device may
be varied about respective average positions. The varying of the
positions of the one or more icons may include animating the one or
more icons to simulate floating of the one or more icons on a
surface corresponding to a surface of a display in the portable
device. The display may be a touch-sensitive display, which
responds to physical contact by a stylus or one or more fingers at
one or more contact points. While the following embodiments may be
equally applied to other types of displays, a touch-sensitive
display is used as an illustrative example.
[0273] The varying of the positions of the one or more icons may
intuitively indicate to the user that the positions of the one or
more icons may be reconfigured by the user. The user may modify,
adapt and/or reconfigure the positions of the one or more icons. In
embodiments where the portable device includes a touch-sensitive
display, the user may make contact with the touch-sensitive display
proximate to a respective icon at a first position. Upon making
contact with the touch-sensitive display, the respective icon may
cease varying its position. The user may drag the respective icon
to a second position. Upon breaking contact with the
touch-sensitive display, the respective icon may resume varying its
position. In some embodiments, the respective icon can be "thrown,"
so that the final position of the respective icon is different from
the point at which the icon is released. In this embodiment, the
final position can depend on a variety of factors, such as the
speed of the "throw," the parameters used in a simulated equation
of motion for the "throw" (e.g., coefficient of friction), and/or
the presence of a lay out grid with simulated attractive forces. In
some embodiments, the display may include two regions. During the
interface reconfiguration mode, positions of one or more icons
displayed in the first region may be varied while positions of one
or more icons displayed in the second region may be stationary.
[0274] The user may similarly modify, adapt and/or reconfigure the
positions of additional icons during the interface reconfiguration
mode. When the user has completed these changes (at least for the
time being), he or she may terminate the interface reconfiguration
mode. In response to this user action, the portable device may
return to a normal mode of operation and the varying of the
displayed positions of the one or more icons will cease.
[0275] The user may initiate or terminate the interface
reconfiguration process by selecting one or more appropriate
physical buttons on the portable device (e.g., menu button 204,
FIG. 2), by a gesture (such as making contact and swiping one or
more fingers across the touch-sensitive display or making contact
and holding for more than a predefined time period) and/or by
selecting one or more soft buttons (such as one or more icons that
are displayed on the touch-sensitive display). As used herein, a
gesture is a motion of the object/appendage making contact with the
touch screen display surface. Exemplary gestures include finger tap
gestures and finger swipe gestures. In some embodiments, the
interface reconfiguration process terminates a predefined time
after the interface reconfiguration process is initiated, i.e.,
there is a time out.
[0276] The one or more icons displayed on the portable device may
be graphical objects. In some embodiments, the one or more icons
may be on-screen representations of controls that may be
manipulated by the user, such as bars, buttons and text boxes. In
some embodiments, the one or more icons correspond to application
programs (email, browser, address book, etc.) and/or web-clip
widgets that may be selected by the user by contacting the
touch-sensitive display proximate to an icon of interest.
[0277] FIG. 10 is a flow diagram of a position adjustment process
1000 for a portable multifunction device in accordance with some
embodiments. While the position adjustment process 1000 described
below includes a number of operations that appear to occur in a
specific order, it should be apparent that the process 1000 can
include more or fewer operations, which can be executed serially or
in parallel (e.g., using parallel processors or a multi-threading
environment), an order of two or more operations may be changed
and/or two or more operations may be combined into a single
operation.
[0278] In the position adjustment process 1000, a plurality of
icons are displayed in a GUI in a touch-sensitive display (1002). A
first predefined user action that initiates an interface
reconfiguration process is detected (1004). Exemplary predefined
user actions include selecting a physical button on the portable
device, making a predefined gesture on the touch screen display
surface, or selecting a soft button. Position(s) of one or more of
the plurality of displayed icons are varied about respective
average position(s) (1006). A point of contact with the
touch-sensitive display at a first position of a respective icon is
detected (1008). Movement of the point of contact to a second
position is detected (1010). Movement of the respective icon to the
second position is displayed and the respective icon is displayed
at the second position (1012).
[0279] If a second predefined user action that terminates the
interface reconfiguration process is detected (1014--yes), the
position(s) of the one or more icons is fixed (1016). Exemplary
predefined user actions include selecting or deselecting a physical
button on the portable device (e.g., menu button 204, FIG. 2),
making another predefined gesture on the touch screen display
surface, or selecting or deselecting a soft button. The fixed
position(s) may correspond to a respective average position(s) for
the one or more icons. If a second predefined user action that
terminates the interface reconfiguration process is not detected
(1014--no), the process may continue when a point of contact
proximate to the same or another icon is detected (1008).
[0280] FIGS. 11A-1100 illustrate exemplary user interfaces during
interface reconfiguration in accordance with some embodiments.
[0281] In some embodiments, the user interface on the touch screen
display 112 is divided into multiple sections or windows. For
example, in FIG. 11A, a region of UI 1100A may include a tray 408
for holding icons or graphical objects representing functions that
are frequently used by the user (e.g., phone 138, mail 140, and
browser 147) and a region or area 802 for holding icons or
graphical objects representing functions that are used less
frequently by the user (e.g., IM 141, calendar 148, image
management 144, etc.).
[0282] FIGS. 11B-11D illustrate the portable multifunction device
100 during the interface reconfiguration mode in accordance with
some embodiments. After the interface reconfiguration mode is
initiated, the display of one or more of the icons in the area 802
is modified from the previous stationary positions to time-varying
positions. As noted previously, the display may include animating
one or more of the icons to simulate floating of one or more of the
icons on a surface corresponding to the display surface. For
example, the animated varying of the positions of one or more of
the icons during the interface reconfiguration mode may resemble
that of a hockey puck in an air hockey game. The displayed
position(s) of a respective icon in the icons may be varied in a
region 1104 (FIG. 11B) centered on the average position of the
respective icon.
[0283] While FIGS. 11B-11D illustrate movement of one or more of
the icons in the area 802, in other embodiments positions of one or
more of the icons in another region of the user interface, such as
tray 408, may be varied separately or in addition to those of one
or more of the icons in area 802.
[0284] The time-varying position(s) of one or more of the icons in
area 802 intuitively indicate to the user that the positions of one
or more of the icons may be modified. This is illustrated in FIGS.
11C-11D, which show the repositioning of an icon during the
interface reconfiguration mode. The user makes contact with one of
the icons that is moving at a position 1108 and moves the point of
contact across the display surface. The contact and the motion are
detected by the portable multifunction device 100. As a
consequence, the displayed icon, in this example corresponding to a
stocks application 149 2, is moved accordingly.
[0285] As shown in FIG. 11D, the user moves the stocks application
icon 149-2 to position 1110 and breaks contact with the display
surface. The stocks application icon 149 2 is now displayed at the
position 1110. While the displayed position of the stocks
application icon 149-2 is shown as stationary in FIG. 11D, in some
embodiments the position of the stocks application icon 149-2 may
be varied once the user breaks contact with the display surface. In
some embodiments, only icons displayed in one or more subsections
of the user interface are displayed with a varying position during
the interface reconfiguration mode. Thus, if the stocks application
icon 149-2 had been dragged to another position in the area 802, it
may be displayed with a varying position after the user breaks
contact with the display. In some embodiments, the device may
provide audio and/or tactile feedback when an icon is moved to a
new position, such as an audible chime and/or a vibration.
[0286] FIG. 11D also illustrates the optional displacement of the
browser icon 147 to position 1112. The browser icon 147 was
displaced from its initial position to its new position 1112 due to
at least partial overlap with the stocks application icon 149-2,
i.e., when the portable multifunction device 100 determined that
the user positioned the stocks application icon 149-2 over the
browser icon 147, the displayed position of the browser icon 147
was changed.
[0287] In other embodiments, an icon may be evicted or removed from
the tray 408 when an additional icon, such as the iPod icon 152, is
added to the tray 408. For example, the tray 408 may be configured
to accommodate a finite number of icons, such as 4 icons. If an
additional icon is added to the tray 408, a nearest icon to the
additional icon or an icon that at least partially overlaps the
additional icon may be evicted or removed from the tray 408. In
some embodiments, the evicted icon floats or zooms from its
position in tray 408 to a new position in area 802, where it may
join a sorted list of icons. In some embodiments, if the eviction
process is not completed (e.g., the additional icon is not added to
fray 408), the evicted icon may halt its progress towards its new
position in area 802 and return to its position in tray 408.
[0288] FIG. 11E illustrates the user interface after the interface
reconfiguration mode has been terminated or has terminated (due to
a time out) in accordance with some embodiments. The icons in UI
1100E have stationary positions. The stocks application icon 149-2
and the browser icon 147 are displayed in their new positions in
the tray 408.
[0289] The animated effects during the interface reconfiguration
mode, such as the varying position(s) of one or more of the icons,
may be in accordance with corresponding equations of motion for one
or more of the icons in a plane substantially coincident with the
touch screen display surface. The equations of motion may have a
coefficient of friction less than a threshold allowing the
simulation and/or animation of floating or sliding of one or more
of the icons. The equation of motion for the respective icon may
have a non-zero initial velocity, a non-zero angular velocity,
and/or a restoring force about the respective average position of
the respective icon such that the position of the respective icon
oscillates in the region 1104 (FIG. 11B) substantially centered on
the respective average position of the respective icon.
[0290] In some embodiments, the position of the respective icon may
be varied during the interface reconfiguration mode in such a way
that the respective icon rotates about the respective average
position of the respective icon while maintaining a fixed
orientation with respect to the user interface and the portable
electronic device 100. This is illustrated in FIGS. 11F and 11G. In
this example, the position of the online video icon 155 in area 802
is varied in such a way that it maintains a fixed orientation in
region 1104. This may make it easier for the user to determine the
function of the respective icon during the interface
reconfiguration mode.
[0291] FIGS. 12A-12F are flow diagrams of icon reconfiguration
processes 1200 in accordance with some embodiments. The processes
are performed by a computing device with a touch screen display
(e.g., portable multifunction device 100). While the icon
reconfiguration processes 1200 described below include a number of
operations that appear to occur in a specific order, it should be
apparent that the processes 1200 can include more or fewer
operations, which can be executed serially or in parallel (e.g.,
using parallel processors or a multi-threading environment), an
order of two or more operations may be changed and/or two or more
operations may be combined into a single operation.
[0292] The computing device displays (1202) a first set of a first
plurality of icons in a first area of the touch screen display
(e.g., area 802, FIG. 11H). The first plurality of icons includes a
plurality of sets of icons that are separately displayed in the
first area of the touch screen display. For example, in FIGS.
11H-1100, icons 141, 148, 144, 143, 155, 149-2, 154, 149-1, 149-4,
149-3, 153, 412, 152, 149-6-20, 149-6-21, 149-6-22, 149-6-30,
149-6-31, 149-6-32, 149-6-33, 149-6-34, 149-6-35, 149-6-40,
149-6-41, 149-6-42, 149-6-43, 149-6-44, and 149-6-45 are a first
plurality of icons in area 802. Icons 141, 148, 144, 143, 155,
149-2, 154, 149-1, 149-4, 149-3, 153, 412, 152, 149-6-20, 149-6-21,
and 149-6-22 form a first set in area 802 in FIG. 11H; icons
149-6-30, 149-6-31, 149-6-32, 149-6-33, 149-6-34 and 149-6-35 form
a second set in area 802 in FIG. 11Z; and icons 149-6-40, 149-6-41,
149-6-42, 149-6-43, 149-6-44, and 149-6-45 form a third set in area
802 in FIG. 11KK. In this context, "separately displayed" means
when one of the sets is displayed, the other sets are not
concurrently displayed, except possibly during a brief transition
from one set of icons to the next (e.g., an animation). As this
example illustrates, respective sets in the first plurality of
icons are distinct sets of icons, although an icon can be moved
from one set to another set during the icon reconfiguration process
(e.g., as described below using calculator icon 149-3 as an
example).
[0293] In some embodiments, the first plurality of icons includes a
plurality of application launch icons, wherein in response to
detecting activation of an application icon in the plurality of
application icons when the user interface reconfiguration process
is not active, an application that corresponds to the activated
application icon is launched and displayed. In some embodiments,
the applications include a default set of applications, third-party
applications, and/or web-clip widget applications. As noted above,
the application launch icons are not for issuing commands or
subcommands with an application. Rather, they are for launching
applications. If an application is already launched, then
activation of the corresponding application launch icon results in
display of the application.
[0294] In some embodiments, the first plurality of icons includes
one or more web-clip widget icons (e.g., web-clip widget icons
149-6-20, 149-6-21, and 149-6-22, FIG. 11H), wherein in response to
detecting activation of a web-clip widget icon when the user
interface reconfiguration process is not active, a portion of a web
page that corresponds to the activated web-clip widget icon is
displayed.
[0295] The computing device displays (1204) a second plurality of
icons in a second area on the touch screen display (e.g., tray 408,
FIG. 11H) while displaying icons in the first plurality of icons in
the first area. The second area is different (e.g., visually
distinct) from the first area. For example, tray 408 is different
from area 802 in FIG. 11H. In some embodiments, the second
plurality of icons correspond to applications or functions that are
frequently used by a user.
[0296] In some embodiments, the second plurality of icons includes
a plurality of application launch icons, wherein in response to
detecting activation of an application icon in the plurality of
application icons when the user interface reconfiguration process
is not active, an application that corresponds to the activated
application icon is launched and/or displayed, as explained above.
In some embodiments, the applications include a default set of
applications, third-party applications, and/or web-clip widget
applications.
[0297] The computing device detects (1206) a first finger gesture
on the touch screen display. In some embodiments, the first finger
gesture is a stationary (or substantially stationary) contact with
an icon in the first set of the first plurality of icons (e.g.,
gesture 1114 on stocks icon 149-2, FIG. 11H) for greater than a
predetermined time (e.g., 0.5-2.0 seconds). In some embodiments,
the first finger gesture is on an edit icon (not shown). In some
embodiments, the first finger gesture is on any application
icon.
[0298] In response to detecting the first finger gesture, the
computing device initiates a user interface reconfiguration
process, and varies positions of one or more icons in the first set
of the first plurality of icons about respective average positions
(1208). In some embodiments, in response to detecting the first
finger gesture, the computing device also varies (1210) positions
of one or more icons in the second plurality of icons about
respective average positions (e.g., UI 1100I, FIG. 11I).
[0299] In some embodiments, the varying includes animating the one
or more icons to simulate floating of the one or more icons on a
surface corresponding to a surface of the touch screen display.
[0300] In some embodiments, the varying position of a respective
icon in the one or more icons corresponds to an equation of motion
in a plane substantially coincident with the touch screen display,
the equation of motion having a coefficient of friction less than a
threshold. In some embodiments, the equation of motion for the
respective icon has a non-zero initial velocity. In some
embodiments, the equation of motion for the respective icon has a
restoring force about a respective average position of the
respective icon such that the position of the respective icon
oscillates in a region substantially centered on the respective
average position of the respective icon. In some embodiments, the
equation of motion for the respective icon includes a, non-zero
angular velocity. In some embodiments, the respective icon rotates
about the respective average position of the respective icon while
maintaining a fixed orientation with respect to the touch screen
display.
[0301] In some embodiments, the varying includes randomly varying
each icon in the first set of the first plurality of icons about a
respective average position.
[0302] In some embodiments, icons displayed in at least one of the
first area and the second area include icons that may be deleted by
a user and icons that may not be deleted by the user. In some
embodiments, the computing device visually distinguishes (1212) the
icons that may be deleted by the user from the icons that may not
be deleted by the user; detects (1214) one or more finger gestures
corresponding to a request to delete an icon that may be deleted by
the user; and, in response to detecting the one or more finger
gestures corresponding to the request to delete the icon, deletes
(1216) the icon. For example, in FIG. 111, only the web clip
widgets 149-6 may be deleted, so these icons have a circled "X"
deletion icons 1116 next to them to visually indicate that these
icons may be deleted. In response to detecting a finger gesture on
the deletion icon 1116 (FIG. 11I) for icon 149-6-22 (FIG. 11I),
icon 149-6-22 is deleted (FIG. 11J).
[0303] In some embodiments, third party applications and web clip
widgets may be deleted, but core or default applications may not be
deleted. In some embodiments, if the device is reset, the default
applications are displayed in the first set in area 802 and in tray
408, with the third party applications and web clip widgets
deleted. In some embodiments, if the device is reset, the default
applications are displayed in the first set in area 802 and in tray
408, with the third party applications and web clip widgets
displayed after the default applications in the first set in area
802. In some embodiments, if the device is reset, the default
applications are displayed in the first set in area 802 and in tray
408, with the third party applications and web clip widgets
displayed in a second set in area 802.
[0304] In some embodiments, the computing device detects (1218) a
user making a point of contact with the touch screen display at a
first position corresponding to a first icon in the first set and
detects movement of the point of contact to a second position on
the touch screen display. In response to detecting the point of
contact and detecting movement of the point of contact, the
computing device displays (1220) movement of the first icon to the
second position on the touch screen display and displays the first
icon at the second position. In some embodiments, the second
position is in the first area. For example, in response to
detecting point of contact 1118 on stocks icon 149-2 (FIG. 11J) and
detecting movement of the point of contact, the computing device
displays (1220) movement of the stocks icon 149-2 to the second
position (FIG. 11J) on the touch screen display and displays the
stocks icon 149-2 at the second position (FIG. 11L).
[0305] In some embodiments, the computing device moves (1222) a
second icon from a respective initial position to a respective new
position when the second position of the first icon at least
partially overlaps with the respective initial position of the
second icon. For example, the iPod icon 152, which overlaps with
the stocks icon 149-2 (FIG. 11J), is moved to a new position (FIGS.
11K-11L). In some embodiments, the second icon is either in the
first area (e.g., area 902) or the second area (e.g., tray
408).
[0306] In some embodiments, the second position is in the first
area and the computing device rearranges (1224) icons in the first
set other than the first icon to accommodate display of the first
icon at the second position in the first area (e.g., as shown in
FIGS. 11K-11L).
[0307] In some embodiments, rearranging (1224) icons in the first
set other than the first icon includes compacting (1226) at least
some of the icons in the first set other than the first icon to
place an icon in the first position, which was previously occupied
by the first icon (e.g., as shown in FIGS. 11K-11L).
[0308] In some embodiments, rearranging (1224) icons in the first
set other than the first icon includes snaking (1228) at least some
of the icons in the first set other than the first icon to place an
icon in the first position, which was previously occupied by the
first icon (e.g., as shown in FIGS. 11K-11L).
[0309] In some embodiments, rearranging (1224) icons in the first
set other than the first icon includes moving (1230) an icon in the
first set to the first position, which was previously occupied by
the first icon, wherein the moved icon was at the second position
prior to movement of the first icon (e.g., as shown in FIG. 11M).
In other words, the icons in the first position and the second
position are swapped.
[0310] In some embodiments, the computing device fixes (1232) a
position of the first icon at the second position and ceases to
vary positions of the one or more icons in the first set in
response to detecting a predefined user action for terminating the
user interface reconfiguration process (e.g., as shown in FIG.
11N). In some embodiments, the predefined user action is activation
of a physical button (e.g., menu button 204, FIG. 11N) or a soft
button (e.g., a done icon, not shown).
[0311] In some embodiments, the computing device detects (1234) a
user making a first point of contact (e.g., contact 1120, FIG. 110)
with the touch screen display at a first position corresponding to
a first icon in the first set and detects movement of the first
point of contact to a second position in the second area on the
touch screen display (e.g., as shown in FIG. 110). In response to
detecting the first point of contact and detecting movement of the
first point of contact, the computing device displays (1236)
movement of the first icon to the second position in the second
area on the touch screen display and displays the first icon at the
second position (e.g., as shown in FIG. 11Q). In some embodiments,
icons in the second area are symmetrically distributed about the
center of the second area (e.g., as shown in FIG. 11R),
[0312] In some embodiments, the computing device moves (1238) a
third icon in the second plurality of icons from a respective
initial position to a respective new position when the new position
of the first icon at least partially overlaps with the respective
initial position of the third icon (e.g., as shown in FIGS.
11P-11R, where mail icon 140 and browser icon 147 move to new
positions). In some embodiments, the size of the icons in the
second area (e.g., tray 408) is reduced as more icons are added,
until a predetermined maximum number (e.g., 6 icons) is reached
(e.g., as shown in FIG. 11S). In some embodiments, after the
maximum is reached, icons must be removed from the second area
prior to adding more icons to the second area. In some embodiments,
after the maximum is reached, icons in the second area are evicted
from the second area when more icons are added to the second
area.
[0313] In some embodiments, the computing device detects (1240) the
user making a second point of contact (e.g., contact 1122, FIG.
11T) with the touch screen display at a third position
corresponding to a second icon in the second plurality of icons in
the second region on the touch screen display and detects movement
of the second point of contact to a fourth position in the first
region on the touch screen display (e.g., as shown in FIG. 11T).
The computing device responds (1242) to detecting the second point
of contact and detecting movement of the second point of contact by
displaying movement of the second icon to the fourth position of
the touch screen display and displaying the second icon at the
fourth position (e.g., as shown in FIGS. 11T-11V, where mail icon
140 moves from the tray 408 to the first area 802). In some
embodiments, the computing device fixes (1244) a position of the
second icon at the fourth position and ceases to vary positions of
the one or more icons in the first set in response to detecting a
predefined user action for terminating the predefined user
interface reconfiguration process (e.g., as shown in FIG. 11W). In
some embodiments, the predefined user action is activation of a
physical button (e.g., menu button 204, FIG. 11W) or a soft button
(e.g., a done icon, not shown).
[0314] In some embodiments, the computing device detects (1246) a
second finger gesture on a first icon in the first set on the touch
screen display. In response to detecting the second finger gesture,
the computing device replaces (1256) display of the first set of
the first plurality of icons with display of a second set of the
first plurality of icons in the first area on the touch screen
display, and moves the first icon from the first set to the second
set (e.g., as shown in FIGS. 11X-11Z, where the first set of icons
(141, 148, 144, 143, 155, 149-2, 154, 149-1, 149-4, 149-3, 153,
412, 152, 149-6-20, 149-6-21, and 149-6-22) is replaced by the
second set of icons (149-6-30, 149-6-31, 149-6-32, 149-6-33,
149-6-34 and 149-6-35) and the calculator icon 149-3 is moved from
the first set to the second set).
[0315] In some embodiments, detecting the second finger gesture
includes: detecting (1248) a user making a first point of contact
(e.g., contact 1124, FIG. 11X) with the touch screen display at a
first position corresponding to the first icon in the first set and
detecting movement of the first point of contact to an edge of the
first area; and in response to detecting the first point of contact
and detecting movement of the first point of contact to the edge of
the first area, displaying (1250) movement of the first icon to the
edge of the first area (e.g., as shown in FIG. 11X for calculator
icon 149-3). In some embodiments, the edge of first area coincides
with an edge of the touch screen display. In some embodiments, the
plurality of sets form a sequence of sets and going to the right
edge results in display of the next set in the sequence of sets and
going to the left edge results in display of the previous set in
the sequence of sets. In some embodiments, the plurality of sets
form a sequence of sets and going to the bottom edge results in
display of the next set in the sequence of sets and going to the
top edge results in display of the previous set in the sequence of
sets.
[0316] In some embodiments, detecting the second finger gesture
includes detecting (1252) the first point of contact at the edge of
the first area for greater than a predetermined time (e.g., 0.2-1.0
seconds).
[0317] In some embodiments, detecting the second finger gesture
includes detecting (1254) movement of the first point of contact
away from the edge of the first area and then detecting another
movement of the first point of contact back to the edge of the
first area (e.g., as shown in FIG. 1111 for calculator icon 149-3)
within a predetermined time (e.g., 0.2-0.5 seconds).
[0318] In some embodiments, the computing device detects (1258) a
user making a second point of contact with the touch screen display
at a second position corresponding to the first icon in the second
set and detects movement of the second point of contact to a third
position on the touch screen display. In response to detecting the
second point of contact and detecting movement of the second point
of contact, the computing device displays (1260) movement of the
first icon to the third position on the touch screen display and
displays the first icon at the third position (e.g., as shown in
FIGS. 11Z, 11AA, and 11CC-11EE for calculator icon 149-3). In some
embodiments, the third position is in the first area. In some
embodiments, the first icon is the only icon in the second set
(e.g., as shown in FIG. 11BB for calculator icon 149-3). In other
words, the first icon is added to an otherwise empty first area. In
some embodiments, the plurality of sets of icons that are
separately displayed in the first area comprise a sequence of sets
and, during the reconfiguration process, an empty set is added
after the last set of icons in the sequence of sets.
[0319] In some embodiments, positions of one or more icons in the
second set of the first plurality of icons vary about respective
average positions (e.g., as shown in FIGS. 11Z and 11AA). In some
embodiments, positions of all of the icons in the second set vary
about respective average positions. In some embodiments, positions
of all of the icons in the second set except the first icon vary
about respective average positions.
[0320] In some embodiments, the third position is in the first area
and the computing device rearranges (1262) icons in the second set
other than the first icon to accommodate display of the first icon
at the third position in the first area (e.g., as shown in FIGS.
11CC and 11DD). In some embodiments, rearranging icons in the
second set other than the first icon includes compacting (1264) at
least some of the icons in the first set and the second set other
than the first icon (e.g., as shown in FIGS. 11FF and 11GG, where
icon 149-6-30 is compacted into the first set). In some
embodiments, rearranging icons in the second set other than the
first icon includes snaking (1266) at least some of the icons in
the second set other than the first icon (e.g., as shown in FIGS.
11CC and 11DD).
[0321] In some embodiments, the plurality of sets of icons includes
a number of sets of icons that are configured to be separately
displayed as a sequence of sets of icons in the first area of the
touch screen display. The computing device displays (1268) two or
more set-sequence-indicia icons (e.g., icons 804-1, 804-2, and
804-3 in FIGS. 11H-1100, which operate in the same manner as the
corresponding icons in FIGS. 8A-8D, described above). The
set-sequence-indicia icons provide information about the number of
sets of icons in the plurality of sets of icons and a position of a
displayed set of icons in the sequence of sets of icons. In
response to detecting the second finger gesture, the computing
device updates (1270) the information provided by the
set-sequence-indicia icons to reflect the replacement of the
displayed first set by the second set (e.g., icon 804-1 is darkened
in FIG. 11X when the first set is displayed and icon 804-2 is
darkened in FIG. 11Z when the second set is displayed).
[0322] In some embodiments, the computing device fixes (1272) a
position of the first icon at the third position and ceases to vary
positions of the one or more icons in the second set in response to
detecting a predefined user action for terminating the user
interface reconfiguration process (e.g., as shown in FIGS. 11EE and
11HH). In some embodiments, the predefined user action is
activation of a physical button (e.g., menu button 204, FIG. 11EE
or 11HH) or a soft button (e.g., a done icon, not shown).
[0323] In some embodiments, the computing device detects (1274) a
second finger gesture on a first icon in the first set on the touch
screen display. In response to detecting the second finger gesture,
the computing device replaces (1276) display of the first set of
the first plurality of icons with display of a second set of the
first plurality of icons in the first area on the touch screen
display, and moves the first icon from the first set to the second
set (e.g., as shown in FIGS. 11X-11Z, where the first set of icons
(141, 148, 144, 143, 155, 149-2, 154, 149-1, 149-4, 149-3, 153,
412, 152, 149-6-20, 149-6-21, and 149-6-22) is replaced by the
second set of icons (149-6-30, 149-6-31, 149-6-32, 149-6-33,
149-6-34 and 149-6-35) and the calculator icon 149-3 is moved from
the first set to the second set). The computing device detects
(1278) a third finger gesture on the first icon in the second set
on the touch screen display. In response to detecting the third
finger gesture, the computing device replaces (1280) display of the
second set of the first plurality of icons with display of a third
set of the first plurality of icons in the first area on the touch
screen display, and moves the first icon from the second set to the
third set (e.g., as shown in FIGS. 11II-KK for calculator icon
149-3). The computing device detects (1282) a user making a second
point of contact with the touch screen display at a second position
corresponding to the first icon in the third set and detects
movement of the second point of contact to a third position on the
touch screen display. In response to detecting the second point of
contact and detecting movement of the second point of contact, the
computing device displays (1284) movement of the first icon to the
third position on the touch screen display and displays the first
icon at the third position (e.g., as shown in FIGS. 11LL-11NN for
calculator icon 149-3). In some embodiments, the third position is
in the first area. In some embodiments, the first icon is the only
icon in the third set. In other words, the first icon is added to
an otherwise empty first area. In some embodiments, positions of
one or more icons in the third set of the first plurality of icons
vary about respective average positions (e.g., as shown in FIG.
11NN). In some embodiments, positions of all of the icons in the
third set vary about respective average positions. In some
embodiments, positions of all of the icons in the third set except
the first icon vary about respective average positions.
[0324] In some embodiments, the computing device fixes (1286) a
position of the first icon at the third position and ceases to vary
positions of the one or more icons in the third set in response to
detecting a predefined user action for terminating the user
interface reconfiguration process (e.g., as shown in FIG. 1100). In
some embodiments, the predefined user action is activation of a
physical button (e.g., menu button 204, FIG. 1100) or a soft button
(e.g., a done icon, not shown).
[0325] The foregoing description, for purpose of explanation, has
been described with reference to specific embodiments. However, the
illustrative discussions above are not intended to be exhaustive or
to limit the invention to the precise forms disclosed. Many
modifications and variations are possible in view of the above
teachings. The embodiments were chosen and described in order to
best explain the principles of the invention and its practical
applications, to thereby enable others skilled in the art to best
utilize the invention and various embodiments with various
modifications as are suited to the particular use contemplated.
* * * * *