U.S. patent application number 15/143902 was filed with the patent office on 2017-03-30 for portable device, method, and graphical user interface for scrolling to display the top of an electronic document.
The applicant listed for this patent is Apple Inc.. Invention is credited to Stephen O. LEMAY, Richard WILLIAMSON.
Application Number | 20170090748 15/143902 |
Document ID | / |
Family ID | 41449180 |
Filed Date | 2017-03-30 |
United States Patent
Application |
20170090748 |
Kind Code |
A1 |
WILLIAMSON; Richard ; et
al. |
March 30, 2017 |
PORTABLE DEVICE, METHOD, AND GRAPHICAL USER INTERFACE FOR SCROLLING
TO DISPLAY THE TOP OF AN ELECTRONIC DOCUMENT
Abstract
Techniques for use in conjunction with a computing device with a
touch screen display comprise displaying a text entry area while
concurrently displaying a Uniform Resource Locator (URL) entry area
for inputting URLs of web pages. A gesture is detected at a
location on the touch screen display. In accordance with the
location corresponding to the text entry area, the technique
displays a first soft keyboard. In accordance with the location
corresponding to the URL entry area, the technique displays a
second soft keyboard that is different from the first soft
keyboard.
Inventors: |
WILLIAMSON; Richard; (Los
Gatos, CA) ; LEMAY; Stephen O.; (Palo Alto,
CA) |
|
Applicant: |
Name |
City |
State |
Country |
Type |
Apple Inc. |
Cupertino |
CA |
US |
|
|
Family ID: |
41449180 |
Appl. No.: |
15/143902 |
Filed: |
May 2, 2016 |
Related U.S. Patent Documents
|
|
|
|
|
|
Application
Number |
Filing Date |
Patent Number |
|
|
13959631 |
Aug 5, 2013 |
9329770 |
|
|
15143902 |
|
|
|
|
12163899 |
Jun 27, 2008 |
8504946 |
|
|
13959631 |
|
|
|
|
Current U.S.
Class: |
1/1 |
Current CPC
Class: |
G06F 40/166 20200101;
G06F 3/04886 20130101; G06F 3/0481 20130101; G06F 3/0488 20130101;
G06F 3/04883 20130101 |
International
Class: |
G06F 3/0488 20060101
G06F003/0488; G06F 17/24 20060101 G06F017/24; G06F 3/0481 20060101
G06F003/0481 |
Claims
1. (canceled)
2. A non-transitory computer-readable storage medium including one
or more programs configured to be executed by one or more
processors of an electronic device with a touch screen display, the
one or more programs including instructions for: concurrently
displaying, on the touch screen display: a text entry area, and a
Uniform Resource Locator (URL) entry area for inputting URLs of web
pages; detecting a gesture at a location on the touch screen
display; and in response to detecting the gesture on the touch
screen display: in accordance with a determination that the
location of the gesture corresponds to the text entry area,
displaying, on the touch screen display, a first soft keyboard; and
in accordance with a determination that the location of the gesture
corresponds to the URL entry area, displaying, on the touch screen
display, a second soft keyboard that is different from the first
soft keyboard.
3. The non-transitory computer-readable storage medium of claim 2,
wherein displaying the second soft keyboard includes replacing
display of the first soft keyboard with display of the second soft
keyboard.
4. The non-transitory computer-readable storage medium of claim 2,
wherein displaying the first soft keyboard includes replacing
display of the second soft keyboard with display of the first soft
keyboard.
5. The non-transitory computer-readable storage medium of claim 2,
wherein the gesture is a finger gesture on the touch screen
display.
6. The non-transitory computer-readable storage medium of claim 2,
wherein the gesture is a tap gesture on the touch screen
display.
7. The non-transitory computer-readable storage medium of claim 2,
the one or more programs further including instructions for:
displaying a number indicator of active web pages.
8. The non-transitory computer-readable storage medium of claim 2,
the one or more programs further including instructions for:
displaying an indication of a number of currently active web
pages.
9. The non-transitory computer-readable storage medium of claim 2,
the one or more programs further including instructions for:
detecting, on the touch screen display, a gesture corresponding to
a clearing affordance of the URL entry area; and in response to
detecting the gesture corresponding to the clearing affordance of
the URL entry area, clearing the contents of the URL entry
area.
10. The non-transitory computer-readable storage medium of claim 2,
the one or more programs further including instructions for:
detecting, on the touch screen display, a gesture corresponding to
a clearing affordance of the text entry area; and in response to
detecting the gesture corresponding to the clearing affordance of
the text entry area, clearing the contents of the text entry
area.
11. The non-transitory computer-readable storage medium of claim 2,
the one or more programs further including instructions for: while
displaying one of the first soft keyboard and the second soft
keyboard, detecting, on the touch screen display, a gesture at a
location that corresponds to a web page portion; and in response to
detecting the gesture at the location that corresponds to the web
page portion, ceasing display of the respective soft keyboard.
12. The non-transitory computer-readable storage medium of claim 2,
wherein the first soft keyboard includes a search key.
13. The non-transitory computer-readable storage medium of claim 2,
the one or more programs further including instructions for:
detecting, on the touch screen display, a gesture at a location
that corresponds to a web address shortcut key; and in response to
detecting the gesture at the location that corresponds to the web
address shortcut key, inserting a set of URL domain characters in
the URL entry area.
14. The non-transitory computer-readable storage medium of claim
13, wherein inserting a set of URL domain characters in the URL
entry area includes inserting a top-level domain.
15. The non-transitory computer-readable storage medium of claim 2,
wherein the text entry area is part of a web page displayed on the
touch screen display.
16. The non-transitory computer-readable storage medium of claim 2,
the one or more programs further including instructions for: prior
to detecting the gesture at the location on the touch screen
display, displaying a web page on the touch screen display; and in
response to detecting the gesture at the location on the touch
screen display, maintaining display of at least a portion of the
web page.
17. The non-transitory computer-readable storage medium of claim
16, comprising: further in response to detecting the gesture at the
location on the touch screen display, darkening the at least the
portion of the web page.
18. The non-transitory computer-readable storage medium of claim 2,
comprising: while displaying the first soft keyboard, detecting a
second gesture at a location on the touch screen display
corresponding to the URL entry area; and in response to detecting
the second gesture, replacing display of the first soft keyboard
with display of the second soft keyboard that is different from the
first soft keyboard.
19. The non-transitory computer-readable storage medium of claim 2,
comprising: while displaying the second soft keyboard, detecting a
second gesture at a location on the touch screen display
corresponding to the text entry area; and in response to detecting
the second gesture, replacing display of the second soft keyboard
with display of the first soft keyboard that is different from the
second soft keyboard.
20. An electronic device, comprising: a touch screen display; one
or more processors; memory; and one or more programs stored in the
memory, the one or more programs configured to be executed by the
one or more processors and the one or more programs including
instructions for: concurrently displaying, on the touch screen
display: a text entry area, and a Uniform Resource Locator (URL)
entry area for inputting URLs of web pages; detecting a gesture at
a location on the touch screen display; and in response to
detecting the gesture on the touch screen display: in accordance
with a determination that the location of the gesture corresponds
to the text entry area, displaying, on the touch screen display, a
first soft keyboard; and in accordance with a determination that
the location of the gesture corresponds to the URL entry area,
displaying, on the touch screen display, a second soft keyboard
that is different from the first soft keyboard.
21. A computer-implemented method, comprising: at an electronic
device with a touch screen display: concurrently displaying, on the
touch screen display: a text entry area, and a Uniform Resource
Locator (URL) entry area for inputting URLs of web pages; detecting
a gesture at a location on the touch screen display; and in
response to detecting the gesture on the touch screen display: in
accordance with a determination that the location of the gesture
corresponds to the text entry area, displaying, on the touch screen
display, a first soft keyboard; and in accordance with a
determination that the location of the gesture corresponds to the
URL entry area, displaying, on the touch screen display, a second
soft keyboard that is different from the first soft keyboard.
Description
RELATED APPLICATIONS
[0001] This application is a continuation of U.S. patent
application Ser. No. 12/163,899, "Portable Device, Method, and
Graphical User Interface for Automatically Scrolling to Display the
Top of an Electronic Document," filed Jun. 27, 2008, which
application is incorporated by reference herein in its
entirety.
[0002] This application is related to the following applications:
(1) U.S. patent application Ser. No. 10/188,182, "Touch Pad For
Handheld Device," filed Jul. 1, 2002; (2) U.S. patent application
Ser. No. 10/722,948, "Touch Pad For Handheld Device," filed Nov.
25, 2003; (3) U.S. patent application Ser. No. 10/643,256, "Movable
Touch Pad With Added Functionality," filed Aug. 18, 2003; (4) U.S.
patent application Ser. No. 10/654,108, "Ambidextrous Mouse," filed
Sep. 2, 2003; (5) U.S. patent application Ser. No. 10/840,862,
"Multipoint Touchscreen," filed May 6, 2004; (6) U.S. patent
application Ser. No. 10/903,964, "Gestures For Touch Sensitive
Input Devices," filed Jul. 30, 2004; (7) U.S. patent application
Ser. No. 11/038,590, "Mode-Based Graphical User Interfaces For
Touch Sensitive Input Devices" filed Jan. 18, 2005; (8) U.S. patent
application Ser. No. 11/057,050, "Display Actuator," filed Feb. 11,
2005; (9) U.S. patent application Ser. No. 11/367,749,
"Multi-Functional Hand-Held Device," filed Mar. 3, 2006; and (10)
U.S. patent application Ser. No. 11/850,635, "Touch Screen Device,
Method, and Graphical User Interface for Determining Commands by
Applying Heuristics," filed Sep. 5, 2007. All of these applications
are incorporated by reference herein in their entirety.
TECHNICAL FIELD
[0003] The disclosed embodiments relate generally to portable
electronic devices with touch screen displays, and more
particularly, to portable electronic devices with touch screen
displays that display a portion of an electronic document, such as
a portion of a web page.
BACKGROUND
[0004] As portable electronic devices become more compact, and the
number of functions performed by a given device increase, it has
become a significant challenge to design a user interface that
allows users to easily interact with a multifunction device. This
challenge is particular significant for handheld portable devices,
which have much smaller screens than desktop or laptop computers.
This situation is unfortunate because the user interface is the
gateway through which users receive not only content but also
responses to user actions or behaviors, including user attempts to
access a device's features, tools, and functions. Some portable
communication devices (e.g., mobile telephones, sometimes called
mobile phones, cell phones, cellular telephones, and the like) have
resorted to adding more pushbuttons, increasing the density of push
buttons, overloading the functions of pushbuttons, or using complex
menu systems to allow a user to access, store and manipulate data.
These conventional user interfaces often result in complicated key
sequences and menu hierarchies that must be memorized by the
user.
[0005] Many conventional user interfaces, such as those that
include physical pushbuttons, are also inflexible. This may prevent
a user interface from being configured and/or adapted by either an
application running on the portable device or by users. When
coupled with the time consuming requirement to memorize multiple
key sequences and menu hierarchies, and the difficulty in
activating a desired pushbutton, such inflexibility is frustrating
to most users.
[0006] Because of the small size of display screens on portable
electronic devices, only a portion of an electronic document is
typically displayed on the screen at a given time. Thus, users need
to translate (e.g., scroll) the displayed document to view the
entire content of the document. Frequently, users may need to
scroll to the top of the document after the user has scrolled down
the document, for example to view a title or content at the top of
the document. Limitations of conventional user interfaces require
the user to manually scroll back to the top of a lengthy document.
Manually scrolling to the top of the document on a small display
screen may require additional key and gesture sequences that are
time-consuming or awkward to perform.
[0007] Additionally, if the electronic document is a web page,
users may need to scroll to the top of the web page or memorize key
sequences to display the Uniform Resource Locator (URL) entry
area.
[0008] Accordingly, there is a need for portable electronic devices
with touch screen displays that have with more transparent and
efficient user interfaces for displaying and navigating an
electronic document (e.g., a web page). Such interfaces increase
efficiency and user satisfaction with portable devices.
SUMMARY
[0009] The above deficiencies and other problems associated with
user interfaces for portable devices are reduced or eliminated by
the disclosed multifunction device. In some embodiments, the device
has a touch-sensitive display (also known as a "touch screen") with
a graphical user interface (GUI), one or more processors, memory
and one or more modules, programs or sets of instructions stored in
the memory for performing multiple functions. In some embodiments,
the user interacts with the GUI primarily through finger contacts
and gestures on the touch-sensitive display. In some embodiments,
the functions may include telephoning, video conferencing,
c-mailing, instant messaging, blogging, digital photographing,
digital videoing, web browsing, digital music playing, and/or
digital video playing. Instructions for performing these functions
may be included in a computer readable storage medium or other
computer program product configured for execution by one or more
processors.
[0010] In accordance with some embodiments, a computer-implemented
method is performed at a portable electronic device with a touch
screen display. The computer-implemented method includes displaying
a portion of a web page in a web browser application without
concurrently displaying a URL entry area for inputting URLs of web
pages. A gesture is detected in a predefined area at the top of the
touch screen display. In response to detecting the gesture in the
predefined area at the top of the touch screen display, the URL
entry area is displayed.
[0011] In accordance with some embodiments, a graphical user
interface on a portable electronic device with a touch screen
display includes: a portion of a web page in a web browser
application; a URL entry area for inputting URLs of web pages; and
a predefined area at the top of the touch screen display. Prior to
detecting a gesture in the predefined area at the top of the touch
screen display, the portion of the web page is displayed in the web
browser application without concurrently displaying the URL entry
area for inputting URLs of web pages. In response to detecting the
gesture in the predefined area at the top of the touch screen
display, the URL entry area is displayed.
[0012] In accordance with some embodiments, a portable computing
device includes: a touch screen display; one or more processors;
memory; and one or more programs. The one or more programs are
stored in the memory and configured to be executed by the one or
more processors. The one or more programs include instructions for:
displaying a portion of a web page in a web browser application
without concurrently displaying a URL entry area for inputting URLs
of web pages; detecting a gesture in a predefined area at the top
of the touch screen display; and in response to detecting the
gesture in the predefined area at the top of the touch screen
display, displaying the URL entry area.
[0013] In accordance with some embodiments, a computer readable
storage medium has stored therein instructions, which when executed
by a portable electronic device with a touch screen display, cause
the portable electronic device to: display a portion of a web page
in a web browser application without concurrently displaying a URL
entry area for inputting URLs of web pages; detect a gesture in a
predefined area at the top of the touch screen display; and in
response to detecting the gesture in the predefined area at the top
of the touch screen display, display the URL entry area.
[0014] In accordance with some embodiments, a portable electronic
device includes: a touch screen display; means for displaying a
portion of a web page in a web browser application without
concurrently displaying a URL entry area for inputting URLs of web
pages; means for detecting a gesture in a predefined area at the
top of the touch screen display; and, in response to detecting the
gesture in the predefined area at the top of the touch screen
display, means for displaying the URL entry area.
[0015] In accordance with some embodiments, a computer-implemented
method is performed at a portable electronic device with a touch
screen display. The computer-implemented method includes displaying
a portion of an electronic document on the touch screen display.
The electronic document has an electronic document length. The
displayed portion of the electronic document has a displayed
portion length that is less than the electronic document length. A
gesture is detected in a predefined area at the top of the touch
screen display. In response to detecting the gesture in the
predefined area at the top of the touch screen display, a top
portion of the electronic document is displayed.
[0016] In accordance with some embodiments, a graphical user
interface on a portable electronic device with a touch screen
display includes: a portion of an electronic document and a
predefined area at the top of the touch screen display. The
electronic document has an electronic document length. The
displayed portion of the electronic document has a displayed
portion length that is less than the electronic document length. In
response to detecting a gesture in the predefined area at the top
of the touch screen display, a top portion of the electronic
document is displayed.
[0017] In accordance with some embodiments, a portable computing
device includes: a touch screen display; one or more processors;
memory; and one or more programs. The one or more programs are
stored in the memory and configured to be executed by the one or
more processors. The one or more programs include instructions for:
displaying a portion of an electronic document on the touch screen
display, wherein the electronic document has an electronic document
length and the displayed portion of the electronic document has a
displayed portion length that is less than the electronic document
length; detecting a gesture in a predefined area at the top of the
touch screen display; and, in response to detecting the gesture in
the predefined area at the top of the touch screen display,
displaying a top portion of the electronic document.
[0018] In accordance with some embodiments, a computer readable
storage medium has stored therein instructions, which when executed
by a portable electronic device with a touch screen display, cause
the portable electronic device to: display a portion of an
electronic document on the touch screen display, wherein the
electronic document has an electronic document length and the
displayed portion of the electronic document has a displayed
portion length that is less than the electronic document length;
detect a gesture in a predefined area at the top of the touch
screen display; and, in response to detecting the gesture in the
predefined area at the top of the touch screen display, display a
top portion of the electronic document.
[0019] In accordance with some embodiments, a portable electronic
device includes: a touch screen display; means for displaying a
portion of an electronic document on the touch screen display,
wherein the electronic document has an electronic document length
and the displayed portion of the electronic document has a
displayed portion length that is less than the electronic document
length; means for detecting a gesture in a predefined area at the
top of the touch screen display; and, in response to detecting the
gesture in the predefined area at the top of the touch screen
display, means for displaying a top portion of the electronic
document.
[0020] Thus, the invention provides an efficient, easy-to-use
interface for displaying and navigating an electronic document on a
portable electronic device with a touch screen display.
BRIEF DESCRIPTION OF THE DRAWINGS
[0021] For a better understanding of the aforementioned embodiments
of the invention as well as additional embodiments thereof,
reference should be made to the Description of Embodiments below,
in conjunction with the following drawings in which like reference
numerals refer to corresponding parts throughout the figures.
[0022] FIGS. 1A and 1B are block diagrams illustrating portable
multifunction devices with touch-sensitive displays in accordance
with some embodiments.
[0023] FIG. 2 illustrates a portable multifunction device having a
touch screen in accordance with some embodiments.
[0024] FIGS. 3A-3C illustrate exemplary user interfaces for
unlocking a portable electronic device in accordance with some
embodiments.
[0025] FIGS. 4A and 4B illustrate exemplary user interfaces for a
menu of applications on a portable multifunction device in
accordance with some embodiments.
[0026] FIGS. 5A-5J illustrate exemplary user interfaces for
displaying and navigating a portion of a web page in accordance
with some embodiments.
[0027] FIGS. 6A-6D illustrate exemplary user interfaces for
displaying and navigating a portion of an electronic document in
accordance with some embodiments.
[0028] FIGS. 7A-7C are a flow diagram illustrating a method of
displaying and navigating a portion of a web page in accordance
with some embodiments.
[0029] FIGS. 8A-8C are a flow diagram illustrating a method of
displaying and navigating a portion of an electronic document in
accordance with some embodiments.
DESCRIPTION OF EMBODIMENTS
[0030] Reference will now be made in detail to embodiments,
examples of which are illustrated in the accompanying drawings. In
the following detailed description, numerous specific details are
set forth in order to provide a thorough understanding of the
present invention. However, it will be apparent to one of ordinary
skill in the art that the present invention may be practiced
without these specific details. In other instances, well-known
methods, procedures, components, circuits, and networks have not
been described in detail so as not to unnecessarily obscure aspects
of the embodiments.
[0031] It will also be understood that, although the terms first,
second, etc. may be used herein to describe various elements, these
elements should not be limited by these terms. These terms are only
used to distinguish one element from another. For example, a first
gesture could be termed a second gesture, and, similarly, a second
gesture could be termed a first gesture, without departing from the
scope of the present invention.
[0032] The terminology used in the description of the invention
herein is for the purpose of describing particular embodiments only
and is not intended to be limiting of the invention. As used in the
description of the invention and the appended claims, the singular
forms "a", "an" and "the" are intended to include the plural forms
as well, unless the context clearly indicates otherwise. It will
also be understood that the term "and/or" as used herein refers to
and encompasses any and all possible combinations of one or more of
the associated listed items. It will be further understood that the
terms "comprises" and/or "comprising," when used in this
specification, specify the presence of stated features, integers,
steps, operations, elements, and/or components, but do not preclude
the presence or addition of one or more other features, integers,
steps, operations, elements, components, and/or groups thereof.
[0033] As used herein, the term "if" may be construed to mean
"when" or "upon" or "in response to determining" or "in response to
detecting," depending on the context. Similarly, the phrase "if it
is determined" or "if [a stated condition or event] is detected"
may be construed to mean "upon determining" or "in response to
determining" or "upon detecting [the stated condition or event]" or
"in response to detecting [the stated condition or event],"
depending on the context.
[0034] Embodiments of a portable multifunction device, user
interfaces for such devices, and associated processes for using
such devices are described. In some embodiments, the device is a
portable communications device such as a mobile telephone that also
contains other functions, such as PDA and/or music player
functions.
[0035] The user interface may include a physical click wheel in
addition to a touch screen or a virtual click wheel displayed on
the touch screen. A click wheel is a user-interface device that may
provide navigation commands based on an angular displacement of the
wheel or a point of contact with the wheel by a user of the device.
A click wheel may also be used to provide a user command
corresponding to selection of one or more items, for example, when
the user of the device presses down on at least a portion of the
wheel or the center of the wheel. Alternatively, breaking contact
with a click wheel image on a touch screen surface may indicate a
user command corresponding to selection. For simplicity, in the
discussion that follows, a portable multifunction device that
includes a touch screen is used as an exemplary embodiment. It
should be understood, however, that some of the user interfaces and
associated processes may be applied to other devices, such as
personal computers and laptop computers, which may include one or
more other physical user-interface devices, such as a physical
click wheel, a physical keyboard, a mouse and/or a joystick.
[0036] The device supports a variety of applications, such as one
or more of the following: a telephone application, a video
conferencing application, an e-mail application, an instant
messaging application, a blogging application, a photo management
application, a digital camera application, a digital video camera
application, a web browsing application, a digital music player
application, and/or a digital video player application.
[0037] The various applications that may be executed on the device
may use at least one common physical user-interface device, such as
the touch screen. One or more functions of the touch screen as well
as corresponding information displayed on the device may be
adjusted and/or varied from one application to the next and/or
within a respective application. In this way, a common physical
architecture (such as the touch screen) of the device may support
the variety of applications with user interfaces that are intuitive
and transparent.
[0038] The user interfaces may include one or more soft keyboard
embodiments. The soft keyboard embodiments may include standard
(QWERTY) and/or non-standard configurations of symbols on the
displayed icons of the keyboard, such as those described in U.S.
patent application Ser. No. 11/459,606, "Keyboards For Portable
Electronic Devices," filed Jul. 24, 2006, and Ser. No. 11/459,615,
"Touch Screen Keyboards For Portable Electronic Devices," filed
Jul. 24, 2006, the contents of which are hereby incorporated by
reference in their entirety. The keyboard embodiments may include a
reduced number of icons (or soft keys) relative to the number of
keys in existing physical keyboards, such as that for a typewriter.
This may make it easier for users to select one or more icons in
the keyboard, and thus, one or more corresponding symbols. The
keyboard embodiments may be adaptive. For example, displayed icons
may be modified in accordance with user actions, such as selecting
one or more icons and/or one or more corresponding symbols. One or
more applications on the portable device may utilize common and/or
different keyboard embodiments. Thus, the keyboard embodiment used
may be tailored to at least some of the applications. In some
embodiments, one or more keyboard embodiments may be tailored to a
respective user. For example, one or more keyboard embodiments may
be tailored to a respective user based on a word usage history
(lexicography, slang, individual usage) of the respective user.
Some of the keyboard embodiments may be adjusted to reduce a
probability of a user error when selecting one or more icons, and
thus one or more symbols, when using the soft keyboard
embodiments.
[0039] Attention is now directed towards embodiments of the device.
FIGS. 1A and 1B are block diagrams illustrating portable
multifunction devices 100 with touch-sensitive displays 112 in
accordance with some embodiments. The touch-sensitive display 112
is sometimes called a "touch screen" for convenience, and may also
be known as or called a touch-sensitive display system. The device
100 may include a memory 102 (which may include one or more
computer readable storage mediums), a memory controller 122, one or
more processing units (CPU's) 120, a peripherals interface 118, RF
circuitry 108, audio circuitry 110, a speaker 111, a microphone
113, an input/output (I/O) subsystem 106, other input or control
devices 116, and an external port 124. The device 100 may include
one or more optical sensors 164. These components may communicate
over one or more communication buses or signal lines 103.
[0040] It should be appreciated that the device 100 is only one
example of a portable multifunction device 100, and that the device
100 may have more or fewer components than shown, may combine two
or more components, or a may have a different configuration or
arrangement of the components. The various components shown in
FIGS. 1A and 1B may be implemented in hardware, software or a
combination of both hardware and software, including one or more
signal processing and/or application specific integrated
circuits.
[0041] Memory 102 may include high-speed random access memory and
may also include non-volatile memory, such as one or more magnetic
disk storage devices, flash memory devices, or other non-volatile
solid-state memory devices. Access to memory 102 by other
components of the device 100, such as the CPU 120 and the
peripherals interface 118, may be controlled by the memory
controller 122.
[0042] The peripherals interface 118 couples the input and output
peripherals of the device to the CPU 120 and memory 102. The one or
more processors 120 run or execute various software programs and/or
sets of instructions stored in memory 102 to perform various
functions for the device 100 and to process data.
[0043] In some embodiments, the peripherals interface 118, the CPU
120, and the memory controller 122 may be implemented on a single
chip, such as a chip 104. In some other embodiments, they may be
implemented on separate chips.
[0044] The RF (radio frequency) circuitry 108 receives and sends RF
signals, also called electromagnetic signals. The RF circuitry 108
converts electrical signals to/from electromagnetic signals and
communicates with communications networks and other communications
devices via the electromagnetic signals. The RF circuitry 108 may
include well-known circuitry for performing these functions,
including but not limited to an antenna system, an RF transceiver,
one or more amplifiers, a tuner, one or more oscillators, a digital
signal processor, a CODEC chipset, a subscriber identity module
(SIM) card, memory, and so forth. The RF circuitry 108 may
communicate with networks, such as the Internet, also referred to
as the World Wide Web (WWW), an intranet and/or a wireless network,
such as a cellular telephone network, a wireless local area network
(LAN) and/or a metropolitan area network (MAN), and other devices
by wireless communication. The wireless communication may use any
of a plurality of communications standards, protocols and
technologies, including but not limited to Global System for Mobile
Communications (GSM), Enhanced Data GSM Environment (EDGE),
high-speed downlink packet access (HSDPA), wideband code division
multiple access (W-CDMA), code division multiple access (CDMA),
time division multiple access (TDMA), Bluetooth, Wireless Fidelity
(Wi-Fi) (e.g., IEEE 802.11a, IEEE 802.11b, IEEE 802.11g and/or IEEE
802.1 in), voice over Internet Protocol (VoIP), Wi-MAX, a protocol
for email (e.g., Internet message access protocol (IMAP) and/or
post office protocol (POP)), instant messaging (e.g., extensible
messaging and presence protocol (XMPP), Session Initiation Protocol
for Instant Messaging and Presence Leveraging Extensions (SIMPLE),
and/or Instant Messaging and Presence Service (IMPS)), and/or Short
Message Service (SMS)), or any other suitable communication
protocol, including communication protocols not yet developed as of
the filing date of this document.
[0045] The audio circuitry 110, the speaker 111, and the microphone
113 provide an audio interface between a user and the device 100.
The audio circuitry 110 receives audio data from the peripherals
interface 118, converts the audio data to an electrical signal, and
transmits the electrical signal to the speaker 111. The speaker 111
converts the electrical signal to human-audible sound waves. The
audio circuitry 110 also receives electrical signals converted by
the microphone 113 from sound waves. The audio circuitry 110
converts the electrical signal to audio data and transmits the
audio data to the peripherals interface 118 for processing. Audio
data may be retrieved from and/or transmitted to memory 102 and/or
the RF circuitry 108 by the peripherals interface 118. In some
embodiments, the audio circuitry 110 also includes a headset jack
(e.g. 212, FIG. 2). The headset jack provides an interface between
the audio circuitry 110 and removable audio input/output
peripherals, such as output-only headphones or a headset with both
output (e.g., a headphone for one or both ears) and input (e.g., a
microphone).
[0046] The I/O subsystem 106 couples input/output peripherals on
the device 100, such as the touch screen 112 and other
input/control devices 116, to the peripherals interface 118. The
I/O subsystem 106 may include a display controller 156 and one or
more input controllers 160 for other input or control devices. The
one or more input controllers 160 receive/send electrical signals
from/to other input or control devices 116. The other input/control
devices 116 may include physical buttons (e.g., push buttons,
rocker buttons, etc.), dials, slider switches, joysticks, click
wheels, and so forth. In some alternate embodiments, input
controller(s) 160 may be coupled to any (or none) of the following:
a keyboard, infrared port, USB port, and a pointer device such as a
mouse. The one or more buttons (e.g., 208, FIG. 2) may include an
up/down button for volume control of the speaker 111 and/or the
microphone 113. The one or more buttons may include a push button
(e.g., 206, FIG. 2). A quick press of the push button may disengage
a lock of the touch screen 112 or begin a process that uses
gestures on the touch screen to unlock the device, as described in
U.S. patent application Ser. No. 11/322,549, "Unlocking a Device by
Performing Gestures on an Unlock Image," filed Dec. 23, 2005, which
is hereby incorporated by reference in its entirety. A longer press
of the push button (e.g., 206) may turn power to the device 100 on
or off. The user may be able to customize a functionality of one or
more of the buttons. The touch screen 112 is used to implement
virtual or soft buttons and one or more soft keyboards.
[0047] The touch-sensitive touch screen 112 provides an input
interface and an output interface between the device and a user.
The display controller 156 receives and/or sends electrical signals
from/to the touch screen 112. The touch screen 112 displays visual
output to the user. The visual output may include graphics, text,
icons, video, and any combination thereof (collectively termed
"graphics"). In some embodiments, some or all of the visual output
may correspond to user-interface objects.
[0048] A touch screen 112 has a touch-sensitive surface, sensor or
set of sensors that accepts input from the user based on haptic
and/or tactile contact. The touch screen 112 and the display
controller 156 (along with any associated modules and/or sets of
instructions in memory 102) detect contact (and any movement or
breaking of the contact) on the touch screen 112 and converts the
detected contact into interaction with user-interface objects
(e.g., one or more soft keys, icons, web pages or images) that are
displayed on the touch screen. In an exemplary embodiment, a point
of contact between a touch screen 112 and the user corresponds to a
finger of the user.
[0049] The touch screen 112 may use LCD (liquid crystal display)
technology, or LPD (light emitting polymer display) technology,
although other display technologies may be used in other
embodiments. The touch screen 112 and the display controller 156
may detect contact and any movement or breaking thereof using any
of a plurality of touch sensing technologies now known or later
developed, including but not limited to capacitive, resistive,
infrared, and surface acoustic wave technologies, as well as other
proximity sensor arrays or other elements for determining one or
more points of contact with a touch screen 112.
[0050] A touch-sensitive display in some embodiments of the touch
screen 112 may be analogous to the multi-touch sensitive tablets
described in the following U.S. Pat. No. 6,323,846 (Westerman et
al.), U.S. Pat. No. 6,570,557 (Westerman et al.), and/or U.S. Pat.
No. 6,677,932 (Westerman), and/or U.S. Patent Publication
2002/0015024A1, each of which is hereby incorporated by reference
in its entirety. However, a touch screen 112 displays visual output
from the portable device 100, whereas touch sensitive tablets do
not provide visual output.
[0051] A touch-sensitive display in some embodiments of the touch
screen 112 may be as described in the following applications: (1)
U.S. patent application Ser. No. 11/381,313, "Multipoint Touch
Surface Controller," filed May 2, 2006; (2) U.S. patent application
Ser. No. 10/840,862, "Multipoint Touchscreen," filed May 6, 2004;
(3) U.S. patent application Ser. No. 10/903,964, "Gestures For
Touch Sensitive Input Devices," filed Jul. 30, 2004; (4) U.S.
patent application Ser. No. 11/048,264, "Gestures For Touch
Sensitive Input Devices," filed Jan. 31, 2005; (5) U.S. patent
application Ser. No. 11/038,590, "Mode-Based Graphical User
Interfaces For Touch Sensitive Input Devices," filed Jan. 18, 2005;
(6) U.S. patent application Ser. No. 11/228,758, "Virtual Input
Device Placement On A Touch Screen User Interface," filed Sep. 16,
2005; (7) U.S. patent application Ser. No. 11/228,700, "Operation
Of A Computer With A Touch Screen Interface," filed Sep. 16, 2005;
(8) U.S. patent application Ser. No. 11/228,737, "Activating
Virtual Keys Of A Touch-Screen Virtual Keyboard," filed Sep. 16,
2005; and (9) U.S. patent application Ser. No. 11/367,749,
"Multi-Functional Hand-Held Device," filed Mar. 3, 2006. All of
these applications are incorporated by reference herein in their
entirety.
[0052] The touch screen 112 may have a resolution in excess of 100
dpi. In an exemplary embodiment, the touch screen has a resolution
of approximately 160 dpi. The user may make contact with the touch
screen 112 using any suitable object or appendage, such as a
stylus, a finger, and so forth. In some embodiments, the user
interface is designed to work primarily with finger-based contacts
and gestures, which are much less precise than stylus-based input
due to the larger area of contact of a finger on the touch screen.
In some embodiments, the device translates the rough finger-based
input into a precise pointer/cursor position or command for
performing the actions desired by the user.
[0053] In some embodiments, in addition to the touch screen, the
device 100 may include a touchpad (not shown) for activating or
deactivating particular functions. In some embodiments, the
touchpad is a touch-sensitive area of the device that, unlike the
touch screen, does not display visual output. The touchpad may be a
touch-sensitive surface that is separate from the touch screen 112
or an extension of the touch-sensitive surface formed by the touch
screen.
[0054] In some embodiments, the device 100 may include a physical
or virtual click wheel as an input control device 116. A user may
navigate among and interact with one or more graphical objects
(henceforth referred to as icons) displayed in the touch screen 112
by rotating the click wheel or by moving a point of contact with
the click wheel (e.g., where the amount of movement of the point of
contact is measured by its angular displacement with respect to a
center point of the click wheel). The click wheel may also be used
to select one or more of the displayed icons. For example, the user
may press down on at least a portion of the click wheel or an
associated button. User commands and navigation commands provided
by the user via the click wheel may be processed by an input
controller 160 as well as one or more of the modules and/or sets of
instructions in memory 102. For a virtual click wheel, the click
wheel and click wheel controller may be part of the touch screen
112 and the display controller 156, respectively. For a virtual
click wheel, the click wheel may be either an opaque or
semitransparent object that appears and disappears on the touch
screen display in response to user interaction with the device. In
some embodiments, a virtual click wheel is displayed on the touch
screen of a portable multifunction device and operated by user
contact with the touch screen.
[0055] The device 100 also includes a power system 162 for powering
the various components. The power system 162 may include a power
management system, one or more power sources (e.g., battery,
alternating current (AC)), a recharging system, a power failure
detection circuit, a power converter or inverter, a power status
indicator (e.g., a light-emitting diode (LED)) and any other
components associated with the generation, management and
distribution of power in portable devices.
[0056] The device 100 may also include one or more optical sensors
164. FIGS. 1A and 1B show an optical sensor coupled to an optical
sensor controller 158 in I/O subsystem 106. The optical sensor 164
may include charge-coupled device (CCD) or complementary
metal-oxide semiconductor (CMOS) phototransistors. The optical
sensor 164 receives light from the environment, projected through
one or more lens, and converts the light to data representing an
image. In conjunction with an imaging module 143 (also called a
camera module), the optical sensor 164 may capture still images or
video. In some embodiments, an optical sensor is located on the
back of the device 100, opposite the touch screen display 112 on
the front of the device, so that the touch screen display may be
used as a viewfinder for either still and/or video image
acquisition. In some embodiments, an optical sensor is located on
the front of the device so that the user's image may be obtained
for videoconferencing while the user views the other video
conference participants on the touch screen display. In some
embodiments, the position of the optical sensor 164 can be changed
by the user (e.g., by rotating the lens and the sensor in the
device housing) so that a single optical sensor 164 may be used
along with the touch screen display for both video conferencing and
still and/or video image acquisition.
[0057] The device 100 may also include one or more proximity
sensors 166. FIGS. 1A and 1B show a proximity sensor 166 coupled to
the peripherals interface 118. Alternately, the proximity sensor
166 may be coupled to an input controller 160 in the I/O subsystem
106. The proximity sensor 166 may perform as described in U.S.
patent application Ser. No. 11/241,839, "Proximity Detector In
Handheld Device"; Ser. No. 11/240,788, "Proximity Detector In
Handheld Device"; Ser. No. 11/620,702, "Using Ambient Light Sensor
To Augment Proximity Sensor Output"; Ser. No. 11/586,862,
"Automated Response To And Sensing Of User Activity In Portable
Devices"; and Ser. No. 11/638,251, "Methods And Systems For
Automatic Configuration Of Peripherals," which are hereby
incorporated by reference in their entirety. In some embodiments,
the proximity sensor turns off and disables the touch screen 112
when the multifunction device is placed near the user's ear (e.g.,
when the user is making a phone call). In some embodiments, the
proximity sensor keeps the screen off when the device is in the
user's pocket, purse, or other dark area to prevent unnecessary
battery drainage when the device is a locked state.
[0058] The device 100 may also include one or more accelerometers
168. FIGS. 1A and 1B show an accelerometer 168 coupled to the
peripherals interface 118. Alternately, the accelerometer 168 may
be coupled to an input controller 160 in the I/O subsystem 106. The
accelerometer 168 may perform as described in U.S. Patent
Publication No. 20050190059, "Acceleration-based Theft Detection
System for Portable Electronic Devices," and U.S. Patent
Publication No. 20060017692, "Methods And Apparatuses For Operating
A Portable Device Based On An Accelerometer," both of which are
which are incorporated by reference herein in their entirety. In
some embodiments, information is displayed on the touch screen
display in a portrait view or a landscape view based on an analysis
of data received from the one or more accelerometers.
[0059] In some embodiments, the software components stored in
memory 102 may include an operating system 126, a communication
module (or set of instructions) 128, a contact/motion module (or
set of instructions) 130, a graphics module (or set of
instructions) 132, a text input module (or set of instructions)
134, a Global Positioning System (GPS) module (or set of
instructions) 135, and applications (or set of instructions)
136.
[0060] The operating system 126 (e.g., Darwin, RTXC, LINUX, UNIX,
OS X, WINDOWS, or an embedded operating system such as VxWorks)
includes various software components and/or drivers for controlling
and managing general system tasks (e.g., memory management, storage
device control, power management, etc.) and facilitates
communication between various hardware and software components.
[0061] The communication module 128 facilitates communication with
other devices over one or more external ports 124 and also includes
various software components for handling data received by the RF
circuitry 108 and/or the external port 124. The external port 124
(e.g., Universal Serial Bus (USB), FIREWIRE, etc.) is adapted for
coupling directly to other devices or indirectly over a network
(e.g., the Internet, wireless LAN, etc.). In some embodiments, the
external port is a multi-pin (e.g., 30-pin) connector that is the
same as, or similar to and/or compatible with the 30-pin connector
used on iPod (trademark of Apple Computer, Inc.) devices.
[0062] The contact/motion module 130 may detect contact with the
touch screen 112 (in conjunction with the display controller 156)
and other touch sensitive devices (e.g., a touchpad or physical
click wheel). The contact/motion module 130 includes various
software components for performing various operations related to
detection of contact, such as determining if contact has occurred,
determining if there is movement of the contact and tracking the
movement across the touch screen 112, and determining if the
contact has been broken (i.e., if the contact has ceased).
Determining movement of the point of contact may include
determining speed (magnitude), velocity (magnitude and direction),
and/or an acceleration (a change in magnitude and/or direction) of
the point of contact. These operations may be applied to single
contacts (e.g., one finger contacts) or to multiple simultaneous
contacts (e.g., "multitouch"/multiple finger contacts). In some
embodiments, the contact/motion module 130 and the display
controller 156 also detects contact on a touchpad. In some
embodiments, the contact/motion module 130 and the controller 160
detects contact on a click wheel.
[0063] The graphics module 132 includes various known software
components for rendering and displaying graphics on the touch
screen 112, including components for changing the intensity of
graphics that are displayed. As used herein, the term "graphics"
includes any object that can be displayed to a user, including
without limitation text, web pages, icons (such as user-interface
objects including soft keys), digital images, videos, animations
and the like.
[0064] The text input module 134, which may be a component of
graphics module 132, provides soft keyboards for entering text in
various applications (e.g., contacts 137, e-mail 140, IM 141,
blogging 142, browser 147, and any other application that needs
text input).
[0065] The GPS module 135 determines the location of the device and
provides this information for use in various applications (e.g., to
telephone 138 for use in location-based dialing, to camera 143
and/or blogger 142 as picture/video metadata, and to applications
that provide location-based services such as weather widgets, local
yellow page widgets, and map/navigation widgets).
[0066] The applications 136 may include the following modules (or
sets of instructions), or a subset or superset thereof: [0067] a
contacts module 137 (sometimes called an address book or contact
list); [0068] a telephone module 138; [0069] a video conferencing
module 139; [0070] an e-mail client module 140; [0071] an instant
messaging (IM) module 141; [0072] a blogging module 142; [0073] a
camera module 143 for still and/or video images; [0074] an image
management module 144; [0075] a video player module 145; [0076] a
music player module 146; [0077] a browser module 147; [0078] a
calendar module 148; [0079] widget modules 149, which may include
weather widget 149-1, stocks widget 149-2, calculator widget 149-3,
alarm clock widget 149-4, dictionary widget 149-5, and other
widgets obtained by the user, as well as user-created widgets
149-6; [0080] widget creator module 150 for making user-created
widgets 149-6; [0081] search module 151; [0082] video and music
player module 152, which merges video player module 145 and music
player module 146; [0083] notes module 153; [0084] map module 154;
and/or [0085] online video module 155.
[0086] Examples of other applications 136 that may be stored in
memory 102 include other word processing applications, JAVA-enabled
applications, encryption, digital rights management, voice
recognition, and voice replication.
[0087] In conjunction with touch screen 112, display controller
156, contact module 130, graphics module 132, and text input module
134, the contacts module 137 may be used to manage an address book
or contact list, including: adding name(s) to the address book;
deleting name(s) from the address book; associating telephone
number(s), e-mail address(es), physical address(es) or other
information with a name; associating an image with a name;
categorizing and sorting names; providing telephone numbers or
e-mail addresses to initiate and/or facilitate communications by
telephone 138, video conference 139, e-mail 140, or IM 141; and so
forth.
[0088] In conjunction with RF circuitry 108, audio circuitry 110,
speaker 111, microphone 113, touch screen 112, display controller
156, contact module 130, graphics module 132, and text input module
134, the telephone module 138 may be used to enter a sequence of
characters corresponding to a telephone number, access one or more
telephone numbers in the address book 137, modify a telephone
number that has been entered, dial a respective telephone number,
conduct a conversation and disconnect or hang up when the
conversation is completed. As noted above, the wireless
communication may use any of a plurality of communications
standards, protocols and technologies.
[0089] In conjunction with RF circuitry 108, audio circuitry 110,
speaker 111, microphone 113, touch screen 112, display controller
156, optical sensor 164, optical sensor controller 158, contact
module 130, graphics module 132, text input module 134, contact
list 137, and telephone module 138, the videoconferencing module
139 may be used to initiate, conduct, and terminate a video
conference between a user and one or more other participants.
[0090] In conjunction with RF circuitry 108, touch screen 112,
display controller 156, contact module 130, graphics module 132,
and text input module 134, the e-mail client module 140 may be used
to create, send, receive, and manage e-mail. In conjunction with
image management module 144, the e-mail module 140 makes it very
easy to create and send c-mails with still or video images taken
with camera module 143.
[0091] In conjunction with RF circuitry 108, touch screen 112,
display controller 156, contact module 130, graphics module 132,
and text input module 134, the instant messaging module 141 may be
used to enter a sequence of characters corresponding to an instant
message, to modify previously entered characters, to transmit a
respective instant message (for example, using a Short Message
Service (SMS) or Multimedia Message Service (MMS) protocol for
telephony-based instant messages or using XMPP, SIMPLE, or IMPS for
Internet-based instant messages), to receive instant messages and
to view received instant messages. In some embodiments, transmitted
and/or received instant messages may include graphics, photos,
audio files, video files and/or other attachments as are supported
in a MMS and/or an Enhanced Messaging Service (EMS). As used
herein, "instant messaging" refers to both telephony-based messages
(e.g., messages sent using SMS or MMS) and Internet-based messages
(e.g., messages sent using XMPP, SIMPLE, or IMPS).
[0092] In conjunction with RF circuitry 108, touch screen 112,
display controller 156, contact module 130, graphics module 132,
text input module 134, image management module 144, and browsing
module 147, the blogging module 142 may be used to send text, still
images, video, and/or other graphics to a blog (e.g., the user's
blog).
[0093] In conjunction with touch screen 112, display controller
156, optical sensor(s) 164, optical sensor controller 158, contact
module 130, graphics module 132, and image management module 144,
the camera module 143 may be used to capture still images or video
(including a video stream) and store them into memory 102, modify
characteristics of a still image or video, or delete a still image
or video from memory 102.
[0094] In conjunction with touch screen 112, display controller
156, contact module 130, graphics module 132, text input module
134, and camera module 143, the image management module 144 may be
used to arrange, modify or otherwise manipulate, label, delete,
present (e.g., in a digital slide show or album), and store still
and/or video images.
[0095] In conjunction with touch screen 112, display controller
156, contact module 130, graphics module 132, audio circuitry 110,
and speaker 111, the video player module 145 may be used to
display, present or otherwise play back videos (e.g., on the touch
screen or on an external, connected display via external port
124).
[0096] In conjunction with touch screen 112, display system
controller 156, contact module 130, graphics module 132, audio
circuitry 110, speaker 111, RF circuitry 108, and browser module
147, the music player module 146 allows the user to download and
play back recorded music and other sound files stored in one or
more file formats, such as MP3 or AAC files. In some embodiments,
the device 100 may include the functionality of an MP3 player, such
as an iPod (trademark of Apple Computer, Inc.).
[0097] In conjunction with RF circuitry 108, touch screen 112,
display system controller 156, contact module 130, graphics module
132, and text input module 134, the browser module 147 may be used
to browse the Internet, including searching, linking to, receiving,
and displaying web pages or portions thereof, as well as
attachments and other files linked to web pages.
[0098] In conjunction with RF circuitry 108, touch screen 112,
display system controller 156, contact module 130, graphics module
132, text input module 134, c-mail module 140, and browser module
147, the calendar module 148 may be used to create, display,
modify, and store calendars and data associated with calendars
(e.g., calendar entries, to do lists, etc.).
[0099] In conjunction with RF circuitry 108, touch screen 112,
display system controller 156, contact module 130, graphics module
132, text input module 134, and browser module 147, the widget
modules 149 are mini-applications that may be downloaded and used
by a user (e.g., weather widget 149-1, stocks widget 149-2,
calculator widget 149-3, alarm clock widget 149-4, and dictionary
widget 149-5) or created by the user (e.g., user-created widget
149-6). In some embodiments, a widget includes an HTML (Hypertext
Markup Language) file, a CSS (Cascading Style Sheets) file, and a
JavaScript file. In some embodiments, a widget includes an XML
(Extensible Markup Language) file and a JavaScript file (e.g.,
Yahoo! Widgets).
[0100] In conjunction with RF circuitry 108, touch screen 112,
display system controller 156, contact module 130, graphics module
132, text input module 134, and browser module 147, the widget
creator module 150 may be used by a user to create widgets (e.g.,
turning a user-specified portion of a web page into a widget).
[0101] In conjunction with touch screen 112, display system
controller 156, contact module 130, graphics module 132, and text
input module 134, the search module 151 may be used to search for
text, music, sound, image, video, and/or other files in memory 102
that match one or more search criteria (e.g., one or more
user-specified search terms).
[0102] In conjunction with touch screen 112, display controller
156, contact module 130, graphics module 132, and text input module
134, the notes module 153 may be used to create and manage notes,
to do lists, and the like.
[0103] In conjunction with RF circuitry 108, touch screen 112,
display system controller 156, contact module 130, graphics module
132, text input module 134, GPS module 135, and browser module 147,
the map module 154 may be used to receive, display, modify, and
store maps and data associated with maps (e.g., driving directions;
data on stores and other points of interest at or near a particular
location; and other location-based data).
[0104] In conjunction with touch screen 112, display system
controller 156, contact module 130, graphics module 132, audio
circuitry 110, speaker 111, RF circuitry 108, text input module
134, e-mail client module 140, and browser module 147, the online
video module 155 allows the user to access, browse, receive (e.g.,
by streaming and/or download), play back (e.g., on the touch screen
or on an external, connected display via external port 124), send
an e-mail with a link to a particular online video, and otherwise
manage online videos in one or more file formats, such as H.264. In
some embodiments, instant messaging module 141, rather than e-mail
client module 140, is used to send a link to a particular online
video. Additional description of the online video application can
be found in U.S. Provisional Patent Application No. 60/936,562,
"Portable Multifunction Device, Method, and Graphical User
Interface for Playing Online Videos," filed Jun. 20, 2007, and U.S.
patent application Ser. No. 11/968,067, "Portable Multifunction
Device, Method, and Graphical User Interface for Playing Online
Videos," filed Dec. 31, 2007, the content of which is hereby
incorporated by reference in its entirety.
[0105] Each of the above identified modules and applications
correspond to a set of instructions for performing one or more
functions described above. These modules (i.e., sets of
instructions) need not be implemented as separate software
programs, procedures or modules, and thus various subsets of these
modules may be combined or otherwise re-arranged in various
embodiments. For example, video player module 145 may be combined
with music player module 146 into a single module (e.g., video and
music player module 152, FIG. 1B). In some embodiments, memory 102
may store a subset of the modules and data structures identified
above. Furthermore, memory 102 may store additional modules and
data structures not described above.
[0106] In some embodiments, the device 100 is a device where
operation of a predefined set of functions on the device is
performed exclusively through a touch screen 112 and/or a touchpad.
By using a touch screen and/or a touchpad as the primary
input/control device for operation of the device 100, the number of
physical input/control devices (such as push buttons, dials, and
the like) on the device 100 may be reduced.
[0107] The predefined set of functions that may be performed
exclusively through a touch screen and/or a touchpad include
navigation between user interfaces. In some embodiments, the
touchpad, when touched by the user, navigates the device 100 to a
main, home, or root menu from any user interface that may be
displayed on the device 100. In such embodiments, the touchpad may
be referred to as a "menu button." In some other embodiments, the
menu button may be a physical push button or other physical
input/control device instead of a touchpad.
[0108] FIG. 2 illustrates a portable multifunction device 100
having a touch screen 112 in accordance with some embodiments. The
touch screen may display one or more graphics within user interface
(UI) 200. In this embodiment, as well as others, a user may select
one or more of the graphics by making contact or touching the
graphics, for example, with one or more fingers 202 (not drawn to
scale in the figure). In some embodiments, selection of one or more
graphics occurs when the user breaks contact with the one or more
graphics. In some embodiments, the contact may include a gesture,
such as one or more taps, one or more swipes (from left to right,
right to left, upward and/or downward) and/or a rolling of a finger
(from right to left, left to right, upward and/or downward) that
has made contact with the device 100. In some embodiments,
inadvertent contact with a graphic may not select the graphic. For
example, a swipe gesture that sweeps over an application icon may
not select the corresponding application when the gesture
corresponding to selection is a tap.
[0109] The device 100 may also include one or more physical
buttons, such as "home" or menu button 204. As described
previously, the menu button 204 may be used to navigate to any
application 136 in a set of applications that may be executed on
the device 100. Alternatively, in some embodiments, the menu button
is implemented as a soft key in a GUI in touch screen 112.
[0110] In one embodiment, the device 100 includes a touch screen
112, a menu button 204, a push button 206 for powering the device
on/off and locking the device, volume adjustment button(s) 208, a
Subscriber Identity Module (SIM) card slot 210, a head set jack
212, and a docking/charging external port 124. The push button 206
may be used to turn the power on/off on the device by depressing
the button and holding the button in the depressed state for a
predefined time interval; to lock the device by depressing the
button and releasing the button before the predefined time interval
has elapsed; and/or to unlock the device or initiate an unlock
process. In an alternative embodiment, the device 100 also may
accept verbal input for activation or deactivation of some
functions through the microphone 113.
[0111] Attention is now directed towards embodiments of user
interfaces ("UI") and associated processes that may be implemented
on a portable multifunction device 100.
[0112] FIGS. 3A-3C illustrate exemplary user interfaces for
unlocking a portable electronic device in accordance with some
embodiments. In some embodiments, user interface 300A includes the
following elements, or a subset or superset thereof: [0113] Unlock
image 302 that is moved with a finger gesture to unlock the device;
[0114] Arrow 304 that provides a visual cue to the unlock gesture;
[0115] Channel 306 that provides additional cues to the unlock
gesture; [0116] Time 308; [0117] Day 310; [0118] Date 312; and
[0119] Wallpaper image 314.
[0120] In some embodiments, in addition to or in place of wallpaper
image 314, an unlock user interface may include a device charging
status icon 316 and a headset charging status icon 318 (e.g., UI
300B, FIG. 3B). The device charging status icon 316 indicates the
battery status while the device 100 is being recharged (e.g., in a
dock). Similarly, headset charging status icon 318 indicates the
battery status of a headset associated with device 100 (e.g., a
Bluetooth headset) while the headset is being recharged (e.g., in
another portion of the dock).
[0121] In some embodiments, the device detects contact with the
touch-sensitive display (e.g., a user's finger making contact on or
near the unlock image 302) while the device is in a user-interface
lock state. The device moves the unlock image 302 in accordance
with the contact. The device transitions to a user-interface unlock
state if the detected contact corresponds to a predefined gesture,
such as moving the unlock image across channel 306. Conversely, the
device maintains the user-interface lock state if the detected
contact does not correspond to the predefined gesture. This process
saves battery power by ensuring that the device is not accidentally
awakened. This process is easy for users to perform, in part
because of the visual cue(s) provided on the touch screen.
[0122] In some embodiments, after detecting an unlock gesture, the
device displays a passcode (or password) interface (e.g., UI 300C,
FIG. 3C) for entering a passcode to complete the unlock process.
The addition of a passcode protects against unauthorized use of the
device. In some embodiments, the passcode interface includes an
emergency call icon that permits an emergency call (e.g., to 911)
without entering the passcode. In some embodiments, the use of a
passcode is a user-selectable option (e.g., part of settings
412).
[0123] As noted above, processes that use gestures on the touch
screen to unlock the device are described in U.S. patent
application Ser. No. 11/322,549, "Unlocking A Device By Performing
Gestures On An Unlock Image," filed Dec. 23, 2005, and Ser. No.
11/322,550, "Indication Of Progress Towards Satisfaction Of A User
Input Condition," filed Dec. 23, 2005, which are hereby
incorporated by reference in their entirety.
[0124] FIGS. 4A and 4B illustrate exemplary user interfaces for a
menu of applications on a portable multifunction device in
accordance with some embodiments. In some embodiments, user
interface 400A includes the following elements, or a subset or
superset thereof: [0125] Signal strength indicator(s) 402 for
wireless communication(s), such as cellular and Wi-Fi signals;
[0126] Time 404; [0127] Battery status indicator 406; [0128] Tray
408 with icons for frequently used applications, such as: [0129]
Phone 138, which may include an indicator 414 of the number of
missed calls or voicemail messages; [0130] E-mail client 140, which
may include an indicator 410 of the number of unread e-mails;
[0131] Browser 147; and [0132] Music player 146; and [0133] Icons
for other applications, such as: [0134] IM 141; [0135] Image
management 144; [0136] Camera 143; [0137] Video player 145; [0138]
Weather 149-1; [0139] Stocks 149-2; [0140] Blog 142; [0141]
Calendar 148; [0142] Calculator 149-3; [0143] Alarm clock 149-4;
[0144] Dictionary 149-5; and [0145] User-created widget 149-6.
[0146] In some embodiments, user interface 400B includes the
following elements, or a subset or superset thereof: [0147] 402,
404, 406, 141, 148, 144, 143, 149-3, 149-2, 149-1, 149-4, 410, 414,
138, 140, and 147, as described above; [0148] Bluetooth indicator
405; [0149] Map 154; [0150] Notes 153; [0151] Settings 412, which
provides access to settings for the device 100 and its various
applications 136, as described further below; [0152] Video and
music player module 152, also referred to as iPod (trademark of
Apple Computer, Inc.) module 152; and [0153] Online video module
155, also referred to as YouTube (trademark of Google, Inc.) module
155.
[0154] In some embodiments, UI 400A or 400B displays all of the
available applications 136 on one screen so that there is no need
to scroll through a list of applications (e.g., via a scroll bar).
In some embodiments, as the number of applications increase, the
icons corresponding to the applications may decrease in size so
that all applications may be displayed on a single screen without
scrolling. In some embodiments, having all applications on one
screen and a menu button enables a user to access any desired
application with at most two inputs, such as activating the menu
button 204 and then activating the desired application (e.g., by a
tap or other finger gesture on the icon corresponding to the
application). In some embodiments, a predefined gesture on the menu
button 204 (e.g., a double tap or a double click) acts as a short
cut that initiates display of a particular user interface in a
particular application. In some embodiments, the short cut is a
user-selectable option (e.g., part of settings 412). For example,
if the user makes frequent calls to persons listed in a Favorites
UI (e.g., UI 2700A, FIG. 27A) in the phone 138, the user may choose
to have the Favorites UI be displayed in response to a double click
on the menu button. As another example, the user may choose to have
a UI with information about the currently playing music (e.g., UI
4300S, FIG. 43S) be displayed in response to a double click on the
menu button.
[0155] In some embodiments, UI 400A or 400B provides integrated
access to both widget-based applications and non-widget-based
applications. In some embodiments, all of the widgets, whether
user-created or not, are displayed in UI 400A or 400B. In other
embodiments, activating the icon for user-created widget 149-6 may
lead to another UI that contains the user-created widgets or icons
corresponding to the user-created widgets.
[0156] In some embodiments, a user may rearrange the icons in UI
400A or 400B, e.g., using processes described in U.S. patent
application Ser. No. 11/459,602, "Portable Electronic Device With
Interface Reconfiguration Mode," filed Jul. 24, 2006, which is
hereby incorporated by reference in its entirety. For example, a
user may move application icons in and out of tray 408 using finger
gestures.
[0157] In some embodiments, UI 400A or 400B includes a gauge (not
shown) that displays an updated account usage metric for an account
associated with usage of the device (e.g., a cellular phone
account), as described in U.S. patent application Ser. No.
11/322,552, "Account Information Display For Portable Communication
Device," filed Dec. 23, 2005, which is hereby incorporated by
reference in its entirety.
[0158] In some embodiments, a signal strength indicator 402 (FIG.
4B) for a WiFi network is replaced by a symbol for a cellular
network (e.g., the letter "E" for an EDGE network, FIG. 4A) when
the device switches from using the WiFi network to using the
cellular network for data transmission (e.g., because the WiFi
signal is weak or unavailable).
[0159] FIGS. 5A-5J illustrate exemplary user interfaces for
displaying and navigating a portion of a web page in accordance
with some embodiments.
[0160] In some embodiments, UIs 500A-500E and UI 500J (FIGS. 5A-5E
and 5J) include the following elements, or a subset or superset
thereof: [0161] 402, 404, and 406 as described above; [0162] Status
bar 501 for displaying one or more status indicators, such as the
signal strength indicator 402, time 404, and battery status
indicator 406; [0163] Previous page icon 502 that when activated
(e.g., by a finger tap on the icon) initiates display of the
previous web page; [0164] Web page name 504; [0165] Next page icon
506 that when activated (e.g., by a finger tap on the icon)
initiates display of the next web page; [0166] URL (Uniform
Resource Locator) entry area 508 for inputting URLs or other
Uniform Resource Identifiers (URIs) of web pages; [0167] Refresh
icon 510 that when activated (e.g., by a finger tap on the icon)
initiates a refresh of the web page; [0168] Portions 512 of a web
page (e.g., bottom portion 512-3 (FIG. 5A), intermediate portion
512-2 (FIG. 5J), and top portion 512-1 (FIG. 5D)); [0169] Bookmarks
icon 518 that when activated (e.g., by a finger tap on the icon)
initiates display of a bookmarks list or menu for the browser,
[0170] Add bookmark icon 520 that when activated (e.g., by a finger
tap on the icon) initiates display of a UI for adding bookmarks;
and [0171] New window icon 522 that when activated (e.g., by a
finger tap on the icon) initiates display of a UI for adding new
windows (e.g., web pages) to the browser, and which may also
indicate the number of windows (e.g., "4" in icon 522, FIG.
5A).
[0172] In some embodiments, a portion 512 of the web page 51 (e.g.,
portion 512-3 that displays items 96 to 100 on a web page with a
"Top 100 List." FIG. 5A) is displayed in a web browser application
147, without concurrently displaying a URL entry area 508 for
inputting URLs of web pages (e.g., UI 500A, FIG. 5A).
[0173] In response to detecting a gesture 580 (UI 500B, FIG. 5B) on
a predefined area at the top of the touch screen display (e.g.,
status bar 501), a URL entry area 508 is displayed (UI 500C, FIG.
5C). In some embodiments, in response to detecting the gesture on
the predefined area at the top of the touch screen display, the web
page is translated (e.g., scrolled) to display the top portion of
the web page (e.g., portion 512-1 that displays items 1 to 4 on the
web page with the "Top 100 List," FIG. 5D).
[0174] In some embodiments, UIs 500F-500I (FIGS. 5F-5I) include the
following elements, or a subset or superset thereof: [0175] 402,
404, 406, 501, 504, and 508, as described above; [0176] Cancel icon
505 that when activated (e.g., by a finger tap on the icon) cancels
a URL or search term input process and ceases display of the
corresponding keyboard (e.g., URL input keyboard 550 or Search
input keyboard 562, described below); [0177] URL clear icon 532
that when activated (e.g., by a finger tap on the icon) clears any
input in URL text entry area 508; [0178] Search term entry area 534
for displaying search terms input for web searches; [0179] URL
input keyboard 550 (FIG. 5F) with period key 556, backslash key
558, ".com" key 552, and "Go" key 560, which streamlines entering
common characters in URLs; [0180] Alternate keyboard selector icons
554 (FIG. 5F) and 564 (FIG. 5H) that when activated (e.g., by a
finger tap on the icon) initiates the display of a different
keyboard (e.g., a number/symbol keyboard, not shown); [0181] Search
input keyboard 562 (FIG. 5H) with alternate keyboard selector icon
564, space icon 568 and search icon 570, for entering search
term(s); and [0182] Background web page portion 572 for providing
application context and that when activated (e.g., by a finger tap
on the portion 572) cancels the URL or search term input process
and ceases display of the corresponding keyboard (e.g., URL input
keyboard 550 or Search input keyboard 562).
[0183] In some embodiments, a gesture 582 is detected on URL entry
area 508 (UI 500E, FIG. 5E). In response to detecting the gesture
582 on the URL entry area 508, URL input keyboard 550 is displayed
for entering text such as letter, number, punctuation and other
symbols, in the URL entry area 508 (UI 500F, FIG. 5F). In some
embodiments, in response to detecting the gesture 582 on the URL
entry area 508, a search term entry area (e.g., area 534) is also
displayed for inputting search terms for web searches (UI 500F,
FIG. 5F).
[0184] In some embodiments, a gesture 584 is detected on search
term entry area 534 (UT 500G, FIG. 5G). In response to detecting
the gesture 584 on search term entry area 534, URL input keyboard
550 for entering characters in the URL text entry area 508 is
replaced with search input keyboard 562 for entering characters in
the search term entry area 534 (UI 500H, FIG. 5H).
[0185] The user interfaces in FIGS. 5A-5J are used to illustrate
the processes described below with respect to FIGS. 7A-7C and
8A-8C.
[0186] FIGS. 6A-6D illustrate exemplary user interfaces for
displaying and navigating a portion of an electronic document
(e.g., an electronic note with a shopping list) in accordance with
some embodiments.
[0187] In some embodiments, UIs 600A-600D (FIGS. 6A-6D) include the
following elements, or a subset or superset thereof: [0188] 402,
404, 406, and 501, as described above; [0189] Title 610 of the
electronic document; [0190] Portions 614 of an electronic document
(e.g., bottom portion 614-3 (FIG. 6A), intermediate portion 614-2
(FIG. 6D), and top portion 614-1 (FIG. 6C)); [0191] Add note icon
616 that when activated (e.g., by a finger tap on the icon)
initiates display of a new note (not shown); [0192] Notes icon 620
that when activated (e.g., by a finger tap on the icon) initiates
display of a list of notes in the notes application 153 (not
shown); [0193] Previous note icon 632 that when activated (e.g., by
a finger tap on the icon) initiates display of the previous note
(not shown); [0194] Create email icon 634 that when activated
(e.g., by a finger tap on the icon) initiates transfer to the email
application 140 and display of a UI for creating an email message
(not shown); [0195] Trash icon 636 that when activated (e.g., by a
finger tap on the icon) initiates display of a UI for deleting the
note; and [0196] Next note icon 638 that when activated (e.g., by a
finger tap on the icon) initiates display of the next note (not
shown).
[0197] In some embodiments, a portion of an electronic document is
displayed on the touch screen (e.g., portion 614-3 that displays
items 9 to 14 of an electronic note with a Christmas Shopping List,
FIG. 6A). In some embodiments, a gesture 680 (FIG. 6B) is detected
in a predefined area at the top of the touch screen display (e.g.,
status bar 501. FIG. 6B). In response to detecting the gesture 680
at the top of the touch screen display, a top portion of the
electronic document is displayed (e.g., portion 614-1 that displays
items 1 to 6 of the electronic note with the Christmas Shopping
List, FIG. 6C).
[0198] In some embodiments, an upward swipe gesture 688 (FIG. 6C)
is detected on the touch screen display (UI 600C, FIG. 6C). In
response to detecting the upward swipe gesture on the touch screen
display, the electronic document is translated to display a portion
of the electronic document other than the top portion of the
electronic document. For example, in response to detecting the
upward swipe gesture 688 on the touch screen display, the
electronic note is translated (e.g., scrolled) to display portion
614-2 (FIG. 6D) rather than the top portion 614-1 of the note (FIG.
6C).
[0199] The user interfaces in FIGS. 6A-6D are used to illustrate
the process described below with respect to FIGS. 8A-8C.
[0200] FIGS. 7A-7C are a flow diagram illustrating a method of
displaying and navigating a portion of a web page in accordance
with some embodiments. The method 7000 is performed at a portable
electronic device with a touch screen display (e.g., portable
multifunction device 100).
[0201] The device displays (7020) a portion of a web page (e.g.,
portion 512-3 that displays items 96 to 100 on a web page with a
"Top 100 List," FIG. 5A) in a web browser application 147 without
concurrently displaying a URL entry area 508 for inputting URLs of
web pages.
[0202] The device detects (7040) a gesture in a predefined area at
the top of the touch screen display. For example, a gesture 580 (UI
500B, FIG. 5B) is detected on status bar 501.
[0203] In some embodiments, the gesture (e.g., gesture 580, FIG.
5B) is a finger gesture (7060). In some embodiments, the finger
gesture is a finger tap gesture (7080) (e.g., a single tap gesture
or a double tap gesture).
[0204] In some embodiments, the gesture is made with a stylus
(7100). In some embodiments, the gesture is a tap gesture (7120)
(e.g., a single tap gesture or a double tap gesture).
[0205] In some embodiments, the predefined area at the top of the
touch screen display is a status bar (7140) for the portable
electronic device (e.g., status bar 501). In some embodiments, the
gesture 580 may be detected anywhere along the top of the touch
screen display (e.g., anywhere along the status bar 501) so that
the user does not need to touch a precise location.
[0206] In response to detecting the gesture 580 in the predefined
area at the top of the touch screen display, the device displays
(7160) the URL entry area 508 (FIG. 5C), thereby providing a simple
and efficient way for a user to display a URL entry area with the
address of a displayed web page.
[0207] In some embodiments, in response to detecting the gesture
580 in the predefined area at the top of the touch screen display,
the device also translates (7180) the web page to display the top
portion 512-1 of the web page (e.g., UI 500D, FIG. 5D). In these
embodiments, in response to a single gesture, the device displays a
URL entry area and also displays the top portion of the web page,
thereby simplifying navigation of the web page and use of the
browser application. In some embodiments, the top portion 512-1 of
the web page is displayed (7200) adjacent to the URL entry area 508
(FIG. 5D).
[0208] In some embodiments, the device detects (7220) a gesture on
the URL entry area 508 (e.g., gesture 582, FIG. 5E). In response to
detecting the gesture on the URL entry area 508, the device
displays (7240) a soft keyboard for entering text in the URL entry
area 508 (e.g., URL input keyboard 550 in UI 500F, FIG. 5F). In
some embodiments, in response to detecting the gesture on the URL
entry area 508, the device also displays a background web page
portion 572 to provide application context (FIG. 5F). In some
embodiments, the background web page portion 572 is a darkened
portion of the web page.
[0209] In some embodiments, the soft keyboard 550 for entering text
in the URL entry area 508 includes (7260) a single key for entering
".com" in the URL entry area (e.g., ".com" key 552, FIG. 5F).
[0210] In some embodiments, in response to detecting the gesture
(e.g., gesture 582, FIG. 5E) on the URL entry area 508, the device
displays (7280): a soft keyboard for entering text in the URL entry
area, and a search term entry area for inputting search terms for
web searches (e.g., URL input keyboard 550 and search term entry
area 534 in UI 500F, FIG. 5F).
[0211] In some embodiments, the device detects (7300) a gesture on
the search term entry area (e.g., gesture 584, FIG. 5G). In
response to detecting the gesture 584 on the search term entry
area, the device replaces (7320) the display of the soft keyboard
for entering text in the URL entry area with display of a soft
keyboard for entering text in the search term entry area. For
example, URL input keyboard 550 (FIG. 5G) for entering characters
in the URL text entry area 508 (FIG. 5G) is replaced with search
input keyboard 562 (FIG. 5H) for entering text in the search term
entry area 534 (FIG. 5H). In some embodiments, in response to
gestures (e.g., 582 and 534) on the URL entry area 508 and the
search term entry area 534, the device displays the corresponding
keyboard (e.g., 550 and 562, respectively).
[0212] In some embodiments, the device detects (7340) an upward
swipe gesture on the touch screen display (e.g., gesture 586, FIG.
5E). In response to detecting the upward swipe gesture 586 on the
touch screen display, the device translates (7360) the web page
(e.g., from portion 512-1 (FIG. 5E) to portion 512-2 (FIG. 5J)) and
ceases to display the URL entry area 508, thereby hiding the URL
entry area 508 when the top portion 512-1 of the web page is no
longer displayed.
[0213] A graphical user interface on a portable electronic device
with a touch screen display comprises: a portion of a web page in a
web browser application (e.g., portion 512-3, FIG. 5A); a URL entry
area 508 for inputting URLs of web pages (FIG. 5C or 5D); and a
predefined area at the top of the touch screen display (e.g., area
501, FIG. 5A). Prior to detecting a gesture in the predefined area
at the top of the touch screen display, the portion 512 of the web
page is displayed in the web browser application without
concurrently displaying the URL entry area 508 for inputting URLs
of web pages (e.g., UI 500A, FIG. 5A). In response to detecting the
gesture (e.g., gesture 580, FIG. 5B) in the predefined area at the
top of the touch screen display, the URL entry area 508 is
displayed (FIG. 5C or 5D).
[0214] FIGS. 8A-8C are a flow diagram illustrating a method of
displaying and navigating a portion of an electronic document in
accordance with some embodiments. The method 8000 is performed at a
portable electronic device with a touch screen display (e.g.,
portable multifunction device 100). The method 8000 provides a
simple and efficient way for a user to quickly display the top of
the electronic document, as further described below.
[0215] The device displays (8020) a portion of an electronic
document on the touch screen display (e.g., portion 512-3 of a web
page (FIG. 5A) or portion 614-3 of an electronic note (FIG.
6A)).
[0216] The electronic document has (8040) an electronic document
length. The displayed portion of the electronic document has (8060)
a displayed portion length that is less than the electronic
document length. For example, the length of the displayed portion
512-3 (FIG. 5A) of the web page is less than the length of the web
page. Similarly, the length of the displayed portion 614-3 (FIG.
6A) of the electronic note is less than the length of the
electronic note. In other words, less than the entire length of the
electronic document is displayed on the touch screen display.
[0217] In some embodiments, the electronic document is a web page
(8080). In some embodiments, the electronic document is a word
processing document (8100). In some embodiments, the electronic
document is a PDF file (8120). In some embodiments, the electronic
document is a digital image, a presentation document, or a
spreadsheet (8140).
[0218] The device detects (8160) a gesture (e.g., gesture 580 (FIG.
5B) or gesture 680 (FIG. 6B)) in a predefined area at the top of
the touch screen display (e.g., status bar 501). In some
embodiments, the gesture is a contact on any part of the status bar
501.
[0219] In some embodiments, the gesture (e.g., gesture 580 (FIG.
5B) or gesture 680 (FIG. 6B)) is a finger gesture (8180). In some
embodiments, the finger gesture is a finger tap gesture (8200)
(e.g., a single tap gesture or a double tap gesture).
[0220] In some embodiments, the gesture is made with a stylus
(8220). In some embodiments, the gesture is a tap gesture (8240)
(e.g., a single tap gesture or a double tap gesture).
[0221] In some embodiments, the predefined area (8260) at the top
of the touch screen display is a status bar for the portable
electronic device (e.g., status bar 501).
[0222] In response to detecting the gesture in the predefined area
at the top of the touch screen display, the device displays (8280)
a top portion of the electronic document. For example, in response
to detecting the gesture 580 (FIG. 5B) in the status bar 501, the
device displays the top portion 512-1 of the web page (FIG. 5D).
Similarly, in response to detecting the gesture 680 (FIG. 6B) in
the status bar 501, the device displays the top portion 614-1 of
the electronic note (FIG. 6C).
[0223] In some embodiments, displaying the top portion of the
electronic document includes (8300) translating (e.g., scrolling)
the electronic document to display the top portion of the
electronic document. In some embodiments, displaying the top
portion of the electronic document includes jumping from a
currently displayed portion of the electronic document to a display
of the top portion of the electronic document.
[0224] In some embodiments, if the electronic document is a web
page (e.g., FIG. 5B), in response to detecting the gesture 580 in
the predefined area at the top of the touch screen display, the
device translates (8320) the web page to display the top portion of
the web page and concurrently displays a URL entry area for
inputting URLs of web pages. For example, in response to detecting
the gesture 580 (FIG. 5B) in the status bar 501, the web page is
translated to display the top portion 512-1 of the web page and the
URL entry area 508 is concurrently displayed (FIG. 5D).
[0225] In some embodiments, the device detects (8340) an upward
swipe gesture (e.g., gesture 586 (FIG. 5E) or gesture 688 (FIG.
6C)) on the touch screen display. In response to detecting the
upward swipe gesture on the touch screen display, the device
translates (8360) the electronic document to display a portion of
the electronic document other than the top portion of the
electronic document. For example, in response to detecting the
upward swipe gesture 586 (FIG. 5E) on the touch screen display, the
device translates the web page to display an intermediate portion
512-2 of the web page (FIG. 5J), rather than the top portion 512-1
of the web page. Similarly, in response to detecting the upward
swipe gesture 688 (FIG. 6C) on the touch screen display, the device
translates the electronic note to display an intermediate portion
614-2 of the electronic note (FIG. 6D), rather than the top portion
614-1 of the electronic note.
[0226] A graphical user interface on a portable electronic device
with a touch screen display comprises: a portion of an electronic
document (e.g., portion 512-3 (FIG. 5A) or portion 614-3 (FIG. 6A);
and a predefined area at the top of the touch screen display (e.g.,
status bar 501). The electronic document has an electronic document
length and the displayed portion of the electronic document has a
displayed portion length that is less than the electronic document
length. In response to detecting a gesture in the predefined area
at the top of the touch screen display (e.g., gesture 580 (FIG. 5B)
or gesture 680 (FIG. 6B)), a top portion of the electronic document
is displayed (e.g., the top portion 512-1 of a web page (FIG. 5D)
or the top portion 614-1 of an electronic note (FIG. 6C)).
[0227] The foregoing description, for purpose of explanation, has
been described with reference to specific embodiments. However, the
illustrative discussions above are not intended to be exhaustive or
to limit the invention to the precise forms disclosed. Many
modifications and variations are possible in view of the above
teachings. The embodiments were chosen and described in order to
best explain the principles of the invention and its practical
applications, to thereby enable others skilled in the art to best
utilize the invention and various embodiments with various
modifications as are suited to the particular use contemplated.
* * * * *