U.S. patent application number 14/536646 was filed with the patent office on 2015-03-05 for device, method, and graphical user interface for providing tactile feedback for operations performed in a user interface.
The applicant listed for this patent is APPLE INC.. Invention is credited to Jeffrey Traer Bernstein, Julian Missig.
Application Number | 20150067496 14/536646 |
Document ID | / |
Family ID | 48468818 |
Filed Date | 2015-03-05 |
United States Patent
Application |
20150067496 |
Kind Code |
A1 |
Missig; Julian ; et
al. |
March 5, 2015 |
Device, Method, and Graphical User Interface for Providing Tactile
Feedback for Operations Performed in a User Interface
Abstract
An electronic device with a touch-sensitive surface and a
display displays a user interface object on the display, detects a
contact on the touch-sensitive surface, and detects a first
movement of the contact across the touch-sensitive surface, the
first movement corresponding to performing an operation on the user
interface object, and, in response to detecting the first movement,
the device performs the operation and generates a first tactile
output on the touch-sensitive surface. The device also detects a
second movement of the contact across the touch-sensitive surface,
the second movement corresponding to reversing the operation on the
user interface object, and in response to detecting the second
movement, the device reverses the operation and generates a second
tactile output on the touch-sensitive surface, where the second
tactile output is different from the first tactile output.
Inventors: |
Missig; Julian; (Redwood
City, CA) ; Bernstein; Jeffrey Traer; (San Francisco,
CA) |
|
Applicant: |
Name |
City |
State |
Country |
Type |
APPLE INC. |
Cupertino |
CA |
US |
|
|
Family ID: |
48468818 |
Appl. No.: |
14/536646 |
Filed: |
November 9, 2014 |
Related U.S. Patent Documents
|
|
|
|
|
|
Application
Number |
Filing Date |
Patent Number |
|
|
PCT/US2013/040070 |
May 8, 2013 |
|
|
|
14536646 |
|
|
|
|
61778284 |
Mar 12, 2013 |
|
|
|
61747278 |
Dec 29, 2012 |
|
|
|
61688227 |
May 9, 2012 |
|
|
|
Current U.S.
Class: |
715/702 |
Current CPC
Class: |
G06F 3/016 20130101;
G06F 3/04883 20130101; G06F 3/0488 20130101; G06F 2203/04105
20130101; G06F 3/04842 20130101; G06F 3/04886 20130101 |
Class at
Publication: |
715/702 |
International
Class: |
G06F 3/01 20060101
G06F003/01; G06F 3/0484 20060101 G06F003/0484; G06F 3/0488 20060101
G06F003/0488 |
Claims
1. A non-transitory computer readable storage medium storing one or
more programs, the one or more programs comprising instructions,
which when executed by an electronic device with a display and a
touch-sensitive surface cause the device to: display a user
interface object on the display; detect a contact on the
touch-sensitive surface; detect a first movement of the contact
across the touch-sensitive surface, the first movement
corresponding to performing an operation on the user interface
object; in response to detecting the first movement: perform the
operation; and generate a first tactile output on the
touch-sensitive surface; detect a second movement of the contact
across the touch-sensitive surface, the second movement
corresponding to reversing the operation on the user interface
object; and in response to detecting the second movement: reverse
the operation; and generate a second tactile output on the
touch-sensitive surface, wherein the second tactile output is
different from the first tactile output.
2. The non-transitory computer readable storage medium of claim 1,
wherein the first movement of the contact across the
touch-sensitive surface and the second movement of the contact
across the touch-sensitive surface are part of a single continuous
gesture performed without detecting a liftoff of the contact from
the touch-sensitive surface.
3. The non-transitory computer readable storage medium of claim 1,
wherein: performing the operation includes snapping the user
interface object into a respective object placement guide; and
reversing the operation includes snapping the user interface object
out of the respective object placement guide.
4. The non-transitory computer readable storage medium of claim 1,
wherein: performing the operation includes marking data
corresponding to the user interface object for deletion; and
reversing the operation includes unmarking the data corresponding
to the user interface object for deletion.
5. The non-transitory computer readable storage medium of claim 1,
wherein: the user interface object corresponds to a file;
performing the operation includes placing the file in a directory;
and reversing the operation includes removing the file from the
directory.
6. The non-transitory computer readable storage medium of claim 1,
wherein: the user interface object corresponds to an application;
performing the operation includes placing the user interface object
in an application launch region; and reversing the operation
includes removing the user interface object from the application
launch region.
7. The non-transitory computer readable storage medium of claim 1,
wherein: the first tactile output is generated by movement of the
touch-sensitive surface that includes a first dominant movement
component; the second tactile output is generated by movement of
the touch-sensitive surface that includes a second dominant
movement component; and the first dominant movement component and
the second dominant movement component have a same movement profile
and different amplitudes.
8. The non-transitory computer readable storage medium of claim 1,
wherein: the first tactile output is generated by movement of the
touch-sensitive surface that includes a first dominant movement
component; the second tactile output is generated by movement of
the touch-sensitive surface that includes a second dominant
movement component; and the first dominant movement component and
the second dominant movement component have different movement
profiles and a same amplitude.
9. An electronic device, comprising: a display; a touch-sensitive
surface; one or more processors; memory; and one or more programs,
wherein the one or more programs are stored in the memory and
configured to be executed by the one or more processors, the one or
more programs including instructions for: displaying a user
interface object on the display; detecting a contact on the
touch-sensitive surface; detecting a first movement of the contact
across the touch-sensitive surface, the first movement
corresponding to performing an operation on the user interface
object; in response to detecting the first movement: performing the
operation; and generating a first tactile output on the
touch-sensitive surface; detecting a second movement of the contact
across the touch-sensitive surface, the second movement
corresponding to reversing the operation on the user interface
object; and in response to detecting the second movement: reversing
the operation; and generating a second tactile output on the
touch-sensitive surface, wherein the second tactile output is
different from the first tactile output.
10. A method, comprising: at an electronic device with a
touch-sensitive surface and a display: displaying a user interface
object on the display; detecting a contact on the touch-sensitive
surface; detecting a first movement of the contact across the
touch-sensitive surface, the first movement corresponding to
performing an operation on the user interface object; in response
to detecting the first movement: performing the operation; and
generating a first tactile output on the touch-sensitive surface;
detecting a second movement of the contact across the
touch-sensitive surface, the second movement corresponding to
reversing the operation on the user interface object; and in
response to detecting the second movement: reversing the operation;
and generating a second tactile output on the touch-sensitive
surface, wherein the second tactile output is different from the
first tactile output.
Description
RELATED APPLICATIONS
[0001] This application is a Continuation of PCT Patent Application
Serial No. PCT/US2013/040070, filed on May 8, 2013, entitled
"Device, Method, and Graphical User Interface for Providing Tactile
Feedback for Operations Performed in a User Interface," which
claims the benefit of and priority to U.S. Provisional Patent
Application Ser. No. 61/778,284, filed on Mar. 12, 2013, entitled
"Device, Method, and Graphical User Interface for Providing Tactile
Feedback for Operations Performed in a User Interface;" U.S.
Provisional Patent Application No. 61/747,278, filed Dec. 29, 2012,
entitled "Device, Method, and Graphical User Interface for
Manipulating User Interface Objects with Visual and/or Haptic
Feedback;" and U.S. Provisional Patent Application No. 61/688,227,
filed May 9, 2012, entitled "Device, Method, and Graphical User
Interface for Manipulating User Interface Objects with Visual
and/or Haptic Feedback," which applications are incorporated by
reference herein in their entireties.
[0002] This application is also related to the following: U.S.
Provisional Patent Application Ser. No. 61/778,092, filed on Mar.
12, 2013, entitled "Device, Method, and Graphical User Interface
for Selecting Object within a Group of Objects;" U.S. Provisional
Patent Application Ser. No. 61/778,125, filed on Mar. 12, 2013,
entitled "Device, Method, and Graphical User Interface for
Navigating User Interface Hierarchies;" U.S. Provisional Patent
Application Ser. No. 61/778,156, filed on Mar. 12, 2013, entitled
"Device, Method, and Graphical User Interface for Manipulating
Framed Graphical Objects;" U.S. Provisional Patent Application Ser.
No. 61/778,179, filed on Mar. 12, 2013, entitled "Device, Method,
and Graphical User Interface for Scrolling Nested Regions;" U.S.
Provisional Patent Application Ser. No. 61/778,171, filed on Mar.
12, 2013, entitled "Device, Method, and Graphical User Interface
for Displaying Additional Information in Response to a User
Contact;" U.S. Provisional Patent Application Ser. No. 61/778,191,
filed on Mar. 12, 2013, entitled "Device, Method, and Graphical
User Interface for Displaying User Interface Objects Corresponding
to an Application;" U.S. Provisional Patent Application Ser. No.
61/778,211, filed on Mar. 12, 2013, entitled "Device, Method, and
Graphical User Interface for Facilitating User Interaction with
Controls in a User Interface;"U.S. Provisional Patent Application
Ser. No. 61/778,239, filed on Mar. 12, 2013, entitled "Device,
Method, and Graphical User Interface for Forgoing Generation of
Tactile Output for a Multi-Contact Gesture;" U.S. Provisional
Patent Application Ser. No. 61/778,287, filed on Mar. 12, 2013,
entitled "Device, Method, and Graphical User Interface for
Providing Feedback for Changing Activation States of a User
Interface Object;" U.S. Provisional Patent Application Ser. No.
61/778,363, filed on Mar. 12, 2013, entitled "Device, Method, and
Graphical User Interface for Transitioning between Touch Input to
Display Output Relationships;" U.S. Provisional Patent Application
Ser. No. 61/778,367, filed on Mar. 12, 2013, entitled "Device,
Method, and Graphical User Interface for Moving a User Interface
Object Based on an Intensity of a Press Input;" U.S. Provisional
Patent Application Ser. No. 61/778,265, filed on Mar. 12, 2013,
entitled "Device, Method, and Graphical User Interface for
Transitioning between Display States in Response to a Gesture;"
U.S. Provisional Patent Application Ser. No. 61/778,373, filed on
Mar. 12, 2013, entitled "Device, Method, and Graphical User
Interface for Managing Activation of a Control Based on Contact
Intensity;" U.S. Provisional Patent Application Ser. No.
61/778,412, filed on Mar. 13, 2013, entitled "Device, Method, and
Graphical User Interface for Displaying Content Associated with a
Corresponding Affordance;" U.S. Provisional Patent Application Ser.
No. 61/778,413, filed on Mar. 13, 2013, entitled "Device, Method,
and Graphical User Interface for Selecting User Interface Objects;"
U.S. Provisional Patent Application Ser. No. 61/778,414, filed on
Mar. 13, 2013, entitled "Device, Method, and Graphical User
Interface for Moving and Dropping a User Interface Object;" U.S.
Provisional Patent Application Ser. No. 61/778,416, filed on Mar.
13, 2013, entitled "Device, Method, and Graphical User Interface
for Determining Whether to Scroll or Select Content;" and U.S.
Provisional Patent Application Ser. No. 61/778,418, filed on Mar.
13, 2013, entitled "Device, Method, and Graphical User Interface
for Switching between User Interfaces," which are incorporated
herein by reference in their entireties.
[0003] This application is also related to the following: U.S.
Provisional Patent Application Ser. No. 61/645,033, filed on May 9,
2012, entitled "Adaptive Haptic Feedback for Electronic Devices;"
U.S. Provisional Patent Application Ser. No. 61/665,603, filed on
Jun. 28, 2012, entitled "Adaptive Haptic Feedback for Electronic
Devices;" and U.S. Provisional Patent Application Ser. No.
61/681,098, filed on Aug. 8, 2012, entitled "Adaptive Haptic
Feedback for Electronic Devices," which are incorporated herein by
reference in their entireties.
TECHNICAL FIELD
[0004] This relates generally to electronic devices with
touch-sensitive surfaces, including but not limited to electronic
devices with touch-sensitive surfaces that detect inputs for
manipulating user interfaces.
BACKGROUND
[0005] The use of touch-sensitive surfaces as input devices for
computers and other electronic computing devices has increased
significantly in recent years. Exemplary touch-sensitive surfaces
include touch pads and touch screen displays. Such surfaces are
widely used to manipulate user interface objects on a display.
[0006] Exemplary manipulations include adjusting the position
and/or size of one or more user interface objects or activating
buttons or opening files/applications represented by user interface
objects, as well as associating metadata with one or more user
interface objects or otherwise manipulating user interfaces.
Exemplary user interface objects include digital images, video,
text, icons, control elements such as buttons and other graphics. A
user will, in some circumstances, need to perform such
manipulations on user interface objects in a file management
program (e.g., Finder from Apple Inc. of Cupertino, Calif.), an
image management application (e.g., Aperture or iPhoto from Apple
Inc. of Cupertino, Calif.), a digital content (e.g., videos and
music) management application (e.g., iTunes from Apple Inc. of
Cupertino, Calif.), a drawing application, a presentation
application (e.g., Keynote from Apple Inc. of Cupertino, Calif.), a
word processing application (e.g., Pages from Apple Inc. of
Cupertino, Calif.), a website creation application (e.g., iWeb from
Apple Inc. of Cupertino, Calif.), a disk authoring application
(e.g., iDVD from Apple Inc. of Cupertino, Calif.), or a spreadsheet
application (e.g., Numbers from Apple Inc. of Cupertino,
Calif.).
[0007] But existing methods for performing these manipulations are
cumbersome and inefficient. In addition, existing methods take
longer than necessary, thereby wasting energy. This latter
consideration is particularly important in battery-operated
devices.
SUMMARY
[0008] Accordingly, there is a need for electronic devices with
faster, more efficient methods and interfaces for manipulating user
interfaces. Such methods and interfaces optionally complement or
replace conventional methods for manipulating user interfaces. Such
methods and interfaces reduce the cognitive burden on a user and
produce a more efficient human-machine interface. For
battery-operated devices, such methods and interfaces conserve
power and increase the time between battery charges.
[0009] The above deficiencies and other problems associated with
user interfaces for electronic devices with touch-sensitive
surfaces are reduced or eliminated by the disclosed devices. In
some embodiments, the device is a desktop computer. In some
embodiments, the device is portable (e.g., a notebook computer,
tablet computer, or handheld device). In some embodiments, the
device has a touchpad. In some embodiments, the device has a
touch-sensitive display (also known as a "touch screen" or "touch
screen display"). In some embodiments, the device has a graphical
user interface (GUI), one or more processors, memory and one or
more modules, programs or sets of instructions stored in the memory
for performing multiple functions. In some embodiments, the user
interacts with the GUI primarily through finger contacts and
gestures on the touch-sensitive surface. In some embodiments, the
functions optionally include image editing, drawing, presenting,
word processing, website creating, disk authoring, spreadsheet
making, game playing, telephoning, video conferencing, e-mailing,
instant messaging, workout support, digital photographing, digital
videoing, web browsing, digital music playing, and/or digital video
playing. Executable instructions for performing these functions
are, optionally, included in a non-transitory computer readable
storage medium or other computer program product configured for
execution by one or more processors.
[0010] There is a need for electronic devices with more methods and
interfaces for providing tactile feedback for operations performed
in a user interface. Such methods and interfaces may complement or
replace conventional methods for providing feedback for operations
performed in a user interface. Such methods and interfaces reduce
the cognitive burden on a user and produce a more efficient
human-machine interface.
[0011] In accordance with some embodiments, a method is performed
at an electronic device with a display and a touch-sensitive
surface. The method includes displaying a user interface object on
the display, detecting a contact on the touch-sensitive surface,
and detecting a first movement of the contact across the
touch-sensitive surface, the first movement corresponding to
performing an operation on the user interface object, and, in
response to detecting the first movement, performing the operation
and generating a first tactile output on the touch-sensitive
surface. The method further includes detecting a second movement of
the contact across the touch-sensitive surface, the second movement
corresponding to reversing the operation on the user interface
object, and in response to detecting the second movement, reversing
the operation and generating a second tactile output on the
touch-sensitive surface, wherein the second tactile output is
different from the first tactile output.
[0012] In accordance with some embodiments, an electronic device
includes a display unit configured to display a user interface
object, a touch-sensitive surface unit configured to detect user
contacts, and a processing unit coupled to the display unit and the
touch-sensitive surface unit. The processing unit is configured to
detect a contact on the touch-sensitive surface unit, detect a
first movement of the contact across the touch-sensitive surface
unit, the first movement corresponding to performing an operation
on the user interface object, in response to detecting the first
movement; perform the operation and generate a first tactile output
on the touch-sensitive surface unit. The processing unit is further
configured to detect a second movement of the contact across the
touch-sensitive surface unit, the second movement corresponding to
reversing the operation on the user interface object, and in
response to detecting the second movement; reverse the operation
and generate a second tactile output on the touch-sensitive surface
unit, where the second tactile output is different from the first
tactile output.
[0013] Thus, electronic devices with displays and touch-sensitive
surfaces are provided with more methods and interfaces for
providing tactile feedback for operations performed in a user
interface, thereby increasing the effectiveness, efficiency, and
user satisfaction with such devices. Such methods and interfaces
may complement or replace conventional methods for providing
feedback for operations performed in a user interface.
[0014] There is a need for electronic devices with faster, more
efficient methods and interfaces for indicating changes in the
z-order of user interface objects. Such methods and interfaces may
complement or replace conventional methods for indicating changes
in the z-order of user interface objects. Such methods and
interfaces reduce the cognitive burden on a user and produce a more
efficient human-machine interface. For battery-operated devices,
such methods and interfaces conserve power and increase the time
between battery charges.
[0015] In accordance with some embodiments, a method is performed
at an electronic device with a display and a touch-sensitive
surface. The method includes: displaying a plurality of user
interface objects on the display, where: the plurality of user
interface objects have a z-order, the plurality of user interface
objects includes a first user interface object and a second user
interface object, and the first user interface object is above the
second user interface object in the z-order; while detecting a
contact on the touch-sensitive surface, receiving a request to move
the first user interface object below the second user interface
object in the z-order; and in response to the request: moving the
first user interface object below the second user interface object
in the z-order; in accordance with a determination that the first
user interface object overlaps at least a portion of the second
user interface object, generating a tactile output associated with
moving the first user interface object below the second user
interface object on the touch-sensitive surface in conjunction with
moving the first user interface object below the second user
interface object; and in accordance with a determination that the
first user interface object does not overlap the second user
interface object, forgoing generating the tactile output associated
with moving the first user interface object below the second user
interface object.
[0016] In accordance with some embodiments, an electronic device
includes a display unit configured to display a plurality of user
interface objects on the display unit, where: the plurality of user
interface objects have a z-order, the plurality of user interface
objects includes a first user interface object and a second user
interface object, and the first user interface object is above the
second user interface object in the z-order; a touch-sensitive
surface unit configured to receive contacts; and a processing unit
coupled to the display unit and the touch-sensitive surface unit.
The processing unit is configured to: while detecting a contact on
the touch-sensitive surface unit, receive a request to move the
first user interface object below the second user interface object
in the z-order; and in response to the request: move the first user
interface object below the second user interface object in the
z-order; in accordance with a determination that the first user
interface object overlaps at least a portion of the second user
interface object, generate a tactile output associated with moving
the first user interface object below the second user interface
object on the touch-sensitive surface unit in conjunction with
moving the first user interface object below the second user
interface object; and in accordance with a determination that the
first user interface object does not overlap the second user
interface object, forgo generating the tactile output associated
with moving the first user interface object below the second user
interface object.
[0017] Thus, electronic devices with displays and touch-sensitive
surfaces are provided with faster, more efficient methods and
interfaces for indicating changes in the z-order of user interface
objects, thereby increasing the effectiveness, efficiency, and user
satisfaction with such devices. Such methods and interfaces may
complement or replace conventional methods for indicating changes
in the z-order of user interface objects.
[0018] There is a need for electronic devices with faster, more
efficient methods and interfaces for providing feedback when an
action will result in the adjustment of a parameter beyond a
predefined limit. Such methods and interfaces may complement or
replace conventional methods for providing feedback when an action
will result in the adjustment of a parameter beyond a predefined
limit. Such methods and interfaces reduce the cognitive burden on a
user and produce a more efficient human-machine interface. For
battery-operated devices, such methods and interfaces conserve
power and increase the time between battery charges.
[0019] In accordance with some embodiments, a method is performed
at an electronic device with a display, a touch-sensitive surface.
The method includes: displaying, on the display, a control for
controlling a parameter associated with respective content. The
method further includes: detecting a gesture on the touch-sensitive
surface for adjusting the parameter. The method further includes,
in response to detecting the gesture: determining an adjustment of
the parameter that corresponds to an extent of the gesture; in
accordance with a determination that the adjustment of the
parameter would cause one or more predefined adjustment limits to
be exceeded, generating a respective tactile output on the
touch-sensitive surface; and in accordance with a determination
that the adjustment of the parameter would not cause the one or
more predefined adjustment limits to be exceeded, performing the
adjustment of the parameter without generating the respective
tactile output on the touch-sensitive surface.
[0020] In accordance with some embodiments, an electronic device
includes a display unit configured to display a control for
controlling a parameter associated with respective content; a
touch-sensitive surface unit configured to receive user contacts;
and a processing unit coupled to the display unit and the
touch-sensitive surface unit. The processing unit is configured to:
enable display of a control for controlling a parameter associated
with respective content on the display unit; and detect a gesture
on the touch-sensitive surface unit for adjusting the parameter.
The processing unit is further configured to, in response to
detecting the gesture: determine an adjustment of the parameter
that corresponds to an extent of the gesture; in accordance with a
determination that the adjustment of the parameter would cause one
or more predefined adjustment limits to be exceeded, generate a
respective tactile output on the touch-sensitive surface unit; and
in accordance with a determination that the adjustment of the
parameter would not cause the one or more predefined adjustment
limits to be exceeded, perform the adjustment of the parameter
without generating the respective tactile output on the
touch-sensitive surface unit.
[0021] Thus, electronic devices with displays, touch-sensitive
surfaces are provided with faster, more efficient methods and
interfaces for providing feedback when an action will result in the
adjustment of a parameter beyond a predefined limit, thereby
increasing the effectiveness, efficiency, and user satisfaction
with such devices. Such methods and interfaces may complement or
replace conventional methods for providing feedback when an action
will result in the adjustment of a parameter beyond a predefined
limit.
[0022] There is a need for electronic devices with more methods and
interfaces for providing feedback corresponding to a clock. Such
methods and interfaces may complement or replace conventional
methods for displaying a clock. Such methods and interfaces reduce
the cognitive burden on a user and produce a more efficient
human-machine interface. For battery-operated devices, such methods
and interfaces conserve power and increase the time between battery
charges.
[0023] In accordance with some embodiments, a method is performed
at an electronic device with a display and a touch-sensitive
surface. The method includes displaying a representation of a clock
on the display, detecting movement of a focus selector over the
representation of the clock; while detecting the focus selector
over the representation of the clock, providing tactile feedback
that corresponds to the clock, where the tactile feedback includes
a regular pattern of tactile outputs on the touch-sensitive
surface. The method further includes, while providing the tactile
feedback, detecting movement of the focus selector away from the
representation of the clock, and in response to detecting movement
of the focus selector away from the representation of the clock,
ceasing to provide the tactile feedback corresponding to the
clock.
[0024] In accordance with some embodiments, an electronic device
includes a display unit configured to display a representation of a
clock, a touch-sensitive surface unit, and a processing unit
coupled to the display unit and the touch-sensitive surface unit.
The processing unit is configured to: detect movement of a focus
selector over the representation of the clock, while detecting the
focus selector over the representation of the clock, provide
tactile feedback that corresponds to the clock, where the tactile
feedback includes a regular pattern of tactile outputs on the
touch-sensitive surface unit. The processing unit is further
configured to, while providing the tactile feedback, detect
movement of the focus selector away from the representation of the
clock, and in response to detecting movement of the focus selector
away from the representation of the clock, cease to provide the
tactile feedback corresponding to the clock.
[0025] Thus, electronic devices with displays and touch-sensitive
surfaces are provided with more methods and interfaces for
providing feedback corresponding to a clock, thereby increasing the
effectiveness, efficiency, and user satisfaction with such devices.
Such methods and interfaces may complement or replace conventional
methods for providing feedback corresponding to a clock.
[0026] There is a need for electronic devices with faster, more
efficient methods and interfaces for providing feedback that
corresponds to beats of a piece of music. Such methods and
interfaces may complement or replace conventional methods for
providing feedback that corresponds to beats of a piece of music.
Such methods and interfaces reduce the cognitive burden on a user
and produce a more efficient human-machine interface. For
battery-operated devices, such methods and interfaces conserve
power and increase the time between battery charges.
[0027] In accordance with some embodiments, a method is performed
at an electronic device with a display, a touch-sensitive surface.
The method includes displaying a representation of a piece of music
on the display. The method further includes detecting movement of a
focus selector over the representation of the piece of music. The
method further includes, while detecting the focus selector over
the representation of the piece of music, providing tactile
feedback that corresponds to at least a subset of beats of the
piece of music. The method further includes, after providing the
tactile feedback, detecting movement of the focus selector away
from the representation of the piece of music. The method further
includes, in response to detecting movement of the focus selector
away from the representation of the piece of music, ceasing to
provide the tactile feedback that corresponds to the beats of the
piece of music.
[0028] In accordance with some embodiments, an electronic device
includes a display unit configured to display a representation of a
piece of music; a touch-sensitive surface unit configured to
receive user contacts; and a processing unit coupled to the display
unit and the touch-sensitive surface unit. The processing unit is
configured to: enable display of a representation of a piece of
music; and detect movement of a focus selector over the
representation of the piece of music. The processing unit is
further configured to, while detecting the focus selector over the
representation of the piece of music, provide tactile feedback that
corresponds to at least a subset of beats of the piece of music.
The processing unit is further configured to, after providing the
tactile feedback, detect movement of the focus selector away from
the representation of the piece of music. The processing unit is
further configured to, in response to detecting movement of the
focus selector away from the representation of the piece of music,
cease to provide the tactile feedback that corresponds to the beats
of the piece of music.
[0029] Thus, electronic devices with displays, touch-sensitive
surfaces are provided with faster, more efficient methods and
interfaces for providing feedback that corresponds to beats of a
piece of music, thereby increasing the effectiveness, efficiency,
and user satisfaction with such devices. Such methods and
interfaces may complement or replace conventional methods for
providing feedback that corresponds to beats of a piece of
music.
[0030] In accordance with some embodiments, an electronic device
includes a display, a touch-sensitive surface, optionally one or
more sensors to detect intensity of contacts with the
touch-sensitive surface, one or more processors, memory, and one or
more programs; the one or more programs are stored in the memory
and configured to be executed by the one or more processors and the
one or more programs include instructions for performing the
operations of any of the methods referred to in paragraph [0058].
In accordance with some embodiments, a graphical user interface on
an electronic device with a display, a touch-sensitive surface,
optionally one or more sensors to detect intensity of contacts with
the touch-sensitive surface, a memory, and one or more processors
to execute one or more programs stored in the memory includes one
or more of the elements displayed in any of the methods referred to
in paragraph [0058], which are updated in response to inputs, as
described in any of the methods referred to in paragraph [0058]. In
accordance with some embodiments, a computer readable storage
medium has stored therein instructions which when executed by an
electronic device with a display, a touch-sensitive surface, and
optionally one or more sensors to detect intensity of contacts with
the touch-sensitive surface, cause the device to perform the
operations of any of the methods referred to in paragraph [0058].
In accordance with some embodiments, an electronic device includes:
a display, a touch-sensitive surface, and optionally one or more
sensors to detect intensity of contacts with the touch-sensitive
surface; and means for performing the operations of any of the
methods referred to in paragraph [0058]. In accordance with some
embodiments, an information processing apparatus, for use in an
electronic device with a display and a touch-sensitive surface,
optionally one or more sensors to detect intensity of contacts with
the touch-sensitive surface, includes means for performing the
operations of any of the methods referred to in paragraph
[0058].
BRIEF DESCRIPTION OF THE DRAWINGS
[0031] For a better understanding of the various described
embodiments, reference should be made to the Description of
Embodiments below, in conjunction with the following drawings in
which like reference numerals refer to corresponding parts
throughout the figures.
[0032] FIG. 1A is a block diagram illustrating a portable
multifunction device with a touch-sensitive display in accordance
with some embodiments.
[0033] FIG. 1B is a block diagram illustrating exemplary components
for event handling in accordance with some embodiments.
[0034] FIG. 2 illustrates a portable multifunction device having a
touch screen in accordance with some embodiments.
[0035] FIG. 3 is a block diagram of an exemplary multifunction
device with a display and a touch-sensitive surface in accordance
with some embodiments.
[0036] FIG. 4A illustrates an exemplary user interface for a menu
of applications on a portable multifunction device in accordance
with some embodiments.
[0037] FIG. 4B illustrates an exemplary user interface for a
multifunction device with a touch-sensitive surface that is
separate from the display in accordance with some embodiments.
[0038] FIGS. 5A-5O illustrate exemplary user interfaces for
providing tactile feedback for operations performed in a user
interface in accordance with some embodiments.
[0039] FIGS. 6A-6C are flow diagrams illustrating a method of
providing tactile feedback for operations performed in a user
interface in accordance with some embodiments.
[0040] FIG. 7 is a functional block diagram of an electronic device
in accordance with some embodiments.
[0041] FIGS. 8A-8S illustrate exemplary user interfaces for
indicating changes in z-order of user interface objects in
accordance with some embodiments.
[0042] FIGS. 9A-9D are flow diagrams illustrating a method of
indicating changes in z-order of user interface objects in
accordance with some embodiments.
[0043] FIG. 10 is a functional block diagram of an electronic
device in accordance with some embodiments.
[0044] FIGS. 11A-11T illustrate exemplary user interfaces for
providing feedback when an action will result in the adjustment of
a parameter beyond a predefined limit in accordance with some
embodiments.
[0045] FIGS. 12A-12B are flow diagrams illustrating a method of
providing feedback when an action will result in the adjustment of
a parameter beyond a predefined limit in accordance with some
embodiments.
[0046] FIG. 13 is a functional block diagram of an electronic
device in accordance with some embodiments.
[0047] FIGS. 14A-14J illustrate exemplary user interfaces for
providing tactile feedback corresponding to a clock in accordance
with some embodiments.
[0048] FIGS. 15A-15B are flow diagrams illustrating a method of
providing tactile feedback corresponding to a clock in accordance
with some embodiments.
[0049] FIG. 16 is a functional block diagram of an electronic
device in accordance with some embodiments.
[0050] FIGS. 17A-17L illustrate exemplary user interfaces for
providing feedback that corresponds to beats of a piece of music in
accordance with some embodiments.
[0051] FIGS. 17M-17O illustrate exemplary waveforms of movement
profiles for generating tactile outputs in accordance with some
embodiments.
[0052] FIGS. 18A-18B are flow diagrams illustrating a method of
providing feedback that corresponds to beats of a piece of music in
accordance with some embodiments.
[0053] FIG. 19 is a functional block diagram of an electronic
device in accordance with some embodiments.
DESCRIPTION OF EMBODIMENTS
[0054] The methods, devices and GUIs described herein provide
visual and/or haptic feedback that makes manipulation of user
interface objects more efficient and intuitive for a user. For
example, in a system where the clicking action of a trackpad is
decoupled from the contact intensity (e.g., contact force, contact
pressure, or a substitute therefore) that is needed to reach an
activation threshold, the device can generate different tactile
outputs (e.g., "different clicks") for different activation events
(e.g., so that clicks that accomplish a particular result are
differentiated from clicks that do not produce any result or that
accomplish a different result from the particular result).
Additionally, tactile outputs can be generated in response to other
events that are not related to increasing intensity of a contact,
such as generating a tactile output (e.g., a "detent") when a user
interface object is moved to a particular position, boundary or
orientation, or when an event occurs at the device.
[0055] Additionally, in a system where a trackpad or touch-screen
display is sensitive to a range of contact intensity that includes
more than one or two specific intensity values (e.g., more than a
simple on/off, binary intensity determination), the user interface
can provide responses (e.g., visual or tactile cues) that are
indicative of the intensity of the contact within the range. In
some implementations, a pre-activation-threshold response and/or a
post-activation-threshold response to an input are displayed as
continuous animations. As one example of such a response, a preview
of an operation is displayed in response to detecting an increase
in contact intensity that is still below an activation threshold
for performing the operation. As another example of such a
response, an animation associated with an operation continues even
after the activation threshold for the operation has been reached.
Both of these examples provide a user with a continuous response to
the force or pressure of a user's contact, which provides a user
with visual and/or haptic feedback that is richer and more
intuitive. More specifically, such continuous force responses give
the user the experience of being able to press lightly to preview
an operation and/or press deeply to push "past" or "through" a
predefined user interface state corresponding to the operation.
[0056] Additionally, for a device with a touch-sensitive surface
that is sensitive to a range of contact intensity, multiple contact
intensity thresholds can be monitored by the device and different
functions can be mapped to different contact intensity thresholds.
This serves to increase the available "gesture space" providing
easy access to advanced features for users who know that increasing
the intensity of a contact at or beyond a second "deep press"
intensity threshold will cause the device to perform a different
operation from an operation that would be performed if the
intensity of the contact is between a first "activation" intensity
threshold and the second "deep press" intensity threshold. An
advantage of assigning additional functionality to a second "deep
press" intensity threshold while maintaining familiar functionality
at a first "activation" intensity threshold is that inexperienced
users who are, in some circumstances, confused by the additional
functionality can use the familiar functionality by just applying
an intensity up to the first "activation" intensity threshold,
whereas more experienced users can take advantage of the additional
functionality by applying an intensity at the second "deep press"
intensity threshold.
[0057] Additionally, for a device with a touch-sensitive surface
that is sensitive to a range of contact intensity, the device can
provide additional functionality by allowing users to perform
complex operations with a single continuous contact. For example,
when selecting a group of objects, a user can move a continuous
contact around the touch-sensitive surface and can press while
dragging (e.g., applying an intensity greater than a "deep press"
intensity threshold) to add additional elements to a selection. In
this way, a user can intuitively interact with a user interface
where pressing harder with a contact causes objects in the user
interface to be "stickier."
[0058] A number of different approaches to providing an intuitive
user interface on a device where a clicking action is decoupled
from the force that is needed to reach an activation threshold
and/or the device is sensitive to a wide range of contact
intensities are described below. Using one or more of these
approaches (optionally in conjunction with each other) helps to
provide a user interface that intuitively provides users with
additional information and functionality, thereby reducing the
user's cognitive burden and improving the human-machine interface.
Such improvements in the human-machine interface enable users to
use the device faster and more efficiently. For battery-operated
devices, these improvements conserve power and increase the time
between battery charges. For ease of explanation, systems, methods
and user interfaces for including illustrative examples of some of
these approaches are described below, as follows: [0059] Many
electronic devices have graphical user interfaces that include user
interface objects. There are usually many operations which can be
performed on the interface objects. For example, an interface
object can be snapped to a guideline or removed from a guideline.
Some user interfaces provide visual feedback indicating whether an
operation has been performed or reversed. However, in some
situations, a user will not notice the visual feedback and thus
will be confused as to whether the operation has been performed or
reversed. The embodiments described below improve on these methods
by generating tactile outputs for the user corresponding to the
operations performed, thereby providing a more convenient and
efficient user interface. In particular, FIGS. 5A-5O illustrate
exemplary user interfaces for providing tactile feedback for
operations performed in a user interface. FIGS. 6A-6C are flow
diagrams illustrating a method of providing tactile feedback for
operations performed in a user interface. The user interfaces in
FIGS. 5A-5O are further used to illustrate the processes described
below with reference to FIGS. 6A-6C. [0060] Many electronic devices
display user interface objects that have a layer order (e.g., a
z-order or front-to-back order of the user interface objects). In
some circumstances, a user interacts with such objects by
repositioning them on the display, and overlapping objects are
displayed on the display in accordance with their front-to-back
order (e.g., an object that is "in front" of another object is
displayed where the two objects overlap). In addition to
repositioning the objects on the display, a user often wants to
change the front-to-back order of the objects on the display. In
some methods, changes in the z-order are indicated with visual
feedback. However, in some situations, a user will not notice the
visual feedback and thus will be confused as to whether the
operation has been performed. The embodiments described below
improve on these methods by providing for tactile outputs when
objects overlap each other and their z-order changes, thereby
providing a more convenient and efficient user interface. In
particular, FIGS. 8A-8S illustrate exemplary user interfaces for
indicating changes in the z-order of user interface objects. FIGS.
9A-9D are flow diagrams illustrating a method of indicating changes
in the z-order of user interface objects. The user interfaces in
FIGS. 8A-8S are used to illustrate the processes in FIGS. 9A-9D.
[0061] Many electronic devices have graphical user interfaces that
display user interface objects that can be manipulated by adjusting
one or more associated parameter such as the size of a user
interface object. For practical reasons, some of these parameters
have predefined adjustment limits are commonly assigned to these
user interface object, limiting the extent to which their
properties can be adjusted. Some user interfaces provide visual
feedback indicating whether a predefined adjustment limit has been
exceeded. However, in some situations, a user will not notice the
visual feedback and thus will be confused as to whether or not the
predefined adjustment limit has been exceeded. The embodiments
described below provide improved methods and user interfaces for
generating feedback to a user navigating a complex user interface
by providing tactile feedback when an action will result in the
adjustment of a parameter beyond a predefined adjustment limit,
thereby providing a more convenient and efficient user interface.
In particular, FIGS. 11A-11T illustrate exemplary user interfaces
for providing feedback when an action will result in the adjustment
of a parameter beyond a predefined limit. FIGS. 12A-12B are flow
diagrams illustrating a method of providing feedback when an action
will result in the adjustment of a parameter beyond a predefined
limit. The user interfaces in FIGS. 11A-11T are used to illustrate
the processes in FIGS. 12A-12B. [0062] Many electronic devices have
graphical user interfaces that include a representation of a clock.
There is often a need to provide efficient and convenient ways for
users to receive feedback corresponding to the clock. Some user
interfaces provide visual feedback indicating advancement of time
on a clock. However, in some situations, a user will look away from
the clock or be distracted and will not be able to pay attention to
the visual feedback while performing another task. The embodiments
below improve on the these methods by generating tactile outputs
for the user that correspond to the clock (e.g., a `tick tock`
pattern of tactile outputs), thereby providing a more convenient
and efficient user interface by enabling the user to pay attention
to different visual element while monitoring the advancement of
time on the clock. In particular, FIGS. 14A-14J illustrate
exemplary user interfaces for providing tactile feedback
corresponding to a clock in accordance with some embodiments. The
user interfaces in these figures are used to illustrate the
processes described below, including the processes described below
with reference to FIGS. 15A-15B. [0063] Many electronic devices
have graphical user interfaces that display application windows
showing representations of a piece of music (e.g., a graphical
representation of a piece of cover art for an album of the piece of
music, a region indicating that a piece of music is being currently
being played or notes of a piece of music in a graphical
representation of a music score corresponding to a piece of music).
Given the complexity of user interface environment that includes
application windows corresponding to applications having both audio
and visual components (e.g., music playback, music composition,
video playback or video composition applications), there is a need
to provide feedback that enables the user to more efficiently and
conveniently navigate through the user interface environment. Some
user interfaces provide visual feedback indicating notes of a piece
of music. However, in some situations, a user will look away from
the region of the user interface providing visual feedback
indicating notes of a piece of music or be distracted and will not
be able to pay attention to the visual feedback while performing
another task. The embodiments described below provide improved
methods and user interfaces for generating feedback to a user
navigating a complex user interface environment by generating
tactile outputs corresponding to notes in a piece of music, thereby
providing a more convenient and efficient user interface by
enabling the user to pay attention to different visual element
while monitoring the notes in the piece of music. More
specifically, these methods and user interfaces provide feedback
that corresponds to beats of a piece of music represented on a
display. Below, FIGS. 17A-17L illustrate exemplary user interfaces
for providing feedback that corresponds to beats of a piece of
music. FIGS. 18A-18B are flow diagrams illustrating a method of
providing feedback that corresponds to beats of a piece of music.
The user interfaces in FIGS. 17A-17L are used to illustrate the
processes in FIGS. 18A-18B.
Exemplary Devices
[0064] Reference will now be made in detail to embodiments,
examples of which are illustrated in the accompanying drawings. In
the following detailed description, numerous specific details are
set forth in order to provide a thorough understanding of the
various described embodiments. However, it will be apparent to one
of ordinary skill in the art that the various described embodiments
may be practiced without these specific details. In other
instances, well-known methods, procedures, components, circuits,
and networks have not been described in detail so as not to
unnecessarily obscure aspects of the embodiments.
[0065] It will also be understood that, although the terms first,
second, etc. are, in some instances, used herein to describe
various elements, these elements should not be limited by these
terms. These terms are only used to distinguish one element from
another. For example, a first contact could be termed a second
contact, and, similarly, a second contact could be termed a first
contact, without departing from the scope of the various described
embodiments. The first contact and the second contact are both
contacts, but they are not the same contact.
[0066] The terminology used in the description of the various
described embodiments herein is for the purpose of describing
particular embodiments only and is not intended to be limiting. As
used in the description of the various described embodiments and
the appended claims, the singular forms "a", "an" and "the" are
intended to include the plural forms as well, unless the context
clearly indicates otherwise. It will also be understood that the
term "and/or" as used herein refers to and encompasses any and all
possible combinations of one or more of the associated listed
items. It will be further understood that the terms "includes,"
"including," "comprises," and/or "comprising," when used in this
specification, specify the presence of stated features, integers,
steps, operations, elements, and/or components, but do not preclude
the presence or addition of one or more other features, integers,
steps, operations, elements, components, and/or groups thereof.
[0067] As used herein, the term "if" is, optionally, construed to
mean "when" or "upon" or "in response to determining" or "in
response to detecting," depending on the context. Similarly, the
phrase "if it is determined" or "if [a stated condition or event]
is detected" is, optionally, construed to mean "upon determining"
or "in response to determining" or "upon detecting [the stated
condition or event]" or "in response to detecting [the stated
condition or event]," depending on the context.
[0068] Embodiments of electronic devices, user interfaces for such
devices, and associated processes for using such devices are
described. In some embodiments, the device is a portable
communications device, such as a mobile telephone, that also
contains other functions, such as PDA and/or music player
functions. Exemplary embodiments of portable multifunction devices
include, without limitation, the iPhone.RTM., iPod Touch.RTM., and
iPad.RTM. devices from Apple Inc. of Cupertino, Calif. Other
portable electronic devices, such as laptops or tablet computers
with touch-sensitive surfaces (e.g., touch screen displays and/or
touch pads), are, optionally, used. It should also be understood
that, in some embodiments, the device is not a portable
communications device, but is a desktop computer with a
touch-sensitive surface (e.g., a touch screen display and/or a
touch pad).
[0069] In the discussion that follows, an electronic device that
includes a display and a touch-sensitive surface is described. It
should be understood, however, that the electronic device
optionally includes one or more other physical user-interface
devices, such as a physical keyboard, a mouse and/or a
joystick.
[0070] The device typically supports a variety of applications,
such as one or more of the following: a drawing application, a
presentation application, a word processing application, a website
creation application, a disk authoring application, a spreadsheet
application, a gaming application, a telephone application, a video
conferencing application, an e-mail application, an instant
messaging application, a workout support application, a photo
management application, a digital camera application, a digital
video camera application, a web browsing application, a digital
music player application, and/or a digital video player
application.
[0071] The various applications that are executed on the device
optionally use at least one common physical user-interface device,
such as the touch-sensitive surface. One or more functions of the
touch-sensitive surface as well as corresponding information
displayed on the device are, optionally, adjusted and/or varied
from one application to the next and/or within a respective
application. In this way, a common physical architecture (such as
the touch-sensitive surface) of the device optionally supports the
variety of applications with user interfaces that are intuitive and
transparent to the user.
[0072] Attention is now directed toward embodiments of portable
devices with touch-sensitive displays. FIG. 1A is a block diagram
illustrating portable multifunction device 100 with touch-sensitive
displays 112 in accordance with some embodiments. Touch-sensitive
display 112 is sometimes called a "touch screen" for convenience,
and is sometimes known as or called a touch-sensitive display
system. Device 100 includes memory 102 (which optionally includes
one or more computer readable storage mediums), memory controller
122, one or more processing units (CPU's) 120, peripherals
interface 118, RF circuitry 108, audio circuitry 110, speaker 111,
microphone 113, input/output (I/O) subsystem 106, other input or
control devices 116, and external port 124. Device 100 optionally
includes one or more optical sensors 164. Device 100 optionally
includes one or more intensity sensors 165 for detecting intensity
of contacts on device 100 (e.g., a touch-sensitive surface such as
touch-sensitive display system 112 of device 100). Device 100
optionally includes one or more tactile output generators 167 for
generating tactile outputs on device 100 (e.g., generating tactile
outputs on a touch-sensitive surface such as touch-sensitive
display system 112 of device 100 or touchpad 355 of device 300).
These components optionally communicate over one or more
communication buses or signal lines 103.
[0073] As used in the specification and claims, the term
"intensity" of a contact on a touch-sensitive surface refers to the
force or pressure (force per unit area) of a contact (e.g., a
finger contact) on the touch sensitive surface, or to a substitute
(proxy) for the force or pressure of a contact on the touch
sensitive surface. The intensity of a contact has a range of values
that includes at least four distinct values and more typically
includes hundreds of distinct values (e.g., at least 256).
Intensity of a contact is, optionally, determined (or measured)
using various approaches and various sensors or combinations of
sensors. For example, one or more force sensors underneath or
adjacent to the touch-sensitive surface are, optionally, used to
measure force at various points on the touch-sensitive surface. In
some implementations, force measurements from multiple force
sensors are combined (e.g., a weighted average) to determine an
estimated force of a contact. Similarly, a pressure-sensitive tip
of a stylus is, optionally, used to determine a pressure of the
stylus on the touch-sensitive surface. Alternatively, the size of
the contact area detected on the touch-sensitive surface and/or
changes thereto, the capacitance of the touch-sensitive surface
proximate to the contact and/or changes thereto, and/or the
resistance of the touch-sensitive surface proximate to the contact
and/or changes thereto are, optionally, used as a substitute for
the force or pressure of the contact on the touch-sensitive
surface. In some implementations, the substitute measurements for
contact force or pressure are used directly to determine whether an
intensity threshold has been exceeded (e.g., the intensity
threshold is described in units corresponding to the substitute
measurements). In some implementations, the substitute measurements
for contact force or pressure are converted to an estimated force
or pressure and the estimated force or pressure is used to
determine whether an intensity threshold has been exceeded (e.g.,
the intensity threshold is a pressure threshold measured in units
of pressure).
[0074] As used in the specification and claims, the term "tactile
output" refers to physical displacement of a device relative to a
previous position of the device, physical displacement of a
component (e.g., a touch-sensitive surface) of a device relative to
another component (e.g., housing) of the device, or displacement of
the component relative to a center of mass of the device that will
be detected by a user with the user's sense of touch. For example,
in situations where the device or the component of the device is in
contact with a surface of a user that is sensitive to touch (e.g.,
a finger, palm, or other part of a user's hand), the tactile output
generated by the physical displacement will be interpreted by the
user as a tactile sensation corresponding to a perceived change in
physical characteristics of the device or the component of the
device. For example, movement of a touch-sensitive surface (e.g., a
touch-sensitive display or trackpad) is, optionally, interpreted by
the user as a "down click" or "up click" of a physical actuator
button. In some cases, a user will feel a tactile sensation such as
an "down click" or "up click" even when there is no movement of a
physical actuator button associated with the touch-sensitive
surface that is physically pressed (e.g., displaced) by the user's
movements. As another example, movement of the touch-sensitive
surface is, optionally, interpreted or sensed by the user as
"roughness" of the touch-sensitive surface, even when there is no
change in smoothness of the touch-sensitive surface. While such
interpretations of touch by a user will be subject to the
individualized sensory perceptions of the user, there are many
sensory perceptions of touch that are common to a large majority of
users. Thus, when a tactile output is described as corresponding to
a particular sensory perception of a user (e.g., an "up click," a
"down click," "roughness"), unless otherwise stated, the generated
tactile output corresponds to physical displacement of the device
or a component thereof that will generate the described sensory
perception for a typical (or average) user.
[0075] It should be appreciated that device 100 is only one example
of a portable multifunction device, and that device 100 optionally
has more or fewer components than shown, optionally combines two or
more components, or optionally has a different configuration or
arrangement of the components. The various components shown in FIG.
1A are implemented in hardware, software, or a combination of both
hardware and software, including one or more signal processing
and/or application specific integrated circuits.
[0076] Memory 102 optionally includes high-speed random access
memory and optionally also includes non-volatile memory, such as
one or more magnetic disk storage devices, flash memory devices, or
other non-volatile solid-state memory devices. Access to memory 102
by other components of device 100, such as CPU 120 and the
peripherals interface 118, is, optionally, controlled by memory
controller 122.
[0077] Peripherals interface 118 can be used to couple input and
output peripherals of the device to CPU 120 and memory 102. The one
or more processors 120 run or execute various software programs
and/or sets of instructions stored in memory 102 to perform various
functions for device 100 and to process data.
[0078] In some embodiments, peripherals interface 118, CPU 120, and
memory controller 122 are, optionally, implemented on a single
chip, such as chip 104. In some other embodiments, they are,
optionally, implemented on separate chips.
[0079] RF (radio frequency) circuitry 108 receives and sends RF
signals, also called electromagnetic signals. RF circuitry 108
converts electrical signals to/from electromagnetic signals and
communicates with communications networks and other communications
devices via the electromagnetic signals. RF circuitry 108
optionally includes well-known circuitry for performing these
functions, including but not limited to an antenna system, an RF
transceiver, one or more amplifiers, a tuner, one or more
oscillators, a digital signal processor, a CODEC chipset, a
subscriber identity module (SIM) card, memory, and so forth. RF
circuitry 108 optionally communicates with networks, such as the
Internet, also referred to as the World Wide Web (WWW), an intranet
and/or a wireless network, such as a cellular telephone network, a
wireless local area network (LAN) and/or a metropolitan area
network (MAN), and other devices by wireless communication. The
wireless communication optionally uses any of a plurality of
communications standards, protocols and technologies, including but
not limited to Global System for Mobile Communications (GSM),
Enhanced Data GSM Environment (EDGE), high-speed downlink packet
access (HSDPA), high-speed uplink packet access (HSUPA), Evolution,
Data-Only (EV-DO), HSPA, HSPA+, Dual-Cell HSPA (DC-HSPDA), long
term evolution (LTE), near field communication (NFC), wideband code
division multiple access (W-CDMA), code division multiple access
(CDMA), time division multiple access (TDMA), Bluetooth, Wireless
Fidelity (Wi-Fi) (e.g., IEEE 802.11a, IEEE 802.11b, IEEE 802.11g
and/or IEEE 802.11n), voice over Internet Protocol (VoIP), Wi-MAX,
a protocol for e-mail (e.g., Internet message access protocol
(IMAP) and/or post office protocol (POP)), instant messaging (e.g.,
extensible messaging and presence protocol (XMPP), Session
Initiation Protocol for Instant Messaging and Presence Leveraging
Extensions (SIMPLE), Instant Messaging and Presence Service
(IMPS)), and/or Short Message Service (SMS), or any other suitable
communication protocol, including communication protocols not yet
developed as of the filing date of this document.
[0080] Audio circuitry 110, speaker 111, and microphone 113 provide
an audio interface between a user and device 100. Audio circuitry
110 receives audio data from peripherals interface 118, converts
the audio data to an electrical signal, and transmits the
electrical signal to speaker 111. Speaker 111 converts the
electrical signal to human-audible sound waves. Audio circuitry 110
also receives electrical signals converted by microphone 113 from
sound waves. Audio circuitry 110 converts the electrical signal to
audio data and transmits the audio data to peripherals interface
118 for processing. Audio data is, optionally, retrieved from
and/or transmitted to memory 102 and/or RF circuitry 108 by
peripherals interface 118. In some embodiments, audio circuitry 110
also includes a headset jack (e.g., 212, FIG. 2). The headset jack
provides an interface between audio circuitry 110 and removable
audio input/output peripherals, such as output-only headphones or a
headset with both output (e.g., a headphone for one or both ears)
and input (e.g., a microphone).
[0081] I/O subsystem 106 couples input/output peripherals on device
100, such as touch screen 112 and other input control devices 116,
to peripherals interface 118. I/O subsystem 106 optionally includes
display controller 156, optical sensor controller 158, intensity
sensor controller 159, haptic feedback controller 161 and one or
more input controllers 160 for other input or control devices. The
one or more input controllers 160 receive/send electrical signals
from/to other input or control devices 116. The other input control
devices 116 optionally include physical buttons (e.g., push
buttons, rocker buttons, etc.), dials, slider switches, joysticks,
click wheels, and so forth. In some alternate embodiments, input
controller(s) 160 are, optionally, coupled to any (or none) of the
following: a keyboard, infrared port, USB port, and a pointer
device such as a mouse. The one or more buttons (e.g., 208, FIG. 2)
optionally include an up/down button for volume control of speaker
111 and/or microphone 113. The one or more buttons optionally
include a push button (e.g., 206, FIG. 2).
[0082] Touch-sensitive display 112 provides an input interface and
an output interface between the device and a user. Display
controller 156 receives and/or sends electrical signals from/to
touch screen 112. Touch screen 112 displays visual output to the
user. The visual output optionally includes graphics, text, icons,
video, and any combination thereof (collectively termed
"graphics"). In some embodiments, some or all of the visual output
corresponds to user-interface objects.
[0083] Touch screen 112 has a touch-sensitive surface, sensor or
set of sensors that accepts input from the user based on haptic
and/or tactile contact. Touch screen 112 and display controller 156
(along with any associated modules and/or sets of instructions in
memory 102) detect contact (and any movement or breaking of the
contact) on touch screen 112 and converts the detected contact into
interaction with user-interface objects (e.g., one or more soft
keys, icons, web pages or images) that are displayed on touch
screen 112. In an exemplary embodiment, a point of contact between
touch screen 112 and the user corresponds to a finger of the
user.
[0084] Touch screen 112 optionally uses LCD (liquid crystal
display) technology, LPD (light emitting polymer display)
technology, or LED (light emitting diode) technology, although
other display technologies are used in other embodiments. Touch
screen 112 and display controller 156 optionally detect contact and
any movement or breaking thereof using any of a plurality of touch
sensing technologies now known or later developed, including but
not limited to capacitive, resistive, infrared, and surface
acoustic wave technologies, as well as other proximity sensor
arrays or other elements for determining one or more points of
contact with touch screen 112. In an exemplary embodiment,
projected mutual capacitance sensing technology is used, such as
that found in the iPhone.RTM., iPod Touch.RTM., and iPad.RTM. from
Apple Inc. of Cupertino, Calif.
[0085] Touch screen 112 optionally has a video resolution in excess
of 100 dpi. In some embodiments, the touch screen has a video
resolution of approximately 160 dpi. The user optionally makes
contact with touch screen 112 using any suitable object or
appendage, such as a stylus, a finger, and so forth. In some
embodiments, the user interface is designed to work primarily with
finger-based contacts and gestures, which can be less precise than
stylus-based input due to the larger area of contact of a finger on
the touch screen. In some embodiments, the device translates the
rough finger-based input into a precise pointer/cursor position or
command for performing the actions desired by the user.
[0086] In some embodiments, in addition to the touch screen, device
100 optionally includes a touchpad (not shown) for activating or
deactivating particular functions. In some embodiments, the
touchpad is a touch-sensitive area of the device that, unlike the
touch screen, does not display visual output. The touchpad is,
optionally, a touch-sensitive surface that is separate from touch
screen 112 or an extension of the touch-sensitive surface formed by
the touch screen.
[0087] Device 100 also includes power system 162 for powering the
various components. Power system 162 optionally includes a power
management system, one or more power sources (e.g., battery,
alternating current (AC)), a recharging system, a power failure
detection circuit, a power converter or inverter, a power status
indicator (e.g., a light-emitting diode (LED)) and any other
components associated with the generation, management and
distribution of power in portable devices.
[0088] Device 100 optionally also includes one or more optical
sensors 164. FIG. 1A shows an optical sensor coupled to optical
sensor controller 158 in I/O subsystem 106. Optical sensor 164
optionally includes charge-coupled device (CCD) or complementary
metal-oxide semiconductor (CMOS) phototransistors. Optical sensor
164 receives light from the environment, projected through one or
more lens, and converts the light to data representing an image. In
conjunction with imaging module 143 (also called a camera module),
optical sensor 164 optionally captures still images or video. In
some embodiments, an optical sensor is located on the back of
device 100, opposite touch screen display 112 on the front of the
device, so that the touch screen display is enabled for use as a
viewfinder for still and/or video image acquisition. In some
embodiments, another optical sensor is located on the front of the
device so that the user's image is, optionally, obtained for
videoconferencing while the user views the other video conference
participants on the touch screen display.
[0089] Device 100 optionally also includes one or more contact
intensity sensors 165. FIG. 1A shows a contact intensity sensor
coupled to intensity sensor controller 159 in I/O subsystem 106.
Contact intensity sensor 165 optionally includes one or more
piezoresistive strain gauges, capacitive force sensors, electric
force sensors, piezoelectric force sensors, optical force sensors,
capacitive touch-sensitive surfaces, or other intensity sensors
(e.g., sensors used to measure the force (or pressure) of a contact
on a touch-sensitive surface). Contact intensity sensor 165
receives contact intensity information (e.g., pressure information
or a proxy for pressure information) from the environment. In some
embodiments, at least one contact intensity sensor is collocated
with, or proximate to, a touch-sensitive surface (e.g.,
touch-sensitive display system 112). In some embodiments, at least
one contact intensity sensor is located on the back of device 100,
opposite touch screen display 112 which is located on the front of
device 100.
[0090] Device 100 optionally also includes one or more proximity
sensors 166. FIG. 1A shows proximity sensor 166 coupled to
peripherals interface 118. Alternately, proximity sensor 166 is
coupled to input controller 160 in I/O subsystem 106. In some
embodiments, the proximity sensor turns off and disables touch
screen 112 when the multifunction device is placed near the user's
ear (e.g., when the user is making a phone call).
[0091] Device 100 optionally also includes one or more tactile
output generators 167. FIG. 1A shows a tactile output generator
coupled to haptic feedback controller 161 in I/O subsystem 106.
Tactile output generator 167 optionally includes one or more
electroacoustic devices such as speakers or other audio components
and/or electromechanical devices that convert energy into linear
motion such as a motor, solenoid, electroactive polymer,
piezoelectric actuator, electrostatic actuator, or other tactile
output generating component (e.g., a component that converts
electrical signals into tactile outputs on the device). Contact
intensity sensor 165 receives tactile feedback generation
instructions from haptic feedback module 133 and generates tactile
outputs on device 100 that are capable of being sensed by a user of
device 100. In some embodiments, at least one tactile output
generator is collocated with, or proximate to, a touch-sensitive
surface (e.g., touch-sensitive display system 112) and, optionally,
generates a tactile output by moving the touch-sensitive surface
vertically (e.g., in/out of a surface of device 100) or laterally
(e.g., back and forth in the same plane as a surface of device
100). In some embodiments, at least one tactile output generator
sensor is located on the back of device 100, opposite touch screen
display 112 which is located on the front of device 100.
[0092] Device 100 optionally also includes one or more
accelerometers 168. FIG. 1A shows accelerometer 168 coupled to
peripherals interface 118. Alternately, accelerometer 168 is,
optionally, coupled to an input controller 160 in I/O subsystem
106. In some embodiments, information is displayed on the touch
screen display in a portrait view or a landscape view based on an
analysis of data received from the one or more accelerometers.
Device 100 optionally includes, in addition to accelerometer(s)
168, a magnetometer (not shown) and a GPS (or GLONASS or other
global navigation system) receiver (not shown) for obtaining
information concerning the location and orientation (e.g., portrait
or landscape) of device 100.
[0093] In some embodiments, the software components stored in
memory 102 include operating system 126, communication module (or
set of instructions) 128, contact/motion module (or set of
instructions) 130, graphics module (or set of instructions) 132,
text input module (or set of instructions) 134, Global Positioning
System (GPS) module (or set of instructions) 135, and applications
(or sets of instructions) 136. Furthermore, in some embodiments
memory 102 stores device/global internal state 157, as shown in
FIGS. 1A and 3. Device/global internal state 157 includes one or
more of: active application state, indicating which applications,
if any, are currently active; display state, indicating what
applications, views or other information occupy various regions of
touch screen display 112; sensor state, including information
obtained from the device's various sensors and input control
devices 116; and location information concerning the device's
location and/or attitude.
[0094] Operating system 126 (e.g., Darwin, RTXC, LINUX, UNIX, OS X,
WINDOWS, or an embedded operating system such as VxWorks) includes
various software components and/or drivers for controlling and
managing general system tasks (e.g., memory management, storage
device control, power management, etc.) and facilitates
communication between various hardware and software components.
[0095] Communication module 128 facilitates communication with
other devices over one or more external ports 124 and also includes
various software components for handling data received by RF
circuitry 108 and/or external port 124. External port 124 (e.g.,
Universal Serial Bus (USB), FIREWIRE, etc.) is adapted for coupling
directly to other devices or indirectly over a network (e.g., the
Internet, wireless LAN, etc.). In some embodiments, the external
port is a multi-pin (e.g., 30-pin) connector that is the same as,
or similar to and/or compatible with the 30-pin connector used on
iPod (trademark of Apple Inc.) devices.
[0096] Contact/motion module 130 optionally detects contact with
touch screen 112 (in conjunction with display controller 156) and
other touch sensitive devices (e.g., a touchpad or physical click
wheel). Contact/motion module 130 includes various software
components for performing various operations related to detection
of contact, such as determining if contact has occurred (e.g.,
detecting a finger-down event), determining an intensity of the
contact (e.g., the force or pressure of the contact or a substitute
for the force or pressure of the contact) determining if there is
movement of the contact and tracking the movement across the
touch-sensitive surface (e.g., detecting one or more
finger-dragging events), and determining if the contact has ceased
(e.g., detecting a finger-up event or a break in contact).
Contact/motion module 130 receives contact data from the
touch-sensitive surface. Determining movement of the point of
contact, which is represented by a series of contact data,
optionally includes determining speed (magnitude), velocity
(magnitude and direction), and/or an acceleration (a change in
magnitude and/or direction) of the point of contact. These
operations are, optionally, applied to single contacts (e.g., one
finger contacts) or to multiple simultaneous contacts (e.g.,
"multitouch"/multiple finger contacts). In some embodiments,
contact/motion module 130 and display controller 156 detect contact
on a touchpad.
[0097] In some embodiments, contact/motion module 130 uses a set of
one or more intensity thresholds to determine whether an operation
has been performed by a user (e.g., to determine whether a user has
"clicked" on an icon). In some embodiments at least a subset of the
intensity thresholds are determined in accordance with software
parameters (e.g., the intensity thresholds are not determined by
the activation thresholds of particular physical actuators and can
be adjusted without changing the physical hardware of device 100).
For example, a mouse "click" threshold of a trackpad or touch
screen display can be set to any of a large range of predefined
thresholds values without changing the trackpad or touch screen
display hardware. Additionally, in some implementations a user of
the device is provided with software settings for adjusting one or
more of the set of intensity thresholds (e.g., by adjusting
individual intensity thresholds and/or by adjusting a plurality of
intensity thresholds at once with a system-level click "intensity"
parameter).
[0098] Contact/motion module 130 optionally detects a gesture input
by a user. Different gestures on the touch-sensitive surface have
different contact patterns and intensities. Thus, a gesture is,
optionally, detected by detecting a particular contact pattern. For
example, detecting a finger tap gesture includes detecting a
finger-down event followed by detecting a finger-up (lift off)
event at the same position (or substantially the same position) as
the finger-down event (e.g., at the position of an icon). As
another example, detecting a finger swipe gesture on the
touch-sensitive surface includes detecting a finger-down event
followed by detecting one or more finger-dragging events, and
subsequently followed by detecting a finger-up (lift off)
event.
[0099] Graphics module 132 includes various known software
components for rendering and displaying graphics on touch screen
112 or other display, including components for changing the visual
impact (e.g., brightness, transparency, saturation, contrast or
other visual property) of graphics that are displayed. As used
herein, the term "graphics" includes any object that can be
displayed to a user, including without limitation text, web pages,
icons (such as user-interface objects including soft keys), digital
images, videos, animations and the like.
[0100] In some embodiments, graphics module 132 stores data
representing graphics to be used. Each graphic is, optionally,
assigned a corresponding code. Graphics module 132 receives, from
applications etc., one or more codes specifying graphics to be
displayed along with, if necessary, coordinate data and other
graphic property data, and then generates screen image data to
output to display controller 156.
[0101] Haptic feedback module 133 includes various software
components for generating instructions used by tactile output
generator(s) 167 to produce tactile outputs at one or more
locations on device 100 in response to user interactions with
device 100.
[0102] Text input module 134, which is, optionally, a component of
graphics module 132, provides soft keyboards for entering text in
various applications (e.g., contacts 137, e-mail 140, IM 141,
browser 147, and any other application that needs text input).
[0103] GPS module 135 determines the location of the device and
provides this information for use in various applications (e.g., to
telephone 138 for use in location-based dialing, to camera 143 as
picture/video metadata, and to applications that provide
location-based services such as weather widgets, local yellow page
widgets, and map/navigation widgets).
[0104] Applications 136 optionally include the following modules
(or sets of instructions), or a subset or superset thereof: [0105]
contacts module 137 (sometimes called an address book or contact
list); [0106] telephone module 138; [0107] video conferencing
module 139; [0108] e-mail client module 140; [0109] instant
messaging (IM) module 141; [0110] workout support module 142;
[0111] camera module 143 for still and/or video images; [0112]
image management module 144; [0113] browser module 147; [0114]
calendar module 148; [0115] widget modules 149, which optionally
include one or more of: weather widget 149-1, stocks widget 149-2,
calculator widget 149-3, alarm clock widget 149-4, dictionary
widget 149-5, and other widgets obtained by the user, as well as
user-created widgets 149-6; [0116] widget creator module 150 for
making user-created widgets 149-6; [0117] search module 151; [0118]
video and music player module 152, which is, optionally, made up of
a video player module and a music player module; [0119] notes
module 153; [0120] map module 154; and/or [0121] online video
module 155.
[0122] Examples of other applications 136 that are, optionally,
stored in memory 102 include other word processing applications,
other image editing applications, drawing applications,
presentation applications, JAVA-enabled applications, encryption,
digital rights management, voice recognition, and voice
replication.
[0123] In conjunction with touch screen 112, display controller
156, contact module 130, graphics module 132, and text input module
134, contacts module 137 are, optionally, used to manage an address
book or contact list (e.g., stored in application internal state
192 of contacts module 137 in memory 102 or memory 370), including:
adding name(s) to the address book; deleting name(s) from the
address book; associating telephone number(s), e-mail address(es),
physical address(es) or other information with a name; associating
an image with a name; categorizing and sorting names; providing
telephone numbers or e-mail addresses to initiate and/or facilitate
communications by telephone 138, video conference 139, e-mail 140,
or IM 141; and so forth.
[0124] In conjunction with RF circuitry 108, audio circuitry 110,
speaker 111, microphone 113, touch screen 112, display controller
156, contact module 130, graphics module 132, and text input module
134, telephone module 138 are, optionally, used to enter a sequence
of characters corresponding to a telephone number, access one or
more telephone numbers in address book 137, modify a telephone
number that has been entered, dial a respective telephone number,
conduct a conversation and disconnect or hang up when the
conversation is completed. As noted above, the wireless
communication optionally uses any of a plurality of communications
standards, protocols and technologies.
[0125] In conjunction with RF circuitry 108, audio circuitry 110,
speaker 111, microphone 113, touch screen 112, display controller
156, optical sensor 164, optical sensor controller 158, contact
module 130, graphics module 132, text input module 134, contact
list 137, and telephone module 138, videoconferencing module 139
includes executable instructions to initiate, conduct, and
terminate a video conference between a user and one or more other
participants in accordance with user instructions.
[0126] In conjunction with RF circuitry 108, touch screen 112,
display controller 156, contact module 130, graphics module 132,
and text input module 134, e-mail client module 140 includes
executable instructions to create, send, receive, and manage e-mail
in response to user instructions. In conjunction with image
management module 144, e-mail client module 140 makes it very easy
to create and send e-mails with still or video images taken with
camera module 143.
[0127] In conjunction with RF circuitry 108, touch screen 112,
display controller 156, contact module 130, graphics module 132,
and text input module 134, the instant messaging module 141
includes executable instructions to enter a sequence of characters
corresponding to an instant message, to modify previously entered
characters, to transmit a respective instant message (for example,
using a Short Message Service (SMS) or Multimedia Message Service
(MMS) protocol for telephony-based instant messages or using XMPP,
SIMPLE, or IMPS for Internet-based instant messages), to receive
instant messages and to view received instant messages. In some
embodiments, transmitted and/or received instant messages
optionally include graphics, photos, audio files, video files
and/or other attachments as are supported in a MMS and/or an
Enhanced Messaging Service (EMS). As used herein, "instant
messaging" refers to both telephony-based messages (e.g., messages
sent using SMS or MMS) and Internet-based messages (e.g., messages
sent using XMPP, SIMPLE, or IMPS).
[0128] In conjunction with RF circuitry 108, touch screen 112,
display controller 156, contact module 130, graphics module 132,
text input module 134, GPS module 135, map module 154, and music
player module 146, workout support module 142 includes executable
instructions to create workouts (e.g., with time, distance, and/or
calorie burning goals); communicate with workout sensors (sports
devices); receive workout sensor data; calibrate sensors used to
monitor a workout; select and play music for a workout; and
display, store and transmit workout data.
[0129] In conjunction with touch screen 112, display controller
156, optical sensor(s) 164, optical sensor controller 158, contact
module 130, graphics module 132, and image management module 144,
camera module 143 includes executable instructions to capture still
images or video (including a video stream) and store them into
memory 102, modify characteristics of a still image or video, or
delete a still image or video from memory 102.
[0130] In conjunction with touch screen 112, display controller
156, contact module 130, graphics module 132, text input module
134, and camera module 143, image management module 144 includes
executable instructions to arrange, modify (e.g., edit), or
otherwise manipulate, label, delete, present (e.g., in a digital
slide show or album), and store still and/or video images.
[0131] In conjunction with RF circuitry 108, touch screen 112,
display system controller 156, contact module 130, graphics module
132, and text input module 134, browser module 147 includes
executable instructions to browse the Internet in accordance with
user instructions, including searching, linking to, receiving, and
displaying web pages or portions thereof, as well as attachments
and other files linked to web pages.
[0132] In conjunction with RF circuitry 108, touch screen 112,
display system controller 156, contact module 130, graphics module
132, text input module 134, e-mail client module 140, and browser
module 147, calendar module 148 includes executable instructions to
create, display, modify, and store calendars and data associated
with calendars (e.g., calendar entries, to do lists, etc.) in
accordance with user instructions.
[0133] In conjunction with RF circuitry 108, touch screen 112,
display system controller 156, contact module 130, graphics module
132, text input module 134, and browser module 147, widget modules
149 are mini-applications that are, optionally, downloaded and used
by a user (e.g., weather widget 149-1, stocks widget 149-2,
calculator widget 149-3, alarm clock widget 149-4, and dictionary
widget 149-5) or created by the user (e.g., user-created widget
149-6). In some embodiments, a widget includes an HTML (Hypertext
Markup Language) file, a CSS (Cascading Style Sheets) file, and a
JavaScript file. In some embodiments, a widget includes an XML
(Extensible Markup Language) file and a JavaScript file (e.g.,
Yahoo! Widgets).
[0134] In conjunction with RF circuitry 108, touch screen 112,
display system controller 156, contact module 130, graphics module
132, text input module 134, and browser module 147, the widget
creator module 150 are, optionally, used by a user to create
widgets (e.g., turning a user-specified portion of a web page into
a widget).
[0135] In conjunction with touch screen 112, display system
controller 156, contact module 130, graphics module 132, and text
input module 134, search module 151 includes executable
instructions to search for text, music, sound, image, video, and/or
other files in memory 102 that match one or more search criteria
(e.g., one or more user-specified search terms) in accordance with
user instructions.
[0136] In conjunction with touch screen 112, display system
controller 156, contact module 130, graphics module 132, audio
circuitry 110, speaker 111, RF circuitry 108, and browser module
147, video and music player module 152 includes executable
instructions that allow the user to download and play back recorded
music and other sound files stored in one or more file formats,
such as MP3 or AAC files, and executable instructions to display,
present or otherwise play back videos (e.g., on touch screen 112 or
on an external, connected display via external port 124). In some
embodiments, device 100 optionally includes the functionality of an
MP3 player, such as an iPod (trademark of Apple Inc.).
[0137] In conjunction with touch screen 112, display controller
156, contact module 130, graphics module 132, and text input module
134, notes module 153 includes executable instructions to create
and manage notes, to do lists, and the like in accordance with user
instructions.
[0138] In conjunction with RF circuitry 108, touch screen 112,
display system controller 156, contact module 130, graphics module
132, text input module 134, GPS module 135, and browser module 147,
map module 154 are, optionally, used to receive, display, modify,
and store maps and data associated with maps (e.g., driving
directions; data on stores and other points of interest at or near
a particular location; and other location-based data) in accordance
with user instructions.
[0139] In conjunction with touch screen 112, display system
controller 156, contact module 130, graphics module 132, audio
circuitry 110, speaker 111, RF circuitry 108, text input module
134, e-mail client module 140, and browser module 147, online video
module 155 includes instructions that allow the user to access,
browse, receive (e.g., by streaming and/or download), play back
(e.g., on the touch screen or on an external, connected display via
external port 124), send an e-mail with a link to a particular
online video, and otherwise manage online videos in one or more
file formats, such as H.264. In some embodiments, instant messaging
module 141, rather than e-mail client module 140, is used to send a
link to a particular online video.
[0140] Each of the above identified modules and applications
correspond to a set of executable instructions for performing one
or more functions described above and the methods described in this
application (e.g., the computer-implemented methods and other
information processing methods described herein). These modules
(i.e., sets of instructions) need not be implemented as separate
software programs, procedures or modules, and thus various subsets
of these modules are, optionally, combined or otherwise re-arranged
in various embodiments. In some embodiments, memory 102 optionally
stores a subset of the modules and data structures identified
above. Furthermore, memory 102 optionally stores additional modules
and data structures not described above.
[0141] In some embodiments, device 100 is a device where operation
of a predefined set of functions on the device is performed
exclusively through a touch screen and/or a touchpad. By using a
touch screen and/or a touchpad as the primary input control device
for operation of device 100, the number of physical input control
devices (such as push buttons, dials, and the like) on device 100
is, optionally, reduced.
[0142] The predefined set of functions that are performed
exclusively through a touch screen and/or a touchpad optionally
include navigation between user interfaces. In some embodiments,
the touchpad, when touched by the user, navigates device 100 to a
main, home, or root menu from any user interface that is displayed
on device 100. In such embodiments, a "menu button" is implemented
using a touchpad. In some other embodiments, the menu button is a
physical push button or other physical input control device instead
of a touchpad.
[0143] FIG. 1B is a block diagram illustrating exemplary components
for event handling in accordance with some embodiments. In some
embodiments, memory 102 (in FIG. 1A) or 370 (FIG. 3) includes event
sorter 170 (e.g., in operating system 126) and a respective
application 136-1 (e.g., any of the aforementioned applications
137-151, 155, 380-390).
[0144] Event sorter 170 receives event information and determines
the application 136-1 and application view 191 of application 136-1
to which to deliver the event information. Event sorter 170
includes event monitor 171 and event dispatcher module 174. In some
embodiments, application 136-1 includes application internal state
192, which indicates the current application view(s) displayed on
touch sensitive display 112 when the application is active or
executing. In some embodiments, device/global internal state 157 is
used by event sorter 170 to determine which application(s) is (are)
currently active, and application internal state 192 is used by
event sorter 170 to determine application views 191 to which to
deliver event information.
[0145] In some embodiments, application internal state 192 includes
additional information, such as one or more of: resume information
to be used when application 136-1 resumes execution, user interface
state information that indicates information being displayed or
that is ready for display by application 136-1, a state queue for
enabling the user to go back to a prior state or view of
application 136-1, and a redo/undo queue of previous actions taken
by the user.
[0146] Event monitor 171 receives event information from
peripherals interface 118. Event information includes information
about a sub-event (e.g., a user touch on touch-sensitive display
112, as part of a multi-touch gesture). Peripherals interface 118
transmits information it receives from I/O subsystem 106 or a
sensor, such as proximity sensor 166, accelerometer(s) 168, and/or
microphone 113 (through audio circuitry 110). Information that
peripherals interface 118 receives from I/O subsystem 106 includes
information from touch-sensitive display 112 or a touch-sensitive
surface.
[0147] In some embodiments, event monitor 171 sends requests to the
peripherals interface 118 at predetermined intervals. In response,
peripherals interface 118 transmits event information. In other
embodiments, peripheral interface 118 transmits event information
only when there is a significant event (e.g., receiving an input
above a predetermined noise threshold and/or for more than a
predetermined duration).
[0148] In some embodiments, event sorter 170 also includes a hit
view determination module 172 and/or an active event recognizer
determination module 173.
[0149] Hit view determination module 172 provides software
procedures for determining where a sub-event has taken place within
one or more views, when touch sensitive display 112 displays more
than one view. Views are made up of controls and other elements
that a user can see on the display.
[0150] Another aspect of the user interface associated with an
application is a set of views, sometimes herein called application
views or user interface windows, in which information is displayed
and touch-based gestures occur. The application views (of a
respective application) in which a touch is detected optionally
correspond to programmatic levels within a programmatic or view
hierarchy of the application. For example, the lowest level view in
which a touch is detected is, optionally, called the hit view, and
the set of events that are recognized as proper inputs are,
optionally, determined based, at least in part, on the hit view of
the initial touch that begins a touch-based gesture.
[0151] Hit view determination module 172 receives information
related to sub-events of a touch-based gesture. When an application
has multiple views organized in a hierarchy, hit view determination
module 172 identifies a hit view as the lowest view in the
hierarchy which should handle the sub-event. In most circumstances,
the hit view is the lowest level view in which an initiating
sub-event occurs (i.e., the first sub-event in the sequence of
sub-events that form an event or potential event). Once the hit
view is identified by the hit view determination module, the hit
view typically receives all sub-events related to the same touch or
input source for which it was identified as the hit view.
[0152] Active event recognizer determination module 173 determines
which view or views within a view hierarchy should receive a
particular sequence of sub-events. In some embodiments, active
event recognizer determination module 173 determines that only the
hit view should receive a particular sequence of sub-events. In
other embodiments, active event recognizer determination module 173
determines that all views that include the physical location of a
sub-event are actively involved views, and therefore determines
that all actively involved views should receive a particular
sequence of sub-events. In other embodiments, even if touch
sub-events were entirely confined to the area associated with one
particular view, views higher in the hierarchy would still remain
as actively involved views.
[0153] Event dispatcher module 174 dispatches the event information
to an event recognizer (e.g., event recognizer 180). In embodiments
including active event recognizer determination module 173, event
dispatcher module 174 delivers the event information to an event
recognizer determined by active event recognizer determination
module 173. In some embodiments, event dispatcher module 174 stores
in an event queue the event information, which is retrieved by a
respective event receiver module 182.
[0154] In some embodiments, operating system 126 includes event
sorter 170. Alternatively, application 136-1 includes event sorter
170. In yet other embodiments, event sorter 170 is a stand-alone
module, or a part of another module stored in memory 102, such as
contact/motion module 130.
[0155] In some embodiments, application 136-1 includes a plurality
of event handlers 190 and one or more application views 191, each
of which includes instructions for handling touch events that occur
within a respective view of the application's user interface. Each
application view 191 of the application 136-1 includes one or more
event recognizers 180. Typically, a respective application view 191
includes a plurality of event recognizers 180. In other
embodiments, one or more of event recognizers 180 are part of a
separate module, such as a user interface kit (not shown) or a
higher level object from which application 136-1 inherits methods
and other properties. In some embodiments, a respective event
handler 190 includes one or more of: data updater 176, object
updater 177, GUI updater 178, and/or event data 179 received from
event sorter 170. Event handler 190 optionally utilizes or calls
data updater 176, object updater 177 or GUI updater 178 to update
the application internal state 192. Alternatively, one or more of
the application views 191 includes one or more respective event
handlers 190. Also, in some embodiments, one or more of data
updater 176, object updater 177, and GUI updater 178 are included
in a respective application view 191.
[0156] A respective event recognizer 180 receives event information
(e.g., event data 179) from event sorter 170, and identifies an
event from the event information. Event recognizer 180 includes
event receiver 182 and event comparator 184. In some embodiments,
event recognizer 180 also includes at least a subset of: metadata
183, and event delivery instructions 188 (which optionally include
sub-event delivery instructions).
[0157] Event receiver 182 receives event information from event
sorter 170. The event information includes information about a
sub-event, for example, a touch or a touch movement. Depending on
the sub-event, the event information also includes additional
information, such as location of the sub-event. When the sub-event
concerns motion of a touch, the event information optionally also
includes speed and direction of the sub-event. In some embodiments,
events include rotation of the device from one orientation to
another (e.g., from a portrait orientation to a landscape
orientation, or vice versa), and the event information includes
corresponding information about the current orientation (also
called device attitude) of the device.
[0158] Event comparator 184 compares the event information to
predefined event or sub-event definitions and, based on the
comparison, determines an event or sub-event, or determines or
updates the state of an event or sub-event. In some embodiments,
event comparator 184 includes event definitions 186. Event
definitions 186 contain definitions of events (e.g., predefined
sequences of sub-events), for example, event 1 (187-1), event 2
(187-2), and others. In some embodiments, sub-events in an event
187 include, for example, touch begin, touch end, touch movement,
touch cancellation, and multiple touching. In one example, the
definition for event 1 (187-1) is a double tap on a displayed
object. The double tap, for example, comprises a first touch (touch
begin) on the displayed object for a predetermined phase, a first
lift-off (touch end) for a predetermined phase, a second touch
(touch begin) on the displayed object for a predetermined phase,
and a second lift-off (touch end) for a predetermined phase. In
another example, the definition for event 2 (187-2) is a dragging
on a displayed object. The dragging, for example, comprises a touch
(or contact) on the displayed object for a predetermined phase, a
movement of the touch across touch-sensitive display 112, and
lift-off of the touch (touch end). In some embodiments, the event
also includes information for one or more associated event handlers
190.
[0159] In some embodiments, event definition 187 includes a
definition of an event for a respective user-interface object. In
some embodiments, event comparator 184 performs a hit test to
determine which user-interface object is associated with a
sub-event. For example, in an application view in which three
user-interface objects are displayed on touch-sensitive display
112, when a touch is detected on touch-sensitive display 112, event
comparator 184 performs a hit test to determine which of the three
user-interface objects is associated with the touch (sub-event). If
each displayed object is associated with a respective event handler
190, the event comparator uses the result of the hit test to
determine which event handler 190 should be activated. For example,
event comparator 184 selects an event handler associated with the
sub-event and the object triggering the hit test.
[0160] In some embodiments, the definition for a respective event
187 also includes delayed actions that delay delivery of the event
information until after it has been determined whether the sequence
of sub-events does or does not correspond to the event recognizer's
event type.
[0161] When a respective event recognizer 180 determines that the
series of sub-events do not match any of the events in event
definitions 186, the respective event recognizer 180 enters an
event impossible, event failed, or event ended state, after which
it disregards subsequent sub-events of the touch-based gesture. In
this situation, other event recognizers, if any, that remain active
for the hit view continue to track and process sub-events of an
ongoing touch-based gesture.
[0162] In some embodiments, a respective event recognizer 180
includes metadata 183 with configurable properties, flags, and/or
lists that indicate how the event delivery system should perform
sub-event delivery to actively involved event recognizers. In some
embodiments, metadata 183 includes configurable properties, flags,
and/or lists that indicate how event recognizers interact, or are
enabled to interact, with one another. In some embodiments,
metadata 183 includes configurable properties, flags, and/or lists
that indicate whether sub-events are delivered to varying levels in
the view or programmatic hierarchy.
[0163] In some embodiments, a respective event recognizer 180
activates event handler 190 associated with an event when one or
more particular sub-events of an event are recognized. In some
embodiments, a respective event recognizer 180 delivers event
information associated with the event to event handler 190.
Activating an event handler 190 is distinct from sending (and
deferred sending) sub-events to a respective hit view. In some
embodiments, event recognizer 180 throws a flag associated with the
recognized event, and event handler 190 associated with the flag
catches the flag and performs a predefined process.
[0164] In some embodiments, event delivery instructions 188 include
sub-event delivery instructions that deliver event information
about a sub-event without activating an event handler. Instead, the
sub-event delivery instructions deliver event information to event
handlers associated with the series of sub-events or to actively
involved views. Event handlers associated with the series of
sub-events or with actively involved views receive the event
information and perform a predetermined process.
[0165] In some embodiments, data updater 176 creates and updates
data used in application 136-1. For example, data updater 176
updates the telephone number used in contacts module 137, or stores
a video file used in video player module 145. In some embodiments,
object updater 177 creates and updates objects used in application
136-1. For example, object updater 176 creates a new user-interface
object or updates the position of a user-interface object. GUI
updater 178 updates the GUI. For example, GUI updater 178 prepares
display information and sends it to graphics module 132 for display
on a touch-sensitive display.
[0166] In some embodiments, event handler(s) 190 includes or has
access to data updater 176, object updater 177, and GUI updater
178. In some embodiments, data updater 176, object updater 177, and
GUI updater 178 are included in a single module of a respective
application 136-1 or application view 191. In other embodiments,
they are included in two or more software modules.
[0167] It shall be understood that the foregoing discussion
regarding event handling of user touches on touch-sensitive
displays also applies to other forms of user inputs to operate
multifunction devices 100 with input-devices, not all of which are
initiated on touch screens. For example, mouse movement and mouse
button presses, optionally coordinated with single or multiple
keyboard presses or holds; contact movements such as taps, drags,
scrolls, etc., on touch-pads; pen stylus inputs; movement of the
device; oral instructions; detected eye movements; biometric
inputs; and/or any combination thereof are optionally utilized as
inputs corresponding to sub-events which define an event to be
recognized.
[0168] FIG. 2 illustrates a portable multifunction device 100
having a touch screen 112 in accordance with some embodiments. The
touch screen optionally displays one or more graphics within user
interface (UI) 200. In this embodiment, as well as others described
below, a user is enabled to select one or more of the graphics by
making a gesture on the graphics, for example, with one or more
fingers 202 (not drawn to scale in the figure) or one or more
styluses 203 (not drawn to scale in the figure). In some
embodiments, selection of one or more graphics occurs when the user
breaks contact with the one or more graphics. In some embodiments,
the gesture optionally includes one or more taps, one or more
swipes (from left to right, right to left, upward and/or downward)
and/or a rolling of a finger (from right to left, left to right,
upward and/or downward) that has made contact with device 100. In
some implementations or circumstances, inadvertent contact with a
graphic does not select the graphic. For example, a swipe gesture
that sweeps over an application icon optionally does not select the
corresponding application when the gesture corresponding to
selection is a tap.
[0169] Device 100 optionally also includes one or more physical
buttons, such as "home" or menu button 204. As described
previously, menu button 204 is, optionally, used to navigate to any
application 136 in a set of applications that are, optionally
executed on device 100. Alternatively, in some embodiments, the
menu button is implemented as a soft key in a GUI displayed on
touch screen 112.
[0170] In one embodiment, device 100 includes touch screen 112,
menu button 204, push button 206 for powering the device on/off and
locking the device, volume adjustment button(s) 208, Subscriber
Identity Module (SIM) card slot 210, head set jack 212, and
docking/charging external port 124. Push button 206 is, optionally,
used to turn the power on/off on the device by depressing the
button and holding the button in the depressed state for a
predefined time interval; to lock the device by depressing the
button and releasing the button before the predefined time interval
has elapsed; and/or to unlock the device or initiate an unlock
process. In an alternative embodiment, device 100 also accepts
verbal input for activation or deactivation of some functions
through microphone 113. Device 100 also, optionally, includes one
or more contact intensity sensors 165 for detecting intensity of
contacts on touch screen 112 and/or one or more tactile output
generators 167 for generating tactile outputs for a user of device
100.
[0171] FIG. 3 is a block diagram of an exemplary multifunction
device with a display and a touch-sensitive surface in accordance
with some embodiments. Device 300 need not be portable. In some
embodiments, device 300 is a laptop computer, a desktop computer, a
tablet computer, a multimedia player device, a navigation device,
an educational device (such as a child's learning toy), a gaming
system, or a control device (e.g., a home or industrial
controller). Device 300 typically includes one or more processing
units (CPU's) 310, one or more network or other communications
interfaces 360, memory 370, and one or more communication buses 320
for interconnecting these components. Communication buses 320
optionally include circuitry (sometimes called a chipset) that
interconnects and controls communications between system
components. Device 300 includes input/output (I/O) interface 330
comprising display 340, which is typically a touch screen display.
I/O interface 330 also optionally includes a keyboard and/or mouse
(or other pointing device) 350 and touchpad 355, tactile output
generator 357 for generating tactile outputs on device 300 (e.g.,
similar to tactile output generator(s) 167 described above with
reference to FIG. 1A), sensors 359 (e.g., optical, acceleration,
proximity, touch-sensitive, and/or contact intensity sensors
similar to contact intensity sensor(s) 165 described above with
reference to FIG. 1A). Memory 370 includes high-speed random access
memory, such as DRAM, SRAM, DDR RAM or other random access solid
state memory devices; and optionally includes non-volatile memory,
such as one or more magnetic disk storage devices, optical disk
storage devices, flash memory devices, or other non-volatile solid
state storage devices. Memory 370 optionally includes one or more
storage devices remotely located from CPU(s) 310. In some
embodiments, memory 370 stores programs, modules, and data
structures analogous to the programs, modules, and data structures
stored in memory 102 of portable multifunction device 100 (FIG.
1A), or a subset thereof. Furthermore, memory 370 optionally stores
additional programs, modules, and data structures not present in
memory 102 of portable multifunction device 100. For example,
memory 370 of device 300 optionally stores drawing module 380,
presentation module 382, word processing module 384, website
creation module 386, disk authoring module 388, and/or spreadsheet
module 390, while memory 102 of portable multifunction device 100
(FIG. 1A) optionally does not store these modules.
[0172] Each of the above identified elements in FIG. 3 are,
optionally, stored in one or more of the previously mentioned
memory devices. Each of the above identified modules corresponds to
a set of instructions for performing a function described above.
The above identified modules or programs (i.e., sets of
instructions) need not be implemented as separate software
programs, procedures or modules, and thus various subsets of these
modules are, optionally, combined or otherwise re-arranged in
various embodiments. In some embodiments, memory 370 optionally
stores a subset of the modules and data structures identified
above. Furthermore, memory 370 optionally stores additional modules
and data structures not described above.
[0173] Attention is now directed towards embodiments of user
interfaces ("UI") that is, optionally, implemented on portable
multifunction device 100.
[0174] FIG. 4A illustrates an exemplary user interface for a menu
of applications on portable multifunction device 100 in accordance
with some embodiments. Similar user interfaces are, optionally,
implemented on device 300. In some embodiments, user interface 400
includes the following elements, or a subset or superset thereof:
[0175] Signal strength indicator(s) 402 for wireless
communication(s), such as cellular and Wi-Fi signals; [0176] Time
404; [0177] Bluetooth indicator 405; [0178] Battery status
indicator 406; [0179] Tray 408 with icons for frequently used
applications, such as: [0180] Icon 416 for telephone module 138,
labeled "Phone," which optionally includes an indicator 414 of the
number of missed calls or voicemail messages; [0181] Icon 418 for
e-mail client module 140, labeled "Mail," which optionally includes
an indicator 410 of the number of unread e-mails; [0182] Icon 420
for browser module 147, labeled "Browser;" and [0183] Icon 422 for
video and music player module 152, also referred to as iPod
(trademark of Apple Inc.) module 152, labeled "iPod;" and [0184]
Icons for other applications, such as: [0185] Icon 424 for IM
module 141, labeled "Text;" [0186] Icon 426 for calendar module
148, labeled "Calendar;" [0187] Icon 428 for image management
module 144, labeled "Photos;" [0188] Icon 430 for camera module
143, labeled "Camera;" [0189] Icon 432 for online video module 155,
labeled "Online Video" [0190] Icon 434 for stocks widget 149-2,
labeled "Stocks;" [0191] Icon 436 for map module 154, labeled
"Map;" [0192] Icon 438 for weather widget 149-1, labeled "Weather;"
[0193] Icon 440 for alarm clock widget 149-4, labeled "Clock;"
[0194] Icon 442 for workout support module 142, labeled "Workout
Support;" [0195] Icon 444 for notes module 153, labeled "Notes;"
and [0196] Icon 446 for a settings application or module, which
provides access to settings for device 100 and its various
applications 136.
[0197] It should be noted that the icon labels illustrated in FIG.
4A are merely exemplary. For example, icon 422 for video and music
player module 152 are labeled "Music" or "Music Player." Other
labels are, optionally, used for various application icons. In some
embodiments, a label for a respective application icon includes a
name of an application corresponding to the respective application
icon. In some embodiments, a label for a particular application
icon is distinct from a name of an application corresponding to the
particular application icon.
[0198] FIG. 4B illustrates an exemplary user interface on a device
(e.g., device 300, FIG. 3) with a touch-sensitive surface 451
(e.g., a tablet or touchpad 355, FIG. 3) that is separate from the
display 450 (e.g., touch screen display 112). Device 300 also,
optionally, includes one or more contact intensity sensors (e.g.,
one or more of sensors 357) for detecting intensity of contacts on
touch-sensitive surface 451 and/or one or more tactile output
generators 359 for generating tactile outputs for a user of device
300.
[0199] Although some of the examples which follow will be given
with reference to inputs on touch screen display 112 (where the
touch sensitive surface and the display are combined), in some
embodiments, the device detects inputs on a touch-sensitive surface
that is separate from the display, as shown in FIG. 4B. In some
embodiments the touch sensitive surface (e.g., 451 in FIG. 4B) has
a primary axis (e.g., 452 in FIG. 4B) that corresponds to a primary
axis (e.g., 453 in FIG. 4B) on the display (e.g., 450). In
accordance with these embodiments, the device detects contacts
(e.g., 460 and 462 in FIG. 4B) with the touch-sensitive surface 451
at locations that correspond to respective locations on the display
(e.g., in FIG. 4B, 460 corresponds to 468 and 462 corresponds to
470). In this way, user inputs (e.g., contacts 460 and 462, and
movements thereof) detected by the device on the touch-sensitive
surface (e.g., 451 in FIG. 4B) are used by the device to manipulate
the user interface on the display (e.g., 450 in FIG. 4B) of the
multifunction device when the touch-sensitive surface is separate
from the display. It should be understood that similar methods are,
optionally, used for other user interfaces described herein.
[0200] Additionally, while the following examples are given
primarily with reference to finger inputs (e.g., finger contacts,
finger tap gestures, finger swipe gestures), it should be
understood that, in some embodiments, one or more of the finger
inputs are replaced with input from another input device (e.g., a
mouse based input or stylus input). For example, a swipe gesture
is, optionally, replaced with a mouse click (e.g., instead of a
contact) followed by movement of the cursor along the path of the
swipe (e.g., instead of movement of the contact). As another
example, a tap gesture is, optionally, replaced with a mouse click
while the cursor is located over the location of the tap gesture
(e.g., instead of detection of the contact followed by ceasing to
detect the contact). Similarly, when multiple user inputs are
simultaneously detected, it should be understood that multiple
computer mice are, optionally, used simultaneously, or a mouse and
finger contacts are, optionally, used simultaneously.
[0201] As used herein, the term "focus selector" refers to an input
element that indicates a current part of a user interface with
which a user is interacting. In some implementations that include a
cursor or other location marker, the cursor acts as a "focus
selector," so that when an input (e.g., a press input) is detected
on a touch-sensitive surface (e.g., touchpad 355 in FIG. 3 or
touch-sensitive surface 451 in FIG. 4B) while the cursor is over a
particular user interface element (e.g., a button, window, slider
or other user interface element), the particular user interface
element is adjusted in accordance with the detected input. In some
implementations that include a touch-screen display (e.g.,
touch-sensitive display system 112 in FIG. 1A or touch screen 112
in FIG. 4A) that enables direct interaction with user interface
elements on the touch-screen display, a detected contact on the
touch-screen acts as a "focus selector," so that when an input
(e.g., a press input by the contact) is detected on the
touch-screen display at a location of a particular user interface
element (e.g., a button, window, slider or other user interface
element), the particular user interface element is adjusted in
accordance with the detected input. In some implementations focus
is moved from one region of a user interface to another region of
the user interface without corresponding movement of a cursor or
movement of a contact on a touch-screen display (e.g., by using a
tab key or arrow keys to move focus from one button to another
button); in these implementations, the focus selector moves in
accordance with movement of focus between different regions of the
user interface. Without regard to the specific form taken by the
focus selector, the focus selector is generally the user interface
element (or contact on a touch-screen display) that is controlled
by the user so as to communicate the user's intended interaction
with the user interface (e.g., by indicating, to the device, the
element of the user interface with which the user is intending to
interact). For example, the location of a focus selector (e.g., a
cursor, a contact or a selection box) over a respective button
while a press input is detected on the touch-sensitive surface
(e.g., a touchpad or touch screen) will indicate that the user is
intending to activate the respective button (as opposed to other
user interface elements shown on a display of the device).
[0202] The user interface figures described below include various
intensity diagrams that show the current intensity of the contact
on the touch-sensitive surface relative to one or more intensity
thresholds (e.g., a contact detection intensity threshold IT.sub.0,
a light press intensity threshold IT.sub.L, a deep press intensity
threshold IT.sub.D, and/or one or more other intensity thresholds).
This intensity diagram is typically not part of the displayed user
interface, but is provided to aid in the interpretation of the
figures. In some embodiments, the light press intensity threshold
corresponds to an intensity at which the device will perform
operations typically associated with clicking a button of a
physical mouse or a trackpad. In some embodiments, the deep press
intensity threshold corresponds to an intensity at which the device
will perform operations that are different from operations
typically associated with clicking a button of a physical mouse or
a trackpad. In some embodiments, when a contact is detected with an
intensity below the light press intensity threshold (e.g., and
above a nominal contact-detection intensity threshold IT.sub.0
below which the contact is no longer detected), the device will
move a focus selector in accordance with movement of the contact on
the touch-sensitive surface without performing an operation
associated with the light press intensity threshold or the deep
press intensity threshold. Generally, unless otherwise stated,
these intensity thresholds are consistent between different sets of
user interface figures.
[0203] An increase of intensity of the contact from an intensity
below the light press intensity threshold IT.sub.L to an intensity
between the light press intensity threshold IT.sub.L and the deep
press intensity threshold IT.sub.D is sometimes referred to as a
"light press" input. An increase of intensity of the contact from
an intensity below the deep press intensity threshold IT.sub.D to
an intensity above the deep press intensity threshold IT.sub.D is
sometimes referred to as a "deep press" input. An increase of
intensity of the contact from an intensity below the
contact-detection intensity threshold IT.sub.0 to an intensity
between the contact-detection intensity threshold IT.sub.0 and the
light press intensity threshold IT.sub.L is sometimes referred to
as detecting the contact on the touch-surface. A decrease of
intensity of the contact from an intensity above the
contact-detection intensity threshold IT.sub.0 to an intensity
below the contact intensity threshold IT.sub.0 is sometimes
referred to as detecting liftoff of the contact from the
touch-surface. In some embodiments IT.sub.0 is zero. In some
embodiments IT.sub.0 is greater than zero. In some illustrations a
shaded circle or oval is used to represent intensity of a contact
on the touch-sensitive surface. In some illustrations a circle or
oval without shading is used represent a respective contact on the
touch-sensitive surface without specifying the intensity of the
respective contact.
[0204] In some embodiments described herein, one or more operations
are performed in response to detecting a gesture that includes a
respective press input or in response to detecting the respective
press input performed with a respective contact (or a plurality of
contacts), where the respective press input is detected based at
least in part on detecting an increase in intensity of the contact
(or plurality of contacts) above a press-input intensity threshold.
In some embodiments, the respective operation is performed in
response to detecting the increase in intensity of the respective
contact above the press-input intensity threshold (e.g., a "down
stroke" of the respective press input). In some embodiments, the
press input includes an increase in intensity of the respective
contact above the press-input intensity threshold and a subsequent
decrease in intensity of the contact below the press-input
intensity threshold, and the respective operation is performed in
response to detecting the subsequent decrease in intensity of the
respective contact below the press-input threshold (e.g., an "up
stroke" of the respective press input).
[0205] In some embodiments, the device employs intensity hysteresis
to avoid accidental inputs sometimes termed "jitter," where the
device defines or selects a hysteresis intensity threshold with a
predefined relationship to the press-input intensity threshold
(e.g., the hysteresis intensity threshold is X intensity units
lower than the press-input intensity threshold or the hysteresis
intensity threshold is 75%, 90% or some reasonable proportion of
the press-input intensity threshold). Thus, in some embodiments,
the press input includes an increase in intensity of the respective
contact above the press-input intensity threshold and a subsequent
decrease in intensity of the contact below the hysteresis intensity
threshold that corresponds to the press-input intensity threshold,
and the respective operation is performed in response to detecting
the subsequent decrease in intensity of the respective contact
below the hysteresis intensity threshold (e.g., an "up stroke" of
the respective press input). Similarly, in some embodiments, the
press input is detected only when the device detects an increase in
intensity of the contact from an intensity at or below the
hysteresis intensity threshold to an intensity at or above the
press-input intensity threshold and, optionally, a subsequent
decrease in intensity of the contact to an intensity at or below
the hysteresis intensity, and the respective operation is performed
in response to detecting the press input (e.g., the increase in
intensity of the contact or the decrease in intensity of the
contact, depending on the circumstances).
[0206] For ease of explanation, the description of operations
performed in response to a press input associated with a
press-input intensity threshold or in response to a gesture
including the press input are, optionally, triggered in response to
detecting either: an increase in intensity of a contact above the
press-input intensity threshold, an increase in intensity of a
contact from an intensity below the hysteresis intensity threshold
to an intensity above the press-input intensity threshold, a
decrease in intensity of the contact below the press-input
intensity threshold, and/or a decrease in intensity of the contact
below the hysteresis intensity threshold corresponding to the
press-input intensity threshold. Additionally, in examples where an
operation is described as being performed in response to detecting
a decrease in intensity of a contact below the press-input
intensity threshold, the operation is, optionally, performed in
response to detecting a decrease in intensity of the contact below
a hysteresis intensity threshold corresponding to, and lower than,
the press-input intensity threshold.
User Interfaces and Associated Processes
Providing Tactile Feedback for Operations Performed on in a User
Interface
[0207] Many electronic devices have graphical user interfaces that
include user interface objects. There are usually many operations
which can be performed on the interface objects. For example, an
interface object can be snapped to a guideline or removed from a
guideline. Another example would be moving a user interface object
(e.g., a file) into or out of a folder. There is often a need to
provide efficient and convenient ways for users to receive feedback
for operations performed on these user interface objects. The
embodiments below improve on existing methods by generating tactile
outputs for the user corresponding to the operations performed.
[0208] FIGS. 5A-5O illustrate exemplary user interfaces for
providing tactile feedback for operations performed in a user
interface in accordance with some embodiments. The user interfaces
in these figures are used to illustrate the processes described
below, including the processes described below with reference to
FIGS. 6A-6C.
[0209] FIG. 5A illustrates an example of a user interface that
includes a user interface object. User interface 10200 in FIGS.
5A-5C is displayed on display 450 of a device (e.g., device 300)
and is responsive to contacts (e.g., a finger contact) on
touch-sensitive surface 451. User interface 10200 includes user
interface object 10202 and, per some embodiments, object placement
guide 10208. FIG. 5A further illustrates contact 10204 at position
10204-a on touch-sensitive surface 451 and a displayed
representation of a focus selector (e.g., cursor 10206)
corresponding to contact 10204.
[0210] In some embodiments, the device is an electronic device with
a separate display (e.g., display 450) and a separate
touch-sensitive surface (e.g., touch-sensitive surface 451). In
some embodiments, the device is portable multifunction device 100,
the display is touch-sensitive display system 112, and the
touch-sensitive surface includes tactile output generators 167 on
the display (FIG. 1A). For convenience of explanation, the
embodiments described with reference to FIGS. 5A-5O and FIGS. 6A-6C
will be discussed with reference to display 450 and a separate
touch-sensitive surface 451, however analogous operations are,
optionally, performed on a device with a touch-sensitive display
system 112 in response to detecting the contacts described in FIGS.
5A-5O on the touch-sensitive display system 112 while displaying
the user interfaces shown in FIGS. 5A-5O on the touch-sensitive
display system 112; in such embodiments, the focus selector is,
optionally: a respective contact, a representative point
corresponding to a contact (e.g., a centroid of a respective
contact or a point associated with a respective contact), or a
centroid of two or more contacts detected on the touch-sensitive
display system 112, in place of cursor 10206.
[0211] FIGS. 5A-5B illustrate an example of performing an operation
that includes snapping a user interface object into a respective
object placement guide. In this example, the device detects contact
10204 and movement 10210 of contact 10204 from position 10204-a in
FIG. 5A to position 10204-b in FIG. 5B on touch-sensitive surface
451. In response to detecting movement 10210, which corresponds to
moving user interface object 10206 within a snapping distance of
object placement guide 10208, the device snaps user interface
object 10202 into object placement guide 10208 and generates
tactile output 10211 on touch-sensitive surface 451.
[0212] FIGS. 5B-5C illustrate an example of reversing the operation
shown in FIGS. 5B-5C by snapping a user interface object out of a
respective object placement guide. In this example, the device
detects contact 10204 and movement 10212 of contact 10204 from
position 10204-b in FIG. 5B to position 10204-c in FIG. 5C on
touch-sensitive surface 451. In response to detecting movement
10212, which corresponds to moving user interface object 10206 at
or beyond an unsnapping distance of object placement guide 10208,
the device snaps user interface object 10202 out of object
placement guide 10208 (e.g., reversing the operation described
above with reference to FIGS. 5A-5B). The device also generates
tactile output 10213 on touch-sensitive surface 451 in response to
movement 10212 of contact 10204 that corresponds to snapping user
interface object 10202 out of object placement guide 10208.
[0213] In some embodiments, the device compares a respective amount
of time between movement 10210 in FIG. 5B and movement 10212 in
FIG. 5C with a predefined time threshold (e.g., the device
determines a magnitude of a pause time between the end of the
movement 10210 and the beginning of movement 10212 and compares the
magnitude of the pause time with the predefined time threshold). In
some embodiments, in accordance with a determination that the
respective amount of time (e.g., the pause time) between movement
10210 and movement 10212 is less than the predefined time
threshold, the device forgoes generating tactile output 10211
and/or tactile output 10213. In contrast, in some embodiments, in
accordance with a determination that the respective amount of time
(e.g., the pause time) between movement 10210 and movement 10212 is
greater than the predefined time threshold, the device generates
tactile output 10211 and tactile output 10213, as described in
greater detail above. In some embodiments, when the respective
amount of time is less than the predefined time threshold, the
tactile output corresponding to performing the operation (e.g.,
tactile output 10211) is generated and the tactile output
corresponding to reversing the operation (e.g., tactile output
10213) is not generated. In some embodiments, when the respective
amount of time is less than the predefined time threshold, the
tactile output corresponding to performing the operation (e.g.,
tactile output 10211) and the tactile output corresponding to
reversing the operation (e.g., tactile output 10213) are both not
generated. In some embodiments, when the respective amount of time
is less than the predefined time threshold, the tactile output
corresponding to performing the operation (e.g., tactile output
10211) is not generated and the tactile output corresponding to
reversing the operation (e.g., tactile output 10213) is generated.
Forgoing generating one or more tactile outputs corresponding to
performing and reversing the operation when the operation is
performed and reversed quickly (e.g., before the predefined time
threshold has elapsed) prevents tactile outputs from being
generated when the user accidentally performs and then reverses the
operation. These tactile outputs, would likely, if generated,
confuse the user or distract the user from other more important
tactile and visual feedback. As such, selectively suppressing
(e.g., forgoing) generating tactile outputs based on one or more
predefined time thresholds, as described above, provides a more
efficient and intuitive user interface, thereby improving the user
experience when interacting with the user interface.
[0214] FIG. 5D illustrates another example of a user interface that
includes a user interface object. In this example, user interface
10200 in FIGS. 5D-5F includes user interface object 10222 and
folder 10228 which is a trash folder used to mark data for
deletion. FIG. 5D further illustrates contact 10224 at position
10224-a on touch-sensitive surface 451 and a displayed
representation of a focus selector (e.g., cursor 10206)
corresponding to contact 10224.
[0215] FIGS. 5D-5E illustrate an example of performing an
operation. In this example, performing an operation includes
marking data corresponding to a user interface object for deletion.
In this example, the device detects contact 10224 and movement
10230 of contact 10224 from position 10224-a in FIG. 5D to position
10224-b in FIG. 5E on touch-sensitive surface 451. In response to
detecting movement 10230, the device moves user interface object
10222 over folder 10228, as shown in FIG. 5E and the device marks
data corresponding to user interface object 10222 for deletion and
generates tactile output 10231 on touch-sensitive surface 451.
Optionally, marking the user interface object for deletion is
performed in response to detecting an input such as liftoff of the
contact.
[0216] FIGS. 5E-5F illustrate an example of reversing the operation
shown in FIGS. 5D-5E, by unmarking data corresponding to a user
interface object for deletion. In this example, the device detects
contact 10224 and movement 10232 of contact 10224 from position
10224-b in FIG. 5E to position 10224-c in FIG. 5F on
touch-sensitive surface 451. In response to detecting movement
10232, the device moves user interface object 102 away from folder
icon 10228 and unmarks data corresponding to user interface object
10202 for deletion (e.g., reversing the data marking operation
described above with reference to FIGS. 5D-5E). The device also
generates tactile output 10233 on touch-sensitive surface 451 in
response to movement 10232 of contact 10224 that corresponds to
unmarking data user interface object 10222 for deletion. In
circumstances where marking the user interface object for deletion
is performed in response to detecting an input such as liftoff of
the contact, contact 10224 in FIG. 5F is optionally a different
contact from contact 10224 in FIG. 5E.
[0217] FIG. 5G illustrates another example of a user interface that
includes a user interface object. In this example, user interface
10200 in FIGS. 5G-5I includes user interface object 10242,
representing a file in this example, and folder 10248 representing
a directory in a file system. FIG. 5G further illustrates contact
10244 at position 10244-a on touch-sensitive surface 451 and a
displayed representation of a focus selector (e.g., cursor 10206)
corresponding to contact 10244.
[0218] FIGS. 5G-5H illustrate another example of performing an
operation. In this example the operation includes placing a file in
a directory. In this example, the device detects contact 10244 and
movement 10250 of contact 10244 from position 10244-a in FIG. 5G to
position 10244-b in FIG. 5H on touch-sensitive surface 451. In
response to detecting movement 10250, the device moves user
interface object 10242 over folder 10248 and the device places the
file, represented by user interface object 10242, in the directory,
represented by folder 10248 and generates tactile output 10251 on
touch-sensitive surface 451. Optionally, placing the file
represented by the user interface object in the folder is performed
in response to detecting an input such as liftoff of the
contact.
[0219] FIGS. 5H-5I illustrate an example of reversing the operation
shown in FIGS. 5G-5H by removing a file from a directory. In this
example, the device detects contact 10244 and movement 10252 of
contact 10244 from position 10244-b in FIG. 5H to position 10244-c
in FIG. 5I on touch-sensitive surface 451. In response to detecting
movement 10252, the device moves user interface object 10242 away
from folder icon 10248 and removes the file represented by user
interface object 10242 from the directory represented by folder
10248 (e.g., reversing the operation described above with reference
to FIGS. 5G-5H). The device also generates tactile output 10253 on
touch-sensitive surface 451 in response movement 10252 of contact
10244 that corresponds to removing the file from the directory. In
circumstances where placing the file represented by the user
interface object in the folder is performed in response to
detecting an input such as liftoff of the contact, contact 10244 in
FIG. 5I is optionally a different contact from contact 10244 in
FIG. 5H.
[0220] FIG. 5J illustrates another example of a user interface that
includes a user interface object. In this example, user interface
10200 in FIGS. 5J-5L includes user interface object 10262,
corresponding to an application (or an application launch icon) and
application launch region 10268. FIG. 5J further illustrates
contact 10264 at position 10264-a on touch-sensitive surface 451
and a displayed representation of a focus selector (e.g., cursor
10206) corresponding to contact 10264.
[0221] FIGS. 5J-5K illustrate an example of performing an
operation. In this example the operation includes placing a user
interface object in an application launch region. In this example,
the device detects contact 10264 and movement 10270 of contact
10264 from position 10264-a in FIG. 5J to position 10264-b in FIG.
5K on touch-sensitive surface 451. In response to detecting
movement 10270, the device moves user interface object 10262 over
application launch region 10268 and places user interface object
10262 in application launch region 10268 and generates tactile
output 10271 on touch-sensitive surface 451. Optionally, placing
the user interface object in the application launch region is
performed in response to detecting an input such as liftoff of the
contact.
[0222] FIGS. 5K-5L illustrate an example of reversing the operation
shown in FIGS. 5J-5K by removing a user interface object from an
application launch region. In this example, the device detects
contact 10264 and movement 10272 of contact 10264 from position
10264-b in FIG. 5K to position 10264-c in FIG. 5L on
touch-sensitive surface 451. In response to detecting movement
10272, the device moves user interface object 10262 away from
application launch region 10268 and removes user interface object
10262 from application launch region 10268 (e.g., reversing the
operation described above with reference to FIGS. 5J-5K). The
device also generates tactile output 10273 on touch-sensitive
surface 451 in response to movement 10272 of contact 10264 that
corresponds to removing user interface object 10262 from
application launch region 10268. In circumstances where placing the
user interface object in the application launch region is performed
in response to detecting an input such as liftoff of the contact,
contact 10264 in FIG. 5L is optionally a different contact from
contact 10264 in FIG. 5K.
[0223] FIGS. 5M-5O illustrate example waveforms of movement
profiles for generating the tactile output. FIG. 5M illustrates a
triangle waveform with period 10280-1. FIG. 5N illustrates a square
waveform with period 10280-2 and FIG. 5O illustrates a sawtooth
waveform with period 10280-3. One of these movement profiles
illustrated in FIGS. 5M-5O is, optionally, be utilized when
generating a tactile output corresponding performing an operation
(e.g., tactile outputs 10211, 10231, 10251 or 10271) or reversing a
performed operation (e.g., tactile outputs 10213, 10233, 10253 or
10273), as discussed above. In some embodiments, another waveform
is used to generate tactile outputs corresponding to the different
operations described with reference to FIGS. 5A-5L, above. In some
embodiments the tactile outputs corresponding to performing an
operation (e.g., tactile outputs 10211, 10231, 10251 or 10271) are
generated using the same waveform. In some embodiments the tactile
outputs corresponding to reversing an operation (e.g., tactile
outputs 10213, 10233, 10253 or 10273) are generated using the same
waveform. In some embodiments the tactile outputs corresponding to
performing an operation (e.g., tactile outputs 10211, 10231, 10251
or 10271) are generated using a first waveform that is different
from a second waveform used to generate the tactile outputs
corresponding to reversing an operation (e.g., tactile outputs
10213, 10233, 10253 or 10273).
[0224] FIGS. 6A-6C are flow diagrams illustrating a method 10300 of
providing tactile feedback for operations performed in a user
interface in accordance with some embodiments. Method 10300 is
performed at an electronic device (e.g., device 300, FIG. 3, or
portable multifunction device 100, FIG. 1A) with a display and a
touch-sensitive surface. In some embodiments, the display is a
touch screen display and the touch-sensitive surface is on the
display. In some embodiments, the display is separate from the
touch-sensitive surface. Some operations in method 10300 are,
optionally, combined and/or the order of some operations is,
optionally, changed.
[0225] As described below, the method 10300 provides an intuitive
way to provide tactile feedback for operations performed in a user
interface. The method reduces the cognitive burden on a user when
performing operations in a user interface, thereby creating a more
efficient human-machine interface. For battery-operated electronic
devices, enabling a user to perform operations in a user interface
faster and more efficiently conserves power and increases the time
between battery charges.
[0226] The device displays (10302) a user interface object. FIG.
5A, for example, shows user interface object 10202, displayed in
graphical user interface 10200. The device detects (10304) a
contact (e.g., a finger contact) on the touch-sensitive surface.
For example, FIG. 5A shows contact 10204 at position 10204-a on
touch-sensitive surface 451.
[0227] The device detects (10306) a first movement of the contact
across the touch-sensitive surface, where the first movement
corresponds to performing an operation on the user interface
object. For example, FIG. 5B shows contact 10204 and subsequent
movement 10210 on touch-sensitive surface 451 corresponding to
snapping user interface object 10202 into object placement guide
10208. In response to detecting (10310) the first movement, the
device performs (10312) the operation and generates (10314) a first
tactile output on the touch-sensitive surface. FIG. 5B, for
example, shows user interface object 10202 snapping into object
placement guide 10208 and tactile output 10211 generated on
touch-sensitive surface 451.
[0228] The device detects (10316) a second movement of the contact
across the touch-sensitive surface, where the second movement
corresponds to reversing the operation on the user interface
object. For example, FIG. 5C shows contact 10204 and subsequent
movement 10212 on touch-sensitive surface corresponding to snapping
user interface object 10202 out of object placement guide
10208.
[0229] In some embodiments or circumstances, the first movement of
the contact across the touch-sensitive surface and the second
movement of the contact across the touch-sensitive surface are
(10318) part of a single continuous gesture performed without
detecting a liftoff of the contact from the touch-sensitive
surface. In some embodiments, even if there is a pause in movement
of the contact, the first movement and the second movement are
considered to be part of the same continuous gesture as long as the
contact continues to be detected on the touch-sensitive surface. In
some embodiments, if the same first and second movement are
detected as part of two different gestures (e.g., there is a
liftoff of the contact between when the first movement is detected
and when the second movement is detected), then the same tactile
output is generated in response to both performing the operation
and reversing the operation). In some embodiments, if the same
first and second movement are detected as part of two different
gestures (e.g., there is a liftoff of the contact between when the
first movement is detected and when the second movement is
detected) then different tactile outputs are still generated in
response to both performing the operation and reversing the
operation. For example, FIGS. 5A-5C show contact 10204 moving from
position 10204-a to position 10204-b (shown in FIGS. 5A-5B) which
corresponds to snapping user interface object 10202 to object
placement guide 10208, then moving from position 10204-b to
position 10204-c (shown in FIGS. 5B-5C) which corresponds to
unsnapping user interface object 10202 from object placement guide
10208. In this example, tactile output 10211 is generated in
response to contact 10204 moving from position 10204-a to position
10204-b and tactile output 10213 is generated in response to
contact 10204 moving from position 10204-b to position 10204-c. In
some embodiments, tactile output 10213 (sometimes called the second
tactile output) is generated in response to detecting a movement
corresponding to reversing a prior operation on a respective user
interface object. In some embodiments, tactile output 10211
(sometimes called the first tactile output) is generated in
response to detecting a movement corresponding to perform an
operation that is not a reversal of a prior operation (e.g., an
immediately prior operation) on a respective user interface
object.
[0230] In response to detecting (10320) the second movement, the
device reverses (10322) the operation. It should be understood that
reversing an operation does not necessarily entail performing an
exact mirror image of the procedure undertaken to perform the
operation. For example, to snap an object to a guide the object is
moved to a position within a snapping distance from the guide,
while to move an object away from the guide, movement of a contact
is detected that corresponds to movement of the object more than an
unsnapping distance from the guide, without respect to the
particular path taken by the contact or the object. FIG. 5C, for
example, shows user interface object 10202 unsnapping from object
placement guide 10208 and tactile output 10213, different from
tactile output 10211 shown in FIG. 5B, is generated on
touch-sensitive surface 451.
[0231] In some embodiments, performing the operation includes
snapping (10324) the user interface object (e.g., a picture, text
box, shape or some other moveable user interface object) into a
respective object placement guide and reversing the operation
includes snapping the user interface object out of the respective
object placement guide. In some embodiments, snapping a user
interface object into a respective object placement guide includes
detecting user-controlled movement of the user interface object
within a predefined distance from the respective object placement
guide and, in response to detecting the user-controlled movement of
the user interface object within the predefined distance of the
respective object placement guide, automatically moving (e.g., via
device-controlled movement) the respective user interface object
adjacent to the respective object placement guide. In some
embodiments, once the user interface object has been snapped into a
respective object placement guide, subsequent movement of the
contact across the touch-sensitive surface does not cause movement
of the user interface object until a predefined precondition is
met. In particular, in some embodiments, snapping a user interface
object out of a respective object placement guide includes
detecting movement of a contact that would correspond to
user-controlled movement of the user interface object more than a
predefined distance away from the respective object placement guide
if the object were not snapped to the respective object placement
guide and in response to detecting the movement of the contact,
automatically moving (e.g., via device-controlled movement) the
respective user interface object to a location on the display that
is away from the respective object placement guide in accordance
with the movement of the contact on the touch-sensitive surface.
For example, FIGS. 5A-5C show user interface object 10202 snapping
to object placement guide 10208 (FIGS. 5A-5B), then snapping user
interface object 10202 out of object placement guide 10208 (FIGS.
5B-5C).
[0232] In some embodiments, performing the operation includes
marking (10326) data corresponding to the user interface object for
deletion (e.g., placing an icon corresponding to a file in a trash
or recycle folder) and reversing the operation includes unmarking
the data corresponding to the user interface object for deletion
(e.g., removing/restoring an icon corresponding to a file from a
trash or recycle folder). FIGS. 5D-5F, for example, show user
interface object 10222 moving over trash folder 10228 and causing
the device to, in response, mark data corresponding to user
interface object 10222 for deletion (FIGS. 5D-5E) then moving user
interface object 10222 away from trash folder 10228 and causing the
device to, in response, unmark data corresponding to user interface
object 10222 for deletion (FIGS. 5E-5F).a
[0233] In some embodiments, the user interface object corresponds
to a file, performing the operation includes placing (10328) the
file in a directory and reversing the operation includes removing
the file from the directory. For example, in some embodiments,
performing the operation includes moving a user interface object
that is a graphical representation of a file to a location
corresponding to a folder icon that represents the directory and
reversing the operation includes removing the icon from a graphical
representation of the folder/directory. FIGS. 5G-5I, for example,
show user interface object 10242, corresponding to a file, moving
over folder 10248 and causing the device to, in response, place the
file in a directory represented by folder 10248 (FIGS. 5G-5H) then
moving user interface object 10242 away from folder 10248 causing
the device to, in response, remove the file from the directory
(FIGS. 5H-5I).
[0234] In some embodiments, the user interface object corresponds
to an application, performing the operation includes placing
(10330) the user interface object in an application launch region
and reversing the operation includes removing the user interface
object from the application launch region. Examples of an
application launch region include a dock or a quick launch bar.
FIGS. 5J-5L, for example, show user interface object 10262,
corresponding to an application, moving over application launch
region 10268 and causing the device to, in response, place user
interface object 10262 in application launch region 10268 (FIGS.
5J-5K) then moving user interface object 10262 away from
application launch region 10268 and causing the device to, in
response, remove user interface object 10262 from application
launch region 10268 (FIGS. 5K-5L).
[0235] In response to detecting (10320) the second movement, in
addition to reversing the operation, the device generates (10332) a
second tactile output on the touch-sensitive surface, where the
second tactile output is different from the first tactile output.
As a result, the device undoes the previously performed operation.
In some embodiments the first tactile output is different from the
second tactile output based on differences in amplitudes of the
tactile outputs. In some embodiments, the first type of tactile
output is generated by movement of the touch-sensitive surface that
includes a first dominant movement component. For example, the
generated movement corresponds to an initial impulse of the first
tactile output, ignoring any unintended resonance. In some
embodiments, the second type of tactile output is generated by
movement of the touch-sensitive surface that includes a second
dominant movement component. For example, the generated movement
corresponds to an initial impulse of the second tactile output,
ignoring any unintended resonance. In some embodiments, the first
dominant movement component and the second dominant movement
component have (10334) a same movement profile and different
amplitudes. For example, the first dominant movement component and
the second dominant movement component have the same movement
profile when the first dominant movement component and the second
dominant movement component have a same waveform shape, such as
square, sine, sawtooth or triangle, and approximately the same
period.
[0236] In some embodiments the first tactile output is different
from the second tactile output based on differences in movement
profiles of the tactile outputs. In some embodiments, the first
type of tactile output is generated by movement of the
touch-sensitive surface that includes a first dominant movement
component. For example, the generated movement corresponds to an
initial impulse of the first tactile output, ignoring any
unintended resonance. In some embodiments, the second type of
tactile output is generated by movement of the touch-sensitive
surface that includes a second dominant movement component. For
example, the generated movement corresponds to an initial impulse
of the second tactile output, ignoring any unintended resonance. In
some embodiments, the first dominant movement component and the
second dominant movement component have (10336) different movement
profiles and a same amplitude. For example, the first dominant
movement component and the second dominant movement component have
different movement profiles when the first dominant movement
component and the second dominant movement component have a
different waveform shape, such as square, sine, sawtooth or
triangle, and/or approximately the same period.
[0237] It should be understood that the particular order in which
the operations in FIGS. 6A-6C have been described is merely
exemplary and is not intended to indicate that the described order
is the only order in which the operations could be performed. One
of ordinary skill in the art would recognize various ways to
reorder the operations described herein. Additionally, it should be
noted that details of other processes described herein (e.g., those
listed in paragraph [0058]) with respect to other methods described
herein are also applicable in an analogous manner to method 10300
described above with respect to FIGS. 6A-6C. For example, the
contacts, movements, user interface objects, focus selectors, and
tactile outputs described above with reference to method 10300
optionally have one or more of the characteristics of contacts,
movements, user interface objects, focus selectors, and tactile
outputs described herein with reference to other methods described
herein (e.g., those listed in paragraph [0058]). For brevity, these
details are not repeated here.
[0238] In accordance with some embodiments, FIG. 7 shows a
functional block diagram of an electronic device 10400 configured
in accordance with the principles of the various described
embodiments. The functional blocks of the device are, optionally,
implemented by hardware, software, or a combination of hardware and
software to carry out the principles of the various described
embodiments. It is understood by persons of skill in the art that
the functional blocks described in FIG. 7 are, optionally, combined
or separated into sub-blocks to implement the principles of the
various described embodiments. Therefore, the description herein
optionally supports any possible combination or separation or
further definition of the functional blocks described herein.
[0239] As shown in FIG. 7, an electronic device 10400 includes a
display unit 10402 configured to display a user interface object; a
touch-sensitive surface unit 10404 configured to detect user
contacts; and a processing unit 10406 coupled to display unit 10402
and touch-sensitive surface unit 10404. In some embodiments, the
processing unit includes a detecting unit 10408, a display enabling
unit 10410, a performing unit 10412, a generating unit 10414, and a
reversing unit 10416.
[0240] The processing unit 10406 is configured to detect a contact
on the touch-sensitive surface unit, detect a first movement of the
contact across the touch-sensitive surface unit (e.g., with
detecting unit 10408), the first movement corresponding to
performing an operation on the user interface object, and in
response to detecting the first movement; perform the operation
(e.g., with performing unit 10412) and generate a first tactile
output on the touch-sensitive surface unit (e.g., with generating
unit 10414). The processing unit 10406 is further configured to
detect a second movement of the contact across the touch-sensitive
surface unit (e.g., with detecting unit 10408), the second movement
corresponding to reversing the operation on the user interface
object; and in response to detecting the second movement, reverse
the operation (e.g., with reversing unit 10416) and generate a
second tactile output on the touch-sensitive surface unit (e.g.,
with generating unit 10414), where the second tactile output is
different from the first tactile output.
[0241] In some embodiments, the first movement of the contact
across the touch-sensitive surface unit and the second movement of
the contact across the touch-sensitive surface unit are part of a
single continuous gesture performed without detecting a liftoff of
the contact from the touch-sensitive surface unit 10404.
[0242] In some embodiments, performing the operation (e.g., with
the performing unit 10412) includes snapping the user interface
object into a respective object placement guide and reversing the
operation (e.g., with the reversing unit 10416) includes snapping
the user interface object out of the respective object placement
guide.
[0243] In some embodiments, performing the operation (e.g., with
the performing unit 10412) includes marking data corresponding to
the user interface object for deletion and reversing the operation
(e.g., with reversing unit 10416) includes unmarking the data
corresponding to the user interface object for deletion.
[0244] In some embodiments, the user interface object corresponds
to a file, performing the operation (e.g., with the performing unit
10412) includes placing the file in a directory and reversing the
operation (e.g., with the reversing unit 10416) includes removing
the file from the directory.
[0245] In some embodiments, the user interface object corresponds
to an application, performing the operation (e.g., with the
performing unit 10412) includes placing the user interface object
in an application launch region and reversing the operation (e.g.,
with the reversing unit 10416) includes removing the user interface
object from the application launch region.
[0246] In some embodiments, the first tactile output is generated
(e.g., with the generating unit 10414) by movement of the
touch-sensitive surface unit that includes a first dominant
movement component, the second tactile output is generated (e.g.,
with the generating unit 10414) by movement of the touch-sensitive
surface unit that includes a second dominant movement component,
and the first dominant movement component and the second dominant
movement component have a same movement profile and different
amplitudes.
[0247] In some embodiments, the first tactile output is generated
(e.g., with the generating unit 10414) by movement of the
touch-sensitive surface unit that includes a first dominant
movement component, the second tactile output is generated (e.g.,
with the generating unit 10414) by movement of the touch-sensitive
surface unit that includes a second dominant movement component,
and the first dominant movement component and the second dominant
movement component have different movement profiles and a same
amplitude.
[0248] The operations in the information processing methods
described above are, optionally implemented by running one or more
functional modules in information processing apparatus such as
general purpose processors (e.g., as described above with respect
to FIGS. 1A and 3) or application specific chips.
[0249] The operations described above with reference to FIGS. 6A-6C
are, optionally, implemented by components depicted in FIGS. 1A-1B
or FIG. 7. For example, detection operations 10304 and 10306,
performing operation 10312, generating operations 10314 and 10332,
and reversing operation 10322 are, optionally, implemented by event
sorter 170, event recognizer 180, and event handler 190. Event
monitor 171 in event sorter 170 detects a contact on
touch-sensitive display 112, and event dispatcher module 174
delivers the event information to application 136-1. A respective
event recognizer 180 of application 136-1 compares the event
information to respective event definitions 186, and determines
whether a first contact at a first location on the touch-sensitive
surface corresponds to a predefined event or sub-event, such as
selection of an object on a user interface. When a respective
predefined event or sub-event is detected, event recognizer 180
activates an event handler 190 associated with the detection of the
event or sub-event. Event handler 190 optionally utilizes or calls
data updater 176 or object updater 177 to update the application
internal state 192. In some embodiments, event handler 190 accesses
a respective GUI updater 178 to update what is displayed by the
application. Similarly, it would be clear to a person having
ordinary skill in the art how other processes can be implemented
based on the components depicted in FIGS. 1A-1B.
Indicating Changes in the Z-Order of User Interface Objects
[0250] Many electronic devices display user interface objects that
have a layer order (e.g., a z-order or front-to-back order of the
user interface objects). A user typically interacts with such
objects by repositioning them on the display, and overlapping
objects are displayed on the display in accordance with their
front-to-back order (e.g., an object that is "in front" of another
object is displayed where the two objects overlap). In addition to
repositioning the objects on the display, a user often wants to
change the front-to-back order of the objects on the display. In
some methods, changes in the z-order are indicated visually. The
embodiments described below improve on these methods by providing
for tactile outputs when objects overlap each other and their
z-order changes. Thus, the user has tactile as well as visual
indication of the change in z-order when the objects overlap, and
thus their covering of each other changes with the change in
z-order.
[0251] FIGS. 8A-8S illustrate exemplary user interfaces for
indicating changes in the z-order of user interface objects in
accordance with some embodiments. The user interfaces in these
figures are used to illustrate the processes described below,
including the processes in FIGS. 9A-9D. FIGS. 8A-8S include
intensity diagrams that show the current intensity of the contact
on the touch-sensitive surface relative to a plurality of intensity
thresholds including a light press intensity threshold (e.g.,
"IT.sub.L") and a deep press intensity threshold (e.g.,
"IT.sub.D"). In some embodiments, operations similar to those
described below with reference to "IT.sub.D" are performed with
reference to a different intensity threshold (e.g.,
"IT.sub.L").
[0252] FIG. 8A shows user interface objects 10506-1 and 10506-2
displayed on display 450 (e.g., display 340, touch screen 112) of a
device (e.g., device 300, 100). Objects 10506 are, optionally,
windows of respective applications, shapes or other graphics in a
drawing, or objects (e.g., text block, picture, etc.) in a
presentation. Objects 10506-1 and 10506-2 are displayed with a
z-order. In FIG. 8A, object 10506-1 is in front of (or "above")
object 10506-2 in the z-order, and, likewise, object 10506-2 is in
back of (or "behind" or "below") object 10506-1 in the z-order. If
two objects do not overlap (e.g., objects 10506-1 and 10506-2 in
FIG. 8A), their z-order relative to each other may not be visually
displayed to a user. In FIG. 8D, object 10506-1 is in front of
object 10506-2 in the z-order and that relative z-order is visually
displayed to the user, as objects 10506-1 and 10506-2 overlap, and
object 10506-1 covers at least a part of object 10506-2.
[0253] FIG. 8A also shows cursor 10504 displayed on display 450.
Cursor 10504 is an example of a focus selector. A user optionally
positions cursor 10504 over an object 10506 to bring that object
into focus. In FIG. 8A, cursor 10504 is located over object
10506-1.
[0254] FIG. 8B shows contact 10510 detected on touch-sensitive
surface 451. While contact 10510 is detected on touch-sensitive
surface 451, a request to move object 10506-1 below object 10506-2
in the z-order is received by the device (e.g., as shown in FIG.
8C). The device optionally receives the request in the form of, for
example, a gesture input performed on touch-sensitive surface 451
(e.g., a gesture performed with contact 10510) while cursor 10504
is located over object 10506-1, an increase in the intensity of
contact 10510 above an intensity threshold while cursor 10504 is
located over object 10506-1, or an input made using a keyboard or
other input device (e.g., a keyboard shortcut, a selection of a
menu option using the keyboard or other input device). The
intensity (and the change in intensity) is, optionally, detected by
one or more sensors, included in the device, that are configured to
detect intensity of contacts with touch-sensitive surface 451. In
some embodiments, while contact 10510 has an intensity between
IT.sub.L and IT.sub.D, the user is enabled to move the object
associated with cursor 10504 by moving contact 10510 on the
touch-sensitive surface.
[0255] In response to the request (e.g., the increase in intensity
of contact from an intensity below IT.sub.D in FIG. 8B to an
intensity above IT.sub.D in FIG. 8C), object 10506-1 is moved below
object 10506-2 in the z-order. The change in z-order is,
optionally, not visually displayed to the user if objects 10506-1
and 10506-2 do not overlap, as shown in FIG. 8C. In accordance with
a determination that objects 10506-1 and 10506-2 do not overlap, no
tactile output associated with the move of object 10506-1 below
object 10506-2 is generated.
[0256] FIG. 8D shows objects 10506-1 and 10506-2 displayed, in a
z-order in which object 10506-1 is in front of object 10506-2, on
display 450. In FIG. 8D, objects 10506-1 and 10506-2 overlap, with
object 10506-1 covering a part of object 10506-2. Cursor 10504 is
displayed as located over object 10506-1. For example, in FIG. 8D,
the device detected movement 10511 of contact 10512 down and to the
right on the touch-sensitive surface 451 while cursor 10504 was
over object 10506-1 and the intensity of contact 10512 was between
IT.sub.L and IT.sub.D, and in response to detecting the movement
10511 of contact 10512, the device moved cursor 10504 and object
10506-1 down and to the left on the display 450 in accordance with
the movement of contact 10512 on the touch-sensitive surface.
[0257] FIG. 8D also shows contact 10512 detected on touch-sensitive
surface 451. While contact 10512 is detected on touch-sensitive
surface 451, a request to move object 10506-1 below object 10506-2
in the z-order is received by the device (e.g., as shown in FIG.
8C). The device optionally receives the request in the form of, for
example, a gesture input performed on touch-sensitive surface 451
(e.g., a gesture performed with contact 10512) while cursor 10504
is located over object 10506-1, an increase in the intensity of
contact 10512 above the intensity threshold while cursor 10504 is
located over object 10506-1, or an input made using a keyboard or
other input device (e.g., a keyboard shortcut, a selection of a
menu option using the keyboard or other input device).
[0258] In response to the request (e.g., the increase in intensity
of contact from an intensity below IT.sub.D in FIG. 8D to an
intensity above IT.sub.D in FIG. 8E), object 10506-1 is moved below
object 10506-2 in the z-order, as shown in FIG. 8E. With the change
in z-order, object 10506-2 now covers a part of object 10506-1. In
accordance with a determination that objects 10506-1 and 10506-2
overlap, a tactile output 10513 associated with the movement of
object 10506-1 below object 10506-2 is generated in conjunction
with the move of object 10506-1 below object 10506-2. The tactile
output may be sensed by the user via contact 10512 as a tactile
sensation. In some embodiments, the tactile output is generated by
movement of touch-sensitive surface 451, and the movement includes
a dominant movement component, which optionally has a waveform
shape with a wavelength, such as a square, sine, squine, sawtooth,
or triangle.
[0259] Many electronic devices display user interface objects that
have a layer order (e.g., a z-order or front-to-back order of the
user interface objects). A user typically interacts with such
objects by repositioning them on the display, and overlapping
objects are displayed on the display in accordance with their
front-to-back order (e.g., an object that is "in front" of another
object is displayed where the two objects overlap). In addition to
repositioning the objects on the display, a user often wants to
change the front-to-back order of the objects on the display.
[0260] Thus, when the relative z-order between two objects is
changed, a tactile output is generated if the two objects overlap
(e.g., 10513 in FIG. 8E), and a tactile output is not generated if
the two objects do not overlap (e.g., as shown in FIG. 8C). The
tactile output gives the user an indication that the change in
z-order affects which object is covered by the other object.
[0261] FIGS. 8F-8I illustrate an example of moving an object within
the z-order using a control for changing the z-order. FIG. 8F shows
objects 10506-1 and 10506-2, and cursor 10504, displayed on display
450. Objects 10506-1 and 10506-2 are displayed in a z-order, with
object 10506-1 in front of object 10506-2 in the z-order. Z-order
slider 10514 is also displayed on display 450. Z-order slider 10514
includes slider thumbs 10516-1 and 10516-2. Thumb 10516-1
corresponds to object 10506-1 and thumb 10516-2 corresponds to
object 10506-2. The position of a thumb 10516 relative to the other
thumbs on z-order slider 10514 corresponds to the corresponding
object's position in the z-order. For example, as depicted in FIG.
8F, the further left a thumb 10516 is on z-order slider 10514, the
further up front the corresponding object 10506 is in the z-order.
Thus, thumb 10516-1 is to the left of thumb 10516-2, corresponding
to the z-order of objects 10506-1 and 10506-2 as shown. It should
be appreciated that the correspondence between being further left
on z-order slider 10514 and the corresponding object being further
up front in the z-order is a design choice; optionally, the further
right a thumb 10516 is on z-order slider 10514, the further up
front the corresponding object 10506 is in the z-order.
[0262] In FIG. 8F, the device detects movement 10517 of contact
10518 downward on the touch-sensitive surface 451, and in response
to detecting the movement 10517 of contact 10518, the device moves
cursor 10504 over thumb 10516-1. FIG. 8G also shows contact 10518
detected on touch-sensitive surface 451 while cursor 10504 is
located over thumb 10516-1. The gesture including contact 10518
includes movement 10519 of a contact 10518 on touch-sensitive
surface 451 while contact 10518 has an intensity between IT.sub.L
and IT.sub.D. In response to detection of the gesture including
movement 10519 of contact 10518, thumb 10516-1 is moved rightward
on slider 10514 past thumb 10516-2, so that thumb 10516-2 is to the
left of thumb 10516-1 on slider 10514, as shown in FIG. 8G. In
response to the movement of thumb 10516-1 to the right of thumb
10516-2, object 10506-1, which corresponds to thumb 10516-1, is
moved downward, below object 10506-2 in the z-order. In accordance
with a determination that objects 10506-1 and 10506-2 overlap, a
tactile output 10513 associated with the move of object 10506-1
below object 10506-2 is generated in conjunction with the move of
object 10506-1 below object 10506-2. The tactile output may be
sensed by the user via contact 10518 as a tactile sensation. In
some embodiments, the tactile output is generated by movement of
touch-sensitive surface 451, and the movement includes a dominant
movement component, which optionally has a waveform shape, such as
a square, sine, squine, sawtooth, or triangle.
[0263] FIG. 8I shows objects 10506-1 and 10506-2 and slider 10514
displayed on display 450, with object 10506-1 in front of object
10506-2 in the z-order. In FIG. 8H, the device detects movement
10521 of contact 10520 downward and to the right on the
touch-sensitive surface 451, and in response to detecting the
movement 10521 of contact 10520, the device moves cursor 10504 over
thumb 10516-2. FIG. 8H also shows cursor 10504, located over thumb
10516-2, displayed on display 450, and a gesture including movement
10523 of contact 10520 to the left detected on touch-sensitive
surface 451 while cursor 10504 is located over thumb 10516-2. The
gesture including contact 10520 includes movement 10523 of a
contact on touch-sensitive surface 451. In response to detection of
the gesture including contact 10520, thumb 10516-2 is moved
leftward on slider 10514 past thumb 10516-1, so that thumb 10516-2
is to the left of thumb 10516-1 on slider 10514, as shown in FIG.
8I. In response to the movement of thumb 10516-2 to the left of
thumb 10516-1, object 10506-2, which corresponds to thumb 10516-2,
is moved upward, in front of object 10506-1 in the z-order. In
accordance with a determination that objects 10506-1 and 10506-2
overlap, a tactile output 10513 associated with the move of object
10506-2 above object 10506-1 is generated in conjunction with the
move of object 10506-2 above object 10506-1. The tactile output may
be sensed by the user via contact 10520 as a tactile sensation. In
some embodiments, the tactile output is generated by movement of
touch-sensitive surface 451, and the movement includes a dominant
movement component, which optionally has a waveform shape, such as
a square, sine, squine, sawtooth, or triangle.
[0264] In some embodiments, the tactile output associated with the
move of object 10506-1 below object 10506-2 in the z-order has a
wavelength that is determined based on a position of object 10506-2
in the z-order prior to receiving the request to move object
10506-1 below object 10506-2 in the z-order.
[0265] FIG. 8J shows user interface objects 10506-1, 10506-2, and
10506-3 displayed on display 450. Objects 10506-1, 10506-2, and
10506-3 are displayed with a z-order. In FIG. 8J, object 10506-1 is
in front of objects 10506-2 and 10506-3. Object 10506-3 is in front
of object 10506-2 but in back of object 10506-1. Thus, object
10506-3 is an intervening object between objects 10506-1 and
10506-2 in the z-order, even though there is no visual indication
of this ordering in FIG. 8J. Objects 10506-1 and 10506-3 do not
overlap, and objects 10506-1 and 10506-2 overlap. Cursor 10504 is
displayed over object 10506-1.
[0266] FIG. 8J shows contact 10524 detected on touch-sensitive
surface 451. While contact 10524 is detected on touch-sensitive
surface 451, a request to move object 10506-1 below object 10506-2
in the z-order is received by the device (e.g., an increase in
intensity of contact 10524 from an intensity below IT.sub.L in FIG.
8J to an intensity above IT.sub.D in FIG. 8L). The device
optionally receives the request in the form of, for example, a
gesture input performed on touch-sensitive surface 451 (e.g., a
gesture performed with contact 10524) while cursor 10504 is located
over object 10506-1, an increase in the intensity of contact 10524
above the intensity threshold while cursor 10504 is located over
object 10506-1, or an input made using a keyboard or other input
device (e.g., a keyboard shortcut, a selection of a menu option
using the keyboard or other input device).
[0267] In response to the request, object 10506-1 is moved below
intervening object 10506-3 (e.g., when contact 10524 reaches an
intensity above IT.sub.L in FIG. 8K) and then below object 10506-2
(e.g., when contact 10524 reaches an intensity above IT.sub.D in
FIG. 8M) in the z-order, as shown in FIG. 8K-8L. When object
10506-1 is moved below object 10506-3, in accordance with a
determination that objects 10506-1 and 10506-3 do not overlap, no
tactile output associated with the move of object 10506-1 below
object 10506-3 is generated. When object 10506-1 is moved below
object 10506-2, in accordance with a determination that objects
10506-1 and 10506-2 overlap, a tactile output 10525 associated with
the move of object 10506-1 below object 10506-2 is generated in
conjunction with the move of object 10506-1 below object 10506-2.
The tactile output may be sensed by user via contact 10524 as a
tactile sensation. In some embodiments, the tactile output is
generated by movement of touch-sensitive surface 451, and the
movement includes a dominant movement component, which optionally
has a waveform shape, such as a square, sine, squine, sawtooth, or
triangle.
[0268] FIG. 8M shows user interface objects 10506-1, 10506-2, and
10506-3 displayed on display 450. Objects 10506-1, 10506-2, and
10506-3 are displayed with a z-order. In FIG. 8M, object 10506-1 is
in front of objects 10506-2 and 10506-3. Object 10506-3 is in front
of object 10506-2 but in back of object 10506-1. Thus, object
10506-3 is an intervening object between objects 10506-1 and
10506-2 within the z-order. Objects 10506-1, 10506-2, and 10506-3
overlap. Cursor 10504 is displayed over object 10506-1.
[0269] FIG. 8M shows contact 10526 detected on touch-sensitive
surface 451. While contact 10526 is detected on touch-sensitive
surface 451, a request to move object 10506-1 below object 10506-2
in the z-order is received by the device (e.g., an increase in
intensity of contact 10526 from an intensity below IT.sub.L in FIG.
8M to an intensity above IT.sub.D in FIG. 8O). In some
circumstances, the device receives the request in the form of, for
example, a gesture input performed on touch-sensitive surface 451
(e.g., a gesture performed with contact 10526) while cursor 10504
is located over object 10506-1, an increase in the intensity of
contact 10526 above the intensity threshold while cursor 10504 is
located over object 10506-1, or an input made using a keyboard or
other input device (e.g., a keyboard shortcut, a selection of a
menu option using the keyboard or other input device).
[0270] In response to the request, object 10506-1 is moved below
object 10506-3 (e.g., when contact 10524 reaches an intensity above
IT.sub.L in FIG. 8N) and then below object 10506-2 (e.g., when
contact 10524 reaches an intensity above IT.sub.D in FIG. 8O) in
the z-order, as shown in FIGS. 8N-8O. When object 10506-1 is moved
below object 10506-3, in accordance with a determination that
objects 10506-1 and 10506-3 overlap, a tactile output 10527
associated with the move of object 10506-1 below object 10506-3 is
generated in conjunction with the move of object 10506-1 below
object 10506-3. This tactile output associated with the move of
object 10506-1 below object 10506-3 may be sensed by user via
contact 10526 as a tactile sensation. When object 10506-1 is moved
below object 10506-2, in accordance with a determination that
objects 10506-1 and 10506-2 overlap, a tactile output 10528
associated with the move of object 10506-1 below object 10506-2 is
generated in conjunction with the move of object 10506-1 below
object 10506-2. This tactile output associated with the move of
object 10506-1 below object 10506-2 may be sensed by user via
contact 10526 as a tactile sensation. In some embodiments, the
tactile outputs are generated by respective movements of
touch-sensitive surface 451, and the respective movements each
include a dominant movement component, which optionally has a
waveform shape, such as a square, sine, squine, sawtooth, or
triangle.
[0271] In some embodiments, the tactile output 10527 associated
with the move of object 10506-1 below object 10506-3 (hereinafter
"Tactile Output A") and the tactile output 10528 associated with
the move of object 10506-1 below object 10506-2 after moving below
object 10506-3 (hereinafter "Tactile Output B") are different. For
example, the dominant movement component for Tactile Output A 10527
optionally has a different wavelength than the dominant movement
component for Tactile Output B 10528. In some embodiments, the
wavelength for Tactile Output A 10527 is determined based on a
position of object 10506-3 in the z-order, and the wavelength for
Tactile Output B 10528 is determined based on a position of object
10506-2 in the z-order.
[0272] In some embodiments, the wavelength of Tactile Output A
10527 is determined based on a number of user interface objects
10506, that object 10506-1 overlaps, that are between object
10506-1 and object 10506-3 in the z-order. In some embodiments, the
wavelength of Tactile Output B 10528 is determined based on a
number of user interface objects 10506, that object 10506-1
overlaps, that are between object 10506-1 and object 10506-2 in the
z-order.
[0273] Object 10506-1 optionally overlaps multiple other user
interface objects arranged in a respective z-order sequence,
irrespective of whether object 10506-1 overlaps with object
10506-2. Thus, in some embodiments, the z-order includes object
10506-1, the multiple other user interface objects behind object
10506-1, and then object 10506-2. Thus, when object 10506-1 is
moved below object 10506-2 in the z-order, in accordance with a
request to move object 10506-1 below object 10506-2 in the z-order,
object 10506-1 is moved below each of the multiple other user
interface objects in sequence before being moved below object
10506-2. For each of the multiple other user interface objects that
objects 10506-1 moves below, a tactile output is generated in
conjunction with the move of object 10506-1 below the respective
user interface object. Thus, as object 10506-1 is moved below the
multiple other user interface objects, a sequence of tactile
outputs is generated. In some embodiments, the sequence of tactile
outputs is generated based on a mathematical progression. For
example, each successive tactile output has a wavelength that is
double the wavelength of the preceding tactile output. In some
other embodiments, the sequence of tactile outputs is generated
based on a musical progression. For example, each successive
tactile output corresponds to a next note in a predefined musical
scale or the same note in a lower octave.
[0274] FIGS. 8P-8S illustrate an example of the user interfaces
described above with reference to FIGS. 8A-8O implemented on a
device with a touch-sensitive display (e.g., device 100 with touch
screen 112). FIGS. 8P-8Q show objects 10532-1 and 10532-2 displayed
on touch-sensitive display 112. Objects 10532-1 and 10532-2, which
do not overlap, are displayed in a z-order, with object 10532-1 in
front of object 10532-2 in the z-order.
[0275] FIG. 8P shows contact 10534 detected on touch-sensitive
display 112 at a position over object 10532-1. While contact 10534
is detected on touch-sensitive display 112, a request to move
object 10532-1 below object 10532-2 in the z-order is received by
the device. The device optionally receives the request in the form
of, for example, a gesture input performed on touch-sensitive
display 112 (e.g., a gesture performed with contact 10534 over
object 10532-1 or with another contact on touch-sensitive display
112) or an increase in the intensity of contact 10534 over object
10532-1 above the intensity threshold (e.g., an increase in
intensity of contact 10534 from an intensity below IT.sub.D in FIG.
8P to an intensity above IT.sub.D in FIG. 8Q).
[0276] In response to the request, object 10532-1 is moved below
object 10532-2 in the z-order, as shown in FIG. 8Q. In accordance
with a determination that objects 10532-1 and 10532-2 do not
overlap, no tactile output associated with the move of object
10532-1 below object 10532-2 is generated.
[0277] FIG. 8R shows overlapping objects 10532-1 and 10532-2
displayed on touch-sensitive display 112. Objects 10532-1 and
10532-2 are displayed in a z-order, with object 10532-1 in front of
object 10532-2 in the z-order. FIG. 8R also shows contact 10536
detected on touch-sensitive display 112 at a position over object
10532-1. While contact 10536 is detected on touch-sensitive display
112, a request to move object 10532-1 below object 10532-2 in the
z-order is received by the device. The device optionally receives
the request in the form of, for example, a gesture input performed
on touch-sensitive display 112 (e.g., a gesture performed with
contact 10536 over object 10532-1 or with another contact on
touch-sensitive display 112) or an increase in the intensity of
contact 10536 over object 10532-1 above the intensity threshold
(e.g., an increase in intensity of contact 10536 from an intensity
below IT.sub.D in FIG. 8R to an intensity above IT.sub.D in FIG.
8S).
[0278] In response to the request, object 10532-1 is moved below
object 10532-2 in the z-order, as shown in FIG. 8S. In accordance
with a determination that objects 10532-1 and 10532-2 overlap, a
tactile output 10537 associated with the move of object 10532-1
below object 10532-2 is generated in conjunction with the move of
object 10532-1 below object 10532-2. The tactile output may be
sensed by user via contact 10536 as a tactile sensation. In some
embodiments, the tactile output is generated by movement of
touch-sensitive display 112, and the movement includes a dominant
movement component, which optionally has a waveform shape, such as
a square, sine, squine, sawtooth, or triangle. In some embodiments,
the tactile output associated with the move of object 10532-1 below
object 10532-2 in the z-order has a wavelength that is determined
based on a position of object 10532-2 in the z-order prior to
receiving the request to move object 10532-1 below object 10532-2
in the z-order.
[0279] FIGS. 9A-9D are flow diagrams illustrating a method 10600 of
indicating changes in the z-order of user interface objects in
accordance with some embodiments. The method 10600 is performed at
an electronic device (e.g., device 300, FIG. 3, or portable
multifunction device 100, FIG. 1A) with a display and a
touch-sensitive surface. In some embodiments, the display is a
touch screen display and the touch-sensitive surface is on the
display. In some embodiments, the display is separate from the
touch-sensitive surface. Some operations in method 10600 are,
optionally, combined and/or the order of some operations is,
optionally, changed.
[0280] As described below, the method 10600 provides an intuitive
way to indicate changes in the z-order of user interface objects.
The method reduces the cognitive burden on a user when indicating
changes in the z-order of user interface objects, thereby creating
a more efficient human-machine interface. For battery-operated
electronic devices, enabling a user to perceive changes in the
z-order of user interface objects faster and more efficiently
conserves power and increases the time between battery charges.
[0281] The device displays (10602) a plurality of user interface
objects on the display, where the plurality of user interface
objects have a z-order, the plurality of user interface objects
includes a first user interface object and a second user interface
object, and the first user interface object is above the second
user interface object in the z-order. Multiple user interface
objects, such as objects 10506-1 and 10506-2, or objects 10532-1
and 10532-2, are, optionally, displayed with a z-order, as shown in
FIG. 8A, 8D, or 8P, respectively. In FIGS. 8A and 8D, object
10506-1 is above object 10506-2 in the z-order. In FIG. 8P, object
10532-1 is above object 10532-2 in the z-order.
[0282] While detecting a contact (e.g., a finger contact) on the
touch-sensitive surface, the device receives (10604) a request to
move the first user interface object below the second user
interface object in the z-order. For example, while a contact
(e.g., contact 10510, FIG. 8A; contact 10512, FIG. 8D; contact
10518, FIG. 8F; contact 10520, FIG. 8H; contact 10524, FIG. 8J;
contact 10526, FIG. 8M; contact 10534, FIG. 8Q; contact 10536, FIG.
8R), is detected on the touch-sensitive surface (e.g.,
touch-sensitive surface 451, touch-sensitive display 112), a
request to move object 10506-1 below object 10506-2 is
received.
[0283] In some embodiments, receiving the request to move the first
user interface object below the second user interface object
includes (10606), while a focus selector is over the first user
interface object, detecting an increase in intensity of the contact
above a respective intensity threshold (e.g., the deep press
intensity threshold IT.sub.D) Thus, the user is intuitively enabled
to press the first user interface object "down" below the second
user interface object in the z-order. For example, while contact
cursor 10504 is located over object 10506-1, the intensity of
contact 10510 (or contact 10512 or 10524 or 10526) is, optionally,
increased from an intensity below IT.sub.D to an intensity above
IT.sub.D. In embodiments where the touch-sensitive surface is a
touch-sensitive display (e.g., touch-sensitive display 112),
receiving the request includes detecting an increase in intensity
of the contact (e.g., an increase in intensity of contact 10534 or
10536 above the deep press intensity threshold IT.sub.D) while the
contact is over the first user object (e.g., object 10532-1).
[0284] In some embodiments, receiving the request to move the first
user interface object below the second user interface object
includes (10608), while displaying a control for changing a z-order
of the first user interface object (e.g., a slider or other control
that is separate/distinct from the first user interface object that
determines a z-order of the first user interface object), detecting
an input on the control that corresponds to moving the first user
interface object downward in the z-order. The control for changing
z-order is, optionally, slider 10514 (FIG. 8F). While z-order
slider 10514 is displayed, an input corresponding to moving object
10506-1 downward (e.g., the gesture including movement 10519 of
contact 10518 on touch-sensitive surface 451 while cursor 10504 is
over slider thumb 10516-1) is detected. In response to detecting
the gesture including movement 10519 of contact 10518 the device
moves thumb 10516-1, which corresponds to object 10506-1, on slider
10514 that corresponds to moving object 10506-1 downward in the
z-order.
[0285] In some embodiments, receiving the request to move the first
user interface object below the second user interface object
includes (10610), while displaying a control for changing a z-order
of the second user interface object (e.g., a slider or other
control that is separate/distinct from the second user interface
object that determines a z-order of the first user interface
object), detecting an input on the control that corresponds to
moving the second user interface object upward in the z-order. The
control for changing z-order is, optionally, slider 10514 (FIG.
8H). While z-order slider 10514 is displayed, an input
corresponding to moving object 10506-2 upward (e.g., the gesture
including movement 10523 of contact 10520 on touch-sensitive
surface 451 while cursor 10504 is over slider thumb 10516-2) is
detected. In response to detecting the gesture including movement
10523 of contact 10520, the device moves thumb 10516-2, which
corresponds to object 10506-2, on slider 10514 that corresponds to
moving object 10506-2 upward in the z-order.
[0286] In response (10612) to the request, the device moves (10614)
the first user interface object below the second user interface
object in the z-order. In response to the request, object 10506-1
(or 10532-1) is moved below object 10506-2 (or 10532-2) in the
z-order, as shown in FIG. 8C (or 8E or 8G or 8I or 8L or 8O or 8Q
or 8S).
[0287] In accordance with a determination that the first user
interface object overlaps at least a portion of the second user
interface object, the device generates (10616) a tactile output
(e.g., 10513 in FIG. 8E, 8G or 8I; 10525 in FIG. 8L; 10528 in FIG.
8O; or 10537 in FIG. 8S) associated with moving the first user
interface object below the second user interface object on the
touch-sensitive surface in conjunction with moving the first user
interface object below the second user interface object. If objects
10506-1 and 10506-2 (or objects 10532-1 and 10532-2) overlap, as
shown in FIG. 8D-8O or 8R-8S, a tactile output associated with the
move of object 10506-1 (or object 10532-1) below object 10506-2 (or
object 10532-2) is generated when object 10506-1 (or object
10532-1) is moved below object 10506-2 (or object 10532-2). In some
embodiments, the tactile output associated with moving the first
user interface object below the second user interface object has
(10618) a wavelength that is determined based on a position of the
second user interface object in the z-order prior to receiving the
request to move the first user interface object below the second
user interface object in the z-order (e.g., the lower the first
user interface object is moved in z-order, the lower the pitch of
the tactile output). For example, the tactile output associated
with moving object 10506-1 below 10506-2 has a wavelength that is
determined based on a position of object 10506-2 in the z-order
prior to receiving the request to move object 10506-1 below object
10506-2 in the z-order.
[0288] In accordance with a determination that the first user
interface object does not overlap the second user interface object,
forgoes (10620) generating the tactile output associated with
moving the first user interface object below the second user
interface object. If objects 10506-1 and 10506-2 (or objects
10532-1 and 10532-2) do not overlap, as shown in FIG. 8A-8C or
8P-8Q), no tactile output associated with the move of object
10506-1 (or object 10532-1) below object 10506-2 (or object
10532-2) is generated when object 10506-1 (or object 10532-1) is
moved below object 10506-2 (or object 10532-2).
[0289] In some embodiments, the first user interface object
overlaps (10622) at least a portion of the second user interface
object when at least a portion of the first user interface object
covers at least a portion of the second user interface object. In
some embodiments, the first user interface object partially
overlaps the second user interface object. In some embodiments, the
first user interface object completely overlaps the second user
interface object (e.g., the first user interface object covers all
of the second user interface object). For example, in FIG. 8A,
objects 10506-1 and 10506-2 do not overlap, and in FIG. 8D objects
10506-1 and 10506-2 overlap.
[0290] In some embodiments, the plurality of user interface objects
includes (10624) an intervening user interface object, and the
intervening user interface object has a position in the z-order
between the first user interface object and the second user
interface object. As shown in FIGS. 8J and 8M, for example, there
is, optionally, an intervening object 10506-3 between objects
10506-1 and 10506-2 in the z-order. In some embodiments, when there
is an intervening user interface object, in response (10612) to the
request to move the first user interface object below the second
user interface object in the z-order, the device moves (10630) the
first user interface object below the intervening user interface
object in the z-order prior to moving the first user interface
object below the second user interface object in the z-order. In
accordance with a determination that the first user interface
object overlaps at least a portion of the intervening user
interface object, the device generates (10632), on the
touch-sensitive surface, a tactile output associated with moving
the first user interface object below the intervening user
interface object in conjunction with moving the first user
interface object below the intervening user interface object. In
accordance with a determination that the first user interface
object does not overlap the intervening user interface object, the
device forgoes generating the tactile output associated with moving
the first user interface object below the intervening user
interface object. In response to the request to move object 10506-1
below object 10506-2 in the z-order, object 10506-1 is moved below
object 10506-3 on the way to being moved below object 10506-2 in
the z-order, as shown in FIGS. 8J-8O. If objects 10506-1 and object
10506-3 do not overlap, as shown in FIGS. 8J-8K, no tactile output
associated with the move of objects 10506-1 below object 10506-3 is
generated. If objects 10506-1 and object 10506-3 overlap, as shown
in FIGS. 8M-8N, a tactile output associated with the move of
objects 10506-1 below object 10506-3 is generated.
[0291] In some embodiments, the first user interface object
overlaps (10635) the intervening user interface object and the
second user interface object, and in response (10612) to the
request to move the first user interface object below the second
user interface object in the z-order, the device generates (10636)
a first tactile output, on the touch-sensitive surface, associated
with moving the first user interface object below the intervening
user interface object, prior to moving the first user interface
object below the second user interface object in the z-order; and
generates (10638) a second tactile output, on the touch-sensitive
surface, associated with moving the first user interface object
below the second user interface object, wherein the first tactile
output is different from the second tactile output. For example, as
shown in FIG. 8M, object 10506-1 overlaps with object 10506-3 and
object 10506-2. In response to the request to move object 10506-1
below object 10506-2 in the z-order, object 10506-1 is moved below
object 10506-3 and then moved below object 10506-2 in the z-order.
As object 10506-1 overlaps with object 10506-3 and object 10506-2,
Tactile Output A 10527 is generated for the move of object 10506-1
below object 10506-3, and Tactile Output B 10528 is generated for
the move of object 10506-1 below object 10506-2. Tactile Output A
10527 and Tactile Output B 10528 are, optionally, different. For
example, Tactile Output A 10527 and Tactile Output B 10528
optionally have different wavelengths or amplitude.
[0292] In some embodiments, the first tactile output is (10640)
generated by movement of the touch-sensitive surface that includes
a first dominant movement component (e.g., movement corresponding
to an initial impulse of the first tactile output, ignoring any
unintended resonance), the second tactile output is generated by
movement of the touch-sensitive surface that includes a second
dominant movement component (e.g., movement corresponding to an
initial impulse of the second tactile output, ignoring any
unintended resonance), and the first dominant movement component
and the second dominant movement component have different
wavelengths (e.g., while maintaining a same movement profile such
as a same waveform shape such as square, sine, squine, sawtooth or
triangle and/or while maintaining a same amplitude). Tactile Output
A 10527 and Tactile Output B 10528 have respective dominant
movement components that have different wavelengths. Thus, Tactile
Output A 10527 for the move below object 10506-3 and Tactile Output
B 10528 for the move below object 10506-2 may feel different to the
user.
[0293] In some embodiments, the wavelength of the first tactile
output is (10642) determined based on a position of the intervening
user interface object in the z-order, and the wavelength of the
second tactile output is determined based on a position of the
second user interface object in the z-order (e.g., the tactile
output that is generated when the first user interface object is
moved past a respective user interface object is determined based
on an absolute position of the respective user interface object in
the z-order, which provides feedback to the user as to how far the
first user interface object has been pushed down into the z-order).
In FIGS. 8M-8O, for example, the wavelength of Tactile Output A
10527 is, optionally, determined based on the absolute position of
object 10506-3 in the z-order, and the wavelength of Tactile Output
B 10528 is, optionally, determined based on the absolute position
of object 10506-2 in the z-order.
[0294] In some embodiments, the wavelength of the first tactile
output is (10644) determined based on a number of user interface
objects that the first user interface object overlaps that are
between the first user interface object and the intervening user
interface object in the z-order. In FIGS. 8M-8O, for example, the
wavelength of Tactile Output A 10527 is, optionally, determined
based on the number of user interface objects that overlap object
10506-1 and are between object 10506-1 and object 10506-3 in the
z-order.
[0295] In some embodiments, the wavelength of the second tactile
output is (10646) determined based on a number of user interface
objects that the first user interface object overlaps that are
between the first user interface object and the second user
interface object in the z-order (e.g., the tactile output that is
generated when the first user interface object is moved past a
respective user interface object is determined based on how many
other objects are between the first user interface object and the
respective object, which provides feedback to the user as to how
far the user interface object has been pushed down into a "local"
z-order for objects that are in the same general area of the user
interface and overlap each other). In FIGS. 8M-8O, for example, the
wavelength of Tactile Output B 10528 is, optionally, determined
based on the number of user interface objects that overlap object
10506-1 and are between object 10506-1 and object 10506-2 in the
z-order.
[0296] In some embodiments, the first user interface overlaps
(10648) a plurality of other user interface objects arranged in a
respective z-order sequence and a next tactile output (e.g., a
pitch/wavelength/intensity of the next tactile output)
corresponding to movement of the first user interface object below
a next user interface object in the z-order sequence is based on a
mathematical progression from a prior tactile output corresponding
to movement of the first user interface object below a prior user
interface object in the z-order sequence (e.g., each successive
tactile output doubles the wavelength of the prior tactile
output).
[0297] In some embodiments, the first user interface overlaps
(10650) a plurality of other user interface objects arranged in a
respective z-order sequence and a next tactile output (e.g., a
pitch/wavelength/intensity of the next tactile output)
corresponding to movement of the first user interface object below
a next user interface object in the z-order sequence is based on a
musical progression from a prior tactile output corresponding to
movement of the first user interface object below a prior user
interface object in the z-order sequence (e.g., each successive
tactile output corresponds to a next note in a predefined musical
scale or the same note in a lower octave).
[0298] For example, in some circumstances, object 10506-1 overlaps
multiple other user interface objects arranged in a respective
z-order sequence irrespective of whether object 10506-1 overlaps
with object 10506-2. Thus, in some embodiments, the z-order
includes object 10506-1, the multiple other user interface objects
behind object 10506-1, and then object 10506-2. Thus, when object
10506-1 is moved below object 10506-2 in the z-order, in accordance
with a request to move object 10506-1 below object 10506-2 in the
z-order, object 10506-1 is moved below each of the multiple other
user interface objects in z-order sequence before being moved below
object 10506-2. In this example, for each of the multiple other
user interface objects that objects 10506-1 moves below, a tactile
output is generated in conjunction with the move of object 10506-1
below the respective other user interface object. Thus, as object
10506-1 is moved below the multiple other user interface objects, a
sequence of tactile outputs is generated. In some embodiments, the
sequence of tactile outputs is generated based on a mathematical
progression. For example, each successive tactile output has a
wavelength that is double the wavelength of the preceding tactile
output. In some other embodiments, the sequence of tactile outputs
is generated based on a musical progression. For example, each
successive tactile output corresponds to a next note in a
predefined musical scale or the same note in a lower octave.
[0299] It should be understood that the particular order in which
the operations in FIGS. 9A-9D have been described is merely
exemplary and is not intended to indicate that the described order
is the only order in which the operations could be performed. One
of ordinary skill in the art would recognize various ways to
reorder the operations described herein. Additionally, it should be
noted that details of other processes described herein with respect
to other methods described herein (e.g., those listed in paragraph
[0058]) are also applicable in an analogous manner to method 10600
described above with respect to FIGS. 9A-9D. For example, the
contacts, gestures, user interface objects, tactile outputs,
intensity thresholds, and focus selectors described above with
reference to method 10600 optionally has one or more of the
characteristics of the contacts, gestures, user interface objects,
tactile outputs, intensity thresholds, and focus selectors
described herein with reference to other methods described herein
(e.g., those listed in paragraph [0058]). For brevity, these
details are not repeated here.
[0300] In accordance with some embodiments, FIG. 10 shows a
functional block diagram of an electronic device 10700 configured
in accordance with the principles of the various described
embodiments. The functional blocks of the device are, optionally,
implemented by hardware, software, or a combination of hardware and
software to carry out the principles of the various described
embodiments. It is understood by persons of skill in the art that
the functional blocks described in FIG. 10 are, optionally,
combined or separated into sub-blocks to implement the principles
of the various described embodiments. Therefore, the description
herein optionally supports any possible combination or separation
or further definition of the functional blocks described
herein.
[0301] As shown in FIG. 10, an electronic device 10700 includes a
display unit 10702 configured to display a plurality of user
interface objects on the display unit 10702, where: the plurality
of user interface objects have a z-order, the plurality of user
interface objects includes a first user interface object and a
second user interface object, and the first user interface object
is above the second user interface object in the z-order; a
touch-sensitive surface unit 10704 configured to receive contacts;
and a processing unit 10706 coupled to the display unit 10702 and
the touch-sensitive surface unit 10704. In some embodiments, the
processing unit 10706 includes a receiving unit 10708, a moving
unit 10710, and a generating unit 10712.
[0302] The processing unit 10706 is configured to: while detecting
a contact on the touch-sensitive surface unit 10704, receive a
request to move the first user interface object below the second
user interface object in the z-order (e.g., with the receiving unit
10708); and in response to the request: move the first user
interface object below the second user interface object in the
z-order (e.g., with the moving unit 10710); in accordance with a
determination that the first user interface object overlaps at
least a portion of the second user interface object, generate a
tactile output associated with moving the first user interface
object below the second user interface object on the
touch-sensitive surface unit 10704 in conjunction with moving the
first user interface object below the second user interface object
(e.g., with the generating unit 10712); and in accordance with a
determination that the first user interface object does not overlap
the second user interface object, forgo generating the tactile
output associated with moving the first user interface object below
the second user interface object (e.g., with the generating unit
10712).
[0303] In some embodiments, the first user interface object
overlaps at least a portion of the second user interface object
when at least a portion of the first user interface object covers
at least a portion of the second user interface object.
[0304] In some embodiments, receiving the request to move the first
user interface object below the second user interface object
includes, while a focus selector is over the first user interface
object, detecting an increase in intensity of the contact above a
respective intensity threshold.
[0305] In some embodiments, receiving the request to move the first
user interface object below the second user interface object
includes, while displaying a control for changing a z-order of the
first user interface object, detecting an input on the control that
corresponds to moving the first user interface object downward in
the z-order.
[0306] In some embodiments, receiving the request to move the first
user interface object below the second user interface object
includes, while displaying a control for changing a z-order of the
second user interface object, detecting an input on the control
that corresponds to moving the second user interface object upward
in the z-order.
[0307] In some embodiments, the plurality of user interface objects
includes an intervening user interface object, and the intervening
user interface object has a position in the z-order between the
first user interface object and the second user interface object.
The processing unit 10706 is configured to: in response to the
request to move the first user interface object below the second
user interface object in the z-order: move the first user interface
object below the intervening user interface object in the z-order
prior to moving the first user interface object below the second
user interface object in the z-order (e.g., with the moving unit
10710); in accordance with a determination that the first user
interface object overlaps at least a portion of the intervening
user interface object, generate, on the touch-sensitive surface
unit 10704, a tactile output associated with moving the first user
interface object below the intervening user interface object in
conjunction with moving the first user interface object below the
intervening user interface object (e.g., with the generating unit
10712); and in accordance with a determination that the first user
interface object does not overlap the intervening user interface
object, forgo generating the tactile output associated with moving
the first user interface object below the intervening user
interface object (e.g., with the generating unit 10712).
[0308] In some embodiments, the first user interface object
overlaps the intervening user interface object and the second user
interface object. The processing unit 10706 is configured to: in
response to the request to move the first user interface object
below the second user interface object in the z-order: generate a
first tactile output, on the touch-sensitive surface unit 10704,
associated with moving the first user interface object below the
intervening user interface object, prior to moving the first user
interface object below the second user interface object in the
z-order (e.g., with the generating unit 10712); and generate a
second tactile output, on the touch-sensitive surface unit 10704,
associated with moving the first user interface object below the
second user interface object, wherein the first tactile output is
different from the second tactile output (e.g., with the generating
unit 10712).
[0309] In some embodiments, the first tactile output is generated
by movement of the touch-sensitive surface unit 10704 that includes
a first dominant movement component, the second tactile output is
generated by movement of the touch-sensitive surface unit 10704
that includes a second dominant movement component, and the first
dominant movement component and the second dominant movement
component have different wavelengths.
[0310] In some embodiments, the wavelength of the first tactile
output is determined based on a position of the intervening user
interface object in the z-order, and the wavelength of the second
tactile output is determined based on a position of the second user
interface object in the z-order.
[0311] In some embodiments, the wavelength of the first tactile
output is determined based on a number of user interface objects
that the first user interface object overlaps that are between the
first user interface object and the intervening user interface
object in the z-order.
[0312] In some embodiments, the wavelength of the second tactile
output is determined based on a number of user interface objects
that the first user interface object overlaps that are between the
first user interface object and the second user interface object in
the z-order.
[0313] In some embodiments, the tactile output associated with
moving the first user interface object below the second user
interface object has a wavelength that is determined based on a
position of the second user interface object in the z-order prior
to receiving the request to move the first user interface object
below the second user interface object in the z-order.
[0314] In some embodiments, the first user interface overlaps a
plurality of other user interface objects arranged in a respective
z-order sequence and a next tactile output corresponding to
movement of the first user interface object below a next user
interface object in the z-order sequence is based on a mathematical
progression from a prior tactile output corresponding to movement
of the first user interface object below a prior user interface
object in the z-order sequence.
[0315] In some embodiments, the first user interface overlaps a
plurality of other user interface objects arranged in a respective
z-order sequence and a next tactile output corresponding to
movement of the first user interface object below a next user
interface object in the z-order sequence is based on a musical
progression from a prior tactile output corresponding to movement
of the first user interface object below a prior user interface
object in the z-order sequence.
[0316] The operations in the information processing methods
described above are, optionally implemented by running one or more
functional modules in information processing apparatus such as
general purpose processors (e.g., as described above with respect
to FIGS. 1A and 3) or application specific chips.
[0317] The operations described above with reference to FIGS. 9A-9D
are, optionally, implemented by components depicted in FIGS. 1A-1B
or FIG. 10. For example, receiving operation 10604, moving
operation 10614, generating operation 10616, and forgoing operation
10620 are, optionally, implemented by event sorter 170, event
recognizer 180, and event handler 190. Event monitor 171 in event
sorter 170 detects a contact on touch-sensitive display 112, and
event dispatcher module 174 delivers the event information to
application 136-1. A respective event recognizer 180 of application
136-1 compares the event information to respective event
definitions 186, and determines whether a first contact at a first
location on the touch-sensitive surface (or whether rotation of the
device) corresponds to a predefined event or sub-event, such as
selection of an object on a user interface, or rotation of the
device from one orientation to another. When a respective
predefined event or sub-event is detected, event recognizer 180
activates an event handler 190 associated with the detection of the
event or sub-event. Event handler 190 optionally utilizes or calls
data updater 176 or object updater 177 to update the application
internal state 192. In some embodiments, event handler 190 accesses
a respective GUI updater 178 to update what is displayed by the
application. Similarly, it would be clear to a person having
ordinary skill in the art how other processes can be implemented
based on the components depicted in FIGS. 1A-1B.
Providing Tactile Feedback Warning a User
[0318] Many electronic devices have graphical user interfaces that
display user interface objects that can be manipulated by adjusting
one or more associated parameter. For example, a graphical user
interface optionally displays one or more user interface object
(e.g., an image, media clip, audio clip, shape, application window,
folder, menu or status bar) that the user can customize (e.g.,
enlarge, shrink, crop, rotate, increase volume, decrease volume or
otherwise manipulate a visual or audio parameter) through a user
interface (e.g., mouse, touch-sensitive surface or keyboard). Due
to practical considerations, such as size constraints of an
associated display, power constraints of an associated speaker and
inherent properties of the user interface object (e.g., size,
shape, length and volume), predefined adjustment limits are
commonly assigned to these user interface object, limiting the
extent to which their properties can be adjusted. Given the
complexity of a user interface environment where predefined
adjustment limits are applied to user interface objects, there is a
need to provide feedback that enables the user to more efficiently
and conveniently adjust the properties of these user interface
objects with respect to the predefined adjustment limits and alert
a user when the predefined adjustment limits have been reached or
exceeded.
[0319] The embodiments described below provide improved methods and
user interfaces for generating feedback to a user navigating a
complex user interface. More specifically, these methods and user
interfaces provide tactile feedback to the user when an action will
result in the adjustment of a user interface object parameter
beyond a predefined limit. The tactile feedback warns the user when
their action will result in the adjustment of a user interface
object parameter beyond a predefined adjustment limit. In this
fashion, the methods and user interfaces provided below allow the
user to more efficiently discern between allowed, forbidden and
non-recommended parameter adjustments by providing tactile
feedback, instead of or in addition to audible and/or visual
feedback. Some methods for warning a user that a predefined
adjustment limit has been exceeded rely on an audible or visual
cue. However, there are many situations (e.g., at work, in a
theatre an in various social situations) where the volume of an
electronic device will be lowered or muted, rendering audible cues
ineffective. Advantageously, the methods and user interfaces
described below augment or replace audible feedback by providing
tactile feedback indicating that a predefined adjustment limit has
been or will be exceeded, rendering the warning effective even when
the volume of the electronic device has been lowered or muted.
[0320] FIGS. 11A-11T illustrate exemplary user interfaces for
providing feedback when an action will result in the adjustment of
a parameter beyond a predefined limit in accordance with some
embodiments. The user interfaces in these figures are used to
illustrate the processes described below, including the processes
in FIGS. 12A-12B.
[0321] FIG. 11A illustrates exemplary user interface 10808
displaying images 10814-10830 and a control icon 10802 (e.g., a
thumb or handle of a control) for controlling a parameter (e.g.,
size) of the images in accordance with some embodiments. In FIG.
11A, user interface 10808 is displayed on display 450 of an
electronic device that also includes touch-sensitive surface 451
and one or more sensors for detecting intensity of contacts with
touch-sensitive surface. In some embodiments, touch-sensitive
surface 451 is a touch screen display that is optionally display
450 or a separate display. User interface 10808 displays a control
icon 10802 for controlling a parameter (e.g., size) associated with
respective content (e.g., images 10814, 10816, 10818, 10820, 10822,
10824, 10826, 10828 and 10830 displayed on user interface 10808).
In FIG. 11A, user interface 10808 also displays cursor 10806,
controllable by the user through contacts on touch-sensitive
surface 451. For example, detection of movement of a contact (e.g.,
a gesture) on touch-sensitive surface 451 corresponds to movement
of cursor 10806 on user interface 10808. In FIG. 11A, user
interface 10808 also displays sizing bar 10804, corresponding to a
plurality of sizes for the displayed content (e.g., images). In
FIG. 11A, the left and right boundaries of sizing bar 10804
correspond to predefined sizing limits for the displayed content
(e.g., images). For example, when control icon 10802 is moved to
the left boundary of sizing bar 10804, the displayed content (e.g.,
images) are displayed at a size corresponding to a predefined
minimum size (e.g., the displayed images are shrunk to a smallest
allowable size). Likewise, when control icon 10802 is moved to the
right boundary of sizing bar 10804, the displayed content (e.g.,
images) are displayed at a size corresponding to a predefined
maximum size (e.g., the displayed images are expanded to a largest
allowable size).
[0322] In some embodiments, the device is an electronic device with
a separate display (e.g., display 450) and a separate
touch-sensitive surface (e.g., touch-sensitive surface 451). In
some embodiments, the device is portable multifunction device 100,
the display is touch-sensitive display system 112, and the
touch-sensitive surface includes tactile output generators 167 on
the display (FIG. 1A). For convenience of explanation, the
embodiments described with reference to FIGS. 11A-11T and FIGS.
12A-12B will be discussed with reference to display 450 and a
separate touch-sensitive surface 451, however analogous operations
are, optionally, performed on a device with a touch-sensitive
display system 112 in response to detecting movement of the
contacts described in FIGS. 11A-11T on the touch-sensitive display
system 112 while displaying the user interfaces shown in FIGS.
11A-11T on the touch-sensitive display system 112; in such
embodiments, the focus selector is, optionally: a respective
contact, a representative point corresponding to a contact (e.g., a
centroid of a respective contact or a point associated with a
respective contact), or a centroid of two or more contacts detected
on the touch-sensitive display system 112, in place of cursor
10806.
[0323] FIGS. 11A-11H illustrate that contact 10810 and a gesture
including movement of contact 10810 are detected on touch-sensitive
surface 451 (e.g., movement 10812-a of contact 10810 from location
10810-a in FIG. 11A to location 10810-b in FIG. 11B; movement
10812-b of contact 10810 from location 10810-b in FIG. 11B to
location 10810-c in FIG. 11C, FIG. 11D, FIG. 11E or FIG. 11F;
and/or liftoff of contact 10810 from location 10810-c in FIG. 11G
or FIG. 11H). Contact 10810 is detected at a position on
touch-sensitive surface 451 corresponding to an area on display 450
occupied by control icon 10802 (e.g., contact 10810 corresponds to
a focus selector on the display, such as cursor 10806 which is at
or near a location of user interface object 10802). In some
embodiments, movement of contact 10810 on touch-sensitive surface
451 that corresponds to movement of focus selector (e.g., a cursor
10806) on display 450 (e.g., as illustrated in FIGS. 11A-11F).
[0324] FIGS. 11A-11B illustrate an example of a beginning of a
gesture where the device adjusts a size of images 10814, 10816,
10818, 10820, 10822, 10824, 10826, 10828 and 10830 in accordance
with movement 10812-a of contact 10810 that controls movement of
cursor 10806 corresponding to movement of a control icon 10802 in
sizing bar 10804. In FIG. 11B, the device does not generate a
tactile output corresponding to exceeding the predefined adjustment
limit (e.g., a size limit corresponding to the end of sizing bar
10804), because the predefined adjustment limit has not been
exceeded. FIG. 11B illustrates an example where, in accordance with
a determination that the adjustment of the parameter (e.g., size)
would not cause one or more predefined adjustment limit to be
exceeded (e.g., movement of control icon 10802 into a respective
area of the display that is within a predefined adjustment limit
indicated by the right boundary of sizing bar 10804), the
electronic device adjusts the parameter without generating a
tactile output on the touch-sensitive surface (e.g., the device
increases the size of images 10814, 10816, 10818, 10820, 10822,
10824, 10826, 10828 and 10830 displayed in user interface 10808 to
a size smaller than the predefined maximum size limit in accordance
with the value of the parameter that corresponds to the current
location of control icon 10802 in the respective area of the
display). In contrast, FIGS. 11C-11F, described below, illustrate
examples where, in accordance with a determination that the
adjustment of the parameter (e.g., size) would cause one or more
predefined adjustment limit to be exceeded (e.g., where the
movement of contact 10810 on the touch-sensitive surface 451
corresponds to a size adjustment of the displayed pictures
exceeding a predefined limit indicated by the right boundary of
sizing bar 10804), tactile output generators 167 generate tactile
outputs 10832 on touch-sensitive surface 451.
[0325] FIGS. 11B-11F illustrate various examples where the device
detects a continuation of a gesture including movement 10812-b of
contact 10810 that controls movement of cursor 10806 beyond an end
of sizing bar 10804. In FIG. 11C, in response to detecting the
continuation of the gesture including movement 10812-b, the device
continues to increase the size of the images up to the predefined
adjustment limit (e.g., a size limit) and moves control icon 10802
beyond an end of sizing bar 10804 in accordance with movement of
cursor 10806. In FIG. 11D, in response to detecting the
continuation of the gesture including movement 10812-b, the device
continues to increase the size of the images up to the predefined
adjustment limit (e.g., a size limit) and moves control icon 10802
up to an end of sizing bar 10804 in accordance with movement of
cursor 10806. In FIG. 11E, in response to detecting the
continuation of the gesture including movement 10812-b, the device
cancels the increase in size of the images corresponding to the
gesture and moves control icon 10802 up to an end of sizing bar
10804 in accordance with movement of cursor 10806. In FIG. 11F, in
response to detecting the continuation of the gesture including
movement 10812-b, the device continues to increase the size of the
images beyond the predefined adjustment limit (e.g., a size limit)
and moves control icon 10802 beyond an end of sizing bar 10804 in
accordance with movement of cursor 10806.
[0326] FIGS. 11C and 11F illustrate examples where the device
detects movement 10812-b of contact 10810 on touch-sensitive
surface 451 that corresponds to movement of cursor 10806 and
control icon 10802 past the right boundary of sizing bar 10804
displayed on display 450 (e.g., movement of cursor 10806 and
control icon 10802 into a respective area of the display that
corresponds to an parameter adjustment exceeding a predefined
adjustment limit for the displayed content).
[0327] FIGS. 11D-11E illustrate examples where the device detects
movement 10812-b of contact 10810 on touch-sensitive surface 451
that corresponds to movement of cursor 10806 past the right
boundary of sizing bar 10804 displayed on display 450 (e.g.,
movement of control icon 10802 into a respective area of the
display that corresponds to a parameter adjustment exceeding a
predefined adjustment limit for the displayed content) and movement
of control icon 10802 to the right boundary of sizing bar 10804
displayed on the display (e.g., movement of control icon 10802 into
a respective area of the display that corresponds to an a
predefined adjustment limit for the displayed content, even though
the extent of the movement 10812-b of contact 10810 corresponds to
an adjustment of the parameter that exceeds the predefined
adjustment limit).
[0328] FIGS. 11C-11F illustrate examples where, in accordance with
a determination that the adjustment of the parameter (e.g., size)
would cause one or more predefined adjustment limit to be exceeded
(e.g., where the extent of the movement 10812-b of contact 10810
corresponds to a size adjustment of the displayed pictures
exceeding a predefined limit indicated by the right boundary of
sizing bar 10804), tactile output generators 167 generate tactile
outputs 10832 on touch-sensitive surface 451 that indicate to the
user that the gesture corresponds to an adjustment of the parameter
that would exceed the predefined adjustment limit.
[0329] FIGS. 11C-11D illustrate examples where, in accordance with
a determination that the adjustment of the parameter would cause
the one or more predefined adjustment limits to be exceeded, the
parameter is adjusted so that the predefined adjustment limit is
reached (e.g., the size of images 10814, 10816, 10818, 10820,
10822, 10824, 10826, 10828 and 10830 displayed on user interface
10808 is increased to the predefined maximum size limit).
[0330] FIG. 11E illustrates an example where, in accordance with a
determination that the adjustment of the parameter would cause the
one or more predefined adjustment limits to be exceeded, adjustment
of the parameter is cancelled (e.g., the size of images 10814,
10816, 10818, 10820, 10822, 10824, 10826, 10828 and 10830 displayed
on user interface 10808 is adjusted back to the size the images
were displayed at prior to detecting the movement 10812-a and
10812-b of contact 10810).
[0331] FIG. 11F illustrates an example where, in accordance with a
determination that the adjustment of the parameter would cause the
one or more predefined adjustment limits to be exceeded, the
parameter is adjusted accordingly such that the predefined
adjustment limit is exceeded (e.g., the size of images 10814,
10816, 10818, 10820, 10822, 10824, 10826, 10828 and 10830 displayed
on user interface 10808 is increased past the predefined maximum
size limit).
[0332] FIGS. 11G-11H illustrate various examples where the device
detects liftoff of a contact 10810 used to perform one of the
gestures described above with reference to FIGS. 11A-11F. FIG. 11G
illustrates an example where, in response to liftoff of contact
10810 in FIG. 11F, adjustment of the parameter exceeding the
predefined adjustment limit is not reversed (e.g., the size of
images 10814, 10816, 10818, 10820, 10822, 10824, 10826, 10828 and
10830 displayed on user interface 10808, increased past the
predefined maximum size limit in FIG. 11F, is maintained after
liftoff of contact 10810). In this example, in response to
detecting liftoff of contact 10810, the device moves control icon
10802 back to an end of scroll bar 10804. FIG. 11H illustrates an
example where, in response to liftoff of contact 10810 in FIG. 11F,
adjustment of the parameter exceeding the predefined adjustment
limit is partially reversed to match the predefined adjustment
limit (e.g., the size of images 10814, 10816, 10818, 10820, 10822,
10824, 10826, 10828 and 10830 displayed on user interface 10808,
increased past the predefined maximum size limit in FIG. 11F, is
shrunk back down to the predefined maximum size limit after liftoff
of contact 10810). FIG. 11G illustrates an example where, in
response to liftoff of contact 10810 in FIG. 11F, display of
control icon 10802 is adjusted to correspond to the right boundary
of sizing bar 10804 (e.g., movement of control icon 10802 back to
the right boundary of sizing bar 10804).
[0333] FIGS. 11I-11K illustrate a contact 10840 and a gesture
including movement 10842 of contact 10840 that are detected on
touch-sensitive surface 451 (e.g., movement 10842-a of contact
10840 from location 10840-a in FIG. 11I to location 10840-b in FIG.
11J and/or movement 10842-b of contact 10840 from location 10840-b
in FIG. 11J to location 10840-c in FIG. 11K). Contact 10840 is
detected at a position on touch-sensitive surface 451 corresponding
to an area on display 450 occupied by control 10836 (e.g., contact
10840 corresponds to a focus selector on the display, such as
cursor 10806 which is at or near a location of user interface
object 10836). The gesture in FIGS. 11I-11K includes movement 10842
of contact 10840 on touch-sensitive surface 451 that corresponds to
movement of a focus selector (e.g., a cursor 10806) on display
450.
[0334] In some embodiments, as illustrated in FIGS. 11I-11K, the
content is a media clip (e.g., media clips 10844, 10846, 10848,
10850, 10852 and/or 10854) that includes audio (e.g., audio 10834),
the parameter is a volume level (e.g., volume level 10836), the
predefined adjustment limits include a clipping limit (e.g.,
clipping limit 10838), and the clipping limit is exceeded when a
maximum volume that occurs in the content is above the clipping
limit. For example, as illustrated in FIG. 11J, in response to
detecting movement 10842-a of contact 10810 corresponding to
adjustment of a parameter that would not cause a predefined
adjustment limit to be exceeded (e.g., increasing volume level
10836 such that the maximum volume of audio 10834 does not exceed
volume clipping limit 10838), the electronic device adjusts the
parameter without generating a tactile output on the
touch-sensitive surface. In contrast, as illustrated in FIG. 11K,
in response to detecting movement 10842-b of contact 10810
corresponding to adjustment of the parameter that would cause the
predefined adjustment limit to be exceeded (e.g., increasing volume
level 10836 such that the maximum volume of audio 10834 exceeds
volume clipping limit 10838), tactile output generators 167
generate tactile outputs 10832 on touch-sensitive surface 451.
[0335] FIGS. 11L-11N illustrate a contact 10860 and a gesture
including movement 10862 of contact 10860 that are detected on
touch-sensitive surface 451 (e.g., movement 10862-a of contact
10860 from location 10860-a in FIG. 11L to location 10860-b in FIG.
11M and/or movement 10862-b of contact 10860 from location 10860-b
in FIG. 11M to location 10860-c in FIG. 11N). Contact 10860 is
detected at a position on touch-sensitive surface 451 corresponding
to an area on display 450 occupied by control 10858 (e.g., contact
10860 corresponds to a focus selector on the display, such as
cursor 10864 which is at or near a location of user interface
object 10858). The gesture in FIGS. 11L-11N includes movement 10862
of contact 10860 on touch-sensitive surface 451 that corresponds to
movement of a focus selector (e.g., a cursor 10864) on display
450).
[0336] In some embodiments, as illustrated in FIGS. 11L-11N, the
content is a media clip (e.g., media clip 10856), the parameter is
a cropping mask (e.g., cropping mask 10858), the predefined
adjustment limits include a time-based content boundary (e.g., the
right boundary of media clip 10856), and the time-based content
boundary is exceeded when the cropping mask extends beyond the
time-based content boundary. For example, as illustrated in FIG.
11M, in response to detecting movement 10862-a of contact 10860
corresponding to adjustment of a that would not cause a predefined
adjustment limit to be exceeded (e.g., extending cropping mask
10864 to, but not past, the right boundary of media clip 10856),
the electronic device adjusts the parameter (e.g., size of cropping
mask 10858) without generating a tactile output on the
touch-sensitive surface. In contrast, as illustrated in FIG. 11N,
in response to detecting movement 10862-b of contact 10860
corresponding to adjustment of the parameter that would cause the
predefined adjustment limit to be exceeded (e.g., extending
cropping mask 10864 past the right boundary of media clip 10856),
tactile output generators 167 generate tactile outputs 10832 on
touch-sensitive surface 451.
[0337] FIGS. 11O-11Q illustrate a contact 10870 and a gesture
including movement 10872 of contact 10870 that are detected on
touch-sensitive surface 451 (e.g., movement 10872-a of contact
10870 from location 10870-a in FIG. 11O to location 10870-b in FIG.
11P and/or movement 10872-b of contact 10870 from location 10870-b
in FIG. 11P to location 10870-c in FIG. 11Q). Contact 10870 is
detected at a position on touch-sensitive surface 451 corresponding
to an area on display 450 occupied by control 10866 (e.g., contact
10870 corresponds to a focus selector on the display, such as
cursor 10806 which is at or near a location of user interface
object 10866). The gesture in FIGS. 11O-11Q includes movement 10872
of contact 10870 on touch-sensitive surface 451 that corresponds to
movement of a focus selector (e.g., a cursor 10806) on display
450.
[0338] In some embodiments, as illustrated in FIGS. 11O-11Q, the
content is an image (e.g., image 10868), the parameter is a
cropping mask (e.g., cropping mask 10866), the predefined
adjustment limits include a content boundary (e.g., an outer edge
of image 10868), and the content boundary is exceeded when the
cropping mask extends beyond the content boundary. For example, as
illustrated in FIG. 11P, in response to detecting movement 10872-a
of contact 10870 corresponding to adjustment of a parameter that
would not cause a predefined adjustment limits to be exceeded
(e.g., extending cropping mask 10866 to, but not past, the lower
and right borders of image 10868), the electronic device adjusts
the parameter (e.g., size of cropping mask 10866) without
generating a tactile output on the touch-sensitive surface. In
contrast, as illustrated in FIG. 11Q, in response to detecting
movement 10872-b corresponding to adjustment of the parameter that
would cause a predefined adjustment limits to be exceeded (e.g.,
extending cropping mask 10866 past the lower and right borders of
image 10868), tactile output generators 167 generate tactile
outputs 10832 on touch-sensitive surface 451.
[0339] FIGS. 11R-11T illustrate a contact 10880 and a gesture
including movement 10882 of contact 10880 that are detected on
touch-sensitive surface 451 (e.g., movement 10882-a of contact
10880 from location 10880-a in FIG. 11R to location 10880-b in FIG.
11S and/or movement 10882-b of contact 10880 from location 10880-b
in FIG. 11S to location 10880-c in FIG. 11T). Contact 10880 is
detected at a position on touch-sensitive surface 451 corresponding
to an area on display 450 occupied by control 10878 (e.g., contact
10880 corresponds to a focus selector on the display, such as
cursor 10806 which is at or near a location of user interface
object 10878). The gesture in FIGS. 11R-11T includes movement 10882
of contact 10880 on touch-sensitive surface 451 that corresponds to
movement of a focus selector (e.g., a cursor 10806) on display
450.
[0340] In some embodiments, as illustrated in FIGS. 11R-11T, the
content is a shape (e.g., square 10876), the parameter is a shape
adjustment parameter (e.g., roundness of the corners on square
10876), and the predefined adjustment limits include a maximum
allowable adjustment to a respective portion of a respective shape
(e.g., maximum radius for rounding the corners of square 10876).
For example, as illustrated in FIG. 11S, in response to detecting
movement 10882-a corresponding to adjustment of a parameter that
would not cause a predefined adjustment limits to be exceeded
(e.g., rounding the corners of square 10876 using a radius that
matches, but does not exceed, a predetermined maximum radius), the
electronic device adjusts the parameter (e.g., roundness of the
corners of square 10876) without generating a tactile output on the
touch-sensitive surface. In contrast, as illustrated in FIG. 11T,
in response to detecting movement 10882-b corresponding to
adjustment of the parameter that would cause a predefined
adjustment limits to be exceeded (e.g., rounding the corners of
square 10876 using a radius that exceeds a predetermined maximum
radius), tactile output generators 167 generate tactile outputs
10832 on touch-sensitive surface 451.
[0341] FIGS. 12A-12B are flow diagrams illustrating a method 10900
of providing feedback when an action will result in the adjustment
of a parameter beyond a predefined limit in accordance with some
embodiments. The method 10900 is performed at an electronic device
(e.g., device 300, FIG. 3, or portable multifunction device 100,
FIG. 1A) with a display and a touch-sensitive surface. In some
embodiments, the display is a touch screen display and the
touch-sensitive surface is on the display. In some embodiments, the
display is separate from the touch-sensitive surface. Some
operations in method 10900 are, optionally, combined and/or the
order of some operations is, optionally, changed.
[0342] As described below, the method 10900 provides an intuitive
way to provide feedback when an action will result in the
adjustment of a parameter beyond a predefined limit. The method
reduces the cognitive burden on a user when detecting feedback when
an action will result in the adjustment of a parameter beyond a
predefined limit, thereby creating a more efficient human-machine
interface. For battery-operated electronic devices, enabling a user
to detect feedback when an action will result in the adjustment of
a parameter beyond a predefined limit faster and more efficiently
conserves power and increases the time between battery charges.
[0343] In some embodiments, the device displays (10902), on a
display (e.g., display 450 in FIGS. 11A-11T), a control (e.g., a
resizing control including sizing bar 10804 and control icon 10802
in FIGS. 11A-11H, control 10836 in FIGS. 11I-11K, control 10858 in
FIGS. 11L-11N, control 10866 in FIGS. 11O-11Q, or control 10878 in
FIGS. 11R-11T) for controlling a parameter associated with
respective content (e.g., the size of images 10814, 10816, 10818,
10820, 10822, 10824, 10826, 10828 and 10830 in FIGS. 11A-11H, the
volume level of audio 10834 in FIGS. 11I-11K, cropping mask 10858
applied to media clip 10856 in FIGS. 11L-11N, cropping mask 10866
applied to image 10868 in FIGS. 11O-11Q, or the roundness of the
corners of square 10876 in FIGS. 11R-11T).
[0344] In some embodiments, while the device displays the control,
the device detects (10904) a gesture (e.g., movement 10812 of
contact 10810 in FIGS. 11A-11H, movement 10842 of contact 10840 in
FIGS. 11I-11K, movement 10862 of contact 10860 in FIGS. 11L-11N,
movement 10872 of contact 10870 in FIGS. 11O-11Q, or movement 10882
of contact 10880 in FIGS. 11R-11T) on a touch-sensitive surface
(e.g., touch-sensitive surface 451) for adjusting the
parameter.
[0345] In some embodiments, in response (10906) to detecting the
gesture: the device determines (10908) an adjustment of the
parameter that corresponds to an extent of the gesture (e.g., an
extent of lateral movement, an extent of rotation, or an extent of
increase/decrease in intensity of a contact detected on the
touch-sensitive surface).
[0346] In response (10906) to detecting the gesture: in accordance
with a determination that the adjustment of the parameter would
cause one or more predefined adjustment limits to be exceeded, the
device generates (10910) a respective tactile output (e.g., tactile
outputs 10832 in FIGS. 11C-11F, FIG. 11K, FIG. 11N, FIG. 11Q or
FIG. 11T) on the touch-sensitive surface (e.g., touch-sensitive
surface 451).
[0347] In some embodiments, in response (10906) to detecting the
gesture: in accordance with a determination that the adjustment of
the parameter would cause one or more predefined adjustment limits
to be exceeded, the device forgoes (10912) adjusting the parameter
(e.g., adjustment of the parameter is cancelled if the requested
adjustment exceeds the adjustment limits for the parameter). For
example, as illustrated in FIGS. 11C and 11E, where a gesture
including movement 10812-b of contact 10810 that corresponds to a
size adjustment of images 10814, 10816, 10818, 10820, 10822, 10824,
10826, 10828 and 10830 that would exceed the predefined adjustment
limit corresponding to the right boundary of sizing bar 10804, the
images are displayed at a magnification corresponding to the size
of the images prior to the gesture, as shown in FIG. 11E.
[0348] In some embodiments, in response (10906) to detecting the
gesture: in accordance with a determination that the adjustment of
the parameter would cause the one or more predefined adjustment
limits to be exceeded, the device adjusts (10914) the parameter so
that the predefined adjustment limit is reached (e.g., the extent
of the adjustment of the parameter is limited by the predetermined
adjustment limit). For example, as illustrated in FIGS. 11C-11D,
where the gesture including movement 10812-b of contact 10810
corresponds to a size adjustment of images 10814, 10816, 10818,
10820, 10822, 10824, 10826, 10828 and 10830 that would exceed the
predefined adjustment limit corresponding to the right boundary of
sizing bar 10804, the images are displayed at a magnification
corresponding to the predefined maximum size adjustment limit, as
shown in FIG. 11D. In some embodiments, the parameter is adjusted
to a respective predefined adjustment limit, but the respective
predefined adjustment limit is not exceeded (e.g., a thumb on a
slider is adjusted to the end of the slider, but no further).
[0349] In some embodiments, in response (10906) to detecting the
gesture: in accordance with a determination that the adjustment of
the parameter would cause the one or more predefined adjustment
limits to be exceeded, the device performs (10916) the adjustment
of the parameter (e.g., the extent of the adjustment of the
parameter is not limited by the predetermined adjustment limit).
For example, as illustrated in FIG. 11F, where the gesture
including movement 10812-b of contact 10810 corresponds to a size
adjustment of images 10814, 10816, 10818, 10820, 10822, 10824,
10826, 10828 and 10830 that would exceed the predefined adjustment
limit corresponding to the right boundary of sizing bar 10804, the
images are displayed at a magnification corresponding to the size
exceeding the predefined maximum size adjustment limit (e.g., even
if adjusting the parameter in this way causes the one or more
predefined adjustment limits to be exceeded), as shown in FIG. 11F.
For example, a user is allowed to extend a cropping mask outside of
a canvas (e.g., as shown in FIG. 11Q) or beyond an end of a media
clip (e.g., as shown in FIG. 11N), but the device warns the user
that cropping while the cropping mask is outside of the canvas or
beyond the end of the media clip will add blank content to the
respective content.
[0350] In some embodiments, the content is (10918) a media clip
(e.g., media clips 10844, 10846, 10848, 10850, 10852 and/or 10854
in FIGS. 11I-11K) that includes audio (e.g., audio 10834 in FIGS.
11I-11K), the parameter is a volume level (e.g., volume level 10836
in FIGS. 11I-11K), the predefined adjustment limits include a
clipping limit (e.g., clipping limit 10838 in FIGS. 11I-11K), and
the clipping limit is exceeded when a maximum volume that occurs in
the content is above the clipping limit (e.g., when the tallest
peaks of audio 10834 exceed clipping limit 10838 in FIG. 11K). In
some embodiments, the clipping limit is an analog clipping limit
(e.g., the sound is limited by the physical/electrical
characteristics of an amplifier such that sounds above the clipping
limit would push an amplifier to create a signal with more power
than its power supply can produce and thus the amplifier amplifies
the signal only up to its maximum capacity). In some embodiments,
the clipping limit is a digital clipping limit (e.g., the signal is
restricted by the predetermined range digital representations of
the sound and a sound with amplitude above the range of chosen
digital representations will be represented as a maximum digital
representation).
[0351] In some embodiments, the content is (10920) a media clip
(e.g., media clip 10856 in FIGS. 11L-11N), the parameter is a
cropping mask (e.g., cropping mask 10858 in FIGS. 11L-11N), the
predefined adjustment limits include a time-based content boundary
(e.g., the right boundary of media clip 10856 in FIGS. 11L-11N),
and the time-based content boundary is exceeded when the cropping
mask extends beyond the time-based content boundary (e.g., the user
tries to perform an operation that corresponds to cropping the
media clip before the beginning time of the media clip, or after
the end time of the media clip). For example, as illustrated in
FIG. 11N, the time-based content boundary is exceeded when cropping
mask 10858 extends past the right time-based content boundary of
media clip 10856.
[0352] In some embodiments, the content is an image (e.g., image
10868 in FIGS. 11O-11Q), the parameter is a cropping mask (e.g.,
cropping mask 10866 in FIGS. 11O-11Q), the predefined adjustment
limits include a content boundary (e.g., an outer edge of image
10868 in FIGS. 11O-11Q), and the content boundary is exceeded when
the cropping mask extends beyond the content boundary (e.g., beyond
the border of an image). For example, as illustrated in FIG. 11Q,
the content boundary is exceeded when cropping mask 10866 extends
beyond the lower and right borders of image 10868.
[0353] In some embodiments, the content is a shape (e.g., square
10876 in FIG. 11R), the parameter is a shape adjustment parameter
(e.g., a parameter corresponding to the roundness of the corners on
square 10876 in FIGS. 11R-11T), and the predefined adjustment
limits include a maximum allowable adjustment to a respective
portion of a respective shape (e.g., maximum radius for rounding
the corners of square 10876 in FIGS. 11R-11T). In some embodiments,
the shape adjustment parameter is, for example, roundness of a
shape corner, opacity or line width. In some embodiments, the
predefined adjustment limit is, for example, a maximum radius for
rounding a corner, a minimum/maximum opacity, or a minimum/maximum
line width.
[0354] In response (10906) to detecting the gesture: in accordance
with a determination that the adjustment of the parameter would not
cause the one or more predefined adjustment limits to be exceeded,
the device performs (10926) the adjustment of the parameter without
generating the respective tactile output (e.g., a tactile output
corresponding to exceeding the predefined adjustment limit) on the
touch-sensitive surface (e.g., touch-sensitive surface 451). For
example, as illustrated in FIGS. 11B, 11J, 11M, 11P and 11S, where
a gesture corresponds to an adjustment of the parameter that does
not exceed a predefined adjustment limit, the adjustment is
performed and no tactile output is generated on the touch-sensitive
surface.
[0355] It should be understood that the particular order in which
the operations in FIGS. 12A-12B have been described is merely
exemplary and is not intended to indicate that the described order
is the only order in which the operations could be performed. One
of ordinary skill in the art would recognize various ways to
reorder the operations described herein. Additionally, it should be
noted that details of other processes described herein with respect
to other methods described herein (e.g., those listed in paragraph
[0058]) are also applicable in an analogous manner to method 10900
described above with respect to FIGS. 12A-12B. For example, the
contacts, gestures, user interface objects, tactile sensations and
focus selectors described above with reference to method 10900
optionally have one or more of the characteristics of the contacts,
gestures, user interface objects, tactile sensations and focus
selectors described herein with reference to other methods
described herein (e.g., those listed in paragraph [0058]). For
brevity, these details are not repeated here.
[0356] In accordance with some embodiments, FIG. 13 shows a
functional block diagram of an electronic device 11000 configured
in accordance with the principles of the various described
embodiments. The functional blocks of the device are, optionally,
implemented by hardware, software, or a combination of hardware and
software to carry out the principles of the various described
embodiments. It is understood by persons of skill in the art that
the functional blocks described in FIG. 13 are, optionally,
combined or separated into sub-blocks to implement the principles
of the various described embodiments. Therefore, the description
herein optionally supports any possible combination or separation
or further definition of the functional blocks described
herein.
[0357] As shown in FIG. 13, an electronic device 11000 includes a
display unit 11002 configured to display a control for controlling
a parameter associated with respective content, a touch-sensitive
surface unit 11004 configured to receive user contacts, optionally
one or more sensor units 11006 configured to detect intensity of
contacts with the touch-sensitive surface unit 11004; and a
processing unit 11008 coupled to the display unit 11002, the
touch-sensitive surface unit 11004 and optionally the one or more
sensor units 11006. In some embodiments, the processing unit 11008
includes a display enabling unit 11010, a detecting unit 11012, a
determining unit 11014, a generating unit 11016, and an adjusting
unit 11018.
[0358] In some embodiments, the processing unit 11008 is configured
to enable display (e.g., with the display enabling unit 11010) of a
control for controlling a parameter associated with respective
content. In some embodiments, the processing unit 11008 is further
configured to detect a gesture on the touch-sensitive surface unit
11004 for adjusting the parameter (e.g., with the detecting unit
11012); and in response to detecting the gesture: the processing
unit 11008 is configured determine an adjustment of the parameter
that corresponds to an extent of the gesture (e.g., with the
determining unit 11014); in accordance with a determination that
the adjustment of the parameter would cause one or more predefined
adjustment limits to be exceeded, the processing unit 11008 is
configured to generate a respective tactile output on the
touch-sensitive surface unit (e.g., with the generating unit
11016); and in accordance with a determination that the adjustment
of the parameter would not cause the one or more predefined
adjustment limits to be exceeded, the processing unit 11008 is
configured to perform the adjustment of the parameter (e.g., with
the adjusting unit 11018) without generating the respective tactile
output on the touch-sensitive surface unit 11004.
[0359] In some embodiments, the processing unit 11008 is further
configured to, in accordance with a determination that the
adjustment of the parameter would cause the one or more predefined
adjustment limits to be exceeded, forgo adjusting the parameter
(e.g., with the adjusting unit 11018).
[0360] In some embodiments, the processing unit 11008 is further
configured to, in accordance with a determination that the
adjustment of the parameter would cause the one or more predefined
adjustment limits to be exceeded, adjust the parameter so that the
predefined adjustment limit is reached (e.g., with the adjusting
unit 11018).
[0361] In some embodiments, the processing unit 11008 is further
configured to, in accordance with a determination that the
adjustment of the parameter would cause the one or more predefined
adjustment limits to be exceeded, perform the adjustment of the
parameter (e.g., with the adjusting unit 11018).
[0362] In some embodiments, the content is a media clip that
includes audio, the parameter is a volume level, the predefined
adjustment limits include a clipping limit and the clipping limit
is exceeded when a maximum volume that occurs in the content is
above the clipping limit.
[0363] In some embodiments, the content is a media clip, the
parameter is a cropping mask, the predefined adjustment limits
include a time-based content boundary and the time-based content
boundary is exceeded when the cropping mask extends beyond the
time-based content boundary.
[0364] In some embodiments, the content is an image, the parameter
is a cropping mask, the predefined adjustment limits include a
content boundary and the content boundary is exceeded when the
cropping mask extends beyond the content boundary.
[0365] In some embodiments, the content is a shape, the parameter
is a shape adjustment parameter and the predefined adjustment
limits include a maximum allowable adjustment to a respective
portion of a respective shape.
[0366] The operations in the information processing methods
described above are, optionally implemented by running one or more
functional modules in information processing apparatus such as
general purpose processors (e.g., as described above with respect
to FIGS. 1A and 3) or application specific chips.
[0367] The operations described above with reference to FIGS.
12A-12B are, optionally, implemented by components depicted in
FIGS. 1A-1B or FIG. 13. For example, detection operation 10904 and
determination operations 10908, 10910, 10912, 10914, 10916 and
10926 are, optionally, implemented by event sorter 170, event
recognizer 180, and event handler 190. Event monitor 171 in event
sorter 170 detects a contact on touch-sensitive display 112, and
event dispatcher module 174 delivers the event information to
application 136-1. A respective event recognizer 180 of application
136-1 compares the event information to respective event
definitions 186, and determines whether a first contact at a first
location on the touch-sensitive surface corresponds to a predefined
event or sub-event, such as selection of an object on a user
interface, adjustment of a parameter associated with respective
content, or generation of a tactile output (e.g., corresponding to
a determination that an adjustment of a parameter would cause one
or more predefined adjustment limits to be exceeded). When a
respective predefined event or sub-event is detected, event
recognizer 180 activates an event handler 190 associated with the
detection of the event or sub-event. Event handler 190 optionally
utilizes or calls data updater 176 or object updater 177 to update
the application internal state 192. In some embodiments, event
handler 190 accesses a respective GUI updater 178 to update what is
displayed by the application. Similarly, it would be clear to a
person having ordinary skill in the art how other processes can be
implemented based on the components depicted in FIGS. 1A-1B.
Providing Tactile Feedback Corresponding to a Clock
[0368] Many electronic devices have graphical user interfaces that
include a representation of a clock. For example, many cellular
phones, laptops, and tablets have a representation of a clock
prominently displayed on the graphical user interface. There is
often a need to provide efficient and convenient ways for users to
receive feedback corresponding to the clock. The embodiments below
improve on existing methods by generating tactile outputs for the
user that correspond to the clock (e.g., a `tick tock` pattern of
tactile outputs) indicating that a focus selector is over the
representation of the clock and, optionally, providing an
indication of the rate at which time is passing.
[0369] FIGS. 14A-14J illustrate exemplary user interfaces for
providing tactile feedback corresponding to a clock in accordance
with some embodiments. The user interfaces in these figures are
used to illustrate the processes described below, including the
processes described below with reference to FIGS. 15A-15B.
[0370] FIG. 14A illustrates an example of a user interface that
includes a representation of a clock. User interface 11100 is
displayed on display 450 of a device (e.g., device 300) and is
responsive to contacts (e.g., a finger contact) on touch-sensitive
surface 451. User interface 11100 includes representation 11102 of
a clock. FIG. 14A further illustrates contact 11106 on
touch-sensitive surface 451 and, per some embodiments, a displayed
representation of focus selector (e.g., a cursor 11104), at
position 11104-a, corresponding to contact 11106.
[0371] In some embodiments, the device is portable multifunction
device 100, the display is touch-sensitive display system 112, and
the touch-sensitive surface includes tactile output generators 167
on the display (FIG. 1A). For convenience of explanation, the
embodiments described with reference to FIGS. 14A-14J and FIGS.
15A-15B will be discussed with reference to display 450 and a
separate touch-sensitive surface 451, however analogous operations
are, optionally, performed on a device with a touch-sensitive
display system 112 in response to detecting the contacts described
in FIGS. 14A-14J on the touch-sensitive display system 112 while
displaying the user interfaces shown in FIGS. 14A-14J on the
touch-sensitive display system 112; in such embodiments, the focus
selector is, optionally: a respective contact, a representative
point corresponding to a contact (e.g., a centroid of a respective
contact or a point associated with a respective contact), or a
centroid of two or more contacts detected on the touch-sensitive
display system 112, in place of cursor 11104.
[0372] FIGS. 14A and 14B illustrate an example of detecting a focus
selector over a representation of a clock. In this example, contact
11106 and movement 11108 of contact 11106 are detected on
touch-sensitive surface 451. Movement of a focus selector (e.g.,
cursor 11104), corresponding to movement 11108, causes the focus
selector (e.g., cursor 11104) to move to from position 11104-a in
FIG. 14A that is not over representation 11102 of a clock to
position 11104-b in FIG. 14B that is over the representation 11102
of the clock and the device starts to provide tactile feedback
11110 (e.g., generating tactile outputs corresponding to a tick
tock sensation) on touch-sensitive surface 451.
[0373] FIG. 14C illustrates an example of continuing to provide
tactile feedback while detecting a focus selector over a
representation of a clock. In this example, cursor 11104 at
position 11104-b is over representation 11102 of a clock and
tactile feedback 11110 continues to be provided on touch-sensitive
surface 451. As discussed below with reference to FIG. 14B, tactile
feedback 11110 includes a regular pattern of tactile outputs on
touch-sensitive surface 451.
[0374] FIGS. 14C and 14D illustrate an example of movement of a
focus selector that maintains the focus selector over a
representation of a clock. In this example, a focus selector (e.g.,
cursor 11104) is initially at position 11104-b as shown in FIG.
14C. As shown in FIG. 14D, contact 11112 and movement 11114 are
detected on touch-sensitive surface 451 and the corresponding
movement of cursor 11104 causes cursor 11104 to move to position
11104-c. Since cursor 11104 is over representation 11102 of a clock
when at position 11104-c, tactile feedback 11110 is provided on
touch-sensitive surface 451. In some embodiments, the period of the
regular pattern of tactile feedback is not based on movement of a
focus selector (e.g., cursor 11104) that maintains the focus
selector (e.g., cursor 11104) over representation 11102 of the
clock. For example, in some embodiments, while the cursor 11104
remains over representation 11102 of the clock, the period of the
regular pattern of tactile feedback is independent of movement of
cursor 11104.
[0375] FIGS. 14D and 14E illustrate an example of movement of a
focus selector away from a representation of a clock and therefore
ceasing to provide tactile feedback corresponding to the clock. In
this example, cursor 11104 is initially at position 11104-c as
shown in FIG. 14D. As shown in FIG. 14E, contact 11116 and movement
11118 are detected on touch-sensitive surface 451 and the
corresponding movement of cursor 11104 causes cursor 11104 to move
to position 11104-d. Since cursor 11104 is no longer over
representation 11102 of a clock when at position 11104-d, tactile
feedback is no longer provided on touch-sensitive surface 451.
[0376] FIGS. 14F-14H illustrate example waveforms of movement
profiles for generating the tactile feedback. FIG. 14F illustrates
a triangle waveform with period 11130-1. FIG. 14G illustrates a
square waveform with period 11130-2 and FIG. 14H illustrates a
sawtooth waveform with period 11130-3. In some embodiments, one of
the movement profiles illustrated in FIGS. 14F-14H will be utilized
when generating tactile feedback 11110 corresponding to a clock, as
discussed above. In these examples, since the regular pattern
comprises repetition of single waveform, the period of the regular
pattern is the same as the period of the individual waveform in the
regular pattern. In some embodiments, the period (e.g., from peak
to peak or leading edge to leading edge) of the regular pattern is
1 second.
[0377] Per some embodiments, FIGS. 14I-14J illustrate example
waveforms of movement profiles that include an alternating sequence
of outputs. In some embodiments, tactile feedback 11110 includes an
alternating sequence of tactile outputs that have different output
characteristics. FIG. 14I illustrates an alternating sequence of
square waves with approximately the same period and with different
amplitudes 11132-1 and 11132-2. FIG. 14J illustrates an alternating
sequence of square and sawtooth waves with approximately the same
period and amplitudes. In some embodiments, the period (e.g., from
peak to peak or leading edge to leading edge of outputs with the
same movement profile) of the regular pattern is 2 seconds, so that
the time between two successive outputs is 1 second or
approximately 1 second (e.g., the time between a tick and a tock is
1 second and the time between a tock and a tick is 1 second).
[0378] In some embodiments, the tactile feedback 11110 includes
other regular patterns of tactile outputs on touch-sensitive
surface 451 than the ones shown in FIGS. 14F-14J. For example, the
regular pattern can be a sequence of component waveforms, having a
sequence of length L (where L is an integer greater than 0), that
is repeatedly generated. In some embodiments, at least one
component in the sequence of component waveforms is distinct from
at least one of the other components in at least one respect (e.g.,
amplitude, period and/or shape). In some embodiments, each
component waveform in the sequence of component waveforms is
distinct from the other components in at least one respect (e.g.,
amplitude, period and/or shape), while in other embodiments, some
components ("repeated components") in the sequence are the same
(e.g., every Nth component in the sequence of L components is the
same), while other components are different from the repeated
components. In these embodiments, the period of the regular pattern
is the period to generate the sequence of component waveforms
(i.e., from the start time of a first instance of the regular
pattern until the start time of a next instance of the regular
pattern).
[0379] FIGS. 15A-15B are flow diagrams illustrating a method 11200
of providing tactile feedback corresponding to a clock in
accordance with some embodiments. Method 11200 is performed at an
electronic device (e.g., device 300, FIG. 3, or portable
multifunction device 100, FIG. 1A) with a display and a
touch-sensitive surface. In some embodiments, the display is a
touch screen display and the touch-sensitive surface is on the
display. In some embodiments, the display is separate from the
touch-sensitive surface. Some operations in method 11200 are,
optionally, combined and/or the order of some operations is,
optionally, changed.
[0380] As described below, the method 11200 provides an intuitive
way to provide tactile feedback corresponding to a clock. The
method reduces the cognitive burden on a user when displaying a
clock, thereby creating a more efficient human-machine interface.
For battery-operated electronic devices, enabling a user to
interact with a clock faster and more efficiently conserves power
and increases the time between battery charges.
[0381] The device displays (11202) a representation of a clock.
FIG. 14A, for example, shows representation 11102 of a clock,
displayed in graphical user interface 11100. In some embodiments,
prior to detecting the focus selector over the representation of
the clock, the device displays (11204) the representation of the
clock without providing the tactile feedback that corresponds to
the clock on the touch-sensitive surface (e.g., a tick-tock output
corresponding to the clock is not generated prior to the focus
selector moving over the clock). In FIG. 14A, for example, cursor
11104 is at position 11104-a and is not over representation 11102
of the clock and therefore tactile feedback 11110 is not generated
by the device.
[0382] While displaying the representation of the clock, the device
detects (11206) movement of a focus selector over the
representation of the clock. As shown in FIG. 14B, for example,
cursor 11104 moves to position 11104-b over representation 11102 of
the clock from a position 11104-a that was not over the
representation of the clock. While detecting the focus selector
over the representation of the clock, the device provides (11208)
tactile feedback that corresponds to the clock, where the tactile
feedback includes a regular pattern of tactile outputs on the
touch-sensitive surface. For example, FIG. 14C shows cursor 11104
at position 11104-b over representation 11102 of the clock and
tactile feedback 11110 provided on touch-sensitive surface 451,
where the tactile feedback includes a regular pattern of tactile
outputs, such as those described above with reference to FIGS.
14F-14J.
[0383] In some embodiments, the tactile outputs in the regular
pattern of tactile outputs are generated at evenly spaced intervals
(11210). For example, in some embodiments, the regular pattern of
tactile outputs will have a period of one second. In some other
embodiments the regular pattern of tactile outputs will have a
period of 0.5 seconds, 2 seconds, or other length of time between
0.25 seconds and ten seconds.
[0384] In some embodiments, the regular pattern of tactile outputs
on the touch-sensitive surface includes one of the regular patterns
described above with reference to FIGS. 14F-14J. For example, in
some embodiments, the regular pattern of tactile outputs on the
touch-sensitive surface includes an alternating sequence of tactile
outputs that have different output characteristics (11212). For
example, in some embodiments, the pattern of tactile outputs on the
touch-sensitive surface will generate a tick-tock sensation where
"tick" tactile outputs and "tock" tactile outputs are selected so
as to produce "tick" sensations that correspond to "tick" tactile
outputs feel different to a user than "tock" sensations that
correspond to "tock" tactile outputs. FIGS. 14I-14J, for example,
illustrate example waveforms of movement profiles that include an
alternating sequence of outputs.
[0385] In some embodiments, the alternating sequence of tactile
outputs includes a first type of tactile output alternating with a
second type of tactile output (11214) with a different amplitude.
For example, in some embodiments, various combinations of the
waveforms of movement profiles illustrated in FIGS. 14F-14H would
be utilized. In some embodiments, the first type of tactile output
is generated by movement of the touch-sensitive surface that
includes a first dominant movement component (e.g., movement
corresponding to an initial impulse of the first tactile output,
ignoring any unintended resonance). In some embodiments, the second
type of tactile output is generated by movement of the
touch-sensitive surface that includes a second dominant movement
component (e.g., movement corresponding to an initial impulse of
the second tactile output, ignoring any unintended resonance). In
some embodiments, the first dominant movement component and the
second dominant movement component have a same movement profile
(e.g., same waveform shape such as square, sine, squine, sawtooth
or triangle; and/or approximately the same width/period) and
different amplitudes. FIG. 14I, for example, illustrates an example
waveform of movement profiles with alternating square waves having
different amplitudes 11132-1 and 11132-2.
[0386] In some embodiments, the alternating sequence of tactile
outputs includes a first type of tactile output alternating with a
second type of tactile output (11216) with a different movement
profile. For example, in some embodiments, various combinations of
the waveforms of movement profiles illustrated in FIGS. 14F-14H
would be utilized. In some embodiments, the first type of tactile
output is generated by movement of the touch-sensitive surface that
includes a first dominant movement component (e.g., movement
corresponding to an initial impulse of the first tactile output,
ignoring any unintended resonance). In some embodiments, the second
type of tactile output is generated by movement of the
touch-sensitive surface that includes a second dominant movement
component (e.g., movement corresponding to an initial impulse of
the second tactile output, ignoring any unintended resonance). In
some embodiments, the first dominant movement component and the
second dominant movement component have different movement profiles
(e.g., different waveform shapes such as square, sine, squine,
sawtooth or triangle; and/or different width/period) and a same
amplitude. FIG. 14J, for example, illustrates an example waveform
of movement profiles with alternating square waves and sawtooth
waves with the same amplitude.
[0387] In some embodiments, while detecting (11218) the focus
selector over the representation of the clock, the device detects
(11220) movement of the focus selector that maintains the focus
selector over the representation of the clock and in response to
detecting movement of the focus selector, the device continues to
provide (11222) the tactile feedback corresponding to the clock
without changing a period of the regular pattern. In some
embodiments, the period of the regular pattern of tactile outputs
is not changed based on movement of the focus selector while over
the representation of the clock. For instance, per these
embodiments, the period of tactile feedback 11110 would not change
when cursor 11104 moves from position 11104-b in FIG. 14C to
position 11104-c in FIG. 14D.
[0388] While providing the tactile feedback, the device detects
(11224) movement of the focus selector away from the representation
of the clock. FIG. 14E, for example, shows cursor 11104 moving to
position 11104-d away from representation 11102 of the clock. In
response to detecting movement of the focus selector away from the
representation of the clock, the device ceases (11226) to provide
the tactile feedback corresponding to the clock. For example, this
is illustrated in FIGS. 14D-14E. In FIG. 14D, cursor 11104 is at
position 11104-b over representation 11102 of the clock and the
device provides tactile feedback 11110. However, in FIG. 14E,
cursor 11104 moves to position 11104-d away from representation
11102 of the clock and the device ceases to generate tactile
feedback 11110.
[0389] It should be understood that the particular order in which
the operations in FIGS. 15A-15B have been described is merely
exemplary and is not intended to indicate that the described order
is the only order in which the operations could be performed. One
of ordinary skill in the art would recognize various ways to
reorder the operations described herein. Additionally, it should be
noted that details of other processes described herein with respect
to other methods described herein (e.g., those listed in paragraph
[0058]) are also applicable in an analogous manner to method 11200
described above with respect to FIGS. 15A-15B. For example, the
contacts, focus selectors, and tactile feedback (e.g., tactile
outputs) described above with reference to method 11200 optionally
has one or more of the characteristics of contacts, focus
selectors, and tactile feedback (e.g., tactile outputs) described
herein with reference to other methods described herein (e.g.,
those listed in paragraph [0058]). For brevity, these details are
not repeated here.
[0390] In accordance with some embodiments, FIG. 16 shows a
functional block diagram of an electronic device 11300 configured
in accordance with the principles of the various described
embodiments. The functional blocks of the device are, optionally,
implemented by hardware, software, or a combination of hardware and
software to carry out the principles of the various described
embodiments. It is understood by persons of skill in the art that
the functional blocks described in FIG. 16 are, optionally,
combined or separated into sub-blocks to implement the principles
of the various described embodiments. Therefore, the description
herein optionally supports any possible combination or separation
or further definition of the functional blocks described
herein.
[0391] As shown in FIG. 16, an electronic device 11300 includes a
display unit 11302 configured to display a representation of a
clock; a touch-sensitive surface unit 11304; and a processing unit
11306 coupled to the display unit 11302 and the touch-sensitive
surface unit 11304. In some embodiments, the processing unit
includes detecting unit 11308, display enabling unit 11310,
feedback unit 11312, and generating unit 11314.
[0392] The processing unit 11306 is configured to: detect movement
of a focus selector over the representation of the clock (e.g.,
with the detecting unit 11308), while detecting the focus selector
over the representation of the clock, provide tactile feedback
(e.g., with the feedback unit 11312) that corresponds to the clock,
where the tactile feedback includes a regular pattern of tactile
outputs on the touch-sensitive surface unit. The processing unit is
further configured to, while providing the tactile feedback, detect
movement of the focus selector away from the representation of the
clock (e.g., with the detecting unit 11308), and in response to
detecting movement of the focus selector away from the
representation of the clock, cease to provide the tactile feedback
(e.g., with the feedback unit 11312) corresponding to the
clock.
[0393] In some embodiments, the tactile outputs in the regular
pattern of tactile outputs are generated at evenly spaced intervals
(e.g., with the generating unit 11314).
[0394] In some embodiments, the processing unit 11306 is further
configured to, while detecting the focus selector over the
representation of the clock (e.g., with detecting unit 11308),
detect movement of the focus selector that maintains the focus
selector over the representation of the clock (e.g., with the
detecting unit 11308), and in response to detecting movement of the
focus selector, continue to provide the tactile feedback
corresponding to the clock (e.g., with feedback unit 11312) without
changing a period of the regular pattern.
[0395] In some embodiments, the processing unit 11306 is further
configured to, prior to detecting the focus selector over the
representation of the clock, display the representation of the
clock on the display unit (e.g., with the display enabling unit
11310) without providing the tactile feedback that corresponds to
the clock on the touch-sensitive surface unit.
[0396] In some embodiments, the regular pattern of tactile outputs
on the touch-sensitive surface unit includes an alternating
sequence of tactile outputs that have different output
characteristics.
[0397] In some embodiments, the alternating sequence of tactile
outputs includes a first type of tactile output alternating with a
second type of tactile output, the first type of tactile output is
generated by movement of the touch-sensitive surface unit (e.g.,
with the generating unit 11314) that includes a first dominant
movement component, the second type of tactile output is generated
by movement of the touch-sensitive surface unit (e.g., with the
generating unit 11314) that includes a second dominant movement
component, and the first dominant movement component and the second
dominant movement component have a same movement profile and
different amplitudes.
[0398] In some embodiments, the alternating sequence of tactile
outputs includes a first type of tactile output alternating with a
second type of tactile output, the first type of tactile output is
generated by movement of the touch-sensitive surface unit (e.g.,
with the generating unit 11314) that includes a first dominant
movement component, the second type of tactile output is generated
by movement of the touch-sensitive surface unit (e.g., with the
generating unit 11314) that includes a second dominant movement
component, and the first dominant movement component and the second
dominant movement component have different movement profiles and a
same amplitude.
[0399] The operations in the information processing methods
described above are, optionally implemented by running one or more
functional modules in information processing apparatus such as
general purpose processors (e.g., as described above with respect
to FIGS. 1A and 3) or application specific chips.
[0400] The operations described above with reference to FIGS.
15A-15B are, optionally, implemented by components depicted in
FIGS. 1A-1B or FIG. 16. For example, detection operations 11206,
11220 and 11224 and tactile feedback operation 11208 are,
optionally, implemented by event sorter 170, event recognizer 180,
and event handler 190. Event monitor 171 in event sorter 170
detects a contact on touch-sensitive display 112, and event
dispatcher module 174 delivers the event information to application
136-1. A respective event recognizer 180 of application 136-1
compares the event information to respective event definitions 186,
and determines whether a first contact at a first location on the
touch-sensitive surface corresponds to a predefined event or
sub-event, such as selection of an object on a user interface. When
a respective predefined event or sub-event is detected, event
recognizer 180 activates an event handler 190 associated with the
detection of the event or sub-event. Event handler 190 optionally
utilizes or calls data updater 176 or object updater 177 to update
the application internal state 192. In some embodiments, event
handler 190 accesses a respective GUI updater 178 to update what is
displayed by the application. Similarly, it would be clear to a
person having ordinary skill in the art how other processes can be
implemented based on the components depicted in FIGS. 1A-1B.
Providing Tactile Feedback Corresponding to Beats of a Piece of
Music
[0401] Many electronic devices have graphical user interfaces that
display application windows showing representations of a piece of
music (e.g., a graphical representation of a piece of cover art for
an album of the piece of music, a region indicating that a piece of
music is being currently being played or notes of a piece of music
in a graphical representation of a music score corresponding to a
piece of music). For example, a media player application window
(e.g., an audio or video player) optionally includes a display
region that provides information on a selected piece of music
(e.g., the name, composer, artist, associated album, recording
date, publisher and/or length of the piece of music). Likewise, a
composing application window optionally displays an interactive
representation of the musical score of a piece of music being
composed, allowing the user to manipulate the piece of music by
adding, removing or changing notes displayed in the score. Given
the complexity of user interface environment that includes
application windows corresponding to applications having both audio
and visual components (e.g., music playback, music composition,
video playback or video composition applications), there is a need
to provide feedback that enables the user to more efficiently and
conveniently navigate through the user interface environment.
[0402] The embodiments described below provide improved methods and
user interfaces for generating feedback to a user navigating a
complex user interface environment. More specifically, these
methods and user interfaces provide feedback that corresponds to
beats of a piece of music represented on a display. The tactile
feedback provides the user with a sense of the beat of the piece of
music. In this fashion, the methods and user interfaces provided
below allow the user to more efficiently and conveniently achieve
an understanding of the beat of the music, as well as a greater
understanding of the piece of music as a whole, by providing
tactile feedback, instead of or in addition to audible and/or
visual feedback. For example, in some embodiments, a user searching
for a piece of music with a specific beat (e.g., to accompany a
visual display or to listen to while running) moves a focus
selector over a representation of the piece of music on the
display, a receives tactile feedback corresponding to the beat of
the music, without having to listen to the music. Likewise, in some
embodiments, a user composing a piece of music moves a focus
selector over a representation of the musical piece they are
composition and receives tactile feedback corresponding to the beat
of the music, expediting the composition process.
[0403] Some methods for sensing the beat of a piece of music rely
on the user listening to the piece of music and picking-up the beat
themselves. Other methods for sensing the beat of a piece of music
rely on the user detecting a visual cue (e.g., a flash or pulse on
a display) corresponding to the beat of the music. However, there
are many situations (e.g., at work, in a theatre and in various
social situations) where the volume of an electronic device will be
lowered or muted, rendering audible cues ineffective.
Advantageously, the methods and user interfaces described below
augment or replace audible by providing tactile feedback indicating
that a user interface object has been selected and/or
activated.
[0404] FIGS. 17A-17L illustrate exemplary user interfaces for
providing feedback that corresponds to beats of a piece of music in
accordance with some embodiments. The user interfaces in these
figures are used to illustrate the processes described below,
including the processes in FIGS. 18A-18B. FIGS. 17A-17B, 17G-17I
and 17K-17L include intensity diagrams that show the current
intensity of the contact on the touch-sensitive surface relative to
a plurality of intensity thresholds including a contact detection
intensity threshold (e.g., "IT.sub.0") and a light press intensity
threshold (e.g., "IT.sub.L"). In some embodiments, operations
similar to those described below with reference to IT.sub.L are
performed with reference to a different intensity threshold (e.g.,
"IT.sub.D"). In some embodiments, the operations described below
are not dependent on an intensity of the contact. FIGS. 17C,
17E-17G and 17J include musical scores and waveform diagrams that
show the amplitude (e.g., a high amplitude "A.sub.H" or low
amplitude "A.sub.L") and shape (e.g., square or sawtooth) of the
waveform corresponding to tactile output generated on the
touch-sensitive surface in response to a tactile output generating
event (e.g., selection or playback of a beat in a piece of music).
These musical scores and waveform diagrams are typically not part
of the displayed user interface, but are provided to aid in the
interpretation of the figures.
[0405] FIG. 17A illustrates exemplary user interface 11408
displaying one or more user interface objects, for example, user
interface 11408 displays media player window 11402 that includes
representations of a piece of music (e.g., graphical
representations 11406-1, 11406-2, and 11406-3 of a piece of cover
art for a musical album) and cursor 11404 (e.g., a displayed
representation of a focus selector). In FIG. 17A, user interface
11408 is displayed on display 450 of an electronic device that also
includes touch-sensitive surface 451 and one or more sensors for
detecting intensity of contacts with touch-sensitive surface. In
some embodiments, touch-sensitive surface 451 is a touch screen
display that is optionally display 450 or a separate display.
[0406] In some embodiments, the device is an electronic device with
a separate display (e.g., display 450) and a separate
touch-sensitive surface (e.g., touch-sensitive surface 451). In
some embodiments, the device is portable multifunction device 100,
the display is touch-sensitive display system 112, and the
touch-sensitive surface includes tactile output generators 167 on
the display (FIG. 1A). For convenience of explanation, the
embodiments described with reference to FIGS. 17A-17O and 18A-18B
will be discussed with reference to display 450 and a separate
touch-sensitive surface 451, however analogous operations are,
optionally, performed on a device with a touch-sensitive display
system 112 in response to detecting movement of the contacts
described in FIGS. 17A-17O on the touch-sensitive display system
112 while displaying the user interfaces shown in FIGS. 17A-17O on
the touch-sensitive display system 112; in such embodiments, the
focus selector is, optionally: a respective contact, a
representative point corresponding to a contact (e.g., a centroid
of a respective contact or a point associated with a respective
contact), or a centroid of two or more contacts detected on the
touch-sensitive display system 112, in place of cursor 11404.
[0407] FIGS. 17A-17G illustrate various embodiments where user
interface 11408 displays representations 11406 of a piece of music
on display 450. User interface 11408 also displays cursor 11404,
controlled by contact 11410 on touch-sensitive surface 451 and
movement 11412 thereof. In some embodiments, cursor 11404 moves
over representation 11406 of a piece of music, and in response,
tactile generators 167 provide tactile feedback (e.g., tactile
outputs 11414) that corresponds to at least a subset of beats in
the piece of music (e.g., beats 11418). In some embodiments, after
the tactile feedback has been provided, cursor 11404 moves away
from representation 11406 of a piece of music, and in response,
tactile generators 167 cease to provide tactile feedback
corresponding to beats in the piece of music.
[0408] FIGS. 17A-17G illustrate that contact 11410, corresponding
to cursor 11404 displayed on display 450, and a gesture including
movement 11412 of contact 11410 (e.g., movement 11412-a of contact
11410 from location 11410-a in FIG. 17A to location 11410-b in
FIGS. 17B-17F and/or movement 11412-b of contact 11410 from
location 11410-b in FIGS. 17B-17F to location 11410-c in FIG. 17G)
are detected on touch-sensitive surface 451. Contact 11410 is
detected at a position on touch-sensitive surface 451 corresponding
to an area on display 450 occupied by focus selector 11404 (e.g.,
contact 11410 corresponds to a focus selector on the display, such
as cursor 11404 which is at or near a location of user interface
object 11402). In some embodiments, movement of contact 11410 on
touch-sensitive surface 451 corresponds to movement of focus
selector (e.g., a cursor 11404) on display 450 (e.g., as
illustrated in FIGS. 17A-17G).
[0409] FIGS. 17A-17F illustrate various examples of a beginning of
a gesture where cursor 11404 moves over representation 11406 of a
piece of music, in accordance with movement 11412-a of contact
11410 on touch-sensitive surface 451, corresponding to cursor 11404
on display 450. In FIGS. 17B-17F, while focus selector 11404
remains over representation 11406 of a piece of music, the device
generates tactile outputs 11414 (e.g., via tactile output
generators 167), corresponding to a subset of beats (e.g., beats
11418 shown in FIGS. 17C-17F) of the piece of music.
[0410] In some embodiments, as illustrated in FIGS. 17A-17G, the
piece of music is being played in a media player application (e.g.,
illustrated as media player application window 11402). In some
embodiments, as illustrated in FIGS. 17A-17G, the representation of
the piece of music is a graphical representation of the piece of
music (e.g., images 11406 of cover art corresponding to an album of
the piece of music).
[0411] FIGS. 17C-17F illustrate various embodiments where
representation 11406 of the piece of music is displayed in media
player application window 11402 and tactile feedback (e.g., tactile
outputs 11414-1, 11414-3, 11414-5, and 11414-7 in FIGS. 17C-17F)
are generated when corresponding beats (e.g., beats 11418-1,
11418-3, 11418-5, and 11418-7 in FIGS. 17C-17F) in a subset of
beats (e.g., a subset of all of the beats in piece of music 11416)
are played by the media player application. FIGS. 17C-17F also
illustrate various embodiments, where the subset of beats (e.g.,
those beats that correspond to tactile feedback provided to the
user) include stressed beats on every other beat, including the
first (e.g., beat 11418-1), third (e.g., beat 11418-3), fifth
(e.g., beat 11418-5) and seventh (e.g., beat 11418-7) beats of the
piece of music.
[0412] FIG. 17C illustrates an embodiment where the subset of beats
excludes unstressed beats, including the second (e.g., beat
11418-2), fourth (e.g., beat 11418-4), sixth (e.g., beat 11418-6)
and eighth (e.g., beat 11418-8) beats of the piece of music (e.g.,
piece of music 11416). In contrast, FIGS. 17D-17F, described below,
illustrate various embodiments, where the subset of beats includes
both stressed beats (e.g., every odd beat 11418) and unstressed
beats (e.g., every even beat 11418) of the piece of music. FIG. 17D
illustrates an embodiment, where the tactile outputs 11414 are
substantially the same, regardless of whether they correspond to a
stressed beat or an unstressed beat. In contrast, FIGS. 17E-17F,
described below, illustrate various embodiments, where first
tactile outputs 11414 corresponding to stressed beats (e.g., odd
number beats 11418) are substantially different from second tactile
outputs 11414 corresponding to unstressed beats (e.g., even number
beats 11418).
[0413] For example, FIG. 17E illustrates an embodiment where first
tactile outputs 11414 corresponding to stressed beats 11418 have a
same or substantially same amplitude (e.g., high amplitude
"A.sub.H") but a substantially different movement profile (e.g.,
square waveform shape 11436 as compared to sawtooth waveform shape
11434) as second tactile outputs 11414 corresponding to unstressed
beats 11418 of the piece of music. In contrast, FIG. 17F
illustrates an embodiment, where first tactile outputs 11414
corresponding to stressed beats 11418 have a substantially
different amplitude (e.g., high amplitude "A.sub.H" as compared to
low amplitude "A.sub.L") but substantially a same movement profile
(e.g., square waveform shape 11434) as second tactile outputs 11414
corresponding to unstressed beats 11418 of the piece of music.
[0414] FIGS. 17B-17G illustrate various embodiments where after
providing tactile feedback, as illustrated in FIGS. 17B-17F, the
device detects movement 11412-b of contact 11410 on touch sensitive
surface 451 that corresponds to movement of cursor 11404 away from
representation 11406 of the piece of music. As illustrated in FIG.
17G, in response to the movement of cursor 11404 away from
representation 11406 of the piece of music, tactile generators 167
cease providing tactile outputs 11414, corresponding to beats of
the piece of music.
[0415] FIGS. 17H-17L illustrate various embodiments where user
interface 11408 displays music composition application window
11422, which includes representation 11424 of a musical score, on
display 450. User interface 11408 also displays cursor 11404,
controlled by contact 11426 on touch-sensitive surface 451 and
movement 11428 thereof. In some embodiments, cursor 11404 moves
over representation 11424 of a musical score, and in response,
tactile generators 167 provide tactile feedback (e.g., tactile
outputs 11414) that corresponds to at least a subset of beats in
the piece of music (e.g., beats 11418). In some embodiments, after
the tactile feedback has been provided, cursor 11404 moves away
from representation 11424 of the musical score, and in response,
tactile generators 167 cease to provide tactile feedback
corresponding to beats in the piece of music.
[0416] In some embodiments, as illustrated in FIGS. 17I-17J,
movement of cursor 11404 over representation 11424 of a musical
score, in accordance with movement 11428-a of contact 11426 on
touch-sensitive surface 451, while the piece of music (e.g., a
composition) is being played back by the composition application,
results in the generation of tactile outputs 11414 corresponding to
beats 11418 of the piece of music when the beats are played by the
media application. Although FIG. 17J illustrates that first tactile
outputs 11414 corresponding to stressed beats 11418 (e.g., the odd
numbered tactile outputs 11414 and beats 11418, respectively) and
second tactile outputs 11414 corresponding to unstressed beats
11418 (e.g., the even numbered tactile outputs 11414 and beats
11418, respectively) feel substantially different (e.g., have a
substantially different amplitude, but a same or substantially same
square waveform movement profile), in some embodiments, the first
tactile outputs and second tactile outputs are generated with
substantially the same amplitude and movement profile and thus will
feel substantially the same to a user. In some embodiments, the
second tactile outputs corresponding to unstressed beats are
excluded (e.g., are not generated by the device).
[0417] In some embodiments, as illustrated in FIGS. 17K-17L,
movement of cursor 11404 over a representation 11430-3 of
corresponding beat 11418-3 in musical score 11424, in accordance
with movement 11428-c of contact 11426 on touch-sensitive surface
451, results in the generation of tactile feedback (e.g., tactile
output 11414-3). For example, FIG. 17K illustrates an embodiment
where cursor 11404, in accordance with movement 11428-b of contact
11426 from position 11426-1 to position 11426-3 on touch-sensitive
surface 451, moves over representation 11424 of a musical score on
display 450 and tactile feedback is not generated because the
cursor is displayed at a position not corresponding to a
representation of a beat in the piece of music. In contrast, as
illustrated in FIG. 17L, in accordance with movement 11428-c of
contact 11426 from position 11426-3 to position 11426-d on
touch-sensitive surface 451, cursor 11404 moves over representation
11430-3 of beat 11418-3 in representation 11424 of the musical
score and tactile feedback (e.g., tactile output 11414-3) is
generated on touch-sensitive surface 451, regardless of whether or
not music corresponding to representation 11424 of the score of the
piece of music is concurrently being played by the music
composition application.
[0418] FIGS. 17M-17O illustrate example waveforms of movement
profiles for generating these tactile outputs. FIG. 17M illustrates
a sawtooth waveform 11434. FIG. 17N illustrates a square waveform
11436 and FIG. 17O illustrates a square waveform 11438 that has a
lower amplitude than the square waveform of FIG. 17F. Sawtooth
waveform 11434 has a different movement profile from square
waveforms 11436 and 11438 and substantially the same amplitude as
square waveform 11436. Square waveform 11436 has a substantially
same movement profile and substantially different amplitude than
square waveform 11438.
[0419] FIGS. 18A-18B are flow diagrams illustrating a method 11500
of providing feedback that corresponds to beats of a piece of music
in accordance with some embodiments. The method 11500 is performed
at an electronic device (e.g., device 300, FIG. 3, or portable
multifunction device 100, FIG. 1A) with a display and a
touch-sensitive surface. In some embodiments, the display is a
touch screen display and the touch-sensitive surface is on the
display. In some embodiments, the display is separate from the
touch-sensitive surface. Some operations in method 11500 are,
optionally, combined and/or the order of some operations is,
optionally, changed.
[0420] As described below, the method 11500 provides an intuitive
way to provide feedback that corresponds to beats of a piece of
music. The method reduces the cognitive burden on a user when
detecting feedback that corresponds to beats of a piece of music,
thereby creating a more efficient human-machine interface. For
battery-operated electronic devices, enabling a user to detect
feedback that corresponds to beats of a piece of music faster and
more efficiently conserves power and increases the time between
battery charges.
[0421] In some embodiments, the device displays (11502) a
representation of a piece of music (e.g., representations 11406 of
a piece of cover art corresponding to a piece of music in FIGS.
17A-17G or representation 11424 of a musical score corresponding to
a piece of music in FIGS. 17H-17L) on a display (e.g., display 450
in FIGS. 17A-17L). In some embodiments, the piece of music (e.g.,
music 11416 in FIGS. 17C-17F and 17J) is played (11504) in a media
player application (e.g., media player application window 11402 in
FIGS. 17A-17G), and the representation of the piece of music is a
graphical representation of the piece of music (e.g.,
representations 11406 of a piece of cover art for an album of the
piece of music in FIGS. 17A-17G). In some embodiments, the
representation of the piece of music is a piece of cover art for an
album of the piece of music (e.g., representations 11406). In some
embodiments, the representation of the piece of music is a "now
playing" region of the media player application (e.g., region 11420
of media player application window 11402) including a name and play
time of the piece of music.
[0422] In some embodiments, while the device displays the
representation of a piece of music, the device detects (11506)
movement of a focus selector (e.g., cursor 11404 in FIGS. 17A-17L)
over the representation of the piece of music.
[0423] In some embodiments, while detecting the focus selector over
the representation of the piece of music, the device provides
(11508) tactile feedback (e.g., tactile outputs 11414 in FIGS.
17B-17F, 17I-17J and 17L) that corresponds to at least a subset of
beats (e.g., beats 11418 in FIGS. 17C-17F and 17J) of the piece of
music (e.g., piece of music 11416 in FIGS. 17C-17F and 17J). In
some embodiments, the representation of the piece of music is a
representation of the musical notes (e.g., a musical score). In
some embodiments, the representation of the piece of music includes
visual media corresponding to the piece of music (e.g., an image, a
video, a text description of the piece of music or an album/video
including the piece of music, or an audio visualizer associated
with the piece of music). For example, the tactile feedback
corresponding to a piece of music is generated while a focus
selector (e.g., a displayed cursor or a contact) is over an album
cover for the piece of music, over a composer or artist image for
the piece of music, or over a currently playing video that includes
the piece of music as currently playing audio content (e.g., the
background music in a movie or the music in a music video).
[0424] In some embodiments, after providing the tactile feedback,
the device detects (11528) movement of the focus selector away from
the representation of the piece of music (e.g., movement of cursor
11404, corresponding to movement 11412-b of contact 11410-c on
touch-sensitive surface 451, in FIG. 17G).
[0425] In some embodiments, in response to detecting movement of
the focus selector away from the representation of the piece of
music, the device ceases (11530) to provide the tactile feedback
(e.g., tactile outputs 11414) that corresponds to the beats of the
piece of music.
[0426] In some embodiments, the focus selector (e.g., cursor 11404
in FIGS. 17A-17L) is moved (11510) in accordance with movement
(e.g., movements 11412 in FIGS. 17A-17G or movements 11428 in FIGS.
17H-17L) of a contact (e.g., contact 11410 in FIGS. 17A-17G or
contact 11426 in FIGS. 17H-17L) on a touch-sensitive surface (e.g.,
touch-sensitive surface 451), and the tactile feedback is provided
by generating tactile outputs (e.g., tactile outputs 11414 in FIGS.
17B-17F, 17I-17J and 17L) on the touch-sensitive surface. In some
embodiments, the contact is the focus selector (e.g., when the
device has a touch screen, the focus selector is, optionally,
contact 11410). In some embodiments, the contact corresponds to a
cursor or selection box that is displayed on the display.
[0427] In some embodiments, the representation of the piece of
music (e.g., representations 11406 of a piece of cover art for an
album of the piece of music in FIGS. 17A-17G) is displayed (11512)
in a media player application (e.g., media player application
window 11402 in FIGS. 17A-17G), and the tactile feedback includes a
plurality of tactile outputs (e.g., tactile outputs 11414)
generated when corresponding beats (e.g., beats 11418) in the
subset of beats are played by the media player application (e.g.,
the touch-sensitive surface generates tactile outputs in time with
music being played in the media player application).
[0428] In some embodiments, the representation of the piece of
music is displayed (11514) as a musical score (e.g., representation
11424 of a score of the piece of music in FIGS. 17H-17L) in a music
composing application (e.g., music composition application window
11422 in FIGS. 17H-17L). For example, by displaying a
representation of notes (e.g., black bar 11430-9, corresponding to
the note played at beat 11418-9 of piece of music 11416, in FIGS.
17H-17L) of the piece of music (e.g., music 11416) in a
representation of a musical score (e.g., representation 11424 of a
musical score) corresponding to the piece of music.
[0429] In some embodiments, while the representation of the piece
of music is displayed as a musical score in a music composing
application, the tactile feedback includes (11516) a plurality of
tactile outputs (e.g., tactile outputs 11414) generated when the
focus selector moves over representations of corresponding beats
(e.g., beat 11418-3 represented as vertical line 11432-3 or beat
representation 11430-3 in FIG. 17L) in the subset of beats in the
musical score. In some embodiments, the focus selector moves in
accordance with movement of a contact on the touch-sensitive
surface (e.g., movement 11428-c of contact 11426 on touch-sensitive
surface 451 in FIGS. 17K-17L), and thus the tactile outputs (e.g.,
tactile outputs 11414) are generated in accordance with movement of
the contact on the touch-sensitive surface.
[0430] In some embodiments, the subset of beats (e.g., beats 11418)
includes (11518) stressed beats in the piece of music (e.g., even
numbered beats 11418 in piece of music 11416 in FIGS. 17C-17F and
17J). In some embodiments, a beat is the basic unit of time in
music (e.g., a quarter note in a piece of music having a 4/4 time
signature or an eighth note in a piece of music having a 6/8 time
signature), where a stressed beat is a stronger, louder or
otherwise more emphatic beat of a plurality of beats. Some typical
beat patterns include stressing every fourth beat (e.g., as
commonly done in music having a 4/8 time signature), stressing
every other beat (e.g., as commonly done in music having a 4/4 time
signature) or stressing every third beat (e.g., as commonly done in
music having a 3/4 or 6/8 time signature, such as a waltz). A beat
that is not stressed is sometimes referred to as an unstressed
beat. In some embodiments, a beat is a subunit of the basic unit of
time in music (e.g., an eighth note in music having a 4/4 or 3/4
time signature).
[0431] In some embodiments, the subset of beats excludes (11520)
unstressed beats of the piece of music. For example, as illustrated
in FIG. 17C, the subset of beats 11418 includes stressed beats
(e.g., odd numbered beats 11418) but excludes unstressed beats
(e.g., even numbered beats 11418), and tactile outputs 11414 are
only generated corresponding to the stressed beats.
[0432] In some embodiments, where tactile feedback corresponding to
at least a subset of beats of the piece of music is provided
(11508) while detecting the focus selector over the representation
of the piece of music, the subset of beats include (11522) one or
more stressed beats (e.g., odd numbered beats 11418) and one or
more unstressed beats (e.g., even numbered beats 11418), the
tactile feedback includes a plurality of tactile outputs (e.g.,
tactile outputs 11414) for corresponding beats in the subset of
beats, a first tactile output is generated for stressed beats, and
a second tactile output, different from the first tactile output,
is generated for non-stressed beats (e.g., even numbered tactile
outputs 11414 and odd numbered tactile outputs 11414 feel
substantially different to the user, as represented in FIGS.
17E-17F and 17J). In contrast, in some embodiments, a first tactile
output corresponding to a stressed beat and a second tactile output
corresponding to an unstressed beat are substantially the same
(e.g., odd numbered tactile outputs 11414 corresponding to odd
numbered stressed beats 11418 and even numbered tactile outputs
11414 corresponding to even numbered unstressed beats 11418 feel
substantially the same to the user, as represented in FIG. 17D). In
some embodiments, the first tactile output is more prominent (e.g.,
has a larger amplitude) than the second tactile output. In some
embodiments, the second tactile output is more prominent (e.g., has
a larger amplitude) than the first tactile output.
[0433] In some embodiments, the first tactile output is generated
(11524) by movement of the touch-sensitive surface that includes a
first dominant movement component, the second tactile output is
generated by movement of the touch-sensitive surface that includes
a second dominant movement component, and the first dominant
movement component and the second dominant movement component have
a same or substantially same amplitude (e.g., high amplitude
"A.sub.H" of all tactile outputs 11414 in FIG. 17E) and
substantially different movement profiles (e.g., square waveform
shape 11436 of odd numbered tactile outputs 11414 and sawtooth
waveform shape 11434 of even numbered tactile outputs 11414 in FIG.
17E). In some embodiments, movement of the touch-sensitive surface
corresponds to an initial impulse, ignoring any unintended
resonance. In some embodiments, the movement profiles differ in
their waveform shape (e.g., square, sine, squine, triangle or
sawtooth waveform shape), waveform pulse width and/or waveform
pulse period (e.g., frequency). For example, as illustrated in FIG.
17E, a "detent" that is generated on the touch-sensitive surface
corresponding to a stressed beat of the music has a square waveform
movement profile (e.g., square waveform 11436 of odd numbered
tactile outputs 11414 in FIG. 17E), whereas a "click" that is
generated on the touch-sensitive surface corresponding to an
unstressed beat of the music has a sawtooth waveform movement
profile (e.g., sawtooth waveform 11434 of even numbered tactile
outputs 11414 in FIG. 17E), or vice versa.
[0434] In some embodiments, the first tactile output is generated
(11526) by movement of the touch-sensitive surface that includes a
first dominant movement component, the second tactile output is
generated by movement of the touch-sensitive surface that includes
a second dominant movement component, and the first dominant
movement component and the second dominant movement component have
a same or substantially same movement profile (e.g., square
waveforms 11434 of odd numbered tactile outputs 11414 and square
waveform 11436 of even numbered tactile outputs 11414 in FIGS. 17F
and 17J) and substantially different amplitudes (e.g., high
amplitude "A.sub.H" of odd numbered tactile outputs 11414 is
greater than low amplitude "A.sub.L" of even numbered tactile
outputs 11414 in FIGS. 17F and 17J). In some embodiments, movement
of the touch-sensitive surface corresponds to an initial impulse,
ignoring any unintended resonance. In some embodiments, the
movement profiles differ in their waveform shape (e.g., square,
sine, squine, triangle or sawtooth waveform shape), waveform pulse
width and/or waveform pulse period (e.g., frequency). For example,
as illustrated in FIGS. 17F and 17J, a "detent" that is generated
on the touch-sensitive surface corresponding to a stressed beat of
the music has a greater amplitude than a "detent" that is generated
on the touch-sensitive surface corresponding to an unstressed beat
of the music (e.g., high amplitude "A.sub.H" of odd numbered
tactile outputs 11414 in FIGS. 17F and 17J is greater than low
amplitude "A.sub.L" of even numbered tactile output 11414 in FIGS.
17F and 17J), or vice versa.
[0435] In some embodiments, after providing tactile feedback, the
device detects (11528) movement of the focus selector away from the
representation of the piece of music. For example, as illustrated
in FIG. 17G, in accordance with detection of movement 11412-b of
contact 11410 from position 11410-b to position 11410-c on
touch-sensitive surface 451, corresponding to movement of cursor
11404 away from representation 11406 of a piece of music. In some
embodiments, in response to detecting movement of the focus
selector away from the representation of the piece of music, the
device ceases (11530) to provide tactile feedback that corresponds
to the beats of the piece of music. For example, as illustrated in
FIG. 17G, when cursor 11404 moves away from representation 11406 of
a piece of music, tactile output generators 167 stop generating
tactile outputs 11414 on touch-sensitive surface 451 because the
cursor is no longer positioned over the representation of the piece
of music.
[0436] It should be understood that the particular order in which
the operations in FIGS. 18A-18B have been described is merely
exemplary and is not intended to indicate that the described order
is the only order in which the operations could be performed. One
of ordinary skill in the art would recognize various ways to
reorder the operations described herein. Additionally, it should be
noted that details of other processes described herein with respect
to other methods described herein (e.g., those listed in paragraph
[0058]) are also applicable in an analogous manner to method 11500
described above with respect to FIGS. 18A-18B. For example, the
contacts, gestures, user interface objects, tactile sensations and
focus selectors described above with reference to method 11500
optionally have one or more of the characteristics of the contacts,
gestures, user interface objects, tactile sensations and focus
selectors described herein with reference to other methods
described herein (e.g., those listed in paragraph [0058]). For
brevity, these details are not repeated here.
[0437] In accordance with some embodiments, FIG. 19 shows a
functional block diagram of an electronic device 11600 configured
in accordance with the principles of the various described
embodiments. The functional blocks of the device are, optionally,
implemented by hardware, software, or a combination of hardware and
software to carry out the principles of the various described
embodiments. It is understood by persons of skill in the art that
the functional blocks described in FIG. 19 are, optionally,
combined or separated into sub-blocks to implement the principles
of the various described embodiments. Therefore, the description
herein optionally supports any possible combination or separation
or further definition of the functional blocks described
herein.
[0438] As shown in FIG. 19, an electronic device 11600 includes a
display unit 11602 configured to display one or more user interface
objects, a touch-sensitive surface unit 11604 configured to receive
user contacts, optionally one or more sensor units 11606 configured
to detect intensity of contacts with the touch-sensitive surface
unit 11604; and a processing unit 11608 coupled to the display unit
11602, the touch-sensitive surface unit 11604 and optionally the
one or more sensor units 11606. In some embodiments, the processing
unit 11608 includes a display enabling unit 11610, a detecting unit
11612, a providing unit 11614, and a ceasing unit 11616.
[0439] In some embodiments, the processing unit 11608 is configured
to enable display (e.g., with the display enabling unit 11610) of a
representation of a piece of music on display unit 11602. In some
embodiments, the processing unit 11608 is configured to detect
movement of a focus selector over the representation of the piece
of music (e.g., with detecting unit 11612); and while detecting the
focus selector over the representation of the piece of music, the
processing unit 11608 is configured to provide tactile feedback
that corresponds to at least a subset of beats of the piece of
music (e.g., with providing unit 11614). In some embodiments, after
providing the tactile feedback, the processing unit 11608 is
configured to detect movement of the focus selector away from the
representation of the piece of music (e.g., with the detecting unit
11612); and in response to detecting movement of the focus selector
away from the representation of the piece of music, the processing
unit 11608 is configured to cease to provide the tactile feedback
that corresponds to the beats of the piece of music (e.g., with the
ceasing unit 11616).
[0440] In some embodiments, the processing unit 11608 is configured
to enable display of movement of the focus selector (e.g., with the
display enabling unit 11610) in accordance with movement of a
contact on touch-sensitive surface unit 11604, and the tactile
feedback is provided by generating tactile outputs on the
touch-sensitive surface unit 11604 (e.g., with the providing unit
11614).
[0441] In some embodiments, the piece of music is currently being
played in a media player application; and the representation of the
piece of music is a graphical representation of the piece of
music.
[0442] In some embodiments, the processing unit 11608 is configured
to display the representation of the piece of music in a media
player application (e.g., with the display enabling unit 11610),
and the tactile feedback includes a plurality of tactile outputs
generated when corresponding beats in the subset of beats are
played by the media player application.
[0443] In some embodiments, the processing unit 11608 is configured
to display the representation of the piece of music as a musical
score in a music composing application (e.g., with the display
enabling unit 11610).
[0444] In some embodiments, the tactile feedback includes a
plurality of tactile outputs generated when the focus selector
moves over representations of corresponding beats in the subset of
beats in the musical score.
[0445] In some embodiments, the subset of the beats includes
stressed beats of the piece of music.
[0446] In some embodiments, the subset of the beats excludes
unstressed beats of the piece of music.
[0447] In some embodiments, the subset of beats include one or more
stressed beats and one or more unstressed beats, the tactile
feedback includes a plurality of tactile outputs for corresponding
beats in the subset of beats, the processing unit 11608 is
configured to generate a first tactile output for stressed beats
(e.g., with the providing unit 11614), and the processing unit
11608 is configured to generate a second tactile output, different
from the first tactile output, for non-stressed beats (e.g., with
the providing unit 11614).
[0448] In some embodiments, the first tactile output is generated
by movement of the touch-sensitive surface unit 11604 that includes
a first dominant movement component, the second tactile output is
generated by movement of the touch-sensitive surface unit 11604
that includes a second dominant movement component, and the first
dominant movement component and the second dominant movement
component have a same amplitude and different movement
profiles.
[0449] In some embodiments, the first tactile output is generated
by movement of the touch-sensitive surface unit 11604 that includes
a first dominant movement component, the second tactile output is
generated by movement of the touch-sensitive surface unit 11604
that includes a second dominant movement component, and the first
dominant movement component and the second dominant movement
component have a same amplitude and different movement
profiles.
[0450] The operations in the information processing methods
described above are, optionally implemented by running one or more
functional modules in information processing apparatus such as
general purpose processors (e.g., as described above with respect
to FIGS. 1A and 3) or application specific chips.
[0451] The operations described above with reference to FIGS.
18A-18B are, optionally, implemented by components depicted in
FIGS. 1A-1B or FIG. 19. For example, detection operations 11506 and
11528 and providing operation 11508 are, optionally, implemented by
event sorter 170, event recognizer 180, and event handler 190.
Event monitor 171 in event sorter 170 detects a contact on
touch-sensitive display 112, and event dispatcher module 174
delivers the event information to application 136-1. A respective
event recognizer 180 of application 136-1 compares the event
information to respective event definitions 186, and determines
whether a first contact at a first location on the touch-sensitive
surface corresponds to a predefined event or sub-event, such as
selection of an object on a user interface or display of a focus
selector over a representation of a piece of music on a user
interface. When a respective predefined event or sub-event is
detected, event recognizer 180 activates an event handler 190
associated with the detection of the event or sub-event. Event
handler 190 optionally utilizes or calls data updater 176 or object
updater 177 to update the application internal state 192. In some
embodiments, event handler 190 accesses a respective GUI updater
178 to update what is displayed by the application. In some
embodiments, event handler 190 accesses a respective tactile output
generator 167 to generate a tactile output. Similarly, it would be
clear to a person having ordinary skill in the art how other
processes can be implemented based on the components depicted in
FIGS. 1A-1B.
[0452] It should be understood that the particular order in which
the operations have been described above is merely exemplary and is
not intended to indicate that the described order is the only order
in which the operations could be performed. One of ordinary skill
in the art would recognize various ways to reorder the operations
described herein. Additionally, it should be noted that the various
processes separately described herein (e.g., those listed in
paragraph [0058]) can be combined with each other in different
arrangements. For example, the contacts, user interface objects,
tactile sensations, intensity thresholds, and/or focus selectors
described above with reference to any one of the various processes
separately described herein (e.g., those listed in paragraph
[0058]) optionally have one or more of the characteristics of the
contacts, gestures, user interface objects, tactile sensations,
intensity thresholds, and focus selectors described herein with
reference to one or more of the other methods described herein
(e.g., those listed in paragraph [0058]). For brevity, all of the
various possible combinations are not specifically enumerated here,
but it should be understood that the claims described above may be
combined in any way that is not precluded by mutually exclusive
claim features.
[0453] The foregoing description, for purpose of explanation, has
been described with reference to specific embodiments. However, the
illustrative discussions above are not intended to be exhaustive or
to limit the various described embodiments to the precise forms
disclosed. Many modifications and variations are possible in view
of the above teachings. The embodiments were chosen and described
in order to best explain the principles of the various described
embodiments and their practical applications, to thereby enable
others skilled in the art to best utilize the various described
embodiments with various modifications as are suited to the
particular use contemplated.
* * * * *