U.S. patent application number 12/948675 was filed with the patent office on 2012-04-05 for gesture controlled screen repositioning for one or more displays.
This patent application is currently assigned to FLEXTRONICS ID, LLC. Invention is credited to Sanjiv Sirpal.
Application Number | 20120084736 12/948675 |
Document ID | / |
Family ID | 45889332 |
Filed Date | 2012-04-05 |
United States Patent
Application |
20120084736 |
Kind Code |
A1 |
Sirpal; Sanjiv |
April 5, 2012 |
GESTURE CONTROLLED SCREEN REPOSITIONING FOR ONE OR MORE
DISPLAYS
Abstract
Control of a computing device using gesture inputs. The
computing device may be a handheld computing device with a
plurality of displays. The displays may be capable of displaying a
graphical user interface (GUI). The plurality of displays may be
modified in response to receipt of a gesture input such that the
displays are changed from a first state to a second state. The
change of the displays from the first state to the second state may
include moving a GUI from a first display to a second display.
Additionally, a second GUI may be moved from the second display to
the first display. The gesture input may comprise multiple touches,
such as a pinch gesture.
Inventors: |
Sirpal; Sanjiv; (Oakville,
CA) |
Assignee: |
FLEXTRONICS ID, LLC
Broomfield
CO
|
Family ID: |
45889332 |
Appl. No.: |
12/948675 |
Filed: |
November 17, 2010 |
Related U.S. Patent Documents
|
|
|
|
|
|
Application
Number |
Filing Date |
Patent Number |
|
|
61389000 |
Oct 1, 2010 |
|
|
|
61389117 |
Oct 1, 2010 |
|
|
|
61389087 |
Oct 1, 2010 |
|
|
|
Current U.S.
Class: |
715/863 |
Current CPC
Class: |
G06F 3/04886 20130101;
G06F 3/017 20130101; G06F 3/0488 20130101; G06F 3/0482 20130101;
G06F 1/1641 20130101; G06F 3/0412 20130101; G06F 3/04842 20130101;
G06F 3/04845 20130101; G06F 3/0483 20130101; G06F 3/0416 20130101;
G06F 3/0486 20130101; G06F 3/04847 20130101; G06F 1/1616 20130101;
G06F 3/04883 20130101; G06F 3/0481 20130101; G06F 3/04817 20130101;
G06F 3/1423 20130101; G06F 1/1647 20130101 |
Class at
Publication: |
715/863 |
International
Class: |
G06F 3/033 20060101
G06F003/033 |
Claims
1. A method for controlling a plurality of displays of a handheld
computing device, comprising: displaying a first screen in a first
display of the plurality of displays when in a first display state;
receiving a first gesture input at said handheld computing device
and a second gesture input at said handled computing device,
wherein at least a portion of the first gesture input occurs
simultaneously with at least a portion of the second gesture input;
modifying the plurality of displays in response to the receiving
step from the first display state to a second display state such
that the first screen is displayed in a second display of the
plurality of displays when in the second display state.
2. The method as recited in claim 1, wherein the first gesture
input is a drag gesture in a first direction.
3. The method as recited in claim 2, wherein the second gesture
input is a drag gesture in a second direction, and wherein said
first direction and said second direction are opposite.
4. The method as recited in claim 1, wherein at least one of the
first gesture input and second gesture input is received at a touch
sensitive device.
5. The method as recited in claim 4, wherein said touch sensitive
device is an off display touch sensitive device.
6. The method as recited in claim 1, wherein a second screen is
displayed in the second display when in the first display state and
the first display when in the second display state.
7. The method as recited in claim 6, wherein the first screen is
associated with a first application executing on the handheld
computing device and the second screen is associated with a second
application executing on the handheld computing device.
8. The method as recited in claim 7, wherein at least one of the
first and second applications is a single screen application.
9. The method as recited in claim 7, wherein at least one of the
first and second applications is a multi screen application
executing in a single screen mode.
10. The method as recited in claim 1, wherein a first desktop
screen is displayed in the second display in the first display
state and a second desktop screen is displayed in the first display
in the second display state.
11. The method as recited in claim 1, wherein the first gesture
input is a touch input received at a first touch sensitive portion
of said handheld computing device and the second gesture input is a
touch input received at a second touch sensitive portion of said
handled computing device.
12. The method as recited in claim 11, wherein said first touch
sensitive portion is associated with the first display to comprise
a first touch screen display, and wherein said second touch
sensitive portion is associated with a second touch sensitive
display to comprise a second touch sensitive display.
13. The method as recited in claim 11, wherein said first touch
sensitive portion is disposed apart from said first display, and
wherein said second touch sensitive portion is disposed apart from
said second display.
14. The method as recited in claim 1, wherein the plurality of
displays comprise separate portions of a single display, wherein
the first display corresponds with a first portion of the single
display and the second display corresponds with a second portion of
the single display.
15. A handheld computing device, comprising: a processor; a first
display in operative communication with the processor and operable
to display a first screen in a first display state; a second
display in operative communication with the processor; a first
gesture sensor in operative communication with the processor and
operable to receive a first gesture input; a second gesture sensor
in operative communication with the processor and operable to
receive a second gesture input; wherein the processor, upon receipt
of the first and second touch gesture inputs, changes the first and
second displays to a second display state such that the first
screen is displayed on the second display in the second display
state.
16. The device as recited in claim 15, wherein a second screen is
displayed on the second display in the first display state and the
second screen is displayed on first display in the second display
state.
17. The method as recited in claim 15, wherein the first gesture
sensor is first touch sensor during and the first display and the
first touch sensitive device comprise a first touch screen display,
and wherein the second gesture sensor is a second touch sensitive
device, and the second display and the second touch sensitive
device comprise a second touch screen display.
18. The method as recited in claim 15, wherein the handheld device
is a smart phone.
19. The method as recited in claim 18, wherein the first display
and second display are positionable with respect to each other
between an open and a closed position.
20. The method as recited in claim 19, wherein when in the open
position, the first display and the second display are both visible
from the vantage point of a user.
21. The method as recited in claim 20, wherein when in the closed
position only one of the first display and the second display are
visible from the vantage point of a user.
22. A method for controlling a plurality of displays of a handheld
computing device, comprising: displaying a first screen in a first
display and a second screen in a second display, when the plurality
of displays are in a first display state; receiving a first gesture
input at said handheld computing device and a second gesture input
at said handheld computing device, wherein at least a portion of
the first gesture input occurs simultaneously with at least a
portion of the second gesture input; modifying the plurality of
displays from the first state to a second display state in response
to the receiving step; and wherein the first screen is displayed in
the second display and the second screen is displayed in the first
display when the plurality of displays are in the second display
state.
Description
RELATED APPLICATIONS
[0001] This application claims priority to U.S. Provisional
Application Ser. No. 61/389,000, filed Oct. 1, 2010, entitled "DUAL
DISPLAY WINDOWING SYSTEM"; Provisional Application Ser. No.
61/389,117, filed Oct. 1, 2010, entitled "MULTI-OPERATING SYSTEM
PORTABLE DOCKETING DEVICE"; and Provisional Application Ser. No.
61/389,087, filed Oct. 1, 2010, entitled "TABLET COMPUTING USER
INTERFACE". Each and every part of the foregoing provisional
applications is hereby incorporated by reference in their
entirety.
BACKGROUND
[0002] As the computing and communication functions of handheld
computing devices become more powerful, the user interface and
display elements of such devices have evolved by attempting to
adapt user interface regimes developed for personal computers for
use with handheld computing devices. However, this attempt to adapt
prior user interface regimes has been met with various hurdles.
[0003] For instance, the majority of current handheld computing
devices make use of a physical keypad for user interface. Many
different implementations of physical keypads exist that vary in
orientation and relationship to the device screen. However, in
every case the physical keypads take up a certain percentage of the
physical space of the device and increase the weight of the device.
In addition to the disadvantages of size and weight, physical
keypads are not configurable in the same manner as a touch screen
based user interface. While certain limited forms of physical
keypads currently have, on the keys themselves, configurable
displays, such as eInk or OLED surfaces, to allow for
reconfiguration of the keys, even in these cases, the physical
layout of keys is not modifiable. Rather, only the values
associated with the physical keys on the keypad may be changed.
[0004] Other methods may provide increased user configurability of
physical keypads. These methods may include stickers and/or labels
that can be added to keys to reference modified functions or
plastic overlays on top of the keypad denoting different functional
suites. For instance, the ZBoard keyboard, meant for laptop or
desktop computer use, incorporates a dual layered physical keyboard
which separates the keys and their layout from the connections
which send signals to the machine. As such, different physical
keyboard inserts for different applications can be inserted into a
holder allowing full configurability such that the orientation and
layout of the keys in addition to their denotation of function is
configurable. This model could be extended to handheld computing
devices; however, the rate at which such a modular keypad can
change functions is much slower than a touch screen user interface.
Furthermore, for each potential functional suite, an additional
physical key layout must be carried by the user, greatly increasing
the overall physical size and weight of such implementations. One
advantage of a physical keypad for handheld computing devices is
that the user input space is extended beyond the user display space
such that none of the keys themselves, the housing of the keys, a
user's fingers, or a pointing device obscure any screen space
during user interface activities.
[0005] A substantial number of handheld computing devices make use
of a small touch screen display to deliver display information to
the user and to receive inputs from the user interface commands. In
this case, while the configurability of the device may be greatly
increased and a wide variety of user interface options may be
available to the user, this flexibility comes at a price. Namely,
such arrangements require shared screen space between the display
and the user interface. While this issue is shared with other types
of touch screen display/user interface technology, the small form
factor of handheld computing devices results in a tension between
the displayed graphics and area provided for receiving inputs. For
instance, the small display further constrains the display space,
which may increase the difficulty of interpreting actions or
results while a keypad or other user interface scheme is laid
overtop or to the side of the applications in use such that the
application is squeezed into an even smaller portion of the
display. Thus a single display touch screen solution, which solves
the problem of flexibility of the user interface may create an even
more substantial set of problems of obfuscation of the display,
visual clutter, and an overall conflict of action and attention
between the user interface and the display.
[0006] Single display touch screen devices thus benefit from user
interface flexibility, but are crippled by their limited screen
space such that when users are entering information into the device
through the display, the ability to interpret information in the
display can be severely hampered. This problem is exacerbated in
several key situations when complex interaction between display and
interface is required, such as when manipulating layers on maps,
playing a game, or modifying data received from a scientific
application. This conflict between user interface and screen space
severely limits the degree to which the touch based user interface
may be used in an intuitive manner.
SUMMARY
[0007] A first aspect of the present invention includes a method
for controlling a plurality of displays of a handheld computing
device. The method includes displaying a first screen in a first
display of the plurality of displays when in a first display state.
A first gesture input and a second gesture input are received at
the handheld computing device. At least a portion of the first
gesture input occurs simultaneously with at least a portion of the
second gesture input. In response to this receiving, the plurality
of displays are modified from the first display state to a second
display state such that the first screen is displayed in a second
display of the plurality of displays when in the second display
state.
[0008] A second aspect of the present invention includes a handheld
computing device that includes a processor. The device also
includes a first display in operative communication with the
processor that is also operable to display a first screen in a
first display state. The device further includes a second display
in operative communication with the processor, a first gesture
sensor in operative communication with the processor that is also
operable to receive a first gesture input, and a second gesture
sensor in operative communication with the processor that is also
operable to receive a second gesture input. The processor, upon
receipt of the first and second touch gesture inputs, changes the
first and second displays to a second display state such that the
first screen is displayed on the second display in the second
display state.
[0009] A third aspect of the present invention includes another
method for controlling a plurality of displays of a handheld
computing device. This method includes displaying a first screen in
a first display and a second screen in a second display when the
plurality of displays are in a first display state. The method
further includes receiving a first gesture input at the handheld
computing device and a second gesture input at the handheld
computing device. At least a portion of the first gesture input
occurs simultaneously with at least a portion of the second gesture
input. The method also includes modifying the plurality of displays
from the first state to a second display state in response to the
receiving of the first and second gesture inputs. In turn, the
first screen is displayed in the second display and the second
screen is displayed in the first display when the plurality of
displays are in the second display state.
[0010] A number of feature refinements and additional features are
applicable to the foregoing aspects. These feature refinements and
additional features may be used individually or in any combination.
As such, each of the following features that will be discussed may
be, but are not required to be, used with any other feature or
combination of features of any of the aspects presented herein.
[0011] In one embodiment, the first gesture input may be a drag
gesture in a first direction. The second gesture input may be a
drag gesture in a second direction. The first direction and the
second direction may be opposite. As such, the first gesture input
and second gesture input may combine to define a pinch gesture. At
least one of the first gesture input and second gesture input may
be received at a touch sensitive device. As such, the gesture
inputs may be touch gesture inputs. The touch sensitive device may
be an off display touch sensitive device provided separately from
the first or second display. As such, the gesture inputs may be
received away from the first and second display such that the first
and second displays are not obscured when receiving the gesture
input. The first touch sensitive portion may be associated with the
first display to comprise a first touch screen display, and the
second touch sensitive portion may be associated with a second
touch sensitive display to comprise a second touch sensitive
display. Alternatively, the first touch sensitive portion may be
disposed apart from said first display, and the second touch
sensitive portion may be disposed apart from said second
display.
[0012] In another embodiment, a second screen may be displayed in
the second display when in the first display state and the first
display when in the second display state. As such, the first screen
and second screen may be exchanged between the first display and
second display when the device is modified from the first display
state to the second display state. The first screen may be
associated with a first application executing on the handheld
computing device, and the second screen may be associated with a
second application executing on the handheld computing device. At
least one of the first and second applications may be a single
screen application. Additionally or alternatively, at least one of
the first and second applications may be a multi screen application
executing in a single screen mode. A first desktop screen may be
displayed in the second display in the first display state and a
second desktop screen may be displayed in the first display in the
second display state.
[0013] In another embodiment, the plurality of displays comprise
separate portions of a single display. The first display
corresponds with a first portion of the single display and the
second display corresponds with a second portion of the single
display.
[0014] In still further embodiments, the handheld device may be a
smartphone. The first display and second display may be
positionable with respect to each other between an open and a
closed position. When in the open position, the first display and
the second display may both visible from the vantage point of a
user. When in the closed position only one of the first display and
the second display may be visible from the vantage point of a
user.
BRIEF DESCRIPTION OF THE DRAWINGS
[0015] FIG. 1 is a schematic view of an embodiment of a handheld
computing device.
[0016] FIGS. 2A-D are graphical representations of an embodiment of
a handheld computing device in various instances of operation.
[0017] FIGS. 3A-K are graphical representations of an embodiment of
a handheld computing device provided in different positions,
orientations, and instances of operation.
[0018] FIG. 4 includes graphical representations of various gesture
inputs for controlling a handheld computing device.
[0019] FIGS. 5A and 5B are graphical representations of an
embodiment of a handheld computing device functioning in response
to a gesture input to change between a first display state and a
second display state.
[0020] FIGS. 6A and 6B are graphical representations of another
embodiment of a handheld computing device functioning in response
to a gesture input to change between a first display state and a
second display state.
[0021] FIGS. 7A and 7B are graphical representations of an
embodiment of a handheld computing device before and after
receiving a gesture input.
[0022] FIGS. 8A and 8B are graphical representations of another
embodiment a handheld computing device before and after receiving a
received gesture input.
[0023] FIGS. 9A, 9B, and 9C are graphical representations of
another embodiment of a handheld computing device functioning in
response to a received gesture input to change between a first
display state and a second display state.
[0024] FIGS. 10A and 10B are schematic views of two embodiments of
a handheld computing device provided with touch sensitive
devices.
[0025] FIG. 11 is a graphical representation of an embodiment of a
gesture input.
DETAILED DESCRIPTION
[0026] The present disclosure is generally related to gesture
inputs for interaction with a computing device. The interface
controls are particularly suited for control of devices that have
one or more displays capable of displaying graphical user
interfaces (GUIs) on a handheld portable device. The following
disclosure may, in various embodiments, be applied to other
computing devices capable of displaying and responding to a GUI
(e.g., laptop computers, tablet computers, desktop computers, touch
screen monitors, etc.) and is not intended to be limited to
handheld computing devices unless otherwise explicitly
specified.
[0027] FIG. 1 depicts an embodiment of a handheld computing device
100. The handheld computing device 100 may include a first display
102 and a second display 104. Additionally, while two displays
(102, 104) may be shown and described below with regard to the
functionality of various embodiments of handheld computing devices,
a handheld computing device may be provided that includes more than
two displays. In any regard, the first display 102 and the second
display 104 may be independently controllable. The displays may be
operative to display a displayed image or "screen". As used herein,
the term "display" is intended to connote device hardware, whereas
"screen" is intended to connote the displayed image produced on the
display. In this regard, a display is a physical piece of hardware
that is operable to render a screen. A screen may refer to a
majority of the display. For instance, a screen may occupy all of
the display area except for areas dedicated to other functions
(e.g., menu bars, status bars, etc.) A screen may be associated
with an application and/or an operating system executing on the
handheld computing device 100. For instance, application screens or
desktop screens may be displayed. An application may have various
kinds of screens that are capable of being manipulated as will be
described further below. In an embodiment, each display may have a
resolution of 480 pixels by 800 pixels, although higher and lower
resolution displays may also be provided.
[0028] A screen may be associated with an operating system, an
application, or the like. In some instances, a screen may include
interactive features (e.g., buttons, text fields, toggle fields,
etc.) capable of manipulation by way of a user input. The user
input may be received by various input devices (e.g., a physical
keyboard, a roller ball, directional keys, a touch sensitive
device, etc.). In some instances, a screen may simply include
graphics and have no ability to receive an input by a user. In
other instances, graphics features and input features may both be
provided by a screen. As such, the one or more displays of a
handheld computing device, the screens displayed on the one or more
displays, and various user input devices may comprise a GUI that
allows a user to exploit functionality of the handheld computing
device.
[0029] The handheld computing device 100 may be configurable
between a first position and a second position. In the first
position, a single display (e.g., the first display 102 or the
second display 104) may be visible from the perspective of a user.
Both displays 102, 104 may be exposed on an exterior of the
handheld device 100 when in the first position, but the displays
102, 104 may be arranged in a non-adjacent manner such that both
displays 102, 104 are not concurrently visible from the perspective
of a user (e.g., one display may be visible from the front of the
device 100 and the other display may be visible from the back of
the device 100).
[0030] The handheld computing device 100 may also be provided in
the second position such that the displays 102, 104 may be
concurrently viewable from the perspective of a user (e.g., the
displays 102, 104 may be positioned adjacent to one another). The
displays 102, 104 may be displayed in the second position such that
the displays 102, 104 are arranged end-to-end or side-by-side.
Additionally, the displays 102, 104 may be arranged in a portrait
orientation or a landscape orientation with respect to a user. As
will be discussed further below, a portrait orientation is intended
to describe an arrangement of the handheld computing device,
wherein the longer dimension of the display of the handheld
computing device is vertically oriented (e.g., with respect to
gravity or the perspective of a user). A landscape orientation is
intended to describe an arrangement wherein the shorter dimension
of the display of the handheld computing device is vertically
oriented (e.g., with respect to gravity or the perspective of a
user). Furthermore, the longer dimension and shorter dimension may
refer to each display individually or the combined viewing area of
the one or more displays of the device. Thus, when the individual
displays are arranged in a portrait orientation, the overall
display area may be arranged in a landscape orientation, and vice
versa. Additionally, the displays and screens may be in different
respective orientations. For instance, when the displays are in a
landscape orientation, one or more screens may be rendered in a
portrait orientation on the displays.
[0031] The handheld computing device 100 may be manipulated between
the first position (i.e., a single display visible from a user's
perspective) and the second position (i.e., at least two displays
concurrently visible from the user's perspective) in a variety of
manners. For instance, the device 100 may include a slider
mechanism such that the first and second displays 102, 104 are
disposable adjacent to one another in a parallel fashion in a
second position and slideable to the first position where only a
single display is viewable and the other display is obscured by the
viewable display.
[0032] Alternatively, the device 100 may be arranged in a clam
shell type arrangement wherein a hinge is provided between the
first display 102 and the second display 104 such that the displays
102, 104 are concurrently visible by a user when in the second
position (i.e., an open position). The displays 102, 104 may be
provided on an interior clam shell portion or an exterior clam
shell portion of the device 100. In this regard, both displays 102,
104 may be visible from the front and the back of the device,
respectively, when the device is in the first position (i.e., the
closed position). When the device 100 is in the open position, the
displays 102, 104 may be provided adjacent and parallel to one
another. Alternative arrangements of the handheld computing device
100 are contemplated wherein different arrangements and/or relative
locations of the displays may be provided when in the first and
second position.
[0033] In addition, the first display 102 and the second display
104 may be provided as entirely separate devices. In this regard, a
user may manipulate the displays 102, 104 such that they may be
positioned adjacent to one another (e.g., side-by-side or
end-to-end). The displays 102, 104 may be in operative
communication when adjacently positioned such that the displays
102, 104 may operate in the manner provided in greater detail below
when adjacently positioned (e.g., via physical contacts, wireless
communications, etc.). A retention member (not shown) may be
provided to retain the separate displays 102, 104 in an adjacent
position. For instance, the retention member may include
coordinating magnets, mechanical clips or fasteners, elastic
members, etc.
[0034] While the foregoing has referenced two displays 102 and 104,
alternate embodiments of a handheld device may include more than
two displays. In this regard, the two or more displays may behave
in a manner in accordance with the foregoing wherein only a single
display is viewable by a user in a first position and multiple
displays (i.e., more than two displays) are viewable in a second
position. Additionally, in one embodiment, the two displays 102 and
104 may comprise separate portions of a unitary display. As such,
the first display 102 may be a first portion of the unitary display
and the second display 104 may be a second portion of the unitary
display. For instance, the handheld computing device 100 (e.g.,
having a first and second display 102 and 104) may be operatively
connected to the unitary display (e.g., via a connector or a dock
portion of the unitary display) such that the first display 102 and
the second display 104 of the handheld computing device 100 are
emulated on the unitary display. As such, the unitary display may
have first and second portions corresponding to and acting in a
similar manner to the first and second display 102 and 104 of the
handheld computing device 100 described below.
[0035] The handheld computing device 100 may further include one or
more input devices that may be used to receive user inputs. These
input devices may be operative to receive gesture inputs from a
user, and, accordingly, may be referred to generally as gesture
sensors. A number of different types of gesture sensors may be
provided. Some examples include, but are not limited to traditional
input devices (keypads, trackballs, etc.), touch sensitive devices,
optical sensors (e.g., a camera or the like), etc. The discussion
contained herein may reference the use of touch sensitive devices
to receive gesture inputs. However, the use of touch sensitive
devices is not intended to limit the means for receiving gesture
inputs to touch sensitive devices alone and is provided for
illustrative purposes only. Accordingly, any of the foregoing means
for receiving a gesture input may be used to produce the
functionality disclosed below with regard to gesture inputs
received at touch sensitive devices.
[0036] In this regard, the handheld computing device 100 may
include at least a first touch sensor 106. Furthermore, the
handheld computing device may include a second touch sensor 108.
The first touch sensor 106 and/or the second touch sensor 108 may
be touchpad devices, touch screen devices, or other appropriate
touch sensitive devices. Examples include capacitive touch
sensitive panels, resistive touch sensitive panels, or devices
employing other touch sensitive technologies. The first touch
sensor 106 and/or second touch sensor 108 may be used in
conjunction with a portion of a user's body (e.g., finger, thumb,
hand, etc.), a stylus, or other acceptable touch sensitive
interface mechanisms known in the art. Furthermore, the first touch
sensor 106 and/or the second touch sensor 108 may be multi-touch
devices capable of sensing multiple touches simultaneously.
[0037] The first touch sensor 106 may correspond to the first
display 102 and the second touch sensor 108 may correspond to the
second display 104. In one embodiment of the handheld computing
device 100, the first display 102 and the first touch sensor 106
comprise a first touch screen display 110. In this regard, the
first touch sensor 106 may be transparent or translucent and
positioned with respect to the first display 102 such that a
corresponding touch received at the first touch sensor 106 may be
correlated to the first display 102 (e.g., to interact with a
screen presented thereon). Similarly, the second display 104 and
the second touch sensor 108 may comprise a second touch screen
display 112. In this regard, the second touch sensor 108 may be
positioned with respect to the second display 104 such that a touch
received at the second touch sensor 108 may be correlated to the
second display 104 (e.g., to interact with a screen presented
thereon). Alternatively, the first touch sensor 106 and/or the
second touch sensor 108 may be provided separately from the
displays 102, 104. Furthermore, in an alternate embodiment, only a
single touch sensor may be provided that allows for inputs to
control both the first display 102 and the second display 104. The
single touch sensor may also be provided separately or integrally
with the displays.
[0038] In this regard, the first and second touch sensors 106, 108
may have the substantially same footprint on the handheld computing
device 100 as the displays 102, 104. Alternatively, the touch
sensors 106, 108 may have a footprint including less of the
entirety of the displays 102, 104. Further still, the touch sensors
106, 108 may include a footprint that extends beyond the displays
102, 104 such that at least a portion of the touch sensors 106, 108
are provided in non-overlapping relation with respect to the
displays 102, 104. As discussed further below, the touch sensors
106, 108 may alternatively be provided in complete non-overlapping
relation such that the footprint of the touch sensors 106, 108 is
completely different than the footprint of the displays 102,
104.
[0039] With reference to FIGS. 9A-B, various potential arrangements
are depicted for the first display 102, the second display 104, and
touch sensors 106', 106'', and 108''. In FIG. 9A, the first 102 and
second display 104 are arranged side-by-side such that a crease 196
separates the displays. In this regard, the first display 102 and
second display 104 may be arranged in a clam-shell type arrangement
such that the crease 196 includes a hinge that allows for pivotal
movement between the first display 102 and second display 104 as
discussed above. A touch sensor 106' may span the width of both the
first display 102 and the second display 104. In this regard, the
touch sensor 106' may span the crease 196 without interruption.
Alternatively, as shown in FIG. 9B, separate touch sensors 106''
and 108'' may be provided on either side of the crease 196. In this
regard, each of the touch sensors 106'' and 108'' may span the
width of each of the first display 102 and second display 104,
respectively.
[0040] In any of the arrangements shown in FIGS. 9A-B, the displays
(102, 104) may also comprise touch screen displays that may be used
in conjunction with touch sensitive portions that are provided
separately from the touch screen displays. Thus, displays 102 and
104 may both comprise touch screen displays and be provided in
addition to touch sensitive devices 106', 106'', and 108''.
Accordingly, a combination of touch screen displays (e.g., 110,
112) and off display touch sensors (e.g., 106', 106'', 108'') may
be provided for a single device. Touch inputs may be received at
both a touch screen display (110, 112) and off display touch sensor
(106', 106'', 108''). In this regard, gestures received at an off
screen display sensor may have a different functionality than the
same gesture received at a touch screen display. Also, a touch
sensitive device may be divided into a plurality of zones. The same
gesture received in different zones may have different
functionality. For instance, a percentage (e.g., 10%, 25%, etc.) of
the touch sensitive device at the top or bottom of the display may
be defined as a separate zone than the remainder of the touch
sensitive device. Thus, a gesture received in this zone may have a
different functionality than a gesture received in the remainder of
the touch sensitive device.
[0041] The handheld computing device 100 may further include a
processor 116. The processor 116 may be in operative communication
with a data bus 114. The processor 116 may generally be operative
to control the functionality of the handheld device 100. For
instance, the processor 116 may execute an operating system and be
operative to execute applications. The processor 116 may be in
communication with one or more additional components 120-134 of the
handheld computing device 100 as will be described below. For
instance, the processor 116 may be in direct communication with one
more of the additional components 120-134 or may communicate with
the one or more additional components via the data bus 114.
Furthermore, while the discussion below may describe the additional
components 120-134 being in operative communication with the data
bus 114, it will be understood that in other embodiments, any of
the additional components 120-134 may be in direct operative
communication with any of the other additional components 120-134.
Furthermore, the processor 116 may be operative to independently
control the first display 102 and the second display 104 and may be
operative to receive input from the first touch sensor 106 and the
second touch sensor 108, the processor 116 may comprise one or more
different processors. For example, the processor 116 may comprise
one or more application specific integrated circuits (ASICs), one
or more field-programmable gate arrays (FPGAs), one or more general
purpose processors operative to execute machine readable code, or a
combination of the foregoing.
[0042] The handheld computing device may include a battery 118
operative to provide power to the various devices and components of
the handheld computing device 100. In this regard, the handheld
computing device 100 may be portable.
[0043] The handheld computing device 100 may further include a
memory module 120 in operative communication with the data bus 114.
The memory module 120 may be operative to store data (e.g.,
application data). For instance, the memory 120 may store machine
readable code executable by the processor 116 to execute various
functionalities of the device 100.
[0044] Additionally, a communications module 122 may be in
operative communication with one or more components via the data
bus 114. The communications module 122 may be operative to
communicate over a cellular network, a Wi-Fi connection, a
hardwired connection or other appropriate means of wired or
wireless communication. The handheld computing device 100 may also
include an antenna 126. The antenna 126 may be in operative
communication with the communications module 122 to provide
wireless capability to the communications module 122. Accordingly,
the handheld computing device 100 may have telephony capability
(i.e., the handheld computing device 100 may be a smartphone
device).
[0045] An audio module 124 may also be provided in operative
communication with the data bus 114. The audio module 124 may
include a microphone and/or speakers. In this regard, the audio
module 124 may be able to capture audio or produce sounds.
Furthermore, the device 100 may include a camera module 128. The
camera module 128 may be in operative communication with other
components of the handheld computing device 100 to facilitate the
capture and storage of images or video.
[0046] Additionally, the handheld computing device 100 may include
an I/O module 130. The I/O module 130 may provide input and output
features for the handheld computing device 100 such that the
handheld computing device 100 may be connected via a connector or
other device in order to provide syncing or other communications
between the handheld computing device 100 and another device (e.g.,
a peripheral device, another computing device etc.).
[0047] The handheld computing device 100 may further include an
accelerometer module 132. The accelerometer module 132 may be able
to monitor the orientation of the handheld computing device 100
with respect to gravity. In this regard, the accelerometer module
132 may be operable to determine whether the handheld computing
device 100 is substantially in a portrait orientation or landscape
orientation. The accelerometer module 132 may further provide other
control functionality by monitoring the orientation and/or movement
of the handheld computing device 100.
[0048] The handheld computing device 100 may also include one or
more hardware buttons 134. The hardware buttons 134 may be used to
control various features of the handheld computing device 100. The
hardware buttons 134 may have fixed functionality or may be
contextual such that the specific function of the buttons changes
during operation of the handheld computing device 100. Examples of
such hardware buttons may include, but are not limited to, volume
control, a home screen button, an end button, a send button, a menu
button, etc.
[0049] With further reference to FIGS. 2A-D, various screens of an
embodiment of a device are shown. The screens shown in FIGS. 2A-D
are intended to represent the potential screens that may be
displayed. Thus, while multiple screens may be shown, only one or a
subset of the multiple screens may be shown on the displays of the
device at any one moment. In this regard, a screen may be described
in a relative location to the displays or other screens (e.g., to
the left of a display, to the right of a display, under another
screen, above another screen, etc.). These relationships may be
logically established such that no physical display reflects the
relative position. For instance, a screen may be moved off a
display to the left. While the screen is no longer displayed on the
display, the screen may have a virtual or logical position to the
left of the display from which it was moved. This logical position
may be recognized by a user and embodied in values describing the
screen (e.g., values stored in memory correspond to the screen).
Thus, when referencing screens in relative locations to other
screens, the relationships may be embodied in logic and not
physically reflected in the display of the device.
[0050] FIGS. 2A-D may display a number of different screens that
may be displayed at various instances of operation of a handheld
device and are not intended to be presented in any particular order
or arrangement. Single screen applications and multi screen
applications may be provided. A single screen application is
intended to describe an application that is capable of producing a
screen that may occupy only a single display at a time. A multi
screen application is intended to describe an application that is
capable of producing one or more screens that may simultaneously
occupy multiple displays. Additionally, a multi screen application
may occupy a single display. In this regard, a multi screen
application may have a single screen mode and a multi screen
mode.
[0051] A desktop sequence 136 is displayed in FIG. 2A. The desktop
sequence 136 may include a number of individual desktop screens
138a-138f. Thus, each desktop screen 138 may occupy substantially
the entirety of a single display (e.g., the first display 102 or
second display 104 of FIG. 1). The desktop screens 138a-138f may be
in a predetermined order such that the desktop screens 138a-138f
appear consecutively and the order in which the desktop screens
appear may not be reordered. However, the desktop screens 138a-138f
may be sequentially navigated (e.g., in response to a user input).
That is, one or more of the desktop screens 138a-138f may be
sequentially displayed on a handheld device as controlled by a user
input.
[0052] Additionally, FIG. 2B displays a hierarchal application
sequence 140 of a multi screen application. The hierarchal
application sequence 140 may include a root screen 142, one or more
node screens 144, and a leaf screen 146. The root screen 142 may be
a top level view of the hierarchical application sequence 140 such
that there is no parent screen corresponding to the root screen
142. The root screen 142 may be a parent to a node screen 144. One
or more node screens 144 may be provided that are related as
parent/children. A node screen may also serve as a parent to a leaf
screen 146. By leaf screen 146, it is meant that the leaf screen
146 has no corresponding node screen for which the leaf screen 146
is a parent. As such, the leaf screen does not have any children
node screens 144. FIG. 2C depicts various single screen
applications 148a, 148b, and 148c arranged sequentially. Each of
these single screen applications may correspond to a different
executing application. For instance, in FIG. 2C Application 4,
Application 5, and Application 6 may be executing on the device and
correspond to each single screen 148a, 148b, and 148c,
respectively.
[0053] FIG. 2D also includes an empty view 166. The empty view 166
may be used during transitions of a screen (e.g., movement of
screen between a first display and a second display). It is not
necessary that the empty view 166 be interpretable by the user as
an effective GUI screen. The empty view 166 merely communicates to
the user that an action regarding the screen (e.g., the movement of
the screen with respect to one or more displays) is occurring. An
application displaying an empty view 166 need not be able to rest,
wait, process or interpret input. The empty view 166 may display a
screen, or a representation thereof, as it is being moved in
proportion to the amount of the screen that has been moved from a
first display to a second display as will be discussed in greater
detail below. In this regard, the empty view 166 may be used to
relate information regarding the position of a screen during a
transition of the screen (e.g., in response to gesture). While
shown in FIG. 2D as a grayed screen, an empty view 166 is only
intended to refer to a screen not capable of receiving an input. In
this regard, the display of an empty view 166 may include an
animation or the like showing the response of a screen as it is
being moved or changed (e.g., modified into or out of a landscape
mode).
[0054] FIGS. 3A-K depict various arrangements and statuses of
displays 102, 104 of a device that are possible in various
embodiments of a handheld computing device according to the present
disclosure. For instance, when in the first (e.g., closed)
position, a closed front display 168 may be visible as shown in
FIG. 3A. The closed front display 168 may correspond with the first
display 102 or the second display 104. The closed front 168 as
displayed may be occupied by a desktop screen 138 as shown in FIG.
3A. Alternatively, an application with a single screen or a multi
screen application in single screen mode may be displayed in the
closed front 168. A closed back display 170 may be viewable from an
opposite side of the display when the device is in a closed
position, as shown in FIG. 3B. The closed back 170 may display a
different desktop screen or application screen than the closed
front 168 or may simply display an empty view 166 (e.g., displaying
an icon or other graphic) and lack functionality as an
interface.
[0055] FIG. 3C depicts a closed device in a landscape orientation
172. In one embodiment, a landscape mode (i.e., wherein the display
is adjusted to display a screen 148 in a landscape orientation) may
not be enabled as shown in FIG. 3C. Alternatively, the landscape
mode may be enabled such that the screen 148 is modified when the
device is sensed in a landscape orientation 172, such that the
screen 148 is rendered in a landscape orientation as shown at FIG.
3D.
[0056] The device may further be provided in a second (e.g., open)
position 174 as shown in FIG. 3E. In the open position 174, at
least two displays 102, 104 are arranged such that the two displays
102, 104 are both visible from the vantage point of a user. The two
displays 102, 104 may be arranged in a side-by-side fashion when in
the open position 174. Thus, each of the two displays 102, 104 may
display separate screens. For instance, the displays 102, 104 may
each display a separate desktop screen 138a, 138b, respectively.
While the individual displays 102 and 104 are in a portrait
orientation as shown in FIG. 3E, it may be appreciated that the
full display area (comprising both the first display 102 and the
second display 104) may be arranged in a landscape orientation.
Thus, whether the device as depicted in FIG. 3E is in a landscape
or portrait orientation may depend on whether the displays are
being used individually or collectively. If used collectively as a
unitary display, the device may be in a landscape orientation,
whereas if the displays are used separately, the orientation shown
in FIG. 3E may be referred to as a portrait orientation.
[0057] Additionally, when the device is in an open position 174 as
shown in FIG. 3F, a similar dependency with regard to the use of
the screens as a unitary display or separate displays may also
affect whether the device is in a portrait orientation or landscape
orientation. As can be appreciated, each individual screen is in a
landscape orientation, such that if the displays are used
separately, the device may be in a landscape orientation. If used
as a unitary display, the device may be in a portrait orientation.
In any regard, as shown in FIG. 3F, a single screen 148 may occupy
a first display 102 and the second display 104 may display a
desktop screen 138. The single screen 148 may be displayed in a
landscape or portrait mode. Alternatively, a device in an open
orientation 172 may display a multi screen GUI 156 that may occupy
both displays 102, 104 in a portrait orientation as shown in FIG.
3G such that the individual displays are in a landscape
orientation.
[0058] FIGS. 3I-K depict the potential arrangements of the screens
of a multi screen application 152. The multi screen application 152
may, in one mode, occupy a single display 102 when the device is in
a closed position 168 as shown in FIG. 3I. That is, the multi
screen application 152 may be in a single screen mode.
Alternatively, when the device is in an open position as shown in
FIG. 3J, the multi screen application 152 may still occupy a single
display 102 in single screen mode. Furthermore, the multi screen
application 152 may be expanded to occupy both displays 102, 104
when the device is in the open position as shown in FIG. 3K. In
this regard, the multi screen application 152 may also execute in a
multi screen mode. Various options may be provided for expanding
the multi screen application 152 from a single screen mode to a
multi screen mode.
[0059] For example, the multi screen application 152 may be
maximized from a single screen mode displayed in a single display
to two screens displayed in two displays such that a parent screen
is displayed in the first display and a node screen is expanded
into the second display. In this regard, each of the screens
displayed in the first and second display may be independent
screens that comprise part of a hierarchical application sequence
(e.g., as shown in FIG. 2B). Alternatively, the single screen mode
of the multi screen application may simply be scaled such that the
contents of the single screen are scaled to occupy both displays.
Thus, the same content displayed in the single screen is scaled to
occupy multiple displays, but no additional viewing area or
graphics are presented. Further still, the maximization of the
multi screen application from a single screen mode to a multi
screen mode may result in the expansion of the viewable area of the
application. For example, if a multi screen application is
displayed in single screen mode, upon maximization into multi
screen mode, the viewable area of the multi-screen application may
be expanded while the scale of the graphics displayed remains the
same. In this regard, the viewable area of the multi-screen
application may be expanded into the second display while the
scaling remains constant upon expansion.
[0060] In this regard, an application may have configurable
functionality regarding the nature and behavior of the screens of
the application. For instance, an application may be configurable
to be a single screen application or a multi screen application.
Furthermore, a multi screen application may be configurable as to
the nature of the expansion of the multi screen application between
a single screen mode and a multi screen mode. These configuration
values may be default values that may be changed or may be
permanent values for various applications. These configuration
values may be communicated to the device (e.g., the processor 116)
to dictate the behavior of the application when executing on the
device.
[0061] FIG. 4 depicts various graphical representations of gesture
inputs that may be recognized by a handheld computing device. Such
gestures may be received at one or more touch sensitive portions of
the device. In this regard, various input mechanisms may be used in
order to generate the gestures shown in FIG. 4. For example a
stylus, a user's finger(s), or other devices may be used to
activate the touch sensitive device in order to receive the
gestures. The use of a gesture may describe the use of a truncated
input that results in functionality without the full range of
motion necessary to conventionally carry out the same
functionality. For instance, movement of screens between displays
may be carried out by selecting and moving the screen between
displays such that the full extent of the motion between displays
is received as an input. However, such an implementation may be
difficult to accomplish in that the first and second displays may
comprise separate display portions without continuity therebetween.
As such, a gesture may truncate the full motion of movement or
provide an alternative input to accomplish the same functionality.
Thus, movement spanning the first and second display may be
truncated so that the gesture may be received at a single touch
sensitive device. The use of gesture inputs is particularly suited
to handheld computing devices in that the full action of an input
may be difficult to execute given the limited input and display
space commonly provided on a handheld computing device.
[0062] With reference to FIG. 4, a circle 190 may represent a touch
received at a touch sensitive device. The circle 190 may include a
border 192, the thickness of which may indicate the length of time
the touch is held stationary at the touch sensitive device. In this
regard, a tap 186 has a thinner border 192 than the border 192' for
a long press 188. In this regard, the long press 188 may involve a
touch that remains stationary on the touch sensitive display for
longer than that of a tap 186. As such, different gestures may be
registered depending upon the length of time that the touch remains
stationary prior to movement.
[0063] A drag 176 involves a touch (represented by circle 190) with
movement 194 in a direction. The drag 176 may involve an initiating
touch that remains stationary on the touch sensitive device for a
certain amount of time represented by the border 192. In contrast,
a flick 178 may involve a touch with a shorter dwell time prior to
movement than the drag as indicated by the thinner border 192'' of
the flick 178. Thus, again different gestures may be produced by
differing dwell times of a touch prior to movement. The flick 178
may also include movement 194. The direction of movement 194 of the
drag and flick 178 may be referred to as the direction of the drag
or direction of the flick. Thus, a drag to the right may describe a
drag 176 with movement 194 to the right.
[0064] In an embodiment, a gesture having movement (e.g., a flick
or drag gesture as described above) may be limited to movement in a
single direction along a first axis. Thus, while movement in a
direction different than along the first axis may be disregarded so
long as contact with the touch sensitive device is unbroken. In
this regard, once a gesture is initiated, movement in a direction
not along an axis along which initial movement is registered may be
disregarded or only the vector component of movement along the axis
may be registered.
[0065] While the directional gestures (e.g., the drag 176 and flick
178) shown in FIG. 4 include only horizontal motion after the
initial touch, this may not be actual movement of the touch during
the gesture. For instance, once the drag is initiated in the
horizontal direction, movement in a direction other than in the
horizontal direction may not result in movement of the screen to be
moved in the direction different and the horizontal direction. For
instance, with further reference to FIG. 11, the drag 176 from left
to right may be initiated with initial movement 204 from left to
right along an initiated direction 210. Subsequently, while
maintaining contact with the touch sensitive device, the user may
input an off direction movement 206 in a direction different than
the initiated direction 210. In this regard, the off direction
movement 206 may not result in any movement of a screen between two
displays. Furthermore, the user may input partially off direction
movement 208, where only a vector portion of the movement is in the
direction of the initiated direction 210. In this regard, only the
portion of the partially off direction movement 208 may result in
movement of a screen between displays. In short, the movement of
application screens between the first display 102 and the second
display 104 may be constrained along a single axis along which the
displays are arranged.
[0066] Additionally, FIG. 4 depicts a pinch gesture 180. The pinch
gesture 180 may be initiated by a first touch input 190a and a
second touch input 190b to a touch sensitive device. The first
touch input 190a and second touch input 190b may be received at the
same touch sensitive device or may be received on different touch
sensitive devices. The first touch input 190a may be held for a
certain amount of time, as represented by the border 192a. Also,
the second touch input 190b may be held for a certain amount of
time, as represented by the border 192b. The first touch input 190a
and the second touch input 190b may also include corresponding
first movement 194a and second movement 194b, respectively. The
first touch input 190a may include first movement 194a in a
direction generally towards the second touch input 190b. Also, the
second touch input 190b may include second movement 194b in a
direction generally towards the first touch input 190a. In this
regard, the pinch gesture 180 may be accomplished by user touching
the touch sensitive portion of the device in a manner that
resembles a user imparting a pinching motion with respect to the
device.
[0067] With additional reference to FIGS. 5A and 5B, the function
of a handheld computing device in response to a gesture input is
shown. In FIG. 5A, a first screen 500 is displayed on the first
display 102. A second screen 510 is displayed on a second display
104. The first screen 500 and second screen 510 may be associated
with single screen applications, or may be other single screen
variants, such as a multi screen application executing in single
screen mode. When in the configuration shown in FIG. 5A, a pinch
gesture 180 may be received at the handheld computing device. For
instance, the pinch 180 may be received at an off display touch
sensitive device (not shown). In response to the pinch gesture 180,
the first display 102 may be changed to the display state shown in
FIG. 5B such that the second screen 510 is displayed on the first
display 102. Also, the second display 104 may be changed to the
display state shown in FIG. 5B such that the first display 500 is
displayed on the second display 104. In this regard, the screens
500 and 510 may be swapped between a first display 102 and a second
display 104 upon receipt of the pinch gesture 180.
[0068] With further reference to FIGS. 6A and 6B, a received pinch
gesture 180 may also be used to swap screens displayed in a first
display 102 and second display 104 wherein the screens are
associated with multi screen applications executing in single
screen mode. In this regard, FIG. 6A, Application A1 600 may be
displayed in the first display 102 such that Application A1 600 is
operating in single screen mode and is displayed in the first
display 102. Also, Application B1 610 may be operating in a single
screen mode and may be displayed in the second display 104. Upon
receipt of the pinch gesture 180 when the device is in the
configuration shown in FIG. 6A, the device may change to the
display state shown in FIG. 6B. In FIG. 6B, Application A1 600 may
be swapped with Application B1 610 such that Application B1 610 is
displayed by the first display 102 and Application A1 600 is
displayed by the first display 104.
[0069] With additional reference to FIGS. 7A and 7B, Application A1
700 may be a multi screen application that is operating in multi
screen mode. In this regard, Application A1 700 may occupy both the
first display 102 and the second display 104. Upon receipt of a
pinch gesture 180 when in the configuration shown in FIG. 7A, the
device may remain unchanged as shown in FIG. 7B. That is, the
receipt of the pinch gesture 180 when the Application A1 700 is in
multi-screen mode may result in no change in the display state of
the device. In this regard, in FIG. 7B, Application A1 700 is still
displayed in both the first display 102 and the second display 104
after receipt of the pinch gesture 180. Similarly, as shown in
FIGS. 8A and 8B, a first desktop screen 138a may be displayed in
the first display 102 and a second desktop screen 138b may be
displayed in the second display 104. Upon receipt of the pinch
gesture 180 when the device is in the configuration shown in FIG.
8A, no change may occur such that the arrangement of FIG. 8A is
maintained in FIG. 8B such that the first desktop screen 138a is
still shown in the first display 102 and the second desktop screen
138b is still shown in the second display 104.
[0070] In contrast, as shown in FIG. 9A when a single screen of
Application X 900 is displayed in the first display 102 and a
desktop screen 138b is displayed in a second display 104, upon
receipt of a pinch gesture 180, the device may change to a display
state shown in FIG. 9B. That is, upon receipt of a pinch gesture
180 when in the configuration shown in FIG. 9A, the result may be
the single screen of Application X 900 being moved to the second
display 104. In this regard, a first desktop screen 138a that was
previously obscured by the Application X 900 may be exposed.
Additionally, the second desktop screen 138b that was previously
visible in the second display 104 may be obscured. Additionally, if
a second pinch gesture 180 is received when in the configuration
shown in FIG. 9B, the device may be changed to a state shown in
FIG. 9C, wherein Application X 900 is moved from the second display
104 to the first display 102. In this regard, the first desktop
screen 138a that is visible in FIG. 9B is again obscured by
Application X 900 in FIG. 9C. Additionally, the second desktop
screen 138b may be exposed in FIG. 9C in the second display 104
that was previously obscured by Application X in FIG. 9B.
[0071] While the invention has been illustrated and described in
detail in the drawings and foregoing description, such illustration
and description is to be considered as exemplary and not
restrictive in character. For example, certain embodiments
described hereinabove may be combinable with other described
embodiments and/or arranged in other ways (e.g., process elements
may be performed in other sequences). Accordingly, it should be
understood that only the preferred embodiment and variants thereof
have been shown and described and that all changes and
modifications that come within the spirit of the invention are
desired to be protected.
* * * * *