U.S. patent application number 13/627708 was filed with the patent office on 2013-04-25 for unified desktop input segregation in an application manager.
This patent application is currently assigned to Z124. The applicant listed for this patent is Z124. Invention is credited to Alexander de Paz, Martin Gimpl, Paul E. Reeves, Sanjiv Sirpal.
Application Number | 20130104062 13/627708 |
Document ID | / |
Family ID | 48137011 |
Filed Date | 2013-04-25 |
United States Patent
Application |
20130104062 |
Kind Code |
A1 |
Reeves; Paul E. ; et
al. |
April 25, 2013 |
UNIFIED DESKTOP INPUT SEGREGATION IN AN APPLICATION MANAGER
Abstract
Methods and devices for selectively presenting a user interface
or "desktop" across two devices are provided. More particularly, a
unified desktop is presented across a device and a computer system
that comprise a unified system. The unified desktop acts as a
single user interface that presents data and receives user
interaction in a seamless environment that emulates a personal
computing environment. To function within the personal computing
environment, the unified desktop includes a process for docking and
undocking the device with the computer system. The unified desktop
presents a new user interface to allow access to functions of the
unified desktop.
Inventors: |
Reeves; Paul E.; (Oakville,
CA) ; Sirpal; Sanjiv; (Oakville, CA) ; de Paz;
Alexander; (Burlington, CA) ; Gimpl; Martin;
(Helsinki, FI) |
|
Applicant: |
Name |
City |
State |
Country |
Type |
Z124; |
George Town |
|
KY |
|
|
Assignee: |
Z124
George Town
KY
|
Family ID: |
48137011 |
Appl. No.: |
13/627708 |
Filed: |
September 26, 2012 |
Related U.S. Patent Documents
|
|
|
|
|
|
Application
Number |
Filing Date |
Patent Number |
|
|
13605740 |
Sep 6, 2012 |
|
|
|
13627708 |
|
|
|
|
13605482 |
Sep 6, 2012 |
|
|
|
13605740 |
|
|
|
|
13605145 |
Sep 6, 2012 |
|
|
|
13605482 |
|
|
|
|
13604960 |
Sep 6, 2012 |
|
|
|
13605145 |
|
|
|
|
13604384 |
Sep 5, 2012 |
|
|
|
13604960 |
|
|
|
|
13603136 |
Sep 4, 2012 |
|
|
|
13604384 |
|
|
|
|
13410931 |
Mar 2, 2012 |
|
|
|
13603136 |
|
|
|
|
13566336 |
Aug 3, 2012 |
|
|
|
13410931 |
|
|
|
|
13566244 |
Aug 3, 2012 |
|
|
|
13566336 |
|
|
|
|
13566168 |
Aug 3, 2012 |
|
|
|
13566244 |
|
|
|
|
13566103 |
Aug 3, 2012 |
|
|
|
13566168 |
|
|
|
|
13543678 |
Jul 6, 2012 |
|
|
|
13566103 |
|
|
|
|
13543635 |
Jul 6, 2012 |
|
|
|
13543678 |
|
|
|
|
13408530 |
Feb 29, 2012 |
|
|
|
13543635 |
|
|
|
|
13410958 |
Mar 2, 2012 |
|
|
|
13408530 |
|
|
|
|
13410983 |
Mar 2, 2012 |
|
|
|
13410958 |
|
|
|
|
13411034 |
Mar 2, 2012 |
|
|
|
13410983 |
|
|
|
|
13411075 |
Mar 2, 2012 |
|
|
|
13411034 |
|
|
|
|
13436593 |
Mar 30, 2012 |
|
|
|
13411075 |
|
|
|
|
13436823 |
Mar 30, 2012 |
|
|
|
13436593 |
|
|
|
|
13436826 |
Mar 30, 2012 |
|
|
|
13436823 |
|
|
|
|
13485734 |
May 31, 2012 |
|
|
|
13436826 |
|
|
|
|
13485743 |
May 31, 2012 |
|
|
|
13485734 |
|
|
|
|
61539884 |
Sep 27, 2011 |
|
|
|
Current U.S.
Class: |
715/761 |
Current CPC
Class: |
G09G 5/14 20130101; G06F
3/048 20130101; G06F 3/04883 20130101; G06F 3/04845 20130101; G06F
2203/04803 20130101; G06F 3/1423 20130101; G09G 2354/00 20130101;
G06F 3/017 20130101; G06F 3/0482 20130101; G06F 3/0486 20130101;
G06F 1/1641 20130101 |
Class at
Publication: |
715/761 |
International
Class: |
G06F 3/01 20060101
G06F003/01 |
Claims
1. A method, comprising: providing a device; providing a computer
system; docking the device to the computer system to form a unified
system; generating a unified desktop for the unified system,
wherein the unified desktop includes at least a first user
interface associated with the device and a second user interface
associated with the computer system, and wherein the unified
desktop emulates a personal computer environment; receiving a first
user interface input on the first user interface; in response to
receiving the first user interface input, displaying a first
application manager interface on the first user interface;
receiving a second user interface input on the second user
interface; in response to receiving the second user interface
input, displaying a second application manager interface on the
second user interface; receiving a third user interface input,
wherein the third user interface input reorders one or more
application representations in the first application manager
interface.
2. The method of claim 1, wherein the third user interface input
changes the display of one or more application windows displayed in
the second user interface.
3. The method of claim 2, wherein third user interface input also
reorders the one or more application representations in the second
application manager interface.
4. The method of claim 3, wherein the third user interface input is
an input into a touch sensitive display.
5. The method of claim 3, wherein the third user interface input is
an input into a gesture capture region.
6. The method of claim 5, wherein one or more applications
displayed on the first user interface are device-focused and one or
more applications displayed on the second user interface are
computer system-focused.
7. The method of claim 6, wherein the unified desktop prevents the
third user interface from changing a computer system-focused
application to a device-focused application.
8. The method of claim 7, wherein a fourth user interface input
changes a computer system-focused application to a device-focused
application.
9. The method of claim 8, wherein the fourth user interface input
is a drag and drop operation between the first user interface and
the second user interface.
10. The method of claim 8, wherein a fifth user input reorders the
one or more application representations in the second application
manager interface and the one or more applications, associated with
the one or more application representations, in a stack associated
with the first user interface.
11. A unified system, comprising: a device comprising: a first
screen; a first memory; a first processor; a computer system
comprising: a second screen; a second memory; a second processor,
wherein the device is docked with the computer system to form the
unified system; wherein the unified system is configured to:
receive a first user interface input on the first user interface;
in response to receiving the first user interface input, display a
first application manager interface on the first user interface;
receive a second user interface input on the second user interface;
in response to receiving the second user interface input, display a
second application manager interface on the second user interface;
receive a third user interface input, wherein the third user
interface input reorders one or more application representations in
the first application manager interface.
12. The unified system of claim 11, wherein the third user
interface input changes the display of one or more application
windows displayed in the second user interface, and wherein third
user interface input also reorders the one or more application
representations in the second application manager interface.
13. The unified system of claim 11, wherein the third user
interface input is one of an input into a touch sensitive display
or an input into a gesture capture region.
14. The unified system of claim 13, wherein one or more
applications displayed on the first user interface are
device-focused and one or more applications displayed on the second
user interface are computer system-focused, and wherein the unified
desktop prevents the third user interface from changing a computer
system-focused application to a device-focused application.
15. The unified system of claim 14, wherein a fourth user interface
input changes a computer system-focused application to a
device-focused application, and wherein the fourth user interface
input is a drag and drop operation between the first user interface
and the second user interface.
16. A computer readable medium having stored thereon
computer-executable instructions, the computer executable
instructions causing a processor to execute a method for providing
a unified desktop, the computer-executable instructions comprising:
upon docking a device with a computer system, instructions to
generate a unified desktop for the unified system, wherein the
unified desktop includes a first user interface associated with the
device and a second user interface associated with the computer
system, and wherein the unified desktop emulates a personal
computer environment; instructions to receive a first user
interface input on the first user interface; in response to
receiving the first user interface input, instructions to display a
first application manager interface on the first user interface;
instructions to receive a second user interface input on the second
user interface; in response to receiving the second user interface
input, instructions to display a second application manager
interface on the second user interface; instructions to receive a
third user interface input, wherein the third user interface input
reorders one or more application representations in the first
application manager interface.
17. The computer readable medium of claim 16, wherein the third
user interface input changes the display of one or more application
windows displayed in the second user interface, and wherein third
user interface input also reorders the one or more application
representations in the second application manager interface.
18. The computer readable medium of claim 16, wherein the third
user interface input is one of an input into a touch sensitive
display or an input into a gesture capture region.
19. The computer readable medium of claim 16, wherein one or more
applications displayed on the first user interface are
device-focused and one or more applications displayed on the second
user interface are computer system-focused, and wherein the unified
desktop prevents the third user interface from changing a computer
system-focused application to a device-focused application.
20. The computer readable medium of claim 16, wherein a fourth user
interface input changes a computer system-focused application to a
device-focused application, and wherein the fourth user interface
input is a drag and drop operation between the first user interface
and the second user interface.
Description
[0001] This application is a Continuation-In-Part and claims the
benefit of and priority to U.S. application Ser. No. 13/605,740,
filed Sep. 6, 2012; U.S. application Ser. No. 13/605,482, filed
Sep. 6, 2012; U.S. application Ser. No. 13/605,145, filed Sep. 6,
2012; U.S. application Ser. No. 13/604,960, filed Sep. 6, 2012;
U.S. application Ser. No. 13/603,136, filed Sep. 4, 2012; U.S.
application Ser. No. 13/604,384, filed Sep. 5, 2012; U.S.
application Ser. No. 13/566,336, filed Aug. 3, 2012; U.S.
application Ser. No. 13/566,244, filed Aug. 3, 2012; U.S.
application Ser. No. 13/566,168, filed Aug. 3, 2012; U.S.
application Ser. No. 13/566,103, filed Aug. 3, 2012; U.S.
application Ser. No. 13/543,678, filed Jul. 6, 2012; U.S.
application Ser. No. 13/543,635, filed Jul. 6, 2012; U.S.
application Ser. No. 13/408,530, filed Feb. 29, 2012; U.S.
application Ser. No. 13/410,931, filed Mar. 2, 2012; U.S.
application Ser. No. 13/410,958, filed Mar. 2, 2012; U.S.
application Ser. No. 13/410,983, filed Mar. 2, 2012; U.S.
application Ser. No. 13/411,034, filed Mar. 2, 2012; U.S.
application Ser. No. 13/411,075, filed Mar. 2, 2012; U.S.
application Ser. No. 13/436,593, filed Mar. 30, 2012; U.S.
application Ser. No. 13/436,823, filed Mar. 30, 2012; U.S.
application Ser. No. 13/436,826, filed Mar. 30, 2012; U.S.
application Ser. No. 13/485,734, filed May 31, 2012, and U.S.
application Ser. No. 13/485,743, filed May 31, 2012.
[0002] This application also claims the benefits of and priority,
under 35 U.S.C. .sctn.119(e), to U.S. Provisional Application Ser.
No. 61/539,884, filed Sep. 27, 2011. This application is also
related to U.S. application Ser. Nos. 13/217,108, filed on Aug. 24,
2011; 13/251,427, filed on Oct. 3, 2011; 13/247,166, filed on Sep.
28, 2011; 13/217,121, filed on Aug. 24, 2011; 13/217,130, filed on
Aug. 24, 2011; 13/247,170, filed on Sep. 28, 2011; 13/246,669,
filed on Sep. 27, 2011; 13/217,099, filed on Aug. 24, 2011;
13/247,885, filed on Sep. 28, 2011; 13/250,764, filed on Sep. 30,
2011; 13/251,434, filed on Oct. 3, 2011; 13/399,901, filed on Feb.
17, 2012; 13/399,929, filed on Feb. 17, 2012; 13/399,936, filed on
Feb. 17, 2012; 13/246,118, filed on Sep. 27, 2011; 13/246,128,
filed on Sep. 27, 2011; 13/246,133, filed on Sep. 27, 2011;
13/246,675, filed on Sep. 27, 2011 and 13/246,665, filed on Sep.
27, 2011. The entire disclosure of all the applications or patents
listed herein are hereby incorporated by reference, for all that
they teach and for all purposes.
BACKGROUND
[0003] A substantial number of handheld computing devices, such as
cellular phones, tablets, and E-Readers, make use of a touch screen
display not only to deliver display information to the user but
also to receive inputs from user interface commands. While touch
screen displays may increase the configurability of the handheld
device and provide a wide variety of user interface options, this
flexibility typically comes at a price. The dual use of the touch
screen to provide content and receive user commands, while flexible
for the user, may obfuscate the display and cause visual clutter,
thereby leading to user frustration and loss of productivity.
[0004] The small form factor of handheld computing devices requires
a careful balancing between the displayed graphics and the area
provided for receiving inputs. On the one hand, the small display
constrains the display space, which may increase the difficulty of
interpreting actions or results. On the other, a virtual keypad or
other user interface scheme is superimposed on or positioned
adjacent to an executing application, requiring the application to
be squeezed into an even smaller portion of the display.
[0005] This balancing act is particularly difficult for single
display touch screen devices. Single display touch screen devices
are crippled by their limited screen space. When users are entering
information into the device, through the single display, the
ability to interpret information in the display can be severely
hampered, particularly when a complex interaction between display
and interface is required.
[0006] Current handheld computing devices may be connected to
larger computing devices, e.g., personal computers (PCs), or
peripheral screens to provide more display area. Current handheld
devices do not include features that allow it to provide both PC
functionality and the functionality associated with the handheld
device, e.g., phone, text, or other communication functionality.
Instead, a peripheral screen connected to a handheld device merely
provides more display area for the handheld computing device. When
connecting the handheld device to another computing system, such as
a PC, the handheld device is typically recognized by the computing
system as a peripheral device. The functionality of the handheld
device is typically not integrated with the functionality of the
larger computing system.
SUMMARY
[0007] There is a need for a dual multi-display handheld computing
device that provides for enhanced power and/or versatility compared
to conventional single display handheld computing devices. These
and other needs are addressed by the various aspects, embodiments,
and/or configurations of the present disclosure. Also, while the
disclosure is presented in terms of exemplary embodiments, it
should be appreciated that individual aspects of the disclosure can
be separately claimed.
[0008] Embodiments provide for a handheld device with a unified
desktop for integrating the functionality of the handheld device
with a larger computing system, e.g., a PC. When connected to a
peripheral display and/or a display of a PC, the handheld device
provides a unified desktop displayed across the screen(s) of the
handheld device and the additional display. The unified desktop
unifies the PC functionality provided on the additional display
with the handheld functionality, such as communication applications
(e.g., phone, SMS, MMS) provided on the screen(s) of the handheld
device. A user can seamlessly interact with applications, e.g.,
open, drag, close, receive notifications, on the unified desktop
whether the applications are displayed on the screens of the
handheld device, or the peripheral display of the larger computing
system. Each portion of the unified desktop (i.e., the portion on
the handheld device and the portion on the peripheral screen) may
display different applications, information, and/or have a
different layout. Also, in embodiments, each portion of the desktop
may display similar information in different formats. For example,
battery level of the handheld device, wireless network signal
strength, notifications, can be displayed in both portions of the
desktop, with a larger format being used on the portion of the
unified desktop displayed on the peripheral screen, and a smaller
format used on the screen(s) of the peripheral device.
[0009] Further embodiments provide for devices, computer-executable
instructions, and methods for providing a unified desktop. Upon
docking a device with a computer system, a unified system generates
a unified desktop. The unified desktop can include a first user
interface associated with the device and a second user interface
associated with the computer system. While docked, the unified
system can receive a first user interface input on the first user
interface, and in response, display a first application manager
interface on the first user interface The unified system may
optionally receive a second user interface input on the second user
interface and, in response, display a second application manager
interface on the second user interface. The unified desktop may
then receive a third user interface input, wherein the third user
interface input reorders one or more application representations in
the first application manager interface.
[0010] The phrases "at least one", "one or more", and "and/or" are
open-ended expressions that are both conjunctive and disjunctive in
operation. For example, each of the expressions "at least one of A,
B and C", "at least one of A, B, or C", "one or more of A, B, and
C", "one or more of A, B, or C" and "A, B, and/or C" means A alone,
B alone, C alone, A and B together, A and C together, B and C
together, or A, B and C together.
[0011] The term "a" or "an" entity refers to one or more of that
entity. As such, the terms "a" (or "an"), "one or more" and "at
least one" can be used interchangeably herein. It is also to be
noted that the terms "comprising", "including", and "having" can be
used interchangeably.
[0012] The term "automatic" and variations thereof, as used herein,
refers to any process or operation done without material human
input when the process or operation is performed. However, a
process or operation can be automatic, even though performance of
the process or operation uses material or immaterial human input,
if the input is received before performance of the process or
operation. Human input is deemed to be material if such input
influences how the process or operation will be performed. Human
input that consents to the performance of the process or operation
is not deemed to be "material".
[0013] The term "computer-readable medium" as used herein refers to
any tangible storage and/or transmission medium that participate in
providing instructions to a processor for execution. Such a medium
may take many forms, including but not limited to, non-volatile
media, volatile media, and transmission media. Non-volatile media
includes, for example, NVRAM, or magnetic or optical disks.
Volatile media includes dynamic memory, such as main memory. Common
forms of computer-readable media include, for example, a floppy
disk, a flexible disk, hard disk, magnetic tape, or any other
magnetic medium, magneto-optical medium, a CD-ROM, any other
optical medium, punch cards, paper tape, any other physical medium
with patterns of holes, a RAM, a PROM, and EPROM, a FLASH-EPROM, a
solid state medium like a memory card, any other memory chip or
cartridge, a carrier wave as described hereinafter, or any other
medium from which a computer can read. A digital file attachment to
e-mail or other self-contained information archive or set of
archives is considered a distribution medium equivalent to a
tangible storage medium. When the computer-readable media is
configured as a database, it is to be understood that the database
may be any type of database, such as relational, hierarchical,
object-oriented, and/or the like. Accordingly, the disclosure is
considered to include a tangible storage medium or distribution
medium and prior art-recognized equivalents and successor media, in
which the software implementations of the present disclosure are
stored.
[0014] The term "desktop" refers to a metaphor used to portray
systems. A desktop is generally considered a "surface" that
typically includes pictures, called icons, widgets, folders, etc.
that can activate show applications, windows, cabinets, files,
folders, documents, and other graphical items. The icons are
generally selectable to initiate a task through user interface
interaction to allow a user to execute applications or conduct
other operations.
[0015] The term "screen," "touch screen," or "touchscreen" refers
to a physical structure that includes one or more hardware
components that provide the device with the ability to render a
user interface and/or receive user input. A screen can encompass
any combination of gesture capture region, a touch sensitive
display, and/or a configurable area. The device can have one or
more physical screens embedded in the hardware. However a screen
may also include an external peripheral device that may be attached
and detached from the device. In embodiments, multiple external
devices may be attached to the device. Thus, in embodiments, the
screen can enable the user to interact with the device by touching
areas on the screen and provides information to a user through a
display. The touch screen may sense user contact in a number of
different ways, such as by a change in an electrical parameter
(e.g., resistance or capacitance), acoustic wave variations,
infrared radiation proximity detection, light variation detection,
and the like. In a resistive touch screen, for example, normally
separated conductive and resistive metallic layers in the screen
pass an electrical current. When a user touches the screen, the two
layers make contact in the contacted location, whereby a change in
electrical field is noted and the coordinates of the contacted
location calculated. In a capacitive touch screen, a capacitive
layer stores electrical charge, which is discharged to the user
upon contact with the touch screen, causing a decrease in the
charge of the capacitive layer. The decrease is measured, and the
contacted location coordinates determined. In a surface acoustic
wave touch screen, an acoustic wave is transmitted through the
screen, and the acoustic wave is disturbed by user contact. A
receiving transducer detects the user contact instance and
determines the contacted location coordinates.
[0016] The term "display" refers to a portion of one or more
screens used to display the output of a computer to a user. A
display may be a single-screen display or a multi-screen display,
referred to as a composite display. A composite display can
encompass the touch sensitive display of one or more screens. A
single physical screen can include multiple displays that are
managed as separate logical displays. Thus, different content can
be displayed on the separate displays although part of the same
physical screen.
[0017] The term "displayed image" refers to an image produced on
the display. A typical displayed image is a window or desktop. The
displayed image may occupy all or a portion of the display.
[0018] The term "display orientation" refers to the way in which a
rectangular display is oriented by a user for viewing. The two most
common types of display orientation are portrait and landscape. In
landscape mode, the display is oriented such that the width of the
display is greater than the height of the display (such as a 4:3
ratio, which is 4 units wide and 3 units tall, or a 16:9 ratio,
which is 16 units wide and 9 units tall). Stated differently, the
longer dimension of the display is oriented substantially
horizontal in landscape mode while the shorter dimension of the
display is oriented substantially vertical. In the portrait mode,
by contrast, the display is oriented such that the width of the
display is less than the height of the display. Stated differently,
the shorter dimension of the display is oriented substantially
horizontal in the portrait mode while the longer dimension of the
display is oriented substantially vertical.
[0019] The term "composite display" refers to a logical structure
that defines a display that can encompass one or more screens. A
multi-screen display can be associated with a composite display
that encompasses all the screens. The composite display can have
different display characteristics based on the various orientations
of the device.
[0020] The term "gesture" refers to a user action that expresses an
intended idea, action, meaning, result, and/or outcome. The user
action can include manipulating a device (e.g., opening or closing
a device, changing a device orientation, moving a trackball or
wheel, etc.), movement of a body part in relation to the device,
movement of an implement or tool in relation to the device, audio
inputs, etc. A gesture may be made on a device (such as on the
screen) or with the device to interact with the device.
[0021] The term "module" as used herein refers to any known or
later developed hardware, software, firmware, artificial
intelligence, fuzzy logic, or combination of hardware and software
that is capable of performing the functionality associated with
that element.
[0022] The term "gesture capture" refers to a sense or otherwise a
detection of an instance and/or type of user gesture. The gesture
capture can occur in one or more areas of the screen, A gesture
region can be on the display, where it may be referred to as a
touch sensitive display or off the display where it may be referred
to as a gesture capture area.
[0023] A "multi-screen application" or "multiple-display
application" refers to an application that is capable of multiple
modes. The multi-screen application mode can include, but is not
limited to, a single screen mode (where the application is
displayed on a single screen) or a composite display mode (where
the application is displayed on two or more screens). A
multi-screen application can have different layouts optimized for
the mode. Thus, the multi-screen application can have different
layouts for a single screen or for a composite display that can
encompass two or more screens. The different layouts may have
different screen/display dimensions and/or configurations on which
the user interfaces of the multi-screen applications can be
rendered. The different layouts allow the application to optimize
the application's user interface for the type of display, e.g.,
single screen or multiple screens. In single screen mode, the
multi-screen application may present one window pane of
information. In a composite display mode, the multi-screen
application may present multiple window panes of information or may
provide a larger and a richer presentation because there is more
space for the display contents. The multi-screen applications may
be designed to adapt dynamically to changes in the device and the
mode depending on which display (single or composite) the system
assigns to the multi-screen application. In alternative
embodiments, the user can use a gesture to request the application
transition to a different mode, and, if a display is available for
the requested mode, the device can allow the application to move to
that display and transition modes.
[0024] A "single-screen application" refers to an application that
is capable of single screen mode. Thus, the single-screen
application can produce only one window and may not be capable of
different modes or different display dimensions. A single-screen
application may not be capable of the several modes discussed with
the multi-screen application.
[0025] The term "window" refers to a, typically rectangular,
displayed image on at least part of a display that contains or
provides content different from the rest of the screen. The window
may obscure the desktop.
[0026] The terms "determine", "calculate" and "compute," and
variations thereof, as used herein, are used interchangeably and
include any type of methodology, process, mathematical operation or
technique.
[0027] It shall be understood that the term "means" as used herein
shall be given its broadest possible interpretation in accordance
with 35 U.S.C., Section 112, Paragraph 6. Accordingly, a claim
incorporating the term "means" shall cover all structures,
materials, or acts set forth herein, and all of the equivalents
thereof. Further, the structures, materials or acts and the
equivalents thereof shall include all those described in the
summary of the invention, brief description of the drawings,
detailed description, abstract, and claims themselves.
[0028] The preceding is a simplified summary of the disclosure to
provide an understanding of some aspects of the disclosure. This
summary is neither an extensive nor exhaustive overview of the
disclosure and its various aspects, embodiments, and/or
configurations. It is intended neither to identify key or critical
elements of the disclosure nor to delineate the scope of the
disclosure but to present selected concepts of the disclosure in a
simplified form as an introduction to the more detailed description
presented below. As will be appreciated, other aspects,
embodiments, and/or configurations of the disclosure are possible
utilizing, alone or in combination, one or more of the features set
forth above or described in detail below.
BRIEF DESCRIPTION OF THE DRAWINGS
[0029] FIG. 1A includes a first view of an embodiment of a
multi-screen user device;
[0030] FIG. 1B includes a second view of an embodiment of a
multi-screen user device;
[0031] FIG. 1C includes a third view of an embodiment of a
multi-screen user device;
[0032] FIG. 1D includes a fourth view of an embodiment of a
multi-screen user device;
[0033] FIG. 1E includes a fifth view of an embodiment of a
multi-screen user device;
[0034] FIG. 1F includes a sixth view of an embodiment of a
multi-screen user device;
[0035] FIG. 1G includes a seventh view of an embodiment of a
multi-screen user device;
[0036] FIG. 1H includes a eighth view of an embodiment of a
multi-screen user device;
[0037] FIG. 1I includes a ninth view of an embodiment of a
multi-screen user device;
[0038] FIG. 1J includes a tenth view of an embodiment of a
multi-screen user device;
[0039] FIG. 2 is a block diagram of an embodiment of the hardware
of the device;
[0040] FIG. 3A is a block diagram of an embodiment of the state
model for the device based on the device's orientation and/or
configuration;
[0041] FIG. 3B is a table of an embodiment of the state model for
the device based on the device's orientation and/or
configuration;
[0042] FIG. 4A is a first representation of an embodiment of user
gesture received at a device;
[0043] FIG. 4B is a second representation of an embodiment of user
gesture received at a device;
[0044] FIG. 4C is a third representation of an embodiment of user
gesture received at a device;
[0045] FIG. 4D is a fourth representation of an embodiment of user
gesture received at a device;
[0046] FIG. 4E is a fifth representation of an embodiment of user
gesture received at a device;
[0047] FIG. 4F is a sixth representation of an embodiment of user
gesture received at a device;
[0048] FIG. 4G is a seventh representation of an embodiment of user
gesture received at a device;
[0049] FIG. 4H is a eighth representation of an embodiment of user
gesture received at a device;
[0050] FIG. 5A is a block diagram of an embodiment of the device
software and/or firmware;
[0051] FIG. 5B is a second block diagram of an embodiment of the
device software and/or firmware;
[0052] FIG. 6A is a first representation of an embodiment of a
device configuration generated in response to the device state;
[0053] FIG. 6B is a second representation of an embodiment of a
device configuration generated in response to the device state;
[0054] FIG. 6C is a third representation of an embodiment of a
device configuration generated in response to the device state;
[0055] FIG. 6D is a fourth representation of an embodiment of a
device configuration generated in response to the device state;
[0056] FIG. 6E is a fifth representation of an embodiment of a
device configuration generated in response to the device state;
[0057] FIG. 6F is a sixth representation of an embodiment of a
device configuration generated in response to the device state;
[0058] FIG. 6G is a seventh representation of an embodiment of a
device configuration generated in response to the device state;
[0059] FIG. 6H is a eighth representation of an embodiment of a
device configuration generated in response to the device state;
[0060] FIG. 6I is a ninth representation of an embodiment of a
device configuration generated in response to the device state;
[0061] FIG. 6J is a tenth representation of an embodiment of a
device configuration generated in response to the device state;
[0062] FIG. 7A is representation of a logical window stack;
[0063] FIG. 7B is another representation of an embodiment of a
logical window stack;
[0064] FIG. 7C is another representation of an embodiment of a
logical window stack;
[0065] FIG. 7D is another representation of an embodiment of a
logical window stack;
[0066] FIG. 7E is another representation of an embodiment of a
logical window stack;
[0067] FIG. 8 is block diagram of an embodiment of a logical data
structure for a window stack;
[0068] FIG. 9 is a flow chart of an embodiment of a method for
creating a window stack;
[0069] FIG. 10 is a flow chart of an embodiment of a method for
managing the execution of an application;
[0070] FIG. 11 is a block diagram of an embodiment of the hardware
of a unified system;
[0071] FIG. 12 is a block diagram of an embodiment of the hardware
of a computer system;
[0072] FIG. 13 is a representation of an embodiment of a unified
desktop;
[0073] FIG. 14 is a representation of an embodiment of a user
interface presented in a unified desktop;
[0074] FIG. 15 is another representation of an embodiment of a user
interface presented in a unified desktop;
[0075] FIG. 16 is a flow chart of an embodiment of a method for
providing status indicators in a unified desktop;
[0076] FIG. 17 is a flow chart of an embodiment of a method for
providing status indicators in a unified desktop;
[0077] FIG. 18 is another representation of an embodiment of a user
interface presented in a unified desktop;
[0078] FIG. 19 is another representation of an embodiment of a user
interface presented in a unified desktop;
[0079] FIG. 20 is a flow chart of an embodiment of a method for
providing a freeform window in a unified desktop;
[0080] FIG. 20 is another flow chart of an embodiment of a method
for providing a freeform window in a unified desktop;
[0081] FIG. 21 is another representation of an embodiment of a user
interface presented in a unified desktop;
[0082] FIG. 22 is another representation of an embodiment of a user
interface presented in a unified desktop;
[0083] FIG. 23 is another representation of an embodiment of a user
interface presented in a unified desktop;
[0084] FIG. 24 is another representation of an embodiment of a user
interface presented in a unified desktop;
[0085] FIG. 25 is another representation of an embodiment of a user
interface presented in a unified desktop;
[0086] FIG. 26 is another representation of an embodiment of a user
interface presented in a unified desktop;
[0087] FIG. 27 is another representation of an embodiment of a user
interface presented in a unified desktop;
[0088] FIG. 28 is another representation of an embodiment of a user
interface presented in a unified desktop;
[0089] FIG. 29 is a representation of a state diagram of a user
interface in a sleep state presented in a unified desktop;
[0090] FIG. 30 is a flow chart of an embodiment of a method for
providing a common wake and unlock strategy for a unified
desktop;
[0091] FIG. 31 is another representation of an embodiment of user
interfaces presented before docking a device with a computer system
to form a unified desktop;
[0092] FIG. 32 is another representation of an embodiment of a user
interface presented in a unified desktop after docking;
[0093] FIG. 33 is another representation of an embodiment of a user
interface presented in a unified desktop after docking;
[0094] FIG. 34 is a flow chart of an embodiment of a method for
docking a device with a computer system to form a unified
desktop;
[0095] FIG. 35 is another representation of an embodiment of a user
interface presented in a unified desktop before undocking;
[0096] FIG. 36 is another representation of an embodiment of user
interfaces presented after undocking a device with a computer
system to dismantle a unified desktop;
[0097] FIG. 37 is a flow chart of an embodiment of a method for
undocking a device with a computer system;
[0098] FIG. 38 is a representation of a rules diagram for different
rules that govern the docking and undocking of a device and a
computer system;
[0099] FIG. 39 is a block diagram of an embodiment of the computing
environment of a unified system;
[0100] FIG. 40 is a representation of an embodiment of a user
interface presented in a unified desktop;
[0101] FIG. 41A is a block diagram of an embodiment of a user
interface data structure associated with the unified desktop;
[0102] FIG. 41B is another block diagram of another embodiment of a
user interface data structure associated with the unified
desktop;
[0103] FIG. 42 is another representation of an embodiment of a user
interface presented in a unified desktop;
[0104] FIG. 43A is a block diagram of an embodiment of a user
interface data structure associated with the unified desktop;
[0105] FIG. 43B is another block diagram of another embodiment of a
user interface data structure associated with the unified
desktop;
[0106] FIG. 44 is another representation of an embodiment of a user
interface presented in a unified desktop;
[0107] FIG. 45A is a block diagram of an embodiment of a user
interface data structure associated with the unified desktop;
[0108] FIG. 45B is another block diagram of another embodiment of a
user interface data structure associated with the unified
desktop;
[0109] FIG. 46 is another representation of an embodiment of a user
interface presented in a unified desktop;
[0110] FIG. 47A is a block diagram of an embodiment of a user
interface data structure associated with the unified desktop;
[0111] FIG. 47B is another block diagram of another embodiment of a
user interface data structure associated with the unified
desktop;
[0112] FIG. 48 is a flow chart of an embodiment of a method for
managing the user interface when docking a device with a computer
system;
[0113] FIG. 49 is another representation of an embodiment of a user
interface presented in a unified desktop;
[0114] FIG. 50 is another representation of an embodiment of a user
interface presented in a unified desktop;
[0115] FIG. 51 is another representation of an embodiment of a user
interface presented in a unified desktop;
[0116] FIG. 52 is another representation of an embodiment of a user
interface presented in a unified desktop;
[0117] FIG. 53 is a block diagram of an embodiment of data
structure associated windows previously displayed when the device
was previously docked with the computer system;
[0118] FIG. 54 is a flow chart of an embodiment of a method for
managing window stickiness;
[0119] FIG. 55A is another block diagram of another embodiment of
modules associated with the device docked with the computer
system;
[0120] FIG. 55B is another block diagram of another embodiment of
modules associated with the computer system docked with the
device;
[0121] FIG. 56 is a flow chart of an embodiment of a method for
processing events in the unified system;
[0122] FIG. 57 is another flow chart of another embodiment of a
method for processing events in the unified system;
[0123] FIG. 58 is another flow chart of another embodiment of a
method for processing events in the unified system;
[0124] FIG. 59 is a flow chart of an embodiment of a method for the
device assuming the role of master in the unified system;
[0125] FIG. 60 is another representation of an embodiment of a user
interface presented in a unified desktop;
[0126] FIG. 61A is another representation of an embodiment of a
user interface presented in a unified desktop;
[0127] FIG. 61B is another representation of an embodiment of a
user interface presented in a unified desktop;
[0128] FIG. 61C is another representation of an embodiment of a
user interface presented in a unified desktop;
[0129] FIG. 62 is another representation of an embodiment of a user
interface presented in a unified desktop;
[0130] FIG. 63 is another representation of an embodiment of a user
interface presented in a unified desktop;
[0131] FIG. 64 is another representation of an embodiment of a user
interface presented in a unified desktop;
[0132] FIG. 65 is another representation of an embodiment of a user
interface presented in a unified desktop;
[0133] FIG. 66A is another representation of an embodiment of a
user interface presented in a unified desktop;
[0134] FIG. 66B is another representation of an embodiment of a
user interface presented in a unified desktop;
[0135] FIG. 67 is another representation of an embodiment of a user
interface presented in a unified desktop;
[0136] FIG. 68 is another representation of an embodiment of a user
interface presented in a unified desktop;
[0137] FIG. 69 is another representation of an embodiment of a user
interface presented in a unified desktop;
[0138] FIG. 70 is a flow chart of an embodiment of a method for
providing a triad control in the unified system.
[0139] FIG. 71 is a flow chart of an embodiment of a method for
providing an application manager interface in the unified
system;
[0140] FIG. 72 is another representation of an embodiment of a user
interface presented in a unified desktop;
[0141] FIG. 73 is another representation of an embodiment of a user
interface presented in a unified desktop;
[0142] FIG. 74 is another representation of an embodiment of a user
interface presented in a unified desktop;
[0143] FIG. 75 is another representation of an embodiment of a user
interface presented in a unified desktop;
[0144] FIG. 76 is another representation of an embodiment of a user
interface presented in a unified desktop;
[0145] FIG. 77 is a flow chart of an embodiment of a method for
providing an application manager interface in the unified
system;
[0146] FIG. 78 is another representation of an embodiment of a user
interface presented in a unified desktop
[0147] FIG. 79 is a flow chart of an embodiment of a method for
providing a big brother application in the unified system;
[0148] FIG. is another representation of an embodiment of a user
interface presented in a unified desktop
[0149] FIG. 81 is another representation of an embodiment of a user
interface presented in a unified desktop;
[0150] FIG. 82 is a flow chart of an embodiment of a method for
providing a big brother application in the unified system;
[0151] FIG. 83 is an embodiment of a data structure associated with
a big brother application in a unified desktop.
[0152] In the appended figures, similar components and/or features
may have the same reference label. Further, various components of
the same type may be distinguished by following the reference label
by a letter that distinguishes among the similar components. If
only the first reference label is used in the specification, the
description is applicable to any one of the similar components
having the same first reference label irrespective of the second
reference label.
DETAILED DESCRIPTION
[0153] Presented herein are embodiments of a device. The device can
be a communications device, such as a cellular telephone, or other
smart device. The device can include two screens that are oriented
to provide several unique display configurations. Further, the
device can receive user input in unique ways. The overall design
and functionality of the device provides for an enhanced user
experience making the device more useful and more efficient.
[0154] Mechanical Features:
[0155] FIGS. 1A-1J illustrate a device 100 in accordance with
embodiments of the present disclosure. As described in greater
detail below, device 100 can be positioned in a number of different
ways each of which provides different functionality to a user. The
device 100 is a multi-screen device that includes a primary screen
104 and a secondary screen 108, both of which are touch sensitive.
In embodiments, the entire front surface of screens 104 and 108 may
be touch sensitive and capable of receiving input by a user
touching the front surface of the screens 104 and 108. Primary
screen 104 includes touch sensitive display 110, which, in addition
to being touch sensitive, also displays information to a user.
Secondary screen 108 includes touch sensitive display 114, which
also displays information to a user. In other embodiments, screens
104 and 108 may include more than one display area.
[0156] Primary screen 104 also includes a configurable area 112
that has been configured for specific inputs when the user touches
portions of the configurable area 112. Secondary screen 108 also
includes a configurable area 116 that has been configured for
specific inputs. Areas 112a and 116a have been configured to
receive a "back" input indicating that a user would like to view
information previously displayed. Areas 112b and 116b have been
configured to receive a "menu" input indicating that the user would
like to view options from a menu. Areas 112c and 116c have been
configured to receive a "home" input indicating that the user would
like to view information associated with a "home" view. In other
embodiments, areas 112a-c and 116a-c may be configured, in addition
to the configurations described above, for other types of specific
inputs including controlling features of device 100, some
non-limiting examples including adjusting overall system power,
adjusting the volume, adjusting the brightness, adjusting the
vibration, selecting of displayed items (on either of screen 104 or
108), operating a camera, operating a microphone, and
initiating/terminating of telephone calls. Also, in some
embodiments, areas 112a-C and 116a-C may be configured for specific
inputs depending upon the application running on device 100 and/or
information displayed on touch sensitive displays 110 and/or
114.
[0157] In addition to touch sensing, primary screen 104 and
secondary screen 108 may also include areas that receive input from
a user without requiring the user to touch the display area of the
screen. For example, primary screen 104 includes gesture capture
area 120, and secondary screen 108 includes gesture capture area
124. These areas are able to receive input by recognizing gestures
made by a user without the need for the user to actually touch the
surface of the display area. In comparison to touch sensitive
displays 110 and 114, the gesture capture areas 120 and 124 are
commonly not capable of rendering a displayed image.
[0158] The two screens 104 and 108 are connected together with a
hinge 128, shown clearly in FIG. 1C (illustrating a back view of
device 100). Hinge 128, in the embodiment shown in FIGS. 1A-1J, is
a center hinge that connects screens 104 and 108 so that when the
hinge is closed, screens 104 and 108 are juxtaposed (i.e.,
side-by-side) as shown in FIG. 1B (illustrating a front view of
device 100). Hinge 128 can be opened to position the two screens
104 and 108 in different relative positions to each other. As
described in greater detail below, the device 100 may have
different functionalities depending on the relative positions of
screens 104 and 108.
[0159] FIG. 1D illustrates the right side of device 100. As shown
in FIG. 1D, secondary screen 108 also includes a card slot 132 and
a port 136 on its side. Card slot 132 in embodiments, accommodates
different types of cards including a subscriber identity module
(SIM). Port 136 in embodiments is an input/output port (I/O port)
that allows device 100 to be connected to other peripheral devices,
such as a display, keyboard, or printing device. As can be
appreciated, these are merely some examples and in other
embodiments device 100 may include other slots and ports such as
slots and ports for accommodating additional memory devices and/or
for connecting other peripheral devices. Also shown in FIG. 1D is
an audio jack 140 that accommodates a tip, ring, sleeve (TRS)
connector for example to allow a user to utilize headphones or a
headset.
[0160] Device 100 also includes a number of buttons 158. For
example, FIG. 1E illustrates the left side of device 100. As shown
in FIG. 1E, the side of primary screen 104 includes three buttons
144, 148, and 152, which can be configured for specific inputs. For
example, buttons 144, 148, and 152 may be configured to, in
combination or alone, control a number of aspects of device 100.
Some non-limiting examples include overall system power, volume,
brightness, vibration, selection of displayed items (on either of
screen 104 or 108), a camera, a microphone, and
initiation/termination of telephone calls. In some embodiments,
instead of separate buttons two buttons may be combined into a
rocker button. This arrangement is useful in situations where the
buttons are configured to control features such as volume or
brightness. In addition to buttons 144, 148, and 152, device 100
also includes a button 156, shown in FIG. 1F, which illustrates the
top of device 100. In one embodiment, button 156 is configured as
an on/off button used to control overall system power to device
100. In other embodiments, button 156 is configured to, in addition
to or in lieu of controlling system power, control other aspects of
device 100. In some embodiments, one or more of the buttons 144,
148, 152, and 156 are capable of supporting different user
commands. By way of example, a normal press has a duration commonly
of less than about 1 second and resembles a quick tap. A medium
press has a duration commonly of 1 second or more but less than
about 12 seconds. A long press has a duration commonly of about 12
seconds or more. The function of the buttons is normally specific
to the application that is currently in focus on the respective
display 110 and 114. In a telephone application for instance and
depending on the particular button, a normal, medium, or long press
can mean end call, increase in call volume, decrease in call
volume, and toggle microphone mute. In a camera or video
application for instance and depending on the particular button, a
normal, medium, or long press can mean increase zoom, decrease
zoom, and take photograph or record video.
[0161] There are also a number of hardware components within device
100. As illustrated in FIG. 1C, device 100 includes a speaker 160
and a microphone 164. Device 100 also includes a camera 168 (FIG.
1B). Additionally, device 100 includes two position sensors 172A
and 172B, which are used to determine the relative positions of
screens 104 and 108. In one embodiment, position sensors 172A and
172B are Hall effect sensors. However, in other embodiments other
sensors can be used in addition to or in lieu of the Hall effect
sensors. An accelerometer 176 may also be included as part of
device 100 to determine the orientation of the device 100 and/or
the orientation of screens 104 and 108. Additional internal
hardware components that may be included in device 100 are
described below with respect to FIG. 2.
[0162] The overall design of device 100 allows it to provide
additional functionality not available in other communication
devices. Some of the functionality is based on the various
positions and orientations that device 100 can have. As shown in
FIGS. 1B-1G, device 100 can be operated in an "open" position where
screens 104 and 108 are juxtaposed. This position allows a large
display area for displaying information to a user. When position
sensors 172A and 172B determine that device 100 is in the open
position, they can generate a signal that can be used to trigger
different events such as displaying information on both screens 104
and 108. Additional events may be triggered if accelerometer 176
determines that device 100 is in a portrait position (FIG. 1B) as
opposed to a landscape position (not shown).
[0163] In addition to the open position, device 100 may also have a
"closed" position illustrated in FIG. 1H. Again, position sensors
172A and 172B can generate a signal indicating that device 100 is
in the "closed" position. This can trigger an event that results in
a change of displayed information on screen 104 and/or 108. For
example, device 100 may be programmed to stop displaying
information on one of the screens, e.g., screen 108, since a user
can only view one screen at a time when device 100 is in the
"closed" position. In other embodiments, the signal generated by
position sensors 172A and 172B, indicating that the device 100 is
in the "closed" position, can trigger device 100 to answer an
incoming telephone call. The "closed" position can also be a
preferred position for utilizing the device 100 as a mobile
phone.
[0164] Device 100 can also be used in an "easel" position which is
illustrated in FIG. 1I. In the "easel" position, screens 104 and
108 are angled with respect to each other and facing outward with
the edges of screens 104 and 108 substantially horizontal. In this
position, device 100 can be configured to display information on
both screens 104 and 108 to allow two users to simultaneously
interact with device 100. When device 100 is in the "easel"
position, sensors 172A and 172B generate a signal indicating that
the screens 104 and 108 are positioned at an angle to each other,
and the accelerometer 176 can generate a signal indicating that
device 100 has been placed so that the edge of screens 104 and 108
are substantially horizontal. The signals can then be used in
combination to generate events that trigger changes in the display
of information on screens 104 and 108.
[0165] FIG. 1J illustrates device 100 in a "modified easel"
position. In the "modified easel" position, one of screens 104 or
108 is used as a stand and is faced down on the surface of an
object such as a table. This position provides a convenient way for
information to be displayed to a user in landscape orientation.
Similar to the easel position, when device 100 is in the "modified
easel" position, position sensors 172A and 172B generate a signal
indicating that the screens 104 and 108 are positioned at an angle
to each other. The accelerometer 176 would generate a signal
indicating that device 100 has been positioned so that one of
screens 104 and 108 is faced downwardly and is substantially
horizontal. The signals can then be used to generate events that
trigger changes in the display of information of screens 104 and
108. For example, information may not be displayed on the screen
that is face down since a user cannot see the screen.
[0166] Transitional states are also possible. When the position
sensors 172A and B and/or accelerometer indicate that the screens
are being closed or folded (from open), a closing transitional
state is recognized. Conversely when the position sensors 172A and
B indicate that the screens are being opened or folded (from
closed), an opening transitional state is recognized. The closing
and opening transitional states are typically time-based, or have a
maximum time duration from a sensed starting point. Normally, no
user input is possible when one of the closing and opening states
is in effect. In this manner, incidental user contact with a screen
during the closing or opening function is not misinterpreted as
user input. In embodiments, another transitional state is possible
when the device 100 is closed. This additional transitional state
allows the display to switch from one screen 104 to the second
screen 108 when the device 100 is closed based on some user input,
e.g., a double tap on the screen 110,114.
[0167] As can be appreciated, the description of device 100 is made
for illustrative purposes only, and the embodiments are not limited
to the specific mechanical features shown in FIGS. 1A-1J and
described above. In other embodiments, device 100 may include
additional features, including one or more additional buttons,
slots, display areas, hinges, and/or locking mechanisms.
Additionally, in embodiments, the features described above may be
located in different parts of device 100 and still provide similar
functionality. Therefore, FIGS. 1A-1J and the description provided
above are nonlimiting.
[0168] Hardware Features:
[0169] FIG. 2 illustrates components of a device 100 in accordance
with embodiments of the present disclosure. In general, the device
100 includes a primary screen 104 and a secondary screen 108. While
the primary screen 104 and its components are normally enabled in
both the opened and closed positions or states, the secondary
screen 108 and its components are normally enabled in the opened
state but disabled in the closed state. However, even when in the
closed state a user or application triggered interrupt (such as in
response to a phone application or camera application operation)
can flip the active screen, or disable the primary screen 104 and
enable the secondary screen 108, by a suitable command. Each screen
104, 108 can be touch sensitive and can include different operative
areas. For example, a first operative area, within each touch
sensitive screen 104 and 108, may comprise a touch sensitive
display 110, 114. In general, the touch sensitive display 110, 114
may comprise a full color, touch sensitive display. A second area
within each touch sensitive screen 104 and 108 may comprise a
gesture capture region 120, 124. The gesture capture region 120,
124 may comprise an area or region that is outside of the touch
sensitive display 110, 114 area, and that is capable of receiving
input, for example in the form of gestures provided by a user.
However, the gesture capture region 120, 124 does not include
pixels that can perform a display function or capability.
[0170] A third region of the touch sensitive screens 104 and 108
may comprise a configurable area 112, 116. The configurable area
112, 116 is capable of receiving input and has display or limited
display capabilities. In embodiments, the configurable area 112,
116 may present different input options to the user. For example,
the configurable area 112, 116 may display buttons or other
relatable items. Moreover, the identity of displayed buttons, or
whether any buttons are displayed at all within the configurable
area 112, 116 of a touch sensitive screen 104 or 108, may be
determined from the context in which the device 100 is used and/or
operated. In an exemplary embodiment, the touch sensitive screens
104 and 108 comprise liquid crystal display devices extending
across at least those regions of the touch sensitive screens 104
and 108 that are capable of providing visual output to a user, and
a capacitive input matrix over those regions of the touch sensitive
screens 104 and 108 that are capable of receiving input from the
user.
[0171] One or more display controllers 216a, 216b may be provided
for controlling the operation of the touch sensitive screens 104
and 108, including input (touch sensing) and output (display)
functions. In the exemplary embodiment illustrated in FIG. 2, a
separate touch screen controller 216a or 216b is provided for each
touch screen 104 and 108. In accordance with alternate embodiments,
a common or shared touch screen controller 216 may be used to
control each of the included touch sensitive screens 104 and 108.
In accordance with still other embodiments, the functions of a
touch screen controller 216 may be incorporated into other
components, such as a processor 204.
[0172] The processor 204 may comprise a general purpose
programmable processor or controller for executing application
programming or instructions. In accordance with at least some
embodiments, the processor 204 may include multiple processor
cores, and/or implement multiple virtual processors. In accordance
with still other embodiments, the processor 204 may include
multiple physical processors. As a particular example, the
processor 204 may comprise a specially configured application
specific integrated circuit (ASIC) or other integrated circuit, a
digital signal processor, a controller, a hardwired electronic or
logic circuit, a programmable logic device or gate array, a special
purpose computer, or the like. The processor 204 generally
functions to run programming code or instructions implementing
various functions of the device 100.
[0173] A communication device 100 may also include memory 208 for
use in connection with the execution of application programming or
instructions by the processor 204, and for the temporary or long
term storage of program instructions and/or data. As examples, the
memory 208 may comprise RAM, DRAM, SDRAM, or other solid state
memory. Alternatively or in addition, data storage 212 may be
provided. Like the memory 208, the data storage 212 may comprise a
solid state memory device or devices. Alternatively or in addition,
the data storage 212 may comprise a hard disk drive or other random
access memory.
[0174] In support of communications functions or capabilities, the
device 100 can include a cellular telephony module 228. As
examples, the cellular telephony module 228 can comprise a GSM,
CDMA, FDMA and/or analog cellular telephony transceiver capable of
supporting voice, multimedia and/or data transfers over a cellular
network. Alternatively or in addition, the device 100 can include
an additional or other wireless communications module 232. As
examples, the other wireless communications module 232 can comprise
a Wi-Fi, BLUETOOTH.TM., WiMax, infrared, or other wireless
communications link. The cellular telephony module 228 and the
other wireless communications module 232 can each be associated
with a shared or a dedicated antenna 224.
[0175] A port interface 252 may be included. The port interface 252
may include proprietary or universal ports to support the
interconnection of the device 100 to other devices or components,
such as a dock, which may or may not include additional or
different capabilities from those integral to the device 100. In
addition to supporting an exchange of communication signals between
the device 100 and another device or component, the docking port
136 and/or port interface 252 can support the supply of power to or
from the device 100. The port interface 252 also comprises an
intelligent element that comprises a docking module for controlling
communications or other interactions between the device 100 and a
connected device or component.
[0176] An input/output module 248 and associated ports may be
included to support communications over wired networks or links,
for example with other communication devices, server devices,
and/or peripheral devices. Examples of an input/output module 248
include an Ethernet port, a Universal Serial Bus (USB) port,
Institute of Electrical and Electronics Engineers (IEEE) 1394, or
other interface.
[0177] An audio input/output interface/device(s) 244 can be
included to provide analog audio to an interconnected speaker or
other device, and to receive analog audio input from a connected
microphone or other device. As an example, the audio input/output
interface/device(s) 244 may comprise an associated amplifier and
analog to digital converter. Alternatively or in addition, the
device 100 can include an integrated audio input/output device 256
and/or an audio jack for interconnecting an external speaker or
microphone. For example, an integrated speaker and an integrated
microphone can be provided, to support near talk or speaker phone
operations.
[0178] Hardware buttons 158 can be included for example for use in
connection with certain control operations. Examples include a
master power switch, volume control, etc., as described in
conjunction with FIGS. 1A through 1J. One or more image capture
interfaces/devices 240, such as a camera, can be included for
capturing still and/or video images. Alternatively or in addition,
an image capture interface/device 240 can include a scanner or code
reader. An image capture interface/device 240 can include or be
associated with additional elements, such as a flash or other light
source.
[0179] The device 100 can also include a global positioning system
(GPS) receiver 236. In accordance with embodiments of the present
invention, the GPS receiver 236 may further comprise a GPS module
that is capable of providing absolute location information to other
components of the device 100. An accelerometer(s) 176 may also be
included. For example, in connection with the display of
information to a user and/or other functions, a signal from the
accelerometer 176 can be used to determine an orientation and/or
format in which to display that information to the user.
[0180] Embodiments of the present invention can also include one or
more position sensor(s) 172. The position sensor 172 can provide a
signal indicating the position of the touch sensitive screens 104
and 108 relative to one another. This information can be provided
as an input, for example to a user interface application, to
determine an operating mode, characteristics of the touch sensitive
displays 110, 114, and/or other device 100 operations. As examples,
a screen position sensor 172 can comprise a series of Hall effect
sensors, a multiple position switch, an optical switch, a
Wheatstone bridge, a potentiometer, or other arrangement capable of
providing a signal indicating of multiple relative positions the
touch screens are in.
[0181] Communications between various components of the device 100
can be carried by one or more buses 222. In addition, power can be
supplied to the components of the device 100 from a power source
and/or power control module 260. The power control module 260 can,
for example, include a battery, an AC to DC converter, power
control logic, and/or ports for interconnecting the device 100 to
an external source of power.
[0182] Device State:
[0183] FIGS. 3A and 3B represent illustrative states of device 100.
While a number of illustrative states are shown, and transitions
from a first state to a second state, it is to be appreciated that
the illustrative state diagram may not encompass all possible
states and/or all possible transitions from a first state to a
second state. As illustrated in FIG. 3, the various arrows between
the states (illustrated by the state represented in the circle)
represent a physical change that occurs to the device 100, that is
detected by one or more of hardware and software, the detection
triggering one or more of a hardware and/or software interrupt that
is used to control and/or manage one or more functions of device
100.
[0184] As illustrated in FIG. 3A, there are twelve exemplary
"physical" states: closed 304, transition 308 (or opening
transitional state), easel 312, modified easel 316, open 320,
inbound/outbound call or communication 324, image/video capture
328, transition 332 (or closing transitional state), landscape 340,
docked 336, docked 344 and landscape 348. Next to each illustrative
state is a representation of the physical state of the device 100
with the exception of states 324 and 328, where the state is
generally symbolized by the international icon for a telephone and
the icon for a camera, respectfully.
[0185] In state 304, the device is in a closed state with the
device 100 generally oriented in the portrait direction with the
primary screen 104 and the secondary screen 108 back-to-back in
different planes (see FIG. 1H). From the closed state, the device
100 can enter, for example, docked state 336, where the device 100
is coupled with a docking station, docking cable, or in general
docked or associated with one or more other devices or peripherals,
or the landscape state 340, where the device 100 is generally
oriented with the primary screen 104 facing the user, and the
primary screen 104 and the secondary screen 108 being
back-to-back.
[0186] In the closed state, the device can also move to a
transitional state where the device remains closed by the display
is moved from one screen 104 to another screen 108 based on a user
input, e.g., a double tap on the screen 110, 114. Still another
embodiment includes a bilateral state. In the bilateral state, the
device remains closed, but a single application displays at least
one window on both the first display 110 and the second display
114. The windows shown on the first and second display 110, 114 may
be the same or different based on the application and the state of
that application. For example, while acquiring an image with a
camera, the device may display the view finder on the first display
110 and displays a preview for the photo subjects (full screen and
mirrored left-to-right) on the second display 114.
[0187] In state 308, a transition state from the closed state 304
to the semi-open state or easel state 312, the device 100 is shown
opening with the primary screen 104 and the secondary screen 108
being rotated around a point of axis coincidence with the hinge.
Upon entering the easel state 312, the primary screen 104 and the
secondary screen 108 are separated from one another such that, for
example, the device 100 can sit in an easel-like configuration on a
surface.
[0188] In state 316, known as the modified easel position, the
device 100 has the primary screen 104 and the secondary screen 108
in a similar relative relationship to one another as in the easel
state 312, with the difference being one of the primary screen 104
or the secondary screen 108 are placed on a surface as shown.
[0189] State 320 is the open state where the primary screen 104 and
the secondary screen 108 are generally on the same plane. From the
open state, the device 100 can transition to the docked state 344
or the open landscape state 348. In the open state 320, the primary
screen 104 and the secondary screen 108 are generally in the
portrait-like orientation while in landscaped state 348 the primary
screen 104 and the secondary screen 108 are generally in a
landscape-like orientation.
[0190] State 324 is illustrative of a communication state, such as
when an inbound or outbound call is being received or placed,
respectively, by the device 100. While not illustrated for clarity,
it should be appreciated the device 100 can transition to the
inbound/outbound call state 324 from any state illustrated in FIG.
3. In a similar manner, the image/video capture state 328 can be
entered into from any other state in FIG. 3, with the image/video
capture state 328 allowing the device 100 to take one or more
images via a camera and/or videos with a video capture device
240.
[0191] Transition state 322 illustratively shows primary screen 104
and the secondary screen 108 being closed upon one another for
entry into, for example, the closed state 304.
[0192] FIG. 3B illustrates, with reference to the key, the inputs
that are received to detect a transition from a first state to a
second state. In FIG. 3B, various combinations of states are shown
with in general, a portion of the columns being directed toward a
portrait state 352, a landscape state 356, and a portion of the
rows being directed to portrait state 360 and landscape state
364.
[0193] In FIG. 3B, the Key indicates that "H" represents an input
from one or more Hall Effect sensors, "A" represents an input from
one or more accelerometers, "T" represents an input from a timer,
"P" represents a communications trigger input and "I" represents an
image and/or video capture request input. Thus, in the center
portion 376 of the chart, an input, or combination of inputs, are
shown that represent how the device 100 detects a transition from a
first physical state to a second physical state.
[0194] As discussed, in the center portion of the chart 376, the
inputs that are received enable the detection of a transition from,
for example, a portrait open state to a landscape easel
state--shown in bold--"HAT." For this exemplary transition from the
portrait open to the landscape easel state, a Hall Effect sensor
("H"), an accelerometer ("A") and a timer ("T") input may be
needed. The timer input can be derived from, for example, a clock
associated with the processor.
[0195] In addition to the portrait and landscape states, a docked
state 368 is also shown that is triggered based on the receipt of a
docking signal 372. As discussed above and in relation to FIG. 3,
the docking signal can be triggered by the association of the
device 100 with one or more other device 100s, accessories,
peripherals, smart docks, or the like.
[0196] User Interaction:
[0197] FIGS. 4A through 4H depict various graphical representations
of gesture inputs that may be recognized by the screens 104, 108.
The gestures may be performed not only by a user's body part, such
as a digit, but also by other devices, such as a stylus, that may
be sensed by the contact sensing portion(s) of a screen 104, 108.
In general, gestures are interpreted differently, based on where
the gestures are performed (either directly on the display 110, 114
or in the gesture capture region 120, 124). For example, gestures
in the display 110,114 may be directed to a desktop or application,
and gestures in the gesture capture region 120, 124 may be
interpreted as for the system.
[0198] With reference to FIGS. 4A-4H, a first type of gesture, a
touch gesture 420, is substantially stationary on the screen
104,108 for a selected length of time. A circle 428 represents a
touch or other contact type received at particular location of a
contact sensing portion of the screen. The circle 428 may include a
border 432, the thickness of which indicates a length of time that
the contact is held substantially stationary at the contact
location. For instance, a tap 420 (or short press) has a thinner
border 432a than the border 432b for a long press 424 (or for a
normal press). The long press 424 may involve a contact that
remains substantially stationary on the screen for longer time
period than that of a tap 420. As will be appreciated, differently
defined gestures may be registered depending upon the length of
time that the touch remains stationary prior to contact cessation
or movement on the screen.
[0199] With reference to FIG. 4C, a drag gesture 400 on the screen
104,108 is an initial contact (represented by circle 428) with
contact movement 436 in a selected direction. The initial contact
428 may remain stationary on the screen 104,108 for a certain
amount of time represented by the border 432. The drag gesture
typically requires the user to contact an icon, window, or other
displayed image at a first location followed by movement of the
contact in a drag direction to a new second location desired for
the selected displayed image. The contact movement need not be in a
straight line but have any path of movement so long as the contact
is substantially continuous from the first to the second
locations.
[0200] With reference to FIG. 4D, a flick gesture 404 on the screen
104,108 is an initial contact (represented by circle 428) with
truncated contact movement 436 (relative to a drag gesture) in a
selected direction. In embodiments, a flick has a higher exit
velocity for the last movement in the gesture compared to the drag
gesture. The flick gesture can, for instance, be a finger snap
following initial contact. Compared to a drag gesture, a flick
gesture generally does not require continual contact with the
screen 104,108 from the first location of a displayed image to a
predetermined second location. The contacted displayed image is
moved by the flick gesture in the direction of the flick gesture to
the predetermined second location. Although both gestures commonly
can move a displayed image from a first location to a second
location, the temporal duration and distance of travel of the
contact on the screen is generally less for a flick than for a drag
gesture.
[0201] With reference to FIG. 4E, a pinch gesture 408 on the screen
104,108 is depicted. The pinch gesture 408 may be initiated by a
first contact 428a to the screen 104,108 by, for example, a first
digit and a second contact 428b to the screen 104,108 by, for
example, a second digit. The first and second contacts 428a,b may
be detected by a common contact sensing portion of a common screen
104,108, by different contact sensing portions of a common screen
104 or 108, or by different contact sensing portions of different
screens. The first contact 428a is held for a first amount of time,
as represented by the border 432a, and the second contact 428b is
held for a second amount of time, as represented by the border
432b. The first and second amounts of time are generally
substantially the same, and the first and second contacts 428 a, b
generally occur substantially simultaneously. The first and second
contacts 428 a, b generally also include corresponding first and
second contact movements 436 a, b, respectively. The first and
second contact movements 436 a, b are generally in opposing
directions. Stated another way, the first contact movement 436a is
towards the second contact 436b, and the second contact movement
436b is towards the first contact 436a. More simply stated, the
pinch gesture 408 may be accomplished by a user's digits touching
the screen 104,108 in a pinching motion.
[0202] With reference to FIG. 4F, a spread gesture 410 on the
screen 104,108 is depicted. The spread gesture 410 may be initiated
by a first contact 428a to the screen 104,108 by, for example, a
first digit and a second contact 428b to the screen 104,108 by, for
example, a second digit. The first and second contacts 428a,b may
be detected by a common contact sensing portion of a common screen
104,108, by different contact sensing portions of a common screen
104,108, or by different contact sensing portions of different
screens. The first contact 428a is held for a first amount of time,
as represented by the border 432a, and the second contact 428b is
held for a second amount of time, as represented by the border
432b. The first and second amounts of time are generally
substantially the same, and the first and second contacts 428 a, b
generally occur substantially simultaneously. The first and second
contacts 428 a, b generally also include corresponding first and
second contact movements 436a, b, respectively. The first and
second contact movements 436 a, b are generally in a common
direction. Stated another way, the first and second contact
movements 436 a, b are away from the first and second contacts
428a, b. More simply stated, the spread gesture 410 may be
accomplished by a user's digits touching the screen 104,108 in a
spreading motion.
[0203] The above gestures may be combined in any manner, such as
those shown by FIGS. 4G and 4H, to produce a determined functional
result. For example, in FIG. 4G a tap gesture 420 is combined with
a drag or flick gesture 412 in a direction away from the tap
gesture 420. In FIG. 4H, a tap gesture 420 is combined with a drag
or flick gesture 412 in a direction towards the tap gesture
420.
[0204] The functional result of receiving a gesture can vary
depending on a number of factors, including a state of the device
100, display 110, 114, or screen 104, 108, a context associated
with the gesture, or sensed location of the gesture. The state of
the device commonly refers to one or more of a configuration of the
device 100, a display orientation, and user and other inputs
received by the device 100. Context commonly refers to one or more
of the particular application(s) selected by the gesture and the
portion(s) of the application currently executing, whether the
application is a single- or multi-screen application, and whether
the application is a multi-screen application displaying one or
more windows in one or more screens or in one or more stacks.
Sensed location of the gesture commonly refers to whether the
sensed set(s) of gesture location coordinates are on a touch
sensitive display 110, 114 or a gesture capture region 120, 124,
whether the sensed set(s) of gesture location coordinates are
associated with a common or different display or screen 104,108,
and/or what portion of the gesture capture region contains the
sensed set(s) of gesture location coordinates.
[0205] A tap, when received by an a touch sensitive display 110,
114, can be used, for instance, to select an icon to initiate or
terminate execution of a corresponding application, to maximize or
minimize a window, to reorder windows in a stack, and to provide
user input such as by keyboard display or other displayed image. A
drag, when received by a touch sensitive display 110, 114, can be
used, for instance, to relocate an icon or window to a desired
location within a display, to reorder a stack on a display, or to
span both displays (such that the selected window occupies a
portion of each display simultaneously). A flick, when received by
a touch sensitive display 110, 114 or a gesture capture region 120,
124, can be used to relocate a window from a first display to a
second display or to span both displays (such that the selected
window occupies a portion of each display simultaneously). Unlike
the drag gesture, however, the flick gesture is generally not used
to move the displayed image to a specific user-selected location
but to a default location that is not configurable by the user.
[0206] The pinch gesture, when received by a touch sensitive
display 110, 114 or a gesture capture region 120, 124, can be used
to minimize or otherwise increase the displayed area or size of a
window (typically when received entirely by a common display), to
switch windows displayed at the top of the stack on each display to
the top of the stack of the other display (typically when received
by different displays or screens), or to display an application
manager (a "pop-up window" that displays the windows in the stack).
The spread gesture, when received by a touch sensitive display 110,
114 or a gesture capture region 120, 124, can be used to maximize
or otherwise decrease the displayed area or size of a window, to
switch windows displayed at the top of the stack on each display to
the top of the stack of the other display (typically when received
by different displays or screens), or to display an application
manager (typically when received by an off-screen gesture capture
region on the same or different screens).
[0207] The combined gestures of FIG. 4G, when received by a common
display capture region in a common display or screen 104,108, can
be used to hold a first window stack location in a first stack
constant for a display receiving the gesture while reordering a
second window stack location in a second window stack to include a
window in the display receiving the gesture. The combined gestures
of FIG. 4H, when received by different display capture regions in a
common display or screen 104,108 or in different displays or
screens, can be used to hold a first window stack location in a
first window stack constant for a display receiving the tap part of
the gesture while reordering a second window stack location in a
second window stack to include a window in the display receiving
the flick or drag gesture. Although specific gestures and gesture
capture regions in the preceding examples have been associated with
corresponding sets of functional results, it is to be appreciated
that these associations can be redefined in any manner to produce
differing associations between gestures and/or gesture capture
regions and/or functional results.
[0208] Firmware and Software:
[0209] The memory 508 may store and the processor 504 may execute
one or more software components. These components can include at
least one operating system (OS) 516, an application manager 562, a
desktop 566, and/or one or more applications 564a and/or 564b from
an application store 560. The OS 516 can include a framework 520,
one or more frame buffers 548, one or more drivers 512, previously
described in conjunction with FIG. 2, and/or a kernel 518. The OS
516 can be any software, consisting of programs and data, which
manages computer hardware resources and provides common services
for the execution of various applications 564. The OS 516 can be
any operating system and, at least in some embodiments, dedicated
to mobile devices, including, but not limited to, Linux,
ANDROID.TM., iPhone OS (IOS.TM.), WINDOWS PHONE 7.TM., etc. The OS
516 is operable to provide functionality to the phone by executing
one or more operations, as described herein.
[0210] The applications 564 can be any higher level software that
executes particular functionality for the user. Applications 564
can include programs such as email clients, web browsers, texting
applications, games, media players, office suites, etc. The
applications 564 can be stored in an application store 560, which
may represent any memory or data storage, and the management
software associated therewith, for storing the applications 564.
Once executed, the applications 564 may be run in a different area
of memory 508.
[0211] The framework 520 may be any software or data that allows
the multiple tasks running on the device to interact. In
embodiments, at least portions of the framework 520 and the
discrete components described hereinafter may be considered part of
the OS 516 or an application 564. However, these portions will be
described as part of the framework 520, but those components are
not so limited. The framework 520 can include, but is not limited
to, a Multi-Display Management (MDM) module 524, a Surface Cache
module 528, a Window Management module 532, an Input Management
module 536, a Task Management module 540, an Application Model
Manager 542, a Display Controller, one or more frame buffers 548, a
task stack 552, one or more window stacks 550 (which is a logical
arrangement of windows and/or desktops in a display area), and/or
an event buffer 556.
[0212] The MDM module 524 includes one or more modules that are
operable to manage the display of applications or other data on the
screens of the device. An embodiment of the MDM module 524 is
described in conjunction with FIG. 5B. In embodiments, the MDM
module 524 receives inputs from the other OS 516 components, such
as, the drivers 512, and from the applications 564 to determine
continually the state of the device 100. The inputs assist the MDM
module 524 in determining how to configure and allocate the
displays according to the application's preferences and
requirements, and the user's actions. Once a determination for
display configurations is made, the MDM module 524 can bind the
applications 564 to a display. The configuration may then be
provided to one or more other components to generate a window with
a display.
[0213] The Surface Cache module 528 includes any memory or storage
and the software associated therewith to store or cache one or more
images of windows. A series of active and/or non-active windows (or
other display objects, such as, a desktop display) can be
associated with each display. An active window (or other display
object) is currently displayed. A non-active windows (or other
display objects) were opened and, at some time, displayed but are
now not displayed. To enhance the user experience, before a window
transitions from an active state to an inactive state, a "screen
shot" of a last generated image of the window (or other display
object) can be stored. The Surface Cache module 528 may be operable
to store a bitmap of the last active image of a window (or other
display object) not currently displayed. Thus, the Surface Cache
module 528 stores the images of non-active windows (or other
display objects) in a data store.
[0214] In embodiments, the Window Management module 532 is operable
to manage the windows (or other display objects) that are active or
not active on each of the displays. The Window Management module
532, based on information from the MDM module 524, the OS 516, or
other components, determines when a window (or other display
object) is visible or not active. The Window Management module 532
may then put a non-visible window (or other display object) in a
"not active state" and, in conjunction with the Task Management
module Task Management 540 suspends the application's operation.
Further, the Window Management module 532 may assign, through
collaborative interaction with the MDM module 524, a display
identifier to the window (or other display object) or manage one or
more other items of data associated with the window (or other
display object). The Window Management module 532 may also provide
the stored information to the application 564, the Task Management
module 540, or other components interacting with or associated with
the window (or other display object). The Window Management module
532 can also associate an input task with a window based on window
focus and display coordinates within the motion space.
[0215] The Input Management module 536 is operable to manage events
that occur with the device. An event is any input into the window
environment, for example, a user interface interactions with a
user. The Input Management module 536 receives the events and
logically stores the events in an event buffer 556. Events can
include such user interface interactions as a "down event," which
occurs when a screen 104, 108 receives a touch signal from a user,
a "move event," which occurs when the screen 104, 108 determines
that a user's finger is moving across a screen(s), an "up event,
which occurs when the screen 104, 108 determines that the user has
stopped touching the screen 104, 108, etc. These events are
received, stored, and forwarded to other modules by the Input
Management module 536. The Input Management module 536 may also map
screen inputs to a motion space which is the culmination of all
physical and virtual display available on the device.
[0216] The motion space is a virtualized space that includes all
touch sensitive displays 110,114 "tiled" together to mimic the
physical dimensions of the device 100. For example, when the device
100 is unfolded, the motion space size may be 960.times.800, which
may be the number of pixels in the combined display area for both
touch sensitive displays 110, 114. If a user touches on a first
touch sensitive display 110 on location (40, 40), a full screen
window can receive touch event with location (40, 40). If a user
touches on a second touch sensitive display 114, with location (40,
40), the full screen window can receive touch event with location
(520, 40), because the second touch sensitive display 114 is on the
right side of the first touch sensitive display 110, so the device
100 can offset the touch by the first touch sensitive display's 110
width, which is 480 pixels. When a hardware event occurs with
location info from a driver 512, the framework 520 can up-scale the
physical location to the motion space because the location of the
event may be different based on the device orientation and state.
The motion space may be as described in U.S. patent application
Ser. No. 13/187,026, filed Jul. 20, 2011, entitled "Systems and
Methods for Receiving Gesture Inputs Spanning Multiple Input
Devices," which is hereby incorporated by reference in its entirety
for all that it teaches and for all purposes.
[0217] A task can be an application and a sub-task can be an
application component that provides a window with which users can
interact to do something, such as dial the phone, take a photo,
send an email, or view a map. Each task may be given a window in
which to draw a user interface. The window typically fills a
display (for example, touch sensitive display 110,114), but may be
smaller than the display 110,114 and float on top of other windows.
An application usually consists of multiple sub-tasks that are
loosely bound to each other. Typically, one task in an application
is specified as the "main" task, which is presented to the user
when launching the application for the first time. Each task can
then start another task or sub-task to perform different
actions.
[0218] The Task Management module 540 is operable to manage the
operation of one or more applications 564 that may be executed by
the device. Thus, the Task Management module 540 can receive
signals to launch, suspend, terminate, etc. an application or
application sub-tasks stored in the application store 560. The Task
Management module 540 may then instantiate one or more tasks or
sub-tasks of the application 564 to begin operation of the
application 564. Further, the Task Management Module 540 may
launch, suspend, or terminate a task or sub-task as a result of
user input or as a result of a signal from a collaborating
framework 520 component. The Task Management Module 540 is
responsible for managing the lifecycle of applications (tasks and
sub-task) from when the application is launched to when the
application is terminated.
[0219] The processing of the Task Management Module 540 is
facilitated by a task stack 552, which is a logical structure
associated with the Task Management Module 540. The task stack 552
maintains the state of all tasks and sub-tasks on the device 100.
When some component of the operating system 516 requires a task or
sub-task to transition in its lifecycle, the OS 516 component can
notify the Task Management Module 540. The Task Management Module
540 may then locate the task or sub-task, using identification
information, in the task stack 552, and send a signal to the task
or sub-task indicating what kind of lifecycle transition the task
needs to execute. Informing the task or sub-task of the transition
allows the task or sub-task to prepare for the lifecycle state
transition. The Task Management Module 540 can then execute the
state transition for the task or sub-task. In embodiments, the
state transition may entail triggering the OS kernel 518 to
terminate the task when termination is required.
[0220] Further, the Task Management module 540 may suspend the
application 564 based on information from the Window Management
Module 532. Suspending the application 564 may maintain application
data in memory but may limit or stop the application 564 from
rendering a window or user interface. Once the application becomes
active again, the Task Management module 540 can again trigger the
application to render its user interface. In embodiments, if a task
is suspended, the task may save the task's state in case the task
is terminated. In the suspended state, the application task may not
receive input because the application window is not visible to the
user.
[0221] The frame buffer 548 is a logical structure(s) used to
render the user interface. The frame buffer 548 can be created and
destroyed by the OS kernel 518. However, the Display Controller 544
can write the image data, for the visible windows, into the frame
buffer 548. A frame buffer 548 can be associated with one screen or
multiple screens. The association of a frame buffer 548 with a
screen can be controlled dynamically by interaction with the OS
kernel 518. A composite display may be created by associating
multiple screens with a single frame buffer 548. Graphical data
used to render an application's window user interface may then be
written to the single frame buffer 548, for the composite display,
which is output to the multiple screens 104,108. The Display
Controller 544 can direct an application's user interface to a
portion of the frame buffer 548 that is mapped to a particular
display 110,114, thus, displaying the user interface on only one
screen 104 or 108. The Display Controller 544 can extend the
control over user interfaces to multiple applications, controlling
the user interfaces for as many displays as are associated with a
frame buffer 548 or a portion thereof. This approach compensates
for the multiple physical screens 104,108 that are in use by the
software component above the Display Controller 544.
[0222] The Application Manager 562 is an application that provides
a presentation layer for the window environment. Thus, the
Application Manager 562 provides the graphical model for rendering
by the Task Management Module 540. Likewise, the Desktop 566
provides the presentation layer for the Application Store 560.
Thus, the desktop provides a graphical model of a surface having
selectable application icons for the Applications 564 in the
Application Store 560 that can be provided to the Window Management
Module 556 for rendering.
[0223] Further, the framework can include an Application Model
Manager (AMM) 542. The Application Manager 562 may interface with
the AMM 542. In embodiments, the AMM 542 receives state change
information from the device 100 regarding the state of applications
(which are running or suspended). The AMM 542 can associate bit map
images from the Surface Cache Module 528 to the tasks that are
alive (running or suspended). Further, the AMM 542 can convert the
logical window stack maintained in the Task Manager Module 540 to a
linear ("film strip" or "deck of cards") organization that the user
perceives when the using the off gesture capture area 120 to sort
through the windows. Further, the AMM 542 may provide a list of
executing applications to the Application Manager 562.
[0224] An embodiment of the MDM module 524 is shown in FIG. 5B. The
MDM module 524 is operable to determine the state of the
environment for the device, including, but not limited to, the
orientation of the device, whether the device 100 is opened or
closed, what applications 564 are executing, how the applications
564 are to be displayed, what actions the user is conducting, the
tasks being displayed, etc. To configure the display, the MDM
module 524 interprets these environmental factors and determines a
display configuration, as described in conjunction with FIGS.
6A-6J. Then, the MDM module 524 can bind the applications 564 or
other device components to the displays. The configuration may then
be sent to the Display Controller 544 and/or the other components
within the OS 516 to generate the display. The MDM module 524 can
include one or more of, but is not limited to, a Display
Configuration Module 568, a Preferences Module 572, a Device State
Module 574, a Gesture Module 576, a Requirements Module 580, an
Event Module 584, and/or a Binding Module 588.
[0225] The Display Configuration Module 568 determines the layout
for the display. In embodiments, the Display Configuration Module
568 can determine the environmental factors. The environmental
factors may be received from one or more other MDM modules 524 or
from other sources. The Display Configuration Module 568 can then
determine from the list of factors the best configuration for the
display. Some embodiments of the possible configurations and the
factors associated therewith are described in conjunction with
FIGS. 6A-6F.
[0226] The Preferences Module 572 is operable to determine display
preferences for an application 564 or other component. For example,
an application can have a preference for Single or Dual displays.
The Preferences Module 572 can determine an application's display
preference (e.g., by inspecting the application's preference
settings) and may allow the application 564 to change to a mode
(e.g., single screen, dual screen, max, etc.) if the device 100 is
in a state that can accommodate the preferred mode. However, some
user interface policies may disallow a mode even if the mode is
available. As the configuration of the device changes, the
preferences may be reviewed to determine if a better display
configuration can be achieved for an application 564.
[0227] The Device State Module 574 is operable to determine or
receive the state of the device. The state of the device can be as
described in conjunction with FIGS. 3A and 3B. The state of the
device can be used by the Display Configuration Module 568 to
determine the configuration for the display. As such, the Device
State Module 574 may receive inputs and interpret the state of the
device. The state information is then provided to the Display
Configuration Module 568.
[0228] The Gesture Module 576 is shown as part of the MDM module
524, but, in embodiments, the Gesture module 576 may be a separate
Framework 520 component that is separate from the MDM module 524.
In embodiments, the Gesture Module 576 is operable to determine if
the user is conducting any actions on any part of the user
interface. In alternative embodiments, the Gesture Module 576
receives user interface actions from the configurable area 112,116
only. The Gesture Module 576 can receive touch events that occur on
the configurable area 112,116 (or possibly other user interface
areas) by way of the Input Management Module 536 and may interpret
the touch events (using direction, speed, distance, duration, and
various other parameters) to determine what kind of gesture the
user is performing. When a gesture is interpreted, the Gesture
Module 576 can initiate the processing of the gesture and, by
collaborating with other Framework 520 components, can manage the
required window animation. The Gesture Module 576 collaborates with
the Application Model Manager 542 to collect state information with
respect to which applications are running (active or paused) and
the order in which applications must appear when a user gesture is
performed. The Gesture Module 576 may also receive references to
bitmaps (from the Surface Cache Module 528) and live windows so
that when a gesture occurs it can instruct the Display Controller
544 how to move the window(s) across the display 110,114. Thus,
suspended applications may appear to be running when those windows
are moved across the display 110,114.
[0229] Further, the Gesture Module 576 can receive task information
either from the Task Manage Module 540 or the Input Management
module 536. The gestures may be as defined in conjunction with
FIGS. 4A through 4H. For example, moving a window causes the
display to render a series of display frames that illustrate the
window moving. The gesture associated with such user interface
interaction can be received and interpreted by the Gesture Module
576. The information about the user gesture is then sent to the
Task Management Module 540 to modify the display binding of the
task.
[0230] The Requirements Module 580, similar to the Preferences
Module 572, is operable to determine display requirements for an
application 564 or other component. An application can have a set
display requirement that must be observed. Some applications
require a particular display orientation. For example, the
application "Angry Birds" can only be displayed in landscape
orientation. This type of display requirement can be determined or
received, by the Requirements Module 580. As the orientation of the
device changes, the Requirements Module 580 can reassert the
display requirements for the application 564. The Display
Configuration Module 568 can generate a display configuration that
is in accordance with the application display requirements, as
provided by the Requirements Module 580.
[0231] The Event Module 584, similar to the Gesture Module 576, is
operable to determine one or more events occurring with an
application or other component that can affect the user interface.
Thus, the Event Module 584 can receive event information either
from the event buffer 556 or the Task Management module 540. These
events can change how the tasks are bound to the displays. The
Event Module 584 can collect state change information from other
Framework 520 components and act upon that state change
information. In an example, when the phone is opened or closed or
when an orientation change has occurred, a new message may be
rendered in a secondary screen. The state change based on the event
can be received and interpreted by the Event Module 584. The
information about the events then may be sent to the Display
Configuration Module 568 to modify the configuration of the
display.
[0232] The Binding Module 588 is operable to bind the applications
564 or the other components to the configuration determined by the
Display Configuration Module 568. A binding associates, in memory,
the display configuration for each application with the display and
mode of the application. Thus, the Binding Module 588 can associate
an application with a display configuration for the application
(e.g. landscape, portrait, multi-screen, etc.). Then, the Binding
Module 588 may assign a display identifier to the display. The
display identifier associated the application with a particular
display of the device 100. This binding is then stored and provided
to the Display Controller 544, the other components of the OS 516,
or other components to properly render the display. The binding is
dynamic and can change or be updated based on configuration changes
associated with events, gestures, state changes, application
preferences or requirements, etc.
[0233] User Interface Configurations:
[0234] With reference now to FIGS. 6A-J, various types of output
configurations made possible by the device 100 will be described
hereinafter.
[0235] FIGS. 6A and 6B depict two different output configurations
of the device 100 being in a first state. Specifically, FIG. 6A
depicts the device 100 being in a closed portrait state 304 where
the data is displayed on the primary screen 104. In this example,
the device 100 displays data via the touch sensitive display 110 in
a first portrait configuration 604. As can be appreciated, the
first portrait configuration 604 may only display a desktop or
operating system home screen. Alternatively, one or more windows
may be presented in a portrait orientation while the device 100 is
displaying data in the first portrait configuration 604.
[0236] FIG. 6B depicts the device 100 still being in the closed
portrait state 304, but instead data is displayed on the secondary
screen 108. In this example, the device 100 displays data via the
touch sensitive display 114 in a second portrait configuration
608.
[0237] It may be possible to display similar or different data in
either the first or second portrait configuration 604, 608. It may
also be possible to transition between the first portrait
configuration 604 and second portrait configuration 608 by
providing the device 100 a user gesture (e.g., a double tap
gesture), a menu selection, or other means. Other suitable gestures
may also be employed to transition between configurations.
Furthermore, it may also be possible to transition the device 100
from the first or second portrait configuration 604, 608 to any
other configuration described herein depending upon which state the
device 100 is moved.
[0238] An alternative output configuration may be accommodated by
the device 100 being in a second state. Specifically, FIG. 6C
depicts a third portrait configuration where data is displayed
simultaneously on both the primary screen 104 and the secondary
screen 108. The third portrait configuration may be referred to as
a Dual-Portrait (PD) output configuration. In the PD output
configuration, the touch sensitive display 110 of the primary
screen 104 depicts data in the first portrait configuration 604
while the touch sensitive display 114 of the secondary screen 108
depicts data in the second portrait configuration 608. The
simultaneous presentation of the first portrait configuration 604
and the second portrait configuration 608 may occur when the device
100 is in an open portrait state 320. In this configuration, the
device 100 may display one application window in one display 110 or
114, two application windows (one in each display 110 and 114), one
application window and one desktop, or one desktop. Other
configurations may be possible. It should be appreciated that it
may also be possible to transition the device 100 from the
simultaneous display of configurations 604, 608 to any other
configuration described herein depending upon which state the
device 100 is moved. Furthermore, while in this state, an
application's display preference may place the device into
bilateral mode, in which both displays are active to display
different windows in the same application. For example, a Camera
application may display a viewfinder and controls on one side,
while the other side displays a mirrored preview that can be seen
by the photo subjects. Games involving simultaneous play by two
players may also take advantage of bilateral mode.
[0239] FIGS. 6D and 6E depicts two further output configurations of
the device 100 being in a third state. Specifically, FIG. 6D
depicts the device 100 being in a closed landscape state 340 where
the data is displayed on the primary screen 104. In this example,
the device 100 displays data via the touch sensitive display 110 in
a first landscape configuration 612. Much like the other
configurations described herein, the first landscape configuration
612 may display a desktop, a home screen, one or more windows
displaying application data, or the like.
[0240] FIG. 6E depicts the device 100 still being in the closed
landscape state 340, but instead data is displayed on the secondary
screen 108. In this example, the device 100 displays data via the
touch sensitive display 114 in a second landscape configuration
616. It may be possible to display similar or different data in
either the first or second portrait configuration 612, 616. It may
also be possible to transition between the first landscape
configuration 612 and second landscape configuration 616 by
providing the device 100 with one or both of a twist and tap
gesture or a flip and slide gesture. Other suitable gestures may
also be employed to transition between configurations. Furthermore,
it may also be possible to transition the device 100 from the first
or second landscape configuration 612, 616 to any other
configuration described herein depending upon which state the
device 100 is moved.
[0241] FIG. 6F depicts a third landscape configuration where data
is displayed simultaneously on both the primary screen 104 and the
secondary screen 108. The third landscape configuration may be
referred to as a Dual-Landscape (LD) output configuration. In the
LD output configuration, the touch sensitive display 110 of the
primary screen 104 depicts data in the first landscape
configuration 612 while the touch sensitive display 114 of the
secondary screen 108 depicts data in the second landscape
configuration 616. The simultaneous presentation of the first
landscape configuration 612 and the second landscape configuration
616 may occur when the device 100 is in an open landscape state
340. It should be appreciated that it may also be possible to
transition the device 100 from the simultaneous display of
configurations 612, 616 to any other configuration described herein
depending upon which state the device 100 is moved.
[0242] FIGS. 6G and 6H depict two views of a device 100 being in
yet another state. Specifically, the device 100 is depicted as
being in an easel state 312. FIG. 6G shows that a first easel
output configuration 618 may be displayed on the touch sensitive
display 110. FIG. 6H shows that a second easel output configuration
620 may be displayed on the touch sensitive display 114. The device
100 may be configured to depict either the first easel output
configuration 618 or the second easel output configuration 620
individually. Alternatively, both the easel output configurations
618, 620 may be presented simultaneously. In some embodiments, the
easel output configurations 618, 620 may be similar or identical to
the landscape output configurations 612, 616. The device 100 may
also be configured to display one or both of the easel output
configurations 618, 620 while in a modified easel state 316. It
should be appreciated that simultaneous utilization of the easel
output configurations 618, 620 may facilitate two-person games
(e.g., Battleship.RTM., chess, checkers, etc.), multi-user
conferences where two or more users share the same device 100, and
other applications. As can be appreciated, it may also be possible
to transition the device 100 from the display of one or both
configurations 618, 620 to any other configuration described herein
depending upon which state the device 100 is moved.
[0243] FIG. 6I depicts yet another output configuration that may be
accommodated while the device 100 is in an open portrait state 320.
Specifically, the device 100 may be configured to present a single
continuous image across both touch sensitive displays 110, 114 in a
portrait configuration referred to herein as a Portrait-Max (PMax)
configuration 624. In this configuration, data (e.g., a single
image, application, window, icon, video, etc.) may be split and
displayed partially on one of the touch sensitive displays while
the other portion of the data is displayed on the other touch
sensitive display. The Pmax configuration 624 may facilitate a
larger display and/or better resolution for displaying a particular
image on the device 100. Similar to other output configurations, it
may be possible to transition the device 100 from the Pmax
configuration 624 to any other output configuration described
herein depending upon which state the device 100 is moved.
[0244] FIG. 6J depicts still another output configuration that may
be accommodated while the device 100 is in an open landscape state
348. Specifically, the device 100 may be configured to present a
single continuous image across both touch sensitive displays 110,
114 in a landscape configuration referred to herein as a
Landscape-Max (LMax) configuration 628. In this configuration, data
(e.g., a single image, application, window, icon, video, etc.) may
be split and displayed partially on one of the touch sensitive
displays while the other portion of the data is displayed on the
other touch sensitive display. The Lmax configuration 628 may
facilitate a larger display and/or better resolution for displaying
a particular image on the device 100. Similar to other output
configurations, it may be possible to transition the device 100
from the Lmax configuration 628 to any other output configuration
described herein depending upon which state the device 100 is
moved.
[0245] The device 100 manages desktops and/or windows with at least
one window stack 700, 728, as shown in FIGS. 7A and 7B. A window
stack 700, 728 is a logical arrangement of active and/or inactive
windows for a multi-screen device. For example, the window stack
700, 728 may be logically similar to a deck of cards, where one or
more windows or desktops are arranged in order, as shown in FIGS.
7A and 7B. An active window is a window that is currently being
displayed on at least one of the touch sensitive displays 110, 114.
For example, windows 104 and 108 are active windows and are
displayed on touch sensitive displays 110 and 114. An inactive
window is a window that was opened and displayed but is now
"behind" an active window and not being displayed. In embodiments,
an inactive window may be for an application that is suspended, and
thus, the window is not displaying active content. For example,
windows 712, 716, 720, and 724 are inactive windows.
[0246] A window stack 700, 728 may have various arrangements or
organizational structures. In the embodiment shown in FIG. 7A, the
device 100 includes a first stack 760 associated with a first touch
sensitive display 110 and a second stack associated with a second
touch sensitive display 114. Thus, each touch sensitive display
110, 114 can have an associated window stack 760, 764. These two
window stacks 760, 764 may have different numbers of windows
arranged in the respective stacks 760, 764. Further, the two window
stacks 760, 764 can also be identified differently and managed
separately. Thus, the first window stack 760 can be arranged in
order from a first window 704 to a next window 720 to a last window
724 and finally to a desktop 722, which, in embodiments, is at the
"bottom" of the window stack 760. In embodiments, the desktop 722
is not always at the "bottom" as application windows can be
arranged in the window stack below the desktop 722, and the desktop
722 can be brought to the "top" of a stack over other windows
during a desktop reveal Likewise, the second stack 764 can be
arranged from a first window 708 to a next window 712 to a last
window 716, and finally to a desktop 718, which, in embodiments, is
a single desktop area, with desktop 722, under all the windows in
both window stack 760 and window stack 764. A logical data
structure for managing the two window stacks 760, 764 may be as
described in conjunction with FIG. 8.
[0247] Another arrangement for a window stack 728 is shown in FIG.
7B. In this embodiment, there is a single window stack 728 for both
touch sensitive displays 110, 114. Thus, the window stack 728 is
arranged from a desktop 758 to a first window 744 to a last window
756. A window can be arranged in a position among all windows
without an association to a specific touch sensitive display 110,
114. In this embodiment, a window is in the order of windows.
Further, at least one window is identified as being active. For
example, a single window may be rendered in two portions 732 and
736 that are displayed on the first touch sensitive screen 110 and
the second touch sensitive screen 114. The single window may only
occupy a single position in the window stack 728 although it is
displayed on both displays 110, 114.
[0248] Yet another arrangement of a window stack 760 is shown in
FIGS. 7C through 7E. The window stack 760 is shown in three
"elevation" views. In FIG. 7C, the top of the window stack 760 is
shown. Two sides of the window stack 760 are shown in FIGS. 7D and
7E. In this embodiment, the window stack 760 resembles a stack of
bricks. The windows are stacked on each other. Looking from the top
of the window stack 760 in FIG. 7C, only the top most windows in
the window stack 760 are seen in different portions of the
composite display 764. The composite display 764 represents a
logical model for the entire display area of the device 100, which
can include touch sensitive display 110 and touch sensitive display
114. A desktop 786 or a window can occupy part or all of the
composite display 764.
[0249] In the embodiment shown, the desktop 786 is the lowest
display or "brick" in the window stack 760. Thereupon, window 1
782, window 2 782, window 3 768, and window 4 770 are layered.
Window 1 782, window 3 768, window 2 782, and window 4 770 only
occupy a portion of the composite display 764. Thus, another part
of the stack 760 includes window 8 774 and windows 5 through 7
shown in section 790. Only the top window in any portion of the
composite display 764 is actually rendered and displayed. Thus, as
shown in the top view in FIG. 7C, window 4 770, window 8 774, and
window 3 768 are displayed as being at the top of the display in
different portions of the window stack 760. A window can be
dimensioned to occupy only a portion of the composite display 760
to "reveal" windows lower in the window stack 760. For example,
window 3 768 is lower in the stack than both window 4 770 and
window 8 774 but is still displayed. A logical data structure to
manage the window stack can be as described in conjunction with
FIG. 8.
[0250] When a new window is opened, the newly activated window is
generally positioned at the top of the stack. However, where and
how the window is positioned within the stack can be a function of
the orientation of the device 100, the context of what programs,
functions, software, etc. are being executed on the device 100, how
the stack is positioned when the new window is opened, etc. To
insert the window in the stack, the position in the stack for the
window is determined and the touch sensitive display 110, 114 to
which the window is associated may also be determined. With this
information, a logical data structure for the window can be created
and stored. When user interface or other events or tasks change the
arrangement of windows, the window stack(s) can be changed to
reflect the change in arrangement. It should be noted that these
same concepts described above can be used to manage the one or more
desktops for the device 100.
[0251] A logical data structure 800 for managing the arrangement of
windows or desktops in a window stack is shown in FIG. 8. The
logical data structure 800 can be any data structure used to store
data whether an object, record, file, etc. The logical data
structure 800 can be stored in any type of database or data storage
system, regardless of protocol or standard. In embodiments, the
logical data structure 800 includes one or more portions, fields,
attributes, etc. that store data in a logical arrangement that
allows for easy storage and retrieval of the information.
Hereinafter, these one or more portions, fields, attributes, etc.
shall be described simply as fields. The fields can store data for
a window identifier 804, dimensions 808, a stack position
identifier 812, a display identifier 816, and/or an active
indicator 820. Each window in a window stack can have an associated
logical data structure 800. While only a single logical data
structure 800 is shown in FIG. 8, there may be more or fewer
logical data structures 800 used with a window stack (based on the
number of windows or desktops in the stack), as represented by
ellipses 824. Further, there may be more or fewer fields than those
shown in FIG. 8, as represented by ellipses 828.
[0252] A window identifier 804 can include any identifier (ID) that
uniquely identifies the associated window in relation to other
windows in the window stack. The window identifier 804 can be a
globally unique identifier (GUID), a numeric ID, an alphanumeric
ID, or other type of identifier. In embodiments, the window
identifier 804 can be one, two, or any number of digits based on
the number of windows that can be opened. In alternative
embodiments, the size of the window identifier 804 may change based
on the number of windows opened. While the window is open, the
window identifier 804 may be static and remain unchanged.
[0253] Dimensions 808 can include dimensions for a window in the
composite display 760. For example, the dimensions 808 can include
coordinates for two or more corners of the window or may include
one coordinate and dimensions for the width and height of the
window. These dimensions 808 can delineate what portion of the
composite display 760 the window may occupy, which may the entire
composite display 760 or only part of composite display 760. For
example, window 4 770 may have dimensions 880 that indicate that
the window 770 will occupy only part of the display area for
composite display 760, as shown in FIGS. 7c through 7E. As windows
are moved or inserted in the window stack, the dimensions 808 may
change.
[0254] A stack position identifier 812 can be any identifier that
can identify the position in the stack for the window or may be
inferred from the window's control record within a data structure,
such as a list or a stack. The stack position identifier 812 can be
a GUID, a numeric ID, an alphanumeric ID, or other type of
identifier. Each window or desktop can include a stack position
identifier 812. For example, as shown in FIG. 7A, window 1 704 in
stack 1 760 can have a stack position identifier 812 of 1
identifying that window 704 is the first window in the stack 760
and the active window. Similarly, window 6 724 can have a stack
position identifier 812 of 3 representing that window 724 is the
third window in the stack 760. Window 2 708 can also have a stack
position identifier 812 of 1 representing that window 708 is the
first window in the second stack 764. As shown in FIG. 7B, window 1
744 can have a stack position identifier 812 of 1, window 3,
rendered in portions 732 and 736, can have a stack position
identifier 812 of 3, and window 6 756 can have a stack position
identifier 812 of 6. Thus, depending on the type of stack, the
stack position identifier 812 can represent a window's location in
the stack.
[0255] A display identifier 816 can identify that the window or
desktop is associated with a particular display, such as the first
display 110 or the second display 114, or the composite display 760
composed of both displays. While this display identifier 816 may
not be needed for a multi-stack system, as shown in FIG. 7A, the
display identifier 816 can indicate whether a window in the serial
stack of FIG. 7B is displayed on a particular display. Thus, window
3 may have two portions 732 and 736 in FIG. 7B. The first portion
732 may have a display identifier 816 for the first display while
the second portion 736 may have a display identifier 816 for the
second display 114. However, in alternative embodiments, the window
may have two display identifier 816 that represent that the window
is displayed on both of the displays 110, 114, or a display
identifier 816 identifying the composite display. In another
alternate embodiment, the window may have a single display
identifier 816 to represent that the window is displayed on both of
the displays 110, 114.
[0256] Similar to the display identifier 816, an active indicator
820 may not be needed with the dual stack system of FIG. 7A, as the
window in stack position 1 is active and displayed. In the system
of FIG. 7B, the active indicator 820 can indicate which window(s)
in the stack is being displayed. Thus, window 3 may have two
portions 732 and 736 in FIG. 7. The first portion 732 may have an
active indicator 820 while the second portion 736 may also have an
active indicator 820. However, in alternative embodiments, window 3
may have a single active indicator 820. The active indicator 820
can be a simple flag or bit that represents that the window is
active or displayed.
[0257] An embodiment of a method 900 for creating a window stack is
shown in FIG. 9. While a general order for the steps of the method
900 is shown in FIG. 9. Generally, the method 900 starts with a
start operation 904 and ends with an end operation 928. The method
900 can include more or fewer steps or can arrange the order of the
steps differently than those shown in FIG. 9. The method 900 can be
executed as a set of computer-executable instructions executed by a
computer system and encoded or stored on a computer readable
medium. Hereinafter, the method 900 shall be explained with
reference to the systems, components, modules, software, data
structures, user interfaces, etc. described in conjunction with
FIGS. 1-8.
[0258] A multi-screen device 100 can receive activation of a
window, in step 908. In embodiments, the multi-screen device 100
can receive activation of a window by receiving an input from the
touch sensitive display 110 or 114, the configurable area 112 or
116, a gesture capture region 120 or 124, or some other hardware
sensor operable to receive user interface inputs. The processor may
execute the Task Management Module 540 may receive the input. The
Task Management Module 540 can interpret the input as requesting an
application task to be executed that will open a window in the
window stack.
[0259] In embodiments, the Task Management Module 540 places the
user interface interaction in the task stack 552 to be acted upon
by the Display Configuration Module 568 of the Multi-Display
Management Module 524. Further, the Task Management Module 540
waits for information from the Multi-Display Management Module 524
to send instructions to the Window Management Module 532 to create
the window in the window stack.
[0260] The Multi-Display Management Module 524, upon receiving
instruction from the Task Management Module 540, determines to
which touch portion of the composite display 760, the newly
activated window should be associated, in step 912. For example,
window 4 770 is associated with the a portion of the composite
display 764 In embodiments, the device state module 574 of the
Multi-Display Management Module 524 may determine how the device is
oriented or in what state the device is in, e.g., open, closed,
portrait, etc. Further, the preferences module 572 and/or
requirements module 580 may determine how the window is to be
displayed. The gesture module 576 may determine the user's
intentions about how the window is to be opened based on the type
of gesture and the location of where the gesture is made.
[0261] The Display Configuration Module 568 may use the input from
these modules and evaluate the current window stack 760 to
determine the best place and the best dimensions, based on a
visibility algorithm, to open the window. Thus, the Display
Configuration Module 568 determines the best place to put the
window at the top of the window stack 760, in step 916. The
visibility algorithm, in embodiments, determines for all portions
of the composite display, which windows are at the top of the
stack. For example, the visibility algorithm determines that window
3 768, window 4 770, and window 8 774 are at the top of the stack
760 as viewed in FIGS. 7C through 7E. Upon determining where to
open the window, the Display Configuration Module 568 can assign a
display identifier 816 and possibly dimensions 808 to the window.
The display identifier 816 and dimensions 808 can then be sent back
to the Task Management Module 540. The Task Management Module 540
may then assign the window a stack position identifier 812
indicating the windows position at the top of the window stack.
[0262] In embodiments, the Task Management Module 540 sends the
window stack information and instructions to render the window to
the Window Management Module 532. The Window Management Module 532
and the Task Management Module 540 can create the logical data
structure 800, in step 924. Both the Task Management Module 540 and
the Window Management Module 532 may create and manage copies of
the window stack. These copies of the window stack can be
synchronized or kept similar through communications between the
Window Management Module 532 and the Task Management Module 540.
Thus, the Window Management Module 532 and the Task Management
Module 540, based on the information determined by the
Multi-Display Management Module 524, can assign dimensions 808, a
stack position identifier 812 (e.g., window 1 782, window 4 770,
etc.), a display identifier 816 (e.g., touch sensitive display 1
110, touch sensitive display 2 114, composite display identifier,
etc,), and an active indicator 820, which is generally always set
when the window is at the "top" of the stack. The logical data
structure 800 may then be stored by both the Window Management
Module 532 and the Task Management Module 540. Further, the Window
Management Module 532 and the Task Management Module 540 may
thereinafter manage the window stack and the logical data
structure(s) 800.
[0263] An embodiment of a method 1000 for executing an application,
such as a phone application, is shown in FIG. 10. While a general
order for the steps of the method 1000 is shown in FIG. 10.
Generally, the method 1000 starts with a start operation 1004 and
ends with an end operation 1040. The method 1000 can include more
or fewer steps or can arrange the order of the steps differently
than those shown in FIG. 10. The method 1000 can be executed as a
set of computer-executable instructions executed by a computer
system and encoded or stored on a computer readable medium.
Hereinafter, the method 1000 shall be explained with reference to
the systems, components, modules, software, data structures, user
interfaces, etc. described in conjunction with FIGS. 1-9.
[0264] An application, such as a phone application, is executed, in
step 1008. In embodiments, a processor 204 receives indication to
execute an application through a user interface 110, 114, 112, 116,
etc. The indication can be a selection of an icon associated with
the application. In other embodiments, the indication can be a
signal generated from another application or event, such as
receiving a phone call or other communication, which causes the
application to execute automatically. The processor 204 can
retrieve the application 564a from the application store 560 and
begin its execution. In executing the application 564a, a user
interface can be generated for a user.
[0265] In creating a user interface, the application 564a can begin
executing to create a manifest, in step 1012. A manifest is a data
structure that indicates the capabilities of the application 564a.
The manifest can generally be created from the resources in the
resources directory of the application 564a. The resources
directory can indicate the types of modes, locations, or other
indications for how the user interface should be configured in the
multi-display device 100. For example, the several modes can
include: "classic mode" that indicates that the application 564a is
capable of being displayed on a single screen or display 110/114;
"dual mode" that indicates that the application 564a is capable of
being displaced on two or more displays 110 and 114; "max mode"
that indicates the application 564a is capable of being displayed
or desires to be displayed across multiple displays 110 and 114;
and/or "bilateral mode" that indicates that the application 564a is
capable of being displayed on 2 or more displays 110 and 114 when
the device 100 is in easel mode (see FIGS. 1I and/or 1J).
[0266] Similarly, the manifest can include a desired or allowed
location within the displays 110/114. The possible locations can
include: "left", which indicates that the application 564a desires
to be displayed on the left display 110; "right", which indicates
that the application 564a desires to be displayed on the right
display 114; and/or other indications of where a location should be
including possible "top" and/or "bottom" of one or more of the
displays 110/114.
[0267] The application 564a can also indicate that it desires to be
displayed in a "minimum" window, which is a window that occupies
less than the full area of a single display. There may be other
modes possible for the application 564a, which may be included in
the manifest. The manifest can be sent from the application 564a to
the multi-display management module 524.
[0268] The multi-display management module 524 can receive the
manifest, in step 1016. In receiving the manifest, the
multi-display management module 524 can use the information to
determine a display binding for the application 564a. The manifest
may be received more than once from the application 564a based on
changes in how the application 564a is being executed, where the
application 564a desires to have a different display setting for
the new mode. Thus, with the manifest, the application 564a can
indicate to the multi-display management module 524 how best to or
what is the desired for the application's user interface. The
multi-display management module 524 can use the information in the
manifest to determine the best fit for the user interface depending
on how the device 100 is currently configured.
[0269] The multi-display management module 524 can determine the
application display mode, in step 1020. Here the multi-display
management module 524 receives or retrieves an indication of the
device 100 configuration. For example, the multi-display management
module 524 can determine if the device is in single display
configuration (see FIG. 6A, 6B, 6D, or 6E), dual display
configuration (see FIG. 6C or 6F), bilateral display configuration
(see FIG. 6G or 6H), or one of the other display configurations
(see FIG. 6I or 6J).
[0270] Further, the multi-display management module 524 can
determine if the device 100 is in a portrait or landscape
orientation. With this information, the multi-display management
module 524 may then consider the capabilities or preferences listed
for the application 564a in the received manifest. The combined
information may then allow the multi-display management module 524
to determine a display binding. The display binding can include
which of the one or more displays 110 and/or 114 are going to be
used to display the application's user interface(s). For example,
the multi-display management module 524 can determine that the
primary display 110, the secondary display 114, or all displays 110
and 114 of the device 100 will be used to display the application's
user interface.
[0271] The display modes setting can be assigned by creating or
setting a number in the display binding. This number can be "0" for
the primary display 110, "1" for the secondary display 114, or "2"
for dual displays 110 and 114. The display mode setting can also
indicate if the application 564a should display the user interface
in portrait or landscape orientation. Further, there may be other
settings, for example, providing a max mode or other setting that
may indicate how the application 564a is to be displayed on the
device. The display binding information is stored in a data
structure to create and set a binding, in step 1024.
[0272] The established display binding may then be provided, by the
multi-display management module 524, to the application 564a, in
step 1028. The provided display binding data structure can become
an attribute of the application 564a. An application 564a may
thereinafter store the display binding attribute in the memory of
the device 100. The application 564a with the display binding may
then generate a user interface based on this display binding. The
application 564a may be unaware of the position of the display
110/114 but may be able to determine, from the display binding, the
size of the available user interface to generate a window that has
particular characteristics for that display setting.
[0273] When a configuration change happens to the device 100, the
multi-display management module 524 may change the display binding
and send a new display binding to the application 564a. In
embodiments, the multi-display management module 524 may indicate
to the application 564a that there is a new binding or, in other
embodiments, the application 564a may request a display
configuration change or a new display binding, in which case the
multi-display management module 524 may send a new display binding
to the application 564a. Thus, the multi-display management module
524 can change the configuration of the display for the application
564a by altering the display binding for the application 564a
during the execution of that application 564a.
[0274] The multi-display management module 524 thereinafter, while
the application 564a is executing, can determine if there has been
a configuration change to the device 100, in step 1032. The
configuration change may be an event (see FIGS. 3A and 3B)
triggered by one or more signals from one or more hardware sensor
172, 176, etc. For example, if the device 100 is changed from
portrait 304 to landscape 340 orientation, Hall effect sensors 172
may indicate to the framework 520 that a display configuration
change has been made. Other changes may include transitions from a
single display 304 to a dual display configuration 320, by opening
the device. Other types of configuration changes may be possible
and may be signaled to alert the multi-display management module
524 of the configuration change. If a configuration change has been
made, the method 1000 proceeds YES to step 1020 so that the
multi-display management module 524 can determine new application
display mode settings and create a new display binding, which may
be passed to the application 564a. If there are no configuration
changes, the method 1000 precedes NO to step 1036.
[0275] In step 1036, a new application mode change may be
determined. Application mode changes can also occur in the
application 564a, and thus, the application 564a can determine if
something has occurred within the application 564a that requires a
different display setting. Modes are described hereinafter with
respect to FIG. 51. The mode change can create a desire to change
the display 110/114, and thus, require the application 564a to
generate a new manifest. If the application 564a does sense a mode
change or an event has occurred that requires a change in display
setting, the method 1000 proceeds YES back to step 1012. At step
1012, a new manifest or preference is created by the application
564a that may be received by the multi-display management module
524 to determine if the multi-display management module 524 can
change the display binding. If it is possible to provide the
preferred display, the multi-display management module 524 can
create a new display binding and send display binding back to the
application 564a and allow the application 564a to alter its user
interface. If no mode change is sensed or an event is not received
to create a mode change, the method 1000 proceeds NO to end
operation 1040.
[0276] Unified System:
[0277] An embodiment of a unified system 1100 is shown in FIG. 11.
In embodiments, the unified system 1100 includes a computer system
1104 and the device 100. The computer system 1104 may be as
described in conjunction with FIG. 11. The device 100 may be as
described herein in conjunction with FIG. 1A through FIG. 10. The
device 100 may be physically connected to the computer system 1104
with a docking cradle or other wired connection. In other
embodiments, the device 100 and computer system 1104 may
communicate or be connected wirelessly using a wireless system
and/or protocol (e.g., Bluetooth.TM., 802.11g, etc.). Upon
connection of the device 100 and the computer system 1104, the
device 100 can recognize the connection either manually (through
user input) or automatically and functionally connect the device
100 with the computer system 1104 to form the unified system. The
unified system 1100 functions as to allow the device 100 to
communicate with, interact with, and/or control the function of the
computer system 1104 when functionally connected. As such, when
executed, the unified system appears to be a single system where
the device 100 and the computer system 1104 function in concert.
The components and/or software that enable the unified system, the
communications or functions of the unified system, and other
description of the unified system is described in U.S. Provisional
Applications 61/61/507,206, 61/507,201, 61/507,199, 61/507,209,
61/507,203, and 61/389,117, which are each incorporated herein, by
reference, for all that they teach and for all purposes. Further,
the unified system is also described in U.S. patent application
Ser. Nos. 12/948,585 and 12/948,676, which are each incorporated
herein, by reference, for all that they teach and for all
purposes.
[0278] FIG. 11 illustrates one embodiment of a computer system 1104
upon which the test system may be deployed or executed. The
computer system 1104 is shown comprising hardware elements that may
be electrically coupled via a bus 1155. The hardware elements may
include one or more central processing units (CPUs) 1105; one or
more input devices 1110 (e.g., a mouse, a keyboard, etc.); and one
or more output devices 1115 (e.g., a display device, a printer,
etc.). The computer system 1104 may also include one or more
storage devices 1120. By way of example, storage device(s) 1120 may
be disk drives, optical storage devices, solid-state storage
devices, such as a random access memory ("RAM") and/or a read-only
memory ("ROM"), which can be programmable, flash-updateable, and/or
the like.
[0279] The computer system 1104 may additionally include a
computer-readable storage media reader 1125; a communications
system 1130 (e.g., a modem, a network card (wireless or wired), an
infra-red communication device, etc.); and working memory 1140,
which may include RAM and ROM devices as described above. In some
embodiments, the computer system 1104 may also include a processing
acceleration unit 1135, which can include a DSP, a special-purpose
processor and/or the like
[0280] The computer-readable storage media reader 1125 can further
be connected to a computer-readable storage medium, together (and,
optionally, in combination with storage device(s) 1120)
comprehensively representing remote, local, fixed, and/or removable
storage devices plus storage media for temporarily and/or more
permanently containing computer-readable information. The
communications system 1130 may permit data to be exchanged with the
network 1120 and/or any other computer described above with respect
to the system 1100. Moreover, as disclosed herein, the term
"storage medium" may represent one or more devices for storing
data, including read only memory (ROM), random access memory (RAM),
magnetic RAM, core memory, magnetic disk storage mediums, optical
storage mediums, flash memory devices, and/or other machine
readable mediums for storing information.
[0281] The computer system 1104 may also comprise software
elements, shown as being currently located within a working memory
1140, including an operating system 1145 and/or other code 1150,
such as program code implementing the components and software
described herein. It should be appreciated that alternate
embodiments of a computer system 1104 may have numerous variations
from that described above. For example, customized hardware might
also be used and/or particular elements might be implemented in
hardware, software (including portable software, such as applets),
or both. Further, connection to other computing devices such as
network input/output devices may be employed.
[0282] Unified Desktop:
[0283] An embodiment of a unified desktop 1300 is shown in FIG. 13.
The unified desktop 1330 is the user interface for the unified
system 1100. Thus, the unified desktop 1300 is formed from the user
interface 1304 in the screen 1215 associated with the computer
system 1104 and the user interface 1308 in the screen(s) 104, 108
associated with the device 100. As a unified desktop 1300, the user
interface 1304 and 1308 function together to provide parallel
displays, exchange windows or other user interface elements, and
generally present a cohesive user interface across both the
computers system 1104 and the device 100. In other words, the
unified desktop spans or is provided over at least one of the
screens 104, 108 of the device 100 and the screen 1215 of the
computer system 1104. The device 100 can assume the form factor and
function of both device 100 and the computer system 1104; the
design of the unified desktop can provide a seamless user
experience across the device 100 and the computer system 1104,
enabling the user to access shared content, manage applications and
peripherals regardless of which system 100 or 1104 presents the
user interface action. Hereinafter, specific user interface actions
related to the unified desktop shall be presented.
[0284] Status Indicators:
[0285] A portion of the unified desktop is shown in FIG. 14. More
particularly, the user interface 1304 of the unified desktop 1300
is shown, where a set of status indicators are provided in a
portion 1400 of the unified desktop 1300. The status indicators in
the portion 1400 provide information about both the device 100 and
the computer system 1104. In embodiments, the portion 1400 is
provided in only one of the user interface 1304 or user interface
1308. Thus, status indicators associated with both the device 100
and the computer system 1104 are shown together in a screen 1215 or
104, 108 associated with only the device 100 or the computer system
1104.
[0286] The status indicators shown in FIG. 14 include a wireless
fidelity (Wi-Fi) indicator 1404, a Bluetooth indicator 1408, a
Network Traffic indicator 1412, a signal strength indicator 1416, a
battery indicator 1420, a power menu indicator 1430, and a time
indicator 1434. In embodiments, one or more of the indicators may
be selectable to provide further information or user interface
devices that may be selected to configure the device 100 and/or the
computer system 1104. For example, as shown in FIG. 14, the battery
indicator 1420 has been selected, which caused the device 100 to
provide a drop down user interface device 1425 in the user
interface 1304. The drop down user interface device 1425 provides
further information 1438 about the amount of device battery life
and a user interface device 1442 that can be selected to access the
power settings for the device 100. Thus, the user can configure the
device 100 or the computer system 1104 by accessing menus or other
user interface through the status indicators 1404 through 1434.
Some of the possible status indicators that may be displayed for
the device 100, and the menus or other information associated with
the status indicators is provided in the table below:
TABLE-US-00001 Indicator Menu content Description Label Wi-Fi Wi-Fi
status Presents the current "Wi-Fi: <x>", where <x>
stands for "Off", status of Wi-Fi. "Scanning . . . ", "Connecting
to <SSID> . . . ", "Disconnected", "Connected". Wi-Fi ON/OFF
Allows for turning "Turn Wi-Fi on", or "Turn Wi-Fi off" toggle
Wi-Fi on and off List of found Wi- Lists all the Wi-Fi
"<SSID>" an icon to represent the Fi networks networks that
are found network's strength and its security. The and known. List
of network currently connected to is marked. networks is scanned
whenever the menu is displayed. When clicked, tries to connect to
the network, and shows a dialog if a password is required. Command
to Allows for connecting "Add Wi-Fi network . . . " connect to a to
a hidden Wi-Fi hidden Wi-Fi network. Shows a network dialog for
entering the SSID, a dropdown menu for security password field when
required. Based on the same command in Android Settings. Link to
Wi-Fi Links to: Android "Wi-Fi settings . . . " settings Settings
> Wireless & network settings Bluetooth Bluetooth device The
Bluetooth name of "<device name>" name the handset. Bluetooth
status Presents the current "Bluetooth: <x>", where <x>
stands for status of Bluetooth. "On", or "Off" Bluetooth Allows for
turning "Turn Bluetooth on", or "Turn Bluetooth ON/OFF toggle
Bluetooth on and off. off" Discoverability Turns on the "Make
discoverable", or "Discoverable for status discoverability of
<x> seconds" Bluetooth for 120 seconds (Android default) List
of connected Lists all paired and <Bluetooth device name> +
icon Bluetooth de- currently connected representing the type of the
device vices Bluetooth devices. List of devices is updated whenever
the menu is displayed. Items are not clickable. Link to Bluetooth
Links to: Android "Bluetooth devices & settings . . . "
settings Settings > Wireless & network settings >
Bluetooth settings Network traffic Network status Shows the current
"Network: <x>", where <x> stands for the network speed
(3G, network speed 2G, . . . ) Link to Network Links to: Settings
> "Mobile network settings . . . " settings Mobile networks
Signal strength Network status Shows the current "<operator
name>" operator Network strength Shows the current "Signal
strength: <x>", where <x> stands signal strength for
"none", "poor", "good", "very good", "excellent" When airplane mode
is on: "Airplane mode" Link to Network Links to: Settings >
"Mobile network settings . . . " settings Mobile networks Battery
(phone) Battery status Shows the status of the "Phone battery:
<x>%", when using battery handset's battery power: <x>
stands for the current charge, charge. "Phone battery charging
(<x>%)", when the battery is being charged; <x> stands
for the current "Phone battery charged", when the battery is fully
charged and plugged in. Battery Battery status Shows the status of
the "Peripheral battery: <x>%", when using (peripheral,
second battery, battery power; <x> stands for the current
when provided by a charge. applicable) peripheral. "Peripheral
battery charging (<x>%)", when the battery is being charged;
<x> stands for the current charge. "Peripheral battery
charged", when the battery is fully charged and plugged in. GPS GPS
status Shows the current "GPS status: <x>", where <x>
stands for status of the GPS "On", "Off", or "Failure" GPS ON/OFF
Allows for turning "Turn GPS on", or "Turn GPS off" toggle
Bluetooth on and off. Link to GPS Links to: Settings > "Location
settings . . . " settings Location & Security Sync Sync status
Shows the current "Syncing in progress . . . " status of the Sync
Link to Sync Links to: Settings > "Sync settings . . . "
settings Account & Sync Alarms Integrated into date/time menu
No SIM Warning message A warning message for "No SIM card,
Emergency calls only." warning not finding a SIM card Speakerphone
Speakerphone Shows the status of "Speaker on" status speakerphone
when in use Roaming Roaming status Shows roaming "Roaming" warnings
warnings Link to Network Links to: Settings > "Mobile network
settings . . . " settings Mobile networks TTY TTY status Shows the
Text "TTY enabled" Telephone Device or Telecommunication Device for
the Deaf (TDD) status
[0287] Some of the possible status indicators that may be displayed
for the computer system 1104, and the menus or other information
associated with the status indicators is provided in the table
below:
TABLE-US-00002 Indicator Menu content Description Label Volume (and
Silent mode Turns on silent mode "Turns Silent mode on", or "Turn
silent/vibration) toggle Silent mode off" Volume slider Allows for
changing the volume level; Link to Sound Links to: "Sound settings
. . . " settings Monitors Use Unity menu For settings label. use
"Monitor settings . . . " Keyboard Use Unity menu For settings
label, use "Keyboard settings . . . " Time/date Use Unity menu List
of alarms Lists all alarms that "<alarm name>: <alarm
time> - <repeat have been set up and days>"; note that the
alarm name and repeat enabled. Only shown days are optional.
Examples: 8:30 AM - Mon, when alarms have been Tue, Wed, Thu, Fri
10:00 AM - Sat. Sun set up. Groceries: 6:40 PM Pills: 12:00 PM -
Mon, Tue, Wed, Thu, Fri Link to Clock Opens the Android "Manage
alarms . . . " application Clock application. Only shown when
alarms have been set up. Link to date & Links to: Settings >
"Date & time settings . . . " time settings Date & time
settings Power menu Airplane mode Toggle airplane mode "Turn
Airplane mode on", or "Turn on and off Airplane mode off" Sleep
Sleep "Sleep" Power off Power off "Power off" Android Settings
Links to Android "Phone settings . . . " Settings application PC
Settings Links to PC system "Desktop settings . . . " settings
[0288] Other possible status indicators are contemplated and
included as one skilled in the art would understand.
[0289] A user interface 1308 with other status indicators is shown
in FIG. 15. User interface 1308 is associated with the device 100.
The user interface 1308 may also include a portion 1500 that
displays status indicators. In embodiments, the portion 1500 may
include the same or different status indicators from portion 1400
and may display the status indicators substantially simultaneously
with portion 1400. For example, portion 1500 may include a Network
Traffic indicator 1412, a signal strength indicator 1416, a battery
indicator 1420, and a time indicator 1434. However, portion 1500
may include an alarm indicator 1504 that is not shown in portion
1400. Thus, the portions 1400 and 1500 can show the same
indicators, different indicators, and, in some embodiments,
indicators associated with only the device 100 or the computer
system 1104.
[0290] An embodiment of a method 1600 for providing status
indicators is shown in FIG. 16. While a general order for the steps
of the method 1600 is shown in FIG. 16. Generally, the method 1600
starts with a start operation 1604 and ends with an end operation
1624. The method 1600 can include more or fewer steps or can
arrange the order of the steps differently than those shown in FIG.
16. The method 1600 can be executed as a set of computer-executable
instructions executed by a computer system and encoded or stored on
a computer readable medium. Hereinafter, the method 1600 shall be
explained with reference to the systems, components, modules,
software, data structures, user interfaces, etc. described in
conjunction with FIGS. 1-15.
[0291] A computer system 1104 and a device 100 can be connected,
physically, electrically, wirelessly, etc. The device 100 can
automatically recognize the connection and functionally connect the
device 100 with the computer system 1104 to form a unified system
1100, in step 1608. In other embodiments, a user may provide input
to functionally connect the device 100 with the computer system
1104. In response to the function connection, the device 100 can
generate a unified desktop 1300, in step 1612. the unified desktop
1300 can expand or create a single user interface for the unified
system 1100 across the screens of the device 100 and the computer
system 1104.
[0292] The device 100 may then determine status indicators that
would need to be displayed for the user, in step 1616. In
embodiments, the device 100 can request or discover the status
indicators associated with the computer system 1104. The status
indicators associated with the computer system 1104 may be
incorporated into a data structure or combined with the status
indicators associated with the device 100. The combined set of
status indicators may then be provided in the unified desktop 1300,
in step 1620. For example, the combined set of status indicators
can be displayed in a portion 1400 of the user interface 1304.
Thus, in a single area of the unified desktop 1300 (and possibly
only on one screen of either the device 100 or the computer system
1104), the device 100 can provide status indicators for both the
device 100 and the computer system 1104.
[0293] An embodiment of a method 1700 for providing status
indicators is shown in FIG. 17. While a general order for the steps
of the method 1700 is shown in FIG. 17. Generally, the method 1700
starts with a start operation 1704 and ends with an end operation
1724. The method 1700 can include more or fewer steps or can
arrange the order of the steps differently than those shown in FIG.
17. The method 1700 can be executed as a set of computer-executable
instructions executed by a computer system and encoded or stored on
a computer readable medium. Hereinafter, the method 1700 shall be
explained with reference to the systems, components, modules,
software, data structures, user interfaces, etc. described in
conjunction with FIGS. 1-16.
[0294] The processor 204 of the device 100 can receive a selection
of a status indicator, in step 1708. A selection can be a user
interface action conducted on a status indicator. For example, a
user may hover a pointer over status indicator 1420. In another
example, the user may click the pointer on the status indicator
1420 to select it. Regardless, the user conducts a user interface
action on a status indicator to generate an input into the unified
desktop 1300. In response to receiving the selection by the user,
the device 100 can provide information and/or a user interface
device in the unified desktop 1300, in step 1712. Information can
include more detail about a status, as described in the tables
above. For example, upon selecting the status indicator 1420, the
device 100 can provide more detailed information that the battery
charge is at 50%, as shown in information 1438 in the drop down
1434. A user interface device may be a selectable icon or other
user interface display that will allow the user to access further
functionality. For example, the user interface display 1442 in the
drop down 1424 display can be selected by the user's pointer.
Selection of the user interface display 1442 allows the user to
access power settings for the device 100. Other menus or functions
can be accessed through the status indicators as shown in the
tables above.
[0295] The processor 204 of the device 100 can receive a selection
of a user interface device, in step 1716. For example, the user may
select user interface display 1442 by clicking on or near the user
interface display 1442. Upon receiving this selection, the device
100 can provide a menu or other functionality associated with the
status indicator, in step 1720. In embodiments, the device 100
presents a menu that can alter or address the status indicated by
the status indicator. The menus or functionality that may be
accessed may be as described in the tables above. For example, by
selecting user interface display 1442, the device may provide a
power settings menu to the user that allows the user to change the
power settings of the device 100. The user may modify a function of
the device 100 or the computer system 1104 by interaction with the
provided menu or functionality. For example, the user may change
the duration of inactivity required before a screen blacks out in
the power settings menu.
[0296] Freedom Window Mode:
[0297] A unified desktop interface 1304 is shown with a window 1800
(showing a personal computing application user interface) and a
freeform window 1804 in FIG. 18. A freeform window 1804 is a shell
that can encapsulate a user interface for a device application. A
device application can be any application that typically executes
on the device 100. Thus, the device application may be configured
to execute in a mobile device environment but not in a personal
computing environment. The device application can be specific to
the device and does not execute on the computer system. For
example, gestures received with the device 100 can affect the
device application but may not be received or understood in the
personal computing environment. Device applications can include a
phone application, a text messaging application, a utilities
application, a game application, or a mobile application. The
device application may be executed by the device 100 receiving a
user selection of the device application. The selection may be a
user selection of a shortcut displayed in the unified desktop. An
example of a shortcut 1812 is shown in the user interface 1304. The
shortcut 1812 allows a device application to be opened in the
personal computing environment like other personal computing
applications.
[0298] The freeform window 1408 allows the device application user
interface to behave as a typical window in a computer system
environment. For example, as shown in FIG. 18, freeform window 1804
allows the user interface 1808 of the device application to display
at least partially over another window 1800. In other embodiments,
the freeform window 1804 allows the user interface 1808 of the
device application to display at least partially behind another
window (not shown). Thus, the freeform window 1804 provides display
functionality to the user interface 1808 that would not normally be
functional for a device application.
[0299] Upon executing the device application, the device
application can generate a user interface 1808 which can be
incorporated into the freeform window 1804. An embodiment of the
freeform window 1804 is shown in FIG. 19. The freeform window 1804
includes a portion 1900 that includes controls 1904-1912. The
controls can cause the user interface 1808 to complete actions
associated with a personal computing environment. For example,
control 1904 can cause the device 100 to close the user interface
1808 and stop execution of the device application associated
therewith. Control 1908 can cause the device 100 to minimize or
shrink the user interface 1808 of the device application. Control
1912 can cause the device 100 to maximize the user interface the
user interface 1808 of the device application. A user may also user
handles or other controls to expand or contract the freeform window
1804 and thus the user interface 1808.
[0300] The freeform window 1804 can provide one or more user
interface devices that allow the user interface 1808 to provide
functionality (such as features or controls) in the freeform window
1804. For example, a first user interface device 1916 may be a
control or device to access previous information displayed by the
user interface 1808 of the device application. A second user
interface device 1924 can provide a control to exit or stop
execution of the device application. A third user interface device
1928 can provide a feature to expand the user interface 1808 the
device application into the entire available space of the freeform
window 1804. A fourth user interface device 1920 may provide a
control to change the user interface 1808 to a second user
interface that displays other information. Other user interface
devices that provide other functionality are contemplated as one
skilled in the art would understand.
[0301] An embodiment of a method 2000 for providing a freeform
window is shown in FIG. 20. While a general order for the steps of
the method 2000 is shown in FIG. 20. Generally, the method 2000
starts with a start operation 2004 and ends with an end operation
2024. The method 2000 can include more or fewer steps or can
arrange the order of the steps differently than those shown in FIG.
20. The method 2000 can be executed as a set of computer-executable
instructions executed by a computer system and encoded or stored on
a computer readable medium. Hereinafter, the method 2000 shall be
explained with reference to the systems, components, modules,
software, data structures, user interfaces, etc. described in
conjunction with FIGS. 1-19.
[0302] A computer system 1104 and a device 100 can be connected,
physically, electrically, wirelessly, etc. The device 100 can
automatically recognize the connection and functionally connect the
device 100 with the computer system 1104 to form a unified system
1100, in step 2008. In other embodiments, a user may provide input
to functionally connect the device 100 with the computer system
1104. In response to the function connection, the device 100 can
generate a unified desktop 1300, in step 2012. The unified desktop
1300 can expand or create a single user interface for the unified
system 1100 across the screens of the device 100 and the computer
system 1104.
[0303] The device 100 may then execute a device application, in
step 2020. In embodiments, the device 100 may receive a selection
of a device application from the unified desktop 1300. For example,
the user may select a shortcut 1812. Regardless, the device 100 can
execute the selected device application in the computer system
environment. When the device 100 executes the device application, a
freeform window 1804 may be created to encapsulate or display the
user interface 1808 of the device application. Thus, the device 100
displays the user interface 1808 of the device application in the
freeform window 1804, in step 2020.
[0304] An embodiment of a method 2100 for providing status
indicators is shown in FIG. 21. While a general order for the steps
of the method 2100 is shown in FIG. 21. Generally, the method 2100
starts with a start operation 2104 and ends with an end operation
2124. The method 2100 can include more or fewer steps or can
arrange the order of the steps differently than those shown in FIG.
21. The method 2100 can be executed as a set of computer-executable
instructions executed by a computer system and encoded or stored on
a computer readable medium. Hereinafter, the method 2100 shall be
explained with reference to the systems, components, modules,
software, data structures, user interfaces, etc. described in
conjunction with FIGS. 1-20.
[0305] The processor 204 of the device 100 can receive a selection
of a device application, in step 2108. A selection can be a user
interface action conducted on a shortcut 1812 or other user
interface device. Regardless, the user conducts a user interface
action to execute a device application. In response to receiving
the selection by the user, the device 100 can determine any device
application feature, controls, or other functionality to be
provided for the user interface 1808, in step 2112. The features,
controls, etc. can be functions that conduct and action or complete
a process for the device application and may not function in a
typical computer system environment. These functions can include
expanding or contracting the window, providing alternative
displays, entering information, executing sub-processes, etc. Some
possible features or controls are as discussed with FIG. 19.
[0306] The processor 204 of the device 100 can provide a freeform
window 1804 for the user interface 1808 of the device application,
in step 2116. The freeform window 1808 can be as described in FIGS.
18 and 19. As part of the freeform window 1808, the device may
provide one or more user interface devices that enable the features
or controls determined by the device 100, in step 2120. The user
interface devices may be as described in conjunction with FIG.
19.
[0307] Wake and Unlock:
[0308] FIGS. 22 through 28 show a unified desktop interface 1304 is
shown while is sleep mode or in a state after waking or unlocking
after sleep mode. Sleep mode is a mode where the user interface
displays are inactive or "blackened". As shown in FIG. 22, user
interface 2204 and user interface 2208 do not show any content
because the system 2200 is in "sleep mode." As described herein,
the sleep state or sleep mode can include different types of states
including a simple sleep state, a screen lock state, and/or a
passcode lock state, as described herein. To leave sleep mode and
continue to use the system 2200, input may be received from a user
in the keyboard 2212, a user interface device 2216, or by other
means (e.g., mouse, power button, etc.). If such input is received,
the system 2200 can change states of a user interface for the
unified desktop during wake, unlock, passcode unlock, call
received, and other states.
[0309] A table 2900 containing at least some of the possible sleep
states and how to interact with the system 2200 in the state is
shown in FIG. 29. In embodiments, there may be three states 2904, a
simple "sleep" state represented in column 2908, a "screen lock"
state represented in column 2912, and/or a "passcode lock" state
represented in column 2916. In embodiments, there may be more or
fewer states. A sleep state may be chosen by a user and affected
when the device 100 receives a selection of the sleep state from
the user.
[0310] The table 2900 shows rules that govern the sleep state and
how to wake from the selected sleep state. In a sleep state, the
user interface 1304 is off without anything being displayed and the
wake action is any interaction with the user interfaces of the
unified desktop. In a screen lock state, the user interface 1304 is
off and the user interface, of at least the device 100, requires a
specified and predetermined user interaction (e.g., moving a bar on
the user interface) to unlock the state. Generally, the bar or
other user interface device that receives the specified user
interaction is presented to the user after the user provides an
initial user interface interaction to the unified desktop. In a
passcode state, the system 2200 requires the entry of a passcode
(e.g., a numeric code, a visual code, etc.) to exit the state. As
with the screen lock state, a passcode user interface device that
receives the passcode is presented to the user after the user
provides an initial user interface interaction to the unified
desktop. Thus, all the states appear as that shown in FIG. 22 but
require different input to exit the state.
[0311] At least two events 2920 can cause the system 2200 to leave
the states mentioned above. The first event may be a wake event
represented in row 2924. The second event may be a phone call
represented in row 2928. When these events occur, with the correct
user input shown in the body of table 2900, the system 2200 may
exit from the state and allow the user to use or interact with the
system.
[0312] An embodiment of a method 3000 for waking the system 2200 is
shown in FIG. 30. While a general order for the steps of the method
3000 is shown in FIG. 30. Generally, the method 3000 starts with a
start operation 3004 and ends with an end operation 3032. The
method 3000 can include more or fewer steps or can arrange the
order of the steps differently than those shown in FIG. 30. The
method 3000 can be executed as a set of computer-executable
instructions executed by a computer system and encoded or stored on
a computer readable medium. Hereinafter, the method 3000 shall be
explained with reference to the systems, components, modules,
software, data structures, user interfaces, etc. described in
conjunction with FIGS. 1-29.
[0313] A system 2200 may sleep, in step 3008. The sleep state may
include the states as described in conjunction with FIG. 29 or
other states that render the system 2200 inactive. The system 2200
may appear as shown in FIG. 22, where in user interface 2204 for
the computer system 1104 and the user interface 2208 for the device
100 are inactive and do not display anything. The sleep state may
be entered after the expiration of a period of time, automatically
in response to an action or event, or in response to user
interaction. Thus, the sleep state can render the system 2200
unusable until it is awakened or unlocked.
[0314] The device 100 or the computer system 1104 may receive an
event in step 3012. An event may be as described in conjunction
with FIG. 29. For example, the event may be a wake action received
by a user. In alternative embodiments, the event may be a received
phone call. Either event may be received with the correct user
input to wake the system 2200. The correct user input is based on
the state in which the device currently resides.
[0315] Thus, the system 2200 may determine the state of the system,
in step 3016. The state may be set by the user or be automatically
set. For example, a user may determine that a passcode lock is
necessary and enter the passcode with the setting. However, if the
user makes no change to the desired sleep state, the device 100 may
determine to use the sleep state 2908. Regardless, the system 2200
determines which state has been set when the device sleeps.
[0316] The system 2200 may then determine if the correct user
interface input, associated with the determined state, has been
entered to wake the system 2200, in step 3020. As shown in table
2900, there is different input based on the event and the state. If
the system 2200 is in a simple sleep state 2908, then any input to
the device 100 or computer system 1104 may wake the system 2200.
Thus, the user may hit a key on the keyboard 2212, push a user
interface device 2216 on the device 100, touch the touch sensitive
display 110, 114, the gesture capture area 120, 124, or another
touch-sensitive area on the device 100, or make some other input
into the system 2200. In the screen lock state 2912, the device 100
may require the user to complete some form of user input (e.g.,
moving a slider bar or other orchestrated movement on the touch
sensitive display 110, 114), either in addition to a first
interaction or by itself, to unlock the device 100. The computer
system 1104 may not require any additional input. Thus, a first
gesture may wake the computer system 1104, while a second gesture
is needed to unlock the device interface 2208. If the correct input
is received, the method 3000 proceeds YES to step 3028.
[0317] However, if the incorrect input is received, the method 3000
proceeds NO to step 3024, where the system 2200 may inform the user
of the incorrect input. Thus, a user interface may be provided that
indicates that the input was incorrect, for example, a pop-up
message that states "The password or username is incorrect." In
other embodiments, the system 2200 may revert to the locked state
without unlocking the device, which can indicate to the user that
the input was incorrect. In step 3028, the system 2200 may wake.
Thus, the system 2200 can transition, for example, for the display
shown in FIG. 27 to that shown in FIG. 28. The user may then have
complete access to the system 2200 and view the desktop or open
application after waking the system 2200.
[0318] Docking and Undocking
[0319] An embodiment of a method 3400 for docking the device 100
with the computer system 1104 to form a unified system 3100 is
shown in FIG. 34. While a general order for the steps of the method
3400 is shown in FIG. 34. Generally, the method 3400 starts with a
start operation 3404 and ends with an end operation 3432. The
method 3400 can include more or fewer steps or can arrange the
order of the steps differently than those shown in FIG. 34. The
method 3400 can be executed as a set of computer-executable
instructions executed by a computer system and encoded or stored on
a computer readable medium. Hereinafter, the method 3400 shall be
explained with reference to the systems, components, modules,
software, data structures, user interfaces, etc. described in
conjunction with FIGS. 1-33 and FIG. 38. In particular, FIGS. 31-33
show embodiments of user interfaces associated with the process of
docking.
[0320] A device 100 and a computer system 1104 may be provided, in
step 3408. The device 100 may be a mobile device as described
herein. In embodiments, the device 100 may display one or more user
interfaces 3108 and/or 3112 before docking occurs. The computer
system 1104 can include a display 1304 that presents a user
interface 3104. The user interface 3104 can include any window or
other display. In embodiments, the user interface 3104 shows a
slide show in FIG. 31 before docking. After presenting the device
100 and the computer system 1104, the device 100 may dock with the
computer system 1104, in step 3412.
[0321] Docking the device 100 may include electrically connecting
the device with the computer system 1104. The electrical connection
may be made with a docking cradle, a wire interface, a wireless
interface, or by other device or connection. Once docked, the
computer system 1104 may be controlled or managed by the device
100. Thus, actions on the computer system 1104 may be handled by
the device 100. Thus, docking the device 100 creates the unified
system and presents the unified interface for the computer system
1104 and the device 100.
[0322] The behavior of the unified system 3200, as shown in FIG. 32
and after docking, may be governed by a set of docking rules, as
described in conjunction with FIG. 38. In embodiments, the computer
system display 3204 may hide one or more displays or windows
presented before docking. Rather, the computer system 1104 may
present a desktop 3208 after docking, in step 3416. The desktop
3208 can display a unified desktop theme with icons or other user
interface devices (for example, icon 3212) providing access to
unified system 3200 functionality. Thus, the computer system 1104
hides pre-existing windows or displays to provide the unified
desktop 3204.
[0323] After docking, the device software (as described in FIGS. 5A
and 5B) may determine if any application displays or windows were
displayed on the device 100 before docking, in step 3420. The
determination may include the processor 504 checking the display
configuration in the frame buffers 548 or other type of check. If
there were displays presented on the device 100 before docking, the
method 3400 may proceed YES to step 3428. If there were on displays
presented on the device 100 before docking, the method 3400 may
proceed NO to step 3424 where the unified desktop is also provided
on the display(s) of the device 100, as seen in FIG. 33.
[0324] If there were on displays (e.g., display 3108 and/or display
3112) presented on the device 100 before docking, the displays 3108
and/or 3112 may be migrated from the device interface 3304 to the
computer system display 3204, as shown in FIG. 33. In embodiments,
new freeform windows or big brother applications are invoked to
migrate the displays. Thus, a new instance of the application
window or interface is opened in the display of the computer
system. In other embodiments, the display buffer is simply changed
to reflect the migration. Thus, displays on the device 100 preempt
any displays on the computer system 1104 after docking.
[0325] An embodiment of a method 3700 for undocking the device 100
with the computer system 1104 to is shown in FIG. 37. While a
general order for the steps of the method 3700 is shown in FIG. 37.
Generally, the method 3700 starts with a start operation 3704 and
ends with an end operation 3728. The method 3700 can include more
or fewer steps or can arrange the order of the steps differently
than those shown in FIG. 37. The method 3700 can be executed as a
set of computer-executable instructions executed by a computer
system and encoded or stored on a computer readable medium.
Hereinafter, the method 3700 shall be explained with reference to
the systems, components, modules, software, data structures, user
interfaces, etc. described in conjunction with FIGS. 1-36 and FIG.
38. In particular, FIGS. 35 and 36 show embodiments of user
interfaces associated with the process of undocking.
[0326] A device 100 and a computer system 1104 may be docked. The
device 100 may be a mobile device as described herein. In
embodiments, the device 100 may display one or more user interfaces
3504 before undocking occurs. The computer system 1104 can include
a display 1304 that presents a display 3508 before undocking.
Thereinafter, the device may be undocked from the computer system
1104, in step 3708.
[0327] Undocking the device 100 may include electrically
disconnecting the device from the computer system 1104. The
electrical connection may be made with a docking cradle, a wire
interface, a wireless interface, or by other device or connection.
To disconnect the device 100, the device 100 may be separated from
the docking cradles or wire interface or the wireless interface may
be disconnected. Once undocked, the computer system 1104 may no
longer be controlled or managed by the device 100. Thus, the
computer system 1104 may resume control of its own actions. Thus,
undocking the device 100 dismantles the unified system and presents
separate interfaces for the computer system 1104 and the device
100.
[0328] The behavior of the unified system 3200, as shown in FIG. 36
after undocking, may be governed by a set of undocking rules, as
described in conjunction with FIG. 38. In embodiments, the computer
system display 3204 may hide the unified desktop, in step 3712, and
display a window or other display 3604 that was hidden during
docking, as shown in FIG. 36. After undocking, the device software
(as described in FIGS. 5A and 5B) may determine if any application
displays or windows were displayed on the device 100 before
undocking, in step 3716. The determination may include the
processor 504 checking the display configuration in the frame
buffers 548 or other type of check. If there were displays
presented on the device 100 before undocking, the method 3700 may
proceed YES to step 3720. If there were on displays presented on
the device 100 before undocking, the method 3700 may proceed NO to
step 3724 where a device desktop is provided on the display(s) of
the device 100.
[0329] If there were on displays (e.g., display 3504) presented on
the device 100 before undocking, the display(s) 3504 may be
maintained on the device interface, as shown in FIG. 35. However,
no windows or displays are migrated from the computer system 1104
to the device 100. Thus, the device 100 maintains its display
characteristics after undocking.
[0330] A table 3800 containing at least some of the docking and
undocking rules and how to interact with the system 2200 during
docking and undocking is shown in FIG. 38. In embodiments, there
two sets of rules 3804--one set for docking represented in column
3808 and another for undocking represented in column 3812. In
embodiments, there may be more or fewer rules. Further, the docking
rules may include a rule governing visible-to-visible transitions
represented in column 3816 and a stickiness rule represented in
column 3820. The table 3800 shows rules that govern the components
(represented by column 3824) of the unified system. Thus, the rules
govern how the displays of the device 100 and computer system 1104
are managed during docking and undocking. The device interfaces are
represented in row 3828 while the computer system interfaces are
represented in row 3832.
[0331] During docking, the user interface 1304 may display
interfaces or a desktop for the computer system. However, upon
docking, the interfaces and computer system desktop are hidden
according to the visible-to-visible rules for the computer system.
Further, if there was a previous docking and undocking, any
application interfaces previously displayed during a past docking
are re-displayed according to the stickiness rules for the computer
system. It should be noted that the device or computer system may
store an indicator of the previously displayed interfaces at the
last docking and access the indicators to reinstate the interfaces
upon re-docking. The device, during docking, will have interfaces
displayed on the device moved to the display of the computer system
according to the visible-to-visible rule.
[0332] During undocking, any interfaces displayed on the device
remain displayed on the device according to the undocking rules for
the device. For the computer system, applications displayed in the
unified desktop are hidden but available if the device is
re-docked. The computer system may then display other interfaces
associated with the computer system or the computer system desktop,
according to the undocking rules for the computer system.
[0333] An embodiment of a method 4800 for applying a
visible-to-visible rule during docking of a device is shown in FIG.
48. While a general order for the steps of the method 4800 is shown
in FIG. 48. Generally, the method 4800 starts with a start
operation 4804 and ends with an end operation 4840. The method 4800
can include more or fewer steps or can arrange the order of the
steps differently than those shown in FIG. 48. The method 4800 can
be executed as a set of computer-executable instructions executed
by a computer system and encoded or stored on a computer readable
medium. Hereinafter, the method 4800 shall be explained with
reference to the systems, components, modules, software, data
structures, user interfaces, etc. described in conjunction with
FIGS. 1-47.
[0334] Device 100 docks with a computer system (PC) 1304. As shown
in FIG. 39, the device can include two displays 3908 and 3912. The
PC can include one display 3904. When docked, the displays 3904,
3908, and/or 3912 can display a unified desktop, which may display
one or more application windows. As explained in conjunction with
FIG. 38, the unified system maintains visible windows from the
device 100 on the unified desktop. Before docking, the PC 1304 and
device 100 may appear as shown in FIG. 40. As shown in FIG. 40, the
device 100 may display a first window 3108 and a second window
3112. The PC 1304 may display a first window 4004 and a second
window 4008. Further, the device 100 may display an application
manager interface 4012 that displays windows (e.g., 4020) on the
first display 3908 of the device 1304 and an application manager
interface 4016 that displays windows on the second display 3912 of
the device 1304. The application manager interfaces 4012, 4016 can
also display windows (e.g., 4024) that are not currently displayed
on the displays 3908, 3912.
[0335] A representation of a display buffer or data structure 4100
associated with the display 3904 is shown in FIG. 41A. The data
structure 4100 can store information about which windows are
displayed before docking. For example, data object 4104 is
associated with window 4004 and data object is associated with
window 4008. Further, a data object 4112 may be associated with a
desktop displayed on the display 3904. A window stack 4116 can
represent the windows open, active, or inactive on the displays
3908, 3912 before docking the device 100. For example, data object
4124 can represent window 3108, data object 4120 can represent
window 3112, and data objects 4128, 4132 can represent the desktop
on the device 100.
[0336] At some time thereinafter, the device 100 is docked with the
PC 1304 to create the unified desktop 4200, as shown in FIG. 42. In
response to docking the device 100, the windows 4004, 4008
previously displayed on the PC 1304 can be hidden or closed, in
step 4808. The desktop 4204 may then be provided on the PC display
39, in step 4812. A processor on the device 1304 can then determine
if there were any windows active and displayed on the device 1304
before docking, in step 4816. The processor can scan the window
stack 4116 to determine that windows 3108 and 3112 were open
substantially simultaneously with the docking of the device
1304.
[0337] After determining the windows that were open on the device
100 before docking, the processor can instruct the PC 1304 to
create windows that are similar to or associated with the programs
or applications that were being executed on the device 100. For
example, if an Internet Browser was being executed on the device
100, a browser application can be opened on the PC 1304. After
creating the windows, the active windows 3108, 3112 are moved to
the created windows. In embodiments, state information for the
active windows 3108, 3112 can be used to provide a new instance of
the window on the PC 1304. Thus, while, to the user, it appears the
window is moved, a new window is actually instantiated. As shown in
FIG. 42, windows 3108, 3112 are now open in display 3904. The
unified desktop 4204, 4208 may be shown on the PC 1304 and the
device 100, in step 4828.
[0338] In response to the movement of the windows after docking,
the processor modifies the window stack, in step 4832. For example,
the processor removes the windows from the window stack 4308, as
shown in FIG. 43B. Further, the window stack 4308 is changed to
reflect that the unified desktop 4312 is displayed on the device
display 3912, 3908. Further, the processor modifies the computer
system display buffer, in step 4836. For example, the display
buffer 4300 is changed to reflect that windows 4124, 4120 are open
on the PC display 3908. The unified desktop 4304 may also be
provided on the display 3904, as shown in the window stack.
[0339] While docked, a window 3108 may be moved from the PC display
3904 to the device display 3908, as shown in FIG. 44. The display
buffer 4500 can be changed to reflect the move, as shown in FIG.
45A. Thus, the window 4124 is no longer shown in the window stack
4500 but shown in the window stack 4504 in FIG. 45B. Upon
undocking, the window 3108 remains visible on the device 100. The
other window 3112 is no longer shown. The display buffer 4700 for
the PC 1304 can now display the PC desktop 4110 but no other
windows. The window stack 4704 may remain unchanged, as shown in
FIG. 47B.
[0340] An embodiment of a method 5400 for maintaining window
stickiness during a re-dock of a device is shown in FIG. 54. While
a general order for the steps of the method 5400 is shown in FIG.
54. Generally, the method 5400 starts with a start operation 5404
and ends with an end operation 5432. The method 5400 can include
more or fewer steps or can arrange the order of the steps
differently than those shown in FIG. 54. The method 5400 can be
executed as a set of computer-executable instructions executed by a
computer system and encoded or stored on a computer readable
medium. Hereinafter, the method 5400 shall be explained with
reference to the systems, components, modules, software, data
structures, user interfaces, etc. described in conjunction with
FIGS. 1-53.
[0341] Device 100 docks with a computer system (PC) 1304, in step
5408. The docking is as described herein. During the docking, at
least one window is opened on the unified desktop of the computing
system 1304, in step 5412. For example, at least one of window 4904
or window 4908 are opened during the docked session. The docking of
the unified system is represented by line 4912. In embodiments, a
desktop 4916 is shown on the device 100. However, the device 100
might also have one or more open windows displayed on the device
100.
[0342] At some time thereinafter, the device 100 is undocked from
the PC 1304, in step 5416. The undocked device 100 and computer
system 1104 is shown in FIG. 50, as system 5000. The undocked
system may be represented by the absence of line 4912. Upon the
occurrence of the undock event, recognized by the device 100 and/or
the computer system 1104, the device 100 and/or the computer system
1104 may save a data structure 5300, in step 5420. The data
structure 5300 is as shown in FIG. 53.
[0343] The data structure 5300 may include one or more data fields
and may be stored as any type of data structure as described
herein. In embodiments, the data structure 5300 can include a
timestamp 5304 of the last docking, an identifier (ID) for a window
5308, and/or status and/or state information 5312. The data
structure 5300 can include more or fewer data items than that shown
in FIG. 53 as represented by ellipses 5324. In embodiments, the
information contained in data structure 5300 is associated all
windows that were open on the computer system 1104 before the last
undocking. For example, the data structure 5300 may include
information about window 4904 and window 4908 as shown in FIG.
49.
[0344] Further, other data fields, such as window 2 ID 5316 and
state information 5320, may represent window 2 4908, which was open
on the computer system 1104 before undocking. There may be more or
fewer data fields for more or fewer windows that may have been open
on the computer system 1104 before undocking, as represented by
ellipses 5328. This data structure 5300 can be stored with the
device 100 and/or the computer system 1104. The undocked system
5100, as shown in FIG. 51, may have one or more actions occur on
the computer system 1104 or on the device 100 while undocked. For
example, window 3 5104 may be opened on the device 100 while the
device 100 is undocked from the unified system.
[0345] At some time thereinafter the device 100 may be re-docked
with the computer system 1104 to recreate the unified system 5200
as shown in FIG. 52, in system 5424. Upon the occurrence of a
re-dock, the computer system 1104 and/or the device 100 may
recognize the re-dock event and may access the data structure 5300.
Information from the data structure 5300 may be read to reopen one
or more windows, in step 5428. For example, as shown in FIG. 52,
window 1 4904 and window 2 4908 may be reopened as having been open
during the last docked session. Thus, the open windows at undocking
are sticky or re-open the next time that the device 100 docks with
the computer system 1104. Further, windows that were open on the
device 100 during undock, may be moved to the computer system 1104,
as represented by window 3 5104 being displayed on the computer
system display 1308. If a window is moved from the device 100 to
the computer system 1104, the desktop 5204 may be displayed on the
device 100.
[0346] Embodiments of systems 5504/5520 for creating a unified
system for the device 100 and PC 1304 are shown in FIGS. 55A and
55B. The software components or modules that provide for the
unified system on a computer system are shown in FIG. 55B. The
systems 5504 and/or 5520 for the device 100 and the PC 1304 may be
stored and executed in hardware as described herein. The software
modules can include a first operating system 5508 and a second
operating system 5512. The two operating systems 5508 and 5512 may
interact to create and manage the unified system. In embodiments,
the second operating system 5512 may control the functions of the
device 100. The first operating system 5508 may control or direct
the operations of the computer system 1104. Thus, the first
operating system 5508 may communicate with the computer system
interface 5516 that sends signals to the computing system 1304
through the docking hardware. Embodiments of the dual operating
system are described in U.S. Provisional Patent Applications
61/507,199, filed Jul. 13, 2011, entitled "Dockable Mobile Software
Architecture," 61/507,201, filed Jul. 13, 2011, entitled
"Cross-environment communication framework," 61/507,203, filed Jul.
13, 2011, entitled "Multi-operating system," 61/507,206, filed Jul.
13, 2011, entitled "Auto-configuration of a docked system in a
multi-OS environment," and 61/507,209, filed Jul. 13, 2011,
entitled "Auto-waking of a suspended secondary OS in a dockable
system".
[0347] The modules on the computer system 5520 may be installed or
stored upon the first docking of the device 100 to the computer
system 1304. The modules can include a device interface 5524 that
communicates with the computer interface 5516. Thus, the device
interface 5524 can receive signals from the first operating system
5508 and may send signals or events to the first operating system
5508. The device interface 5524 can communicate with an
application-programming interface 5528. In turn, the
application-programming interface (API) 5528 can communicate with
the operating system 5532 for the computer system. The API 5528 can
act as an intermediary that both controls and directs the computing
system OS 5532 or changes the operation thereof. Thus, the API 5528
can both subordinate normal computer system events for the PC 1304
and promote the events or signals sent from the device 100.
[0348] In embodiments, the API 5528 may include one or more
modules. For example, the API 5528 can include an interceptor
module 5536, a relay module 5540, an injector module 5544 and/or a
receiver module 5548. The interceptor module 5536 may be operable
to intercept events or processor executions that are put on the
stack for the computer system processor. Thus, the interceptor 5536
can erase, delete, or change the stack for the PC 1304, thus
controlling what actions are conducted by the PC 1304. Any events
that occur on the PC 1304 that are placed into the stack may be
intercepted by the interceptor 5536 and provided to the relay 5540,
which may then relay the event through the device interface 5524 to
the first operating system 5508. The information sent from the
relay 5540 allows the first operating system 5508 to respond to the
event(s) for the PC 1304.
[0349] Likewise signals from the OS 5508 to the PC 1304 may be
received by a receiver 5548. When the first operating system 5508
wants to control or have the personal computer 1304 conduct some
action, the first operating system 5508 may send a signal through
the PC interface 5516 to the receiver 5548. The receiver 5548 may
then pass the signal onto the injector 5544, which may place the
event or instruction into the stack for the computer operating
system 5532. Thus, the injector 5544 communicates signals to the
computer system OS 5532 to control its actions.
[0350] An embodiment of the method 5600 for receiving an event for
the unified desktop at the device 100 is shown in FIG. 56. While a
general order for the steps of the method 5600 is shown in FIG. 56.
Generally, the method 5600 starts with a start operation 5604 and
ends with an end operation 5636. The method 5600 can include more
or fewer steps or can arrange the order of the steps differently
than those shown in FIG. 56. The method 5600 can be executed as a
set of computer-executable instructions executed by a computer
system and encoded or stored on a computer readable medium.
Hereinafter, the method 5600 shall be explained with reference to
the systems, components, modules, software, data structures, user
interfaces, etc. described in conjunction with FIGS. 1-55B. The
process 5600 may be described particularly in conjunction with
FIGS. 55A and 55B.
[0351] The operating system 5512 may receive an event at the device
100, in step 5608. An event may be any type of user interface input
or other occurrence that may require attention from the device 100.
An event may require action by the device or some type of response.
Once an event is received, the second operating system 5512 can
determine if the event is associated with the device 100, in step
5612. For example, if the event is a user interface input that
requires user interface output at the device 100, then the event is
related to the device 100. However, if the event may be related to
output on the personal computer 1304, the second operating system
5512 may determine that the event is related to the personal
computer 1304. If the event is related to the device 100, the
method 5600 proceeds YES to step 5616. In contrast, if the event is
not associated with the device 100, the method 5600 proceeds NO to
step 5624.
[0352] In step 5616, the second operating system 5512 can process
the event for the device 100. The processing of an event by an
operating system may be as understood in the art. Thus, the
processing of the event may involve user interface output or
completing an instruction for an application. If the processing
requires output, the output may be displayed on the interface of
the device 100, in step 5620.
[0353] If the event is related to the computer system 1104,
operating system 2 5512 may send the event to the first operating
system 5508, in step 5624. The first operating system 5508 may
process events similarly to the second operating system 5512 but
may process the events for the personal computer 1304. Thus, the
processing may include one or more steps or actions that may be
unique to the computer system 1104 as opposed to the device 100.
Thus, the operating system 5508 processes the event for the
computer system 1104, in step 5628, and sends the processed or
executed actions through the computing system interface 5516 to the
device interface 5524. The instructions or completed processing
actions may be received by the receiver 5548 and sent to the
injector 5544. The injector 5544 may send the completed
instructions or actions for the personal computer operation system
(OS) 5532 to create a display or some other output. The personal
computer OS 5532 can then display the output, in step 5632. In this
way, the device 100 can both control the functions of the device
100 and the personal computer 1304 for an event received by the
device 100 but applying to either the device 100 or the personal
computer 1304.
[0354] An embodiment of a method 5700 for processing an event
received on a personal computer 1304 is shown in FIG. 57. While a
general order for the steps of the method 5700 is shown in FIG. 57.
Generally, the method 5700 starts with a start operation 5704 and
ends with an end operation 5720. The method 5700 can include more
or fewer steps or can arrange the order of the steps differently
than those shown in FIG. 57. The method 5700 can be executed as a
set of computer-executable instructions executed by a computer
system and encoded or stored on a computer readable medium.
Hereinafter, the method 5700 shall be explained with reference to
the systems, components, modules, software, data structures, user
interfaces, etc. described in conjunction with FIGS. 1-56.
[0355] An event may be received on the personal computer 1304, in
step 5708. The event may be as described previously in conjunction
with FIG. 56. The event may be received in the user interface of
the personal computer 1304. Upon receiving the event, the
interceptor 5536 may intercept event information from the memory
stack of the PC OS 5532 and provide that information to the relay
5540. The relay 5540 may then package the information to send to
the device, in step 5712. The information may be sent in a message
provided by the device interface 5524 to the computer interface
5516 and arriving at the first OS 5508. The first OS 5508 may then
process the event, in step 5716. The processing of the event may be
the same or similar as that described in conjunction with FIG. 56.
Further, the output provided by the first OS 5508 may then be sent
back to the PC OS 5532 as described in conjunction with FIG.
56.
[0356] An embodiment of the method 5800 for processing an event
that may cross the personal computer 1304 and device 100 is shown
in FIG. 58. While a general order for the steps of the method 5800
is shown in FIG. 58. Generally, the method 5800 starts with a start
operation 5804 and ends with an end operation 5820. The method 5800
can include more or fewer steps or can arrange the order of the
steps differently than those shown in FIG. 58. The method 5800 can
be executed as a set of computer-executable instructions executed
by a computer system and encoded or stored on a computer readable
medium. Hereinafter, the method 5800 shall be explained with
reference to the systems, components, modules, software, data
structures, user interfaces, etc. described in conjunction with
FIGS. 1-57.
[0357] An event may be received that crosses both the personal
computer 1304 and the device 100, in step 5808. For example, a user
may desire to move a window from the device interface to the
computer interface by dragging that window from one interface to
the other. This type of event would cross the boundary of the
personal computer 1304 and device 100 and thus create an event that
may occur in both the device 100 and the personal computer
1304.
[0358] The event may be received in the second OS 5512. Further,
the interceptor 5536 may receiver or intercept the event in the
personal computer OS 5532. The intercepted event may be sent
through the relay 5540 to the first OS 5508, in step 5812. The
first and second OS 5508 and 5512 may then communicate about the
events to coordinate the timing of the information. Thus, the
determination may be made as to where an event started or stopped
and which of the outputs should be processed first. Thus, both the
second OS 5512 and the first OS 5508 may process the event or
events, in step 5816. The output may then be coordinated so as to
provide a seamless user interaction for such an event. The second
OS or the first OS may throttle the other operating system in the
device 100 to coordinate or to time the processing and then output
of data as described in conjunction with FIG. 56.
[0359] An embodiment of the method 5900 for the device 100 to be
master of the unified system, upon docking, as described and shown
in FIG. 59. While a general order for the steps of the method 5900
is shown in FIG. 59. Generally, the method 5900 starts with a start
operation 5904 and ends with an end operation 5924. The method 5900
can include more or fewer steps or can arrange the order of the
steps differently than those shown in FIG. 59. The method 5900 can
be executed as a set of computer-executable instructions executed
by a computer system and encoded or stored on a computer readable
medium. Hereinafter, the method 5900 shall be explained with
reference to the systems, components, modules, software, data
structures, user interfaces, etc. described in conjunction with
FIGS. 1-58.
[0360] A docking event may be received, in step 5908. The docking
event may be created when the device 100 is connected to the
computer system 1104 as shown by the line 4912 in FIG. 49. The
docking event may occur in both the computer system 1104 and the
device 100. In the device 100, the second OS 5512 may begin
communicating with the first OS 5508 and instruct the first OS 5508
to begin signaling the computer system 1104 to control the computer
system's actions. Further, in the computer system 1104, the
application programming interface 5528 may begin to be executed and
begin scanning or monitoring the stack of the personal computer OS
5532 to intercept or inject instructions into the memory stack for
the operating system 5532. In embodiments, the first OS 5508 may
send an instruction to the application programming interface 5528
to be executed. In other embodiments, the docking signal or event
may cause the PC OS 5532 to begin executing the application
programming interface 5528. Upon the execution of the API 5528 and
the first OS 5508, the device 100 controls the personal computer
1304 as the master, in step 5912. Thus, any actions being conducted
on either the device 100 or the computer system 1104 can be
executed or handled with the device 100.
[0361] Upon the device 100 becoming master over the personal
computer 1304, the computer system 1308 subordinates any functions
the computer system 1308 normally executes independently, in step
5916. For example, any windows or applications being executed by
the personal computer 1304 before docking are hidden and those
functions are paused while the device 100 is docked. Thus, any
functions normally executed on the computer system 1104 are
subordinated to the master control of the device 100. One such
subordination may be the computer system 1104 hiding any display,
in step 5920. The computer system 1104 displays a master or unified
desktop on the display after having the computer system's other
functions. Any windows or the previous desktop are hidden behind
the unified system desktop or are replaced by the unified system
desktop.
[0362] Triad Control
[0363] An embodiment of the unified system 6000 showing a triad
control interface 6008 is shown in FIG. 60. The unified system 6000
includes the device interface 1304 and the computer system
interface 1308. As shown in the device interface 1304, a control
window 6004 may be provided by user interface input or
automatically through action of an application or other event. The
user interface 6004 provides direct access to certain functionality
including, but not limited to, a phone application, an application
launcher, a file browser, etc. The provided functionality in the
unified desktop 6000 may be ported to the computer system display
1308 in a new and novel user interface area 6008 called the triad
control. The triad control area 6008 may be a separate popup bar
that is navigable by a user interface device, for example, a mouse,
that is hovered over a specific area of the user interface 1308.
Upon the recognition that the user desires to view the triad
controls 6008, the triad control 6008 may appear on the user
interface 1308 as if sliding from of the screen and onto the top of
the display 1308.
[0364] The triad control 6008 can include at least some of the same
functionality as in user interface 6004. For example, the area 6012
shows several user selectable icons in the triad control 6008. The
user selectable icon 6016 can execute a file browser. The user
selectable icon 6020 can execute an application launcher. The user
selectable icon 6024 can execute an application manager. Further,
the user selectable icon 6028 can provide a phone application that
is executed by the device 100 but appears on the computer system
display 1308. Likewise, a user selectable 6032 may provide a
browser window that allows the user to browse the internet through
the phone device 100. The selectable device icons 6012 provide
functionality typically associated with the phone over the unified
desktop 6000 by allowing access to those functions using a new
selectable area 6012 for user icons in the triad control bar
6008.
[0365] A series of windows shown in FIGS. 61A-61C show a visual
representation of how the triad control 6008 may be provided to a
user in the user interface 1304. As shown in FIG. 61A, a user
interface device 6104 may be shown in user interface 1308. The user
interface device 6104 may be controllable by a mouse, a finger on a
touch sensor display, or by some other hardware means. The
represented arrow icon can be moved from the device's position, as
shown in FIG. 61, to a position shown in FIG. 61B. The position of
the user interface device 6104, in FIG. 61B, can be recognized as
being within a triad control area 6108. Upon recognizing the
presence and persistence of the user interface device 6104 in the
triad control area 6108, the triad control bar 6008 may begin to
present itself by appearing to slide from the side of the screen.
Thus, as the triad control bar 6008 slides, the triad control bar
will become completely displayed, as shown in FIG. 61C. If the user
persists in maintaining the user interface device 6104 in the triad
control area 6108, the triad control bar 6112 can continue to be
displayed and to be selectable for the user.
[0366] An embodiment of the application launcher user interface
6204, presented to the user after user selectable icon 6020 is
selected in triad control bar 6008, is shown in FIG. 62. The user
may select the user interface device 6020 or user selectable icon
6020 in the triad control 6008. Upon selecting icon 6020, by user
interface action, the application launcher user interface 6204 may
be presented; the application launcher user interface 6204 presents
selectable icons associated with applications that can be launched
by the user. The applications may include a first group of one or
more applications that are executed on the device and/or a second
group of one or more applications executed on the computer system.
Thus, the available applications for the user, presented in the
application launcher user interface 6204 on the unified desktop,
encompass applications executed both on the device and the computer
system. The application launcher user interface 6204 may present
the first and second group of applications in two, separate
sections of the application launcher user interface 6024. The first
section presents applications executed on the device, and the
second section presents applications executed on the computer
system.
[0367] For example, the group of applications 6208 provide icons
associated with PC productivity applications that may be executed
by the computer system. In alternative embodiments, the PC
productivity applications 6208 are applications executed by the
device that may emulate computing system functionality. The
applications shown in the group 6212 may be Android applications or
other applications executed solely by the device. Thus, by
providing the application launcher menu 6204, the user can select
one of the icons shown in either area 6208 or 6212 to launch an
application in the unified desktop. Further, the window 6204 may
provide a search area 6216 where the user may provide search
criteria that can search for a certain application within the
application launcher 6204.
[0368] An embodiment of a process for using the search function in
the application launcher user interface 6204 is shown in FIG. 63.
The search function can include a user interface 6216 in the
application launcher user interface 6204. The user interface 6216
may accept one or more characters of a search string through user
interface input from a keyboard, drop down menu, or other user
interface action. The search function can be progressive. In other
words, as the letters are typed into the search function user
interface, the unified system may search for applications
containing or starting with the search string. For example, the
user may have typed a letter "c" 6304 in the search window 6216.
Upon typing the letter within the search window 6216, the window
6204 may present a more limited set of icons associated with
applications that contain or begin with the letter "c." Thus, based
on the search string, the application launcher presents one or more
selectable icons associated with the search string. For example,
the number of PC productivity applications shown in area 6208 has
been reduced to the fewer number of applications, as shown in area
6308. Further, the applications shown in area 6212 have been
reduced to a less numerous number of applications shown in area
6312. The applications shown in areas 6308 or 6312 may have a "c"
in the name of the application or may have a "c" that begins the
application name. In this way, the searching function in 6216
allows for the user to quickly discern what applications are
important and to select a more limited set of applications from the
application launcher window 6204.
[0369] Embodiments of a process for using an application manager,
initiated by selecting icon 6024, is shown in FIGS. 64 through 66.
In embodiments, the user may move a user interface device 6404 to
select the application manager icon 6024 in the triad control 6008.
At the time of the selection of the icon 6024, the computing system
interface 1304 may be displaying at least one window 6408 or 6412.
Upon selecting the application manager icon 6024 in the triad
control 6008, a new menu or application launcher user interface
display 6504 (shown in FIG. 65) may be presented within the display
1304. The new display 6504 may contain two or more areas or
sections, e.g., area 6516 and area 6520. The separation of the
areas may be delineated by visual indicia, for example, a bar 6532
shown between area 6512 and area 6520. Each area 6512, 6520 may
display different information. The areas 6512 and/or 6520 may
display one or more representations of one or more applications
executing on the unified system. The one or more representations
may be a snapshot(s) of an application window associated with one
or more application(s) executing on the unified system. The
snapshot may be a last-presented view of the application window on
the unified desktop. For example, snapshot 6408 shown in FIG. 65 is
the same as or similar to window 6408 shown in FIG. 64.
[0370] Area 6520 of the application launcher user interface 6504
can display one or more snapshots of applications currently
executing or being displayed on the device Likewise, area 6520 may
display one or more snapshots of applications being executed or
being displayed on the computing system. For example, windows 6412
and 6408 are shown in area 6520. Thus, the display 6504 captures
the windows that were currently being displayed in FIG. 64. In
further embodiments, an application window 6524 that was not being
displayed but being executed on the device may be shown in area
6520. The executed application is represented by window
representation 6524 in area 6520.
[0371] A user may manage the currently executing applications by
selecting or conducting user interface actions on the snapshots or
representations of the applications shown in window or user
interface 6504. Thus, if the user wishes to select, execute, or
move the snapshot of a window being executed on the device, e.g.,
6508 or 6512, the user may select or conduct the user interface
action on those windows 6508, 6512 in area 6516. In embodiments,
when the user moves a snapshot or user interface device 6604 to
select a window 6412 in the display 6504 such that the window 6412
may then become displayed in the personal computing interface 1308,
as shown on FIG. 66. If a snapshot is moved between sections 6516
and 6520, the application window associated with the snapshot may
be moved from the user interface of the device to the user
interface of the computer system, or vice versa. When selected in
windows 6504 and/or windows 6412, the displayed windows in the
interface(s) 1304, 1308 becomes highlighted and/or receives focus.
Thus, in embodiments, the application manager window 6504 allows
the user to manage applications currently executing in the unified
desktop.
[0372] When the user selects the icon 6016, a file manager window
6704 may appear within the user interface device 1304, as shown in
FIG. 67. Thus, the triad control 6008 provides access to files that
may be on the device or the computer system. The user interface
window 6704 may provide an area for information about recently-used
or recently-opened files 6708; the area 6708 displays information
associated with recently used files (e.g., filename, date when the
file was last opened, the size of the file, etc.). A file may be
associated with an application executed in the unified system. The
icons or user interface devices may be selectable to open the
represented files in area 6708. Further, window 6704 may provide an
area 6712 that includes a list of folders either automatically
generated for the user or populated by user input. The folder icons
in area 6712 may also be selectable to open a folder and access a
file within the data storage of either the device or the computer
system.
[0373] A phone manager user interface 6804 presented, after icon
6028 is selected in the triad control 6008, is shown on FIG. 68.
The phone manager user interface 6804 may allow the user to select
one or more icons displayed within the phone manager user interface
6804 to conduct a communication session. For example, the phone
icon 6808 may allow the user to begin a telephone call or send a
text message by selecting the icon 6808. Further, in area 6812, a
call log may be listed that shows recent calls either received or
sent by the device. In area 6016, one or more icons may be provided
that represent data about frequently called or contacted persons or
entities. In area 6820, one or more contacts that are communicated
with often may be listed in a favorite's field. In field or area
6824, all contacts may be listed. The list of contacts may be
searchable or navigable by the user in the phone manager user
interface 6804. Any of these different fields may be searchable, by
search function 6828, presented in the phone manager user interface
6804. The user interface of the search function can accept a search
string and return one or more contacts associated with the search
string. Each icon, in embodiments, within the phone manager user
interface 6804, may be selectable to conduct a communication
session using the device, regardless of the fact that the icon was
selected in the phone manager user interface 6804 presented in the
computing system interface 1304.
[0374] Like the phone functionality described in conjunction with
FIG. 68, if a user selects a browser icon 6032 within the triad
control bar 6008, a browser control user interface 6904 may be
presented. The browser control user interface 6904 can present one
or more items of information that are associated with the web
browsing functionality of the device. The information is presented
as one or more selectable icons. Thus, icon 6908 may be selected to
begin or conduct a web browser session using the web browser
application and network communication capability of the device.
[0375] The browser control user interface 6904 can be separated
into two or more areas. For example, area 6912 may include
bookmarks for the web browser bookmarked by the user in the device.
A bookmark is a designated website that the user may frequent and
stores the Uniform Resource Locator (URL) for the site in the
bookmark information. Selecting the bookmark directs the web
browser to contact the URL and provide the website information to
the user. Area 6916 may provide a list of one or more names or URLS
for the most visited websites that may be selectable by the user.
Area 6920 may provide a list of downloads that have been recently
downloaded by the device from a network. Area 6924 may provide a
history of recently visited websites using the web browser. Any of
these areas may be searchable by search function 6928 provided in
windows 6904. Each of the icons within windows 6904 may be selected
to begin a web browsing session using the functionality of the
phone, while selectable within the computer system interface 1304.
Any of the different areas may be searchable, by search function
6928, presented in the browser control user interface 6904. The
user interface of the search function can accept a search string
and return one or more contacts associated with the search
string.
[0376] An embodiment of a method 7000 for selecting or executing
functions within the unified desktop using the triad control 6008
is shown in FIG. 70. An embodiment of the method 7000 for receiving
an event for the unified desktop at the device 100 is shown in FIG.
70. While a general order for the steps of the method 7000 is shown
in FIG. 70. Generally, the method 7000 starts with a start
operation 7004 and ends with an end operation 7036. The method 7000
can include more or fewer steps or can arrange the order of the
steps differently than those shown in FIG. 70. The method 7000 can
be executed as a set of computer-executable instructions executed
by a computer system and encoded or stored on a computer readable
medium. Hereinafter, the method 7000 shall be explained with
reference to the systems, components, modules, software, data
structures, user interfaces, etc. described in conjunction with
FIGS. 1-69. The method 7000 may be described with particular
attention paid to FIGS. 60 through 69.
[0377] A user interface event may be received in display 1304, in
step 7080. The user interface event may occur within the triad
control region 6108. Upon receiving the user interface action that
persists within the area 6108, the unified desktop may display a
triad control 6008, in step 7012. In embodiments, the triad control
6008 display may be conducted as described in conjunction with
FIGS. 61A through 61C.
[0378] Thereinafter, the unified desktop system may determine if an
event is conducted within the triad control bar 6008. An event may
be a selection of a user selectable icon within region 6012 or some
other icon presented within the triad control 6008. If no action is
conducted, the triad control 6008 may continue to be displayed on
user interface device 100, in step 7028. If an event does occur
within the triad control 6008, such as the selection of the one of
the user selectable icons in area 6012, the selected function may
be provided in step 7020.
[0379] The function provided may be the display of a window as
described in conjunction with FIG. 62 through 69. The menus
provided may include further functionality that may be accessed
through the triad control 6008. The unified desktop must then
determine if a UA device is still in the triad control 6008 or in
one of the menus provided from the triad control 6008, in step
7024. If the user interface device remains within the triad control
area 6008, the triad control 6008 may continue to be displaced, in
step 7028. However, if the user interface device is no longer
within the triad control area 6008, the triad control 6008 may be
hidden, in step 7032. If a functionality provided within the menu
is selected by a user, for example, selecting a user selectable
icon within a menu presented in FIGS. 62 through 69, the triad
control may be hidden and the function provided.
[0380] An embodiment of a method 7100 for providing application
manager interfaces within the unified desktop is shown in FIG. 71.
While a general order for the steps of the method 7100 is shown in
FIG. 71. Generally, the method 7100 starts with a start operation
7104 and ends with an end operation 7128. The method 7100 can
include more or fewer steps or can arrange the order of the steps
differently than those shown in FIG. 71. The method 7100 can be
executed as a set of computer-executable instructions executed by a
computer system and encoded or stored on a computer readable
medium. Hereinafter, the method 7100 shall be explained with
reference to the systems, components, modules, software, data
structures, user interfaces, etc. described in conjunction with
FIGS. 1-70 and. The method 7100 may be described with particular
attention paid to FIGS. 60 through 70.
[0381] As explained in FIGS. 1 through 70, a device 100 and
computer system 1104 may be provided. The device 100 may then be
docked with the computer system 1104 to form a unified system 1100
consisting of the device 100 and the computer system 1104. When the
device 100 is docked, the device 100 and computer system 1104
respond in generating a unified desktop 1300 for the unified system
1100. The unified desktop 1300 is a single logical display that
spans both the user interface 1308 associated with the device 100
and the user interface 1304 associated with the computer system
1104. The unified desktop 1300 can emulate a personal computer
environment where windows and applications may be executed and
information may be displayed on the unified desktop 1300 for the
user.
[0382] The unified system 1100 can receive a user interface event,
which may be a user interface input into the first user interface
1308 associated with the device 100, in step 7108. The user
interface input can be a selection of a triad control button 6008
or other control button(s) associated with the device 100 or
computer system 1104. The buttons may provide the ability to
display an application manager interface 7212, in step 7112. Thus,
in response to receiving the user interface input the unified
system 1100 can display a first application manager interface 7212
on the device 100. Optionally, a second user interface input may be
received, in step 7116. The second user interface input may be an
input into the second user interface 1304 of the computer system
1104. In response to the second user interface input, a second
application manager interface 7216 may be shown on the computer
system interface 1304. It should be noted that it is possible for
one user interface input to force the display of both application
manager interface 7212 and application manager interface 7216 with
a single input or interaction.
[0383] As shown in FIG. 73, the application manager interfaces 7212
and 7216 may display at least one or more of the active windows
7304, 7308, 7312, 7316, and/or 7320 on either the first user
interface 1304 or the second user interface 1308. As explained
previously herein, the application manager interface 7212, 7216 may
be used to reorder the display of windows in either the device user
interface 1308 or the computer system display 1304. For example, as
shown in FIG. 74, with a user interface input 7404, 7408, window
7308 may be brought to the foreground in front of window 7304,
which changes the display 1304, as shown in FIG. 73, to what is
shown in FIG. 74. Thus, the unified system 1100 can receive a third
user interface input to reorder one or more application
representations or thumbnails 7324-7364, such as window 7308 and/or
window 7304, in the first application interface 7212 and/or the
second application interface 7216, in step 7124.
[0384] The third user interface input 7404, 7408 can be a drag and
drop or a flick gesture as described in conjunction with FIGS. 4A
through 4H. For example, arrow 7408 can represent a switch in the
orientation of the thumbnails or representations 7356 and 7360 that
are associated with windows 7304 and 7308, respectively, in the
application manager interface(s) 7212, 7216. In this case, window
representation 7360 is moved in front of window representation
7356. This user interface input 7408 can change the display of the
application windows 7304 and 7308 in display 1304 in the computer
system 1104.
[0385] Any change in one of the application manager interfaces 7212
and/or 7216 can change the other representations 7324-7364 in the
other application manager interface 7212 and/or 7216. For example,
the change 7408 can also change the representation of windows 7332
and 7336. Likewise, a reorientation 7404 in the other application
manager interface 7216 can change the window representations for
7308 and 7304 and the representation of the windows in application
manager interface 7212.
[0386] In embodiments, the third user interface can be an input
into the touch sensitive display 1308. Thus, a drag and drop or
flick gesture may move the windows in the first user interface 1308
and change the representation of the windows in the application
manager interface 7212, 7216. The third user interface input may
also be a gesture in a gesture capture region 120, 124 as explained
herein. Likewise, changes in the user interface 1304 can also
change the application manager interface 7212, 7216. For example,
window six may receive focus by a user clicking on the window 7308
with a device (e.g. mouse) and bringing the window 7308 into the
foreground over windows 7304 and 7312. This change in focus may
also change the order or representation of the windows in
application manager interfaces 7212 and 7216.
[0387] The applications in the unified desktop 1300 may have
different types that are based on where the windows 7308-7320 are
displayed. For example, applications displayed in user interface
1304 may be device focused applications. In contrast, applications
displayed on user interface 1308 may be considered computer system
focused applications. A device focused application can be an
application that is executed on the device 100. A computer system
focused application can be an application that may be executed in
either the computer system 1104 or the device 100 but is displayed
on the computer system interface 1304.
[0388] A third user interface input that changes the representation
of the application manager interface 7212, 7216 may attempt to
change a computer system focused application to a device focused
application. The third user interface input may attempt to change
the computer system focused application to a device focused
application and may be prevented by the unified desktop 1300. Thus,
the unified desktop 1300 does not move the window as specified by
the user interface input and may return an error message to the
user in one or more of the displays 1304, 1308.
[0389] A fourth user input may be received that changes a computer
system focused application to a device focused application. In this
situation, a window may be moved from the computer system display
1304 to the device display 1308. For example, as shown in FIG. 75,
the window 6 7308 may be moved from the computer system display
1304 to the device display 1308. The application manager interface
7216 on the computer system 1104 may be updated to show that window
7308 is now on the device interface 1308 and not on the computer
system interface 1304. The change may also be shown in the device
100 application manager interface 7212. The fourth user interface
input that makes the change in the application window 7308 can be a
drag and drop, from one interface 1304 to the other interface 1308,
may be a change in one or more of the application manager
interfaces 7216 or 7212, or some other input that changes the
positioning of window 7308.
[0390] A fifth user interface input may be received that reorders
the window stack of the device 100 as represented in one or more of
the application manager interfaces 7212, 7216. For example, as
shown in FIG. 76, a change may be made that repositions window 1
7320 and window 2 7316. The change may be made in one of the
application manager interfaces 7212, 7216. For example, a drag and
drop or a flick gesture of the thumbnail 7316 or 7320 may be done
in the application manager interface 7216 to change the position of
the windows on the device 100. The change may then be shown in
application manager interface 7212. This process may also be
completed in the device 100 application manager interface 7212 and
then shown in the application manager interface 7216.
[0391] An embodiment of a method 7700 for providing application
manager interfaces within the unified desktop is shown in FIG. 77.
While a general order for the steps of the method 7700 is shown in
FIG. 77. Generally, the method 7700 starts with a start operation
7704 and ends with an end operation 7728. The method 7700 can
include more or fewer steps or can arrange the order of the steps
differently than those shown in FIG. 77. The method 7700 can be
executed as a set of computer-executable instructions executed by a
computer system and encoded or stored on a computer readable
medium. Hereinafter, the method 7700 shall be explained with
reference to the systems, components, modules, software, data
structures, user interfaces, etc. described in conjunction with
FIGS. 1-76 and. The method 7700 may be described with particular
attention paid to FIGS. 60 through 76.
[0392] A method 7700 may also be provided where the first
application manager interface 7316 is displayed, in step 7712, in
the first user interface 1308 presented on the device 100 after
receiving a first user interface input, in step 7708, into either
the first user interface 1308 or the second user interface 1304
associated with the computer system 1104, in step 7708. Optionally,
a second user interface input may be received in either the first
or second user interfaces 1304, 1308 associated with either the
device 100 or the computer system 1104, in step 7716. The second
user interface input may cause the unified desktop 1300 to display
the second application manager interface 7216 in the computer
system display 1304, in step 7720. However, in this process 7700, a
third user interface input may reorder one or more thumbnails
7304-7320 in the second application manager user interface 7216, in
step 7724.
[0393] The third user interface input may not affect one or more
application windows associated with the reordered thumbnails in the
application manager user interface 7216. For example, as shown in
FIG. 78, the thumbnail 7332 may be moved to a different orientation
in the application manager interface 7216 compared to window
representation 7308. However, within the user interface 1304 no
change is made as compared to the original user interface shown in
FIG. 73. Thus, it is possible to reorder thumbnails 7324-7364 of
the windows 7304-7320 in the application manager interfaces 7216
and 7212 without changing the orientation of the windows 7304-7320
in the user interfaces 1304 and 1308.
[0394] However, reordering thumbnails 7324-7364 in one user
application manager interface 7216 or 7212 can reorder those same
thumbnails 7324-7364 in the other application manager interface
7212 or 7216. In this way, the thumbnails 7344-7364 in the first
application manager interface 7212 and the thumbnails 7324-7340 in
the second application manager interface 7216 stay the same or
similar.
[0395] Further, the application manager interfaces 7212 and 7216
may display thumbnails or representations for all executing
applications. However, the display area for the application manager
interfaces 7212 and 7216 may not be large enough to display all of
the thumbnails at the same time but may present a navigable user
interface 7212 and 7216. Thus, by using a swipe, drag and drop, or
other gesture or another user interface input described herein, the
user may scroll through the thumbnails 7324-7364 in those
application manager interfaces 7212 and 7216.
[0396] In the unified desktop 1300, a first application window 7304
may be displayed on the first user interface 1304. Likewise, a
second application 7316 may be displayed in the second user
interface 1308. As explained previously, the second application
7316 may be a device focused application while the first
application 7304 may be computer system focused application. The
device focused application 7316 shown in the second user interface
1308 and the computer system focused applications 7304-7312 in the
first user interface 1304 can be displayed differently in the
application manager user interfaces 7216, 7216. For example, the
set of application representations 7332-7340 and 7356-7364 are
associated with the computer system applications 7304-7312 on the
first user interface 1304. These thumbnails 7332-7340 and 7356-7364
show a darkened dotted box, which delineates the thumbnails as
being computer system applications and/or displayed on the first
user interface 1304. Other visual indicia may be used besides the
dotted black line as the dotted black line is only provided as an
example of how different types of thumbnails may be displayed
differently.
[0397] Further, the computer system thumbnails 7332-7340 and
7356-7364 may be differently positioned than the thumbnails
representing device focused application. For example, in the first
application manager user interface 7212, the thumbnails 7332-7340
and 7356-7364 are shown to the right on a second user interface 114
of the device display 1308. The thumbnails 7344-7352 for the device
focused applications may be displayed to the left or on a first
user interface 110 of the device display 1308.
[0398] Likewise, the thumbnails 7324-7328 for device focused
applications may be shown at the top of the second application
manager user interface 7216 and separated by a line 7804. The
computer system application thumbnails 7332-7340 may be shown at
the bottom or below the line 7804 of the second application user
interface 7216.
[0399] The second application 7316 may be an Android application.
The first application 7304 may be an Android application and/or a
personal computer productivity application. An Android application
could be any application that executes in the Android or
Android-similar computer system 1104 or operating system. A
personal computer productivity application can be any application
that typically executes on a personal computer 1104.
[0400] An embodiment of a method 7900 for providing application
manager interfaces within the unified desktop is shown in FIG. 79.
While a general order for the steps of the method 7900 is shown in
FIG. 79. Generally, the method 7900 starts with a start operation
7904 and ends with an end operation 7928. The method 7900 can
include more or fewer steps or can arrange the order of the steps
differently than those shown in FIG. 79. The method 7900 can be
executed as a set of computer-executable instructions executed by a
computer system and encoded or stored on a computer readable
medium. Hereinafter, the method 7900 shall be explained with
reference to the systems, components, modules, software, data
structures, user interfaces, etc. described in conjunction with
FIGS. 1-78 and. The method 7900 may be described with particular
attention paid to FIGS. 60 through 78.
[0401] Another method 7900 may provide for a device 100 and a
computer system 1104 that are docked to form a unified system 1100.
A unified desktop 1300 may be generated to for the unified system
1100 that provides for the first user interface 1308 and a second
user interface 1304 being part of the unified desktop 1300, where
the unified desktop 1300 emulates a personal computer environment.
Thus, the unified system 1100 can receive an indication that the
device 100 is docked with the computer system 1104, in step 7908.
The unified system 1100 can then generate the unified desktop 1300,
in step 7912.
[0402] Thereinafter, the unified desktop 1300 may receive a first
input as a user interface event that moves the display window
associated with an application from the device user interface 1308
to the computer system user interface 1304, in step 7916. As
explained previously, applications displayed on the display 1308
are device focused applications. When moving a device focused
application to the computer system user interface 1304, the unified
system 1100 must determine if that application being executed on a
device 100 is associated with a big brother application normally
associated with the computer system 1104, in step 7920.
[0403] A big brother application is an application that is normally
executed in the computer system 1104 that has functionality similar
to at least one application executed by the device 100. For
example, a Twitter application run in an Android environment on the
device 100 may have a big brother application for Twitter that is
typically run in a computer system environment on the personal
computer 1104. Thus, the big brother application is the application
on the computer system 1104 that is associated with the application
executed on the device 100.
[0404] These big brother applications may be associated with the
device applications to provide a seamless transition when moving
application windows in the unified system 1100. If the device
application is associated with a big brother application, the
device 100 may send a directive to the computer system 1104 to
begin execution of the big brother application, in step 7924. The
directives are sent as explained herein to execute different
functions in the unified system 1100.
[0405] Thereinafter, the computer system 1104 may receive, from the
device 100, state information, in step 7928. State information can
be information that describes the current state of the application
was when it was executing on the device 100. For example, the state
information can have a configuration of the display or any
information that is being stored in memory that will be needed by
the computer system 1104 to continue execution of the application.
The state information is extracted at the point that the
application was moved to the computer system display 1304. Examples
of state information for a web browser application may be the
current site or URL that had been navigated to for the web browser.
Further, any information currently entered into a user interface
associated with the web browser may also be provided to the
computer system 1104. Other applications may have different, less,
or more state information transferred from the device 100 to the
computer system 1104.
[0406] Once the state information is received or at some point
after the determination is made that a big brother application is
associated with the device application, the computer system 1104
can re-execute the device application as a big brother application,
in step 7932. Thus, the big brother application is executed and
emulates the application from the device 100 and provides the
appearance that the application window was seamlessly moved from
the device 100 to the computer system 1104.
[0407] The input that causes the big brother application to execute
may be the docking of the device 100. Thus as explained herein
before, the docking of the device 100 may cause one or more windows
to move from the device display 1308 to the computer system display
1304. In other embodiments, the first input that moves the window
may be a user interface interaction, such as a drag and drop or
other type of user interface interaction, which moves the display
from the first user interface 1308 to the second user interface
1304.
[0408] The directive sent from the device 100 to the computer
system 1104 may be a command from the device 100 to the computer
system 1104 to execute the big brother application. As explained
with the device 100 being master and the computer system 1104 being
subordinate to the device 100 during execution of the unified
system 1100, the device 100 can command the computer system 1104 to
complete certain functions. In this case, the device 100 may
command the computer system 1104 to execute the big brother
application. The state information may follow the directive in a
second message after the directive has been sent. In this way, the
computer system 1104 may begin executing the big brother
application, at least in a general sense, before receiving the
state information to configure the big brother application into a
common state as was shown on the device 100 before the window was
moved.
[0409] Examples of the execution of big brother applications are
shown in FIGS. 80 and 81. In this example, at least two
applications 8004 and 8008 are being executed on the device 100 and
shown in the device display 1308. There are no current executing
applications shown in the computer system display 1304. At some
point, due to a user interface input or other event, the
applications 8004, 8008, being executed on the device 100, are
moved to the computer system display 1304, as shown in FIG. 81.
While the two application windows 8004, 8008 may be similar to what
was shown in the device 100, the application windows 8004, 8008 are
executed in big brother applications on the computer system 1104.
For example, the web browser function 8008 being executed in the
device 100 is now being executed in a computer system web browser
application. Likewise, the Twitter application 8004 executed in the
device 100 is now being executed as a Twitter application on the
computer system after being moved to the computer system display
1308.
[0410] An embodiment of a method 8200 for providing the unified
desktop is shown in FIG. 82. While a general order for the steps of
the method 8200 is shown in FIG. 82. Generally, the method 8200
starts with a start operation 8204 and ends with an end operation
8228. The method 8200 can include more or fewer steps or can
arrange the order of the steps differently than those shown in FIG.
82. The method 8200 can be executed as a set of computer-executable
instructions executed by a computer system and encoded or stored on
a computer readable medium. Hereinafter, the method 8200 shall be
explained with reference to the systems, components, modules,
software, data structures, user interfaces, etc. described in
conjunction with FIGS. 1-81 and. The method 8200 may be described
with particular attention paid to FIGS. 60 through 81.
[0411] Method 8200 may begin when the indication is received by the
device 100 or computer system 1104 that the device 100 is docked
with the computer system 1104, in step 8208. Thus, the device 100
and computer system 1104 are provided and then docked to form a
unified system 1100. The unified system 1100 can then generate the
unified desktop 1300, in step 8212. The unified desktop 1300 is a
common display that covers both the display 1308 of the device 100
and the display 1304 of the computer system 1104. This unified
desktop 1300 emulates a personal computer environment.
[0412] Thereinafter, a user interface event or other input may be
received that moves the display of an application from the device
user interface 1308 to the computer system interface 1304 (see
FIGS. 79-81). The application display may be associated with an
application that was originally executed on the device 100. The
device 100 must then determine, for the unified system 1100,
whether the application is associated with the big brother
application.
[0413] To begin the determination, the device 100 can search
metadata 8300, shown in FIG. 83, associated with a big brother
application pool to determine if there is an associated big brother
application, in step 8220. A big brother application pool can be a
set of one or more applications that are associated with one or
more device 100 applications. The computer system applications in
the big brother application pool may emulate or have similar
features or functions to device applications and present a set of
big brother applications.
[0414] The metadata 8300 associated with the big brother
application pool may be as shown in FIG. 83 in the data structure
8304. The data structure 8304 shown in FIG. 83 may be any type of
database or other data storage technique or structure that allows
for the metadata 8300 or information to be searched, stored, and
retrieved for use in various functions.
[0415] The unified system 1100 can use the big brother application
pool metadata 8300 to determine if an application is part of a big
brother application pool, in step 8224. Thus, if the device 100
application is associated with a big brother application, the
method 8200 may proceed YES to step 8228 where the big brother
application is executed and the application for the device 100 is
moved into the big brother application. If there is no big brother
application, the method 8200 may proceed NO to execute the
application in a freeform window, in step 8232. A freeform window
is a type of logical structure within the unified system 1100 that
allows for device applications to be executed in the computer
system 1104 environment without having a big brother
application.
[0416] As shown in the data structure 8304, the big brother
application pool can include two or more big brother applications.
The data structure can include more or less data and have more or
fewer entries than that shown in FIG. 83, as represented by
ellipses 8348 and 8352 Each big brother application, in the big
brother application pool, can be associated with at least one
application executed on the device 100. For example, in row 8308,
device application 1 is associated with big brother application W.
Thus, the unified system 1100 can search the metadata within the
row 8308 of the associated applications to determine that the
device 100 application 1 is associated with big brother application
W.
[0417] The metadata 8300 can include the association as shown in
column 8312. Column 8312 can have a simple indication as to whether
or not there is an association for the device application in column
8316. The column 8312 can be searched quickly as it is binary, for
the unified system 1100 to determine if the device application,
shown in column 8316, has an associated big brother application in
column 8320. If there is an association, a third column 8328 may
identify which big brother applications are associated the device
application in column 8316.
[0418] The associations between the device applications, in column
8316, and the big brother applications, shown in column 8324 may be
created automatically or manually. Thus, there may be a simple
indication automatically provided to help a user make the
association. For example, a simple name of the application may be
enough to determine if there is an association between a device
application and a big brother application. In other situations, a
user or system administrator may be able to make the associations
manually by providing the metadata or the association to the data
structure 8304. The user may manually set an association between,
for example, application 8332 and big brother application 8336.
[0419] The device 100 may search the metadata within the data
structure 8304. As the device 100 may be master, the device 100 may
determine if there is a big brother application. As such, if the
device 100 does determine there is a big brother application
associated with the device application whose display is moved to
the computer system display 1304, the device 100 can send a
directive to the computer system 1104 to execute the big brother
application identified or associated with the metadata 8300, in the
data structure 8304. Thus, the device 100 can read the identifier,
in column 8324, and send that information to the computer system
1104 to execute that big brother application. Further, the device
100 may read the state information included in column 8328 to
determine what state information to send to the computer system
1104 to properly execute the window in display 1304.
[0420] An application may be introduced to either a device 100 or
the computer system 1104. When a new application or some other
change to the configuration of the device 100 or computer system
1104 is made, a new association between a device application and a
big brother application may be made. For example, if application
8340 is recently introduced to the device 100, the device 100 may
automatically determine that there is a big brother application,
application 8344 on the computer system 1104 that should be
associated as a big brother application. Thus, if a new application
is installed, the device 100 may create a new row or item in the
data structure 8304 making the association and determining the
metadata for the association in the data structure 8304.
[0421] While the exemplary aspects, embodiments, and/or
configurations illustrated herein show the various components of
the system collocated, certain components of the system can be
located remotely, at distant portions of a distributed network,
such as a LAN and/or the Internet, or within a dedicated system.
Thus, it should be appreciated, that the components of the system
can be combined in to one or more devices, such as a tablet-like
device, or collocated on a particular node of a distributed
network, such as an analog and/or digital telecommunications
network, a packet-switch network, or a circuit-switched network. It
will be appreciated from the preceding description, and for reasons
of computational efficiency, that the components of the system can
be arranged at any location within a distributed network of
components without affecting the operation of the system. For
example, the various components can be located in a switch such as
a PBX and media server, gateway, in one or more communications
devices, at one or more users' premises, or some combination
thereof. Similarly, one or more functional portions of the system
could be distributed between a telecommunications device(s) and an
associated computing device.
[0422] Furthermore, it should be appreciated that the various links
connecting the elements can be wired or wireless links, or any
combination thereof, or any other known or later developed
element(s) that is capable of supplying and/or communicating data
to and from the connected elements. These wired or wireless links
can also be secure links and may be capable of communicating
encrypted information. Transmission media used as links, for
example, can be any suitable carrier for electrical signals,
including coaxial cables, copper wire and fiber optics, and may
take the form of acoustic or light waves, such as those generated
during radio-wave and infra-red data communications.
[0423] Also, while the flowcharts have been discussed and
illustrated in relation to a particular sequence of events, it
should be appreciated that changes, additions, and omissions to
this sequence can occur without materially affecting the operation
of the disclosed embodiments, configuration, and aspects.
[0424] In yet another embodiment, the systems and methods of this
disclosure can be implemented in conjunction with a special purpose
computer, a programmed microprocessor or microcontroller and
peripheral integrated circuit element(s), an ASIC or other
integrated circuit, a digital signal processor, a hard-wired
electronic or logic circuit such as discrete element circuit, a
programmable logic device or gate array such as PLD, PLA, FPGA,
PAL, special purpose computer, any comparable means, or the like.
In general, any device(s) or means capable of implementing the
methodology illustrated herein can be used to implement the various
aspects of this disclosure. Exemplary hardware that can be used for
the disclosed embodiments, configurations and aspects includes
computers, handheld devices, telephones (e.g., cellular, Internet
enabled, digital, analog, hybrids, and others), and other hardware
known in the art. Some of these devices include processors (e.g., a
single or multiple microprocessors), memory, nonvolatile storage,
input devices, and output devices. Furthermore, alternative
software implementations including, but not limited to, distributed
processing or component/object distributed processing, parallel
processing, or virtual machine processing can also be constructed
to implement the methods described herein.
[0425] In yet another embodiment, the disclosed methods may be
readily implemented in conjunction with software using object or
object-oriented software development environments that provide
portable source code that can be used on a variety of computer or
workstation platforms. Alternatively, the disclosed system may be
implemented partially or fully in hardware using standard logic
circuits or VLSI design. Whether software or hardware is used to
implement the systems in accordance with this disclosure is
dependent on the speed and/or efficiency requirements of the
system, the particular function, and the particular software or
hardware systems or microprocessor or microcomputer systems being
utilized.
[0426] In yet another embodiment, the disclosed methods may be
partially implemented in software that can be stored on a storage
medium, executed on programmed general-purpose computer with the
cooperation of a controller and memory, a special purpose computer,
a microprocessor, or the like. In these instances, the systems and
methods of this disclosure can be implemented as program embedded
on personal computer such as an applet, JAVA.RTM. or CGI script, as
a resource residing on a server or computer workstation, as a
routine embedded in a dedicated measurement system, system
component, or the like. The system can also be implemented by
physically incorporating the system and/or method into a software
and/or hardware system.
[0427] Although the present disclosure describes components and
functions implemented in the aspects, embodiments, and/or
configurations with reference to particular standards and
protocols, the aspects, embodiments, and/or configurations are not
limited to such standards and protocols. Other similar standards
and protocols not mentioned herein are in existence and are
considered to be included in the present disclosure. Moreover, the
standards and protocols mentioned herein and other similar
standards and protocols not mentioned herein are periodically
superseded by faster or more effective equivalents having
essentially the same functions. Such replacement standards and
protocols having the same functions are considered equivalents
included in the present disclosure.
[0428] The present disclosure, in various aspects, embodiments,
and/or configurations, includes components, methods, processes,
systems and/or apparatus substantially as depicted and described
herein, including various aspects, embodiments, configurations
embodiments, subcombinations, and/or subsets thereof. Those of
skill in the art will understand how to make and use the disclosed
aspects, embodiments, and/or configurations after understanding the
present disclosure. The present disclosure, in various aspects,
embodiments, and/or configurations, includes providing devices and
processes in the absence of items not depicted and/or described
herein or in various aspects, embodiments, and/or configurations
hereof, including in the absence of such items as may have been
used in previous devices or processes, e.g., for improving
performance, achieving ease and\or reducing cost of
implementation.
[0429] The foregoing discussion has been presented for purposes of
illustration and description. The foregoing is not intended to
limit the disclosure to the form or forms disclosed herein. In the
foregoing Detailed Description for example, various features of the
disclosure are grouped together in one or more aspects,
embodiments, and/or configurations for the purpose of streamlining
the disclosure. The features of the aspects, embodiments, and/or
configurations of the disclosure may be combined in alternate
aspects, embodiments, and/or configurations other than those
discussed above. This method of disclosure is not to be interpreted
as reflecting an intention that the claims require more features
than are expressly recited in each claim. Rather, as the following
claims reflect, inventive aspects lie in less than all features of
a single foregoing disclosed aspect, embodiment, and/or
configuration. Thus, the following claims are hereby incorporated
into this Detailed Description, with each claim standing on its own
as a separate preferred embodiment of the disclosure.
[0430] Moreover, though the description has included description of
one or more aspects, embodiments, and/or configurations and certain
variations and modifications, other variations, combinations, and
modifications are within the scope of the disclosure, e.g., as may
be within the skill and knowledge of those in the art, after
understanding the present disclosure. It is intended to obtain
rights which include alternative aspects, embodiments, and/or
configurations to the extent permitted, including alternate,
interchangeable and/or equivalent structures, functions, ranges or
steps to those claimed, whether or not such alternate,
interchangeable and/or equivalent structures, functions, ranges or
steps are disclosed herein, and without intending to publicly
dedicate any patentable subject matter.
* * * * *