U.S. patent application number 12/020472 was filed with the patent office on 2008-07-31 for mobile device user interface for remote interaction.
Invention is credited to Nitin Bhandari, Dan Duong, Erik R. Swenson, Alexander James Vincent.
Application Number | 20080184128 12/020472 |
Document ID | / |
Family ID | 39645203 |
Filed Date | 2008-07-31 |
United States Patent
Application |
20080184128 |
Kind Code |
A1 |
Swenson; Erik R. ; et
al. |
July 31, 2008 |
MOBILE DEVICE USER INTERFACE FOR REMOTE INTERACTION
Abstract
Systems and methods pertaining to displaying a web browsing
session are disclosed. In one embodiment, a system includes a web
browsing engine residing on a first device, with a viewing
application residing on a second device and operatively coupled to
the web browsing engine, where the viewing application is adapted
to display a portion of a webpage rendered by the web browsing
engine and an overlay graphical component. In the same embodiment,
the system also includes a recognition engine adapted to identify
an element on the webpage and communicate information regarding the
element to the viewing application.
Inventors: |
Swenson; Erik R.; (San Jose,
CA) ; Bhandari; Nitin; (Fremont, CA) ;
Vincent; Alexander James; (San Jose, CA) ; Duong;
Dan; (San Jose, CA) |
Correspondence
Address: |
FENWICK & WEST LLP
SILICON VALLEY CENTER, 801 CALIFORNIA STREET
MOUNTAIN VIEW
CA
94041
US
|
Family ID: |
39645203 |
Appl. No.: |
12/020472 |
Filed: |
January 25, 2008 |
Related U.S. Patent Documents
|
|
|
|
|
|
Application
Number |
Filing Date |
Patent Number |
|
|
60886577 |
Jan 25, 2007 |
|
|
|
Current U.S.
Class: |
715/738 ;
715/790 |
Current CPC
Class: |
H04N 19/167 20141101;
H04N 19/164 20141101; H04N 19/107 20141101; H04N 21/41407 20130101;
G09G 2370/022 20130101; G09G 2370/10 20130101; H04N 19/59 20141101;
H04N 19/87 20141101; G09G 5/14 20130101; G09G 2340/0407 20130101;
G09G 2340/145 20130101; G09G 2340/06 20130101; H04N 21/234336
20130101; H04N 21/4782 20130101; H04N 19/172 20141101; H04N 21/8451
20130101; G09G 2370/042 20130101; H04N 21/816 20130101; H04N 19/15
20141101; H04N 19/156 20141101 |
Class at
Publication: |
715/738 ;
715/790 |
International
Class: |
G06F 3/00 20060101
G06F003/00; G06F 3/048 20060101 G06F003/048 |
Claims
1. A system for displaying a web browsing session, the system
comprising: a web browsing engine residing on a first device; a
viewing application residing on a second device and operatively
coupled to the web browsing engine, the viewing application adapted
to display a portion of a webpage rendered by the web browsing
engine and an overlay graphical component; and a recognition engine
adapted to identify an element on the webpage and communicate
information regarding the element to the viewing application.
2. The system of claim 1, wherein the recognition engine is further
adapted to communicate information regarding the element to the
viewing application responsive to the viewing application passing
user input to the recognition engine.
3. The system of claim 2, wherein the user input comprises a
selection of a location on the webpage displayed on the second
device, the location corresponding to the element's location on the
webpage.
4. The system of claim 1, wherein the recognition engine is further
adapted to communicate information regarding the element to the
viewing application with the portion of the webpage.
5. The system of claim 1, further comprising: a state manager
engine operatively coupled to the viewing application and the web
browsing engine, the state manager engine adapted to synchronize
user input displaying locally on the viewing application with user
input transferred to the web browsing engine.
6. The system of claim 1, further comprising: a plurality of
overlay graphical components corresponding to a plurality of
elements on the webpage.
7. The system of claim 1, wherein the overlay graphical component
comprises a cursor icon, the viewing application further adapted to
transform the cursor icon, based on the information communicated
regarding the element, while the cursor icon overlays the
element.
8. The system of claim 1, wherein the element comprises a radio
button and the overlay graphical component comprises an overlay
radio button, the viewing application further adapted to transform
the overlay radio button upon a user selection and communicate
selection of the radio button element to the web browsing
engine.
9. The system of claim 1, wherein the element comprises a drop-down
menu and the overlay graphical component comprises an overlay
drop-down menu, the viewing application further adapted to expand
the overlay drop-down menu upon a user selection and communicate
user selection of a menu option to the web browsing engine.
10. The system of claim 1, wherein the element comprises a textbox
and the overlay graphical component comprises an overlay textbox,
the viewing application further adapted to fill in the overlay
textbox based on user input and communicate the overlay textbox
user input to the web browsing engine.
11. The system of claim 1, wherein the recognition engine is
further adapted to identify elements based on underlying code of
the webpage.
12. The system of claim 1, wherein the viewing application is
further adapted to display a user interface accompanying the
portion of the webpage.
13. The system of claim 1, wherein the viewing application is
further adapted to switch from the portion of the webpage to a full
screen view of a video embedded in the webpage.
14. The system of claim 13, wherein the viewing application is
further adapted to display a video-control user interface
accompanying the video.
15. A method for displaying a web browsing session, the method
comprising: rendering a portion of a webpage, the rendering
performed by a web browsing engine residing on a first device;
recognizing an element on the webpage and communicating information
regarding the element to a viewing application, the recognizing
performed by a recognition engine, the viewing application residing
on a second device; and displaying the portion of the webpage and
an overlay graphical component via the viewing application.
16. The method of claim 15, wherein the communicating information
regarding the element to the viewing application is responsive to
the viewing application passing user input to the recognition
engine, the user input comprising a selection of a location on the
webpage displayed on the second device and the location
corresponding to the element's location on the webpage.
17. The method of claim 15, further comprising: communicating
information regarding the element to the viewing application with
the portion of the webpage.
18. The method of claim 15, further comprising: synchronizing user
input displaying locally on the viewing application with user input
transferred to the web browsing engine.
19. The method of claim 15, further comprising: a plurality of
overlay graphical components corresponding to a plurality of
elements on the webpage.
20. The method of claim 15, wherein the overlay graphical component
comprises a cursor icon, the method further comprising:
transforming the cursor icon, based on the information communicated
regarding the element, while the cursor icon overlays the
element.
21. The method of claim 15, wherein the element comprises a radio
button and the overlay graphical component comprises an overlay
radio button, the method further comprising: transforming the
overlay radio button upon a user selection; and communicating
selection of the radio button element to the web browsing
engine.
22. The method of claim 15, wherein the element comprises a
drop-down menu and the overlay graphical component comprises an
overlay drop-down menu, the method further comprising: receiving an
initial state of the drop-down menu, including items listed;
expanding the overlay drop-down menu upon a user selection; and
communicating user selection of a menu option to the web browsing
engine.
23. The method of claim 15, wherein the element comprises a textbox
and the overlay graphical component comprises an overlay textbox,
the method further comprising: receiving an initial state of the
textbox, including initial text; filling in the overlay textbox
based on user input; and communicating the overlay textbox user
input to the web browsing engine.
24. The method of claim 15, further comprising: displaying a user
interface accompanying the portion of the webpage.
25. The method of claim 15, further comprising: switching from the
portion of the webpage to a full screen view of a video embedded in
the webpage, the switching performed by the viewing application.
Description
CROSS-REFERENCE TO RELATED APPLICATIONS
[0001] The present application claims priority under 35 U.S.C.
.sctn. 119(e) to U.S. Provisional Patent Application No.
60/886,577, filed on Jan. 25, 2007, entitled "SYSTEM FOR WEB
BROWSING VIDEO," which is incorporated by reference in its
entirety.
BACKGROUND
[0002] The present invention relates to a system for displaying a
web browsing session. In particular, the present invention is
related to a viewing application for displaying a web browsing
session from a separate device.
[0003] Access to applications, including web browsers, is provided
for in various client-server environments. Placing a web browser on
a server for delivery to a client presents a large number of
issues, including issues with the delivery of the browsing
experience to the client user, such as handling interactive objects
within a web page. For interaction with handheld clients, such as
cellular phones, bandwidth and display size constraints pose
additional challenges in delivering a satisfactory web browsing
experience from a server, including dealing with any latency within
the system evident to the end user.
[0004] There exists a need to support full-featured web browsing
sessions on a diverse cross-section of bandwidth and
capability-limited mobile devices in a way that addresses these
challenges and advantageously utilizes a client-server environment,
as well as to support the use of other applications in this same
manner. Embodiments of this invention will address other needs as
well.
SUMMARY
[0005] In various embodiments, the present invention provides
systems and methods pertaining to displaying a web browsing
session. In one embodiment, a system includes a web browsing engine
residing on a first device, with a viewing application residing on
a second device and operatively coupled to the web browsing engine,
where the viewing application is adapted to display a portion of a
webpage rendered by the web browsing engine and an overlay
graphical component. In the same embodiment, the system also
includes a recognition engine adapted to identify an element on the
webpage and communicate information regarding the element to the
viewing application.
[0006] In another embodiment, a system for displaying a web
browsing session includes a recognition engine further adapted to
communicate information regarding the element to the viewing
application responsive to the viewing application passing user
input to the recognition engine. In yet another embodiment, such
user input comprises a selection of a location on the webpage
displayed on the second device, the location corresponding to the
element's location on the webpage.
[0007] In still yet another embodiment, a system for displaying a
web browsing session includes a recognition engine further adapted
to communicate information regarding the element to the viewing
application with the portion of the webpage.
[0008] In a further embodiment, the system further comprises a
state manager engine operatively coupled to the viewing application
and the web browsing engine, the state manager engine adapted to
synchronize user input displaying locally on the viewing
application with user input transferred to the web browsing
engine.
[0009] One skilled in the art will recognize that the present
invention can be implemented in a wide variety of ways, and many
different kinds of apparatus and systems may implement various
embodiments of the invention.
BRIEF DESCRIPTION OF THE DRAWINGS
[0010] FIG. 1 is a block diagram illustrating some aspects of a
client-server architecture of the present invention, according to
one embodiment.
[0011] FIG. 2 is a block diagram illustrating some aspects of the
present invention in connection with a server, according to one
embodiment.
[0012] FIG. 3 is a block diagram illustrating some aspects of an
architectural overview of the present invention, including a
server, an audio server and a client, according to one
embodiment.
[0013] FIG. 4 is a block diagram illustrating some aspects of the
present invention in connection with a client, according to one
embodiment.
[0014] FIG. 5 is a diagram illustrating some aspects of
multiple-user software architecture, according to one
embodiment.
[0015] FIG. 6 is a flowchart illustrating some supporting aspects
of capturing a succession of video frames, according to one
embodiment.
[0016] FIG. 7 is a flowchart illustrating some supporting aspects
of sending a succession of video frames, according to one
embodiment.
[0017] FIG. 8 is a diagram illustrating some aspects of a
client-server exchange, according to one embodiment.
[0018] FIG. 9 is a diagram illustrating some aspects of a
client-server exchange, including an accompanying exchange within
the server, according to one embodiment.
[0019] FIG. 10 is a diagram illustrating some aspects of viewport
move operations and related state management, according to one
embodiment.
[0020] FIG. 11 is a diagram illustrating some aspects of a
client-server exchange with respect to state management, according
to one embodiment.
[0021] FIG. 12 is a diagram illustrating some aspects of a
client-server exchange, including an accompanying exchange between
a server and network storage, according to one embodiment.
[0022] FIG. 13 is a diagram illustrating some aspects of displaying
a web browsing session from a server to a cellular phone, according
to one embodiment.
[0023] FIG. 14 is a diagram illustrating some aspects of a user
interface display, according to one embodiment.
[0024] FIG. 15 is a diagram illustrating some aspects of a user
interface display, according to one embodiment.
[0025] FIG. 16 illustrates an example computer system suitable for
use in association with a client-server architecture for remote
interaction, according to one embodiment.
[0026] One skilled in the art will recognize that these Figures are
merely examples of the operation of the invention according to one
or some embodiments, and that other architectures, method steps,
exchanges and modes of operation can be used without departing from
the essential characteristics of the invention.
DETAILED DESCRIPTION OF THE EMBODIMENTS
[0027] The present invention is now described more fully with
reference to the accompanying Figures, in which one or some
embodiments of the invention are shown. The present invention may
be embodied in many different forms and should not be construed as
limited to the embodiments set forth herein. Rather these
embodiments are provided so that this disclosure will be complete
and will fully convey principles of the invention to those skilled
in the art.
[0028] For illustrative purposes, embodiments of the invention are
described in connection with a server or a mobile client device,
such as an example mobile client device. Various specific details
are set forth herein regarding embodiments with respect to servers
and mobile client devices to aid in understanding the present
invention. However, such specific details are intended to be
illustrative, and are not intended to restrict in any way the scope
of the present invention as claimed herein. In particular, one
skilled in the art will recognize that the invention can be used in
connection with a wide variety of contexts, including, for example,
client devices operating in a wired network. In addition,
embodiments of the invention are described in connection with a web
browsing application, but such descriptions are intended to be
illustrative and examples, and in no way limit the scope of the
invention as claimed. Various embodiments of the invention may be
used in connection with many different types of programs, including
an operating system (OS), a wide variety of applications, including
word processing, spreadsheet, presentation, and database
applications, and so forth.
[0029] In some embodiments, the present invention is implemented at
least partially in a conventional server computer system running an
OS, such as a Microsoft OS, available from Microsoft Corporation;
various versions of Linux; various versions of UNIX; a MacOS,
available from Apple Computer Inc.; and/or other operating systems.
In some embodiments, the present invention is implemented in a
conventional personal computer system running an OS such as
Microsoft Windows Vista or XP (or another Windows version), MacOS X
(or another MacOS version), various versions of Linux, various
versions of UNIX, or any other OS designed to generally manage
operations on a computing device.
[0030] In addition, the present invention can be implemented on, or
in connection with, devices other than personal computers, such as,
for example, personal digital assistants (PDAs), cell phones,
computing devices in which one or more computing resources is
located remotely and accessed via a network, running on a variety
of operating systems. The invention may be included as add-on
software, or it may be a feature of an application that is bundled
with a computer system or sold separately, or it may even be
implemented as functionality embedded in hardware.
[0031] Output generated by the invention can be displayed on a
screen, transmitted to a remote device, stored in a database or
other storage mechanism, printed, or used in any other way. In
addition, in some embodiments, the invention makes use of input
provided to the computer system via input devices such as a
keyboard (screen-based or physical, in a variety of forms), scroll
wheels, number pads, stylus-based inputs, a touchscreen or
touchpad, etc. Such components, including their operation and
interactions with one another and with a central processing unit of
the personal computer, are well known in the art of computer
systems and therefore are not depicted here.
[0032] Any software portions described herein with reference to
modules need not include discrete software modules. Any software
configuration described herein is meant only by way of example;
other configurations are contemplated by and within the scope of
various embodiments of the present invention. The term, engine, is
used herein to denote any software or hardware configuration, or
combination thereof, that performs the function or functions
referenced.
[0033] Reference in the specification to "one embodiment" or "an
embodiment" means that a particular feature, structure, or
characteristic described in connection with the embodiment is
included in at least one embodiment of the invention. The
appearance of the phrase "in one embodiment" in various places in
the specification does not necessarily refer to the same
embodiment. The appearance of the phrase "in some embodiments" in
various places in the specification are not necessarily all
referring to the same set of embodiments. The appearance of the
phrase "in various embodiments" in various places in the
specification are not necessarily all referring to the same set of
embodiments.
1. System
[0034] FIG. 1 is a block diagram illustrating some aspects of
system 100 of the present invention, according to one embodiment.
System 100 employs a client-server architecture that includes a
number of server application instances running on server 200,
including server application 1 (102), server application 2 (104),
server application 3 (106), and a wide-ranging number of additional
server applications (represented by ellipsis 108), up to server
application n (110). The term "server application" is used herein
to denote a server-side application, i.e., an application running
on one or more servers. Server application n (110) represents the
number of server application instances that happen to be running in
system 100 at any given point. Server 200 also includes user
manager module 502, which serves to manage multiple users among the
multiple server application instances 102-110. User manager module
502 is described herein in FIG. 5, and represents one of potential
multiple user managers running on server 200. Server 200 is running
one instance of an OS underlying server applications 102-110. In
another embodiment, server 200 may run multiple instances of an OS,
each OS instance including one or more application instances.
Server 200 also includes provision manager module 1205, which is
described herein in FIG. 12.
[0035] While FIG. 1 illustrates multiple server applications
102-110, in other embodiments, a number of different types of
programs may be alternately used, including, for instance, an OS.
Server applications 102-110 illustrated in FIG. 1 may run on one
server 200 or any number of servers, as, for example, in one or
more server farm environments. Server applications 102-110 may each
comprise instances of different server applications, or may all
comprise an instance of one server application. For example, each
server application 102-110 could comprise a separate instance of a
web browsing application.
A. Server
[0036] Describing server application 1 (102) in further detail, as
an example server application instance, server application 1 (102)
includes application 112, plugin 114, state manager module 115,
recognition module 117, audio data generator 116, audio encoder
module 120, video encoder module 124, and command process module
126. Video encoder module 124 makes use of feedback parameter
125.
[0037] Video encoder module 124 is operatively coupled to
application 112, and is adapted to receive a succession of captures
(122) of the user interface (UI) of application 112 for encoding
into video frames for transmission via network 128. The succession
of captures (122) of the UI comprise data that is captured and
transferred from application 112 to video encoder 124 by a separate
module, described and illustrated in FIG. 2 (image management
module 216). State manager module 115 manages state information, as
will be described in relation to subsequent Figures. Recognition
module 117 identifies elements related to output of application
112, as will be described in relation to subsequent Figures. The
term, user interface, as used throughout this disclosure, refers to
all or a portion of any user interface associated with a wide
variety of computer programs.
[0038] The encoding of application UI captures (122) is not limited
to any particular encoding or video compression format, and may
include a wide variety of video compression techniques, ranging
from the use of a video compression standard, such as H.264, to an
entirely customized form of video compression, to a modified
version of a video compression standard, and so forth.
[0039] Audio encoder module 120 is operatively coupled to audio
data generator 116 of application 112, and is adapted to transform
audio captures 118 (e.g., an audio stream) of audio data generator
116 into an encoded audio stream for transmission via network 128.
Audio captures 118 comprises data being transferred from audio data
generator 116 to audio encoder module 120.
[0040] Audio data generator 116 is operatively coupled to
application 112, and is adapted to generate the audio data
accompanying application 112. Plugin 114 is operatively coupled to
application 112 and command process module 126. Plugin 114 is
adapted to facilitate the interface between application 112 and
command process module 126.
[0041] Server 200 is further described herein in FIG. 2.
C. Client
[0042] System 100 includes a number of clients, including client 1
(400), client 2 (132), client 3 (134), and a wide-ranging number of
additional clients (represented by ellipsis 136), up to client n
(138), with client n (138) representing the number of clients that
happen to be engaged in the system at any given point. As
illustrated in FIG. 1, the different clients comprise different,
non-related client devices.
[0043] Describing client 1 (400) in further detail, as an example
client, client 1 (400) may include audio decoder module 142, video
decoder module 144, command process module 146, viewing application
148, state manager module 149, and speaker 150. Video decoder
module 144 may be adapted to decode the succession of video frames
encoded by video encoder module 124, where the successive video
frames have been transmitted across network 128 for reception by
client 1 (400). Video decoder module 144 may be operatively coupled
to viewing application 148, and adapted to communicate the decoded
video frames to viewing application 148 for display of the video
frames on client 1 (400). State manager module 149 manages state
information, as will be described in relation to subsequent
Figures.
[0044] Client 1 (400) includes speaker 150, and audio decoder
module 142 is operatively coupled to speaker 150. Audio decoder
module 142 is adapted to decode the audio captures encoded by audio
encoder module 120, where the encoded audio has been transmitted
across network 128 for reception by client 1 (400). After decoding
the audio stream, audio decoder module 142 may communicate the
decoded audio to speaker 150 for audio output from client 1
(400).
[0045] Viewing application 148 is adapted to receive user input and
communicate the user input to command process module 146. Command
process module 146 is adapted to communicate the user input back to
command process module 126 of application 102 via network 128.
Command process module 126 is adapted to communicate the user input
to application 112 via plugin 114.
[0046] Plugin 114 facilitates the remote interactive use of
application 112 via the system 100 described in FIG. 1. Plugin 114
may also be an extension. In another embodiment, application 112
may be customized for use with the client-server architecture of
this invention to the extent that a special plugin is not needed.
In yet another embodiment, neither a plugin or special application
modifications may be needed.
[0047] Command process module 146 is adapted to communicate one or
more feedback parameters 125 to command process module 126. Command
process module 126 is adapted to communicate the one or more
feedback parameters 125 to video encoder module 124 and audio
encoder module 120 for their respective encoding of the succession
of application UI captures 122 and audio captures 118. The one or
more feedback parameters 125 may comprise one or more of a wide
range of parameters, including a bandwidth parameter relating to at
least a portion of network 128, a device parameter of client 1
(400) or a user input for client 1 (400).
[0048] The one or more feedback parameters 125 may comprise a
bandwidth parameter, which may include any estimated or measured
bandwidth data point. An example bandwidth parameter may include
estimated bandwidth based on measurements of certain packets
traversing between server 200 and client 1 (400), (e.g., how much
data sent divided by traversal time to obtain a throughput value),
or other bandwidth information obtained from, or in conjunction
with, network 128, including from a network protocol. The one or
more feedback parameters 125 may comprise user input for client 1
(400), including, for example, a user request for encoding
performed in a certain format or manner, with such a request being
requested and communicated by viewing application 148. The one or
more feedback parameters 125 may comprise a display resolution of
client 1 (400) (e.g., CGA, QVGA, VGA, NTSC, PAL, WVGA, SVGA, XGA,
etc.). The one or more feedback parameters 125 may comprise other
screen parameters (e.g., screen size, refresh capabilities,
backlighting capabilities, screen technology, etc.) or other
parameters of the client device (e.g., device processor, available
memory for use in storing video frames, location if GPS or other
location technology-enabled, etc.). None of the example feedback
parameters discussed above are meant to exclude their combined use
with each other, or other feedback parameters. In some embodiments,
video encoder module 124 may be adapted to at least partially base
its video sample rate on the one of more feedback parameters
125.
[0049] The multiple clients depicted in FIG. 1 are illustrated to
indicate that each client may potentially comprise a different type
of client device, each with its own one or more feedback
parameters.
[0050] Client 1 (400) is further described herein in FIG. 4.
[0051] One skilled in the art will recognize that the client-server
architecture illustrated in FIG. 1 is merely an example, and that
the invention may be practiced and implemented using many other
architectures and environments.
[0052] FIG. 2 is a block diagram illustrating some aspects of the
present invention in connection with server 200, according to one
embodiment. Server 200 includes user manager module 502, provision
manager module 1205, server application 1 (102), application 112,
plugin 114, state manager module 115, recognition module 117, audio
data generator 116, audio encoder module 120, image management
module 216, memory 218, video encoder module 124 (which includes
feedback parameter 125), command process module 126, and align
module 224. Command process module 126 includes client interpreter
sub-module 228, and plugin 114 includes client implementer
sub-module 208. The components illustrated in FIG. 2 with the same
numbers as components illustrated in FIG. 1 correspond to those
respective components of FIG. 1, and thus their general operation
will not be repeated. While one running application is illustrated
with respect to server 200, server application 102 is illustrated
as a representative instance of multiple server applications
running on server 200, each of the multiple server applications
being associated with its own distinct client (clients are not
shown in this illustration). Additionally, user manager module 502
represents one of potential multiple user managers running on
server 200.
[0053] Image management module 216 serves to capture the UI of
application 112 (as the UI would appear on a screen) and save the
capture in memory 218. Any capture process such as screen-scraping
may be used, and image management module 216 may perform this
capture at any desired rate. Image management module 216 also
compares the last prior capture of the application UI to the
current capture to determine whether any changes have occurred in a
particular area of the application UI. Any image/video frame
matching process may be used for this comparison operation. Image
management module 216 serves to repetitively perform this
function.
[0054] If image management module 216 detects any change in the
particular area of interest, a delta flag is set to indicate that
the area of interest has changed. Upon detecting a change, image
management module 216 serves to convert the native format of the UI
rendered data to a video frame format more suited for compression
and transmission to the client device (e.g., color space
transformation, data format transformation, etc.). Image management
module 216 serves to resize the image for the reformatted video
frame. In the embodiment of FIG. 2, multiple parameters of the
applicable client device were included in the one or more feedback
parameters 125, allowing image management module 216 to perform the
reformatting and resizing based on client device parameters (the
relevant parameters having been communicated to image management
module 216).
[0055] Image management module 216 periodically checks (based on
its sample interval) if the delta flag has been set. If the delta
flag is detected as set during a check, the reformatted/resized
video frame in memory 218 is encoded by video encoder module 124
for transmission to the client device.
[0056] Client interpreter sub-module 228 of command process module
126 serves to interpret data received from client device 400 and to
translate this data for use in connection with video encoder module
124, audio encoder module 120 and application 112 (e.g., user
commands, etc.). Client interpreter sub-module 228 serves to pass
the feedback parameters 125 to video encoder module 124 and audio
encoder 120 for use in encoding.
[0057] Client interpreter sub-module 228 of command process module
126 serves to translate client-received data for use in connection
with plugin 114 and its client implementer sub-module 208. In
communicating back user input, the client device passes coordinates
(of a cursor, etc.) relative to the client device's screen to
command process 126. Client interpreter sub-module 228 serves to
determine the corresponding location in relation to the viewport of
the client device and the application UI. Client interpreter
sub-module 228 then communicates the translated coordinates to
plugin 114 for use by its client implementer sub-module 208. Client
implementer sub-module 208 serves to translate from conventional
user input to a format appropriate for application 112, and then to
directly inject the translated input into application 112.
[0058] Align module 224 correlates and cross-stamps video frames
encoded by video encoder module 124 and audio encoded by audio
encoder module 120, so that the audio stream and the video frames
associated with the UI of application 112 may be readily matched at
client device 400. Image management module 216 may also serve to
time-stamp all images, and the operation of capturing audio from
audio data generator 116 may also serve to timestamp the audio
stream, both for down-stream alignment by align module 224, as
would be appreciated by one skilled in the art. In another
embodiment, all alignment/matching of audio and video frames may be
performed at the client device.
[0059] One skilled in the art will recognize that the illustration
of FIG. 2 is merely an example, and that the invention may be
practiced and implemented in many other ways.
[0060] FIG. 3 is a functional block diagram 300 illustrating some
aspects of an architectural overview of the present invention,
including a server, an audio server and a client, according to one
embodiment. In this embodiment, audio is sent to the client from
dedicated audio server 304. Functional block diagram 300 includes
server 302, audio server 304 and client 306, with client 306
operatively linked to server 302 and audio server 304 via network
310 (via connections 312, 314 and 316). Server 302 is operatively
linked to audio server 304 via connection 308. Server 302 includes
application 318, plugin 322, state manager module 323, recognition
module 327, audio data generator 320, video encoder module 324
(including feedback parameter 325), command process module 326,
audio interceptor module 330, PID (process identifier) manager
module 332, and time-stamp manager module 334.
[0061] Video encoder module 324 operates as described in FIG. 2,
being analogous to video encoder module 124 (and likewise, for
feedback parameter 325 with respect to feedback parameter 125).
Video encoder module 324 operates to encode application UI captures
328 and to communicate the encoded video frames for transmission to
client 306. In the process of obtaining application UI captures,
the resulting UI captures are time-stamped. Time-stamp manager
module 334 facilitates the time-stamping of the UI captures.
Command process module 326 operates as described in FIG. 2, being
analogous to command process module 126.
[0062] While one running application is illustrated with respect to
server 302, application 318 is illustrated as a representative
instance of multiple applications running on server 302, each of
the multiple applications having its own video encoder and command
process modules, and being associated with its own distinct client.
Audio data generator 320 renders an audio stream (not shown) for
application 318. Audio interceptor module 330 intercepts or traps
this audio stream for redirection to audio server 304, and may
timestamp the audio stream. Time-stamp manager module 334 may
facilitate the time-stamping of the audio stream. Audio interceptor
module 330 may make use of a customized DLL to facilitate such a
redirection of the audio stream. PID manager module 332, serves to
detect and manage the different process IDs of the multiple
applications running on server 302. PID manager module 332 may
stamp each audio stream redirected to audio server with the process
ID of its associated application.
[0063] Audio server 304 includes audio stream processing module 336
and PID authentication module 338. Audio stream processing module
336 serves to encode the audio streams received from the
applications running on server 302, and perform any conversion
desired (e.g., conversion of sample rates, bit depths, channel
counts, buffer size, etc.). In the embodiment of FIG. 3, User
Datagram Protocol ports are used (not shown) to direct each audio
stream to its destination client device; other protocols may be
used in other embodiments. Audio stream processing module 336
directs each audio stream to the port associated with the audio
stream's corresponding client device (i.e., the client device
displaying the video frames corresponding to the audio stream).
Audio stream processing module 336 may work in association with PID
authentication module 338 to verify and direct the multiple audio
streams streaming from server 302 to the appropriate port.
[0064] Client 306 includes video decoder module 340, audio decoder
module 342, command process module 344 and audio/video sync module
346. After client 306 receives and decodes the applicable audio and
video streams from server 302 (i.e., the audio and video streams of
the application instantiated for client 306), audio/video sync
module 346 correlates the time-stamps on both streams and works in
conjunction with audio decoder module 342 and video decoder module
340 to synchronize output to speaker 348 and viewing application
350, respectively. Client 306 also includes state manager module
351 to manage state information.
[0065] One skilled in the art will recognize that the illustration
of FIG. 3 is merely an example, and that the invention may be
practiced and implemented in many other ways.
[0066] FIG. 4 is a block diagram illustrating some aspects of the
present invention in connection with a client, according to one
embodiment. Client 400 includes video decoder module 144, audio
decoder module 142, audio/video sync module 406, command process
module 146, speaker 150, viewing application 148, state manager
module 149, and connections 410, 412 and 414.
[0067] Video decoder module 144 receives encoded video frames via
connection 412, while audio decoder module 142 receives an encoded
audio stream via connection 414. Audio/video sync module 406 serves
to match time-stamps or another type of identifier on the audio
stream and the video frames for synced output via speaker 150 and
viewing application 148, respectively. Audio decoder module 142,
video decoder module 144 and viewing application 148 all may serve
to provide feedback to command process module 146, to communicate
back to the server-side application feedback parameters (not
illustrated in FIG. 4), including to vary the sample rate and/or
compression of the video encoding, the audio encoding, etc.
[0068] Command process module 146 serves to pass feedback
parameters of client 400 for use in video and/or audio encoding
upon initiation of a session or during a session. Such feedback
parameters may include one or more of the following parameters:
display resolution, screen size, processor identification or
capabilities, memory capabilities/parameters, speaker capabilities,
and so forth.
[0069] Viewing application 148 displays the succession of video
frames of a portion of the server-side application's UI. Viewing
application 148 serves to facilitate communicating user input
control, including user commands, to command process module 146 for
transmission back to the server. Client user input control passed
back to the server may include, for example, input from: a keyboard
(screen-based or physical, in a variety of forms), scroll wheels,
number pads, stylus-based inputs, a touchscreen or touchpad, etc.
Viewing application 148 serves to aggregate certain user input for
sending, such as opening up a local text box for text entry. State
manager module 149 manages state information, as will be described
in relation to sub-sequent Figures.
[0070] In one embodiment, viewing application 148 comprises an
application based on the Java Platform, Micro Edition, and portable
to BREW (Binary Runtime Environment for Wireless). In other
embodiments, viewing application 148 comprises an application based
on the Symbian platform, a Linux-based platform (e.g., Android), a
Palm platform, a Pocket PC/Microsoft Smartphone platform, or
another platform capable of supporting the functionality described
herein. In some embodiments, viewing application 148 runs as a
stand alone Java application, while in other embodiments, viewing
application 148 runs as a plugin to a standard mobile browser, as
would be appreciated one skilled in the art. In one embodiment,
viewing application 148 includes a number of modules to facilitate
implementing the functionality described herein, including modules
to facilitate browser navigation, client I/O and viewport tracking
(not shown), as would be appreciated by one of skill in the
art.
[0071] One skilled in the art will recognize that the illustration
of FIG. 4 is merely an example, and that the invention may be
practiced and implemented in many other ways.
[0072] FIG. 5 is a diagram 500 illustrating some aspects of a
multiple-user software architecture, according to one embodiment.
User manager module 502 includes worker thread 504, worker thread
506, and a wide-ranging number of additional worker threads
(represented by ellipsis 508); with `worker thread n` being
represented by worker thread 510. Worker thread 510 represents the
total number of worker threads that happen to be running in the
system at any given point. Each worker thread corresponds to a list
of active users, and the lists of active users may potentially
comprise a different number of active users. As illustrated in FIG.
5, worker thread 504 corresponds to thread cycle 512, worker thread
506 corresponds to thread cycle 514, the variable number of worker
threads represented by ellipsis 508 corresponds to the same
variable number of thread cycles represented by ellipsis 518, and
worker thread 510 corresponds to thread cycle 516. As illustrated
in FIG. 5, worker thread 504 cycles through user 1 (520), user 2
(522), user 3 (524) and user 4 (526); worker thread 506 cycles
through user 5 (528), user 6 (530) and user 7 (532); and worker
thread 516 cycles through user 8 (534), user 9 (536), user 10
(538), user 11 (540) and user 12 (542). The number of users
supported by the worker threads illustrated in FIG. 5 is meant to
represent a snapshot at an arbitrary point in time, as the number
of users supported by any given thread is dynamic.
[0073] User manager module 502 may be set to instantiate a finite
number of worker threads before instantiating additional worker
threads to manage further users added to the system. The number of
worker threads in the overall architecture illustrated by FIG. 5
will vary according to various embodiments. The parameters
regarding the number of active users assigned per worker thread
will also vary according to various embodiments.
[0074] User manager module 502 runs on a server (as illustrated in
FIG. 1) where multiple instances of applications (as illustrated in
FIG. 1) are also running. User manager module 502 thus serves to
manage multiple users in an environment of multiple application
instances. When a new user is introduced into the overall system of
FIG. 1, the new user is assigned to a worker thread (504-510) to
facilitate the interaction between a specific client and a specific
server-side application.
[0075] While the embodiment illustrated in FIG. 5 illustrates
multiple users being assigned to a single thread, in other
embodiments, a single user may be assigned to their own single
thread. In other embodiments, a user may be assigned to either a
shared thread or a dedicated thread depending on one or more
factors, such as the current loading/usage of the overall system,
the user's service policy with the provider of the respective
service operating an embodiment of this invention, and so
forth.
[0076] User manager module 502 facilitates load balancing of
multiple users in a number of ways, as each worker thread cycles
through their respective list of active users and processes one
active event for each user. The active events that may be processed
include: (a) send one video frame update to the client or (b)
update state information pertaining to the client's viewing
application and the server-side application/UI. For illustration
purposes for the case in which user 1 is associated with server
application 1 (102) of FIG. 1, worker thread 512 will time slice
between the respective video encoder/command process modules of
users 1 (520) through 4 (526) to perform (a) and (b) above, with
video encoder module 124 and command process module 126 comprising
the video encoder/command process modules of user 1 (520). A
separate thread (not shown) would be operatively coupled to audio
encoder module 120 to continuously send the encoded audio stream to
the client, when audio data is present to send.
[0077] By operating in this manner, no single user suffers as a
result of a processing-intensive session (e.g., for a high
resolution, high frame rate video) of another user being serviced
by the same thread. This will be further described below with
reference to FIG. 7. In other embodiments, more than one active
event may be processed per user. In other embodiments, a variable
number of active events may be processed per user, based on a wide
range of factors, including the level of underlying application
activity for the applicable user, the user's service policy with
the provider of the respective service operating an embodiment of
this invention, and so forth.
[0078] User manager module 502 may move specific users among
different worker threads to load balance among processing-intensive
users and processing-light users. For example, if multiple users
being serviced by one worker thread are in need of being serviced
with high frame rate video, user manager module 502 may move one or
more such users to another thread servicing only, or predominately,
processing-light users. Another thread may also be instantiated for
this purpose. As one skilled in the art would appreciate, a user
may be treated as an object, and moved to another thread as objects
are transferred among threads. The timing of such object moves may
take place at specific junctures in the display of video frames by
a client device's viewing application, in order to minimize
disruption of a user's experience.
[0079] One skilled in the art will recognize that the illustration
of FIG. 5 is merely an example, and that the invention may be
practiced and implemented in many other ways.
[0080] FIG. 6 is a flowchart 600 illustrating some supporting
aspects of capturing a succession of video frames, according to one
embodiment. Operation 602 (Render Application UI to memory) is
performed initially, either by a plugin to an application or by an
application itself. Operation 602 serves to capture the UI of an
application as the UI would appear on a screen and save the capture
in a memory buffer (218 of FIG. 2); actual display of the UI on a
screen is not required, but may be used. Operation 604 (Any delta
from prior Application UI capture?) then serves to compare the last
prior capture of the application UI to the current capture to
determine whether any changes have occurred. This delta checking
operation may be performed in a wide number of ways, including, for
example, hashing pixel blocks of the current UI capture and
comparing the hash values to an analogous pixel-hash table
generated from the prior UI capture. The hash values may then also
be available for potential use in any compression method utilized,
e.g., matching blocks of successive video frames, matching blocks
against a reference frame, etc. Alternatively, for example, the
application may notify the server when a change occurs prior to
operation 602.
[0081] If operation 604 is determined in the negative, then
Operation 606 (Delay) may be implemented by a timer (not shown),
before operation 602 is repeatedly performed. If operation 604 is
determined in the affirmative, Operation 608 (Convert to
appropriate format) is then performed. Operation 608 serves to
convert the native format of the UI rendered data to another format
more suited for compression and transmission over a network for
display on a client device (e.g., color space transformation, data
format transformation, etc.).
[0082] Operation 610 (Resize) is then performed. Operation 610
serves to resize the native screen size inherent to the UI rendered
data to another size more suited for display on a client device
(e.g., a cellular phone, a handheld computer, etc.). Operation 610
may make use of one or more feedback parameters (not shown) of the
client device communicated to the server-side application and its
accompanying video encoder instantiation. Operation 612 (Store in
memory for encoder) then follows, storing the converted video frame
for use by a video encoder. Operation 614 (Flag video update
needed) then follows, setting an indication for use by an operation
determining if a video update is needed (see operation 712 (Video
update needed?) of FIG. 7). Operation 616 (Delay) then follows, and
may be implemented by a timer (not shown), before operation 602 is
repeatedly performed.
[0083] One skilled in the art will recognize that the illustration
of FIG. 6 is merely an example, and that the invention may be
practiced and implemented in many other ways.
[0084] FIG. 7 is a flowchart 700 illustrating some supporting
aspects of sending a succession of video frames, according to one
embodiment. Update User N, 702, represents a start of a sequence of
steps, with the sequence of steps representing the steps undertaken
by the worker threads of FIG. 5 for each user in the applicable
worker thread's list of active users. A worker thread initially
performs operation 704, (Read back channel). Operation 706 (Any
data to process for user?) then follows, where it is determined if
anything pertinent for User N came in from the network that needs
to be processed. Responsive to data pertinent to User N being
detected, operation 708 (Process network events) then follows.
Incoming data pertinent to User N may comprise, for example,
information regarding user input to the client device, such as
attempting to zoom in on a particular part of the server-side
application UI (as shown by a video frame of the server-side
application UI displayed on a viewing application running on the
client device). Operation 708 may include communicating such
processed information to its next destination, e.g., if a zoom
command had been sent from the client, the zoom command would be
appropriately processed and forwarded to the server-side
application before the worker thread proceeded to the next
applicable operation.
[0085] Either after operation 708 (Process network events) or a
negative determination of operation 706 (Any data to process for
user?), operation 710 (Update needed?) is then performed. Operation
710 may depend on a counter (not shown) being set when the last
video frame for the applicable user was sent, or, more
specifically, when operation 712 (Video update needed?) was last
performed. If the counter has not yet reached its endpoint, then
the worker thread performing the operations will then proceed to
operation 718 (Increment N) to commence the sequence of steps
illustrated in FIG. 7 for the next applicable user. The counter
controls the frame rate for the succession of video frames being
communicated from the server to the client, or, more specifically,
the allowable frame rate, as will be further described below in
relation to operation 712, (Video update needed?). For example, for
an allowable frame rate of ten times per second, the counter would
be set to count to 100 milliseconds (e.g., from 100 milliseconds
down to zero, or vice-versa).
[0086] If operation 710 (Update needed?) is determined in the
affirmative, then operation 712 (Video update needed?) is
performed. Operation 712 may comprise checking a `video update
needed` flag, as was described in relation to FIG. 6, or some such
similar operation. Operation 712 serves to determine whether
anything has changed in the portion of the server-side application
being displayed by the client in the client's viewing application.
If operation 712 is determined in the affirmative, operation 714
(Grab needed video info) is then performed. Operation 714 serves to
obtain any video frame information needed to update the video frame
of an application UI from the last transmitted video frame of the
UI, and may make use of a wide range of video frame/compression
techniques, including video frame/compression standards, customized
methods, and combinations thereof.
[0087] Once operation 714 (Grab needed video info) has been
performed, operation 716 (Send network event) is then performed.
Operation 716 serves to send to the client one video frame update,
or updated state information pertaining to the client's viewing
application and the server-side application/UI. Operation 716 may
post the applicable data for User N to the queue for User N's
network connection to effectuate this.
[0088] If operation 712 is determined in the negative, operation
716 (Send network event) may be still be performed if there is
updated state information pertaining to the client's viewing
application and the server-side application/UI.
[0089] By transmitting only one video frame update (or state
information update) per user, the worker thread servicing multiple
users may cycle through and serve all of the users on its active
user list without any one user significantly consuming the worker
thread's time to the detriment of any other particular user. As the
worker thread's load across all of its supported users increases,
servicing times for all of the worker thread's active users will
gradually increase. Load balancing is thus inherently embedded in
this manner. Additional load balancing techniques are described in
connection with FIG. 5.
[0090] One skilled in the art will recognize that the illustration
of FIG. 7 is merely an example, and that the invention may be
practiced and implemented in many other ways.
[0091] FIG. 8 is a diagram illustrating some aspects of
client-server exchange 800, according to one embodiment.
Client-server exchange 800 depicts a session exchange between
server 802 and client device 804. As described herein, server 802
may refer to any server-side machine, and may include a number of
servers, either located in one facility or geographically
dispersed, operating in conjunction to facilitate the operations
described in FIG. 8. These servers may include authentication
servers, database servers, etc.
[0092] Client device 804 initiates client-server exchange 800 with
operation 806, with a user launching a viewing application on
client device 804. (The term, client device, is used with respect
to FIG. 8 simply because the client device's parameters are
discussed throughout; the term, client, could also be used
interchangeably.) The viewing application then facilitates opening
a connection to server 802 via a network connection via operation
808. Operation 808 makes use of one or more standard Internet
protocols, or variations thereof, as would be appreciated by one of
skill in the art. Operation 808 serves to pass the user's identity
(e.g., by telephone number, carrier account, etc.) and client
device's (804) display resolution and size to server 802, for use
by server 802 in the session. Server 802 then performs operation
810, which launches and provisions a server application instance,
with the server application customized based on the user's
preferences. In the present embodiment, the user's preferences are
fetched from a database (not illustrated) where they have been
associated with the user's identity. Per operation 810, the server
application renders in a virtual frame buffer. In another
embodiment, the server application may render to a screen.
[0093] Operation 812 then follows, where audio/video encoder
modules, and an accompanying command process module, are launched
and provisioned with client device's (804) display resolution and
size for customized encoding for client device 804. The command
process and encoder modules may also be provisioned with a level of
service associated with the user's identity, providing the
particular user with a priority-level with regard to other users
using the system. Subsequently, as depicted in operation 814, the
video encoder may convert and encode video frames of the server
application's UI (e.g., converting the video frames rendered by the
server application to QVGA resolution from the server application's
native rendering resolution because client device 804 supports QVGA
resolution). The video encoder module also resizes the server
application's native UI size rendering to suitably fit within, or
work together with client device's (804) screen size. The audio
encoder module encodes an audio stream output of the server
application based on the speaker capabilities of client device 804
(e.g., if client device is known to be a cellular phone, the audio
may be encoded such the quality encoded does not exceed the
particular phone's speaker capabilities, or of a default level used
for cellular phones). Arrow 816 illustrates the communication of
the encoded audio and video to client device 804.
[0094] Operation 818 subsequently follows, where the viewing
application's decoder modules (audio and video) decode the audio
and video frames received. The video frames may be displayed in the
viewing application on the client, and the audio may be output to
client device's (804) speakers (if audio is present and client 804
is not on mute), as depicted by operation 820.
[0095] Operation 822 subsequently follows, depicting an on-going
series of interactions (represented by arrows 826 and 828) between
server 802 and client device 804, also represented by operation 824
on the client side. Operation 822 depicts the server application
only sending information to the encoder modules when the UI or
audio output change, with the encoders encoding this information
for transmittal to client device 804. Thus, if nothing changes
regarding the server application's UI or audio, then audio/video
information is not encoded and sent to client device 804. A video
encoder of server (802) thus asynchronously communicates video
frames based on changes in the UI. Video is sent as video frames to
display UI changes, and not as commands outside of a video frame
format.
[0096] Operation 824 depicts the user interacting with the virtual
application, with the user's inputs being transformed into
parameters and being passed back to the server application.
Operation 822 further depicts the server-side command process
module translating user input parameters to operate on the server
application UI, with the server application UI accordingly
changing. Operation 824 completes the cyclical sequence by further
depicting the encoded audio and video resulting from user inputs to
the virtual application being received, decoded and displayed in
the viewing application.
[0097] One skilled in the art will recognize that the illustration
of FIG. 8 is merely an example, and that the invention may be
practiced and implemented in many other ways.
[0098] FIG. 9 is a diagram illustrating some aspects of
client-server exchange 900, according to one embodiment.
Client-server exchange 900 depicts a session exchange between
server 903 and client 904, with an accompanying exchange between
encoder/command process modules 902 and application 906 (both
running on server 903) also being illustrated. Application 906
includes state management module 907 and client 904 includes state
management module 905, which will be discussed in relation to FIG.
10. Application 906 comprises a web browsing application in this
embodiment. Encoder/command process modules 902 comprise audio and
video encoder modules and a command process module. References to
exchanges with encoder/command process modules 902 may only
specifically comprise an exchange with one of these modules, as
would be appreciated by one skilled in the art. In another
embodiment, a functional element similarly situated as
encoder/command process modules 902 may comprise a video encoder
module and a command process module, but not an audio encoder
module. As described herein, server 903 may refer to any
server-side machine, and may include a number of servers, either
located in one facility or geographically dispersed, operating in
conjunction to facilitate the operations described in FIG. 9. These
servers may include authentication servers, database servers,
etc.
[0099] Client 904 initiates client-server exchange 900 with
operation 908, open connection. Server 903 responds with operation
910, connection confirmed. Client 904 then send its capabilities to
encoder/command process modules 902, including screen size and
other device parameters, via operation 912. The device parameters
may include a wide variety of device parameters, including a device
processor, memory, screen characteristics, etc. Client 904 then
sends a URL via operation 914, which may comprise a saved URL
(e.g., a homepage) or a URL entered by the user of client 904.
Encoder/command process modules 902 in turn communicate the URL to
application 906 via operation 916, and application 906 then loads
the URL via operation 918. Application 906 also passes the width
(w) and height (h) of the web page associated with the URL to
encoder/command process modules 902 via operation 920.
Encoder/command process modules 902 then communicates the web page
size to client 904, as well as the viewport visible on the client
screen, including parameters characterizing the viewport of the
client, e.g., a corner coordinate (x, y) and an associated zoom
factor (z), via operation 922. The parameters characterizing the
viewport of the client may comprise absolute position information,
relative position information, etc.
[0100] A screen capture of the webpage viewport (the portion of the
browser UI that the viewport has been associated with) then takes
place via operation 924, in accordance with a number of techniques
known in the art. A video frame of the web page visible through the
viewport is then communicated to client 904 via operation 926. A
subsequent screen capture 930 then takes place after a variable
sample interval 928, with the associated video frame being
communicated via operation 932. Arrow symbol 929, commonly used to
indicate a variable element, is illustrated crossing variable
sample interval 928 to indicate this novel feature.
[0101] An asynchronous feedback channel provides feedback via
operation 934. This feedback may be used to vary the sample
interval 928 based on one or more feedback parameters, including
client device parameters, user input parameters, and/or estimated
bandwidth parameters, such as bandwidth parameters based on
measurements of the packets traversing back and forth between
server 903 and client 904. RTCP protocol, or a similar such
protocol (standardized or customized) may be used in connection
with providing such feedback, as illustrated by operation 936.
Ellipsis 938 and cycle 940 illustrate the repetitive nature of the
interaction between server 903 sending video frames to client
904.
[0102] Sample interval 928 may also be at least partially varied
based on the rate of change of the underlying webpage being viewed.
For example, if little to no change is detected in the underlying
webpage being viewed by client 904, then the frame sample interval
may be adjusted upward. Likewise, for a very dynamic webpage, or
content within a webpage, the frame sample interval may be adjusted
downward.
[0103] The user of client 904 may move the viewport from which a
webpage is being viewed, to view another portion of the webpage, as
depicted in operation 942, with x' and y' comprising new parameters
of the viewport. The new portion of the webpage that matches the
new viewport will then be captured via operation 944, and a video
frame of the new viewport will be communicated to client 904 via
operation 946.
[0104] The user of client 904 may again move the viewport, as
depicted in operation 948, with x'' and y'' comprising new
parameters of the viewport. This time, the new viewport extends
beyond what would be displayed on the server browser window, and
thus the browser itself must scroll to capture the desired portion
of the webpage, as depicted in operation 950. Having appropriately
scrolled, as depicted via operation 952, a screen capture of the
new viewport will then be obtained, as illustrated in operation
954, with the resulting video frame communicated via operation
956.
[0105] The user of client 904 may also use a mouse or
mouse-equivalent (e.g., finger tap/motion on a touchscreen,
multi-directional button, trackpoint stylus moving a cursor, etc.),
as shown via operation 958, where a mouse down motion is made, with
the new coordinates of the mouse being passed as (a, b). Client 904
will pass coordinates relative to the client device's screen back
to encoder/command process modules 902 in such an operation, with
encoder/command process modules 902 determining the corresponding
location in relation to the viewport and underlying webpage. In the
embodiment being described in FIG. 9, server 903 is running an
underlying Windows OS, permitting the injection of a mouse message
with the appropriate location information to the window associated
with browser 906 (whether there is an actual screen being used for
rendering or not). This is illustrated via operation 960, and the
screen cursor would resultantly move in application 906, and be
communicated back in a video frame to client 904 as described
above. In other embodiments being used in conjunction with other
operating systems, similar such functions may be used if available,
or some analogous other such techniques, as would be appreciated by
one skilled in the art.
[0106] Operations 962, 964, 966 and 968 depict similar mouse-driven
events, which will work in an analogous manner. The term,
mouse-driven event, is used broadly herein to include input control
events triggered by a wide variety of mouse or mouse-equivalent
control inputs on a variety of devices (e.g., finger tap/motion on
a touchscreen, multi-directional button, trackpoint, stylus moving
a cursor, etc.). Other input driven control events (such as a
keypad entry) may work in the same manner as well. The types of
operations depicted in 970, 972, 974, 976 and 978 have been
described above, and ellipsis 980 and cycle 982 serve to illustrate
on-going interactions as long as the session between client 904 and
server 903 continues.
[0107] One skilled in the art will recognize that the illustration
of FIG. 9 is merely an example, and that the invention may be
practiced and implemented in many other ways.
[0108] FIG. 10 is a diagram 1000 illustrating some aspects of
viewport move operations and related state management, according to
one embodiment. FIG. 9 is referenced throughout the description of
FIG. 10, as the two figures are related. Diagram 1000 includes
webpage 1002, with a width of w' 1004, and a height of h' 1006.
Webpage 1002 is illustrated partially rendered in FIG. 10, with
rendered webpage portion 1008 having a width of w 1010, and a
height of h 1012. Webpage portions 1014, 1020, and 1026 are
illustrated, each having a left corner coordinates of (x, y) 1016,
(x', y') 1022, and (x'', y'') 1028, respectively, and associated
zoom factors of 1018, 1024, and 1030, respectively. Webpage portion
1026 includes cursor position (a, b) 1032. Webpage portions 1014,
1020, and 1026 relate to operations 922, 942 and 948, respectively,
of FIG. 9.
[0109] Webpage portion 1014 corresponds to a portion of webpage
1004 sent for remote viewing, which comprises a viewport of client
(being indicated by (x, y) 1016 and zoom factor 1018). Following or
performed concurrently with either operations 944 or 946 of FIG. 9,
state manager module 907 of server 903, having previously
identified webpage portion 1014 as the current state, updates its
current state webpage to webpage portion 1020. State manager module
905 of client 904 does likewise upon client 904 displaying webpage
portion 1020. State manager modules 907 and 905 then identify
webpage portion 1014 as the prior state. Client 904 may request
prior state webpage portion 1020 from server 903, such as, for
example, via a back icon (not shown). As an intermediate step,
client 904 may display a locally cached version of the prior state
webpage portion while server 903 is in the process of
obtaining/sending the current version of the prior state webpage
portion.
[0110] Webpage portion 1026 likewise becomes the next current
viewport of client 904 per operations 948-956, and state manager
modules 907 and 905 likewise updating the current state webpage to
webpage portion 1026, with the addition of internal application
scrolling operations 950 and 952 due to part of 1026 not being on
rendered webpage 1008. In another embodiment, the entire applicable
webpage is rendered (analogous to h' 1006 by w' 1004 of FIG. 10),
and thus there are no internal scrolling operations to perform.
[0111] Similar to the identification of webpage portion states,
cursor position sub-states are maintained relative to viewport
views and the viewport view's corresponding webpage portion. For
example, while webpage portion 1026 comprises the current state,
webpage portion 1026 includes cursor position (a, b) 1032. As
described in relation to operations 958-964 in FIG. 9, as cursor
position (a, b) is updated, so too is its corresponding sub-state
maintained by state manager module 907.
[0112] One skilled in the art will recognize that the illustration
of FIG. 10 is merely an example, and that the invention may be
practiced and implemented in many other ways.
[0113] FIG. 11 is a diagram illustrating some aspects of a
client-server exchange 1100 with respect to state management,
according to one embodiment. Client-server exchange 1100 depicts a
session exchange between client 1102 and server 1104. As described
herein, server 1102 may refer to any server-side machine, and may
include a number of servers, either located in one facility or
geographically dispersed, operating in conjunction to facilitate
the operations described in FIG. 11. These servers may include
authentication servers, database servers, etc.
[0114] Server 1102 includes state manager module 1104 to manage
state information, and client 1106 includes state manager module
1108 to manage state information. State manager module 1104 and
state manager module 1108 operate as described herein; their common
name describes their general function and is not meant to imply
they are instances of the same module. In another embodiment, these
two modules may be instances of the same module. In yet another
embodiment, state manager modules 1104 and 1108 may operate as one
logical software unit.
[0115] Operation 1110 illustrates state manager module 1104
identifying a portion of a webpage sent for remote viewing, with
the portion comprising a viewport view of client 1106. Operation
1114 illustrates state manager module 1104 identifying this webpage
portion as the current state webpage portion. Operation 1112
illustrates state manager module 1108 identifying a portion of a
webpage being displayed, with operation 1112 illustrating state
manager module 1108 identifying this webpage portion as the current
state webpage portion. The webpage portions are defined areas
within a webpage, and thus in another embodiment, a current state
checker module (not shown) may be used to periodically verify the
current states are uniform across server 1102 and client 1106. In
yet another embodiment, a common table (not shown) may be shared
among state manager modules 1104 and 1108, where states are defined
in terms of location identifiers of a specific portion of a
specific webpage.
[0116] Operation 1118 illustrates a user of client 1106 selecting a
second portion of the webpage. The user can make such a selection
in a wide variety of ways, including moving a navigation toggle or
a scroll wheel, using a touch screen or keypad, etc. In the
embodiment described in relation to FIG. 11, the webpage portions
further comprise an area of the webpage surrounding the viewport of
client 1106. Thus, in the instance of a selection by the user of a
limited scroll-down for operation 1118, the second portion of the
webpage may already reside on client 1106. Operation 1120
illustrates checking if the second portion is already in local
memory. In another embodiment, the webpage portions match the
viewport of client 1106, and a local memory check, like operation
1120, may check if the desired webpage portion had been previously
loaded and was still resident in local memory. Determining whether
on object of interest is in local memory can be performed in a
number of ways, as would be appreciated by one of skill in the
art.
[0117] In the example illustrated, operation 1120 was determined in
the negative, and parameters for the second portion of the webpage
are thus communicated via operation 1122. FIGS. 9 and 10 provide
more detail regarding passing parameters. Similarly to operations
1110 and 1112, a second portion of a webpage is identified by
server 1102 and sent to client 1106, as illustrated by operations
1124 and 1126. State manager modules 1104 and 1108 accordingly
identify the second portion as the current state portion via
operations 1128 and 1130. State manager modules 1104 and 1108 then
identify the former prior state webpage portion as the prior state
in operations 1132 and 1134.
[0118] In the embodiment illustrated in FIG. 11, operation 1120 may
also be determined in the positive (not shown). In the case of the
second webpage portion residing completely in the local memory of
client 1106, the second webpage portion may not be requested by,
and provided to, client 1106. Client 1106 may simply display the
second webpage portion and relay parameters regarding the second
webpage portion in order for state manager module 1104 of server
1102 to update its state information. Displaying the second webpage
portion may also comprise an intermediate step before obtaining a
more current version of the second webpage portion from server
1102.
[0119] The same operations discussed herein do not only apply to
present and prior states, but also to a plurality of webpage
portion states of a plurality of websites. This plurality of states
may comprise an ordered succession of states, as may be recorded in
a browsing session using an embodiment of the invention. This
ordered succession of states may be available toggling through the
website portion views of a user session, saved as a series of
bookmarks, etc.
[0120] Though not illustrated, a feature of managing state
information as described herein includes enabling the sharing of a
webpage portion with another user, in accordance with one
embodiment. A webpage portion may be included in an identifier that
can be placed in a message for sending, with the identifier adapted
to facilitate transport to the applicable webpage portion. The
message could be an email message, an instant-messaging message, a
messaging feature used in a social network, etc. The identifier may
route the recipient of the message into their system account (an
embodiment client-server system account), where the webpage portion
included in the identifier can be accessed like a prior webpage
portion state of the user. A database may be used in conjunction
with such identifiers. As will be appreciated by those skilled in
the art, other ways may be used to augment a webpage portion
identifier so that it may be transferred among different users,
both within and outside of embodiment client-server system
accounts.
[0121] One skilled in the art will recognize that the illustration
of FIG. 11 is merely an example, and that the invention may be
practiced and implemented in many other ways.
[0122] FIG. 12 is a diagram illustrating some aspects of a
client-server exchange, including an accompanying exchange between
a server and network storage, according to one embodiment.
Client-server exchange 1200 depicts a session exchange between
client 1202 and server 1204, with an accompanying exchange between
server 1204 and network storage 1206. As described herein, server
1204 may refer to any server-side machine, and may include a number
of servers, either located in one facility or geographically
dispersed, operating in conjunction to facilitate the operations
described in FIG. 12. Network storage 1206 may refer to any one or
more storage units, operating in combination or any other way, that
are networked to server 1204. Server 1204 includes provision
manager module 1205 and user manager module 502 (of FIG. 5).
[0123] Operations 1208, 1210 and 1212 serve to initiate a session.
Operation 1214 launches a unique browser instance for client 1202.
Provision manager module 1205 uses the user identifier passed in
operation 1210 (e.g., a telephone number, account number, etc.), or
another identifier associated with the user identifier, to fetch
browser state information for the user from network storage 1206
via operations 1216 and 1218. Provision manager module 1205
likewise fetches a customer profile for the user from network
storage 1206 via operations 1216 and 1218. The customer profile and
the browser state information may reside in different, unrelated
portions of network storage 1206. Browser state information for the
user may include any type of user information associated with a
browser, including bookmarks, cookies, caches, etc. Via operation
1220, provision manager module 1205 provisions the unique browser
instance launched for the user as would be appreciated by one
skilled in the art (e.g., by using automated directory copies,
automated provisioning techniques, etc.).
[0124] Via operation 1222, provision manager module 1205 works in
conjunction with user manager module 502 (described in relation to
FIG. 5) to provision resources based on the user's customer
profile. The user's customer profile could include the user's past
usage history (e.g., categorized as bandwidth low, medium, or
heavy), encoding usage (also possibly categorized), etc. Provision
manager module 1205 and user manager module 502 together operate to
provide different levels of service to a user, based on a level of
service provider plan, etc. A wide range of parameters may be
configured, depending on the particular embodiment, including
peak/average bandwidth Quality of Service (QoS), video compression
settings for motion video (i.e., quality), video frame rate, video
image size limits and refitting, server memory, CPU and/or disk
usage. The parameters configured could be set statically per
user/profile, or the parameters could dynamically change, based on
a number of factors, including the type of URL being used. In this
way, the service provider could provide higher quality video for
low-bandwidth pages (e.g., news websites, etc.) and place more
restrictions on bandwidth-intensive websites. As described in
relation to FIG. 5, provisioning a user among worker threads may
also be used to implement tiered levels of service.
[0125] Arrows 1224 and 1226 depict a web browsing session. Via
operation 1228, the browser state information will be updated based
on the user's use during the session (e.g., the addition of cookies
based on websites visited, modifications to bookmarks during the
sessions, etc.). Via operation 1230, at least some of the user's
customer profile will also be tracked for updating the user's
customer profile on terminating the session.
[0126] During a session, user manager module 502 may perform a
state copy of a user session to transfer a user to another thread,
either on the same server or a different server. User manager
module 502 may operate in conjunction with provision manager module
1205 to facilitate such a move. While moving a user from one thread
to another is discussed in relation to FIG. 5, provision manager
module 1205 may also facilitate such a state copy by updating state
information during a user session. User manager module 502 may then
pause a browsing session, while provision manager module 1205
provisions another browser instance with the updated state
information at the desired destination. User manager module 502 may
then move the user session to the desired destination by performing
a state copy (which may include a directory copy, etc.). The
applicable user web browsing session may then resume. The timing of
such transfers may take place at specific junctures in the display
of video frames by a client device's viewing application, in order
to minimize disruption of a user's experience, such as when a user
makes a request of a URL.
[0127] When the user desires to end the session, the user may close
their virtual browser, as shown in operation 1232, and accompanying
operation 1234. The user's browser end state will also be captured
by provision manager module 1205, as shown via operation 1236, via
a directory save, etc. The user's customer profile will be updated
based on the usage information captured, as shown in operation
1238. For example, the user may visit mostly text-based websites
during a session, and have their average bandwidth and encoding
averages lowered by such a resource-light session. Provision
manager module 1205 will update the user's state information and
customer profile based on the session, saving the updated versions
in network storage 1206, as server 1204 closes the unique server
browser instantiated at the beginning of the session.
[0128] One skilled in the art will recognize that the illustration
of FIG. 12 is merely an example, and that the invention may be
practiced and implemented in many other ways.
[0129] FIG. 13 is a diagram illustrating some aspects of displaying
a web browsing session from a server to a cellular phone, according
to one embodiment. Server 1302 is illustrated, upon which a web
browsing engine (not shown) executes, rendering webpage 1304.
Webpage 1304 is rendered into a buffer on server 1302 in the
embodiment discussed in relation to FIG. 13, but webpage 1304 may
be also rendered to a screen in another embodiment. A recognition
engine (not shown) also runs on server 1302, which identifies
elements of interest on webpage 1304, such as hyperlink 1306.
Recognition engine may use any number of techniques known to one
skilled in the art to identify elements of interests with respect
to webpage 1304, including parsing through the HTML/XHTML/Document
Object Model data to obtain access to the rendered webpage
data/contents of webpage 1304 (including location information of
the contents of webpage 1304). In some embodiments, other
techniques for parsing webpage contents may be used, including a
tabbing function to cycle through components of a webpage. In some
embodiments, optical character recognition techniques known to
those skilled in the art may additionally or alternatively be
used.
[0130] In some embodiments, the information regarding the elements
of interest may be communicated along with webpage 1304, such as in
the form of a polygon map. A viewing application (not shown)
running on cellular phone 1310 may then match such a polygon map
with the video of webpage 1304 being displayed. In various
embodiments, the information regarding the elements of interest may
be communicated from server 1302 to cellular phone 1310 in
combination with the video frames in a wide variety of ways, as
would be appreciated by one of skill in the art regarding sending
two data sets to be matched at the destination.
[0131] In one embodiment, the information regarding the elements of
interest may be communicated in response to viewing application
passing user input back to the recognition engine on server 1302.
For example, a user may select a location on the image being
displayed on cellular phone 1310 corresponding to a textbox (not
shown) on webpage 1304, for the user to input text into the text
box. As described elsewhere in relation to other Figures, the
location of the user selection may be passed back to server 1302.
Recognition engine on server 1302 may then communicate to the
viewing application that the location selected by the user
corresponds to a textbox. In one embodiment, the viewing
application may then locally render a generic textbox for the input
of text by the user, sending the text entered by the user back to
the recognition engine once the user has completed the entry.
[0132] In another embodiment, the recognition engine may send
information regarding the textbox to the viewing application for
the viewing application to customize the locally-rendered textbox,
including, for example, information regarding acceptable string
length, acceptable character entries, string-formatting, etc. This
information may be used by the viewing application to locally
police acceptable input for a given textbox, as well as for
formatting the display of input (e.g., causing password textbox
entries to appear as "*******" instead of cleartext). In some
instances, a textbox, radio button, etc., will already contain text
(e.g., have an initial state with initial text), be selected, etc.,
and this information will be sent to the client to appear in the
locally rendered textbox, radio button, etc. A state manager on the
backend server will also track the state of interactive objects on
a webpage, with the state manager serving to synchronize client
user input with the web browsing engine on the backend server. The
state manager may also serve to react to events on the server that
occur while a client control is activated (such as a control losing
focus suddenly), and may alert a user in some embodiments if a
synchronization was unsuccessful, including, for example, providing
a reentry prompt.
[0133] Also, this applies to state. E.g. current selection(s) in a
list box, or current state of a radio button.
[0134] Cellular phone 1310 is illustrated in FIG. 13 as the
destination of webpage 1304 (labeled webpage portion 1308 with
respect to a portion being displayed on cellular phone 1310), after
the webpage has been transformed into video frames as discussed in
regard to other Figures herein. The viewing application executing
on cellular phone 1310 displays webpage portion 1308 along with
address bar 1312 (displaying URL 1340), toolbar 1318, and an
overlay graphical component, cursor 1314, with address bar 1312,
toolbar 1318 and cursor 1314 being rendered locally on cellular
phone 1310. Address bar 1312 and toolbar 1318 may appear in a
number of locations together with a portion of webpage 1304,
according to a variety of embodiments. In some embodiments, address
bar 1312 and toolbar 1318 may be user-configurable to appear in
other display locations, while in some embodiments, address bar
1312 and toolbar 1318 may automatically adjust their locations
based on positioning of cellular phone 1310 for cellular phones
with motion detection ability.
[0135] Hyperlink 1316 (analogous to hyperlink 1306 rendered on
server 1302) is displayed in the video image displayed on cellular
phone 1310. In some of the embodiments in which information
regarding elements of interest are communicated along with webpage
1304, the viewing application may be adapted to transform cursor
icon 1314 into hand icon 1315 (or another icon to denote an
underlying hyperlink) while cursor icon 1314 overlays hyperlink
1316. Address bar 1312, comprising a roll out address bar in the
embodiment illustrated in FIG. 13, may include any number of
functional user interface elements. Address bar 1312 rolls out when
"A" icon 1320 is tapped, and rolls back into main toolbar 1318 when
"A" icon 1320 is tapped when address bar 1312 is already rolled
out, or upon inactivity for a defined time period. Address bar 1312
displays the current URL and may also wrap around to another line
when a user is entering a long URL. Home icon 1334 allows for a
user to set a home webpage by a number of means, either through
cellular phone 1310 or through an import function through a
specialized portal, as would be appreciated by one of skill in the
art. Favorites icon 1336 allows for a user to save URLs for easy
reference, and may also be set up through cellular phone 1310 or
through an import function through a specialized portal, as would
be appreciated by one of skill in the art. "V" icon 1338 allows for
a voice/speech URL entry. In some embodiments, viewing application
may interface with speech recognition capabilities of cellular
phone 1310 for such voice/speech URL entries.
[0136] Back and Forward icons 1322 allow for a user to scroll
through webpages for previously entered URLs. In some embodiments,
Back and Forward icons 1322 may allow for a user to scroll through
webpage portion views. In some embodiments, such icons may be
configurable to either scroll through prior webpages, or through
prior webpage portion views, based on user selection of the two
options. "X" stop icon 1326 allows for stopping the loading of a
portion of a webpage, and thus appears while a portion of a webpage
is being loaded. In some embodiments, "X" stop icon 1326 may
display during the loading of, and permit cessation of the loading
of, an entire webpage. When a portion of a webpage, or a webpage,
has loaded, "X" stop icon 1326 is replaced by "R" refresh icon
1324, to allow for a reloading of the portion of the webpage
currently being viewed, or the reloading of an entire webpage
depending on the embodiment.
[0137] Zoom in/out icons 1328 allow for zoom functionality and
control. Zoom in/out icons 1328 may allow for multiple
functionality based on the duration of selecting the icon. For
example, for zooming out, each tap may incrementally zoom out,
while a long press may zoom out to a full page view. Scroll icons
130 and 132 allow for scrolling in the client window.
[0138] Depending on the embodiment and functionality of cellular
phone 1310, the icons discussed herein may be touched-based for
cellular phones with touch screen functionality, may include a
menu-based user interface (softkeys), and/or may be selectable via
scroll/selection buttons or any other input features of cellular
phones. In some embodiments, a specialized touch pattern, e.g., a
drag, etc., may be equated to a right-button mouse click, with such
a parameter being passed to the server 1302 for right-button mouse
options to appear on webpage 1304 (and webpage portion 1308). In
some embodiments, right-button mouse options may be locally
rendered by the viewing application.
[0139] In some embodiments, web browsing engine on server 1302 may
make use of small screen rendering techniques to render all, or
part of, webpage 1304 to fit within the width of the screen of
cellular phone 1310. In some embodiments, the viewing application
on cellular phone 1310 may allow for user selection of a small
screen rendering view, allowing the user to switch between multiple
view formats known in the art. Intra-web page views displayed on
cellular phone 1310 may be tracked in some embodiments, allowing
the center of a view to be maintained across multiple view
formats.
[0140] One skilled in the art will recognize that the illustration
of FIG. 13 is merely an example, and that the invention may be
practiced and implemented in many other ways.
[0141] FIG. 14 is a diagram illustrating some aspects of a user
interface display, according to one embodiment. Webpage portion
1402 illustrates a collection of textboxes of checkout section
1401, specifically shipping information section 1403. Webpage
portion 1402 appears as if displayed on a mobile device (not
shown), such as a cellular phone, with toolbar 1439 appearing on
the bottom of webpage portion 1402. In other embodiments, toolbar
1439 may appear elsewhere or may roll out of view, either by
default or by user configuration. Shipping information section 1403
includes checkbox 1415 for selecting the `use billing address`
option (illustrated as unchecked), textbox 1416 for first name
information 1404, textbox 1418 for last name information 1406,
textbox 1420 for address information 1408, textbox 1422 for city
information 1410, dropdown menu 1428 for state information 1412,
textbox 1424 for zip-code information 1425, textbox 1426 for email
information 1414, and radio buttons 1434 and 1438, indicating "yes"
1432 or "no" 1436 responses to question 1430.
[0142] In the embodiment illustrated in FIG. 14, information
regarding the interactive elements was communicated along with
webpage portion 1402 to the mobile device displaying webpage
portion 1402. In another embodiment, information regarding an
interactive element may be communicated to the viewing application
once the user has selected a location on the webpage being
displayed corresponding to the interactive element. Webpage portion
1440 will be used to illustrate further operations with respect to
webpage portion 1402. In the case of a user bypassing the
address-related textboxes shown in FIG. 14 by selecting checkbox
1415 (e.g., by tapping the location of the screen where checkbox
1415 was being displayed), a check appearing in checkbox 1415 (not
shown) would be rendered on the backend server, in accordance with
one embodiment. In another embodiment, the viewing application may
locally render checkbox 1415 and the check, or just the check,
depending on the embodiment.
[0143] When a user selects textbox 1416 to enter their first name,
the viewing application displaying portion of webpage 1402 locally
renders textbox outline 1442 and rolls out overlay textbox 1444
(the rolling feature indicated by arrow 1446 for illustration
purposes). As the user inputs their name, the letters entered by
the user display in overlay textbox 1444. Once the user has
completed their entry, for example by pressing an enter key on the
mobile device or selecting another textbox on webpage portion 1440,
the viewing application will communicate the information to a state
manager on the backend server, with the state manager serving to
synchronize client user input with the web browsing engine on the
backend server. In another embodiment, an overlay textbox for
character entry may be locally rendered directly where textbox 1416
is shown. In another embodiment, toolbar 1448 may transform to an
overlay textbox for character entry.
[0144] In the embodiment illustrated in FIG. 14, the viewing
application received information regarding dropdown menu 1428 from
the recognition engine residing on the backend server, including
the available menu options (e.g., items listed, the initial
selection, or state, of the drop-down menu), together with the
information to display webpage 1402. Viewing application then
locally rendered dropdown menu 1428, and will locally render an
expanded menu upon the user selecting dropdown menu 1428. Once the
user makes a selection of the available menu options, viewing
application will communicate the information to a state manager on
the backend server, with the state manager serving to synchronize
client user input with the web browsing engine on the backend
server. In other embodiments, other implementations may be used to
locally render the selections available from a dropdown menu.
[0145] In the embodiment illustrated in FIG. 14, the viewing
application received information regarding radio buttons 1434 and
1438 from the recognition engine residing on the backend server,
together with the information to display webpage 1402. Viewing
application then locally rendered radio buttons 1434 and 1438, and
locally rendered radio button 1434 selected upon the user tapping
radio button 1434 (FIG. 14 illustrating an embodiment where the
mobile device includes a touch screen). Once selection of a radio
button is made, viewing application will communicate the
information to a state manager on the backend server, with the
state manager serving to synchronize client user input with the web
browsing engine on the backend server. In other embodiments, other
implementations may remotely render selection of a radio
button.
[0146] One skilled in the art will recognize that the illustration
of FIG. 14 is merely an example, and that the invention may be
practiced and implemented in many other ways. In various
embodiments, one or more of the above interactive elements of a
webpage (for example, a textbox, a radio button, a dropdown menu,
etc.) may be rendered completely on the backend server.
Additionally, in some embodiments, a state manager module within
the viewing application may facilitate collecting user input and
synchronizing such user input with user input provided to the
backend web browsing engine.
[0147] FIG. 15 is a diagram illustrating some aspects of a user
interface display, according to one embodiment. Webpage portion
1502 appears as if displayed on a mobile device (not shown), such
as a cellular phone, with toolbar 1504 appearing on the side of
webpage portion 1502. In other embodiments, toolbar 1504 may appear
elsewhere or may roll out of view, either by default or by user
configuration. Webpage portion 1502 illustrates news section 1506,
including headline 1508, text section 1510 and webpage video
1512.
[0148] In the embodiment illustrated in FIG. 15, information
regarding the interactive element of webpage video 1512 was
communicated along with webpage portion 1502 to the mobile device
displaying webpage portion 1502. In another embodiment, information
regarding webpage video 1512 may be communicated to the viewing
application once the user has selected a location on the image
displayed corresponding to webpage video 1512. Full screen video
view 1518 will be used to illustrate an operation with respect to
webpage portion 1502. When the user moves cursor icon 1514 over
webpage video 1512, the viewing application locally rendering
cursor icon 1514 will transform cursor icon 1514 to hand icon 1516
based on the information communicated from the backend server to
the viewing application. In another embodiment, the viewing
application may not make such a transformation. In other
embodiments, a play icon may be displayed on webpage video 1512,
either placed there by the viewing application or as displayed on
the underlying webpage.
[0149] Once the user has selected webpage video 1512, the viewing
application illustrated in FIG. 15 will switch to full screen video
view 1518, allowing for larger viewing of webpage video 1512 (shown
as larger webpage video 1520 in full screen video view 1518). In
the embodiment illustrated, the same underlying video transport, as
discussed with respect to other Figures, is used both for
displaying webpage portion 1502 and full screen video view 1518.
Full screen video view 1518 includes a video-control user interface
1522, the length of which corresponds to the length of webpage
video 1520, viewed-portion segment 1526, and pause/play icon 1524,
which may be alternatively used for pausing or playing webpage
video 1520, depending on the current play state of webpage video
1520. In another embodiment, the viewing application may not have a
specialized video-control user interface for playing webpage video
1520. In other embodiments, the viewing application may have a
different specialized video-control user interface for playing
webpage video 1520.
[0150] In some embodiments, the viewing application may play video
1512 as shown in webpage portion 1502, and not switch to full
screen video view 1518. In some embodiments, whether video 1512
switches to full screen video view 1518 may be
user-configurable.
[0151] One skilled in the art will recognize that the illustration
of FIG. 15 is merely an example, and that the invention may be
practiced and implemented in many other ways.
[0152] FIG. 16 illustrates an example computer system suitable for
use in association with a client-server architecture for remote
interaction, according to one embodiment. As shown, computer system
1600 may represent either a computer operating as a server, or a
computer operating as a client, with the general components
illustrated in FIG. 16 potentially varying with each respective
representation, as would be appreciated by one of skill in the art.
Computer system 1600 may include one or more processors 1602 and
may include system memory 1604. Additionally, computer system 1600
may include storage 1606 in the form of one or more devices (such
as a hard drive, an optical or another type of disk, electronic
memory, including flash memory, and so forth), input/output devices
1608 (as a keyboard (screen-based or physical, in a variety of
forms), scroll wheels, number pads, stylus-based inputs, a
touchscreen or touchpad, etc.) and communication interfaces 1610
(to connect to a LAN, a WAN, a wired or wireless network, and so
forth). The elements may be coupled to each other via system bus
1612, which may represent one or more buses. In the case where
system bus 1612 represents multiple buses, the multiple buses may
be bridged by one or more bus bridges (not shown). When
representing client devices in some embodiments, processor(s) 1602
may comprise a controller, and system memory 1604 and storage 1606
may comprise one cohesive memory component.
[0153] These elements each perform their conventional functions
known in the art. In various embodiments, computing system 1600 may
at least be partially incorporated in a larger computing system.
System memory 1604 and storage 1606 may be employed to store a
working copy and a permanent copy of the programming instructions
implementing various aspects of the one or more earlier described
embodiments of the present invention. Any software portions
described herein need not include discrete software modules. Any
software configuration described above is meant only by way of
example; other configurations are contemplated by and within the
scope of various embodiments of the present invention. The term,
engine, is used herein to denote any software or hardware
configuration, or combination thereof, that performs the function
or functions referenced. In particular, the term, web browsing
engine, is used herein to describe any software or hardware
configuration, or combination thereof, that performs a web browsing
function.
[0154] With respect to some embodiments of the invention, modules
have been described to implement various functions. In alternate
embodiments, part or all of the modules may be implemented in
hardware, for example, using one or more Application Specific
Integrated Circuits (ASICs) instead.
[0155] In all of the foregoing, it is appreciated that such
embodiments are stated only for the purpose of example, and that
other embodiments could equally be provided without departing from
the essential characteristics of the present invention.
[0156] The present invention has been described in particular
detail with respect to one possible embodiment. Those of skill in
the art will appreciate that the invention may be practiced in
other embodiments. First, the particular naming of the components,
capitalization of terms, the attributes, data structures, or any
other programming or structural aspect is not mandatory or
significant, and the mechanisms that implement the invention or its
features may have different names, formats, or protocols. Further,
the system may be implemented via a combination of hardware and
software, as described, or entirely in hardware elements. Also, the
particular division of functionality between the various system
components described herein is merely by way of example, and not
mandatory; functions performed by a single system component may
instead be performed by multiple components, and functions
performed by multiple components may instead performed by a single
component.
[0157] Some portions of above description present the features of
the present invention in terms of algorithms and symbolic
representations of operations on information. These algorithmic
descriptions and representations are the means used by those
skilled in the data processing arts to most effectively convey the
substance of their work to others skilled in the art. These
operations, while described functionally or logically, are
understood to be implemented by computer programs. Furthermore, it
has also proven convenient at times, to refer to these arrangements
of operations as modules or by functional names, without loss of
generality.
[0158] Unless specifically stated otherwise as apparent from the
above discussion, it is appreciated that throughout the
description, discussions utilizing terms such as "determining" or
"displaying" or the like, refer to the action and processes of a
computer system, or similar electronic computing device, that
manipulates and transforms data represented as physical
(electronic) quantities within the computer system memories or
registers or other such information storage, transmission or
display devices.
[0159] Certain aspects of the present invention include process
steps and instructions described herein in the form of an
algorithm. It should be noted that the process steps and
instructions of the present invention could be embodied in
software, firmware or hardware, and when embodied in software,
could be downloaded to reside on and be operated from different
platforms used by real time network operating systems.
[0160] The present invention also relates to an apparatus for
performing the operations herein. This apparatus may be specially
constructed for the required purposes, or it may include a computer
(including any type of computer, depending on various embodiments,
including a server, personal computer, tablet device, handheld
computer, PDA, cellular phone, etc.) selectively activated or
reconfigured by a computer program stored on a computer readable
medium that can be accessed by the computer. Such a computer
program may be stored in a computer readable storage medium, such
as, but is not limited to, any type of disk including floppy disks,
optical disks, CD-ROMs, magnetic-optical disks, read-only memories
(ROMs), random access memories (RAMs), EPROMs, EEPROMs, magnetic or
optical cards, application specific integrated circuits (ASICs), or
any type of media suitable for storing electronic instructions, and
each coupled to a computer system bus. Furthermore, the computers
referred to in the specification may include a single processor or
may be architectures employing multiple processor designs,
including multi-core designs, for increased computing
capability.
[0161] The algorithms and operations presented herein are not
inherently related to any particular computer or other apparatus.
Various general-purpose systems may also be used with programs in
accordance with the teachings herein, or it may prove convenient to
construct more specialized apparatus to perform the required method
steps. The required structure for a variety of these systems will
be apparent to those of skill in the, along with equivalent
variations. In addition, the present invention is not described
with reference to any particular programming language. It is
appreciated that a variety of programming languages may be used to
implement the teachings of the present invention as described
herein.
[0162] The present invention is well suited to a wide variety of
computer network systems over numerous topologies. Within this
field, the configuration and management of large networks include
storage devices and computers that are communicatively coupled to
dissimilar computers and storage devices over a network, such as
the Internet.
[0163] Finally, it should be noted that the language used in the
specification has been principally selected for readability and
instructional purposes, and may not have been selected to delineate
or circumscribe the inventive subject matter. Accordingly, the
disclosure of the present invention is intended to be illustrative,
but not limiting, of the scope of the invention, which is set forth
in the following claims.
* * * * *