U.S. patent application number 12/878848 was filed with the patent office on 2010-12-30 for system and method for providing a remote user interface for an application executing on a computing device.
This patent application is currently assigned to EXENT TECHNOLOGIES, LTD.. Invention is credited to Yoav M. Tzruya.
Application Number | 20100332984 12/878848 |
Document ID | / |
Family ID | 37734975 |
Filed Date | 2010-12-30 |
United States Patent
Application |
20100332984 |
Kind Code |
A1 |
Tzruya; Yoav M. |
December 30, 2010 |
SYSTEM AND METHOD FOR PROVIDING A REMOTE USER INTERFACE FOR AN
APPLICATION EXECUTING ON A COMPUTING DEVICE
Abstract
A system and method for providing a remote user interface for an
application, such as a video game, executing on a computing device.
The system includes a computing device configured to execute a
software application and at least one remote user interface (UI)
communicatively coupled to the computing device via a data
communication network. The remote UI includes at least one hardware
device such as a video, audio or user input/output (I/O) device.
The computing device is further configured to emulate the hardware
device locally and to redirect function calls generated by the
software application for the emulated local hardware device to the
remote UI for processing by the hardware device.
Inventors: |
Tzruya; Yoav M.; (Even
Yehuda, IL) |
Correspondence
Address: |
FIALA & WEAVER P.L.L.C.;C/O CPA GLOBAL
P.O. BOX 52050
MINNEAPOLIS
MN
55402
US
|
Assignee: |
EXENT TECHNOLOGIES, LTD.
Petach-Tikva
IL
|
Family ID: |
37734975 |
Appl. No.: |
12/878848 |
Filed: |
September 9, 2010 |
Related U.S. Patent Documents
|
|
|
|
|
|
Application
Number |
Filing Date |
Patent Number |
|
|
11204363 |
Aug 16, 2005 |
|
|
|
12878848 |
|
|
|
|
Current U.S.
Class: |
715/716 ;
715/727 |
Current CPC
Class: |
G06F 9/452 20180201 |
Class at
Publication: |
715/716 ;
715/727 |
International
Class: |
G06F 3/048 20060101
G06F003/048; G06F 3/16 20060101 G06F003/16 |
Claims
1. A method for operating a remote user interface (UI) for a
computing device comprising: publishing graphics capability
information to the computing device over a data communication
network; receiving graphics commands from the computing device over
the data communication network, wherein the format of the graphics
commands received from the computing device is consistent with the
graphics capability information; and processing the graphics
commands in a graphics device to generate video content
therefrom.
2. The method of claim 1, further comprising: rendering and
displaying the video content.
3. The method of claim 1, wherein publishing graphics capability
information comprises publishing graphics capability information in
accordance with a UPnP protocol.
4. The method of claim 1, wherein publishing graphics capability
information to the computing device over a data communication
network comprises publishing graphics capability information to the
computing device over a local area network.
5. The method of claim 1, wherein publishing graphics capability
information to the computing device over a data communication
network comprises publishing graphics capability information to the
computing device over a wide area network.
6. The method of claim 1, wherein receiving graphics commands
comprises receiving one of OpenGL commands, DirectX commands, or
Graphics Device Interface commands
7. The method of claim 1, wherein receiving graphics commands
comprises receiving Pre-Rendering Code (PRC) commands.
8. A remote user interface (UI) for a computing device comprising:
control logic; and a graphics device coupled to the control logic;
wherein the control logic is configured to publish graphics
capability information to the computing device over a data
communication network and to receive graphics commands from the
computing device over the data communication network, wherein the
format of the graphics commands received from the computing device
is consistent with the graphics capability information; and wherein
the graphics device processes the graphics commands to generate
video content therefrom.
9. The remote UI of claim 8, further comprising: a display that
renders and displays the video content.
10. The remote UI of claim 8, wherein the control logic is
configured to publish the graphics capability information in
accordance with a UPnP protocol.
11. The remote UI of claim 8, wherein the control logic is
configured to publish the graphics capability information to the
computing device over a local area network.
12. The remote UI of claim 8, wherein the control logic is
configured to publish the graphics capability information to the
computing device over a wide area network.
13. The remote UI of claim 8, wherein the control logic is
configured to receive one of OpenGL commands, DirectX commands, or
Graphics Device Interface commands.
14. The remote UI of claim 8, wherein the control logic is
configured to receive Pre-Rendering Code (PRC) commands.
15. A method for operating a remote user interface (UI) for a
computing device comprising: publishing audio capability
information to the computing device over a data communication
network; receiving audio commands from the computing device over
the data communication network, wherein the format of the audio
commands received from the computing device is consistent with the
audio capability information; and processing the audio commands in
an audio device to generate audio content therefrom.
16. The method of claim 15, further comprising: playing the audio
content.
17. The method of claim 15, wherein publishing audio capability
information comprises publishing audio capability information in
accordance with a UPnP protocol.
18. The method of claim 15, wherein publishing audio capability
information to the computing device over a data communication
network comprises publishing audio capability information to the
computing device over a local area network.
19. The method of claim 15, wherein publishing audio capability
information to the computing device over a data communication
network comprises publishing audio capability information to the
computing device over a wide area network.
20. The method of claim 15, wherein receiving audio commands
comprises receiving DirectX commands.
21. A remote user interface (UI) for a computing device comprising:
control logic; and an audio device coupled to the control logic;
wherein the control logic is configured to publish audio capability
information to the computing device over a data communication
network and to receive audio commands from the computing device
over the data communication network, wherein the format of the
audio commands received from the computing device is consistent
with the audio capability information; and wherein the audio device
processes the audio commands to generate audio content
therefrom.
22. The remote UI of claim 21, further comprising: one or more
speakers that play the audio content.
23. The remote UI of claim 21, wherein the control logic is
configured to publish the audio capability information in
accordance with a UPnP protocol.
24. The remote UI of claim 21, wherein the control logic is
configured to publish the audio capability information to the
computing device over a local area network.
25. The remote UI of claim 21, wherein the control logic is
configured to publish the audio capability information to the
computing device over a wide area network.
26. The remote UI of claim 21, wherein the control logic is
configured to receive DirectX commands.
27. A method for operating a remote user interface (UI) for a
computing device comprising: publishing user input/output (I/O)
device capability information to the computing device over a data
communication network; receiving control commands from the
computing device over the data communication network, wherein the
format of the control commands received from the computing device
is consistent with the user I/O device capability information; and
processing the control commands in a user I/O device to generate
output to a user.
28. The method of claim 27, further comprising: processing input
from a user in the user I/O device to generate control commands;
and transmitting the generated control commands to the computing
device over the data communication network.
29. The method of claim 27, wherein publishing user I/O device
capability information comprises publishing user I/O device
capability information in accordance with a UPnP protocol.
30. The method of claim 27, wherein publishing user I/O device
capability information to the computing device over a data
communication network comprises publishing user I/O device
capability information to the computing device over a local area
network.
31. The method of claim 27, wherein publishing user I/O device
capability information to the computing device over a data
communication network comprises publishing user I/O device
capability information to the computing device over a wide area
network.
32. The method of claim 27, wherein receiving control commands
comprises receiving DirectX commands.
33. A remote user interface (UI) for a computing device comprising:
control logic; and a user input/output (I/O) device coupled to the
control logic; wherein the control logic is configured to publish
user I/O device capability information to the computing device over
a data communication network and to receive control commands from
the computing device over the data communication network, wherein
the format of the control commands received from the computing
device is consistent with the user I/O device capability
information; and wherein the user I/O device processes the control
commands to generate output for a user.
34. The remote UI of claim 33, wherein the user I/O device
processes input from a user to generate control commands; and
wherein the control logic is further configured to transmit the
generated control commands to the computing device over the data
communication network.
35. The remote UI of claim 33, wherein the control logic is
configured to publish the user I/O device capability information in
accordance with a UPnP protocol.
36. The remote UI of claim 33, wherein the control logic is
configured to publish the user I/O device capability information to
the computing device over a local area network.
37. The remote UI of claim 33, wherein the control logic is
configured to publish the user I/O device capability information to
the computing device over a wide area network.
38. The remote UI of claim 33, wherein the control logic is
configured to receive DirectX commands.
39. A method for operating a remote user interface (UI) for a
computing device comprising: publishing graphics and audio
capability information to the computing device over a data
communication network; receiving graphics and audio commands from
the computing device over the data communication network, wherein
the format of the graphics commands received from the computing
device is consistent with the graphics capability information and
the format of the audio commands received from the computing device
is consistent with the audio capability information; processing the
graphics commands in a graphics device to generate video content
therefrom; and processing the audio commands in an audio device to
generate audio content therefrom.
40. The method of claim 39, further comprising: publishing user
input/output (I/O) device capability information to the computing
device over the data communication network; and receiving control
commands from the computing device over the data communication
network, wherein the format of the control commands received from
the computing device is consistent with the user I/O device
capability information; and processing the control commands in a
user I/O device to generate output for a user.
41. A remote user interface (UI) for a computing device comprising:
control logic; a graphics device coupled to the control logic; and
an audio device coupled to the control logic; wherein the control
logic is configured to publish graphics and audio capability
information to the computing device over a data communication
network and to receive graphics and audio commands from the
computing device over the data communication network, wherein the
format of the graphics commands received from the computing device
is consistent with the graphics capability information and the
format of the audio commands received from the computing device is
consistent with the audio capability information; wherein the
graphics device processes the graphics commands to generate video
content therefrom; and wherein the audio device processes the audio
commands to generate audio content therefrom.
42. The remote UI of claim 41, further comprising: a user
input/output (I/O) device coupled to the control logic; wherein the
control logic is further configured to publish user I/O device
capability information to the computing device over the data
communication network and to receive control commands from the
computing device over the data communication network, wherein the
format of the control commands received from the computing device
is consistent with the user I/O device capability information; and
wherein the user I/O device processes the control commands to
generate output for a user.
Description
CROSS-REFERENCE TO RELATED APPLICATIONS
[0001] This application is a continuation of U.S. patent
application Ser. No. 11/204,363, filed Aug. 16, 2005, the entirety
of which is incorporated by reference herein.
BACKGROUND OF THE INVENTION
[0002] 1. Field of the Invention
[0003] The present invention generally relates to user interfaces
for an application executing on a computing device. In particular,
the present invention relates to a system and method for providing
a remote user interface for an application, such as a video game,
executing on a computing device.
[0004] 2. Background
[0005] Currently, the platforms available for playing video games
or other real-time software applications in the home include
personal computers (PC) and various proprietary console-based
systems, such as Microsoft's Xbox.RTM. and Sony's Playstation.RTM..
These platforms are limited in various respects. For example, a
given PC can run only a single video game at a time, since the
video game requires exclusive control over both the graphics and
audio hardware of the PC as well as the PC's display and sound
system. This is true regardless of whether the game is being played
on-line (i.e., in connection with a server or other PC over a data
communication network) or off-line. To enable multiple end users to
play different video games at the same time, an entirely new PC or
other gaming platform must be purchased and located elsewhere in
the home. Furthermore, the end user is confined to playing the
video game in the room in which the PC is located.
BRIEF SUMMARY OF THE INVENTION
[0006] The present invention provides a system and method for
providing a remote user interface for an application, such as a
video game, executing on a computing device. The system includes a
computing device, such as a personal computer (PC), configured to
execute a software application and a remote user interface (UI)
communicatively coupled thereto via a data communication network.
The remote UI includes a hardware device such as a video, audio or
user input/output (I/O) device. The computing device is also
configured to emulate a local hardware device and to redirect
function calls generated by the software application for the
emulated local hardware device to the remote UI for processing
therein. The computing device may also be further configured to
receive control commands from the remote UI, the control commands
originating from a user I/O device, and to redirect the control
commands to the software application.
[0007] In accordance with an implementation of the present
invention, multiple remote UIs may be coupled to the computing
device via the data communication network, and each of the multiple
remote UIs may include one or more hardware devices, such as one or
more of a video, audio or user I/O device.
[0008] By off-loading the processing of graphics and/or audio
commands to a remote UI, an implementation of the present invention
permits simultaneously execution of multiple software applications
on the computing device. Consequently, a user of a first remote UI
can remotely access and interact with a first software application
executing on computing device while a user of a second remote UI
remotely accesses and utilizes a second software application
executing on the computing device. In this way, more than one user
within a home can remotely use different interactive software
applications executing on the computing device at the same time
that would have otherwise exclusively occupied the resources of the
computing device.
[0009] An implementation of the present invention provides a
low-cost solution to the problem of providing multiple remote user
interfaces for using interactive software applications throughout
the home.
[0010] An implementation of the present invention provides
additional benefits in that it allows a software application to be
executed on its native computing platform while being accessed via
a remote UI, without requiring that the software application be
programmed to accommodate such remote access. This is achieved
through the emulation of local resources by the computing device
and the subsequent interception and redirection of commands
generated by the software application for those local resources in
a manner transparent to the software application. This is in
contrast to, for example, conventional X-Windows systems that
enable programs running on one computer to be displayed on another
computer. In order to make use of X-Windows technology, only
software applications written specifically to work with the
X-Windows protocol can be used.
[0011] Furthermore, because a remote UI in accordance with an
implementation of the present invention need only implement the
low-level hardware necessary to process graphics and audio commands
transmitted from the computing device, it may be manufactured in a
low-cost fashion relative to the cost of manufacturing the
computing device.
[0012] Indeed, because the remote UI device need only implement
such low-level hardware, the remote UI device can be implemented as
a mobile device, such as a personal digital assistant (PDA),
thereby allowing an end user to roam from place to place within the
home, or as an extension to a set-top box, thereby integrating into
cable TV and IPTV networks.
[0013] Additionally, because an implementation of the present
invention sends graphics and audio commands from the computing
device to a remote UI device rather than a high-bandwidth raw video
and audio feed, such an implementation provides a low-latency,
low-bandwidth alternative to the streaming of raw video and audio
content over a data communication network. Thus, an implementation
of the present invention marks an improvement over conventional
"screen-scraping" technologies, such as those implemented in
Windows terminal servers, in which graphics output is captured at a
low level, converted to a raw video feed and transmitted to a
remote device in a fully-textured and fully-rendered form.
[0014] Further features and advantages of the present invention, as
well as the structure and operation of various embodiments thereof,
are described in detail below with reference to the accompanying
drawings. It is noted that the invention is not limited to the
specific embodiments described herein. Such embodiments are
presented herein for illustrative purposes only. Additional
embodiments will be apparent to persons skilled in the relevant
art(s) based on the teachings contained herein.
BRIEF DESCRIPTION OF THE DRAWINGS/FIGURES
[0015] The accompanying drawings, which are incorporated herein and
form part of the specification, illustrate the present invention
and, together with the description, further serve to explain the
principles of the invention and to enable a person skilled in the
relevant art(s) to make and use the invention.
[0016] FIG. 1 is a block diagram illustrating an exemplary system
for providing a remote user interface for an application executing
on a computing device in accordance with an implementation of the
present invention.
[0017] FIG. 2 is a flowchart of an example process for establishing
communication between a computing device and a remote UI and for
remotely generating and displaying graphics content via the remote
UI in accordance with an implementation of the present
invention.
[0018] FIG. 3 illustrates an example software architecture of a
media server in accordance with an implementation of the present
invention.
[0019] FIG. 4 depicts an example computer system that may be
utilized to implement a computing device in accordance with an
implementation of the present invention.
[0020] The features and advantages of the present invention will
become more apparent from the detailed description set forth below
when taken in conjunction with the drawings, in which like
reference characters identify corresponding elements throughout. In
the drawings, like reference numbers generally indicate identical,
functionally similar, and/or structurally similar elements. The
drawing in which an element first appears is indicated by the
leftmost digit(s) in the corresponding reference number.
DETAILED DESCRIPTION OF THE INVENTION
A. System Architecture
[0021] FIG. 1 is a high level block diagram illustrating an
exemplary system 100 for providing a remote user interface for an
application executing on a computing device. As shown in FIG. 1,
system 100 includes a computing device 102 coupled to one or more
remote user interfaces (UIs) 106a-106n via a data communication
network 104. In one exemplary implementation, computing device 102
and remote U's 106a-106n are all located in a user's home and data
communication network 104 comprises a wired and/or wireless local
area network (LAN). In an alternative exemplary implementation,
computing device 102 is located at the central office or
point-of-presence of a broadband service provider, remote U's
106a-106n are located in a user's home, and data communication
network 104 includes a wide area network (WAN) such as the
Internet.
[0022] Computing device 102 is configured to execute a software
application 108, such as a video game, that is programmed to
generate graphics and audio commands for respective hardware
devices capable of executing those commands. Software application
108 is also programmed to receive and respond to control commands
received from a user input/output (I/O) device and/or associated
user I/O device interface. Computing device 102 represents the
native platform upon which software application 108 was intended to
be executed and displayed.
[0023] For the sake of convenience, from this point forward,
computing device 102 will be described as a personal computer (PC)
and software application 108 will be described as a software
application programmed for execution on a PC. However, the present
invention is not so limited. For example, computing device 102 may
comprise a server, a console, or any other processor-based system
capable of executing software applications.
[0024] In a conventional PC, graphics and audio commands generated
by a software application such as software application 108 would be
received by software interfaces also executing on the PC and then
processed for execution by local hardware devices, such as a video
and audio card connected to the motherboard of the PC. Furthermore,
control commands for the software application would be received via
one or more local user input/output (I/O) devices coupled to an I/O
bus of the PC, such as a keyboard, mouse, game controller or the
like, and processed by a locally-executing software interface prior
to receipt by the software application.
[0025] In contrast, in accordance with FIG. 1 and as will be
described in more detail herein, software application 108 is
executed within a sandbox environment 118 on computing device 102.
Sandbox environment 118 captures the graphics and audio commands
generated by software application 108 and selectively redirects
them to one of remote UIs 106a-106n via data communication network
104. This allows software application 108 to be displayed on the
remote UI using the hardware of the remote UI, even though software
application 108 may not have been programmed to utilize such remote
resources. Furthermore, sandbox environment 118 receives control
commands from the remote UI via data communication network 104 and
processes them for input to software application 108.
[0026] As shown in FIG. 1, remote UI 106a includes control logic
110, a graphics device 112, an audio device 114, and a user I/O
device 116. Each of the other remote UI's 106b-106n includes
similar features, although this is not shown in FIG. 1 for the sake
of brevity. Control logic 110 comprises an interface between data
communication network 104 and each of graphics device 112, audio
device 114 and user I/O device 116. As will be described in more
detail herein, control logic 110 is configured to at least perform
functions relating to the publication of graphics, audio and user
I/O device capability information over data communication network
104 and to facilitate the transfer of graphics, audio and user I/O
device commands from computing device 102 to graphics device 112,
audio device 114, and user I/O device 116. As will be appreciated
by persons skilled in the relevant art based on the teachings
provided herein, control logic 110 can be implemented in hardware,
software, or as a combination of hardware and software.
[0027] Graphics device 112 comprises a graphics card or like
hardware capable of executing graphics commands to generate image
and video content. Audio device 114 comprises an audio card or like
hardware capable of executing audio commands to generate audio
content. User I/O device 116 comprises a mouse, keyboard, game
controller or like hardware capable of receiving user input and
generating control commands therefrom. User I/O device 116 may be
connected to remote UI 106a using a direct cable connection or any
type of wireless communication.
[0028] Each of remote UIs 106a-106n can be a device capable of
independently displaying the video content, playing the audio
content and receiving control commands from a user. Each of remote
UIs 106a-106n may operate in conjunction with one or more other
devices to perform these functions. For example, the remote UI may
comprise a set-top box that operates in conjunction with a
television to which it is connected to display video content, play
audio content, and in conjunction with a user I/O device to which
it is connected to receive control commands from a user. As a
further example, the remote UI may comprise a PC that operates in
conjunction with a monitor to which it is connected to display
video content, with a sound system or speakers to which it is
connected to play audio content, and in conjunction with a user I/O
device to which it is connected to receive control commands from a
user.
[0029] Although FIG. 1 shows only one software application 108
executing within sandbox environment 118, it is to be appreciated
that multiple software applications may be simultaneously executing
within multiple corresponding sandbox environments 118.
Consequently, a user of a first remote UI can remotely access and
interact with a first software application executing on computing
device 102 while a user of a second remote UI remotely accesses and
utilizes a second software application executing on computing
device 102, each in accordance with the techniques described
herein. In this way, more than one user within a home can use
different interactive software applications executing on computing
device 102 at the same time.
[0030] The operation and interaction of sandbox environment 118 and
remote UIs 106a-106n will now be described in more detail.
1. Sandbox Environment
[0031] Sandbox environment 118 comprises one or more software
modules installed on computing device 102 that operate to isolate
software application 108 from other processes executing on
computing device 102 and that optionally prevent a user from
accessing processes or files associated with software application
108. At a minimum, sandbox environment 118 includes one or more
software modules that capture graphics and audio commands generated
by software application 108 for selective transmission to one of
remote UIs 106a-106n. The capturing of commands may occur, for
example, at the device driver level or hardware abstraction layer
(HAL) level of computing device 102.
[0032] In particular, sandbox environment 118 is configured to
receive notifications from the control logic within each of remote
UIs 106a-106n. The term "notification" is used in a general sense,
and may in fact include the transmission of multiple messages from
a remote UI to computing device 102 or the exchange of messages
between a remote UI and computing device 102. The notifications
provide a means by which each of remote U's 106a-106n can publish
its capabilities. In one implementation, a device discovery and
control protocol such as UPnP is used to allow sandbox environment
118 to automatically discover each of remote UIs 106a-106n and to
learn about their capabilities.
[0033] Upon learning about the capabilities of a remote UI, sandbox
environment 118 emulates the existence of a device, including
device drivers, having similar capabilities. For example, upon
receiving information about the capabilities of remote UI 106a,
sandbox environment 118 would emulate devices having the respective
capabilities of graphics device 112, audio device 114, and user I/O
device 116. This would include creating a software stack for each
of those devices on computing device 102.
[0034] The published capabilities of a remote UI may be inherently
different than the internal hardware and software capabilities of
computing device 102. As such, the software stacks created on
computing device 102 provide an emulated environment which allow
software application 108 to operate as if such capabilities existed
within computing device 102.
[0035] Furthermore, the published capabilities of a remote UI 106a
may be significantly different than the capabilities of remote U's
106b-106n. To address this, an implementation of the present
invention creates a separate software stack for each such remote UI
within a corresponding separate sandbox environment 118 on
computing device 102. Each software stack may be significantly
different from each other software stack. As a result, a
heterogeneous set of remote U's can be supported by system 100.
[0036] Once created, an emulated device captures commands generated
by software application 108 relating to graphics, audio, or user
I/O devices, depending on the type of device being emulated. The
captured commands are transmitted over data communication network
104 to a selected one of remote UIs 106a-106n. For example,
commands generated by software application 108 directed to a
DirectX or OpenGL stack may be captured and transmitted over data
communication network 104 to one of remote U's 106a-106n.
[0037] As will be appreciated by persons skilled in the art,
because sandbox environment 118 captures graphics and audio
commands in their "meta" form and transmits them from computing
device 102 to a remote UI 106a-106n, an implementation of the
present invention provides a low-latency, low-bandwidth alternative
to the streaming of raw video and audio content over a data
communication network. An example of such meta commands includes,
but is not limited to, OpenGL commands, DirectX commands or
Graphics Device Interface (GDI) commands.
[0038] In one implementation, sandbox environment 118 generates one
or more Pre-Rendering Code (PRC) streams or commands responsive to
the receipt of DirectX or OpenGL inputs from software application
108. These PRC streams are then transmitted over data communication
network 104 to a selected one of remote UIs 106a-106n, where they
are received and processed by an output device to generate video
and/or audio content. The manner in which the PRC is generated may
be related to parameters of the output device which were made known
when the remote UI first published its capabilities.
2. Remote UIs
[0039] Each of remote UIs 106a-106n includes hardware and software
stacks for processing graphics commands and generating graphics
content therefrom, processing audio commands and generating audio
content therefrom, and for receiving user input and generating
control commands therefrom. As noted above, each remote UI
106a-106n publishes its particular set of capabilities to sandbox
environment 118. This may be achieved, for example, by sending a
notification to computing device 102 via data communication network
104 or alternatively through the use of a device discovery and
control protocol such as UPnP.
[0040] The software stacks on each remote UI are capable of
processing graphics and audio commands transmitted over data
communication network 104 by computing device 102. The processing
is performed in adherence with both the original command
functionality as well as in a low-latency fashion. In an
implementation where the commands comprise PRC streams (described
above), the software stacks convert the PRC into video and audio
output to feed a presentation device (e.g., video display,
speakers) that is integrated with or connected to the remote UI
device.
B. Example Process
[0041] FIG. 2 is a flowchart 200 of an example process for
establishing communication between computing device 102 and one of
remote UIs 106a-106n and for remotely generating and displaying
graphics content via the remote UI. In the following description,
the combination of computing device 102 and sandbox environment 118
executing thereon will be collectively referred to as "the media
server", while the remote UI 106a-106n with which it is
communicating will simply be referred to as "the remote UI".
[0042] As shown in FIG. 2, the process begins at step 202, in which
an end user requests to run or start a graphics application that is
located on the media server for display on the remote UI, or on a
device that is connected to the remote UI. For example, the end
user may request to run a video game located on the media server,
wherein the media server is situated in the basement of the end
user's home, in order to view it on a television which is connected
to the remote UI in another part of the end user's home. The
request may be input by the end user via a user interface located
on the remote UI or on a device connected to the remote UI.
[0043] At step 204, responsive to the end user's request, a network
connection is established between the remote UI and the media
server via a data communication network. As will be readily
appreciated by persons skilled in the art, any of a variety of
network protocols can be used in order to set up communication
between the remote UI and the media server. For example, in one
implementation, the media server is configured to listen and wait
for an incoming Internet Protocol (IP) connection and the media
server is configured to establish a Transmission Control
Protocol/Internet Protocol (TCP/IP) connection to the remote UI
when needed.
[0044] At step 206, after a connection has been established between
the remote UI and the media server, the remote UI publishes or
exposes its capabilities to the media server. These capabilities
can be published via unidirectional or bidirectional communication
between the remote UI and the media server. In one implementation,
the establishment of a network connection between the media server
and the remote UI as set forth in step 204 and the publication of
the capabilities of the remote UI as set forth in step 206 each may
be facilitated by the use and extensions of a network discovery and
control protocol such as UPnP.
[0045] At step 208, based on the published capabilities of the
remote UI, the media server determines what functionality required
for executing the requested graphic application can be executed on
the media server and what functionality can be executed on the
remote UI. The decision algorithm executed by the media server to
make this determination may be based on the capabilities of both
the remote UI and the media server as well as on the hardware and
software resources currently available on each at the time the
algorithm is executed. In one implementation, the media server is
configured to dynamically adjust its allocation of functionality
during the execution of the requested graphic application if the
capabilities and available resources change.
[0046] At step 210, after the capabilities of the remote UI have
been exposed and the decision algorithm executed by the media
server defines what portions of the graphic rendering are to be
executed on each of the media server and the remote UI, software
hooks are set in the relevant software and operating system (OS)
stack on the media server in order to capture the relevant
functionality in real time. For example, the hooks can be set on
interfaces such as DirectX or OpenGL interfaces, or on any other
interface. The software hooks capture graphics commands and
redirect them to the remote UI.
[0047] FIG. 3 illustrates an example software architecture 300 of
the media server that is useful in understanding step 210. As shown
in FIG. 3, software architecture 300 comprises two graphics
applications 302 and 304, which are identified as 32-bit
Microsoft.RTM. Windows.RTM. applications, executing on the media
server. Each application 302 and 304 has a different software stack
by which it utilizes graphics hardware 314.
[0048] In particular, graphics commands generated by application
302 are received by a Direct3D application programming interface
(API) 306. Direct3D API 306 processes the graphics commands for
input to a device driver interface (DDI) 312 either directly or via
a hardware abstraction layer (HAL) device 310. DDI 312 then
processes the input and generates commands for graphics hardware
314. In contrast, graphics commands generated by application 304
are received by a Microsoft.RTM. Windows.RTM. Graphics Device
Interface (GDI) 308. GDI 308 processes the graphics commands for
input to DDI 312, which then processes the input and generates
commands for graphics hardware 314.
[0049] In accordance with step 210, the media server can set
software hooks in between any of the depicted layers of the
software stacks for applications 302 and 304, wherein the location
of a hook is determined based on the allocation of functionality
between the remote UI and the media server. Thus, for example, with
respect to application 302, a software hook could be set between
application 302 and Direct3D API 306 if the remote UI fully
supports Direct3D. Alternatively, a software hook could be set
between Direct3D API 306 and HAL device 310, or between Direct3D
API 306 and DDI 312 if the remote UI is less powerful and it is
determined that some Direct3D processing must be performed on the
media server. With respect to application 304, a software hook
could be set between application 304 and GDI 308 or between GDI 308
and DDI 312 depending on the allocation of functionality between
the media server and the remote UI.
[0050] The location of the software hooks is tied to which software
layers must be emulated on the media server. In particular, the
media server emulates those layer just below the software hooks,
thereby providing the upper layers the necessary interfaces to
"believe" that the lower levels are fully available on the media
server. However, instead of fully implementing the lower levels,
the emulated layer transmits relevant commands to the remote UI to
ensure proper operation of graphics applications 302 and 304.
[0051] Returning now to flowchart 200, once the software hooks have
been set at step 210, the graphics application is executed on the
media server as shown at step 212. During execution of the graphics
application, when a function that should be executed on the remote
UI is called, the function call or command is redirected by the
software hooks to the remote UI as shown at step 214. In an
implementation, the function call is redirected using a Remote
Procedure Call (RPC)-like communication protocol. It should be
noted that, depending on the allocation of functionality between
the media server and the remote UI, some function calls may be
handled entirely by the media server. In any case, at step 216, the
remote UI processes the function calls received from the media
server to generate and display graphics content.
[0052] Note that in an alternate implementation, one or more of
steps 204, 206, and 208 (involving the publication of the
capabilities of the remote UI, the allocation of functionality
between the media server and the remote UI, and the setting of
software hooks) may be performed prior to receipt of the end user's
request to run a graphics application. For example, one or more of
these steps could be performed the first time the media server and
the remote UI are both connected to the data communication
network.
[0053] With minor modifications, the foregoing method of flowchart
200 is also applicable to the remote generation and playing of the
audio content portion of a software application via the remote UI.
In an audio context, the media server compares the audio
capabilities of the remote UI and the media server and then
allocates functionality to each based on a decision algorithm.
Software hooks are set in accordance with this allocation. The
software hooks redirect audio-related function calls to the remote
UI, where they are processed to generate audio content. Depending
on the implementation, the audio content is then either played by
the remote UI itself, or by a device connected to the remote
UI.
[0054] Furthermore, the same general approach can be used to handle
the remote generation and processing of control commands by a user
I/O device attached to the remote UI. Again, the media server
compares the user I/O device capabilities of the remote UI and the
media server and allocates functionality to each based on a
decision algorithm. Device drivers that emulate the I/O
capabilities of the remote UI are created on the media server in
accordance with this allocation. Control commands associated with a
user I/O device are unique in that they may be transmitted in both
directions--from the remote UI to the media server and from the
media server to the remote UI (e.g., as in the case of a force
feedback game controller). Thus, the software hooks in this case
operate both to receive control commands transmitted from the
remote UI and to re-direct function calls related to the user I/O
device to the remote UI. Once again, an RPC-like protocol can be
used for communication between the two.
C. Example Computing Device
[0055] FIG. 4 depicts an example computer system 400 that may be
utilized to implement computing device 102. However, the following
description of computer system 400 is provided by way of example
only and is not intended to be limiting. Rather, as noted elsewhere
herein, computing device 102 may alternately comprise a server, a
console, or any other processor-based system capable of executing
software applications.
[0056] As shown in FIG. 4, example computer system 400 includes a
processor 404 for executing software routines. Although a single
processor is shown for the sake of clarity, computer system 400 may
also comprise a multi-processor system. Processor 404 is connected
to a communication infrastructure 406 for communication with other
components of computer system 400. Communication infrastructure 406
may comprise, for example, a communications bus, cross-bar, or
network.
[0057] Computer system 400 further includes a main memory 408, such
as a random access memory (RAM), and a secondary memory 410.
Secondary memory 410 may include, for example, a hard disk drive
412 and/or a removable storage drive 414, which may comprise a
floppy disk drive, a magnetic tape drive, an optical disk drive, or
the like. Removable storage drive 414 reads from and/or writes to a
removable storage unit 418 in a well known manner. Removable
storage unit 418 may comprise a floppy disk, magnetic tape, optical
disk, or the like, which is read by and written to by removable
storage drive 414. As will be appreciated by persons skilled in the
relevant art(s), removable storage unit 418 includes a computer
usable storage medium having stored therein computer software
and/or data.
[0058] In an alternative implementation, secondary memory 410 may
include other similar means for allowing computer programs or other
instructions to be loaded into computer system 400. Such means can
include, for example, a removable storage unit 422 and an interface
420. Examples of a removable storage unit 422 and interface 420
include a program cartridge and cartridge interface (such as that
found in video game console devices), a removable memory chip (such
as an EPROM or PROM) and associated socket, and other removable
storage units 422 and interfaces 420 which allow software and data
to be transferred from the removable storage unit 422 to computer
system 400.
[0059] Computer system 400 also includes at least one communication
interface 424. Communication interface 424 allows software and data
to be transferred between computer system 400 and external devices
via a communication path 426. In particular, communication
interface 424 permits data to be transferred between computer
system 400 and a data communication network, such as a public data
or private data communication network. Examples of communication
interface 424 can include a modem, a network interface (such as
Ethernet card), a communication port, and the like. Software and
data transferred via communication interface 424 are in the form of
signals which can be electronic, electromagnetic, optical or other
signals capable of being received by communication interface 424.
These signals are provided to the communication interface via
communication path 426.
[0060] As shown in FIG. 4, computer system 400 further includes a
graphics interface 430, an audio interface 440, and an I/O device
interface 450. In a conventional mode of operation, a software
application executed by processor 404 generates graphics and audio
commands. The graphics commands are received by graphics interface
430, which processes them to generate video content for display on
a local display 432. The audio commands are received by audio
interface 440, which processes them to generate audio content for
playback by one or more local speaker(s) 442. I/O device interface
450 receives control commands from a local I/O device 452, such as
a keyboard, mouse, game controller or the like, and processes them
for handling by the software application being executed by
processor 404.
[0061] However, as described in more detail elsewhere herein, in
accordance with an implementation of the present invention, a
software application is executed by processor 404 within a sandbox
environment. The sandbox environment captures graphics and audio
commands generated by the software application and selectively
redirects them to a remote UI (not shown) via communications
interface 424. The graphics commands are processed by a graphics
interface within the remote UI to generate video content for
display on a remote display. The audio commands are processed by an
audio interface within the remote UI to generate audio content for
playback by one or more remote speaker(s). Additionally, the
sandbox environment receives control commands from the remote UI
via communications interface 424 and processes them for input to
the software application. Thus, in this implementation, the
hardware associated with local graphics interface 430, audio
interface 440, and I/O device interface 450 is not used to execute
the software application. Rather, hardware within (or connected) to
the remote UI is used to carry out analogous functions.
[0062] As used herein, the term "computer program product" may
refer, in part, to removable storage unit 418, removable storage
unit 422, a hard disk installed in hard disk drive 412, or a
carrier wave carrying software over communication path 426
(wireless link or cable) to communication interface 424. A computer
useable medium can include magnetic media, optical media, or other
recordable media, or media that transmits a carrier wave or other
signal. These computer program products are means for providing
software to computer system 400.
[0063] Computer programs (also called computer control logic) are
stored in main memory 408 and/or secondary memory 410. Computer
programs can also be received via communication interface 424. Such
computer programs, when executed, enable the computer system 400 to
perform one or more features of the present invention as discussed
herein. In particular, the computer programs, when executed, enable
the processor 404 to perform features of the present invention.
Accordingly, such computer programs represent controllers of the
computer system 400.
[0064] Software for implementing the present invention may be
stored in a computer program product and loaded into computer
system 400 using removable storage drive 414, hard disk drive 412,
or interface 420. Alternatively, the computer program product may
be downloaded to computer system 400 over communications path 426.
The software, when executed by the processor 404, causes the
processor 404 to perform functions of the invention as described
herein.
D. Conclusion
[0065] While various embodiments of the present invention have been
described above, it should be understood that they have been
presented by way of example only, and not limitation. It will be
understood by those skilled in the relevant art(s) that various
changes in form and details may be made therein without departing
from the spirit and scope of the invention as defined in the
appended claims. Accordingly, the breadth and scope of the present
invention should not be limited by any of the above-described
exemplary embodiments, but should be defined only in accordance
with the following claims and their equivalents.
* * * * *