U.S. patent application number 13/284810 was filed with the patent office on 2012-11-01 for techniques for content navigation using proximity sensing.
This patent application is currently assigned to LOGITECH INC.. Invention is credited to Jean-Michel Chardon, Nicolas Chauvin, Eric Raeber, Frederic Vexo.
Application Number | 20120274547 13/284810 |
Document ID | / |
Family ID | 47067498 |
Filed Date | 2012-11-01 |
United States Patent
Application |
20120274547 |
Kind Code |
A1 |
Raeber; Eric ; et
al. |
November 1, 2012 |
TECHNIQUES FOR CONTENT NAVIGATION USING PROXIMITY SENSING
Abstract
Techniques for content navigation utilize proximity sensing so
that user interaction with a graphical user interface is based at
least in part on both contact with a surface and contactless
interaction with the surface. A representation of an object
detected as being proximate to and/or in contact with a surface
appears on a display, which may be separate from the surface. The
representation may appear at a location of the display that is
determined according to a mapping of surface locations to display
locations. The representation is updated based at least in part on
movement of the object relative to the surface and one or more
distances of the object from the surface.
Inventors: |
Raeber; Eric; (Redwood City,
CA) ; Chardon; Jean-Michel; (Toronto, CA) ;
Vexo; Frederic; (Bussigny-Pres-Lausanne, CH) ;
Chauvin; Nicolas; (Blonay, CH) |
Assignee: |
LOGITECH INC.
Morges
CH
|
Family ID: |
47067498 |
Appl. No.: |
13/284810 |
Filed: |
October 28, 2011 |
Related U.S. Patent Documents
|
|
|
|
|
|
Application
Number |
Filing Date |
Patent Number |
|
|
61480849 |
Apr 29, 2011 |
|
|
|
Current U.S.
Class: |
345/156 |
Current CPC
Class: |
H04N 21/4312 20130101;
H04N 21/4782 20130101; G06F 16/40 20190101; G06F 16/70 20190101;
G06F 16/60 20190101; H04N 21/42209 20130101; G06F 16/93 20190101;
G06F 16/433 20190101; H04N 21/42224 20130101; H04N 21/4786
20130101; H04N 21/4222 20130101; G06F 3/04886 20130101; H04N
21/4781 20130101 |
Class at
Publication: |
345/156 |
International
Class: |
G09G 5/00 20060101
G09G005/00 |
Claims
1. A computer-implemented method of manipulating a display,
comprising: detecting a set of one or more user appendages
proximate to and out of contact with a set of one or more
corresponding sensor locations of a sensing region of a remote
control device; determining, based at least in part on a mapping of
the sensing region to a display region of a remote display device,
a set of one or more display locations of the display region; and
transmitting a signal that causes the display device to display a
set of one or more representations of the detected set of one or
more user appendages according to the determined set of one or more
second locations.
2. The computer-implemented method of claim 1, wherein the mapping
is an absolute mapping.
3. The computer-implemented method of claim 1, further comprising:
calculating at least one measurement that corresponds to a distance
of the detected appendage from the sensing region; and wherein
displaying the representation of the detected appendage includes
displaying the representation of the detected appendage with one or
more color characteristics that are based at least in part on the
measurement.
4. The computer-implemented method of claim 1, wherein the display
displays a graphical user interface and wherein displaying the set
of one or more representations includes overlaying the one or more
representations on the graphical user interface.
5. The computer-implemented method of claim 4, wherein: the
graphical user interface includes one or more selectable options
that each correspond to a selection region of the sensing region;
the method further comprises detecting a contact event by at least
one appendage of the set of one or more appendages, the detected
contact event corresponding to a contact location of the sensing
region; and when the contact location corresponds to a selection
region of a corresponding selectable option of the graphical user
interface, the graphical user interface is updated according to the
corresponding selectable option.
6. The computer-implemented method of claim 5, wherein the
displayed set of one or more representations is changed upon
detection of the contact event for which the contact location
corresponds to the selection region of the corresponding selectable
option.
7. The computer-implemented method of claim 6, wherein changing the
displayed set of one or more representations includes removing the
set of one or more representations from the display.
8. The computer-implemented method of claim 1, wherein at least one
of the representations resembles a corresponding appendage.
9. The computer-implemented method of claim 1, wherein the
displayed set of one or more representations includes at least two
representations of different forms.
10. A computer-implemented method of manipulating a display,
comprising: calculating measurements that correspond to distances
of a user appendage from a sensing region of a remote control
device as the user moves the user appendage relative to the sensing
region; determining, based at least in part on a mapping of
locations of the sensing region to locations of a display device, a
location on the display device that corresponds to a location of
the appendage relative to the sensing region; and taking one or
more actions that cause a display device to display a
representation of the appendage according to the determined
location on the display device such that the representation has one
or more color characteristics that vary based at least in part on
the calculated measurements.
11. The computer-implemented method of claim 10, wherein taking the
one or more actions includes transmitting remotely generated
signals to the display device.
12. The computer-implemented method of claim 10, wherein the
representation has a transparent appearance when the user appendage
is out of contact with the sensing region and an opaque appearance
when the user appendage is in contact with the sensing region.
13. The computer-implemented method of claim 10, further comprising
determining location changes of the sensing region with which the
user appendage is proximate or in contact and wherein taking the
one or more actions includes updating locations of the
representation on the display.
14. A user input system, comprising: one or more processors; and
memory including instructions that, when executed collectively by
the one or more processors, cause the user input system to cause a
display device to at least: display a representation of a user
appendage on a display of the display device at a location of the
display based at least in part on a mapping of locations of a
sensing region of a remote control device to locations on the
display; and change one or more color characteristics of the
representation based at least in part on changes in distances of
the user appendage from a sensing region of a remote control
device.
15. The user input system of claim 14, wherein the instructions
further cause the user input system to cause the display device to:
change a location of the representation based at least in part on
movement of the user appendage relative to the sensing region.
16. The user input system of claim 14, wherein the instructions
further cause the user input system to cause the display device to:
update the display according to a predefined action of multiple
user appendages in connection with the sensing region.
17. The user input system of claim 14, wherein the display device
is separate from the user input system.
18. A user input system for interacting with a graphical user
interface, comprising: one or more processors; and memory including
instructions that, when executed collectively by the one or more
processors, cause the user input system to cause a display device
to at least: display a representation of a user appendage at a
display location of a display of the display device, the display
location based at least in part on a mapping of display locations
of the display device to sensing locations of a sensing region of a
sensing device; and at least when the appendage moves relative to
and out of contact with the sensing device, change the display
location based at least in part on the mapping.
19. The user input system of claim 18, wherein the instructions
further cause the user input system to cause the display device to:
change a location of the representation based at least in part on
movement of the user appendage relative to the sensing region.
20. The user input system of claim 18, wherein the mapping is an
absolute mapping.
21. The user input system of claim 18, wherein the instructions
further cause the user input system to identify the user appendage
from a set of potential user appendages.
22. The user input system of claim 21, wherein the instructions
further cause the user input system to update the display based at
least in part on detection of an event that is uncausable using at
least one other of the potential user appendages.
23. The user input system of claim 18, wherein the sensing device
is a component of a device that is physically disconnected from the
display device.
24. The user input system of claim 18, wherein the sensing device
is a remote control device for the display device.
25. A display device, comprising: one or more processors; and
memory including instructions that, when executed collectively by
the one or more processors, cause the display device to at least:
display a graphical user interface; receive signals corresponding
to user interaction with a sensing region of a sensing input
device, the signals being based at least in part on a number of
dimensions of user interaction that is greater than two; and change
the graphical user interface according to the received signals.
26. The user input system of claim 25, wherein the signals are
generated by an intermediate device that receives other signals
from a remote control device.
27. The user input system of claim 25, wherein the sensing input
device is separate from the display device.
28. The user input system of claim 25, wherein changing the
graphical user interface includes updating an appearance
characteristic of a representation of an object used to interact
with the sensing input device.
29. The user input system of claim 25, wherein changing the
graphical user interface includes updating, on the graphical user
interface, a location of a representation of an object used to
interact with the sensing input device.
30. The user input system of claim 25, wherein the user interaction
includes contactless interaction with the sensing input device.
Description
CROSS-REFERENCES TO RELATED APPLICATIONS
[0001] This application incorporates by reference for all purposes
the full disclosure of U.S. application Ser. No. 13/284,668,
entitled "Remote Control System for Connected Devices," and filed
concurrently herewith. This application also incorporates by
reference the full disclosure of U.S. Application No. 61/480,849,
entitled "Remote Control for Connected Devices," and filed on Apr.
29, 2011.
[0002] This application also incorporates by reference for all
purposes the full disclosure of U.S. Provisional Application No.
61/227,485, filed Jul. 22, 2009, U.S. Provisional Application No.
61/314,639, filed Mar. 17, 2010, U.S. application Ser. No.
12/840,320 entitled "System and Method for Remote, Virtual On
Screen Input," and filed on Jul. 21, 2010, and U.S. application
Ser. No. 13/047,962, entitled "System and Method for Capturing Hand
Annotations" and filed on Mar. 15, 2011.
BACKGROUND OF THE INVENTION
[0003] Various devices can be used by users to provide input to
different systems. Input devices such as mice, keyboards, keypads,
touch pads, joysticks, and other devices, for example, allow users
to control one or more devices by interaction with the input
devices. In addition, as such technology improves, touch screens
have become more prevalent as input devices. In a typical
application, a touch screen and a display are integrated so that a
graphical user interface (GUI) displayed on the touch screen
provides visual indicators of how user input can be provided to
interact with the GUI. The GUI may, for instance, include
selectable options so that a user can see on the display where to
touch the touch screen to select a displayed option. In many
instances, input devices that incorporate touch are separated from
a display on which a GUI is displayed. Many notebook computers, for
example, include touch pads. In typical uses, a user touches a
touch pad and moves one or more fingers to cause a cursor displayed
on the GUI to move accordingly. A button proximate to the touch pad
or sometimes the touch pad itself can be tapped to cause a GUI
element to be selected. Other ways of interacting with the touch
pad and any buttons with the touch pad may be used to interact with
the GUI accordingly.
[0004] Despite the numerous ways users are able to interact with
devices using various input devices, existing devices, whether
devices being controlled or input devices used to control other
devices, do not take full advantage of technologies that have been
developed. In addition, many devices are designed in a manner that
makes use of developed technologies cumbersome. Televisions, for
example, often are configured to provide high-quality displays. At
the same time, the use of touch-based input devices with
televisions can be awkward. Users, for example, often view
televisions from a large enough distance that incorporation of a
touch screen input device with the television display is
impractical. Similar issues exist for many display devices, such as
computer monitors. Thus, while touch-based input has proven to be
beneficial, many benefits of touch-based input are often
unattained.
BRIEF SUMMARY OF THE INVENTION
[0005] The following presents a simplified summary of some
embodiments of the invention in order to provide a basic
understanding of the invention. This summary is not an extensive
overview of the invention. It is not intended to identify
key/critical elements of the invention or to delineate the scope of
the invention. Its sole purpose is to present some embodiments of
the invention in a simplified form as a prelude to the more
detailed description that is presented later.
[0006] Techniques of the present disclosure provide for the
interaction of graphical user interfaces using input devices that
incorporate touch and/or proximity sensing. Such techniques provide
advantages including advantages of some embodiments that allow
users to obtain a touch user-input experience with displays that
are not necessarily touch-input enabled. In an embodiment, a
computer-implemented method of manipulating a display is described.
The method includes detecting a set of one or more user appendages
proximate to and out of contact with a set of one or more
corresponding sensor locations of a sensing region of a remote
control device; determining, based at least in part on a mapping of
the sensing region to a display region of a remote display device,
a set of one or more display locations of the display region; and
transmitting a signal that causes the display device to display a
set of one or more representations of the detected set of one or
more user appendages according to the determined set of one or more
second locations. The mapping may be an absolute mapping.
[0007] Variations of the method are also considered as being within
the scope of the present disclosure. For example, in an embodiment,
the method further includes: calculating at least one measurement
that corresponds to distance of the detected appendage from the
sensing region. In this embodiment, displaying the representation
of the detected appendage may include displaying the representation
of the detected appendage with one or more color characteristics
that are based at least in part on the measurement. The color
characteristics may be, for instance, brightness, hue, opacity, and
the like. The display may display a graphical user interface and
displaying the set of one or more representations may include
overlaying the one or more representations on the graphical user
interface or otherwise visually distinguishing locations
corresponding to user interaction with the sensing region from
other locations. In an embodiment, the graphical user interface
includes one or more selectable options that each correspond to a
selection region of the sensing region. In this embodiment, the
method may further comprise detecting a contact event by at least
one appendage of the set of one or more appendages. The detected
contact event may correspond to a contact location of the sensing
region. When the contact location corresponds to a selection region
of a corresponding selectable option of the graphical user
interface, the graphical user interface may be updated according to
the corresponding selectable option.
[0008] The displayed set of one or more representations may be
changed upon detection of the contact event for which the contact
location corresponds to the selection region of the corresponding
selectable option. Changing the displayed set of one or more
representations may include removing the set of one or more
representations from the display. At least one of the
representations may resemble the corresponding appendage and the
displayed set of one or more representations may include at least
two representations of different forms, such as two different
fingers.
[0009] In accordance with another embodiment, a
computer-implemented method of manipulating a display is disclosed.
The method, in this embodiment, includes calculating measurements
that correspond to distances of a user appendage from a sensing
region of a remote control device as the user moves the user
appendage relative to the sensing region; and taking one or more
actions that cause a display device to display a representation of
the appendage such that the representation has one or more color
characteristics that vary based at least in part on the calculated
measurements.
[0010] As with all methods disclosed and suggested herein,
variations are considered within the scope of the present
disclosure. For example, taking the one or more actions may include
transmitting remotely generated signals to the display device. As
another example, the representation may have a transparent
appearance when the user appendage is out of contact with the
sensing region and an opaque appearance when the user appendage is
in contact with the sensing region. The method may also include
determining location changes of the sensing region with which the
user appendage is proximate or in contact. In such instances,
taking the one or more actions may include updating locations of
the representation on the display.
[0011] In accordance with yet another embodiment, a user input
system is described. The user input system may be a set of one or
more devices that collectively operate to change a display
according to user input. In this embodiment, the user input system
includes one or more processors and memory including instructions
that, when executed collectively by the one or more processors,
cause the user input system to cause a display device update
according to user input. For instance, the display device may
display a representation of a user appendage on a display of the
display device and change one or more color characteristics of the
representation based at least in part on changes in distances of
the user appendage from a sensing region of a remote control
device.
[0012] Variations of the user input system are also considered as
being within the scope of the present disclosure. For example, the
instructions may further cause the user input system to cause the
display device to change a location of the representation based at
least in part on movement of the user appendage relative to the
sensing region. The instructions may also further cause the user
input system to cause the display device to update the display
according to a predefined action of multiple user appendages in
connection with the sensing region. The display device may be
separate from the user input system. For instance, the display
device may be a television and the user input system may be a
remote control device (or remote control system) that operates the
television.
[0013] In accordance with another embodiment, another user input
system is described. The user input system allows for user
interaction with a graphical user interface. The user input system
may include, for example, one or more processors and memory
including instructions that, when executed collectively by the one
or more processors, cause the user input system to cause a display
device to display information according to user input. The display
device may, for instance, display a representation of a user
appendage at a display location of a display of the display device
where the display location is based at least in part on an absolute
mapping of display locations of the display device to sensing
locations of a sensing region of a sensing device. At least when
the appendage moves relative to and out of contact with the sensing
device, the display device may change the display location based at
least in part on the absolute mapping.
[0014] Variations of the user input system considered as being
within the scope of the present disclosure include, but are not
limited to, the instructions further causing the user input system
to cause the display device to change a location of the
representation based at least in part on movement of the user
appendage relative to the sensing region. The instructions may
further cause the user input system to identify the user appendage
from a set of potential user appendages and/or cause the user input
system to update the display based at least in part on detection of
an event that is uncausable using at least one other of the
potential user appendages. The sensing device may be a component of
a device that is physically disconnected from the display device
and/or the sensing device may be a remote control device for the
display device.
[0015] In accordance with another embodiment, a display device is
disclosed. The display device includes one or more processors and
memory including instructions that, when executed collectively by
the one or more processors, cause the user input system to display
information according to user input. The display may, for instance,
display a graphical user interface and receive signals
corresponding to user interaction with a sensing region of a
sensing input device, the signals being based at least in part on a
number of dimensions of user interaction that is greater than two,
and change the graphical user interface according to the received
signals. The signals may be generated by an intermediate device
that receives other signals from a remote control device. The
sensing input device may be separate from the display device.
Changing the graphical user interface may include updating an
appearance characteristic of a representation of an object used to
interact with the sensing input device. Changing the graphical user
interface may also include updating, on the graphical user
interface, a location of a representation of an object used to
interact with the sensing input device. The user interaction may
include contactless interaction with the sensing input device.
[0016] For a fuller understanding of the nature and advantages of
the present invention, reference should be made to the ensuing
detailed description and accompanying drawings.
BRIEF DESCRIPTION OF THE DRAWINGS
[0017] FIG. 1 shows an illustrative example of an environment in
which various embodiments may be practiced.
[0018] FIG. 2 shows an illustration of a remote control device and
a display device in accordance with at least one embodiment.
[0019] FIG. 3 shows the remote control device and the display
device of FIG. 2 being navigated by a user in accordance with at
least one embodiment.
[0020] FIG. 4 shows an illustrative example of a process for
facilitating user navigation of an interface in accordance with at
least one embodiment.
[0021] FIG. 5 shows an illustrative example of maintaining an
interface in accordance with at least one embodiment.
[0022] FIG. 6 shows an illustration of an aspect of the invention
in accordance with at least one embodiment.
[0023] FIG. 7 shows the aspect of FIG. 6 as it changes according to
user movement in accordance with at least one embodiment.
DETAILED DESCRIPTION OF THE INVENTION
[0024] In the following description, various embodiments of the
present invention will be described. For purposes of explanation,
specific configurations and details are set forth in order to
provide a thorough understanding of the embodiments. However, it
will also be apparent to one skilled in the art that the present
invention may be practiced without the specific details.
Furthermore, well-known features may be omitted or simplified in
order not to obscure the embodiment being described.
[0025] FIG. 1 shows an environment 100 in which various embodiments
may be practiced. In accordance with an embodiment, environment 100
utilizes a content appliance 102 in order to provide content to a
user. As illustrated in FIG. 1, the content may be provided to the
user in various ways. For example, the environment 100 in FIG. 1
includes a television 104, an audio system 106 and a mobile device
108 (such as a mobile phone) that may be used to provide content to
a user. Content may include video content, audio content, text
content, and generally any type of content that may be provided
audio, visually or otherwise to a user. Other devices may also be
used in the environment 100. For example, as illustrated in FIG. 1,
the environment 100 includes an audio visual (AV) receiver 110
which operates in connection with television 104. Also, the
environment 100 as illustrated in FIG. 1 includes a video camera
112, a set top box 114 and a remote control 116 and a keyboard
118.
[0026] When a user utilizes an environment, such as the environment
100, one or more devices may utilize the content appliance 102 in
some manner. To accomplish this, the various devices shown in FIG.
1 are configured to communicate with one another according to
various protocols. As a result, in an embodiment, the content
appliance 102 configured to communicate with various devices
utilizing the different methods, such as according to the methods
and protocols illustrated in FIG. 1. For example, in an embodiment,
the content appliance 102 is configured to generate and transmit
infrared (IR) signals to various devices that are configured to
receive IR signals and perform one or more functions accordingly.
Different devices may utilize different codes and the content
appliance may be configured to generate proper codes with each
appliance. For example, a television from one manufacturer may
utilize different codes in a television from another manufacturer.
The content appliance 102 may be configured accordingly to generate
a transmit appropriate codes. The content appliance may include a
data store that has the codes for various devices and codes may be
obtained from remote sources, such as from remote databases as
discussed below. In a set up process, a user may configure the
content appliance 102 to submit the correct codes to the
appropriate device(s).
[0027] As another example of how the content 102 is able to
communicate utilizing various protocols, the content appliance 102
includes various ports which may be used to connect with various
devices. For example, in an embodiment, the content appliance 102
includes an HDMI OUT port 120 which may be used to provide content
through an HDMI cable to another device. For example, as
illustrated in FIG. 1, the HDMI OUT port 120 communicates content
to the AV receiver 110. The HDMI OUT port may be used to provide
content to other devices, such as directly to the television 104.
In an embodiment, the content appliance 102 includes an S/PDIF port
122 to communicate with the audio system 106.
[0028] An ethernet port 124 may be provided with the content
appliance 102 to enable the content appliance 102 to communicate
utilizing an appropriate networking protocol, such as illustrated
in FIG. 1. For example, the content appliance 102 may communicate
signals utilizing the ethernet port 124 to communicate to a set top
box. The set top box may operate according to an application of a
content provider such as a satellite or cable television provider.
The ethernet port 124 of the content appliance 102 may be used to
instruct the set top box 114 to obtain content on demand.
[0029] In an embodiment, the content appliance 102 includes one or
more universal serial bus (USB) ports 126. The USB ports 126 may be
utilized to communicate with various accessories that are
configured to communicate utilizing a USB cable. For example, as
shown in FIG. 1, the content appliance 102 communicates with a
video camera 112. The video camera 112 may be used, for instance,
to enable use of the content appliance to make video calls over a
public communications network, such as the Internet 128. Generally,
the content appliance 102 may be configured to communicate with any
device connectable using USB techniques.
[0030] Other ports on the content appliance 102 may include RCA
ports 130 in order to provide content to devices that are
configured to communicate using such ports and an HDMI end port 132
which may be used to accept content from another device, such as
from the set top box 114. Generally, the content appliance 102 may
have additional ports to those discussed above and, in some
embodiments, may include fewer ports than illustrated.
[0031] Various devices in communication with the content appliance
may be used to control the content appliance and other devices in
the environment 100. For example, the remote control 116 may
communicate with the content appliance 102 utilizing radio
frequency (RF) communication. As described in more detail below,
the remote control 116 may include a touch screen that may be used
in accordance with the various embodiments described herein.
[0032] A keyboard 118 may also communicate with the content
appliance 102 utilizing RF or another method (and possibly one or
more other devices, either directly, or through the content
appliance 102). The keyboard may be used for various actions, such
as navigation on a interface displayed on the television 104, user
input by a user typing utilizing the keyboard 118, and general
remote control functions. For example, an interface displayed on
the television 104 may include options for text entry. The user may
type text utilizing keyboard 118. Keystrokes that the user makes on
the keyboard 118 may be communicated to the content appliance 102,
which in turn generates an appropriate signal to send over an HDMI
cable connecting the HDMI OUT port 120 to the AV receiver 110. The
AV receiver 110 may communicate with television 104 over HDMI or
another suitable connection to enable the television to display
text or other content that corresponds to the user input. The
keyboard 118 may also include other features as well. For example,
the keyboard 118 may include a touchpad, such as described below or
generally a touchpad that may allow for user navigation of an
interface displayed on a display device. The touchpad may have
proximity sensing capabilities to enable use of the keyboard in
various embodiments of the present disclosure.
[0033] In an embodiment, the mobile device 108 is also able to
control the content appliance 102 (and possibly other devices,
either directly, or through the content appliance 102). The mobile
device may include a remote control application that provides an
interface for controlling the content appliance 102. In this
particular example from FIG. 1, the mobile device 108 includes a
touch screen that may be used in a manner described below. As the
user interacts with the mobile device 108, the mobile device may
communicate with the content appliance 102 over wi-fi utilizing
signals that correspond to the user's interaction with the mobile
device 108. The content appliance 102 may be, for instance,
configured to receive signals from the mobile device over wi-fi
(directly, as illustrated, or indirectly, such as through a
wireless router or other device). The content appliance may be
configured to generate signals of another type (such as IR, HDMI,
RF, and the like) that correspond to codes received over wi-fi from
the mobile device 108 and then generate and transmit signals
accordingly. An application executing on the mobile device 108 may
provide a graphical user interface that allows users to use the
mobile device 108 as a remote control and generate such codes
accordingly. The mobile device 108 (and other devices), as
illustrated, may be configured to receive information from the
content appliance 102 and reconfigure itself according to the
information received. The mobile device 108 may, for example,
update a display and/or update any applications executing on the
mobile device 108 according to information received by the content
appliance 102. It should be noted that, while the present
disclosure discusses a mobile device illustrated as a mobile phone,
the mobile device may be a different device with at least some
similar capabilities. For example, the mobile device may be a
portable music player or tablet computing device with a touch
screen. Example mobile devices include, but are not limited to, a
mobile phone with a touch screen (e.g., a smartphone such as an
iPhone or an Android based phone, etc.), a portable music player
(e.g., an iPod, etc.), a tablet computing device (e.g., an iPad,
iPad2, etc.), and other devices with touch sensitive user input
devices. Of course, such devices (and other devices) may be
included additionally in a mobile device in the environment
illustrated in FIG. 1.
[0034] In an embodiment, the content appliance 102 is also
configured to utilize various services provided over a public
communications network, such as the Internet 128. As example, the
content appliance 102 may communicate with a router 134 of home
network. The content appliance 102 and the router 134 may
communicate utilizing a wired or wireless connection. The router
134 may be directly or indirectly connected to the Internet 128 in
order to access various third-party services. For example, in an
embodiment, a code service 136 is provided. The code service in an
embodiment provides codes for the content appliance 102 to control
various devices to enable the content appliance to translate codes
received from another device (such as the remote control 116, the
keyboard 118, and/or the mobile device 108). The various devices to
control may be identified to the content appliance 102 by user
input or through automated means. The content appliance 102 may
submit a request through the router 134 to the code service 136 for
appropriate codes. The codes may be, for example, IR codes that are
used to control the various devices that utilize IR for
communication. Thus, for example, if a user presses a button on the
remote control 116, keyboard 118, or an interface element of the
mobile device 108, a signal corresponding to the selection by the
user may be communicated to the content appliance 102. The content
appliance 102 may then generate a code based at least in part on
information received from the code service 136. As an illustrative
example, if the user presses a play button of the remote control
116, a signal corresponding to selection of the play button may be
sent to the content appliance 102 which may generate a play IR
code, which is then transmitted to the television 104 or to another
suitable appliance, such as generally any appliance that is able to
play content.
[0035] Other services that may be accessed by the content appliance
102 over the Internet 128 include various content services 138. The
content services may be, for example, any information resource,
such as websites, video-streaming services, audio-streaming
services and generally any services that provide content over the
Internet 128.
[0036] It should be noted that the environment illustrated in FIG.
1 is provided for the purpose of illustration and that numerous
environments may be used to practice embodiments of the present
disclosure. Various embodiments, for example, are applicable in any
environment where proximity sensing is used as a method of enabling
user input, including any environment in which a touch pad or touch
screen with proximity sensing capabilities is used to interact with
a GUI on a separate display. It should be noted that various
embodiments may be described as utilizing a particular input device
such as a touch pad or touch screen but, unless otherwise clear
from context, various embodiments of the invention may utilize
other input devices other than those explicitly described. For
instance, in many instances, such as where a display on the input
device is not required, embodiments described as utilizing a touch
screen may be modified to utilize a touch pad or other suitable
input device having touch and/or proximity sensing capabilities as
an alternative to a touch screen. FIG. 1 shows an example
environment in which user input is provided to a display
(television, in the illustrated example) through a content
appliance. However, techniques of the present disclosure are also
applicable for providing user input directly to a device with a
display. For instance, the various techniques described herein may
be used in connection with a television remote control device,
where the television remote control device sends signals according
to user interaction with a touch screen directly to a
television.
[0037] FIG. 2 shows an illustration of such a remote control device
202 and a television 204, although signals from the remote control
device 202 could be transmitted using a content appliance, such as
described above in connection with FIG. 1. In an embodiment, the
remote control device 202 is used to control how content is
displayed on the television 204 or other device with a display. The
remote control device 202 may communicate signals directly to the
television 204 or through an intermediate device, such as the
content appliance 102 described above in connection with FIG.
1.
[0038] In an embodiment, the television 204 may display an
interface which is navigable by a user utilizing the remote control
202. For example, the television 204 in FIG. 2, displays an
interface 206 which includes a plurality of selectable options. In
this specific example, the options are provided as interface
buttons displayed on the television 204. Each button in this
example corresponds to an activity that the user may perform by
selecting one of the buttons. For example, the interface 206
includes a watch TV button 208. Selection of the watch TV button
208 may result in one or more devices changing to a state suitable
for watching TV on the television 204. For example, a set top box
(not shown) may be put into an on state if it was not in such a
state already. Television 204 may be put into a state wherein it
receives television content from an appropriate source, such as
from the set top box or from a different device, such as a content
appliance 102 described in connection with FIG. 1.
[0039] In this example, the television 204 (or a network of devices
that includes the television 204) is configured to utilize one or
more other devices, such as a DVD player, music player, a gaming
device, and devices that allow communication over the Internet. For
example, in an embodiment the users are able to utilize the
television 204 to check an email account and/or stream a movie from
a remote streaming service. Generally, the television 204 or a
network of devices that includes the television 204 may be
configured for use with any device involved in providing content,
either from the devices themselves, or from other sources,
including remote sources accessible over the Internet or other
communications network.
[0040] In an embodiment, the remote control 202 includes a touch
screen 210. As noted, while the remote control is described as
having a touch screen 210, embodiments may utilize a remote control
with a touch pad instead of or in addition to a touch screen. The
touch screen 210 or touch pad in an embodiment operates using
capacitive proximity sensing. The touch screen 210 (or touch pad)
may be configured, for example, according to the disclosure of U.S.
application Ser. No. 13/047,962, referenced above. The touch screen
210 (or touch pad), however, may utilize any technique for
proximity sensing. As discussed below, the touch screen 210 (or
touch pad) may be used as an input device by a user. The input may
be input for controlling the remote control device 202 and/or the
television 204. Other mechanisms for input may be used in addition
to the touch screen 210. For instance, FIG. 2 illustrates various
buttons, many of which are common on standard remote controls. Such
buttons may be selected to perform corresponding functions.
Performance of some functions may be caused by selection of
corresponding buttons or selections of interface elements on the
touchscreen 210, although some functions may be causable by either
the buttons or the touchscreen 210, but not both.
[0041] In an embodiment, there is an absolute mapping between
locations on the touch screen or touch pad 210 and locations on the
graphical user interface on the television (or other display
device). Thus, each location on the touch screen or touch pad 210
is mapped to at least one location on the user interface on the
display. The absolute mapping may be a surjective mapping from the
locations of the user interface on the display to the locations of
the touch screen. The mapping may be a surjective mapping from the
locations of the touch screen to the locations of the user
interface if there are more locations on the touch screen or touch
pad than the user interface. A mapping may also be a one-to-one
mapping between touch screen sensor locations and pointer locations
on the interface. Generally, the mapping may be any mapping that is
configured such that, from the user perspective, each location on
the touch screen has a corresponding location on the user
interface. It should be noted, however, that some embodiments of
the present disclosure may utilize a relative mapping between the
touch screen and user interface.
[0042] Mappings between a region of a remote control device and a
display device may be determined in various ways. For example, in
an embodiment, a device in connection with the display device
utilizes extended display identification data (EDID) or extended
EDID (E-DID) received over a high-definition multimedia interface
(HDMI) or other connection to determine display parameters for the
display device. The data may, for example, specify a maximum
horizontal image size and maximum vertical image size which, allows
for a mapping of the sensing region to the display. Other ways of
determining the display size for generating a mapping may be used.
For instance, a user may input the display size during a setup
process. The user may alternatively enter an identifier (model
number, e.g.) for a display device and a database (which may be a
remote database) may be referenced to determine the display size
based on the model identifier. Alternatively, some other method may
be used to determine an identifier for the display device (e.g.,
knowledge of the legacy remote control used by the display device),
and the identifier may be used to determine the display size
(possibly using a remote database). Generally, any suitable for
determining the display dimensions and generating a mapping may be
used.
[0043] In an embodiment, the user interacts with the touch screen
or touch pad 210 in order to navigate the user interface displayed
on the television 204. Generally, the user may interact with the
touch screen 210 (or touch pad) by using one or more appendages
(such as fingers) to touch and/or hover over the touch screen 210
and/or move the one or more appendages relative to the touch screen
210. The manner in which the user interacts with the touch screen
210 may be sensed by the touch screen 210 (or touch pad) to
generate signals. The generated signals may be interpreted by one
or more processors of the remote control 202 to generate one or
more other signals corresponding to user input and are transmitted
by the remote control 202 to another device, such as directly to
the television 204 or to a content appliance, such as described
above, or in any suitable manner. For example, if the user touches
the touch screen 210 (or touch pad) with his or her finger and
moves the finger upward while in contact with the touch screen 210
(touch pad), one or more processors of the remote control 210 may
interpret signals generated according to such touch and movement
and generate and transmit one or more other signals that enable
another device to update a device on which a GUI is displayed
accordingly. Alternatively, signals generated by the touch screen
210 or signals derived therefrom may be transmitted to another
device (such as the television 204 or a content appliance) and
interpreted by the other device. Generally, any manner in which
signals generated by the touch screen 210 (or touch pad) are
interpreted as user input may be used.
[0044] It should be noted that FIG. 2 is simplified for the purpose
of illustration and additional details and/or variations are
considered as being within the scope of the present disclosure. For
example, FIG. 2 shows an example where a remote control device is
used to control a display on a television. Generally, the
techniques of the present disclosure may apply in any instance in
which a touch-sensitive surface is used to provide user input for
interaction with a GUI. Further, FIG. 2 shows the touch screen 210
without a display during user interaction. The touch screen 210
may, however, include a display when the user interacts with the
touch screen 210, and/or at other times. The display of the touch
screen 210 may, for example, be identical or similar to the display
on the television 204. In this manner, the user can see the GUI by
looking at either the remote control 202 or the television 204. As
noted, a touch pad or other touch-sensitive user input device with
which proximity sensing is possible may be used in accordance with
various embodiments.
[0045] In addition, FIG. 2 shows a touch screen on a device (remote
control 202) that is different from the device on which the GUI is
displayed (or primarily displayed). However, techniques of the
present disclosure may also be applied in instances in which the
touch screen and display are incorporated in a single device. A
notebook computer, for example, may include a touch screen that is
separate from a display of the notebook computer. The techniques
described herein may also be applied to single devices. For
example, a touch screen and display operating in accordance with
the present disclosure may both be part of a single mobile device
(such as a smart phone, tablet computing device, or other device).
Generally, the scope of the present disclosure is not limited to
the embodiments explicitly described and illustrated herein.
[0046] As mentioned, various embodiments of the present disclosure
utilize touch and proximity sensing technology to allow a user to
interact with a graphical user interface. FIG. 3 accordingly shows
an illustrative example of how the touch screen 210 of FIG. 2 may
be utilized to navigate the interface on the television 204. As
with FIG. 2, techniques illustrated in FIG. 3 could apply to a
variety of environments and devices, not just those explicitly
illustrated and described herein.
[0047] FIG. 3 in this particular example shows a remote control
device 302 and a television 304 which may be the remote control
device 202 and television 204 described above in connection with
FIG. 2. Also shown in FIG. 3, as in FIG. 2, the television 304
displays an interface 306 which has various selectable options,
such as a watch TV button 308. In addition, the remote control
device 302 includes a touch screen 310 which may be the touch
screen 210 described above in connection with FIG. 2.
[0048] In an embodiment, when the user interacts with the touch
screen 310, visual indicators of such interaction appear on the
interface displayed on the television 304. For example, interaction
with the touch screen 310 in an embodiment includes touching the
touch screen 310 and, generally, performing actions in close
proximity to the touch screen 310. For example, as shown in FIG. 3,
a user is interacting with the touch screen 310 with a left hand
312 and a right hand 314. It should be noted, however, that other
appendages may be used to interact with the touch screen 310. For
example, the user may interact with the touch screen 310 or may
interact with the touch screen with any appendage or combination of
appendages, or other portions of appendages (such as the palm of a
hand). In addition, other items, such as styluses or other,
non-human, things may be used for interaction with the touch screen
310, although the present disclosure will focus on thumbs for the
purpose of illustration.
[0049] Accordingly, in FIG. 3, a left thumb 316 and a right thumb
318 are shown interacting with touch screen 310. In this example,
the user is hovering over, that is, not touching, the touch screen
310 with the left thumb 316 and, in particular, in the lower-left
corner of the touch screen 310. However, with the right thumb 318,
the user is touching the touch screen 310 and in particular,
touching an upper-right portion of touch screen 310. In an
embodiment, as a result of such user interaction with the touch
screen 310, representations of the left thumb 316 and right thumb
318 are shown in corresponding places on the user interface 306.
For example, a representation of a left thumb 320 is displayed at a
location on the location of the user interface 306 that corresponds
to a location over which the left thumb 316 is hovering over the
touch screen 310. Similarly, a representation 322 of the right
thumb 318 appears in the upper-right hand portion of the user
interface 306 at a location that corresponds to a location touched
by the user on the touch screen 310.
[0050] In the illustrative example of FIG. 3, the representation
320 of the left thumb 316 and the representation 322 of the right
thumb 318 each have the appearance of a thumb. That is, the
representations on the television 304 resemble the appendages used
by the user. However, other representations may be used which do
not necessarily resemble body parts. For example, circles or
targets or any visual indicator of the user's interaction with
touch screen 310 may be used. Further, FIG. 3 shows representations
overlaid on a GUI for the purpose of illustration. However, other
ways of providing visual feedback to the user based on the user's
interaction with the touch screen may be used. For instance, a
representation may not be overlaid on the user interface, but may
be made by manipulating the user interface in a way that indicates
user interaction with the touch screen 310. For example, the user
interface could have a warped effect at a location where the user
interacts with the touch screen 310. As another example, the
display may brighten at locations corresponding to a location with
which a user interacts with the touch screen 310. As yet another
example, interface elements, such as selectable options of the user
interface, may change color, brightness or other characteristics
when the user interacts with the touch screen 310 in a
corresponding location. Generally, any manner of providing visual
feedback of the user's interaction with the touch screen 310 may be
used.
[0051] The visual feedback of user interaction with the touch
screen 310 may be provided in a varying manner. For example, as
shown in FIG. 3, color characteristics of representations of the
user appendages vary according to how the user is interacting with
touch screen 310. For example, because the left thumb 316 is not
touching but is only hovering over the touch screen 310, the
representation 320 of the left thumb 316 in this example appears
not bright and transparent; that is, the combination of the
representation 320 of the left thumb and the user interface has an
appearance as if the interface shows through the representation 320
of the left thumb 316.
[0052] The representation 322 of the right thumb 318, on the other
hand, appears bright and opaque, thereby indicating to the user
that the right thumb is touching the touch screen 310. In this
manner, the user can hover over the touch screen with an appendage
and, based on the location of the representation on the interface
306, knows where to move his or her appendage to navigate the
interface as desired. Because, in this example, the representation
is transparent when the corresponding appendage is hovering, the
representation does not obscure the user interface. For example, as
shown in FIG. 3, the representation 320 of the left thumb 316
appears transparently over a play a game option 324 of the user
interface 306. In this manner, the user does not need to move the
left thumb 316 in order to see what option would be selected by
pressing the touch screen 310 at the same location over which the
left thumb 316 is hovering. As noted, other ways of providing
visual feedback that do not obscure elements of the interface 306,
such as by changing elements of the interface to be still
recognizable, may be used and, in some embodiments, elements of the
interface may be allowed to be obscured.
[0053] Other indications of user interaction with an interface may
also be shown in addition to the representations of the appendages.
For example, in FIG. 3, lines radiating from the representation of
the right thumb 322 at a location corresponding to a location where
the user touches the touch screen 310 are shown as an illustrative
example. The radiated lines indicate that the user has touched the
touch screen at this location and therefore selected a
corresponding option on the user interface which, in this example,
is a play music option 326. The lines may appear responsive to
contact with the touch screen 310 (or proximity within a
threshold), and may subsequently disappear, such as when the
corresponding appendage loses contact with the touch screen 310,
when a determination is made that the user made a selection of an
element, and/or at another time.
[0054] Other variations not illustrated in this figure, but
described below, may also be used. For example, the amount by which
a representation is transparent may vary according to a distance by
which an appendage is hovering over the touch screen 310. Further,
while the example in FIG. 3 shows the user navigating the user
interface using the touch screen 310 utilizing two appendages, his
or her thumbs, the navigation may be done by a single appendage or
other mechanism, or more than two appendages. When a user selects
an option of user interface 306, representations may disappear in
order to allow a user to fully view a new updated interface that
appears as a result of the selection of the option.
[0055] In addition, while the illustrative examples in FIGS. 2 and
3 illustrate a particular interface for a particular purpose, other
GUIs may be used in accordance with the present disclosure.
Generally, any GUI that may be presented on a display and
manipulated using touch techniques may be used. Example GUIs are
GUIs configured for use with any operating system that allows for
touch-based input. Embodiments of the present disclosure may also
be used to enable non-touch-based operating systems to be navigated
using touch techniques. For example, a left-right-up-down (LRUD)
operating system, such as on some televisions, may allow navigation
only according to the left, right, up, or down directions. User
input on a touch screen may be translated to corresponding left,
right, up, or down commands to enable touch navigation. For
instance, a touch screen may be divided into regions, where each
region corresponds to one or more LRUD commands. Touching the touch
screen at an upper middle portion, for instance, may correspond to
an up command. Some regions may correspond to multiple commands.
Touching a touch screen in a corner, for instance, may correspond
to a sequence of LRUD commands. The lower left corner may, for
example, correspond to an down-left or left-down sequence of
commands. The commands may be transmitted in sequence to the device
with the LRUD operating system upon selection of the touch screen.
Generally, one or more ways of interacting with a touch screen may
correspond to one or more commands of a non-touch-screen operating
system.
[0056] FIG. 4 shows an illustrative example of a process 400 that
may be used to provide navigation of an interface, such as in the
manner illustrated in connection with FIGS. 2 and 3. Some or all of
the process 400 (or any other processes described herein, or
variations and/or combinations thereof) may be performed under the
control of one or more computer systems configured with executable
instructions and may be implemented as code (e.g., executable
instructions, one or more computer programs, or one or more
applications) executing collectively on one or more processors, by
hardware, or combinations thereof. A computer system may be any
device capable of performing processing functions, such as notebook
computers, desktop computers, remote control devices, content
appliances, mobile phones, tablet computing devices, and,
generally, any device or collection of devices that utilize one or
more processors. One of more of the actions depicted in FIG. 4 may
be performed by a device such as a remote control device, a content
appliance, a television, or, generally, any device that is
configured to participate in providing content. The code may be
stored on a computer-readable storage medium, for example, in the
form of a computer program comprising a plurality of instructions
executable by one or more processors. The computer-readable storage
medium may be non-transitory.
[0057] In an embodiment, the process 400 includes displaying 402 an
interface, such as described above. As used herein, an interface
may mean actually displaying the interface or taking one or more
actions that cause an interface to be displayed. For example,
referring to FIG. 1, the content appliance 102 may display an
interface by generating the signal, causing the signal to be sent
to the AV receiver which relays the signal to the television 104. A
device performing the process 400, such as a television thus
configured, may display the interface itself.
[0058] In an embodiment, a proximate appendage is detected 404.
Detecting a proximate appendage may be done in any suitable manner,
such as utilizing the techniques in U.S. application Ser. No.
13/047,962 and U.S. application Ser. No. 12/840,320, described
above. The appendage may be detected, for example, upon the user
moving an appendage within a certain distance of a touch screen and
by detecting the proximate appendage. Depending on a particular
environment in which the process 400 is performed, detecting the
appendage may be performed in various ways. For example, detecting
the appendage may be performed by detecting the appendage directly
or receiving a signal from another device, such as a remote control
device that detected the appendage.
[0059] Once the appendage is detected 404 in an embodiment, an
interface display location is determined 406 based at least in part
on a mapping of input device locations to interface display
locations, which may be an absolute mapping, as described above.
Upon determining the interface display location, in an embodiment,
the representation of the appendage is overlaid 408 on the
interface display at the determined interface display location. As
discussed above, other ways of providing representations may be
performed, although overlays of representations are used for the
purpose of illustration. As the user moves his or her appendage
relative to the touch screen, the position of the representation on
the interface may be updated 410 according to movement of the
appendage. For example, if the user moves the appendage to the
left, the representation of the appendage may move to the left as
well. If the user moves the appendage up or down relative to the
touch screen, then the representation may remain in the same place,
but change color characteristics such as described above.
Determining how to update the position of the representation may
include multiple detections of the appendage and corresponding
determinations of the interface display location based on the
mapping.
[0060] In an embodiment, at some point during interaction with the
touch screen, the user may make contact with the touch screen. In
such instances, when the user touches the touch screen, a touch
event of the appendage is detected 412. When a touch event is
detected 412, an operation according to the touch event type and/or
location of touch is performed 414. The way in which the user
touches a touch screen may indicate, for example, how the user
wishes to navigate a user interface. For instance, touching the
touch screen and moving the appendage while in contact with the
touch screen may indicate a drag operation in the user interface.
If the initial touch was on an element and is dragable in the user
interface, the element may move accordingly. Similarly, if the user
touches the touch screen and subsequently raises the appendage away
from the touch screen, losing contact with the touch screen, such
an event may indicate selection of an option at the location that
was touched. A double tap on the touch surface 210 may also be
appropriately interpreted (for example as a double click on an icon
on the display).
[0061] FIG. 5 shows a more detailed process 500 which may be used
to update a user interface in accordance with an embodiment. The
process illustrated in FIG. 5 may be performed, for example, by a
remote control device, such as described, or generally any device
with a touch screen that is configured to operate in accordance
with the present disclosure. In an embodiment, the process 500
includes displaying interface 502, such as described above. During
performance of the process 500, the touch screen may be monitored
504 for various events involving user interaction with the touch
screen. A process in a device performing the process 500 may
periodically or otherwise poll for events and take action when
events are detected. For example, as illustrated in FIG. 5, a
repetitive sub-process is performed in which the touch screen is
monitored 504 and determinations are made 506 whether an appendage
is detected.
[0062] Determining whether an appendage is detected may be
performed in various ways. For example, in some embodiments, the
determination may simply be a determination of whether signals from
a touch screen indicate the presence of an appendage proximate to
the touch screen. However, the determination may be more complex
and may include other determinations. For example, determining
whether an appendage is detected may include determining how many
appendages are detected. In addition, in an embodiment, for any
appendages detected, the detected appendages may be matched to
actual appendages. For instance, referring to FIG. 3, the two
detected appendages may be matched to the left and right thumb.
Other appendages and objects may be detected and matched, such as
other fingers, styluses, and the like.
[0063] Matching detected appendages to actual appendages may be
done in various ways. For example, in an embodiment, when an
appendage is detected, the appendage will generally cause different
portions of the touch screen to generate different signals. For
example, with capacitive pressure sensors, the capacitance
measurements for the touch screen may increase for locations that
are proximate to the detected appendage. The locations for which
the capacitance changes may be used to determine which appendage a
detected appendage corresponds to. For example, a thumb will
generally affect a larger region of the touch screen than other
fingers due to the thumb's larger relative size. In addition,
regions of affected locations in the touch screen may generally be
oriented differently depending on the appendage being sensed.
Referring to FIG. 3, for example, a region of locations on the
touch screen 310 for the left thumb would generally point upward
and to the right whereas a right thumb would point upward and to
the left. Some suitable techniques for matching detected appendages
to fingers are described in U.S. application Ser. No. 13/047,962
and U.S. application Ser. No. 12/840,320, referenced above.
[0064] Returning to the process 500, in an embodiment, if no
appendage is detected, the touch screen continues to be monitored,
as illustrated. however, if it is determined 506 that an appendage
is detected, then a determination may be made whether a touch event
criteria has been satisfied. (The touch screen may continue to be
monitored, when a touch event is detected, for example, to detect
further touch events.) Touch event criteria may be criteria that,
when met, indicate a touch event. For example, criteria for a
selection of a touch event corresponding to selection of a user
interface element may be that the user may contact the touch screen
and then may lose contact over a predetermined period of time.
Other criteria can be simpler, the same complexity, and/or more
complex. Criteria may take into account information about timing of
various activities, such as how long the user has touched the touch
screen, how many appendages or other objects touched the screen,
whether or not the user moved while in contact with the touch
screen, a certain amount and the like.
[0065] In an embodiment, the touch event criteria takes into
account matches to appendages that have been detected. Different
touch events may correspond to different actions by a user using
different subsets of his or her fingers (or other appendages or
objects). For example, in an embodiment, when a middle finger and
an index finger are detected, a right click event may be generated
at a user interface (UI) location corresponding to a touch screen
location selected by the index finger. The location of the UI may
correspond to a particular right-click menu of selectable interface
options that may be displayed upon detection of the right click
event. As another example, when a thumb and right index finger are
detected, a scroll left event may be detected for a UI element
(scroll bar, icon, etc.) selected by the right index finger.
Similarly, when a thumb and right ring finger are detected, a
scroll right event may be detected for a UI element selected by the
right ring finger. When the thumb finger is detected, a scroll left
or right event may be generated for a UI element selected by the
index finger. The direction of scroll may be determined in many
ways, such as by the left or right thumb detected, by the direction
of movement of the detected thumb, and the like. Generally, any way
of matching actions of sets of one or more fingers (or other
objects) to events may be used.
[0066] If it is determined that touch event criteria are satisfied,
a determination may be made 510 of the touch event type. If, for
example, it is determined that the touch event type was an object
selection, then the interface may be updated 512 according to the
selection. As noted, updating the interface may be done in any
suitable way, such as changing a display of the interface on a
display device providing a completely new display or generally
changing the interface in accordance with its programming logic. An
overlay of representations on the interface may be removed in
accordance with an embodiment. If the event touch type is another
type of touch type, such as a drag, scroll or other event, then the
user interface may be updated if applicable. For example, if the
user interface is scrollable, the user interface may be scrolled.
If a scroll event is detected with an interface object selected,
the object may be moved on the interface accordingly. As another
example, if the user touches the touch screen at a location
corresponding to a portion of the user interface that is not
manipulable through user interaction and moved within that area
while in contact with the touch screen, the user interface may be
left as is, although representations of detected appendages may be
updated with movement accordingly.
[0067] As illustrated in FIG. 5, when an interface is updated or
otherwise, the touch screen may be continued to be monitored 504.
If it has been determined that touch event criteria have not been
satisfied--for example, if the user has touched the touch screen,
but not moved or lost contact with the touch screen--a
determination may be made 516 whether the detection of the
appendage corresponds to a touch or a hover. If it is determined
516 that the detected appendage is hovering, then a hover mode
representation of an appendage on the interface screen at a
location corresponding to the detected location may be overlaid or
updated. For example, if the hover mode representation is already
on the interface, the position of the representation may be changed
accordingly. If, for example, an overlay of the representation was
not on the user interface, a representation may then be overlaid at
the location corresponding to the location at which the appendage
was detected on the touch screen. If it is determined that the
detected appendage is touching the touch screen, a touch mode
representation of the appendage on the interface screen at a
location corresponding to the detection location may be overlaid or
updated accordingly. For example, if a hover mode representation
was on the touch screen and the user then touched the touch screen,
the hover mode representation may be changed into a touch mode
representation. If no representation was on the interface, then a
touch mode representation may appear. If a touch mode
representation was present on the interface, a location of the
representation may be updated in accordance with any movement by
the user--for example, if the location changed since the last time
the determination was made. As with other steps herein, the touch
screen is continued to be monitored during performance of the
process 500.
[0068] As noted, representations of user appendages or other
devices used to interact with a touch screen may change according
to a manner in which the interaction is performed. One way of
changing the manner in which the interaction is performed is by
changing the representation based at least in part on a distance of
a detected appendage and, in particular, changing the
representation based at least in part on the distances of multiple
locations of a detected appendage (or other object). As described
above, the color characteristics of a representation of an
appendage may be changed based at least in part on the distance the
appendage is from the touch screen. FIG. 6, for example, shows a
touch screen 602 over which a finger 604 is hovering at a distance
that is not in contact with the touch screen 602. FIG. 6 also shows
a corresponding user interface 606 on which a representation 608 of
the finger 604 appears. It should be noted that the touch screen
602 and user interface 606 are shown simplified for the purpose of
illustration of an embodiment. For example, the touch screen 602 is
shown without other portions of a device with which the touch
screen 602 is attached. Similarly, the user interface 606 is
illustrated without a device with which the user interface is
displayed or a GUI, although the user interface may, and typically
will, have a GUI displayed. The touch screen 602 may similarly have
a GUI displayed, which may match a GUI displayed on the user
interface 606.
[0069] The representation 608 of the finger 604 appears on the
interface at a location corresponding to the location at which the
finger 604 hovers over the touch screen 602. In addition, in this
particular example, the representation 608 resembles an outline of
a finger and this outline is oriented according to the orientation
of the finger 604 over the touch screen 602. As shown in FIG. 6,
the finger 604 at different locations is a different distance over
the touch screen 602. The representation 608 of the finger 604 has
color characteristics changing according to the various distances.
For example, portions of the representation 608 of the finger 604
that correspond to locations of the finger 604 that are closer to
the touch screen 602 are darker than portions of the representation
608 that correspond to locations on the finger 604 that are further
from the touch screen 602. As the user reorients the finger, for
example, so that other parts of the finger are different distances
from those illustrated in FIG. 6, the representation may be updated
accordingly. Similarly, as the finger 604 is moved relative to the
touch screen 602, the location of the representation 608 on the
user interface 606 may be updated as well. (If a representation is
displayed on the touch screen, the representation may be updated
there as well.) When updated, transparency and/or other color
characteristics of the representation 608 may change according to
distance changes at locations of the various portions of the finger
604 from the touch screen 602.
[0070] FIG. 7 shows an illustrative example of how in one way a
representation may be updated. In particular, FIG. 7 shows a touch
screen 702 over which a finger 704 hovers in order to cause a user
interface 706 to display a representation 708 of the finger 704.
The touch screen 702, finger 704, interface 706 and representation
708 may be the same as those similarly named items described above
in connection with FIG. 6. In FIG. 7, however, the finger 704 is
closer to the touch screen 702 than the finger 604 to the touch
screen 602 in FIG. 6. The representation 708 in FIG. 7 of the
finger 704, accordingly, is displayed with different color
characteristics than in FIG. 6. For example, the tip of the finger
704 is closer to the touch screen 702 than the tip of the finger
604 to the touch screen 602 of FIG. 6. The tip of the
representation 708 is therefore in this example less transparent
than the tip of the representation 608 in FIG. 6. Portions of the
representation 708 of the finger 704 that are further from the
touch screen 702 are more transparent in the same manner as shown
in FIG. 6. However, they are less transparent than corresponding
locations in FIG. 6 because overall the finger 704 in FIG. 7 is
closer to the touch screen than the finger 604 in FIG. 6 at
corresponding locations.
[0071] FIGS. 6 and 7 show comparative figures that illustrate one
particular embodiment of the present disclosure. As noted, other
ways of varying characteristics of a representation of a detected
object may be made. For example, as noted, the user interface may
warp at locations corresponding to detected objects. The amount by
which the user interface is warped may vary depending on the
distance of the detected objects. The closer the object, the more
the corresponding location on the user interface may be warped.
Generally, any way of varying the representation based at least in
part on distance of a corresponding object to a touch screen may be
used.
[0072] Further, while various embodiments of the present disclosure
describe embodiments in terms of touch screens, any input device
that is able to detect proximity and touch may be used. For
example, touch screens used in some embodiments may not themselves
display any information or may display information different from
that which is displayed on another device, such as a television. In
addition, proximity sensing techniques may be used in other
contexts, not just planar touch sensitive areas, such as those
illustrated above. For example, a remote control device with
buttons (such as physically displaceable buttons) may incorporate
proximity sensing technology. Proximity sensors may be incorporated
with the remote control device. When a user's appendage becomes
close to a button (such as by making contact with the button), a
representation of the appendage may appear on a display. When the
user presses the button, an action may be taken. The action may
correspond to the button, the location of the representation on the
display, or otherwise. As an illustrative example, if a user's
appendage becomes close to a "play" button on a remote control, a
representation of the appendage may appear over a "play" button on
a display, such as described above. When the user presses the
button, a play function may be performed. For instance, if watching
a DVD, a DVD player may be put in a play state. In an embodiment
where displaceable or other buttons are used in connection with
proximity sensing techniques, movement of the representation of an
appendage on a display may correspond to user movement of the
appendage relative to the remote control, manipulation of an input
device (such as a joystick), or otherwise.
[0073] In addition to the above, other variations are within the
scope of the present disclosure. For example, additional techniques
may be incorporated with those above. As an example, buttons of a
remote control (such as physically displaceable buttons) may be
force sensing. The above described hovering effects may be produced
upon light presses of a button and selection of an object may be
performed upon more forceful presses of the button. As other
examples, sound, vibration, or other additional feedback may be
provided to the user based on the user's interaction with a remote
control device. A remote control device may, for instance, vibrate
upon making a selection using the techniques described herein. The
remote control or another device (such as a television or audio
system) may make a sound upon selection.
[0074] Other variations are within the spirit of the present
invention. Thus, while the invention is susceptible to various
modifications and alternative constructions, certain illustrated
embodiments thereof are shown in the drawings and have been
described above in detail. It should be understood, however, that
there is no intention to limit the invention to the specific form
or forms disclosed, but on the contrary, the intention is to cover
all modifications, alternative constructions, and equivalents
falling within the spirit and scope of the invention, as defined in
the appended claims.
[0075] The use of the terms "a" and "an" and "the" and similar
referents in the context of describing the invention (especially in
the context of the following claims) are to be construed to cover
both the singular and the plural, unless otherwise indicated herein
or clearly contradicted by context. The terms "comprising,"
"having," "including," and "containing" are to be construed as
open-ended terms (i.e., meaning "including, but not limited to,")
unless otherwise noted. The term "connected" is to be construed as
partly or wholly contained within, attached to, or joined together,
even if there is something intervening. Recitation of ranges of
values herein are merely intended to serve as a shorthand method of
referring individually to each separate value falling within the
range, unless otherwise indicated herein, and each separate value
is incorporated into the specification as if it were individually
recited herein. All methods described herein can be performed in
any suitable order unless otherwise indicated herein or otherwise
clearly contradicted by context. The use of any and all examples,
or exemplary language (e.g., "such as") provided herein, is
intended merely to better illuminate embodiments of the invention
and does not pose a limitation on the scope of the invention unless
otherwise claimed. No language in the specification should be
construed as indicating any non-claimed element as essential to the
practice of the invention.
[0076] Preferred embodiments of this invention are described
herein, including the best mode known to the inventors for carrying
out the invention. Variations of those preferred embodiments may
become apparent to those of ordinary skill in the art upon reading
the foregoing description. The inventors expect skilled artisans to
employ such variations as appropriate, and the inventors intend for
the invention to be practiced otherwise than as specifically
described herein. Accordingly, this invention includes all
modifications and equivalents of the subject matter recited in the
claims appended hereto as permitted by applicable law. Moreover,
any combination of the above-described elements in all possible
variations thereof is encompassed by the invention unless otherwise
indicated herein or otherwise clearly contradicted by context.
[0077] All references, including publications, patent applications,
and patents, cited herein are hereby incorporated by reference to
the same extent as if each reference were individually and
specifically indicated to be incorporated by reference and were set
forth in its entirety herein.
* * * * *