U.S. patent application number 15/861254 was filed with the patent office on 2018-05-17 for method and devices for displaying graphical user interfaces based on user contact.
This patent application is currently assigned to Immersion Corporation. The applicant listed for this patent is Immersion Corporation. Invention is credited to Juan Manuel Cruz-Hernandez, Ali Modarres.
Application Number | 20180136774 15/861254 |
Document ID | / |
Family ID | 50287870 |
Filed Date | 2018-05-17 |
United States Patent
Application |
20180136774 |
Kind Code |
A1 |
Cruz-Hernandez; Juan Manuel ;
et al. |
May 17, 2018 |
Method and Devices for Displaying Graphical User Interfaces Based
on User Contact
Abstract
Method and devices for displaying graphical user interface
configurations based detected user contact are disclosed. Method
and apparatus for provided embedded transaction modules are
disclosed. One disclosed method comprises displaying a graphical
user interface (GUI) according to a first GUI configuration on a
display of a handheld device, receiving a sensor signal from a
sensor, the sensor coupled to the handheld device, the sensor
signal indicating a contact with the handheld device, determining a
grasping contact based at least in part on the sensor signal,
determining a second GUI configuration based at least in part on
the grasping contact, and displaying the GUI on the display
according to the second GUI configuration.
Inventors: |
Cruz-Hernandez; Juan Manuel;
(Montreal, CA) ; Modarres; Ali; (Montreal,
CA) |
|
Applicant: |
Name |
City |
State |
Country |
Type |
Immersion Corporation |
San Jose |
CA |
US |
|
|
Assignee: |
Immersion Corporation
|
Family ID: |
50287870 |
Appl. No.: |
15/861254 |
Filed: |
January 3, 2018 |
Related U.S. Patent Documents
|
|
|
|
|
|
Application
Number |
Filing Date |
Patent Number |
|
|
13800264 |
Mar 13, 2013 |
9904394 |
|
|
15861254 |
|
|
|
|
Current U.S.
Class: |
1/1 |
Current CPC
Class: |
G06F 1/169 20130101;
G06F 3/0414 20130101; G06F 3/0488 20130101; G06F 3/016
20130101 |
International
Class: |
G06F 3/041 20060101
G06F003/041; G06F 3/01 20060101 G06F003/01; G06F 3/0488 20060101
G06F003/0488; G06F 1/16 20060101 G06F001/16 |
Claims
1-32. (canceled)
33. A computing device comprising: an outer housing; two or more
contact sensors positioned around a perimeter of the outer housing;
a processor; and a memory on which program code is stored, the
program code being executable by the processor to cause the
processor to: determine that at least two contact sensors of the
two or more contact sensors are being contacted concurrently;
determine that the outer housing is being grasped using a type of
grasping contact based on the at least two contact sensors being
contacted concurrently; determine a haptic effect corresponding to
the type of grasping contact; and transmit a haptic signal that is
associated with the haptic effect to a haptic output device to
cause the haptic output device to output the haptic effect.
34. The computing device of claim 33, wherein the at least two
contact sensors are a subset of the two or more contact sensors,
and wherein the two or more contact sensors are pressure
sensors.
35. The computing device of claim 33, wherein the haptic effect is
configured to indicate that the type of grasping contact is being
used to grasp the computing device.
36. The computing device of claim 33, wherein the memory further
comprises program code that is executable by the processor to cause
the processor to, after determining the haptic effect: adjust the
haptic effect in response to a change in a pressure being applied
to a contact sensor of the two or more contact sensors during the
type of grasping contact.
37. The computing device of claim 33, wherein the memory further
comprises program code that is executable by the processor to cause
the processor to determine that the outer housing is being grasped
using the type of grasping contact using relationships between (i)
multiple contact-configurations with the two or more contact
sensors, and (ii) multiple types of grasping contacts.
38. The computing device of claim 33, wherein the type of grasping
contact is a single-handed grasping contact, and wherein the memory
further comprises program code that is executable by the processor
to cause the processor to determine that the outer housing is being
grasped using the single-handed grasping contact based on a first
combination of contact sensors in the two or more contact sensors
being contacted concurrently.
39. The computing device of claim 38, wherein the memory further
comprises program code that is executable by the processor to cause
the processor to, subsequent to determining that the outer housing
is being grasping using the single-handed grasping contact,
determine that the outer housing is being grasped using a
double-handed grasping contact based on a second combination of
contact sensors in the two or more contact sensors being contacted
concurrently, the second combination being different from the first
combination.
40. A non-transitory computer-readable medium comprising program
code that is executable by a processor to cause the processor to:
determine that at least two contact sensors are being contacted
concurrently, the at least two contact sensors being among two or
more contact sensors positioned around a perimeter of an outer
housing of a computing device; determine that the outer housing is
being grasped using a type of grasping contact based on the at
least two contact sensors being contacted concurrently; determine a
haptic effect corresponding to the type of grasping contact; and
transmit a haptic signal that is associated with the haptic effect
to a haptic output device to cause the haptic output device to
output the haptic effect.
41. The non-transitory computer-readable medium of claim 40,
wherein the at least two contact sensors are a subset of the two or
more contact sensors.
42. The non-transitory computer-readable medium of claim 40,
wherein the two or more contact sensors are pressure sensors.
43. The non-transitory computer-readable medium of claim 40,
further comprising program code that is executable by the processor
to cause the processor to, after determining the haptic effect:
adjust the haptic effect in response to a change in a pressure
being applied to a contact sensor of the two or more contact
sensors during the type of grasping contact.
44. The non-transitory computer-readable medium of claim 40,
further comprising program code that is executable by the processor
to cause the processor to determine that the outer housing is being
grasped using the type of grasping contact using relationships
between (i) multiple contact-configurations with the two or more
contact sensors, and (ii) multiple types of grasping contacts.
45. The non-transitory computer-readable medium of claim 40,
wherein the type of grasping contact is a single-handed grasping
contact, and further comprising program code that is executable by
the processor to cause the processor to determine that the outer
housing is being grasped using the single-handed grasping contact
based on a first combination of contact sensors in the two or more
contact sensors being contacted concurrently.
46. The non-transitory computer-readable medium of claim 45,
further comprising program code that is executable by the processor
to cause the processor to, subsequent to determining that the outer
housing is being grasping using the single-handed grasping contact,
determine that the outer housing is being grasped using a
double-handed grasping contact based on a second combination of
contact sensors in the two or more contact sensors being contacted
concurrently, the second combination being different from the first
combination.
47. A method comprising: determining, by a processor, that at least
two contact sensors are being contacted concurrently, the at least
two contact sensors being among two or more contact sensors
positioned around a perimeter of an outer housing of a computing
device; determining, by the processor, that the computing device is
being grasped using a type of grasping contact based on the at
least two contact sensors being contacted concurrently;
determining, by the processor, a haptic effect corresponding to the
type of grasping contact; and transmitting, by the processor, a
haptic signal that is associated with the haptic effect to a haptic
output device to cause the haptic output device to output the
haptic effect.
48. The method of claim 47, wherein the at least two contact
sensors are a subset of the two or more contact sensors, and
wherein the two or more contact sensors are pressure sensors.
49. The method of claim 47, wherein the haptic effect is configured
to indicate that the type of grasping contact is being used to
grasp the outer housing.
50. The method of claim 47, further comprising, after determining
the haptic effect: adjusting the haptic effect in response to a
change in a pressure being applied to a contact sensor of the two
or more contact sensors during the type of grasping contact.
51. The method of claim 47, further comprising determining that the
computing device is being grasped using the type of grasping
contact using relationships between (i) multiple
contact-configurations with the two or more contact sensors, and
(ii) multiple types of grasping contacts.
52. The method of claim 47, wherein the type of grasping contact is
a single-handed grasping contact, and further comprising
determining that the outer housing is being grasped using the
single-handed grasping contact based on a first combination of
contact sensors in the two or more contact sensors being contacted
concurrently.
53. The method of claim 52, further comprising, prior to
determining that the outer housing is being grasping using the
single-handed grasping contact, determine that the outer housing is
being grasped using a double-handed grasping contact based on a
second combination of contact sensors in the two or more contact
sensors being contacted concurrently, the second combination being
different from the first combination.
Description
CROSS REFERENCE TO RELATED APPLICATIONS
[0001] This application is a continuation of and claims priority to
U.S. patent application Ser. No. 13/800,264, filed Mar. 13, 2013
and entitled "Method and Devices for Displaying Graphical User
interfaces Bases on User Contact," the entirety of which is hereby
incorporated by reference herein.
FIELD
[0002] The present disclosure generally relates to methods and
devices for displaying graphical user interfaces, and more
particularly to methods and devices for displaying graphical user
interface configurations based user contact.
BACKGROUND
[0003] Presently, handheld devices are used to perform a myriad of
tasks ranging from conducting phone calls and sending text messages
to recording video, snapping pictures, and browsing the Internet.
When using a handheld device, especially handheld devices
comprising touch screens, a typical user often holds or grasps the
handheld device in a variety of different ways depending on any
number of factors (e.g. task being performed, hand availability,
comfort, preference, etc.). For example, when composing a text
message or an email, users often turn the device to a horizontal or
landscape orientation and hold the device with two hands in order
to use both thumbs to compose the message. However, when users only
have a single available hand (e.g. standing and holding a cup of
coffee in one hand), users often opt to hold the handheld device in
a vertical or portrait orientation and compose the message with one
thumb. To accommodate this behavior, current handheld devices
rotate the display and in some cases modify the user interface
based on whether the handheld device is in a vertical or horizontal
position. While useful, adjusting the user interface based only on
vertical or horizontal orientation is a crude tool that does not
take into account whether a user is holding the device with one
hand versus two hands or the type of one-handed or two-handed grasp
that a user is employing. Consequently, present devices do not
adjust the configuration of the user interface for the variety of
holds or grasps that a user may employ while the handheld device is
vertically oriented or the variety of holds or grasps that a user
may employ when the handheld device is horizontally oriented.
SUMMARY
[0004] The present disclosure generally relates to a method
comprising displaying a graphical user interface (GUI) according to
a first GUI configuration on a display of a handheld device,
receiving a sensor signal from a sensor--the sensor coupled to the
handheld device and the sensor signal indicating a contact with the
handheld device--determining a grasping contact based at least in
part on the sensor signal, determining a second GUI configuration
based at least in part on the grasping contact, and displaying the
GUI on the display according to the second GUI configuration.
Another embodiment comprises a computer-readable medium encoded
with processor-executable software program code for carrying out
such a method.
[0005] Illustrative embodiments disclosed herein are mentioned not
to limit or define the invention, but to provide examples to aid
understanding thereof. Illustrative embodiments are discussed in
the Detailed Description and further description of the invention
is provided therein. Advantages offered by various embodiments of
this invention may be further understood by examining this
specification.
BRIEF DESCRIPTION OF THE DRAWINGS
[0006] These and other features, aspects, and advantages according
to the present disclosure are better understood when the following
Detailed Description is read with reference to the accompanying
figures, wherein:
[0007] FIGS. 1-4 are illustrations of a handheld device according
to embodiments.
[0008] FIG. 5 is a flow diagram illustrating the operation of a
handheld device according to one embodiment.
[0009] FIGS. 6-14 are illustrative grasping contacts with a
handheld device according to embodiments.
[0010] FIGS. 15A-G are illustrations of GUI configurations
displayed on a handheld device according to embodiments.
DETAILED DESCRIPTION
[0011] Embodiments according to this disclosure provide methods and
handheld devices for displaying graphical GUI configurations based
on user contact. In particular, various graphical GUI
configurations are displayed based on how a user is grasping or
holding a handheld device.
Illustrative Embodiment
[0012] In one illustrative embodiment, a touchscreen cell phone
comprises two pressure sensors configured to detect and measure
pressure applied to the sides of the cell phone. The handheld
device processes the pressure measurements to determine how the
user is grasping or holding the cell phone.
[0013] In one particular example, the cell phone is displaying an
interactive map application and then determines that is the user
has changed his grip such that he is holding the cell phone in a
vertical orientation in his right hand. In response, the cell phone
configures the GUI of the interactive map application displayed on
the cell phone so that interface controls are located on the right
side of the display. When holding a cell phone vertically in his
right hand, the user's right thumb is generally freely movable and
may be the most convenient digit for interacting with the GUI of
the interactive map application. By configuring the GUI of the
interactive map application to place interface controls near the
user's right thumb, the cell phone maximizes the usability of the
GUI
[0014] As the user continues to modify his grasp of his cell phone,
the cell phone determines each new grasp applied to the cell phone
and displays GUIs based on the various grasps. For example, the
user may adjust his grasp of the cell phone such that he is holding
the cell phone in a horizontal orientation with his left hand
grasping the left side of the cell phone and his right hand
grasping the right side of the cell phone. In response, the cell
phone configures the GUI for interactive map application such that
the interface controls are located on the left and right sides of
the display close to the user's thumbs. When holding a cell phone
in this manner, a user's thumbs are typically unencumbered and are
the most convenient fingers for interacting with the GUI, as shown
in FIG. 11. Once again, by modifying the GUI based on the user's
grasp, the cell phone configures the GUI of the interactive map
application to maximize the usability of the cell phone.
Illustrative Device
[0015] Referring now to FIG. 1, a block diagram illustrating a
handheld device according to one embodiment of the disclosure is
shown. Handheld device 10 comprises a processor 14. Handheld device
10 also comprises a memory 16, a display 12, an input device 15,
and sensors 18, all in communication with the processor 14. In one
embodiment, the handheld device 10 is a cell phone. In other
embodiments, handheld device 10 may be an MP3 player, a digital
camera, a handheld video gaming device, a tablet computer, or any
other handheld device comprising a display.
[0016] In some embodiments, the handheld device 10 comprises a
touch screen that acts as both a display 12 and an input device 15.
In other embodiments, input devices 15 may include one or more
buttons, trackballs, scroll wheels, touchpads, and/or any other
input device known to one having ordinary skill in the art. In some
embodiments, handheld device 10 further comprises a communication
component for communicating with a network and/or with another
device. For example, the communication component may be a wireless
networking device, a module and antenna for communication with a
cellular network, or a module and antenna for direct communication
with another device. Handheld device 10 also comprises memory 16
which stores software program code that is executable by processor
14. For example, memory 16 may comprise random-access memory that
stores program code for an operating system and user applications.
For example, memory 16 may comprise user applications including a
map application, an email application, a messaging application, a
camera application, an internet browser application, a music
application, a calendar application, or any other application.
[0017] The presence of two boxes labeled as "Sensor 18" in the
block diagram of FIG. 1 is not intended to limit a particular
embodiment according to the present disclosure to a particular
number of sensors. Rather, it is intended to demonstrate that
various embodiments according to the present disclosure may
comprise one sensor, two sensors, or any number of sensors. For
example, FIG. 2 illustrates a handheld device 10 according to one
embodiment according to the present disclosure comprising two
sensors 18, one located at each side of the handheld device 10.
FIG. 3 illustrates a handheld device 10 according to one embodiment
according to the present disclosure comprising two sensors 18, one
located at each side of the handheld device 10, and an additional
sensor 18 located at the top of the handheld device 10. In still
another embodiment, illustrated by FIG. 4, a handheld device 10
comprises nine sensors 18: four sensors located at each side of the
handheld device 10, and an additional sensor 18 located at the top
of the handheld device 10. Additional embodiments may comprise one
or more sensors 18. Furthermore, other embodiments may comprise one
or more sensors 18 located at the back, bottom, and/or face of
handheld device 10. In addition, the one or more sensors 18 in the
various embodiments may have sensing areas of varying sizes and
shapes. For example, as shown in FIG. 4, a sensor 18 located at the
top of handheld device 10 may have an oval-shaped sensing area that
has a larger area than the circular sensing areas of sensors 18
located at the left and right sides of handheld device 10. In sum,
the present disclosure contemplates a plurality of embodiments
comprising one or more sensors 18, having sensing areas that may
vary in area and/or shape, located at each of one or more exterior
surfaces of a handheld device 10.
[0018] A sensor 18 according to embodiments may be any type of
sensor that one of ordinary skill in the art would know to use to
detect grasping contact applied to a handheld device 10. In one
embodiment, sensor 18 is a pressure sensor. For example, a sensor
18 may be a resistive pressure sensor, a piezoelectric pressure
sensor, a strain gauge, or any other type of pressure sensor known
in the art. In another embodiment, sensor 18 is a sensor capable of
determining the area and/or dimensions of contact. For example
sensor 18 may be capacitive sensor or any other type of sensor
known in the art to be capable of determining the area and/or
dimensions of contact.
[0019] Furthermore, one or more sensors 18 may be integrated into a
handheld device 10 in any manner known by those of ordinary skill
in the art that allows for the detection of a grasping contact
applied by a user. For example, a piezoelectric sensor may be
internally coupled to a housing of a handheld device 10 such that
it can detect slight deformations of the housing that indicate a
grasping contact applied to the handheld device 10. In another
embodiment, one or more capacitive sensors may be coupled to an
exterior surface of the housing of a handheld device 10. In another
illustrative embodiment, a sensing pad of a force sensitive
resistor may be integrated into the surface of a housing of a
handheld device 10 such that a grasping force applied to the
handheld device 10 may be directly applied to the sensing pad. In
other embodiments, the one or more pressure sensors 18 are coupled
to the external surface of the housing of the handheld device
10.
[0020] In one embodiment, the handheld device 10 may comprise an
accelerometer, a gyroscope, a piezoelectric sensor, or other
suitable sensors for detecting acceleration, movement, and/or
orientation of the handheld device 10. In one such embodiment, a
detected orientation may be used in conjunction with detected
pressure to determine a grasping contact applied to handheld
device. For example, a handheld device 10 may determine that a
horizontally-oriented two-handed grasp is being applied to the
handheld device 10 by detecting a horizontal orientation using an
accelerometer and detecting pressures indicative of a user grasping
the phone at each end (as shown in FIG. 11) using the one or more
pressure sensors 18. In other embodiments, orientation is
determined based on the pressure detected by the one or more
pressure sensors 18.
Illustrative Grasping Contacts
[0021] FIGS. 6-14 show illustrative of grasping contacts (also
referred to as grasps herein) with a handheld device according to
embodiments of the disclosure. In particular, FIG. 6 shows an
illustrative vertically-oriented left-handed grasp. FIG. 7 shows an
illustrative vertically-oriented right-handed grasp. FIG. 8 shows
an illustrative vertically-oriented two-handed grasp. FIG. 9 shows
an illustrative horizontally-oriented left-handed grasp. FIG. 10
shows an illustrative horizontally-oriented right-handed grasp.
FIG. 11 shows an illustrative horizontally-oriented two-handed
grasp. FIG. 12 shows an illustrative right-handed camera grasp.
FIG. 13 shows an illustrative left-handed camera grasp. FIG. 14
shows an illustrative two-handed camera grasp. However, the
illustrative grasping contacts shown by FIGS. 6-14 are examples and
the grasping contacts contemplated by the present disclosure are
not limited to these illustrative embodiments. All variations of
grasping contacts that a user may use to hold a handheld device are
contemplated by the present disclosure.
Operation of an Illustrative Handheld Device
[0022] FIG. 5 shows a flow diagram illustrating the operation of a
handheld device according to one embodiment. In particular, FIG. 5
shows steps performed by a handheld device to provide GUI
configurations based on a user's grasp of the handheld device. To
aid in understanding how each of the steps may be performed, the
following description is provided in the context of the
illustrative block diagram of a handheld device shown in FIG. 1.
However, embodiments according to the present disclosure may be
implemented in alternative embodiments.
[0023] Beginning at step 51, the handheld device 10 displays a GUI
according to a first configuration. The first GUI configuration may
be any GUI configuration associated with any grasping contact
recognizable by the handheld device 10 or the lack of any grasping
contact. For example, the handheld device 10 may be displaying a
GUI configuration associated with a horizontally-oriented
right-handed grasp of the handheld device 10 based on detecting
that the handheld device 10 was subjected to such a grasping
contact in a previous iteration of the steps 52-56 of FIG. 5. In
another example, the handheld device 10 may be displaying a default
GUI configuration associated with no grasping contact or an unknown
grasping contact.
[0024] At step 52, a processor 14 of a handheld device receives a
sensor signal indicating a reading obtained by a sensor 18. For
example, in one embodiment a handheld device 10 comprises a
pressure sensor 18 and the sensor signal received by the processor
14 indicates whether or not the sensor 18 detects pressure applied
to the sensor 18 that exceeds a threshold magnitude. In another
embodiment, the sensor signal received by the processor 14
indicates a magnitude value of the pressure applied to the sensor
18. In such an embodiment, the sensor signal may indicate a
magnitude value of zero where the sensor detects that no pressure
is being applied to the sensor. In another embodiment, a sensor 18
does not provide a sensor signal if no pressure is detected. In
still another embodiment, handheld device 10 comprises a capacitive
sensor 18 and the sensor signal received by processor 14 indicates
the area and/or dimensions of contact applied to the sensor 18. In
this embodiment, the handheld device 10 may then determine an
applied pressure based on the area and/or dimensions of contact
applied to the sensor 18.
[0025] As described above, a handheld device 10 may comprise a
plurality of sensors 18. In one such embodiment, the processor 14
receives signals indicating the pressure or contact area and/or
dimensions detected by two or more of the plurality of sensors 18.
In other embodiments, sensor signals indicating the pressure or
contact area and/or dimensions detected by each of the plurality of
sensors 18 are received by the processor 14 of handheld device
10.
[0026] In one embodiment, the processor 14 receives sensors signals
through periodically checking the output of the one or more sensors
18. In another embodiment, processor 14 checks the output of the
one or more sensors 18 upon receiving a hardware interrupt
indicating a change in one or sensor readings of sensors 18. In an
alternate embodiment, processor 14 checks the output of the one or
more sensors 18 upon receiving a software interrupt indicating a
change in one or sensor readings of sensors 18.
[0027] At step 53, the handheld device 10 processes the one or more
sensor signals to determine either a particular grasping contact
being applied to the handheld device 10 or that no grasping contact
is presently being applied. In one embodiment, processor 14
receives sensor signals corresponding to each of the one or more of
the sensors 18 of handheld device 10. In this embodiment, each
sensor signal received by the processor 14 indicates that a
corresponding sensor 18 either detects the presence (e.g. binary
value of 1) or absence (e.g. binary value of 0) of pressure above a
required threshold, or a contact. The memory 16 of client device 10
may comprise a set of maps corresponding to possible permutations
of the binary values of the sensors 18 that represent grasping
contacts. For example, if a client device 10 has four sensors 18,
the memory 16 of the client device may comprise the following set
of maps: 0000, 0001, 0010, 0011, 0100, 0101, 0110, 0111, 1000,
1001, 1010, 1011, 1100, 1101, 1110, and 1111, wherein each bit
represents a particular sensor and each map is associated with a
particular grasping contact. In some embodiments, more than one map
may correspond to a particular grasping contact. Upon receiving
sensor signals corresponding to each sensor 18, the processor 14
may determine a binary value corresponding to each sensor signal,
compare the binary values to the set of maps, and thereby determine
the appropriate map and corresponding grasping contact.
[0028] In another embodiment, processor 14 receives sensor signals
corresponding to each sensor 18 of client device 10. Each sensor
signal received by the processor 14 indicates a measurement of
pressure detected at a corresponding sensor 18. In another
embodiment, each sensor signal received by the processor 14
indicates a measurement of contact area and/or dimensions detected
at a corresponding sensor 18, whereby the processor 14 calculates
corresponding pressure measurements. The range of possible pressure
measurements may be subdivided with each subdivision having a
corresponding value. The memory 16 of client device 10 may comprise
a set of maps corresponding to possible permutations of the
subdivided pressure measurement values corresponding to each sensor
18 that represent grasping contacts. For example, if a client
device 10 has two sensors 18, and the range pressure magnitudes per
sensor is subdivided into four subdivisions, the memory 16 of the
client device may comprise the following set of maps: 0000, 0001,
0010, 0011, 0100, 0101, 0110, 0111, 1000, 1001, 1010, 1011, 1100,
1101, 1110, and 1111, wherein the first two bits represent a first
sensor 18, the second two bits represent a second sensor 18, and
each map is associated with a particular grasping contact. In some
embodiments, more than one map may correspond to a particular
grasping contact. Upon receiving sensor signals corresponding to
each sensor 18, the processor 14 may determine a subdivision value
corresponding to each sensor signal, compare the values to the set
of maps, and thereby determine the appropriate map and
corresponding grasping contact.
[0029] In other embodiments, processor 14 receives sensor signals
corresponding to a subset of the sensors 18 of a handheld device
10. For example, processor 14 may receive sensor signals from each
of the one or more of the sensors 18 of handheld device 10 that are
detecting the presence of pressure or contact area and/or
dimensions, but not from the sensors 18 to which no pressure,
contact, or pressure below a required threshold is applied. In one
embodiment, handheld device 10 assumes a default value (e.g. binary
value of 0 for no pressure or contact) for the sensors 18 for which
the processor 14 did not receive sensor signals, and determines the
appropriate map as described above to determine the grasping
contact.
[0030] In other embodiments, maps using decimal, hexadecimal, or
any other types of numbers may be used. In still further
embodiments, maps may not be used at all. For example, in one
embodiment the processor 14 receives sensor signals indicating
measurements of pressure or contact area and/or dimensions for
sensors 18 of handheld device 10, and processes the measurements
through an algorithm that determines a grasping contact by
evaluating measurements for particular sensors 18 relative to
measurements for other sensors 18. For example, if a handheld
device 10 comprises a sensor 18 at each side of the handheld device
10, detects a pressure applied to the left sensor 18 and a pressure
applied to the right sensor 18 that is a magnitude higher than the
pressure applied to the left sensor 18, then the handheld device
determines that a user is holding handheld device in his left hand.
In another embodiment, the algorithm may use both actual and
relative measurements to determine the grasping contact. In still
another embodiment, the value of pressure measurements over time is
used to determine the grasping contact. In another embodiment, the
value of pressure measurements over time is used to determine
emotions/moods of a user of a handheld device 10.
[0031] In addition to detecting grasping contacts, a handheld
device 10 may detect the lack of a grasping contact. In one
embodiment, a handheld device 10 determines the lack of a grasping
contact based all sensors 18 sensing no applied pressure. In
another embodiment, the handheld device 10 determines the lack of a
grasping contact by determining that the handheld device 10 is
lying on a table or similar surface (e.g. by detecting pressure
only at sensors 18 located on the back of handheld device 10).
[0032] In one embodiment, the operations performed by a handheld
device 10 to determine a grasping contact, or lack thereof, applied
to handheld device 10 at steps 52 and 53 are performed according to
an operating system software module or similar software package
stored in memory 16 and comprising software program code executable
by processor 14. After determining grasping contact at step 53, the
operating system software module would provide grasping contact
information to an application layer of the operating system of
handheld device 10 so that the application layer may, at step 54,
determine a GUI configuration based on the grasping contact. In
another embodiment, the grasping contact information is provided to
one or more applications executing on handheld device 10, so that
the one or more applications may determine a GUI configuration
based on the grasping contact, or lack thereof. The grasping
contact information may be provided to an application layer and/or
one or more applications using an application program interface, a
global data structure, messaging between operating system layers,
or through any other means known by one having ordinary skill in
the art.
[0033] At step 54, the handheld device 10 determines a GUI
configuration based on the determined grasping contact or the lack
of a grasping contact presently applied to a handheld device 10. In
one embodiment, memory 16 comprises a database of GUI
configurations that may be retrieved according to the grasping
contact detected in step 52. In another embodiment, memory 16
comprises a database of GUI configurations that may be retrieved
based on a particular grasping contact and on a particular screen
displayed when the grasping contact is detected. For example, a
handheld device 10 may be displaying a home screen when a
vertically-oriented left handed grasp is applied to the handheld
device 10. The handheld device 10 retrieves a GUI configuration
from the database based on the application of the
vertically-oriented left handed grasp and the active status of the
home screen. In another embodiment, memory 16 of handheld device 10
comprise program code for applications that may be executed on the
handheld device 10, wherein the application program code comprises
GUI configurations and a mapping of grasping contacts to the GUI
configurations based on the application screen being displayed. For
example, a text message composition screen of a texting application
may be displayed on a handheld device 10 at the time the handheld
device 10 detects the application of a horizontally-oriented
two-handed grasp. Based on the grasping contact detected and the
active status of the text message composition screen, the texting
application determines a GUI configuration based on the mapping of
the grasping contacts to GUI configurations while displaying the
text message composition screen. In another embodiment, the
handheld device 10 determines that an application or particular
functionality is to be launched based on the detection of a
grasping contact applied to the handheld device 10. For example, a
handheld device 10 may be configured to launch a camera application
based on a particular grasping contact (e.g. one of the exemplary
camera grasps of FIGS. 12-14). The camera application may then
determine a particular GUI configuration based on the grasping
contact currently applied to the handheld device 10.
[0034] In one embodiment, a handheld device 10 determines a GUI
configuration based on the determined grasping contact applied to a
handheld device 10 and on detected movement or acceleration of the
handheld device 10. For example, the handheld device 10 may
determine that a user is applying a vertically-oriented left-handed
grasp to the handheld device 10 while in a moving vehicle and
determine a particular GUT configuration based on the detected
movement and grasping contact. In one embodiment, a text messaging
application compose screen displays a particular keyboard
configuration for a detected vertically-oriented left-handed grasp
and no detected movement, but displays no keyboard where the
handheld device 10 determines a vertically-oriented left-handed
grasp is being applied to a handheld device 10 and also determines
that the handheld device 10 is in a moving vehicle.
[0035] At step 55, the display 12 of handheld device 10 presents
the GUI configuration determined at step 54 based on a detected
grasping contact. If the determined GUI configuration is already
being displayed, then the handheld device 10 simply continues to
display the current GUI configuration. If the GUI configuration
determined at step 54 based on a detected grasping contact differs
from the GUI configuration currently being displayed, then the
handheld device 10 updates the display according to the GUI
configuration determined at step 54. In another embodiment, the
handheld device 10 launches an application or particular
functionality and a corresponding GUI configuration determined at
step 54 based on a detected grasping contact, and updates the
display 12 based on the determined GUI configuration. If a handheld
device 10 receives sensor signals before receiving any user input,
the method returns to step 52. If the handheld device receives user
input, then the method proceeds to step 56.
[0036] For the purposes of this disclosure, user input may be any
manipulation of physical controls (e.g. physical buttons, switches,
scroll wheels, or any other physical control known to one having
ordinary skill in the art) or manipulation of controls or objects
displayed on a screen by using one or more fingers, a stylus or
similar input mechanisms, to tap, press, press and hold, swipe, or
provide any other input through static or dynamic contact with a
touchscreen or a touchpad, such as by pressing, dragging or
otherwise changing a characteristic of one or more contact points
(collectively referred to herein as "gestures"). As discussed
below, in some embodiments user input may further comprise grasping
contacts and variations of pressure provided by a particular
grasping contact. In one embodiment, user input may comprise audio.
In another embodiment, user input comprises physically moving the
handheld device 10 including shaking, turning, and/or any other
physical movements of the handheld device 10 performed by a
user.
[0037] At step 56, the user interacts with the displayed GUI
configuration of the handheld device 10. For example, a user may be
using a graphical map application on her handheld device 10 while
employing a vertically-oriented right-handed grasp. The mapping
application may be configured to use a GUI configuration that
displays the map in a vertical orientation and displays the zoom
and pan controls for viewing a map on the right side of the display
12 of the handheld device closest to the user's right thumb that
may easily access the right side of display 12. Advantageously, the
displayed. GUI configuration is based on a user's grasp of the
handheld device 10 and therefore may be configured provide more
convenient GUI than a static GUI that does not change based on the
user's grasp or is limited to configuration based on
vertical/horizontal orientation.
[0038] At step 57, the handheld device interprets the user input
based on the displayed GUI configuration. For example, in the
mapping application example described above, a user's tap of an
area on the right side of the display 12 will be interpreted as a
manipulation of a zoom or pan control according to the GUI
configuration for a vertically-oriented right-handed grasp. In
another embodiment, the user input may cause a screen transition.
The subsequent screen may be one of multiple GUI configurations
based on a presently applied grasping contact. For example, in the
mapping application example described above, if a user presses a
button to launch a map search dialogue, the displayed GUI
configuration may comprise search dialogue and a keypad along the
right side of the display 12 based on a vertically-oriented
right-handed grasp.
[0039] The method represented by the flow diagram illustrating the
operation of a handheld device of FIG. 5 is an iterative process,
the steps of which may be performed in different sequences. For
example, the process may proceed from displaying a GUI at step 55
to receiving one or more sensor signals at step 52 as a result of a
user applying or changing his grasp of handheld device 10.
Similarly, the process may interpret user input based on a
displayed GUI at step 57 and then proceed to receive one or more
sensor signals at step 52 as a result of a user applying or
changing his grasp of handheld device 10. Furthermore, as described
above, interpretation of user input based on a displayed GUI at
step 57 may result in a screen transition. Accordingly, the process
may proceed from step 57 to step 54 to determine the appropriate
GUI configuration for the screen displayed following the
transition. Finally, the process may interpret user input based on
a displayed GUI at step 57 and then proceed to receive additional
user input at step 56.
Exemplary GUI Configurations
[0040] FIGS. 15A-G illustrate GUI configurations according to
various embodiments according to the present disclosure. FIG. 15A
shows a handheld device 1500 displaying a GUI configuration for a
text message composition screen when the handheld device 1500
determines that a user is holding the handheld device 1500 in a
horizontally-oriented two-handed grasp, such as the grasp
illustrated by FIG. 11. Based on the detection of the
horizontally-oriented two-handed grasp, the handheld device 1500
determines that a GUI configuration comprising a
horizontally-oriented text message composition area 1510 and a
keyboard 1512 that spans the bottom portion of a touchscreen 1502
should be displayed. This configuration allows a user to use both
thumbs to compose the text message. While composing a text message
employing a horizontally-oriented two-handed grasp, the user may
change grasps. For example, the user may release his left hand's
grasp of the handheld device 1500 resulting in a
horizontally-oriented right-handed grasp.
[0041] FIG. 15B shows a handheld device 1500 displaying a GUI
configuration for a text message composition screen when the
handheld device 1500 determines that a user is holding the handheld
device 1500 in a horizontally-oriented right-handed grasp, such as
the grasp illustrated by FIG. 10. Based on the detection of the
horizontally-oriented right-handed grasp, the handheld device 1500
determines that a GUI configuration comprising a
horizontally-oriented text message composition area 1520 and a
keyboard 1522 positioned on a right portion of the touchscreen 1502
should be displayed. This configuration allows a user to use his
right thumb--the only thumb positioned near the touchscreen--to
compose the text message. Therefore, the handheld device 1500 is
able to configure the text message composition screen GUI to
optimize usability based on the user's grasp. While composing a
text message employing a horizontally-oriented right-handed grasp,
the user may again change grasps. For example, the user may turn
the handheld device 1500 to a vertical position and hold it in his
right hand resulting in a vertically-oriented right-handed
grasp.
[0042] FIG. 15C shows a handheld device 1500 displaying a GUI
configuration for a text message composition screen when the
handheld device 1500 determines that a user is holding the handheld
device 1500 in a vertically-oriented right-handed grasp (see, e.g.,
FIG. 7), vertically-oriented left-handed grasp (see, e.g., FIG. 6),
vertically-oriented two-handed grasp (see, e.g., FIG. 8). Based on
the detection of a vertically-oriented right-handed grasp,
vertically-oriented left-handed grasp, or vertically-oriented
two-handed grasp, the handheld device 1500 determines that a GUI
configuration comprising a vertically-oriented text message
composition area 1530 and a keyboard 1532 spanning a bottom portion
of the touchscreen 1502 should be displayed. In this embodiment, a
user's thumb can easily reach the entire width of the display while
employing a vertically-oriented right-handed or left-handed grasp.
Therefore, a single GUI configuration may be used for a
vertically-oriented right-handed grasp, vertically-oriented
left-handed grasp, or vertically-oriented two-handed grasp. In
other embodiments, different GUI configurations for the text
message composition may be used for each of the vertically-oriented
grasps. For example, if a touchscreen 1502 of a handheld device
1500 has a width such that a typical user's thumb would not be able
to reach the entire width of the touchscreen 1502 while holding the
handheld device 1500 in a vertical position, GUI configurations
positioning a keypad on appropriate sides of the screen, similar to
the illustrative embodiment of FIG. 15B may be employed.
[0043] As described above in relation to step 54, in one embodiment
a handheld device 1500 may be configured to launch or transition to
a different application or functionality not presently displayed
upon detecting one or more particular grasping contacts. For
example, in one embodiment a user may be composing a text message
when she sees an event unfolding, decide that she would like to
capture video of the event, and then grasp the phone in a
right-handed camera grasp (see, e.g., FIG. 12), a left-handed
camera grasp (see, e.g., FIG. 13), or a two-handed camera grasp
(see, e.g., FIG. 14) to launch a camera application or transition
to camera functionality.
[0044] FIG. 15D shows a handheld device 1500 displaying a GUI
configuration for a camera screen when the handheld device 1500
determines that a user is holding the handheld device 1500 in a
right-handed camera grasp (see. e.g., FIG. 12). Based on the
detection of a right-handed camera grasp, the handheld device 1500
determines that a GUI configuration comprising a
horizontally-oriented viewfinder area 1540 spanning from the left
edge of the touchscreen 1502 to a camera controls area 1542
positioned on the right side of the touchscreen 1502 for displaying
user interface controls for manipulating the camera functionality.
In the embodiment shown in FIG. 15D, the user interface controls
comprise a picture mode button 1544, a video mode button 1546, and
a shutter/record button 1548. Additional or alternative controls
(e.g., zoom controls, timer controls, lighting mode controls, etc.)
may be displayed in other embodiments.
[0045] Similarly, FIG. 15E shows a handheld device 1500 displaying
a GUI configuration for a camera screen when the handheld device
1500 determines that a user is holding the handheld device 1500 in
a left-handed camera grasp (see, e.g., FIG. 13). Based on the
detection of a left-handed camera grasp, the handheld device 1500
determines that a GUI configuration comprising a
horizontally-oriented viewfinder area 1550 spanning from the right
edge of the touchscreen 1502 to a camera controls area 1552
positioned on the left side of the touchscreen 1502 for displaying
user interface controls for manipulating the camera functionality.
In the embodiment shown in FIG. 15E, the user interface controls
comprise a picture mode button 1554, a video mode button 1556, and
a shutter/record button 1558. Additional or alternative controls
(e.g., zoom controls, timer controls, lighting mode controls, etc.)
may be displayed in other embodiments.
[0046] The configurations of FIGS. 15D and 15E for right-handed
camera grasps and left-hand camera grasps, respectively, allow a
user to use the thumb positioned near the touchscreen 1502 to
control the camera. Therefore, the handheld device 1500 is able to
configure the camera screen GUI to optimize usability based on the
user's grasp. In another embodiment, the user may apply a
two-handed camera grasp (see, e.g., FIG. 14) to the handheld device
1500. In the case of a two-handed camera grasp, both of the user's
thumbs would be positioned near the touchscreen 1502. In one
embodiment, when a two-handed camera grasp is applied, the handheld
device 1500 defaults to the same camera screen GUI configuration
used for a left-handed camera grasp. In another embodiment, when a
two-handed camera grasp is applied, the handheld device 1500
defaults to the same camera screen GUI configuration used for a
right-handed camera grasp. In still another embodiment, when a
two-handed camera grasp is applied, the handheld device 1500 will
display identical user interface controls (e.g. a picture mode
button 1554, a video mode button 1556, and a shutter/record button
1558) on both sides of the touchscreen 1502. In an additional
embodiment, when a two-handed camera grasp is applied, the handheld
device 1500 displays a camera screen GUI configuration designed for
optimal usability with a two-handed camera grasp.
[0047] After transitioning to the camera functionality of a
handheld device 1500 by applying any one of the three camera grasps
described above, a user may decide to rotate the handheld device
1500 to a vertical orientation in order to capture pictures or
video in a portrait format. For example, the user may turn the
handheld device 1500 to a vertical position and hold it in his
right hand resulting in a vertically-oriented right-handed grasp.
Alternatively, the user may hold the handheld device in her left
hand resulting in a vertically-oriented left-handed grasp.
[0048] FIG. 15F shows a handheld device 1500 displaying a GUI
configuration for a camera screen when the handheld device 1500
determines that a user is holding the handheld device 1500 in a
vertically-oriented right-handed grasp with camera functionality
active. Based on the detection of a vertically-oriented
right-handed grasp, the handheld device 1500 determines that a GUI
configuration comprising a vertically-oriented viewfinder area 1560
spanning from the top edge of the touchscreen 1502 to a camera
controls area 1562 positioned at the bottom of the touchscreen 1502
for displaying user interface controls for manipulating the camera
functionality. In the embodiment shown in FIG. 15F, the user
interface controls comprise a picture mode button 1564, a video
mode button 1566, and a shutter/record button 1568. The
shutter/record button 1568 is positioned toward the right side of
the camera controls area 1562 to allow convenient manipulation by a
user's right thumb. Additional or alternative controls (e.g., zoom
controls, timer controls, lighting mode controls, etc.) may be
displayed in other embodiments.
[0049] FIG. 15G shows a handheld device 1500 displaying a GUI
configuration for a camera screen when the handheld device 1500
determines that a user is holding the handheld device 1500 in a
vertically-oriented left-handed grasp with camera functionality
active. Based on the detection of a vertically-oriented left-handed
grasp, the handheld device 1500 determines that a GUI configuration
comprising a vertically-oriented viewfinder area 1570 spanning from
the top edge of the touchscreen 1502 to a camera controls area 1572
positioned at the bottom of the touchscreen 1502 for displaying
user interface controls for manipulating the camera functionality.
In the embodiment shown in FIG. 15F, the user interface controls
comprise a picture mode button 1574, a video mode button 1576, and
a shutter/record button 1578. The shutter/record button 1578 is
positioned toward the left side of the camera controls area 1572 to
allow convenient manipulation by a user's right thumb. Additional
or alternative controls (e.g., zoom controls, timer controls,
lighting mode controls, etc.) may be displayed in other
embodiments.
[0050] In another embodiment, the user may apply a
vertically-oriented two-handed grasp to the handheld device 1500
with camera functionality active. In the case of a
vertically-oriented two-handed grasp, both of the user's thumbs
would be positioned near the touchscreen 1502. In one embodiment,
when a vertically-oriented two-handed grasp is applied with camera
functionality active, the handheld device 1500 defaults to the same
camera screen GUI configuration used for a vertically-oriented
left-handed grasp with camera functionality active. In another
embodiment, when a vertically-oriented two-handed grasp is applied,
the handheld device 1500 defaults to the same camera screen GUI
configuration used for a vertically-oriented right-handed grasp
with camera functionality active. In still another embodiment, when
a vertically-oriented two-handed grasp is applied, the handheld
device 1500 will display identical user interface controls on both
sides of the touchscreen 1502. In an additional embodiment, when a
vertically-oriented two-handed grasp is applied, the handheld
device 1500 displays a camera screen GUI configuration designed for
optimal usability with a vertically-oriented two-handed grasp.
Grasping Contacts as User Input
[0051] In addition to providing GUI configurations based on
grasping contacts applied to a handheld device, the present
disclosure further contemplates a handheld device interpreting
grasping contacts as user input. In one exemplary embodiment, a
user, as she is applying a two-handed camera grasp to a handheld
device with her left and right index fingers pressing down on the
top corners of the handheld device, may slide either her left or
right index finger along the top of the housing of the handheld
device toward the middle to cause a camera application to zoom and
may slide the finger back to its original position to undo the zoom
selection. In another exemplary embodiment, a user, as she is
applying a two-handed camera grasp to a handheld device with her
left and right index fingers pressing down on the top corners of
the handheld device, may increase the pressure applied either index
finger to cause a camera application to snap a picture, to begin
recording video, change volume, or perform any other operation that
according to the design of the camera application. In another
exemplary embodiment, particular grasping contacts and related
pressure information is used to identify a particular user. In one
embodiment, this functionality is used for unlocking a handheld
device 10 and/or particular functionality/information contained
thereon. For example, a handheld device 10 may provide a phone lock
setting that allows the user to input a particular grasp (e.g.
pinching the top and bottom of the faces of the handheld device 10
in opposite corners between a user's thumb and index finger,
squeezing the sides of the handheld device 10 at particular
positions, or any other grasping contact that the handheld device
10 is capable of detecting) to unlock the handheld device. In
another embodiment, a handheld device 10 may allow similar
configuration and functionality for a keypad lock or for locking
applications or particular files containing sensitive information
(e.g. a password storage application, a banking application, a
contacts application, a notepad application, a file containing
confidential strategic information for a business, or any other
application or file).
[0052] The present disclosure further contemplates embodiments in
which a user's emotion or mood is detected based on one more of the
particular grasping contacts detected, the magnitude of the
pressure applied to a handheld device 10, or the timing of the
pressure applied. A handheld device 10 may provide particular GUI
configurations and/or tailored functionality based on emotions,
moods, or other physiological data detected by one or more sensors.
In one embodiment, the haptic effects output by a handheld device
10 may be altered (e.g. lessened/increased in intensity) based on a
detected emotion/mood of a user.
[0053] In another embodiment, the user "User A") of one handheld
device 10 may input information to the handheld device 10 in the
form of grasping contacts and variations of pressure applied by
those grasping contacts which, in turn, may be communicated to
another handheld device 10 and its user ("User B") thereby
providing emotional clues or mood states of User A to User B. In
one embodiment, User A's emotion and/or mood is provided to User B
through User B's handheld device 10 outputting haptic effects based
on emotion and/or mood information received from User A's handheld
device 10. In other embodiments, audio and/or visual effects may be
used. In one particular embodiment, User A inputs grasping contacts
of constant or varying pressures into his/her handheld device 10
that is executing a virtual handshake application that interprets
the grasping contacts and causes information based on the grasping
contacts to be transmitted to a virtual handshake application
executing on User B's handheld device 10. Based on the received
information, User B's handheld device outputs one or more haptic
effects representing User A's grasping contacts.
[0054] The present disclosure contemplates numerous variations of
interpreting grasping contacts as user input in relation to any
number of applications or functionality including all of the
variations that one having ordinary skill in the art would
recognize as useful in the process of designing user interfaces for
handheld devices.
Haptics
[0055] In one embodiment, a handheld device 10 comprises one or
more hardware components (e.g. various actuators known to one
having ordinary skill in the art) for outputting haptic effects and
outputs haptic effects based on grasping contacts detected by a
handheld device 10 in one or more of the manners described above.
For example, in one embodiment a memory 16 of a handheld device 10
comprises a set or library of haptic effects (e.g. vibrations,
pulses, pops, jolts, and/or combinations thereof) associated with
various grasping contacts and are output upon detection of a
grasping contact as single instances or in repeating patterns for
the duration of a detected grasping contact. In another embodiment,
the haptic effect will vary in intensity and/or frequency based on
variations in the magnitude of pressure applied to a handheld
device 10 by a grasping contact and sensed by the handheld device
10. For example, a haptic effect resembling a heartbeat may be
output upon detection of a vertically-oriented right-handed grasp
and may be strengthened in intensity and/or frequency as a user
increases the pressure applied by that grasping contact and may be
weakened in intensity and/or frequency as a user lessens the
pressure applied by that grasping contact. In one embodiment, a
haptic effect (e.g. the heartbeat haptic effect) output by a
handheld device 10 may be solely based on the particular grasping
contact detected. For example, the handheld device 10 may output a
jolt haptic effect upon detecting an upside-down vertically
oriented right-handed grasp in order to notify the user that the
handheld device 10 is upside-down.
[0056] In another embodiment, a handheld device 10 outputs a haptic
effect based on a detected grasping contact and the current GUI
being displayed and/or the operation being performed by the user of
the handheld device 10. For example, a user grasping a handheld
device 10 in a vertically-oriented left-handed grasp may activate a
scroll list displayed by the handheld device 10 with his/her left
thumb flinging the list on the touch screen and simultaneously
manipulate the speed of the scrolling by increasing or decreasing
the magnitude of the grasping contact applied to the handheld
device 10. During the scrolling of the list, the handheld device 10
varies the intensity of the output of haptic effects associated
with list scrolling and/or items in the list based on the magnitude
of pressure applied by the grasping contact.
[0057] The embodiments related to haptics described above are
exemplary and do not limit the scope of the present disclosure in
any way. The present disclosure contemplates numerous variations of
determining and outputting haptic effects based on the nature,
variation, and intensity of grasping contacts applied to a handheld
device in relation to any number of applications or functionality
including all of the variations that one having ordinary skill in
the art would recognize as useful in the process of designing user
interfaces for handheld devices.
General
[0058] While the methods and systems herein are described in terms
of software executing on various machines, the methods and systems
may also be implemented as specifically-configured hardware, such
as a field-programmable gate array (FPGA) specifically to execute
the various methods. For example, embodiments can be implemented in
digital electronic circuitry, or in computer hardware, firmware,
software, or in a combination of thereof. In one embodiment, a
device may comprise a processor or processors. The processor
comprises a computer-readable medium, such as a random access
memory (RAM) coupled to the processor. The processor executes
computer-executable program instructions stored in memory, such as
executing one or more computer programs for editing an image. Such
processors may comprise a microprocessor, a digital signal
processor (DSP), an application-specific integrated circuit ASIC),
field programmable gate arrays (FPGAs), and state machines. Such
processors may further comprise programmable electronic devices
such as PLCs, programmable interrupt controllers (PICs),
programmable logic devices (PLDs), programmable read-only memories
(PROMs), electronically programmable read-only memories (EPROMs or
EEPROMs), or other similar devices.
[0059] Such processors may comprise, or may be in communication
with, media, for example computer-readable media, that may store
instructions that, when executed by the processor, can cause the
processor to perform the steps described herein as carried out, or
assisted, by a processor. Embodiments of computer-readable media
may comprise, but are not limited to, an electronic, optical,
magnetic, or other storage device capable of providing a processor,
such as the processor in a web server, with computer-readable
instructions. Other examples of media comprise, but are not limited
to, a floppy disk, CD-ROM, magnetic disk, memory chip, ROM, RAM,
ASIC, configured processor, all optical media, all magnetic tape or
other magnetic media, or any other medium from which a computer
processor can read. The processor, and the processing, described
may be in one or more structures, and may be dispersed through one
or more structures. The processor may comprise code for carrying
out one or more of the methods (or parts of methods) described
herein.
[0060] The foregoing description of some embodiments of the
invention has been presented only for the purpose of illustration
and description and is not intended to be exhaustive or to limit
the invention to the precise forms disclosed. Numerous
modifications and adaptations thereof will be apparent to those
skilled in the art without departing from the spirit and scope of
the invention.
[0061] Reference herein to "one embodiment" or "an embodiment"
means that a particular feature, structure, operation, or other
characteristic described in connection with the embodiment may be
included in at least one implementation of the invention. The
invention is not restricted to the particular embodiments described
as such. The appearance of the phrase "in one embodiment" or "in an
embodiment" in various places in the specification does not
necessarily refer to the same embodiment. Any particular feature,
structure, operation, or other characteristic described in this
specification in relation to "one embodiment" may be combined with
other features, structures, operations, or other characteristics
described in respect of any other embodiment.
* * * * *