U.S. patent application number 15/633636 was filed with the patent office on 2017-10-26 for virtual reality applications.
This patent application is currently assigned to EMPIRE TECHNOLOGY DEVELOPMENT LLC. The applicant listed for this patent is EMPIRE TECHNOLOGY DEVELOPMENT LLC. Invention is credited to Roy Levien, Mark Malamud.
Application Number | 20170308272 15/633636 |
Document ID | / |
Family ID | 50149152 |
Filed Date | 2017-10-26 |
United States Patent
Application |
20170308272 |
Kind Code |
A1 |
Levien; Roy ; et
al. |
October 26, 2017 |
VIRTUAL REALITY APPLICATIONS
Abstract
Augmented reality technology is described. The technology can
detect objects in a scene, identifying one or more installed or
available applications based on the detected objects, and place
icons representing the identified applications proximate to the
detected objects in a display of the scene, e.g., so that a user
can start or install the identified applications. The technology
can also facilitate interaction with an identified object, e.g., to
remotely control a recognized object.
Inventors: |
Levien; Roy; (Lexington,
ME) ; Malamud; Mark; (Seattle, WA) |
|
Applicant: |
Name |
City |
State |
Country |
Type |
EMPIRE TECHNOLOGY DEVELOPMENT LLC |
Wilmington |
DE |
US |
|
|
Assignee: |
EMPIRE TECHNOLOGY DEVELOPMENT
LLC
Wilmington
DE
|
Family ID: |
50149152 |
Appl. No.: |
15/633636 |
Filed: |
June 26, 2017 |
Related U.S. Patent Documents
|
|
|
|
|
|
Application
Number |
Filing Date |
Patent Number |
|
|
13821560 |
Mar 7, 2013 |
9690457 |
|
|
PCT/US12/52320 |
Aug 24, 2012 |
|
|
|
15633636 |
|
|
|
|
Current U.S.
Class: |
1/1 |
Current CPC
Class: |
G06F 8/61 20130101; G06T
11/00 20130101; G06F 3/04817 20130101; G06F 3/0482 20130101; G06F
3/04842 20130101 |
International
Class: |
G06F 3/0481 20130101
G06F003/0481 |
Claims
1. A method performed by a processor, the method comprising:
detecting at least one object of multiple objects in a scene;
identifying, based on the detected at least one object, one or more
applications yet to be installed at a mobile computation device;
and placing one or more icons proximate to the detected at least
one object in a display of the scene, wherein the one or more icons
represent the identified one or more applications.
2. The method of claim 1, wherein detecting the at least one object
includes employing a digital camera.
3. The method of claim 1, wherein the scene is displayed on an
output device.
4. The method of claim 1, further comprising: receiving contextual
information, wherein identifying the one or more applications
includes identifying the one or more applications further based on
the received contextual information.
5. The method of claim 4, wherein receiving the contextual
information comprises receiving the contextual information via an
input from a user.
6. The method of claim 4, wherein the received contextual
information is based on positional information.
7. The method of claim 1, wherein detecting the at least one object
includes employing one or more image recognition methods.
8. The method of claim 1, wherein identifying the one or more
applications includes searching a list of attributes, associated
with a plurality of applications, for an attribute associated with
the detected at least one object.
9. The method of claim 1, further comprising: receiving a user
selection of an icon of the one or more icons; and installing an
application associated with the user selected icon at the mobile
computation device.
10. The method of claim 1, further comprising: receiving an input
to disassociate an application, of the identified one or more
applications, from the detected at least one object.
11. The method of claim 1, further comprising: adapting the one or
more icons based on the detected at least one object, prior to
placing the one or more icons proximate to the detected at least
one object in the display of the scene.
12. A computer-readable storage device that stores instructions
that, in response to execution by a processor, cause the processor
to perform or control performance of operations to: detect at least
one object of multiple objects in a scene; identify, based on the
detected at least one object, one or more applications yet to be
installed at a mobile computation device; and place one or more
icons proximate to the detected at least one object in a display of
the scene, wherein the one or more icons represent the identified
one or more applications.
13. The computer-readable storage device of claim 12, wherein the
operation to detect the at least one object includes an operation
to employ a digital camera.
14. The computer-readable storage device of claim 12, wherein the
scene is displayed on an output device.
15. The computer-readable storage device of claim 12, wherein the
stored instructions, in response to execution by a computer, cause
the computer to perform or control performance of at least one
operation to: obtain contextual information, wherein the
identification of the one or more applications includes
identification of the one or more applications further based on the
obtained contextual information.
16. The computer-readable storage device of claim 15, wherein the
contextual information is received via an input from a user.
17. The computer-readable storage device of claim 15, wherein the
contextual information is based on positional information.
18. A system, comprising: a first component configured to detect at
least one object of multiple objects in a scene; a second component
operatively coupled to the first component, wherein the second
component is configured to identify, based on the detected at least
one object, one or more applications yet to be installed at a
mobile computation device; and a third component operatively
coupled to the second component, wherein the third component is
configured to place one or more icons proximate to the detected at
least one object in a display of the scene, wherein the one or more
icons represent the identified one or more applications.
19. The system of claim 18, wherein the first component is
configured to detect the at least one object by use of a digital
camera.
20. The system of claim 18, wherein the scene is displayed on an
output device.
21. The system of claim 18, further comprising: another component
operatively coupled to the second component, wherein the another
component is configured to receive contextual information, and
wherein the second component is configured to identify the one or
more applications further based on the contextual information
received by the another component.
22. The system of claim 18, wherein the first component is
configured to detect the at least one object by use of one or more
image recognition methods.
23. The system of claim 18, wherein the second component is
configured to identify the one or more applications by search of a
list of attributes, associated with a plurality of applications,
for an attribute associated with the detected at least one
object.
24. The system of claim 18, wherein placement of the one or more
icons proximate to the detected at least one object comprises f the
one or more icons being stacked proximate to the detected at least
one object.
25. The system of claim 18, wherein the second component is
configured to use features provided by the one or more applications
to identify the one or more applications.
Description
CROSS-REFERENCE TO RELATED APPLICATIONS
[0001] This application is a divisional application under 35 U.S.C.
.sctn.121 of and claims priority under 35 U.S.C. .sctn.120 to U.S.
application Ser. No. 13/821,560, filed on Mar. 7, 2013, entitled
"VIRTUAL REALITY APPLICATIONS," which in turn is a U.S National
Stage filing under 35 U.S.C. .sctn.371 of the International
Application No. PCT/US2012/052320, filed on Aug. 24, 2012 and
entitled "VIRTUAL REALITY APPLICATIONS." U.S. application Ser. No.
13/821,560 and International Application No. PCT/US12/52320,
including any appendices or attachments thereof, are hereby
incorporated by reference in their entirety.
BACKGROUND
[0002] The number of mobile computing devices in use has increased
dramatically over the last decade and continues to increase.
Examples of mobile computing devices are mobile telephones, digital
cameras, and global positioning system ("GPS") receivers. According
to one study, 60% of the world's population has access to mobile
telephones. An increasing number of people use digital cameras and
some manufacturers of digital cameras presently have revenues of
tens of billions of United States dollars annually. Digital cameras
are used to capture, store, and share images. Often, the images can
be viewed nearly immediately after they are captured, such as on a
display device associated with the digital cameras. Once an image
is captured, it can be processed by computing devices. Image
recognition is one such process that can be used to recognize and
identify objects in an image. For example, image recognition
techniques can determine whether an image contains a human face, a
particular object or shape, etc.
[0003] Augmented reality is a view of a physical, real-world
environment that is enhanced by computing devices to digitally
augment visual or auditory information a user observes in the real
world. As an example, an augmented reality system can receive scene
information from a digital camera and a GPS, identify objects
(e.g., people, animals, structures, etc.) in the scene, and provide
additional information relating to the identified objects. A user
of such a system can take a photo of a scene using a mobile
computing device (e.g., a digital camera, a cellular phone, a
"smartphone," etc.) and automatically receive information about one
or more objects an augmented reality system recognizes in the
photographed (i.e., digitized) scene.
[0004] There are now hundreds of thousands of applications
available for mobile devices. Users can download and install
applications ("apps") that are interesting or useful to them.
However, finding such applications can be challenging.
BRIEF DESCRIPTION OF THE DRAWINGS
[0005] FIG. 1 is a flow diagram illustrating a routine invoked by
the disclosed technology in various embodiments.
[0006] FIG. 2 is an environmental diagram illustrating use of the
disclosed technology in various embodiments.
[0007] FIG. 3 is a block diagram illustrating components employed
by the disclosed technology in various embodiments.
[0008] FIG. 4 is a flow diagram illustrating a routine invoked by
the disclosed technology in various embodiments.
[0009] FIG. 5 is a flow diagram illustrating a routine invoked by
the disclosed technology in various embodiments.
[0010] FIGS. 6A and 6B are environmental diagrams illustrating use
of the disclosed technology in various embodiments.
[0011] FIG. 7 is a flow diagram illustrating a routine invoked by
the disclosed technology in various embodiments.
[0012] FIG. 8 is a block diagram of an illustrative embodiment of a
computing device that is arranged in accordance with at least some
embodiments of the present disclosure.
DETAILED DESCRIPTION
[0013] In the following detailed description, reference is made to
the accompanying drawings, which form a part hereof. In the
drawings, similar symbols typically identify similar components,
unless context dictates otherwise. The illustrative embodiments
described in the detailed description, drawings, and claims are not
meant to be limiting. Other embodiments may be utilized, and other
changes may be made, without departing from the scope of the
subject matter presented herein. It will be readily understood that
the aspects of the present disclosure, as generally described
herein, and illustrated in the Figures, can be arranged,
substituted, combined, separated, and designed in a wide variety of
different configurations, all of which are explicitly contemplated
herein.
[0014] Augmented reality technology ("the technology") is
described. In various embodiments, the technology detects objects
in a scene, identifies one or more installed applications based on
at least one detected object, and displays an icon representing the
identified one or more applications, e.g., proximate to the
detected object(s) in a display of the scene. The technology can
use various techniques for object recognition, e.g., image
recognition, pattern recognition, etc. When a user selects a
displayed icon, an application corresponding to the selected icon
can start. In various embodiments, the technology can instead (or
additionally) identify available but not-yet-installed applications
based on at least one detected object. When the user selects a
displayed icon, an application corresponding to the selected icon
can be installed and optionally started. Thus, the technology
enables a user to quickly identify applications that may be
pertinent to the context or milieu in which users find
themselves.
[0015] The technology can employ a digital camera configured for
use with a mobile device the user is employing to digitize a scene.
The mobile device can also process contextual information, e.g.,
GPS coordinates. Some applications may correspond to contextual
information. As an example, when the scene includes a particular
restaurant, an identified application can be an application
corresponding to the restaurant. If multiple applications
correspond to an object (e.g., a restaurant, store, or other
establishment or object), the technology may identify applications
suitable for the current location (e.g., GPS coordinates).
Alternatively, a user can specify the contextual information to
use. As an example, the technology may identify applications for
establishments that are open at the current time, but the user may
be interested only in applications corresponding to establishments
that are open for dinner later in the day.
[0016] The technology can identify applications, e.g., by matching
attributes corresponding to installed (or available) applications
to the present context or milieu, e.g., based on attributes of
matched objects. The attributes can be stored locally on a mobile
device or at a server. A user can also associate or disassociate an
application with recognized objects, e.g., so that a particular
application's icon is visible or removed the next time an object is
in a digitized scene.
[0017] In various embodiments, the technology can also identify
applications based on stored application "preferences." These
preferences may be indicated by application developers,
organizations, etc. As an example, when a user is in a particular
geographic area, a "sponsored application" may be identified by the
technology.
[0018] When multiple application icons are identified, the
technology may use various techniques to alter the user interface.
As examples, the icons may be stacked; some icons may appear before
other applications in the stack; some icons (e.g., sponsored
applications) may be larger than other icons; etc.
[0019] The technology can also adapt the icons for applications,
e.g., so that the icons are representative of underlying
information. As an example, a restaurant review application's icon
may be identified for many restaurants, and the icon may change to
indicate a review for the recognized restaurant.
[0020] In various embodiments, the technology can detect objects in
a scene, associate the detected objects with methods for
interacting with the detected objects, obtain a specification for
interacting with the detected objects using the associated methods;
and provide a user interface for controlling the detected objects.
As an example, when an audiovisual device (e.g., television, DVD
player, etc.) is detected in a scene, the technology can
communicate with the detected device (e.g., using WiFi,
radiofrequency, infrared, or other communications means) and obtain
a specification for interacting with the device. The specification
can provide information, e.g., available commands, how the commands
are to be sent, the format for the commands, etc. The specification
can also provide information about user interface elements. Upon
receiving the specification, the technology can provide a user
interface that a user can use to control the device. When the user
interacts via the user interface, the technology can transmit
commands to the device. In various embodiments, the technology can
communicate with the detected objects by employing a radiofrequency
identification tag, wireless network, infrared signal, etc.
[0021] In various embodiments, the technology includes a component
configured for use with a device that receives a signal from a
computing device, provides an identification of one or more methods
operable to control the device, receives a command from the
computing device wherein the command was identified in the one or
more methods, and controls the device according to the received
command. The command can be to control media (e.g., play, stop,
pause, rewind, fast forward, etc.), control a power circuit (e.g.,
turn on/off), etc. The component may also provide a specification
for the one or more methods, e.g., a hint for providing a user
interface component.
[0022] Turning now to the figures, FIG. 1 is a flow diagram
illustrating a routine 100 invoked by the disclosed technology in
various embodiments. The routine 100 begins at block 102. The
routine 100 then continues at block 104, where it receives a
digitized vision of a scene. The routine 100 then continues at
block 106, where it detects objects in the scene. In various
embodiments, the routine 100 may employ various image recognition
techniques to recognize objects. The routine 100 then continues at
block 108 where it receives contextual information. Examples of
contextual information are location information (GPS coordinates,
street address, city, etc.), time of day, etc. The routine 100 then
continues at block 110 where it identifies applications based on
the detected objects. As an example, when the technology recognizes
a television, the technology may indicate an application that
provides current television listings. As another example, when the
technology recognizes a restaurant, the technology may identify an
application that is associated with the restaurant, e.g., to
provide menus, reserve seats, etc. The routine 100 then continues
at block 112 where it identifies objects based on contextual
information. As an example, if there are two restaurants identified
in the scene and one of the restaurants is only open for lunch and
dinner, the technology may only identify the restaurant open for
breakfast if the present time is within what would normally be
considered breakfast hours. The routine 100 then continues at block
114, where it places an icon representing identified applications
near detected objects. The routine 100 then returns at block
116.
[0023] Those skilled in the art will appreciate that the logic
illustrated in FIG. 1 and described above, and in each of the flow
diagrams discussed below, may be altered in a variety of ways. For
example, the order of the logic may be rearranged, substeps may be
performed in parallel, illustrated logic may be omitted, other
logic may be included, etc.
[0024] FIG. 2 is an environmental diagram illustrating use of the
disclosed technology in various embodiments. A scene 200 includes
three objects: a first object 202, a second object 204, and a third
object 206. A display of a mobile computing device 208 displays
digitized representations of the objects as a digitized
representation of the first object 210, a digitized representation
of the second object 212, and a digitized representation of the
third object 214. The digitized representation of the first object
to 10 is associated with a first icon 216A and a second icon 216B.
The digitized representation of the second object 212 is associated
with a third icon 218. As described above, the icons can represent
installed applications or available applications. When a user
selects an icon, e.g., by touching an area near the icon on a
touchscreen of the mobile computing device, the technology may
launch the indicated application (if already installed) or install
the indicated application. In some embodiments, the technology may
automatically launch applications that are installed.
[0025] FIG. 3 is a block diagram illustrating components employed
by the disclosed technology in various embodiments. The components
300 can include a digitizer 302, a recognizer 304, application
attributes 306, and an identifier 308. In various embodiments,
additional components (not illustrated) or a subset of the
illustrated components 300 can be employed without deviating from
the scope of the claimed technology. The digitizer component 302
can digitize a scene, e.g., a scene received via an image capture
device (not illustrated). The recognizer component 304 can
recognize objects in a digitized scene. In various embodiments, the
recognizer component can use various image recognition techniques
to recognize objects in the digitized scene. The identifier
component 308 can identify installed applications to be associated
with recognized objects, e.g., using stored application attributes
306. The attributes can indicate, e.g., information about objects
with which applications are associated, time of day, location, etc.
The identifier component 308 can also employ a server computing
device (not illustrated), e.g., to identify applications that are
not presently installed but may be associated with recognized
objects.
[0026] FIG. 4 is a flow diagram illustrating a routine 400 invoked
by the disclosed technology in various embodiments. The routine 400
begins at block 402. The routine 400 continues at block 404, where
it receives a digitized vision of a scene. The routine 400 then
continues at block 406 where it detects objects in the digitized
scene. The routine 400 can employ various image recognition
techniques to recognize objects. The routine 400 then continues at
block 408 where it identifies applications that are not stored
locally, e.g., on a mobile computing device on which the routine
executes. The routine 400 then continues at block 410 where it
places an icon for an application near a recognized object. As an
example, if the routine 400 recognizes a coffee shop in a digitized
scene and the user's mobile computing device does not have
installed an application corresponding to the recognized coffee
shop, the routine 400 may place an icon for an application
corresponding to the recognized coffee shop that, if selected,
causes the application to be installed. The routine 400 then
returns at block 412.
[0027] FIG. 5 is a flow diagram illustrating a routine 500 invoked
by the disclosed technology in various embodiments. The routine 500
begins at block 502. The routine then continues at block 504, where
it detects objects in a scene. The routine 500 then continues at
block 506, where it selects a first object from the recognized
objects. The routine 500 then continues at decision block 508,
where it determines whether the selected object can be interacted
with. The routine 500 may make this determination, e.g., by
querying a database (e.g., a local database stored at the computing
device invoking the routine 500 or a remote database stored at a
server computing device). If the selected object can be interacted
with, the routine 500 continues at block 510. Otherwise, the
routine 500 continues at block 514. At block 510, the routine 500
obtains specifications for interacting with the selected object. In
various embodiments, the routine 500 may obtain the specifications
from a locally stored database, the object directly (e.g.,
wirelessly), or from a remote computing device. The routine 500 may
then provide a user interface to a user so that the user can
interact with the selected object. As an example, if the recognized
object is a television or other audiovisual device, the routine 500
may provide a user interface that enables the user to control the
audiovisual device. The received specifications can include
instructions for providing aspects of the user interface. The
routine 500 then continues at block 514, where it selects a next
object from the set of objects detected above in relation to block
504. The routine 500 then continues at decision block 516, where it
determines whether a next object was selected. If there are no more
objects to be selected, the routine 500 returns at block 518.
Otherwise, the routine 500 continues at decision block 508 to
analyze the selected object.
[0028] FIGS. 6A and 6B are environmental diagrams illustrating use
of the disclosed technology in various embodiments. FIG. 6A
includes a scene 600 and a digitized version of the scene 600
displayed at a mobile computing device 606. The scene 600 includes
a television 602 and another object 604. The digitized version of
the scene 600 displayed at the mobile computing device 606 includes
a digitized representation of the television 608 and a digitized
representation of the other object 610. The mobile computing device
606 also displays an icon 612 associated with the digitized
representation of the television 608. As an example, the technology
may have recognized the television 602 and identified an
application corresponding to the television 602 and represented by
the icon 612. When the user selects the icon 612, the technology
may launch the corresponding application (or install the
corresponding application). In various embodiments, the technology
may employ an antenna 614 associated with the mobile computing
device 606, e.g., to communicate with the television 602 or a
network computing device (not illustrated) to receive
specifications relating to controlling the television 602. In
various embodiments, the mobile computing device 606 may
communicate with the television using infrared, radio frequency,
WiFi, etc. FIG. 6B illustrates a user interface 620 displayed by
the mobile computing device 606, e.g., when the user launches the
application by selecting icon 612.
[0029] FIG. 7 is a flow diagram illustrating a routine 700 invoked
by the disclosed technology in various embodiments. The routine 700
begins at block 702. The routine 700 then continues at block 704,
where it receives a signal. In various embodiments, the routine 700
can receive a signal from a mobile computing device that a user is
operating to command a device on which the routine 700 executes.
The routine 700 then continues at block 706, work provides methods
operable to control the device. As an example, the routine 700 may
provide a specification for controlling the device. The
specification can include indications of user interfaces, available
commands, frequencies, etc. The routine 700 then continues at block
708, where it receives a command. In various embodiments, the
routine may receive commands from the mobile computing device to
which the routine 700 provided the specification. The routine 700
then continues at block 710, where it controls the device according
to the received command. The routine then returns at block 712.
[0030] FIG. 8 is a block diagram illustrating an example computing
device 800 that is arranged in accordance with at least some
embodiments of the present disclosure. In a very basic
configuration 802, computing device 800 typically includes one or
more processors 804 and a system memory 806. A memory bus 808 may
be used for communicating between processor 804 and system memory
806.
[0031] Depending on the desired configuration, processor 804 may be
of any type including but not limited to a microprocessor
(".mu.P"), a microcontroller (".mu.C"), a digital signal processor
("DSP"), or any combination thereof. Processor 804 may include one
or more levels of caching, such as a level one cache 810 and a
level two cache 812, a processor core 814, and registers 816. An
example processor core 814 may include an arithmetic logic unit
("ALU"), a floating point unit ("FPU"), a digital signal processing
core ("DSP core"), or any combination thereof. An example memory
controller 818 may also be used with processor 804, or in some
implementations memory controller 818 may be an internal part of
processor 804.
[0032] Depending on the desired configuration, system memory 806
may be of any type including but not limited to volatile memory
(such as RAM), non-volatile memory (such as ROM, flash memory,
etc.) or any combination thereof. System memory 806 may include an
operating system 820, one or more applications 822, and program
data 824. Application 822 may include an application identifier
component 826 that is arranged to identify applications
corresponding to a recognized object. Program data 824 may include
application attribute information 828, as is described herein. In
some embodiments, application 822 may be arranged to operate with
program data 824 on operating system 820 such that applications can
be identified. This described basic configuration 802 is
illustrated in FIG. 8 by those components within the inner dashed
line.
[0033] Computing device 800 may have additional features or
functionality, and additional interfaces to facilitate
communications between basic configuration 802 and any required
devices and interfaces. For example, a bus/interface controller 830
may be used to facilitate communications between basic
configuration 802 and one or more data storage devices 832 via a
storage interface bus 834. Data storage devices 832 may be
removable storage devices 836, non-removable storage devices 838,
or a combination thereof. Examples of removable storage and
non-removable storage devices include magnetic disk devices such as
flexible disk drives and hard-disk drives ("HDDs"), optical disk
drives such as compact disk ("CD") drives or digital versatile disk
("DVD") drives, solid state drives ("SSDs"), and tape drives to
name a few. Example computer storage media may include volatile and
non-volatile, removable and non-removable media implemented in any
method or technology for storage of information, such as computer
readable instructions, data structures, program modules, or other
data.
[0034] System memory 806, removable storage devices 836 and
non-removable storage devices 838 are examples of computer storage
media. Computer storage media includes, but is not limited to, RAM,
ROM, EEPROM, flash memory or other memory technology, CD-ROM,
digital versatile disks (DVDs) or other optical storage, magnetic
cassettes, magnetic tape, magnetic disk storage or other magnetic
storage devices, or any other medium which may be used to store the
desired information and which may be accessed by computing device
800. Any such computer storage media may be part of computing
device 800.
[0035] Computing device 800 may also include an interface bus 840
for facilitating communication from various interface devices
(e.g., output devices 842, peripheral interfaces 844, and
communication devices 846) to basic configuration 802 via
bus/interface controller 830. Example output devices 842 include a
graphics processing unit 848 and an audio processing unit 850,
which may be configured to communicate to various external devices
such as a display or speakers via one or more A/V ports 852.
Example peripheral interfaces 844 include a serial interface
controller 854 or a parallel interface controller 856, which may be
configured to communicate with external devices such as input
devices (e.g., keyboard, mouse, pen, voice input device, touch
input device, etc.) or other peripheral devices (e.g., printer,
scanner, etc.) via one or more I/O ports 858. An example
communication device 846 includes a network controller 860, which
may be arranged to facilitate communications with one or more other
computing devices 862 over a network communication link via one or
more communication ports 864.
[0036] The network communication link may be one example of a
communication media. Communication media may typically be embodied
by computer readable instructions, data structures, program
modules, or other data in a modulated data signal, such as a
carrier wave or other transport mechanism, and may include any
information delivery media. A "modulated data signal" may be a
signal that has one or more of its characteristics set or changed
in such a manner as to encode information in the signal. By way of
example, and not limitation, communication media may include wired
media such as a wired network or direct-wired connection, and
wireless media such as acoustic, radio frequency ("RF"), microwave,
infrared ("IR") and other wireless media. The term computer
readable media as used herein may include both storage media and
communication media.
[0037] Computing device 800 may be implemented as a portion of a
small-form factor portable (or mobile) electronic device such as a
cell phone, a personal data assistant ("PDA"), a personal media
player device, a wireless web-watch device, a personal headset
device, an application specific device, or a hybrid device that
include any of the above functions. Computing device 800 may also
be implemented as a personal computer including both laptop
computer and non-laptop computer configurations.
[0038] Although the subject matter has been described in language
specific to structural features and/or methodological acts, it is
to be understood that the subject matter defined in the appended
claims is not necessarily limited to the specific features or acts
described above. Rather, the specific features and acts described
above are disclosed as example forms of implementing the claims.
Accordingly, the invention is not limited except as by the appended
claims.
* * * * *