U.S. patent application number 12/698107 was filed with the patent office on 2010-06-10 for wireless augmented reality communication system.
Invention is credited to MARTIN AGAN, ANN DEVEREAUX, THOMAS JEDREY.
Application Number | 20100141554 12/698107 |
Document ID | / |
Family ID | 36191169 |
Filed Date | 2010-06-10 |
United States Patent
Application |
20100141554 |
Kind Code |
A1 |
DEVEREAUX; ANN ; et
al. |
June 10, 2010 |
WIRELESS AUGMENTED REALITY COMMUNICATION SYSTEM
Abstract
The system of the present invention is a highly integrated radio
communication system with a multimedia co-processor which allows
true two-way multimedia (video, audio, data) access as well as
real-time biomedical monitoring in a pager-sized portable access
unit. The system is integrated in a network structure including one
or more general purpose nodes for providing a wireless-to-wired
interface. The network architecture allows video, audio and data
(including biomedical data) streams to be connected directly to
external users and devices. The portable access units may also be
mated to various non-personal devices such as cameras or
environmental sensors for providing a method for setting up
wireless sensor nets from which reported data may be accessed
through the portable access unit. The reported data may
alternatively be automatically logged at a remote computer for
access and viewing through a portable access unit, including the
user's own.
Inventors: |
DEVEREAUX; ANN; (TUJUNGA,
CA) ; JEDREY; THOMAS; (PASADENA, CA) ; AGAN;
MARTIN; (PASADENA, CA) |
Correspondence
Address: |
IVAN POSEY, ESQ.;DARBY & DARBY, P.C.
32nd FLOOR, 707 WILSHIRE BOULEVARD
LOS ANGELES
CA
90017
US
|
Family ID: |
36191169 |
Appl. No.: |
12/698107 |
Filed: |
February 1, 2010 |
Related U.S. Patent Documents
|
|
|
|
|
|
Application
Number |
Filing Date |
Patent Number |
|
|
11410517 |
Apr 24, 2006 |
|
|
|
12698107 |
|
|
|
|
09483315 |
Jan 14, 2000 |
7035897 |
|
|
11410517 |
|
|
|
|
60115993 |
Jan 15, 1999 |
|
|
|
Current U.S.
Class: |
345/7 ;
455/66.1 |
Current CPC
Class: |
A61B 5/0015 20130101;
H04L 67/12 20130101; H04W 76/10 20180201; H04L 65/1063 20130101;
G08C 17/02 20130101; Y04S 40/18 20180501; H04N 7/155 20130101; H04L
65/1069 20130101; G06T 19/006 20130101; H04N 7/147 20130101; H04L
67/04 20130101; H04N 7/152 20130101 |
Class at
Publication: |
345/7 ;
455/66.1 |
International
Class: |
G09G 5/00 20060101
G09G005/00; H04B 7/00 20060101 H04B007/00 |
Goverment Interests
GOVERNMENT LICENSE RIGHTS
[0002] The U.S. Government has certain rights in this invention
pursuant to NAS7-1407 awarded by NASA.
Claims
1-33. (canceled)
34. A method for providing wireless augmented reality through a
network, the method comprising: wirelessly accessing the network
from a portable access unit through a general purpose node coupled
in communication with the network, the general purpose node also
coupled in communication with a processor device; associating the
portable access unit with the processor device; capturing video of
a subject matter in the portable access unit; providing a signal
from the portable access unit to the processor device; responsive
to providing the signal, receiving information related to the
subject matter; and displaying the information on the portable
access unit.
35. The method of claim 34, wherein displaying the information
comprises displaying the information as an overlay to the subject
matter.
36. The method of claim 34, wherein displaying information
comprises displaying the information in a head set as an overlay to
the subject matter.
37. The method of claim 34, further comprising: sending commands to
the processor device from the portable access unit.
38. The method of claim 34, further comprising: presenting a list
of remote portable access units available for connection; and
establishing two-way communications with a remote access unit
through the general purpose node.
39. The method of claim 34, wherein receiving information comprises
receiving video information.
40. The method of claim 34, wherein receiving information comprises
receiving audio information.
41. The method of claim 34, wherein receiving information comprises
receiving text information.
42. A portable access unit for receiving information from a
processor device through a network, the method comprising: a
communication device to wirelessly access the network through a
general purpose node coupled in communication with the network, the
general purpose node also coupled in communication with the
processor device, the general purpose node to associate the
portable access unit with the processor device; a camera, coupled
to the communication device, the camera to capture video of a
subject matter in the portable access unit; and a display device,
coupled to the communication device, the display device to output
information related the subject matter, the information received
from the processor device responsive to providing the signal.
43. The portable access unit of claim 42, wherein the portable
access unit is a hand-held device.
44. The portable access unit of claim 42, wherein the display
device displays the information as a heads up display.
45. The portable access unit of claim 42, wherein the display
device displays the information as an overlay to the subject
matter.
46. The portable access unit of claim 42, wherein the display
device comprises a head set.
47. The portable access unit of claim 42, further comprising: a
codec, coupled to the communication device, to the camera, and to
the display device, the codec to compress video received from the
camera and to decompress video received from the processor
device.
48. The portable access unit of claim 42, further comprising: a
microphone, coupled to the communication device, the microphone to
receive audio commands for the processor device.
49. The portable access unit of claim 42, wherein the communication
device comprises an IEEE 802.11-compatible communication
device.
50. The portable access unit of claim 42, wherein the information
comprises video information.
51. The portable access unit of claim 42, wherein the information
comprises audio information.
52. The portable access unit of claim 42, wherein the information
comprises text information.
53. A method for providing wireless augmented reality through a
network, the method comprising: wirelessly accessing a network from
a portable access unit through a general purpose node coupled in
communication with the network, the general purpose node also
coupled in communication with a plurality of processor devices;
presenting the plurality of processor devices available for
connection; associating the portable access unit with a processor
device from the plurality of processor devices; capturing video of
a subject matter in the portable access unit; providing a signal
from the portable access unit to the processor device; responsive
to providing the signal, receiving information related to the
subject matter; and displaying the information through as a heads
up display overlaying the information relative to the subject
matter.
Description
RELATED APPLICATIONS
[0001] This application is based on provisional patent application
Ser. No. 60/115,993 filed Jan. 15, 1999.
BACKGROUND OF THE INVENTION
[0003] 1. Field of the Invention
[0004] The invention is a wireless augmented reality system (WARS)
that leverages communications and multimedia information processing
microelectronics, along with displays, imaging sensors, biosensors,
and voice recognition to provide hands-free, tetherless, real-time
access and display of network resources, including video, audio and
data.
[0005] 2. Description of the Prior Art and Related Information
[0006] Online instruction manuals are becoming more prevalent in
the industrial and everyday environment. These electronic technical
manuals (ETM) may be interactive. Just as with printed manuals,
ETMs may become very difficult to use and maintain in these
environments where elements of an environment, such as dust,
chemical or general harshness may be detrimental to the electronics
and storage devices used to display and operate the ETM. Further,
it is not always possible for a worker who requires access to an
ETM to stop work to consult ETM.
[0007] These problems are multiplied in extraterrestrial
environments such as a space shuttle or a space station. During
intra and extra vehicular activities, it may be virtually
impossible to access a traditional keyboard and computer display to
access an ETM. For example, during a satellite repair mission, it
would not be practical for an astronaut in a bulky extravehicular
space suit to type commands on a keyboard to view a display in the
extreme environment of outer space where the sun glare may make
viewing impossible.
[0008] Hands-free portable computers have been implemented in an
attempt to solve some of these problems. For example, U.S. Pat.
Nos. 5,305,244 and 5,844,824 describe systems in which a head-up
display and voice recognition is implemented in a portable computer
for displaying ETM. However, these systems, being a single
user-to-computer paradigm, do not allow multiple-user access to
multiple computers, multimedia devices or nodes on a network for
accessing arbitrarily-selected data channels. Further, these
previously-described systems are self contained and their data
storage needs to be updated periodically to be sure that the latest
data is displayed. Further, these systems do not allow two-way
communication over local and wide area networks to other
multi-media users and devices, and do not provide real-time
biomedical information about the physical condition of the
user.
[0009] There is thus a need for a wireless, wearable communications
system allowing two-way voice, video and data communication between
local users and to remote users and devices over network nodes,
along with tetherless real-time monitoring of the local user's
physical condition.
SUMMARY OF THE INVENTION
[0010] The system solves the above problems with prior art systems
with an adaptive wireless remote access network comprised of small
individual portable access units linked to a local cellular general
purpose node. Interlinked general purpose nodes support
communications across different habitat modules or
internal-to-extravehicular communications, in the case of the space
environment; terrestrial wired networks such as the Internet can
serve as the interconnection of remotely scattered access nodes in
an industrial, commercial or home environment application.
[0011] The system may provide shuttle and international space
station astronauts with tetherless, on-demand access to data
channels from multimedia devices such as cameras or audio sensors
associated with other persons or in a stand-alone configuration,
and multimedia or data display from a networked computer terminal
and to the equipment control capabilities which may be available
through that computer. Transparent to such access, the system can
maintain a data channel for monitoring an astronaut's health or
environment via in-situ sensors. Though this system may be used for
the shuttle and the international space station, the system has
uses in many possible applications related to medical, industrial,
and commercial areas.
[0012] The invention is a personal communications system designed
especially for the space shuttle or station environment to provide
cellular communications access throughout the vessel with video,
audio, data and computer connect services. A small, wearable
portable access unit (PAU) communicates over high-rate link to a
centrally-located network access unit, called a general purpose
node herein. The system backbone provides 2-way video, 2-way audio,
and a multi-purpose data channel between the PAU and general
purpose node. One embodiment of the PAU used for personal
communication has an attached headset with video display, audio
feed and camera, which together may be used for audio or video
teleconferencing. When used as a virtual terminal to a computer in
the network, the user is able to view and manipulate imagery, text
or video, using voice commands to control the terminal
operations.
[0013] Using the system, an astronaut may efficiently operate and
monitor computer-controllable activities inside or outside the
vehicle or station. Hands-free access to computer-based instruction
texts, diagrams and checklists replaces juggling manuals and
clipboards, and tetherless computer system access allows free
motion throughout a cabin while monitoring and operating equipment.
Along with voice commands, an integrated "touchpad" on the PAU may
be used for remote computer control through a sensor data channel;
this return data channel may also be used for other control data as
from a three-D mouse or data glove input device, allowing the
real-time video display to be used for remote, wireless monitor and
control of robotic cameras or manipulators.
[0014] Concurrent with information provided to the astronaut, the
system also allows external observation of the astronaut's
situation; personal biological or other sensors can send back
continuous telemetry through personal access unit and general
purpose node. A miniature camera integrated into the headset
provides real-time video of the wearer's field of view to remote
observers. In this way, for example, a principal investigator
located on Earth may consult with a payload specialist on the
operation or troubleshooting of their equipment.
[0015] The system provides wireless high-rate data exchange. The
radio link is adapted to operate within a high-interference,
high-multipath environment of a space shuttle or space station
module. Radio frequency (RF) links do not require visual
line-of-sight to operate, but the metal walls and lack of RF
absorbers, combined with moving human bodies, creates an enormous
potential for destructive self-interference of the radio signals.
The integrated radio and multimedia data processing technology
provides for efficient and high-quality video and audio data
compression for noisy indoor communications channels. The system
supports multiple-user access for video, audio, and sensor data
services in the wireless coverage area. Potential applications of
the system are in any environment where heads-up, hands-free
information retrieval or multimedia communications access improves
efficiency including tetherless operations/monitor consoles, remote
consultations in medical or maintenance procedures, and
hazardous/confined space activities. There are also in-the-home
entertainment/communications applications.
[0016] Similar to the space extravehicular activities applications,
bio-isolation suits have similar operation constraints to space
suits. They are worn commonly where there are chemical or
biological contaminates, and any extraneous materials brought into
a chamber, such as clipboards or documents, also present a
contamination risk. A unit suitably modified for use in a space
suit could be used in this situation. This allows the user to use a
computer (log data, use a check list, etc.), to communicate with
colleagues, including providing first-hand video of work in
progress, and to maintain constant monitoring of the health of the
user.
[0017] An extension of the medical applications areas would be in
remote telemedicine. Many medical diagnostic and treatment tools
are being made portable and rugged enough to be taken to remote
sites. Some examples are an ultrasound unit that is the size of a
backpack, an entire intensive care unit of equipment built into a
stretcher, and a trauma pod built into a cruise missile. For many
of these devices, CRT or LCD panels comprise a significant amount
of the bulk and weight of the devices. The system of the present
invention may provide a replacement for the CRT or LCD panel as
well as an interface to the control system of the device, while
providing communications access through an interface to the remote
site's existing communications equipment.
[0018] Industrial applications include use by inspection or
maintenance crews in remote or dangerous environments such as oil
refineries, drilling rigs, power plants, etc., where the personnel
can move around with their hands and peripheral vision free to
attend to their own safety and tasks. They would be in constant
contact with the information they needed and any technical assist
could be given by individuals looking at the return video images
from the user.
[0019] An example of a commercial application is for mission
control and other operations personnel who presently must sit at a
display console for hours at a time. These individuals could make
use of the system of the present invention to increase their
mobility and efficiency.
BRIEF DESCRIPTION OF THE DRAWINGS
[0020] FIG. 1 is a diagrammatic illustration of the components of
the system of the present invention;
[0021] FIG. 2 is block diagram illustrating communications
components used by the personal access unit and general purpose
node of the system of FIG. 1; and
[0022] FIG. 3 is a flowchart illustrating a method performed using
the system of FIG. 1.
DETAILED DESCRIPTION OF PREFERRED EMBODIMENTS
[0023] With reference to FIG. 1, a diagram illustrating components
of the system of the present invention is shown. The system may
comprise small pager-like devices called portable access units 100.
The portable access units 100 are accessorizable for different
"multimedia" interfaces for display, camera, audio and sensor
operation. Another embodiment of the portable access unit 100a
comprises a wearable headset and microphone assembly 102a.
[0024] The portable access units 100-100a interface directly
through wireless link with a network through a general purpose node
150. The general purpose node 150 allows wireless-to-wire
communication with a local network 170. The local area network 170
may be electrically connected to a wide area network or Internet
172 in order to connect to remote local area networks 174.
Alternatively, the general purpose node 150 may be directly
connected to the wide area network 172. The general purpose node
150 may thus act as a router for routing video, display, audio and
control data packets between the portable access units 100 and
other, or remote, portable access units 100 or remote media devices
125, 180, etc connected to the networks 170-174. The connection
with a network 170-174 may occur directly in electrical connection
with one of the networks 170-174, or in wireless communication
through a remote general purpose node 150a that is electrically
connected to the network. The portable access units 100 may provide
communication to and from remote media devices comprising computers
180-182 running specialized client software or certain commercial
multimedia Internet software products such as video conferencing
products that adhere to the industry standard H.323 for multimedia
data transfer.
[0025] Each portable access unit 100-100a may dynamically associate
with the closest general purpose node 150-150a when it is logged on
to the network 170-174 or is connected thereto. Each general
purpose node 150-150a records the associations and registers each
portable access unit 100-100a on a list of connections associated
with the particular general purpose node 150-150a. The list of
connections is stored in a random access memory device included in
the general purpose node 150-150a. When a portable access unit 100
is logged off or disconnected from the network 170-174, it is
disassociated from the particular general purpose node 150-150a
that it was associated with, and thus, is removed from the list of
connections.
[0026] As shown on an example selection list screen 190 that may be
presented on a display 102 or headset 102a on any of the portable
access units 100-100a, the user can set up a video, audio, or data
link with any other portable access unit 100-100a or remote media
device 125, 180, etc, that is logged onto a network 170-174. The
headset 102a may comprise a heads-up display (120 in FIG. 2) inside
a headset embodying a transparent color LCD device. Using control
keys or voice commands, a user of the portable access unit 100-100a
may select a local or remote portable access unit 100-100a on a
selection list 190 of other portable access units 100-100a or media
devices 125, 180. The selection list 190 comprises a combination of
the lists of connections stored on all of the general purpose nodes
150-150a. Users may further access a nameserver located on the
access node 150 for locating remote unfamiliar portable access
units 100-100a or remote media devices.
[0027] By selecting entries from the selection list 190, users may
communicate with portable access units 100-100a or various media
devices such as cameras 125, internet phones 104, one or more
computers 180-182 located throughout the networks 170-174. A user
may further select from the list 190 user names representing users
of other portable access units 100 that are logged in and
associated with remote general purpose nodes 150a connected to the
networks 170-174.
[0028] With reference to FIG. 2, the components of the access node
150 and the wearable headset embodiment of the portable access unit
100a is shown. Elements for both the general purpose access node
and portable access unit 100a include a communications device 202.
Data processing functions are implemented in the form of an
audio/video coder/decoder (codec) pair 200, one codec 200
comprising part of the portable access unit 100a and the other
codec 200 being part of another portable access node 100a or remote
media device for which it is desired to exchange signals. At a
portable access node, the codec 200 controls a digital data stream
which is fed to the communications device 202, which is implemented
as an RF modem transceiver pair with an equivalent communications
device 202 located in the general purpose access node. The codecs
200 serve as the interfaces to the external elements (including
possibly the user display 102a and the sensor 104) on both sides of
the communication continuum comprising the communications device
202 of the general purpose node 150, an internal network interface
protocol circuit 152, the external networks 170-174 and the
electrical connection or general purpose access node connection to
the desired remote portable access node or media device. The
internal network interface protocol circuit 152 may comprise an
Ethernet chip, memory and a network protocol chip. With this
architecture, the system addresses the issues of multiple-access
and data channel quality, through the implementation of the
communications device 202. Multiple implementations of the
communication device 202 in the general purpose node 150 allows for
multiple simultaneous communication links with a plurality of
portable access units 100-100a for the general purpose node
150.
[0029] With the base functionality of the communications device 202
and codec subsystem 200, the architecture provides flexibility in
utilization of different external components such as different
headset 102a configurations, sensor 104 packages, and network
interface 152 capabilities.
The communication device 202 is designed to operate in a high
multipath space station or terrestrial indoor environment while
being able to support multiple users at high, multimedia-type
bandwidths. Thus the communications device's 202 physical (PHY) and
media access (MAC) layers in combination support multiple access,
dynamic network association, channel error rates of broadcast video
quality (1.times.10e-6), and data rates up to broadcast-quality
video bandwidths (on the order of 768 kbps per user (one-way)).
Modulation to achieve this performance will be differential
phase-shift keying, of binary or higher order (quadrature or
high-order quadrature amplitude modulation); the order chosen
reflects the necessary user data volume to be supported within
fixed, FCC-specified bandwidth allocations. Orthogonal frequency
division multiplexing, code division multiple access, and frequency
hopping/time division multiple access may be used for achieving
multiple access. Spread spectrum, channel equalization, antenna
diversity and retransmission techniques may also be used for
improving the reliability of the communications link. Through a
combination of these technologies, two-way multimedia channel
throughputs can be achieved for each of multiple simultaneous
users. A variety of RF frequencies may be used, but the determining
factor in frequency band selection is the availability in the band
of a relatively large amount of spectrum in the space station or
FCC terrestrial allocations, allowing the transmission of
compressed video. Ranges in the 2.5 to 5.7 band range are
preferable due to the FCC bandwidth available, the compactness of
RF elements required at these frequencies, and the potentially low
amount of interference that will be sustained. The RF front end of
both the portable access unit 100-100a and general purpose node
150-150a may be interchangeable with different frequency front ends
for system use in different frequency bands.
[0030] Low-rate, single user implementations of the communications
system may be effected through adapted commercial wireless-LAN type
products following the FCC 802.11 standard such as a
frequency-hopping 2.4 GHz wireless LAN transceiver by Waveaccess,
Inc of Wellesley, Mass., or direct-sequence spread-spectrum 2.4 GHz
wireless LAN chipset by Harris Prism of Melbourne, Fla. These radio
implementations, as with commercial implementations of the
industry-proposed Bluetooth and HomeRF standards, will be limited
in user access and overall throughput, however, and therefore
unsuitable to real-time video teleconferencing for multiple users.
The preferred embodiment for full capability is to implement the
communications devices' physical and media access control layers in
custom ASIC circuits allowing for support of all system
capabilities within microelectronics architecture for small size
and low power draw, providing pager-type form factor of wearable
personal access units 100-100a.
[0031] The communications device 202 comprises a buffer memory and
a radio frequency front end. Data modulation/demodulation circuits
and error detection/correction protocol circuits are further
included. Various combinations of these circuits may be obtained
from Proxim of Sunnyvale, Calif., Harris of Melbourne, Fla. and
Stanford Telecom of Stanford, Calif. Alternatively, all of the
various circuitry may be implemented with an application specific
integrated circuit (ASIC), or a combination of an ASIC and discrete
elements for size and weight efficiency.
[0032] Three classes of headsets 102a may be used: hi-resolution
military systems which are CRT based and may be provided by
Honeywell of Morristown, N.J., or Hughes Network Systems of San
Diego, Calif.; medium resolution industrial systems which are CRT
or LED based scanners and may be provided by Intervision of Santa
Clara, Calif.; or low to medium resolution entertainment systems
which are color viewfinder LCD based systems that may be supplied
by Virtual Vision of Redmond, Wash. (the V-CAP and E-GLASS), Sony
Europe of Hampshire, United Kingdom (GLASSTRON VISOR) or Olympus of
San Jose, Calif. Typical headset display 120 specifications for the
portable access unit 100a include the following: [0033] RESOLUTION:
Comparable at least to VGA (640.times.480) or better to
1280.times.1024 w/off-the-shelf display & I/O configuration
[0034] DISPLAY: >10 FL/day, Display Bright.Ratio: >2,
Brightness range:2 OOM.sub.max [0035] FOV: 40-60 deg, Gray scale:
>12 [0036] EyeRelief: 20-26 mm TSP, 14/10 mm (on/off-axis) exit
pupil [0037] Unif: 2:1 across 2/3 FOV, GLARE: <2.5% image
content, PixelContrast:25 [0038] FOCUS: Hands off, Obs: %
look-around, Diopter range: .+-.2, [0039] Mag: 1.+-.p5%, Cont:
>95%, motion sensor 10.degree. cone, Inter. Eye. adj: 52-72 mm
[0040] Image Enhan & IFF: Weaponsight, motion sensor and
computer interface
[0041] The audio/video codec 200 in a portable access unit 100-100a
or other client device is based around a single chip,
standards-based codec that accepts analog or digital audio and
video (i.e. NTSC or VGA) compresses this input, and multiplexes the
compressed data with an external data stream. The preferred
industry standards are: ITU H.263 based video, ITU G.722 based
audio, and ITU H.221 based multiplexing. The audio video codec 200
in the portable access unit 100-100a can establish a link with a
similar audio/video codec 200 associated with another portable
access unit 100-100a or a remote media device 104, 125, 180 or 182.
The signals from the codec 200 in the portable access unit 100a
outputs the received and decompressed remote signals from the
device for which the link was established. The interface between
the codec 200 and communication device 202 as well as between the
communication devices 202 of the general purpose node 150-150a and
portable access unit 100-100a operate two-way with a high bandwidth
suitable for transmitting video. Of this bandwith, the audio
portion utilizes up to 64 kbps and the data from the sensor 104
utilizes the required amount for the type of sensor 104, with the
remainder allocated to compressed video. The quality of the video
at these data rates in excess of 128 kbps is at least equivalent to
video teleconferencing quality video.
[0042] The audio/video codec 200 portion of the portable access
unit 100-100a may further comprise video input and output ports,
audio input and output ports, data input and output ports, and a
the above-mentioned multimedia processor chip for packaging signals
for data compression and decompression for transmission. Exemplary
multimedia processors include the VCPEX chip by 8.times.8, Inc. of
Santa Clara, Calif. or digital signal processing chips by Texas
Instruments and others. The audio/video codec 200 further comprises
a field processor gate array, electrically programmable read-only
memory and random access memory for processing and packaging
signals for compression and decompression
[0043] The sensor 104 may comprise a commercially available pulse
oximeter sensor or other type of bio-sensor. A pulse-oximeter
sensor allows the measurement of pulse rate and oxygen saturation
of the blood. Data from the sensor 104 is transmitted to the
general purpose node 150-150a, and transmitted to any remote media
device connected to any of the networks 170-172. The sensor 104 may
comprise an "on body" wireless human performance and fatigue
monitoring system that communicates with a belt-mounted
transceiver/control module. The remote media device may comprise a
processor 180-182 for display or logging of the real-time sensor
signals.
[0044] The headset 102a comprises a heads-up display 120 comprising
a transparent color LCD device for video signals received and
processed by the codec 200. The headset 102a may further comprise,
or have attached thereto, an integrated microphone 122 for
receiving voice commands from the user of the portable access unit
100a or for communicating voice signals to a remote portable access
unit 100 or remote media device. The headset may further comprise a
speaker 124 or earpiece unit for presenting audio signals to the
user. The portable access unit 100a may further comprise a digital
camera 106 that may either be attached on the user's person, or to
the headset 102a for providing video signals to other portable
access units 100-100a or media devices.
[0045] With reference to FIG. 3, a flow diagram illustrating the
method performed by the system of FIG. 1 is shown. A user puts on
the headset 102a, portable access unit 100a, step 400. The user may
log into the local general purpose node 150 wherein the portable
access unit associates with the general purpose node 150 such that
the user is added to a connection list stored in a random access
memory device residing in the general purpose node 150, step 401.
Data is provided from the general purpose node 150 to the portable
access unit through the communication devices 202, step 402. The
user is presented with a selection list 190 of portable access
units 100-100a and media devices logged onto the system on the
display 120, step 404. The user selects one of the entries from the
selection list, step 406. The selection is transmitted to the
general purpose node 150, step 408. The general purpose node 150
sets up a connection over the networks 170-174 for channeling data
between the portable access unit 100a and the selected network
device, step 410. The selected network device may comprise the
processor 180 or other network client 182 for running a software
application, a camera 125 for providing remote viewing operations
to the user on the display 120, the Internet phone 104 for
providing voice communications with the a remote user, or another
portable access unit 100-100a over a remote general purpose node
150a. By providing control commands to the microphone 122 or other
input system, such as a keyboard or handheld mouse, the user may
conduct operations by transmitting commands between the portable
access unit 100a and the general purpose node 150 which routs the
control commands to the device that the user selected, step
412.
[0046] It will thus be seen that changes may be made in carrying
out the above system and method and in the construction set forth
without departing from the spirit and scope of the invention, it is
intended that any and all matter contained in the above description
and shown in the accompanying drawings shall be interpreted as
illustrative and not in a limiting sense.
* * * * *