U.S. patent number 9,846,308 [Application Number 15/047,110] was granted by the patent office on 2017-12-19 for haptic systems for head-worn computers.
This patent grant is currently assigned to Osterhout Group, Inc.. The grantee listed for this patent is Osterhout Group, Inc.. Invention is credited to Ralph F. Osterhout.
United States Patent |
9,846,308 |
Osterhout |
December 19, 2017 |
**Please see images for:
( Certificate of Correction ) ** |
Haptic systems for head-worn computers
Abstract
Aspects of the present disclosure relate to haptic feedback
systems and methods for use in head-worn computing systems. A head
worn computer includes a frame adapted to hold a computer display
in front of a user's eye, a processor adapted to present digital
content in the computer display and to produce a haptic signal in
coordination with the digital content display, and a haptic system
including a plurality of haptic segments, wherein each of the
haptic segments is individually controlled in coordination with the
haptic signal.
Inventors: |
Osterhout; Ralph F. (San
Francisco, CA) |
Applicant: |
Name |
City |
State |
Country |
Type |
Osterhout Group, Inc. |
San Francisco |
CA |
US |
|
|
Assignee: |
Osterhout Group, Inc. (San
Francisco, CA)
|
Family
ID: |
56094197 |
Appl.
No.: |
15/047,110 |
Filed: |
February 18, 2016 |
Prior Publication Data
|
|
|
|
Document
Identifier |
Publication Date |
|
US 20160161747 A1 |
Jun 9, 2016 |
|
Related U.S. Patent Documents
|
|
|
|
|
|
|
Application
Number |
Filing Date |
Patent Number |
Issue Date |
|
|
14589713 |
Jan 5, 2015 |
9529195 |
|
|
|
14163646 |
Jan 24, 2014 |
9400390 |
|
|
|
15047110 |
|
|
|
|
|
14861496 |
Sep 22, 2015 |
9753288 |
|
|
|
14851755 |
Sep 11, 2015 |
9651784 |
|
|
|
Current U.S.
Class: |
1/1 |
Current CPC
Class: |
G02B
27/017 (20130101); G06F 1/163 (20130101); G06F
3/005 (20130101); G02B 27/1033 (20130101); G02B
26/0833 (20130101); G02B 27/0093 (20130101); G02B
27/0176 (20130101); G02B 27/283 (20130101); G06F
1/1635 (20130101); G06T 19/006 (20130101); G06F
3/016 (20130101); G06F 3/013 (20130101); G06F
3/017 (20130101); G02B 27/0172 (20130101); G02B
2027/0178 (20130101); G02B 2027/0138 (20130101); G02B
2027/0118 (20130101); G02B 2027/0194 (20130101); G02B
2027/0187 (20130101) |
Current International
Class: |
G02B
27/14 (20060101); G06F 3/00 (20060101); G02B
27/00 (20060101); G02B 27/28 (20060101); G06F
1/16 (20060101); G09G 5/00 (20060101); G02B
27/01 (20060101); G06T 19/00 (20110101); G06F
3/01 (20060101); G02B 26/08 (20060101); G02B
27/10 (20060101) |
Field of
Search: |
;359/630 ;345/7-9 |
References Cited
[Referenced By]
U.S. Patent Documents
Foreign Patent Documents
|
|
|
|
|
|
|
368898 |
|
May 1990 |
|
EP |
|
777867 |
|
Jun 1997 |
|
EP |
|
2486450 |
|
Aug 2012 |
|
EP |
|
2502410 |
|
Sep 2012 |
|
EP |
|
2011143655 |
|
Nov 2011 |
|
WO |
|
2012058175 |
|
May 2012 |
|
WO |
|
2013050650 |
|
Apr 2013 |
|
WO |
|
2013103825 |
|
Jul 2013 |
|
WO |
|
2013110846 |
|
Aug 2013 |
|
WO |
|
2013170073 |
|
Nov 2013 |
|
WO |
|
Other References
US 8,743,465, 06/2014, Totani et al. (withdrawn) cited by applicant
.
US 8,792,178, 07/2014, Totani et al. (withdrawn) cited by applicant
.
"Lightberry",
https://web.archive.org/web/20131201194408/http:I/lightberry.eu/,
Dec. 1, 2013, 11 Pages. cited by applicant .
Bezryadin, et al., "Brightness Calculation in Digital Image
Processing", Technologies for Digital Fulfillment 2007, Las Vegas,
NV, 2007, pp. 1-6. cited by applicant .
Clements-Cortes, et al., "Short-Term Effects of Rhythmic Sensory
Stimulation in Alzheimer's Disease: An Exploratory Pilot Study",
Journal of Alzheimer's Disease 52 (2016) DOI 10.3233/JAD-160081 IOS
Press, Feb. 9, 2016, pp. 651-660. cited by applicant .
Pamplona, et al., "Photorealistic Models for Pupil Light Reflex and
Iridal Pattern Deformation", pp. 1-12. cited by applicant .
Schedwill, "Bidirectional OLED Microdisplay", Fraunhofer Research
Institution for Organics, Materials and Electronic Device Comedd,
Apr. 11, 2014, 2 pages. cited by applicant .
Vogel, et al., "Data glasses controlled by eye movements",
Information and communication, Fraunhofer-Gesellschaft, Sep. 22,
2013, 2 pages. cited by applicant.
|
Primary Examiner: Choi; William
Attorney, Agent or Firm: GTC Law Group PC &
Affiliates
Parent Case Text
CROSS-REFERENCE TO RELATED APPLICATIONS
This application is a continuation-in-part of U.S. non-provisional
application Ser. No. 14/589,713, filed Jan. 5, 2015, which is a
continuation-in-part of U.S. non-provisional application Ser. No.
14/163,646, filed Jan. 24, 2014;
This application is also a continuation-in-part of U.S.
non-provisional application Ser. No. 14/861,496, filed Sep. 22,
2015, which is a continuation-in-part of U.S. non-provisional
application Ser. No. 14/851,755, entitled "SEE-THROUGH COMPUTER
DISPLAY SYSTEMS", filed Sep. 11, 2015.
All of the above applications are incorporated herein by reference
in their entirety.
Claims
I claim:
1. A head-worn computer, comprising: a frame adapted to hold a
computer display in front of a user's eye; a processor adapted to
present digital content in the computer display and to produce a
haptic signal in coordination with the digital content display; and
a haptic system comprising a plurality of haptic segments, wherein
each of the haptic segments is individually controlled in
coordination with the haptic signal, and wherein each of the
plurality of haptic segments comprises a different vibration
capacity.
2. The head-worn computer of claim 1, wherein each of the haptic
segments comprises a piezo strip activated by the haptic signal to
generate a vibration in the frame.
3. The head-worn computer of claim 1, wherein an intensity of the
haptic system is increased by activating more than one of the
plurality of haptic segments.
4. The head-worn computer of claim 3, wherein the intensity is
further increased by activating more than two of the plurality of
haptic segments.
5. The head-worn computer of claim 1, wherein an intensity of the
haptic system is regulated depending on which of the plurality of
haptic segments is activated.
6. The head-worn computer of claim 1, wherein each of the plurality
of haptic segments are mounted in a linear arrangement and the
segments are arranged such that the higher capacity segments are at
one end of the linear arrangement.
7. The head-worn computer of claim 6, wherein the linear
arrangement is from back to front on an arm of the head-worn
computer.
8. The head-worn computer of claim 7, wherein the linear
arrangement is proximate a temple of the user.
9. The head-worn computer of claim 7, wherein the linear
arrangement is proximate an ear of the user.
10. The head-worn computer of claim 7, wherein the linear
arrangement is proximate a rear portion of the user's head.
11. The head-worn computer of claim 6, wherein the linear
arrangement is from front to back on an arm of the head-worn
computer.
12. A head-worn computer, comprising: a frame adapted to hold a
computer display in front of a user's eye; a processor adapted to
present digital content in the computer display and to produce a
haptic signal in coordination with the digital content display; a
haptic system comprising a plurality of haptic segments, wherein
each of the haptic segments is individually controlled in
coordination with the haptic signal, and a vibration conduit,
wherein the vibration conduit is mounted proximate the haptic
system and adapted to touch the skin of the user's head to
facilitate vibration sensations from the haptic system to the
user's head.
13. The head-worn computer of claim 12, wherein the vibration
conduit is mounted on an arm of the head-worn computer.
14. The head-worn computer of claim 12, wherein the vibration
conduit touches the user's head proximate a temple of the user's
head.
15. The head-worn computer of claim 12, wherein the vibration
conduit is made of a soft material that deforms to increase contact
area with the user's head.
Description
BACKGROUND
Field of the Invention
This disclosure relates to head-worn computer systems with haptic
feedback systems.
Description of Related Art
Head mounted displays (HMDs) and particularly HMDs that provide a
see-through view of the environment are valuable instruments. The
presentation of content in the see-through display can be a
complicated operation when attempting to ensure that the user
experience is optimized. Improved systems and methods for
presenting content in the see-through display are required to
improve the user experience.
SUMMARY
Aspects of the present disclosure relate to methods and systems for
providing haptic feedback in head-worn computer systems.
In an aspect, a head-worn computer may include a frame adapted to
hold a computer display in front of a user's eye, a processor
adapted to present digital content in the computer display and to
produce a haptic signal in coordination with the digital content
display, and a haptic system comprising a plurality of haptic
segments, wherein each of the haptic segments is individually
controlled in coordination with the haptic signal. Each of the
haptic segments may include a piezo strip activated by the haptic
signal to generate a vibration in the frame. An intensity of the
haptic system is increased by activating more than one of the
plurality of haptic segments. The intensity may be further
increased by activating more than two of the plurality of haptic
segments. Each of the plurality of haptic segments may include a
different vibration capacity. An intensity of the haptic system may
be regulated depending on which of the plurality of haptic segments
is activated. Each of the plurality of haptic segments may be
mounted in a linear arrangement and the segments are arranged such
that the higher capacity segments are at one end of the linear
arrangement. The linear arrangement may be from back to front on an
arm of the head-worn computer, proximate a temple of the user,
proximate an ear of the user, proximate a rear portion of the
user's head, or from front to back on an arm of the head-worn
computer. The head-worn computer may further include a vibration
conduit, wherein the vibration conduit is mounted proximate the
haptic system and adapted to touch the skin of the user's head to
facilitate vibration sensations from the haptic system to the
user's head. The vibration conduit may be mounted on an arm of the
head-worn computer. The vibration conduit may touch the user's head
proximate a temple of the user's head. The vibration conduit may be
made of a soft material that deforms to increase contact area with
the user's head.
These and other systems, methods, objects, features, and advantages
of the present disclosure will be apparent to those skilled in the
art from the following detailed description of the preferred
embodiment and the drawings. All documents mentioned herein are
hereby incorporated in their entirety by reference.
BRIEF DESCRIPTION OF THE DRAWINGS
Embodiments are described with reference to the following Figures.
The same numbers may be used throughout to reference like features
and components that are shown in the Figures:
FIG. 1 illustrates a head worn computing system in accordance with
the principles of the present disclosure.
FIG. 2 illustrates a head worn computing system with optical system
in accordance with the principles of the present disclosure.
FIG. 3 illustrates upper and lower optical modules in accordance
with the principles of the present disclosure.
FIG. 4 illustrates angles of combiner elements in accordance with
the principles of the present disclosure.
FIG. 5 illustrates upper and lower optical modules in accordance
with the principles of the present disclosure.
FIG. 6 illustrates upper and lower optical modules in accordance
with the principles of the present disclosure.
FIG. 7 illustrates upper and lower optical modules in accordance
with the principles of the present disclosure.
FIG. 8 illustrates upper and lower optical modules in accordance
with the principles of the present disclosure.
FIGS. 9, 10a, 10b and 11 illustrate light sources and filters in
accordance with the principles of the present disclosure.
FIGS. 12a to 12c illustrate light sources and quantum dot systems
in accordance with the principles of the present disclosure.
FIGS. 13a to 13c illustrate peripheral lighting systems in
accordance with the principles of the present disclosure.
FIGS. 14a to 14h illustrate light suppression systems in accordance
with the principles of the present disclosure.
FIG. 15 illustrates an external user interface in accordance with
the principles of the present disclosure.
FIG. 16 illustrates external user interfaces in accordance with the
principles of the present disclosure.
FIGS. 17 and 18 illustrate structured eye lighting systems
according to the principles of the present disclosure.
FIG. 19 illustrates eye glint in the prediction of eye direction
analysis in accordance with the principles of the present
disclosure.
FIG. 20a illustrates eye characteristics that may be used in
personal identification through analysis of a system according to
the principles of the present disclosure.
FIG. 20b illustrates a digital content presentation reflection off
of the wearer's eye that may be analyzed in accordance with the
principles of the present disclosure.
FIG. 21 illustrates eye imaging along various virtual target lines
and various focal planes in accordance with the principles of the
present disclosure.
FIG. 22 illustrates content control with respect to eye movement
based on eye imaging in accordance with the principles of the
present disclosure.
FIG. 23 illustrates eye imaging and eye convergence in accordance
with the principles of the present disclosure.
FIG. 24 illustrates light impinging an eye in accordance with the
principles of the present disclosure.
FIG. 25 illustrates a view of an eye in accordance with the
principles of the present disclosure.
FIGS. 26a and 26b illustrate views of an eye with a structured
light pattern in accordance with the principles of the present
disclosure.
FIG. 27 illustrates a user interface in accordance with the
principles of the present disclosure.
FIG. 28 illustrates a user interface in accordance with the
principles of the present disclosure.
FIGS. 29 and 29a illustrate haptic systems in accordance with the
principles of the present disclosure.
While the disclosure has been described in connection with certain
preferred embodiments, other embodiments would be understood by one
of ordinary skill in the art and are encompassed herein.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENT(S)
Aspects of the present disclosure relate to head-worn computing
("HWC") systems. HWC involves, in some instances, a system that
mimics the appearance of head-worn glasses or sunglasses. The
glasses may be a fully developed computing platform, such as
including computer displays presented in each of the lenses of the
glasses to the eyes of the user. In embodiments, the lenses and
displays may be configured to allow a person wearing the glasses to
see the environment through the lenses while also seeing,
simultaneously, digital imagery, which forms an overlaid image that
is perceived by the person as a digitally augmented image of the
environment, or augmented reality ("AR").
HWC involves more than just placing a computing system on a
person's head. The system may need to be designed as a lightweight,
compact and fully functional computer display, such as wherein the
computer display includes a high resolution digital display that
provides a high level of emersion comprised of the displayed
digital content and the see-through view of the environmental
surroundings. User interfaces and control systems suited to the HWC
device may be required that are unlike those used for a more
conventional computer such as a laptop. For the HWC and associated
systems to be most effective, the glasses may be equipped with
sensors to determine environmental conditions, geographic location,
relative positioning to other points of interest, objects
identified by imaging and movement by the user or other users in a
connected group, compass heading, head tilt, where the user is
looking and the like. The HWC may then change the mode of operation
to match the conditions, location, positioning, movements, and the
like, in a method generally referred to as a contextually aware
HWC. The glasses also may need to be connected, wirelessly or
otherwise, to other systems either locally or through a network.
Controlling the glasses may be achieved through the use of an
external device, automatically through contextually gathered
information, through user gestures captured by the glasses sensors,
and the like. Each technique may be further refined depending on
the software application being used in the glasses. The glasses may
further be used to control or coordinate with external devices that
are associated with the glasses.
Referring to FIG. 1, an overview of the HWC system 100 is
presented. As shown, the HWC system 100 comprises a HWC 102, which
in this instance is configured as glasses to be worn on the head
with sensors such that the HWC 102 is aware of the objects and
conditions in the environment 114. In this instance, the HWC 102
also receives and interprets control inputs such as gestures and
movements 116. The HWC 102 may communicate with external user
interfaces 104. The external user interfaces 104 may provide a
physical user interface to take control instructions from a user of
the HWC 102 and the external user interfaces 104 and the HWC 102
may communicate bi-directionally to affect the user's command and
provide feedback to the external device 108. The HWC 102 may also
communicate bi-directionally with externally controlled or
coordinated local devices 108. For example, an external user
interface 104 may be used in connection with the HWC 102 to control
an externally controlled or coordinated local device 108. The
externally controlled or coordinated local device 108 may provide
feedback to the HWC 102 and a customized GUI may be presented in
the HWC 102 based on the type of device or specifically identified
device 108. The HWC 102 may also interact with remote devices and
information sources 112 through a network connection 110. Again,
the external user interface 104 may be used in connection with the
HWC 102 to control or otherwise interact with any of the remote
devices 108 and information sources 112 in a similar way as when
the external user interfaces 104 are used to control or otherwise
interact with the externally controlled or coordinated local
devices 108. Similarly, HWC 102 may interpret gestures 116 (e.g
captured from forward, downward, upward, rearward facing sensors
such as camera(s), range finders, IR sensors, etc.) or
environmental conditions sensed in the environment 114 to control
either local or remote devices 108 or 112.
We will now describe each of the main elements depicted on FIG. 1
in more detail; however, these descriptions are intended to provide
general guidance and should not be construed as limiting.
Additional description of each element may also be further
described herein.
The HWC 102 is a computing platform intended to be worn on a
person's head. The HWC 102 may take many different forms to fit
many different functional requirements. In some situations, the HWC
102 will be designed in the form of conventional glasses. The
glasses may or may not have active computer graphics displays. In
situations where the HWC 102 has integrated computer displays the
displays may be configured as see-through displays such that the
digital imagery can be overlaid with respect to the user's view of
the environment 114. There are a number of see-through optical
designs that may be used, including ones that have a reflective
display (e.g. LCoS, DLP), emissive displays (e.g. OLED, LED),
hologram, TIR waveguides, and the like. In embodiments, lighting
systems used in connection with the display optics may be solid
state lighting systems, such as LED, OLED, quantum dot, quantum dot
LED, etc. In addition, the optical configuration may be monocular
or binocular. It may also include vision corrective optical
components. In embodiments, the optics may be packaged as contact
lenses. In other embodiments, the HWC 102 may be in the form of a
helmet with a see-through shield, sunglasses, safety glasses,
goggles, a mask, fire helmet with see-through shield, police helmet
with see through shield, military helmet with see-through shield,
utility form customized to a certain work task (e.g. inventory
control, logistics, repair, maintenance, etc.), and the like.
The HWC 102 may also have a number of integrated computing
facilities, such as an integrated processor, integrated power
management, communication structures (e.g. cell net, WiFi,
Bluetooth, local area connections, mesh connections, remote
connections (e.g. client server, etc.)), and the like. The HWC 102
may also have a number of positional awareness sensors, such as
GPS, electronic compass, altimeter, tilt sensor, IMU, and the like.
It may also have other sensors such as a camera, rangefinder,
hyper-spectral camera, Geiger counter, microphone, spectral
illumination detector, temperature sensor, chemical sensor,
biologic sensor, moisture sensor, ultrasonic sensor, and the
like.
The HWC 102 may also have integrated control technologies. The
integrated control technologies may be contextual based control,
passive control, active control, user control, and the like. For
example, the HWC 102 may have an integrated sensor (e.g. camera)
that captures user hand or body gestures 116 such that the
integrated processing system can interpret the gestures and
generate control commands for the HWC 102. In another example, the
HWC 102 may have sensors that detect movement (e.g. a nod, head
shake, and the like) including accelerometers, gyros and other
inertial measurements, where the integrated processor may interpret
the movement and generate a control command in response. The HWC
102 may also automatically control itself based on measured or
perceived environmental conditions. For example, if it is bright in
the environment the HWC 102 may increase the brightness or contrast
of the displayed image. In embodiments, the integrated control
technologies may be mounted on the HWC 102 such that a user can
interact with it directly. For example, the HWC 102 may have a
button(s), touch capacitive interface, and the like.
As described herein, the HWC 102 may be in communication with
external user interfaces 104. The external user interfaces may come
in many different forms. For example, a cell phone screen may be
adapted to take user input for control of an aspect of the HWC 102.
The external user interface may be a dedicated UI (e.g. air mouse,
finger mounted mouse), such as a keyboard, touch surface,
button(s), joy stick, and the like. In embodiments, the external
controller may be integrated into another device such as a ring,
watch, bike, car, and the like. In each case, the external user
interface 104 may include sensors (e.g. IMU, accelerometers,
compass, altimeter, and the like) to provide additional input for
controlling the HWD 104.
As described herein, the HWC 102 may control or coordinate with
other local devices 108. The external devices 108 may be an audio
device, visual device, vehicle, cell phone, computer, and the like.
For instance, the local external device 108 may be another HWC 102,
where information may then be exchanged between the separate HWCs
108.
Similar to the way the HWC 102 may control or coordinate with local
devices 106, the HWC 102 may control or coordinate with remote
devices 112, such as the HWC 102 communicating with the remote
devices 112 through a network 110. Again, the form of the remote
device 112 may have many forms. Included in these forms is another
HWC 102. For example, each HWC 102 may communicate its GPS position
such that all the HWCs 102 know where all of HWC 102 are
located.
FIG. 2 illustrates a HWC 102 with an optical system that includes
an upper optical module 202 and a lower optical module 204. While
the upper and lower optical modules 202 and 204 will generally be
described as separate modules, it should be understood that this is
illustrative only and the present disclosure includes other
physical configurations, such as that when the two modules are
combined into a single module or where the elements making up the
two modules are configured into more than two modules. In
embodiments, the upper module 202 includes a computer controlled
display (e.g. LCoS, FLCoS, DLP, OLED, backlit LCD, etc.) and image
light delivery optics. In embodiments, the lower module includes
eye delivery optics that are configured to receive the upper
module's image light and deliver the image light to the eye of a
wearer of the HWC. In FIG. 2, it should be noted that while the
upper and lower optical modules 202 and 204 are illustrated in one
side of the HWC such that image light can be delivered to one eye
of the wearer, that it is envisioned by the present disclosure that
embodiments will contain two image light delivery systems, one for
each eye.
FIG. 3 illustrates a combination of an upper optical module 202
with a lower optical module 204. In this embodiment, the image
light projected from the upper optical module 202 may or may not be
polarized. The image light is reflected off a flat combiner element
602 such that it is directed towards the user's eye. Wherein, the
combiner element 602 is a partial mirror that reflects image light
while transmitting a substantial portion of light from the
environment so the user can look through the combiner element and
see the environment surrounding the HWC.
The combiner 602 may include a holographic pattern, to form a
holographic mirror. If a monochrome image is desired, there may be
a single wavelength reflection design for the holographic pattern
on the surface of the combiner 602. If the intention is to have
multiple colors reflected from the surface of the combiner 602, a
multiple wavelength holographic mirror maybe included on the
combiner surface. For example, in a three-color embodiment, where
red, green and blue pixels are generated in the image light, the
holographic mirror may be reflective to wavelengths substantially
matching the wavelengths of the red, green and blue light provided
in the image light. This configuration can be used as a wavelength
specific mirror where pre-determined wavelengths of light from the
image light are reflected to the user's eye. This configuration may
also be made such that substantially all other wavelengths in the
visible pass through the combiner element 602 so the user has a
substantially clear view of the environmental surroundings when
looking through the combiner element 602. The transparency between
the user's eye and the surrounding may be approximately 80% when
using a combiner that is a holographic mirror. Wherein holographic
mirrors can be made using lasers to produce interference patterns
in the holographic material of the combiner where the wavelengths
of the lasers correspond to the wavelengths of light that are
subsequently reflected by the holographic mirror.
In another embodiment, the combiner element 602 may include a notch
mirror comprised of a multilayer coated substrate wherein the
coating is designed to substantially reflect the wavelengths of
light provided in the image light by the light source and
substantially transmit the remaining wavelengths in the visible
spectrum. For example, in the case where red, green and blue light
is provided by the light source in the upper optics to enable full
color images to be provided to the user, the notch mirror is a
tristimulus notch mirror wherein the multilayer coating is designed
to substantially reflect narrow bands of red, green and blue light
that are matched to the what is provided by the light source and
the remaining visible wavelengths are substantially transmitted
through the coating to enable a view of the environment through the
combiner. In another example where monochrome images are provided
to the user, the notch mirror is designed to reflect a single
narrow band of light that is matched to the wavelength range of the
image light provided by the upper optics while transmitting the
remaining visible wavelengths to enable a see-thru view of the
environment. The combiner 602 with the notch mirror would operate,
from the user's perspective, in a manner similar to the combiner
that includes a holographic pattern on the combiner element 602.
The combiner, with the tristimulus notch mirror, would reflect
image light associated with pixels, to the eye because of the match
between the reflective wavelengths of the notch mirror and the
wavelengths or color of the image light, and the wearer would
simultaneously be able to see with high clarity the environmental
surroundings. The transparency between the user's eye and the
surrounding may be approximately 80% when using the tristimulus
notch mirror. In addition, the image provided with the notch mirror
combiner can provide higher contrast images than the holographic
mirror combiner because the notch mirror acts in a purely
reflective manner compared to the holographic mirror which operates
through diffraction, and as such the notch mirror is subject to
less scattering of the imaging light by the combiner. In another
embodiment, the combiner element 602 may include a simple partial
mirror that reflects a portion (e.g. 50%) of all wavelengths of
light in the visible.
Image light can escape through the combiner 602 and may produce
face glow from the optics shown in FIG. 3, as the escaping image
light is generally directed downward onto the cheek of the user.
When using a holographic mirror combiner or a tristimulus notch
mirror combiner, the escaping light can be trapped to avoid face
glow. In embodiments, if the image light is polarized before the
combiner, a linear polarizer can be laminated, or otherwise
associated, to the combiner, with the transmission axis of the
polarizer oriented relative to the polarized image light so that
any escaping image light is absorbed by the polarizer. In
embodiments, the image light would be polarized to provide S
polarized light to the combiner for better reflection. As a result,
the linear polarizer on the combiner would be oriented to absorb S
polarized light and pass P polarized light. This provides the
preferred orientation of polarized sunglasses as well.
If the image light is unpolarized, a microlouvered film such as a
privacy filter can be used to absorb the escaping image light while
providing the user with a see-thru view of the environment. In this
case, the absorbance or transmittance of the microlouvered film is
dependent on the angle of the light. Where steep angle light is
absorbed and light at less of an angle is transmitted. For this
reason, in an embodiment, the combiner with the microlouver film is
angled at greater than 45 degrees to the optical axis of the image
light (e.g. the combiner can be oriented at 50 degrees so the image
light from the file lens is incident on the combiner at an oblique
angle.
FIG. 4 illustrates an embodiment of a combiner element 602 at
various angles when the combiner element 602 includes a holographic
mirror. Normally, a mirrored surface reflects light at an angle
equal to the angle that the light is incident to the mirrored
surface. Typically, this necessitates that the combiner element be
at 45 degrees, 602a, if the light is presented vertically to the
combiner so the light can be reflected horizontally towards the
wearer's eye. In embodiments, the incident light can be presented
at angles other than vertical to enable the mirror surface to be
oriented at other than 45 degrees, but in all cases wherein a
mirrored surface is employed (including the tristimulus notch
mirror described previously), the incident angle equals the
reflected angle. As a result, increasing the angle of the combiner
602a requires that the incident image light be presented to the
combiner 602a at a different angle which positions the upper
optical module 202 to the left of the combiner as shown in FIG. 4.
In contrast, a holographic mirror combiner, included in
embodiments, can be made such that light is reflected at a
different angle from the angle that the light is incident onto the
holographic mirrored surface. This allows freedom to select the
angle of the combiner element 602b independent of the angle of the
incident image light and the angle of the light reflected into the
wearer's eye. In embodiments, the angle of the combiner element
602b is greater than 45 degrees (shown in FIG. 4) as this allows a
more laterally compact HWC design. The increased angle of the
combiner element 602b decreases the front to back width of the
lower optical module 204 and may allow for a thinner HWC display
(i.e. the furthest element from the wearer's eye can be closer to
the wearer's face).
FIG. 5 illustrates another embodiment of a lower optical module
204. In this embodiment, polarized or unpolarized image light
provided by the upper optical module 202, is directed into the
lower optical module 204. The image light reflects off a partial
mirror 804 (e.g. polarized mirror, notch mirror, holographic
mirror, etc.) and is directed toward a curved partially reflective
mirror 802. The curved partial mirror 802 then reflects the image
light back towards the user's eye, which passes through the partial
mirror 804. The user can also see through the partial mirror 804
and the curved partial mirror 802 to see the surrounding
environment. As a result, the user perceives a combined image
comprised of the displayed image light overlaid onto the see-thru
view of the environment. In a preferred embodiment, the partial
mirror 804 and the curved partial mirror 802 are both
non-polarizing so that the transmitted light from the surrounding
environment is unpolarized so that rainbow interference patterns
are eliminated when looking at polarized light in the environment
such as provided by a computer monitor or in the reflected light
from a lake.
While many of the embodiments of the present disclosure have been
referred to as upper and lower modules containing certain optical
components, it should be understood that the image light production
and management functions described in connection with the upper
module may be arranged to direct light in other directions (e.g.
upward, sideward, etc.). In embodiments, it may be preferred to
mount the upper module 202 above the wearer's eye, in which case
the image light would be directed downward. In other embodiments it
may be preferred to produce light from the side of the wearer's
eye, or from below the wearer's eye. In addition, the lower optical
module is generally configured to deliver the image light to the
wearer's eye and allow the wearer to see through the lower optical
module, which may be accomplished through a variety of optical
components.
FIG. 6 illustrates an embodiment of the present disclosure where
the upper optical module 202 is arranged to direct image light into
a total internal reflection (TIR) waveguide 810. In this
embodiment, the upper optical module 202 is positioned above the
wearer's eye 812 and the light is directed horizontally into the
TIR waveguide 810. The TIR waveguide is designed to internally
reflect the image light in a series of downward TIR reflections
until it reaches the portion in front of the wearer's eye, where
the light passes out of the TIR waveguide 812 in a direction toward
the wearer's eye. In this embodiment, an outer shield 814 may be
positioned in front of the TIR waveguide 810.
FIG. 7 illustrates an embodiment of the present disclosure where
the upper optical module 202 is arranged to direct image light into
a TIR waveguide 818. In this embodiment, the upper optical module
202 is arranged on the side of the TIR waveguide 818. For example,
the upper optical module may be positioned in the arm or near the
arm of the HWC when configured as a pair of head worn glasses. The
TIR waveguide 818 is designed to internally reflect the image light
in a series of TIR reflections until it reaches the portion in
front of the wearer's eye, where the light passes out of the TIR
waveguide 818 in a direction toward the wearer's eye 812.
FIG. 8 illustrates yet further embodiments of the present
disclosure where an upper optical module 202 directs polarized
image light into an optical guide 828 where the image light passes
through a polarized reflector 824, changes polarization state upon
reflection of the optical element 822 which includes a 1/4 wave
film for example and then is reflected by the polarized reflector
824 towards the wearer's eye, due to the change in polarization of
the image light. The upper optical module 202 may be positioned
behind the optical guide 828 wherein the image light is directed
toward a mirror 820 that reflects the image light along the optical
guide 828 and towards the polarized reflector 824. Alternatively,
in other embodiments, the upper optical module 202 may direct the
image light directly along the optical guide 828 and towards the
polarized reflector 824. It should be understood that the present
disclosure comprises other optical arrangements intended to direct
image light into the wearer's eye.
FIG. 9 illustrates a light source 1100 that may be used in
association with the upper optics module 202. In embodiments, the
light source 1100 may provide light to a backlighting optical
system that is associated with the light source 1100 and which
serves to homogenize the light and thereby provide uniform
illuminating light to an image source in the upper optics. In
embodiments, the light source 1100 includes a tristimulus notch
filter 1102. The tristimulus notch filter 1102 has narrow band pass
filters for three wavelengths, as indicated in FIG. 10b in a
transmission graph 1108. The graph shown in FIG. 10a, as 1104
illustrates an output of three different colored LEDs. One can see
that the bandwidths of emission are narrow, but they have long
tails. The tristimulus notch filter 1102 can be used in connection
with such LEDs to provide a light source 1100 that emits narrow
filtered wavelengths of light as shown in FIG. 11 as the
transmission graph 1110. Wherein the clipping effects of the
tristimulus notch filter 1102 can be seen to have cut the tails
from the LED emission graph 1104 to provide narrower wavelength
bands of light to the upper optical module 202. The light source
1100 can be used in connection with a matched combiner 602 that
includes a holographic mirror or tristimulus notch mirror that
substantially reflects the narrow bands of image light toward the
wearer's eye with a reduced amount of image light that does not get
reflected by the combiner, thereby improving efficiency of the
head-worn computer (HWC) or head-mounted display (HMD) and reducing
escaping light that can cause faceglow.
FIG. 12a illustrates another light source 1200 that may be used in
association with the upper optics module 202. In embodiments, the
light source 1200 may provide light to a backlighting optical
system that homogenizes the light prior to illuminating the image
source in the upper optics as described previously herein. In
embodiments, the light source 1200 includes a quantum dot cover
glass 1202. Where the quantum dots absorb light of a shorter
wavelength and emit light of a longer wavelength (FIG. 12b shows an
example wherein a UV spectrum 1202 applied to a quantum dot results
in the quantum dot emitting a narrow band shown as a PL spectrum
1204) that is dependent on the material makeup and size of the
quantum dot. As a result, quantum dots in the quantum dot cover
glass 1202 can be tailored to provide one or more bands of narrow
bandwidth light (e.g. red, green and blue emissions dependent on
the different quantum dots included as illustrated in the graph
shown in FIG. 12c where three different quantum dots are used. In
embodiments, the LED driver light emits UV light, deep blue or blue
light. For sequential illumination of different colors, multiple
light sources 1200 would be used where each light source 1200 would
include a quantum dot cover glass 1202 with at least one type of
quantum dot selected to emit at one of each of the desired colors.
The light source 1100 can be used in connection with a combiner 602
with a holographic mirror or tristimulus notch mirror to provide
narrow bands of image light that are reflected toward the wearer's
eye with less wasted image light that does not get reflected.
Another aspect of the present disclosure relates to the generation
of peripheral image lighting effects for a person wearing a HWC. In
embodiments, a solid state lighting system (e.g. LED, OLED, etc),
or other lighting system, may be included inside the optical
elements of an lower optical module 204. The solid state lighting
system may be arranged such that lighting effects outside of a
field of view (FOV) associated with displayed digital content is
presented to create an immersive effect for the person wearing the
HWC. To this end, the lighting effects may be presented to any
portion of the HWC that is visible to the wearer. The solid state
lighting system may be digitally controlled by an integrated
processor on the HWC. In embodiments, the integrated processor will
control the lighting effects in coordination with digital content
that is presented within the FOV of the HWC. For example, a movie,
picture, game, or other content, may be displayed or playing within
the FOV of the HWC. The content may show a bomb blast on the right
side of the FOV and at the same moment, the solid state lighting
system inside of the upper module optics may flash quickly in
concert with the FOV image effect. The effect may not be fast, it
may be more persistent to indicate, for example, a general glow or
color on one side of the user. The solid state lighting system may
be color controlled, with red, green and blue LEDs, for example,
such that color control can be coordinated with the digitally
presented content within the field of view.
FIG. 13a illustrates optical components of a lower optical module
204 together with an outer lens 1302. FIG. 13a also shows an
embodiment including effects LED's 1308a and 1308b. FIG. 13a
illustrates image light 1312, as described herein elsewhere,
directed into the upper optical module where it will reflect off of
the combiner element 1304, as described herein elsewhere. The
combiner element 1304 in this embodiment is angled towards the
wearer's eye at the top of the module and away from the wearer's
eye at the bottom of the module, as also illustrated and described
in connection with FIG. 8 (e.g. at a 45 degree angle). The image
light 1312 provided by an upper optical module 202 (not shown in
FIG. 13a) reflects off of the combiner element 1304 towards the
collimating mirror 1310, away from the wearer's eye, as described
herein elsewhere. The image light 1312 then reflects and focuses
off of the collimating mirror 1304, passes back through the
combiner element 1304, and is directed into the wearer's eye. The
wearer can also view the surrounding environment through the
transparency of the combiner element 1304, collimating mirror 1310,
and outer lens 1302 (if it is included). As described herein
elsewhere, the image light may or may not be polarized and the
see-through view of the surrounding environment is preferably
non-polarized to provide a view of the surrounding environment that
does not include rainbow interference patterns if the light from
the surrounding environment is polarized such as from a computer
monitor or reflections from a lake. The wearer will generally
perceive that the image light forms an image in the FOV 1305. In
embodiments, the outer lens 1302 may be included. The outer lens
1302 is an outer lens that may or may not be corrective and it may
be designed to conceal the lower optical module components in an
effort to make the HWC appear to be in a form similar to standard
glasses or sunglasses.
In the embodiment illustrated in FIG. 13a, the effects LEDs 1308a
and 1308b are positioned at the sides of the combiner element 1304
and the outer lens 1302 and/or the collimating mirror 1310. In
embodiments, the effects LEDs 1308a are positioned within the
confines defined by the combiner element 1304 and the outer lens
1302 and/or the collimating mirror. The effects LEDs 1308a and
1308b are also positioned outside of the FOV 1305 associated with
the displayed digital content. In this arrangement, the effects
LEDs 1308a and 1308b can provide lighting effects within the lower
optical module outside of the FOV 1305. In embodiments the light
emitted from the effects LEDs 1308a and 1308b may be polarized and
the outer lens 1302 may include a polarizer such that the light
from the effects LEDs 1308a and 1308b will pass through the
combiner element 1304 toward the wearer's eye and will be absorbed
by the outer lens 1302. This arrangement provides peripheral
lighting effects to the wearer in a more private setting by not
transmitting the lighting effects through the front of the HWC into
the surrounding environment. However, in other embodiments, the
effects LEDs 1308a and 1308b may be non-polarized so the lighting
effects provided are made to be purposefully viewable by others in
the environment for entertainment such as giving the effect of the
wearer's eye glowing in correspondence to the image content being
viewed by the wearer.
FIG. 13b illustrates a cross section of the embodiment described in
connection with FIG. 13a. As illustrated, the effects LED 1308a is
located in the upper-front area inside of the optical components of
the lower optical module. It should be understood that the effects
LED 1308a position in the described embodiments is only
illustrative and alternate placements are encompassed by the
present disclosure. Additionally, in embodiments, there may be one
or more effects LEDs 1308a in each of the two sides of HWC to
provide peripheral lighting effects near one or both eyes of the
wearer.
FIG. 13c illustrates an embodiment where the combiner element 1304
is angled away from the eye at the top and towards the eye at the
bottom (e.g. in accordance with the holographic or notch filter
embodiments described herein). In this embodiment, the effects LED
1308a may be located on the outer lens 1302 side of the combiner
element 1304 to provide a concealed appearance of the lighting
effects. As with other embodiments, the effects LED 1308a of FIG.
13c may include a polarizer such that the emitted light can pass
through a polarized element associated with the combiner element
1304 and be blocked by a polarized element associated with the
outer lens 1302. Alternatively the effects LED 13087a can be
configured such that at least a portion of the light is reflected
away from the wearer's eye so that it is visible to people in the
surrounding environment. This can be accomplished for example by
using a combiner 1304 that is a simple partial mirror so that a
portion of the image light 1312 is reflected toward the wearer's
eye and a first portion of the light from the effects LED 13087a is
transmitted toward the wearer's eye and a second portion of the
light from the effects LED 1308a is reflected outward toward the
surrounding environment.
FIGS. 14a, 14b, 14c and 14d show illustrations of a HWC that
includes eye covers 1402 to restrict loss of image light to the
surrounding environment and to restrict the ingress of stray light
from the environment. Where the eye covers 1402 can be removably
attached to the HWC with magnets 1404. Another aspect of the
present disclosure relates to automatically configuring the
lighting system(s) used in the HWC 102. In embodiments, the display
lighting and/or effects lighting, as described herein, may be
controlled in a manner suitable for when an eye cover 1402 is
attached or removed from the HWC 102. For example, at night, when
the light in the environment is low, the lighting system(s) in the
HWC may go into a low light mode to further control any amounts of
stray light escaping from the HWC and the areas around the HWC.
Covert operations at night, while using night vision or standard
vision, may require a solution which prevents as much escaping
light as possible so a user may clip on the eye cover(s) 1402 and
then the HWC may go into a low light mode. The low light mode may,
in some embodiments, only go into a low light mode when the eye
cover 1402 is attached if the HWC identifies that the environment
is in low light conditions (e.g. through environment light level
sensor detection). In embodiments, the low light level may be
determined to be at an intermediate point between full and low
light dependent on environmental conditions.
Another aspect of the present disclosure relates to automatically
controlling the type of content displayed in the HWC when eye
covers 1402 are attached or removed from the HWC. In embodiments,
when the eye cover(s) 1402 is attached to the HWC, the displayed
content may be restricted in amount or in color amounts. For
example, the display(s) may go into a simple content delivery mode
to restrict the amount of information displayed. This may be done
to reduce the amount of light produced by the display(s). In an
embodiment, the display(s) may change from color displays to
monochrome displays to reduce the amount of light produced. In an
embodiment, the monochrome lighting may be red to limit the impact
on the wearer's eyes to maintain an ability to see better in the
dark.
Another aspect of the present disclosure relates to a system
adapted to quickly convert from a see-through system to a
non-see-through or very low transmission see-through system for a
more immersive user experience. The conversion system may include
replaceable lenses, an eye cover, and optics adapted to provide
user experiences in both modes. The outer lenses, for example, may
be `blacked-out` with an opaque cover 1412 to provide an experience
where all of the user's attention is dedicated to the digital
content and then the outer lenses may be switched out for high
see-through lenses so the digital content is augmenting the user's
view of the surrounding environment. Another aspect of the
disclosure relates to low transmission outer lenses that permit the
user to see through the outer lenses but remain dark enough to
maintain most of the user's attention on the digital content. The
slight see-through can provide the user with a visual connection to
the surrounding environment and this can reduce or eliminate nausea
and other problems associated with total removal of the surrounding
view when viewing digital content.
FIG. 14d illustrates a head-worn computer system 102 with a
see-through digital content display 204 adapted to include a
removable outer lens 1414 and a removable eye cover 1402. The eye
cover 1402 may be attached to the head-worn computer 102 with
magnets 1404 or other attachment systems (e.g. mechanical
attachments, a snug friction fit between the arms of the head-worn
computer 102, etc.). The eye cover 1402 may be attached when the
user wants to cut stray light from escaping the confines of the
head-worn computer, create a more immersive experience by removing
the otherwise viewable peripheral view of the surrounding
environment, etc. The removable outer lens 1414 may be of several
varieties for various experiences. It may have no transmission or a
very low transmission to create a dark background for the digital
content, creating an immersive experience for the digital content.
It may have a high transmission so the user can see through the
see-through display and the outer lens 1414 to view the surrounding
environment, creating a system for a heads-up display, augmented
reality display, assisted reality display, etc. The outer lens 1414
may be dark in a middle portion to provide a dark background for
the digital content (i.e. dark backdrop behind the see-through
field of view from the user's perspective) and a higher
transmission area elsewhere. The outer lenses 1414 may have a
transmission in the range of 2 to 5%, 5 to 10%, 10 to 20% for the
immersion effect and above 10% or 20% for the augmented reality
effect, for example. The outer lenses 1414 may also have an
adjustable transmission to facilitate the change in system effect.
For example, the outer lenses 1414 may be electronically adjustable
tint lenses (e.g. liquid crystal or have crossed polarizers with an
adjustment for the level of cross).
In embodiments, the eye cover 1402 may have areas of transparency
or partial transparency to provide some visual connection with the
user's surrounding environment. This may also reduce or eliminate
nausea or other feelings associated with the complete removal of
the view of the surrounding environment.
FIG. 14e illustrates a HWC 102 assembled with an eye cover 1402
without outer lenses in place. The outer lenses, in embodiments,
may be held in place with magnets 1418 for ease of removal and
replacement. In embodiments, the outer lenses may be held in place
with other systems, such as mechanical systems.
Another aspect of the present disclosure relates to an effects
system that generates effects outside of the field of view in the
see-through display of the head-worn computer. The effects may be,
for example, lighting effects, sound effects, tactile effects (e.g.
through vibration), air movement effects, etc. In embodiments, the
effect generation system is mounted on the eye cover 1402. For
example, a lighting system (e.g. LED(s), OLEDs, etc.) may be
mounted on an inside surface 1420, or exposed through the inside
surface 1420, as illustrated in FIG. 14f, such that they can create
a lighting effect (e.g. a bright light, colored light, subtle color
effect) in coordination with content being displayed in the field
of view of the see-through display. The content may be a movie or a
game, for example, and an explosion may happen on the right side of
the content, as scripted, and matching the content, a bright flash
may be generated by the effects lighting system to create a
stronger effect. As another example, the effects system may include
a vibratory system mounted near the sides or temples, or otherwise,
and when the same explosion occurs, the vibratory system may
generate a vibration on the right side to increase the user
experience indicating that the explosion had a real sound wave
creating the vibration. As yet a further example, the effects
system may have an air system where the effect is a puff of air
blown onto the user's face. This may create a feeling of closeness
with some fast moving object in the content. The effects system may
also have speakers directed towards the user's ears or an
attachment for ear buds, etc.
In embodiments, the effects generated by the effects system may be
scripted by an author to coordinate with the content. In
embodiments, sensors may be placed inside of the eye cover to
monitor content effects (e.g. a light sensor to measure strong
lighting effects or peripheral lighting effects) that would than
cause an effect(s) to be generated.
The effects system in the eye cover may be powered by an internal
battery and the battery, in embodiments, may also provide
additional power to the head-worn computer 102 as a back-up system.
In embodiments, the effects system is powered by the batteries in
the head-worn computer. Power may be delivered through the
attachment system (e.g. magnets, mechanical system) or a dedicated
power system.
The effects system may receive data and/or commands from the
head-worn computer through a data connection that is wired or
wireless. The data may come through the attachment system, a
separate line, or through Bluetooth or other short range
communication protocol, for example.
In embodiments, the eye cover 1402 is made of reticulated foam,
which is very light and can contour to the user's face. The
reticulated foam also allows air to circulate because of the
open-celled nature of the material, which can reduce user fatigue
and increase user comfort. The eye cover 1402 may be made of other
materials, soft, stiff, priable, etc. and may have another material
on the periphery that contacts the face for comfort. In
embodiments, the eye cover 1402 may include a fan to exchange air
between an external environment and an internal space, where the
internal space is defined in part by the face of the user. The fan
may operate very slowly and at low power to exchange the air to
keep the face of the user cool. In embodiments the fan may have a
variable speed controller and/or a temperature sensor may be
positioned to measure temperature in the internal space to control
the temperature in the internal space to a specified range,
temperature, etc. The internal space is generally characterized by
the space confined space in front of the user's eyes and upper
cheeks where the eye cover encloses the area.
Another aspect of the present disclosure relates to flexibly
mounting an audio headset on the head-worn computer 102 and/or the
eye cover 1402. In embodiments, the audio headset is mounted with a
relatively rigid system that has flexible joint(s) (e.g. a
rotational joint at the connection with the eye cover, a rotational
joint in the middle of a rigid arm, etc.) and extension(s) (e.g. a
telescopic arm) to provide the user with adjustability to allow for
a comfortable fit over, in or around the user's ear. In
embodiments, the audio headset is mounted with a flexible system
that is more flexible throughout, such as with a wire-based
connection.
FIG. 14g illustrates a head-worn computer 102 with removable lenses
1414 along with a mounted eye cover 1402. The head-worn computer,
in embodiments, includes a see-through display (as disclosed
herein). The eye cover 1402 also includes a mounted audio headset
1422. The mounted audio headset 1422 in this embodiment is mounted
to the eye cover 1402 and has audio wire connections (not shown).
In embodiments, the audio wires' connections may connect to an
internal wireless communication system (e.g. Bluetooth, NFC, WiFi)
to make connection to the processor in the head-worn computer. In
embodiments, the audio wires may connect to a magnetic connector,
mechanical connector or the like to make the connection.
FIG. 14h illustrates an unmounted eye cover 1402 with a mounted
audio headset 1422. As illustrated, the mechanical design of the
eye cover is adapted to fit onto the head-worn computer to provide
visual isolation or partial isolation and the audio headset.
In embodiments, the eye cover 1402 may be adapted to be removably
mounted on a head-worn computer 102 with a see-through computer
display. An audio headset 1422 with an adjustable mount may be
connected to the eye cover, wherein the adjustable mount may
provide extension and rotation to provide a user of the head-worn
computer with a mechanism to align the audio headset with an ear of
the user. In embodiments, the audio headset includes an audio wire
connected to a connector on the eye cover and the eye cover
connector may be adapted to removably mate with a connector on the
head-worn computer. In embodiments, the audio headset may be
adapted to receive audio signals from the head-worn computer 102
through a wireless connection (e.g. Bluetooth, WiFi). As described
elsewhere herein, the head-worn computer 102 may have a removable
and replaceable front lens 1414. The eye cover 1402 may include a
battery to power systems internal to the eye cover 1402. The eye
cover 1402 may have a battery to power systems internal to the
head-worn computer 102.
In embodiments, the eye cover 1402 may include a fan adapted to
exchange air between an internal space, defined in part by the
user's face, and an external environment to cool the air in the
internal space and the user's face. In embodiments, the audio
headset 1422 may include a vibratory system (e.g. a vibration
motor, piezo motor, etc. in the armature and/or in the section over
the ear) adapted to provide the user with a haptic feedback
coordinated with digital content presented in the see-through
computer display. In embodiments, the head-worn computer 102
includes a vibratory system adapted to provide the user with a
haptic feedback coordinated with digital content presented in the
see-through computer display.
In embodiments, the eye cover 1402 is adapted to be removably
mounted on a head-worn computer with a see-through computer
display. The eye cover 1402 may also include a flexible audio
headset mounted to the eye cover 1402, wherein the flexibility
provides the user of the head-worn computer 102 with a mechanism to
align the audio headset with an ear of the user. In embodiments,
the flexible audio headset is mounted to the eye cover 1402 with a
magnetic connection. In embodiments, the flexible audio headset may
be mounted to the eye cover 1402 with a mechanical connection.
In embodiments, the audio headset 1422 may be spring or otherwise
loaded such that the head set presses inward towards the user's
ears for a more secure fit.
Referring to FIG. 15, we now turn to describe a particular external
user interface 104, referred to generally as a pen 1500. The pen
1500 is a specially designed external user interface 104 and can
operate as a user interface, to many different styles of HWC 102.
The pen 1500 generally follows the form of a conventional pen,
which is a familiar user handled device and creates an intuitive
physical interface for many of the operations to be carried out in
the HWC system 100. The pen 1500 may be one of several user
interfaces 104 used in connection with controlling operations
within the HWC system 100. For example, the HWC 102 may watch for
and interpret hand gestures 116 as control signals, where the pen
1500 may also be used as a user interface with the same HWC 102.
Similarly, a remote keyboard may be used as an external user
interface 104 in concert with the pen 1500. The combination of user
interfaces or the use of just one control system generally depends
on the operation(s) being executed in the HWC's system 100.
While the pen 1500 may follow the general form of a conventional
pen, it contains numerous technologies that enable it to function
as an external user interface 104. FIG. 15 illustrates technologies
comprised in the pen 1500. As can be seen, the pen 1500 may include
a camera 1508, which is arranged to view through lens 1502. The
camera may then be focused, such as through lens 1502, to image a
surface upon which a user is writing or making other movements to
interact with the HWC 102. There are situations where the pen 1500
will also have an ink, graphite, or other system such that what is
being written can be seen on the writing surface. There are other
situations where the pen 1500 does not have such a physical writing
system so there is no deposit on the writing surface, where the pen
would only be communicating data or commands to the HWC 102. The
lens 1502 configuration is described in greater detail herein. The
function of the camera 1508 is to capture information from an
unstructured writing surface such that pen strokes can be
interpreted as intended by the user. To assist in the predication
of the intended stroke path, the pen 1500 may include a sensor,
such as an IMU 1512. Of course, the IMU could be included in the
pen 1500 in its separate parts (e.g. gyro, accelerometer, etc.) or
an IMU could be included as a single unit. In this instance, the
IMU 1512 is used to measure and predict the motion of the pen 1500.
In turn, the integrated microprocessor 1510 would take the IMU
information and camera information as inputs and process the
information to form a prediction of the pen tip movement.
The pen 1500 may also include a pressure monitoring system 1504,
such as to measure the pressure exerted on the lens 1502. As will
be described in greater detail herein, the pressure measurement can
be used to predict the user's intention for changing the weight of
a line, type of a line, type of brush, click, double click, and the
like. In embodiments, the pressure sensor may be constructed using
any force or pressure measurement sensor located behind the lens
1502, including for example, a resistive sensor, a current sensor,
a capacitive sensor, a voltage sensor such as a piezoelectric
sensor, and the like.
The pen 1500 may also include a communications module 1518, such as
for bi-directional communication with the HWC 102. In embodiments,
the communications module 1518 may be a short distance
communication module (e.g. Bluetooth). The communications module
1518 may be security matched to the HWC 102. The communications
module 1518 may be arranged to communicate data and commands to and
from the microprocessor 1510 of the pen 1500. The microprocessor
1510 may be programmed to interpret data generated from the camera
1508, IMU 1512, and pressure sensor 1504, and the like, and then
pass a command onto the HWC 102 through the communications module
1518, for example. In another embodiment, the data collected from
any of the input sources (e.g. camera 1508, IMU 1512, pressure
sensor 1504) by the microprocessor may be communicated by the
communication module 1518 to the HWC 102, and the HWC 102 may
perform data processing and prediction of the user's intention when
using the pen 1500. In yet another embodiment, the data may be
further passed on through a network 110 to a remote device 112,
such as a server, for the data processing and prediction. The
commands may then be communicated back to the HWC 102 for execution
(e.g. display writing in the glasses display, make a selection
within the UI of the glasses display, control a remote external
device 112, control a local external device 108), and the like. The
pen may also include memory 1514 for long or short term uses.
The pen 1500 may also include a number of physical user interfaces,
such as quick launch buttons 1522, a touch sensor 1520, and the
like. The quick launch buttons 1522 may be adapted to provide the
user with a fast way of jumping to a software application in the
HWC system 100. For example, the user may be a frequent user of
communication software packages (e.g. email, text, Twitter,
Instagram, Facebook, Google+, and the like), and the user may
program a quick launch button 1522 to command the HWC 102 to launch
an application. The pen 1500 may be provided with several quick
launch buttons 1522, which may be user programmable or factory
programmable. The quick launch button 1522 may be programmed to
perform an operation. For example, one of the buttons may be
programmed to clear the digital display of the HWC 102. This would
create a fast way for the user to clear the screens on the HWC 102
for any reason, such as for example to better view the environment.
The quick launch button functionality will be discussed in further
detail below. The touch sensor 1520 may be used to take gesture
style input from the user. For example, the user may be able to
take a single finger and run it across the touch sensor 1520 to
affect a page scroll.
The pen 1500 may also include a laser pointer 1524. The laser
pointer 1524 may be coordinated with the IMU 1512 to coordinate
gestures and laser pointing. For example, a user may use the laser
1524 in a presentation to help with guiding the audience with the
interpretation of graphics and the IMU 1512 may, either
simultaneously or when the laser 1524 is off, interpret the user's
gestures as commands or data input.
FIG. 16 illustrates yet another embodiment of the present
disclosure. FIG. 16 illustrates a watchband clip-on controller
2000. The watchband clip-on controller may be a controller used to
control the HWC 102 or devices in the HWC system 100. The watchband
clip-on controller 2000 has a fastener 2018 (e.g. rotatable clip)
that is mechanically adapted to attach to a watchband, as
illustrated at 2004.
The watchband controller 2000 may have quick launch interfaces 2008
(e.g. to launch applications and choosers as described herein), a
touch pad 2014 (e.g. to be used as a touch style mouse for GUI
control in a HWC 102 display) and a display 2012. The clip 2018 may
be adapted to fit a wide range of watchbands so it can be used in
connection with a watch that is independently selected for its
function. The clip, in embodiments, is rotatable such that a user
can position it in a desirable manner. In embodiments the clip may
be a flexible strap. In embodiments, the flexible strap may be
adapted to be stretched to attach to a hand, wrist, finger, device,
weapon, and the like.
In embodiments, the watchband controller may be configured as a
removable and replacable watchband. For example, the controller may
be incorporated into a band with a certain width, segment
spacing's, etc. such that the watchband, with its incorporated
controller, can be attached to a watch body. The attachment, in
embodiments, may be mechanically adapted to attach with a pin upon
which the watchband rotates. In embodiments, the watchband
controller may be electrically connected to the watch and/or watch
body such that the watch, watch body and/or the watchband
controller can communicate data between them.
The watchband controller 2000 may have 3-axis motion monitoring
(e.g. through an IMU, accelerometers, magnetometers, gyroscopes,
etc.) to capture user motion. The user motion may then be
interpreted for gesture control.
In embodiments, the watchband controller 2000 may comprise fitness
sensors and a fitness computer. The sensors may track heart rate,
calories burned, strides, distance covered, and the like. The data
may then be compared against performance goals and/or standards for
user feedback.
In embodiments directed to capturing images of the wearer's eye,
light to illuminate the wearer's eye can be provided by several
different sources including: light from the displayed image (i.e.
image light); light from the environment that passes through the
combiner or other optics; light provided by a dedicated eye light,
etc. FIGS. 17 and 18 show illustrations of dedicated eye
illumination lights 3420. FIG. 17 shows an illustration from a side
view in which the dedicated illumination eye light 3420 is
positioned at a corner of the combiner 3410 so that it doesn't
interfere with the image light 3415. The dedicated eye illumination
light 3420 is pointed so that the eye illumination light 3425
illuminates the eyebox 3427 where the eye 3430 is located when the
wearer is viewing displayed images provided by the image light
3415. FIG. 18 shows an illustration from the perspective of the eye
of the wearer to show how the dedicated eye illumination light 3420
is positioned at the corner of the combiner 3410. While the
dedicated eye illumination light 3420 is shown at the upper left
corner of the combiner 3410, other positions along one of the edges
of the combiner 3410, or other optical or mechanical components,
are possible as well. In other embodiments, more than one dedicated
eye light 3420 with different positions can be used. In an
embodiment, the dedicated eye light 3420 is an infrared light that
is not visible by the wearer (e.g. 800 nm) so that the eye
illumination light 3425 doesn't interfere with the displayed image
perceived by the wearer.
In embodiments, the eye imaging camera is inline with the image
light optical path, or part of the image light optical path. For
example, the eye camera may be positioned in the upper module to
capture eye image light that reflects back through the optical
system towards the image display. The eye image light may be
captured after reflecting off of the image source (e.g. in a DLP
configuration where the mirrors can be positioned to reflect the
light towards the eye image light camera), a partially reflective
surface may be placed along the image light optical path such that
when the eye image light reflects back into the upper or lower
module that it is reflected in a direction that the eye imaging
camera can capture light eye image light. In other embodiments, the
eye image light camera is positioned outside of the image light
optical path. For example, the camera(s) may be positioned near the
outer lens of the platform.
FIG. 19 shows a series of illustrations of captured eye images that
show the eye glint (i.e. light that reflects off the front of the
eye) produced by a dedicated eye light mounted adjacent to the
combiner as previously described herein. In this embodiment of the
disclosure, captured images of the wearer's eye are analyzed to
determine the relative positions of the iris 3550, pupil, or other
portion of the eye, and the eye glint 3560. The eye glint is a
reflected image of the dedicated eye light 3420 when the dedicated
light is used. FIG. 19 illustrates the relative positions of the
iris 3550 and the eye glint 3560 for a variety of eye positions. By
providing a dedicated eye light 3420 in a fixed position, combined
with the fact that the human eye is essentially spherical, or at
least a reliably repeatable shape, the eye glint provides a fixed
reference point against which the determined position of the iris
can be compared to determine where the wearer is looking, either
within the displayed image or within the see-through view of the
surrounding environment. By positioning the dedicated eye light
3420 at a corner of the combiner 3410, the eye glint 3560 is formed
away from the iris 3550 in the captured images. As a result, the
positions of the iris and the eye glint can be determined more
easily and more accurately during the analysis of the captured
images, since they do not interfere with one another. In a further
embodiment, the combiner includes an associated cut filter that
prevents infrared light from the environment from entering the HWC
and the eye camera is an infrared camera, so that the eye glint
3560 is only provided by light from the dedicated eye light. For
example, the combiner can include a low pass filter that passes
visible light while reflecting infrared light from the environment
away from the eye camera, reflecting infrared light from the
dedicated eye light toward the user's eye and the eye camera can
include a high pass filter that absorbs visible light associated
with the displayed image while passing infrared light associated
with the eye image.
In an embodiment of the eye imaging system, the lens for the eye
camera is designed to take into account the optics associated with
the upper module 202 and the lower module 204. This is accomplished
by designing the eye camera to include the optics in the upper
module 202 and optics in the lower module 204, so that a high MTF
image is produced, at the image sensor in the eye camera, of the
wearer's eye. In yet a further embodiment, the eye camera lens is
provided with a large depth of field to eliminate the need for
focusing the eye camera to enable sharp images of the eye to be
captured. Where a large depth of field is typically provided by a
high f/# lens (e.g. f/#>5). In this case, the reduced light
gathering associated with high f/# lenses is compensated by the
inclusion of a dedicated eye light to enable a bright image of the
eye to be captured. Further, the brightness of the dedicated eye
light can be modulated and synchronized with the capture of eye
images so that the dedicated eye light has a reduced duty cycle and
the brightness of infrared light on the wearer's eye is
reduced.
In a further embodiment, FIG. 20a shows an illustration of an eye
image that is used to identify the wearer of the HWC. In this case,
an image of the wearer's eye 3611 is captured and analyzed for
patterns of identifiable features 3612. The patterns are then
compared to a database of eye images to determine the identity of
the wearer. After the identity of the wearer has been verified, the
operating mode of the HWC and the types of images, applications,
and information to be displayed can be adjusted and controlled in
correspondence to the determined identity of the wearer. Examples
of adjustments to the operating mode depending on who the wearer is
determined to be or not be include: making different operating
modes or feature sets available, shutting down or sending a message
to an external network, allowing guest features and applications to
run, etc.
FIG. 20b is an illustration of another embodiment using eye
imaging, in which the sharpness of the displayed image is
determined based on the eye glint produced by the reflection of the
displayed image from the wearer's eye surface. By capturing images
of the wearer's eye 3611, an eye glint 3622, which is a small
version of the displayed image can be captured and analyzed for
sharpness. If the displayed image is determined to not be sharp,
then an automated adjustment to the focus of the HWC optics can be
performed to improve the sharpness. This ability to perform a
measurement of the sharpness of a displayed image at the surface of
the wearer's eye can provide a very accurate measurement of image
quality. Having the ability to measure and automatically adjust the
focus of displayed images can be very useful in augmented reality
imaging where the focus distance of the displayed image can be
varied in response to changes in the environment or changes in the
method of use by the wearer.
An aspect of the present disclosure relates to controlling the HWC
102 through interpretations of eye imagery. In embodiments,
eye-imaging technologies, such as those described herein, are used
to capture an eye image or a series of eye images for processing.
The image(s) may be processed to determine a user intended action,
an HWC predetermined reaction, or other action. For example, the
imagery may be interpreted as an affirmative user control action
for an application on the HWC 102. Or, the imagery may cause, for
example, the HWC 102 to react in a pre-determined way such that the
HWC 102 is operating safely, intuitively, etc.
FIG. 21 illustrates an eye imagery process that involves imaging
the HWC 102 wearer's eye(s) and processing the images (e.g. through
eye imaging technologies described herein) to determine in what
position 3702 the eye is relative to it's neutral or forward
looking position and/or the FOV 3708. The process may involve a
calibration step where the user is instructed, through guidance
provided in the FOV of the HWC 102, to look in certain directions
such that a more accurate prediction of the eye position relative
to areas of the FOV can be made. In the event the wearer's eye is
determined to be looking towards the right side of the FOV 3708 (as
illustrated in FIG. 21, the eye is looking out of the page) a
virtual target line may be established to project what in the
environment the wearer may be looking towards or at. The virtual
target line may be used in connection with an image captured by
camera on the HWC 102 that images the surrounding environment in
front of the wearer. In embodiments, the field of view of the
camera capturing the surrounding environment matches, or can be
matched (e.g. digitally), to the FOV 3708 such that making the
comparison is made more clear. For example, with the camera
capturing the image of the surroundings in an angle that matches
the FOV 3708 the virtual line can be processed (e.g. in 2d or 3d,
depending on the camera images capabilities and/or the processing
of the images) by projecting what surrounding environment objects
align with the virtual target line. In the event there are multiple
objects along the virtual target line, focal planes may be
established corresponding to each of the objects such that digital
content may be placed in an area in the FOV 3708 that aligns with
the virtual target line and falls at a focal plane of an
intersecting object. The user then may see the digital content when
he focuses on the object in the environment, which is at the same
focal plane. In embodiments, objects in line with the virtual
target line may be established by comparison to mapped information
of the surroundings.
In embodiments, the digital content that is in line with the
virtual target line may not be displayed in the FOV until the eye
position is in the right position. This may be a predetermined
process. For example, the system may be set up such that a
particular piece of digital content (e.g. an advertisement,
guidance information, object information, etc.) will appear in the
event that the wearer looks at a certain object(s) in the
environment. A virtual target line(s) may be developed that
virtually connects the wearer's eye with an object(s) in the
environment (e.g. a building, portion of a building, mark on a
building, gps location, etc.) and the virtual target line may be
continually updated depending on the position and viewing direction
of the wearer (e.g. as determined through GPS, e-compass, IMU,
etc.) and the position of the object. When the virtual target line
suggests that the wearer's pupil is substantially aligned with the
virtual target line or about to be aligned with the virtual target
line, the digital content may be displayed in the FOV 3704.
In embodiments, the time spent looking along the virtual target
line and/or a particular portion of the FOV 3708 may indicate that
the wearer is interested in an object in the environment and/or
digital content being displayed. In the event there is no digital
content being displayed at the time a predetermined period of time
is spent looking at a direction, digital content may be presented
in the area of the FOV 3708. The time spent looking at an object
may be interpreted as a command to display information about the
object, for example. In other embodiments, the content may not
relate to the object and may be presented because of the indication
that the person is relatively inactive. In embodiments, the digital
content may be positioned in proximity to the virtual target line,
but not in-line with it such that the wearer's view of the
surroundings are not obstructed but information can augment the
wearer's view of the surroundings. In embodiments, the time spent
looking along a target line in the direction of displayed digital
content may be an indication of interest in the digital content.
This may be used as a conversion event in advertising. For example,
an advertiser may pay more for an add placement if the wearer of
the HWC 102 looks at a displayed advertisement for a certain period
of time. As such, in embodiments, the time spent looking at the
advertisement, as assessed by comparing eye position with the
content placement, target line or other appropriate position may be
used to determine a rate of conversion or other compensation amount
due for the presentation.
An aspect of the disclosure relates to removing content from the
FOV of the HWC 102 when the wearer of the HWC 102 apparently wants
to view the surrounding environments clearly. FIG. 22 illustrates a
situation where eye imagery suggests that the eye has or is moving
quickly so the digital content 3804 in the FOV 3808 is removed from
the FOV 3808. In this example, the wearer may be looking quickly to
the side indicating that there is something on the side in the
environment that has grabbed the wearer's attention. This eye
movement 3802 may be captured through eye imaging techniques (e.g.
as described herein) and if the movement matches a predetermined
movement (e.g. speed, rate, pattern, etc.) the content may be
removed from view. In embodiments, the eye movement is used as one
input and HWC movements indicated by other sensors (e.g. IMU in the
HWC) may be used as another indication. These various sensor
movements may be used together to project an event that should
cause a change in the content being displayed in the FOV.
Another aspect of the present disclosure relates to determining a
focal plane based on the wearer's eye convergence. Eyes are
generally converged slightly and converge more when the person
focuses on something very close. This is generally referred to as
convergence. In embodiments, convergence is calibrated for the
wearer. That is, the wearer may be guided through certain focal
plane exercises to determine how much the wearer's eyes converge at
various focal planes and at various viewing angles. The convergence
information may then be stored in a database for later reference.
In embodiments, a general table may be used in the event there is
no calibration step or the person skips the calibration step. The
two eyes may then be imaged periodically to determine the
convergence in an attempt to understand what focal plane the wearer
is focused on. In embodiments, the eyes may be imaged to determine
a virtual target line and then the eye's convergence may be
determined to establish the wearer's focus, and the digital content
may be displayed or altered based thereon.
FIG. 23 illustrates a situation where digital content is moved 3902
within one or both of the FOVs 3908 and 3910 to align with the
convergence of the eyes as determined by the pupil movement 3904.
By moving the digital content to maintain alignment, in
embodiments, the overlapping nature of the content is maintained so
the object appears properly to the wearer. This can be important in
situations where 3D content is displayed.
An aspect of the present disclosure relates to controlling the HWC
102 based on events detected through eye imaging. A wearer winking,
blinking, moving his eyes in a certain pattern, etc. may, for
example, control an application of the HWC 102. Eye imaging (e.g.
as described herein) may be used to monitor the eye(s) of the
wearer and once a pre-determined pattern is detected an application
control command may be initiated.
An aspect of the disclosure relates to monitoring the health of a
person wearing a HWC 102 by monitoring the wearer's eye(s).
Calibrations may be made such that the normal performance, under
various conditions (e.g. lighting conditions, image light
conditions, etc.) of a wearer's eyes may be documented. The
wearer's eyes may then be monitored through eye imaging (e.g. as
described herein) for changes in their performance. Changes in
performance may be indicative of a health concern (e.g. concussion,
brain injury, stroke, loss of blood, etc.). If detected the data
indicative of the change or event may be communicated from the HWC
102.
Aspects of the present disclosure relate to security and access of
computer assets (e.g. the HWC itself and related computer systems)
as determined through eye image verification. As discussed herein
elsewhere, eye imagery may be compared to known person eye imagery
to confirm a person's identity. Eye imagery may also be used to
confirm the identity of people wearing the HWCs 102 before allowing
them to link together or share files, streams, information,
etc.
A variety of use cases for eye imaging are possible based on
technologies described herein. An aspect of the present disclosure
relates to the timing of eye image capture. The timing of the
capture of the eye image and the frequency of the capture of
multiple images of the eye can vary dependent on the use case for
the information gathered from the eye image. For example, capturing
an eye image to identify the user of the HWC may be required only
when the HWC has been turned ON or when the HWC determines that the
HWC has been put onto a wearer's head to control the security of
the HWC and the associated information that is displayed to the
user, wherein the orientation, movement pattern, stress or position
of the earhorns (or other portions of the HWC) of the HWC can be
used to determine that a person has put the HWC onto their head
with the intention to use the HWC. Those same parameters may be
monitored in an effort to understand when the HWC is dismounted
from the user's head. This may enable a situation where the capture
of an eye image for identifying the wearer may be completed only
when a change in the wearing status is identified. In a contrasting
example, capturing eye images to monitor the health of the wearer
may require images to be captured periodically (e.g. every few
seconds, minutes, hours, days, etc.). For example, the eye images
may be taken in minute intervals when the images are being used to
monitor the health of the wearer when detected movements indicate
that the wearer is exercising. In a further contrasting example,
capturing eye images to monitor the health of the wearer for
long-term effects may only require that eye images be captured
monthly. Embodiments of the disclosure relate to selection of the
timing and rate of capture of eye images to be in correspondence
with the selected use scenario associated with the eye images.
These selections may be done automatically, as with the exercise
example above where movements indicate exercise, or these
selections may be set manually. In a further embodiment, the
selection of the timing and rate of eye image capture is adjusted
automatically depending on the mode of operation of the HWC. The
selection of the timing and rate of eye image capture can further
be selected in correspondence with input characteristics associated
with the wearer including age and health status, or sensed physical
conditions of the wearer including heart rate, chemical makeup of
the blood and eye blink rate.
FIG. 24 illustrates a cross section of an eyeball of a wearer of an
HWC with focus points that can be associated with the eye imaging
system of the disclosure. The eyeball 5010 includes an iris 5012
and a retina 5014. Because the eye imaging system of the disclosure
provides coaxial eye imaging with a display system, images of the
eye can be captured from a perspective directly in front of the eye
and inline with where the wearer is looking. In embodiments of the
disclosure, the eye imaging system can be focused at the iris 5012
and/or the retina 5014 of the wearer, to capture images of the
external surface of the iris 5012 or the internal portions of the
eye, which includes the retina 5014. FIG. 24 shows light rays 5020
and 5025 that are respectively associated with capturing images of
the iris 5012 or the retina 5014 wherein the optics associated with
the eye imaging system are respectively focused at the iris 5012 or
the retina 5014. Illuminating light can also be provided in the eye
imaging system to illuminate the iris 5012 or the retina 5014. FIG.
25 shows an illustration of an eye including an iris 5130 and a
sclera 5125. In embodiments, the eye imaging system can be used to
capture images that include the iris 5130 and portions of the
sclera 5125. The images can then be analyzed to determine color,
shapes and patterns that are associated with the user. In further
embodiments, the focus of the eye imaging system is adjusted to
enable images to be captured of the iris 5012 or the retina 5014.
Illuminating light can also be adjusted to illuminate the iris 5012
or to pass through the pupil of the eye to illuminate the retina
5014. The illuminating light can be visible light to enable capture
of colors of the iris 5012 or the retina 5014, or the illuminating
light can be ultraviolet (e.g. 340 nm), near infrared (e.g. 850 nm)
or mid-wave infrared (e.g. 5000 nm) light to enable capture of
hyperspectral characteristics of the eye.
FIGS. 26a and 26b illustrate captured images of eyes where the eyes
are illuminated with structured light patterns. In FIG. 26a, an eye
5220 is shown with a projected structured light pattern 5230, where
the light pattern is a grid of lines. A light pattern of such as
5230 can be provided by the light source 5355 by including a
diffractive or a refractive device to modify the light 5357 as are
known by those skilled in the art. A visible light source can also
be included for the second camera, which can include a diffractive
or refractive to modify the light 5467 to provide a light pattern.
FIG. 26b illustrates how the structured light pattern of 5230
becomes distorted to 5235 when the user's eye 5225 looks to the
side. This distortion comes from the fact that the human eye is not
completely spherical in shape, instead the iris sticks out slightly
from the eyeball to form a bump in the area of the iris. As a
result, the shape of the eye and the associated shape of the
reflected structured light pattern is different depending on which
direction the eye is pointed, when images of the eye are captured
from a fixed position. Changes in the structured light pattern can
subsequently be analyzed in captured eye images to determine the
direction that the eye is looking.
The eye imaging system can also be used for the assessment of
aspects of health of the user. In this case, information gained
from analyzing captured images of the iris 5130 or sclera 5125 are
different from information gained from analyzing captured images of
the retina 5014. Where images of the retina 5014 are captured using
light that illuminates the inner portions of the eye including the
retina 5014. The light can be visible light, but in an embodiment,
the light is infrared light (e.g. wavelength 1 to 5 microns) and
the eye camera is an infrared light sensor (e.g. an InGaAs sensor)
or a low resolution infrared image sensor that is used to determine
the relative amount of light that is absorbed, reflected or
scattered by the inner portions of the eye. Wherein the majority of
the light that is absorbed, reflected or scattered can be
attributed to materials in the inner portion of the eye including
the retina where there are densely packed blood vessels with thin
walls so that the absorption, reflection and scattering are caused
by the material makeup of the blood. These measurements can be
conducted automatically when the user is wearing the HWC, either at
regular intervals, after identified events or when prompted by an
external communication. In a preferred embodiment, the illuminating
light is near infrared or mid infrared (e.g. 0.7 to 5 microns
wavelength) to reduce the chance for thermal damage to the wearer's
eye. In a further embodiment, the light source and the camera
together comprise a spectrometer wherein the relative intensity of
the light reflected by the eye is analyzed over a series of narrow
wavelengths within the range of wavelengths provided by the light
source to determine a characteristic spectrum of the light that is
absorbed, reflected or scattered by the eye. For example, the light
source can provide a broad range of infrared light to illuminate
the eye and the camera can include: a grating to laterally disperse
the reflected light from the eye into a series of narrow wavelength
bands that are captured by a linear photodetector so that the
relative intensity by wavelength can be measured and a
characteristic absorbance spectrum for the eye can be determined
over the broad range of infrared. In a further example, the light
source can provide a series of narrow wavelengths of light
(ultraviolet, visible or infrared) to sequentially illuminate the
eye and camera includes a photodetector that is selected to measure
the relative intensity of the series of narrow wavelengths in a
series of sequential measurements that together can be used to
determine a characteristic spectrum of the eye. The determined
characteristic spectrum is then compared to known characteristic
spectra for different materials to determine the material makeup of
the eye. In yet another embodiment, the illuminating light is
focused on the retina and a characteristic spectrum of the retina
is determined and the spectrum is compared to known spectra for
materials that may be present in the user's blood. For example, in
the visible wavelengths 540 nm is useful for detecting hemoglobin
and 660 nm is useful for differentiating oxygenated hemoglobin. In
a further example, in the infrared, a wide variety of materials can
be identified as is known by those skilled in the art, including:
glucose, urea, alcohol and controlled substances.
Another aspect of the present disclosure relates to an intuitive
user interface mounted on the HWC 102 where the user interface
includes tactile feedback (otherwise referred to as haptic
feedback) to the user to provide the user an indication of
engagement and change. In embodiments, the user interface is a
rotating element on a temple section of a glasses form factor of
the HWC 102. The rotating element may include segments such that it
positively engages at certain predetermined angles. This
facilitates a tactile feedback to the user. As the user turns the
rotating element it `clicks` through it's predetermined steps or
angles and each step causes a displayed user interface content to
be changed. For example, the user may cycle through a set of menu
items or selectable applications. In embodiments, the rotating
element also includes a selection element, such as a
pressure-induced section where the user can push to make a
selection.
FIG. 27 illustrates a human head wearing a head-worn computer in a
glasses form factor. The glasses have a temple section 11702 and a
rotating user interface element 11704. The user can rotate the
rotating element 11704 to cycle through options presented as
content in the see-through display of the glasses. FIG. 28
illustrates several examples of different rotating user interface
elements 11704a, 11704b and 11704c. Rotating element 11704a is
mounted at the front end of the temple and has significant side and
top exposure for user interaction. Rotating element 11704b is
mounted further back and also has significant exposure (e.g. 270
degrees of touch). Rotating element 11704c has less exposure and is
exposed for interaction on the top of the temple. Other embodiments
may have a side or bottom exposure.
Another aspect of the present disclosure relates to a haptic system
in a head-worn computer. Creating visual, audio, and haptic
sensations in coordination can increase the enjoyment or
effectiveness of awareness in a number of situations. For example,
when viewing a movie or playing a game while digital content is
presented in a computer display of a head-worn computer, it is more
immersive to include coordinated sound and haptic effects. When
presenting information in the head-worn computer, it may be
advantageous to present a haptic effect to enhance or be the
information. For example, the haptic sensation may gently cause the
user of the head-worn computer believe that there is some presence
on the user's right side, but out of sight. It may be a very light
haptic effect to cause the `tingling` sensation of a presence of
unknown origin. It may be a high intensity haptic sensation to
coordinate with an apparent explosion, either out of sight or
in-sight in the computer display. Haptic sensations can be used to
generate a perception in the user that objects and events are close
by. As another example, digital content may be presented to the
user in the computer displays and the digital content may appear to
be within reach of the user. If the user reaches out his hand in an
attempt to touch the digital object, which is not a real object,
the haptic system may cause a sensation and the user may interpret
the sensation as a touching sensation. The haptic system may
generate slight vibrations near one or both temples for example and
the user may infer from those vibrations that he has touched the
digital object. This additional dimension in sensory feedback can
be very useful and create a more intuitive and immersive user
experience.
Another aspect of the present disclosure relates to controlling and
modulating the intensity of a haptic system in a head-worn
computer. In embodiments, the haptic system includes separate piezo
strips such that each of the separate strips can be controlled
separately. Each strip may be controlled over a range of vibration
levels and some of the separate strips may have a greater vibration
capacity than others. For example, a set of strips may be mounted
in the arm of the head-worn computer (e.g. near the user's temple,
ear, rear of the head, substantially along the length of the arm,
etc.) and the further forward the strip the higher capacity the
strip may have. The strips of varying capacity could be arranged in
any number of ways, including linear, curved, compound shape, two
dimensional array, one dimensional array, three dimensional array,
etc.). A processor in the head-worn computer may regulate the power
applied to the strips individually, in sub-groups, as a whole, etc.
In embodiments, separate strips or segments of varying capacity are
individually controlled to generate a finely controlled multi-level
vibration system. Patterns based on frequency, duration, intensity,
segment type, and/or other control parameters can be used to
generate signature haptic feedback. For example, to simulate the
haptic feedback of an explosion close to the user, a high
intensity, low frequency, and moderate duration may be a pattern to
use. A bullet whipping by the user may be simulated with a higher
frequency and shorter duration. Following this disclosure, one can
imagine various patterns for various simulation scenarios.
Another aspect of the present disclosure relates to making a
physical connection between the haptic system and the user's head.
Typically, with a glasses format, the glasses touch the user's head
in several places (e.g. ears, nose, forehead, etc.) and these areas
may be satisfactory to generate the necessary haptic feedback. In
embodiments, an additional mechanical element may be added to
better translate the vibration from the haptic system to a desired
location on the user's head. For example, a vibration or signal
conduit may be added to the head-worn computer such that there is a
vibration translation medium between the head-worn computers
internal haptic system and the user's temple area.
FIG. 29 illustrates a head-worn computer 102 with a haptic system
comprised of piezo strips 29002. In this embodiment, the piezo
strips 29002 are arranged linearly with strips of increasing
vibration capacity from back to front of the arm 29004. The
increasing capacity may be provided by different sized strips, for
example. This arrangement can cause a progressively increased
vibration power 29003 from back to front. This arrangement is
provided for ease of explanation; other arrangements are
contemplated by the inventors of the present application and these
examples should not be construed as limiting. The head-worn
computer 102 may also have a vibration or signal conduit 29001 that
facilitates the physical vibrations from the haptic system to the
head of the user 29005. The vibration conduit may be malleable to
form to the head of the user for a tighter or more appropriate
fit.
An aspect of the present invention relates to a head-worn computer,
comprising: a frame adapted to hold a computer display in front of
a user's eye; a processor adapted to present digital content in the
computer display and to produce a haptic signal in coordination
with the digital content display; and a haptic system comprised of
a plurality of haptic segments, wherein each of the haptic segments
is individually controlled in coordination with the haptic signal.
In embodiments, the haptic segments comprise a piezo strip
activated by the haptic signal to generate a vibration in the
frame. The intensity of the haptic system may be increased by
activating more than one of the plurality of haptic segments. The
intensity may be further increased by activating more than 2 of the
plurality of haptic segments. In embodiments, each of the plurality
of haptic segments comprises a different vibration capacity. In
embodiments, the intensity of the haptic system may be regulated
depending on which of the plurality of haptic segments is
activated. In embodiments, each of the plurality of haptic segments
are mounted in a linear arrangement and the segments are arranged
such that the higher capacity segments are at one end of the linear
arrangement. In embodiments, the linear arrangement is from back to
front on an arm of the head-worn computer. In embodiments, the
linear arrangement is proximate a temple of the user. In
embodiments, the linear arrangement is proximate an ear of the
user. In embodiments, the linear arrangement is proximate a rear
portion of the user's head. In embodiments, the linear arrangement
is from front to back on an arm of the head-worn computer, or
otherwise arranged.
An aspect of the present disclosure provides a head-worn computer
with a vibration conduit, wherein the vibration conduit is mounted
proximate the haptic system and adapted to touch the skin of the
user's head to facilitate vibration sensations from the haptic
system to the user's head. In embodiments, the vibration conduit is
mounted on an arm of the head-worn computer. In embodiments, the
vibration conduit touches the user's head proximate a temple of the
user's head. In embodiments, the vibration conduit is made of a
soft material that deforms to increase contact area with the user's
head.
An aspect of the present disclosure relates to a haptic array
system in a head-worn computer. The haptic array(s) that can
correlate vibratory sensations to indicate events, scenarios, etc.
to the wearer. The vibrations may correlate or respond to auditory,
visual, proximity to elements, etc. of a video game, movie, or
relationships to elements in the real world as a means of
augmenting the wearer's reality. As an example, physical proximity
to objects in a wearer's environment, sudden changes in elevation
in the path of the wearer (e.g. about to step off a curb), the
explosions in a game or bullets passing by a wearer. Haptic effects
from a piezo array(s) that make contact the side of the wearer's
head may be adapted to effect sensations that correlate to other
events experienced by the wearer.
FIG. 29a illustrates a haptic system according to the principles of
the present disclosure. In embodiments the piezo strips are mounted
or deposited with varying width and thus varying force Piezo
Elements on a rigid or flexible, non-conductive substrate attached,
to or part of the temples of glasses, goggles, bands or other form
factor. The non-conductive substrate may conform to the curvature
of a head by being curved and it may be able to pivot (e.g. in and
out, side to side, up and down, etc.) from a person's head. This
arrangement may be mounted to the inside of the temples of a pair
of glasses. Similarly, the vibration conduit, described herein
elsewhere, may be mounted with a pivot. As can be seen in FIG. 29a,
the piezo strips 29002 may be mounted on a substrate and the
substrate may be mounted to the inside of a glasses arm, strap,
etc. The piezo strips in this embodiment increase in vibration
capacity as they move forward.
Although embodiments of HWC have been described in language
specific to features, systems, computer processes and/or methods,
the appended claims are not necessarily limited to the specific
features, systems, computer processes and/or methods described.
Rather, the specific features, systems, computer processes and/or
and methods are disclosed as non-limited example implementations of
HWC. All documents referenced herein are hereby incorporated by
reference.
The methods and systems described herein may be deployed in part or
in whole through a machine that executes computer software, program
codes, and/or instructions on a processor. The processor may be
part of a server, cloud server, client, network infrastructure,
mobile computing platform, stationary computing platform, or other
computing platform. A processor may be any kind of computational or
processing device capable of executing program instructions, codes,
binary instructions and the like. The processor may be or include a
signal processor, digital processor, embedded processor,
microprocessor or any variant such as a co-processor (math
co-processor, graphic co-processor, communication co-processor and
the like) and the like that may directly or indirectly facilitate
execution of program code or program instructions stored thereon.
In addition, the processor may enable execution of multiple
programs, threads, and codes. The threads may be executed
simultaneously to enhance the performance of the processor and to
facilitate simultaneous operations of the application. By way of
implementation, methods, program codes, program instructions and
the like described herein may be implemented in one or more thread.
The thread may spawn other threads that may have assigned
priorities associated with them; the processor may execute these
threads based on priority or any other order based on instructions
provided in the program code. The processor may include memory that
stores methods, codes, instructions and programs as described
herein and elsewhere. The processor may access a storage medium
through an interface that may store methods, codes, and
instructions as described herein and elsewhere. The storage medium
associated with the processor for storing methods, programs, codes,
program instructions or other type of instructions capable of being
executed by the computing or processing device may include but may
not be limited to one or more of a CD-ROM, DVD, memory, hard disk,
flash drive, RAM, ROM, cache and the like.
A processor may include one or more cores that may enhance speed
and performance of a multiprocessor. In embodiments, the process
may be a dual core processor, quad core processors, other
chip-level multiprocessor and the like that combine two or more
independent cores (called a die).
The methods and systems described herein may be deployed in part or
in whole through a machine that executes computer software on a
server, client, firewall, gateway, hub, router, or other such
computer and/or networking hardware. The software program may be
associated with a server that may include a file server, print
server, domain server, internet server, intranet server and other
variants such as secondary server, host server, distributed server
and the like. The server may include one or more of memories,
processors, computer readable transitory and/or non-transitory
media, storage media, ports (physical and virtual), communication
devices, and interfaces capable of accessing other servers,
clients, machines, and devices through a wired or a wireless
medium, and the like. The methods, programs or codes as described
herein and elsewhere may be executed by the server. In addition,
other devices required for execution of methods as described in
this application may be considered as a part of the infrastructure
associated with the server.
The server may provide an interface to other devices including,
without limitation, clients, other servers, printers, database
servers, print servers, file servers, communication servers,
distributed servers and the like. Additionally, this coupling
and/or connection may facilitate remote execution of program across
the network. The networking of some or all of these devices may
facilitate parallel processing of a program or method at one or
more location without deviating from the scope of the invention. In
addition, all the devices attached to the server through an
interface may include at least one storage medium capable of
storing methods, programs, code and/or instructions. A central
repository may provide program instructions to be executed on
different devices. In this implementation, the remote repository
may act as a storage medium for program code, instructions, and
programs.
The software program may be associated with a client that may
include a file client, print client, domain client, internet
client, intranet client and other variants such as secondary
client, host client, distributed client and the like. The client
may include one or more of memories, processors, computer readable
transitory and/or non-transitory media, storage media, ports
(physical and virtual), communication devices, and interfaces
capable of accessing other clients, servers, machines, and devices
through a wired or a wireless medium, and the like. The methods,
programs or codes as described herein and elsewhere may be executed
by the client. In addition, other devices required for execution of
methods as described in this application may be considered as a
part of the infrastructure associated with the client.
The client may provide an interface to other devices including,
without limitation, servers, other clients, printers, database
servers, print servers, file servers, communication servers,
distributed servers and the like. Additionally, this coupling
and/or connection may facilitate remote execution of program across
the network. The networking of some or all of these devices may
facilitate parallel processing of a program or method at one or
more location without deviating from the scope of the invention. In
addition, all the devices attached to the client through an
interface may include at least one storage medium capable of
storing methods, programs, applications, code and/or instructions.
A central repository may provide program instructions to be
executed on different devices. In this implementation, the remote
repository may act as a storage medium for program code,
instructions, and programs.
The methods and systems described herein may be deployed in part or
in whole through network infrastructures. The network
infrastructure may include elements such as computing devices,
servers, routers, hubs, firewalls, clients, personal computers,
communication devices, routing devices and other active and passive
devices, modules and/or components as known in the art. The
computing and/or non-computing device(s) associated with the
network infrastructure may include, apart from other components, a
storage medium such as flash memory, buffer, stack, RAM, ROM and
the like. The processes, methods, program codes, instructions
described herein and elsewhere may be executed by one or more of
the network infrastructural elements.
The methods, program codes, and instructions described herein and
elsewhere may be implemented on a cellular network having multiple
cells. The cellular network may either be frequency division
multiple access (FDMA) network or code division multiple access
(CDMA) network. The cellular network may include mobile devices,
cell sites, base stations, repeaters, antennas, towers, and the
like.
The methods, programs codes, and instructions described herein and
elsewhere may be implemented on or through mobile devices. The
mobile devices may include navigation devices, cell phones, mobile
phones, mobile personal digital assistants, laptops, palmtops,
netbooks, pagers, electronic books readers, music players and the
like. These devices may include, apart from other components, a
storage medium such as a flash memory, buffer, RAM, ROM and one or
more computing devices. The computing devices associated with
mobile devices may be enabled to execute program codes, methods,
and instructions stored thereon. Alternatively, the mobile devices
may be configured to execute instructions in collaboration with
other devices. The mobile devices may communicate with base
stations interfaced with servers and configured to execute program
codes. The mobile devices may communicate on a peer to peer
network, mesh network, or other communications network. The program
code may be stored on the storage medium associated with the server
and executed by a computing device embedded within the server. The
base station may include a computing device and a storage medium.
The storage device may store program codes and instructions
executed by the computing devices associated with the base
station.
The computer software, program codes, and/or instructions may be
stored and/or accessed on machine readable transitory and/or
non-transitory media that may include: computer components,
devices, and recording media that retain digital data used for
computing for some interval of time; semiconductor storage known as
random access memory (RAM); mass storage typically for more
permanent storage, such as optical discs, forms of magnetic storage
like hard disks, tapes, drums, cards and other types; processor
registers, cache memory, volatile memory, non-volatile memory;
optical storage such as CD, DVD; removable media such as flash
memory (e.g. USB sticks or keys), floppy disks, magnetic tape,
paper tape, punch cards, standalone RAM disks, Zip drives,
removable mass storage, off-line, and the like; other computer
memory such as dynamic memory, static memory, read/write storage,
mutable storage, read only, random access, sequential access,
location addressable, file addressable, content addressable,
network attached storage, storage area network, bar codes, magnetic
ink, and the like.
The methods and systems described herein may transform physical
and/or or intangible items from one state to another. The methods
and systems described herein may also transform data representing
physical and/or intangible items from one state to another, such as
from usage data to a normalized usage dataset.
The elements described and depicted herein, including in flow
charts and block diagrams throughout the figures, imply logical
boundaries between the elements. However, according to software or
hardware engineering practices, the depicted elements and the
functions thereof may be implemented on machines through computer
executable transitory and/or non-transitory media having a
processor capable of executing program instructions stored thereon
as a monolithic software structure, as standalone software modules,
or as modules that employ external routines, code, services, and so
forth, or any combination of these, and all such implementations
may be within the scope of the present disclosure. Examples of such
machines may include, but may not be limited to, personal digital
assistants, laptops, personal computers, mobile phones, other
handheld computing devices, medical equipment, wired or wireless
communication devices, transducers, chips, calculators, satellites,
tablet PCs, electronic books, gadgets, electronic devices, devices
having artificial intelligence, computing devices, networking
equipment, servers, routers and the like. Furthermore, the elements
depicted in the flow chart and block diagrams or any other logical
component may be implemented on a machine capable of executing
program instructions. Thus, while the foregoing drawings and
descriptions set forth functional aspects of the disclosed systems,
no particular arrangement of software for implementing these
functional aspects should be inferred from these descriptions
unless explicitly stated or otherwise clear from the context.
Similarly, it will be appreciated that the various steps identified
and described above may be varied, and that the order of steps may
be adapted to particular applications of the techniques disclosed
herein. All such variations and modifications are intended to fall
within the scope of this disclosure. As such, the depiction and/or
description of an order for various steps should not be understood
to require a particular order of execution for those steps, unless
required by a particular application, or explicitly stated or
otherwise clear from the context.
The methods and/or processes described above, and steps thereof,
may be realized in hardware, software or any combination of
hardware and software suitable for a particular application. The
hardware may include a dedicated computing device or specific
computing device or particular aspect or component of a specific
computing device. The processes may be realized in one or more
microprocessors, microcontrollers, embedded microcontrollers,
programmable digital signal processors or other programmable
device, along with internal and/or external memory. The processes
may also, or instead, be embodied in an application specific
integrated circuit, a programmable gate array, programmable array
logic, or any other device or combination of devices that may be
configured to process electronic signals. It will further be
appreciated that one or more of the processes may be realized as a
computer executable code capable of being executed on a machine
readable medium.
The computer executable code may be created using a structured
programming language such as C, an object oriented programming
language such as C++, or any other high-level or low-level
programming language (including assembly languages, hardware
description languages, and database programming languages and
technologies) that may be stored, compiled or interpreted to run on
one of the above devices, as well as heterogeneous combinations of
processors, processor architectures, or combinations of different
hardware and software, or any other machine capable of executing
program instructions.
Thus, in one aspect, each method described above and combinations
thereof may be embodied in computer executable code that, when
executing on one or more computing devices, performs the steps
thereof. In another aspect, the methods may be embodied in systems
that perform the steps thereof, and may be distributed across
devices in a number of ways, or all of the functionality may be
integrated into a dedicated, standalone device or other hardware.
In another aspect, the means for performing the steps associated
with the processes described above may include any of the hardware
and/or software described above. All such permutations and
combinations are intended to fall within the scope of the present
disclosure.
While the invention has been disclosed in connection with the
preferred embodiments shown and described in detail, various
modifications and improvements thereon will become readily apparent
to those skilled in the art. Accordingly, the spirit and scope of
the present invention is not to be limited by the foregoing
examples, but is to be understood in the broadest sense allowable
by law.
* * * * *
References