U.S. patent application number 17/737931 was filed with the patent office on 2022-08-25 for microlens arrays for parallel micropatterning.
The applicant listed for this patent is Facebook Technologies, LLC. Invention is credited to Daniel BRODOCEANU, Oscar TORRENTS ABAD.
Application Number | 20220266385 17/737931 |
Document ID | / |
Family ID | |
Filed Date | 2022-08-25 |
United States Patent
Application |
20220266385 |
Kind Code |
A1 |
BRODOCEANU; Daniel ; et
al. |
August 25, 2022 |
MICROLENS ARRAYS FOR PARALLEL MICROPATTERNING
Abstract
Disclosed herein are systems and methods for using microlens
arrays for parallel micropatterning of features. In some
embodiments, a system includes a laser that emits a laser beam, a
beam homogenizer configured to shape the laser beam into a shaped
laser beam having a beam profile, and a lenslet array. The beam
homogenizer shapes the laser beam such that at least a portion of
the beam profile is substantially uniform in power. The lenslets of
the lenslet array have the same shape and each receive a respective
portion of the shaped laser beam to output a plurality of laser
sub-beams. The plurality of laser sub-beams can be directed toward
one or more layers of material to generate or modify a plurality of
features on the one or more layers in parallel.
Inventors: |
BRODOCEANU; Daniel; (Cork,
IE) ; TORRENTS ABAD; Oscar; (Blarney, IE) |
|
Applicant: |
Name |
City |
State |
Country |
Type |
Facebook Technologies, LLC |
Menlo Park |
CA |
US |
|
|
Appl. No.: |
17/737931 |
Filed: |
May 5, 2022 |
Related U.S. Patent Documents
|
|
|
|
|
|
Application
Number |
Filing Date |
Patent Number |
|
|
16376156 |
Apr 5, 2019 |
11344971 |
|
|
17737931 |
|
|
|
|
International
Class: |
B23K 26/06 20060101
B23K026/06; G02B 27/09 20060101 G02B027/09; G02B 3/00 20060101
G02B003/00 |
Claims
1. A method comprising: emitting a laser beam; passing the laser
beam through a beam homogenizer to form a shaped laser beam having
a beam profile, wherein at least a portion of the beam profile is
substantially uniform in power; providing the shaped laser beam to
a lenslet array that includes a plurality of lenslets, wherein each
lenslet of the plurality of lenslets has a same shape and receives
a respective portion of the shaped laser beam, the respective
portion of the shaped laser beam corresponding to the at least a
portion of the beam profile that is substantially uniform in power;
generating a plurality of laser sub-beams using the lenslet array,
each laser sub-beam of the plurality of laser sub-beams being
generated by a corresponding lenslet based on the respective
portion of the shaped laser beam received by the lenslet; and
directing the plurality of laser sub-beams toward one or more
layers of material to generate or modify a plurality of features on
the one or more layers in parallel.
2. The method of claim 1, wherein directing the plurality of laser
sub-beams toward the one or more layers to generate or modify the
plurality of features on the one or more layers in parallel
comprises: moving at least one of the lenslet array or the one or
more layers such that a focal point of each lenslet of the
plurality of lenslets is scanned in a scanning pattern.
3. The method of claim 2, wherein the scanning pattern comprises a
plurality of concentric circles.
4. The method of claim 2, wherein the scanning pattern comprises a
spiral.
5. The method of claim 2, wherein the scanning pattern causes the
plurality of laser sub-beams to shape the plurality of features
into microlenses.
6. The method of claim 1, wherein the plurality of features is
formed by two-photon polymerization of a material in the one or
more layers.
7. The method of claim 1, wherein: the one or more layers includes
a first layer and a second layer, and each laser sub-beam generates
or modifies a corresponding hole to form a plurality of holes, the
plurality of holes extending through the first layer and exposing
the second layer.
8. The method of claim 7, wherein: the first layer corresponds to a
wafer, the second layer is a polymer layer in which a plurality of
mesas is embedded, the method further comprises injecting a plasma
species through the plurality of holes to etch the polymer layer,
and the injecting of the plasma species is performed as part of
forming an array of light-emitting diodes from the plurality of
mesas.
9. The method of claim 7, wherein: the first layer corresponds to a
silicon substrate, the second layer is a metal layer, and the
method further comprises performing metal-assisted chemical etching
to etch portions of the silicon substrate that are in contact with
the metal layer while leaving portions of the silicon substrate
that were exposed by the plurality of holes unetched.
10. The method of claim 1, wherein the laser beam is a pulsed laser
beam, and wherein the beam profile of the shaped laser beam is a
flat-top profile.
11. A system comprising: a laser configured to emit a laser beam; a
beam homogenizer configured to shape the laser beam into a shaped
laser beam having a beam profile, wherein at least a portion of the
beam profile is substantially uniform in power; and a lenslet array
comprising a plurality of lenslets, wherein: each lenslet of the
plurality of lenslets has a same shape and is configured to receive
a respective portion of the shaped laser beam, the respective
portion of the shaped laser beam corresponding to the at least a
portion of the beam profile that is substantially uniform in power,
and each lenslet of the plurality of lenslets is configured to
generate a corresponding laser sub-beam based on the respective
portion of the shaped laser beam received by the lenslet, such that
the lenslet array outputs a plurality of laser sub-beams in
parallel.
12. The system of claim 11, wherein the system is configured to
scan a focal point of each lenslet of the plurality of lenslets
according to a scanning pattern, based on movement of the lenslet
array relative to one or more layers of material toward which the
plurality of laser sub-beams is directed.
13. The system of claim 11, wherein the laser beam is a pulsed
laser beam, and wherein the beam profile of the shaped laser beam
is a flat-top profile.
14. The system of claim 11, further comprising: a beam confiner
arranged between the laser and the beam homogenizer.
15. The system of claim 14, wherein the beam confiner is configured
to collimate the laser beam.
16. The system of claim 14, wherein the beam confiner is a
cylindrical tube.
17. The system of claim 16, wherein the beam confiner corresponds
to a microscope objective without optical elements.
18. The system of claim 11, wherein the lenslets are arranged in a
hexagonal pattern.
19. The system of claim 11, wherein the lenslet array is positioned
on an exit window of the beam homogenizer.
20. The system of claim 19, wherein the system is configured to
move the beam homogenizer and the lenslet array together, along
each axis of a three-axis coordinate system and while the laser
beam is being emitted by the laser.
Description
CROSS REFERENCE SECTION
[0001] This application is a continuation of U.S. Non-Provisional
Application No. 16/376,156, filed Apr. 5, 2019, titled "MICROLENS
ARRAYS FOR PARALLEL MICROPATTERNING," which is herein incorporated
by reference in its entirety for all purposes.
BACKGROUND
[0002] Many applications use laser ablation, laser-assisted
etching, or two-photon polymerization to pattern small features.
Typically a single laser beam is focused by a microscope objective
to perform micropatterning. While this method provides flexibility
for forming individual features, it is inefficient for forming
arrays of multiple features. For example, it is very time-consuming
to scan a single laser beam to form the thousands or millions of
light-emitting diodes (LEDs) that may be required for a
display.
SUMMARY
[0003] The present disclosure generally relates to using microlens
arrays for parallel micropatterning of features. In some
embodiments, a method includes emitting a laser beam, providing the
laser beam to a lenslet array including a plurality of lenslets,
and generating, from the laser beam, a plurality of laser sub-beams
using the lenslet array. Each one of the plurality of laser
sub-beams is generated by a corresponding one of the plurality of
lenslets. Each lenslet of the plurality of lenslets has the same
shape.
[0004] The method may also include directing the plurality of laser
sub-beams toward a substrate to generate or modify a plurality of
features on the substrate in parallel, wherein each one of the
plurality of laser sub-beams generates or modifies a corresponding
one of the plurality of features on the substrate. Directing the
plurality of laser sub-beams toward the substrate to generate or
modify the plurality of features on the substrate in parallel may
include moving the lenslet array and/or the substrate such that a
focal point of each lenslet of the plurality of lenslets is scanned
in a scanning pattern on a material formed on the substrate. The
scanning pattern may include a plurality of concentric circles.
Alternatively or in addition, the scanning pattern may include a
spiral. The plurality of features may be formed by two-photon
polymerization of the material formed on the substrate.
[0005] Alternatively or in addition, the method may include
directing the plurality of laser sub-beams toward a wafer to
generate or modify a plurality of holes through the wafer in
parallel, wherein each one of the plurality of laser sub-beams
generates or modifies a corresponding one of the plurality of holes
through the wafer. Directing the plurality of laser sub-beams
toward the wafer to generate or modify the plurality of holes
through the wafer in parallel may include positioning the lenslet
array and/or the wafer such that a focal point of each lenslet of
the plurality of lenslets is formed on the wafer. A cross-section
of each of the holes may have a round shape. Alternatively, a
cross-section of each of the holes may have a parallelogram shape.
Alternatively or in addition, a pitch of the holes may be larger
than a pitch of a plurality of mesa shapes that are formed adjacent
to a surface of the wafer.
[0006] Alternatively or in addition, the method may include
directing the plurality of laser sub-beams toward a silicon
substrate to generate or modify a plurality of holes in parallel
through a metal layer that is formed on a surface of the silicon
substrate, wherein each one of the plurality of laser sub-beams
generates or modifies a corresponding one of the plurality of holes
through the metal layer. Directing the plurality of laser sub-beams
toward the silicon substrate to generate or modify the plurality of
holes in parallel through the metal layer may include positioning
the lenslet array and/or the silicon substrate such that a focal
point of each lenslet of the plurality of lenslets is formed on the
metal layer. The method may also include creating an array of
silicon nanowires by using a remaining portion of the metal layer
to etch a portion of the silicon substrate that is aligned with the
remaining portion of the metal layer.
[0007] Alternatively or in addition, the method may include
directing the plurality of laser sub-beams toward an array of
light-emitting diodes to generate or modify a plurality of optical
elements in parallel in an elastomeric material or a photoresist
that is formed on a surface of the array of light-emitting diodes,
wherein each one of the plurality of laser sub-beams generates or
modifies a corresponding one of the plurality of optical elements
in the elastomeric material or the photoresist. Directing the
plurality of laser sub-beams toward the array of light-emitting
diodes to generate or modify the plurality of optical elements in
parallel in the elastomeric material or the photoresist may include
positioning the lenslet array and/or the array of light-emitting
diodes such that a focal point of each lenslet of the plurality of
lenslets is formed on the elastomeric material or the photoresist
that is formed on the surface of the array of light-emitting
diodes. The laser beam may be a pulsed laser beam, and the laser
beam may be shaped to have a flat-top profile.
[0008] In some embodiments, a system includes a laser that is
configured to emit a laser beam, and a lenslet array including a
plurality of lenslets. The lenslet array is configured to receive
the laser beam, each lenslet of the plurality of lenslets has the
same shape, and the lenslet array is configured to generate a
plurality of laser sub-beams from the laser beam, each one of the
plurality of lenslets being configured to generate a corresponding
one of the plurality of laser sub-beams. The system may also
include a beam homogenizer that is configured to shape the pulsed
laser beam to have a flat-top profile, and a beam confiner that is
arranged between the laser and the beam homogenizer. The beam
confiner may be configured to collimate the laser beam.
[0009] This summary is neither intended to identify key or
essential features of the claimed subject matter, nor is it
intended to be used in isolation to determine the scope of the
claimed subject matter. The subject matter should be understood by
reference to appropriate portions of the entire specification of
this disclosure, any or all drawings, and each claim. The
foregoing, together with other features and examples, will be
described in more detail below in the following specification,
claims, and accompanying drawings.
BRIEF DESCRIPTION OF THE DRAWINGS
[0010] Illustrative embodiments are described in detail below with
reference to the following figures:
[0011] FIG. 1 is a simplified block diagram of an example
artificial reality system environment including a near-eye display,
according to certain embodiments;
[0012] FIG. 2 is a perspective view of a simplified example
near-eye display including various sensors;
[0013] FIG. 3 is a perspective view of an example near-eye display
in the form of a head-mounted display (HMD) device for implementing
some of the examples disclosed herein;
[0014] FIG. 4 is a simplified block diagram of an example
electronic system of an example near-eye display for implementing
some of the examples disclosed herein;
[0015] FIG. 5A is a simplified block diagram of a related art
system for performing micropatterning;
[0016] FIG. 5B is a simplified block diagram of a system for
performing micropatterning according to certain embodiments of the
invention;
[0017] FIG. 6 is an example of a structure that can be created by
the system for performing micropatterning according to certain
embodiments of the invention;
[0018] FIGS. 7A and 7B are additional examples of structures that
can be created by the system for performing micropatterning
according to certain embodiments of the invention;
[0019] FIGS. 8A and 8B are additional examples of structures that
can be created by the system for performing micropatterning
according to certain embodiments of the invention; and
[0020] FIG. 9 is a cross-sectional view of an example of an LED
having various components that can be created by the system for
performing micropatterning according to certain embodiments of the
invention.
DETAILED DESCRIPTION
[0021] In the following description, for the purposes of
explanation, specific details are set forth in order to provide a
thorough understanding of examples of the disclosure. However, it
will be apparent that various examples may be practiced without
these specific details. For example, devices, systems, structures,
assemblies, methods, and other components may be shown as
components in block diagram form in order not to obscure the
examples in unnecessary detail. In other instances, well-known
devices, processes, systems, structures, and techniques may be
shown without necessary detail in order to avoid obscuring the
examples. The figures and description are not intended to be
restrictive. The terms and expressions that have been employed in
this disclosure are used as terms of description and not of
limitation, and there is no intention in the use of such terms and
expressions of excluding any equivalents of the features shown and
described or portions thereof.
[0022] An artificial reality system, such as a virtual reality
(VR), augmented reality (AR), or mixed reality (MR) system, may
include a near-eye display (e.g., a headset or a pair of glasses)
configured to present content to a user via an electronic or optic
display and, in some cases, may also include a console configured
to generate content for presentation to the user and to provide the
generated content to the near-eye display for presentation. To
improve user interaction with presented content, the console may
modify or generate content based on a location where the user is
looking, which may be determined by tracking the user's eye.
Tracking the eye may include tracking the position and/or shape of
the pupil of the eye, and/or the rotational position (gaze
direction) of the eye. To track the eye, the near-eye display may
illuminate a surface of the user's eye using light sources mounted
to or within the near-eye display, according to at least one
embodiment. An imaging device (e.g., a camera) included in the
vicinity of the near-eye display may then capture light reflected
by various surfaces of the user's eye. Light that is reflected
specularly off the cornea of the user's eye may result in "glints"
in the captured image. One way to illuminate the eye to see the
pupil as well as the glints is to use a two-dimensional (2D) array
of light-emitting diodes (LEDs). Techniques such as a centroiding
algorithm may be used to accurately determine the locations of the
glints on the eye in the captured image, and the rotational
position (e.g., the gaze direction) of the eye may then be
determined based on the locations of the glints relative to a known
feature of the eye (e.g., the center of the pupil) within the
captured image.
[0023] FIG. 1 is a simplified block diagram of an example
artificial reality system environment 100 including a near-eye
display 120, in accordance with certain embodiments. Artificial
reality system environment 100 shown in FIG. 1 may include a
near-eye display 120, an external imaging device 150, and an
input/output interface 140 that are each coupled to a console
110.
[0024] While FIG. 1 shows example artificial reality system
environment 100 including one near-eye display 120, one external
imaging device 150, and one input/output interface 140, any number
of these components may be included in artificial reality system
environment 100, or any of the components may be omitted. For
example, there may be multiple near-eye displays 120 monitored by
one or more external imaging devices 150 in communication with
console 110. In alternative configurations, different or additional
components may be included in artificial reality system environment
100.
[0025] Near-eye display 120 may be a head-mounted display that
presents content to a user. Examples of content presented by
near-eye display 120 include one or more of images, videos, audios,
or some combination thereof. In some embodiments, audio may be
presented via an external device (e.g., speakers and/or headphones)
that receives audio information from near-eye display 120, console
110, or both, and presents audio data based on the audio
information. Near-eye display 120 may include one or more rigid
bodies, which may be rigidly or non-rigidly coupled to each other.
A rigid coupling between rigid bodies may cause the coupled rigid
bodies to act as a single rigid entity. A non-rigid coupling
between rigid bodies may allow the rigid bodies to move relative to
each other. In various embodiments, near-eye display 120 may be
implemented in any suitable form factor, including a pair of
glasses. Additionally, in various embodiments, the functionality
described herein may be used in a headset that combines images of
an environment external to near-eye display 120 and content
received from console 110, or from any other console generating and
providing content to a user. Therefore, near-eye display 120, and
methods for eye tracking described herein, may augment images of a
physical, real-world environment external to near-eye display 120
with generated content (e.g., images, video, sound, etc.) to
present an augmented reality to a user.
[0026] In various embodiments, near-eye display 120 may include one
or more of display electronics 122, display optics 124, one or more
locators 126, one or more position sensors 128, an eye-tracking
unit 130, and an inertial measurement unit (IMU) 132. Near-eye
display 120 may omit any of these elements or include additional
elements in various embodiments. Additionally, in some embodiments,
near-eye display 120 may include elements combining the function of
various elements described in conjunction with FIG. 1.
[0027] Display electronics 122 may display images to the user
according to data received from console 110. In various
embodiments, display electronics 122 may include one or more
display panels, such as a liquid crystal display (LCD), an organic
light emitting diode (OLED) display, a micro-LED display, an
active-matrix OLED display (AMOLED), a transparent OLED display
(TOLED), or some other display. For example, in one implementation
of near-eye display 120, display electronics 122 may include a
front TOLED panel, a rear display panel, and an optical component
(e.g., an attenuator, polarizer, or diffractive or spectral film)
between the front and rear display panels. Display electronics 122
may include sub-pixels to emit light of a predominant color such as
red, green, blue, white, or yellow. In some implementations,
display electronics 122 may display a 3D image through stereo
effects produced by two-dimensional panels to create a subjective
perception of image depth. For example, display electronics 122 may
include a left display and a right display positioned in front of a
user's left eye and right eye, respectively. The left and right
displays may present copies of an image shifted horizontally
relative to each other to create a stereoscopic effect (i.e., a
perception of image depth by a user viewing the image).
[0028] In certain embodiments, display optics 124 may display image
content optically (e.g., using optical waveguides and couplers), or
magnify image light received from display electronics 122, correct
optical errors associated with the image light, and present the
corrected image light to a user of near-eye display 120. In various
embodiments, display optics 124 may include one or more optical
elements. Example optical elements may include a substrate, optical
waveguides, an aperture, a Fresnel lens, a convex lens, a concave
lens, a filter, or any other suitable optical element that may
affect image light emitted from display electronics 122. Display
optics 124 may include a combination of different optical elements
as well as mechanical couplings to maintain relative spacing and
orientation of the optical elements in the combination. One or more
optical elements in display optics 124 may have an optical coating,
such as an anti-reflective coating, a reflective coating, a
filtering coating, or a combination of different optical
coatings.
[0029] Magnification of the image light by display optics 124 may
allow display electronics 122 to be physically smaller, weigh less,
and consume less power than larger displays. Additionally,
magnification may increase a field of view of the displayed
content. In some embodiments, display optics 124 may have an
effective focal length larger than the spacing between display
optics 124 and display electronics 122 to magnify image light
projected by display electronics 122. The amount of magnification
of image light by display optics 124 may be adjusted by adding or
removing optical elements from display optics 124.
[0030] Display optics 124 may be designed to correct one or more
types of optical errors, such as two-dimensional optical errors,
three-dimensional optical errors, or a combination thereof.
Two-dimensional errors may include optical aberrations that occur
in two dimensions. Example types of two-dimensional errors may
include barrel distortion, pincushion distortion, longitudinal
chromatic aberration, and transverse chromatic aberration.
Three-dimensional errors may include optical errors that occur in
three dimensions. Example types of three-dimensional errors may
include spherical aberration, comatic aberration, field curvature,
and astigmatism. In some embodiments, content provided to display
electronics 122 for display may be pre-distorted, and display
optics 124 may correct the distortion when it receives image light
from display electronics 122 generated based on the pre-distorted
content.
[0031] Locators 126 may be objects located in specific positions on
near-eye display 120 relative to one another and relative to a
reference point on near-eye display 120. Console 110 may identify
locators 126 in images captured by external imaging device 150 to
determine the artificial reality headset's position, orientation,
or both. A locator 126 may be a light emitting diode (LED), a
corner cube reflector, a reflective marker, a type of light source
that contrasts with an environment in which near-eye display 120
operates, or some combinations thereof. In embodiments where
locators 126 are active components (e.g., LEDs or other types of
light emitting devices), locators 126 may emit light in the visible
band (e.g., about 380 nm to 750 nm), in the infrared (IR) band
(e.g., about 750 nm to 1 mm), in the ultraviolet band (e.g., about
10 nm to about 380 nm), in another portion of the electromagnetic
spectrum, or in any combination of portions of the electromagnetic
spectrum.
[0032] In some embodiments, locators 126 may be located beneath an
outer surface of near-eye display 120. A portion of near-eye
display 120 between a locator 126 and an entity external to
near-eye display 120 (e.g., external imaging device 150, a user
viewing the outer surface of near-eye display 120) may be
transparent to the wavelengths of light emitted or reflected by
locators 126 or is thin enough to not substantially attenuate the
light emitted or reflected by locators 126. In some embodiments,
the outer surface or other portions of near-eye display 120 may be
opaque in the visible band, but is transparent in the IR band, and
locators 126 may be under the outer surface and may emit light in
the IR band.
[0033] External imaging device 150 may generate slow calibration
data based on calibration parameters received from console 110.
Slow calibration data may include one or more images showing
observed positions of locators 126 that are detectable by external
imaging device 150. External imaging device 150 may include one or
more cameras, one or more video cameras, any other device capable
of capturing images including one or more of locators 126, or some
combinations thereof. Additionally, external imaging device 150 may
include one or more filters (e.g., to increase signal to noise
ratio). External imaging device 150 may be configured to detect
light emitted or reflected from locators 126 in a field of view of
external imaging device 150. In embodiments where locators 126
include passive elements (e.g., retroreflectors), external imaging
device 150 may include a light source that illuminates some or all
of locators 126, which may retro-reflect the light to the light
source in external imaging device 150. Slow calibration data may be
communicated from external imaging device 150 to console 110, and
external imaging device 150 may receive one or more calibration
parameters from console 110 to adjust one or more imaging
parameters (e.g., focal length, focus, frame rate, sensor
temperature, shutter speed, aperture, etc.).
[0034] Position sensors 128 may generate one or more measurement
signals in response to motion of near-eye display 120. Examples of
position sensors 128 may include accelerometers, gyroscopes,
magnetometers, other motion-detecting or error-correcting sensors,
or some combinations thereof. For example, in some embodiments,
position sensors 128 may include multiple accelerometers to measure
translational motion (e.g., forward/back, up/down, or left/right)
and multiple gyroscopes to measure rotational motion (e.g., pitch,
yaw, or roll). In some embodiments, various position sensors may be
oriented orthogonally to each other.
[0035] IMU 132 may be an electronic device that generates fast
calibration data based on measurement signals received from one or
more of position sensors 128. Position sensors 128 may be located
external to IMU 132, internal to IMU 132, or some combination
thereof. Based on the one or more measurement signals from one or
more position sensors 128, IMU 132 may generate fast calibration
data indicating an estimated position of near-eye display 120
relative to an initial position of near-eye display 120. For
example, IMU 132 may integrate measurement signals received from
accelerometers over time to estimate a velocity vector and
integrate the velocity vector over time to determine an estimated
position of a reference point on near-eye display 120.
Alternatively, IMU 132 may provide the sampled measurement signals
to console 110, which may determine the fast calibration data.
While the reference point may generally be defined as a point in
space, in various embodiments, the reference point may also be
defined as a point within near-eye display 120 (e.g., a center of
IMU 132).
[0036] Eye-tracking unit 130 may include one or more imaging
devices configured to capture eye tracking data, which an
eye-tracking module 118 in console 110 may use to track the user's
eye. Eye tracking data may refer to data output by eye-tracking
unit 130. Example eye tracking data may include images captured by
eye-tracking unit 130 or information derived from the images
captured by eye-tracking unit 130. Eye tracking may refer to
determining an eye's position, including orientation and location
of the eye, relative to near-eye display 120. For example,
eye-tracking module 118 may output the eye's pitch and yaw based on
images of the eye captured by eye-tracking unit 130. In various
embodiments, eye-tracking unit 130 may measure electromagnetic
energy reflected by the eye and communicate the measured
electromagnetic energy to eye-tracking module 118, which may then
determine the eye's position based on the measured electromagnetic
energy. For example, eye-tracking unit 130 may measure
electromagnetic waves such as visible light, infrared light, radio
waves, microwaves, waves in any other part of the electromagnetic
spectrum, or a combination thereof reflected by an eye of a
user.
[0037] Eye-tracking unit 130 may include one or more eye-tracking
systems. An eye-tracking system may include an imaging system to
image one or more eyes and may optionally include a light emitter,
which may generate light that is directed to an eye such that light
reflected by the eye may be captured by the imaging system. For
example, eye-tracking unit 130 may include a coherent light source
(e.g., a VCSEL) emitting light in the visible spectrum or infrared
spectrum, and a camera capturing the light reflected by the user's
eye. As another example, eye-tracking unit 130 may capture
reflected radio waves emitted by a miniature radar unit.
Eye-tracking unit 130 may use low-power light emitters that emit
light at frequencies and intensities that would not injure the eye
or cause physical discomfort. Eye-tracking unit 130 may be arranged
to increase contrast in images of an eye captured by eye-tracking
unit 130 while reducing the overall power consumed by eye-tracking
unit 130 (e.g., reducing power consumed by a light emitter and an
imaging system included in eye-tracking unit 130). For example, in
some implementations, eye-tracking unit 130 may consume less than
100 milliwatts of power.
[0038] In some embodiments, eye-tracking unit 130 may include one
light emitter and one camera to track each of the user's eyes. In
other embodiments, eye-tracking unit 130 may include a plurality of
light emitters and one camera to track each of the user's eyes.
Eye-tracking unit 130 may also include different eye-tracking
systems that operate together to provide improved eye tracking
accuracy and responsiveness. For example, eye-tracking unit 130 may
include a fast eye-tracking system with a fast response time and a
slow eye-tracking system with a slower response time. The fast
eye-tracking system may frequently measure an eye to capture data
used by eye-tracking module 118 to determine the eye's position
relative to a reference eye position. The slow eye-tracking system
may independently measure the eye to capture data used by
eye-tracking module 118 to determine the reference eye position
without reference to a previously determined eye position. Data
captured by the slow eye-tracking system may allow eye-tracking
module 118 to determine the reference eye position with greater
accuracy than the eye's position determined from data captured by
the fast eye-tracking system. In various embodiments, the slow
eye-tracking system may provide eye-tracking data to eye-tracking
module 118 at a lower frequency than the fast eye-tracking system.
For example, the slow eye-tracking system may operate less
frequently or have a slower response time to conserve power.
[0039] Eye-tracking unit 130 may be configured to estimate the
orientation of the user's eye. The orientation of the eye may
correspond to the direction of the user's gaze within near-eye
display 120. The orientation of the user's eye may be defined as
the direction of the foveal axis, which is the axis between the
fovea (an area on the retina of the eye with the highest
concentration of photoreceptors) and the center of the eye's pupil.
In general, when a user's eyes are fixed on a point, the foveal
axes of the user's eyes intersect that point. The pupillary axis of
an eye may be defined as the axis that passes through the center of
the pupil and is perpendicular to the corneal surface. In general,
even though the pupillary axis and the foveal axis intersect at the
center of the pupil, the pupillary axis may not directly align with
the foveal axis. For example, the orientation of the foveal axis
may be offset from the pupillary axis by approximately -1.degree.
to 8.degree. laterally and about .+-.4.degree. vertically. Because
the foveal axis is defined according to the fovea, which is located
in the back of the eye, the foveal axis may be difficult or
impossible to measure directly in some eye tracking embodiments.
Accordingly, in some embodiments, the orientation of the pupillary
axis may be detected and the foveal axis may be estimated based on
the detected pupillary axis.
[0040] In general, the movement of an eye corresponds not only to
an angular rotation of the eye, but also to a translation of the
eye, a change in the torsion of the eye, and/or a change in the
shape of the eye. Eye-tracking unit 130 may also be configured to
detect the translation of the eye, which may be a change in the
position of the eye relative to the eye socket. In some
embodiments, the translation of the eye may not be detected
directly, but may be approximated based on a mapping from a
detected angular orientation. Translation of the eye corresponding
to a change in the eye's position relative to the eye-tracking unit
may also be detected. Translation of this type may occur, for
example, due to a shift in the position of near-eye display 120 on
a user's head. Eye-tracking unit 130 may also detect the torsion of
the eye and the rotation of the eye about the pupillary axis.
Eye-tracking unit 130 may use the detected torsion of the eye to
estimate the orientation of the foveal axis from the pupillary
axis. Eye-tracking unit 130 may also track a change in the shape of
the eye, which may be approximated as a skew or scaling linear
transform or a twisting distortion (e.g., due to torsional
deformation). Eye-tracking unit 130 may estimate the foveal axis
based on some combinations of the angular orientation of the
pupillary axis, the translation of the eye, the torsion of the eye,
and the current shape of the eye.
[0041] In some embodiments, eye-tracking unit 130 may include
multiple emitters or at least one emitter that can project a
structured light pattern on all portions or a portion of the eye.
The structured light pattern may be distorted due to the shape of
the eye when viewed from an offset angle. Eye-tracking unit 130 may
also include at least one camera that may detect the distortions
(if any) of the structured light pattern projected onto the eye.
The camera may be oriented on a different axis to the eye than the
emitter. By detecting the deformation of the structured light
pattern on the surface of the eye, eye-tracking unit 130 may
determine the shape of the portion of the eye being illuminated by
the structured light pattern. Therefore, the captured distorted
light pattern may be indicative of the 3D shape of the illuminated
portion of the eye. The orientation of the eye may thus be derived
from the 3D shape of the illuminated portion of the eye.
Eye-tracking unit 130 can also estimate the pupillary axis, the
translation of the eye, the torsion of the eye, and the current
shape of the eye based on the image of the distorted structured
light pattern captured by the camera.
[0042] Near-eye display 120 may use the orientation of the eye to,
e.g., determine an inter-pupillary distance (IPD) of the user,
determine gaze direction, introduce depth cues (e.g., blur image
outside of the user's main line of sight), collect heuristics on
the user interaction in the VR media (e.g., time spent on any
particular subject, object, or frame as a function of exposed
stimuli), some other functions that are based in part on the
orientation of at least one of the user's eyes, or some combination
thereof. Because the orientation may be determined for both eyes of
the user, eye-tracking unit 130 may be able to determine where the
user is looking. For example, determining a direction of a user's
gaze may include determining a point of convergence based on the
determined orientations of the user's left and right eyes. A point
of convergence may be the point where the two foveal axes of the
user's eyes intersect (or the nearest point between the two axes).
The direction of the user's gaze may be the direction of a line
passing through the point of convergence and the mid-point between
the pupils of the user's eyes.
[0043] Input/output interface 140 may be a device that allows a
user to send action requests to console 110. An action request may
be a request to perform a particular action. For example, an action
request may be to start or to end an application or to perform a
particular action within the application. Input/output interface
140 may include one or more input devices. Example input devices
may include a keyboard, a mouse, a game controller, a glove, a
button, a touch screen, or any other suitable device for receiving
action requests and communicating the received action requests to
console 110. An action request received by the input/output
interface 140 may be communicated to console 110, which may perform
an action corresponding to the requested action. In some
embodiments, input/output interface 140 may provide haptic feedback
to the user in accordance with instructions received from console
110. For example, input/output interface 140 may provide haptic
feedback when an action request is received, or when console 110
has performed a requested action and communicates instructions to
input/output interface 140.
[0044] Console 110 may provide content to near-eye display 120 for
presentation to the user in accordance with information received
from one or more of external imaging device 150, near-eye display
120, and input/output interface 140. In the example shown in FIG.
1, console 110 may include an application store 112, a headset
tracking module 114, a virtual reality engine 116, and eye-tracking
module 118. Some embodiments of console 110 may include different
or additional modules than those described in conjunction with FIG.
1. Functions further described below may be distributed among
components of console 110 in a different manner than is described
here.
[0045] In some embodiments, console 110 may include a processor and
a non-transitory computer-readable storage medium storing
instructions executable by the processor. The processor may include
multiple processing units executing instructions in parallel. The
computer-readable storage medium may be any memory, such as a hard
disk drive, a removable memory, or a solid-state drive (e.g., flash
memory or dynamic random access memory (DRAM)). In various
embodiments, the modules of console 110 described in conjunction
with FIG. 1 may be encoded as instructions in the non-transitory
computer-readable storage medium that, when executed by the
processor, cause the processor to perform the functions further
described below.
[0046] Application store 112 may store one or more applications for
execution by console 110. An application may include a group of
instructions that, when executed by a processor, generates content
for presentation to the user. Content generated by an application
may be in response to inputs received from the user via movement of
the user's eyes or inputs received from the input/output interface
140. Examples of the applications may include gaming applications,
conferencing applications, video playback application, or other
suitable applications.
[0047] Headset tracking module 114 may track movements of near-eye
display 120 using slow calibration information from external
imaging device 150. For example, headset tracking module 114 may
determine positions of a reference point of near-eye display 120
using observed locators from the slow calibration information and a
model of near-eye display 120. Headset tracking module 114 may also
determine positions of a reference point of near-eye display 120
using position information from the fast calibration information.
Additionally, in some embodiments, headset tracking module 114 may
use portions of the fast calibration information, the slow
calibration information, or some combination thereof, to predict a
future location of near-eye display 120. Headset tracking module
114 may provide the estimated or predicted future position of
near-eye display 120 to VR engine 116.
[0048] Headset tracking module 114 may calibrate the artificial
reality system environment 100 using one or more calibration
parameters, and may adjust one or more calibration parameters to
reduce errors in determining the position of near-eye display 120.
For example, headset tracking module 114 may adjust the focus of
external imaging device 150 to obtain a more accurate position for
observed locators on near-eye display 120. Moreover, calibration
performed by headset tracking module 114 may also account for
information received from IMU 132. Additionally, if tracking of
near-eye display 120 is lost (e.g., external imaging device 150
loses line of sight of at least a threshold number of locators
126), headset tracking module 114 may re-calibrate some or all of
the calibration parameters.
[0049] VR engine 116 may execute applications within artificial
reality system environment 100 and receive position information of
near-eye display 120, acceleration information of near-eye display
120, velocity information of near-eye display 120, predicted future
positions of near-eye display 120, or some combination thereof from
headset tracking module 114. VR engine 116 may also receive
estimated eye position and orientation information from
eye-tracking module 118. Based on the received information, VR
engine 116 may determine content to provide to near-eye display 120
for presentation to the user. For example, if the received
information indicates that the user has looked to the left, VR
engine 116 may generate content for near-eye display 120 that
mirrors the user's eye movement in a virtual environment.
[0050] Additionally, VR engine 116 may perform an action within an
application executing on console 110 in response to an action
request received from input/output interface 140, and provide
feedback to the user indicating that the action has been performed.
The feedback may be visual or audible feedback via near-eye display
120 or haptic feedback via input/output interface 140.
[0051] Eye-tracking module 118 may receive eye-tracking data from
eye-tracking unit 130 and determine the position of the user's eye
based on the eye tracking data. The position of the eye may include
an eye's orientation, location, or both relative to near-eye
display 120 or any element thereof. Because the eye's axes of
rotation change as a function of the eye's location in its socket,
determining the eye's location in its socket may allow eye-tracking
module 118 to more accurately determine the eye's orientation.
[0052] In some embodiments, eye-tracking unit 130 may output
eye-tracking data including images of the eye, and eye-tracking
module 118 may determine the eye's position based on the images.
For example, eye-tracking module 118 may store a mapping between
images captured by eye-tracking unit 130 and eye positions to
determine a reference eye position from an image captured by
eye-tracking unit 130. Alternatively or additionally, eye-tracking
module 118 may determine an updated eye position relative to a
reference eye position by comparing an image from which the
reference eye position is determined to an image from which the
updated eye position is to be determined. Eye-tracking module 118
may determine eye position using measurements from different
imaging devices or other sensors. For example, as described above,
eye-tracking module 118 may use measurements from a slow
eye-tracking system to determine a reference eye position, and then
determine updated positions relative to the reference eye position
from a fast eye-tracking system until a next reference eye position
is determined based on measurements from the slow eye-tracking
system.
[0053] Eye-tracking module 118 may also determine eye calibration
parameters to improve precision and accuracy of eye tracking. Eye
calibration parameters may include parameters that may change
whenever a user dons or adjusts near-eye display 120. Example eye
calibration parameters may include an estimated distance between a
component of eye-tracking unit 130 and one or more parts of the
eye, such as the eye's center, pupil, cornea boundary, or a point
on the surface of the eye. Other example eye calibration parameters
may be specific to a particular user and may include an estimated
average eye radius, an average corneal radius, an average sclera
radius, a map of features on the eye surface, and an estimated eye
surface contour. In embodiments where light from the outside of
near-eye display 120 may reach the eye (as in some augmented
reality applications), the calibration parameters may include
correction factors for intensity and color balance due to
variations in light from the outside of near-eye display 120.
Eye-tracking module 118 may use eye calibration parameters to
determine whether the measurements captured by eye-tracking unit
130 would allow eye-tracking module 118 to determine an accurate
eye position (also referred to herein as "valid measurements").
Invalid measurements, from which eye-tracking module 118 may not be
able to determine an accurate eye position, may be caused by the
user blinking, adjusting the headset, or removing the headset,
and/or may be caused by near-eye display 120 experiencing greater
than a threshold change in illumination due to external light.
[0054] FIG. 2 is a perspective view of a simplified example
near-eye display 200 including various sensors. Near-eye display
200 may be a specific implementation of near-eye display 120 of
FIG. 1, and may be configured to operate as a virtual reality
display, an augmented reality display, and/or a mixed reality
display. Near-eye display 200 may include a frame 205 and a display
210. Display 210 may be configured to present content to a user. In
some embodiments, display 210 may include display electronics
and/or display optics. For example, as described above with respect
to near-eye display 120 of FIG. 1, display 210 may include an LCD
display panel, an LED display panel, or an optical display panel
(e.g., a waveguide display assembly).
[0055] Near-eye display 200 may further include various sensors
250a, 250b, 250c, 250d, and 250e on or within frame 205. In some
embodiments, sensors 250a-250e may include one or more depth
sensors, motion sensors, position sensors, inertial sensors, or
ambient light sensors. In some embodiments, sensors 250a-250e may
include one or more image sensors configured to generate image data
representing different fields of views in different directions. In
some embodiments, sensors 250a-250e may be used as input devices to
control or influence the displayed content of near-eye display 200,
and/or to provide an interactive VR/AR/MR experience to a user of
near-eye display 200. In some embodiments, sensors 250a-250e may
also be used for stereoscopic imaging.
[0056] In some embodiments, near-eye display 200 may further
include one or more illuminators 230 to project light into the
physical environment. The projected light may be associated with
different frequency bands (e.g., visible light, infra-red light,
ultra-violet light, etc.), and may serve various purposes. For
example, illuminator(s) 230 may project light in a dark environment
(or in an environment with low intensity of infra-red light,
ultra-violet light, etc.) to assist sensors 250a-250e in capturing
images of different objects within the dark environment. In some
embodiments, illuminator(s) 230 may be used to project certain
light pattern onto the objects within the environment. In some
embodiments, illuminator(s) 230 may be used as locators, such as
locators 126 described above with respect to FIG. 1.
[0057] In some embodiments, near-eye display 200 may also include a
high-resolution camera 240. Camera 240 may capture images of the
physical environment in the field of view. The captured images may
be processed, for example, by a virtual reality engine (e.g.,
virtual reality engine 116 of FIG. 1) to add virtual objects to the
captured images or modify physical objects in the captured images,
and the processed images may be displayed to the user by display
210 for AR or MR applications.
[0058] Embodiments of the invention may include or be implemented
in conjunction with an artificial reality system. Artificial
reality is a form of reality that has been adjusted in some manner
before presentation to a user, which may include, e.g., a virtual
reality (VR), an augmented reality (AR), a mixed reality (MR), a
hybrid reality, or some combination and/or derivatives thereof.
Artificial reality content may include completely generated content
or generated content combined with captured (e.g., real-world)
content. The artificial reality content may include video, audio,
haptic feedback, or some combination thereof, and any of which may
be presented in a single channel or in multiple channels (such as
stereo video that produces a three-dimensional effect to the
viewer). Additionally, in some embodiments, artificial reality may
also be associated with applications, products, accessories,
services, or some combination thereof, that are used to, e.g.,
create content in an artificial reality and/or are otherwise used
in (e.g., perform activities in) an artificial reality. The
artificial reality system that provides the artificial reality
content may be implemented on various platforms, including a
head-mounted display (HMD) connected to a host computer system, a
standalone HMD, a mobile device or computing system, or any other
hardware platform capable of providing artificial reality content
to one or more viewers.
[0059] FIG. 3 is a perspective view of an example near-eye display
in the form of a head-mounted display (HMD) device 300 for
implementing some of the example near-eye displays (e.g., near-eye
display 120) disclosed herein. HMD device 300 may be a part of,
e.g., a virtual reality (VR) system, an augmented reality (AR)
system, a mixed reality (MR) system, or some combinations thereof.
HMD device 300 may include a body 320 and a head strap 330. FIG. 3
shows a top side 323, a front side 325, and a right side 327 of
body 320 in the perspective view. Head strap 330 may have an
adjustable or extendible length. There may be a sufficient space
between body 320 and head strap 330 of HMD device 300 for allowing
a user to mount HMD device 300 onto the user's head. In various
embodiments, HMD device 300 may include additional, fewer, or
different components. For example, in some embodiments, HMD device
300 may include eyeglass temples and temples tips, rather than head
strap 330.
[0060] HmD device 300 may present to a user media including virtual
and/or augmented views of a physical, real-world environment with
computer-generated elements. Examples of the media presented by HMD
device 300 may include images (e.g., two-dimensional (2D) or
three-dimensional (3D) images), videos (e.g., 2D or 3D videos),
audios, or some combinations thereof. The images and videos may be
presented to each eye of the user by one or more display assemblies
(not shown in FIG. 3) enclosed in body 320 of HMD device 300. In
various embodiments, the one or more display assemblies may include
a single electronic display panel or multiple electronic display
panels (e.g., one display panel for each eye of the user). Examples
of the electronic display panel(s) may include, for example, a
liquid crystal display (LCD), an organic light emitting diode
(OLED) display, an inorganic light emitting diode (ILED) display, a
micro-LED display, an active-matrix organic light emitting diode
(AMOLED) display, a transparent organic light emitting diode
(TOLED) display, some other display, or some combinations thereof.
HMD device 300 may include two eye box regions.
[0061] In some implementations, HMD device 300 may include various
sensors (not shown), such as depth sensors, motion sensors,
position sensors, and eye tracking sensors. Some of these sensors
may use a structured light pattern for sensing. In some
implementations, HMD device 300 may include an input/output
interface for communicating with a console. In some
implementations, HMD device 300 may include a virtual reality
engine (not shown) that can execute applications within HMD device
300 and receive depth information, position information,
acceleration information, velocity information, predicted future
positions, or some combination thereof of HMD device 300 from the
various sensors. In some implementations, the information received
by the virtual reality engine may be used for producing a signal
(e.g., display instructions) to the one or more display assemblies.
In some implementations, HMD device 300 may include locators (not
shown, such as locators 126) located in fixed positions on body 320
relative to one another and relative to a reference point. Each of
the locators may emit light that is detectable by an external
imaging device.
[0062] FIG. 4 is a simplified block diagram of an example
electronic system 400 of an example near-eye display (e.g., HMD
device) for implementing some of the examples disclosed herein.
Electronic system 400 may be used as the electronic system of HMD
device 1000 or other near-eye displays described above. In this
example, electronic system 400 may include one or more processor(s)
410 and a memory 420. Processor(s) 410 may be configured to execute
instructions for performing operations at a number of components,
and can be, for example, a general-purpose processor or
microprocessor suitable for implementation within a portable
electronic device. Processor(s) 410 may be communicatively coupled
with a plurality of components within electronic system 400. To
realize this communicative coupling, processor(s) 410 may
communicate with the other illustrated components across a bus 440.
Bus 440 may be any subsystem adapted to transfer data within
electronic system 400. Bus 440 may include a plurality of computer
buses and additional circuitry to transfer data.
[0063] Memory 420 may be coupled to processor(s) 410. In some
embodiments, memory 420 may offer both short-term and long-term
storage and may be divided into several units. Memory 420 may be
volatile, such as static random access memory (SRAM) and/or dynamic
random access memory (DRAM) and/or non-volatile, such as read-only
memory (ROM), flash memory, and the like. Furthermore, memory 420
may include removable storage devices, such as secure digital (SD)
cards. Memory 420 may provide storage of computer-readable
instructions, data structures, program modules, and other data for
electronic system 400. In some embodiments, memory 420 may be
distributed into different hardware modules. A set of instructions
and/or code might be stored on memory 420. The instructions might
take the form of executable code that may be executable by
electronic system 400, and/or might take the form of source and/or
installable code, which, upon compilation and/or installation on
electronic system 400 (e.g., using any of a variety of generally
available compilers, installation programs,
compression/decompression utilities, etc.), may take the form of
executable code.
[0064] In some embodiments, memory 420 may store a plurality of
application modules 422 through 424, which may include any number
of applications. Examples of applications may include gaming
applications, conferencing applications, video playback
applications, or other suitable applications. The applications may
include a depth sensing function or eye tracking function.
Application modules 422-424 may include particular instructions to
be executed by processor(s) 410. In some embodiments, certain
applications or parts of application modules 422-424 may be
executable by other hardware modules 480. In certain embodiments,
memory 420 may additionally include secure memory, which may
include additional security controls to prevent copying or other
unauthorized access to secure information.
[0065] In some embodiments, memory 420 may include an operating
system 425 loaded therein. Operating system 425 may be operable to
initiate the execution of the instructions provided by application
modules 422-424 and/or manage other hardware modules 480 as well as
interfaces with a wireless communication subsystem 430 which may
include one or more wireless transceivers. Operating system 425 may
be adapted to perform other operations across the components of
electronic system 400 including threading, resource management,
data storage control and other similar functionality.
[0066] Wireless communication subsystem 430 may include, for
example, an infrared communication device, a wireless communication
device and/or chipset (such as a Bluetooth.RTM. device, an IEEE
802.11 device, a Wi-Fi device, a WiMax device, cellular
communication facilities, etc.), and/or similar communication
interfaces. Electronic system 400 may include one or more antennas
434 for wireless communication as part of wireless communication
subsystem 430 or as a separate component coupled to any portion of
the system. Depending on desired functionality, wireless
communication subsystem 430 may include separate transceivers to
communicate with base transceiver stations and other wireless
devices and access points, which may include communicating with
different data networks and/or network types, such as wireless
wide-area networks (WWANs), wireless local area networks (WLANs),
or wireless personal area networks (WPANs). A WWAN may be, for
example, a WiMax (IEEE 802.16) network. A WLAN may be, for example,
an IEEE 802.11x network. A WPAN may be, for example, a Bluetooth
network, an IEEE 802.15x, or some other types of network. The
techniques described herein may also be used for any combination of
WWAN, WLAN, and/or WPAN. Wireless communications subsystem 430 may
permit data to be exchanged with a network, other computer systems,
and/or any other devices described herein. Wireless communication
subsystem 430 may include a means for transmitting or receiving
data, such as identifiers of HMD devices, position data, a
geographic map, a heat map, photos, or videos, using antenna(s) 434
and wireless link(s) 432. Wireless communication subsystem 430,
processor(s) 410, and memory 420 may together comprise at least a
part of one or more of a means for performing some functions
disclosed herein.
[0067] Embodiments of electronic system 400 may also include one or
more sensors 490. Sensor(s) 490 may include, for example, an image
sensor, an accelerometer, a pressure sensor, a temperature sensor,
a proximity sensor, a magnetometer, a gyroscope, an inertial sensor
(e.g., a module that combines an accelerometer and a gyroscope), an
ambient light sensor, or any other similar module operable to
provide sensory output and/or receive sensory input, such as a
depth sensor or a position sensor. For example, in some
implementations, sensor(s) 490 may include one or more inertial
measurement units (IMUs) and/or one or more position sensors. An
IMU may generate calibration data indicating an estimated position
of the HMD device relative to an initial position of the HMD
device, based on measurement signals received from one or more of
the position sensors. A position sensor may generate one or more
measurement signals in response to motion of the HMD device.
Examples of the position sensors may include, but are not limited
to, one or more accelerometers, one or more gyroscopes, one or more
magnetometers, another suitable type of sensor that detects motion,
a type of sensor used for error correction of the IMU, or some
combination thereof. The position sensors may be located external
to the IMU, internal to the IMU, or some combination thereof. At
least some sensors may use a structured light pattern for
sensing.
[0068] Electronic system 400 may include a display module 460.
Display module 460 may be a near-eye display, and may graphically
present information, such as images, videos, and various
instructions, from electronic system 400 to a user. Such
information may be derived from one or more application modules
422-424, virtual reality engine 426, one or more other hardware
modules 480, a combination thereof, or any other suitable means for
resolving graphical content for the user (e.g., by operating system
425). Display module 460 may use liquid crystal display
[0069] (LCD) technology, light-emitting diode (LED) technology
(including, for example, OLED, ILED, mLED, AMOLED, TOLED, etc.),
light emitting polymer display (LPD) technology, or some other
display technology.
[0070] Electronic system 400 may include a user input/output module
470. User input/output module 470 may allow a user to send action
requests to electronic system 400. An action request may be a
request to perform a particular action. For example, an action
request may be to start or end an application or to perform a
particular action within the application. User input/output module
470 may include one or more input devices. Example input devices
may include a touchscreen, a touch pad, microphone(s), button(s),
dial(s), switch(es), a keyboard, a mouse, a game controller, or any
other suitable device for receiving action requests and
communicating the received action requests to electronic system
400. In some embodiments, user input/output module 470 may provide
haptic feedback to the user in accordance with instructions
received from electronic system 400. For example, the haptic
feedback may be provided when an action request is received or has
been performed.
[0071] Electronic system 400 may include a camera 450 that may be
used to take photos or videos of a user, for example, for tracking
the user's eye position. Camera 450 may also be used to take photos
or videos of the environment, for example, for VR, AR, or MR
applications. Camera 450 may include, for example, a complementary
metal-oxide-semiconductor (CMOS) image sensor with a few millions
or tens of millions of pixels. In some implementations, camera 450
may include two or more cameras that may be used to capture 3-D
images.
[0072] In some embodiments, electronic system 400 may include a
plurality of other hardware modules 480. Each of other hardware
modules 480 may be a physical module within electronic system 400.
While each of other hardware modules 480 may be permanently
configured as a structure, some of other hardware modules 480 may
be temporarily configured to perform specific functions or
temporarily activated. Examples of other hardware modules 480 may
include, for example, an audio output and/or input module (e.g., a
microphone or speaker), a near field communication (NFC) module, a
rechargeable battery, a battery management system, a wired/wireless
battery charging system, etc. In some embodiments, one or more
functions of other hardware modules 480 may be implemented in
software.
[0073] In some embodiments, memory 420 of electronic system 400 may
also store a virtual reality engine 426. Virtual reality engine 426
may execute applications within electronic system 400 and receive
position information, acceleration information, velocity
information, predicted future positions, or some combination
thereof of the HMD device from the various sensors. In some
embodiments, the information received by virtual reality engine 426
may be used for producing a signal (e.g., display instructions) to
display module 460. For example, if the received information
indicates that the user has looked to the left, virtual reality
engine 426 may generate content for the HMD device that mirrors the
user's movement in a virtual environment. Additionally, virtual
reality engine 426 may perform an action within an application in
response to an action request received from user input/output
module 470 and provide feedback to the user. The provided feedback
may be visual, audible, or haptic feedback. In some
implementations, processor(s) 410 may include one or more GPUs that
may execute virtual reality engine 426.
[0074] In various implementations, the above-described hardware and
modules may be implemented on a single device or on multiple
devices that can communicate with one another using wired or
wireless connections. For example, in some implementations, some
components or modules, such as GPUs, virtual reality engine 426,
and applications (e.g., tracking application), may be implemented
on a console separate from the head-mounted display device. In some
implementations, one console may be connected to or support more
than one HMD.
[0075] In alternative configurations, different and/or additional
components may be included in electronic system 400. Similarly,
functionality of one or more of the components can be distributed
among the components in a manner different from the manner
described above. For example, in some embodiments, electronic
system 400 may be modified to include other system environments,
such as an AR system environment and/or an MR environment.
[0076] As discussed above, LEDs may be used as light sources in
various parts of an artificial reality system, such as the display
electronics 122, the locators 126, and the eye tracking unit 130.
Further, LEDs may be used in various display technologies, such as
heads-up displays, television displays, smartphone displays, watch
displays, wearable displays, and flexible displays. LEDs can be
used in combination with a plurality of sensors in many
applications such as the Internet of Things (IOT). The LEDs
described herein can be configured to emit light having any desired
wavelength, such as ultraviolet, visible, or infrared light. Also,
the LEDs described herein can be configured to have any suitable
mesa shape, such as planar, vertical, conical, semi-parabolic,
parabolic, or combinations thereof. The LEDs described herein may
be micro-LEDs that have an active light emitting area with a linear
dimension that is less than 50 .mu.m, less than 20 .mu.m, or less
than 10 .mu.m. For example, the linear dimension may be as small as
2 .mu.m or 4 .mu.m.
[0077] FIG. 5A is a simplified block diagram of a related art
system 500 for performing micropatterning. The system 500 includes
a laser 510 that emits a laser beam 515. The laser 510 may be an
ultrafast pulsed laser, such as a femtosecond or picosecond laser.
The laser beam 515 is received by a microscope objective 520, which
focuses the laser beam 515 to a focal point 525. In order to
perform micropatterning, the focal point 525 may be scanned across
a material to create a plurality of features. Each feature is
patterned individually, resulting in lengthy processing times to
create the plurality of features. Also, it is challenging to
fabricate each of the plurality of features to have a consistent
shape and size.
[0078] FIG. 5B is a simplified block diagram of a system 505 for
performing micropatterning according to certain embodiments of the
invention. The system 505 includes a laser 530 that emits a laser
beam 535. The laser 530 may be an ultrafast pulsed laser, such as a
femtosecond or picosecond laser. Alternatively, the laser 530 may a
continuous wave laser or a collimated ultraviolet (UV) light
source. The laser beam 535 may be received by a beam confiner 540,
which prevents light from the laser beam 535 from escaping from the
system 505. The beam confiner 540 may have any suitable shape, such
as a cylinder. For example, the beam confiner 540 may be a
microscope objective from which the optics have been removed.
Further, the beam confiner 540 may collimate the laser beam
535.
[0079] A beam homogenizer 545 may shape the laser beam 535 to have
a flat-top profile. The laser beam 535 with the flat-top profile
may have uniform power across the entire beam profile, or across a
portion of the beam profile. The uniform power may vary across the
beam profile within an acceptable range for micropatterning, such
as .+-.0.1%, .+-.1%, .+-.5%, or .+-.10%. The laser beam 535 having
the flat-top profile may be provided to a lenslet array 550, which
includes a plurality of lenslets 555. The lenslet array 550 may be
positioned on an exit window of the beam homogenizer 545. Each of
the lenslets 555 may have the same shape. The same shape includes
lenslets 555 that are identical, as well as lenslets 555 that may
have minute differences due to variations in the manufacturing
process. The laser beam 535 having the flat-top profile may provide
approximately the same energy and beam shape to each of the
lenslets 555. Each of the lenslets 555 focuses a respective portion
of the laser beam 535 to a respective focal point 560. Each of the
focal points 560 may be formed at the same distance with respect to
the lenslet array 550.
[0080] The system 505 may be used to create a plurality of features
simultaneously. If the lenslet array 550 includes n lenslets 555,
the system 505 may pattern n features simultaneously by providing
the lenslet array 550 with the shaped laser beam 535. Each of the n
features may have the same shape or approximately the same shape.
If each of the n lenslets 555 has the same shape and the beam
profile and energy of the portion of the shaped laser beam 535 that
is incident on each of the n lenslets 555 is the same, then each of
then features will have the same shape. The n features may have
slightly different characteristics if there are any inconsistencies
in the shapes of the n lenslets 555, the beam profile of the shaped
laser beam 535, and/or the energy of the shaped laser beam 535.
[0081] The lenslets 555 may be arranged in any suitable
configuration. For example, in order to fabricate a structure with
features in a hexagonal pattern, the lenslets 555 may be arranged
in a hexagonal pattern. The lenslets 555 may have any suitable
pitch, such as 10 .mu.m, 20 .mu.m, 30 .mu.m, or 50 .mu.m. The shape
of each lenslet 555 and the configuration of the lenslets 555
within the lenslet array 550 may be customized to achieve a desired
shape and configuration of the patterned features.
[0082] The shape of the features may be controlled by moving the
system 505 along the x, y, and/or z axes, or any other suitable
three-dimensional coordinate system, before or while the laser 530
is emitting the laser beam 535. Alternatively or in addition, the
shape of the features may be controlled by moving the material that
is being patterned along the x, y, and/or z axes, or any other
suitable three-dimensional coordinate system, before or while the
laser 530 is emitting the laser beam 535. Additional sets of
features may be patterned by moving the system 505 and/or the
material that is being patterned, and then repeating the
simultaneous patterning of n features in a different region of the
material that is being patterned. The features may be patterned by
two-photon polymerization, in which two photons having a single
wavelength .lamda. are focused in a small volume of the material,
inducing a transition that is normally only possible at a
wavelength of 212.
[0083] FIG. 6 is an example of a structure 600 that can be created
by the system 505 for performing micropatterning. The structure 600
includes a plurality of mesas 620 that are formed on a substrate
610. In order to form this pattern, the lenslet array 550 and/or
the substrate 610 may be moved such that the focal point 560 of
each lenslet 555 is scanned in concentric circles on a material
formed on the substrate 610. For example, the material may be a
semiconductor material such as GaN. The concentric circles may be
formed at different depths through the material formed on the
substrate 610. If the lenslet array 550 includes 72 lenslets 555 in
a rectangular array, the 72 mesas 620 can be formed simultaneously
on the substrate 610. The mesas 620 may be formed by any suitable
alternative method, such as scanning the lenslet array 550 and/or
the substrate 610 in a line-by-line or a layer-by-layer pattern.
For example, the lenlset array 550 and/or the substrate 610 may be
scanned in a two-dimensional pattern or a three-dimensional
pattern. The mesas 620 may have any suitable shape, such as
parabolic, semi-parabolic, planar, vertical, or conical. The mesas
620 may be used to form an array of LEDs that may be used in a
variety of applications, such as various parts of an artificial
reality system or display technologies.
[0084] FIGS. 7A and 7B are additional examples of structures 700
and 705 that can be created by the system 505 for performing
micropatterning. Referring to FIG. 7A, the structure 700 includes a
wafer 720 on which a polymer layer 715 is formed. For example, the
polymer layer 715 may be formed by spin coating or drop casting.
The wafer 720 may be made of various materials that can be
structured with a laser beam, such as glass or fused silica. A
plurality of mesas 730 may be formed on a carrier 710 and then
embedded within the polymer layer 715.
[0085] The system 505 may be used to simultaneously create a
plurality of holes 725 within the wafer 720. The holes 725 may be
patterned by positioning the lenslet array 550 and/or the wafer 720
such that the focal point 560 of each lenslet 555 is formed on the
wafer 720. The laser beam 535 may be applied until each hole 725
penetrates the entire depth of the wafer 720. It may be possible to
pattern small holes 725 without moving the lenslet array 550 and/or
the wafer 720 laterally. A cross-section of each hole 725 may have
any suitable shape, such as a round shape or a parallelogram shape.
In the example shown in FIG. 7A, each hole 725 may have a round
shape with a diameter between 10 .mu.m and 30 .mu.m. The pitch of
the holes 725 may be larger than the pitch of the mesas 730 and/or
electrical contacts, such as the electrical contact 910 shown in
FIG. 9. The holes 725 may be evenly distributed throughout the
wafer 720, such that no alignment between the holes 725 and the
mesas 730 is required. A plasma species may be injected through the
holes 725 in order to etch the polymer layer 715.
[0086] Referring to FIG. 7B, the structure 705 includes a wafer 750
on which a polymer layer 745 is formed. For example, the polymer
layer 745 may be formed by spin coating or drop casting. The wafer
750 may be made of various materials that can be structured with a
laser beam, such as glass or fused silica. A plurality of mesas 760
may be formed on a carrier 740 and then embedded within the polymer
layer 745. The system 505 may be used to simultaneously create a
plurality of holes 755 within the wafer 750. The holes 755 may be
patterned by positioning the lenslet array 550 and/or the wafer 750
such that the focal point 560 of each lenslet 555 is formed on the
wafer 750. The laser beam 535 may be applied until each hole 755
penetrates the entire depth of the wafer 750. In order to pattern
larger holes 755, the lenslet array 550 and/or the wafer 750 may be
moved laterally in order to increase the lateral dimensions of the
holes 755. A cross-section of each hole 755 may have any suitable
shape, such as a round shape or a parallelogram shape. In the
example shown in FIG. 7B, each hole 755 may have a parallelogram
shape with dimensions of 60 .mu.m.times.420 .mu.m. The pitch of the
holes 755 may be larger than the pitch of the mesas 760. The holes
755 may be arranged in a pattern that matches the groups of mesas
760, such that alignment between the holes 755 and the mesas 760 is
required. A plasma species may be injected through the holes 755 in
order to etch the polymer layer 745.
[0087] FIGS. 8A and 8B are additional examples of structures 800
and 805 that can be created by the system 505 for performing
micropatterning. Referring to FIG. 8A, the structure 800 includes a
silicon substrate 815 on which a metal layer 810 is formed. The
metal layer 810 may include a noble metal, such as gold, silver, or
platinum. The system 505 may be used to simultaneously create a
plurality of holes 820 within the metal layer 810. The holes 820
may be patterned by positioning the lenslet array 550 and/or the
silicon substrate 815 such that the focal point 560 of each lenslet
555 is formed on the metal layer 810. The laser beam 535 may be
applied until each hole 820 penetrates the entire depth of the
metal layer 810.
[0088] Referring to FIG. 8B, metal-assisted chemical etching may be
used to form silicon nanowires 825 on the silicon substrate 815.
For example, a solution of HF and H.sub.2O.sub.2 may be applied to
the structure 800 shown in FIG. 8A, causing the metal in the metal
layer 810 to etch the portions of the silicon substrate 815 between
the remaining portions of the metal layer 810. A line or a
two-dimensional array of silicon nanowires 825 may be formed.
[0089] FIG. 9 is a cross-sectional view of an example of an LED 900
having various components that can be created by the system 505 for
performing micropatterning. The LED may be a micro-LED, which may
have an active light-emitting area 906 with a linear dimension that
is less than 50 .mu.m, less than 20 .mu.m, or less than 10 .mu.m.
For example, the linear dimension may be as small as 2 .mu.m or 4
.mu.m. Their small size enables a display system to have a single
pixel including three micro-LEDs: a red micro-LED, a green
micro-LED, and a blue micro-LED. Their small size also enables
micro-LEDs to be lightweight, making them particularly suitable for
use in wearable display systems, such as watches and computing
glasses.
[0090] The LED 900 includes, among other components, a
semiconductor structure. The semiconductor structure includes
semiconductor layers 902 and 904 and a light-emitting layer 906
that sits between the semiconductor layers 902 and 904. For
example, the LED 900 may include a semiconductor structure in which
the light-emitting layer 906 is a layer of indium gallium nitride
that is sandwiched between a layer of p-type gallium nitride and a
layer of n-type gallium nitride. In some embodiments, semiconductor
layer 902 is a p-type semiconductor, and semiconductor layer 904 is
an n-type semiconductor. In some embodiments, semiconductor layer
902 is an n-type semiconductor, and semiconductor layer 904 is a
p-type semiconductor.
[0091] The semiconductor layers 902 and 904 are operatively coupled
to electrical contacts 908 and 910, respectively. The electrical
contacts 908 and 910 are typically made of a conductive material,
such as a metallic material. In the example of FIG. 9, the
electrical contacts 908 and 910 are both located on a top surface
of the semiconductor structure such that they can both support the
LED 900 when it is mounted on a substrate including a control
circuit. However, in some embodiments, electrical contacts can be
located on opposite surfaces of a semiconductor structure.
[0092] The light-emitting layer 906 includes one or more quantum
wells that output light 916 when a voltage is applied across the
electrical contacts 908 and 910. To directionalize the output of
light 916, the semiconductor structure may be formed into any of a
variety of shapes (e.g., a paraboloid, a cylinder, or a cone) that
enable collimation/quasi-collimation of light 916. Such shapes are
referred to herein as "mesa" shapes; and collimation and
quasi-collimation are collectively referred to herein as
"collimation". Collimation results in increased brightness of light
output.
[0093] In the example of FIG. 9, mesa 914 corresponds to a
paraboloid shape that guides light 916 toward through a
light-emitting surface 912 of the semiconductor structure. More
specifically, the light-emitting layer 906 is approximately
positioned at the focal point of the paraboloid such that some of
the emitted light is reflected, within a critical angle of total
internal reflection, off the inner walls of the paraboloid toward
the light-emitting surface 912.
[0094] In some embodiments, a mesa shape also has a truncated top
that can accommodate an electrical contact. In the example of FIG.
9, mesa 914 corresponds to a paraboloid shape having a truncated
vertex that accommodates electrical contact 908. Base 918 refers to
the part of the semiconductor structure that is not included in the
mesa 914.
[0095] To enable further collimation of light 916, an optical
element 920 can be formed on the light-emitting surface 912. In the
example of FIG. 9, the optical element 920 is a microlens. As will
be described in greater detail below, the optical element 920 can
be formed from an elastomeric material or a photoresist.
[0096] Although only one LED 900 is shown in FIG. 9, a plurality of
LEDs may be formed simultaneously by the patterning methods
disclosed herein. As discussed above with respect to FIG. 6, a
plurality of mesas 914 may be formed by using system 505.
Similarly, a plurality of optical elements 920 may be formed
simultaneously by using system 505. For example, the lenslet array
550 and/or an array of LEDs 900 may be moved such that the focal
point 560 of each lenslet 555 is formed on an elastomeric material
or a photoresist that is formed on the light-emitting surface 912
of the array of LEDs 900. The optical elements 920 may be formed to
have any suitable shape, such as a spherical lens, a Fresnel lens,
a spherical lens having a partial opening, or a donut-shaped lens.
The center axis of each optical element 920 may be formed to be
aligned with the center axis of the mesa 914.
[0097] The methods, systems, and devices discussed above are
examples. Various embodiments may omit, substitute, or add various
procedures or components as appropriate. For instance, in
alternative configurations, the methods described may be performed
in an order different from that described, and/or various stages
may be added, omitted, and/or combined.
[0098] Also, features described with respect to certain embodiments
may be combined in various other embodiments. Different aspects and
elements of the embodiments may be combined in a similar manner.
Also, technology evolves and, thus, many of the elements are
examples that do not limit the scope of the disclosure to those
specific examples.
[0099] Specific details are given in the description to provide a
thorough understanding of the embodiments. However, embodiments may
be practiced without these specific details. For example,
well-known circuits, processes, systems, structures, and techniques
have been shown without unnecessary detail in order to avoid
obscuring the embodiments. This description provides example
embodiments only, and is not intended to limit the scope,
applicability, or configuration of the invention. Rather, the
preceding description of the embodiments will provide those skilled
in the art with an enabling description for implementing various
embodiments. Various changes may be made in the function and
arrangement of elements without departing from the spirit and scope
of the present disclosure.
[0100] Also, some embodiments were described as processes depicted
as flow diagrams or block diagrams. Although each may describe the
operations as a sequential process, many of the operations may be
performed in parallel or concurrently. In addition, the order of
the operations may be rearranged. A process may have additional
steps not included in the figure. Furthermore, embodiments of the
methods may be implemented by hardware, software, firmware,
middleware, microcode, hardware description languages, or any
combination thereof. When implemented in software, firmware,
middleware, or microcode, the program code or code segments to
perform the associated tasks may be stored in a computer-readable
medium such as a storage medium. Processors may perform the
associated tasks.
[0101] It will be apparent to those skilled in the art that
substantial variations may be made in accordance with specific
requirements. For example, customized or special-purpose hardware
might also be used, and/or particular elements might be implemented
in hardware, software (including portable software, such as
applets, etc.), or both. Further, connection to other computing
devices such as network input/output devices may be employed.
[0102] With reference to the appended figures, components that can
include memory can include non-transitory machine-readable media.
The term "machine-readable medium" and "computer-readable medium,"
as used herein, refer to any storage medium that participates in
providing data that causes a machine to operate in a specific
fashion. In embodiments provided hereinabove, various
machine-readable media might be involved in providing
instructions/code to processing units and/or other device(s) for
execution. Additionally or alternatively, the machine-readable
media might be used to store and/or carry such instructions/code.
In many implementations, a computer-readable medium is a physical
and/or tangible storage medium.
[0103] Such a medium may take many forms, including, but not
limited to, non-volatile media, volatile media, and transmission
media. Common forms of computer-readable media include, for
example, magnetic and/or optical media such as compact disk (CD) or
digital versatile disk (DVD), punch cards, paper tape, any other
physical medium with patterns of holes, a RAM, a programmable
read-only memory (PROM), an erasable programmable read-only memory
(EPROM), a FLASH-EPROM, any other memory chip or cartridge, a
carrier wave as described hereinafter, or any other medium from
which a computer can read instructions and/or code. A computer
program product may include code and/or machine-executable
instructions that may represent a procedure, a function, a
subprogram, a program, a routine, an application (App), a
subroutine, a module, a software package, a class, or any
combination of instructions, data structures, or program
statements.
[0104] Those of skill in the art will appreciate that information
and signals used to communicate the messages described herein may
be represented using any of a variety of different technologies and
techniques. For example, data, instructions, commands, information,
signals, bits, symbols, and chips that may be referenced throughout
the above description may be represented by voltages, currents,
electromagnetic waves, magnetic fields or particles, optical fields
or particles, or any combination thereof.
[0105] Terms, "and" and "or" as used herein, may include a variety
of meanings that are also expected to depend at least in part upon
the context in which such terms are used. Typically, "or" if used
to associate a list, such as A, B, or C, is intended to mean A, B,
and C, here used in the inclusive sense, as well as A, B, or C,
here used in the exclusive sense. In addition, the term "one or
more" as used herein may be used to describe any feature,
structure, or characteristic in the singular or may be used to
describe some combination of features, structures, or
characteristics. However, it should be noted that this is merely an
illustrative example and claimed subject matter is not limited to
this example. Furthermore, the term "at least one of" if used to
associate a list, such as A, B, or C, can be interpreted to mean
any combination of A, B, and/or C, such as A, AB, AC, BC, AA, ABC,
AAB, AABBCCC, etc.
[0106] Further, while certain embodiments have been described using
a particular combination of hardware and software, it should be
recognized that other combinations of hardware and software are
also possible. Certain embodiments may be implemented only in
hardware, or only in software, or using combinations thereof. In
one example, software may be implemented with a computer program
product containing computer program code or instructions executable
by one or more processors for performing any or all of the steps,
operations, or processes described in this disclosure, where the
computer program may be stored on a non-transitory computer
readable medium. The various processes described herein can be
implemented on the same processor or different processors in any
combination.
[0107] Where devices, systems, components or modules are described
as being configured to perform certain operations or functions,
such configuration can be accomplished, for example, by designing
electronic circuits to perform the operation, by programming
programmable electronic circuits (such as microprocessors) to
perform the operation such as by executing computer instructions or
code, or processors or cores programmed to execute code or
instructions stored on a non-transitory memory medium, or any
combination thereof. Processes can communicate using a variety of
techniques, including, but not limited to, conventional techniques
for inter-process communications, and different pairs of processes
may use different techniques, or the same pair of processes may use
different techniques at different times.
[0108] The specification and drawings are, accordingly, to be
regarded in an illustrative rather than a restrictive sense. It
will, however, be evident that additions, subtractions, deletions,
and other modifications and changes may be made thereunto without
departing from the broader spirit and scope as set forth in the
claims. Thus, although specific embodiments have been described,
these are not intended to be limiting. Various modifications and
equivalents are within the scope of the following claims.
* * * * *