U.S. patent application number 14/447490 was filed with the patent office on 2016-02-04 for customized face mask.
The applicant listed for this patent is Elwha LLC. Invention is credited to William D. Duncan, Roderick A. Hyde, Jordin T. Kare, Tony S. Pan, Yaroslav A. Urzhumov, Lowell L. Wood, JR..
Application Number | 20160029716 14/447490 |
Document ID | / |
Family ID | 55178670 |
Filed Date | 2016-02-04 |
United States Patent
Application |
20160029716 |
Kind Code |
A1 |
Duncan; William D. ; et
al. |
February 4, 2016 |
CUSTOMIZED FACE MASK
Abstract
A face mask includes a covering member configured to cover
facial features of a face of a user, the covering member including
an air-permeable filter member; and a facial image provided on an
outward facing surface of the covering member, the facial image
being generated based on an image of the user and representing
covered facial features of the user. The face mask further includes
a fastening member configured to secure the covering member over
the face of the user.
Inventors: |
Duncan; William D.; (Mill
Creek, WA) ; Hyde; Roderick A.; (Redmond, WA)
; Kare; Jordin T.; (Seattle, WA) ; Pan; Tony
S.; (Bellevue, WA) ; Urzhumov; Yaroslav A.;
(Bellevue, WA) ; Wood, JR.; Lowell L.; (Bellevue,
WA) |
|
Applicant: |
Name |
City |
State |
Country |
Type |
Elwha LLC |
Bellevue |
WA |
US |
|
|
Family ID: |
55178670 |
Appl. No.: |
14/447490 |
Filed: |
July 30, 2014 |
Current U.S.
Class: |
128/863 |
Current CPC
Class: |
A41D 13/11 20130101 |
International
Class: |
A41D 13/11 20060101
A41D013/11 |
Claims
1. A face mask, comprising: a covering member configured to cover
facial features of a face of a user, the covering member including:
an air-permeable filter member; and a facial image provided on an
outward facing surface of the covering member, the facial image
being generated based on an image of the user and representing
covered facial features of the user; and a fastening member
configured to secure the covering member over the face of the
user.
2. The mask of claim 1, wherein a color of the fastening member is
based on a facial feature of the user.
3-6. (canceled)
7. The mask of claim 1, wherein the facial image includes a first
image portion and a second image portion, the second image portion
providing a transitional image between the first image portion and
the face of a user.
8. The mask of claim 7, wherein the second image portion provides a
substantially homogenous visual transition between adjacent regions
of the first image portion and adjacent uncovered portions of the
face of the user.
9. The mask of claim 7, wherein the first image portion includes a
modified facial feature of a user.
10. (canceled)
11. The mask of claim 7, wherein a border region of the second
image portion provides a substantially homogeneous visual
transition between an adjacent interior region of the second image
portion and adjacent uncovered portions of the face of the
user.
12. (canceled)
13. The mask of claim 1, further comprising a removable insert
including the facial image.
14-16. (canceled)
17. The mask of claim 1, further comprising a display layer
configured to provide the facial image and enable dynamic changing
of the facial image.
18-19. (canceled)
20. The mask of claim 17, wherein the display layer is configured
to change the facial image based on movement of the user's
face.
21-23. (canceled)
24. The mask of claim 17, wherein the display layer is configured
to be changeable based on an input from the user.
25. (canceled)
26. The mask of claim 1, further comprising an armature movable to
change the outward physical contour of the covering member.
27. The mask of claim 26, further comprising a control unit coupled
to the armature and configured to control operation of the
armature.
28. The mask of claim 27, wherein the control unit includes a
ranging device configured to detect a person proximate the user,
and wherein the control unit is configured to control operation of
the armature based on detection of the person.
29-41. (canceled)
42. A face mask, comprising: a covering member configured to cover
facial features of a face of a user, the covering member including:
a filter layer; and a display layer configured to provide a
changeable image; and a fastening member configured to secure the
covering member over the face of the user.
43-44. (canceled)
45. The mask of claim 42, wherein the display layer is configured
to provide an image representing covered facial features of the
user.
46. The mask of claim 42, wherein the display layer is configured
to provide a first image portion and a second image portion, the
second image portion providing a transitional image between the
first image portion and the face of a user.
47. (canceled)
48. The mask of claim 46, wherein the first image portion includes
a modified facial feature of a user.
49-58. (canceled)
59. The mask of claim 42, wherein the display layer is configured
to change the facial image based on movement of the user's
face.
60-65. (canceled)
66. The mask of claim 42, wherein the covering member is movable
between a folded state and an unfolded state.
67. The mask of claim 66, wherein the facial image is configured to
provide a continuous image in one or both of the folded and
unfolded states.
68. The mask of claim 66, further comprising an actuator configured
to move the covering member between the folded state and the
unfolded state.
69. The mask of claim 68, further comprising a control unit
configured to control operation of the actuator.
70. The mask of claim 68, wherein the actuator is at least
partially powered by breathing of the user.
71. (canceled)
72. The mask of claim 69, wherein the control unit is configured to
control operation of the actuator based on detection of a person
proximate the user.
73-78. (canceled)
79. A clothing item, comprising: a covering member configured to
cover a portion of a user's head, at least a portion of the
covering member being an air-permeable material; and a facial image
provided on an outward-facing surface of the covering member, the
facial image being based on an image of the user and representing
covered facial features of the user.
80. (canceled)
81. The clothing item of claim 79, wherein the facial image
includes a first image portion and a second image portion, the
second image portion providing a transitional image between the
first image portion and remaining portions of the covering
member.
82. The clothing item of claim 81, wherein the first image portion
includes a modified facial feature of a user.
83-84. (canceled)
85. The clothing item of claim 79, further comprising a removable
insert including the facial image.
86. The clothing item of claim 85, wherein the covering member
includes a receptacle configured to receive the removable
insert.
87-88. (canceled)
89. The clothing item of claim 79, further comprising a display
layer configured to provide the facial image and enable dynamic
changing of the facial image.
90. The clothing item of claim 89, wherein the display layer
includes e-ink.
91. The clothing item of claim 89, wherein the display layer
includes an OLED.
92. The clothing item of claim 89, wherein the display layer is
configured to change the facial image based on movement of the
user's face.
93-100. (canceled)
101. The clothing item of claim 79, wherein the covering member is
movable between a folded state and an unfolded state.
102. The clothing item of claim 101, wherein the facial image is
configured to provide a continuous image in one or both of the
folded and unfolded states.
103-182. (canceled)
Description
BACKGROUND
[0001] Face masks such as surgical masks (sometimes referred to as
hygiene masks, procedure masks, etc.) are often worn by users to,
for example, protect the user's mouth and nose from undesirable
airborne particles such as bacteria, airborne diseases, and the
like. Typically, a face mask covers the user's mouth and nose and
is held in place by a strap, band, or a similar fastening
device.
SUMMARY
[0002] One embodiment relates to a face mask, including a covering
member configured to cover facial features of a face of a user, the
covering member including an air-permeable filter member; and a
facial image provided on an outward facing surface of the covering
member, the facial image being generated based on an image of the
user and representing covered facial features of the user; and a
fastening member configured to secure the covering member over the
face of the user.
[0003] Another embodiment relates to a face mask, including a
covering member configured to cover facial features of a face of a
user, the covering member including a filter layer; and a display
layer configured to provide a changeable image; and a fastening
member configured to secure the covering member over the face of
the user.
[0004] Another embodiment relates to a clothing item, including a
covering member configured to cover a portion of a user's head, at
least a portion of the covering member being an air-permeable
material; and a facial image provided on an outward-facing surface
of the covering member, the facial image being based on an image of
the user and representing covered facial features of the user.
[0005] Another embodiment relates to a method of producing a
customized face mask, including receiving a user image of a face of
a user; determining a covering area based on the image, the
covering area including a portion of the user image; and printing a
covering image including a representation of the covering area onto
a display layer of an air-permeable face mask.
[0006] Another embodiment relates to a method of producing a
customized face mask, including acquiring a user image of a face of
a user using an image capture device; determining a covering area
corresponding to a portion of the image of the face of the user;
and providing a covering image representing the portion of the
image using a display layer of an air-permeable face mask such that
when the covering member is worn by a user, the covering image
provides a visual representation of at least a portion of the
underlying portions of the face of the user.
[0007] The foregoing summary is illustrative only and is not
intended to be in any way limiting. In addition to the illustrative
aspects, embodiments, and features described above, further
aspects, embodiments, and features will become apparent by
reference to the drawings and the following detailed
description.
BRIEF DESCRIPTION OF THE DRAWINGS
[0008] FIG. 1 is a perspective view of a customized face mask worn
by a user according to one embodiment.
[0009] FIG. 2 is a perspective view of the face mask of FIG. 1
according to one embodiment.
[0010] FIG. 3 is a perspective view of the face mask of FIG. 1
according to another embodiment.
[0011] FIG. 4 is a partial cross-sectional view of a face mask
according to one embodiment.
[0012] FIG. 5 is a side view of a face mask in both a collapsed and
an expanded configuration according to one embodiment.
[0013] FIG. 6 is an illustration of mapping a two dimensional image
onto a face mask according to one embodiment.
[0014] FIG. 7 is a perspective view of the face mask of FIG. 1
according to another embodiment.
[0015] FIG. 8 is a perspective view of a garment having a
customized image according to one embodiment.
[0016] FIG. 9 is a schematic representation of a system for
producing a customized face mask according to one embodiment.
[0017] FIG. 10 is a block diagram of a method of producing a
customized face mask according to one embodiment.
[0018] FIG. 11 is a block diagram of a method of using a customized
face mask according to one embodiment.
DETAILED DESCRIPTION
[0019] In the following detailed description, reference is made to
the accompanying drawings, which form a part thereof. In the
drawings, similar symbols typically identify similar components,
unless context dictates otherwise. The illustrative embodiments
described in the detailed description, drawings, and claims are not
meant to be limiting. Other embodiments may be utilized, and other
changes may be made, without departing from the spirit or scope of
the subject matter presented here.
[0020] Referring to the figures generally, various embodiments
disclosed herein relate to face masks, and more specifically, face
masks that are customizable to provide unique imagery for users. A
mask, such as a surgical mask or similar mask, may provide
filtering features to prevent spread of infection, etc. from
airborne particles. One or more portions of the mask may include an
image provided on an outward facing surface such that others can
view the image. The image may be a representation of underlying
facial features of the user, modified facial features of the user,
and the like. Furthermore, the mask may include portions that are
opaque, translucent, or substantially transparent, or that can
dynamically change between any or all of these states.
[0021] Referring now to FIGS. 1-2, mask 14 is shown according to
one embodiment. As shown in FIG. 1, mask 14 covers a portion of
face 12 of user 10. Mask 14 may be similar in many respects to a
typical surgical mask and be used to retard passage (e.g., capture,
reflect, filter, etc.) of particles leaving the user's nose, mouth,
etc., or similarly, to retard passage of particles prior to
entering the user's nose, mouth, etc. Mask 14 may come in a variety
of sizes, and be used in a variety of environments (e.g., during
medical procedures, for personal use at home, work, on the street,
etc., and the like).
[0022] In one embodiment, mask 14 includes covering member 16 and
one or more fastening members 18. Covering image 20 is provided on
an outward-facing surface of covering member 16. Covering member 16
is in one embodiment made at least partially from an air-permeable
material (e.g., a filter material, a paper or other non-woven or
woven material, etc.) to enable a user to breathe through covering
member 16. Fastening members 18 are configured to secure mask 14 to
the user (e.g., around the ears, around the neck, around the head,
etc.) to maintain mask 14 in a desired position.
[0023] Covering member 16 is in one embodiment a generally
rectangular piece of material intended to cover all or a portion of
a user's face. In one embodiment, covering member 16 is configured
to cover a mouth and a nose of a user, and conform to the contour
of the user's face in order to minimize the amount of air
travelling to/from the user's nose and mouth without passing
through covering member 16. In one embodiment, edge portions 22
provide a perimeter edge for covering member 16 (e.g., to provide a
cleaner appearance to the edges of mask 14). Edge portions 22 may
be adhered, stitched, sewn, or otherwise secured to covering member
16. In some embodiments covering image 20 extends up to and
throughout edge portion 22 to the outer edge of covering member
16.
[0024] Fastening members 18 are in one embodiment elastic members
configured to resiliently retain mask 14 on a face of a user. Any
suitable straps, bands, strings, etc. may be used. As shown in FIG.
2, fastening members 18 are provided in a looped configuration.
Alternatively, fastening members 18 may be provided as free-ended
members that may be tied together or otherwise secured. In some
embodiments, only a single fastening member 18 is utilized (e.g., a
single strap, band, etc.). In alternative embodiments, fastening
members 18 are omitted, and mask 14 is secured using adhesives or
other means. In some embodiments, the outermost surface of
fastening members 18 is colored based on one or more facial
features of the user. This coloring may be employed to match
underlying features of the user's face, and may be based on skin
tone, on skin color, on facial hair color, or the like. The
coloring of fastening member 18 may be spatially variable (e.g.,
along a length of fastening member 18) to match spatial variations
in the coloration due to the user's skin or hair.
[0025] Covering image 20 is in one embodiment an image provided on
an outward facing surface of covering member 16. Covering image 20
includes depictions of various facial features, such as a nose, a
mouth, skin features or imperfections, and the like. Covering image
20 may be configured to provide a substantially homogenous
transition between covering image 20 and adjacent uncovered
portions of a user's face.
[0026] In one embodiment, covering image 20 is generated based on
images (e.g., user images) of a user's face, such as digital
photographs, scanned images, and the like. Based on the user
images, various facial features can then be printed onto mask 14 as
part of covering image 20. In one embodiment, covering image 20 is
intended to substantially replicate the appearance of the user's
face (e.g., to replicate facial features underlying mask 14). In
other embodiments, one or more portions of covering image 20 may be
a modified version of the user's face, or further yet, include
customized and/or changeable features.
[0027] Referring to FIG. 3, according to one embodiment, covering
image 20 includes first, or inner, image 24, and second, or outer,
image 26. First and second images 24, 26 are in one embodiment used
to provide a customized image (e.g., as part of first image 24)
that transitions (e.g., via second image 26) to uncovered portions
of a user's face. For example, first image 24 may provide a
customized mouth, nose, or other facial features, and a border
region of second image 26 may be configured to provide a smooth
visual transition between adjacent regions of first image 24 and
interior regions of second image 26, and/or similarly between
adjacent interior regions of second image 26 and surrounding
adjacent uncovered facial features of the user. In some embodiments
the border region(s) of second image 26 extend throughout second
image 26 (i.e., there is no separate interior region). In one
embodiment an inner border region (i.e., adjacent to first image
24) of second image 26 transitions without a separate interior
region to an outer border region (i.e., adjacent to the uncovered
face of the user) of second image 26. First image 24 may be
customized to remove skin imperfections (e.g., scars, blemishes,
etc.), modify structural facial features (e.g., to provide a
larger/smaller nose, thicker/thinner lips, different teeth, etc.),
or provide other customized facial features.
[0028] According to one embodiment, one or both of first image 24
and second image 26 are provided on an air-permeable material, such
as a filter material or similar material. One or both of first
image 24 and second image 26 may include portions that are
transparent, translucent, or opaque. For example, a user may desire
to have modified facial features, such as a different mouth or
nose, provided as part of covering image 20, yet have the image be
translucent such that others can see facial movements such as lip
movements while the user speaks, as this often helps others
interpret what is being said by the user. As such, first or second
images 24, 26 may both provide images visible to others and enable
others to see or partially see the underlying facial features.
[0029] Referring now to FIG. 4, in some embodiments covering member
16 includes multiple layers such as layers 28, 30, and 32. In one
embodiment, one or more layers (e.g., 28, 32) are provided as a
filter or other material, and a separate layer (e.g., layer 30) is
provided as a display layer. The display layer may provide an
electronically generated image that can be dynamically modified
over time. For example, the display layer may provide a
representation of the covered portion of the user's face (e.g.,
based on an image of the user), a representation of modified facial
features, such as a modified nose, mouth, etc., movement of facial
features, and the like. In one embodiment, the display layer is
configured to depict a moving mouth based on the user speaking.
[0030] Layers 28, 30, and 32 can be made of any suitable material
and arranged in any suitable manner. For example, in one
embodiment, layer 30 is a display layer captured between layers 28,
32. In such a configuration, one or both of layers 28, 32 can be a
filter member configured to filter particles during use of mask 14.
Any of layers 28, 30, 32 may be opaque, transparent, or translucent
to provide a desired appearance to others. For example, layer 32 in
one embodiment is the outer-most layer of mask 14, and is provided
as a transparent layer such that images provided by an underlying
display layer, such as layer 30, are visible to others.
[0031] Referring further to FIG. 4, in one embodiment mask 14
includes control unit 34 (e.g., a controller, processing unit,
etc.). In one embodiment, control unit 34 includes power source 36
(e.g., a battery) and one or more control modules 38. In on
embodiment, power source 36 is a removable or rechargeable battery.
In other embodiments, power source 36 includes a transducer
configured to convert flexure of mask 14 (e.g., due to a user
breathing) to usable electrical energy. Control modules 38 may
include a processor and memory. The processor may be implemented as
a general-purpose processor, an application specific integrated
circuit (ASIC), one or more field programmable gate arrays (FPGAs),
a digital-signal-processor (DSP), a group of processing components,
or other suitable electronic processing components. The memory may
be one or more devices (e.g., RAM, ROM, Flash Memory, hard disk
storage, etc.) for storing data and/or computer code for
facilitating the various processes described herein, and may
include non-transient volatile memory or non-volatile memory. The
memory may include database components, object code components,
script components, or any other type of information structure for
supporting the various activities and information structures
described herein, and may be communicably connected to the
processor and provide computer code or instructions to the
processor for executing the processes described herein.
[0032] In one embodiment, control unit 38 is configured to control
a display layer of mask 14. For example, in one embodiment, layer
30 shown in FIG. 4 is a display layer and control unit 34 is
configured to control the operation of layer 30 to provide various
images. Layer 30 provides an image on an outward-facing surface of
covering member 16 such that the image can be seen by others. The
display layer may be or include any suitable material or
components, including e-paper, e-ink, a flexible display member, a
flexible OLED, etc.
[0033] In one embodiment, control unit 34 stores image data
regarding images of a user, and controls the display layer to
display images based on the image data. For example, the display
layer may display images intended to replicate or modify facial
features of the user, or alternatively, provide facial movements by
dynamically modifying the displayed image over time. In one
embodiment, control unit 34 is configured to operate based on user
inputs to provide one or more desired display images. For example,
control unit 34 may store multiple different images for display on
the display layer, and a user may specify which images are to be
displayed based on any of a number of factors (e.g., user
selection, time of day, the location of the user, etc.).
[0034] In one embodiment, control unit 34 controls the display
layer to provide different image portions, such as first image
portion 24 and second image portion 26 shown in FIG. 3. As such, a
first image portion may provide a generally static display and a
second image portion may provide a dynamically changing display.
For example, inner image portion 24 may provide a changing display
portion (e.g., to display lip movement, nose movement, etc.), and
outer image portion 26 may provide a transitional image such that
there is a substantially homogeneous visual transition between
inner image portion 24, outer image portion 26, and adjacent facial
features of the user (e.g., avoiding abrupt changes in the visual
image, color, etc.).
[0035] According to various alternative embodiments, control unit
34 is configured to provide customized images that transition into
the surrounding facial features of the user. For example, a user
may wish to have an image of an animal face (e.g., a tiger, a bear,
etc.) displayed. Control unit 34 may store such custom images in
memory, and control operation of the display layer to provide the
appropriate custom imagery. In one embodiment, control unit 34 is
configured to determine a transitional image, such as second image
portion 26, configured to transition the custom image into
surrounding facial features of the user (e.g., by providing gradual
changes in color, facial features, etc.). Any custom image may be
used according to various alternative embodiments, and the custom
images may be provided in the form of image data (e.g., digital
photographs, electronic scans, etc.) from a user.
[0036] Referring now to FIG. 5, according to one embodiment, face
mask 14 is configured to provide images in one or both of a
collapsed configuration 39 (e.g., as viewed along arrow 46) and an
expanded configuration 41 (e.g., as viewed along arrow 48). For
example, face mask 14 may include one or more pleats or folds 40
that enable face mask 14 to move between the collapsed and expanded
configurations (e.g., to accommodate different-sized faces, to
provide more or less coverage over a user face, to accommodate
mouth movements, etc.). As shown in FIG. 5, in collapsed
configuration 39, a single image may be provided collectively by
image portions 42, which in FIG. 5 extend along height 45. In
expanded configuration 41, a single image is provided by portion
44, which extends along height 47. As such, face mask 14 (e.g., a
covering member or display layer) is in one embodiment configured
to provide images representing facial features in one or both of
the collapsed and expanded states 39, 41. In some embodiments
(e.g., as shown in FIG. 5) the folds in collapsed configuration 39
result in obscuration of part of the folded material by another
part; in other embodiments the folds in collapsed configuration 39
are less severe, and result in less (or no) obscuration as viewed
along arrow 46.
[0037] In some embodiments, covering member 16 includes actuation
device 43 coupled to control unit 34. Actuation device 43 is a
movable member (e.g., a flexible member, etc.) configured to
provide movement of covering member 16 between collapsed
configuration 39 and expanded configuration 41 by folding/unfolding
folds 40. Control unit 34 is configured to control operation of
actuation device 43 to enable control of the amount of expansion of
face mask 14. For example, in some embodiments, covering member 16
provides a first image in collapsed configuration 39 that generally
corresponds to the user's face, while in expanded configuration 41
covering member 16 provides a second image that is modified (e.g.,
elongated, distorted, etc.) relative to the first image. To
minimize distortion while near other people, control unit 34 may be
configured to collapse covering member 16 based on detecting nearby
people. In some embodiments, control unit 34 is or includes a short
wave range finder configured to detect the presence of nearby
people. In other embodiments, control unit 34 is configured to
operate device 43 (and therefore control the degree of expansion of
covering member 16) based on other factors (e.g., time, location,
user inputs, etc.).
[0038] Referring now to FIG. 6, in one embodiment, mask 14 is
configured to provide covering image 20 such that when mask 14
conforms to the contours of a user's face, mask 14 provides a
realistic representation of underlying facial features. As such,
covering image 20 may be provided on a three-dimensional surface
and generated based on mapping two-dimensional image 50 onto mask
14. In some embodiments, image 50 is mapped onto mask 14 so as to
provide a realistic portrayal of underlying facial features from a
forward-facing direction. The mapping may be accomplished based on
a curvature or contour of mask 14, a known curvature or contour of
a user's face, an estimated curvature of a user's face, etc.
[0039] Referring to FIG. 7, according to various embodiments, mask
14 includes one or more features configured to enhance the visual
impact and/or the effectiveness of mask 14. For example, covering
member 16 may include contoured edges 56. Edges 56 may be
non-straight edges configured to better conform to a user's face
than, for example, straight edges. In some embodiments, edges 56
are contoured based on user image data of a particular user, such
that the contours of edges 56 are customized based on the facial
structure of a particular user. In other embodiments, edges 56 are
contoured based on inputs received from a user, for example, to
provide extended areas of coverage over one or more facial
features, etc.
[0040] In one embodiment, mask 14 further includes one or more
armatures 58, 60 (e.g., structural members or inserts, etc.)
configured to provide structural support for mask 14 and enable
modification of the appearance of various facial features of a
user. As shown in FIG. 7, mask 14 includes armatures 58, 60 spaced
apart and oriented horizontally along a length of mask 14. In one
embodiment, armature 58 is positioned at or near a user's nose, and
armature 60 is positioned at or near a user's mouth. As such,
armatures 58, 60 may enable simulation of facial features such as a
nose, mouth, etc. According to various other embodiments, different
numbers or sizes of armatures may be used, and the armatures may be
placed in any desired locations on mask 14.
[0041] In one embodiment, one or both of armatures 58, 60 are
movable (e.g., in a similar manner to movable member 43 shown in
FIG. 5) to dynamically change the shape of mask 14 (e.g., during
use). In one embodiment, armatures 58, 60 are operatively coupled
to a control unit such as control unit 34 such that movement of
armatures 58, 60 can be customized based on user inputs or other
factors (e.g., time of day, location, whether a user is speaking,
etc.). In further embodiments, control member 34 controls operation
of armatures 58, 60 in combination with controlling a display layer
such as display layer 30 shown in FIG. 4 such that for each
different image displayed by display layer 30, armatures 58, 60 are
configured to be placed into a corresponding position by control
member 34 (e.g., to simulate particular facial expressions,
customized facial features, etc.).
[0042] Referring further to FIG. 7, according to one embodiment,
mask 14 includes a pocket, or receptacle 61. Receptacle 61 is in
one embodiment formed of layers of covering member 16 (e.g., layers
28, 32 shown in FIG. 4) and is configured to receive a display
layer such as display layer 30 (see FIG. 4). As such, display layer
30 is in one embodiment a removable display layer that may be
re-used as remaining portions of mask 14 are discarded after use.
Receptacle 61 may be any suitable size, and may in some embodiments
be transparent, or have apertures therein to facilitate viewing of
images displayed on the display layer.
[0043] Referring to FIG. 8, garment 62 (e.g., a clothing item, a
head portion of a burka, etc.) is shown according to one
embodiment. As shown in FIG. 8, garment 62 includes covering member
64. Covering member 64 may share any of the features of covering
member 16, such as providing a display of underlying facial
features, etc. Covering member 64 is in one embodiment integrally
formed with or coupled to the remainder of garment 62.
Alternatively, covering member 64 may be attached using fasteners
(e.g., clips, pins, hook and loop fasteners, etc.), a pocket or
receptacle such as receptacle 66, and the like. As with mask 14,
garment 62 enables a user to provide images on a covering member
overlying one or more facial features of the user, and can include
any of the features of mask 14.
[0044] Referring now to FIG. 9, system 70 for generating a mask
such as mask 14 is shown according to one embodiment. As shown in
FIG. 9, system 70 includes image capture device 72, image processor
74, and printer 76. System 70 may alternatively include or be
configured to communicate with and receive inputs from/provide
outputs to one or more input/output devices 78 or control unit 34.
It should be understood that according to various alternative
embodiments, system 70 may include more or fewer components than
those illustrated in FIG. 9, and various components may be
separated or integrated relative to the illustrative configuration
shown in FIG. 9. Furthermore, the various components may
communicate with each other using any suitable wired or wireless
communications protocols. For example, while system 70 is shown in
FIG. 9 to include both image capture device 72 and printer 76, in
other embodiments, one or both of image capture device 72 and
printer 76 may be omitted. All such alternative embodiments are
within the scope of the present disclosure.
[0045] Image capture device 72 is configured to capture one or more
images of a user. In one embodiment, image capture device 72 is or
includes a digital camera (e.g., a dedicated camera, a cellular
phone or other device with camera capabilities, etc.).
Alternatively, image capture device 72 may be or include a video
recorder, a scanning device, or other suitable image capture
device. Image capture device 72 captures one or more images of a
user and provides user images/user image data to image processor
74.
[0046] Image processor 74 is configured to receive user
images/image data (e.g., from image capture device 72 or another
suitable image capture device) and generate a covering image such
as covering image 20 for printing by printer 76. Image processor 74
includes processor 80 and memory 82. Processor 80 may be
implemented as a general-purpose processor, an application specific
integrated circuit (ASIC), one or more field programmable gate
arrays (FPGAs), a digital-signal-processor (DSP), a group of
processing components, or other suitable electronic processing
components. Memory 82 is one or more devices (e.g., RAM, ROM, Flash
Memory, hard disk storage, etc.) for storing data and/or computer
code for facilitating the various processes described herein.
Memory 82 may be or include non-transient volatile memory or
non-volatile memory. Memory 82 may include database components,
object code components, script components, or any other type of
information structure for supporting the various activities and
information structures described herein. Memory 82 may be
communicably connected to processor 80 and provide computer code or
instructions to processor 80 for executing the processes described
herein.
[0047] According to one embodiment, image processor 74 is
configured to generate a covering image based on a covering member
and one or more user images. For example, based on known dimensions
of a covering member of a particular mask and image data of a
particular user, image processor 74 determines the likely portions
of the user's face that will be covered by the mask during use. As
such, image processor 74 may crop a portion an image of a user and,
after any necessary or desired modifications, print the cropped
image onto the covering member.
[0048] As noted above, the printed covering image may be a modified
version of the user image to provide, for example, modified facial
features, corrected blemishes, customized images (e.g., animal
features, etc.), different facial expressions (e.g., smiles,
frowns, etc.), and the like. In one embodiment, the covering image
is configured to provide a generally homogenous transition between
the covering image and adjacent uncovered portions of the user's
face.
[0049] Image processor 74 controls operation of printer 76 to print
the appropriate covering image onto a covering member or other
appropriate surface. Image processor 74 may be configured to print
the covering image onto a covering member. The covering image may
be any suitable image, such as any of those discussed with respect
to mask 14 and covering member 16. In one embodiment, printer 76 is
an ink jet printer. In other embodiments, printer 76 is another
suitable printing device. Printer 76 may be located locally or
remotely from image processor 74 and/or image capture device 72.
For example, in some embodiments, a user captures one or more
images of his or her face (e.g., using a digital camera) and
provides the images to a mask vendor, who in turn processes the
images and prints one or masks for use by the user. In one
embodiment, the one or more images may be taken from different
perspectives, capture different facial expressions, or the like. In
other embodiments, image processor 74 is accessible via a web-based
application and/or resides on a user device (e.g., a cellular
phone, laptop computer, desktop computer, etc.), such that the user
can take one or more digital photographs, upload the photographs to
the image processor, and subsequently print one or more masks
(e.g., on a personal printer, etc.).
[0050] According to some embodiments, printer 76 is or includes a
three-dimensional printer and is configured to print one or more
structural components for mask 14. For example, based on the user
image data, one or more armatures or other components can be
designed (e.g., by image processor 72) to enable mask 14 to conform
to a user's facial contour and/or to provide customized facial
structural features. As such, based on one or more factors, such as
user images, a particular covering member, and/or one or more user
inputs, printer 74 may print one or more structural components for
mask 14.
[0051] Referring further to FIG. 9, in some embodiments, mask 14
includes a dynamically changeable display layer controlled by
control unit 34. As such, rather than printing images onto a
covering member, image processor 74 in some embodiments instead
provides one or more data files to control unit 34. The data files
contain data enabling control unit 34 to provide one or more
desired displays on a display layer of mask 14. For example, the
data files may include different facial expressions, different
customized images, etc. As discussed above, control unit 34 may
communicate with image processor 74 wirelessly or via wired
communications protocols.
[0052] Referring now to FIG. 10, method 90 of making a face mask
such as mask 14 is shown according to one embodiment. One or more
images are acquired (92). The images may be acquired using image
capture device 72 or another suitable image capture device. In one
embodiment, the images are acquired using a personal or mobile
device of a user. Based on the acquired images, a covering image is
generated (94). For example, an image processor such as image
processor 74 may receive user image data associated with one or
more images, and based on the user image data and data related to a
particular face mask, generate a covering image for printing onto
the face mask. The covering image is in one embodiment based on the
user images and intended to substantially replicate all or some of
the facial features of the user. In one embodiment, a user
communicates user images acquired with a personal and/or portable
computer to a remote image processor, while in other embodiments,
the image processor is integrated with or provided locally with an
image capture device. If desired, the covering image is customized
(96). For example, a user may provide one or more inputs indicating
a customization preference. The preferences may include changing a
facial feature (e.g., eye, nose, etc.), a facial color (to provide
a different color skin tone), or alternatively providing images
related to animals, etc.
[0053] Once the covering image is complete, the covering image is
printed onto one or more face masks (98). In one embodiment, a
printer such as printer 76 is used to print the covering image onto
the face masks. The printer may be located remotely from, or
alternatively integrated with or located locally with, an image
processor and/or an image capture device. For example, in one
embodiment, a user uploads images from a digital camera (e.g., a
cellular phone with camera capabilities), processes the images
using a personal computer (e.g., a program residing on the personal
computer, a program accessible via a web-based application, etc.),
and prints one or more face masks using a personal printer. In
alternative embodiments, a vendor can perform one or more of the
image processing and printing steps and provide the user with one
or more sets of face masks. In further embodiments, a user can
select different image variations for printing (e.g., with
different facial expressions, facial features, skin tones, etc.),
and receive sets of each type of customized face mask.
[0054] According to an alternative embodiment, rather than printing
a covering image onto a face mask, one or more covering image data
files are created for use in generating electronic displays on an
electronic display member of a face mask. For example, referring
back to FIG. 4, control unit 34 may receive covering image data
files enabling control unit 34 to generate one or more desired
displays on a display layers such as display layer 30. The covering
image data files may be communicated from an image processor to
control unit 34 using any suitable wired or wireless communications
protocol.
[0055] Referring now to FIG. 11, method 100 of using a face mask
such as mask 14 is shown according to one embodiment. A first image
is provided (102). The first image may be provided by way of an
image printed onto a covering member of a face mask, or
alternatively, provided by an electronic display layer. An input is
received (104). The input may be received by a control unit coupled
to a covering member and/or display layer. The input may be
representative of a proximity of another person, a time, such as a
time of day, day of the week, etc., a location of the user (e.g.,
at work, at home, out in public, etc.), whether the user is
currently speaking, and the like. Based on the input, the first
image is modified (106). Modifying the first image may include
providing a second, different image via an electronic display
layer. Alternatively, in some embodiments modifying the first image
includes physically manipulating a covering member of a face mask.
For example, in the case of a pleated or folded mask, different
images may be provided based on whether the mask is folded or
unfolded, such that modifying the first image includes folding or
unfolding the covering member. In a further example, modifying the
first image includes actuating one or more armatures to provide a
different structural appearance to a covering member (e.g., to
provide different facial expressions, to simulate talking, etc.).
Method 100 may be continued while the user continues to wear the
face mask.
[0056] The present disclosure contemplates methods, systems, and
program products on any machine-readable media for accomplishing
various operations. The embodiments of the present disclosure may
be implemented using existing computer processors, or by a special
purpose computer processor for an appropriate system, incorporated
for this or another purpose, or by a hardwired system. Embodiments
within the scope of the present disclosure include program products
comprising machine-readable media for carrying or having
machine-executable instructions or data structures stored thereon.
Such machine-readable media can be any available media that can be
accessed by a general purpose or special purpose computer or other
machine with a processor. By way of example, such machine-readable
media can comprise RAM, ROM, EPROM, EEPROM, CD-ROM or other optical
disk storage, magnetic disk storage or other magnetic storage
devices, or any other medium which can be used to carry or store
desired program code in the form of machine-executable instructions
or data structures and which can be accessed by a general purpose
or special purpose computer or other machine with a processor. When
information is transferred or provided over a network or another
communications connection (either hardwired, wireless, or a
combination of hardwired or wireless) to a machine, the machine
properly views the connection as a machine-readable medium. Thus,
any such connection is properly termed a machine-readable medium.
Combinations of the above are also included within the scope of
machine-readable media. Machine-executable instructions include,
for example, instructions and data which cause a general purpose
computer, special purpose computer, or special purpose processing
machines to perform a certain function or group of functions.
[0057] Although the figures may show a specific order of method
steps, the order of the steps may differ from what is depicted.
Also two or more steps may be performed concurrently or with
partial concurrence. Such variation will depend on the software and
hardware systems chosen and on designer choice. All such variations
are within the scope of the disclosure. Likewise, software
implementations could be accomplished with standard programming
techniques with rule based logic and other logic to accomplish the
various connection steps, processing steps, comparison steps and
decision steps.
[0058] While various aspects and embodiments have been disclosed
herein, other aspects and embodiments will be apparent to those
skilled in the art. The various aspects and embodiments disclosed
herein are for purposes of illustration and are not intended to be
limiting, with the true scope and spirit being indicated by the
following claims.
* * * * *