U.S. patent application number 12/406066 was filed with the patent office on 2009-09-17 for graphical user interface for selection of options from option groups and methods relating to same.
This patent application is currently assigned to PHOTOMETRIA, INC.. Invention is credited to Kevin Barnes, Satya Mallick.
Application Number | 20090231356 12/406066 |
Document ID | / |
Family ID | 41062548 |
Filed Date | 2009-09-17 |
United States Patent
Application |
20090231356 |
Kind Code |
A1 |
Barnes; Kevin ; et
al. |
September 17, 2009 |
GRAPHICAL USER INTERFACE FOR SELECTION OF OPTIONS FROM OPTION
GROUPS AND METHODS RELATING TO SAME
Abstract
A system providing a graphical user interface for the selection
of options from option groups and method for implementing the same.
A graphical user interface enables a user to see the results of
applying virtual makeup to an image of a person's face. The
interface may enable a user to select the color of the makeup to be
applied through a circular region with a series of option tabs. The
outer portion of the circular region, or the flywheel, may have
multiple color segments. The color segments are arranged so that
adjacent color segments may be perceived to be most similar. The
inner portion of the circle may present the history of the colors
previously chosen and may have an uncolored center circle, and a
series of uncolored circles surrounding the center circle.
Inventors: |
Barnes; Kevin; (San Diego,
CA) ; Mallick; Satya; (San Diego, CA) |
Correspondence
Address: |
DALINA LAW GROUP, P.C.
7910 IVANHOE AVE. #325
LA JOLLA
CA
92037
US
|
Assignee: |
PHOTOMETRIA, INC.
San Diego
CA
|
Family ID: |
41062548 |
Appl. No.: |
12/406066 |
Filed: |
March 17, 2009 |
Related U.S. Patent Documents
|
|
|
|
|
|
Application
Number |
Filing Date |
Patent Number |
|
|
61037323 |
Mar 17, 2008 |
|
|
|
61037319 |
Mar 17, 2008 |
|
|
|
61037314 |
Mar 17, 2008 |
|
|
|
Current U.S.
Class: |
345/594 ;
715/781 |
Current CPC
Class: |
G06F 3/04883 20130101;
G09G 2340/10 20130101; G06Q 30/0601 20130101; G16H 30/40 20180101;
G06Q 30/0256 20130101; G06Q 30/0275 20130101; G06F 3/0482 20130101;
G06Q 30/0269 20130101; G09G 5/14 20130101; G06Q 40/04 20130101;
G09G 2320/0666 20130101; G09G 5/06 20130101; G09G 2340/0407
20130101 |
Class at
Publication: |
345/594 ;
715/781 |
International
Class: |
G09G 5/02 20060101
G09G005/02; G06F 3/048 20060101 G06F003/048 |
Claims
1. A computer program product for rendering a graphical user
interface component for selecting color to apply to an image
comprising computer readable program code, said computer readable
program code executing in a tangible memory medium and configured
to: display a graphical user interface component comprising a
palette selection region, a color selection region, a history
screen region further comprising a current selected region and a
plurality of recently selected screen regions; identify an
identified region of a subject image to apply a color; define at
least one color palette to associate with said palette selection
region; display a plurality of color choices within said at least
one color palette in said color selection region; obtain input from
a user representing a first color choice made by said user; apply
said first color choice to said identified region of a subject
image; display said first color choice in said current selected
region; if second color choice is input by said user, apply said
new color choice to said identified region of said subject image;
and display said new color choice in said current selected region;
and display said first color choice in one of said plurality of
recently selected screen regions.
2. The computer program product of claim 1 wherein said color
palette is defined by said user.
3. The computer program product of claim 1 wherein said color
palette is defined by a default selection.
4. The computer program product of claim 1 wherein said color
palette is defined based on a color profile of an associated
image.
5. The computer program product of claim 1 wherein said color
palette is defined based on said identified region of said subject
image.
6. The computer program product of claim 1 wherein said color
palette represents colors associated with a cosmetic product.
7. The computer program product of claim 1 wherein said color
palette represents a product line of products and their associated
characteristics.
8. The computer program product of claim 6 wherein said cosmetic
product is a facial cosmetic product.
9. The computer program product of claim 8 wherein said facial
cosmetic product is foundation.
10. The computer program product of claim 8 wherein said facial
cosmetic product is blush.
11. The computer program product of claim 8 wherein said facial
cosmetic product is concealer.
12. The computer program product of claim 8 wherein said facial
cosmetic product is eye shadow.
13. The computer program product of claim 8 wherein said facial
cosmetic product is mascara.
14. The computer program product of claim 8 wherein said facial
cosmetic product is eye liner.
15. The computer program product of claim 8 wherein said facial
cosmetic product is lipstick.
16. The computer program product of claim 8 wherein said facial
cosmetic product is lip gloss.
17. The computer program product of claim 8 wherein said facial
cosmetic product is lip liner.
18. The computer program product of claim 1 wherein said palette
selection region is a tab adjacent to said color selection
region.
19. The computer program product of claim 1 wherein said color
selection region is visually represented as a flywheel separated
into segments for plurality of said color choices.
20. The computer program product of claim 1 wherein said recently
selected screen region moves in descending order of most to least
recently selected color choices.
21. The computer program product of claim 1 wherein definition of
said at least one color palette to associate with said palette
selection region is defined by said user.
22. The computer program product of claim 1 wherein definition of
said at least one color palette to associate with said palette
selection region is defined by a system.
23. The computer program product of claim 1 wherein definition of
said at least one color palette to associate with said palette
selection region is defined by said identified region of subject
image.
24. The computer program product of claim 1 wherein a context
sensitive menu is presented to said user when said user selects
said identified region of said subject image, wherein said context
sensitive menu presents at least one action that is associated with
said identified region.
25. The computer program product of claim 1 further configured to
present at least one option group wherein said at least one option
group is associated with said at least one color palette.
26. A method for displaying a color selection region comprising:
creating a circular doubly linked flywheel data structure, said
flywheel data structure comprising a plurality of records;
associating each record of said plurality of records with a segment
on a flywheel display; associating each color of a plurality of
colors with each record of said plurality of records; calculating a
local cost for a color; when said local cost is less than zero,
swapping said record of said color with a next record of a next
color; and displaying a first color in said plurality of color on a
first segment of said flywheel display.
27. The method of claim 26 wherein said calculating a local cost
for a color comprises determining differences in color space of
said record and said next record.
28. The method of claim 26 wherein said flywheel display is
visually represented as a circle.
29. The method of claim 26 wherein said flywheel display is
visually represented as an ellipse.
30. A computer program product for rendering a graphical user
interface component for selecting color to apply to an image
comprising computer readable program code, said computer readable
program code executing in a tangible memory medium and configured
to: display a graphical user interface component comprising a
palette selection region, a color selection region further
comprising a flywheel display, a history screen region further
comprising a current selected region and a plurality of recently
selected screen regions; identify a region of a subject image to
apply a color to; define at least one color palette to associate
with said palette selection region; create a circular doubly linked
flywheel data structure, said flywheel data structure comprising a
plurality of records; associate each record of said plurality of
records with a segment on a flywheel display; associate each color
of a plurality of colors with each record of said plurality of
records; calculate a local cost for a color; when said local cost
is less than zero, swap said record of said color with a next
record of a next color; and display a first color in said plurality
of color on a first segment of said flywheel display. obtain input
from a user representing a first color choice made by said user;
apply said first color choice to said identified region of a
subject image; display said first color choice in said current
selected region; if second color choice is input by said user,
apply said new color choice to said identified region of said
subject image; display said new color choice in said current
selected region; and display said first color choice in one of said
plurality of recently selected screen regions.
31. A computer program product for rendering a graphical user
interface component for selecting color to apply to an image
comprising computer readable program code, said computer readable
program code executing in a tangible memory medium and configured
to: display a graphical user interface component comprising a
palette selection region, a color selection region; identify a
first identified region of a subject image to apply a first color;
obtain input from a user to determine a first color choice of said
first identified region; identify a second identified region of
said subject image to apply a second color; obtain input from said
user to determine a second color choice of said second identified
region; and, store a look of said subject image, said look
comprising said first color choice and said second color
choice.
32. A computer readable storage medium encoded with computer
program instructions which when accessed by a computer cause the
computer to load the program instructions to a memory therein
creating a special purpose data structure causing the computer to
operate as a specially programmed computer executing a method of
displaying color selection history comprising: displaying in a
specially programmed computer a graphical user interface component
comprising a palette selection region, a color selection region, a
history screen region further comprising a current selected region
and a plurality of recently selected screen regions; receiving
through a human interface device, an identified region of a subject
image to apply a color; computing at least one color palette to
associate with said palette selection region in one or more
processors; displaying a plurality of color choices within at least
one color palette in said color selection region; receiving through
said human interface device input from a user representing a first
color choice made by said user; displaying said first color choice
to said identified region of a subject image; displaying said first
color choice in said current selected region; if second color
choice is input by said user, displaying said new color choice to
said identified region of said subject image; and displaying said
new color choice in said current selected region; and displaying
said first color choice in one of said plurality of recently
selected screen regions.
Description
CROSS REFERENCE TO RELATED APPLICATIONS
[0001] This application claims priority to three U.S. Provisional
Patent Applications, all filed on Mar. 17, 2008, and all co-owned
by the same assignee. These applications are entitled "SYSTEM AND
METHOD FOR CREATING AND SHARING PERSONALIZED VIRTUAL MAKEOVERS,"
Ser. No. 61/037,323, "GRAPHICAL USER INTERFACE FOR SELECTION OF
OPTIONS FROM OPTION GROUPS AND METHODS RELATING TO SAME," Ser. No.
61/037,319, and "METHOD OF MONETIZING ONLINE PERSONALIZED BEAUTY
PRODUCT SELECTIONS," Ser. No. 61/037,314, filed 17 Mar. 2008. These
provisional patent applications are hereby incorporated by
reference in their entirety into this specification.
BACKGROUND OF THE INVENTION
[0002] 1. Field of the Invention
[0003] One or more embodiments of the invention described herein
pertain to the field of computer systems. More particularly, but
not by way of limitation, one or more embodiments of the invention
enable the rendering of a computer graphical user interface for the
selection of options from option groups. In at least one context
the graphical user interface provides users with screen elements
that enable the efficient selection of context appropriate colors
that are applied to an image in a corresponding screen region.
[0004] 2. Description of the Related Art
[0005] Computer systems have made longstanding use of various
systems for enabling users to affect the color of what is displayed
on the computer screen. Early programs allowed a user to draw
shapes and text in a rudimentary form. While the earliest
incarnations of these graphic design programs functioned only in
black and white, as computers developed color options were
attached. These early applications, for example PC Paintbrush for
Microsoft's Disk Operating System (DOS), created in 1985, allowed a
user to select a drawing color from a palette. The user was
presented with a variety of colors to choose from in rowed boxes
where different shades of color were represented by a mix of pixels
using a 16-bit Enhanced Graphics Adapter and compatible
display.
[0006] As graphical programs continued to develop more color
options became available, and truer representation of the color
spectrum developed. A number of formats for color selection exist
including the rowed format continued from early graphics programs
into later versions, such as Microsoft's Paint for Windows XP.
Customization of these color selectors allow a user to move a point
through a large square that contains certain colors, and also
provides access to the millions of shades in between. The user may
create a desired color by selecting hue, saturation and luminosity
values for a color. While some vary, most color selection user
interfaces use this method for color selection. Less commonly, a
hexadecimal value associated with a color that is recognized by
Hypertext Markup Language (HTML) browsers may be selected to create
a desired color.
[0007] Color selection interfaces, however, generally lack an
ability to allow a user to select a color based on the availability
of the color and/or the appropriateness of a color in a given
situation. Color selection interfaces that allow for the grouping
of colors into palettes often do so in a fashion that is arbitrary
to the shade of color. For example, many group all reds together
into a single palette.
[0008] The interfaces provided for color selection are generally
formulaic and vary little from program to program. Almost all
involve a process requiring a large amount of user experimentation
to create the desired color based on manual selection of the hue,
saturation and luminosity (HSL) values or Red, Green, Blue (RGB)
values. Others simply provide a limited supply of colors. These
interfaces also lack the ability to allow a user to select from a
wide variety of colors, which is necessary in an interface for
selecting colors for transference to a photographic image. For
example, if a user picks a shade of orange from an HSL color
selection interface, the interface is unable to provide a name for
the color for said user to go to a local hardware store and attempt
to purchase a matching paint color with which to paint a house.
[0009] Interfaces for color selection also lack a coherent and
ordered method of tracking recent selections within the interface,
while maintaining the context appropriate application of these
selections as described above.
[0010] For at least the limitations described above there is a need
for a computer graphical user interface for the selection of
options from option groups in various contexts such as the one
described in further detail throughout this document.
BRIEF SUMMARY OF THE INVENTION
[0011] One or more embodiments of the invention are directed to a
graphical user interface for the selection of options from option
groups. By way of example and not by limitation, the graphical user
interface described herein provides users with an arrangement of
screen elements that enables the user to make color choices within
a particular context and apply the selected colors to an image.
[0012] In one or more embodiments of the invention, a graphical
user interface may enable a user to see the results of applying
makeup to an image of a person's face. This interface may offer
multiple tabs to enable a user to select a particular section of a
person's face to which the user chooses to apply makeup. The
interface may enable a user to select the color palette of the
makeup to be applied through a circular region with a series of
option group tabs. Each option group tab has a group of associated
colors. The outer portion of the circular region, or the flywheel,
may have multiple color segments. The color segments are arranged
so that adjacent color segments may be perceived to be most
similar. The inner portion of the circle may present the history of
the colors previously chosen and may have an uncolored center
circle, and a series of uncolored circles surrounding the center
circle.
[0013] In one or more embodiments of the invention, when a user
selects an option group tab, a new group of color segments may be
presented in the flywheel portion of the circle. As the user moves
and holds the cursor above a colored segment, the size of that
segment may expand to allow the user to see the color more clearly.
When the user clicks on a color segment, the section of the
person's face selected changes to the color of the color segment.
In addition, the selected color fills the center circle. When the
user selects a different color, the center circle may be filled
with the selected different color and one of the circles
surrounding the center circle may be then filled with the previous
selected color. As the user selects additional different colors,
the center circle and the circles surrounding the center circle
present the history of the previously selected color choices.
[0014] In the example described here, the graphical user interface
components and the methods enabling the graphical user interface
components to operate as described are illustrated via a virtual
make over. Thus, while not limited solely to such an implementation
one or more embodiments of the invention are directed to providing
users with a graphical user interface that enables the users to
apply virtual makeup to an image. Users may, for instance, upload a
picture of them and use the graphical user interface components
described herein to apply virtual makeup to the image. Hence in at
least one embodiment of the invention, users utilize the graphical
user interface components to make color choices about various color
shades of makeup such as foundation, concealer, blush, eye-shadow,
eye-liner, mascara, lipstick, lip liner, lip gloss, and contact
lenses. Color choices are made by a user as to what color of makeup
to apply, and the system renders the chosen color to the image.
[0015] The user's color choices and the context within which the
choices were made are retained in a recent selection screen region
of the interface. The method described here is not limited as to
the type of computer it may run upon and may for instance operate
on any generalized computer system that has the computational
ability to execute the methods described herein and can display the
results of the users' choices on a display means. The computer
typically includes at least a keyboard, a display device such as a
monitor, and a pointing device such as a mouse. The computer also
typically comprises a random access memory, a read only memory, a
central processing unit and a storage device such as a hard disk
drive. In some embodiments of the interface, the computer may also
comprise a network connection that allows the computer to send and
receive data through a computer network such as the Internet. The
invention may be embodied on mobile computer platforms such as cell
phones, Personal Desktop Assistants (PDAs), kiosks, games boxes or
any other computational device
[0016] The term options as it is used here relates to an option
that is selectable by a user which relate to the applicable of said
option to a subject. For instance, in one embodiment, the options
are colors associated with facial makeup that are further applied
to a subject image, and the context for the options provided is
dependent on the part of the image the color is to be applied to.
In one or more embodiments of the invention the system is able to
dynamically generate options presented to the user and may, for
instance, determine what colors to layout on the fly based on the
user's selection. The number of colors and the particular colors to
be displayed are generally dictated by user choice. If a user asks
to only see a certain type, category or subcategory of makeup the
system renders the color choices based on the user's input. For
instance, if the user selects only organic, SPF foundations the
system supporting the graphical user interface obtains the makeup
that is classified as such, determines the colors to be displayed
using corresponding color data associated with those makeup choices
and renders the choices within the graphical user interface for
selection by the user.
[0017] The visual presentation used within the graphical user
interface to present color options to the user can vary depending
upon which embodiment of the invention is implemented. In at least
one embodiment of the invention a circular color flywheel type
interface is utilized to present the color choices. The colors
displayed on the color flywheel show the user what color choices
are available for application to the image. The user may change the
colors displayed on the color flywheel by defining or selecting a
new option group. Each option group has an associated collection of
colors and when the option group is active, the associated colors
are displayed on the color flywheel. In the center portion of the
color flywheel information about the operations and color choices
already made by the user is displayed using a collection of
circles. The center most circle indicates which color is active and
presently applied to the associated image and the various circles
surrounding this active circle depict what other colors have been
or may be applied to the image. One advantage of using a color
flywheel to implement one or more embodiments of the invention is
that the colors on the color flywheel can be determined on the fly
as was mentioned above and will be more fully described throughout
this document.
[0018] The method described herein in the context of a graphical
user interface enables users to visually glance at a history view
as the history data is collected and gathered based upon user
selections within a specific set of option groups. As new options
are selected, the interface presents and distinguishes between the
currently applied and selected option as well as options that have
been recently selected through the use of dynamic graphical
representation. A number of items are kept in the history, and this
number can as large or as little as is called for in any
implementation of an embodiment of the invention. The interface
moves in sequence from the most recently selected to the earliest
selected option. Upon the interface history being filled the
earliest selected option is removed from the list to make room for
new selections. One or more embodiments of the invention are also
able to recognize a new selection by a user that is already in the
recent history and has not been removed and is able to resort the
history instead of duplicating the history entry and taking up a
second history position on the interface.
[0019] The interface is able to retain its history through
navigation. As a user navigates through option groups which are
associated with the same subject area of application the history
persists, allowing for selections by a user to be retained and
referred back to even between option groups. A new history is
created once the user navigates to a new subject area and the
graphical representation of the history is blanked. However should
a user choose to return to the previous subject area the history
for that area would once again become available. The history may
also persist across user sessions.
[0020] In one embodiment used for example where the interface is
configured to present makeup options for application to an image of
a face, a user selecting a set of lipstick colors for application
to the face would see their most recent selections represented on
the recently selected history regions of the interface. Were the
user to choose to select eye shadow colors instead, the interface
would be repopulated with options appropriate to eye shadows, and a
new history would be created based on those selections. Navigating
back to the lipstick options would repopulate both the interface
with options and the previous history.
[0021] Context sensitivity with respect to the operations to be
applied to the image is incorporated into one or more embodiments
of the invention. For instance, when an image of a person is
obtained by the system, the system processes the image using facial
detection algorithms to identify the various parts of a person's
face such as the eyes, nose, mouth, skin, cheeks, chin, hair,
eyebrows, ears, or any other anatomically identifiable feature.
This facial information is stored by the system for later use. When
the user is later working on applying color changes to the uploaded
image, the information obtained about the image such as where the
eyes are located, what part of the image is skin, and where the
lips are located is used to present a limited set of operations to
the user based on the context within which a particular set of
tasks may be applied. The system is configured to present
operations to the user that are relevant to the location on the
image where the user has positioned the cursor or otherwise
selected. When the cursor is located over the eye, for instance, a
mouse click presents operations that are eye specific. When the
cursor is positioned over the lips the operations are lip specific
(e.g., relate the application of lipstick, lip liner, or lip gloss)
and when the cursor is over a relevant portion of the skin the
operations presented coincide with the location of the cursor
(e.g., relate to foundation or concealer). The location within the
image where context sensitive menus such as the ones described are
presented depends upon the particular face within the image and is
driven by the system making use of an automatic facial detection
algorithm or by having the user manually identify key anatomical
features of the face.
BRIEF DESCRIPTION OF THE DRAWINGS
[0022] The patent or application file contains at least one drawing
executed in color. Copies of this patent or patent application
publication with color drawing(s) will be provided by the Office
upon request and payment of the necessary fee.
[0023] The above and other aspects, features and advantages of the
invention will be more apparent from the following more particular
description thereof, presented in conjunction with the following
drawings wherein:
[0024] FIG. 1 shows an exemplary embodiment of an option selector
in which option palettes are selected.
[0025] FIG. 1A illustrates the color palettes selected by an option
group in one or more embodiments of the invention.
[0026] FIG. 1B illustrates an exemplary embodiment where a single
color is selected by a user.
[0027] FIG. 1C illustrates a history of colors in one or more
embodiments of the invention.
[0028] FIG. 1D illustrates an example method for ordering colors in
one or more embodiments of the invention.
[0029] FIG. 2 shows an exemplary method in which the option
selector operates where colors and color palettes are used as the
selections.
[0030] FIG. 3 shows an exemplary method in which the option
selector of FIG. 1 operates with any available options.
[0031] FIG. 4 shows an exemplary embodiment of the option selector
of FIG. 1 in a web application.
[0032] FIG. 5 shows the high level operation of the option selector
within an application.
[0033] FIG. 6 shows a method for creating, saving and using a look
to apply items associated with the look to an image. The look
attributes are saved independent of the image.
[0034] FIG. 7 shows an image being provided to the system, in which
the user can select an already saved `look` to apply to their
image, or start on their own and create a new look for the uploaded
image.
[0035] FIG. 8 shows an exemplary embodiment of an interface
allowing for the `at-a-glance` viewing of an image with multiple
applied looks.
[0036] FIG. 9 presents exemplary computer and peripherals which,
when programmed as described herein, may operate as a specially
programmed computer capable of implementing one or more methods,
apparatus and/or systems of the invention.
DETAILED DESCRIPTION
[0037] A graphical user interface for the selection of options from
option groups and method for implementing the same will now be
described. In the example given here the invention is described in
the context of a graphical user interface component that is used to
enable users to select and apply virtual makeup to an image. In the
following exemplary description numerous specific details are set
forth in order to provide a more thorough understanding of
embodiments of the invention. It will be apparent, however, to an
artisan of ordinary skill that the present invention may be
practiced without incorporating all aspects of the specific details
described herein. In other instances, specific features,
quantities, or measurements well known to those of ordinary skill
in the art have not been described in detail so as not to obscure
the invention. Readers should note that, although examples of the
invention are set forth herein, the invention is not limited to the
specific examples given in that the claims, and the full scope of
any equivalents, are what define the invention.
[0038] One or more embodiments of the invention are implemented in
the context of a graphical user interface for selection of options
from option groups and methods related to the same. This disclosure
relies in part on the novel programming algorithms and user
interface elements discussed in two co-pending U.S. Patent
applications filed on the same day and owned by the same assignee.
These applications are entitled "SYSTEM AND METHOD FOR CREATING AND
SHARING PERSONALIZED VIRTUAL MAKEOVER," Ser. No. ______, filed 17
Mar. 2009, hereinafter known as the "Personalized Makeovers"
co-pending patent application and "METHOD OF MONETIZING ONLINE
PERSONALIZED BEAUTY PRODUCT SELECTIONS", Ser. No. ______, filed 17
Mar. 2009, hereinafter, the "Monetizing" co-pending patent
application. These patent applications are hereby incorporated by
reference into this specification.
[0039] Interface Enabling Dynamic Layout of Options or Color Within
a Group for Application to an Image
[0040] FIG. 1 shows a graphical user interface comprised of
interactive screen regions that, upon activation by the end user,
apply desired effects to an associated image. For instance a user
may utilize the interface to identify a type of makeup and color
and to then apply the selected makeup to a corresponding image.
Images are generally uploaded or otherwise saved into a computer
for application of the methodology described herein. Once an image
is available to the system the user identifies a grouping (e.g.,
makeup type) that will have an associated collection of options
(e.g., color choices). Each grouping has a set of stored values
that defines the attributes of the group. For example, if the user
selected a brand or generalized color palette of foundation to
apply to the skin region within an image, the color values
associated with each type of foundation that falls within the
selected grouping is stored in the system. When an item within the
group is part of the grouping made by the user the values for each
item are used to populate the option group. In the context of a
virtual makeup interface, for example, there are various types of
makeup (e.g., foundation, concealer, blush, eye-shadow, eye-liner,
mascara, lipstick, lip liner, lip gloss, and contact lenses). The
user selects a makeup type such as foundation and then defines a
grouping within the type. The user may want a color palette from a
certain brand of makeup or choose colors that are recommended for a
certain skin type. In other instances the user may want makeup that
has certain characteristics (e.g., SPF level 15 or any other user
preferred characteristic) and can select such a grouping. Each of
the individual items of makeup has a set of discrete color values.
When the user identifies the grouping, the system obtains the
various discrete values for the different items of makeup that fall
within the grouping. In one or more embodiments of the invention,
these discrete color values are then presented to the user for
selection and application to a corresponding image via the
dynamically generated graphical user interface described in FIG. 1
and throughout. In one or more embodiments of the invention, the
generated graphical user interface may not be dynamically
generated. In one or more embodiments of the invention, a different
flywheel may be loaded for each corresponding option such as
lipstick for example.
[0041] In the figure depicted as an example of one or more
embodiments of the invention a color flywheel type format is used
to display the available color choices to the user. It will be
apparent however to those of ordinary skill in the art that any
geometric shape, or combination of shapes, can be used in
alternative to a circle and hence while some of the advantages
described herein result from the circular color flywheel format
other shapes such as an ellipse, a triangle, or a polygon are
feasible to implement. In one or more embodiments of the invention,
a circular format may consistently generate the same interface
regardless of the number of colors within the grouping. In one or
more embodiments of the invention, a square format may be employed
such that the size of the segments may change to accommodate more
colors.
[0042] FIG. 1 which illustrates the color flywheel type embodiment
of the invention denotes an option group tab at 101. Each option
group tab has a set of associated items where each item with the
group has discrete color values that are presented on the color
flywheel. If the option group relates to eye shadow for instance,
the color values depicted are for each item of eye shadow within
the group. A visual cue that provides the user with an easy way to
determine what the option group relates to (e.g., what type of
makeup) may provide the user with information used to navigate
between option groups. In general each option group is associated
with a color palette that defines a plurality of color choices the
user may select. The colors on the color palette are associated
with a particular item (e.g., makeup item) that is part of the
grouping. When an option group is selected the corresponding color
palette associated with the selected option group is displayed. In
the example depicted in FIG. 1 for instance, when option group tab
101 is active, the palette displayed in flywheel display segments
102a-102ff are the colors associated with option group tab 101. If
option group tab 101 was representative of a group of lipstick
colors, flywheel display segments 102a through 102ff would
represent the various colors of lipstick within the group.
[0043] As shown in FIGS. 1 and 1A, in the context of colors,
different color palettes can be selected by selecting a different
option group (e.g., 113). The selection of a specific option group
at 101 triggers the population of flywheel display segments 102a,
102b, and 102c through 102ff with the options associated with the
selected option group. In one or more embodiments of the invention
option group tab 101 is stored as a group of references to the
items within the grouping. In this case for instance, option group
tab 101 is a set of stored references to the items of lipstick and
their corresponding discrete color values. When the user activates
option group tab 101 the system obtains the corresponding color
values and dynamically populates flywheel display segments 102a
through 102ff with the color of items within the group. FIG. 1A
illustrates how the screen appears once an option group is selected
and the corresponding color flywheel (in this case screen region
114) is populated. The number of colors displayed in screen region
114 is not fixed but depends on the option group.
[0044] In the example given at FIG. 1, flywheel display segments
102a through 102ff contain various color choices but the interface
depicted may be configured to contain other options. In the
examples given this region is divided into sections around the
circumference of the circular region 103 shows in FIG. 1. Other
methods of displaying the options within flywheel display segments
102a through 102ff are also contemplated as being within the scope
and spirit of the invention. Hence readers should note that other
graphical arrangements that place the options associated with a
group into an arrangement where the choices are reasonably
proximate to the grouping and the history data are also
feasible.
[0045] Currently selected screen region 104 which in the example
depicted here is located in the middle of the user interface allows
for the display of the currently selected option. Screen elements
around this middle currently selected screen region 104 contain a
history of color values identifying recent selections made by the
user as shown in this example at 105-112.
[0046] Upon activation of the interface the user is presented with
options dynamically generated or preloaded by the application. Once
a user enters the application, the option group selection unit is
populated at 101 with options the system determined to be relevant.
This determination is made by performing a lookup of the options
(e.g., color values, etc . . . ) associated with a specific option
group. In the examples provided, the options provided in the Option
Selector device of FIG. 1 are used to provide color choices to be
applied to a picture, in this example a face. The application in
this example is able to recognize different parts of the face to
which makeup would be applied and then make changes.
[0047] A generalized interface for making use of the graphical user
interface depicted in FIGS. 1, 1A, 1B, and 1C is shown at FIG. 4.
In the example provided and shown in FIG. 4, a user selects a
subject at points 401-403, which relates to a specific portion of
the subject image. Upon the selection of one of these subjects the
user is further presented with specific options relating to the
selected portion at 405. For example, if the subject selected was
the mouth, the screen region at 405 showing available details for
selection would represent lipstick, lip liner, and lip gloss as
options to populate the option selector interface. The selection of
lipstick in this example would then populate the palette, or option
group, selection interface shown on FIG. 1, option group tab 101,
and at 406. Each palette contains a variety of lipstick colors,
grouped by context. For example, the colors red and orange would be
grouped together in one palette, and neutral or naked colors would
be grouped in another. Upon selection of a palette, the color
selection is populated with the colors associated with the palette
at flywheel display segment 102a. Selection of a color from this
interface applies the lipstick color to the subject image's mouth
in a manner consistent with lipstick being applied to a person's
face. Selection of a new detail, such as lip liner, repopulates the
interface with associated palettes and colors for that detail. The
colors available in each palette are in one or more embodiments of
the invention options that can be applied to an image of a person
in a specific way. Lipstick for instance is applied to lips,
eye-shadows to eyelids, and blush to cheeks.
[0048] The color options presented to a user are therefore provided
based on the context in which they are to be applied. In the makeup
example the palettes and colors presented to the user relate to eye
makeup, foundations, or others makeup type applications. Outside
the context of the makeup example described herein the methods
described here may be applied to other situations where there is a
grouping that has options for application to an image. For example
the planned decoration of a home where colors are applied based on
a desired `theme` or `look`. Applying colors to an image is not the
only embodiment of the invention as textures, patterns, text,
fonts, and any other form of graphically represented information
can be applied as option groups and options for this interface.
[0049] As previously mentioned, parts of the interface comprise a
history region. Circular region 103 depicts the history interface
in accordance with one or more embodiments of the invention, where
104 through 112 represent specific positions in the history,
ranging from the most currently selected option, through a number
of recently selected options. Upon selection of an option (e.g.
color value) from the interface (e.g., color flywheel) and the
application of this option to the associated subject image, the
selected option is shown at currently selected screen region 104,
which is used to represent the `currently applied` selection. In
keeping with our provided example embodiment, a lipstick color
selected by the user and applied to the subject image would then
display on the applied image as well as in current selected screen
region 104. FIG. 1B shows the invention in one or more example
embodiments where a single color is selected by a user. A visual
cue in the form of an arrow or other display element indicating the
selection made (131) is provided to show the selected color. A
large patch of this selected color is shown at screen region 116.
Before a color is selected but while the mouse is over the color
flywheel portion of the interface, the colors are enlarged to show
the user a bigger view of the color associated with the selection.
This mouse over effect is depicted at screen element 130 in FIGS.
1B and 1C.
[0050] FIG. 1C illustrates the use of the recently selected screen
regions 105 through 112. Upon the selection of a subsequent option
from the flywheel display segment 102a, the newly selected option
is shown at the currently selected screen region 104. This begins
to build a history, and moves the last selected color from 104 to
the most recently selected screen region of 105. As illustrated in
FIG. 1, the currently selected screen region 104 and recently
selected screen regions 105-112 have a progression associated. As a
new selection is made by the user, the new option occupies the
current selected space of 104, thereby moving the most recently
selected options around recently selected screen regions 105-112 in
the manner shown. FIG. 1C shows in an example embodiment of the
invention that a user has selected various colors which are moved
around the screen region. Upon these regions being filled with a
history, and a new selection being made, the interface is able to
drop the earliest selection from recently selected screen region
112 in order to make room for newer selections to be made.
[0051] Arrangement and Ordering of Colors
[0052] The flywheel is part of a graphical user interface where
users select a color from a finite number of colors, typically
between 50 and 200 colors. Rather than displaying colors in an
arbitrary order, it may be preferable for the colors to be ordered
in a fashion where successive colors appear similar and where
families of colors are proximal. In the virtual makeover
application, it is necessary to layout colors on the flywheel in
real-time because the selection of colors to be presented to a user
may vary over time (e.g., the collection of products being offered
by merchants may change, or a user may only want to see colors of
lipstick from her two favorite brands).
[0053] Colors lie in a 3-dimensional space, often specified in
terms of the three primaries red, green blue (RGB) or by their Hue,
Saturation and Luminance (HSL). In mapping colors to the flywheel,
we are projecting colors from a 3-D space onto a 1-D circle. Unlike
the standard color wheel, which is 2-dimensional and where colors
are typically organized in polar coordinates with the hue defining
the angle and the saturation defining the radius, the flywheel data
structure may be a one-dimensional sorted circular doubly linked
list of colors for example. Rather than performing this
dimensionality reduction in a data independent way (e.g.,
projecting to the flywheel just using Hue), the projection may be
data-dependent, using the set of input colors to determine the
projection. This is important since the colors being displayed in a
flywheel for certain applications (e.g., display of cosmetics
colors) may not be uniformly distributed through the color space,
but are often highly clustered. For example, lipsticks may contain
many red-toned colors, but may contain very few blue, yellow and
green tones.
[0054] To assign a discrete set of N colors to N positions on a
flywheel display, we define a cost function
COST = i = 0 N - 1 color ( i ) , color ( i + 1 ) mod N
##EQU00001##
[0055] where color(i) is the color assigned to the i-th position on
the flywheel, and where <x,y> means the distance in color
space between colors x and y. For the subsequent method, the
distance between colors could be any metric. In the implementation,
the metric may be the square of the Euclidian distance in the RGB
color space. Finding the assignment of colors to positions to
minimize this cost function is a combinatorial optimization
problem. Finding the global minimum of this cost function is
computationally expensive, and so a greedy algorithm can be used
which is fast, but is not guaranteed to find the global minimum. A
greedy algorithm estimates the solution of a problem by making the
locally optimal choice at each stage with the hope of finding the
global optimum. In one or more embodiments of the invention, a
color palette has multiple colors. A circular doubly linked data
structure has nodes which may contain information describing each
color's coordinates in color space for example. A greedy algorithm
may be employed to sort through the circular doubly linked data
structure with the goal of re-ordering the colors so that the
colors are ordered such that colors that are perceived to be most
similar are close together. Once the greedy algorithm completes the
sorting process, the flywheel display may then be populated with
the color in the first node filling the first screen region in the
flywheel display, the color in the second node filling the second
screen region in the flywheel display, and so forth.
[0056] FIG. 1D illustrates an example method for ordering colors in
one or more embodiments of the invention. At block 150, a circular
doubly linked flywheel data structure is created which may hold "N"
records. A doubly linked flywheel data structure may consist of a
sequence of nodes with each containing data fields and references
pointing to the next or previous nodes. Each of the nodes in the
flywheel data structure may be associated with a screen region on
the flywheel display for example. Each of the colors in a palette
may be associated with each of the nodes in the flywheel data
structure. In one or more embodiments of the invention, the
coordinates in color space for each color may be associated with
the flywheel data structure. Thus, for a given color palette, each
color will be associated with a node in the circular doubly linked
flywheel data structure.
[0057] At block 151, variable integer "n" is set to 1. At block
152, the Local Cost is calculated for colors n-1, n, n+1, and n+2.
The Local Cost is calculated by adding the distance in color space
between color(n) and color(n-1) to the distance in color space
between color(n+1) and color(n+2) and subtracting the distance in
color space between color(n-1) and color(n+1) and subtracting the
distance in color space between color(n) and color(n+2). In one or
more embodiments of the invention, the calculation may be executed
in a tangible memory medium or a microprocessor-based computer for
example.
[0058] At block 153, the value of Local Cost may be considered.
Should the value of the Local Cost be less than zero ("0"), the
flow may divert to block 154. Should the value of the Local Cost be
not less than zero, the flow will divert to block 155. At block
154, the order of color(n) and color(n+1) may be swapped. At block
155, the variable integer n incremented by one and is set to the
value of n plus one. At block 156, the value of "n" is compared to
that of "N." When the value of "n" exceeds the value of "N," the
process flow is diverted to block 158. When the value of "n" does
not exceed the value of "N," the process flow is diverted to block
157. At block 157, if the accumulated time for the process exceeds
a timeout value, the process flow is diverted to block 158. If the
accumulated time for the process does not exceed a timeout value,
the process flow is diverted to block 152.
[0059] At block 158, the flywheel display may be displayed. The
first screen region may be filled with color(1), the second screen
region filled color(2), and so forth for example. In one or more
embodiments of the invention, the flywheel may be displayed on a
display monitor for example.
[0060] Color Selection History
[0061] As previously mentioned, the user interface has the ability
to populate option groups on a context-dependent basis. In the
makeup example, color palettes and the associated colors are
populated based on the specific area of the subject image that
colors are intended to be applied to. In this, and any other
embodiments of the invention, the circular region 103 is capable of
persistently remembering selected options during navigation of
other palettes. So if a user were to pick three colors from one
palette under `lipstick,` subsequently change palettes and pick
three more colors, all six would be displayed as recent selections
in the history. This persistence is capable of surviving during
navigation of other areas of the interface. As previously mentioned
the palette and option group tab 101 and flywheel display segments
102a through 102ff are repopulated with new options upon the
selection of a new detail. The navigation to a new region, for
example `lip liner,` would create a new `history` to repopulate,
and previous selections from the lipstick would no longer display
in circular region 103. A new history would be created based on
user selections under lip liner. However, should the user return to
the lipstick detail, the palette and option group tab 101 and
flywheel display segments 102a through 102ff would be repopulated,
and the previous history associated with that detail would once
again be displayed. This allows for a user to navigate through
extensive options and sub options, while keeping the histories for
each section separate and persistent through navigation without
requiring separate instances of the interface for each detail being
used. It will be apparent to the reader that the use of histories
that are connected to each group of palettes and color groups can
be applied to any embodiment where multiple option groups are
available depending on the context of selection to be made. For
example, wall colors retained in a history would be persistent and
remembered through navigation of other areas of home decoration
such as wallpaper options.
[0062] The interface is also able to recognize whether an option
being selected by a user is already in the recent history recently
selected screen regions 105-112, and able to reorder the selections
within the history region to represent the new selection order
without duplication of the selection in the history or the
unnecessary dropping off of a selection from the recently selected
screen region 112.
[0063] FIG. 2 represents the use of the interface in an embodiment
relating to colors being applied to a subject image. At step 201
the interface obtains the palette selection and populates flywheel
display segments 102a through 102ff with the colors associated with
the selected palette. At step 202, the selection of a color from
flywheel display segment 102a applies said color to a subject
image, and displays the color in the currently selected region of
104. At step 203, the method then proceeds through decisions
relating to the history region of the interface. At step 204 the
interface determines whether or not a history needs to be built, or
whether it already exists. A history will need to be built if more
than one color has been selected from at least one of the palette
options provided as described above. In the event that a history
does not exist and is required, at steps 205 through 206 the
interface can perform the application of a newly selected color to
the image, while representing the currently selected color in the
current selected region of 104, while moving the last selected
color to the recently selected regions of 105. In the event that a
history already exists a further decision must be made depending on
whether the history regions have been filled with options already
or otherwise. This decision occurs at step 207. Steps 208 and 209
occur in the event that the history has not already been filed, and
performs the application of a newly selected color to the image,
rotating the color history around circular region 103 to make room
for a new addition to the list. Steps 210 through 212 occur in the
event that the history region has already been filled, and performs
the same function above but instead removing the earliest color
from the list in order to free up a section. For example, if
recently selected screen regions 105-112 were filled with color
selections and a user selects a new color, the new color is
displayed in currently selected screen region 104 as currently
applied, while moving the last selected to 105. All of the other
recently selected colors rotate around, with the color occupying
recently selected screen region 112 being removed from the list to
make room. At the end of the decisions at 206, 209 or 212, the
interface then loops the decision process.
[0064] While the following paragraph appears to be duplicative, the
intention of this paragraph and accompanying figure is intended to
display a higher level of abstraction at which this invention can
operate with a number of options, option groups and history
interfaces dependent on the context in which the invention is
applied. At step 301 the interface obtains the option group
selections and populates flywheel display segments 102a through
102ff with the options associated with the selected option group.
At step 302, the selection of an option from flywheel display
segment 102a applies said option to the associated subject, and
displays the selected option, or its relevant representation, in
the `currently selected` screen region of 104. At step 303, the
method then proceeds through decisions relating to the history
region of the interface. At step 304 the interface determines
whether or not a history needs to be built, or whether it already
exists. A history will need to be built if more than one option has
been selected from at least one of the option group options
provided as described above. In the event that a history does not
exist and is required, at steps 305 through 306 the interface can
perform the application of a newly selected option to the
associated subject, while representing the currently selected
option in the current selected region of 104, while moving the last
selected option to the recently selected regions of 105. In the
event that a history already exists a further decision must be made
depending on whether the history regions have been filled with
options already or otherwise. This decision occurs at step 307.
Steps 308 and 309 occur in the event that the history has not
already been filed, and performs the application of a newly
selected option to the subject, rotating the option history around
circular region 103 to make room for a new addition to the list.
Steps 310 through 312 occur in the event that the history region
has already been filled, and performs the same function above but
instead removing the earliest option from the list in order to free
up a section. For example, if recently selected screen regions
105-112 were filled with option selections and a user selects a new
option, the new option is displayed in currently selected screen
region 104 as currently applied, while moving the last selected to
105. All of the other recently selected options rotate around, with
the option occupying recently selected screen region 112 being
removed from the list to make room. At the end of the decisions at
306, 309 or 312, the interface then loops the decision process.
[0065] As touched on above, FIG. 4 shows an embodiment of the
invention in the context of a hypertext page where the options and
option groups presented and selectable by the user are presented
through the internet to a client computer. In the example provided
in FIG. 4, an image is presented to the application and areas of
the picture for application of colors are identified. In this case,
the eyes, skin and mouth are identified as being areas for color
application and associated with the selection regions of 401, 402
and 403. These areas can be further separated into details of color
applications, for example the selection of eyes would allow the
user to further select eye shadow, mascara, and eyeliner options,
whereas selection of the mouth would allow the user to further
select lipstick, lip liner and lip gloss. Upon selection of an
identified area, the palette groups are provided at 405. Selection
of the mouth region on 401 populates region 405 with the detail
selections of lipstick, lip liner and lip gloss. Further selection
of one of these options from 405 populates the color selection
interface at 406 which has been described above in FIG. 1. The
selection of a color from 406 applies the color to the subject
image at 407, while populating the color history regions of 103 as
described fully above. It will be apparent that there are many
embodiments of this interface, whether operated on a local system
or through a computer network, and where the subject having options
applied may be other than a facial image having makeup applied. In
a home development embodiment for example the subject image would
be a graphical representation of a room, where the subject
selections of 401, 402 and 403 could represent the walls, floor
ceiling of said room, and further options would allow a user to
select wallpaper, paint and fabric. It will also be apparent to the
reader that the interface is not limited to a set of three options
in each category, but that any plurality of option groups and
options can be presented to the user, and that other embodiments of
this invention other than those described in example could be
used.
[0066] Context Sensitive Menus
[0067] FIG. 5 describes the use of the interface within the context
of the application. A subject image is presented to the application
at 501, which is then analyzed by the application (see
"Personalized Makeovers" co-pending patent application) and areas
of the image are identified at step 502. Steps 503-507 obtain the
options and option groups to be presented to the user through the
interface, which are then provided and selected based on the area
of image where options are to be applied. Upon the selection of an
option by the user the application processes both the subject and
the options at step 508 and presents the user with the resulting
output at step 509. The steps are repeated for each subject
presented to the application, and each option selected by a user.
In our makeup example, an image of a face is provided to the
application which is then analyzed into areas at 502. Option
groups, such as the lipstick, lip liner, eye shadow and foundation
options are collected and presented at 507 to the user. Upon
selection of a particular makeup type and color the application
processes the selection at 508, and presents an output of the image
with the selected colors and makeup types selected at 509.
[0068] When an image is provided to the system the system processes
the image to identify the various features within the image. This
is achieved using the facial recognition or other image processing
algorithms or located manually by the user. Once the features
within an image are located the system is then able to take actions
based on the user input. The image in this example is a face, which
has been divided into sections for application of makeup. The eyes,
mouth, and skin have been identified as areas for application, and
assigned as `hotspots` within the image. These `hotspots` allow the
user to directly interact with the image being modified and apply
options relevant to the hotspot. For instance, blush is applied to
the cheeks and lipstick to the lips.
[0069] The location of each hotspot is determined by the system
once the facial features are identified by the system or user. The
eyes, lips, cheeks, eyelashes or any other facial features have an
associated set of action. These hotspots, which may differ from
image to image, are based on the facial or image recognition system
identifying the features. In some cases the user confirms or
adjusts the first attempt the system makes at identifying the
features. In one or more embodiments of the invention, when the
user clicks the mouse over a hotspot the system presents actions
that can be performed on the part of the image associated with the
hotspot. In one or more embodiments of the invention, commands that
can be performed on the part of the image associated with the
hotspot may be activated by other means such as a user touching a
touch screen or through the use of a light pen for example. This
provides the user with context sensitive menus that are based on
different parts of the image having been given a feature
classification by the system. An eyelid is thus identified as such
as are every other feature within the image.
[0070] The system uses this feature information to present commands
to the user that are relevant to the feature. This means that a
user, using an activation command, most commonly a left click on a
mouse pointer device, is able to access a menu wherein the menu
relates to the area being clicked. In one or more embodiments of
the invention, commands are presented when a user clicks the right
or other buttons on a computer mouse. In this example, right
clicking on the area of an image that was identified as being eyes
presents an eye-related context menu. The user may then select one
of the operations on the menu and perform direct manipulation of
the image. A right click on a part of the image that has no
specific identifier would present the user with options that are
applicable to the entire image. These context sensitive menus, and
the options they provide, are associated with the particular image
upon that image being processed and shown to the user through the
interface provided.
[0071] Look History/Saving
[0072] In the examples given, a number of beauty products can be
applied to an image to create a personalized makeover. The products
are applied to specific areas of the image to which they apply
lipstick and lip liner to lip regions of the image, eye shadow and
liner to the eyes, and so on until a full set of makeup has been
created by the user in this personalized makeover. This completed
set of selections made by a user and applied to the subject image
is then a complete "look." The term look is used herein as a noun
in a sense to describe an overall style and appearance of a
subject, such as a person, personal fashion, or interior
decoration. The term makeover is used as a noun to describe the
creation of a new look to improve attractiveness in a subject,
conform to societal fashions, or simply modify the appearance. As
such, giving a subject image a makeover through this interface
creates a new `look` which is made up of the options selected by a
user through this interface and applied to the subject image. For
example, this look could contain a specific style and color of eye
shadow for application, a specific color and amount of foundation
to apply, specific colors of blush and thickness of application,
and so on until all desired makeups are specified and given color
and quantity values within the look.
[0073] When users create a look by making various makeup choices
and applying those choices to an image to arrive at a "look" the
look can be saved and later applied to a different image and or the
same image. Since a `look` is the collection of options selected by
a user from the option selection interface, the `look` can be saved
and stored independently of the subject image used to produce the
look. Also, `looks` can be predetermined and applied to any number
of subject images. For example, if a user wished to apply a `high
fashion look` to their own face, the user can provide their own
image at 601, which is processed by the system (further described
and referenced in the "Personalized Makeovers" copending patent
application). In processing, the areas of the image are identified
to correspond with the mouth, eyes, skin areas and other parts of
the image. The high fashion `look` contains data relating to the
beauty products used to create the look, such as a specific brand
and color of eye shadow, lipstick and so on. This data is then
applied over the subject image to create the look. In other
embodiments, the look could, in the context of interior decoration,
contain furniture types with an associated style, wall color and
texture with associated RGB color values and/or a texture image,
and door types. It will be apparent to the reader that the concept
of a saved look can relate to any number of options, selected from
option groups, associated with a subject and presented to the user
applied to the subject.
[0074] Alternatively, looks can be created by a user, and
subsequently saved, and selected by another user for application to
a new subject image. In this, the image shown at FIG. 4 element 407
is an active rendering of the subject image, with areas of the
image identified, and the `look` data applied over the identified
areas of the subject image. Once the user has made selections that
he or she is happy with, the `look` is saved and associated with
the subject image. Other users are capable of viewing this database
and determining what `looks` they like, and further apply the look
to a subject image applied by them.
[0075] The information that defines a look is stored in a data
structure that defines the color values and region information
associated with each item that is needed to create the look. The
color values are those the user selected while creating the look
and the region information defines what part of the image the
colors should be applied against. The various features identified
during facial detection are used to define the regions for
application of the color values. This information is saved and can
later be applied to any image even if the look was created using a
different image.
[0076] A method for applying a look to an image in accordance with
one or more embodiments of the invention is shown in FIG. 6. The
look attributes or options such as what makeup was used to create
the look are saved independent of the image. At step 601, an image
is provided to the system and analyzed by the system, where areas
for option application such as color values are identified. The
features within the image are identified and the user is presented
with an interface for defining what makeup to apply to which
features. Choices are made for instance as to what lipstick,
eye-shadow, foundation, blush, or other makeup items make a look.
At step 602 the look is created and applied to the subject image at
step 602. The look is then saved in a structure that contains the
saved combinations of beauty products, hair styles, accessories,
and associated with the uploaded image at step 603. A second user
can review the interface and see uploaded images with the saved
looks applied at step 604. The second user can then upload a new
image at step 605, which is further given a saved look to apply to
the image in the same manner as the previous image at step 606.
[0077] FIG. 7 shows the image being provided to the system, in
which the user can select an already saved `look` to apply to their
image, or start on their own and create a new look for the uploaded
image. At step 701, the image may be uploaded or selected by a
user. At step 702, the image is analyzed and the image is broken
into sections for the option or look application. The system
determines whether the user wishes to apply a saved look to an
image or whether to allow the user to create their own look through
the system of applying options from option groups described above
at 703. The user is then presented with the interface shown in FIG.
4, where the image at 407 can either be unmodified and ready for
option application at step 704, or can be presented as an image
rendered with options from a saved look at step 705. The rendered
image is presented to the user at step 707.
[0078] The look is applied to the image as layers over the image
using the order generally used during the application of makeup.
Foundation for instance is applied before blush, etc. In one or
more embodiments of the invention the process of applying a
makeover is as follows: A makeover is represented as a base image,
a set of layers that may follow the layers of application of real
life makeup. Rendering proceeds by applying the lowest layers up
through the highest layers. The low layers would include concealer
and foundation. Blush might be applied on top of this. A hair image
layer at the top layer with reasonable choices being made by the
system about what layer is best to apply next. The makeup
associated with the look has an applied area which is a defined
style. This can be represented as a mask which might be binary or
it might be an alpha mask containing numerical values. The mask
shape may be predefined. It might be warped by a user. For example,
multi-toned eye shadow may be represented by three masks, one for
three different eye shadow products, and these will be rendered in
order. The mask shapes are generally different. The user may modify
the masks so as to, for example, fall on the crease of the eye and
extend to the eye brow.
[0079] The mask for applying products like foundation or tanners or
bronzers might use an automatically constructed mask based on
automatic skin detection as described elsewhere. It may use a
predefined mask whose shape and location is reference to
automatically or manually detected face features.
[0080] A rendering process for each layer takes the partially
rendered image to that point and renders the new layer to the
region defined by the mask. The output of applying that layer will
be a function of the makeup properties including its color,
transparency, luster, gloss, metals content, and the underlying
image color. There are different ways to model each of the ways
types of makeup are applied. One method is called alpha blending.
Others use the separation of specular from diffuse reflectance and
apply colors to these components. Others use shape models of the
face to determine specular components. Once a layer is applied, it
is not modified by layers above.
[0081] The masks and their positions, and the product parameters
(color, transparency, luster, gloss, metal content, and flecks) are
called a look. This look may be saved to a disk or other storage
device. The look may be loaded and applied to an original image.
The look may be applied to a second image of the same person. For
this to be effective, the underlying facial features from the first
image must be brought into correspondence with the second image.
This correspondence induces a transformation of the masked regions.
The foundation and bronzer layer would still use the automatically
detected skin mask. The subsequent layers would be transformed
according to the correspondence.
[0082] Fast rendering can be accomplished using rendering at
multiple resolutions and backgrounding. Each layer is rendered in
turn at low resolution, then higher resolutions until the full
resolution rendering proceeds. In an interactive setting such as a
makeup editor, a user may be modifying the makeover (mask
positions) faster than the rendering can terminate when at full
resolution. With multi-resolution rendering with backgrounding, if
a mouse event causes a movement of a layer, the rendering will
terminate at whatever resolution it has reached and rendering would
commence at the new position starting at the lowest resolution.
This provides a high degree of interactivity. As the defined
`looks` can be considered separate but still associated with a
subject presented to the system, it is possible for the same
subject to have multiple looks applied to it and viewed at a
glance. Conversely, it is also possible for the same look to be
applied to multiple images. FIG. 8 shows, in the context of our
makeup example, an interface allowing for the `at-a-glance` viewing
of an image with multiple applied looks. The original subject
image, without any application, is shown at 801. A copy of the
image with a specific applied look is shown at 802. This look would
be a user created, and user saved look containing details of the
beauty product types and options being applied to the image. A
subsequent copy of the image is shown at 803 with a different look
attached. This allows a user to compare images with various makeup
selections (options) applied and make a personal selection based on
the images presented to them.
[0083] Computer System Aspect
[0084] One or more embodiments of the invention may be implemented
in the form of one or more computer programs that when executed in
computer memory may cause one or more computer processors to
initiate the methods and processes described herein. The files
assembled to makeup the software program may be stored on one or
more computer-readable medium and retrieved when needed to carry
out the programmed methods and processes described herein. Within
the scope of a computer-implemented embodiment of the invention,
readers should note that one or more embodiments of the invention
may comprise computer programs, data and other information further
comprising but not limited to: sets of computer instructions, code
sequences, configuration information, data and other information in
any form, format or language usable by a general purpose computer
or other data processing device, such that when such a computer or
device contains, is programmed with, or has access to said computer
programs, the data and other information transforms said general
purpose computer into a machine capable of performing the methods
and processes described herein, and specifically such as those
described above.
[0085] Various embodiments of the invention may be implemented as a
method, apparatus, or article of manufacture using standard
programming and/or engineering techniques to produce software,
firmware, hardware, computer-readable media or any combination
thereof. The term "article of manufacture" (or alternatively,
"computer program product,") as used herein is intended to
encompass a computer program of any form accessible from any
computer-readable device, carrier or media. In addition, the
software in which various embodiments are implemented may be
accessible through a transmission medium, such as for example, from
a server over the network. The article of manufacture in which the
program is implemented may also employ transmission media, such as
a network transmission line and/or a wireless transmission media.
Those skilled in the art will recognize that many modifications may
be made to this configuration without departing from the scope of
the invention.
[0086] A computer-readable medium suitable to provide computer
readable instructions and/or computer readable data for the methods
and processes described herein may be any type of magnetic,
optical, electrical or other storage medium including disk, tape,
CD, DVD, flash drive, thumb drive, storage card, distributed
storage or any other memory device, location, approach or other
storage medium or technique known to those of skill in the art.
[0087] In one or more embodiments of the invention, the methods
described here may not be limited as to the type of computer it may
run upon and may for instance operate on any generalized computer
system that has the computational ability to execute the methods
described herein and can display the results of the user's choices
on one or more display devices. Display devices appropriate for
providing interaction with the invention described herein includes,
but is not limited to, computer monitors, cell phones, PDAs,
televisions, or any other form of computer controllable output
display. As used herein, a computer system refers to but is not
limited to any type of computing device, including its associated
computer software, data, peripheral devices, communications
equipment and any required or desired computers that may achieve
direct or indirect communication with a primary computing
device.
[0088] In one or more embodiments of the invention, a
general-purpose computer may be utilized to implement one or more
aspects of the invention. In one or more embodiments of the
invention, the computer may include various input and output means,
including but not limited to a keyboard or other textual input
devices, a display device such as a monitor or other display
screen, and a pointing device and/or user selection indicator such
as a mouse, keypad, touch screen, pointing device, or other known
input/output devices known to those of skill in the art. The
general purpose computer described herein may include one or more
banks of random access memory, read only memory, and one or more
central processing unit(s). The general purpose computer described
herein may also include one or more data storage device(s) such as
a hard disk drive, or other computer readable medium discussed
above. An operating system that executes within the computer memory
may provide an interface between the hardware and software. The
operating system may be responsible for managing, coordinating and
sharing of the limited resources within the computer. Software
programs that run on the computer may be performed by an operating
system to provide the program of the invention with access to the
resources needed to execute. In other embodiments the program may
run stand-alone on the processor to perform the methods described
herein.
[0089] In one or more embodiments of the invention, the method(s)
described herein, when loaded on or executing through or by one or
more general purpose computer(s) described above, may transform the
general purpose computer(s) into a specially programmed computer
able to perform the method or methods described herein. In one or
more embodiments of the invention, the computer-readable storage
medium(s) encoded with computer program instructions that, when
accessed by a computer, may cause the computer to load the program
instructions to a memory there accessible, thereby creates a
specially programmed computer able to perform the methods described
herein as a specially programmed computer.
[0090] The specially programmed computer of the invention may also
comprise a connection that allows the computer to send and/or
receive data through a computer network such as the Internet or
other communication network. Mobile computer platforms such as
cellular telephones, Personal Desktop Assistants (PDAs), other
hand-held computing devices, digital recorders, wearable computing
devices, kiosks, set top boxes, games boxes or any other
computational device, portable, personal, real or virtual or
otherwise, may also qualify as a computer system or part of a
computer system capable of executing the methods described herein
as a specially programmed computer.
[0091] FIG. 9 depicts a general-purpose computer and peripherals,
when programmed as described herein, may operate as a specially
programmed computer capable of implementing one or more methods,
apparatus and/or systems of the invention. Processor 907 may be
coupled to bi-directional communication infrastructure 902 such as
Communication Infrastructure System Bus 902. Communication
Infrastructure 902 may generally be a system bus that provides an
interface to the other components in the general-purpose computer
system such as Processor 907, Main Memory 906, Display Interface
908, Secondary Memory 912 and/or Communication Interface 924.
[0092] Main memory 906 may provide a computer readable medium for
accessing and executed stored data and applications. Display
Interface 908 may communicate with Display Unit 910 which may be
utilized to display outputs to the user of the specially-programmed
computer system. Display Unit 910 may comprise one or more monitors
that may visually depict aspects of the computer program to the
user. Main Memory 906 and Display Interface 908 may be coupled to
Communication Infrastructure 902, which may serve as the interface
point to Secondary Memory 912 and Communication Interface 924.
Secondary Memory 912 may provide additional memory resources beyond
main Memory 906, and may generally function as a storage location
for computer programs to be executed by Processor 907. Either fixed
or removable computer-readable media may serve as Secondary Memory
912. Secondary Memory 912 may comprise, for example, Hard Disk 914
and Removable Storage Drive 916 that may have an associated
Removable Storage Unit 918. There may be multiple sources of
Secondary Memory 912 and systems of the invention may be configured
as needed to support the data storage requirements of the user and
the methods described herein. Secondary Memory 912 may also
comprise Interface 920 that serves as an interface point to
additional storage such as Removable Storage Unit 922. Numerous
types of data storage devices may serve as repositories for data
utilized by the specially programmed computer system of the
invention. For example, magnetic, optical or magnetic-optical
storage systems, or any other available mass storage technology
that provides a repository for digital information may be used.
[0093] Communication Interface 924 may be coupled to Communication
Infrastructure 902 and may serve as a conduit for data destined for
or received from Communication Path 926. A Network Interface Card
(NIC) is an example of the type of device that once coupled to
Communication Infrastructure 902 may provide a mechanism for
transporting data to Communication Path 926. Computer networks such
Local Area Networks (LAN), Wide Area Networks (WAN), Wireless
networks, optical networks, distributed networks, the Internet or
any combination thereof are some examples of the type of
communication paths that may be utilized by the specially program
computer system of the invention. Communication Path 926 may
comprise any type of telecommunication network or interconnection
fabric that can transport data to and from Communication Interface
924.
[0094] To facilitate user interaction with the specially programmed
computer system of the invention, one or more Human Interface
Devices (HID) 930 may be provided. Some examples of HIDs that
enable users to input commands or data to the specially programmed
computer of the invention may comprise a keyboard, mouse, touch
screen devices, microphones or other audio interface devices,
motion sensors or the like, as well as any other device able to
accept any kind of human input and in turn communicate that input
to Processor 907 to trigger one or more responses from the
specially programmed computer of the invention are within the scope
of the system of the invention.
[0095] While FIG. 9 depicts a physical device, the scope of the
system of the invention may also encompass a virtual device,
virtual machine or simulator embodied in one or more computer
programs executing on a computer or computer system and acting or
providing a computer system environment compatible with the methods
and processes of the invention. Where a virtual machine, process,
device or otherwise performs substantially similarly to that of a
physical computer system of the invention, such a virtual platform
will also fall within the scope of a system of the invention,
notwithstanding the description herein of a physical system such as
that in FIG. 9.
[0096] One or more embodiments of the invention are configured to
enable the specially programmed computer of the invention to take
the input data given and transform it into an interface enabling
dynamic layout of options or color within a group for application
to an image, arrangement and ordering of colors, color selection
history, context sensitive menus, and look history and saving, by
applying one or more of the methods and/or processes of the
invention as described herein. Thus the methods described herein
are able to transform the raw input data provided to the system of
the invention into a resulting output of the system using the
specially programmed computer as described.
[0097] One or more embodiments of the invention are configured to
enable a general-purpose computer to take one or more color
palettes and the color choices associated with each color palette,
from memory and transform and display the graphical user interface
component on Display Unit 910 for example. The user, through the
interaction with the Human Interface Device 930, enters a selection
of a region of a subject image. Processor 907 receives the
selection of a region of a subject image, transforms the color
palette and color choices data, and transmits the information to
the Display Unit 910 for display. The user may interact with the
computer system through Human Interface Device 930 and may select a
second color choice which causes Processor 907 to process the
information and transmit signal to the graphical user interface on
Display Unit 910.
[0098] In one or more embodiments of the invention, when a user
selects an identified region of a subject image, Processor 907
transmits data to the Display Unit 910 and enables the user to see
context sensitive menus that is associated with the specific
identified region of the subject image.
[0099] In one or more embodiments of the invention, Processor 907
requests for the color records for a particular color palette.
Processor 907 then calculates the arrangement of the colors in a
color flywheel display through the use of a greedy algorithm.
Processor 907 then transmits the signals to the Display Unit 910
where they may be displayed to a user.
[0100] In or more embodiments of the invention, Human Interface
device 903 accepts a user's input in which the user may select a
first identified region of a subject image to apply a first color,
and a second identified region of a subject image to apply a second
color. Processor 907 may process the metadata describing the first
color and second color and store that information as a "look" in
Main Memory 906 for example.
[0101] While the invention herein disclosed has been described by
means of specific embodiments and applications thereof, numerous
modifications and variations could be made thereto by those skilled
in the art without departing from the scope of the invention set
forth in the claims.
* * * * *