U.S. patent application number 14/027472 was filed with the patent office on 2014-01-16 for face recognition system and method.
This patent application is currently assigned to C-TRUE LTD.. The applicant listed for this patent is C-TRUE LTD.. Invention is credited to Avihu Meir Gamliel.
Application Number | 20140016836 14/027472 |
Document ID | / |
Family ID | 41465533 |
Filed Date | 2014-01-16 |
United States Patent
Application |
20140016836 |
Kind Code |
A1 |
Gamliel; Avihu Meir |
January 16, 2014 |
FACE RECOGNITION SYSTEM AND METHOD
Abstract
Apparatus for face recognition, the apparatus comprising: a face
symmetry verifier, configured to verify symmetry of a face in at
least one image, according to a predefined symmetry criterion, and
a face identifier, associated with the face symmetry verifier, and
configured to identify the face, provided the symmetry of the face
is successfully verified.
Inventors: |
Gamliel; Avihu Meir;
(Pardes-Hana, IL) |
|
Applicant: |
Name |
City |
State |
Country |
Type |
C-TRUE LTD. |
Rehovot |
|
IL |
|
|
Assignee: |
C-TRUE LTD.
Rehovot
IL
|
Family ID: |
41465533 |
Appl. No.: |
14/027472 |
Filed: |
September 16, 2013 |
Related U.S. Patent Documents
|
|
|
|
|
|
Application
Number |
Filing Date |
Patent Number |
|
|
12919076 |
Aug 24, 2010 |
8600121 |
|
|
PCT/IB09/52720 |
Jun 24, 2009 |
|
|
|
14027472 |
|
|
|
|
61133711 |
Jul 2, 2008 |
|
|
|
Current U.S.
Class: |
382/118 |
Current CPC
Class: |
G06K 9/00241 20130101;
G06K 9/00281 20130101; G06K 9/00979 20130101 |
Class at
Publication: |
382/118 |
International
Class: |
G06K 9/00 20060101
G06K009/00 |
Claims
1. A method for face recognition, the method comprising: a)
verifying symmetry of a face in at least one image, according to a
predefined symmetry criterion based on a geometric property of a
polygon formable by connecting known face elements as appearing in
the image; and b) identifying said face, provided said symmetry of
said face is successfully verified.
2. The method of claim 1, further comprising capturing said image
of said face.
3. The method of claim 1, further comprising detecting said face in
said image.
4. The method of claim 1, further comprising using a skin detection
method, for detecting said face in said image.
5. The method of claim 1, further comprising using a Viola-Jones
detection method, for detecting said face in said image.
6. The method of claim 1, further comprising using a Gabor Filter
based method, for detecting said face in said image.
7. The method of claim 1, further comprising cropping said image,
thereby removing background from said image.
8. The method of claim 1, further comprising resizing said image
into a predefined size.
9. The method of claim 1, further comprising improving a quality of
illumination of said image.
10. The method of claim 1, further comprising using Histogram
Equalization, for improving a quality of illumination of said
image.
11. The method of claim 1, further comprising using an intensity
map, for verifying said symmetry of said face in said image.
12. The method of claim 1, further comprising using a gradient map,
for verifying said symmetry of said face in said image.
13. The method of claim 1, further comprising using a Fourier
Transform phase map, for verifying said symmetry of said face in
said image.
14. The method of claim 1, further comprising measuring symmetry of
each one of a plurality of input images, and selecting said at
least one image of said face among said plurality of input images,
and wherein said measured symmetry of said at least one selected
image of said face is highest amongst said input images.
15. The method of claim 14, wherein said plurality of input images
are at least a part of a video sequence.
16. A method for face recognition, the method comprising: a)
verifying symmetry of a face in at least one first image of said
face, according to a predefined symmetry criterion based on a
geometric property of a polygon formable by connecting known face
elements as appearing in the first image, wherein said first image
is associated with respective data identifying said face; and b)
updating a face database with said first image of said face and
said associated data identifying said face, provided said symmetry
of said face in said first image is successfully verified.
17. The method of claim 16, further comprising: c) verifying
symmetry of said face in at least one second image of said face,
according to said predefined symmetry criterion; and d) identifying
said face in said second image, using said face database, provided
said symmetry of said face in said second image is successfully
verified.
18. A method for face recognition, the method comprising: a)
verifying symmetry of a face in at least one image, according to a
predefined symmetry criterion based on a geometric property of a
polygon formable by connecting known face elements as appearing in
the image; and b) controlling forwarding of said image, according
to a result of said verifying of said symmetry.
19. An apparatus for face recognition, the apparatus comprising: at
least one computer; a face symmetry verifier, implemented on said
at least one computer, configured to verify symmetry of a face in
at least one image, according to a predefined symmetry criterion
based on a geometric property of a polygon formable by connecting
known face elements as appearing in the image; and a face
identifier, associated with said face symmetry verifier,
implemented on said computer and configured to identify said face,
provided said symmetry of said face is successfully verified.
20. An apparatus for face recognition, the apparatus comprising: at
least one computer; a face symmetry verifier, implemented on said
at least one computer and configured to verify symmetry of a face
in at least one first image of said face, according to a predefined
symmetry criterion based on a geometric property of a polygon
formable by connecting known face elements as appearing in the
first image, wherein said first image is associated with respective
data identifying said face; and a face database updater, associated
with said face symmetry verifier, implemented on said at least one
computer and configured to update a face database with said first
image of said face and said associated data identifying said face,
provided said symmetry of said face in said first image is
successfully verified.
21. An apparatus for face recognition, the apparatus comprising: at
least one computer; a face symmetry verifier, implemented on said
at least one computer, configured to verify symmetry of a face in
at least one image, according to a predefined symmetry criterion
based on a geometric property of a polygon formable by connecting
known face elements as appearing in the image; and a forwarding
controller, associated with said face symmetry verifier,
implemented on said at least one computer and configured to control
forwarding of said image, according to a result of said
verification of said symmetry by said face symmetry verifier.
Description
FIELD AND BACKGROUND OF THE INVENTION
[0001] The present invention relates to face authentication and
recognition and, more particularly, but not exclusively to a system
and method for automatic face authentication and recognition, say
for security and surveillance purposes.
[0002] Currently, two popular applications of face recognition
systems are access control and security screening.
[0003] Access control systems are used to authenticate the identity
of individuals before allowing entry into a secure area.
Specifically, the system stores images of personnel authorized to
enter the secure area. When entry is attempted, the person's facial
image is captured, and compared to facial images of authorized
personnel. When a facial image match is detected, entry is
granted.
[0004] For example, U.S. Pat. No. 7,050,608, to Dobashi, filed on
Mar. 7, 2002, entitled "Face image recognition apparatus",
discloses a face image recognition apparatus.
[0005] Dobashi's face image recognition apparatus includes a
registration information holding section in which a reference
feature amount of the face of at least one to-be-recognized person
is previously registered. The feature amount of the face is
extracted from a face image input via an image input section, by
use of a feature amount extracting section.
[0006] A recognition section determines the recognition rate
between the extracted feature amount and the reference feature
amount registered in the registration information holding section.
A feature amount adding section additionally registers the feature
amount extracted by the feature amount extracting section as a new
reference feature amount into the registration information holding
section when it is determined that the determined recognition rate
is lower than a preset value.
[0007] U.S. Pat. No. 7,221,809, to Geng, filed on Dec. 17, 2002,
entitled "Face recognition system and method", discloses a method
of automatically recognizing a human face.
[0008] The method described by Geng includes developing a
three-dimensional model of a face, and generating a number of
two-dimensional images based on the three-dimensional model. The
generated two-dimensional images are then enrolled in a database
and searched against an input image for identifying the face of the
input image.
[0009] Security screening involves capturing images of people in
public places and comparing them to images of persons who are known
to pose security risks. One primary example of security screening
is its use at airport security checkpoints.
[0010] For example, U.S. Pat. No. 5,164,992, to Turk, filed on Nov.
1, 1990, entitled "Face Recognition System", describes a
recognition system for identifying members of an audience.
[0011] The system described by Turk includes an imaging system
which generates an image of the audience and a selector module for
selecting a portion of the generated image. Turk's system further
includes a detection means which analyzes the selected image
portion to determine whether an image of a person is present, and a
recognition module responsive to the detection means for
determining whether a detected image of a person identified by the
detection means resembles one of a reference set of images of
individuals.
[0012] U.S. patent application Ser. No. 10/719,792, to Monroe,
filed on Nov. 21, 2003, entitled "Method for incorporating facial
recognition technology in a multimedia surveillance system",
discloses facial recognition technology integrated into a
multimedia surveillance system for enhancing the collection,
distribution and management of recognition data, by utilizing the
system's cameras, databases, monitor stations, and notification
systems.
[0013] With the system described by Monroe, at least one camera,
ideally an IP camera is provided. This IP camera performs
additional processing steps to captured video. Specifically, the
captured video is digitized and compressed into a convenient
compressed file format, and then sent to a network protocol stack
for subsequent conveyance over a local or wide area network. The
compressed digital video is transported via Local Area Network
(LAN) or Wide Area Network (WAN) to a processor which performs
steps of Facial Separation, Facial Signature Generation, and Facial
Database Lookup.
[0014] U.S. patent application Ser. No. 11/450,581, to Chen et al.,
filed on Jun. 12, 2006, entitled "Three-dimensional face
recognition system and method", describes a three dimensional (3D)
face recognition system.
[0015] Chen's system has a first data storing module for storing
three dimensional (3D) face model data and two dimensional (2D)
face image data, an input unit for inputting 3D face model data and
2D face image data, a signal conversion module for converting
analog data of the 3D face model data and 2D face image data to
digital data, and a second data storing module for storing the
digital data.
[0016] Chen's system further includes a micro-processing module for
analyzing geometric characteristics of points in the 3D face model
data stored in the first and second data storing module to
determine feature points of the 3D face model data, and assigning
different weight ratios to feature points.
[0017] Chen's system further includes a comparison module for
comparing the feature points stored in the first and second data
storing module. The different geometric characteristics are given
different weight ratios, and the comparison module calculates
relativity between the feature points to obtain a comparison
result.
SUMMARY OF THE INVENTION
[0018] According to one aspect of the present invention there is
provided an apparatus for face recognition. The apparatus comprises
a face symmetry verifier, configured to verify symmetry of a face
in at least one image, according to a predefined symmetry
criterion, and a face identifier, associated with the face symmetry
verifier, and configured to identify the face, provided the
symmetry of the face is successfully verified.
[0019] According to a second aspect of the present invention there
is provided an apparatus for face recognition. The apparatus
comprises a face symmetry verifier, configured to verify symmetry
of a face in at least one first image of the face, according to a
predefined symmetry criterion, wherein the first image is
associated with respective data identifying the face, and a face
database updater, associated with the face symmetry verifier, and
configured to update a face database with the first image of the
face and the associated data identifying the face, provided the
symmetry of the face in the first image is successfully
verified.
[0020] According to a third aspect of the present invention there
is provided, an apparatus for face recognition, the apparatus
comprising: a face symmetry verifier, configured to verify symmetry
of a face in at least one image, according to a predefined symmetry
criterion, and a forwarding controller, associated with the face
symmetry verifier, and configured to control forwarding of the
image, according to a result of the verification of the symmetry by
the face symmetry verifier.
[0021] According to a fourth aspect of the present invention there
is provided a method for face recognition. The method comprises: a)
verifying symmetry of a face in at least one image, according to a
predefined symmetry criterion, and b) identifying the face,
provided the symmetry of the face is successfully verified.
[0022] According to a fifth aspect of the present invention there
is provided a method for face recognition, the method comprising:
a) verifying symmetry of a face in at least one first image of the
face, according to a predefined symmetry criterion, wherein the
first image is associated with respective data identifying the
face, and b) updating a face database with the first image of the
face and the associated data identifying the face, provided the
symmetry of the face in the first image is successfully
verified.
[0023] According to a sixth aspect of the present invention there
is provided a method for face recognition, the method comprising:
a) verifying symmetry of a face in at least one image, according to
a predefined symmetry criterion, and b) controlling forwarding of
the image, according to a result of the verifying of the
symmetry.
[0024] Unless otherwise defined, all technical and scientific terms
used herein have the same meaning as commonly understood by one of
ordinary skill in the art to which this invention belongs. The
materials, methods, and examples provided herein are illustrative
only and not intended to be limiting.
[0025] Implementation of the method and system of the present
invention involves performing or completing certain selected tasks
or steps manually, automatically, or a combination thereof.
Moreover, according to actual instrumentation and equipment of
preferred embodiments of the method and system of the present
invention, several selected steps could be implemented by hardware
or by software on any operating system of any firmware or a
combination thereof. For example, as hardware, selected steps of
the invention could be implemented as a chip or a circuit. As
software, selected steps of the invention could be implemented as a
plurality of software instructions being executed by a computer
using any suitable operating system. In any case, selected steps of
the method and system of the invention could be described as being
performed by a data processor, such as a computing platform for
executing a plurality of instructions.
BRIEF DESCRIPTION OF THE DRAWINGS
[0026] The invention is herein described, by way of example only,
with reference to the accompanying drawings.
[0027] With specific reference now to the drawings in detail, it is
stressed that the particulars shown are by way of example and for
purposes of illustrative discussion of the preferred embodiments of
the present invention only, and are presented in order to provide
what is believed to be the most useful and readily understood
description of the principles and conceptual aspects of the
invention.
[0028] The description taken with the drawings making apparent to
those skilled in the art how the several forms of the invention may
be embodied in practice.
[0029] In the drawings:
[0030] FIG. 1 is a block diagram illustrating a first apparatus for
face recognition, according to an exemplary embodiment of the
present invention.
[0031] FIG. 2 is a block diagram illustrating a second apparatus
for face recognition, according to an exemplary embodiment of the
present invention.
[0032] FIG. 3 is a block diagram illustrating a third apparatus for
face recognition, according to an exemplary embodiment of the
present invention.
[0033] FIG. 4 is a block diagram illustrating a fourth apparatus
for face recognition, according to an exemplary embodiment of the
present invention.
[0034] FIG. 5 is a flowchart illustrating a first method for face
recognition, according to an exemplary embodiment of the present
invention.
[0035] FIG. 6 is a flowchart illustrating a second method for face
recognition, according to an exemplary embodiment of the present
invention.
[0036] FIG. 7 is a flowchart illustrating a third method for face
recognition, according to an exemplary embodiment of the present
invention.
[0037] FIG. 8 is a flowchart illustrating a fourth method for face
recognition, according to an exemplary embodiment of the present
invention.
[0038] FIG. 9 is a flowchart illustrating a fifth method for face
recognition, according to an exemplary embodiment of the present
invention.
[0039] FIG. 10 is a flowchart illustrating a sixth method for face
recognition, according to an exemplary embodiment of the present
invention.
[0040] FIG. 11 is a flowchart illustrating a seventh method for
face recognition, according to an exemplary embodiment of the
present invention.
[0041] FIG. 12 is a flowchart illustrating an eighth method for
face recognition, according to an exemplary embodiment of the
present invention.
[0042] FIG. 13 illustrates cropping of an image of a face,
according to an exemplary embodiment of the present invention.
[0043] FIGS. 14a, 14b, and 14c illustrate a face recognition
scenario, according to an exemplary embodiment of the present
invention.
DESCRIPTION OF THE PREFERRED EMBODIMENTS
[0044] The present embodiments comprise an apparatus and method for
recognizing a face in one or more images (say a still image, a
sequence of video images, etc.).
[0045] According to an exemplary embodiment of the present
invention, a database of faces which belong to known individuals
(say to criminals, or to authorized users of a classified
information system) is used to store images of the faces of the
known individuals.
[0046] Optionally, the database of faces rather stores one or more
features extracted from each of the faces, say a biometric stamp
extracted from the image of each of the faces, using conventional
biometric methods, as known in the art.
[0047] Preferably, a face as captured in the image has to comply
with a symmetry criterion defined in advance, before the image
(bearing the face) is forwarded for storage in the face database,
as described in further detail hereinbelow.
[0048] Optionally, the symmetry criterion pertains to a statistical
model run over previously received images.
[0049] For example, the symmetry criterion may be based on a degree
of deviation of the image (and thus the face captured in the image)
from an average image, as known in the art. The average image is
calculated from the previously received images, using known in the
art methods. In the average image, each pixel's intensity equals an
average of intensities of pixels in the same position in the
previously received images. The average image is likely to be
symmetric. Consequently, a comparison made between the average
image and a captured image is indicative of the degree of symmetry
of the captured image.
[0050] Optionally, the symmetry criterion is based on a comparison
made between the image bearing the face and one or more images
previously captured from the same user. That is to say that the
face of the user in the image is compared with the face of the same
user, as captured in previously received image(s), or with an
average image calculated from previously received images of the
same user. The average image thus bears an average of the face, as
described in further detail hereinabove.
[0051] Optionally, the symmetry criterion is based on symmetry of a
polygon, which connects selected parts of the captured image.
[0052] Optionally, the selected parts are known elements of a human
face (say nose, eyes, or mouth). The known elements may be
identified in the captured image using known in art techniques,
such as: Viola-Jones algorithms, Neural Network methods, etc. The
centers of the known face elements identified in the captured image
are connected to form the polygon, and a verified symmetry of the
polygon serves as an indication for the symmetry of the face in the
captured image.
[0053] For example, the centers of the right eye, left eye, and
nose, in the captured image, may be connected to form a triangle,
which is expected to be isosceles, and thus symmetric. A successful
verification of the triangle as isosceles (say by a comparison made
between the triangle's arms) indicates that the face captured in
the image is indeed symmetric.
[0054] Similarly, the centers of the eyes and edges of lips in the
image bearing the face may be connected to form a trapezoid, which
is also expected to be symmetric, etc.
[0055] Optionally, the selected parts are segments of the face in
the image. The segments are identified in the captured image, using
known in the art image segmentation methods, such as Feature
Oriented Flood Fill, Texture Analysis, Principal Component Analysis
(PCA) based methods, DFT (Discrete Fourier Transform) methods (i.e.
harmonic methods), etc., as known in the art.
[0056] The mass centers of the selected segments (say segments
positioned in parts of the image expected to include known parts of
the face, say nose, lips, or mouth) of the captured image are
connected to form a polygon. A verified symmetry of the polygon
serves as an indication for the symmetry of the face, as described
in further detail hereinabove.
[0057] Optionally, the symmetry criterion is applied on a map
representation of the image. The map representation may include,
but is not limited to: an intensity map, a phase map, a texture map
(i.e. gradient map), or any other map generated from the image
using standard image processing filters, as known in the art.
[0058] The symmetry criterion may be defined before the images (or
features extracted from the images, say biometric stamps, as known
in the art) are stored in the face database, as described in
further detail hereinbelow.
[0059] Optionally, the symmetry criterion is formulated as a
threshold value for symmetry, as known in the art. The threshold
value may be a theoretical value based on theoretical calculations,
an empirical value derived from experimental data, etc.
[0060] When a face in a new image (say a face of an individual who
wishes to be granted access to a classified information system)
needs to be identified, the new image is tested with respect to
symmetry of the face in the new image. That is to say that the face
has to comply with the symmetry criterion before an attempt is made
at identifying the face using the database of faces.
[0061] Thus, according to an exemplary embodiment of the present
invention, the symmetry criterion is enforced on all faces
identified in images, using the methods and apparatuses taught
hereinbelow.
[0062] The symmetry criterion may improve accuracy and efficiency
of identification of the face in the image. For example, in order
to meet the face criterion, the face is aligned into a position
where the face appears symmetric (say a position where an
individual looks straight into a camera). Consequently, there is
produced a significantly uniform face alignment amongst the
images.
[0063] The uniform face alignment may ease identification of a face
in a new image, through a comparison with images in the face
database. The identification is eased, since the uniform face
alignment may increase similarity between face images of the same
individual, especially as far as two dimensional (2D) images are
concerned. Consequently, face recognition rates, such as FAR (False
Acceptance Rate) and FRR (False Rejection Rate), may be reduced,
and thus improved.
[0064] Further, when an individual has to align his face into the
position where the individual's face appears symmetric, the
individual is less likely to use extreme facial expressions.
Extreme facial expressions (such as a widely opened mouth) are
known to posses a problem, as far as face recognition (i.e.
identification) is concerned.
[0065] The principles and operation of an apparatus and method
according to the present invention may be better understood with
reference to the drawings and accompanying description.
[0066] Before explaining at least one embodiment of the invention
in detail, it is to be understood that the invention is not limited
in its application to the details of construction and the
arrangement of the components as set forth in the following
description or illustrated in the drawings. The invention is
capable of other embodiments or of being practiced or carried out
in various ways. Also, it is to be understood that the phraseology
and terminology employed herein is for the purpose of description
and should not be regarded as limiting.
[0067] Reference is now made to FIG. 1, which is a block diagram
illustrating a first apparatus for face recognition, according to
an exemplary embodiment of the present invention.
[0068] Apparatus 1000 for face recognition includes a face symmetry
verifier 110.
[0069] The face symmetry verifier 110 verifies symmetry of a face
in one or more image(s), (say a still video image of a face of an
individual, a sequence of video images of an individual, etc.)
according to a symmetry criterion, as described in further detail
hereinabove.
[0070] Optionally, the symmetry criterion pertains to a statistical
model run over previously received images. For example, the
symmetry criterion may be based on a degree of deviation of the
image (and thus the face captured in the image) from an average
image, as described in further detail hereinabove.
[0071] Optionally, the symmetry criterion is based on a comparison
made between the image and one or more images previously captured
from the same face, as described in further detail hereinabove.
[0072] Optionally, the symmetry criterion is based on symmetry of a
polygon, which connects selected parts of the image.
[0073] Optionally, the selected parts are known elements of a human
face (say nose, eyes, or mouth). The known elements may be
identified in the captured image using known in art techniques,
such as: Viola-Jones algorithms, Neural Network methods, etc. The
centers of the known face elements identified in the captured image
are connected to form the polygon, and a verified symmetry of the
polygon serves as an indication for the symmetry of the face in the
captured image, as described in further detail hereinabove.
[0074] Optionally, the selected parts are segments of the face in
the image.
[0075] The segments are identified in the captured image, using
known in the art image segmentation methods, such as Feature
Oriented Flood Fill, Texture Analysis, Principal Component Analysis
(PCA) based methods, DFT (Discrete Fourier Transform) methods (i.e.
harmonic methods), etc., as described in further detail
hereinabove.
[0076] The symmetry criterion may be defined by a user of the
apparatus 1000, as described in further detail hereinbelow.
Optionally, the face symmetry verifier 110 uses an intensity map,
for verifying the symmetry of the face in the image, as described
in further detail hereinbelow.
[0077] Optionally, the face symmetry verifier 110 uses a texture
map (i.e. a gradient map), for verifying the symmetry of the face
in the image, as described in further detail hereinbelow.
[0078] Optionally, the face symmetry verifier 110 uses a Fast
Fourier Transform (FFT) phase map, for verifying the symmetry of
the face in the image, as described in further detail.
[0079] Optionally, the face is a face of an individual who is a
collaborating user. For example, the face may belong to a user who
may be asked to move into a better position. The user collaborates
by moving into a better aligned position (say a position where the
user looks directly into a still camera). A new image of the user's
face, as captured from the better aligned position, may be more
symmetric, as described in further detail hereinbelow.
[0080] Optionally, the images are a part of a video sequence, and
the video sequence is continuously fed to the face symmetry
verifier 110, say from a surveillance system or device (such as a
video camera which continuously captures images of a secure area),
as described in further detail hereinbelow.
[0081] The face symmetry verifier 110 verifies the symmetry of face
in each of the images. When the face symmetry verifier 110
successfully verifies the symmetry of the face in one of the images
(say the face of a criminal), the image is forwarded to a face
symmetry verifier, as described in further detail hereinbelow.
[0082] Optionally, the face symmetry verifier 110 measures symmetry
of each one of two or more images of the video sequence fed to the
face symmetry verifier 110. Then, the face symmetry verifier 110
selects the one or more image(s) of the face amongst the input
images, such that the measured symmetry of the selected images of
the face is highest amongst the input images.
[0083] Apparatus 1000 further includes a face identifier 120,
connected to the face symmetry verifier 110.
[0084] The face identifier 120 identifies the face, provided the
symmetry of the face is successfully verified by the face symmetry
verifier 110, as described in further detail hereinbelow.
[0085] The face identifier 120 may use any of current face
identification methods.
[0086] Optionally, the face identifier 120 identifies the face by
attempting to match between the image bearing the face and images
in a database of images, or between a feature (say a biometric
stamp) extracted from the image bearing the face and features
previously extracted from images previously received and stored in
a face database, as known in the art and described in further
detail hereinabove.
[0087] The symmetry criterion may improve accuracy and efficiency
of the face identifier 120, as described in further detail
hereinabove.
[0088] Optionally, apparatus 1000 further includes an image
capturer, connected to the face symmetry verifier 110.
[0089] The image capturer may include, but is not limited to a
digital still camera, a video camera, etc. The image capturer
captures the image(s) of the face, and forwards the captured
image(s) to the face symmetry verifier 110.
[0090] Optionally, when the face symmetry verifier 110 finds the
face in the image non-symmetric (i.e. when the face fails to meet
the symmetry criterion), the face symmetry verifier 110 instructs
the image capturer to capture a new image of the face.
[0091] Optionally, upon finding the face non-symmetric, the face
symmetry verifier 110 presents an appropriate message (say a
message asking an individual whose face image is captured to look
straight into the image capturer, etc.), and the face capturer
captures a new image of the face, as described in further detail
hereinbelow.
[0092] Optionally, apparatus 1000 further includes a face detector,
connected to the face symmetry verifier 110.
[0093] The face detector detects the face in the image. The face
detector may use one or more methods for detecting the face in the
image, including, but not limited to: a skin detection method, a
Viola-Jones detection method, a Gabor Filter based method, etc., as
described in further detail hereinbelow.
[0094] Optionally, apparatus 1000 also includes an image cropper,
connected to the face symmetry verifier 110.
[0095] The image cropper crops the image, and thereby significantly
removes background details from the image.
[0096] Optionally, the image cropper crops the image around the
face, leaving a purely facial image (i.e. an image which includes
only the face, without background details).
[0097] Optionally, the image cropper crops the image, along a
rectangle, as illustrated using FIG. 13, and described in further
detail hereinbelow.
[0098] Optionally, apparatus 1000 also includes an image resizer,
connected to the face symmetry verifier 110.
[0099] The image resizer resizes the image into a predefined size,
and thereby standardizes the image's size according to a size
standard predefined by a user of the apparatus 1000, as described
in further detail hereinbelow.
[0100] Optionally, apparatus 1000 further includes an image
illumination quality improver, connected to the face symmetry
verifier 110.
[0101] The image illumination quality improver may improve one or
more qualities of illumination of the image, say using Histogram
Equalization, as known in the art, and described in further detail
hereinbelow.
[0102] Reference is now made to FIG. 2, which is a block diagram
illustrating a second apparatus for face recognition, according to
an exemplary embodiment of the present invention.
[0103] Apparatus 2000 for face recognition includes a face symmetry
verifier 210.
[0104] The face symmetry verifier 210 verifies symmetry of a face
in one or more image(s), (say a still video image of a face of an
individual, or a sequence of video images of an individual),
according to a symmetry criterion, as described in further detail
hereinabove.
[0105] Optionally, the symmetry criterion pertains to a statistical
model run over previously received images. For example, the
symmetry criterion may be based on a degree of deviation of the
image (and thus the face captured in the image) from an average
image, as described in further detail hereinabove.
[0106] Optionally, the symmetry criterion is based on a comparison
made between the image and one or more images previously captured
from the same face, as described in further detail hereinabove.
[0107] Optionally, the symmetry criterion is based on symmetry of a
polygon, which connects selected parts of the image.
[0108] Optionally, the selected parts are known elements of a human
face (say nose, eyes, or mouth).
[0109] The known elements may be identified in the captured image
using known in art techniques, such as: Viola-Jones algorithms,
Neural Network methods, etc. The centers of the known face elements
identified in the captured image are connected to form the polygon,
and a verified symmetry of the polygon serves as an indication for
the symmetry of the face in the captured image, as described in
further detail hereinabove.
[0110] Optionally, the selected parts are segments of the face in
the image.
[0111] The segments are identified in the captured image, using
known in the art image segmentation methods, such as Feature
Oriented Flood Fill, Texture Analysis, Principal Component Analysis
(PCA) based methods, DFT (Discrete Fourier Transform) methods (i.e.
harmonic methods), etc., as described in further detail
hereinabove.
[0112] The symmetry criterion may be defined by a user of the
apparatus 2000, as described in further detail hereinbelow.
[0113] The symmetry criterion may be based on an intensity map, a
phase map, a texture map, etc., as described in further detail
hereinbelow.
[0114] Optionally, the face symmetry verifier 210 uses an intensity
map, a gradient map, a Fast Fourier Transform (FFT) phase map, or a
combination thereof, for verifying the symmetry of the face in the
image(s), as described in further detail hereinbelow.
[0115] Optionally, the face symmetry verifier 210 measures symmetry
of each one of two or more input images (say images which are a
part of a sequence of video images, or a video stream). Then, the
face symmetry verifier 210 selects the one or more image(s) of the
face amongst the input images, such that the measured symmetry of
the selected images of the face is highest amongst the input
images.
[0116] Apparatus 2000 further includes a face database updater 230,
connected to the face symmetry verifier 210.
[0117] The face database updater 230 updates a face database 250
with the image of the face or with one or more features extracted
from the image (say with a biometric stamp, as known in the art),
and with associated data identifying the face, provided the
symmetry of the face in the image(s) is successfully verified by
the face symmetry verifier 210, as described in further detail
hereinbelow.
[0118] The data identifying face may include, but is not limited to
details such as a passport number, a name, or an address. The
details may be provided by an operator of the apparatus 2000, by an
individual whose face is captured in the image, etc.
[0119] The face database 250 may be a local database, a remote
database accessible through a wide area network (such as the
Internet), etc., as known in the art.
[0120] Reference is now made to FIG. 3, which is a block diagram
illustrating a third apparatus for face recognition, according to
an exemplary embodiment of the present invention.
[0121] Apparatus 3000 for face recognition includes a face symmetry
verifier 310.
[0122] The face symmetry verifier 310 receives one or more first
image(s) of a face, together with data identifying the face. The
data identifying the face may include, but is not limited to
details such as a passport number, a name, or an address. The
details may be provided by an operator of the apparatus 3000, by an
individual whose face is captured in the image, etc.
[0123] The face symmetry verifier 310 verifies symmetry of a face
in one or more first image(s), according to a symmetry criterion,
as described in further detail hereinabove.
[0124] Optionally, the symmetry criterion pertains to a statistical
model run over previously received images.
[0125] For example, the symmetry criterion may be based on a degree
of deviation of the image (and thus the face captured in the image)
from an average image, as described in further detail
hereinabove.
[0126] Optionally, the symmetry criterion is based on a comparison
made between the image and one or more images previously captured
from the same face, as described in further detail hereinabove.
[0127] Optionally, the symmetry criterion is based on symmetry of a
polygon, which connects selected parts of the image.
[0128] Optionally, the selected parts are known elements of a human
face (say nose, eyes, or mouth). The known elements may be
identified in the captured image using known in art techniques,
such as: Viola-Jones algorithms, Neural Network methods, etc. The
centers of the known face elements identified in the captured image
are connected to form the polygon, and a verified symmetry of the
polygon serves as an indication for the symmetry of the face in the
captured image, as described in further detail hereinabove.
[0129] Optionally, the selected parts are segments of the face in
the image.
[0130] The segments are identified in the captured image, using
known in the art image segmentation methods, such as Feature
Oriented Flood Fill, Texture Analysis, Principal Component Analysis
(PCA) based methods, DFT (Discrete Fourier Transform) methods (i.e.
harmonic methods), etc., as described in further detail
hereinabove.
[0131] In one example, the first image includes a still video image
of a face of an individual who enrolls in a security system, or a
sequence of video images of a known criminal the police wishes to
store in a database of criminal suspects.
[0132] The symmetry criterion may be defined by a user of the
apparatus 3000, as described in further detail hereinbelow.
[0133] The symmetry criterion may be based on an intensity map, a
phase map, a texture map, etc., as described in further detail
hereinbelow.
[0134] Optionally, the face symmetry verifier 310 uses an intensity
map, a gradient map, a Fast Fourier Transform (FFT) phase map, or a
combination thereof, for verifying the symmetry of the face in the
first image(s), as described in further detail hereinbelow.
[0135] Optionally, the face symmetry verifier 310 measures symmetry
of each one of two or more images input to the face symmetry
verifier 310, say images which are a part of a sequence of video
images streamed to the face symmetry verifier 310. Then, the face
symmetry verifier 310 selects one or more first image(s) of the
face amongst the input images, such that the measured symmetry of
the selected image(s) of the face is highest amongst the input
image(s).
[0136] Apparatus 3000 further includes a face database updater 330,
connected to the face symmetry verifier 310.
[0137] Optionally, the face database updater 330 updates a face
database 350 with one or more of the first image(s) or with one or
more features extracted from the first image(s) (say a biometric
stamp, as known in the art), and the data identifying the face,
provided the symmetry of the face in the first image(s) is
successfully verified, as described in further detail
hereinbelow.
[0138] Optionally, the face database updater 330 updates the face
database with the images selected by the face symmetry verifier
310, or with one or more features extracted from the selected
images (say biometric stamps, as known in the art), as described in
further detail hereinabove.
[0139] The face database 350 may be a local database, a remote
database accessible through a wide area network (such as the
Internet), etc., as known in the art.
[0140] Apparatus 3000 further includes a face identifier 320,
connected to the face symmetry verifier 310.
[0141] When one or more second image(s) of a face are presented to
the face symmetry verifier 310 (say, a video stream of an
individual who attempts to walk into a secure area), the face
symmetry verifier 310 verifies the symmetry of the face in the
second image(s), according to the predefined symmetry
criterion.
[0142] If the symmetry of the face in the second image is
successfully verified, the face identifier 320 identifies the face
in the second image, using the face database 350. For example, the
face identifier 320 may identify the face by matching a feature
extracted from the second image (such as a biometric stamp) and a
feature previously extracted from one of the first images and
stored in the face database 350, as described in further detail
hereinabove.
[0143] In one example, an authorized user of classified information
system enrolls in the classified information system.
[0144] A first image of the authorized user's face is input to the
face symmetry verifier 310 (say a passport photo), together with
data identifying the authorized user.
[0145] The data indemnifying the user may include, but is not
limited to details such as a passport number, a name, an address, a
role, etc. The details may be provided by an operator of the
apparatus 3000, by the authorized user, etc.
[0146] If the face symmetry verifier 310 verifies the symmetry of
the authorized user's face in the first image, the face database
updater 330 updates the face database 350 with the first image of
the authorized user (or with a feature extracted from the first
image), together with the data identifying the authorized user, as
described in further detail hereinabove.
[0147] The next time the authorized user wishes to log into the
classified information system, a second image of his face is
captured live, say by a still camera in communication with the
classified information system.
[0148] The face symmetry verifier 310 receives the second image and
verifies the symmetry of the authorized user's face in the second
image.
[0149] When the symmetry of the authorized user's face in the
second image is successfully verified, the face identifier 320
identifies the face in the second image, using the face database
350. Optionally, the face identifier 320 identifies the face by
extracting a biometric stamp from the second image and finding a
matching biometric stamp in the face database 350, as described in
further detail hereinabove.
[0150] Consequently, upon positive identification of the authorized
user's face, the authorized user is allowed to log into the
classified information system.
[0151] Reference is now made to FIG. 4, which is a block diagram
illustrating a fourth apparatus for face recognition, according to
an exemplary embodiment of the present invention.
[0152] Apparatus 4000 for face recognition includes a face symmetry
verifier 410.
[0153] The face symmetry verifier 410 verifies symmetry of a face
in one or more image(s) (say a still video image of a face of an
individual, or a sequence of video images of an individual)
according to a symmetry criterion, as described in further detail
hereinabove.
[0154] Optionally, the symmetry criterion pertains to a statistical
model run over previously received images.
[0155] For example, the symmetry criterion may be based on a degree
of deviation of the image (and thus the face captured in the image)
from an average image, as described in further detail
hereinabove.
[0156] Optionally, the symmetry criterion is based on a comparison
made between the image and one or more images previously captured
from the same face, as described in further detail hereinabove.
[0157] Optionally, the symmetry criterion is based on symmetry of a
polygon, which connects selected parts of the image.
[0158] Optionally, the selected parts are known elements of a human
face (say nose, eyes, or mouth).
[0159] The known elements may be identified in the captured image
using known in art techniques, such as: Viola-Jones algorithms,
Neural Network methods, etc. The centers of the known face elements
identified in the captured image are connected to form the polygon,
and a verified symmetry of the polygon serves as an indication for
the symmetry of the face in the captured image, as described in
further detail hereinabove.
[0160] Optionally, the selected parts are segments of the face in
the image.
[0161] The segments are identified in the captured image, using
known in the art image segmentation methods, such as Feature
Oriented Flood Fill, Texture Analysis, Principal Component Analysis
(PCA) based methods, DFT (Discrete Fourier Transform) methods (i.e.
harmonic methods), etc., as described in further detail
hereinabove.
[0162] The symmetry criterion may be defined by a user of the
apparatus 4000, as described in further detail hereinbelow.
[0163] The symmetry criterion may be based on an intensity map, a
phase map, a texture map, etc., as described in further detail
hereinbelow.
[0164] Optionally, the face symmetry verifier 410 uses an intensity
map, a gradient map, a fast Fourier Transform (FFT) phase map, an
image processing filter output (as known in art), or a combination
thereof, for verifying the symmetry of the face in the image(s), as
described in further detail hereinbelow.
[0165] Apparatus 4000 further includes a forwarding controller 470,
connected to the face symmetry verifier 410.
[0166] The forwarding controller 470 controls the forwarding of the
image (or a feature extracted from the image, say a biometric
stamp, as known in the art), in accordance with the verification of
the symmetry of the face.
[0167] In a first example, the face symmetry verifier 410 may find
the face in the image to be non-symmetric (i.e. when the face fails
to meet the symmetry criterion). Consequently, the forwarding
controller 470 blocks the forwarding of the image (or the feature
extracted from the image) to an image identifier 120.
[0168] Optionally, the forwarding controller 470 presents an
appropriate message.
[0169] For example, the forwarding controller 470 may present a
message asking an individual whose face image is captured to look
straight into an image capturer (say, a still camera), or to align
in a position in front of the image capturer, as described in
further detail hereinbelow.
[0170] Then, the image capturer may capture a new (and hopefully,
symmetric) image of the face of the individual.
[0171] When the face symmetry verifier 410 finds that the face
successfully meets the symmetry criterion (and is thus successfully
verified as symmetric), the forwarding controller 470 may forward
the image (or the feature extracted from the image), say to an
image identifier 120, as described in further detail
hereinabove.
[0172] In a second example, the forwarding controller 470 may
forward the image (or the feature extracted from the image) to one
or more destination(s) set in advance of the verification of the
symmetry, say by an operator of the apparatus 4000.
[0173] The destination(s) may include, but is not limited to: an
email address, a database server, or an application (which may run
on a remote computer, on a computer the forwarding controller also
runs on, etc.).
[0174] Apparatus 4000 may be used as a stand alone product, or in
combination with other systems, say a face recognition system, a
security system, etc.
[0175] Reference is now made to FIG. 5, which is a flowchart
illustrating a first method for face recognition, according to an
exemplary embodiment of the present invention.
[0176] In a first method for face recognition, according to an
exemplary embodiment of the present invention, symmetry of a face
in one or more image(s), (say a still video image of a face of an
individual, a sequence of video images of an individual, etc.) is
verified 510 according to a symmetry criterion, say using the face
symmetry verifier 110, as described in further detail
hereinabove.
[0177] Optionally, the symmetry criterion pertains to a statistical
model run over previously received images.
[0178] For example, the symmetry criterion may be based on a degree
of deviation of the image (and thus the face captured in the image)
from an average image, as described in further detail
hereinabove.
[0179] Optionally, the symmetry criterion is based on a comparison
made between the image and one or more images previously captured
from the same face, as described in further detail hereinabove.
[0180] Optionally, the symmetry criterion is based on symmetry of a
polygon, which connects selected parts of the image.
[0181] Optionally, the selected parts are known elements of a human
face (say nose, eyes, or mouth).
[0182] The known elements may be identified in the captured image
using known in art techniques, such as: Viola-Jones algorithms,
Neural Network methods, etc. The centers of the known face elements
identified in the captured image are connected to form the polygon,
and a verified symmetry of the polygon serves as an indication for
the symmetry of the face in the captured image, as described in
further detail hereinabove.
[0183] Optionally, the selected parts are segments of the face in
the image.
[0184] The segments are identified in the captured image, using
known in the art image segmentation methods, such as Feature
Oriented Flood Fill, Texture Analysis, Principal Component Analysis
(PCA) based methods, DFT (Discrete Fourier Transform) methods (i.e.
harmonic methods), etc., as described in further detail
hereinabove.
[0185] The symmetry criterion may be defined in advance, say by a
user of apparatus 1000, as described in further detail
hereinabove.
[0186] The symmetry criterion may be based on an intensity map, a
phase map, a texture map, an image processing filter, etc., as
described in further detail hereinbelow.
[0187] Optionally, the first method further includes using an
intensity map, for verifying 510 the symmetry of the face in the
image, as described in further detail, hereinbelow.
[0188] Optionally, the first method further includes using a
gradient map, for verifying 510 the symmetry of the face in the
image, as described in further detail, hereinbelow.
[0189] Optionally, the first method further includes using a fast
Fourier Transform (FFT) phase map, for verifying 510 the symmetry
of the face in the image, as described in further detail,
hereinbelow.
[0190] Optionally, the first method further includes measuring
symmetry of each one of two or more input images (say images which
are a part of a sequence of video images, or a video stream). Then,
the one or more image(s) of the face are selected amongst the input
images, such that the measured symmetry of the selected images of
the face is highest amongst the input images.
[0191] Next, the face is identified 520, say by the face identifier
120, provided the symmetry of the face is successfully verified
510.
[0192] For example, the face may be identified by attempting to
match between the image bearing the face and images previously
stored in a database of images. The face may also be identified by
finding a match between a feature (such as biometric method)
extracted from the image bearing the face, and a feature previously
extracted from an image and stored in a face database, as known in
the art and described in further detail hereinbelow.
[0193] Optionally, the first method further includes a preliminary
step of capturing the image of the face, and forwarding the
captured image, for the symmetry verification, (say to the face
symmetry verifier 110), as described in further detail
hereinabove.
[0194] Optionally, the first method further includes detecting the
face in the image, say using the face detector, as described in
further detail hereinbelow.
[0195] The detection of the face may be carried out using one or
more methods including, but not limited to: a skin detection
method, a Viola-Jones detection method, a Gabor Filter based
method, etc., as described in further detail hereinbelow.
[0196] Optionally, the first method further includes cropping the
image.
[0197] Optionally, the cropping is carried out around the face, and
thereby leaves a purely facial image (i.e. an image substantially
without background).
[0198] Optionally, the cropping may be carried out along a
rectangle, significantly removing background from the image, as
illustrated using FIG. 13, and described in further detail
hereinbelow.
[0199] Optionally, the first method further includes resizing the
image into a predefined size, and thereby standardizing the image's
size according to a predefined size standard, as described in
further detail hereinbelow.
[0200] Optionally, the first method further includes improving one
or more qualities of illumination of the image, say using a
Histogram Equalization method, as described in further detail
hereinbelow.
[0201] Reference is now made to FIG. 6, which is a flowchart
illustrating a second method for face recognition, according to an
exemplary embodiment of the present invention.
[0202] In a second method for face recognition, according to an
exemplary embodiment of the present invention, symmetry of a face
in one or more image(s), (say a still video image of a face of an
individual, a sequence of video images of an individual, etc.) is
verified 610 according to a symmetry criterion, say using the face
symmetry verifier 210, as described in further detail
hereinabove.
[0203] Optionally, the symmetry criterion pertains to a statistical
model run over previously received images. For example, the
symmetry criterion may be based on a degree of deviation of the
image (and thus the face captured in the image) from an average
image, as described in further detail hereinabove.
[0204] Optionally, the symmetry criterion is based on a comparison
made between the image and one or more images previously captured
from the same face, as described in further detail hereinabove.
[0205] Optionally, the symmetry criterion is based on symmetry of a
polygon, which connects selected parts of the image.
[0206] Optionally, the selected parts are known elements of a human
face (say nose, eyes, or mouth).
[0207] The known elements may be identified in the captured image
using known in art techniques, such as: Viola-Jones algorithms,
Neural Network methods, etc. The centers of the known face elements
identified in the captured image are connected to form the polygon,
and a verified symmetry of the polygon serves as an indication for
the symmetry of the face in the captured image, as described in
further detail hereinabove.
[0208] Optionally, the selected parts are segments of the face in
the image.
[0209] The segments are identified in the captured image, using
known in the art image segmentation methods, such as Feature
Oriented Flood Fill, Texture Analysis, Principal Component Analysis
(PCA) based methods, DFT (Discrete Fourier Transform) methods (i.e.
harmonic methods), etc., as described in further detail
hereinabove.
[0210] The symmetry criterion may be defined by a user of apparatus
2000, as described in further detail hereinabove.
[0211] Optionally, the second method further includes using an
intensity map, for verifying 610 the symmetry of the face in the
image, as described in further detail, hereinbelow.
[0212] Optionally, the second method further includes using a
gradient map, for verifying 610 the symmetry of the face in the
image, as described in further detail, hereinbelow.
[0213] Optionally, the first method further includes using a fast
Fourier Transform (FFT) phase map, for verifying 610 the symmetry
of the face in the image, as described in further detail,
hereinbelow.
[0214] Optionally, the first method further includes measuring
symmetry of each one of two or more input images (say images which
are a part of a sequence of video images, or a video stream). Then,
the one or more image(s) of the face are selected amongst the input
images, such that the measured symmetry of the selected images of
the face is highest amongst the input images.
[0215] Next, a face database 250 is updated 620 with the image of
the face or with one or more features (say biometric stamps, as
known in the art) extracted from the image of the face and with
associated data identifying the face, provided the symmetry of the
face in the image(s) is successfully verified (say by the face
symmetry verifier 210), as described in further detail
hereinabove.
[0216] The data identifying face may include, but is not limited to
details such as a passport number, a name, or an address. The
details may be provided by an operator of the apparatus 2000, by an
individual whose face is captured in the image, etc.
[0217] The face database 250 may be a local database, a remote
database accessible through a wide area network (such as the
Internet), etc., as known in the art.
[0218] Reference is now made to FIG. 7, which is a flowchart
illustrating a third method for face recognition, according to an
exemplary embodiment of the present invention.
[0219] In a third method, according to an exemplary embodiment of
the present invention, symmetry of a face in one or more first
image(s) is verified 710, according to a symmetry criterion, say
using the face symmetry verifier 310, as described in further
detail hereinabove.
[0220] Optionally, the symmetry criterion pertains to a statistical
model run over previously received images. For example, the
symmetry criterion may be based on a degree of deviation of the
image (and thus the face captured in the image) from an average
image, as described in further detail hereinabove.
[0221] Optionally, the symmetry criterion is based on a comparison
made between the image and one or more images previously captured
from the same face, as described in further detail hereinabove.
[0222] Optionally, the symmetry criterion is based on symmetry of a
polygon, which connects selected parts of the image.
[0223] Optionally, the selected parts are known elements of a human
face (say nose, eyes, or mouth).
[0224] The known elements may be identified in the captured image
using known in art techniques, such as: Viola-Jones algorithms,
Neural Network methods, etc. The centers of the known face elements
identified in the captured image are connected to form the polygon,
and a verified symmetry of the polygon serves as an indication for
the symmetry of the face in the captured image, as described in
further detail hereinabove.
[0225] Optionally, the selected parts are segments of the face in
the image.
[0226] The segments are identified in the captured image, using
known in the art image segmentation methods, such as Feature
Oriented Flood Fill, Texture Analysis, Principal Component Analysis
(PCA) based methods, DFT (Discrete Fourier Transform) methods (i.e.
harmonic methods), etc., as described in further detail
hereinabove.
[0227] In one example, the first image(s) may include a passport
photo of an individual who enrolls in a security system, a sequence
of video images of a known criminal the police wishes to store in a
database of criminal suspects, etc.
[0228] The symmetry criterion may be defined by a user of the
apparatus 3000, as described in further detail hereinabove.
[0229] Optionally, the verification of the symmetry of the first
image(s) is carried out using an intensity map, a texture map (i.e.
gradient map), a fast Fourier Transform (FFT) phase map, or a
combination thereof, as described in further detail
hereinbelow.
[0230] Next, a face database 350 is updated 730 with the first
image(s) of the face (or with features extracted from the first
image(s), as described in further detail hereinabove), and with
associated data identifying the face, provided the symmetry of the
face in the first image(s) is successfully verified 710, say by the
face symmetry verifier 310, as described in further detail
hereinbelow.
[0231] The data identifying face may include, but is not limited to
details such as a passport number, a name, or an address. The
details may be provided by an operator of the apparatus 4000, by
the individual whose face is captured in the image, etc.
[0232] The face database 350 may be a local database, a remote
database accessible through a wide area network (such as the
Internet), etc., as known in the art.
[0233] When one or more second image(s) of the face are presented
to the face symmetry verifier 310 (say, a video stream of a
criminal who attempts to walk into a secure area), the symmetry of
the face in the second image(s) is verified 770, according to the
predefined symmetry criterion, as described in further detail
hereinabove.
[0234] If the symmetry of the face in the second image is
successfully verified 770, the face in the second image is
identified 790, say by the face identifier 320, using the face
database 350.
[0235] Optionally, the face is identified 790 by finding a match
between the second image and an image stored in the face database
350 (or between a feature extracted from the second image and
features stored in the face database 350), as known in the art and
described in further detail hereinabove.
[0236] For example, a police unit may wish to store a face image
and identifying data of a known criminal in a suspect database. The
symmetry of the known criminal's face in the first image is
verified 710, say using the face symmetry verifier 310, as
described in further detail hereinbelow.
[0237] If the symmetry of the known criminal's face in the first
image is successfully verified 710, the suspect database is updated
730 with the first image of the known criminal, together with the
data identifying the known criminal.
[0238] A surveillance camera may capture a video stream (i.e.
second images of the criminal) of the criminal in a crime
scene.
[0239] The video stream may be used to identify the criminal, say
using apparatus 3000, as described in further detail
hereinbelow.
[0240] When the symmetry of the criminal's face in the second image
is successfully verified 770, the face in the second image may be
identified 790, using the police unit's suspect database.
Consequently, upon positive identification of the criminal's face,
the police may arrest the known criminal, and use the video stream
as evidence against the known criminal.
[0241] Reference is now made to FIG. 8, which is a flowchart
illustrating a fourth method for face recognition, according to an
exemplary embodiment of the present invention.
[0242] In a fourth method, according to an exemplary embodiment of
the present invention, an image of a face is captured 800, say
using an image capturer, such as a digital still camera, a video
camera, or a surveillance camera (which constantly streams video
images of a secure area).
[0243] For example, a user may approach a face recognition system,
which includes apparatus 1000, and the image capturer.
[0244] Optionally, the image capturer may be triggered to capture
the image of a user who approaches the face recognition system by a
smart card reader connected to the image capturer.
[0245] Upon insertion of a smart cart into the smart card reader,
by the user, the smart card reader triggers the image capturer, to
capture the image of the user. Then, the captured image is
forwarded for symmetry verification, as described in further detail
hereinbelow.
[0246] Similarly, the image capturer may be triggered to capture
the image of the face of the user who approaches the face
recognition system, by a RFID (Radio frequency identification) card
reader connected to the image capturer. The RFID card reader
triggers the image capturer to capture the image, when the user
inserts an RFID cart into the RFID reader. Then, the captured image
is forwarded for symmetry verification, as described in further
detail hereinbelow.
[0247] Optionally, the image capturer continuously captures images.
For example, the imager capturer may be a surveillance camera,
which constantly streams video images of a secure area. Upon
detection of the user's face in the image (say by the face
detector), the image is forwarded to the face symmetry verifier
110, as describe in further detail hereinabove.
[0248] Optionally, the image is captured in a two dimensional (2D)
format, as known in the art.
[0249] Optionally, the image is captured in a three dimensional
(3D) format, as known in the art.
[0250] Next, the face in the captured image is verified 810,
according to a symmetry criterion, say by the face symmetry
verifier 110, as described in further detail hereinabove.
[0251] Optionally, when the face in the image is found to be
non-symmetric (i.e. the face fails to meet the symmetry criterion),
the image capturer is instructed (say by the face symmetry verifier
110) to capture a new image of the face.
[0252] The image capturer may present an appropriate message, say a
message asking an individual whose face image is captured to look
straight into the image capturer, or align in a position in front
of the image capturer (say, a still camera), as described in
further detail hereinbelow.
[0253] Then, the image capturer captures a new image of the
face.
[0254] When the symmetry of the face is successfully verified 810,
the image is pre-processed 880, using one of several pre-processing
methods currently used for face recognition. The pre-processing
methods may be used for sharpening, grey scale modification,
removal of red eyes, etc., as known in the art.
[0255] Finally, the face is identified 890, say by the face
identifier 120, as described in further detail hereinabove.
[0256] Reference is now made to FIG. 9, which is a flowchart
illustrating a fifth method for face recognition, according to an
exemplary embodiment of the present invention.
[0257] In a fifth method, according to an exemplary embodiment of
the present invention, a video image is captured 900, say by a
video still camera, as described in further detail hereinabove.
[0258] Next, there is detected 901 a face in the captured
image.
[0259] The face may be detected using one or more methods for
detecting a face in the image. The methods include, but not limited
to: Viola-Jones detection methods, Gabor Jets based methods, skin
detection methods, histogram analysis methods, or other methods
(say methods based on edge maps, gradients, or standard face
shapes, etc.), as known in the art.
[0260] Viola-Jones methods use several image processing filters
over the whole image. A neural network algorithm is trained over a
training set (say a set of already processed face images). The face
is detected using a neural network algorithm and a search is made
to find best match values that predict the face center location, as
known in the art.
[0261] Gabor Jets methods use a convolution of a Fourier
Coefficient of the image with wavelets coefficients of low order,
where the values that predict face location are set according to
empirically found predictive values, as known in the art.
[0262] Skin detectors analyze an intensity map presentation of the
image, in order to find that pixel intensity values which comply
with standard skin values, as known in the art.
[0263] Histogram analysis methods analyze a histogram of the image,
say a Pixel Frequency Histogram, after applying several filters
(histogram normalization, histogram stretching, etc.) on the
histogram of the image. The filters applied on the image's
histogram may enable separation of face from background, as known
in the art.
[0264] Next, the image is cropped 902, and thus background is
significantly removed from the image, as illustrated using FIG. 13,
and described in further detail hereinbelow.
[0265] Then, the cropped image is resized 903 into a size, in
accordance with a side standard. The size standard may be set by an
operator of the apparatus 1000.
[0266] The size standard may improve accuracy and efficiency of
identification of the face, since images in a database of face
images, which are substantially the same size as the resized image,
are more likely be matched with the resized image, for identifying
the face, as described in further detail hereinabove.
[0267] Next, there are improved 904 one or more illumination
qualities of the image.
[0268] For example, the illumination qualities of the image may be
enhanced using Histogram Equalization, which modifies the dynamic
range and contrast of an image by altering the image.
[0269] Optionally, the histogram equalization employs a monotonic,
nonlinear mapping, which re-assigns the intensity values of pixels
in the image, such that the improved image contains a uniform
distribution of intensities (i.e. a flat histogram).
[0270] Histogram Equalization is usually introduced using
continuous (rather than discrete) process functions, as known in
the art.
[0271] Optionally, the histogram equalization employs linear
mapping, exponential (or logarithmic) mapping, etc., as known in
the art.
[0272] Next, the symmetry of the face in the image is verified 907,
using a symmetry criterion, as described in further detail
hereinbelow.
[0273] Upon successful verification 915 of the symmetry of the face
in the image, there is identified 920 the face in the image, as
described in further detail hereinabove. If the image is found to
be non-symmetric (i.e. if the image fails to comply with the
symmetry criterion), an image of the face is captured again 900, as
described in further detail hereinabove.
[0274] The symmetry criterion may be based on an intensity map of
the image, a phase map of the image, a texture map of the image,
etc., and is predefined before the images are stored in a face
database.
[0275] In the face database, there are stored images of known faces
(or features extracted from images of the known faces, say
biometric stamps, as known in the art), which also meet the face
criterion, as described in further detail hereinbelow.
[0276] Thus, according to exemplary embodiments of the present
invention, the symmetry criterion is enforced on all face images
the method is used on.
[0277] The symmetry criterion may improve accuracy and efficiency
of identification of the face in the image.
[0278] For example, in order to meet the face criterion, the face
is aligned into a position where the face appears symmetric (say a
position where an individual looks straight into a camera).
[0279] Consequently, there is produced a significantly uniform face
alignment amongst the images.
[0280] The uniform face alignment may ease identification of a face
in a new image, through comparison with images in a face database
(or with features extracted images and stored in the face
database). The identification may be eased, since the uniform face
alignment may increase similarity between face images of the same
individual, especially as far as two dimensional (2D) images are
concerned.
[0281] Consequently, face recognition rates, such as FAR (False
Acceptance Rate) and FRR (False Rejection Rate), may be
reduced.
[0282] Further, when an individual has to align his face into the
position where the individual's face appears symmetric, the
individual is less likely to use extreme facial expression. Extreme
facial expressions (such as a widely opened mouth) are known to
posses a problem, as far as face recognition (i.e. identification)
is concerned.
[0283] Reference is now made to FIG. 10, which is a flowchart
illustrating a sixth method for face recognition, according to an
exemplary embodiment of the present invention.
[0284] In a sixth method for face recognition, according to an
exemplary embodiment of the present invention, symmetry of a face
in one or more image(s), (say a still video image of a face of an
individual, a sequence of video images of an individual, etc.) is
verified 1010 according to a symmetry criterion.
[0285] Optionally, the symmetry is verified using the face symmetry
verifier 410, as described in further detail hereinabove.
[0286] Optionally, the symmetry criterion pertains to a statistical
model run over previously received images.
[0287] For example, the symmetry criterion may be based on a degree
of deviation of the image (and thus the face captured in the image)
from an average image, as described in further detail
hereinabove.
[0288] Optionally, the symmetry criterion is based on a comparison
made between the image and one or more images previously captured
from the same face, as described in further detail hereinabove.
[0289] Optionally, the symmetry criterion is based on symmetry of a
polygon, which connects selected parts of the image.
[0290] Optionally, the selected parts are known elements of a human
face (say nose, eyes, or mouth).
[0291] The known elements may be identified in the captured image
using known in art techniques, such as: Viola-Jones algorithms,
Neural Network methods, etc. The centers of the known face elements
identified in the captured image are connected to form the polygon.
The verified symmetry of the polygon serves as an indication for
the symmetry of the face in the captured image, as described in
further detail hereinabove.
[0292] Optionally, the selected parts are segments of the face in
the image.
[0293] The segments are identified in the captured image, using
known in the art image segmentation methods, such as Feature
Oriented Flood Fill, Texture Analysis, Principal Component Analysis
(PCA) based methods, DFT (Discrete Fourier Transform) methods (i.e.
harmonic methods), etc., as described in further detail
hereinabove.
[0294] The symmetry criterion may be defined by a user of apparatus
4000, as described in further detail hereinbelow.
[0295] Optionally, the sixth method further includes using an
intensity map, for verifying 1010 the symmetry of the face in the
image, as described in further detail hereinbelow.
[0296] Optionally, the sixth method further includes using a
gradient map, for verifying 1010 the symmetry of the face in the
image, as described in further detail hereinbelow.
[0297] Optionally, the sixth further includes using a fast Fourier
Transform (FFT) phase map, for verifying 1010 the symmetry of the
face in the image, as described in further detail, hereinbelow.
[0298] Next, the forwarding of the image (or features extracted
from the image) is controlled 1070, say by the forwarding
controller 470, as described in further detail hereinabove.
[0299] For example, when the face in the image to is found to be
non-symmetric (i.e. when the face fails to meet the symmetry
criterion), the forwarding of the image to a face identifier 110,
or to another destination (say a destination set by an Operator of
apparatus 4000) may be blocked.
[0300] Reference is now made to FIG. 11, which is a flowchart
illustrating a seventh method for face recognition, according to an
exemplary embodiment of the present invention.
[0301] A seventh method, according to a preferred embodiment of the
present invention uses an intensity map of a image captured, say by
an image capturer (a still camera, a video camera, etc.).
[0302] In the seventh method, the face is found 1101 in the image,
as described in further detail hereinabove.
[0303] Next, the image is cropped 1102, say 15% in each side (top,
bottom, right and left), along a rectangle, as described in further
detail, and illustrated using FIG. 13 hereinbelow.
[0304] The cropped image is resized 1103, say to hundred on hundred
pixels.
[0305] Optionally, the image is modified, using histogram
equalization 1104 (say Linear Histogram Equalization), as described
in further detail hereinabove.
[0306] Next, there is verified the symmetry of the face in the
image through the following:
[0307] The image is divided 1105 into two equal parts: a left side
and a right side, along a vertical line passing through a point in
the middle of the image.
[0308] Next, an average pixel intensity is calculated 1106 using
all pixels of the right part.
[0309] The average pixel intensity is denoted hereinbelow as: Right
Avg.
[0310] An average intensity is also calculated 1106 using all
pixels of the left part, and denoted hereinbelow as: Left Avg.
[0311] Next, the left side is transformed 1107. For each old pixel
P.sub.old (i, j) of the left size, there is computed a new value
using Formula 1, yielding a corresponding new value for the pixel,
denoted hereinbelow as P.sub.new (i, j).
P new ( i , j ) = P old ( i , j ) .times. Right Avg . Left Avg .
Formula 1 ##EQU00001##
[0312] The new pixel values P.sub.new (i, j) form a new image,
which comprises the new values calculated for the pixels of the
left side, and the original values of the pixels of the right side.
The new image is denoted hereinbelow as: I.sub.new.
[0313] Next the new image I.sub.new is flipped 1108 over a central
vertical line, to form a flipped image denoted hereinbelow as
I.sub.flipped.
[0314] Then, for each of pixel (I, J) there is computed a
difference 1109 between intensity of the pixel in I.sub.new and the
intensity of the pixel in I.sub.flipped, using Formula 2:
Diff i,j=|I new(i,j)-I flipped(i,j)| Formula 2
[0315] The resultant difference is denoted: Diff i,j.
[0316] Next, there is computed 1110 the symmetry of the face by
dividing the average of the differences (Diff i,j) of intensities
of the pixels calculated using Formula 2, by the average of
intensities of the pixels of I.sub.new, as formulated by Formula
3:
Symmetry = Avg ( Diff i , j ) Avg ( I new ) Formula 3
##EQU00002##
[0317] According to an exemplary embodiment, the threshold for
symmetry i.e. symmetry criterion) is set at 0.35. If
Symmetry<0.35, the face is successfully verified 1111 as
symmetric. If Symmetry>=0.35, the face is determined to be
non-symmetric, and a new image has to be captured, as described in
further detail hereinabove.
[0318] Reference is now made to FIG. 12, which is a flowchart
illustrating an eighth method for face recognition, according to an
exemplary embodiment of the present invention.
[0319] An eighth method, according to a preferred embodiment of the
present invention uses a phase map of an image captured (say by an
image capturer (a still camera, a video camera, etc.). The phase
map may be calculated using Fourier Transform (FT), as known in the
art.
[0320] In the eighth method, the face is found 1201 in the image,
as described in further detail hereinabove.
[0321] Next, the image is cropped 1202 15% in each side (top,
bottom, right and left), along a rectangle, as described in further
detail, and illustrated using FIG. 13 hereinbelow.
[0322] The cropped image is resized 1203, say to hundred on hundred
pixels.
[0323] Optionally, the image is modified, using histogram
equalization 1204 (say Linear Histogram Equalization), as described
in further detail hereinabove.
[0324] Next, there is verified the symmetry of the face in the
image, through the following:
[0325] The image is divided 1205 along a vertical line, into equal
parts: a left side and right side.
[0326] Next, the right side is flipped 1206 vertically.
[0327] Next, there is computed 1207 the Fourier Transform (FT) for
the right side and for the left side. The resultant phase maps are
denoted hereinbelow as I.sub.right and I.sub.left respectively.
[0328] Next, there is computed 1208 the difference between
I.sub.right and I.sub.left, using Formula 4, where Diff denotes the
difference between the two.
Diff=|I right-I left| Formula 4
[0329] Next, there is computed 1209 symmetry for the image, using
Formula 5.
Symmetry = Diff Number of pixels of half image Formula 5
##EQU00003##
[0330] According to an exemplary embodiment, the threshold for
symmetry (i.e. the symmetry criterion) is set at 35. If
Symmetry<35, the face is successfully verified 1210 as
symmetric. If Symmetry>=35, the face is determined to be
non-symmetric, and a new image has to be captured, as described in
further detail hereinabove.
[0331] Reference is now made to FIG. 13, which illustrates cropping
of an image of a face, according to an exemplary embodiment of the
present invention.
[0332] According to an exemplary embodiment of the present
invention, an image of a face may be cropped, say 15% of each size,
a long a rectangle. Consequently the background is significantly
removed from the image.
[0333] The cropping of the image may result in a more efficient and
accurate face recognition, as the identifying is carried out on the
face 1311 itself, without unnecessary processing of background
details, such as a collar 1312, which have nothing to do with the
face itself.
[0334] The removal of the background details may also ease
identification of a face, by introducing increased similarity
between face images of the same individual, especially as far as
two dimensional (2D) images are concerned.
[0335] The methods for face recognition, as described hereinabove,
may also be used in a variety of systems where symmetry information
may prove helpful.
[0336] The systems may include, but are not limited to: 2D or 3D
systems, security system, access control, HLS (Home Land Security),
ATM (Automatic Teller Machines), web portals, or any application
which requires recognition of the subject.
[0337] The systems may also include: passport picture capturing,
standard image capturing (thus enforcing a standard for image
capturing, say for e-Passport or e-ID generation, as known in the
art).
[0338] Optionally, the apparatuses for face recognition, as
described in further detail hereinabove, may be used on-line for
real time applications, or off-line (say on a database of face
images).
[0339] The apparatuses for face recognition may be implemented
using a Personal Computer, an embedded system, a FPGA (Field
Programmable Gate Array), or any other computing device, as known
in the art.
[0340] Reference is now made to FIGS. 14A, 14B, and 14C, which
illustrate a face recognition scenario, according to an exemplary
embodiment of the present invention.
[0341] In a first recognition scenario, according to an exemplary
embodiment of the present invention, a user approaches a face
recognition system, say a face recognition system based on
apparatus 3000, as described in further detail hereinabove.
[0342] The user may be asked to get closer to a camera (say using a
message displayed on a video monitor), as illustrated in FIG.
14A.
[0343] Next, an image of the user's face is captured by the
camera.
[0344] If the face symmetry verifier 310 finds the face in the
image to be non-symmetric, the user is asked to look straight into
the camera (say using a message displayed on a video monitor), as
illustrated in FIG. 14B.
[0345] The camera captures a second image of the user who looks
straight into the camera. As the user looks straight into the
camera, the face symmetry verifier 310 verifies that the user's
face in the second image are indeed symmetric, as described in
further detail hereinabove. Consequently, the second image is
forwarded to the face identifier 320, which identifies the
user.
[0346] Upon successful identification of the user a relevant
message is presented to the user, say a welcome message, as
illustrated in FIG. 14C.
[0347] It is expected that during the life of this patent many
relevant devices and systems will be developed and the scope of the
terms herein, particularly of the terms "Camera", "Image", and
"Photo", is intended to include all such new technologies a
priori.
[0348] It is appreciated that certain features of the invention,
which are, for clarity, described in the context of separate
embodiments, may also be provided in combination in a single
embodiment. Conversely, various features of the invention, which
are, for brevity, described in the context of a single embodiment,
may also be provided separately or in any suitable
sub-combination.
[0349] Although the invention has been described in conjunction
with specific embodiments thereof, it is evident that many
alternatives, modifications and variations will be apparent to
those skilled in the art. Accordingly, it is intended to embrace
all such alternatives, modifications and variations that fall
within the spirit and broad scope of the appended claims.
[0350] All publications, patents and patent applications mentioned
in this specification are herein incorporated in their entirety by
reference into the specification, to the same extent as if each
individual publication, patent or patent application was
specifically and individually indicated to be incorporated herein
by reference. In addition, citation or identification of any
reference in this application shall not be construed as an
admission that such reference is available as prior art to the
present invention.
* * * * *