U.S. patent application number 17/359456 was filed with the patent office on 2022-03-17 for privacy protection-based user recognition methods, apparatuses, and devices.
This patent application is currently assigned to ALIPAY (HANGZHOU) INFORMATION TECHNOLOGY CO., LTD.. The applicant listed for this patent is ALIPAY (HANGZHOU) INFORMATION TECHNOLOGY CO., LTD.. Invention is credited to Juntao Zhang, Qixian Zhou.
Application Number | 20220085971 17/359456 |
Document ID | / |
Family ID | 1000006178511 |
Filed Date | 2022-03-17 |
United States Patent
Application |
20220085971 |
Kind Code |
A1 |
Zhang; Juntao ; et
al. |
March 17, 2022 |
PRIVACY PROTECTION-BASED USER RECOGNITION METHODS, APPARATUSES, AND
DEVICES
Abstract
Disclosed herein are methods, systems, and media for
privacy-protected user recognition. One of the methods comprising
obtaining a biometric feature of a first user; performing
homomorphic encryption on the biometric feature of the first user
to obtain a first ciphertext feature; determining a candidate
ciphertext feature from a predetermined ciphertext feature set
based on the first ciphertext feature and a predetermined graph
structure index, wherein the predetermined ciphertext feature set
comprises a plurality of second ciphertext features obtained by
performing the homomorphic encryption of a plurality of second
biometric features of multiple second users, and wherein the
predetermined graph structure index is generated based on
similarity among at least some of the plurality of second
ciphertext features in the predetermined ciphertext feature set;
and determining a recognition result for the first user based on
the candidate ciphertext feature.
Inventors: |
Zhang; Juntao; (Hangzhou,
CN) ; Zhou; Qixian; (Hangzhou, CN) |
|
Applicant: |
Name |
City |
State |
Country |
Type |
ALIPAY (HANGZHOU) INFORMATION TECHNOLOGY CO., LTD. |
Hangzhou |
|
CN |
|
|
Assignee: |
ALIPAY (HANGZHOU) INFORMATION
TECHNOLOGY CO., LTD.
Hangzhou
CN
|
Family ID: |
1000006178511 |
Appl. No.: |
17/359456 |
Filed: |
June 25, 2021 |
Current U.S.
Class: |
1/1 |
Current CPC
Class: |
G06V 40/161 20220101;
H04L 9/008 20130101; G06F 21/6245 20130101; G06V 40/171 20220101;
G06F 21/32 20130101 |
International
Class: |
H04L 9/00 20060101
H04L009/00; G06F 21/62 20060101 G06F021/62; G06K 9/00 20060101
G06K009/00 |
Foreign Application Data
Date |
Code |
Application Number |
Sep 11, 2020 |
CN |
202010952277.X |
Claims
1. A computer-implemented method for privacy-protected user
recognition, comprising: collecting data using a terminal device
and extracting a facial feature of a first user based on the
collected data; obtaining a first ciphertext feature comprising:
performing a homomorphic encryption on the facial feature to obtain
a first high-dimensional feature; and performing dimension
reduction on the facial feature and performing the homomorphic
encryption to obtain a first low-dimensional feature; determining a
candidate ciphertext feature from a predetermined ciphertext
feature set based on the first ciphertext feature and a
predetermined graph structure index, wherein the predetermined
ciphertext feature set comprises a plurality of second ciphertext
features obtained by performing the homomorphic encryption of a
plurality of biometric features of multiple second users, and
wherein the predetermined graph structure index is generated based
on similarity among at least some of the plurality of second
ciphertext features in the predetermined ciphertext feature set;
and determining a recognition result for the first user based on
the candidate ciphertext feature.
2. (canceled)
3. (canceled)
4. The computer-implemented method according to claim 1, wherein
the predetermined ciphertext feature set comprises: a
high-dimensional feature subset generated by performing the
homomorphic encryption on the at least some of the plurality of
biometric features of the multiple second users; and a
low-dimensional feature subset generated by performing dimension
reduction on the at least some of the plurality of biometric
features of the multiple second users to get second low-dimensional
features and performing the homomorphic encryption on the second
low-dimensional features, wherein the predetermined graph structure
index is generated based on the low-dimensional feature subset, and
wherein determining a candidate ciphertext feature from the
predetermined ciphertext feature set based on the first ciphertext
feature and the predetermined graph structure index comprises
determining the candidate ciphertext feature from the
low-dimensional feature subset based on the first low-dimensional
feature and the predetermined graph structure index.
5. The computer-implemented method according to claim 4, wherein
determining the recognition result for the first user based on the
candidate ciphertext feature comprises: determining, from the
high-dimensional feature subset, ciphertext features corresponding
to the candidate ciphertext feature as comparison features; and
determining the recognition result for the first user by comparing
the first high-dimensional feature with the comparison
features.
6. The computer-implemented method according to claim 5, wherein
determining the recognition result for the first user by comparing
the first high-dimensional feature with the comparison features
comprises: determining whether a comparison feature matches the
first high-dimensional feature by comparing the first
high-dimensional feature with the comparison features; and in
response to determining that the comparison feature matches the
first high-dimensional feature, determining, as the recognition
result for the first user, that the first user is one of the
multiple second users that corresponds to the comparison
feature.
7. The computer-implemented method according to claim 1, wherein
the predetermined graph structure index comprises one or more graph
nodes representing at least some of the plurality of second
ciphertext features in the predetermined ciphertext feature set and
one or more edges generated between the one or more graph nodes;
and wherein determining the candidate ciphertext feature from the
predetermined ciphertext feature set based on the first ciphertext
feature and the predetermined graph structure index comprises
determining, from the predetermined graph structure index and
through index query, one or more ciphertext features close to the
first ciphertext feature as the candidate ciphertext feature.
8. The computer-implemented method according to claim 7, wherein
the predetermined graph structure index is generated based on a
hierarchical navigable small world (HNSW) algorithm.
9. The computer-implemented method according to claim 1, wherein
the method further comprises, before performing the homomorphic
encryption on the facial feature of the first user, pre-receiving
an encryption key used to obtain the first ciphertext feature
through the homomorphic encryption, and wherein performing the
homomorphic encryption on the facial feature of the first user
comprises performing the homomorphic encryption on the facial
feature of the first user by using the encryption key.
10. The computer-implemented method according to claim 1, wherein
the terminal device is offline or has poor network
connectivity.
11. The computer-implemented method according to claim 1, wherein
the method further comprises, before determining the candidate
ciphertext feature from the predetermined ciphertext feature set
based on the first ciphertext feature and the predetermined graph
structure index: pre-receiving the predetermined ciphertext feature
set sent by a cloud-side device, wherein privacy comprises the
plurality of biometric features of the multiple second users, and
privacy protection is achieved by performing the homomorphic
encryption on the plurality of biometric features of the multiple
second users to obtain the predetermined ciphertext feature
set.
12. A computer-implemented system, comprising: one or more
computers; and one or more computer memory devices interoperably
coupled with the one or more computers and having tangible,
non-transitory, machine-readable media storing one or more
instructions that, when executed by the one or more computers,
perform operations for privacy-protected user recognition:
collecting data using a terminal device and extracting a facial
feature of a first user based on the collected data; obtaining a
first ciphertext feature, comprising: performing a homomorphic
encryption on the facial feature to obtain a first high-dimensional
feature; and performing dimension reduction on the facial feature
and performing the homomorphic encryption to obtain a first
low-dimensional feature; determining a candidate ciphertext feature
from a predetermined ciphertext feature set based on the first
ciphertext feature and a predetermined graph structure index,
wherein the predetermined ciphertext feature set comprises a
plurality of second ciphertext features obtained by performing the
homomorphic encryption of a plurality of biometric features of
multiple second users, and wherein the predetermined graph
structure index is generated based on similarity among at least
some of the plurality of second ciphertext features in the
predetermined ciphertext feature set; and determining a recognition
result for the first user based on the candidate ciphertext
feature.
13. (canceled)
14. (canceled)
15. The computer-implemented system according to claim 12, wherein
the predetermined ciphertext feature set comprises: a
high-dimensional feature subset generated by performing the
homomorphic encryption on the at least some of the plurality of
biometric features of the multiple second users; and a
low-dimensional feature subset generated by performing dimension
reduction on the at least some of the plurality of biometric
features of the multiple second users to get second low-dimensional
features and performing the homomorphic encryption on the second
low-dimensional features, wherein the predetermined graph structure
index is generated based on the low-dimensional feature subset, and
wherein determining a candidate ciphertext feature from the
predetermined ciphertext feature set based on the first ciphertext
feature and the predetermined graph structure index comprises
determining the candidate ciphertext feature from the
low-dimensional feature subset based on the first low-dimensional
feature and the predetermined graph structure index.
16. The computer-implemented system according to claim 15, wherein
determining the recognition result for the first user based on the
candidate ciphertext feature comprises: determining, from the
high-dimensional feature subset, ciphertext features corresponding
to the candidate ciphertext feature as comparison features; and
determining the recognition result for the first user by comparing
the first high-dimensional feature with the comparison
features.
17. The computer-implemented system according to claim 16, wherein
determining the recognition result for the first user by comparing
the first high-dimensional feature with the comparison features
comprises: determining whether a comparison feature matches the
first high-dimensional feature by comparing the first
high-dimensional feature with the comparison features; and in
response to determining that the comparison feature matches the
first high-dimensional feature, determining, as the recognition
result for the first user, that the first user is one of the
multiple second users that corresponds to the comparison
feature.
18. The computer-implemented system according to claim 12, wherein
the predetermined graph structure index comprises one or more graph
nodes representing at least some of the plurality of second
ciphertext features in the predetermined ciphertext feature set and
one or more edges generated between the one or more graph nodes;
and wherein determining the candidate ciphertext feature from the
predetermined ciphertext feature set based on the first ciphertext
feature and the predetermined graph structure index comprises
determining, from the predetermined graph structure index and
through index query, one or more ciphertext features close to the
first ciphertext feature as the candidate ciphertext feature.
19. The computer-implemented system according to claim 18, wherein
the predetermined graph structure index is generated based on a
hierarchical navigable small world (HNSW) algorithm.
20. A non-transitory, computer-readable medium storing one or more
instructions executable by a computer system to perform operations
comprising: collecting data using a terminal device and extracting
a facial feature of a first user based on the collected data;
obtaining a first ciphertext feature, comprising: performing a
homomorphic encryption on the facial feature to obtain a first
high-dimensional feature; and performing dimension reduction on the
facial feature and performing the homomorphic encryption to obtain
a first low-dimensional feature; determining a candidate ciphertext
feature from a predetermined ciphertext feature set based on the
first ciphertext feature and a predetermined graph structure index,
wherein the predetermined ciphertext feature set comprises a
plurality of second ciphertext features obtained by performing the
homomorphic encryption of a plurality of biometric features of
multiple second users, and wherein the predetermined graph
structure index is generated based on similarity among at least
some of the plurality of second ciphertext features in the
predetermined ciphertext feature set; and determining a recognition
result for the first user based on the candidate ciphertext
feature.
Description
CROSS-REFERENCE TO RELATED APPLICATIONS
[0001] This application claims priority to Chinese Patent
Application No. 202010952277.X, filed on Sep. 11, 2020, which is
hereby incorporated by reference in its entirety.
TECHNICAL FIELD
[0002] The present specification relates to the field of computer
software technologies, and in particular, to privacy
protection-based user recognition methods, apparatuses, and
devices.
BACKGROUND
[0003] The rapid development of Internet technologies has also
promoted the development of rich and diversified payment methods.
Face scanning payment is a payment method that has emerged and
gradually popularized in recent years, and is widely used in
shopping malls, supermarkets, self-service vending machines,
etc.
[0004] In the existing technology, face scanning payment needs
real-time support from a cloud side, and important processes such
as facial features comparison are performed in the cloud side.
Therefore, good network support is needed. With the popularity of
face scanning payment, there is also a practical need to deploy
face scanning payment in a subway, bus, group meal, etc. In these
scenarios, networks are often offline or network signals are often
weak. To prevent such situations from adversely affecting user
experience for face scanning payment, a face scanning payment
process can be considered to be delivered from the cloud side to a
terminal device, and then correspondingly a pre-registered facial
feature library used to recognize users is also delivered to the
terminal device. It is possible to achieve reverse user portrait
based on facial features and achieve a high degree of
similarity.
[0005] Based on the previous description, a user recognition
solution applicable to a terminal device and able to protect
privacy such as facial features is needed, to better support
services such as face scanning payment.
SUMMARY
[0006] One or more embodiments of the present specification provide
privacy protection-based user recognition methods, apparatuses, and
devices, and storage media, to resolve the following technical
problem: a user recognition solution applicable to a terminal
device and able to protect privacy such as facial features is
needed.
[0007] To resolve the technical problem, the one or more
embodiments of the present specification are implemented as
follows:
[0008] One or more embodiments of the present specification provide
a privacy protection-based user recognition method, including the
following: a biometric feature of a first user is obtained;
homomorphic encryption is performed on the biometric feature of the
first user to obtain a first ciphertext feature; a candidate
ciphertext feature is determined from a predetermined ciphertext
feature set based on the first ciphertext feature and a
predetermined graph structure index, where the ciphertext feature
set includes second ciphertext features obtained by performing
homomorphic encryption on biometric features of multiple second
users, and the graph structure index is generated based on
similarity among ciphertext features obtained through homomorphic
encryption; and a recognition result for the first user is
determined based on the candidate ciphertext feature.
[0009] One or more embodiments of the present specification provide
a privacy protection-based user recognition apparatus, including: a
feature acquisition module, configured to obtain a biometric
feature of a first user; a homomorphic encryption module,
configured to perform homomorphic encryption on the biometric
feature of the first user to obtain a first ciphertext feature; a
graph structure index module, configured to determine a candidate
ciphertext feature from a predetermined ciphertext feature set
based on the first ciphertext feature and a predetermined graph
structure index, where the ciphertext feature set includes second
ciphertext features obtained by performing homomorphic encryption
on biometric features of multiple second users, and the graph
structure index is generated based on similarity among ciphertext
features obtained through homomorphic encryption; and a recognition
determining module, configured to determine a recognition result
for the first user based on the candidate ciphertext feature.
[0010] One or more embodiments of the present specification provide
a privacy protection-based user recognition device, including: at
least one processor; and a memory communicably coupled to the at
least one processor, where the memory stores instructions that can
be executed by the at least one processor, and the instructions are
executed by the at least one processor to enable the at least one
processor to: obtain a biometric feature of a first user; perform
homomorphic encryption on the biometric feature of the first user
to obtain a first ciphertext feature; determine a candidate
ciphertext feature from a predetermined ciphertext feature set
based on the first ciphertext feature and a predetermined graph
structure index, where the ciphertext feature set includes second
ciphertext features obtained by performing homomorphic encryption
on biometric features of multiple second users, and the graph
structure index is generated based on similarity among ciphertext
features obtained through homomorphic encryption; and determine a
recognition result for the first user based on the candidate
ciphertext feature.
[0011] One or more embodiments of the present specification provide
a non-volatile computer storage medium, storing computer-executable
instructions. The computer-executable instructions are set to:
obtain a biometric feature of a first user; perform homomorphic
encryption on the biometric feature of the first user to obtain a
first ciphertext feature; determine a candidate ciphertext feature
from a predetermined ciphertext feature set based on the first
ciphertext feature and a predetermined graph structure index, where
the ciphertext feature set includes second ciphertext features
obtained by performing homomorphic encryption on biometric features
of multiple second users, and the graph structure index is
generated based on similarity among ciphertext features obtained
through homomorphic encryption; and determine a recognition result
for the first user based on the candidate ciphertext feature.
[0012] The at least one technical solution used in the one or more
embodiments of the present specification can achieve the following
beneficial effects: the privacy of the second users can be
effectively protected through homomorphic encryption, and
homomorphic encryption can be normally applied to a terminal device
for features comparison to recognize whether the first user is a
certain one of the second users.
BRIEF DESCRIPTION OF DRAWINGS
[0013] To describe technical solutions in the embodiments of the
present specification or in the existing technology more clearly,
the following briefly describes the accompanying drawings needed
for describing the embodiments or the existing technology. Clearly,
the accompanying drawings in the following descriptions show merely
some embodiments of the present specification, and a person of
ordinary skill in the art can still derive other drawings from
these accompanying drawings without creative efforts.
[0014] FIG. 1 is a schematic flowchart illustrating a privacy
protection-based user recognition method, according to one or more
embodiments of the present specification;
[0015] FIG. 2 is a schematic diagram illustrating an application
scenario of the method in FIG. 1, according to one or more
embodiments of the present specification;
[0016] FIG. 3 is a detailed schematic flowchart illustrating the
method in FIG. 1 in the application scenario in FIG. 2, according
to one or more embodiments of the present specification;
[0017] FIG. 4 is a schematic structural diagram illustrating a
privacy protection-based user recognition apparatus, according to
one or more embodiments of the present specification; and
[0018] FIG. 5 is a schematic structural diagram illustrating a
privacy protection-based user recognition device, according to one
or more embodiments of the present specification.
DESCRIPTION OF EMBODIMENTS
[0019] Embodiments of the present specification provide privacy
protection-based user recognition methods, apparatuses, and
devices, and storage media.
[0020] To make a person skilled in the art better understand the
technical solutions in the present specification, the following
clearly and comprehensively describes the technical solutions in
the embodiments of the present specification with reference to the
accompanying drawings in the embodiments of the present
specification. Clearly, the described embodiments are merely some
rather than all of the embodiments of the present application. All
other embodiments obtained by a person of ordinary skill in the art
based on the embodiments of the present specification without
creative efforts shall fall within the protection scope of the
present application.
[0021] In the one or more embodiments of the present specification,
the provided solutions support a terminal device in an offline
environment or a weak network environment. A cloud side can perform
homomorphic encryption the privacy of multiple second users and
then send encrypted privacy to the terminal device to recognize
whether a first user is certain one of the second users, so that
the privacy of the second users can be effectively protected, and
ciphertext data comparison can effectively reflect similarity among
corresponding plaintext data, thereby ensuring accuracy of a
recognition result. In addition, the use of a graph structure index
helps improve efficiency of determining a candidate ciphertext
feature, thereby helping improve user recognition efficiency and
supporting serving more users. The cloud side here can be
specifically a corresponding application relative to the terminal
device, a device on a server side of the corresponding application,
or a remote third-party device. The corresponding application is,
for example, a face scanning payment application, a fingerprint
payment application, or other applications having a biometric
feature authentication function. The following provides detailed
descriptions based on this idea.
[0022] FIG. 1 is a schematic flowchart illustrating a privacy
protection-based user recognition method, according to one or more
embodiments of the present specification. The method can be
performed by a terminal device, which is referred to terminal-side
execution. For example, for the application scenario in the
background, the method is applicable to a terminal device even in
an offline environment or a weak network environment and can
achieve a better recognition effect and relatively high recognition
efficiency.
[0023] The terminal device is, for example, a user mobile terminal
device and/or a public terminal device. Common user mobile terminal
devices include a smartphone, a tablet computer, a handheld device,
etc. Common public terminal devices include security inspection
equipment, a turnstile, a surveillance camera, and a cashiering
device in a bus, subway, a shopping mall, etc.
[0024] A first user and second users are mentioned in the process.
The first user is a current user to be recognized, and the second
users are users having pre-registered biometric features. Whether
the first user is a certain one of the second users is determined
through recognition.
[0025] The process shown in FIG. 1 can include the following
steps.
[0026] S102. Obtain a biometric feature of a first user.
[0027] In the one or more embodiments of the present specification,
for face recognition-based face scanning payment, the biometric
feature is a facial feature. In addition, the biometric feature can
be a fingerprint, palmprint, iris, sclera, voiceprint, heartbeat,
pulse, genetic material, human tooth bite mark, gait, etc.
[0028] In the one or more embodiments of the present specification,
the biometric feature of the first user is obtained through on-site
collection, which needs to be supported by a sensing device
corresponding to the biometric feature.
[0029] For example, in a subway station, if the turnstile supports
face recognition, the facial features are collected by using a
component such as a camera of the turnstile when the first user
passes the turnstile. A specific method for obtaining the facial
features includes the following: a data collection operation is
performed on the face of the first user by using an optical
component of the user mobile terminal device or the public terminal
device, and the facial features of the first user are extracted
based on the data collection operation. For example, a face image
(which can be an ordinary face photo) is collected by using the
camera of the turnstile, the face image is input to a pre-trained
neural network model for processing, and the facial features are
extracted by using a hidden layer of the neural network model.
[0030] In actual applications, to accurately obtain the facial
feature, support from more optical components may be further
needed. The collected image may further include a projected
dot-matrix image, etc. that reflects depth-of-field information.
For example, the facial features are obtained based on structured
light, and the optical component can include an infrared camera, a
flood light sensor, a dot-matrix projector, etc.
[0031] S104. Perform homomorphic encryption on the biometric
feature of the first user to obtain a first ciphertext feature.
[0032] In the one or more embodiments of the present specification,
the biometric feature is the privacy of the user, and the biometric
feature is protected through homomorphic encryption to prevent the
privacy of the user from being disclosed. An advantage of using
homomorphic encryption here further lies in the following: if
similarity comparison is performed between biometric features
obtained through homomorphic encryption, an obtained comparison
result can also effectively reflect similarity among corresponding
plaintexts. As such, the privacy can be protected while the user
can be recognized based on a ciphertext.
[0033] However, if other encryption methods are used, comparison
between corresponding ciphertexts may be meaningful only when
plaintexts are exactly the same. However, in most biometric feature
recognition scenarios, even for the same user, a biometric feature
registered by the user may be slightly or greatly different from a
biometric feature collected during subsequent recognition based on
collection conditions. Therefore, the two biometric features are
usually merely highly similar, but difficult to be exactly the
same. Based on this, if other encryption methods are used, it is
difficult to achieve the beneficial effect mentioned in the above
paragraph.
[0034] S106. Determine a candidate ciphertext feature from a
predetermined ciphertext feature set based on the first ciphertext
feature and a predetermined graph structure index, where the
ciphertext feature set includes second ciphertext features obtained
by performing homomorphic encryption on biometric features of
multiple second users, and the graph structure index is generated
based on similarity among ciphertext features obtained through
homomorphic encryption.
[0035] In the one or more embodiments of the present specification,
for example, the execution body is the terminal device. The graph
structure index and the ciphertext feature set can be pre-delivered
to the terminal device through online transmission or offline
duplication. Then, the terminal device still can perform the user
recognition process based on the pre-delivered data even if the
terminal device does not have a good network condition. Homomorphic
encryption is performed on the biometric features and then
encrypted biometric features are delivered, so that the privacy of
the second users can be protected.
[0036] In the one or more embodiments of the present specification,
because the ciphertext feature set includes the second ciphertext
features respectively corresponding to the multiple second users,
the multiple different second users can be successfully recognized.
This capability is very applicable to public crowd recognition in a
public place. Certainly, this capability is conveniently applicable
to recognizing a user by using a personal mobile terminal device of
the user.
[0037] Further, the ciphertext feature set can include a large
quantity or even a massive quantity (for example, tens of millions)
of second ciphertext features of second users, thereby helping
improve a capability of recognizing a large-traffic public crowd.
In this case, recognition efficiency becomes more important. Here,
the recognition efficiency is effectively improved by using the
graph structure index. The graph structure index is generated based
on at least some ciphertext features in the ciphertext feature set,
and is used to quickly determine a ciphertext feature that has a
relatively high similarity degree with the first ciphertext feature
from the at least some ciphertext features as the candidate
ciphertext feature.
[0038] In the one or more embodiments of the present specification,
the graph structure index can be generated based on a neighbor
search algorithm, for example, the hierarchical navigable small
world (HNSW) algorithm, the navigable small world (NSW) algorithm,
the K-dimensional (KD) tree, etc. The graph structure index may
already include needed ciphertext features. In this case, when the
graph structure index is delivered, the ciphertext features do not
need to be delivered again, thereby reducing data transmission and
storage costs.
[0039] For the graph structure index, as graph nodes, the
ciphertext feature can be represented by using vectors. A degree of
similarity between the ciphertext features can be measured based on
the distance (such as Euclidean distance) between the graph nodes
or a cosine value of the angle between the corresponding
vectors.
[0040] In the one or more embodiments of the present specification,
a reasonable similarity degree threshold can be set to help
determine the candidate ciphertext feature. If a relatively high
similarity degree threshold is set, a candidate ciphertext feature
that satisfies the requirement may not be determined. This means
that it is very likely that the first user is not any one of the
second users. In this case, to improve recognition efficiency, it
can be considered that it is directly determined that the first
user cannot be successfully recognized as a recognition result,
thereby helping start a recognition process for another user as
soon as possible.
[0041] S108. Determine a recognition result for the first user
based on the candidate ciphertext feature.
[0042] In the one or more embodiments of the present specification,
there can be one or more candidate ciphertext features. Each
candidate ciphertext feature corresponds to a second user. Based on
the candidate ciphertext feature, whether a degree of similarity
between the corresponding second user and the first user satisfies
a predetermined condition (for example, is greater than a
predetermined similarity degree threshold) is determined. If yes,
it can be determined that the first user is the second user.
[0043] According to the method in FIG. 1, the privacy of the second
users can be effectively protected through homomorphic encryption,
and homomorphic encryption can be normally applied to a terminal
device for features comparison to recognize whether the first user
is a certain one of the second users.
[0044] Based on the method in FIG. 1, the present application
further provides some specific implementation solutions and
extended solutions of the method, which are continuously described
below.
[0045] In the one or more embodiments of the present specification,
with the development of sensing technologies and big data
technologies, in actual applications, a vector of more dimensions
is often used to accurately represent a biometric feature. The
increase in dimension causes an increase in processing burden,
which is not conducive to improving recognition efficiency. Based
on this, dimension reduction processing can be performed on the
biometric feature and then a biometric feature obtained after the
dimension reduction processing is used for a recognition process.
However, the dimension reduction processing can cause a reduction
in accuracy of the biometric feature. In consideration of accuracy,
the biometric feature obtained before the dimension reduction and
the biometric feature obtained after the dimension reduction can
both be used in the recognition process, thereby achieving a better
balance between accuracy and efficiency.
[0046] Some of the following embodiments are described based on
this idea. For ease of description, the following uses two concepts
"high-dimensional" and "low-dimensional" to distinguish between
dimensions respectively corresponding to a feature obtained before
dimension reduction and a feature obtained after the dimension
reduction. It is worthwhile to note that high/low here is relative,
and a specific range of high/low can be defined based on actual
conditions such as different biometric features and different
solution implementation capabilities.
[0047] Facial features are used as an example. If an original
collected facial feature is 512-dimensional, it can be considered
that 512-dimensional belongs to high-dimensional; and if the
512-dimensional facial feature is reduced to a 64-dimensional or
32-dimensional facial feature through dimension reduction
processing, it can be considered that 64-dimensional and
32-dimensional belong to low-dimensional. Similarly, if an original
collected facial feature is 64-dimensional, and the 64-dimensional
facial feature is reduced to a 32-dimensional facial feature
through dimension reduction processing, it can be considered that
64-dimensional belongs to high-dimensional and 32-dimensional
belongs to low-dimensional.
[0048] In the one or more embodiments of the present specification,
generally, homomorphic encryption is performed on both a
high-dimensional feature and a corresponding low-dimensional
feature to respectively obtain a high-dimensional ciphertext and a
low-dimensional ciphertext; and a low-dimensional similar feature
is quickly indexed by using the low-dimensional ciphertext, a
high-dimensional similar feature corresponding to the
low-dimensional similar feature is determined, and the
high-dimensional ciphertext is compared with the high-dimensional
similar feature to determine a recognition result. Based on this
idea, efficiency is improved through low-dimensional indexing,
accuracy is improved through high-dimensional comparison, so that
both the efficiency and the accuracy are considered, thereby
helping achieve a better effect as a whole. The following provides
detailed descriptions based on this idea.
[0049] The ciphertext feature set is pre-generated, and the step is
performed on, for example, a cloud side. Specifically, for the
second user, a biometric feature of the second user is obtained,
and homomorphic encryption is performed on the biometric feature of
the second user to obtain a second ciphertext high-dimensional
feature; and dimension reduction processing is performed on the
biometric feature of the second user and then homomorphic
encryption is performed to obtain a second ciphertext
low-dimensional feature. The multiple second users can be
separately processed in a similar way to obtain respective second
ciphertext high-dimensional features and second ciphertext
low-dimensional features of the second users. These second
ciphertext high-dimensional features constitute a ciphertext
high-dimensional feature subset, these second ciphertext
low-dimensional features constitute a ciphertext low-dimensional
feature subset, and the ciphertext high-dimensional feature subset
and the ciphertext low-dimensional feature subset constitute the
previously described predetermined ciphertext feature set.
[0050] The graph structure index is pre-generated, and the step is
performed on, for example, a cloud side. Specifically, the graph
structure index is generated based on the ciphertext
low-dimensional feature subset. In this case, the candidate
ciphertext feature is determined from the ciphertext
low-dimensional feature subset. Assume that the graph structure
index is generated based on the HNSW algorithm. Specifically, a
second ciphertext low-dimensional feature corresponding to each
second user in the ciphertext low-dimensional feature subset is
separately used as one graph node, and each graph node is added
based on a node adding method of the HNSW algorithm (a proper edge
is created based on a corresponding policy in the adding process),
to generate the graph structure index.
[0051] The execution body of the recognition method pre-obtains the
ciphertext high-dimensional feature subset and the graph structure
index. For example, the terminal device serves as the execution
body to pre-receive needed data, such as the ciphertext
high-dimensional feature subset and the graph structure index,
generated and delivered by the cloud side. The cloud side is
considered to be on a side opposite to a side of the execution
body, and subsequent steps are also performed by a device on the
side that the execution body is located on.
[0052] The process of recognizing the first user is started to
first obtain the biometric feature of the first user, and perform
homomorphic encryption on the biometric feature of the first user
to obtain the first ciphertext feature. Specifically, homomorphic
encryption is performed on the biometric feature of the first user
to obtain a first ciphertext high-dimensional feature; and
dimension reduction processing is performed on the biometric
feature of the first user and then homomorphic encryption is
performed to obtain a first ciphertext low-dimensional feature.
[0053] Processing parameters used for processing, such as dimension
reduction and homomorphic encryption, performed on related data of
the second user and the first user can be associated with each
other or even the same, to reduce interference factors, so that a
result of subsequent similarity comparison can be more accurate and
reliable.
[0054] For example, assume that the second ciphertext feature is
generated by the cloud side by performing homomorphic encryption
with key A as an encryption key, key A sent by the cloud side can
be pre-received, and homomorphic encryption is also performed on
the biometric feature of the first user by using key A to obtain
the first ciphertext feature. Similarly, assume that the second
ciphertext high-dimensional feature and the second ciphertext
low-dimensional feature are generated by performing homomorphic
encryption by respectively using key a and key b, the first
ciphertext high-dimensional feature and the first ciphertext
low-dimensional feature can be correspondingly generated by
performing homomorphic encryption by respectively using key a and
key b.
[0055] The candidate ciphertext feature is determined. Assume that
the graph structure index includes graph nodes representing at
least some ciphertext features in the ciphertext feature set and an
edge generated between the graph nodes for index query, one or more
ciphertext features close to the first ciphertext feature can be
specifically determined from the graph structure index through
index query as the candidate ciphertext feature. The advantage of
low-dimensional related processing in terms of improving efficiency
has been previously described. Based on the description, if the at
least some ciphertext features are, for example, the ciphertext
low-dimensional feature subset, the candidate ciphertext feature is
determined from the ciphertext low-dimensional feature subset
through index query based on the first ciphertext low-dimensional
feature and the predetermined graph structure index.
[0056] The recognition result for the first user is determined
based on the candidate ciphertext feature. The candidate ciphertext
feature already reflects a degree of similarity between the first
user and the second user in a low-dimensional case. Therefore, if a
degree of similarity between a certain candidate ciphertext feature
and the first ciphertext low-dimensional feature (low-dimensional
comparison) is high enough, it can be determined that the first
user is a second user corresponding to the candidate ciphertext
feature. Certainly, to obtain a more accurate and reliable
recognition result, comparison can be performed in a
high-dimensional case.
[0057] Specifically, ciphertext features corresponding to the
candidate ciphertext feature are determined form the ciphertext
high-dimensional feature subset as comparison features, and the
first ciphertext high-dimensional feature is compared with the
comparison features (high-dimensional comparison) to determine the
recognition result for the first user. For example, the first
ciphertext high-dimensional feature is compared with the comparison
features to determine whether there is a comparison feature that
successfully matches the first ciphertext high-dimensional feature;
and if yes, it is determined that the first user is a second user
corresponding to the successfully matched high-dimensional feature
for the recognition result of the first user.
[0058] In the one or more embodiments of the present specification,
alternatively, a high-dimensional feature can be compared with a
low-dimensional feature. For example, the second ciphertext
high-dimensional features and the second ciphertext low-dimensional
features are constructed in the same graph to improve comparability
between the high-dimensional feature and the low-dimensional
feature, and then the first ciphertext high-dimensional feature is
compared with the second ciphertext low-dimensional feature or the
first ciphertext low-dimensional feature is compared with the
second ciphertext high-dimensional feature based on the
corresponding graph structure index to obtain the recognition
result for the first user.
[0059] Based on the previous descriptions, more intuitively, the
one or more embodiments of the present specification further
provide an application scenario of the method in FIG. 1 and a
detailed process of the method in FIG. 1 in the application
scenario, which are described with reference to FIG. 2 and FIG.
3.
[0060] FIG. 2 is a schematic diagram illustrating an application
scenario of the method in FIG. 1, according to one or more
embodiments of the present specification. This application scenario
includes devices on two sides: a cloud side and a terminal side. A
biometric feature is specifically a facial feature, and the
terminal side specifically includes a public face collection
terminal device disposed in a place such as a subway, a bus, or a
shopping mall. Multiple users pre-register respective facial
features in a cloud side. The cloud side performs privacy
protection on the facial features registered by the users, and
pre-delivers data that face recognition needs to be based on to the
terminal side. The terminal side performs face recognition on
multiple users on-site based on the data delivered by the cloud
side, to authenticate user identities and perform a service
operation such as money deduction for an authenticated user.
[0061] FIG. 3 is a detailed schematic flowchart illustrating the
method in FIG. 1 in the application scenario in FIG. 2, according
to one or more embodiments of the present specification.
[0062] In this application scenario, a graph structure index is
generated by using the HNSW algorithm for use, a degree of
similarity between graph nodes is measured by using the Euclidean
distance, and processing functions provided by the homomorphic
encryption algorithm and the HNSW algorithm are used to improve
processing efficiency.
[0063] Processing functions provided by the homomorphic encryption
algorithm include the following:
[0064] HE.INIT( ) represents a key generation function, used to
generate a public-private key pair (PK,SK) for homomorphic
encryption, where PK represents a public key, and SK represents a
private key.
[0065] HF.Enc(PK,x) represents an encryption function, used to
perform homomorphic encryption on plaintext x by using PK as an
encryption key, to obtain a corresponding ciphertext, denoted as
c.
[0066] HE.Dec(SK,c) represents a decryption function, used to
decrypt ciphertext c by using SK as a decryption key, to obtain
corresponding plaintext x.
[0067] HF.EncDist (c.sub.1,c.sub.2) represents a ciphertext
Euclidean distance calculation function, used to calculate the
Euclidean distance between ciphertext c.sub.1 and ciphertext
c.sub.2.
[0068] Assume that there is a data set: Data={x.sub.1,x.sub.2, . .
. ,x.sub.n}, where x.sub.i is an m-dimensional feature vector.
[0069] Keys are generated by using the key generation function:
(PK,SK)=HE.INIT( ).
[0070] Homomorphic encryption is performed on the elements in Data
by using the encryption function (for ease of description, some of
the following steps are represented by using pseudocode):
[0071] for x.sub.i in Data, c.sub.i=HE.Enc(PK,x.sub.i), so that a
corresponding ciphertext set is obtained:
EncData={c.sub.1,c.sub.2, . . . ,c.sub.n}.
[0072] A function provided by the HNSW algorithm and used to
identify the closest ciphertext node is used:
HNSW.firtdNearestPoint(c.sub.i,EncData).
[0073] For a specified ciphertext node c.sub.i, a ciphertext node
closest to a is identified from EncData, and is denoted as c.sub.j.
Details are as follows:
[0074] dist.sub.i,j=HE.EncDist(c.sub.i,c.sub.j),c.sub.j, .di-elect
cons.{c.sub.1,c.sub.2, . . . ,c.sub.n} is calculated sequentially
to obtain a set:
dist={dist.sub.i,1,dist.sub.i,2, . . . ,dist.sub.i,n}.
[0075] After the elements in dist are sorted, the minimum value min
dist.sub.i,j and corresponding ciphertext node c.sub.j are
identified.
[0076] The HNSW algorithm further provides a function used to add a
graph node:
HNSW.Add(c.sub.i).
[0077] Specifically, for node c.sub.i, that has not been added to
the graph, the number of layers for the current node, level, is
calculated for example, by using an equation level=int(-log.sub.2
(random))* self.level_mult)+1; and nodes ep closest to c.sub.i are
sequentially identified starting from the highest layer of the
current graph, where ep at the previous layer is used as input of
the current layer, and ep can be calculated by using
HNSW.findNearestPomt ( ). By analogy, node ep closest to c.sub.i at
the (level)th layer is identified. Then, c.sub.i is sequentially
added to all layers of the graph from the (level-1)th layer to the
0th layer according to the following steps:
[0078] (1) At each layer, ef nodes closest to c.sub.i are
identified by using HNSW.findNearestPoint( ) starting from ep, and
ep is updated.
[0079] (2) After ep is identified, m nodes are identified from the
ef nodes by using a heuristic selection algorithm, and c.sub.i is
connected to the m nodes.
[0080] (3) The m nodes are also connected to c.sub.i.
[0081] (4) Then, ep is transmitted to next-layer search as
input.
[0082] Steps (1) to (4) are repeated until node c.sub.i is added to
the current graph at the 0th layer.
[0083] The HNSW algorithm further provides a function used to
generate a graph structure index:
HNSW.index(EncData).
[0084] For specified node set EncData, specifically, the nodes in
the specified node set are added to a graph to generate a graph
structure index:
for c.sub.i in EncData
HNSW.Add(c.sub.i)
[0085] The HNSW algorithm further provides a function that is used
to query a node function near a specified node:
HNSW.query(c.sub.i,k).
[0086] For specified node c.sub.i, node elem closest to c.sub.i is
identified from the highest layer L by using HNSW.findNearestPoint
( ), and elem is used as a start point of the (L-1)th layer to
start query, to identify node point closet to elem. The step is
repeated until the 0th layer. At the 0th layer, k nodes closet to
point are identified starting from current point and returned.
[0087] The process in FIG. 3 is described based on the previous
descriptions of the functions and the processes. The process in
FIG. 3 can include two phases: an initialization phase on a cloud
side and a recognition phase on the terminal side.
[0088] The initialization phase on the cloud side can include the
following steps:
[0089] Dimension reduction processing is performed on
high-dimensional facial feature database Data.sub.high (the
previously described biometric features of the multiple second
users) to obtain low-dimensional facial feature database
Data.sub.low. Assume that the facial features are processed by
using a neural network model, a specific dimension number of high
dimension depends on the number of nodes at a corresponding hidden
layer or output layer of the neural network model, a specific
dimension number of low dimension is specifically set based on, for
example, a specific scenario and a recall rate requirement, and a
used dimension reduction processing algorithm can also be set based
on the specific scenario.
[0090] PK.sub.high and PK.sub.low are generated by using HE.INIT( )
and stored, and homomorphic encryption is performed on elements
M.sub.high and M.sub.low in Data.sub.high and Data.sub.low based on
PK.sub.high and PK.sub.low by using HE.Enc( ), to obtain
EncData.sub.high (the previously described ciphertext
high-dimensional feature subset) and EncData.sub.low (the
previously described ciphertext low-dimensional feature
subset).
[0091] HNSW (the previously described graph structure index) is
generated based on EncData.sub.low by using HNSW index ( ).
[0092] EncData.sub.high, EHNSW, PK.sub.high, and PK.sub.low are
delivered to the terminal side and stored in a database on the
terminal side for use.
[0093] The recognition phase on the terminal side can include the
following steps:
[0094] A terminal device collects a face image of a first user, and
extracts facial features Vec.sub.high (the previously described
biometric feature of the first user) from the face image by using
pre-trained neural network model FaceNet. Facial features in
Data.sub.high can also be extracted by using FaceNet, which
achieves better consistency, thereby helping reduce interference
factors.
[0095] Dimension reduction processing is performed on Vec.sub.high
to obtain Vec.sub.low, where a used dimension reduction processing
algorithm can be consistent with the dimension reduction processing
algorithm used on the cloud side.
[0096] Homomorphic encryption is performed on Vec.sub.high and
Vec.sub.low based on PK.sub.high and PK.sub.low by using HE.Enc( ),
to obtain EncVec.sub.high (the previously described first
ciphertext high-dimensional feature) and EncVec.sub.low (the
previously described first ciphertext low-dimensional feature).
[0097] The first K ciphertext features (the previously described
candidate ciphertext feature) closest to EncVec.sub.low are
determined from HNSW through index query by using HNSW.query (
).
[0098] EncVec.sub.high is compared with ciphertext features (the
previously described comparison features) corresponding to the
first K ciphertext features in EncData.sub.high, or EncVec.sub.high
is compared with the first K ciphertext features, to obtain a
comparison result, thereby recognizing the first user.
[0099] Based on the same idea, biometric features of multiple
second users can be combined and then homomorphic encryption can be
performed, and a corresponding graph structure index can be
generated, to support simultaneous recognition of multiple first
users, thereby recognizing a large-traffic public crowd more
efficiently.
[0100] Based on the same idea, the one or more embodiments of the
present specification further provide an apparatus and a device
corresponding to the previously described method, as shown in FIG.
4 and FIG. 5.
[0101] FIG. 4 is a schematic structural diagram illustrating a
privacy protection-based user recognition apparatus, according to
one or more embodiments of the present specification. Dashed-line
blocks in the figure represent optional modules, and the apparatus
includes:
[0102] a feature acquisition module 402, configured to obtain a
biometric feature of a first user;
[0103] a homomorphic encryption module 404, configured to perform
homomorphic encryption on the biometric feature of the first user
to obtain a first ciphertext feature;
[0104] a graph structure index module 406, configured to determine
a candidate ciphertext feature from a predetermined ciphertext
feature set based on the first ciphertext feature and a
predetermined graph structure index, where the ciphertext feature
set includes second ciphertext features obtained by performing
homomorphic encryption on biometric features of multiple second
users, and the graph structure index is generated based on
similarity among ciphertext features obtained through homomorphic
encryption; and
[0105] a recognition determining module 408, configured to
determine a recognition result for the first user based on the
candidate ciphertext feature.
[0106] Optionally, the biometric feature includes a facial
feature.
[0107] The feature acquisition module 402 includes a collection
module 4022 and an extraction module 4024.
[0108] The collection module 4022 is configured to perform a data
collection operation on the face of the first user by using an
optical component of a user mobile terminal device or a public
terminal device.
[0109] The extraction module 4024 is configured to extract the
facial features of the first user based on the data collection
operation.
[0110] Optionally, the first ciphertext feature includes a first
ciphertext high-dimensional feature and a first ciphertext
low-dimensional feature.
[0111] The homomorphic encryption module 404 includes an encryption
module 4042 and a dimension reduction module 4044.
[0112] The encryption module 4042 is configured to perform
homomorphic encryption on the biometric feature of the first user
to obtain the first ciphertext high-dimensional feature.
[0113] The dimension reduction module 4044 is configured to perform
dimension reduction processing on the biometric feature of the
first user and then the encryption module is configured to perform
homomorphic encryption to obtain the first ciphertext
low-dimensional feature.
[0114] Optionally, the ciphertext feature set includes a ciphertext
high-dimensional feature subset and a ciphertext low-dimensional
feature subset correspondingly generated based on dimension
reduction processing, and the graph structure index is generated
based on the ciphertext low-dimensional feature subset.
[0115] The graph structure index module 406 is configured to
determine the candidate ciphertext feature from the predetermined
ciphertext low-dimensional feature subset based on the first
ciphertext low-dimensional feature and the predetermined graph
structure index.
[0116] Optionally, the recognition determining module 408 includes
a comparison feature determining module 4082 and a comparison
recognition module 4084.
[0117] The comparison feature determining module 4082 is configured
to determine ciphertext features corresponding to the candidate
ciphertext feature from the ciphertext high-dimensional feature
subset as comparison features.
[0118] The comparison recognition module 4084 is configured to
compare the first ciphertext high-dimensional feature with the
comparison features to determine the recognition result for the
first user.
[0119] Optionally, the comparison recognition module 4084 is
configured to compare the first ciphertext high-dimensional feature
with the comparison features to determine whether there is a
comparison feature that successfully matches the first ciphertext
high-dimensional feature; and if yes, determine that the first user
is a second user corresponding to the successfully matched
comparison feature as the recognition result for the first
user.
[0120] Optionally, the graph structure index includes graph nodes
representing at least some ciphertext features in the ciphertext
feature set and an edge generated between the graph nodes for index
query.
[0121] The graph structure index module 406 is configured to
determine one or more ciphertext features close to the first
ciphertext feature from the graph structure index through index
query as the candidate ciphertext feature.
[0122] Optionally, the graph structure index is generated based on
the HNSW algorithm.
[0123] Optionally, the apparatus further includes a first receiving
module 4010.
[0124] The first receiving module 4010 is configured to pre-receive
an encryption key sent by a cloud side and used when the second
ciphertext feature is obtained through homomorphic encryption
before the homomorphic encryption module 404 performs homomorphic
encryption on the biometric feature of the first user.
[0125] The homomorphic encryption module 404 is configured to
perform homomorphic encryption on the biometric feature of the
first user by using the encryption key.
[0126] Optionally, the apparatus is applied to the following
devices in an offline environment or a weak network environment: a
user mobile terminal device and/or a public terminal device.
[0127] Optionally, the apparatus further includes a second
receiving module 4012, configured to pre-receive the ciphertext
feature set sent by the cloud side before the graph structure index
module 406 determines the candidate ciphertext feature from the
predetermined ciphertext feature set based on the first ciphertext
feature and the predetermined graph structure index.
[0128] The privacy includes the biometric features of the multiple
second users, and the privacy is protected by the cloud side by
performing homomorphic encryption on the biometric features of the
multiple second users to obtain the ciphertext feature set.
[0129] FIG. 5 is a schematic structural diagram illustrating a
privacy protection-based user recognition device, according to one
or more embodiments of the present specification. The device
includes: at least one processor; and a memory communicably coupled
to the at least one processor.
[0130] The memory stores instructions that can be executed by the
at least one processor, and the instructions are executed by the at
least one processor to enable the at least one processor to: obtain
a biometric feature of a first user; perform homomorphic encryption
on the biometric feature of the first user to obtain a first
ciphertext feature; determine a candidate ciphertext feature from a
predetermined ciphertext feature set based on the first ciphertext
feature and a predetermined graph structure index, where the
ciphertext feature set includes second ciphertext features obtained
by performing homomorphic encryption on biometric features of
multiple second users, and the graph structure index is generated
based on similarity among ciphertext features obtained through
homomorphic encryption; and determine a recognition result for the
first user based on the candidate ciphertext feature.
[0131] The processor can communicate with the memory by using a
bus, and the device can further include an input/output interface
for communicating with other devices.
[0132] Based on the same idea, the one or more embodiments of the
present specification further provides a non-volatile computer
storage medium corresponding to the previously described method,
storing computer-executable instructions. The computer-executable
instructions are set to: obtain a biometric feature of a first
user; perform homomorphic encryption on the biometric feature of
the first user to obtain a first ciphertext feature; determine a
candidate ciphertext feature from a predetermined ciphertext
feature set based on the first ciphertext feature and a
predetermined graph structure index, where the ciphertext feature
set includes second ciphertext features obtained by performing
homomorphic encryption on biometric features of multiple second
users, and the graph structure index is generated based on
similarity among ciphertext features obtained through homomorphic
encryption; and determine a recognition result for the first user
based on the candidate ciphertext feature.
[0133] In the 1990s, whether a technical improvement is a hardware
improvement (for example, an improvement of circuit structures,
such as a diode, a transistor, or a switch) or a software
improvement (an improvement of a method process) can be clearly
distinguished. However, as technologies develop, current
improvements of many method processes can be considered as direct
improvements on hardware circuit structures. Almost all designers
program an improved method process into a hardware circuit, to
obtain a corresponding hardware circuit structure. Therefore, a
method process can be implemented by using a hardware entity
module. For example, a programmable logic device (PLD) (for
example, a field programmable gate array (FPGA)) is such an
integrated circuit, and a logical function of the PLD is determined
by a user through device programming. A designer performs
programming to "integrate" a digital system to a single PLD,
without needing a chip manufacturer to design and manufacture a
dedicated integrated circuit chip. In addition, at present, instead
of manually manufacturing an integrated circuit chip, this type of
programming is mostly implemented by using "logic compiler"
software. The "logic compiler" software is similar to a software
compiler used to develop and write a program. Original code needs
to be written in a particular programming language before
compilation. The language is referred to as a hardware description
language (HDL). There are many HDLs, such as the Advanced Boolean
Expression Language (ABEL), the Altera Hardware Description
Language (AHDL), Confluence, the Cornell University Programming
Language (CUPL), HDCal, the Java Hardware Description Language
(JHDL), Lava, Lola, MyHDL, PALASM, and the Ruby Hardware
Description Language (RHDL). The Very-High-Speed Integrated Circuit
Hardware Description Language (VHDL) and Verilog are most commonly
used at present. A person skilled in the art should also understand
that a hardware circuit that implements a logical method process
can be readily obtained provided that the method process is
logically programmed by using several of the previously described
HDLs and is programmed into an integrated circuit.
[0134] A controller can be implemented by using any appropriate
method. For example, the controller can be in a form a
microprocessor or a processor, or a computer-readable medium that
stores computer readable program code (such as software or
firmware) that can be executed by the microprocessor or the
processor, a logic gate, a switch, an application-specific
integrated circuit (ASIC), a programmable logic controller, or a
built-in microprocessor. Examples of the controller include but are
not limited to the following microprocessors: ARC 625D, Atmel
AT91SAM, Microchip PIC18F26K20, and Silicone Labs C8051F320. The
memory controller can also be implemented as a part of control
logic of the memory. A person skilled in the art should also know
that, in addition to implementing the controller by using only the
computer readable program code, method steps can be logically
programmed to allow the controller to implement the same function
in a form of the logic gate, the switch, the ASIC, the programmable
logic controller, or the built-in microcontroller.
[0135] Therefore, the controller can be considered as a hardware
component, and an apparatus included in the controller and
configured to implement various functions can also be considered as
a structure in the hardware component. Alternatively, the apparatus
configured to implement various functions can even be considered as
both a software module for implementing the method and a structure
in the hardware component.
[0136] The system, apparatus, module, or unit illustrated in the
previously described embodiments can be specifically implemented by
using a computer chip or an entity, or can be implemented by using
a product having a certain function. A typical implementation
device is a computer. Specifically, the computer can be, for
example, a personal computer, a laptop computer, a cellular phone,
a camera phone, a smartphone, a personal digital assistant, a media
player, a navigation device, an email device, a game console, a
tablet computer, a wearable device, or a combination of any of
these devices.
[0137] For ease of description, when the apparatus is described,
the apparatus is divided into various units based on functions for
separate description. Certainly, when the present specification is
implemented, functions of the units can be implemented in one or
more pieces of software and/or hardware.
[0138] A person skilled in the art should understand that the
embodiments of the present specification can be provided as
methods, systems, or computer program products. Therefore, the
embodiments of the present specification can be in a form of
hardware only embodiments, software only embodiments, or
embodiments with a combination of software and hardware. Moreover,
the embodiments of the present specification can be in a form of a
computer program product implemented on one or more computer-usable
storage media (including but not limited to a magnetic disk memory,
a CD-ROM, an optical memory, etc.) that include computer-usable
program code.
[0139] The present specification is described with reference to the
flowcharts and/or block diagrams of the method, the device
(system), and the computer program product according to the
embodiments of the present specification. It should be understood
that computer program instructions can be used to implement each
process and/or each block in the flowcharts and/or the block
diagrams and a combination of a process and/or a block in the
flowcharts and/or the block diagrams. These computer program
instructions can be provided for a general-purpose computer, a
dedicated computer, an embedded processor, or a processor of
another programmable data processing device to generate a machine,
so that the instructions executed by a computer or a processor of
another programmable data processing device generate an apparatus
for implementing a specific function in one or more processes in
the flowcharts and/or in one or more blocks in the block
diagrams.
[0140] Alternatively, these computer program instructions can be
stored in a computer readable memory that can instruct a computer
or another programmable data processing device to work in a
specific way, so that the instructions stored in the computer
readable memory generate an artifact that includes an instruction
apparatus. The instruction apparatus implements a specific function
in one or more processes in the flowcharts and/or in one or more
blocks in the block diagrams.
[0141] Alternatively, these computer program instructions can be
loaded onto a computer or another programmable data processing
device, so that a series of operations and steps are performed on
the computer or another programmable device, thereby generating
computer-implemented processing. Therefore, the instructions
executed on the computer or another programmable device provide
steps for implementing a specific function in one or more processes
in the flowcharts and/or in one or more blocks in the block
diagrams.
[0142] In a typical configuration, a computing device includes one
or more processors (CPUs), input/output interfaces, network
interfaces, and memories.
[0143] The memory can include a non-persistent memory, a random
access memory (RAM), a non-volatile memory, and/or another form in
a computer readable medium, for example, a read-only memory (ROM)
or a flash memory (flash RAM).
[0144] The memory is an example of the computer readable
medium.
[0145] The computer-readable medium includes persistent,
non-persistent, movable, and unmovable media that can store
information by using any method or technology. The information can
be a computer readable instruction, a data structure, a program
module, or other data. Examples of a computer storage medium
include but are not limited to a phase-change random access memory
(PRAM), a static random access memory (SRAM), a dynamic random
access memory (DRAM), another type of random access memory (RAM), a
read-only memory (ROM), an electrically erasable programmable
read-only memory (EEPROM), a flash memory or another memory
technology, a compact disc read-only memory (CD-ROM), a digital
versatile disc (DVD) or another optical storage, a cassette
magnetic tape, or a magnetic tape/magnetic disk storage or another
magnetic storage device. The computer storage medium can be
configured to store information accessible to a computing device.
Based on the definition in the present specification, the computer
readable medium does not include computer readable transitory media
(transitory media) such as a modulated data signal and carrier.
[0146] It is worthwhile to further note that, the terms "include,"
"comprise," or their any other variants are intended to cover a
non-exclusive inclusion, so that a process, a method, a product, or
a device that includes a list of elements not only includes those
elements but also includes other elements not expressly listed, or
further includes elements inherent to such process, method,
product, or device. Without more constraints, an element preceded
by "includes a . . . " does not preclude the existence of
additional identical elements in the process, method, product, or
device that includes the element.
[0147] The present specification can be described in the general
context of computer-executable instructions executed by a computer,
for example, a program module. Generally, the program module
includes a routine, a program, an object, a component, a data
structure, etc. for executing a specific task or implementing a
specific abstract data type. The present specification can
alternatively be practiced in distributed computing environments.
In these distributed computing environments, tasks are performed by
remote processing devices connected through a communications
network. In the distributed computing environments, the program
module can be located in both local and remote computer storage
media including storage devices.
[0148] The embodiments of the present specification are described
in a progressive way. For same or similar parts of the embodiments,
mutual references can be made to the embodiments. Each embodiment
focuses on a difference from other embodiments.
[0149] Especially, the apparatus, device, and non-volatile computer
storage medium embodiments are basically similar to the method
embodiment, and therefore are described briefly. For a related
part, references can be made to some descriptions in the method
embodiment.
[0150] Specific embodiments of the present specification are
described above.
[0151] Other embodiments fall within the scope of the appended
claims. In some cases, the actions or steps described in the claims
can be performed in an order different from the order in the
embodiments and the desired results can still be achieved. In
addition, the process depicted in the accompanying drawings does
not necessarily need the shown particular execution order or
sequence to achieve the desired results. In some implementations,
multi-tasking processing and parallel processing are allowed or may
be advantageous.
[0152] The previous descriptions are merely one or more embodiments
of the present specification, and are not intended to limit the
present specification. For a person skilled in the art, various
modifications and changes can be made to the one or more
embodiments of the present specification. Any modification,
equivalent replacement, improvement, etc. made within the spirit
and the principle of the one or more embodiments of the present
specification shall fall within the scope of the claims of the
present specification.
* * * * *