U.S. patent application number 16/333776 was filed with the patent office on 2019-07-11 for personal authentication device, personal authentication method, and recording medium.
This patent application is currently assigned to NEC Corporation. The applicant listed for this patent is NEC Corporation. Invention is credited to Takayuki ARAKAWA, Takafumi KOSHINAKA, Masahiro SAIKOU.
Application Number | 20190213313 16/333776 |
Document ID | / |
Family ID | 61619133 |
Filed Date | 2019-07-11 |
View All Diagrams
United States Patent
Application |
20190213313 |
Kind Code |
A1 |
KOSHINAKA; Takafumi ; et
al. |
July 11, 2019 |
PERSONAL AUTHENTICATION DEVICE, PERSONAL AUTHENTICATION METHOD, AND
RECORDING MEDIUM
Abstract
To provide a personal authentication device capable of simply
securing security with little psychological and physical burden of
a user to be authenticated. Personal authentication device 300
includes: Transmission unit 31 that transmits a first acoustic
signal to a part of a user's head; Observation unit 32 that
observes a second acoustic signal that is an acoustic signal after
the first acoustic signal propagates through the part of the user's
head; Calculation unit 33 that calculates acoustic characteristics
from the first acoustic signal and the second acoustic signal;
Extraction unit 34 that extracts a feature amount related to a user
from the acoustic characteristics; Storage control unit 35 that
registers the feature amount in the storage unit as a first feature
amount; Identification unit 36 that identifies the user by
collating the first feature amount acquired from the storage unit
with a second feature amount acquired after the extraction of the
first feature amount from the extraction unit; Storage unit 37
stores the first feature amount, wherein while Identification unit
36 identifies the user as being identical, Transmission unit 31
transmits the first acoustic signal every predetermined
interval.
Inventors: |
KOSHINAKA; Takafumi; (Tokyo,
JP) ; SAIKOU; Masahiro; (Tokyo, JP) ; ARAKAWA;
Takayuki; (Tokyo, JP) |
|
Applicant: |
Name |
City |
State |
Country |
Type |
NEC Corporation |
Tokyo |
|
JP |
|
|
Assignee: |
NEC Corporation
Tokyo
JP
|
Family ID: |
61619133 |
Appl. No.: |
16/333776 |
Filed: |
September 11, 2017 |
PCT Filed: |
September 11, 2017 |
PCT NO: |
PCT/JP2017/032682 |
371 Date: |
March 15, 2019 |
Current U.S.
Class: |
1/1 |
Current CPC
Class: |
H04L 63/0861 20130101;
G07C 9/37 20200101; G06F 2221/2117 20130101; H04W 12/0605 20190101;
G06F 21/32 20130101; H04S 7/304 20130101; H04W 12/00504 20190101;
A61B 5/117 20130101 |
International
Class: |
G06F 21/32 20060101
G06F021/32; H04S 7/00 20060101 H04S007/00 |
Foreign Application Data
Date |
Code |
Application Number |
Sep 16, 2016 |
JP |
2016-181897 |
Claims
1. A personal authentication device comprising: one or more
memories storing instructions; and one or more processors
configured to process the instructions to: transmit a first
acoustic signal to a part of a head of a user; observe a second
acoustic signal that is an acoustic signal after the first acoustic
signal propagates through the part of the head of the user;
calculate acoustic characteristics from the first acoustic signal
and the second acoustic signal; extract a feature amount related to
the user from the acoustic characteristics; register the feature
amount in a storage means as a first feature amount; and identify
the user by collating the first feature amount registered in the
storage with a second feature amount extracted in the extraction
after the first feature amount is registered, wherein, when the
user is identified as being identical, the first acoustic signal is
transmitted every predetermined interval.
2. The personal authentication device according to claim 1, wherein
the processors further process the instructions to: provide a
service to the user when the user is identified as being identical
in the identification.
3. The personal authentication device according to claim 1,
wherein, when the user is not identified as being identical in the
identification, the providing of the service is stopped and the
first feature amount registered in the storage is deleted.
4. The personal authentication device according to claim 1, wherein
the processors further process the instructions to: when the user
is not identified as being identical in the identification, emit
light in a color different from a color when the user is identified
as being identical.
5. The personal authentication device according to claim 2, wherein
the processors further process the instructions to: detect whether
a specified time has passed, when the user is identified as being
identical in the identification and it is detected that the
specified time has passed, the providing of the service to the user
is stopped.
6. The personal authentication device according to claim 5,
wherein, when it is detected that the specified time has passed,
emitting light in a color different from a color when the specified
time has not passed.
7. A personal authentication method comprising: transmitting a
first acoustic signal to a part of a head of a user; observing a
second acoustic signal that is an acoustic signal after the first
acoustic signal propagates through the part of the head of the
user; calculating acoustic characteristics from the first acoustic
signal and the second acoustic signal; extracting a feature amount
related to the user from the acoustic characteristics; registering
the feature amount in a storage as a first feature amount; and
identifying the user by collating the first feature amount
registered in the storage with a second feature amount extracted
after the first feature amount is registered, wherein, when the
user is identified as being identical, the first acoustic signal is
transmitted every predetermined interval.
8. The personal authentication method according to claim 7, further
comprising: providing a service to the user when the user is
identified as being identical.
9. The personal authentication method according to claim 7,
wherein, when the user is not identified as being identical, the
providing of the service is stopped and the first feature amount
registered in the storage is deleted.
10. The personal authentication method according to claim 7,
further comprising: when the user is identified as being identical,
emitting light in a color different from a color when the user is
identified as being identical.
11. The personal authentication method according to claim 8,
further comprising: detecting whether a specified time has passed,
when the user is identified as being identical and it is detected
that the specified time has passed, the providing of the service to
the user is stopped.
12. The personal authentication method according to claim 11,
wherein, when it is detected that the specified time has passed,
emitting light in a color different from a color when the specified
time has not passed.
13. A non-transitory computer readable recording medium stored with
a personal authentication program causing a computer to perform:
transmitting a first acoustic signal to a part of a head of a user;
observing a second acoustic signal that is an acoustic signal after
the first acoustic signal propagates through the part of the head
of the user; calculating acoustic characteristics from the first
acoustic signal and the second acoustic signal; extracting a
feature amount related to the user from the acoustic
characteristics; registering the feature amount in a storage as a
first feature amount; and identifying the user by collating the
first feature amount registered in the storage with a second
feature amount extracted after the first feature amount is
registered, wherein, when the user is identified as being
identical, the first acoustic signal is transmitted every
predetermined interval.
14. The recording medium according to claim 13, further comprising:
providing a service to the user when the user is identified as
being identical.
15. The recording medium according to claim 13, wherein, when the
user is not identified as being identical, the providing of the
service is stopped and the first feature amount registered in the
storage is deleted.
16. The recording medium according to claim 13, further comprising:
when the user is identified as being identical, emitting light in a
color different from a color when the user is identified as being
identical.
17. The recording medium according to claim 14, further comprising:
detecting whether a specified time has passed, when the user is
identified as being identical and it is detected that the specified
time has passed, the providing of the service to the user is
stopped.
18. The recording medium according to claim 17, wherein, when it is
detected that the specified time has passed, emitting light in a
color different from a color when the specified time has not
passed.
Description
TECHNICAL FIELD
[0001] The present invention relates to a personal authentication
device for authenticating an individual.
BACKGROUND ART
[0002] Since personal authentication (biometrics-based
authentication) based on individual differences of a living body is
less likely to leak or theft than passwords, it is increasingly
introduced for the purpose of identifying individuals and
confirming rights and for the purpose of security protection. As
personal authentication technologies based on individual
differences of a living body, there have been known technologies
using e.g. a fingerprint, a vein, a face, an iris, and voice. Among
them, a method using sound information can perform personal
authentication with a general-purpose inexpensive device such as a
telephone and a microphone without preparing a special device.
[0003] PTL 1 discloses a method in which a user is always monitored
during log-in by using a biometrics authentication method based on
a combination of a fingerprint, a face, a mouse movement.
[0004] PTL 2 discloses a method in which sound information received
and transmitted from/to an auditory organ is subjected to signal
processing, the processed information is stored in a storage device
as acoustic characteristics, and the acoustic characteristics
stored in the storage device and newly inputted acoustic
characteristics are collated with each other, thereby determining
whether a person to be authenticated is an authentication target
person.
CITATION LIST
Patent Literature
[0005] [PTL 1] Japanese Unexamined Patent Application Publication
No. 2004-13 83 1 A [0006] [PTL 2] Japanese Unexamined Patent
Application Publication No. 2002-143130 A
SUMMARY OF INVENTION
Technical Problem
[0007] However, the biometric authentication disclosed in PTL 1 and
performed at a predetermined place or time, has the following
problems.
[0008] Firstly, in the case of personal authentication performed by
acquiring biological information at a predetermined place or time,
there is a problem that a user is forced to do an operation for
performing authentication. For example, in the case of personal
authentication using a fingerprint or a vein, an operation of a
user such as putting his/her finger on a dedicated scanner is
necessary. Furthermore, in the case of personal authentication
using a face or an iris, an operation of a user such as turning a
face to a camera is necessary. Furthermore, in the case of personal
authentication using voice or bone conduction sound, an operation
of a user such as speaking a password is necessary. Therefore, a
user has a psychological and physical burden in each
authentication. Moreover, in the case of personal authentication
performed by acquiring biological information at a predetermined
place or time, it is difficult to continuously authenticate a user
(a person to be collated) at all times. Thus, when a user is
intentionally replaced by another person after the authentication,
since detecting the replacement is impossible, security level
becomes low.
[0009] In the personal authentication method disclosed in PTL 2 and
using an auditory organ, acoustic characteristics for identifying
individual information need to be stored in a robust storage device
in advance.
[0010] Therefore, in view of the aforementioned problems, the
present invention aims to provide a personal authentication device
capable of simply securing security with little psychological and
physical burden of a user to be authenticated.
Solution to Problem
[0011] To solve the above problem, a personal authentication device
according to first aspect of the present invention includes:
[0012] a transmission means for transmitting a first acoustic
signal to a part of a head of a user;
[0013] an observation means for observing a second acoustic signal
that is an acoustic signal after the first acoustic signal
propagates through the part of the head of the user;
[0014] a calculation means for calculating acoustic characteristics
from the first acoustic signal and the second acoustic signal;
[0015] an extraction means for extracting a feature amount related
to the user from the acoustic characteristics;
[0016] a storage control means for registering the feature amount
in a storage means as a first feature amount; and
[0017] an identification means for identifying the user by
collating the first feature amount registered in the storage means
with a second feature amount extracted from the extraction means
after the first feature amount is registered,
[0018] wherein, when the identification means identifies the user
as being identical, the transmission means transmits the first
acoustic signal every predetermined interval.
[0019] A personal authentication method according to second aspect
of the present invention includes:
[0020] transmitting a first acoustic signal to a part of a head of
a user;
[0021] observing a second acoustic signal that is an acoustic
signal after the first acoustic signal propagates through the part
of the head of the user;
[0022] calculating acoustic characteristics from the first acoustic
signal and the second acoustic signal;
[0023] extracting a feature amount related to the user from the
acoustic characteristics;
[0024] registering the feature amount in a storage means as a first
feature amount; and
[0025] identifying the user by collating the first feature amount
registered in the storage means with a second feature amount
extracted after the first feature amount is registered,
[0026] wherein, when the user is identified as being identical, the
first acoustic signal is transmitted every predetermined
interval.
[0027] A personal authentication program according to third aspect
of the present invention causes a computer to perform:
[0028] transmitting a first acoustic signal to a part of a head of
a user;
[0029] observing a second acoustic signal that is an acoustic
signal after the first acoustic signal propagates through the part
of the head of the user;
[0030] calculating acoustic characteristics from the first acoustic
signal and the second acoustic signal;
[0031] extracting a feature amount related to the user from the
acoustic characteristics;
[0032] registering the feature amount in a storage means as a first
feature amount; and
[0033] identifying the user by collating the first feature amount
registered in the storage means with a second feature amount
extracted after the first feature amount is registered,
[0034] wherein, when the user is identified as being identical, the
first acoustic signal is transmitted every predetermined
interval.
[0035] The personal authentication program may be stored in a
non-transitory storage medium.
Advantageous Effects of Invention
[0036] According to the present invention, it is possible to
provide a personal authentication device capable of simply securing
security with little psychological and physical burden of a user to
be authenticated.
BRIEF DESCRIPTION OF DRAWINGS
[0037] FIG. 1 is a block diagram illustrating a configuration
example of a personal authentication device according to a first
example embodiment of the present invention.
[0038] FIG. 2 is a configuration diagram illustrating a specific
hardware configuration example of a personal authentication device
according to a first example embodiment of the present
invention.
[0039] FIG. 3 is a flowchart illustrating an example of an
operation of a personal authentication device according to a first
example embodiment of the present invention.
[0040] FIG. 4A is a graph illustrating an example of a transmitted
acoustic signal.
[0041] FIG. 4B is a graph illustrating an example of an observed
acoustic signal.
[0042] FIG. 5 is a graph illustrating an example of an impulse
response as acoustic characteristics.
[0043] FIG. 6 is a block diagram illustrating a configuration
example of a personal authentication device according to a second
example embodiment of the present invention.
[0044] FIG. 7 is a diagram illustrating a configuration example of
an earphone and a peripheral device thereof according to a second
example embodiment of the present invention.
[0045] FIG. 8 is a flowchart illustrating an example of an
operation of a personal authentication device according to a second
example embodiment of the present invention.
[0046] FIG. 9 is a block diagram illustrating a configuration
example of a personal authentication device according to a third
example embodiment of the present invention.
[0047] FIG. 10 is a configuration example of an information
processing device for embodying each example embodiment according
to the present invention.
EXAMPLE EMBODIMENT
First Example Embodiment
[0048] (Personal Authentication Device)
[0049] Personal authentication device 100 according to a first
example embodiment of the present invention will be described with
reference to the drawings. FIG. 1 is a block diagram illustrating a
configuration example of personal authentication device 100
according to the first example embodiment. Personal authentication
device 100 illustrated in FIG. 1 includes transmission unit 11,
observation unit 12, calculation unit 13, extraction unit 14,
storage control unit 15, identification unit 16, storage unit 17,
and service control unit 18.
[0050] Transmission unit 11 transmits an acoustic signal to a part
of a user's head. The part of the head, to which the acoustic
signal is transmitted, is more specifically an area where a cavity
has been formed in the head, and may be at least a part of an area
where it is possible to mount or approximate an ornament or a
device for producing a sound effect.
[0051] Observation unit 12 observes an acoustic signal after the
acoustic signal transmitted from transmission unit 11 propagates
through the part of the user's head. Furthermore, the part of the
head serving as the propagation path of the acoustic signal may be
more specifically at least a part of a skull, a brain, and a
sensory organ constituting the head, and a cavity among them.
[0052] Calculation unit 13 calculates acoustic characteristics of
the acoustic signal propagating through the part of the user's head
on the basis of the acoustic signal transmitted from transmission
unit 11 and the acoustic signal observed by observation unit
12.
[0053] Extraction unit 14 extracts a feature amount related to a
user to be authenticated (an authentication target user) from the
calculated acoustic characteristics. The extraction of the feature
amount may be performed by a predetermined arithmetic
operation.
[0054] Storage control unit 15 stores the feature amount obtained
by extraction unit 14 in storage unit 17 at the time of
registration of the authentication target user that is performed
whenever a service is started (hereinafter, this may be described
as first registration). Moreover, when each service is stopped,
storage control unit 15 deletes the feature amount of the
authentication target user from storage unit 17. That is, the
feature amount serving as a password is stored and deleted for each
service even in the case of the same user. As described above, a
so-called one-time password method, in which a password is changed
in a short period of time, is employed. Accordingly, it is not
necessary to store a feature amount in storage unit 17 in advance.
Moreover, a feature amount is stored in storage unit 17 whenever a
service is provided, so that it is possible to secure high
security.
[0055] Storage unit 17 stores the feature amount related to the
authentication target user at the time of the first registration of
the user. Hereinafter, a user, whose feature amount is stored in
storage unit 17, may be called a registered user.
[0056] Identification unit 16 collates the feature amount obtained
by extraction unit 14 with the feature amount stored in storage
unit 17 at the time of the first registration, and determines
whether these feature amounts coincide with each other (the user is
identical).
[0057] Service control unit 18 controls the service provision. For
example, service control unit 18 controls a service to be provided
when the determination result of identification unit 16 indicates
the same user, and controls the provision of the service to be
stopped when the determination result does not indicate the same
user. It should be noted that providing the service, for example,
indicates starting application software for providing the service
or maintaining a startup state of the application software and
stopping the provision of the service indicates ending the
application software.
[0058] Preferably, service control unit 18 receives permission,
which indicates that a user is a legitimate user who can receive a
service, from such as a service administrator before and after a
feature amount is stored in storage unit 17 at the time of the
first registration. In an example of the permission to accept a
user, the service administrator inputs an instruction for
permitting first registration to information processing device 1 to
be described later, hands over earphone 4 to be described later to
the user, and prompts the user to perform the first registration.
It should be noted that the action of the service administrator may
be mounted in service control unit 18 as a software program. For
example, it may be possible to use a software program of another
type of personal authentication (e.g. fingerprint authentication),
which is highly burdensome for a user to carry out usual
(continuous) authentication even though safety is high. In such a
case, the administrator prompts input of a user via an input device
such as a touch panel provided to or connected to the information
processing device 1. The input for permitting the first
registration by the service administrator may be performed by
installing a physical switch at earphone 4 to be described later
and mediating the switch.
[0059] Herein, the service in the present example embodiment is a
service that does not need to identify user's personal information
(such as a name), but should prohibit the service from being used
by others or a service intended to detect the absence of a user.
For example, the service includes a voice guidance service to be
provided toward a specific person who wears earphone 4 in a
specific place such as a department store, a museum, a conference
with interpretation. In addition, the service also includes a movie
or music appreciation service provided at fixed seats of the
Shinkansen or airplanes, and a movie or music appreciation service
distributed to an individual via a smart phone. It should be noted
that design can also be changed for identifying a user.
[0060] FIG. 2 is a configuration diagram illustrating a specific
hardware configuration example for implementing personal
authentication device 100 of the present example embodiment
illustrated in FIG. 1. Personal authentication device 100, for
example, includes information processing device 1, sound processor
2, microphone amplifier 3, earphone 4, and microphone 5.
Specifically, information processing device 1 is a smart phone, a
tablet terminal, or a personal computer. Reference numeral 6
denotes a user to be recognized.
[0061] Sound transmitted from information processing device 1 is
subjected to D/A (digital/analog) conversion in sound processor 2
and is delivered to earphone 4. Earphone 4 includes microphone 5.
Earphone 4 is mounted on or inserted into a user's ear, sound
produced by microphone 5 is echoed in the ear, and earphone 4
collects the echo sound. The collected echo sound is amplified by
microphone amplifier 3, is subjected to A/D (analog/digital)
conversion in sound processor 2, and is transmitted to information
processing device 1.
[0062] In the hardware configuration example illustrated in FIG. 2,
earphone 4 is an example of transmission unit 11. Furthermore,
microphone 5, sound processor 2, and microphone amplifier 3 are an
example of observation unit 12. As illustrated in FIG. 2, it is
desired that microphone 5 and earphone 4 are integrated such that
their relative positional relation does not change. However, when
the relative positional relation therebetween does not change
significantly, the present invention is not limited thereto.
Furthermore, as an example of earphone 4 and microphone 5, a
microphone-integrated earphone, in which they are inserted into the
entrance of an ear canal, is used; however, as a practical example
of both, a microphone may be set on a headphone that covers the
auricle. Furthermore, as another practical example of both, a
microphone may be installed in a handset part of a telephone. In
addition, an acoustic signal transmitted by an earphone installed
at the entrance of the ear canal of the left ear may be observed
with a microphone installed at the entrance of the ear canal of the
right ear, or vice versa.
[0063] It should be noted that the extraction of the feature amount
may be performed from both ears or from the right or left ear only.
In the present example embodiment, during a predetermined operation
requiring personal authentication, a user is required to wear
earphone 4 at all times. Thus, in the case of a predetermined
operation over a long period of time, it is also assumed that a
user experiences pain or a sense of discomfort in the ear. In such
a case, the user may appropriately change an ear with earphone 4 to
the other ear. It should be noted that when changing an ear to be
authenticated, a first registration operation to be described later
is necessary again.
[0064] Furthermore, calculation unit 13, extraction unit 14,
identification unit 16, and service control unit 18 are
respectively implemented by a central processing unit (CPU) and a
memory operating according to a program in information processing
device 1. Furthermore, storage unit 17 is implemented by a storage
medium such as a hard disk in information processing device 1. The
same function may also be performed by mounting a miniaturized
information processing device 1 in earphone 4.
[0065] (Operation of Personal Authentication Device)
[0066] Next, an example of the operation of personal authentication
device 100 in the present example embodiment will be described with
reference to the flowchart illustrated in FIG. 3. Firstly, a
service administrator inputs a keyword that permits first
registration from a switch or a keyboard of information processing
device 1 while facing a user, and prompts the user to register for
the first time. On the basis of an operation of the user, the first
registration is performed as follows, and subsequently, personal
authentication device 100 performs a personal authentication
operation as follows.
[0067] That is, in step S110, when the user wears earphone 4 in
his/her ear, transmission unit 11 transmits an acoustic signal
toward a part of a head of the user to be authenticated. For
example, in step S110, earphone 4 transmits an acoustic signal
toward an ear canal from the entrance of the ear canal. As the
acoustic signal, a method using, such as, an M-sequence signal
(maximal length sequence), a time stretched pulse (TSP) signal a
widely used for measuring an impulse response is considered.
[0068] FIG. 4A is a graph illustrating an example of the acoustic
signal transmitted by transmission unit 11. In the graph of FIG.
4A, a horizontal axis denotes time t and a vertical axis denotes a
signal value x(t) of the acoustic signal transmitted at time t.
Hereinafter, the acoustic signal transmitted by transmission unit
11 may be called a transmitted acoustic signal.
[0069] In step S120, observation unit 12 observes an acoustic
signal after the acoustic signal transmitted from transmission unit
11 in step S110 propagates through the part of the user's head. For
example, in step S120, microphone 5 detects the acoustic signal
propagated from earphone 4. The detected acoustic signal is
amplified by microphone amplifier 3, is subjected to A/D conversion
in sound processor 2, and is transmitted to information processing
device 1.
[0070] FIG. 4B is a graph illustrating an example of the acoustic
signal observed by observation unit 12. In the graph of FIG. 4B, a
horizontal axis denotes time t and a vertical axis denotes a signal
value y(t) of the acoustic signal observed at time t. Hereinafter,
the acoustic signal observed by observation unit 12 may be called
an observed acoustic signal.
[0071] In step S130, calculation unit 13 calculates acoustic
characteristics of the part of the user's head from a change in the
transmitted acoustic signal and the observed acoustic signal. The
acoustic characteristics include, such as, an impulse response, a
transfer function obtained by performing Fourier transform or
Laplace transform on the impulse response. The acoustic
characteristics, for example, include information regarding how the
acoustic signal is reflected and/or attenuated in a living body.
For example, when earphone 4 and microphone 5 are installed at the
entrance of an ear canal and acoustic characteristics that reflect
in the ear canal are calculated by calculation unit 13, an ear
canal impulse response or an ear canal transfer function may be
used as the acoustic characteristics.
[0072] FIG. 5 is a graph illustrating an example of the impulse
response as the acoustic characteristics calculated by calculation
unit 13. In the graph of FIG. 5, a horizontal axis denotes time t
and a vertical axis denotes a value g(t) of an impulse response of
an acoustic signal observed at time t.
[0073] Among the signal value x(t) of the transmitted acoustic
signal, the signal value y(t) of the observed acoustic signal, and
the value g(t) of the impulse response, there is a relation
expressed by the following Equation (1).
[Equation 1]
y(t)=.intg..sub.0.sup.tx(.tau.)g(t-.tau.)d.tau. (1)
[0074] Furthermore, among X(f), Y(f), and G(f) obtained by
respectively performing Fourier transform on x(t), y(t), and g(t),
there is a relation expressed by the following Equation (2). In
Equation (2) below, f denotes a frequency band. Furthermore, G
denotes a transfer function.
Y(f)=G(f)X(f) (2)
In step S140, extraction unit 14 extracts a feature amount from the
acoustic characteristics calculated by calculation unit 13. As the
feature amount, the impulse response or the transfer function may
be used as is. That is, extraction unit 14 uses values of each time
of the impulse response as the acoustic characteristics or values
of each frequency of the transfer function as the feature amount.
Furthermore, it is considered to use a feature amount obtained by
performing main component analysis and dimensional compression on
the impulse response or the transfer function as the acoustic
characteristics, or to use mel-frequency cepstrum coefficients
(mfcc) disclosed in NPL 1 as a feature amount.
[0075] In step S150, identification unit 16 determines whether the
extraction of a feature amount this time is the first extraction
for a user. As a specific example, identification unit 16 includes
a counter memory for counting the number of extractions or searches
whether data of a feature amount exists in storage unit 17, thereby
performing the above determination. When it is determined as the
first extraction, that is, the first registration, the procedure
proceeds to step S160, and when it is not determined as the first
registration (the second time or more), the procedure proceeds to
step S180.
[0076] When it is the first registration, storage control unit 15
stores the feature amount extracted in extraction unit 14 in
storage unit 17 in step S160. In step S170, when it is detected
that the feature amount is present in storage unit 17, service
control unit 18 starts an application program for providing a
service.
[0077] When it is not the first registration, identification unit
16 collates the feature amount obtained by extraction unit 14 with
the feature amount of a registered user stored in storage unit 17
in step S180.
[0078] In step S190, when the feature amounts coincide with each
other as a collation result and it is determined that a user to be
authenticated corresponds to a registered user, the procedure
proceeds to step S200. When the feature amounts do not coincide
with each other as the collation result and it is determined that
the user to be authenticated does not correspond to the registered
user, the procedure proceeds to step S210. This determination
corresponds to one-to-one authentication.
[0079] In the one-to-one authentication, feature amounts of a user
to be authenticated and a registered user are collated with each
other in a one-to-one manner. In such a case, a registered user,
for which collation is to be performed, may be designated in
advance with a user identification (ID). As a collation method, for
example, identification unit 16 may calculate a distance between
feature amounts, determine that they are the same person when the
distance is smaller than a threshold value, and determine that they
are different persons when the distance is larger than the
threshold value. As a distance measure, such as the Euclid distance
or a cosine distance is considered. However, other distances may be
used.
[0080] Furthermore, in the above, an example, in which a feature
amount stored in advance is stored in storage unit 17, has been
described; however, storage unit 17 may store a statistical model
instead of the feature amount. The statistical model may be a mean
value and a variance value obtained by acquiring a feature amount
multiple times for one person, or a relational expression
calculated using these values. Alternatively, there are, for
example, a Gaussian mixture model (GMM), a support vector machine
(SVM), a model using a neutral network, as disclosed in PTL 1.
[0081] In step S200, identification unit 16 waits for the passage
of a predetermined time (for example, one second) and returns
procedure to step S110.
[0082] In step S210, service control unit 18, for example, ends an
application program providing a service such that the service is
not provided to a user who is not a registered user. In such a
case, storage control unit 15 may allow storage unit 17 to store a
feature amount of an unauthorized user who is not a registered
user.
[0083] In addition, service control unit 18 may end an application
program for a service according to a request from a registered
user. Furthermore, when a registered user detaches earphone 4 from
his/her ear and thus extraction unit 14 is not able to completely
acquire a feature amount (echo sound), service control unit 18 may
end the application program for a service. In such a case,
identification unit 16 may not immediately but after several times
of collation, notify service control unit 18 of a collation result
and service stop, after it is found that some reason prevent
identification unit 16 from acquiring a feature amount, after
several tries.
[0084] At the end of the service, service control unit 18 instructs
storage control unit 15 to erase data of the feature amount of the
registered user in storage unit 17. Storage control unit 15 erase
the data of the feature amount of the registered user when the
application program is ended.
[0085] Thus, the operation of personal authentication device 100
according to the first example embodiment is ended.
[0086] According to the first example embodiment of the present
invention, it is possible to provide a personal authentication
device capable of simply securing security with little
psychological and physical burden of a user to be authenticated. In
the present example embodiment, personal authentication is
performed using a characteristic in which acoustic characteristics
of an acoustic signal propagating through a part of a user's head
are different for each individual. Since the acoustic
characteristics propagating through the part of the user's head are
internal characteristics of a living body differently from
characteristics observable from an exterior such as a face and a
fingerprint, the risk of leakage is low and theft is difficult.
Furthermore, in order to know acoustic characteristics, since both
the transmitted acoustic signal and the observed acoustic signal
are necessary, there is little risk of being acquired and forged by
eavesdropping. Furthermore, since an operation to be performed by a
user to be authenticated is to wear a headphone or an earphone with
an embedded microphone or hold a cellular phone with a microphone
embedded in a receiving part to an ear, psychological and physical
burden of a user is small. Furthermore, when the personal
authentication method of the present example embodiment is used in
combination with music distribution, a transceiver, or an
information distribution device that transmits voice such as
communication, it is possible to provide a user with personal
authentication without any additional physical and mental
burden.
[0087] Furthermore, the acoustic characteristics can be acquired in
a short period, such as about one second, and it is possible to
keep authenticating a user at all times while a service is being
provided. Therefore, as collated with a case where authentication
is performed once at the beginning or immediately before receiving
any service, when there is an illegal act such as alternation
(impersonation) to another person after the authentication, it is
possible to detect the illegal act.
[0088] Moreover, whenever a service is started, feature amount data
of a user is registered in storage unit 17, and whenever the
service is ended, the feature amount data registered in storage
unit 17 is deleted. In this way, since the feature amount data used
as an ID or a password is collated only for a short period of time
(only for one-time service use time), it is possible to simply
secure high security. Moreover, even though a user with one-ear
authentication has a sense of discomfort to an ear used for
authentication due to the long-time use of a service and desires to
switch the authentication subject to the other ear, it is possible
to easily switch the ear to be authenticated. In such a case, it is
preferably to perform authentication switching (permission of first
authentication by a service administrator) in order to secure
security.
Second Example Embodiment
[0089] In the first example embodiment of the present invention,
personal authentication device 100 is available at all times
according to a user's request; however, depending on the content of
a service to be provided, an available time (for example, up to 2
hours) or a time zone (for example, between 12:00 to 17:00) of the
service may be specified even in the case of the same user. In the
second example embodiment of the present invention, a description
will be provided for personal authentication device 200 that
authenticates a user when an available time is specified.
[0090] Personal authentication device 200 according to the second
example embodiment of the present invention will be described with
reference to the drawings. FIG. 6 is a block diagram illustrating a
configuration example of personal authentication device 200
according to the second example embodiment. Personal authentication
device 200 includes transmission unit 11, observation unit 12,
calculation unit 13, extraction unit 14, storage control unit 15,
identification unit 16, storage unit 17, timer control unit 21,
lamp control unit 22, and service control unit 28.
[0091] Timer control unit 21 controls a timer preset by a service
administrator. Specifically, when a time or a time zone preset by
the service administrator is passed, timer control unit 21 notifies
lamp control unit 22 to change a lamp color. In such a case, timer
control unit 21 may notify service control unit 28 to stop the
providing service.
[0092] Lamp control unit 22 controls a color, flickering of a lamp
according to the notification from timer control unit 21. As
illustrated in FIG. 7, a lamp 7 is installed at a position easily
seen from the outside, for example, on the surface of earphone 4.
Lamp control unit 22 changes the color of lamp 7 according to a
service providing state or a user authentication state. For
example, the color of lamp 7 is yellow before first authentication,
is green after the first authentication and normal operation, is
red in the case of overtime use (time-over) of a registered user,
is red flickering when it is determined that it is not used by a
registered user.
[0093] The operation of personal authentication device 200
according to the second example embodiment will be described with
reference to the flowchart of FIG. 8.
[0094] Steps S110 to S180 are the same as those of the operation of
the first example embodiment (see FIG. 3).
[0095] In step S190, when it is determined that a user to be
authenticated corresponds to a registered user as the collation
result, the procedure proceeds to step S191. When it is determined
that the user to be authenticated does not correspond to the
registered user, the procedure proceeds to step S193. In step S193,
lamp control unit 22 changes the color of lamp 7 being currently
displayed (for example, it blinks in red).
[0096] In step S191, timer control unit 21 determines whether a
current time exceeds a time (a service time) preset by a service
administrator. When the current time is within the service time,
the procedure proceeds to step S200, and when the current time is
out of the service time, the procedure proceeds to step S192. In
step S192, lamp control unit 22 changes the color of lamp 7 being
currently displayed (for example, it blinks in red). Timer control
unit 21 notifies lamp control unit 22 such that the lamp color is
changed. In such a case, timer control unit 21 may notify service
control unit 28 such that a service is stopped.
[0097] Steps S200 and S210 are the same as those of the operation
of the first example embodiment (see FIG. 3).
[0098] Thus, the operation of personal authentication device 200
according to the second example embodiment is ended.
[0099] According to the second example embodiment of the present
invention, it is possible to provide a personal authentication
device with little psychological and physical burden of a user to
be authenticated and with high security performance. Moreover, in
addition to the effects of the first example embodiment, when a
user uses a service for a predetermined time or more or when a user
uses a service out of a specified time, the color of the lamp is
changed. In this way, it is possible to allow a service
administrator to visibly recognize the use of a service by a user
out of a specified time. Furthermore, when a user uses a service
for a predetermined time or more, personal authentication device
200 may stop providing the service. In this way, a service
administrator can more accurately manage personal authentication
device 200.
[0100] In addition, instead of flickering the lamp, a sound buzzer
may also be used. Moreover, even in the case of a legally
authorized user, when it is intended to prohibit use out of a
specified place, it is also possible to manage (monitor) a place of
use by using such as a beacon system when the user is indoors or by
using such as a global positioning system (GPS) system when the
user is outdoors. The management of a person to use, a use time,
and a place of use may be realized by any combination.
Third Example Embodiment
[0101] Personal authentication device 300 according to a third
example embodiment of the present invention includes transmission
unit 31, observation unit 32, calculation unit 33, extraction unit
34, storage control unit 35, identification unit 36, and storage
unit 37 as illustrated in FIG. 9.
[0102] Transmission unit 31 transmits a first acoustic signal to a
part of a user's head. Observation unit 32 observes a second
acoustic signal that is an acoustic signal after the first acoustic
signal propagates through the part of the user's head. Calculation
unit 33 calculates acoustic characteristics from the first acoustic
signal and the second acoustic signal. Extraction unit 34 extracts
a feature amount related to a user from the acoustic
characteristics. Storage control unit 35 registers the feature
amount in the storage unit as a first feature amount.
Identification unit 36 identifies the user by collating the first
feature amount acquired from the storage unit with a second feature
amount acquired after the extraction of the first feature amount
from the extraction unit. Storage unit 37 stores the first feature
amount. It should be noted that when identification unit 36
identifies the user as being identical, transmission unit 31
transmits the first acoustic signal every predetermined
interval.
[0103] According to the third example embodiment of the present
invention, it is possible to provide a personal authentication with
little psychological and/or physical burden of a user to be
authenticated and with high security performance. The reason for
this is because identification unit 36 identifies a user by
collating the first feature amount registered in the storage unit
for the first time with the second feature amount acquired after
the first registration and transmission unit 31 transmits the first
acoustic signal every predetermined interval when identification
unit 36 identifies the user as being identical. In this way, a user
simply wears an earphone, so that it is possible to perform
personal authentication with high security performance at all
times.
(Configuration of Information Processing Device)
[0104] In the aforementioned each example embodiment of the present
invention, respective elements of respective personal
authentication devices illustrated in FIG. 1, FIG. 6, and FIG. 9
illustrate blocks of a functional unit. Some or all of respective
elements of the personal authentication devices, for example, are
realized using an arbitrary combination of information processing
device 1 as illustrated in FIG. 10 and a program. Information
processing device 1 includes the following elements as an example.
[0105] Central processing unit (CPU) 501 [0106] Read only memory
(ROM) 502 [0107] Random access memory (RAM) 503 [0108] Program 504
loaded on RAM 503 [0109] Storage device 505 storing program 504
[0110] Drive device 507 for performing reading and writing of
recording medium 506 [0111] Communication interface 508 connected
to communication network 509 [0112] Input/output interface 510 for
performing input/output of data [0113] Bus 511 connecting each
element
[0114] Respective elements of the personal authentication device in
each example embodiment of the present invention are implemented
when CPU 501 acquires and executes program 504 for performing
functions of the elements. Program 504 for performing the functions
of the elements of the personal authentication device, for example,
is stored in storage device 505 or RAM 503 in advance and is read
by CPU 501 when necessary. It should be noted that program 504 may
be supplied to CPU 501 via communication network 509, or drive
device 507 may read program 504 stored in recording medium 506 in
advance and supply CPU 501 with read program 504.
[0115] There are various modification examples in the
implementation method of each device. For example, the personal
authentication device may be implemented by an arbitrary
combination of different information processing devices and
programs for each element. Furthermore, a plurality of elements
included in the personal authentication device may be implemented
by an arbitrary combination of one information processing device 1
and a program.
[0116] Furthermore, some or all of respective elements of
respective personal authentication devices are implemented by other
general-purpose or dedicated circuits, processors or a combination
thereof. These may also be configured by a single chip, or by a
plurality of chips connected via a bus.
[0117] Some or all of respective elements of respective personal
authentication devices may be implemented by a combination of the
aforementioned circuits and a program.
[0118] When some or all of respective elements of respective
personal authentication devices are implemented by a plurality of
information processing devices, circuits, the plurality of
information processing devices, circuits may be arranged in a
concentrated manner or arranged in a distributed manner. For
example, the information processing devices, circuits may be
implemented as a form in which a client and server system, a cloud
computing system are connected to one another via a communication
network.
[0119] Some or all of the aforementioned example embodiments are
also described in the following Supplementary Notes; however, the
present invention is not limited thereto.
[0120] [Supplementary Note 1]
[0121] A personal authentication device comprising:
[0122] a transmission means for transmitting a first acoustic
signal to a part of a head of a user;
[0123] an observation means for observing a second acoustic signal
that is an acoustic signal after the first acoustic signal
propagates through the part of the head of the user;
[0124] a calculation means for calculating acoustic characteristics
from the first acoustic signal and the second acoustic signal;
[0125] an extraction means for extracting a feature amount related
to the user from the acoustic characteristics;
[0126] a storage control means for registering the feature amount
in a storage means as a first feature amount; and
[0127] an identification means for identifying the user by
collating the first feature amount registered in the storage means
with a second feature amount extracted from the extraction means
after the first feature amount is registered,
[0128] wherein, when the identification means identifies the user
as being identical, the transmission means transmits the first
acoustic signal every predetermined interval.
[0129] [Supplementary Note 2]
[0130] The personal authentication device according to
Supplementary note 1, further comprising:
[0131] a service control means for providing a service to the user
when the identification means identifies the user as being
identical.
[0132] [Supplementary Note 3]
[0133] The personal authentication device according to
Supplementary note 1 or 2, wherein, when the identification means
is not able to identify the user as being identical, the service
control means stops providing the service and the storage control
means deletes the first feature amount registered in the storage
means.
[0134] [Supplementary Note 4]
[0135] The personal authentication device according to any one of
Supplementary notes 1 to 3, further comprising:
[0136] a light emitting means for emitting light in a color
different from a color when the user is identified as being
identical in a case where the identification means is not able to
identify the user as being identical.
[0137] [Supplementary Note 5]
[0138] The personal authentication device according to any one of
Supplementary notes 1 to 4, further comprising:
[0139] a timer means for detecting whether a specified time has
passed,
[0140] when the identification means identifies the user as being
identical and the timer means detects that the specified time has
passed, the service control means stops providing the service to
the user.
[0141] [Supplementary Note 6]
[0142] The personal authentication device according to any one of
Supplementary notes 1 to 5, wherein, when the timer means detects
that the specified time has passed, the light emitting means emits
light in a color different from a color when the timer means
detects that the specified time has not passed.
[0143] [Supplementary Note 7]
[0144] A personal authentication method comprising:
[0145] transmitting a first acoustic signal to a part of a head of
a user;
[0146] observing a second acoustic signal that is an acoustic
signal after the first acoustic signal propagates through the part
of the head of the user;
[0147] calculating acoustic characteristics from the first acoustic
signal and the second acoustic signal;
[0148] extracting a feature amount related to the user from the
acoustic characteristics;
[0149] registering the feature amount in a storage means as a first
feature amount; and
[0150] identifying the user by collating the first feature amount
registered in the storage means with a second feature amount
extracted after the first feature amount is registered,
[0151] wherein, when the user is identified as being identical, the
first acoustic signal is transmitted every predetermined
interval.
[0152] [Supplementary Note 8]
[0153] The personal authentication method according to
Supplementary note 7, further comprising:
[0154] providing a service to the user when the user is identified
as being identical.
[0155] [Supplementary Note 9]
[0156] The personal authentication method according to
Supplementary note 7 or 8, wherein, when the user is not identified
as being identical, providing of the service is stopped and the
first feature amount registered in the storage means is
deleted.
[0157] [Supplementary Note 10]
[0158] The personal authentication method according to any one of
Supplementary notes 7 to 9, further comprising:
[0159] emitting, by a light emitting means, light in a color
different from a color when the user is identified as being
identical in a case where it is not possible to identify the user
as being identical.
[0160] [Supplementary Note 11]
[0161] The personal authentication method according to any one of
Supplementary notes 7 to 10, further comprising:
[0162] detecting whether a specified time has passed,
[0163] when the user is identified as being identical and it is
detected that the specified time has passed, providing of the
service to the user is stopped.
[0164] [Supplementary Note 12]
[0165] The personal authentication method according to any one of
Supplementary notes 7 to 11, wherein, when it is detected that the
specified time has passed, the light emitting means emits light in
a color different from a color when the specified time has not
passed.
[0166] [Supplementary Note 13]
[0167] A recording medium stored with a personal authentication
program causing a computer to perform:
[0168] transmitting a first acoustic signal to a part of a head of
a user;
[0169] observing a second acoustic signal that is an acoustic
signal after the first acoustic signal propagates through the part
of the head of the user;
[0170] calculating acoustic characteristics from the first acoustic
signal and the second acoustic signal;
[0171] extracting a feature amount related to the user from the
acoustic characteristics;
[0172] registering the feature amount in a storage means as a first
feature amount; and
[0173] identifying the user by collating the first feature amount
registered in the storage means with a second feature amount
extracted after the first feature amount is registered,
[0174] wherein, when the user is identified as being identical, the
first acoustic signal is transmitted every predetermined
interval.
[0175] [Supplementary Note 14]
[0176] The recording medium according to Supplementary note 13,
further comprising:
[0177] providing a service to the user when the user is identified
as being identical.
[0178] [Supplementary Note 15]
[0179] The recording medium according to Supplementary note 13 or
14, wherein, when the user is not identified as being identical,
providing of the service is stopped and the first feature amount
registered in the storage means is deleted.
[0180] [Supplementary Note 16]
[0181] The recording medium according to any one of Supplementary
notes 13 to 15, further comprising:
[0182] emitting, by a light emitting means, light in a color
different from a color when the user is identified as being
identical in a case where it is not possible to identify the user
as being identical.
[0183] [Supplementary Note 17]
[0184] The recording medium according to any one of Supplementary
notes 13 to 16, further comprising:
[0185] detecting whether a specified time has passed,
[0186] when the user is identified as being identical and it is
detected that the specified time has passed, providing of the
service to the user is stopped.
[0187] [Supplementary Note 18]
[0188] The recording medium according to any one of Supplementary
notes 13 to 17, wherein, when it is detected that the specified
time has passed, the light emitting means emits light in a color
different from a color when the specified time has not passed.
[0189] So far, the present invention has been described with
reference to the present example embodiments and the examples;
however, the present invention is not limited to the aforementioned
example embodiments and examples. Various modifications which can
be understood by a person skilled in the art can be made in the
configuration and details of the present invention within the scope
of the present invention.
[0190] This application is based upon and claims the benefit of
priority from Japanese Patent Application No. 2016-181897 filed on
Sep. 16, 2016, the entire contents of which are incorporated herein
by reference.
REFERENCE SIGNS LIST
[0191] 1 information processing device [0192] 2 sound processor
[0193] 3 microphone amplifier [0194] 4 earphone [0195] 5 microphone
[0196] 6 user [0197] 7 lamp [0198] 11 transmission unit [0199] 12
observation unit [0200] 13 calculation unit [0201] 14 extraction
unit [0202] 15 storage control unit [0203] 16 identification unit
[0204] 17 storage unit [0205] 18 service control unit [0206] 21
timer control unit [0207] 22 lamp control unit [0208] 28 service
control unit [0209] 31 transmission unit [0210] 32 observation unit
[0211] 33 calculation unit [0212] 34 extraction unit [0213] 35
storage control unit [0214] 36 identification unit [0215] 37
storage unit [0216] 100 personal authentication device [0217] 200
personal authentication device [0218] 300 personal authentication
device [0219] 500 information processing device [0220] 501 CPU
[0221] 503 RAM [0222] 504 program [0223] 505 storage device [0224]
506 recording medium [0225] 507 drive device [0226] 508
communication interface [0227] 509 communication network [0228] 510
input/output interface [0229] 511 bus
* * * * *